I need to be able to capture an image of a GLSurfaceView at certain moment in time. I have the following code:
relative.setDrawingCacheEnabled(true);
screenshot = Bitmap.createBitmap(relative.getDrawingCache());
relative.setDrawingCacheEnabled(false);
Log.v(TAG, "Screenshot height: " + screenshot.getHeight());
image.setImageBitmap(screenshot);
The GLSurfaceView is contained within a RelativeLayout, but I have also tries it straight using the GLSurfaceView to try and capture the image. With this I think the screen captures a transparent image, i.e. nothing there. Any help will be appreciated.
SurfaceView and GLSurfaceView punch holes in their windows to allow their surfaces to be displayed. In other words, they have transparent areas.
So you cannot capture an image by calling GLSurfaceView.getDrawingCache().
If you want to get an image from GLSurfaceView, you should invoke gl.glReadPixels() in GLSurfaceView.onDrawFrame().
I patched createBitmapFromGLSurface method and call it in onDrawFrame().
(The original code might be from skuld's code.)
private Bitmap createBitmapFromGLSurface(int x, int y, int w, int h, GL10 gl)
throws OutOfMemoryError {
int bitmapBuffer[] = new int[w * h];
int bitmapSource[] = new int[w * h];
IntBuffer intBuffer = IntBuffer.wrap(bitmapBuffer);
intBuffer.position(0);
try {
gl.glReadPixels(x, y, w, h, GL10.GL_RGBA, GL10.GL_UNSIGNED_BYTE, intBuffer);
int offset1, offset2;
for (int i = 0; i < h; i++) {
offset1 = i * w;
offset2 = (h - i - 1) * w;
for (int j = 0; j < w; j++) {
int texturePixel = bitmapBuffer[offset1 + j];
int blue = (texturePixel >> 16) & 0xff;
int red = (texturePixel << 16) & 0x00ff0000;
int pixel = (texturePixel & 0xff00ff00) | red | blue;
bitmapSource[offset2 + j] = pixel;
}
}
} catch (GLException e) {
return null;
}
return Bitmap.createBitmap(bitmapSource, w, h, Bitmap.Config.ARGB_8888);
}
Here is a complete solution if you are using a third party library that you just 'pass in' a GLSurfaceView defined in your layout. You won't have a handle on the onDrawFrame() of the renderer, this can be a problem...
To do this you need to queue it up for the GLSurfaceView to handle.
private GLSurfaceView glSurfaceView; // findById() in onCreate
private Bitmap snapshotBitmap;
private interface BitmapReadyCallbacks {
void onBitmapReady(Bitmap bitmap);
}
/* Usage code
captureBitmap(new BitmapReadyCallbacks() {
#Override
public void onBitmapReady(Bitmap bitmap) {
someImageView.setImageBitmap(bitmap);
}
});
*/
// supporting methods
private void captureBitmap(final BitmapReadyCallbacks bitmapReadyCallbacks) {
glSurfaceView.queueEvent(new Runnable() {
#Override
public void run() {
EGL10 egl = (EGL10) EGLContext.getEGL();
GL10 gl = (GL10)egl.eglGetCurrentContext().getGL();
snapshotBitmap = createBitmapFromGLSurface(0, 0, glSurfaceView.getWidth(), glSurfaceView.getHeight(), gl);
runOnUiThread(new Runnable() {
#Override
public void run() {
bitmapReadyCallbacks.onBitmapReady(snapshotBitmap);
}
});
}
});
}
// from other answer in this question
private Bitmap createBitmapFromGLSurface(int x, int y, int w, int h, GL10 gl) {
int bitmapBuffer[] = new int[w * h];
int bitmapSource[] = new int[w * h];
IntBuffer intBuffer = IntBuffer.wrap(bitmapBuffer);
intBuffer.position(0);
try {
gl.glReadPixels(x, y, w, h, GL10.GL_RGBA, GL10.GL_UNSIGNED_BYTE, intBuffer);
int offset1, offset2;
for (int i = 0; i < h; i++) {
offset1 = i * w;
offset2 = (h - i - 1) * w;
for (int j = 0; j < w; j++) {
int texturePixel = bitmapBuffer[offset1 + j];
int blue = (texturePixel >> 16) & 0xff;
int red = (texturePixel << 16) & 0x00ff0000;
int pixel = (texturePixel & 0xff00ff00) | red | blue;
bitmapSource[offset2 + j] = pixel;
}
}
} catch (GLException e) {
Log.e(TAG, "createBitmapFromGLSurface: " + e.getMessage(), e);
return null;
}
return Bitmap.createBitmap(bitmapSource, w, h, Config.ARGB_8888);
}
Note: In this code, when I click the Button, it takes the screenshot as Image and saves it in sdcard location. I used boolean condition and an if statement in onDraw method, because the renderer class may call the onDraw method anytime and anyway, and without the if this code may save lots of images in the memory card.
MainActivity class:
protected boolean printOptionEnable = false;
saveImageButton.setOnClickListener(new OnClickListener() {
#Override
public void onClick(View v) {
Log.v("hari", "pan button clicked");
isSaveClick = true;
myRenderer.printOptionEnable = isSaveClick;
}
});
MyRenderer class:
int width_surface , height_surface ;
#Override
public void onSurfaceChanged(GL10 gl, int width, int height) {
Log.i("JO", "onSurfaceChanged");
// Adjust the viewport based on geometry changes,
// such as screen rotation
GLES20.glViewport(0, 0, width, height);
float ratio = (float) width / height;
width_surface = width ;
height_surface = height ;
}
#Override
public void onDrawFrame(GL10 gl) {
try {
if (printOptionEnable) {
printOptionEnable = false ;
Log.i("hari", "printOptionEnable if condition:" + printOptionEnable);
int w = width_surface ;
int h = height_surface ;
Log.i("hari", "w:"+w+"-----h:"+h);
int b[]=new int[(int) (w*h)];
int bt[]=new int[(int) (w*h)];
IntBuffer buffer=IntBuffer.wrap(b);
buffer.position(0);
GLES20.glReadPixels(0, 0, w, h,GLES20.GL_RGBA,GLES20.GL_UNSIGNED_BYTE, buffer);
for(int i=0; i<h; i++)
{
//remember, that OpenGL bitmap is incompatible with Android bitmap
//and so, some correction need.
for(int j=0; j<w; j++)
{
int pix=b[i*w+j];
int pb=(pix>>16)&0xff;
int pr=(pix<<16)&0x00ff0000;
int pix1=(pix&0xff00ff00) | pr | pb;
bt[(h-i-1)*w+j]=pix1;
}
}
Bitmap inBitmap = null ;
if (inBitmap == null || !inBitmap.isMutable()
|| inBitmap.getWidth() != w || inBitmap.getHeight() != h) {
inBitmap = Bitmap.createBitmap(w, h, Bitmap.Config.ARGB_8888);
}
//Bitmap.createBitmap(w, h, Bitmap.Config.ARGB_8888);
inBitmap.copyPixelsFromBuffer(buffer);
//return inBitmap ;
// return Bitmap.createBitmap(bt, w, h, Bitmap.Config.ARGB_8888);
inBitmap = Bitmap.createBitmap(bt, w, h, Bitmap.Config.ARGB_8888);
ByteArrayOutputStream bos = new ByteArrayOutputStream();
inBitmap.compress(CompressFormat.JPEG, 90, bos);
byte[] bitmapdata = bos.toByteArray();
ByteArrayInputStream fis = new ByteArrayInputStream(bitmapdata);
final Calendar c=Calendar.getInstance();
long mytimestamp=c.getTimeInMillis();
String timeStamp=String.valueOf(mytimestamp);
String myfile="hari"+timeStamp+".jpeg";
dir_image = new File(Environment.getExternalStorageDirectory()+File.separator+
"printerscreenshots"+File.separator+"image");
dir_image.mkdirs();
try {
File tmpFile = new File(dir_image,myfile);
FileOutputStream fos = new FileOutputStream(tmpFile);
byte[] buf = new byte[1024];
int len;
while ((len = fis.read(buf)) > 0) {
fos.write(buf, 0, len);
}
fis.close();
fos.close();
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
Log.v("hari", "screenshots:"+dir_image.toString());
}
} catch(Exception e) {
e.printStackTrace();
}
}
You can use a GLTextureView extending TextureView instead of GlsurfaceView to show you OpenGL data.
See: http://stackoverflow.com/questions/12061419/converting-from-glsurfaceview-to-textureview-via-gltextureview
As the GLTextureView extends from TextureView, it has a getBitmap function that should work.
myGlTextureView.getBitmap(int width, int height)
Related
I have a .png image that I need to tint a custom color with the RGB values of 207, 173, 23. (https://github.com/pret/pokecrystal/blob/master/gfx/tilesets/players_room.png?raw=true)
I did some research and found the following code:
public BufferedImage getBufferedImage(String source, int redPercent, int greenPercent, int bluePercent) throws IOException{
BufferedImage img = null;
File f = null;
try{
f = new File(source);
img = ImageIO.read(f);
}catch(IOException e){
System.out.println(e);
}
int width = img.getWidth();
int height = img.getHeight();
for(int y = 0; y < height; y++){
for(int x = 0; x < width; x++){
int p = img.getRGB(x,y);
int a = (p>>24)&0xff;
int r = (p>>16)&0xff;
int g = (p>>8)&0xff;
int b = p&0xff;
p = (a<<24) | (redPercent*r/100<<16) | (greenPercent*g/100<<8) | (bluePercent*b/100);
img.setRGB(x, y, p);
}
}
return img;
}
This method is supposed to return a buffered image with the entered RGB values. However, whenever I use it, it only returns a lighter of darker version of the image with no color. I am wondering if the problem lies in the image itself, perhaps having to do with the transparency, or is the problem the code?
The problem is that the PNG image is set up to hold greyscale data only and so the BufferedImage img is also only capable of holding greyscale data. To fix this just create an output BufferedImage in RGB colour mode.
I also tidied up your exception handling.
import java.awt.image.BufferedImage;
import java.io.*;
import javax.imageio.ImageIO;
class SOQuestion {
public static BufferedImage getBufferedImage(String source,
int redPercent, int greenPercent, int bluePercent) {
BufferedImage img = null;
File f = null;
try {
f = new File(source);
img = ImageIO.read(f);
} catch (IOException e) {
System.out.println(e);
return null;
}
int width = img.getWidth();
int height = img.getHeight();
BufferedImage out = new BufferedImage(width, height,
BufferedImage.TYPE_INT_RGB);
for (int y = 0; y < height; y++) {
for (int x = 0; x < width; x++) {
int p = img.getRGB(x,y);
int a = (p>>24) & 0xff;
int r = (p>>16) & 0xff;
int g = (p>>8) & 0xff;
int b = p & 0xff;
p = (a<<24) | (redPercent*r/100<<16) |
(greenPercent*g/100<<8) | (bluePercent*b/100);
out.setRGB(x, y, p);
}
}
return out;
}
public static void main(String[] args) {
BufferedImage result = SOQuestion.getBufferedImage(args[0], 81, 68, 9);
File outputfile = new File("output.png");
try {
ImageIO.write(result, "png", outputfile);
} catch (IOException e) {
System.out.println(e);
}
}
}
I want my app to generate different QR code for the same input string. This QR code should be changing every certain time. Is it possible to achieve this using ZXing library on android?
onClick method{
try {
encriptionString=editText.getText().toString();
Bitmap bitmap = encodeAsBitmap(encriptionString);
imageView.setImageBitmap(bitmap);
} catch (WriterException e) {
e.printStackTrace();
}
}
Bitmap encodeAsBitmap(String str) throws WriterException {
BitMatrix result;
try {
result = new MultiFormatWriter().encode(str,
BarcodeFormat.QR_CODE, 250, 250, null);
} catch (IllegalArgumentException iae) {
// Unsupported format
return null;
}
int w = result.getWidth();
int h = result.getHeight();
int[] pixels = new int[w * h];
for (int y = 0; y < h; y++) {
int offset = y * w;
for (int x = 0; x < w; x++) {
pixels[offset + x] = result.get(x, y) ? BLACK : WHITE;
}
}
Bitmap bitmap = Bitmap.createBitmap(w, h, Bitmap.Config.ARGB_8888);
bitmap.setPixels(pixels, 0, 250, 0, 0, w, h);
return bitmap;
}
Actually I introduce a simple string and I get my QR code. But I want to know if it's possible to get a different QR code for same String.
I am working on a 2D platform game and I have a sprite sheet which includes the sprites of tiles and blocks.
I noticed that there was a pink-ish background behind the transparent sprites so I thought that Java wasn't loading the sprites as PNG and I tried to re-draw the sprite on a new bufferedImage, pixel by pixel checking if the pixel was R=255, G=63, B=52 but unfortunately, the code wasn't able to detect that either and at this point I have no more options left to try.
I made sure that the "pink" color values are correct by using a color picker.
original spritesheet (transparent):
The class that loads the sprite(s) is:
public class SpriteSheet {
private BufferedImage image;
public SpriteSheet(BufferedImage image) {
this.image = image;
}
public BufferedImage grabImage(int col, int row, int width, int height) {
BufferedImage alpha = new BufferedImage(width, height, BufferedImage.TYPE_INT_ARGB);
BufferedImage img = image.getSubimage(
(col * width) - width,
(row * height) - height,
width,
height);
int w = img.getWidth();
int h = img.getHeight();
for(int y = 0; y < h; y++) {
for(int x = 0; x < w; x++) {
int pixel = img.getRGB(x, y);
int red, green, blue;
red = (pixel >> 16) & 0xff;
green = (pixel >> 8) & 0xff;
blue = (pixel) & 0xff;
if(red == 255 && green == 63 && blue == 52)
alpha.setRGB(x, y, new Color(0, 0, 0, 0).getRGB());
else
alpha.setRGB(x, y, pixel);
}
}
return alpha;
}
}
the class that loads the sprite sheet is:
public class Texture {
SpriteSheet bs, ss;
private BufferedImage block_sheet = null;
public BufferedImage[] block = new BufferedImage[3];
public Texture() {
BufferedImageLoader loader = new BufferedImageLoader();
try {
block_sheet = loader.loadImage("/tiles.png");
} catch(Exception e) {
e.printStackTrace();
}
bs = new SpriteSheet(block_sheet);
getTextures();
}
private void getTextures() {
block[0] = bs.grabImage(1, 1, 32, 32);
block[1] = bs.grabImage(2, 1, 32, 32);
block[2] = bs.grabImage(4, 1, 32, 32);
}
}
How do I get rid of the pink-ish background and keep transparency?
I dont understand why you're using subImage.
try {
BufferedImage img = ImageIO.read(new File("D:/image.png"));
for (int i = 0; i < img.getWidth(); i++) {
for (int j = 0; j < img.getHeight(); j++) {
Color pixelcolor = new Color(img.getRGB(i, j));
int r = pixelcolor.getRed();
int g = pixelcolor.getGreen();
int b = pixelcolor.getBlue();
if (r == 255 && g == 63 && b == 52) {
int rgb = new Color(255, 255, 255).getRGB();
img.setRGB(i, j, rgb);
}
}
}
ImageIO.write(img, "png", new File("D:/transparent.png"));
} catch (Exception e) {
System.err.println(e.getMessage());
}
cough, It worked all along, I had forgotten to disable the test blocks which was representing the blocks. Realized this after some time.
So the transparency was working fine. I just saw the rectangle i was drawing behind it.
I try to make a screenshot of my game according to this article:
https://github.com/libgdx/libgdx/wiki/Taking-a-Screenshot
It seams that there is problem with black color in PNG conversion.
My screenshots looks as follow:
should be:
is:
Here is detailed view:
There is a strange color instead of shadow around a leaf.
Did anyone have a similar problem?
I resolved this problem by implementing platform specific mechanism - it's different for android and desktop applications.
You can find more about platform specific code in libgdx here.
Here is the interface for android:
public interface ScreenshotPixmap {
public void saveScreenshot(FileHandle fileHandle);
}
And implementation:
public class AndroidScreenshotPixmap implements ScreenshotPixmap {
public Pixmap getScreenshot( int x, int y, int w, int h, boolean flipY ) {
Gdx.gl.glPixelStorei( GL20.GL_PACK_ALIGNMENT, 1 );
final Pixmap pixmap = new Pixmap( w, h, Pixmap.Format.RGBA8888 );
ByteBuffer pixels = pixmap.getPixels();
Gdx.gl.glReadPixels( x, y, w, h, GL20.GL_RGBA, GL20.GL_UNSIGNED_BYTE, pixels );
final int numBytes = w * h * 4;
byte[] lines = new byte[numBytes];
if ( flipY ) {
final int numBytesPerLine = w * 4;
for ( int i = 0; i < h; i++ ) {
pixels.position( (h - i - 1) * numBytesPerLine );
pixels.get( lines, i * numBytesPerLine, numBytesPerLine );
}
pixels.clear();
pixels.put( lines );
} else {
pixels.clear();
pixels.get( lines );
}
return pixmap;
}
public int[] pixmapToIntArray( Pixmap pixmap ) {
int w = pixmap.getWidth();
int h = pixmap.getHeight();
int dest = 0;
int[] raw = new int[w * h];
for ( int y = 0; y < h; y++ ) {
for ( int x = 0; x < w; x++ ) {
int rgba = pixmap.getPixel( x, y );
raw[dest++] = 0xFF000000 | ( rgba >> 8 );
}
}
return raw;
}
public void savePNG( int[] colors, int width, int height, OutputStream stream ) {
Bitmap bitmap = Bitmap.createBitmap( colors, width, height, Bitmap.Config.ARGB_8888 );
bitmap.compress( Bitmap.CompressFormat.PNG, 100, stream );
}
#Override
public void saveScreenshot(FileHandle fileHandle) {
Pixmap pixmap = getScreenshot(0, 0, Gdx.graphics.getWidth(), Gdx.graphics.getHeight(), false);
OutputStream stream = fileHandle.write(false);
savePNG(pixmapToIntArray(pixmap), Gdx.graphics.getWidth(), Gdx.graphics.getHeight(), stream);
}
}
Good luck.
My class for saving screenshots works fine:
public class ScreenshotSaver {
private static int counter = 1;
public static void saveScreenshot() {
FileHandle fh;
do {
fh = new FileHandle("screenshot" + counter++ + ".png");
} while (fh.exists());
Pixmap pixmap = getScreenshot(0, 0, Gdx.graphics.getWidth(), Gdx.graphics.getHeight(), false);
PixmapIO.writePNG(fh, pixmap);
pixmap.dispose();
}
private static Pixmap getScreenshot(int x, int y, int w, int h, boolean flipY) {
Gdx.gl.glPixelStorei(GL10.GL_PACK_ALIGNMENT, 1);
final Pixmap pixmap = new Pixmap(w, h, Format.RGBA8888);
ByteBuffer pixels = pixmap.getPixels();
Gdx.gl.glReadPixels(x, y, w, h, GL10.GL_RGBA, GL10.GL_UNSIGNED_BYTE, pixels);
final int numBytes = w * h * 4;
byte[] lines = new byte[numBytes];
if (flipY) {
pixels.clear();
pixels.get(lines);
} else {
final int numBytesPerLine = w * 4;
for (int i = 0; i < h; i++) {
pixels.position((h - i - 1) * numBytesPerLine);
pixels.get(lines, i * numBytesPerLine, numBytesPerLine);
}
pixels.clear();
pixels.put(lines);
}
return pixmap;
}
}
The problem is, when I execute program in write protected area. Program simply turns off. I'd prefer if it just didn't save the screenshot instead. How to do that?
Why don't you just use a try/catch?
try{
PixmapIO.writePNG(fh, pixmap);
} catch (Exception e) {
// save it somewhere else
}
pixmap.dispose();
For runtime file writing you use Gdx.files.local(...). See file handling in the libgdx wiki. It works everywhere but with GWT/in the browser.