I have a Canvas I draw on, I'm trying to take the bitmap out, convert it to a byte array and save it serialized into a file. then later open, deserialize, and apply the bitmap back to the canvas. In the code below everything seems to work well except that when applying the bitmap to canvas nothing appears. can someone please show me where I'm going wrong.
public byte[] getCanvasData(){
ByteArrayOutputStream bos = new ByteArrayOutputStream();
mBitmap.compress(CompressFormat.PNG, 0, bos);
byte[] bitmapdata = bos.toByteArray();
return bitmapdata;
}
public void setCanvasData(byte[] canvasData, int w, int h){
mBitmap = Bitmap.createBitmap(w, h, Bitmap.Config.ARGB_8888);
mBitmap.eraseColor(0x00000000);
mCanvas = new Canvas(mBitmap);
mCanvas.drawBitmap(BitmapFactory.decodeByteArray(canvasData , 0, canvasData.length).copy(Bitmap.Config.ARGB_8888, true), 0, 0, null);
}
ADDED SOME EXTRA CODE TO POSSIBLY HELP A LITTLE
public void readInSerialisable() throws IOException
{
FileInputStream fileIn = new FileInputStream("/sdcard/theBKup.ser");
ObjectInputStream in = new ObjectInputStream(fileIn);
try
{
BookData book = (BookData) in.readObject();
pages.clear();
canvasContainer.removeAllViews();
for (int i = 0; i < book.getBook().size(); i++){
Log.d("CREATION", "LOADING PAGE " + i);
pages.add(new Canvas2(context, book.getPageAt(i), canvasContainer.getWidth(), canvasContainer.getHeight()));
}
canvasContainer.addView(pages.get(page), new AbsoluteLayout.LayoutParams(AbsoluteLayout.LayoutParams.FILL_PARENT, AbsoluteLayout.LayoutParams.FILL_PARENT, 0, 0));
updatePagination();
Log.d("CREATION", "Updated Pagination");
}
catch (Exception exc)
{
System.out.println("didnt work");
exc.printStackTrace();
}
}
BookData - Serializable class containing all my data, simple gets/sets in there
onDraw Method
#Override
protected void onDraw(Canvas canvas) {
Log.d("DRAWING", "WE ARE DRAWING");
canvas.drawColor(0x00AAAAAA); //MAKE CANVAS TRANSPARENT
canvas.drawBitmap(mBitmap, 0, 0, mBitmapPaint);
canvas.drawPath(mPath, mPaint);
}
I would do the following 2 tests.
Log some of the byte stream to make sure that it was loaded correctly. Something like Log.v(canvasData[0]+canvasData[1]);, or put a break point there, or something just to make sure the data is correct.
Draw a bitmap that you know is valid, using the same code, and see if it appears correctly.
I'm not sure exactly what's going on, but I strongly suspect one of the following.
The byte stream is not being read in correctly.
The bitmap is not being updated to the screen, or is using a trivially small size.
In the event that your byte stream data has something, then you will want to take a look at the Canvas documentation. Specifically, look at the following bit.
In order to see a Canvas, it has to be put on to a view. Once it is on a view, the onDraw() command must be called for it to be visible. I would make sure that you are in fact doing an onDraw(), and that the Canvas is associated with the View correctly. If you are using an onDraw() already, please post the bits of code associated with it.
Related
In my activity, I want to modify an image to add a text on it.
The image is selected in the galery or taken with the camera and then stored in a file in a previous activity. Then the uri of that file is passed through extras.
Now I try to add a string on top of the image like so:
try {
modifyThePic(imageUri);
} catch (IOException e) {
e.printStackTrace();
}
here is the function's body:
public void modifyThePic(Uri imageUri) throws IOException {
ImageDecoder.Source source = ImageDecoder.createSource(this.getContentResolver(), imageUri);
Bitmap bitmap = ImageDecoder.decodeBitmap(source);
Canvas canvas = new Canvas(bitmap);
Paint paint = new Paint();
paint.setColor(Color.BLACK);
paint.setTextSize(10);
canvas.drawText("Some Text here", 0, 0, paint);
image.setImageBitmap(bitmap); //image is the imageView to control the result
}
The expected behaviour would be to display the image with "Some Text here" on top of it. But instead there is nothing displayed, however the app doesn't crash.
While debugging, I come across an error that appears between the
Bitmap bitmap = ImageDecoder.decodeBitmap(source);
and the
Canvas canvas = new Canvas(bitmap);
here is the error:
java.io.FileNotFoundException: No content provider: /storage/emulated/0/Android/data/com.emergence.pantherapp/files/Pictures/JPEG_20200829_181926_7510981182141824841.jpg
I suspect that I missuse the "ImageDecoder" as it's my first time using it. More precisely, I was not able to let the "decodeBitmap" method in onCreate, Android Studio was telling me that it could not be in the "main thread", I am not familiar at all with threading. Moving it in a dedicaded function fixed this but maybe am I supposed to do something else and it's the root of the problem.
My questions:
Am I using the right tools to modify the file and add the text on it ?
If yes, what do I do wrong ?
If no, what library/tools should I look into to do this task ?
Thank you,
EDIT: ADDITONAL ANSWER ELEMENTS
As both #blackapps and #rmunge pointed out, I was not getting a legit URI but instead a file path. The easy fix for my problem was to get the URI from the path using this code:
Uri realuri = Uri.fromFile(new File("insert path here")));
Additionaly to edit the bitmap, it must be made mutable which is covered here for example.
The final function to extract the bitmap from the URI and adding text on top of it is this one:
public void modifyThePic(Uri imageUri) throws IOException {
ContentResolver cr = this.getContentResolver();
InputStream input = cr.openInputStream(imageUri);
Bitmap bitmap = BitmapFactory.decodeStream(input).copy(Bitmap.Config.ARGB_8888,true);
Canvas canvas = new Canvas(bitmap);
Paint paint = new Paint();
paint.setColor(Color.BLACK);
paint.setTextSize(300);
int xPos = (canvas.getWidth() / 2); //just some operations to center the text
int yPos = (int) ((canvas.getHeight() / 2) - ((paint.descent() + paint.ascent()) / 2)) ;
canvas.drawText("SOME TEXT TO TRY IT OUT", xPos, yPos, paint);
image.setImageBitmap(bitmap);
}
Seems your URI is wrong. It has to start with file:///
This post is a follow up to :
ImageIO.read can't read ByteArrayInputStream (image processing)
Similar to the OP, I am getting a null pointer whenever I try to read from my ByteArrayInputStream (as it should, as explained by the top answer). Noticing this, I have implemented the code from the #haraldK 's answer from the post above in order to correct this issue, but I have run into another problem. I have the following code:
byte[] imageInByteArr = ...
// convert byte array back to BufferedImage
int width = 1085;
int height = 696;
BufferedImage convertedGrayScale = new BufferedImage(width, height, BufferedImage.TYPE_BYTE_GRAY);
convertedGrayScale.getRaster().setDataElements(0, 0, width, height, imageInByteArr );
try {
ImageIO.write(convertedGrayScale, "jpg", new File("C:\\test.jpg"));
}
catch (IOException e) {
System.err.println("IOException: " + e);
}
Upon execution, I run into a java.lang.ArrayIndexOutOfBoundsException: null error on the line right before the try/catch block. My first thought was that this null pointer was arising for not having a file in my C drive called test.jpg. I adjusted to fix that worry, yet I am still getting the same null pointer issue at convertedGrayScale.getRaster().setDataElements(0, 0, width, height, imageInByteArr );. Why is this happening?
On another note, aside from writing the file uining ImageIO, is there ANY other way for me to convert the byte[] into a visual representation of an image? I have tried to just print the array onto a file and saving it as a '.jpg', but the file will not open. Any suggestions will help. To summarize, I am looking to convert a byte[] into an image and save it OR render it onto a browser. Whichever is easier/doable.
it appears that your imageInByteArr is too short. I was able to get the same error you get from this
public static void main(String[] args) {
int width = 1085;
int height = 696;
byte[] imageInByteArr = new byte[width ];
BufferedImage convertedGrayScale = new BufferedImage(width, height, BufferedImage.TYPE_BYTE_GRAY);
convertedGrayScale.getRaster().setDataElements(0, 0, width, height, imageInByteArr);
}
when using width*height for size of imageInByteArr or anything bigger i get no error, but when it's smaller than the data you are trying to update it throws the exception.
I want to take only a part of the the screen data from a preview video callback to reduce the time of the process. The probleme is I only know how to take the whole screen with OnPreviewFrame:
#Override
public void onPreviewFrame(byte[] data, Camera camera) {
myData = data;
// +get camera resolution x, y
}
And then with this data get the image :
private Bitmap getBitmapFromYUV(byte[] data, int width, int height)
{
ByteArrayOutputStream out = new ByteArrayOutputStream();
YuvImage yuvImage = new YuvImage(data, ImageFormat.NV21, width, height, null);
yuvImage.compressToJpeg(new Rect(0, 0, width, height), 100, out);
byte[] imageBytes = out.toByteArray();
Bitmap image = BitmapFactory.decodeByteArray(imageBytes, 0, imageBytes.length);
return image;
}
And then I take the part of the image taken I want :
cutImage = Bitmap.createBitmap(image, xOffset, yOffset, customWidth, customHeight);
The problem is that I need to take lots of images to apply some image processing on it and that's why I want to reduce the time it takes to get the images. Instead of taking the whole screen and then crop it, I want to immediatly get the cropped image. Is there a way to get the part of the screen data ?
Ok I finally found something, I still record all the data of the camera but when using compressToJpeg I crop the picture with a custom Rect. Maybe there is something better to do before this but this is still a good improvement. Here are my changes :
yuvImage.compressToJpeg(new Rect(offsetX, offsetY, sizeCaptureX + offsetX, sizeCaptureY + offsetY ), 100, out);
I am trying to load a texture for a cube and I have trouble with the dimensions I use. The texture has the power of two (256x256). When it should use 256 as width and height it throws an exception:
java.lang.IndexOutOfBoundsException: Required 262144 remaining bytes in buffer, only had 68998
at com.jogamp.common.nio.Buffers.rangeCheckBytes(Buffers.java:828)
The code:
private void initTexture(GL2ES2 gl) {
try {
BufferedImage bufferedImage = ImageIO.read(new URI("http://192.168.0.39/images/box.gif").toURL());
ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();
ImageIO.write(bufferedImage, "gif", byteArrayOutputStream);
byte[] imageData = byteArrayOutputStream.toByteArray();
imageBuffer = ByteBuffer.wrap(imageData);
} catch (Exception e) {
e.printStackTrace();
}
imageBuffer.rewind();
gl.glGenTextures(1, textureIds, 0);
gl.glBindTexture(GL2ES2.GL_TEXTURE_2D, textureIds[0]);
gl.glTexImage2D(GL2ES2.GL_TEXTURE_2D, 0, GL2ES2.GL_RGBA, 256, 256, 0, GL2ES2.GL_RGBA, GL2ES2.GL_UNSIGNED_BYTE, imageBuffer);
gl.glTexParameteri(GL2ES2.GL_TEXTURE_2D, GL2ES2.GL_TEXTURE_MAG_FILTER, GL2ES2.GL_LINEAR);
gl.glTexParameteri(GL2ES2.GL_TEXTURE_2D, GL2ES2.GL_TEXTURE_MIN_FILTER, GL2ES2.GL_LINEAR_MIPMAP_NEAREST);
gl.glGenerateMipmap(GL2ES2.GL_TEXTURE_2D);
gl.glBindTexture(GL2ES2.GL_TEXTURE_2D, 0);
}
When I change the parameter width/height to 128 the exception disappears but the cubes show wrong colors:
As bestsss mentioned, the reason might be some raw format. The problem: I can't fix this. I tried multiple images and formats. Created them with gimp (working on ubuntu) but the exception is always the same. So I guess the reason for that is that I read the image in a wrong way. Some ideas?
Update
My solution (which uses JOGL classes TextureIO and Texture):
Texture texture;
private void initTexture(GL2ES2 gl) {
try {
texture = TextureIO.newTexture(new URI("http://192.168.0.39/images/box.gif").toURL(),true,null);
texture.setTexParameterf(GL2ES2.GL_TEXTURE_MIN_FILTER, GL2ES2.GL_LINEAR_MIPMAP_LINEAR);
texture.setTexParameterf(GL2ES2.GL_TEXTURE_MAG_FILTER, GL2ES2.GL_LINEAR);
} catch (Exception e) {
e.printStackTrace();
}
}
public void display(GL2ES2 gl) {
// code snipped
if (texture != null) {
texture.enable();
texture.bind();
}
// code snipped
}
Zero clue about the API however. I can bet the expected format is some raw one NOT gif since 262144 =2^18 (or 256*256*4). RGB+Alpha are 4bytes.
edit: again,
gl.glTexImage2D(GL2ES2.GL_TEXTURE_2D, 0, GL2ES2.GL_RGBA, 256, 256, 0, GL2ES2.GL_RGBA, GL2ES2.GL_UNSIGNED_BYTE, imageBuffer);
just guessting but look at the constants: GL2ES2.GL_RGBA, GL2ES2.GL_RGBA, GL2ES2.GL_UNSIGNED_BYTE - all support RGBA format for the bytes in the byte buffer,see what other contestants are available, the way I believe using NIO would have point only with direct buffers containing the raster in the format specified by the constants. (i.e. no other formats for image storage/transmission like jpeg/bif/png will help)
So read the documentation again, look for tutorial, examples and proceed (the way you load the image is not very good either)
I've been trying to load a bmp picture to use it as a texture at my program I've used a IOStream class to extend DataInputStream to read the pixels at the photo with this code based on a texture loader code for C++:
//class Data members
public static int BMPtextures[];
public static int BMPtexCount = 30;
public static int currentTextureID = 0;
//loading methode
static int loadBMPTexture(int index, String fileName, GL gl)
{
try
{
IOStream wdis = new IOStream(fileName);
wdis.skipBytes(18);
int width = wdis.readIntW();
int height = wdis.readIntW();
wdis.skipBytes(28);
byte buf[] = new byte[wdis.available()];
wdis.read(buf);
wdis.close();
gl.glBindTexture(GL.GL_TEXTURE_2D, BMPtextures[index]);
gl.glTexImage2D(GL.GL_TEXTURE_2D, 0, 3, width, height, 0, GL.GL_BGR, GL.GL_UNSIGNED_BYTE, buf);
gl.glTexParameteri(GL.GL_TEXTURE_2D, GL.GL_TEXTURE_MAG_FILTER, GL.GL_LINEAR);
gl.glTexParameteri(GL.GL_TEXTURE_2D, GL.GL_TEXTURE_MIN_FILTER, GL.GL_LINEAR);
currentTextureID = index;
return currentTextureID;
}
catch (IOException ex)
{
// Utils.msgBox("File Error\n" + fileName, "Error", Utils.MSG_WARN);
return -1;
}
}
and IOStream code :
public class IOStream extends DataInputStream {
public IOStream(String file) throws FileNotFoundException {
super(new FileInputStream(file));
}
public short readShortW() throws IOException {
return (short)(readUnsignedByte() + readUnsignedByte() * 256);
}
public int readIntW() throws IOException {
return readShortW() + readShortW() * 256 * 256;
}
void read(Buffer[] buf) {
}
}
and the calling:
GTexture.loadBMPTexture(1,"/BasicJOGL/src/basicjogl/data/Font.bmp",gl);
after debugging I figured out that when it come to this line:
IOStream wdis = new IOStream(fileName);
an IOExeption occurred and it's a DispatchException What's this supposed to mean, and how can I solve it?
I tried to:
use \ and \\ and / and //
change the path of the photo and take all the path from c:\ to the photoname.bmp
rename the photo using numbers like 1.bmp
None worked.
Judging by your latest comment, you are no longer getting the IOException but are still having troubles getting the texture to actually render (just getting a white square).
I noticed the following are not in the code you posted here (but could be elsewhere):
gl.glGenTextures
You need to generate places for your textures before binding them. Also, make sure you have enabled texturing:
gl.glEnable(GL.GL_TEXTURE2D);
For additional information / tutorials on getting started with OpenGL texturing, I recommend taking a read of NeHe Productions: OpenGL Lesson #06. Also, down the bottom of the page you will find JOGL sample code to help you convert the concepts from C to Java.
Anyway, hope this gives a few new ideas to try.
Probably don't need help on this anymore, but I noticed that IOStream extends DataInputStream but when it comes to actually implementing read() it's been left blank. so regardless you're never actually reading anything into buf which might explain why your texture is blank but you don't get any other problems.
Here is a simple way of loading a texture in JOGL. It works with BMP as well.
public static Texture loadTexture(String file) throws GLException, IOException
{
ByteArrayOutputStream os = new ByteArrayOutputStream();
ImageIO.write(ImageIO.read(new File(file)), "png", os);
InputStream fis = new ByteArrayInputStream(os.toByteArray());
return TextureIO.newTexture(fis, true, TextureIO.PNG);
}
also dont forget to enable and bind, and set texture-coordinates.
...
gl.glEnableClientState(GL2ES1.GL_TEXTURE_COORD_ARRAY);
if(myTexture == null)
myTexture = loadTexture("filename.png");
myTexture.enable(gl);
myTexture.bind(gl);
gl.glTexCoordPointer(2, GL2ES1.GL_FLOAT, 0, textureCoords);
...