I am trying to print bitmap image. In order to print image my printer needs byte array of bitmap. Picture size is 128x128 pixel.
Here is code how I read and convert image to byte array.
BufferedImage image = ImageIO.read(new File("test.bmp"));
ByteArrayOutputStream baos = new ByteArrayOutputStream();
ImageIO.write(image, "bmp", baos);
byte[] imageInByte = baos.toByteArray();
System.out.println(imageInByte.length);
After code execution imageInByte array length is 2110. What am I missing here? Should not array length be 16384 (128 x 128)?
You are assuming one byte per pixel and no header information. The "bitsPerPixel" header will play a large part in determining how much space the image data takes up. See the structure of a bitmap file here.
Related
I am trying to pass an image from java to flutter. The image is in bitmap in java so I first try to convert it to bytes[] using:
on java side:
int byteCount = transferredImage.getAllocationByteCount(); // transferredImage is the bitmap
ByteBuffer buffer = ByteBuffer.allocate(byteCount); //Create a new buffer
transferredImage.copyPixelsToBuffer(buffer);
byte[] myByte = buffer.array();
Then on flutter side I read the image as Uint8List and create an img.Image object from it:
img.Image styleImage = img.Image.fromBytes(
imgWidth,
imgHeight,
transferredBytes,
);
This works, without causing any errors but the image displayed using Image.memory(img.encodeJpg(styleImage)) always has a blue tint. The bitmap on java side is fine but when converted to bytes[] and displayed in flutter it has a blue tint..
So I assumed some important information may have been dropped during conversion to bytes using my above method, I then convert the image to an image format and then transfer the bytes:
at java side:
ByteArrayOutputStream baos = new ByteArrayOutputStream();
transferredImage.compress(Bitmap.CompressFormat.PNG, 100, baos);
byte bytes []= baos.toByteArray();
on flutter side:
img.Image styleImage = img.Image.fromBytes(
imgWidth,
imgHeight,
transferredBytes,
);
But this does not work and creates an error:
RangeError (index): Index out of range: index should be less than
321512: 322560
If possible, I would prefer to have my first attempt work as it is really fast, so is there any way I can get this to work ?
Thanks
I'm sorry that I'm not good at English.
I want to make a real-time image manipulation app.(Binarization, color inverse...etc.)
It needs speed, so I want to manipulate it as byte[], without converting to Image or Bitmap.
The format of image that I got from camera was YUV(NV21), but because I don't know about this format, I convert it to JPEG.
but it also doesn't work as I expected.(I thought it would be one byte per one pixel or three byte per one pixel.)
So,
How can I do such manipulation(binarization, color inverse) as JPEG byte array?
or How can I convert NV21 format byte array to RGB byte array?
and I used the method to convert NV21 to JPEG.
YuvImage yuvimage = new YuvImage(bytes, ImageFormat.NV21, width, height, null);
ByteArrayOutputStream outputStream = new ByteArrayOutputStream();
yuvimage.compressToJpeg(new Rect(0, 0, width, height), 100, outputStream);
and I got yuv image byte array from onPreviewFrame(Camra.PreviewCallback).
I think you can use setPreviewFormat(int) to set other than the default NV21 format for camera captures.
https://developer.android.com/reference/android/graphics/ImageFormat.html#NV21
I first parse image, and index it in database as byte[] which means the byte[] variable has that image in byte format
I am going to use images on search list.
Is it possible to put byte[] variable in File()?
ImageIO.read(new File(byteImage+".png"));
Your image is stored as a png? If so, you can ready it directly from the byte array with:
BufferedImage img = ImageIO.read(new ByteArrayInputStream(imageBytes));
use new FileOutputStream(fileVariable).write(bytes);
Using below code to upload an image.
ByteArrayOutputStream baos = new ByteArrayOutputStream();
BufferedImage img = ImageIO.read(new File("abc.tiff"));
ImageIO.write(img, "tiff", baos);
img.flush();
img=null;
byte[] bytes2 = baos.toByteArray();
if(baos != null) {
baos.flush();
baos.close();
}
System.out.println("bytes2 size::::" + bytes2.length);
My original tiff image size is 119 kb and when I print the bytes of image I get 800974 bytes length.
Why the image size increases? and what can be the issue
Obviously your abc.tiff file is archived (TIFF format has an archived version), while img contains non-archived, unpacked data. That is why, your output is 782 kb instead of 119.
TIFF format, as most image formats, support lots of image modes and compression strategies.
ImageIO.write() might not be very clever at optimizing, or it might need some tweaking. See eg here.
I need to know how to get an array of bytes from a loaded image, in java. BufferedImage seems not to supply any methods that produce an array of bytes, so what do I use?
BufferedImage bufferedImage; //assumed that you have created it already
ByteArrayOutputStream byteStream = new ByteArrayOutputStream();
ImageIO.write(bufferedImage,"jpg", byteStream);
byte[] byteArray = byteStream.toByteArray();