I'm sorry that I'm not good at English.
I want to make a real-time image manipulation app.(Binarization, color inverse...etc.)
It needs speed, so I want to manipulate it as byte[], without converting to Image or Bitmap.
The format of image that I got from camera was YUV(NV21), but because I don't know about this format, I convert it to JPEG.
but it also doesn't work as I expected.(I thought it would be one byte per one pixel or three byte per one pixel.)
So,
How can I do such manipulation(binarization, color inverse) as JPEG byte array?
or How can I convert NV21 format byte array to RGB byte array?
and I used the method to convert NV21 to JPEG.
YuvImage yuvimage = new YuvImage(bytes, ImageFormat.NV21, width, height, null);
ByteArrayOutputStream outputStream = new ByteArrayOutputStream();
yuvimage.compressToJpeg(new Rect(0, 0, width, height), 100, outputStream);
and I got yuv image byte array from onPreviewFrame(Camra.PreviewCallback).
I think you can use setPreviewFormat(int) to set other than the default NV21 format for camera captures.
https://developer.android.com/reference/android/graphics/ImageFormat.html#NV21
Related
I'm in the draw loop of an android view:
Bitmap bitmap = Bitmap.createBitmap(this.getWidth(),
this.getHeight(), Bitmap.Config.ARGB_4444);
Canvas newCanvas = new Canvas(bitmap);
super.draw(newCanvas);
Log.d("AndroidUnity","Canvas Drawn!");
mImageView.setImageBitmap(bitmap);
And the above code shows me the correct drawing on the attached Image Viewer.
When I convert the bitmap to a byte array:
ByteBuffer byteBuffer = ByteBuffer.allocate(bitmap.getByteCount());
bitmap.copyPixelsToBuffer(byteBuffer);
byte[] bytes = byteBuffer.array();
importing the bytes into Unity does not work (shows a black image on my rawimage):
imageTexture2D = new Texture2D(width, height, TextureFormat.ARGB4444, false);
imageTexture2D.LoadRawTextureData(bytes);
imageTexture2D.Apply();
RawImage.texture = imageTexture2D;
Any ideas on how to get the Java bytes[] to display as a texture/image in Unity? I've tested that the bytes are sending correctly, i.e. when I push a byte array of {1,2,3,4} from android, I get {1,2,3,4} on the unity side.
this isn't mentioning that Unity throws an error when trying to transfer the bytes as a byte[], so instead I have to follow this advice, on the C# side:
void ReceieveAndroidBytes(AndroidJavaObject jo){
AndroidJavaObject bufferObject = jo.Get<AndroidJavaObject>("Buffer");
byte[] bytes = AndroidJNIHelper.ConvertFromJNIArray<byte[]>(bufferObject.GetRawObject()); }
and a trivial byte[] container class "Buffer" on the java side
I was trying to do the exact same thing and my initial attempts also had a black texture. I do the array conversion with AndroidJNIHelper.ConvertFromJNIArray like you do except I used sbyte[] instead of byte[]. To set the actual image data I ended up using
imageTexture2D.SetPixelData(bytes, 0);
If I'm not mistaken LoadRawTextureData is even rawer than an array of pixel data, it might be how graphics cards store textures with compression. If that is true then raw pixel data isn't in the right format and it can't be decoded.
I am trying to print bitmap image. In order to print image my printer needs byte array of bitmap. Picture size is 128x128 pixel.
Here is code how I read and convert image to byte array.
BufferedImage image = ImageIO.read(new File("test.bmp"));
ByteArrayOutputStream baos = new ByteArrayOutputStream();
ImageIO.write(image, "bmp", baos);
byte[] imageInByte = baos.toByteArray();
System.out.println(imageInByte.length);
After code execution imageInByte array length is 2110. What am I missing here? Should not array length be 16384 (128 x 128)?
You are assuming one byte per pixel and no header information. The "bitsPerPixel" header will play a large part in determining how much space the image data takes up. See the structure of a bitmap file here.
I have a camera sending frames to a SurfaceView. I would like to get these frames out of the surface view and send them elsewhere. In their final form, the images must be in JPEG format. To accomplish this currently, I am creating a YUV image from the byte[] and then calling compressToJpeg. However, when I invoke compressToJpeg on every frame rather than doing nothing but displaying it, my FPS goes from ~30 to ~4. I commented out the other lines and this function appears to be the culprit.
public void onNewRawImage(byte[] data, Size size) {
// Convert to JPG
YuvImage yuvimage=new YuvImage(data,
ImageFormat.NV21, size.width, size.height, null);
ByteArrayOutputStream baos = new ByteArrayOutputStream();
yuvimage.compressToJpeg(new Rect(0, 0, yuvimage.getWidth(),
yuvimage.getHeight()), 80, baos);
byte[] jdata = baos.toByteArray();
// Convert to Bitmap
Bitmap bitmap = BitmapFactory.decodeByteArray(jdata, 0, jdata.length);
}
Is it possible to start in the JPEG format rather than having to convert to it? I am hoping I am making a mistake somewhere. Any help is greatly appreciated, thank you.
When I am using copyPixelsFromBuffer and copyPixelsToBuffer, the bitmap is not displaying as the same one, I have tried below code:
Bitmap bm = BitmapFactory.decodeByteArray(a, 0, a.length);
int[] pixels = new int[bm.getWidth() * bm.getHeight()];
bm.getPixels(pixels, 0, bm.getWidth(), 0, 0,bm.getWidth(),bm.getHeight());
ByteBuffer buffer = ByteBuffer.allocate(bm.getRowBytes()*bm.getHeight());
bm.copyPixelsToBuffer(buffer);//I copy the pixels from Bitmap bm to the buffer
ByteBuffer buffer1 = ByteBuffer.wrap(buffer.array());
newbm = Bitmap.createBitmap(160, 160,Config.RGB_565);
newbm.copyPixelsFromBuffer(buffer1);//I read pixels from the Buffer and put the pixels to the Bitmap newbm.
imageview1.setImageBitmap(newbm);
imageview2.setImageBitmap(bm);
Why the Bitmap bm and newbm did not display the same content?
In your code, you are copying the pixels into a bitmap with RGB_565 format, whereas the original bitmap from which you got the pixels must be in a different format.
The problem is clear from the documentation of copyPixelsFromBuffer():
The data in the buffer is not changed in any way (unlike setPixels(),
which converts from unpremultipled 32bit to whatever the bitmap's
native format is.
So either use the same bitmap format, or use setPixels() or draw the original bitmap onto the new one using a Canvas.drawBitmap() call.
Also use bm.getWidth() & bm.getHeight() to specify the size of the new bitmap instead of hard-coding as 160.
I need to know how to get an array of bytes from a loaded image, in java. BufferedImage seems not to supply any methods that produce an array of bytes, so what do I use?
BufferedImage bufferedImage; //assumed that you have created it already
ByteArrayOutputStream byteStream = new ByteArrayOutputStream();
ImageIO.write(bufferedImage,"jpg", byteStream);
byte[] byteArray = byteStream.toByteArray();