Android bitmap image to Unity C# texture - java

I'm in the draw loop of an android view:
Bitmap bitmap = Bitmap.createBitmap(this.getWidth(),
this.getHeight(), Bitmap.Config.ARGB_4444);
Canvas newCanvas = new Canvas(bitmap);
super.draw(newCanvas);
Log.d("AndroidUnity","Canvas Drawn!");
mImageView.setImageBitmap(bitmap);
And the above code shows me the correct drawing on the attached Image Viewer.
When I convert the bitmap to a byte array:
ByteBuffer byteBuffer = ByteBuffer.allocate(bitmap.getByteCount());
bitmap.copyPixelsToBuffer(byteBuffer);
byte[] bytes = byteBuffer.array();
importing the bytes into Unity does not work (shows a black image on my rawimage):
imageTexture2D = new Texture2D(width, height, TextureFormat.ARGB4444, false);
imageTexture2D.LoadRawTextureData(bytes);
imageTexture2D.Apply();
RawImage.texture = imageTexture2D;
Any ideas on how to get the Java bytes[] to display as a texture/image in Unity? I've tested that the bytes are sending correctly, i.e. when I push a byte array of {1,2,3,4} from android, I get {1,2,3,4} on the unity side.
this isn't mentioning that Unity throws an error when trying to transfer the bytes as a byte[], so instead I have to follow this advice, on the C# side:
void ReceieveAndroidBytes(AndroidJavaObject jo){
AndroidJavaObject bufferObject = jo.Get<AndroidJavaObject>("Buffer");
byte[] bytes = AndroidJNIHelper.ConvertFromJNIArray<byte[]>(bufferObject.GetRawObject()); }
and a trivial byte[] container class "Buffer" on the java side

I was trying to do the exact same thing and my initial attempts also had a black texture. I do the array conversion with AndroidJNIHelper.ConvertFromJNIArray like you do except I used sbyte[] instead of byte[]. To set the actual image data I ended up using
imageTexture2D.SetPixelData(bytes, 0);
If I'm not mistaken LoadRawTextureData is even rawer than an array of pixel data, it might be how graphics cards store textures with compression. If that is true then raw pixel data isn't in the right format and it can't be decoded.

Related

How to pass an image from java to flutter?

I am trying to pass an image from java to flutter. The image is in bitmap in java so I first try to convert it to bytes[] using:
on java side:
int byteCount = transferredImage.getAllocationByteCount(); // transferredImage is the bitmap
ByteBuffer buffer = ByteBuffer.allocate(byteCount); //Create a new buffer
transferredImage.copyPixelsToBuffer(buffer);
byte[] myByte = buffer.array();
Then on flutter side I read the image as Uint8List and create an img.Image object from it:
img.Image styleImage = img.Image.fromBytes(
imgWidth,
imgHeight,
transferredBytes,
);
This works, without causing any errors but the image displayed using Image.memory(img.encodeJpg(styleImage)) always has a blue tint. The bitmap on java side is fine but when converted to bytes[] and displayed in flutter it has a blue tint..
So I assumed some important information may have been dropped during conversion to bytes using my above method, I then convert the image to an image format and then transfer the bytes:
at java side:
ByteArrayOutputStream baos = new ByteArrayOutputStream();
transferredImage.compress(Bitmap.CompressFormat.PNG, 100, baos);
byte bytes []= baos.toByteArray();
on flutter side:
img.Image styleImage = img.Image.fromBytes(
imgWidth,
imgHeight,
transferredBytes,
);
But this does not work and creates an error:
RangeError (index): Index out of range: index should be less than
321512: 322560
If possible, I would prefer to have my first attempt work as it is really fast, so is there any way I can get this to work ?
Thanks

image manipulation as byte[]

I'm sorry that I'm not good at English.
I want to make a real-time image manipulation app.(Binarization, color inverse...etc.)
It needs speed, so I want to manipulate it as byte[], without converting to Image or Bitmap.
The format of image that I got from camera was YUV(NV21), but because I don't know about this format, I convert it to JPEG.
but it also doesn't work as I expected.(I thought it would be one byte per one pixel or three byte per one pixel.)
So,
How can I do such manipulation(binarization, color inverse) as JPEG byte array?
or How can I convert NV21 format byte array to RGB byte array?
and I used the method to convert NV21 to JPEG.
YuvImage yuvimage = new YuvImage(bytes, ImageFormat.NV21, width, height, null);
ByteArrayOutputStream outputStream = new ByteArrayOutputStream();
yuvimage.compressToJpeg(new Rect(0, 0, width, height), 100, outputStream);
and I got yuv image byte array from onPreviewFrame(Camra.PreviewCallback).
I think you can use setPreviewFormat(int) to set other than the default NV21 format for camera captures.
https://developer.android.com/reference/android/graphics/ImageFormat.html#NV21

Confusion with ordering of the color values in the bitmap buffer got from Bitmapfactory

I decoded an image (JPEG) with Android BitmapFactory class and it decoded fine with color format ARGB_8888.
Bitmap bitmap = BitmapFactory.decodeFile(Environment.getExternalStorageDirectory().toString()+"/camel.jpg",
new BitmapFactory.Options());
Log.d(TAG,"Color format of the bitmap : "+bitmap.getConfig());
I dumped the raw buffer from the Bitmap class
ByteBuffer buffer = ByteBuffer.allocate(bitmap.getByteCount());
bitmap.copyPixelsToBuffer(buffer);
Log.d(TAG,"Color format of the bitmap : "+bitmap.getConfig());
FileOutputStream outputStream = null;
try {
outputStream = new FileOutputStream(Environment.getExternalStorageDirectory().toString() + "/buffer.raw");
outputStream.write(buffer.array());
outputStream.close();
}
catch (Exception e) {
e.printStackTrace();
}
and examined the dump in a Hex editor , it looks like the ordering of the color components is not in the order A,R,G and B.
Instead it looks like ordering is R,G,B and A.
As seen above the alpha (FF) is in the end of the set of four bytes for a pixel.Also to corroborate this I opened the image in 7yuv and image is displayed properly with color format RGBA and little-endian encoding.
My confusion is why is Android reports the format of the Bitmap as ARGB_8888 and actual byte ordering is R,G,B and A.
I was wondering this might be due to some endian mismatch , but in that case the whole ordering needs to be simply reversed (B,G,R and A).
Also I might be doing something wrong while dumping the bitmap to a raw buffer (and I am not very good at Java) , but I am not sure.
Thanks in advance!

Convert byte[] to Image EmguCV without know width and height

I need convert byte array (byte[]) (from java BufferedImage) to EmguCV Image without to know the width and height.
byte[] bytes = [data];
Image<Gray, byte> image = new Image<Gray, byte>();
image.Bytes = bytes;
Can you help me?
I found this
[ http://www.emgu.com/forum/viewtopic.php?t=1057 ]
public Image<Bgr, Byte> byteArrayToImage(byte[] byteArrayIn)
{
MemoryStream ms = new MemoryStream(byteArrayIn);
Bitmap returnImage = Image.FromStream(ms);
return new Image<Bgr, byte>(returnImage);
// you probably need to clean up stuff like the bitmap and the stream...
}
On the emgu forum and retyped it to remove a variable-name typo. Assuming your byte array is a standard image , looks like you can load it into a standard image variable that will automatically figure out sizing. From there you could pass that image to the constructor of your emgu image.

New Bitmap Changed on Copy Using Buffer

When I am using copyPixelsFromBuffer and copyPixelsToBuffer, the bitmap is not displaying as the same one, I have tried below code:
Bitmap bm = BitmapFactory.decodeByteArray(a, 0, a.length);
int[] pixels = new int[bm.getWidth() * bm.getHeight()];
bm.getPixels(pixels, 0, bm.getWidth(), 0, 0,bm.getWidth(),bm.getHeight());
ByteBuffer buffer = ByteBuffer.allocate(bm.getRowBytes()*bm.getHeight());
bm.copyPixelsToBuffer(buffer);//I copy the pixels from Bitmap bm to the buffer
ByteBuffer buffer1 = ByteBuffer.wrap(buffer.array());
newbm = Bitmap.createBitmap(160, 160,Config.RGB_565);
newbm.copyPixelsFromBuffer(buffer1);//I read pixels from the Buffer and put the pixels to the Bitmap newbm.
imageview1.setImageBitmap(newbm);
imageview2.setImageBitmap(bm);
Why the Bitmap bm and newbm did not display the same content?
In your code, you are copying the pixels into a bitmap with RGB_565 format, whereas the original bitmap from which you got the pixels must be in a different format.
The problem is clear from the documentation of copyPixelsFromBuffer():
The data in the buffer is not changed in any way (unlike setPixels(),
which converts from unpremultipled 32bit to whatever the bitmap's
native format is.
So either use the same bitmap format, or use setPixels() or draw the original bitmap onto the new one using a Canvas.drawBitmap() call.
Also use bm.getWidth() & bm.getHeight() to specify the size of the new bitmap instead of hard-coding as 160.

Categories