How to pass an image from java to flutter? - java

I am trying to pass an image from java to flutter. The image is in bitmap in java so I first try to convert it to bytes[] using:
on java side:
int byteCount = transferredImage.getAllocationByteCount(); // transferredImage is the bitmap
ByteBuffer buffer = ByteBuffer.allocate(byteCount); //Create a new buffer
transferredImage.copyPixelsToBuffer(buffer);
byte[] myByte = buffer.array();
Then on flutter side I read the image as Uint8List and create an img.Image object from it:
img.Image styleImage = img.Image.fromBytes(
imgWidth,
imgHeight,
transferredBytes,
);
This works, without causing any errors but the image displayed using Image.memory(img.encodeJpg(styleImage)) always has a blue tint. The bitmap on java side is fine but when converted to bytes[] and displayed in flutter it has a blue tint..
So I assumed some important information may have been dropped during conversion to bytes using my above method, I then convert the image to an image format and then transfer the bytes:
at java side:
ByteArrayOutputStream baos = new ByteArrayOutputStream();
transferredImage.compress(Bitmap.CompressFormat.PNG, 100, baos);
byte bytes []= baos.toByteArray();
on flutter side:
img.Image styleImage = img.Image.fromBytes(
imgWidth,
imgHeight,
transferredBytes,
);
But this does not work and creates an error:
RangeError (index): Index out of range: index should be less than
321512: 322560
If possible, I would prefer to have my first attempt work as it is really fast, so is there any way I can get this to work ?
Thanks

Related

Android bitmap image to Unity C# texture

I'm in the draw loop of an android view:
Bitmap bitmap = Bitmap.createBitmap(this.getWidth(),
this.getHeight(), Bitmap.Config.ARGB_4444);
Canvas newCanvas = new Canvas(bitmap);
super.draw(newCanvas);
Log.d("AndroidUnity","Canvas Drawn!");
mImageView.setImageBitmap(bitmap);
And the above code shows me the correct drawing on the attached Image Viewer.
When I convert the bitmap to a byte array:
ByteBuffer byteBuffer = ByteBuffer.allocate(bitmap.getByteCount());
bitmap.copyPixelsToBuffer(byteBuffer);
byte[] bytes = byteBuffer.array();
importing the bytes into Unity does not work (shows a black image on my rawimage):
imageTexture2D = new Texture2D(width, height, TextureFormat.ARGB4444, false);
imageTexture2D.LoadRawTextureData(bytes);
imageTexture2D.Apply();
RawImage.texture = imageTexture2D;
Any ideas on how to get the Java bytes[] to display as a texture/image in Unity? I've tested that the bytes are sending correctly, i.e. when I push a byte array of {1,2,3,4} from android, I get {1,2,3,4} on the unity side.
this isn't mentioning that Unity throws an error when trying to transfer the bytes as a byte[], so instead I have to follow this advice, on the C# side:
void ReceieveAndroidBytes(AndroidJavaObject jo){
AndroidJavaObject bufferObject = jo.Get<AndroidJavaObject>("Buffer");
byte[] bytes = AndroidJNIHelper.ConvertFromJNIArray<byte[]>(bufferObject.GetRawObject()); }
and a trivial byte[] container class "Buffer" on the java side
I was trying to do the exact same thing and my initial attempts also had a black texture. I do the array conversion with AndroidJNIHelper.ConvertFromJNIArray like you do except I used sbyte[] instead of byte[]. To set the actual image data I ended up using
imageTexture2D.SetPixelData(bytes, 0);
If I'm not mistaken LoadRawTextureData is even rawer than an array of pixel data, it might be how graphics cards store textures with compression. If that is true then raw pixel data isn't in the right format and it can't be decoded.

Android - Error when uploading a bitmap to server C#

When i make upload of a image in Android to the server (C #) it generates an error converting the image to byte Array, this happens with some images which is very strange because this error is not always generated, that is, depending the image is correctly converted or not.
The conversion code:
ByteArrayOutputStream stream = new ByteArrayOutputStream();
bitmapPhoto.compress(Bitmap.CompressFormat.JPEG, 50, stream);
final byte[] img_byte = stream.toByteArray();
final String imageString = Base64.encodeToString(img_byte, Base64.DEFAULT);
When receiving in C# I make the following conversion in the string, generating a byte Array and then for the MemoryStream.
byte[] novaFoto = Convert.FromBase64String(NovaFoto.FotoString);
using (MemoryStream stream = new MemoryStream(novaFoto))
{
Image image = Image.FromStream(stream);
}
depending at the image the error appears in either Convert.FromBase64String or Image.FromStream.
Can anyone help me with this? If you have another method i like this.Thanks!!

Convert byte[] to Image EmguCV without know width and height

I need convert byte array (byte[]) (from java BufferedImage) to EmguCV Image without to know the width and height.
byte[] bytes = [data];
Image<Gray, byte> image = new Image<Gray, byte>();
image.Bytes = bytes;
Can you help me?
I found this
[ http://www.emgu.com/forum/viewtopic.php?t=1057 ]
public Image<Bgr, Byte> byteArrayToImage(byte[] byteArrayIn)
{
MemoryStream ms = new MemoryStream(byteArrayIn);
Bitmap returnImage = Image.FromStream(ms);
return new Image<Bgr, byte>(returnImage);
// you probably need to clean up stuff like the bitmap and the stream...
}
On the emgu forum and retyped it to remove a variable-name typo. Assuming your byte array is a standard image , looks like you can load it into a standard image variable that will automatically figure out sizing. From there you could pass that image to the constructor of your emgu image.

Convert BufferedImage to aws...rekognition.model.Image

I'm playing around with the Amazon Rekognition. I found a really nice/easy library to take an image from my webcam which works like this:
BufferedImage bufImg = webcam.getImage();
I'm then trying to convert this BufferedImage to a com.amazonaws.services.rekognition.model.Image , which is what must be submitted to the Rekognition library. This is what I'm doing:
byte[] imgBytes = ((DataBufferByte) bufImg.getData().getDataBuffer()).getData();
ByteBuffer byteBuffer = ByteBuffer.wrap(imgBytes);
return new Image().withBytes(byteBuffer);
However when I try and do some API call to Rekognition with the Image, I get an Exception:
com.amazonaws.services.rekognition.model.InvalidImageFormatException: Invalid image encoding (Service: AmazonRekognition; Status Code: 400; Error Code: InvalidImageFormatException; Request ID: X)
The docs state that the Java SDK will automatically base64 encode the bytes. In case, something weird was happening, I tried base64 encoding the bytes before converting:
imgBytes = Base64.getEncoder().encode(imgBytes);
However, the same Exception ensues.
Any ideas? :)
I tried encoding the image to a JPG (Rekognition supports PNG or JPG formats) and it solved the problem.
BufferedImage bufImg = webcam.getImage();
ByteArrayOutputStream baos = new ByteArrayOutputStream();
ImageIO.write(bufImg, "jpg", baos);
ByteBuffer byteBuffer = ByteBuffer.wrap(baos.toByteArray());
return new Image().withBytes(byteBuffer);

Getting an in-memory representation of the JPEG rendering of an Image

I need to know how to get an array of bytes from a loaded image, in java. BufferedImage seems not to supply any methods that produce an array of bytes, so what do I use?
BufferedImage bufferedImage; //assumed that you have created it already
ByteArrayOutputStream byteStream = new ByteArrayOutputStream();
ImageIO.write(bufferedImage,"jpg", byteStream);
byte[] byteArray = byteStream.toByteArray();

Categories