android renderscript ScriptIntrinsicColorMatrix setRGBtoYUV example usage - java

as the title says, how to convert bitmap RGB back to YUV byte[] using ScriptIntrinsicColorMatrix? below is the sample code (cannot be decoded by zxing):
public byte[] getYUVBytes(Bitmap src, boolean initOutAllocOnce){
if(!initOutAllocOnce){
outYUV = null;
}
if(outYUV == null){
outYUV = Allocation.createSized(rs, Element.U8(rs), src.getByteCount());
}
byte[] yuvData = new byte[src.getByteCount()];
Allocation in;
in = Allocation.createFromBitmap(rs, src,
Allocation.MipmapControl.MIPMAP_NONE,
Allocation.USAGE_SCRIPT);
scriptColor.setRGBtoYUV();
scriptColor.forEach(in, outYUV);
outYUV.copyTo(yuvData);
return yuvData;
}
one thing i notice is from original camera YUV byte is 3110400 bytes but after using the ScriptIntrinsicColorMatrix convertion in becomes 8294400, which i think is wrong.
Reason for YUV -> BW -> YUV is I want to convert the image into black and white (not grayscale) and back to YUV for zxing to decode at the same time show the black and white in surfaceview (like a custom camera filter).
tried below code but its a bit slow (can be decoded by zxing).
int[] intArray = null;
intArray = new int[bmp.getWidth() * bmp.getHeight()];
bmp.getPixels(intArray, 0, bmp.getWidth(), 0, 0, bmp.getWidth(), bmp.getHeight());
LuminanceSource source = new RGBLuminanceSource(cameraResolution.x, cameraResolution.y, intArray);
data = source.getMatrix();
any other alternative for RGB to YUV that is fast? if ScriptIntrinsicColorMatrix class cannot be done?
please and thank you

Related

How to create Bitmap from Android MediaImage in OUTPUT_IMAGE_FORMAT_RGBA_8888 format?

I am trying to use this new feature of CameraX Image Analysis (version 1.1.0-alpha08): using setOutputImageFormat(ImageAnalysis.OUTPUT_IMAGE_FORMAT_RGBA_8888), images sent to the analyzer will have RGBA format.
See this for reference: https://developer.android.com/reference/androidx/camera/core/ImageAnalysis#OUTPUT_IMAGE_FORMAT_RGBA_8888
I need to turn the image sent to the analyzer into a Bitmap so that I can input it to a TensorFlow classifier.
Without this new feature I would receive the image in the standard YUV_420_888 format then I would have to use one of the several solutions that can be googled in order to turn YUV_420_888 to RGBA then to Bitmap. Like this: https://blog.minhazav.dev/how-to-convert-yuv-420-sp-android.media.Image-to-Bitmap-or-jpeg/.
I assume getting the Media Image directly in RGBA format should help me avoid implementing those painfull solutions (that I have actually tried and do not seem to work very well for me so far).
Problem is I don't know how to turn this RGBA Media Image into a Bitmap. I have noticed that calling mediaImage.getFormat() returns 1 which is not an ImageFormat value but a PixelFormat one, the one logically corresponding to RGBA_8888 format, which is in line with the documentation: "All ImageProxy sent to ImageAnalysis.Analyzer.analyze(ImageProxy) will have format PixelFormat.RGBA_8888".
I have tried this:
private Bitmap toBitmapRGBA(Image image) {
Image.Plane[] planes = image.getPlanes();
ByteBuffer buffer = planes[0].getBuffer();
buffer.rewind();
int size = buffer.remaining();
byte[] bytes = new byte[size];
buffer.get(bytes);
Bitmap bitmapImage = BitmapFactory.decodeByteArray(bytes, 0, bytes.length, null);
return bitmapImage;
}
This returns null indicating the decodeByteArray does not work. (I notice the image has got only one plane).
private Bitmap toBitmapRGBA2(Image image) {
Image.Plane[] planes = image.getPlanes();
ByteBuffer buffer = planes[0].getBuffer();
Bitmap bitmap = Bitmap.createBitmap(image.getWidth(), image.getHeight(), Bitmap.Config.ARGB_8888);
buffer.rewind();
bitmap.copyPixelsFromBuffer(buffer);
return bitmap;
}
This returns a Bitmap that looks noting but noise.
Please help!
Kind regards
Mickael
I actually found a solution myself, so I post it here if anyone is interested:
private Bitmap toBitmap(Image image) {
Image.Plane[] planes = image.getPlanes();
ByteBuffer buffer = planes[0].getBuffer();
int pixelStride = planes[0].getPixelStride();
int rowStride = planes[0].getRowStride();
int rowPadding = rowStride - pixelStride * image.getWidth();
Bitmap bitmap = Bitmap.createBitmap(image.getWidth()+rowPadding/pixelStride,
image.getHeight(), Bitmap.Config.ARGB_8888);
bitmap.copyPixelsFromBuffer(buffer);
return bitmap;
}
if you want to process the pixel array further on without creating a bitmap object you can do something like this:
val data = imageProxy.planes[0].buffer.toByteArray()
val pixels = IntArray(data.size / imageProxy.planes[0].pixelStride) {
var index = it * imageProxy.planes[0].pixelStride
(data[index++].toInt() and 0xff.shl(16)) or
(data[index++].toInt() and 0xff).shl(8) or
(data[index++].toInt() and 0xff).shl(0) or
(data[index].toInt() and 0xff).shl(24)
}
And then you can create bitmap this way:
Bitmap.createBitmap(
pixels,
0,
imageProxy.planes[0].rowStride / imageProxy.planes[0].pixelStride,
imageProxy.width,
imageProxy.height,
Bitmap.Config.ARGB_8888
)

Android bitmap image to Unity C# texture

I'm in the draw loop of an android view:
Bitmap bitmap = Bitmap.createBitmap(this.getWidth(),
this.getHeight(), Bitmap.Config.ARGB_4444);
Canvas newCanvas = new Canvas(bitmap);
super.draw(newCanvas);
Log.d("AndroidUnity","Canvas Drawn!");
mImageView.setImageBitmap(bitmap);
And the above code shows me the correct drawing on the attached Image Viewer.
When I convert the bitmap to a byte array:
ByteBuffer byteBuffer = ByteBuffer.allocate(bitmap.getByteCount());
bitmap.copyPixelsToBuffer(byteBuffer);
byte[] bytes = byteBuffer.array();
importing the bytes into Unity does not work (shows a black image on my rawimage):
imageTexture2D = new Texture2D(width, height, TextureFormat.ARGB4444, false);
imageTexture2D.LoadRawTextureData(bytes);
imageTexture2D.Apply();
RawImage.texture = imageTexture2D;
Any ideas on how to get the Java bytes[] to display as a texture/image in Unity? I've tested that the bytes are sending correctly, i.e. when I push a byte array of {1,2,3,4} from android, I get {1,2,3,4} on the unity side.
this isn't mentioning that Unity throws an error when trying to transfer the bytes as a byte[], so instead I have to follow this advice, on the C# side:
void ReceieveAndroidBytes(AndroidJavaObject jo){
AndroidJavaObject bufferObject = jo.Get<AndroidJavaObject>("Buffer");
byte[] bytes = AndroidJNIHelper.ConvertFromJNIArray<byte[]>(bufferObject.GetRawObject()); }
and a trivial byte[] container class "Buffer" on the java side
I was trying to do the exact same thing and my initial attempts also had a black texture. I do the array conversion with AndroidJNIHelper.ConvertFromJNIArray like you do except I used sbyte[] instead of byte[]. To set the actual image data I ended up using
imageTexture2D.SetPixelData(bytes, 0);
If I'm not mistaken LoadRawTextureData is even rawer than an array of pixel data, it might be how graphics cards store textures with compression. If that is true then raw pixel data isn't in the right format and it can't be decoded.

camera2 output to Bitmap

I'm trying to use Google Mobile Vision API with the camera2 module and I'm having a lot of trouble.
I'm using Google's android-Camera2Video example code as a base. I've modified it to include the following callback:
Camera2VideoFragment.java
OnCameraImageAvailable mCameraImageCallback;
public interface OnCameraImageAvailable {
void onCameraImageAvailable(Image image);
}
ImageReader.OnImageAvailableListener mImageAvailable = new ImageReader.OnImageAvailableListener() {
#Override
public void onImageAvailable(ImageReader reader) {
Image image = reader.acquireLatestImage();
if (image == null)
return;
mCameraImageCallback.onCameraImageAvailable(image);
image.close();
}
};
That way any fragment including Camera2VideoFragment.java can get access to its images.
Now, The Barcode API only accepts Bitmap images, but I'm unable to convert YUV_420_888 to Bitmap. Instead, I changed the imageReader's file format to JPEG and ran the following conversion code:
Image.Plane[] planes = image.getPlanes();
ByteBuffer buffer = planes[0].getBuffer();
buffer.rewind();
byte[] data = new byte[buffer.capacity()];
buffer.get(data);
Bitmap bitmap = BitmapFactory.decodeByteArray(data, 0, data.length);
This worked but the framerate drop of feeding JPEG data to the imageReader was significant. I'm wondering if anyone has worked around this issue before.
A late answer but hopefully still helpful.
As Ezequiel Adrian on his Example has explained the conversion of YUV_420_888 into one of the supported formats (In his case NV21), you can do the similar thing to get your Bitmap output:
private byte[] convertYUV420888ToNV21(Image imgYUV420) {
// Converting YUV_420_888 data to YUV_420_SP (NV21).
byte[] data;
ByteBuffer buffer0 = imgYUV420.getPlanes()[0].getBuffer();
ByteBuffer buffer2 = imgYUV420.getPlanes()[2].getBuffer();
int buffer0_size = buffer0.remaining();
int buffer2_size = buffer2.remaining();
data = new byte[buffer0_size + buffer2_size];
buffer0.get(data, 0, buffer0_size);
buffer2.get(data, buffer0_size, buffer2_size);
return data;}
Then you can convert the result into Bitmap:
Bitmap bitmap = BitmapFactory.decodeByteArray(bytes, 0, bytes.length);

JAVA : How to create .PNG image from a byte[]?

I have seen some code source, but I do not understand...
I use Java 7
Please, how to convert a RGB (Red,Green,Blue) Byte Array (or something similar) to a .PNG file format ?
Example from an array that could represent "a RGB pixel" :
byte[] aByteArray={0xa,0x2,0xf};
Important Aspect :
I try to generate a .PNG file only from a byte[] "not from a previous existing file"
is it possible with an existing API? ;)
Here my first code :
byte[] aByteArray={0xa,0x2,0xf};
ByteArrayInputStream bais = new ByteArrayInputStream(aByteArray);
File outputfile = new File("image.png");
ImageIO.write(bais, "png", outputfile);
....Error : No suitable Method Found
Here the other version modified from Jeremy but look similar :
byte[] aByteArray={0xa,0x2,0xf};
ByteArrayInputStream bais = new ByteArrayInputStream(aByteArray);
final BufferedImage bufferedImage = ImageIO.read(newByteArrayInputStream(aByteArray));
ImageIO.write(bufferedImage, "png", new File("image.png"));
....multiple Errors : image == null! ...... Sure ? Note : I do not search to use a source file
The Image I/O API deals with images, so you need to make an image from your byte array first before you write it out.
byte[] aByteArray = {0xa,0x2,0xf,(byte)0xff,(byte)0xff,(byte)0xff};
int width = 1;
int height = 2;
DataBuffer buffer = new DataBufferByte(aByteArray, aByteArray.length);
//3 bytes per pixel: red, green, blue
WritableRaster raster = Raster.createInterleavedRaster(buffer, width, height, 3 * width, 3, new int[] {0, 1, 2}, (Point)null);
ColorModel cm = new ComponentColorModel(ColorModel.getRGBdefault().getColorSpace(), false, true, Transparency.OPAQUE, DataBuffer.TYPE_BYTE);
BufferedImage image = new BufferedImage(cm, raster, true, null);
ImageIO.write(image, "png", new File("image.png"));
This assumes the byte array has three bytes per pixel (red, green then blue) and the range of values is 0-255.

How to make custom Tiles in android google maps

Android's Google Maps api v2 provides a TileOverlay to which TileProviders can be added. A TileProvider will generate a Tile object given the lat, long, and depth. In order to make a tile, one must give it width (easy), height (easy), and an image represented as a byte array (confusing me). If I wanted to 'draw' a simple object and then turn it into a byte array, how would I do this?
For instance I am looking for something that looks like
Canvas canvas = new canvas();
...
canvas.drawRect(); //Or something like this (just an example)
...
byte[] bytes = canvas.SomeConversionFunctionOrProcessThatIDontKnow();
return new Tile(1,1,bytes);
public byte[] getByteArray (String image) throws IOException {
File yourImg = new File(image);
BufferedImage bufferedImage = ImageIO.read(yourImg);
WritableRaster wRaster = bufferedImage .getRaster();
DataBufferByte data = (DataBufferByte) wRaster.getDataBuffer();
return data.getData();
}
This should do the trick ; )
Tile#data should be compressed image data in one of the supported image formats. In other words, the raw contents of an image file. If you already have a decoded Bitmap, use Bitmap#compress(...) to write it to a ByteArrayOutputStream, then get the byte[] from that.
Bitmap bitmap = Bitmap.createBitmap(TILE_DIMENSION, TILE_DIMENSION,Bitmap.Config.ARGB_8888);
Canvas canvas = new Canvas(bitmap);
......
//draw something
......
Tile tile = convertBitmap(bitmap);
bitmap.recycle();
return tile;
}
private Tile convertBitmap(Bitmap bitmap){
ByteArrayOutputStream stream = new ByteArrayOutputStream();
bitmap.compress(Bitmap.CompressFormat.PNG, 100, stream);
byte[] bitmapData = stream.toByteArray();
return new Tile(TILE_DIMENSION, TILE_DIMENSION, bitmapData);
}

Categories