How to convert mat(OpenCV) to image(JavaFX)?
I think this isn't not best method:
MatOfByte byteMat = new MatOfByte();
Highgui.imencode(".bmp", mat, byteMat);
return new Image(new ByteArrayInputStream(byteMat.toArray()));
P.S.
Image - import javafx.scene.image.Image;
One way to do it would be this.I do not remember the source from where I got this:-
private Image mat2Image(Mat frame)
{
int type = BufferedImage.TYPE_BYTE_GRAY;
if ( frame.channels() > 1 ) {
type = BufferedImage.TYPE_3BYTE_BGR;
}
int bufferSize = frame.channels()*frame.cols()*frame.rows();
byte [] b = new byte[bufferSize];
frame.get(0,0,b); // get all the pixels
BufferedImage image = new BufferedImage(frame.cols(),frame.rows(), type);
final byte[] targetPixels = ((DataBufferByte) image.getRaster().getDataBuffer()).getData();
System.arraycopy(b, 0, targetPixels, 0, b.length);
return SwingFXUtils.toFXImage(image,null);
}
There may be a way to make it neater using the Converters class from Opencv along with JDK8. I will update this if I find any such thing.
Paritosh, the issue with your method is that it only applies to the Mats with the type of CvType.CV_8U or CvType.CV_8S, since those Mats can be contained in a byte array. If the type of the Mat is, lets say, CvType.CV_32F, you would need a float array to hold the data. The float array cannot be System.arraycopied into a byte[] targetPixels array.
Related
I have tried with this 1 code but I am not sure if am right
this is the code which I have tried :
Mat originalImage = Highgui.imread(path);
int[] imageInByte = new int[(int) (originalImage.total() * originalImage.channels())];
also I want to know how to get mat from integer array?
After allocating the array,
byte imageInByte[] = new byte[originalImage.total() * originalImage.channels()];
You can copy the array from C++/JNI,
originalImage.get(0, 0, imageInByte);
To update the array in C++/JNI
originalImage.put(0, 0, imageInByte);
Mat mRgb = Imgcodecs.imread("test.jpg");
MatOfInt iRgb = new MatOfInt(CvType.CV_32S);// middle type
mRgb.convertTo(iRgb, CvType.CV_32S);// 复制mRgb的数据到 iRgb
int[] dataArray = new int[(int)(iRgb.total()*iRgb.channels())];
iRgb.get(0,0, dataArray);// iRgb数据 int[]
as the title says, how to convert bitmap RGB back to YUV byte[] using ScriptIntrinsicColorMatrix? below is the sample code (cannot be decoded by zxing):
public byte[] getYUVBytes(Bitmap src, boolean initOutAllocOnce){
if(!initOutAllocOnce){
outYUV = null;
}
if(outYUV == null){
outYUV = Allocation.createSized(rs, Element.U8(rs), src.getByteCount());
}
byte[] yuvData = new byte[src.getByteCount()];
Allocation in;
in = Allocation.createFromBitmap(rs, src,
Allocation.MipmapControl.MIPMAP_NONE,
Allocation.USAGE_SCRIPT);
scriptColor.setRGBtoYUV();
scriptColor.forEach(in, outYUV);
outYUV.copyTo(yuvData);
return yuvData;
}
one thing i notice is from original camera YUV byte is 3110400 bytes but after using the ScriptIntrinsicColorMatrix convertion in becomes 8294400, which i think is wrong.
Reason for YUV -> BW -> YUV is I want to convert the image into black and white (not grayscale) and back to YUV for zxing to decode at the same time show the black and white in surfaceview (like a custom camera filter).
tried below code but its a bit slow (can be decoded by zxing).
int[] intArray = null;
intArray = new int[bmp.getWidth() * bmp.getHeight()];
bmp.getPixels(intArray, 0, bmp.getWidth(), 0, 0, bmp.getWidth(), bmp.getHeight());
LuminanceSource source = new RGBLuminanceSource(cameraResolution.x, cameraResolution.y, intArray);
data = source.getMatrix();
any other alternative for RGB to YUV that is fast? if ScriptIntrinsicColorMatrix class cannot be done?
please and thank you
This question already has answers here:
OpenCV Mat object serialization in java
(4 answers)
Closed 5 years ago.
I have a Mat image for my system and I want to be able to store it in my sqlite database. So I am thinking I need to try convert it to a byte array to be able to store it. But then Im not sure that is right because I'm unsure how to use the value I get if I was to access it from the db to be able to turn it back into its original Mat image.
Below is what I have come up with so far:
static byte[] matToByte(Mat mat) throws SQLException {
int length = (int) (mat.total()*mat.elemSize());
byte buffer[] = new byte[length];
int converted = mat.get(0, 0, buffer);
IrisInitialDatabase.addFeatures(converted);
return buffer;
}
static Mat byteToMat(byte[] value) {
Mat m = new Mat();
m.put(0, 0, value);
return m;
}
thanks :)
Save it to the database in Base64 format.
Mat to bitmap
Bitmap image = Bitmap.createBitmap(rgba.cols(),
rgba.rows(), Bitmap.Config.RGB_565);
Utils.matToBitmap(rgba, image);
Bitmap bitmap = (Bitmap) image;
bitmap = Bitmap.createScaledBitmap(bitmap, 600, 450, false);
bitmap to byte array
ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();
bitmap.compress(Bitmap.CompressFormat.PNG, 100, byteArrayOutputStream);
byte[] byteArray = byteArrayOutputStream .toByteArray();
---save to database---
get mat back
m = Highgui.imdecode(new MatOfByte(Base64.decode(base64ImageFromDB,0)),Highgui.IMREAD_UNCHANGED);
I have some C code that decodes a video frame by frame. I get to a point where i have an AVFrame in BGR32 and would like to send it back to Java for editing.
I have a ByteBuffer object in my C code that was created in Java using AllocateDirect but i struggle to write the content of the AVFrame->data[0] (of uint8_t type) to it and read it back. I have tried memcpy with no luck. Does anyone have an idea how to achieve this?
UPDATE
Followed Will's comment below and wrote this in C
char *buf = (*pEnv)->GetDirectBufferAddress(pEnv, byteBuffer);
memcpy(buf, rgb_frame->data[0], output_width*output_height*4);
The buffer does contain some data in Java but doing the following returns a null bitmap
BufferedImage frame = ImageIO.read(bitmapStream);
Where bitmapStream is a ByteBufferInputStream defined here:
https://code.google.com/p/kryo/source/browse/trunk/src/com/esotericsoftware/kryo/io/ByteBufferInputStream.java?r=205
Not sure if I am not writing things correctly in this buffer
UPDATE 2
Got pretty close now thanks to the latest snippet. I am using BGR32 format in my C code ie 4 bytes per pixel. So I modified things a bit in Java:
final byte[] dataArray = new byte[width*height*4];
imageData.get(dataArray);
final BufferedImage image = new BufferedImage(width, height, BufferedImage.TYPE_INT_BGR);
final DataBuffer buffer = new DataBufferByte(dataArray, dataArray.length);
Raster raster = Raster.createRaster(sampleModel, buffer, null);
image.setData(raster);
I get the image correctly but there seems to be an issue with color channels
Tried different formats with no luck
From Oracle's JNI Functions Documentation at
https://docs.oracle.com/javase/8/docs/technotes/guides/jni/spec/functions.html#GetDirectBufferAddress
GetDirectBufferAddress
void* GetDirectBufferAddress(JNIEnv* env, jobject buf);
Fetches and returns the starting address of the memory region
referenced by the given direct java.nio.Buffer.
This function allows native code to access the same memory region that
is accessible to Java code via the buffer object. LINKAGE:
Index 230 in the JNIEnv interface function table. PARAMETERS:
env: the JNIEnv interface pointer
buf: a direct java.nio.Buffer object (must not be NULL) RETURNS:
Returns the starting address of the memory region referenced by the
buffer. Returns NULL if the memory region is undefined, if the given
object is not a direct java.nio.Buffer, or if JNI access to direct
buffers is not supported by this virtual machine. SINCE:
JDK/JRE 1.4
I tested with this C++ code:
static char framebuf[100];
JNIEXPORT void JNICALL Java_javaapplication45_UseByteBuffer_readBuf
(JNIEnv *env, jobject usebb, jobject bb) {
void *addr = env->GetDirectBufferAddress(bb);
framebuf[0] = 77;
memcpy(addr,framebuf,100);
}
and this Java Code:
public class UseByteBuffer {
public native void readBuf(ByteBuffer bb);
}
...
public static void main(String[] args) {
System.load("/home/shackle/NetBeansProjects/usebb/dist/Debug/GNU-Linux-x86/libusebb.so");
ByteBuffer bb = ByteBuffer.allocateDirect(100);
new UseByteBuffer().readBuf(bb);
byte first_byte = bb.get(0);
System.out.println("first_byte = " + first_byte);
}
And it printed the first_byte=77 indicating it got the data copied correctly.
Update
ImageIO.read() will not accept just any set of bytes it has to be in a format that one of the installed ImageReader's can recognize such as JPEG or PNG.
Here is an example getting raw for (3 byte r,g,b )bytes into an image
int width = 256;
int height = 256;
ByteBuffer bb = ByteBuffer.allocateDirect(height*width*3);
byte[] raw = new byte[width * height * 3];
bb.get(raw);
BufferedImage image = new BufferedImage(width, height, BufferedImage.TYPE_3BYTE_BGR);
DataBuffer buffer = new DataBufferByte(raw, raw.length);
SampleModel sampleModel = new ComponentSampleModel(DataBuffer.TYPE_BYTE, width, height, 3, width * 3, new int[]{0,1,2});
Raster raster = Raster.createRaster(sampleModel, buffer, null);
image.setData(raster);
Update 2
For BGR32 I believe this would be closer:
ByteBuffer imageData = ByteBuffer.allocateDirect(height * width * 4);
byte[] raw = new byte[width * height * 4];
imageData.get(raw);
BufferedImage image = new BufferedImage(width, height, BufferedImage.TYPE_INT_BGR);
DataBuffer buffer = new DataBufferByte(raw, raw.length);
SampleModel sampleModel = new ComponentSampleModel(
DataBuffer.TYPE_BYTE, width, height, 4, width * 4,
new int[]{2,1,0} // Try {1,2,3}, {3,2,1}, {0,1,2}
);
Raster raster = Raster.createRaster(sampleModel, buffer, null);
image.setData(raster);
Notice where I have commented, where I suspect you may need to experiment with the array of bandOffsets in the third argument of the ComponentSampleModel constructor to fix the color model.
Update 3
One can reuse the sampleModel to get data out of the image by using BufferedImage.copyData() to a WritableRaster instead of using getRaster().
SampleModel sampleModel = new ComponentSampleModel(
DataBuffer.TYPE_BYTE, width, height, 4, width * 4,
new int[]{2, 1, 0}
);
...
BufferedImage newImage = ImageIO.read(new File("test.png"));
byte newRaw[] = new byte[height*width*4];
DataBuffer newBuffer = new DataBufferByte(newRaw, newRaw.length);
WritableRaster newRaster = Raster.createWritableRaster(sampleModel, newBuffer, null);
newImage.copyData(newRaster);
I'm trying to convert a Buffered image into a ByteBuffer but i get this exception
java.awt.image.DataBufferInt cannot be cast to java.awt.image.DataBufferByte
can someone please help me out and suggest a good method of conversion.
Source:
public static ByteBuffer convertImageData(BufferedImage bi)
{
byte[] pixelData = ((DataBufferByte) bi.getRaster().getDataBuffer()).getData();
// return ByteBuffer.wrap(pixelData);
ByteBuffer buf = ByteBuffer.allocateDirect(pixelData.length);
buf.order(ByteOrder.nativeOrder());
buf.put(pixelData);
buf.flip();
return buf;
}
this is my object
ByteBuffer buf = convertImageData(image);
You can't just cast an arbitrary databuffer to DataBufferByte, you need to make sure it actually is the right type:
ByteBuffer byteBuffer;
DataBuffer dataBuffer = bi.getRaster().getDataBuffer();
if (dataBuffer instanceof DataBufferByte) {
byte[] pixelData = ((DataBufferByte) dataBuffer).getData();
byteBuffer = ByteBuffer.wrap(pixelData);
}
else if (dataBuffer instanceof DataBufferUShort) {
short[] pixelData = ((DataBufferUShort) dataBuffer).getData();
byteBuffer = ByteBuffer.allocate(pixelData.length * 2);
byteBuffer.asShortBuffer().put(ShortBuffer.wrap(pixelData));
}
else if (dataBuffer instanceof DataBufferShort) {
short[] pixelData = ((DataBufferShort) dataBuffer).getData();
byteBuffer = ByteBuffer.allocate(pixelData.length * 2);
byteBuffer.asShortBuffer().put(ShortBuffer.wrap(pixelData));
}
else if (dataBuffer instanceof DataBufferInt) {
int[] pixelData = ((DataBufferInt) dataBuffer).getData();
byteBuffer = ByteBuffer.allocate(pixelData.length * 4);
byteBuffer.asIntBuffer().put(IntBuffer.wrap(pixelData));
}
else {
throw new IllegalArgumentException("Not implemented for data buffer type: " + dataBuffer.getClass());
}
If your BufferedImage is one of the standard types (BufferedImage.TYPE_* other than TYPE_CUSTOM) the above should work.
Note that special DatBuffer subclasses may exist, and may store pixels in multiple banks, with different byte order, might be channel interleaved (rather than the standard pixel interleaved) etc. So the above code is still not completely general.
If you are to pass these ByteBuffers to native code, using allocateDirect(..) and copying the pixels over might be faster, otherwise I think using wrap(..) will make for both simpler code and be more efficient.
If you want to encode the image data, you may want to use the ImageIO here. Something like this:
public static ByteBuffer convertImageData(BufferedImage bi) {
ByteArrayOutputStream out = new ByteArrayOutputStream();
try {
ImageIO.write(bi, "png", out);
return ByteBuffer.wrap(out.toByteArray());
} catch (IOException ex) {
//TODO
}
return null;
}
Here is the list of supported formats.