YUV to JPEG causing massive fps loss - java

I have a camera sending frames to a SurfaceView. I would like to get these frames out of the surface view and send them elsewhere. In their final form, the images must be in JPEG format. To accomplish this currently, I am creating a YUV image from the byte[] and then calling compressToJpeg. However, when I invoke compressToJpeg on every frame rather than doing nothing but displaying it, my FPS goes from ~30 to ~4. I commented out the other lines and this function appears to be the culprit.
public void onNewRawImage(byte[] data, Size size) {
// Convert to JPG
YuvImage yuvimage=new YuvImage(data,
ImageFormat.NV21, size.width, size.height, null);
ByteArrayOutputStream baos = new ByteArrayOutputStream();
yuvimage.compressToJpeg(new Rect(0, 0, yuvimage.getWidth(),
yuvimage.getHeight()), 80, baos);
byte[] jdata = baos.toByteArray();
// Convert to Bitmap
Bitmap bitmap = BitmapFactory.decodeByteArray(jdata, 0, jdata.length);
}
Is it possible to start in the JPEG format rather than having to convert to it? I am hoping I am making a mistake somewhere. Any help is greatly appreciated, thank you.

Related

image manipulation as byte[]

I'm sorry that I'm not good at English.
I want to make a real-time image manipulation app.(Binarization, color inverse...etc.)
It needs speed, so I want to manipulate it as byte[], without converting to Image or Bitmap.
The format of image that I got from camera was YUV(NV21), but because I don't know about this format, I convert it to JPEG.
but it also doesn't work as I expected.(I thought it would be one byte per one pixel or three byte per one pixel.)
So,
How can I do such manipulation(binarization, color inverse) as JPEG byte array?
or How can I convert NV21 format byte array to RGB byte array?
and I used the method to convert NV21 to JPEG.
YuvImage yuvimage = new YuvImage(bytes, ImageFormat.NV21, width, height, null);
ByteArrayOutputStream outputStream = new ByteArrayOutputStream();
yuvimage.compressToJpeg(new Rect(0, 0, width, height), 100, outputStream);
and I got yuv image byte array from onPreviewFrame(Camra.PreviewCallback).
I think you can use setPreviewFormat(int) to set other than the default NV21 format for camera captures.
https://developer.android.com/reference/android/graphics/ImageFormat.html#NV21

Android Camera Preview - take only a part of the screen data

I want to take only a part of the the screen data from a preview video callback to reduce the time of the process. The probleme is I only know how to take the whole screen with OnPreviewFrame:
#Override
public void onPreviewFrame(byte[] data, Camera camera) {
myData = data;
// +get camera resolution x, y
}
And then with this data get the image :
private Bitmap getBitmapFromYUV(byte[] data, int width, int height)
{
ByteArrayOutputStream out = new ByteArrayOutputStream();
YuvImage yuvImage = new YuvImage(data, ImageFormat.NV21, width, height, null);
yuvImage.compressToJpeg(new Rect(0, 0, width, height), 100, out);
byte[] imageBytes = out.toByteArray();
Bitmap image = BitmapFactory.decodeByteArray(imageBytes, 0, imageBytes.length);
return image;
}
And then I take the part of the image taken I want :
cutImage = Bitmap.createBitmap(image, xOffset, yOffset, customWidth, customHeight);
The problem is that I need to take lots of images to apply some image processing on it and that's why I want to reduce the time it takes to get the images. Instead of taking the whole screen and then crop it, I want to immediatly get the cropped image. Is there a way to get the part of the screen data ?
Ok I finally found something, I still record all the data of the camera but when using compressToJpeg I crop the picture with a custom Rect. Maybe there is something better to do before this but this is still a good improvement. Here are my changes :
yuvImage.compressToJpeg(new Rect(offsetX, offsetY, sizeCaptureX + offsetX, sizeCaptureY + offsetY ), 100, out);

Java: How To Convert Image to Byte[]

I'm having an image in Database in bytearray format. I want to display on browser. I don't know how to write Image using OutputStream. Here is my code.
byte[] imageInBytes = (byte[]) obj; // from Database
InputStream in = new ByteArrayInputStream(imageInBytes);
Image img = ImageIO.read(in).getScaledInstance(50, -1, Image.SCALE_SMOOTH);
OutputStream o = resp.getOutputStream(); // HttpServletResponse
o.write(imgByte);
You may try something like this:
File f=new File("image.jpg");
BufferedImage o=ImageIO.read(f);
ByteArrayOutputStream b=new ByteArrayOutputStream();
ImageIO.write(o, "jpg", b);
byte[] img=b.toByteArray();
You have to set the content type of the response to be an image type that you are sending.
Suppose your image was stored when it was a jpeg. then,
OutputStream o = resp.getOutputStream(); // HttpServletResponse
o.setContentType("image/jpeg");
o.write(img.getBytes() /* imgByte */);
would send the browser an image. ( The browser understands from the header information that the following information you just sent it, is a jpeg image. )
You could try using ImageIO.write...
ImageIO.write(img, "jpg", o);
But this will require you to use BufferedImage when reading...
BufferedImage img = ImageIO.read(in);
You could then use AffineTransform to scale the image...
BufferedImage scaled = new BufferedImage(img.getWidth() / 2, img.getHeight() / 2, img.getType());
Graphics2D g2d = scaled.createGraphics();
g2d.setTransform(AffineTransform.getScaledInstance(0.5, 0.5));
g2d.drawImage(img, 0, 0, null);
g2d.dispose();
img = scaled;
This, obviously, only scales the image by 50%, so you'll need to calculate the required scaling factor based on the original size of the image against your desired size...
Take a look at Java: maintaining aspect ratio of JPanel background image for some ideas on scaling images...

Streaming JPEGs from a Cellphone Camera

I am trying to stream JPEG frames from the camera to my pc using a udp socket but I am running into some issues.
So I set up a camera and added a callback for the preview frame event:
#Override
public void onPreviewFrame(byte[] data, Camera camera)
{
final YuvImage image = new YuvImage(data, mPreviewFormat, mPreviewWidth, mPreviewHeight, null); //Create the Yuv image
image.compressToJpeg(mPreviewRect, 80, stream); //Compress to JPEG
Bitmap b = BitmapFactory.decodeByteArray(stream.toByteArray(), 0, stream.size()); //Convert to Bitmap
Bitmap resizedBitmap = Bitmap.createScaledBitmap(b, 320, 240, false); //Scale to 320x240
resizedBitmap.compress(Bitmap.CompressFormat.JPEG, 100, stream); //Compress back to JPEG
byte[] byteArray = stream.toByteArray();
DatagramPacket sendPacket = new DatagramPacket(byteArray, byteArray.length, IPAddress, 37654);
try
{
socket.send(sendPacket); //Send frame to address
}
catch (IOException e)
{
e.printStackTrace();
}
stream.reset();
}
My problem is that this is taking about 0.2 seconds so my frame rate is about 5 FPS. Is there any way I can speed this up my target FPS is anywhere from 15 to 20 FPS. From my timing tests I believe that the problem may be with Bitmap b = BitmapFactory.decodeByteArray(stream.toByteArray(), 0, stream.size()); It seems to be taking the longest time about 0.1 seconds. Is there a way to scale a Yuv image directly?
Thanks!
I've done something similar and ended up moving that into native code and made a JNI call to convert from NV21 and stick onto a circular buffer.
Have a 2nd thread read off the buffer and do the network IO so you can return from onPreviewFrame as quickly as possible.
Even better, copy the nv21 data to the circular buffer and have that 2nd thread do the jpg conversion before sending the data on the network.
by the way, I did this for a video chat client. It worked well but I've since moved to sending H.263 frames. It's more efficient than sending JPGs if you are also doing a video chat client.

New Bitmap Changed on Copy Using Buffer

When I am using copyPixelsFromBuffer and copyPixelsToBuffer, the bitmap is not displaying as the same one, I have tried below code:
Bitmap bm = BitmapFactory.decodeByteArray(a, 0, a.length);
int[] pixels = new int[bm.getWidth() * bm.getHeight()];
bm.getPixels(pixels, 0, bm.getWidth(), 0, 0,bm.getWidth(),bm.getHeight());
ByteBuffer buffer = ByteBuffer.allocate(bm.getRowBytes()*bm.getHeight());
bm.copyPixelsToBuffer(buffer);//I copy the pixels from Bitmap bm to the buffer
ByteBuffer buffer1 = ByteBuffer.wrap(buffer.array());
newbm = Bitmap.createBitmap(160, 160,Config.RGB_565);
newbm.copyPixelsFromBuffer(buffer1);//I read pixels from the Buffer and put the pixels to the Bitmap newbm.
imageview1.setImageBitmap(newbm);
imageview2.setImageBitmap(bm);
Why the Bitmap bm and newbm did not display the same content?
In your code, you are copying the pixels into a bitmap with RGB_565 format, whereas the original bitmap from which you got the pixels must be in a different format.
The problem is clear from the documentation of copyPixelsFromBuffer():
The data in the buffer is not changed in any way (unlike setPixels(),
which converts from unpremultipled 32bit to whatever the bitmap's
native format is.
So either use the same bitmap format, or use setPixels() or draw the original bitmap onto the new one using a Canvas.drawBitmap() call.
Also use bm.getWidth() & bm.getHeight() to specify the size of the new bitmap instead of hard-coding as 160.

Categories