Java: How to control lossless compression in JPEG 2000 - java

I'm new to Java. I've been able to read a "raw" image file with short data, display it, and save it as a .jp2 file but the 150.000 byte file gets compressed to just over 50,000 bytes. Some years ago I used a propriatry library to in C/C++ to do this and achieved lossless compression of the same type of images in the range of 10:1.
My image data is in a BufferedImage and I save it thus:
...
// biGray is the BufferedImage and dest the file name
Iterator writers = ImageIO.getImageWritersByFormatName("JPEG2000");
ImageWriter writer = (ImageWriter)writers.next();
try {
ImageIO.write(biGray, "JPEG 2000", dest);
} catch (IOException e) {
e.printStackTrace();
}
Is it possible to get greater lossless reversable compression? How?
Thanks, Nate.

Related

Compress JPEG image to specific quality level without using Bitmap#compress() method

I want to compress a Bitmap (JPEG image) to quality 64, the result with be stored in a byte array. Previously I can achieve this with Bitmap#compress() method:
public static byte[] compressJPEG(Bitmap bmp, int quality) {
if (quality <= 0)
return new byte[0];
byte[] array = new byte[0];
try {
ByteArrayOutputStream bmpStream = new ByteArrayOutputStream();
bmp.compress(Bitmap.CompressFormat.JPEG, quality, bmpStream);
array = bmpStream.toByteArray();
bmpStream.close();
} catch (Exception e) {
e.printStackTrace();
}
return array;
}
But the result byte array (with the same input image file) is different when I run this on different devices. (May be because it use nativeCompress() internally)
After checking, I find out the different is in Huffman Table(s), SOS & image data parts, see this:
Diff check between 2 image photos
So how can I compress JPEG image to specific quality level without using Bitmap#compress() method? Because I really want the result byte array to be similar in structure.
There is no concept of "quality level" in JPEG. It is a shorthand that some encoders use to select quantization tables. There are a number of variables in JPEG compression, including the type and breakdown of scans, samples, huffman table selection, and quantization table selection. If any of those are different in the encoder you use, you are going to get different output. It's the nature of the beast.

Confusion with ordering of the color values in the bitmap buffer got from Bitmapfactory

I decoded an image (JPEG) with Android BitmapFactory class and it decoded fine with color format ARGB_8888.
Bitmap bitmap = BitmapFactory.decodeFile(Environment.getExternalStorageDirectory().toString()+"/camel.jpg",
new BitmapFactory.Options());
Log.d(TAG,"Color format of the bitmap : "+bitmap.getConfig());
I dumped the raw buffer from the Bitmap class
ByteBuffer buffer = ByteBuffer.allocate(bitmap.getByteCount());
bitmap.copyPixelsToBuffer(buffer);
Log.d(TAG,"Color format of the bitmap : "+bitmap.getConfig());
FileOutputStream outputStream = null;
try {
outputStream = new FileOutputStream(Environment.getExternalStorageDirectory().toString() + "/buffer.raw");
outputStream.write(buffer.array());
outputStream.close();
}
catch (Exception e) {
e.printStackTrace();
}
and examined the dump in a Hex editor , it looks like the ordering of the color components is not in the order A,R,G and B.
Instead it looks like ordering is R,G,B and A.
As seen above the alpha (FF) is in the end of the set of four bytes for a pixel.Also to corroborate this I opened the image in 7yuv and image is displayed properly with color format RGBA and little-endian encoding.
My confusion is why is Android reports the format of the Bitmap as ARGB_8888 and actual byte ordering is R,G,B and A.
I was wondering this might be due to some endian mismatch , but in that case the whole ordering needs to be simply reversed (B,G,R and A).
Also I might be doing something wrong while dumping the bitmap to a raw buffer (and I am not very good at Java) , but I am not sure.
Thanks in advance!

Improving file save performance in android

I have 100 images on external memory which i do these two following tasks in a for loop .
Loading every item as a bitmap and merging it with another bitmap
Saving result in as a new file in memory
And for 100 images it takes too much time . Merging bitmaps is quit fast and OK but saving the result in file takes too much time . Is there anyway to boost this issue ? Keeping the bitmaps in memory and batch save files can cause OutOfMemoryException .
This is how i merge bitmaps :
how to merge to two bitmap one over another
This is how i save the bitmap to file :
File imageFileFolder = new File(Statics.TEMP_PATH);
imageFileFolder.mkdirs();
FileOutputStream out = null;
File imageFileName = new File(imageFileFolder, imageName);
try {
out = new FileOutputStream(imageFileName);
imageBitmap.compress(Bitmap.CompressFormat.PNG, 100, out);
out.flush();
out.close();
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
Note that all of these are in a AsyncTask block .
Finally i came with the solution to save Bitmap as jpeg. Saving a 250k png bitmap took almost 500 milliseconds . The same bitmap in jpeg format took about 50 milliseconds. Although the qualities wont be the same but it looks like it is a tradeoff between quality and performance.

How to reduce size of a multipart file in java

I have Java Spring MVC application in which there is an option to upload an image and save to the server. i have the following method:
#RequestMapping(value = "/uploaddocimagecontentsubmit", method = RequestMethod.POST)
public String createUpdateFileImageContentSubmit(#RequestParam("file") MultipartFile file, ModelMap model)
{
//methods to handle file upload
}
I am now trying to reduce the size of the image refering the following:
increasing-resolution-and-reducing-size-of-an-image-in-java and decrease-image-resolution-in-java
The problem I am facing is that in the above examples, we are dealing with java.io.File Objects which are saved to a specified location. I dont want to save the image. Is there any way that I can use something similar to compress my Multipart Image file and continue with the upload.
Why don't you resize it on the client before upload? That will save some bandwidth
BlueImp JQuery Upload can do this
It was my first time taking a deep dive into the ImageIO package. I came across the MemoryCacheImageOutputStream, which allows you to write an image output stream to an output stream, i.e. ByteArrayOutputStream. From there, The data can be retrieved using toByteArray() and toString(), after compression. I used toByteArray, as I am storing images to postgresql and it stores the images as a byte array. I know this is late, but I hope it helps someone.
private byte[] compressImage(MultipartFile mpFile) {
float quality = 0.3f;
String imageName = mpFile.getOriginalFilename();
String imageExtension = imageName.substring(imageName.lastIndexOf(".") + 1);
// Returns an Iterator containing all currently registered ImageWriters that claim to be able to encode the named format.
// You don't have to register one yourself; some are provided.
ImageWriter imageWriter = ImageIO.getImageWritersByFormatName(imageExtension).next();
ImageWriteParam imageWriteParam = imageWriter.getDefaultWriteParam();
imageWriteParam.setCompressionMode(ImageWriteParam.MODE_EXPLICIT); // Check the api value that suites your needs.
// A compression quality setting of 0.0 is most generically interpreted as "high compression is important,"
// while a setting of 1.0 is most generically interpreted as "high image quality is important."
imageWriteParam.setCompressionQuality(quality);
ByteArrayOutputStream baos = new ByteArrayOutputStream();
// MemoryCacheImageOutputStream: An implementation of ImageOutputStream that writes its output to a regular
// OutputStream, i.e. the ByteArrayOutputStream.
ImageOutputStream imageOutputStream = new MemoryCacheImageOutputStream(baos);
// Sets the destination to the given ImageOutputStream or other Object.
imageWriter.setOutput(imageOutputStream);
BufferedImage originalImage = null;
try (InputStream inputStream = mpFile.getInputStream()) {
originalImage = ImageIO.read(inputStream);
} catch (IOException e) {
String info = String.format("compressImage - bufferedImage (file %s)- IOException - message: %s ", imageName, e.getMessage());
logger.error(info);
return baos.toByteArray();
}
IIOImage image = new IIOImage(originalImage, null, null);
try {
imageWriter.write(null, image, imageWriteParam);
} catch (IOException e) {
String info = String.format("compressImage - imageWriter (file %s)- IOException - message: %s ", imageName, e.getMessage());
logger.error(info);
} finally {
imageWriter.dispose();
}
return baos.toByteArray();
}

streaming a jpeg using com.sun.image.codec.jpeg.JPEGImageEncoder vs javax.imageio.ImageIO

I have a BufferedImage object of a jpeg which needs to be streamed as servlet response.
The existing code streams the jpeg using JPEGImageEncoder which looks like this :
JPEGImageEncoder encoder = JPEGCodec.createJPEGEncoder(resp.getOutputStream());
resp.reset();
resp.setContentType("image/jpg");
resp.setHeader("Content-disposition", "inline;filename=xyz.jpg");
JPEGEncodeParam param = encoder.getDefaultJPEGEncodeParam(image);
param.setQuality(jpegQuality, false);
encoder.setJPEGEncodeParam(param);
encoder.encode(image);
I have noticed that this is resulting in the file size of the streamed jpeg to be tripled , unable to figure why.So I have tried using ImageIO to stream the jpeg
ImageIO.write(image, "jpg", out);
This works just fine, I am unable to decide why my predecessor has gone with the choice of JPEGImageEncoder and was wondering what issues would arise if I change to using ImageIO, I have compared both jpegs and couldn't really spot differences. Any thoughts?
To be clear, you've already a concrete JPEG image somewhere on disk or in database and you just need to send it unmodified to the client? There's then indeed absolutely no reason to use JPEGImageEncoder (and ImageIO).
Just stream it unmodified to the response body.
E.g.
File file = new File("/path/to/image.jpg");
response.setContentType("image/jpeg");
response.setHeader("Content-Length", String.valueOf(file.length()));
InputStream input = new FileInputStream(file);
OutputStream output = response.getOutputStream();
byte[] buffer = new byte[8192];
try {
for (int length = 0; (length = input.read(buffer)) > 0;) {
output.write(buffer, 0, length);
}
}
finally {
try { input.close(); } catch (IOException ignore) {}
try { output.close(); } catch (IOException ignore) {}
}
You see the mistake of unnecessarily using JPEGImageEncoder (and ImageIO) to stream image files often back in code of starters who are ignorant about the nature of bits and bytes. Those tools are only useful if you want to convert between JPEG and a different image format, or want to manipulate (crop, skew, rotate, resize, etc) it.

Categories