I want to compress a Bitmap (JPEG image) to quality 64, the result with be stored in a byte array. Previously I can achieve this with Bitmap#compress() method:
public static byte[] compressJPEG(Bitmap bmp, int quality) {
if (quality <= 0)
return new byte[0];
byte[] array = new byte[0];
try {
ByteArrayOutputStream bmpStream = new ByteArrayOutputStream();
bmp.compress(Bitmap.CompressFormat.JPEG, quality, bmpStream);
array = bmpStream.toByteArray();
bmpStream.close();
} catch (Exception e) {
e.printStackTrace();
}
return array;
}
But the result byte array (with the same input image file) is different when I run this on different devices. (May be because it use nativeCompress() internally)
After checking, I find out the different is in Huffman Table(s), SOS & image data parts, see this:
Diff check between 2 image photos
So how can I compress JPEG image to specific quality level without using Bitmap#compress() method? Because I really want the result byte array to be similar in structure.
There is no concept of "quality level" in JPEG. It is a shorthand that some encoders use to select quantization tables. There are a number of variables in JPEG compression, including the type and breakdown of scans, samples, huffman table selection, and quantization table selection. If any of those are different in the encoder you use, you are going to get different output. It's the nature of the beast.
Related
I decoded an image (JPEG) with Android BitmapFactory class and it decoded fine with color format ARGB_8888.
Bitmap bitmap = BitmapFactory.decodeFile(Environment.getExternalStorageDirectory().toString()+"/camel.jpg",
new BitmapFactory.Options());
Log.d(TAG,"Color format of the bitmap : "+bitmap.getConfig());
I dumped the raw buffer from the Bitmap class
ByteBuffer buffer = ByteBuffer.allocate(bitmap.getByteCount());
bitmap.copyPixelsToBuffer(buffer);
Log.d(TAG,"Color format of the bitmap : "+bitmap.getConfig());
FileOutputStream outputStream = null;
try {
outputStream = new FileOutputStream(Environment.getExternalStorageDirectory().toString() + "/buffer.raw");
outputStream.write(buffer.array());
outputStream.close();
}
catch (Exception e) {
e.printStackTrace();
}
and examined the dump in a Hex editor , it looks like the ordering of the color components is not in the order A,R,G and B.
Instead it looks like ordering is R,G,B and A.
As seen above the alpha (FF) is in the end of the set of four bytes for a pixel.Also to corroborate this I opened the image in 7yuv and image is displayed properly with color format RGBA and little-endian encoding.
My confusion is why is Android reports the format of the Bitmap as ARGB_8888 and actual byte ordering is R,G,B and A.
I was wondering this might be due to some endian mismatch , but in that case the whole ordering needs to be simply reversed (B,G,R and A).
Also I might be doing something wrong while dumping the bitmap to a raw buffer (and I am not very good at Java) , but I am not sure.
Thanks in advance!
I am converting an Image into byte[] using following code.
public static byte[] extractBytes (String ImageName) throws IOException {
ByteArrayOutputStream baos=new ByteArrayOutputStream();
BufferedImage img=ImageIO.read(new File(ImageName));
ImageIO.write(img, "jpg", baos);
return baos.toByteArray();
}
Now when I am testing my code:
public static void main(String[] args) throws IOException {
String filepath = "image_old.jpg";
File outp=new File(filepath);
System.out.println("Size of original image="+outp.length());
byte[] data = extractBytes(filepath);
System.out.println("size of byte[] data="+data.length);
BufferedImage img = ImageIO.read(new ByteArrayInputStream(data));
//converting the byte[] array into image again
File outputfile = new File("image_new.jpg");
ImageIO.write(img, "jpeg", outputfile);
System.out.println("size of converted image="+outputfile.length());
}
I am getting very strange results:
Size of original image=78620
size of byte[] data=20280
size of converted image=20244
After converting image into byte[], its size getting decreased by around 1/4th and also when I am converting byte[] back to image its size alters.But output image is successfully getting created in the desired location. I can see the slight difference in quality of the original image and new image after doing 500-600 % zoom in. New image is little blurred after zoom in.
Here is the image on which I am doing the testing http://pbrd.co/1BrOVbf
Please explain the reason of this change in size and also I want to know any method to get the same size after this.
The image you have is compressed with maximum quality setting ("100%" or 1.0 in ImageIO terms). JPEG compression isn't very effective at such high settings, and is thus quite a bit larger than usual. When using ImageIO.write(..., "JPEG", ...) the default quality setting will be used. This default is 0.75 (the exact meaning of such a value is encoder dependent though, and isn't exact science), and thus lower quality, resulting in a smaller file size.
(Another likely cause for such a significant decrease in file size between the original and the re-compressed image, is the removal of meta data. When reading using ImageIO.read(file) you are effectively stripping away any meta data in the JPEG file, like XMP, Exif or ICC profiles. In extreme cases (yes, I'm talking mainly about Photoshop here ;-)) this meta data can take up more space than the image data itself (ie. megabytes of meta data is possible). This is however, not the case for your file.)
As you can see from the second re-compression (from byte[] to final output file), the output is just slightly smaller than the input. This is because the quality setting (unspecified, so still using default) will be the same in both cases (also, any metadata would also be lost in this step, so not adding to the file size). The minor difference is likely due to some small losses (rounding errors etc) in the JPEG decompression/re-compression.
While slightly counter-intuitive, the least data-loss (in terms of change from the original image, not in file size) when re-compression a JPEG, is always achieved by re-compression with the same quality setting (using the exact same tables should be virtually lossless, but small rounding errors might still occur) as the original. Increasing the quality setting will make the file output larger, but the quality will actually degrade.
The only way to be 100% sure to not lose any data or image quality, is by not decoding/encoding the image in the first place, but rather just copy the file byte by byte, for instance like this:
File in = ...;
File out = ...;
InputStream input = new FileInputStream(in);
try {
OutputStream output = new FileOutputStream(out);
try {
copy(input, output);
}
finally {
output.close();
}
}
finally {
input.close();
}
And the copy method:
public void copy(final InputStream in, final OutputStream out) {
byte[] buffer = new byte[1024];
int count;
while ((count = in.read(buffer)) != -1) {
out.write(buffer, 0, count);
}
// Flush out stream, to write any remaining buffered data
out.flush();
}
When you call ImageIO.write(img, "jpeg", outputfile); the ImageIO library writes a jpeg image, using its own compression parameters. The output image appears to be more compressed than the input image. You can adjust the level of compression by changing the parameter in the call to jpegParams.setCompressionQuality below. The resulting file may be bigger or smaller than the original depending on the relative compression levels in each.
public static ImageWriter getImageWriter() throws IOException {
IIORegistry registry = IIORegistry.getDefaultInstance();
Iterator<ImageWriterSpi> services = registry.getServiceProviders(ImageWriterSpi.class, (provider) -> {
if (provider instanceof ImageWriterSpi) {
return Arrays.stream(((ImageWriterSpi) provider).getFormatNames()).anyMatch(formatName -> formatName.equalsIgnoreCase("JPEG"));
}
return false;
}, true);
ImageWriterSpi writerSpi = services.next();
ImageWriter writer = writerSpi.createWriterInstance();
return writer;
}
public static void main(String[] args) throws IOException {
String filepath = "old.jpg";
File outp = new File(filepath);
System.out.println("Size of original image=" + outp.length());
byte[] data = extractBytes(filepath);
System.out.println("size of byte[] data=" + data.length);
BufferedImage img = ImageIO.read(new ByteArrayInputStream(data));
File outputfile = new File("new.jpg");
JPEGImageWriteParam jpegParams = new JPEGImageWriteParam(null);
jpegParams.setCompressionMode(ImageWriteParam.MODE_EXPLICIT);
jpegParams.setCompressionQuality(1f);
ImageWriter writer = getImageWriter();
outputfile.delete();
try (final ImageOutputStream stream = createImageOutputStream(outputfile)) {
writer.setOutput(stream);
try {
writer.write(null, new IIOImage(img, null, null), jpegParams);
} finally {
writer.dispose();
stream.flush();
}
}
System.out.println("size of converted image=" + outputfile.length());
}
This solution is adapted from the answer by JeanValjean given here Setting jpg compression level with ImageIO in Java
I have created the following method, and it builds a base64 encoded string of an image. The issue that I am having is that when this runs, the image that it grabs is high quality but when it is saved into the byte array then encoded, the image is pixelated and fairly low quality, what can I do to get a 100% quality image?
public String getImageString(String img){
String image = "";
try{
BufferedImage bufferedImage = ImageIO.read(HelpPage.class.getResource(img));
ByteArrayOutputStream out = new ByteArrayOutputStream();
ImageIO.write(bufferedImage, "JPG", out);
String base64bytes = Base64.encode(out.toByteArray());
image = "data:image/jpeg;base64," + base64bytes;
}catch(IOException ex){
Logger.getLogger(HomePage.class.getName()).log(Level.SEVERE, null, ex);
}
return image;
}
You don't need to use ImageIO at all here. Just read the bytes from the resource and base64-encode them.
You're converting a JPEG to another JPEG, which is an inherently lossy process, although it shouldn't be as bad as 'low quality'. But you don't need to do it at all.
I have a Data-URL from an image file and have to pass it through to another function. Along this path from Data-URL to the BufferedImage it needs to be a byteArray.
my approach was the following:
String dataUrl;
byte[] imageData = dataUrl.getBytes();
// pass the byteArray along the path
// create BufferedImage from byteArray
BufferedImage inputImage = ImageIO.read(new ByteArrayInputStream(imageData));
// If the picture is null, then throw an unsupported image exception.
if (inputImage == null) {
throw new UnknownImageFormatException();
}
The problem is, it always throws the UnknownImageFormatException Exception, which means inputImage is null, which means, the ImageIO.read did not recognize the imagetype.
I've used ImageIO.getReaderFormatNames() to get the supported Filenames and got the following list:
Supported Formats:
jpg, BMP, bmp, JPG, jpeg, wbmp, png, JPEG, PNG, WBMP, GIF, gif
The dataURLs I try to pass are like: data:image/png;base64,... or data:image/jpg;base64,...
As far as I understand, those are in the supported filelist and therefor should be recognized.
What else could cause the inputImage to be null in this case? And more interesting, how do I solve it?
As the comments already said the image data is Base64 encoded. To retrieve the binary data you have to strip the type/encoding headers, then decode the Base64 content to binary data.
String encodingPrefix = "base64,";
int contentStartIndex = dataUrl.indexOf(encodingPrefix) + encodingPrefix.length();
byte[] imageData = Base64.decodeBase64(dataUrl.substring(contentStartIndex));
I use org.apache.commons.codec.binary.Base64 from apaches common-codec, other Base64 decoders should work as well.
The only one problem with RFC2397 string is its specification with everything before data but data: and , optional:
data:[<mediatype>][;base64],<data>
So pure Java 8 solution accounting this would be:
final int dataStartIndex = dataUrl.indexOf(",") + 1;
final String data = dataUrl.substring(dataStartIndex);
byte[] decoded = java.util.Base64.getDecoder().decode(data);
Of course dataStartIndex should be checked.
I think, a simple regex replace would be better and more conform to the RFC2397:
java.util.Base64.getDecoder().decode(b64DataString.replaceFirst("data:.+,", ""))
The RFC states that the data: and the , are the required prefixes for a data url, therefore it is wise to match for them.
I'm new to Java. I've been able to read a "raw" image file with short data, display it, and save it as a .jp2 file but the 150.000 byte file gets compressed to just over 50,000 bytes. Some years ago I used a propriatry library to in C/C++ to do this and achieved lossless compression of the same type of images in the range of 10:1.
My image data is in a BufferedImage and I save it thus:
...
// biGray is the BufferedImage and dest the file name
Iterator writers = ImageIO.getImageWritersByFormatName("JPEG2000");
ImageWriter writer = (ImageWriter)writers.next();
try {
ImageIO.write(biGray, "JPEG 2000", dest);
} catch (IOException e) {
e.printStackTrace();
}
Is it possible to get greater lossless reversable compression? How?
Thanks, Nate.