I have created the following method, and it builds a base64 encoded string of an image. The issue that I am having is that when this runs, the image that it grabs is high quality but when it is saved into the byte array then encoded, the image is pixelated and fairly low quality, what can I do to get a 100% quality image?
public String getImageString(String img){
String image = "";
try{
BufferedImage bufferedImage = ImageIO.read(HelpPage.class.getResource(img));
ByteArrayOutputStream out = new ByteArrayOutputStream();
ImageIO.write(bufferedImage, "JPG", out);
String base64bytes = Base64.encode(out.toByteArray());
image = "data:image/jpeg;base64," + base64bytes;
}catch(IOException ex){
Logger.getLogger(HomePage.class.getName()).log(Level.SEVERE, null, ex);
}
return image;
}
You don't need to use ImageIO at all here. Just read the bytes from the resource and base64-encode them.
You're converting a JPEG to another JPEG, which is an inherently lossy process, although it shouldn't be as bad as 'low quality'. But you don't need to do it at all.
Related
Well, i'm trying to send a picture from android to java, if i make it with compression it works really good, but i need to make it without compression beacuse i need a good or normal quality.
FixBitmap is my current Bitmap picture
//Android
FixBitmap.compress(Bitmap.CompressFormat.PNG, 100, byteArrayOutputStream);
byteArray = byteArrayOutputStream.toByteArray();
ConvertImage = Base64.encodeToString(byteArray, Base64.DEFAULT);
Log.e(TAG,"LENGTH"+ConvertImage.length());
Length 43388
//Java
try{
String file=request.getParameter("image_data");
String filename=rt.getId()+"_"+rt.getName()+".png";
BufferedImage image = null;
byte[] imageByte;
BASE64Decoder decoder = new BASE64Decoder();
imageByte = decoder.decodeBuffer(file);
ByteArrayInputStream bis = new ByteArrayInputStream(imageByte);
image = ImageIO.read(bis);
bis.close();
File outputfile = new File(filename);
ImageIO.write(image, "png", outputfile);
Path destinationFile = Paths.get(getServletContext().getRealPath("/")+"uploads\\", filename);
Files.write(destinationFile,imageByte);
}catch(Exception ex){
System.out.println("Error :"+ex.getMessage());
}
This code actually works as i said, but the compression make it looks in a very bad quality, so i tried to make it without compression, just converting my bitmap to a byte array, just like this
ByteBuffer buffer = ByteBuffer.allocate(FixBitmap.getRowBytes() *
FixBitmap.getHeight());
FixBitmap.copyPixelsToBuffer(buffer);
byteArray = buffer.array();
ConvertImage = Base64.encodeToString(byteArray, Base64.DEFAULT);
Log.e(TAG,"LENGTH"+ConvertImage.length());
Length 252107
The code on my java side it's the same but now it doesn't work, it just shows me this error :
java.lang.IllegalArgumentException: image == null!
So I decided to print the length because maybe there is some restrictions about this...
so I hope you can help me with this (just send/get the picture without compression)
I'm playing around with the Amazon Rekognition. I found a really nice/easy library to take an image from my webcam which works like this:
BufferedImage bufImg = webcam.getImage();
I'm then trying to convert this BufferedImage to a com.amazonaws.services.rekognition.model.Image , which is what must be submitted to the Rekognition library. This is what I'm doing:
byte[] imgBytes = ((DataBufferByte) bufImg.getData().getDataBuffer()).getData();
ByteBuffer byteBuffer = ByteBuffer.wrap(imgBytes);
return new Image().withBytes(byteBuffer);
However when I try and do some API call to Rekognition with the Image, I get an Exception:
com.amazonaws.services.rekognition.model.InvalidImageFormatException: Invalid image encoding (Service: AmazonRekognition; Status Code: 400; Error Code: InvalidImageFormatException; Request ID: X)
The docs state that the Java SDK will automatically base64 encode the bytes. In case, something weird was happening, I tried base64 encoding the bytes before converting:
imgBytes = Base64.getEncoder().encode(imgBytes);
However, the same Exception ensues.
Any ideas? :)
I tried encoding the image to a JPG (Rekognition supports PNG or JPG formats) and it solved the problem.
BufferedImage bufImg = webcam.getImage();
ByteArrayOutputStream baos = new ByteArrayOutputStream();
ImageIO.write(bufImg, "jpg", baos);
ByteBuffer byteBuffer = ByteBuffer.wrap(baos.toByteArray());
return new Image().withBytes(byteBuffer);
I am converting an Image into byte[] using following code.
public static byte[] extractBytes (String ImageName) throws IOException {
ByteArrayOutputStream baos=new ByteArrayOutputStream();
BufferedImage img=ImageIO.read(new File(ImageName));
ImageIO.write(img, "jpg", baos);
return baos.toByteArray();
}
Now when I am testing my code:
public static void main(String[] args) throws IOException {
String filepath = "image_old.jpg";
File outp=new File(filepath);
System.out.println("Size of original image="+outp.length());
byte[] data = extractBytes(filepath);
System.out.println("size of byte[] data="+data.length);
BufferedImage img = ImageIO.read(new ByteArrayInputStream(data));
//converting the byte[] array into image again
File outputfile = new File("image_new.jpg");
ImageIO.write(img, "jpeg", outputfile);
System.out.println("size of converted image="+outputfile.length());
}
I am getting very strange results:
Size of original image=78620
size of byte[] data=20280
size of converted image=20244
After converting image into byte[], its size getting decreased by around 1/4th and also when I am converting byte[] back to image its size alters.But output image is successfully getting created in the desired location. I can see the slight difference in quality of the original image and new image after doing 500-600 % zoom in. New image is little blurred after zoom in.
Here is the image on which I am doing the testing http://pbrd.co/1BrOVbf
Please explain the reason of this change in size and also I want to know any method to get the same size after this.
The image you have is compressed with maximum quality setting ("100%" or 1.0 in ImageIO terms). JPEG compression isn't very effective at such high settings, and is thus quite a bit larger than usual. When using ImageIO.write(..., "JPEG", ...) the default quality setting will be used. This default is 0.75 (the exact meaning of such a value is encoder dependent though, and isn't exact science), and thus lower quality, resulting in a smaller file size.
(Another likely cause for such a significant decrease in file size between the original and the re-compressed image, is the removal of meta data. When reading using ImageIO.read(file) you are effectively stripping away any meta data in the JPEG file, like XMP, Exif or ICC profiles. In extreme cases (yes, I'm talking mainly about Photoshop here ;-)) this meta data can take up more space than the image data itself (ie. megabytes of meta data is possible). This is however, not the case for your file.)
As you can see from the second re-compression (from byte[] to final output file), the output is just slightly smaller than the input. This is because the quality setting (unspecified, so still using default) will be the same in both cases (also, any metadata would also be lost in this step, so not adding to the file size). The minor difference is likely due to some small losses (rounding errors etc) in the JPEG decompression/re-compression.
While slightly counter-intuitive, the least data-loss (in terms of change from the original image, not in file size) when re-compression a JPEG, is always achieved by re-compression with the same quality setting (using the exact same tables should be virtually lossless, but small rounding errors might still occur) as the original. Increasing the quality setting will make the file output larger, but the quality will actually degrade.
The only way to be 100% sure to not lose any data or image quality, is by not decoding/encoding the image in the first place, but rather just copy the file byte by byte, for instance like this:
File in = ...;
File out = ...;
InputStream input = new FileInputStream(in);
try {
OutputStream output = new FileOutputStream(out);
try {
copy(input, output);
}
finally {
output.close();
}
}
finally {
input.close();
}
And the copy method:
public void copy(final InputStream in, final OutputStream out) {
byte[] buffer = new byte[1024];
int count;
while ((count = in.read(buffer)) != -1) {
out.write(buffer, 0, count);
}
// Flush out stream, to write any remaining buffered data
out.flush();
}
When you call ImageIO.write(img, "jpeg", outputfile); the ImageIO library writes a jpeg image, using its own compression parameters. The output image appears to be more compressed than the input image. You can adjust the level of compression by changing the parameter in the call to jpegParams.setCompressionQuality below. The resulting file may be bigger or smaller than the original depending on the relative compression levels in each.
public static ImageWriter getImageWriter() throws IOException {
IIORegistry registry = IIORegistry.getDefaultInstance();
Iterator<ImageWriterSpi> services = registry.getServiceProviders(ImageWriterSpi.class, (provider) -> {
if (provider instanceof ImageWriterSpi) {
return Arrays.stream(((ImageWriterSpi) provider).getFormatNames()).anyMatch(formatName -> formatName.equalsIgnoreCase("JPEG"));
}
return false;
}, true);
ImageWriterSpi writerSpi = services.next();
ImageWriter writer = writerSpi.createWriterInstance();
return writer;
}
public static void main(String[] args) throws IOException {
String filepath = "old.jpg";
File outp = new File(filepath);
System.out.println("Size of original image=" + outp.length());
byte[] data = extractBytes(filepath);
System.out.println("size of byte[] data=" + data.length);
BufferedImage img = ImageIO.read(new ByteArrayInputStream(data));
File outputfile = new File("new.jpg");
JPEGImageWriteParam jpegParams = new JPEGImageWriteParam(null);
jpegParams.setCompressionMode(ImageWriteParam.MODE_EXPLICIT);
jpegParams.setCompressionQuality(1f);
ImageWriter writer = getImageWriter();
outputfile.delete();
try (final ImageOutputStream stream = createImageOutputStream(outputfile)) {
writer.setOutput(stream);
try {
writer.write(null, new IIOImage(img, null, null), jpegParams);
} finally {
writer.dispose();
stream.flush();
}
}
System.out.println("size of converted image=" + outputfile.length());
}
This solution is adapted from the answer by JeanValjean given here Setting jpg compression level with ImageIO in Java
I'm designing a program that stores geography data on the JavaFX platform, and whenever I convert an image from a JavaFX Image into a BufferedImage then a ByteArray (for the purpose of serialization) before converting to a Buffered Image and then JavaFX Image again, it gets slightly corrupt. Here's the code I'm using to convert back and forth:
private byte [] loadImageData (Image image){
try{
//creating a byte array output stream from the Image
BufferedImage bi = SwingFXUtils.fromFXImage(image, null);
ByteArrayOutputStream baos = new ByteArrayOutputStream(1000);
ImageIO.write(bi, "png", baos );
baos.flush();
byte[] imageData = baos.toByteArray();
baos.close();
return imageData;
}catch (Exception e){
e.printStackTrace();
}
}
public Image restoreMapData (byte[] data){
try{
//converting back to an image
InputStream in = new ByteArrayInputStream(data);
BufferedImage bi = ImageIO.read(in);
return SwingFXUtils.toFXImage(bi, null);
}catch(Exception e){
e.printStackTrace();
return null;
}
}
Could there be an error elsewhere? I've attached a corrupted and un-corrupt picture of the data.
I also noticed that if I convert to BufferedImage with a type TYPE_INT_ARGB it greatly diminishes the effect.
Using below code to upload an image.
ByteArrayOutputStream baos = new ByteArrayOutputStream();
BufferedImage img = ImageIO.read(new File("abc.tiff"));
ImageIO.write(img, "tiff", baos);
img.flush();
img=null;
byte[] bytes2 = baos.toByteArray();
if(baos != null) {
baos.flush();
baos.close();
}
System.out.println("bytes2 size::::" + bytes2.length);
My original tiff image size is 119 kb and when I print the bytes of image I get 800974 bytes length.
Why the image size increases? and what can be the issue
Obviously your abc.tiff file is archived (TIFF format has an archived version), while img contains non-archived, unpacked data. That is why, your output is 782 kb instead of 119.
TIFF format, as most image formats, support lots of image modes and compression strategies.
ImageIO.write() might not be very clever at optimizing, or it might need some tweaking. See eg here.