I have a curious question. When decode a JPG image in Bitmap, it take very high memory as JPG is a compressed format.So for 0.5 MB JPG image, bitmap is almost 4 MB.
My question is if I want to upload an image, I can just read it from file and send it to server i.e. I do NOT need to load it in Bitmap.
Have anybody ever tried this?
Does it makes sense?
Thanks for help.
Yes this is possible. You don't need to covert jpg to bitmap to upload it to server.
You can consider using a higher level API like http://loopj.com/android-async-http/. Please see the section "Uploading Files with RequestParams" for more details.
jpg is a format to store compressed images into files. It is practical because it takes less memory than the raw bitmap format, where the color values of all pixels in the image are explicitly present in the data.
You do need to decompress the jpg file in order to display it, however the best way to send it would be to use the compressed jpg file (no need to load the bitmap, just send the data in the file).
Note that JPEG compression is lossy, meaning that the reduction in size comes with a degradation in quality from the original source image. (see the thorough and well illustrated wikipedia entry: https://en.wikipedia.org/wiki/Jpg).
Related
Currently I am running out of RAM when rendering images and while I have optimized it as much as I can for memory efficiency, I have realized that for images as large as I want it will never be enough so instead I want to write to a file as I am rendering. I have no idea how I could do this and I'm almost certain that most image formats won't be suitable for this as to recalculate compression they would need to put the image being appended to into RAM.
This is my current code. I'm happy to completely switch libraries to get this to work.
panel.logic.Calculate(false); // Renders the image - I can make this render in steps
Graphics2D g2d = bufferedImage.createGraphics();
panel.paint(g2d);
g2d.dispose();
SimpleDateFormat formatter = new SimpleDateFormat("dd-MM-yyyy_HH-mm-ss");
Date date = new Date();
File file = new File(formatter.format(date) + ".png");
try {
ImageIO.write(bufferedImage, "png", file);
} catch (IOException ioException) {
ioException.printStackTrace();
}
Yes, this is possible in several image formats, but won't be easy with the standard Java Graphics2D api. In particular, the class java.awt.image.BufferedImage explicitly represents an image where the entire bitmap is held in memory.
I would start by asking, how large are the images you are thinking of here? Unless your generating program is unusually memory constrained, then any image that is too big to hold in memory during generating will then be too big to hold in memory during display, so would be useless, I think?
In order to write an image file in a "streaming" style, you will need a format that allows you to write pixels or regions. This will be hard in image formats that are more sophisticated like JPEG, but easier in image formats that are more pixel oriented like BMP or PNG.
For BMP, you would need to write the file header, then stream out the pixel array into the file, which you could do pixel-by-pixel without holding the whole thing in memory, then write the footer. The format is described here: https://en.wikipedia.org/wiki/BMP_file_format
For PNG, it would be much the same, except that the file format is quite a bit more complicated and involves a compression layer (which can still be handled in a streaming format).
There aren't many libraries to handle this approach, because of the obvious limitations I outlined above and that other commenters have outlined: if an image is so large that it needs it, then it will be too large to ever display it.
I think you might be able to persuade ImageIO to write a BMP or PNG in a streaming fashion if you implement a custom RenderedImage class. I might see if I can get that to work; I'll update here if so.
Example code
I had a go at writing a PNG image in a streaming fashion using ImageIO.
Here's the code:
https://gist.github.com/RichardBradley/e7326ec777faccb9579ad4e0b0358f87
I found that the PNG encoder will request the image one scanline at a time, regardless of the Tile settings of the Image.
See com/sun/imageio/plugins/png/PNGImageWriter.java:818
(This may in fact be a bug in PNGImageWriter, but no-one has noticed because no-one writes images in a streaming style in real world use.)
If you want to stream the data pixel-by-pixel instead of line-by-line, you could fork PNGImageWriter and refactor that section. I think it should be quite possible.
(I am not 100% certain that something inside the ImageIO / PNGImageWriter pipeline will not just buffer the image to memory anyway. You could turn the image size right up and retest to be sure.)
Your problem is not that the final image may not fit in memory.
The problem is that the rendering process takes too much memory.
This means that you have to modify the rendering process in such a way that it will write its intermediate results to disk instead of keeping it in memory.
This may mean that you can use the BMP format and write bit by bit to the disk as described in the answer provided by Rich (ok, in larger chunks, not really each single bit …), or you write an intermediate format of your own, or you allocate disk memory as cache memory.
But when your current rendering process finishes without an OOME, writing the resulting image to disk cannot be the real issue; only when writing would mean that the given data structure has to be converted again into a particular format, this could cause an issue (for example, the renderer returns a byte array holding the image as BMP, but the output should be a JPEG – in that case, you may have to hold the image in memory twice, and that could cause the OOME).
But without knowing details about what panel.logic.Calculate() and panel.paint() are really doing (in detail!), the question is difficult to answer.
First, I think you can assign more memory to JVM by Config -xmx on jvm parameters.
Second, Use Lazy-Load Strategy. you can try to split the image and every image splited is loaded when it display on the panel.
May be you can take the following steps:
Downsize the image
Change image format
Assign more memory to JVM
I am trying to implement attachments in my application and user is able to upload image files (png, jpg, jpeg). I have read OWASP recommendations for image uploads, and one of the tips was to - convert the input image to a bitmap (keeping only the bitmap data, and throwing away all the extra annotations), then convert the bitmap to your desired output format. One reasonable way to do this is to convert to PBM format, then convert to PNG.
Image is saved as byte array.
I am trying to rewrite uploaded image by using ImageTranscoder from ImageIO library. But i am not really sure what it is doing, and if all the possibly malicious code is removed from image, because it seems that only metadata is being rewritten.
Is there any suggestions, best practices, of how desired goal should be achieved to remove all possibly malicious code inside image file?
You do not need an intermediate file format like PBM, as BufferedImage (which is the standard way of representing an in-memory bitmap in Java) is just plain pixel data. You can just go from encoded "anything" to decoded bitmap to encoded PNG.
The simplest way you could possibly do what you describe is:
ImageIO.write(ImageIO.read(input), "PNG", output);
This is rather naive code, and will break for many real-world files, or possibly just silently not output anything. You probably want to handle at least the most normal error cases, so something like below:
BufferedImage image = ImageIO.read(input);
if (image == null) {
// TODO: Handle image not read (decoded)
}
else if (!ImageIO.write(image, "PNG", output)) {
// TODO: Handle image not written (could not be encoded as PNG)
}
Other things to consider: The above will remove malicious code in the meta data. However, there might be special images crafted for DoS (small files decoding to huge in-memory representations, TIFF IFD loops, and much more). These problems need to be addressed in the image decoders for the various input formats. But at least your output files should be safe from this.
In addition, malicious code could be stored in the ICC profile, which might be carried over to the output image. You can probably avoid this by force converting all images to the built-in sRGB color space, or writing the images without ICC profiles.
PS: The ImageTranscoder interface is intended for situations where you want to keep as much meta data as possible (that is why it has methods only for meta data), and allows transformation of meta data from one file format to another (one could argue the name should have been MetadataTranscoder).
I was using java ImageIO and BufferedImage for some image operations and wanted to see how it behaves when I output the image as a jpg. There I found some interesting behaviour that I can't quite explain. I have the following code. The code reads an image and then outputs the same image as "copy.jpg" in the same folder. The code is in Kotlin, but the functions used are java functions:
val image = File("some/image/path.jpg")
val bufImage = ImageIO.read(image.inputStream())
FileOutputStream(File(image.parentFile, "copy.jpg")).use { os ->
ImageIO.write(bufImage, "jpg", os)
}
I would expect it to output the exactly same file, except maybe the meta information. However the resulting file was almost a tenth of the original file. I doubt the meta information would be that much. The exact size difference varied depending on which image file I used, however every time the output image would be smaller. But I could not see a quality difference to the old file. When zooming in I would see the same pixels.
Why is the file size reduced so dramatically?
JPEG is lossy compression: it throws away lots of information in order to keep the file small. (An uncompressed image file could be orders of magnitude larger.)
It's intended to throw away information that you're not likely to see or care about, of course; but it still loses some image data.
And the loss is generational: if you have an image that came from a JPEG file, and then recompress it to a JPEG file, it will usually lose more data, giving a worse-quality result than the first JPEG file — even if the compression settings are exactly the same. (Trying to approximate an already-compressed image won't work the same as trying to approximate the original source image. And there's no way to recover information which is already lost!)
That's almost certainly what's happening here. Your code reads a JPEG file and expands it into a BufferedImage (which holds the uncompressed image data), and then compresses it again into a new JPEG file, which loses further quality. It's probably using a lot higher compression than the first file used, hence the smaller size.
I'd be surprised if you couldn't see any difference between the two JPEG files in an image viewer or editor, when magnified. (JPEG artefacts are most obvious around sharp edges and boundaries, but if you know what to look for you can sometimes see them elsewhere. Subtle changes can be easier to see if you can line up both images on the exact same area of screen and flip directly between them.)
You can control how much information is lost when creating a JPEG — but the ImageIO.write() method you're using doesn't provide a way to do that. See this question for how to do it. (It's in Java, but you should be able to follow it.)
Obviously, the more information you're prepared to lose, the smaller file you can end up with. But note that if you choose a high-quality setting, the result could be a lot larger than the first JPEG, even though it will probably still lose slightly more quality.
(That's why, if you're doing any sort of processing on an image, it's best to keep it in lossless formats until the very end, and compress to a lossy format like JPEG only once, to avoid losing quality each time you save and reload.)
As you indicate, another reason could be the loss of non-image data — you're unlikely to notice the loss of metadata such as camera settings, but the file could have had a sizeable thumbnail image too.
I am using Java's javax.imageio.ImageIO to read contents from my FlashAir 32Gb (2nd gen) and to write them to my local hard drive. I can read and save the files, but I noticed that the ones in the hard drive are much smaller than the originals in the SD Card. Any ideas of why this is happening?
This is the code that I use:
URL imageURL = new URL("http://192.168.0.1/DCIM/100__TSB/IMG_0001.JPG");
File dest = new File("/home/me/Pictures/FlashAir/IMG_0001.JPG");
BufferedImage image = ImageIO.read(imageURL);
ImageIO.write(image, "JPG", dest);
The images save to the local hard drive are fine, but they are about 1/3 of the original size.
That code isn't just copying (reading and saving) files. It's decoding the images, and then re-encoding them at the default JPEG compression rate (which, judging by the JPEGImageWriteParam documentation, is 0.75).
If you want to change the compression level, check out this question.
If you're trying to copy the files exactly, don't use ImageIO at all.
I use sockets to send jpg images from server (android) to client. I want to attach timestamps to these images which are of type long. Since these images are already processed by image filters I don't want to save them before transmission, so using ExifInterface seems impossible. I therefore tried to use IIOMetadata but never got it to work. I dont want to use external libs like senselan.
What is the easiest way to do it? If using IIOMetadata is the best way to do it, could you please provide me with a working example on how to attach this to my byte[] and extract it later?
You can send a jpg file and then add 8 bytes to encode the long timestamp, then
another jpg and 8 bytes of timestam, and so on.
You can detect the end of the jpeg, using what it is commented here
Detect Eof for JPG images
Okay, I did what Pablo suggested, but attached the timestamp to the front of the image.