I have some code which uses the ImageReader class to read in a large number of TIF images. The imageReader object is final and created in the constructor.
synchronized(imageReader) {
LOG.debug(file);
FileInputStream fin = new FileInputStream(file);
ImageInputStream iis = ImageIO.createImageInputStream(fin);
imageReader.setInput(iis, false);
int sourceXSubSampling = targetSize == null ?
1 : Math.max(1, imageReader.getWidth(0) / targetSize.width);
int sourceYSubSampling = targetSize == null ?
1 : Math.max(1, imageReader.getHeight(0) / targetSize.height);
ImageReadParam subSamplingParam = new ImageReadParam();
subSamplingParam.setSourceSubsampling(sourceXSubSampling, sourceYSubSampling, 0, 0);
return imageReader.read(0, subSamplingParam);
}
About one instance in four the ImageReader get "stuck" on the first image it loaded and keeps loading that same image over and over again even though it is provided with different ImageInputStreams. This is evidenced by the output to the logger.
How do I solve this. I was thinking about taking a "fingerprint" of the image and getting a different ImageReader from the iterator if this occurs but that seems like overkill. Does anyone know how to solve this problem?
As #MadProgrammer says in the comment section, the typical pattern for reading multiple images is to obtain a new ImageReader for each image, and afterwards dispose() it. The time/memory spent on creating a reader instance is very small, compared to actually reading an image. So any performance penalty should be negligible.
In theory it should, however, be sufficient to invoke reset() on the ImageReader before/after each read.
Related
I am working with OpenCV in java, but I don't understand part of a class that loads pictures in java:
public class ImageProcessor {
public BufferedImage toBufferedImage(Mat matrix){
int type = BufferedImage.TYPE_BYTE_GRAY;
if ( matrix.channels() > 1 ) {
type = BufferedImage.TYPE_3BYTE_BGR;
}
int bufferSize = matrix.channels()*matrix.cols()*matrix.rows();
byte [] buffer = new byte[bufferSize];
matrix.get(0,0,buffer); // get all the pixels
BufferedImage image = new BufferedImage(matrix.cols(),matrix.rows(),type);
final byte[] targetPixels = ((DataBufferByte) image.getRaster().getDataBuffer()).getData();
System.arraycopy(buffer, 0, targetPixels, 0, buffer.length);
return image;
}
Main class sends a Mat object to this class.
The result sends BufferedImage but I don't understand targetPixels because this class doesn't use it somewhere else. But whenever I comment targetPixels and System.arraycopy, result shows black picture.
I want to know what's targetPixels - and what does it do?
The point is less about that array, but the methods that get you there.
You start here: getRaster(). That is supposed to return a WritableRaster ... and so on.
That class is using from getDataBuffer() from the Raster class; and there we find:
A class representing a rectangular array of pixels. A Raster encapsulates a DataBuffer that stores the sample values and a SampleModel that describes how to locate a given sample value in a DataBuffer.
What happens in essence here: that Image object, in the end has an array of bytes that are supposed to contain certain information.
That assignment:
final byte[] targetPixels = ...
retrieves a reference to that internal data; and then arrayCopy() is used to copy data into that array.
For the record: that doesn't look like a good approach - as it only works when this copy action really affects the internals of that Image object. But what if that last call getData() would create a copy of the internal data?
In other words: this code tries to gain direct access to internal data of some object; and then manipulate that internal data.
Even if that works today, it is not robust; and might break easily in the future. The other thing: note that this code does a unconditional cast (DataBufferByte). That code throws a RuntimeException if the the buffer doesn't have exactly that type.
Probably that is "all fine"; since it is all related to "AWT" classes which probably exist for ages; and will not change at all any more. But as said; this code has various potential issues.
targetPixels is the main image data (i.e. the pixels) of your new image. The actual image is created when the pixeldata is copied from buffer into targetPixels.
targetPixels is the array of bytes from your newly created BufferedImage, those bytes are empty thus you need to copy the bytes from the buffer to it with System.arraycopy.. :)
Nearest neighbor scaling works: The entire picture stays intact when I use TYPE_NEAREST_NEIGHBOR.
Even though it is Scala code, all used libraries are standard Java libraries.
Functions:
def getBufferedImage(imageFile: java.io.File): BufferedImage = {
ImageIO.read(imageFile)
}
def scaleImage(image: BufferedImage, minSize: Double): BufferedImage = {
val before: BufferedImage = image
val w = before.getWidth()
val h = before.getHeight()
val affit = new AffineTransform()
var scale = 1.0
if(h < w) {
if(h > 0) {
scale = minSize / h
}
} else {
if(w > 0) {
scale = minSize / w
}
}
affit.scale(scale, scale)
val affitop = new AffineTransformOp(affit, AffineTransformOp.TYPE_BICUBIC)
affitop.filter(before, null)
}
def getImageJpegByteArray(image: BufferedImage): Array[Byte] = {
val baos = new java.io.ByteArrayOutputStream()
val mcios = new MemoryCacheImageOutputStream(baos)
ImageIO.write(image, "jpeg", mcios)
mcios.close()
baos.toByteArray
}
Calling code snippet:
val img = getBufferedImage(imageFile)
val scaledImg = scaleImage(img, 512)
val result = getImageJpegByteArray(scaledImg)
// result is written to SQLite database
result is written to an SQLite database. If I download it from the database and save it as JPEG file, the resulting JPEG is
as expected if I use AffineTransformOp.TYPE_NEAREST_NEIGHBOR
completely black if I use AffineTransformOp.TYPE_BILINEAR
completely black if I use AffineTransformOp.TYPE_BICUBIC
Consequently, I accuse AffineTransformOp of being buggy...
How can I solve this problem?
File magic number of result is always ff d8 ff as expected for JPEG.
Details
Java version: Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_71
Operating System: Apple, OS X 10.9.5
Test image: http://www.photos-public-domain.com/wp-content/uploads/2012/05/thundercloud-plum-blossoms.jpg
I was able to reproduce your issue on Java 1.7.0_71 on OS X 10.10.4 (I rewrote your code in Java, I can post the full code if you are interested).
In any case, the problem is not that AffineTransformOp is buggy in itself. In my test program I displayed the image using a minimal Swing JFrame and the scaled image looked all good there. This is likely why most people in the comments did not understand the problem.
Part of the issue is that the BufferedImage returned by AffineTransformOp when you don't provide a destination to the filter method (the second parameter, null in your case), it will create one for you. This image will get type BufferedImage.TYPE_INT_ARGB. Here is the relevant code from AffineTransformOp.createCompatibleDestImage() (lines 456-468, I kept the formatting, to make it easier to spot):
ColorModel cm = src.getColorModel();
if (interpolationType != TYPE_NEAREST_NEIGHBOR &&
(cm instanceof IndexColorModel ||
cm.getTransparency() == Transparency.OPAQUE)
{
image = new BufferedImage(w, h, BufferedImage.TYPE_INT_ARGB);
}
else {
image = new BufferedImage(cm,
src.getRaster().createCompatibleWritableRaster(w,h),
cm.isAlphaPremultiplied(), null);
}
Notice the special case for TYPE_NEAREST_NEIGHBOR, which explains why you'll get different behavior when using nearest neighbor algorithm.
Normally this is all good, however (as I said, the image displays just fine in a Swing component).
The problem arises when you try to store this image as a JPEG. During the years, there's been a lot of confusion and issues related to the ImageIO JPEG plugin and whether it will allow you to write images with alpha channel (like your TYPE_INT_ARGB image). It does allow that. But, most often ARGB JPEGs will get misinterpreted as CMYK JPEGs (as they are 4 channels, and storing ARGB data in JPEG is very exotic) and will be displayed in all funky colors. In your case though, it seems to be all black...
So, there are two possible solutions:
Either write your image in a file format that supports alpha channel, like PNG or TIFF (TIFF requires an extra plugin, so it might not be the best choice). Like this:
ImageIO.write(image, "PNG", mcios);
Or, make sure your BufferedImage is in a pixel format without alpha channel before storing as JPEG. You can do this after the scaling, but the easiest (and fastest) is to just provide the AffineTransformOp with an explicit destination image, like this:
Rectangle newSize = affitop.getBounds2D(before).getBounds();
return affitop.filter(before,
new BufferedImage(newSize.width, newSize.height, BufferedImage.TYPE_3BYTE_BGR));
Here is your image, scaled by the program, using JPEG format and the TYPE_3BYTE_BGR:
I'm sure you can rewrite my Java code back to Scala. :-)
We have an application which serve images, to speed up the response time, we cache the BufferedImage directly in memory.
class Provider {
#Override
public IData render(String... layers,String coordinate) {
int rwidth = 256 , rheight = 256 ;
ArrayList<BufferedImage> result = new ArrayList<BufferedImage>();
for (String layer : layers) {
String lkey = layer + "-" + coordinate;
BufferedImage imageData = cacher.get(lkey);
if (imageData == null) {
try {
imageData = generateImage(layer, coordinate,rwidth, rheight, bbox);
cacher.put(lkey, imageData);
} catch (IOException e) {
e.printStackTrace();
continue;
}
}
if (imageData != null) {
result.add(imageData);
}
}
return new Data(rheight, rheight, width, result);
}
private BufferedImage generateImage(String layer, String coordinate,int rwidth, int rheight) throws IOException {
BufferedImage image = new BufferedImage(rwidth, rheight, BufferedImage.TYPE_INT_ARGB);
Graphics2D g = image.createGraphics();
g.setColor(Color.RED);
g.drawString(layer+"-"+coordinate, new Random().nextInt(rwidth), new Random().nextInt(rheight));
g.dispose();
return image;
}
}
class Data implements IData {
public Data(int imageWidth, int imageHeight, int originalWidth, ArrayList<BufferedImage> images) {
this.imageResult = new BufferedImage(this.imageWidth, this.imageHeight, BufferedImage.TYPE_INT_ARGB);
Graphics2D g = imageResult.createGraphics();
for (BufferedImage imgData : images) {
g.drawImage(imgData, 0, 0, null);
imgData = null;
}
imageResult.flush();
g.dispose();
images.clear();
}
#Override
public void save(OutputStream out, String format) throws IOException {
ImageIO.write(this.imageResult, format, out);
out.flush();
this.imageResult = null;
}
}
usage:
class ImageServlet extends HttpServlet {
void doGet(req,res){
IData data= provider.render(req.getParameter("layers").split(","));
OutputStream out=res.getOutputStream();
data.save(out,"png")
out.flush();
}
}
Note:the provider filed is a single instance.
However it seems that there is a possible memory leak because I will get Out Of Memory exception when the application keep running for about 2 minutes.
Then I use visualvm to check the memory usage:
Even I Perform GC manually, the memory can not be released.
And Though there are only 300+ BufferedImage cached, and 20M+ memory are used, 1.3G+ memory are retained. In fact, through "firebug" I can make sure that a generate image is less than 1Kb. So I think the memory usage is not healthy.
Once I do not use the cache (comment the following line):
//cacher.put(lkey, imageData);
The memory usage looks good:
So it seem that the cached BufferedImage cause the memory leak.
Then I tried to transform the BufferedImage to byte[] and cache the byte[] instead of the object itself. And the memory usage is still normal. However I found the Serialization and Deserialization for the BufferedImage will cost too much time.
So I wonder if you guys have any experience of image caching?
update:
Since there are so many people said that there is no memory leak but my cacher use too many memory, I am not sure but I have tried to cache byte[] instead of BufferedImage directly, and the memory use looks good. And I can not imagine 322 image will take up 1.5G+ memory,event as #BrettOkken said, the total size should be (256 * 256 * 4byte) * 322 / 1024 / 1024 = 80M, far less than 1Gb.
And just now,I change to cache the byte and monitor the memory again, codes change like this:
BufferedImage ig = generateImage(layer,coordinate rwidth, rheight);
ByteArrayOutputStream bos = new ByteArrayOutputStream();
ImageIO.write(ig, "png", bos);
imageData = bos.toByteArray();
tileCacher.put(lkey, imageData);
And the memory usage:
Same codes, same operation.
Note from both VisualVM screenshots that 97.5% memory consumed by 4,313 instances of int[] (Which I assume is by cached buffered image) is not consumed in non-cached version.
Although you have a less than 1K PNG image (which is compressed as per PNG format), this single image is being generated out of multiple instances of buffered image (which is not compressed). Hence you cannot directly co-relate image size from browser to memory occupied on server. So issue here is not memory leak but amount of memory required to cache this uncompressed layers of buffered images.
Strategy to resolve this is to tweak your caching mechanism:
If possible use compressed version of layers cached instead of raw
images
Ensure that you will never run out of memory by limiting cache size
by instances or by amount of memory utilized. Use either LRU or LIRS
cache eviction policy
Use custom key object with coordinate and layer as two separate
variables overriding with equals/hashcode to use as key.
Observe the behavior and if you have too many cache misses then you
will need better caching strategy or cache may be unnecessary
overhead.
I believe you are caching layers as you expect combinations of layer
and coordinates and hence cannot cache final images but depending on kind of
pattern of requests you expect you may want to consider that option if possible
Not sure what caching API you are using or what are actual values in your request. However based of visualvm it looks to me that String objects are leaking. Also as you mentioned if you turn off caching, problem is resolved.
Consider extract of below snippet of your code.
String lkey = layer + "-" + coordinate;
BufferedImage imageData = cacher.get(lkey);
Now here are few things for you to consider for this code.
You possibly getting new string objects each time for lkey
Your cache has no upper limit with and no eviction policy (e.g. LRU)
Cacher instead of doing String.equals() is doing == and since this
are new string objects they never match causing new entry each time
VisualVM is a start but it doesn't give the complete picture.
You need to trigger a heap dump while the application is using a high amount of memory.
You can trigger a heap dump from VisualVM. It can also be done automatically on an OOME if you add this vmarg to the java process:
-XX:+HeapDumpOnOutOfMemoryError
Use Memory Analyzer Tool to open and inspect the heap dump.
The tool is quite capable and can help you walk the object references to discover:
What is actually using your memory.
Why the objects from #1 aren't being garbage collected.
This is where I called the program to read the SpriteSheet, which was in another class.
private SpriteSheet spriteSheet = new SpriteSheet("/sprite_sheet.png");
This is where I tried to read the pixel colors from the sprite sheet which was colored with Black, Dark grey, Light grey, and White. This is meant to print out color details for the first row of pixels.
public SpriteSheet(String path) {
BufferedImage image = null;
try {
image = ImageIO.read(SpriteSheet.class.getResourceAsStream(path));
} catch (IOException e) {
e.printStackTrace();
}
if(image==null){
return;
}
this.path = path;
this.width = image.getWidth();
this.height = image.getHeight();
pixels = image.getRGB(0, 0, width, height, null, 0, width);
for(int i = 0; i<pixels.length;i++){
pixels[i] = (pixels[i] & 0xff)/64;
}
for(int i=0; i<8;i++){
System.out.println(pixels[i]);
}
When I run this it does not print the numbers like I coded. How could I fix this? And how does the reading of colors work?
If you aren't getting any output then either your println lines aren't getting reached or (far less likely) there's something wrong with your console configuration that is hiding output. We will assume, hopefully correctly, that all methods called here always either return or throw (i.e. nothing is hanging). Presuming the former, the only opportunity I see for that to happen is if the image fails to load and you return.
Since you print the stack trace of any exceptions but do not report seeing a stack trace, that means that ImageIO.read() must be returning null. For that to happen, according to the documentation:
The InputStream is wrapped in an ImageInputStream. If no registered ImageReader claims to be able to read the resulting stream, null is returned.
I do know that PNG is supported generally, but perhaps your specific flavor of PNG is not (e.g. a compressed, 4-color palette or something; I don't know what the limitations of ImageIO are here). I suggest you try saving the image in a different format, or testing with a more baseline PNG format (e.g. uncompressed 24-bit, no transparency, no interlacing).
If my analysis here is correct, the moral of this story is: Improve your error handling. Don't silently fail; especially when performing tasks that are critical to the correct functioning of your program.
If I am not correct, I suggest you either drop some debugging printouts in your method or step through in a debugger to see why those println lines aren't getting executed.
I'm receiving a Bitmap in byte array through socket and I read it and then I want to set it os.toByteArray as ImageView in my application. The code I use is:
try {
//bmp = BitmapFactory.decodeByteArray(result, 0, result.length);
bitmap_tmp = Bitmap.createBitmap(540, 719, Bitmap.Config.ARGB_8888);
ByteBuffer buffer = ByteBuffer.wrap(os.toByteArray());
bitmap_tmp.copyPixelsFromBuffer(buffer);
Log.d("Server",result+"Length:"+result.length);
runOnUiThread(new Runnable() {
#Override
public void run() {
imageView.setImageBitmap(bitmap_tmp);
}
});
return bmp;
} finally {
}
When I run my application and start receiving Byte[] and expect that ImageView is changed, it's not.
LogCat says:
java.lang.RuntimeException: Buffer not large enough for pixels at
android.graphics.Bitmap.copyPixelsFromBuffer
I searched in similar questions but couldn't find a solution to my problem.
Take a look at the source (version 2.3.4_r1, last time Bitmap was updated on Grepcode prior to 4.4) for Bitmap::copyPixelsFromBuffer()
The wording of the error is a bit unclear, but the code clarifies-- it means that your buffer is calculated as not having enough data to fill the pixels of your bitmap.
This is (possibly) because they use the buffer's remaining() method to figure the capacity of the buffer, which takes into account the current value of its position attribute. If you call rewind() on your buffer before you invoke copyFromPixels(), you might see the runtime exception disappear. I say 'might' because the ByteBuffer::wrap() method should set the position attribute value to zero, removing the need to call rewind, but judging by similar questions and my own experience resetting the position explicitly may do the trick.
Try
ByteBuffer buffer = ByteBuffer.wrap(os.toByteArray());
buffer.rewind();
bitmap_tmp.copyPixelsFromBuffer(buffer);
The buffer size should be exactly 1553040B (assuming bitmap's height, width and 32bit to encode each color).