Before you all say 'thats already on here somewhere...' :-)
PLEASE let me say I have looked and not found a simple example of using JAI to tile multiple jpg's and save to disk without java heap errors or other memory problems
I cant find a complete working set of code anywhere - they all seem to be miswritten / unchecked or simply do not work....
Help me Some-BiWan Kenobi - you're my only hope!
How big are the images? You might be able to just increase the memory allocated to the JVM. If they are huge, then JPG might not be the right format, because you need to load the whole image into memory to compress it and write it out. You might have better luck writing a tiled TIFF, using JAI.
I asked a similar question here: Write swing component to large TIFF image using JAI
Related
How reliable is ZXing's barcode localization for DataMatrix decoding compared to libdmtx?
I have a set of png image files of stickers (proprietary, so unfortunately I'm not able to share them) containing DataMatrix barcodes. These stickers sit on flat surfaces, have very nice quiet zones and are generally centered in the image, but suffer from inequal lighting conditions and slight dust, likely the largest obstacle to reliable decoding.
I'd like to use a modifiable Java library to decode them and it seems that ZXing is the only open-source option (open to other suggestions). However, upon running these images through the ZXing online decoder, I consistently get NO BARCODE FOUND, even on the cleanest images. In contrast, when I run the same images through proprietary online decoders, like Inlite's Free Online Barcode Reader, I get reliable decodes for all the images. My company has implemented a library in C that also reliable decodes the barcode images by processing them and calling libdmtx. Similarly, this online DataMatrix decoder built on libdmtx can also reliably read my image files.
Is the barcode localization in ZXing significantly inferior to libdmtx?
If I attempt the same preprocessing on the image files before I run them through ZXing, could I achieve similar results? I have a strong preference for a Java library (ZXing), but I may have no choice but to use libdmtx. Would appreciate any insight, thanks!
I had similar problem as you but on encoding side. As per my findings Zxing is certainly inferior to Libdmtx. We are using both libraries in house in C++ and Java project.
There is a case when Zxing breaks while generating barcode look at my comments here:
https://github.com/zxing/zxing/issues/624
However Libdmtx works flowless. The other free options you have in java world are (they are for encoding):
barcode4j
OkapiBarcode
Another alternative is the relatively new ZXing cpp port here: https://github.com/nu-book/zxing-cpp.
It contains a completely new DataMatrix detector that was meant to fix serious limitations of the Java upstream version. It was specifically designed to deal with low resolution images (module size as low as around 2 pixels) and symbols that have just the required 1 module quite zone and a busy background.
The following comparison is certainly not 'fair' but I just had the dmtxread utility of the libdmtx try my test set of images and it missed 3 of 17 samples and took a whooping 300 times as long compared to my code :).
I am currently working on a website which involves a lot of images. The problem is all the images are uploaded by the user so I can't do anything to alter the images. The website runs quiet ok on local system but the speed drops too much on the server,it becomes too slow
I'd suggest you to use Timthumb. It creates a thumbnail by generating a URL on the fly and uses very minimal disk space.
If the users of your website are uploading the images, then I presume (there must be) an upload script. Inside of that script or directly after its execution you could compress or rescale the image to size needed on the website, shortening loading time. There is a PHP image processing library called ImageMagick here:
http://php.net/manual/en/book.imagick.php
There is the PHP GD image processing library here:
http://php.net/manual/en/book.image.php
I don't have much personal experience with them, but from my knowledge it looks like one will do the job. Off the top of my head, that's the best solution I can think of, and hopefully it works. There is not a lot you can change about your problem if you don't compress/scale the images, and these are probably your best options. Wish you the best.
Having an ecommerce website, We have thousands of product images. On checking pagespeed on google it shows me something like this:
I was wondering, if there is any built in feature in Java or any third party library is available with which we can losslessly compress all the images that we host. Hence we can save few KBs of our customers.
On searching through internet I found few like punnypng and kraken which are paid, hence we do not have heavy image uploaded every month, subscribing to them is not worth. I would prefer any built in feature in Java or any open source third party library.
I came across JAI, but not sure about whether it addresses this problem or not. Anyone with hands-on experience with this?
P.S. We are using Java 8
Have you looked at classes in the javax.imageio package (http://docs.oracle.com/javase/7/docs/api/javax/imageio/package-summary.html) ?
You can do decoding and re-encoding of the images. The class ImageWriteParam (http://docs.oracle.com/javase/7/docs/api/javax/imageio/ImageWriteParam.html) lets you customize the compression settings.
~600 KB jpeg images are quite large for screen, though not for print. Having several images on a page would mean making more or less "thumb" views being smaller. And provide an individual product page with higher resolution, say 600 KB JPEG.
Standard ImageIO suffices for conversion, see #NicolaFerraro.
Faster page loading can be achieved on the overview page with multiple images, by storing the smaller views into one large image. PNG might then be appriopriate to prevent JPEG artifacts.
To provide a higher resolution for a print, one can use the CSS media setting.
Check thumbnailator. It is great at making smaller images from larger images.
Besides you should consider when you are going to make these smaller images. At each call, at the first call (any keeping a cache), ...
I was wondering if it is possible to load a huge JPEG (see my previous question there) and save it / use it in a quickly-loadable format, without loading its fully decoded contents into memory.
The main idea is to have a file format (or a function I'm not aware of which would do that natively in Java) allowing to load dynamically a given area of the image, without loading 800MB of data.
I'm actually using a tiling algorithm to render the image, but it needs to load the full image (thanks to mKorbel and Gerard Le Blanc), but this "preloading" step remains quite long.
I don't figure out how to do that (and first of all, is it possible ?), since Google did not help me a lot (maybe bad-english searching ?)
I found what I needed there :
very large image manipulation and tiling
The Java JAI lib has a lot of features to handle these kind of problems, and is supported by Oracle, which means it is (theoretically) stable and sustainable.
Thanks to #BryanD !
I am writing an web application where I need to send an image from servlet to client. Image is generated dynamically and is quite big(+-2MB). It might be jpeg, png, or gif.
Now, I am using ImageIO.write() to write the image to output stream, but its veeeery slow. It takes up to 6 seconds till the client see the image. I need to speed it up.
Any suggestions?
btw. I am aware of Looking for a faster alternative to ImageIO topic. But it didn't help me.
Since it's slow with PNG ImageMagick is not a solution and
I have tested JAI and it was even worse.
Thanks in advance
Edit:
To show you some code:
BufferedImage bi = [code to generate Image];
response.setContentType(mime);
ServletOutputStream out = response.getOutputStream();
ImageIO.write(bi,"png",out);
I stripped down Exception handling for readability.
Image encoding in java is pretty slow in general but you may also want to ensure you have the native libraries installed as they make quite a noticeable difference in performance.
http://download.java.net/media/jai-imageio/builds/release/1.1/INSTALL-jai_imageio.html
Be aware that ImageIO by default uses temporary files as cache when creating ImageInputStreams and ImageOutputStreams. This can be switched off by calling ImageIO.setUseCache(false).
For a more detailed explanation see this answer.
Are you sure that the
ImageIO
takes so long - maybe there is another problem, e.g.
slow (network) connection to the client
the generation (calculation) of the image takes a lot of time