Speed optimization of Website with a lot of images - java

I am currently working on a website which involves a lot of images. The problem is all the images are uploaded by the user so I can't do anything to alter the images. The website runs quiet ok on local system but the speed drops too much on the server,it becomes too slow

I'd suggest you to use Timthumb. It creates a thumbnail by generating a URL on the fly and uses very minimal disk space.

If the users of your website are uploading the images, then I presume (there must be) an upload script. Inside of that script or directly after its execution you could compress or rescale the image to size needed on the website, shortening loading time. There is a PHP image processing library called ImageMagick here:
http://php.net/manual/en/book.imagick.php
There is the PHP GD image processing library here:
http://php.net/manual/en/book.image.php
I don't have much personal experience with them, but from my knowledge it looks like one will do the job. Off the top of my head, that's the best solution I can think of, and hopefully it works. There is not a lot you can change about your problem if you don't compress/scale the images, and these are probably your best options. Wish you the best.

Related

Optimize images - Losslessly compress images in Java

Having an ecommerce website, We have thousands of product images. On checking pagespeed on google it shows me something like this:
I was wondering, if there is any built in feature in Java or any third party library is available with which we can losslessly compress all the images that we host. Hence we can save few KBs of our customers.
On searching through internet I found few like punnypng and kraken which are paid, hence we do not have heavy image uploaded every month, subscribing to them is not worth. I would prefer any built in feature in Java or any open source third party library.
I came across JAI, but not sure about whether it addresses this problem or not. Anyone with hands-on experience with this?
P.S. We are using Java 8
Have you looked at classes in the javax.imageio package (http://docs.oracle.com/javase/7/docs/api/javax/imageio/package-summary.html) ?
You can do decoding and re-encoding of the images. The class ImageWriteParam (http://docs.oracle.com/javase/7/docs/api/javax/imageio/ImageWriteParam.html) lets you customize the compression settings.
~600 KB jpeg images are quite large for screen, though not for print. Having several images on a page would mean making more or less "thumb" views being smaller. And provide an individual product page with higher resolution, say 600 KB JPEG.
Standard ImageIO suffices for conversion, see #NicolaFerraro.
Faster page loading can be achieved on the overview page with multiple images, by storing the smaller views into one large image. PNG might then be appriopriate to prevent JPEG artifacts.
To provide a higher resolution for a print, one can use the CSS media setting.
Check thumbnailator. It is great at making smaller images from larger images.
Besides you should consider when you are going to make these smaller images. At each call, at the first call (any keeping a cache), ...

Adding Many Saved Images in Android App

The application I'm trying to build will have a lot of images displayed (in ImageViews), and I'm not fetching them from a server/online service as it will need to be used offline. I know I can just dump them in the res/drawable directories, but I was wondering if there's any way to optimize this. Is there a way to somehow compress these images (besides making them smaller, they're already as small as I need) or use some other sort of android tool to better store them locally on the device?
I could just be overlooking a well used feature, and if so, it'd be great if someone could point me to that.
Edit: If I were to compress the images somehow, I would need to decompress at runtime or something, and that would take another thread/loading time. I'm not sure how to do that either, so I'm just brainstorming various ways, and I thought someone here would've come across this at some point.
If you haven't already, this is a good read - http://developer.android.com/guide/practices/ui_guidelines/icon_design.html#design-tips
When saving image assets, remove unnecessary metadata
Although the Android SDK tools will automatically compress PNGs when
packaging application resources into the application binary, a good
practice is to remove unnecessary headers and metadata from your PNG
assets. Tools such as OptiPNG or Pngcrush can ensure that this
metadata is removed and that your image asset file sizes are
optimized.
Outside of all other compression logic the above would be the place to start. Also when you say "optimize" - do you mean optimize the way images/drawables are loaded in your app or just the amount of space (on disk) the app will consume?

Securing images used in java applets

I am using a java applet in my web project and some images are shown in this java applet.
I want to limit users to pull images from the web page. What are possible techniques for a client user to save images used in a java applet which is used in a browser session, except "print screen"?
Any ideas are welcome.
Thanks.
..What are possible techniques for a client user to save images used in a java applet
I don't know about typical end-users, but I'd do these things to circumvent several security strategies:
1. Hide them in the archives
Look at the source of the page
Discover the location of the Jars
Download each one by direct fetch
Rename them to .zip and expand them (quick & dirty), then..
Sell your images for causing me that much inconvenience to get them in the form I want.
That last part was sarcasm (mostly), but just wanted to make the point that if you put something in a Jar, people can get it out.
2. Hide them on the server
Use a packet sniffer to discover the image locations by URL.
Pull each image directly (etc.)
3. Encrypt the images
You might use techniques to encrypt the images, then obfuscate the code that decrypts them, but that would also fail against a determined hacker.
4. Screen grab
As mentioned already. 'Last resort' - crude, but effective.
..Any ideas are welcome.
Don't pursue such strategies. You won't achieve any form of security worth having, it will just irritate the user.
If the image is a bitmap inside of a jframe it'll be hard to capture it without resorting to a screen grab. Just like using it in flash the image file itself won't actually exist anywhere on the client.

Storing lots of small files: archive vs. filesystem

I am creating an application that requires a lot of image thumbnails (~3000, 5-25KB). Because speed is essential I plan on loading these images into memory when the application starts. At runtime, new thumbnails will be downloaded and added to the collective.
I could store them all in a folder, but reading thousands of files into memory when a program starts hardly seems efficient.
My second option would be to save them in some kind of (compressed) archive. This would make storage itself and loading more efficient (I think). However, new files will be added regularly, and that will probably not go as smoothly as just saving them in a folder.
Is storing a cache of small files in a (compressed) archive a bad idea or not? Are ZIP files the way to go? Would I be better off using uncompressed archives (and if so, what kind)?
All image files will be JPEG's.
Thanks in advance!
EDIT: I am considering to drop the "load everything into memory on application start" thing. This would simplify my question a little. My initial idea to put everything in one big file now seems less beneficial, since the problem of many files in one directory can be solved by hashing into subdirectories.
Small files don't compress especially well, so you may not gain much compression.
While loading the files will be fast because they are smaller, decompression adds time. You'd have to experiment to see which is faster.
I would think the real issues would relate to the efficiency of the file system when it comes to iterating over all the little files, especially if they are all in one folder. Windows is notorious for being pretty inefficient when folders contain lots of files.
I would consider doing something like writing them out into one file, uncompressed, that could be streamed into memory -- maybe not necessarily contiguous memory, as that might be a problem. But the idea would be to put them all in one file. Then write some kind of index that ties a file name or other identifier to an offset from which the location of the image in memory could be determined.
New images could be added at the end, and the index updated appropriately.
It isn't fancy but that's what you're trying to avoid. An archive or even a file system gives you lots of power and flexibility but at the cost of efficiency. When you know what you want to do, sometimes simple is better.
I would consider implementing a solution that reads files from a folder, another that divides the files into subfolders and subsubfolders so there are no more than 100 or so files in any given folder, then time those solutions so you have something to compare to. I would think a simple indexed file would be fast enough that you wouldn't even need to pre-load the images like you're suggesting -- just retrieve them as you need them and keep them around once they're in memory.
All disk based storage, and most database, allocate space in chunks. The chunks on large capacity disks can be large. If you have 5kb files and a 32kb disk chunk you end up with 85% wasted space on your storage.
Using an archive won't compress jpeg much because the jpeg encoding algorithm already does that. It will however save you the wasted space on the storage media. It does make things more complicated and perhaps a little slower.
In my opinion I think that the zip file way it´s a bad idea, because you will slowdown everything with the process to load the zip file and unzip it to extract each image.
I think that the purpose of a thumbnail image is that by nature is small so your app plus hardware can load it as fast as possible. So I believe that it is a better idea to load each image as you need it.
Well, if you have small, "geometric" pictures, you may implement them as objects of type javax.swing.Icon rather than images to load from the filesystem.
http://download.oracle.com/javase/6/docs/api/javax/swing/Icon.html
http://download.oracle.com/javase/tutorial/uiswing/components/icon.html
So you will implement one or more objects which draw themselves onto a Graphics surface using the Graphics drawing primitives, instead of copying pixels.
If this is a web-application then the best performance boost you can get is setting good HTTP caching headers. Having a unique URL for every image (also different URLs for different versions of the same image) makes it possible to set VERY far future expire headers, because changing the image changes the URL leading into refetch.
I won't compress, because JPEG cannot be good compressed and it only costs CPU time.
I would recommend to simply store the images into filesystem and consider the use of libraries like jawr or implement your own caching strategy.
I know this question has already answered but I think you need more options other than zipping.
While zip is good, It's not really affect much for JPEG since JPEG has already compressed.
Other thing you may want to consider is :
Put the image in Content Delivery Network (CDN)
Compress components with gzip ( mean the server will automatically zip every response ) and you dont need to write any code to unzip it later - it's handled by the browser automatically.
Since you mention JPEG, you may want to use JPEGTran.Run jpegtran on all your JPEGs.
This tool does lossless JPEG operations such as rotation and can also be used to optimize and remove comments and other useless information (such as EXIF information) from your images.
jpegtran -copy none -optimize -perfect src.jpg dest.jpg
Use Image Sprites. Instead of asking browser to download many image at same time, ask the browser to only download one.
For the details read : http://developer.yahoo.com/performance/rules.html#opt_images
For the basic examination how to improve your website performance you can try install YSlow ( plugin to detect uneffecient code ) in Firefox.
Hope that helps.

What's a good client-side, resize-before-upload, flash/java uploader?

does anybody know a good client-side, resize-before-upload, flash/java uploader?
We need this terribly on our project, since we have a lot of people uploading photos and most of them have photos which are bigger than 3MB. We want to be able to resize these photos before they get sent out to us.
I've googled for client-side java/flash uploaders, but I wasn't impressed with the results.
Is there anyone here who has already purchased a java/flash uploader before? Which ones are good?
Thanks!
http://www.shift8creative.com/projects/agile-uploader/index.html
I just finished this little project. It's a very tricky thing to do - resizing before uploading and then to make matters worse there's a few gotchas. Such as if you wanted to use a submit button outside of Flash (via ExternalInterface) to submit... You could end up in some trouble. I released my project for free and it should take away all the headaches for ya. Though it's only a single file upload (some other projects like swfupload and plupload aim to provide multiple file uploading but are much larger projects and I feel too bulky and take too long to setup on top of the fact that they really aren't that seamless, you'll be jumping through hoops on the backend).
http://www.plupload.com/
Allows you to upload files using HTML5 Gears, Silverlight, Flash, BrowserPlus or normal forms, providing some unique features such as upload progress, image resizing and chunked uploads.

Categories