Adding Many Saved Images in Android App - java

The application I'm trying to build will have a lot of images displayed (in ImageViews), and I'm not fetching them from a server/online service as it will need to be used offline. I know I can just dump them in the res/drawable directories, but I was wondering if there's any way to optimize this. Is there a way to somehow compress these images (besides making them smaller, they're already as small as I need) or use some other sort of android tool to better store them locally on the device?
I could just be overlooking a well used feature, and if so, it'd be great if someone could point me to that.
Edit: If I were to compress the images somehow, I would need to decompress at runtime or something, and that would take another thread/loading time. I'm not sure how to do that either, so I'm just brainstorming various ways, and I thought someone here would've come across this at some point.

If you haven't already, this is a good read - http://developer.android.com/guide/practices/ui_guidelines/icon_design.html#design-tips
When saving image assets, remove unnecessary metadata
Although the Android SDK tools will automatically compress PNGs when
packaging application resources into the application binary, a good
practice is to remove unnecessary headers and metadata from your PNG
assets. Tools such as OptiPNG or Pngcrush can ensure that this
metadata is removed and that your image asset file sizes are
optimized.
Outside of all other compression logic the above would be the place to start. Also when you say "optimize" - do you mean optimize the way images/drawables are loaded in your app or just the amount of space (on disk) the app will consume?

Related

Speed optimization of Website with a lot of images

I am currently working on a website which involves a lot of images. The problem is all the images are uploaded by the user so I can't do anything to alter the images. The website runs quiet ok on local system but the speed drops too much on the server,it becomes too slow
I'd suggest you to use Timthumb. It creates a thumbnail by generating a URL on the fly and uses very minimal disk space.
If the users of your website are uploading the images, then I presume (there must be) an upload script. Inside of that script or directly after its execution you could compress or rescale the image to size needed on the website, shortening loading time. There is a PHP image processing library called ImageMagick here:
http://php.net/manual/en/book.imagick.php
There is the PHP GD image processing library here:
http://php.net/manual/en/book.image.php
I don't have much personal experience with them, but from my knowledge it looks like one will do the job. Off the top of my head, that's the best solution I can think of, and hopefully it works. There is not a lot you can change about your problem if you don't compress/scale the images, and these are probably your best options. Wish you the best.

Android GTFS app

I'm trying to work on an app which uses GTFS. This may seems like a stupid question but I couldn't find any answer to it.
The GTFS for Israel, a rather small country with not so many buses infrastructure, is around 120 MB zipped file.
Right now the only possible way I could think of for getting it working is to download the file, but downloading 120 MB using the phone could take quite a long time. Sure you can do this only once and save it in a database on the phone, but it still requires downloading 120 MB.
Since it is zipped, I can't unzip it over the server and than just get the txt files..
So basically I'm asking, How can I get the information to the phone, without downloading the zipped file?
I've seen and used apps which uses that same GTFS file, and they load up really fast, even on the first load..
I hope you understand my issue, not sure how to explain it better.
Thanks!
P.s I would make an iPhone app too, and it's the same issue, hence the iPhone tag
One approach might be to preprocess the GTFS data during your app development. You could load it into a SQLite database, and use Core Data to get the data you need out of the file at runtime. This also gives you an opportunity to include only the data that you actually need for your app - it doesn't make sense to ask users to download extra data that they won't need.
Use protocol binary format (pbf) formely google and now open source. It is compact and very fast searchable, so no need to decompress it on a device and load it into a database on that device because pbf acts as a database. Just include pbf library in your code to query it. Of course you have to compress it once before distributing the data online.

Ways to handle huge XML/JSON files

I'm looking to create an Android (altho for iOS the problem will be the same) application which will function pretty much as a webshop.
It will contain a lot of products - which can be acces through any way we want since that still has to be build.
The problem is, we created a plain text file to test the size, and it turns out that even a selection of the products, with no structure (XML, JSON..) is already 300mb.
Once we add a structure, this will logically only cause more overhead and increase this size.
Like I said, pretty much anything is possible in matters of receiving the data.
They can build an API to be able to fetch products once at a time when needed, or 1 big file to parse in a background process...
However, one of the wishes is being (as much as possible) offline. This would normally mean saving all the data into a database on the phone, but if this will result in 300mb on your SD card, this is no good.
To sum it up what I exactly want to know;
Are there any other ways to handle big data like this, without having to keep a connection to internet constantly, or having to download 300mb on someone's phone.
Some kind of compression, special way to save it in the database... any ideas are welcome.

Storing lots of small files: archive vs. filesystem

I am creating an application that requires a lot of image thumbnails (~3000, 5-25KB). Because speed is essential I plan on loading these images into memory when the application starts. At runtime, new thumbnails will be downloaded and added to the collective.
I could store them all in a folder, but reading thousands of files into memory when a program starts hardly seems efficient.
My second option would be to save them in some kind of (compressed) archive. This would make storage itself and loading more efficient (I think). However, new files will be added regularly, and that will probably not go as smoothly as just saving them in a folder.
Is storing a cache of small files in a (compressed) archive a bad idea or not? Are ZIP files the way to go? Would I be better off using uncompressed archives (and if so, what kind)?
All image files will be JPEG's.
Thanks in advance!
EDIT: I am considering to drop the "load everything into memory on application start" thing. This would simplify my question a little. My initial idea to put everything in one big file now seems less beneficial, since the problem of many files in one directory can be solved by hashing into subdirectories.
Small files don't compress especially well, so you may not gain much compression.
While loading the files will be fast because they are smaller, decompression adds time. You'd have to experiment to see which is faster.
I would think the real issues would relate to the efficiency of the file system when it comes to iterating over all the little files, especially if they are all in one folder. Windows is notorious for being pretty inefficient when folders contain lots of files.
I would consider doing something like writing them out into one file, uncompressed, that could be streamed into memory -- maybe not necessarily contiguous memory, as that might be a problem. But the idea would be to put them all in one file. Then write some kind of index that ties a file name or other identifier to an offset from which the location of the image in memory could be determined.
New images could be added at the end, and the index updated appropriately.
It isn't fancy but that's what you're trying to avoid. An archive or even a file system gives you lots of power and flexibility but at the cost of efficiency. When you know what you want to do, sometimes simple is better.
I would consider implementing a solution that reads files from a folder, another that divides the files into subfolders and subsubfolders so there are no more than 100 or so files in any given folder, then time those solutions so you have something to compare to. I would think a simple indexed file would be fast enough that you wouldn't even need to pre-load the images like you're suggesting -- just retrieve them as you need them and keep them around once they're in memory.
All disk based storage, and most database, allocate space in chunks. The chunks on large capacity disks can be large. If you have 5kb files and a 32kb disk chunk you end up with 85% wasted space on your storage.
Using an archive won't compress jpeg much because the jpeg encoding algorithm already does that. It will however save you the wasted space on the storage media. It does make things more complicated and perhaps a little slower.
In my opinion I think that the zip file way it´s a bad idea, because you will slowdown everything with the process to load the zip file and unzip it to extract each image.
I think that the purpose of a thumbnail image is that by nature is small so your app plus hardware can load it as fast as possible. So I believe that it is a better idea to load each image as you need it.
Well, if you have small, "geometric" pictures, you may implement them as objects of type javax.swing.Icon rather than images to load from the filesystem.
http://download.oracle.com/javase/6/docs/api/javax/swing/Icon.html
http://download.oracle.com/javase/tutorial/uiswing/components/icon.html
So you will implement one or more objects which draw themselves onto a Graphics surface using the Graphics drawing primitives, instead of copying pixels.
If this is a web-application then the best performance boost you can get is setting good HTTP caching headers. Having a unique URL for every image (also different URLs for different versions of the same image) makes it possible to set VERY far future expire headers, because changing the image changes the URL leading into refetch.
I won't compress, because JPEG cannot be good compressed and it only costs CPU time.
I would recommend to simply store the images into filesystem and consider the use of libraries like jawr or implement your own caching strategy.
I know this question has already answered but I think you need more options other than zipping.
While zip is good, It's not really affect much for JPEG since JPEG has already compressed.
Other thing you may want to consider is :
Put the image in Content Delivery Network (CDN)
Compress components with gzip ( mean the server will automatically zip every response ) and you dont need to write any code to unzip it later - it's handled by the browser automatically.
Since you mention JPEG, you may want to use JPEGTran.Run jpegtran on all your JPEGs.
This tool does lossless JPEG operations such as rotation and can also be used to optimize and remove comments and other useless information (such as EXIF information) from your images.
jpegtran -copy none -optimize -perfect src.jpg dest.jpg
Use Image Sprites. Instead of asking browser to download many image at same time, ask the browser to only download one.
For the details read : http://developer.yahoo.com/performance/rules.html#opt_images
For the basic examination how to improve your website performance you can try install YSlow ( plugin to detect uneffecient code ) in Firefox.
Hope that helps.

What's a good client-side, resize-before-upload, flash/java uploader?

does anybody know a good client-side, resize-before-upload, flash/java uploader?
We need this terribly on our project, since we have a lot of people uploading photos and most of them have photos which are bigger than 3MB. We want to be able to resize these photos before they get sent out to us.
I've googled for client-side java/flash uploaders, but I wasn't impressed with the results.
Is there anyone here who has already purchased a java/flash uploader before? Which ones are good?
Thanks!
http://www.shift8creative.com/projects/agile-uploader/index.html
I just finished this little project. It's a very tricky thing to do - resizing before uploading and then to make matters worse there's a few gotchas. Such as if you wanted to use a submit button outside of Flash (via ExternalInterface) to submit... You could end up in some trouble. I released my project for free and it should take away all the headaches for ya. Though it's only a single file upload (some other projects like swfupload and plupload aim to provide multiple file uploading but are much larger projects and I feel too bulky and take too long to setup on top of the fact that they really aren't that seamless, you'll be jumping through hoops on the backend).
http://www.plupload.com/
Allows you to upload files using HTML5 Gears, Silverlight, Flash, BrowserPlus or normal forms, providing some unique features such as upload progress, image resizing and chunked uploads.

Categories