WAR optimization - java

We are using GWT Eclipse tool for developing a web application in JAVA and Lightsreamer (third party software) for login purpose and for data streaming.
It takes around 6 mins to load page and about 2 mins to login in production environment.
Previously the Browser folder(WAR) size was 70.5MB. So it was reduced by removing some folders like Web-INF, js, pieces and some files of SC(Deleted subfolders in SC are modules, modules-debug, system) and some files of SmartG and also even optimized the images in Image folder.
And now Image folder got reduced from 40KB to 17KB.
Now the total size of WAR is 15.6MB. It is also works. But now too it takes long time to load the page.
Kindly advice us how we need to optimize WAR folder and suggest the way of optimizing.

The first thing to do is analyse your client application HTTP requests to your webserver using HTTP monitoring tools like Firebug
http://getfirebug.com

Deleting Stuff in your WAR has not impact on the perfs. It's not like the browser loads all that stuff. If after deleting all the garbage, your application still works like before, then that means those was useless.
Now, about your actual issue, as other told you, you should watch where the time is spent. Firebug can be of a good help in that because you will see (Net tab) how long each request takes. Also, speedTracer is a great tool when using GWT because you see exactly where the time is spent. And it can even display you the time spent on the server side!
also some tips :
gwt should compile to obfuscated JS, not pretty or whatever
make sure your images are not too big and loaded only when necessary => image bundle anyone?
compress your css + external JS (yuicompressor)
Have a good caching policy (far futur for gwt stuff since the names are generated)
and lot of others....

Related

Why included images in a thymeleaf layout takes a long time to being displayed?

I am working on a java app, works fine and all, but at the frontend, I am using thymeleaf (first time I use it, moving on from primefaces).
What Ive noticed is that the page loads (working locally) pretty fast, as expected, since the information is not reelevant and no DB is being used by now.
What surprises me is that the images load probably 2 seconds later, I have no idea why, they are stored locally on the app assets folder.
I am incluiding them this way:
<img th:src="#{/assets/images/myimage.png}" />
Is there anyway to make it faster? (Of course, I will later set more memory to my JVM, but this sort of stuff shouldnt take that long...)
Any caching or faster ways?
I am using spring 4 mvc
Thanks.
There are so many variable to this i am not going to list all of them but a few:
Distance between Client and Server
Efficiency of your application
Size of images
Browser and all extensions/plugins attached to it
Power of server
Power of client machine
Delivery method
Potential javascripts that manage loading of images
Malware
As mentioned there is a lot more that I haven't listed and it really is a process of elimination.
Some things we have on our application that avoids the delay in images include
server memory increase from 512 to 1024
Changing location of server to a more local source
changing the location from where the application is sourcing the images (faster raid disks)
delaying full page display until everything is preloaded on client machine (look into Flash of Unstyled Content Fixes)
Performance improvements on web application itself
What you need to do is explore each option and start improving on it and eventually you will get the results you need.
Fyi if you want to avoid loading images from server have them hosted on a CDN for speed of transfer.

Running a program on a server

I want to build an Android application that downloads an XML file from a web server and displays its contents in a readable format.
My problem is generating that XML file. Basically I want to run a program, say, every 30 minutes that downloads a web page (as that data is not easily accessible), parses it, generates said XML file and puts it somewhere for the Android application to download.
Now, I was writing a Java application to do this, but it came to me: where am I going to run this? I thought of having a laptop permanently running at home, but there must be a better alternative.
I have online hosting, but it is very simple. It does not even include SSH.
Any ideas?
Edit: as per your suggestions, I checked and yes, my cPanel does have a "Cron Jobs" section. I will now investigate it. Thank you so much for your help.
http://www.setcronjob.com/ allows you to trigger a web page request once every hour, which might be good enough.
I have not actually tried it, but it sounds like a good solution.
you need to rent a server which will generate your html and also serve the content to your app. Not expensive if you get a VPS or cloud server.

Challenge in Asynchronous Javascript Loading

Background: I'm working on a GWT based application which is deployed on Google App Engine. My application uses a custom library, which requires 6-7 big core javascript files (~3 MB total size) to be loaded up front. So to even load my application's home page, it takes anywhere from 20-40 secs depending upon the bandwidth. Recently I set upon the task to address the slow page load issue.
I made a plain HTML home page, without any GWT or custom library's components. This home page contains a login button and a simple jQuery based image slider. I'm using Google Accounts based authentication. On the home page, I use "asynchronous ordered" Javascript loading technique presented here: http://stevesouders.com/efws/loadscript.php?t=1303979716 to load all the core .js files. So while the user is looking at the home page or interacting with the image slider, the .js files quietly download in the background. After the user logs in, the control is redirected to the Main HTML file, which retrieves the .js files from the cache, and loads the app. This time the loading of the app is significantly faster. So, the home page loads up in 2-5 secs and the main app also loads up pretty fast.
It's exactly what I need. But there's an issue.
ISSUE: If the user clicks on the Login button too fast (i.e. before all the asynchronous .js file downloads are complete), the asynchronous downloads break. In fact, as soon as the user navigates away from the home page (to the Google Accounts sign-in page or any other page), all the asynchronous downloads break. Now when the main app is displayed, it retrieves the partially downloaded .js files from the cache and reports "Unexpected end of line" errors.
I could do one thing to address the problem - disable the login button until the asynchronous downloads complete. This is slightly better than the current situation because it allows the user to at least see the home page and interact with the image slider. But it's not ideal.
Ideally I'm looking for a way to check the hash (or file size) of the .js files in the Main HTML, to see whether they are correct. If there's a mismatch, download the files again. But I'm not a Javascript expert by any means and not even sure whether that's possible. So, I'm looking for help from experts here. This might be a common problem that people have addressed in other ways. So, I'm also open to other alternatives.
Thanks.
UPDATES:
I have multiple (6-7) .js files to be loaded.
The .js files are independent and need to be loaded in a particular order.
3MB of Javascript files is pretty crazy. Do you really need that much? If so you should look to concatinating / minifying them using jsmin or uglifyjs. This will reduce the size by at least 50%. Next you ensure gzipping is enabled on the server hosting these files. That should reduce the size by another 50%.
Lastly you should use an expires header to ensure this file caches clientside forever. To uncache update a query param on the src:
<script src="compressed.js?v=1"></script>
As for checkinging hash of filesizes, not exactly possible, though you could check for the existence of certain variables your libs are introducing into the global namespace. For example:
jQueryLoaded = !!window.jQuery;
In this case, considering we have around 3 MB of javascript to be loaded, I would construct a loading page, with a progress bar, that downloads all the javascript, and shows the progress before showing the actual site. This does not limit user experience to any significant degree, it rather gives you an opportunity to show updates, news (even feeds), help messages while loading. Also have a timer on the loading page, to detect if it takes too long and give them an option to do something else if it does take too long, for example showing the login page with login disabled till scripts are downloaded. I have found scripts that do this here and here
You can actually compress, include versions, minify and bundle the js files this reduces footprint by more than 80%. Have a look at Yahoo's Coding horror. You can do all these using jawr

deployment time of "ear" annoying using jboss,ant, jsp's and prehistoric pc

I am developing a web based java app, running on jboss and sql server.
I seem to find myself spending an inordinate amount of time recompiling/deploying just to tweak the interface in jquery/javascript/css/html.
Any tips for reducing the turnaround ?
Its deployed to an ear file, so I can not alter the jsps/javascript after deployment(?). Yes, I have created the a static version of the webpage frontends but they do not give me the full functionality - none of the data from db/jstl processing.
To clarify its not so much the actual compile time itself (30seconds) as the ant builds are set-uo well and are very modular; its the subsequent deployment to jboss and accessing the application that cause the real headache.
If you do not work directly in an exploded war inside the hotdeploy folder of JBoss, then strongly consider it.
when developing with application server i've used this product in the past: JRebel from zeroturnaround.
It will prevent having to restart and redeploy an application running within an application server. It works for most scenario's however i found that there were a few occasions when a server restart were required(in my case making changes to the application initialisation). But if you're only working on the interface this product will save you a great number of deployments and restarts.
I have not used Jrebel in combination with JBoss but they mention it as a supported container so thta shouldn't be a problem.
I am an average web designer (at best!) and writing complicated HTML and CSS is a pain for me. A lot of what I do with styles and layout is trial and error and involves a lot of tweaking. I also change my mind frequently about exactly what shade of color I want things. Basically, I'm in the same boat as you.
Long ago I abandoned the idea of the tweak-deploy-test iteration cycle (mvn clean tomcat:deploy takes 2 minutes on my current project) as by the 10th iteration trying to sort a simple layout problem and waiting for the deployment would drive me round the bend. I now use two strategies;
Get a static copy of the HTML I want to work with. This usually means deploying the app, navigating to the page and saving it to a work directory somewhere. This saves the static HTML as well as any images. Next I copy the CSS files from my workspace into the work directory and hand edit the saved HTML file to point to these CSS files.
Open the static HTML page in Firefox. Now I can tweak the CSS or HTML and simply refresh Firefox to show the changes. Iteration time is now down to about 1 second. I can further improve my tweaking using the Firebug addon. This allows you to manipulate the CSS and HTML from within Firefox. This is especially useful for getting margin and padding size right. Once I've tweaked it in Firebug I hand edit the saved HTML and CSS then refresh Firefox to make sure I'm happy with the result.
At certain key stages I then make the changes to my workspace to reflect my tweaking on the static files. I then redeploy and test to make sure I got it right. As I use Firefox for all my development I have to pay special attention to browser compatibility, especially with IE, but this usually comes at a later stage.
Edit:
I didn't mention Javascript, but this process works great for JS too!

Large File Download

Internet Explorer has a file download limit of 4GB (2 GB on IE6). Firefox does not have this problem (haven't tested safari yet)
(More info here: http://support.microsoft.com/kb/298618)
I am working on a site that will allow the user to download very large files (up to and exceeding 100GB)
What is the best way to do this without using FTP. The end user must be able to download the file from there browser using HTTP. I don't think Flash or Silverlight can save files to the client so as far as I know they won't cut it.
I'm guessing we will need an ActiveX or Java applet to pull this off. Something like the download manager that MSDN uses.
Does anyone know of a commercial (or free) component that will do that? We do not want the user to have to install a "browser wide" download manager (like GetRight), we want it to only work with downloading on our site.
Update: Here is some additional info to help clarify what I'm trying to do. Most of the files above the 4GB limit would be large HD video files (its for a video editing company). These will be downloaded by users across the internet, this isn't going to be people on a local network. We want the files to be available via HTTP (some users are going to be behind firewalls that aren't going to allow FTP, Bittorrent, etc.). The will be a library of files the end user could download, so we aren't talking about a one time large download. The will be download different large files on a semi-regular basis.
So far Vault that #Edmund-Tay suggested is the closest solution. The only problem is that it doesn't work for files larger than 4GB (it instantly fails before starting the download, they are probably using a 32bit integer somewhere which is exceeded/overflown by the content length of the file).
The best solution would be a java applet or ActiveX component, since the problem only exist in IE, that would work like the article #spoulson linked to. However, so far I haven't had any luck finding a solution that does anything like that (multipart downloads, resume, etc.).
It looks like we might have to write our own. Another option would be to write a .Net application (maybe ClickOnce) that is associated with an extension or mime type. Then the user would actually be downloading a small file from the web server that opens in the exe/ClickOnce app that tells the application what file to download. That is how the MSDN downloader works. The end user would then only have to download/install an EXE once. That would be better than downloading an exe every time they wanted to download a large file.
#levand:
My actual preference, as a user, in these situations is to download a lightweight .exe file that downloads the file for you.
That's a dealbreaker for many, many sites. Users either are or should be extremely reluctant to download .exe files from websites and run them willy-nilly. Even if they're not always that cautious, incautious behaviour is not something we should encourage as responsible developers.
If you're working on something along the lines of a company intranet, a .exe is potentially an okay solution, but for the public web? No way.
#TonyB:
What is the best way to do this without using FTP.
I'm sorry, but I have to ask why the requirement. Your question reads to me along the lines of "what's the best way to cook a steak without any meat or heat source?" FTP was designed for this sort of thing.
bittorrent?
There have been a few web-based versions already (bitlet, w3btorrent), and Azureus was built using java, so it's definitely possible.
Edit: #TonyB is it limited to port 80?
Please don't use ActiveX... I am so sick of sites that are only viewable in IE.
My actual preference, as a user, in these situations is to download a lightweight .exe file that downloads the file for you.
Can you split the files into pieces and then rejoin them after the download?
If you don't want to write java code in-house, there are commercial applet solutions available:
Vault
MyDownloder
Both of them have eval versions that you can download and test.
A few ideas:
Blizzard use a light-weight .exe BitTorrent wrapper for their patches. I'm not entirely sure how it is done, but it looks like a branded version of the official BitTorrent client.
Upload to Amazon S3, provide the torrent link of the file (all S3 files are automatically BitTorrent-enabled), plus the full HTTP download link as alternative. See S3 documentation
What about saying "We recommend that you install Free Download Manager to download this file. You will have the added benefit of being able to resume the file and accelerate the download."
Personally I never download anything using the built in browser download tool unless I have to (e.g. Gmail attachments)
#travis
Unfortunately It has to be over HTTP inside the users browser.
I'll update the question to be more clear about that.
#levand
The problem only exist in IE (it works in Firefox) so while ActiveX would only work on IE, IE is the only one we need the work around for.
#travis - interesting idea. Not sure if it will work for what I need but I'll keep it in mind. I'm hoping to find something to integrate with the existing site instead of having to go out to a third party. It would also require me to setup a bittorrent tracker which wouldn't be as easy as it sounds for this application because different users will have different access to different files.
#jjnguy
I'm looking for a java applet or ActiveX component that will do that for me. These are non-technical users so we really just want to have them click download and the full file ends up in the specified location
#ceejayoz
I totally agree but its part of the requirement for our client. There will be FTP access but each user will have the option of downloading via HTTP or FTP. There are some users that will be behind corporate firewalls that don't permit FTP
I have seen other sites do this in the past (MSDN, Adobe) so I was hoping there is something out there already instead of having to make one in house (and learning java and/or ActiveX)
I say click-once installed download manager, similar to msdn.
But becoming a CDN without a more optimized protocol for the job is no easy task. I can't imagine a business model that can be worthwhile enough to have such large file downloads as a core competency unless you are doing something like msdn. If you create a thick client, you at least get the chance to get some more face time with the users, for advertising or some other revenue model, since you will probably be paying in the hundreds of thousands of dollars to host such a service.
The problem with the applet approach mentioned is that unless you have the end user modify their java security properties your applet will not have permission to save to the hard drive.
It may be possible using Java Web Start (aka JNLP). I think that if it is a signed app it can get the extra permission to write to the hard drive. This is not too different from the download an exe approach. The problem with this is that the user has to have the correct version of Java installed and has to have the Java Web Start capability setup correctly.
I would recommend the exe approach as it will be the easiest for the non-technical users to use.
There are some users that will be behind corporate firewalls that don't permit FTP...
Are users with restrictive firewalls like that likely to be permitted to install and run a .exe file from your website?
Take a look at cURL. This article describes how to do a multi-part simultaneous download via HTTP. I've used cURL in the past to manage FTP downloads of files over 300GB.
Another tip: You can boost download times even more if you increase the size of the TCP Window on the client's NIC configuration. Set it as high as the OS allows and you should see up to 2x improvement depending on your physical network. This worked for me on Windows 2000 and 2003 when FTPing over a WAN. The down side is it may increase overhead for all other network traffic that wants only a few KB for a network packet, but is now forced to send/recv in 64KB packets. Your mileage may vary.
Edit: What exactly is this you're trying to accomplish? Who is the audience? I'm assumed for a bit that you're looking to do this over your own network; but you seem to imply the client side is someone on the internet. I think we need clearer requirements.
Create a folder of files to be downloaded on the server where the document service is running (either using Linux commands or using java to execute shell commands)
Write the file to be downloaded to this folder (using Linux command or Java shell command is OK). Considering the efficiency of program execution, WGet command is used here
Package the downloaded folder as a zip file (using shell command), configure nginx agent, return the access file path of nginx to the front end, and then download from the front end.

Categories