We're developing a web application which must tolerate noticeable amount of load. I'm running my tests on a HP (Proliant DL 380) server (two 3.6GHz Xeon cpus, 16 GBs of RAM, ...). I'm using Apache JMeter and pylot to run load tests (they kinda show similar results).
In one scenario, I configured the load testing program to hit my index page ,using just one thread, as much as it can. The index page is about 60KB and consists of about 10 ajax calls, lots of JavaScript and jQuery Codes, required CSS and etc.
The results I got was, well, disappointing.
Full index.jsp page:
Throughput (req/sec): 3.567
Response Time (secs): 0.278
So I removed every ajax call, got rid of charts and also CSS (but not JS)
Throughput (req/sec): 6.082
Response Time (secs): 0.161
Still very low! So I built an static index page in HTML format which contains all the data with the same size (without any server side and client side computation)
Throughput (req/sec): 20.787
Response Time (secs): 0.046
Wow, that was a breakthrough! Now I added some of the JavaScript codes to the index.html page
Throughput (req/sec): 9.617
Response Time (secs): 0.103
Well, I think the bottleneck has been found, Java Script codes. I need to find out how many req/sec "the server" can handle and since java Script is run client side, I don't think I should include it in this test. So should load testing tools process JS codes? (they seem to be doing this)
Another key question is, according to hardware, content size and aforementioned configs, is this amount of throughput plausible? shouldn't I expect more? my expectation is 500 req/sec! is the only solution adding hardware?!
BTW, the webapp has been built using Java+Struts2+JSP+Hibernate+MySQL. it is also distributed over multiple servers using haproxy. but the aforementioned tests was run on a single server.
If what you are looking for is a raw number of how fast the server can serve up the content, then I would say yes, ignore the java-script and focus on how fast it takes to transfer all the various bits of the page (HTML, images, script files, CSS, etc).
However, if you are trying to test user experience, well JS is part of that, so you need to take that into account. From your description, you are not worried about user experience but rather load to the server.
You might want to consider setting up JMeter to make direct calls to the pieces of your pages, as described in the first sentence above.
Is your JavaScript (CSS,images) directly embedded on the page or loaded from a script tag? The latter case will force the browser to download the file from the server ,instantly halving your paged per second.that's one of the reasons you should load jquery from a different server ( eg Google) - this will have a minor hit on the user perceived page load time (one additional DNS query to be made) but really takes the load off your server
Another key question is, according to hardware, content size and aforementioned configs, is this amount of throughput plausible? shouldn't I expect more? my expectation is 500 req/sec! is the only solution adding hardware?!
Writing your application so that pages are cacheable and then putting a http-proxy up in front of your application is often a good strategy.
Related
I am working on a java app, works fine and all, but at the frontend, I am using thymeleaf (first time I use it, moving on from primefaces).
What Ive noticed is that the page loads (working locally) pretty fast, as expected, since the information is not reelevant and no DB is being used by now.
What surprises me is that the images load probably 2 seconds later, I have no idea why, they are stored locally on the app assets folder.
I am incluiding them this way:
<img th:src="#{/assets/images/myimage.png}" />
Is there anyway to make it faster? (Of course, I will later set more memory to my JVM, but this sort of stuff shouldnt take that long...)
Any caching or faster ways?
I am using spring 4 mvc
Thanks.
There are so many variable to this i am not going to list all of them but a few:
Distance between Client and Server
Efficiency of your application
Size of images
Browser and all extensions/plugins attached to it
Power of server
Power of client machine
Delivery method
Potential javascripts that manage loading of images
Malware
As mentioned there is a lot more that I haven't listed and it really is a process of elimination.
Some things we have on our application that avoids the delay in images include
server memory increase from 512 to 1024
Changing location of server to a more local source
changing the location from where the application is sourcing the images (faster raid disks)
delaying full page display until everything is preloaded on client machine (look into Flash of Unstyled Content Fixes)
Performance improvements on web application itself
What you need to do is explore each option and start improving on it and eventually you will get the results you need.
Fyi if you want to avoid loading images from server have them hosted on a CDN for speed of transfer.
I want to extract HTML data from a website using JAVA. The problem is the webpage keeps scrolling down once the user reaches the bottom of the page. Number of times it scrolls down is fixed. My JAVA code can extract only for the 1st part. How do I extract for the remaining scrolls? Is there a way to load the whole page at once with JAVA? ANy help would be appreciated :)
This might be the type of thing that PhantomJS (http://phantomjs.org/) was designed for. It will crawl entire web pages and even execute JavaScript, using a "real" browser in headless mode. I suggest stopping what you're doing with Java and take a look at PhantomJS instead. It could save you a LOT of time. :)
This type of behavior is implemented in the browser, interpreting the user's scrolling actions to load more content via AJAX and dynamically modifying the in-memory DOM in the browser. Consider that your Java runs in a web container on the server, and that web container (i.e. Tomcat, JBoss, etc) provides a huge amount of underlying code so your app doesn't have to worry about the plumbing.
Conceptually, a similar thing occurs at the client, with the DHTML web page running in its own "container" (the browser), which provides a wealth of functionality, from UI to networking, to DOM, etc. If you remove the browser from the equation and replace it with a Java program, you will need to provide the equivalent of the browser in which the DHTML/Javascript can execute.
I believe that HTMLUnit may fill the bill, but have not worked with it personally.
I wish to monitor the my internet bandwidth from a web page i create. Is it possible to handle or read data packets by the web page from network adapter using Java(.jsp) code ? It should be dynamic and should be able to update the page as soon as possible.
EDIT:
May be i was not clear. What i want is to monitor total upload and download speed, and also from each servers connect at the real time.So, as the uploads and downloads occurs from each server, the page be automatically updated.
Somewhat like this:
http://stackoverflow.com upload: 50KBps download: 35KBps
http://www.facebook.com upload: 100KBps download: 10KBps
C:\program files\yahoo messenger.exe upload: 25KBps download: 12KBps
And the computer processes using the internet in the same way but in a web page not in a desktop application.
Is it possible to handle or read data packets by the web page from network adapter using Java
Leaving aside whether it is actually possible for the moment, in order to achieve this you'd need to implement a full TCP/IP stack within your Java code.
You also need to answer the question of what is it that you actually want to measure. For most web based applications, latency, caching and parallelism are far more relevant to page response times than bandwidth.
and should be able to update the page as soon as possible.
Makes me even more confused about what you're trying to achieve, and whether you'd be able to deliver something to meet that requirement.
Most hosting companies limit the amount of data you can transfer within a selected time period - and usually refer to this as a bandwidth limit - but it's quite different from the how much data the hardware can process and the rate at which it can process it (i.e. what you'd see in the TCP/IP packets). If you want to measure this then you'd need an exact definition of how the hosting company measures this - and the best place to suorce the data from is your webserver logs.
If you want to measure how responsive your pages are client-side and make allowances for the limitations of their network connection, then have a look at Yahoo Boomerang.
If you simply want to provide a self-service tool for users to measure their own bandwidth, then a java applet paired with a server side feed is probably the way to go - but remember to make multiple measurements with different sized 'files' and use a regression algorithm to measure the speed and latency.
I have to tell to the user in a web application(using struts) the download speed.
What is the best way?
For the moment I simply have a .jsp filled with java scripts and I calculate the time it takes to the user to load this .jsp(I make this action 3 times and take the average speed). It worked ok when I had slow connection speed(under 1mb) but for bigger speed connection the values where wrong(to small; and if I put somebody from another country to test it-the connection speed results are even smaller).
The jsp size (taken with YSlow add-on from Mozilla) was little:70kb (gzipped) so I raised it to 260kb and still doesn't show the right value.
I have to mention that I cannot download a file to the user and that is because I have to do a little test of compatibility of the application with the user network,browser..etc and the application means a lot of java script on the client side and the average size of a jsp file is 70kb(gzipped)<-that's why I tried at start with this size.
Could you please give me an advice, I'll appreciate
Luisa
You think the best way to do it is use client side (javascript for example).
If you want more specific data you need to do some downloads in the loop (100k, 1M, 5M).
http://ditio.net/2008/08/06/detect-connection-speed-with-javascript/
I want to know what is the best way to calculate the download speed of the clients using Java.
I have searched on the internet and found that I have to download a file from the server and calculate the time before and after the download completes and to make it more specific - repeat this action for 3 times so I can have an average time.
For me, the problem above doesn't work well, for localhost I got the results let's say close to reality but from server the results are a lot lower. I made a page filled with js,css(the size that I want to test), before enters calculate the startDate from the extended Action class(using struts) and on the 'onLoad()' event I made an ajax submit in which I got the endDate.StartDate-EndDate gave me the time...I forgot to mention, I set also that my jsp page isn't cached.
Do you know other solutions?
Thanks,
Luisa
On Linux you could just run the following from the command line.
time wget <some-url>
This will give you the elapsed time for the GET request as the "real" time. Of course, this gives you just the time taken to fetch the page contents. It won't tell give you the time for loading any images, stylesheets, scripts, etc that the page pulls in, or the time taken to render the page or execute any embedded javascript.
... for localhost I got the results let's say close to reality but from server the results are a lot lower.
Your previous method may actually have been giving you true results. It is not unusual for access via "localhost" to work faster than remote access, especially if there are slow network links, web proxies, etc in the route from the client to the server.