I want to know what is the best way to calculate the download speed of the clients using Java.
I have searched on the internet and found that I have to download a file from the server and calculate the time before and after the download completes and to make it more specific - repeat this action for 3 times so I can have an average time.
For me, the problem above doesn't work well, for localhost I got the results let's say close to reality but from server the results are a lot lower. I made a page filled with js,css(the size that I want to test), before enters calculate the startDate from the extended Action class(using struts) and on the 'onLoad()' event I made an ajax submit in which I got the endDate.StartDate-EndDate gave me the time...I forgot to mention, I set also that my jsp page isn't cached.
Do you know other solutions?
Thanks,
Luisa
On Linux you could just run the following from the command line.
time wget <some-url>
This will give you the elapsed time for the GET request as the "real" time. Of course, this gives you just the time taken to fetch the page contents. It won't tell give you the time for loading any images, stylesheets, scripts, etc that the page pulls in, or the time taken to render the page or execute any embedded javascript.
... for localhost I got the results let's say close to reality but from server the results are a lot lower.
Your previous method may actually have been giving you true results. It is not unusual for access via "localhost" to work faster than remote access, especially if there are slow network links, web proxies, etc in the route from the client to the server.
Related
I need advice how to plan to design a project. My assignment is to take data from user through webpage and pass the data as an input to a program that should run locally on the computer. I imagine it as an application (to be installed), where user would type input into specific boxes and then after clicking run, the data would be additionally modified (to ensure correctness of input and some basic transformations) go to the program (written in Java) and after the result is computed then the results would be used for another computation to make the results easily readable (depicted in graphs, known distributions, some intervals etc.), and displayed to the user (also in the form of webpage).
I am not sure what is the process, what are the programs that I need to use (preferring open source and for free) for development of such a thing. I have not done this before and I need the advice what are the topics and areas that I need to do the research in.
I imagine steps similar to those.
have html page with CSS, containing JS functionalities
(input boxes and interactive boxes for commands, such as send data)
make connection between html page and local server
send data from html form into program running on the server
let the program compute result from html input
process the result into user friendly format
send the data to be displayed on html page
(graphically readable)
get to the html page in step 1.
Is something like this functional? How can I make the Java program run locally? Friend of mine mentioned to have it run on a port (how can I accomplish that)?
I can not have it run on internet, it needs to be as secure as possible (distributed to people but used separately). I have read about sockets, would that be good use in this case?
I would really appreciate tips how to proceed further.
We're developing a web application which must tolerate noticeable amount of load. I'm running my tests on a HP (Proliant DL 380) server (two 3.6GHz Xeon cpus, 16 GBs of RAM, ...). I'm using Apache JMeter and pylot to run load tests (they kinda show similar results).
In one scenario, I configured the load testing program to hit my index page ,using just one thread, as much as it can. The index page is about 60KB and consists of about 10 ajax calls, lots of JavaScript and jQuery Codes, required CSS and etc.
The results I got was, well, disappointing.
Full index.jsp page:
Throughput (req/sec): 3.567
Response Time (secs): 0.278
So I removed every ajax call, got rid of charts and also CSS (but not JS)
Throughput (req/sec): 6.082
Response Time (secs): 0.161
Still very low! So I built an static index page in HTML format which contains all the data with the same size (without any server side and client side computation)
Throughput (req/sec): 20.787
Response Time (secs): 0.046
Wow, that was a breakthrough! Now I added some of the JavaScript codes to the index.html page
Throughput (req/sec): 9.617
Response Time (secs): 0.103
Well, I think the bottleneck has been found, Java Script codes. I need to find out how many req/sec "the server" can handle and since java Script is run client side, I don't think I should include it in this test. So should load testing tools process JS codes? (they seem to be doing this)
Another key question is, according to hardware, content size and aforementioned configs, is this amount of throughput plausible? shouldn't I expect more? my expectation is 500 req/sec! is the only solution adding hardware?!
BTW, the webapp has been built using Java+Struts2+JSP+Hibernate+MySQL. it is also distributed over multiple servers using haproxy. but the aforementioned tests was run on a single server.
If what you are looking for is a raw number of how fast the server can serve up the content, then I would say yes, ignore the java-script and focus on how fast it takes to transfer all the various bits of the page (HTML, images, script files, CSS, etc).
However, if you are trying to test user experience, well JS is part of that, so you need to take that into account. From your description, you are not worried about user experience but rather load to the server.
You might want to consider setting up JMeter to make direct calls to the pieces of your pages, as described in the first sentence above.
Is your JavaScript (CSS,images) directly embedded on the page or loaded from a script tag? The latter case will force the browser to download the file from the server ,instantly halving your paged per second.that's one of the reasons you should load jquery from a different server ( eg Google) - this will have a minor hit on the user perceived page load time (one additional DNS query to be made) but really takes the load off your server
Another key question is, according to hardware, content size and aforementioned configs, is this amount of throughput plausible? shouldn't I expect more? my expectation is 500 req/sec! is the only solution adding hardware?!
Writing your application so that pages are cacheable and then putting a http-proxy up in front of your application is often a good strategy.
I am trying to download the contents of a site. The site is a magneto site where one can filter results by selecting properties on the sidebar. See zennioptical.com for a good example.
I am trying to download the contents of a site. So if we are using zennioptical.com as an example i need to download all the rectangular glasses. Or all the plastic etc..
So how do is send a request to the server to display only the rectangular frames etc?
Thanks so much
You basic answer is you need to do a HTTP GET request with the correct query params. Not totally sure how you are trying to do this based on your question, so here are two options.
If you are trying to do this from javascript you can look at this question. It has a bunch of answers that show how to perform AJAX GETs with the built in XMLHttpRequest or with jQuery.
If you are trying to download the page from a java application, this really doesn't involve AJAX at all. You'll still need to do a GET request but now you can look at this other question for some ideas.
Whether you are using javascript or java, the hard part is going to be figuring out the right URLs to query. If you are trying to scrape someone else's site you will have to see what URLs your browser is requesting when you filter the results. One of the easiest ways to see that info is in Firefox with the Web Console found at Tools->Web Developer->Web Console. You could also download something like Wireshark which is a good tool to have around, but probably overkill for what you need.
EDIT
For example, when I clicked the "rectangle frames" option at zenni optical, this is the query that fired off in the Web Console:
[16:34:06.976] GET http://www.zennioptical.com/?prescription_type=single&frm_shape%5B%5D=724&nav_cat_id=2&isAjax=true&makeAjaxSearch=true [HTTP/1.1 200 OK 2328ms]
You'll have to do a sufficient number of these to figure out how to generate the URLs to get the results you want.
DISCLAIMER
If you are downloading someone's else data, it would be best to check with them first. The owner of the server may not appreciate what they might consider stealing their data/work. And then depending on how you use the data you pull down, you could be venturing into all sorts of ethical issues... Then again, if you are downloading from your own site, go for it.
I have to tell to the user in a web application(using struts) the download speed.
What is the best way?
For the moment I simply have a .jsp filled with java scripts and I calculate the time it takes to the user to load this .jsp(I make this action 3 times and take the average speed). It worked ok when I had slow connection speed(under 1mb) but for bigger speed connection the values where wrong(to small; and if I put somebody from another country to test it-the connection speed results are even smaller).
The jsp size (taken with YSlow add-on from Mozilla) was little:70kb (gzipped) so I raised it to 260kb and still doesn't show the right value.
I have to mention that I cannot download a file to the user and that is because I have to do a little test of compatibility of the application with the user network,browser..etc and the application means a lot of java script on the client side and the average size of a jsp file is 70kb(gzipped)<-that's why I tried at start with this size.
Could you please give me an advice, I'll appreciate
Luisa
You think the best way to do it is use client side (javascript for example).
If you want more specific data you need to do some downloads in the loop (100k, 1M, 5M).
http://ditio.net/2008/08/06/detect-connection-speed-with-javascript/
I'm looking for a elegant way to create a queue for serving up batch compiled applets.
I have hacked together a SQL and PHP script to handle this but it chokes under mild loads.
is there an existing system that can handle taking a list in SQL and serve the applets in descending order each time it is requested. I'm also trying to handle this all server side as well.
The trick would be getting file001, then file002 ++ ect. to get served each time a web page is loaded. I'm batch creating applets that has a slightly modified background and I'm trying to serve a never been used applet waiting in the queue to load each time the a page is requested.
Is there a applet server I can tweak or does look like something that needs to be built?
No, I have never heard of a "batch compile applet server".
Could you maybe explain in more detail why you feel this is necessary?
Why don't you just use the same class and pass parameters to it?
That said, you can do compilation on demand quite well with e.g. ant and / or CruiseControl. You could put the pre-compiled applets into a directory. Then your PHP frontend just needs to keep track of what applet it delivered last, and fetch the next one the next time.
Still, this sounds rather complicated to me; so maybe you could explain your motivation.
In particular, why do you want a new applet on every reload? Would that not be rather confusing? Why not offer a link for each variant, so the user can choose?