in my project, I have some JavaScript responsible for tracking user actions in order to optimize the page's layout. These calls are executed when the user clicks something, including the links leading to further pages.
I have the whole flow covered by automated tests written in Java and based on Selenium Webdriver. I'm using a Browsermob proxy to capture the requests and verify that the correct data is being passed to the user tracking service.
In certain situations, the requests hitting the service do not get recorded by the proxy. The reason this happens is because the browser navigates to the next page before getting the response from the tracking service. The request actually hits the destination, which I can see by the state of the database. Because the browser does not wait for the responses, they happen not to be noticed by the proxy, despite the default 5 second wait, which just seems to be ignored in this case. This only happens once in a while, causing false negatives in my test reports.
I can't force the browser to actually wait for those requests because I don't want the tracking to impede the user journey. What I'd like to do is to somehow configure the proxy to tell the difference between requests that have not been sent and the ones cancelled midway. This way I could attach this information to my reports.
Is this possible to achieve using the Browsermob proxy? Perhaps some other tool would do a better job.
Try to use phantomjs webDriver implementation, we don't need to initiate jetty proxy server and we can get all requests even those without responses.
I have a jsp/servlet based web app.
I have a button "Clean Up" which calls a servlet and the request goes till DAO class.DAO class performs different DB activities like, moving data from Master table to backup table, then deleting data from master table etc.
As of now this activity is Synchronous and user needs to wait until a response is sent.
I want to implement the same scenario as an Asynch task with user just getting a message as
" Clean Up Activities Triggered"
What could be the best/easiest way to perform this task. I cannot use scheduler.
My Container is TomCat.
Simplest but a different solution for this could be to use some AJAX behavior in the client side. There are lot of simple/powerful frameworks(JS files) to help you achieve AJAX in your page. Using AJAX, you just submit the request asynchronously and display the client side message "Clean Up Activities Triggered", while request is being processed in the server side. If user wait, server process will return and display a "success" message otherwise user is free to navigate other pages or perform other actions.
ExecutorService is the most robust solution. Creating a simple thread is enough as well. However the bigger problem is synchronization. Use Semaphore to control whether two users aren't cleaning up simultaneously.
we did this for our project once and it worked pretty well.
We sent the 200 ok to the user as long as there no issues processing the request. And we used the java executorservice to do the cleanup.
And in case something went wrong notified the user separately.
I have a crawler Java application which is supposed to connect to some HTTP servers, download the HTML content of their pages, then move on to other HTTP servers. For this task, I've used the Apache HTTP library.
At the first few hours of the run, things seem to work rather smoothly (there are some connection-related exceptions thrown around from time to time, but that's to be expected).
Yet after a while, it seems like I keep getting SocketTimeoutException on every request I send out. The exception does not occur on the HttpClient class's "execute" method, but rather when I try to get the content of the Entity (which I retrieve from the HttpResponse object), or when I try to write that content to a file.
Then, if I stop the application, and start it over again, things seem to go back to working fine - even though it picks up from where it stopped at, meaning it's interacting with the same servers which I received the SocketTimeoutException when trying to interact with before.
I tried looking for all kinds of possible clean-ups that I might be missing and might be essential when using this library, but couldn't find anything.
Any help would be greatly appreciated.
Thanks.
This sounds like the kind of thing which could be caused by connection pools where you're not closing things when you're done with them, if the timeout occurs while the client library waits to retrieve a pooled connection. Are you sure you're closing everything properly (in finally statements)?
If you run Wireshark to monitor your traffic, what network traffic occurs while it's "broken"?
Make sure that you're not using a lot of http requests at the same time. For example, send 5 http requests, and wait for first response. Then you can make another request etc. Looks like your http requests opens too much sockets.
I am try to create a JSP page that will show all the status in a group of local servers.
Currently I create a schedule class that will constantly poll to check the status of the server with 30 second interval, with 5 second delay to wait for each server reply, and provide the JSP page with the information. However I find this way to be not accurate as it will take some time before the information of the schedule class to be updated. Do you guys have a better way to check the status of several server within a local network?
-- Update --
Thanks #Romain Hippeau and #dbyrne for their answers
Currently I am trying to make the code more in server end, that is to do a constant check
on the status of the group of server asynchronously for update so as to make it more responsive.
However I forgot to add that the client have the ability to control the server status. Thus I have problem for example when the client changes the server status, and then refresh the page. When the page retrieve the information from not updated schedule class, it will show the previous status of the server instead.
You can use Tomcat Comet here is an article http://www.ibm.com/developerworks/web/library/wa-cometjava/index.html.
This technology (which is part of the Servlet 3.0 spec) allows you to push notifications to the clients. There are issues with running it behind a firewall, If you are within an Intranet this should not be too big of an issue
Make sure you poll the servers asynchronously. You don't want to wait for a response from one server before polling the next. This will dramatically cut down the amount of time it takes to poll all the servers. It was unclear to me from your question whether or not you are already doing this.
Asynchronous http requests in Java
I have a shell script which I'd like to trigger from a J2EE web app.
The script does lots of things - processing, FTPing, etc - it's a legacy thing.
It takes a long time to run.
I'm wondering what is the best approach to this. I want a user to be able to click on a link, trigger the script, and display a message to the user saying that the script has started. I'd like the HTTP request/response cycle to be instantaneous, irrespective of the fact that my script takes a long time to run.
I can think of three options:
Spawn a new thread during the processing of the user's click. However, I don't think this is compliant with the J2EE spec.
Send some output down the HTTP response stream and commit it before triggering the script. This gives the illusion that the HTTP request/response cycle has finished, but actually the thread processing the request is still sat there waiting for the shell script to finish. So I've basically hijacked the containers HTTP processing thread for my own purpose.
Create a wrapper script which starts my main script in the background. This would let the request/response cycle to finish normally in the container.
All the above would be using a servlet and Runtime.getRuntime().exec().
This is running on Solaris using Oracle's OC4J app server, on Java 1.4.2.
Please does anyone have any opinions on which is the least hacky solution and why?
Or does anyone have a better approach? We've got Quartz available, but we don't want to have to reimplement the shell script as a Java process.
Thanks.
You mentioned Quartz so let's go for an option #4 (which is IMO the best of course):
Use Quartz Scheduler and a org.quartz.jobs.NativeJob
PS: The biggest problem may be to find documentation and this is the best source I've been able to find: How to use NativeJob?
I'd go with option 3, especially if you don't actually need to know when the script finishes (or have some other way of finding out other than waiting for the process to end).
Option 1 wastes a thread that's just going to be sitting around waiting for the script to finish. Option 2 seems like a bad idea. I wouldn't hijack servlet container threads.
Is it necessary for your application to evaluate output from the script you are starting, or is this a simple fire-and-forget job? If it's not required, you can 'abuse' the fact that Runtime.getRuntime().exec() will return immediately with the process continuing to run in the background. If you actually wanted to wait for the script/process to finish, you would have to invoke waitFor() on the Process object returned by exec().
If the process you are starting writes anything to stdout or stderr, be sure to redirect these to either log files or /dev/null, otherwise the process will block after a while, since stdout and stderr are available as InputStreams with limited buffering capabilites through the Process object.
My approach to this would probably be something like the following:
Set up an ExecutorService within the servlet to perform the actual execution.
Create an implementation of Callable with an appropriate return type, that wraps the actual script execution (using Runtime.exec()) to translate Java input variables to shell script arguments, and the script output to an appropriate Java object.
When a request comes in, create an appropriate Callable object, submit it to the executor service and put the resulting Future somewhere persistent (e.g. user's session, or UID-keyed map returning the key to the user for later lookups, depending on requirements). Then immediately send an HTTP response to the user implying that the script was started OK (including the lookup key if required).
Add some mechanism for the user to poll the progress of their task, returning either a "still running" response, a "failed" response or a "succeeded + result" response depending on the state of the Future that you just looked up.
It's a bit handwavy but depending on how your webapp is structured you can probably fit these general components in somewhere.
If your HTTP response / the user does not need to see the output of the script, or be aware of when the script completes, then your best option is to launch the thread in some sort of wrapper script as you mention so that it can run outside of the servlet container environment as a whole. This means you can absolve yourself from needing to manage threads within the container, or hijacking a thread as you mention, etc.
Only if the user needs to be informed of when the script completes and/or monitor the script's output would I consider options 1 or 2.
For the second option, you can use a servlet, and after you've responded to the HTTP request, you can use java.lang.Runtime.exec() to execute your script. I'd also recommend that you look here : http://www.javaworld.com/javaworld/jw-12-2000/jw-1229-traps.html
... for some of the problems and pitfalls of using it.
The most robust solution for asynchronous backend processes is using a message queue IMO. Recently I implemented this using a Spring-embedded ActiveMQ broker, and rigging up a producing and consuming bean. When a job needs to be started, my code calls the producer which puts a message on the queue. The consumer is subscribed to the queue and get kicked into action by the message in a separate thread. This approach neatly separates the UI from the queueing mechanism (via the producer), and from the asynchronous process (handled by the consumer).
Note this was a Java 5, Spring-configured environment running on a Tomcat server on developer machines, and deployed to Weblogic on the test/production machines.
Your problem stems from the fact that you are trying to go against the 'single response per request' model in J2EE, and have the end-user's page dynamically update as the backend task executes.
Unless you want to go down the introducing an Ajax-based solution, you will have to force the rendered page on the user's browser to 'poll' the server for information periodically, until the back-end task completes.
This can be achieved by:
When the J2EE container receives the request, spawn a thread which takes a reference to the session object (which will be used to write the output of your script)
Initialize the response servlet to write an html page which will contain a Javascript function to reload the page from the server at regular intervals (every 10 seconds or so).
On each request, poll the session object to display the output stored by the spawned thread in step 1
[clean-up logic can be added to delete the stored content from the session once the thread completes if needed, also you can set any additional flags in the session for mark state transitions of the execution of your script]
This is one way to achieve what you want - it isn't the most elegant of all approaches, but it is essentially due to needing to asynchronously update your page content from the server , with a request/response model.
There are other ways to achieve this, but it really depends on how inflexible your constraints are. I have heard of Direct Web Remoting (although I haven't played with it yet), might be worth taking a look at Developing Applications using Reverse-Ajax