How do I stop Safari from caching my Servlet response? - java

I'm having trouble testing a web app with Safari. My app returns wave audio data. The problem happens when I change the application and hit it again from Safari. Safari caches the original response so no matter how many times I hit refresh it seems like I've not updated anything. I can almost get around this using force refresh with Firefox but because I'm having trouble generating the wave headers using the javax.sound API Firefox only plays the first second of audio returned. A few weeks ago I tried setting the HTTP header in my servlet to prevent caching but I don't think I was setting it correctly. (What is the header for browser cache control?) This is becoming a real pain and I'm looking for any ideas, comments, or alternative approaches. I'm getting ready to try again but I figured I'd ask here in the interim to see if someone can provide help.

I found my answer. I just added a combination of "Pragma: no-cache", "Cache-Control: no-cache", and "Expires: -1" to y response headers. Now it appears the caching is no more. I still need a solution for my broken wave headers. I really hate to engineer a wave/io package for something that should be straight forward.

Related

Truncation of web socket chunk while uploading a file

I have tried searching for this strange issue I am facing but could not find anything on web.
Following is what I am trying to do.
Upload File from User browser to Play server running on some different environment
Following is the issue I am facing.
The chunk is getting truncated before reaching to Play server
Observations:
Chunk is created at client side properly from java script and websocket.send() is passing proper chunk to Play.
On Play server, the chunk is coming as a String event object which is truncated.
Very strange thing about this problem is This is happening only from some machines/networks, for all others it is working fine
When tried with different chunk size, it has been observed that for smaller chunks many of initial chunks gets received properly and later one fails
We have tried bypassing Firewalls and Proxies as well on some network to check what happens if there are no such restrictions, but it is still failing
Please give your inputs which can help me debug this and fix this. Any additional things you want I can provide, not pasting code as it is working on majority of machines and networks but failing on a few, so it does not seem to be a code issue
PS. This question can have many answers based on people's views, to all SO users, I just need help on what could be the thing which can go wrong, so please do not flag this as inappropriate
I have figured it out, the issue was with latest update of google chrome. I downloaded the chrome from here (Version 37) and it started working fine.
I came to know after a period of time that this issue was because of the implementation changes in chrome V38 for multiple frames for single message, initially it was getting transferred in a single frame so ultimately the implementation from your server side also need to be changed in order to handling the same.
I was using older version of Play framework which wasn't having this multiple frame handling implementation so it was breaking.
After updating Play to 2.2.3 it started working properly as they have implemented multiframe handling in that version. Some useful links below
Issue With Latest Chrome
Play Changelog
Changes for Continuation frame handling for WebSockets in Play 2.2.3

Logout flow in play! framework 2.0, java

I ran into this problem yesterday and haven't been able to find a solution to it.
Once a user logs out how do I prevent them from hitting the back button and loading the cached, previous page?
I ran into this post and read the suggested article, but I'm unsure if any of these suggestions are the correct way to handle this problem.
I even ran the sample apps from Play! notably the Forms app and it has the same problem. I thought their apps would at least show how to handle this.
Any help would be appreciated. Thanks.
You can disable the cache in the response's header (no-cache or must revalidate) for every page that needs to check the credentials.

Interacting with an AJAX site from Java

I am trying to download the contents of a site. The site is a magneto site where one can filter results by selecting properties on the sidebar. See zennioptical.com for a good example.
I am trying to download the contents of a site. So if we are using zennioptical.com as an example i need to download all the rectangular glasses. Or all the plastic etc..
So how do is send a request to the server to display only the rectangular frames etc?
Thanks so much
You basic answer is you need to do a HTTP GET request with the correct query params. Not totally sure how you are trying to do this based on your question, so here are two options.
If you are trying to do this from javascript you can look at this question. It has a bunch of answers that show how to perform AJAX GETs with the built in XMLHttpRequest or with jQuery.
If you are trying to download the page from a java application, this really doesn't involve AJAX at all. You'll still need to do a GET request but now you can look at this other question for some ideas.
Whether you are using javascript or java, the hard part is going to be figuring out the right URLs to query. If you are trying to scrape someone else's site you will have to see what URLs your browser is requesting when you filter the results. One of the easiest ways to see that info is in Firefox with the Web Console found at Tools->Web Developer->Web Console. You could also download something like Wireshark which is a good tool to have around, but probably overkill for what you need.
EDIT
For example, when I clicked the "rectangle frames" option at zenni optical, this is the query that fired off in the Web Console:
[16:34:06.976] GET http://www.zennioptical.com/?prescription_type=single&frm_shape%5B%5D=724&nav_cat_id=2&isAjax=true&makeAjaxSearch=true [HTTP/1.1 200 OK 2328ms]
You'll have to do a sufficient number of these to figure out how to generate the URLs to get the results you want.
DISCLAIMER
If you are downloading someone's else data, it would be best to check with them first. The owner of the server may not appreciate what they might consider stealing their data/work. And then depending on how you use the data you pull down, you could be venturing into all sorts of ethical issues... Then again, if you are downloading from your own site, go for it.

Java and HttpUnit having GZip bugs

I've been trying to find workarounds to this for nearly 6 straight hours now but to no avail.
I have a simple HttpUnit program that does some sign in to a website. The problem is that I'm getting this error when I finally submit the log in form:
Exception in thread "main" java.io.EOFException: Unexpected end of ZLIB input stream
at java.util.zip.InflaterInputStream.fill(InflaterInputStream.java:223)
at java.util.zip.InflaterInputStream.read(InflaterInputStream.java:141)
at java.util.zip.GZIPInputStream.read(GZIPInputStream.java:92)
at com.meterware.httpunit.WebResponse.readFromStream(WebResponse.java:967)
..and so on..
It appears a bug in the way JDK was managing the gzip file was reported back in 2002 (!). It appears Java still has this bug and the only workaround is to tell HttpUnit to not accept gzip encoded data, like so:
WebConversation.getClientProperties().setAcceptGzip(false);
Unfortuantely this I cannot do. For the sign in to be successful, I need to have gzip enabled, otherwise I'm just redirected to the generic home page (without being signed in).
I suspect the only way to make this work would be to change the source code of HttpUnit and handle the EOF excpetion in it's gzip parsing. I desperately need help. Does anyone have a solution to this?
If not, then could you suggest some equivalently simple API in Java, such as HttpUnit, for tasks such as filling forms and signing in.
I suspect the only way to make this work would be to change the source
code of HttpUnit and handle the EOF excpetion in it's gzip parsing.
This is often a solution for developers when there are bugs in third party libraries. If you know what the problem is in HttpUnit, why not fix it locally? You can even push the fix back to the project and hope they adopt it.
If not, then could you suggest some equivalently simple API in Java, such as HttpUnit, for tasks such as filling forms and signing in.
If you mean a library to traverse websites, there are plenty. If you want a "headless" HTTP client, try Apache HttpComponents HttpClient. If you want something that actually uses a browser, try Selenium.
It seems this problem was fixed in HttpUnit 1.7 (I was using 1.6).
Anyways, I'm shifting to HttpComponents as suggested by #ShaggyFrog since now I'm running into bigger, unknown problems with cookies and sessions.
Edit: It seems HttpComponents was overly complicated. At one point I had to manually handle Http 302 redirects. So I decided to shift to HtmlUnit, which is extremely similar to HttpUnit but without any cookie or gzip problems. Worked like a charm.

System requirements for Cometd/Bayeux Usage on Android

I'm trying to implement a Cometd/Bayeux server on Android using iJetty. The Jetty implementation itself works just fine serving static pages along with servlets. I am trying to up the ante a bit and create a Bayeux application on the phone but I'm having some trouble. I can hit the page that has the dojo cometd scripts on it, but I am unable to subscribe to the channel. When I view firebug/chome developer tools, I see a series of posts/gets that last a couple of milliseconds (~14). However, when I run a cometd application on a normal machine, the posts/gets last several seconds (~14 seconds) before timing out and reopening the connection. This second scenario makes sense to me with my understanding of how continuation in HTTP works. So I'm thinking that something is not allowing those connections to hang open and prematurely returning a value and consequently closing the connection. I would post my source but I'm not sure what to post short of posting everything...(it is open source though so if you want to have a look it's at http://webtext-android.googlecode.com).
So my question is, does anybody think that there could be some underlying limitation imposed by the Android system that is preventing these servlets from working? Are there assumptions that are made by the Jetty Bayeux implementation with regards to the underlying system? Or is it more likely that somehow I have a bad implementation of the ContinuationCometdServelt? I should note that all of the posts/gets from the client return 200 OK messages so I'm not inclined to think that the Android system is simply terminating the connection.
I know this is a bit off the wall and I'm definitely trying to do something a bit out of the ordinary but any suggestions or tips would be greatly appreciated.
In case anybody discovers this and has similar problems (this applies to all cometd implementations regardless of host), I discovered that the issue was with using the Google js library. For some reason, the dojo scripts I was loading from Google (1.4) didn't have a valid implementation of cometd. I switched my dojo script to the one that was used by the jetty-1.6.23 example and it works perfectly.

Categories