I write, because I can not solve the following problem.
I have a servlet that processes some information.
In response I put both text and binary content.
How do I get two response, then two html page, starting from the same request?
is a thing possible?
The first response should continue to do what he does now, while the second would appear to make a popup window to save an image.
There are easier ways to achieve the same result?
Many thanks in advance
As answered in your previous question, You can send only one HTTP response per HTTP request. This is not a servlet restriction, this is a HTTP restriction. The server is not supposed to send data to the client unaskingly. That would have made the Internet extremely annoying and unusable. As if you're thrown dead with a continuous stream of spam.
To be able to return two responses, the client has to fire two requests itself. If you want to do this automagically on a "single click", then you can (ab)use some shot of JavaScript for this. E.g.
click
This will fire two requests, one to page.jsp using normal HTML in current window and another one to downloadservlet/file.ext in new window using JavaScript. This window will however disappear if the response is of Content-Disposition: attachment as answered in your previous question.
You only need to take into account that this won't work when the client has JavaScript disabled.
Related
Is it possible to send extra data attached to a http response via Java or Php?
My Website is a homework-platform: One User enters homeworks into a database, and all users can then see the homeworks on the website. The current load is very inefficient, as the browser makes two requests for eveything to load: One for the index file and one for the homeworks. For the homeworks request the client also sends settings of the user to the server, based on which the returned homeworks are generated by a Php script.
Now, I wonder, if it is possible, to combine those two requests into one? Is it maybe possible to detect the http request with Java or Php on the server, read the cookies (where the settings are saved), then get the homeworks from the database and send the data attached to the http response to the client? Or, even better, firstly only return the index file and as soon as possible and the homework data afterwards as a second response, because the client needs some time to parse the Html & build the DOM-tree when it can't show the homeworks anyway.
While browsing the web I stumbled across terms like "Server-side rendering" and "SPDY", but I don't know if those are the right starting points.
Any help is highly appreciated, as I'm personally very interested in a solution and it would greatly improve the load time of my website.
A simple solution to your problem is to initialize your data in the index file.
You would create a javascript object, and embed it right into the html, rendered by your server. You could place this object in the global namespace (such as under window.initData), so that it can be accessed by the code in your script.
<scipt>
window.initData = {
someVariable: 23,
}; // you could use json_encode if you use php, or Jackson if you use java
</script>
However, it is not a huge problem if your data is fetched in a separate server request. Especially when it takes more time to retrieve the data from the database/web services, you can provide better user experience by first fetching the static content very quickly and displaying a spinner while the (slower) data is being loaded.
I daily visit this link to find my lectures at school. Every time I have to scroll down the list to find my own class, and then post it so I can view the result. Is there any way i could make a direct link to the preferred content? I'm looking to create a simple webview app in Android showing individual form categories.
EDIT : Really any method for converting the aspx info into another format would do the trick. Prefferably a direc link to each form item. But if I can convert every single item to a .xml file or anything else I could work with it. But I have to make it automated.
You can capture the outgoing request and write a simple application to POST the data back to the page. The WebClient class is useful for this.
Looking at the request in Chrome's developer tools, I see that the form posts back to itself and then redirects to the result page. Presumably, you should POST the form data to the initial page, which will then cause it to perform the redirect.
The form contains a large amount of ViewState data which may or may not need to be included in the request to make it work.
A completely different approach would be to find a browser extension, such as a macro recorder, which emulate your actions. This plugin (haven't tried it myself) appears to do exactly that.
Okay, this is going to be hard to explain but here goes nothing:
Lately I've been working a lot with POST and GET requests, but now I want to send a POST/GET request to this site called: http://www.mangareader.net/
The main problem I'm facing is that I want to use the search function of this site. Normally I would send a get request or something like that, but apparently this search function doesn't work that way, it works with some kind of Javascript code? I don't know exactly what it is, but try typing "Elf" in the search bar, and you'll get a drop down list of all the mangas (Japanese comics) with the word "Elf" in them. I want to know how this process is called, and how I can implement it into a Java program. For instance:
Login into a website
- > Send an HTTP post request. Get HTML data back. Process the HTML data. Get the information I need from the HTML source.
Using a search function on a regular site like google.com or bing.com
- > Send get request. Get HTML data back. Process the HTML data. Get the information I need from the HTML source.
Using search function on mangareader.net
- > ??????????
How would I achieve this? A theoretic explanation is enough, but a practical example would be great as well.
If you analyse the javascript that runs when search you get the following:
GET http://www.mangareader.net/actions/search/?q=test&limit=100 [HTTP/1.1 200 OK 113ms]
In other words, you can search on the site by a GET-request to
http://www.mangareader.net/actions/search/?q=test&limit=100
Where ?q contains your search word.
This site uses an ajax call to get a | ( pipe symbol ) seperated list from the page
/actions/search?q=term
It parses this list using string split and then makes it into combobox.
I have little experience with java, but a simple GET request to this page should work
replace {term} with your search function.
http://www.mangareader.net/actions/search/?q={term}&limit=100
You can use chrome network monitor to see if for your self
I understand that this question is asked over and over again, but I want a way to handle back button clicks on the server side (like banking applications).
When the user clicks on the back button, the page should be invalid and the user should be asked to start all over again.
Can you direct me to some tutorials on this?
The simplest way I've seen this solved is as follows.
Every page is served up with a unique ID/token.
That unique ID is always submitted when submitting any forms, and tracked on the server as being "used".
If the user ever clicks "back" and re-submits the same form, the server checks the unique ID... notices that it is a duplicate and then ignores the submission.
Note this won't physically stop a user from going "back", but if the last action was "transfer $1,000,000 dollars!" - the user won't accidentally transmit 2 million.
Make pages not cachable
track the user route server side, if she is visiting the visited page which she isn't supposed to revisit by back, in a session data may be.
check to see if she is requesting the visited resource then handle accordingly
Filter is the best place to do
Instruct page to use no cache.
Add to the head element of the page
<meta http-equiv="Pragma" content="no-cache">
<meta http-equiv="cache-control" content="no-cache">
<meta http-equiv="expires" content="0">
There are two problems you need to solve here.
The first is how browsers typically handle the back button. You should use a POST request to get to the page that the back button should not have access to. Most browsers will use a local cache for GET requests,, so if you do a GET, your server simply won't be accessed at all. A POST request will however typically perform a new request. Many browsers will also warn the user, and show a dialog box saying i.e. "Are you sure you want to send the form again?". So by using a POST, you increase the likelihood that every page load of that page will perform a new request to your server.
You may also be able to use a GET request where your server returns HTTP headers that makes browsers not load the page from the cache. Experiment with this.
The second problem is to make sure you invalidate duplicate requests server side. The first solution I can think of is to generate a token that you submit with the form and store in a database on every request. If a request is performed with a token that already is stored, you can invalidate the request. Perhaps there are better techniques, but I'll leave that as an exercise for the reader ;)
I also searched for this , and after all i found a little trick i think it may for your.
Every page your have an javaScript function that call to server with
ajax to check whether this page is available at that time.
In the server side you keep the availability (with the session).
If not redirect the page as you wish .
I don't know a lot about this area so please excuse me if my question is vague or stupid.
I have a webpage which uses javascript and AJAX to display live data. Every few seconds, a request is made and a JSON response is returned and the data on the webpage is updated.
What I want to do is create a program in Java that will basically capture every response and interpret the data. I have found libraries which handle the JSON format already. However, I don't know how to get the response using Java.
So for example, a live news feed. I would like to log the data as it appears.
Thanks
Basically what you need to do is make an HTTP GET request to the page that hosts the JSON. You can do this by using a Java HTTP client. The one in the link is from Apache Commons but I believe there is actually one built into Java that is relatively straight-forward to use. When you make a request, it will return a result object that you can then use to access the response data and information such as response headers, etc.