I don't know a lot about this area so please excuse me if my question is vague or stupid.
I have a webpage which uses javascript and AJAX to display live data. Every few seconds, a request is made and a JSON response is returned and the data on the webpage is updated.
What I want to do is create a program in Java that will basically capture every response and interpret the data. I have found libraries which handle the JSON format already. However, I don't know how to get the response using Java.
So for example, a live news feed. I would like to log the data as it appears.
Thanks
Basically what you need to do is make an HTTP GET request to the page that hosts the JSON. You can do this by using a Java HTTP client. The one in the link is from Apache Commons but I believe there is actually one built into Java that is relatively straight-forward to use. When you make a request, it will return a result object that you can then use to access the response data and information such as response headers, etc.
Related
Is it possible to send extra data attached to a http response via Java or Php?
My Website is a homework-platform: One User enters homeworks into a database, and all users can then see the homeworks on the website. The current load is very inefficient, as the browser makes two requests for eveything to load: One for the index file and one for the homeworks. For the homeworks request the client also sends settings of the user to the server, based on which the returned homeworks are generated by a Php script.
Now, I wonder, if it is possible, to combine those two requests into one? Is it maybe possible to detect the http request with Java or Php on the server, read the cookies (where the settings are saved), then get the homeworks from the database and send the data attached to the http response to the client? Or, even better, firstly only return the index file and as soon as possible and the homework data afterwards as a second response, because the client needs some time to parse the Html & build the DOM-tree when it can't show the homeworks anyway.
While browsing the web I stumbled across terms like "Server-side rendering" and "SPDY", but I don't know if those are the right starting points.
Any help is highly appreciated, as I'm personally very interested in a solution and it would greatly improve the load time of my website.
A simple solution to your problem is to initialize your data in the index file.
You would create a javascript object, and embed it right into the html, rendered by your server. You could place this object in the global namespace (such as under window.initData), so that it can be accessed by the code in your script.
<scipt>
window.initData = {
someVariable: 23,
}; // you could use json_encode if you use php, or Jackson if you use java
</script>
However, it is not a huge problem if your data is fetched in a separate server request. Especially when it takes more time to retrieve the data from the database/web services, you can provide better user experience by first fetching the static content very quickly and displaying a spinner while the (slower) data is being loaded.
I am trying to parse a website, specifically this one It does not provide a api for that, like it does for bf4 or other titles, but the owner said that I should just parse the data.
The problem I have is that using jSoup, it retrieves the data, but if you look carefully, the website makes a new httpget and only after that the search is completed.
From what I could gather, i think it sends some paramethers in the header to.
If i just use jSoup to call that like I get some data, and where the search should be I get the message:
Please activate Javascript to see the search results.
Is there is a way to get the data? I really need this, any help is very much appreciated.
Please help
You need a javascript-capable client, e.g., HtmlUnit or Selenium.
In my GWT application I am retrieving the XML data from a REST server. I am using Piriti XML parser https://code.google.com/p/piriti/wiki/Xml for deserializing the object and display in a table. As long as we are returning upto 1000 records everything is fine but with the big result it just hang and gives user message to stop the java script running in the back. Could someone please help me to find the best way to handle big data in GWT OR more precisely the best approach to parse big XML file in GWT.
Thanks a lot for all your suggestions.
The problem is that parsing a big XML document slows down the browser. And you need enough memory to hold the whole DOM plus your mapped objects in memory. The only solution is to avoid such situation. You have to adapt your REST service to be able to send only small chunks of data to the browser. So if you already have a paged table you only retrieve the data for the first page at the beginning. If the user wants to change the page you do another REST call to retrieve the data for the next page.
If you cannot change the the REST service itself you can create another server side service (on a server controlled by you) as a proxy. At first access you call the original REST service, store the XML at your own server and allow the client to retrieve only parts of that XML.
Okay, this is going to be hard to explain but here goes nothing:
Lately I've been working a lot with POST and GET requests, but now I want to send a POST/GET request to this site called: http://www.mangareader.net/
The main problem I'm facing is that I want to use the search function of this site. Normally I would send a get request or something like that, but apparently this search function doesn't work that way, it works with some kind of Javascript code? I don't know exactly what it is, but try typing "Elf" in the search bar, and you'll get a drop down list of all the mangas (Japanese comics) with the word "Elf" in them. I want to know how this process is called, and how I can implement it into a Java program. For instance:
Login into a website
- > Send an HTTP post request. Get HTML data back. Process the HTML data. Get the information I need from the HTML source.
Using a search function on a regular site like google.com or bing.com
- > Send get request. Get HTML data back. Process the HTML data. Get the information I need from the HTML source.
Using search function on mangareader.net
- > ??????????
How would I achieve this? A theoretic explanation is enough, but a practical example would be great as well.
If you analyse the javascript that runs when search you get the following:
GET http://www.mangareader.net/actions/search/?q=test&limit=100 [HTTP/1.1 200 OK 113ms]
In other words, you can search on the site by a GET-request to
http://www.mangareader.net/actions/search/?q=test&limit=100
Where ?q contains your search word.
This site uses an ajax call to get a | ( pipe symbol ) seperated list from the page
/actions/search?q=term
It parses this list using string split and then makes it into combobox.
I have little experience with java, but a simple GET request to this page should work
replace {term} with your search function.
http://www.mangareader.net/actions/search/?q={term}&limit=100
You can use chrome network monitor to see if for your self
I want to write an application in Java that will communicate with Google App Engine app written in Go by sending and receiving dynamic data. The data is not human readable (as in, not ASCII, Unicode or the like) and ranges from a couple bytes to about 1MB.
I am wondering if it is possible to send such data to and from GAE using Post method directly, or is it better to just encode it as a hex dump and transfer is at text (thusly increasing its size a couple times)?
Yes, this is possible, of course. Just like an HTTP response, an HTTP request can contain a payload of any sort (unless it's a GET or other method that doesn't permit a body); just set the content-type appropriately and send the data in the body of the HTTP request.
If people can POST 10mb photos to Facebook, then I don't see why you can't do that with your data :)