I want to create a homepage that shows live stock charts. I also want to install a screener function for some indicators. Therefore I need to have live stock data of about some thousand companies. The data I want to obtain should be received in a really short time period (something like 5 sec). And the harsh part is that I want to receive them all at one in that short time period and save them in table for some other functions. I just found some ways to get CSV data from yahoo or something like that. But this method is to slow for the time period presuppose.
I don't know exactly if there is a general term for this method, but I would be really happy for getting some information about some ways to get the solution for this problem.
At least for the front-end, you'll need to implement an ajax routine that pulls stock data and populates your page accordingly - you would stick this ajax routine in a setInterval javascript call (set for every 5s). That way you'll get real-time updates without having to refresh the page.
As for the back-end - I'm not up to date with stock-quote websites, but I wouldn't be surprised if one of the bigger ones had a free API you can access via. a server-side language with PHP - however I'm not sure as to the minimum interval they would provide for free - anyway, this is the script your javascript/ajax routine would be calling.
You need to implement web socket server or user web socket api to fetch data. There are several api which provide stock exchange data. Here is my suggestion.
NodeJS server with Socket.io
Socket client with either iOS, android, javascript, Angular, java.
On NodeJS server you can retrieve stocks data from api check this link for example. Once you have data you cam emit it via socket and on client side you can listen for events.
Use case using bittrex and NodeJS server.
bittrex.websockets.client(function() {
console.log('Websocket connected');
bittrex.websockets.subscribe(['BTC-ETH'], function(data) {
if (data.M === 'updateExchangeState') {
data.A.forEach(function(data_for) {
console.log('Market Update for '+ data_for.MarketName, data_for);
});
}
});
});
Related
Technology Used:
In the question below, "frontend refers to Android" and "backend to Node.js".
Constraints:
We have rural users in developing market, so the internet may be slow and/or
jittery/unstable. Due to jitter, we need a solution where we can use whatever (if not all) data is transmitted.
We have quite large data (huge list of objects) which we cannot simple transmit through
JSON (via plain REST APIs), as until the whole data is downloaded,
we get nothing (because we are using Retrofit and its onResponse is
not called).
Goal:
To convert the list of objects (in backend) to binary data. So that
when we receive data in the frontend, we are able to access
serialized data without unpacking. Achieving it through FlatBuffers.
To transmit this data through streaming when triggered from
the frontend. I want to stream the data as I want to use (show in UI in
realtime) whatever data (list of objects) user has received (Even if
user gets disconnected during transmission). I am having issues
here, as I am unable to achieve this through REST API - Retrofit
combination. Need help here about what to use for trigger based
streaming.
To reconvert the list of objects in the frontend to Java objects and
show in user's UI. I am using FlatBuffer here, as it is fast and
able to use/serialize whatever objects are transmitted. No need for
entire data transmission to complete.
I am able to successfully implement step 1 & 3 of the goal. But, I am not able to sort out step 2.
Please suggest what is a good and easy way to achieve this (stream binary data from backend to frontend). It would be better, if we can trigger and stream using Retrofit (if possible) in the frontend.
Achieved by difference method:
On the Node side, use fs.createReadStream function for streaming.
On the Android side, use URLConnection, BufferedReader, InputStreamReader for consuming stream.
P.S - Didn't get any way to do it via Retrofit.
Is it possible to send extra data attached to a http response via Java or Php?
My Website is a homework-platform: One User enters homeworks into a database, and all users can then see the homeworks on the website. The current load is very inefficient, as the browser makes two requests for eveything to load: One for the index file and one for the homeworks. For the homeworks request the client also sends settings of the user to the server, based on which the returned homeworks are generated by a Php script.
Now, I wonder, if it is possible, to combine those two requests into one? Is it maybe possible to detect the http request with Java or Php on the server, read the cookies (where the settings are saved), then get the homeworks from the database and send the data attached to the http response to the client? Or, even better, firstly only return the index file and as soon as possible and the homework data afterwards as a second response, because the client needs some time to parse the Html & build the DOM-tree when it can't show the homeworks anyway.
While browsing the web I stumbled across terms like "Server-side rendering" and "SPDY", but I don't know if those are the right starting points.
Any help is highly appreciated, as I'm personally very interested in a solution and it would greatly improve the load time of my website.
A simple solution to your problem is to initialize your data in the index file.
You would create a javascript object, and embed it right into the html, rendered by your server. You could place this object in the global namespace (such as under window.initData), so that it can be accessed by the code in your script.
<scipt>
window.initData = {
someVariable: 23,
}; // you could use json_encode if you use php, or Jackson if you use java
</script>
However, it is not a huge problem if your data is fetched in a separate server request. Especially when it takes more time to retrieve the data from the database/web services, you can provide better user experience by first fetching the static content very quickly and displaying a spinner while the (slower) data is being loaded.
In my GWT application I am retrieving the XML data from a REST server. I am using Piriti XML parser https://code.google.com/p/piriti/wiki/Xml for deserializing the object and display in a table. As long as we are returning upto 1000 records everything is fine but with the big result it just hang and gives user message to stop the java script running in the back. Could someone please help me to find the best way to handle big data in GWT OR more precisely the best approach to parse big XML file in GWT.
Thanks a lot for all your suggestions.
The problem is that parsing a big XML document slows down the browser. And you need enough memory to hold the whole DOM plus your mapped objects in memory. The only solution is to avoid such situation. You have to adapt your REST service to be able to send only small chunks of data to the browser. So if you already have a paged table you only retrieve the data for the first page at the beginning. If the user wants to change the page you do another REST call to retrieve the data for the next page.
If you cannot change the the REST service itself you can create another server side service (on a server controlled by you) as a proxy. At first access you call the original REST service, store the XML at your own server and allow the client to retrieve only parts of that XML.
I have a PHP page that grabs a variable via GET and then pulls in some information from a database based on that variable. Once finished with the server-side stuff, there is some javascript that runs and takes the data supplied and creates a .png image using a 3rd party API and then saves that image to my server using an AJAX POST call to another PHP page.
That all works fine, but what I'd like to do now is automate some calls to that PHP page. Namely, say I have 100 such variables to go through, I want to, preferably in Java with a for loop, call that PHP page with each variable in turn.
The problem is that client-side javascript. It won't execute with the simple URLConnection in Java. It seems like I need some sort of browser replicator or some way to have java act like it's calling the PHP in a browser?
Alternatively, I could make do with having a third PHP page act in place of the Java as the controller, but I'm faced with the same problem of getting the javascript to execute.
Am I missing something easy? Is this set up not possible? I'd really prefer to do it in Java if possible to fold it into other code I already have running.
Let me try to add some more specifics without bogging it down too much. There's a PHP file getData.php that takes in an ID number via GET. So I call it like ./getData.php?id=someId
That PHP file takes the ID, goes to my DB and retrieves some data and pastes it into the HTML source. Then once the page is finished, I have some javascript within getData.php that retrieves that data, formats it into a DataTable and passes it off to Google Visualization API in order to make a SVG chart.
Then I have more JS that runs that takes that SVG object, turns it into a Canvas object, grabs the base64 image data from it and finally POSTs to saveTo.php with the following array:
{'id' : id, 'data' : imgData}
saveTo.php simply takes in that POST data, creates a file on my server based on id and pastes the imgData into it. The end result is that I can pass in an ID to getData.php and end up with saved image of a Visualization chart that I want made based on data in my DB tied to that ID.
That all works by hand. But I have ~1,000 of these IDs and I'd like to have it so this whole process is run each morning so that I can have updated images based on yesterday's data.
I should mention that I did try using the 3rd party toolkit HtmlUnit (http://htmlunit.sourceforge.net/) but just keep getting these errors:
com.gargoylesoftware.htmlunit.IncorrectnessListenerImpl notify
WARNING: Obsolete content type encountered: 'text/javascript'.
Some more searching around and hitting upon the right keywords to try finally led me to Selenium, which is exactly what I was looking for.
http://docs.seleniumhq.org/projects/webdriver/
At the moment I am using a third party library called Android Query. I use it to download the webpage, then I parse the html for the bits that I want. These bits are mostly in one small part of the page, so the rest is discarded.
each page is about 100kb and I am fetching 200-300 pages which tastes a while especially on a slow connection.
Is there any method or library to allow m e to fetch a certain div?
The pages i am fetching are from google play market.
example code i am using
String url = "https://play.google.com/store/apps/details?id=com.touchtype.swiftkey";
aq.ajax(url, String.class, new AjaxCallback<String>() {
#Override
public void callback(String url, String html, AjaxStatus status) {
parseHtml(html);
}
});
edit: if is it not possible, is there a light weight version of Google play pages that I can access and download?
Looking at the other answer here:
Is there a way to download partial part of a webpage, rather than the whole HTML body, programmatically?
It looks like there are ways to do it, but you'd need to know the bytes range. This would be very error prone as the content could easily change over time if google changes different parts of the page.
You could setup your own web server to sit in between your app and google play and return the data you query.
There is a Hacker News thread about this:
https://news.ycombinator.com/item?id=4634259
Apparently Apple has a JSON API, but Google does not.
Here are two server side libraries you could use to make the task easier:
https://github.com/chadrem/market_bot (Ruby)
https://github.com/pastfuture/MarketBot (PHP)
You could return only the data you need from your web app, making it as slim as possible. You could also cache it on your server so that you don't have to hit google for every request. Send several app ids in one request to minimize round trips.