I have a two restful webservices:
getMarketData
stopMarketData
getMarketData pulls the data from external service. stopMarketData will stop the pulling process of data being fetched from external service.
Now the problem is, when I fire getMarketData it creates a connection with the external service and start fetching the data (its continuous process as it continuously fetches the data until we call stopMarketData).
After that if I make a call to stopMarketData webservice it doesn't stop the fetching data process as the connection is not in the context of getMarketData so how can i persist the connection between getMarketData and stopMarketData calls in restful webservice.
I don't think you're supposed to maintain state in RESTful services. How would you scale this solution out to run on multiple machines or even multiple processes?
If you really do want to do this, you will have to somehow place the connection in a global area (such as in the Application object if you're doing JSP or ASP) that is available from multiple requests. Then, the stopMarketData call could get the connection from that global area and close it. This approach is definitely not very scalable.
Another option would be to use an asynchronous technology like Message Driven EJBs. startMarketData and stopMarketData calls would simply post messages to these EJBs to start and stop, respectively.
Hope this helps.
Nate
Related
I need suggestions regarding creating a scheduled task in a spring boot application. Our application is an order management portal that helps users with details about an order or a list of orders.
Currently, there is a manual process that is as followed: when a new order comes into the database, the user picks up some information of the order, prepares a doc file of it, and sends it to an external portal through email. That portal sends users a response in the email and the user saves the response in the system through our application. Now we want to automate this process in which instead of going the email route we will make a SOAP call to the external portal(send the doc as an attachment in XML request) and get the response back. For this, I want to write a scheduled process (#Scheduled) which will pick up the new orders and make a SOAP call for them.
My question is should I create the scheduled process in the same application or create a new module(a spring boot app) for it. I appreciate your suggestions.
Spring boot scheduling is very handy, go with it if you can. However if you have multiple instances of the spring application, scheduling will be enabled on all of them making it hard to synchronize who calls what.
Now if you are running multiple instances and assuming you have some sort of load balancer in front of them, than create a separate cron job which will call some endpoint and your load balancer will rout the request to one particular instance.
However in the case with multiply instances, probably the cleanest way is to use some sort of messaging like queues. Then all of the application could subscribe to a topic and your cron can just push notifications to it.
I have a Java JsonRPC webservice application(we can also think of it as a client application that uses other web services.) This jsonrpc webservice application has a sqlite db.
This web service is doing a number of tasks (such as calling web services, sending transactions, querying balance, etc.).
These calls are transactions made by the user.
Another important task of this web service is to update the local database (sqlite) by making a web service call at regular intervals (10 seconds).
The process that runs continuously with this 10 second time interval should not interfere with other read and write operations.
How can I find a solution for this problem? Should I create a child thread in the java main thread?
Yes , creating another Thread will work bu another cleaner approach is to use Spring Quartz
A client connects to a JAX-RS Endpoint to retrieve data. This endpoint uses an EJB to access the database and perform logic then returns the data to the endpoint.
Client <----> Rest Endpoint <----> EJB
Before data can be returned to the Client, I need to pass data to another Endpoint instance (another Client that is connected to this Rest Endpoint). I would like to keep this a REST Service but I'm faced with two problems:
REST Endpoints are stateless. So no Sessions are linked with an endpoint. However, this could be fixed with context injection (not as efficient) or passing a userID as a parameter.
No Communication method. There's WebSockets but how would that help me communicate between Sessions on the server? There's JMS but from my understanding that works from application to application not Sessions.
What I'm asking: Is there a way to communicate between different REST (or EJB) instances/sessions? If so, how?
You would have to persist sessions to some external storage (like Redis) and pass the session identifier with each request between various system components. When such request is handled, you could load session state from the storage and proceed accordingly.
Most of the EJB/Servlet containers allow you to turn session persistence on.
I have a data-push web service implemented in REST which pushes the data in database.
Now I want to create one more web service which will take input from a data-push web service and perform some business logic for any alerts. If any alert is present then it will call an alert service. In this case data-push web service should detach as soon as it posts the required data.
My doubt is if there are too many request on data-push web service - lets say for every second - then how will it handle threading mechanism and post on new web service?
If you are worried about the throughput of the service pushing data, you can queue the data push requests up and have a pool of worker threads process them as time and system resources permit.
The queuing mechanism could be any number of solutions depending on your scalability and throughput requirements:
In-Memory
JMS Messaging Middleware
Relational Database
Distributed Cache
In one of our applications we need to call the Yahoo Soap Webservice to Get Weather and other related info.
I used the wsdl2java tool from axis1.4 and generated th required stubs and wrote a client. I use jsp's use bean to include the client bean and call methods defined in the client which call the yahoo webservice inturn.
Now the problem: When users make calls to the jsp the response time of the webservice differs greatly, like for one user it took less then 10 seconds and the other in the same network took more than a minute.
I was just wondering if Axis1.4 queues the requests even though the jsps are multithreaded.
And finally is there an efficient way of calling the webservice(Yahoo weather). Typically i get around 200 simultaneous requests from my users.
Why don't you schedule one thread to get the weather every minute or so, and expose that to the JSP, in stead of letting each JSP get its own weather report?
That's a lot more efficient for both you and Yahoo, and JSP's only need to lookup a local object (almost instantaneous) in stead of connecting to a web service.
EDIT
Some new requirements in the comments of this answer suggest a different way of choosing solutions.
It seems that not only weather, which not only doesn't change that often but is also the same for every user, is requested by web service but also other data like flight data.
The requirements for flight data retrieval are very much different than for weather data. So I think you should define a few types of (remote) data and choose a different solution
for each category.
As basis for the requirements I'd use something simple:
Users like their information promptly, they do not like waiting
The amount of data stored on the web server is finite
Remote web services have an EULA of sorts and are probably not happy with 200 concurrent requests of the same data by the same source (you)
Fast data access to users is best achieved by having the data locally, be it transient (kept in a bean) or persistent (a local database). That can be done by periodically requesting data from the remote source, and using the cached data in the JSP. That would also keep you in the clear with the third point.
A finite amount of data stored on the web service means that not everything can be cached. Data which differs per user, or large data sets which can vary over small periods of time, cannot readily be cached. It's not really a good idea to load data on all flights of all airports in the US every minute or so. That kind of requests would be better served by running a specific web service query when necessary.
The trick is now to identify when caching data is feasible. If it is feasible, do that, otherwise run the web service query in the background. That can be done by presenting the JSP now and starting the web service query in the background. The JSP can have an AJAX script which queries your web server whether the data is ready, and insert that data in the page when ready.
I'd use Google tools to monitor how long the call to the web service is taking.
There are several things going on here:
Map Java beans to XML request.
Send XML request to web service.
Unmarshall XML request on web service side.
Web service processes request
Web service marshalles XML response
Web service sends XML response to Java client
Unmarshall XML response and display on client.
You can't see inside the Yahoo web service, but do break out what you can see on the client side to see where the time is spent.
Check memory as well. If Axis is generating .class files, maybe your perm space is being consumed. Visual VM is available to you with the JDK. Attach it to the PID on your client to see what's going on in memory on your app server.
Maybe this would be a good place for an AJAX call. This will be a good solution if you can get the weather in the background while users are doing other things.
I would recommend local caching and data pooling. Instead of sending out 200 separate requests for similar/same locations run a background thread which pulls the weather for only the locations your users are interested in and caches them locally, this cache updates every minute or so. When users request their personal preferences, the requests hit the cache and refetch if the location is new or the data in the cache is stale. This way the user will have a more seamless experience and you will not hit Yahoo throttles and get denied service.