Web Application Database Mem-Cache Suggestions - java

My website serve live info to user. These information can change dynamically. (You can think it is a STOCK Prices) My each query time to get these information from db about 3-5 seconds. My total time to get all information about 3 minutes. I serve these information to 6000 user. I am using hashmap to store and serve information to users. I get all information from db every 5 minutes and store it on hashmap. Everything is OK but I want to use advanced cache systems. What is your suggest. Can I use HSQLDB for that? INFO: I am using Spring MVC + Hibernate so I don't want to use non-JAVA solutions such as REDIS.

You may use ehcache as a second level cache for hibernate or as "self managed" cache. Guava library offers also efficient cache capibilities.

Related

How to initialize and fill Apache Ignite database by first node?

I would like to use Apache Ignite as failover read-only storage so my application will be able to access the most sensitive data if main storage (Oracle) is down.
So I need to
Start nodes
Create schema (execute DDL queries)
Load data from Oracle to Ignite
Seems like it's not the same as database caching and I don't need to use Cache. However, this page says that I need to implement a store to load a large amount of data from 3rd parties.
So, my questions are:
How to effectively transfer data from Oracle to Ignite? Data Streamers?
Who should init this transfer? First started node? How to do that? (tutorials explain how to achieve that via clients, should I follow this advice?)
Actually, I think, use of a cache store without read/write-through would be a suitable option here. You can configure a CacheJdbcPojoStore, for example, and call IgniteCache#loadCache(...) on your cache, once the cluster is up. More on this topic: https://apacheignite.readme.io/docs/3rd-party-store
If you don't want to use a cache store, then IgniteDataStreamer could be a good choice. This is the fastest way to upload big amount of data to the cluster. Data loading is usually performed from a client node, when all server nodes are up and running.

MySQL to Redis and Redis to MySQL

i want to optimize my game servers in Minecraft. I have 150k users in database, when daily on my servers join 15k users.
I have read about Redis, and i also read that Redis is faster than MySQL, i know that i can't give up from MySQL because my websites are using same database.
But what if i will load every 15 minutes all MySQL data to redis, then all my server plugins will work on this data, then after next 15 minutes redis will export that data to MySQL? I load same data to 4 servers and to 3 plugins on every server, so maybe loading it all to one redis server will be faster than send requests to MySQL from 4 servers * 3 Plugins?
Thanks for help.
Redis is an effective way to cache data from a MySQL database. Even though Redis has persistence options, many will still favor using a MySQL database for this task. As Redis operates in memory, it will be much faster than a MySQL database which (for the most part) does not operate in memory. Often, people will favor storing cache data with HashMaps, but since you have 3 servers, Redis would be a much better option. This way, you wouldn't have to create 3 near identical caches for each server.
Hi as much I can understand you have 4 mysql servers and 3 plugins.
As Redis is extremely fast no doubt but Redis use case is different than mysql. my advice is to load data in Redis which you'll use very frequently it'll be much faster than mysql, but to make it faster you have to design your keys intelligently so that Redis can search it faster. You can refresh keys and values after certain interval, defiantly your system's performance will improve.

Using EhCache as the main Datatasource instead of Database

Is it reliable to use ehCache as a datasource instead of a database ?
My business functionality will be to periodically collect information from a running application and store it in the Ehcache cache and then retrieve and display statistics about the collected information by querying the cache with EhCache Search API. The cache will only nee to keep the last 30-45 days of data.
What do you think about this approah?
ehCache could be an acceptable solution - assuming TTI, TTL and other params are set according to your business needs. There shouldn't be any reliability issue per se. A SQL database affords options for transactional commits, complex queries and relational support which aren't provided of course of ehCache by itself.

how to fetch a real-time changing data into a local server?

we have to develop a local server which will load itself with the real-time data of a industry (particularly time stamped data points like the temperature of a boiler,pressure values etc) which are stored in industrial server and we want to fetch them and populate our server with it, the data is not streamed at server end so how to fetch it continuously and populate the server...
we would like to store only past 2-3 days of history data as time advances, any recommendations about the server and the back end process to be used to fetch data are welcome, we don't have any idea were to start..
please help...
As others have stated,
You need to provide more information on how do you intend to populate your server.
What API do you have for the "real time server"?
I worked on a management system for solar engery devices
(i.e - devices that produce electricity from solar energy - they are called photo-volatic cells if I remember correctly).
In my case these devices had an FTP access , which provided me files with time-based information.
I constructed a java server that used the following technologies:
A. Apache tomcat web container - This web container allowed me on one hand to hold java logic, and on the other hand to expose HTTP-based interface to the customer.
The Java logic was located in a Servlet- which exposes methods to handle HTTP requests (and allows writing returned data using response objects).
B. The servlet has an init method, I used it to perform some initialization, such as starting a quartz periodic task to probe the ftp servers of the devices.
C. I used a database (postgresql database, which is an open source database) to store configuration for the application, and also to store results.
D. I used another periodic task to archive old data in an archiving table, so the main data table will hold relatively new data.
I ran the archiving task once in a few days, and it simple checked for record that were "too" old, inserted them to the archiving table, and deleted them from the main data table. In order to peform this efficiently I have have decided to use a function that I coded on the database.
E. In order to access the database from the application, I used the Hibernate object relational mapping technology.
This technology allowed me to define mappings between tables and their relations to java objects, and gave me generated create,read (by-id), delete and updated SQL statements.
Using the HQL query language, I wrote some more complex queries.
F. For presentation/client side - I used plain JSP.
You may choose other alternatives such as :
GWT, Apache Wicket, JSF
You may consider using some MVC framework to have some seperation between the logic and the presentation. Such frameworks can be:
Spring-MVC , Struts, and many others.
To conclude, you must understand that Java offers you a variety of technologies, you must define requirements well, and then start investigating which technology can meet your needs.

Empty Hibernate cache on demand

I'm writing a soap web service: Jboss + Hibernate + Java. Database on PostrgreSQL. After publishing the webservice, it runs perfectly.
For testing purposes I change data on the database by opening pgAdmin, and changing the values on the rows by hand. Now, the problem is, Hibernate is not aware of those changes. Not until I re-publish the web service.
Is there any way to tell Hibernate to empty the cache or reload the data from the database so it will take the last values available?
Thanks!
I'm assuming you're talking about the second level cache...
The Cache API exposes several methods allowing to evict various regions. Check all the evictXxx() methods. Use SessionFactory#getCache() to obtain the Cache.

Categories