Dynamic Multi-tenant WebApp (Spring Hibernate) - java

I've come up with a working dynamic multi-tenant application using:
Java 8
Java Servlet 3.1
Spring 3.0.7-RELEASE (can't change the version)
Hibernate 3.6.0.Final (can't change the version)
Commons dbcp2
This is the 1st time I've had to instantiate Spring objects myself so I'm wondering if I've done everything correctly or if the app will blow up in my face at an unspecified future date during production.
Basically, the single DataBase schema is known, but the database details will be specified at runtime by the user. They are free to specify any hostname/port/DB name/username/password.
Here's the workflow:
The user logs in to the web app then either chooses a database from a known list, or specifies a custom database (hostname/port/etc.).
If the Hibernate SessionFactory is built successfully (or is found in the cache), then it's persisted for the user's session using SourceContext#setSourceId(SourceId) then the user can work with this database.
If anybody choses/specifies the same database, the same cached AnnotationSessionFactoryBean is returned
The user can switch databases at any point.
When the user switches away from a custom DB (or logs off), the cached AnnotationSessionFactoryBeans are removed/destroyed
So will the following work as intended? Help and pointers are most welcome.
web.xml
<web-app version="3.1" ...>
<context-param>
<param-name>contextConfigLocation</param-name>
<param-value>classpath:applicationContext.xml</param-value>
</context-param>
<listener>
<listener-class>org.springframework.web.context.ContextLoaderListener</listener-class>
</listener>
<listener> <!-- Needed for SourceContext -->
<listener-class>org.springframework.web.context.request.RequestContextListener</listener-class>
</listener>
<web-app>
applicationContext.xml
<beans ...>
<tx:annotation-driven />
<util:properties id="db" location="classpath:db.properties" /> <!-- driver/url prefix -->
<context:component-scan base-package="com.example.basepackage" />
</beans>
UserDao.java
#Service
public class UserDao implements UserDaoImpl {
#Autowired
private TemplateFactory templateFactory;
#Override
public void addTask() {
final HibernateTemplate template = templateFactory.getHibernateTemplate();
final User user = (User) DataAccessUtils.uniqueResult(
template.find("select distinct u from User u left join fetch u.tasks where u.id = ?", 1)
);
final Task task = new Task("Do something");
user.getTasks().add(task);
TransactionTemplate txTemplate = templateFactory.getTxTemplate(template);
txTemplate.execute(new TransactionCallbackWithoutResult() {
#Override
protected void doInTransactionWithoutResult(TransactionStatus status) {
template.save(task);
template.update(user);
}
});
}
}
TemplateFactory.java
#Service
public class TemplateFactory {
#Autowired
private SourceSessionFactory factory;
#Resource(name = "SourceContext")
private SourceContext srcCtx; // session scope, proxied bean
#Override
public HibernateTemplate getHibernateTemplate() {
LocalSessionFactoryBean sessionFactory = factory.getSessionFactory(srcCtx.getSourceId());
return new HibernateTemplate(sessionFactory.getObject());
}
#Override
public TransactionTemplate getTxTemplate(HibernateTemplate template) {
HibernateTransactionManager txManager = new HibernateTransactionManager();
txManager.setSessionFactory(template.getSessionFactory());
return new TransactionTemplate(txManager);
}
}
SourceContext.java
#Component("SourceContext")
#Scope(value="session", proxyMode = ScopedProxyMode.INTERFACES)
public class SourceContext {
private static final long serialVersionUID = -124875L;
private SourceId id;
#Override
public SourceId getSourceId() {
return id;
}
#Override
public void setSourceId(SourceId id) {
this.id = id;
}
}
SourceId.java
public interface SourceId {
String getHostname();
int getPort();
String getSID();
String getUsername();
String getPassword();
// concrete class has proper hashCode/equals/toString methods
// which use all of the SourceIds properties above
}
SourceSessionFactory.java
#Service
public class SourceSessionFactory {
private static Map<SourceId, AnnotationSessionFactoryBean> cache = new HashMap<SourceId, AnnotationSessionFactoryBean>();
#Resource(name = "db")
private Properties db;
#Override
public LocalSessionFactoryBean getSessionFactory(SourceId id) {
synchronized (cache) {
AnnotationSessionFactoryBean sessionFactory = cache.get(id);
if (sessionFactory == null) {
return createSessionFactory(id);
}
else {
return sessionFactory;
}
}
}
private AnnotationSessionFactoryBean createSessionFactory(SourceId id) {
AnnotationSessionFactoryBean sessionFactory = new AnnotationSessionFactoryBean();
sessionFactory.setDataSource(new CutomDataSource(id, db));
sessionFactory.setPackagesToScan(new String[] { "com.example.basepackage" });
try {
sessionFactory.afterPropertiesSet();
}
catch (Exception e) {
throw new SourceException("Unable to build SessionFactory for:" + id, e);
}
cache.put(id, sessionFactory);
return sessionFactory;
}
public void destroy(SourceId id) {
synchronized (cache) {
AnnotationSessionFactoryBean sessionFactory = cache.remove(id);
if (sessionFactory != null) {
if (LOG.isInfoEnabled()) {
LOG.info("Releasing SessionFactory for: " + id);
}
try {
sessionFactory.destroy();
}
catch (HibernateException e) {
LOG.error("Unable to destroy SessionFactory for: " + id);
e.printStackTrace(System.err);
}
}
}
}
}
CustomDataSource.java
public class CutomDataSource extends BasicDataSource { // commons-dbcp2
public CutomDataSource(SourceId id, Properties db) {
setDriverClassName(db.getProperty("driverClassName"));
setUrl(db.getProperty("url") + id.getHostname() + ":" + id.getPort() + ":" + id.getSID());
setUsername(id.getUsername());
setPassword(id.getPassword());
}
}

In the end I extended Spring's AbstractRoutingDataSource to be able to dynamically create datasources on the fly. I'll update this answer with the full code as soon as everything is working correctly. I have a couple of last things to sort out, but the crux of it is as follows:
#Service
public class DynamicRoutingDataSource extends AbstractRoutingDataSource {
// this is pretty much the same as the above SourceSessionFactory
// but with a map of CustomDataSources instead of
// AnnotationSessionFactoryBeans
#Autowired
private DynamicDataSourceFactory dataSourceFactory;
// This is the sticky part. I currently have a workaround instead.
// Hibernate needs an actual connection upon spring startup & there's
// also no session in place during spring initialization. TBC.
// #Resource(name = "UserContext") // scope session, proxy bean
private UserContext userCtx; // something that returns the DB config
#Override
protected SourceId determineCurrentLookupKey() {
return userCtx.getSourceId();
}
#Override
protected CustomDataSource determineTargetDataSource() {
SourceId id = determineCurrentLookupKey();
return dataSourceFactory.getDataSource(id);
}
#Override
public void afterPropertiesSet() {
// we don't need to resolve any data sources
}
// Inherited methods copied here to show what's going on
// #Override
// public Connection getConnection() throws SQLException {
// return determineTargetDataSource().getConnection();
// }
//
// #Override
// public Connection getConnection(String username, String password)
// throws SQLException {
// return determineTargetDataSource().getConnection(username, password);
// }
}
So I just wire up the DynamicRoutingDataSource as the DataSource for Spring's SessionFactoryBean along with a TransactionManager an all the rest as usual. As I said, more code to follow.

Related

Apache Ignite Cache Store + HikariCP DataSource

I am trying to set up Apache Ignite cache store using PostgreSQL as an external storage.
public class MyCacheStore extends CacheStoreAdapter<String, MyCache> {
private static final String GET_QUERY= "SELECT * FROM ..";
private static final String UPDATE_QUERY = "UPDATE ...";
private static final String DELETE_QUERY = "DELETE FROM ..";
#CacheStoreSessionResource
private CacheStoreSession session;
#Override
public MyCache load(String key) throws CacheLoaderException {
Connection connection = session.attachment();
try (PreparedStatement preparedStatement = connection.prepareStatement(GET_QUERY)) {
// some stuff
}
}
#Override
public void loadCache(IgniteBiInClosure<String, MyCache> clo, Object... args) {
super.loadCache(clo, args);
}
#Override
public void write(Cache.Entry<? extends String, ? extends MyCache> entry) throws CacheWriterException {
Connection connection = session.attachment();
try (PreparedStatement preparedStatement = connection.prepareStatement(UPDATE_QUERY)) {
// some stuff
}
}
#Override
public void delete(Object key) throws CacheWriterException {
Connection connection = session.attachment();
try (PreparedStatement preparedStatement = connection.prepareStatement(DELETE_QUERY)) {
// some stuff
}
}
}
MyCache is a standard class:
public class MyCache implements Serializable {
#QuerySqlField(index = true, name = "id")
private String id;
public MyCache() {
}
public MyCache(String id) {
this.id = id;
}
public String getId() {
return id;
}
public void setId(String id) {
this.id = id;
}
}
Here is a configuration class
import javax.cache.configuration.Factory;
import javax.cache.configuration.FactoryBuilder;
#Configuration
public class ServiceConfig {
// no problems here
#Bean
#ConfigurationProperties(prefix = "postgre")
DataSource dataSource() {
return DataSourceBuilder
.create()
.build();
}
#Bean
public Ignite igniteInstance(IgniteConfiguration igniteConfiguration) {
return Ignition.start(igniteConfiguration);
}
#Bean
public IgniteConfiguration igniteCfg () {
// some other stuff here
IgniteConfiguration cfg = new IgniteConfiguration();
cfg.setClientMode(true);
CacheConfiguration myCacheConfiguration = new CacheConfiguration("MY_CACHE")
.setIndexedTypes(String.class, MyCache.class)
.setAtomicityMode(CacheAtomicityMode.TRANSACTIONAL)
.setReadThrough(true)
.setReadThrough(true)
.setCacheStoreSessionListenerFactories(new MyCacheStoreSessionListenerFactory(dataSource))
.setCacheStoreFactory(FactoryBuilder.factoryOf(MyCacheStore.class));
cfg.setCacheConfiguration(myCacheConfiguration);
return cfg;
}
private static class MyCacheStoreSessionListenerFactory implements Factory {
DataSource dataSource;
MyCacheStoreSessionListenerFactory(DataSource dataSource) {
this.dataSource = dataSource;
}
#Override
public CacheStoreSessionListener create() {
// Data Source
CacheJdbcStoreSessionListener listener = new CacheJdbcStoreSessionListener();
listener.setDataSource(dataSource);
return listener;
}
}
}
And this is what I get in logs:
...
Caused by: class org.apache.ignite.IgniteCheckedException: Failed to validate cache configuration
(make sure all objects in cache configuration are serializable): MyCache
at org.apache.ignite.internal.processors.cache.GridCacheProcessor$11.applyx(GridCacheProcessor.java:4766)
at org.apache.ignite.internal.processors.cache.GridCacheProcessor$11.applyx(GridCacheProcessor.java:4743)
at org.apache.ignite.internal.processors.cache.GridCacheProcessor.withBinaryContext(GridCacheProcessor.java:4788)
at org.apache.ignite.internal.processors.cache.GridCacheProcessor.cloneCheckSerializable(GridCacheProcessor.java:4743)
at org.apache.ignite.internal.processors.cache.GridCacheProcessor.addCacheOnJoin(GridCacheProcessor.java:818)
at org.apache.ignite.internal.processors.cache.GridCacheProcessor.addCacheOnJoinFromConfig(GridCacheProcessor.java:891)
at org.apache.ignite.internal.processors.cache.GridCacheProcessor.startCachesOnStart(GridCacheProcessor.java:753)
at org.apache.ignite.internal.processors.cache.GridCacheProcessor.start(GridCacheProcessor.java:795)
at org.apache.ignite.internal.IgniteKernal.startProcessor(IgniteKernal.java:1700)
... 77 more
Caused by: class org.apache.ignite.IgniteCheckedException: Failed to serialize object: CacheConfiguration [name=MyCache, grpName=null, memPlcName=null, storeConcurrentLoadAllThreshold=5, rebalancePoolSize=2, rebalanceTimeout=10000, evictPlc=null, evictPlcFactory=null, onheapCache=false, sqlOnheapCache=false, sqlOnheapCacheMaxSize=0, evictFilter=null, eagerTtl=true, dfltLockTimeout=0, nearCfg=null, writeSync=null, storeFactory=javax.cache.configuration.FactoryBuilder$ClassFactory#d87782a1, storeKeepBinary=false, loadPrevVal=false, aff=null, cacheMode=PARTITIONED, atomicityMode=TRANSACTIONAL, backups=0, invalidate=false, tmLookupClsName=null, rebalanceMode=ASYNC, rebalanceOrder=0, rebalanceBatchSize=524288, rebalanceBatchesPrefetchCnt=2, maxConcurrentAsyncOps=500, sqlIdxMaxInlineSize=-1, writeBehindEnabled=false, writeBehindFlushSize=10240, writeBehindFlushFreq=5000, writeBehindFlushThreadCnt=1, writeBehindBatchSize=512, writeBehindCoalescing=true, maxQryIterCnt=1024, affMapper=null, rebalanceDelay=0, rebalanceThrottle=0, interceptor=null, longQryWarnTimeout=3000, qryDetailMetricsSz=0, readFromBackup=true, nodeFilter=null, sqlSchema=null, sqlEscapeAll=false, cpOnRead=true, topValidator=null, partLossPlc=IGNORE, qryParallelism=1, evtsDisabled=false, encryptionEnabled=false]
at org.apache.ignite.marshaller.jdk.JdkMarshaller.marshal0(JdkMarshaller.java:103)
at org.apache.ignite.marshaller.AbstractNodeNameAwareMarshaller.marshal(AbstractNodeNameAwareMarshaller.java:70)
at org.apache.ignite.marshaller.jdk.JdkMarshaller.marshal0(JdkMarshaller.java:117)
at org.apache.ignite.marshaller.AbstractNodeNameAwareMarshaller.marshal(AbstractNodeNameAwareMarshaller.java:58)
at org.apache.ignite.internal.util.IgniteUtils.marshal(IgniteUtils.java:10250)
at org.apache.ignite.internal.processors.cache.GridCacheProcessor$11.applyx(GridCacheProcessor.java:4762)
... 85 more
Caused by: java.io.NotSerializableException: com.zaxxer.hikari.HikariDataSource
at java.base/java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1185)
at java.base/java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1553)
I have read all official documentation about it and examined many other examples, but can't make it run.
HikariCP is the most popular connection pool library, I can't understand why Ignite throws an exception about not being able to serialize DataSource.
Any advice or idea would be appreciated, thank you!
Since your Cache Store is not serializable, you should not use Factory.factoryOf (which is a no-op wrapper) but instead supply a real serializable factory implementation which will acquire local HikariCP on node and then construct the Cache Store.

failed to create a child event loop/failed to open a new selector/Too many open files

I am getting errors like "failed to create a child event loop/failed to open a new selector/Too many open files" when there are 30 or more concurrent requests...How to solve the above errors? Am I doing anything wrong? I am using Spring boot and Java cassandra driver. Below is the connection file:
public class Connection {
public static Session getConnection() {
final Cluster cluster = Cluster.builder().addContactPoint(ConnectionBean.getCASSANDRA_DB_IP())
.withQueryOptions(new QueryOptions().setConsistencyLevel(ConsistencyLevel.LOCAL_ONE))
.withCredentials(ConnectionBean.getCASSANDRA_USER(), ConnectionBean.getCASSANDRA_PASSWORD())
.withPoolingOptions(poolingOptions)
.build();
final Session session = cluster.connect(ConnectionBean.getCASSANDRA_DB_NAME());
return session;
}
}
Below is the ConnectionBean file which I used in Connection file:
public class ConnectionBean {
public static String CASSANDRA_DB_IP;
public static String CASSANDRA_DB_NAME;
public static String CASSANDRA_USER;
public static String CASSANDRA_PASSWORD;
public ConnectionBean() {
}
public ConnectionBean(String CASSANDRA_DB_IP,String CASSANDRA_DB_NAME,String CASSANDRA_USER,String CASSANDRA_PASSWORD) {
this.CASSANDRA_DB_IP=CASSANDRA_DB_IP;
this.CASSANDRA_DB_NAME=CASSANDRA_DB_NAME;
this.CASSANDRA_USER=CASSANDRA_USER;
this.CASSANDRA_PASSWORD=CASSANDRA_PASSWORD;
}
public static String getCASSANDRA_DB_IP() {
return CASSANDRA_DB_IP;
}
public static void setCASSANDRA_DB_IP(String cASSANDRA_DB_IP) {
CASSANDRA_DB_IP = cASSANDRA_DB_IP;
}
public static String getCASSANDRA_DB_NAME() {
return CASSANDRA_DB_NAME;
}
public static void setCASSANDRA_DB_NAME(String cASSANDRA_DB_NAME) {
CASSANDRA_DB_NAME = cASSANDRA_DB_NAME;
}
public static String getCASSANDRA_USER() {
return CASSANDRA_USER;
}
public static void setCASSANDRA_USER(String cASSANDRA_USER) {
CASSANDRA_USER = cASSANDRA_USER;
}
public static String getCASSANDRA_PASSWORD() {
return CASSANDRA_PASSWORD;
}
public static void setCASSANDRA_PASSWORD(String cASSANDRA_PASSWORD) {
CASSANDRA_PASSWORD = cASSANDRA_PASSWORD;
}
}
Below is the class from where ConnectionBean variables are initialized :
public class SecurityConfiguration extends WebSecurityConfigurerAdapter {
private static final String LOGIN_PROCESSING_URL = "/login";
private static final String LOGIN_FAILURE_URL = "/login?error";
private static final String LOGIN_URL = "/login";
#Autowired
private BCryptPasswordEncoder bCryptPasswordEncoder;
#Autowired
private DataSource dataSource;
#Value("${spring.queries.users-query}")
private String usersQuery;
#Value("${spring.queries.roles-query}")
private String rolesQuery;
#Value("${CASSANDRA_DB_IP}")
public String CASSANDRA_DB_IP;
#Value("${CASSANDRA_DB_NAME}")
public String CASSANDRA_DB_NAME;
#Value("${CASSANDRA_USER}")
public String CASSANDRA_USER;
#Value("${CASSANDRA_PASSWORD}")
public String CASSANDRA_PASSWORD;
#Override
protected void configure(AuthenticationManagerBuilder auth) throws Exception {
ConnectionBean cb = new ConnectionBean(CASSANDRA_DB_IP, CASSANDRA_DB_NAME, CASSANDRA_USER, CASSANDRA_PASSWORD);
auth.jdbcAuthentication().usersByUsernameQuery(usersQuery).authoritiesByUsernameQuery(rolesQuery)
.dataSource(dataSource).passwordEncoder(bCryptPasswordEncoder);
}
#Override
protected void configure(HttpSecurity http) throws Exception {
// Not using Spring CSRF here to be able to use plain HTML for the login page
http.csrf().disable()
// Register our CustomRequestCache, that saves unauthorized access attempts, so
// the user is redirected after login.
.requestCache().requestCache(new CustomRequestCache())
// Restrict access to our application.
.and().authorizeRequests()
// Allow all flow internal requests.
.requestMatchers(SecurityUtils::isFrameworkInternalRequest).permitAll()
// Allow all requests by logged in users.
.anyRequest().authenticated()
// Configure the login page.
.and().formLogin().loginPage(LOGIN_URL).permitAll().loginProcessingUrl(LOGIN_PROCESSING_URL)
.failureUrl(LOGIN_FAILURE_URL)
// Register the success handler that redirects users to the page they last tried
// to access
.successHandler(new SavedRequestAwareAuthenticationSuccessHandler())
// Configure logout
.and().logout().logoutSuccessUrl(LOGOUT_SUCCESS_URL);
}
/**
* Allows access to static resources, bypassing Spring security.
*/
#Override
public void configure(WebSecurity web) throws Exception {
web.ignoring().antMatchers(
// Vaadin Flow static resources
"/VAADIN/**",
// the standard favicon URI
"/favicon.ico",
// web application manifest
"/manifest.json", "/sw.js", "/offline-page.html",
// icons and images
"/icons/**", "/images/**",
// (development mode) static resources
"/frontend/**",
// (development mode) webjars
"/webjars/**",
// (development mode) H2 debugging console
"/h2-console/**",
// (production mode) static resources
"/frontend-es5/**", "/frontend-es6/**");
}
}
And finally, below is the class through which I am querying cassandra data:
public class getData {
Session session;
public getData(){
session = Connection.getConnection();
getDataTable();
}
private void getDataTable() {
try {
String query = "SELECT * FROM tableName";
ResultSet rs = session.execute(query);
for (Row row : rs) {
/*Do some stuff here using row*/
}
} catch (Exception e) {
e.printStackTrace();
}
}
}
If getConnection() is being invoked for every request, you are creating a new Cluster instance each time.
This is discouraged because one connection is created between your client and a C* node for each Cluster instance, and for each Session a connection pool of at least one connection is created for each C* node.
If you are not closing your Cluster instances after a request completes, these connections will remain open. After a number of requests, you'll have so many connections open that you will run out of file descriptors in your OS.
To resolve this issue, create only one Cluster and Session instance and reuse it between requests. This strategy is outlined in 4 simple rules when using the DataStax drivers for Cassandra:
Use one Cluster instance per (physical) cluster (per application lifetime)
Use at most one Session per keyspace, or use a single Session and explicitely specify the keyspace in your queries

CDI #Inject won't work, object stays null

I am trying to use CDI, using #Inject for dependency injection but my object stays null and won't initialize... more precisely:
I have a webapplication with WeatherController which use a java application with all my modules. In the Java application I have a ForecastService where I try to initialize my repositories with CDI without success.
I tried/searched a lot. Hopefully somebody can help me here?
I have a web application which uses this controller:
#Path("/weather")
public class WeatherController {
private ForecastService forecastService;
//private ForecastRepository forecastRepository = new ForecastFakeDB();
//private ObservationRepository observationRepository = new ObservationFakeDB();
public WeatherController() {
//this.forecastService.setForecastRepository(forecastRepository);
//forecastService.setObservationRepository(observationRepository);
forecastService = new ForecastService();
}
//localhost:8080/DA_project_weatherPredictions/api/weather/observation/Leuven
#GET
#Produces({"application/json"})
#Path("/observation/{location}")
public Response getObservation(#PathParam("location") String location) {
try {
ObjectMapper mapper = new ObjectMapper();
Observation observation = forecastService.getCurrentObservation(location);
//Object to JSON in String
String jsonInString = mapper.writeValueAsString(observation);
return Response.status(200).entity(jsonInString).build();
} catch (Exception ex) {
System.out.println("error");
System.out.println(ex.getMessage());
ex.printStackTrace();
return null;
}
}
This works perfectly. This is my forecastService:
public class ForecastService implements Service {
#Inject
ForecastRepository forecastRepository;
#Inject
ObservationRepository observationRepository;
private Client client;
private WebTarget webTargetObservation, webTargetForecast;
public ForecastService() {
// WeatherRepositoryFactory weatherRepositoryFactory = new WeatherRepositoryFactory();
// forecastRepository = weatherRepositoryFactory.getForecastRepository(repository);
// observationRepository = weatherRepositoryFactory.getObservationRepository(repository);
loadWeather();
}
public void setForecastRepository(ForecastRepository forecastRepository) {
this.forecastRepository = forecastRepository;
}
public void setObservationRepository(ObservationRepository observationRepository) {
this.observationRepository = observationRepository;
}
public void loadWeather() {
//http://api.openweathermap.org/data/2.5/weather?units=metric&appid=12fa8f41738b72d954b6758d48e129aa&q=BE,Leuven
//http://api.openweathermap.org/data/2.5/forecast?units=metric&appid=12fa8f41738b72d954b6758d48e129aa&q=BE,Leuven
client = ClientBuilder.newClient();
webTargetObservation = client.target("http://api.openweathermap.org/data/2.5/weather")
.queryParam("mode", "json")
.queryParam("units", "metric")
.queryParam("appid", "12fa8f41738b72d954b6758d48e129aa");
webTargetForecast = client.target("http://api.openweathermap.org/data/2.5/forecast")
.queryParam("mode", "json")
.queryParam("units", "metric")
.queryParam("appid", "12fa8f41738b72d954b6758d48e129aa");
}
public Observation getCurrentObservation(String location) throws Exception {
Observation observation;
observation = observationRepository.getObservation(location);
if (observation == null) {
try {
//observation = webTargetObservation.queryParam("q", location).request(MediaType.APPLICATION_JSON).get(Observation.class);
Response response = webTargetObservation.queryParam("q", location).request(MediaType.APPLICATION_JSON).get();
String json = response.readEntity(String.class);
//System.out.println(json);
response.close();
observation = new ObjectMapper().readValue(json, Observation.class);
//System.out.println(observation.getWeather().getDescription());
}
catch (Exception e){
StringBuilder sb = new StringBuilder(e.toString());
for (StackTraceElement ste : e.getStackTrace()) {
sb.append("\n\tat ");
sb.append(ste);
}
String trace = sb.toString();
throw new Exception (trace);
//throw new Exception("Location not found");
}
this.observationRepository.addObservation(observation, location);
}
return observation;
}
So the problem is that my repositories stay null
#Alternative
public class ObservationDB implements ObservationRepository{
//as ID we can use the ASCI value of the String key .. example uklondon to ASCII
public ObservationDB(String name) {
}
#Override
public Observation getObservation(String location) {
throw new UnsupportedOperationException("Not supported yet.");
}
#Override
public void addObservation(Observation observation, String location) {
throw new UnsupportedOperationException("Not supported yet.");
}
}
Mermory DB:
#Default
public class ObservationFakeDB implements ObservationRepository {
//example String key : beleuven, uklondon
private static Map<String, Observation> observations;
public ObservationFakeDB() {
observations = new HashMap<>();
}
#Override
public Observation getObservation(String location) {
return observations.get(location);
}
#Override
public void addObservation(Observation observation, String location) {
observations.put(location, observation);
}
}
I have a beans.xml, I thought beans.xml, #Inject, #Default en #Alternative would make this work. I tried #Dependent, #Applicationscoped.
EDIT:
I also often get this warning on Netbeans.
My beans.xml
<beans xmlns="http://xmlns.jcp.org/xml/ns/javaee"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://xmlns.jcp.org/xml/ns/javaee
http://xmlns.jcp.org/xml/ns/javaee/beans_1_1.xsd"
bean-discovery-mode="all">
</beans>
You need to let your CDI container manages the lifecycle of all your beans to allow it to resolve and inject properly their dependencies.
So, in your case you should not create yourself the instance of ForecastService, you should rather delegate it to the CDI container by simply annotating the field forecastService with #Inject this way its dependencies will be automatically resolved and set by the container.
public class WeatherController {
#Inject
private ForecastService forecastService;
...

Hibernate can not write data to mysql correctly

I am trying to write some information about the user who opens the web application which i made. The web app is simple. When open it it shows users ip, user-agent, and the time of the last log into the app. I`ve made this app with these technologies: Spring, Maven, Tomcat, Hibernate and MySQL. It seems that everything is working correctly but when i open the app in browser in do note make a record to Data Base and i must start manually Main.java to make a record but the record is no correct because it does not record the real information but it record NULL everywhere. I have two questions - How to do all this automatically (when i open the web app to have a record in DB) and Why my records are every time NULL.
public class Main {
public static void main(String[] args) {
UserInfo user = new UserInfo();
user.setRollNo(7);
user.setIp(UserController.ip);
user.setUserAgent(UserController.userAgent);
user.setDate((Date) UserController.date);
SessionFactory sessionFactory = new AnnotationConfiguration()
.configure().buildSessionFactory();
Session session = sessionFactory.openSession();
session.beginTransaction();
session.save(user);
session.getTransaction().commit();
session.close();
}
}
UserController.java
#Controller
public class UserController {
public static String ip;
public static String userAgent;
public static Date date;
#Autowired
private HttpServletRequest request;
#RequestMapping(value = "/connection")
public ModelAndView getUserInfo() {
ModelAndView modelandView = new ModelAndView("UserInfo");
ip = request.getRemoteAddr();
modelandView.addObject("msg1", "Your IP is " + ip);
userAgent = request.getHeader("User-Agent");
modelandView.addObject("msg2", "Your User agent is " + userAgent);
date = new java.util.Date();
/*
* Calendar calendar = Calendar.getInstance(); java.sql.Date startDate =
* new java.sql.Date(calendar.getTime() .getTime());
*/
modelandView.addObject("msg3", "Your last log was at " + date);
return modelandView;
}
}
UserInfo.java
#Entity
#Table(name = "USER_INFORMATION")
public class UserInfo {
#Id
private int rollNo;
private String ip;
private String userAgent;
private Date date;
public int getRollNo() {
return rollNo;
}
public void setRollNo(int rollNo) {
this.rollNo = rollNo;
}
public String getIp() {
return ip;
}
public void setIp(String ip) {
this.ip = ip;
}
public String getUserAgent() {
return userAgent;
}
public void setUserAgent(String userAgent) {
this.userAgent = userAgent;
}
public Date getDate() {
return date;
}
public void setDate(Date date) {
this.date = date;
}
}
hibernate.cfg.xml
<?xml version='1.0' encoding='utf-8'?>
<!DOCTYPE hibernate-configuration PUBLIC
"-//Hibernate/Hibernate Configuration DTD 3.0//EN"
"http://hibernate.sourceforge.net/hibernate-configuration-3.0.dtd">
<hibernate-configuration>
<session-factory>
<!-- Database connection settings -->
<property name="connection.driver_class">com.mysql.jdbc.Driver</property>
<property name="connection.url">jdbc:mysql://localhost:3306/browserinfo</property>
<property name="connection.username">root</property>
<property name="connection.password"></property>
<property name="connection.pool_size">1</property>
<property name="dialect">org.hibernate.dialect.MySQLDialect</property>
<property name="cache.provider_class">org.hibernate.cache.NoCacheProvider</property>
<property name="show_sql">true</property>
<property name="hbm2ddl.auto">validate</property>
<mapping class="com.nikola.hellocontroller.UserInfo" />
</session-factory>
</hibernate-configuration>
According to this:
#novy154 I am trying to make e record to DB with the users IP, user-agent info and the time when the user was lastloged in the app. After that i want to display all records in the app.
First of all, you need to have some DataAccessLayer. Something like
public interface UserInfoRepository {
void save(UserInfo userInfo);
Collection<UserInfo> findAll();
}
and it's implementation.
Then maybe some service over the Repository, but it doesn't matter in this case, so let's skip it.
You need to inject your repository into controller and:
1) invoke userInfoRepository.save(newlyCreatedUserInfo)
2) pass all UserInfos to your view (by invoking userInfoRepository.findAll() and putting the result in modelAndView) and display it somehow.
Let me know if you need help implementing this.
But this is just in purpose of learning how to make simple crud-like application, cause this doesn't make any sense. You can't even identify your user, you just want to track each visitor's ip, user info and so on.
// edit:
Repository implementation would be something like:
#Repository
#Transactional
public class UserInfoRepositoryImpl implements UserInfoRepository {
#Autowired
private SessionFactory sessionFactory;
#Override
public void save(UserInfo userInfo) {
getCurrentSession.save(userInfo);
}
#Override
public Collection<UserInfo> findAll() {
final Criteria root = getCurrentSession().createCriteria(UserInfo.class);
return Collections.checkedList(root.list(), UserInfo.class);
}
private Session getCurrentSession() {
return sessionFactory.getCurrentSession();
}
}
Then in your controller method you need to invoke save(newlyCreatedUserInfo) to store your info in the database and
modelAndView.addObject("userInfos", repository.findAll());
to retrieve all data and pass it to a view.
Of course remember about
...
#Autowired
private UserInfoRepository userInfoRepository;
...
in controller.
Note that since spring-data-jpa provides you more powerful repositories out of the box, this is reinventing a wheel.

java.sql.SQLException: This function is not supported using HSQL and Spring

Can someone please tell me why am I geting java.sql.SQLException: This function is not supported using HSQL and Spring? I am trying to insert a new row into my database..
Below is my DAO and I get the error on the mySession.save(message) line:
#Transactional
#Repository
public class MessageDaoImpl implements MessageDao
{
private Log log = null;
#Autowired
private SessionFactory sessionFactory;
public MessageDaoImpl()
{
super();
log = LogFactory.getLog(MessageDaoImpl.class);
}
#SuppressWarnings("unchecked")
#Transactional(readOnly = true, propagation = Propagation.REQUIRED)
public List<Message> listMessages()
{
try
{
return (List<Message>) sessionFactory.getCurrentSession()
.createCriteria(Message.class).list();
} catch (Exception e)
{
log.fatal(e.getMessage());
return null;
}
}
#SuppressWarnings("unchecked")
#Transactional(readOnly = false, propagation = Propagation.REQUIRED)
public void SaveOrUpdateMessage(Message message)
{
try
{
Session mySession = sessionFactory.getCurrentSession();
mySession.save(message);
mySession.flush();
} catch (Exception e)
{
log.fatal(e.getMessage());
}
}
}
Here is my main class:
public static void main(String[] args)
{
ApplicationContext context = new AnnotationConfigApplicationContext(HelloWorldConfig.class);
MessageService mService = context.getBean(MessageService.class);
HelloWorld helloWorld = context.getBean(HelloWorld.class);
/**
* Date: 4/26/13 / 9:26 AM
* Comments:
*
* I added Log4J to the example.
*/
LOGGER.debug("Message from HelloWorld Bean: " + helloWorld.getMessage());
Message message = new Message();
message.setMessage(helloWorld.getMessage());
//
mService.SaveMessage(message);
helloWorld.setMessage("I am in Staten Island, New York");
LOGGER.debug("Message from HelloWorld Bean: " + helloWorld.getMessage());
}
}
Here is my DatabaseConfig:
public class DatabaseConfig
{
private static final Logger LOGGER = getLogger(DatabaseConfig.class);
#Autowired
Environment env;
#Bean
public DataSource dataSource() {
EmbeddedDatabaseBuilder builder = new EmbeddedDatabaseBuilder();
EmbeddedDatabase db = builder.setType(EmbeddedDatabaseType.HSQL).
addScript("schema.sql").build();
return db;
}
#Bean
public DataSource hsqlDataSource() {
BasicDataSource ds = new BasicDataSource();
try {
ds.setDriverClassName("org.hsqldb.jdbcDriver");
ds.setUsername("sa");
ds.setPassword("");
ds.setUrl("jdbc:hsqldb:mem:mydb");
}
catch (Exception e)
{
LOGGER.error(e.getMessage());
}
return ds;
}
#Bean
public SessionFactory sessionFactory()
{
LocalSessionFactoryBean factoryBean = new LocalSessionFactoryBean();
factoryBean.setDataSource(hsqlDataSource());
factoryBean.setHibernateProperties(getHibernateProperties());
factoryBean.setPackagesToScan(new String[]{"com.xxxxx.HelloSpringJavaBasedJavaConfig.model"});
try
{
factoryBean.afterPropertiesSet();
} catch (IOException e)
{
LOGGER.error(e.getMessage());
e.printStackTrace(); //To change body of catch statement use File | Settings | File Templates.
}
return factoryBean.getObject();
}
#Bean
public Properties getHibernateProperties()
{
Properties hibernateProperties = new Properties();
hibernateProperties.setProperty("hibernate.dialect", env.getProperty("hibernate.dialect"));
hibernateProperties.setProperty("hibernate.show_sql", env.getProperty("hibernate.show_sql"));
hibernateProperties.setProperty("hibernate.use_sql_comments", env.getProperty("hibernate.use_sql_comments"));
hibernateProperties.setProperty("hibernate.format_sql", env.getProperty("hibernate.format_sql"));
hibernateProperties.setProperty("hibernate.hbm2ddl.auto", env.getProperty("hibernate.hbm2ddl.auto"));
hibernateProperties.setProperty("hibernate.generate_statistics", env.getProperty("hibernate.generate_statistics"));
hibernateProperties.setProperty("javax.persistence.validation.mode", env.getProperty("javax.persistence.validation.mode"));
//Audit History flags
hibernateProperties.setProperty("org.hibernate.envers.store_data_at_delete", env.getProperty("org.hibernate.envers.store_data_at_delete"));
hibernateProperties.setProperty("org.hibernate.envers.global_with_modified_flag", env.getProperty("org.hibernate.envers.global_with_modified_flag"));
return hibernateProperties;
}
#Bean
public HibernateTransactionManager hibernateTransactionManager()
{
HibernateTransactionManager htm = new HibernateTransactionManager();
htm.setSessionFactory(sessionFactory());
htm.afterPropertiesSet();
return htm;
}
}
and here is what I am getting to the console:
Exception in thread "main" org.hibernate.AssertionFailure: null id in com.xxx.HelloSpringJavaBasedJavaConfig.model.Message entry (don't flush the Session after an exception occurs)
Here is my message model bean:
#Entity
#Table(name = "messages")
public class Message
{
#Id
#Column(name = "id")
#GeneratedValue(strategy = GenerationType.AUTO)
private String id;
#Column(name = "message")
private String message;
public String getId()
{
return id;
}
public void setId(String id)
{
this.id = id;
}
public String getMessage()
{
return message;
}
public void setMessage(String message)
{
this.message = message;
}
#Override
public String toString()
{
return "Message{" +
"id='" + id + '\'' +
", message='" + message + '\'' +
'}';
}
}
This relates to the version of hsql being used probably the version causing issue was 1.8 with hibernate 4. Need to use 2.2.9 instead
You can't use a String with #GenerateValue with the Strategy GenerationType.AUTO since it uses sequence generator and those can't be used with non-numerical values. You have to set it yourself. Use an Integer or Long if you want it to be generated for you.
Hibernate docs
Or use an id generator that uses string values
#Id
#GeneratedValue(generator="system-uuid")
#GenericGenerator(name="system-uuid", strategy = "uuid")
it was a version issues. I updated the versions and now everything works
I had the same issue after I upgraded hibernate to version 4.2.8 .Looking in the logs, I noticed that the sql query generated by hibernate tried to insert a record with a null primary key. The field was annotated just with: #Id #GeneratedValue
Upgrading hsqldb to version 2.2.9 made this disappear just like Denis said and I am very thankful to him for the reply.
It seems very likely that this issue is related to attempting to use a Session which has already signaled an error. As Sotirios mentioned, it is a bad idea to catch exceptions in your DAO, and if you do, it is critical that you re-throw them.
Normally, once you catch a Hibernate exception, you must roll back your transaction and close the session as the session state may no longer be valid (Hibernate core documentation).
If the Session throws an exception, including any SQLException, immediately rollback the database transaction, call Session.close() and discard the Session instance. Certain methods of Session will not leave the session in a consistent state. No exception thrown by Hibernate can be treated as recoverable. Ensure that the Session will be closed by calling close() in a finally block.
Since you're using Spring, most of that doesn't apply to you, but the exception message you are seeing indicates that the actual cause of your problem probably was related to catching an exception and then continuing to use the same session.

Categories