Issues with Hibernate and C3p0 pool - java

I'm testing web application with hibernate 4.3.0 and c3p0 pool and sometimes after reloading context I get this error:
INFO: Reloading Context with name [/AppName] is completed
янв 07, 2015 12:07:14 AM org.apache.catalina.loader.WebappClassLoader loadClass
INFO: Illegal access: this web application instance has been stopped already. Could not load com.mchange.v2.resourcepool.BasicResourcePool$AsyncTestIdleResourceTask. The eventual following stack trace is caused by an error thrown for debugging purposes as well as to attempt to terminate the thread which caused the illegal access, and has no functional impact.
java.lang.IllegalStateException
at org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1612)
at org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1571)
at com.mchange.v2.resourcepool.BasicResourcePool.checkIdleResources(BasicResourcePool.java:1584)
at com.mchange.v2.resourcepool.BasicResourcePool.access$2000(BasicResourcePool.java:44)
at com.mchange.v2.resourcepool.BasicResourcePool$CheckIdleResourcesTask.run(BasicResourcePool.java:2116)
at java.util.TimerThread.mainLoop(Timer.java:555)
at java.util.TimerThread.run(Timer.java:505)
Exception in thread "C3P0PooledConnectionPoolManager[identityToken->1hge136961hmilhm17ylc97|17d425e]-AdminTaskTimer" java.lang.NoClassDefFoundError: com/mchange/v2/resourcepool/BasicResourcePool$AsyncTestIdleResourceTask
at com.mchange.v2.resourcepool.BasicResourcePool.checkIdleResources(BasicResourcePool.java:1584)
at com.mchange.v2.resourcepool.BasicResourcePool.access$2000(BasicResourcePool.java:44)
at com.mchange.v2.resourcepool.BasicResourcePool$CheckIdleResourcesTask.run(BasicResourcePool.java:2116)
at java.util.TimerThread.mainLoop(Timer.java:555)
at java.util.TimerThread.run(Timer.java:505)
Caused by: java.lang.ClassNotFoundException: com.mchange.v2.resourcepool.BasicResourcePool$AsyncTestIdleResourceTask
at org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1720)
at org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1571)
... 5 more
dependencies - maybe hibernate-c3p0 dependecie is enougth?
<dependency>
<groupId>com.mchange</groupId>
<artifactId>c3p0</artifactId>
<version>0.9.2.1</version>
</dependency>
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-c3p0</artifactId>
<version>4.3.0.Final</version>
</dependency>
hibernate settings
<property name="connection.provider_class">org.hibernate.connection.C3P0ConnectionProvider</property>
<property name="hibernate.c3p0.min_size">5</property>
<property name="hibernate.c3p0.max_size">20</property>
<property name="hibernate.c3p0.timeout">3000</property>
<property name="hibernate.c3p0.max_statements">50</property>
<property name="hibernate.c3p0.idle_test_period">300</property>
What is the reason for that error and hot to fix it?
All dependecies related to hibernate-c3p0 are in the classpath.

Please refer to the links I added as comments. During re-deployment classes are unloaded, if you fail to close Hibernate's sessionFactory then you get these ClassNotFound errors. One of the answer suggest that you can add a custom ServletContextListener to handle the contextDestroyed event.

Related

Getting Hazelcast Native Client working with Hibernate 5.2.x

I'm trying to get Hazelcast working with Hibernate, but unless i use the super_client option, it doesn't start up.
According to the docs, Super Client should only be used if your app is on the same RAC or data center. For local, this will be the case, for production, they will most definitely be separated, so Native Client is the only option that will work for us.
Super Client is a member of the cluster, it has socket connection to
every member in the cluster and it knows where the data is so it will
get to the data much faster. But Super Client has the clustering
overhead and it must be on the same data center even on the same RAC.
However Native client is not member and relies on one of the cluster
members. Native Clients can be anywhere in the LAN or WAN. It scales
much better and overhead is quite less. So if your clients are less
than Hazelcast nodes then Super client can be an option; otherwise
definitely try Native Client. As a rule of thumb: Try Native client
first, if it doesn't perform well enough for you, then consider Super
client.
The best option for starting up Hazelcast seems to be using Docker:
docker pull hazelcast/hazelcast:3.10.4
docker run --name=hazelcast -d=true -p 5701:5701 hazelcast/hazelcast:3.10.4
And this is what it looks like once it is up and running, i double checked that the Hazelcast port, 5701, is exposed which it clearly is.
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
77a5a0bed5eb hazelcast/hazelcast:3.10.4 "bash -c 'set -euo p…" 3 days ago Up 6 hours 0.0.0.0:5701->5701/tcp hazelcast
The docker hub docs also mentions how to pass in JAVA_OPTS, i'm not sure if this is required or optional and what its purpose is, but this didn't help me get up and running:
-e JAVA_OPTS="-Dhazelcast.local.publicAddress=127.0.0.1:5701"
telnet 127.0.0.1 5701 successfully connects to localhost:5701, so i know the port is open. The docker docs doesn't mention what the default password is for this running Hazelcast instance, my assumption is that it's empty or that the password is dev-pass as mentioned in several older tutorials.
I'm using Hibernate 5.2.13.Final
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-core</artifactId>
<version>${hibernate.version}</version>
</dependency>
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-entitymanager</artifactId>
<version>${hibernate.version}</version>
<exclusions>
<exclusion>
<groupId>cglib</groupId>
<artifactId>cglib</artifactId>
</exclusion>
<exclusion>
<groupId>dom4j</groupId>
<artifactId>dom4j</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-validator</artifactId>
<version>${hibernate-validator.version}</version>
</dependency>
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-c3p0</artifactId>
<version>${hibernate.version}</version>
</dependency>
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-java8</artifactId>
<version>${hibernate.version}</version>
</dependency>
For Hazelcast, according to the docs, two dependencies are required,
<dependency>
<groupId>com.hazelcast</groupId>
<artifactId>hazelcast</artifactId>
<version>3.10.4</version>
</dependency>
<dependency>
<groupId>com.hazelcast</groupId>
<artifactId>hazelcast-hibernate52</artifactId>
<version>1.2.3</version>
</dependency>
<dependency>
<groupId>com.hazelcast</groupId>
<artifactId>hazelcast-client</artifactId>
<version>3.10.4</version>
</dependency>
The docs show the following links:
Clicking Hibernate 5 shows that hazelcast-hibernate52 is the correct dependency
When i click See here for details, i'm greeted with docs that look somewhat outdated:
Assuming there's only a typo, i go through to the example:
<!DOCTYPE hibernate-configuration SYSTEM "http://www.hibernate.org/dtd/hibernate-configuration-3.0.dtd">
<hibernate-configuration>
<session-factory>
<property name="hibernate.cache.use_second_level_cache">true</property>
<property name="hibernate.cache.use_query_cache">false</property>
<property name="hibernate.cache.use_minimal_puts">true</property>
<property name="hibernate.cache.region.factory_class">com.hazelcast.hibernate.HazelcastCacheRegionFactory</property>
<property name="hibernate.cache.hazelcast.use_native_client">false</property>
<property name="hibernate.cache.hazelcast.native_client_hosts">127.0.0.1</property>
<property name="hibernate.cache.hazelcast.native_client_group">hibernate</property>
<property name="hibernate.cache.hazelcast.native_client_password">password</property>
<property name="hibernate.connection.driver_class">org.apache.derby.jdbc.EmbeddedDriver</property>
<property name="hibernate.connection.url">jdbc:derby:hibernateDB</property>
<mapping resource="Employee.hbm.xml"/>
</session-factory>
</hibernate-configuration>
In the example, Use Native Client is set to false, yet it is being configured, is this a typo or is this the correct configuration?
I'm trying out these settings on a standard Hibernate Postgres with C3P0 setup, here's my persistence.xml
<properties>
<!-- Hibernate Config -->
<property name="hibernate.dialect" value="org.hibernate.dialect.PostgreSQL95Dialect" />
<property name="hibernate.generate_statistics" value="false" />
<property name="hibernate.hbm2ddl.auto" value="validate"/>
<property name="hibernate.physical_naming_strategy" value="za.co.convirt.util.CustomApplicationNamingStrategy"/>
<property name="hibernate.connection.charSet" value="UTF-8"/>
<property name="hibernate.show_sql" value="false" />
<property name="hibernate.format_sql" value="false"/>
<property name="hibernate.use_sql_comments" value="false"/>
<!-- JDBC Config -->
<property name="javax.persistence.jdbc.driver" value="org.postgresql.Driver" />
<property name="javax.persistence.jdbc.time_zone" value="UTC" />
<property name="hibernate.jdbc.time_zone" value="UTC"/>
<!-- Connection Pool -->
<property name="hibernate.connection.provider_class" value="org.hibernate.connection.C3P0ConnectionProvider" />
<property name="hibernate.c3p0.max_size" value="5" />
<property name="hibernate.c3p0.min_size" value="1" />
<property name="hibernate.c3p0.acquire_increment" value="1" />
<property name="hibernate.c3p0.idle_test_period" value="300" />
<property name="hibernate.c3p0.max_statements" value="0" />
<property name="hibernate.c3p0.timeout" value="100" />
<!-- Batch writing -->
<property name="hibernate.jdbc.batch_size" value = "50"/>
<property name="hibernate.order_updates" value = "true"/>
<property name="hibernate.jdbc.batch_versioned_data" value = "true"/>
</properties>
Some params are entered programmatically via (this has been in use for ages, so i know it works, but adding it here just in case it helps make the code more clear)
fun paramsFromArgs(args: Array<String>): Map<String, String> {
val hibernateMap = mutableMapOf<String, String>()
args.forEach {
if (it.isNotBlank()) {
if (it.startsWith("hibernate") || it.startsWith("javax.persistence")) {
val split = it.split("=", limit = 2)
hibernateMap.put(split.get(0), split.get(1))
}
}
}
return hibernateMap
}
Now when i setup Second Level cache with Hazelcast:
paramsDefault.add("hibernate.cache.use_query_cache=true")
paramsDefault.add("hibernate.cache.use_second_level_cache=true")
paramsDefault.add("hibernate.cache.region.factory_class=com.hazelcast.hibernate.HazelcastCacheRegionFactory")
paramsDefault.add("hibernate.cache.provider_configuration_file_resource_path=hazelcast.xml")
paramsDefault.add("hibernate.cache.hazelcast.use_native_client=false")
paramsDefault.add("hibernate.cache.hazelcast.native_client_address=127.0.0.1")
paramsDefault.add("hibernate.cache.hazelcast.native_client_group=dev")
paramsDefault.add("hibernate.cache.hazelcast.native_client_password=dev-pass22222asfasdf")
paramsDefault.add("hibernate.cache.hazelcast.client.statistics.enabled=true")
Database.setupEntityManagerFactory("default",
Database.paramsFromArgs(paramsDefault.toTypedArray()))
Setting use_native_client to false like in the example doesn't seem to do anything, with the logs in debug mode, i'm not seeing anything Hazelcast related.
Switching it to true (which makes more sense considering it's being configured to have a password and IP address, it bombs out on startup.
hibernate.cache.hazelcast.use_native_client=true
hibernate.cache.hazelcast.native_client_address=127.0.0.1
hibernate.cache.hazelcast.native_client_group=dev
hibernate.cache.hazelcast.native_client_password=dev-pass
DEB [16:18:26.531] setup org.hibernate.jpa.internal.util.LogHelper PersistenceUnitInfo [
name: default
persistence provider classname: org.hibernate.jpa.HibernatePersistenceProvider
classloader: null
excludeUnlistedClasses: false
JTA datasource: null
Non JTA datasource: null
Transaction type: RESOURCE_LOCAL
PU root URL: file:/Users/vlad/Code/.../...
Shared Cache Mode: null
Validation Mode: null
Jar files URLs []
Managed classes names []
Mapping files names []
Properties [
...
hibernate.jdbc.time_zone: UTC
javax.persistence.jdbc.password:
hibernate.cache.region.factory_class: com.hazelcast.hibernate.HazelcastCacheRegionFactory
hibernate.c3p0.idle_test_period: 300
hibernate.cache.hazelcast.use_native_client: true
...
hibernate.cache.hazelcast.native_client_group: dev
...
javax.persistence.jdbc.driver: org.postgresql.Driver
hibernate.use_sql_comments: false
hibernate.cache.hazelcast.native_client_address: 127.0.0.1
...
hibernate.cache.hazelcast.client.statistics.enabled: true
hibernate.dialect: org.hibernate.dialect.PostgreSQL95Dialect
hibernate.cache.provider_configuration_file_resource_path: hazelcast.xml]
HazelcastCacheRegionFactory is being used according to the logs:
DEB [16:18:26.884] setup org.hibernate.cache.internal.RegionFactoryInitiator
Cache region factory : com.hazelcast.hibernate.HazelcastCacheRegionFactory
Followed by two log entries that doesn't follow my logging standard (i'm guessing it's not using SLF4j?):
Sep 12, 2018 2:18:29 PM com.hazelcast.hibernate.HazelcastCacheRegionFactory
INFO: Starting up HazelcastCacheRegionFactory
... and then Unable to build Hibernate Session Factory:
ERR [16:18:29.802] setup ApplicationApi [PersistenceUnit: default] Unable to build Hibernate SessionFactory
at org.hibernate.jpa.boot.internal.EntityManagerFactoryBuilderImpl.persistenceException (EntityManagerFactoryBuilderImpl.java:970)
at org.hibernate.jpa.boot.internal.EntityManagerFactoryBuilderImpl.build (EntityManagerFactoryBuilderImpl.java:895)
at org.hibernate.jpa.HibernatePersistenceProvider.createEntityManagerFactory (HibernatePersistenceProvider.java:58)
at javax.persistence.Persistence.createEntityManagerFactory (Persistence.java:55)
at za.co.convirt.util.Database.setupEntityManagerFactory (Database.kt:20)
at za.co.convirt.util.Database.setupEntityManagerFactory$default (Database.kt:19)
at ApplicationApi$main$hibernateThread$1.invoke (ApplicationApi.kt:171)
at ApplicationApi$main$hibernateThread$1.invoke (ApplicationApi.kt:26)
at kotlin.concurrent.ThreadsKt$thread$thread$1.run (Thread.kt:30)
To make sure that it's not failing due to missing annotations on entities, i added a few #Cache annotations to entities, but it makes no difference.
#Table
#Entity
#EntityListeners(AuditListener::class)
#PersistenceContext(unitName = "default")
#Cache(usage = CacheConcurrencyStrategy.READ_WRITE, region = "Seat")
class Seat(
name: String,
...
I've also added a hazelcast.xml, not sure if this is needed or not:
<hazelcast
xmlns="http://www.hazelcast.com/schema/config"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="
http://www.hazelcast.com/schema/config
http://www.hazelcast.com/schema/config/hazelcast-config-3.11.xsd">
<services enable-defaults="true"/>
</hazelcast>
Is hibernate 5.2.x supported? (this ticket shows that the problems with Hibernate 5.2 was fixed, so my assumption is that it should work)
I want to run a stand-alone Hazelcast instance on one server and have multiple instances of the application use it as a central cache location, what am i missing from setup to make it work?
Update 1:
I've written a small piece of code that successfully connects to the local hazelcast instance (this is on my dev machine, same as with the rest of the code)
import com.hazelcast.client.HazelcastClient
import com.hazelcast.client.config.ClientConfig
import java.util.*
fun main(args: Array<String>) {
val config = ClientConfig()
config.getNetworkConfig().addAddress("127.0.0.1:5701")
val hazelcastInstance = HazelcastClient.newHazelcastClient(config)
val map = hazelcastInstance.getMap<String, String>("blah")
map.forEach { t, u ->
println(" $t -> $u ")
}
map.put("${Random().nextInt()}", "${Random().nextInt()}")
hazelcastInstance.shutdown()
}
To proof that it's storing and retrieving from cache, i restart the main method several times and each time the number of entries in blah increases
Run1:
No printlns
Run2:
1498523740 -> -1418154711
Run3:
1498523740 -> -1418154711
-248583979 -> -940621527
So Hazelcast is working correctly ...
Update 2:
I can now connect hazelcast from Hibernate, but it's throwing an exception for every lookup it has to do.
Removing hazelcast.xml from my classpath and then removing the group and password options, Hibernate is starting up and connecting.
paramsDefault.add("hibernate.cache.use_query_cache=true")
paramsDefault.add("hibernate.cache.use_second_level_cache=true")
paramsDefault.add("hibernate.cache.region.factory_class=com.hazelcast.hibernate.HazelcastCacheRegionFactory")
paramsDefault.add("hibernate.cache.hazelcast.use_native_client=true")
paramsDefault.add("hibernate.cache.hazelcast.native_client_address=127.0.0.1")
Outputs:
Sep 13, 2018 6:02:37 PM com.hazelcast.hibernate.HazelcastCacheRegionFactory
INFO: Starting up HazelcastCacheRegionFactory
Sep 13, 2018 6:02:37 PM com.hazelcast.core.LifecycleService
INFO: hz.client_0 [dev] [3.10.4] HazelcastClient 3.10.4 (20180727 - 0f51fcf) is STARTING
Sep 13, 2018 6:02:38 PM com.hazelcast.client.spi.ClientInvocationService
INFO: hz.client_0 [dev] [3.10.4] Running with 2 response threads
Sep 13, 2018 6:02:38 PM com.hazelcast.core.LifecycleService
INFO: hz.client_0 [dev] [3.10.4] HazelcastClient 3.10.4 (20180727 - 0f51fcf) is STARTED
Sep 13, 2018 6:02:38 PM com.hazelcast.client.connection.ClientConnectionManager
INFO: hz.client_0 [dev] [3.10.4] Trying to connect to [127.0.0.1]:5701 as owner member
Sep 13, 2018 6:02:38 PM com.hazelcast.client.connection.ClientConnectionManager
INFO: hz.client_0 [dev] [3.10.4] Setting ClientConnection{alive=true, connectionId=1, channel=NioChannel{/127.0.0.1:61191->/127.0.0.1:5701}, remoteEndpoint=[127.0.0.1]:5701, lastReadTime=2018-09-13 18:02:38.356, lastWriteTime=2018-09-13 18:02:38.352, closedTime=never, lastHeartbeatRequested=never, lastHeartbeatReceived=never, connected server version=3.10.4} as owner with principal ClientPrincipal{uuid='532bf500-e03e-4620-a9c2-14bb55c07166', ownerUuid='2fb66fa1-a17f-49fe-ba2b-bf585d43906d'}
Sep 13, 2018 6:02:38 PM com.hazelcast.client.connection.ClientConnectionManager
INFO: hz.client_0 [dev] [3.10.4] Authenticated with server [127.0.0.1]:5701, server version:3.10.4 Local address: /127.0.0.1:61191
Sep 13, 2018 6:02:38 PM com.hazelcast.client.spi.impl.ClientMembershipListener
INFO: hz.client_0 [dev] [3.10.4]
Members [1] {
Member [127.0.0.1]:5701 - 2fb66fa1-a17f-49fe-ba2b-bf585d43906d
}
Sep 13, 2018 6:02:38 PM com.hazelcast.core.LifecycleService
INFO: hz.client_0 [dev] [3.10.4] HazelcastClient 3.10.4 (20180727 - 0f51fcf) is CLIENT_CONNECTED
Sep 13, 2018 6:02:38 PM com.hazelcast.internal.diagnostics.Diagnostics
INFO: hz.client_0 [dev] [3.10.4] Diagnostics disabled. To enable add -Dhazelcast.diagnostics.enabled=true to the JVM arguments.
However, any entity that's being retrieved that makes a call to Hazelcast, just stalls.
I've restarted Hazelcast with JAVA_OPTS to see if it makes a different, doesn't seem like it:
docker run --name=hazelcast -d=true -p 5701:5701 -e JAVA_OPTS="-Dhazelcast.local.publicAddress=127.0.0.1:5701" hazelcast/hazelcast:3.10.4
Digging into Hazelcast logs using:
docker logs -f hazelcast
I'm seeing the following:
Sep 13, 2018 6:02:11 PM com.hazelcast.client.ClientEndpointManager
INFO: [127.0.0.1]:5701 [dev] [3.10.4] Destroying ClientEndpoint{connection=Connection[id=2, /172.17.0.2:5701->/172.17.0.1:56514, endpoint=[172.17.0.1]:56514, alive=false, type=JAVA_CLIENT], principal='ClientPrincipal{uuid='d8a9b730-c5fd-458c-9ab6-671aece99305', ownerUuid='2fb66fa1-a17f-49fe-ba2b-bf585d43906d'}, ownerConnection=true, authenticated=true, clientVersion=3.10.4, creationTime=1536861657874, latest statistics=null}
Sep 13, 2018 6:02:38 PM com.hazelcast.nio.tcp.TcpIpAcceptor
INFO: [127.0.0.1]:5701 [dev] [3.10.4] Accepting socket connection from /172.17.0.1:56516
Sep 13, 2018 6:02:38 PM com.hazelcast.nio.tcp.TcpIpConnectionManager
INFO: [127.0.0.1]:5701 [dev] [3.10.4] Established socket connection between /172.17.0.2:5701 and /172.17.0.1:56516
Sep 13, 2018 6:02:38 PM com.hazelcast.client.impl.protocol.task.AuthenticationMessageTask
INFO: [127.0.0.1]:5701 [dev] [3.10.4] Received auth from Connection[id=3, /172.17.0.2:5701->/172.17.0.1:56516, endpoint=null, alive=true, type=JAVA_CLIENT], successfully authenticated, principal: ClientPrincipal{uuid='532bf500-e03e-4620-a9c2-14bb55c07166', ownerUuid='2fb66fa1-a17f-49fe-ba2b-bf585d43906d'}, owner connection: true, client version: 3.10.4
Sep 13, 2018 6:03:11 PM com.hazelcast.transaction.TransactionManagerService
INFO: [127.0.0.1]:5701 [dev] [3.10.4] Committing/rolling-back live transactions of client, UUID: d8a9b730-c5fd-458c-9ab6-671aece99305
Upon hitting the cache:
Sep 13, 2018 6:05:43 PM com.hazelcast.map.impl.operation.EntryOperation
SEVERE: [127.0.0.1]:5701 [dev] [3.10.4] java.lang.ClassNotFoundException: org.hibernate.cache.spi.entry.StandardCacheEntryImpl
com.hazelcast.nio.serialization.HazelcastSerializationException: java.lang.ClassNotFoundException: org.hibernate.cache.spi.entry.StandardCacheEntryImpl
at com.hazelcast.internal.serialization.impl.JavaDefaultSerializers$JavaSerializer.read(JavaDefaultSerializers.java:86)
at com.hazelcast.internal.serialization.impl.JavaDefaultSerializers$JavaSerializer.read(JavaDefaultSerializers.java:75)
at com.hazelcast.internal.serialization.impl.StreamSerializerAdapter.read(StreamSerializerAdapter.java:48)
at com.hazelcast.internal.serialization.impl.AbstractSerializationService.readObject(AbstractSerializationService.java:269)
at com.hazelcast.internal.serialization.impl.ByteArrayObjectDataInput.readObject(ByteArrayObjectDataInput.java:574)
at com.hazelcast.hibernate.serialization.Value.readData(Value.java:78)
at com.hazelcast.internal.serialization.impl.DataSerializableSerializer.readInternal(DataSerializableSerializer.java:160)
at com.hazelcast.internal.serialization.impl.DataSerializableSerializer.read(DataSerializableSerializer.java:106)
at com.hazelcast.internal.serialization.impl.DataSerializableSerializer.read(DataSerializableSerializer.java:51)
at com.hazelcast.internal.serialization.impl.StreamSerializerAdapter.read(StreamSerializerAdapter.java:48)
at com.hazelcast.internal.serialization.impl.AbstractSerializationService.toObject(AbstractSerializationService.java:187)
at com.hazelcast.query.impl.CachedQueryEntry.getValue(CachedQueryEntry.java:75)
at com.hazelcast.hibernate.distributed.LockEntryProcessor.process(LockEntryProcessor.java:49)
at com.hazelcast.hibernate.distributed.LockEntryProcessor.process(LockEntryProcessor.java:32)
at com.hazelcast.map.impl.operation.EntryOperator.process(EntryOperator.java:319)
at com.hazelcast.map.impl.operation.EntryOperator.operateOnKeyValueInternal(EntryOperator.java:182)
at com.hazelcast.map.impl.operation.EntryOperator.operateOnKey(EntryOperator.java:167)
at com.hazelcast.map.impl.operation.EntryOperation.runVanilla(EntryOperation.java:384)
at com.hazelcast.map.impl.operation.EntryOperation.call(EntryOperation.java:188)
at com.hazelcast.spi.impl.operationservice.impl.OperationRunnerImpl.call(OperationRunnerImpl.java:202)
at com.hazelcast.spi.impl.operationservice.impl.OperationRunnerImpl.run(OperationRunnerImpl.java:191)
at com.hazelcast.spi.impl.operationexecutor.impl.OperationExecutorImpl.run(OperationExecutorImpl.java:406)
at com.hazelcast.spi.impl.operationexecutor.impl.OperationExecutorImpl.runOrExecute(OperationExecutorImpl.java:433)
at com.hazelcast.spi.impl.operationservice.impl.Invocation.doInvokeLocal(Invocation.java:581)
at com.hazelcast.spi.impl.operationservice.impl.Invocation.doInvoke(Invocation.java:566)
at com.hazelcast.spi.impl.operationservice.impl.Invocation.invoke0(Invocation.java:525)
at com.hazelcast.spi.impl.operationservice.impl.Invocation.invoke(Invocation.java:215)
at com.hazelcast.spi.impl.operationservice.impl.InvocationBuilderImpl.invoke(InvocationBuilderImpl.java:60)
at com.hazelcast.client.impl.protocol.task.AbstractPartitionMessageTask.processMessage(AbstractPartitionMessageTask.java:67)
at com.hazelcast.client.impl.protocol.task.AbstractMessageTask.initializeAndProcessMessage(AbstractMessageTask.java:123)
at com.hazelcast.client.impl.protocol.task.AbstractMessageTask.doRun(AbstractMessageTask.java:111)
at com.hazelcast.client.impl.protocol.task.AbstractMessageTask.run(AbstractMessageTask.java:101)
at com.hazelcast.spi.impl.operationservice.impl.OperationRunnerImpl.run(OperationRunnerImpl.java:155)
at com.hazelcast.spi.impl.operationexecutor.impl.OperationThread.process(OperationThread.java:125)
at com.hazelcast.spi.impl.operationexecutor.impl.OperationThread.run(OperationThread.java:100)
Caused by: java.lang.ClassNotFoundException: org.hibernate.cache.spi.entry.StandardCacheEntryImpl
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at com.hazelcast.nio.ClassLoaderUtil.tryLoadClass(ClassLoaderUtil.java:173)
at com.hazelcast.nio.ClassLoaderUtil.loadClass(ClassLoaderUtil.java:147)
at com.hazelcast.nio.IOUtil$ClassLoaderAwareObjectInputStream.resolveClass(IOUtil.java:615)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1866)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1749)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2040)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1571)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:431)
at com.hazelcast.internal.serialization.impl.JavaDefaultSerializers$JavaSerializer.read(JavaDefaultSerializers.java:82)
... 34 more
Do i need to include some kind of JAR inside my Hazelcast Docker setup, or what is happening here?
Looks like you are attempting to use the loop back address from another sever and outside of the docker network.
You may want to try bridged to eliminate the docker network address translation.
Also since 0.0.0.0 is bonded thus all ip addresses should have Hazelcast listeners installed.
I would simplify and first validate Hazelcast. If you have enterprise then use the console application otherwise write a simple start server java main. And then attemp to connect with a client using a real IP address. Once this works then work on hibernate configuration.
Finally got it working, here's what was needed:
1: Make sure you have all three these dependencies, initially i was missing the first one, but for some reason, wasn't getting a ClassNotFound exception as expected. It doesn't seem to be a transitive dependency of either hazelcast-client or hazelcast-hibernate52
<dependency>
<groupId>com.hazelcast</groupId>
<artifactId>hazelcast</artifactId>
<version>${hazelcast.version}</version>
</dependency>
<dependency>
<groupId>com.hazelcast</groupId>
<artifactId>hazelcast-hibernate52</artifactId>
<version>${hazelcast-hibernate.version}</version>
</dependency>
<dependency>
<groupId>com.hazelcast</groupId>
<artifactId>hazelcast-client</artifactId>
<version>${hazelcast.version}</version>
</dependency>
2: If your dev instance of Hazelcast doesn't have a password, don't specify the password. 127.0.0.1 works fine, no need to run on an external server while dev-ing.
paramsDefault.add("hibernate.cache.use_query_cache=true")
paramsDefault.add("hibernate.cache.use_second_level_cache=true")
paramsDefault.add("hibernate.cache.region.factory_class=com.hazelcast.hibernate.HazelcastCacheRegionFactory")
paramsDefault.add("hibernate.cache.hazelcast.use_native_client=true")
paramsDefault.add("hibernate.cache.hazelcast.native_client_address=127.0.0.1")
// paramsDefault.add("hibernate.cache.hazelcast.native_client_group=$ENV")
// paramsDefault.add("hibernate.cache.hazelcast.native_client_password=dev-pass")
3: Get rid of hazelcast.xml - after removing that hazelcast.xml file, Hibernate actually started up even though my hazelcast.xml file just had a one-liner in it saying use default config.
4: Make sure all entities are marked with Serializable, otherwise entities won't cache and cause an exception on the Hazelcast server itself.
#Table
#Entity
#BatchSize(size = 50)
#PersistenceContext(unitName = "default")
#Cache(usage = CacheConcurrencyStrategy.READ_WRITE, region = "Tag")
class Tag(
name: String,
) : Serializable {
5: If your entity has #OneToMany, #ManyToOne or other entities inside of it, make sure those entities are Serializable as well.
6: Write a little script to ensure that your entities are caching:
import com.hazelcast.client.HazelcastClient
import com.hazelcast.client.config.ClientConfig
fun main(args: Array<String>) {
val config = ClientConfig()
config.getNetworkConfig().addAddress("127.0.0.1:5701")
val hazelcastInstance = HazelcastClient.newHazelcastClient(config)
val map = hazelcastInstance.getMap<Any, Any>("Tag")
println("=================")
map.forEach { t, u ->
println(" $t -> $u ")
}
println("=================")
hazelcastInstance.shutdown()
}
The above script will println all Tag entities currently in the cache
7: When starting up the Docker instance, make sure you've exposed the port, without the -p option, nothing will work.
docker run --name=hazelcast -d=true -p 5701:5701 -e JAVA_OPTS="-Dhazelcast.local.publicAddress=127.0.0.1:5701" hazelcast/hazelcast:3.10.4
8: Check the Hazelcast logs to see if your Java / Kotlin client is connecting:
docker logs -f hazelcast
You should see something like this when there's a connection:
Sep 13, 2018 9:05:06 PM com.hazelcast.client.ClientEndpointManager
INFO: [127.0.0.1]:5701 [dev] [3.10.4] Destroying ClientEndpoint{connection=Connection[id=32, /172.17.0.2:5701->/172.17.0.1:56574, endpoint=[172.17.0.1]:56574, alive=false, type=JAVA_CLIENT], principal='ClientPrincipal{uuid='99cbf1b4-d11c-462d-bd87-4c069bc9b2ef', ownerUuid='2fb66fa1-a17f-49fe-ba2b-bf585d43906d'}, ownerConnection=true, authenticated=true, clientVersion=3.10.4, creationTime=1536872631771, latest statistics=null}
Sep 13, 2018 9:05:19 PM com.hazelcast.nio.tcp.TcpIpAcceptor
INFO: [127.0.0.1]:5701 [dev] [3.10.4] Accepting socket connection from /172.17.0.1:56576
Sep 13, 2018 9:05:19 PM com.hazelcast.nio.tcp.TcpIpConnectionManager
INFO: [127.0.0.1]:5701 [dev] [3.10.4] Established socket connection between /172.17.0.2:5701 and /172.17.0.1:56576
Sep 13, 2018 9:05:19 PM com.hazelcast.client.impl.protocol.task.AuthenticationMessageTask
INFO: [127.0.0.1]:5701 [dev] [3.10.4] Received auth from Connection[id=33, /172.17.0.2:5701->/172.17.0.1:56576, endpoint=null, alive=true, type=JAVA_CLIENT], successfully authenticated, principal: ClientPrincipal{uuid='ff51de39-fd9c-4ecf-bdd4-bbdb6ec6c79e', ownerUuid='2fb66fa1-a17f-49fe-ba2b-bf585d43906d'}, owner connection: true, client version: 3.10.4
These seems to be the crux of getting hazelcast hibernate working.

Why we are getting exception when we enable the query-cache in hibernate.cfg.xml file by use hibernate 5.3.1.Final dependency?

I created a small second level cache program by using 'hibernate 5.3.1.final' dependency.I used below dependencies to work with Second level cache.
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-core</artifactId>
<version>5.3.1.Final</version>
</dependency>
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-ehcache</artifactId>
<version>5.3.1.Final</version>
</dependency>
In this dependency the 'EhcacheRegionFactory' package name was changed.so we need to use like below property in our 'hibernate.cfg.xml'file.
<property name="hibernate.cache.region.factory_class">
org.hibernate.cache.ehcache.internal.EhcacheRegionFactory
</property>
In hibernate 5.3 the second level cache is working fine but 'query-cache' is not working.i am getting below exception when iam enabling the query-cache in 'hibernate.cfg.xml'.
->Exception in thread "main" java.lang.ExceptionInInitializerError
->Caused by: org.hibernate.service.spi.ServiceException: Unable to create
requested service [org.hibernate.cache.spi.CacheImplementor]
->Caused by: org.hibernate.cache.CacheException: On-the-fly creation of
JCache Cache objects is not supported
[org.hibernate.cache.spi.TimestampsRegion]
How to solve this one?
But The same program when i tried with hibernate 5.2.17 dependency the query cache working fine.
Use Hibernate ORM Hibernate Core » 5.3.4.Final dependencies it will work

Spring : Java heap space outOfMemory on bean creation's exception

I'm migrating my Spring dependencies injections, from XML declaration into the annotation style.
But now that I've migrated DAO, Services and Controllers, the server won't launch anymore :
java.lang.OutOfMemoryError: Java heap space
I can see in the stacktrace (see below) there is a BeanCreationException, but i can't know which one because the OutOfMemory exception occurs on BeanCreationException's message construction.
I already tried to increase -Xmx and -XX:MaxPermSize JVM's argument, but i believe it's like an infinite loop or something like that. The application worked perfectly before the migration to annotation.
Do you have any idea how to know where the problem come from ?
Versions : Hibernate 3.1, Spring 2.5, Tomcat 7, JDK 7
Typical applicationContext.xml content (before) :
<bean id="fooDAO" class="foo.bar.fooDAO">
<property name="sessionFactory">
<ref bean="foo_factory" /> //extends LocalSessionFactoryBean
</property>
</bean>
<bean id="fooService" class="foo.bar.fooService">
<property name="foodDAO" ref="fooDAO" />
</bean>
<bean id="fooControler" class="foo.bar.fooControler" autowire="byName">
<property name="fooService" ref="fooService" />
</bean>
After migration :
#Repository on DAO, #Service on services and #Controller on controllers.
StackTrace :
mars 12, 2018 1:37:51 PM org.apache.catalina.core.ContainerBase startInternal
GRAVE: A child container failed during start
java.util.concurrent.ExecutionException: java.lang.OutOfMemoryError: Java heap space
at java.util.concurrent.FutureTask.report(FutureTask.java:122)
at java.util.concurrent.FutureTask.get(FutureTask.java:188)
at org.apache.catalina.core.ContainerBase.startInternal(ContainerBase.java:1239)
at org.apache.catalina.core.StandardHost.startInternal(StandardHost.java:819)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:145)
at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1700)
at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1690)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.OutOfMemoryError: Java heap space
at java.util.Arrays.copyOf(Arrays.java:2367)
at java.lang.AbstractStringBuilder.expandCapacity(AbstractStringBuilder.java:130)
at java.lang.AbstractStringBuilder.ensureCapacityInternal(AbstractStringBuilder.java:114)
at java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:415)
at java.lang.StringBuffer.append(StringBuffer.java:232)
at org.springframework.beans.factory.BeanCreationException.toString(BeanCreationException.java:154)
at java.lang.String.valueOf(String.java:2849)
at java.lang.StringBuffer.append(StringBuffer.java:232)
at org.springframework.beans.factory.BeanCreationException.toString(BeanCreationException.java:154)
at java.lang.String.valueOf(String.java:2849)
at java.lang.StringBuffer.append(StringBuffer.java:232)
at org.springframework.beans.factory.BeanCreationException.toString(BeanCreationException.java:154)
at java.lang.String.valueOf(String.java:2849)
at java.lang.StringBuffer.append(StringBuffer.java:232)
at org.springframework.core.NestedExceptionUtils.buildMessage(NestedExceptionUtils.java:47)
at org.springframework.core.NestedRuntimeException.getMessage(NestedRuntimeException.java:67)
at java.lang.Throwable.getLocalizedMessage(Throwable.java:391)
at java.lang.Throwable.toString(Throwable.java:480)
at org.springframework.beans.factory.BeanCreationException.toString(BeanCreationException.java:149)
at java.lang.String.valueOf(String.java:2849)
at java.lang.StringBuffer.append(StringBuffer.java:232)
at org.springframework.core.NestedExceptionUtils.buildMessage(NestedExceptionUtils.java:47)
at org.springframework.core.NestedRuntimeException.getMessage(NestedRuntimeException.java:67)
at java.lang.Throwable.getLocalizedMessage(Throwable.java:391)
at java.lang.Throwable.toString(Throwable.java:480)
at org.springframework.beans.factory.BeanCreationException.toString(BeanCreationException.java:149)
at java.lang.String.valueOf(String.java:2849)
at java.lang.StringBuffer.append(StringBuffer.java:232)
at org.springframework.core.NestedExceptionUtils.buildMessage(NestedExceptionUtils.java:47)
at org.springframework.core.NestedRuntimeException.getMessage(NestedRuntimeException.java:67)
at java.lang.Throwable.getLocalizedMessage(Throwable.java:391)
at java.lang.Throwable.toString(Throwable.java:480)
mars 12, 2018 1:37:51 PM org.apache.catalina.core.ContainerBase startInternal
GRAVE: A child container failed during start
java.util.concurrent.ExecutionException: org.apache.catalina.LifecycleException: Failed to start component [StandardEngine[Catalina].StandardHost[localhost]]

Spring, Tomcat, Java

When I run .war file in tomcat, the logs show
ERROR [com.configleon.configurer.WebPropertyConfigurer] - The 'configLocation' variable is not specified in the JVM settings!
ERROR [org.springframework.web.context.ContextLoader] - Context initialization failed
And this my code :
<!-- configlion property configurator -->
<bean class="com.configleon.configurer.WebPropertyConfigurer">
<property name="propertyResources">
<bean class="com.configleon.resource.WebPropertyResources"/>
</property>
</bean>
Anyone can help me please ?
for first one ERROR [com.configleon.configurer.WebPropertyConfigurer]
see the here
and for second one
ERROR [org.springframework.web.context.ContextLoader] - Context initialization failed
in deployment environment, just make sure your server classpath has included the Spring jar library (e.g spring-2.5.6.jar).
For Spring3, ContextLoaderListener is moved to spring-web.jar, you can get the library from Maven central repository.
Markup
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-web</artifactId>
<version>3.0.5.RELEASE</version>
</dependency>

Java Eclipse : Hibernate Configuration to AnnotationConfiguration does not work and gives the runtime error [duplicate]

This question already has answers here:
What causes java.lang.IncompatibleClassChangeError?
(21 answers)
Closed 9 years ago.
Recently, I created a Maven project in which I created the following files
Model Class
Calling Class ( Instantiates Model Class object and create Session factory)
Hibernate.cfg.xml file
org.postgresql.Driver
jdbc:postgresql://localhost:8080/dev
temp
temp
1
<!-- SQL Dialect -->
<property name="hibernate.dialect">org.hibernate.dialect.PostgreSQLDialect</property>
<property name="hbm2ddl.auto">create</property>
<!-- Names the annotated entity class -->
<!-- List of XML mapping files -->
<mapping class="org.test.livejava.machine.UserDetails" />
Added the following dependencies in Maven.
UTF-8
JBoss repository
http://repository.jboss.org/nexus/content/groups/public/
junit
junit
3.8.1
org.codehaus.mojo
hibernate3-maven-plugin
2.0
org.slf4j
slf4j-api
1.7.6
compile
org.postgresql
postgresql
9.3-1100-jdbc41
<dependency>
<groupId>javax.persistence</groupId>
<artifactId>persistence-api</artifactId>
<version>1.0.2</version>
</dependency>
<!-- Hibernate annotation -->
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-core</artifactId>
<version>3.6.3.Final</version>
</dependency>
<dependency>
<groupId>javassist</groupId>
<artifactId>javassist</artifactId>
<version>3.12.1.GA</version>
</dependency>
Now, when I run the program and run it, it gives me the following error.
19:04:12,784 INFO org.hibernate.cfg.Environment - Hibernate 3.2.0.cr5
19:04:12,786 INFO org.hibernate.cfg.Environment - hibernate.properties not found
19:04:12,787 INFO org.hibernate.cfg.Environment - Bytecode provider name : cglib
19:04:12,789 INFO org.hibernate.cfg.Environment - using JDK 1.4 java.sql.Timestamp handling
19:04:12,818 INFO org.hibernate.cfg.Configuration - configuring from resource: /hibernate.cfg.xml
19:04:12,818 INFO org.hibernate.cfg.Configuration - Configuration resource: /hibernate.cfg.xml
org.hibernate.MappingException: An AnnotationConfiguration instance is required to use <mapping class="org.test.livejava.machine.UserDetails"/>
at org.hibernate.cfg.Configuration.parseMappingElement(Configuration.java:1524)
at org.hibernate.cfg.Configuration.parseSessionFactory(Configuration.java:1479)
at org.hibernate.cfg.Configuration.doConfigure(Configuration.java:1458)
at org.hibernate.cfg.Configuration.doConfigure(Configuration.java:1432)
at org.hibernate.cfg.Configuration.configure(Configuration.java:1352)
at org.hibernate.cfg.Configuration.configure(Configuration.java:1338)
at org.test.livejava.machine.machine.main(machine.java:29)
After reading about the deprecated Configuration method, I used AnnotationsConfiguration() and it started giving me the following error.
Exception in thread "main" java.lang.IncompatibleClassChangeError: Implementing class
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at org.test.livejava.machine.machine.main(machine.java:29)
Has anyone got the same issue before while compiling? I am not sure why this is not able to find load it correctly even though all the maven dependencies seem to be included
I appreciate your time and help looking into it!
Thanks!
Your Maven dependency lists a Hibernate version of 3.6.3.Final, but your logs indicate a Hibernate version of 3.2.0.cr5 (note that even if you need to stay with Hibernate 3, the bugfixes are up to 3.6.10).
The problem is attempting to use an ancient version of Hibernate at runtime: You either aren't using Maven to launch the program, or you're not compiling it into a fat war or jar.

Categories