I am getting this error "No Persistence provider for EntityManager named" and not able to proceed.
I am learning hibernate and trying so hands on stuff.
I tried all the methods mentions in this search forum, but still i get the same error. I tried the following
SAVED THE persistent.xml file in src/main/resources/META-INF/persistent.xml.
Updated the provider to "org.hibernate.jpa.HibernatePersistenceProvider".
Still no luck, can any one kindly help in how to resolve this. Below is screen shot and code.
POM:
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.11</version>
<scope>test</scope>
</dependency>
<!-- https://mvnrepository.com/artifact/org.hibernate.javax.persistence/hibernate-jpa-2.0-api -->
<dependency>
<groupId>org.hibernate.javax.persistence</groupId>
<artifactId>hibernate-jpa-2.1-api</artifactId>
<version>1.0.0.Final</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.hibernate/hibernate-core -->
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-core</artifactId>
<version>6.1.0.Final</version>
</dependency>
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<version>5.1.44</version>
</dependency>
<dependency>
<groupId>javax.xml.bind</groupId>
<artifactId>jaxb-api</artifactId>
<version>2.3.1</version>
</dependency>
<dependency>
<groupId>org.glassfish.jaxb</groupId>
<artifactId>jaxb-runtime</artifactId>
<version>2.3.1</version>
</dependency>
</dependencies>
persistence.xml:
<?xml version="1.0" encoding="UTF-8" ?>
<persistence xmlns="http://java.sun.com/xml/ns/persistence"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://java.sun.com/xml/ns/persistence
http://java.sun.com/xml/ns/persistence/persistence_2_0.xsd"
version="2.0">
<persistence-unit name="concretepage">
<description>JPA Demo</description>
<provider>org.hibernate.jpa.HibernatePersistenceProvider</provider>
<properties>
<property name="hibernate.dialect" value="org.hibernate.dialect.MySQLDialect"/>
<property name="hibernate.hbm2ddl.auto" value="update"/>
<property name="javax.persistence.jdbc.driver" value="com.mysql.jdbc.Driver"/>
<property name="javax.persistence.jdbc.url" value="jdbc:mysql://localhost/testDB1"/>
<property name="javax.persistence.jdbc.user" value="xxxxx"/>
<property name="javax.persistence.jdbc.password" value="yyyyy"/>
</properties>
</persistence-unit>
</persistence>
Java:
package database.hibernate;
import javax.persistence.EntityManager;
import javax.persistence.EntityManagerFactory;
import javax.persistence.Persistence;
public class JPAUtility {
private static final EntityManagerFactory emFactory;
static {
emFactory = Persistence.createEntityManagerFactory("concretepage");
}
public static EntityManager getEntityManager(){
return emFactory.createEntityManager();
}
public static void close(){
emFactory.close();
}
}
Java Program calling JPAUtility:
public class App2 {
public static void main(String[] args) {
EntityManager entityManager = JPAUtility.getEntityManager();
entityManager.getTransaction().begin();
....
entityManager.getTransaction().commit();
}
}
Error Message:
Exception in thread "main" java.lang.ExceptionInInitializerError
at database.hibernate.App2.main(App2.java:9)
Caused by: javax.persistence.PersistenceException: No Persistence provider for EntityManager named concretepage
at javax.persistence.Persistence.createEntityManagerFactory(Persistence.java:61)
at javax.persistence.Persistence.createEntityManagerFactory(Persistence.java:39)
at database.hibernate.JPAUtility.<clinit>(JPAUtility.java:9)
... 1 more
Project Structure:
You need to change the maven dependency version from <version>6.1.0.Final</version> to <version>5.2.11.Final</version> for <artifactId>hibernate-core</artifactId> as there has been significant changes from Hibernate 5 to 6 and need to consider upgrading corresponding dependencies and schema names etc in persistance.xml file
You can read another thread here - why is dependency to javax.persistence-api removed in hibernate-core 6.0.2
When I tried with 6.1.0.Final - I also get the same error as mentioned below.
But when I changed version back to 5.2.11 then it connected to database.
PS : I've used the maria db instead of mysql but that should not matter.
So I'm developing an API and I'm trying to connect to a local docker instance of DB2 from java using JPA entity manager. After running Apache Maven and getting build success, I try a GET request from Postman to test an endpoint and get a list of all users from DB2. I'm receiving the below error saying that entity manager is null. I've tried researching online and can't seem to find the solution.
The Dao class is virtually unchanged from a previous iteration that used Derby instead of DB2 and it worked fine.
I was wanting to know where I've gone wrong, what needs changing, and if anything is not needed. Provided below is the xml files and the DAO java class containing the entity manager. All properties for DB2 is correct as they're unchanged from a successful connection I was able to do from Eclipse's Database Development. The driver used then was db2jcc4 which I think is different to the driver said to use for the maven dependency so not sure if that's an issue.
Postman:
Error 500: java.lang.NullPointerException: Cannot invoke
"javax.persistence.EntityManager.createNamedQuery(String, java.lang.Class)" because
"this.em" is null
Maven:
[INFO] [ERROR ] CWWJP0015E: An error occurred in the org.eclipse.persistence.jpa.PersistenceProvider persistence provider when it attempted to create the container entity manager factory for the jpa-unit persistence unit. The following error occurred: Exception [EclipseLink-28018] (Eclipse Persistence Services - 2.7.9.v20210604-2c549e2208): org.eclipse.persistence.exceptions.EntityManagerSetupException
[INFO] Exception Description: Predeployment of PersistenceUnit [jpa-unit] failed.
[INFO] Internal Exception: javax.persistence.PersistenceException: CWWJP0013E: The server cannot locate the java:comp/DefaultDataSource data source for the jpa-unit persistence unit because it has encountered the following exception: javax.naming.NameNotFoundException: javax.naming.NameNotFoundException: java:comp/DefaultDataSource.
pom.xml
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.obdoblock</groupId>
<artifactId>hyperledger-api</artifactId>
<version>1.0-SNAPSHOT</version>
<packaging>war</packaging>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
<!-- Liberty configuration -->
<liberty.var.default.http.port>3500</liberty.var.default.http.port>
<liberty.var.default.https.port>9443</liberty.var.default.https.port>
<liberty.var.app.context.root>api</liberty.var.app.context.root>
<!-- TestDB Configuration -->
<version.ibm.db2>11.5.6.0</version.ibm.db2>
</properties>
<dependencies>
<!-- Provided dependencies -->
<dependency>
<groupId>jakarta.platform</groupId>
<artifactId>jakarta.jakartaee-api</artifactId>
<version>8.0.0</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.eclipse.microprofile</groupId>
<artifactId>microprofile</artifactId>
<version>3.3</version>
<type>pom</type>
<scope>provided</scope>
</dependency>
<!-- For tests -->
<dependency>
<groupId>org.junit.jupiter</groupId>
<artifactId>junit-jupiter</artifactId>
<version>5.6.2</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.cxf</groupId>
<artifactId>cxf-rt-rs-client</artifactId>
<version>3.3.6</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.cxf</groupId>
<artifactId>cxf-rt-rs-extension-providers</artifactId>
<version>3.3.6</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.eclipse</groupId>
<artifactId>yasson</artifactId>
<version>1.0.7</version>
<scope>test</scope>
</dependency>
<!-- DB2 connector -->
<dependency>
<groupId>com.ibm.db2</groupId>
<artifactId>jcc</artifactId>
<version>11.5.6.0</version>
</dependency>
<!-- Hyperledger Fabric Gateway for Blockchain -->
<dependency>
<groupId>org.hyperledger.fabric</groupId>
<artifactId>fabric-gateway-java</artifactId>
<version>2.2.0</version>
</dependency>
</dependencies>
<build>
<finalName>${project.artifactId}</finalName>
<plugins>
<!-- Enable liberty-maven plugin -->
<plugin>
<groupId>io.openliberty.tools</groupId>
<artifactId>liberty-maven-plugin</artifactId>
<version>3.3.4</version>
<configuration>
<copyDependencies>
<location>${project.build.directory}/liberty/wlp/usr/shared/resources</location>
<dependency>
<groupId>com.ibm.db2</groupId>
<artifactId>jcc</artifactId>
<version>11.5.6.0</version>
</dependency>
</copyDependencies>
</configuration>
</plugin>
<!-- Plugin to run functional tests -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-failsafe-plugin</artifactId>
<version>2.22.2</version>
<configuration>
<systemPropertyVariables>
<http.port>${liberty.var.default.http.port}</http.port>
<context.root>${liberty.var.app.context.root}</context.root>
</systemPropertyVariables>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
<version>3.2.3</version>
</plugin>
<!-- Plugin to run unit tests -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.22.2</version>
</plugin>
</plugins>
</build>
</project>
server.xml
<server description="Obdoblock REST Server">
<featureManager>
<feature>jaxrs-2.1</feature>
<feature>openapi-3.1</feature>
<feature>jpa-2.2</feature>
<feature>cdi-2.0</feature>
</featureManager>
<httpEndpoint
httpPort="${default.http.port}"
httpsPort="${default.https.port}"
id="defaultHttpEndpoint"
host="*"
/>
<webApplication
location="hyperledger-api.war"
contextRoot="${app.context.root}"
/>
<!-- DB2 Library Configuration -->
<library id="DB2JCCLib">
<fileset dir="${shared.resource.dir}" includes="*.jar" />
</library>
<dataSource jndiName="jdbc/db2">
<jdbcDriver libraryRef="jdbcLib"/>
<properties
databaseName="testdb"
serverName="localhost"
portNumber="50000"
user="****" password="****"
/>
</dataSource>
</server>
persistence.xml
<?xml version="1.0" encoding="UTF-8"?>
<!-- TODO: This will have to be configured by ENV as well -->
<!-- https://www.eclipse.org/eclipselink/documentation/2.5/jpa/extensions/p_ddl_generation.htm -->
<persistence version="2.2"
xmlns="http://xmlns.jcp.org/xml/ns/persistence"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://xmlns.jcp.org/xml/ns/persistence
http://xmlns.jcp.org/xml/ns/persistence/persistence_2_2.xsd">
<persistence-unit name="jpa-unit" transaction-type="JTA">
<properties>
<!-- Connection Specific -->
<property name="hibernate.dialect" value="org.hibernate.dialect.DB2Dialect"/>
<property name="javax.persistence.jdbc.driver" value="com.ibm.db2.jcc.DB2Driver" />
<property name="javax.persistence.jdbc.url" value="jdbc:db2://localhost:50000/testdb" />
<property name="javax.persistence.jdbc.user" value="****" />
<property name="javax.persistence.jdbc.password" value="****" />
<property name="show_sql" value="true"/>
<property name="hibernate.temp.use_jdbc_metadata_defaults" value="false"/>
</properties>
</persistence-unit>
</persistence>
UserDao.java
package dao;
import java.util.List;
import javax.persistence.EntityManager;
import javax.persistence.PersistenceContext;
import javax.enterprise.context.RequestScoped;
import models.*;
#RequestScoped
public class UserDao {
#PersistenceContext(name = "jpa-unit")
private EntityManager em;
public void createUser(Users user){
em.persist(user);
}
public Users readUser(int userId){
return em.find(Users.class, userId);
}
public List<Users> readAllUsers(){
return em.createNamedQuery("Users.findAll", Users.class).getResultList();
}
public void updateUser(Users user){
em.merge(user);
}
public void deleteUser(Users userId){
em.remove(userId);
}
public List<Users> findUser(String email){
return em.createNamedQuery("Users.findUser", Users.class)
.setParameter("email", email)
.getResultList();
}
public void createHistory(History hist){
em.persist(hist);
}
//wait this doesnt do anything?
public Users readHistory(int id){
return em.find(Users.class, id);
}
public List<History> readAllHistory(){
return em.createNamedQuery("History.findAll", History.class).getResultList();
}
}
Versions:
Docker: 20.10.8, build 3967b7d
DB2: ibm/db2 docker image version 11.5.6
Maven: 3.8.3
Java: JDK 14.0.2
If needing any more details, I'm happy to provide them.
Thanks, Dylan
Your persistence.xml is incorrect. It should point to datasource configured in the server.xml, not specify driver properties.
Like this:
<?xml version="1.0" encoding="UTF-8"?>
<persistence version="2.2"
xmlns="http://xmlns.jcp.org/xml/ns/persistence"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://xmlns.jcp.org/xml/ns/persistence
http://xmlns.jcp.org/xml/ns/persistence/persistence_2_2.xsd">
<persistence-unit name="jpa-unit" transaction-type="JTA">
<jta-data-source>jdbc/guestbookDS</jta-data-source>
<properties>
<property name="javax.persistence.schema-generation.database.action" value="create" />
<property name="javax.persistence.schema-generation.create-database-schemas" value="true" />
<property name="javax.persistence.schema-generation.scripts.action" value="create" />
<property name="javax.persistence.schema-generation.scripts.create-target" value="create.ddl"/>
<!--
<property name="eclipselink.ddl-generation" value="create-or-extend-tables"/>
<property name="eclipselink.ddl-generation.output-mode" value="both" />
-->
</properties>
</persistence-unit>
</persistence>
You can check this very simple Liberty project which uses JPA and database here https://github.com/stocktrader-ops/db-sat-demo (it is using PostgreSQL instead of DB2, but you should just use your datasource setup)
I have working application with h2 database. There is no connection problem with db as I can get some values from there however when I try to create EntityManagerFactory: EntityManagerFactory emf = Persistence.createEntityManagerFactory("EmployeeService"); then I see in logs:
2020-06-04 19:22:17.901 INFO 22496 --- [ main] o.hibernate.jpa.internal.util.LogHelper : HHH000204: Processing PersistenceUnitInfo [name: EmployeeService]
2020-06-04 19:22:17.946 WARN 22496 --- [ main] o.h.e.j.c.i.ConnectionProviderInitiator : HHH000181: No appropriate connection provider encountered, assuming application will be supplying connections
2020-06-04 19:22:17.946 WARN 22496 --- [ main] o.h.e.j.e.i.JdbcEnvironmentInitiator : HHH000342: Could not obtain connection to query metadata : The application must supply JDBC connections
Exception in thread "main" org.hibernate.service.spi.ServiceException: Unable to create requested service [org.hibernate.engine.jdbc.env.spi.JdbcEnvironment]
at org.hibernate.service.internal.AbstractServiceRegistryImpl.createService(AbstractServiceRegistryImpl.java:275)
Caused by: org.hibernate.HibernateException: Access to DialectResolutionInfo cannot be null when 'hibernate.dialect' not set
at org.hibernate.engine.jdbc.dialect.internal.DialectFactoryImpl.determineDialect(DialectFactoryImpl.java:100)
in application.properties I have:
#H2
spring.h2.console.enabled=true
spring.h2.console.path=/h2-console/
spring.datasource.url=jdbc:h2:mem:testdb
spring.datasource.driverClassName=org.h2.Driver
spring.datasource.username=sa
spring.datasource.password=
spring.jpa.database-platform=org.hibernate.dialect.H2Dialect
#Flyway
flyway.user=sa
flyway.password=
flyway.url=jdbc:h2:mem:testdb
flyway.locations=filesystem:db/migration
spring.flyway.baseline-on-migrate = true
and in persistance.xml in META-INF:
<?xml version="1.0" encoding="UTF-8"?>
<persistence xmlns="http://java.sun.com/xml/ns/persistence" version="1.0">
<persistence-unit name="EmployeeService">
<class>com.example.demo.Employee.Employee</class>
<properties>
<property name="javax.persistence.jdbc.url" value="jdbc:h2:mem:testdb"/>
<property name="javax.persistence.jdbc.driver" value="org.h2.Driver"/>
<property name="javax.persistence.jdbc.user" value="sa"/>
<property name="javax.persistence.jdbc.password" value=""/>
</properties>
</persistence-unit>
</persistence>
Problem appears only when try to create mentioned above EntityManagerFactory. When I don't try it I can go to h2 web console and see that flyway created db correctly, did inserts and I can do selects/inserts and so on.
#Edit
Adding pom.xml
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-jpa</artifactId>
</dependency>
<dependency>
<groupId>com.h2database</groupId>
<artifactId>h2</artifactId>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>org.spingframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
</dependency>
<dependency>
<groupId>org.flywaydb</groupId>
<artifactId>flyway-core</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
<exclusions>
<exclusion>
<groupId>org.junit.vintage</groupId>
<artifactId>junit-vintage-engine</artifactId>
</exclusion>
</exclusions>
</dependency>
</dependencies>
If you check the exception, it's complaining that he doesn't have the dialect in your persistence.xml.
Please add
<property name="hibernate.dialect" value="org.hibernate.dialect.H2Dialect"/>
I am using Cassandra 3.0.3, Spark 1.6.0 and trying to run by combining code from the old documentation in http://www.datastax.com/dev/blog/accessing-cassandra-from-spark-in-java and the new one in https://github.com/datastax/spark-cassandra-connector/blob/master/doc/7_java_api.md.
Here is my pom.xml file
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>muhrafifm</groupId>
<artifactId>spark-cass-twitterdw</artifactId>
<version>1.0</version>
<packaging>jar</packaging>
<build>
<plugins>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.0</version>
<configuration>
<source>1.7</source>
<target>1.7</target>
</configuration>
</plugin>
</plugins>
</build>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<maven.compiler.source>1.7</maven.compiler.source>
<maven.compiler.target>1.7</maven.compiler.target>
</properties>
<dependencies>
<dependency>
<groupId>com.datastax.cassandra</groupId>
<artifactId>cassandra-driver-core</artifactId>
<version>3.0.0</version>
</dependency>
<dependency>
<groupId>com.googlecode.json-simple</groupId>
<artifactId>json-simple</artifactId>
<version>1.1.1</version>
<type>jar</type>
</dependency>
<dependency>
<groupId>com.datastax.spark</groupId>
<artifactId>spark-cassandra-connector_2.10</artifactId>
<version>1.6.0-M1</version>
<type>jar</type>
</dependency>
<dependency>
<groupId>com.datastax.spark</groupId>
<artifactId>spark-cassandra-connector-java_2.10</artifactId>
<version>1.6.0-M1</version>
<type>jar</type>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.6.0</version>
<type>jar</type>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.10</artifactId>
<version>1.6.0</version>
<type>jar</type>
</dependency>
<dependency>
<groupId>org.apache.thrift</groupId>
<artifactId>libthrift</artifactId>
<version>0.9.1</version>
</dependency>
</dependencies>
The changes I made are basically in method javaFunction, and here is one of the method after I change the javaFunction according to the new documentation. I've also included import static com.datastax.spark.connector.japi.CassandraJavaUtil.*;
private void generateData(JavaSparkContext sc) {
CassandraConnector connector = CassandraConnector.apply(sc.getConf());
// Prepare the schema
try (Session session = connector.openSession()) {
session.execute("DROP KEYSPACE IF EXISTS java_api");
session.execute("CREATE KEYSPACE java_api WITH replication = {'class': 'SimpleStrategy', 'replication_factor': 1}");
session.execute("CREATE TABLE java_api.products (id INT PRIMARY KEY, name TEXT, parents LIST<INT>)");
session.execute("CREATE TABLE java_api.sales (id UUID PRIMARY KEY, product INT, price DECIMAL)");
session.execute("CREATE TABLE java_api.summaries (product INT PRIMARY KEY, summary DECIMAL)");
}
// Prepare the products hierarchy
List<Product> products = Arrays.asList(
new Product(0, "All products", Collections.<Integer>emptyList()),
new Product(1, "Product A", Arrays.asList(0)),
new Product(4, "Product A1", Arrays.asList(0, 1)),
new Product(5, "Product A2", Arrays.asList(0, 1)),
new Product(2, "Product B", Arrays.asList(0)),
new Product(6, "Product B1", Arrays.asList(0, 2)),
new Product(7, "Product B2", Arrays.asList(0, 2)),
new Product(3, "Product C", Arrays.asList(0)),
new Product(8, "Product C1", Arrays.asList(0, 3)),
new Product(9, "Product C2", Arrays.asList(0, 3))
);
JavaRDD<Product> productsRDD = sc.parallelize(products);
javaFunctions(productsRDD).writerBuilder("java_api", "products", mapToRow(Product.class)).saveToCassandra();
JavaRDD<Sale> salesRDD = productsRDD.filter(new Function<Product, Boolean>() {
#Override
public Boolean call(Product product) throws Exception {
return product.getParents().size() == 2;
}
}).flatMap(new FlatMapFunction<Product, Sale>() {
#Override
public Iterable<Sale> call(Product product) throws Exception {
Random random = new Random();
List<Sale> sales = new ArrayList<>(1000);
for (int i = 0; i < 1000; i++) {
sales.add(new Sale(UUID.randomUUID(), product.getId(), BigDecimal.valueOf(random.nextDouble())));
}
return sales;
}
});
javaFunctions(salesRDD).writerBuilder("java_api", "sales", mapToRow(Sale.class)).saveToCassandra();
}
And here is the error that I got.
16/03/04 13:29:06 INFO Cluster: New Cassandra host /127.0.0.1:9042 added
16/03/04 13:29:06 INFO CassandraConnector: Connected to Cassandra cluster: Test Cluster
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/sql/catalyst/package$ScalaReflectionLock$
at org.apache.spark.sql.catalyst.ReflectionLock$.<init>(ReflectionLock.scala:5)
at org.apache.spark.sql.catalyst.ReflectionLock$.<clinit>(ReflectionLock.scala)
at com.datastax.spark.connector.mapper.ReflectionColumnMapper.<init>(ReflectionColumnMapper.scala:38)
at com.datastax.spark.connector.mapper.JavaBeanColumnMapper.<init>(JavaBeanColumnMapper.scala:10)
at com.datastax.spark.connector.util.JavaApiHelper$.javaBeanColumnMapper(JavaApiHelper.scala:93)
at com.datastax.spark.connector.util.JavaApiHelper.javaBeanColumnMapper(JavaApiHelper.scala)
at com.datastax.spark.connector.japi.CassandraJavaUtil.mapToRow(CassandraJavaUtil.java:1204)
at com.datastax.spark.connector.japi.CassandraJavaUtil.mapToRow(CassandraJavaUtil.java:1222)
at muhrafifm.spark.cass.twitterdw.Demo.generateData(Demo.java:69)
at muhrafifm.spark.cass.twitterdw.Demo.run(Demo.java:35)
at muhrafifm.spark.cass.twitterdw.Demo.main(Demo.java:181)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.sql.catalyst.package$ScalaReflectionLock$
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 11 more
16/03/04 13:29:40 INFO CassandraConnector: Disconnected from Cassandra cluster: Test Cluster
16/03/04 13:29:41 INFO SparkContext: Invoking stop() from shutdown hook
16/03/04 13:29:41 INFO SparkUI: Stopped Spark web UI at http://10.144.233.28:4040
16/03/04 13:29:41 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
16/03/04 13:29:42 INFO MemoryStore: MemoryStore cleared
16/03/04 13:29:42 INFO BlockManager: BlockManager stopped
16/03/04 13:29:42 INFO BlockManagerMaster: BlockManagerMaster stopped
16/03/04 13:29:42 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
16/03/04 13:29:42 INFO SparkContext: Successfully stopped SparkContext
16/03/04 13:29:42 INFO ShutdownHookManager: Shutdown hook called
16/03/04 13:29:42 INFO ShutdownHookManager: Deleting directory /tmp/spark- 16fd2ae2-b61b-4411-a776-1e578caabba6
------------------------------------------------------------------------
BUILD FAILURE
Is there anything I did wrong? it seems like needing the package that I don't even use, is there anything to fix that? or should I use the previous version of the cassandra-spark-connector?
Any response is appreciated, Thank you.
Code is looking for
org/apache/spark/sql/catalyst/package$ScalaReflectionLock$
So you should include spark-sql library, which has the right dependency.
I had the same problem and the issue was the compatibility between the Spark version and the Spark Cassandra connector.
I was using spark 2.3 and the Cassandra connector was an older version.
The version compatibility matrix is available here:
https://github.com/datastax/spark-cassandra-connector
This is the POM I used for this application and it was completely ran without any issues with ( java version "1.8.0_131" & javac 1.8.0_131 ).
Complete application can be found here.
https://github.com/sunone5/BigData/tree/master/spark-cassandra-streaming
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>spark-cassandra-streaming</groupId>
<artifactId>spark-cassandra-streaming</artifactId>
<version>0.0.1-SNAPSHOT</version>
<dependencies>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-core_2.11 -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.2.0</version>
<scope>provided</scope>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-streaming_2.10 -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.10</artifactId>
<version>2.2.0</version>
<scope>provided</scope>
</dependency>
<!-- https://mvnrepository.com/artifact/com.datastax.spark/spark-cassandra-connector_2.11 -->
<dependency>
<groupId>com.datastax.spark</groupId>
<artifactId>spark-cassandra-connector_2.11</artifactId>
<version>2.0.5</version>
</dependency>
<!-- https://mvnrepository.com/artifact/com.datastax.spark/spark-cassandra-connector-java_2.10 -->
<dependency>
<groupId>com.datastax.spark</groupId>
<artifactId>spark-cassandra-connector-java_2.10</artifactId>
<version>1.6.0-M1</version>
</dependency>
<!-- https://mvnrepository.com/artifact/com.datastax.cassandra/cassandra-driver-core -->
<dependency>
<groupId>com.datastax.cassandra</groupId>
<artifactId>cassandra-driver-core</artifactId>
<version>3.3.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-catalyst_2.10 -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-catalyst_2.10</artifactId>
<version>2.2.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-sql_2.10 -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.10</artifactId>
<version>2.2.0</version>
</dependency>
</dependencies>
<build>
<resources>
<resource>
<directory>${basedir}/src/main/resources</directory>
</resource>
</resources>
<pluginManagement>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.6.2</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
</plugins>
</pluginManagement>
</build>
</project>
I am able to do it successfully.
My scala version is 2.11.12
Below is my pom.xml:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.sashi</groupId>
<artifactId>SalesAnalysis</artifactId>
<version>0.0.1-SNAPSHOT</version>
<packaging>jar</packaging>
<name>SalesAnalysis</name>
<url>http://maven.apache.org</url>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
<dependencies>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-core_2.11 -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.2.0</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.2.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/com.datastax.spark/spark-cassandra-connector_2.11 -->
<dependency>
<groupId>com.datastax.spark</groupId>
<artifactId>spark-cassandra-connector_2.11</artifactId>
<version>2.0.5</version>
</dependency>
<dependency>
<groupId>com.datastax.cassandra</groupId>
<artifactId>cassandra-driver-core</artifactId>
<version>3.3.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-catalyst_2.10 -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-catalyst_2.11</artifactId>
<version>2.2.0</version>
</dependency>
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-all</artifactId>
<version>4.1.17.Final</version>
</dependency>
</dependencies>
</project>
This is my spark-submit script:
spark-submit --class com.sashi.SalesAnalysis.CassandraSparkSalesAnalysis --packages com.datastax.spark:spark-cassandra-connector_2.11:2.4.0 /home/cloudera/Desktop/spark_ex/Cassandra/sales-analysis.jar
i never used maven before. But i always want to. I update one eclipse project im working on to be all in maven.
This was my WEB-INF/lib folder with all my jars.
Now i wanted to do the same in maven adding dependencies one by one but im having errors
When i enter a servlet i get this:
excepción
javax.servlet.ServletException: La ejecución del Servlet lanzó una excepción
org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
causa raíz
java.lang.ExceptionInInitializerError
util.HibernateUtil.<clinit>(HibernateUtil.java:38)
app.Model.getProvincias(Model.java:74)
servlet.FormS.doGet(FormS.java:22)
javax.servlet.http.HttpServlet.service(HttpServlet.java:620)
javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
causa raíz
java.lang.NoSuchMethodError: org.hibernate.cfg.Configuration.addAnnotatedClass(Ljava/lang/Class;)Lorg/hibernate/cfg/Configuration;
util.HibernateUtil.<clinit>(HibernateUtil.java:25)
app.Model.getProvincias(Model.java:74)
servlet.FormS.doGet(FormS.java:22)
javax.servlet.http.HttpServlet.service(HttpServlet.java:620)
javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
Here is my hibernate Configure file hibernate.cfg.xml
<?xml version='1.0' encoding='UTF-8'?>
<!DOCTYPE hibernate-configuration PUBLIC
"-//Hibernate/Hibernate Configuration DTD 3.0//EN"
"http://hibernate.sourceforge.net/hibernate-configuration-3.0.dtd">
<hibernate-configuration>
<session-factory>
<property name="dialect">org.hibernate.dialect.MySQLDialect</property>
<property name="connection.driver_class">com.mysql.jdbc.Driver</property>
<property name="connection.password">pass</property>
<property name="connection.url">jdbc:mysql://obram2.com/obram2_base</property>
<!--
<property name="connection.url">jdbc:mysql://192.168.200.41/obram2_base</property>
-->
<property name="hibernate.c3p0.min_size">5</property>
<property name="hibernate.c3p0.max_size">20</property>
<property name="hibernate.c3p0.timeout">300</property>
<property name="hibernate.c3p0.max_statements">50</property>
<property name="hibernate.c3p0.idle_test_period">3000</property>
<property name="connection.username">obram2_root</property>
<property name="cache.provider_class">org.hibernate.cache.internal.NoCacheProvider</property>
<property name="show_sql">false</property>
<property name="hibernate.hbm2ddl.auto">update</property>
</session-factory>
</hibernate-configuration>
And this my HibernateUtil
package util;
import org.hibernate.SessionFactory;
import org.hibernate.boot.registry.StandardServiceRegistryBuilder;
import org.hibernate.cfg.Configuration;
import org.hibernate.service.ServiceRegistry;
import bean.Business;
import bean.Client;
import bean.Form;
import bean.Pago;
public class HibernateUtil
{
//private static final SessionFactory sessionFactory;
private static SessionFactory sessionFactory;
private static ServiceRegistry serviceRegistry;
static
{
try
{
Configuration configuration = new Configuration().configure()
.addAnnotatedClass(Localidad.class)
.addAnnotatedClass(Form.class)
.addAnnotatedClass(Client.class)
.addAnnotatedClass(Business.class)
.addAnnotatedClass(Provincia.class)
.addAnnotatedClass(Pago.class);
serviceRegistry = new StandardServiceRegistryBuilder().applySettings(configuration.getProperties()).build();
sessionFactory = configuration.buildSessionFactory(serviceRegistry);
}
catch (Throwable ex)
{
System.err.println("Initial SessionFactory creation failed." + ex);
throw new ExceptionInInitializerError(ex);
}
}
public static SessionFactory getSessionFactory()
{
return sessionFactory;
}
}
And Finally Here are all my new maven dependencies, which i tried to remain in same versions of all of them:
And my pom.xml
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>OBRAM2</groupId>
<artifactId>OBRAM2</artifactId>
<version>0.0.1-SNAPSHOT</version>
<packaging>war</packaging>
<build>
<sourceDirectory>src</sourceDirectory>
<resources>
<resource>
<directory>src</directory>
<excludes>
<exclude>**/*.java</exclude>
</excludes>
</resource>
</resources>
<plugins>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<configuration>
<source>1.7</source>
<target>1.7</target>
</configuration>
</plugin>
<plugin>
<artifactId>maven-war-plugin</artifactId>
<version>2.3</version>
<configuration>
<warSourceDirectory>WebContent</warSourceDirectory>
<failOnMissingWebXml>false</failOnMissingWebXml>
</configuration>
</plugin>
</plugins>
</build>
<dependencies>
<dependency>
<groupId>javax.mail</groupId>
<artifactId>javax.mail-api</artifactId>
<version>1.5.3</version>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.17</version>
</dependency>
<dependency>
<groupId>org.apache.velocity</groupId>
<artifactId>velocity</artifactId>
<version>1.7</version>
</dependency>
<dependency>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
<version>2.3.1</version>
</dependency>
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<version>5.1.17</version>
</dependency>
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>jstl</artifactId>
<version>1.2</version>
</dependency>
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-core</artifactId>
<version>4.3.8.Final</version>
</dependency>
<dependency>
<groupId>org.hibernate.common</groupId>
<artifactId>hibernate-commons-annotations</artifactId>
<version>4.0.5.Final</version>
</dependency>
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-lang3</artifactId>
<version>3.4</version>
</dependency>
</dependencies>
</project>
Remember the exact same code was working before maven, i didnt make any changes.
Thanks!
I finally solved it by creating maven folders structure. Its seems that eclipse Configure->Conver to Maven doesnt does that.
src\main\java
here my files
src\main\resources
src\main\webapp
When you define a maven it resolves all sub-dependencies of your artifacts and make decisions about versions of the artifacts. You can solve this conflicts by going to target/{yourapp}/WEB-INF/lib/ and see the exact jars that maven is packaging on the war. If there are some than don't fit according to your initial /lib you can use the eclipse maven dependency manager (or use directly on the pom) to exclude those that you don't want.
The error that you are getting is very usual when you use different versions of your artifacts.