ServiceConfigurationError when using Hibernate SPI MetadataContributor - java

Hi someone a clue how to integrate a MetadataContributor in a Spring Boot application without errors?
I got:
java.util.ServiceConfigurationError: org.hibernate.boot.spi.MetadataContributor: Provider com.example.demo.TestContributor not found
at java.base/java.util.ServiceLoader.fail(ServiceLoader.java:588) ~[na:na]
at java.base/java.util.ServiceLoader$LazyClassPathLookupIterator.nextProviderClass(ServiceLoader.java:1211) ~[na:na]
at java.base/java.util.ServiceLoader$LazyClassPathLookupIterator.hasNextService(ServiceLoader.java:1220) ~[na:na]
at java.base/java.util.ServiceLoader$LazyClassPathLookupIterator.hasNext(ServiceLoader.java:1264) ~[na:na]
at java.base/java.util.ServiceLoader$2.hasNext(ServiceLoader.java:1299) ~[na:na]
at java.base/java.util.ServiceLoader$ProviderSpliterator.tryAdvance(ServiceLoader.java:1483) ~[na:na]
at java.base/java.util.Spliterators$1Adapter.hasNext(Spliterators.java:681) ~[na:na]
at org.hibernate.boot.registry.classloading.internal.AggregatedServiceLoader$ClassPathAndModulePathAggregatedServiceLoader.hasNextIgnoringServiceConfigurationError(AggregatedServiceLoader.java:241) ~[hibernate-core-5.4.27.Final.jar!/:5.4.27.Final]
at org.hibernate.boot.registry.classloading.internal.AggregatedServiceLoader$ClassPathAndModulePathAggregatedServiceLoader.loadAll(AggregatedServiceLoader.java:219) ~[hibernate-core-5.4.27.Final.jar!/:5.4.27.Final]
at org.hibernate.boot.registry.classloading.internal.AggregatedServiceLoader$ClassPathAndModulePathAggregatedServiceLoader.getAll(AggregatedServiceLoader.java:187) ~[hibernate-core-5.4.27.Final.jar!/:5.4.27.Final]
at org.hibernate.boot.registry.classloading.internal.ClassLoaderServiceImpl.loadJavaServices(ClassLoaderServiceImpl.java:251) ~[hibernate-core-5.4.27.Final.jar!/:5.4.27.Final]
at org.hibernate.boot.model.process.spi.MetadataBuildingProcess.complete(MetadataBuildingProcess.java:290) ~[hibernate-core-5.4.27.Final.jar!/:5.4.27.Final]
...
I have a fresh spring boot project with just
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-jpa</artifactId>
</dependency>
<dependency>
<groupId>com.h2database</groupId>
<artifactId>h2</artifactId>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
Added a contributor
package com.example.demo;
public class TestContributor implements MetadataContributor {
Logger logger = LoggerFactory.getLogger(TestContributor.class);
#Override
public void contribute(InFlightMetadataCollector metadataCollector, IndexView jandexIndex) {
logger.info("Still works");
}
}
And added src/main/resources/META-INF/services/org.hibernate.boot.spi.MetadataContributor file with class definition
com.example.demo.TestContributor
When I start the project via Intellij everything works.
But when I build a jar mvn clean package and try to start it via jar java -jar demo-0.0.1-SNAPSHOT.jar I get the problem from above.
It still works cause the other classloaders seems to work fine. I tried to debug into Hibernate but I don't understand what the problem is. Someone has an idea?

Related

Gremlin query on Janusgraph through Spark. Error: Provider org.janusgraph.hadoop.serialize.JanusGraphKryoShimService could not be instantiated

Current Architecture
Description
I am using JanusGraph 0.6.2 for graph processing.
GCP BigTable as JanusGraph Backend/database.
Spark 3.0.0 with hadoop 2.7 for data processing, setup locally (planning to setup the env in GCP after the POC).
Gremlin Client and Java 11 as a client to run Spark Job, to do queries like traversal, find nodes and etc through SparkGraphComputer
Problem
I am able to trigger a query job, to do the node count on Spark using Gremlin Client, But I am facing issues triggering a query job using Java apis.
Expectation
Trigger a Query Job using Java APIs.
Apache Spark Setup is done
Configuration working for Gremlin Client
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
# Hadoop Graph Configuration
#
gremlin.graph=org.apache.tinkerpop.gremlin.hadoop.structure.HadoopGraph
gremlin.hadoop.graphReader=org.janusgraph.hadoop.formats.hbase.HBaseInputFormat
gremlin.hadoop.graphWriter=org.apache.hadoop.mapreduce.lib.output.NullOutputFormat
gremlin.hadoop.jarsInDistributedCache=true
gremlin.hadoop.inputLocation=none
gremlin.hadoop.outputLocation=output
gremlin.spark.persistContext=true
#
# JanusGraph HBase InputFormat configuration
#
#janusgraphmr.ioformat.conf.storage.backend=hbase
#janusgraphmr.ioformat.conf.storage.hostname=localhost
#janusgraphmr.ioformat.conf.storage.port=8586
#janusgraphmr.ioformat.conf.storage.hbase.table=janusgraph
janusgraphmr.ioformat.conf.storage.backend=hbase
janusgraphmr.ioformat.conf.storage.hbase.ext.hbase.client.connection.impl=com.google.cloud.bigtable.hbase2_x.BigtableConnection
janusgraphmr.ioformat.conf.storage.hbase.ext.google.bigtable.project.id= *****
janusgraphmr.ioformat.conf.storage.hbase.ext.google.bigtable.instance.id= *****
janusgraphmr.ioformat.conf.storage.hbase.table= ******
janusgraphmr.ioformat.conf.storage.hbase.ext.hbase.regionsizecalculator.enable=false
# This defines the indexing backend configuration used while writing data to JanusGraph.
janusgraphmr.ioformat.conf.index.search.backend=elasticsearch
janusgraphmr.ioformat.conf.index.search.hostname=localhost
#
# SparkGraphComputer Configuration
#
spark.master=spark://RINMAC1714:7077
spark.executor.memory=1g
spark.executor.extraClassPath=/Users/rohit.pahan/portables/janusgraph-0.6.2/lib/*
spark.serializer=org.apache.spark.serializer.KryoSerializer
spark.kryo.registrator=org.janusgraph.hadoop.serialize.JanusGraphKryoRegistrator
Above config works and I get the result. Please follow the screenshot
Java API configuration which is not working for me
GraphTraversalProvider.java
import org.apache.commons.configuration.Configuration;
import org.apache.tinkerpop.gremlin.hadoop.Constants;
public class GraphTraversalProvider {
public static Configuration makeLocal() {
return make(true);
}
public static Configuration makeRemote() {
return make(false);
}
private static Configuration make(boolean local) {
final Configuration hadoopConfig = new BaseConfiguration();
hadoopConfig.setProperty("gremlin.graph", "org.apache.tinkerpop.gremlin.hadoop.structure.HadoopGraph");
hadoopConfig.setProperty(Constants.GREMLIN_HADOOP_GRAPH_READER, "org.janusgraph.hadoop.formats.hbase.HBaseInputFormat");
hadoopConfig.setProperty(Constants.GREMLIN_HADOOP_GRAPH_WRITER, "org.apache.hadoop.mapreduce.lib.output.NullOutputFormat");
hadoopConfig.setProperty(Constants.GREMLIN_HADOOP_JARS_IN_DISTRIBUTED_CACHE, true);
hadoopConfig.setProperty(Constants.GREMLIN_HADOOP_INPUT_LOCATION, "none");
hadoopConfig.setProperty(Constants.GREMLIN_HADOOP_OUTPUT_LOCATION, "output");
hadoopConfig.setProperty(Constants.GREMLIN_SPARK_PERSIST_CONTEXT, true);
hadoopConfig.setProperty("janusgraphmr.ioformat.conf.storage.backend", "hbase");
hadoopConfig.setProperty("janusgraphmr.ioformat.conf.storage.hbase.ext.hbase.client.connection.impl", "com.google.cloud.bigtable.hbase2_x.BigtableConnectio");
hadoopConfig.setProperty("janusgraphmr.ioformat.conf.storage.hbase.ext.google.bigtable.project.id", "******");
hadoopConfig.setProperty("janusgraphmr.ioformat.conf.storage.hbase.ext.google.bigtable.instance.id", "*******");
hadoopConfig.setProperty("janusgraphmr.ioformat.conf.storage.hbase.table", "******");
hadoopConfig.setProperty("janusgraphmr.ioformat.conf.storage.hbase.ext.hbase.regionsizecalculator.enable", false);
hadoopConfig.setProperty("janusgraphmr.ioformat.conf.index.search.backend", "elasticsearch");
hadoopConfig.setProperty("janusgraphmr.ioformat.conf.index.search.hostname", "localhost");
if (local) {
hadoopConfig.setProperty("spark.master", "local[*]"); // Run Spark locally with as many worker threads as logical cores on your machine.
} else {
hadoopConfig.setProperty("spark.master", "spark://MAC1714:7077");
}
hadoopConfig.setProperty("spark.executor.memory", "1g");
hadoopConfig.setProperty(Constants.SPARK_SERIALIZER, "org.apache.spark.serializer.KryoSerializer");
hadoopConfig.setProperty("spark.kryo.registrator", "org.janusgraph.hadoop.serialize.JanusGraphKryoRegistrator");
hadoopConfig.setProperty("spark.kryo.registrationRequired","false");
return hadoopConfig;
}
}
Main Class
public static void main(String[] args) throws Exception {
runSpark();
}
private static void runSpark() throws Exception {
Configuration config = GraphTraversalProvider.makeRemote();
Graph hadoopGraph = GraphFactory.open(config);
Long totalVertices = hadoopGraph.traversal().withComputer(SparkGraphComputer.class).V().count().next();
System.out.println("IT WORKED: " + totalVertices);
hadoopGraph.close();
}
}
pom.xml
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.2.6.RELEASE</version>
<relativePath/> <!-- lookup parent from repository -->
</parent>
<groupId>com.janus</groupId>
<artifactId>janus-spark</artifactId>
<version>0.0.1-SNAPSHOT</version>
<name>janus-spark</name>
<description>Demo project for Spring Boot</description>
<properties>
<janus.version>0.6.2</janus.version>
<spark.version>3.0.0</spark.version>
<gremlin.version>3.4.6</gremlin.version>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<!-- https://mvnrepository.com/artifact/org.janusgraph/janusgraph-bigtable -->
<dependency>
<groupId>org.janusgraph</groupId>
<artifactId>janusgraph-bigtable</artifactId>
<version>${janus.version}</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.janusgraph/janusgraph-hadoop -->
<dependency>
<groupId>org.janusgraph</groupId>
<artifactId>janusgraph-hadoop</artifactId>
<version>${janus.version}</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.janusgraph/janusgraph-hbase -->
<dependency>
<groupId>org.janusgraph</groupId>
<artifactId>janusgraph-hbase</artifactId>
<version>${janus.version}</version>
</dependency>
<dependency>
<groupId>org.janusgraph</groupId>
<artifactId>janusgraph-solr</artifactId>
<version>${janus.version}</version>
</dependency>
<!-- https://mvnrepository.com/artifact/com.esotericsoftware.kryo/kryo -->
<dependency>
<groupId>com.esotericsoftware.kryo</groupId>
<artifactId>kryo</artifactId>
<version>2.16</version>
</dependency>
<!--
<dependency>
<groupId>com.twitter</groupId>
<artifactId>chill_2.13</artifactId>
<version>0.10.0</version>
</dependency>-->
<!-- GREMLIN -->
<dependency>
<groupId>org.apache.tinkerpop</groupId>
<artifactId>spark-gremlin</artifactId>
<version>${gremlin.version}</version>
<exclusions>
<exclusion>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
</exclusion>
<exclusion>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.tinkerpop</groupId>
<artifactId>hadoop-gremlin</artifactId>
<version>${gremlin.version}</version>
</dependency>
<!-- SPARK -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.12</artifactId>
<version>${spark.version}</version>
<exclusions>
<exclusion>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
<version>27.0-jre</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
</project>
Error Logs
SLF4J: Found binding in [jar:file:/Users/rohit.pahan/portables/janusgraph-0.6.2/lib/slf4j-log4j12-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/rohit.pahan/portables/janusgraph-0.6.2/lib/logback-classic-1.1.3.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/rohit.pahan/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/rohit.pahan/.m2/repository/org/slf4j/slf4j-log4j12/1.7.30/slf4j-log4j12-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
0 [main] WARN org.apache.tinkerpop.gremlin.spark.process.computer.SparkGraphComputer - class org.apache.hadoop.mapreduce.lib.output.NullOutputFormat does not implement PersistResultGraphAware and thus, persistence options are unknown -- assuming all options are possible
Exception in thread "main" java.lang.IllegalStateException: java.util.ServiceConfigurationError: org.apache.tinkerpop.gremlin.structure.io.gryo.kryoshim.KryoShimService: Provider org.janusgraph.hadoop.serialize.JanusGraphKryoShimService could not be instantiated
at org.apache.tinkerpop.gremlin.process.computer.traversal.step.map.VertexProgramStep.processNextStart(VertexProgramStep.java:88)
at org.apache.tinkerpop.gremlin.process.traversal.step.util.AbstractStep.hasNext(AbstractStep.java:150)
at org.apache.tinkerpop.gremlin.process.traversal.step.util.ExpandableStepIterator.next(ExpandableStepIterator.java:55)
at org.apache.tinkerpop.gremlin.process.computer.traversal.step.map.ComputerResultStep.processNextStart(ComputerResultStep.java:68)
at org.apache.tinkerpop.gremlin.process.traversal.step.util.AbstractStep.next(AbstractStep.java:135)
at org.apache.tinkerpop.gremlin.process.traversal.step.util.AbstractStep.next(AbstractStep.java:40)
at org.apache.tinkerpop.gremlin.process.traversal.util.DefaultTraversal.next(DefaultTraversal.java:240)
at com.janus.app.services.RunSparkJob.runSpark(RunSparkJob.java:20)
at com.janus.app.services.RunSparkJob.main(RunSparkJob.java:14)
Caused by: java.util.concurrent.ExecutionException: java.util.ServiceConfigurationError: org.apache.tinkerpop.gremlin.structure.io.gryo.kryoshim.KryoShimService: Provider org.janusgraph.hadoop.serialize.JanusGraphKryoShimService could not be instantiated
at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122)
at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191)
at org.apache.tinkerpop.gremlin.process.computer.traversal.step.map.VertexProgramStep.processNextStart(VertexProgramStep.java:68)
... 8 more
Caused by: java.util.ServiceConfigurationError: org.apache.tinkerpop.gremlin.structure.io.gryo.kryoshim.KryoShimService: Provider org.janusgraph.hadoop.serialize.JanusGraphKryoShimService could not be instantiated
at java.base/java.util.ServiceLoader.fail(ServiceLoader.java:582)
at java.base/java.util.ServiceLoader$ProviderImpl.newInstance(ServiceLoader.java:804)
at java.base/java.util.ServiceLoader$ProviderImpl.get(ServiceLoader.java:722)
at java.base/java.util.ServiceLoader$3.next(ServiceLoader.java:1393)
at org.apache.tinkerpop.gremlin.structure.io.gryo.kryoshim.KryoShimServiceLoader.load(KryoShimServiceLoader.java:97)
at org.apache.tinkerpop.gremlin.structure.io.gryo.kryoshim.KryoShimServiceLoader.applyConfiguration(KryoShimServiceLoader.java:58)
at org.apache.tinkerpop.gremlin.spark.process.computer.SparkGraphComputer.lambda$submitWithExecutor$1(SparkGraphComputer.java:248)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1130)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:630)
at java.base/java.lang.Thread.run(Thread.java:831)
Caused by: java.lang.IllegalArgumentException: Unable to create serializer "org.apache.tinkerpop.shaded.kryo.serializers.FieldSerializer" for class: java.util.concurrent.atomic.AtomicLong
at org.apache.tinkerpop.shaded.kryo.factories.ReflectionSerializerFactory.makeSerializer(ReflectionSerializerFactory.java:67)
at org.apache.tinkerpop.shaded.kryo.factories.ReflectionSerializerFactory.makeSerializer(ReflectionSerializerFactory.java:45)
at org.apache.tinkerpop.shaded.kryo.Kryo.newDefaultSerializer(Kryo.java:380)
at org.apache.tinkerpop.shaded.kryo.Kryo.getDefaultSerializer(Kryo.java:364)
at org.apache.tinkerpop.gremlin.structure.io.gryo.GryoTypeReg.registerWith(GryoTypeReg.java:122)
at org.apache.tinkerpop.gremlin.structure.io.gryo.GryoMapper.createMapper(GryoMapper.java:101)
at org.apache.tinkerpop.gremlin.structure.io.gryo.GryoMapper.createMapper(GryoMapper.java:75)
at org.apache.tinkerpop.gremlin.structure.io.gryo.GryoReader.<init>(GryoReader.java:71)
at org.apache.tinkerpop.gremlin.structure.io.gryo.GryoReader.<init>(GryoReader.java:64)
at org.apache.tinkerpop.gremlin.structure.io.gryo.GryoReader$Builder.create(GryoReader.java:302)
at org.apache.tinkerpop.gremlin.structure.io.gryo.GryoPool.createPool(GryoPool.java:126)
at org.apache.tinkerpop.gremlin.structure.io.gryo.GryoPool.access$100(GryoPool.java:40)
at org.apache.tinkerpop.gremlin.structure.io.gryo.GryoPool$Builder.create(GryoPool.java:227)
at org.apache.tinkerpop.gremlin.hadoop.structure.io.HadoopPools.initialize(HadoopPools.java:51)
at org.janusgraph.hadoop.serialize.JanusGraphKryoShimService.<init>(JanusGraphKryoShimService.java:30)
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:78)
at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.base/java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:499)
at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:480)
at java.base/java.util.ServiceLoader$ProviderImpl.newInstance(ServiceLoader.java:780)
... 9 more
Caused by: java.lang.reflect.InvocationTargetException
at jdk.internal.reflect.GeneratedConstructorAccessor3.newInstance(Unknown Source)
at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.base/java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:499)
at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:480)
at org.apache.tinkerpop.shaded.kryo.factories.ReflectionSerializerFactory.makeSerializer(ReflectionSerializerFactory.java:54)
... 29 more
Caused by: java.lang.reflect.InaccessibleObjectException: Unable to make field private volatile long java.util.concurrent.atomic.AtomicLong.value accessible: module java.base does not "opens java.util.concurrent.atomic" to unnamed module #1d9b7cce
at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:357)
at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:297)
at java.base/java.lang.reflect.Field.checkCanSetAccessible(Field.java:177)
at java.base/java.lang.reflect.Field.setAccessible(Field.java:171)
at org.apache.tinkerpop.shaded.kryo.serializers.FieldSerializer.buildValidFields(FieldSerializer.java:306)
at org.apache.tinkerpop.shaded.kryo.serializers.FieldSerializer.rebuildCachedFields(FieldSerializer.java:239)
at org.apache.tinkerpop.shaded.kryo.serializers.FieldSerializer.rebuildCachedFields(FieldSerializer.java:182)
at org.apache.tinkerpop.shaded.kryo.serializers.FieldSerializer.<init>(FieldSerializer.java:155)
... 34 more
Process finished with exit code 1
I am still exploring Janusgraph and its processing capabilities with Spark. I have given all the details here, Let me know if any more details are required. It is a very new techstack for me. I would be grateful for any help.
<properties>
<janus.version>0.6.2</janus.version>
<spark.version>3.0.0</spark.version>
<gremlin.version>3.4.6</gremlin.version>
</properties>
JanusGraph-0.6.2 depends on TinkerPop-3.5.3.
Mixing with other TinkerPop versions can easily lead to these kind of problems.

Why get java.lang.ClassNotFoundException when run as jar, but work well with IntelliJ IDEA

spring boot version : 2.4.1
spring cloud version : 2020.0.0
My code
#Configuration
public class BaseConfig {
#Bean
public Module sortJacksonModule() {
return new SortJacksonModule();
}
}
my pom.xml dependency
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-commons</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-openfeign-core</artifactId>
</dependency>
my pom.xml plugin
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
When run with IntelliJ IDEA, it work well.
But when run with jar(by mvn clean package), it show
Caused by: java.lang.NoClassDefFoundError: feign/codec/EncodeException
at org.springframework.cloud.openfeign.support.SortJacksonModule.setupModule(SortJacksonModule.java:47) ~[spring-cloud-openfeign-core-3.0.0.jar!/:3.0.0]
at com.fasterxml.jackson.databind.ObjectMapper.registerModule(ObjectMapper.java:819) ~[jackson-databind-2.11.3.jar!/:2.11.3]
at com.fasterxml.jackson.databind.ObjectMapper.registerModules(ObjectMapper.java:1021) ~[jackson-databind-2.11.3.jar!/:2.11.3]
at org.springframework.http.converter.json.Jackson2ObjectMapperBuilder.configure(Jackson2ObjectMapperBuilder.java:712) ~[spring-web-5.3.2.jar!/:5.3.2]
at org.springframework.http.converter.json.Jackson2ObjectMapperBuilder.build(Jackson2ObjectMapperBuilder.java:680) ~[spring-web-5.3.2.jar!/:5.3.2]
at org.springframework.boot.autoconfigure.jackson.JacksonAutoConfiguration$JacksonObjectMapperConfiguration.jacksonObjectMapper(JacksonAutoConfiguration.java:101) ~[spring-boot-autoconfigure-2.4.1.jar!/:2.4.1]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_232]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_232]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_232]
at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_232]
at org.springframework.beans.factory.support.SimpleInstantiationStrategy.instantiate(SimpleInstantiationStrategy.java:154) ~[spring-beans-5.3.2.jar!/:5.3.2]
... 113 common frames omitted
Caused by: java.lang.ClassNotFoundException: feign.codec.EncodeException
at java.net.URLClassLoader.findClass(URLClassLoader.java:382) ~[na:1.8.0_232]
at java.lang.ClassLoader.loadClass(ClassLoader.java:418) ~[na:1.8.0_232]
at org.springframework.boot.loader.LaunchedURLClassLoader.loadClass(LaunchedURLClassLoader.java:151) ~[demo-spring-core-11-0.0.1-SNAPSHOT.jar:0.0.1-SNAPSHOT]
at java.lang.ClassLoader.loadClass(ClassLoader.java:351) ~[na:1.8.0_232]
... 124 common frames omitted
After study the error log, I found that feign.codec.EncodeException is optional dependency in spring-cloud-openfeign-core, so ClassNotFoundException is right behavior(optional dependency not include in final jar).
So my question is: Why IntelliJ IDEA can run without any error? I try both IntelliJ IDEA run and mvn spring-boot:run, both work fine.
update: add example
After more study, I found out this only happen when the class not called.
try {
System.out.println("not important code");
} catch (Exception e) {
throw new EncodeException("not exist class");
}
In this example, the try catch never throw an exception. And the EncodeException class is in an optional dependency.
This code run well in IntelliJ IDEA, but fail when run as java -jar xxx.jar
========== update again with minimal demo
I create a minimal demo to reproduce this issue.
a standalone demo-module
<dependency>
<groupId>io.github.openfeign</groupId>
<artifactId>feign-core</artifactId>
<version>10.10.1</version>
<optional>true</optional>
</dependency>
import feign.codec.EncodeException;
/**
* Hello world!
*/
public class App {
public void testOptional() {
try {
System.out.println("test");
} catch (Exception e) {
throw new EncodeException("never throw this");
}
}
}
demo spring project(create by spring initializr and add a dependency)
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter</artifactId>
</dependency>
<dependency>
<groupId>com.example</groupId>
<artifactId>demo-module</artifactId>
<version>0.0.1-SNAPSHOT</version>
</dependency>
#Component
public class MyMain implements ApplicationRunner {
#Override
public void run(ApplicationArguments args) throws Exception {
new App().testOptional();
}
}
Inspect your project classpath in Idea ( CTRL-Alt-Shift-S ) - I daresay optional jar is somewhere on module compile classpath and it is enough to run your class in IDE - but not in standalone jar. Optional means in maven context that it is present on classpath while compiling, but not packed into resulting artifact.
There is an option called Enable launch optimization in IntelliJ IDEA run config, uncheck it and everything work as expected.

Hadoop distcp from a spring boot application - ClassNotFoundException

I am trying to submit distCP job from a spring boot application on a REST API call.
version of spring: 1.5.13.RELEASE
hadoop version: 2.7.3
below is the code I am using to instantiate the DistCP:
List<Path> srcPathList = new ArrayList<Path>();
srcPathList.add(new Path("hdfs://<cluster>/tmp/<user>/source"));
Path targetPath = new Path("hdfs://<cluster>/tmp/<user>/destination");
DistCpOptions distCpOptions = new DistCpOptions(srcPathList,targetPath);
DistCp distCp = new DistCp(configuration,distCpOptions);
Job job = distCp.execute();
The job is submitted successfully to the cluster, however the job fails due to ClassNotFoundException on the cluster. Below is the exception:
INFO [main] org.apache.hadoop.service.AbstractService: Service org.apache.hadoop.mapreduce.v2.app.MRAppMaster failed in state INITED;
cause: org.apache.hadoop.yarn.exceptions.YarnRuntimeException:
java.lang.RuntimeException: java.lang.ClassNotFoundException:
Class org.apache.hadoop.tools.mapred.CopyOutputFormat not found
Why does this happen? Any pointers around this would be very helpful!! Thanks!
I found the reason via viewing the job.jar on the NodeManager machine. The structure of job.jar is:
BOOT-INF/class/xxx
this is unreasonable.
I tried to replace the jar package with war,it works!
<packaging>war</packaging>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
<!--exclude inner tomcat-->
<exclusions>
<exclusion>
<artifactId>spring-boot-starter-tomcat</artifactId>
<groupId>org.springframework.boot</groupId>
</exclusion>
</exclusions>
</dependency>
<!-- include tomcat-->
<dependency>
<groupId>org.apache.tomcat</groupId>
<artifactId>tomcat-servlet-api</artifactId>
<version>7.0.47</version>
<scope>provided</scope>
</dependency>
...
and then add start class:
import org.springframework.boot.builder.SpringApplicationBuilder;
import org.springframework.boot.web.support.SpringBootServletInitializer;
public class SpringBootStartApplication extends SpringBootServletInitializer {
#Override
protected SpringApplicationBuilder configure(SpringApplicationBuilder builder) {
//
return builder.sources(xxxPortalApplication.class);
}
}

com.amazonaws.transform.JsonErrorUnmarshaller: method <init>(Ljava/lang/Class;)V not found

We are using a library called logback-ext-cloudwatch-appender to send our logback-based logs to AWS Cloudwatch. This is what the dependency looks like in our pom.xml file.
<dependency>
<groupId>org.eluder.logback</groupId>
<artifactId>logback-ext-cloudwatch-appender</artifactId>
<version>1.0-SNAPSHOT</version>
</dependency>
A few days ago these errors started appearing in our logs.
java.lang.NoSuchMethodError: com.amazonaws.transform.JsonErrorUnmarshaller: method <init>(Ljava/lang/Class;)V not found
at com.amazonaws.services.logs.model.transform.InvalidParameterExceptionUnmarshaller.<init>(InvalidParameterExceptionUnmarshaller.java:26)
at com.amazonaws.services.logs.AWSLogsClient.init(AWSLogsClient.java:280)
at com.amazonaws.services.logs.AWSLogsClient.<init>(AWSLogsClient.java:275)
at com.amazonaws.services.logs.AWSLogsClient.<init>(AWSLogsClient.java:248)
at org.eluder.logback.ext.cloudwatch.appender.AbstractCloudWatchAppender.doStart(AbstractCloudWatchAppender.java:100)
at org.eluder.logback.ext.aws.core.AbstractAwsEncodingStringAppender.start(AbstractAwsEncodingStringAppender.java:123)
at org.eluder.logback.ext.cloudwatch.appender.AbstractCloudWatchAppender.start(AbstractCloudWatchAppender.java:95)
at ch.qos.logback.ext.spring.DelegatingLogbackAppender.getDelegate(Unknown Source)
at ch.qos.logback.ext.spring.DelegatingLogbackAppender.append(Unknown Source)
at ch.qos.logback.ext.spring.DelegatingLogbackAppender.append(Unknown Source)
at ch.qos.logback.core.UnsynchronizedAppenderBase.doAppend(UnsynchronizedAppenderBase.java:84)
at ch.qos.logback.core.spi.AppenderAttachableImpl.appendLoopOnAppenders(AppenderAttachableImpl.java:48)
at ch.qos.logback.classic.Logger.appendLoopOnAppenders(Logger.java:270)
at ch.qos.logback.classic.Logger.callAppenders(Logger.java:257)
at ch.qos.logback.classic.Logger.buildLoggingEventAndAppend(Logger.java:421)
at ch.qos.logback.classic.Logger.filterAndLog_0_Or3Plus(Logger.java:383)
at ch.qos.logback.classic.Logger.info(Logger.java:579)
(truncated)
I found out that we had upgraded to AWS 1.11.5, but I couldn't find any evidence of such a bug in that release.
It turns out that the current version of logback-ext-cloudwatch-appender has a transitive dependency on a specific version of aws-java-sdk-logs 1.10.2, which is not compatible with other aws-java-sdk libraries 1.11.0 and above. We do use a number of other aws-java-sdk libraries. We excluded the dependency like this.
<dependency>
<groupId>org.eluder.logback</groupId>
<artifactId>logback-ext-cloudwatch-appender</artifactId>
<version>1.0-SNAPSHOT</version>
<exclusions>
<exclusion>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-core</artifactId>
</exclusion>
<exclusion>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-logs</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-core</artifactId>
<version>1.11.5</version>
</dependency>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-logs</artifactId>
<version>1.11.5</version>
</dependency>
At that point we started getting another error.
Exception in thread "org.myorg.task.MyTask working" java.lang.NoSuchMethodError: com.amazonaws.services.logs.AWSLogsClient.createLogGroup(Lcom/amazonaws/services/logs/model/CreateLogGroupRequest;)V
at org.eluder.logback.ext.cloudwatch.appender.AbstractCloudWatchAppender.createLogGroup(AbstractCloudWatchAppender.java:171)
at org.eluder.logback.ext.cloudwatch.appender.AbstractCloudWatchAppender.doStart(AbstractCloudWatchAppender.java:107)
at org.eluder.logback.ext.aws.core.AbstractAwsEncodingStringAppender.start(AbstractAwsEncodingStringAppender.java:123)
at org.eluder.logback.ext.cloudwatch.appender.AbstractCloudWatchAppender.start(AbstractCloudWatchAppender.java:95)
at ch.qos.logback.core.joran.action.AppenderAction.end(AppenderAction.java:90)
at ch.qos.logback.core.joran.spi.Interpreter.callEndAction(Interpreter.java:309)
at ch.qos.logback.core.joran.spi.Interpreter.endElement(Interpreter.java:193)
at ch.qos.logback.core.joran.spi.Interpreter.endElement(Interpreter.java:179)
at ch.qos.logback.core.joran.spi.EventPlayer.play(EventPlayer.java:62)
at ch.qos.logback.core.joran.GenericConfigurator.doConfigure(GenericConfigurator.java:155)
at ch.qos.logback.core.sift.SiftingJoranConfiguratorBase.doConfigure(SiftingJoranConfiguratorBase.java:82)
at ch.qos.logback.core.sift.AbstractAppenderFactoryUsingJoran.buildAppender(AbstractAppenderFactoryUsingJoran.java:51)
at ch.qos.logback.core.sift.AppenderTracker.buildComponent(AppenderTracker.java:56)
at ch.qos.logback.core.sift.AppenderTracker.buildComponent(AppenderTracker.java:32)
at ch.qos.logback.core.spi.AbstractComponentTracker.getOrCreate(AbstractComponentTracker.java:124)
at ch.qos.logback.core.sift.SiftingAppenderBase.append(SiftingAppenderBase.java:104)
at ch.qos.logback.core.AppenderBase.doAppend(AppenderBase.java:82)
at ch.qos.logback.core.spi.AppenderAttachableImpl.appendLoopOnAppenders(AppenderAttachableImpl.java:48)
at ch.qos.logback.classic.Logger.appendLoopOnAppenders(Logger.java:270)
at ch.qos.logback.classic.Logger.callAppenders(Logger.java:257)
at ch.qos.logback.classic.Logger.buildLoggingEventAndAppend(Logger.java:421)
at ch.qos.logback.classic.Logger.filterAndLog_0_Or3Plus(Logger.java:383)
at ch.qos.logback.classic.Logger.info(Logger.java:579)
Merely excluding the dependencies did not work. I had to fork and rebuild the logback-ext-cloudwatch-appender jar with a dependency on the current 1.11.5 aws-java-sdk libraries. Trying to use the new AWS dependencies with the logback-ext-cloudwatch-appender jar which had been built against the old libraries caused a mismatch in the method signature (return type) of createLogGroup which caused a runtime error. To get this to run, I only had to change the pom.xml files, not the source code, in my forked version.

exception: java.lang.NoSuchMethodError: com.lowagie.text.pdf.PdfWriter.setRgbTransparencyBlending(Z)V

guys! For long time I can't fix the exception: java.lang.NoSuchMethodError: com.lowagie.text.pdf.PdfWriter.setRgbTransparencyBlending(Z)V
I've add all need jars into classpath:
commons-beanutils-1.8.0
commons-collections-2.1.1
commons-digester-2.1.0
commons-javaflow-20060411
commons-logging-1.1.1
itext - 2.1.5
jasperreports - 5.1.0
I saw requirements for JasperReports here, so I've all need libraries, but, anyway, I can't fix the bug
My code:
class ForIReport {
public static void main(String[] args) {
// def conn = Sql.newInstance(
// "jdbc:sqlserver://localhost:1433;databaseName=twitter",
// 'sa',
// 'sunrise123',
// 'com.microsoft.sqlserver.jdbc.SQLServerDriver')
// Class.forName("com.microsoft.jdbc.SQLServerDriver").newInstance();
// Connection conn = DriverManager.getConnection("jdbc:microsoft:sqlserver://localhost:1433", 'sa', 'sunrise123');
def fileName = "C:/Users/avalev/Documents/iReport/First.jasper"
def outFileName = "First.pdf"
HashMap hm = new HashMap()
JasperPrint print = JasperFillManager.fillReport(fileName, hm, new JREmptyDataSource())
JRExporter exporter = new JRPdfExporter()
exporter.setParameter(
JRExporterParameter.OUTPUT_FILE_NAME,
outFileName);
exporter.setParameter(JRExporterParameter.JASPER_PRINT, print)
exporter.exportReport()
println("Created file :" + outFileName)
}
}
and description of exception
log4j:WARN No appenders could be found for logger (net.sf.jasperreports.extensions.ExtensionsEnvironment).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Exception in thread "main" java.lang.NoSuchMethodError: com.lowagie.text.pdf.PdfWriter.setRgbTransparencyBlending(Z)V
at net.sf.jasperreports.engine.export.JRPdfExporter.exportReportToStream(JRPdfExporter.java:596)
at net.sf.jasperreports.engine.export.JRPdfExporter.exportReport(JRPdfExporter.java:419)
at net.sf.jasperreports.engine.JRExporter$exportReport.call(Unknown Source)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:42)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:108)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:112)
at ForIReport.main(One.groovy:51)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:120)
I can create the instance of PdfWriter class (for check myself)
Thank you for your help
jasperreports-5.1.0 needs itext-2.1.7.
You can see it in the pom of the jasperreports-5.1.0 project:
<dependency>
<groupId>com.lowagie</groupId>
<artifactId>itext</artifactId>
<version>2.1.7.js2</version>
<scope>compile</scope>
</dependency>
You need to upgrade the version of itext to version 2.1.7 minimum.
I had the same [runtime] error. What I realized was, I had the wrong jars for the "batik" library. I got all version 1.7 jars from the org.apache.xmlgraphics. I'm using jasper in this way:
<dependency>
<groupId>net.sf.jasperreports</groupId>
<artifactId>jasperreports</artifactId>
<version>4.0.0</version>
</dependency>
The batik, for example:
<dependency>
<groupId>org.apache.xmlgraphics</groupId>
<artifactId>batik-anim</artifactId>
<version>1.7</version>
</dependency>
Also, I made sure I only had one instance of iText in the pom:
<dependency>
<groupId>com.lowagie</groupId>
<artifactId>iText</artifactId>
<version>2.1.7</version>
</dependency>
Hope that helps.
I had the same issue, when retrieving data from grid and writing to a PDF using
flying-saucer-pdf
The isuue was com.lowagie (itext) and org.xhtmlrenderer (flying-saucer-pdf) versions incompatible,
use following,
<dependency>
<groupId>com.lowagie</groupId>
<artifactId>itext</artifactId>
<version>2.1.7</version>
</dependency>
<dependency>
<groupId>org.xhtmlrenderer</groupId>
<artifactId>flying-saucer-pdf</artifactId>
<version>9.0.7</version>
</dependency>
I have also came across same situation but finally succeeded to resolve it.
If you are using maven then add below dependency
<dependency>
<groupId>org.eclipse.birt.runtime.3_7_1</groupId>
<artifactId>com.lowagie.text</artifactId>
<version>2.1.7</version>
</dependency>
or download jar from below link and add to your buildpath
com.lowagie.text_2.1.7
It will be of no use to add itext-2.1.7.jar , Also the latest version of that is itextpdf-5.5.9.jar
If M. Abbas answer does not work then please use this dependency:
<dependency>
<groupId>com.lowagie</groupId>
<artifactId>itext</artifactId>
<version>2.1.7</version>
</dependency>
It works for me.

Categories