I'm trying to index a realtime steam using Storm and ElasticSearch and I'm getting this exception. I'm using the following version of ElasticSearch
<dependency>
<groupId>org.elasticsearch</groupId>
<artifactId>elasticsearch</artifactId>
<version>0.90.1</version>
</dependency>
Any pointers about where I should start looking ?
org.elasticsearch.transport.RemoteTransportException: Failed to deserialize exception response from stream
Caused by: org.elasticsearch.transport.TransportSerializationException: Failed to deserialize exception response from stream
at org.elasticsearch.transport.netty.MessageChannelHandler.handlerResponseError(MessageChannelHandler.java:171)
at org.elasticsearch.transport.netty.MessageChannelHandler.messageReceived(MessageChannelHandler.java:125)
at org.elasticsearch.common.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70)
at org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)
at org.elasticsearch.common.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791)
at org.elasticsearch.common.netty.channel.Channels.fireMessageReceived(Channels.java:296)
at org.elasticsearch.common.netty.handler.codec.frame.FrameDecoder.unfoldAndFireMessageReceived(FrameDecoder.java:462)
at org.elasticsearch.common.netty.handler.codec.frame.FrameDecoder.callDecode(FrameDecoder.java:443)
at org.elasticsearch.common.netty.handler.codec.frame.FrameDecoder.messageReceived(FrameDecoder.java:303)
at org.elasticsearch.common.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70)
at org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)
at org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:559)
at org.elasticsearch.common.netty.channel.Channels.fireMessageReceived(Channels.java:268)
at org.elasticsearch.common.netty.channel.Channels.fireMessageReceived(Channels.java:255)
at org.elasticsearch.common.netty.channel.socket.nio.NioWorker.read(NioWorker.java:88)
at org.elasticsearch.common.netty.channel.socket.nio.AbstractNioWorker.process(AbstractNioWorker.java:107)
at org.elasticsearch.common.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:312)
at org.elasticsearch.common.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:88)
at org.elasticsearch.common.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178)
at org.elasticsearch.common.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108)
at org.elasticsearch.common.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:722)
Caused by: java.io.StreamCorruptedException: unexpected end of block data
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1370)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1989)
at java.io.ObjectInputStream.defaultReadObject(ObjectInputStream.java:499)
at java.lang.Throwable.readObject(Throwable.java:913)
at sun.reflect.GeneratedMethodAccessor59.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1004)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1891)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1796)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1348)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1989)
at java.io.ObjectInputStream.defaultReadObject(ObjectInputStream.java:499)
at java.lang.Throwable.readObject(Throwable.java:913)
at sun.reflect.GeneratedMethodAccessor59.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1004)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1891)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1796)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1348)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
at org.elasticsearch.transport.netty.MessageChannelHandler.handlerResponseError(MessageChannelHandler.java:169)
... 23 more
I upgraded to 0.90.1 on all my clients and servers and it works now. I"m not sure if this was the root cause or something else. If I see that error again in the new version I'll post an update here.
<dependency>
<groupId>org.elasticsearch</groupId>
<artifactId>elasticsearch</artifactId>
<version>0.90.2</version>
</dependency>
I found that the problem was something I was doing with the data. I was creating an index with document type as _tweet. Elasticsearch doesn't allow _ in the beginning of the type.
I suggest you do elasticsearch -f -D es.config=config/elasticsearch.yml to start the service in foreground. Then you can read the error to see why you get this problem.
It might be useful to check if the elastic search client and server are running the same version. I was using a 0.90.5 server with a 0.90.3 client and was able to get rid of the exception when I upgraded my Java client to 0.90.5
Related
I am trying to create a table in Redshift based on the spark dataset. I am using spark-redshift driver in jdbc to achieve this locally.
The code snippet to do this
data.write()
.format("com.databricks.spark.redshift")
.option("url", "jdbc:redshift://..")
.option("dbtable", "test_table")
.option("tempdir", "s3://temp")
.option("aws_iam_role", "arn:aws:iam::..")
.option("extracopyoptions", "region 'us-west-1'")
.mode(SaveMode.Append).save();
My maven pom.xml has following dependency:
<dependency>
<groupId>com.databricks</groupId>
<artifactId>spark-redshift_2.11</artifactId>
<version>2.0.1</version>
</dependency>
I am using java 1.8.
I am getting the following error:
java.io.IOException: No FileSystem for scheme: s3
at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2660)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2667)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:94)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2703)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2685)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:373)
at com.databricks.spark.redshift.Utils$.assertThatFileSystemIsNotS3BlockFileSystem(Utils.scala:156)
at com.databricks.spark.redshift.RedshiftWriter.saveToRedshift(RedshiftWriter.scala:340)
at com.databricks.spark.redshift.DefaultSource.createRelation(DefaultSource.scala:106)
at org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:86)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80)
at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80)
at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:668)
at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:668)
at org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78)
at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:668)
at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:276)
at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:270)
at com.peak.spark.jobs.SparkDataIngestJob.writeData(SparkDataIngestJob.java:196)
at com.peak.spark.jobs.SparkDataIngestJob.exec(SparkDataIngestJob.java:123)
at com.peak.spark.core.AbstractSparkJob.run(AbstractSparkJob.java:74)
at com.peak.spark.core.SparkAppLauncher.onApplicationEvent(SparkAppLauncher.java:40)
at com.peak.spark.core.SparkAppLauncher.onApplicationEvent(SparkAppLauncher.java:16)
at org.springframework.context.event.SimpleApplicationEventMulticaster.invokeListener(SimpleApplicationEventMulticaster.java:151)
at org.springframework.context.event.SimpleApplicationEventMulticaster.multicastEvent(SimpleApplicationEventMulticaster.java:128)
at org.springframework.context.support.AbstractApplicationContext.publishEvent(AbstractApplicationContext.java:331)
at org.springframework.context.support.AbstractApplicationContext.start(AbstractApplicationContext.java:1174)
at com.peak.spark.core.SparkApp.launch(SparkApp.java:38)
at com.peak.spark.core.SparkApp.main(SparkApp.java:55)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Exception in thread "main" java.lang.RuntimeException: Spark Job FailedNo FileSystem for scheme: s3
at com.peak.spark.jobs.SparkDataIngestJob.exec(SparkDataIngestJob.java:162)
at com.peak.spark.core.AbstractSparkJob.run(AbstractSparkJob.java:74)
at com.peak.spark.core.SparkAppLauncher.onApplicationEvent(SparkAppLauncher.java:40)
at com.peak.spark.core.SparkAppLauncher.onApplicationEvent(SparkAppLauncher.java:16)
at org.springframework.context.event.SimpleApplicationEventMulticaster.invokeListener(SimpleApplicationEventMulticaster.java:151)
at org.springframework.context.event.SimpleApplicationEventMulticaster.multicastEvent(SimpleApplicationEventMulticaster.java:128)
at org.springframework.context.support.AbstractApplicationContext.publishEvent(AbstractApplicationContext.java:331)
at org.springframework.context.support.AbstractApplicationContext.start(AbstractApplicationContext.java:1174)
at com.peak.spark.core.SparkApp.launch(SparkApp.java:38)
at com.peak.spark.core.SparkApp.main(SparkApp.java:55)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Please help me figure out what's wrong here.
I suppose you forget to include hadoop-aws package to your project. This package will allow you to work with s3:// schema
<!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-aws -->
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-aws</artifactId>
<version>2.6.0</version>
</dependency>
Since you are trying to execute this code on your local system, your code won't know how to access s3 file system.
You can do one of the two things to solve this problem:
Configure AWS credentials in your system so that your code will somehow try to reach s3 bucket. I will not recommend this approach for various reasons.
Keep your file path in configuration file. Use 2 configuration files - one for testing code and other for production environment. In test environment, use paths like c:\path\to\your\dummy\folder\ and in production environment configuration file use paths like s3:\your_bucket_name\path\in\bucket.
Hope it helps.
So when I try to connect to my hive metastore I'm getting this exception
javax.jdo.JDOFatalInternalException: Unexpected exception caught.
at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1193)
at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:412)
at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:441)
at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:336)
at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:292)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:60)
at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:69)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:681)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:659)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:708)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:507)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:78)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:84)
at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:6286)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:207)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:187)
at com.bouygtel.jupiter.dic.catalogue.AspirationMetastore.aspirer(AspirationMetastore.java:146)
at com.bouygtel.jupiter.dic.integration.adaptateurs.AdaptateurUpdate.lambda$updateDatabase$0(AdaptateurUpdate.java:68)
at com.bouygtel.commun.summer.socle.coeur.threads.SummerExecutorServiceFactory$3.run(SummerExecutorServiceFactory.java:160)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.reflect.InvocationTargetException: null
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
at java.security.AccessController.doPrivileged(Native Method)
at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
... 25 common frames omitted
Caused by: org.datanucleus.exceptions.NucleusUserException: Error : Could not find API definition for name "JDO". Perhaps you dont have the requisite datanucleus-api-XXX jar in the CLASSPATH?
at org.datanucleus.api.ApiAdapterFactory.getApiAdapter(ApiAdapterFactory.java:93)
at org.datanucleus.AbstractNucleusContext.<init>(AbstractNucleusContext.java:118)
at org.datanucleus.PersistenceNucleusContextImpl.<init>(PersistenceNucleusContextImpl.java:170)
at org.datanucleus.PersistenceNucleusContextImpl.<init>(PersistenceNucleusContextImpl.java:159)
at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.<init>(JDOPersistenceManagerFactory.java:436)
at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:314)
at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:223)
... 33 common frames omitted
But I have this in my pom.xml
<dependency>
<groupId>org.datanucleus</groupId>
<artifactId>datanucleus-api-jdo</artifactId>
<version>5.0.3</version>
</dependency>
The weird thing is that this only happens when I use my code through the jar, when I run it from eclipse, the thing works perfectly.
I double checked through "jar tf" and the jar that I run does contain the "datanucleus-api-jdo-5.0.3.jar".
I've searched for conflicts in my dependency hierarchy and there is one dependency that also use an older version of the "datanucleus-api-jdo", but it's omitted for conflict according to eclipse and excluding it does not solve the problem
Not sure if that helps but the run configuration within eclipse is "clean package spring-boot:run".
Any ideas of what is causing this?
If you think any information is missing please ask
I am receiving an error when running my spring-boot application with embedded-tomcat (haven't tried with server tomcat yet).
java.lang.IllegalStateException: Tomcat 7 reflection failed
at org.springframework.boot.context.embedded.tomcat.SkipPatternJarScanner.scan(SkipPatternJarScanner.java:77)
at org.apache.catalina.startup.TldConfig.execute(TldConfig.java:271)
at org.apache.catalina.startup.TldConfig.lifecycleEvent(TldConfig.java:590)
at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
at org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5522)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:145)
at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1571)
at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1561)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.reflect.InvocationTargetException: null
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.boot.context.embedded.tomcat.SkipPatternJarScanner.scan(SkipPatternJarScanner.java:73)
... 12 common frames omitted
Caused by: java.lang.NoClassDefFoundError: oracle/i18n/util/LocaleMapper
at oracle.xml.parser.v2.XMLReader.setEncoding(XMLReader.java:980)
at oracle.xml.parser.v2.XMLReader.checkXMLDecl(XMLReader.java:3284)
at oracle.xml.parser.v2.XMLReader.pushXMLReader(XMLReader.java:570)
at oracle.xml.parser.v2.XMLReader.pushXMLReader(XMLReader.java:274)
at oracle.xml.parser.v2.XMLParser.parse(XMLParser.java:232)
at org.apache.tomcat.util.digester.Digester.parse(Digester.java:1576)
at org.apache.catalina.startup.TldConfig.tldScanStream(TldConfig.java:565)
at org.apache.catalina.startup.TldConfig.tldScanJar(TldConfig.java:513)
at org.apache.catalina.startup.TldConfig.access$200(TldConfig.java:61)
at org.apache.catalina.startup.TldConfig$TldJarScannerCallback.scan(TldConfig.java:300)
at org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:259)
at org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:221)
... 17 common frames omitted
Caused by: java.lang.ClassNotFoundException: oracle.i18n.util.LocaleMapper
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 29 common frames omitted
On further inspection, It appears as though the previous version of the orai18n.jar depended on by the ojdbc7 maven dep contained this class in vesion 11.2.0.3 but the newer (12.1.0.2) file does not contain it.
After much hassle, I've found the root cause (Edit: Actually the better answer provided has a more appropriate cause/solution, see Serri's answer):
according to this bug comment in spring-boot The Oracle SaxParserFactory impl and the (later run into after solving the first) DocumentBuilderFactory implementations are picked up instead of the default xerces impls. Changing the implementation using the META-INF/services/<class-name> files solved the issue:
in my.jar:
and each file:
// javax.xml.parsers.DocumentBuilderFactory
com.sun.org.apache.xerces.internal.jaxp.DocumentBuilderFactoryImpl
// javax.xml.parsers.SAXParserFactory
com.sun.org.apache.xerces.internal.jaxp.SAXParserFactoryImpl
This causes each service finder to resolve the default xerces implementation in the standard java library.
It's a conflict with xmlparserv2.
Try this :
<dependency>
<groupId>com.oracle.jdbc</groupId>
<artifactId>ojdbc7</artifactId>
<version>12.1.0.2</version>
<exclusions>
<exclusion>
<groupId>com.oracle.jdbc</groupId>
<artifactId>xmlparserv2</artifactId>
</exclusion>
</exclusions>
</dependency>
In my case I had the Oracle Jars installed under $CATALINA_HOME/lib to be able to set up a JNDI DB connection in Tomcat, simply removing xmlparserv2.jar from the $CATALINA_HOME/lib directory resolved the problem.
New change for xdb instead com.oracle.jdbc use: com.oracle.ojdbc.
<dependency>
<groupId>com.oracle.ojdbc</groupId>
<artifactId>xdb</artifactId>
<version>19.3.0.0</version>
<exclusions>
<exclusion>
<groupId>com.oracle.ojdbc</groupId>
<artifactId>xmlparserv2</artifactId>
</exclusion>
</exclusions>
</dependency>
The cause of the problem is as Serri said, except for the solution given by Serri I think if you use IntelliJ IDEA as a coding tool, please check Project Structure Artifacts to remove the dependency on xmlparserv2, this worked for me
Remove conflicting dependencies
I've a web application which sending mail. To perform sending mail I use library which itself based on Apache Commons Email 1.4. When I've used Websphere Liberty 8.5.5.7 everything worked fine, but after I've upgraded to 8.5.5.9 it stopped to work and throwing an exception: javax.mail.NoSuchProviderException the full stack trace I posted at the end of the question.
Again, I've tried to run it on both installations of Websphere Liberty 8.5.5.7 and 8.5.5.9, on first one it works fine on second it's broken. I've tried to look what the difference and it looks that Liberty itself has com.ibm.ws.javax.mail-1.5_1.5.xx.jar package which interfere with Java javax-mail-1.5.2 which used by commons-email 1.4, see its pom.xml here.
After some debugging I've paid attention that:
Liberty 8.5.5.9 uses com.ibm.ws.javax.mail-1.5_1.5.12.jar while Liberty 8.5.5.7 uses com.ibm.ws.javax.mail-1.5_1.5.10.jar possible this is caused to my bug.
So, my question is how can I eliminate the influence of com.ibm.ws.javax.mail which comes together with liberty to make it work or is there another workaround?
Also, where can I submit bug report for Liberty Team?
Update:
It looks that Liberty has JavaMail 1.5 feature which can be enabled or disabled, I didn't enable it explicitly, it looks that on of my features enabled it implicitly, this is my features list:
<featureManager>
<feature>jdbc-4.1</feature>
<feature>adminCenter-1.0</feature>
<feature>jndi-1.0</feature>
<feature>localConnector-1.0</feature>
<feature>servlet-3.1</feature>
<feature>ssl-1.0</feature>
<feature>jaxrs-2.0</feature>
<feature>cdi-1.2</feature>
<feature>ejbLite-3.2</feature>
<feature>jsf-2.2</feature>
</featureManager>
Update 2
I've solved this issue by using this article: Overriding a provided API with an alternative version
Just make in your server.xml the following changes:
<application id="" name="Scholar" type="ear" location="scholar.ear">
<classloader delegation="parentLast" />
</application>
And now your libraries will receive higher precedence then Liberty's libraries.
Full stack trace of the exception:
org.apache.commons.mail.EmailException: Sending the email to the following server failed : relay.******.com:25
at org.apache.commons.mail.Email.sendMimeMessage(Email.java:1421) ~[commons-email-1.4.jar:1.4]
at org.apache.commons.mail.Email.send(Email.java:1448) ~[commons-email-1.4.jar:1.4]
at my.package.mailer.sendHtmlMail(IBMMail.java:96) ~[classes/:?]
at my.package.mailer.EmailGateway.sendMail(EmailGateway.java:24) [classes/:?]
at my.package.mailer.EventMailGateway.eventCreatedNotification(EventMailGateway.java:52) [classes/:?]
at **********.restful.EventsAPI.saveEvent(EventsAPI.java:99) [classes/:?]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_45]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_45]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_45]
at java.lang.reflect.Method.invoke(Method.java:497) ~[?:1.8.0_45]
at com.ibm.ws.jaxrs20.server.LibertyJaxRsServerFactoryBean.performInvocation(LibertyJaxRsServerFactoryBean.java:636) [com.ibm.ws.jaxrs-2.0.server_1.0.12.jar:?]
at com.ibm.ws.jaxrs20.server.LibertyJaxRsInvoker.performInvocation(LibertyJaxRsInvoker.java:115) [com.ibm.ws.jaxrs-2.0.server_1.0.12.jar:?]
at org.apache.cxf.service.invoker.AbstractInvoker.invoke(AbstractInvoker.java:97) [cxf-core-3.0.3.jar:3.0.3]
at com.ibm.ws.jaxrs20.server.LibertyJaxRsInvoker.invoke(LibertyJaxRsInvoker.java:210) [com.ibm.ws.jaxrs-2.0.server_1.0.12.jar:?]
at org.apache.cxf.jaxrs.JAXRSInvoker.invoke(JAXRSInvoker.java:200) [cxf-rt-frontend-jaxrs-3.0.3.jar:3.0.3]
at com.ibm.ws.jaxrs20.server.LibertyJaxRsInvoker.invoke(LibertyJaxRsInvoker.java:381) [com.ibm.ws.jaxrs-2.0.server_1.0.12.jar:?]
at org.apache.cxf.jaxrs.JAXRSInvoker.invoke(JAXRSInvoker.java:99) [cxf-rt-frontend-jaxrs-3.0.3.jar:3.0.3]
at org.apache.cxf.interceptor.ServiceInvokerInterceptor$1.run(ServiceInvokerInterceptor.java:59) [cxf-core-3.0.3.jar:3.0.3]
at org.apache.cxf.interceptor.ServiceInvokerInterceptor.handleMessage(ServiceInvokerInterceptor.java:96) [cxf-core-3.0.3.jar:3.0.3]
at org.apache.cxf.phase.PhaseInterceptorChain.doIntercept(PhaseInterceptorChain.java:307) [cxf-core-3.0.3.jar:3.0.3]
at org.apache.cxf.transport.ChainInitiationObserver.onMessage(ChainInitiationObserver.java:124) [com.ibm.ws.jaxrs-2.0.common_1.0.12.jar:3.0.3]
at org.apache.cxf.transport.http.AbstractHTTPDestination.invoke(AbstractHTTPDestination.java:256) [cxf-rt-transports-http-3.0.3.jar:3.0.3]
at com.ibm.ws.jaxrs20.endpoint.AbstractJaxRsWebEndpoint.invoke(AbstractJaxRsWebEndpoint.java:134) [com.ibm.ws.jaxrs-2.0.common_1.0.12.jar:?]
at com.ibm.websphere.jaxrs.server.IBMRestServlet.handleRequest(IBMRestServlet.java:149) [com.ibm.ws.jaxrs-2.0.server_1.0.12.jar:?]
at com.ibm.websphere.jaxrs.server.IBMRestServlet.doPost(IBMRestServlet.java:107) [com.ibm.ws.jaxrs-2.0.server_1.0.12.jar:?]
at javax.servlet.http.HttpServlet.service(HttpServlet.java:707) [com.ibm.ws.javaee.servlet.3.1_1.0.12.jar:?]
at com.ibm.websphere.jaxrs.server.IBMRestServlet.service(IBMRestServlet.java:99) [com.ibm.ws.jaxrs-2.0.server_1.0.12.jar:?]
at com.ibm.ws.webcontainer.servlet.ServletWrapper.service(ServletWrapper.java:1290) [com.ibm.ws.webcontainer_1.1.12.jar:?]
at com.ibm.ws.webcontainer.servlet.ServletWrapper.handleRequest(ServletWrapper.java:778) [com.ibm.ws.webcontainer_1.1.12.jar:?]
at com.ibm.ws.webcontainer.servlet.ServletWrapper.handleRequest(ServletWrapper.java:475) [com.ibm.ws.webcontainer_1.1.12.jar:?]
at com.ibm.ws.webcontainer.filter.WebAppFilterManager.invokeFilters(WebAppFilterManager.java:1161) [com.ibm.ws.webcontainer_1.1.12.jar:?]
at com.ibm.ws.webcontainer.servlet.CacheServletWrapper.handleRequest(CacheServletWrapper.java:82) [com.ibm.ws.webcontainer_1.1.12.jar:?]
at com.ibm.ws.webcontainer.WebContainer.handleRequest(WebContainer.java:928) [com.ibm.ws.webcontainer_1.1.12.jar:?]
at com.ibm.ws.webcontainer.osgi.DynamicVirtualHost$2.run(DynamicVirtualHost.java:262) [com.ibm.ws.webcontainer_1.1.12.jar:?]
at com.ibm.ws.http.dispatcher.internal.channel.HttpDispatcherLink$TaskWrapper.run(HttpDispatcherLink.java:955) [com.ibm.ws.transport.http_1.0.12.jar:?]
at com.ibm.ws.http.dispatcher.internal.channel.HttpDispatcherLink.ready(HttpDispatcherLink.java:341) [com.ibm.ws.transport.http_1.0.12.jar:?]
at com.ibm.ws.http.channel.internal.inbound.HttpInboundLink.handleDiscrimination(HttpInboundLink.java:471) [com.ibm.ws.transport.http_1.0.12.jar:?]
at com.ibm.ws.http.channel.internal.inbound.HttpInboundLink.handleNewRequest(HttpInboundLink.java:405) [com.ibm.ws.transport.http_1.0.12.jar:?]
at com.ibm.ws.http.channel.internal.inbound.HttpInboundLink.processRequest(HttpInboundLink.java:285) [com.ibm.ws.transport.http_1.0.12.jar:?]
at com.ibm.ws.http.channel.internal.inbound.HttpInboundLink.ready(HttpInboundLink.java:256) [com.ibm.ws.transport.http_1.0.12.jar:?]
at com.ibm.ws.tcpchannel.internal.NewConnectionInitialReadCallback.sendToDiscriminators(NewConnectionInitialReadCallback.java:174) [com.ibm.ws.channelfw_1.0.12.jar:?]
at com.ibm.ws.tcpchannel.internal.NewConnectionInitialReadCallback.complete(NewConnectionInitialReadCallback.java:83) [com.ibm.ws.channelfw_1.0.12.jar:?]
at com.ibm.ws.tcpchannel.internal.WorkQueueManager.requestComplete(WorkQueueManager.java:504) [com.ibm.ws.channelfw_1.0.12.jar:?]
at com.ibm.ws.tcpchannel.internal.WorkQueueManager.attemptIO(WorkQueueManager.java:574) [com.ibm.ws.channelfw_1.0.12.jar:?]
at com.ibm.ws.tcpchannel.internal.WorkQueueManager.workerRun(WorkQueueManager.java:929) [com.ibm.ws.channelfw_1.0.12.jar:?]
at com.ibm.ws.tcpchannel.internal.WorkQueueManager$Worker.run(WorkQueueManager.java:1018) [com.ibm.ws.channelfw_1.0.12.jar:?]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [?:1.8.0_45]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [?:1.8.0_45]
at java.lang.Thread.run(Thread.java:745) [?:1.8.0_45]
Caused by: javax.mail.NoSuchProviderException: smtp
at javax.mail.Session.getService(Session.java:820) ~[javax.mail-1.5.2.jar:1.5.2]
at javax.mail.Session.getTransport(Session.java:742) ~[javax.mail-1.5.2.jar:1.5.2]
at javax.mail.Session.getTransport(Session.java:682) ~[javax.mail-1.5.2.jar:1.5.2]
at javax.mail.Session.getTransport(Session.java:662) ~[javax.mail-1.5.2.jar:1.5.2]
at javax.mail.Session.getTransport(Session.java:719) ~[javax.mail-1.5.2.jar:1.5.2]
at javax.mail.Transport.send0(Transport.java:248) ~[javax.mail-1.5.2.jar:1.5.2]
at javax.mail.Transport.send(Transport.java:124) ~[javax.mail-1.5.2.jar:1.5.2]
at org.apache.commons.mail.Email.sendMimeMessage(Email.java:1411) ~[commons-email-1.4.jar:1.4]
... 48 more
I've solved this issue by using this article: Overriding a provided API with an alternative version
Just make in your server.xml the following changes:
<application id="" name="Scholar" type="ear" location="scholar.ear">
<classloader delegation="parentLast" />
</application>
And now your libraries will receive higher precedence then Liberty's libraries.
I hit javax.mail.NoSuchProviderException: smtp after updating from Liberty 8.5.5.8 to 8.5.5.9. Adding <feature>javaMail-1.5</feature> to my server.xml worked for me.
How do I use the endorsed standard mechanism to use xerces in a GlassFish 4.0 WAR application? According to the documentation*, you should put it into domain-dir/lib/endorsed. However, when I put xercesImpl, xml-apis as well as xml-resolver there, the classes are not available from my classes in the WAR. This is most proabably due to an inconsistency between the docs and the default config: the domain.xml does not list domain-dir/lib/endorsed.
When I include it or put the files into glassfish-install-dir/lib/endorsed, the startup fails with the following exception:
Exception in thread "main" java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.sun.enterprise.glassfish.bootstrap.GlassFishMain.main(GlassFishMain.java:97)
at com.sun.enterprise.glassfish.bootstrap.ASMain.main(ASMain.java:54)
Caused by: A MultiException has 5 exceptions. They are:
1. javax.xml.stream.FactoryConfigurationError: Provider com.ctc.wstx.stax.WstxOutputFactory not found
2. java.lang.IllegalStateException: Unable to perform operation: create on com.sun.enterprise.v3.server.DomainXmlPersistence
3. javax.xml.stream.FactoryConfigurationError: Provider com.ctc.wstx.stax.WstxInputFactory not found
4. java.lang.IllegalArgumentException: While attempting to resolve the dependencies of com.sun.enterprise.v3.server.GFDomainXml errors were found
5. java.lang.IllegalStateException: Unable to perform operation: resolve on com.sun.enterprise.v3.server.GFDomainXml
at org.jvnet.hk2.internal.Collector.throwIfErrors(Collector.java:88)
at org.jvnet.hk2.internal.ClazzCreator.resolveAllDependencies(ClazzCreator.java:252)
at org.jvnet.hk2.internal.ClazzCreator.create(ClazzCreator.java:360)
at org.jvnet.hk2.internal.SystemDescriptor.create(SystemDescriptor.java:461)
at org.jvnet.hk2.internal.SingletonContext$1.compute(SingletonContext.java:114)
at org.jvnet.hk2.internal.SingletonContext$1.compute(SingletonContext.java:102)
at org.glassfish.hk2.utilities.cache.Cache$OriginThreadAwareFuture$1.call(Cache.java:97)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at org.glassfish.hk2.utilities.cache.Cache$OriginThreadAwareFuture.run(Cache.java:154)
at org.glassfish.hk2.utilities.cache.Cache.compute(Cache.java:199)
at org.jvnet.hk2.internal.SingletonContext.findOrCreate(SingletonContext.java:153)
at org.jvnet.hk2.internal.Utilities.createService(Utilities.java:2258)
at org.jvnet.hk2.internal.ServiceLocatorImpl.internalGetAllServiceHandles(ServiceLocatorImpl.java:1372)
at org.jvnet.hk2.internal.ServiceLocatorImpl.getAllServices(ServiceLocatorImpl.java:726)
at org.jvnet.hk2.internal.ServiceLocatorImpl.getAllServices(ServiceLocatorImpl.java:714)
at org.jvnet.hk2.config.ConfigurationPopulator.populateConfig(ConfigurationPopulator.java:56)
at org.glassfish.hk2.bootstrap.HK2Populator.populateConfig(HK2Populator.java:106)
at com.sun.enterprise.module.common_impl.AbstractModulesRegistryImpl.populateConfig(AbstractModulesRegistryImpl.java:211)
at com.sun.enterprise.module.bootstrap.Main.createServiceLocator(Main.java:273)
at org.jvnet.hk2.osgiadapter.HK2Main.createServiceLocator(HK2Main.java:120)
at com.sun.enterprise.glassfish.bootstrap.osgi.EmbeddedOSGiGlassFishRuntime.newGlassFish(EmbeddedOSGiGlassFishRuntime.java:95)
at com.sun.enterprise.glassfish.bootstrap.GlassFishRuntimeDecorator.newGlassFish(GlassFishRuntimeDecorator.java:68)
at com.sun.enterprise.glassfish.bootstrap.osgi.OSGiGlassFishRuntime.newGlassFish(OSGiGlassFishRuntime.java:91)
at com.sun.enterprise.glassfish.bootstrap.GlassFishMain$Launcher.launch(GlassFishMain.java:113)
... 6 more
Caused by: javax.xml.stream.FactoryConfigurationError: Provider com.ctc.wstx.stax.WstxOutputFactory not found
at javax.xml.stream.XMLOutputFactory.newInstance(Unknown Source)
at com.sun.enterprise.v3.server.DomainXmlPersistence.<init>(DomainXmlPersistence.java:85)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.glassfish.hk2.utilities.reflection.ReflectionHelper.makeMe(ReflectionHelper.java:1107)
at org.jvnet.hk2.internal.ClazzCreator.createMe(ClazzCreator.java:274)
at org.jvnet.hk2.internal.ClazzCreator.create(ClazzCreator.java:368)
at org.jvnet.hk2.internal.SystemDescriptor.create(SystemDescriptor.java:461)
at org.jvnet.hk2.internal.SingletonContext$1.compute(SingletonContext.java:114)
at org.jvnet.hk2.internal.SingletonContext$1.compute(SingletonContext.java:102)
at org.glassfish.hk2.utilities.cache.Cache$OriginThreadAwareFuture$1.call(Cache.java:97)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at org.glassfish.hk2.utilities.cache.Cache$OriginThreadAwareFuture.run(Cache.java:154)
at org.glassfish.hk2.utilities.cache.Cache.compute(Cache.java:199)
at org.jvnet.hk2.internal.SingletonContext.findOrCreate(SingletonContext.java:153)
at org.jvnet.hk2.internal.Utilities.createService(Utilities.java:2258)
at org.jvnet.hk2.internal.ServiceLocatorImpl.getService(ServiceLocatorImpl.java:638)
at org.jvnet.hk2.internal.ThreeThirtyResolver.resolve(ThreeThirtyResolver.java:77)
at org.jvnet.hk2.internal.ClazzCreator.resolve(ClazzCreator.java:214)
at org.jvnet.hk2.internal.ClazzCreator.resolveAllDependencies(ClazzCreator.java:237)
... 28 more
How do I fix this? downloading the Woodstox StAX parser into lib/endorsed as suggested in this answer did not work.
--
* That's the documentation for GlassFish 3, but it's exactly the same in the documentation for GlassFish 4, which I only found as PDF