I am trying to implement relayrides/pushy, but am getting the following runtime error:
Jun 28, 2017 2:06:58 PM com.turo.pushy.apns.SslUtil getSslProvider
INFO: Native SSL provider not available; will use JDK SSL provider.
Exception in thread "main" java.lang.NoClassDefFoundError: io/netty/handler/ssl/SslContextBuilder
at com.turo.pushy.apns.ApnsClientBuilder.build(ApnsClientBuilder.java:396)
at com.jobs.spring.service.NotificationServiceImpl.sendIOSPushNotification(NotificationServiceImpl.java:122)
Caused by: java.lang.ClassNotFoundException: io.netty.handler.ssl.SslContextBuilder
pom.xml
<dependency>
<groupId>org.apache.httpcomponents</groupId>
<artifactId>httpclient</artifactId>
<version>4.5.2</version>
</dependency>
<!-- Push Notifications -->
<dependency>
<groupId>com.turo</groupId>
<artifactId>pushy</artifactId>
<version>0.10</version>
</dependency>
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-tcnative</artifactId>
<version>2.0.5.Final</version>
</dependency>
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-handler</artifactId>
<version>4.0.27.Final</version>
</dependency>
<dependency>
<groupId>com.ning</groupId>
<artifactId>async-http-client</artifactId>
<version>1.9.40</version>
</dependency>
<dependency>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
<version>2.8.1</version>
</dependency>
java
final ApnsClient apnsClient = new ApnsClientBuilder()
.setClientCredentials(new File(PATH_TO_P12_CERT), CERT_PASSWORD )
.build();
I am guessing my mvn dependencies are incorrect. Any help appreciated.
UPDATE
I updated my dependencies to:
<dependency>
<groupId>com.turo</groupId>
<artifactId>pushy</artifactId>
<version>0.10</version>
</dependency>
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-all</artifactId>
<version>4.1.11.Final</version>
</dependency>
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-tcnative</artifactId>
<version>2.0.1.Final</version>
</dependency>
<dependency>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
<version>2.6</version>
</dependency>
But now get:
Jun 28, 2017 2:40:18 PM com.turo.pushy.apns.SslUtil getSslProvider
INFO: Native SSL provider not available; will use JDK SSL provider.
Exception in thread "main" java.lang.NoSuchMethodError: io.netty.bootstrap.Bootstrap.config()Lio/netty/bootstrap/BootstrapConfig;
at com.turo.pushy.apns.ApnsClient.<init>(ApnsClient.java:172)
at com.turo.pushy.apns.ApnsClientBuilder.build(ApnsClientBuilder.java:420)
at com.jobs.spring.service.NotificationServiceImpl.sendIOSPushNotification(NotificationServiceImpl.java:121)
seems like you mix netty 4.1 and 4.0. You need to only use 4.1 if you want to use pushy.
Did you clean your classpath in between I created a test project containing only the dependency
<dependency>
<groupId>com.turo</groupId>
<artifactId>pushy</artifactId>
<version>0.10</version>
</dependency>
and a simple main class just creating the client with the snipped you are using above and it works out fine
To me it looks like there might be two versions of BootstrapConfig on the classpath. Try to remove all dependencies besides pushy and clean/refreshing the maven dependencies.
Related
I'm building a batch with Pentaho Kettle to process some files. This batch is started by a servlet. However, when I try to start it, I get the following warn which actually block the process:
WARN [org.jboss.modules] (Batch Thread - 2) Failed to define class org.pentaho.di.core.attributes.metastore.EmbeddedMetaStore in Module "deployment.my-program-1.0.0.0-SNAPSHOT.war:main" from Service Module Loader: java.lang.NoClassDefFoundError: Failed to link org/pentaho/di/core/attributes/metastore/EmbeddedMetaStore (Module "deployment.my-program-1.0.0.0-SNAPSHOT.war:main" from Service Module Loader): org/pentaho/metastore/api/BaseMetaStore
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.jboss.modules.ModuleClassLoader.defineClass(ModuleClassLoader.java:446)
at org.jboss.modules.ModuleClassLoader.loadClassLocal(ModuleClassLoader.java:274)
at org.jboss.modules.ModuleClassLoader$1.loadClassLocal(ModuleClassLoader.java:78)
at org.jboss.modules.Module.loadModuleClass(Module.java:605)
at org.jboss.modules.ModuleClassLoader.findClass(ModuleClassLoader.java:190)
at org.jboss.modules.ConcurrentClassLoader.performLoadClassUnchecked(ConcurrentClassLoader.java:363)
at org.jboss.modules.ConcurrentClassLoader.performLoadClass(ConcurrentClassLoader.java:351)
at org.jboss.modules.ConcurrentClassLoader.loadClass(ConcurrentClassLoader.java:93)
at org.pentaho.di.base.AbstractMeta.<init>(AbstractMeta.java:162)
at org.pentaho.di.job.JobMeta.<init>(JobMeta.java:201)
at org.pentaho.di.job.JobMeta.<init>(JobMeta.java:755)
However, when I look in my lib folder, the jar containing this class (kettle-engine-8.2.0.0-342.jar) is there and the class is also in it.
I'm using a JBoss EAP 7.0 with Java 1.8.0.161 on my computer for development. I start it with my IDE (Eclipse Photon). Here are my dependencies in the pom.xml:
<dependencies>
<dependency>
<groupId>javax</groupId>
<artifactId>javaee-api</artifactId>
<version>7.0</version>
</dependency>
<!-- vf2 debug -->
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-vfs2</artifactId>
<version>2.2</version>
</dependency>
<!-- maven-war-plugin-->
<dependency>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
<version>3.2.2</version>
</dependency>
<!-- pentaho kettle -->
<dependency>
<groupId>pentaho-kettle</groupId>
<artifactId>kettle-core</artifactId>
<version>8.2.0.0-342</version>
</dependency>
<dependency>
<groupId>pentaho-kettle</groupId>
<artifactId>kettle-engine</artifactId>
<version>8.2.0.0-342</version>
</dependency>
<dependency>
<groupId>com.googlecode.json-simple</groupId>
<artifactId>json-simple</artifactId>
<version>1.1</version>
</dependency>
<dependency>
<groupId>org.codehaus.janino</groupId>
<artifactId>janino</artifactId>
<version>2.5.16</version>
</dependency>
<dependency>
<groupId>org.mozilla</groupId>
<artifactId>rhino</artifactId>
<version>1.7R3</version>
</dependency>
<!-- this correct an error -->
<dependency>
<groupId>commons-lang</groupId>
<artifactId>commons-lang</artifactId>
<version>2.6</version>
</dependency>
<!-- netty version override to correct netty 3.7 bug / FYI, pentaho don't use netty anyway-->
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty</artifactId>
<version>3.9.0.Final</version>
</dependency>
</dependencies>
Important : this issue does not occur when using previous version of kettle-core and kettle-engine (tested with 7.0).
It's an embedded call so I do not directly call this class in my code.
I do not have multiple versions of this jar in my classpath.
Another person in my company found out so i'll just answer here to close the topic.
I didn't show it here but in my dependencies I had a dependency to some custom framework from work. This framework actually included pentaho, but the 7.0 version. Then, #suresh was right when he talk about multiple version jar.
So the answer is :
Check if any dependency actually contains duplicate dependency you're calling somewhere else. To avoid this issue, use the excluding syntax in the dependency calling the wrong versions :
<exclusions>
<exclusion>
<groupId>pentaho-kettle</groupId>
<artifactId>kettle-engine</artifactId>
</exclusion>
<exclusion>
<groupId>pentaho-kettle</groupId>
<artifactId>kettle-core</artifactId>
</exclusion>
</exclusions>
I am getting all these warnings from Tika when I try to use it:
Feb 24, 2018 9:24:35 PM
org.apache.tika.config.InitializableProblemHandler$3
handleInitializableProblem WARNING: JBIG2ImageReader not loaded. jbig2
files will be ignored See
https://pdfbox.apache.org/2.0/dependencies.html#jai-image-io for
optional dependencies. TIFFImageWriter not loaded. tiff files will not
be processed See
https://pdfbox.apache.org/2.0/dependencies.html#jai-image-io for
optional dependencies. J2KImageReader not loaded. JPEG2000 files will
not be processed. See
https://pdfbox.apache.org/2.0/dependencies.html#jai-image-io for
optional dependencies.
Feb 24, 2018 9:24:35 PM
org.apache.tika.config.InitializableProblemHandler$3
handleInitializableProblem WARNING: org.xerial's sqlite-jdbc is not
loaded. Please provide the jar on your classpath to parse sqlite
files. See tika-parsers/pom.xml for the correct version.
I tried adding this (in Tika pom.xml):
<dependency>
<groupId>org.bouncycastle</groupId>
<artifactId>bcprov-jdk15on</artifactId>
<version>1.57</version>
</dependency>
<dependency>
<groupId>org.bouncycastle</groupId>
<artifactId>bcmail-jdk15on</artifactId>
<version>1.57</version>
</dependency>
<dependency>
<groupId>org.bouncycastle</groupId>
<artifactId>bcpkix-jdk15on</artifactId>
<version>1.57</version>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.17</version>
</dependency>
<dependency>
<groupId>com.levigo.jbig2</groupId>
<artifactId>levigo-jbig2-imageio</artifactId>
<version>2.0</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>com.github.jai-imageio</groupId>
<artifactId>jai-imageio-core</artifactId>
<version>1.3.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>com.github.jai-imageio</groupId>
<artifactId>jai-imageio-jpeg2000</artifactId>
<version>1.3.0</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.xerial</groupId>
<artifactId>sqlite-jdbc</artifactId>
<version>3.20.1</version>
</dependency>
But I still get the same warnings.
How do I resolve this?
UPDATE 1
My dependencies were added here: https://github.com/apache/tika/blob/1.17/pom.xml#L164-L170
Also I did try without the set to test. It did not do anything.
The dependencies that I added seemed to be for PDFBox a Tika dependency.
I added the following dependencies and I didn't have any other warning
<dependency>
<groupId>org.apache.tika</groupId>
<artifactId>tika-core</artifactId>
<version>1.18</version>
</dependency>
<dependency>
<groupId>org.apache.tika</groupId>
<artifactId>tika-parsers</artifactId>
<version>1.18</version>
</dependency>
<dependency>
<groupId>org.apache.pdfbox</groupId>
<artifactId>jbig2-imageio</artifactId>
<version>3.0.1</version>
</dependency>
<dependency>
<groupId>com.github.jai-imageio</groupId>
<artifactId>jai-imageio-jpeg2000</artifactId>
<version>1.3.0</version>
</dependency>
Its hard to see exactly what is happening because you did not include your entire <dependencies>...</dependencies> section of your pom.xml, but I suspect it is due to optional maven dependencies. According to maven docs, you need to declare optional dependencies in your pom or they will not be loaded.
Additionally, all of your imageio dependencies are all have <scope>test</scope> making them only usable during unit testing.
For Clojure visitors: I fixed it with:
(System/setProperty "tika.config" "tika-config.xml")
in my config.clj file. The xml is just:
<?xml version="1.0" encoding="UTF-8"?>
<properties>
<service-loader initializableProblemHandler="ignore"/>
</properties>
this xml file is in the "resources" dir and that dir must be in your path.
this is now documented in the error log:
Feb 19, 2019 3:18:44 PM org.apache.tika.config.InitializableProblemHandler$3 handleInitializableProblem
WARNING: J2KImageReader not loaded. JPEG2000 files will not be processed.
See https://pdfbox.apache.org/2.0/dependencies.html#jai-image-io
for optional dependencies.
However I'd prefer to have a version of Tika (e.g., with a classifier) which does not include OCR/image processing when I only want to parse text, or have an option to turn off the error logging (and only log an error when actually trying to load an unsupported format).
Getting exception whilst using Google Pubsub to list topics, my web application is running on tomcat.
public static List<String> listTopics(GcpCredentials gcCredentials, String project) throws GCPException, IOException
{
List<String> topics = new ArrayList<>();
TopicAdminClient client = getTopicClient(gcCredentials);
ProjectName projectName = ProjectName.create(project);
ListTopicsPagedResponse response = client.listTopics(projectName);
for (Topic topic :response.iterateAll())
{
topics.add(topic.getNameAsTopicName().getTopic());
}
return topics;
}`
Exception:
java.lang.IllegalArgumentException: Jetty ALPN/NPN has not been properly configured.
at io.grpc.netty.GrpcSslContexts.selectApplicationProtocolConfig(GrpcSslContexts.java:174)
at io.grpc.netty.GrpcSslContexts.configure(GrpcSslContexts.java:151)
at io.grpc.netty.GrpcSslContexts.configure(GrpcSslContexts.java:139)
at io.grpc.netty.GrpcSslContexts.forClient(GrpcSslContexts.java:109)
at io.grpc.netty.NettyChannelBuilder.createProtocolNegotiatorByType(NettyChannelBuilder.java:335)
at io.grpc.netty.NettyChannelBuilder.createProtocolNegotiator(NettyChannelBuilder.java:308)
at io.grpc.netty.NettyChannelBuilder$NettyTransportFactory$DynamicNettyTransportParams.getProtocolNegotiator(NettyChannelBuilder.java:499)
at io.grpc.netty.NettyChannelBuilder$NettyTransportFactory.newClientTransport(NettyChannelBuilder.java:448)
at io.grpc.internal.CallCredentialsApplyingTransportFactory.newClientTransport(CallCredentialsApplyingTransportFactory.java:61)
at io.grpc.internal.InternalSubchannel.startNewTransport(InternalSubchannel.java:209)
at io.grpc.internal.InternalSubchannel.obtainActiveTransport(InternalSubchannel.java:186)
at io.grpc.internal.ManagedChannelImpl$SubchannelImplImpl.obtainActiveTransport(ManagedChannelImpl.java:806)
at io.grpc.internal.GrpcUtil.getTransportFromPickResult(GrpcUtil.java:568)
at io.grpc.internal.DelayedClientTransport.reprocess(DelayedClientTransport.java:296)
at io.grpc.internal.ManagedChannelImpl$LbHelperImpl$5.run(ManagedChannelImpl.java:724)
at io.grpc.internal.ChannelExecutor.drain(ChannelExecutor.java:87)
at io.grpc.internal.ManagedChannelImpl$LbHelperImpl.runSerialized(ManagedChannelImpl.java:715)
at io.grpc.internal.ManagedChannelImpl$NameResolverListenerImpl.onUpdate(ManagedChannelImpl.java:752)
at io.grpc.internal.DnsNameResolver$1.run(DnsNameResolver.java:174)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
I have observed this problem with Netty version 4.1.15.Final but not with 4.1.13.Final. Check your transitive dependencies. I.e Spring Boot references Netty.
What I added to POM to make it work with Spanner API version 0.22.0-beta:
<properties>
<v.netty>4.1.13.Final</v.netty>
</properties>
...
<dependencyManagement>
<dependencies>
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-codec</artifactId>
<version>${v.netty}</version>
</dependency>
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-codec-http</artifactId>
<version>${v.netty}</version>
</dependency>
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-codec-http2</artifactId>
<version>${v.netty}</version>
</dependency>
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-handler</artifactId>
<version>${v.netty}</version>
</dependency>
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-common</artifactId>
<version>${v.netty}</version>
</dependency>
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-handler-proxy</artifactId>
<version>${v.netty}</version>
</dependency>
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-transport</artifactId>
<version>${v.netty}</version>
</dependency>
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-resolver</artifactId>
<version>${v.netty}</version>
</dependency>
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-codec-socks</artifactId>
<version>${v.netty}</version>
</dependency>
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-buffer</artifactId>
<version>${v.netty}</version>
</dependency>
</dependencies>
</dependencyManagement>
If the problem persists, or if it is not an option, plase run your JVM with appropriate bootclasspath entry, like:
java -Xbootclasspath/p:/tmp/alpn-boot-8.1.11.v20170118.jar -cp ...
Make sure to replace /tmp/alpn-boot-8.1.11.v20170118.jar with location of alpn jar that matches your JVM version as listed on this page: https://www.eclipse.org/jetty/documentation/9.4.x/alpn-chapter.html
There is a possible solution on github: "Exception when configuring SSL: "Jetty ALPN/NPN has not been properly configured."
The recommended steps, quoting from the above document:
1) Edit catalina.sh
/usr/local/Cellar/tomcat/8.5.3/libexec/bin/catalina.sh
2) At line 103, add the following
CATALINA_OPTS="-javaagent:/Library/Tomcat/lib/jetty-alpn-agent-2.0.6.jar"
3) Restart tomcat
../bin/shutdown.sh ../bin/startup.sh
I've embedded Neo4j 3.0.1 into a Java 8 application, but I've been running into SPI issues.
Running from within IntelliJ produces the correct results as expected, but as soon as I build the artifact to a JAR, run it and attempt to write to Neo4j, I get the following exception:
Caused by: org.neo4j.kernel.impl.store.UnderlyingStorageException: java.lang.IllegalArgumentException: An SPI class of type org.apache.lucene.codecs.PostingsFormat with name 'BlockTreeOrds' does not exist. You need to add the corresponding JAR file supporting this SPI to your classpath. The current classpath supports the following names: [Lucene50]
at org.neo4j.kernel.impl.transaction.command.LabelUpdateWork.apply(LabelUpdateWork.java:62)
at org.neo4j.kernel.impl.transaction.command.LabelUpdateWork.apply(LabelUpdateWork.java:33)
at org.neo4j.concurrent.WorkSync.doSynchronizedWork(WorkSync.java:121)
at org.neo4j.concurrent.WorkSync.apply(WorkSync.java:90)
at org.neo4j.kernel.impl.transaction.command.IndexBatchTransactionApplier.close(IndexBatchTransactionApplier.java:105)
at org.neo4j.kernel.impl.api.BatchTransactionApplierFacade.close(BatchTransactionApplierFacade.java:70)
at org.neo4j.kernel.impl.storageengine.impl.recordstorage.RecordStorageEngine.apply(RecordStorageEngine.java:336)
at org.neo4j.kernel.impl.api.TransactionRepresentationCommitProcess.applyToStore(TransactionRepresentationCommitProcess.java:78)
... 25 more
There seems to be no exception starting Neo4j so I'm assuming that certain dependencies are not being resolved with the Maven build.
I have the following in my pom.xml file:
<dependency>
<groupId>com.sparkjava</groupId>
<artifactId>spark-core</artifactId>
<version>2.5</version>
</dependency>
<dependency>
<groupId>org.neo4j</groupId>
<artifactId>neo4j</artifactId>
<version>3.0.1</version>
<type>pom</type>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-simple</artifactId>
<version>1.7.21</version>
</dependency>
<dependency>
<groupId>org.apache.lucene</groupId>
<artifactId>lucene-codecs</artifactId>
<version>5.5.0</version>
</dependency>
<dependency>
<groupId>org.neo4j</groupId>
<artifactId>neo4j-slf4j</artifactId>
<version>3.0.0-M02</version>
</dependency>
<dependency>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
<version>2.6.2</version>
</dependency>
<dependency>
<groupId>com.github.jknack</groupId>
<artifactId>handlebars</artifactId>
<version>4.0.5</version>
</dependency>
<dependency>
<groupId>com.opencsv</groupId>
<artifactId>opencsv</artifactId>
<version>3.7</version>
</dependency>
<dependency>
<groupId>commons-codec</groupId>
<artifactId>commons-codec</artifactId>
<version>1.10</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>${jackson.version}</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-annotations</artifactId>
<version>${jackson.version}</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-core</artifactId>
<version>${jackson.version}</version>
</dependency>
How do I resolve this issue?
UPDATE:
I've re-created this issue with a really simple blank project, source can be found here if you'd like to run it on your end: https://github.com/SeanNieuwoudt/neo4j-spi
I took your project and executed in my eclipse. I have no issues. Here are below console logs after starting:
[Thread-0] INFO org.eclipse.jetty.util.log - Logging initialized #473ms
[Thread-0] INFO spark.embeddedserver.jetty.EmbeddedJettyServer - == Spark has ignited ...
[Thread-0] INFO spark.embeddedserver.jetty.EmbeddedJettyServer - >> Listening on 0.0.0.0:8080
[Thread-0] INFO org.eclipse.jetty.server.Server - jetty-9.3.6.v20151106
[Thread-0] INFO org.eclipse.jetty.server.ServerConnector - Started ServerConnector#5aa07ed2{HTTP/1.1,[http/1.1]}{0.0.0.0:8080}
[Thread-0] INFO org.eclipse.jetty.server.Server - Started #772ms
While accessing - http://localhost:8080/
I got output printed as 'Hello World"
Steps i followed:
1) Downloaded your project from your github url
2) Imported as maven projects pointing to java 8
3) Maven install was successful
3) Ran main class
4) Saw the output in the browser.
Am i missing any steps to replicate actual issue?
Or is something to do with your maven. May be check your maven dependencies after running maven install to see if the expected jars are downloaded.
Good luck.
Try to add the following dependency:
<dependency>
<groupId>org.apache.lucene</groupId>
<artifactId>lucene-backward-codecs</artifactId>
<version>5.5.0</version>
</dependency>
This is my solution. https://stackoverflow.com/a/38623295/6579110. You can also check https://maven.apache.org/plugins/maven-shade-plugin/examples/resource-transformers.html#ServicesResourceTransformer for more details.
My job writes each record to DynamoDB in the Hadoop's map.
I cannot make it run with Hadoop 2.6 which has httpclient-4.2.5.jar and httpcore-4.2.5.jar.
AWS which I am using was built using httpclient-4.5.2.jar and httpcore-4.4.4.jar.
When I am using classpath to include the new jar files, it gives the following exception.
java.lang.Exception: java.lang.NoSuchFieldError: INSTANCE
at org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:462)
at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:522)
Caused by: java.lang.NoSuchFieldError: INSTANCE
at org.apache.http.conn.ssl.SSLConnectionSocketFactory.<clinit>(SSLConnectionSocketFactory.java:144)
at com.amazonaws.http.apache.client.impl.ApacheConnectionManagerFactory.getPreferredSocketFactory(ApacheConnectionManagerFactory.java:87)
at com.amazonaws.http.apache.client.impl.ApacheConnectionManagerFactory.create(ApacheConnectionManagerFactory.java:65)
at com.amazonaws.http.apache.client.impl.ApacheConnectionManagerFactory.create(ApacheConnectionManagerFactory.java:58)
at com.amazonaws.http.apache.client.impl.ApacheHttpClientFactory.create(ApacheHttpClientFactory.java:49)
To me, it looks like Hadoop was built using the old libraries and something has changed in the syntax.
What's the reasonable solution than recompiling older sources of AWS?
As an update, I had to switch to Maven and play around with the versions a bit.
<!-- http://mvnrepository.com/artifact/org.apache.hadoop/hadoop-common -->
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.6.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-core</artifactId>
<version>2.6.0</version>
</dependency>
<!-- http://mvnrepository.com/artifact/com.amazonaws/aws-java-sdk -->
<dependency>
<groupId>org.apache.httpcomponents</groupId>
<artifactId>httpclient</artifactId>
<version>4.3.4</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-dynamodb</artifactId>
<version>1.9.13</version>
</dependency>
<dependency>
<groupId>org.netpreserve.commons</groupId>
<artifactId>webarchive-commons</artifactId>
<version>1.1.4</version>
</dependency>
Finally, it works