How to use spark mllib in web project - java

I am trying to use spark mllib.jar in web project. I downloaded spark-1.1.0-bin-hadoop2.4 and unzipped. There are some jar found as follows:
datanucleus-api-jdi-3.2.1.jar
datanucleus-core-3.2.2.jar
datanucleus-rdbms-3.2.1.jar
spark-assembly-1.1.0-hadoop2.4.0.jar
spark-examples-1.1.0-hadoop2.4.0.jar
Then I use spark-assembly-1.1.0-hadoop2.4.0.jar to import classification methods. It can be run successfully in java project. However when I add the jar to SomeWebProject/web-inf/lib, it turns out error messages:
validateJarFile ...\web-inf\lib\spark-assembly-1.1.0-hadoop2.4.0.jar jar not loaded. offending class:javax/servlet/servlet.class
I know it because my web project javax.servlet class is duplicated with spark jar. I try to delete spark's javax.servlet. It still not working.
Could you please tell me how to figure it out
and
Can I use other spark jar to run mllib in local mode. This jar is too large and it is about 132 mb and I think some of them are useless. But I cannot find any other available jars. Is this jar the only way to import spark-mllib.jar?
p.s. For some reasons I can not deploy spark in my servers. So I could not use hadoop environments
Thanks very much!!!

If you can use maven then just add these dependencies to your pom.xml:
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>1.3.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-mllib_2.11</artifactId>
<version>1.3.0</version>
</dependency>

Related

How to download files from Gitlab account using java

I am writing one utility job in java to download JSON files from particular URL of Gitlab account and further process them according to requirement. I tried to do same using java-gitlab-api dependency. However, even after including below maven dependency,
I get error as :
Missing artifact org.gitlab:java-gitlab-api:jar:1.1.8-
The import org.gitlab cannot be resolved.
Maven dependency I am using is :
<dependency>
<groupId>org.gitlab</groupId>
<artifactId>java-gitlab-api</artifactId>
<version>1.1.8-SNAPSHOT</version>
</dependency>
I tried to update, clean maven project but nothing worked. Anyone has an idea of how can I rectify issues and download files from gitlab account.
Use appropriate maven java-gitlab-api version(s). There is most recent version is available too.
https://mvnrepository.com/artifact/org.gitlab/java-gitlab-api/1.1.8
https://mvnrepository.com/artifact/org.gitlab/java-gitlab-api
<!-- https://mvnrepository.com/artifact/org.gitlab/java-gitlab-api -->
<dependency>
<groupId>org.gitlab</groupId>
<artifactId>java-gitlab-api</artifactId>
<version>1.1.8</version>
</dependency>

Using BoringSSL With Vertx (ALPN)

I am trying to use BoringSSL with Vertx so that I can use ALPN. I am running vertx 3.7.1 and have included the following in my pom.xml-
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-tcnative-boringssl-static</artifactId>
<version>2.0.23.Final</version>
</dependency>
When I run my application I get the following message- "ALPN not available for JDK SSL/TLS engine". If I add <scope>runtime</scope> or <scrope>compile</scope> to the dependency, I will not get this message when I run the application through my IDE.
However, when I build a jar with any of the scopes (or no scope defined) the resulting jar does not contain the boringssl piece. More specifically, there is an io.netty package, but the io.netty.internal.tcnative is missing.
How can I include this package in the final jar I'm building?

import org.apache cannot be resolved error in Eclipse

I'm writing my map-reduce program in eclipse. Using Hadoop 2.6.0 version.
So I have downloaded hadoop-2.6.0.tar.gz, unzipped and placed hadoop-2.6.0.jar file to /opt/eclipse/plugins with 777 permissions.
Secondly I have added location of the jar file in java build path in eclipse in external library. But am getting below error.
Can anyone share with me why its happening or what I'm missing ?
Thanks
The Eclipse Plugin is not "The Hadoop Library"
I suggest you start over with a Maven project.
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>2.6.0</version>
</dependency>
Keep in mind - version 2.6.0 was released in 2014, so you might want to upgrade your cluster to at least a 2.7.x version
Also, if running that on the cluster as a JAR file, you'd want to add <scope>provided</scope>
If you insist on using the JAR files, then you need the hadoop-client-2.6.0.jar

Jetty wants Jersey, but I'm not using it

Building a relatively simple jetty app, following the instructions here:
http://www.eclipse.org/jetty/documentation/9.4.x/maven-and-jetty.html
I'm not using Jersey, but mvn jetty:run complains about
Provider com.sun.jersey.server.impl.container.servlet.JerseyServletContainerInitializer not found
My pom.xml does not include any reference to Jersey. In fact, it is quite simple:
<dependency>
<groupId>org.eclipse.jetty</groupId>
<artifactId>jetty-server</artifactId>
<version>${jettyVersion}</version>
</dependency>
<dependency>
<groupId>org.apache.solr</groupId>
<artifactId>solr-solrj</artifactId>
<version>6.0.1</version>
</dependency>
What is making jetty look for Jersey?
Search all of your dependencies for META-INF/services/javax.servlet.ServletContainerInitializer files.
The one that has the entry for com.sun.jersey.server.impl.container.servlet.JerseyServletContainerInitializer is the one causing you problems.
Look at your project dependencies (aka <project><dependencies>) and your project's configuration of jetty-maven-plugin to see if that <plugin> has any extra dependencies added to it (aka <plugin><dependencies>).
Well, after much machination, and gnashing of teeth, I think I stumbled about the answer. Whilst learning about maven, I was playing with shaded uber-jars. I had compiled one of the packages as an uper-jar, and installed it. When maven put it all together, it added a bit too much and broke my dependencies. Removing the shaded jar for local installation and just using it for distribution worked just fine.

JFreeChart NoClassDefFoundError on Windows Tomcat

java.lang.NoClassDefFoundError: Could not initialize class org.jfree.chart.JFreeChart
On Windows, a GUI system is always available.
Why such error still occur on Windows?
How to solve?
I use tomcat 7 with JFreeChart 1.0.14.
Please make sure you have library file of jfree in your classpath.
If you are using maven adding this dependency shall work,
<dependency>
<groupId>jfree</groupId>
<artifactId>jfreechart</artifactId>
<version>1.0.14</version>
Its not an issue of windows, make sure you are including the jar of jfreechart in your classpath, if you are using maven use
<dependency>
<groupId>jfree</groupId>
<artifactId>jfreechart</artifactId>
<version>1.0.13</version> <!--Use any valid version-->
</dependency>
If not using maven, downlaod the jar and add to your lib folder

Categories