spark & java COMPILATION ERROR - java

When I used maven to compile spark java program, I got an compilation error like this
[ERROR] COMPILATION ERROR :
[INFO] -------------------------------------------------------------
[ERROR] /home/spark/java/src/main/java/SimpleApp.java:[9,36] cannot find symbol
symbol: variable read
location: variable spark of type org.apache.spark.sql.SparkSession
here is my JAVA pragram
import org.apache.spark.sql.SparkSession;
import org.apache.spark.sql.Dataset;
public class SimpleApp {
public static void main(String[] args) {
String logFile = "/home/spark/spark-2.2.0-bin-hadoop2.7/README.md"; // Should be some file on your system
SparkSession spark = SparkSession.builder().appName("Simple Application").getOrCreate();
Dataset<String> logData = spark.read.textFile(logFile).cache();
// Dataset<String> logData = SparkSession.builder().appName("Simple Application").getOrCreate().read.textFile(logFile).cache();
long numAs = logData.filter(s -> s.contains("a")).count();
long numBs = logData.filter(s -> s.contains("b")).count();
System.out.println("Lines with a: " + numAs + ", lines with b: " + numBs);
spark.stop();
}
}
I write the program follow the official website :
http://spark.apache.org/docs/latest/quick-start.html
how this error comes..??
here is my pom.xml
<project>
<groupId>edu.berkeley</groupId>
<artifactId>simple-project</artifactId>
<modelVersion>4.0.0</modelVersion>
<name>Simple Project</name>
<packaging>jar</packaging>
<version>1.0</version>
<properties>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
</properties>
<dependencies>
<dependency> <!-- Spark dependency -->
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.2.0</version>
</dependency>
</dependencies>
</project>

ok... I found the result..
the example in offical website is wrong ..
spark.read.textFile(logFile).cache(); --> spark.read().textFile(logFile).cache();
read should be read()

Related

Exception in thread "main" java.lang.UnsatisfiedLinkError: no jniTest in java.library.path Eclipse

I am trying a Maven Project with javacpp from Eclipse, and the project builds, but when I try to run it, it gives an exception and terminates itself. I am pretty new to Maven and managing pom.xml files, and I don't know what to do. The exception is:
Exception in thread "main" java.lang.UnsatisfiedLinkError: no jniTest in java.library.path:
at java.base/java.lang.ClassLoader.loadLibrary(ClassLoader.java:2447)
at java.base/java.lang.Runtime.loadLibrary0(Runtime.java:809)
at java.base/java.lang.System.loadLibrary(System.java:1893)
at org.bytedeco.javacpp.Loader.loadLibrary(Loader.java:1683)
at org.bytedeco.javacpp.Loader.load(Loader.java:1300)
at org.bytedeco.javacpp.Loader.load(Loader.java:1123)
at com.berk.maventest.Test.<clinit>(Test.java:13)
Below is my pom.xml file:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.berk</groupId>
<artifactId>maventest</artifactId>
<version>0.0.1-SNAPSHOT</version>
<properties>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
</properties>
<dependencies>
<dependency>
<groupId>org.bytedeco</groupId>
<artifactId>javacpp-platform</artifactId>
<version>1.5.4</version>
</dependency>
<dependency>
<groupId>org.bytedeco</groupId>
<artifactId>ffmpeg-platform</artifactId>
<version>4.3.1-1.5.4</version>
</dependency>
</dependencies>
</project>
And my Test.java file:
package com.berk.maventest;
import org.bytedeco.javacpp.*;
import org.bytedeco.javacpp.annotation.*;
#Platform(include = "Test.cpp")
#Namespace("TestLibrary")
public class Test extends Pointer{
static {
Loader.load();
}
public Test()
{
allocate();
}
private native void allocate();
public native int testMethod(int a);
public static void main(String [] args) {
//Calling the constructor
Test test = new Test();
//Calling the function
System.out.println("The answer is: " + test.testMethod(2));
//Calling the destructor
test.close();
}
}

Cannot resolve symbol 'TestUnit' in maven project

I'm trying to learn JUnit Testing currently. I want to test first custom 'TestUnit' class but IDE tells me that Cannot resolve symbol 'TestUnit'. What I should do to fix this error?
TestUnit class:
package test.lessons.purejava;
import org.junit.Test;
import static org.junit.Assert.assertEquals;
public class TestUnit {
#Test
public void testAdd(){
String string = "Test unit works";
assertEquals("Test unit works", string);
}
#Test
public void concatTest(){
Concatenation concatenation = new Concatenation();
String result = concatenation.concat("Hello", " World");
assertEquals("Hello World", result);
}
}
and pom.xml
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>test.lessons</groupId>
<artifactId>purejava</artifactId>
<version>1.0-SNAPSHOT</version>
<packaging>jar</packaging>
<dependencies>
<!--<!–
https://mvnrepository.com/artifact/com.microsoft.sqlserver/mssql-jdbc
–>-->
<!-- https://mvnrepository.com/artifact/junit/junit -->
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.12</version>
<!--<scope>test</scope>-->
</dependency>
</dependencies>
</project>

Spring RestTemplate & AsyncRestTemplate with Netty4 hangs forever

Very simple setup:
pom.xml
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.example</groupId>
<artifactId>demo-rest-client</artifactId>
<version>0.0.1-SNAPSHOT</version>
<packaging>jar</packaging>
<name>demo-rest-client</name>
<description>Demo project for Spring Boot</description>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>1.4.1.RELEASE</version>
<relativePath/> <!-- lookup parent from repository -->
</parent>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
<java.version>1.8</java.version>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-all</artifactId>
<version>4.1.5.Final</version>
</dependency>
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-buffer</artifactId>
<version>4.1.5.Final</version>
</dependency>
<dependency>
<groupId>com.squareup.okhttp3</groupId>
<artifactId>okhttp</artifactId>
<version>3.4.1</version>
</dependency>
<dependency>
<groupId>io.github.openfeign</groupId>
<artifactId>feign-core</artifactId>
<version>9.3.1</version>
</dependency>
<dependency>
<groupId>io.github.openfeign</groupId>
<artifactId>feign-hystrix</artifactId>
<version>9.3.1</version>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
</project>
And a test case to demonstrate different usages of AsyncRestTemplate:
SampleTests.java
package com.example;
import com.netflix.hystrix.HystrixCommand;
import com.netflix.hystrix.HystrixCommandProperties;
import feign.RequestLine;
import feign.hystrix.HystrixFeign;
import feign.hystrix.SetterFactory;
import org.junit.Test;
import org.springframework.http.ResponseEntity;
import org.springframework.http.client.Netty4ClientHttpRequestFactory;
import org.springframework.http.client.OkHttp3ClientHttpRequestFactory;
import org.springframework.util.concurrent.ListenableFuture;
import org.springframework.web.client.AsyncRestTemplate;
import org.springframework.web.client.RestTemplate;
public class SampleTests {
private static final String URL = "https://api.github.com/users/octocat";
private static final int DEFAULT_SLEEP_MILLIS = 20;
private static final int DEFAULT_TIMEOUT = 10000;
#Test(timeout = DEFAULT_TIMEOUT)
public void syncRestNetty() throws Exception {
RestTemplate restTemplate = new RestTemplate(new Netty4ClientHttpRequestFactory());
ResponseEntity<String> response = restTemplate.getForEntity(URL, String.class);
System.out.println("response = " + response);
}
#Test(timeout = DEFAULT_TIMEOUT)
public void asyncRestNetty() throws Exception {
AsyncRestTemplate restTemplate = new AsyncRestTemplate(new Netty4ClientHttpRequestFactory());
ListenableFuture<ResponseEntity<String>> listenableFuture = restTemplate.getForEntity(URL, String.class);
listenableFuture.addCallback(result -> System.out.println("result = " + result), Throwable::printStackTrace);
while (!listenableFuture.isDone()) {
Thread.sleep(DEFAULT_SLEEP_MILLIS);
}
System.out.println("the end");
}
#Test
public void asyncRestOkHttp() throws Exception {
AsyncRestTemplate restTemplate = new AsyncRestTemplate(new OkHttp3ClientHttpRequestFactory());
ListenableFuture<ResponseEntity<String>> listenableFuture = restTemplate.getForEntity(URL, String.class);
listenableFuture.addCallback(result -> System.out.println("result = " + result), Throwable::printStackTrace);
while (!listenableFuture.isDone()) {
Thread.sleep(DEFAULT_SLEEP_MILLIS);
}
System.out.println("the end");
}
#Test
public void asyncRestHystrixFeign() throws Exception {
GitHub gitHub = HystrixFeign.builder()
.setterFactory((target, method) -> new SetterFactory.Default().create(target, method).andCommandPropertiesDefaults(HystrixCommandProperties.defaultSetter().withExecutionTimeoutInMilliseconds(10000)))
.target(GitHub.class, "https://api.github.com");
HystrixCommand<String> command = gitHub.octocatAsync();
command.toObservable().subscribe(result -> System.out.println("result = " + result), Throwable::printStackTrace);
while (!command.isExecutionComplete()) {
Thread.sleep(DEFAULT_SLEEP_MILLIS);
}
System.out.println("command.getExecutionTimeInMilliseconds() = " + command.getExecutionTimeInMilliseconds());
System.out.println("the end");
}
interface GitHub {
#RequestLine("GET /users/octocat")
HystrixCommand<String> octocatAsync();
}
}
When trying to run the tests which use Netty they just hang forever. (To see this please remove the JUnit timeout constraint). But if I run the exact same code with other clients everything works as expected.
I have tried different versions of Spring Boot and Netty but did not succeed. And from the logs everything looks ok.
What am I missing here?
EDIT:
Opened a ticket https://jira.spring.io/browse/SPR-14744 as suggested on Spring Gitter
EDIT-2:
Answer from Brian Clozel helped me find the issue which is related to Netty not realizing the server sent an empty response (a particular case with Github API and plain http) so I am marking it as accepted.
Can you try to configure your request factory with a Netty Sslcontext?
Netty4ClientHttpRequestFactory nettyFactory = new Netty4ClientHttpRequestFactory();
nettyFactory.setSslContext(SslContextBuilder.forClient().build());
AsyncRestTemplate restTemplate = new AsyncRestTemplate(nettyFactory);
Without that context, the client is trying to send plaintext requests to the https endpoint; in that case, you're probably getting an HTTP 400 response.
In your example code, the throwable should be an instance of HttpClientErrorException, and you could get that information by logging the response status or its body with exception.getResponseBodyAsString().

java.lang.AbstractMethodError in Java Spark and cassandra connection

working with spark1.6.0 and cassandra-3.1.1 and I tried to connect to cassandra database using Java spark. there is no error while building but getting the following error while i run the application
vException in thread "main" java.lang.AbstractMethodError
at org.apache.spark.Logging$class.log(Logging.scala:51)
at com.datastax.spark.connector.cql.CassandraConnector$.log(CassandraConnector.scala:144)
at org.apache.spark.Logging$class.logDebug(Logging.scala:62)
at com.datastax.spark.connector.cql.CassandraConnector$.logDebug(CassandraConnector.scala:144)
at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:154)
at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$4.apply(CassandraConnector.scala:151)
at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$4.apply(CassandraConnector.scala:151)
at com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache.scala:36)
at com.datastax.spark.connector.cql.RefCountedCache.acquire(RefCountedCache.scala:61)
at com.datastax.spark.connector.cql.CassandraConnector.openSession(CassandraConnector.scala:72)
at com.test.cassandra.spark.Main.generateData(Main.java:30)
at com.test.cassandra.spark.Main.run(Main.java:21)
at com.test.cassandra.spark.Main.main(Main.java:163)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
my code
import com.datastax.driver.core.Session;
import com.datastax.spark.connector.cql.CassandraConnector;
import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaSparkContext;
import java.io.Serializable;
public class Main implements Serializable {
private transient SparkConf sconf;
private static final String keySpaceName = "java_api";
private static final String primaryTableName = "test_cassandra";
private Main(SparkConf conf) {
this.sconf = conf;
}
private void run() {
JavaSparkContext sc = new JavaSparkContext(sconf);
generateData(sc);
sc.stop();
}
private void generateData(JavaSparkContext sc) {
CassandraConnector connector = CassandraConnector.apply(sc.getConf());
try (Session session = connector.openSession()) {
System.out.println("connected to cassandra");
session.execute("DROP KEYSPACE IF EXISTS java_api");
session.execute("CREATE KEYSPACE java_api WITH replication = {'class': 'SimpleStrategy', 'replication_factor': 1}");
session.execute("CREATE TABLE java_api.sales (id UUID PRIMARY KEY, product INT, price DECIMAL)");
session.execute("CREATE TABLE java_api.summaries (product INT PRIMARY KEY, summary DECIMAL)");
System.out.println("connected");
}
}
public static void main(String[] args) {
if (args.length != 2) {
System.err
.println("Syntax: com.datastax.spark.demo.Main <Spark Master URL> <Cassandra contact point>");
System.exit(1);
}
SparkConf conf = new SparkConf()
.set("spark.cassandra.connection.host", "localhost")
.set("spark.cassandra.connection.native.port", "9042");
conf.setAppName("Java API demo");
conf.setMaster(args[0]);
//conf.set("spark.cassandra.connection.host", "127.0.0.1");
Main app = new Main(conf);
app.run();
}
}
my pom.xml
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.test</groupId>
<artifactId>cassandra-spark</artifactId>
<version>1.0</version>
<packaging>jar</packaging>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<maven.compiler.source>1.7</maven.compiler.source>
<maven.compiler.target>1.7</maven.compiler.target>
</properties>
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>
<!--Spark Cassandra Connector -->
<dependency>
<groupId>com.datastax.spark</groupId>
<artifactId>spark-cassandra-connector_2.10</artifactId>
<version>1.5.0-M3</version>
</dependency>
<dependency>
<groupId>com.datastax.spark</groupId>
<artifactId>spark-cassandra-connector-java_2.10</artifactId>
<version>1.5.0-M3</version>
</dependency>
<dependency>
<groupId>com.datastax.cassandra</groupId>
<artifactId>cassandra-driver-core</artifactId>
<version>3.0.0-rc1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.6.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.10</artifactId>
<version>1.6.0</version>
</dependency>
</dependencies>
</project>
This may come from the fact that
some class has incompatibly changed since the currently executing method was last compiled.
This may come from the java version for example
See the response to this question:
Spark streaming StreamingContext.start() - Error starting receiver 0
Seems this issue is because of conflict in logging of spark and Cassandra.I was getting this error while using below dependency.
libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector" % "1.6.2"
I used below Cassandra connector to resolve this issue..
libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector" % "1.6.5"

java.lang.NoClassDefFoundError: javax/servlet/FilterRegistration

I am using Spark 1.6.0 and I am trying to code a very simple project of "word counts". I am getting this error:
java.lang.NoClassDefFoundError: javax/servlet/FilterRegistration
This is my code:
import org.apache.spark.api.java.JavaSparkContext;
import scala.Tuple2;
import org.apache.spark.api.java.JavaPairRDD;
import org.apache.spark.api.java.JavaRDD;
import java.util.Arrays;
import org.apache.spark.SparkConf;
public class WordCount {
public static void main(String[] args) {
SparkConf conf = new SparkConf().setAppName("WordCount").setMaster("local[2]");
JavaSparkContext sc = new JavaSparkContext(conf);
JavaRDD<String> lines = sc.textFile("scrittura.txt");
JavaRDD<Integer> lineLengths = lines.map(s -> s.length());
int totalLength = lineLengths.reduce((a, b) -> a + b);
System.out.println("TOTAL: " + totalLength);
JavaRDD<String> flat = lines
.flatMap(x -> Arrays.asList(x.replaceAll("[^A-Za-z ]", "").split(" ")));
JavaPairRDD<String, Integer> map = flat
.mapToPair(x -> new Tuple2<String, Integer>(x, 1));
JavaPairRDD<String, Integer> reduce = map
.reduceByKey((x, y) -> x + y);
System.out.println(reduce.collect());
sc.stop();
sc.close();
}}
This is my log:
Exception in thread "main" java.lang.NoClassDefFoundError:
javax/servlet/FilterRegistration at
org.spark-project.jetty.servlet.ServletContextHandler.(ServletContextHandler.java:136)
at
org.spark-project.jetty.servlet.ServletContextHandler.(ServletContextHandler.java:129)
at
org.spark-project.jetty.servlet.ServletContextHandler.(ServletContextHandler.java:98)
at
org.apache.spark.ui.JettyUtils$.createServletHandler(JettyUtils.scala:110)
at
org.apache.spark.ui.JettyUtils$.createServletHandler(JettyUtils.scala:101)
at org.apache.spark.ui.WebUI.attachPage(WebUI.scala:78) at
org.apache.spark.ui.WebUI$$anonfun$attachTab$1.apply(WebUI.scala:62)
at
org.apache.spark.ui.WebUI$$anonfun$attachTab$1.apply(WebUI.scala:62)
at
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
at org.apache.spark.ui.WebUI.attachTab(WebUI.scala:62) at
org.apache.spark.ui.SparkUI.initialize(SparkUI.scala:61) at
org.apache.spark.ui.SparkUI.(SparkUI.scala:74) at
org.apache.spark.ui.SparkUI$.create(SparkUI.scala:190) at
org.apache.spark.ui.SparkUI$.createLiveUI(SparkUI.scala:141) at
org.apache.spark.SparkContext.(SparkContext.scala:466) at
org.apache.spark.api.java.JavaSparkContext.(JavaSparkContext.scala:61)
at WordCount.main(WordCount.java:16) Caused by:
java.lang.ClassNotFoundException: javax.servlet.FilterRegistration at
java.net.URLClassLoader.findClass(URLClassLoader.java:381) at
java.lang.ClassLoader.loadClass(ClassLoader.java:424) at
sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331) at
java.lang.ClassLoader.loadClass(ClassLoader.java:357) ... 18 more
This is my pom.xml:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<artifactId>examples</artifactId>
<version>0.0.1-SNAPSHOT</version>
<packaging>jar</packaging>
<name>examples</name>
<url>http://maven.apache.org</url>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.5.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-mllib_2.10</artifactId>
<version>1.5.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>2.6.2</version>
</dependency>
<dependency>
<groupId>org.eclipse.jetty.orbit</groupId>
<artifactId>javax.servlet</artifactId>
<version>3.0.0.v201112011016</version>
</dependency>
</dependencies>
</project>
How can I solve it?
Thank you!

Categories