working with spark1.6.0 and cassandra-3.1.1 and I tried to connect to cassandra database using Java spark. there is no error while building but getting the following error while i run the application
vException in thread "main" java.lang.AbstractMethodError
at org.apache.spark.Logging$class.log(Logging.scala:51)
at com.datastax.spark.connector.cql.CassandraConnector$.log(CassandraConnector.scala:144)
at org.apache.spark.Logging$class.logDebug(Logging.scala:62)
at com.datastax.spark.connector.cql.CassandraConnector$.logDebug(CassandraConnector.scala:144)
at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:154)
at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$4.apply(CassandraConnector.scala:151)
at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$4.apply(CassandraConnector.scala:151)
at com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache.scala:36)
at com.datastax.spark.connector.cql.RefCountedCache.acquire(RefCountedCache.scala:61)
at com.datastax.spark.connector.cql.CassandraConnector.openSession(CassandraConnector.scala:72)
at com.test.cassandra.spark.Main.generateData(Main.java:30)
at com.test.cassandra.spark.Main.run(Main.java:21)
at com.test.cassandra.spark.Main.main(Main.java:163)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
my code
import com.datastax.driver.core.Session;
import com.datastax.spark.connector.cql.CassandraConnector;
import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaSparkContext;
import java.io.Serializable;
public class Main implements Serializable {
private transient SparkConf sconf;
private static final String keySpaceName = "java_api";
private static final String primaryTableName = "test_cassandra";
private Main(SparkConf conf) {
this.sconf = conf;
}
private void run() {
JavaSparkContext sc = new JavaSparkContext(sconf);
generateData(sc);
sc.stop();
}
private void generateData(JavaSparkContext sc) {
CassandraConnector connector = CassandraConnector.apply(sc.getConf());
try (Session session = connector.openSession()) {
System.out.println("connected to cassandra");
session.execute("DROP KEYSPACE IF EXISTS java_api");
session.execute("CREATE KEYSPACE java_api WITH replication = {'class': 'SimpleStrategy', 'replication_factor': 1}");
session.execute("CREATE TABLE java_api.sales (id UUID PRIMARY KEY, product INT, price DECIMAL)");
session.execute("CREATE TABLE java_api.summaries (product INT PRIMARY KEY, summary DECIMAL)");
System.out.println("connected");
}
}
public static void main(String[] args) {
if (args.length != 2) {
System.err
.println("Syntax: com.datastax.spark.demo.Main <Spark Master URL> <Cassandra contact point>");
System.exit(1);
}
SparkConf conf = new SparkConf()
.set("spark.cassandra.connection.host", "localhost")
.set("spark.cassandra.connection.native.port", "9042");
conf.setAppName("Java API demo");
conf.setMaster(args[0]);
//conf.set("spark.cassandra.connection.host", "127.0.0.1");
Main app = new Main(conf);
app.run();
}
}
my pom.xml
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.test</groupId>
<artifactId>cassandra-spark</artifactId>
<version>1.0</version>
<packaging>jar</packaging>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<maven.compiler.source>1.7</maven.compiler.source>
<maven.compiler.target>1.7</maven.compiler.target>
</properties>
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>
<!--Spark Cassandra Connector -->
<dependency>
<groupId>com.datastax.spark</groupId>
<artifactId>spark-cassandra-connector_2.10</artifactId>
<version>1.5.0-M3</version>
</dependency>
<dependency>
<groupId>com.datastax.spark</groupId>
<artifactId>spark-cassandra-connector-java_2.10</artifactId>
<version>1.5.0-M3</version>
</dependency>
<dependency>
<groupId>com.datastax.cassandra</groupId>
<artifactId>cassandra-driver-core</artifactId>
<version>3.0.0-rc1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.6.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.10</artifactId>
<version>1.6.0</version>
</dependency>
</dependencies>
</project>
This may come from the fact that
some class has incompatibly changed since the currently executing method was last compiled.
This may come from the java version for example
See the response to this question:
Spark streaming StreamingContext.start() - Error starting receiver 0
Seems this issue is because of conflict in logging of spark and Cassandra.I was getting this error while using below dependency.
libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector" % "1.6.2"
I used below Cassandra connector to resolve this issue..
libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector" % "1.6.5"
Related
When i run the STS(SpringBoot) application i get the below error:
The attempt was made from the following location: org.apache.catalina.authenticator.AuthenticatorBase.startInternal(AuthenticatorBase.java:1321)
The following method did not exist:
javax.servlet.ServletContext.getVirtualServerName()Ljava/lang/String;
The method's class, javax.servlet.ServletContext, is available from the following locations:
jar:file:/home/talha/.m2/repository/javax/servlet/servlet-api/2.5/servlet-api-2.5.jar!/javax/servlet/ServletContext.class
jar:file:/home/talha/.m2/repository/org/apache/tomcat/embed/tomcat-embed-core/9.0.33/tomcat-embed-core-9.0.33.jar!/javax/servlet/ServletContext.class
It was loaded from the following location:
file:/home/talha/.m2/repository/javax/servlet/servlet-api/2.5/servlet-api-2.5.jar
The suggestion in the ide is:
Action:
Correct the classpath of your application so that it contains a single, compatible version of javax.servlet.ServletContext
I guess there is something wrong with my pom.xml the code is as below:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.2.6.RELEASE</version>
<relativePath /> <!-- lookup parent from repository -->
</parent>
<groupId>com.uni</groupId>
<artifactId>authorize</artifactId>
<version>0.0.1-SNAPSHOT</version>
<packaging>war</packaging>
<name>authorize</name>
<description>API for user registration and login with validation</description>
<properties>
<java.version>1.8</java.version>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-jpa</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<!-- https://mvnrepository.com/artifact/com.sun.mail/javax.mail -->
<dependency>
<groupId>com.sun.mail</groupId>
<artifactId>javax.mail</artifactId>
<version>1.6.2</version>
</dependency>
<dependency>
<groupId>com.h2database</groupId>
<artifactId>h2</artifactId>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-tomcat</artifactId>
<scope>provided</scope>
</dependency>
<!-- https://mvnrepository.com/artifact/org.springframework.boot/spring-boot-starter-data-mongodb -->
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-mongodb</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.security</groupId>
<artifactId>spring-security-core</artifactId>
</dependency>
<!-- https://mvnrepository.com/artifact/joda-time/joda-time -->
<dependency>
<groupId>joda-time</groupId>
<artifactId>joda-time</artifactId>
</dependency>
<dependency>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
</dependency>
<dependency>
<groupId>com.googlecode.jsontoken</groupId>
<artifactId>jsontoken</artifactId>
<version>1.0</version>
</dependency>
<!-- Thanks for using https://jar-download.com -->
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
<exclusions>
<exclusion>
<groupId>org.junit.vintage</groupId>
<artifactId>junit-vintage-engine</artifactId>
</exclusion>
</exclusions>
</dependency>
</dependencies>
<build>
<finalName>authrize</finalName>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
</project>
The main dependencies causing the error are:
<dependency>
<groupId>joda-time</groupId>
<artifactId>joda-time</artifactId>
</dependency>
<dependency>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
</dependency>
<dependency>
<groupId>com.googlecode.jsontoken</groupId>
<artifactId>jsontoken</artifactId>
<version>1.0</version>
</dependency>
I get the error only after adding the above dependencies, the need to add the dependencies is the below class:
package com.uni.authorize.service;
import java.security.InvalidKeyException;
import java.security.SignatureException;
import java.util.List;
import org.joda.time.DateTime;
import org.joda.time.Instant;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.annotation.Configuration;
import org.springframework.stereotype.Component;
import com.google.gson.JsonObject;
import com.uni.authorize.model.TokenKeys;
import com.uni.authorize.pojo.TokenObject;
import com.uni.authorize.repository.TokenKeysRepository;
import com.uni.authorize.service.CreateToken;
import net.oauth.jsontoken.JsonToken;
import net.oauth.jsontoken.crypto.HmacSHA256Signer;
#Configuration
public class CreateToken {
private static final Logger LOGGER = LoggerFactory.getLogger(CreateToken.class);
private static final String ISSUER = "UnI United Tech";
#Autowired
TokenKeysRepository tokenKeyRepository;
public TokenObject createToken(String userId) {
List<TokenKeys> tokenList = tokenKeyRepository.findAll();
TokenKeys tokenKeys = tokenList.get(0);
long accessTokenValidity = tokenKeys.getAccessTokenValidity();
long refreshTokenValidiry = tokenKeys.getRefreshTokenValidity();
String accessTokenKey = tokenKeys.getAccessKey();
String refreshTokenKey = tokenKeys.getRefreshKey();
String accessToken = generateAccessToken(userId, accessTokenKey, accessTokenValidity);
String refreshToken = generateRefreshToken(userId, refreshTokenKey, refreshTokenValidiry);
TokenObject tokenObject = new TokenObject();
tokenObject.setAccess_token(accessToken);
tokenObject.setRefresh_token(refreshToken);
tokenObject.setToken_type("bearer");
tokenObject.setExpires_in(accessTokenValidity);
tokenObject.setScope("read write trust");
return tokenObject;
}
private String generateAccessToken(String userId, String accessTokenKey, long accessTokenValidity) {
HmacSHA256Signer signer;
try {
signer = new HmacSHA256Signer(ISSUER, null, accessTokenKey.getBytes());
} catch (InvalidKeyException e) {
throw new RuntimeException(e);
}
// Configure JSON token
JsonToken token = new net.oauth.jsontoken.JsonToken(signer);
// token.setAudience(AUDIENCE);
DateTime dateTime = new DateTime();
long dateTimeMillis = dateTime.getMillis();
// DateTime currentTimeDateTime = new DateTime(dateTimeMillis);
// DateTime expiryTimeDateTime = new DateTime(dateTimeMillis + accessTokenValidity);
Instant currentTimeInstant = new org.joda.time.Instant(dateTimeMillis);
Instant expirationTimeInstant = new org.joda.time.Instant(dateTimeMillis + accessTokenValidity);
LOGGER.debug("Current Time Instant" + currentTimeInstant);
LOGGER.debug("Expiration Tine Instant" + expirationTimeInstant);
token.setIssuedAt(currentTimeInstant);
token.setExpiration(expirationTimeInstant);
// Configure request object, which provides information of the item
JsonObject request = new JsonObject();
request.addProperty("userId", userId);
JsonObject payload = token.getPayloadAsJsonObject();
payload.add("info", request);
try {
return token.serializeAndSign();
} catch (SignatureException e) {
throw new RuntimeException(e);
}
}
private String generateRefreshToken(String userId, String refreshTokenKey, long refreshTokenValidiry) {
HmacSHA256Signer signer;
try {
signer = new HmacSHA256Signer(ISSUER, null, refreshTokenKey.getBytes());
} catch (InvalidKeyException e) {
throw new RuntimeException(e);
}
// Configure JSON token
JsonToken token = new net.oauth.jsontoken.JsonToken(signer);
// token.setAudience(AUDIENCE);
DateTime dateTime = new DateTime();
long dateTimeMillis = dateTime.getMillis();
// DateTime currentTimeDateTime = new DateTime(dateTimeMillis);
// DateTime expiryTimeDateTime = new DateTime(dateTimeMillis + refreshTokenValidiry);
Instant currentTimeInstant = new org.joda.time.Instant(dateTimeMillis);
Instant expirationTimeInstant = new org.joda.time.Instant(dateTimeMillis + refreshTokenValidiry);
LOGGER.debug("Current Time Instant" + currentTimeInstant);
LOGGER.debug("Expiration Tine Instant" + expirationTimeInstant);
token.setIssuedAt(currentTimeInstant);
token.setExpiration(expirationTimeInstant);
// Configure request object, which provides information of the item
JsonObject request = new JsonObject();
request.addProperty("userId", userId);
JsonObject payload = token.getPayloadAsJsonObject();
payload.add("info", request);
try {
return token.serializeAndSign();
} catch (SignatureException e) {
throw new RuntimeException(e);
}
}
}
Please help me resolve this issue.
Thanks in advance
getVirtualServerName() was added in Servlet 3.1, but you included servlet-api-2.5.jar is your application.
Options:
Change your dependencies to include servlet-api-3.1.jar (or later)
Remove the servlet-api-2.5.jar dependency, since the correct version is included in the Embedded Tomcat file (tomcat-embed-core-9.0.33.jar).
Actually, you should never ship servlet-api.jar with your application, since it will be provided by the Servlet Container. Seems you're missing <scope>provided</scope> in your dependency tag for the servlet-api file.
I solved a similar issue by adding dependencyManagement.
Here is an example of my code where a version of org.springframework.cloud was the problem:
<dependencyManagement>
<dependencies>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-dependencies</artifactId>
<version>2021.0.0</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
Very simple setup:
pom.xml
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.example</groupId>
<artifactId>demo-rest-client</artifactId>
<version>0.0.1-SNAPSHOT</version>
<packaging>jar</packaging>
<name>demo-rest-client</name>
<description>Demo project for Spring Boot</description>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>1.4.1.RELEASE</version>
<relativePath/> <!-- lookup parent from repository -->
</parent>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
<java.version>1.8</java.version>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-all</artifactId>
<version>4.1.5.Final</version>
</dependency>
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-buffer</artifactId>
<version>4.1.5.Final</version>
</dependency>
<dependency>
<groupId>com.squareup.okhttp3</groupId>
<artifactId>okhttp</artifactId>
<version>3.4.1</version>
</dependency>
<dependency>
<groupId>io.github.openfeign</groupId>
<artifactId>feign-core</artifactId>
<version>9.3.1</version>
</dependency>
<dependency>
<groupId>io.github.openfeign</groupId>
<artifactId>feign-hystrix</artifactId>
<version>9.3.1</version>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
</project>
And a test case to demonstrate different usages of AsyncRestTemplate:
SampleTests.java
package com.example;
import com.netflix.hystrix.HystrixCommand;
import com.netflix.hystrix.HystrixCommandProperties;
import feign.RequestLine;
import feign.hystrix.HystrixFeign;
import feign.hystrix.SetterFactory;
import org.junit.Test;
import org.springframework.http.ResponseEntity;
import org.springframework.http.client.Netty4ClientHttpRequestFactory;
import org.springframework.http.client.OkHttp3ClientHttpRequestFactory;
import org.springframework.util.concurrent.ListenableFuture;
import org.springframework.web.client.AsyncRestTemplate;
import org.springframework.web.client.RestTemplate;
public class SampleTests {
private static final String URL = "https://api.github.com/users/octocat";
private static final int DEFAULT_SLEEP_MILLIS = 20;
private static final int DEFAULT_TIMEOUT = 10000;
#Test(timeout = DEFAULT_TIMEOUT)
public void syncRestNetty() throws Exception {
RestTemplate restTemplate = new RestTemplate(new Netty4ClientHttpRequestFactory());
ResponseEntity<String> response = restTemplate.getForEntity(URL, String.class);
System.out.println("response = " + response);
}
#Test(timeout = DEFAULT_TIMEOUT)
public void asyncRestNetty() throws Exception {
AsyncRestTemplate restTemplate = new AsyncRestTemplate(new Netty4ClientHttpRequestFactory());
ListenableFuture<ResponseEntity<String>> listenableFuture = restTemplate.getForEntity(URL, String.class);
listenableFuture.addCallback(result -> System.out.println("result = " + result), Throwable::printStackTrace);
while (!listenableFuture.isDone()) {
Thread.sleep(DEFAULT_SLEEP_MILLIS);
}
System.out.println("the end");
}
#Test
public void asyncRestOkHttp() throws Exception {
AsyncRestTemplate restTemplate = new AsyncRestTemplate(new OkHttp3ClientHttpRequestFactory());
ListenableFuture<ResponseEntity<String>> listenableFuture = restTemplate.getForEntity(URL, String.class);
listenableFuture.addCallback(result -> System.out.println("result = " + result), Throwable::printStackTrace);
while (!listenableFuture.isDone()) {
Thread.sleep(DEFAULT_SLEEP_MILLIS);
}
System.out.println("the end");
}
#Test
public void asyncRestHystrixFeign() throws Exception {
GitHub gitHub = HystrixFeign.builder()
.setterFactory((target, method) -> new SetterFactory.Default().create(target, method).andCommandPropertiesDefaults(HystrixCommandProperties.defaultSetter().withExecutionTimeoutInMilliseconds(10000)))
.target(GitHub.class, "https://api.github.com");
HystrixCommand<String> command = gitHub.octocatAsync();
command.toObservable().subscribe(result -> System.out.println("result = " + result), Throwable::printStackTrace);
while (!command.isExecutionComplete()) {
Thread.sleep(DEFAULT_SLEEP_MILLIS);
}
System.out.println("command.getExecutionTimeInMilliseconds() = " + command.getExecutionTimeInMilliseconds());
System.out.println("the end");
}
interface GitHub {
#RequestLine("GET /users/octocat")
HystrixCommand<String> octocatAsync();
}
}
When trying to run the tests which use Netty they just hang forever. (To see this please remove the JUnit timeout constraint). But if I run the exact same code with other clients everything works as expected.
I have tried different versions of Spring Boot and Netty but did not succeed. And from the logs everything looks ok.
What am I missing here?
EDIT:
Opened a ticket https://jira.spring.io/browse/SPR-14744 as suggested on Spring Gitter
EDIT-2:
Answer from Brian Clozel helped me find the issue which is related to Netty not realizing the server sent an empty response (a particular case with Github API and plain http) so I am marking it as accepted.
Can you try to configure your request factory with a Netty Sslcontext?
Netty4ClientHttpRequestFactory nettyFactory = new Netty4ClientHttpRequestFactory();
nettyFactory.setSslContext(SslContextBuilder.forClient().build());
AsyncRestTemplate restTemplate = new AsyncRestTemplate(nettyFactory);
Without that context, the client is trying to send plaintext requests to the https endpoint; in that case, you're probably getting an HTTP 400 response.
In your example code, the throwable should be an instance of HttpClientErrorException, and you could get that information by logging the response status or its body with exception.getResponseBodyAsString().
I have built spark using scala 2.11. I ran the following steps :
./dev/change-scala-version.sh 2.11
mvn -Pyarn -Phadoop-2.4 -Dscala-2.11 -DskipTests clean package
After building spark successfully, I tried to intialize spark via akka model .
So, my Main class looks like :
ActorSystem system = ActorSystem.create("ClusterSystem");
Inbox inbox = Inbox.create(system);
ActorRef sparkActorRef = system.actorOf(SparkActor.props(mapOfArguments), "sparkActor");
inbox.send(sparkActorRef, "start");
The spark actor looks like:
public class SparkActor extends UntypedActor{
private static Logger logger = LoggerFactory.getLogger(SparkActor.class);
final Map<String,Object> configurations;
final SparkConf sparkConf;
private int sparkBatchDuration;
public static Props props(final Map<String,Object> configurations) {
return Props.create(new Creator<SparkActor>() {
private static final long serialVersionUID = 1L;
#Override
public SparkActor create() throws Exception {
return new SparkActor(configurations);
}
});
}
public SparkActor(Map<String,Object> configurations) {
this.configurations = configurations;
this.sparkConf =initializeSparkConf(configurations);
ActorRef mediator = DistributedPubSub.get(getContext().system()).mediator();
mediator.tell(new DistributedPubSubMediator.Subscribe("data", getSelf()), getSelf());
}
private SparkConf initializeSparkConf(Map<String, Object> mapOfArgs) {
SparkConf conf = new SparkConf();
Configuration sparkConf = (Configuration) mapOfArgs.get(StreamingConstants.MAP_SPARK_CONFIGURATION);
Iterator it = sparkConf.getKeys();
while(it.hasNext()){
String propertyKey = (String)it.next();
String propertyValue = sparkConf.getString(propertyKey);
conf.set(propertyKey.trim(), propertyValue.trim());
}
conf.setMaster(sparkConf.getString(StreamingConstants.SET_MASTER));
return conf;
}
#Override
public void onReceive(Object arg0) throws Exception {
if((arg0 instanceof String) & (arg0.toString().equalsIgnoreCase("start"))){
logger.info("Going to start");
sparkConf.setAppName(StreamingConstants.APP_NAME);
logger.debug("App name set to {}. Beginning spark execution",StreamingConstants.APP_NAME);
Configuration kafkaConfiguration = (Configuration) configurations.get(StreamingConstants.MAP_KAFKA_CONFIGURATION);
sparkBatchDuration = Integer.parseInt((String)configurations.get(StreamingConstants.MAP_SPARK_DURATION));
//Initializing Kafka configurations.
String[] eplTopicsAndThreads = kafkaConfiguration.getString(StreamingConstants.EPL_QUEUE).split(",");
Map<String,Integer> mapofeplTopicsAndThreads = new TreeMap<>();
for (String item : eplTopicsAndThreads){
String topic = item.split(StreamingConstants.EPL_QUEUE_SEPARATOR)[0];
Integer numberOfThreads= Integer.parseInt(item.split(StreamingConstants.EPL_QUEUE_SEPARATOR)[1]);
mapofeplTopicsAndThreads.put(topic, numberOfThreads);
}
//Creating a receiver stream in spark
JavaPairReceiverInputDStream<String,String> receiverStream = null;
JavaStreamingContext ssc = new JavaStreamingContext(sparkConf, Durations.seconds(sparkBatchDuration));
receiverStream = KafkaUtils.createStream(ssc,
kafkaConfiguration.getString(StreamingConstants.ZOOKEEPER_SERVER_PROPERTY),
kafkaConfiguration.getString(StreamingConstants.KAFKA_GROUP_NAME),
mapofeplTopicsAndThreads);
JavaDStream<String> javaRdd = receiverStream.map(new SparkTaskTupleHelper());
javaRdd.foreachRDD(new Function<JavaRDD<String>, Void>() {
#Override
public Void call(JavaRDD<String> jsonData) throws Exception {
//Code to process some data from kafka
}
});
ssc.start();
ssc.awaitTermination();
}
}
I start my spark application as
./spark-submit --class com.sample.Main --master local[8] ../executables/spark-akka.jar
I get the following exception on startup
Uncaught error from thread [ClusterSystem-akka.actor.default-dispatcher-3] shutting down JVM since 'akka.jvm-exit-on-fatal-error' is enabled for ActorSystem[ClusterSystem]
java.lang.NoSuchMethodError: scala.Predef$.ArrowAssoc(Ljava/lang/Object;)Ljava/lang/Object;
at akka.cluster.pubsub.protobuf.DistributedPubSubMessageSerializer.<init>(DistributedPubSubMessageSerializer.scala:42)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)
at scala.util.Try$.apply(Try.scala:161)
at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)
at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
at scala.util.Success.flatMap(Try.scala:200)
at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:84)
at akka.serialization.Serialization.serializerOf(Serialization.scala:165)
at akka.serialization.Serialization$$anonfun$3.apply(Serialization.scala:174)
at akka.serialization.Serialization$$anonfun$3.apply(Serialization.scala:174)
at scala.collection.TraversableLike$WithFilter$$anonfun$map$2.apply(TraversableLike.scala:722)
at scala.collection.immutable.HashMap$HashMap1.foreach(HashMap.scala:224)
at scala.collection.immutable.HashMap$HashTrieMap.foreach(HashMap.scala:403)
at scala.collection.immutable.HashMap$HashTrieMap.foreach(HashMap.scala:403)
at scala.collection.TraversableLike$WithFilter.map(TraversableLike.scala:721)
at akka.serialization.Serialization.<init>(Serialization.scala:174)
at akka.serialization.SerializationExtension$.createExtension(SerializationExtension.scala:15)
at akka.serialization.SerializationExtension$.createExtension(SerializationExtension.scala:12)
at akka.actor.ActorSystemImpl.registerExtension(ActorSystem.scala:713)
at akka.actor.ExtensionId$class.apply(Extension.scala:79)
at akka.serialization.SerializationExtension$.apply(SerializationExtension.scala:12)
at akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:175)
at akka.actor.ActorSystemImpl.liftedTree2$1(ActorSystem.scala:620)
at akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:617)
at akka.actor.ActorSystemImpl._start(ActorSystem.scala:617)
at akka.actor.ActorSystemImpl.start(ActorSystem.scala:634)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:142)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:119)
at org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)
at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:52)
at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1913)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1904)
at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:55)
at org.apache.spark.rpc.akka.AkkaRpcEnvFactory.create(AkkaRpcEnv.scala:253)
at org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:53)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:252)
at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:193)
at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:277)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:450)
at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:864)
at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:81)
at org.apache.spark.streaming.api.java.JavaStreamingContext.<init>(JavaStreamingContext.scala:134)
at com.sample.SparkActor.onReceive(SparkActor.java:106)
at akka.actor.UntypedActor$$anonfun$receive$1.applyOrElse(UntypedActor.scala:167)
at akka.actor.Actor$class.aroundReceive(Actor.scala:467)
at akka.actor.UntypedActor.aroundReceive(UntypedActor.scala:97)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
at akka.actor.ActorCell.invoke(ActorCell.scala:487)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
at akka.dispatch.Mailbox.run(Mailbox.scala:220)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:397)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
A list of options that I have already tried..
1) rebuilt spark with akka version 2.4.4 and got a NoSuchMethodError for toRootLowerCase
2) Tried to reuse the inbuilt spark of 2.3.11 and still got the same exception at CLusterSettings.scala
I have looked at similar errors on stackoverflow and found that it was due to a scala version mismatch. But having built everything with 2.11 and using akka 2.4.4 I thought that all jars will be on the same scala version.
Am i missing any particular step?
My pom file for your reference.
<packaging>jar</packaging>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<slf4j.version>1.7.6</slf4j.version>
<log4j.version>2.0-rc1</log4j.version>
<commons.cli.version>1.2</commons.cli.version>
<kafka.version>0.8.2.2</kafka.version>
<akka.version>2.4.4</akka.version>
<akka.version.old>2.4.4</akka.version.old>
</properties>
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>1.6.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>1.6.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.11</artifactId>
<version>1.6.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-mllib_2.11</artifactId>
<version>1.6.1</version>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>2.11.8</version>
</dependency>
<dependency>
<groupId>com.typesafe.akka</groupId>
<artifactId>akka-actor_2.11</artifactId>
<version>${akka.version}</version>
</dependency>
<dependency>
<groupId>com.typesafe.akka</groupId>
<artifactId>akka-cluster_2.11</artifactId>
<version>${akka.version}</version>
</dependency>
<dependency>
<groupId>com.typesafe.akka</groupId>
<artifactId>akka-kernel_2.11</artifactId>
<version>${akka.version}</version>
</dependency>
<dependency>
<groupId>com.typesafe.akka</groupId>
<artifactId>akka-cluster-tools_2.11</artifactId>
<version>${akka.version}</version>
</dependency>
<dependency>
<groupId>com.typesafe.akka</groupId>
<artifactId>akka-remote_2.11</artifactId>
<version>2.4.4</version>
</dependency>
<dependency>
<groupId>com.typesafe.akka</groupId>
<artifactId>akka-slf4j_2.11</artifactId>
<version>2.4.4</version>
</dependency>
If I remove the cluster jars and the distributedpubsub code and use plain remoting i.e akka.tcp then no errors are shown. It works fine in that scenario. I wish to know why the distributedpubsub throws this error.
I am using Spark 1.6.0 and I am trying to code a very simple project of "word counts". I am getting this error:
java.lang.NoClassDefFoundError: javax/servlet/FilterRegistration
This is my code:
import org.apache.spark.api.java.JavaSparkContext;
import scala.Tuple2;
import org.apache.spark.api.java.JavaPairRDD;
import org.apache.spark.api.java.JavaRDD;
import java.util.Arrays;
import org.apache.spark.SparkConf;
public class WordCount {
public static void main(String[] args) {
SparkConf conf = new SparkConf().setAppName("WordCount").setMaster("local[2]");
JavaSparkContext sc = new JavaSparkContext(conf);
JavaRDD<String> lines = sc.textFile("scrittura.txt");
JavaRDD<Integer> lineLengths = lines.map(s -> s.length());
int totalLength = lineLengths.reduce((a, b) -> a + b);
System.out.println("TOTAL: " + totalLength);
JavaRDD<String> flat = lines
.flatMap(x -> Arrays.asList(x.replaceAll("[^A-Za-z ]", "").split(" ")));
JavaPairRDD<String, Integer> map = flat
.mapToPair(x -> new Tuple2<String, Integer>(x, 1));
JavaPairRDD<String, Integer> reduce = map
.reduceByKey((x, y) -> x + y);
System.out.println(reduce.collect());
sc.stop();
sc.close();
}}
This is my log:
Exception in thread "main" java.lang.NoClassDefFoundError:
javax/servlet/FilterRegistration at
org.spark-project.jetty.servlet.ServletContextHandler.(ServletContextHandler.java:136)
at
org.spark-project.jetty.servlet.ServletContextHandler.(ServletContextHandler.java:129)
at
org.spark-project.jetty.servlet.ServletContextHandler.(ServletContextHandler.java:98)
at
org.apache.spark.ui.JettyUtils$.createServletHandler(JettyUtils.scala:110)
at
org.apache.spark.ui.JettyUtils$.createServletHandler(JettyUtils.scala:101)
at org.apache.spark.ui.WebUI.attachPage(WebUI.scala:78) at
org.apache.spark.ui.WebUI$$anonfun$attachTab$1.apply(WebUI.scala:62)
at
org.apache.spark.ui.WebUI$$anonfun$attachTab$1.apply(WebUI.scala:62)
at
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
at org.apache.spark.ui.WebUI.attachTab(WebUI.scala:62) at
org.apache.spark.ui.SparkUI.initialize(SparkUI.scala:61) at
org.apache.spark.ui.SparkUI.(SparkUI.scala:74) at
org.apache.spark.ui.SparkUI$.create(SparkUI.scala:190) at
org.apache.spark.ui.SparkUI$.createLiveUI(SparkUI.scala:141) at
org.apache.spark.SparkContext.(SparkContext.scala:466) at
org.apache.spark.api.java.JavaSparkContext.(JavaSparkContext.scala:61)
at WordCount.main(WordCount.java:16) Caused by:
java.lang.ClassNotFoundException: javax.servlet.FilterRegistration at
java.net.URLClassLoader.findClass(URLClassLoader.java:381) at
java.lang.ClassLoader.loadClass(ClassLoader.java:424) at
sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331) at
java.lang.ClassLoader.loadClass(ClassLoader.java:357) ... 18 more
This is my pom.xml:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<artifactId>examples</artifactId>
<version>0.0.1-SNAPSHOT</version>
<packaging>jar</packaging>
<name>examples</name>
<url>http://maven.apache.org</url>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.5.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-mllib_2.10</artifactId>
<version>1.5.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>2.6.2</version>
</dependency>
<dependency>
<groupId>org.eclipse.jetty.orbit</groupId>
<artifactId>javax.servlet</artifactId>
<version>3.0.0.v201112011016</version>
</dependency>
</dependencies>
</project>
How can I solve it?
Thank you!
I am trying to write application for real time processing with apache storm , kafka and trident
but in initialization of TridentKafkaConfig i see this error
Exception in thread "main" java.lang.NoClassDefFoundError: kafka/api/OffsetRequest
at storm.kafka.KafkaConfig.<init>(KafkaConfig.java:43)
at storm.kafka.trident.TridentKafkaConfig.<init>(TridentKafkaConfig.java:30)
at spout.TestSpout.<clinit>(TestSpout.java:22)
at IOTTridentTopology.initializeTridentTopology(IOTTridentTopology.java:31)
at IOTTridentTopology.main(IOTTridentTopology.java:26)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:144)
Caused by: java.lang.ClassNotFoundException: kafka.api.OffsetRequest
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
... 10 more
my spout class is
public class TestSpout extends OpaqueTridentKafkaSpout {
private static TridentKafkaConfig config;
private static BrokerHosts HOSTS = new ZkHosts(TridentConfig.ZKHOSTS);
private static String TOPIC = "test";
private static int BUFFER_SIZE = TridentConfig.BUFFER_SIZE;
static{
config = new TridentKafkaConfig(HOSTS, TOPIC);
config.scheme = new SchemeAsMultiScheme(new RawScheme());
config.bufferSizeBytes = BUFFER_SIZE;
}
public TestSpout(TridentKafkaConfig config) {
super(config);
}
public TestSpout() {
super(config);
}
}
main class:
public static void main(String[] args) {
initializeTridentTopology();
}
private static void initializeTridentTopology() {
TridentTopology topology = new TridentTopology();
TestSpout spout = new TestSpout();
//////////////// test //////////////////////
topology.newStream("testspout", spout).each(spout.getOutputFields(), new TestFunction(), new Fields());
/////////////// end test ///////////////////
LocalCluster cluster = new LocalCluster();
Config config = new Config();
config.setDebug(false);
config.setMaxTaskParallelism(1);
config.registerSerialization(storm.kafka.trident.GlobalPartitionInformation.class);
config.registerSerialization(java.util.TreeMap.class);
config.setNumWorkers(5);
config.setFallBackOnJavaSerialization(true);
cluster.submitTopology("KafkaTrident", config, topology.build());
}
and my pom.xml:
<?xml version="1.0" encoding="UTF-8"?>
http://maven.apache.org/xsd/maven-4.0.0.xsd">
4.0.0
<groupId>IOT</groupId>
<artifactId>ver0.1</artifactId>
<version>1.0-SNAPSHOT</version>
<dependencies>
<dependency>
<groupId>org.apache.storm</groupId>
<artifactId>storm-core</artifactId>
<version>0.9.3</version>
</dependency>
<dependency>
<groupId>org.apache.storm</groupId>
<artifactId>storm-kafka</artifactId>
<version>0.9.3</version>
</dependency>
</dependencies>
I am trying different version of storm-kafka (0.9.3 and 0.9.4 and 0.9.5 and 0.9.6 and 0.10.0) and storm-core (9.3 and 9.4 and 9.6)
But I still see my previous error
by googling i found this link but ...
ClassNotFoundException: kafka.api.OffsetRequest
after some googling i found this link
https://github.com/wurstmeister/storm-kafka-0.8-plus-test
and found my answer in pom.xml file
by adding this code and find compatible version of kafka all problem resolved
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_2.11</artifactId>
<version>0.9.0.0</version>
<exclusions>
<exclusion>
<groupId>org.apache.zookeeper</groupId>
<artifactId>zookeeper</artifactId>
</exclusion>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
</exclusion>
<exclusion>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
</exclusion>
</exclusions>
</dependency>
If you use LocalCluster deployment a storm topology you need to add the Kafka lib to your dependencies (for Storm 0.10.0):
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_2.9.2</artifactId>
<version>0.8.1.1</version>
</dependency>
kafka.api.OffsetRequest class is missed beacause org.apache.kafka is provided dependency for the storm-kafka:
http://mvnrepository.com/artifact/org.apache.storm/storm-kafka/0.10.0. Please, see the Provided Dependencies section for details.