Error at jscc.start() - java

Here is my Spark Code and pom.xml. The issue with this program is JavaStreamingContext is streaming only first batch of records, it's not streaming any further and i am getting error at jscc.start().
Can anyone give me some clue on why this is happening. Is there anything wrong with my spark dependencies?
17/04/11 10:32:20 ERROR StreamingContext: Error starting the context, marking it as stopped
java.lang.IllegalArgumentException: requirement failed: No output operations registered, so nothing to execute
at scala.Predef$.require(Predef.scala:224)
at org.apache.spark.streaming.DStreamGraph.validate(DStreamGraph.scala:163)
at org.apache.spark.streaming.StreamingContext.validate(StreamingContext.scala:513)
at org.apache.spark.streaming.StreamingContext.liftedTree1$1(StreamingContext.scala:573)
at org.apache.spark.streaming.StreamingContext.start(StreamingContext.scala:572)
at org.apache.spark.streaming.api.java.JavaStreamingContext.start(JavaStreamingContext.scala:554)
at com.comcast.emm.vodip.WholeTextLocal.WholeTextLocal.main(WholeTextLocal.java:72)
Exception in thread "main" java.lang.IllegalArgumentException: requirement failed: No output operations registered, so nothing to execute
at scala.Predef$.require(Predef.scala:224)
at org.apache.spark.streaming.DStreamGraph.validate(DStreamGraph.scala:163)
at org.apache.spark.streaming.StreamingContext.validate(StreamingContext.scala:513)
at org.apache.spark.streaming.StreamingContext.liftedTree1$1(StreamingContext.scala:573)
at org.apache.spark.streaming.StreamingContext.start(StreamingContext.scala:572)
at org.apache.spark.streaming.api.java.JavaStreamingContext.start(JavaStreamingContext.scala:554)
at com.comcast.emm.vodip.WholeTextLocal.WholeTextLocal.main(WholeTextLocal.java:72)
Code: pom.xml:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.abcd.emm.mpp</groupId>
<artifactId>WholeTextLocal</artifactId>
<version>0.0.1-SNAPSHOT</version>
<packaging>jar</packaging>
<name>WholeTextLocal</name>
<url>http://maven.apache.org</url>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
<build>
<plugins>
<!-- Maven Shade Plugin -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.4.3</version>
<executions>
<!-- Run shade goal on package phase -->
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<excludeScope>system</excludeScope>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<excludeGroupIds>junit,org.mockito,org.hamcrest</excludeGroupIds>
<transformers>
<!-- add Main-Class to manifest file -->
<transformer
implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<mainClass>com.abcd.emm.mpp.WholeTextLocal.WholeTextLocal</mainClass>
</transformer>
</transformers>
</configuration>
</execution>
</executions>
<configuration>
<finalName>${project.artifactId}</finalName>
<filters>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
</configuration>
</plugin>
</plugins>
</build>
<dependencies>
<!-- https://mvnrepository.com/artifact/org.apache.commons/commons-compress -->
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-compress</artifactId>
<version>1.5</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.6.4</version>
<exclusions>
<exclusion>
<groupId>javax.servlet</groupId>
<artifactId>*</artifactId>
</exclusion>
</exclusions>
</dependency>
<!-- <dependency> <groupId>com.fasterxml.jackson.core</groupId> <artifactId>jackson-databind</artifactId>
<version>2.4.4</version> </dependency> -->
<dependency>
<groupId>jdk.tools</groupId>
<artifactId>jdk.tools</artifactId>
<version>1.8.0_101</version>
<scope>system</scope>
<systemPath>${JAVA_HOME}/lib/tools.jar</systemPath>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-common -->
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.6.4</version>
<exclusions>
<exclusion>
<groupId>javax.servlet</groupId>
<artifactId>*</artifactId>
</exclusion>
</exclusions>
</dependency>
<!-- required at run time by hive -->
<dependency>
<groupId>org.datanucleus</groupId>
<artifactId>datanucleus-core</artifactId>
<version>3.2.2</version>
</dependency>
<dependency>
<groupId>org.datanucleus</groupId>
<artifactId>datanucleus-api-jdo</artifactId>
<version>3.2.1</version>
</dependency>
<dependency>
<groupId>org.datanucleus</groupId>
<artifactId>datanucleus-rdbms</artifactId>
<version>3.2.1</version>
</dependency>
<dependency>
<groupId>com.googlecode.json-simple</groupId>
<artifactId>json-simple</artifactId>
<version>1.1</version>
</dependency>
<!-- https://mvnrepository.com/artifact/com.sparkjava/spark-core -->
<!-- https://mvnrepository.com/artifact/com.sparkjava/spark-core -->
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-core_2.11 -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.0.2</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-streaming_2.11 -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.11</artifactId>
<version>2.0.2</version>
</dependency>
<!-- <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-streaming_2.10</artifactId>
<version>2.0.2</version> <scope>provided</scope> </dependency> -->
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-sql_2.10 -->
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-sql_2.11 -->
<!-- <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-sql_2.11</artifactId>
<version>2.0.2</version> </dependency> -->
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-hive_2.10 -->
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-hive_2.10 -->
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-hive_2.11 -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_2.11</artifactId>
<version>2.0.2</version>
</dependency>
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-metastore</artifactId>
<version>0.14.0</version>
</dependency>
<dependency>
<groupId>org.mariadb.jdbc</groupId>
<artifactId>mariadb-java-client</artifactId>
<version>1.2.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-hive-thriftserver_2.11 -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive-thriftserver_2.11</artifactId>
<version>2.0.2</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>2.6.0</version>
<!-- <scope>provided</scope> -->
</dependency>
<!-- http://mvnrepository.com/artifact/org.apache.avro/avro-tools -->
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>1.6.0</version>
<!--<scope>provided</scope> -->
</dependency>
<!-- <dependency> <groupId>log4j</groupId> <artifactId>log4j</artifactId>
<version>1.2.17</version> </dependency> -->
<!-- <dependency> <groupId>org.scalatest</groupId> <artifactId>scalatest_2.10</artifactId>
<version>2.2.4</version> <scope>test</scope> </dependency> -->
</dependencies>
</project>
Java code:
package com.comcast.emm.vodip.WholeTextLocal;
import java.util.Arrays;
import java.util.Iterator;
import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaPairRDD;
import org.apache.spark.api.java.JavaRDD;
import org.apache.spark.api.java.function.FlatMapFunction;
import org.apache.spark.api.java.function.Function;
import org.apache.spark.api.java.function.VoidFunction;
import org.apache.spark.streaming.Durations;
import org.apache.spark.streaming.api.java.JavaStreamingContext;
import scala.Tuple2;
public class WholeTextLocal
{
public static void main(String[] args) throws InterruptedException
{
SparkConf sparkConf = new SparkConf().setAppName("My app").setMaster("local")
.set("spark.driver.allowMultipleContexts", "true");
JavaStreamingContext jssc = new JavaStreamingContext(sparkConf, Durations.seconds(15));
JavaPairRDD<String, String> WholeTextLocalFiles = jssc.sparkContext()
.wholeTextFiles("C:/Users/aaa/files/abcd7/simple/*.txt");
// Pick the content.
JavaRDD<String> myRDD = WholeTextLocalFiles.map(new Function<Tuple2<String, String>, String>()
{
private static final long serialVersionUID = -551872585218963131L;
public String call(Tuple2<String, String> v1) throws Exception
{
System.out.println("v1._2=" + v1._2);
return v1._2;
}
});
// Split the content in each file with /n character.
JavaRDD<String> myRDD2 = myRDD.flatMap(new FlatMapFunction<String, String>()
{
public Iterator<String> call(String t) throws Exception
{
return Arrays.asList(t.split("\\r?\\n")).iterator();
}
});
// Loop through and print.
myRDD2.foreachPartition(new VoidFunction<Iterator<String>>()
{
private static final long serialVersionUID = -4895942417886562330L;
public void call(Iterator<String> t) throws Exception
{
while (t.hasNext())
{
System.out.println(t.next());
}
}
});
jssc.start();
jssc.awaitTermination();
}
}
UPDATED CODE: this is streaming. but, sysout is not printing anything eventhough file is being recognized by spark.
public class WholeTextLocal
{
public static void main(String[] args) throws InterruptedException
{
SparkConf sparkConf = new SparkConf().setAppName("My app").setMaster("local")
.set("spark.driver.allowMultipleContexts", "true");
JavaStreamingContext jssc = new JavaStreamingContext(sparkConf, Durations.seconds(10));
JavaPairInputDStream<Text, Text> dStream = jssc.fileStream("C:/Users/aaa/files/abcd7/simple/", Text.class, Text.class, WholeTextFileInputFormat.class);
// Pick the content.
JavaDStream<String> myRDD = dStream.map(new Function<Tuple2<Text, Text>, String>()
{
private static final long serialVersionUID = -551872585218963131L;
#Override
public String call(Tuple2<Text, Text> v1) throws Exception
{
System.out.println("v1._2=" + v1._1);
return v1._2.toString();
}
});
// Split the content in each file with /n character.
JavaDStream<String> myRDD2 = myRDD.flatMap(new FlatMapFunction<String, String>()
{
#Override
public Iterator<String> call(String t) throws Exception
{
return Arrays.asList(t.split("\\r?\\n")).iterator();
}
});
// Loop through and print.
myRDD2.foreachRDD(itr -> new VoidFunction<Iterator<String>>()
{
private static final long serialVersionUID = -4895942417886562330L;
#Override
public void call(Iterator<String> t) throws Exception
{
while (t.hasNext())
{
System.out.println(t.next());
}
}
});
myRDD2.count();
jssc.start();
jssc.awaitTermination();
}
}

I think that error message tells you everything:
No output operations registered, so nothing to execute at
You must add some action at the end, i.e. foreachRDD instead of foreachPartition:
myRDD2.foreachRDD(new VoidFunction<JavaRDD<String>>() {
private static final long serialVersionUID = -4895942417886562330L;
public void call(JavaRDD<String> rdd) throws Exception {
rdd.foreachPartition(new VoidFunction<Iterator<String>>() {
private static final long serialVersionUID = -1L;
public void call(Iterator<String> rdd) throws Exception {
while (t.hasNext()) {
System.out.println(t.next());
}
}
}
});
Note: you can use foreachPartition inside foreachRDD
Second:
you are reading files using JavaRDD.wholeTextFiles. It's not a streaming function. You should read using JavaStreamingContext.textFileStream and operate on DStreams
You can use also fileStream with input format like WholeTextFileInputFormat:
JavaPairInputDStream<Text, Text> dStream = jssc.fileStream("dir", Text.class, Text.class, WholeTextFileInputFormat.class);

Related

Some cucumber methods do not work correctly (Cucumber Feature Wrapper)

I have a question about Cucumber library, I was taking a course of selenium with cucumber and testNg, but I have run into some problems because some methods no longer exist
For Example:
I cant used "CucumberFeatureWrapper" what could be the relative to this? Image: https://i.stack.imgur.com/3JMcD.png
The same way happens when i need to used .provideFeatures(); in testNgCucumberRunner.provideFeatures Image: https://i.stack.imgur.com/L76xQ.png
POM:
e<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>groupId</groupId>
<artifactId>CRMFramework</artifactId>
<version>1.0-SNAPSHOT</version>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.8.1</version>
<configuration>
<source>10</source>
<target>10</target>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>3.0.0-M5</version>
<configuration>
<testFailureIgnore>true</testFailureIgnore>
<suiteXmlFiles>
<suiteXmlFile>src/test/java/TestNg.xml</suiteXmlFile>
</suiteXmlFiles>
<testFailureIgnore>true</testFailureIgnore>
</configuration>
</plugin>
<plugin>
<groupId>net.masterthought</groupId>
<artifactId>maven-cucumber-reporting</artifactId>
<version>5.4.0</version>
<executions>
<execution>
<id>execution</id>
<phase>verify</phase>
<goals>
<goal>generate</goal>
</goals>
<configuration>
<projectName>ExecuteAutomation</projectName>
<!-- output directory for the generated report -->
<outputDirectory>${project.build.directory}/cucumber-reports</outputDirectory>
<inputDirectory>${project.build.directory}/cucumber-json-report.json</inputDirectory>
<jsonFiles>
<!-- supports wildcard or name pattern -->
<param>**/*.json</param>
</jsonFiles>
<mergeFeaturesWithRetest>true</mergeFeaturesWithRetest>
<mergeFeaturesById>true</mergeFeaturesById>
<checkBuildResult>false</checkBuildResult>
<skipEmptyJSONFiles>true</skipEmptyJSONFiles>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
<properties>
<maven.compiler.source>15</maven.compiler.source>
<maven.compiler.target>15</maven.compiler.target>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
</properties>
<dependencies>
<dependency>
<groupId>org.seleniumhq.selenium</groupId>
<artifactId>selenium-java</artifactId>
<version>3.141.59</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.13.1</version>
</dependency>
<!-- https://mvnrepository.com/artifact/net.sourceforge.jexcelapi/jxl -->
<dependency>
<groupId>net.sourceforge.jexcelapi</groupId>
<artifactId>jxl</artifactId>
<version>2.6.12</version>
</dependency>
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-java</artifactId>
<version>6.9.1</version>
</dependency>
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-core</artifactId>
<version>6.9.1</version>
</dependency>
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-junit</artifactId>
<version>6.9.1</version>
</dependency>
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-testng</artifactId>
<version>6.9.1</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.testng/testng -->
<dependency>
<groupId>org.testng</groupId>
<artifactId>testng</artifactId>
<version>7.3.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/io.cucumber/datatable-dependencies -->
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>datatable-dependencies</artifactId>
<version>1.0.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/com.aventstack/extentreports -->
<dependency>
<groupId>com.aventstack</groupId>
<artifactId>extentreports</artifactId>
<version>5.0.6</version>
</dependency>
<dependency>
<groupId>org.jetbrains</groupId>
<artifactId>annotations</artifactId>
<version>RELEASE</version>
<scope>compile</scope>
</dependency>
</dependencies>
TestRunner
import com.aventstack.extentreports.gherkin.model.Feature;
import com.crm.framework.utilities.ExtentReport;
import io.cucumber.testng.*;
import org.testng.annotations.Test;
import org.testng.ITestContext;
import org.testng.annotations.AfterClass;
import org.testng.annotations.BeforeClass;
import org.testng.annotations.DataProvider;
import java.util.List;
//json:target/cucumber-reports/cucumberTestReport.json
#CucumberOptions(
features = {"src/test/java/features/"},
glue = {"steps"},
plugin = {"json:target/cucumber-json-report.json",
"pretty", "html:target/cucumber-report-html"})
public class TestRunner {
private TestNGCucumberRunner testNGCucumberRunner;
#BeforeClass(alwaysRun = true)
public void setUpClass() {
testNGCucumberRunner = new TestNGCucumberRunner(this.getClass());
}
#Test(dataProvider = "features")
public void LoginTest(CucumberFeatureWrapper cucumberFeatureWrapper) throws ClassNotFoundException {
//Insert the Feature Name
//ExtentReport.startFeature("login").assignAuthor("DiegoHM").assignDevice("Chrome").assignCategory("Regression");
}
#DataProvider
public Object[] features(ITestContext context) {
// return testNGCucumberRunner.
return testNGCucumberRunner.provideFeatures();
}
#AfterClass(alwaysRun = true)
public void afterClass() {
testNGCucumberRunner.finish();
}
}
You're using classes that don't exist anymore. You can find out which classes are in a library by using ctrl/cmd + clicking on a package name in most modern IDEs.
So try using:
package io.cucumber.examples.testng;
import io.cucumber.testng.CucumberOptions;
import io.cucumber.testng.FeatureWrapper;
import io.cucumber.testng.PickleWrapper;
import io.cucumber.testng.TestNGCucumberRunner;
import org.testng.annotations.AfterClass;
import org.testng.annotations.BeforeClass;
import org.testng.annotations.DataProvider;
import org.testng.annotations.Test;
#CucumberOptions(....)
public class RunCucumberByCompositionTest {
private TestNGCucumberRunner testNGCucumberRunner;
#BeforeClass(alwaysRun = true)
public void setUpClass() {
testNGCucumberRunner = new TestNGCucumberRunner(this.getClass());
}
#Test(groups = "cucumber", description = "Runs Cucumber Scenarios", dataProvider = "scenarios")
public void scenario(PickleWrapper pickle, FeatureWrapper cucumberFeature) {
testNGCucumberRunner.runScenario(pickle.getPickle());
}
#DataProvider
public Object[][] scenarios() {
return testNGCucumberRunner.provideScenarios();
}
#AfterClass(alwaysRun = true)
public void tearDownClass() {
testNGCucumberRunner.finish();
}
}
From:
https://github.com/cucumber/cucumber-jvm/blob/main/examples/java-calculator-testng/src/test/java/io/cucumber/examples/testng

Spring Boot Can't See My Mapstruct Mapper

I defined the mapper but Spring Boot can't detect it. I couldn't find the problem for days. Please help. Tried on IDEA and Netbeans. Tried to add some annotations on main class from this thread.
Description:
Parameter 1 of constructor in
com.example.springmysqlelastic.service.impl.UserService required a
bean of type 'com.example.springmysqlelastic.mapper.UserMapper' that
could not be found.
Action:
Consider defining a bean of type
'com.example.springmysqlelastic.mapper.UserMapper' in your
configuration.
UserMapper.java
package com.example.springmysqlelastic.mapper;
import com.example.springmysqlelastic.model.Food;
import com.example.springmysqlelastic.model.FoodModel;
import com.example.springmysqlelastic.model.User;
import com.example.springmysqlelastic.model.UserModel;
import com.example.springmysqlelastic.model.dto.FoodDTO;
import com.example.springmysqlelastic.model.dto.UserDTO;
import org.mapstruct.Mapper;
import org.springframework.stereotype.Component;
import java.util.List;
//#Component
#Mapper(componentModel = "spring")
public interface UserMapper {
UserDTO toUserDTO(User user);
List<UserDTO> toUserDtos(List<User> users);
User toUser(UserDTO userDTO);
List<User> toUsers(List<UserDTO> userDTOS);
UserModel toUserModel(User user);
/*
FoodDTO toFoodDTO(Food food);
List<FoodDTO> toFoodDtos(List<Food> foods);
Food toFood(FoodDTO foodDTO);
List<Food> toFoods(List<FoodDTO> foodDTOS);
FoodModel toFoodModel(Food food);*/
}
UserService
package com.example.springmysqlelastic.service.impl;
import com.example.springmysqlelastic.mapper.UserMapper;
import com.example.springmysqlelastic.model.User;
import com.example.springmysqlelastic.model.dto.UserDTO;
import com.example.springmysqlelastic.repo.IUserDAO;
import com.example.springmysqlelastic.service.IUserService;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Component;
import org.springframework.stereotype.Service;
import java.util.List;
#Service
public class UserService implements IUserService {
private IUserDAO userDAO;
private UserMapper userMapper;
#Autowired
public UserService(IUserDAO userDAO, UserMapper userMapper) {
this.userDAO = userDAO;
this.userMapper = userMapper;
}
#Override
public UserDTO save(UserDTO userDTO) {
User user = this.userDAO.save(this.userMapper.toUser(userDTO));
return this.userMapper.toUserDTO(user);
}
#Override
public UserDTO findById(Long id) {
return this.userMapper.toUserDTO(this.userDAO.findById(id).orElse(null));
}
#Override
public List<UserDTO> findAll() {
return this.userMapper.toUserDtos(this.userDAO.findAll());
}
}
UserController.java
package com.example.springmysqlelastic.rest;
import com.example.springmysqlelastic.model.dto.UserDTO;
import com.example.springmysqlelastic.service.IUserService;
import com.example.springmysqlelastic.utils.PathResources;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.http.HttpStatus;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.PathVariable;
import org.springframework.web.bind.annotation.PostMapping;
import org.springframework.web.bind.annotation.RequestBody;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RestController;
import java.util.List;
#RestController
#RequestMapping("/user") //PathResources.USER
public class UserController {
private final IUserService userService;
#Autowired
public UserController(IUserService userService) {
this.userService = userService;
}
#PostMapping("/save") //PathResources.SAVE
public ResponseEntity<UserDTO> saveUser(#RequestBody UserDTO userDTO) {
return new ResponseEntity<>(this.userService.save(userDTO), HttpStatus.OK);
}
#GetMapping("/find-one/{id}") //PathResources.FIND_ONE + "/{" + PathResources.ID + "}"
public ResponseEntity<UserDTO> findById(#PathVariable Long id) {
return new ResponseEntity<>(this.userService.findById(id), HttpStatus.OK);
}
#GetMapping("/find-all") //PathResources.FIND_ALL
public ResponseEntity<List<UserDTO>> findById() {
return new ResponseEntity<>(this.userService.findAll(), HttpStatus.OK);
}
}
pom.xml
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.3.1.RELEASE</version>
<relativePath/> <!-- lookup parent from repository -->
</parent>
<groupId>com.example</groupId>
<artifactId>spring-mysql-elastic</artifactId>
<packaging>jar</packaging>
<version>0.0.1-SNAPSHOT</version>
<name>spring-mysql-elastic</name>
<description>Demo project for Mysql and ElasticSearch Synchronization in Spring</description>
<properties>
<java.version>1.8</java.version>
<mapstruct.version>1.4.0.Beta3</mapstruct.version>
<org.json.version>20190722</org.json.version>
<swagger.version>2.9.2</swagger.version>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-jpa</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-tomcat</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-elasticsearch</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-thymeleaf</artifactId>
</dependency>
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
<exclusions>
<exclusion>
<groupId>org.junit.vintage</groupId>
<artifactId>junit-vintage-engine</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.mapstruct</groupId>
<artifactId>mapstruct-jdk8</artifactId>
<version>${mapstruct.version}</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.json/json -->
<dependency>
<groupId>org.json</groupId>
<artifactId>json</artifactId>
<version>${org.json.version}</version>
</dependency>
<!-- https://mvnrepository.com/artifact/com.alibaba/fastjson
<dependency>
<groupId>com.alibaba</groupId>
<artifactId>fastjson</artifactId>
<version>1.2.73</version>
</dependency>-->
<dependency>
<groupId>io.springfox</groupId>
<artifactId>springfox-swagger2</artifactId>
<version>${swagger.version}</version>
</dependency>
<dependency>
<groupId>io.springfox</groupId>
<artifactId>springfox-swagger-ui</artifactId>
<version>${swagger.version}</version>
</dependency>
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
<!--<version>1.18.12</version>-->
<type>jar</type>
</dependency>
<!-- https://mvnrepository.com/artifact/com.google.code.gson/gson -->
<dependency>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
<version>2.8.6</version>
</dependency>
<dependency>
<groupId>jakarta.validation</groupId>
<artifactId>jakarta.validation-api</artifactId>
<version>2.0.2</version>
<type>jar</type>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<executions>
<execution>
<goals>
<goal>repackage</goal>
</goals>
<configuration>
<mainClass>com.example.springmysqlelastic.SpringMysqlElasticApplication</mainClass>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>${java.version}</source>
<target>${java.version}</target>
<annotationProcessorPaths>
<path>
<groupId>org.mapstruct</groupId>
<artifactId>mapstruct-processor</artifactId>
<version>${mapstruct.version}</version>
</path>
</annotationProcessorPaths>
</configuration>
</plugin>
</plugins>
</build>
</project>
Main class
package com.example.springmysqlelastic;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.EnableAutoConfiguration;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.boot.autoconfigure.domain.EntityScan;
import org.springframework.context.annotation.ComponentScan;
import org.springframework.data.elasticsearch.repository.config.EnableElasticsearchRepositories;
import org.springframework.data.jpa.repository.config.EnableJpaRepositories;
import org.springframework.scheduling.annotation.EnableScheduling;
#SpringBootApplication
//#SpringBootApplication(scanBasePackages={"com.example.springmysqlelastic"})
#EnableElasticsearchRepositories("com.example.springmysqlelastic.repo.elastic")
#EnableScheduling
//#ComponentScan(scanBasePackages = {"com.example.springmysqlelastic"})
//#EntityScan("com.example.springmysqlelastic.model")
//#EnableJpaRepositories("com.example.springmysqlelastic")
//#ComponentScan(basePackages = {"com.example.springmysqlelastic"})
//#EnableAutoConfiguration
public class SpringMysqlElasticApplication {
public static void main(String[] args) {
SpringApplication.run(SpringMysqlElasticApplication.class, args);
}
}
In the documentation says that in addition to the main dependency, you need to add annotationProcessor
Example for Gradle:
dependencies {
implementation 'org.mapstruct:mapstruct:1.4.2.Final'
annotationProcessor 'org.mapstruct:mapstruct-processor:1.4.2.Final'
}
I think its an issue with swagger conflicting with the mapstruct library..
I modified your POM like below and it started working for me. They have some documentation here that talks about it. https://www.programmersought.com/article/6208308983/
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.3.1.RELEASE</version>
<relativePath/> <!-- lookup parent from repository -->
</parent>
<groupId>com.example</groupId>
<artifactId>spring-mysql-elastic</artifactId>
<packaging>jar</packaging>
<version>0.0.1-SNAPSHOT</version>
<name>spring-mysql-elastic</name>
<description>Demo project for Mysql and ElasticSearch Synchronization in Spring</description>
<properties>
<java.version>1.8</java.version>
<mapstruct.version>1.4.0.Beta3</mapstruct.version>
<org.json.version>20190722</org.json.version>
<swagger.version>2.9.2</swagger.version>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-tomcat</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-elasticsearch</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-thymeleaf</artifactId>
</dependency>
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
<exclusions>
<exclusion>
<groupId>org.junit.vintage</groupId>
<artifactId>junit-vintage-engine</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.mapstruct</groupId>
<artifactId>mapstruct-jdk8</artifactId>
<version>${mapstruct.version}</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.json/json -->
<dependency>
<groupId>org.json</groupId>
<artifactId>json</artifactId>
<version>${org.json.version}</version>
</dependency>
<!-- https://mvnrepository.com/artifact/com.alibaba/fastjson
<dependency>
<groupId>com.alibaba</groupId>
<artifactId>fastjson</artifactId>
<version>1.2.73</version>
</dependency>-->
<dependency>
<groupId>io.springfox</groupId>
<artifactId>springfox-swagger2</artifactId>
<version>${swagger.version}</version>
<exclusions>
<exclusion>
<groupId>org.mapstruct</groupId>
<artifactId>mapstruct</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>io.springfox</groupId>
<artifactId>springfox-swagger-ui</artifactId>
<version>${swagger.version}</version>
<exclusions>
<exclusion>
<groupId>org.mapstruct</groupId>
<artifactId>mapstruct</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
<!--<version>1.18.12</version>-->
<type>jar</type>
</dependency>
<!-- https://mvnrepository.com/artifact/com.google.code.gson/gson -->
<dependency>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
<version>2.8.6</version>
</dependency>
<dependency>
<groupId>jakarta.validation</groupId>
<artifactId>jakarta.validation-api</artifactId>
<version>2.0.2</version>
<type>jar</type>
</dependency>
<dependency>
<groupId>org.mapstruct</groupId>
<artifactId>mapstruct</artifactId>
<version>1.4.0.Beta3</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<executions>
<execution>
<goals>
<goal>repackage</goal>
</goals>
<configuration>
<mainClass>com.example.springmysqlelastic.SpringMysqlElasticApplication</mainClass>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>${java.version}</source>
<target>${java.version}</target>
<annotationProcessorPaths>
<path>
<groupId>org.mapstruct</groupId>
<artifactId>mapstruct-processor</artifactId>
<version>1.4.0.Beta3</version>
</path>
</annotationProcessorPaths>
</configuration>
</plugin>
</plugins>
</build>
</project>
PS: I have removed JPA dependency to make it work without spring. Forget it.
You can find that I have changed swagger dependencies to look like below -
<dependency>
<groupId>io.springfox</groupId>
<artifactId>springfox-swagger2</artifactId>
<version>${swagger.version}</version>
<exclusions>
<exclusion>
<groupId>org.mapstruct</groupId>
<artifactId>mapstruct</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>io.springfox</groupId>
<artifactId>springfox-swagger-ui</artifactId>
<version>${swagger.version}</version>
<exclusions>
<exclusion>
<groupId>org.mapstruct</groupId>
<artifactId>mapstruct</artifactId>
</exclusion>
</exclusions>
</dependency>
Can you check if it works for you?
Update
After codegen step, this is what it generates -
package com.example.springmysqlelastic.mapper;
import java.util.ArrayList;
import java.util.List;
import javax.annotation.Generated;
import org.springframework.stereotype.Component;
#Generated(
value = "org.mapstruct.ap.MappingProcessor",
date = "2020-08-11T16:40:57+0530",
comments = "version: 1.4.0.Beta3, compiler: javac, environment: Java 1.8.0_265 (Private Build)"
)
#Component
public class UserMapperImpl implements UserMapper {
#Override
public UserDTO toUserDTO(User user) {
if ( user == null ) {
return null;
}
UserDTO userDTO = new UserDTO();
return userDTO;
}
#Override
public List<UserDTO> toUserDtos(List<User> users) {
if ( users == null ) {
return null;
}
List<UserDTO> list = new ArrayList<UserDTO>( users.size() );
for ( User user : users ) {
list.add( toUserDTO( user ) );
}
return list;
}
#Override
public User toUser(UserDTO userDTO) {
if ( userDTO == null ) {
return null;
}
User user = new User();
return user;
}
#Override
public List<User> toUsers(List<UserDTO> userDTOS) {
if ( userDTOS == null ) {
return null;
}
List<User> list = new ArrayList<User>( userDTOS.size() );
for ( UserDTO userDTO : userDTOS ) {
list.add( toUser( userDTO ) );
}
return list;
}
}

Spring Executable jar not saving in DB, It works with Eclipse

I have a SpringBoot App, that works as expected when I run it with Eclipse. I create a runnable jar and test it. Everything works expect one thing: it does not save anything on the DB, there is no log or exception to check. I assume there has to be something with the export, because it does not store anything,
not even running it locally.
I have two projects, one depending from the other.
The project POM.xml:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>BatchCare</groupId>
<artifactId>BatchCare</artifactId>
<version>0.0.1-SNAPSHOT</version>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>1.3.3.RELEASE</version>
</parent>
<build>
<sourceDirectory>src</sourceDirectory>
<plugins>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<version>1.4.2.RELEASE</version>
<executions>
<execution>
<goals>
<goal>repackage</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-jpa</artifactId>
</dependency>
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-io</artifactId>
<version>2.5</version>
</dependency>
<dependency>
<groupId>org.simpleframework</groupId>
<artifactId>simple-xml</artifactId>
<version>2.6.9</version>
</dependency>
<dependency>
<groupId>com.oracle</groupId>
<artifactId>ojdbc6</artifactId>
<version>11.1.0.7.0</version>
</dependency>
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-io</artifactId>
<version>2.5</version>
<!-- <version>1.3.2</version> -->
</dependency>
<dependency>
<groupId>org.apache.poi</groupId>
<artifactId>poi</artifactId>
<version>3.9</version>
</dependency>
<dependency>
<groupId>org.apache.poi</groupId>
<artifactId>poi-ooxml</artifactId>
<version>3.9</version>
</dependency>
<dependency>
<groupId>org.apache.poi</groupId>
<artifactId>poi-ooxml-schemas</artifactId>
<version>3.9</version>
</dependency>
<dependency>
<groupId>jaxen</groupId>
<artifactId>jaxen</artifactId>
<version>1.1.1</version>
</dependency>
<dependency>
<groupId>CareLibrary</groupId>
<artifactId>CareLibrary</artifactId>
<version>0.0.1-SNAPSHOT</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>com.h2database</groupId>
<artifactId>h2</artifactId>
<version>1.3.156</version>
</dependency>
</dependencies>
<repositories>
<repository>
<id>spring-releases</id>
<url>https://repo.spring.io/libs-release</url>
</repository>
<repository>
<id>local-repo</id>
<name>Local Project Repo</name>
<url>file://${basedir}/lib/</url>
<layout>default</layout>
</repository>
</repositories>
Classes:
#ComponentScan({"ar.com.tr.latam.care.utils", "ar.com.tr.latam.care.unification.process", "ar.com.tr.latam.care.repository"})
#EnableJpaRepositories
//#Component
#Repository
public class RequestLoggerImpl implements RequestLogger {
/**
* Log.
*/
private static final Logger LOG = LoggerFactory.getLogger(RequestLoggerImpl.class);
#Autowired
private RequestLogDao requestLogDao;
#Override
public void log() {
RequestLog requestLog = new RequestLog();
requestLog.setDate(new Date());
requestLog.setSource("jar");
try {
requestLogDao.save(requestLog);
} catch (Exception ex) {
System.out.println("-------------ERROR-------------------");
ex.printStackTrace();
}
}
}
RequestLogDAO
public interface RequestLogDao extends CrudRepository<RequestLog, Integer> {
}
RquestLog
#Entity
#Table(name = "REQUEST_LOG")
public class RequestLog {
#Id
#GeneratedValue(strategy = GenerationType.SEQUENCE, generator = "RequestLogSeq")
#SequenceGenerator(name = "RequestLogSeq", sequenceName = "REQUEST_LOG_SEQ")
private Integer id;
#Column
#NotNull
private Date date;
#NotNull
private String source;
public void setDate(Date date) {
this.date = date;
}
public void setSource(String source) {
this.source = source;
}
}
}
My main Class:
#ComponentScan({"ar.com.tr.latam.care", "ar.com.tr.latam.care.repository", "ar.com.tr.latam.care.filtro"})
#EnableJpaRepositories
#SpringBootApplication
public class InitBatch implements CommandLineRunner {
#Autowired
private Batch batch;
#Override
public void run(String... args) throws Exception {
batch.processFiles();
}
public static void main(String[] args) throws Exception {
long startTime = System.nanoTime();
SpringApplication.run(InitBatch.class, args).close();
System.out.println(String.format("Total process time (%s) seconds",(System.nanoTime() - startTime) / 1000000000));
}
}
How do I export the project? First, I do a mvn clean install for the Library project . After this I go to my Project and do mvn clean package.
This creates a Runnable jar that works, expect for this problem that can't save anyhing on the DB.
Also, the RequestLogDao is made with JPA.
Any help? Thanks in advance!

storm hdfs connector ... trying to write data into hdfs using storm

I am trying to write data into HDFS using "storm-hdfs connector 0.1.3".
The github URL: https://github.com/ptgoetz/storm-hdfs,I have added this dependency into my maven project.
<dependency>
<groupId>com.github.ptgoetz</groupId>
<artifactId>storm-hdfs</artifactId>
<version>0.1.3-SNAPSHOT</version>
<scope>provided</scope>
</dependency>
The sample topology to write the data into HDFS is provided in the storm-hdfs project itself. I just modified it to match to my file locations. The HdfsFileTopology is :
package my.company.app;
import backtype.storm.Config;
import backtype.storm.LocalCluster;
import backtype.storm.StormSubmitter;
import backtype.storm.spout.SpoutOutputCollector;
import backtype.storm.task.OutputCollector;
import backtype.storm.task.TopologyContext;
import backtype.storm.topology.OutputFieldsDeclarer;
import backtype.storm.topology.TopologyBuilder;
import backtype.storm.topology.base.BaseRichBolt;
import backtype.storm.topology.base.BaseRichSpout;
import backtype.storm.tuple.Fields;
import backtype.storm.tuple.Tuple;
import backtype.storm.tuple.Values;
import org.apache.storm.hdfs.bolt.HdfsBolt;
import org.apache.storm.hdfs.bolt.AbstractHdfsBolt;
import org.apache.storm.hdfs.bolt.SequenceFileBolt;
import org.apache.storm.hdfs.bolt.format.DefaultFileNameFormat;
import org.apache.storm.hdfs.bolt.format.DelimitedRecordFormat;
import org.apache.storm.hdfs.bolt.format.FileNameFormat;
import org.apache.storm.hdfs.bolt.format.RecordFormat;
import org.apache.storm.hdfs.bolt.rotation.FileRotationPolicy;
import org.apache.storm.hdfs.bolt.rotation.FileSizeRotationPolicy;
import org.apache.storm.hdfs.bolt.rotation.FileSizeRotationPolicy.Units;
import org.apache.storm.hdfs.bolt.rotation.TimedRotationPolicy;
import org.apache.storm.hdfs.bolt.sync.CountSyncPolicy;
import org.apache.storm.hdfs.bolt.sync.SyncPolicy;
import org.apache.storm.hdfs.common.rotation.MoveFileAction;
import org.yaml.snakeyaml.Yaml;
import java.io.FileInputStream;
import java.io.InputStream;
import java.util.HashMap;
import java.util.Map;
import java.util.UUID;
import java.util.concurrent.ConcurrentHashMap;
public class HdfsFileTopology {
static final String SENTENCE_SPOUT_ID = "sentence-spout";
static final String BOLT_ID = "my-bolt";
static final String TOPOLOGY_NAME = "test-topology";
public static void main(String[] args) throws Exception {
Config config = new Config();
config.setNumWorkers(1);
SentenceSpout spout = new SentenceSpout();
// sync the filesystem after every 1k tuples
SyncPolicy syncPolicy = new CountSyncPolicy(1000);
// rotate files when they reach 5MB
FileRotationPolicy rotationPolicy = new TimedRotationPolicy(1.0f, TimedRotationPolicy.TimeUnit.MINUTES);
FileNameFormat fileNameFormat = new DefaultFileNameFormat()
.withPath("/users/storm/")
.withExtension(".txt");
// use "|" instead of "," for field delimiter
RecordFormat format = new DelimitedRecordFormat()
.withFieldDelimiter("|");
Yaml yaml = new Yaml();
InputStream in = new FileInputStream(args[1]);
Map<String, Object> yamlConf = (Map<String, Object>) yaml.load(in);
in.close();
config.put("hdfs.config", yamlConf);
HdfsBolt bolt = new HdfsBolt()
.withConfigKey("hdfs.config")
.withFsUrl(args[0])
.withFileNameFormat(fileNameFormat)
.withRecordFormat(format)
.withRotationPolicy(rotationPolicy)
.withSyncPolicy(syncPolicy)
.addRotationAction(new MoveFileAction().toDestination("/dest2/"));
TopologyBuilder builder = new TopologyBuilder();
builder.setSpout(SENTENCE_SPOUT_ID, spout, 1);
// SentenceSpout --> MyBolt
builder.setBolt(BOLT_ID, bolt, 4)
.shuffleGrouping(SENTENCE_SPOUT_ID);
if (args.length == 2) {
LocalCluster cluster = new LocalCluster();
cluster.submitTopology(TOPOLOGY_NAME, config, builder.createTopology());
waitForSeconds(120);
cluster.killTopology(TOPOLOGY_NAME);
cluster.shutdown();
System.exit(0);
} else if (args.length == 3) {
StormSubmitter.submitTopology(args[0], config, builder.createTopology());
} else{
System.out.println("Usage: HdfsFileTopology [topology name] <yaml config file>");
}
}
public static void waitForSeconds(int seconds) {
try {
Thread.sleep(seconds * 1000);
} catch (InterruptedException e) {
}
}
public static class SentenceSpout extends BaseRichSpout {
private ConcurrentHashMap<UUID, Values> pending;
private SpoutOutputCollector collector;
private String[] sentences = {
"my dog has fleas",
"i like cold beverages",
"the dog ate my homework",
"don't have a cow man",
"i don't think i like fleas"
};
private int index = 0;
private int count = 0;
private long total = 0L;
public void declareOutputFields(OutputFieldsDeclarer declarer) {
declarer.declare(new Fields("sentence", "timestamp"));
}
public void open(Map config, TopologyContext context,
SpoutOutputCollector collector) {
this.collector = collector;
this.pending = new ConcurrentHashMap<UUID, Values>();
}
public void nextTuple() {
Values values = new Values(sentences[index], System.currentTimeMillis());
UUID msgId = UUID.randomUUID();
this.pending.put(msgId, values);
this.collector.emit(values, msgId);
index++;
if (index >= sentences.length) {
index = 0;
}
count++;
total++;
if(count > 20000){
count = 0;
System.out.println("Pending count: " + this.pending.size() + ", total: " + this.total);
}
Thread.yield();
}
public void ack(Object msgId) {
this.pending.remove(msgId);
}
public void fail(Object msgId) {
System.out.println("**** RESENDING FAILED TUPLE");
this.collector.emit(this.pending.get(msgId), msgId);
}
}
public static class MyBolt extends BaseRichBolt {
private HashMap<String, Long> counts = null;
private OutputCollector collector;
public void prepare(Map config, TopologyContext context, OutputCollector collector) {
this.counts = new HashMap<String, Long>();
this.collector = collector;
}
public void execute(Tuple tuple) {
collector.ack(tuple);
}
public void declareOutputFields(OutputFieldsDeclarer declarer) {
// this bolt does not emit anything
}
#Override
public void cleanup() {
}
}
}
I compile the project using maven (group id : my.company.app) and the build is successful but when I submit the jar file to storm it throws error.
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/storm/hdfs/bolt/format/FileNameFormat
Caused by: java.lang.ClassNotFoundException: org.apache.storm.hdfs.bolt.format.FileNameFormat
Even though I have included the class,why is it throwing an error that the class was not found?
And any help in how to write data to HDFS using storm will be appreciated.
As requested the pom.xml
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.mycompany.app</groupId>
<artifactId>hdfs_example</artifactId>
<version>1.0-SNAPSHOT</version>
<packaging>jar</packaging>
<name>hdfs_example</name>
<url>http://maven.apache.org</url>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
<repositories>
<repository>
<id>Codehaus</id>
<url>http://repository.codehaus.org</url>
</repository>
<repository>
<id>Codehaus.Snapshots</id>
<url>http://snapshots.repository.codehaus.org</url>
<snapshots><enabled>true</enabled></snapshots>
</repository>
<repository>
<id>github-releases</id>
<url>http://oss.sonatype.org/content/repositories/github-releases/</url>
</repository>
<repository>
<id>clojars.org</id>
<url>http://clojars.org/repo</url>
</repository>
</repositories>
<dependencies>
<dependency>
<groupId>org.testng</groupId>
<artifactId>testng</artifactId>
<version>6.8.5</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.mockito</groupId>
<artifactId>mockito-all</artifactId>
<version>1.9.0</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.easytesting</groupId>
<artifactId>fest-assert-core</artifactId>
<version>2.0M8</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.jmock</groupId>
<artifactId>jmock</artifactId>
<version>2.6.0</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.storm</groupId>
<artifactId>storm-core</artifactId>
<version>0.9.2-incubating</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>commons-collections</groupId>
<artifactId>commons-collections</artifactId>
<version>3.2.1</version>
</dependency>
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
<version>15.0</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>2.2.0</version>
<exclusions>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>com.github.ptgoetz</groupId>
<artifactId>storm-hdfs</artifactId>
<version>0.1.3-SNAPSHOT</version>
<scope>provided</scope>
</dependency>
</dependencies>
<build>
<sourceDirectory>src/main/java</sourceDirectory>
<testSourceDirectory>test/main/java</testSourceDirectory>
<resources>
<resource>
<directory>${basedir}/multilang</directory>
</resource>
</resources>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>1.4</version>
<configuration>
<createDependencyReducedPom>true</createDependencyReducedPom>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<transformers>
<transformer
implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
<transformer
implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<mainClass></mainClass>
</transformer>
</transformers>
</configuration>
</execution>
</executions>
</plugin>
<!--
<plugin>
<groupId>com.theoryinpractise</groupId>
<artifactId>clojure-maven-plugin</artifactId>
<version>1.3.12</version>
<extensions>true</extensions>
<configuration>
<sourceDirectories>
<sourceDirectory>src/clj</sourceDirectory>
</sourceDirectories>
</configuration>
<executions>
<execution>
<id>compile</id>
<phase>compile</phase>
<goals>
<goal>compile</goal>
</goals>
</execution>
<execution>
<id>test</id>
<phase>test</phase>
<goals>
<goal>test</goal>
</goals>
</execution>
</executions>
</plugin>
-->
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<version>1.2.1</version>
<executions>
<execution>
<goals>
<goal>exec</goal>
</goals>
</execution>
</executions>
<configuration>
<executable>java</executable>
<includeProjectDependencies>true</includeProjectDependencies>
<includePluginDependencies>false</includePluginDependencies>
<classpathScope>compile</classpathScope>
<mainClass>${storm.topology}</mainClass>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>1.6</source>
<target>1.6</target>
</configuration>
</plugin>
</plugins>
</build>
</project>
I had same issue and after checking pom.xml in detail I realized that in dependency of storm-hdfs <version>0.1.3-SNAPSHOT</version> has scope defined as included which I think means that we have to add jar into storm jar and maven wont do it during packaging.
I changed version to what is available in maven repo and removed the scope which forced maven to download jar and include it in storm jar during build.
Here is my pom.xml for reference(with some basic details removed):
src/main/resources/
false
core-site.xml
hdfs-site.xml
<plugins>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.2</version>
<configuration>
<source>1.7</source>
<target>1.7</target>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>1.4</version>
<configuration>
<createDependencyReducedPom>true</createDependencyReducedPom>
</configuration>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<transformers>
<transformer
implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
<transformer
implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<mainClass>com.company.main</mainClass>
</transformer>
</transformers>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.storm</groupId>
<artifactId>storm-core</artifactId>
<version>0.9.2-incubating</version>
<!-- keep storm out of the jar-with-dependencies -->
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.storm</groupId>
<artifactId>storm-kafka</artifactId>
<version>0.9.2-incubating</version>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.17</version>
</dependency>
<!-- Utilities -->
<dependency>
<groupId>commons-collections</groupId>
<artifactId>commons-collections</artifactId>
<version>3.2.1</version>
</dependency>
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
<version>15.0</version>
</dependency>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_2.10</artifactId>
<version>0.8.1.1</version>
<exclusions>
<exclusion>
<groupId>javax.jms</groupId>
<artifactId>jms</artifactId>
</exclusion>
<exclusion>
<groupId>com.sun.jdmk</groupId>
<artifactId>jmxtools</artifactId>
</exclusion>
<exclusion>
<groupId>com.sun.jmx</groupId>
<artifactId>jmxri</artifactId>
</exclusion>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-simple</artifactId>
</exclusion>
<exclusion>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
</exclusion>
<exclusion>
<groupId>org.apache.zookeeper</groupId>
<artifactId>zookeeper</artifactId>
</exclusion>
<exclusion>
<groupId>com.101tec</groupId>
<artifactId>zkclient</artifactId>
</exclusion>
</exclusions>
</dependency>
<!-- our cluster hadoop version -->
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>2.4.0</version>
<exclusions>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
</exclusion>
</exclusions>
</dependency>
<!-- our cluster hadoop version -->
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
<version>2.4.0</version>
<exclusions>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
</exclusion>
</exclusions>
</dependency>
<!-- apache hdfs-bolt related dependencies -->
<dependency>
<groupId>com.github.ptgoetz</groupId>
<artifactId>storm-hdfs</artifactId>
<version>0.1.2</version>
<exclusions>
<exclusion>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
</exclusion>
<exclusion>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
</exclusion>
</exclusions>
</dependency>

UnsupportedSignatureMethodException: HMAC-SHA1 exception in tomcat with jerseyoauth

This is my pom
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.mkyong.common</groupId>
<artifactId>RESTfulExample</artifactId>
<packaging>war</packaging>
<version>1.0-SNAPSHOT</version>
<name>RESTfulExample Maven Webapp</name>
<url>http://maven.apache.org</url>
<repositories>
<repository>
<id>maven2-repository.java.net</id>
<name>Java.net Repository for Maven</name>
<url>http://download.java.net/maven/2/</url>
<layout>default</layout>
</repository>
</repositories>
<properties>
<project.build.java.target>1.6</project.build.java.target>
</properties>
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.8.2</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>com.sun.jersey</groupId>
<artifactId>jersey-server</artifactId>
<version>1.8</version>
</dependency>
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-core</artifactId>
<version>4.3.0.Final</version>
</dependency>
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<version>5.1.28</version>
</dependency>
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjrt</artifactId>
<version>1.6.5</version>
</dependency>
<dependency>
<groupId>org.mockito</groupId>
<artifactId>mockito-all</artifactId>
<version>1.9.5</version>
</dependency>
<dependency>
<groupId>com.sun.jersey.contribs.jersey-oauth</groupId>
<artifactId>oauth-server</artifactId>
<version>1.18</version>
</dependency>
<dependency>
<groupId>com.sun.jersey.contribs.jersey-oauth</groupId>
<artifactId>oauth-signature</artifactId>
<version>1.18</version>
</dependency>
<dependency>
<groupId>com.sun.jersey.contribs.jersey-oauth</groupId>
<artifactId>oauth-client</artifactId>
<version>1.18</version>
</dependency>
</dependencies>
<build>
<finalName>RESTfulExample</finalName>
<plugins>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>1.5</source>
<target>1.6</target>
</configuration>
</plugin>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>aspectj-maven-plugin</artifactId>
<version>1.3.1</version>
<configuration>
<complianceLevel>1.6</complianceLevel>
</configuration>
<executions>
<execution>
<goals>
<goal>compile</goal>
<!--<goal>test-compile</goal> -->
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>
This is my oath filter class
public class OAuthAuthenticationFilter1 implements ContainerRequestFilter {
public ContainerRequest filter(ContainerRequest containerRequest) {
OAuthServerRequest request = new OAuthServerRequest(containerRequest);
OAuthParameters params = new OAuthParameters();
params.readRequest(request);
OAuthSecrets secrets = new OAuthSecrets();
OAuthParameters param=new OAuthParameters();
secrets.consumerSecret("OwnAccount");
param.consumerKey("OwnAccount");
try {
if(!OAuthSignature.verify(request, params, secrets)) {
throw new WebApplicationException(401);
}
} catch (OAuthSignatureException e) {
throw new WebApplicationException(e, 401);
}
return containerRequest;
}
}
This is my client
public class Sample1 {
public static final String HOSTNAME = "http://localhost:8080/RESTfulExample/rest/hello/planetext";
public static final String CONSUMER_KEY = "OwnAccount";
public static final String CONSUMER_SECRET = "OwnAccount";
/**
* #param args
*/
public static void main(String[] args) {
Client client = Client.create();
OAuthParameters params = new OAuthParameters().signatureMethod("HMAC-SHA1").consumerKey(CONSUMER_KEY);
OAuthSecrets secrets = new OAuthSecrets().consumerSecret(CONSUMER_SECRET);
OAuthClientFilter filter = new OAuthClientFilter(client.getProviders(), params, secrets);
WebResource res = client.resource(HOSTNAME );
res.addFilter(filter);
String responseString = res.get(String.class);
System.out.println(responseString);
}
}
I am using following oath and jersey pom referring the site
http://search.maven.org/#search%7Cga%7C1%7Cg%3A%22com.sun.jersey.contribs.jersey-oauth%22
<dependency>
<groupId>com.sun.jersey.contribs.jersey-oauth</groupId>
<artifactId>oauth-server</artifactId>
<version>1.18</version>
</dependency>
<dependency>
<groupId>com.sun.jersey.contribs.jersey-oauth</groupId>
<artifactId>oauth-signature</artifactId>
<version>1.18</version>
</dependency>
<dependency>
<groupId>com.sun.jersey.contribs.jersey-oauth</groupId>
<artifactId>oauth-client</artifactId>
<version>1.18</version>
</dependency>
When following code on filter class is called
try {
// exception is thrown from here
if(!OAuthSignature.verify(request, params, secrets)) {
throw new WebApplicationException(401);
}
} catch (OAuthSignatureException e) {
throw new WebApplicationException(e, 401);
}
I am getting this exception
com.sun.jersey.oauth.signature.UnsupportedSignatureMethodException: HMAC-SHA1
at com.sun.jersey.oauth.signature.OAuthSignature.getSignatureMethod(OAuthSignature.java:257)
I am using tomcat 7.
I searched on the net but could not find the cause for tomcat .What i am missing ?Do any one has faced it for tomcat ?
I solved this problem when i updated jersey-server jar to 1.18.I was using jersey-server 1.8 jar before.

Categories