i'm trying to write simple data into the table by Apache Iceberg 0.9.1, but error messages show. I want to CRUD data by Hadoop directly.
i create a hadooptable , and try to read from the table. after that i try to write data into the table .
i prepare a json file including one line. my code have read the json object, and arrange the order of the data, but the final step writing data is always error. i've changed some version of dependency packages , but another error messages are show.
Are there something wrong on version of packages.
Please help me.
this is my source code:
public class IcebergTest {
public static void main(String[] args) {
testWithoutCatalog();
readDataWithouCatalog();
writeDataWithoutCatalog();
}
public static void testWithoutCatalog() {
Schema bookSchema = new Schema(optional(1, "title", Types.StringType.get()),
optional(2, "price", Types.LongType.get()),
optional(3, "author", Types.StringType.get()),
optional(4, "genre", Types.StringType.get()));
PartitionSpec bookspec = PartitionSpec.builderFor(bookSchema).identity("title").build();
Configuration conf = new Configuration();
String warehousePath = "hdfs://hadoop01:9000/warehouse_path/xgfying/books3";
HadoopTables tables = new HadoopTables(conf);
Table table = tables.create(bookSchema, bookspec, warehousePath);
}
public static void readDataWithouCatalog(){
.......
}
public static void writeDataWithoutCatalog(){
SparkSession spark = SparkSession.builder().master("local[2]").getOrCreate();
Dataset<Row> df = spark.read().json("src/test/data/books3.json");
System.out.println(" this is the writing data : "+df.select("title","price","author","genre")
.first().toString());
df.select("title","price","author","genre")
.write().format("iceberg").mode("append")
.save("hdfs://hadoop01:9000/warehouse_path/xgfying/books3");
// System.out.println(df.write().format("iceberg").mode("append").toString());
}
}
this is the error messages:
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
20/11/18 15:51:36 INFO SparkContext: Running Spark version 2.4.5
.......
file:///C:/tmp/icebergtest1/src/test/data/books3.json, range: 0-75, partition values: [empty row]
20/11/18 15:51:52 ERROR Utils: Aborting task
java.lang.ExceptionInInitializerError
at org.apache.iceberg.parquet.Parquet$WriteBuilder.build(Parquet.java:232)
at org.apache.iceberg.spark.source.SparkAppenderFactory.newAppender(SparkAppenderFactory.java:61)
at org.apache.iceberg.spark.source.BaseWriter.openCurrent(BaseWriter.java:105)
at org.apache.iceberg.spark.source.PartitionedWriter.write(PartitionedWriter.java:63)
at org.apache.iceberg.spark.source.Writer$Partitioned24Writer.write(Writer.java:271)
at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$$anonfun$run$3.apply(WriteToDataSourceV2Exec.scala:118)
at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$$anonfun$run$3.apply(WriteToDataSourceV2Exec.scala:116)
at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1394)
at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$.run(WriteToDataSourceV2Exec.scala:146)
at org.apache.spark.sql.execution.datasources.v2.WriteToDataSourceV2Exec$$anonfun$doExecute$2.apply(WriteToDataSourceV2Exec.scala:67)
at org.apache.spark.sql.execution.datasources.v2.WriteToDataSourceV2Exec$$anonfun$doExecute$2.apply(WriteToDataSourceV2Exec.scala:66)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:123)
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.RuntimeException: Cannot find constructor for interface org.apache.parquet.column.page.PageWriteStore
Missing org.apache.parquet.hadoop.ColumnChunkPageWriteStore(org.apache.parquet.hadoop.CodecFactory$BytesCompressor,org.apache.parquet.schema.MessageType,org.apache.parquet.bytes.ByteBufferAllocator,int) [java.lang.NoSuchMethodException: org.apache.parquet.hadoop.ColumnChunkPageWriteStore.<init>(org.apache.parquet.hadoop.CodecFactory$BytesCompressor, org.apache.parquet.schema.MessageType, org.apache.parquet.bytes.ByteBufferAllocator, int)]
at org.apache.iceberg.common.DynConstructors$Builder.build(DynConstructors.java:235)
at org.apache.iceberg.parquet.ParquetWriter.<clinit>(ParquetWriter.java:55)
... 19 more
20/11/18 15:51:52 ERROR DataWritingSparkTask: Aborting commit for partition 0 (task 2, attempt 0, stage 2.0)
20/11/18 15:51:52 ERROR DataWritingSparkTask: Aborted commit for partition 0 (task 2, attempt 0, stage 2.0)
this is my pom.xml:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>icebergtest</groupId>
<artifactId>icebergtest1</artifactId>
<version>0.0.1-SNAPSHOT</version>
<packaging>jar</packaging>
<name>icebergtest1</name>
<url>http://maven.apache.org</url>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<iceberg.version>0.9.1</iceberg.version>
<hadoop.version>2.7.0</hadoop.version>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
</properties>
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>
<!-- org.apache.hadoop BEGIN-->
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>${hadoop.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
<version>${hadoop.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>${hadoop.version}</version>
<!--将netty包排除-->
<exclusions>
<exclusion>
<groupId>io.netty</groupId>
<artifactId>netty</artifactId>
</exclusion>
</exclusions>
</dependency>
<!--解决io.netty.buffer.PooledByteBufAllocator.defaultNumHeapArena()I异常,-->
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-all</artifactId>
<version>4.1.18.Final</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-core</artifactId>
<version>${hadoop.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-auth</artifactId>
<version>${hadoop.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-jobclient</artifactId>
<version>${hadoop.version}</version>
</dependency>
<!-- org.apache.hadoop END-->
<!-- org.apache.iceberg BEGIN-->
<dependency>
<groupId>org.apache.iceberg</groupId>
<artifactId>iceberg-core</artifactId>
<version>${iceberg.version}</version>
</dependency>
<dependency>
<groupId>org.apache.iceberg</groupId>
<artifactId>iceberg-api</artifactId>
<version>${iceberg.version}</version>
</dependency>
<dependency>
<groupId>org.apache.iceberg</groupId>
<artifactId>iceberg-parquet</artifactId>
<version>${iceberg.version}</version>
</dependency>
<dependency>
<groupId>org.apache.iceberg</groupId>
<artifactId>iceberg-common</artifactId>
<version>${iceberg.version}</version>
</dependency>
<dependency>
<groupId>org.apache.iceberg</groupId>
<artifactId>iceberg-orc</artifactId>
<version>${iceberg.version}</version>
</dependency>
<dependency>
<groupId>org.apache.iceberg</groupId>
<artifactId>iceberg-data</artifactId>
<version>${iceberg.version}</version>
</dependency>
<dependency>
<groupId>org.apache.iceberg</groupId>
<artifactId>iceberg-hive</artifactId>
<version>${iceberg.version}</version>
</dependency>
<dependency>
<groupId>org.apache.iceberg</groupId>
<artifactId>iceberg-arrow</artifactId>
<version>${iceberg.version}</version>
</dependency>
<dependency>
<groupId>org.apache.iceberg</groupId>
<artifactId>iceberg-spark</artifactId>
<version>${iceberg.version}</version>
</dependency>
<dependency>
<groupId>org.apache.iceberg</groupId>
<artifactId>iceberg-bundled-guava</artifactId>
<version>${iceberg.version}</version>
</dependency>
<dependency>
<groupId>org.apache.iceberg</groupId>
<artifactId>iceberg-spark-runtime</artifactId>
<version>${iceberg.version}</version>
</dependency>
<dependency>
<groupId>org.apache.iceberg</groupId>
<artifactId>iceberg-spark2</artifactId>
<version>${iceberg.version}</version>
</dependency>
<dependency>
<groupId>org.apache.iceberg</groupId>
<artifactId>iceberg-flink</artifactId>
<version>${iceberg.version}</version>
</dependency>
<dependency>
<groupId>org.apache.iceberg</groupId>
<artifactId>iceberg-pig</artifactId>
<version>${iceberg.version}</version>
</dependency>
<dependency>
<groupId>org.apache.iceberg</groupId>
<artifactId>iceberg-mr</artifactId>
<version>${iceberg.version}</version>
</dependency>
<!-- org.apache.iceberg END-->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.4.5</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.4.5</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka-0-10_2.11</artifactId>
<version>2.4.5</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-mllib_2.11</artifactId>
<version>2.4.5</version>
<exclusions>
<exclusion>
<groupId>org.codehaus.janino</groupId>
<artifactId>commons-compiler</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>2.11.0</version>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-compiler</artifactId>
<version>2.11.0</version>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-reflect</artifactId>
<version>2.11.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/com.fasterxml.jackson.core/jackson-databind -->
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-core</artifactId>
<!--<version>2.7.9</version>-->
<version>2.6.6</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<!--<version>2.7.9.4</version>-->
<version>2.6.5</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-annotations</artifactId>
<!--<version>2.7.9</version>-->
<version>2.6.5</version>
</dependency>
<!-- https://mvnrepository.com/artifact/com.alibaba/fastjson -->
<dependency>
<groupId>com.alibaba</groupId>
<artifactId>fastjson</artifactId>
<version>1.2.56</version>
</dependency>
<dependency>
<groupId>org.apache.parquet</groupId>
<artifactId>parquet-avro</artifactId>
<version>1.11.1</version>
</dependency>
<dependency>
<groupId>org.apache.avro</groupId>
<artifactId>avro</artifactId>
<version>1.10.0</version>
</dependency>
<dependency>
<groupId>org.apache.parquet</groupId>
<artifactId>parquet-column</artifactId>
<version>1.11.1</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-hive -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_2.11</artifactId>
<version>2.4.0</version>
<scope>provided</scope>
</dependency>
</dependencies>
</project>
Missing
org.apache.parquet.hadoop.ColumnChunkPageWriteStore(org.apache.parquet.hadoop.CodecFactory$BytesCompressor,org.apache.parquet.schema.MessageType,org.apache.parquet.bytes.ByteBufferAllocator,int)
[java.lang.NoSuchMethodException:
org.apache.parquet.hadoop.ColumnChunkPageWriteStore.(org.apache.parquet.hadoop.CodecFactory$BytesCompressor,
org.apache.parquet.schema.MessageType,
org.apache.parquet.bytes.ByteBufferAllocator, int)]
Means you are using the Constructor of ColumnChunkPageWriteStore, which takes in 4 parameters, of types (org.apache.parquet.hadoop.CodecFactory$BytesCompressor, org.apache.parquet.schema.MessageType, org.apache.parquet.bytes.ByteBufferAllocator, int)
It cant find the constructor you are using. That why NoSuchMethodError
According to https://jar-download.com/artifacts/org.apache.parquet/parquet-hadoop/1.8.1/source-code/org/apache/parquet/hadoop/ColumnChunkPageWriteStore.java , you need 1.8.1 of parquet-hadoop
Change your mvn import to an older version. I looked at 1.8.1 source code and it has the proper constructor you need.
Related
when I'm trying to update my spring-boot project to 2.7.x from 2.0.1, I'm getting the following error...
also, I'm using the flapdoodle embded dependemcy with version 3.0.0 for my testcases.
nested exception is java.lang.IllegalStateException: Error processing condition on org.springframework.boot.autoconfigure.mongo.embedded.EmbeddedMongoAutoConfiguration.embeddedMongoServer
DEPENDENCIES
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter</artifactId>
<version>2.7.5</version>
</dependency>
<dependency>
<groupId>org.yaml</groupId>
<artifactId>snakeyaml</artifactId>
<version>1.33</version>
</dependency>
<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-classic</artifactId>
<version>1.2.10</version>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-validation</artifactId>
<version>2.7.5</version>
</dependency>
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>mongodb-driver-sync</artifactId>
<version>4.3</version>
</dependency>
<dependency>
<groupId>com.github.ulisesbocchio</groupId>
<artifactId>jasypt-spring-boot-starter</artifactId>
</dependency>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-to-slf4j</artifactId>
<version>2.17.1</version>
</dependency>
<!-- Test -->
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<version>2.7.5</version>
<scope>test</scope>
<exclusions>
<exclusion>
<groupId>org.junit.jupiter</groupId>
<artifactId>junit-jupiter</artifactId>
</exclusion>
<exclusion>
<groupId>org.mockito</groupId>
<artifactId>mockito-junit-jupiter</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>info.cukes</groupId>
<artifactId>cucumber-java</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>info.cukes</groupId>
<artifactId>cucumber-junit</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>info.cukes</groupId>
<artifactId>cucumber-spring</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>de.flapdoodle.embed</groupId>
<artifactId>de.flapdoodle.embed.mongo</artifactId>
<version>3.0.0</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>commons-io</groupId>
<artifactId>commons-io</artifactId>
</dependency>
I have newly created a spring boot project and added spring cloud and azure active directory and lots of dependency, now while I am trying to run my application I am getting below error.
Error:
APPLICATION FAILED TO START
***************************
Description:
An attempt was made to call a method that does not exist. The attempt was made from the following location:
com.azure.spring.cloud.autoconfigure.aad.AadWebSecurityConfigurerAdapter.configure(AadWebSecurityConfigurerAdapter.java:83)
The following method did not exist:
org.springframework.security.config.annotation.web.configurers.oauth2.client.OAuth2LoginConfigurer$AuthorizationEndpointConfig.authorizationRequestResolver(Lorg/springframework/security/oauth2/client/web/OAuth2AuthorizationRequestResolver;)Lorg/springframework/security/config/annotation/web/configurers/oauth2/client/OAuth2LoginConfigurer$AuthorizationEndpointConfig;
The calling method's class, com.azure.spring.cloud.autoconfigure.aad.AadWebSecurityConfigurerAdapter, was loaded from the following location:
jar:file:/C:/Users/ARIMO/.m2/repository/com/azure/spring/spring-cloud-azure-autoconfigure/4.5.0/spring-cloud-azure-autoconfigure-4.5.0.jar!/com/azure/spring/cloud/autoconfigure/aad/AadWebSecurityConfigurerAdapter.class
The called method's class, org.springframework.security.config.annotation.web.configurers.oauth2.client.OAuth2LoginConfigurer$AuthorizationEndpointConfig, is available from the following locations:
jar:file:/C:/Users/ARIMO/.m2/repository/org/springframework/security/spring-security-config/5.0.0.RELEASE/spring-security-config-5.0.0.RELEASE.jar!/org/springframework/security/config/annotation/web/configurers/oauth2/client/OAuth2LoginConfigurer$AuthorizationEndpointConfig.class
The called method's class hierarchy was loaded from the following locations:
org.springframework.security.config.annotation.web.configurers.oauth2.client.OAuth2LoginConfigurer.AuthorizationEndpointConfig: file:/C:/Users/ARIMO/.m2/repository/org/springframework/security/spring-security-config/5.0.0.RELEASE/spring-security-config-5.0.0.RELEASE.jar
Action:
Correct the classpath of your application so that it contains compatible versions of the classes com.azure.spring.cloud.autoconfigure.aad.AadWebSecurityConfigurerAdapter and org.springframework.security.config.annotation.web.configurers.oauth2.client.OAuth2LoginConfigurer$AuthorizationEndpointConfig
Process finished with exit code 1
POM.xml: Below is the pom file for integrate one of the functionalities of authenticate using active directory as well as other functionality.
project xmlns=http://maven.apache.org/POM/4.0.0
xmlns:xsi=http://www.w3.org/2001/XMLSchema-instance
xsi:schemaLocation=http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd>
<modelVersion>4.0.0</modelVersion>
<groupId>com.demo.test</groupId>
<artifactId>test</artifactId>
<version>0.0.1-SNAPSHOT</version>
<name>demo</name>
<description>Spring boot project for test</description>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.7.1</version>
<relativePath />
</parent>
<properties>
<java.version>1.8</java.version>
<cxf.version>3.1.5</cxf.version>
<olingo.version>2.0.11</olingo.version>
<commons.version>0.0.1-SNAPSHOT</commons.version>
<log4j2.version>2.17.0</log4j2.version>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
<spring-cloud-azure.version>4.5.0</spring-cloud-azure.version>
</properties>
<dependencies>
<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-classic</artifactId>
<version>1.2.3</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.springframework.security/spring-security-core -->
<dependency>
<groupId>org.springframework.security</groupId>
<artifactId>spring-security-web</artifactId>
<version>5.0.0.RELEASE</version>
</dependency>
<dependency>
<groupId>org.springframework.security</groupId>
<artifactId>spring-security-core</artifactId>
<version>5.0.0.RELEASE</version>
</dependency>
<dependency>
<groupId>org.springframework.security</groupId>
<artifactId>spring-security-config</artifactId>
<version>5.0.0.RELEASE</version>
</dependency>
<dependency>
<groupId>com.microsoft.sqlserver</groupId>
<artifactId>mssql-jdbc</artifactId>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-jpa</artifactId>
<exclusions>
<exclusion>
<groupId>org.apache.tomcat</groupId>
<artifactId>tomcat-jdbc</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-actuator</artifactId>
</dependency>
<!-- Cloud dependencies -->
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-cloud-connectors</artifactId>
<version>2.1.4.RELEASE</version>
</dependency>
<!-- Cloud dependencies -->
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-netflix-eureka-client</artifactId>
<version>2.2.10.RELEASE</version>
<exclusions>
<exclusion>
<groupId>javax.ws.rs</groupId>
<artifactId>jsr311-api</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-openfeign</artifactId>
</dependency>
<!-- Apache Olingo dependencies -->
<dependency>
<groupId>org.apache.olingo</groupId>
<artifactId>olingo-odata2-jpa-processor-api</artifactId>
<version>${olingo.version}</version>
<exclusions>
<exclusion>
<groupId>org.eclipse.persistence</groupId>
<artifactId>javax.persistence</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.olingo</groupId>
<artifactId>olingo-odata2-jpa-processor-core</artifactId>
<version>${olingo.version}</version>
<exclusions>
<exclusion>
<groupId>org.eclipse.persistence</groupId>
<artifactId>javax.persistence</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.cxf</groupId>
<artifactId>cxf-rt-frontend-jaxrs</artifactId>
<version>${cxf.version}</version>
</dependency>
<dependency>
<groupId>org.apache.olingo</groupId>
<artifactId>olingo-odata2-api</artifactId>
<version>${olingo.version}</version>
</dependency>
<dependency>
<groupId>org.apache.olingo</groupId>
<artifactId>olingo-odata2-core</artifactId>
<version>${olingo.version}</version>
</dependency>
<dependency>
<groupId>org.apache.olingo</groupId>
<artifactId>olingo-odata2-api-annotation</artifactId>
<version>${olingo.version}</version>
</dependency>
<!-- Utilities -->
<dependency>
<groupId>org.flywaydb</groupId>
<artifactId>flyway-sqlserver</artifactId>
</dependency>
<!-- Test tools -->
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.mockito</groupId>
<artifactId>mockito-core</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>com.h2database</groupId>
<artifactId>h2</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
<version>1.18.10</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>io.projectreactor</groupId>
<artifactId>reactor-core</artifactId>
<version>3.4.10</version>
</dependency>
<dependency>
<groupId>com.microsoft.azure</groupId>
<artifactId>azure-core</artifactId>
<version>0.9.8</version>
<exclusions>
<exclusion>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-core</artifactId>
</exclusion>
<exclusion>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-annotations</artifactId>
</exclusion>
<exclusion>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
</exclusion>
<exclusion>
<groupId>com.fasterxml.jackson.dataformat</groupId>
<artifactId>jackson-dataformat-xml</artifactId>
</exclusion>
<exclusion>
<groupId>com.fasterxml.jackson.datatype</groupId>
<artifactId>jackson-datatype-jsr310</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.cloudfoundry.identity</groupId>
<artifactId>cloudfoundry-identity-client-lib</artifactId>
<version>4.10.0</version>
<exclusions>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
</exclusion>
</exclusions>
</dependency>
<!-- https://mvnrepository.com/artifact/org.json/json -->
<dependency>
<groupId>org.json</groupId>
<artifactId>json</artifactId>
<version>20080701</version>
</dependency>
<!-- Utilities -->
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-core</artifactId>
<version>2.13.2</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-annotations</artifactId>
<version>2.13.2</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>2.13.2</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.dataformat</groupId>
<artifactId>jackson-dataformat-xml</artifactId>
<version>2.13.2</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.datatype</groupId>
<artifactId>jackson-datatype-jsr310</artifactId>
<version>2.13.2</version>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-oauth2-client</artifactId>
</dependency>
<dependency>
<groupId>com.azure.spring</groupId>
<artifactId>spring-cloud-azure-starter-active-directory</artifactId>
</dependency>
</dependencies>
<dependencyManagement>
<dependencies>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-dependencies</artifactId>
<version>2021.0.3</version>
<type>pom</type>
<scope>import</scope>
</dependency>
<dependency>
<groupId>com.azure.spring</groupId>
<artifactId>spring-cloud-azure-dependencies</artifactId>
<version>${spring-cloud-azure.version}</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<version>2.1.4.RELEASE</version>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.12.4</version><!--$NO-MVN-MAN-VER$-->
</plugin>
</plugins>
</build>
</project>
Tried multiple times
I'm trying to write a dataframe to an orc, but to no avail. I'm using Spark 1.6 with Java.
I am running on my local machine, I tried to install some dependencies but without success.
My POM is this:
<properties>
<spark.version>1.6.0</spark.version>
<scala.short.version>2.10</scala.short.version>
<slf4j.version>1.7.25</slf4j.version>
<maven.compiler.source>1.7</maven.compiler.source>
<maven.compiler.target>1.7</maven.compiler.target>
</properties>
<dependencies>
<!-- https://mvnrepository.com/artifact/org.scalatest/scalatest_${scala.short.version} -->
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>${slf4j.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.11</artifactId>
<version>1.6.0</version>
</dependency>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients</artifactId>
<version>2.3.0</version>
</dependency>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_2.10</artifactId>
<version>0.9.0.0</version>
</dependency>
<dependency>
<groupId>commons-logging</groupId>
<artifactId>commons-logging</artifactId>
<version>1.1.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka_2.10</artifactId>
<version>1.6.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka-0-10_2.10</artifactId>
<version>2.0.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>1.6.0</version>
</dependency>
<dependency>
<groupId>com.databricks</groupId>
<artifactId>spark-avro_2.10</artifactId>
<version>3.2.0</version>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>2.11.8</version>
<!--<scope>provided</scope>-->
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>1.6.0</version>
</dependency>
<dependency>
<groupId>com.typesafe</groupId>
<artifactId>config</artifactId>
<version>RELEASE</version>
</dependency>
<dependency>
<groupId>commons-codec</groupId>
<artifactId>commons-codec</artifactId>
<version>1.11</version>
<!--<scope>provided</scope>-->
</dependency>
<!-- https://mvnrepository.com/artifact/com.typesafe.play/play-json -->
<dependency>
<groupId>com.typesafe.play</groupId>
<artifactId>play-json_2.11</artifactId>
<version>2.7.0-M1</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-aws -->
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-aws</artifactId>
<version>2.7.3</version>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-xml</artifactId>
<version>2.11.0-M4</version>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-parser-combinators</artifactId>
<version>2.11.0-M4</version>
</dependency>
</dependencies>
I have a job spark that I want to write to an orc file, but this error returns me:
Exception in thread "main" java.lang.ClassNotFoundException: Failed to find data source: orc. Please find packages at http://spark-packages.org
at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.lookupDataSource(ResolvedDataSource.scala:77)
at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:219)
at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:148)
at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:139)
at Confiaveis.main(Confiaveis.java:96)
Caused by: java.lang.ClassNotFoundException: orc.DefaultSource
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at org.apache.spark.sql.execution.datasources.ResolvedDataSource$$anonfun$4$$anonfun$apply$1.apply(ResolvedDataSource.scala:62)
at org.apache.spark.sql.execution.datasources.ResolvedDataSource$$anonfun$4$$anonfun$apply$1.apply(ResolvedDataSource.scala:62)
at scala.util.Try$.apply(Try.scala:192)
at org.apache.spark.sql.execution.datasources.ResolvedDataSource$$anonfun$4.apply(ResolvedDataSource.scala:62)
at org.apache.spark.sql.execution.datasources.ResolvedDataSource$$anonfun$4.apply(ResolvedDataSource.scala:62)
at scala.util.Try.orElse(Try.scala:84)
at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.lookupDataSource(ResolvedDataSource.scala:62)
... 4 more
I used this command to write:
df.write().mode("append").format("orc").save("path");
does anyone know how i can solve this?
As little as I know of spark, I understand that it's a library he doesn't find, but I can't find anywhere to clarify what that library would be.
Try
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_*your_version*</artifactId>
<version>*your_version*</version>
<scope>provided</scope>
</dependency>
I am attempting to use Apache Beam from Java as a data pipeline of sorts. I have written a simple class that sources from Google Pubsub and sinks to Google Bigquery, but I cannot get it to build for the life of me. I am using Maven to build and have added every Beam package I could find, but I still get "class file not found" errors.
Specifically:
[ERROR] /X:/Work/pipeline/backup-pipeline/src/main/java/PassthroughPipeline.java:[28,16] cannot access org.apache.beam.sdk.options.GcpOptions
class file for org.apache.beam.sdk.options.GcpOptions not found
[ERROR] /X:/Work/pipeline/backup-pipeline/src/main/java/PassthroughPipeline.java:[29,16] cannot access org.apache.beam.sdk.options.BigQueryOptions
class file for org.apache.beam.sdk.options.BigQueryOptions not found
[ERROR] /X:/Work/pipeline/backup-pipeline/src/main/java/PassthroughPipeline.java:[31,16] cannot access org.apache.beam.sdk.options.GcsOptions
class file for org.apache.beam.sdk.options.GcsOptions not found
Does anyone know what packages I need to add to resolve these? Google has unfortunately been no help.
The POM file that I have is based off of the example POM given by Apache for Wordcount, but with extra dependencies added. Below are the dependencies I put in it. I can provide the full file if needed, but it is quite monolithic.
<dependencies>
<dependency>
<groupId>org.apache.beam</groupId>
<artifactId>beam-runners-apex</artifactId>
<version>${beam.version}</version>
<scope>runtime</scope>
</dependency>
<!--
Apex depends on httpclient version 4.3.5, project has a transitive dependency to httpclient 4.0.1 from
google-http-client. Apex dependency version being specified explicitly so that it gets picked up. This
can be removed when the project no longer has a dependency on a different httpclient version.
-->
<dependency>
<groupId>org.apache.httpcomponents</groupId>
<artifactId>httpclient</artifactId>
<version>4.3.5</version>
<scope>runtime</scope>
<exclusions>
<exclusion>
<groupId>commons-codec</groupId>
<artifactId>commons-codec</artifactId>
</exclusion>
</exclusions>
</dependency>
</dependencies>
</profile>
<profile>
<id>dataflow-runner</id>
<!-- Makes the DataflowRunner available when running a pipeline. -->
<dependencies>
<dependency>
<groupId>org.apache.beam</groupId>
<artifactId>beam-runners-google-cloud-dataflow-java</artifactId>
<version>${beam.version}</version>
<scope>runtime</scope>
</dependency>
</dependencies>
</profile>
<profile>
<id>flink-runner</id>
<!-- Makes the FlinkRunner available when running a pipeline. -->
<dependencies>
<dependency>
<groupId>org.apache.beam</groupId>
<artifactId>beam-runners-flink_2.10</artifactId>
<version>${beam.version}</version>
<scope>runtime</scope>
</dependency>
</dependencies>
</profile>
<profile>
<id>spark-runner</id>
<!-- Makes the SparkRunner available when running a pipeline. Additionally,
overrides some Spark dependencies to Beam-compatible versions. -->
<dependencies>
<dependency>
<groupId>org.apache.beam</groupId>
<artifactId>beam-runners-spark</artifactId>
<version>${beam.version}</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>org.apache.beam</groupId>
<artifactId>beam-sdks-java-io-hadoop-file-system</artifactId>
<version>${beam.version}</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.10</artifactId>
<version>${spark.version}</version>
<scope>runtime</scope>
<exclusions>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>jul-to-slf4j</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.module</groupId>
<artifactId>jackson-module-scala_2.10</artifactId>
<version>${jackson.version}</version>
<scope>runtime</scope>
</dependency>
</dependencies>
</profile>
</profiles>
<dependencies>
<!-- Adds a dependency on the Beam SDK. -->
<dependency>
<groupId>org.apache.beam</groupId>
<artifactId>beam-sdks-java-core</artifactId>
<version>2.2.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.beam/beam-sdks-java-io-google-cloud-platform -->
<dependency>
<groupId>org.apache.beam</groupId>
<artifactId>beam-sdks-java-io-google-cloud-platform</artifactId>
<version>2.2.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.beam/beam-sdks-common-fn-api -->
<dependency>
<groupId>org.apache.beam</groupId>
<artifactId>beam-sdks-common-fn-api</artifactId>
<version>2.2.0</version>
<scope>test</scope>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.beam/beam-sdks-java-extensions-google-cloud-platform-core -->
<dependency>
<groupId>org.apache.beam</groupId>
<artifactId>beam-sdks-java-extensions-google-cloud-platform-core</artifactId>
<version>2.2.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.beam/beam-sdks-java-io-common -->
<dependency>
<groupId>org.apache.beam</groupId>
<artifactId>beam-sdks-java-io-common</artifactId>
<version>2.2.0</version>
<scope>test</scope>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.beam/beam-runners-parent -->
<dependency>
<groupId>org.apache.beam</groupId>
<artifactId>beam-runners-parent</artifactId>
<version>2.2.0</version>
<type>pom</type>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.beam/beam-runners-gcp-parent -->
<dependency>
<groupId>org.apache.beam</groupId>
<artifactId>beam-runners-gcp-parent</artifactId>
<version>2.2.0</version>
<type>pom</type>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.beam/beam-sdks-java-extensions-parent -->
<dependency>
<groupId>org.apache.beam</groupId>
<artifactId>beam-sdks-java-extensions-parent</artifactId>
<version>2.2.0</version>
<type>pom</type>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.beam/beam-parent -->
<dependency>
<groupId>org.apache.beam</groupId>
<artifactId>beam-parent</artifactId>
<version>2.2.0</version>
<type>pom</type>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.beam/beam-sdks-common-parent -->
<dependency>
<groupId>org.apache.beam</groupId>
<artifactId>beam-sdks-common-parent</artifactId>
<version>2.2.0</version>
<type>pom</type>
</dependency>
<!-- https://mvnrepository.com/artifact/com.google.cloud.dataflow/google-cloud-dataflow-java-sdk-parent -->
<dependency>
<groupId>com.google.cloud.dataflow</groupId>
<artifactId>google-cloud-dataflow-java-sdk-parent</artifactId>
<version>2.2.0</version>
<type>pom</type>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.beam/beam-runners-reference -->
<dependency>
<groupId>org.apache.beam</groupId>
<artifactId>beam-runners-reference</artifactId>
<version>2.2.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.beam/beam-sdks-java-parent -->
<dependency>
<groupId>org.apache.beam</groupId>
<artifactId>beam-sdks-java-parent</artifactId>
<version>2.2.0</version>
<type>pom</type>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.beam/beam-sdks-java-build-tools -->
<dependency>
<groupId>org.apache.beam</groupId>
<artifactId>beam-sdks-java-build-tools</artifactId>
<version>2.2.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.beam/beam-runners-direct-java -->
<dependency>
<groupId>org.apache.beam</groupId>
<artifactId>beam-runners-direct-java</artifactId>
<version>2.2.0</version>
<scope>test</scope>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.beam/beam-runners-core-construction-java -->
<dependency>
<groupId>org.apache.beam</groupId>
<artifactId>beam-runners-core-construction-java</artifactId>
<version>2.2.0</version>
</dependency>
<dependency>
<groupId>com.google.cloud.dataflow</groupId>
<artifactId>google-cloud-dataflow-java-sdk-all</artifactId>
<version>[2.1.0, 2.99)</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.beam/beam-sdks-common-runner-api -->
<dependency>
<groupId>org.apache.beam</groupId>
<artifactId>beam-sdks-common-runner-api</artifactId>
<version>2.2.0</version>
</dependency>
<dependency>
<groupId>org.apache.beam</groupId>
<artifactId>beam-runners-google-cloud-dataflow-java</artifactId>
<version>0.4.0</version>
</dependency>
<dependency>
<groupId>com.google.api-client</groupId>
<artifactId>google-api-client</artifactId>
<version>${google-clients.version}</version>
<exclusions>
<!-- Exclude an old version of guava that is being pulled
in by a transitive dependency of google-api-client -->
<exclusion>
<groupId>com.google.guava</groupId>
<artifactId>guava-jdk5</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>com.google.apis</groupId>
<artifactId>google-api-services-bigquery</artifactId>
<version>${bigquery.version}</version>
<exclusions>
<!-- Exclude an old version of guava that is being pulled
in by a transitive dependency of google-api-client -->
<exclusion>
<groupId>com.google.guava</groupId>
<artifactId>guava-jdk5</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>com.google.http-client</groupId>
<artifactId>google-http-client</artifactId>
<version>${google-clients.version}</version>
<exclusions>
<!-- Exclude an old version of guava that is being pulled
in by a transitive dependency of google-api-client -->
<exclusion>
<groupId>com.google.guava</groupId>
<artifactId>guava-jdk5</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>com.google.apis</groupId>
<artifactId>google-api-services-pubsub</artifactId>
<version>${pubsub.version}</version>
<exclusions>
<!-- Exclude an old version of guava that is being pulled
in by a transitive dependency of google-api-client -->
<exclusion>
<groupId>com.google.guava</groupId>
<artifactId>guava-jdk5</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>joda-time</groupId>
<artifactId>joda-time</artifactId>
<version>${joda.version}</version>
</dependency>
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
<version>${guava.version}</version>
</dependency>
<!-- Add slf4j API frontend binding with JUL backend -->
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>${slf4j.version}</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-jdk14</artifactId>
<version>${slf4j.version}</version>
<!-- When loaded at runtime this will wire up slf4j to the JUL backend -->
<scope>runtime</scope>
</dependency>
<!-- Hamcrest and JUnit are required dependencies of PAssert,
which is used in the main code of DebuggingWordCount example. -->
<dependency>
<groupId>org.hamcrest</groupId>
<artifactId>hamcrest-all</artifactId>
<version>${hamcrest.version}</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>${junit.version}</version>
</dependency>
</dependencies>
These classes:
org.apache.beam.sdk.options.GcpOptions
org.apache.beam.sdk.options.GcsOptions
org.apache.beam.sdk.options.BigQueryOptions
... are all in an earlier version of Apache Beam.
Given the dependencies in your pom.xml (specifically, the dependency on v2.2.0 of Apache Beam) the correct imports are:
org.apache.beam.sdk.extensions.gcp.options.GcpOptions
org.apache.beam.sdk.extensions.gcp.options.GcsOptions
org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
Through angular I am sending update request to SPRING in controller I have method to map this request but in this #RequestBody not able to map to my class and give
Error:
**2017-10-06 12:42:43.496 ERROR 5856 --- [nio-8080-exec-1] o.a.c.c.C.[.[.[/].[dispatcherServlet] : Servlet.service() for servlet [dispatcherServlet] in context with path [] threw exception [Handler processing failed; nested exception is java.lang.NoSuchMethodError: com.fasterxml.jackson.databind.type.TypeFactory.constructType(Ljava/lang/reflect/Type;Ljava/lang/Class;)Lcom/fasterxml/jackson/databind/JavaType;] with root cause**
**java.lang.NoSuchMethodError: com.fasterxml.jackson.databind.type.TypeFactory.constructType(Ljava/lang/reflect/Type;Ljava/lang/Class;)Lcom/fasterxml/jackson/databind/JavaType;**
**at org.springframework.http.converter.json.AbstractJackson2HttpMessageConverter.getJavaType(AbstractJackson2HttpMessageConverter.java:319) ~[spring-web-4.2.7.RELEASE.jar:4.2.7.RELEASE]**
**at org.springframework.hateoas.mvc.TypeConstrainedMappingJackson2HttpMessageConverter.canRead(TypeConstrainedMappingJackson2HttpMessageConverter.java:63) ~[spring-hateoas-0.19.0.RELEASE.jar:na]**
But I have check in com.fasterxml.jackson.databind.type.TypeFactory class their is method
#Deprecated
public JavaType constructType(Type type, Class<?> contextClass) {
TypeBindings bindings = (contextClass == null)
? TypeBindings.emptyBindings() : constructType(contextClass).getBindings();
return _fromAny(null, type, bindings);
}
Angular Request Code
function updateActivityByActivityID(ENDPOINTS, ActivityID, bodyJSON) {
var url = ENDPOINTS.SERVICE_ADDRESS
+ ENDPOINTS.UPDATE_ACTIVITY_BY_ACTIVITY_ID
+ ActivityID;
return HttpHandler.PUT(url, bodyJSON);
}
url : http://localhost:8080/Integration/Activity/updateCron/14
bodyJSON : cronExpression: "0 0/1 * 1/1 * ? *"
Controller Code
#ApiOperation(value = "update Activity scheduled expression")
#RequestMapping(method = RequestMethod.PUT, path = "/Integration/Activity/updateCron/{activityID}", produces = "application/json")
#ApiResponses(value = {
#ApiResponse(code = 200, message = "Success", response = String.class),
#ApiResponse(code = 401, message = "Unauthorized"),
#ApiResponse(code = 403, message = "Forbidden"),
#ApiResponse(code = 404, message = "Not Found"),
#ApiResponse(code = 500, message = "Failure") })
#ResponseBody
public IntegrationProcessResultVO updateActivityScheduledExpression(
#PathVariable("activityID") Long activityID,
#RequestBody ScheduledSetupVO scheduledSetupVO)
throws ActivityNotFoundException, IOException,
ActivitySchedulerException {
System.out.println("Activity ID : "+activityID);
System.out.println("Request Body : "+scheduledSetupVO);
IntegrationProcessResultVO integrationProcessResultVO = activityService
.updateActivityCronExpression(activityID, scheduledSetupVO);
return integrationProcessResultVO;
}
ScheduledSetupVO Class
package com.data.integration.service.vo;
public class ScheduledSetupVO {
private String cronExpression;
public ScheduledSetupVO(String cronExpression) {
super();
this.cronExpression = cronExpression;
}
public ScheduledSetupVO() {
super();
// TODO Auto-generated constructor stub
}
public String getCronExpression() {
return cronExpression;
}
public void setCronExpression(String cronExpression) {
this.cronExpression = cronExpression;
}
#Override
public String toString() {
// TODO Auto-generated method stub
return super.toString();
}
}
Pom.xml
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<name>dataintegrationservice</name>
<description>Data Integration</description>
<url>http://iquantifi.com/</url>
<groupId>com.data.integration</groupId>
<artifactId>dataintegrationservice</artifactId>
<version>1.0.0</version>
<packaging>jar</packaging>
<!-- Configures repository location for pentaho libraries -->
<repositories>
<repository>
<id>pentaho-releases</id>
<url>https://public.nexus.pentaho.org/content/groups/omni/</url> <!-- Location changed Previous - http://repository.pentaho.org/artifactory/repo/ -->
</repository>
<repository>
<id>clojars-releases</id>
<url>http://clojars.org/repo/</url>
</repository>
</repositories>
<!-- lookup parent from repository -->
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>1.3.6.RELEASE</version>
<relativePath />
</parent>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<java.version>1.8</java.version>
<pentaho.kettle.version>6.1.0.0-192</pentaho.kettle.version>
<vertx.version>3.3.0</vertx.version>
<commons.lang.version>3.4</commons.lang.version>
<jtds.version>1.3.1</jtds.version>
<pentaho-reporting-version>6.1.0.1-196</pentaho-reporting-version>
<quartz.version>2.2.3</quartz.version>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.security.oauth</groupId>
<artifactId>spring-security-oauth2</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-jpa</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-devtools</artifactId>
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-actuator</artifactId>
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-tomcat</artifactId>
</dependency>
<dependency>
<groupId>org.apache.tomcat</groupId>
<artifactId>tomcat-dbcp</artifactId>
<version>${tomcat.version}</version>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>net.sourceforge.jtds</groupId>
<artifactId>jtds</artifactId>
<version>${jtds.version}</version>
</dependency>
<dependency>
<groupId>org.postgresql</groupId>
<artifactId>postgresql</artifactId>
</dependency>
<dependency>
<groupId>net.logstash.logback</groupId>
<artifactId>logstash-logback-encoder</artifactId>
<version>4.7</version>
</dependency>
<dependency>
<groupId>com.hazelcast</groupId>
<artifactId>hazelcast</artifactId>
<version>3.7</version><!--$NO-MVN-MAN-VER$ -->
</dependency>
<dependency>
<groupId>net.lingala.zip4j</groupId>
<artifactId>zip4j</artifactId>
<version>1.3.2</version>
</dependency>
<!-- pentaho libraries -->
<dependency>
<groupId>pentaho-kettle</groupId>
<artifactId>kettle-core</artifactId>
<version>${pentaho.kettle.version}</version>
</dependency>
<dependency>
<groupId>pentaho-kettle</groupId>
<artifactId>kettle-engine</artifactId>
<version>${pentaho.kettle.version}</version>
</dependency>
<dependency>
<groupId>pentaho-kettle</groupId>
<artifactId>kettle-ui-swt</artifactId>
<version>${pentaho.kettle.version}</version>
</dependency>
<dependency>
<groupId>pentaho-library</groupId>
<artifactId>libformula</artifactId>
<version>${pentaho.kettle.version}</version>
</dependency>
<dependency>
<groupId>pentaho-library</groupId>
<artifactId>libbase</artifactId>
<version>${pentaho.kettle.version}</version>
</dependency>
<!-- Jar required for Pentaho Transformation/job -->
<dependency>
<groupId>net.sourceforge.jexcelapi</groupId>
<artifactId>jxl</artifactId>
<version>2.6.12</version>
</dependency>
<dependency>
<groupId>jsonpath</groupId>
<artifactId>jsonpath</artifactId>
<version>1.0</version>
</dependency>
<dependency>
<groupId>com.sun.jersey</groupId>
<artifactId>jersey-client</artifactId>
<version>1.19</version>
</dependency>
<dependency>
<groupId>com.sun.jersey.contribs</groupId>
<artifactId>jersey-apache-client</artifactId>
<version>1.16</version>
</dependency>
<dependency>
<groupId>org.codehaus.janino</groupId>
<artifactId>janino</artifactId>
<version>2.5.15</version><!--$NO-MVN-MAN-VER$ -->
</dependency>
<dependency>
<groupId>com.enterprisedt</groupId>
<artifactId>edtFTPj</artifactId>
<version>2.1.0</version>
</dependency>
<!-- pentaho reporting -->
<!-- Start pentaho reporting jar dependencies -->
<dependency>
<groupId>pentaho-reporting-engine</groupId>
<artifactId>pentaho-reporting-engine-classic-core</artifactId>
<version>${pentaho-reporting-version}</version>
<type>jar</type>
</dependency>
<dependency>
<groupId>pentaho-reporting-engine</groupId>
<artifactId>pentaho-reporting-engine-classic-extensions</artifactId>
<version>${pentaho-reporting-version}</version>
<type>jar</type>
</dependency>
<dependency>
<groupId>pentaho-library</groupId>
<artifactId>libdocbundle</artifactId>
<version>${pentaho-reporting-version}</version>
</dependency>
<dependency>
<groupId>pentaho-library</groupId>
<artifactId>libfonts</artifactId>
<version>${pentaho-reporting-version}</version>
</dependency>
<dependency>
<groupId>pentaho-library</groupId>
<artifactId>libformat</artifactId>
<version>${pentaho-reporting-version}</version>
</dependency>
<dependency>
<groupId>pentaho-library</groupId>
<artifactId>libloader</artifactId>
<version>${pentaho-reporting-version}</version>
</dependency>
<dependency>
<groupId>pentaho-library</groupId>
<artifactId>librepository</artifactId>
<version>${pentaho-reporting-version}</version>
</dependency>
<dependency>
<groupId>pentaho-library</groupId>
<artifactId>libserializer</artifactId>
<version>${pentaho-reporting-version}</version>
</dependency>
<dependency>
<groupId>pentaho-library</groupId>
<artifactId>libxml</artifactId>
<version>${pentaho-reporting-version}</version>
</dependency>
<dependency>
<groupId>pentaho-library</groupId>
<artifactId>libswing</artifactId>
<version>${pentaho-reporting-version}</version>
</dependency>
<dependency>
<groupId>com.lowagie</groupId>
<artifactId>itext</artifactId>
<version>2.1.7</version>
</dependency>
<dependency>
<groupId>net.sourceforge.stripes</groupId>
<artifactId>stripes</artifactId>
<version>1.5</version>
</dependency>
<dependency>
<groupId>commons-lang</groupId>
<artifactId>commons-lang</artifactId>
<version>2.6</version>
</dependency>
<dependency>
<groupId>rhino</groupId>
<artifactId>js</artifactId>
<version>1.7R3</version>
</dependency>
<dependency>
<groupId>org.jfree</groupId>
<artifactId>jfreechart</artifactId>
<version>1.0.15</version>
</dependency>
<dependency>
<groupId>javax.mail</groupId>
<artifactId>mail</artifactId>
<version>1.4.1</version>
</dependency>
<dependency>
<groupId>org.json</groupId>
<artifactId>json</artifactId>
</dependency>
<dependency>
<groupId>com.googlecode.json-simple</groupId>
<artifactId>json-simple</artifactId>
</dependency>
<dependency>
<groupId>org.apache.poi</groupId>
<artifactId>poi-ooxml</artifactId>
<version>3.8</version>
</dependency>
<dependency>
<groupId>net.sf.ehcache</groupId>
<artifactId>ehcache-core</artifactId>
<version>2.5.1</version>
</dependency>
<dependency>
<groupId>bsh</groupId>
<artifactId>bsh</artifactId>
<version>1.3.0</version>
</dependency>
<dependency>
<groupId>net.sourceforge.barbecue</groupId>
<artifactId>barbecue</artifactId>
<version>1.0.6d</version>
</dependency>
<dependency>
<groupId>bsf</groupId>
<artifactId>bsf</artifactId>
<version>2.4.0</version>
</dependency>
<dependency>
<groupId>pentaho-library</groupId>
<artifactId>libsparkline</artifactId>
<version>5.0.3</version>
</dependency>
<!-- end of Jar required for Pentaho Transformation/job -->
<!-- Apache commons util libraries -->
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-lang3</artifactId>
<version>${commons.lang.version}</version>
</dependency>
<dependency>
<groupId>com.jayway.jsonpath</groupId>
<artifactId>json-path</artifactId>
</dependency>
<!-- Apache commons util libraries end -->
<!-- Swagger UI -->
<dependency>
<groupId>io.springfox</groupId>
<artifactId>springfox-swagger-ui</artifactId>
<version>2.3.0</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>io.springfox</groupId>
<artifactId>springfox-swagger2</artifactId>
<version>2.3.0</version>
<scope>compile</scope>
</dependency>
<!-- end Swagger UI -->
<dependency>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
<version>2.7</version><!--$NO-MVN-MAN-VER$ -->
</dependency>
<!-- Quartz Scheduler -->
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-context-support</artifactId>
</dependency>
<dependency>
<groupId>org.quartz-scheduler</groupId>
<artifactId>quartz</artifactId>
<version>${quartz.version}</version>
</dependency>
<!-- Neo4j JDBC Driver -->
<dependency>
<groupId>org.neo4j</groupId>
<artifactId>neo4j-jdbc-driver</artifactId>
<version>3.0</version>
</dependency>
<!-- MySQL connector -->
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
</dependency>
<!-- for NoSuchMethod Error while executing jar -->
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>2.7.3</version><!--$NO-MVN-MAN-VER$-->
<exclusions>
<exclusion>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-core</artifactId>
</exclusion>
<exclusion>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-annotations</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-core</artifactId>
<version>2.7.3</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-annotations</artifactId>
<version>2.7.3</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<configuration>
<excludeDevtools>true</excludeDevtools>
<executable>true</executable>
</configuration>
</plugin>
</plugins>
</build>
<scm>
<url>https://bitbucket.org/kscloud/dataintegrationservice.git</url>
<connection>scm:git:https://kaniket_kanaka#bitbucket.org/kscloud/dataintegrationservice.git</connection>
<developerConnection>scm:git:https://kaniket_kanaka#bitbucket.org/kscloud/dataintegrationservice.git</developerConnection>
</scm>
If I removed the #RequestBody from the controller method then it is working fine
You are using an incompatible Jackson version. You are managing the version yourself don't do that.
Either remove the following dependencies from your pom.xml the spring-boot-starter-web already adds them. :
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>2.7.3</version><!--$NO-MVN-MAN-VER$-->
<exclusions>
<exclusion>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-core</artifactId>
</exclusion>
<exclusion>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-annotations</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-core</artifactId>
<version>2.7.3</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-annotations</artifactId>
<version>2.7.3</version>
</dependency>
Or at least remove the <version> tag from the dependencies so that Spring Boot can manage the version.