Maven cucumber reporting failure - Java.io.FilenotFound exception cucumber.json - java

I'm experiencing following error, on trying to run CucumberJVM tests using Maven Reporting Mojo.
[INFO] --- maven-cucumber-reporting:0.0.5:generate (execution) # testproject ---
About to generate
java.io.FileNotFoundException: /Users//IdeaProjects/testproject/target/cucumber.json (No such file or directory)
I have following dependencies in POM.xml file.
<dependency>
<groupId>net.masterthought</groupId>
<artifactId>maven-cucumber-reporting</artifactId>
<version>${maven.cucumber.reporting.version}</version>
</dependency>
<dependency>
<groupId>net.masterthought</groupId>
<artifactId>cucumber-reporting</artifactId>
<version>${cucumber.reporting.version}</version>
</dependency>
<dependency>
<groupId>com.googlecode.totallylazy</groupId>
<artifactId>totallylazy</artifactId>
<version>1077</version>
</dependency>
<dependency>
<groupId>com.beust</groupId>
<artifactId>jcommander</artifactId>
<version>1.25</version>
</dependency>
Repositories set to following.
<repositories>
<repository>
<id>repo.bodar.com</id>
<url>http://repo.bodar.com</url>
</repository>
<repository>
<id>sonatype-releases</id>
<url>https://oss.sonatype.org/content/repositories/releases/</url>
</repository>
</repositories>
Plugin set to following :
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.16</version>
<configuration>
<testFailureIgnore>true</testFailureIgnore>
</configuration>
</plugin>
<plugin>
<groupId>net.masterthought</groupId>
<artifactId>maven-cucumber-reporting</artifactId>
<version>${maven.cucumber.reporting.version}</version>
<executions>
<execution>
<id>execution</id>
<phase>verify</phase>
<goals>
<goal>generate</goal>
</goals>
<configuration>
<projectName>${project.name}</projectName>
<outputDirectory>${project.build.directory}/cucumber-html-reports</outputDirectory>
<cucumberOutput>${project.build.directory}/cucumber.json</cucumberOutput>
<enableFlashCharts>false</enableFlashCharts>
</configuration>
</execution>
</executions>
</plugin>
I'm using Standard maven project structure, with feature files hosted in /src/test/resources, Step definitions under /src/test/main.
Would you please advise how to resolve this issue. I'm seeing this issue on MAC 64-bit OS.
Thanks in advance.
Best Regards,
Raja

The plugin generates the HTML report from json. So the first step is to have cucumber generate the json.
When I ran into the same issue I found the cause of this the problem is that the cucumber.json file does not exist.
I addressed this in my project by adding json:target under the #CucumberOptions annotation, this then generated the json. for example
#CucumberOptions(features = "src/test/resources/features/some.feature",
monochrome = false, format = {"pretty", "json:target/cucumber.json"})
The masterthought plugin then processes the generated JSON by looking for it at ${project.build.directory}/cucumber.json
At this point the Maven plugin generates the HTML from the JSON.

Related

Spark: ClassNotFoundException while reading/ writing CSV

I'm trying to write a DataFrame as follows to a CSV file on HDFS
df.write()
.format("com.databricks.spark.csv")
.option("header", "true")
.save("/user/cloudera/csv");
but I get the following error
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/commons/csv/CSVFormat
...
Caused by: java.lang.ClassNotFoundException: org.apache.commons.csv.CSVFormat
... 21 more
My pom.xml has the following dependencies
<dependency>
<groupId>com.databricks</groupId>
<artifactId>spark-csv_2.10</artifactId>
<version>1.5.0</version>
</dependency>
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-csv</artifactId>
<version>1.5</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.10</artifactId>
<version>1.6.0</version>
</dependency>
I use spark 1.6.0 with scala 2.10.5 and use the following command to submit the job
spark-submit --jars /path/spark-csv_2.10-1.5.0.jar --class com.iris.Begin /path/CsvSolver.jar
I also have commons-csv/1.1 and commons-csv/1.5 in .m2 repository.
Could someone help me with this?
Just try to add those need jars to jars folder located in spark folder ...\spark\jars\
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-csv</artifactId>
<version>1.5</version>
</dependency>
Try adding this to the pom. If that doesn't work manually download the JAR from here https://mvnrepository.com/artifact/org.apache.commons/commons-csv/1.5 and add using --jars to spark submit. That would definitely solve the problem
It's better to build fat jar that will include all your dependencies (spark-core should be marked as provided) & submit only this jar without any additional --jars options.
In Maven you can generate fat jar by using Maven Assembly plugin with predefined profile jar-with-dependencies. Something like:
<build>
<plugins>
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<version>3.1.0</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
<configuration>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
</plugin>
</plugins>
</build>

Query DSL & Maven: Classes not generated, but only on command line (in eclipse it is working fine)

We are using a setup with Spring Boot, Hibernate, Query DSL and Maven with Java 1.8
Recently, I've added Query DSL to the project with the configuration listed below. To make it work, I had to configure the Java Compiler in the eclipse project settings to allow Annotation Processing and also add the Query DSL .jar file to the eclipse Annotation Factory Path.
This setup worked as expected. It generated the custom Q classes and I could use them in my code. When now running the mvn clean install on the command line, every class in my code throws the error cannot find symbol, because the class is missing. Is there anything else I need to configure - similar to the .jar file in the eclipse settings - to make the build process work?
EDIT: This question is not a duplicate of this question because I did not ask why this error (cannot find a symbol) occurs but rather how to configure QueryDSL to also work on the command line.
EDIT2: I have now tried to integrate the build-helper-maven-plugin to use multiple source paths as an input. This did not help either. I also tried to generate the files into a src folder. It did not help either.
When I first compile the library in eclipse, the mvn compile goes through on the command line, but mvn clean compile still fails, because it just uses the compiled files of eclipse again. The apt-maven-plugin is executed, which can be seen just before the build process fails:
[INFO] --- apt-maven-plugin:1.1.3:process (default) # project1 ---
[INFO]
[INFO] --- build-helper-maven-plugin:1.9.1:add-source (add-source) # project1 ---
[INFO] Source directory: C:\Users\user1\git\project1\src\main\generated added.
[INFO]
[INFO] --- maven-processor-plugin:2.2.4:process (process) # project1 ---
[ERROR] diagnostic: [...]
EDIT3: When I remove every import statement which is referring to the Q classes, the build process goes through (obviously). It is, however, remarkable, that the Q classes get compiled correctly in that case. They appear in the target folder as .class files as they should. Could it be, that the Q classes are compiled too late?
Here is an excerpt of the pom.xml
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
[...]
<prerequisites>
<maven>3.0.0</maven>
</prerequisites>
<dependencies>
[...]
<dependency>
<groupId>com.querydsl</groupId>
<artifactId>querydsl-apt</artifactId>
<version>4.1.3</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>com.querydsl</groupId>
<artifactId>querydsl-jpa</artifactId>
<version>4.1.3</version>
</dependency>
</dependencies>
<build>
<defaultGoal>spring-boot:run</defaultGoal>
<plugins>
[...]
<plugin>
<groupId>com.mysema.maven</groupId>
<artifactId>apt-maven-plugin</artifactId>
<version>1.1.3</version>
<executions>
<execution>
<goals>
<goal>process</goal>
</goals>
<configuration>
<outputDirectory>target/generated-sources/java</outputDirectory>
<processor>com.querydsl.apt.jpa.JPAAnnotationProcessor</processor>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
[...]
</build>
</project>
This is the configuration of the eclipse project settings:
This is the error message which is displayed in the console:
[INFO] --- maven-processor-plugin:2.2.4:process (process) # project1 ---
[ERROR] diagnostic: C:\Users\user1\git\project1\src\main\java\com\project1\repository\UserRepositoryImpl.java:3: error: cannot find symbol
import static com.project1.domain.QUser.user;
^
symbol: class QUser
location: package com.project1.domain
[ERROR] diagnostic: C:\Users\user1\git\project1\src\main\java\com\project1\repository\UserRepositoryImpl.java:3: error: static import only from classes and interfaces
import static com.project.domain.QUser.user;
^
This is old question but this is how i find my solution, added classifier for jpa dependency:
<!-- BEGIN: 'querydsl-jpa' -->
<dependency>
<groupId>com.mysema.querydsl</groupId>
<artifactId>querydsl-jpa</artifactId>
<version>${querydsl-jpa.version}</version>
<classifier>apt</classifier>
</dependency>
<!-- END: 'querydsl-jpa' -->
My complete pom:
<!-- BEGIN: BUILD -->
<build>
<!-- BEGIN: PLUGINS -->
<plugins>
<!-- BEGIN: apt-maven-plugin -->
<plugin>
<groupId>com.mysema.maven</groupId>
<artifactId>apt-maven-plugin</artifactId>
<version>${apt.version}</version>
<executions>
<execution>
<goals>
<goal>process</goal>
</goals>
<configuration>
<outputDirectory>target/generated-sources/apt</outputDirectory>
<processor>com.mysema.query.apt.jpa.JPAAnnotationProcessor</processor>
</configuration>
</execution>
</executions>
</plugin>
<!-- END: apt-maven-plugin -->
</plugins>
<!-- END: PLUGINS -->
</build>
<!-- END: BUILD -->
<!-- BEGIN: DEPENDENCIES -->
<dependencies>
<!-- *********************************************** -->
<!-- BEGIN: 'QUERYDSL DEPENDENCIES' -->
<!-- BEGIN: 'querydsl-apt' -->
<dependency>
<groupId>com.mysema.querydsl</groupId>
<artifactId>querydsl-apt</artifactId>
<version>${querydsl-apt.version}</version>
</dependency>
<!-- END: 'querydsl-apt' -->
<!-- BEGIN: 'querydsl-jpa' -->
<dependency>
<groupId>com.mysema.querydsl</groupId>
<artifactId>querydsl-jpa</artifactId>
<version>${querydsl-jpa.version}</version>
<classifier>apt</classifier>
</dependency>
<!-- END: 'querydsl-jpa' -->
<!-- *********************************************** -->
<!-- END: 'QUERYDSL DEPENDENCIES' -->
</dependencies>
For me, it didn't work because it conflicted with maven-compiler-plugin with already set annotation processor. Just deleted the use of apt-maven-plugin and added its annotation processor in maven-compiler-plugin.
<build>
<plugins>
<!-- related to issues:-->
<!-- - https://github.com/querydsl/querydsl/issues/2654 -->
<!-- - https://github.com/querydsl/querydsl/issues/2242 -->
<!-- Using apt-maven-plugin conflicts with other annotation processors (like mapStruct) -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.8.1</version>
<configuration>
<annotationProcessorPaths>
<path>
<groupId>org.mapstruct</groupId>
<artifactId>mapstruct-processor</artifactId>
<version>${org.mapstruct.version}</version>
</path>
<path>
<groupId>com.querydsl</groupId>
<artifactId>querydsl-apt</artifactId>
<version>5.0.0</version>
<classifier>jpa</classifier>
</path>
<path>
<groupId>jakarta.persistence</groupId>
<artifactId>jakarta.persistence-api</artifactId>
<version>2.2.3</version>
</path>
<path>
<groupId>javax.annotation</groupId>
<artifactId>javax.annotation-api</artifactId>
<version>1.3.2</version>
</path>
<!-- other annotation processors -->
</annotationProcessorPaths>
</configuration>
</plugin>
<!-- <plugin>-->
<!-- <groupId>com.mysema.maven</groupId>-->
<!-- <artifactId>apt-maven-plugin</artifactId>-->
<!-- <version>1.1.3</version>-->
<!-- <executions>-->
<!-- <execution>-->
<!-- <goals>-->
<!-- <goal>process</goal>-->
<!-- </goals>-->
<!-- <configuration>-->
<!-- <outputDirectory>target/generated-sources</outputDirectory>-->
<!-- <processor>com.mysema.query.apt.jpa.JPAAnnotationProcessor</processor>-->
<!-- </configuration>-->
<!-- </execution>-->
<!-- </executions>-->
<!-- </plugin>-->
</plugins>
</build>
But there is an issue with using querydsl annotation processor in maven-compiler-plugin. You have to add jakarta.persistence-api and javax.annotation-api.
I would rather use profile to generate these Qclasses only when db change occurs.
cons:
-your diff in pull requests is clean when you don't change db schema because for each generation these files tend to generate differently for some reason (atleast in my case).
-you can manage witch of tables present in your db will have Qclasses (sometimes it is a pain when you forget to regenerate them after changing db schema)
-well not that it is lots of time . but builds are faster if You don't change schema and profile is turned off.
Try something like this and turn on profile when You want to generate changed schema Qclasses :
<properties>
<whitelisted.tables>
user_accunt,
other
</whitelisted.tables>
</properties>
<profiles>
<profile>
<id>generate</id>
<build>
<plugins>
<plugin>
<groupId>com.querydsl</groupId>
<artifactId>querydsl-maven-plugin</artifactId>
<version>${querydsl.version}</version>
<executions>
<execution>
<goals>
<goal>export</goal>
</goals>
</execution>
</executions>
<configuration>
<jdbcDriver>org.postgresql.Driver</jdbcDriver>
<jdbcUrl>jdbc:postgresql://localhost:port/dbname</jdbcUrl>
<packageName>your.package.name.for.q</packageName>
<jdbcUser>dbusername</jdbcUser>
<jdbcPassword>dbpassword</jdbcPassword>
<targetFolder>${project.basedir}/src/main/java/</targetFolder>
<spatial>true</spatial>
<tableNamePattern>${whitelisted.tables}</tableNamePattern>
</configuration>
<dependencies>
<dependency>
<groupId>org.postgresql</groupId>
<artifactId>postgresql</artifactId>
<version>42.1.1</version>
</dependency>
</dependencies>
</plugin>
</plugins>
</build>
</profile>
</profiles>
The generated-source directory are not automatically included in the jar.
You need to use the Maven build helper plugin to fix this issue, for example:
https://github.com/alexec/javahelp-skeleton/blob/master/pom.xml
You can try out few things:
1.try to put <clearOutputDir>false</clearOutputDir> in your configuration tag
2. Sometimes classes might not be getting generated before the compile phase. So try to put phase in your plugin
<execution>
<phase>generate-sources</phase>
<goals>
<goal>...</goal>
</goals>
</execution>
By convention Maven assumes all source code is in 'src/main/java', compiles this and put all *.class file in target.
So if you have a class 'Alien.java' in <project-root>/alice/in/wonderland, your won't be able to access it (in src/main/java) because maven puts everything from src/main/java in classpath for your compiler and hence compiler is unaware of any source code (*.java) anywhere else.
In your case you are generating your source code in directory target/generated-sources/java, so you will have to tell maven about it. As mentioned in some other answers you may use build-helper-plugin for this, let maven know that your source resides on target/generated-sources/java by
<project>
...
<build>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>build-helper-maven-plugin</artifactId>
<version>3.0.0</version>
<executions>
<execution>
<id>add-source</id>
<phase>generate-sources</phase>
<goals>
<goal>add-source</goal>
</goals>
<configuration>
<sources>
<source>target/generated-sources/java</source>
<source>alice/in/wonderland</source>
</sources>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>
Edit: You have mentioned wrong path in build-helper-plugin
I see although you are using build helper plugin but you are using wrong path
--- build-helper-maven-plugin:1.9.1:add-source (add-source) # project1 ---
[INFO] Source directory: C:\Users\user1\git\project1\src\main\generated added.
Instead of src\main\generated you should use <source>target/generated-sources/java</source>

class file for org.apache.flink.api.common.serialization.DeserializationSchema not found

I'm trying to develop a flink stream job. The job should read from a kafka topic.
I've tried to update the example at https://github.com/dataArtisans/kafka-example/blob/master/src/main/java/com/dataartisans/ReadFromKafka.java
I want to use Flink 1.4 and Kafka 0.11.
When I try to build the (maven) project I get the following error:
[ERROR] /quickstart/src/main/java/org/myorg/quickstart/StreamingJob.java:[20,66] cannot access org.apache.flink.api.common.serialization.DeserializationSchema
class file for org.apache.flink.api.common.serialization.DeserializationSchema not found
[INFO] 1 error
...
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.1:compile (default-compile) on project quickstart: Compilation failure
[ERROR] /quickstart/src/main/java/org/myorg/quickstart/StreamingJob.java:[20,66] cannot access org.apache.flink.api.common.serialization.DeserializationSchema
[ERROR] class file for org.apache.flink.api.common.serialization.DeserializationSchema not found
Are there any ideas how to resolve this error? So far, I wasn't able to find a solution.
Files
StreamingJob.java
package org.myorg.quickstart;
import org.apache.flink.api.common.functions.MapFunction;
import org.apache.flink.api.java.utils.ParameterTool;
import org.apache.flink.streaming.api.datastream.DataStream;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08;
import org.apache.flink.streaming.util.serialization.JSONKeyValueDeserializationSchema;
import org.apache.flink.streaming.util.serialization.KeyedDeserializationSchema;
public class StreamingJob {
public static void main(String[] args) throws Exception {
// set up the streaming execution environment
final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
// parse user parameters
ParameterTool parameterTool = ParameterTool.fromArgs(args);
DataStream<String> messageStream = env.addSource(new FlinkKafkaConsumer08<String>(parameterTool.getRequired("topic"), (KeyedDeserializationSchema) new JSONKeyValueDeserializationSchema(true), parameterTool.getProperties()));
// print() will write the contents of the stream to the TaskManager's standard out stream
// the rebelance call is causing a repartitioning of the data so that all machines
// see the messages (for example in cases when "num kafka partitions" < "num flink operators"
messageStream.rebalance().map(new MapFunction<String, String>() {
private static final long serialVersionUID = -6867736771747690202L;
#Override
public String map(String value) throws Exception {
return "Kafka and Flink says: " + value;
}
}).print();
// execute program
env.execute("Flink Streaming Java API Skeleton");
}
}
pom.xml
<!--
Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements. See the NOTICE file
distributed with this work for additional information
regarding copyright ownership. The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing,
software distributed under the License is distributed on an
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND, either express or implied. See the License for the
specific language governing permissions and limitations
under the License.
-->
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.myorg.quickstart</groupId>
<artifactId>quickstart</artifactId>
<version>0.1</version>
<packaging>jar</packaging>
<name>Flink Quickstart Job</name>
<url>http://www.myorganization.org</url>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<flink.version>1.3.0</flink.version>
<slf4j.version>1.7.7</slf4j.version>
<log4j.version>1.2.17</log4j.version>
<scala.binary.version>2.10</scala.binary.version>
</properties>
<repositories>
<repository>
<id>apache.snapshots</id>
<name>Apache Development Snapshot Repository</name>
<url>https://repository.apache.org/content/repositories/snapshots/</url>
<releases>
<enabled>false</enabled>
</releases>
<snapshots>
<enabled>true</enabled>
</snapshots>
</repository>
</repositories>
<!--
Execute "mvn clean package -Pbuild-jar"
to build a jar file out of this project!
How to use the Flink Quickstart pom:
a) Adding new dependencies:
You can add dependencies to the list below.
Please check if the maven-shade-plugin below is filtering out your dependency
and remove the exclude from there.
b) Build a jar for running on the cluster:
There are two options for creating a jar from this project
b.1) "mvn clean package" -> this will create a fat jar which contains all
dependencies necessary for running the jar created by this pom in a cluster.
The "maven-shade-plugin" excludes everything that is provided on a running Flink cluster.
b.2) "mvn clean package -Pbuild-jar" -> This will also create a fat-jar, but with much
nicer dependency exclusion handling. This approach is preferred and leads to
much cleaner jar files.
-->
<dependencies>
<!-- Apache Flink dependencies -->
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-java</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-java_2.10</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-clients_2.10</artifactId>
<version>${flink.version}</version>
</dependency>
<!-- explicitly add a standard loggin framework, as Flink does not have
a hard dependency on one specific framework by default -->
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>${slf4j.version}</version>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>${log4j.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kafka-0.8_2.11</artifactId>
<version>1.4-SNAPSHOT</version>
</dependency>
</dependencies>
<profiles>
<profile>
<!-- Profile for packaging correct JAR files -->
<id>build-jar</id>
<activation>
<activeByDefault>false</activeByDefault>
</activation>
<dependencies>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-java</artifactId>
<version>${flink.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-java_2.10</artifactId>
<version>${flink.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-clients_2.10</artifactId>
<version>${flink.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>${slf4j.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>${log4j.version}</version>
<scope>provided</scope>
</dependency>
</dependencies>
<build>
<plugins>
<!-- disable the exclusion rules -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.4.1</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<artifactSet>
<excludes combine.self="override"></excludes>
</artifactSet>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</profile>
</profiles>
<build>
<plugins>
<!-- We use the maven-shade plugin to create a fat jar that contains all dependencies
except flink and its transitive dependencies. The resulting fat-jar can be executed
on a cluster. Change the value of Program-Class if your program entry point changes. -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.4.1</version>
<executions>
<!-- Run shade goal on package phase -->
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<artifactSet>
<excludes>
<!-- This list contains all dependencies of flink-dist
Everything else will be packaged into the fat-jar
-->
<exclude>org.apache.flink:flink-annotations</exclude>
<exclude>org.apache.flink:flink-shaded-hadoop2</exclude>
<exclude>org.apache.flink:flink-shaded-curator-recipes</exclude>
<exclude>org.apache.flink:flink-core</exclude>
<exclude>org.apache.flink:flink-java</exclude>
<exclude>org.apache.flink:flink-scala_2.10</exclude>
<exclude>org.apache.flink:flink-runtime_2.10</exclude>
<exclude>org.apache.flink:flink-optimizer_2.10</exclude>
<exclude>org.apache.flink:flink-clients_2.10</exclude>
<exclude>org.apache.flink:flink-avro_2.10</exclude>
<exclude>org.apache.flink:flink-examples-batch_2.10</exclude>
<exclude>org.apache.flink:flink-examples-streaming_2.10</exclude>
<exclude>org.apache.flink:flink-streaming-java_2.10</exclude>
<exclude>org.apache.flink:flink-streaming-scala_2.10</exclude>
<exclude>org.apache.flink:flink-scala-shell_2.10</exclude>
<exclude>org.apache.flink:flink-python</exclude>
<exclude>org.apache.flink:flink-metrics-core</exclude>
<exclude>org.apache.flink:flink-metrics-jmx</exclude>
<exclude>org.apache.flink:flink-statebackend-rocksdb_2.10</exclude>
<!-- Also exclude very big transitive dependencies of Flink
WARNING: You have to remove these excludes if your code relies on other
versions of these dependencies.
-->
<exclude>log4j:log4j</exclude>
<exclude>org.scala-lang:scala-library</exclude>
<exclude>org.scala-lang:scala-compiler</exclude>
<exclude>org.scala-lang:scala-reflect</exclude>
<exclude>com.data-artisans:flakka-actor_*</exclude>
<exclude>com.data-artisans:flakka-remote_*</exclude>
<exclude>com.data-artisans:flakka-slf4j_*</exclude>
<exclude>io.netty:netty-all</exclude>
<exclude>io.netty:netty</exclude>
<exclude>commons-fileupload:commons-fileupload</exclude>
<exclude>org.apache.avro:avro</exclude>
<exclude>commons-collections:commons-collections</exclude>
<exclude>org.codehaus.jackson:jackson-core-asl</exclude>
<exclude>org.codehaus.jackson:jackson-mapper-asl</exclude>
<exclude>com.thoughtworks.paranamer:paranamer</exclude>
<exclude>org.xerial.snappy:snappy-java</exclude>
<exclude>org.apache.commons:commons-compress</exclude>
<exclude>org.tukaani:xz</exclude>
<exclude>com.esotericsoftware.kryo:kryo</exclude>
<exclude>com.esotericsoftware.minlog:minlog</exclude>
<exclude>org.objenesis:objenesis</exclude>
<exclude>com.twitter:chill_*</exclude>
<exclude>com.twitter:chill-java</exclude>
<exclude>commons-lang:commons-lang</exclude>
<exclude>junit:junit</exclude>
<exclude>org.apache.commons:commons-lang3</exclude>
<exclude>org.slf4j:slf4j-api</exclude>
<exclude>org.slf4j:slf4j-log4j12</exclude>
<exclude>log4j:log4j</exclude>
<exclude>org.apache.commons:commons-math</exclude>
<exclude>org.apache.sling:org.apache.sling.commons.json</exclude>
<exclude>commons-logging:commons-logging</exclude>
<exclude>commons-codec:commons-codec</exclude>
<exclude>com.fasterxml.jackson.core:jackson-core</exclude>
<exclude>com.fasterxml.jackson.core:jackson-databind</exclude>
<exclude>com.fasterxml.jackson.core:jackson-annotations</exclude>
<exclude>stax:stax-api</exclude>
<exclude>com.typesafe:config</exclude>
<exclude>org.uncommons.maths:uncommons-maths</exclude>
<exclude>com.github.scopt:scopt_*</exclude>
<exclude>commons-io:commons-io</exclude>
<exclude>commons-cli:commons-cli</exclude>
</excludes>
</artifactSet>
<filters>
<filter>
<artifact>org.apache.flink:*</artifact>
<excludes>
<!-- exclude shaded google but include shaded curator -->
<exclude>org/apache/flink/shaded/com/**</exclude>
<exclude>web-docs/**</exclude>
</excludes>
</filter>
<filter>
<!-- Do not copy the signatures in the META-INF folder.
Otherwise, this might cause SecurityExceptions when using the JAR. -->
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
<!-- If you want to use ./bin/flink run <quickstart jar> uncomment the following lines.
This will add a Main-Class entry to the manifest file -->
<!--
<transformers>
<transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<mainClass>org.myorg.quickstart.StreamingJob</mainClass>
</transformer>
</transformers>
-->
<createDependencyReducedPom>false</createDependencyReducedPom>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<configuration>
<source>1.7</source> <!-- If you want to use Java 8, change this to "1.8" -->
<target>1.7</target> <!-- If you want to use Java 8, change this to "1.8" -->
</configuration>
</plugin>
</plugins>
<!-- If you want to use Java 8 Lambda Expressions uncomment the following lines -->
<!--
<pluginManagement>
<plugins>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>1.8</source>
<target>1.8</target>
<compilerId>jdt</compilerId>
</configuration>
<dependencies>
<dependency>
<groupId>org.eclipse.tycho</groupId>
<artifactId>tycho-compiler-jdt</artifactId>
<version>0.21.0</version>
</dependency>
</dependencies>
</plugin>
<plugin>
<groupId>org.eclipse.m2e</groupId>
<artifactId>lifecycle-mapping</artifactId>
<version>1.0.0</version>
<configuration>
<lifecycleMappingMetadata>
<pluginExecutions>
<pluginExecution>
<pluginExecutionFilter>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId>
<versionRange>[2.4,)</versionRange>
<goals>
<goal>single</goal>
</goals>
</pluginExecutionFilter>
<action>
<ignore/>
</action>
</pluginExecution>
<pluginExecution>
<pluginExecutionFilter>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<versionRange>[3.1,)</versionRange>
<goals>
<goal>testCompile</goal>
<goal>compile</goal>
</goals>
</pluginExecutionFilter>
<action>
<ignore/>
</action>
</pluginExecution>
</pluginExecutions>
</lifecycleMappingMetadata>
</configuration>
</plugin>
</plugins>
</pluginManagement>
-->
</build>
</project>
I think its because you are trying to use flink 1.3.0 according to your Pom.xml.
<flink.version>1.3.0</flink.version>
DeserializationSchema is in org.apache.flink.streaming.util.serialization for 1.3.0. not where its trying to look. Should be able to change the version to 1.4.1 in your pom.xml to
<flink.version>1.4.1</flink.version>
You need to add this maven dependency:
<!-- https://mvnrepository.com/artifact/org.apache.flink/flink-core -->
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-core</artifactId>
<version>1.4.0</version>
</dependency>

Lazerycode jmeter maven plugin Java Extension downloads excluded Dependencies

I am trying to use Java samplers in my tests.
I have a separate maven project where I create my extensions. After building the project I get a .jar lib. I include it in my maven plugin like this:
<dependencies>
<dependency>
<groupId>com.lazerycode.jmeter</groupId>
<artifactId>jmeter-maven-plugin</artifactId>
<version>2.2.0</version>
</dependency>
<dependency>
<groupId>com.qiagen</groupId>
<artifactId>qa_toolkit</artifactId>
<version>RELEASE</version>
</dependency>
<dependency>
<groupId>com.qiagen</groupId>
<artifactId>JMeterExtensions</artifactId>
<version>jmeter3.2.3</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>com.lazerycode.jmeter</groupId>
<artifactId>jmeter-maven-plugin</artifactId>
<version>2.2.0</version>
<executions>
<execution>
<id>jmeter-tests</id>
<phase>verify</phase>
<goals>
<goal>jmeter</goal>
</goals>
</execution>
</executions>
<configuration>
<testFilesDirectory>${basedir}/src/test/jmeter/</testFilesDirectory>
<testFilesIncluded>
<jMeterTestFile>${jmxTest}</jMeterTestFile>
</testFilesIncluded>
<jmeterDirectory>${jmeter.home}</jmeterDirectory>
<jmeterExtensions>
<artifact>com.qiagen:JMeterExtensions:jmeter3.2.3</artifact>
</jmeterExtensions>
<propertiesUser>
<csvData>${basedir}/src/test/jmeter/${csvData}</csvData>
<threads>${threads}</threads>
<rampTime>${rampTime}</rampTime>
<loopCount>${loopCount}</loopCount>
<options>${options}</options>
<server>${server}</server>
<port>${port}</port>
<sleep>${sleep}</sleep>
<inputXmlFileDir>${inputXmlFileDir}</inputXmlFileDir>
<templatesCsv>${templatesCsv}</templatesCsv>
<xmlInputsCsv>${xmlInputsCsv}</xmlInputsCsv>
<reportScenariosCsv>${reportScenariosCsv}</reportScenariosCsv>
</propertiesUser>
<jMeterProcessJVMSettings>
<xms>2048</xms>
<xmx>2048</xmx>
<arguments>
<argument>-Xprof</argument>
<argument>-Xfuture</argument>
</arguments>
</jMeterProcessJVMSettings>
</configuration>
</plugin>
</plugins>
</build>
In my extensions i have some invalid transitive dependencies which i excluded from extensions pom.xml. I don't see them in the dependency tree.
When I run the tests, with the flag downloadExtensionDependencies on true, it looks like it tries to download all dependencies (also those excluded) and then the test fails because of that invalid dependency.
Failed to collect dependencies at org.springframework:spring-webmvc:jar:3.1.1.RELEASE -> jasperreports:jasperreports:jar:2.0.5 -> commons-collections:commons-collections:jar:3.2.1.redhat-7: Failed to read artifact descriptor for commons-collections:commons-collections:jar:3.2.1.redhat-7: Could not transfer artifact org.apache.commons:commons-parent:pom:22-redhat-2 from/to jaspersoft (http://www.jasperforge.org/maven2): www.jasperforge.org: Unknown host www.jasperforge.org -> [Help 1]
Do you have any ideas why is the plugin trying to download the excluded dependencies also?
Use version 2.6.0 of the plugin which has now better default values like not downloading optional dependencies.
And use this to exclude broken or excluded dependencies:
<excludedArtifacts>
<exclusion>commons-pool2:commons-pool2</exclusion>
<exclusion>commons-math3:commons-math3</exclusion>
<exclusion>com.sun.jdmk:jmxtools</exclusion>
<exclusion>com.sun.jmx:jmxri</exclusion>
</excludedArtifacts>

Maven won't exclude during copy-dependencies

I have a project that uses Netty 4.0.29 and I have another dependency that pulls in netty 3.9.0. I put in an exclusion but it is still roping in 3.9.0 when I run copy-dependencies.
<dependency>
<groupId>com.ning</groupId>
<artifactId>async-http-client</artifactId>
<version>1.9.31</version>
<exclusions>
<exclusion>
<groupId>io.netty</groupId>
<artifactId>netty</artifactId>
</exclusion>
</exclusions>
</dependency>
If I run mvn dependency:tree with this exclusion in place, I see that it is indeed excluded:
[INFO] +- com.ning:async-http-client:jar:1.9.31:compile
But when I run mvn clean dependency:copy-dependencies I see the jar 3.9.0 being copied along with the 4.0.29. According to the documentation and Google, this should not copy when there is an exclusion.
[INFO] Copying netty-3.9.0.Final.jar to /Users/udonom1/wk/141/coursecopy-api/target/dependency/netty-3.9.0.Final.jar
[INFO] Copying netty-all-4.0.29.Final.jar to /Users/udonom1/wk/141/coursecopy-api/target/dependency/netty-all-4.0.29.Final.jar
I tried excluding as suggested by the first answer below and that did not work.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<version>2.1</version>
<executions>
<execution>
<id>copy-dependencies</id>
<phase>process-sources</phase>
<goals>
<goal>copy-dependencies</goal>
</goals>
<configuration> <outputDirectory>${project.build.directory}/lib</outputDirectory>
<overWriteReleases>false</overWriteReleases>
<overWriteSnapshots>false</overWriteSnapshots>
<overWriteIfNewer>true</overWriteIfNewer>
<excludeArtifactIds>io.netty:netty:3.9.0.Final</excludeArtifactIds>
</configuration>
</execution>
</executions>
</plugin>
I also added a dependency as further suggested:
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-all</artifactId>
<version>4.0.29.Final</version>
</dependency>
What am I doing wrong?
For those who are having the same issue. I used mvn -X and discovered that dependency:tree is omitting two other jars that are referencing netty. I added exclusions for those and I'm good to go. Spent a whole day on this.
If you are writing not library you have simple way to control versions of any dependency in your project - dependencyManagement block in root pom file, example:
<dependencyManagement>
<dependencies>
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty</artifactId>
<version>4.0.29.Final</version>
</dependency>
</dependencies>
</dependencyManagement>
Additional bonuses from this block - you can omit version and scope for dependency in concrete dependency (with same group id, artifact id and packaging).
PS another look to your dependencies make me ask you: are you sure that this dependency have single maven artifact id? netty-all-4.0.29.Final.jar - seems that this artifact should have netty-all artifact id... If they have different artifact id's my recipe wouldn't help. In this case you should define build configuration for maven-dependency-plugin, example:
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<version>2.10</version>
<configuration>
<excludeArtifactIds>io.netty:netty:3.9.0.Final</excludeArtifactIds>
</configuration>
</plugin>
</plugins>
</build>
or just use -DexcludeArtifactIds parameter in your maven call

Categories