I am currently writing a Maven plugin that creates some files in a subdirectory of target. My wish is to zip these files and deploy them as side artifact.
More precisely:
Say I have created some files in target/someplugin. Then I want to zip these files and attach them to the build, so that the zip is installed/deployed with classifier someplugin.
How can I achieve this?
I did the following (may not be perfect):
First I injected the MavenProjectHelper
#Component
private MavenProjectHelper projectHelper;
Then I added the zt-zip library
<dependency>
<groupId>org.zeroturnaround</groupId>
<artifactId>zt-zip</artifactId>
<version>1.13</version>
<type>jar</type>
</dependency>
Now I can do something like
ZipUtil.pack(someplugindir, somepluginzip);
projectHelper.attachArtifact(project, "zip", "someplugin", somepluginzip);
and everything looks fine.
Related
I want to use AWS S3 buckets with my Java 17 / Maven 3.8.5 app but when I add this to the pom.xml file
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk</artifactId>
<version>1.12.177</version>
</dependency>
I get a bunch of errors like these:
Failed to read artifact descriptor for com.amazonaws:aws-java-sdk-iotevents:jar:1.12.177
although when I add -core like this:
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-core</artifactId>
<version>1.12.177</version>
</dependency>
the errors go away in my pom.xml file but I dont think this is what I want.
What I want is to be able to import com.amazonaws.services.s3.AmazonS3; like this
import com.amazonaws.services.s3.AmazonS3;
import org.springframework.context.annotation.Configuration;
#Configuration
public class AmazonConfig {
public AmazonS3 s3() {
AWSCredentials awsCredentials = new BasicAWSCredentials("", "");
}
}
but I keep getting this error
The import com.amazonaws.services.s3 cannot be resolved
Im really new to Java so I'm not sure if the problem is from my Maven configuration or something else.
Thanks.
you may need to do a mvn install or mvn package, this way, you will download the dependency.
First of all check if the dependencies are pulled or not in the external dependencies section in the IDE. If its not there see if maven is configured properly.
What ide are you using?
If its eclipse force a maven update on the project. If it’s Intellij, invalidate caches from the file menu. Close the ide. Open the root directory of you project in file explorer and Delete any .preferences or .setting or any other unnecessary file or folder present on the source folder reated by the ide and after clearing all that open up IDE and see if that resolves the issue.
Also for s3 you’ll need this dependency
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-s3</artifactId></dependency>
Really not sure what was wrong but restarting a new project worked.
I'm hitting this error when run my package jar:
java.lang.NoClassDefFoundError: com/google/common/base/Joiner
I simply call: java -jar xxx.jar
I already added the dependency in my pom.xml:
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
<version>18.0</version>
<scope>compile</scope>
</dependency>
I'm using IntelliJ editor. I have a unit test for the function which uses the Joiner class. It runs successfully from within IntelliJ.
I put my cursor on the Joiner and use "command + B" to search for the declaration of the Joiner class. It opens the Decompiled source code page, and on the heading it shows the path as: guava-18.0.jar/com/goog/common/base/Joiner
So everything looks correct.
Can anyone help me figure out why do I hit this error?
You could shade guava into your jar but it would result in a bigger jar file. But that way it would be included in your programm 100%. Read more about shading maven depencies into your jar file in the official maven-documentation: http://maven.apache.org/plugins/maven-shade-plugin/
I want to upload a file to HDFS. I compiled my code using following jars as dependencies:
hadoop-auth-2.6.1.jar,
hadoop-common-2.6.1.jar and
hadoop-hdfs-2.6.1.jar,
My code:
I compiled it with Ant. But, it gave me this error: No FileSystem for scheme:hdfs.
Then I changed the code and compiled again:
But now I got another error: Class org.apache.hdfs.DistributedFileSystem not found.
What's wrong? And what should I do?
DistributedFileSystem is part of hadoop-core.
To fix this problem, you need to include hadoop-core-1.2.1.jar also (Note: I am using Maven for building):
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-core</artifactId>
<version>1.2.1</version>
</dependency>
Overall, I am using following Maven dependencies:
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.7.1</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-core</artifactId>
<version>2.7.1</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-core</artifactId>
<version>1.2.1</version>
</dependency>
While getting Hadoop Filesystem object like below
FileSystem fs = FileSystem.get(hdfsUrl,configuration);
If you get following error :
"No FileSystem for scheme:hdfs"
You can resolve it by setting following 2 properties on configuration.
configuration.set("fs.hdfs.impl","org.apache.hadoop.hdfs.DistributedFileSystem");
configuration.set("fs.file.impl", "org.apache.hadoop.fs.LocalFileSystem");
Now, you may get new error like below :
java.lang.ClassNotFoundException: Class org.apache.hadoop.hdfs.DistributedFileSystem not found
Hadoop-common.jar uses Thread.currentThread.getContextClassLoader() and configuration.getClassLoader to load classes.
So, if you set your classLoader by using
Thread.currentThread.setContextClassLoader(yourClassLoader);
configuration.setClassLoader(yourClassLoader);
you will be able to load required class from other hadoop jars (e.g. hadoop-hdfs)
Let me know if you need more help. And don't forget to upvote if you find this bit useful.
I have the same problem, when I compile my Java code into a executable jar and run the compiled jar. Always some error "Not found" (e.g. in your case no FileSystem...), which means some hadoop jar is not included in the compilation.
The solution is add the correct dependencies in the Maven/Gradle or add (all) the jars.
In my case, the hdfs is from the class org.apache.hadoop.hdfs.DistributedFileSystem from the jar: hadoop-hdfs-client-3.2.1.jar.
The relevant jars which have been used can be found in the log file (if you successfully run the program and have the log file). In my example is the following:
You can simply add all the jars (from the installed hadoop folder). They should be in the folder of common/hdfs/ ... under the folder: hadoop 3.2.1/share/hadoop. There are possible other jars which are used but not shown in the log. To be safe, just include all the jars. You can run hdfs classpath in the terminal to find the location of all the jars.
Once all the jars have been added, in your java code, you may also need to set the hadoop configuration
Configuration hadoopConfiguration = new Configuration();
hadoopConfiguration.addResource(new Path(CoreSiteXMLStr));
hadoopConfiguration.set("fs.hdfs.impl", "org.apache.hadoop.hdfs.DistributedFileSystem");
I have uploaded some of the jars to the my local artifactory...
For a example....
Repository Browser
+--org
|-jboos
|-json
|json-java
|-maven-metadata.xml
This xml file Dependency Declaration shown as below
<dependency>
<groupId>org</groupId>
<artifactId>json</artifactId>
<version>json-java</version>
<type>xml</type>
</dependency>
I need to change this xml file for a reason. Is there anyway to do this artifactory UI itself. (Without login to the locale artifactory server )
First, do not upload maven-metadata files. They are generated by Artifactory for you.
Second, since Artifactory usually manages uneditable binary files, there is no edit functionality in Artifactory.
I am with JFrog, the company behind Bintray and [artifactory], see my profile for details and links.
I am trying to compile my JasperReports template using an Ant script and Java. I am getting this error:
jasper java.lang.NoClassDefFoundError:
org/codehaus/groovy/control/CompilationFailedException
There is nothing complex in the template, but I still can't compile.
You will have to set the language value in your template to Java. There are two ways you can do this:
If you are using iReport, select the root object in your Report Inspector (the one with the same name as your report). Then in the Properties window, select Java from the Languages drop-down.
If you are editing the raw mark-up in the JRXML file, remove language="groovy" from the file altogether.
Then try to recompile - you should be sorted. :)
If you are using Maven, you must add the groovy dependency in your pom.xml.
<dependency>
<groupId>org.codehaus.groovy</groupId>
<artifactId>groovy-all</artifactId>
<version>2.4.10</version>
</dependency>
In another case, you must add the library groovy in your lib folder (WEB-INF/lib)
Another solution is to copy groovy-all-{version}.jar from the groovy binary distribution into the application's.
If you are using TIBCOJaspersoftStudio:
Download latest groovy 2.4.* jar from https://groovy.apache.org/download.html
Unpack and get this file ./groovy-2.4.10/embeddable/groovy-all-2.4.10.jar
Put the jar in ./TIBCOJaspersoftStudio-6.3.1.final/plugins
Delete the old jar: ./TIBCOJaspersoftStudio-6.3.1.final/plugins/groovy-all_2.4.5.jar
Change the languge to java in JRXML (ex:- language="java") or add groovy*.jar to your project’s classpath.
Your are missing the a important library groovy in path.
case 1 : if you are using Maven add this dependency with compatible version in pom.xml
<dependency>
<groupId>org.codehaus.groovy</groupId>
<artifactId>groovy-all</artifactId>
<version>3.0.4</version>
</dependency>
Case 2 : Second way is add the compatible version of groovy jar in class path
Url to download groovy jar : http://groovy-lang.org/download.html