Add jar file to local repository (linux server) - java

I created a "lib" folder in my project
~\test2000\lib\ls-client.jar
Then, I added a jar file to this folder
Now I want to call him in the project, I did this with the following code, In my system right.
<dependency>
<groupId>com.lightstreamer</groupId>
<artifactId>ls-client</artifactId>
<version>2.5.2</version>
<scope>system</scope>
<systemPath>${project.basedir}/lib/ls-client.jar</systemPath>
</dependency>
But when I transfer the program to the Linux server
There's an error saying that it can not read jarFile
I think I do not enter groupId and artifactId!!!
Can anyone help me?

step 1:
Download the "ls-client", extract it and copy the ls-client.jar to somewhere else, for example, c drive. Issue following command :
mvn install:install-file -Dfile=c:\\test2000\lib\ls-client.jar -DgroupId=com.lightstreamer
-DartifactId=ls-client -Dversion=2.5.2 -Dpackaging=jar
step 2:
<dependency>
<groupId>com.lightstreamer</groupId>
<artifactId>ls-client</artifactId>
<version>2.5.2</version>
</dependency>

Related

Not able to create PDF using iText 7.1.9 in Java

I'm trying to create a PDF file using iText 7.1.9 jar in Java. But I'm getting below exception. I have all necessary steps to compile jars, set class path and build path. Please help me resolve this.
java.lang.NoClassDefFoundError: com/itextpdf/kernel/pdf/PdfDocument
in line pdfdoc = new PdfDocument(new PdfReader(file));. My POM.xml for Vaadin8/Maven is as below. In properties tag:
<itext.version>RELEASE</itext.version>
And in dependencies I have added below entry:
<!-- https://mvnrepository.com/artifact/com.itextpdf/itext7-core -->
<dependency>
<groupId>com.itextpdf</groupId>
<artifactId>itext7-core</artifactId>
<version>7.1.9</version>
<type>pom</type>
</dependency>
I have built all jars using command line as
mvn deploy:deploy-file -Dfile=/barcodes-7.1.9.jar -DgroupId=com.roufid.tutorials -DartifactId=example-app -Dversion=1.0 -Dpackaging=jar -Durl=file:./maven-repository/ -DrepositoryId=maven-repository -DupdateReleaseInfo=true
This is caused when there is a class file that your code depends on and it is present at compile time but not found at runtime. Look for differences in your build time and runtime classpaths.

maven can not import "pinyin4j.jar"

want to translate the Chinese to pinyin。So I want to use the rack(pinyin4j.jar).But the maven dependency tree do not contain "pinyin4j".
I have try to use zhe command "mvn clean、mvn compile 、mvn install 、mvn reimport"
I have try to delete it and add it again.
import net.sourceforge.pinyin4j.PinyinHelper;
this does not work.
Try following steps, It is working fine in my system:
Go to the location of your project.
Example: D:\Project\workspace\com.test.pinyin4j>
Try mvn clean install.
Update project and it should work fine.
You can check below attached images. It got successfully imported.
Note: Don't forget to add pinyin4j dependency in POX file.
<dependencies>
<!-- https://mvnrepository.com/artifact/com.belerweb/pinyin4j -->
<dependency>
<groupId>com.belerweb</groupId>
<artifactId>pinyin4j</artifactId>
<version>2.5.0</version>
</dependency>
</dependencies>

Create side artifact from directory

I am currently writing a Maven plugin that creates some files in a subdirectory of target. My wish is to zip these files and deploy them as side artifact.
More precisely:
Say I have created some files in target/someplugin. Then I want to zip these files and attach them to the build, so that the zip is installed/deployed with classifier someplugin.
How can I achieve this?
I did the following (may not be perfect):
First I injected the MavenProjectHelper
#Component
private MavenProjectHelper projectHelper;
Then I added the zt-zip library
<dependency>
<groupId>org.zeroturnaround</groupId>
<artifactId>zt-zip</artifactId>
<version>1.13</version>
<type>jar</type>
</dependency>
Now I can do something like
ZipUtil.pack(someplugindir, somepluginzip);
projectHelper.attachArtifact(project, "zip", "someplugin", somepluginzip);
and everything looks fine.

ClassNotFoundException in com.amazonaws.demos.polly.PollyDemo

I'm new to Maven and the AWS SDK. So I installed both and updated my Java SDK. Double checked all required path and classpath settings.
The AWS Polly manual (page 119 in the pdf) presents a demo code example, to test Polly.
Being in this for the very first time, I tried this example (pom.xml and PollyDemo.java). Calling Maven as written in the manual, I receive the ClassNotFoundException for PollyDemo (classpath to com.amazonaws.demos.polly package has been set).
With over 10 years Java experience I feel like a newbie.
Please help
you need to add aws-java-sdk-polly dependency into your pom.xml
file and update the project, you can find dependency below:
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-polly</artifactId>
<version>1.11.77</version>
</dependency>
<!-- https://mvnrepository.com/artifact/com.googlecode.soundlibs/jlayer -->
<dependency>
<groupId>com.googlecode.soundlibs</groupId>
<artifactId>jlayer</artifactId>
<version>1.0.1-1</version>
</dependency>
for more you can refer below link :
http://docs.aws.amazon.com/de_de/polly/latest/dg/examples-java.html
http://docs.aws.amazon.com/polly/latest/dg/examples-java.html
The example can be run if: AWS credentials are created and setted (1), a new project is started by creating an empty directory (e.g. 'my-app'),
opening a shell in 'my-app' and running the command 'mvn archetype:generate -DgroupId=com.amazonaws.demos.polly -DartifactId=polly-java-demo -DarchetypeArtifactId=maven-archetype-quickstart -DinteractiveMode=false' (2), and finally replace both the existing 'pom.xml' and hello world java file with the ones in the example (3).

No FileSystem for scheme:hdfs and Class org.apache.hadoop.DistributedFileSystem not found

I want to upload a file to HDFS. I compiled my code using following jars as dependencies:
hadoop-auth-2.6.1.jar,
hadoop-common-2.6.1.jar and
hadoop-hdfs-2.6.1.jar,
My code:
I compiled it with Ant. But, it gave me this error: No FileSystem for scheme:hdfs.
Then I changed the code and compiled again:
But now I got another error: Class org.apache.hdfs.DistributedFileSystem not found.
What's wrong? And what should I do?
DistributedFileSystem is part of hadoop-core.
To fix this problem, you need to include hadoop-core-1.2.1.jar also (Note: I am using Maven for building):
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-core</artifactId>
<version>1.2.1</version>
</dependency>
Overall, I am using following Maven dependencies:
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.7.1</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-core</artifactId>
<version>2.7.1</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-core</artifactId>
<version>1.2.1</version>
</dependency>
While getting Hadoop Filesystem object like below
FileSystem fs = FileSystem.get(hdfsUrl,configuration);
If you get following error :
"No FileSystem for scheme:hdfs"
You can resolve it by setting following 2 properties on configuration.
configuration.set("fs.hdfs.impl","org.apache.hadoop.hdfs.DistributedFileSystem");
configuration.set("fs.file.impl", "org.apache.hadoop.fs.LocalFileSystem");
Now, you may get new error like below :
java.lang.ClassNotFoundException: Class org.apache.hadoop.hdfs.DistributedFileSystem not found
Hadoop-common.jar uses Thread.currentThread.getContextClassLoader() and configuration.getClassLoader to load classes.
So, if you set your classLoader by using
Thread.currentThread.setContextClassLoader(yourClassLoader);
configuration.setClassLoader(yourClassLoader);
you will be able to load required class from other hadoop jars (e.g. hadoop-hdfs)
Let me know if you need more help. And don't forget to upvote if you find this bit useful.
I have the same problem, when I compile my Java code into a executable jar and run the compiled jar. Always some error "Not found" (e.g. in your case no FileSystem...), which means some hadoop jar is not included in the compilation.
The solution is add the correct dependencies in the Maven/Gradle or add (all) the jars.
In my case, the hdfs is from the class org.apache.hadoop.hdfs.DistributedFileSystem from the jar: hadoop-hdfs-client-3.2.1.jar.
The relevant jars which have been used can be found in the log file (if you successfully run the program and have the log file). In my example is the following:
You can simply add all the jars (from the installed hadoop folder). They should be in the folder of common/hdfs/ ... under the folder: hadoop 3.2.1/share/hadoop. There are possible other jars which are used but not shown in the log. To be safe, just include all the jars. You can run hdfs classpath in the terminal to find the location of all the jars.
Once all the jars have been added, in your java code, you may also need to set the hadoop configuration
Configuration hadoopConfiguration = new Configuration();
hadoopConfiguration.addResource(new Path(CoreSiteXMLStr));
hadoopConfiguration.set("fs.hdfs.impl", "org.apache.hadoop.hdfs.DistributedFileSystem");

Categories