I have RHEL7.3 in which i have built tesseract and i am able to execute my jar . Now i have another system with same RHEL configuration and i want to execute the same jar But i don't want to build tesseract on this system so I pick /usr/lib64/libtesseract.so and /usr/lib64/liblept.so from previously builded tesseract system and put these so files in fresh RHEL7.3 system at /usr/lib64/
path and try to execute my jar but this time my jar did not execute succesfully .It ends with throwing 'java.lang.UnsatisfiedLinkError: Unable to load library 'tesseract': Native library (linux-x86-64/libtesseract.so) not found in resource path' .What did i miss(dependencies)
I have
- RHEL 7.3 (64 bit)
- JRE 1.8.0_51 (64 bit)
please try
yum install tesseract-ocr
Related
I made an SWT application using WindowBuilder in Eclipse. I exported the project as a runnable Jar File on Windows 10, then transferred the file to my Macosx machine, and I received this error,
$ java -jar Downloads/SWTApplication.jar
Exception in thread "main" java.lang.UnsatisfiedLinkError: Could not load SWT library. Reasons:
no swt-win32-4926r21 in java.library.path
no swt-win32 in java.library.path
Can't load library: /Users/myhomefolder/.swt/lib/macosx/x86_64/libswt-win32-4926r21.jnilib
Can't load library: /Users/myhomefolder/.swt/lib/macosx/x86_64/libswt-win32.jnilib
at org.eclipse.swt.internal.Library.loadLibrary(Library.java:344)
at org.eclipse.swt.internal.Library.loadLibrary(Library.java:256)
at org.eclipse.swt.internal.C.<clinit>(C.java:19)
at org.eclipse.swt.widgets.Display.<clinit>(Display.java:143)
at MainWindow.open(MainWindow.java:58)
at MainWindow.main(MainWindow.java:47)
I tried adding both the windows and mac versions of swt.jar file to the Java Build Path in Eclipse, and it did not work. Is there a way to get it to work just by adding the SWTs to the Java Build Path? Tell me if there are more steps to this process if there is.
I expected the program to run on my Mac machine, just like when I double click the Jar File on my Windows Machine, but it returned the error above.
Please bear with me since I'm relatively new to Java and SWT.
I cannot solve this exception, I've read the hadoop docu and all related stackoverflow questions that I could find.
My fileSystem.mkdirs(***) throws:
Exception in thread "main" java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode0(Ljava/lang/String;I)V
at org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode0(Native Method)
at org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode(NativeIO.java:524)
at org.apache.hadoop.fs.RawLocalFileSystem.mkOneDirWithMode(RawLocalFileSystem.java:465)
at org.apache.hadoop.fs.RawLocalFileSystem.mkdirsWithOptionalPermission(RawLocalFileSystem.java:518)
at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:496)
at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:316)
...
I am including the following dependencies in my app (via maven pom.xml), all in version 2.6.0-cdh5.13.0: hadoop-common, hadoop-hdfs, hadoop-client, hadoop-minicluster
My filesystem variable is a valid (hadoop-common) FileSystem (org.apache.hadoop.fs.FileSystem).
I downloaded the hadoop files from https://github.com/steveloughran/winutils/tree/master/hadoop-2.6.0/bin.
I stored the winutils.exe and all the other files from in version 2.6.0 to my local file system under C:\Temp\hadoop\bin.
I added the path variable HADOOP_HOME with C:\Temp\hadoop (yes, not the path to the bin directory).
The fallback is not used ("using builtin-java classes"), I am getting:
145 [main] DEBUG org.apache.hadoop.util.NativeCodeLoader - Trying to load the custom-built native-hadoop library...
147 [main] DEBUG org.apache.hadoop.util.NativeCodeLoader - Loaded the native-hadoop library
(See https://hadoop.apache.org/docs/r2.6.0/hadoop-project-dist/hadoop-common/NativeLibraries.html)
I understood, that this exception can be caused by a hadoop version mismatch, but I checked that the imported hadoop matches the hadoop I stored locally, version wise.
I am working on a Windows 10 x64 system and in IntelliJ.
Anybody has an idea, what I could check or even, what I am doing wrong?
UPDATE:
I run my main with the following VM options
-Dhadoop.home.dir=C:/Temp/hadoop
-Djava.library.path=C:/Temp/hadoop/bin
Without specifying the lib path, I get:
org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
To me setting VM Argument -Djava.library.path=C:\devTools\winutils-master\hadoop-3.0.0
resolved the issue.
The reason for this exception was:
I am importing 2.6.0-cdh5.13.0 via my maven pom, but I downloaded the pre-built files in version 2.6.0. Those are missing the changes made in the cdh5.13.0 variant (CDH is Cloudera’s platform that includes the Hadoop ecosystem). Hence, the versions are indeed in conflict.
If I import hadoop-common, hadoop-hdfs, hadoop-client like 2.6.0 instead of like 2.6.0-cdh5.13.0, the exception disappears (and I don't even need to set the VM options).
See http://archive-primary.cloudera.com/cdh5/cdh/5/hadoop-2.6.0-cdh5.13.0/hadoop-project-dist/hadoop-common/NativeLibraries.html
Check your java version. If java is 32bits version, you need uninstall and re-install with 64 bits version for hadoop.
Check command:
java -d32 -version;(no error, if 32 version)
java -d64 -version;(no error, if 64 version)
Download hadoop.dll and winutils.exe files from hadoop-3.0.0 and got resolve
https://github.com/steveloughran/winutils/tree/master/hadoop-3.0.0/bin
In my case, the issue was that i had hadoop.dll in system32,
if you run hadoop on windows and he finds hadoop.dll in System32 or in your HADOOP_HOME\bin then he consider that it's a cluster, the probleme is that clusters are not compatible with windows, so it fails.
Solution delete hadoop.dll from System32 and HADOOP_HOME\bin
see : Full answer
Had this issue and it turned out HADOOP_HOME was set to a 2.6.4 version folder. Updated it to a 3.0.0 folder and it worked. It seems in general either you have to update it on the calling end with -Djava.library.path or in your environment settings to 3.0.0.
For the environment variable, HADOOP_HOME is a system environment variable you can access via the command
rundll32.exe sysdm.cpl, EditEnvironmentVariables
This can be entered into ⊞ + R (windows key + R) followed by ctrl + shift and enter, or by opening powershell or cmd as admin.
From here I edited the system environment variable HADOOP_HOME to the 3.0.0 folder, and I updated the system PATH to HADOOP_HOME/bin. Be sure there are no conflicts in your user variables, such as in PATH.
After that, any terminal or program calling Spark should be restarted and checked to make sure it loaded the new environment variables.
I am having the same issue with writing parquet files in spark. Downloading the hadoop.dll and winutils.exe files from hadoop-3.0.0 and moving hadoop.dll to C:\Windows\System32 folder and moving winutils.exe to C:\hadoop\bin folder solved my problem.
Thanks Shailendra Singh for sharing the above link
Running a .jar file in the command line works fine, but I am unable to execute .jar files on my PC by double-clicking them. What is interesting, the same .jar files successfully execute on my laptop with a double-click.
I included %JAVA_HOME\bin and JDK_HOME\bin in the PATH variable.
I have JAVA_HOME set to C:\Program Files\Java\jre1.8.0_121.
I have JDK_HOME set to C:\Program Files\Java\jdk1.8.0_121.
For some period of time, the Java(TM) Platform SE binary disappeared from the Open with list, and I could not add it until I changed the HKEY_CLASSES_ROOT\jar_auto_file\shell\open\command key in regedit.exe to the appropriate version:
C:\Program Files\Java\jre1.8.0_121\bin\javaw.exe" "%1".
Here is a snapshot of Windows CMD with java -version entered on my PC:
I tried the following methods, with no success:
reinstalling both JRE and JDK,
associating .jar files with C:\Program Files\Java\jre1.8.0_121\bin\javaw.exe,
using assoc .jar=jarfile and ftype jarfile="C:\Program Files\Java\jre1.8.0_121\bin\javaw.exe" -jar "%1" %* in command line,
creating a new system Environment Variable OPENDS_JAVA_ARGS and setting it to -jar.
Additional information:
I am using Windows 7 Professional 64-bit (Service Pack 1) with latest
updates installed both on my PC and laptop.
Running .jar files by double-clicking them used to work on my PC,
but suddently stopped working some time ago. I did not modify
anything related to Java.
I have the newest version of both JRE and JDK (Java 8 version 121).
Of the .jar files I am trying to run, some are GUI Swing applications
and some open the system console (if not already open) using
Runtime.getRuntime().exec(String[]).
I have already tried fixing this problem with solutions on the following links (without success):
How to run .jar file by double click on Windows 7 (64)
How to open/run .jar file (double-click not working)?
I can run .jar files through cmd, but I cannot double click them
Can't Run JAR Files
Notes:
I do not want to open .jar files with WinRAR.
I do not want to open .jar files by running java -jar jarfile.jar in cmd.exe.
I am not looking for a 3rd party application to run .jar files.
For now, I am using C:\Windows\System32\cmd.exe /k "java -jar C:\path\to\jarfile\Program.jar" as a shortcut location to run jar files, but I am not satisfied with this solution as this opens the console to run even Swing applications.
I managed to get Java(TM) Platform SE binary (javaw.exe) in the Open with list by matching paths in HKEY_CURRENT_USER and HKEY_LOCAL_MACHINE registry entries:
HKEY_CURRENT_USER\Software\Classes\jar_auto_file\shell\open\command
HKEY_LOCAL_MACHINE\SOFTWARE\Classes\jarfile\shell\open\command
Seems like one of the entries pointed to an older version of Java. Now .jar files are finally showing icons, but still do not work on double-click.
UPDATE: SOLUTION
Delete the .jar and jar_auto_file entries from HKEY_CURRENT_USER/Software/Classes and it should work. .jar files should open on double-click again.
Make sure to backup these entries first by right-clicking the entry and choosing Export.
Additional information can be found on superuser.
I am trying to use SikuliIntegrator in a C# Winforms project on my Windows 64 bit laptop and it won't run because:
Additional information: Exception in thread "main" java.lang.UnsatisfiedLinkError: C:\Users\simon\AppData\Local\Temp\tmplib\VisionProxy.dll: Can't find dependent libraries
I've looked online as much as possible for the last 2 hours and I have added these variables to the System Environment Variables:
SIKULI_HOME = C:\SikuliX
JAVA_HOME= C:\Program Files (x86)\Java\jre7
PATH= %Path%;%JAVA_HOME%\bin;%SIKULI_HOME%\libs;
Still won't work. What am i missing???
I found the solution. I was using Java 7 and that is not supported so I did the following:
Installation Steps:
Download and install Sikuli using the self-extracting installer:
Sikuli-X-1.0rc3 (r905)-win32.exe. After installation is completed, a
folder named Sikuli X should be created on your system.
Do not start using Sikuli X now because it has some bugs.
Download the following zip file: Sikuli X r930. This contains
important bug fixes Open the downloaded zip file and locate the
folder called SIKULI-IDE. Copy the content in SIKULI-IDE to Sikuli X. The purpose of this step is to replace the files associated with r905 (the buggy
version) by the files associated with r930 that has the most recent
bug fixes.
I'm trying to install the MS SQL JDBC driver on ubuntu to be used with sqoop for Hadoop. I'm totally new to java and linux, so I'm not sure where to extract everything to.
Just put it in the runtime classpath or add its path to the runtime classpath.
How to do it depends on how you're executing the program. If you're using java command in command console to execute a .class file, then use the -cp argument to specify the paths to classes and/or JAR files which are to be taken in the classpath. The classpath is basically a collection of absolute/relative disk file system paths where Java has to look for JAR files and classes.
Assuming that you've downloaded a .zip, you need to extract it and then look for a .jar file (usually in a /lib folder). For starters, it's the easiest to put the .jar in the current working directory and then execute your program (with the Class.forName("com.mysql.jdbc.Driver"); line) as follows:
java -cp .:mysql.jar com.example.YourClass
The . signifies the current path and the : is the separator (which I believe is correct for Ubuntu, on Windows it's ;).
To install the driver, you can:
Download the driver from Microsoft: https://www.microsoft.com/en-us/download/details.aspx?id=11774
Unzip and untar it (gzip -d sqljdbc_6.0.7507.100_enu.tar.gz and
tar -xf sqljdbc_6.0.7507.100_enu.tar)
Install it by copying the correct version into /usr/share/java (It will need to be world readable.) (sudo cp sqljdbc42.jar /usr/share/java/)
In the tomcat directory (/usr/share/tomcat8/lib but it could be tomcat7 if you are running a different version.) run sudo ln -s ../../java/sqljdbc42.jar sqljdbc42.jar (with the correct version names from below).
If you are using Maven, see Setting up maven dependency for SQL Server
The correct version is as follows: (Under System Requirements)
Sqljdbc.jar requires a JRE of 5 and supports the JDBC 3.0 API
Sqljdbc4.jar requires a JRE of 6 and supports the JDBC 4.0 API
Sqljdbc41.jar requires a JRE of 7 and supports the JDBC 4.1 API
Sqljdbc42.jar requires a JRE of 8 and supports the JDBC 4.2 API
Just put your jdbc jar file into /usr/lib/jvm/java-8-oracle/jre/lib/ext by using this command:
sudo cp ojdbc6.jar /usr/lib/jvm/java-8-oracle/jre/lib/ext