Error while trying to read a file - java

In my Java/Maven application, I'm trying to read some files that is in resources folder. I just can read this files when execute the application in Eclipse, but when I try to run the application with the command bellow I get a Nullpointer because the file is not found.
java -Dserver.port=$PORT -Dspring.profiles.active=dev -jar my-war.war
I'm reading the file with this command:
String string = new String(
Files.readAllBytes(
Paths.get(ClassLoader.getSystemResource("fileFolder/file.html").toURI())));

the problem was solved changing the way that I read the resource:
IOUtils.toString((new ClassPathResource("fileFolder/file.html").getInputStream()));
because the resources when in .war not are treated as files, so read them as file will throw an exception

Related

Java JAR file runs on local machine but missing file on others

The JAR file consists of the ffmpeg.exe file and it can run normally on my machine without any problems. However, if I try to run it on another computer it would tell me that java.io.IOException: Cannot run program "ffmpeg.exe": CreateProcess error=2,The system cannot find the file specified from the stacktrace. The way I imported it was
FFMpeg ffmpeg = new FFMpeg("ffmpeg.exe"); //in res folder
...
//ffmpeg class
public FFMPEG(String ffmepgEXE) {
this.ffmepgEXE = ffmepgEXE;
}
The quick fix is you have to put ffmpeg.exe in the same folder with your .jar file.
If you want to read file from resources folder, you have to change this code:
URL resource = Test.class.getResource("ffmpeg.exe");
String filepath = Paths.get(resource.toURI()).toFile().getAbsolutePath();
FFMpeg ffmpeg = new FFMpeg(filepath);

Getting error while loading properties file within the executable jar when running on linux server

I am loading the properties file within the jar, when i run this jar in the windows machine it is working fine. But when i run the same executable jar in linux server it is failing getting below error:
"Unable to execute command due to error: null/AutoTest/auto_env.properties (No such file or directory)"
Below is the code written to load the properties file:
InputStream in = Thread.currentThread().getContextClassLoader().getResourceAsStream("/auto_env.properties");
Properties props = new Properties();
try {
if (in != null) {
props.load(in);
}
I tried below fixes to solve the issue but it did not work:
To get the properties file path: getClass()..getProtectionDomain().getCodeSource().getLocation().toURI().getPath() and passing it in the inputstream.
Instaed of Thread.currentThread().getContextClassLoader() used ClassLoader.class that also din't worked.
Please help to resolve this

How to specify a Java file path for both Mac and Ubuntu

I'm using IntelliJ and Spring and Java to locally develop an app on a Mac, and then deploy to a tomcat server on AWS, using Ubuntu 16.04.3 LTS (GNU/Linux 4.4.0-1048-aws x86_64).
I'm having trouble specifying the file path so that it works in both environments.
My code is
InputStream fileStream = new FileInputStream("src/main/resources/static/web/data/ReportDates.json");
JsonReader reader = Json.createReader(fileStream);
JsonObject reportDates = reader.readObject();
reader.close();
When I run locally, the file is read in correctly. It is located in:
src/main/resources/static/web/data/ReportDates.json
But when I deploy, that code results in the error message:
java.io.FileNotFoundException: src/main/resources/static/web/data/ReportDates.json (No such file or directory)
The actual location of the file on that machine turns out to be:
/opt/tomcat/webapps/automentor/WEB-INF/classes/static/web/data/ReportDates.json
How can I specify the file path so that it works correctly in both environments?
I have given up on using a single path. #Nicholas Pesa got me thinking -- since I use IDEA, I don't have a fixed WEB-INF folder, so it's easier for me to change the path that should be used than to move the file to a fixed location.
My code now uses:
String filepath = (new File("src/main/resources/static/web/data/ReportDates.json").exists()) ? "src/main/resources/static/web/data/ReportDates.json" : "/opt/tomcat/webapps/automentor/WEB-INF/classes/static/web/data/ReportDates.json";

Start a java application from Hadoop YARN

I'm trying to run a java application from a YARN application (in detail: from the ApplicationMaster in the YARN app). All examples I found are dealing with bash scripts that are ran.
My problem seems to be that I distribute the JAR file wrongly to the nodes in my cluster. I specify the JAR as local resource in the YARN client.
Path jarPath2 = new Path("/hdfs/yarn1/08_PrimeCalculator.jar");
jarPath2 = fs.makeQualified(jarPath2);
FileStatus jarStat2 = null;
try {
jarStat2 = fs.getFileStatus(jarPath2);
log.log(Level.INFO, "JAR path in HDFS is "+jarStat2.getPath());
} catch (IOException e) {
e.printStackTrace();
}
LocalResource packageResource = Records.newRecord(LocalResource.class);
packageResource.setResource(ConverterUtils.getYarnUrlFromPath(jarPath2));
packageResource.setSize(jarStat2.getLen());
packageResource.setTimestamp(jarStat2.getModificationTime());
packageResource.setType(LocalResourceType.ARCHIVE);
packageResource.setVisibility(LocalResourceVisibility.PUBLIC);
Map<String, LocalResource> res = new HashMap<String, LocalResource>();
res.put("package", packageResource);
So my JAR is supposed to be distributed to the ApplicationMaster and be unpacked since I specify the ResourceType to be an ARCHIVE. On the AM I try to call a class from the JAR like this:
String command = "java -cp './package/*' de.jofre.prime.PrimeCalculator";
The Hadoop logs tell me when running the application: "Could not find or load main class de.jofre.prime.PrimeCalculator". The class exists at exactly the path that is shown in the error message.
Any ideas what I am doing wrong here?
I found out how to start a java process from an ApplicationMaster. Infact, my problem was based on the command to start the process even if this is the officially documented way provided by the Apache Hadoop project.
What I did no was to specify the packageResource to be a file not an archive:
packageResource.setType(LocalResourceType.FILE);
Now the node manager does not extract the resource but leaves it as file. In my case as JAR.
To start the process I call:
java -jar primecalculator.jar
To start a JAR without specifying a main class in command line you have to specify the main class in the MANIFEST file (Manually or let maven do it for you).
To sum it up: I did NOT added the resource as archive but as file and I did not use the -cp command to add the syslink folder that is created by hadoop for the extracted archive folder. I simply startet the JAR via the -jar parameter and that's it.
Hope it helps you guys!

Deployment in tomcat

i am getting a problem
i have deployed a war file, when i run localy through tomcat it works fine but when i run on another system by giveing my system ip and then project folder e.g
http:\192.168.0.145\DllTest it loads the applet but when i click on a button to load the functionality it is throwing an exception
Exception in thread "AWT-EventQueue-3" java.lang.UnsatisfiedLinkError: Expecting an absolute path of the library: http:\192.168.0.145:8080\DllTest\lib\jinvoke.dll
while it is working fine localy but not in another system. Please tell me what is the problem.
Is it a rights issue or something else.
You cannot load a DLL on an external host. It has to be an absolute disk file system -as the exception message already hints. Your best bet is to download it manually, create a temp file and load it instead.
File dllFile = File.createTempFile("jinvoke", ".dll");
InputStream input = new URL(getCodeBase(), "lib/jinvoke.dll").openStream();
OuptutStream output = new FileOutputStream(dllFile);
// Write input to output and close streams the usual Java IO way.
// Then load it using absolute disk file system path.
System.loadLibrary(dllFile.getAbsolutePath());
dllFile.deleteOnExit();

Categories