java.io.FileNotFoundException by using play dist - java

The below code works fine when I start project with play start
object LogFile {
implicit val formats = DefaultFormats
private var fileInput = new FileInputStream("./conf/log4j.properties");
private val properties = new Properties
properties.load(fileInput);
def test(head: String, data: String) {
System.setProperty("my.log", "scala.txt")
PropertyConfigurator.configure(properties)
val log = Logger.getLogger(head)
log.error(data)
}
}
but when I am using sudo /home/ubuntu/play/play dist
and run that I got:
[error] play - Cannot invoke the action, eventually got an error:
java.io.FileNotFoundException: ./conf/log4j.properties (No such file or directory)
What am I doing wrong?
I am using Scala 2.10 with play framework 2.2

You're missing the Log4j Properties file
./conf/log4j.properties
You're probably missing the file:
/home/ubuntu/project/conf/log4j.properties
sudo command changes the user that you are executing as. So the new user possibly has different environment variables.
note: project is application name.
Also, you're using a relative path ./conf/log4j.properties, the root of which will be resolved at runtime based on the home directory that you are executing in.
Possible solutions:
1) Don't use a relative path, rather use an absolute path
2) Change the home directory in the profile of the user that you are executing
your application as ( root user?)
3) Copy the missing file to the directory where your application is looking for the file

Related

Flink can't access files in JAR

I tried to run a JAR in a flink cluster but I get this FileNotFound Exception.
Caused by: java.io.FileNotFoundException: File file:/tmp/flink-web-88bf3f41-94fc-40bd-a865-bb0e6d5ac95c/flink-web-upload/82227475-523d-4607-8ab2-09bae8602248-tutorial-1.0-jar-with-dependencies.jar!/ldbc_sample/edges.csv does not exist or the user running Flink ('userA') has insufficient permissions to access it.
at org.apache.flink.core.fs.local.LocalFileSystem.getFileStatus(LocalFileSystem.java:106)
The csv files are located in a folder in the resources directory of the project.
I access the file path by:
URL resource = Helper.class.getClassLoader().getResource("ldbc_sample");
return resource.getPath();
I opened the jar and made sure that the files definitely exist, and I also run it locally, and it worked.
What do I have to do, to make sure that flink can access my csv?
Maybe you want to pass your .csv as an argument to your program ?
Something like:
def main(args: Array[String]): Unit = {
val ldbcSample = ParameterTool.fromArgs(args).getRequired("ldbc_sample")
...
}
or you can make your .properties file with different arguments:
ldbc_sample: /ldbc_sample/edges.csv
topic_source: TOPIC_NAME
val jobParams = ParameterTool.fromArgs(args)
val jobArgs = ParameterTool.fromPropertiesFile(jobParams.getRequired("properties_file_path"))

How to specify a Java file path for both Mac and Ubuntu

I'm using IntelliJ and Spring and Java to locally develop an app on a Mac, and then deploy to a tomcat server on AWS, using Ubuntu 16.04.3 LTS (GNU/Linux 4.4.0-1048-aws x86_64).
I'm having trouble specifying the file path so that it works in both environments.
My code is
InputStream fileStream = new FileInputStream("src/main/resources/static/web/data/ReportDates.json");
JsonReader reader = Json.createReader(fileStream);
JsonObject reportDates = reader.readObject();
reader.close();
When I run locally, the file is read in correctly. It is located in:
src/main/resources/static/web/data/ReportDates.json
But when I deploy, that code results in the error message:
java.io.FileNotFoundException: src/main/resources/static/web/data/ReportDates.json (No such file or directory)
The actual location of the file on that machine turns out to be:
/opt/tomcat/webapps/automentor/WEB-INF/classes/static/web/data/ReportDates.json
How can I specify the file path so that it works correctly in both environments?
I have given up on using a single path. #Nicholas Pesa got me thinking -- since I use IDEA, I don't have a fixed WEB-INF folder, so it's easier for me to change the path that should be used than to move the file to a fixed location.
My code now uses:
String filepath = (new File("src/main/resources/static/web/data/ReportDates.json").exists()) ? "src/main/resources/static/web/data/ReportDates.json" : "/opt/tomcat/webapps/automentor/WEB-INF/classes/static/web/data/ReportDates.json";

Hot to set the working directory of a process in Java

I am trying to set the working directory of a process that will be running an exe.
Here is what I have so far:
public Process launchClient() throws IOException {
File pathToExecutable = new File(currentAccount.getAbsoluteFile()+"/Release", "TuringBot.exe");
System.out.println(pathToExecutable);
ProcessBuilder builder = new ProcessBuilder(pathToExecutable.getAbsolutePath());
builder.directory( new File( currentAccount, "Release" )); // this is where you set the root folder for the executable to run with
System.out.println("Working Directory = " +
System.getProperty("user.dir"));
return builder.start();
}
but every time I launch it, the exe attempts to run but fails to because the current directory is wrong and it can't properly access the required files.
EDIT:
Here's a little more about whats going on...
I have x amount of client folders, each contains an executable file that needs to be started and relies on the contents of its local folder to execute properly, however when running this code to launch each executable file the working path for each unique executable is not being applied and is rather the default directory (the JAR's locations).

Start a java application from Hadoop YARN

I'm trying to run a java application from a YARN application (in detail: from the ApplicationMaster in the YARN app). All examples I found are dealing with bash scripts that are ran.
My problem seems to be that I distribute the JAR file wrongly to the nodes in my cluster. I specify the JAR as local resource in the YARN client.
Path jarPath2 = new Path("/hdfs/yarn1/08_PrimeCalculator.jar");
jarPath2 = fs.makeQualified(jarPath2);
FileStatus jarStat2 = null;
try {
jarStat2 = fs.getFileStatus(jarPath2);
log.log(Level.INFO, "JAR path in HDFS is "+jarStat2.getPath());
} catch (IOException e) {
e.printStackTrace();
}
LocalResource packageResource = Records.newRecord(LocalResource.class);
packageResource.setResource(ConverterUtils.getYarnUrlFromPath(jarPath2));
packageResource.setSize(jarStat2.getLen());
packageResource.setTimestamp(jarStat2.getModificationTime());
packageResource.setType(LocalResourceType.ARCHIVE);
packageResource.setVisibility(LocalResourceVisibility.PUBLIC);
Map<String, LocalResource> res = new HashMap<String, LocalResource>();
res.put("package", packageResource);
So my JAR is supposed to be distributed to the ApplicationMaster and be unpacked since I specify the ResourceType to be an ARCHIVE. On the AM I try to call a class from the JAR like this:
String command = "java -cp './package/*' de.jofre.prime.PrimeCalculator";
The Hadoop logs tell me when running the application: "Could not find or load main class de.jofre.prime.PrimeCalculator". The class exists at exactly the path that is shown in the error message.
Any ideas what I am doing wrong here?
I found out how to start a java process from an ApplicationMaster. Infact, my problem was based on the command to start the process even if this is the officially documented way provided by the Apache Hadoop project.
What I did no was to specify the packageResource to be a file not an archive:
packageResource.setType(LocalResourceType.FILE);
Now the node manager does not extract the resource but leaves it as file. In my case as JAR.
To start the process I call:
java -jar primecalculator.jar
To start a JAR without specifying a main class in command line you have to specify the main class in the MANIFEST file (Manually or let maven do it for you).
To sum it up: I did NOT added the resource as archive but as file and I did not use the -cp command to add the syslink folder that is created by hadoop for the extracted archive folder. I simply startet the JAR via the -jar parameter and that's it.
Hope it helps you guys!

Servlet - Addressing properties files

I have a simple Servlet that needs to pass some properties files to another class.
Properties prop = new Properties();
prop.load(new FileInputStream("/home/user/config.properties"));
Above works fine.
But I can't address the right absolute path in below:
String protocol = prop.getProperty("protocol", "/home/user/protocol.properties");
String routes = prop.getProperty("routes", "/home/user/routes.properties");
MyClass message = new MyClass(protocol, routes, 0);
At the end I receive below from tomcat log:
INFO: Server startup in 3656 ms
java.io.FileNotFoundException: routes.properties (No such file or directory)
at java.io.FileInputStream.open(Native Method)
at java.io.FileInputStream.<init>(FileInputStream.java:138)
at java.io.FileInputStream.<init>(FileInputStream.java:97)
at com.cc.verticals.Messenger.<init>(Messenger.java:134)
at com.foo.MyClass.<init>(MyClass.java:42)
at com.verticals.cc.util.VerticalUtil.setup(VerticalUtil.java:59)
at com.verticals.cc.util.VerticalUtil.main(VerticalUtil.java:259)
at com.verticals.cc.dao.VerticalDao.<init>(VerticalDao.java:24)
at com.verticals.cc.controller.VerticalController.<init>(VerticalController.java:33)
Line 42 is pointing to the constructor where routes.properties file goes in.
Messenger line 134 points to:
prop.load(new FileInputStream(routesFilename));
Any Idea how to address the properties files and send them as a String parameter? Thanks.
By the looks of it (I prefer if you post the content's of the properties files), there is a property within config.properties such that routes = routes.properties. When you call new file(routes); you get the FileNotFoundException because you are trying to open routes.properties in the current working directory where java was launched (which doesn't exist)
As a side note, you using one property file to reference another property, which is fine but a bit odd or unconventional. Further, you should stick these files in a 'resource' folder to remove absolute paths and gain portability.
Notice that prop.getProperty method cannot throw FileNotFoundException. So that exception must have been thrown earlier on prop.load();
Please make sure that you have opened the permissions on the file. Open a terminal and issue following command:
$ chmod 777 /home/user/routes.properties
$ chmod 777 /home/user/protocol.properties

Categories