As part of a jar ran through hadoop, I want to implement a simple function that (a) creates a file if it doesn't exist, (b) appends bytes from a string passed in on a new line into this file.
I wrote the following:
public class FSFacade {
private static FileContext fc = FileCOntext.getFileContext();
public static void appendRawText(Path p, String data) throws IOException {
InputStream is
= new ByteArrayInputStream(data.getBytes(StandardCharsets.UTF_8));
FsPermission permissions
= new FsPermission(FsAction.ALL, FsAction.ALL, FsAction.ALL);
OutputStream os
= fc.create(p,
EnumSet.of(CREATE, APPEND),
CreateOpts.perms(permissions),
CreateOpts.createParents());
IOUtils.copyBytes(is, os, new Configuration());
}
}
This code works fine in Eclipse, but when I try and run it on an HDFS via hadoop jar it raises either of the following exceptions:
java.io.FileNotFoundException: /out (Permission denied)
java.io.FileNotFoundException: /results/out (no such file or directory)
I assume the first one is raised because my process doesn't have permissions to write to the root of the HDFS. The second one probably means my code somehow doesn't create the file if it doesn't exist yet.
How can I make sure, programatically, that my process
(a) has all the appropriate permissions to write into the Path passed in ? (I presume it means execute perms on all folders in the path and write perms on the last one ?)
(b) indeed creates the file if it doesn't exist yet, as I expected EnumSet.of(CREATE, APPEND) to do ?
You can use the following command to give permission to write into HDFS
> hdfs dfs -chmod -R 777 /*
* means permissions will be enabled for all folders
777 will enable all permissions (read , write and execute)
Hope it helps !!
Related
This code throws FileNotFoundException.
Edit: As requested I have included the full StackTrace.
import java.io.FileInputStream;
import java.io.InputStream;
public class ReadFile{
public static void main(String[] args){
InputStream inputstream = new FileInputStream("C:\\file.txt");
}
}
The file "file.txt" is at that location though. I would like to post a screenshot of this as requested, but I can't because I need at least 10 reputation points.
If you are 100% certain that the file exists and you're still getting a FileNotFoundException, than most likely your user or the user running Java has no permission to access this file (since I am using German Windows the dialog is in German, but as you can see "Benutzer" (which is Users) have a denied right to read and execute the file a.txt:
This however, results in a a FileNotFoundException with a localized error message returned :
Exception in thread "main" java.io.FileNotFoundException: C:\a.txt (Zugriff verweigert)
at java.io.FileInputStream.open(Native Method)
at java.io.FileInputStream.<init>(FileInputStream.java:131)
at java.io.FileInputStream.<init>(FileInputStream.java:87)
at Threadstuff.main(Threadstuff.java:50)
Zugriff verweigert means "access denied". If that isn't the problem either, I guess you should post your full StackTrace.
The other option I mentioned in my comment is an explorer option ("View" -> "Options") in the Folder and Search options -> View:
(roughly translates to "Hide extensions for known extensions")
If this is enabled, the filenames in the explorer are losing their extensions in the view. Meaning that they are shown as "file" instead of "file.txt" - which sometimes leads to the mistake of creating a "file.txt.txt" when renaming a file. And is/was also often used to trick users into thinking they were open a different kind of file (.pdf.exe) - mostly used by bad guys.
Is this really the full Filepath? Better check that one.
Also I'd recommend putting files that are to be read by your program e.g. Textfiles, Images and such into the classpath of your project so when you pack and export it the file paths are not obstructed by being on somebody else's PC where that file does not exist on that path and so on.
This answer suggest you transform the Path of the file to a java conform URL path.
Try the below one.
InputStream inputstream = new FileInputStream("C:"+File.separator+"file.txt");
A better approach would be
File file = new File("C:"+File.separator+"file.txt");
if(file.exists()) {
//Read the file
}
else {
System.out.println("File does not exist);
}
To ensure whether file exists or not in windows, press windows button + r and then paste the file path you have mentioned and after that press enter key. If file is in that location, a notepad with file contents will be opened.
I have code that extracts some specific large (about 15k entries) binary serialized file archive to folder on disk.
public void extractExact(Path absolutePath, DoubleConsumer progressConsumer) throws IOException
{
...
// Extract to file channel
try (final FileOutputStream fos = new FileOutputStream(absolutePath.toFile()))
{
PakExtractor.Extract(pakFile, Entry, fos.getChannel(), progressConsumer);
}
}
extractExact function calls for every entry in archive.
after this, if I try to call Files.delete(<archive_file_path>) method - I will get an exception:
java.nio.file.FileSystemException: The process cannot access the file because it is being used by another process.
I checked my archive file in Process Explorer search and it says that I have ~15k file openings by my java.exe (as many as the files in the archive)
this happens only in windows (jdk1.8.0_162). On Linux I don't have any problems with "zombie" opened files.
Finally - we found the solution. Many thanks to #Netherwire. FileChannel class have map method that does some implicit copy operations with file descriptors, so be careful when use it. Here is more information.
So after 36 hours of experimenting with this and that, I have finally managed to get a cluster up and running but now I am confused how I can write files to it using Java? A tutorial said this program should be used but I don't understand it at all and it doesn't work as well.
public class FileWriteToHDFS {
public static void main(String[] args) throws Exception {
//Source file in the local file system
String localSrc = args[0];
//Destination file in HDFS
String dst = args[1];
//Input stream for the file in local file system to be written to HDFS
InputStream in = new BufferedInputStream(new FileInputStream(localSrc));
//Get configuration of Hadoop system
Configuration conf = new Configuration();
System.out.println("Connecting to -- "+conf.get("fs.defaultFS"));
//Destination file in HDFS
FileSystem fs = FileSystem.get(URI.create(dst), conf);
OutputStream out = fs.create(new Path(dst));
//Copy file from local to HDFS
IOUtils.copyBytes(in, out, 4096, true);
System.out.println(dst + " copied to HDFS");
}
}
My confusion is how does this piece of code identify specifics of my cluster? How will it know where the masternode is and where the slavenodes are?
Furthermore when I run this code and provide some local file in source and leave destination blank/or provide a file name only the program writes the file back to my local storage and not the location that I defined as storage space for my namenodes and datanodes. Should I be providing this path manually? How does this work? Please suggest some blog that can help me understand it better or can get working with a smallest example.
First off, you'll need to add some Hadoop libraries to your classpath. Without those, no, that code won't work.
How will it know where the masternode is and where the slavenodes are?
From the new Configuration(); and subsequent conf.get("fs.defaultFS").
It reads the core-site.xml of the HADOOP_CONF_DIR environment variable and returns the address of the namenode. The client only needs to talk to the namenode to receive the locations of the datanodes, from which file blocks are written to
the program writes the file back to my local storage
It's not clear where you've configured the filesystem, but the default is file://, your local disk. You change this in the core-site.xml. If you follow the Hadoop documentation, the pseudo distributed cluster setup mentions this
It's also not very clear why you need your own Java code when simply hdfs dfs -put will do the same thing
I am getting used to Java 7 and the new Files class.
I am writing a small application which, at some point, must replace the contents of a file.
I used a temporary file to avoid erasing the target file if somethign goes wrong. However, I'm always getting an AccessDeniedException when performing the actual copy.
Here is my code:
// Temporary file generation.
Path target = getCurrentConfigFile(); // Returns a path, works ok.
Path tempFile = Files.createTempFile("tempfile", null);
Files.write(tempFile, conf.getBytes(Charset.defaultCharset()), StandardOpenOption.WRITE);
// Actual copy.
Files.copy(tempFile, target, StandardCopyOption.REPLACE_EXISTING);
// Cleanup.
Files.delete(tempFile);
getCurrentConfigFile() handles the target file Path creation:
(... generates various strings from configuration parameters)
return FileSystems.getDefault().getPath(all, these, various, strings);
When I execute the code, it's through a .bat script, and I get the error both with a standard Command Prompt or elevation.
The target file is in C:\temp\tests, a directory I created with the same Windows user.
It seems the problem lies in reading from the temporary file, as writing directly to the target works.
Where should I look next?
Not an answer but too long for a comment. I run the code below (from the command line on Windows 7) and it works as expected:
public static void main(String[] args) throws IOException {
Path target = Paths.get("C:/temp/test.txt"); // Returns a path, works ok.
Path tempFile = Files.createTempFile("tempfile", null);
Files.write(tempFile, "abc".getBytes(UTF_8), StandardOpenOption.WRITE);
// Actual copy.
Files.copy(tempFile, target, StandardCopyOption.REPLACE_EXISTING);
// Cleanup.
Files.delete(tempFile);
}
so your problem is not with that code. It may be somewhere else in your code or due to the permissions on the files/folder you are using.
I need to create temporary directory but I'm always getting access denied when I try to create a file into the temporary directory.
java.io.FileNotFoundException: C:\tmpDir7504230706415790917 (Access Denied)
here's my code:
public static File createTempDir() throws IOException {
File temp = File.createTempFile("tmpDir", "", new File("C:/"));
temp.delete();
temp.mkdir();
return temp;
}
public File createFile(InputStream inputStream, File tmpDir ) {
File file = null;
if (tmpDir.isDirectory()) {
try {
file = new File(tmpDir.getAbsolutePath());
// write the inputStream to a FileOutputStream
OutputStream out = new FileOutputStream(file);
int read = 0;
byte[] bytes = new byte[1024];
while ((read = inputStream.read(bytes)) != -1) {
out.write(bytes, 0, read);
}
inputStream.close();
out.flush();
out.close();
System.out.println("New file created!");
} catch (IOException e) {
System.out.println(e.getMessage());
}
}
return file;
}
I'm working on a web application and I'm using tomcat. Is there a way to create temporary file on tomcat server memory? I know that's bizarre, but I don't know ... maybe it's possible.
You could use Tomcat's temp folder.
If you use
<%=System.getProperty("java.io.tmpdir")%>
in a JSP you can get path to it.
This line in your code says create a file whose name starts with text "tmpDir" in the directory "C:\". That is not what you want.
File temp = File.createTempFile("tmpDir","",new File("C:/"));
The operating system is properly disallowing that because C:\ is a protected directory. Use the following instead:
File temp = File.createTempFile("tmp",null);
This will let Java determine the appropriate temporary directory. Your file will have the simple prefix "tmp" followed by some random text. You can change "tmp" to anything meaningful for your app, in case you need to manually clean out these temp files and you want to be able to quickly identify them.
You usually cannot write onto C:\ directly due to the default permission setting. I sometime have permission issue for doing so. However, you can write your temporary file in your user folder. Usually, this is C:\Documents and Settings\UserName\ on XP or C:\Users\UserName\ on vista and Windows 7. A tool called SystemUtils from Apache Lang can be very useful if you want to get the home directory depending on OS platform.
For example:
SystemUtils.getUserDir();
SystemUtils.getUserHome();
Update
Also, you create a temp file object but you call mkdir to make it into a directory and try to write your file to that directory object. You can only write a file into a directory but not on the directory itself. To solve this problem, either don't call temp.mkdir(); or change this file=new File(tmpDir.getAbsolutePath()); to file=new File(tmpDir, "sometempname");
On Linux with tomcat7 installation:
So if you are running web application this is the temp directory Tomcat uses for the creation of temporary files.
TOMCAT_HOME/temp
In my case TOMCAT_HOME => /usr/share/tomcat7
If you are running Java program without tomcat, by default it uses /tmp directory.
Not sure if it affects but i ran this command too.
chmod 777 on TOMCAT_HOME/temp