How to check if filesystem supports links and symlinks in Java - java

The Files class introduced in Java 7 has methods for handling links and symlinks but only as optional operations.
Is there any way of determining at runtime if a file system supports these operations before actually invoking the respective methods or do I need to call them and then catch the exception?
Classes like FileSystem or FileStore do not seem to contain anything in that regard (or I overlooked it).

I don't see any general approach that will work without relying on an UnsupportedOperationException or some other exception.
You could use a heuristic that assumes that only subclasses of BasicFileAttributesView support symbolic linking.
Note: The approach below will not work because FileAttributeViews and file attributes are not the same concept:
I did not get isSymbolicLink as one of the supported attributes with the following code on OS X 10.8.4:
package com.mlbam.internal;
import java.nio.file.Files;
import java.nio.file.FileStore;
import java.nio.file.FileSystems;
import java.nio.file.Paths;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
public class MainClass {
private static final Logger LOG = LoggerFactory.getLogger(MainClass.class);
public static void main(String[] args) {
try {
System.out.println("FileStore.supportsFileAttributeView('isSymbolicLink'): "
+ Files.getFileStore(Paths.get("/")).supportsFileAttributeView("isSymbolicLink"));
// Got: FileStore.supportsFileAttributeView('isSymbolicLink'): false
System.out.println(FileSystems.getDefault().supportedFileAttributeViews());
// Got: [basic, owner, unix, posix]
} catch (Exception e) {}
}
}
Original Answer:
If you have an instance of FileStore, you can use FileStore.supportsFileAttributeView("isSymbolicLink")
Or, if you have an instance of FileSystem, you can check that resulting Set<String> from FileSystem.supportedFileAttributeViews() contains the String "isSymbolicLink".
You can get the FileStore associated with a Path using Files.getFileStore(Path)
One way of getting the FileSystem is via FileSystems.getDefault()

Related

How to configure root for temp dirs in Java

We run code which does the standard for creating a temp directory:
import java.nio.file.Files;
And then:
tmp = Files.createTempDirectory("ourprefix-");
This, effectively, creates the directories under /tmp/ so that we get things like /tmp/ourprefix-1234 or similar.
Unfortunately, this base directory /tmp/ seems to be fixed and since on our build server lots of things tend to put their temp stuff there and because the partition the /tmp/ is on is rather small, this is a problem.
Is there a way to configure this facility from the outside (i. e. without changing the code)? I would have guessed that /tmp/ is a default and can be overridden by setting a special environment variable or (more Javaish) passing a special property to the compiler (e. g. -Djava.tmp.root=/path/to/my/larger/partition/tmp).
I tried using java.io.tmpdir but setting this did not have any effect; it seems to be the default in case nothing is given to createTempDirectory() but in our case the code passes a prefix.
Any idea how to achieve what I want without changing the source code?
EDIT
After some investigation I found that this works just fine:
import java.nio.file.Path;
import java.nio.file.Files;
import java.io.IOException;
public class TestTempDir {
public static void main(String[] args) throws IOException {
System.out.println(System.getProperty("java.io.tmpdir"));
Path path = Files.createTempDirectory("myprefix-");
System.out.println(path.toFile().getAbsolutePath());
}
}
Compile with javac TestTempDir.java, prepare with mkdir tmp and run with java -Djava.io.tmpdir=pwd/tmp TestTempDir this just works as expected:
/my/work/path/tmp
/my/work/path/tmp/myprefix-1525078348397347983
My issue rather seems to be one with Jenkins and its Maven plugin which does not pass the set properties along to the test cases :-/
if you pass the java.io.tmpdir property as a custom JVM property as you run the JVM, it should work.
Something like that :
java -Djava.io.tmpdir=myPath myClass
I tested and it works :
import java.io.IOException;
import java.nio.file.Files;
import java.nio.file.Path;
public class TestTempDir {
public static void main(String[] args) throws IOException {
System.out.println(System.getProperty("java.io.tmpdir"));
Path dir = Files.createTempDirectory("helloDir");
System.out.println(dir.toString());
}
}
$ java -Djava.io.tmpdir=D:\temp TestTempDir
D:\temp
D:\temp\helloDir5660384505531934395

Unit Testing with different behaviour on existance of a file

There are some cases that the software shall behave differently according to some environmental conditions, for example whether a file exists at some place or not.
In my case, I'm developing a library, and it is configured according to a configuration file in classpath, (and falls back to default behavior if the config file does not exists).
How shall I unit test this class?
I need to write tests for evaluating the class in following cases:
the file does not exists on classpath
the file with content A exist on classpath
the file with content B exist on classpath
But I don't know how to configure environment to justify all of them. And execute the test one after each other.
By the way I'm using Java, and I have both JUnit and TestNG on the test classpath.
Edit:
One of the problems is that the config file resides in classpath, so if the normal ClassLoader finds and loads it, it returns the same content as long as the same class loader is used.
And I believe using a custom ClassLoader for testing is so complicated, that it needs tests to validate the tests!
You can use a temporary file created by your test to mock out the path in your class.
ConfigurationTest.java:
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import static org.junit.Assume.assumeThat;
import java.nio.file.Files;
import org.junit.Test;
public class ConfigurationTest {
private Configuration config = new Configuration();
#Test
public void testWithConfigFile() throws Exception {
config.configFile = Files.createTempFile("config_",".ini");
config.configFile.toFile().deleteOnExit();
assertFalse(config.isInDefaultMode());
}
#Test
public void testWithoutConfigFile() throws Exception {
assumeThat(Files.exists(config.configFile), is(false));
assertTrue(config.isInDefaultMode());
}
}
Configuration.java:
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.Paths;
public class Configuration {
Path configFile = Paths.get("config.ini");
public boolean isInDefaultMode() {
return !Files.exists(configFile);
}
}

Java SimpleFileVisitor Issue when encountering permission errors

I am trying to search my hard drive for mp4 files and copy them into a specific folder. The problem is that I don't have permission to access folders like: "C:\Documents and Settings", so my program stops when it encounters those rather than continuing on.
I tried to create a blacklist but it don't work at all.
package S;
import java.io.File;
import java.io.IOException;
import java.nio.file.FileVisitResult;
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.Paths;
import java.nio.file.SimpleFileVisitor;
import java.nio.file.attribute.BasicFileAttributes;
public class C {
public static void main(String args[]) throws IOException {
Path dir = Paths.get("C:/");
Files.walkFileTree(dir, new FindJavaVisitor());
}
private static class FindJavaVisitor extends SimpleFileVisitor<Path> {
#Override
public FileVisitResult visitFile(Path file, BasicFileAttributes attrs) throws IOException {
if (file.toString().contains(".mp4")) {
file.toFile().renameTo(new File("C:/MP4/"+file.toFile().getName()));
}
return FileVisitResult.CONTINUE;
}
}
}
You have to override two methods.
The first is the visitFileFailed() method.
As the documentation states:
Unless overridden, this method re-throws the I/O exception that prevented the file from being visited.
You also have to override the postVisitDirectory() method; it has two arguments, the second being an IOException. If there is an error, the second argument will not be null and in this case, from the documentation again:
Unless overridden, this method returns CONTINUE if the directory iteration completes without an I/O exception; otherwise this method re-throws the I/O exception that caused the iteration of the directory to terminate prematurely.
Given your error, the second one is the one you want to override.
However, I see in your code that you do file.toFile().renameTo().
Do not use this. Use Files.move() instead.
Finally, you also move while you are iterating... this is not a very good idea. Recall that the list of files in a directory, for instance, and unlike with the old API, is dynamically populated!
You should create a "renaming" Map<Path, Path> and perform the renames after you have visited it all. At least that is how I would proceed in this case.

How can you set the search path for JavaToRascal from Java

We tried to import a test rascal module and a module from the standard library using JavaToRascal.
The test module is stored in C:\Users\Klemens\workspace\RascalInterop\src\MyTest.rsc and contains:
module MyTest
Te java code containing the JavaToRascal invocation is as follows:
import java.io.PrintWriter;
import java.net.URISyntaxException;
import org.rascalmpl.interpreter.JavaToRascal;
import org.rascalmpl.interpreter.load.IRascalSearchPathContributor;
import org.rascalmpl.interpreter.load.StandardLibraryContributor;
import org.rascalmpl.interpreter.load.URIContributor;
import org.rascalmpl.uri.URIUtil;
public class RascalInterop {
public static void main(String[] args) throws URISyntaxException {
JavaToRascal j2r = new JavaToRascal(new PrintWriter(System.out), new PrintWriter(System.err));
IRascalSearchPathContributor modulePath = new URIContributor(URIUtil.createFileLocation("C:\\Users\\Klemens\\workspace\\RascalInterop\\src\\MyTest.rsc"));
j2r.getEvaluator().addRascalSearchPathContributor(modulePath);
try {
j2r.eval("import MyTest;").toString(); // Could not import module MyTest: can not find in search path
} catch (Exception e) {
System.out.println(e.getMessage());
}
try {
j2r.getEvaluator().addRascalSearchPathContributor(StandardLibraryContributor.getInstance());
j2r.eval("import IO;").toString(); // null pointer exception
} catch (Exception e) {
System.out.println(e.getClass());
}
}
}
The print in the first try block that tries to import our MyTest.rsc module results in:
Could not import module MyTest: can not find in search path
?[Advice](http://tutor.rascal-mpl.org/Errors/Static/ModuleImport/ModuleImport.html)
The second import attempting to import the IO module from the standard library results in:
class java.lang.NullPointerException
Any ideas how to use properly set the search path from a Java program?
We tried to use j2r.getEvaluator().addRascalSearchPathContributor in various ways but did not succeed in loading a MyTest.rsc module from the given directory.
Despite that these API will change in the near future (due to the compilation process and related changes), here's an answer. Two answers actually, one for Rascal files and one for Java code that it needs
For Rascal:
j2r.getEvaluator().addRascalSearchPathContributor
What you used is the correct way of doing things. So if it did not work, please provide more code so we can diagnose what goes wrong. So where is your module? Is it in a jar file or a binary folder? If its in a jar, you need some additional wiring I'm glad to explain.
The Rascal search path is distinguished from the Classpath for Java classes which are used by Rascal. So you have different API for that. We use classloaders to find Java files (such that it also works for situations like OSGI bundles in Eclipse):
Evaluator x = ctx.getEvaluator();
x.addClassLoader(getClass().getClassLoader());
This will make sure that the class loader used to load the current class is also used to load the class linked mentioned in the Rascal file. Of course you can also provide other class loaders. Note that if the libraries you depend on are loaded via OSGI, make sure you get a classloader from a class that is in a bundle that has access to these classes. The simple case is when everything is in the same jar file, then any classloader will do.
I think you should change the path to refer to the src directory instead of the source file:
new URIContributor(URIUtil.createFileLocation("C:\\Users\\Klemens\\workspace\\RascalInterop\\src"));
Also: probably you should use forward slashes without C:\, so /Users/.../src
AFAIK The null pointer exception is expected, evaluating import returns null, and you try to call toString().

Package-private scope in Scala visible from Java

I just found out about a pretty weird behaviour of Scala scoping when bytecode generated from Scala code is used from Java code. Consider the following snippet using Spark (Spark 1.4, Hadoop 2.6):
import java.util.Arrays;
import java.util.List;
import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaSparkContext;
import org.apache.spark.broadcast.Broadcast;
public class Test {
public static void main(String[] args) {
JavaSparkContext sc =
new JavaSparkContext(new SparkConf()
.setMaster("local[*]")
.setAppName("test"));
Broadcast<List<Integer>> broadcast = sc.broadcast(Arrays.asList(1, 2, 3));
broadcast.destroy(true);
// fails with java.io.IOException: org.apache.spark.SparkException:
// Attempted to use Broadcast(0) after it was destroyed
sc.parallelize(Arrays.asList("task1", "task2"), 2)
.foreach(x -> System.out.println(broadcast.getValue()));
}
}
This code fails, which is expected as I voluntarily destroy a Broadcast before using it, but the thing is that in my mental model it should not even compile, let alone running fine.
Indeed, Broadcast.destroy(Boolean) is declared as private[spark] so it should not be visible from my code. I'll try looking at the bytecode of Broadcast but it's not my specialty, that's why I prefer posting this question. Also, sorry I was too lazy to create an example that does not depend on Spark, but at least you get the idea. Note that I can use various package-private methods of Spark, it's not just about Broadcast.
Any idea of what's going on ?
If we reconstruct this issue with a simpler example:
package yuvie
class X {
private[yuvie] def destory(d: Boolean) = true
}
And decompile this in Java:
[yuvali#localhost yuvie]$ javap -p X.class
Compiled from "X.scala"
public class yuvie.X {
public boolean destory(boolean);
public yuvie.X();
}
We see that private[package] in Scala becomes public in Java. Why? This comes from the fact that Java private package isn't equivalent to Scala private package. There is a nice explanation in this post:
The important distinction is that 'private [mypackage]' in Scala is
not Java package-private, however much it looks like it. Scala
packages are truly hierarchical, and 'private [mypackage]' grants
access to classes and objects up to "mypackage" (including all the
hierarchical packages that may be between). (I don't have the Scala
spec reference for this and my understating here may be hazy, I'm
using [4] as a reference.) Java's packages are not hierarchical, and
package-private grants access only to classes in that package, as well
as subclasses of the original class, something that Scala's 'private
[mypackage]' does not allow.
So, 'package [mypackage]' is both more and less restrictive that Java
package-private. For both reasons, JVM package-private can't be used
to implement it, and the only option that allows the uses that Scala
exposes in the compiler is 'public.'

Categories