Java 7 introduces a great API for writing custom file systems. Consider a use case where I don't want to implement a new file system, I just want to tweak the behavior of the existing one. For example, flip every bit that is written/read from it.
It seems to me that the current jdk just does not have the appropriate facilities for this. AbstractFileSystemProvider, the provider that WindowsFileSystemProvider extends is package-private so I can't reuse it. I didn't even find the concrete implementation for Linux.
Problem #1: There is no useful abstraction of the current file system for extension.
Let's assume I extend only for Windows. WindowsFileSystemProvider is public, so I can actually easily override the newByteChannel and be done with it. But Alas!
Problem #2: WindowsFileSystem is not public, so I actually have to code an entirely new FileSystem just to introduce a new Provider.
Am I missing something or is this feature completely raw and not ready to be used by application writers?
After contacting core-libs-dev in openjdk, I got the following answer:
The service provider interface allows you to replace the default provider or interpose on it (see the FileSystems.getDefault docs for the details on how this is configured). When you interpose on the default provider then you have the opportunity to do your customization although it can be tricky to ensure that you get all the delegation right. As a starting point then look at the PassThroughFileSystem in jdk/test tree, this is a provider used by some of the tests and may be what you are looking for.
The PassThroughFileSystem is a nice reference implementation that demonstrates how one can implement a custom provider with proper delegation to the default one. That being said, in my opinion the problem is still there but at least we have a better starting point.
Related
We are migrating a system written in C to Java and must retain existing processes (no debate). We currently "embed" compile-time information into the C application using the C preprocessor, for example:
cc -o xxx.o -DCOMP_ARG='"compile time arg"' xxx.c
The xxx.c file can then use "COMP_ARG" and its value will be embedded in the code and we have little worry about it being changed inadvertently.
We realize Java likes to use properties files, however, our requirements are such that some information ** ** be embedded in the code, so properties files are not an option - these certain values cannot be specified at runtime. To illustrate the point, such data could be a date-stamp of when the file was compiled, but the exact data is irrelevant to the question.
We are looking for a way to specify at compile time various values that are available to the Java code. We are quite aware that Java does not have a pre-processor as does C, so the mechanism would be different.
Our current solution is using a code generation step (Maven), which does work, however, Eclipse is wreaking havoc trying to deal with the source files so that we had turn off "Build Automatically". We really want to find a more robust solution.
We appreciate any help, thanks.
The xxx.c file can then use "COMP_ARG" and its value will be embedded
in the code and we have little worry about it being changed
inadvertently.
...our requirements are such that some information be embedded in the
code....
We are looking for a way to specify at compile time various values
that are available to the Java code. We are quite aware that Java does
not have a pre-processor as does C, so the mechanism would be
different.
It seems that the best way to solve this problem would be to make use of annotations in your code.
In Java, annotations are a kind of interface declaration, but they do not enforce a behavioral contract with an implementing class. Rather, they are meant to define a contract with some external framework, preprocessor, or with the compiler itself. Annotations are used extensively in Java EE 5.0 (and later) to specify configuration and behavior to the framework within which the developer's code runs. Annotations are also used extensively by the JavaDoc documentation processor. Here, the annotations in the doc comments allow you to specify and format the information which you intend to appear in the documentation when the JavaDoc processor runs.
Annotations can be defined to be accessible at runtime. In such a case, the primary mechanism for accessing annotations is the Java Reflection facility. For example, annotations with a retention policy of RUNTIME and defined on a class, can be accessed through that class's corresponding Class object:
Class myCls = MyClass.class; // the "class literal" for MyClass
Annotation[] annotations = myCls.getDeclaredAnnotations();
Annotations can include arguments for parameters to allow for more flexibility in configuration. The use of annotations is most convenient when the code itself can be so annotated.
A quick tutorial on how annotations are defined and used in Java is available here: https://docs.oracle.com/javase/tutorial/java/annotations/
I'm going to post my own answer which seems to be "Can't be done" - what can't be done, apparently, is provide at compile time to Java, a set of parameters that gets passed to the program at execution time. The solution appears to be to continue with what I am doing which is to update a Java source file with the compile-time data and figure out how to coax Eclipse to stop over-writing the files.
Thanks to everyone who commented.
I would like to mark usage of certain methods provide by the JRE as deprecated. How do I do this?
You can't. Only code within your control can have the #Deprecated annotation added. Any attempt to reverse engineer the bytecode will result in a non-portable JRE. This is contrary to Java's write once, run anywhere methodology.
you can't deprecate JRE methods, but you can add warnings or even compile errors to your build system i.e. using AspectJ or forbid the use of given methods in the IDE.
For example in Eclipse:
Go to Project properties -->Java Compiler --> Errors Warnings, Then enable project specific settings, Expand Deprecated and restrited APIs category
"Forbidden reference (acess rule)"
Obviously you could instrument or override the class adding #Deprecated annotation, but it's not a clean solution.
Add such restrictions to your coding guidelines, and enforce as part of your code review process.
You only can do it, if and only if you are building your own JRE! In that case just add #Deprecated above the corresponding code block! But if you are using Oracle's JRE, you are no where to do so!
In what context? Do you mean you want to be able to easily configure your IDE to inhibit use of certain API? Or are you trying to dictate to the world what APIs you prohibit? Or are you trying to do something at runtime?
If the first case, Eclipse, and I assume other IDEs, allow you to mark any API as forbidden, discouraged, or accessible at the package or class level.
If you mean the second, you can't, of course. That would be silly.
If you are trying to prohibit certain methods from being called at runtime, you can configure a security policy to prevent code loaded from specified locations from being able to call specific methods that check with the SecurityManager, if one is installed.
You can compile your own version of the class and add it to the boot class path or lib/ext directory. http://docs.oracle.com/javase/tutorial/ext/basics/install.html This will change the JDK and the JRE.
In fact you can remove it for compiling and your program won't compile if it is used.
Snihalani: Just so that I get this straight ...
You want to 'deprecate methods in the JRE' in order to 'Making sure people don't use java's implementation and use my implementation from now on.' ?
First of all: you can't change anything in the JRE, neither are you allowed to, it's property of Oracle. Uou might be able to change something locally if you want to go through the trouble, but that 'll just be in your local JRE, not in the ones that can be downloaded from the Oracle webpage.
Next to that, nobody has your implementation, so how would we be able to use it anyway? The implementations provided by Oracle do exactly what they should do, and when a flaw/bug/... is found it'll be corrected or replaced by a new method (at which point the original method becomes deprecated).
But, what mostly worries me, is that you would go and change implementations with something you came up with. Reminds me quite lot of phishing and such techniques, having us run your code, without knowing what it does, without even knowing we are running your code. After all, if you would have access to the original code and "build" the JRE, what's to stop you from altering the code in the original method?
Deprecated is a way for the author to say:
"Yup ... I did this in the past, but it seems that there are problems with the method.
just in order not to change the behaviour of existing applications using this method, I will not change this method, rather mark it as deprecated, and add a method that solves this problem".
You are not the author, so it isn't up to you to decide whether or not the methods work the way they should anyway.
In software development we are all using the libraries by software providers. Consider in class A there are four functions viz., x,y,z. I just want my development team to avoid using the function x. So instead of telling them not to use, I found an idea. Inherit the class and override all the functions and for the function x an unsupportedmethod exception is thrown and for the rest I'm calling the super methods. There also I found a problem, developers can use the base class A directly, how to avoid the class A being used directly. I found a similar functionality in OSGi, the lib bundles can be brought in and then not exported and so on. Is there are any way to achieve this is java?
I suppose code reviews exist for these reasons. Consider situation where you can not edit the source of a third party, what would you do ? Like Siddharth says, sub class it and throw a meaningful exception and document it with a clear reasons. If someone is using base class even after that, mostly it may not out of ignorance,but it may out of curiosity. That kind of thing can be appreciated personally and for learning, but for the project sake developer has to follow the guidelines.
I think simply telling your developers what to do is preferred over a complex software solution. Sometimes the simple thing is better.
But, if you insist on going down this path, you can enforce your architecture standards using aspects if you're a Spring user. Weave the offending methods with an aspect that throws an exception if they're called.
You can edit library class file in hex editor and modify its access modifier from public to package private. Also you can rename it and then use inheritance to wrap this class. Here you can find class file specification. Once I've tried this technique to substitute jdbc driver class with wraper class that provide some additional logging and other useful tricks.
There is a variety of tools that check source code for adherence to certain rules, such as formatting, dead code, naming conventions for variables etc. Popular ones for Java include the Maven Enforcer plugin, checkstyle and PMD.
These might allow you to write a rule that forbids certain method calls. Then you could check automatically at compile time. As far as I can tell, unfortunately none of the tools above support "illegal method calls" out-of-the-box; however, at least for PMD writing new checks is fairly simple.
What are the best practices for using ServiceLoader in an Environment with multiple ClassLoaders? The documentation recommends to create and save a single service instance at initialization:
private static ServiceLoader<CodecSet> codecSetLoader = ServiceLoader.load(CodecSet.class);
This would initialize the ServiceLoader using the current context classloader. Now suppose this snippet is contained in a class loaded using a shared classloader in a web container and multiple web applications want to define their own service implementations. These would not get picked up in the above code, it might even be possible that the loader gets initialized using the first webapps context classloader and provide the wrong implementation to other users.
Always creating a new ServiceLoader seems wasteful performance wise since it has to enumerate and parse service files each time. Edit: This can even be a big performance problem as shown in this answer regarding java's XPath implementation.
How do other libraries handle this? Do they cache the implementations per classloader, do they reparse their configuration everytime or do they simply ignore this problem and only work for one classloader?
I personally do not like the ServiceLoader under any circumstances. It's slow and needlessly wasteful and there is little you can do to optimize it.
I also find it a bit limited -- you really have to go out of your way if you want to do more than search by type alone.
xbean-finder's ResourceFinder
ResourceFinder is a self-contained java file capable of replacing ServiceLoader usage. Copy/paste reuse is no problem. It's one java file and is ASL 2.0 licensed and available from Apache.
Before our attention spans get too short, here's how it can replace a ServiceLoader
ResourceFinder finder = new ResourceFinder("META-INF/services/");
List<Class<? extends Plugin>> impls = finder.findAllImplementations(Plugin.class);
This will find all of the META-INF/services/org.acme.Plugin implementations in your classpath.
Note it does not actually instantiate all the instances. Pick the one(s) you want and you're one newInstance() call away from having an instance.
Why is this nice?
How hard is it to call newInstance() with proper exception handling? Not hard.
Having the freedom to instantiate only the ones you want is nice.
Now you can support constructor args!
Narrowing search scope
If you want to just check specific URLs you can do so easily:
URL url = new File("some.jar").toURI().toURL();
ResourceFinder finder = new ResourceFinder("META-INF/services/", url);
Here, only the 'some.jar' will be searched on any usage of this ResourceFinder instance.
There's also a convenience class called UrlSet which can make selecting URLs from the classpath very easy.
ClassLoader webAppClassLoader = Thread.currentThread().getContextClassLoader();
UrlSet urlSet = new UrlSet(webAppClassLoader);
urlSet = urlSet.exclude(webAppClassLoader.getParent());
urlSet = urlSet.matching(".*acme-.*.jar");
List<URL> urls = urlSet.getUrls();
Alternate "service" styles
Say you wanted to apply the ServiceLoader type concept to redesign URL handling and find/load the java.net.URLStreamHandler for a specific protocol.
Here's how you might layout the services in your classpath:
META-INF/java.net.URLStreamHandler/foo
META-INF/java.net.URLStreamHandler/bar
META-INF/java.net.URLStreamHandler/baz
Where foo is a plain text file that contains the name of the service implementation just as before. Now say someone creates a foo://... URL. We can find the implementation for that quickly, via:
ResourceFinder finder = new ResourceFinder("META-INF/");
Map<String, Class<? extends URLStreamHandler>> handlers = finder.mapAllImplementations(URLStreamHandler.class);
Class<? extends URLStreamHandler> fooHandler = handlers.get("foo");
Alternate "service" styles 2
Say you wanted to put some configuration information in your service file, so it contains more than just a classname. Here's an alternate style that resolves services to properties files. By convention one key would be the class names and the other keys would be injectable properties.
So here red is a properties file
META-INF/org.acme.Plugin/red
META-INF/org.acme.Plugin/blue
META-INF/org.acme.Plugin/green
You can look things up similarly as before.
ResourceFinder finder = new ResourceFinder("META-INF/");
Map<String,Properties> plugins = finder.mapAllProperties(Plugin.class.getName());
Properties redDefinition = plugins.get("red");
Here's how you could use those properties with xbean-reflect, another little library that can give you framework-free IoC. You just give it the class name and some name value pairs and it will construct and inject.
ObjectRecipe recipe = new ObjectRecipe(redDefinition.remove("className").toString());
recipe.setAllProperties(redDefinition);
Plugin red = (Plugin) recipe.create();
red.start();
Here's how that might look "spelled" out in long form:
ObjectRecipe recipe = new ObjectRecipe("com.example.plugins.RedPlugin");
recipe.setProperty("myDateField","2011-08-29");
recipe.setProperty("myIntField","100");
recipe.setProperty("myBooleanField","true");
recipe.setProperty("myUrlField","http://www.stackoverflow.com");
Plugin red = (Plugin) recipe.create();
red.start();
The xbean-reflect library is a step beyond the built-in JavaBeans API, but a bit better without requiring you to go all the way to a full-on IoC framework like Guice or Spring. It supports factory methods and constructor args and setter/field injection.
Why is the ServiceLoader so limited?
Deprecated code in the JVM damages the Java language itself. Many things are trimmed to the bone before being added to the JVM, because you cannot trim them after. The ServiceLoader is a prime example of that. The API is limited and OpenJDK implementation is somewhere around 500 lines including javadoc.
There's nothing fancy there and replacing it is easy. If it doesn't work for you, don't use it.
Classpath scope
APIs aside, in pure practicality narrowing the scope of the URLs searched is the true solution to this problem. App Servers have quite a lot of URLs all by themselves, not including the jars in your application. Tomcat 7 on OSX for example has about 40~ URLs in the StandardClassLoader alone (this is the parent to all webapp classloaders).
The bigger your app server the longer even a simple search will take.
Caching doesn't help if you intend to search for more than one entry. As well, it can add some bad leaks. Can be a real lose-lose scenario.
Narrow the URLs down to the 5 or 12 that you really care about and you can do all sorts of service loading and never notice the hit.
Have you tried using the two argument version so that you can specify which classloader to use? Ie, java.util.ServiceLoader.load(Class, ClassLoader)
Mu.
In a 1x WebContainer <-> Nx WebApplication system, the ServiceLoader instantiated in the WebContainer will not pick up any classes defined in WebApplications, just those in the container. A ServiceLoader instantiated in a WebApplication will detect classes defined in the application in addition to those defined in the container.
Keep in mind WebApplications will need to be kept separate, are designed that way, things will break if you try and circumvent that, and they are not the method and system available to extend the container - if your library is a simple Jar, just drop it into the appropriate extension folder of the container.
I really like Neil's answer in the link I added in my comment. Due to I have same experences in my recent project.
"Another thing to bear in mind with ServiceLoader is to try to abstract the lookup mechanism. The publish mechanism is quite nice and clean and declarative. But the lookup (via java.util.ServiceLoader) is as ugly as hell, implemented as a classpath scanner that breaks horribly if you put the code into any environment (such as OSGi or Java EE) that does not have global visibility. If your code gets tangled up with that then you'll have a hard time running it on OSGi later. Better to write an abstraction that you can replace when the time comes."
I actually met this problem in OSGi environment actually it's just eclipse in our project. But I luckily fixed it in a timely fashion. My workaround is using one class from the plugin I want to load ,and get classLoader from it. That will be a valid fix. I didn't use the standard ServiceLoader, But my process is quite similiar, use a properties to define the plugin classes I need to load. And I know there is another way to know each plugin's classloader. But at least I don't need to use that.
Honest, I don't like the generics used in ServiceLoader. Because it limited that one ServiceLoader can only handle classes for one interface. Well is it really useful? In my implementation, it don't force you by this limitation. I just use one implementation of loader to load all the plugin classes. I don't see the reason to use two or more. Due to the consumer can know from the config files about the relationships between interfaces and implementation.
This question seem to be more complicated than I first anticipated. As I see
it, there are 3 possible strategies for dealing with ServiceLoaders.
Use a static ServiceLoader instance and only support loading classes from
the same classloader as the one holding the ServiceLoader reference. This
would work when
The service configuration and implementation are in a shared classloader
and all child classloaders are using the same implementation. The example
in the documentation is geared towards theis use case.
Or
Configuration and implementation are put into each child classloader and
deployed along each webapp in WEB-INF/lib.
In this scenario it is not possible to deploy the service in a shared classloader
and let each webapp choose its own service implementation.
Initialize the ServiceLoader on each access passing the context classloader of
the current thread as the second parameter. This approach is taken be the JAXP
and JAXB apis, although they are using their own FactoryFinder implementation
instead of ServiceLoader. So it is possible to bundle a xml parser with a webapp
and have it automatically get picked up for example by DocumentBuilderFactory#newInstance.
This lookup has a performance impact, but in the case of xml parsing
the time to look up the implementation is small compared to the time needed to
actually parse a xml document. In the library I'm envisioning the factory
itself is pretty simple so the lookup time would dominate the performance.
Somehow cache the implementation class with the context classloader as the key.
I'm not entirely sure if this is possible in all the above cases without
causing any memory leaks.
In conclusion, I will probably be ignoring this problem and require that the library
gets deployed inside each webapp, i.e. option 1b above.
I'm developing a system that allows developers to upload custom groovy scripts and freemarker templates.
I can provide a certain level of security at a very high level with the default Java security infrastructure - i.e. prevent code from accessing the filesystem or network, however I have a need to restrict access to specific methods.
My plan was to modify the Groovy and Freemarker runtimes to read Annotations that would either whitelist or blacklist certain methods, however this would force me to maintain a forked version of their code, which is not desirable.
All I essentially need to be able to do is prevent the execution of specific methods when called from Groovy or Freemarker. I've considered a hack that would look at the call stack, but this would be a massive speed hit (and it quite messy).
Does anyone have any other ideas for implementing this?
You can do it by subclassing the GroovyClassLoader and enforcing your constraints within an AST Visitor. THis post explains how to do it: http://hamletdarcy.blogspot.com/2009/01/groovy-compile-time-meta-magic.html
Also, the code referenced there is in the samples folder of Groovy 1.6 installer.
You should have a look at the project groovy-sandbox from kohsuke. Have also a look to his blog post here on this topic and what is solution is addressing: sandboxing, but performance drawback.
OSGi is great for this. You can partition your code into bundles and set exactly what each bundle exposes, and to what other bundles. Would that work for you?
You might also consider the java-sandbox (http://blog.datenwerke.net/p/the-java-sandbox.html) a recently developed library that allows to securely execute untrusted code from within java.
Also see: http://blog.datenwerke.net/2013/06/sandboxing-groovy-with-java-sandbox.html