Existing apis/libraries to compile/run java code? - java

I'm trying to add a java scripting option to my web application with the following requirements:
1-The user writes custom java code.
2-The user can compile/run this code.
3-The user can upload jars and use them in his code (without using a custom class loader/reflection).
By overriding the java class loader, I managed to achieve the first 2 requirements.
However, for the 3rd requirement, I am unsure how to proceed.
I was wondering if there was a way to use an existing api to handle this, or any offline java code compilers to execute this custom code:
input : java code+ dependent jars -> output : code execution

There is no way to dynamicly load additional JARs from the user without creating a new ClassLoader instance. But it doesn't necessarily need to be a custom subclass of ClassLoader.
The general idea is that you create a new instance of (say) URLClassLoader providing the URLs for (say) a local copy of the user's JARs. This new classloader would typically have your application's classloader as its parent classloader. Then you would use the new classloader to load the user's classes; i.e. the ones you wanted to user. You would need to reflection to create instances of those classes, and then you could cast them to an interface in your app's code-base and invoke methods on them as per a normal class.
Offline compilation won't help. The problem you need to solve is getting your application to >>load<< the user's code.
Warning: what you are proposing to do sounds rather dangerous. You have no real control over the code that the user is going to put into the Java source code or JAR. It could potentially do all sorts of malicious things: reading / writing your server's files, exfiltrating secrets or confidential data, mining bitcoins, causing your server to participate in a DDOS. Or just do something that is annoying and disruptive.
That potentially malicious coder is going to run within your webapp on your server.

Related

Is there any way that users of my modular library are able to access classes which have not been exported?

I'm trying to get familiar with the module system introduced in Java 9, and I would like to know the best way to leverage it.
For a library I'm writing, I would like to do the following (ignore the naming of the packages):
Expose only interfaces, simple POJO classes, and factory classes via com.myproject.api. Everything in this class can be used by the users.
Put the implementation of interfaces in com.myproject.core. Users should not be able to access anything in here.
My reasoning is that users do not need to get confused or overwhelmed by the implementation logic. Instead, they can just look at (hopefully) clean and well documentated interfaces.
However, due to the way Java packages work, it can be difficult to restrict use of certain classes without making them all package private. But I don't like putting all the classes in one package, and would rather organize them into various packages.
After reading about the module system, I believe I can do the following to achieve what I want. This is the module-info.java file:
module com.myproject {
exports com.myproject.api;
}
From my understanding, the users of my library will be able to use everything defined in the com.myproject.api package (by using require com.mypojrect.api in their own module-info file).
But is there any way that users will be able to access anything in the com.myproject.core package? I have no problem with them looking at the code (via IDE or the source code itself), but I just don't want to end up supporting classes/methods/logic which I didn't want to expose.
I'm concerned that users who don't have a modularized application or users who put my library JAR on the classpath will somehow find a way to get access to the supposed restricted package.
Please let me know if you need any other information.
A pre-JDK9 user of your library cannot exist, as you're going to use the Java Platform Module System, which is post-JDK8, and thus you're going to compile to a class version greater than 52.
Said that, your users will be able to look at the source code (if shipped), and obviously they will be able to extract your .class files.
By definition
a type in a module is not accessible to other modules unless it’s a
public type and you export its package.
The only way to gain Reflective access to your classes would be if you willingly opened them, with the
opens your.package
directive. So basically, you're covered also on the Reflection aspect.
And the opens directive exposes to Reflection only public definitions.
If you want to control Reflective access to your classes in a non-modular/pre-JDK9 environment, SecurityManager might be what you're looking for. However this requires access to the JVM configuration.

Using class loader to enable shared code between Java and Android

I am trying to build an application that runs under JavaSE and Android. Most of the code is the same between the two, but there are some specific functions that need to be separated. I use Eclipse. So I decided to put the shared code in a separate project, and then build one more project for Android and one for Java, which reference the shared project. I put all Java and Android specific functions in one class residing in the Java and Android specific projects. These classes are called UtilsJ (for Java) and UtilsA (for Android). The code in the shared project uses a factory to determine at runtime which version it needs to pick, and then calls the class loader to load the right class. Essentially: if property java.vm.name equals Dalvik, load UtilsA, else load UtilsJ (and of course cast to the Utils interface before returning).
My question is simply if this is a good idea or is something going to eventually break? I've never used class loader before. Any other suggestions how to implement this sharing would also be appreciated.
Generating an interface implementation dynamically is certainly a valid technique. For instance, having a data access interface that has multiple implementations; one each for flat files, MySQL and WebDAV. The program can pick an implementation at run time based on system/platform properties.
But this feels different. If I saw that I had a Java app and an Android app that had a lot of common code, my goal would be to create an Eclipse project that generates a jar file that I could just drop into the libraries of both projects. In that case, the jar file wouldn't contain any code that was incompatible with one platform or the other. So there wouldn't be any reason to have a platform-specific implementation.
Let's take your example some code reading an initialization file. If it's common code, you have an input parameter which is a file. On Android, maybe it's "/data/data/com.whatever.blahblahblah" and on Java you're getting the "user.dir" system parameter for the top level directories. But one way or another, it's a File, and you hand it to your common setup method. That's okay. But if your initialization file read code e.g. needs a Context to get a Resource to read the file for Android, then it's not common code. And it doesn't belong in a library jar for a JVM-hosted app.
So I think that in your case the platform-specific implementation classes are overkill. If it's common code, it's the same code — period.
Let's talk about another example in your comment. If you are using desktop Java, then you are probably using Swing or AWT, so you still have the same issue of running some network task off the UI thread, notifying when it completes, maybe even updating some progress indicator UI while it's processing. Same function, same operation, but the code is so different that I can't see how having it in the same library next to an AsyncTask version could be of any benefit.
And testing might get tricky. Obviously JUnit will work for everything, but some tests would need to run on a device or emulator.
I stated that it was a valid technique, and of course you may have other compelling reasons to choose the multi-platform option. You asked the question; is anything going to break? My answer is: Probably not, but why risk dealing with some heartburn down the road? Speaking for myself, I wouldn't do it. If I had to support multiple MVC apps, my common library would have nothing but M.

Do I need to extend ClassLoader to redirect web application resource loading?

My question is two-fold. First, I'll explain the problem, and second, assuming the solution is to implement a class loader, how to go about doing that on a web application.
My problem is this: Our company is using a framework made by another company. It uses xml files to generate web pages and these xml files are located within another library (jar files). It wasn't meant to be dynamic because these libraries are generated often (weekly?), but they determine how many fields there are, what type of information it collects (datetime, combo box, etc.), and so on.
Now the question has been proposed by my company whether or not it would be possible to dynamically move these fields around (by dynamic, I mean ideally you could refresh the page and see the effects of changes made to the layout). I did a few preliminary tests and discovered that modifying the xml does give the desired effect on the web page, however, since these xml files are located in the jars, it means I have two possibilities:
Create a tool which modifies the jar outside of the scope of my web application, though this would obviously imply that it absolutely cannot be dynamic. Moreover, I'd have to create an interface aside from the web application in order to manage the tool. Moreover still, I can't seem to shake the impression that this is an incredibly hacky approach and that I should probably avoid this solution at any cost.
I implement a class loader (specifically getResourceAsStream) and when I see a call to load one such xml file, rather than do the default behavior, I generate the xml file based on the original, modifying information as I require, then returning the resource to the caller (which in this case would be the third-party framework).
First question is, is #2 my best option or do there exist other options (or should I stick to #1)?
My second question is, assuming I should implement my own class loader, how best can I do this on my web application? I'm using Tomcat 7, but if possible I would like the solution to be independent of which web container I'm using.
Any help would be greatly appreciated!
You could probably simply explode the jar to a directory that is on the classpath and update the XML files in place and on the fly. This won't account for any internal caching within the application, (if any, that's a different problem) but it's straightforward to implement and doesn't put you in the shenanigan filled ClassLoader business.
I'm not sure if I understand your question.But I guess you could try using xstream api from thoughtworks.It can generate xml on the fly for you given a Java object and from this point on you can treat these xmls the way you do now to generate your webpages.
I know this answer is very trivialising however if this can lead you to a new api that can help you move to a new approach of generating xml with minimum fuss then I guess it would have served your purpose well.

Does anyone know how a white-list class access approach similar to Google App Engine can be implemented?

I am writing a container framework that can dynamically deploy a Jar file containing user developed classes in the container, and then using a web interface execute certain classes from the Jar file.
Everything else is well set, including the validations. However, a requirement is to only allow access to certain JDK and other library classes from the user developed class. Clearly, this is due to the fact that the container will need an assurance that someone (intentionally or otherwise) ends up running a piece of Java code that results in a "bad" behavior.
Generally, I find stuff on Google on almost all topics. In this case, I just could not :(
see Can i deny access to a jvm class by configuring java.policy file?
it suggests using a custom classloader
Note that a custom classloader is not enough against a malicious person - he/she can access a parent classloader and load the restricted classes through it. In addition to a custom classloader, you should set a security manager and revoke the getClassLoader permission (and, perhaps, some other permissions too).

Can i deny access to a jvm class by configuring java.policy file?

I wanted to add to my jdk6\jre\lib\security\java.policy file an interdiction to create some classes that are blacklisted by appengine. For example I want my local jvm to throw an exception when the application tries to instantiate javax.naming.NamingException.
It is possible?
I will try to explain my specific problem here. Google offers an service (GAE-google app engine) that has some limitations on what classes can be used. For example doesn't instantiate JNDI classes that are in javax.naming package. They also offer an testing server that can be used to tests this application on my machine, but this server allows such classes and can exacute the code. You find out that you used a blacklisted class only after you upload your application to google. I was thinking if such class blacklist enforcement couldn't be done on the development jvm. Else i'm thinking that this would be easy they might already provide such a policy file.
You could write a small loader application that creates a new, custom classloader. Your application classes could then be loaded using this classloader.
In the custom classloader, you can then throw ClassNotFoundException when your application tries to access a class that you want to blacklist.
You will need to overload the load() method. This method will be responsible for throwing the exception on your blacklisted classes ordelegating to the parent Classloader if the class is allowed. A sample implementation:
public Class loadClass(String name) throws ClassNotFoundException {
if(name.equals("javax.lang.ClassIDontLike")){
throw new ClassNotFoundException("I'm sorry, Dave. I'm afraid I can't do that.");
}
return super.loadClass(name, false);
}
(Of course, a real implementation can be way more sophisticated than this)
Because the classes of your application are loaded through this Classloader, and you are only delegating the loadClass() invokations to the parent classloader when you want to, you can blacklist any classes that you need.
I am pretty sure that this is the method that Google uses to blacklist classes in their server. They load every app in a specific Classloader. This is also similar to the way that Tomcat isolates the different Web Applications.
Wouldn't you rather get compilation errors than runtime errors while testing your program? You could configure your IDE or compiler to warn you when an undesired class is instantiated. I know AspectJ has some nice features for this: You can define compilation warnings/errors on join points and get feedback in e.g. Eclipse. To use this in Eclipse, you simply install the AspectJ plugin and write a suitable aspect. To get the errors while compiling from a command line or script, you would actually have to use the AspectJ compiler, but I doubt that you would need that.
The Java documentation lists all possible policy permissions here:
http://java.sun.com/javase/6/docs/technotes/guides/security/permissions.html
Class creation / loading is not mentioned, so I believe you cannot enforce this using a policy.
At any rate, why do you want to throw an exception when an exception class is loaded? Maybe you could explain your problem, then someone might be able to propose a solution.
Edit:
One way to prevent loading of certain classes would be to remove them from the JRE installation. Most system classes are contained in rt.jar in your JDK/JRE installation. You should be able to modify it with any ZIP-tool.
Just create a special installation of your JRE, and modify its rt.jar. That is an ugly hack, but should be OK for testing purposes...

Categories