I'm working on a Java library that I would like to be able to use across a couple of different Java compiler versions. Some annotations (specifically #SafeVarargs) only exist on some of these compiler versions and generate errors in others.
Especially for something like #SafeVarargs, which serves mostly as a marker to suppress warnings rather than actually changing the output of the compiler, I would like to be able to use these annotations and simply provide a dummy-implementation if an earlier compiler is missing them.
How would I go about doing this?
I guess you could just create surrogate implementations of those annotations and put them in a Jar that is added to the classpath making sure that the system/compiler provided one take priority when resolved by the corresponding class loader.
For example you can just copy the code of SafeVarargs from here
Related
We are migrating a system written in C to Java and must retain existing processes (no debate). We currently "embed" compile-time information into the C application using the C preprocessor, for example:
cc -o xxx.o -DCOMP_ARG='"compile time arg"' xxx.c
The xxx.c file can then use "COMP_ARG" and its value will be embedded in the code and we have little worry about it being changed inadvertently.
We realize Java likes to use properties files, however, our requirements are such that some information ** ** be embedded in the code, so properties files are not an option - these certain values cannot be specified at runtime. To illustrate the point, such data could be a date-stamp of when the file was compiled, but the exact data is irrelevant to the question.
We are looking for a way to specify at compile time various values that are available to the Java code. We are quite aware that Java does not have a pre-processor as does C, so the mechanism would be different.
Our current solution is using a code generation step (Maven), which does work, however, Eclipse is wreaking havoc trying to deal with the source files so that we had turn off "Build Automatically". We really want to find a more robust solution.
We appreciate any help, thanks.
The xxx.c file can then use "COMP_ARG" and its value will be embedded
in the code and we have little worry about it being changed
inadvertently.
...our requirements are such that some information be embedded in the
code....
We are looking for a way to specify at compile time various values
that are available to the Java code. We are quite aware that Java does
not have a pre-processor as does C, so the mechanism would be
different.
It seems that the best way to solve this problem would be to make use of annotations in your code.
In Java, annotations are a kind of interface declaration, but they do not enforce a behavioral contract with an implementing class. Rather, they are meant to define a contract with some external framework, preprocessor, or with the compiler itself. Annotations are used extensively in Java EE 5.0 (and later) to specify configuration and behavior to the framework within which the developer's code runs. Annotations are also used extensively by the JavaDoc documentation processor. Here, the annotations in the doc comments allow you to specify and format the information which you intend to appear in the documentation when the JavaDoc processor runs.
Annotations can be defined to be accessible at runtime. In such a case, the primary mechanism for accessing annotations is the Java Reflection facility. For example, annotations with a retention policy of RUNTIME and defined on a class, can be accessed through that class's corresponding Class object:
Class myCls = MyClass.class; // the "class literal" for MyClass
Annotation[] annotations = myCls.getDeclaredAnnotations();
Annotations can include arguments for parameters to allow for more flexibility in configuration. The use of annotations is most convenient when the code itself can be so annotated.
A quick tutorial on how annotations are defined and used in Java is available here: https://docs.oracle.com/javase/tutorial/java/annotations/
I'm going to post my own answer which seems to be "Can't be done" - what can't be done, apparently, is provide at compile time to Java, a set of parameters that gets passed to the program at execution time. The solution appears to be to continue with what I am doing which is to update a Java source file with the compile-time data and figure out how to coax Eclipse to stop over-writing the files.
Thanks to everyone who commented.
`Platform`: Windows 7, MinGW, MSYS, Java 1.5
I have thrift 0.9.1 compiler (prebuilt for windows) and source. I use Ant to build java library.
I create one thrift idl and compile it with the compiler. No problem in generating code files.
I add these files in my project, and that add slf4j (downloaded from their site) and libthrift.
Most of the errors that I have previously (imports etc) are gone except for errors related to overriding methods.
So basically it complains like:
The method clear() of type Server must override a superclass method
and similarly for compareTo, write, read etc. In short it complains about all methods that are overridden. This is all thrift compiler generated code and I haven't changed anything.
Is there any incompatibility? I cannot really find any mention of that. I have tried removing and then adding the libraries, I have also tried cleaning, refreshing, validating the project but the errors are still there.
I have also tried to compile the code (thrift code) but MinGW is also a huge headache. It cannot find configure even though I have installed it. And if I run the msys console, it is able to configure but cannot make complaining about inttypes.h not present (which is not in msys include directory but is present in MinGW include directory.).
Any suggestion would be appreciated.
Are you using Java 5? With Java 5 #Override doesn't search for methods on interfaces, only on superclasses.
If you are using a Java 5 compiler trying using a more recent javac (preferably 7 or 8) and see of that works.
EDIT:
Not sure if this is in your version of Thrift, but in mine it looks like there is a flag called java5 that you an specify when generating code to specify that you want the generated code to be Java 5 compliant
java (Java):
beans: Members will be private, and setter methods will return void.
private-members: Members will be private, but setter methods will return 'this' like usual.
nocamel: Do not use CamelCase field accessors with beans.
fullcamel: Convert underscored_accessor_or_service_names to camelCase.
android: Generated structures are Parcelable.
android_legacy: Do not use java.io.IOException(throwable) (available for Android 2.3 and above).
java5: Generate Java 1.5 compliant code (includes android_legacy flag).
reuse-objects: Data objects will not be allocated, but existing instances will be used (read and write).
sorted_containers:
Use TreeSet/TreeMap instead of HashSet/HashMap as a implementation of set/map.
I would like to mark usage of certain methods provide by the JRE as deprecated. How do I do this?
You can't. Only code within your control can have the #Deprecated annotation added. Any attempt to reverse engineer the bytecode will result in a non-portable JRE. This is contrary to Java's write once, run anywhere methodology.
you can't deprecate JRE methods, but you can add warnings or even compile errors to your build system i.e. using AspectJ or forbid the use of given methods in the IDE.
For example in Eclipse:
Go to Project properties -->Java Compiler --> Errors Warnings, Then enable project specific settings, Expand Deprecated and restrited APIs category
"Forbidden reference (acess rule)"
Obviously you could instrument or override the class adding #Deprecated annotation, but it's not a clean solution.
Add such restrictions to your coding guidelines, and enforce as part of your code review process.
You only can do it, if and only if you are building your own JRE! In that case just add #Deprecated above the corresponding code block! But if you are using Oracle's JRE, you are no where to do so!
In what context? Do you mean you want to be able to easily configure your IDE to inhibit use of certain API? Or are you trying to dictate to the world what APIs you prohibit? Or are you trying to do something at runtime?
If the first case, Eclipse, and I assume other IDEs, allow you to mark any API as forbidden, discouraged, or accessible at the package or class level.
If you mean the second, you can't, of course. That would be silly.
If you are trying to prohibit certain methods from being called at runtime, you can configure a security policy to prevent code loaded from specified locations from being able to call specific methods that check with the SecurityManager, if one is installed.
You can compile your own version of the class and add it to the boot class path or lib/ext directory. http://docs.oracle.com/javase/tutorial/ext/basics/install.html This will change the JDK and the JRE.
In fact you can remove it for compiling and your program won't compile if it is used.
Snihalani: Just so that I get this straight ...
You want to 'deprecate methods in the JRE' in order to 'Making sure people don't use java's implementation and use my implementation from now on.' ?
First of all: you can't change anything in the JRE, neither are you allowed to, it's property of Oracle. Uou might be able to change something locally if you want to go through the trouble, but that 'll just be in your local JRE, not in the ones that can be downloaded from the Oracle webpage.
Next to that, nobody has your implementation, so how would we be able to use it anyway? The implementations provided by Oracle do exactly what they should do, and when a flaw/bug/... is found it'll be corrected or replaced by a new method (at which point the original method becomes deprecated).
But, what mostly worries me, is that you would go and change implementations with something you came up with. Reminds me quite lot of phishing and such techniques, having us run your code, without knowing what it does, without even knowing we are running your code. After all, if you would have access to the original code and "build" the JRE, what's to stop you from altering the code in the original method?
Deprecated is a way for the author to say:
"Yup ... I did this in the past, but it seems that there are problems with the method.
just in order not to change the behaviour of existing applications using this method, I will not change this method, rather mark it as deprecated, and add a method that solves this problem".
You are not the author, so it isn't up to you to decide whether or not the methods work the way they should anyway.
I'm trying to identify places where annotation names are the same or similar to compile a list of these things to make sure our team knows where possible points of confusion can be found. For example, Guice #provides and RESTeasy #provider are similar enough in spelling but different enough in semantics as to confuse people so I'd like to call that out explicitly and explain the differences.
What I'm looking for is a tool or even a website that enumerates the annotations associated with packages. This might be a pipe dream, but before I manually start going through and collecting these things I thought I'd check.
I was considering writing one based on Javadoc that simply only pulled in the annotations but I don't have access to Java source files in many cases.
Any thoughts or suggestions?
In Eclipse you can use the standard method "Search for references" (context menu of a used annotation References -> Project) and you are getting a list where the annotations is used within your project.
I suggest to scan for annotations yourself and generate a list for that.
You can do that by writing your own implementation of an annotation processer, i.e. extend AbstractProcessor. Within this processor you can write a text file containing all Annotations. You can add this processor to your build procedure, then it will execute the processor when you build the project.
Another way to do this is using the Google Reflections library. This might be a bit more work since you would need to write a small programm to fetch the annotations and write the file.
I wrote such a tool: https://github.com/MoserMichael/ls-annotations
it decompiles the byte code and lists declarations (classes, functions, variables) with annotations only. You can also use it to find all classes/interfaces derived from a given class/inerface - and all the classes/interfaces derived from a given class/interface.
The tool uses the asm library to scan class files and to extract annotations. it can detect annotations with retention policy CLASS and RUNTIME. It can't detect annotations with retention policy SOURCE that are not put into bytecode, for example #Override is one of these.
Why not scanning your classpath and export all used annotations? Then just use some simple parsing / text compare to see the elements with almost the same name?
I have a scenario where I have code written against version 1 of a library but I want to ship version 2 of the library instead. The code has shipped and is therefore not changeable. I'm concerned that it might try to access classes or members of the library that existed in v1 but have been removed in v2.
I figured it would be possible to write a tool to do a simple check to see if the code will link against the newer version of the library. I appreciate that the code may still be very broken even if the code links. I am thinking about this from the other side - if the code won't link then I can be sure there is a problem.
As far as I can see, I need to run through the bytecode checking for references, method calls and field accesses to library classes then use reflection to check whether the class/member exists.
I have three-fold question:
(1) Does such a tool exist already?
(2) I have a niggling feeling it is much more complicated that I imagine and that I have missed something major - is that the case?
(3) Do you know of a handy library that would allow me to inspect the bytecode such that I can find the method calls, references etc.?
Thanks!
I think that Clirr - a binary compatibility checker - can help here:
Clirr is a tool that checks Java libraries for binary and source compatibility with older releases. Basically you give it two sets of jar files and Clirr dumps out a list of changes in the public api. The Clirr Ant task can be configured to break the build if it detects incompatible api changes. In a continuous integration process Clirr can automatically prevent accidental introduction of binary or source compatibility problems.
Changing the library in your IDE will result in all possible compile-time errors.
You don't need anything else, unless your code uses another library, which in turn uses the updated library.
Be especially wary of Spring configuration files. Class names are configured as text and don't show up as missing until runtime.
If you have access to the source code, you could just compile source against the new library. If it doesn't compile, you have definitely a problem. If it compiles you may still have a problem if the program uses reflection, some kind of IoC stuff like Spring etc.
If you have unit tests, then you may have a better change catch any linking errors.
If you have only have a .class file of the program, then I don't know any tools that would help besides decomplining class file to source and compiling source again against the new library, but that doesn't sound too healthy.
The checks you mentioned are done by the JVM/Java class loader, see e.g. Linking of Classes and Interfaces.
So "attempting to link" can be simply achieved by trying to run the application. Of course you could hoist the checks to run them yourself on your collection of .class/.jar files. I guess a bunch of 3rd party byte code manipulators like BCEL will also do similar checks for you.
I notice that you mention reflection in the tags. If you load classes/invoke methods through reflection, there's no way to analyse this in general.
Good luck!