Is Java inlining still an issue for most JVMs? - java

In a (now) relatively old book, "Java Puzzlers", the authors talk about the inlining problems that can occur in static final fields (Puzzle 93: Class Warfare discussed here).
Essentially Java used to have a problem where due to how classes load, you could run into the issue that if a library class (class A) is recompiled with a changed static final field, then a class (class B) which uses that field might not function properly. This could happen because the class A field might have been inlined into class B during compilation, so that when the library is updated, class B does not incorporate the change in the class A field.
My simple question is... Is this still a problem? Do the newer versions of Java redefine the class loading procedure so that we do not have to worry about such issues?
I can only find relatively old posts touching on this issue (pre-2014), which makes me think that this issue has somehow been addressed, but I can find no definitive source.
Also, if it makes any difference, I am particularly interested if this will be a problem in Android.

Related

java (library) can not seem to find grooy classes

I'm in a complex situation, and I'm going to try to be brief but I'd happily any additional information.
I have inherited the responsibility for a massive, ancient (in Internet years), and poorly documented code base, mostly in groovy. Mercifully the vast majority of this large system of Apps and services and plugins will be "end of life"d in a few months, but there's once piece that needs to live on. I've have been trying to extract just that piece. So it can stand up on it's own.
Overall things have gone pretty well, there are some unit tests, so I've been able to use the compilation and the unit tests to figure out what class files I've needed to bring over and 3rd party libraries I've needed.
However I've run into an issue (both in the unit tests and when I try to actually "run" the application).
The Error Message
The error message looks something like this.
Exception in thread "main" org.codehaus.groovy.runtime.typehandling.GroovyCastException:
Cannot cast object '[
...
about 20 different classes, as far as I can tell these are ORM entities
...
]' with class 'com.google.common.collect.RegularImmutableList' to class 'java.lang.Class' due to:
java.lang.ClassNotFoundException: [
...
the exact same list of classes (and paths) only with the java style '/' instead of the groovy style '.'
... ]
To make things more complicated, even in my much smaller code base there are still a lot of dependencies (all very old).
I'm running Groovy Version: 2.2.2 JVM: 1.7.0_121 Vendor: Oracle Corporation OS: Linux
I had been running gradle 2.2.1 but after I ran into this problem I did some research and learned that I should be able to update gradle so I did, I'm now at 4.7 (that did not fix the problem).
More Information Than You Require
I'm afraid I don't know groovy or java or gradle that well, and I understand the ecosystem even less.
In the medium term I'm planning on bringing the dependencies and environment up to versions that are currently being supported but I think it would be best to get this to work the old way first.
The original project that I cribbed all this code from still does build. The code I brought over was spread out over many different projects that were all formed a very complicated dependency tree. I figured it would be best to bring them into one project since the "new" code base is going to be much smaller.
I've spend a fair amount of time looking at the code where this error is happening (and it's stack trace), it appears as though there's a lot going on
protected HibernateBundle<HohumDatabaseConfiguration> initializeHibernateBundle(List<Class<?>> serviceEntities) {
HibernateBundle bundle = new HibernateBundle<HohumDatabaseConfiguration>(
ImmutableList.copyOf(serviceEntities),
new HohumSessionFactoryFactory()) {
#Override
DatabaseConfiguration getDatabaseConfiguration(HohumDatabaseConfiguration configuration) {
return configuration.database
}
}
return bundle
}
The problematic line is ImmutableList.copyOf(serviceEntities).
Ideas
All of these Entities were in a "plugin" (I'm still not sure what a "plugin is exactly in this context). From reading the documentation I do know this was so the code could be shared with some other applications (not in this repo). Maybe in the "old" repo java has not issues finding the classes because they're all wrapped up in a nice jar by the time this casting needs to happen.
This is a bug in groovy or one of the 3rd party libraries I'm using and I should try some new versions of things and it will just start working.
(I'm not really excited about either of my ideas)
Question
If you know how to solve this problem that would be wonderful. If not, how should one proceed?
If any important information is missing I'd be happy to supply it. Thank you so much for reading all this.
Full Stack Trace
Exception in thread "main" org.codehaus.groovy.runtime.typehandling.GroovyCastException: Cannot cast object '[class us.rupe.domain.AnswerEntity, class us.rupe.domain.AnswerTagEntity, class us.rupe.domain.BundleEntity, class us.rupe.domain.BundleProductEntity, class us.rupe.domain.IdealProductEntity, class us.rupe.domain.ProductEntity, class us.rupe.domain.QuestionEntity, class us.rupe.domain.TagEntity, class us.rupe.domain.SubjectEntity, class us.rupe.domain.SubjectDependentEntity, class us.rupe.domain.SurveyEntity, class us.rupe.domain.SurveyInstanceEntity, class us.rupe.domain.SurveyInstanceAnswerEntity, class us.rupe.domain.SurveyQuestionEntity, class us.rupe.domain.PurchaseEntity, class us.rupe.domain.QuestionGuardEntity, class us.rupe.domain.RecommenderVersionEntity, class us.rupe.domain.AdjusterConfigurationEntity, class us.rupe.domain.CurrentAdjusterConfigurationEntity]' with class 'com.google.common.collect.RegularImmutableList' to class 'java.lang.Class' due to: java.lang.ClassNotFoundException: [class us/rupe/domain/AnswerEntity, class us/rupe/domain/AnswerTagEntity, class us/rupe/domain/BundleEntity, class us/rupe/domain/BundleProductEntity, class us/rupe/domain/IdealProductEntity, class us/rupe/domain/ProductEntity, class us/rupe/domain/QuestionEntity, class us/rupe/domain/TagEntity, class us/rupe/domain/SubjectEntity, class us/rupe/domain/SubjectDependentEntity, class us/rupe/domain/SurveyEntity, class us/rupe/domain/SurveyInstanceEntity, class us/rupe/domain/SurveyInstanceAnswerEntity, class us/rupe/domain/SurveyQuestionEntity, class us/rupe/domain/PurchaseEntity, class us/rupe/domain/QuestionGuardEntity, class us/rupe/domain/RecommenderVersionEntity, class us/rupe/domain/AdjusterConfigurationEntity, class us/rupe/domain/CurrentAdjusterConfigurationEntity]
at org.codehaus.groovy.runtime.typehandling.DefaultTypeTransformation.castToClass(DefaultTypeTransformation.java:380)
at org.codehaus.groovy.runtime.typehandling.DefaultTypeTransformation.castToType(DefaultTypeTransformation.java:249)
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.castToType(ScriptBytecodeAdapter.java:599)
at us.rupe.service.HohumDatabaseService$1.<init>(HohumDatabaseService.groovy)
at us.rupe.service.HohumDatabaseService.initializeHibernateBundle(HohumDatabaseService.groovy:57)
at us.rupe.service.HohumDatabaseService.<init>(BloomDatabaseService.groovy:25)
at us.rupe.RecommendationService.<init>(RecommendationService.groovy)
at us.rupe.RecommendationService.main(RecommendationService.groovy:58)

Using reflection to modify the structure of an object

From wikipedia:
reflection is the ability of a computer program to examine and modify the structure and behavior (specifically the values, meta-data, properties and functions) of an object at runtime.
Can anyone give me a concrete example of modifying the structure of an object? I'm aware of the following example.
Object foo = Class.forName("complete.classpath.and.Foo").newInstance();
Method m = foo.getClass().getDeclaredMethod("hello", new Class<?>[0]);
m.invoke(foo);
Other ways to get the class and examine structures. But the questions is how modify is done?
Just an additional hint since the previous answers and comments answer the question concerning reflection.
To really change the structur of a class and therefore its behaviour during runtime look at Byte code instrumentaion and in this case javassist and asm libs. In any case this is not trivial task.
Additionally you might have a look at aspect programming technic, which enables you to enhance methods with some functionallity. Often used to introduce logging without the need to have a dependency of the logging classes within your class and also dont have the invocations of the logging methods between the problem related code.
In English reflection means "mirror image".
So I'd disagree with the Wikipedia definition. For me, reflection is about runtime inspection of code, not manipulation.
In java, you can modify the bytecode at runtime using byte code manipulation. One well known library and in wide spread use is CGLIB.
In java, reflection is not fully supported as defined by the wikipedia.
Only Field.setAccessible(true) or Method.setAccessible(true) really modifies a class, and still it only changes security, not behaviour.
Frameworks like e.g. hibernate use this to add behaviour to a class by e.g. generating a subclass in bytecode that accesses private fields in the parent class.
Java is still a static typed language, unlike javascript where you can change any behaviour at runtime.
The only method in reflection (java.lang.reflect) to modify object's class behaviour is to change the accessibility flag of Constructor, Method and Field - setAccessible, whatever wiki says. Though there are libraries like http://ru.wikipedia.org/wiki/Byte_Code_Engineering_Library for decomposing, modifying, and recomposing binary Java classes

How unsafe is the use of sun.misc.Unsafe actually?

I am wondering about how unsafe the use sun.misc.Unsafe actually is. I want to create a proxy of an object where I intercept every method call (but the one to Object.finalize for performance considerations). For this purpose, I googled a litle bit and came up with the following code snippet:
class MyClass {
private final String value;
MyClass() {
this.value = "called";
}
public void print() {
System.out.println(value);
}
}
#org.junit.Test
public void testConstructorTrespassing() throws Exception {
#SuppressWarnings("unchecked")
Constructor<MyClass> constructor = ReflectionFactory.getReflectionFactory()
.newConstructorForSerialization(MyClass.class, Object.class.getConstructor());
constructor.setAccessible(true);
assertNull(constructor.newInstance().print());
}
My consideration is:
Even though Java is advertised as Write once, run everywhere my reality as a developer looks rather like Write once, run once in a controllable customer's run time environment
sun.misc.Unsafe is considered to become part of the public API in Java 9
Many non-Oracle VMs also offer sun.misc.Unsafe since - I guess - there are quite some libraries already use it. This also makes the class unlikely to disappear
I am never going to run the application on Android, so this does not matter for me.
How many people are actually using non-Oracle VMs anyways?
I am still wondering: Are there other reasons why I should not use sun.misc.Unsafe I did not think of? If you google this questions, people rather answer an unspecified because its not safe but I do not really feel it is besides of the (very unlikely) possibility that the method will one day disappear form the Oracle VM.
I actually need to create an object without calling a constructor to overcome Java's type system. I am not considering sun.misc.Unsafe for performance reasons.
Additional information: I am using ReflectionFactory in the example for convenience which delegates to Unsafe eventually. I know about libraries like objenesis but looking at the code I found out that they basically do something similar but check for other ways when using Java versions which would not work for me anyways so I guess writing four lines is worth saving a dependency.
There are three significant (IMO) issues:
The methods in the Unsafe class have the ability to violate runtime type safety, and do other things that can lead to your JVM "hard crashing".
Virtually anything that you do using Unsafe could in theory be dependent on internal details of the JVM; i.e. details of how the JVM does things and represents things. These may be platform dependent, and may change from one version of Java to the next.
The methods you are using ... or even the class name itself ... may not be the same across different releases, platforms and vendors.
IMO, these amount to strong reasons not to do it ... but that is a matter of opinion.
Now if Unsafe becomes standardised / part of the standard Java API (e.g. in Java 9), then some of the above issues would be moot. But I think the risk of hard crashes if you make a mistake will always remain.
During one JavaOne 2013 session Mark Reinhold (the JDK architect) got a question: "how safe it is to use the Unsafe class?". He replied with sort of surprising answer: "I believe its should become a stable API. Of course properly guarded with security checks, etc..."
So it looks like there may be something like java.util.Unsafe for JDK9. Meanwhile using the existing class is relatively safe (as safe as doing something unsafe can be).

Why wasn't java.io.Serializable deprecated in Java 5?

In pre Java 5, there were no annotations. As a result you could not add metadata to a class.
To mark a class as serializable, you had to implement the Serializable interface (which is just that, a marker) and use further transient keywords to mark a field as non serializable if needed, something like:
public class MyClass implements Serializable {
...
private transient Bla field;
...
}
Now you could theoretically make use of annotations (this is a perfect use for them) and have:
#Serializable
public class MyClass {
...
#Transient
private Bla field;
...
}
But the interface and the keyword were not deprecated and no annotation was added in Java 5 to replace them.
What are the considerations for this decision to keep the interface and the keyword?
Off course there is the issue of compatibility with pre Java 5 code, but at some point that will end (e.g. related to the new features of generics, the JLS specifies that It is possible that future versions of the Java programming language will disallow the use of raw types). So why not prepare the way for serialized annotations also?
Any thoughts? (although I would prefer concrete references :D which I was unable to find)
The interface is there so methods can be defined to accept objects of type Serializable:
public void registerObject(Serializable obj);
Off course there is the issue of compatibility with pre Java 5 code ...
Yes. This is the root of the problem.
If they #deprecated the Serializable interface in (say) Java 5, then lots of old pre-Java 5 code would give warnings (or errors depending on compiler switches).
There's nothing actually wrong with using Serializable and "forcing" people to replace it is annoying. (People have better things to do ...)
When people have "fixed" this in their code, it will no longer compile using a pre-Java 5 compiler, or run on a pre-Java 5 JVM.
It is a bad idea to do things that make the compiler systematically "cry wolf".
... but at some point that will end.
In reality, the chance of this actually happening is (IMO) small. As far as I'm aware, Sun / Oracle have never removed a deprecated feature. Not even dangerous ones like Thread.stop() and friends.
As a footnote, the Go developers are taking a different approach to this problem. When they want to change a language or library feature, they just do it. They provide a converter that will automatically rewrite your code as required.
Serializable and transient are indeed two things that could have been replaced by annotations.
They haven't been deprecated probably because there are a lot of programs that use Serializable and it would have been annoying for millions if developers if the compiler would suddenly start spewing out thousands of deprecation warnings.
There are lots of other things in the standard Java API that could have been deprecated long ago - for example, the legacy collection classes Vector and Hashtable (and I'm sure you can easily find many more things). And there are other things that could have been implemented using annotations (for example the keyword volatile, strictfp and synchronized).
Deprecation is for things that are actively harmful. What you're suggesting is forcing authors of eight years of existing Java code (at the time) to rewrite it, for no advantage, just to shut up the deprecation compiler warnings, to get back to the fully correct code they had with Java 1.4. That would not be useful.

Java super-tuning, a few questions

Before I ask my question can I please ask not to get a lecture about optimising for no reason.
Consider the following questions purely academic.
I've been thinking about the efficiency of accesses between root (ie often used and often accessing each other) classes in Java, but this applies to most OO languages/compilers. The fastest way (I'm guessing) that you could access something in Java would be a static final reference. Theoretically, since that reference is available during loading, a good JIT compiler would remove the need to do any reference lookup to access the variable and point any accesses to that variable straight to a constant address. Perhaps for security reasons it doesn't work that way anyway, but bear with me...
Say I've decided that there are some order of operations problems or some arguments to pass at startup that means I can't have a static final reference, even if I were to go to the trouble of having each class construct the other as is recommended to get Java classes to have static final references to each other. Another reason I might not want to do this would be... oh, say, just for example, that I was providing platform specific implementations of some of these classes. ;-)
Now I'm left with two obvious choices. I can have my classes know about each other with a static reference (on some system hub class), which is set after constructing all classes (during which I mandate that they cannot access each other yet, thus doing away with order of operations problems at least during construction). On the other hand, the classes could have instance final references to each other, were I now to decide that sorting out the order of operations was important or could be made the responsibility of the person passing the args - or more to the point, providing platform specific implementations of these classes we want to have referencing each other.
A static variable means you don't have to look up the location of the variable wrt to the class it belongs to, saving you one operation. A final variable means you don't have to look up the value at all but it does have to belong to your class, so you save 'one operation'. OK I know I'm really handwaving now!
Then something else occurred to me: I could have static final stub classes, kind of like a wacky interface where each call was relegated to an 'impl' which can just extend the stub. The performance hit then would be the double function call required to run the functions and possibly I guess you can't declare your methods final anymore. I hypothesised that perhaps those could be inlined if they were appropriately declared, then gave up as I realised I would then have to think about whether or not the references to the 'impl's could be made static, or final, or...
So which of the three would turn out fastest? :-)
Any other thoughts on lowering frequent-access overheads or even other ways of hinting performance to the JIT compiler?
UPDATE: After running several hours of test of various things and reading http://www.ibm.com/developerworks/java/library/j-jtp02225.html I've found that most things you would normally look at when tuning e.g. C++ go out the window completely with the JIT compiler. I've seen it run 30 seconds of calculations once, twice, and on the third (and subsequent) runs decide "Hey, you aren't reading the result of that calculation, so I'm not running it!".
FWIW you can test data structures and I was able to develop an arraylist implementation that was more performant for my needs using a microbenchmark. The access patterns must have been random enough to keep the compiler guessing, but it still worked out how to better implement a generic-ified growing array with my simpler and more tuned code.
As far as the test here was concerned, I simply could not get a benchmark result! My simple test of calling a function and reading a variable from a final vs non-final object reference revealed more about the JIT than the JVM's access patterns. Unbelievably, calling the same function on the same object at different places in the method changes the time taken by a factor of FOUR!
As the guy in the IBM article says, the only way to test an optimisation is in-situ.
Thanks to everyone who pointed me along the way.
Its worth noting that static fields are stored in a special per-class object which contains the static fields for that class. Using static fields instead of object fields are unlikely to be any faster.
See the update, I answered my own question by doing some benchmarking, and found that there are far greater gains in unexpected areas and that performance for simple operations like referencing members is comparable on most modern systems where performance is limited more by memory bandwidth than CPU cycles.
Assuming you found a way to reliably profile your application, keep in mind that it will all go out the window should you switch to another jdk impl (IBM to Sun to OpenJDK etc), or even upgrade version on your existing JVM.
The reason you are having trouble, and would likely have different results with different JVM impls lies in the Java spec - is explicitly states that it does not define optimizations and leaves it to each implementation to optimize (or not) in any way so long as execution behavior is unchanged by the optimization.

Categories