Say I have: class A depends on class B depends on class C depends on class A.
It seems impossible to compile. I have read this post about disabling compile-time depedency-checking, but all my classes are within the classpath, well-defined, etc. The only problem is that they mutually depend on each other.
Is there a way to write such an application that would compile without a hitch?
We do have several such dependency cycles in our legacy codebase and they do compile without a hitch.
This is not to say it is good to have cyclic dependencies - on the contrary. I intend to get rid of them eventually to clean up our architecture. Nevertheless, in the meantime, the code still compiles and works.
The important thing here is that the compiler must be able to compile all the classes at the same time. If this is the case, there should be no problem. Of course, you should take care of the usual directory layout problems.
If the packages can't be compiled together, it gets more complicated - you might have to create dummy implementations first (which don't depend on the other classes) and then (when you have the right classes) substitute them. But I can't really imagine an reason for not being able to compile them together.
You can have this circular dependancy like this as Java knows which files to read to find the code from the name. i.e. it compiles them all at once. You will only have a problem if you try to compile one at a time.
Related
This question is related to, but not a duplicate of, this question.
My issue is slightly different; I have a "utility module", shared between the client and server code, and it contains no GWT-specific code.
I understand that normally, all the sources are pulled into one specific project, where everything is compiled together. But there is one issue with that: I only get to know if my utility project is "GWT compatible", when I compile the main project. This is way too late; I haven't even got around to start on the main project, but I want to know before I make a "commit" to my SCM, that my utility project is "GWT compatible".
In other words, I want to validate the utility project for GWT compatibility, independently from it's use in a separate project (module).
There's a large part of the JRE that is not covered by GWT, and it is particularly likely in a utility module that non-GWT-compatible classes or method be used. That is what I want to validate against.
EDIT: I could add a "dummy entry point", I suppose, but that makes the project depend on GWT, which I don't want to, since it is "general" code, also to be used by people that don't use GWT. If it matters, I use Maven as build system.
EDIT2: No matter what I do, I will only get real compilation/validation with an entry point (does NOT need to reference any of the classes). Neither <force>true</force>, nor <failOnError>true</failOnError> will do. Is there a way I can define that entry point, for the shared project, such that only gwt-maven-plugin sees it, but not javac (so as not to add an unneeded dependency in the Java code)?
The compiler actually always visits all code on the source path (note: not quite the same as the classpath), by starting at the requested module with any <source> tags, and then checking each <inherits> along the way. If it finds something that isn't compatible or isn't compilable, it will mark it as broken, and move on - as long as nothing actually depends on it (i.e. an EntryPoint, or something that an EntryPoint depends on) you'll just see this message:
Validating newly compiled units
Ignored 1 unit with compilation errors in first pass.
Compile with -strict or with -logLevel set to TRACE or DEBUG to see all errors.
If you include that -strict flag, the compile will actually fail when it hits something that can't be included correctly.
This work is done in the very early stages of the compile, while constructing the TypeOracle, which is used for Generators, long before any JS is built. That type oracle is passed to generators, which need to be able to ask questions like 'what interfaces on the sourcepath have a JSO implementation' and 'what are all possible subclasses of List'. Generators can do a huge number of things, including emit even more types which then need to be parsed, compiled, and the process continues until a full JProgram is created of all possible types, based on the current set of modules.
That JProgram then gets compiled down based on what can be reached from the roots - the entrypoint, as well as a few other details such as how to emulate Java details like casts, arrays, longs, exceptions, etc.
If -strict was not specified, and the compiler ends up needing to reach something which is unavailable due to earlier compilation problems, that is the time you find out. Using -strict to stop earlier will help ensure that you catch those issues sooner.
One more fun fact: By default, with com.google.gwt.user.User in your module (or any other <inherits> that depends on it), you already have an entrypoint, or several! These do some quick checking that your page is working correctly, such as using a strict doctype, or the browser actually matching the expected user.agent setting. This means that it is usually possible to compile a module even without an entrypoint (except with gwt-maven-plugin:compile, which will not consider a module for compilation just by those built-in ones).
EDIT: Okay, even one more: From http://www.gwtproject.org/doc/latest/DevGuideCompilingAndDebugging.html, combined with -strict, it looks like you can force the validation to run without actually compiling to JS:
-validateOnly Validate all source code, but do not compile
I don't think it's possible because the GWT compiler does not compile any unused code.
This means that your shared utility "module" may have code in it that is not compatible with GWT, but it will not cause any problems as long as GWT code never calls such incompatible classes or methods. Without an entry point GWT compiler won't know which code is used and which is not - it will assume that all of it is unused.
We have a huge application in our company that was bought with its source code (almost all source code actually) and we have been working adapting it to our needs, fixes etc... The way someone a long time ago decided that it would be a good to compile the project was to have the precompiled jars of the original company and only edit or add the classes we needed in an Override folder that we get from some zip files with the original source code. So when we compile we only compile our edited classes, use the original jar in the Classpath and then join our new or edited classes with the original jar to create a new Jar.
The issue seems to be that people are changing method contracts, editing interfaces and adding abstract methods to abstract classes. When we compile we don't recompile the complete source code, only ours so the compiler only checks contract consistency of our Overrided source code.
I have started to notice that thing seem to be breaking because old classes are trying to call a method that doesn't exists any more, or classes that call methods of classes that implement what really is just a partial version of the interface.
I cannot that simply join the original source code with our overrided source code because some classes and java files are autogenerated at build time so i could accidentaly fix something that really isnt broken. (I tried, thats why i know)
Is there a way to validate contract consistency between classes of various compiled jars automatically? After fixing those inconcesties i will be able merge both source directories together and avoid this issues in the future.
Why not compile everything? (you do have the source code) That way you will at least get compile errors.
I assume you don't have any tests to verify things still work. Maybe you should start writing those too. A good place to start is to test the new code that is written that has broken before. That way you focus on testing the parts that are changing.
And what are the pro's con's of using either?
I actually saw it in Netbeans under Project Properties > Libraries for Java Applications. We have two tabs, one for compile time libraries and run time libraries, and it looks like we can add a library to either independent of each other
There is no such a thing as compile time libraries vs. runtime libraries
Perhaps you're mixing some concepts.
In Java the libraries to be used are statically validated at compile time and also validated at runtime.
For instance if you want to use IterableMap specified in the Apache Collections library. The compiler validates "at compile time" you are invoking a method that exist in that class.
But the compiler doesn't link or do much of anything with that library, you still need it at runtime. So, when your code executes, the Java runtime, searches for that class again, and invokes the method the compiler verified existed.
And that what there is.
The UI and terminology of the Libraries properties dialog is pretty confusing.
The Help button on that dialog will give you a fair bit of info.
The Compile-time library list may be a subset of the Run-time library list.
Consider this situation...
You have source code that imports classes from on a library 'widgets.jar'. The class files in widgets.jar reference symbols from the jar file 'xml.jar'. If your source code does not import classes from xml.jar, you could define the Compile-time libraries list to contain just widgets.jar.
When you try to run your project, you will probably need to include xml.jar in the Run-time libraries list to prevent ClassNotFoundException's.
Perhaps, this comes into play when you want to load a library dynamically, or check for an existence of the library and then execute the code.
During compilation, compiler needs to know what the signatures of the methods, classes etc to know if you code is correct. Hence you add the compile time library.
During runtime, JVM still needs the library to run that specific code. But you can put a logic to avoid that code by checking if the library exists, for example by doing Class.for() method. Some libraries might already exist in the system (qt.jar for example) or might not, and you can check and execute your code accordingly.
Please correct me if I am wrong.
As others have stated, you are confusing concepts. I think what you are really trying to understand is what Maven refers to as dependency scope. Sometimes you only need a dependency at compile time because you expect it to be provided at runtime, and sometimes you need it at runtime but not compile time.
I'm confused with the role -classpath option plays in both compiling and running a java program. Please help me understand.
Because they are two separate operations and not necessarily the same paths.
The runtime dependencies are often more extensive than the compile time dependencies. For example, many programs will code to interfaces, which limits the compile time dependencies to those interfaces. At runtime, the VM must be able to resolve the implementations of those interfaces, which are not required until they are loaded at runtime.
it simply in both cases tells javac and java where to find dependencies required for your program to both compile and run
The reason it is done twice is that the environment you compile the code in may not be the same environment you run the code in.
Java loads classes at runtime. For example, you could write a method that forces loading of class X, compile it, write class X, compile it, and then run them together. In addition, you typically refer to classes by a fully specified name, but could run the same program with different versions of that class (e.g., a different version of the library). Thus, you need to tell Java where it could potentially find the classes that it needs to load.
As for compilation, to ensure type safety, you have to provide the Java compiler at least with the interfaces or base classes that you are referring to and making calls on, so that the compiler can at least ensure that the call would be legal. For that reason, you have to tell it where to find the jars containing them.
Here is an example. Let's say you want to use JMS (a messaging framework) in a core Java program. At compile time, you need to at least tell javac where to find the JMS interfaces. At runtime, you need to provide these interfaces, but you also need to provide the JAR with the actual implementation (e.g., ActiveMQ).
In C++ I believe it is the case that linking happens around compile-time, to create an executable (I am not a C++ programmer so I'm not sure about that).
In Java, the linker step happens at runtime (see the JVM spec, "Loading, Linking and Initalizing"). From your question it sounds like you understand why the classpath needs to be specified at compile time (because you might reference classes from third-party JARs in your code), so I will just explain that when your program is being run, those classes are not loaded into the JVM until they are referenced. At this point, the JVM needs to know where to find their representation.
The compiler has to know where to look to satisfy compile-time dependencies.
The VM has to know where to look to satisfy runtime dependencies.
At compile time, you need to tell javac where to find third-party and user-defined classes. At runtime, you also need to tell java where to find third-party and user-defined classes. In both cases, one way to change the class path is to use the JDK Tools' -classpath option. Checkout the Setting the Class Path technical note for more details.
This is for an Android application but I'm broadening the question to Java as I don't know how this is usually implemented.
Assuming you have a project that targets a specific SDK version. A new release of the SDK is backward incompatible and requires changing three lines in one class.
How is this managed in Java without duplicating any code(or by duplicating the least amount)?
I don't want to create two projects for only 3 lines that are different.
What I'm trying to achieve in the end is a single executable that'll work for both versions. In C/C++, you'd have a #define based on the version. How do I achieve the same thing in Java?
Edit: after reading the comments about the #define, I realized there were two issues I was merging into one:
So first issue is, how do I not
duplicate code ? What construct is there that is the equivalent of a
#define in C.
The second one is: is it possible
to bundle everything in the same
executable? (this is less of a
concern as the first one).
It depends heavily on the incompatibility. If it is simply behavior, you can check the java.version system property and branch the code accordingly (for three lines, something as simple as an if statement).
If, however, it is a lack of a class or something similar that will throw an error when the class is loaded or when the code gets closer to execution (not necessarily something you can void reasonably by checking before calling), then the solution gets a lot harder. The notion of having a separate version is the cleanest from a code point of view, but it does mean you have to distribute two versions.
Another solution is reflection. Don't reference the class directly, call it via reflection (test for the methods or classes to determine what environment you are currently running in and execute the methods). This is probably the "official" approach in that reflection exists to deal with classes that you don't have or don't know you will have at compile time. It is just being applied to libraries within the JDK. It gets very ugly very fast, however. For three lines of code, it's ok, but doing anything extensive is going to get bad.
The last thing I can think of is to write common denominator code - that is code that gets the job done in both, finding another way to do it that doesn't trigger the problematic class or method.
I would isolate the code that needs to be different in a separate class (or multiple classes if necessary), and include / exclude them when building the project for the different versions.
So i would have like src/java/org/myproj/Foo.java which is the common stuff, and then oldversion/java/org/myproj/Bar.java and newversion/java/org/myproj/Bar.java which is the different implementations of the class that uses changed api.
Then I either compile "src/java and oldversion/java" or "src/java and newversion/java".
Possibly a similar situation, I had a method which wasn't available in the previous version of the JDK but if it was there I wanted to call it, I didn't want to force people to use the more recent version though. I used reflection to look for the method, if it was there I called it, if it wasn't I didn't.
Pretty hacky but might give you what you want.
Addressing Java in general, I see two primary approaches.
1). Refactor the specific code to its own library. Have different versions of that library. Effectively your app is creating an abstaction above the different SDKs. Heavyweight for 3 lines of code, but perhaps quite reasonable for larger scale problems.
2). Injection using annotation. Write your own annotation processor to manage the appropriate injection. More work, but maybe more fun.
Separate changing code in different classes with the same interface. Place classes in the same jar. Use factory design pattern to instantiate one or another class depending on SDK version.