We have a huge application in our company that was bought with its source code (almost all source code actually) and we have been working adapting it to our needs, fixes etc... The way someone a long time ago decided that it would be a good to compile the project was to have the precompiled jars of the original company and only edit or add the classes we needed in an Override folder that we get from some zip files with the original source code. So when we compile we only compile our edited classes, use the original jar in the Classpath and then join our new or edited classes with the original jar to create a new Jar.
The issue seems to be that people are changing method contracts, editing interfaces and adding abstract methods to abstract classes. When we compile we don't recompile the complete source code, only ours so the compiler only checks contract consistency of our Overrided source code.
I have started to notice that thing seem to be breaking because old classes are trying to call a method that doesn't exists any more, or classes that call methods of classes that implement what really is just a partial version of the interface.
I cannot that simply join the original source code with our overrided source code because some classes and java files are autogenerated at build time so i could accidentaly fix something that really isnt broken. (I tried, thats why i know)
Is there a way to validate contract consistency between classes of various compiled jars automatically? After fixing those inconcesties i will be able merge both source directories together and avoid this issues in the future.
Why not compile everything? (you do have the source code) That way you will at least get compile errors.
I assume you don't have any tests to verify things still work. Maybe you should start writing those too. A good place to start is to test the new code that is written that has broken before. That way you focus on testing the parts that are changing.
Related
The question is whether the functionality I describe below already exists, or whether I need to make an attempt at creating it myself. I am aware that I am probably looking at a lot of work if it does not exist yet, and I am also sure that others have already tried. I am nevertheless grateful for comments such as "project A tried this, but..." or "dude D already failed because...". If somebody has an overall more elegant solution, that would of course be welcome as well.
I want to change the way I develop (private) Java code by introducing a multiplexing layer. What I mean by that is that I want to be able to create library-like parameterizable AST-snippets, which I want to insert into my code via some sort of placeholders (such as annotations). I am aware of project https://projectlombok.org/ and have found that, while I find it useful for small applications, it does not generally suit my requirements, as it does not seem possible to insert own snippets without forking the entire project and making major modifications. Also lombok only ever modifies a single file at a time, while I am looking for a solution that will need to 'know' multiple files at a time.
I imagine a structure like this:
Source S: (Parameterizable) AST-snippets that can be included via some sort of reference in Source A.
Source A: Regular Java-Code, in which I can reference snippets from Source A. This code will not be compiled directly, as it is lacking the referenced snippets, and would thus throw a lot of compile time exceptions.
Source T: Target Source, which is an AST-equivalent copy of Source A, except that all references of AST-Snippets have been replaced by their respective Snippet from Source S. It needs to be mappable to the original Source A as well as the resolved snippets from Source S, where applicable, as most development will happen there.
I see several challenges with this concept, not the least of which are debuggability, source-mapping and compatibility with other frameworks/APIs. Also, it seems a challenge to work around the one-file-at-a-time limitation, memory wise.
The advantage over lombok would be flexibility, since lombok only provides a fixed set of snippets for specific purposes, whereas this would enable devs to write own snippets, or make modifications to getters, setters etc. Also, lombok 'quirks' into the compilation step, and does not output the 'fused' source, afaik.
I want to target at least javac and eclipse's ecj compilers.
I have a fairly new small team project with a new team and we have tried a few different approaches and had a few significant iterations in design.
We have some classes laying around that should not be used and a few classes that have such messes as ClassStuff and ClassStuffImproved. We have SVN but I don't think nuking all the junk and making people dig manually in the history is productive. Some things may need to be re implemented properly and the previous poor implementation would provide a reference. I do however want to break anything that depends on junk. I also want the project to build even if these junk files are broken.
Is there a folder convention? Should i put all nuked content in a textfile so at least its easily searchable when someone wonders where a class went?
What is the typical convention here?
I've kept SVN history logs in place by using TortoiseSVN's Repository Browser. If you have a recent SVN server, operations like "Move to", "Copy to" and "Rename" will keep history logs to the original. Be sure to update all checkouts after these kind of changes (else local changes will have a hard time merging with the new server reality).
Files you no longer use can be moved to a "museum" or "attic" branch (and try to keep the original directory structure in there).
Apache projects have been changing package names to indicate new versions that more or less break with previous versions (e.g. compare commons-lang 2.6 with 3.2). In SVN you can do this by simply renaming a directory in the trunk. I find it a good convention: all code that depends on the old version breaks but is easily updated by adding one character to the import statements (and ofcourse you verify the new version works as expected by reading the updated Apidocs).
Re-implementing junk is, in my experience, harder than it looks at first glance. I use the #Deprecated annotation at class-level at first for classes that need to be replaced. Your IDE will clearly show where deprecated code is used and the compiler will show warnings. Create the re-implementation in a package 'version 2'. Then you can update like you would update from commons-lang version 2 to commons-lang version 3. Once this is all done, delete the deprecated classes (or package).
Be careful when deleting deprecated classes though: the last time I tried deleting deprecated classes, I found a dependency on the deprecated classes in one little and old, but crucial and heavily tested, program and had to keep the deprecated classes in place to maintain backwards compatability.
I'm not sure there is a convention, in my projects I just copy it into a new version with a name like:
MyClass2.java
This way I won't get into trouble when I'm importing and old version of a source file. And I also comment the whole contents of MyClass.java with /* ... */. After some reasonable amount of time I drop the old MyClass version and leave MyClass2 in the project. I'm leaving it as MyClass2, so that it states that it this file has history and it's a more advanced version of my class, so brings a little fun into the process.
I've never seen anyone doing this, but the practice seems quite intuitive.
I need to manually add a method call to a class file without decompiling and recompiling the code because It depends on thousands of other classes and I don't wan't have to do more than is nessescary. I know java but not how class files are made.
Any help is appreciated.
EDIT:
I am not the owner of the source and I need this to work on any computer, which means I cannot redistribute the sources and have them compiled realtime while my patcher is working.
You have the source code, and you have all other classes compiled. So you can recompile just that source file, passing compiled classes as parameters to java compiler with -classpath option.
You should use ASM or Javaassist to manipulate the bytecode. ASM is a little bit more complex and requires you to understand more about the JVM, but it's faster. Javaassist doesn't require you to know much about the JVM's internals.
However, I don't see why you can't just recompile that single sourcefile? If you only need to add this method once, it's very inefficient to learn ASM or Javaassist.
How about subclassing? Then you don't need to touch the sources.
So if you have the source code and want to add some methods into only one class. Then you don't have to worry about other classes even they are dependent on your current modified class. Re-compiling a file doesn't affect other classes. Since the output will be produced at run-time.
If your class is not declared final and the method you are interested is not final, you can extend the class and override just that method.
Just change the source code, recompile ! Everything will work fine. Subclassing won't work .Because Already existing classes won't know about the new subclass until you change their code to use the new subclass instead of old superclass.
For manual editing of classfiles, I'd recommend Krakatau. (Disclosure, I wrote it). It lets you disassemble a classfile, edit it, and reassemble. There are other assemblers out there, but AFAIK, Krakatau is the only one that supports all the weird edge cases in the classfile format.
The main caveat is that Krakatau by default does not preserve certain optional debugging attributes (specifically LineNumberTable, LocalVariableTable, and LocalVariableTypeTable), since there is no simple way to represent them in a human editable format, and failing to edit them when the bytecode changes will result in a verification error. Most likely you don't actually need this though so it shouldn't matter.
The other caveat of course is that you have to understand bytecode. But if you don't, you won't be able to manually edit classfiles anyway.
I got it now! I Created fake source files with the same names/methods but didn't add anything else except for class and method names. That way I only needed to pack the ones that are directly linked to my Class file. But now compiling takes a few milliseconds whereas it used to take around 124s, Lol. Works great!
Say I have: class A depends on class B depends on class C depends on class A.
It seems impossible to compile. I have read this post about disabling compile-time depedency-checking, but all my classes are within the classpath, well-defined, etc. The only problem is that they mutually depend on each other.
Is there a way to write such an application that would compile without a hitch?
We do have several such dependency cycles in our legacy codebase and they do compile without a hitch.
This is not to say it is good to have cyclic dependencies - on the contrary. I intend to get rid of them eventually to clean up our architecture. Nevertheless, in the meantime, the code still compiles and works.
The important thing here is that the compiler must be able to compile all the classes at the same time. If this is the case, there should be no problem. Of course, you should take care of the usual directory layout problems.
If the packages can't be compiled together, it gets more complicated - you might have to create dummy implementations first (which don't depend on the other classes) and then (when you have the right classes) substitute them. But I can't really imagine an reason for not being able to compile them together.
You can have this circular dependancy like this as Java knows which files to read to find the code from the name. i.e. it compiles them all at once. You will only have a problem if you try to compile one at a time.
And what are the pro's con's of using either?
I actually saw it in Netbeans under Project Properties > Libraries for Java Applications. We have two tabs, one for compile time libraries and run time libraries, and it looks like we can add a library to either independent of each other
There is no such a thing as compile time libraries vs. runtime libraries
Perhaps you're mixing some concepts.
In Java the libraries to be used are statically validated at compile time and also validated at runtime.
For instance if you want to use IterableMap specified in the Apache Collections library. The compiler validates "at compile time" you are invoking a method that exist in that class.
But the compiler doesn't link or do much of anything with that library, you still need it at runtime. So, when your code executes, the Java runtime, searches for that class again, and invokes the method the compiler verified existed.
And that what there is.
The UI and terminology of the Libraries properties dialog is pretty confusing.
The Help button on that dialog will give you a fair bit of info.
The Compile-time library list may be a subset of the Run-time library list.
Consider this situation...
You have source code that imports classes from on a library 'widgets.jar'. The class files in widgets.jar reference symbols from the jar file 'xml.jar'. If your source code does not import classes from xml.jar, you could define the Compile-time libraries list to contain just widgets.jar.
When you try to run your project, you will probably need to include xml.jar in the Run-time libraries list to prevent ClassNotFoundException's.
Perhaps, this comes into play when you want to load a library dynamically, or check for an existence of the library and then execute the code.
During compilation, compiler needs to know what the signatures of the methods, classes etc to know if you code is correct. Hence you add the compile time library.
During runtime, JVM still needs the library to run that specific code. But you can put a logic to avoid that code by checking if the library exists, for example by doing Class.for() method. Some libraries might already exist in the system (qt.jar for example) or might not, and you can check and execute your code accordingly.
Please correct me if I am wrong.
As others have stated, you are confusing concepts. I think what you are really trying to understand is what Maven refers to as dependency scope. Sometimes you only need a dependency at compile time because you expect it to be provided at runtime, and sometimes you need it at runtime but not compile time.