Is a CompilationParticipant bundle usable in a headless PDE build? - java

I've written a plugin that uses the org.eclipse.jdt.core.compilationParticipant extension to gather some compile information to be used elsewhere. I've tested in multiple versions of the Eclipse IDE and it works like a charm. My ultimate goal is to be able to use it in a headless production PDE build. I've added some logging to the bundle so I am aware when it starts up, when it shuts down, and when source compilation occurs. The problem is that these events never get caught in my headless build buy the participant. The headless PDE build is kicked off by starting the equinox launcher from an ant script that runs the antrunner executing the PDE build script. There are so many scopes of execution involved I'm unsure where to start looking. My first question is, is what I'm trying to do even possible? It didn't seem like the CompilationParticipant would only work in the UI, but I want to make sure before I go down the road of debugging this. Has anyone ever done this?
I tried to add a comment, but I'm too wordy so I will try to clarify here a bit. Unfortunately I can't do much to change the build system except to apply hooks like I am attempting. I did spend some time running through the ant scripts that PDE generates and see the it is calling the JDT compiler adapter which made me curious if the JDT compiler adapter could reference the compilation participant since it is running ant from the plugin and should have access to the framework, and it seemed to be the intent of the participant API to allow the hooking of the JDT compiler to do things like the implementation of the APT processor and other DSL implementations. That was my read on the intent of the participants, and assume they would be available in a headless build since the APT processor works, but since I can't find a really good tutorial I'm kind of putting things together piecemeal and I'm guessing I'm missing something, or at least I hope so..
It is true that PDE is generating ant scripts and calling the javac task, but it is also setting the build.compiler property to use the JDT compiler and therefore I would assume have access to the OSGi framework. Here is a snippet from one of the generated build files to show what I am talking about:
<compilerarg line="-log '${temp.folder}/pde.example3.jar.bin${logExtension}'" compiler="org.eclipse.jdt.core.JDTCompilerAdapter"/>
Debugging org.eclipse.jdt.internal.core.JavaModelManager reveals that the JDT compiler is in fact being used but getRegisteredParticipants is not being called for some reason, startup() is however being called, so the question it why does it not try to register participants.
After spending hours in the debugger attaching to the various VMs that spawn during my build process I was able to determine the flow through a PDE build. I don't believe that CompilationParticipants come in to play, in fact I don't even think the JavaBuilder is called. It looks like the execution path is something like the following:
Ant spawns my VM which starts the Equinox Launcher which starts up the OSGi framework and instantiates the AntRunner application, this in turn starts ant from the Elcipse Ant plugin that runs the build.xml file from the PDE plugin, the Build.xml file generates all the ant scripting used to generate the eclipse plugins which includes setting the build.compiler to the JDTCompilerAdapter which wraps the Eclipse Java Compiler ( originally based on Visual Age for Java ). The JDTCompilerAdapter does some setup and instantiates the org.eclipse.jdt.internal.compiler.batch.Main class which does the real compilation, and also instantiates the org.eclipse.jdt.internal.compiler.apt.dispatch.BatchAnnotationProcessorManager class to handle annotation processing. Nowhere in this path of execution are participants notified, and the JDTCompilerAdapter seems to be specifically designed to be able to be used outside the OSGi environment in ant. so it looks like CompilationParticipants will not give me what I need in a headless PDE build using the antrunner..

AFAIK PDE build is "just" a fancy way of generating a lot of Ant-scripts, and I belive it just uses the javac target to compile files. You can check that after the PDE build has run, by going into your source folders, find the Ant-script, and check.
If what you do is important for the build, I would recommend that you check out Buckminster. It is a build tool designed of OSGi applications. It is special in the sense, that it actually builds in an Eclipse workspace, so it uses the same builders and stuff like CompilationParticipants as you do during development, assuming you have installed the plugins in the headless build-application.

Well after tons of debugging, reading docs, and stepping though the PDE sources it seems like this can NOT be done. It seems in a headless build the execution of the JDTCompilerAdapter is designed to work outside of OSGi and does not have access to the framework it is simply called from the javac task and does NOT involve the JavaBuilder and therefore does not call any participants.

Related

How Does Gradle Pass Its Classpath When It Compiles Java? [duplicate]

I'm trying to get a better picture of what happens behind the scenes in Android Studio when building an Android application. I've been reading up on Gradle, but one thing I cannot figure out is how to see the respective CLI command and arguments that is being invoked by Gradle. It seems to be abstracted and not logged to the Gradle Console or Event Log.
The closest I've gotten to seeing what's going on inside Gradle is the AOSP code.
2.2.2 Source:
https://android.googlesource.com/platform/tools/base/+/gradle_2.2.2/build-system/gradle-core/src/main/java/com/android/build/gradle/tasks
Goals
I want to be able to see the respective CLI command that is generated by the Gradle tasks inside Android Studio.
Use Case Example
I want to view the Legacy Android Build Process in depth. This includes going through the following:
Source Code / Library Code -> javac -> Java bytecode (.class) -> proguard -> minimized bytecode (.class) -> dex -> DEX bytecode (.dex)
For example I would want to see the respective javac command invoked by AndroidJavaCompile. https://android.googlesource.com/platform/tools/base/+/gradle_2.2.2/build-system/gradle-core/src/main/java/com/android/build/gradle/tasks/factory/AndroidJavaCompile.java
I fear that the only way to do this is to look directly through source code or even build directly from source.
Due Diligence
I've done quite a bit of searching on Google, Android blogs, Google I/O talks, Android books, and much more. I haven't been able to find a straight-forward answer.
That's not possible. Simply, because most of the Gradle tasks do not invoke CLI commands.
Every Gradle build file is a piece of Groovy code that gets executed in a JVM along with the Gradle API (written in Java). Therefor, you can implement any task or configuration functionality directly in any JVM language, from which most plugins make use of instead of executing command line tools. Nevertheless, this is possible by using or extending the Exec task.
The compilation step is handled by a AndroidJavaCompile task, which extends the common JavaCompile Gradle task by some version checks and the Instant Run feature. However, you don't know how Gradle actually compiles the .java files. In the internal source files for the JavaCompile task of the Gradle API, there seem to be various implementations (DaemonJavaCompiler, JdkJavaCompiler and even CommandLineJavaCompiler). Since you can specify CompilerOptions with your task, Gradle seems to choose the real compiler based on these options. Please note, that even if a CommandLineJavaCompiler exists, it is also possible (and highly likely), that Gradle prefers to use the javax.tools package and its JavaCompiler implementation to compile the source files instead of invoking a command line tool.
I also took a look on the ProGuard step in your example build process: ProGuard can be used as command line tool, where you can specify arguments to define how it'll work. But ProGuard also provides a Gradle task (ProGuardTask), that executes without invoking ProGuard from command line. The ProGuard Java code will be executed in the Gradle JVM.
As you can see, even if each Gradle task may be replaced by one (or multiple) CLI command(s), Gradle does not execute these commands. Instead, the functionality is called directly in the Gradle JVM. If you want to get a better insight, you can increase the Gradle log level. Good implementations of Gradle tasks should provide all necessary information in logs.

What's the fastest way to get all build errors in a Java project?

Currently I'm looking at integrating some build processes into my source control (Git hooks specifically). I'm trying to write a pre-commit hook that checks for build errors in my Java project (a medium-large test development project) and fails to allow commits that contain errors in the build. This is turning out to be rather challenging.
The approach here uses a command-line Eclipse tool to build and output warnings and errors. This does technically work, but it's slow and may cause problems with the Eclipse IDE (I've already had heap allocation errors). I've also looked at solutions using ant but these approaches don't seem to be a simple one-line solution, and may still be slow.
My main question: what's the fastest (run-time compilation speed) way to build and validate a Java project, by command line? I'd like a solution that returns 0 with no errors and something else if errors are present, but I'm willing to look at other things.
Let's start with some basics:
pre-commit hooks run on the server and not the client. There is no working directory by default. You have to make sure that javac is available, and is the correct version.
Your pre-commit hook will freeze up the user's terminal until completion.
Now, how long will it take to checkout a fresh copy of your Java project, run Ant, wait for it to compile, and then process the output of the compile? a minute or two? 20 seconds? 10 seconds? Even 10 seconds will feel like forever as you wait for the Git push to complete. And, if other users want to commit code, they have to also wait.
A better, and easier approach is to use a Continuous Build Server like Jenkins. Jenkins is easy to setup. (It comes with its own application server built in) and has hundreds of plugins that you can use to help report the health of your project. If a compile cannot happen, Jenkins will email the culprit and whomever else you mention.
We have our Jenkins setup to do Ant builds, Maven builds, and use either Git or Subversion as our repository (depending upon the project). Jenkins builds the project, keeps the console log, and will fail the build if build.xml fails. At our place, this means I start pestering the developer to fix the problem or to undo their changes. At my last workplace, developers were given 10 minutes to fix the build, or I would undo their changes.
Not only can Jenkins let you know when a build fails, but has plugins that can report on the Java compiler warnings, Javadoc warnings, run Findbugs, PMD, find duplicate lines of code (via CPD that comes with PMD), and then report everything in a series of graphs. You can also mark builds as unstable (build completes, but is problematic) or simply fail the build based upon the number of issues found with these tools.
Jenkins can also run Unit tests, and again graph the results, then run coverage analysis with JaCoCo or Cobertura or Emma.
So, take a look at Jenkins. It's easy to setup and will do exactly what you want and more.
Ant. There isn't going to be a "one-line-solution". Write an ANT script that compiles the code, and fails if there are any errors. It's not easy, but it's the best option.
Out of the choices you mention, Ant is the best. But let's face it, writing XML sucks. My guess is that any build tool will fail and return an error code when compilation fails. My favorite is sbt, but there's a bit of a learning curve if you aren't into Scala (and even those in Scala like to complain about sbt). Another great option IMO is Gradle. You write your scripts in Groovy which is a dynamically-typed superset of Java.
Jenkins may be a something you could look at

Register Bundles in the PluginRegistry?

I'm trying to load OSGi Bundles from an arbitrary folder at runtime in order to use them in my Eclipse RCP Application. The following steps I have done so far to achieve that objective:
Create a new Plugin
Acquire the BundleContext from the Plugin Activator
Install a Bundle via the install() method of the BundleContext
Start the acquired Bundle via the start() method
After these steps the Bundle is in status ACTIVE and can be retrieved via any BundleContext. My problem is that the bundle cannot be retrieved via PluginRegistry.getAllModels(). Apparently the PluginRegistry is not listing to changes in the BundleContext. I need find a way to register my Bundle in the PluginRegistry. This is important because the PluginRegistry is used by already existing software parts, e.g. the Manifest Editor.
The PluginRegistry has no method to register Bundles. Is there a way to add them to the registry?
The PluginRegistry class is a development time class supporting the PDE. Does this mean that your RCP application includes the PDE and is used (in part) for plugin development? If this is true, then you will need to work out now PDE works (which is something I don't know much about). I recommend having a look at the classes in the org.eclipse.pde.runtime plugin (not the internal classes). You should be able to work out with the debugger and looking at the code how to add a plugin to the PDE runtime. If you have further questions about that, use the PDE newsgroup at the Eclipse site.
If your RCP application does not include plugin development, then there is no need to work with the PluginRegistry at runtime, so I'm confused by your question. Perhaps you could elaborate more?
During the development of Acceleo, we stumble upon this problem too as we need to let the user deploy Eclipse plugins, located in its workspace, in the running Eclipse instance (we also need to uninstall those plugins after that). Since Acceleo is open source, you can have a look at our source code on github.
I won't detail everything here but you should find what you are looking for around the line 880 and after around the line 752. The file linked is our utility class for manipulation of Eclipse plugins in the workspace and Eclipse bundles in the running instance so you can find there pretty much anything needed to handle your problem.
Small warning, when we are deploying on the fly an Eclipse plugin located in the workspace, we deactivate its plugin.xml. Since most of the Eclipse tools are just looking at the plugin which are contributing to their extension point at a given moment and since they are not listening dynamically to the installation / uninstallation of Eclipse plugins contributing to their extension point (which can be done like this) they may keep references to contribution from plugins that we will uninstall later which can create problem. This behavior is explained in detail line 775 in the first linked file.
Regards,
Stephane Begaudeau

Android with AspectJ and building it with Ant on Eclipse

I'm new to Android and wanted to use AspectJ with it.
I had searched couple articles online and follow the instruction to have it working:
http://blog.punegtug.org/2010/11/adding-aspect-to-android.html
But I wanted to know whether if it's possible to separate the aspects away from the Android project. In the tutorial link above, it has both the Android App and the aspects inside the same project, but in many cases, we wanted to leave the Android Project untouched in its isolating spaces.
Let said I have AndroidProject in my Eclipse workspace, I would like to create a separate projects for my aspects called something like "AndroidAspectProject" which only contains the aspects for it.
I'm not sure whether this would work because it seems we need to let AspectJ compiler inject point cuts and advices to the .class files before creating the .dex files. In this sense, I may not able to do it in a separate project.
Does anyone try with this?
Another related question would be:
Is it possible to have Ant build the AndroidProject with AND without aspects on it? Can this be done outside of Eclipse?
I'm looking for a way to build different flavours as I'm only injecting pointcuts into the AndroidProject on dev/debug build, but will leave it untouched on release build.
Whether or not to do the compile-time aspects is a matter of whether or not you run the aspectj ant tasks. Have separate targets or properties for the AOP- and non-AOP-builds and either build one based on a target name or property, or build them both and change the artifact name.
IIRC Eclipse allows you to specify an Ant target to run on a build.
Inside of Eclipse, this is simple. Just add AndroidAspectProject to the aspect path of AndroidProject.
Inside of ant, there are several ways of doing this. But, the simplest is to define 2 targets. One that uses iajc and the other that uses javac to compile your sources. You then need to use a little ant magic switch between targets depending on whether you are compiling for dev or for production.

Keep correctness of Eclipse workspace with continuous integration

IDE misconfiguration is a big source of inefficient time use in our team.
I wanted to know if other teams have tried to check the health of the eclipse workspace with continuous integration.
Eclipse is open source and extensible, and most (all?) of its files are in xml. So it should not be difficult to add a step to continuous integration that checks the health of the workspace, such as no missing Jar files, no errors, etc.
What we have is a separate ant script to do the real builds that go to QA and to the customers. This ant script is run with continuous integration and we have put in place a few simple checks that catch most big showstoppers.
The workspace configuration is a different story and we sometimes detect problems when it's too late (the dev left home).
EDIT: Note that we share our Eclipse config files.
There is some information on building with Eclipse from the command line here.
(Should be a comment, but I can't).
I don't see why you want to do that. Eclipse complains loudly if anything is broken, so leave it to the developer.
What you should consider instead, in my opinion is to write tests that check that everything is as you expect it to in the building process of those builds from source code that the developer has checked in the source repository.
If a build breaks due to a jar is missing in the build, add a check. If a build breaks because it is dependent on a certain feature in the JVM, add a check.
Only ship builds outside of the development team that pass all tests. Those builds that fail, should be fixed by the developer introducing the change that broke the build.
Since you are using Ant, you can create a custom task that verifies the following files against pre-defined ones. If they don't match, report problem:
workspace/.metadata/*.* (whichever configurations you think are important)
workspace/project/.classpath
workspace/project/.project
workspace/project/.settings/*.* (whichever configurations you think are important)
Of course, these files include some hard-coded paths, so you can use regular expressions in the pre-defined files.
If you want to check only simple things like "the project doesn't compile", then just compile the project in the ant script (using the javac task) and see if there are errors.
Another thing - continuous integration should better be IDE-agnostic. I.e. you must have a IDE-less environment (a CI Engine) that compiles the project. Imagine the following:
three developers, one of them accidentally removed a jar from his Eclipse, but the project in the repository is compiling. No need to report problems in that case
one of the developers adds a new jar and commits. The others have not updated. No problems are reported in there workspaces, although after they update, they might get the problem.
That all said, I think you'd better look at Hudson, which is a continuous integration engine. Thus you won't be dependent on IDE settings for your builds.

Categories