Set custom compiler in eclipse (omp4j) - java

So I am trying to use omp4j with the eclipse IDE. The problem is, that omp4j needs to replace the javac command to work (see http://www.omp4j.org/download). And I don't know how I can accomplish that in eclipse other than renaming the omp4j.jar to javac.jar and replacing my JDKs javac.jar and that seems like a wrong solution.

omp4j is a preprocessor. If omp4j is called without --no-compile, the preprocessed Java source code will be automatically compiled via javac, so omp4j can be used as a replacement for javac.
Eclipse has its own incremental Java compiler which can not be replaced. This means, in Eclipse omp4j has to be used with the argument --no-compile as preprocessor only. The processor can be executed
in an Ant, Maven, Gradle, etc. build script or
via an Ant build script as project builder on save.
To have the full Java support for the sources before preprocessing, the OMP4J_THREAD_NUM and OMP4J_NUM_THREADS constants can be faked via a static import statement with the * wildcard and a JAR that exists in two different versions, one with these face constants for the sources to edit and one with other constants for the generated sources which are not intended to be edited.
Probably it will be best to start with a Java project for each, before and after preprocessing.

Related

How to rename packages and classes programmically?

I have a code base scattered across tens of repositories.
I want to standardize names of packages and classes, but it's too tedious to do it by hand in IDE, since I need a dictionary based renaming across repositories.
Is there a way programmatically rename classes and packages across many repositories?
A similar thing for a different language: https://metacpan.org/pod/App::EditorTools
Eclipse, and just about every other major IDE, can do this rather trivially. Load the project into the IDE (most can read the project if it is built by maven or gradle, just by saying you want to 'import an existing maven java project' or some such, possibly after installation a maven and/or gradle plugin - if it's not a project built by such tools, then just import an existing java project and tell eclipse about where the source files live).
Then, right click the package, pick refactor/rename, rename it, and eclipse (or intellij, or any other major java IDE) will rename the directory, update the package statement in every source file inside it, and will update all imports or any other reference, and will even search for strings that contain that exact name in case you're doing weird reflective shenanigans and tell you that those probably also need to be updated.
It's not quite programmatic, but this sounds like it'll be much easier and faster than actually using e.g. ecj or writing an eclipse app that will run without a user interface to apply these refactor scripts.

How to configure Javac plugin in IntelliJ IDEA

I run a custom compiler plugin with the -Xplugin:MyPlugin switch, which injects some extra methods into my classes. I set the additional command line parameters under
Settings -> Build, Execution, Deployment -> Compiler -> Java Compiler
Everything builds just fine, but in the IDEA editor every call of the generated methods is highlighted red, and also autocomplete does not work.
What else can I configure to make it recognize the generated methods?
You need to write a plugin for IntelliJ IDEA to make it aware of the methods you generate. On-the-fly code analysis in IntelliJ IDEA uses its own parser and reference resolution implementation; it does not use javac, and cannot be extended by writing javac plugins.
The main entry point for such a plugin is the PsiAugmentProvider class.

Frege can't find classes in referenced project or external jar

I think I'm making a simple mistake here, but I can't get Frege to find any classes outside of the local Eclipse project.
I have a working non-trivial Java project (that's not mine), that I do not want to modify. I want to have a new clean Frege enabled project that makes use of classes from the original project.
I tried marking the original project as a dependency of my Frege project, and I tried packaging the original project into a JAR, and listing the JAR as an external dependency of the Frege project. In both cases, a Java file in the Frege project can access the classes, but the Frege compiler says "class org.foo.bar.Class is not a known Java class". This seems like a bug, but I am not confident that I have not missed a simple configuration step.
I have not tried setting arguments in the project configuration as I wouldn't know what to set.
I did quickly discover that I can make a new Java file in the Frege project with a blank subclass of whatever class I need and use that in a Frege file. I have successfully compiled and run a simple program like this. The program just makes a new object, gets a field, and prints the correct value, so I believe my Frege is installed and working properly.
More info:
Eclipse Luna 4.4.0
Java 7
No Maven
Official eclipse-ferge plugin installed through Eclipse
It should be enough to have your library listed in the build path and under "Referenced Libraries". Your recent comment indicates that the compiler does indeed find the class in question.
However, when you have an open editor tab it will not take notice of changed dependencies. Also, especially in recent eclipse versions I have observed that resolved error markers are sometimes not cleaned up correctly.
Please close the editor tab that has the false errors shown, and reopen it.

Is a CompilationParticipant bundle usable in a headless PDE build?

I've written a plugin that uses the org.eclipse.jdt.core.compilationParticipant extension to gather some compile information to be used elsewhere. I've tested in multiple versions of the Eclipse IDE and it works like a charm. My ultimate goal is to be able to use it in a headless production PDE build. I've added some logging to the bundle so I am aware when it starts up, when it shuts down, and when source compilation occurs. The problem is that these events never get caught in my headless build buy the participant. The headless PDE build is kicked off by starting the equinox launcher from an ant script that runs the antrunner executing the PDE build script. There are so many scopes of execution involved I'm unsure where to start looking. My first question is, is what I'm trying to do even possible? It didn't seem like the CompilationParticipant would only work in the UI, but I want to make sure before I go down the road of debugging this. Has anyone ever done this?
I tried to add a comment, but I'm too wordy so I will try to clarify here a bit. Unfortunately I can't do much to change the build system except to apply hooks like I am attempting. I did spend some time running through the ant scripts that PDE generates and see the it is calling the JDT compiler adapter which made me curious if the JDT compiler adapter could reference the compilation participant since it is running ant from the plugin and should have access to the framework, and it seemed to be the intent of the participant API to allow the hooking of the JDT compiler to do things like the implementation of the APT processor and other DSL implementations. That was my read on the intent of the participants, and assume they would be available in a headless build since the APT processor works, but since I can't find a really good tutorial I'm kind of putting things together piecemeal and I'm guessing I'm missing something, or at least I hope so..
It is true that PDE is generating ant scripts and calling the javac task, but it is also setting the build.compiler property to use the JDT compiler and therefore I would assume have access to the OSGi framework. Here is a snippet from one of the generated build files to show what I am talking about:
<compilerarg line="-log '${temp.folder}/pde.example3.jar.bin${logExtension}'" compiler="org.eclipse.jdt.core.JDTCompilerAdapter"/>
Debugging org.eclipse.jdt.internal.core.JavaModelManager reveals that the JDT compiler is in fact being used but getRegisteredParticipants is not being called for some reason, startup() is however being called, so the question it why does it not try to register participants.
After spending hours in the debugger attaching to the various VMs that spawn during my build process I was able to determine the flow through a PDE build. I don't believe that CompilationParticipants come in to play, in fact I don't even think the JavaBuilder is called. It looks like the execution path is something like the following:
Ant spawns my VM which starts the Equinox Launcher which starts up the OSGi framework and instantiates the AntRunner application, this in turn starts ant from the Elcipse Ant plugin that runs the build.xml file from the PDE plugin, the Build.xml file generates all the ant scripting used to generate the eclipse plugins which includes setting the build.compiler to the JDTCompilerAdapter which wraps the Eclipse Java Compiler ( originally based on Visual Age for Java ). The JDTCompilerAdapter does some setup and instantiates the org.eclipse.jdt.internal.compiler.batch.Main class which does the real compilation, and also instantiates the org.eclipse.jdt.internal.compiler.apt.dispatch.BatchAnnotationProcessorManager class to handle annotation processing. Nowhere in this path of execution are participants notified, and the JDTCompilerAdapter seems to be specifically designed to be able to be used outside the OSGi environment in ant. so it looks like CompilationParticipants will not give me what I need in a headless PDE build using the antrunner..
AFAIK PDE build is "just" a fancy way of generating a lot of Ant-scripts, and I belive it just uses the javac target to compile files. You can check that after the PDE build has run, by going into your source folders, find the Ant-script, and check.
If what you do is important for the build, I would recommend that you check out Buckminster. It is a build tool designed of OSGi applications. It is special in the sense, that it actually builds in an Eclipse workspace, so it uses the same builders and stuff like CompilationParticipants as you do during development, assuming you have installed the plugins in the headless build-application.
Well after tons of debugging, reading docs, and stepping though the PDE sources it seems like this can NOT be done. It seems in a headless build the execution of the JDTCompilerAdapter is designed to work outside of OSGi and does not have access to the framework it is simply called from the javac task and does NOT involve the JavaBuilder and therefore does not call any participants.

Parallelism in Java 8

I tried to use new parallel feature JDK8, but unfortunately, I couldn't get it to work.
NetBeans 7.1 says that method "parallel" does not exist.
Does this method require special import?
Does anyone have sample code demonstrating Java 8 parallelism?
I have been playing with JDK8 Lambda Developer Preview for a few weeks now. The following is what I do to simplify compilation and testing of my code:
Configure JEdit to Compile JDK 8 Code
The following guide describes how to configure Apache Ant and JEdit to easily compile source code with JDK 8 Lambda Expressions and the new API features in the JDK 8 Lambda Developer Preview.
This is what I do, as of today, basically because no IDE supports these JDK 8 features yet.
Download the following:
JDK 8
JEdit
Apache Ant
Then create the following directory structure:
Sanbox
|-----jdk8
|-----ant
|-----projects
Place the uncompressed JDK build in the jdk8 directory.
Place the uncompressed Apache Ant in the ant directory.
The projects directory will be for JEdit projects.
Then install the following JEdit Plugins:
Ant Farm
Java Fold
Project Builder
Project Viewer
Project Wizard
SVN Plugin (I use this to synchronize my projects with my repo, you may not need it, though)
Now, configure your Apache Ant:
Create a file in your home folder named antrc_pre.bat (i.e. %USERPROFILE%\antrc_pre.bat).
(Note: if you are using Linux you can configure this in ~/.ant/ant.conf).
This file will be run by Apache Ant before running any tasks, therefore, this is a place to configure or override which JDK you want to use by defining the JAVA_HOME variable.
At the top of this file define the JAVA_HOME variable and make it point to the directory where the JDK8 is installed. Somewhat like this: SET JAVA_HOME=C:\Sanbox\jdk8
Make sure to comment it out once you're done with your JDK 8 session so that Ant continues to use the default configuration.
Time to configure JEdit Ant Plugin
In JEdit go to Plugins -> Plugin Options -> Ant Farm -> Build Options
In the dialog select the option: "Run Ant targets using an external script/build file"
Choose the ant.bat script (i.e. C:\Sandbox\ant\bin\ant.bat).
Note: If you are using Ant 1.8.x it is probable that you'll need to add a property in the properties section of the plugin: build.compiler=javac1.7, otherwise you would get an error at compiling with JDK 8. I did not have this problem with Ant 1.7, though.
Then create a new Java Project:
In JEdit go to Plugins -> Project Builder -> Create New Project
Choose Java Application and click Next
Choose your projects directory as the place to locate files (i.e. C:\Sanbox\projects).
Voila! At this point, JEdit will present four buttons in the tool bar: Build Application, Compile, Clean and Run Application. These are based on the build.xml file and are executed according to the corresponding Ant tasks. You're good to go, you may start writing lambda expressions and use the new APIs:-)
Parallelism Example
In the last developer preview (b50), there is little paralleism implemented yet. I can see they are doing more work in a seperate branch (if you want to download and build the OpenJDK8 source, though).
You can, however, use a method Arrays.parallell which creates a ParallelIterable wrapper over an array. This you can use to test some of the parallelism features.
I did an example to find primes in a large array. I could verify that all my four cores are used when I run this in parallel.
Integer[] source = new Integer[30000000];
for(int i=0; i<source.length; i++)
source[i] = i;
ParallelIterable<Integer> allIntegers = Arrays.parallel(source).filter(isPrime);
Iterable<Integer> primes = allIntegers.into(new LinkedList<Integer>());
This compiles and runs fine in my JEdit Project with Apache Ant 8.4.x and JDk8-b50.
I hope this helps.
PD:
I did not define the predicate isPrime in the code above in order not to obscure the simplicity of the example. I am pretty sure eveyone can easily define a primality predicate to try this code.
My suggestion would be to put Netbeans to one side, use a plain text editor to edit your Java code, and compile and run it from the command prompt, using the Java 8 toolchain. That way you can be sure that your problems are not due to a Netbeans issue.
Check if your netbeans is using jdk8 (i doubt it) . If it does not then make it point to your local copy of jdk8 instead of the inbuilt jdk.
Hope this helps.
You can use the nightly version of Netbeans which now has some experimental support for JDK8 features - I've given this a try and it seems to work well with lambdas (at least you don't get the red squiggles under them, auto-formatting and suggested corrections don't seem to work properly yet, but they're more in the way of minor niggles.) You'll need to make sure you add the Lambda enabled JDK8 as a Java Platform, and then set the source level to Java 8 for the project you want to experiment with.
You can grab the latest Lambda enabled build of the JDK here.
At the time of writing, there are 3 types of parallel methods on the static Arrays class which can be experimented with - parallelStream(), parallelPrefix(), and parallelSort(). Note however that this will likely change before the final release, the API at present is very much in flux.
Below example finds all directories in my documents directory:
List<File> directories = Arrays.asList(new File("/Users/sid/Documents")
.listFiles())
.parallelStream()
.filter(t -> t.isDirectory() == true)
.collect(Collectors.toList());
Java 8 provides stream support where a collection is transformed into a continuous stream of objects. If the size of the collection is small .stream() is alright. But if you have a big collections and want to exploit the parallelism feature then you can use .parallelStream() method.

Categories