Why does Groovy compiler apparently produce 1.5 version of Java? - java

After some problems with differences between JSE versions, I'm trying to log the Java compiler version used to compile (it's Groovy 2.1.9, Grails 2.3.8, Java 1.7.0_60 in fact).
After some rummaging around, I've constructed this piece of code to read the leading bytes of the class - see /http://en.wikipedia.org/wiki/Java_class_file#General_layout
(change the path to the class to match the package name):
class CompilerVersionSupport {
public static String getVersion() {
String classAsPath = 'com/my/organisation/CompilerVersionSupport.class';
InputStream stream = (new CompilerVersionSupport()).getClass().getClassLoader().getResourceAsStream(classAsPath);
DataInputStream ins = new DataInputStream (stream)
assert( ins.readUnsignedShort() == 0xcafe )
assert( ins.readUnsignedShort() == 0xbabe )
int minor = ins.readUnsignedShort();
int major = ins.readUnsignedShort();
ins.close();
int javaVersion = major - 44
return "1.$javaVersion"
}
}
Trouble is, it returns 1.5.
What could be going on?
Charles

The default Groovy behaviour is not to compile the code with the same bytecode version as the JDK being used. 1.5 is the default for compatibility reasons, IMHO. If you want the compiler to output newer bytecode you need to set that explicitly.
For example if you're using Maven to compile the code, you can use the GMavenPlus plugin. See the description of the targetBytecode parameter.
If you're not using Maven you can use -Dgroovy.target.bytecode=1.7 or research the possibilities for your particular build tool

If you're using Maven as the build tool, then chances are that it's using the gmavenplus-plugin to compile Groovy. To find out the target Java version of the bytecode generated I poked into the pom of the gmavenplus-plugin that my application uses: ~/.m2/repository/org/codehaus/gmavenplus/gmavenplus-plugin/1.5/gmavenplus-plugin-1.5.pom.
Inside that file I saw this, notice <javaVersion/>,
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
<mavenVersion>2.2.1</mavenVersion>
<coberturaPluginVersion>2.7</coberturaPluginVersion>
<javadocPluginVersion>2.10.1</javadocPluginVersion>
<!-- these are properties so integration tests can use them -->
<javaVersion>1.5</javaVersion>
<dependencyPluginVersion>2.10</dependencyPluginVersion>
<compilerPluginVersion>3.2</compilerPluginVersion>
<junitVersion>4.12</junitVersion>
<surefirePluginVersion>2.18.1</surefirePluginVersion>
<pluginPluginVersion>3.4</pluginPluginVersion>
<!-- this is a property so that site generation can use it -->
<sourcePluginVersion>2.4</sourcePluginVersion>
<!-- this is a property so that site generation and integration tests can use it -->
<groovyVersion>2.4.1</groovyVersion>
</properties>
...
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>${compilerPluginVersion}</version>
<configuration>
<source>${javaVersion}</source>
<target>${javaVersion}</target>
</configuration>
</plugin>
I use IntelliJ for an IDE. IntelliJ is automatically setting the language level to Java 1.5. Even if I change it, when I re-import projects it resets back to Java 1.5 (I've fuzzed out sensitive information),

I think the issue is with the program you are using to find the class version. If the assertion is not enabled the stream doesnt read the first two unsigned shorts and hence the subsequent minor and major read statements results in 0Xcafe and 0xbabe respectively. Try enabling assertion or try using an if check.
public static String getVersion() throws Exception {
String classAsPath = "com/my/organisation/CompilerVersionSupport.class";
InputStream stream = (new CompilerVersionSupport()).getClass().getClassLoader().getResourceAsStream(classAsPath);
DataInputStream ins = new DataInputStream(stream);
if(ins.readUnsignedShort() != 0xcafe) throw new AssertionError("Invalid Class");
if(ins.readUnsignedShort() != 0xbabe) throw new AssertionError("Invalid Class");
int minor = ins.readUnsignedShort();
int major = ins.readUnsignedShort();
ins.close();
int javaVersion = major - 44;
return "1." + javaVersion;
}

Related

Standalone Nashorn with Java 11 throws java.lang.StackOverflowError upon eval

I came across an issue with Nashorn, when evaluating a large expression, it works fine in Java 8 , but throws a java.lang.StackOverflowError in Java 11.
Exception in thread "main" java.lang.StackOverflowError
at org.openjdk.nashorn.internal.codegen.LocalVariableTypesCalculator.enterBinaryNode(LocalVariableTypesCalculator.java:444)
at org.openjdk.nashorn.internal.ir.BinaryNode.accept(BinaryNode.java:329)
at org.openjdk.nashorn.internal.codegen.LocalVariableTypesCalculator.enterJoinPredecessorExpression(LocalVariableTypesCalculator.java:777)
at org.openjdk.nashorn.internal.ir.JoinPredecessorExpression.accept(JoinPredecessorExpression.java:114)
at org.openjdk.nashorn.internal.codegen.LocalVariableTypesCalculator.visitExpression(LocalVariableTypesCalculator.java:603)
at org.openjdk.nashorn.internal.codegen.LocalVariableTypesCalculator.enterBinaryNode(LocalVariableTypesCalculator.java:447)
at org.openjdk.nashorn.internal.ir.BinaryNode.accept(BinaryNode.java:329)
I came across this question, and in an attempt to fix this issue, as suggested in this comment, I'm trying to use the Standalone Nashorn with Java 11, by using org.openjdk.nashorn.api.scripting.NashornScriptEngineFactory in my code (See this page for info on "why do we need to do that").
I'm having a simple Maven project, where I have added the following to my pom. For simplicity, I've included only the following important parts:
Nashorn dependency
Maven Compiler Plugin
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.example</groupId>
<artifactId>maven-project</artifactId>
<version>1.0-SNAPSHOT</version>
<packaging>jar</packaging>
<dependencies>
<dependency>
<groupId>org.openjdk.nashorn</groupId>
<artifactId>nashorn-core</artifactId>
<version>15.3</version>
</dependency>
</dependencies>
<build>
<plugins>
...
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.8.0</version>
<configuration>
<release>11</release>
</configuration>
</plugin>
</plugins>
</build>
</project>
(Apart from these, I'm using maven-dependency-plugin and maven-jar-plugin to bundle this into an executable Jar).
The following is my Java code:
Main.java
package main;
import org.openjdk.nashorn.api.scripting.NashornScriptEngine;
import org.openjdk.nashorn.api.scripting.NashornScriptEngineFactory;
import javax.script.ScriptEngine;
public class Main {
public static void main(String[] args) {
// write your code here
String expression = "(\"BUY\" == \"BUY\") && (\"BUY\" == \"BUY\") && (1.00 >= 1000.000)"; // It's a long expression which contains conditions like this.
try {
NashornScriptEngineFactory factory = new NashornScriptEngineFactory();
System.out.println("================================================================================");
System.out.println("factory.engineName: " + factory.getEngineName());
System.out.println("factory.engineVersion: " + factory.getEngineVersion());
System.out.println("factory.languageName: " + factory.getLanguageName());
System.out.println("factory.languageVersion: " + factory.getLanguageVersion());
System.out.println("================================================================================");
ScriptEngine engine = factory.getScriptEngine();
Object results = engine.eval(expression);
System.out.println("Successful");
System.out.println(results);
} catch (Exception e) {
e.printStackTrace();
}
}
}
With Java version java version "11.0.9" 2020-10-20 LTS, I'm compiling a jar called maven-project-1.0-SNAPSHOT.jar via doing mvn clean install.
When running the main class of the Jar as follows:
java -jar target/maven-project-1.0-SNAPSHOT.jar
Based on this information, the standalone version of Nashorn will be loaded with Java 11 since I have used org.openjdk.nashorn.api.scripting.NashornScriptEngineFactory, and therefore, I'm not expecting a java.lang.StackOverflowError with Java 11.
But my output produces a StackOverflowError:
================================================================================
factory.engineName: OpenJDK Nashorn
factory.engineVersion: 15.3
factory.languageName: ECMAScript
factory.languageVersion: ECMA - 262 Edition 5.1
================================================================================
Exception in thread "main" java.lang.StackOverflowError
at org.openjdk.nashorn.internal.codegen.LocalVariableTypesCalculator.enterJoinPredecessorExpression(LocalVariableTypesCalculator.java:775)
at org.openjdk.nashorn.internal.ir.JoinPredecessorExpression.accept(JoinPredecessorExpression.java:114)
at org.openjdk.nashorn.internal.codegen.LocalVariableTypesCalculator.visitExpression(LocalVariableTypesCalculator.java:603)
at org.openjdk.nashorn.internal.codegen.LocalVariableTypesCalculator.enterBinaryNode(LocalVariableTypesCalculator.java:447)
at org.openjdk.nashorn.internal.ir.BinaryNode.accept(BinaryNode.java:329)
Update: When compiling this with Java 17 and testing, I did not get a StackOverflowError. As pointed out by #Thorbjørn Ravn Andersen
in the comments, the reason to check this was, to rule out the possibility of the in-built Nashorn being picked in Java 11, instead of the standalone one. (Since Java 17 doesn't have the inbuilt Nashorn, it would pick only the standalone one).
Based on this, I can also think of an addition to this question:
how can we make sure that the standalone Nashorn is picked up in Java 11?
I tried executing the jar as follows (referred to this answer of this related question), but the StackOverflowError is still being thrown. (The /path/to/my/maven-project/target/libs/ directory contains jars such as nashorn-core-15.3.jar, asm-7.3.1.jar, asm-commons-7.3.1.jar ...)
java --module-path "/path/to/my/maven-project/target/libs/" --add-modules org.openjdk.nashorn -jar target/maven-project-1.0-SNAPSHOT.jar
(Instead of passing arguments/flags like this, a preferrable way would be to add these as a configuration or so in the pom - and handle it "neater").
FYI, The following code works fine with Java 8, but throws the same StackOverflowError in Java 11, which is why I had to use org.openjdk.nashorn.api.scripting.NashornScriptEngineFactory instead of javax.script.ScriptEngineManager.
import javax.script.ScriptEngine;
import javax.script.ScriptEngineManager;
import javax.script.ScriptException;
public class Main {
public ScriptEngine engine = new ScriptEngineManager().getEngineByName(ENGINE_NAME);;
public static final String ENGINE_NAME = "nashorn";
public static void main(String[] args) {
// write your code here
Main main = new Main();
Object result = null;
String exp = "(1640 >= 10) && (1640 <= 100) && (1640 >= 123)";
try {
result = main.engine.eval(exp);
System.out.println(result);
} catch (ScriptException e) {
e.printStackTrace();
}
}
}
Am I missing anything here? Thanks in advance for any solutions.
How long is the actual source code expression that in the JavaScript code that causes this?
You are not getting the exception in Java 8 because Nashorn in Java 8 didn't have some of the more advanced static type inference capabilities which allow it to generate more optimal code when it can prove some variables will always be ints or doubles and never objects.
This static type inference was added in a later Nashorn version, one released with Java 9. The stack overflow happens in its code as it needs to recursively walk the expression tree. If you have an extremely long expression, this might happen. I'd suggest you break very long expressions into smaller ones, either isolating them into local functions or assigning them to local variables. Local functions are better as then you can keep short-circuited evaluation, e.g. if you had
function f(a, b, c, d, e, f, g, h) {
return a == b && c == d && e == f && g == h;
}
you can instead do
function f(a, b, c, d, e, f, g, h) {
function p1() {
return a == b && c == d;
}
function p2() {
return e == f && g == h;
}
return p1() && p2();
}
This example just illustrates the technique, Nashorn should be good to evaluate very long expressions before running into a stack overflow.
Alternatively, run your JVM with explicit -Xss setting to give threads more stack memory.

Commons-lang StringUtils isNotBlank method still raise NPE

Does Sonar support commons-lang StringUtils ?
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-lang3</artifactId>
<version>3.7</version>
</dependency>
Env:
INFO: SonarQube Scanner 3.2.0.1227
INFO: Java 1.8.0_121 Oracle Corporation (64-bit)
INFO: Linux 4.19.2-1.el7.elrepo.x86_64 amd64
Community EditionVersion 7.6 (build 21501)
code to reproduce this issue:
public class DetectorImport {
public String check1(Nonentity nonentity) {
String s;
if(nonentity == null) {
s = null;
}else {
s = nonentity.getName();
}
if(StringUtils.isNotBlank(s)) {
s = s.replaceAll("(", "(");
}
return s;
}
}
From this sonar google groups, it is mentioned that the common.langs methods are supported by sonar.
https://groups.google.com/forum/#!topic/sonarqube/aluTP63hfyA
Maybe another approach could be for you to use other utility classes, commonly used across Java projects. We currently support methods from commons-lang StringUtils (v2, and v3), guava preconditions, and java 8 methods from java.util.Objects (nonNull, isNull, requireNonNull). As we know how these methods behave, we are able to correctly handle such call and discard similar FPs. Of course, I don't want to force you using such libraries to make the analyzer happy. :)
changing above code to following one indeed solve this issue:
public class DetectorImport {
public String check1(Nonentity nonentity) {
String s;
if(nonentity == null) {
s = null;
}else {
s = nonentity.getName();
}
if(s !=null) {
s = s.replaceAll("(", "(");
}
return s;
}
}
The above question is copied from
https://community.sonarsource.com/t/commons-lang-stringutils-isnotblank-method-still-raise-npe/21517
Am not the OP in Sonar, but I provided my solution there. I had the exact same question, so I am copying the solution over to others who end up here.
We are using the SonarScanner and not the maven sonar plugin for our scans. And for us the issue was that the “sonar.java.libraries” variable was not properly set. I added the target “dependency:copy-dependencies” as part of the maven execution. This copied all the dependencies to the right location, then I set the property “-Dsonar.java.libraries=target/dependency” and everything started working as it's supposed to.

Java OpenCV from Maven

Is there any way to get OpenCV from repository? Which artifact should I add to pom.xml? Every tutorial I'd found is from '14 and it seems like something changed - they say it is'nt in official Maven repository yet, but I've found entry:
<!-- https://mvnrepository.com/artifact/nu.pattern/opencv -->
<dependency>
<groupId>nu.pattern</groupId>
<artifactId>opencv</artifactId>
<version>2.4.9-7</version>
</dependency>
Sadly, I get error
Caused by: java.lang.UnsatisfiedLinkError: no opencv_java249 in java.library.path
when I'm using System.loadLibrary(Core.NATIVE_LIBRARY_NAME). Can I add this library in a way that would make my project include it and 'forget' about manually adding it to classpath?
Add the following dependency in your POM file:
<dependency>
<groupId>org.openpnp</groupId>
<artifactId>opencv</artifactId>
<version>3.2.0-0</version>
</dependency>
and replace the following lines:
System.loadLibrary(Core.NATIVE_LIBRARY_NAME)
with
nu.pattern.OpenCV.loadShared();
This should solve the problem in WINDOWS also. Happy Coding.
This worked for me.
nu.pattern.OpenCV.loadLibrary();
I'm using following maven dependency
<dependency>
<groupId>nu.pattern</groupId>
<artifactId>opencv</artifactId>
<version>2.4.9-4</version>
</dependency>
Try this, see if it works:
nu.pattern.OpenCV.loadShared();
System.loadLibrary(org.opencv.core.Core.NATIVE_LIBRARY_NAME);
More info here in API section: https://github.com/patternconsulting/opencv
Also have 2.4.9-7 opencv dependency.
There is currently no official way to use the official Java bindings for OpenCV as a Maven dependency (as already mentioned in the comments, the Maven artifact was already requested in #4588, but is still unattended). Nevertheless, there are 3 possible approaches to your problem:
java.lang.UnsatisfiedLinkError was thrown because you need to install the binding's binaries (that is "opencv_java") separately. Most likely, that unofficial artifact does not include them (or not the ones compatible with your system). In order to build the bindings:
git clone the OpenCV repository.
git checkout the intended version (it appears that you are using version 2.4.9, although more recent versions are available)
Follow the instructions here to build OpenCV and its Java bindings, thus yielding a dynamically linked library ("opencv_java249.dll", "libopencv_java249.so", or something else depending on your OS).
Copy the shared library file to your java.library.path (again, this variable is system-dependent, but can be defined when running your application). At this point you should be ready to use that artifact.
An alternative is to use other bindings: the JavaCPP presets for OpenCV seem to work just as nicely as the official ones, and these are registered in maven (binaries for various platforms included!). Just remember that the API may not be exactly the same.
This solution may sound too far out, but it has legitimately worked for me in the past. Basically, you can avoid using the bindings: implement your solution in C++, then either link it with the JVM via JNI or make it a separate application, used by the main application via other mechanisms of your system (process spawning, I/O channels, you name it). For instance, I have once made a service component for feature extraction that other programs would connect to via ZeroMQ sockets.
Just use it
nu.pattern.OpenCV.loadShared();
write a class with this static void method
class Test {
public static void loadOpenCVNativeLibrary() {
nu.pattern.OpenCV.loadShared();
}
}
and after call it in your application class (with static main) for web application (spring boot for example) like this
static {
Test.loadOpenCVNativeLibrary();
}
...
public static void main(String[] args) throws UnknownHostException {
}
All you need:
install jar in local maven repository with:
mvn install:install-file -Dfile=C:\opencv411\build\java\opencv-411.jar -DgroupId=org -DartifactId=opencv -Dversion=4.1.1 -Dpackaging=jar
create a dependency in pom.xml:
<dependency>
<groupId>org</groupId>
<artifactId>opencv</artifactId>
<version>4.1.1</version>
</dependency>
Now that jar is on, we need to somehow add the OpenCV libraries. I did this by adding the lib folder in java.library.path to the maven-surefire plugin:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.22.2</version>
<configuration>
<argLine>-Djava.library.path=${project.build.outputDirectory}/lib</argLine>
</configuration>
</plugin>
public static void main(String[] arges) throws MalformedURLException,
IOException, Exception {
loadLibraries();
// create and print on screen a 3x3 identity matrix
System.out.println("Create a 3x3 identity matrix...");
Mat mat = Mat.eye(3, 3, CvType.CV_8UC1);
System.out.println("mat = " + mat.dump());
// prepare to convert a RGB image in gray scale
String location = "resources/Poli.jpg";
System.out.print("Convert the image at " + location + " in gray scale... ");
// get the jpeg image from the internal resource folder
Mat image = Imgcodecs.imread(location);
// convert the image in gray scale
Imgproc.cvtColor(image, image, Imgproc.COLOR_RGB2GRAY);
// write the new image on disk
Imgcodecs.imwrite("resources/Poli-gray.jpg", image);
System.out.println("Done!");
}
private static void loadLibraries() {
try {
InputStream in = null;
File fileOut = null;
String osName = System.getProperty("os.name");
// String opencvpath = System.getProperty("user.dir");
String opencvpath = "C:\\opencv411\\build\\java\\";
if (osName.startsWith("Windows")) {
int bitness = Integer.parseInt(System.getProperty("sun.arch.data.model"));
if (bitness == 32) {
opencvpath = opencvpath + "\\x86\\";
} else if (bitness == 64) {
opencvpath = opencvpath + "\\x64\\";
} else {
opencvpath = opencvpath + "\\x86\\";
}
} else if (osName.equals("Mac OS X")) {
opencvpath = opencvpath + "Your path to .dylib";
}
System.out.println(opencvpath);
// System.out.println("Core.NATIVE_LIBRARY_NAME = " + Core.NATIVE_LIBRARY_NAME);
System.out.println("Core.NATIVE_LIBRARY_NAME = " + "opencv_java411.dll");
// System.load(opencvpath + Core.NATIVE_LIBRARY_NAME + ".dll");
System.load(opencvpath + "opencv_java411.dll");
} catch (Exception e) {
throw new RuntimeException("Failed to load opencv native library", e);
}
}
For those who wants to use OpenCV 3.2 in MacOs environment, you can use following repository definition:
<repositories>
<repository>
<id>kodfarki</id>
<url>https://raw.githubusercontent.com/kodfarki/repository/master/</url>
</repository>
</repositories>
There is also an example project in https://github.com/kodfarki/opencv-example.
To use this example project, you still need to install OpenCV binaries
brew tap homebrew/science
brew install opencv3 --with-java --with-contrib
For windows there was a problem with #Sachin Aryal's answer. The answer by #Anirban Chakraborty is a very good hint. But, there was still issues at runtime as described in this thread.
Finally replacing OpenCV.loadShared(); with OpenCV.loadLocally(); worked for me.

Netbeans and Maven - project compiles but cannot find libraries at runtime

I am trying to use an open source tool built on Batik and I am running into trouble with one of the dependencies when I try to build it. Pretty sure this is something to do with classpaths and library locations, but I can't figure out what is happening.
So the project I am working with ( SVG2EMF ) is using the FreeHep EMF Driver, which in turn uses the FreeHep GraphicsIO project. Because these three have not been playing nicely on my system ( Ubuntu 14.04 ) I've downloaded the source for all three to try and step through the problem.
Everything builds correctly and I can step through the code successfully, but the unit tests on SVG2EMF fail at the point where the EMF Driver makes a call to something from GraphicsIO- the relevant parts of the code in question is here:
import org.freehep.graphicsio.ImageGraphics2D;
import org.freehep.graphicsio.ImageConstants;
// ...snip...
public class AlphaBlend extends EMFTag implements EMFConstants
{
// ...snip...
public void write(int tagID, EMFOutputStream emf) throws IOException
{
emf.writeRECTL(bounds);
emf.writeLONG(x);
emf.writeLONG(y);
emf.writeLONG(width);
emf.writeLONG(height);
dwROP.write(emf);
emf.writeLONG(xSrc);
emf.writeLONG(ySrc);
emf.writeXFORM(transform);
emf.writeCOLORREF(bkg);
emf.writeDWORD(usage);
emf.writeDWORD(size); // bmi follows this record immediately
emf.writeDWORD(BitmapInfoHeader.size);
emf.writeDWORD(size + BitmapInfoHeader.size); // bitmap follows bmi
emf.pushBuffer();
int encode;
// plain
encode = BI_RGB;
ImageGraphics2D.writeImage(
(RenderedImage) image,
ImageConstants.RAW.toLowerCase(),
ImageGraphics2D.getRAWProperties(bkg, "*BGRA"),
new NoCloseOutputStream(emf));
// emf.writeImage(image, bkg, "*BGRA", 1);
// png
// encode = BI_PNG;
// ImageGraphics2D.writeImage(image, "png", new Properties(), new
// NoCloseOutputStream(emf));
// jpg
// encode = BI_JPEG;
// ImageGraphics2D.writeImage(image, "jpg", new Properties(), new
// NoCloseOutputStream(emf));
int length = emf.popBuffer();
emf.writeDWORD(length);
emf.writeLONG(image.getWidth());
emf.writeLONG(image.getHeight());
BitmapInfoHeader header = new BitmapInfoHeader(image.getWidth(), image
.getHeight(), 32, encode, length, 0, 0, 0, 0);
bmi = new BitmapInfo(header);
bmi.write(emf);
emf.append();
}
This throws a NoClassDefFoundError specifically relating to org.freehep.graphicsio.ImageGraphics2D on that writeImage call. When I step through in the debugger, a watch on ImageConstants.RAW has the value of Unknown type "org.freehep.graphicsio.ImageConstants" even though the application built quite happily with those references. Any references to ImageGraphics2D behave in exactly the same way.
The dependency in the SVG2EMF pom.xml looks like this:
<dependencies>
<!-- some other dependencies -->
<dependency>
<groupId>org.freehep</groupId>
<artifactId>freehep-graphicsio-emf</artifactId>
<version>2.1.1</version>
</dependency>
</dependencies>
Dependency from the FreeHEP EMF Driver looks like this:
<dependencies>
<!-- necessary because transitive deps seem to go above inhertied deps -->
<dependency>
<groupId>org.freehep</groupId>
<artifactId>freehep-util</artifactId>
<version>2.0.2</version>
</dependency>
<dependency>
<groupId>org.freehep</groupId>
<artifactId>freehep-graphicsio</artifactId>
<version>2.1.1</version>
</dependency>
<!-- Other dependencies -->
</dependencies>
Can anybody shed any light on what is actually going on here or what I need to be doing in order to enable this to work?
EDIT: I think I have found where the problem is coming from- way down the StackTrace I see a "Caused by: ExceptionInInitializerError" - which appears to mark the class as inaccessible from then on. So the dependency does exist, but an exception is being thrown by the initializer which causes the JRE to mark it as unusable.
Further Edit: To solve these problems it can be useful ( although it is not mentioned anywhere on the freehep.org website ) to know that the project is now hosted on Github so you can find newer versions from there. In my case going straight to the latest version solved the problem.

Gradle strange behavior while extending sourceSets with Map variable

We are developing a Java project that is able to instrument (change) class files at build time. We defined a Gradle task that invokes a java based Ant task which takes an inputDir (e.g. build/classes) and an outputDir (e.g. build/classes-instrumented) and possible other parameters. The task gets invoked separately for main and test class files after compilation. Since the "normal" java sourceSet is not a good fit, our first thought was to implement our own sourceSet but couldn't find an easy way. A reasonable alternative, similar to ANTLR etc, seemed to be extra variables. Since I needed several, I went for a Map.
sourceSets.all { ext.instrumentation = [:] }
sourceSets.all {
instrumentation.inputDir = null
instrumentation.outputDir = null
instrumentation.classPath = null
}
def postfix = '-instrumented'
Below you see how we initialize the variables.
sourceSets {
main {
instrumentation.inputDir = sourceSets.main.output.classesDir
instrumentation.outputDir = instrumentation.inputDir + postfix
instrumentation.classPath = sourceSets.main.output + configurations.compile
}
test {
instrumentation.inputDir = sourceSets.test.output.classesDir
instrumentation.outputDir = instrumentation.inputDir + postfix
}
}
However it fails with "Could not find method main() for arguments [build_f2cvmoa3v4hnjefifhpuk6ira$_run_closure5_closure23#12a14b74] on root
project 'Continuations'."
We are using Gradle 2.1
I have the following questions:
any idea why the first one fails?
Is the extra variable a reasonable solution to approach the problem?
Thanks a lot for your help
solution: install last version.
I had the same problem, I read gradle documentation of gradle 3, but gradle 2.7 was installed.
checked gradle version 2.7
then read gradle 2.7 doc https://docs.gradle.org/2.7/userguide/tutorial_java_projects.html#N103CD , but found no info about sourceSet in java plugin for that version
installed gradle 3 --> problem solved

Categories