how come Eclipse compiles this Java code but not Ant? - java

This compiles find using Eclipse:
abstract class CollectionView implements Collection<Object> {
...
public Object[] toArray(Object[] o) {
if (fast) {
return get(map).toArray(o);
} else {
synchronized (map) {
return get(map).toArray(o);
}
}
}
...
}
class KeySet extends CollectionView implements Set<Object> {
protected Collection<Object> get(Map<Object, Object> map) {
return map.keySet();
}
protected Object iteratorNext(Map.Entry entry) {
return entry.getKey();
}
}
but it fails to compile when using Ant:
error: KeySet is not abstract and does not override
abstract method toArray(T[]) in Set
I can see why the code would compile using Eclipse: KeySet already inherits the implementation of the toArray(T[]) method from CollectionView.
But why does it fail when I compile using Ant?
<javac srcdir="src" destdir="bin" debug="on">
<compilerarg value="-Xlint:unchecked"/>
<compilerarg value="-Xlint:deprecation"/>
</javac>

First we should note the exact signature of the method expected to be implemented is:
<T> T[] toArray(T[] a);
And both javac and Eclipse do warn you about this 'Type safety' issue. And if you change the signature to be the expected one, javac is happy.
If you put an #Override to the method toArray, even with the signature which use raw Object type, both Eclipse and javac correctly see it as an override of the method declared by Collection. So this issue is not there.
The inconsistency, and I think the bug of javac, is that is any subclass implementation, javac doesn't recognize the super method Object[] toArray(Object[] o) to implement <T> T[] toArray(T[] a). If it did for the abstract class, i should also do it for every subclass.
It is not the first time javac has a bug about this. See this thread for instance. I have searched the Oracle bug database, I found nothing reported about what you have found.
Then there are a work around: in the abstrcat class, use the expected signature; Or do the override 'manually` in the subclasss:
public Object[] toArray(Object[] o) {
return super.toArray(o);
}

There are several cases where eclipse compiles fine and javac doesn't. If you do not mind, there are three ways that I know to build using the eclipse compiler.
Package eclipse pre-compiled classes (hacky, NOT recommended)
Use the eclipse compiler adapter with Ant. When you specify the property build.compiler, all javac tasks henceforth will be affected on your Ant build. You can set it to "org.eclipse.jdt.core.JDTCompilerAdapter". Note that you will have to include this class (and classes it depends on) in your ant build classpath. The most straightforward way is to add the necessary jars to the lib folder of your Ant installation
When building with maven configure this
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<configuration>
<compilerId>eclipse</compilerId>
<compilerVersion>1.6</compilerVersion>
<source>1.6</source>
<target>1.6</target>
<optimize>true</optimize>
</configuration>
<dependencies>
<dependency>
<groupId>org.codehaus.plexus</groupId>
<artifactId>plexus-compiler-eclipse</artifactId>
<version>2.2</version>
</dependency>
</dependencies>
</plugin>
in the plugins section of the build section of your pom.xml

Related

JSON ObjectMapper with javac "-parameters" behaves when run via maven, not via InteliJ IDEA

As you can probably gather from the title, this is a somewhat complicated issue.
First of all, my goal:
I am trying to achieve conversion of my java classes to and from JSON without having to add any json-specific annotations to them.
My java classes include immutables, which must initialize their members from parameters passed to the constructor, so I have to have multi-parameter constructors that work without #JsonCreator and without #JsonParameter.
I am using the jackson ObjectMapper. If there is another ObjectMapper that I can use that works without the problem described herein, I'd be happy to use it, but it would have to be equally reputable as the jackson ObjectMapper. (So, I am not willing to download Jim's ObjectMapper from his GitHub.)
My understanding as to how this can actually be achieved, in case I am wrong somewhere:
Java used to make method (and constructor) parameter types discoverable via reflection, but not parameter names. That's why the #JsonCreator and #JsonParameter annotations used to be necessary: to tell the json ObjectMapper which constructor parameter corresponds to which property. With Java 8, the compiler will emit method (and constructor) parameter names into the bytecode if you supply the new -parameters argument, and will make them available via reflection, and recent versions of the jackson ObjectMapper support this, so it should now be possible to have json object mapping without any json-specific annotations.
I have this pom.xml:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>test</groupId>
<artifactId>test.json</artifactId>
<version>1.0-SNAPSHOT</version>
<name>Json Test</name>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
<build>
<sourceDirectory>main</sourceDirectory>
<testSourceDirectory>test</testSourceDirectory>
<plugins>
<plugin>
<!--<groupId>org.apache.maven.plugins</groupId>-->
<artifactId>maven-compiler-plugin</artifactId>
<version>3.5</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
<!--<compilerArgument>-parameters</compilerArgument>-->
<!--<fork>true</fork>-->
<compilerArgs>
<arg>-parameters</arg>
</compilerArgs>
</configuration>
</plugin>
</plugins>
</build>
<dependencies>
<dependency>
<groupId>com.fasterxml.jackson.jaxrs</groupId>
<artifactId>jackson-jaxrs-json-provider</artifactId>
<version>2.7.2</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.module</groupId>
<artifactId>jackson-module-parameter-names</artifactId>
<version>2.7.2</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.11</version>
<scope>test</scope>
</dependency>
</dependencies>
</project>
And I use it to compile and run the following little self-contained program:
package jsontest;
import com.fasterxml.jackson.annotation.*;
import com.fasterxml.jackson.databind.*;
import com.fasterxml.jackson.module.paramnames.ParameterNamesModule;
import java.io.IOException;
import java.lang.reflect.*;
public final class MyMain
{
public static void main( String[] args ) throws IOException, NoSuchMethodException
{
Method m = MyMain.class.getMethod("main", String[].class);
Parameter mp = m.getParameters()[0];
if( !mp.isNamePresent() || !mp.getName().equals("args") )
throw new RuntimeException();
Constructor<MyMain> c = MyMain.class.getConstructor(String.class,String.class);
Parameter m2p0 = c.getParameters()[0];
if( !m2p0.isNamePresent() || !m2p0.getName().equals("s1") )
throw new RuntimeException();
Parameter m2p1 = c.getParameters()[1];
if( !m2p1.isNamePresent() || !m2p1.getName().equals("s2") )
throw new RuntimeException();
ObjectMapper mapper = new ObjectMapper();
mapper.registerModule( new ParameterNamesModule() ); // "-parameters" option must be passed to the java compiler for this to work.
mapper.configure( DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, true );
mapper.configure( SerializationFeature.ORDER_MAP_ENTRIES_BY_KEYS, true );
mapper.setSerializationInclusion( JsonInclude.Include.ALWAYS );
mapper.setVisibility( PropertyAccessor.ALL, JsonAutoDetect.Visibility.PUBLIC_ONLY );
mapper.enableDefaultTyping( ObjectMapper.DefaultTyping.NON_FINAL, JsonTypeInfo.As.PROPERTY );
MyMain t = new MyMain( "1", "2" );
String json = mapper.writeValueAsString( t );
/*
* Exception in thread "main" com.fasterxml.jackson.databind.JsonMappingException: No suitable constructor found for type [simple type, class saganaki.Test]: can not
* instantiate from JSON object (missing default constructor or creator, or perhaps need to add/enable type information?)
*/
t = mapper.readValue( json, MyMain.class );
if( !t.s1.equals( "1" ) || !t.s2.equals( "2" ) )
throw new RuntimeException();
System.out.println( "Success!" );
}
public final String s1;
public final String s2;
public MyMain( String s1, String s2 )
{
this.s1 = s1;
this.s2 = s2;
}
}
Here is what happens:
If I compile the program using mvn clean compile and then run or debug it from within Idea, it works fine and it displays "Success!".
If I do "Rebuild Project" from within Intellij Idea and then I run/debug, it fails with a JsonMappingException that "No suitable constructor found for simple type jsontest.MyMain".
The strange thing (to me) is, the code before the instantiation of the ObjectMapper checks to make sure that constructor parameter names are present and valid, effectively ensuring that the "-parameters" argument has been successfully passed to the compiler, and these checks always pass!
If I edit my "debug configuration" in Idea and in the "Before launch" section I remove "Make" and I replace it with "Run maven goal" compile then I can successfully run my program from within Idea, but I do not want to do have to do this. (Also, it does not even work very well, I guess I must be doing something wrong: quite often I run and it fails with the same exception as above, and the next time I run it succeeds.)
So, here are my questions:
Why does my program behave differently when compiled by maven than when compiled with Idea?
More specifically: what is ObjectMapper's problem given that my assertions prove that the "-parameters" argument was passed to the compiler, and arguments do have names?
What can I do to make Idea compile my program the same way as maven (at least with respect to the problem at hand) without replacing Idea's "Make"?
Why does it not work consistently when I replace the default "Make" with "Run maven goal" compile in Idea's debug configuration? (What am I doing wrong?)
EDIT
My apologies, the assertions were not necessarily proving anything, since they were not necessarily enabled with -enableassertions. I replaced them with if() throw RuntimeException() to avoid confusion.
As far as I can see in the IntelliJ Community edition sources, IntelliJ is not doing anything with the compilerArgs you're specifying.
In MavenProject.java, there are two places where the compilerArgs are being read:
Element compilerArguments = compilerConfiguration.getChild("compilerArgs");
if (compilerArguments != null) {
for (Element element : compilerArguments.getChildren()) {
String arg = element.getValue();
if ("-proc:none".equals(arg)) {
return ProcMode.NONE;
}
if ("-proc:only".equals(arg)) {
return ProcMode.ONLY;
}
}
}
and
Element compilerArgs = compilerConfig.getChild("compilerArgs");
if (compilerArgs != null) {
for (Element e : compilerArgs.getChildren()) {
if (!StringUtil.equals(e.getName(), "arg")) continue;
String arg = e.getTextTrim();
addAnnotationProcessorOption(arg, res);
}
}
The first code block is only looking at the -proc: argument, so this block can be ignored. The second one is passing the values of the arg element (which you are specifying) to the addAnnotationProcessorOption method.
private static void addAnnotationProcessorOption(String compilerArg, Map<String, String> optionsMap) {
if (compilerArg == null || compilerArg.trim().isEmpty()) return;
if (compilerArg.startsWith("-A")) {
int idx = compilerArg.indexOf('=', 3);
if (idx >= 0) {
optionsMap.put(compilerArg.substring(2, idx), compilerArg.substring(idx + 1));
} else {
optionsMap.put(compilerArg.substring(2), "");
}
}
}
This method is only processing arguments which start with -A, which are used to pass options to the annotation processors. Other arguments are ignored.
Currently, the only ways to get your sources to run from within IntelliJ are to enable the flag yourself in the "Additional command line parameters" field of the compiler settings (which isn't portable), or by compiling with maven as a pre-make step in your run configuration. You probably have to file an issue with Jetbrains if you want this to be possible in IntelliJ automatically.

No Warning with Raw Type

I defined a parameterized interface:
import com.google.common.base.Optional;
public interface AbstractResource<S extends Parent> {
Optional<S> getOption();
Optional<S> getAbsent();
Optional<S> getNull();
}
Then, I implemented it as a raw type. Observe that I'm breaking the interface by returning the Optional types Child, Object and Integer for the respective methods.
public class FooResource implements AbstractResource { // Did not add type parameter
#Override
public Optional<Child> getOption() {
Child child = new Child("John");
return Optional.of(child);
}
#Override
public Optional<Object> getAbsent() {
return Optional.absent();
}
#Override
public Optional<Integer> getNull() {
return null;
}
}
When compiling with the -Xlint:unchecked option, why doesn't the compiler show a warning that FooResource fails to add type parameters? In fact, it compiles successfully.
-Xlint:unchecked is for unchecked conversion (aka casts). It does nothing have to do with using raw types. Compile your classes with the option -Xlint. You then will get the expected output:
FooResource.java:3: warning: [rawtypes] found raw type: AbstractResource
AbstractResource
^
missing type arguments for generic class AbstractResource<S>
where S is a type-variable:
S extends Parent declared in interface AbstractResource
1 warning
I would suggest using an IDE. Eclipse - for example - shows such warnings out of the box.
I just found out, that Java 7 supports more "Xlint" options (than Java 6, e.g.). So the option "-Xlint:rawtypes" will indeed help here.
If you want to see the warning, you should be using
-Xlint:rawtypes
instead of
-Xlint:unchecked
For Maven builds, refer to this:
http://maven.apache.org/plugins/maven-compiler-plugin/examples/pass-compiler-arguments.html
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<configuration>
<compilerArgument>-Xlint:rawtypes</compilerArgument>
</configuration>
</plugin>

Using JUnit RunListener in IntelliJ IDEA

I'm working on project where I need to perform some action before running each JUnit test. This problem was solved using RunListener that could be added to the JUnit core. The project assembly is done using Maven, so I have this lines in my pom file:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.12</version>
<configuration>
<properties>
<property>
<name>listener</name>
<value>cc.redberry.core.GlobalRunListener</value>
</property>
</properties>
</configuration>
</plugin>
So, everything works using:
mvn clean test
But when tests are started using IntelliJ (using its internal test runner) the actions coded in our RunListener are not executed, so it is impossible to perform testing using IntelliJ infrastructure.
As I see, IntelliJ does not parse this configuration from pom file, so is there a way to explicitly tell IntelliJ to add RunListener to JUnit core? May be using some VM options in configuration?
It is much more convenient to use beautiful IntelliJ testing environment instead of reading maven output.
P.S. The action I need to perform is basically a reset of static environment (some static fields in my classes).
I didn't see a way to specify a RunListener in Intellij, but another solution would be to write your own customer Runner and annotate #RunWith() on your tests.
public class MyRunner extends BlockJUnit4ClassRunner {
public MyRunner(Class<?> klass) throws InitializationError {
super(klass);
}
#Override
protected void runChild(final FrameworkMethod method, RunNotifier notifier) {
// run your code here. example:
Runner.value = true;
super.runChild(method, notifier);
}
}
Sample static variable:
public class Runner {
public static boolean value = false;
}
Then run your tests like this:
#RunWith(MyRunner.class)
public class MyRunnerTest {
#Test
public void testRunChild() {
Assert.assertTrue(Runner.value);
}
}
This will allow you to do your static initialization without a RunListener.

Java generics compile in Eclipse, but not in javac

I had to discover I have Java code in my project, which compiles and runs fine in Eclipse, but throws a compilation error in javac.
A self-contained snippet:
import java.util.HashSet;
import java.util.Set;
public class Main {
public static void main(String[] args) {
Set<Integer> setOfInts = new HashSet<Integer>();
Set<Object> setOfObjects = covariantSet(setOfInts);
}
public static <S, T extends S> Set<S> covariantSet(Set<T> set) {
return new HashSet<S>(set);
}
}
Compilation in javac returns:
Main.java:10: incompatible types
found : java.util.Set<java.lang.Integer>
required: java.util.Set<java.lang.Object>
Set<Object> setOfObjects = covariantSet(setOfInts);
^
This error now prevents building the project in Maven. As the Eclipse compiler is built to be more tolerant, I now have to assume the definition and usage of snippets as above static method is no valid Java?
It seems that Sun's 1.6 JDK can't infer the correct type. The following seems to work on my machine:
Set<Object> setOfObjects = Main.<Object, Integer>covariantSet(setOfInts);
Note that you must invoke the static method prefixed with the class name
You are right. This problem indeed exists. Eclipse does not use javac. It uses its own compiler.
Actually javac is "right". Generics are erasures. Type S is not included into your byte code, so jvm does not have enough information about the return type at runtime. To solve the problem change the method prototype as following:
public static <S, T extends S> Set<S> covariantSet(Set<T> set, Class<S> returnType)
Now the return type is passed to the method at runtime and compiler should not complain.
In your Maven build skript you have set the compiler version.
In Ant it lookes like this:
<property name="source.version" value="1.5" />
search for 1.3 or 1.4, or compile to find that value in the maven skripts
With value 1.5 the compiler will accept the generics (see your error messages)
I know it's old question, but I want to mention, the function could be written as:
import java.util.HashSet;
import java.util.Set;
public class Main {
public static void main(String[] args) {
Set<Integer> setOfInts = new HashSet<Integer>();
Set<Object> setOfObjects = covariantSet(setOfInts);
}
public static <S> Set<S> covariantSet(Set<? extends S> set) {
return new HashSet<S>(set);
}
}
It's a little bit cleaner and you can use the function exactly how you intented to(with implicit generic typing).
Add the next plugin to your pom.xml:
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<version>2.3.2</version>
<configuration>
<source>1.6</source>
<target>1.6</target>
</configuration>
</plugin>

Maven Codehaus findbugs plugin "onlyAnalyze" option not working as expected

Update for the impatient: it's simple, use package.- for sub-package scanning instead of package.*, as-per martoe's answer below!
I cannot seem to get onlyAnalyze working for my multi-module project: regardless of what package (or pattern) I set, maven-findbugs-plugin doesn't evaluate sub-packages as I'd expect from passing it packagename.*.
To prove either myself or the plugin at fault (though I always assume it's the former!), I setup a small Maven project with the following structure:
pom.xml
src/
main/java/acme/App.java
main/java/acme/moo/App.java
main/java/no_detect/App.java
which is very simple!
The POM has the following findbugs configuration:
<build>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>findbugs-maven-plugin</artifactId>
<version>2.4.0</version>
<executions>
<execution>
<phase>verify</phase>
<goals><goal>findbugs</goal><goal>check</goal></goals>
</execution>
</executions>
<configuration>
<debug>true</debug>
<effort>Max</effort>
<threshold>Low</threshold>
<onlyAnalyze>acme.*</onlyAnalyze>
</configuration>
</plugin>
</plugins>
</build>
and every App.java has the following code with two obvious violations:
package acme;
import java.io.Serializable;
public class App implements Serializable
{
private static final class NotSer {
private String meh = "meh";
}
private static final NotSer ns = new NotSer();// Violation: not serializable field
public static void main( String[] args )
{
ns.meh = "hehehe";// Vilation: unused
System.out.println( "Hello World!" );
}
}
Note that no_detect.App has the same content as above, but my expectation is that it wouldn't be evaluated by findbugs because I have the "onlyAnalyze" option set to acme.* which I assume would evaluate acme.App and acme.moo.App and nothing else.
I now execute a mvn clean install to clean, build, test, run findbugs, package, install, which produces the following findbugs report (snipped for brevity) and results in a build failure which is expected because acme.App and acme.moo.App:
<BugInstance category='BAD_PRACTICE' type='SE_NO_SERIALVERSIONID' instanceOccurrenceMax='0'>
<ShortMessage>Class is Serializable, but doesn't define serialVersionUID</ShortMessage>
<LongMessage>acme.App is Serializable; consider declaring a serialVersionUID</LongMessage>
<Details>
<p> This field is never read.&nbsp; Consider removing it from the class.</p>
</Details>
<BugPattern category='BAD_PRACTICE' abbrev='SnVI' type='SE_NO_SERIALVERSIONID'><ShortDescription>Class is Serializable, but doesn't define serialVersionUID</ShortDescription><Details>
<BugCode abbrev='UrF'><Description>Unread field</Description></BugCode><BugCode abbrev='SnVI'><Description>Serializable class with no Version ID</Description></BugCode>
To summarise: only acme.App is analysed, acme.moo.App isn't (bad) and neither is no_detect.App (good).
I tried with two wildcards in the onlyAnalyze option but that produces a successful build but with a findbugs error (Dangling meta character '*' etc).
I tried with onlyAnalyze set to acme.*,acme.moo.* which analyzes all the expected classes (acme.App and acme.moo.App) which means it "works" but not as I expect; i.e. I have to explicitly declare all parent-packages for the classes I want to analyze: that could get large and difficult to maintain on a multi-module project!
Do I have to define every package I want analyzed, or can I declare a wildcard/regex pattern that will do what I want?
I'd rather not use the inclusion/exclusion XML because that requires far more setup and reasoning that I don't currently have time for...
To cite the Findbugs manual: "Replace .* with .- to also analyze all subpackages"

Categories