I'm using the Groovy Spreadsheet Builder within one of my Grails projects to export some data as Excel file.
Everything works great until I create a runnable jar (using gradle assemble) and use this.
I'm using the builder within a service like this:
class ExcelService {
...
void export(OutputStream outputStream) {
...
PoiSpreadsheetBuilder.create(outputStream).build {
apply ExcelStylesheet
...
}
}
...
}
When I try to export my data from the app started using the generated jar I will get the following MissingMethodException:
groovy.lang.MissingMethodException: No signature of method: my.package.ExcelService.apply() is applicable for argument types: (java.lang.Class)
The (Java) interface of SpreadsheetBuilder looks like this:
public interface SpreadsheetBuilder {
void build(#DelegatesTo(strategy = Closure.DELEGATE_FIRST, value = WorkbookDefinition.class) #ClosureParams(value = FromString.class, options = "builders.dsl.spreadsheet.builder.api.WorkbookDefinition") Configurer<WorkbookDefinition> workbookDefinition);
}
While debugging the execution of the code and the jar I found the difference while stepping through invokeMethod() of ClosureMetaClass.
When closure.getResolveStrategy(); in the working version is called Closure.DELEGATE_FIRST will be returned. Debugging the jar, the result will be 0 so that the MissingMethodException will be thrown later due to the wrong resolve strategy.
For now I have no idea how to solve this problem.
What is/could be the reason for this behavior?
What can I do to solve this issue?
I'm using Grails 3.3.8 with Java OpenJDK 1.8.0_192.
If you don't need to support JDK 7, you could upgrade to Groovy Spreadsheet Builder 2.0.0.RC1 which is only JDK 8 compatible but appears to solve the problem.
#ClosureParams and #DelegatesTo are applicable to parameters of type groovy.lang.Closure. In this case, you have applied it to Configurer<WorkbookDefinition>.
Related
I'm trying to solve a problem with my Java project and one of the possible solutions is to change jdk.io.File.enableADS to TRUE in system properties. But, i don't know how to change it.
I'm also working in a project that uses jhipster and undertow. My project builds with no error, generating the connection link, but when I try to connect the page it doesn't load and the application shows the error:
java.lang.NoClassDefFoundError: Could not initialize class org.xnio.conduits.Conduit
I've looked at the code, found the line that throws the error and I saw in many blogs people telling to change the config above the text.
I'm using the JDK 11.0.15
This is the code that throws the error:
try {
if (osName.contains("windows")) {
return new FileOutputStream("NUL:").getChannel();
} else {
return new FileOutputStream("/dev/null").getChannel();
}
} catch (FileNotFoundException e) {
throw new IOError(e);
}
It seems you're running into this problem: Java: FileOutputStream("NUL:") not working after Java upgrade
The code you're referencing that is causing the problem is from https://code.yawk.at/org.jboss.xnio/xnio-api/3.8.4.Final/org/xnio/conduits/Conduits.java
static {
NULL_FILE_CHANNEL = AccessController.doPrivileged(new PrivilegedAction<FileChannel>() {
public FileChannel run() {
final String osName = System.getProperty("os.name", "unknown").toLowerCase(Locale.US);
try {
if (osName.contains("windows")) {
return new FileOutputStream("NUL:").getChannel();
} else {
return new FileOutputStream("/dev/null").getChannel();
}
} catch (FileNotFoundException e) {
throw new IOError(e);
}
}
});
}
I've looked at the sources for this release (3.8.4) and the most recent release on maven central (3.8.7): https://mvnrepository.com/artifact/org.jboss.xnio/xnio-api/3.8.7.Final
And there is some good news, it has been fixed in the latest release of xnio-api. The current code in version 3.8.7 is now as follows: (NUL: was replaced by NUL)
if (osName.contains("windows")) {
return new FileOutputStream("NUL").getChannel();
} else {
return new FileOutputStream("/dev/null").getChannel();
}
So if it's possible i would suggest you try to upgrade your dependency so that xnio-api-3.8.7.Final.jar is used.
Update 09-2022
Thanks to the comment of #NicolasRiousset, who found the following issue logged on jhipster's github, i have further traced this problematic dependency.
It starts with the optional dependency to spring-boot-starter-undertow, which in turn depends on undertow-core, and that depends on xnio-api from Jboss.
The earliest version that includes the fix (xnio-api-3.8.7) can be found in undertow-core 2.2.18.Final. The earliest version that uses this is spring-boot-starter-undertow 2.7.1.
And JHipster starts including spring-boot version >= 2.7.1 from jhipster-framework 7.9.0.
So upgrading to JHipster >= 7.9.0 should fix this problematic xnio-api dependency.
In case you don't strictly need undertow as your embedded web server, you can also switch (back) to spring-boot-starter-tomcat, since tomcat doesn't use the xnio-api.
Otherwise the mentioned system property should indeed also be a valid workaround for now. Since the code is statically loaded, i think you'll have to use VM arguments to include it in your program instead of using System.setProperty.
So add -Djdk.io.File.enableADS=true to your programs startup/command line.
See for reference the following question that was mentioned in the comments: How to set system property?
It does also seem to be fixed in newer version of the Java runtime, so upgrading your JDK/JRE is also an option. According to the linked question at the top it's fixed in Java 8u333, and on my test system with Java 17.0.2 both versions of NUL also work.
I ran into this same issue. I closed and reopened IntelliJ. Now it works. Worth trying before getting too deep down the rabbit hole.
I am unable to use swing library with my scala-sdk-2.12.4.
I am using Java 9 version.
When I try to run the program:
package rs.ac.bg.etf.zd173013m.gui
import swing._
object HelloWorld extends SimpleSwingApplication {
def top = new MainFrame {
title = "First Swing App"
contents = new Button {
text = "Click me"
}
}
}
I get the following error:
Exception in thread "main" java.lang.IncompatibleClassChangeError: Method scala.swing.Reactor.$init$()V must be InterfaceMethodref constant
at scala.swing.SwingApplication.<init>(SwingApplication.scala:4)
at scala.swing.SimpleSwingApplication.<init>(SimpleSwingApplication.scala:13)
at rs.ac.bg.etf.zd173013m.gui.HelloWorld$.<init>(Application.scala:5)
at rs.ac.bg.etf.zd173013m.gui.HelloWorld$.<clinit>(Application.scala)
at rs.ac.bg.etf.zd173013m.gui.HelloWorld.main(Application.scala)
You have incompatible JAR versions on your classpath. The code in the JAR containing "SwingApplication" was compiled against a different version of "Reactor" than the one on your classpath.
What are you using to manage your dependencies? I guess that you are downloading them manually.
Switch to a dependency management system like Gradle and this problem should go away, as it will ensure that all your dependencies are consistent.
I´m using Krextor to convert XML to RDF. It runs fine from the command line.
I try to run it from Java (Eclipse) using this code.
private static void XMLToRDF() throws KrextorException, ValidityException, ParsingException, IOException, XSLException{
Element root = new Element("person");
Attribute friend = new Attribute("friends", "http://van-houten.name/milhouse");
root.addAttribute(friend);
Element name = new Element("name");
name.appendChild("Bart Simpson");
root.appendChild(name);
nu.xom.Document inputDocument = new nu.xom.Document(root);
System.out.println(inputDocument.toXML());
Element root1 = inputDocument.getRootElement();
System.out.println(root1);
Krextor k = new Krextor();
nu.xom.Document outputDocument = k.extract("socialnetwork","turtle",inputDocument);
System.out.println(outputDocument.toString());
}
I have the following problem problem
Exception in thread "main" java.lang.NoClassDefFoundError: net/sf/saxon/CollectionURIResolver
Caused by: java.lang.ClassNotFoundException: net.sf.saxon.CollectionURIResolver
I have included Saxon9he in the classpath, and I have also added manually as a library in the project but the error is the same.
I am the main developer of Krextor. And, #Michael Kay, actually a colleague of Grangel, so I will resolve the concrete problem with him locally.
So indeed the last Saxon version with which I did serious testing was 9.1; after that I haven't used Krextor integrated into Java but mainly used Krextor from the command line.
#Grangel, could you please file an issue for Krextor, and then we can work on fixing it together.
Indeed, #Michael Kay, for a while I had been including more recent Saxon versions with Krextor and updated the command line wrapper to use them (such as to add different JARs to the classpath), but I have not necessarily updated the Java wrapper code.
I'm trying to write a UDF for Hadoop Hive, that parses User Agents. Following code works fine on my local machine, but on Hadoop I'm getting:
org.apache.hadoop.hive.ql.metadata.HiveException: Unable to execute method public java.lang.String MyUDF .evaluate(java.lang.String) throws org.apache.hadoop.hive.ql.metadata.HiveException on object MyUDF#64ca8bfb of class MyUDF with arguments {All Occupations:java.lang.String} of size 1',
Code:
import java.io.IOException;
import org.apache.hadoop.hive.ql.exec.UDF;
import org.apache.hadoop.hive.ql.metadata.HiveException;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.*;
import com.decibel.uasparser.OnlineUpdater;
import com.decibel.uasparser.UASparser;
import com.decibel.uasparser.UserAgentInfo;
public class MyUDF extends UDF {
public String evaluate(String i) {
UASparser parser = null;
parser = new UASparser();
String key = "";
OnlineUpdater update = new OnlineUpdater(parser, key);
UserAgentInfo info = null;
info = parser.parse(i);
return info.getDeviceType();
}
}
Facts that come to my mind I should mention:
I'm compiling with Eclipse with "export runnable jar file" and extract required libraries into generated jar option
I'm uploading this "fat jar" file with Hue
Minimum working example I managed to run:
public String evaluate(String i) {
return "hello" + i.toString()";
}
I guess the problem lies somewhere around that library (downloaded from https://udger.com) I'm using, but I have no idea where.
Any suggestions?
Thanks, Michal
It could be a few things. Best thing is to check the logs, but here's a list of a few quick things you can check in a minute.
jar does not contain all dependencies. I am not sure how eclipse builds a runnable jar, but it may not include all dependencies. You can do
jar tf your-udf-jar.jar
to see what was included. You should see stuff from com.decibel.uasparser. If not, you have to build the jar with the appropriate dependencies (usually you do that using maven).
Different version of the JVM. If you compile with jdk8 and the cluster runs jdk7, it would also fail
Hive version. Sometimes the Hive APIs change slightly, enough to be incompatible. Probably not the case here, but make sure to compile the UDF against the same version of hadoop and hive that you have in the cluster
You should always check if info is null after the call to parse()
looks like the library uses a key, meaning that actually gets data from an online service (udger.com), so it may not work without an actual key. Even more important, the library updates online, contacting the online service for each record. This means, looking at the code, that it will create one update thread per record. You should change the code to do that only once in the constructor like the following:
Here's how to change it:
public class MyUDF extends UDF {
UASparser parser = new UASparser();
public MyUDF() {
super()
String key = "PUT YOUR KEY HERE";
// update only once, when the UDF is instantiated
OnlineUpdater update = new OnlineUpdater(parser, key);
}
public String evaluate(String i) {
UserAgentInfo info = parser.parse(i);
if(info!=null) return info.getDeviceType();
// you want it to return null if it's unparseable
// otherwise one bad record will stop your processing
// with an exception
else return null;
}
}
But to know for sure, you have to look at the logs...yarn logs, but also you can look at the hive logs on the machine you're submitting the job on ( probably in /var/log/hive but it depends on your installation).
such a problem probably can be solved by steps:
overide the method UDF.getRequiredJars(), make it returning a hdfs file path list which values are determined by where you put the following xxx_lib folder into your hdfs. Note that , the list mist exactly contains each jar's full hdfs path strings ,such as hdfs://yourcluster/some_path/xxx_lib/some.jar
export your udf code by following "Runnable jar file exporting wizard" (chose "copy required libraries into a sub folder next to the generated jar". This steps will result in a xxx.jar and a lib folder xxx_lib next to xxx.jar
put xxx.jar and the folders xxx_lib to your hdfs filesystem according to your code in step 0.
create a udf using: add jar ${the-xxx.jar-hdfs-path}; create function your-function as $}qualified name of udf class};
Try it. I test this and it works
We are developing a Java project that is able to instrument (change) class files at build time. We defined a Gradle task that invokes a java based Ant task which takes an inputDir (e.g. build/classes) and an outputDir (e.g. build/classes-instrumented) and possible other parameters. The task gets invoked separately for main and test class files after compilation. Since the "normal" java sourceSet is not a good fit, our first thought was to implement our own sourceSet but couldn't find an easy way. A reasonable alternative, similar to ANTLR etc, seemed to be extra variables. Since I needed several, I went for a Map.
sourceSets.all { ext.instrumentation = [:] }
sourceSets.all {
instrumentation.inputDir = null
instrumentation.outputDir = null
instrumentation.classPath = null
}
def postfix = '-instrumented'
Below you see how we initialize the variables.
sourceSets {
main {
instrumentation.inputDir = sourceSets.main.output.classesDir
instrumentation.outputDir = instrumentation.inputDir + postfix
instrumentation.classPath = sourceSets.main.output + configurations.compile
}
test {
instrumentation.inputDir = sourceSets.test.output.classesDir
instrumentation.outputDir = instrumentation.inputDir + postfix
}
}
However it fails with "Could not find method main() for arguments [build_f2cvmoa3v4hnjefifhpuk6ira$_run_closure5_closure23#12a14b74] on root
project 'Continuations'."
We are using Gradle 2.1
I have the following questions:
any idea why the first one fails?
Is the extra variable a reasonable solution to approach the problem?
Thanks a lot for your help
solution: install last version.
I had the same problem, I read gradle documentation of gradle 3, but gradle 2.7 was installed.
checked gradle version 2.7
then read gradle 2.7 doc https://docs.gradle.org/2.7/userguide/tutorial_java_projects.html#N103CD , but found no info about sourceSet in java plugin for that version
installed gradle 3 --> problem solved