Java NoClassDefFoundError amf - java

I'm trying to compile a java snippet into a jar file, I got into a classical java runtime exception but I'm unable to solve the problem.
This is the code, borrowed from Markus Wulftange:
import java.io.ByteArrayInputStream;
import java.io.ByteArrayOutputStream;
import java.io.IOException;
import java.util.Arrays;
import flex.messaging.io.SerializationContext;
import flex.messaging.io.amf.ActionContext;
import flex.messaging.io.amf.ActionMessage;
import flex.messaging.io.amf.AmfMessageDeserializer;
import flex.messaging.io.amf.AmfMessageSerializer;
import flex.messaging.io.amf.MessageBody;
public class Amf3ExternalizableUnicastRef {
public static void main(String[] args) throws IOException, ClassNotFoundException {
if (args.length < 2 || (args.length == 3 && !args[0].equals("-d"))) {
System.err.println("usage: java -jar " + Amf3ExternalizableUnicastRef.class.getSimpleName() + ".jar [-d] <host> <port>");
return;
}
boolean doDeserialize = false;
if (args.length == 3) {
doDeserialize = true;
args = Arrays.copyOfRange(args, 1, args.length);
}
// generate the UnicastRef object
Object unicastRef = generateUnicastRef(args[0], Integer.parseInt(args[1]));
// serialize object to AMF message
byte[] amf = serialize(unicastRef);
// deserialize AMF message
if (doDeserialize) {
deserialize(amf);
} else {
System.out.write(amf);
}
}
public static Object generateUnicastRef(String host, int port) {
java.rmi.server.ObjID objId = new java.rmi.server.ObjID();
sun.rmi.transport.tcp.TCPEndpoint endpoint = new sun.rmi.transport.tcp.TCPEndpoint(host, port);
sun.rmi.transport.LiveRef liveRef = new sun.rmi.transport.LiveRef(objId, endpoint, false);
return new sun.rmi.server.UnicastRef(liveRef);
}
public static byte[] serialize(Object data) throws IOException {
MessageBody body = new MessageBody();
body.setData(data);
ActionMessage message = new ActionMessage();
message.addBody(body);
ByteArrayOutputStream out = new ByteArrayOutputStream();
AmfMessageSerializer serializer = new AmfMessageSerializer();
serializer.initialize(SerializationContext.getSerializationContext(), out, null);
serializer.writeMessage(message);
return out.toByteArray();
}
public static void deserialize(byte[] amf) throws ClassNotFoundException, IOException {
ByteArrayInputStream in = new ByteArrayInputStream(amf);
AmfMessageDeserializer deserializer = new AmfMessageDeserializer();
deserializer.initialize(SerializationContext.getSerializationContext(), in, null);
deserializer.readMessage(new ActionMessage(), new ActionContext());
}
}
Using the package flex-messaging-core, located in the same directory, I compiled into a jar with
javac -cp flex...jar sourcefile.java.
Then compiled to a jar with
jar -cfm myjar.jar MANIFEST.ML myclass.class.
But then, when running from shell with proper arguments
java -jar myjar.jar -d 127.0.0.1 8000
it throws an Exception in threadmain java.lang.NoClassDefFoundError : flex/messaging/io/amf/MessageBody.
I googled and tried all solutions for 2 days, but really can't solve the problem by myself, can I kindly ask for a little help?

Shouldn't it be MANIFEST.MF, not MANIFEST.ML?

Related

Dependencies for Protoc Java plugin?

I'm trying to make working example at How to write a custom Protobuf CodeGenerator in Java . When I try to compile a file with
import com.google.protobuf.compiler.PluginProtos;
import java.io.IOException;
public class MyPlugin {
public static void main(String[] args) throws IOException {
CodeGenerator gen = new CodeGenerator();
PluginProtos.CodeGeneratorRequest codeGeneratorRequest = PluginProtos.CodeGeneratorRequest.parseFrom(System.in);
codeGeneratorRequest.getProtoFileList().forEach(gen::handleFile);
// get the response and do something with it
//PluginProtos.CodeGeneratorResponse response = PluginProtos.CodeGeneratorResponse.newBuilder().build();
//response.writeTo(System.out);
}
}
I get compile error because CodeGenerator is unknown. I have in my pom.xml Maven file the following inside "dependencies" tag -
<dependency>
<groupId>com.google.protobuf</groupId>
<artifactId>protobuf-java</artifactId>
<version>3.6.1</version>
</dependency>
What else do I need to add to dependencies make my plugin working? I plan to use Java API from https://developers.google.com/protocol-buffers/docs/reference/java/ .
Yeah, that was a bit stupid - CodeGenerator is a custom class, we need to write it, it's the name which confuses by implying it's from the Google library.
So after writing it could look like this - approximate analogy of Python code at https://www.expobrain.net/2015/09/13/create-a-plugin-for-google-protocol-buffer/ , but without packaging into JSON and cleaning subfields:
import com.google.protobuf.DescriptorProtos;
import com.google.protobuf.compiler.PluginProtos;
import java.io.IOException;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
public class ProtocPlugin {
private static List _traverse(String strPackage, List items) {
List<List> res = new ArrayList<>();
for(Object item : items) {
res.add(Arrays.asList(item, strPackage));
if(item instanceof DescriptorProtos.DescriptorProto) {
DescriptorProtos.DescriptorProto dp = (DescriptorProtos.DescriptorProto) item;
for(DescriptorProtos.EnumDescriptorProto e : dp.getEnumTypeList()) {
res.add(Arrays.asList(e, strPackage));
}
for(DescriptorProtos.DescriptorProto nested : dp.getNestedTypeList()) {
String strNestedPackage = strPackage + nested.getName();
for(Object nestedItem : _traverse(strNestedPackage, nested.getNestedTypeList())) {
res.add(Arrays.asList(((List)nestedItem).get(0), strNestedPackage));
}
}
}
}
return res;
}
public static void main(String[] args) throws IOException {
StringBuilder data = new StringBuilder();
PluginProtos.CodeGeneratorRequest codeGeneratorRequest = PluginProtos.CodeGeneratorRequest.parseFrom(System.in);
codeGeneratorRequest.getProtoFileList().forEach((DescriptorProtos.FileDescriptorProto fileDescriptorProto) -> {
String strPackage = fileDescriptorProto.getPackage();
if(strPackage == null || strPackage.isEmpty()) {
strPackage = "<root>";
}
data.append("package: ").append(strPackage).append("\n");
data.append("filename: ").append(fileDescriptorProto.getName()).append("\n");
List<DescriptorProtos.EnumDescriptorProto> enums = fileDescriptorProto.getEnumTypeList();
for(Object pair : _traverse(strPackage, enums)) {
data.append("type: enum").append("\n");
data.append(((List)pair).get(0)).append(((List)pair).get(1)).append(" ");
}
List<DescriptorProtos.DescriptorProto> messageTypes = fileDescriptorProto.getMessageTypeList();
for(Object pair : _traverse(strPackage, messageTypes)) {
data.append("type: message").append("\n");
data.append(((List)pair).get(0)).append(((List)pair).get(1)).append(" ");
}
});
PluginProtos.CodeGeneratorResponse.Builder builder = PluginProtos.CodeGeneratorResponse.newBuilder();
builder.addFileBuilder().setContent(data.toString()).setName("mytest.txt");
PluginProtos.CodeGeneratorResponse response = builder.build();
response.writeTo(System.out);
}
}
The launch of protoc could be with
protoc --plugin=protoc-gen-custom=my-plugin.bat --custom_out=. hello.proto
where my-plugin.bat contains something like
#echo off
java -cp target/classes;c:/users/bover/.m2/repository/com/google/protobuf/protobuf-java/3.6.1/protobuf-java-3.6.1.jar ProtocPlugin
here we assume that our Java plugin compiles ProtocPlugin.class into target/classes directory. The output will be in mytest.txt file. hello.proto is a simple proto file from the Python example above.

Pcap4j Library: Exception in thread "main" java.io.IOException: No NIF to capture

When I capture packets, I get the following error:
Exception in thread "main" java.io.IOException: No NIF to capture.
at org.pcap4j.util.NifSelector.selectNetworkInterface(NifSelector.java:44)
at io.bigdatalabs.pcaptest.App.main(App.java:22)
The code is:
package io.bigdatalabs.pcaptest;
import java.io.IOException;
import org.pcap4j.core.BpfProgram.BpfCompileMode;
import org.pcap4j.core.NotOpenException;
import org.pcap4j.core.PacketListener;
import org.pcap4j.core.PcapHandle;
import org.pcap4j.core.PcapNativeException;
import org.pcap4j.core.PcapNetworkInterface;
import org.pcap4j.core.PcapNetworkInterface.PromiscuousMode;
import org.pcap4j.packet.Packet;
import org.pcap4j.util.NifSelector;
public class App {
public static void main(String [] args) throws PcapNativeException, IOException, NotOpenException, InterruptedException {
String filter = null;
if (args.length != 0) {
filter = args[0];
}
PcapNetworkInterface nif = new NifSelector().selectNetworkInterface();
if (nif == null) {
System.exit(1);
}
final PcapHandle handle = nif.openLive(65536, PromiscuousMode.PROMISCUOUS, 10);
if (filter != null && filter.length() != 0) {
handle.setFilter(filter, BpfCompileMode.OPTIMIZE);
}
PacketListener listener = new PacketListener() {
#Override
public void gotPacket(Packet packet) {
printPacket(packet, handle);
}
};
handle.loop(5, listener);
}
private static void printPacket(Packet packet, PcapHandle ph) {
StringBuilder sb = new StringBuilder();
sb.append("A packet captured at ")
.append(ph.getTimestamp())
.append(":");
System.out.println(sb);
System.out.println(packet);
}
}
But I run this code with sudo command, it is running. Why don't I run this code without sudo? I should run this code without sudo. How can I run this code?
In order to list and monitor your network interfaces, you need superuser privilege. That's why you cant list any interfaces and get an Exception when you run this code without sudo.
Even famous WireShark can't list your interfaces if you don't run it without sudo, so i don't think there'll be any other solutions for you to run your application with normal privileges.
Good luck.

Colons in Apache Spark application path

I'm submitting Apache Spark application to YARN programmatically:
package application.RestApplication;
import org.apache.hadoop.conf.Configuration;
import org.apache.spark.SparkConf;
import org.apache.spark.deploy.yarn.Client;
import org.apache.spark.deploy.yarn.ClientArguments;
public class App {
public static void main(String[] args1) {
String[] args = new String[] {
"--class", "org.apache.spark.examples.JavaWordCount",
"--jar", "/opt/spark/examples/jars/spark-examples_2.11-2.0.0.jar",
"--arg", "hdfs://hadoop-master:9000/input/file.txt"
};
Configuration config = new Configuration();
System.setProperty("SPARK_YARN_MODE", "true");
SparkConf sparkConf = new SparkConf();
ClientArguments cArgs = new ClientArguments(args);
Client client = new Client(cArgs, config, sparkConf);
client.run();
}
}
I have problem with line: "--arg", "hdfs://hadoop-master:9000/input/file.txt" - more specifically with colons:
16/08/29 09:54:16 ERROR yarn.ApplicationMaster: Uncaught exception:
java.lang.NumberFormatException: For input string: "9000/input/plik2.txt"
at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
at java.lang.Integer.parseInt(Integer.java:580)
at java.lang.Integer.parseInt(Integer.java:615)
at scala.collection.immutable.StringLike$class.toInt(StringLike.scala:272)
at scala.collection.immutable.StringOps.toInt(StringOps.scala:29)
at org.apache.spark.util.Utils$.parseHostPort(Utils.scala:935)
at org.apache.spark.deploy.yarn.ApplicationMaster.waitForSparkDriver(ApplicationMaster.scala:547)
at org.apache.spark.deploy.yarn.ApplicationMaster.runExecutorLauncher(ApplicationMaster.scala:405)
at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:247)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$main$1.apply$mcV$sp(ApplicationMaster.scala:749)
at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:71)
at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:70)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:70)
at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:747)
at org.apache.spark.deploy.yarn.ExecutorLauncher$.main(ApplicationMaster.scala:774)
at org.apache.spark.deploy.yarn.ExecutorLauncher.main(ApplicationMaster.scala)
How to write (as argument) path to file with colons? I try various combinations with slashes, backslashes, %3a, etc...
According to Utils#parseHostPort which get invoked during that call, Spark seems to consider as port all the text that is behind last : :
def parseHostPort(hostPort: String): (String, Int) = {
// Check cache first.
val cached = hostPortParseResults.get(hostPort)
if (cached != null) {
return cached
}
val indx: Int = hostPort.lastIndexOf(':')
// This is potentially broken - when dealing with ipv6 addresses for example, sigh ...
// but then hadoop does not support ipv6 right now.
// For now, we assume that if port exists, then it is valid - not check if it is an int > 0
if (-1 == indx) {
val retval = (hostPort, 0)
hostPortParseResults.put(hostPort, retval)
return retval
}
val retval = (hostPort.substring(0, indx).trim(), hostPort.substring(indx + 1).trim().toInt)
hostPortParseResults.putIfAbsent(hostPort, retval)
hostPortParseResults.get(hostPort)
}
As a consequence, the whole string 9000/input/file.txt is supposed to be a single port number. Which suggests you are not supposed to refer to your input file from HDFS file system. I guess someone more skilled in Apache Spark would give you better advice.
I changed program to: https://github.com/mahmoudparsian/data-algorithms-book/blob/master/src/main/java/org/dataalgorithms/chapB13/client/SubmitSparkPiToYARNFromJavaCode.java
import org.apache.spark.SparkConf;
import org.apache.spark.deploy.yarn.Client;
import org.apache.spark.deploy.yarn.ClientArguments;
import org.apache.hadoop.conf.Configuration;
import org.apache.log4j.Logger;
public class SubmitSparkAppToYARNFromJavaCode {
public static void main(String[] args) throws Exception {
run();
}
static void run() throws Exception {
String sparkExamplesJar = "/opt/spark/examples/jars/spark-examples_2.11-2.0.0.jar";
final String[] args = new String[]{
"--jar",
sparkExamplesJar,
"--class",
"org.apache.spark.examples.JavaWordCount",
"--arg",
"hdfs://hadoop-master:9000/input/file.txt"
};
Configuration config = ConfigurationManager.createConfiguration();
System.setProperty("SPARK_YARN_MODE", "true");
SparkConf sparkConf = new SparkConf();
sparkConf.setSparkHome(SPARK_HOME);
sparkConf.setMaster("yarn");
sparkConf.setAppName("spark-yarn");
sparkConf.set("master", "yarn");
sparkConf.set("spark.submit.deployMode", "cluster");
ClientArguments clientArguments = new ClientArguments(args);
Client client = new Client(clientArguments, config, sparkConf);
client.run();
}
}
and now it works!

Spring Resource Inside JAR/WAR

I created a really simple project to test reading a directory or file using getClass().getResource('...').getPath() from STS, Tomcat, and running the JAR/WAR file from the terminal with the embedded Tomcat.
Like I said, the project is simple, here's the code:
package org.example
import org.springframework.beans.factory.annotation.Autowired
import org.springframework.boot.CommandLineRunner
import org.springframework.boot.SpringApplication
import org.springframework.boot.autoconfigure.SpringBootApplication
#SpringBootApplication
class ResourceDemoApplication implements CommandLineRunner {
static void main(String[] args) {
SpringApplication.run ResourceDemoApplication, args
}
#Override
void run(String... arg0) throws Exception {
retrieveDirectory()
}
void retrieveDirectory() {
/*new File(getClass().getResource('/private/folders').getPath()).eachDirRecurse() { dir ->
dir.eachFileMatch(~/.*.txt/) { file ->
println(file.getPath())
}
}*/
println new File(getClass().getResource('/private/folders/').getPath()).isDirectory()
}
}
When this code runs in STS or if I drop it in a running Tomcat instance, it prints true. When I run it as java -jar..., it returns false in the terminal. I have looked at countless examples and I still don't understand how to get this to work properly or as expected. I know that reading files from inside the JAR is different than having access to the file system, but I'm not sure how to get this to work regardless of how it's deployed.
Thank you in advance for the help!
After quite a bit of research and digging into the code, I ended up with this solution:
package org.example
import org.springframework.boot.CommandLineRunner
import org.springframework.boot.SpringApplication
import org.springframework.boot.autoconfigure.SpringBootApplication
import org.springframework.core.io.FileSystemResource
import org.springframework.core.io.support.PathMatchingResourcePatternResolver
#SpringBootApplication
class ResourceDemoApplication implements CommandLineRunner {
PathMatchingResourcePatternResolver resolver = new PathMatchingResourcePatternResolver()
static void main(String[] args) {
SpringApplication.run ResourceDemoApplication, args
}
#Override
void run(String... arg0) throws Exception {
retrieveDirectory()
}
void retrieveDirectory() {
List<FileSystemResource> files = resolver.findPathMatchingResources('private/folders/**/example.txt')
files.each { file ->
println file.getInputStream().text
}
}
}
With groovy you don't need to declare types etc... I am doing it for the sake of documentation here to show what's happening in the code. If you do this in Java you will need something like this to replace println file.getInputStream().text:
InputStream is
BufferedReader br
String fileContents
files.each { file ->
is = file.getInputStream()
br = new BufferedReader(new InputStreamReader(is))
String line
fileContents = ""
while((line = br.readLine()) != null) {
fileContents += line
}
println fileContents
println "************************"
br.close()
}

ant java task : redirecting output with spawn=true

Greetings,
a rather clear question here.
I have to launch a java job with <java>, which is to be run in parallel with ant and it's okay if the job outlives the ant process, hence spawn="true".
I have to see the job output in a designated file. This is perfectly achievable via output="job.out" for spawn="false", but I am kinda out of luck having spawn"=true".
So, is there any modestly dirty hack or I really have to wrap the java call with an exec like the one below?
CMD /C my-java-command-and-hardcoded-classpath-goes-here > job.out
Thanks,
Anton
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.PrintStream;
import java.lang.reflect.InvocationTargetException;
import java.util.Arrays;
public class StreamRedirector {
public static void main(String[] args) throws FileNotFoundException, ClassNotFoundException,
NoSuchMethodException, InvocationTargetException, IllegalAccessException {
System.out.println(Arrays.toString(args));
// parse the arguments
if (args.length != 2) {
throw new IllegalArgumentException(
"Usage:" +
"\targ0 = wrapped main FQN;\n" +
"\targ1 = dest output file name;\n" +
"\tother args are passed to wrapped main;"
);
}
String mainClass = args[0];
String destinationFile = args[1];
// redirect the streams
PrintStream outErr = new PrintStream(new FileOutputStream(destinationFile));
System.setErr(outErr);
System.setOut(outErr);
// delegate to the other main
String[] wrappedArgs = new String[args.length - 2];
System.arraycopy(args, 2, wrappedArgs, 0, wrappedArgs.length);
Class.forName(mainClass).getMethod("main", String[].class).invoke(null, (Object) wrappedArgs);
}
}

Categories