I have a java program that I'm required to compile into a Linux native program using gcj-4.3. This program requires serial port access. The javax.comm api provides serial port access but I'm not sure how to get my compiled java program to use it.
The target box has Java installed, but of course my compiled program isn't running in the JRE...so I'm not exactly sure how I can link in the comm.jar file or how that file can find the .properties file it requires.
I wonder if I can just compile the comm.jar allong with my .jar file and link the two object files together. Can my code then reference the classes in comm.jar?
Thanks in advance for your help!
I'm not an GCJ expert but I have some suggestions (I'm not providing the syntax, this will require some exploration that I didn't perform):
first, I think that you'll have to compile comm.jar into a (shared) library,
then, you'll have to link your code against the library,
finally, use the GCJ_PROPERTIES environment variable to pass properties to the program at invocation time.
The following pointers might be helpful to implement this:
GCJ---The GNU Compiler for Java (great resources IMO, covers all the steps mentioned above)
GCJ – Getting Started (more an intro but still nice)
Compile ActiveMQ with GCJ (more use cases but I don't think they apply here)
And of course, the official documentation :)
Related
I've been working on trying to get COBOL and Java to interact with each other on the mainframe, and have run into trouble with specifically the cob2 compiler, which is the Unix on the mainframe equivalent.
I haven't seen many user experiences with this compiler online, so I was wondering if I asked a more direct question, people would reveal their insight.
IBM has several examples of Java calling COBOL DLL's either directly or indirectly, but they ultimately boil down to compile the COBOL as a dll, use System.load, compile Java and run. These examples haven't worked for me for the following reasons.
When using cob2 with the -c option, it is purported to generate a .o object file. This has not happened for me, although it did generate an empty .lst file. I was able to get around this by simply skipping the -c step and compiling and linking using this series of commands:
` sh ${COB2HOME}/bin/cob2 -o ${DIR}/c2jcr.o
-qdll,thread,case=mixed ${DIR}/c2jcr.cbl;
${COB2HOME}/bin/cob2
-o ${DIR}/libc2jcr.so
-bdll,case=mixed ${DIR}/c2jcr.o
${JAVAHOME}/bin/j9vm/libjvm.x
${COB2HOME}/lib/igzcjava.x `
This appears to provide the .so library that is required for link with the Java program, but upon investigation of the load, and during run, the system declares that the LE CSECT CEESTART is not there.
Am I missing something in my cob2 library that has these LE modules, or somewhere in my scripting? I tried pulling in loads from the mainframe compiled with the LE modules intact and ENTRY CEESTART explicitly stated in the link step, but could not get any further than "UnsatisfiedLinkError" with "Internal Error".
Any wisdom is greatly appreciated, especially if you've gone down a completely different route to call COBOL from Java. Thank you very much.
After conferring with IBM, it turns out I had a couple things missing.
You must have a STEPLIB environment field set to the location of your COBOL compiler on the mainframe, so it can find your IGYCRCTL module.
Second, like other COBOL 5+ compiling, you must allocate a gargantuan amount of space in order to compile. 2 GB is not enough. Since I don't have permission to reallocate this in Unix, I ran a BPXBATCH job with REGION=0M.
After those two changes, -c compiles came out as normally. The "workaround" I provided in the question is completely incorrect. You must use:
sh ${COB2HOME}/bin/cob2 -c -qdll,thread,case=mixed ${DIR}/${COBPROG}.cbl
as your compile step, and the rest is just linkage.
I'm struggling trying to get remote actors setup in Scala. I'm running Scala 2.10.2 and Akka 2.2.1.
I compile using [I've shortened the paths on the classpath arg for clarity sake]:
$ scalac -classpath "akka-2.2.1/lib:akka-2.2.1/lib/scala-library.jar:akka-2.2.1/lib/akka:akka-2.2.1/lib/akka/scala-reflect-2.10.1.jar:akka-2.2.1/lib/akka/config-1.0.2.jar:akka-2.2.1/lib/akka/akka-remote_2.10-2.2.1.jar:akka-2.2.1/lib/akka/akka-kernel_2.10-2.2.1.jar:akka-2.2.1/lib/akka/akka-actor_2.10-2.2.1.jar:." [file.scala]
I've continuously added new libraries trying to debug this - I'm pretty sure all I really need to include is akka-remote, but the others shouldn't hurt.
No issues compiling.
I attempt to run like this:
$ scala -classpath "[same as above]" [application]
And I receive a NSM exception:
java.lang.NoSuchMethodException: akka.remote.RemoteActorRefProvider.<init>(java.lang.String, akka.actor.ActorSystem$Settings, akka.event.EventStream, akka.actor.Scheduler, akka.actor.DynamicAccess)
at java.lang.Class.getConstructor0(Class.java:2810)
at java.lang.Class.getDeclaredConstructor(Class.java:2053)
...
Looking into the source code, it appears that Akka 2.2.X's flavor of this constructor takes 4 arguments (the Scheduler is removed). But in Akka < 2.2.X, the constructor takes 5 args.
Thus, I'm thinking my classpath isn't setup quite right. At compile-time, Scala must be finding the <2.2.X flavor. I don't even know where it would be finding it, since I only have Akka 2.2.1 installed.
Any suggestions!? Thanks! (Please don't say to use SBT).
The problem here is that the Scala distribution contains akka-actor 2.1.0 and helpfully puts that in the boot class path for you. We very strongly recommend using a dependency manager like sbt or maven when building anything which goes beyond the most trivial projects.
As noted in another answer, the problem is that scala puts a different version of Akka into the bootclasspath.
To more directly answer your question (as you said you don't want to use sbt): you can execute your program with java instead of scala. You just have to put the appropriate Scala jars into the classpath.
Here is a spark-dev message about the problem. The important part is: "the workaround is to use java to launch the application instead of scala. All you need to do is to include the right Scala jars (scala-library and scala-compiler) in the classpath."
I have a text (.txt) file that contains Java code! I want to create a method that includes this Java code and then call that method through the program.
Can anybody suggest a way to do this?
let consider this example what it does actually load the source code, compile and execute the java code by simpler program by using JavaCompiler API.
Use the JavaCompiler. It can compile code from a String, so I'm sure it could handle code from a text file.
Do you think instead of putting it in the main method I can put it in for example test method and call method like this?
Put it wherever you like. E.G. see the STBC & especially the source code. It provides a GUI and can compile the code in the text area on button click.
this program need tools.jar but jre 7 doesnt have this!!
Did you try reading the documentation that is provided for the STBC? Notably:
System Requirements
STBC will run on any computer with a version 1.6+ Java Plug-In* JDK (AKA SDK).
(*) The API that STBC uses is merely a public interface to the compiler in the tools.jar that is distributed only with JDKs (though the 'public JRE' of the JDK also seems to acquire a tools.jar). This leads to some unusual requirements in running either the native jar, or the web start app.
Or shorter, no JRE will have a JavaCompiler, only JDKs have them.
Change the .txt file to a .java file,
add it to your java project
Compile the code
Execute the methods
Load the file in through standard java IO and then have Groovy evaluate it for you:
http://groovy.codehaus.org/Embedding+Groovy
it's something like quine:
http://www.nyx.org/%7Egthompso/quine.htm
I'm new to java development, I just want to use javac for my build system. I'm using java to add a feature to a program someone else wrote, specifically involving GeoTiff images.
I found a class online that I would like to use, however I'm having trouble building the class, no matter what I do I get this message:
javac GeoTiffIIOMetadataAdapter.java
GeoTiffIIOMetadataAdapter.java:11: package com.sun.media.imageio.plugins.tiff does not exist
import com.sun.media.imageio.plugins.tiff.GeoTIFFTagSet;
I'm on RHEL5, so I installed the package I thought I needed, jai-imageio-core.x86_64. But the problem persists. I think that I'm not setting some variable corrently (like -sourcepath or something). I would appreciate any help.
You need to include the jar with -cp or -classpath.
So your compile would be like java -cp "<location to jai_imageio-1.1.jar>" <your java class> .
I think you need this jar file.
You can read more about javac here.
Find out where the package installed the jar file with the class you want to import, and add it to the javac commandline in the -classpath. (You then also need to include it in the classpath when your plugin runs; how to do that may depend on the program it plugs into).
I think that I'm not setting some variable correctly (like -sourcepath or something)
This tutorial briefly introduces the usage of environment variables in Java: PATH and CLASSPATH
This one seems to be the most popular answer to various classpath related questions I've seen at online forums: Setting the class path.
To avoid "blind recommendation" I quickly skimmed through it before adding to this answer and, well... it really covers most of what one needs to know to deal with classpath. Pretty good; the reason why I didn't look into it before is that there always has been some guru nearby who explained stuff to me.
I mean the intel assembly for the processor?
If you want to see the native code generated (at runtime) by the JIT compiler, then there are a series of JVM flags that will print the assembly code as it is generated.
They are included in this listing - search for "PrintAssembly".
Note that these options need to be prefixed with "-XX:" in the java command line. Refer to the java manual page for details.
You won't get anything like that. Java is compiled to ByteCode.
Java-code is translated to bytecode. Then the JVM takes the bytecode and executes it. So I think you are out of luck.
If you want the bytecode - see javap. It ships with the JDK and disassembles a class file. As #arjan noted, Eclipse shows such information when you double-click a class.
http://java.decompiler.free.fr/
but I guess that's extra easy to be found oneself.
If with "assembly translation" you mean the byte code (the output from Javac, JDT, etc) then the answer is really simple: find a .class file and double click on it.
This will show you the byte code in human readable mnemonics.
Enter the output folder - usually "bin" - in the Navigation perspective. There you can see all your class files.