How can I set a custom SSLSocketFactory on a Heroku app? - java

I'd like to control which public IP addresses my app can connect to, so that I can blacklist a small set of IPs for outgoing connections for the entire app.
Deploying a Tomcat Java app to Heroku, I've specified a custom Java security configuration by overriding "java.security.properties"
web: java $JAVA_OPTS -Djava.security.properties=java.security -jar target/dependency/webapp-runner.jar --port $PORT target/*.war
In that config, I've given a custom SSLSocketFactory class
ssl.SocketFactory.provider=security.MyCustomSocketFactory
This allows MyCustomSocketFactory to examine every IP address and host for outgoing connections in a small sample app. However, it's not working for my full application after I deploy to Heroku. The class isn't found, even though it is packaged into the .war file.
Caused by: java.lang.ClassNotFoundException: security.MyCustomSocketFactory
at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:582)
at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:185)
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:496)
at java.base/javax.net.ssl.SSLSocketFactory.getDefault(SSLSocketFactory.java:105)
at java.base/javax.net.ssl.HttpsURLConnection.getDefaultSSLSocketFactory(HttpsURLConnection.java:335)
at java.base/javax.net.ssl.HttpsURLConnection.<init>(HttpsURLConnection.java:292)
I think I have to specify that my single class is class-loaded differently, because my application is initialized by webapp-runner.jar. Is there a different approach I should be taking?
I know my class is available to some classloader, because I can call Class.forName() from my own code, without getting an exception. But it's just not able to be loaded from SSLSocketFactory.getDefault().

As codefinger suggested in a comment, I needed to include MyCustomSocketFactory on the classpath outside of the war file.
I moved MyCustomSocketFactory to a separate Maven project, and built it as a separate jar.
Then, I added a built step on my main project to copy the JAR into the same directory as webapp-runner.jar.
<plugin>
<artifactId>maven-resources-plugin</artifactId>
<version>3.1.0</version>
<executions>
<execution>
<id>copy-socketblocker</id>
<phase>package</phase>
<goals>
<goal>copy-resources</goal>
</goals>
<configuration>
<outputDirectory>${basedir}/target/dependency</outputDirectory>
<resources>
<resource>
<directory>jars</directory>
<includes>socketblocker-1.0.jar</includes>
</resource>
</resources>
</configuration>
</execution>
</executions>
</plugin>
Finally, I modified my Procfile to use wildcard matching, and add both webapp-runner.jar and my custom JAR as classpath entries.
web: java $JAVA_OPTS -Djava.security.properties=java.security -cp "target/dependency/*" webapp.runner.launch.Main --port $PORT target/*.war

Related

Creating executable JAR in IntelliJ (Java 18, JavaFX 18 Maven project), "WARNING: Unsupported JavaFX configuration..."

I have a Java 18, JavaFX 18 Maven project which has a lot of libraries, beside the javaFX libraries, that needs to be included in the artifact. I want to create an artifact, a jar, which contains all dependencies. I started following this video to create the jar: https://www.youtube.com/watch?v=UKd6zpUnAE4
Summarizing my steps, and referring to the steps in the video:
In IntelliJ in Project Structure/Project Settings/Libraries I removed all Maven added libraries, and added C:\Program Files\Java\javafx-sdk-18.0.2\lib
After, in Run/Edit Configurations... I added a VM options, and in that window I added
--module-path "C:\Program Files\Java\javafx-sdk-18.0.2\lib"
--add-modules javafx.controls,javafx.fxml
After, in the video, "Ken" the host of the video creates a class, with a main() method, that runs the application original main() class. I did not need this step, because I already has a class that does the same.
After, File/Project Structure/Project Settings/Artifact/ I added a JAR/From modules with dependencies/ and I choose the class I recently created, and shortened the path until the source folder (src)
Following this step, after I clicked add (+), and added the content of "...javafx-sdk-18.0.2/bin" all dll's and everything (all files).
Here, at this point, separate from the video, I also created a folder named "jars" and put all Maven dependencies jars in that folder.
According to the video, after these steps, with a double click on the artifact the jar runs without a problem.
However, I needed I more step. My dependency jars are signed jars, so I needed to open the artifact with WinRAR and remove the *.SF, *.DSA and *.RSA files. Earlier this caused me problems so I followed the idea here: Invalid signature file digest for Manifest main attributes exception while trying to run jar file, and here: "Invalid signature file" when attempting to run a .jar
After this, everything should be fine, however not :( The jar doesn't run on double click. When I run it from command line, I receive the following error:
$ java -jar jHasher.jar
jan. 15, 2023 3:19:07 DU. com.sun.javafx.application.PlatformImpl startup
WARNING: Unsupported JavaFX configuration: classes were loaded from 'unnamed module #3a178016'
javafx.fxml.LoadException:
unknown path:53
at javafx.fxml.FXMLLoader.constructLoadException(FXMLLoader.java:2707)
at javafx.fxml.FXMLLoader.loadImpl(FXMLLoader.java:2685)
at javafx.fxml.FXMLLoader.load(FXMLLoader.java:2532)
at view.GUI.start(GUI.java:29)
at com.sun.javafx.application.LauncherImpl.lambda$launchApplication1$9(LauncherImpl.java:847)
at com.sun.javafx.application.PlatformImpl.lambda$runAndWait$12(PlatformImpl.java:484)
at com.sun.javafx.application.PlatformImpl.lambda$runLater$10(PlatformImpl.java:457)
at java.base/java.security.AccessController.doPrivileged(AccessController.java:399)
at com.sun.javafx.application.PlatformImpl.lambda$runLater$11(PlatformImpl.java:456)
at com.sun.glass.ui.InvokeLaterDispatcher$Future.run(InvokeLaterDispatcher.java:96)
at com.sun.glass.ui.win.WinApplication._runLoop(Native Method)
at com.sun.glass.ui.win.WinApplication.lambda$runLoop$3(WinApplication.java:184)
at java.base/java.lang.Thread.run(Thread.java:833)
Caused by: java.lang.RuntimeException: java.lang.reflect.InvocationTargetException
at com.sun.javafx.fxml.BeanAdapter.put(BeanAdapter.java:263)
at com.sun.javafx.fxml.BeanAdapter.put(BeanAdapter.java:54)
at javafx.fxml.FXMLLoader$Element.applyProperty(FXMLLoader.java:523)
at javafx.fxml.FXMLLoader$Element.processValue(FXMLLoader.java:373)
at javafx.fxml.FXMLLoader$Element.processPropertyAttribute(FXMLLoader.java:335)
at javafx.fxml.FXMLLoader$Element.processInstancePropertyAttributes(FXMLLoader.java:245)
at javafx.fxml.FXMLLoader$ValueElement.processEndElement(FXMLLoader.java:778)
at javafx.fxml.FXMLLoader.processEndElement(FXMLLoader.java:2924)
at javafx.fxml.FXMLLoader.loadImpl(FXMLLoader.java:2639)
... 11 more
Caused by: java.lang.reflect.InvocationTargetException
at java.base/jdk.internal.reflect.DirectMethodHandleAccessor.invoke(DirectMethodHandleAccessor.java:119)
at java.base/java.lang.reflect.Method.invoke(Method.java:577)
at com.sun.javafx.fxml.ModuleHelper.invoke(ModuleHelper.java:102)
at com.sun.javafx.fxml.BeanAdapter.put(BeanAdapter.java:259)
... 19 more
Caused by: java.lang.UnsupportedOperationException: Cannot resolve 'win10-document'
at org.kordamp.ikonli.AbstractIkonResolver.resolve(AbstractIkonResolver.java:61)
at org.kordamp.ikonli.javafx.IkonResolver.resolve(IkonResolver.java:73)
at org.kordamp.ikonli.javafx.FontIcon.setIconLiteral(FontIcon.java:251)
at java.base/jdk.internal.reflect.DirectMethodHandleAccessor.invoke(DirectMethodHandleAccessor.java:104)
... 22 more
I have searched the following error message. I also found some posts on StackOverflow, however they are not clear to me, and I was not able to fix this issue. Please, guide me how to proceed. All suggestions are highly appreciated.
After several hard day, I was able to create the executable jar. I'd like to share the know-how with you.
After 5th step, skipping the WinRAR for removing the *.SF, *.DSA and *.RSA files. I added maven-shade-plugin to my pom.xml. The shade plugin can automatically remove these unwanted files, but unfortunately by itself cannot create a runnable JAR, because throws again exceptions and doesn't run on double click (JavaFX 18 Maven IntelliJ: Graphics Device initialization failed for: d3d, sw Error initializing QuantumRenderer: no suitable pipeline found).
To avoid this exception and include the unlocated/missing JavaFX files we have to repack the already packed JAR. To do that, I used the spring-boot-maven-plugin. After setting up the plugins (code below), you have to run the plugins with maven in a correct order! My maven command was the following: mvn clean package spring-boot:repackage
That it, finally the created JAR (JAR of the JAR) can run on double click.
My pom.xml's corresponding parts:
Shade plugin setting:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>3.4.1</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<transformers>
<transformer implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
<transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<mainClass>controller.Start</mainClass>
</transformer>
</transformers>
<minimizeJar>true</minimizeJar>
<filters>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
</configuration>
</execution>
</executions>
</plugin>
The Spring-boot-maven-plugin setting (this should be placed outside the plugins section, at the very end of the pom.xml):
<pluginManagement>
<plugins>
<plugin>
<!-- mvn clean package spring-boot:repackage -->
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<executions>
<execution>
<goals>
<goal>repackage</goal>
</goals>
<configuration>
<classifier>spring-boot</classifier>
<mainClass>
controller.Start
</mainClass>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</pluginManagement>
Make sure to run the plugins in the correct order, as mentioned above! I found this resource very useful: https://www.baeldung.com/spring-boot-repackage-vs-mvn-package

Maven Filtering parameters in file

I have been looking over the maven war plugin and how to configure it. Here is my situation. I have a web application that is distributed to several production facilities. There are two files, in this web app, that are customized for each facility. These are /js/config.js and /META-INF/context.xml.
I have my project in a typical maven structure:
/src
|--/main
|--webapp
|--/js
|--config.js
|--properties
|--plant.properties
|--/META-INF
|--context.xml
I've left out non-essential directories for brevity.
The config.js has been altered to contain "parameter" I want substituted:
var Config {
...
system_title: '${plant_name} - Audit System',
...
}
The relevant portion of my pom is:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
<version>3.2.0</version>
<configuration>
<filters>
src/main/webapp/js/properties/fayetteville.properties
</filters>
<failOnMissingWebXml>false</failOnMissingWebXml>
<webResources>
<resource>
<directory>src/main/webapp/js</directory>
<filtering>true</filtering>
<exclude>**/properties</exclude>
</resource>
</webResources>
</configuration>
</plugin>
When I run "mvn clean package", I would expect to see ${plant_name} replaced with what is in my properties file. In this case, my properties file contains a single key-value pair:
plant_name=Austin
But I am not seeing the substitution. The resulting config.js in the target folder still contains ${plant_name} as does the config.js in the resulting war file.
I really don't want to use profiles if possible. Eventually, I want the build process to use a list of properties files to do this for all plants.
From my research, including a number of SO questions and answers, I feel I have things configured correctly.
What might I be doing wrong?

Eclipse Jar-in-Jar fails to find a joda.time class

Eclipse has 3 runnable JAR export methods. One of them does not work in my case. I want to stop using the export method that makes a library sub-folder and switch to a single JAR.
In all cases my invocation is in a script, with a few script variables such as $MEMORYOPTIONS
java $MEMORYOPTIONS -enableassertions -classpath VARIOUS-SHOWN-BELOW topLevelDomain.domain.packageName.className $1 $2 $3
Firstly...
I have success with the following export method and the class path as shown.
export > runnable jar > extract required libraries
-classpath /home/user/workspace/project/project1.jar
I have a reason for not wanting to use this single JAR. (It is because unpackaged third party packages expose files with duplicate names so I get annoying warnings. Example: A file called License.txt is in several packages.)
Secondly...
As already mentioned I also have success with the following "library sub-folder" export method and class path as shown.
export > runnable jar > copy required libraries into a sub-folder
-classpath /home/user/workspace/project/project1.jar:/home/user/workspace/project/project1_lib/*
(Edit: As it turns out the JAR has a manifest that points to the project1_lib subfolder so the class path can be simplified to omit that. Just delete the part after the colon (:) separator from the class path.)
Thirdly...
I interpret "package required libraries" to mean a JAR-in-JAR export. Invoked with the class path shown, this export results in a failure to find the class.
export > runnable jar > package required libraries
-classpath /home/user/workspace/project/project1.jar
The error is:
Exception in thread "main" java.lang.NoClassDefFoundError: org/joda/time/ReadablePartial
How do I get this particular type of Eclipse export to work? I have already uninstalled Eclipse (Mars) and reinstalled. I have also
removed the org.joda.time package and added it back. The problem persists.
Did you consider creating an uber jar ?
With maven you just need to add the following plugin definition and use the command mvn package
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.4.2</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<filters>
<filter>
<artifact>joda-time:joda-time</artifact>
<includes>
<include>**</include>
</includes>
</filter>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
</configuration>
</execution>
</executions>
</plugin>
Hope this helps.
You need to call the bundled JarRsrcLoader with your class as an argument:
java $MEMORYOPTIONS -enableassertions
-classpath /home/user/workspace/project/project1.jar
org.eclipse.jdt.internal.jarinjarloader.JarRsrcLoader
topLevelDomain.domain.packageName.className
However, this doesn't allow you to pass arguments ($1 $2 $3) to the called class.
You should launch your app using java's -jar option. Something like this:
java $MEMORYOPTIONS -enableassertions -jar /path/to/project1.jar $1 $2 $3

Cannot load UIMA PEAR package through the package installer GUI

I have a problem related to installing a UIMA PEAR package containing an Annotator component. I am using PearPackageMavenPlugin for the job with the following setup:
<plugin>
<groupId>org.apache.uima</groupId>
<artifactId>PearPackagingMavenPlugin</artifactId>
<version>2.6.0</version>
<extensions>true</extensions>
<executions>
<execution>
<phase>package</phase>
<configuration>
<!-- PEAR file component classpath settings -->
<classpath>$main_root/bin</classpath>
<!-- PEAR file main component descriptor -->
<mainComponentDesc>desc/S4DocumentUimaAnnotator.xml</mainComponentDesc>
<!-- PEAR file component ID -->
<componentId>S4DocumentAnnotator</componentId>
<!-- PEAR file UIMA datapath settings -->
<datapath>$main_root/resources</datapath>
</configuration>
<goals>
<goal>package</goal>
</goals>
</execution>
</executions>
</plugin>
`
I have constructed a special maven profile building the project in a bin directory instead of target so all my compiled classes are there that is why I have pointed the classpath setting of the plugin at $main_root/bin.
Finally when I load the built pear package I get the following error:
Verification of S4DocumentAnnotator failed =>
org.apache.uima.resource.ResourceInitializationException: The class com.ontotext.s4.api.components.uima.S4DocumentUimaAnnotator is not a valid Analysis Component. You must specify an Annotator, CAS Consumer, Collection Reader, or CAS Multiplier. If you are calling ResourceManager.setExtensionClassPath, this error can also be caused if you have put UIMA framework jar files on the extension classpath, which is not allowed. (Descriptor: file:/home/ceco/s4_stuff/my_pear/S4DocumentAnnotator/desc/S4DocumentUimaAnnotator.xml)
at org.apache.uima.analysis_engine.impl.PrimitiveAnalysisEngine_impl.initializeAnalysisComponent(PrimitiveAnalysisEngine_impl.java:228)
at org.apache.uima.analysis_engine.impl.PrimitiveAnalysisEngine_impl.initialize(PrimitiveAnalysisEngine_impl.java:170)
at org.apache.uima.impl.AnalysisEngineFactory_impl.produceResource(AnalysisEngineFactory_impl.java:94)
at org.apache.uima.impl.CompositeResourceFactory_impl.produceResource(CompositeResourceFactory_impl.java:62)
at org.apache.uima.UIMAFramework.produceResource(UIMAFramework.java:279)
at org.apache.uima.UIMAFramework.produceResource(UIMAFramework.java:331)
at org.apache.uima.UIMAFramework.produceAnalysisEngine(UIMAFramework.java:448)
at org.apache.uima.pear.tools.InstallationTester.testAnalysisEngine(InstallationTester.java:218)
at org.apache.uima.pear.tools.InstallationTester.doTest(InstallationTester.java:113)
at org.apache.uima.pear.tools.InstallationController.verifyComponentInstallation(InstallationController.java:1110)
at org.apache.uima.pear.tools.InstallationController.verifyComponent(InstallationController.java:1993)
at org.apache.uima.tools.pear.install.InstallPear.installPear(InstallPear.java:389)
at org.apache.uima.tools.pear.install.InstallPear.access$000(InstallPear.java:80)
at org.apache.uima.tools.pear.install.InstallPear$RunInstallation.run(InstallPear.java:109)
at java.lang.Thread.run(Thread.java:744)
I do not understand why the UIMA jars are not supposed to be packaged when the idea of the PEAR package is to be self-contained and not depend on the system it is ran on?
This is what I would try:
The class com.ontotext.s4.api.components.uima.S4DocumentUimaAnnotator is not a valid Analysis Component
Check that S4DocumentUimaAnnotator is valid. Unzip the PEAR and check the xml.
If you are calling ResourceManager.setExtensionClassPath, this error can also be caused if you have put UIMA framework jar files on the extension classpath, which is not allowed.
Did you try to print the extension classpath?
Else you could try to use a plain java version of the PEAR, meaning: manually unzip it and create a normal java project with it.
One common issue I ran into several times and which creates exactly this error message is that my PEAR contains uimaj-common.jar in the lib folder. Have you checked this?
I know this one is old but what I think you should do is set the scope of the uima dependencies to provided. The PEAR needs only it's dependencies to run, the enviroment it'll be used should have the uima dependencies set to use all the uima features and pears
Provided dependencies won't be copied to the lib folder of the PEAR
There are a couple of things that could be going on, but going by your error message, the first thing I would determine is whether or not the uimaj-core jar file is being excluded from the build of the PEAR file. It should be. (Take a look at the error message above.) I just ran into this problem myself, and I got around it by adding this to my POM:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<executions>
<!-- Copy the dependencies to the lib folder for the PEAR to copy. -->
<execution>
<id>copy-dependencies</id>
<phase>package</phase>
<goals>
<goal>copy-dependencies</goal>
</goals>
<configuration>
<stripVersion>true</stripVersion>
<outputDirectory>${basedir}/lib</outputDirectory>
<overWriteReleases>false</overWriteReleases>
<overWriteSnapshots>true</overWriteSnapshots>
<includeScope>runtime</includeScope>
<!-- An exception happens when using a PEAR if the archive includes this jar. -->
<excludeArtifactIds>uimaj-core</excludeArtifactIds>
</configuration>
</execution>
</executions>
</plugin>
I recently posted a parent POM file and a project-specific POM file to my Gist. You're welcome to take a look at what I'm doing. (One of the things I'm doing is copying over my desc directory, which contains my AE's descriptor XML file, to the root of my project directory, so that the maven-pear plugin can copy it into the PEAR archive properly.)
Parent POM:
https://gist.github.com/software-mariodiana/d46e10fca53dc6e6c0f16e20563476b8
Project-specific POM:
https://gist.github.com/software-mariodiana/e9a0f0f03a49d33dcc32655170fd4841
Good luck!

How to port an Eclipse Java project to another PC and compile it from Shell?

I have created a Java project in Eclipse and successfully executed it directly from Eclipse on my Windows PC. Now I have to run the same java program on Linux server.
I have tried to copy the .class files from my PC to server and run it but it didn't work. After that I copied the whole project and run javac MyProject.java from shell and it returned the following errors:
RecordImportBatch.java:2: error: package org.apache.commons.io does not exist
import org.apache.commons.io.FileUtils;
...
RecordImportBatch.java:3: error: package org.neo4j.graphdb does not exist
import org.neo4j.graphdb.RelationshipType;
which I guess are caused because I didn't include jar files in compile command.
There are many jar files included in this project and as a Java newbie so far I haven't found the way to compile the project which works in Eclipse from Shell.
Does anyone know if there is a way to get the appropriate compile command directly from Eclipse and just paste it to Shell or do I have to include all jars 'manually'? If this is the case, does anyone know how to include all jars, placed in lib directory which is located in the same folder as MyProject.java?
Thank you!
If you are just learning about java, this suggestion may be some challenge, but it would be good for you to use maven to build your project, which requires reorganizing your source files and directories. And then use the assembly plugin to create a zip that includes all dependencies. Then to run your program, you just do something like:
unzip myapp.zip
cd myapp
java -cp "lib/*" com.blah.MyApp
(you might need to adjust the syntax of the /* part, using single quotes, or removing quotes depending on your shell)
Here is a snippet for the assembly plugin (general purpose... nothing hardcoded other than version, and the path which follows conventions). This goes in pom.xml:
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<version>2.2</version>
<configuration>
<descriptors>
<descriptor>src/main/assembly/distribution.xml</descriptor>
</descriptors>
<appendAssemblyId>false</appendAssemblyId>
</configuration>
<executions>
<execution>
<id>make-assembly</id>
<!-- this is used for inheritance merges -->
<phase>package</phase>
<!-- append to the packaging phase. -->
<goals>
<goal>single</goal>
<!-- goals == mojos -->
</goals>
</execution>
</executions>
</plugin>
And here is an example assembly file (this goes in src/main/assembly/distribution.xml relative to pom.xml):
<assembly xmlns="http://maven.apache.org/plugins/maven-assembly-plugin/assembly/1.1.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/plugins/maven-assembly-plugin/assembly/1.1.0 http://maven.apache.org/xsd/assembly-1.1.0.xsd"
>
<id>${artifact.version}</id>
<formats>
<format>zip</format>
</formats>
<files>
<file>
<!-- an example script instead of using "java -cp ..." each time -->
<source>${project.basedir}/src/main/bin/run.sh</source>
<outputDirectory>.</outputDirectory>
<destName>run.sh</destName>
<fileMode>0754</fileMode>
</file>
</files>
<fileSets>
<fileSet>
<directory>${project.basedir}/src/main/resources/</directory>
<outputDirectory>/res/</outputDirectory>
<includes>
<!-- just examples... -->
<include>*.sql</include>
<include>*.properties</include>
</includes>
</fileSet>
<fileSet>
<directory>config/</directory>
<outputDirectory>/config/</outputDirectory>
</fileSet>
</fileSets>
<dependencySets>
<dependencySet>
<outputDirectory>/lib</outputDirectory>
<excludes>
<!-- add redundant/useless files here -->
</excludes>
</dependencySet>
</dependencySets>
</assembly>
Also, eclipse has a "jar packager" utility in the gui, but I found it to be not very good when I used it a few years ago. And I don't think it handles dependencies, so you would need to take my "-cp" argument above, and add all the jars, or put them in your lib directory yourself.
Also there is this http://fjep.sourceforge.net/ but I have never used it.... I just found it now while quickly looking up the eclipse jar packager. In his tutorial, his last line (showing running it) looks like:
> java -jar demorun_fat.jar
Hello
If what you need to do, is to compile and run your program in Eclipse on your pc and transfer the compiled result to the Linux machine, then use the File -> Export -> Java -> Runnable Jar file and choose the packaging most suitable for you.
The technologically most simple is to use "Copy required libraries into a sub-folder next to the jar" but then you need to distribute by zipping the files together, and unzip them on the Linux box.
I would strongly recommend using any kind of build tools, the de facto standards are Ant or Maven, but you can find several alternatives. Both of them are quite trivial to set up for a smaller project, and using them is also a piece of cake (note that Eclipse can also generate you a basic Ant build.xml file).
For instance, it could be one command to run your whole project:
> ant run
Buildfile: build.xml
clean:
compile:
[mkdir] Created dir: ...
[javac] Compiling N source file to ...
run:
[java] Running application...
main:
BUILD SUCCESSFUL

Categories