I have a CMakeList.txt which will build a Java project with Maven to a war file when running make, but when I run make install, it will rebuild it again before copy to the installation folder of Web Application.
How can I only build Java once with make but not again with make install? Here is the CMakeList.txt:
add_custom_target(JavaProject ALL
COMMAND ${MAVEN_EXECUTABLE} package
WORKING_DIRECTORY ${CMAKE_CURRENT_SOURCE_DIR}
VERBATIM)
install(FILES "${JAVA_PROJECT_TARGET_DIR}/java_project.war"
DESTINATION ${WAR_DIR})
As the documentation of add_custom_target() says, custom targets are always considered out of date, which means they will re-build with each invocation of make which includes them.
What you want instead is a custom command to produce the .war file:
add_custom_command(
OUTPUT "${JAVA_PROJECT_TARGET_DIR}/java_project.war"
COMMAND ${MAVEN_EXECUTABLE} package
WORKING_DIRECTORY ${CMAKE_CURRENT_SOURCE_DIR}
VERBATIM
)
This tells CMake how the file named "${JAVA_PROJECT_TARGET_DIR}/java_project.war" is produced when someone requests it. For files, CMake can generate dependency checks just fine, so it will not re-build needlessly. Note that you will probably also want to include some DEPENDS in that add_custom_command(), otherwise it will never rebuild once built(1).
Then, you need one more thing: a driver for the custom command. That is something that will depend on the command's OUTPUT and actually cause it to be built. So you'll add a custom target:
add_custom_target(
JavaProject ALL
DEPENDS "${JAVA_PROJECT_TARGET_DIR}/java_project.war"
)
Then, the sequence will be as follows:
During a make, JavaProject will be considered out of date (since it's a custom target) and will be built. This means its dependencies will be checked for up-to-datedness, and re-built if they're not up to date. That's what the custom command is for. After that, the custom target itself would run its COMMAND, but it doesn't have any, so nothing else happens.
On a subsequent make invocation, JavaProject will again be considered out of date and will thus be built. Its dependencies are checked again, but this time, they're up to date (since the .war already exists). It's therefore not built again. The custom target still has no COMMAND, so nothing further happens.
This "custom target as driver for custom commands" approach is very a idiomatic piece of CMake code, and you will see it in many projects which produce additional files which do not participate in further build steps (such as documentation).
(1) If the list of dependencies is very large, you want to move it to a separate files and include that. Something like this:
In CMakeLists.txt:
include(files.cmake)
add_custom_command(
OUTPUT "${JAVA_PROJECT_TARGET_DIR}/java_project.war"
COMMAND ${MAVEN_EXECUTABLE} package
DEPENDS ${MyFiles}
WORKING_DIRECTORY ${CMAKE_CURRENT_SOURCE_DIR}
VERBATIM
)
In files.cmake:
set(MyFiles
a/file1.java
a/file2.java
a/b/file1.java
a/c/file1.java
# ... list all files as necessary
)
This keeps the CMakeList itself readable, while allowing you to explicitly depend on all you need.
Although Angew has an excellent answer but unfortunately it does not work as I expect (i.e: when update source code folder and run make, it will not build the war again).
Here is the way to solve what I wanted:
set(CMAKE_SKIP_INSTALL_ALL_DEPENDENCY TRUE)
Then when I run make it will build and make install will just copy to the installation folder.
Related
I know there are a lot of questions that seem similar. I have also spent a few hours getting to grips with Gradle multiprojects. But I still don't understand what the best course of action is here. Incidentally I am using Groovy as my coding language, but explanations referencing Java would be just as good.
I have developed an Eclipse Gradle project, "ProjectA", which in particular has a class, IndexManager, which is responsible for creating and opening and querying Lucene indices.
Now I am developing a new Eclipse Gradle project, "ProjectB", which would like to use the IndexManager class from ProjectA.
This doesn't really mean that I would like both projects to be part of a multiproject. I don't want to compile the latest version of ProjectA each time I compile ProjectB - instead I would like ProjectB to be dependent on a specific version of ProjectA's IndexManager. With the option of upgrading to a new version at some future point. I.e. much as with the sorts of dependencies you get from Maven or JCenter...
Both projects have the application plugin, so ProjectA produces an executable .jar file whose name incorporates the version. But currently this contains only the .class files, the resource files, and a file called MANIFEST.MF containing the line "Manifest-Version: 1.0". Obviously it doesn't contain any of the dependencies (e.g. Lucene jar files) needed by the .class files.
The application plugin also lets you produce a runnable distribution: this consists of an executable file (2 in fact, one for *nix/Cygwin, one for Windows), but also all the .jar dependencies needed to run it.
Could someone explain how I might accomplish the task of packaging up this class, IndexManager (or alternatively all the classes in ProjectA possibly), and then including it in my dependencies clause of ProjectB's build.gradle... and then using it in a given file (Groovy or Java) of ProjectB?
Or point to some tutorial about the best course of action?
One possible answer to this which I seem to have found, but find a bit unsatisfactory, appears to be to take the class which is to be used by multiple projects, here IndexManager, and put it in a Gradle project which is specifically designed to be a Groovy library. To this end, you can kick it off by creating the project directory and then:
$ gradle init --type groovy-library
... possible to do from the Cygwin prompt, but not from within Eclipse as far as I know. So you then have to import it into Eclipse. build.gradle in this library project then has to include the dependencies needed by IndexManager, in this case:
compile 'org.apache.lucene:lucene-analyzers-common:6.+'
compile 'org.apache.lucene:lucene-queryparser:6.+'
compile 'org.apache.lucene:lucene-highlighter:6.+'
compile 'commons-io:commons-io:2.6'
compile 'org.apache.poi:poi-ooxml:4.0.0'
compile 'ch.qos.logback:logback-classic:1.2.1'
After this, I ran gradle jar to create the .jar which contains this IndexManager class, initially without any fancy stuff in the manifest (e.g. name, version). And I put this .jar file in a dedicated local directory.
Then I created another Gradle project to use this .jar file, the critical dependency here being
compile files('D:/My Documents/software projects/misc/localJars/XGradleLibExp.jar' )
The file to use this class looks like this:
package core
import XGradleLibExp.IndexManager
class Test {
public static void main( args ) {
println "hello xxx"
Printer printer = new Printer()
IndexManager im = new IndexManager( printer )
def result = im.makeIndexFromDbaseTable()
println "call result $result"
}
}
class Printer {
def outPS = new PrintStream(System.out, true, 'UTF-8' )
}
... I had designed IndexManager to use an auxiliary class, which had a property outPS. Groovy duck-typing means you just have to supply anything with such a property and hopefully things work.
The above arrangement didn't run: although you can do build and installdist without errors, the attempt to execute the distributed executable fails because the above 6 compile dependency lines are not present in build.gradle of the "consumer" project. When you put them in this "consumer" Gradle project's build.gradle, it works.
No doubt you can add the version to the generated .jar file, and thus keep older versions for use with "consumer" projects. What I don't understand is how you might harness the mechanism which makes the downloading and use of the dependencies needed by the .jar as automatic as we are used to for things obtained from "real repositories".
PS in the course of my struggles today I seem to have found that Gradle's "maven-publish" plugin is not compatible with Gradle 5.+ (which I'm using). This may or may not be relevant: some people have talked of using a "local Maven repository". I have no idea whether this is the answer to my problem... Await input from an über-Gradle-geek... :)
You should be able to update the Eclipse model to reflect this project-to-project dependency. It looks something like this (in ProjectB's build.gradle):
apply plugin: 'eclipse'
eclipse {
classpath.file.whenMerged {
entries << new org.gradle.plugins.ide.eclipse.model.ProjectDependency('/ProjectA')
}
project.file.whenMerged {
// add a project reference, which should show up in /ProjectB/.project's <projects> element
}
}
These changes may be to the running data model, so they may not actually alter the .classpath and .project files. More info can be found here: https://docs.gradle.org/current/dsl/org.gradle.plugins.ide.eclipse.model.EclipseModel.html
This issue is discussed here: http://gradle.1045684.n5.nabble.com/Gradle-s-Eclipse-DSL-and-resolving-dependencies-to-workspace-projects-td4856525.html and a bug was opened but never resolved here: https://issues.gradle.org/browse/GRADLE-1014
I am unable to compile tests with JUnit. When I attempt to do so, I get this error:
package org.junit.jupiter.api does not exist
I get this error compiling the tests even if I put the .jar in the same directory and compile as follows:
javac -cp junit4-4.12.jar Tests.java
The contents of Test.java are:
import static org.junit.jupiter.api.Assertions.assertEquals;
import org.junit.jupiter.api.Test;
public class Tests {
... several tests ...
It's not clear to me what the issue is, and as far as I can tell, it should work with the .jar -- it's the one from /usr/share/java, where it was installed when I installed junit.
As #DwB has already mentioned you have wrong junit version.
Here is what is jupiter in JUnit: http://junit.org/junit5/docs/current/user-guide/#overview-what-is-junit-5
In simple words JUnit Jupiter API is a set of new classes which were written and introduced in junit 5 version only. And ur trying to use 4 version.
And also i want to clarify some points.
even if I put the .jar in the same directory and compile as follows
It does not matter actually is your file in the same directory or not. Its all about it's path. If you are setting jar only by name of jar file (as you did) then your path becomes relative to your current directory from where u execute javac command. You can just use absolute path and run this command from every directory you want.
https://docs.oracle.com/javase/8/docs/technotes/tools/windows/classpath.html (this one is for windows but for other os there are only minor changes in path writing)
If you get errors like package does not exist, classnotfound or anything similar then such kinds of errors almost always mean you have something wrong with your classpath or dependencies. In your case you simply had wrong version.
Now about finding necessary deps. In java world one of the main places for dependencies is maven central. Almost every opensource library can be found there and maven by default uses this repository to find and load dependencies (in your case these are jars) from there. Also you can use it to get necessary jars manually by simply using it's UI (https://mvnrepository.com/artifact/org.junit.jupiter/junit-jupiter-api/5.0.0). There is download jar button.
Now if you know package or class but do not know in what dependency (jar for simplicity) it is located. In this case you can use http://grepcode.com or other resources which allow to search within available source code withit different repositories. In most cases this work. With juniper i did not manage to find smth there but in other cases this may help) Or the most simple case is just google package and in most cases it also will help to define entry point.
Now about solving ur issue. It seems that you will need as api as implentation. You will definitely need this one https://mvnrepository.com/artifact/org.junit.jupiter/junit-jupiter-api/5.0.0 but it seems that you will need juniper-engine too. First try adding only API and then just go on adding necessary libraries according to errors. You can add multiple jars to cp (read provided class path guide from oracle).
Is IntelliJ compiling all the time since it tells me with red squiggly lines when there is an error? (in addition to the autocomplete features) Or is it doing some sort of psuedo compiling?
If it is doing legit compiling, where does it put these compiled classes? I'de like to point my JRebel to that directory instead of the individual module target folders.
Meo is right, from what I learned when I developed plugins for custom languages, IntelliJ does not compile anything until you explicitly make your project. While you are typing, its lexer/parser detects any invalid token or code construct. In the meantime, it maintains an index of every class and method in your project and its dependencies, along with their signature, etc.
After you stop typing, you'll see a little colored eye in the top part of the right gutter. It indicates that the IDE is running "annotators" and "code inspections". They are able to tell whether or not classes, methods and variable are valid based on the current index and the current state of your file (imports, declarations, etc.). The same goes for unused variables, invalid parameters in method calls, etc.
Pros:
annotators work directly on what they call a PSI tree, which is basically an enhanced AST representing your current file
it may be faster that compiling every time (it uses an index and does not need to recompile every dependent class)
annotators can detect things javac don't care about, such as potential bugs (e.g. using = instead of == in a while condition)
Cons:
that's a loooot of work, basically they need to rewrite the logic to find every error that javac can produce (which is why you can find many issues on their bugtracker labelled "good code is red" or "bad code is green", meaning there is a difference between what they detect and what the compiler would output)
TL;DR: it does not produce any .class until you make your project, everything is done "by hand"
For every module, the compiler output path can be found from Paths tab in Module Settings.
JRebel plugin generates rebel.xml automatically and derives the directory path from Module Settings, so you do not need to point to the locations manually - just generate rebel.xml using the IDE plugin: right click on module in the project view -> JRebel -> generate rebel.xml
Just to add, after compilation, the classes are stored in the target directory if it's a Maven project - otherwise, the directory is specified in IntelliJ's Project Structure, in "Project compiler output":
IntelliJ understands the code, it does not need to compile the code to know what is wrong.
I found my .class files by going to the out/production/main folders from the home directory of the project.
I'm trying to create a JNI jar with CMake. For that the following has to be done in the appropriate order:
compile .class files
generate .h headers
build native library
jar everything
where
is done with add_jar() (I prefered that at custom_command)
is done with add_custom_command(TARGET ...)
is done with add_library()
is done with add_custom_command(TARGET ...) (because -C option is not supported by add_jar)
How can I ensure that the proper order is followed? I get errors sometimes on the first run.
add_custom_command has a POST/PRE build option, but add_jar and add_library does not. The add_custom_command that does not have the argument TARGET has the DEPENDS option, should I use that?
Is there a way of telling add_library to wait for the 2. custom command to have been ran?
I guess the error is that you're calling add_library with source files which don't yet exist during the first run of CMake?
If so, you can set the GENERATED property on those source files using the set_source_files_properties command. This lets CMake know that it's OK for those files to not exist at configure-time (when CMake runs), but that they will exist at build-time.
To ensure that the add_jar command executes before add_library, create a dependency on the add_jar target using add_dependencies. To ensure that the add_custom_command command executes before add_library, have the custom command use the TARGET ... PRE_BUILD options.
For example, if your list of sources for the lib is held in a variable called ${Srcs}, you can do:
# Allow 'Srcs' to not exist at configure-time
set_source_files_properties(${Srcs} PROPERTIES GENERATED TRUE)
add_library(MyLib ${Srcs})
# compile .class files
add_jar(MyJarTarget ...)
# generate .h headers
add_custom_command(TARGET MyLib PRE_BUILD COMMAND ...)
# Force 'add_jar' to be built before 'MyLib'
add_dependencies(MyLib MyJarTarget)
I'm trying to run a particular JUnit test by hand on a Windows XP command line, which has an unusually high number of elements in the class path. I've tried several variations, such as:
set CLASS_PATH=C:\path\a\b\c;C:\path\e\f\g;....
set CLASS_PATH=%CLASS_PATH%;C:\path2\a\b\c;C:\path2\e\f\g;....
...
C:\apps\jdk1.6.0_07\bin\java.exe -client oracle.jdevimpl.junit.runner.TestRunner com.myco.myClass.MyTest testMethod
(Other variations are setting the classpath all on one line, setting the classpath via -classpath as an argument to java"). It always comes down to the console throwing up it's hands with this error:
The input line is too long.
The syntax of the command is incorrect.
This is a JUnit test testing a rather large existing legacy project, so no suggestions about rearranging my directory structure to something more reasonable, those types of solutions are out for now. I was just trying to gen up a quick test against this project and run it on the command line, and the console is stonewalling me. Help!
The Windows command line is very limiting in this regard. A workaround is to create a "pathing jar". This is a jar containing only a Manifest.mf file, whose Class-Path specifies the disk paths of your long list of jars, etc. Now just add this pathing jar to your command line classpath. This is usually more convenient than packaging the actual resources together.
As I recall, the disk paths can be relative to the pathing jar itself. So the Manifest.mf might look something like this:
Class-Path: this.jar that.jar ../lib/other.jar
If your pathing jar contains mainly foundational resources, then it won't change too frequently, but you will probably still want to generate it somewhere in your build. For example:
<jar destfile="pathing.jar">
<manifest>
<attribute name="Class-Path" value="this.jar that.jar ../lib/other.jar"/>
</manifest>
</jar>
Since Java 6 you can use classpath wildcards.
Example: foo/*, refers to all .jar files in the directory foo
this will not match class files (only jar files). To match both use: foo;foo/* or foo/*;foo. The order determines what is loaded first.
The search is NOT recursive
Use An "Argument File" on Java 9+
In Java 9+, the java executable supports providing arguments via a file. See
https://docs.oracle.com/javase/9/tools/java.htm#JSWOR-GUID-4856361B-8BFD-4964-AE84-121F5F6CF111.
This mechanism is explicitly intended to solve the problem of OS limitations on command lengths:
You can shorten or simplify the java command by using #argument files
to specify a text file that contains arguments, such as options and
class names, passed to the java command. This let’s you to create java
commands of any length on any operating system.
In the command line, use the at sign (#) prefix to identify an
argument file that contains java options and class names. When the
java command encounters a file beginning with the at sign (#) , it
expands the contents of that file into an argument list just as they
would be specified on the command line.
This is the "right" solution, if you are running version 9 or above. This mechanism simply modifies how the argument is provided to the JVM, and is therefore 100% compatible with any framework or application, regardless of how they do classloading i.e. it is completely equivalent to simply providing the argument on the command line as usual. This is not true for manifest-based workarounds to this OS limitation.
An example of this is:
Original command:
java -cp c:\foo\bar.jar;c:\foo\baz.jar
can be rewritten as:
java #c:\path\to\cparg
where c:\path\to\cparg is a file which contains:
-cp c:\foo\bar.jar;c:\foo\baz.jar
This "argument file" also supports line continuation characters and quoting for properly handling spaces in paths e.g.
-cp "\
c:\foo\bar.jar;\
c:\foo\baz.jar"
Gradle
If you are encountering this issue in Gradle, see this plugin, which converts your classpath automatically into an "argument file" and provides that to the JVM when doing exec or test tasks on Windows. On Linux or other operating systems it does nothing by default, though an optional configuration value can be used to apply the transformation regardless of OS.
https://github.com/redocksoft/classpath-to-file-gradle-plugin
(disclaimer: I am the author)
See also this related Gradle issue -- hopefully this capability will eventually be integrated into Gradle core: https://github.com/gradle/gradle/issues/1989.
(I suppose you do not really mean DOS, but refer to cmd.exe.)
I think it is less a CLASSPATH limitation than an environment size/environment variable size limit. On XP, individual environment variables can be 8k in size, the entire environment is limited to 64k. I can't see you would hit that limit.
There is a limit on windows that restricts the length of a command line, on WindowsNT+ it is 8k for cmd.exe. A set command is subject to that restriction. Can it be you have more than 8k worth of directories in your set command? You may be out of luck, then - even if you split them up like Nick Berardi suggested.
Thanks to Raman for introducing a new solution to a pathing problem for Java 9+. I made a hack to bootRun task that allows using everything already evaluated by gradle to run java with argument files. Not very elegant but working.
// Fix long path problem on Windows by utilizing java Command-Line Argument Files
// https://docs.oracle.com/javase/9/tools/java.htm#JSWOR-GUID-4856361B-8BFD-4964-AE84-121F5F6CF111
// The task creates the command-line argument file with classpath
// Then we specify the args parameter with path to command-line argument file and main class
// Then we clear classpath and main parameters
// As arguments are applied after applying classpath and main class last step
// is done to cheat gradle plugin: we will skip classpath and main and manually
// apply them through args
// Hopefully at some point gradle will do this automatically
// https://github.com/gradle/gradle/issues/1989
if (Os.isFamily(Os.FAMILY_WINDOWS)) {
bootRun {
doFirst {
def argumentFilePath = "build/javaArguments.txt"
def argumentFile = project.file(argumentFilePath)
def writer = argumentFile.newPrintWriter()
writer.print('-cp ')
writer.println(classpath.join(';'))
writer.close()
args = ["#${argumentFile.absolutePath}", main]
classpath = project.files()
main = ''
}
}
}
If I were in your shoes, I would download the junction utility from MS : http://technet.microsoft.com/en-us/sysinternals/bb896768.aspx and then map your
"C:\path" to say, "z:\" and "c:\path2" to say, "y:\". This way, you will be reducing 4 characters per item in your classpath.
set CLASS_PATH=C:\path\a\b\c;C:\path\e\f\g;
set CLASS_PATH=%CLASS_PATH%;C:\path2\a\b\c;C:\path2\e\f\g;
Now, your classpath will be :
set CLASS_PATH=z\a\b\c;z\e\f\g;
set CLASS_PATH=%CLASS_PATH%;y:\a\b\c;y:\e\f\g;
It might do more depending on your actual classpath.
I think you are up the creek without a paddle here.
The commandline has a limit for arguments to call a programm.
I have 2 sugestion you could try.
First, prior to running the junit tests, you can let a script/ant_task create JARs of the various classes on the classpath.
Then you can put the JARs on the classpath, which should be shorter.
Another way you could try is to create an antscript to run JUNIT,
in ANT there should not be such a limit for classpath entries.
As HuibertGill mentions, I would wrap this in an Ant build script just so that you don't have to manage all of this yourself.
You could try this
#echo off
set A=D:\jdk1.6.0_23\bin
set B=C:\Documents and Settings\674205\Desktop\JavaProj
set PATH="%PATH%;%A%;"
set CLASSPATH="%CLASSPATH%;%B%;"
go to a command prompt and run it twice(no idea why....i have to do so on a windows XP machine)
also the paths r set only for the current command prompt session
There was no solution to the issue other than somehow making the classpath shorter by moving the jar files into a folder like "C:\jars".
Fix for windows gradle long classpath issue. Fixes JavaExec tasks that error out with message "CreateProcess error=206, The filename or extension is too long"
Using the plugins DSL:
plugins {
id "com.github.ManifestClasspath" version "0.1.0-RELEASE"
}
Using legacy plugin application:
buildscript {
repositories {
maven {
url "https://plugins.gradle.org/m2/"
}
}
dependencies {
classpath "gradle.plugin.com.github.viswaramamoorthy:gradle-util-plugins:0.1.0-RELEASE"
}
}
apply plugin: "com.github.ManifestClasspath"
I had a similar issue here with a giant classpath definition inside a .bat file.
The problem was that this class path was also including the execution path into the giant path, its ok, its make sense.
In this context, the software was not able to run and the message "The input line is too long" appeared everytime.
Solution:
I just moved the all files to a shorter position.
For instance, I was trying to execute the software in a directory tree like:
c:\softwares\testing\testing_solution\one
and I moved the whole structure to a point like this
c:\test
The software worked very well.
It is not the best option, I know, but might help some one who is looking to a fast solution.
Tks
Have you tried stacking them?
set CLASS_PATH = c:\path
set ALT_A = %CLASS_PATH%\a\b\c;
set ALT_B = %CLASS_PATH%\e\f\g;
...
set ALL_PATHS = %CLASS_PATH%;%ALT_A%;%ALT_B%