As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 9 years ago.
I am looking for something like a compiler directive in C or C++ or something like it for Java. Why you ask?
I have a body of Java code, it used to be for one project. Then a second project, code was based on the first project. So a copy of the first one with many changes. Same structure though as far as package names and contents, etc. But the class internal were different.
Now there is a third project, something like the first and the second one mixed together.
How do I structure the code to have one code base/ git repo that contains all three projects code? And how do I determine what code path is to be run? at runtime is it project 1 or project 2 or project three? Compiler directives are used for C/C++. Java?
There are no compiler directives (ifdef) in Java.
Abstract what is common and create a single library (or multiple libraries if needed) that provides the base functionality that is being reused. Then refactor all the projects to use (i.e. extend or compose) classes from the library. This is what OO programming is all about.
I am using ANT Build Script to build my java project. I've a build.properties that contains various values a variable can have. My build.xml implements various specific build targets like debug, release, test. Using different build targets for ant (ant debug, ant release), I can have various versions of jar using same code base. Works nicely for build time variables based on ant targets.
build.properties
# Build Parameters
# Turn On of Off debugging
config.debug_true=true
config.debug_false=false
Config.java
Define this java class somewhere in your project. I did in /config folder under project directory.
package com.stackoverflow.ajitk;
public class Config {
// Build for DEBUG or RELEASE
public final static boolean DEBUG = #CONFIG.DEBUG#;
}
build.xml
<?xml version="1.0" encoding="UTF-8"?>
<project name="ABC" default="release">
<property file="build.properties" />
...
<macrodef name="copy-release-parameters">
<sequential>
<property name="config-target-path" value="gen/com/stackoverflow/ajitk"/>
<!-- Copy the configuration file, replacing tokens in the file. -->
<copy file="config/Config.java" todir="${config-target-path}"
overwrite="true" encoding="utf-8">
<filterset>
<filter token="CONFIG.DEBUG" value="${config.debug_false}"/>
<--... add more variables as needed by your project ... -->
</filterset>
</copy>
</sequential>
</macrodef>
...
<-- the above macro should be used in your release/debug target -->
...
</project>
Related
Admitted, this doesn't sound like a best practice altogether, but let me explain. During the build, we need to paste the build number and the system version into a class whose sole purpose is to contain these values and make them accessible.
Our first idea was to use system properties, but due to the volatility of the deployment environment (an other way of saying "the sysadmins are doing weird unholy creepy things") we would like to have them hard-coded.
Essentially I see 4 possibilities to achieve it in ant :
use <replace> on a token in the class
The problem with this approach is that the file is changed, so you have to replace the token back after compilation with a <replaceregexp>...sooo ugly, I don't want to touch source code with regex. Plus temporal dependencies.
copy the file, make replace on the copy, compile copy, delete copy
One one has to mind the sequence - the original class has to be compiled first in order to be overwritten by the copy. Temporal dependencies are ugly too.
copy the file, replace the token on the original, compile, replace the stained original with the copy
Same temporal dependency issue unless embedded in the compile target. Which is ugly too, because all our build files use the same imported compile target.
create the file from scratch in the build script / store the file outside the source path
Is an improvement over the first three as there are no temporal dependencies, but the compiler/IDE is very unhappy as it is oblivious of the class. The red markers are disturbingly ugly.
What are your thoughts on the alternatives?
Are there any best practices for this?
I sure hope I have missed a perfectly sane approach.
Thank you
EDIT
We ended up using the manifest to store the build number and system version in the Implementation-Version attribute, unsing MyClass.class.getPackage().getImplementationVersion(). I have found this solution was one of the answers to this thread, which was posted in the comment by andersoj
I think a simpler approach would be to have your Version.java class read from a simple .properties file included in the JAR, and just generate this .properties file at build-time in the Ant build. For example just generate:
build.number = 142
build.timestamp = 5/12/2011 12:31
The built-in <buildnumber> task in Ant does half of this already (see the second example).
#2 is generally the way I've seen it done, except that your not-ready-to-compile sources should be in a separate place from you ready-to-compile sources. This avoids the temporal issues you talk about as it should only be compiled once.
This is a common pattern that shows up all the time in software build processes.
The pattern being:
Generate source from some resource and then compile it.
This applies to many things from filtering sources before compilation to generating interface stubs for RMI, CORBA, Web Services, etc...
Copy the source to a designated 'generated sources' location and do the token replacement on the copies files to generate sources, then compile the generated sources to your compiled classes destination.
The order of compilation will depend on whether or not your other sources depend on the generated sources.
My solution would be to:
use on a token in the class:
<replace dir="${source.dir}" includes="**/BuildInfo.*" summary="yes">
<replacefilter token="{{BUILD}}" value="${build}" />
<replacefilter token="{{BUILDDATE}}" value="${builddate}" />
</replace>
This replacement should only take place in the build steps performed by your build system, never within a compile/debug session inside an IDE.
The build system setup should not submit changed source code back to the source repository anyway, so the problem of changed code does not exist with this approach.
In my experience it does not help when you place the build information in a property file, as administrators tend to keep property files while upgrading - replacing the property file that came out of the install. (Build information in a property file is informational to us. It gives an opportunity to check during startup if the property file is in synch with the code version.)
I remember we used the 4th approach in a little different way. You can pass release number to the ant script while creating a release.Ant script should include that in the release(config/properties file) and your class should read it from there may be using properties file or config file.
I always recommend to create some sort of directory and put all built code there. Don't touch the directories you checked out. I usually create a target directory and place all files modified and built there.
If there aren't too many *.java files (or *.cpp files), copy them to target/source' and compile there. You can use thetask with a` to modify this file one file with the build number as you copy it.
<javac srcdir="${target.dir}/source"
destdir="${target.dir}/classes"
[yadda, yadda, yadda]
</java>
This way, you're making no modification in the checked out source directory, so no one will accidentally check in the changes. Plus, you can do a clean by simply deleting the target directory.
If there are thousands, if not millions of *.java files, then you can copy the templates to target/source and then compile the source in both {$basedir}/source and target/source. That way, you're still not mucking up the checked out code and leaving a chance that someone will accidentally check in a modified version. And, you can still do a clean by simply removing target.
I was looking for a solution to the same problem, reading this link: http://ant.apache.org/manual/Tasks/propertyfile.html I was able to findout the solution.
I work with netbeans, so I just need to add this piece of code to my build.xml
<target name="-post-init">
<property name="header" value="##Generated file - do not modify!"/>
<propertyfile file="${src.dir}/version.prop" comment="${header}">
<entry key="product.build.major" type="int" value="1" />
<entry key="product.build.minor" type="int" default="0" operation="+" />
<entry key="product.build.date" type="date" value="now" />
</propertyfile>
</target>
This will increment the minor version each time yo compile the project with clean and build. So you are save to run the project any time that the minor version will stay still.
And I just need to read the file in Runtime. I hope this help.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 6 years ago.
Improve this question
I'm looking for a guide on how to create new maven archetypes that involve using parameters to create directories and file names where the parameters are used as prefixes in the file names and part of the package structure/directories that are created by the archetype.
All I can find is very simple instructions on how to make very simple projects.
why not use the maven website itself?
http://maven.apache.org/guides/mini/guide-creating-archetypes.html
I think I found the a site that helps me a little more once I figured out that Velocity templates are used as part of the archetype processes.
HowToCreateMavenArchetypeFromProject
I think we met the same problem, finally i resolved it with enlightening from this article
assuming you want to generate package com.company.base.dal.dao, we call com.company.base is root package, dal.dao is core package, then just put your template java file under src/main/java/dal/dao, and tweak the packaged="true" in META-INF/maven/archetype-metadata.xml as below
<fileSet filtered="true" packaged="true" encoding="GBK"><!--packaged="true" tells maven to copy the core package in to root package while creating a project.-->
<directory>src/main/java</directory>
<includes>
<include>**/*.java</include>
</includes>
</fileSet>
say, if you set -Dpackage=com.alibaba.china after execute mvn archetype:generate, it will create java source package as com/alibaba/china/dal/dao/sample.java
as how to give the prefix package information in sample.java, just use ${package} as below
package ${package}.dal.dao;
the archetype plugin will replace ${package} with -Dpackage as velocity template
Even I am looking for the same thing.
Even I find Maven home page very basic and does not fulfill our requirement.
I even tried HowToCreateMavenArchetypeFromProject
But I still have one question. How to invoke a specific file of the archetype when the project of that type of archetype is generated. To be specific, if I have a java class App.java how to invoke that while creating project?
What happens with create-archetype-from-project is it creates a project of the same files and directory structure and we need to run that again to get that in action.
Also, ${groupId} and such stuff work only in maven files. What if I have a property file and wanna use the artifactId the user has entered?
Do let me know if you come across any good stuff. Meanwhile, I will also let you know when I find any solution.
Thanks
from the maven documentacion: http://maven.apache.org/archetype/archetype-models/archetype-descriptor/archetype-descriptor.html
A fileset defines the way the project's files located in the jar file are used by the Archetype Plugin to generate a project. If file or directory name contains property pattern, it is replaced with corresponding property value.
if you can name the file as
__artifactId___local.properties
... when you generate the proyect with artifactId=foo it will contains this file:
foo_local.properties
here is a short description but very helpful
http://javajeedevelopment.blogspot.fr/2012/05/how-to-create-maven-archetype.html
Recently I started to use Eclipse's java compiler, because it is significantly faster than standard javac. I was told that it's faster because it performs incremental compiling. But I'm still a bit unsure about this since I can't find any authoritative documentation about both - eclispse's and sun's - compilers "incremental feature". Is it true that Sun's compiler always compiles every source file and Eclipse's compiler compile only changed files and those that are affected by such a change?
Edit: I'm not using Eclipse autobuild feature but instead I'm setting
-Dbuild.compiler=org.eclipse.jdt.core.JDTCompilerAdapter
for my ant builds.
Is it true that Sun's compiler always compiles every source file and Eclipse's compiler compile only changed files and those that are affected by such a change?
I believe that you are correct on both counts.
You can of course force Eclipse to recompile everything.
But the other part of the equation is that Java build tools like Ant and Maven are capable of only compiling classes that have changed, and their tree of dependent classes.
EDIT
In Ant, incremental compilation can be done in two ways:
By default the <javac> task compares the timestamps of .java and corresponding .class files, and only tells the Java compiler to recompile source (.java) files that are newer than their corresponding target (.class) files, or that don't have a target file at all.
The <depend> task also takes into account dependencies between classes, which it determines by reading and analysing the dependency information embedded in the .class files. Having determined which .class files are out of date, the <depend> task deletes them so a following <javac> task will recompile them. However, this is not entirely fool-proof. For example, extensive changes to the source code can lead to the <depend> task may be analysing stale dependencies. Also certain kinds of dependency (e.g. on static constants) are not apparent in the .class file format.
To understand why Ant <depend> is not fool-proof, read the "Limitations" section of the documentation.
Javac only compiles source files that are either named on the command line or are dependencies and are out of date. Eclipse may have a finer-grained way of deciding what that means.
Eclipse certainly does this. Also it does it at save time if you have that option turned on (and it is by default). It looks like sun also doesn't do this (it is very easy to test, just make a small project where A is the main class that uses class B, but B doesn't use class A. Then change A and compile the project again, see if the timestamp for b.class has changed.
This is the way many compilers work (also gcc for instance). You can use tools like ant and make to compile only the part the project that has changed. Also note that these tools aren't perfect, sometimes eclipse just loses track of the changes and you'll need to do a full rebuild.
Restating what I've heard here and phrasing it for lazy folks like me:
You can achieve incremental builds with the javac task in ant, but you should use the depend task to clear out .class files for your modified .java AND you must not leave the includes statement unspecified in the javac task. (Specifying just src path in the javac task and leaving includes unspecified causes javac recompile all sources it finds.)
Here are my depends and javac tasks. With the standard Oracle java compiler, only .java files I modify are compiled. Hope this helps!
<depend srcdir="JavaSource" destdir="${target.classes}" cache="${dependencies.dir}" closure="yes">
<classpath refid="compiler.classpath" />
<include name="**/*.java"/>
</depend>
<javac destdir="${target.classes}" debug="true" debuglevel="${debug.features}" optimize="${optimize.flag}" fork="yes" deprecation="no" source="1.6" target="1.6" encoding="UTF-8" includeantruntime="no">
<classpath refid="compiler.classpath"/>
<src path="JavaSource"/>
<include name="**/*.java" /> <!-- This enables the incremental build -->
</javac>
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 4 years ago.
Improve this question
When building a suite of unit tests for Java code, is there a convention for where to place test code in relation to source code?
For example, if I have a directory /java that contains a bunch of .java source files, is it better to put the test cases in /java itself or use something like /java/test.
If the latter is preferred, how do you test the internals of the code when the private /protected members of a class aren't available outside the package?
I recommend following the Apache Software Foundation's standard directory structure, which yields this:
module/
src/
main/
java/
test/
java/
This keeps tests separate from source, but at the same level in the directory structure. If you read through how Apache defines their structure, you'll see it helps partition other concerns out as well, including resources, config files, other languages, etc.
This structure also allows unit tests to test package and protected level methods of the units under test, assuming you place your test cases in the same package as what they test. Regarding testing private methods - I would not bother. Something else, either public, package, or protected calls them and you should be able to get full test coverage testing those things.
By the way, the link above is to Maven, Apache's standard build tool. Every Java project they have conforms to this standard, as well as every project I have encountered that is built with Maven.
You can put the tests in the same package as the original classes, even if the source code is under its own directory root:
PROJECT_ROOT
+--- src/
+----test/
You can declare a class com.foo.MyClass under src and its test com.foo.MyClassTest under test.
As for access to private members, you can use reflection to invoke the methods (altering their accessibility via Class.getDeclaredMethod.setAccessible), or you could use something like testng/junit5 to put some annotation-driven tests on the source code itself (I personally think this is a bad idea).
Why not check out some projects on java.net to see how they've organized things, for example swinglabs (the SVN repository is pretty slow I'm afraid)?
Most of the times is done like this:
<SOME_DIR>/project/src/com/foo/Application.java
<SOME_DIR>/project/test/com/foo/ApplicationTest.java
So, you keep them separated and you can still test the package/protected functionality because the test is in the same package.
You can't test private stuff, unless it is declared inside the class it self.
Upon delivery, you just pack the .class generated by src, not the tests
Actually it makes a lot of sense to separate your Production and Test projects into 2 separate entities, but have the same package structure in both projects.
So if I have a project 'my-project' I also create 'my-project-test', so I have the following directory structure:
my-project
+--- src/com/foo
my-project-test
+---test/com/foo
This approach ensures that test code dependencies do not pollute production code.
In my personal opinion, package private and protected methods should be tested as well as public methods. Hence I want my test classes in the same package as production classes.
This is how we have it set up and we like it.
build/
src/
test/build/
test/src/
All testing code compiles into its own build directory. This is because we don't want production to contain test classes by mistake.
When creating a Java library module in Android Studio it creates a default class under:
[module]
+ src/main/java/[com/foo/bar]
If you look into [module].iml file, you will find that path as well as the path for tests, that you can utilize. The below is a summary:
<module>
<component>
<content>
<sourceFolder url="file://$MODULE_DIR$/src/main/java" isTestSource="false" />
<sourceFolder url="file://$MODULE_DIR$/src/main/resources" type="java-resource" />
<sourceFolder url="file://$MODULE_DIR$/src/test/java" isTestSource="true" />
<sourceFolder url="file://$MODULE_DIR$/src/test/resources" type="java-test-resource" />
</content>
</component>
</module>
What you can do in particular is to create a directory for tests to have the following structure:
[module]
+ src/main/java/[com/foo/bar]
+ src/test/java/[com/foo/bar]
The above structure will be recognized by Android Studio and your files underneath will be included into the module.
I assume that that structure is a recommended layout for code and tests.
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 11 years ago.
Every time I create a new project I copy the last project's ant file to the new one and make the appropriate changes (trying at the same time to make it more flexible for the next project). But since I didn't really thought about it at the beginning, the file started to look really ugly.
Do you have an Ant template that can be easily ported in a new project? Any tips/sites for making one?
Thank you.
An alternative to making a template is to evolve one by gradually generalising your current project's Ant script so that there are fewer changes to make the next time you copy it for use on a new project. There are several things you can do.
Use ${ant.project.name} in file names, so you only have to mention your application name in the project element. For example, if you generate myapp.jar:
<project name="myapp">
...
<target name="jar">
...
<jar jarfile="${ant.project.name}.jar" ...
Structure your source directory structure so that you can package your build by copying whole directories, rather than naming individual files. For example, if you are copying JAR files to a web application archive, do something like:
<copy todir="${war}/WEB-INF/lib" flatten="true">
<fileset dir="lib" includes="**/*.jar">
</copy>
Use properties files for machine-specific and project-specific build file properties.
<!-- Machine-specific property over-rides -->
<property file="/etc/ant/build.properties" />
<!-- Project-specific property over-rides -->
<property file="build.properties" />
<!-- Default property values, used if not specified in properties files -->
<property name="jboss.home" value="/usr/share/jboss" />
...
Note that Ant properties cannot be changed once set, so you override a value by defining a new value before the default value.
You can give http://import-ant.sourceforge.net/ a try.
It is a set of build file snippets that can be used to create simple custom build files.
I had the same problem and generalized my templates and grow them into in own project: Antiplate. Maybe it's also useful for you.
If you are working on several projects with similar directory structures and want to stick with Ant instead of going to Maven use the Import task. It allows you to have the project build files just import the template and define any variables (classpath, dependencies, ...) and have all the real build script off in the imported template. It even allows overriding of the tasks in the template which allows you to put in project specific pre or post target hooks.
I used to do exactly the same thing.... then I switched to maven. Maven relies on a simple xml file to configure your build and a simple repository to manage your build's dependencies (rather than checking these dependencies into your source control system with your code).
One feature I really like is how easy it is to version your jars - easily keeping previous versions available for legacy users of your library. This also works to your benefit when you want to upgrade a library you use - like junit. These dependencies are stored as separate files (with their version info) in your maven repository so old versions of your code always have their specific dependencies available.
It's a better Ant.
I used to do exactly the same thing.... then I switched to maven.
Oh, it's Maven 2. I was afraid that someone was still seriously using Maven nowadays. Leaving the jokes aside: if you decide to switch to Maven 2, you have to take care while looking for information, because Maven 2 is a complete reimplementation of Maven, with some fundamental design decisions changed. Unfortunately, they didn't change the name, which has been a great source of confusion in the past (and still sometimes is, given the "memory" nature of the web).
Another thing you can do if you want to stay in the Ant spirit, is to use Ivy to manage your dependencies.
One thing to look at -- if you're using Eclipse, check out the ant4eclipse tasks. I use a single build script that asks for the details set up in eclipse (source dirs, build path including dependency projects, build order, etc).
This allows you to manage dependencies in one place (eclipse) and still be able to use a command-line build to automation.