Building a JavaFX project from a directory on a networked drive is surprisingly slow. I believe most of the delay is caused by the build script deleting and re-creating the entire /lib directory. This includes over 20 MB of jar files that remain unchanged for my project.
How to I modify the ANT build task so that this lib folder does not get re-created every time I build the project? What else can be done to reduce the build time?
Another reason for the slow build time over my network is that the project always gets run from the /dist folder. For a standard Java SE project, a project can run from the build directory, removing the need to create a new jar file in /dist every time the project runs.
Is there a way to run the project from the .class files in the build directory instead of needing to run from /dist ?
Here are the netbeans generated build files:
jfx-impl.xml
build-impl
It would be helpful to see more of your build file, it is possible that the target dependency graph of whatever target your running could be changed. Put another way, make a target that doesn't delete lib/
A much larger and cooler solution would be to use ivy to download these libs one time to ~/.ivy where they'd be cached and wouldn't need to be fetched every time you checked out. This would allow you to shed those binaries from source control.
What reasons prevent you from building locally?
The suggestions below are only suggested for your development work based upon your description of slow build speeds in your environment. In general, if NetBeans JavaFX project development builds are already quick enough, the settings above should not be used. For packaging production applications you will want to use different settings.
How to get a quick build for JavaFX under NetBeans
Invest in a solid state drive.
Follow thekbb's suggestion of having the library files local to your machine.
Use NetBeans 7.4 + Java 8 and create a standard Java project rather than a JavaFX project.
Under Project Properties | Libraries uncheck:
Build projects on Classpath
Under Project Properties | Build | Packaging uncheck:
Compress JAR file
Build JAR after Compiling
Copy Dependent Libraries
Under Project Properties | Build | Deployment uncheck:
Enable Native Packaging Options in Project Menu
Keep JavaFX RT Artifacts on Compile Classpath if not present by default.
Under Project Properties | Application | Web Start uncheck:
Enable Web Start
If you end up being unable to solve your build performance issues using NetBeans, you might want to try Intellij Idea (I have found it quite efficient at building JavaFX projects).
My Build Experiences with NetBeans
A standard NetBeans JavaFX project builds and runs JavaFX projects very quickly for me (no more than a second or two). That is even without applying most of the build speed suggestions above. Projects which build in seconds reference over 75 libraries totaling more than 55MB of data. However, that build timing is when using local libraries, not when using libraries stored on a network. Also the quick builds are using a Macbook Air (which has an SSD).
If a project is signed, the signing process takes a few seconds per library jar.
Related
When using trying to build a NetBeans project created with a previous version, I get the following confirmation dialog:
Build Project
The project ... uses build.properties from another NetBeans installation.
Build Anyway
Use this installation
Update
What would these options do?
No matter which options I choose, I do not notice any difference in the build process.
Using NetBeans Development with projects created on NetBeans 8.1.
it depends on what type of dependencies you are using in your project, to be on safe side , I'd prefer clicking Update which it'll update current project's build properties with the external one,
also here's what (wiki.netbeans.org) have about build.properties file:
"If you edit build.xml by hand, you can of course arrange to build other projects (or any Ant scripts) as part of your build, using the or tasks. Note that a build-impl.xml, when building a foreign project, calls its build.xml (rather than skipping to its build-impl.xml), so you can freely mix a hand-customized project with IDE-customized projects."
thanks
I have problem with new gdx-setup.jar. Now when You are trying to generate new project after choosing everything the gdx will generate whole project from You, it will even configure Build Path - there is where my problems starts.
So gdx is creating a path to its libraries something like 'C:/users/user/28193721946210/.gradle/ ...' and so on.
Now - I want to work on this project on another machine - I am sharing project between computers with bitbucket. I was trying to unpin these dependencies from build path and just import my own libraries instead but I can't do much in here. Gradle have only power here, but in fact - there isn't any path in gradle.build or something - I can't make him to use my won dependencies instead of the generated ones.
it will even configure Build Path ... creating a path to its libraries something like 'C:/users/user/28193721946210/.gradle/
This is correct, as Eclipse or any other IDE needs to know where the files are that it needs, on the actual, local machine.
However, these kind of platform specific settings are not static. They also should not be checked into any shared source versioning system.
If you are using git, the gdx-setup will conveniently generate a handy .gitignore file that will set all eclipse specific config files on ignore, for example .project or .classpath files.
LibGDX projects these days are built via gradle. That's why your IDE needs a gradle plugin to be able to read gradle configuration files, most importantly build.gradle, which defines the dependencies for your projects.
When importing those projects into your IDE, the IDE will parse the config files, understand which dependencies it needs, download them to your local machine and then setup a local build-path to those dependencies. However this IDE configuration can change when you trigger an update. In eclipse you do this for example via a rightclick on the project --> Gradle --> Refresh Dependencies.
Furthermore, it will setup the correct local paths on any machine you want to develop your game on.
I have a Java web project that uses Maven standard directory layout: java files gets into java (actually: /src/main/java), resources into resources, web content into webapp.
Then we wanted to improve our web layer, by adding bower, sass, gulp etc. Our gulp build compiles scss, minimize javascripts, optimize images etc, everything what you would expect. But this introduced 1) another build tool, gulp and 2) generated files by gulp.
Question is how to organize such project? One way could be:
(A) gulp builds into webapp folder
In this solution, all javascript,images,scss files are stored in /src/main/assets and gets build into the /src/main/webapp. Both sources and gulp-generated files gets committed to the git. The gradle build is independent from gulp, and it is ok for the users that does not have gulp installed - like those who needs to work only on backend. Also, CI servers does not depend on gulp stuff.
(B) use gulp from gradle during build
In this solution, gulp is called from gradle. Gradle therefore builds everything. And you must use gradle every time when you want to try something. Also, every developer needs to have gulp installed, what may be a problem for developers using windows (as i've been told). Also CI server should know how to run gulp.
My team is torn between these two options. Does anyone have any working experience with either of these solutions?
I'm currently using Java + Grunt + Maven. I've found there are two ways of packaging your frontend with your backend and the same will certainly apply to Gulp.
In the end it's up to what's best for your project/team. From my experience I usually use option B when working with others since the decoupling is easily worth the other issues. When I'm doing my own side projects I always go for option A because it's just easier to launch one webserver and to run a local environment closer to what DEV/PROD is like.
A) Put your frontend into the webapp folder (ex. https://github.com/kdubb1337/maven-grunt-webapp)
Benefits - You can launch your backend and do development all in one place and working with Spring security is a snap, even without OAUTH. Fewer issues working with two webservers up on your local environment when they would normally be bundled into one port on other environments.
B) Keep your frontend in a different folder, or even in a different repo that you clone into the root folder of your backend repo. (ex. https://github.com/kdubb1337/maven-grunt) see the 'yo' folder
Benefits - Fantastic decoupling so front end developers can live in joy without having to even install java locally or worry about recompiling your backend. Works great if you want Travis (or your favourite CI app) to do unit tests on the backend and the frontend.
EDIT I've found this awesome plugin you can use with maven/gradle to build the frontend https://github.com/eirslett/frontend-maven-plugin. Seems like the way to go, will be refactoring my starter projects with this guy for grunt and gulp
The current best practice here is to treat your frontend build as a separate project and put it in its own Maven or Gradle module. Have your Java build system invoke the JavaScript tooling (e.g., with the maven-exec-plugin) and save the output into the appropriate directory in target or build. Bundle up the results in a jar and serve off the classpath.
If you're using Bower, you only need the base Node install on your CI server, and your Java build can invoke the necessary build process from there, fetching JS packages as needed. Don't forget to (1) use --save and (2) exclude the JS modules directory from source control.
In addition, I suggest looking at RaveJS, which manages your JavaScript build and keeps you from having to configure watchers and such during development.
My recommended best practice is using the com.github.eirslett.frontend-maven-plugin (or the maven grunt plugin) to call the grunt/gulp build from mvn (in process resources goal). When CI builds, it's all integrated, even npm etc can be installed in mvn's target so you don't have bother to configure your CI server for npm.
When developers build, then mostly still just use one maven command. For JS/CSS/HTML developers, after mvn clean install, they can run grunt/gulp "watch" in the background to get their JS changes reflected immediately in the browser without incurring any maven overhead (just the wicked fast gulp/grunt tasks).
Deploy the UI components on default Tomcat webapp directory.
Deploy the class files on "wtpwebapps" (default directory for uploading war through eclipse) directory.
Setting Eclipse
Go to server tab, open properties for tomcat
(source: scotch.io)
Make sure, the location should be [workspace metadata]
Then double click on tomcat to open Tomcat overview.
(source: scotch.io)
Set server location to "Use tomcat location".
Grunt/Gulp Setting
Use the copy task to copy the build UI file to<tomcat installation directory>/webapps/<contextRoot>/
https://pub.scotch.io/#ankur841/eclipse-tomcat-deployment-with-gruntgulp-build
Development Environment
I am working on a Maven Java client/server project that relies on Protocol Buffers (protobuf) for sending RPCs between the clients and server. I use Eclipse for Java EE as my primary IDE. Since I use Maven in my project, I am using the m2eclipse plugin for Eclipse. I configure my project in Eclipse to use the "Maven Nature".
The Problem
Basically, with the workspace setup described above, I am running into INFINITE BUILD LOOPS if Eclipse is configured to Build Automatically (which is the default: Project Menu --> Build Automatically). Whenever Eclipse spins off a build, the build will enter an infinite loop, often resulting in all my computer's CPU resources being consumed by Eclipse and eventually an error popup will appear in the IDE due to a memory overflow. What is happening is that all the generated Java code from the .proto files are continuously being built by Maven via the Eclipse build. Once the proto files are generated and compiled into a directory (in my case, target/generated-sources), the build of the proto files is immediately repeated. Even if I was to click on the stop button, the build would spin off again. The only way I can really stop the infinite build loop is to disable Build Automatically.
From looking through links on the web (see this SOF post, also listed below), one workaround was to disable the Maven Project Builder on my Eclipse project. To do so, I would have to open up the Eclipse project settings --> Builders --> deselect Maven Project Builder. Now, the infinite build loop will not happen, seemingly because it was m2eclipse's builder that was the culprit. However, I now lose a lot of useful functionality from this builder. Namely, I am not able to take advantage of automatic resource processing through m2eclipse, such as resource filtering. Note that projects using the Maven Nature have resource directories (src/main/resources and src/test/resources) excluded on the Java Build Path because of the expectation that the Maven Project Builder adds them to the classpath. So, one issue I run into with the Maven Project Builder disabled is that I cannot read resource files from the classpath in my tests. I would have to first run a manual maven build to get access to the resources (but once I refresh the project, I won't be able to find these classpath resources anymore). Or, I could change my project's Java Build Path, but that goes against the Maven Nature's defaults that work for me in all Eclipse Java projects besides ones relying on protobuf.
So, that all being said...
Does anyone have any idea how I can work around this problem? The Eclipse platform seems too mature to have this problem continue to sit around. I could always file an Eclipse bug that will collect dust, but perhaps it is not an Eclipse bug but rather an incorrect configuration on my side. Thank you so much in advance for the help.
Related Links
SOF post about this issue: Eclipse loops endlessly: Invoking 'Maven Project Builder'
Open Github issue for the maven-protobuf-plugin: https://github.com/igor-petruk/protobuf-maven-plugin/issues/16
After reading the attached GitHub Issue more carefully, it appeared that setting the cleanOutputFolder configuration in the protobuf-maven-plugin did the trick. Here is an example XML of using the plugin (version is irrelevant):
<plugin>
<groupId>com.github.igor-petruk.protobuf</groupId>
<artifactId>protobuf-maven-plugin</artifactId>
<version>0.6.3</version>
<configuration>
<cleanOutputFolder>false</cleanOutputFolder>
</configuration>
</plugin>
This means that Eclipse will not run into an infinite build loop because the Maven Project Builder will not have to keep recompiling the same protobuf generated folder, which is in /target/generated-sources. Meanwhile, not having cleanOutputFolder enabled doesn't completely disable the project from picking up proto file outputs and generating new sources based on those files; as long as a Maven build command is run with the clean goal (such as mvn clean install), then the generated-sources directory will still be regenerated since the target directory had already been deleted.
This can also be caused when having different Java versions configured for Java and the generated code.
E.g. for code generation using Xtend, I updated the JDK from 6 to 8, but some of the files in .setting and .preferences were still referring to the old Java 6 compatibility. This caused looping compilation in Eclipse.
I have been working solo on a project for some time, and now, new developers are likely to join the project for various reasons.
I, of course, use a version control software but I am afraid importing my project into Eclipse and making it run might prove a little difficult for new comers, and I want to make it as clean as possible.
When I first took over the project, it took me almost two days to have the project built and run it, I documented every step and fixed the most obvious errors, but not all, and I want the project to run as it is when imported.
The project is a mix of java projects for the backend, a j2ee project for the server and a flex project for the client.
The IDE is going to be Eclipse
The version control software is Perforce
Here are some of the specific problems I have right now, should I fix them, and how ?
Eclipse environment variables are used for libs, all the libs are in a folder in the j2ee project but are used by all the java projects (they have to be set in each IDE the project is imported into)
Runtime JRE is specified in .classpath for each project, so each projects property must be edited when trying to build the project in another environment
Apache server is specified in j2ee project property
To avoid exporting the jars of all the java projects into the j2ee project each time I modify the code, there are linked folders in the j2ee projects, linked to each java project bin folders
For (4) I will probably have to use maven, but is it possible to fix problem (1) (2) and (3) without using maven ?
The alternative is to have a one page set up instruction document
Also do you have any other general or specific advices as to how organize this whole mess.
Thank you
Dependency management is a must - use Maven. If you can't use maven, because you are already using ant, go with Ivy.
Make the project buildable with one click - be int ant build all or mvn package. Maven provides integration with the IDE (via the plugin).
Don't reply on IDE metadata. like .project and .classpath. You can still commit them to ease Eclipse users though, but don't restrict the IDE.
Provide build-on-save. Either using Eclipse WTP, or using the FilSync plugin (it sounds like a hack, but is pretty cool)
Use build profiles (maven provides them automatically) - to create different builds for different environments
It's not always possible to configure everything in your maven (or ant/ivy) scripts. For any additional actions, like installing app server - document then in a single file in the root of your project, describing step by step what should be installed, with what config options, etc. Thus the developers have only one place to look at and follow. The document can (and better) be plain .txt
A sidenote: use Continous Integration - download Hudson or TeamCity and configure it to build a project
From my very recent experience - we had a project we've been working on for 6 months. A colleague of mine had to re-import the project on a new machine. It took him 20 minutes. The project is configured with Maven.