Spring builder is slow in Eclipse - can I speed it up? - java

When building our Java applications in Eclipse, the Spring builder is very slow and gives no status updates.
Specifically, I start building a project, and Eclipse's Progress pane displays
Invoking 'Spring Project Builder' on 'project name'...
for multiple minutes at a time, with no additional details.
I've already turned off the Spring AOP Reference Model Builder, and I just recently disabled the Spring project builder completely out of desperation.
I'm just building and using these projects, not developing them, so theoretically they should compile fine - but this is our development branch, so I'd still like to keep Spring on in case there's a nasty reflection error somewhere.
So, in order to keep using them, is there anything I can do to:
Speed the Spring portion of the build?
Display more detailed output during the Spring project building process?
Edit 2010-02-15 21:39 GMT:
I'm specifically referring to the Spring IDE plugin in Eclipse.

I'm assuming you're referring either to the Spring IDE plugin for Eclipse, or the SpringSource ToolSuite bundle.
The big performance killer that I've nailed down is the processing of <import resource="..."/> entries in the beans files. The plugin has an option for enabling the processing of these, and if turned on, it absolutely hammers performance - it searches the entire classpath (including libraries) for each imported resource, ever ytime something changes. I reported this as a bug, and thankfully it's been fixed, but not yet released.
The <import> support is just a nice-to-have, though, since you can manually add the imported files directly. Turning it off makes the whole experience much more edifying.

Try checking your validators. I remember having some issues at one point because I had a bunch of plugins installed which added a number of validators to my project and the build process took forever mostly because of checking all the XML.

Related

Efficient OSGi development workflow

I work on a product composed of many bundles running as features on top of karaf. Typically our developers work on one bundle at a time. Our normal development goes something like: code, compile, copy bundle to deploy folder, test. We've also found that hotdeploy just refuses to override certain bundles that are installed as features without a server restart or a feature uninstall/reinstall, so sometimes the cycle is longer.
My question is: does anyone in the community have a better way? The way we do things works, but I feel like it's pretty slow and inefficient and I'm betting someone has come up with something better!
EDIT: I realize that I was pretty unclear in my question... We are using Equinox underneath Karaf. We also use Eclipse and Maven, although I don't know that using Maven is relevant.
Sounds like you want the dev:watch command. From the documentation:
The watch command can be used to help at developement time. It allows you to configure a set of URLs that will be monitored. All bundles location matching the given URL will be
automatically updated. This avoids the need for manually updating the bundles or even copying the bundle to the system folder if needed. Note that only maven based urls and maven snapshots will actually be updated automatically, so if you run
dev:watch *
It will actually monitor all bundles that have a location matching mvn:* that have '-SNAPSHOT' in their url.
Doing "dev:watch --help" from the Karaf shell will list its available flags and args.
Something similar is the PAX plugin
Either of these will work quite nicely if you've got the m2 maven plugin for Eclipse.
UPDATED: In my company we strive to be as TDD as possible, therefore a lot a development is done without explicitly starting Karaf. In the normal mix of unit tests we're also using Pax Exam, which is largely fantastic even when run from within Eclipse =)
This helps ensure we're not too tided to any Karaf specifics as it runs with Equinox/Felix/Concierge (so I mock out various Karaf specifics we depend on like JAAS authentication). Along with many other cool tools/functionality, it's capable of provisioning Karaf features and using TinyBundles you can even create bundles on the fly (again useful for mocking/stubbing).
Pax Exam hooks into the JUnit framework by providing a JUnit #Runner, the latest version (2) is much faster and has DSL based API, so the tests are quite concise and readable.
Using Pax Exam gives us good test coverage and short development times. Where tests are less practical or we're hunting bugs that don't surface in tests, the dev:watch command is invaluable.
In summary; IMO you should definitely drive your developments with tests (Pax Exam will slot into your existing build nicely and once you get used to it you'll find development quicker). You can start using the dev:watch command immediately, it will certainly speed up your current situation.
UPDATE 2: In answering another question I've added a maven example Pax-Exam testing a ComponentFactory. Test Driven Development is arguably the most efficient workflow available to developers today. link to question: osgi: Using ServiceFactories? and link to sourcecode: http://dl.dropbox.com/u/2465717/net.earcam.example.servicecomponent_2011-08-16_15-52.tgz
I've had excellent results using Equinox in Eclipse - even hot code replace works properly. granted, the target platform is small and we have only on the order of approx 50 bundles of our own, but workflow goes like this:
First, we have a target platform that contains all third-party and Eclipse bundles, Eclipse takes care of downloading & managing them. Then, the workspace has all the bundles of the project, grouped in 3-4 working sets. Compilation happens as usual on save, sometimes GWT needs to be recompiled, but even then the changes get picked up immediately because no deployment needs to happen - the running Equinox system uses the unpacked project folders as bundles. Running this from within Eclipse gives us hot code replace, on-the-fly changing template files, only MANIFEST.MF/plugin.xml changes need to refresh the bundle - and even then it's usually faster to just restart the framework than to type in the console.
if you use Eclipse Eclipse Libra may be useful for you. Libra can start Felix, Equinox and Knopflerfish inside Eclipse as any other server with WST. They have some youtube videos how to use it.
I also wrote some tools that can help:
An osgi bundle that picks up OSGI services that match the filter (osgitest=junit4). With that you do not write Junit classes but you can provide pre-configured objects (e.g. with OSGI Blueprint). JUnit than runs based on the annotations provided in the interface your service implements.
A maven plugin that has the following useful goals
Start a OSGI containers and deploy the bundle maven project with all of it's dependencies (which are OSGI bundles of course). The OSGI container starting is done with the help of PAX Exam but the JUnit tests are started with the help of the OSGI bundle I wrote (that runs the OSGI services you may provide).
Create a folder that contains a shortcut to all dependencies of the project (located at the maven repo or target directory of the folder)
If the projects are deployed onto the server (Eclipse Libra) I have to say only update X where the X is the id of the bundle and everything is refreshed rapidly. You do not have to re-compile the projects that are published to the server if you run Equinox in Libra as it points to the target classes folder which is refreshed as soon as you save your class or pom.xml.
If you do not publish your project onto the server but add it as a bundle in the container pointing to the shortcut folder you can also run the update command on the OSGi console after running mvn install (without the restarting of the server).
A step-by-step guide is available at http://cookbook.everit.org/
With the following method above it is possible to write tests as TDD tests and run them as part of a maven compile on the CI server.
I hope you will find these tools as useful as I do!
It depends on the platform under Karaf: Felix or Equinox.
Equinox
Eclipse has excellent (or almost excellent) support for launching Equinox with bundles of your choice. The two things you need to prepare are:
Bundles, being developed, available in the workspace as Plug-in projects
Target platform, containing the remaining bundles of the application
Such setup will allow you to easily make changes to your bundles, even runtime and easily restarting the runtime when this is required. I see Karaf as more suitable when you are developing on remote system, where the bundles are deployed via SSH or FTP or when you are using external build tools like Maven, which have ability to automatically copy the bundle in the runtime after it is built.
If you are using Equinox, this will give some extra edge over as the runtime will execute the code directly from the workspace.
Felix
Felix doesn't seem to have such support for launching from Eclipse (although there is a work toward this, tracked in this Jira issue). You can also launch it as normal Java application, but this is far from convenient. In this case, using Maven will be much better alternative. You can still setup Eclipse to take full advantage of the PDE other features, only launching will be done externally.
Summary
In summary, you can always automate everything through Maven and Karaf will greatly help you in this regard. Eclipse will give a little edge, if you are using Equinox. You should be able to have hot-code replace regardless of the method you are using, because the hot-code replace doesn't even consider OSGi at all (except in the only case, when you reload your bundle and fresh class loader is created).

STS Spring with Roo and Maven - too slow?

I am using STS and Java since a while for developing a web application. The project is configured for using Maven, Roo and MySQL as database. I often find that I waste a lot of time in the following cases:
No Roo response - when I modify a domain object Java class even if the Roo shell for the project is open and loaded it would not respond. My current hack to this issue is to generate a new finder for the class using roo (which awakens Roo somehow). Is there any better hack?
Long times to compile - I have set build automatically on as a project option. Perhaps I should not use it. But the bottom line is that sometimes I need to do a small change in a domain object and then run the project hence I need to build the project again and test it (even for small changes). Is there any properties that I can configure to re-compile only some parts of the project?
Maven and Pom - I often found problems with Maven as whenever I was modifying a dependency in pom.xml file and saving it whilst forgetting to be on an internal company network the .m2 library would not be updated correctly. In those cases I should have switched to a non internal network before saving the pom.xml. The annoying bit is that once I then change the network preferences and run the "update all maven dependencies" command, it would not do it properly. The reason is that the pom files for the new library are generated but with an error message as content. What the problem seem to be is that Maven generates these files but then is not clever enough to recognize them and try to fetch again the libraries from the web repositories. To fix this I need to delete the content of the folder and run the command again. Is there anyone that has found a better solution?
Is Roo really worth it? And if so when?
I found that Roo is good to get started and to create the initial database and project configurations (e.g. security). But then? By far I have been using Roo to generate finders but I often find that it takes about 10/15 minutes to Roo to update when a new finder is added to a class (our project is quite big). I am considering starting to keep the Roo shell closed most of the times and manually add the finders in the .aj files and annotate the .java file myself. Will this conflict with Roo afterwards (e.g. once I open the Roo shell for some other reason)?
Java and STS
Why does the project take so long to compile? I mean, I imagine that STS is checking that in every single class there is no dependency with the recently modified files and if there is one updates the code and regenerates a jar file. Is this correct? What if I was using PHP or Python? Would the same happen or would be faster? I know there are several discussions on this but as I am listing the issues I am having I thought I would mention this as well. Is the eternal question of which language to use and for what. In our case we have a relational database, huge amount of data, security constraints, and need to do precise computations (strict data typing needed).
1. Ensure the Roo is pointing to the correct directory: Preferences, Spring, Roo Support
2. Compile times are often compounded by validation: Preferences, Validation, check Suspend all validators Preferences, Spring, uncheck unwanted validation rules Preferences, General, Build, uncheck build automatically, then manually build when needed
3. Update dependencies by right clicking on the project: Maven, Update Dependencies and Update Project Configuration
Some tips are in the DZone RefCard I wrote. http://refcardz.dzone.com/refcardz/eclipse-tools-spring
Regards, Gordon Dickens
twitter.com/gdickens
linkedin.com/in/gordondickens
Blog: technophile.gordondickens.com
Free DropBox: http://db.tt/aJQFTac
1. No ROO response
I used to have regular trouble with ROO console in STS in earlier versions of ROO. In my experience, ROO shell launched via command-line was more responsive. I guess things have improved in newer versions of STS and ROO. Which version are you using?
2. Long times to compile
Although compile time was good enough for me, the culprit usually were validating builders. Several validating builders that validate XML, JSP, Spring configuration etc too a lot of time to complete. I ended up disabling several validators to bring build time within a reasonable limit.
3. Maven and POM
I experienced similar issues too. Although this isn't exactly a ROO issue, but as ROO doesn't offer an alternative to Maven, this is can be a serious problem for ROO projects. I think there might be an option in M2-Eclipse somewhere that lets you selectively force an update on a dependency.

Best way to manage multi-module projects?

I have a medium size project split into 3 modules: Core, plugins (in short its an interpretation layer), and implementation. There are a few global dependencies, and module specific dependencies. There is a custom ant target for generating javadoc excluding the implementation (for obvious reasons). This is stored in an public online SVN repository and therefor needs to be independent of any machine sans the JRE
Right now I'm using the built in NetBeans project management, and it sucks, probably mainly do the fact that the project management system was not designed for modules. Lack of a global library set (you can import a library specific to your nb installation, but then it doesn't get updated), lack of auto resolving of library dependencies (dependency on a project means the project and its dependencies), lack of an independent multi-project formatting style (either tied to profile specific "Global options" or individually setup and synced module-specific options), and other things make managing my project a pain.
When I was experimenting with IDEA, one of the things I loved was its project management. It was close to what I wanted, but like most things in IDEA could of been simpler. However the IDE itself was bad (not up for debate), so I switched back to NetBeans. And Maven looks bad, both from having to traverse its file structure manually and general opinion.
Are their better options out there that can be stored in a standard SVN repository with limited tools to use, are pretty easy to use for 1-3 developers, and for 2-5 modules? It must be able to handle java, and (in the perfect world) integration with NetBeans.
Honestly maven is your best bet. I wouldn't knock it you haven't actually tried it yet. It tends to be a very divisive technology, but those who love it love it for a damn good reason. If you are someone who prefers to keep your hands off the build script/files after you initially set it up, and it looks like you are given you were using Netbeans' built in projects which generate an ant build.xml behind the scenes, then you should just try maven and see what happens.
I'm not sure why you think you need to "traverse the directory structure" with maven if you are in netbeans. See this screenshot for an example of what it looks like. You don't ever see src/main/java or target/ or anything on the file system (unless you need to).
(source: netbeans.org)
If you use a maven multi-module project, you'll get the modularity you are looking for within Netbeans as well. If you want a sample, go checkout an open source project that has tons of modules and load it in Netbeans and play around with it: http://camel.apache.org/source.html

Best ways to manage generated artifacts for web service/xml bindings in a java webapp/client?

I'm working on a couple of web services that use JAXB bindings for the messages (in JAX-WS or spring-ws). When using these bindings there's always some code that is automatically generated from the WSDL to bind the message objects. I'm struggling to figure out the best way I can make this work so that it's easy to work with, hard to break and integrates nicely with IDEs (mostly using eclipse).
I think there are a couple of ways to go about this. The three main options I see right now are:
Generate code, keep the source artifacts and check them into the repository. Pros: integrates easily with IDEs (source highlighting etc), works within the build system. Cons: generated code changes each time you regenerate it, possibly creating noisy commits. It's also redundant since the WSDL file is already checked in, usually.
Generate code as part of the build process. Don't keep source artifacts or only keep them in output directories. Pros: fixes all the cons from the previous one. Cons: harder to integrate with IDE, though maybe this build step can be run automatically? I currently use this on one of my projects but the first time I checkout the project it appears broken, which is a minor nuisance.
Keep generated bindings in separate libraries (jars) included with maven or manually updated jars, depending on your build process. I got the idea from a thread on java.net. This seems more stable and uses explicit versioning but seems a bit heavyweight.
Which one of these options would you implement and how? We're currently using maven and eclipse, so any ideas in that regard would be great. I think this problem generalises to most other build systems and IDE combinations though, even other languages perhaps.
I went for option 3. If you already host your own repository (and optionally CI), it's not that heavyweight. All it takes is a simple POM. It's even possible to include some utility/wrapper/builder classes (that often make life easier with generated classes) and use them in several projects.
I'd go for option 2 and generate code in the "standard" ${project.build.directory}/generated-sources/<toolname> location as part of the build process. Using generated sources is well supported by m2eclipse (use Maven > Update Project Configuration once sources have been generated) and, if I remember well, by the maven eclipse plugin as well (i.e. the folder will be added to the Java Build Path). Actually, I think NetBeans also handle this fine. Not sure for Idea.
For the generation itself, you may need the maven-jaxb2-plugin if I understood correctly.

How to modularize a (large) Java App?

I have a rather large (several MLOC) application at hand that I'd like to split up into more maintainable separate parts. Currently the product is comprised of about 40 Eclipse projects, many of them having inter-dependencies. This alone makes a continuous build system unfeasible, because it would have to rebuild very much with each checkin.
Is there a "best practice" way of how to
identify parts that can immediately be separated
document inter-dependencies visually
untangle the existing code
handle "patches" we need to apply to libraries (currently handled by putting them in the classpath before the actual library)
If there are (free/open) tools to support this, I'd appreciate pointers.
Even though I do not have any experience with Maven it seems like it forces a very modular design. I wonder now whether this is something that can be retrofitted iteratively or if a project that was to use it would have to be layouted with modularity in mind right from the start.
Edit 2009-07-10
We are in the process of splitting out some core modules using Apache Ant/Ivy. Really helpful and well designed tool, not imposing as much on you as maven does.
I wrote down some more general details and personal opinion about why we are doing that on my blog - too long to post here and maybe not interesting to everyone, so follow at your own discretion: www.danielschneller.com
Using OSGi could be a good fit for you. It would allow to create modules out of the application. You can also organize dependencies in a better way. If you define your interfaces between the different modules correctly, then you can use continuous integration as you only have to rebuild the module that you affected on check-in.
The mechanisms provided by OSGi will help you untangle the existing code. Because of the way the classloading works, it also helps you handle the patches in an easier way.
Some concepts of OSGi that seem to be a good match for you, as shown from wikipedia:
The framework is conceptually divided into the following areas:
Bundles - Bundles are normal jar components with extra manifest headers.
Services - The services layer connects bundles in a dynamic way by offering a publish-find-bind model for plain old Java objects(POJO).
Services Registry - The API for management services (ServiceRegistration, ServiceTracker and ServiceReference).
Life-Cycle - The API for life cycle management (install, start, stop, update, and uninstall bundles).
Modules - The layer that defines encapsulation and declaration of dependencies (how a bundle can import and export code).
Security - The layer that handles the security aspects by limiting bundle functionality to pre-defined capabilities.
First: good luck & good coffee. You'll need both.
I once had a similiar problem. Legacy code with awful circular dependencies, even between classes from different packages like org.example.pkg1.A depends on org.example.pk2.B and vice versa.
I started with maven2 and fresh eclipse projects. First I tried to identify the most common functionalities (logging layer, common interfaces, common services) and created maven projects. Each time I was happy with a part, I deployed the library to the central nexus repository so that it was almost immediately available for other projects.
So I slowly worked up through the layers. maven2 handled the dependencies and the m2eclipse plugin provided a helpful dependency view. BTW - it's usually not too difficult to convert an eclipse project into a maven project. m2eclipse can do it for you and you just have to create a few new folders (like src/main/java) and adjust the build path for source folders. Takes just a minute or two. But expect more difficulties, if your project is an eclipse plugin or rcp application and you want maven not only to manage artifacts but also to build and deploy the application.
To opinion, eclipse, maven and nexus (or any other maven repository manager) are a good basis to start. You're lucky, if you have a good documentation of the system architecture and this architecture is really implemented ;)
I had a similar experience in a small code base (40 kloc). There are no °rules":
compiled with and without a "module" in order to see it's usage
I started from "leaf modules", modules without other dependencies
I handled cyclic dependencies (this is a very error-prone task)
with maven there is a great deal with documentation (reports) that can be deployed
in your CI process
with maven you can always see what uses what both in the site both in netbeans (with a
very nice directed graph)
with maven you can import library code in your codebase, apply source patches and
compile with your products (sometimes this is very easy sometimes it is very
difficult)
Check also Dependency Analyzer:
(source: javalobby.org)
Netbeans:
(source: zimmer428.net)
Maven is painful to migrate to for an existing system. However it can cope with 100+ module projects without much difficulty.
The first thing you need to decide is what infra-structure you will move to. Should it be a lot of independently maintained modules (which translates to individual Eclipse projects) or will you consider it a single chunk of code which is versioned and deployed as a whole. The first is well suited for migrating to a Maven like build environment - the latter for having all the source code in at once.
In any case you WILL need a continuous integration system running. Your first task is to make the code base build automatically, so you can let your CI system watch over your source repository and rebuild it whenyou change things. I decided for a non-Maven approach here, and we focus on having an easy Eclipse environment so I created a build enviornment using ant4eclipse and Team ProjectSet files (which we use anyway).
The next step would be getting rid of the circular dependencies - this will make your build simpler, get rid of Eclipse warnings, and eventually allow you to get to the "checkout, compile once, run" stage. This might take a while :-( When you migrate methods and classes, do not MOVE them, but extract or delegate them and leave their old name lying around and mark them deprecated. This will separate your untangeling with your refactoring, and allow code "outside" your project to still work with the code inside your project.
You WILL benefit from a source repository which allows for moving files, and keeping history. CVS is very weak in this regard.
I wouldn't recommend Maven for a legacy source code base. It could give you many headaches just trying to adapt everything to work with it.
I suppose what you need is to do an architectural layout of your project. A tool might help, but the most important part is to organize a logical view of the modules.
It's not free but Structure101 will give you as good as you will get in terms of tool support for hitting all your bullet points. But for the record I'm biased, so you might want to check out SonarJ and Lattix too. ;-)

Categories