Related
I need to configure different #Alternatives, #Decorators and #Injectors for different runtime environments (think testing, staging and production servers).
Right now I use maven to create three wars, and the only difference between those wars are in the beans.xml files. Is there a better way to do this? I do have #Alternative #Stereotypes for the different environments, but even then I need to alter beans.xml, and they don't work for #Decorators (or do they?)
Is it somehow possible to instruct CDI to ignore the values in beans.xml and use a custom configuration source? Because then I could for example read a system property or other environment variable.
The application exclusively runs in containers that use Weld, so a weld-specific solution would be ok.
I already tried to google this but can't seem to find good search terms, and I asked the Weld-Users-Forums, but to no avail. Someone over there suggested to write my own custom extension, but I can't find any API to actually change the container configuration at runtime.
I think it would be possible to have some sort of #ApplicationScoped configuration bean and inject that into all #Decorators which could then decide themselves whether they should be active or not and then in order to configure #Alternatives write #Produces methods for every interface with multiple implementations and inject the config bean there too.
But this seems to me like a lot of unnecessary work to essentially duplicate functionality already present in CDI?
edit
Ok, I realized I'm a bit stupid... of course it is possible to add stereotypes and inteceptors at runtime using the CDI extension API:
void beforeBeanDiscovery(#Observes BeforeBeanDiscovery bbd) {
bbd.addInterceptorBinding(...)
bbd.addStereotype(...)
}
But what I didn't find was an API to add a decorator. The only thing I found was to activate all #Decorators in the beans.xml, then observe
public <T> void processAnotated(#Observes ProcessAnnotatedType<T> event)
and call
event.veto()
if I don't want a #Decorator to be active.
You might want to take a look at JBoss Seam, specifically the Solder sub-project.
It allows dependency driven CDI resolution, so that certain beans are only available if other beans or resources are available. (Class A if "dataSource" is available, Class B if "entityManager" is available)
Since it's open source, you can also take a look at how they wired that together, and use that knowledge as a basis for writing your own extension if needed.
If you're using JSF, I would highly recommend using SEAM-JSF as well, as it gets rid of the clunkiness of having two injection frameworks (JSF DI/CDI) and allows CDI beans in JSF scopes.
I'm working with very large JSF/Facelets applications which use Spring for DI/bean management.
My applications have modular structure and I'm currently looking for approaches to standardize the modularization.
My goal is to compose a web application from a number of modules (possibly depending on each other). Each module may contain the following:
Classes;
Static resources (images, CSS, scripts);
Facelet templates;
Managed beans - Spring application contexts, with request, session and application-scoped beans (alternative is JSF managed beans);
Servlet API stuff - servlets, filters, listeners (this is optional).
What I'd like to avoid (almost at all costs) is the need to copy or extract module resources (like Facelets templates) to the WAR or to extend the web.xml for module's servlets, filters, etc. It must be enough to add the module (JAR, bundle, artifact, ...) to the web application (WEB-INF/lib, bundles, plugins, ...) to extend the web application with this module.
Currently I solve this task with a custom modularization solution which is heavily based on using classpath resources:
Special resources servlet serves static resources from classpath resources (JARs).
Special Facelets resource resolver allows loading Facelet templates from classpath resources.
Spring loads application contexts via the pattern classpath*:com/acme/foo/module/applicationContext.xml - this loads application contexts defined in module JARs.
Finally, a pair of delegating servlets and filters delegate request processing to the servlets and filters configured in Spring application contexts from modules.
Last days I read a lot about OSGi and I was considering, how (and if) I could use OSGi as a standardized modularization approach. I was thinking about how individual tasks could be solved with OSGi:
Static resources - OSGi bundles which want to export static resources register a ResourceLoader instances with the bundle context. A central ResourceServlet uses these resource loaders to load resources from bundles.
Facelet templates - similar to above, a central ResourceResolver uses services registered by bundles.
Managed beans - I have no idea how to use an expression like #{myBean.property} if myBean is defined in one of the bundles.
Servlet API stuff - use something like WebExtender/Pax Web to register servlets, filters and so on.
My questions are:
Am I inventing a bicycle here? Are there standard solutions for that? I've found a mentioning of Spring Slices but could not find much documentation about it.
Do you think OSGi is the right technology for the described task?
Is my sketch of OSGI application more or less correct?
How should managed beans (especially request/session scope) be handled?
I'd be generally grateful for your comments.
What you're aiming to do sounds doable, with a few caveats:
The View Layer: First, your view layer sounds a little overstuffed. There are other ways to modularize JSF components by using custom components that will avoid the headaches involved with trying to create something as dramatic as late-binding managed beans.
The Modules Themselves: Second, your modules don't seem particularly modular. Your first bullet-list makes it sound as if you're trying to create interoperable web apps, rather than modules per se. My idea of a module is that each component has a well-defined, and more or less discrete, purpose. Like how ex underlies vi. If you're going down the OSGi route, then we should define modular like this: Modular, for the sake of this discussion, means that components are hot-swappable -- that is, they can be added and removed without breaking the app.
Dependencies: I'm a little concerned by your description of the modules as "possibly depending on each other." You probably (I hope) already know this, but your dependencies ought to form a directed acyclic graph. Once you introduce a circular dependency, you're asking for a world of hurt in terms of the app's eventual maintainability. One of the biggest weaknesses of OSGi is that it doesn't prevent circular dependencies, so it's up to you to enforce this. Otherwise your dependencies will grow like kudzu and gradually choke the rest of your system's ecosystem.
Servlets: Fuhgeddaboudit. You can't late-bind servlets into a web app, not until the Servlet 3.0 spec is in production (as Pascal pointed out). To launch a separate utility servlet, you'll need to put it into its own app.
OK, so much for the caveats. Let's think about how this might work:
You've defined your own JSF module to do... what, exactly? Let's give it a defined, fairly trivial purpose: a login screen. So you create your login screen, late-bind it using OSGi into your app and... then what? How does the app know the login functionality is there, if you haven't defined it in your .jspx page? How does the app know to navigate to something it can't know is there?
There are ways to get around this using conditional includes and the like (e.g., <c:if #{loginBean.notEmpty}>), but, like you said, things get a little hairy when your managed loginBean exists in another module that may not have even been introduced to the app yet. In fact, you'll get a servlet exception unless that loginBean exists. So what do you do?
You define an API in one of your modules. All the managed beans that you intend to share between modules must be specified as interfaces in this API layer. And all your modules must have default implementations of any of these interfaces that they intend to use. And this API must be shared between all interoperable modules. Then you can use OSGi and Spring to wire together the specified beans with their implementation.
I need to take a moment to point out that this is not how I would approach this problem. Not at all. Given something like as simple as a login page, or even as complicated as a stock chart, I'd personally prefer to create a custom JSF component. But if the requirement is "I want my managed beans to be modular (i.e., hot-swappable, etc)," this is the only way I know to make it work. And I'm not even entirely sure it will work. This email exchange suggests that it's a problem that JSF developers have only just started to work on.
I normally consider managed beans to be part of the view layer, and as such I use them only for view logic, and delegate everything else to the service layer. Making managed beans late-binding is, to my mind, promoting them out of the view layer and into the business logic. There's a reason why all those tutorials are so focused on services: because most of the time you want to consider what it would take for your app to run "headless," and how easy it would be to "skin" your view if, for instance, you wanted it to run, with all its functionality, on an Android phone.
But it sounds like a lot of what you're working with is itself view logic -- for instance, the need to swap in a different view template. OSGi/Spring should be able to help, but you'll need something in your app to choose between available implementations: pretty much what OSGi's Service Registry was built to do.
That leaves static resources. You can modularize these, but remember, you'll need to define an interface to retrieve these resources, and you'll need to provide a default implementation so your app doesn't choke if they're absent. If i18n is a consideration, this could be a good way to go. If you wanted to be really adventurous, then you could push your static resources into JNDI. That would make them completely hot-swappable, and save you the pain of trying to resolve which implementation to use programmatically, but there are some downsides: any failed lookup will cause your app to throw a NamingException. And it's overkill. JNDI is normally used in web apps for app configuration.
As for your remaining questions:
Am I inventing a bicycle here? Are there standard solutions for that?
You are, a little. I've seen apps that do this kind of thing, but you seem to have stumbled into a fairly unique set of requirements.
Do you think OSGi is the right technology for the described task?
If you need the modules to be hot-swappable, then your choices are OSGi and the lighter-weight ServiceLocator interface.
Is my sketch of OSGI application more or less correct?
I can't really tell without knowing more about where your component boundaries are. At the moment, it sounds like you may be pushing OSGi to do more than it is capable of doing.
But don't take my word for it. I found other reading material in these places.
And since you ask about Spring Slices, this should be enough to get you started. You'll need a Git client, and it looks like you'll be training yourself on the app by looking through the source code. And it's very early prototype code.
I am facing the same problems in my current project. In my opinion, OSGi is the best and cleanest solution in terms of standards and future support, but currently you may hit some problems if you try using it in a web application:
there is no well integrated solution between a Web Container and the OSGi platform yet.
OSGi may be too much for a custom build web application that is just searching for a simple modularized architecture. I would consider OSGi if my project needs to support third party extensions that are not 100% under our control, if the project needs hot redeployments, strict access rules between plugins, etc.
A custom solution based on class loaders and resource filters seems very appropriate for me.
As an example you can study the Hudson source code or Java Plug-in Framework (JPF) Project(http://jpf.sourceforge.net/).
As about extending the web.xml, we may be lucky with the Servlet 3.0 specification(http://today.java.net/pub/a/today/2008/10/14/introduction-to-servlet-3.html#pluggability-and-extensibility).
The "web module deployment descriptor fragment" (aka web-fragment.xml) introduced by the Servlet 3.0 specification would be nice here. The specification defines it as:
A web fragment is a logical
partitioning of the web app in such a
way that the frameworks being used
within the web app can define all the
artifacts without asking devlopers to
edit or add information in the
web.xml.
Java EE 6 is maybe not an option for you right now though. Still, it would to be the standardized solution.
Enterprise OSGi is a fairly new domain so dont think you will get a solution that directly satisfies your need. That said one of the things I found missing from Equinox (osgi engine behind eclipse and hence one with largest user base!) is a consistent configuration / DI service. In my project recently we had some similar needs and ended building a simple configuration osgi service.
One of the problems which will be inherent to modular applications would be around DI, as the module visibility could prevent class access in some cases. We got around this using a registered-buddy policy, which is not too ideal but works.
Other than configuration, you can take a look at the recently released Equinox book for guidance on using OSGi as base for creating modular applications. The examples may be specific to Equinox, but the principles would work with any OSGi framework. Link - http://equinoxosgi.org/
You should look into Spring DM Server (it's being transitioned to Eclipse Virgo but that's not been released yet). There's a lot of good things in the recent OSGi enterprise spec which has also just been released.
Some of the Spring DM tutorials will help, I'd imagine. But yes, it's possible to have both resources and classes loaded from outside the web bundle using standard modularity. In that, it's a good fit.
As for the session context - it gets handled as you would expect in a session. However, you might run into problems with sharing that session between web bundles to the extent that in not sure if it's even possible.
You could also look to have a single web bundle and then use e.g. the Eclipse extension registry to extend the capabilities of you web app.
I've recently started learning the Spring Framework, and I'm a bit unclear on how the ApplicationContext is supposed to be used - in both standalone and web applications. I understand that the ApplicationContext, once instantiated with the spring configuration xml, is the "spring container" and is a singleton.
But:
In the starting point - main method - of an app do I use ApplicationContext.getBean("className") and then rely on DI for all other registered beans or is there a way to use only DI?
Is there any other place besides the main method where I could/should use ApplicationContext.getBean("className")?
If and when should ApplicationContext.getBean("className") be used in a web app?
If in your opinion there is information I MUST know regarding DI related to web applications, even though I may not of specifically inquired about it, please share.
You need at least one call into the context from outside it, there's no avoiding that. With webapps, that part is hidden from you, and it it feels like everything is using DI, even though Spring's servlet glue code is doing some unpleasantness behind the scenes.
Could, yes; should, no. There are very few good reasons for calling getBean yourself.
The most obvious scenario is when you have a servlet filter that needs access to the context. FIlters aren't managed by Spring, and so cannot have stuff wired into them by Spring.
That's a bit too vague. Read the reference docs :)
I generally recommend only one usage of ApplicationContext.getBean() per application, and rely on the Spring configs to do the rest.
The exception applies in unit tests, where I want to load up a particular subset of beans (so I would explicitly load a bean that normally would be loaded from the top of my bean hierarchy).
I've been using spring for some time, but I always wondered how does it work, more specifically, how do they load and weave beans/classes marked only with an interface or #annotation.
For the xml declarations, it's easy to see how spring preprocesses my beans (they are declared in the xml context that spring reads), but for the classes marked only with annotations, I can't see how that works, since I don't pass any agent to the jvm or so.
I believe there is some Java/JVM hook that allows you to preprocess classes by some sort of criteria, but I wasn't able to found out anything on the docs.
Can someone point me to some docs? Is this related to the java.lang.instrument.ClassFileTransformer API?
Actually by default Spring does not
do any bytecode postprocessing
neither for XML-, nor
annotation-configured beans. Instead
relevant beans are wrapped into dynamic
proxies (see e.g.
java.lang.reflect.Proxy in the
Java SDK). Dynamic proxies wrap the
actual objects you use and intercept
method calls, allowing to apply AOP
advices. The difference is that proxies are essentially new artificial classes created by the framework, whereas weaving/bytecode postprocessing changes the existing ones. The latter is impossible without using the Instrumentation API you mentioned.
As for the annotations, the implementation of <context:component-scan> tag will scan the classpath for all classes with the Spring annotations and create Spring metadata placeholders for them. After that they are treated as if they were configured via XML (or to be more specific both are treated the same).
Although Spring doesn't do bytecode postprocessing itself you can configure the AspectJ weaving agent that should work just fine with Spring, if proxies do not satisfy you.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
In a few large projects i have been working on lately it seems to become increasingly important to choose one or the other (XML or Annotation). As projects grow, consistency is very important for maintainability.
My questions are: what are the advantages of XML-based configuration over Annotation-based configuration and what are the advantages of Annotation-based configuration over XML-based configuration?
Annotations have their use, but they are not the one silver bullet to kill XML configuration. I recommend mixing the two!
For instance, if using Spring, it is entirely intuitive to use XML for the dependency injection portion of your application. This gets the code's dependencies away from the code which will be using it, by contrast, using some sort of annotation in the code that needs the dependencies makes the code aware of this automatic configuration.
However, instead of using XML for transactional management, marking a method as transactional with an annotation makes perfect sense, since this is information a programmer would probably wish to know. But that an interface is going to be injected as a SubtypeY instead of a SubtypeX should not be included in the class, because if now you wish to inject SubtypeX, you have to change your code, whereas you had an interface contract before anyways, so with XML, you would just need to change the XML mappings and it is fairly quick and painless to do so.
I haven't used JPA annotations, so I don't know how good they are, but I would argue that leaving the mapping of beans to the database in XML is also good, as the object shouldn't care where its information came from, it should just care what it can do with its information. But if you like JPA (I don't have any expirience with it), by all means, go for it.
In general:
If an annotation provides functionality and acts as a comment in and of itself, and doesn't tie the code down to some specific process in order to function normally without this annotation, then go for annotations. For example, a transactional method marked as being transactional does not kill its operating logic, and serves as a good code-level comment as well. Otherwise, this information is probably best expressed as XML, because although it will eventually affect how the code operates, it won't change the main functionality of the code, and hence doesn't belong in the source files.
There is a wider issue here, that of externalised vs inlined meta-data. If your object model is only ever going to persisted in one way, then inlined meta-data (i.e. annotations) are more compact and readable.
If, however, your object model was reused in different applications in such a way that each application wanted to persist the model in different ways, then externalising the meta-data (i.e. XML descriptors) becomes more appropriate.
Neither one is better, and so both are supported, although annotations are more fashionable. As a result, new hair-on-fire frameworks like JPA tend to put more emphasis on them. More mature APIs like native Hibernate offer both, because it's known that neither one is enough.
I always think about annotations as some kind of indicator of what a class is capable of, or how it interacts with others.
Spring XML configuration on the other hand to me is just that, configuration
For instance, information about the ip and port of a proxy, is definetly going into an XML file, it is the runtime configuration.
Using #Autowire,#Element to indicate the framework what to do with the class is good use of annotations.
Putting the URL into the #Webservice annotation is bad style.
But this is just my opinion.
The line between interaction and configuration is not always clear.
I've been using Spring for a few years now and the amount of XML that was required was definitely getting tedious. Between the new XML schemas and annotation support in Spring 2.5 I usually do these things:
Using "component-scan" to autoload classes which use #Repository, #Service or #Component. I usually give every bean a name and then wire them together using #Resource. I find that this plumbing doesn't change very often so annotations make sense.
Using the "aop" namespace for all AOP. This really works great. I still use it for transactions too because putting #Transactional all over the place is kind of a drag. You can create named pointcuts for methods on any service or repository and very quickly apply the advice.
I use LocalContainerEntityManagerFactoryBean along with HibernateJpaVendorAdapter to configure Hibernate. This lets Hibernate easily auto-discover #Entity classes on the classpath. Then I create a named SessionFactory bean using "factory-bean" and "factory-method" referring to the LCEMFB.
An important part in using an annotation-only approach is that the concept of a "bean name" more or less goes away (becomes insignificant).
The "bean names" in Spring form an additional level of abstraction over the implementing classes. With XML beans are defined and referenced relative to their bean name. With annotations they are referenced by their class/interface. (Although the bean name exists, you do not need to know it)
I strongly believe that getting rid of superfluous abstractions simplifies systems and improves productivity. For large projects I think the gains by getting rid of XML can be substantial.
It depends on what everything you want to configure, because there are some options that cannot be configured with anotations. If we see it from the side of annotations:
plus: annotations are less talky
minus: annotations are less visible
It's up to you what is more important...
In general I would recommend to choose one way and use it all over some closed part of product...
(with some exceptions: eg if you choose XML based configurations, it's ok to use #Autowire annotation. It's mixing, but this one helps both readability and maintainability)
I think that visibility is a big win with an XML based approach. I find that the XML isn't really that bad, given the various tools out there for navigating XML documents (i.e. Visual Studio + ReSharper's File Structure window).
You can certainly take a mixed approach, but that seems dangerous to me if only because, potentially, it would make it difficult for new developers on a project to figure out where different objects are configured or mapped.
I don't know; in the end XML Hell doesn't seem all that bad to me.
There are other aspect to compare like refactoring and other code changes. when using XML it takes serous effort to make refactoring because you have to take care of all the XML content. But it is easy when using Annotations.
My preferred way is the Java based configuration without (or minimal) annotations. http://static.springsource.org/spring/docs/3.0.x/spring-framework-reference/html/beans.html#beans-java
I might be wrong, but I thought Annotations (as in Java's #Tag and C#'s [Attribute]) were a compile-time option, and XML was a run-time option. That to me says the are not equivalent and have different pros and cons.
I also think a mix is the best thing, but it also depends on the type of configuration parameters.
I'm working on a Seam project which also uses Spring and I usually deploy it to different development and test servers. So I have split:
Server specific configuration (Like absolute paths to resources on server): Spring XML file
Injecting beans as members of other beans (or reusing a Spring XML defined value in many beans): Annotations
The key difference is that you don't have to recompile the code for all changing server-specific configurations, just edit the xml file.
There's also the advantage that some configuration changes can be done by team members who don't understand all the code involved.
In the scope of DI container, I consider annotation based DI is abusing the use of Java annotation. By saying that, I don't recommend to use it widely in your project. If your project does really needs the power of DI container, I would recommend to use Spring IoC with Xml based configuration option.
If it is just for a sake of Unit-test, developers should apply Dependency Inject pattern in their coding and take advantages from mocking tools such as EasyMock or JMock to circumvent dependencies.
You should try to avoid using DI container in its wrong context.
Configuration information that is always going to be linked to a specific Java component (class, method, or field) is a good candidate to be represented by annotations. Annotations work especially well in this case when the configuration is core to the purpose of the code. Because of the limitations on annotations, it's also best when each component can only ever have one configuration. If you need to deal with multiple configurations, especially ones that are conditional on anything outside the Java class containing an annotation, annotations may create more problems than they solve. Finally, annotations cannot be modified without recompiling the Java source code, so anything that needs to be reconfigurable at run time can't use annotations.
Please refer following links. They might be useful too.
Annotations vs XML, advantages and disadvantages
http://www.ibm.com/developerworks/library/j-cwt08025/
This is the classic 'Configuration versus Convention' question. Personal taste dictates the answer in most cases. However, personally I prefer Configuration (i.e. XML based) over Convention. IMO IDE's are sufficiently robust enough to overcome some of the XML hell people often associate w/ the building and maintaining an XML based approach. In the end, I find the benefits of Configuration (such as building utilities to build, maintain and deploy the XML config file) outweighs Convention in the long run.
I use both. Mostly XML, but when I have a bunch of beans that inherit from a common class and have common properties, I use annotations for those, in the superclass, so I don't have to set the same properties for each bean. Because I'm a bit of a control freak, I use #Resource(name="referredBean") instead of just autowiring stuff (and save myself a lot of trouble if I ever need another bean of the same class as the original referredBean).
There are some pros and cons of annotation configuration from my experience:
When it comes to JPA configuration since it is done once and usually are not changed quite often I prefer to stick to annotation configuration. There maybe a concern regarding possibility to see a bigger picture of configuration - in this case I use MSQLWorkbench diagrams.
Xml configuration is very good to get a bigger picture of application but it maybe cumbersome to find some errors until runtime. In this case Spring #Configuration annotation sounds as a better choice since it let you see a bigger picture as well and also allows to validate configuration on compile time.
As for Spring configuration I prefer to combine both approaches: use #Configuration annotation with Services and Query interfaces and xml configuration for dataSource and spring configuration stuff like context:component-scan base-package="..."
But xml configuration bits java annotations when it comes to flow configuration(Spring Web Flow or Lexaden Web Flow) since it is extremely important to see a bigger picture of the whole business process. And it sounds cumbersome to have it implemented with annotations approach.
I prefer combining both approaches - java annotations and essential xml minimum that minimize configuration hell.
For Spring Framework I like the idea of being able to use the #Component annotation and setting the "component-scan" option so that Spring can find my java beans so that I do not have to define all of my beans in XML, nor in JavaConfig. For example, for stateless singleton java beans that simply need to be wired up to other classes (via an interface ideally) this approach works very well. In general, for Spring beans I have for the most part moved away from Spring XML DSL for defining beans, and now favor the use of JavaConfig and Spring Annotations because you get some compile time checking of your configuration and some refactoring support that you don't get with Spring XML configuration. I do mix the two in certain rare cases where I've found that JavaConfig/Annotations can't do what is available using XML configuration.
For Hibernate ORM (haven't used JPA yet) I still prefer the XML mapping files because annotations in domain model classes to some degree violates The Clean Architecture which is a layering architectural style I have adopted over the past few years. The violation occurs because it requires the Core Layer to depend on persistence related things such as Hibernate or JPA libraries and it makes the domain model POJOs a bit less persistence ignorant. In fact the Core Layer is not supposed to depend on any other infrastructure at all.
However, if The Clean Architecture is not your "cup of tea" then I can see there are definitely advantages (such as convenience and maintainability) of using Hibernate/JPA annotations in domain model classes over separate XML mapping files.