How does ServiceLocator find #Service and #Contact automatically in HK2? - java

According to HK2 #Service javadoc
Annotation placed on classes that are to be automatically added to an
hk2 ServiceLocator.
I don't know how to make ServiceLocator find annotated classes automatically.
TestService
#Contract
public interface TestService {
}
TestServiceImpl
#Service
public class TestServiceImpl implements TestService {
}
Main
public static void main(String[] args) {
ServiceLocator locator = ServiceLocatorUtilities.createAndPopulateServiceLocator();
TestService service = locator.getService(TestServiceImpl.class);
System.out.println(service); // null
}
The result is always null. I have to add Descriptor so the ServiceLocator can find it.
public static void main(String[] args) {
ServiceLocator locator = ServiceLocatorUtilities.createAndPopulateServiceLocator();
DynamicConfigurationService dcs = locator.getService(DynamicConfigurationService.class);
DynamicConfiguration config = dcs.createDynamicConfiguration();
config.bind(BuilderHelper.link(TestServiceImpl.class).to(TestService.class).in(Singleton.class).build());
config.commit();
TestService service = locator.getService(TestServiceImpl.class);
System.out.println(service); // TestServiceImpl instance
}
How do I let ServiceLocator find the annotated classes automatically ? Did I misunderstand something ?

You need to run the hk2-inhabitant-generator over your built classes in order to get automatic detection of services. There is more information here as well.
What that step does in the build process is to create a file named META-INF/hk2-locator/default with information about services. The createAndPopulateServiceLocator call then reads those files and automatically adds those service descriptors into the returned ServiceLocator.

FYI, I was so frustrated with the reliance on the inhabitant files rather than having the capability for runtime scanning of annotated classes, I wrote this project:
https://github.com/VA-CTT/HK2Utilities
Since Eclipse / Maven / inhabitant runtime generators wouldn't play nice, it was nearly impossible to debug code that made use of HK2 in eclipse without runtime scanning.
The HK2Utilities package is available in central:
<dependency>
<groupId>gov.va.oia</groupId>
<artifactId>HK2Utilities</artifactId>
<version>1.4.1</version>
</dependency>
To use it, you just call:
ServiceLocator locator = HK2RuntimeInitializer.init("myName", false, new String[]{"my.package.one", "my.package.two"});
This will scan the runtime classpath for classes in the packages listed, and automatically populate the service locator with them.
You don't ever have to generate inhabitant files with this model - and in practice, I found it to be faster performing than the inhabitant processing code as well (not that the performance matters much for this one-time operation)
---edit---
I still maintain this code - the current release is:
<dependency>
<groupId>net.sagebits</groupId>
<artifactId>HK2Utilities</artifactId>
<version>1.5.2</version>
</dependency>
And the project location is now:
https://github.com/darmbrust/HK2Utilities

Well now (2.6.1) all you need to do is add the dependencies - javax.inject, hk2-utils, hk2-api and hk2-metadata-generator.
When you build the project, javac compiler will generate a 'default' file in META-INF containing the wiring as follows:
[service-class-name]S
contract={contract-class-name}
This will be registered by the ServiceLocator during the run.
This should be sufficient. However if that does not work, there are other options,
mvn plugin
org.glassfish.hk2
hk2-inhabitant-generator
2.5.0-b36
generate-inhabitants
cmd line tool
java org.jvnet.hk2.generator.HabitatGenerator
[--file jarFileOrDirectory]
[--outjar jarFile]
[--locator locatorName]
[--verbose]
More on this https://javaee.github.io/hk2/inhabitant-generator.html

Related

Java EE CDI - obtaining a new instance of a class each time a method is called

I am looking to do some refactoring of a Java J2EE application, but I am not clear how to have CDI provide the needed dependencies:
The current setup is quite simple/easy to understand:
#ApplicationScoped
public class MyApplication {
#Inject
#Named("Default")
private Dependency dependency;
public void dostuff(){
dependency.process();
}
}
I now need a new instance of dependency each time I call dostuff.
I am unclear on how to use CDI to create this for me. My Dependency has its own dependencies that I would like CDI to create for me.
I expect there is a layer of indirection I need to add.
Additional context:
This class is part of a process that polls for work to be done, and is hosted in Wildfly.
We are not using Spring in the project.
Since what you desire is having a new instance of Dependency, each time the method is called, I think what you need is an instance of Provider that is (javax.inject.Provider<T>) injected in your class/bean.
Inject the provider to your current class:
#Inject Provider<DesiredBean> provider;
Then, in your method doStuff() obtain the new instance:
DesiredBean desiredBean = provider.get();
This should get you going.
Is there a specific reason you need to use CDI besides the dependency injection?
If not, I'd suggest making doStuff() take a Dependency object as a parameter:
public void doStuff(Dependency dependency) {
dependency.process();
}
Then, when you call the method, you provide it with a new instance of Dependency:
myApplication.doStuff(new Dependency());
That way, you are still keeping your class less coupled than declaring a new instance in the constructor or field.

Autowiring component not referenced directly

I have a spring-boot project with 3 maven modules in it: producer, consumer and api.
Api contains an interface.
Consumer depends on api and contains the main method and #SpringBootApplication. The class is also in a package that prefixes all the other classes in the other two jars so Component Scan could find everything
Producer depends on api and contains an implementation of the api interface annotated with #Service
In Consumer, i'm trying to get the producer to be injected in the constructor but without referencing the concrete implementation, just the interface. The consumer maven module doesn't even depend on the producer module. This is similar to the way you create applications in OSGi where concrete provider implementations are supposed to be hidden from their consumers.
My problem is that the producer is not being injected. It is not being instantiated or even its class loaded since nobody is referencing it. How can I accomplish this in spring (boot) while keeping the strong encapsulation requirement of consumers not being aware of concrete producers?
When I run the app I get UnsatisfiedDependencyException since there's not producer instantiated to be injected
This is a simplified representation of my code
package com.foo.api
public interface Doer {
}
===== different jar =====
package com.foo
#SpringBootApplication
public class Consumer {
public static void main(String[] args) {
SpringApplication.run(Consumer.class, args);
}
#Autowire
public Consumer(Doer someDoer) {
}
}
===== different jar ======
package com.foo.services
#Service
public class Producer implements Doer {
}
You need to add the Producer module as a dependency of the Consumer module, otherwise maven is not packaging it with the spring boot application - hence it is not available at runtime (the jar is missing).
In order to keep the good separation you defined, make sure to set the dependency's scope to runtime. This means it is not required in compile time but maven knows it needs to be in runtime so it packages the dependency with the application.
For example, add the following the consumer's pom.xml (after setting the correct values...):
<dependency>
<groupId>com.foo.services</groupId>
<artifactId>producer</artifactId>
<version>1.2.3</version>
<scope>runtime</scope>
</dependency>

Can I use #Profile Tags on Components without creating an #Configuration class in Spring Boot or a Web.xml

I've managed to avoid developing any xml files so far for my Spring Boot Maven project (apart from the pom) with them all being generated on compile and I was hoping to keep this way by defining my profiles within the run commands as specified here.
By simply using #ComponentScan my main class to enable the scanning of components and tagging my DAO as #Repository I have successfully managed to autowire my UserDAOmySQLImpl class (which inherits UserDAO).
#Autowired
private UserDAO userDAO;
Then looking forward to add a second DAO for when in Live where we use a db8 implementation I need the application to figure out which UserDAO implementation needs to be used. Profiles seemed to be the answer.
After some reading, it seemed that mostly I need to add in some sort of configuration classes and external xml to manage these profiles and components, though this seems messy and I am hoping unnecessary.
I have tagged by two implementations as so:
#Repository
#Profile("dev")
public class UserDAOmySQLImpl implements UserDAO {...
-
#Repository
#Profile("dev")
public class UserDAOdb8Impl implements UserDAO {...
And then set the active profile through the Maven goals as specified here in hope that this would be a nice clean solution so I could specify the active profile within the goal for our dev and live build scripts respectively.
My current goal for testing is:
spring-boot:run -Drun.profiles=dev
However, when building I receive an error that both beans are being picked up.
*No qualifying bean of type [com.***.studyplanner.integration.UserDAO] is defined: expected single matching bean but found 2: userDAOdb8Impl,userDAOmySQLImpl*
Questions
Is the profile set in the Maven Goal the one being checked against when using the #Profile tag?
Why is Spring picking up both implementations, when if the profile isn't being set properly surely neither implementation should be selected?
Any suggestions on a nice clean way to achieve what I'm looking for?
Bonus
I would really like to set the profile within the app, as it would be easy to simply check whether an environment file exists to decide which profile to use, though I'm struggling to replicate this (found within Spring Profiles Examples
public static void main(String[] args) {
AnnotationConfigApplicationContext context = new AnnotationConfigApplicationContext();
//Enable a "live" profile
context.getEnvironment().setActiveProfiles("live");
context.register(AppConfig.class);
context.refresh();
((ConfigurableApplicationContext) context).close();
}
in to the unusual main class in the application I am working on which I am hesitant to play around with.
public class StudyPlannerApplication {
...
public static void main(String[] args) {
SpringApplication.run(StudyPlannerApplication.class, args);
}
...
}
Any help would much appreciated.
Cheers,
Scott.
Silly me, proof reading the question I noticed that a bad copy & paste job meant that both DAO implementations had the #profile set to "Dev". After changing the db8 DAO to #Profile("live") the above works fine.
So to choose your repository based on profile is actually quite easy when using maven and spring boot.
1) Ensure your main class has the #ComponentScan annotation
#ComponentScan
public class StudyPlannerApplication {
2) Tag your components with the #Profile tags according to which profile you would like them sectected for
#Repository
#Profile("dev")
public class UserDAOdb8Impl implements UserDAO {...
-
#Repository
#Profile("live")
public class UserDAOdb8Impl implements UserDAO {...
3)
Send your profile in with your maven goal
spring-boot:run -Drun.profiles=dev
Hopefully this will help others, as I haven't seen a full example of using the autowiring profiles as I have done elsewhere on the web

reflections library inside Wildfly 8.0.2

I am trying to scan classes with Reflections library, if I add the Dynamic Web project to another project (plain Java one), I get the classes I want, but if run inside a #Startup bean, it is empty.
Here is the code:
Reflections reflections = new Reflections(
new ConfigurationBuilder().filterInputsBy(
new FilterBuilder.Include(
FilterBuilder.prefix("my.package")
)
).setUrls(
ClasspathHelper.forJavaClassPath()
).setScanners(
new SubTypesScanner(false)
)
);
Set<Class<? extends Object>> testClasses = reflections.getSubTypesOf(Object.class);
The tv,goopi should be changed to whatever package prefix used.
the testClasses Set is empty.
If the same code is running in a different project referencing this one, no other change, then the Set is populated with all classes inside the package.
The Maven dependency for Reflections is:
<dependency>
<groupId>org.reflections</groupId>
<artifactId>reflections</artifactId>
<version>0.9.9-RC1</version>
</dependency>
Wildfly 8.2.0.
For now, I can save the file extracted in the external project and use the load function, but this will not be dynamic as it should be.
I struggled with this one for quite a while, it looks like due to the way it works, without getting the full java class path (makes it very heavy) you should load reflect a little bit later. This is mostly due to dynamic class creation during the EJB initialization phase, Startup beans included.
To load the full class path (from a startup bean)
urls.addAll(ClasspathHelper.forJavaClassPath());
To make it cross JEE friendly,
e.g. wild-fly, you need to reflect from a Servlet Context listener. For me the right place was in the constructor but a static field may work.
public class GuiceContext implements ServletContextListener
{
#Override
public void contextInitialized(ServletContextEvent servletContextEvent)
{
ClasspathHelper.forWebInfLib(servletContextEvent.getServletContext());
ClasspathHelper.forWebInfClasses(servletContextEvent.getServletContext());
}
Try and get your beans to initialize stand-alone without dependencies. You can also use a custom injector like Guice to push your beans. In this case you would use the GuiceServletContextListener class.
You are excluding direct exclusions of Object.class, this startup bean in this instance may not be loading due to this.
new SubTypesScanner(false);
A complete library with org.reflections and guice directly implemented can be found at https://github.com/GedMarc/GuiceInjection

How to mock services with Arquillian?

Is it possible to use some kind of mocking framework with Arquillian, or precisely how to mock injected EJBs? I know that, with using the CDI (Contexts and Dependency Injection), it is possible to inject alternatives in test. But without CDI as injection mechanism, when I'm only using EJB injection, how this is possible?
Recently I have tested my EJBs with service interface mock implementation as following:
// Service inteface
public interface Audit {
void audit(String info);
}
// Mock implementation
#Stateless
public class MockAuditBean implements Audit {
public static String lastInfo = null;
#Override
public void audit(String info) {
this.lastInfo = info;
}
}
// assert in test
assertTrue(MockAuditBean.lastInfo.contains("dummy"));
This approach is possible but requires a lot of custom mock implementations. What is worse, injected instances of mocks are proxies and uses service interface. These can not be cast to mock implementation class to compare results. Only static members and methods of mock implementation can be used.
I have tested also another possibilities to set related EJBs manually. This approach has several draw-backs. It requires that target EJB of test has non-private members or setters for them. When target EJB relies on #PostConstruct lifecycle annotation, you have to call it after your manual "injection" setting.
Advantage of this solution is the ability to use mock frameworks, like mockito or jMock.
Have someone an experience to share, how to test and set-up such integration test, or even use mocking frameworks in it ?
IMO, EJBs where not designed with testing in mind. Your alternative sounds like a good enough compromise and I'd go for it. Using mockito is a major plus and I use it even when working with CDI.
I'd use the "default" member scope and javadoc to other developers access them for testing purposes only.
This article from Oracle shows an approach to "injecting" an EJB for testing using JUnit and Mockito:
http://www.oracle.com/technetwork/articles/java/unittesting-455385.html
Edit:
Basically the inclusion of Mockito allows for mocking objects like EntityManager etc.:
import static org.mockito.Mockito.*;
...
em = mock(EntityManager.class);
They show the approach for EJB as well using mockito. Given an EJB:
#Stateless
public class MyResource {
#Inject
Instance<Consultant> company;
#Inject
Event<Result> eventListener;
The test can "inject" those objects:
public class MyResourceTest {
private MyResource myr;
#Before
public void initializeDependencies(){
this.myr = new MyResource();
this.myr.company = mock(Instance.class);
this.myr.eventListener = mock(Event.class);
}
Note that MyResource and MyResource are in the same class path but different source folders so your tests have access to the protected fields, company and eventListener.
Edit:
Note: you can use FacesMockitoRunner from JBoss (https://community.jboss.org/thread/170800) to get this done for the common JSF components and use annotations for the others (Java EE 6 with CDI enabled as a pre-requisite for this, but does not require JBoss server):
Include jsf, mockito, and jsf-mockito dependencies in maven:
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.11</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.mockito</groupId>
<artifactId>mockito-core</artifactId>
<version>1.9.5</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.jboss.test-jsf</groupId>
<artifactId>jsf-mockito</artifactId>
<version>1.1.7-SNAPSHOT</version>
<scope>test</scope>
</dependency>
Add the #RunWith annotation to your test:
#RunWith(FacesMockitoRunner.class)
public class MyTest {
Inject common Faces objects using annotations:
#Inject
FacesContext facesContext;
#Inject
ExternalContext ext;
#Inject
HttpServletRequest request;
Mock any other objects using the annotations #org.mockito.Mock (it appears FacesMockitoRunner calls this behind the scenes so it may not be necessary here):
#Mock MyUserService userService;
#Mock MyFacesBroker broker;
#Mock MyUser user;
Init the Injected Mocks using the
#Before public void initMocks() {
// Init the mocks from above
MockitoAnnotations.initMocks(this);
}
Setup your test as usual:
assertSame(FacesContext.getCurrentInstance(), facesContext);
when(ext.getSessionMap()).thenReturn(session);
assertSame(FacesContext.getCurrentInstance().getExternalContext(), ext);
assertSame(FacesContext.getCurrentInstance().getExternalContext().getSessionMap(), ext.getSessionMap());
etc.
You may want to take a look at testfun-JEE which allows you to unit-test (not integration-test) your EJBs outside of a container.
testfun-JEE takes care for injecting EJBs as well as EntityManager and some standard resource directly into your test class - references within these EJBs to other EJBs are resolved automatically.
And the coolest thing is that you can mock any dependency by simple adding a member variable to your test annotated with #Mock - testfun-JEE will inject this mock wherever needed.
See examples in https://github.com/michaelyaakoby/testfun.
BTW, while this framework was published very recently (like today...) it is being widely used for over a year in my company.
Work with a framework, like Mockito.
Unfortunately, Arquillian does not automatically include the necessary dependencies.
You can add them in your #Deployment function:
#Deployment
public static WebArchive deploy()
{
return ShrinkWrap.create(WebArchive.class)
.addAsLibraries( // add maven resolve artifacts to the deployment
DependencyResolvers.use(MavenDependencyResolver.class)
.artifact("org.mockito:mockito-all:1.8.3")
.resolveAs(GenericArchive.class))
);
}
source
Then in your #Test method you could use:
mock(MockedService.class).methodName()
This github showcase shows a way to allow auto discovery, which seems to require some setup:
source
If you really want to interact with mocks in your integration tests (for instance one reason might be that you don't have a full blown implementation yet or you have an facade to external systems which you don't have control over), there is quite an easy way to integrate Mockito with your Arquillian tests, have a look at this example from the showcase. It's actually extension on its own, but not released as one.

Categories