Say I have the following EJB (using ejb3):
#Stateless(name="Queries")
#Remote(Queries.class)
#Local(Queries.class)
public final class QueriesEJB implements Queries {
...
}
The class is available through both a local and a remote interface.
How can I inject the local interface for this EJB in another part of the app?
Specifically, I'm not sure how to create an #EJB annotation that selects the local interface. For example, is the following sufficient?
#EJB(name="Queries") private Queries queries;
In particular I want to avoid creating separate local and remote interfaces simply for the purpose of distinguishing via #EJB's 'beanInterface' property.
According to the spec you cannot have an interface that is Remote and Local at the same time. However, you create a super-interface, put all methods there, and then create 2 sub-interfaces. Having done that, simply use #EJB. This way you need to maintain only one interface at all.
EDIT: See section 3.2 in "EJB3 spec simplified" at http://jcp.org/aboutJava/communityprocess/final/jsr220/index.html
When a EJB is deployed, the container looks at the interfaces and identifies local and remote interfaces. I would say that the EJB container already uses the local interface in your example. It simply does not make sence to use a remote interface in this case because the container has the choice to use the local one.
If you want to be sure try to use the JNDI name of the local interface as parameter of the #EJB annotation.
#EJB(name="java:comp/env/ejb/EntitySupplierLocal")
In the example above I added local to the interface name. In your case you have to take a look at the JNDI context to get the right name or you even know it ;).
Generally I recommend to use a base interface that has the business methodes defined and extend a local and a remote interface. So you do not have to duplicate methodes and you are able to extend functionality for local and remote access seperatly.
public interface Queries () { .. }
#Local
public interface QueriesLocal extends Queries () { .. }
#Remote
public interface QueriesRemote extends Queries () { .. }
Solutions from previous comments are not fully compatibile with EJB 3.0 on Jboss.
You can easy get this error:
org.jboss.ejb3.common.resolvers.spi.NonDeterministicInterfaceException:
beanInterface specified, Queries, is not unique within EJB QueriesEJB
Create only this:
public interface Queries () //Local by default
#Remote
public interface QueriesRemote extends Queries () { ... }
It works
Related
I am working on a program that supports 3 different platforms. These platforms have identical logic, but the program would have to work with a different database for each one.
I have three different Database.java files for each platform. For example
com.myproject.dao.bmw.Database.java
com.myproject.dao.ford.Database.java
com.myproject.dao.chevy.Database.java
The Database classes all have the same method signatures. But their database connection or queries may be different.
I set the platform name, which in this case is the car make using a config.properties file. I call the methods inside the Database class depending on which platform is set in the config.properties file throughout the program many times.
I want to have to get the Database object based in what is set on the config.properties file when the program starts, while having the same object name for the database. That way each time I call the method names I would not have to use if statements or switches each time I want to use a method in the Database class.
What is the best way to achieve my goal?.
This sounds like a job for the Factory pattern.
Create an interface CarDB (or ICarDb if you like the naming convention like that so you know it is an interface) that contains all the common methods
Create 3 classes that implement CarDB - Ford, Bmw and Chevy
Create a CarDbFactory that has a method like CarDB getDb(Params params) that given your parameters will return a CarDB - the actual one (Ford, Bmw...) depends on the paremeters.
First of all, you did not mention any reasons why you are not considering any of the existing ORM frameworks like Hibernate which is meant specifically for this job. In a nutshell, the ORM allows you to switch across the different databases easily. But if you have a strong reason for not to use the ORM framework, then you can consider the below approach.
Basically, need to define and use the DataBaseConfigFactory and set the appropriate DBConfiguration during the start up of your application as shown below:
DataBaseConfigFactory interface:
public interface DataBaseConfigFactory {
Connection getConnection();
void executeQuery();
}
MyProjectDataBaseConfigFactory class:
public class MyProjectDataBaseConfigFactory implements DataBaseConfigFactory {
private static final DBConfiguration dbConfiguration;
static {
// Get the active db name from props file
// Set dbConfiguration to BmwDBConfiguration or FordDBConfiguration, etc...
}
public Connection getConnection() {
return dbConfiguration.getConnection();
}
public void executeQuery() {
return dbConfiguration.executeQuery();
}
}
Now define a DBConfiguration interface and all specific implementations for the operations that your bmw, ford, etc.. support
DBConfiguration class:
public interface DBConfiguration {
//Add all methods that can be supported by DBConfiguration
}
public class BmwDBConfiguration implements DBConfiguration {
// BMW specific implementations for DBConfiguration
}
public class FordDBConfiguration implements DBConfiguration{
// Ford specific implementations for DBConfiguration
}
In short, you will be using DataBaseConfigFactory interface through out your application to connect with databases and if a new database is added then you need to set the DBConfiguration appropriately.
I have the following code:
public interface DummyInterface {
}
and
#Singleton
#Creatable
public class DummyInterfaceImpl1 implements DummyInterface {
}
And when I want I can simply inject this, and it works just fine, (see below):
#Inject
DummyInterfaceImpl1
However I can't do
#Inject
DummyInterface
Because I get an
Unable to process "ClassWhereIInject.dummyInterface": no actual value was found for the argument "DummyInterface".
So, I am trying to understand, if I use the combination of #Creatable and #Singleon, without adding the instance that I want to inject in the IEclipseContext, then I can only inject implementation classes and not interfaces?
I can see how this can get problematic, especially when one has multiple implementation classes for the same interface, and the dependency injection framework doesn't know which to inject...that is if you don't use the #Named annotation to specify...
The injection system only looks for something with the name you specify. It does not try and find a class that happens to implement that interface. So no you can't use an #Creatable class with a different name to the interface.
An alternative is to use a 'ContextFunction'. This is a function which is called when the injection system is looking for a name. The context function can create an instance of something suitable and put it in the context for the injector. Full details on context function are here
I have quite some JpaRepository extended Repository interfaces due to the design of the database.
In order to construct a simple object i.e Person I have to make method calls to about 4 - 5 repositories just because the data is spread like that throughout the database. Something like this (pardon for pseudocode):
#Service
public class PersonConstructService {
public PersonConstructService(Repository repository,
RepositoryTwo repositoryTwo,
RepositoryThree repositoryThree) {
public Person constructPerson() {
person
.add(GetDataFromRepositoryOne())
.add(GetDataFromRepositoryTwo())
.add(GetDataFromRepositoryThree());
return person;
}
private SomeDataTypeReturnedOne GetDataFromRepositoryOne() {
repository.doSomething();
}
private SomeDataTypeReturnedTwo GetDataFromRepositoryTwo() {
repositoryTwo.doSomething();
}
private SomeDataTypeReturnedThree GetDataFromRepositoryThree() {
repositoryThree.doSomething();
}
}
}
PersonConstructService class uses all these interfaces just to construct a simple Person object. I am calling these repositories from different methods inside the PersonConstructService class. I have thought about spreading this class into multiple classes, but I do not think this is correct.
Instead I would like to use a repositoryService which would include all the repositories listed necessary for creation of a Person object. Is that a good approach? Is it possible in Spring?
The reason I am asking is that sometimes the count of injected Services into a class is about 7-8. This is definitely not good.
I do not think you can / shoudl create a meta-repository like abstraction. Repositories have a well defined meaning, conceptually, they are CRUD services (and a bit more sometimes :-)) for your Hibernate/JPA/Datastore entities. And I guess this is enough for them. Anything more is confusing.
Now what I would propose is a "smart" way of building your "Person" objects that is automa(g)tically aware of any new services that contribute to the meaning of the Person object.
The crux of it would be that :
you could have your Repositories implement a given Interface, say PersonDataProvider, which would have a method, say public PersonPart contributeDataToPersonBuidler(PersonBuilder).
You would make your #Service implement Spring's BeanFactoryPostProcessor interface, allowing you to inspect the container for all such PersonDataProvider instances, and inject them to your service (see accepted answer at How to collect and inject all beans of a given type in Spring XML configuration)
Your #Service implementation would then be to ask all the PersonDataProviders in turn to ask them to contribute their data.
I could expand a bit, but this seems to me like the way to go.
One could argue that this is not clean (it makes your Repositories aware of "something" that happens at the service layer, and they should not have to), and one could work around that, but it's simpler to expose the gist of the solution that way.
EDIT : since this post was first written, I came aware that Spring can auto-detect and inject all beans of a certain type, without the need of PostProcessors. See the accepted answer here : Autowire reference beans into list by type
I see it as a quite reasonable and practical data aggregation on Service layer.
It's perfectly achievable in Spring. If you have access to repositories code you can name them all like:
#Repository("repoOne")
public class RepositoryOne {
#Repository("repoTwo")
public class RepositoryTwo {
And inject them into the aggregation service as necessary:
#Service
public class MultipleRepoService {
#Autowired
#Qualifier("repoOne")
private RepositoryOne repositoryOne;
#Autowired
#Qualifier("repoTwo")
private RepositoryTwo repositoryTwo;
public void doMultipleBusiness() {
repositoryOne.one();
repositoryTwo.two();
}
}
In fact, you even don't need to name and Qualify them if they are different classes, but if they are in hierarchy or have the same interface...
Also, you can inject directly to constructing method if autowiring is not a case:
public void construct(#Qualifier("repoOne")RepositoryOne repoOne,
#Qualifier("repoTwo")RepositoryTwo repoTwo) {
repoOne.one();
repoTwo.two();
}
From Effective Java (Item 1: Consider static factory methods instead of constructors):
The class of the object returned by a static factory method need not even exist
at the time the class containing the method is written. Such flexible static factory
methods form the basis of service provider frameworks, such as the Java Database
Connectivity API (JDBC). A service provider framework is a system in which
multiple service providers implement a service, and the system makes the implementations
available to its clients, decoupling them from the implementations.
I specifically do not understand why the book is saying that The class of the object returned by a static factory method need not even exist at the time the class containing the method is written ? Can some one explain using JDBC as the example .
Consider something like the following:
public interface MyService {
void doSomething();
}
public class MyServiceFactory {
public static MyService getService() {
try {
(MyService) Class.forName(System.getProperty("MyServiceImplemetation")).newInstance();
} catch (Throwable t) {
throw new Error(t);
}
}
}
With this code, your library doesn't need to know about the implementations of the service. Users of your library would have to set a system property containing the name of the implementation they want to use.
This is what is meant by the sentence you don't understand: the factory method will return an instance of some class (which name is stored in the system property "MyServiceImplementation"), but it has absolutely no idea what class it is. All it knows is that it implements MyService and that it must have a public, no-arg constructor (otherwise, the factory above will throw an Error).
the system makes the implementations available to its clients, decoupling them from the implementations
Just to put it in simpler way you don't add any dependencies of these JDBC vendors at compile time. Clients can add their own at runtime
I would like to create a Spring Data JPA repository with custom behavior, and implement that custom behavior using Specifications. I have gone through the Spring Data JPA documentation for implementing custom behavior in a single repository to set this up, except there is no example of using a Spring Data Specification from within a custom repository. How would one do this, if even possible?
I do not see a way to inject something into the custom implementation that takes a specification. I thought I would be tricky and inject the CRUD repository portion of the repository into the custom portion, but that results in a circular instantiation dependency.
I am not using QueryDSL. Thanks.
I guess the primary source for inspiration could be how SimpleJpaRepository handles specifications. The key spots to have a look at are:
SimpleJpaRepository.getQuery(…) - it's basically creating a CriteriaQuery and bootstraps a select using a JPA Root. Whether the latter applies to your use case is already up to you. I think the former will apply definitely.
SimpleJpaRepository.applySpecificationToCriteria(…) - it basically uses the artifacts produced in getQuery(…) (i.e. the Root and the CriteriaQuery) and applies the given Specification to exactly these artifacts.
this is not using Specification, so not sure if it's relevant to you, but one way that I was able to inject custom behavior is as follows,
Basic structure: as follows
i. create a generic interface for the set of entity classes which are modeled after a generic parent entity. Note, this is optional. In my case I had a need for this hierarchy, but it's not necessary
public interface GenericRepository<T> {
// add any common methods to your entity hierarchy objects,
// so that you don't have to repeat them in each of the children entities
// since you will be extending from this interface
}
ii. Extend a specific repository from generic (step 1) and JPARepository as
public interface MySpecificEntityRepository extends GenericRepository<MySpecificEntity>, JpaRepository<MySpecificEntity, Long> {
// add all methods based on column names, entity graphs or JPQL that you would like to
// have here in addition to what's offered by JpaRepository
}
iii. Use the above repository in your service implementation class
Now, the Service class may look like this,
public interface GenericService<T extends GenericEntity, ID extends Serializable> {
// add specific methods you want to extend to user
}
The generic implementation class can be as follows,
public abstract class GenericServiceImpl<T extends GenericEntity, J extends JpaRepository<T, Long> & GenericRepository<T>> implements GenericService<T, Long> {
// constructor takes in specific repository
public GenericServiceImpl(J genericRepository) {
// save this to local var
}
// using the above repository, specific methods are programmed
}
specific implementation class can be
public class MySpecificEntityServiceImpl extends GenericServiceImpl<MySpecificEntity, MySpecificEntityRepository> implements MySpecificEntityService {
// the specific repository is autowired
#Autowired
public MySpecificEntityServiceImpl(MySpecificEntityRepository genericRepository) {
super(genericRepository);
this.genericRepository = (MySpecificEntityRepository) genericRepository;
}
}