We've written an in-house service discovery (SD) client based on the spring-cloud-commons SPI, meaning it provides implementations for interfaces ServiceRegistry and DiscoveryClient and some other Spring-provided abstractions.
Apps that use our library merely add it to their pom file, and it autowires the DiscoveryClient with its own implementation, InHouseDiscoveryClient
<dependency>
<groupId>blah.blah<groupId>
<artifactId>inhouse-service-discovery-client<artifactId>
<dependency>
However, rather than referring to InHouseDiscoveryClient in code, it's best practices to use the interface DiscoveryClient as shown below
# Good
#Autowired
DiscoveryClient client;
# Bad (binds app to a particular SD implementation)
#Autowired
InHouseDiscoveryClient client;
As such, we are required to add spring-cloud-commons to the project.
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-commons</artifactId>
</dependency>
This is where the issue starts. The commons library actually autowires two additional implementations of DiscoveryClient - the SimpleDiscoveryClient and the CompositeDiscoveryClient.
This makes for an odd user experience for our clients. Instead of simply having InHouseDiscoveryClient, users find themselves with these additional beans.
Is it possible to prevent spring-cloud-commons's DiscoveryClient implementations from autowiring? And if so, can this be done in our library rather than in the end-user's applications?
I ended up extended AutoConfigurationImportFilter in my library to remove the autowired beans from cloud-commons. I also removed it's health indicator, but we had a very particular reason to do so - most probably would rather keep it.
my.package
public class StratusDiscoveryExclusionFilter implements AutoConfigurationImportFilter {
private static final Set<String> SHOULD_SKIP = new HashSet<>(
Arrays.asList(
// DiscoveryClient Beans
"org.springframework.cloud.client.discovery.composite.CompositeDiscoveryClientAutoConfiguration",
"org.springframework.cloud.client.discovery.simple.SimpleDiscoveryClientAutoConfiguration",
// Health indicators
"org.springframework.cloud.client.CommonsClientAutoConfiguration")
);
/**
* For each class name, provide an assocated boolean array indicated whether or not to include
*/
#Override
public boolean[] match(String[] classNames, AutoConfigurationMetadata metadata) {
boolean[] matches = new boolean[classNames.length];
for (int i = 0; i < classNames.length; i++) {
matches[i] = !SHOULD_SKIP.contains(classNames[i]);
}
return matches;
}
}
I think add a reference to this in my library's spring.factories file
org.springframework.boot.autoconfigure.AutoConfigurationImportFilter=my.package.MyExclusionFilter
Related
we are documenting our JAX-RS API with swagger. By using the swagger-jaxrs2 package, we can create an build our api documentation very well.
The only thing we want to change: the default "openapi"-Url.
By registering Swaggers OpenApiResource Class, our application produces everytime the default "[host]/openapi" Endpoint.
We are able to create our own endpoint which serves the openapi-spec, but we cannot disable this default endpoint.
Every hint is welcome! Tank you in advance.
We solved with a workaround: Modifing javax.ws.rs.core.Application to load just the Endpoints which we provide by our own, ignoring any other 3rdParty endpoint like the swagger-jaxrs2 openapi or openapi.{type:json|yaml}
#ApplicationPath("")
public class OurApplication extends javax.ws.rs.core.Application {
#Override
public Set<Class<?>> getClasses() {
// Start detecting only classes in your package! provided 3rdParty packages
// (like io.swagger.v3.jaxrs2.integration.resources) won't be provided
Reflections ourClasses = new Reflections("our.package.naming");
// Scan your classed for #javax.ws.rs.Path Annotaion. We need just collect
// API-Endpoints
Set<Class<?>> ourEndpoints = ourClasses.getTypesAnnotatedWith(Path.class);
// fyi - log the registered classes / endpoints
System.out.println("Providing "+ ourEndpoints);
// return endpoints to provide it in your application
return ourEndpoints;
}
}
HINT: Due to #ApplicationPath annotation there is no need to modify web.xml.
The Reflections we used are provided by maven:
<dependency>
<groupId>org.reflections</groupId>
<artifactId>reflections</artifactId>
<version>0.9.12</version>
</dependency>
Watch https://github.com/ronmamo/reflections for more information about the reflections package.
We didn't find a better solution but this is what worked for us. Enjoy it.
I'm working on Spring Boot Rest API, and I did end up using the new keyword here and there.
I'm wondering, did I do something wrong when I used the new keyword for my program. And if it is absolutely forbidden to use new keyword on a real project.
If the answer is yes should i annotate each class i wrote with #component annotation so i can instantiate an object using #autowired.
If the answer is no when can we break that rule ?
You can create objects using the new keyword in a spring application.
But these objects would be outside the scope of the Spring Application Context and hence are not spring managed.
Since these are not spring managed, any nested levels of dependency (such as your Service class having a reference to your Repository class etc)
will not be resolved.
So if you try to invoke a method in your service class, you might end up getting a NullPointer for the repository.
#Service
public class GreetingService {
#Autowired
private GreetingRepository greetingRepository;
public String greet(String userid) {
return greetingRepository.greet(userid);
}
}
#RestController
public class GreetingController {
#Autowired
private GreetingService greetingService;
#RequestMapping("/greeting")
public String greeting(#RequestParam(value = "name", defaultValue = "World") String name) {
return String.format("Hello %s", greetingService.greet(name));
}
#RequestMapping("/greeting2")
public String greeting2(#RequestParam(value = "name", defaultValue = "World") String name) {
GreetingService newGreetingService = new GreetingService();
return String.format("Hello %s", newGreetingService.greet(name));
}
}
In the above example /greeting will work but /greeting2 will fail because the nested dependencies are not resolved.
So if you want your object to be spring managed, then you have to Autowire them.
Generally speaking, for view layer pojos and custom bean configurations, you will use the new keyword.
There is no rule for using or not using new.
It's up to you if you want Spring to manage your objects or want to take care of them on your own.
Spring eases object creation, dependency management, and auto wiring; however, you can instantiate it using new if you don't want that.
I think its fine to use new keyword, but you should learn the difference between different stereotype (Controller, Service, Repository)
You can follow this question to get some clarity:
What's the difference between #Component, #Repository & #Service annotations in Spring?
Using appropriate annotation will allow you to correctly use DI (dependency injection), that will help in writing sliced tests for your spring boot application. Also the Service,Controller and Repository components are created as Singleton, so lesser GC overhead. Moreover components that you create using new keyword are not managed by Spring, and by default Spring will never inject dependencies in a object created using new.
Spring official documentation:
https://docs.spring.io/spring-boot/docs/current/reference/html/using-boot-spring-beans-and-dependency-injection.html
You will need new on Spring mock tests when you will have to create an object as service and inject mock object as dao.
Look at the following code; here as you see, based on a condition it's necessary to dynamically load advertisements on demand. so here you can not #autowire this group of items because all the information are loaded from DB or an external system, so you just need to fill you model accordingly.
if (customer.getType() == CustomerType.INTERNET) {
List < Advertisement > adList = new ArrayList < Advertisement > ();
for (Product product: internetProductList) {
Advertisement advertisement = new Advertisement();
advertisement.setProduct(product);
adList.add(advertisement);
}
}
Note it's appropriate to use Spring for managing external dependencies
like plugging a JDBC connection into a DAO or configurations like
specifying which database type to use.
I'm trying to figure out the options that I have for the architecture of my API project.
I would like to create an API using JAX-RS version 1.0. This API consumes Remote EJBs (EJB 3.0) from a bigger, old and complex application. I'm using Java 6.
So far, I can do this and works. But I'm not satisfied with the solution. See my packages disposition. My concerns are described after the code:
/api/
/com.organization.api.v1.rs -> Rest Services with the JAX-RS annotations
/com.organization.api.v1.services -> Service classes used by Rest Services. Basically, they only have the logic to transform the DTOs objects from Remote EJBs in JSON. This is separated by API version, because the JSON can be different in each version.
/com.organization.api.v1.vo -> View Objects returned by the Rest Services. They will be transformed in JSON using Gson.
/com.organization.api.services -> Service classes used by versioned Services.
Here we have the lookup for Remote EJBs and some API logic, like validations. This services can be used by any versioned of each Service.
Example of the com.organization.api.v1.rs.UserV1RS:
#Path("/v1/user/")
public class UserV1RS {
#GET
public UserV1VO getUsername() {
UserV1VO userVO = ServiceLocator.get(UserV1Service.class).getUsername();
return userVO;
}
}
Example of the com.organization.api.v1.services.UserV1Service:
public class UserV1Service extends UserService {
public UserV1VO getUsername() {
UserDTO userDTO = getUserName(); // method from UserService
return new UserV1VO(userDTO.getName);
}
}
Example of the com.organization.api.services.UserService:
public class UserService {
public UserDTO getUsername() {
UserDTO userDTO = RemoteEJBLocator.lookup(UserRemote.JNDI_REMOTE_NAME).getUser();
return userDTO;
}
}
Some requirements of my project:
The API have versions: v1, v2, etc.
The different API versions of the same versioned Service can share code: UserV1Service and UserV2Service using UserService.
The different API versions of different versioned Services can share code: UserV1Service and OrderV2Service using AnotherService.
Each version have his own View Object (UserV1VO and not UserVO).
What botters me about the code above:
This ServiceLocator class it not a good approach for me. This class use legacy code from an old library and I have a lot of questions about how this class works. The way to use the ServiceLocator is very strange for me too and this strategy is not good to mock the services for my unit tests. I would like to create a new ServiceLocator or use some dependency injection strategy (or another better approach).
The UserService class is not intended to be used by another "external" service, like OrderService. It's only for the UserVxService. But in the future, maybe OrderService would like to use some code from UserService...
Even if I ignore the last problem, using the ServiceLocator I will need to do a lot of lookups among my code. The chance of create a cyclic dependency (serviceOne lookup serviceTwo that lookup serviceThree that lookup serviceOne) is very high.
In this approach, the VOs, like UserV1VO, could be used in my unversioned services (com.organization.api.services), but this cannot happen. A good architecture don't allow something that is not allowed. I have the idea to create a new project, like api-services and put the com.organization.api.services there to avoid this. Is this a good solution?
So... ideas?
A couple of things that I see:
The UserService should ideally be based off an interface. They seem to have a similar contract, but the only difference are their sources (RemoteEJB, LocalServiceLocator). These should be returning DTOs
UserV1Service extends UserService should not use inheritance but should instead favour composition. Think about what you'd need to do for v2 of the same service. Based on your example, you'd get UserV2Service extends UserService. This is not ideal especially if you end up with abstract methods in your base class that is specific for one version. Then all of a sudden other versioned services need to cater for this.
For the ServiceLocator
You're better off using a dependency injection framework like Spring or perhaps CDI in your case. This would only apply to your own code if your project is new.
For the ones that are hard to unit test, you'd wrap the RemoteEJB calls into it's own interface which makes it easier to mock out. The tests for RemoteEJBs would then be integration tests for this project.
The UserService class is not intended to be used by another "external" service, like OrderService. It's only for the UserVxService. But in the future, maybe OrderService would like to use some code from UserService
There is nothing wrong with Services on the same layer to talk to each other.
In this approach, the VOs, like UserV1VO, could be used in my
unversioned services (com.organization.api.services), but this cannot
happen. A good architecture don't allow something that is not allowed.
I have the idea to create a new project, like api-services and put the
com.organization.api.services there to avoid this. Is this a good
solution?
Just because you "could" do something doesn't mean that you should. While it might seem like a good idea to separate the layer into it's own project; in reality nothing stops a developer from either recreating the same class in that project or including the jar in the classpath and using the same class. I'm not saying that splitting it is wrong, just that it should be split for the right reasons instead of "what if scenarios".
I end up with this solution (thanks #Shiraaz.M):
I remove all extends in my Services and delete the dangerous ServiceLocator class. This inheritances without a good purpose and service locator are both bad ideas. I try to use Guice to inject the dependencies in my REST resources, but it's not so easy to do that in my Jax-rs version. But my services are very simple and easy to create, so my solution was simple:
#Path("/v1/user/")
public class UserV1RS {
private UserV1Service userV1Service;
public UserV1RS() {
this.userV1Service = new UserV1Service();
}
// only for tests!
public UserV1RS(UserV1Service userV1Service) {
this.userV1Service = userV1Service;
}
#GET
public UserV1VO getUsername() {
UserV1VO userVO = this.userV1Service.getUsername();
return userVO;
}
}
And my UserV1Service:
public class UserV1Service {
private UserService userService;
public UserV1Service() {
this.userService = new UserService();
}
// for tests
public UserV1Service(UserService userService) {
this.userService = new UserService();
}
public UserV1VO getUsername() {
UserDTO userDTO = userService.getUserName();
return new UserV1VO(userDTO.getName);
}
}
With this strategy, is easy to user other services with composition.
If necessary, in the future, I will introduce Guice to inject the dependencies in the rest resources and services (at least, in the services) and remove the default constructor from the services that have dependencies and using the same constructor in the tests and production.
About the item 4, I talked with the team and explain how is the organization. The team understand well this and no one break this architecture.
According to HK2 #Service javadoc
Annotation placed on classes that are to be automatically added to an
hk2 ServiceLocator.
I don't know how to make ServiceLocator find annotated classes automatically.
TestService
#Contract
public interface TestService {
}
TestServiceImpl
#Service
public class TestServiceImpl implements TestService {
}
Main
public static void main(String[] args) {
ServiceLocator locator = ServiceLocatorUtilities.createAndPopulateServiceLocator();
TestService service = locator.getService(TestServiceImpl.class);
System.out.println(service); // null
}
The result is always null. I have to add Descriptor so the ServiceLocator can find it.
public static void main(String[] args) {
ServiceLocator locator = ServiceLocatorUtilities.createAndPopulateServiceLocator();
DynamicConfigurationService dcs = locator.getService(DynamicConfigurationService.class);
DynamicConfiguration config = dcs.createDynamicConfiguration();
config.bind(BuilderHelper.link(TestServiceImpl.class).to(TestService.class).in(Singleton.class).build());
config.commit();
TestService service = locator.getService(TestServiceImpl.class);
System.out.println(service); // TestServiceImpl instance
}
How do I let ServiceLocator find the annotated classes automatically ? Did I misunderstand something ?
You need to run the hk2-inhabitant-generator over your built classes in order to get automatic detection of services. There is more information here as well.
What that step does in the build process is to create a file named META-INF/hk2-locator/default with information about services. The createAndPopulateServiceLocator call then reads those files and automatically adds those service descriptors into the returned ServiceLocator.
FYI, I was so frustrated with the reliance on the inhabitant files rather than having the capability for runtime scanning of annotated classes, I wrote this project:
https://github.com/VA-CTT/HK2Utilities
Since Eclipse / Maven / inhabitant runtime generators wouldn't play nice, it was nearly impossible to debug code that made use of HK2 in eclipse without runtime scanning.
The HK2Utilities package is available in central:
<dependency>
<groupId>gov.va.oia</groupId>
<artifactId>HK2Utilities</artifactId>
<version>1.4.1</version>
</dependency>
To use it, you just call:
ServiceLocator locator = HK2RuntimeInitializer.init("myName", false, new String[]{"my.package.one", "my.package.two"});
This will scan the runtime classpath for classes in the packages listed, and automatically populate the service locator with them.
You don't ever have to generate inhabitant files with this model - and in practice, I found it to be faster performing than the inhabitant processing code as well (not that the performance matters much for this one-time operation)
---edit---
I still maintain this code - the current release is:
<dependency>
<groupId>net.sagebits</groupId>
<artifactId>HK2Utilities</artifactId>
<version>1.5.2</version>
</dependency>
And the project location is now:
https://github.com/darmbrust/HK2Utilities
Well now (2.6.1) all you need to do is add the dependencies - javax.inject, hk2-utils, hk2-api and hk2-metadata-generator.
When you build the project, javac compiler will generate a 'default' file in META-INF containing the wiring as follows:
[service-class-name]S
contract={contract-class-name}
This will be registered by the ServiceLocator during the run.
This should be sufficient. However if that does not work, there are other options,
mvn plugin
org.glassfish.hk2
hk2-inhabitant-generator
2.5.0-b36
generate-inhabitants
cmd line tool
java org.jvnet.hk2.generator.HabitatGenerator
[--file jarFileOrDirectory]
[--outjar jarFile]
[--locator locatorName]
[--verbose]
More on this https://javaee.github.io/hk2/inhabitant-generator.html
What is the way one inject property value from property placeholder into CDI bean?
In Spring one write:
#org.springframework.beans.factory.annotation.Value("${webservice.user}")
private String webserviceUser;
what sets the webserviceUser field to property webservice.user from property file/property placeholder.
How to do that with CDI? I've tried to find some answer, but I couldn't find any equivalent. However, people write, you can use CDI as Spring substitute on application servers, and that use case is very basic, so surely there must be an easy way, unfortunately I've failed to find it.
CDI is a specification about Dependecy Injection and Context so it doesn't have such configuration things out of the box. But it also provides a very powerful extension mechanism that allows third party projects to add new portable features (i.e that works with all CDI implementation and that are not tied to a server).
The most important project providing CDI extensions is Apache Deltaspike and good news, it provides what you need.
So you need to add deltaspike-core in your project. If you use Maven, you need to add this dependencies to your pom.xml
<dependency>
<groupId>org.apache.deltaspike.core</groupId>
<artifactId>deltaspike-core-api</artifactId>
<version>0.4</version>
</dependency>
<dependency>
<groupId>org.apache.deltaspike.core</groupId>
<artifactId>deltaspike-core-impl</artifactId>
<version>0.4</version>
</dependency>
After that if you don't care about your properties filename, just add META-INF/apache-deltaspike.properties to your project and put your properties in it. If you need more than one file or want to choose the name you'll have to implement PropertyFileConfig interface for each file like this :
public class MyCustomPropertyFileConfig implements PropertyFileConfig
{
#Override
public String getPropertyFileName()
{
return "myconfig.properties";
}
}
After that you'll be able to inject values like this
#ApplicationScoped
public class SomeRandomService
{
#Inject
#ConfigProperty(name = "endpoint.poll.interval")
private Integer pollInterval;
#Inject
#ConfigProperty(name = "endpoint.poll.servername")
private String pollUrl;
...
}
As you see in this example taken from Deltaspike documentation, you can inject your value in String but also in Integer, Long, Float, Boolean fields. You could provide your own type if you need something more specific.
Deltaspike config documentation can be found here.