I am developing a web application using spring framework and google app engine. I am wondering if there is a design pattern or framework with the help of which I can develop features of my application as pluggable modules. For example I have identified 4 features of the application:
Oauth Login
User Profile Management
User Group creation
User File management
Now what I need is to develop all these features as independent modules, so that i can detach any of them dynamically and they are as loosely coupled as possible. They can have their own database implementation, their own set of technologies etc. Is there a design principle to implement modules in such a way.
You can take a look at MQ systems (such as RabbitMQ, ActiveMQ).
MQ system will work as intermediate layer, which provide you loosely coupling.
Communication between modules will be implemented as posting messages to queue and listening for posting.
Also, OSGI may help you. It gives you possibility to make your application as a set of pluggable modules, which might be loaded dynamically.
As per my experience, I suggest, Use MVC pattern. Use Servlet filttersfor 1.Oauth Login.
Create service/POJOs to implement and inject each other according to your requirement for
2.User Profile Management
3.User Group creation
4.User File management
If you know Spring AOP, use. So that you can achive more dynamic integration between implementations of points 2,3, and 4.
You should split the feature in two components: API and implementation. The first one contain interfaces, the second their implementations. You pass the interface to web app controller and inject implementation via Spring or any other Dependency Injection framework. For example
web-app, UserController which handles requests from client and delegate to your components
#Component
public class UserController {
private FileManager fileManager;
#Autowired
public UserController(FileManager fileManager) {
this.fileManager = fileManager;
}
#GET("/user/{userId}/file/{fileId}")
public File getUserFile(long userId, long fileId) {
fileManager.getUserFile(userId, fileId);
}
}
file-mgt-api where you define interfaces to decouple web-app from implementation
public interface FileManager {
File getUserFile(long userId, long fileId);
}
file-mgt-impl where all the details of how to get requested file
#Component
public class FileManagerImpl implements FileManager {
#Override
public File getUserFile(long userId, long fileId) {
// get file by id from DB
// verify that provided user is the file owner
// do other useful stuff
// return the file or throw exception if something wrong
}
}
Do the same for group, profile management and other features. After that you can easily replace implementation by replacing single jar file. Your web-app is completely decoupled and don't know anything about implementation details, it only depends on interfaces.
Related
I'm recently working with microservices, developed as Spring Boot applications (v 2.2) and in my company we're using Keycloak as authorization server.
We chose it because we need complex policies, roles and groups, and we also need the User Managed Authorization (UMA) to share resources between users.
We configured Keycloak with a single realm and many clients (one client per microservice).
Now, I understand that I need to explicitly define Resources within Keycloak and this is fine, but the question is: do I really need to duplicate all of them in my microservice's property file?
All the documentation, examples and tutorials end up with the same thing, that is something like:
keycloak.policy-enforcer-config.enforcement-mode=PERMISSIVE
keycloak.policy-enforcer-config.paths[0].name=Car Resource
keycloak.policy-enforcer-config.paths[0].path=/cars/create
keycloak.policy-enforcer-config.paths[0].scopes[0]=car:create
keycloak.policy-enforcer-config.paths[1].path=/cars/{id}
keycloak.policy-enforcer-config.paths[1].methods[0].method=GET
keycloak.policy-enforcer-config.paths[1].methods[0].scopes[0]=car:view-detail
keycloak.policy-enforcer-config.paths[1].methods[1].method=DELETE
keycloak.policy-enforcer-config.paths[1].methods[1].scopes[0]=car:delete
(this second example fits better our case because it also uses different authorization scopes per http method).
In real life each microservice we're developing has dozens of endpoints and define them one by one seems to me a waste of time and a weakness in the code's robustness: we change an endpoint, we need to reconfigure it in both Keycloak and the application properties.
Is there a way to use some kind of annotation at Controller level? Something like the following pseudo-code:
#RestController
#RequestMapping("/foo")
public class MyController {
#GetMapping
#KeycloakPolicy(scope = "foo:view")
public ResponseEntity<String> foo() {
...
}
#PostMapping
#KeycloakPolicy(scope = "bar:create")
public ResponseEntity<String> bar() {
...
}
}
In the end, I developed my own project that provides auto-configuration capabilities to a spring-boot project that needs to work as a resource server.
The project is released under MIT2 license and it's available on my github:
keycloak-resource-autoconf
I'm still quite new to microservices and have a few basic architectural questions that I can't get solved right now.
I'm using the Quarkus framework with the standard extensions like quarkus-resteasy and quarkus-rest-client for the realization.
The scenario:
I have an example of a "Persistence" service that I want to externally populate with data via a REST call in a dedicated Maven project.
#Path("/api/persistence")
#Products(MediaType.APPLICATION_JSON)
public class Persistence{
#Inject
EntityManager entityManager;
#POST
#Transactional
public Response create(PostDto postDto) {
Post post = toPostMapper.toResource(postDto);
entityManager.persist(post);
return Response.ok(postDto).status(201).build();
}
}
At the same time I would like to have a microservice DataGenerator which generates the corresponding data and passes it to the Persistence Service.
My problem : API sharing
Both services were created as Maven projects.
According to the tutorials I found the correct way would be to declare an interface (here called PersistenceApi) in the DataGenerator project like this
#Path("/api/persistence")
#Products(MediaType.APPLICATION_JSON)
#RegisterRestClient
public interface PersistenceApi {
#POST
#Transactional
public Response create(PostDto post) ;
}
This interface is then integrated into the DataGenerator service via #Inject, which leads to the following exemplary service.
#RequestScoped
#Path("/api/datagenerator")
#Products("application/json")
#Consumes("application/json")
public class DataGenerator{
#Inject
#RestClient
PersistenceApi persistenceApi
#POST
public void getPostExamplePostToPersistence() {
PostDto post = new PostDto();
post.setTitle("Find me in db in persistence-service")
persistenceApi.create(post);
}
}
I have the PersistenceService running locally on port 8181 and have added the following entry in the application.properties of the DataGenerator project so that the service can be found.
furnace.collection.item.service.PersistenceApi/mp-rest/url=http://localhost:8181
furnace.collection.item.service.PersistenceApi/mp-rest/scope=javax.inject.Singleton
I find it "wrong" to declare the interface in my DataGenerator, because at this point I don't notice when the api provided by the Persistence service changes. Accordingly one could come up with the idea to position the interface in the Persistence service, which is then implemented by my concrete Persistence implementation and leads to the following code.
#Path("/api/persistence")
#Products(MediaType.APPLICATION_JSON)
#RegisterRestClient
public class PersistenceApiImpl implements PersistenceApi {
#Inject
EntityManager entityManager;
#POST
#Transactional
public Response create(PostDto fruit) {
Post post = toPostMapper.toResource(fruit);
entityManager.persist(post);
return Response.ok(fruit).status(201).build();
}
}
In order to use them in my DataGenerator project, I would have to include the Persistence project as a dependency in my DataGenerator project, which sounds like a "monolith with extra steps" to me and therefore feels wrong in terms of "separation of concerns".
I have tried the following approach:
I created another Maven project called PersistenceApi which only contains the corresponding PersistenceApi. This PersistenceApi project was then included as a dependency in both the "Persistence" and "DataGenerator" projects. In the "Persistence"-Project I implement the service from the example above and try to address the corresponding interface in the "DataGenerator"-Project via #Inject.
Unfortunately this does not work. When I'm building the service, I get the message that the required dependency PersistenceApi, which I want to include via #Inject in the DataGenerator service, cannot be injected in the form of an UnsatisfiedResolutionException.
Now my questions:
I don't see what I'm missing here. Could you help me?
Is this kind of API-sharing with dedicated Api projects a viable way or is the "monolith with extra steps" approach really the way to go?
Thank you in advance.
Thats a common problem with microservices. Like in the book "Microservices: Grundlagen flexibler Softwarearchitekturen" by Eberhard Wolff (I saw that you are German too) i follow the idea that microservices should have the same coupling like the teams developing them and like the organization your developing it for(have a look at Conway's law). Therefore services of mostly independent teams should be developed independly and the api changes of one service should not affect another at the time of the update.
If you develop both services in your team then i think you can couple them the way you are doing it because you dont have to work together with other teams and there will be no huge overhead. Note that you will be forced to release both services together. If that is always ok for you then save your time and do it your way, if not have a look at API-Versioning:
I use api versioning so the old api is still reachable under "v1/" and the new one under "v2/". This way the team behind the other microservice has enough time to update their service.
Have a look at Domain-driven Design for different ways of integrating bounded contexts (=services) and the coupling consequences. Without API-Versioning you are forced to a partnership and you need to release together. Maybe you prefer Customer-Supplier or even conformist.
To test compatibility between both services have a look at consumer driven contracts and Pact. You can also generate open api files and track their changes but that will only help to notify people about changes.
I'm designing software with different components, that need to communicate with each other through a REST interface.
I've seen and implemented solutions, where the different components of the software (deployed on the same cluster) would communicate by declaring and injecting EJB beans. This way, it's really easy to standardize the communication by defining common interfaces and DTOs in a separate .jar dependency.
This level of comfort and standardization is what I'd like to achieve with RESTful services, between Java-based components of my software.
Imagine something like this:
Let's say, I have a REST Client (C) and a REST Server (S).
I'd like to be able to communicate between them, via a common interface, which is implemented in S and called by C. This common interface is in an API component (I).
It would mean that I would have an interface:
#RestController
#RequestMapping("/rest/user")
public class UserController {
#GetMapping("list")
ResponseEntity<List<UsersModel>> getUserList(OAuth2LoginAuthenticationToken token);
}
In C it could be used like:
public class Sg {
private final UserController userController;
...
public void method(OAuth2LoginAuthenticationToken token) {
...
userController.getUserList(token);
...
}
}
Lastly, in S:
public class UserControllerImpl implements UserController {
#Override
public ResponseEntity<List<UsersModel>> getUserList(OAuth2LoginAuthenticationToken token) {
...
}
}
The only configuration needed is to tell the client the context root (and host address) of the server, everything else is present in the common interface in the form of annotations.
Since not all components are necessarily Java-based, it is important for the REST resource to be callable in a typical REST-like way, so those Java remote service calling mechanics are out of consideration.
I was looking into JAX-RS, which seems promising, but is missing a couple of features that would be nice. For example, there isn't a common interface telling the client which endpoint on the server can the REST resource be found, neither are the method names, etc. AFAIK, on the client, you can only call the method representing the HTTP method of the request, which is a bummer.
Am I out of my mind with this spec? I'm not really experienced with REST services yet, so I don't really know if I'm speaking of something that is out of the REST services scope. Is there an already existing solution to the problem I face?
After more thorough research, I found that RESTeasy already has a solution for this.
You need to use the ProxyBuilder to create a proxy of your interface and that's it.
I am writing a spring rest application, the problem is that I am not sure while I should use a repository or when a service interface together with implementation of it. Let's say that I have a repository that has a method findById I created a service interface that has the same method it returns the object and is called Object findById(Long id); and I wonder if I should create an implementation of that that's looks like that
public Object findById(Long id) {
repository.findById(id).orElseThrow(() -> new RuntimeException("message"));
}
but I could also do the same without this service class as the repository also returns a Optional so it could be also done in the controller
repository.findById(id).orElseThrow(() -> new RuntimeException("message"));
But it's hard to test repositories, better is to create an implementation of the service and then test the service. Anyway what's yours opinion about it, which one is better for you and why?
I think it's all about your project architecture. one of the classic, simplest and most favorite architectures is N-Layer architecture which normally is implemented with 3 main layers. Controllers, Services and Repositories.
Controllers are responsible for getting the requests from clients, updating the model usually with calling Services and returning a response for clients.
Services are where your business logic are implemented and where you should usually check for your transaction management and some security checking and etc.
and finally Repositories are where you interact with underlying systems like File System and Database to save the state of your application.
I've been using Spring annotations such as #RestController and #RequestMapping to generate simple services in a Spring Boot Web application.
So I have this trivial example working correctly:
#RestController
public class HelloController {
#RequestMapping("/")
public String sayIt() {
return "Hello!";
}
}
Now, I would like to separate out an API library (jar) with only the REST interface and the DTOs. One or more separate libraries would provide the actual implementations of this interface. I can then use the (lightweight) API library on the client-side to generate REST client proxies to talk to any of the implementations.
So... are there any annotations or configuration to mark REST interfaces vs. implementations separately? If not, what is the Spring-y way to achieve this instead of using JAX-RS annotations?
#Something1
public class HelloServiceApi {
#RequestMapping("/")
public String sayIt();
}
#Something2
public class HelloServiceImpl implements HelloServiceApi {
public String sayIt() {
return "Hello!";
}
}
I would advise to have a jar that contains the DTO objects only, without any logic. It then can be used by both the REST server and the client to transfer objects.
The client should not be dependent on the REST war/jar or the logic.
Furthermore, I would try to make sure my controller doesn't hold any logic aside from, maybe, transferring the DTOs into domain model objects which will then be passed to the business logic layer.
In my opinion, the REST layer should be only responsible for the external API, argument handling, sending to a layer bellow (service layer) and preparing a response.
That being said, you should have your different implementations at the service layer. This allows having the API/REST layer untouched and constant.
The service layer (being providing in different implementations) should respect some common interface that is later injected in the above rest layer.
Have I responded to your question?