What practices should be used when writing a JPA repository - java

I am writing a spring rest application, the problem is that I am not sure while I should use a repository or when a service interface together with implementation of it. Let's say that I have a repository that has a method findById I created a service interface that has the same method it returns the object and is called Object findById(Long id); and I wonder if I should create an implementation of that that's looks like that
public Object findById(Long id) {
repository.findById(id).orElseThrow(() -> new RuntimeException("message"));
}
but I could also do the same without this service class as the repository also returns a Optional so it could be also done in the controller
repository.findById(id).orElseThrow(() -> new RuntimeException("message"));
But it's hard to test repositories, better is to create an implementation of the service and then test the service. Anyway what's yours opinion about it, which one is better for you and why?

I think it's all about your project architecture. one of the classic, simplest and most favorite architectures is N-Layer architecture which normally is implemented with 3 main layers. Controllers, Services and Repositories.
Controllers are responsible for getting the requests from clients, updating the model usually with calling Services and returning a response for clients.
Services are where your business logic are implemented and where you should usually check for your transaction management and some security checking and etc.
and finally Repositories are where you interact with underlying systems like File System and Database to save the state of your application.

Related

Accessing the same MongoRepository collection from two different Spring Boot Applications

I am working on two different spring boot services, that will need to access a common MongoRepository collection of Users.
For the sake of simplicity, we have SpringBootApp1:
User.java
#Document(collection = "user")
public class User {
#Id
private String id;
....
}
Then I get the repository as:
UserRepository.java
public interface UserRepository extends MongoRepository<User, String> {
}
Now I need another application, SpringBootApp2 that reads the Users. I was planning to do the same in the second service, but then I would have two versions of User object, one defined on each service and both of them trying to read from the same MongoRepository collection. If one User class is modified in one service, the other will not know and they will start being off-sync, on top of having repetition or code on both services.
What would be the best approach in this case?
Why you need SpringBootApp2 to read the User table again ? You can expose an API in SpringBootApp1 which will fetch you the desired data by calling from the App2 . All your database related operations you can keep in one micro service or one app, and if you need the data from the database just expose methods in App1 and call that from App2.
UPDATE :
As you said, App1 does READ-WRITE with the collection and grants access only to authenticated users, and App2 does a simple READ-ONLY activity, in that case, you can use the JPA repo interface in App2 with only the read from collection method implemented. You don't need any setters method for the Entity and no save/update method implemented for your interface. This design is harmless. You are just making a call to the same database from two different services. No harm in that. Anyways you are always making a fresh query to the collection from App2.
Else if you have concerns with this approach as well, and if you dont want to keep multiple DB read logics from different services, then (I do not know what else functionalities your App1 offers other than DB operations) I would like to suggest you to create App3 which is only for the sole purpose of DB operations.
In that case :
App1 will do security related operations + other services
Design App3 which does DB manipulation work, without any direct user credentials or security
App1 will perform authentication, and on success call App3 for any DB READ-WRITE work, App3 wont need any security
App2 will call App3 for any DB READ operations, App3 wont need any security

Looking for the best way to share an interface between microservices with quarkus

I'm still quite new to microservices and have a few basic architectural questions that I can't get solved right now.
I'm using the Quarkus framework with the standard extensions like quarkus-resteasy and quarkus-rest-client for the realization.
The scenario:
I have an example of a "Persistence" service that I want to externally populate with data via a REST call in a dedicated Maven project.
#Path("/api/persistence")
#Products(MediaType.APPLICATION_JSON)
public class Persistence{
#Inject
EntityManager entityManager;
#POST
#Transactional
public Response create(PostDto postDto) {
Post post = toPostMapper.toResource(postDto);
entityManager.persist(post);
return Response.ok(postDto).status(201).build();
}
}
At the same time I would like to have a microservice DataGenerator which generates the corresponding data and passes it to the Persistence Service.
My problem : API sharing
Both services were created as Maven projects.
According to the tutorials I found the correct way would be to declare an interface (here called PersistenceApi) in the DataGenerator project like this
#Path("/api/persistence")
#Products(MediaType.APPLICATION_JSON)
#RegisterRestClient
public interface PersistenceApi {
#POST
#Transactional
public Response create(PostDto post) ;
}
This interface is then integrated into the DataGenerator service via #Inject, which leads to the following exemplary service.
#RequestScoped
#Path("/api/datagenerator")
#Products("application/json")
#Consumes("application/json")
public class DataGenerator{
#Inject
#RestClient
PersistenceApi persistenceApi
#POST
public void getPostExamplePostToPersistence() {
PostDto post = new PostDto();
post.setTitle("Find me in db in persistence-service")
persistenceApi.create(post);
}
}
I have the PersistenceService running locally on port 8181 and have added the following entry in the application.properties of the DataGenerator project so that the service can be found.
furnace.collection.item.service.PersistenceApi/mp-rest/url=http://localhost:8181
furnace.collection.item.service.PersistenceApi/mp-rest/scope=javax.inject.Singleton
I find it "wrong" to declare the interface in my DataGenerator, because at this point I don't notice when the api provided by the Persistence service changes. Accordingly one could come up with the idea to position the interface in the Persistence service, which is then implemented by my concrete Persistence implementation and leads to the following code.
#Path("/api/persistence")
#Products(MediaType.APPLICATION_JSON)
#RegisterRestClient
public class PersistenceApiImpl implements PersistenceApi {
#Inject
EntityManager entityManager;
#POST
#Transactional
public Response create(PostDto fruit) {
Post post = toPostMapper.toResource(fruit);
entityManager.persist(post);
return Response.ok(fruit).status(201).build();
}
}
In order to use them in my DataGenerator project, I would have to include the Persistence project as a dependency in my DataGenerator project, which sounds like a "monolith with extra steps" to me and therefore feels wrong in terms of "separation of concerns".
I have tried the following approach:
I created another Maven project called PersistenceApi which only contains the corresponding PersistenceApi. This PersistenceApi project was then included as a dependency in both the "Persistence" and "DataGenerator" projects. In the "Persistence"-Project I implement the service from the example above and try to address the corresponding interface in the "DataGenerator"-Project via #Inject.
Unfortunately this does not work. When I'm building the service, I get the message that the required dependency PersistenceApi, which I want to include via #Inject in the DataGenerator service, cannot be injected in the form of an UnsatisfiedResolutionException.
Now my questions:
I don't see what I'm missing here. Could you help me?
Is this kind of API-sharing with dedicated Api projects a viable way or is the "monolith with extra steps" approach really the way to go?
Thank you in advance.
Thats a common problem with microservices. Like in the book "Microservices: Grundlagen flexibler Softwarearchitekturen" by Eberhard Wolff (I saw that you are German too) i follow the idea that microservices should have the same coupling like the teams developing them and like the organization your developing it for(have a look at Conway's law). Therefore services of mostly independent teams should be developed independly and the api changes of one service should not affect another at the time of the update.
If you develop both services in your team then i think you can couple them the way you are doing it because you dont have to work together with other teams and there will be no huge overhead. Note that you will be forced to release both services together. If that is always ok for you then save your time and do it your way, if not have a look at API-Versioning:
I use api versioning so the old api is still reachable under "v1/" and the new one under "v2/". This way the team behind the other microservice has enough time to update their service.
Have a look at Domain-driven Design for different ways of integrating bounded contexts (=services) and the coupling consequences. Without API-Versioning you are forced to a partnership and you need to release together. Maybe you prefer Customer-Supplier or even conformist.
To test compatibility between both services have a look at consumer driven contracts and Pact. You can also generate open api files and track their changes but that will only help to notify people about changes.

Separating out REST API and implementation for Spring web services

I've been using Spring annotations such as #RestController and #RequestMapping to generate simple services in a Spring Boot Web application.
So I have this trivial example working correctly:
#RestController
public class HelloController {
#RequestMapping("/")
public String sayIt() {
return "Hello!";
}
}
Now, I would like to separate out an API library (jar) with only the REST interface and the DTOs. One or more separate libraries would provide the actual implementations of this interface. I can then use the (lightweight) API library on the client-side to generate REST client proxies to talk to any of the implementations.
So... are there any annotations or configuration to mark REST interfaces vs. implementations separately? If not, what is the Spring-y way to achieve this instead of using JAX-RS annotations?
#Something1
public class HelloServiceApi {
#RequestMapping("/")
public String sayIt();
}
#Something2
public class HelloServiceImpl implements HelloServiceApi {
public String sayIt() {
return "Hello!";
}
}
I would advise to have a jar that contains the DTO objects only, without any logic. It then can be used by both the REST server and the client to transfer objects.
The client should not be dependent on the REST war/jar or the logic.
Furthermore, I would try to make sure my controller doesn't hold any logic aside from, maybe, transferring the DTOs into domain model objects which will then be passed to the business logic layer.
In my opinion, the REST layer should be only responsible for the external API, argument handling, sending to a layer bellow (service layer) and preparing a response.
That being said, you should have your different implementations at the service layer. This allows having the API/REST layer untouched and constant.
The service layer (being providing in different implementations) should respect some common interface that is later injected in the above rest layer.
Have I responded to your question?

How to develop a web application with a bunch of feature modules

I am developing a web application using spring framework and google app engine. I am wondering if there is a design pattern or framework with the help of which I can develop features of my application as pluggable modules. For example I have identified 4 features of the application:
Oauth Login
User Profile Management
User Group creation
User File management
Now what I need is to develop all these features as independent modules, so that i can detach any of them dynamically and they are as loosely coupled as possible. They can have their own database implementation, their own set of technologies etc. Is there a design principle to implement modules in such a way.
You can take a look at MQ systems (such as RabbitMQ, ActiveMQ).
MQ system will work as intermediate layer, which provide you loosely coupling.
Communication between modules will be implemented as posting messages to queue and listening for posting.
Also, OSGI may help you. It gives you possibility to make your application as a set of pluggable modules, which might be loaded dynamically.
As per my experience, I suggest, Use MVC pattern. Use Servlet filttersfor 1.Oauth Login.
Create service/POJOs to implement and inject each other according to your requirement for
2.User Profile Management
3.User Group creation
4.User File management
If you know Spring AOP, use. So that you can achive more dynamic integration between implementations of points 2,3, and 4.
You should split the feature in two components: API and implementation. The first one contain interfaces, the second their implementations. You pass the interface to web app controller and inject implementation via Spring or any other Dependency Injection framework. For example
web-app, UserController which handles requests from client and delegate to your components
#Component
public class UserController {
private FileManager fileManager;
#Autowired
public UserController(FileManager fileManager) {
this.fileManager = fileManager;
}
#GET("/user/{userId}/file/{fileId}")
public File getUserFile(long userId, long fileId) {
fileManager.getUserFile(userId, fileId);
}
}
file-mgt-api where you define interfaces to decouple web-app from implementation
public interface FileManager {
File getUserFile(long userId, long fileId);
}
file-mgt-impl where all the details of how to get requested file
#Component
public class FileManagerImpl implements FileManager {
#Override
public File getUserFile(long userId, long fileId) {
// get file by id from DB
// verify that provided user is the file owner
// do other useful stuff
// return the file or throw exception if something wrong
}
}
Do the same for group, profile management and other features. After that you can easily replace implementation by replacing single jar file. Your web-app is completely decoupled and don't know anything about implementation details, it only depends on interfaces.

Spring: How to export objects methods via non-HTTP protocol

I have a collection of stateless scala/Java APIs, which might look like:
class UserService {
def get(id: UserIDType): User { ... }
def update( user:User): User { ... }
...
}
where User has a set of inspectable bean properties. I'd like to make these same APIs not only available over HTTP (which I know how to do), but also other, more performant non-HTTP protocols (ideally also running in the same process as the HTTP server). And more importantly, automate as much as possible including generation of client APIs that match the original Java API, and the dispatching of method calls following network requests.
I've found Spring's guide on remoting
However, this looks limited to HTTP (text not binary). What I'd really love is a library/other method that:
lets me scan for registered/annotated services and describes methods and parameters as data
lets me easily dispatch method calls (so that I'm in control of the sockets, communication protocols and whatnot and can chose a more performant one than HTTP).
i.e. the kinds of things that Spring's DispatcherServlet does internally, but without the HTTP limitations.
Ideas?
Background
I'd like to make a set of stateless service API calls available as network services with the following goals:
Some of the APIs should be available as REST calls from web pages/mobile clients
A superset of these APIs should be available internally within our company (e.g. from C++ libraries, python libraries) in a way that is as high-performance (read low-latency) as possible, particularly when the service and client are:
co-located in the same process, or
co-located on the same machine.
automate wrapper code for the client and server. If I add a method a service API, or and an attribute to a class, no-one should have to write additional code in client or server. (e.g. I hit a button and a new client API that matches the original Java/scala service is auto-generated).
(1) is easily achievable with Spring MVC. Without duplication of classes I can simply markup the service: (i.e. service is also a controller)
#Controller
#Service
#RequestMapping(...)
class UserService {
#RequestMapping(...)
def get(#PathVariable("id") id: UserIDType): User { ... }
#RequestMapping(...)
def update( #RequestBody user:User): User { ... }
...
}
With tools like Swagger (3) is easily achievable (I can trivially read the Swagger-generated JSON spec and can auto-generate C++ or Python code that mimics the original service class, it's methods, parameter names and the POJO parameter types).
However, HTTP is not particularly performant, so I'd like to use a different (binary) protocol here. Trivially, I can use the this same spec to generate a thrift .idl or even directly generate client code to talk Thrift/ProtocolBuffers (or any other binary protocol) while also providing to the client an API that looks just like the original Java/scala one.
Really -- the only tricky part is getting a description of the service method and then actually dispatching a method calls. So -- are there libraries to help me do that?
Notes:
I'm aware that mixing service and controller annotations disgusts some Spring MVC purists. However, my goal really is to have a logical API, and automate all of the service embodiments of it (i.e. the controller part is redundant boilerplate that should be auto-generated). I'd REALLY rather not spend my life writing boilerplate code.

Categories