please can you explain me shortly for what are responsible Repository class and Service class in Spring boot. As I know repo is for doing all the database operations and interacting with DB but somewhere I heard from videos that Service class talks to DB so I got confused and can't find any definition for them.
Thank you very much for your time.
#Service assigns Beans to handle logic
#Repository assigns Beans to take care of communicating with the DB
Service class is where you perform your business logic which you do not want the user to view and repository class is where you perform database operations on an entity.
There is one more class called controller which is used to interact with web requests which are then forwarded to service methods and if there is need for data from database they send it forward to repository class.
Hope this explains. It is usually a design pattern for building production level applications
Here is a short example
#Controller // Controller class
public class RequestController{
#Autowired
private ServiceClass service;
#RequestMapping("")
public string index(#Param("name") String name){
return service.getString();
}
#Service
public class ServiceClass{
#Autowired
private StuRepository repo;
public String getString(String name){
if(name.equals("Rahul")
return repo.findName();
else
throw new Error("business logic performed here");
}
#Repository
public interface StuRepository extends JpaRepository<Model,Integer>{
String findName();
}
I am writing an azure function in java. To use a good modular pattern I created a service layer which has abstract interface classes and an impl layer that implements the service layer.
However, I am not using Spring-framework so I can't use #Autowired to create a singleton instance of the service layer in the runner file. How can I use my service layer functions in my runner class (or other places in my project)?
Service Layer
public interface TimeTriggeredService {
String getLogs(String token, String url,final ExecutionContext context);
}
Impl Layer
public class TimeTriggeredServiceImpl implements TimeTriggeredService {
public String getLogs(String token, String url,final ExecutionContext context) {
// Some logic
}
}
Runner Class
public class TimeTriggeredFunction {
#FunctionName("TimeTriggeredFunction")
public void run(#TimerTrigger(name = "timerInfo", schedule = "0 */1 * * * *") String timerInfo,
final ExecutionContext context) {
String timeAuditLogs = TimeTriggeredService.getLogs(token, URL ,context); // unsure what should replace this line or what should be done before this.
}
}
Note : This is NOT a spring project.
Dependency Injection for Java is not supported yet. Check out GitHub issue around the same: #324.
For now, you can use Spring Framework to use Azure Function for HTTP
requests only (not the bindings.
Here is a sample of how to use it.
If you do not want to use Spring framework, you will have to create a global instance in the class and use it.
I have been using dependency injection using #Autowired in Spring boot. From all the articles that I have read about dependency injection, they mention that dependency injection is very useful when we (if) decide to change the implementing class in the future.
For example, let us deal with a Car class and a Wheel interface. The Car class requires an implementation of the Wheel interface for it to work. So, we go ahead and use dependency injection in this scenario
// Wheel interface
public interface Wheel{
public int wheelCount();
public void wheelName();
...
}
// Wheel interface implementation
public class MRF impements Wheel{
#Override
public int wheelCount(){
......
}...
}
// Car class
public class Car {
#Autowired
Wheel wheel;
}
Now in the above scenario, ApplicationContext will figure out that there is an implementation of the Wheel interface and thus bind it to the Car class. In the future, if we change the implementation to say, XYZWheel implementing class and remove the MRF implementation, then the same should work.
However, if we decide to keep both the implementations of Wheel interface in our application, then we will need to specifically mention the dependency we are interested in while Autowiring it. So, the changes would be as follows -
// Wheel interface
public interface Wheel{
public int wheelCount();
public void wheelName();
...
}
#Qualifier("MRF")
// Wheel interface implementation
public class MRF impements Wheel{
#Override
public int wheelCount(){
......
}...
}
// Wheel interface implementation
#Qualifier("XYZWheel")
public class XYZWheel impements Wheel{
#Override
public int wheelCount(){
......
}...
}
// Car class
public class Car {
#Autowired
#Qualifier("XYZWheel")
Wheel wheel;
}
So, now I have to manually define the specific implementation that I want to Autowire. So, how does dependency injection help here ? I can very well use the new operator to actually instantiate the implementing class that I need instead of relying on Spring to autowire it for me.
So my question is, what are the benefit of autowiring/dependency injection when I have multiple implementing classes and thus I need to manually specify the type I am interested in ?
You don't have to necessarily hard-wire an implementation if you selectively use the qualifier for #Primary and #Conditional for setting up your beans.
A real-world example for this applies to implementation of authentication. For our application, we have a real auth service that integrates to another system, and a mocked one for when we want to do local testing without depending on that system.
This is the base user details service for auth. We do not specify any qualifiers for it, even though there are potentially two #Service targets for it, Mock and Real.
#Autowired
BaseUserDetailsService userDetailsService;
This base service is abstract and has all the implementations of methods that are shared between mock and real auth, and two methods related specifically to mock that throw exceptions by default, so our Real auth service can't accidentally be used to mock.
public abstract class BaseUserDetailsService implements UserDetailsService {
public void mockUser(AuthorizedUserPrincipal authorizedUserPrincipal) {
throw new AuthException("Default service cannot mock users!");
}
public UserDetails getMockedUser() {
throw new AuthException("Default service cannot fetch mock users!");
}
//... other methods related to user details
}
From there, we have the real auth service extending this base class, and being #Primary.
#Service
#Primary
#ConditionalOnProperty(
value="app.mockAuthenticationEnabled",
havingValue = "false",
matchIfMissing = true)
public class RealUserDetailsService extends BaseUserDetailsService {
}
This class may seem sparse, because it is. The base service this implements was originally the only authentication service at one point, and we extended it to support mock auth, and have an extended class become the "real" auth. Real auth is the primary auth and is always enabled unless mock auth is enabled.
We also have the mocked auth service, which has a few overrides to actually mock, and a warning:
#Slf4j
#Service
#ConditionalOnProperty(value = "app.mockAuthenticationEnabled")
public class MockUserDetailsService extends BaseUserDetailsService {
private User mockedUser;
#PostConstruct
public void sendMessage() {
log.warn("!!! Mock user authentication is enabled !!!");
}
#Override
public void mockUser(AuthorizedUserPrincipal authorizedUserPrincipal) {
log.warn("Mocked user is being created: " + authorizedUserPrincipal.toString());
user = authorizedUserPrincipal;
}
#Override
public UserDetails getMockedUser() {
log.warn("Mocked user is being fetched from the system! ");
return mockedUser;
}
}
We use these methods in an endpoint dedicated to mocking, which is also conditional:
#RestController
#RequestMapping("/api/mockUser")
#ConditionalOnProperty(value = "app.mockAuthenticationEnabled")
public class MockAuthController {
//...
}
In our application settings, we can toggle mock auth with a simple property.
app:
mockAuthenticationEnabled: true
With the conditional properties, we should never have more than one auth service ready, but even if we do, we don't have any conflicts.
Something went horribly wrong: no Real, no Mock - Application fails to start, no bean.
mockAuthEnabled = true: no Real, Mock - Application uses Mock.
mockAuthEnabled = false: Real, no Mock - Application uses Real.
Something went horribly wrong: Real AND Mock both - Application uses Real bean.
The best way (I think) to understand Dependency Injection (DI) is like this :
DI is a mecanism that allows you to dynamically replace your
#autowired interface by your implementation at run time. This is the
role of your DI framework (Spring, Guice etc...) to perform this
action.
In your Car example, you create an instance of your Wheel as an interface, but during the execution, Spring creates an instance of your implementation such as MRF or XYZWheel.
To answer your question:
I think it depends on the logic you want to implement. This is not the
role of your DI framework to choose which kind of Wheel you want for
your Car. Somehow you will have to define the interfaces you want to
inject as dependencies.
Please any other answer will be useful, because DI is sometimes source of confusion. Thanks in advance.
The Spring integration with Feign supports using Spring MVC annotations for mapping a Feign interface:
#FeignClient("multiplier")
public interface MultiplierApi {
#GetMapping("/multiply")
public Long multiply(#RequestParam("one") long one, #RequestParam("two") long two);
}
I could place the MultiplierApi interface into an API package, and use it with #EnableFeignClients in client programs and as an implemented interface for my controller:
#RestController
public class MultiplierController implements MultiplierApi {
public Long multiply(long one, long two) {
return one * two;
}
}
This seems to allow me to remove duplication that might otherwise occur between the controller and the client interface, reducing the likelihood that the mappings will get out of sync. Is there any disadvantage to sharing the API definition in this way?
We are using Apache Felix annotations to handle all the OSGi stuff in our application. I have a provider class that talks to a server. I have a consumer class that does stuff with data from the server. What I want is to create another provider instance (new class implementing interface) that is for debug purposes only that returns canned responses to requests by the consumer. Ideally I would like the consumer to be unaware of this handoff. It's provider service reference would simply be replaced.
The use case: When the developer is running on a machine without access to the actual server, he presses a button in our running app to switch from the real provider instance to our debug provider instance.
What is the recommended way to accomplish this?
Example code:
public interface IProvider{
public String getDataFromServer();
}
#Component
#Service(value=IProvider.class)
public class RealProvider implements IProvider{
#Override
public String getDataFromServer(){
...
}
}
#Component
#Service(value=IProvider.class)
public class DebugProvider implements IProvider{
#Override
public String getDataFromServer(){
return "Hello World";
}
}
#Component
public class Consumer{
private #Reference IProvider provider;
public void doSomething(){
provider.getDataFromServer();
}
}
If the two providers are in separate bundles, you can stop Bundle A and start Bundle B to switch between implementations of the service.
If the two providers are in the same bundle, you'd need to either drop down to the OSGI API and register/unregister the services manually, or create a proxy version of IProvider that has a debugMode flag and delegates to the specific implementation.