How to disable the default exposure of Spring Data REST repositories? - java

I have a project that uses spring-data-rest, and has a dependency project that only uses Spring Data. Both projects have spring data repositories and use #EnableJpaRepositories to implement their repository interfaces, but I only want to export the repositories in the parent project.
Here's my question: is there some way to configure Spring Data REST to only expose rest endpoints for resources in the parent project, without having to explicitly annotate every repository in the dependency project with #RepositoryRestResource(exported = false)?
If I can only do this with #RepositoryRestResource of disabling it, and worse yet, no other project with a different use case will be able to enable REST endpoints for those repositories, my dependency project will have to include Spring Data REST solely for the…

Looping back here as I was looking for this specific setting. It looks like this is now implemented. In this case, you would want to set spring.data.rest.detection-strategy=annotated to avoid default exposure.
All application.properties options:
# Exposes all public repository interfaces but considers #(Repository)RestResource\u2019s `exported flag.
spring.data.rest.detection-strategy=default
# Exposes all repositories independently of type visibility and annotations.
spring.data.rest.detection-strategy=all
# Only repositories annotated with #(Repository)RestResource are exposed, unless their exported flag is set to false.
spring.data.rest.detection-strategy=annotated
# Only public repositories annotated are exposed.
spring.data.rest.detection-strategy=visibility
References
3.5.1. Setting the Repository Detection Strategy
Common Application Properties

Currently there's no global switch for what you're looking for. I've filed this ticket for you for inclusion in the next major release.
Not sure if it is an option for you but package private repository interfaces are not currently exposed unless explicitly annotated. If you can make all those library repositories package protected that might be favorable over the explicit annotation.

As of version 3.4 use:
import org.springframework.context.annotation.Configuration;
import org.springframework.data.rest.core.config.RepositoryRestConfiguration;
import org.springframework.data.rest.webmvc.config.RepositoryRestConfigurer;
import org.springframework.web.servlet.config.annotation.CorsRegistry;
#Configuration
public class SpringRestConfiguration implements RepositoryRestConfigurer {
#Override
public void configureRepositoryRestConfiguration(RepositoryRestConfiguration config, CorsRegistry cors) {
config.disableDefaultExposure();
}
}

Related

Convert library to autoconfiguration/starter while in use

Our project has a library that defines several beans.
Historically, it's a simple artifact, no autoconfig enabled and all other applications are using this artifact together with #ComponentScan.
Now, I realize, that it should rather be an autoconfiguration dependency, with #ConditionalOnMissingBean etc., to provide more flexibility for the application.
According to the docs, autoconfigurations should not be a candidate for component scanning. As far as I understand, it is only critical for situations when your configuration is #ConditionalOn something, so it does not get scanned twice.
What would be the correct way to perform the transition from library + #ComponentScan to autoconfiguration, assuming all the consumers of the same library cannot be updated instanly?
Will it cause any major issues if the autoconfig get scanned? Or can it be limited to cases with the usage of #ConditionalOn etc?
I know it should be possible to create another maven artifact, make it dependant on the first one and define autoconfiguration, but I would like to avoid creating another now.
Thanks, Pavlo
So I have this solution in my mind:
Start simple in each starter library start defining one #Configuration class with a #ComponentScan(basePackages=“com.acme.libn) when done release all the libriaries on the nexus (Note that this is a bad practice the starters should define the beans in the #Configuration classes but for this step will be used as a workaround).
Go on the consumers and remove #ComponentScan, update library and deploy. This will be the fastest way to remove the scanning of the beans on the main com.acme package.
Now you are free to work on each library as you prefer. Example: refactor the packages, declaring beans in the #Configuration, use #ConditionalOn…, remove #ComponentScan, use more #Configuration classes (you have to link them in the spring.factories file), start using prefix for using propeties defined in the consumers …, you can work in parallel on this task with your team. And release one library when it’s ready if you don’t want to release all of them together.
Re go in the consumers and refactor if needed, update the library that are ready and deploy. Go to step 3 till the libriaries are finished.
Otherwise you have to refactor com.acme package in each library and in each consumers, keeping the #ComponentScan in the consumers. When done with all the libriaries you can then remove it. You don’t have to do it all together, you can do it one library at time and one consumer at time.
PS: if you have entities and you are using spring-data in your libraries let me know I will update the answer because there is extra work to do.
I’ve read the documentation you have linked and I don’t know if it’s outdated or not. I’m using spring.factories file as for example stated here.I’m using spring boot 2.7.0. So check the correct configuration for the version of spring boot you are using
EDIT
I've read better the documentation and digged into the spring boot code META-INF/spring/org.springframework.boot.autoconfigure.AutoConfiguration.import can be used to tell spring where the #Configuration\#AutoCOnfiguration classes are. You can also use META-INF\spring.factories to declare the #Configuration\#AutoConfiguration classes. In the source code they are using the first option. Honestly I've not undersood the difference.
I think I've also discovered how to expose #RestController or any other component in a starter without using #ComponentScan as they say in the doc you must use #Import example:
#Configuration
#Import({RestController.class, SomeService.class})
class ConfigurationClass {
}
#RestController
class RestController {
}
#Service
class SomeService {
}
This will make the SomeService avaiable for injection, and I think this will expose automatically the endpoint defined in RestController, like spring-boot-starter-actuator.
For the repositories of spring-data-jpa you have to use this in the ConfigurationClass example:
#Configuration
#AutoConfigureAfter(JpaRepositoriesAutoConfiguration.class)
#EnableJpaRepositories(basePackages = "com.example")
#Import({Entity1.class, Entity2.class})
class ConfigurationClass {
}
#Entity
public class Entity1 {
}
#Entity
public class Entity2 {
}
In this way the repositories interfaces present in com.example will be ready for injection in the configuration class or outside the jar. In my project I was able to load entities only using #EntityScan in the consumer of the starter I have to make a try with #Import.
All the things that I've writed in the EDIT section must be tested, the only exception is the configuration for jpa repositories with #EntityScan on the consumer. I've had no time for test right now.
So in the end you can remove then #ComponentScan in the first step I've writed and use #Import, but you have to list all the classes that are annotated like #Component\#Service.... In this way you don't have to declare and construct all the beans of the step 3 in the configuration classes.
EDIT 2 I've made some test you will find the code here
what i discovered:
Adding the configuration classes in META-INF/spring/org.springframework.boot.autoconfigure.AutoConfiguration.import doesn't work, you have to use META-INF\spring.factories if you check the code I've commented the configuration class in the file
#Import with entities doesn't work, unlikly you have to put #EntityScan in the cosnumer declaring all packages where the entities are.
For the rest everithing is fine. You can play with it:
POST localhost:8080/someEntities
request body
{
"name" : "entity1"
}
response
{
"id": 1,
"name": "entity1"
}
{
"id": 1,
"name": "entity1"
}
GET localhost:8080/someEntities/1
{
"id": 1,
"name": "entity1"
}
EDIT 3 from spring boot v2.7 the classes annotated with #AutoConfiguration are migrated to a new file META-INF/spring/org.springframework.boot.autoconfigure.AutoConfiguration.imports (nothe the final s so imports) see this question for more info. The configuration classes declared in spring.factories will still be honored

Disable DataSourceAutoConfiguration Spring boot

I have a library that makes use of spring-jdbc, the library contains common utility methods that need to be standardized across multiple projects.
The library when used in other spring boot application causes the project to fail with no bean on type DataSourceConfuguration Exception.
I have read tips to exclude DataSourceConfiguration on #SpringBootApplication but that would mean making change on every application that uses the library regardless of whether the application needs a datasource or not.
#Configuration
#EnableAutoConfiguration(exclude={DataSourceAutoConfiguration.class})
public class MyConfiguration {
}
The other option is to exclude DataSourceConfiguration in spring.factories of the library itself, but then it would stop the autoconfig ability of any application using the library and will have to manually define DataSource.
spring.autoconfigure.exclude=org.springframework.boot.autoconfigure.jdbc.DataSourceAutoConfiguration
Is there a possible way to make this situation work for the library and any other project that wants to use the library but doesn't have to define a datasource and still function like a normal Spring Boot Application ?
The below is from Spring Documentation
import org.springframework.boot.autoconfigure.*;
import org.springframework.boot.autoconfigure.jdbc.*;
import org.springframework.context.annotation.*;
#Configuration
#EnableAutoConfiguration(exclude={DataSourceAutoConfiguration.class})
public class MyConfiguration {
}
https://docs.spring.io/spring-boot/docs/1.3.8.RELEASE/reference/html/using-boot-auto-configuration.html

Spring can't find CrudRepository beans when using ComponentScan

I'm trying to create project structure that would allow me to add/remove modules by simply having them on classpath. With #ComponentScan("com.companyname") annotation in my Spring Application it detects and creates annotated components from modules. But I get errors when trying to autowire my CrudRepository anywhere:
Field repo in com.companyname.somemodule.services.SomeService required a bean of type 'com.companyname.somemodule.repos.SomeRepo' that could not be found.
So I thought that maybe it somehow can't create repos if they are defined in one of modules, so I wen't ahead and added test repo to my base SpringApplication and to my surprise I got:
Field repo in com.companyname.modularapp.TestService required a bean of type 'com.companyname.modularapp.TestRepo' that could not be found.
Then I just removed my #ComponentScan annotation and suddenly TestRepo worked as I intended, I was able to persist and read Test entities normally. So apparently ComponentScan somehow either screw up creation of CrudRepository, or it's later detection.
I define my repos like this:
#Entity
public class Test {
#Id
private long id;
}
public interface TestRepo extends CrudRepository<Test, Long>{}
I'm trying out Spring Boot 2.0.0.M7 with this project but I doubt that's the cause.
Did I miss something?
Also you can define package for Repositories scan by :
#EnableJpaRepositories("com.companyname")
or in XML config
<jpa:repositories base-package="com.companyname"/>
If you are using spring-boot you might as well drop the #ComponentScan annotation, as there is one already defined in the #SpringBootApplication annotation. Maybe there's a conflict of some sort between them, it's hard to tell without looking at the code.
If you customizing package scans in your project, then probably you need to manually configure bean which requires path to scan, e.g. for JPA you can create your own bean of LocalContainerEntityManagerFactoryBean (you can find auto-configuration example here -
org.springframework.boot.autoconfigure.orm.jpa.JpaBaseConfiguration#entityManagerFactory - class link from spring-boot 1.5.*).
But, if you do not require package scan customization, just put class annotated with #SpringBootApplication in project root, and pass it to spring::run method.

How to configure neo4j and cassandra repositories in same spring boot application

I have configured neo4j and cassandra repositories separately with spring boot using spring-data. However, when I try to use two repositories in the same projects it doesn't work as expected.
This is my folder structure.
-----org.test.project
-----controller
BarController
FooController
-----models
-----dao
-----cassandra
BarDAO
FooDAO
-----neo4j
BarDAO
FooDAO
-----repositories
-----cassandra
BarRepository
FooRepository
-----neo
BarRepository
FooRepository
-----services
CassandraService (Has cassandra repositories #Autowired)
NeoService(Has neo repositories #Autowired)
TestApp.java
Note that all the repositories extend respective spring-datarepository with respective DAO.
When I run with this configurations it gives the following error.
Field airportRepository in org.test.project.TestApp required a bean of type 'org.test.project.repositories.cassandra.BarRepository' that could not be found.
I tried changing the Repository names. Then it started to work.
First question is can't we have same names as they are in different packages and start working
Though it started to work this time it gave an error in the authentication header.
org.neo4j.ogm.drivers.http.request.HttpRequestException: http://localhost:7474/db/data/transaction/commit: No authentication header supplied.
I have already added ogm.properties, the same way I did when I was using neo4j repositories only. But it seems they no longer get applied. So I added following into application.properties.
spring.data.neo4j.password=neo4j
spring.data.neo4j.username=neo4j
Second question is, how can I configure neo4j just as the same way I did only with neo4j? I have defined following in ogm.properties. How can I apply this into neo4j configurations?
#Driver, required
driver=org.neo4j.ogm.drivers.bolt.driver.BoltDriver
#URI of the Neo4j database, required. If no port is specified, the
#default port 7687 is used. Otherwise, a port can be specified with
#bolt://neo4j:password#localhost:1234
URI=bolt://neo4j:neo4j#localhost
#Connection pool size (the maximum number of sessions per URL),
#optional, defaults to 50
connection.pool.size=150
#Encryption level (TLS), optional, defaults to REQUIRED. Valid
#values are NONE,REQUIRED
encryption.level=NONE
With the above changes, now it is giving following error.
org.neo4j.ogm.exception.MappingException: No identity field found for class: org.rozzie.processor.models.dao.cassandra.FlightDAO
Note that a neo4j.ogm exception is thrown for the cassandra models. What is happening under the hood. How can I configure these two databases with spring boot in one project as above?
This looks like the Spring Boot autoconfiguration is not able to handle multiple Spring Data projects at the same time.
Please refer to documentation for Spring Data Neo4j and Spring Data Cassandra
In particular you should point SDN module to neo4j repositories only
#EnableNeo4jRepositories(basePackages = "org.test.project.repositories.neo")
and similarly for cassandra.
I have used neo4j with mongo. I don't see any issues. I presume it should be same with cassandra. This is all the configuration I have
#SpringBootApplication
#EnableConfigurationProperties
#EnableNeo4jRepositories("com.in.neo4j.repository.neo")
#EnableMongoRepositories("com.in.neo4j.repository.mongo")
public class Neo4JApplication {
public static void main(String[] args) {
SpringApplication.run(Neo4JApplication.class, args);
}
}
And in my properties file I have
spring.data.neo4j.username=neo4j
spring.data.neo4j.password=admin
spring.data.mongodb.database=blah
spring.data.mongodb.host=blahblah
spring.data.mongodb.port=27017

How to Configure Dependency Injection in a Library Project?

How to Configure Dependency Injection in a Library Project?
Let me illustrate this question with the following example.
Maven Library Project
ReservationAPI
com.example.reservation-api
This project contains a convenience class called ReservationApiClient which uses a RestTemplate (from the Spring Framework) for making HTTP calls.
Is it possible to make the RestTemplate field #Autowired in this library project instead of instantiating it myself?
Maven Executable Project
org.company.application
This project is a Spring Boot application and uses the above ReservationAPI as a dependency. This app will create a #Bean for the convenience class ReservationApiClient contained in that library and will then execute its public methods which in turn make HTTP requests.
What is a good strategy and/or best practices for the scenario described above?
You can do this if you include autowiring in your library project although that means it would always need to be used with a Spring application context to get the value unless you also have getter/setter methods to use as well. However, I don't think using RestTemplate as an autowired object makes sense since there is nothing specific about a RestTemplate and unless you name the beans there is only one bean definition for a class. All of the methods for the RestTemplate require the URI there anyhow. So in this case I would just use the bean for your ReservationApiClient in your application.
One other way to do it is if you want to include Spring dependencies in your library (which I guess you already are by using RestTemplate) you can declare your ReservationApiClient as a #Service or #Component and then use the #ComponentScan annotation in your main Spring Boot project to search that library for components to include in the bean registry.
Another option is to use a feature like Spring Boot's Autoconfigure to create factories that use third party libraries and configure them per properties in your application settings. The auto configuration documentation would be a good place to start with this. You can see the starter projects they have on GitHub and then the associated Autoconfigure classes they have associated with these.
Let me know if any of this does not make sense.

Categories