Multi-tenancy with a separate database per customer, using Spring Data ArangoDB - java

So far, the only way I know to set the name of a database, to use with Spring Data ArangoDB, is by hardcoding it in a database() method while extending AbstractArangoConfiguration, like so:
#Configuration
#EnableArangoRepositories(basePackages = { "com.company.mypackage" })
public class MyConfiguration extends AbstractArangoConfiguration {
#Override
public ArangoDB.Builder arango() {
return new ArangoDB.Builder();
}
#Override
public String database() {
// Name of the database to be used
return "example-database";
}
}
What if I'd like to implement multi-tenancy, where each tenant has data in a separate database and use e.g. a subdomain to determine which database name should be used?
Can the database used by Spring Data ArangoDB be determined at runtime, dynamically?
This question is related to the discussion here: Manage multi-tenancy ArangoDB connection - but is Spring Data ArangoDB specific.

Turns out this is delightfully simple: Just change the ArangoConfiguration database() method #Override to return a Spring Expression (SpEL):
#Override
public String database() {
return "#{tenantProvider.getDatabaseName()}";
}
which in this example references a TenantProvider #Component which can be implemented like so:
#Component
public class TenantProvider {
private final ThreadLocal<String> databaseName;
public TenantProvider() {
super();
databaseName = new ThreadLocal<>();
}
public String getDatabaseName() {
return databaseName.get();
}
public void setDatabaseName(final String databaseName) {
this.databaseName.set(databaseName);
}
}
This component can then be #Autowired wherever in your code to set the database name, such as in a servlet filter, or in my case in an Apache Camel route Processor and in database service methods.
P.s. I became aware of this possibility by reading the ArangoTemplate code and a Spring Expression support documentation section
(via), and one merged pull request.

Related

Spring use different property files depending on request params

Background:
I am working on a java Spring REST microservice that needs to work with multiple identical back-end systems and multiple identical databases depending on the request parameters.
Basically I have 3 "brands". For each brand there is a set of downstream services and a database. I have no control over those.
My spring service will receive brand as a part of request and will need to call the right downstream services and use the correct database.
Previously I would deal with this by having a separate instance of the spring service for each of the brands. There would be a single property file for each brand and spring would use it to wire up beans. I would have separate URL's for each brand and there was no problem.
Some of my beans need to know about "brand" during creation as they are wrappers around connections downstream services. I.e. once the bean is created there won't be a way to switch it to be a "different brand".
Problem:
I would like to change this so that a single instance of my service can handle requests for any brand.
Requirements:
I was thinking about the following solution:
Have a general property file for non-branded stuff. Spring would wire any non-branded beans and keep them as singleton beans.
Have a property file with brand specific urls etc for each of the brands
Spring would create set of singleton beans for each of the brand using appropriate property file.
Next when the request comes in spring would read the request params and use bean specific for that brand.
Performance is important to me so I would like to reuse the beans as much as possible.
I would like to make this thing as transparent as possible so that people creating new beans don't have to worry about doing anything outside standard configuration/context class.
Does anyone know what would be the best solution to achieve this?
I think you can solve the problem injecting the service in every request with the right set of configurations and beans; possibly already existing in your Application Context.
Given:
$ curl http://localhost:8080/greetings/rodo && echo
Hi from brand1, rodo
$ curl -H "x-brand-name: brand1" http://localhost:8080/greetings/rodo
Hi from brand1, rodo
$ curl -H "x-brand-name: brand2" http://localhost:8080/greetings/rodo && echo
Hi from brand2, rodo
The following code would work:
-- application.yml --
brand1:
greetingPrefix: Hi from brand1,
brand2:
greetingPrefix: Hi from brand2,
-- DemoApplication.java --
#SpringBootApplication
public class DemoApplication {
public static void main(String[] args) {
SpringApplication.run(DemoApplication.class, args);
}
#Configuration
class ServiceConfig {
#Bean
public GreetingService greetingServiceBrand1(Brand1Config config) {
return new GreetingService(config);
}
#Bean
public GreetingService greetingServiceBrand2(Brand2Config config) {
return new GreetingService(config);
}
}
#Configuration
class WebConfig implements WebMvcConfigurer {
#Autowired
private ApplicationContext applicationContext;
#Override
public void addArgumentResolvers(List<HandlerMethodArgumentResolver> resolvers) {
resolvers.add(greetingServiceResolver());
}
private GreetingServiceResolver greetingServiceResolver() {
GreetingService greetingServiceBrand1 = applicationContext.getBean("greetingServiceBrand1", GreetingService.class);
GreetingService greetingServiceBrand2 = applicationContext.getBean("greetingServiceBrand2", GreetingService.class);
return new GreetingServiceResolver(greetingServiceBrand1, greetingServiceBrand2);
}
}
}
#RestController
#RequestMapping("/greetings")
class GreetingController {
#GetMapping("/{name}")
public String get(GreetingService greetingService, #PathVariable String name) {
return greetingService.sayHi(name);
}
}
class GreetingServiceResolver implements HandlerMethodArgumentResolver {
private final GreetingService greetingServiceBrand1;
private final GreetingService greetingServiceBrand2;
public GreetingServiceResolver(GreetingService greetingServiceBrand1, GreetingService greetingServiceBrand2) {
this.greetingServiceBrand1 = greetingServiceBrand1;
this.greetingServiceBrand2 = greetingServiceBrand2;
}
#Override
public boolean supportsParameter(MethodParameter parameter) {
return parameter.getParameterType().equals(GreetingService.class);
}
#Override
public Object resolveArgument(
MethodParameter methodParameter,
ModelAndViewContainer modelAndViewContainer,
NativeWebRequest nativeWebRequest,
WebDataBinderFactory webDataBinderFactory
) throws Exception {
String brand = nativeWebRequest.getHeader("x-brand-name");
return resolveGreetingService(brand);
}
private GreetingService resolveGreetingService(String brand) {
if ("brand2".equals(brand)) {
return greetingServiceBrand2;
}
return greetingServiceBrand1; // default
}
}
class GreetingService {
private BaseConfig config;
public GreetingService(BaseConfig config) {
this.config = config;
}
public String sayHi(String name) {
return config.getGreetingPrefix() + " " + name;
}
}
abstract class BaseConfig {
private String greetingPrefix;
public String getGreetingPrefix() {
return greetingPrefix;
}
public void setGreetingPrefix(String greetingPrefix) {
this.greetingPrefix = greetingPrefix;
}
}
#Configuration
#ConfigurationProperties("brand1")
class Brand1Config extends BaseConfig {
}
#Configuration
#ConfigurationProperties("brand2")
class Brand2Config extends BaseConfig {
}
As you can see, it's fundamental to pass the service to each controller method, write a resolver and inject the right set of dependencies depending on a parameter passed to the request, in this case via header.
Since your property files need to be declared statically anyway, you can just write all your different brand stuff in the same property file, like in a key-value format, that Spring can pick up as a list of configurations.
brandConfigs:
- brand: foo
property: foos
- brand2: bar
porperty: bars
Load all your connection beans to your downstream services on startup and just route to them according to your request param. Imo this seems to be the most straight forward and performant way. If some of these downstreams are used very rarely you can lazy load the beans on-demand, but probably this wouldn't make a sense unless you have thousands of different downstream routes.

How to cache data during application startup in Spring boot application

I have a Spring boot Application connecting to SQL Server Database. I need some help in using caching in my application. I have a table for CodeCategory which has a list of codes for Many codes. This table will be loaded every month and data changes only once in a month.
I want to cache this entire table when the Application starts. In any subsequent calls to the table should get value from this cache instead of calling the Database.
For Example,
List<CodeCategory> findAll();
I want to cache the above DB query value during application startup. If there is a DB call like List<CodeCategory> findByCodeValue(String code) should fetch the code result from the already Cached data instead of calling the Database.
Please let me know how this can be achieved using spring boot and ehcache.
As pointed out, It takes some time for ehcache to setup and it is not working completely with #PostConstruct. In that case make use of ApplicationStartedEvent to load the cache.
GitHub Repo: spring-ehcache-demo
#Service
class CodeCategoryService{
#EventListener(classes = ApplicationStartedEvent.class )
public void listenToStart(ApplicationStartedEvent event) {
this.repo.findByCodeValue("100");
}
}
interface CodeCategoryRepository extends JpaRepository<CodeCategory, Long>{
#Cacheable(value = "codeValues")
List<CodeCategory> findByCodeValue(String code);
}
Note: There are multiple ways as pointed by others. You can choose as per your needs.
My way is to define a generic cache handler
#FunctionalInterface
public interface GenericCacheHandler {
List<CodeCategory> findAll();
}
And its implementation as below
#Component
#EnableScheduling // Important
public class GenericCacheHandlerImpl implements GenericCacheHandler {
#Autowired
private CodeRepository codeRepo;
private List<CodeCategory> codes = new ArrayList<>();
#PostConstruct
private void intializeBudgetState() {
List<CodeCategory> codeList = codeRepo.findAll();
// Any customization goes here
codes = codeList;
}
#Override
public List<CodeCategory> getCodes() {
return codes;
}
}
Call it in Service layer as below
#Service
public class CodeServiceImpl implements CodeService {
#Autowired
private GenericCacheHandler genericCacheHandler;
#Override
public CodeDTO anyMethod() {
return genericCacheHandler.getCodes();
}
}
Use the second level hibernate caching to cache all the required db queries.
For caching at the application start-up, we can use #PostContruct in any of the Service class.
Syntax will be :-
#Service
public class anyService{
#PostConstruct
public void init(){
//call any method
}
}
Use CommandLineRunner interface.
Basically , you can create a Spring #Component and implement CommandLineRunner interface. You will have to override it's run method. The run method will be called at the start of the app.
#Component
public class DatabaseLoader implements
CommandLineRunner {
#override
Public void run(.... string){
// Any code here gets called at the start of the app.
}}
This approach is mostly used to bootstrap the application with some initial data.

How to configure spring-data-mongodb to use 2 different mongo instances sharing the same document model

I work for a company that has multiple brands, therefore we have a couple MongoDB instances on some different hosts holding the same document model for our Customer on each of these brands. (Same structure, not same data)
For the sake of simplicity let's say we have an Orange brand, with database instance serving on port 27017 and Banana brand with database instance serving on port 27018
Currently I'm developing a fraud detection service which is required to connect to all databases and analyze all the customers' behavior together regardless of the brand.
So my "model" has a shared entity for Customer, annotated with #Document (org.springframework.data.mongodb.core.mapping.Document)
Next thing I have is two MongoRepositories such as:
public interface BananaRepository extends MongoRepository<Customer, String>
List<Customer> findAllByEmail(String email);
public interface OrangeRepository extends MongoRepository<Customer, String>
List<Customer> findAllByEmail(String email);
With some stub method for finding customers by Id, Email, and so on. Spring is responsible for generating all implementation classes for such interfaces (pretty standard spring stuff)
In order to hint each of these repositories to connect to the right mongodb instance, I need two Mongo Config such as:
#Configuration
#EnableMongoRepositories(basePackageClasses = {Customer.class})
public class BananaConfig extends AbstractMongoConfiguration {
#Value("${database.mongodb.banana.username:}")
private String username;
#Value("${database.mongodb.banana.database}")
private String database;
#Value("${database.mongodb.banana.password:}")
private String password;
#Value("${database.mongodb.banana.uri}")
private String mongoUri;
#Override
protected Collection<String> getMappingBasePackages() {
return Collections.singletonList("com.acme.model");
}
#Override
protected String getDatabaseName() {
return this.database;
}
#Override
#Bean(name="bananaClient")
public MongoClient mongoClient() {
final String authString;
//todo: Use MongoCredential
//todo: Use ServerAddress
//(See https://docs.spring.io/spring-data/mongodb/docs/current/reference/html/#repositories) 10.3.4
if ( valueIsPresent(username) ||valueIsPresent(password)) {
authString = String.format("%s:%s#", username, password);
} else {
authString = "";
}
String conecctionString = "mongodb://" + authString + mongoUri + "/" + database;
System.out.println("Going to connect to: " + conecctionString);
return new MongoClient(new MongoClientURI(conecctionString, builder()
.connectTimeout(5000)
.socketTimeout(8000)
.readPreference(ReadPreference.secondaryPreferred())
.writeConcern(ACKNOWLEDGED)));
}
#Bean(name = "bananaTemplate")
public MongoTemplate mongoTemplate(#Qualifier("bananaFactory") MongoDbFactory mongoFactory) {
return new MongoTemplate(mongoFactory);
}
#Bean(name = "bananaFactory")
public MongoDbFactory mongoFactory() {
return new SimpleMongoDbFactory(mongoClient(),
getDatabaseName());
}
private static int sizeOfValue(String value){
if (value == null) return 0;
return value.length();
}
private static boolean valueIsMissing(String value){
return sizeOfValue(value) == 0;
}
private static boolean valueIsPresent(String value){
return ! valueIsMissing(value);
}
}
I also have similar config for Orange which points to the proper mongo instance.
Then I have my service like this:
public List<? extends Customer> findAllByEmail(String email) {
return Stream.concat(
bananaRepository.findAllByEmail(email).stream(),
orangeRepository.findAllByEmail(email).stream())
.collect(Collectors.toList());
}
Notice that I'm calling both repositories and then collecting back the results into one single list
What I would expect to happen is that each repository would connect to its corresponding mongo instance and query for the customer by its email.
But this don't happened. I always got the query executed against the same mongo instance.
But in the database log I can see both connections being made by spring.
It just uses one connection to run the queries for both repositories.
This is not surprising as both Mongo Config points to the same model package here. Right. But I also tried other approaches such as creating a BananaCustomer extends Customer, into its own model.banana package, and another OrangeCustomer extends Customer into its model.orange package, along with specifying the proper basePackageClasses into each config. But that neither worked, I've ended up getting both queries run against the same database.
:(
After scavenging official Spring-data-mongodb documentation for hours, and looking throughout thousands of lines of code here and there, I've run out of options: seems like nobody have done what I'm trying to accomplish before.
Except for this guy here that had to do the same thing but using JPA instead of mongodb: Link to article
Well, while it's still spring-data it't not for mongodb.
So here is my question:
¿How can I explicitly tell each repository to use a specific mongo config?
Magical autowiring rules, except when it doesn't work and nobody understands the magic.
Thanks in advance.
Well, I had a very detailed answer but StackOverflow complained about looking like spam and didn't allow me to post
The full answer is still available as a Gist file here
The bottom line is that both MongoRepository (interface) and the model object must be placed in the same package.

How to set tableName dynamically using environment variable in spring boot?

I am using AWS ECS to host my application and using DynamoDB for all database operations. So I'll have same database with different table names for different environments. Such as "dev_users" (for Dev env), "test_users" (for Test env), etc.. (This is how our company uses same Dynamo account for different environments)
So I would like to change the "tableName" of the model class using the environment variable passed through "AWS ECS task definition" environment parameters.
For Example.
My Model Class is:
#DynamoDBTable(tableName = "dev_users")
public class User {
Now I need to replace the "dev" with "test" when I deploy my container in test environment. I know I can use
#Value("${DOCKER_ENV:dev}")
to access environment variables. But I'm not sure how to use variables outside the class. Is there any way that I can use the docker env variable to select my table prefix?
My Intent is to use like this:
I know this not possible like this. But is there any other way or work around for this?
Edit 1:
I am working on the Rahul's answer and facing some issues. Before writing the issues, I'll explain the process I followed.
Process:
I have created the beans in my config class (com.myapp.users.config).
As I don't have repositories, I have given my Model class package name as "basePackage" path. (Please check the image)
For 1) I have replaced the "table name over-rider bean injection" to avoid the error.
For 2) I printed the name that is passing on to this method. But it is Null. So checking all the possible ways to pass the value here.
Check the image for error:
I haven't changed anything in my user model class as beans will replace the name of the DynamoDBTable when the beans got executed. But the table name over riding is happening. Data is pulling from the table name given at the Model Class level only.
What I am missing here?
The table names can be altered via an altered DynamoDBMapperConfig bean.
For your case where you have to Prefix each table with a literal, you can add the bean as such. Here the prefix can be the environment name in your case.
#Bean
public TableNameOverride tableNameOverrider() {
String prefix = ... // Use #Value to inject values via Spring or use any logic to define the table prefix
return TableNameOverride.withTableNamePrefix(prefix);
}
For more details check out the complete details here:
https://github.com/derjust/spring-data-dynamodb/wiki/Alter-table-name-during-runtime
I am able to achieve table names prefixed with active profile name.
First added TableNameResolver class as below,
#Component
public class TableNameResolver extends DynamoDBMapperConfig.DefaultTableNameResolver {
private String envProfile;
public TableNameResolver() {}
public TableNameResolver(String envProfile) {
this.envProfile=envProfile;
}
#Override
public String getTableName(Class<?> clazz, DynamoDBMapperConfig config) {
String stageName = envProfile.concat("_");
String rawTableName = super.getTableName(clazz, config);
return stageName.concat(rawTableName);
}
}
Then i setup DynamoDBMapper bean as below,
#Bean
#Primary
public DynamoDBMapper dynamoDBMapper(AmazonDynamoDB amazonDynamoDB) {
DynamoDBMapper mapper = new DynamoDBMapper(amazonDynamoDB,new DynamoDBMapperConfig.Builder().withTableNameResolver(new TableNameResolver(envProfile)).build());
return mapper;
}
Added variable envProfile which is an active profile property value accessed from application.properties file.
#Value("${spring.profiles.active}")
private String envProfile;
We have the same issue with regards to the need to change table names during runtime. We are using Spring-data-dynamodb 5.0.2 and the following configuration seems to provide the solutions that we need.
First I annotated my bean accessor
#EnableDynamoDBRepositories(dynamoDBMapperConfigRef = "getDynamoDBMapperConfig", basePackages = "my.company.base.package")
I also setup an environment variable called ENV_PREFIX which is Spring wired via SpEL.
#Value("#{systemProperties['ENV_PREFIX']}")
private String envPrefix;
Then I setup a TableNameOverride bean:
#Bean
public DynamoDBMapperConfig.TableNameOverride getTableNameOverride() {
return DynamoDBMapperConfig.TableNameOverride.withTableNamePrefix(envPrefix);
}
Finally, I setup the DynamoDBMapperConfig bean using TableNameOverride injection. In 5.0.2, we had to setup a standard DynamoDBTypeConverterFactory in the DynamoDBMapperConfig builder to avoid NPE.:
#Bean
public DynamoDBMapperConfig getDynamoDBMapperConfig(DynamoDBMapperConfig.TableNameOverride tableNameOverride) {
DynamoDBMapperConfig.Builder builder = new DynamoDBMapperConfig.Builder();
builder.setTableNameOverride(tableNameOverride);
builder.setTypeConverterFactory(DynamoDBTypeConverterFactory.standard());
return builder.build();
}
In hind sight, I could have setup a DynamoDBTypeConverterFactory bean that returns a standard DynamoDBTypeConverterFactory and inject that into the getDynamoDBMapperConfig() method using the DynamoDBMapperConfig builder. But this will also do the job.
I up voted the other answer but here is an idea:
Create a base class with all your user details:
#MappedSuperclass
public abstract class AbstractUser {
#Id
#GeneratedValue(strategy=GenerationType.AUTO)
private Long id;
private String firstName;
private String lastName;
Create 2 implentations with different table names and spirng profiles:
#Profile(value= {"dev","default"})
#Entity(name = "dev_user")
public class DevUser extends AbstractUser {
}
#Profile(value= {"prod"})
#Entity(name = "prod_user")
public class ProdUser extends AbstractUser {
}
Create a single JPA respository that uses the mapped super classs
public interface UserRepository extends CrudRepository<AbstractUser, Long> {
}
Then switch the implentation with the spring profile
#RunWith(SpringJUnit4ClassRunner.class)
#DataJpaTest
#Transactional
public class UserRepositoryTest {
#Autowired
protected DataSource dataSource;
#BeforeClass
public static void setUp() {
System.setProperty("spring.profiles.active", "prod");
}
#Test
public void test1() throws Exception {
DatabaseMetaData metaData = dataSource.getConnection().getMetaData();
ResultSet tables = metaData.getTables(null, null, "PROD_USER", new String[] { "TABLE" });
tables.next();
assertEquals("PROD_USER", tables.getString("TABLE_NAME"));
}
}

Dynamic applicationpath

A new application of ours uses multi-tenancy with multiple database. By providing a tenant id in the URL, we can select the right datasource.
But by using that kind of method, the namespace of the URL becomes dynamic (e.g.: instead of /api the url changes to /{id}/api). So is it possible to use a dynamic #ApplicationPath?
Just as it is possible to use a variable in the #Path annotation, could I write something like #ApplicationPath("/tenants/{id}/api")?
Seems applicationpath does not support dynamic segments. In the end we fixed it by using sub-resources:
Config
#ApplicationPath("tenants")
public class TenantConfig extends ResourceConfig {
public TenantConfig(ObjectMapper mapper) {
//set provider + add mapper
register(TenantsController.class);
}
}
TenantsController
#Path("/{id}/api")
public class TenantsController {
//register all your controllers including path here
#Path("/somethings")
public Class<SomethingController> something() {
return SomethingController.class;
}
}
SomethingController
#Component
//Don't use #Path, as path info is already defined in the TenantsController
public class SomethingController {
//do your stuff here;
#GET
#Path("/{id}") //Path for this example would be /tenants/{id}/api/somethings/{id}
public JsonApiResult get(#PathParam("id") int id) {
//retrieve one something
}
}

Categories