I have this requirement in which every client must have his data stored individually in a separated database.
I would like to achieve the following structure:
A global microservice handles authentication and also provide information about the database in which the client data is stored.
The others microservices, when requested, query the auth service to know the client database information, only then the entity manager gets produced.
I am struggling to properly manage the state of the EntityManagerFactory instances.
I've tried to store in in a WeakHashMap but some buggy things started to happen. Like a simple findById throwing exceptions.
I am actually using JEE with DeltaSpike data running on a Payara server.
Anyone have ever done that using a similar stack?
If you are using bean managed transaction, then it becomes even easier to use CDI to manage this kind of entity manager factory resource.
First create a datasource context annotation.
#Qualifier
#Retention(RetentionPolicy.RUNTIME)
#Target({TYPE, PARAMETER, FIELD, METHOD})
public #interface Datasource {
/**
* This may be the database url or whatever.
*/
#Nonbinding
String value() default "";
}
#SuppressWarnings("AnnotationAsSuperInterface")
public class DatasourceLiteral extends AnnotationLiteral<Datasource> implements Datasource {
private static final long serialVersionUID = 7485753390480718735L;
private final String dbName;
public DatasourceLiteral(final String dbName) {
this.dbName = dbName;
}
#Override
public String value() {
return dbName;
}
}
#ApplicationScoped
public class EntityManagerFactoryProvider {
#Produces
#Datasource
#ApplicationScoped
public EntityManagerFactory entityManagerFactory(final InjectionPoint ip) {
final Annotated annotated = ip.getAnnotated();
final Datasource datasource = annotated.getAnnotation(Datasource.class);
/**
* Add relevant jpa properties
*/
final Map<String, String> jpaProperties = new HashMap<>();
/**
* The main point is here.
*/
jpaProperties.put("javax.persistence.jdbc.url", datasource.value());
return Persistence.createEntityManagerFactory("persistence-unit-jpa", jpaProperties);
}
public void dispose(#Disposes #Datasource final EntityManagerFactory emf) {
emf.close();
}
}
#ApplicationScoped
public class ExampleUserDatasource {
#Any
#Inject
private Instance<EntityManagerFactory> entityManagerFactories;
public void doSomething(final String user) {
final UserInfo userInfo = authenticationService.getUser(user);
final Datasource datasource = new DatasourceLiteral(userInfo.getDatasource());
final EntityManagerFactory entityManagerFactory = entityManagerFactories.select(datasource).get();
/**
* You could also actually inject this.
* Do whatever you want with it inside a transaction and close it too.
*/
final EntityManager entityManager = entityManagerFactory.createEntityManager();
}
}
Related
I am working on an application with a WebSocket and want to save the clients id and session to a manager but have difficulties to understand how to do this correct when I also want to be able to reach this from another class with autowire.
public class Client {
private String id;
private Session session;
private MessageHandler handler;
Client(String id, Session session, MessageHandler handler) {
this.id = id;
this.session = session;
this.handler = handler;
}
}
public class ClientsManager {
private Set<Client> clientSet = new CopyOnWriteArraySet<>();
public Set<Client> getClients() {
return this.clientSet;
}
public void addClient(Client client) {
this.clientSet.add(client);
}
public void removeClient(Client client) {
clientSet.remove(client);
}
}
public class WebsocketServerEndpoint {
public static final ClientsManager manageClients = new ClientsManager();
#OnOpen
public void onOpen(Session session, #PathParam("connectId") String connectId) throws IOException, EncodeException {
MessageHandler messageHandler = new MessageHandler();
Client client = new Client(connectId, session, messageHandler);
this.client = client;
manageClients.addClient(client);
}
....
....
....
....
}
From another class:
public class DoSomething {
#Autowired
WebsocketServerEndpoint serverEndpoint;
public String doSomething() {
int numberOfClients = serverEndpoint.getClients().size()
return numberOfClients;
}
}
As I understand. This is not correct and you should not autowire static fields and so.
I can see when I debug that serverEndpoint: null in my DoSomething class but I get 1 connected client if I have one connected and so on.
When I do like this I will get the right number of clients in DoSomething class.
Have I just misunderstood this and it works as I have done?
or how should I do instead?
Is their a better way to write my Client and ClientsManager classes?
What I have read that if I would like to "Autowire" anyway there is two possible ways.
Using Constructor #Autowired For Static Field
Using #PostConstruct to set the value to Static Field
But how does this work when I would like to instantiate "public static final ClientsManager manageClients = new ClientsManager();"
Sorry for my stupid question but I feel I do not fully understand this.
If you would like to understand more about this topic search for Spring Dependency injection, but I write a short summary.
To be able to #Autowire a component you have to create a #Bean or #Service or #Component.
Creating beands first create a Configuration class, and a Beand or Beans inside.
#Configuration
public class Configuration {
#Value("${configuration.property.name}")
String username;
#Bean
public WebsocketServerEndpoint ebsocketServerEndpoint () {
return new WebsocketServerEndpoint();
}
}
#Value is not necessaty just good to mention with this annotation you can get a property name from spring application.properties file.
After this point you have created a #Bean instance of your class it is registered as a singleton class. You can get this one copy class from anywhere in your application you just have to, autowire it.
Or user construcor based dependency injection. ( #Autowired is not prefered).
Dont create beans just add #Component annotation to your class that you want to Autowire but I show a constructor injection.
#Component
public class WebsocketServerEndpoint {
public String test(){
return "test";
}
}
#RestController
public class DoSomething {
private final WebsocketServerEndpoint websocketHandler;
public DoSomething(WebsocketServerEndpoint websocketHandler) {
this.websocketHandler = websocketHandler;
}
#GetMapping(value = "/test")
public String test() {
return websocketHandler.test();
}
}
You can even test this endpoint with a curl GET request. curl http://localhost:8080/test
I would like to benefit from #ConfigurationProperties fantastic facilities without needing to expose the bean into my context. It is not a problem of #Primaries and the like, I simply cannot expose another Datasource into the context. How can I achieve the following?
#ConfigurationProperties("com.non.exposed.datasource.hikari")
public DataSource privateHikariDatasource() {
if (Objects.isNull(this.nonExposedDatasource)) {
this.nonExposedDatasource = this.nonExposedDatasourceProperties.initializeDataSourceBuilder().build();
}
return this.nonExposedDatasource;
}
Thanks to the answer by #LppEdd, the final perfect solution is:
#Autowired
private Environment environment;
public DataSource privateHikariDatasource() {
if (Objects.isNull(this.nonExposedDatasource)) {
this.nonExposedDatasource = bindHikariProperties(this.nonExposedDatasourceProperties.initializeDataSourceBuilder().build());
}
return this.nonExposedDatasource;
}
//This does exactly the same as #ConfigurationProperties("com.non.exposed.hikari") but without requiring the exposure of the Datasource in the ctx as #Bean
private <T extends DataSource> T bindHikariProperties(final T instance) {
return Binder.get(this.environment).bind("com.non.exposed.datasource.hikari", Bindable.ofInstance(instance)).get();
}
Then you can call your bean internally with this.privateHikariDatasource() to be used by your other beans.
Great thanks to #LppEdd!
Being that this DataSource is private to a class, and that containing class can be/is inside the Spring context, you can have a #ConfigurationProperties class
#ConfigurationProperties("com.foo.bar.datasource.hikari")
public class HikariConfiguration { ... }
Which, by registering it via #EnableConfigurationProperties, is available for autowiring
#EnableConfigurationProperties(HikariConfiguration.class)
#SpringBootApplication
public class Application { ... }
And thus can be autowired in the containing class
#Component
class MyClass {
private final HikariConfiguration hikariConfiguration;
private DataSource springDatasource;
MyClass(final HikariConfiguration hikariConfiguration) {
this.hikariConfiguration = hikariConfiguration;
}
...
private DataSource privateSingletonDataSource() {
if (Objects.isNull(this.springDatasource)) {
this.springDatasource = buildDataSource(this.hikariConfiguration);
}
return this.springDatasource;
}
}
buildDataSource will manually construct the DataSource instance.
Remember that you need to take care of synchronization when building the DataSource.
Final response is that you cannot re-use DataSourceProperties. You can't even extend it to change the properties' prefix. Only a single instance of it can exist inside the context.
The best you can do is mimic what Spring does.
Having
com.non.exposed.datasource.hikari.url=testUrl
com.non.exposed.datasource.hikari.username=testUsername
com.non.exposed.datasource.hikari.password=testPassword
...
You can define a new #ConfigurationProperties class
#ConfigurationProperties("com.non.exposed.datasource")
public class NonExposedProperties {
private final Map<String, String> hikari = new HashMap<>(8);
public Map<String, String> getHikari() {
return hikari;
}
}
Then, autowire this properties class in your #Configuration/#Component class.
Follow in-code comments.
#Configuration
public class CustomConfiguration {
private final NonExposedProperties nonExposedProperties;
private DataSource dataSource;
CustomConfiguration(final NonExposedProperties nonExposedProperties) {
this.nonExposedProperties= nonExposedProperties;
}
public DataSource dataSource() {
if (Objects.isNull(dataSource)) {
// Create a standalone instance of DataSourceProperties
final DataSourceProperties dataSourceProperties = new DataSourceProperties();
// Use the NonExposedProperties "hikari" Map as properties' source. It will be
// {
// url -> testUrl
// username -> testUsername
// password -> testPassword
// ... other properties
// }
final ConfigurationPropertySource source = new MapConfigurationPropertySource(nonExposedProperties.getHikari());
// Bind those properties to the DataSourceProperties instance
final BindResult<DataSourceProperties> binded =
new Binder(source).bind(
ConfigurationPropertyName.EMPTY,
Bindable.ofInstance(dataSourceProperties)
);
// Retrieve the binded instance (it's not a new one, it's the same as before)
dataSource = binded.get().initializeDataSourceBuilder().build();
}
// Return the constructed HikariDataSource
return dataSource;
}
}
Context
The server runs on spring-boot and utilizes spring-data. The database
being used is postgresql.
Problem
Some of the components read from information_schema, pg_user,
pg_policies, and pg_catalog. These components' PostConstruct are
currently running before jpa schema creation does. This means that the
information that the components are trying to fetch hasn't been
created by jpa yet, so the components crash.
Prior Research
No errors are being thrown by hibernate itself. Running the server
twice makes the problematic components run correctly. This implies
that these components are running before jpa.
My properties file includes spring.jpa.hibernate.ddl-auto = update . I
tried to find the code behind spring.jpa.hibernate.ddl-auto to see how
I could get the components to require it by way of #DependsOn, but I
have yet to find anything on it.
I can't simply wait for ApplicationReadyEvent with an event listener
as that will break the dependencies between these components.
Code
These are my hikari data sources
#RequiredArgsConstructor
#Configuration
#EnableConfigurationProperties
public class DatabaseConfiguration {
#Bean(name = "server")
#ConfigurationProperties(prefix = "server.datasource")
public HikariDataSource server() {
return (HikariDataSource) DataSourceBuilder.create().build();
}
#Bean(name = "client")
#ConfigurationProperties(prefix = "client.datasource")
public HikariDataSource client() {
return (HikariDataSource) DataSourceBuilder.create().build();
}
}
I have a custom DataSource component.
#Component
public class DatabaseRouterBean {
private final AwsCognitoConfiguration cognitoConfiguration;
private final DatabaseService databaseService;
private final HikariDataSource server;
private final HikariDataSource client;
private final ModelSourceInformation modelSourceInformation;
public DatabaseRouterBean(
#Qualifier("server") final HikariDataSource server,
#Qualifier("client") final HikariDataSource client,
final AwsCognitoConfiguration cognitoConfiguration,
final DatabaseService databaseService,
final ModelSourceInformation modelSourceInformation
) {
this.server = server;
this.client = client;
this.cognitoConfiguration = cognitoConfiguration;
this.databaseService = databaseService;
this.modelSourceInformation = modelSourceInformation;
}
#Bean
#Primary
public DatabaseRouter dataSource() {
return new DatabaseRouter(cognitoConfiguration, databaseService, server, client, modelSourceInformation);
}
}
The following is the implementation of the data source.
// could have a better name
#RequiredArgsConstructor
#Log4j2
public class DatabaseRouter implements DataSource {
private final AwsCognitoConfiguration config;
private final DatabaseService databaseService;
private final HikariDataSource superuser;
private final HikariDataSource user;
private final ModelSourceInformation modelSourceInformation;
The custom data source component is used to create connections for entity managers using one of two accounts on the database for the purpose of multi-tenancy. One account is superuser while the other is a limited user account. Multi-tenancy is achieved through the use of policies. The custom data source runs SET_CONFIG on the connection.
DatabaseService is a very low level service class that supports reading from information_schema, pg_user, pg_policies, and pg_catalog.
#Service
#Log4j
public class DatabaseServiceImpl implements DatabaseService {
private final HikariDataSource server;
private final HikariDataSource client;
ModelSourceInformation has no dependencies. It is used to convert a class type into a configuration variable name and vice versa. It is used by the custom data source to populate SET_CONFIG based on the type of user. It supports defining configuration variables and tying them to models by way of annotations.
AwsCognitoConfiguration is simply a Configuration class that reads the cognito settings from the properties file.
Defined Execution Order By Dependency
DatabaseConfiguration, ModelSourceInformation, AwsCognitoConfiguration
DatabaseService
DatabaseRouter
JPA
Rest of beans
The following components are initialized before jpa. They need to be initialized after jpa. There are dependencies between them.
ModelDynamismInformation
ModelEntityInformation
ModelInformation
ModelPrimaryKeyInformation
ModelSchemaInformation
ModelSecurityInformation
PolicyInitializer
You can use #DependsOn to control the order in which beans get initialized. A bean depending on an EntityManagerFactory should get initialized after Hibernate did its schema creation.
My project has a dependency on another one, and imports beans from it (using #ImportResource("foo.xml")).
foo.xml defines two datasources (datasource1 and datasource2), I would like to make datasource1 a primary (so all auto-configurations of Spring Boot will work).
Is it possible? I found out that there is a DefaultListableBeanFactory that has determinePrimaryCandidate method.
So the idea is to create my own ListableBeanFactory, that would extend the DefaultListableBeanFactory, but how to force Spring Boot to use my implementation?
Or maybe there is another, easier way to mark a given bean as primary (without changing the configuration where it is defined).
You can create a configuration in your project, which builds a new data source annotated as #Primary bean. This new data source will be the datasource1, which will be injected by spring to the new data source factory method. Here you have the working example.
The config:
#SpringBootApplication
public class BeanSpringExampleApplication
{
#Bean(name = "dataSource1")
public FakeDataSource dataSource1()
{
return new FakeDataSource("dataSource1");
}
#Bean(name = "dataSource2")
public FakeDataSource dataSource2()
{
return new FakeDataSource("dataSource2");
}
#Bean
#Primary
public FakeDataSource primaryDataSource(
#Qualifier("dataSource1") FakeDataSource dataSource1)
{
return dataSource1;
}
}
Here you see three beans (using FakeDataSource class), which simulate your situation. The primaryDataSource bean factory method simply returns the dataSource1 (it's just a mere data source selector).
The FakeDataSource is just a placeholder, to make example runnable:
public class FakeDataSource
{
private final String fakeProperty;
public FakeDataSource(String id)
{
fakeProperty = id;
}
/**
* #return the fakeProperty
*/
public String getFakeProperty()
{
return fakeProperty;
}
}
Finally, a test which proves everything is working:
#RunWith(SpringRunner.class)
#SpringBootTest
public class BeanSpringExampleApplicationTests
{
#Autowired
private FakeDataSource fakeDataSource;
#Test
public void should_AutowirePrimaryDataSource() throws Exception
{
assertEquals("dataSource1", fakeDataSource.getFakeProperty());
}
}
I'm writing a client/server app and configuring it with Spring.
My client interface handles marshalling requests to the server and handling the responses.
At the moment, I have a factory that looks something like:
public class ClientFactory {
private ApplicationContext ctx;
public ClientFactory(){
ctx = new AnnotationConfigApplicationContext(MyConfig.class);
}
public MyClient(String host, int port){
MyClient client = ...
// create a connection to the server
return client;
}
}
Now, MyClient has a bunch of dependencies that I would like to inject, so I would like to create the MyClient instance using Spring and use #Inject annotations to inject the dependencies.
How do I pass the host/port as configuration metadata into the Spring configuration? If I can't what is the recommended alternative. I could do all the wiring myself, but then that is what Spring is for.
Jeff
You should check configuration part of the spring reference. For example you can create beans like this with spring 3.x.
#Configuration
// spring config that loads the properties file
#ImportResource("classpath:/properties-config.xml")
public class AppConfig {
/**
* Using property 'EL' syntax to load values from the
* jetProperties value
*/
private #Value("#{jetProperties['jetBean.name']}") String name;
private #Value("#{jetProperties['jetBean.price']}") Long price;
private #Value("#{jetProperties['jetBean.url']}") URL url;
/**
* Create a jetBean within the Spring Application Context
* #return a bean
*/
public #Bean(name = "jetBean")
JetBean jetBean() {
JetBean bean = new JetBeanImpl();
bean.setName(name);
bean.setPrice(price);
bean.setUrl(url);
return bean;
}
}
I solved this using a static configuration class.
public class ClientFactory {
private ApplicationContext ctx;
public ClientFactory(){
ctx = new AnnotationConfigApplicationContext(MyConfig.class,ServerConfig.class);
}
public MyClient(String host, int port){
MyClient client = ...
// create a connection to the server
return client;
}
#Data
#AllArgsConstructor
public static class ServerDetails{
private int port;
private String host;
}
#Configuration
public static class ServerConfig{
static String host;
static int port;
#Bean
public void serverDetails(){
return new ServerDetails(host, port);
}
}
}
It feels very clunky though. Is there a better way?