Mongo not opening connections in Spring 4 Java API - java

I have an API we wrote using Spring 4 with a Mongo database. When the application loads into my local WAS, I can see the app will go out and connect to the database. However when I go to execute a function that should open a query, I get socket closed error.
My Configuration:
#Bean
public MongoDbFactory mongoDbFactory() throws Exception {
logger.info("loading MongoDBFactory bean" );
String PROCESS_ID_MONGO_KEY = "PROCESS_ID_MONGO";
Credentials credentials = credentialsManager().getCredentialsFor(PROCESS_ID_MONGO_KEY);
MongoClient mongoClient = new MongoClient(
Arrays.asList(new ServerAddress(PropertiesManagerUtility.getKeyValue(CollectionType.CREDENTIAL, "mongo.url"), 27017)),
Arrays.asList(MongoCredential.createPlainCredential(credentials.getUserid(), "$external", credentials.getPassword().toCharArray())),
MongoClientOptions.builder()
.sslEnabled(true).connectTimeout(30)
.writeConcern(WriteConcern.MAJORITY)
.socketKeepAlive(true)
.build());
return new SimpleMongoDbFactory(mongoClient, PropertiesManagerUtility.getKeyValue(CollectionType.CREDENTIAL, "mongo.db"));
}
#Bean
public MongoTemplate mongoTemplate() throws Exception {
logger.info("loading MongoTemplate bean" );
// MongoTemplate mongoTemplate = new MongoTemplate(mongoDbFactory());
return new MongoTemplate(mongoDbFactory());
}
My Dao
#Component("achResponseDMDao")
public class AchResponseDMDaoImpl implements IBasicDao<AchResponseDM>{
#Autowired
MongoTemplate mongoTemplate;
public AchResponseDMDaoImpl(MongoTemplate mongoTemplate){
this.mongoTemplate = mongoTemplate;
}
#Override
public AchResponseDM findByResponseCode( String responseCode){
Query query = new Query(Criteria.where("responseCode").is(responseCode));
return mongoTemplate.findOne(query, AchResponseDM.class);
}
...
}
Question is I thought Spring would give me a new connection using the MongoFactory but it appears that original connection gets closed and no more are created. What do I need to do? Thanks in advance.

Injecting the factory instead of the MongoTemplate instance created new connections as needed. The corresponding DAOImpl would have #Autowired MongoFactory mongoFactory; and the method would create an instance of new mongoTemplate(mongoFactory).find(...) or otherwise.
the resulting DAO looks like:
#Component("achResponseDMDao")
public class AchResponseDMDaoImpl implements IBasicDao<AchResponseDM>{
#Autowired
MongoFactory mongoFactory;
#Override
public AchResponseDM findByResponseCode( String responseCode){
Query query = new Query(Criteria.where("responseCode").is(responseCode));
List<AchResponseDM> listOfResponses = mongoTemplate(mongoFactory).find(query, AchResponseDM.class);
return (listOfResponses!=null && !listOfResponses.isEmpty())?listOfResponses.get(0):defaultNonNullResponse();
}
...
}

Related

How to use AbstractDataSource to switch Schema after User Log-in

My problem: Stuck on implementing change of schema after user login, following a StackOverFlow.
Description: Im using the class below. However, I have no idea on how to use it. Im reading every tutorial but I'm stuck. The result I'm expecting are:
1- Spring initializes with the default URL so the user can login.
2- After a successful login, it changes to the schema based on the UserDetails class.
I'm following the Stack Overflow solution at: Change database schema during runtime based on logged in user
The Spring version I'm using is
> : Spring Boot :: (v2.3.3.RELEASE)
import com.google.common.cache.CacheBuilder;
import com.google.common.cache.CacheLoader;
import com.google.common.cache.LoadingCache;
import java.sql.Connection;
import java.sql.ConnectionBuilder;
import java.sql.SQLException;
import java.util.concurrent.TimeUnit;
import javax.sql.DataSource;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.jdbc.DataSourceBuilder;
import org.springframework.core.env.Environment;
import org.springframework.jdbc.datasource.AbstractDataSource;
public class UserSchemaAwareRoutingDataSource extends AbstractDataSource {
#Autowired
UsuarioProvider customUserDetails;
#Autowired
Environment env;
private LoadingCache<String, DataSource> dataSources = createCache();
public UserSchemaAwareRoutingDataSource() {
}
public UserSchemaAwareRoutingDataSource(UsuarioProvider customUserDetails, Environment env) {
this.customUserDetails = customUserDetails;
this.env = env;
}
private LoadingCache<String, DataSource> createCache() {
return CacheBuilder.newBuilder()
.maximumSize(100)
.expireAfterWrite(10, TimeUnit.MINUTES)
.build(
new CacheLoader<String, DataSource>() {
public DataSource load(String key) throws Exception {
return buildDataSourceForSchema(key);
}
});
}
private DataSource buildDataSourceForSchema(String schema) {
System.out.println("schema:" + schema);
String url = "jdbc:mysql://REDACTED.com/" + schema;
String username = env.getRequiredProperty("spring.datasource.username");
String password = env.getRequiredProperty("spring.datasource.password");
System.out.println("Flag A");
DataSource build = (DataSource) DataSourceBuilder.create()
.driverClassName(env.getRequiredProperty("spring.datasource.driverClassName"))
.username(username)
.password(password)
.url(url)
.build();
System.out.println("Flag B");
return build;
}
#Override
public Connection getConnection() throws SQLException {
return determineTargetDataSource().getConnection();
}
#Override
public Connection getConnection(String username, String password) throws SQLException {
return determineTargetDataSource().getConnection(username, password);
}
private DataSource determineTargetDataSource() {
try {
Usuario usuario = customUserDetails.customUserDetails();
//
String db_schema = usuario.getTunnel().getDb_schema();
//
String schema = db_schema;
return dataSources.get(schema);
} catch (Exception ex) {
ex.printStackTrace();
}
return null;
}
#Override
public ConnectionBuilder createConnectionBuilder() throws SQLException {
return super.createConnectionBuilder();
}
}
References:
https://spring.io/blog/2007/01/23/dynamic-datasource-routing/
How to create Dynamic connections (datasource) in spring using JDBC
Spring Boot Configure and Use Two DataSources
Edit (Additional information required on the comments):
I have 1 database.
This database has a n number of schemas. Each schema pertains to one company. One user pertains to one company. The login logic is as follows:
-User input username and password.
-When successful, the UserDetails will contain the name of the 'schema' of this user. Basically, to which company/schema this user pertains.
It should, after that, connect directly to that schema, so the user can work with the data of his own company.
I hope this clarify as much as possible.
Edit 2:
#Component
public class UsuarioProvider {
#Bean
#Scope(value = WebApplicationContext.SCOPE_REQUEST, proxyMode = ScopedProxyMode.TARGET_CLASS) // or just #RequestScope
public Usuario customUserDetails() {
return (Usuario) SecurityContextHolder.getContext().getAuthentication().getPrincipal();
}
}
public class UserSchemaAwareRoutingDataSource extends AbstractDataSource {
#Autowired
private UsuarioProvider usuarioProvider;
#Autowired // This references the primary datasource, because no qualifier is given
private DataSource companyDependentDataSource;
#Autowired
#Qualifier(value = "loginDataSource")
private DataSource loginDataSource;
#Autowired
Environment env;
private LoadingCache<String, DataSource> dataSources = createCache();
public UserSchemaAwareRoutingDataSource() {
}
private LoadingCache<String, DataSource> createCache() {
return CacheBuilder.newBuilder()
.maximumSize(100)
.expireAfterWrite(10, TimeUnit.MINUTES)
.build(
new CacheLoader<String, DataSource>() {
public DataSource load(String key) throws Exception {
return buildDataSourceForSchema(key);
}
});
}
private DataSource buildDataSourceForSchema(String schema) {
System.out.println("schema:" + schema);
String url = "jdbc:mysql://REDACTED.com/" + schema;
String username = env.getRequiredProperty("spring.datasource.username");
String password = env.getRequiredProperty("spring.datasource.password");
System.out.println("Flag A");
DataSource build = (DataSource) DataSourceBuilder.create()
.driverClassName(env.getRequiredProperty("spring.datasource.driverClassName"))
.username(username)
.password(password)
.url(url)
.build();
System.out.println("Flag B");
return build;
}
#Override
public Connection getConnection() throws SQLException {
return determineTargetDataSource().getConnection();
}
#Override
public Connection getConnection(String username, String password) throws SQLException {
return determineTargetDataSource().getConnection(username, password);
}
private DataSource determineTargetDataSource() {
try {
System.out.println("Flag G");
Usuario usuario = usuarioProvider.customUserDetails(); // request scoped answer!
String db_schema = usuario.getTunnel().getDb_schema();
return dataSources.get(db_schema);
} catch (Exception ex) {
ex.printStackTrace();
}
return null;
}
#Override
public ConnectionBuilder createConnectionBuilder() throws SQLException {
return super.createConnectionBuilder();
}
}
Do I need to put #Configuration on top of this class? I'm not being able to make Spring Boot aware of this settings. I'm a bit confused on how to tell Spring Boot what is the loginDataSource; url is. I was using the application.properties default values to login.
Your setting seams the classical situation for two different DataSources.
Here is a Baeldung-Blog-Post how to configure Spring Data JPA.
First thing to notice, they are using #Primary. This is helping and standing in your way at the same time. You can only have ONE primary bean of a certain type. This is causing trouble for some people, since they try to "override" a spring bean by making their testing spring beans primary. Which results in having two primary beans with the same type. So be careful, when setting up your tests.
But it also eases things up, if you are mostly referring to one DataSource and only in a few cases to the other. This seams to be your case, so lets adopt it.
Your DataSource configuration could look like
#Configuration
public class DataSourceConfiguration {
#Bean(name="loginDataSource")
public DataSource loginDataSource(Environment env) {
String url = env.getRequiredProperty("spring.logindatasource.url");
return DataSourceBuilder.create()
.driverClassName(env.getRequiredProperty("spring.logindatasource.driverClassName"))
[...]
.url(url)
.build();
}
#Bean(name="companyDependentDataSource")
#Primary // use with caution, I'd recommend to use name based autowiring. See #Qualifier
public DataSource companyDependentDataSource(Environment env) {
return new UserSchemaAwareRoutingDataSource(); // Autowiring is done afterwards by Spring
}
}
These two DataSources can now be used in your repositories/DAOs or how ever you structure your program
#Autowired // This references the primary datasource, because no qualifier is given. UserSchemaAwareRoutingDataSource is its implementation
// #Qualifier("companyDependentDataSource") if #Primary is omitted
private DataSource companyDependentDataSource;
#Autowired
#Qualifier(name="loginDataSource") // reference by bean name
private DataSource loginDataSource
Here is an example how to configure Spring Data JPA with a DataSource referenced by name:
#Configuration
#EnableJpaRepositories(
basePackages = "<your entity package>",
entityManagerFactoryRef = "companyEntityManagerFactory",
transactionManagerRef = "companyTransactionManager"
)
public class CompanyPersistenceConfiguration {
#Autowired
#Qualifier("companyDependentDataSource")
private DataSource companyDependentDataSource;
#Bean(name="companyEntityManagerFactory")
public LocalContainerEntityManagerFactoryBean companyEntityManagerFactory() {
LocalContainerEntityManagerFactoryBean emf = new LocalContainerEntityManagerFactoryBean();
emf.setDataSource(companyDependentDataSource);
// ... see Baeldung Blog Post
return emf;
}
#Bean(name="companyTransactionManager")
public PlatformTransactionManager companyTransactionManager() {
JpaTransactionManager tm = new JpaTransactionManager();
tm.setEntityManagerFactory(companyEntityManagerFactory().getObject());
return tm;
}
}
As described in my SO-answer you referred to there is an important assumption
The current schema name to be used for the current user is accessible through a Spring JSR-330 Provider like private javax.inject.Provider<User> user; String schema = user.get().getSchema();. This is ideally a ThreadLocal-based proxy.
This is the trick which makes the UserSchemaAwareRoutingDataSource implementation possible. Spring beans are mostly singletons and therefore stateless. This also applies to the normal usage of DataSources. They are treated as stateless singletons and the references to them are passed over in the whole program. So we need to find a way to provide a single instance of the companyDependentDataSource which is behaving different on user basis regardless. To get that behavior I suggest to use a request-scoped bean.
In a web application, you can use #Scope(REQUEST_SCOPE) to create such objects. There is also a Bealdung Post talking about that topic. As usual, #Bean annotated methods reside in #Confiugration annotated classes.
#Configuration
public class UsuarioConfiguration {
#Bean
#Scope(value = WebApplicationContext.SCOPE_REQUEST,
proxyMode = ScopedProxyMode.TARGET_CLASS) // or just #RequestScope
public Usuario usario() {
// based on your edit2
return (Usuario) SecurityContextHolder.getContext().getAuthentication().getPrincipal();
}
}
Now you can use this request scoped object with a provider inside your singleton DataSource to behave different according to the logged in user:
#Autowired
private Usario usario; // this is now a request-scoped proxy which will create the corresponding bean (see UsuarioConfiguration.usario()
private DataSource determineTargetDataSource() {
try {
String db_schema = this.usuario.getTunnel().getDb_schema();
return dataSources.get(db_schema);
} catch (Exception ex) {
ex.printStackTrace();
}
return null;
}
I hope this helps you understand the request scope concept of Spring.
So your login process would look something like
User input username and password
A normal spring bean, referencing the userDataSource by name, is checking the login and is putting the user information into the session/securitycontext/cookie/....
When successful, during the next request the companyDependentDataSource is capable of retrieving a properly setup Usario object
You can use this datasource now to do user specific stuff.
To verify your DataSource is properly working you could create a small Spring MVC endpoint
#RestController
public class DataSourceVerificationController {
#Autowired
private Usario usario;
#Autowired
#Qualifier("companyDependentDataSource") // omit this annotation if you use #Primary
private DataSource companyDependentDataSource;
#GetRequest("/test")
public String test() throws Exception {
String schema = usario.getTunnel().getDb_schema()
Connection con = companyDependentDataSource.getConnection();
Statement stmt = con.createStatement();
ResultSet rs = stmt.executeQuery("select name from Employee"); // just a random guess
rs.next();
String name = rs.getString("name")
rs.close();
stmt.close();
con.close();
return "name = '" + name + "', schema = '" + schema + "'";
}
}
Take your favorite browser go to your login page, do a valid login and call http://localhost:8080/test afterwards

How to make sure SFTP session always close at the end of the spring-batch

My application is based on spring-boot 2.1.6 equipped with spring-batch (chunks approach) and spring-integration to handle the SFTP.
High level functionality is to fetch data from DB, generate a text file then send it through SFTP and this task run every 30 mins.
This application already running in production for some time, but if I see the logs there are error about ssh_msg_disconnect 11 idle connection. It will keep like that until I restart the app.
Below is my application code :
SftpConfig.java
#Configuration
public class SftpConfig {
#Autowired
ApplicationProperties applicationProperties;
#Bean
public SessionFactory<LsEntry> sftpSessionFactory() {
final DefaultSftpSessionFactory factory = new DefaultSftpSessionFactory(true);
factory.setHost(applicationProperties.getSftp().getHost());
factory.setUser(applicationProperties.getSftp().getUser());
factory.setPassword(applicationProperties.getSftp().getPass());
factory.setAllowUnknownKeys(true);
return new CachingSessionFactory<>(factory);
}
#Bean
#ServiceActivator(inputChannel = "toSftpChannel", adviceChain = "retryAdvice")
public MessageHandler handler() {
final SftpMessageHandler handler = new SftpMessageHandler(this.sftpSessionFactory());
handler.setRemoteDirectoryExpression(new LiteralExpression(applicationProperties.getSftp().getPath()));
handler.setFileNameGenerator((final Message<?> message) -> {
if (message.getPayload() instanceof File) {
return ((File) message.getPayload()).getName();
} else {
throw new IllegalArgumentException("File expected as payload.");
}
});
return handler;
}
#Bean
public RequestHandlerRetryAdvice retryAdvice() {
final RequestHandlerRetryAdvice advice = new RequestHandlerRetryAdvice();
final RetryTemplate retryTemplate = new RetryTemplate();
final SimpleRetryPolicy retryPolicy = new SimpleRetryPolicy();
retryPolicy.setMaxAttempts(NumberConstants.FIVE);
retryTemplate.setRetryPolicy(retryPolicy);
advice.setRetryTemplate(retryTemplate);
return advice;
}
#MessagingGateway
public interface UploadGateway {
#Gateway(requestChannel = "toSftpChannel")
void upload(File file);
}
}
step for sending file to sftp
#Autowired
UploadGateway uploadGateway;
private boolean uploadToSharedFolderSuccess(final PaymentStatus paymentStatus, final String strLocalTmpPath) {
try {
final File fileLocalTmpFullPath = new File(strLocalTmpPath);
uploadGateway.upload(fileLocalTmpFullPath);
} catch (final Exception e) {
paymentStatus.setStatus(ProcessStatus.ERROR.toString());
paymentStatus.setRemark(StringUtil.appendIfNotEmpty(paymentStatus.getRemark(),
"Error during upload to shared folder - " + e.getMessage()));
}
return !StringUtils.equalsIgnoreCase(ProcessStatus.ERROR.toString(), paymentStatus.getStatus());
}
From the error, I know that seems like I opened too many connection. But I'm not sure how to check if the connection are closed every end of the spring-batch.
If you don't wrap the session factory in a CachingSessionFactory, the session will be closed after each use.
#Bean
public DefaultSftpSessionFactory sftpSessionFactory() {
final DefaultSftpSessionFactory factory = new DefaultSftpSessionFactory(true);
factory.setHost(applicationProperties.getSftp().getHost());
factory.setUser(applicationProperties.getSftp().getUser());
factory.setPassword(applicationProperties.getSftp().getPass());
factory.setAllowUnknownKeys(true);
return factory;
}

Spring data mongoDB lazy configuration of database connection

I have a multi-tenant mongoDB application and let's assume that
right connection to right database is chosen from tenant name from HTTP request header(i usage earlier prepared properties file with tenant name).
When application is started mongoDB is configuring and i don't have information about tenant, because none request to application hasn't been sent, so i don't know to which database i should be connect. Is a possibility that mongoDB connection to database would be configured dynamicly, when I try to get some data from mongo repository(then I have tenant name from HTTP request)?
MongoDbConfiguration:
#Configuration
public class MongoDbConfiguration {
private final MongoConnector mongoConnector;
#Autowired
public MongoDbConfiguration(MongoConnector mongoConnector) {
this.mongoConnector = mongoConnector;
}
#Bean
public MongoDbFactory mongoDbFactory() {
return new MultiTenantSingleMongoDbFactory(mongoConnector, new MongoExceptionTranslator());
}
#Bean
public MongoTemplate mongoTemplate() {
return new MongoTemplate(mongoDbFactory());
}
}
#Component
#Slf4j
public class MultiTenantMongoDbFactory extends SimpleMongoDbFactory {
private static final Logger logger = LoggerFactory.getLogger(MultiTenantMongoDbFactory.class);
private Map<String, DbConfig> tenantToDbConfig;
private Map<String, MongoDatabase> tenantToMongoDatabase;
#Autowired
public MultiTenantMongoDbFactory(
final #Qualifier("sibTenantContexts") Map<String, DbConfig> dbConfigs,
final SibEnvironment env) {
super(new MongoClientURI(env.getDefaultDatabase()));
this.tenantToDbConfig = dbConfigs;
// Initialize tenantToMongoDatabase map.
buildTenantDbs();
}
#Override
public MongoDatabase getDb() {
String tenantId = (!StringUtils.isEmpty(TenantContext.getId()) ? TenantContext.getId()
: SibConstant.DEFAULT_TENANT);
return this.tenantToMongoDatabase.get(tenantId);
}
/**
* Create tenantToMongoDatabase map.
*/
#SuppressWarnings("resource")
private void buildTenantDbs() {
log.debug("Building tenantDB configuration.");
this.tenantToMongoDatabase = new HashMap<>();
/*
* for each tenant fetch dbConfig and intitialize MongoClient and set it to
* tenantToMongoDatabase
*/
for (Entry<String, DbConfig> idToDbconfig : this.tenantToDbConfig.entrySet()) {
try {
this.tenantToMongoDatabase.put(idToDbconfig.getKey(),
new MongoClient(new MongoClientURI(idToDbconfig.getValue()
.getUri())).getDatabase(idToDbconfig.getValue()
.getDatabase()));
} catch (MongoException e) {
log.error(e.getMessage(), e.getCause());
}
}
}
}
In this, tenantToDbConfig is a bean which I have created at the time of application boot where I store DBConfiguration like (Url/database name) against every tenant. There is one default database which is required at boot time and for every request, I am expecting tenantId in request Header.

Configurable Mongo database url and name in Spring Boot Application

I am trying to configure mongodb properties through application-{environment-name}.properties for connecting to mongodb.
Here's my code for making connection to mongo:
#Configuration
#EnableAutoConfiguration
public class SpringMongoConfig {
#Value("${db.connectionURL}")
private String databaseURL;
#Value("${db.name}")
private String databaseName;
#Bean
public MongoDbFactory mongoDbFactory() throws Exception {
System.out.println("database url: " + databaseURL + " db name: " + databaseName);
return new SimpleMongoDbFactory(new MongoClient(databaseURL), databaseName);
}
#Bean
public MongoTemplate mongoTemplate() throws Exception {
MongoTemplate mongoTemplate = new MongoTemplate(mongoDbFactory());
return mongoTemplate;
}
}
Here's my application-test.properties file:
db.connectionURL=localhost
db.name=rahab
I'm getting null values for databaseURL and databaseName. My idea is that the values are still not available during bean creation. But I don't have any idea to achieve this.
you can use PropertySource in the configuration to specify the location of the properties file which you have created. The example code is given below:
#Configuration
#EnableAutoConfiguration
#PropertySource("application-test.properties")
public class SpringMongoConfig {
#Value("${db.connectionURL}")
private String databaseURL;
#Value("${db.name}")
private String databaseName;
#Bean
public MongoDbFactory mongoDbFactory() throws Exception {
System.out.println("database url: " + databaseURL + " db name: " + databaseName);
return new SimpleMongoDbFactory(new MongoClient(databaseURL), databaseName);
}
#Bean
public MongoTemplate mongoTemplate() throws Exception {
MongoTemplate mongoTemplate = new MongoTemplate(mongoDbFactory());
return mongoTemplate;
}
}
Hope it helps to solve your issue.

Best way of creating Cassandra cluster connection using Spring data

I am using the below code to connect with cassandra using spring data. But it's painful to create connection everytime.
try {
cluster = Cluster.builder().addContactPoint(host).build();
session = cluster.connect("digitalfootprint");
CassandraOperations cassandraOps = new CassandraTemplate(session);
Select usersQuery = QueryBuilder.select(userColumns).from("Users");
usersQuery.where(QueryBuilder.eq("username", username));
List<Users> userResult = cassandraOps
.select(usersQuery, Users.class);
userList = userResult;
} catch(Exception e) {
e.printStackTrace();
} finally {
cluster.close();
}
Is there any way we can have a common static connection or utility kind of stuff. I am using this in web application where lots of CRUD operation will be there. SO it will be painful to repeat the code every where.
Just instantiate appropriate beans at the startup time of your spring web application. An example would be :
#Configuration
public class CassandraConfig {
#Bean
public CassandraClusterFactoryBean cluster() throws UnknownHostException {
CassandraClusterFactoryBean cluster = new CassandraClusterFactoryBean();
cluster.setContactPoints(InetAddress.getLocalHost().getHostName());
cluster.setPort(9042);
return cluster;
}
#Bean
public CassandraMappingContext mappingContext() {
return new BasicCassandraMappingContext();
}
#Bean
public CassandraConverter converter() {
return new MappingCassandraConverter(mappingContext());
}
#Bean
public CassandraSessionFactoryBean session() throws Exception {
CassandraSessionFactoryBean session = new CassandraSessionFactoryBean();
session.setCluster(cluster().getObject());
session.setKeyspaceName("mykeyspace");
session.setConverter(converter());
session.setSchemaAction(SchemaAction.NONE);
return session;
}
#Bean
public CassandraOperations cassandraTemplate() throws Exception {
return new CassandraTemplate(session().getObject());
}
}
Now Inject or Autowire CassandraOperations bean , any time you want

Categories