I have a java application which uses around 3000 reusable threads which is always running and processing items from a queue. I use MongoDB for my data storage and every time I run it it works perfectly for around 40 minutes, after that Mongo DB Object start returning Nullpointer Exception for queries. At first I suspected that it might be due to connections being lost but as you can see in the Google monitoring graph the connections are still open, but there is a significant decrease in number of Mongo queries. Is there anything Im missing here?
My MongoDB class is like this:
public class MongoDB {
private static MongoClient mongoClient;
private static MongoClient initMongoClient() {
ServerAddress server = new ServerAddress("X.X.X.X");
MongoClientOptions.Builder builder = new MongoClientOptions.Builder();
builder.threadsAllowedToBlockForConnectionMultiplier(50000);
builder.socketKeepAlive(true);
builder.connectionsPerHost(10000);
builder.minConnectionsPerHost(2500);
MongoClientOptions options = builder.build();
MongoClient mc = new MongoClient(server, options);
mongoClient = mc;
return mc;
}
public static MongoClient getMongoClient() {
if(mongoClient == null) {
mongoClient = initMongoClient();
}
return mongoClient;
}
public static DB getDb() {
DB db;
MongoClient mc;
try {
mc = getMongoClient();
mc.getDatabaseNames();
} catch(MongoException e) {
mc = initMongoClient();
}
db = mc.getDB("tina");
return db;
}
}
You should switch to Morphia!
https://github.com/mongodb/morphia
Use the factory pattern to produce a single Mongo instance and tie it to a Morphia Datastore object. You can then use the Datastore object to interface with your MongoDB.
public class DatastoreFactory {
private static Datastore ds;
public static Datastore getDatastore() {
//Lazy load the datastore
if(ds == null) {
try {
Morphia morphia = new Morphia();
ds = morphia.createDatastore(
new MongoClient("server", port, "database"));
//... Other datastore options
} catch(Exception e) {
// Handle it
}
}
return ds;
}
Then wherever you need your MongoDB instance you simple use the Datastore object and get it from the factory
Datastore ds = DatastoreFactory.getDatastore();
You could also use CDI to inject the datastore if that's what you're into
#Singleton
public class DatastoreFactory {
private Datastore ds;
#Produces
public Datastore getDatastore() {
//Lazy load the datastore
if(ds == null) {
try {
Morphia morphia = new Morphia();
ds = morphia.createDatastore(
new MongoClient("server", port, "database"));
//... Other datastore options
} catch(Exception e) {
// Handle it
}
}
return ds;
}
Then inject it like so
#Inject
Datastore ds;
BONUS
To further decouple your code from MongoDB, it would be proper design to create Data Access Objects (DAO) to access your database which will contain the Morphia Datastore object. Your DAO will have the methods you wish to use on the database (get, create, save, delete). This way if you decide to move away from MongoDB you will only have to change the DAO object and not all of your code!
Related
How can we write mockito for the below code? It's been written in normal JDBC. I need to create a mock of all this code having main method (which is driving all the logic of updating the data).
I am really need help in mocking the avoid inserting the actual data. Could someone please guide me ?
public class PaytPaytmBilling {
private static Category logger = Category.getInstance(PaytPaytmBilling.class);
private static InputStream inputS = XY.class.getResourceAsStream("/paytm.properties");
private static final INSERT_QUERY = "INSERT STATEMENT";
private static void insertPaytPaytmBilling(ArrayList allPaytPaytmBill) throws Exception{
conn = getConnection(userId, passwd, prop.getProperty("databaseURL"));
String childSql = buildInsertPaytPaytmBillSql();
PreparedStatement pStatement = conn.prepareStatement(childSql);
for (int i=0; i<allPaytPaytmBill.size(); i++){
PaytPaytmBill PaytmBill = (PaytPaytmBill) allPaytPaytmBill.get(i);
pStatement.setString(1, PaytmBill.getXX());
pStatement.setString(2, PaytmBill.getYY());
pStatement.setString(3, PaytmBill.getAA());
pStatement.setLong(4, PaytmBill.getBB());
pStatement.setLong(5, PaytmBill.getCC));
pStatement.setString(6, PaytmBill.getDD());
pStatement.setInt(7, PaytmBill.getEE());
pStatement.setInt(8, PaytmBill.getFF());
pStatement.setString(9, "");
pStatement.setString(10, "");
pStatement.execute();
}
pStatement.close();
conn.close();
}
private static void getDbConn() throws Exception {
// Here get DB connection
}
public static void main(String[] args) throws Exception
{
ArrayList allPaytPaytmBill = new ArrayList();
XY.init();
getDbConn();
// This query reads data from other tables and creates the data..
String qmrString = qmr.buildQmrSql();
allPaytPaytmBill = qmr.getAllMemberData(qmrString);
insertPaytPaytmBilling(allPaytPaytmBill);
}
}
Mockito Test class:
#RunWith(MockitoJUnitRunner.class)
public class PaytmBillingTest {
private static Category logger = Category.getInstance(PaytmBillingTest.class);
#Mock
private DataSource ds;
#Mock
private Connection c;
#Mock
private PreparedStatement stmt;
#Mock
private ResultSet rs;
private ArrayList<PaytmBill> allPaytmBill;
#Before
public void before() {
allPaytmBill = new ArrayList<>();
PaytmBill PaytmBill = new PaytmBill();
PaytmBill.setAA("1182");
PaytmBill.setBB("5122");
PaytmBill.setCC("201807");
PaytmBill.setDD(0L);
PaytmBill.setEE(100);
PaytmBill.setFF(0);
PaytmBill.setGG(0);
PaytmBill.setHH("A");
PaytmBill.setII(null);
PaytmBill.setJJ(null);
allPaytmBill.add(PaytmBill);
}
#Test
public void testPaytmBilling() {
PaytmBilling PaytmBilling = new PaytmBilling();
}
}
First of all, it looks like you are not showing use the real code. For example you added private static void getDbConn() but the code calls conn = getConnection(...), the variable conn is not declared anywhere, etc. This makes it harder to really help with your issue.
Looking at your unit test, you want to mock instances of certain classes used by PaytPaytmBilling, like DataSource, Connection and PreparedStatement. These are called 'dependencies'.
In order to do that, you need to change PaytPaytmBilling so that these dependencies are 'injected' (see Dependency Injection). This means they are provided to PaytPaytmBilling via the constructor or a setter (or with some frameworks just by adding an annotation on the field).
In the current code, the dependencies are obtained by PaytPaytmBilling itself (e.g. by calling a static method, or creating a new instance) and they cannot be mocked (except via some black magic mocking frameworks which I don't advise you to get into right now).
To write good unit tests, you need to write (or refactor) the code to be testable, which means dependencies are injected, not obtained internally in the class. Also avoid static methods and data (constants are ok), they don't play nice with dependency injection and testable code.
So for example the DataSource could be injected via the constructor like this:
public class PaytPaytmBilling {
private static final String CHILD_SQL = "SELECT bladiebla...";
private DataSource dataSource;
public PaytPaytmBilling(DataSource dataSource) {
this.dataSource = dataSource;
}
public void insertPaytPaytmBilling(List<PaytmBill> allPaytPaytmBill) {
// keeping the example simple here.
// don't use String literals for the parameters below but read
// them from Properties (which you can mock for the unit test)
Connection conn = dataSource.getConnection("userId", "passwd", "url");
PreparedStatement pStatement = conn.prepareStatement(CHILD_SQL);
for (int i=0; i<allPaytPaytmBill.size(); i++){
PaytPaytmBill PaytmBill = (PaytPaytmBill) allPaytPaytmBill.get(i);
pStatement.setString(1, PaytmBill.getXX());
pStatement.setString(2, PaytmBill.getYY());
pStatement.setString(3, PaytmBill.getAA());
// ...
pStatement.execute();
}
pStatement.close();
conn.close();
}
If you re-write the code like above, you could test it like this:
#RunWith(MockitoJUnitRunner.class)
public class PaytmBillingTest {
// this will cause Mockito to automatically create an instance
// and inject any mocks needed
#InjectMocks
private PaytmBilling instanceUnderTest;
#Mock
private DataSource dataSource;
// connection is not directly injected. It is obtained by calling
// the injected dataSource
#Mock
private Connection connection;
// preparedStatement is not directly injected. It is obtained by
// calling the connection, which was obtained by calling the
// injected dataSource
#Mock
private PreparedStatement preparedStatement;
private List<PaytmBill> allPaytmBill;
#Before
public void before() {
allPaytmBill = new ArrayList<>();
PaytmBill paytmBill = new PaytmBill();
paytmBill.setAA("1182");
paytmBill.setBB("5122");
paytmBill.setCC("201807");
paytmBill.setDD(0L);
paytmBill.setEE(100);
paytmBill.setFF(0);
paytmBill.setGG(0);
paytmBill.setHH("A");
paytmBill.setII(null);
paytmBill.setJJ(null);
allPaytmBill.add(PaytmBill);
}
#Test
public void testPaytmBilling() {
// given
when(dataSource.getConnection(anyString(), anyString(), anyString())).thenReturn(connection);
when(connection.prepareStatement(anyString())).thenReturn(preparedStatement);
// when
instanceUnderTest.insertPaytPaytmBilling(allPaytPaytmBill);
// then
verify(pStatement).setString(1, paytmBill.getXX());
verify(pStatement).setString(2, paytmBill.getYY());
verify(pStatement).setString(3, paytmBill.getAA());
// ...
verify(pStatement).execute();
verify(pStatement).close();
verify(connection).close();
}
Unrelated suggestion regarding your code: It's better to close resources in a finally block, or using try-with resources. In you current code resources will not be closed if an exception occurs whilst processing on the resources:
Connection conn = dataSource.getConnection("userId", "passwd", "url");
PreparedStatement pStatement = conn.prepareStatement(childSql);
try {
// processing steps
}
finally {
pStatement.close();
conn.close();
}
Or try-with-resources:
try (Connection conn = dataSource.getConnection("userId", "passwd", "url"),
PreparedStatement pStatement = conn.prepareStatement(childSql)) {
// processing steps
}
Since Connection and PreparedStatement implement the AutoCloseable interface they will be closed automatically when the try block ends. This is possible since Java 7.
I have simple class named Signal. Class looks as follows:
public class Signal {
private String id;
private Date timestamp;
public String getId() {
return id;
}
public void setId(String id) {
this.id = id;
}
public Date getTimestamp() {
return timestamp;
}
public void setTimestamp(Date timestamp) {
this.timestamp = timestamp;
}
}
I am trying to insert signal in MongoDB (v3.4). I am using the following method to insert:
public boolean xyz(Signal signal) {
try {
DatabaseConnection databaseConnection =DatabaseConnection.getInstance();
MongoClient mongoClient = databaseConnection.getMongoClient();
MongoDatabase db = mongoClient.getDatabase("myDb");
MongoCollection<Signal> collection = db.getCollection("myCollection", Signal.class);
collection.insertOne(signal);
return true;
} catch (Exception e){
logger.error("Error", e);
return false;
}
}
I am getting the following exception:
org.bson.codecs.configuration.CodecConfigurationException: Can't find
a codec for class in.co.mysite.webapi.models.Signal.
I checked a similar question here but insertion code is different. I took the hint from answer and modified my method but it doesn't look clean. Modified method is as follows:
public boolean xyz(Signal signal) {
try {
DatabaseConnection databaseConnection =DatabaseConnection.getInstance();
MongoClient mongoClient = databaseConnection.getMongoClient();
MongoDatabase db = mongoClient.getDatabase("myDb");
MongoCollection<Document> collection = db.getCollection("myCollection");
Document doc = new Document();
doc.put("id", signal.getId());
doc.put("timestamp", signal.getTimestamp());
doc.put("_id", new ObjectId().toString());
collection.insertOne(doc);
return true;
} catch (Exception e){
logger.error("Error", e);
return false;
}
}
You need to configure a CodecRegistry which will manage the translation from bson to your pojos:
MongoClientURI connectionString = new MongoClientURI("mongodb://localhost:27017");
MongoClient mongoClient = new MongoClient(connectionString);
CodecRegistry pojoCodecRegistry = org.bson.codecs.configuration.CodecRegistries.fromRegistries(MongoClientSettings.getDefaultCodecRegistry(), org.bson.codecs.configuration.CodecRegistries.fromProviders(PojoCodecProvider.builder().automatic(true).build()));
MongoDatabase database = mongoClient.getDatabase("testdb").withCodecRegistry(pojoCodecRegistry);
PS: You could statically import org.bson.codecs.configuration.CodecRegistries.fromRegistries and org.bson.codecs.configuration.CodecRegistries.fromProviders.
A full example could be found in github.
The Mongodb java driver documentation contains also an article about managing pojos (The link is for the 3.8.0 driver version).
Follow the quick start guide for POJO. You need to register the codec to make the translation of your POJOs (Plain Old Java Object) to/from BSON:
http://mongodb.github.io/mongo-java-driver/3.7/driver/getting-started/quick-start-pojo/
Documentation:
MongoDB Driver Quick Start - POJOs
After following the above document, if you are still getting error, then
you could be using a generic document inside your collection like
class DocStore {
String docId:
String docType;
Object document; // this will cause the BSON cast to throw a codec error
Map<String, Object> document; // this won't
}
And still, you would want to cast your document
from POJO to Map
mkyong comes to rescue.
As for the fetch, it works as expected but you might want to cast from Map to your POJO as a post-processing step, we can find some good answers here
Hope it helps! 🙂️
Have you annotated your Java class? Looks like you need a #Entity above your class and #Id above your ID field.
I want to know Best practices for initilizing JOOQ generated DAO. Now,I am using following approach for initilization of JOOQ generated DAO. In following case StudentDao is JOOQ generated.
public class ExtendedStudentDAO extends StudentDao {
public ExtendedStudentDAO () {
super();
}
public ExtendedStudentDAO (Connection connection) {
Configuration configuration = DSL.using(connection,
JDBCUtils.dialect(connection)).configuration();
this.setConfiguration(configuration);
}
//adding extra methods to DAO using DSL
public String getStudentName(Long ID)
throws SQLException {
try (Connection connection = ServiceConnectionManager.getConnection()) {
DSLContext dslContext = ServiceConnectionManager
.getDSLContext(connection);
Record1<String> record = dslContext
.select(Student.Name)
.from(Student.Student)
.where(Student.ID
.equal(ID)).fetchOne();
if (record != null) {
return record.getValue(Student.Name);
}
return null;
}
}
}
and I have doubt with using above DAO my example code is below.
try (Connection connection = ServiceConnectionManager.getConnection()) {
ExtendedStudentDAO extendedStudentDAO =new ExtendedStudentDAO(connection);
Student stud=new Student();
.....
....
//insert method is from Generated DAO
extendedStudentDAO.insert(stud);
//this method is added in extended class
extendedStudentDAO.getStudentName(12);
}
There are two ways to look at this kind of initialisation:
Create DAOs every time you need them
Your approach is correct, but might be considered a bit heavy. You're creating a new DAO every time you need it.
As of jOOQ 3.7, a DAO is a pretty lightweight object. The same is true for the Configuration that wraps your Connection.
Once your project evolves (or in future jOOQ versions), that might no longer be true, as your Configuration initialisation (or jOOQ's DAO initialisation) might become heavier.
But this is a small risk, and it would be easy to fix:
Use dependency injection to manage DAO or Configuration references
Most people will set up only a single jOOQ Configuration for their application, and also only a single DAO instance (per DAO type), somewhere in a service. In this case, your Configuration must not share the Connection reference, but provide a Connection to jOOQ via the ConnectionProvider SPI. In your case, that seems trivial enough:
class MyConnectionProvider implements ConnectionProvider {
#Override
public Connection acquire() {
return ServiceConnectionManager.getConnection();
}
#Override
public void release(Connection connection) {
try {
connection.close();
}
catch (SQLException e) {
throw new DataAccessException("Error while closing", e);
}
}
}
I'm using Dropwizard framework with JDBI and h2-in-memory for my test purposes. Also I've written my DAOs, and now I want to test them with unit tests. I came along the DBUnit which seem to fit my requirements.
But how to integrate it with JDBI and fill it with test data?
I implemented it like this:
I created a base dao class that sets up my DW environment to build a DBI instance for me. This looks like that:
#BeforeClass
public static void setup() {
env = new Environment( "test-env", Jackson.newObjectMapper(), null, new MetricRegistry(), null );
dbi = new DBIFactory().build( env, getDataSourceFactory(), "test" );
dbi.registerArgumentFactory(new JodaDateTimeArgumentFactory());
dbi.registerMapper(new JodaDateTimeMapper(Optional.absent()));
}
static DataSourceFactory getDataSourceFactory()
{
DataSourceFactory dataSourceFactory = new DataSourceFactory();
dataSourceFactory.setDriverClass( "org.h2.Driver" );
dataSourceFactory.setUrl( "jdbc:h2:mem:testDb" );
dataSourceFactory.setUser( "sa" );
dataSourceFactory.setPassword( "" );
return dataSourceFactory;
}
public static DBI getDbi() {
return dbi;
}
public static Environment getEnvironment() {
return env;
}
Not this will create a Datasource for you pointing to your in-memory database.
No in the actual test you can use the DBI instance to create your DAOs before the test:
DaoA dao;
DaoB otherDao;
#Before
public void setupTests() throws IOException {
super.setupTests();
dao = dbi.onDemand(DaoA.class);
otherDao = dbi.onDemand(DaoB.class);
}
With this your good to go and you can start testing. Hope that helps.
Artur
Edit for init:
My tests initialise themselves as well. For that I use dbi directly to execute sql scripts. For example, a test is associated with a test1.sql script that is a test classpath resource. In that case, all I need to do is read that script and run it before the test. For example like this:
StringWriter writer = new StringWriter();
InputStream resourceStream = this.getClass().getResourceAsStream("/sql/schema.sql");
if(resourceStream == null ) {
throw new FileNotFoundException("schema not found");
}
IOUtils.copy(resourceStream, writer);
Handle handle = null;
try {
handle = dbi.open();
handle.execute(writer.toString());
handle.commit();
} finally {
handle.close();
if(resourceStream != null) {
resourceStream.close();
}
writer.close();
}
In the below example, does JdbcTemplate create two connections or one?
public class MyDao {
private JdbcTemplate jdbcTemplate;
public List<Data1> getData1() {
return jdbcTemplate.query(mySql, myParams, myCallback);
}
public List<Data2> getData2() {
jdbcTemplate.query(mySql2, myParams2, myCallback2);
}
}
public class Main {
public static void main(String[] args) {
MyDao dao = new MyDao();
List<Data1> d1 = dao.getData1();
List<Data2> d2 = dao.getData2();
doStuff(d1, d2);
}
}
That is to say, does it reuse the connection from the first query? We are assuming that it was constructed with a basic data source (not a pooled data source).
It depends on the JdbcTempate's DataSource. If you provided a connection pool, like Apache commons-dbcp, then DBCP will do its best to reuse Connections. If you used Spring JDBC's DriverManagerDataSource a new Connection will be created / closed on each JdbcTemplate.query call.