For a small new project, I decided to give JDBI a try (normally I work with hibernate/jpa).
I like the lightweight, annotation based dao creation using #SqlUpdate/#SqlQuery.
But: There are situations where I can't be sure if I want to create an entity or update an existing one.
I would place a "select" statement and depending on its return value use the insert or update statement.
Question: is this somehow supported by the "interface-only" dao in jdbi? Or do I have to write a "createOrUpdate" method myself (making the auto generated dao more or less obsolete)?
Thanks for any hints.
Thanks to #zloster I now built a solution based on an abstract class instead of an interface. Works as required.
#SqlUpdate("insert ...")
public abstract void insert(...);
#SqlUpdate("update...")
public abstract void update();
public X createOrUpdate(final X x) {
if (!exists(x)) {
insert(x);
} else {
update(x);
}
return find(...);
}
Related
I have a requirement to execute method queries against a given mongo database and retrieve data. But for that, I have to check whether the field delete=false(Soft delete) for taking them for all the queries.
It can be achieved by the following kind of query
eg: Optional<User> findByIdAndDeletedIsFalse(String id);
But as you can see we have to put DeletedIsFalse for all the queries.
I tried the answer provided in How to add default criteria to all the queries by default in mongo spring boot but I could find out that it can be only used then we are running queries directly using the mongo template.
After some debugging, I could find out that even though method queries are executed through the Mogno template, they are using package-protected classes and methods to do it. So they cannot be overridden by an inherited class.
I cannot find an entry point for them to executed and where to inject default criteria for the method queries.
Eg: If we check the implementation of MongoTemple, at the end the execution is happening through a method
<S, T> List<T> doFind(String collectionName, Document query, Document fields, Class<S> sourceClass, Class<T> targetClass, CursorPreparer preparer)
and that method is invoked from an internal class called ExecutableFindOperationSupport. All those classes are package protected.
Is there any reason to make them package protected and not giving the chance to override them from an inherited class?
Also is there any other way of running method queries with default criteria without appending them to all the queries?
The main problem in extending MongoTemplate as I've suggested in the previous question is need to override a lot of methods because MongoTemplate uses all them in its inner work. This way is not flexible and robust.
Want to suggest you another solution. You can implement Aspect that executes some code before invoking the MongoTemplate methods. You just add the additional criterion to every query the MongoTemplate receives.
Add the spring-boot-starter-aop to your dependencies. Enable AOP in the configuration class:
#Configuration
#EnableAspectJAutoProxy
public class AspectConfiguration {
}
And implement a small aspect that will do all works:
#Aspect
#Component
public class RepositoryAspect {
#Before("execution(* org.springframework.data.mongodb.core.MongoTemplate.*(..))()")
public void before(JoinPoint joinPoint) throws Throwable {
Object[] args = joinPoint.getArgs();
for (Object arg : args) {
if (arg instanceof Query) {
Query query = (Query) arg;
// add your criteria to the query
return;
}
}
}
}
But keep in mind that this approach may lead to bad performace of executing queries. If you build a highload system, set your criterion is better way - it's the cost for fast work.
What would be the simplest integration component arrangement in my use case:
Receive messages from multiple sources and in multiple formats (all messages are JSON serialized objects).
Store messages in buffer up to 10 seconds (aggregate)
Group messages by different class property getter (eg class1.someId(), class2.otherId(), ...)
Release all messages that are grouped and transform to new aggregated message.
So far (point 1. and 2.), I'm using aggregator, but don't know if there is out of box solution for problem at 3.) - or I will have to try to cast each Message and check if type of object is class1 - then use correlationstrategy someId, if class2 then otherId.
For problem 4.) - I could manually code something - but Transformer seems like a good component to use, I just don't know if there is something like aggregating transformer where I can specify mapping rules for each input type.
UPDATE
Something like this:
class One{
public String getA(){ return "1"; }
}
class Two{
public Integer getB(){ return 1; }
}
class ReduceTo{
public void setId(Integer id){}
public void setOne(One one){}
public void setTwo(Two two){}
}
public class ReducingAggregator {
#CorrelationStrategyMethod
public String strategy(One one){
return one.getA();
}
#CorrelationStrategyMethod
public String strategy(Two two){
return two.getB().toString();
}
#AggregatorMethod
public void reduce(ReduceTo out, One in){
out.setId(Integer.valueOf(in.getA()));
out.setOne(in);
}
#AggregatorMethod
public void reduce(ReduceTo out, Two in){
out.setId(in.getB());
out.setTwo(in);
}
}
Annotations have, I suppose, different use-case than current spring ones. RediceTo could be any object including collections. In config we could specify when passed first time should it be empty list or something else (like reduce in java streams).
Not sure what you would like to see as out-of-the-box solution. That is your classes, so your methods. How Framework may make some decision on them?
Well, yes, you need to implement CorrelationStrategy. Or you can consider to use ExpressionEvaluatingCorrelationStrategy and don't write the Java code :-).
Please, elaborate more what you would like to see as an out-of-the-box feature.
The aggregating transformer is encapsulated exactly in the MessageGroupProcessor function of the Aggregator. By default it is DefaultAggregatingMessageGroupProcessor. Yes, you can code your own or again - use an ExpressionEvaluatingMessageGroupProcessor and don't write Java code again :-)
I'm writing an application meant to manage a database using both JDBC and JPA for an exam. I would like the user to select once at the beginning the API to use so that all the application will use the selected API (whether it be JPA or JDBC).
For the moment I decided to use this approach:
I created an interface for each DAO class (e.g. interface UserDAO) with all needed method declarations.
I created two classes for each DAO distinguished by the API used (e.g UserDAOImplJDBC and UserDAOImplJPA). Both of them implement the interface (in our case, UserDAO).
I created a third class (e.g. UserDAOImpl) that extends the JDBC implementation class. In all my code I've been always using this class. When I wanted to switch to the JPA I just had to change in all DAO classes the extends ***ImplDAOJDBC to extends ***ImplDAOJPA.
Now, as I'm starting having many DAO classes it's starting being complicate to modify the code each time.
Is there a way to change all extends faster?
I was considering adding an option in the first screen (for example a radioGroup) to select JDBC or JPA. But yet I have no idea how to make it work without having to restructure all code. Any idea?
Use a factory to get the appropriate DAO, every time you need one:
public class UserDaoFactory {
public UserDao create() {
if (SomeSharedSingleton.getInstance().getPersistenceOption() == JDBC) {
return new UserDAOImplJDBC();
}
else {
return new UserDAOImplJPA();
}
}
}
That's a classic OO pattern.
That said, I hope you realize that what you're doing there should really never be done in a real application:
there's no reason to do the exact same thing in two different ways
the persistence model of JPA and JDBC is extremely different: JPA entities are managed by the JPA engine, so every change to JPA entities is transparently made persistent. That's not the case with JDBC, where the data you get from the database is detached. So the way to implement business logic is very different between JPA and JDBC: you typically never need to save any change when using JPA.
You got 1 and 2 right, but 3 completely wrong.
Instead of having Impl extending one of the other implementations, choose which implementation to initialize using a utility method, for example. That's assuming you don't use Dependency Injection framework such as Spring.
UserDAO dao = DBUtils.getUserDAO();
public class DBUtils {
public static boolean shouldUseJdbc() {
// Decide on some configuration what should you use
}
public static UserDAO getUserDAO() {
if (shouldUseJdbc()) {
return new UserDAOImplJDBC();
}
else {
return new UserDAOImplJPA();
}
}
}
This is still jus an examle, as your DAOs don't need to be instantiated each time, but actually should be singletons.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 4 years ago.
Improve this question
I implemented a class Database Manager that manages operations on two database engines. The class has a private variable databaseEngine which is set before using class methods (drop database, create database, run script, compare, disconnect, etc.) and based on this variable the class recognizes how to behave.
However, and I know it's wrong, Database Manager's methods are full of switch cases like this one:
public void CreateNewDatabase(String databaseName){
switch (databaseEngine){
case "mysql":
//Executes a prepared statement for dropping mysql database (databaseName
break;
case "postgres":
//Executes a prepared statement for dropping postgres database (databaseName
break;
...
}
}
I require a good advice about this. I want to load everything from configuration and resources folders, I mean, the prepared statement for creating and dropping, etc. If a new database engine needs to be supported, it won't be a headache as It would just require to save sql sripts in a resources file and any other data in a configuration file. Please, suggest me any design pattern useful for this case.
Whenever you need to invoke different operations based on a switch statement, think about using an abstract class which defines the operation interface and implementation classes which implement the operation.
In your case databaseEngine is a String which names a database. Instead create an abstract class DatabaseEngine and define operations like createDatabase:
public abstract class DatabaseEngine {
public abstract void createDatabase(String databaseName);
public abstract void dropDatabase(String databaseName);
}
and add implementations:
public class PostgresEngine extends DatabaseEngine {
public void createDatabase(String databaseName) {
... // do it the postgres way
}
}
and then use it in your manager class
public void createNewDatabase(String databaseName) {
engine_.createDatabase(databaseName);
}
First thing: switching on strings is so old school; if at all; you would want to use a true enum for that. But of course, that isn't really the point; and switching over enums is as bad as switching over strings (regarding the thing that you have in mind) from a "OO design" point of view.
The solution by wero is definitely the "correct choice" from an OO perspective. You see, good OO design starts with SOLID; and SOLID starts with SRP.
In this case, I would point out the "there is only one reason to change" aspect of SRP. Thing is: if you push all database handling for 2, 3, n different databases into one class ... that means that you have to change that one class if any of your databases requires a change. Besides the obvious: providing "access means" to ONE database is (almost more) than enough of a "single responsibility" for a single class.
Another point of view: this is about balancing. Either you are interested in a good, well structured, "really OO type of" design ... then you have to bite the bullet and either define an interface or abstract base class; that is then implemented/extended differently for each concrete database.
Or you prefer "stuffing everything into one class" ... then just keep what you have, because it really doesn't matter if you use door handles made out of gold or steel ... for a house that was built on a bad basement anyway.
Meaning: your switch statements are just the result of a less-than-optimal design. Now decide if you want to cure the symptom or the root cause of the problem.
I implemented a class Database Manager that manages operations on two database engines.
What if you had three or four or five different databases/storages? For example, Oracle, MongoDB, Redis, etc. Would you still put implementation for all of them into Database Manager?
Database Manager's methods are full of switch cases...
As expected, because you put everything into one class.
Please, suggest me any design pattern useful for this case.
The most straitforward way to simplify your solution would be to separate MySQL and Postgree implementations from each other. You would need to use Factory and Strategy design patterns. If one sees a switch, one should consider using them, but don't be obsessed with patterns. They are NOT your goal, i.e. don't put them everywhere in your code just because you can.
So, you should start from defining your abstractions. Create an interface or an abstract class if there's a functionality common to all database subclasses.
// I'm not sure what methods you need, so I just added methods you mentioned.
public interface MyDatabase {
void drop();
void create();
void runScript();
void compare();
void disconnect();
}
Then you need to implement your databases which in fact are strategies.
public final class MySqlDatabase implements MyDatabase {
#Override
public void drop() {}
...
}
public final class PostgreDatabase implements MyDatabase {
#Override
public void drop() {}
...
}
Finally you need to create a factory. You can make it static or implement an interface if you like.
public class MyDatabaseFactory {
public MyDatabase create(String type) {
switch (type) {
case "mysql":
return new MySqlDatabase();
case "postgress":
return new PostgreDatabase();
default:
throw new IllegalArgumentException();
}
}
}
You don't necessarily have to pass a string. It can be an option/settings class, but they have a tendency to grow which may lead to bloated classes. But don't worry too much about it, it's not your biggest problem at the moment.
Last, but not least. If you don't mind, revise your naming conventions. Please, don't name your classes as managers or helpers.
You could create an abstract base class for your DatabaseEngines like this:
public abstract class DatabaseEngine {
public abstract void createDatabase(final String databaseName);
public abstract void dropDatabase(final String databaseName);
}
And then create concrete implementations for each DatabaseEngine you are supporting:
public final class MySQLEngine extends DatabaseEngine {
#Override
public void createDatabase(final String databaseName) {
}
#Override
public void dropDatabase(final String databaseName) {
}
}
Then when you want to make a call to create/drop it will look more like this:
databaseEngine.createDatabase("whatever");
This is opinion based question: but in my point of view you can use:
Factory Design pattern. This will take care of any other
database added or changed in future.
Example:
public interface IDataBaseEngine {
...
}
public class OracleDBConnection implements IDataBaseEngine {
...
}
public class MySQLDBConnection implements IDataBaseEngine {
....
}
public class DatabaseEngineFactory {
public IDataBaseEngine getDatabaseConnection() {
....
}
}
Second, create files let say xml files which will contains your SQL
and according to your DB (which can be configured) these files will
be converted to its SQL
Example:
SQL file: customer.table
<TABLE>
<SELECT>
<FROM>customer</FROM>
<WHERE>customer_id = ?</WHERE>
<ORDER_BY>customer_id<ORDER_BY>
</SELECT>
</TABLE>
Now if your configuration file says your database is oracle, then while compiling above SQL file it will create following SQL file:
SELECT * FROM customer
WHERE customer_id = ?
ORDER BY customer_id
I created a DAL few weeks ago which connects to Mongo Database.
When I want to query the database with a certain class, I need to know collection it belongs.
I thought about creating an annotation, that I'll put above each class which will contain the name of the related collection, and when I'll need to query the database I'll get the annotation value by reflection.
My question is how can I declare that the class that is sent to me has the annotation.
Pretty much like:
public List<T> query(Class<T extends Interface>)
only:
public List<T> query(Class<T has Annotation>)
Thanks.
You should either use interface or enumeration to do this. It is much simpler and more explicit.
But, if you want to experiment it is fine. Following should work
public List query(Class klass) {
for(Annotations a : klass.getAnnotations()) {
//Iterate and do stuff
}
//do other stuff
}