JOOQ: Querying similar tables - java

I have a situation where I have 2 tables that are almost identical. One table is used for making viewing and editing the data and then the data is published. When the data is published it goes into another table. Essentially WIDGET and PUBLISHED_WIDGET
I have implemented all sorts of custom searching, sorting, filtering and paging queries for 1 table and now I have to implement it for the other. I'm trying to find out a way I can abstract it out and use TableLike<COMMON_WIDGET>.
Example:
create table widget (
id int not null auto_increment,
name varchar(64) not null,
lang varchar(2),
updated_by varchar(64),
updated_on timestamp
//...
);
create table published_widget (
id int not null auto_increment,
name varchar(64) not null,
lang varchar(2),
updated_by varchar(64),
updated_on timestamp
//...
);
I want to be able to do something like this:
public class WidgetDao {
private final TableLike<CommonWidget> table;
public Widget find(String rsql) {
dslContext.selectFrom(table)
.where(table.ID.eq("...").and(table.NAME.eq("...")
// ...
}
Is this possible?

Table mapping
You can use the runtime table mapping feature for this. Choose one of your tables as your "base table" (e.g. WIDGET), and then use a derived configuration with this Settings:
Settings settings = new Settings()
.withRenderMapping(new RenderMapping()
.withSchemata(
new MappedSchema().withInput("MY_SCHEMA")
.withTables(
new MappedTable().withInput("WIDGET")
.withOutput("PUBLISHED_WIDGET"))));
And then:
public Widget find(String rsql) {
// Alternatively, store this derived Configuration in your DAO for caching purposes
dslContext.configuration()
.derive(settings)
.dsl()
.selectFrom(WIDGET)
.where(WIDGET.ID.eq("...").and(WIDGET.NAME.eq("..."))
.fetch();
// ...
}
Such Settings will rename (not alias) the table globally, for a Configuration
Table.rename()
Generated tables have a rename() operation on them, that allows you to do exactly what you want on an ad-hoc basis, not globally. Depending on your use-case, that might be more suitable. Again, this is not the same thing as aliasing (which affects generated SQL).
And again, you'll pick one of your similar/identical tables as your base table, and rename that for your purposes:
public Widget find(String rsql) {
Widget table = WIDGET.rename(PUBLISHED_WIDGET.getQualifiedName());
dslContext.selectFrom(table)
.where(table.ID.eq("...").and(table.NAME.eq("..."))
.fetch();
// ...
}
This method currently (jOOQ 3.14) only exists on generated tables, not on org.jooq.Table, see: https://github.com/jOOQ/jOOQ/issues/5242

Related

Model relationship "User has different Roles per Organization"

I'm trying to achieve the following with JPA: I have Users, Organizations and Roles. A User can have multiple Roles in a given Organization. He can also belong to multiple Organizations, and of course have different Roles per Organization.
Currently I would think that a schema for this should look like this (but also open to alternative aproaches):
CREATE TABLE user
(
id INT NOT NULL AUTO_INCREMENT
);
CREATE TABLE role
(
id INT NOT NULL AUTO_INCREMENT
);
CREATE TABLE organization
(
id INT NOT NULL AUTO_INCREMENT
);
CREATE TABLE `user_and_organization_to_role`
(
`id` INT NOT NULL AUTO_INCREMENT,
`fk_user` INT NOT NULL REFERENCES user (id),
`fk_organization` INT NOT NULL REFERENCES organization (id),
`fk_role` INT NOT NULL REFERENCES role (id),
PRIMARY KEY (`id`),
UNIQUE KEY (`fk_user`, `fk_organization`, `fk_role`)
);
I wouldn't have problems checking roles with native SQL Queries, but I would like to model this in JPA to use the Hibernate Metamodel and Criteria API to implement permission checks.
I thought that something like this would be achievable, even though I'm not 100% sure if I'll reach my goal with Criteria API then:
#Entity
public class Organization {
}
#Entity
public class Role {
}
#Entity
public class User {
private Map<Organization, List<Role>> organizationToRoles;
}
Unfortunately I didn't manage to find a way for the correct annotation, so organizationToRoles is mapped correctly. And even though I would think that is a common problem I didn't find a tutorial that explains how to do this.
Could somebody tell me if such a map is doable with JPA at all, and maybe give an example?
Or if it is not possible to directly have a Map<Organization, List<Role>> organizationToRoles in User, how a mapping could be achived, e.g. with an intermediate Entity that forms the relation between User, Organization and Roles?

Android Room: Create table based on external input

I am developing a Java-based Android App where I am using Room. The App is connected to a server from which it downloads project specific configurations. One of these configurations is the setup of a table. I have a table which number and types of columns differ for each project. I need to have a local copy of this table on the phone to store data in case no internet connection is available. The configuration of the table contains the name of the table and the column composition like
[{
"name":"column1",
"datatype":"VARCHAR(20)"
},
{
"name":"column2",
"datatype":"INT(5)"
},
{
"name":"column3",
"datatype":"DOUBLE"
}]
How can I generate such a table with Room? Generating the create query is not a problem but where should I execute it. Additionally, How can I insert, update and query data from the table? Is it possible to generate such SQL queries and execute them? Is there something like a row mapper which can be used to read the queried data from the table?
If this is not possible, any idea how I can solve it otherwise?
Thank you for your support.
You won't be able to do this with Room and use Room's object mapping because Room builds tables according to the mapping of objects.
That is a table is defined according to a class annotated with #Entity and defined as an Entity to the database.
It undertakes much at compile time such as verification of queries building it's component creation SQL. At run time as part of opening the database it checks for the expected components and if it finds a difference will fail/crash.
I did at one time have a project ongoing that built entities/classes based upon a database which could be copied into a project but with Room changing and introducing features (e.g. DEFAULT constraints which were ignored but then introduced).
You can have components, including tables, that are not controlled/known by Room which will not violate the run time schema checking but then you would have to use a SupportSQLiteDatabase so you might as well use the native SQLite.
How can I generate such a table with Room?
In the case of the example then you would need a class annotated with #Entity. However, one rule that Room imposes is that there must be a PRIMARY KEY. So an option could be to introduce one. Lets say a column called id
Another rule that Room enforces are that column types can only be INTEGER, TEXT, REAL or BLOB. However this is determined from the variable type.
So VARCHAR(50) would be for a String,
INT(5) would likely store long or int long would cover all.
DOUBLE would likely store a DOUBLE (REAL) so double.
So the class(entity) could be :-
#Entity
class TableX {
#PrimaryKey
Long id = null;
String column1;
long column2;
double column3;
}
the table name would be TableX
How can I insert, update and query data from the table?
You use an interface or abstract class annotated with #Dao, so for the above you could, for example have:-
#Dao
abstract class TableXDao {
#Insert
abstract long insert(TableX tableX);
#Insert
abstract long[] insert(TableX...tableX);
#Query("INSERT INTO TableX (column1,column2,column3) VALUES(:column1,:column2,:column3)")
abstract long insert(String column1,long column2, double column3);
#Update
abstract int update(TableX tableX);
#Update
abstract int update(TableX...tableX);
#Query("UPDATE tablex set column1=:newColumn1, column2=:newColumn2,column3=:newColumn3 WHERE id=:id")
abstract int update(long id, String newColumn1, long newColumn2, double newColumn3);
#Query("SELECT * FROM tablex")
abstract List<TableX> getAllTableXRows();
}
Note the 3 forms of Insert/Update the #Insert/#Update uses the convenience methods (based upon passing the object or objects) the #Query uses a more free-format/adaptable approach.
Although not asked for Room needs to know about the Database itself so another class, annotated with #Database is required, this will annotation defines the Entities that form the database, the version number (and other options). The class should extend the RoomDatabase class, it must be an abstract class or implement the abstract method createOpenHelper (typically the former). So :-
#Database(entities = {TableX.class},version = 1)
abstract class TheDatabase extends RoomDatabase {
abstract TableXDao getTableXDao();
/* Often :- */
private static volatile TheDatabase instance = null;
public TheDatabase getInstance(Context context) {
if (instance == null) {
instance = Room.databaseBuilder(
context,TheDatabase.class,"thedatabase.db"
)
.build();
}
return instance;
}
}
When the above is compiled much is undertaken, the build will include a warning:-
E:\AndroidStudioApps\SO70351715JavaSQLite\app\src\main\java\a\a\so70351715javasqlite\TheDatabase.java:10: warning: Schema export directory is not provided to the annotation processor so we cannot export the schema. You can either provide `room.schemaLocation` annotation processor argument OR set exportSchema to false.
abstract class TheDatabase extends RoomDatabase {
^
1 warning
purposefully allowed to happen to demonstrate the extensive compile time checking
Additionally it will generated quite a bit of java code :-
the TableXDao_Impl class being code that is invoked when the Dao's are used
the TheDatabase_Impl class being code for accessing the database, including the creation of the tables in the createAllTables method:-
#Override
public void createAllTables(SupportSQLiteDatabase _db) {
_db.execSQL("CREATE TABLE IF NOT EXISTS `TableX` (`id` INTEGER, `column1` TEXT, `column2` INTEGER NOT NULL, `column3` REAL NOT NULL, PRIMARY KEY(`id`))");
_db.execSQL("CREATE TABLE IF NOT EXISTS room_master_table (id INTEGER PRIMARY KEY,identity_hash TEXT)");
_db.execSQL("INSERT OR REPLACE INTO room_master_table (id,identity_hash) VALUES(42, '5f1c580621c8b86aef3b3cccc44d8d76')");
}
As you can see the room_master_table is created and populated with a row that stores a hash, this is part of the verification, if the hash is changed then room will know that the schema has changed (the source code has been changed).
Is there something like a row mapper which can be used to read the queried data from the table?
As can be seen its all done with the compiled code via the annotations so there is not a map but the expectation that everything is known/defined at compile time.
If this is not possible, any idea how I can solve it otherwise?
Use Native SQLite or manage the server database and the room database as a whole.

History, Diff and reverts of persisted objects

In a Spring MVC / Spring Data project I need to implement a mechanism to track history, present differences and revert the changes to an entity object.
Let's say I have an entity with relationships with others like this:
#Entity
public Class ModelA{
#OneToOne(cascade = CascadeType.ALL)
private ModelB modelB;
#OneToOne(cascade = CascadeType.ALL)
private ModelC modelC;
}
I want to have the list of changes, the ability to compare and revert them. I know that using Ruby there are libs that provide this kind of functionality, but I'm not aware if such thing exist in Java.
Spring has a historiography API and Hibernate Envers had been incorporated in Core functionality, although I still can't find a simple example or some guidance how to implement it.
If it's relevant the used database is PostgreSQL and Oracle 11g, but I want to keep it database independent.
Use Enver and Auditions instead please.
One very interesting approach is given by Christian Bauer (Hibernate committer and author of Hibernate in Action and Java Persistence with Hibernate) in this post.
You create a HISTORY table:
create table ITEM (
ITEM_ID NUMBER(19) NOT NULL,
DESC VARCHAR(255) NOT NULL,
PRICE NUMBER(19,2) NOT NULL,
PRIMARY KEY(ITEM_ID)
)
create table ITEM_HISTORY (
ITEM_ID NUMBER(19) NOT NULL,
DESC VARCHAR(255) NOT NULL,
PRICE NUMBER(19,2) NOT NULL,
VERSION NUMBER(10) NOT NULL,
PRIMARY KEY(ITEM_ID, VERSION)
)
Then you map entities to a view instead:
create or replace view ITEM_VERSIONED (ITEM_ID, VERSION, DESC, PRICE) as
select I.ITEM_ID as ITEM_ID,
(select max(IH.VERSION)
from ITEM_HISTORY HI
where HI.ITEM_ID = I.ITEM_ID) as VERSION,
I.DESC as DESC,
I.PRICE as PRICE
from ITEM I
and the DML statements are resolved by INSTEAD OF TRIGGERS which are supported by PostgreSQL and Oracle:
create or replace trigger ITEM_INSERT
instead of insert on ITEM_VERSIONED begin
insert into ITEM(ITEM_ID, DESC, PRICE)
values (:n.ITEM_ID, :n.DESC, :n.PRICE);
insert into ITEM_HISTORY(ITEM_ID, DESC, PRICE, VERSION)
values (:n.ITEM_ID, :n.DESC, :n.PRICE, :n.VERSION);
end;
create or replace trigger ITEM_UPDATE
instead of update on ITEM_VERSIONED begin
update ITEM set
DESC = :n.DESC,
PRICE = :n.PRICE,
where
ITEM_ID = :n.ITEM_ID;
insert into ITEM_HISTORY(ITEM_ID, DESC, PRICE, VERSION)
values (:n.ITEM_ID, :n.DESC, :n.PRICE, :n.VERSION);
end;
This will work even for other applications that may not use Hibernate, yet they operate on the same DB.
If I understand well, what you ask is some sort of Memento pattern to manage some entities subject to history tracking.
In this case, my suggestion is to configure Spring Data to support a second database (i.e. a tracking database), where you will insert the history of the entities you are interested in.
Then you may create a new annotation (possibly using AspectJ) and apply it to your DAOs (e.g. to your repositories, if you are using them). This way, every time you make a CRUD operation on a tracked class (or, more precisely, on a dao/repository that manages a class you want to track), you make an "insert" in the tracking database storing the change that just occurred.
I can give you this reference, which does not match exactly your need, but may support you at finding the one solution that solves your issue.

JOOQ pojos with one-to-many and many-to-many relations

I am struggling to understand how to handle pojos with one-to-many and many-to-many relationships with JOOQ.
I store locations that are created by players (one-to-many relation). A location can hold multiple additional players who may visit it (many-to-many). The database layout comes down to the following:
CREATE TABLE IF NOT EXISTS `Player` (
`player-id` INT UNSIGNED NOT NULL AUTO_INCREMENT,
`player` BINARY(16) NOT NULL,
PRIMARY KEY (`player-id`),
UNIQUE INDEX `U_player` (`player` ASC))
ENGINE = InnoDB;
CREATE TABLE IF NOT EXISTS `Location` (
`location-id` INT UNSIGNED NOT NULL AUTO_INCREMENT,
`name` VARCHAR(32) CHARACTER SET 'utf8' COLLATE 'utf8_bin' NOT NULL,
`player-id` INT UNSIGNED NOT NULL COMMENT '
UNIQUE INDEX `U_name` (`name` ASC),
PRIMARY KEY (`location-id`),
INDEX `Location_Player_fk` (`player-id` ASC),
CONSTRAINT `fk_location_players1`
FOREIGN KEY (`player-id`)
REFERENCES `Player` (`player-id`)
ON DELETE NO ACTION
ON UPDATE NO ACTION,
ENGINE = InnoDB;
CREATE TABLE IF NOT EXISTS `location2player` (
`location-id` INT UNSIGNED NOT NULL,
`player-id` INT UNSIGNED NOT NULL,
INDEX `fk_ location2player_Location1_idx` (`location-id` ASC),
INDEX `fk_location2player_Player1_idx` (`player-id` ASC),
CONSTRAINT `fk_location2player_Location1`
FOREIGN KEY (`location-id`)
REFERENCES `Location` (`location-id`)
ON DELETE NO ACTION
ON UPDATE NO ACTION,
CONSTRAINT `fk_location2player_Player1`
FOREIGN KEY (`player-id`)
REFERENCES `Player` (`player-id`)
ON DELETE NO ACTION
ON UPDATE NO ACTION)
ENGINE = InnoDB;
Within my java application, all these informations are stored within one pojo. Note that the player and the list of invited players can be updated from within the application and need to be updated in the database as well:
public class Location {
private final String name;
private UUID player;
private List<UUID> invitedPlayers;
public void setPlayer(UUID player) {
this.player = player;
}
public void invitePlayer(UUID player) {
invitedPlayers.add(player);
}
public void uninvitePlayer(UUID player) {
invitedPlayers.remove(player);
}
//additional methods…
}
Can I use JOOQ’s pojo mapping to map these three records into the single pojo? Can I use JOOQ’s CRUD feature from this pojo to update the one-to-many and many-to-many relations? If the pojo mapping cannot be used, can I take advantage of JOOQ in any way except using it to write my SQL statements?
Using MULTISET for nested collections with jOOQ 3.15
Starting from jOOQ 3.15, you can use the standard SQL MULTISET operator to nest collections, and to abstract over the below SQL/XML or SQL/JSON serialisation format. Your query would look like this:
List<Location> locations
ctx.select(
LOCATION.NAME,
LOCATION.PLAYER,
multiset(
select(LOCATION2PLAYER.PLAYER_ID)
.from(LOCATION2PLAYER)
.where(LOCATION2PLAYER.LOCATION_ID.eq(LOCATION.LOCATION_ID))
).as("invitedPlayers")
)
.from(LOCATION)
.fetchInto(Location.class);
If your DTOs were immutable (e.g. Java 16 records), you can even avoid using reflection for mapping, and map type safely into your DTO constructors using constructor references and the new jOOQ 3.15 ad-hoc conversion feature.
List<Location> locations
ctx.select(
LOCATION.NAME,
LOCATION.PLAYER,
multiset(
select(LOCATION2PLAYER.PLAYER_ID)
.from(LOCATION2PLAYER)
.where(LOCATION2PLAYER.LOCATION_ID.eq(LOCATION.LOCATION_ID))
).as("invitedPlayers").convertFrom(r -> r.map(Record1::value1))
)
.from(LOCATION)
.fetch(Records.mapping(Location::new));
See also this blog post for more details about MULTISET
Using SQL/XML or SQL/JSON for nested collections with jOOQ 3.14
Starting from jOOQ 3.14, it's possible to nest collections using SQL/XML or SQL/JSON, if your RDBMS supports that. You can then use Jackson, Gson, or JAXB to map from the text format back to your Java classes. For example:
List<Location> locations
ctx.select(
LOCATION.NAME,
LOCATION.PLAYER,
field(
select(jsonArrayAgg(LOCATION2PLAYER.PLAYER_ID))
.from(LOCATION2PLAYER)
.where(LOCATION2PLAYER.LOCATION_ID.eq(LOCATION.LOCATION_ID))
).as("invitedPlayers")
.convertFrom(r -> r.map(Records.mapping(Pla)
)
.from(LOCATION)
.fetch(Records.mapping(Location::new));
In some database products, like PostgreSQL, you could even use SQL array types using ARRAY_AGG() and skip using the intermediate XML or JSON format.
Note that JSON_ARRAYAGG() aggregates empty sets into NULL, not into an empty []. If that's a problem, use COALESCE()
Historic answer (pre jOOQ 3.14)
jOOQ doesn't do this kind of POJO mapping out of the box yet, but you can leverage something like ModelMapper which features a dedicated jOOQ integration, which works for these scenarios to a certain extent.
Essentially, ModelMapper hooks into jOOQ's RecordMapper API. More details here:
http://www.jooq.org/doc/latest/manual/sql-execution/fetching/recordmapper/
http://www.jooq.org/doc/latest/manual/sql-execution/fetching/pojos-with-recordmapper-provider/
You can use SimpleFlatMapper on the
ResultSet of the query.
create a mapper with player as the key
JdbcMapper<Location> jdbcMapper =
JdbcMapperFactory.addKeys("player").newMapper(Location.class);
Then use fetchResultSet to get the ResultSet and pass it to the mapper.
Note that it is important to orderBy(LOCATION.PLAYER_ID) otherwise you might end up with split Locations.
try (ResultSet rs =
dsl
.select(
LOCATION.NAME.as("name"),
LOCATION.PLAYER_ID.as("player"),
LOCATION2PLAYER.PLAYERID.as("invited_players_player"))
.from(LOCATION)
.leftOuterJoin(LOCATION2PLAYER)
.on(LOCATION2PLAYER.LOCATION_ID.eq(LOCATION.LOCATION_ID))
.orderBy(LOCATION.PLAYER_ID)
.fetchResultSet()) {
Stream<Location> stream = jdbcMapper.stream(rs);
}
then do what you need to do on the stream, you can also get an iterator.

MYSQL stored procedure accessing java object stored as a BLOB

I am storing a Java object as an byte in a blob of a table. The java object is customized object. How can I construct the java object and use it in the stored procedure?
Let the class implement java.io.Serializable so that you can get an InputStream of it which you can store in the DB using CallableStatement#setBinaryStream().
That said, this is usually considered a bad design. If the class is actually a Javabean class, you'd better create a table with columns which represents the Javabean properties. E.g. a public class User { private Long id; private String name; private Integer age; } should be mapped to a table like CREATE TABLE user ( id BIGINT AUTO_INCREMENT, name VARCHAR, age INTEGER )
Edit as a reply on your comment: you thus basically want to store an array as binary object. This is a very bad idea. This way you cannot search for the array's data in the database and the database would also not be portable anymore. Just create a new table which represents each of the array items. Add an extra column to it which represents the ID of the parent object (actually, it should be the PK of the table to which the parent object containing the array is been mapped.
Example:
public class Parent {
private Long id;
private String someData;
private List<Child> children;
// Add/generate public getters/setters.
}
public class Child {
private Long id;
private String someData;
// Add/generate public getters/setters.
}
should be mapped to
CREATE TABLE parent (
id BIGINT NOT NULL AUTO_INCREMENT,
someData VARCHAR,
PRIMARY KEY (id)
);
CREATE TABLE child (
id BIGINT NOT NULL AUTO_INCREMENT,
parent_id BIGINT NOT NULL,
someData VARCHAR,
PRIMARY KEY (id),
FOREIGN KEY (parent_id) REFERENCES parent(id)
);
this way you can just select all with help of a JOIN clause. Check the SQL tutorial at w3schools.com and the vendor-specific SQL documentation for examples.
How can I construct the java object and use it in the stored procedure?
This is not possible, at least not with MySQL. Unlike Oracle which supports Java Stored Procedures, MySQL's stored procedure syntax is based on plain ANSI SQL standard. So I don't see how you could construct a Java object from the stream stored in the BLOB. What you can do is acces to the BLOB, but this won't help you much IMHO.
Actually, I think you are totally on the wrong path here, using a BLOB is not the right way to go (at least not here). If you need to persist objects that have a 1:n relation between them, you need to model your database accordingly.
If your Record class has a one to many relation with the User class, which is my understanding, then you have something like this on the Java side:
public class Record {
private Long id;
private User[];
//...
}
Then you need to create two tables at the database level, one for the records and another for the user(s), and model the relation between them using a foreign key (so you can "attach" a user to a record):
CREATE TABLE record
(
record_id INT NOT NULL,
...,
PRIMARY KEY (record_id)
) TYPE = INNODB;
CREATE TABLE user
(
user_id INT NOT NULL,
record_id INT,
...
PRIMARY KEY (user_id),
INDEX (record_id),
FOREIGN KEY (user_id) REFERENCES record (record_id)
) TYPE = INNODB;
Finally, when persisting a Record instance from Java, you'll need to write state in both tables.

Categories