Can spring data r2dbc generate a schema? - java

I am creating a quick project using R2DBC and H2 to familiarize myself with this new reactive stuff. Made a repository that extends ReactiveCrudRepository and all is well with the world, as long as i use the DatabaseClient to issue a CREATE TABLE statement that matches my entity first...
I understand spring data R2DBC is not as fully featured as spring data JPA (yet?) but is there currently a way to generate the schema from the entity classes?
Thanks

No, there is currently no way to generate schema from entities with Spring Data R2DBC.
I'm using it in a project with Postgres DB and it's complicated to manage database migrations, but I managed to wire in Flyway with synchronous Postgre driver (Flyway doesn't work with reactive drivers yet) at startup to handle schema migrations.
Even though you still have to write your own CREATE TABLE statements which shouldn't be that hard and you could even modify your entities in some simple project to create JPA entities and let Hibernate create schema then copy-paste it into a migration file in your R2DBC project.

It is possible for tests and for production.
I production make sure your user has no access to change schema otherwise you may delete tables by mistake!!! or use a migration tool like flyway.
You need to put your schema.sql in the main resources and add the relevant properties
spring.r2dbc.initialization-mode=always
h2 for test and postgres for prod
I use gradle and the versions of driver are:
implementation 'org.springframework.boot.experimental:spring-boot-actuator-autoconfigure-r2dbc'
runtimeOnly 'com.h2database:h2'
runtimeOnly 'io.r2dbc:r2dbc-h2'
runtimeOnly 'io.r2dbc:r2dbc-postgresql'
runtimeOnly 'org.postgresql:postgresql'
testImplementation 'org.springframework.boot.experimental:spring-boot-test-autoconfigure-r2dbc'
The BOM version is
dependencyManagement {
imports {
mavenBom 'org.springframework.boot.experimental:spring-boot-bom-r2dbc:0.1.0.M3'
}
}

That's how I solved this problem:
Controller:
#PostMapping(MAP + PATH_DDL_PROC_DB) //PATH_DDL_PROC_DB = "/database/{db}/{schema}/{table}"
public Flux<Object> createDbByDb(
#PathVariable("db") String db,
#PathVariable("schema") String schema,
#PathVariable("table") String table) {
return ddlProcService.createDbByDb(db,schema,table);
Service:
public Flux<Object> createDbByDb(String db,String schema,String table) {
return ddl.createDbByDb(db,schema,table);
}
Repository:
#Autowired
PostgresqlConnectionConfiguration.Builder connConfig;
public Flux<Object> createDbByDb(String db,String schema,String table) {
return createDb(db).thenMany(
Mono.from(connFactory(connConfig.database(db)).create())
.flatMapMany(
connection ->
Flux.from(connection
.createBatch()
.add(sqlCreateSchema(db))
.add(sqlCreateTable(db,table))
.add(sqlPopulateTable(db,table))
.execute()
)));
}
private Mono<Void> createDb(String db) {
PostgresqlConnectionFactory
connectionFactory = connFactory(connConfig);
DatabaseClient ddl = DatabaseClient.create(connectionFactory);
return ddl
.execute(sqlCreateDb(db))
.then();
}
Connection Class:
#Slf4j
#Configuration
#EnableR2dbcRepositories
public class Connection extends AbstractR2dbcConfiguration {
/*
**********************************************
* Spring Data JDBC:
* DDL: does not support JPA.
*
* R2DBC
* DDL:
* -does no support JPA
* -To achieve DDL, uses R2dbc.DataBaseClient
*
* DML:
* -it uses R2dbcREpositories
* -R2dbcRepositories is different than
* R2dbc.DataBaseClient
* ********************************************
*/
#Bean
public PostgresqlConnectionConfiguration.Builder connectionConfig() {
return PostgresqlConnectionConfiguration
.builder()
.host("db-r2dbc")
.port(5432)
.username("root")
.password("root");
}
#Bean
public PostgresqlConnectionFactory connectionFactory() {
return
new PostgresqlConnectionFactory(
connectionConfig().build()
);
}
}
DDL Scripts:
#Getter
#NoArgsConstructor(access = AccessLevel.PRIVATE)
public final class DDLScripts {
public static final String SQL_GET_TASK = "select * from tasks";
public static String sqlCreateDb(String db) {
String sql = "create database %1$s;";
String[] sql1OrderedParams = quotify(new String[]{db});
String finalSql = format(sql,(Object[]) sql1OrderedParams);
return finalSql;
}
public static String sqlCreateSchema(String schema) {
String sql = "create schema if not exists %1$s;";
String[] sql1OrderedParams = quotify(new String[]{schema});
return format(sql,(Object[]) sql1OrderedParams);
}
public static String sqlCreateTable(String schema,String table) {
String sql1 = "create table %1$s.%2$s " +
"(id serial not null constraint tasks_pk primary key, " +
"lastname varchar not null); ";
String[] sql1OrderedParams = quotify(new String[]{schema,table});
String sql1Final = format(sql1,(Object[]) sql1OrderedParams);
String sql2 = "alter table %1$s.%2$s owner to root; ";
String[] sql2OrderedParams = quotify(new String[]{schema,table});
String sql2Final = format(sql2,(Object[]) sql2OrderedParams);
return sql1Final + sql2Final;
}
public static String sqlPopulateTable(String schema,String table) {
String sql = "insert into %1$s.%2$s values (1, 'schema-table-%3$s');";
String[] sql1OrderedParams = quotify(new String[]{schema,table,schema});
return format(sql,(Object[]) sql1OrderedParams);
}
private static String[] quotify(String[] stringArray) {
String[] returnArray = new String[stringArray.length];
for (int i = 0; i < stringArray.length; i++) {
returnArray[i] = "\"" + stringArray[i] + "\"";
}
return returnArray;
}
}

It is actually possible to load a schema by defining a specific class in this way:
import io.r2dbc.spi.ConnectionFactory
import org.springframework.context.annotation.Bean
import org.springframework.context.annotation.Configuration
import org.springframework.core.io.ClassPathResource
import org.springframework.data.r2dbc.repository.config.EnableR2dbcRepositories
import org.springframework.r2dbc.connection.init.ConnectionFactoryInitializer
import org.springframework.r2dbc.connection.init.ResourceDatabasePopulator
#Configuration
#EnableR2dbcRepositories
class DbConfig {
#Bean
fun initializer(connectionFactory: ConnectionFactory): ConnectionFactoryInitializer {
val initializer = ConnectionFactoryInitializer()
initializer.setConnectionFactory(connectionFactory)
initializer.setDatabasePopulator(
ResourceDatabasePopulator(
ClassPathResource("schema.sql")
)
)
return initializer
}
}
Pay attention that IntelliJ gives an error "Could not autowire. No beans of 'ConnectionFactory' type found" but it is actually a false positive. So ignore it and build again your project.
The schema.sql file has to be put in resources folder.

Related

org.h2.jdbc.JdbcSQLException: Table not found

I'm getting this exception:
org.h2.jdbc.JdbcSQLException:
Table "CUSTOMERS" not found; SQL statement:
SELECT * FROM CUSTOMERS
This is the H2 Console. I have created a table there:
I have the application.yml file. I have tried to add DB_CLOSE_DELAY=-1 and DATABASE_TO_UPPER=false as well:
spring:
database:
url: jdbc:h2:mem:testdb
h2:
console.enabled: true
Also, I have a configuration class, where I have created the H2 Embedded Database:
#Bean
public DataSource dataSource() {
return new EmbeddedDatabaseBuilder().setType(EmbeddedDatabaseType.H2).build();
}
Finally, the query. The table is named CUSTOMERS:
public List<Customer> getAll() {
return jdbcTemplate.query("SELECT * FROM CUSTOMERS", (resultSet, rowNum) -> {
Customer customer = new Customer();
customer.setId(resultSet.getLong("id"));
customer.setName(resultSet.getString("name"));
customer.setAge(resultSet.getInt("age"));
return customer;
});
}
What should I do?
I had the same concern as you for a few days.
I solved it by adding this:
;TRACE_LEVEL_FILE=3;TRACE_LEVEL_SYSTEM_OUT=3
ie : jdbc:h2:mem:testdb;TRACE_LEVEL_FILE=3;TRACE_LEVEL_SYSTEM_OUT=3
It helps to know why H2 has a problem.
Usually it is a keyword problem.
You can ignore it by using NON_KEYWORDS : https://www.h2database.com/html/commands.html#set_non_keywords

how to get values to spring.boot.admin.client.username/ password from database?

I have spring boot admin project and now I hardcoded the username and passwords in application.properties file like this.
spring.boot.admin.client.username=user
spring.boot.admin.client.password=pass
spring.boot.admin.client.instance.metadata.user.name=user
spring.boot.admin.client.instance.metadata.user.password=pass
But want to get that values from database not hardcoded like this.I want to configs to connect to self register the admin server as a client.I am beginner to SpringBoot. How can I do it? Thanks.
So every configuration in an application.properties file can be configured via Javacode. First you have to create a Datasource for your project. Add the spring-data-jpa dependency to your project and config the datasource.
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-jpa</artifactId>
</dependency>
more you can find here: A Guide to JPA with Spring
To configure for example the two properties spring.boot.admin.client.username=user and spring.boot.admin.client.password=pass you need to create a #Configuration class which creates a ClientProperties Bean.
#Configuration
public class AdminClientConfig {
private final JdbcTemplate jdbcTemplate;
private final Environment environment;
public AdminClientConfig(JdbcTemplate jdbcTemplate,
Environment environment) {
super();
this.jdbcTemplate = jdbcTemplate;
this.environment = environment;
}
#Bean
public ClientProperties clientProperties() {
ClientProperties cp = new ClientProperties(environment);
cp.setUsername(getUsername());
cp.setPassword(getPassword());
return cp;
}
private String getUsername() {
String username = jdbcTemplate.queryForObject(
"select username from AnyTable where id = ?",
new Object[] { "123" }, String.class);
return username;
}
private String getPassword() {
String password = jdbcTemplate.queryForObject(
"select password from AnyTable where id = ?",
new Object[] { "123" }, String.class);
return password;
}
}
So the JdbcTemplate has already a Database connection and creates the query to get the Username and Password from the Database. The ClientProperties Bean can then be set.
P.S.: This code is not tested but gives you a some hints to get the job done.

Validate schema programmatically using hibernate

In mose projects the way to run your java app with schema validation is with that configuration (when using spring):
spring.jpa.hibernate.ddl-auto=validate
I ran into a problem that I need to validate my schema at a specific times during running, is there any way to implement that?
I saw that hibernate managed it with the AbstractSchemaValidator,
I'm using spring with hibernate, and I didn't found any information how to deal with it, the only thing I found is How to validate database schema programmatically in hibernate with annotations?
, but it was removed in the older versions of spring-boot
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-jpa</artifactId>
<version>2.0.4.RELEASE</version>
</dependency>
any ideas?
This is solution, if your use case requires:
granular & explicit control of which part of the schema should be
validated
the need is to validate multiple schemas
the need is to validate schema that is not used by the service, on which scheduled validator is running
db connections used by application should not be influenced by validation in any way (meaning, you don't want to borrow connection from main connections pool)
If above applies for your needs, than this is example of how to do scheduled schema validation:
Sources
#SpringBootApplication
#EnableScheduling
#EnableConfigurationProperties(ScheamValidatorProperties.class)
public class SchemaValidatorApplication {
public static void main(String[] args) {
SpringApplication.run(SchemaValidatorApplication.class, args);
}
}
#ConfigurationProperties("schema-validator")
class ScheamValidatorProperties {
public Map<String, String> settings = new HashMap<>();
public ScheamValidatorProperties() {
}
public Map<String, String> getSettings() {
return this.settings;
}
public void setSome(Map<String, String> settings) {
this.settings = settings;
}
}
#Component
class ScheduledSchemaValidator {
private ScheamValidatorProperties props;
public ScheduledSchemaValidator(ScheamValidatorProperties props) {
this.props = props;
}
#Scheduled(cron = "0 0/1 * * * ?")
public void validateSchema() {
StandardServiceRegistry serviceRegistry = new StandardServiceRegistryBuilder()
.applySettings(props.getSettings())
.build();
Metadata metadata = new MetadataSources(serviceRegistry)
.addAnnotatedClass(Entity1.class)
.addAnnotatedClass(Entity2.class)
.buildMetadata();
try {
new SchemaValidator().validate(metadata, serviceRegistry);
} catch (Exception e) {
System.out.println("Validation failed: " + e.getMessage());
} finally {
StandardServiceRegistryBuilder.destroy(serviceRegistry);
}
}
}
#Entity
#Table(name = "table1")
class Entity1 {
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
private Long id;
Entity1() {}
public Long getId() {
return id;
}
}
#Entity
#Table(name = "table2")
class Entity2 {
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
private Long id;
Entity2() {}
public Long getId() {
return id;
}
}
schema.sql
CREATE DATABASE IF NOT EXISTS testdb;
CREATE TABLE IF NOT EXISTS `table1` (
`id` bigint(20) NOT NULL AUTO_INCREMENT,
PRIMARY KEY (`id`)
);
CREATE TABLE IF NOT EXISTS `table2` (
`id` bigint(20) NOT NULL AUTO_INCREMENT,
PRIMARY KEY (`id`)
);
application.yml
spring:
cache:
type: none
datasource:
driver-class-name: com.mysql.cj.jdbc.Driver
url: jdbc:mysql://localhost:3309/testdb?useSSL=false&nullNamePatternMatchesAll=true&serverTimezone=UTC&allowPublicKeyRetrieval=true
username: test_user
password: test_password
testWhileIdle: true
validationQuery: SELECT 1
jpa:
show-sql: false
database-platform: org.hibernate.dialect.MySQL8Dialect
hibernate:
ddl-auto: none
naming:
physical-strategy: org.springframework.boot.orm.jpa.hibernate.SpringPhysicalNamingStrategy
implicit-strategy: org.springframework.boot.orm.jpa.hibernate.SpringImplicitNamingStrategy
properties:
hibernate.dialect: org.hibernate.dialect.MySQL8Dialect
hibernate.cache.use_second_level_cache: false
hibernate.cache.use_query_cache: false
hibernate.generate_statistics: false
hibernate.hbm2ddl.auto: validate
schema-validator:
settings:
connection.driver_class: com.mysql.cj.jdbc.Driver
hibernate.dialect: org.hibernate.dialect.MySQL8Dialect
hibernate.connection.url: jdbc:mysql://localhost:3309/testdb?autoReconnect=true&useSSL=false&allowPublicKeyRetrieval=true
hibernate.connection.username: test_user
hibernate.connection.password: test_password
hibernate.default_schema: testdb
docker-compose.yml
version: '3.0'
services:
db:
image: mysql:8.0.14
restart: always
ports:
- 3309:3306
environment:
MYSQL_ROOT_PASSWORD: test_password
MYSQL_DATABASE: testdb
MYSQL_USER: test_user
MYSQL_PASSWORD: test_password
If you want to let the SchemaValidator to reuse the connection configuration and the mapping information that are already configured in the project rather then defining them once again for schema validation, you should consider my solution such that you are DRY and don't need to maintain these configurations in two separate places.
Actually , what SchemaValidator requires is the Metadata instance which is only available during bootstrapping Hibernate . But we can use Hibernate Integrator API (as described in here) to capture it such that we can validate them later.
(1) Create SchemaValidateService which implements Hibernate Integrator API to capture Metadata. Also setup a #Scheduled method to validate the schema at desired time.
#Component
public class SchemaValidateService implements Integrator {
private Metadata metadata;
#Override
public void integrate(Metadata metadata, SessionFactoryImplementor sessionFactory,
SessionFactoryServiceRegistry serviceRegistry) {
this.metadata = metadata;
}
#Override
public void disintegrate(SessionFactoryImplementor sessionFactory, SessionFactoryServiceRegistry serviceRegistry) {
}
//Adjust the scheduled time here
#Scheduled(cron = "0 0/1 * * * ?")
public void validate() {
try {
System.out.println("Start validating schema");
new SchemaValidator().validate(metadata);
} catch (Exception e) {
//log the validation error here.
}
System.out.println("Finish validating schema....");
}
}
(2) Register SchemaValidateService to Hibernate
#SpringBootApplication
#EnableScheduling
public class App {
#Bean
public HibernatePropertiesCustomizer hibernatePropertiesCustomizer(SchemaValidateService schemaValidateService) {
return (prop -> {
List<Integrator> integrators = new ArrayList<>();
integrators.add(schemaValidateService);
prop.put("hibernate.integrator_provider", (IntegratorProvider) () -> integrators);
});
}
}
Also, this solution should has better performance as it does not need to create a new database connection for validating schema each time as it can just grab the connection from the existing connection pool.

Column querying from connection not working for Postgresql but not for Hsql in Java

I am encountering weird behavior where my integration tests for my JPA Configuration are failing for postgresql but passing for hsql. There are no code changes to the test and assertion method.
I have verified that the table and columns are being added appropriately to the database. Verification for table existence passes ok, but only the column check fails unexpectedly.
What is the root cause of this? Is there a workaround or fix for this issue?
Test:
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(classes=PersistenceConfig.class)
#Transactional
#TransactionConfiguration(defaultRollback=true)
public class UserMappingIntegrationTest {
#Autowired
EntityManager manager;
#Test
public void thatUserMappingWorks() {
assertTableExists(manager, "USER_TABLE");
assertTableHasColumn(manager, "USER_TABLE", "NAME");
}
}
Assertion method:
public static void assertTableHasColumn(EntityManager manager,
final String tableName, final String columnName) {
SessionImpl session = (SessionImpl) manager.unwrap(Session.class);
final ResultCollector rc = new ResultCollector();
session.doWork(connection -> {
ResultSet columns = connection.getMetaData().getColumns(null, null,
tableName.toUpperCase(), null);
while (columns.next()) {
if (columns.getString(4).toUpperCase()
.equals(columnName.toUpperCase())) {
rc.found = true;
}
}
});
if (!rc.found) {
fail("Column [" + columnName + "] not found on table : "
+ tableName);
}
}

Use custom creds while using DropWizard JDBI

I'm developing a webservice using Dropwizard JDBI framework.
Now, instead of having a db configurations in yaml file, I want to use 'user specified params' what i mean to say is, the db configs will be provided through the endpoint url.
Is having custom creds possible through dropwizard jdbi?
if yes, what changes should i be thinking to do in the code while referring this ? ->
http://dropwizard.readthedocs.org/en/latest/manual/jdbi.html
I understand, in normal flow, the service method gets the config details in the run method -
-- Config Class
public class ExampleConfiguration extends Configuration {
#Valid
#NotNull
#JsonProperty
private DatabaseConfiguration database = new DatabaseConfiguration();
public DatabaseConfiguration getDatabaseConfiguration() {
return database;
}
}
-- Service Class
#Override
public void run(ExampleConfiguration config,
Environment environment) throws ClassNotFoundException {
final DBIFactory factory = new DBIFactory();
final DBI jdbi = factory.build(environment, config.getDatabaseConfiguration(), "postgresql");
final UserDAO dao = jdbi.onDemand(UserDAO.class);
environment.addResource(new UserResource(dao));
}
-- and yaml
database:
# the name of your JDBC driver
driverClass: org.postgresql.Driver
# the username
user: pg-user
# the password
password: iAMs00perSecrEET
# the JDBC URL
url: jdbc:postgresql://db.example.com/db-prod
But in this case, I might get the config details in the Resource level...
smthing like -
#GET
#Path(value = "/getProduct/{Id}/{dbUrl}/{dbUname}/{dbPass}")
#Produces(MediaType.APPLICATION_JSON)
public Product getProductById(#PathParam(value = "Id") int Id,
#PathParam(value = "dbUrl") String dbUrl,
#PathParam(value = "dbUname") String dbUname,
#PathParam(value = "dbPath") String dbPass) {
//I have to connect to the DB here! using the params i have.
return new Product(); //should return the Product
}
I'd appreciate if someone can point me a direction.
Why not just use JDBI directly?
#GET
#Path(value = "/getProduct/{Id}/{dbUrl}/{dbUname}/{dbPass}")
#Produces(MediaType.APPLICATION_JSON)
public Product getProductById(#PathParam(value = "Id") int id,
#PathParam(value = "dbUrl") String dbUrl,
#PathParam(value = "dbUname") String dbUname,
#PathParam(value = "dbPass") String dbPass) {
DataSource ds = JdbcConnectionPool.create(dbUrl, dbUname, dbPass);
DBI dbi = new DBI(ds);
ProductDAO dao = dbi.open(ProductDao.class);
Product product = dao.findById(id);
dao.close();
ds.dispose();
return product;
}
#RegisterMapper(ProductMapper.class)
static interface ProductDao {
#SqlQuery("select id from product_table where id = :id") // Whatever SQL query you need to product the product
Product findById(#Bind("id") int id);
#SqlQuery("select * from product_table")
Iterator<Product> findAllProducts();
}
static class ProductMapper implements ResultSetMapper<Product> {
public Product map(int index, ResultSet r, StatementContext ctx) throws SQLException {
return new Product(r.getInt("id")); // Whatever product constructor you need
}
}
There's a notion in the spring world of using a database router (reference: https://spring.io/blog/2007/01/23/dynamic-datasource-routing/).
You likely could setup a proxy for the database connection factory passed to DBI. That proxy would then get the credentials from a thread local (perhaps) and return the real connection giving you what you're after and still let you use the run type proxies.

Categories