Spring JDBC: How to create the tables? - java

I am using Spring JdbcTemplate with the DAO pattern to access a database. Instead of creating the database tables manually, I am looking for a way to generate the tables in the DAO layer.
I understand that I can use the JdbcTemplate to execute statements, I am only looking for the right place to do it.
Is there a best practice for that?

You can use the execute(String) method:
public void execute(String sql) throws DataAccessException
Issue a single SQL execute, typically a DDL statement.
Specified by: execute in interface JdbcOperations
Parameters: sql - static SQL to execute
Throws: DataAccessException - if there is any problem
However as beny23 mentions I would be suspicious of an actual need to do this programatically in a live application.

Slightly offtopic:
Is it absolutely necessary that you need to execute the DDL commands from within your code? In fact I do think it is a good idea to have separation between db admin and db usage. Our Oracle database security setup here is actually set up so that the tables are set up using a different database user (DB_OWNER), than the one running the SELECTs, INSERTs, DELETEs are run by DB_USER.
This prevents accidentially deleting tables or modifying the schema and also allows the DB_USER to be setup such that only the privileges that are absolutely necessary are granted, which adds a layer of security.
I suppose it depends on the nature of your service/application, but think about the benefit of creating the tables inside the code (and whether a possible bug in the DDL code could accidentially destroy production data).

Use .update() methods available in the (Simple)JdbcOperations, the number they return is the number of affected rows. They're supposed to be specifically used for both INSERT and UPDATE statements.

Related

Statement.executeQuery() and SQL injection

We have internal web based tool, that allows arbitrary SQL queries to database. Access to the the tool is limited. I am more worried about mistakes or accidents than someone intentionally tampering data or attacks.
The queries are ultimately executed by Statement.executeQuery and results are returned. I tried few test runs and it seems like executeQuery, as documentation suggests, fails on any other call than select.
Are there any other SQL statements / combinations that can trick executeQuery call to cuase changes in database (insert/update/delete/drop etc.). I tried few SQL injection examples available on the web and it failed in every case.
SQL injection attacks are possible when the query arguments are concatenated to the query template, therefore allowing a rogue attacker to inject a malicious code.
If your Statement queries don't take any parameter, the client has no way to inject a malicious SQL routine. Whenever you have parameterized queries, you should use PreparedStatement instead.
As for statement restriction, you should have the DBA provide you a database user account that can only execute SELECT and DML statements on the application schema only. DROP and TRUNCATE privileges shouldn't be allowed to the application user account.
If you use dynamic schema upgrade (e.g. FleywayDB), you can use a separate database account and a separate DataSource for that specific case.
This way, you will also protect you against data corruptions due to application developers mistakes.

How to drop and create schema multiple times?

I am trying to drop-and-create tables multiple times using JPA/EclipseLink (2.5.1) on a JUnit test by calling createEntityManager on an EntityManagerFactory instantiated multiple times. However, the schema is dropped/created only once. How to make EclipseLink drop/create the schema every time?
My ultimate goal is to have the db tables in a known state (i.e. empty) for each test. Is there a better way to do that?
With EclipseLink, I used have used
properties.put("eclipselink.ddl-generation", "drop-and-create-tables");
properties.put("eclipselink.ddl-generation.output-mode", "database");
properties.put("eclipselink.deploy-on-startup", "true");
Then
JpaHelper.getEntityManagerFactory(em).refreshMetadata(properties);
This drops and recreates tables for all entities in the persistence unit. As mentioned elsewhere, calling this multiple times is expensive - you are better off clearing data before/after a test using the JPA bulk delete queries:
"DELETE from EntityName entity"
Recreating the schema on every test run will slow down your tests considerably. You simply need to ensure your data is in a 'known state', in your case all tables empty. So:
Run your tests in a transaction marked as 'rollback only' so the database returns to a known state after each execution.
and/or
have a look at either DBUnit which you can use to put your database in a known state before each test run.
http://dbunit.sourceforge.net/index.html
or
http://dbsetup.ninja-squad.com/
It might be overkill for your application, but I've used Liquibase to handle versioning of DB schema and table contents. It should be straightforward to tie in pre/post hooks for your JUnit code.

insert hibernate domain objects to different database?

I want to copy all data of a specific table from database1 to database2. In my system i have access via hibernate to the domain object from database1, i don't have to transform the data-structure. i have only a native jdbc connection to database2.
whats the best solution to make this groovy script very generally to support all kinds of domain objects i have? so this script only gets my domain object and the connection string to database and inserts all the data?
I faced a similar issue where I needed the ability to export every hibernate entity to an SQL script, in other words if you had a Person object with two properties (username, password) you should be able to generate the SQL insert statement of that Object.
Person.username = x
Person.password = y
then the process would extract from that object the equivalent SQL insert and create something like:
insert into person (username, password) values ('x', 'y');
However my solution was based on the fact that mappings are done using hibernate annotations and not XML configuration, if this is your case you could achieve the same with 1 or 2 working days, just read the annotations. noting that you will have to do an extra step which is executing the resulted SQL inserts on the other DB.
FYI: this method toSQL() was added in a superclass (AbstractHibernateEntity) that every hibernate entity extended, so calling it was the easiest thing to do.
This was the complicated solution and most general one, however if you only need to copy one table from DB to another I would suggest to simple go with a simple JDBC call and avoid complicating your life ;-)
Regards.
Maybe the easiest would be to stick on the technology level that is common to both databases.
If they exist, you could use database-specific commands, that would be really fast.
If not, you could use simple jdbc on both. You could that in a generic way :-)

Unit testing DDL statements that need to be in a transaction

I am working on an application that uses Oracle's built in authentication mechanisms to manage user accounts and passwords. The application also uses row level security. Basically every user that registers through the application gets an Oracle username and password instead of the typical entry in a "USERS" table. The users also receive labels on certain tables. This type of functionality requires that the execution of DML and DDL statements be combined in many instances, but this poses a problem because the DDL statements perform implicit commits. If an error occurs after a DDL statement has executed, the transaction management will not roll everything back. For example, when a new user registers with the system the following might take place:
Start transaction
Insert person details into a table. (i.e. first name, last name, etc.) -DML
Create an oracle account (create user testuser identified by password;) -DDL implicit commit. Transaction ends.
New transaction begins.
Perform more DML statments (inserts,updates,etc).
Error occurs, transaction only rolls back to step 4.
I understand that the above logic is working as designed, but I'm finding it difficult to unit test this type of functionality and manage it in data access layer. I have had the database go down or errors occur during the unit tests that caused the test schema to be contaminated with test data that should have been rolled back. It's easy enough to wipe the test schema when this happens, but I'm worried about database failures in a production environment. I'm looking for strategies to manage this.
This is a Java/Spring application. Spring is providing the transaction management.
First off I have to say: bad idea doing it this way. For two reasons:
Connections are based on user. That means you largely lose the benefits of connection pooling. It also doesn't scale terribly well. If you have 10,000 users on at once, you're going to be continually opening and closing hard connections (rather than soft connection pools); and
As you've discovered, creating and removing users is DDL not DML and thus you lose "transactionality".
Not sure why you've chosen to do it this but I would strongly recommend you implement users at the application and not the database layer.
As for how to solve your problem, basically you can't. Same as if you were creating a table or an index in the middle of your sequence.
You should use Oracle proxy authentication in combination with row level security.
Read this: http://www.oracle.com/technology/pub/articles/dikmans-toplink-security.html
I'll disagree with some of the previous comments and say that there are a lot of advantages to using the built-in Oracle account security. If you have to augment this with some sort of shadow table of users with additional information, how about wrapping the Oracle account creation in a separate package that is declared PRAGMA AUTONOMOUS_TRANSACTION and returns a sucess/failure status to the package that is doing the insert into the shadow table? I believe this would isolate the Oracle account creation from the transaction.

JUnit for database code

I've been trying to implement unit testing and currently have some code that does the following:
query external database, loading
into a feed table
query a view,
which is a delta of my feed and data
tables, updating data table to match
feed table
my unit testing strategy is this:
I have a testing database that I am free to manipulate.
in setUP(), load some data into my testing db
run my code, using my testing db as the source
inspect the data table, checking for counts and the existence/non existence of certain records
clear testing db, loading in a different set of data
run code again
inspect data table again
Obviously I have the data sets that I load into the source db set up such that I know certain records should be added,deleted,updated, etc.
It seems like this is a bit cumbersome and there should be an easier way? any suggestions?
Is it your intent to test the view which generates the deltas, or to test that your code correctly adds, deletes and updates in response to the view?
If you want to test the view, you could use a tool like DBUnit to populate your feed and data tables with various data whose delta you've manually calculated. Then, for each test you would verify that the view returns a matching set.
If you want to test how your code responds to diffs detected by the view, I would try to abstract away database access. I imagine an java method to which you can pass a result set (or list of POJO/DTO's) and returns a list of parameter Object arrays (again, or POJO's) to be added. Other methods would parse the diff list for items to be removed and updated. You could then create a mock result set or pojo's, pass them to your code and verify the correct parameters are returned. All without touching a database.
I think the key is to break your process into parts and test each of those as independently as possible.
DbUnit will meet your needs. One thing to watch out for is that they have switched to using SLF4J as their logging facade instead of JCL. You can configure SLF4J to forward the logging to JCL but be warned if you are using Maven DbUnit sucks in their Nop log provider by default so you will have to use an exclusion, I blogged about this conflict recently.
I use DbUnit, but also I work very hard to not to have to test against the DB.
Tests that go against the database should only exist for the purpose of testing the database interface.
So I have Mock Db Connections that I can set the data for use in all the rest of my tests.
Apart from the already suggested DBUnit, you may want to look into Unitils. It uses DBUnit, but provides more than that (quoting from the site):
Automatic maintenance of databases, with support for incremental,
repeatable and post processing scripts
Automatically disable constraints and set sequences to a minimum value
Support for Oracle, Hsqldb, MySql, DB2, Postgresql, MsSql and Derby
Simplify test database connection setup
Simple insertion of test data with DBUnit * Run tests in a transaction
JPA entity manager creation and injection for hibernate, toplink and *
Hibernate SessionFactory creation and
session
Automatically test the mapping of JPA entities / hibernate mapped
objects with the database
If you are using Maven, one option is to use the sql-maven-plugin. It allows you to run database initialization/population scripts during the maven build cycle.

Categories