I am using an APACHE DERBY database, and basing my database interactions on EntityManager, and I don't want to use JDBC class to build a query to change my tables' names (i just need to put a prefix to each new user to the application, but have the same structure of tables), such as:
//em stands for EntityManager object
Query tableNamesQuery= em.createNamedQuery("RENAME TABLE SCHEMA.EMP_ACT TO EMPLOYEE_ACT");
em.executeUpdate();
// ... rest of the function's work
// The command works from the database command prompt but i don't know how to use it in a program
//Or as i know you can't change system tables data, but here's the code
Query tableNamesQuery= em.createNamedQuery("UPDATE SYS.SYSTABLES SET TABLENAME='NEW_TABLE_NAME' WHERE TABLETYPE='T'");
em.executeUpdate();
// ... rest of the function's work
My questions are :
This syntax is correct?
Will it work?
Is there any other alternative?
Should I just use the SYS.SYSTABLES and find all the tables that has 'T' as tabletype and alter their name their, will it change the access name ?
I think you're looking for the RENAME TABLE statement: http://db.apache.org/derby/docs/10.10/ref/rrefsqljrenametablestatement.html
Don't just issue update statements against the system catalogs, you will corrupt your database.
I am performing a call to a function which is part of a DB package. This package is deployed in two locations. One local and another remote (across the Atlantic).
I am retrieving the data via the Spring JDBC template.
There is one function which returns approximately 1000 rows (not all that much) and this is taking about 1.5 seconds when getting the data locally but it's taking in the region of 12 seconds when getting the data remotely.
In all sample code, names have been changed and code has been simplified a little.
Please see an example of the current Java code:
SimpleJdbcCall simpleJdbcCall = new SimpleJdbcCall(getDataSource())
.withSchemaName(MY_SCHEMA_NAME)
.withCatalogName("REFCURSOR_PKG")
.withFunctionName("GET_DATA")
.returningResultSet("RESULT_SET", new DataEntryMapper());
SqlParameterSource params = new MapSqlParameterSource()
.addValue("the_name", name)
.addValue("the_rev", rev);
Map resultSet = simpleJdbcCall.execute(params);
ArrayList list = (ArrayList) resultSet.get("RESULT_SET");
The RowMapper class looks something like this:
class RouteDataEntryMapper implements RowMapper {
public RouteDataEntry mapRow(ResultSet resultSet, int rowNum) throws SQLException {
return new DataEntry(resultSet.getString("name"),
Integer.parseInt(resultSet.getString("rev"));
}
}
SQL package spec snippet:
TYPE REF_CURSOR IS REF CURSOR;
SQL function:
FUNCTION GET_ROUTE_DATA(the_name VARCHAR2, the_rev VARCHAR2) RETURN REF_CURSOR AS
RESULT_SET REF_CURSOR;
BEGIN
OPEN RESULT_SET FOR
select *
from table_name tn
where tn.name = the_name
and tn.rev = the_rev;
RETURN RESULT_SET;
CLOSE RESULT_SET;
EXCEPTION WHEN OTHERS THEN
RAISE;
END GET_ROUTE_DATA;
I have tried using regular boiler plate JDBC also (create connection, prepare statement, execute statement, retrieve data from RESULT_SET, etc) and I found that the vast majority of time was spent looping over the RESULT_SET and extracting the data out of it and into some pojos. In the case of the Spring code above, most of the time was spent during the execute() method but this is probably because it creates the objects using the RowMapper at that time.
So, the common area between them is the performing of actions such as:
rs.getString("name")
and I'm guessing that this is where the problem lies but I could be wrong.
As I said, locally the delay is fine but remotely it's taking way too long. Is this because it's going to the DB on every rs.get... ? Is there a better way to do this?
Thanks in advance.
rs.getString("name")
ResultSet.get*(String columnName) can be replaced with ResultSet.get*(int columnNaumber) which is slightly faster but I doubt that the main problem here.
Is this because it's going to the DB on every rs.get... ?
While it really depends the driver I suspect it won't. For a cached result-set it might go to ther server when your scroll through the cursor but it would still fetch a bunch of rows in every roundtrip.
Two more suggestions I have are:
Use a network sniffing utility to see the data being transferred
Check your driver for any option to pre-fetch and such like.
add this line :-
.withoutProcedureColumnMetaDataAccess
in the following code lines
SimpleJdbcCall simpleJdbcCall = new SimpleJdbcCall(getDataSource())
.withSchemaName(MY_SCHEMA_NAME)
.withCatalogName("REFCURSOR_PKG")
.withFunctionName("GET_DATA")
.withoutProcedureColumnMetaDataAccess // to avoid fetching meta data info from database
I want to insert exchange.body to a database table for one of the condition of my route.
Is there any example/tutorial of camel-jdbc component to insert message body?
Can I import the SQL statement itself and pass exchange.body to it?
I looked at http://camel.apache.org/jdbc.html example, but could not understand it.
Here Spring example is confusing for me. I didn't get why is it setting the body as SQL query and again importing some query from the class path. (There is no insert query example mentioned here.)
If you want to insert using the same statement (changing the parameters only) - use SQL component.
If you want to insert using arbitrary SQL statement into the component - use JDBC component.
SQL component usage:
from("direct:start").to("sql:insert into table foo (c1, c1) values ('#','#')");
com.google.common.collect.Lists;
producerTemplate.sendBody("direct:start", Lists.newArrayList("value1","value2"));
JDBC component usage:
from("direct:start").to("jdbc:dataSource");
producerTemplate.sendBody("direct:start", "insert into table foo (c1, c1) values ('value1','value2')");
You probably need to do some restructure of your payload before inserting it anyway, so there should probably be no issue to do a transformation using whatever method in Camel to set the body to the appropriate INSERT statement.
The important thing is what kind of payload structure your incoming message have. In the basic case - it's a string - it should be fairly simple
// In a Java bean/processor before the JDBC endpoint.
// Update: make sure to sanitize the payload from SQL injections if it contains user inputs or external data not generated by trusted sources.
exchange.getIn().setBody("INSERT INTO MYTABLE VALUES('" + exchange.getIn().getBody(String.class) + "', 'fixedValue', 1.0, 42)");
In case your message contains complex data structures, this code will of course be more complex, but it's pretty much the same way regular application will generate SQL queries.
The classpath example you are refering to
<jdbc:embedded-database id="testdb" type="DERBY">
<jdbc:script location="classpath:sql/init.sql"/>
</jdbc:embedded-database>
Simply shows how to test the JDBC component by starting a Database server embedded (Apache Derby) and populate it with some initial data (the sql/init.sql file). This part is not really part of the core jdbc component, but simply in the documentation to get up and running a sample without needing to configure a DB server and setup the JDBC connection properties.
That said, you might want to use the SQL component for more complex scenarios.
Started coming up with a java web app for online user interaction. Decided to use a MySql DB for data storage. I have already created the tables with the proper/expected data types. My question is I always thought the next step would be to creat stored procedures like Search/Add/Delete/etc.. that the user could envoke from the page. So in my java code I could just call the procedure ex:
CallableStatement cs;
Try
{
String outParam = cs.getString(1); // OUT parameter
// Call a procedure with one in and out parameter
cs = connection.prepareCall("{call SearchIt(?)}");
cs.registerOutParameter(1, Types.VARCHAR);
cs.setString(1, "a string");
cs.execute();
outParam = cs.getString(1);
}
catch (SQLException e) {
}
but if my application was not in the need for stored procedures because the user actions would be simple enough to execute simple tedious queries. How could I set up my Java and Sql code to handle that. Could I just have the "Select" or "Update" statements in my code to manipulate the data in my MySQL DB. If so how would that syntax look like?
This URL has documentation on using prepared statements which is what you want to use to avoid security flaws (SQL Injection and such).
http://download.oracle.com/javase/tutorial/jdbc/basics/prepared.html
here's an example from that page
PreparedStatement updateSales = connection.prepareStatement(
"UPDATE COFFEES SET SALES = ? WHERE COF_NAME LIKE ? ");
updateSales.setInt(1, 75);
updateSales.setString(2, "Colombian");
updateSales.executeUpdate():
Just use Statement, or PreparedStatement.
http://download.oracle.com/javase/1.4.2/docs/api/java/sql/Statement.html
In a similar way to what you did, just call :
Statement stm = Connection.createStatement();
then execute your SQL :
stm.execute("SELECT * FROM MYTABLE");
grab the resultset and check out the results.
Beware though - this is bad bad as far as security goes - as others have mentioned, PreparedStatements are a bit more secure, but still not 100%.
To be honest, although basic JDBC is pretty simple, I really hate all the SQL strings littered around your code. If you want something a bit more elegant have a quick look at hibernate - it hides all the hackiness from you, and is also pretty easy to setup.
I want to build an SQL string to do database manipulation (updates, deletes, inserts, selects, that sort of thing) - instead of the awful string concat method using millions of "+"'s and quotes which is unreadable at best - there must be a better way.
I did think of using MessageFormat - but its supposed to be used for user messages, although I think it would do a reasonable job - but I guess there should be something more aligned to SQL type operations in the java sql libraries.
Would Groovy be any good?
First of all consider using query parameters in prepared statements:
PreparedStatement stm = c.prepareStatement("UPDATE user_table SET name=? WHERE id=?");
stm.setString(1, "the name");
stm.setInt(2, 345);
stm.executeUpdate();
The other thing that can be done is to keep all queries in properties file. For example
in a queries.properties file can place the above query:
update_query=UPDATE user_table SET name=? WHERE id=?
Then with the help of a simple utility class:
public class Queries {
private static final String propFileName = "queries.properties";
private static Properties props;
public static Properties getQueries() throws SQLException {
InputStream is =
Queries.class.getResourceAsStream("/" + propFileName);
if (is == null){
throw new SQLException("Unable to load property file: " + propFileName);
}
//singleton
if(props == null){
props = new Properties();
try {
props.load(is);
} catch (IOException e) {
throw new SQLException("Unable to load property file: " + propFileName + "\n" + e.getMessage());
}
}
return props;
}
public static String getQuery(String query) throws SQLException{
return getQueries().getProperty(query);
}
}
you might use your queries as follows:
PreparedStatement stm = c.prepareStatement(Queries.getQuery("update_query"));
This is a rather simple solution, but works well.
For arbitrary SQL, use jOOQ. jOOQ currently supports SELECT, INSERT, UPDATE, DELETE, TRUNCATE, and MERGE. You can create SQL like this:
String sql1 = DSL.using(SQLDialect.MYSQL)
.select(A, B, C)
.from(MY_TABLE)
.where(A.equal(5))
.and(B.greaterThan(8))
.getSQL();
String sql2 = DSL.using(SQLDialect.MYSQL)
.insertInto(MY_TABLE)
.values(A, 1)
.values(B, 2)
.getSQL();
String sql3 = DSL.using(SQLDialect.MYSQL)
.update(MY_TABLE)
.set(A, 1)
.set(B, 2)
.where(C.greaterThan(5))
.getSQL();
Instead of obtaining the SQL string, you could also just execute it, using jOOQ. See
http://www.jooq.org
(Disclaimer: I work for the company behind jOOQ)
One technology you should consider is SQLJ - a way to embed SQL statements directly in Java. As a simple example, you might have the following in a file called TestQueries.sqlj:
public class TestQueries
{
public String getUsername(int id)
{
String username;
#sql
{
select username into :username
from users
where pkey = :id
};
return username;
}
}
There is an additional precompile step which takes your .sqlj files and translates them into pure Java - in short, it looks for the special blocks delimited with
#sql
{
...
}
and turns them into JDBC calls. There are several key benefits to using SQLJ:
completely abstracts away the JDBC layer - programmers only need to think about Java and SQL
the translator can be made to check your queries for syntax etc. against the database at compile time
ability to directly bind Java variables in queries using the ":" prefix
There are implementations of the translator around for most of the major database vendors, so you should be able to find everything you need easily.
I am wondering if you are after something like Squiggle (GitHub). Also something very useful is jDBI. It won't help you with the queries though.
I would have a look at Spring JDBC. I use it whenever I need to execute SQLs programatically. Example:
int countOfActorsNamedJoe
= jdbcTemplate.queryForInt("select count(0) from t_actors where first_name = ?", new Object[]{"Joe"});
It's really great for any kind of sql execution, especially querying; it will help you map resultsets to objects, without adding the complexity of a complete ORM.
I tend to use Spring's Named JDBC Parameters so I can write a standard string like "select * from blah where colX=':someValue'"; I think that's pretty readable.
An alternative would be to supply the string in a separate .sql file and read the contents in using a utility method.
Oh, also worth having a look at Squill: https://squill.dev.java.net/docs/tutorial.html
I second the recommendations for using an ORM like Hibernate. However, there are certainly situations where that doesn't work, so I'll take this opportunity to tout some stuff that i've helped to write: SqlBuilder is a java library for dynamically building sql statements using the "builder" style. it's fairly powerful and fairly flexible.
I have been working on a Java servlet application that needs to construct very dynamic SQL statements for adhoc reporting purposes. The basic function of the app is to feed a bunch of named HTTP request parameters into a pre-coded query, and generate a nicely formatted table of output. I used Spring MVC and the dependency injection framework to store all of my SQL queries in XML files and load them into the reporting application, along with the table formatting information. Eventually, the reporting requirements became more complicated than the capabilities of the existing parameter mapping frameworks and I had to write my own. It was an interesting exercise in development and produced a framework for parameter mapping much more robust than anything else I could find.
The new parameter mappings looked as such:
select app.name as "App",
${optional(" app.owner as "Owner", "):showOwner}
sv.name as "Server", sum(act.trans_ct) as "Trans"
from activity_records act, servers sv, applications app
where act.server_id = sv.id
and act.app_id = app.id
and sv.id = ${integer(0,50):serverId}
and app.id in ${integerList(50):appId}
group by app.name, ${optional(" app.owner, "):showOwner} sv.name
order by app.name, sv.name
The beauty of the resulting framework was that it could process HTTP request parameters directly into the query with proper type checking and limit checking. No extra mappings required for input validation. In the example query above, the parameter named serverId
would be checked to make sure it could cast to an integer and was in the range of 0-50. The parameter appId would be processed as an array of integers, with a length limit of 50. If the field showOwner is present and set to "true", the bits of SQL in the quotes will be added to the generated query for the optional field mappings. field Several more parameter type mappings are available including optional segments of SQL with further parameter mappings. It allows for as complex of a query mapping as the developer can come up with. It even has controls in the report configuration to determine whether a given query will have the final mappings via a PreparedStatement or simply ran as a pre-built query.
For the sample Http request values:
showOwner: true
serverId: 20
appId: 1,2,3,5,7,11,13
It would produce the following SQL:
select app.name as "App",
app.owner as "Owner",
sv.name as "Server", sum(act.trans_ct) as "Trans"
from activity_records act, servers sv, applications app
where act.server_id = sv.id
and act.app_id = app.id
and sv.id = 20
and app.id in (1,2,3,5,7,11,13)
group by app.name, app.owner, sv.name
order by app.name, sv.name
I really think that Spring or Hibernate or one of those frameworks should offer a more robust mapping mechanism that verifies types, allows for complex data types like arrays and other such features. I wrote my engine for only my purposes, it isn't quite read for general release. It only works with Oracle queries at the moment and all of the code belongs to a big corporation. Someday I may take my ideas and build a new open source framework, but I'm hoping one of the existing big players will take up the challenge.
Why do you want to generate all the sql by hand? Have you looked at an ORM like Hibernate Depending on your project it will probably do at least 95% of what you need, do it in a cleaner way then raw SQL, and if you need to get the last bit of performance you can create the SQL queries that need to be hand tuned.
You can also have a look at MyBatis (www.mybatis.org) . It helps you write SQL statements outside your java code and maps the sql results into your java objects among other things.
Google provides a library called the Room Persitence Library which provides a very clean way of writing SQL for Android Apps, basically an abstraction layer over underlying SQLite Database. Bellow is short code snippet from the official website:
#Dao
public interface UserDao {
#Query("SELECT * FROM user")
List<User> getAll();
#Query("SELECT * FROM user WHERE uid IN (:userIds)")
List<User> loadAllByIds(int[] userIds);
#Query("SELECT * FROM user WHERE first_name LIKE :first AND "
+ "last_name LIKE :last LIMIT 1")
User findByName(String first, String last);
#Insert
void insertAll(User... users);
#Delete
void delete(User user);
}
There are more examples and better documentation in the official docs for the library.
There is also one called MentaBean which is a Java ORM. It has nice features and seems to be pretty simple way of writing SQL.
Read an XML file.
You can read it from an XML file. Its easy to maintain and work with.
There are standard STaX, DOM, SAX parsers available out there to make it few lines of code in java.
Do more with attributes
You can have some semantic information with attributes on the tag to help do more with the SQL. This can be the method name or query type or anything that helps you code less.
Maintaince
You can put the xml outside the jar and easily maintain it. Same benefits as a properties file.
Conversion
XML is extensible and easily convertible to other formats.
Use Case
Metamug uses xml to configure REST resource files with sql.
If you put the SQL strings in a properties file and then read that in you can keep the SQL strings in a plain text file.
That doesn't solve the SQL type issues, but at least it makes copying&pasting from TOAD or sqlplus much easier.
How do you get string concatenation, aside from long SQL strings in PreparedStatements (that you could easily provide in a text file and load as a resource anyway) that you break over several lines?
You aren't creating SQL strings directly are you? That's the biggest no-no in programming. Please use PreparedStatements, and supply the data as parameters. It reduces the chance of SQL Injection vastly.