The db tables that I am using are changing very often meaning new column can be add which will reflect my sql’s.
The solution that I was thinking is first “read” the meta data into some map and use it in order to retrieve the values something like this.
Read meta data:
public class Dynamic {
static public final Map<Integer, String> metadata = initMetaData();
HashMap<String, String> data = new HashMap<String, String>();
private static Map<Integer, String> initMetaData() {
Map<Integer, String> tmpMap = new HashMap<Integer, String>();
try {
Connection connection = DBConnection.getConnection();
try {
Statement stmt = connection.createStatement();
ResultSet result = stmt.executeQuery("SELECT * FROM TMP WHERE ROWNUM = 1");
for (int i = 1; i <= result.getMetaData().getColumnCount(); i++) {
tmpMap.put(new Integer(i), result.getMetaData().getColumnName(i));
}
} finally {
connection.close();
}
} catch (SQLException ex) {
…..
}
return Collections.unmodifiableMap(tmpMap);
}
public static String getColumnName(Integer index) {
return metadata.get(index);
}
And when running the sql:
public static void test()
try {
Connection connection = DBConnection.getConnection()
try {
Statement stmt = connection.createStatement();
ResultSet result = stmt.executeQuery("SELECT * FROM TMP where idx = 'R5'");
while (result.next()) {
Dynamic d = new Dynamic()
for (int i = 1; i <= Dynamic.metadata.size(); i++) {
d.setData(Dynamic.getColumnName(i),result.getString(Dynamic.getColumnName(i)));
}
}
In this approach I have two problems(that I notice):
1) I need to execute two loops
2) I don’t know which get function to use via resultset since the type can also change.
How can I overcome those problems ?
I would also appreciate to get some other suggestion maybe there is a simple why
Thanks
1) What is your alternative to the inner loop? How would you get the field values? Do you think there is a lot of overhead in looping over small number of integers?
2) You can get extra information about field data type from the same metadata where you get the field name and map the method names accordingly.
3) You should really create a map for multiple tables - table/fieldSeq/fieldType/methodName and maybe few extra details - you don't have to get them all dynamically all the time.
Related
I have a problem with loading objects from a SQLite database.
First of all, this is my table definition:
CREATE TABLE MyTable (
rowid INTEGER PRIMARY KEY,
data BLOB
);
This is the simple class which I want to store and reload:
public class MyHashMap extends HashMap<String, Integer> {
private static final long serialVersionUID = 0L;
}
Then I'm filling the map with some data and store it with an SQL INSERT statement in the database. Everything works fine and if I execute a SELECT (with the sqlite3 command-line client) I will see the correct information.
Now I'm using the java.sql package to load the object:
String sql = "SELECT data FROM MyTable WHERE rowid = 1";
MyHashMap map = null;
try {
try (Statement stmt = db.createStatement()) {
try (ResultSet rs = stmt.executeQuery(sql)) {
if (rs.next()) {
map = rs.getObject("data", MyHashMap.class);
}
}
}
} catch (Exception e) {
e.printStackTrace();
}
There's no exception thrown but my map variable is null. I debugged the program and I can say that the getObject method is called as expected.
First, you definition of MyHashMap is incorrect
public class MyHashMap extends HashMap<Integer, String> {
private static final long serialVersionUID = 0L;
}
The main issue, though, is that SQL doesn't store Java objects; it merely stores rows of records, which consist of fields. You need to read these records one by one, and store them in your map. Roughly as follows:
MyHashMap map = new MyHashMap();
final String sql = "SELECT rowid, data FROM MyTable";
try (final Statement stmt = connection.createStatement;
final ResultSet rs = stmt.executeQuery(sql)) {
while (rs.next()) {
map.put(rs.getInt(1), rs.getString(2));
}
}
Please note that there's a good chance that reading a Blob into a String will fail. Usually, JDBC drivers are clever enough to convert data types, but if you have raw binary data in your blob, you cannot read it into a string. You would need the getBlob method instead, and deal with the resulting object. But I can't tell from your code what you'll be storing in that blob.
Ok, I found a solution with the following method:
private Object getObjectFromBlob(ResultSet rs, String columnName)
throws ClassNotFoundException, IOException, SQLException {
InputStream binaryInput = rs.getBinaryStream(columnName);
if (binaryInput == null || binaryInput.available() == 0) {
return null;
}
ObjectInputStream in = new ObjectInputStream(binaryInput);
try {
return in.readObject();
} finally {
in.close();
}
}
I am trying to create a league table from key value pairs in a derby database - the rows in the database have only two columns, TeamName & Goals.
I need to get these to my GUI class so I can set the keys & values as JLabels in the league table. Ordered top down descending in terms of total goals.
From what I have read LinkedHashMap & TreeSet should both be able to assist me.
Code I have so far:
public TreeMap viewTeams(){
TreeMap teamData = new TreeMap();
String viewTeams = "SELECT * FROM HUI.TEAM";
connectToDatabase(dbName);
try {
stmt = dbConnection.createStatement();
rs = stmt.executeQuery(viewTeams);
} catch (SQLException error) {
System.err.println("Error querying database for teams: " + error.toString());
}
try {
while (rs.next()){
teamData.put((rs.getString("TEAMNAME")), (rs.getInt("GOALSSCORED")));
}
} catch (SQLException error) {
System.err.println("Error adding players to HasMap: " + error.toString());
}
return teamData;
}
In the TeamDB class
public void updateLeagueTable(){
TeamDB tdb = new TeamDB("FootManDatabase");
TreeMap teamData = tdb.viewTeams(); // Do I need this new TreeMap?
// How do I Iterate through the pairs in descending order?
}
In the GUI class
I would just let the SQL database to the sorting, as that's one thing it can do well.
Also, you can't use a TreeMap<String, Integer> as that would iterate by team name instead of goals. And you can't use TreeMap<Integer, String> either because it doesn't allow duplicates so you won't be able to have two teams with the same goals scored. Instead, I would use a list. Something like the following (but remember to close your resources properly, which I've left out for clarity).
class TeamScore {
private final String name;
private final int numGoals;
public TeamScore(String name, int numGoals) {
this.name = name;
this.numGoals = numGoals;
}
// getters...
}
public List<TeamScore> viewTeams() throws SQLException {
List<TeamScore> teamData = new ArrayList<>();
String viewTeams = "SELECT * FROM HUI.TEAM ORDER BY GOALSSCORED DESC";
connectToDatabase(dbName);
stmt = dbConnection.createStatement();
rs = stmt.executeQuery(viewTeams);
while (rs.next()) {
teamData.add(new TeamScore(rs.getString("TEAMNAME"), rs.getInt("GOALSSCORED"));
}
return teamData;
}
I am populating two tables, which have a 1-many relationship.
So I insert a line in outer, get the (autoincrement primary key) id for that line, and then insert 100 lines into inner (all with a foreign key pointing to outer.id).
Then I repeat, 50 times. For every entry in outer I have to insert, read id, and then insert into inner.
This is slow. Most of the time is spent in loading the 100 lines into inner. I suspect it would be much faster if I could insert all 50*100 lines into inner in one batch operation. But I cannot see how to do that - how can I make the foreign keys work?
How do other people make this efficient?
I am using Java / Spring. The 100 lines are inserted with a JdbcTemplate.batchUpdate().
public final void insert(final JdbcTemplate db,
final Iterable<DataBlock> data) {
String insertSql = getInsertSql();
String idQuery = getIdQuery();
ItemRowMapper.IdRowMapper mapper = new ItemRowMapper.IdRowMapper();
for (DataBlock block: data) {
Object[] outer = block.getOuter();
LOG.trace("Loading outer");
db.update(insertSql, outer);
LOG.trace("Getting index");
// currently retrieve index based on natural key, but could use last index
int id = db.query(idQuery, mapper, uniqueData(outer)).get(0);
LOG.trace("Getting inner");
List<Object[]> inner = block.getInner(id);
// most time spent here
LOG.trace(format("Loading inner (%d)", inner.size()));
innerTable.insert(db, inner);
}
}
And pseudo-SQL:
create table outer (
integer id primary key autoincrement,
...
);
create table inner (
integer outer references outer(id),
...
);
Update - The following appears to work with Spring 3.1.1 and Postgres 9.2-1003.jdbc4.
/**
* An alternative implementation that should be faster, since it inserts
* in just two batches (one for inner and one fo router).
*
* #param db A connection to the database.
* #param data The data to insert.
*/
public final void insertBatchier(final JdbcTemplate db,
final AllDataBlocks data) {
final List<Object[]> outers = data.getOuter();
List<Integer> ids = db.execute(
new PreparedStatementCreator() {
#Override
public PreparedStatement createPreparedStatement(
final Connection con) throws SQLException {
return con.prepareStatement(getInsertSql(),
Statement.RETURN_GENERATED_KEYS);
}
},
new PreparedStatementCallback<List<Integer>>() {
#Override
public List<Integer> doInPreparedStatement(final PreparedStatement ps)
throws SQLException {
for (Object[] outer: outers) {
for (int i = 0; i < outer.length; ++i) {
setParameterValue(ps, i + 1,
SqlTypeValue.TYPE_UNKNOWN, outer[i]);
}
ps.addBatch();
}
ps.executeBatch();
RowMapperResultSetExtractor<Integer> ids =
new RowMapperResultSetExtractor<Integer>(
new ItemRowMapper.IdRowMapper());
try (ResultSet keys = ps.getGeneratedKeys()) {
return ids.extractData(keys);
}
}
});
innerTable.insert(db, data.getInner(ids));
}
I'm not as familiar with JdbcTemplate, but assuming it is similar to JDBC I would do it with something similar (I would probably break this into multiple methods) to the following code:
private static final int BATCH_SIZE = 50;
public void addBatch(Connection connection, List<Outer> outers) {
PreparedStatement outerInsertStatement = connection.prepareStatement("...", Statement.RETURN_GENERATED_KEYS);
PreparedStatement innerInsertStatement = connection.prepareStatement("...", Statement.RETURN_GENERATED_KEYS);
List<Integer> outerIds = new ArrayList<Integer>();
for(Outer outer : outers) {
outerInsertStatement.setParameter(...);
...
outerInsertStatement.setParameter(...);
outerInsertStatement.addBatch();
}
outerInsertStatement.executeBatch();
//Note, this line requires JDBC3
ResultSet primaryKeys = outerInsertStatement.getGeneratedKeys();
while(!primaryKeys.isAfterLast()) {
outerIds.add(primaryKeys.getInt(0));
}
for(int i = 0; i < outers.size(); i++) {
Outer outer = outers.get(i);
Integer outerId = outerIds.get(i);
for(Inner inner : outer.getInners()) {
//One of these setParameter calls would use outerId
innerInsertStatement.setParameter(...);
...
innerInsertStatement.setParameter(...);
innerInsertStatement.addBatch();
if( (i+1) % BATCH_SIZE == 0) {
innerInsertStatement.executeBatch();
}
}
innerInsertStatement.executeBatch();
}
}
I retrieve the data from database and loop it thru an array to display the like amount.
public void SetUpLikeAmount() {
int likes = 0;
ArrayList <Integer> likeArray = new ArrayList <Integer>();
for (int count = 0; count < likeArray.size();count++){
// Set Up Database Source
db.setUp("IT Innovation Project");
String sql = "Select likeDislike_likes from forumLikeDislike WHERE topic_id = "
+ topicId + "";
ResultSet resultSet = null;
// Call readRequest to get the result
resultSet = db.readRequest(sql);
try {
while (resultSet.next()) {
likeArray.add(Integer.parseInt(resultSet.getString("likeDislike_likes")));
likes += likeArray.get(count);
}
resultSet.close();
} catch (Exception e) {
System.out.println(e);
}
}
jLabel_like.setText(Integer.toString(likes));
}
However, it keeps returning 0. Thanks in advance.
(As an aside, it never returns anything - you've posted a void method.)
Look at this code:
ArrayList <Integer> likeArray = new ArrayList <Integer>();
for (int count = 0; count < likeArray.size();count++){
...
}
You've just created a new ArrayList<Integer>, which will therefore have a size of 0. Therefore, the loop always completes immediately, without ever executing the body.
If you're trying to get input from a list created elsewhere, you should probably pass that into your method. (You should also use a PreparedStatement with a parameter instead of including the value directly in your SQL.)
You're iterating over the list likeArray which is empty. So it won't enter the loop
May be here is the new code you should refer:
public void SetUpLikeAmount() {
int likes = 0;
// Set Up Database Source
db.setUp("IT Innovation Project");
String sql = "Select likeDislike_likes from forumLikeDislike WHERE topic_id = "
+ topicId + "";
ResultSet resultSet = null;
// Call readRequest to get the result
resultSet = db.readRequest(sql);
try {
while (resultSet.next()) {
likes += Integer.parseInt(resultSet.getString("likeDislike_likes"));
}
resultSet.close();
} catch (Exception e) {
System.out.println(e);
}
jLabel_like.setText(Integer.toString(likes));
}
You might not need the arraylist I believe as you are getting the value and summing it during the iteration over the result set only.
I have to make a 'query' method for my class which accesses MySQL thru' JDBC.
The input parameter to the method is a full SQL command (with values included), so I don't know the names of columns to fetch out.
Some of the columns are strings, some others are integers, etc.
The method needs to return the value of type ArrayList<HashMap<String,Object>>
where each HashMap is 1 row, and the ArrayList contains all rows of result.
I'm thinking of using ResultSet.getMetaData().getColumnCount() to get the number of columns then fetch cell by cell out of the current row, but is this the only solution? any better ones?
I have the example code here, just in case anybody need it. ('Con' in the code is the standard JDBC connection).
//query a full sql command
public static ArrayList<HashMap<String,Object>>
rawQuery(String fullCommand) {
try {
//create statement
Statement stm = null;
stm = con.createStatement();
//query
ResultSet result = null;
boolean returningRows = stm.execute(fullCommand);
if (returningRows)
result = stm.getResultSet();
else
return new ArrayList<HashMap<String,Object>>();
//get metadata
ResultSetMetaData meta = null;
meta = result.getMetaData();
//get column names
int colCount = meta.getColumnCount();
ArrayList<String> cols = new ArrayList<String>();
for (int index=1; index<=Col_Count; index++)
cols.add(meta.getColumnName(index));
//fetch out rows
ArrayList<HashMap<String,Object>> rows =
new ArrayList<HashMap<String,Object>>();
while (result.next()) {
HashMap<String,Object> row = new HashMap<String,Object>();
for (String colName:cols) {
Object val = Result.getObject(colName);
row.put(colName,val);
}
rows.add(row);
}
//close statement
stm.close();
//pass back rows
return tows;
}
catch (Exception ex) {
System.out.print(ex.getMessage());
return new ArrayList<HashMap<String,Object>>();
}
}//raw_query