Liquibase loading data from csv - java

I want to load an entire column of my PostgreSql table with data from csv file but when I do that I get an exception saying that the primary key of my table should not be null. It looks like Liquibase is creating new rows to insert the data. Is there a way to load the data in existing rows ?
DatabaseChangeLog dbChangeLog = new DatabaseChangeLog();
Liquibase liquibase = new Liquibase(dbChangeLog, new FileSystemResourceAccessor(), database);
ChangeSet loadChangeSet = new ChangeSet(id + "", "nasri", false, false, "", "", "", liquibase.getDatabaseChangeLog());
LoadDataChange loadDataChange = new LoadDataChange();
loadDataChange.setTableName(key);
loadDataChange.setChangeSet(loadChangeSet);
loadDataChange.setResourceAccessor(new FileSystemResourceAccessor());
String path = context.getBundle().getVersion() + "." + key + "." + columnKey + "." + targetFieldKey + ".csv";
loadDataChange.setFile(path);
loadDataChange.setSchemaName("public");
LoadDataColumnConfig columnConfig = new LoadDataColumnConfig();
columnConfig.setName(targetFieldKey);
columnConfig.setType("String");
loadDataChange.addColumn(columnConfig);
loadChangeSet.addChange(loadDataChange);
liquibase.getDatabaseChangeLog().addChangeSet(loadChangeSet);
liquibase.update("");

There is a class thats called: LoadUpdateDataChange
The description says:
Loads or updates data from a CSV file into an existing table. Differs from loadData by issuing a SQL batch that checks for the existence of a record. If found, the record is UPDATEd, else the record is INSERTed. Also, generates DELETE statements for a rollback.
Looks like it should do what you are looking for (I have not used this myself though).

To update records, LoadUpdateDataChange is not working for me. Its always trying to insert new records.It won't update existing records.
If want to update existing records, you should use like below.
<update tableName="someTable">
<column name="update_column_name" value="updated value" />
<where> primaryKey = condition</where>
</update>

Related

Java / MySQL and JSON_SET array not working

I am using MySQL / JSON to store userdata in the database. I have one userid column and a column named data. The data column is a json type.
The default value of this json column is "{}";
When I update the user in the application and save it to the database using JSON_SET('column', 'name', 'value'). Everything works fine. Examples:
UPDATE `users` SET `data` = JSON_SET(`data`, '$.lastname', 'lastname') WHERE `userid` = 1;
UPDATE `users` SET `data` = JSON_SET(`data`, '$.money', 120) WHERE `userid` = 1;
But when I try to set for example a list or hashmap, things starting to get tricky. I use json-simple and gson to parse all objects to json before saving them to the database. But when I do this with a list or hashmap I cant set it in the database (unless I set parse the json to a string with '' around it)
But when I do add the '', json deserialize doesn't see it as a list.
Hope you guys can help me out. thanks in advance.
Here is some code:
String query = "UPDATE `users` SET `data` = JSON_SET(`data`, '$." + fieldName + "', " + this.gson.toJson(value) + ") WHERE `userid` = '" + userId + "';";

How to get auto generated primary key after inserting or updating record using jdbc template batch update?

I am using spring JDBC template for data insertion in Oracle and I have one requirement that I have to bulk insert using spring JDBC template batch update and I want auto generated primary key and that key I need to pass in another method but I am not able to get that auto generated primary using batch update.
Can you please provide solution?
Assuming you have auto generated PK in oracle,
See sample code:
final String insertNewFieldSql = Config.getSqlProperty("insert_new_field_record");
GeneratedKeyHolder holder = new GeneratedKeyHolder();
MapSqlParameterSource parameters = null;
for (ParsedData field : fields) {
parameters = new MapSqlParameterSource();
parameters.addValue("FIELD_1",parsedEmail.getDbRecordId())
.addValue("FIELD_2",field.getName());
namedParameterJdbcTemplate.update( insertNewFieldSql, parameters, holder, new String[] {"PK_FIELD_ID" } );
Long newFieldKey = holder.getKey().longValue();
logger.log(Level.FINEST, "row was added: " + newFieldKey);
}

Codename One SQL database storing wrong values

I am used to developing desktop applications with Java. Now I am trying Codename One to develop my first mobile app.
Trying to replicate my experiences with SQL databases I am running into a very odd storage behavior, which I cannot explain.
The database is created, but when I change the table input value, the new value gets ignored and just the old value is added. To save the new value, I have to delete the database.
I like the interface and any kind help would be appreciated.
Database db = Display.getInstance().openOrCreate("MyDB.db");
db.execute("CREATE TABLE IF NOT EXISTS Persons (Date NOT NULL,Event NOT NULL)");
String sql = "INSERT INTO Persons (DATE , Event) " + "VALUES ( 'John', '10000.00' );";
db.execute (sql);
// adds "John" to the database every time I click the button
// then I change the from "John" to "James"
// I am not adding the lines twice, just change the input
String sql = "INSERT INTO Persons (DATE , Event) " + "VALUES ( 'James', '10000.00' );";
db.execute (sql);
//keeps adding "John" to the database, even though value has been changed to "James"
Cursor cur = db.executeQuery("select * from Persons;");
Row currentRow= cur.getRow();
String dataText = currentRow.getString(0);
while (cur.next()) {
System.out.println(dataText);
}
You're not fetching the next row into dataText in your while() loop, so you're just repeatedly printing out the text from the first row.
It should be:
Cursor cur = db.executeQuery("select * from Persons;");
while (cur.next()) {
Row currentRow = cur.getRow();
String dataText = currentRow.getString("Date");
System.out.println(dataText);
}
If you examine the table with a separate query tool, like PhpMyAdmin, you should see that it contains both rows.
I hope I got the syntax right. I'm not a Java programmer and I got it from a tutorial.

Creating an SQLite database and trying to add records - throwing errors

I have a webapp that I'm trying to set up an SQLite database with. At this point, I have it very simple to build a foundation. At this point there are only two tables. One table uses a foreign key constraint to point to the other table. The problem I am having is when I try to insert data, I always receive the error Error processing SQL: could not execute statement due to a constraint failure (19 constraint failed) -- Code: 6. Code 6, apparently, means the table is locked. How can it be locked if I can successfully insert values into it? Confused...
My code...
I set up the tables with this:
// Create a system table, if it doesn't exist
db.transaction(function(tx){
tx.executeSql('CREATE TABLE IF NOT EXISTS system(systemID TEXT PRIMARY KEY, numZones INT NULL, numHeads INT NULL)', [],nullHandler,errorHandler);
},errorHandler, successCallBack);
// Create a work order table, if it doesn't exist
db.transaction(function(tx){
tx.executeSql('CREATE TABLE IF NOT EXISTS wo(woID_id TEXT PRIMARY KEY, woType TEXT NOT NULL, systemID_fk TEXT NOT NULL, FOREIGN KEY (systemID_fk) REFERENCES system(systemID))', [],nullHandler,errorHandler);
},errorHandler, successCallBack);
Presumably now I will have two tables, one having a field that points to the other table. I am pulling in a JSon feed, parsing it, and trying to put it into these two tables. Here's the code for that parsing:
function GetSystems(){
// First we see if there are credentials stored. If not, we don't try to retrieve the work orders.
db.transaction(function(transaction){
transaction.executeSql('SELECT * FROM Creds;',[], function(transaction, result) {
// If the user hasn't entered their creds yet, we create a new record, otherwise update the current one.
if(result.rows.length != 0){
var row;
db.transaction(function(transaction){
transaction.executeSql('SELECT * FROM Creds where id=1;',[],function(transaction, result) {
$.getJSON(baseURL + "get-wos/?callback=?", { username:result.rows.item(0).username, password:result.rows.item(0).password }, function(data) {
$.each(data, function(i, obj) {
db.transaction(function(transaction){
transaction.executeSql('INSERT INTO system(systemID, numZones, numHeads) VALUES (?, null, null)', [obj.systemID], nullHandler, errorHandler);
transaction.executeSql('INSERT INTO wo (woID, woType, systemID_fk) ' +
'VALUES ((SELECT systemID FROM system WHERE systemID = ' + obj.systemID + '), ?, ?)',
[obj.woID, obj.woType], nullHandler, errorHandler);
})
});
});
});
});
}
});
});
}
When I run the above code, the systems are loaded properly but the wos are not. My research into this issue tells me that I might be having a few issues. One suggestion is that there may already be data in the table. I fixed that by having a drop tables function to clear out the database entirely (I use Chrome dev tools to investigate the db).
So really, I'm not sure what I'm doing wrong. Is my syntax incorrect for inserting a foreign key constraint?
Solved
I stumbled upon this thread and #havexz mentioned the variable in the insertion didn't have quotes around it. I looked at mine and it had the same problem. Here's my edited insert to add a record with a foreign key. Notice the systemID='" instead of the original which was simply systemID=". I was missing the single quotes around my variable.
db.transaction(function(transaction){
transaction.executeSql("INSERT INTO wo (woID, woType, systemID_fk) " +
"VALUES (?, ?, (SELECT systemID FROM system WHERE systemID='" + obj.systemID + "'))", [obj.woID, obj.woType], nullHandler, errorHandler);
});
Is the order of the parameters correct for the INSERT wo? It looks like you're putting the systemID into the woID field rather than systemID_fk which has the constraint. Should it be:
transaction.executeSql('INSERT INTO wo (woID, woType, systemID_fk) ' +
'VALUES (?, ?, (SELECT systemID FROM system WHERE systemID = ' + obj.systemID + '))',
[obj.woID, obj.woType], nullHandler, errorHandler);

HBase doesn't store all records

I have 1.2M records at my MongoDB database. And I want to store all of this data at HBase programmatically. Basically I try to put each retrieved record to HBase in a loop. After the operation is finished, I got only 39912 records on HBase.
Here's what I've tried:
Configuration config = HBaseConfiguration.create();
String tableName = "storedtweet";
String familyName = "msg";
String qualifierName = "msg";
HTable table = new HTable(config, tableName);
// using Spring Data MongoDB to interact with MongoDB
List < StoredTweet > storedTweetList = mongoDAO.getMongoTemplate().findAll(StoredTweet.class);
for (StoredTweet storedTweet: storedTweetList) {
Put p = new Put(Bytes.toBytes(storedTweet.getTweetId()));
p.add(Bytes.toBytes(familyName), Bytes.toBytes(qualifierName), Bytes.toBytes(storedTweet.getMsg()));
table.put(p);
table.flushCommits();
}
If some row key exists and you put it again, HBase Put will override the former. I think there are some records having the same tweet id (you set it to the row key) in your data. That's why some records disappear.

Categories