I'm quite new at android and SQL. I'm making an app, which can get data form the accelerometer, and the I'm storing them in a SQLLite database. Afterwards I intend to get the database out so I can plot the data. I have two questions:
How can I save the data as a usable file on the SD-card? I've seen some topics, but I couldn't get any off it to work. I think I need some examples/tutorials.
Secondly, after some collection, the app starts to lag. I guess it is the storing method, which is like this in the DB-class:
public long createEntry(float x, float y, float z, float t) {
ContentValues cv = new ContentValues();
cv.put(X_DATA,x);
cv.put(Y_DATA,y);
cv.put(Z_DATA,z);
cv.put(TIME_DATA,t);
return ourDatabase.insert(DATABASE_TABLE, null, cv);
}
I hope you'll help me.
For the first part of your question, performing all the inserts in a transaction should make it a lot faster. To do this, before you start inserting, call the following code:
ourDatabase.beginTransaction()
Then once you've finished inserting data, call:
ourDatabase.setTransactionSuccessful();
ourDatabase.getDb().endTransaction();
To answer the second part of question, you need to retrieve the data from the database and create a CSV file from it:
StringBuilder csv = new StringBuilder();
Cursor cursor = this.db.query("Data", new String[] { "x","y", "z", "t" }, null, null, null, null, null);
while (cursor.moveToNext()) {
csv.append(cursor.getFloat(0))
.append(",")
.append(cursor.getFloat(1))
.append(",")
.append(cursor.getFloat(2))
.append(",")
.append(cursor.getFloat(3))
.append("\n");
}
cursor.close();
File outputFile = new File("/sdcard/mycsv.csv");
FileWriter writer;
try {
writer = new FileWriter(outputFile);
writer.write(csv.toString());
writer.close();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
If you're collecting a lot of rows (tens of thousands perhaps), you might need to stream this CSV out to the disk at the same time as writing it.
Related
I have a database saved in my apps assets folder and I copy the database using the below code when the app first opens.
inputStream = mContext.getAssets().open(Utils.getDatabaseName());
if(inputStream != null) {
int mFileLength = inputStream.available();
String filePath = mContext.getDatabasePath(Utils.getDatabaseName()).getAbsolutePath();
// Save the downloaded file
output = new FileOutputStream(filePath);
byte data[] = new byte[1024];
long total = 0;
int count;
while ((count = inputStream.read(data)) != -1) {
total += count;
if(mFileLength != -1) {
// Publish the progress
publishProgress((int) (total * 100 / mFileLength));
}
output.write(data, 0, count);
}
return true;
}
The above code runs without problem but when you try to query the database you get an SQLite: No such table exception.
This issue only occurs in Android P, all earlier versions of Android work correctly.
Is this a known issue with Android P or has something changed?
Was having a similar issue, and solved this adding this to my SQLiteOpenHelper
#Override
public void onOpen(SQLiteDatabase db) {
super.onOpen(db);
db.disableWriteAheadLogging();
}
Apparently Android P sets the PRAGMA Log thing different. Still no idea if will have side effects, but seems to be working!
My issues with Android P got solved by adding
'this.close()' after this.getReadableDatabase() in createDataBase() method as below.
private void createDataBase() throws IOException {
this.getReadableDatabase();
this.close();
try {
copyDataBase();
} catch (IOException e) {
throw new RuntimeException(e);
}
}
This issue seems to lead to a crash much more often on Android P than on previous versions, but it's not a bug on Android P itself.
The problem is that your line where you assign the value to your String filePath opens a connection to the database that remains open when you copy the file from assets.
To fix the problem, replace the line
String filePath = mContext.getDatabasePath(Utils.getDatabaseName()).getAbsolutePath();
with code to get the file path value and then close the database:
MySQLiteOpenHelper helper = new MySQLiteOpenHelper();
SQLiteDatabase database = helper.getReadableDatabase();
String filePath = database.getPath();
database.close();
And also add an inner helper class:
class MySQLiteOpenHelper extends SQLiteOpenHelper {
MySQLiteOpenHelper(Context context, String databaseName) {
super(context, databaseName, null, 2);
}
#Override
public void onCreate(SQLiteDatabase db) {
}
#Override
public void onUpgrade(SQLiteDatabase db, int oldVersion, int newVersion) {
}
}
I ran into a similar issue. I was copying a database but not from an asset. What I found is that the problem had nothing to do with my database file copying code at all. Nor did it have to do with files left open, not closed, flushing or syncing. My code typically overwrites an existing unopen database. What appears to be new/diffferent with Android Pie and different from previous releases of Android, is that when Android Pie creates a SQLite database, it sets journal_mode to WAL (write-ahead logging), by default. I've never used WAL mode and the SQLite docs say that journal_mode should be DELETE by default. The problem is if I overwrite an existing database file, let's call it my.db, the write-ahead log, my.db-wal, still exists and effectively "overrides" what's in the newly copied my.db file. When I opened my database, the sqlite_master table typically only contained a row for android_metadata. All the tables I was expecting were missing. My solution is to simply set journal_mode back to DELETE after opening the database, especially when creating a new database with Android Pie.
PRAGMA journal_mode=DELETE;
Perhaps WAL is better and there's probably some way to close the database so that the write-ahead log doesn't get in the way but I don't really need WAL and haven't needed it for all previous versions of Android.
Unfortunately, the accepted answer just "happens to work" in very concrete cases, but it doesn't give a consistently working advice to avoid such an error in Android 9.
Here it is:
Have single instance of SQLiteOpenHelper class in your application to access your database.
If you need to rewrite / copy the database, close the database (and close all connections to this database) using SQLiteOpenHelper.close() method of this instance AND don't use this SQLiteOpenHelper instance anymore.
After calling close(), not only all connections to the database are closed, but additional database log files are flushed to the main .sqlite file and deleted. So you have one database.sqlite file only, ready to be rewritten or copied.
After copying / rewriting etc. create a new singleton of the SQLiteOpenHelper, which getWritableDatabase() method will return new instance of the SQLite database! And use it till next time you will need your database to be copied / rewritten...
This answer helped me to figure that out: https://stackoverflow.com/a/35648781/297710
I had this problem in Android 9 in my AndStatus application https://github.com/andstatus/andstatus which has quite large suite of automated tests that consistently reproduced "SQLiteException: no such table" in Android 9 emulator before this commit:
https://github.com/andstatus/andstatus/commit/1e3ca0eee8c9fbb8f6326b72dc4c393143a70538 So if you're really curious, you can run All tests before and after this commit to see a difference.
Solution without disabling the WAL
Android 9 introduces a special mode of SQLiteDatabase called Compatibility WAL (write-ahead loggin) that allows a database to use "journal_mode=WAL" while preserving the behavior of keeping a maximum of one connection per database.
In Detail here:
https://source.android.com/devices/tech/perf/compatibility-wal
The SQLite WAL mode is explained in detail here:
https://www.sqlite.org/wal.html
As of the official docs the WAL mode adds a second database file called databasename and "-wal". So if your database is named "data.db" it is called "data-wal.db" in the same directory.
The solution is now to save and restore BOTH files (data.db and data-wal.db) on Android 9.
Afterwards it is working as in earlier versions.
I had the same thing I had an application in version 4 of android, and when updating my mobile that has android 9, then I was 2 days trying to find the error, thanks for the comments in my case I just had to add this.close ();
private void createDataBase () throws IOException {
this.getReadableDatabase ();
this.close ();
try {
copyDataBase ();
} catch (IOException e) {
throw new RuntimeException (e);
}
}
ready running for all versions !!
First, thank you for posting this question. I had the same thing happen. All was working well, but then when testing against Android P Preview I was getting crashes. Here's the bug that I found for this code:
private void copyDatabase(File dbFile, String db_name) throws IOException{
InputStream is = null;
OutputStream os = null;
SQLiteDatabase db = context.openOrCreateDatabase(db_name, Context.MODE_PRIVATE, null);
db.close();
try {
is = context.getAssets().open(db_name);
os = new FileOutputStream(dbFile);
byte[] buffer = new byte[1024];
while (is.read(buffer) > 0) {
os.write(buffer);
}
} catch (IOException e) {
e.printStackTrace();
throw(e);
} finally {
try {
if (os != null) os.close();
if (is != null) is.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
The issue I ran into was this code works just fine BUT in SDK 28+ openOrCreateDatabase no longer automatically creates the android_metadata table for you. So if you do a query of "select * from TABLE" it will not find that TABLE because the query starts to look after the "first" table which should be the metadata table. I fixed this by manually adding the android_metadata table and all was well. Hope someone else finds this useful. It took forever to figure out because specific queries still worked fine.
Similar issue, only Android P device affected. All previous versions no problems.
Turned off auto restore on Android 9 devices.
We did this to troubleshoot. Would not recommend for production cases.
Auto restore was placing a copy of the database file in the data directory before the copy database function is called in the database helper. Therefore the a file.exists() returned true.
The database that was backed up from the development device was missing the table. Therefore "no table found" was in fact correct.
Here's the perfect solution for this problem:
Just override this method in your SQLiteOpenHelper class:
#Override
public void onOpen(SQLiteDatabase db) {
super.onOpen(db);
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.JELLY_BEAN) {
db.disableWriteAheadLogging();
}
}
It seems that you don't close the output stream. While it probably does not explain why the db is not really created (unless Android P added a multi MB buffer) it is a good practice to use a try-with-resource, something like :
// garantees that the data are flushed and the resources freed
try (FileOutputStream output = new FileOutputStream(filePath)) {
byte data[] = new byte[1024];
long total = 0;
int count;
while ((count = inputStream.read(data)) != -1) {
total += count;
if (mFileLength != -1) {
// Publish the progress
publishProgress((int) (total * 100 / mFileLength));
}
output.write(data, 0, count);
}
// maybe a bit overkill
output.getFD().sync();
}
In version P, the major change is WAL (Write Ahead Log). The following two steps are required.
Disable the same by the following line in config.xml in the values folder under resources.
false
Make the following change in the DBAdapter class in createDatabase method. Otherwise phones with earlier Android versions crash.
private void createDataBase() throws IOException {
if (android.os.Build.VERSION.SDK_INT < android.os.Build.VERSION_CODES.P) {
this.getWritableDatabase();
try {
copyDataBase();
} catch (IOException e) {
throw new RuntimeException(e);
}
}
}
The issue occurring in Android Pie,
Solution is:
SQLiteDatabase db = this.getReadableDatabase();
if (db != null && db.isOpen())
db.close();
copyDataBase();
Simplest answer to use following line for Database file path in Android PIE and above:
DB_NAME="xyz.db";
DB_Path = "/data/data/" + BuildConfig.APPLICATION_ID + "/databases/"+DB_NAME;
I've already tried exporting my database tables to CSV using the CSVWriter.
But my tables contain BLOB data. How can I include them in my export?
Then later on, im going to import that exported CSV using CSVReader. Can anyone share some concepts?
This is a part of my code for export
ResultSet res = st.executeQuery("select * from "+db+"."+obTableNames[23]);
int colunmCount = getColumnCount(res);
try {
File filename = new File(dir,""+obTableNames[23]+".csv");
fw = new FileWriter(filename);
CSVWriter writer = new CSVWriter(fw);
writer.writeAll(res, false);
int colType = res.getMetaData().getColumnType(colunmCount);
dispInt(colType);
fw.flush();
fw.close();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
Did you take a look at encodeBase64String(byte[] data) method from the Base64 provided by Apache?
Encodes binary data using the base64 algorithm but does not chunk the output.
This should allow you to return encoded strings representing your Binary Large Object and incorporate it in your CSV.
People on the other side can then use the decodeBase64String(String data) to get the BLOB back again.
I save data to Record store. If the aplication is running it works fine, but when I restart aplication data in record store is lost.
Here is my load command:
try {
int i=1;
display.setCurrent(list2);
RecordStore RS = RecordStore.openRecordStore("recordStore", true);
RecordEnumeration re= RS.enumerateRecords(null, null, true);
adresaURL ad = new adresaURL();
System.out.println("nacteno");
while(re.hasNextElement()){
byte br[] = RS.getRecord(i);
ad.setPopis(new String(br));
br = RS.getRecord(i+1);
ad.setUrl(new String(br));
System.out.println(ad.getPopis());
System.out.println(ad.getUrl());
i+=2;
adresy.addElement(ad);
list2.append(ad.getPopis(), null);
System.out.println("nacteno2");
}
recordStore.closeRecordStore();
} catch (Exception e) {
}
Yeah that won't work.
If you use a RecordEnumeration to iterate through your RMS (as you are), you must use RecordEnumeration.nextRecord() to retrieve the record data. You are using RecordStore.getRecord().
RecordEnumeration.nextRecord() advances your RecordEnumeration on by one. As you never call it, your loop:
while (re.hasNextElement()) {
...
}
will never end!
I'm reading 2 csv files: store_inventory & new_acquisitions.
I want to be able to compare the store_inventory csv file with new_acquisitions.
1) If the item names match just update the quantity in store_inventory.
2) If new_acquisitions has a new item that does not exist in store_inventory, then add it to the store_inventory.
Here is what i have done so far but its not very good. I added comments where i need to add taks 1 & 2.
Any advice or code to do the above tasks would be great! thanks.
File new_acq = new File("/src/test/new_acquisitions.csv");
Scanner acq_scan = null;
try {
acq_scan = new Scanner(new_acq);
} catch (FileNotFoundException ex) {
Logger.getLogger(mainpage.class.getName()).log(Level.SEVERE, null, ex);
}
String itemName;
int quantity;
Double cost;
Double price;
File store_inv = new File("/src/test/store_inventory.csv");
Scanner invscan = null;
try {
invscan = new Scanner(store_inv);
} catch (FileNotFoundException ex) {
Logger.getLogger(mainpage.class.getName()).log(Level.SEVERE, null, ex);
}
String itemNameInv;
int quantityInv;
Double costInv;
Double priceInv;
while (acq_scan.hasNext()) {
String line = acq_scan.nextLine();
if (line.charAt(0) == '#') {
continue;
}
String[] split = line.split(",");
itemName = split[0];
quantity = Integer.parseInt(split[1]);
cost = Double.parseDouble(split[2]);
price = Double.parseDouble(split[3]);
while(invscan.hasNext()) {
String line2 = invscan.nextLine();
if (line2.charAt(0) == '#') {
continue;
}
String[] split2 = line2.split(",");
itemNameInv = split2[0];
quantityInv = Integer.parseInt(split2[1]);
costInv = Double.parseDouble(split2[2]);
priceInv = Double.parseDouble(split2[3]);
if(itemName == itemNameInv) {
//update quantity
}
}
//add new entry into csv file
}
Thanks again for any help. =]
Suggest you use one of the existing CSV parser such as Commons CSV or Super CSV instead of reinventing the wheel. Should make your life a lot easier.
Your implementation makes the common mistake of breaking the line on commas by using line.split(","). This does not work because the values themselves might have commas in them. If that happens, the value must be quoted, and you need to ignore commas within the quotes. The split method can not do this -- I see this mistake a lot.
Here is the source of an implementation that does it correctly:
http://agiletribe.purplehillsbooks.com/2012/11/23/the-only-class-you-need-for-csv-files/
With help of the open source library uniVocity-parsers, you could develop with pretty clean code as following:
private void processInventory() throws IOException {
/**
* ---------------------------------------------
* Read CSV rows into list of beans you defined
* ---------------------------------------------
*/
// 1st, config the CSV reader with row processor attaching the bean definition
CsvParserSettings settings = new CsvParserSettings();
settings.getFormat().setLineSeparator("\n");
BeanListProcessor<Inventory> rowProcessor = new BeanListProcessor<Inventory>(Inventory.class);
settings.setRowProcessor(rowProcessor);
settings.setHeaderExtractionEnabled(true);
// 2nd, parse all rows from the CSV file into the list of beans you defined
CsvParser parser = new CsvParser(settings);
parser.parse(new FileReader("/src/test/store_inventory.csv"));
List<Inventory> storeInvList = rowProcessor.getBeans();
Iterator<Inventory> storeInvIterator = storeInvList.iterator();
parser.parse(new FileReader("/src/test/new_acquisitions.csv"));
List<Inventory> newAcqList = rowProcessor.getBeans();
Iterator<Inventory> newAcqIterator = newAcqList.iterator();
// 3rd, process the beans with business logic
while (newAcqIterator.hasNext()) {
Inventory newAcq = newAcqIterator.next();
boolean isItemIncluded = false;
while (storeInvIterator.hasNext()) {
Inventory storeInv = storeInvIterator.next();
// 1) If the item names match just update the quantity in store_inventory
if (storeInv.getItemName().equalsIgnoreCase(newAcq.getItemName())) {
storeInv.setQuantity(newAcq.getQuantity());
isItemIncluded = true;
}
}
// 2) If new_acquisitions has a new item that does not exist in store_inventory,
// then add it to the store_inventory.
if (!isItemIncluded) {
storeInvList.add(newAcq);
}
}
}
Just follow this code sample I worked out according to your requirements. Note that the library provided simplified API and significent performance for parsing CSV files.
The operation you are performing will require that for each item in your new acquisitions, you will need to search each item in inventory for a match. This is not only not efficient, but the scanner that you have set up for your inventory file would need to be reset after each item.
I would suggest that you add your new acquisitions and your inventory to collections and then iterate over your new acquisitions and look up the new item in your inventory collection. If the item exists, update the item. If it doesnt, add it to the inventory collection. For this activity, it might be good to write a simple class to contain an inventory item. It could be used for both the new acquisitions and for the inventory. For a fast lookup, I would suggest that you use HashSet or HashMap for your inventory collection.
At the end of the process, dont forget to persist the changes to your inventory file.
As Java doesn’t support parsing of CSV files natively, we have to rely on third party library. Opencsv is one of the best library available for this purpose. It’s open source and is shipped with Apache 2.0 licence which makes it possible for commercial use.
Here, this link should help you and others in the situations!
For writing to CSV
public void writeCSV() {
// Delimiter used in CSV file
private static final String NEW_LINE_SEPARATOR = "\n";
// CSV file header
private static final Object[] FILE_HEADER = { "Empoyee Name","Empoyee Code", "In Time", "Out Time", "Duration", "Is Working Day" };
String fileName = "fileName.csv");
List<Objects> objects = new ArrayList<Objects>();
FileWriter fileWriter = null;
CSVPrinter csvFilePrinter = null;
// Create the CSVFormat object with "\n" as a record delimiter
CSVFormat csvFileFormat = CSVFormat.DEFAULT.withRecordSeparator(NEW_LINE_SEPARATOR);
try {
fileWriter = new FileWriter(fileName);
csvFilePrinter = new CSVPrinter(fileWriter, csvFileFormat);
csvFilePrinter.printRecord(FILE_HEADER);
// Write a new student object list to the CSV file
for (Object object : objects) {
List<String> record = new ArrayList<String>();
record.add(object.getValue1().toString());
record.add(object.getValue2().toString());
record.add(object.getValue3().toString());
csvFilePrinter.printRecord(record);
}
} catch (Exception e) {
e.printStackTrace();
} finally {
try {
fileWriter.flush();
fileWriter.close();
csvFilePrinter.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
You can use Apache Commons CSV api.
FYI this anwser : https://stackoverflow.com/a/42198895/6549532
Read / Write Example
The database of my application need to be filled with a lot of data,
so during onCreate(), it's not only some create table sql
instructions, there is a lot of inserts. The solution I chose is to
store all this instructions in a sql file located in res/raw and which
is loaded with Resources.openRawResource(id).
It works well but I face to encoding issue, I have some accentuated
caharacters in the sql file which appears bad in my application. This
my code to do this:
public String getFileContent(Resources resources, int rawId) throws
IOException
{
InputStream is = resources.openRawResource(rawId);
int size = is.available();
// Read the entire asset into a local byte buffer.
byte[] buffer = new byte[size];
is.read(buffer);
is.close();
// Convert the buffer into a string.
return new String(buffer);
}
public void onCreate(SQLiteDatabase db) {
try {
// get file content
String sqlCode = getFileContent(mCtx.getResources(), R.raw.db_create);
// execute code
for (String sqlStatements : sqlCode.split(";"))
{
db.execSQL(sqlStatements);
}
Log.v("Creating database done.");
} catch (IOException e) {
// Should never happen!
Log.e("Error reading sql file " + e.getMessage(), e);
throw new RuntimeException(e);
} catch (SQLException e) {
Log.e("Error executing sql code " + e.getMessage(), e);
throw new RuntimeException(e);
}
The solution I found to avoid this is to load the sql instructions
from a huge static final String instead of a file, and all
accentuated characters appear well.
But isn't there a more elegant way to load sql instructions than a big
static final String attribute with all sql instructions?
I think your problem is in this line:
return new String(buffer);
You're converting the array of bytes in to a java.lang.String but you're not telling Java/Android the encoding to use. So the bytes for your accented characters aren't being converted correctly as the wrong encoding is being used.
If you use the String(byte[],<encoding>) constructor you can specify the encoding your file has and your characters will be converted correctly.
The SQL file solution seems perfect, it's just that you need to make sure that the file is saved in utf8 encoding otherwise all the accentuated characters will be lost. If you don't want to change the file's encoding then you need to pass an extra argument to new String(bytes, charset) defining the file's encoding.
Do prefer to use file resources instead of static final String to avoid having all those unnecessary bytes loaded into memory. In mobile phones you want to save all memory possible!
I am using a different approach:
Instead of executing loads of sql statements (which will take long time to complete), I build my sqlite database on the desktop, put it in the assets folder, create an empty sqlite db in android and copy the db from the assets folder into the database folder. This is a huge increase in speed. Note, you need to create an empty database first in android, and then you can copy and overwrite it. Otherwise, Android will not allow you to write a db into the datbase folder. There are several examples on the internet.
BTW, seems this approach works best, if the db has no file extension.
It looks like you are passing all your sql statements in one string. That's a problem because execSQL expects "a single statement that is not a query" (see documentation [here][1]). Following is a somewhat-ugly-but-working solution.
I have all my sql statements in a file like this:
INSERT INTO table1 VALUES (1, 2, 3);
INSERT INTO table1 VALUES (4, 5, 6);
INSERT INTO table1 VALUES (7, 8, 9);
Notice the new lines in between text(semicolon followed by 2 new lines)
Then, I do this:
String text = new String(buffer, "UTF-8");
for (String command : text.split(";\n\n")) {
try { command = command.trim();
//Log.d(TAG, "command: " + command);
if (command.length() > 0)
db.execSQL(command.trim());
}
catch(Exception e) {do whatever you need here}
My data columns contain blobs of text with new lines AND semicolons, so I had to find a different command-separator. Just be sure to get creative with the split str: use something you know doesn't exist in your data.
HTH
Gerardo
[1]: http://developer.android.com/reference/android/database/sqlite/SQLiteDatabase.html#execSQL(java.lang.String, java.lang.Object[])