So i have a huge JSONObject that I need to write to a file, right now my code work perfectly on 90% of the devices, the problem is on low memory devices such as Amazon Fire TV the app crashes with an error "java.lang.OutOfMemoryError".
I wonder is there another more memory friendly way to write that json object to file?
That's my code:
try{
Writer output = null;
if(jsonFile.isDirectory()){
jsonFile.delete();
}
if(!jsonFile.exists()){
jsonFile.createNewFile();
}
output = new BufferedWriter(new FileWriter(jsonFile));
output.write(mainObject.toString());
output.close();
} catch (Exception e) {
e.printStackTrace();
}
Related
I am working on the application which reads large amounts of data from a file. Basically, I have a huge file (around 1.5 - 2 gigs) containing different objects (~5 to 10 millions of them per file). I need to read all of them and put them to different maps in the app. The problem is that the app runs out of memory while reading the objects at some point. Only when I set it to use -Xmx4096m - it can handle the file. But if the file will be larger, it won't be able to do that anymore.
Here's the code snippet:
String sampleFileName = "sample.file";
FileInputStream fileInputStream = null;
ObjectInputStream objectInputStream = null;
try{
fileInputStream = new FileInputStream(new File(sampleFileName));
int bufferSize = 16 * 1024;
objectInputStream = new ObjectInputStream(new BufferedInputStream(fileInputStream, bufferSize));
while (true){
try{
Object objectToRead = objectInputStream.readUnshared();
if (objectToRead == null){
break;
}
// doing something with the object
}catch (EOFException eofe){
eofe.printStackTrace();
break;
} catch (Exception e) {
e.printStackTrace();
continue;
}
}
} catch (Exception e){
e.printStackTrace();
}finally{
if (objectInputStream != null){
try{
objectInputStream.close();
}catch (Exception e2){
e2.printStackTrace();
}
}
if (fileInputStream != null){
try{
fileInputStream.close();
}catch (Exception e2){
e2.printStackTrace();
}
}
}
First of all, I was using objectInputStream.readObject() instead of objectInputStream.readUnshared(), so it solved the issue partially. When I increased the memory from 2048 to 4096, it started parsing the file. BufferedInputStream is already in use. From the web I've found only examples how to read lines or bytes, but nothing regarding objects, performance wise.
How can I read the file without increasing the memory for JVM and avoiding the OutOfMemory exception? Is there any way to read objects from the file, not keeping anything else in the memory?
When reading big files, parsing objects and keeping them in memory there are several solutions with several tradeoffs:
You can fit all parsed objects into memory for that app deployed on one server. It either requires to store all objects in very zipped way, for example using byte or integer to store 2 numbers or some kind of shifting in other data structures. In other words fitting all objects in possible minimum space. Or increase memory for that server(scale vertically)
a) However reading the files can take too much memory, so you have to read them in chunks. For example this is what I was doing with json files:
JsonReader reader = new JsonReader(new InputStreamReader(in, "UTF-8"));
if (reader.hasNext()) {
reader.beginObject();
String name = reader.nextName();
if ("content".equals(name)) {
reader.beginArray();
parseContentJsonArray(reader, name2ContentMap);
reader.endArray();
}
name = reader.nextName();
if ("ad".equals(name)) {
reader.beginArray();
parsePrerollJsonArray(reader, prerollMap);
reader.endArray();
}
}
The idea is to have a way to identify when certain object starts and ends and read only that part.
b) You can also split files to smaller ones at the source if you can, then it will be easier to read them.
You can't fit all parsed objects for that app on one server. In this case you have to shard based on some object property. For example split data based on US state into multiple servers.
Hopefully it helps in your solution.
EDIT: it seems to be a problem related to Android Pie (api 28). Seems to work on previous versions (tested on 27, 26, 25).
I'm working at this Android code for a very long time, and I noticed that lately when I'm saving data on the disk, I receive this error:
java.lang.NullPointerException: Attempt to invoke virtual method 'int java.math.RoundingMode.ordinal()' on a null object reference
This is how I write data to disk
private void SaveDataToDisk() {
try {
FileOutputStream fos = this.weakActivity.get().openFileOutput(this.FILENAME, Context.MODE_PRIVATE);
if (fos != null) {
ObjectOutputStream os = new ObjectOutputStream(fos);
os.writeObject(this.datastore);
os.close();
fos.close();
}
} catch (Exception ex) {
ErrorManager.TrapThrow(ex, this.weakActivity.get());
}
}
this.datastore is a complex object composed of multiple other object (a very large number).
This is how I read data back when needed
private void LoadDataFromDisk() {
try {
if (this.weakActivity.get().getFileStreamPath(this.FILENAME).exists()) {
FileInputStream fis = this.weakActivity.get().openFileInput(this.FILENAME);
BufferedInputStream bis = new BufferedInputStream(fis);
ObjectInputStream is = new ObjectInputStream(bis);
try {
this.datastore = (DataStore) is.readObject();
} catch (Exception ex) {
this.datastore = new DataStore();
}
is.close();
fis.close();
}
} catch (Exception ex) {
ErrorManager.TrapThrow(ex, this.weakActivity.get());
}
}
Imagine to have a fresh install of the app. The first time LoadDataFromDisk does nothing. Later in time the app writes something on the disk.
When the app call LoadDataFromDisk again, it reads correctly. Then, for example, the app is relaunched: when LoadDataFromDisk is reached, and specifically
this.datastore = (DataStore) is.readObject();
I receive the error above, and falls back to a new DataStore object, in order to keep the app working.
Why not always? The data seems corrupted after has been read.
I can reproduce on AVD and on my phone.
Any help appreciated
We found that this was caused by a custom DataFormatter object (within a containing, serialized object) that relies on a java.text.DecimalFormat. When we went to deserialize the object (sometimes but not always, but apparently dependent on how many times the containing object had been passed among Activity extras), the entire Extras collection containing the serialized container was invalid, and crashed the app with the RoundingMode.ordinal() NPE.
Apparently the DecimalFormat class no longer plays well with serialization in Android 9/Pie, because in our case it was as simple as marking the containing, Serializable object's instance of DataFormatter as transient and the problem vanished.
Sorry not to have a better dissection of the issue ready, but this solved our problem with this error.
I am working on the application which reads large amounts of data from a file. Basically, I have a huge file (around 1.5 - 2 gigs) containing different objects (~5 to 10 millions of them per file). I need to read all of them and put them to different maps in the app. The problem is that the app runs out of memory while reading the objects at some point. Only when I set it to use -Xmx4096m - it can handle the file. But if the file will be larger, it won't be able to do that anymore.
Here's the code snippet:
String sampleFileName = "sample.file";
FileInputStream fileInputStream = null;
ObjectInputStream objectInputStream = null;
try{
fileInputStream = new FileInputStream(new File(sampleFileName));
int bufferSize = 16 * 1024;
objectInputStream = new ObjectInputStream(new BufferedInputStream(fileInputStream, bufferSize));
while (true){
try{
Object objectToRead = objectInputStream.readUnshared();
if (objectToRead == null){
break;
}
// doing something with the object
}catch (EOFException eofe){
eofe.printStackTrace();
break;
} catch (Exception e) {
e.printStackTrace();
continue;
}
}
} catch (Exception e){
e.printStackTrace();
}finally{
if (objectInputStream != null){
try{
objectInputStream.close();
}catch (Exception e2){
e2.printStackTrace();
}
}
if (fileInputStream != null){
try{
fileInputStream.close();
}catch (Exception e2){
e2.printStackTrace();
}
}
}
First of all, I was using objectInputStream.readObject() instead of objectInputStream.readUnshared(), so it solved the issue partially. When I increased the memory from 2048 to 4096, it started parsing the file. BufferedInputStream is already in use. From the web I've found only examples how to read lines or bytes, but nothing regarding objects, performance wise.
How can I read the file without increasing the memory for JVM and avoiding the OutOfMemory exception? Is there any way to read objects from the file, not keeping anything else in the memory?
When reading big files, parsing objects and keeping them in memory there are several solutions with several tradeoffs:
You can fit all parsed objects into memory for that app deployed on one server. It either requires to store all objects in very zipped way, for example using byte or integer to store 2 numbers or some kind of shifting in other data structures. In other words fitting all objects in possible minimum space. Or increase memory for that server(scale vertically)
a) However reading the files can take too much memory, so you have to read them in chunks. For example this is what I was doing with json files:
JsonReader reader = new JsonReader(new InputStreamReader(in, "UTF-8"));
if (reader.hasNext()) {
reader.beginObject();
String name = reader.nextName();
if ("content".equals(name)) {
reader.beginArray();
parseContentJsonArray(reader, name2ContentMap);
reader.endArray();
}
name = reader.nextName();
if ("ad".equals(name)) {
reader.beginArray();
parsePrerollJsonArray(reader, prerollMap);
reader.endArray();
}
}
The idea is to have a way to identify when certain object starts and ends and read only that part.
b) You can also split files to smaller ones at the source if you can, then it will be easier to read them.
You can't fit all parsed objects for that app on one server. In this case you have to shard based on some object property. For example split data based on US state into multiple servers.
Hopefully it helps in your solution.
In the app I am working on right now, part of the functionality is to write data saved on the device to a flash drive connected via a USB-OTG adapter. Specifically, the device is a rooted Motorola Xoom running 4.2.2. I can successfully write files to the drive and read them on my computer. That part works fine. However, when I try to replace existing files with new information, the resulting files come out empty. I even delete the existing files before writing new data. What's weird is that after copying the contents of my internal file to the flash drive, I log the length of the resulting file. It always matches the input file and is always a non-0 number, yet the file still shows up as blank on my computer. Can anyone help with this problem? Relevant code from the AsyncTask that I have doing this work is below.
#Override
protected Void doInBackground(Void... params) {
File[] files = context.getFilesDir().listFiles();
for (File file : files) {
if (file.isFile()) {
List<String> nameSegments = Arrays.asList(file.getName().split(
"_"));
Log.d("source file", "size: " + file.length());
String destinationPath = "/storage/usbdisk0/"
+ nameSegments.get(0) + "/" + nameSegments.get(1) + "/";
File destinationPathFile = new File(destinationPath);
if (!destinationPathFile.mkdirs()) {
destinationPathFile.mkdirs();
}
File destinationFile = new File(destinationPathFile,
nameSegments.get(2));
FileReader fr = null;
FileWriter fw = null;
try {
fr = new FileReader(file);
fw = new FileWriter(destinationFile, false);
int c = fr.read();
while (c != -1) {
fw.write(c);
c = fr.read();
}
fw.flush();
} catch (IOException e) {
e.printStackTrace();
} finally {
try {
fr.close();
fw.close();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
Log.d("destination file", "size: " + new File(destinationFile.getPath()).length());
}
}
return null;
}
EDIT:
Per #Simon's suggestion, I added output.flush() to my code. This does not change the result.
EDIT #2:
I did some further testing with this and found something interesting. If I go to Settings->Storage->Unmount USB Storage after writing to the flash drive but before removing it from the OTG adapter, everything works perfectly. However, failing to eject the drive after writing results in the data not being written. What's strange is that the folder structure and file itself are created on the drive, but the file is always empty. One more thing: if I go to a file manager application and open up the file prior to removing the drive, the files all exist as they should. However, even removing the device, plugging it straight back in to the tablet and opening any of the files results in the file looking empty. I can't make heads or tails of this, and this is incredibly frustrating. Can anyone help with this?
EDIT #3:
I also changed to using FileReaders and FileWriters just to wee what would happen. I don't care about efficiency at this point, I simply want file writing to work reliably. This change did not affect the issue. Updated code is posted above.
Try using FileReader.ready() method before your FileReader.read() call,
and ensure if your FileReader really has some bytes in it.
Try this , Used buffered reader for writing
try
{
fw = new FileWriter(destinationFile);
BufferedWriter writer=new BufferedWriter(fw);
writer.append(yourText); // Append can be changed to write or something if you want to overwrite
writer.close();
}
catch (Exception e) {
throw new RuntimeException(e);
}
finally {
if (fw != null) {
try {
fw.flush();
fw.close();
}
catch (IOException e) {
}
I found the solution to my problem. It appears that the Android system buffers some files off of the SD card/flash drive, and then writes them to the flash drive upon eject. The following code after my file operations synchronizes the buffer with the filesystem and allows the flash drive to be immediately removed from the device without data loss. It's worth noting that this DOES require root access; it will not work on a non-rooted device.
try {
Process p = Runtime.getRuntime().exec("su");
DataOutputStream os = new DataOutputStream(p.getOutputStream());
os.writeBytes("sync; sync\n");
os.writeBytes("exit\n");
os.flush();
} catch (Exception e) {
e.printStackTrace();
}
Source of my solution: Android 2.1 programatically unmount SDCard
It sounds like the filesystem is caching your changes, but not actually writing them to the flash drive until you eject it. I don't think there's a way to flush the filesystem cache, so the best solution seems to be just to unmount and then remount the flash drive.
I'm currently building an application where the user will generate data over time and, should he/she has an internet connection, transmit it to the web. However, if he doesn't have web access, I need to store this data in the phone until the user recovers his access, when I'll need to recover this data to be transmitted. However, I'm facing lots of troubles to do this, as per below.
Note: before anything, I'm using a local java-created file because I know no other way to save/restore this data on the device. If you happen to know any other way to store/access this data from within the device please feel free to comment here.
Just for reference,
phantoms is an ArrayList containing objects with the data I need to
store,
Arquivador is the class that I'm using to make my data persistent and to recover it,
Funcionario is the class with the data generated by the program (just a few strings and numbers)
I am able to write a file to the file system through the code below, on my Activity:
try {
arq = new Arquivador();
arq.addFirstObjectInFile(
openFileOutput("dados.jlog", MODE_WORLD_WRITEABLE),
phantoms.get(0));
phantoms.remove(phantoms.get(0));
for (Funcionario func : phantoms) {
arq.addObjectInFile(openFileOutput("dados.jlog", MODE_APPEND),
func);
}
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
}
Here is the code inside Arquivador that adds the data to a file:
public void addObjectInFile(FileOutputStream arquivo,
Object objetoAAdicionar) {
try {
ObjectOutputStream aoos = new ObjectOutputStream(arquivo);
aoos.writeObject(objetoAAdicionar);
aoos.close();
} catch (IOException ioe) {
Log.d(TAG_NAME, "Erro no Appendable OOS.");
}
}
public void addFirstObjectInFile(FileOutputStream arquivo,
Object objetoAAdicionar) {
try {
AppendableObjectOutputStream aoos = new AppendableObjectOutputStream(
arquivo);
aoos.writeObject(objetoAAdicionar);
aoos.close();
} catch (IOException ioe) {
Log.d(TAG_NAME, "Erro no Appendable OOS.");
}
}
You will notice that I'm adding data to persistence in 2 steps, the first Object and the rest of them. This was an idea I saw on this post, here in StackOverflow, to allow appending data to a Java generated file. I have no problem with this code, it works perfectly.
Later on, back on my Activity, the internet connection is detected and I try to recover the file saved on the disk:
phantoms = new ArrayList<Funcionario>();
Object obj = arq.readObjectFromFile(openFileInput("dados.jlog"));
Funcionario func = null;
if (obj instanceof Funcionario) {
func = (Funcionario) obj;
}
while (func != null) {
phantoms.add(func);
arq.removeObjectFromFile(openFileInput("dados.jlog"), func,
getApplicationContext());
func = (Funcionario) arq
.readObjectFromFile(openFileInput("dados.jlog"));
}
The original idea was to read 1 object at a time, then attempt to transmit it and, if successful, erase the object from the file (so it didn't get retransmitted). However, I was having too many error messages with this. Instead, I decided to load all the objects at once, one by one, to see where my problem was more clearly.
Back to the Arquivador class:
public Object readObjectFromFile(FileInputStream arquivo) {
Object retorno = null;
if (arquivo.equals(null)) {
Log.e(TAG_NAME, "FIS is null!");
}
ObjectInputStream ois = null;
try {
ois = new ObjectInputStream(arquivo);
retorno = ois.readObject();
} catch (IOException ioex) {
} catch (ClassNotFoundException e) {
} finally {
try {
if (ois != null) ois.close();
} catch (IOException e) {
}
}
return retorno;
}
public void removeObjectFromFile(FileInputStream arqPrincipal,
Object objetoARemover, Context contexto) {
try {
// Construct the new file that will later be renamed to the original
// filename.
ObjectOutputStream oos = new ObjectOutputStream(
contexto.openFileOutput("dados.jlog.temp",
contexto.MODE_APPEND));
ObjectInputStream ois = new ObjectInputStream(arqPrincipal);
Object obj = null;
// Read from the original file and write to the new
// unless content matches data to be removed.
try {
while ((obj = ois.readObject()) != null) {
if (!(objetoARemover.equals(obj))) {
oos.writeObject(obj);
oos.flush();
}
}
} catch (EOFException eof) {
} finally {
oos.close();
ois.close();
// Delete the original file
File aDeletar = contexto.getFileStreamPath("dados.jlog");
File aRenomear = contexto.getFileStreamPath("dados.jlog.tmp");
if (!aDeletar.delete()) {
return;
} else {
// Rename the new file to the filename the original file
// had.
if (!aRenomear.renameTo(aDeletar)) Log.d(TAG_NAME,
"Error renaming file");
else Log.d(TAG_NAME, "Renaming successful");
}
}
} catch (FileNotFoundException ex) {
ex.printStackTrace();
Log.d(TAG_NAME, "Arquivo não encontrado");
} catch (IOException ex) {
ex.printStackTrace();
Log.d(TAG_NAME, "Erro de entrada/saída");
} catch (ClassNotFoundException e) {
Log.d(TAG_NAME, "Classe Não Encontrada.");
}
}
The method readObjectFromFile() seems to work just fine. I can even convert the read Object to Funcionario class and read its data.
My problems appear when I use removeObjectFromFile(). The idea is to create a temporary file to store objects from "dados.jlog" file other than the one that has been already loaded in the main program, then once this temp file is created the file "dados.jlog" should be deleted and the temporary file should be renamed to replace it.
The first thing I found out to be strange here is that the ois.readobject() keeps throwing an EOFException. While this makes sense, the tutorial I read on the internet doesn't mention this error. In fact, their code indicates that when the readObject() method reaches the EOF, it would return a reference to null, but instead this class throws this EOFException. I handled this exception in the code - though I'm not sure if this would be the right way to do it.
Another thing I find strange is the fact that this code fails to recognize the object that it should NOT copy. When I compare the object read from the file to the one received as argument, no matter what I try ( == , equals(), etc) they seem different objects to the compiler. Funcionario class is serializable has a serialversionUID, so the object read from the file should be identical to the one I stored. Worse than this, these 2 Objects being compared are read from the same file. They should be identical, right?
After creating the temporary file, I try to delete the original file and rename the temporary file. Though this seems to be working, once the removeObjectFromFile() ends the first time, the program is unable to read the data from the file "dados.jlog" again. I can't read the remaining data from the file and the program enters on an endless loop - since the 1st object is never removed from the list in the file.
Please enlighten me with this matter.
Personally I'd use an SQLLite database. Store each object in a row in the database. Once you've successfully transmitted you can remove the row from the database.
You can even reuse most of your code that you've already done. The easiest way to get there from where you are is to use a separate file for each object and store only the filename of the object in the database. You can then iterate over the rows in the database. Each time you transmit an object to your server simply delete that row from the database (and remove the file from the filesystem!). No rows in the database means no objects remain to be transmitted.