Appending to ObjectOutputStream (writing multiple objects w/o closing stream) - java

Desclaimer My question is different from two following links
Question 1
Question 2
public class AppendableObjectOutputStream extends ObjectOutputStream {
public AppendableObjectOutputStream(OutputStream out) throws IOException {
super(out);
}
#Override
protected void writeStreamHeader() throws IOException {}
}
The problem with above solutions is that they do not support writing multiple objects to appendable stream w/o closing the stream.
If I open appendable stream, write multiple objects - then at time of reading I can read only first object properly and on trying to read second object, I get EOF exception.
If I proceed the way like write on object to appendable stream, close stream. Then again open stream, write another object close and so on. This way I am able to read multiple objects properly.
fileOutputStream = new FileOutputStream("abc.dat",true);
outputBuffer = new BufferedOutputStream(fileOutputStream);
objectStream = new AppendableObjectOutputStream(outputBuffer);
BucketUpdate b1 = new BucketUpdate("getAllProducts1",null,"1",null);
BucketUpdate b2 = new BucketUpdate("getAllProducts2",null,"2",null);
BucketUpdate b3 = new BucketUpdate("getAllProducts3",null,"3",null);
objectStream.writeObject(b1);
objectStream.writeObject(b2);
objectStream.writeObject(b3);
objectStream.close();

Calling ObjectOutputStream.reset() after writing each object will fix this.

If you check question you mentioned, you will see that you have to use AppendableObjectOutputStream only to append objects to file, if file already contains some objects. For empty file you have to use ordinary ObjectOutputStream because the header should be written to the beginning in this case.

Related

How do I handle storage in a Java console app that cannot use DB?

I am given an assignment where we are not allowed to use a DB or libraries but only textfile for data storage.
But it has rather complex requirements, for e.g. many validations, because of that, we need to "access the db" (i.e. read the textfile) many times.
My question is: should I create a class like this:
class SomeRepository{
static ArrayList<Users> users = new ArrayList();
public SomeRepository(){
//instantiate this class on program load
//In constructor, we read the text file, instantiate and store everything inside the arraylist.
}
//public getOneUser(){ // for get methods, we don't read from text file at all }
/public save() { //text file saving code overhere }
}
Is this a good approach to solve the above problem? Currently, what we are doing is reading and writing to the text file every time we want to retrieve some data or write something new.
Wouldn't this be too expensive in terms of heap space memory? Or should I just read/write to the text file for every method?
public class IOManager {
public static void writeObjToTxtFile(String fileName, Object object) {
File file = new File(fileName + ".txt");//File will be created in the root directory where the program runs.
try (FileOutputStream fos = new FileOutputStream(file);
ObjectOutputStream oos = new ObjectOutputStream(fos);) {
oos.writeObject(object);
} catch (IOException e) {
e.printStackTrace();
}
}
public static Object readObjFromTxtFile(String fileName) {
Object obj = null;
File file = new File(fileName + ".txt");
FileInputStream fis = null;
try {
fis = new FileInputStream(file);
ObjectInputStream ois = new ObjectInputStream(fis);
obj = ois.readObject();
} catch (ClassNotFoundException | IOException e) {
e.printStackTrace();
}
return obj;
}
}
Add this class to your project. Since it's general for all Objects, you can pass and receive Objects like these as well: ArrayList<Users>. Play around and Tinker with it to fit whatever your specific purpose is. Hint: You can write other custom methods that calls these methods. eg:
public static void writeUsersToFile(ArrayList<Users> usersArrayList){
writeObjToTxtFile("users",usersArrayList);
}
Ps. Make sure your Objects implement Serializable. Eg:
public class Users implements Serializable {
}
I would suggest reading the contents of your file to a dynamic list such as an arraylist at the start of your program. Make the required queries/changes to your arraylist and then write that arraylist to your file when the program is set to close. This will save significant time over repeated file reads/writes.
This isn't without it's drawbacks, though. You don't want to hogg up memory in case of very large files - but considering this is an assignment, that may not be the case. Additionally, should your program terminate prior to the write at the end, all changes made to your database during the current execution will be lost.

Writing Strings to a binary file java

I have a list of objects that has some simple String properties. I want to be able to save those strings to binary so that when you open the file outside the program, you only see 1's and 0's.
I have managed to use FileOutputStreamand saved the strings, however, I can't manage to get it to write to binary. The file reads as clean readable text. I have tried getBytes().
What would be the best approach for this? Keep in mind that I want to be able to read the file later and construct back the objects. Would it be better to use Serializable and save a list of objects?
Here is my FileWriter:
NB: The toString() is custom and returns a String with linebreaks for every property.
public class FileWriter {
public void write(String fileName, Savable objectToSave ) throws IOException {
File fileToSave = new File(fileName);
String stringToSave = objectToSave.toString();
byte[] bytesToSave = stringToSave.getBytes(StandardCharsets.UTF_8) ;
try (
OutputStream outputStream = new FileOutputStream(fileToSave);
) {
outputStream.write(bytesToSave);
} catch (IOException e) {
throw new IOException("error");
}
}
}
If your goal is simply serializing, implementing Serializable and writing them would work, but your string is still going to be readable. You can encrypt the stream, but anyone decompiling your code can still devise a way to read the values.

Which one is better approach so as to able to use wrapper class read() / write() method with android.content.res.Resources.openRawResource() method?

//Reading a image file from #drawable res folder and writing to a file on external sd card
//below one works no doubt but I want to imrpove it:
OutputStream os = new FileOutputStream(file); //File file.........
InputStream is =getResources().openRawResource(R.drawable.an_image);
byte[] b = new byte[is.available()];
is.read(b);
os.write(b);
is.close();
os.close();
In above code I am using basic io classes to read and write. My question is what can I do in order to able to use wrapper classes like say DataInputStream/ BufferedReaderd or PrintStream / BufferedWriter /PrintWriter.
As openRawResources(int id ) returns InputStream ;
to read a file from res I either need to typecast like this:
DataInputStream is = (DataInputStream) getResources().openRawResource(R.drawble.an_image));
or I can link the stream directly like this:
DataInputStream is = new DataInputStream(getResources().openRawResource(R.drawable.greenball));
and then I may do this to write it to a file on sd card:
PrintStream ps =new PrintStream (new FileOutputStream(file));
while(s=is.readLine()!=null){
ps.print(s);
}
So is that correct approach ? which one is better? Is there a better way?better practice..convention?
Thanks!!!
If openRawResource() is documented to return an InputStream then you cannot rely on that result to be any more specific kind of InputStream, and in particular, you cannot rely on it to be a DataInputStream. Casting does not change that; it just gives you the chance to experience interesting and exciting exceptions. If you want a DataInputStream wrapping the the result of openRawResource() then you must obtain it via the DataInputStream constructor. Similarly for any other wrapper stream.
HOWEVER, do note that DataInputStream likely is not the class you want. It is appropriate for reading back data that were originally written via a DataOutputStream, but it is inappropriate (or at least offers no advantages over any other InputStream) for reading general data.
Furthermore, your use of InputStream.available() is incorrect. That method returns the number of bytes that can currently be read from the stream without blocking, which has only a weak relationship with the total number of bytes that could be read from the stream before it is exhausted (if indeed it ever is).
Moreover, your code is also on shaky ground where it assumes that InputStream.read(byte[]) will read enough bytes to fill the array. It probably will, since that many bytes were reported available, but that's not guaranteed. To copy from one stream to another, you should instead use code along these lines:
private final static int BUFFER_SIZE = 2048;
void copyStream(InputStream in, OutputStream out) throws IOException {
byte[] buffer = new byte[BUFFER_SIZE];
int nread;
while ( (nread = in.read(buffer) != 0 ) do {
out.write(buffer, 0, nread);
}
}

Creating a List of BufferedReaders

I would like to have a method that would return a list of BufferedReader objects (for example for all files in a directory):
private List<BufferedReader> getInputReaders(List<String> filenames) {
List<BufferedReader> result = new ArrayList<BufferedReader>();
for(String filename : filenames)
result.add(new BufferedReader(new InputStreamReader(new FileInputStream(filename), "UTF-8")));
}
return result;
}
Will this be a major waste of resources?
Will all those streams be opened at the moment of creation and remain so therefore holding system resources?
If yes, can I create those readers in "passive" mode without actually opening streams, or is there any other workaround (so I can build a List with thousands of readers safely)?
Yes, the constructor for FileInputStream invokes open() in its constructor. open() is a native method, which will most likely reserve a file descriptor for the file.
Instead of immediately returning a list of BufferedReaders, why not return a list of something that will open the underlying stream as needed? You can create a class that holds onto a filename and simply open the resource when called.
I'm pretty sure it's a bad idea. You risk to consume all the available file descriptors, and there is no point in opening a reader to a file if you don't want to read from it.
If you want to read from the file, then open a reader, read from the file, and close the reader. Then, do the same for the next file to read from.
If you want a unique abstraction to read from various sources (URLs, files, etc.), then create your own Source interface, and multiple implementations which would wrap the resource to read from (URLSource, FileSource, etc.). Only open the actual reader on the wrapped resource when reading from your Source instance.
yes those streams will be opened as soon as they are created
good way to avoid this is to create a LazyReader class that only initializes the Reader on first read
public class LazyReader extends Reader{
String fileName;
Reader reader=null;
public LazyReader(String filename){
super();
this.fileName=fileName;
}
private void init(){
if(reader==null)
reader = new BufferedReader(new InputStreamReader(new FileInputStream(filename), "UTF-8"));
}
public int read(char[] cbuf, int off, int len){
init();
return reader.read(cbuff, off,len);
}
public int close(){
init();
reader.close();
}
//if you want marking you should also implement mark(int), reset() and markSupported()
}

Wrapping a PipedInputStream with a BufferedInputStream

I have an OutputStream that I needed to read from, and so I used the following (Groovy) code to get an InputStream reference to the data:
PipedInputStream inputStream = new PipedInputStream()
PipedOutputStream outputStream = new PipedOutputStream(inputStream)
new Thread(
new Runnable() {
public void run() {
// Some API method
putDataInOutputStream(outputStream)
outputStream.close()
}
}
).start()
handler.process(inputStream)
In this case, handler is some class that implements an interface which has this method:
public void process(InputStream stream);
The problem that came up in our new requirements was that there was some pre-processing on the stream, and therefore I need to read the stream at least twice in the handler.process() method. Here's some example code from one implementation:
public void process(InputStream stream) {
def bufferedStream = new BufferedInputStream(stream, 30 * 1048576) // 30 MB
bufferedStream.mark(Integer.MAX_VALUE)
parseMetadata(bufferedStream)
bufferedStream.reset()
doTheThingYouDo(bufferedStream)
}
I know that for some input I am not hitting the 30 MB limit or the Integer.MAX_VALUE buffer size. However, I'm always getting the following exception:
java.io.IOException: Stream closed
Is this even possible? I think the problem is the thread closing on the PipedOutputStream, but I don't know how to prevent that or if perhaps I'm creating more problems by being a novice at Java Stream IO.
My best guess is that your parseMetadata somehow closed the stream. I've tried your scenario, and it works fine for me. In general, closing the OutputStream before your handler is done reading is not the problem, that's exactly what the piped streams are for.
Besides, given your situation, I would leave out the piping, and the additional thread. If you don't mind having your entire stream in memory, you can do something like
ByteArrayOutputStream out = new ByteArrayOutputStream();
fillTheOutput(out);
ByteArrayInputStream in = new ByteArrayInputStream(out.toByteArray());
pass1(in);
in.reset();
pass2(in);
If you do mind having everything in memory, you're in trouble anyway, since your BufferedInputStream does roughly the same thing.
edit: Note that you can easily build a new ByteArrayInputStream based on the byte array, which is something you cannot do with regular streams.

Categories