How to read different Object using ObjectInputStream in Java - java

Note: it's not a duplicate, because here we want to write not only Objects, but also a whole file, then read it back.
I have created a single File with 3 Objects using ObjectOutputStream,
String
String
File ( size is between [1 to 1.5]GB )
Below is my code whichever I have used to write the File
byte[] BUFFER = new byte[1024*32];
FileInputStream fis = new FileInputStream("ThisIsTheFile.xyz");
FileOutputStream fos = new FileOutputStream("abcd.dat", true);
ObjectOutputStream oos = new ObjectOutputStream(fos);
String fileId = "BSN-1516-5287B-65893", fTitle = "Emberson Booklet";
for(int i = 0; i < 3; i++){
if(i == 0){
oos.write(fileId.getBytes(), 0, fileId.length());
}else if (i == 1){
oos.write(fTitle.getBytes(), 0, fTitle.length());
}else{
InputStream is = new BufferedInputStream(fis);
int bytesRead = -1;
while((bytesRead = is.read(BUFFER)) != -1){
oos.write(BUFFER, 0, bytesRead);
}
is.close();
}
}
fileId = fTitle = null;
oos.flush();
oos.close();
fos.flush();
fos.close();
fis.close();
Now my problem is:
I don't want to interrupt the Java Heap Space so read & write the large File streams using 32KB bytes buffer technique simultaneously.
Am I writing these three Object inside a single File separately & correctly?
Last but not least, How should I retrieve these above said all 3 Object through ObjectInputStream from "abcd.dat" File?
Please help.

I have create a file with 3 single objects
No. You have created a file with no objects and a whole lot of bytes, and no way of telling where one byte sequence stops and another starts. Use writeObject(), and read them with readObject(). The way you have it, there's no point in using ObjectOutputStream at all.
Note: appending to this file won't work. See here for why.

Related

Copy content from one file to multiple file using java

FileInputStream Fread = new FileInputStream("somefilename");
FileOutputStream Fwrite = null;
for (int i = 1; i <= 5; i++)
{
String fileName = "file" + i + ".txt";
Fwrite = new FileOutputStream(fileName);
int c;
while ((c = Fread.read()) != -1)
{
Fwrite.write((char) c);
}
Fwrite.close();
}
Fread.close();
The above code writes only to one file. How to make it work to write the content of one file to multiple files?
FYI: Note that the read() method you used returns a byte, not a char, so calling write((char) c) should have been just write(c).
To write to multiple files in parallel when copying a file, you create a array of output streams for the destination files, then iterate the array to write the data to all of them.
For better performance, you should always do this using a buffer. Writing one byte at a time will not perform well.
public static void copyToMultipleFiles(String inFile, String... outFiles) throws IOException {
OutputStream[] outStreams = new OutputStream[outFiles.length];
try {
for (int i = 0; i < outFiles.length; i++)
outStreams[i] = new FileOutputStream(outFiles[i]);
try (InputStream inStream = new FileInputStream(inFile)) {
byte[] buf = new byte[16384];
for (int len; (len = inStream.read(buf)) > 0; )
for (OutputStream outStream : outStreams)
outStream.write(buf, 0, len);
}
} finally {
for (OutputStream outStream : outStreams)
if (outStream != null)
outStream.close();
}
}
You will have to create multiple instances of FileOutputStream fwrite1, fwrite2, fwrite3, one per each file you want to write to, then, as you read, you simply write to all of them. This is how you achieve it.
Add this line:
Fread.reset();
after Fwrite.close();
And change the first line of code to this:
InputStream Fread = new BufferedInputStream(new FileInputStream("somefilename"));
Fread.mark(0);
The FReadstream gets to the end once and then there is nothing to make it start from the beginning.
To solve this you can:
call to FRead.reset() after each file writing
cache FRead's value somewhere and write to FWrite from this source
create an array / collection of FileOutputStream and write each byte to all of them during iteration
The recommended solution is of course the first one.
Also there are some problems in your code:
You are highly encouraged to use try-with-resouce for Streams as they should be safely closed
You seem to not follow naming conventions which say to name variables in lowerCamelCase

From FileInputStream to BufferedInputStream conversion

we were given a few exercises in lab and one of these is to convert the file transferring method from FileInputStream to BufferedInputStream. It's a client sending a GET request to a web server, which sends the file requested.
I came up with a simple solution, and I just wanted to check if it's correct.
Original code:
try {
FileInputStream fis = new FileInputStream(req);
// req, String containing file name
byte[] data = new byte [fis.available()];
fis.read(data);
out.write(data); // OutputStream out = socket.getOutputStream();
} catch (FileNotFoundException e){
new PrintStream(out).println("404 Not Found");
}
My try:
try {
BufferedInputStream bis = new BufferedInputStream (new FileInputStream(req));
byte[] data = new byte[4];
while(bis.read(data) > -1) {
out.write(data);
data = new byte[4];
}
} catch (FileNotFoundException e){
new PrintStream(out).println("404 Not Found");
}
The file is a web page named index.html, which contains a simple html page.
I have to reallocate the array every time, because at the last execution of the while loop, if the file isn't a multiple of 4 in size, the data array will contain characters from the previous execution, which are shown in the browser.
I chose 4 as data size for debugging purposes.
Output is correct.
Is this a good solution or can I do better?
There's no need to re-create the byte array each time - just overwrite it. More importantly though, you have a conceptual mistake inside your loop. Each iteration just writes the array to the stream assuming it's all valid. If you examine BufferedInputStream#read's documentation you'll see it may not read enough data to fill the entire array, and will return the number of bytes it actually read. You should use this number to limit the amount of bytes you're writing:
while((int len = bis.read(data)) > -1) {
out.write(data, 0, len);
}
I suggest you close off your file once you are done. The BufferedInputStream uses an 8 KB buffer by default which you are reducing to a smaller buffer. A simpler solution is to copy 8 KB at a time and not use the added buffer
try (InputStream in = new FileInputStream(req)) {
byte[] data = new byte[8 << 10];
for (int len; (len = bis.read(data)) > -1; )
out.write(data, 0, len);
} catch (IOException e) {
out.write("404 Not Found\n".getBytes());
}

Convert InputStreamReader to InputStream

How can I convert InputStreamReader to InputStream? I have an InputStream which contains some string and byte data and I want to parse it. So I wrap my InputStream to BufferedReader. Then I read 3 lines from it. After that I want to get the rest of data(bytes) as is. But if I try to get it nothing happens.
Code snippet:
BufferedReader br = new BufferedReader(new InputStreamReader(is,"UTF-8"));
String endOfData = br.readLine();
String contentDisposition = br.readLine();
String contentType = br.readLine();
file = new File(filename);
if(file.exists()) file.delete();
file.createNewFile();
FileOutputStream fos = new FileOutputStream(file);
byte[] data = new byte[8192];
int len = 0;
while (-1 != (len = is.read(data)) )
{
fos.write(data, 0, len);
Log.e("len", len+"");
}
fos.flush();
fos.close();
is.close();
The file is empty. If I don't wrap InputStream it works fine, but I need to read 3 lines and remove it.
Thanks.
If you want to mix text and byte data together, you should use OutputStream.writeUTF to write out those 3 lines, this way one single InputStream will be able to retrieve all the data that you need.
Take a look at commons-io's ReaderInputStream: it is a little heavy handed, but you can wrap the BufferedReader with that and read it as an input stream again.
It's pretty hard to mix byte and character input correctly, especially once you start throwing buffered readers / streams into the mix. I'd suggest that you either pick one and stick with it (converting your bytes to strings as necessary; care with the encoding!) or wrap the entire thing in a ZipOutputStream so you can have multiple logical "files" with different contents.

Java InputStream reading problem

I have a Java class, where I'm reading data in via an InputStream
byte[] b = null;
try {
b = new byte[in.available()];
in.read(b);
} catch (IOException e) {
e.printStackTrace();
}
It works perfectly when I run my app from the IDE (Eclipse).
But when I export my project and it's packed in a JAR, the read command doesn't read all the data. How could I fix it?
This problem mostly occurs when the InputStream is a File (~10kb).
Thanks!
Usually I prefer using a fixed size buffer when reading from input stream. As evilone pointed out, using available() as buffer size might not be a good idea because, say, if you are reading a remote resource, then you might not know the available bytes in advance. You can read the javadoc of InputStream to get more insight.
Here is the code snippet I usually use for reading input stream:
byte[] buffer = new byte[BUFFER_SIZE];
int bytesRead = 0;
while ((bytesRead = in.read(buffer)) >= 0){
for (int i = 0; i < bytesRead; i++){
//Do whatever you need with the bytes here
}
}
The version of read() I'm using here will fill the given buffer as much as possible and
return number of bytes actually read. This means there is chance that your buffer may contain trailing garbage data, so it is very important to use bytes only up to bytesRead.
Note the line (bytesRead = in.read(buffer)) >= 0, there is nothing in the InputStream spec saying that read() cannot read 0 bytes. You may need to handle the case when read() reads 0 bytes as special case depending on your case. For local file I never experienced such case; however, when reading remote resources, I actually seen read() reads 0 bytes constantly resulting the above code into an infinite loop. I solved the infinite loop problem by counting the number of times I read 0 bytes, when the counter exceed a threshold I will throw exception. You may not encounter this problem, but just keep this in mind :)
I probably will stay away from creating new byte array for each read for performance reasons.
read() will return -1 when the InputStream is depleted. There is also a version of read which takes an array, this allows you to do chunked reads. It returns the number of bytes actually read or -1 when at the end of the InputStream. Combine this with a dynamic buffer such as ByteArrayOutputStream to get the following:
InputStream in = ...
ByteArrayOutputStream buffer = new ByteArrayOutputStream();
int read;
byte[] input = new byte[4096];
while ( -1 != ( read = in.read( input ) ) ) {
buffer.write( input, 0, read );
}
input = buffer.toByteArray()
This cuts down a lot on the number of methods you have to invoke and allows the ByteArrayOutputStream to grow its internal buffer faster.
File file = new File("/path/to/file");
try {
InputStream is = new FileInputStream(file);
byte[] bytes = IOUtils.toByteArray(is);
System.out.println("Byte array size: " + bytes.length);
} catch (IOException e) {
e.printStackTrace();
}
Below is a snippet of code that downloads a file (*. Png, *. Jpeg, *. Gif, ...) and write it in BufferedOutputStream that represents the HttpServletResponse.
BufferedInputStream inputStream = bo.getBufferedInputStream(imageFile);
try {
ByteArrayOutputStream buffer = new ByteArrayOutputStream();
int bytesRead = 0;
byte[] input = new byte[DefaultBufferSizeIndicator.getDefaultBufferSize()];
while (-1 != (bytesRead = inputStream.read(input))) {
buffer.write(input, 0, bytesRead);
}
input = buffer.toByteArray();
response.reset();
response.setBufferSize(DefaultBufferSizeIndicator.getDefaultBufferSize());
response.setContentType(mimeType);
// Here's the secret. Content-Length should equal the number of bytes read.
response.setHeader("Content-Length", String.valueOf(buffer.size()));
response.setHeader("Content-Disposition", "inline; filename=\"" + imageFile.getName() + "\"");
BufferedOutputStream outputStream = new BufferedOutputStream(response.getOutputStream(), DefaultBufferSizeIndicator.getDefaultBufferSize());
try {
outputStream.write(input, 0, buffer.size());
} finally {
ImageBO.close(outputStream);
}
} finally {
ImageBO.close(inputStream);
}
Hope this helps.

read a file byte by byte then perform some operation every n bytes

I would like to know how can I read a file byte by byte then perform some operation every n bytes.
for example:
Say I have a file of size = 50 bytes, I want to divide it into blocks each of n bytes. Then each block is sent to a function for some operations to be done on those bytes. The blocks are to be created during the read process and sent to the function when the block reaches n bytes so that I don`t use much memory for storing all blocks.
I want the output of the function to be written/appended on a new file.
This is what I've reached to read, yet I don't know it it is right:
fc = new JFileChooser();
File f = fc.getSelectedFile();
FileInputStream in = new FileInputStream(f);
byte[] b = new byte[16];
in.read(b);
I haven't done anything yet for the write process.
You're on the right lines. Consider wrapping your FileInputStream with a BufferedInputStream, which improve I/O efficiency by reading the file in chunks.
The next step is to check the number of bytes read (returned by your call to read) and to hand-off the array to the processing function. Obviously you'll need to pass the number of bytes read to this method too in case the array was only partially populated.
So far your code looks OK. For reading binary files (as opposed to text files) you should indeed use FileInputStream (for reading text files, you should use a Reader, such as FileReader).
Note that you should check the return value from in.read(b);, because it might read less than 16 bytes if there are less than 16 bytes left at the end of the file.
Ofcourse you should add a loop to the program that keeps reading blocks of bytes until you reach the end of the file.
To write data to a binary file, use FileOutputStream. That class has a constructor that you can pass a flag to indicate that you want to append to an existing file:
FileOutputStream out = new FileOutputStream("output.bin", true);
Also, don't forget to call close() on the FileInputStream and FileOutputStream when you are done.
See the Java API documentation, especially the classes in the java.io package.
I believe that this will work:
final int blockSize = // some calculation
byte[] block = new byte[blockSize];
InputStream is = new FileInputStream(f);
try {
int ret = -1;
do {
int bytesRead = 0;
while (bytesRead < blockSize) {
ret = is.read(block, bytesRead, blockSize - bytesRead);
if (ret < 0)
break; // no more data
bytesRead += ret;
}
myFunction(block, bytesRead);
} while (0 <= ret);
}
finally {
is.close();
}
This code will call myFunction with blockSize bytes for all but possibly the last invocation.
It's a start.
You should check what read() returns. It can read fewer bytes than the size of the array, and also indicate that the end of the file is reached.
Obviously, you need to read() in a loop...
It might be a good idea to reuse the array, but that requires that the part that reads the array copies what it needs, rather than just keeping a reference to the array.
I think this is what you migth need
void readFile(String path, int n) {
try {
File f = new File(path);
FileInputStream fis = new FileInputStream(f);
int ret = 0;
byte[] array = new byte[n];
while(ret > -1) {
ret = fis.read(array);
doSomething(array, ret);
}
fis.close();
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}

Categories