FileInputStream Fread = new FileInputStream("somefilename");
FileOutputStream Fwrite = null;
for (int i = 1; i <= 5; i++)
{
String fileName = "file" + i + ".txt";
Fwrite = new FileOutputStream(fileName);
int c;
while ((c = Fread.read()) != -1)
{
Fwrite.write((char) c);
}
Fwrite.close();
}
Fread.close();
The above code writes only to one file. How to make it work to write the content of one file to multiple files?
FYI: Note that the read() method you used returns a byte, not a char, so calling write((char) c) should have been just write(c).
To write to multiple files in parallel when copying a file, you create a array of output streams for the destination files, then iterate the array to write the data to all of them.
For better performance, you should always do this using a buffer. Writing one byte at a time will not perform well.
public static void copyToMultipleFiles(String inFile, String... outFiles) throws IOException {
OutputStream[] outStreams = new OutputStream[outFiles.length];
try {
for (int i = 0; i < outFiles.length; i++)
outStreams[i] = new FileOutputStream(outFiles[i]);
try (InputStream inStream = new FileInputStream(inFile)) {
byte[] buf = new byte[16384];
for (int len; (len = inStream.read(buf)) > 0; )
for (OutputStream outStream : outStreams)
outStream.write(buf, 0, len);
}
} finally {
for (OutputStream outStream : outStreams)
if (outStream != null)
outStream.close();
}
}
You will have to create multiple instances of FileOutputStream fwrite1, fwrite2, fwrite3, one per each file you want to write to, then, as you read, you simply write to all of them. This is how you achieve it.
Add this line:
Fread.reset();
after Fwrite.close();
And change the first line of code to this:
InputStream Fread = new BufferedInputStream(new FileInputStream("somefilename"));
Fread.mark(0);
The FReadstream gets to the end once and then there is nothing to make it start from the beginning.
To solve this you can:
call to FRead.reset() after each file writing
cache FRead's value somewhere and write to FWrite from this source
create an array / collection of FileOutputStream and write each byte to all of them during iteration
The recommended solution is of course the first one.
Also there are some problems in your code:
You are highly encouraged to use try-with-resouce for Streams as they should be safely closed
You seem to not follow naming conventions which say to name variables in lowerCamelCase
Related
I am trying to create a github webhook. It sends a payload every time I publish a new package to one of my repositories. My issue is that that I cannot seem to be able to read in the whole body. It gets cut off at the same number of bytes each time. However, I can see the whole body if I read it using HttpServletRequest#getReader(). Is there something I am doing wrong when trying to read the input stream?
Here is the code for reading the body:
byte[] bodyBytes = new byte[request.getContentLength()];
System.out.println(request.getContentLength());
request.getInputStream().read(bodyBytes);
//System.out.println(request.getReader().readLine()); //works correctly
try (FileWriter fw = new FileWriter(new File("./payload.txt"))) {
for(byte i : bodyBytes)
fw.write("0x" + String.format("%02x ", i) + " ");
fw.write("\n\n\n");
fw.write(new String(bodyBytes));
}
As per the Javadocs, InputStream.read(byte[]) will read at least one byte, when available, and at most as many as the size of the byte array argument. It may read less for any reason, in which case you have to call it repeatedly to get the entire content. Simplest case: write to a ByteArrayOutputStream:
byte[] buf = new byte[1024];
int r;
InputStream is = request.getInputStream();
try (ByteArrayOutputStream baos = new ByteArrayOutputStream()) {
while ((r = is.read(buf)) >= 0) {
baos.write(buf, 0, r);
}
// the bytes are now accessible:
byte[] entireContent = baos.getBytes();
}
This is the principle; it has the disadvantage that it stores the entire content into memory. You may want to process each "batch" of the input and write it to the file instead of in memory, e.g. as:
byte[] buf = new byte[1024];
int r, i;
InputStream is = request.getInputStream();
try (FileWriter fw = new FileWriter(new File("./payload.txt"))) {
while ((r = is.read(buf)) >= 0) {
for (i=0; i < r; i++) {
fw.write("0x" + String.format("%02x ", buf[i]) + " ");
}
// *************** NOTE ****************************
// Apparently you need the entire content as well, so
// this kind of streaming does not apply in this case.
// You have to store the entire content in memory.
// Keeping the code here as an example/reference.
}
}
Note: it's not a duplicate, because here we want to write not only Objects, but also a whole file, then read it back.
I have created a single File with 3 Objects using ObjectOutputStream,
String
String
File ( size is between [1 to 1.5]GB )
Below is my code whichever I have used to write the File
byte[] BUFFER = new byte[1024*32];
FileInputStream fis = new FileInputStream("ThisIsTheFile.xyz");
FileOutputStream fos = new FileOutputStream("abcd.dat", true);
ObjectOutputStream oos = new ObjectOutputStream(fos);
String fileId = "BSN-1516-5287B-65893", fTitle = "Emberson Booklet";
for(int i = 0; i < 3; i++){
if(i == 0){
oos.write(fileId.getBytes(), 0, fileId.length());
}else if (i == 1){
oos.write(fTitle.getBytes(), 0, fTitle.length());
}else{
InputStream is = new BufferedInputStream(fis);
int bytesRead = -1;
while((bytesRead = is.read(BUFFER)) != -1){
oos.write(BUFFER, 0, bytesRead);
}
is.close();
}
}
fileId = fTitle = null;
oos.flush();
oos.close();
fos.flush();
fos.close();
fis.close();
Now my problem is:
I don't want to interrupt the Java Heap Space so read & write the large File streams using 32KB bytes buffer technique simultaneously.
Am I writing these three Object inside a single File separately & correctly?
Last but not least, How should I retrieve these above said all 3 Object through ObjectInputStream from "abcd.dat" File?
Please help.
I have create a file with 3 single objects
No. You have created a file with no objects and a whole lot of bytes, and no way of telling where one byte sequence stops and another starts. Use writeObject(), and read them with readObject(). The way you have it, there's no point in using ObjectOutputStream at all.
Note: appending to this file won't work. See here for why.
I am getting OutOfMemory Exception. Why? I am using this code for logging. Does this approach correct?
Exceptions and closing of streams are handled in parent methods.
private static void writeToFile(File file, FileWriter out, String message) throws IOException {
if (file.exists() && file.isFile()) {
if ((file.length() + message.getBytes().length) <= FILE_MAX_SIZE_B) {
out.write(message);
} else {
int cutLenght = (int) (file.length() + message.getBytes().length - FILE_MAX_SIZE_B);
FileInputStream fileInputStream = new FileInputStream(file);
BufferedReader bufferedReader = new BufferedReader(new InputStreamReader(fileInputStream));
char[] buf = new char[1024];
int numRead = 0;
StringBuffer text = new StringBuffer(1000);
while ((numRead=bufferedReader.read(buf)) != -1) {
text.append(buf,0,numRead);
}
String result = new String(text).substring(cutLenght);
result += message;
FileWriter fileWriter = new FileWriter(file, appendToFile);
writeToFile(file, fileWriter, result);
bufferedReader.close();
}
}
}
EDIT:
I am using this method for writting my logs in file. So for example in one second I can call 10 logs. I am getting error on lines:
while ((numRead=bufferedReader.read(buf)) != -1) {
text.append(buf,0,numRead);
}
My guess is that you are getting the OutOfMemoryError because you are reading the entire contents of the log file back into memory once it has gotten too close to its maximum size.
You could instead read and write it in smaller chunks, but that could be tricky since you have to avoid overwriting something you haven't already read.
Overall, this technique seems like a very inefficient method of maintaining the log data. Some alternative approaches off the top of my head:
(1) maintain a set of n log files, each with maximum size FILE_MAX_SIZE_B/n. When the first log fills up, open the next one for writing, and so on; when the last one fills up, go back to the first one. In this way you are discarding some of the oldest log data each time you switch files, but not all of it, and still maintaining your overall size limit.
(2) rotate the data within a single file. After each write, add a marker that indicates this is the end of the log stream. When the file has reached its maximum size, just start again at the beginning, overwriting the data that is there. The marker will tell you where the latest message is.
Try something like this:
void appendToFile(File f, CharSequence message, Charset cs, long maximumSize) throws IOException {
long available = maximumSize - f.length();
if (available > 0) {
FileOutputStream fos = new FileOutputStream(f, true);
try {
CharBuffer chars = CharBuffer.wrap(message);
ByteBuffer bytes = ByteBuffer.allocate(8 * 1024); // Re-used when encoding the string
CharsetEncoder enc = cs.newEncoder();
CoderResult res;
do {
res = enc.encode(chars, bytes, true);
bytes.flip();
long len = Math.min(available, bytes.remaining());
available -= len;
fos.write(bytes.array(), bytes.position(), (int) len);
bytes.clear();
} while (res == CoderResult.OVERFLOW && available > 0);
} finally {
fos.close();
}
}
}
Testable with this:
File f = new File(getCacheDir(), "tmp.txt");
f.delete();
// Or whatever charset you want.
Charset cs = Charset.forName("UTF-8");
int maxlen = 2 * 1024; // For this test, 2kb
try {
for (int i = 0; i < maxlen / 20; i++) {
// Write 30 characters for maxlen/20 times == guaranteed overflow
appendToFile(f, "123456789012345678901234567890", cs, maxlen);
System.out.println("Length=" + f.length());
}
} catch (Throwable t) {
t.printStackTrace();
}
f.delete();
Well, you're getting OOM because you're trying to load a huge file into memory.
Did you try opening it with append option instead?
you get OOME because you load the whole file, then get some part of the string. Instead, do a skip on your input stream and read.
I would like to know how can I read a file byte by byte then perform some operation every n bytes.
for example:
Say I have a file of size = 50 bytes, I want to divide it into blocks each of n bytes. Then each block is sent to a function for some operations to be done on those bytes. The blocks are to be created during the read process and sent to the function when the block reaches n bytes so that I don`t use much memory for storing all blocks.
I want the output of the function to be written/appended on a new file.
This is what I've reached to read, yet I don't know it it is right:
fc = new JFileChooser();
File f = fc.getSelectedFile();
FileInputStream in = new FileInputStream(f);
byte[] b = new byte[16];
in.read(b);
I haven't done anything yet for the write process.
You're on the right lines. Consider wrapping your FileInputStream with a BufferedInputStream, which improve I/O efficiency by reading the file in chunks.
The next step is to check the number of bytes read (returned by your call to read) and to hand-off the array to the processing function. Obviously you'll need to pass the number of bytes read to this method too in case the array was only partially populated.
So far your code looks OK. For reading binary files (as opposed to text files) you should indeed use FileInputStream (for reading text files, you should use a Reader, such as FileReader).
Note that you should check the return value from in.read(b);, because it might read less than 16 bytes if there are less than 16 bytes left at the end of the file.
Ofcourse you should add a loop to the program that keeps reading blocks of bytes until you reach the end of the file.
To write data to a binary file, use FileOutputStream. That class has a constructor that you can pass a flag to indicate that you want to append to an existing file:
FileOutputStream out = new FileOutputStream("output.bin", true);
Also, don't forget to call close() on the FileInputStream and FileOutputStream when you are done.
See the Java API documentation, especially the classes in the java.io package.
I believe that this will work:
final int blockSize = // some calculation
byte[] block = new byte[blockSize];
InputStream is = new FileInputStream(f);
try {
int ret = -1;
do {
int bytesRead = 0;
while (bytesRead < blockSize) {
ret = is.read(block, bytesRead, blockSize - bytesRead);
if (ret < 0)
break; // no more data
bytesRead += ret;
}
myFunction(block, bytesRead);
} while (0 <= ret);
}
finally {
is.close();
}
This code will call myFunction with blockSize bytes for all but possibly the last invocation.
It's a start.
You should check what read() returns. It can read fewer bytes than the size of the array, and also indicate that the end of the file is reached.
Obviously, you need to read() in a loop...
It might be a good idea to reuse the array, but that requires that the part that reads the array copies what it needs, rather than just keeping a reference to the array.
I think this is what you migth need
void readFile(String path, int n) {
try {
File f = new File(path);
FileInputStream fis = new FileInputStream(f);
int ret = 0;
byte[] array = new byte[n];
while(ret > -1) {
ret = fis.read(array);
doSomething(array, ret);
}
fis.close();
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
Is there a way to prepend a line to the File in Java, without creating a temporary file, and writing the needed content to it?
No, there is no way to do that SAFELY in Java. (Or AFAIK, any other programming language.)
No filesystem implementation in any mainstream operating system supports this kind of thing, and you won't find this feature supported in any mainstream programming languages.
Real world file systems are implemented on devices that store data as fixed sized "blocks". It is not possible to implement a file system model where you can insert bytes into the middle of a file without significantly slowing down file I/O, wasting disk space or both.
The solutions that involve an in-place rewrite of the file are inherently unsafe. If your application is killed or the power dies in the middle of the prepend / rewrite process, you are likely to lose data. I would NOT recommend using that approach in practice.
Use a temporary file and renaming. It is safer.
There is a way, it involves rewriting the whole file though (but no temporary file). As others mentioned, no file system supports prepending content to a file. Here is some sample code that uses a RandomAccessFile to write and read content while keeping some content buffered in memory:
public static void main(final String args[]) throws Exception {
File f = File.createTempFile(Main.class.getName(), "tmp");
f.deleteOnExit();
System.out.println(f.getPath());
// put some dummy content into our file
BufferedWriter w = new BufferedWriter(new OutputStreamWriter(new FileOutputStream(f)));
for (int i = 0; i < 1000; i++) {
w.write(UUID.randomUUID().toString());
w.write('\n');
}
w.flush();
w.close();
// append "some uuids" to our file
int bufLength = 4096;
byte[] appendBuf = "some uuids\n".getBytes();
byte[] writeBuf = appendBuf;
byte[] readBuf = new byte[bufLength];
int writeBytes = writeBuf.length;
RandomAccessFile rw = new RandomAccessFile(f, "rw");
int read = 0;
int write = 0;
while (true) {
// seek to read position and read content into read buffer
rw.seek(read);
int bytesRead = rw.read(readBuf, 0, readBuf.length);
// seek to write position and write content from write buffer
rw.seek(write);
rw.write(writeBuf, 0, writeBytes);
// no bytes read - end of file reached
if (bytesRead < 0) {
// end of
break;
}
// update seek positions for write and read
read += bytesRead;
write += writeBytes;
writeBytes = bytesRead;
// reuse buffer, create new one to replace (short) append buf
byte[] nextWrite = writeBuf == appendBuf ? new byte[bufLength] : writeBuf;
writeBuf = readBuf;
readBuf = nextWrite;
};
rw.close();
// now show the content of our file
BufferedReader reader = new BufferedReader(new InputStreamReader(new FileInputStream(f)));
String line;
while ((line = reader.readLine()) != null) {
System.out.println(line);
}
}
You could store the file content in a String and prepend the desired line by using a StringBuilder-Object. You just have to put the desired line first and then append the file-content-String.
No extra temporary file needed.
No. There are no "intra-file shift" operations, only read and write of discrete sizes.
It would be possible to do so by reading a chunk of the file of equal length to what you want to prepend, writing the new content in place of it, reading the later chunk and replacing it with what you read before, and so on, rippling down the to the end of the file.
However, don't do that, because if anything stops (out-of-memory, power outage, rogue thread calling System.exit) in the middle of that process, data will be lost. Use the temporary file instead.
private static void addPreAppnedText(File fileName) {
FileOutputStream fileOutputStream =null;
BufferedReader br = null;
FileReader fr = null;
String newFileName = fileName.getAbsolutePath() + "#";
try {
fileOutputStream = new FileOutputStream(newFileName);
fileOutputStream.write("preappendTextDataHere".getBytes());
fr = new FileReader(fileName);
br = new BufferedReader(fr);
String sCurrentLine;
while ((sCurrentLine = br.readLine()) != null) {
fileOutputStream.write(("\n"+sCurrentLine).getBytes());
}
fileOutputStream.flush();
} catch (IOException e) {
e.printStackTrace();
} finally {
try {
fileOutputStream.close();
if (br != null)
br.close();
if (fr != null)
fr.close();
new File(newFileName).renameTo(new File(newFileName.replace("#", "")));
} catch (IOException ex) {
ex.printStackTrace();
}
}
}