I am reading bytes from a socket and then writing to a json file using jsonGenerator. Problem is JsonGenerator is overwriting the file everytime a stream is received from the socket. how do I make it append the subsequent streams instead of overwriting?
JsonFactory factory = new JsonFactory();
JsonGenerator generator = factory.createGenerator(
new File("transactions.json"), JsonEncoding.UTF8);
try {
while ( (bytesRead = in.read(bytes)) != -1 ){ // -1 indicates EOF
output= new String(bytes, "UTF-8");
String length = output.substring(0, 4).trim();
String mti = output.substring(4, 8).trim();
String resp = "000";
String newoutput = "";
String bitmap = output.substring(8, 24);
String stan = output.substring(24, 30);
String date = output.substring(30, 44);
String function_code = output.substring(44, 47);
mti = "1814";
// output to file
generator.writeStartObject();
generator.writeStringField("MTI", mti);
generator.writeStringField("lenght", length);
generator.writeStringField("stan", stan);
generator.writeStringField("date", date);
generator.writeStringField("Function Code", function_code);
generator.writeEndObject();
}
} catch (Exception e) {
System.out.println("Exceptions "+e);
}finally{
generator.close();
}
Also when I declare generator outside the while loop and close it outside the loop for some reason the data is not written to the file so I am assuming the generator kinda like buffers it and when you close it writes to the file.
I might be missing something in your question, but the overwriting reason that jumps out to me is that you aren't specifying that the file should be appended to. Most Java APIs (including Jackson) default to overwriting instead of appending. The simple solution to this is just to use:
// the second parameter specifies whether the file should be appended
try(OutputStream fos = new FileOutputStream(new File("transactions.json"), true)) {
// pass the FileOutputStream to the generator instead
JsonGenerator generator = factory.createGenerator(fos , JsonEncoding.UTF8);
}
I would leave it at that for my answer, but I would be remiss if I didn't point out that if you are reading from multiple sockets concurrently, then you are probably going to end up with JSON data written interleaved.
I suggest wrapping the method in a synchronize block of some sort to prevent this and make it thread-safe.
Below I have an example of how I would re-write this functionality.
import com.fasterxml.jackson.core.JsonEncoding;
import com.fasterxml.jackson.core.JsonFactory;
import com.fasterxml.jackson.core.JsonGenerator;
import java.io.File;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStream;
import java.nio.charset.StandardCharsets;
/**
* A thread-safe class that will append JSON transaction data to a file.
*/
public class TransactionWriter {
private static final JsonFactory jsonFactory = new JsonFactory();
/**
* Helper method that will read a number of UTF8 characters from an input stream and return them in a string.
*
* #param inputStream stream to read from
* #param charsToRead number of characters to read
* #return string of read characters
* #throws IOException when unable to read enough characters from the stream
*/
private static String readUtf8Chars(InputStream inputStream, int charsToRead) throws IOException {
// since we know this is UTF8 up front, we can assume one byte per char
byte[] buffer = new byte[charsToRead];
// fill the buffer
int readBytes = inputStream.read(buffer);
// check that the buffer was actually filled
if(readBytes < charsToRead)
throw new IOException("less bytes available to read than expected: " + readBytes + " instead of " + charsToRead);
// create a string from the buffer
return new String(buffer, StandardCharsets.UTF_8);
}
private final File file;
private final Object writeLock = new Object();
/**
* Constructs a new instance for an output file.
*
* #param file file to append to
*/
public TransactionWriter(File file) {
this.file = file;
}
/**
* Reads a transaction from the input stream and appends a JSON representation to this instance's output file.
*
* #param inputStream stream to read from; will be closed after this method is closed
* #throws IOException when reading or writing failed
*/
public void write(InputStream inputStream) throws IOException {
// since we have multiple threads appending to the same file, synchronize to prevent concurrency issues
synchronized(writeLock) {
// open the output stream to append to the file
try(FileOutputStream outputStream = new FileOutputStream(file, true)) {
// create the generator for the output stream
JsonGenerator generator = jsonFactory.createGenerator(outputStream, JsonEncoding.UTF8);
// write the data to the generator
generator.writeStartObject();
generator.writeStringField("length", readUtf8Chars(inputStream, 4).trim());
generator.writeStringField("MTI", readUtf8Chars(inputStream, 4).trim());
String bitmap = readUtf8Chars(inputStream, 16);
generator.writeStringField("stan", readUtf8Chars(inputStream, 8));
generator.writeStringField("date", readUtf8Chars(inputStream, 14));
generator.writeStringField("Function Code", readUtf8Chars(inputStream, 3));
generator.writeEndObject();
} finally {
// output stream is closed in try-with-resources, but also close the input stream
inputStream.close();
}
}
}
}
To be clear, I have not tested this code at all. I simply know that it compiles on Java 7 language level.
Declare the generator outside the loop (before the loop).
take the loop in try-catch statement, there are two options:
You should close the generator (after the loop) in finally block, or use it in "try-with-resources" way if you use java 1.7
Additionally, do you know that you should ensure that you read the whole message? In this code that you present it could read a half of message and try to process it, which will probably result in exception. You should have some protocol that reads the messages of the inputStream from socket and process only whole messages, not half-loaded chunks. – Krzysztof Cichocki 2 mins ago edit
Related
I have a method which accepts an InputStream (of binary data) and serializes it to XML. In order to do so, it wraps the stream with a base64 encoder and a Reader to convert it to character data. However, since the InputStream is passed in as a parameter, I would consider it a harmful side effect to close the stream, and the contract for Reader.close() says it would do just that. If I don't close the reader, the compiler warns me that I have a
Resource leak: reader is never closed
So, I can add a #SuppressWarnings( "resource" ) to the reader declaration, but is that the right thing to do? Am I missing something?
Here is the actual code:
/**
* Writes base64 encoded text read from the binary stream.
*
* #param binaryStream
* The binary stream to write from
* #return <code>this</code> XmlWriter (for chaining)
* #throws IOException
*/
public XmlWriter binary( InputStream binaryStream ) throws IOException {
Reader reader = new InputStreamReader(
new Base64InputStream( binaryStream, true, base64LineLength, base64LineSeparator.getBytes( charset ) ) );
int bufferSize = 2048;
int charsRead;
char[] buffer = new char[bufferSize];
while ( (charsRead = reader.read( buffer, 0, bufferSize )) >= 0 ) {
writer.write( buffer, 0, charsRead );
}
return this;
}
If you are a happy Java 7 user, try this:
try(InputStream binaryStream = /* ... */) {
xmlWriter.binary(binaryStream);
}
and stream is closed for you. If you can't use Java 7, I agree that it's not the responsibility of binary() method to close() the stream. Just ignore the warning and don't let tools drive your design. It's fine.
As a last resort you can write a lightweight Reader wrapper ignoring close(), but I don't advice it as it makes following the program flow harder.
Also let Apache Commons IO help you with IOUtils.copy():
public XmlWriter binary( InputStream binaryStream ) throws IOException {
Reader reader = new InputStreamReader(
new Base64InputStream( binaryStream, true, base64LineLength, base64LineSeparator.getBytes( charset ) ) );
IOUtils.copy(reader, writer);
return this;
}
This is a perhaps a "feature" in the way Base64InputStream work that even though you specify the length to read, it will close the underlying stream if you close it when clearly you intended not to read the whole stream.
You could wrap the binaryStream in an InputStream which ignores the close, or you could suppress the warning as you have.
I have a java ee application where I use a servlet to print a log file created with log4j. When reading log files you are usually looking for the last log line and therefore the servlet would be much more useful if it printed the log file in reverse order. My actual code is:
response.setContentType("text");
PrintWriter out = response.getWriter();
try {
FileReader logReader = new FileReader("logfile.log");
try {
BufferedReader buffer = new BufferedReader(logReader);
for (String line = buffer.readLine(); line != null; line = buffer.readLine()) {
out.println(line);
}
} finally {
logReader.close();
}
} finally {
out.close();
}
The implementations I've found in the internet involve using a StringBuffer and loading all the file before printing, isn't there a code light way of seeking to the end of the file and reading the content till the start of the file?
[EDIT]
By request, I am prepending this answer with the sentiment of a later comment: If you need this behavior frequently, a "more appropriate" solution is probably to move your logs from text files to database tables with DBAppender (part of log4j 2). Then you could simply query for latest entries.
[/EDIT]
I would probably approach this slightly differently than the answers listed.
(1) Create a subclass of Writer that writes the encoded bytes of each character in reverse order:
public class ReverseOutputStreamWriter extends Writer {
private OutputStream out;
private Charset encoding;
public ReverseOutputStreamWriter(OutputStream out, Charset encoding) {
this.out = out;
this.encoding = encoding;
}
public void write(int ch) throws IOException {
byte[] buffer = this.encoding.encode(String.valueOf(ch)).array();
// write the bytes in reverse order to this.out
}
// other overloaded methods
}
(2) Create a subclass of log4j WriterAppender whose createWriter method would be overridden to create an instance of ReverseOutputStreamWriter.
(3) Create a subclass of log4j Layout whose format method returns the log string in reverse character order:
public class ReversePatternLayout extends PatternLayout {
// constructors
public String format(LoggingEvent event) {
return new StringBuilder(super.format(event)).reverse().toString();
}
}
(4) Modify my logging configuration file to send log messages to both the "normal" log file and a "reverse" log file. The "reverse" log file would contain the same log messages as the "normal" log file, but each message would be written backwards. (Note that the encoding of the "reverse" log file would not necessarily conform to UTF-8, or even any character encoding.)
(5) Create a subclass of InputStream that wraps an instance of RandomAccessFile in order to read the bytes of a file in reverse order:
public class ReverseFileInputStream extends InputStream {
private RandomAccessFile in;
private byte[] buffer;
// The index of the next byte to read.
private int bufferIndex;
public ReverseFileInputStream(File file) {
this.in = new RandomAccessFile(File, "r");
this.buffer = new byte[4096];
this.bufferIndex = this.buffer.length;
this.in.seek(file.length());
}
public void populateBuffer() throws IOException {
// record the old position
// seek to a new, previous position
// read from the new position to the old position into the buffer
// reverse the buffer
}
public int read() throws IOException {
if (this.bufferIndex == this.buffer.length) {
populateBuffer();
if (this.bufferIndex == this.buffer.length) {
return -1;
}
}
return this.buffer[this.bufferIndex++];
}
// other overridden methods
}
Now if I want to read the entries of the "normal" log file in reverse order, I just need to create an instance of ReverseFileInputStream, giving it the "revere" log file.
This is a old question. I also wanted to do the same thing and after some searching found there is a class in apache commons-io to achieve this:
org.apache.commons.io.input.ReversedLinesFileReader
I think a good choice for this would be using RandomFileAccess class. There is some sample code for back-reading using this class on this page. Reading bytes this way is easy, however reading strings might be a bit more challenging.
If you are in a hurry and want the simplest solution without worrying too much about performance, I would give a try to use an external process to do the dirty job (given that you are running your app in a Un*x server, as any decent person would do XD)
new BufferedReader(new InputStreamReader(Runtime.getRuntime().exec("tail yourlogfile.txt -n 50 | rev").getProcess().getInputStream()))
A simpler alternative, because you say that you're creating a servlet to do this, is to use a LinkedList to hold the last N lines (where N might be a servlet parameter). When the list size exceeds N, you call removeFirst().
From a user experience perspective, this is probably the best solution. As you note, the most recent lines are the most important. Not being overwhelmed with information is also very important.
Good question. I'm not aware of any common implementations of this. It's not trivial to do properly either, so be careful what you choose. It should deal with character set encoding and detection of different line break methods. Here's the implementation I have so far that works with ASCII and UTF-8 encoded files, including a test case for UTF-8. It does not work with UTF-16LE or UTF-16BE encoded files.
import java.io.BufferedReader;
import java.io.ByteArrayOutputStream;
import java.io.File;
import java.io.FileInputStream;
import java.io.IOException;
import java.io.InputStreamReader;
import java.io.RandomAccessFile;
import java.io.Reader;
import java.io.UnsupportedEncodingException;
import java.nio.ByteBuffer;
import java.nio.channels.FileChannel;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import junit.framework.TestCase;
public class ReverseLineReader {
private static final int BUFFER_SIZE = 8192;
private final FileChannel channel;
private final String encoding;
private long filePos;
private ByteBuffer buf;
private int bufPos;
private byte lastLineBreak = '\n';
private ByteArrayOutputStream baos = new ByteArrayOutputStream();
public ReverseLineReader(File file, String encoding) throws IOException {
RandomAccessFile raf = new RandomAccessFile(file, "r");
channel = raf.getChannel();
filePos = raf.length();
this.encoding = encoding;
}
public String readLine() throws IOException {
while (true) {
if (bufPos < 0) {
if (filePos == 0) {
if (baos == null) {
return null;
}
String line = bufToString();
baos = null;
return line;
}
long start = Math.max(filePos - BUFFER_SIZE, 0);
long end = filePos;
long len = end - start;
buf = channel.map(FileChannel.MapMode.READ_ONLY, start, len);
bufPos = (int) len;
filePos = start;
}
while (bufPos-- > 0) {
byte c = buf.get(bufPos);
if (c == '\r' || c == '\n') {
if (c != lastLineBreak) {
lastLineBreak = c;
continue;
}
lastLineBreak = c;
return bufToString();
}
baos.write(c);
}
}
}
private String bufToString() throws UnsupportedEncodingException {
if (baos.size() == 0) {
return "";
}
byte[] bytes = baos.toByteArray();
for (int i = 0; i < bytes.length / 2; i++) {
byte t = bytes[i];
bytes[i] = bytes[bytes.length - i - 1];
bytes[bytes.length - i - 1] = t;
}
baos.reset();
return new String(bytes, encoding);
}
public static void main(String[] args) throws IOException {
File file = new File("my.log");
ReverseLineReader reader = new ReverseLineReader(file, "UTF-8");
String line;
while ((line = reader.readLine()) != null) {
System.out.println(line);
}
}
public static class ReverseLineReaderTest extends TestCase {
public void test() throws IOException {
File file = new File("utf8test.log");
String encoding = "UTF-8";
FileInputStream fileIn = new FileInputStream(file);
Reader fileReader = new InputStreamReader(fileIn, encoding);
BufferedReader bufReader = new BufferedReader(fileReader);
List<String> lines = new ArrayList<String>();
String line;
while ((line = bufReader.readLine()) != null) {
lines.add(line);
}
Collections.reverse(lines);
ReverseLineReader reader = new ReverseLineReader(file, encoding);
int pos = 0;
while ((line = reader.readLine()) != null) {
assertEquals(lines.get(pos++), line);
}
assertEquals(lines.size(), pos);
}
}
}
you can use RandomAccessFile implements this function,such as:
import java.io.File;
import java.io.IOException;
import java.io.RandomAccessFile;
import com.google.common.io.LineProcessor;
public class FileUtils {
/**
* 反向读取文本文件(UTF8),文本文件分行是通过\r\n
*
* #param <T>
* #param file
* #param step 反向寻找的步长
* #param lineprocessor
* #throws IOException
*/
public static <T> T backWardsRead(File file, int step,
LineProcessor<T> lineprocessor) throws IOException {
RandomAccessFile rf = new RandomAccessFile(file, "r");
long fileLen = rf.length();
long pos = fileLen - step;
// 寻找倒序的第一行:\r
while (true) {
if (pos < 0) {
// 处理第一行
rf.seek(0);
lineprocessor.processLine(rf.readLine());
return lineprocessor.getResult();
}
rf.seek(pos);
char c = (char) rf.readByte();
while (c != '\r') {
c = (char) rf.readByte();
}
rf.readByte();//read '\n'
pos = rf.getFilePointer();
if (!lineprocessor.processLine(rf.readLine())) {
return lineprocessor.getResult();
}
pos -= step;
}
}
use:
FileUtils.backWardsRead(new File("H:/usersfavs.csv"), 40,
new LineProcessor<Void>() {
//TODO implements method
.......
});
The simplest solution is to read through the file in forward order, using an ArrayList<Long> to hold the byte offset of each log record. You'll need to use something like Jakarta Commons CountingInputStream to retrieve the position of each record, and will need to carefully organize your buffers to ensure that it returns the proper values:
FileInputStream fis = // .. logfile
BufferedInputStream bis = new BufferedInputStream(fis);
CountingInputStream cis = new CountingInputSteam(bis);
InputStreamReader isr = new InputStreamReader(cis, "UTF-8");
And you probably won't be able to use a BufferedReader, because it will attempt to read-ahead and throw off the count (but reading a character at a time won't be a performance problem, because you're buffering lower in the stack).
To write the file, you iterate the list backwards and use a RandomAccessFile. There is a bit of a trick: to properly decode the bytes (assuming a multi-byte encoding), you will need to read the bytes corresponding to an entry, and then apply a decoding to it. The list, however, will give you the start and end position of the bytes.
One big benefit to this approach, versus simply printing the lines in reverse order, is that you won't damage multi-line log messages (such as exceptions).
import java.io.File;
import java.io.IOException;
import java.nio.charset.Charset;
import java.nio.file.Files;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collections;
import java.util.Comparator;
import java.util.HashSet;
import java.util.List;
import java.util.Set;
/**
* Inside of C:\\temp\\vaquar.txt we have following content
* vaquar khan is working into Citi He is good good programmer programmer trust me
* #author vaquar.khan#gmail.com
*
*/
public class ReadFileAndDisplayResultsinReverse {
public static void main(String[] args) {
try {
// read data from file
Object[] wordList = ReadFile();
System.out.println("File data=" + wordList);
//
Set<String> uniquWordList = null;
for (Object text : wordList) {
System.out.println((String) text);
List<String> tokens = Arrays.asList(text.toString().split("\\s+"));
System.out.println("tokens" + tokens);
uniquWordList = new HashSet<String>(tokens);
// If multiple line then code into same loop
}
System.out.println("uniquWordList" + uniquWordList);
Comparator<String> wordComp= new Comparator<String>() {
#Override
public int compare(String o1, String o2) {
if(o1==null && o2 ==null) return 0;
if(o1==null ) return o2.length()-0;
if(o2 ==null) return o1.length()-0;
//
return o2.length()-o1.length();
}
};
List<String> fs=new ArrayList<String>(uniquWordList);
Collections.sort(fs,wordComp);
System.out.println("uniquWordList" + fs);
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
static Object[] ReadFile() throws IOException {
List<String> list = Files.readAllLines(new File("C:\\temp\\vaquar.txt").toPath(), Charset.defaultCharset());
return list.toArray();
}
}
Output:
[Vaquar khan is working into Citi He is good good programmer programmer trust me
tokens[vaquar, khan, is, working, into, Citi, He, is, good, good, programmer, programmer, trust, me]
uniquWordList[trust, vaquar, programmer, is, good, into, khan, me, working, Citi, He]
uniquWordList[programmer, working, vaquar, trust, good, into, khan, Citi, is, me, He]
If you want to Sort A to Z then write one more comparater
Concise solution using Java 7 Autoclosables and Java 8 Streams :
try (Stream<String> logStream = Files.lines(Paths.get("C:\\logfile.log"))) {
logStream
.sorted(Comparator.reverseOrder())
.limit(10) // last 10 lines
.forEach(System.out::println);
}
Big drawback: only works when lines are strictly in natural order, like log files prefixed with timestamps but without exceptions
to get the content of a txt file I usually use a scanner and iterate over each line to get the content:
Scanner sc = new Scanner(new File("file.txt"));
while(sc.hasNextLine()){
String str = sc.nextLine();
}
Does the java api provide a way to get the content with one line of code like:
String content = FileUtils.readFileToString(new File("file.txt"))
Not the built-in API - but Guava does, amongst its other treasures. (It's a fabulous library.)
String content = Files.toString(new File("file.txt"), Charsets.UTF_8);
There are similar methods for reading any Readable, or loading the entire contents of a binary file as a byte array, or reading a file into a list of strings, etc.
Note that this method is now deprecated. The new equivalent is:
String content = Files.asCharSource(new File("file.txt"), Charsets.UTF_8).read();
With Java 7 there is an API along those lines.
Files.readAllLines(Path path, Charset cs)
commons-io has:
IOUtils.toString(new FileReader("file.txt"), "utf-8");
import java.io.IOException;
import java.nio.file.Files;
import java.nio.file.Paths;
public static void main(String[] args) throws IOException {
String content = Files.readString(Paths.get("foo"));
}
From https://docs.oracle.com/en/java/javase/11/docs/api/java.base/java/nio/file/Files.html#readString(java.nio.file.Path)
You could use the FileReader class together with the BufferedReader to read the text file.
File fileToRead = new File("file.txt");
try( FileReader fileStream = new FileReader( fileToRead );
BufferedReader bufferedReader = new BufferedReader( fileStream ) ) {
String line = null;
while( (line = bufferedReader.readLine()) != null ) {
//do something with line
}
} catch ( FileNotFoundException ex ) {
//exception Handling
} catch ( IOException ex ) {
//exception Handling
}
After a bit of testing, I find BufferedReader and Scanner both problematic under various circumstances (the former often fails to detect new lines and the latter often strips spaces, for instance, from a JSON string exported by org.json library). There are other methods available but the problem is they are only supported after certain Java versions (which is bad for an Android developer, for example) and you might not want to use Guava or Apache commons library just for a single purpose like this. Hence, my solution is to read the whole file as bytes and convert it to string. The code below are taken from one of my hobby projects:
/**
* Get byte array from an InputStream most efficiently.
* Taken from sun.misc.IOUtils
* #param is InputStream
* #param length Length of the buffer, -1 to read the whole stream
* #param readAll Whether to read the whole stream
* #return Desired byte array
* #throws IOException If maximum capacity exceeded.
*/
public static byte[] readFully(InputStream is, int length, boolean readAll)
throws IOException {
byte[] output = {};
if (length == -1) length = Integer.MAX_VALUE;
int pos = 0;
while (pos < length) {
int bytesToRead;
if (pos >= output.length) {
bytesToRead = Math.min(length - pos, output.length + 1024);
if (output.length < pos + bytesToRead) {
output = Arrays.copyOf(output, pos + bytesToRead);
}
} else {
bytesToRead = output.length - pos;
}
int cc = is.read(output, pos, bytesToRead);
if (cc < 0) {
if (readAll && length != Integer.MAX_VALUE) {
throw new EOFException("Detect premature EOF");
} else {
if (output.length != pos) {
output = Arrays.copyOf(output, pos);
}
break;
}
}
pos += cc;
}
return output;
}
/**
* Read the full content of a file.
* #param file The file to be read
* #param emptyValue Empty value if no content has found
* #return File content as string
*/
#NonNull
public static String getFileContent(#NonNull File file, #NonNull String emptyValue) {
if (file.isDirectory()) return emptyValue;
try {
return new String(readFully(new FileInputStream(file), -1, true), Charset.defaultCharset());
} catch (IOException e) {
e.printStackTrace();
return emptyValue;
}
}
You can simply use getFileContent(file, "") to read the content of a file.
My current situation is: I have to read a file and put the contents into InputStream. Afterwards I need to place the contents of the InputStream into a byte array which requires (as far as I know) the size of the InputStream. Any ideas?
As requested, I will show the input stream that I am creating from an uploaded file
InputStream uploadedStream = null;
FileItemFactory factory = new DiskFileItemFactory();
ServletFileUpload upload = new ServletFileUpload(factory);
java.util.List items = upload.parseRequest(request);
java.util.Iterator iter = items.iterator();
while (iter.hasNext()) {
FileItem item = (FileItem) iter.next();
if (!item.isFormField()) {
uploadedStream = item.getInputStream();
//CHANGE uploadedStreambyte = item.get()
}
}
The request is a HttpServletRequest object, which is like the FileItemFactory and ServletFileUpload is from the Apache Commons FileUpload package.
This is a REALLY old thread, but it was still the first thing to pop up when I googled the issue. So I just wanted to add this:
InputStream inputStream = conn.getInputStream();
int length = inputStream.available();
Worked for me. And MUCH simpler than the other answers here.
Warning This solution does not provide reliable results regarding the total size of a stream. Except from the JavaDoc:
Note that while some implementations of {#code InputStream} will return
* the total number of bytes in the stream, many will not.
I would read into a ByteArrayOutputStream and then call toByteArray() to get the resultant byte array. You don't need to define the size in advance (although it's possibly an optimisation if you know it. In many cases you won't)
You can't determine the amount of data in a stream without reading it; you can, however, ask for the size of a file:
http://java.sun.com/javase/6/docs/api/java/io/File.html#length()
If that isn't possible, you can write the bytes you read from the input stream to a ByteArrayOutputStream which will grow as required.
I just wanted to add, Apache Commons IO has stream support utilities to perform the copy. (Btw, what do you mean by placing the file into an inputstream? Can you show us your code?)
Edit:
Okay, what do you want to do with the contents of the item?
There is an item.get() which returns the entire thing in a byte array.
Edit2
item.getSize() will return the uploaded file size.
For InputStream
org.apache.commons.io.IoUtils.toByteArray(inputStream).length()
For Optional < MultipartFile >
Stream.of(multipartFile.get()).mapToLong(file->file.getSize()).findFirst().getAsLong()
you can get the size of InputStream using getBytes(inputStream) of Utils.java check this following link
Get Bytes from Inputstream
The function below should work with any InputStream. As other answers have hinted, you can't reliably find the length of an InputStream without reading through it, but unlike other answers, you should not attempt to hold the entire stream in memory by reading into a ByteArrayOutputStream, nor is there any reason to. Instead of reading the stream, you should ideally rely on other API for stream sizes, for example getting the size of a file using the File API.
public static int length(InputStream inputStream, int chunkSize) throws IOException {
byte[] buffer = new byte[chunkSize];
int chunkBytesRead = 0;
int length = 0;
while((chunkBytesRead = inputStream.read(buffer)) != -1) {
length += chunkBytesRead;
}
return length;
}
Choose a reasonable value for chunkSize appropriate to the kind of InputStream. E.g. reading from disk it would not be efficient to have too small a value for chunkSize.
When explicitly dealing with a ByteArrayInputStream then contrary to some of the comments on this page you can use the .available() function to get the size. Just have to do it before you start reading from it.
From the JavaDocs:
Returns the number of remaining bytes that can be read (or skipped
over) from this input stream. The value returned is count - pos, which
is the number of bytes remaining to be read from the input buffer.
https://docs.oracle.com/javase/7/docs/api/java/io/ByteArrayInputStream.html#available()
If you need to stream the data to another object that doesn't allow you to directly determine the size (e.g. javax.imageio.ImageIO), then you can wrap your InputStream within a CountingInputStream (Apache Commons IO), and then read the size:
CountingInputStream countingInputStream = new CountingInputStream(inputStream);
// ... process the whole stream ...
int size = countingInputStream.getCount();
If you know that your InputStream is a FileInputStream or a ByteArrayInputStream, you can use a little reflection to get at the stream size without reading the entire contents. Here's an example method:
static long getInputLength(InputStream inputStream) {
try {
if (inputStream instanceof FilterInputStream) {
FilterInputStream filtered = (FilterInputStream)inputStream;
Field field = FilterInputStream.class.getDeclaredField("in");
field.setAccessible(true);
InputStream internal = (InputStream) field.get(filtered);
return getInputLength(internal);
} else if (inputStream instanceof ByteArrayInputStream) {
ByteArrayInputStream wrapper = (ByteArrayInputStream)inputStream;
Field field = ByteArrayInputStream.class.getDeclaredField("buf");
field.setAccessible(true);
byte[] buffer = (byte[])field.get(wrapper);
return buffer.length;
} else if (inputStream instanceof FileInputStream) {
FileInputStream fileStream = (FileInputStream)inputStream;
return fileStream.getChannel().size();
}
} catch (NoSuchFieldException | IllegalAccessException | IOException exception) {
// Ignore all errors and just return -1.
}
return -1;
}
This could be extended to support additional input streams, I am sure.
Add to your pom.xml:
<dependency>
<groupId>commons-io</groupId>
<artifactId>commons-io</artifactId>
<version>2.5</version>
</dependency>
Use to get the content type lenght (InputStream file):
IOUtils.toByteArray(file).length
Use this method, you just have to pass the InputStream
public String readIt(InputStream is) {
if (is != null) {
BufferedReader reader = new BufferedReader(new InputStreamReader(is, "utf-8"), 8);
StringBuilder sb = new StringBuilder();
String line;
while ((line = reader.readLine()) != null) {
sb.append(line).append("\n");
}
is.close();
return sb.toString();
}
return "error: ";
}
try {
InputStream connInputStream = connection.getInputStream();
} catch (IOException e) {
e.printStackTrace();
}
int size = connInputStream.available();
int available ()
Returns an estimate of the number of bytes that can be read (or skipped over) from this input stream without blocking by the next invocation of a method for this input stream. The next invocation might be the same thread or another thread. A single read or skip of this many bytes will not block, but may read or skip fewer bytes.
InputStream - Android SDK | Android Developers
I have a log file which gets updated every second. I need to read the log file periodically, and once I do a read, I need to store the file pointer position at the end of the last line I read and in the next periodic read I should start from that point.
Currently, I am using a random access file in Java and using the getFilePointer() method to get he offset value and the seek() method to go to the offset position.
However, I have read in most articles and even the Java doc recommendations to use BufferredReader for efficient reading of a file. How can I achieve this (getting the filepointer and moving to the last line) using a BufferedReader, or is there any other efficient way to achieve this task?
A couple of ways that should work:
open the file using a FileInputStream, skip() the relevant number of bytes, then wrap the BufferedReader around the stream (via an InputStreamReader);
open the file (with either FileInputStream or RandomAccessFile), call getChannel() on the stream/RandomAccessFile to get an underlying FileChannel, call position() on the channel, then call Channels.newInputStream() to get an input stream from the channel, which you can pass to InputStreamReader -> BufferedReader.
I haven't honestly profiled these to see which is better performance-wise, but you should see which works better in your situation.
The problem with RandomAccessFile is essentially that its readLine() method is very inefficient. If it's convenient for you to read from the RAF and do your own buffering to split the lines, then there's nothing wrong with RAF per se-- just that its readLine() is poorly implemented
Neil Coffey's solution is good if you are reading fixed length files. However for files that have variable length (data keep coming in) there are some problems with using BufferedReader directly on FileInputStream or FileChannel inputstream via an InputStreamReader. For ex consider the cases
1)
You want to read data from some offset to current file length. So you use BR on FileInputStream/FileChannel(via an InputStreamReader) and use its readLine method. But while you are busy reading the data let say some data got added which causes BF's readLine to read more data than what you expected(the previous file length)
2)
You finished readLine stuff but when you try to read the current file length/channel position some data got added suddenly which causes the current file length/channel position to increase but you have already read less data than this.
In both of the above cases it is difficult to know the actual data you have read (you cannot just use the length of data read using readLine because it skips some chars like carriage return)
So it is better to read the data in buffered bytes and use a BufferedReader wrapper around this. I wrote some methods like this
/** Read data from offset to length bytes in RandomAccessFile using BufferedReader
* #param offset
* #param length
* #param accessFile
* #throws IOException
*/
public static void readBufferedLines(long offset, long length, RandomAccessFile accessFile) throws IOException{
if(accessFile == null) return;
int bufferSize = BYTE_BUFFER_SIZE;// constant say 4096
if(offset < length && offset >= 0){
int index = 1;
long curPosition = offset;
/*
* iterate (length-from)/BYTE_BUFFER_SIZE times to read into buffer no matter where new line occurs
*/
while((curPosition + (index * BYTE_BUFFER_SIZE)) < length){
accessFile.seek(offset); // seek to last parsed data rather than last data read in to buffer
byte[] buf = new byte[bufferSize];
int read = accessFile.read(buf, 0, bufferSize);
index++;// Increment whether or not read successful
if(read > 0){
int lastnewLine = getLastLine(read,buf);
if(lastnewLine <= 0){ // no new line found in the buffer reset buffer size and continue
bufferSize = bufferSize+read;
continue;
}
else{
bufferSize = BYTE_BUFFER_SIZE;
}
readLine(buf, 0, lastnewLine); // read the lines from buffer and parse the line
offset = offset+lastnewLine; // update the last data read
}
}
// Read last chunk. The last chunk size in worst case is the total file when no newline occurs
if(offset < length){
accessFile.seek(offset);
byte[] buf = new byte[(int) (length-offset)];
int read = accessFile.read(buf, 0, buf.length);
if(read > 0){
readLine(buf, 0, read);
offset = offset+read; // update the last data read
}
}
}
}
private static void readLine(byte[] buf, int from , int lastnewLine) throws IOException{
String readLine = "";
BufferedReader reader = new BufferedReader(new InputStreamReader(new ByteArrayInputStream(buf,from,lastnewLine) ));
while( (readLine = reader.readLine()) != null){
//do something with readLine
System.out.println(readLine);
}
reader.close();
}
private static int getLastLine(int read, byte[] buf) {
if(buf == null ) return -1;
if(read > buf.length) read = buf.length;
while( read > 0 && !(buf[read-1] == '\n' || buf[read-1] == '\r')) read--;
return read;
}
public static void main(String[] args) throws IOException {
RandomAccessFile accessFile = new RandomAccessFile("C:/sri/test.log", "r");
readBufferedLines(0, accessFile.length(), accessFile);
accessFile.close();
}
I had a similar problem, and I created this class to take lines from BufferedStream, and count how many bytes you have read so far by using getBytes(). We assume the line separator has a single byte by default, and we re-instance the BufferedReader for seek() to work.
public class FileCounterIterator {
public Long position() {
return _position;
}
public Long fileSize() {
return _fileSize;
}
public FileCounterIterator newlineLength(Long newNewlineLength) {
this._newlineLength = newNewlineLength;
return this;
}
private Long _fileSize = 0L;
private Long _position = 0L;
private Long _newlineLength = 1L;
private RandomAccessFile fp;
private BufferedReader itr;
public FileCounterIterator(String filename) throws IOException {
fp = new RandomAccessFile(filename, "r");
_fileSize = fp.length();
this.seek(0L);
}
public FileCounterIterator seek(Long newPosition) throws IOException {
this.fp.seek(newPosition);
this._position = newPosition;
itr = new BufferedReader(new InputStreamReader(new FileInputStream(fp.getFD())));
return this;
}
public Boolean hasNext() throws IOException {
return this._position < this._fileSize;
}
public String readLine() throws IOException {
String nextLine = itr.readLine();
this._position += nextLine.getBytes().length + _newlineLength;
return nextLine;
}
}