Stream closed error when getting charset - java

I'm having issues with the following code:
try (
InputStream is = new FileInputStream(file);
BufferedReader br = new BufferedReader(
new InputStreamReader(is,
Charset.forName(SidFileUtils.charsetDetection(is))
)
);
) {
br.readLine();
br.readLine();
for (String line = br.readLine() ; line != null ; line = br.readLine()) {
lines.add(line);
}
} catch (ExceptionTechnique | IOException e) {
LOG.error("Erreur lors de la lecture du fichier " + file.getName(), e);
}
This part of the code: Chasrset.forName(...) is giving me a Stream Closed error. I think it's because I'm using the InputStream item twice and it has already been consumed but I'm not sure.
Can you help me understand what is wrong with this code please ?
Thanks a lot in advance !

Yes, the charsetDetection has no other option to read the stream further. Some streams can mark and reset the read position when the specific InputStream supports it.
if (in.markSupported()) {
final int maxBytesNeededForDetection = 8192;
in.mark(maxBytesNeededForDetection);
... do the detection
in.reset();
} else {
throw IllegalState();
}
BufferedInputStream indeed supports it, but only upto the buffer size; otherwise an IOException("Resetting to invalid mark"); is raised.
One then should specify the buffer size in the constructor.
In this case it seems no mark/reset is used by the detection. Quite logical because of the partial coverage of such a technique.
Charset charset = null;
try (InputStream is = new FileInputStream(file)) {
Charset charset = Charset.forName(SidFileUtils.charsetDetection(is));
}
if (charset != null) {
...
}

Related

Inline input stream processing in Java

I need some help on below problem. I am working on a project where I need to deal with files.
I get the handle of input stream from the user from which before writing it to disk I need to perform certain steps.
calculate the file digest
check for only 1 zip file present, unzip the data if zipped
dos 2 unix conversion
record length validation
and encrypt and save the file to disk
Also need to break the flow if there is any exception in the process
I tried to use piped output and input stream, but the constraint is Java recommends it to run in 2 separate threads. Once I read from input stream I am not able to use it from other processing steps. Files can be very big so cannot cache all the data in buffer.
Please provide your suggestions or is there any third party lib I can use for same.
The biggest issue is that you'll need to peek ahead in the provided InputStream to decide if you received a zipfile or not.
private boolean isZipped(InputStream is) throws IOException {
try {
return new ZipInputStream(is).getNextEntry() != null;
} catch (final ZipException ze) {
return false;
}
}
After this you need to reset the inputstream to the initial position before setting up a DigestInputStream.
Then read a ZipInputstream or the DigestInputstream directly.
After you've done your processing, read the DigestInputStream to the end so you can obtain the digest.
Below code has been validated through a wrapping "CountingInputstream" that keeps track of the total number of bytes read from the provided FileInputStream.
final FileInputStream fis = new FileInputStream(filename);
final CountingInputStream countIs = new CountingInputStream(fis);
final boolean isZipped = isZipped(countIs);
// make sure we reset the inputstream before calculating the digest
fis.getChannel().position(0);
final DigestInputStream dis = new DigestInputStream(countIs, MessageDigest.getInstance("SHA-256"));
// decide which inputStream to use
InputStream is = null;
ZipInputStream zis = null;
if (isZipped) {
zis = new ZipInputStream(dis);
zis.getNextEntry();
is = zis;
} else {
is = dis;
}
final File tmpFile = File.createTempFile("Encrypted_", ".tmp");
final OutputStream os = new CipherOutputStream(new FileOutputStream(tmpFile), obtainCipher());
try {
readValidateAndWriteRecords(is, os);
failIf2ndZipEntryExists(zis);
} catch (final Exception e) {
os.close();
tmpFile.delete();
throw e;
}
System.out.println("Digest: " + obtainDigest(dis));
dis.close();
System.out.println("\nValidating bytes read and calculated digest");
final DigestInputStream dis2 = new DigestInputStream(new CountingInputStream(new FileInputStream(filename)), MessageDigest.getInstance("SHA-256"));
System.out.println("Digest: " + obtainDigest(dis2));
dis2.close();
Not really relevant, but these are the helper methods:
private String obtainDigest(DigestInputStream dis) throws IOException {
final byte[] buff = new byte[1024];
while (dis.read(buff) > 0) {
dis.read(buff);
}
return DatatypeConverter.printBase64Binary(dis.getMessageDigest().digest());
}
private void readValidateAndWriteRecords(InputStream is, final OutputStream os) throws IOException {
final BufferedReader br = new BufferedReader(new InputStreamReader(is));
// do2unix is done automatically by readline
for (String line = br.readLine(); line != null; line = br.readLine()) {
// record length validation
if (line.length() < 1) {
throw new RuntimeException("RecordLengthValidationFailed");
}
os.write((line + "\n").getBytes());
}
}
private void failIf2ndZipEntryExists(ZipInputStream zis) throws IOException {
if (zis != null && zis.getNextEntry() != null) {
throw new RuntimeException("Zip File contains multiple entries");
}
}
==> output:
Digest: jIisvDleAttKiPkyU/hDvbzzottAMn6n7inh4RKxPOc=
CountingInputStream closed. Total number of bytes read: 1100
Validating bytes read and calculated digest
Digest: jIisvDleAttKiPkyU/hDvbzzottAMn6n7inh4RKxPOc=
CountingInputStream closed. Total number of bytes read: 1072
Fun question, I may have gone overboard with my answer :)

Removing column names while reading content through Inputstream in Java

I have been trying to remove the column names that come when I read the content of the response returned by http get.
Initially I used http get to get a content and then I read this content using InputStream and then write to local disk as a csv file using FileOutputStream:
InputStream read_content = result.getEntity().getContent();
FileOutputStream writ = new FileOutputStream(new File(path));
byte[] buff = new byte[4096];
int length;
while ((length = read_content.read(buff)) > 0) {
writ.write(buff, 0, length);
}
Here result is the response I get from http get. This works fine but the problem is that the response also contains column names which I want to remove.
After some modification I am using this code now but the output is not coming right:
InputStream read_content = result.getEntity().getContent();
BufferedReader reader =
new BufferedReader(new InputStreamReader(read_content));
FileWriter fstream = new FileWriter(path);
BufferedWriter out = new BufferedWriter(fstream);
reader.readLine();
while (reader.readLine() != null) {
out.write(reader.read());
}
When I execute this modified code then I get garbage result. What am I doing wrong here and how can I remove the table column names?
Yout code should be something like this
BufferedReader br = null ;
BufferedWriter out = null;
try{
InputStream is = new FileInputStream(new File("C:/Space/ConnTest/Test/input.txt"));
br = new BufferedReader(new InputStreamReader(is));
out = new BufferedWriter(new FileWriter(new File("C:/Space/ConnTest/Test/output.txt")));
System.out.println("This is first line ---"+br.readLine());
String str = "";
while ((str = br.readLine()) != null) {
out.write(str);
}
System.out.println("Success");
}
catch(Exception e )
{
e.printStackTrace();
}
finally
{
if(br!=null)
{
br.close();
}
if(out!=null)
{
out.close();
}
}
Dont be confuse with whole code I just replaced your out.write(reader.read()); with
while ((str = br.readLine()) != null) {
out.write(str);
}
And I am calling br.readLine() in SYSOUT so headers will get skipped. Then I am writing the file with br.readLine()
If it's a line, and so is the rest of the content, use BufferedReader.readLine(), and skip the first line.

read output of txt file has white spaces

i got the list of applications from cmd command using /output:D:\list.txt product get name,version. However when i try to retrieve the list using java the output has white spaces after each letter.
SAMPLE:
from text file
links
images
lists
when read in java
l i n k s
i m a g e s
l i s t s
is there a way to fix this problem?
i just used this code:
public void myreader() throws IOException {
Path path = Paths.get("D:\\list.txt");
Charset charset = Charset.forName("ISO-8859-1");
try (BufferedReader reader = Files.newBufferedReader(path,charset)) {
String line = null;
while ((line = reader.readLine()) != null) {
System.out.println(line);
}
}
This can be due to the encoding problem. Try using UTF-16 character set
BufferedReader reader = new BufferedReader(new InputStreamReader(new FileInputStream(path), "UTF-16"));
Have you tried the FileReader?
FileReader fileReader;
try {
fileReader = new FileReader( "D:\\list.txt" );
BufferedReader bufferedReader = new BufferedReader( fileReader );
String line;
while( ( line = bufferedReader.readLine() ) != null )
{
System.out.println( line );
}
fileReader.close();
} catch ( IOException except ) {
System.err.println( except.getStackTrace()[0] );
}
Im not shure where your problem is coming from, but you may take the FileReader for such instructions.
Looks like you read a UTF-16 encoded file.
Give a hint to your Reader - pass "UTF-16", instead of "ISO-8859-1".

java - buffered readed readline() gives null as end of file but no way to use that null

Is there a way to check whether a file was correctly written, I mean if there is an EOF at the end?
I'm asking that because I have a program that takes some file, merge them in a very big file and then use it to get statistics from it.
The point is that the second part never ends because it doesn't recognize the end of file.
The relevant parts of the code are the following:
(please do not ask for the whole code as I cannot post for important reasons)
FileWriter file=null;
PrintWriter pw = null;
String pathToRead=null;
InputStreamReader isr = null;
BufferedReader br = null ;
FileInputStream fis = null ;
TestJFileChooser d=new TestJFileChooser();
int c=1;
String line=null;
....
//here i select the files
selectedFile=new File(pathToRead);
//here I get one buffer reader for each file got with listFiles()
for(File file_sel:app){
if (file_sel.getName().startsWith("gtou")){
System.out.println(file_sel.getName());
fis = null;
try {
fis = new FileInputStream(file_sel);
} catch (FileNotFoundException e) {
e.printStackTrace();
}
isr=new InputStreamReader(fis);
br=new BufferedReader(isr);
map.put(i, br);
num_file++;
i++;
}
}
//then I select the output file and open a print writer for it
fileToWrite=new File(pathToRead);
try {
file = new FileWriter(fileToWrite);
pw= new PrintWriter(file);
} catch (IOException e1) {
e1.printStackTrace();
}
//merging part
....
line=br.readLine();
while(line!=null){
System.out.println("line is:"+line);
....
line=br.readLine();
}
//end of merging ....
pw.flush();
pw.close();
try {
if (file!=null) file.close();
fis.close();
isr.close();
br.close();
for(int fi=0;fi<num_file;fi++){
br2=map.get(fi);
br2.close();
}
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
so.kill();
Runtime r=Runtime.getRuntime();
r.gc();
//this is a popup that comes out
GlitchSquad gli=new GlitchSquad("Completed");
the problem is that as output I get:
line is: null ;
line is: null ;
line is: null ;
etc
And never get to "completed" popup =(
I cannot understand what is exactly that null because the control line!=null doesn't work.
I also tried to use that null as a string ..but nothing..
I thought that was a problem in how I close the streams but now the code seems correct to me ..but still no way to stop it..
Suggestion?
Thanks in advance!
p.s. it is a summarized version in order to focus on the streams.. variables are correctly declared and the same is for imports etc
edit: code updated
EOF is EOF. There is no more data. Unless you have an expected EOF mark within the file, or a self-describing protocol that tells you where the EOF mark should be, there is no way to determine whether the file was completely written.
I don't know if it will solve your problem, but I'd be using this code instead:
try {
fis = new FileInputStream(file_sel);
isr=new InputStreamReader(fis);
br=new BufferedReader(isr);
map.put(num_file++, br);
} catch (FileNotFoundException e) {
e.printStackTrace();
}
Otherwise there may be uncaught "NullPointer"-exceptions or strange BufferedReaders in your "map". ( I don't right now know how new InputStreamReader(null) will behave.)
It looks like i and num_file have always equal values, so just drop i. Or use a LinkedList and drop both.
If there's not a special merging that you have to do, I'd just do it like this:
OutputStream os;
try {
os = new FileOuputStream(outfile);
} catch (FileNotFoundException e) {
os = null;
e.printStackTrace();
}
if (os != null) {
for(File file_sel:app) {
if (file_sel.getName().startsWith("gtou")) {
System.out.println(file_sel.getName());
InputStream is = null;
try {
is = new FileInputStream(file_sel);
byte[] buffer = new byte[1024];
int readBytes = 0;
while ((readBytes = is.read(buffer)) > 0) {
os.write(buffer, 0, readBytes);
}
fos.flush();
is.close();
} catch (FileNotFoundException e) {
e.printStackTrace();
}
}
}
}
If you read files with different encodings, you will have to modify at least the reading of course.
If it doesn't work, I'd suggest you build a "summarized" and runable sample program.
The core of your question is this code:
BufferedReader br = ...
String line = br.readLine();
while (line != null) {
System.out.println("line is:" + line);
...
line = br.readLine();
}
You say that this repeatedly outputs this:
line is: null ;
line is: null ;
(Notice the " ;" on the end!!!)
The only way that can happen is if the file you are reading contains at least one line that look like this:
null ;
Indeed, unless the "..." code includes a continue statement, there must must be lots of those lines in the input file.
Is there a way to check whether a file was correctly written?
Yea. Look at it using a text editor and/or check its file size.
I mean if there is an EOF at the end?
In modern file systems, EOF is a position not a marker. Specifically it is the position after the last byte of the file. So it is logically impossible for a file to not have an EOF. (You'd have to have a file that is infinite in length for there to be no EOF.)

Prepend lines to file in Java

Is there a way to prepend a line to the File in Java, without creating a temporary file, and writing the needed content to it?
No, there is no way to do that SAFELY in Java. (Or AFAIK, any other programming language.)
No filesystem implementation in any mainstream operating system supports this kind of thing, and you won't find this feature supported in any mainstream programming languages.
Real world file systems are implemented on devices that store data as fixed sized "blocks". It is not possible to implement a file system model where you can insert bytes into the middle of a file without significantly slowing down file I/O, wasting disk space or both.
The solutions that involve an in-place rewrite of the file are inherently unsafe. If your application is killed or the power dies in the middle of the prepend / rewrite process, you are likely to lose data. I would NOT recommend using that approach in practice.
Use a temporary file and renaming. It is safer.
There is a way, it involves rewriting the whole file though (but no temporary file). As others mentioned, no file system supports prepending content to a file. Here is some sample code that uses a RandomAccessFile to write and read content while keeping some content buffered in memory:
public static void main(final String args[]) throws Exception {
File f = File.createTempFile(Main.class.getName(), "tmp");
f.deleteOnExit();
System.out.println(f.getPath());
// put some dummy content into our file
BufferedWriter w = new BufferedWriter(new OutputStreamWriter(new FileOutputStream(f)));
for (int i = 0; i < 1000; i++) {
w.write(UUID.randomUUID().toString());
w.write('\n');
}
w.flush();
w.close();
// append "some uuids" to our file
int bufLength = 4096;
byte[] appendBuf = "some uuids\n".getBytes();
byte[] writeBuf = appendBuf;
byte[] readBuf = new byte[bufLength];
int writeBytes = writeBuf.length;
RandomAccessFile rw = new RandomAccessFile(f, "rw");
int read = 0;
int write = 0;
while (true) {
// seek to read position and read content into read buffer
rw.seek(read);
int bytesRead = rw.read(readBuf, 0, readBuf.length);
// seek to write position and write content from write buffer
rw.seek(write);
rw.write(writeBuf, 0, writeBytes);
// no bytes read - end of file reached
if (bytesRead < 0) {
// end of
break;
}
// update seek positions for write and read
read += bytesRead;
write += writeBytes;
writeBytes = bytesRead;
// reuse buffer, create new one to replace (short) append buf
byte[] nextWrite = writeBuf == appendBuf ? new byte[bufLength] : writeBuf;
writeBuf = readBuf;
readBuf = nextWrite;
};
rw.close();
// now show the content of our file
BufferedReader reader = new BufferedReader(new InputStreamReader(new FileInputStream(f)));
String line;
while ((line = reader.readLine()) != null) {
System.out.println(line);
}
}
You could store the file content in a String and prepend the desired line by using a StringBuilder-Object. You just have to put the desired line first and then append the file-content-String.
No extra temporary file needed.
No. There are no "intra-file shift" operations, only read and write of discrete sizes.
It would be possible to do so by reading a chunk of the file of equal length to what you want to prepend, writing the new content in place of it, reading the later chunk and replacing it with what you read before, and so on, rippling down the to the end of the file.
However, don't do that, because if anything stops (out-of-memory, power outage, rogue thread calling System.exit) in the middle of that process, data will be lost. Use the temporary file instead.
private static void addPreAppnedText(File fileName) {
FileOutputStream fileOutputStream =null;
BufferedReader br = null;
FileReader fr = null;
String newFileName = fileName.getAbsolutePath() + "#";
try {
fileOutputStream = new FileOutputStream(newFileName);
fileOutputStream.write("preappendTextDataHere".getBytes());
fr = new FileReader(fileName);
br = new BufferedReader(fr);
String sCurrentLine;
while ((sCurrentLine = br.readLine()) != null) {
fileOutputStream.write(("\n"+sCurrentLine).getBytes());
}
fileOutputStream.flush();
} catch (IOException e) {
e.printStackTrace();
} finally {
try {
fileOutputStream.close();
if (br != null)
br.close();
if (fr != null)
fr.close();
new File(newFileName).renameTo(new File(newFileName.replace("#", "")));
} catch (IOException ex) {
ex.printStackTrace();
}
}
}

Categories