Closed. This question is not reproducible or was caused by typos. It is not currently accepting answers.
This question was caused by a typo or a problem that can no longer be reproduced. While similar questions may be on-topic here, this one was resolved in a way less likely to help future readers.
Closed 2 years ago.
Improve this question
I'm trying to get a copy of the content from the stream without consuming it. My plan is to use this original stream in the later of the code. Following is the sample code I tried to check this. My intention is to keep the original InputStream for future use after getting a copy
import java.io.BufferedReader;
import java.io.ByteArrayOutputStream;
import java.io.FileInputStream;
import java.io.InputStream;
import java.io.InputStreamReader;
public class StreamTest {
public static void main(String[] args) {
try {
InputStream inputstream = new FileInputStream("resource.txt"); // content of the file is 'test'
ByteArrayOutputStream baos = new ByteArrayOutputStream();
org.apache.commons.io.IOUtils.copy(inputstream, baos);
byte[] bytes = baos.toByteArray();
System.out.println("copied stream : " + new String(bytes));
StringBuilder sb = new StringBuilder();
BufferedReader br = new BufferedReader(new InputStreamReader(inputstream));
String line;
while ((line = br.readLine()) != null) {
sb.append(line + System.lineSeparator());
}
System.out.println("original stream : " + sb.toString());
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
but still, when I access the original stream, after coping it, I still see that the original stream is consumed. See below output
copied stream : test
original stream :
Can someone point out my mistake
Thanks
Sadly this is not possible by the nature of streams.
In order to copy the data from the stream, you need to first extract it. By extracting it, you consume the stream.
But do not worry, there should be a solution for your specific use case. Maybe open another stream (if you know the source of the stream gives the same data every time - as in a file - , you can use Supplier everywhere you would use the InputStream, so that a new stream is created whenever necessary), or you can check out this post for creating more streams with the same data: https://stackoverflow.com/a/5924132/3102234
Related
I need help with the below code. I need to review it and fix the security issues within the code. The issue that I see is the BufferReader should read in chunks. This would possibly prevent a DOS Attack.The way the code is written now it will read a infinite length. I'm not sure the best way to limit the BufferReader.Any help would be appreciated.
import java.io.BufferedReader;
import java.io.FileReader;
import java.io.IOException;
public class example {
/**
* #param args the command line arguments
*/
public static void main(String[] args) {
// Read the filename from the command line argument
String filename = args[0];
BufferedReader inputStream = null;
String fileLine;
try {
inputStream = new BufferedReader(new FileReader(filename));
System.out.println("Email Addresses:");
// Read one Line using BufferedReader
while ((fileLine = inputStream.readLine()) != null) {
System.out.println(fileLine);
}
} catch (IOException io) {
System.out.println("File IO exception" + io.getMessage());
} finally {
// Need another catch for closing
// the streams
try {
if (inputStream != null) {
inputStream.close();
}
} catch (IOException io) {
System.out.println("Issue closing the Files" + io.getMessage());
}
}
}
}
The requirement behind the warning about BufferedReader.readLine is to impose a reasonable bound on the maximum amount of memory that an adversary can cause to be allocated at a time. In this case the important usage is the size of the String characters and roughly the same in the buffer used to create it. If the adversary can do this multiple times at once, then that will also need to be limited. Typically, if the resource can be stopped but not closed (for instance, over a network file system) then the buffer can be kept in memory indefinitely.
The easy, general solution is to implement an InputStream that limits the total number of bytes that can be read through it. That could also be implemented at the Reader level limiting the number of characters. The dirty way around is to ignore the BufferedReader and do the reading of char arrays and combining into a StringBuilder yourself.
Presumably various third party library include code that covers those approaches.
(Also: Do use try-with-resource. FileReader picks up whatever character coding has been left as the default, which is probably wrong. Adding throws IOException to main makes the simpler.)
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking for code must demonstrate a minimal understanding of the problem being solved. Include attempted solutions, why they didn't work, and the expected results. See also: Stack Overflow question checklist
Closed 9 years ago.
Improve this question
I am trying to write a program that just reads and write an unbuffered steam, and reads and writes a buffered stream. Following the example on the java docs, I've got this for my buffered stream, which works fine.
import java.io.FileReader;
import java.io.FileWriter;
import java.io.IOException;
public class CopyCharacters {
public static void main(String[] args) throws IOException {
FileReader inputStream = null;
FileWriter outputStream = null;
try {
inputStream = new FileReader("unbufferedread.txt");
outputStream = new FileWriter("unbufferedwrite.txt");
int c;
while ((c = inputStream.read()) != -1) {
outputStream.write(c);
}
// Add finally block incase of errors.
// Display error if input file is not found.
} finally {
if (inputStream != null) {
inputStream.close();
}
if (outputStream != null) {
outputStream.close();
}
}
}
}
However the Java docs say "Here's how you might modify the constructor invocations in the CopyCharacters example to use buffered I/O:".
inputStream = new BufferedReader(new FileReader("bufferedread.txt"));
outputStream = new BufferedWriter(new FileWriter("bufferedwrite.txt"));
My question is how to implement it. Is it possible to add it all to one class? When I try to add it, I get an error saying:
"Cannot find symbol - class BufferedReader"
Any help would be great. Thanks.
You have to import the java.io.BufferedReader and java.io.BufferedWriter classes. Based on the code you posted, you aren't doing that. So just add the two lines:
import java.io.BufferedReader;
import java.io.BufferedWriter;
I am new to java Network programming.I was googling the code for a TCP client in java.I came across the following example.
import java.lang.*;
import java.io.*;
import java.net.*;
class Client {
public static void main(String args[]) {
try {
Socket skt = new Socket("localhost", 1234);
BufferedReader in = new BufferedReader(new
InputStreamReader(skt.getInputStream()));
System.out.print("Received string: '");
while (!in.ready()) {}
System.out.println(in.readLine()); // Read one line and output it
System.out.print("'\n");
in.close();
}
catch(Exception e) {
System.out.print("Whoops! It didn't work!\n");
}
}
}
The client seems to read out the data one "line" at a time?. I am connecting to a server that is streaming OpenFlow packets.A wireshark screenshot of OpenFlow packets is given below.
[http://www.openflow.org/downloads/screenshot-openflow-dissector-2008-07-15-2103.jpg][1]
Once I recieve the complete packets I want to dump that to a file and then later read it using wireshark for example.In the above code they are using calss BufferedReader to read the data in "lines"? At least that is how I understand it.Is there someway in which I can get full packets and then write it to the file?
Readers are for working with text data. If you are working with binary data (it's not entirely clear from that screenshot), you should be working with some type of Stream (either InputStream or possibly DataInputStream). Don't just look for random examples on online, try to find ones that actually apply to what you are interested in doing.
also, don't ever use InputStream.available, it's pretty much useless. as is any example code using it.
also, a simple google search for "OpenFlow java" had some interesting hits. are you sure you need to write something from scratch?
No, but there are libraries that provides such functions. See for example Guava
http://docs.guava-libraries.googlecode.com/git/javadoc/com/google/common/io/ByteStreams.html
If you don't want to (or can't) use libraries you shoud consume a stream like this
List<String> lst = new ArrayList<String>();
String line;
while ((line = in.readLine()) != null) {
lst.add(line);
}
or
String str = "";
String line;
while ((line = in.readLine()) != null) {
str += line + "\n";
}
Note that the BufferedReader.readLine() method will give you a new line on linebreaks ('\n'). If the InputStream is binary you should work with bytes instead.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking for code must demonstrate a minimal understanding of the problem being solved. Include attempted solutions, why they didn't work, and the expected results. See also: Stack Overflow question checklist
Closed 9 years ago.
Improve this question
To execute process/command in bash from java one can use the following class:
import java.io.BufferedReader;
import java.io.IOException;
import java.io.InputStream;
import java.io.InputStreamReader;
import java.util.Arrays;
public class CmdExecutor {
public CmdExecutor(){
}
public void exe(String [] args) throws IOException{
if (args.length <= 0) {
System.out.println("empty command");
return;
}
Process process = new ProcessBuilder(args).start();
InputStream is = process.getInputStream();
InputStreamReader isr = new InputStreamReader(is);
BufferedReader br = new BufferedReader(isr);
String line;
System.out.printf("Output of running %s is:",
Arrays.toString(args));
while ((line = br.readLine()) != null) {
System.out.println(line);
}
}
}
However. What if the process don't terminate(yet). It does calculations and output it in the shell. I want to be able to run the process in another thread and get changes in the output, like frame by frame strings. How can this be achieved?
Check out
http://docs.oracle.com/javase/7/docs/api/java/util/concurrent/LinkedBlockingQueue.html
You may want to have one thread (producer) reading process output, and putting them into a LinkedBlockingQueue (queue.put), then have another thread (consumer) to get elements from the queue (queue.poll) and process it.
I am trying to read a text file which contains about 1000 very long lines. Entire file stands at about 1.4MB.
I am using BufferedReader's readLine method to read file. What happens is it takes 8-10 seconds to print the output on console. I tried the same using fgets of php and it prints all the same lines in blink of an eye!!! How is it possible?
Below is the code I am using
import java.io.BufferedReader;
import java.io.FileInputStream;
import java.io.FileNotFoundException;
import java.io.IOException;
import java.io.InputStreamReader;
import java.util.logging.Level;
import java.util.logging.Logger;
public class ClickLogDataImporter {
public static void main(String [] args) {
try {
new ClickLogDataImporter().getFileData();
} catch (Exception ex) {
Logger.getLogger(ClickLogDataImporter.class.getName()).log(Level.SEVERE, null, ex);
}
}
public void getFileData() throws FileNotFoundException, IOException {
String path = "/home/shantanu/Documents";
BufferedReader br = new BufferedReader(new InputStreamReader(
new FileInputStream(path+"/sample.txt")));
String line = "";
while((line = (br.readLine())) != null) {
System.out.println(line);
}
}
}
PHP code
<?php
$fileName = "/home/shantanu/Documents/sample.txt";
$file = fopen($fileName, 'r');
while(($line = fgets($file)) != false) {
echo $line."\n";
}
?>
Please enlighten me about this issue
I'm not sure but I think PHP just prints the file given the method you used, Java reads the file and gets every lines from it, that means checking every character for a line breaker, the process does not seem to be the same at all.
string file_get_contents
If you try and print each line one by one from the file with PHP, it should be slower.
8 seconds for that code sounds much too long to me. I suspect something else is going on, to be honest. Are you sure it's not console output which is taking a long time?
I suggest you time it (e.g. with System.nanoTime) writing out the total time at the end, but run it with a console minimized. I suspect you'll find it's fast enough then.
Isn't that just the console output that is slow? Now that you know that you're file is read correctly, try by commenting out the line System.out.println(line);.
file_get_contents loads all the file contents into a String, with your code in Java you are reading and printing line by line.
If you are testing inside an IDE like Eclipse, the console output can be quite slow.
If you want the exact behavior of file_get_contents, you can use this dirty code :
File f = new File(path, "sample.txt");
ByteArrayOutputStream bos = new ByteArrayOutputStream(new Long(Math.min(Integer.MAX_VALUE, f.length())).intValue());
FileInputStream fis = new FileInputStream(f);
byte[] buf = new byte[1024 * 8];
int size;
while((size = fis.read(buf)) > 0) {
bos.write(buf, 0, size);
}
fis.close();
bos.close();
System.out.println(new String(bos.toByteArray()));
Well if you r using readline it will go and read the file 1000 times for each line . Try using the read function with a very big buffer say over 28000 or so. It will then read a file say a total of 60 times for 1.4 MB which is much lesser than 1000. If u use a small buffer of 1000, then its gonna read the file around 1300 or something which is even slower than 1000( readline ). Also while printing the lines use print instead of println since the lines are not exactly lines but an array of characters.
Readers are usually slow, you should try Stream readers which are fast. And make sure that FIlE opening process is not taking time. If File is opened and stream objects are created and then measure time, then you can figure out exactly it is due to File opening issue or reading the file issue. Make sure that system io load is not high at the time of this operation, otherwise you measurement will go bad.
BufferedInputStream reader=new BufferedInputStream(new FileInputStream("/home/shantanu/Documents/sample.txt"));
byte[] line=new byte[1024];
while(reader.read(line)>0) {
System.out.println(new String(line));
}