This question already has answers here:
Java multiple file transfer over socket
(3 answers)
Closed 8 years ago.
I'm trying to create a simple client where at first I comunicate to a server:
A filename
The sequence of chunks which compose the file
So for the first one I thought to use to a BufferedWriter: this choice was made since I can't use on the server a InputStreamReader from the moment that the readLine() method is deprecated. However, for the second one I used a OutputStreamWriter since it is the better (only?) one choice to write a byte array on a socket.
So, this is the first version of my client code:
public class Client
{
private static final int PART_SIZE = 1000000; // 1MB
public static void main(String[] args) throws IOException
{
final Path file = Paths.get(args[0]);
final String filenameBase = file.getFileName().toString();
final byte[] buf = new byte[PART_SIZE];
Socket socket = new Socket(InetAddress.getLocalHost(),8080);
System.out.println("Socket created");
int partNumber = 0;
Path part;
int bytesRead;
byte[] toWrite;
try (
final InputStream in = Files.newInputStream(file);
final BufferedWriter bw = new BufferedWriter(new OutputStreamWriter(socket.getOutputStream()));
final DataOutputStream dos = new DataOutputStream(socket.getOutputStream());
) {
System.out.println("closed="+socket.isClosed());
bw.write(filenameBase,0,filenameBase.length());
//other stuff for the chunk creation and spedition
}
}
}
However, if I run this code, this exception occours:
Exception in thread "main" java.net.SocketException: Socket closed
at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:121)
at java.net.SocketOutputStream.write(SocketOutputStream.java:159)
at sun.nio.cs.StreamEncoder.writeBytes(StreamEncoder.java:221)
at sun.nio.cs.StreamEncoder.implClose(StreamEncoder.java:316)
at sun.nio.cs.StreamEncoder.close(StreamEncoder.java:149)
at java.io.OutputStreamWriter.close(OutputStreamWriter.java:233)
at java.io.BufferedWriter.close(BufferedWriter.java:266)
at PAD.Charlie.Client.App.main(App.java:50)
The strange thing is that if I change the order between the BufferedWriter and the DataOutputStream inside the try everthing works fine!
Actually the idea has come because I remembered something about it from the java course, but I can't really remember the details! Can you help me about this doubt that I have? Thanks a lot! :)
First of all, what you are doing is borderline crazy. You appear to be intending to write both text and binary data the same stream:
It is going to be difficult to control the interleaving of the two kinds of data since you are using a buffered writer at that point in the stack.
Even if you get the interleaving right, the "other end" has the problem of unpicking it to separate the text and binary.
You attempt to justify your decision use two stream stacks on the output stream as follows:
So for the first one I thought to use to a BufferedWriter: this choice was made since I can't use on the server a InputStreamReader from the moment that the readLine() method is deprecated. However, for the second one I used a OutputStreamWriter since it is the better (only?) one choice to write a byte array on a socket.
I don't follow your logic. But the fact that one approach doesn't work does not necessarily mean that (any) other one will.
If you want a solution that will work, then I can think of a few. The simplest is to use DataOutputStream only on the client side, and use writeUTF to write the file name and writeInt + write to write the chunks. Indicate the end of file by sending a chunk size of zero.
(You could also send the file as one big chunk if you know beforehand how many bytes you will be sending.)
The server-side code should mirror the client-side in its calls on a DataInputStream.
But the reason for difference in behaviour that you are seeing is that the order of the declarations in the try initializations determines the order in which the streams are closed at the end of the try block.
If the writer is closed first then:
BufferedWriter.close()
-> BufferedWriter.flush() -> OutputStreamWriter.write()
-> OutputStreamWriter.close() -> SocketOutputStream.close()
DataOutputStream.close() -> SocketOutputStream.close()
This is OK because the second set of closes does not need to write any data.
If the writer is closed second then:
DataOutputStream.close() -> SocketOutputStream.close()
BufferedWriter.close()
-> BufferedWriter.flush() -> OutputStreamWriter.write() // FAIL
The failure happens because the flush cannot write data to the socket because you have already (implicitly) closed it.
Because closing the BufferedWriter flushes it, and if you create the writer first it will be closed last, after the stream, and closing either of them closes the socket. See the stack trace. DataOutputStream isn't buffered, so flushing it does nothing.
NB:
... since I can't use on the server a InputStreamReader from the moment that the readLine() method is deprecated. However, for the second one I used a OutputStreamWriter since it is the better (only?) one choice to write a byte array on a socket.
None of this makes sense. InputStreamReader doesn't have a readLine() method, let alone one that is deprecated; and OutputStreamWriter writes chars, not bytes.
Related
Consider the following code snippet getInputStreamForRead() method creates and returns a new input stream for read.
InputStream is = getInputStreamForRead(); //This method creates and returns an input stream for file read
is = getDecompressedStream(is);
Since the orginal file content is compressed and stored it has to be decompressed while reading. Hence getDecompressedStream() method below would provide option to decompress the stream content
public InputStream getDecompressedStream(InputStream is) throws Exception {
return new GZIPInputStream(is);
}
Have the following doubts
Which one is correct for the above snippet
is = getDecompressedStream(is)
or
InputStream newStream = getDecompressedStream(is)
Will reusing the InputStream variable again cause any trouble?
I'm completely new with streams. Kindly help me to know about this.
As long as:
you're not manipulating the original InputStream between the original assignment and the new invocation
you're always closing your streams in a finally statement
... you should be fine re-assigning to the original variable - it's just a new value passed to an existing reference.
In fact, that may be the recommended way, since you get to only close one Closeable programmatically, as GZIPInputStream#close...
Closes this input stream and releases any system resources associated with the stream.
(see here - I read this as, "closes the underlying stream").
Since you want to close the input stream correctly, the best way is to create the input stream using chaining, and using a try-with-resources to handle the close for you.
try (InputStream is = getDecompressedStream(getInputStreamForRead())) {
// code using stream here
}
In my API (Spring boot) I have an endpoint where users can upload multiple file at once. The endpoint takes as input a list of MultipartFile.
I wish not to directly pass this MultipartFile object to the service directly so I loop through each MultipartFile and create a simple map that stored the filename and its InputStream.
Like this:
for (MultipartFile file : files) {
try (InputStream is = multipartFile.getInputStream()) {
filesMap.put(file.getOriginalFilename(), is);
}
}
service.uploadFiles(filesMap)
My understanding for Java streams and streams closing is quite limited.
I thought that try-with-resources automatically closes the InputStream once the code reached the end of the try block.
In the above code when does exactly the the multipartFile.getInputStream() gets closed?
The fact that I'm storing the stream in a map will that cause a memory leak?
Stream closes right after execution reaches closing bracket of try block.
It is okay to store InputStream anywhere after you closed it.
But be aware of that you can't read anything from this stream after you closes it.
Thanks to comments
Also, be aware of that some streams have special behavior on close() and it always depends on Stream realization.
For example:
If you try to read from closed FileInputStream you will get
java.io.IOException: Stream Closed
If you try to read from closed ByteArrayInputStream it will be okay, because of it's special close() realization: public void close() throws IOException {}
When does exactly the multipartFile.getInputStream() gets closed?
try (InputStream is = multipartFile.getInputStream()) {
filesMap.put(file.getOriginalFilename(), is);
} // <-- here
The try-with-resources statement ensures that each resource is closed at the end of the statement.
The fact that I'm storing the stream in a map will that cause a memory leak?
No, your collection just keeps closed InputStreams and you won't be able to read from them (in addition, you will get IOException).
I'm having trouble transitioning to Java from C/C++ for my "Telnet" interface to some modules we work with here. I want to be able to establish a connection with a card that, after starting it's command line interface, waits for a connection and serves up a prompt ("OK>") to the clients. This works fine for both C and C# clients I've written, but the Java has given me some issues. I've attached some code that I grabbed from some examples online, but so far, all I can ascertain for sure is that the socket is being created.
Code:
private boolean CreateTelnetSession()
{
try
{
_socket = new Socket();
_socket.connect(new InetSocketAddress(_ipAddr, _ipPort));
_socket.setSoTimeout(10000);
_socket.setKeepAlive(true);
_out = new PrintWriter(_socket.getOutputStream(), true);
_in = new BufferedReader(new InputStreamReader(_socket.getInputStream()));
_out.println("\r\n");
System.out.println(_in.readLine());
return true;
}
catch(Exception e)
{
System.out.println("Exception!");
}
return false;
}
The socket SEEMS to be created correctly, and when the program shuts down, I can see the session close on the card(s) I'm trying to talk to, but I don't see the carriage return/line feed echoed on the card as I would expect, or a prompt returned via the InputStream. Is it possible that it's a character encoding issue? Am I doing something incorrectly with the streams (crossing them!?!)? Any insight at all? When I get over this initial learning curve, I would like to acknowledge how easy Java makes these socket reads and writes, but until then...
I read this post:
java simple telnet client using sockets
It seems similar to what I'm running up against, but it's not the same. I'm willing to take the rep hit if someone has seen something on here that resolves my issue, so feel free to let me know, bluntly, what I missed.
Edit:
private boolean CreateTelnetSession()
{
try
{
_socket = new Socket();
_socket.connect(new InetSocketAddress(_ipAddr, _ipPort));
_socket.setSoTimeout(10000);
_socket.setKeepAlive(true);
_out = new DataOutputStream(_socket.getOutputStream());
_in = new DataInputStream(_socket.getInputStream());
_outBuffer = ByteBuffer.allocate(2048);
_outBuffer.order(ByteOrder.LITTLE_ENDIAN);
_inBuffer = ByteBuffer.allocate(2048);
_inBuffer.order(ByteOrder.LITTLE_ENDIAN);
System.out.println("Connection Response: " + _in.read(_inBuffer.array()));
System.out.println("Response: " + WriteCommand("DRS\r\n"));
return true;
}
catch(Exception e)
{
System.out.println("Exception!");
}
return false;
}
private String WriteCommand(String command)
{
try
{
_outBuffer = encoder.encode(CharBuffer.wrap(command));
_out.write(_outBuffer.array());
_out.flush();
_in.read(_inBuffer.array());
String retString = decoder.decode(_inBuffer).toString();
return retString.substring(0, retString.indexOf('>') + 1);
}
catch(Exception e)
{
System.out.println("Exception!");
}
return "E1>";
}
There are many things to clean up and I'm going to experiment with whether I need to do it in quite this way, but this is the gist of the "solution". The big killer was the endian-ness. It should be mentioned, once again, that this is ugly and non-production code, but any other input would still be appreciated.
I have a couple things you can try. You are using a PrintWriter for your output, this is a fairly high-level Writer (i.e. it encapsulates a lot of things from you). My concern is that the println() method in PrintWriter adds a line terminating character(s) at the end automatically (as appropriate for your OS). So what you are really sending is "/r/n(line terminator)" so on a unix box you would be sending "/r/n/n".
I would recommend switching to a DataOutputStream which gives you much more control over the raw bytes that are sent: http://docs.oracle.com/javase/6/docs/api/java/io/DataOutputStream.html
Remember if you switch to DataOutputStream you need to call flush on the output stream.
My other thought is it might be an endianess problem. Java is strictly Big Endian (network byte order). Is it possible your "card" is reading things in little-endian? If you need to write over the network in little endian (if so your card is a bad netizen!) you will need to use a ByteBuffer, set its order to little-endian. Write your bytes to it, then write the bytes from your ByteBuffer to the DataOutputStream.
I would probably switch to a DataInputStream for your input stream too. readline() will only return once the newline character is seen. Is your card returning a newline in its response?
My last thought is that your println methods might have an error and you don't know it because PrintWriter doesn't throw exceptions. The PrintWriter JavaDocs says:
"Methods in this class never throw I/O exceptions, although some of its constructors may. The client may inquire as to whether any errors have occurred by invoking checkError()."
Hopefully something in my long rambling response will help you.
I have to edit the contents of a file and write the edited conted to another file.Here is the code iam using .
import java.io.*;
import java.util.*;
public class TestRef {
ArrayList<String> lines = new ArrayList<String>();
String line= null;
public void printThis(){
try{
FileReader fr = new FileReader("C:\\Users\\questions.txt");
BufferedReader br = new BufferedReader(fr);
FileWriter fw = new FileWriter("C:\\Users\\questions_out.txt");
BufferedWriter out = new BufferedWriter(fw);
while((line=br.readLine()) != null) {
if(line.contains("Javascript"))
line.replace("Javascript"," JAVA");
lines.add(line);
out.write(line);
}
}
catch(Exception e){}
}
public static void main(String [] args){
TestRef tr = new TestRef();
tr.printThis();
}
}
So this is like reading one line at a time and printing it back to the file. But when I execute this code the output file is blank.? Can you please provide me with a sample code, how to read from a file, make change in the content and write the whole file to a new file ?
Well, a few problems:
You're never closing either your input or your output. Closing will also flush - it's possible that something's just not being flushed. You should close stream-based resources in a finally block, so that they end up being closed even in the face of an exception. (Given that you should be closing, I wouldn't bother explicitly flushing as well. Just make sure you close the top-level abstraction - i.e. out (and br).
You're catching Exception and then swallowing it. It could well be that an exception is being thrown, but you're not able to tell because you've swallowed it. You should at least be logging it, and probably stopping at that point. (I'd also suggest catching IOException instead of Exception.)
You're using FileWriter and FileReader which doesn't allow you to specify the input/output encoding - not the issue here, but personally I like to take more control over the encodings I use. I'd suggest using FileInputStream and FileOutputStream wrapped in InputStreamReader and OutputStreamWriter.
You're calling String.replace() and ignoring the result. Strings are immutable - calling replace won't change the existing string. You want:
line = line.replace("Javascript"," JAVA");
You're never using your lines variable, and your line variable would be better as a local variable. It's only relevant within the method itself, so only declare it in the method.
Your code would be easier to follow if it were more appropriately indented. If you're using an IDE, it should be able to do this for you - it makes a huge difference in readability.
The first one is the most likely cause of your current problem, but the rest should help when you're past that. (The point about "replace" will probably be your next issue...)
You are missing out.flush().
BufferedWriters don't write anything until either you flush them, or their buffer fills up.
Close the print writer, outside the loop.
out.flush();
out.close();
Moreover you are writing strings to new lines, if you just want to replace javascript with Java, then you might also wanna write '\n', next line character to new file where old file contains new line.
I have a Java code that reads through an input file using a buffer reader until the readLine() method returns null. I need to use the contents of the file again indefinite number of times. How can I read this file from beginning again?
You can close and reopen it again. Another option: if it is not too large, put its content into, say, a List.
Buffer reader supports reset() to a position of buffered data only. But this cant goto the begin of file (suppose that file larger than buffer).
Solutions:
1.Reopen
2.Use RandomAccessFile
A single Reader should be used once to read the file. If you want to read the file again, create a new Reader based on it.
Using Guava's IO utilities, you can create a nice abstraction that lets you read the file as many times as you want using Files.newReaderSupplier(File, Charset). This gives you an InputSupplier<InputStreamReader> that you can retrieve a new Reader from by calling getInput() at any time.
Even better, Guava has many utility methods that make use of InputSuppliers directly... this saves you from having to worry about closing the supplied Reader yourself. The CharStreams class contains most of the text-related IO utilities. A simple example:
public void doSomeStuff(InputSupplier<? extends Reader> readerSupplier) throws IOException {
boolean needToDoMoreStuff = true;
while (needToDoMoreStuff) {
// this handles creating, reading, and closing the Reader!
List<String> lines = CharStreams.readLines(readerSupplier);
// do some stuff with the lines you read
}
}
Given a File, you could call this method like:
File file = ...;
doSomeStuff(Files.newReaderSupplier(file, Charsets.UTF_8)); // or whatever charset
If you want to do some processing for each line without reading every line into memory first, you could alternatively use the readLines overload that takes a LineProcessor.
you do this by calling the run() function recursively, after checking to see if no more lines can be read - here's a sample
// Reload the file when you reach the end (i.e. when you can't read anymore strings)
if ((sCurrentLine = br.readLine()) == null) {
run();
}
If you want to do this, you may want to consider a random access file. With that you can explicitly set the position back to the beginning and start reading again from there.
i would suggestion usings commons libraries
http://commons.apache.org/io/api-release/org/apache/commons/io/FileUtils.html
i think there is a call to just read the file into a byteArray which might be an alternate approach
Not sure if you have considered the mark() and reset() methods on the BufferedReader
that can be an option if your files are only a few MBs in size and you can set the mark at the beginning of the file and keep reset()ing once you hit the end of the file. It also appears that subsequent reads on the same file will be served entirely from the buffer without having to go to the disk.
I faced with the same issue and came wandering to this question.
1. Using mark() and reset() methods:
BufferedReader can be created using a FileReader and also a FileInputStream. FileReader doesn't support Mark and Reset methods. I got an exception while I tried to do this. Even when I tried with FileInputStream I wasn't able to do it because my file was large (even your's is I guess). If the file length is larger than the buffer then mark and reset methods won't work neither with FileReader not with FileInputStream. More on this in this answer by #jtahlborn.
2. Closing and reopening the file
When I closed and reopened the file and created a new BufferedReader, it worked well.
The ideal way I guess is to reopen the file again and construct a new BufferedReader as a FileReader or FileInputStream should be used only once to read the file.
try {
BufferedReader br = new BufferedReader(new FileReader(input));
while ((line = br.readLine()) != null)
{
//do somethng
}
br.close();
}
catch(IOException e)
{
System.err.println("Error: " + e.getMessage());
}
}