streaming binary files through a communication port - java

I am pushing binary files through a com port to a module connected to my computer.
right now I am pushing the files one by one every minute and it seems that I should do it using a stream pushing files constantly.
here are my methods for the pushing of the files:
public void push2rec (File F, boolean chk){
try {
byte[] read = BinRead(F.getAbsolutePath());
SP.writeBytes(read);
if (chk) {F.delete();}
}
catch (SerialPortException ex) {msgArea.append(ex.toString() + "\n");}
}
public static byte[] BinRead(String name){
File file = new File(name);
byte[] bytes = new byte[(int) file.length()];
try {
FileInputStream inputStream = new FileInputStream(file);
inputStream.read(bytes);
inputStream.close();
}
catch (FileNotFoundException ex) {System.out.println(ex);}
catch (IOException ex) {System.out.println(ex);}
return bytes;
}
SP is a serial port instance.
my question is what would be the best way to do it. Also would it be possible to feed the same file over and over again using a stream until the next file should be pushed?
a certain file should be pushed every minute, this is very important. That means I can not push many different files in the same minute. it should be a stream of a the same file.

Related

How do I prevent a file from corrupting when I transfer it over a local network using sockets?

I am working on a school project where I want to make a personal storage server. At the moment, what I am trying to achieve is being able to transfer a file from the client machine to the server. However, when testing this with an image, the file partially sends before it corrupts.
Please bare in mind that I am a reasonably new programmer and that my technical knowledge may be some-what limited.
I am using a byte array through a DataOutputStream to transfer the file. I want to use this method as it should work for any file type. I've tried to set the buffer size to the exact size of the file and larger but neither have worked.
Server:
public void run() {
try {
System.out.println("ip: " + clientSocket.getInetAddress().getHostAddress());
out = new DataOutputStream(clientSocket.getOutputStream());
in = new DataInputStream(clientSocket.getInputStream());
in.read(buffer, 0, buffer.length);
fileOut = new FileOutputStream("X:\\My Documents\\My
Pictures\\gradient.jpg");
fileOut.write(buffer, 0, buffer.length);
in.close();
out.close();
clientSocket.close();
} catch (IOException ex) {
System.out.println(ex.getMessage());
}
}
Client:
public void startConnection(String ip, int port) {
try {
clientSocket = new Socket(ip, port);
out = new DataOutputStream(clientSocket.getOutputStream());
in = new DataInputStream(clientSocket.getInputStream());
x = false;
Path filePath = Paths.get("C:\\Users\\georg\\Documents\\gradient.jpg");
buffer = Files.readAllBytes(filePath);
Thread.sleep(3000);
//Files.write(filePath, buffer);
//out.write(buffer,0,buffer.length);
x = true;
sendMessage(buffer);
} catch (IOException ex) {
System.out.println(ex.getMessage());
} catch (InterruptedException ex) {
Logger.getLogger(PCS_Client.class.getName()).log(Level.SEVERE, null, ex);
}
}
public byte[] sendMessage(byte[] buffer) {
if (x==true){
try {
out.write(buffer,0,buffer.length);
} catch (IOException ex) {
System.out.println(ex.getMessage());
}
}
return null;
}
Here is a comparison of the files I've tried to send vs the files I receive:
https://imgur.com/gallery/T7nUUJT
Curiously, sending a single colour image produces a single colour image on the server. I believe the issue here may have to be in the timing of code execution however I am not sure and do not know how to go about fixing it.
The issue is in your server code, at this line:
in.read(buffer, 0, buffer.length);
You expect to read all the data at once, but if you read the doc you will find this:
public final int read(byte[] b,
int off,
int len)
throws IOException
Reads up to len bytes of data from the contained input stream into an
array of bytes. An attempt is made to read as many as len bytes, but a
smaller number may be read, possibly zero. The number of bytes
actually read is returned as an integer.
The important part is Reads up to len bytes of data.
You must use the return value of read and call it read repeatedly until the is nothing more to read.

Reading a resource file from within compiled jar, return as file

I've read this Reading a resource file from within jar however I couldn't figure out how to get a file instead of a inputstream, which is what I need. This is the code:
private void duplicateDocument() {
FileOutputStream fos = null;
File file;
try {
try {
doc = new File(getClass().getResource("1.docx").toURI());
//doc = new File(getClass().getResourceAsStream("1.docx"));
} catch (URISyntaxException ex) {
Logger.getLogger(ForensicExpertWitnessReportConfigPanel.class.getName()).log(Level.SEVERE, "Failed ...", ex);
}
file = new File("C:\\Users\\student\\Documents\\myfile.docx");
fos = new FileOutputStream(file);
/* This logic will check whether the file
* exists or not. If the file is not found
* at the specified location it would create
* a new file
*/
if (!file.exists()) {
file.createNewFile();
}
/*String content cannot be directly written into
* a file. It needs to be converted into bytes
*/
byte[] bytesArray = FileUtils.readFileToByteArray(doc);
fos.write(bytesArray);
fos.flush();
System.out.println("File Written Successfully");
}
catch (IOException ioe) {
ioe.printStackTrace();
}
finally {
try {
if (fos != null)
{
fos.close();
}
}
catch (IOException ioe) {
System.out.println("Error in closing the Stream");
}
}
}
FileUtils.readFileToByteArray is the only thing I've been able to get working so far, which is why I need the value a a file rather than an inputstream.
Currently, the code above gives "A java.lang.IllegalArgumentException" which is why I saw a suggestion online to use getResourceAsStream() instead - however haven't been able to return it as a file.
My next option is to try Reading a resource file from within jar - buffered reader instead.
Can someone help?
I recommend Files with its many useful functions:
Path out = Paths.get("C:\\Users\\student\\Documents\\myfile.docx");
InputStream in = getClass().getResourceAsStream("1.docx");
Files.copy(in, out, StandardCopyOption.REPLACE_EXISTING);
A resource in principle is a read-only file, possibly zipped in a jar.
Hence one cannot write back to it, and it can only serve as template for a real file, as is done here.
I got it working, using this:
InputStream in = getClass().getResourceAsStream("1.docx");
byte[] bytesArray = IOUtils.toByteArray(in);

How to add string to a file without closing the output stream objects every time

I will receive a large chunk of data (say 1000 data/second, each data has minimum size of 15 bytes). My earlier approach was to create a new outputstream object every time, specify the path and add the values to the file, all this is done in a separate thread. How ever I am still facing a performance hit. I taught instead of writing the data to a file as
File dir = new
File(android.os.Environment.getExternalStorageDirectory().getAbsolutePath() + DEBUG_FILE_PATH);
boolean b = dir.mkdirs();
try
{
fileOutputStream = new FileOutputStream(new File(dir, FILE_NAME),
true);
outputStreamWriter = new OutputStreamWriter(fileOutputStream);
}
catch (FileNotFoundException e)
{
e.printStackTrace();
}
outputstreamwriter.append("some data").close();
I want to maintain the outputstreamwriter and other objects and use them to add the data to the outputstreamwriter buffer, and at the end of my application(when i close the app, may at onDestroy() method of activity). I need to write the data to the file and then close all the open stream.
This approach works for me, but the buffer size for outputstreamwriter is 8kb only. Which is less compared to the amount of data that I am receiving.
How can i solve this ?
The vast majority of your performance hit is most probably in opening the file every single time you want to write a few bytes to it.
So, if you just eliminate the opening and closing of the file all the time, you should be fine.
Just open the file once, keep writing data to it as the data arrives, and then close the file when your application closes.
This way, using a buffered OutputStreamWriter will give you a performance benefit without having to worry about the size of its buffer: when its buffer is full, it will flush itself, transparently to you. You don't need to know anything about how it works and how large (or small) its buffer is.
This solved my problem.
By this approach I have opened the file once when the app starts and, I am adding the 'n' values which I receive from the service into the file. I am flushing the buffer(this writes the data to file). Now even if I receive large data not more than 8kb(buffer max size) I can write it to file which is already opened. Finally I am closing the streams when the app is closed.
//Util class
public static File dir;
public static FileOutputStream fileOutputStream;
public static OutputStreamWriter outputStreamWriter;
//Different class, you can initialize it on your Application class or home activity
private void initializeFileWriteObjects()
{
dir = new File(android.os.Environment.getExternalStorageDirectory().getAbsolutePath() + DEBUG_FILE_PATH);
boolean b = dir.mkdirs();
try
{
fileOutputStream = new FileOutputStream(new File(dir, FILE_NAME), true);
}
catch (FileNotFoundException e)
{
e.printStackTrace();
}
outputStreamWriter = new OutputStreamWriter(fileOutputStream);
}
//Util File
private static boolean writeToFile(final byte[] stream)
{
//Convert String to hex, as I want this to be in hex
final String stringData = byteArrayToHexString(stream);
try
{
outputStreamWriter.append(stringData);
outputStreamWriter.flush();
}
catch (IOException e)
{
e.printStackTrace();
return false;
}
return true;
}
//When the app is closed.
#Override
protected void onDestroy()
{
super.onDestroy();
closeFileStream();
}
//This method is in same Util class, but called at onDestroy()
public static void closeFileStream()
{
try
{
outputStreamWriter.close();
fileOutputStream.close();
}
catch (IOException e)
{
e.printStackTrace();
}
}

CipherInputStream is Empty when trying to decrypt file

I'm trying to write a few helper methods to take care of reading and writing an encrypted file. I have two methods which successfully fulfill this and return an InputStream or an OutputStream (which are really the Cipher version) that I can use to read or write to the file. I have confirmed that these methods work peachy keen when wrapped with an Object stream and used to read and write an encrypted Object to file.
However, the problem arises when I try to read from an encrypted text file. I can verify that the String I feed it is being encrypted and written to the correct file, but when I try to read back from this file, the BufferedReader reports an EOF (null). The InputStream.available() method returns 0. I can assure that the file is there, is being found, and that the InputStream itself is not null. Can anybody tell me what might cause this?
Reading/Writing encrypted Object works beautifully (CorruptedStreamException is good here):
private static void testWriteObject() {
String path = "derp.derp";
Derp start = new Derp("Asymmetril: " + message, 12543, 21.4, false);
FilesEnDe.writeEncryptedObject(key, "derp.derp", start);
echo("original");
echo(">"+start);
Object o;
try {
ObjectInputStream ois = new ObjectInputStream(ResourceManager.getResourceStatic(path));
o = ois.readObject();
echo("encrypted");
echo(">"+o);
ois.close();
} catch (Exception e) {
e.printStackTrace();
}
o = FilesEnDe.readEncryptedObject(key, path);
echo("decrypted");
echo(">"+o);
}
Output:
original
>Asymmetril: WE CAME, WE SAW, WE CONQUERED.; 12543; 21.4; false
[RM] > Trying to load resource: derp.derp
java.io.StreamCorruptedException
[RM] > Trying to load resource: derp.derp
decrypted
>Asymmetril: WE CAME, WE SAW, WE CONQUERED.; 12543; 21.4; false
Trying to decrypt the text file doesn't (note that the encrypted text is readable):
private static void testWriteFile() {
String path = "EncryptedOut.txt";
BufferedReader bis1, bis2;
try {
BufferedOutputStream os = new BufferedOutputStream(FilesEnDe.getEncryptedOutputStream(key, path));
os.write(message.getBytes());
os.flush();
os.close();
} catch (IOException e1) {
e1.printStackTrace();
}
echo("original");
echo(">"+message);
try {
bis1 = new BufferedReader (new InputStreamReader(ResourceManager.getResourceStatic(path)));
echo("encrypted");
echo(">" + bis1.readLine());
bis1.close();
} catch (IOException e) {
e.printStackTrace();
}
try {
InputStream is = FilesEnDe.getEncryptedInputStream(key, path);
InputStreamReader isr = new InputStreamReader(is);
bis2 = new BufferedReader (isr);
echo("bits in stream? " + is.available());
echo("decrypted");
echo(">"+bis2.readLine());
bis2.close();
} catch (IOException e) {
e.printStackTrace();
}
}
Output:
original
>WE CAME, WE SAW, WE CONQUERED.
encrypted
>¤ƒ]£¬Vß4E?´?ùûe
[RM] > Trying to load resource: EncryptedOut.txt
bytes in stream? 0
decrypted
>null
The code used to create the CipherInputStream:
public static InputStream getEncryptedInputStream(String key, String path) {
try {
InputStream is = ResourceManager.getResourceStatic(path);
SecretKeySpec keyspec = new SecretKeySpec(getHash(key),"AES");
Cipher c = Cipher.getInstance("AES");
c.init(Cipher.DECRYPT_MODE, keyspec);
return new CipherInputStream(is,c);
} catch (NoSuchAlgorithmException e) {
e.printStackTrace();
} catch (NoSuchPaddingException e) {
e.printStackTrace();
} catch (InvalidKeyException e) {
e.printStackTrace();
}
return null;
}
THE PROBLEM OCCURS WHEN I TRY TO USE A CIPHERINPUTSTREAM TO DECRYPT THE FILE AND RETRIEVE THE ORIGINAL STRING.
However, the problem arises when I try to read from an encrypted text file.
There is no such thing as an 'encrypted text file'. The result of encryption is binary, not text.
I can verify that the String I feed it is being encrypted and written to the correct file, but when I try to read back from this file, the BufferedReader reports an EOF (null).
You shouldn't be using a BufferedReader. It isn't text, it is binary. Use a BufferedInputStream.
It didn't matter whether I wrote via a PrintWriter or a BufferedOutputStream, nor whether I read with a Reader or not. Turns out, I forgot to close the OutputStream that created the file. As soon as I added that one little line, everything began working. Thank you to Antoniossss for suggesting I redo the broken part of my method. I wonder why Eclipse didn't mention a resource leak about it...

how to flush the ZipOutputStream periodically in java

I am trying to archive list of files in zip format and then downloading it for the user on the fly...
I am facing out of memory issue when downloading a zip of 1gb size
Please help me how i can resolve this without increasing jvm heap size. i would like to flush the stream periodically..
I AM TRYING TO FLUSH PERIODICALLY BUT THIS IS NOT WORKING FOR ME.
Please find my code attached below:
try{
ServletOutputStream out = response.getOutputStream();
ZipOutputStream zip = new ZipOutputStream(out);
response.setContentType("application/octet-stream");
response.addHeader("Content-Disposition",
"attachment; filename=\"ResultFiles.zip\"");
//adding multiple files to zip
ZipUtility.addFileToZip("c:\\a", "print1.txt", zip);
ZipUtility.addFileToZip("c:\\a", "print2.txt", zip);
ZipUtility.addFileToZip("c:\\a", "print3.txt", zip);
ZipUtility.addFileToZip("c:\\a", "print4.txt", zip);
zip.flush();
zip.close();
out.close();
} catch (ZipException ex) {
System.out.println("zip exception");
} catch (Exception ex) {
System.out.println("exception");
ex.printStackTrace();
}
public class ZipUtility {
static public void addFileToZip(String path, String srcFile,
ZipOutputStream zip) throws Exception {
File file = new File(path + "\\" + srcFile);
boolean exists = file.exists();
if (exists) {
long fileSize = file.length();
int buffersize = (int) fileSize;
byte[] buf = new byte[buffersize];
int len;
FileInputStream fin = new FileInputStream(path + "\\" + srcFile);
zip.putNextEntry(new ZipEntry(srcFile));
int bytesread = 0, bytesBuffered = 0;
while ((bytesread = fin.read(buf)) > -1) {
zip.write(buf, 0, bytesread);
bytesBuffered += bytesread;
if (bytesBuffered > 1024 * 1024) { //flush after 1mb
bytesBuffered = 0;
zip.flush();
}
}
zip.closeEntry();
zip.flush();
fin.close();
}
}
}
}
You want to use chunked encoding to send a file that large otherwise the servlet container will try and figure out the size of the data you are trying to send before sending it so it can set the Content-Length header. Since you are compressing files you don't know the size of the data you're sending. Chunked-Encoding allows you to send pieces of the response in smaller chunks. Don't set the content length of the stream. You might try using curl or something to see the HTTP headers in the response your getting from the server. If it isn't chunked then you'll want to figure that out. You'll want to research how to force the servlet container to send chunked encoding. You might have to add this to the response header to make the servlet container send it chunked.
response.setHeader("Transfer-Encoding", "chunked");
The other option would be to compress the file into a temporary file with File.createTemp(), and then send the contents of that. If you compress to a temp file first then you can know how big the file is and set the content length for the servlet.
I guess you are digging in a wrong direction. Try to replace the servlet output stream by a file stream and see if the issue is still here. I suspect your web container tries to collect whole servlet output to calculate content-length before sending http headers.
Another thing...you are performing your close inside your try catch block. This leaves the chance for the stream to stay open on your files if you have an exception, as well as NOT giving the stream the chance to flush to the disk.
Always make sure your close is in a finally block (at least until you can get Java 7 with its try-with-resources block)
//build the byte buffer for transferring the data from the file
//to the zip.
final int BUFFER = 2048;
byte [] data = new byte[BUFFER];
File zipFile= new File("C\:\\myZip.zip");
BufferedInputStream in = null;
ZipOutputStream zipOut = null;
try {
//create the out stream to send the file to and zip it.
//we want it buffered as that is more efficient.
FileOutputStream destination = new FileOutputStream(zipFile);
zipOut = new ZipOutputStream(new BufferedOutputStream(destination));
zipOut.setMethod(ZipOutputStream.DEFLATED);
//create the input stream (buffered) to read in the file so we
//can write it to the zip.
in = new BufferedInputStream(new FileInputStream(fileToZip), BUFFER);
//now "add" the file to the zip (in object speak only).
ZipEntry zipEntry = new ZipEntry(fileName);
zipOut.putNextEntry(zipEntry);
//now actually read from the file and write the file to the zip.
int count;
while((count = in.read(data, 0, BUFFER)) != -1) {
zipOut.write(data, 0, count);
}
}
catch (FileNotFoundException e) {
throw e;
}
catch (IOException e) {
throw e;
}
finally {
//whether we succeed or not, close the streams.
if(in != null) {
try {
in.close();
}
catch (IOException e) {
//note and do nothing.
e.printStackTrace();
}
}
if(zipOut != null) {
try {
zipOut.close();
}
catch (IOException e) {
//note and do nothing.
e.printStackTrace();
}
}
}
Now if you need to loop, you can just loop around the part that you need to add more files to. Perhaps pass in an array of files and loop over it. This code worked for me zipping a file up.
Don't size your buf based on the file size, use a fixed size buffer.

Categories