This question already has answers here:
Convert InputStream to byte array in Java
(34 answers)
Closed 4 years ago.
I am trying to convert an InputStream into a byte array to write it in a file, to generate a PDF.
I have a File type with the url of a PDF, and with that, i have the inputStream of that.
File fichero_pdf = new File("C:/Users/agp2/Desktop/PDF_TRIAXE.pdf");
InputStream stream4 = new FileInputStream(fichero_pdf);
Until here everything is perfect, the problem appears when i try to transform this InputStream to a byte[] and write it in a new File.
I have these two methods:
to convert the Stream to a byte[]:
private static byte[] getArrayFromInputStream(InputStream is) {
BufferedReader br = null;
StringBuilder sb = new StringBuilder();
String line;
try {
br = new BufferedReader(new InputStreamReader(is));
while ((line = br.readLine()) != null) {
sb.append(line+"\n");
}
} catch (IOException e) {
e.printStackTrace();
} finally {
if (br != null) {
try {
br.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
return sb.toString().getBytes();
}
To write the byte[] in the new file:
...
File file=new File(dto.getTitulo());
InputStream stream=dto.getContenido();
byte[] array=getStringFromInputStream(stream);
OutputStream salida=new FileOutputStream(file);
salida.write(array);
salida.close();
stream.close();
helper.addAttachment(file.getName(), file);
}
mailSender.send(message);
...
The Email is sent at perfection, but when i can't open the .pdf.
Also, i compare the code of the new pdf with the code of the first, and is a little bit different.
I need to create a valid pdf file from an inputStream.
You have 2 problems:
You are trying to read bytes as strings, but you don't have to do it. In your case you should use byte streams(FileInputStream, BufferedInputStream), not char streams(InputStreamReader, BufferedReader).
You loose data when you convert String to bytes here:
return sb.toString().getBytes();
I would like to suggest you to use java 7 try-with-resources instead of try-catch-finally.
You can read the whole file to a byte array using ByteArrayOutputStream.
Sample code does the following:
getArrayFromInputStream() - reads all file bytes to byte array
writeContent() - writes content to a new file, in my example pdf_sample2.pdf
Example:
public class ReadAllBytes {
// as example - write to resources folder
private static String DIR = "src\\main\\resources\\";
public static void main(String[] args) throws IOException {
try {
byte[] fileAsBytes = getArrayFromInputStream(new FileInputStream(new File(DIR + "pdf-sample.pdf")));
writeContent(fileAsBytes, DIR + "pdf_sample2.pdf");
} catch (Exception e){
e.printStackTrace();
}
}
private static byte[] getArrayFromInputStream(InputStream inputStream) throws IOException {
byte[] bytes;
byte[] buffer = new byte[1024];
try(BufferedInputStream is = new BufferedInputStream(inputStream)){
ByteArrayOutputStream bos = new ByteArrayOutputStream();
int length;
while ((length = is.read(buffer)) > -1 ) {
bos.write(buffer, 0, length);
}
bos.flush();
bytes = bos.toByteArray();
}
return bytes;
}
private static void writeContent(byte[] content, String fileToWriteTo) throws IOException {
File file = new File(fileToWriteTo);
try(BufferedOutputStream salida = new BufferedOutputStream(new FileOutputStream(file))){
salida.write(content);
salida.flush();
}
}
}
Related
With Java:
I have a byte[] that represents a file.
How do I write this to a file (ie. C:\myfile.pdf)
I know it's done with InputStream, but I can't seem to work it out.
Use Apache Commons IO
FileUtils.writeByteArrayToFile(new File("pathname"), myByteArray)
Or, if you insist on making work for yourself...
try (FileOutputStream fos = new FileOutputStream("pathname")) {
fos.write(myByteArray);
//fos.close(); There is no more need for this line since you had created the instance of "fos" inside the try. And this will automatically close the OutputStream
}
Without any libraries:
try (FileOutputStream stream = new FileOutputStream(path)) {
stream.write(bytes);
}
With Google Guava:
Files.write(bytes, new File(path));
With Apache Commons:
FileUtils.writeByteArrayToFile(new File(path), bytes);
All of these strategies require that you catch an IOException at some point too.
Another solution using java.nio.file:
byte[] bytes = ...;
Path path = Paths.get("C:\\myfile.pdf");
Files.write(path, bytes);
Also since Java 7, one line with java.nio.file.Files:
Files.write(new File(filePath).toPath(), data);
Where data is your byte[] and filePath is a String. You can also add multiple file open options with the StandardOpenOptions class. Add throws or surround with try/catch.
From Java 7 onward you can use the try-with-resources statement to avoid leaking resources and make your code easier to read. More on that here.
To write your byteArray to a file you would do:
try (FileOutputStream fos = new FileOutputStream("fullPathToFile")) {
fos.write(byteArray);
} catch (IOException ioe) {
ioe.printStackTrace();
}
Try an OutputStream or more specifically FileOutputStream
Basic example:
String fileName = "file.test";
BufferedOutputStream bs = null;
try {
FileOutputStream fs = new FileOutputStream(new File(fileName));
bs = new BufferedOutputStream(fs);
bs.write(byte_array);
bs.close();
bs = null;
} catch (Exception e) {
e.printStackTrace()
}
if (bs != null) try { bs.close(); } catch (Exception e) {}
File f = new File(fileName);
byte[] fileContent = msg.getByteSequenceContent();
Path path = Paths.get(f.getAbsolutePath());
try {
Files.write(path, fileContent);
} catch (IOException ex) {
Logger.getLogger(Agent2.class.getName()).log(Level.SEVERE, null, ex);
}
////////////////////////// 1] File to Byte [] ///////////////////
Path path = Paths.get(p);
byte[] data = null;
try {
data = Files.readAllBytes(path);
} catch (IOException ex) {
Logger.getLogger(Agent1.class.getName()).log(Level.SEVERE, null, ex);
}
/////////////////////// 2] Byte [] to File ///////////////////////////
File f = new File(fileName);
byte[] fileContent = msg.getByteSequenceContent();
Path path = Paths.get(f.getAbsolutePath());
try {
Files.write(path, fileContent);
} catch (IOException ex) {
Logger.getLogger(Agent2.class.getName()).log(Level.SEVERE, null, ex);
}
I know it's done with InputStream
Actually, you'd be writing to a file output...
This is a program where we are reading and printing array of bytes offset and length using String Builder and Writing the array of bytes offset length to the new file.
`Enter code here
import java.io.File;
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.io.IOException;
//*This is a program where we are reading and printing array of bytes offset and length using StringBuilder and Writing the array of bytes offset length to the new file*//
public class ReadandWriteAByte {
public void readandWriteBytesToFile(){
File file = new File("count.char"); //(abcdefghijk)
File bfile = new File("bytefile.txt");//(New File)
byte[] b;
FileInputStream fis = null;
FileOutputStream fos = null;
try{
fis = new FileInputStream (file);
fos = new FileOutputStream (bfile);
b = new byte [1024];
int i;
StringBuilder sb = new StringBuilder();
while ((i = fis.read(b))!=-1){
sb.append(new String(b,5,5));
fos.write(b, 2, 5);
}
System.out.println(sb.toString());
}catch (IOException e) {
e.printStackTrace();
}finally {
try {
if(fis != null);
fis.close(); //This helps to close the stream
}catch (IOException e){
e.printStackTrace();
}
}
}
public static void main (String args[]){
ReadandWriteAByte rb = new ReadandWriteAByte();
rb.readandWriteBytesToFile();
}
}
O/P in console : fghij
O/P in new file :cdefg
You can try Cactoos:
new LengthOf(new TeeInput(array, new File("a.txt"))).value();
More details: http://www.yegor256.com/2017/06/22/object-oriented-input-output-in-cactoos.html
I want to show the percentage while copying file by using binary stream but I don't know the way, that How to do it?
Below is my code.
public static void binaryStream() throws IOException {
try {
FileInputStream inputStream = new FileInputStream(new File("Untitled.png"));
FileOutputStream outputStream = new FileOutputStream(new File("Untitled-copied.png"));
int data;
while ((data = inputStream.read()) >= 0) {
outputStream.write(data);
}
outputStream.write(data);
inputStream.close();
outputStream.close();
} catch (FileNotFoundException e) {
System.out.println("Error");
} catch (IOException e) {
System.out.println("Error");
}
}
Example of how to do it like other people mentioned in the comments.
import java.io.*;
public class BinaryStream {
public static void binaryStream(String file1, String file2) throws Exception
{
File sourceFile = new File(file1);
try(
FileInputStream inputStream = new FileInputStream(sourceFile);
FileOutputStream outputStream = new FileOutputStream(new File(file2))
) {
long lenOfFile = sourceFile.length();
long currentBytesWritten = 0;
int data;
while ((data = inputStream.read()) != -1) {
outputStream.write(data);
currentBytesWritten += 1;
System.out.printf("%2.2f%%%n",
100*((double)currentBytesWritten)/((double)lenOfFile));
}
}
}
public static void main(String args[]) throws Exception {
binaryStream("Untitled.png", "Untitled-copied.png");
}
}
Note that I've made some changes:
Removed the extra outputStream.write() call you had that was writing extra content incorrectly
Using try-with-resources idiom to close the streams you open even on exceptions
Throw the exceptions instead of catching, as you shouldn't catch them if you can't handle them
Compare to -1, as that is the documented value for end of file (end of stream)
Output is like this on my computer:
0,06%
// removed data
99,89%
99,94%
100,00%
Note also that this code will print something after each byte written, so it is highly inefficient. You might want to do that less often. On that note, you're reading and writing one byte at a time, which is also very inefficient - you might want to use read(byte[]) instead, reading in chunks. Example of that, using 256 byte array:
import java.io.*;
public class BinaryStream {
public static void binaryStream(String file1, String file2) throws Exception {
File sourceFile = new File(file1);
try(
FileInputStream inputStream = new FileInputStream(sourceFile);
FileOutputStream outputStream = new FileOutputStream(new File(file2))
) {
long lenOfFile = sourceFile.length();
long bytesWritten = 0;
int amountOfBytesRead;
byte[] bytes = new byte[256];
while ((amountOfBytesRead = inputStream.read(bytes)) != -1) {
outputStream.write(bytes, 0, amountOfBytesRead);
bytesWritten += amountOfBytesRead;
System.out.printf("%2.2f%%%n",
100*((double)bytesWritten)/((double)lenOfFile));
}
}
}
public static void main(String args[]) throws Exception {
binaryStream("Untitled.png", "Untitled-copied.png");
}
}
Output on my computer:
14,69%
29,37%
44,06%
58,75%
73,44%
88,12%
100,00%
Note that in the first example, return value of .read() is actually the byte that was read, whereas in the second example, return value of .read() is the amount of bytes read and the actual bytes go into the byte array.
I'm trying to write compressed data to a file and then read in the data and decompress it using the GZIP library. I've tried changing all formatting to StandardCharsets.UTF-8 and ISO-8859-1 and neither have fixed the GZIP format error. I'm wondering if it could possible have to do with the file I'm reading in? Here's the compression function:
public static byte[] compress(String originalFile, String compressFile) throws IOException {
// read in data from text file
// The name of the file to open.
String fileName = originalFile;
// This will reference one line at a time
String line = null;
String original = "";
try {
// FileReader reads text files in the default encoding.
FileReader fileReader =
new FileReader(fileName);
// Always wrap FileReader in BufferedReader.
BufferedReader bufferedReader =
new BufferedReader(fileReader);
while((line = bufferedReader.readLine()) != null) {
original.concat(line);
}
// Always close files.
bufferedReader.close();
}
catch(FileNotFoundException ex) {
System.out.println(
"Unable to open file '" +
fileName + "'");
}
catch(IOException ex) {
System.out.println(
"Error reading file '"
+ fileName + "'");
// Or we could just do this:
// ex.printStackTrace();
}
// create a new output stream for original string
try (ByteArrayOutputStream out = new ByteArrayOutputStream())
{
try (GZIPOutputStream gzip = new GZIPOutputStream(out))
{
gzip.write(original.getBytes(StandardCharsets.UTF_8));
}
byte[] compressed = out.toByteArray();
out.close();
String compressedFileName = compressFile;
try {
// Assume default encoding.
FileWriter fileWriter =
new FileWriter(compressedFileName);
// Always wrap FileWriter in BufferedWriter.
BufferedWriter bufferedWriter =
new BufferedWriter(fileWriter);
// Note that write() does not automatically
// append a newline character.
String compressedStr = compressed.toString();
bufferedWriter.write(compressedStr);
// Always close files.
bufferedWriter.close();
}
catch(IOException ex) {
System.out.println(
"Error writing to file '"
+ fileName + "'");
// Or we could just do this:
// ex.printStackTrace();
}
return compressed;
}
}
(I'm receiving the error on the line in the following decompression function) -
GZIPInputStream compressedByteArrayStream = new GZIPInputStream(new ByteArrayInputStream(s.getBytes(StandardCharsets.UTF_8)));
Decompression Function:
public static String decompress(String file) throws IOException {
byte[] compressed = {};
String s = "";
File fileName = new File(file);
FileInputStream fin = null;
try {
// create FileInputStream object
fin = new FileInputStream(fileName);
// Reads up to certain bytes of data from this input stream into an array of bytes.
fin.read(compressed);
//create string from byte array
s = new String(compressed);
System.out.println("File content: " + s);
}
catch (FileNotFoundException e) {
System.out.println("File not found" + e);
}
catch (IOException ioe) {
System.out.println("Exception while reading file " + ioe);
}
finally {
// close the streams using close method
try {
if (fin != null) {
fin.close();
}
}
catch (IOException ioe) {
System.out.println("Error while closing stream: " + ioe);
}
}
// create a new input string for compressed byte array
GZIPInputStream compressedByteArrayStream = new GZIPInputStream(new ByteArrayInputStream(s.getBytes(StandardCharsets.UTF_8)));
ByteArrayOutputStream byteOutput = new ByteArrayOutputStream();
byte[] buffer = new byte[8192];
// create a string builder and byte reader for the compressed byte array
BufferedReader decompressionBr = new BufferedReader(new InputStreamReader(compressedByteArrayStream, StandardCharsets.UTF_8));
StringBuilder decompressionSb = new StringBuilder();
// write data to decompressed string
String line1;
while((line1 = decompressionBr.readLine()) != null) {
decompressionSb.append(line1);
}
decompressionBr.close();
int len;
String uncompressedStr = "";
while((len = compressedByteArrayStream.read(buffer)) > 0) {
uncompressedStr = byteOutput.toString();
}
compressedByteArrayStream.close();
return uncompressedStr;
}
Here's the error message that i am receiving:
[B#7852e922
File content:
java.io.EOFException
at java.util.zip.GZIPInputStream.readUByte(GZIPInputStream.java:268)
at java.util.zip.GZIPInputStream.readUShort(GZIPInputStream.java:258)
at java.util.zip.GZIPInputStream.readHeader(GZIPInputStream.java:164)
at java.util.zip.GZIPInputStream.<init>(GZIPInputStream.java:79)
at java.util.zip.GZIPInputStream.<init>(GZIPInputStream.java:91)
at org.kingswoodoxford.Compression.decompress(Compression.java:136)
at org.kingswoodoxford.Compression.main(Compression.java:183)
Any suggestions as to how I might be able to fix this?
When you read the file you discard the new line at the end of each line.
A more efficient option which does do this is to copy a block i.e. char[] at a time. You can also convert the text as you go rather than creating a String or a byte[].
BTW original.concat(line); returns the concatenated string which you are discarding.
The real problem is you write to one stream and close a different one. This means that if there is any buffered data at the end of the file (and this is highly likely) the end of the file will be truncated and when you read it it will complain that your file is incomplete or EOFException.
Here is a shorter example
public static void compress(String originalFile, String compressFile) throws IOException {
char[] buffer = new char[8192];
try (
FileReader reader = new FileReader(originalFile);
Writer writer = new OutputStreamWriter(
new GZIPOutputStream(new FileOutputStream(compressFile)));
) {
for (int len; (len = reader.read(buffer)) > 0; )
writer.write(buffer, 0, len);
}
}
In the decompress, don't encode binary as text and attempt to get back the same data. It will almost certainly be corrupted. Try to use a buffer and a loop like I did for compress. i.e. it shouldn't be any more complicated.
Okay guys, I have a file with some HEX values as well as a program that take this values with a byte[] in order to convert some hex values and then reconvert it to a file.
The problem is that when I reconvert de byte array to a file some hex values are modified, and I don't find the problem.
If you see any possible mistake don't hesitate.
As you can see I have a test.sav file, here it is:
And this is the product of the program, the two files are different and they should be the same because any change has been made:
Here is the code:
public class Test {
public static File file;
public static String hex;
public static byte[] mext;
public static byte[] bytearray;
public static void main(String[] args) throws IOException {
file = new File("C:\\Users\\Roman\\Desktop\\test.sav");
StringBuilder sb = new StringBuilder();
FileInputStream fin = null;
try {
fin = new FileInputStream(file);
bytearray = new byte[(int)file.length()];
fin.read(bytearray);
for(byte bytev : bytearray){
sb.append(String.format("%02X", bytev));
}
System.out.println(sb);
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {}
//replaceMax(); <-- I deduced that conversion is not the problem
save(); // THIS IS THE IMPORTANT PART
}
public static void save() throws IOException{
PrintWriter pw = new PrintWriter("C:\\Users\\Roman\\Desktop\\test2.sav");
pw.write("");
pw.close();
FileWriter fw = new FileWriter(new File("C:\\Users\\Roman\\Desktop\\test2.sav"));
BufferedWriter out = new BufferedWriter(fw);
out.write(new String(bytearray, "ASCII"));
out.close();
}
}
You are reading data from a binary file and then trying to write it out as a character stream. Furthermore you're forcing it to use ASCII (a 7 bit character set) as the character encoding.
Try altering the save method to use:
FileOutputStream output = new FileOutputStream("C:\\Users\\Roman\\Desktop\\test2.sav");
try {
output.write(bytearray);
} finally {
output.close();
}
This will avoid character (re)encoding issues.
How can I access an Android resource using RandomAccessFile in Java?
Here is how I would like this to work (but it doesn't):
String fileIn = resources.getResourceName(resourceID);
Log.e("fileIn", fileIn);
//BufferedReader buffer = new BufferedReader(new InputStreamReader(fileIn));
RandomAccessFile buffer = null;
try {
buffer = new RandomAccessFile(fileIn, "r");
} catch (FileNotFoundException e) {
Log.e("err", ""+e);
}
Log output:
fileIn(6062): ls3d.gold.paper:raw/wwe_obj
The following exception appears in my console:
11-26 15:06:35.027: ERROR/err(6062): java.io.FileNotFoundException: /ls3d.gold.paper:raw/wwe_obj (No such file or directory)
Like you, my situation is much easier if I can use an instance of RandomAccessFile. The solution I finally arrived at is to simply copy the resource into a file in cache, then open that file with RandomAccessFile:
/**
* Copies raw resource to a cache file.
* #return File reference to cache file.
* #throws IOException
*/
private File createCacheFile(Context context, int resourceId, String filename)
throws IOException {
File cacheFile = new File(context.getCacheDir(), filename);
if (cacheFile.createNewFile() == false) {
cacheFile.delete();
cacheFile.createNewFile();
}
// from: InputStream to: FileOutputStream.
InputStream inputStream = context.getResources().openRawResource(resourceId);
FileOutputStream fileOutputStream = new FileOutputStream(cacheFile);
int count;
byte[] buffer = new byte[1024 * 512];
while ((count = inputStream.read(buffer)) != -1) {
fileOutputStream.write(buffer, 0, count);
}
fileOutputStream.close();
inputStream.close();
return cacheFile;
}
You would use this method thusly:
File cacheFile = createCacheFile(context, resourceId, "delete-me-please");
RandomAccessFile randomAccessFile = new RandomAccessFile(cacheFile, "r");
// Insert useful things that people want.
randomAccessFile.close();
cacheFile.delete();
Its a FileNotFound exception. That means that you do not specify well the file that you want to open at String fileIn = resources.getResourceName(resourceID);
The problem is that Android can return to you only the InputStream of the raw file or a FileDescriptor but both are not enough for the RandomAccessFile constructor.
There is an open source library called Unified I/O that you can use to achieve that you want, but I think that it will just make your project 'heavier'. Perhaps you should thought if you can avoid the RandomAccessFile somehow.
I'm using this code:
public static String readContentFromResourceFile(Context context, int resourceId)
throws IOException {
StringBuffer sb = new StringBuffer();
final String NEW_LINE = System.getProperty("line.separator");
InputStream is = context.getResources().openRawResource(resourceId);
BufferedReader br = new BufferedReader(new InputStreamReader(is));
String readLine = null;
try {
while ((readLine = br.readLine()) != null) {
sb.append(readLine);
sb.append(NEW_LINE);
}
} catch (IOException e) {
throw e;
} finally {
br.close();
is.close();
}
return sb.toString();
}