I am trying to join two audio files stored in Amazon S3 using Java. Any ideas how to go about it? Should I use S3ObjectInputStream and then AudioInputStream? I tried that but I am getting unsupported file format error although it is a valid wav file. Please advise.
I too am trying to do this. So far this is what I got. We use the inputStream to write to a file because as far as I know there is no good way to take the stream and play from MediaPlayer.
Once we have the file we give the path to MediaPlayer and let it play. However my issues are that the audio file does not download completely. I think this has to deal with the buffer size. Also there is no pause or play so you will still need a GUI for that.
Don't forget to use this in a different thread than the main one.
String bucket = getContext().getString(R.string.bucketName);
String key = "somekey";
InputStream s3ObjectInputStream = s3.getObject(bucket,key).getObjectContent();
byte[] buffer = new byte[1024];
try {
File targetFile = new File(Environment.getExternalStorageDirectory().getAbsolutePath() + "/temp.mp4");
OutputStream outStream = new FileOutputStream(targetFile);
while( (s3ObjectInputStream.read(buffer)) != -1)
{
outStream.write(buffer);
}
MediaPlayer mp = new MediaPlayer();
try {
mp.setDataSource(Environment.getExternalStorageDirectory().getAbsolutePath() + File.separator + "temp.mp4");
mp.prepare();
mp.start();
} catch (Exception e) {
e.printStackTrace();
}
} catch (IOException e) {
e.printStackTrace();
}
Related
I've read this Reading a resource file from within jar however I couldn't figure out how to get a file instead of a inputstream, which is what I need. This is the code:
private void duplicateDocument() {
FileOutputStream fos = null;
File file;
try {
try {
doc = new File(getClass().getResource("1.docx").toURI());
//doc = new File(getClass().getResourceAsStream("1.docx"));
} catch (URISyntaxException ex) {
Logger.getLogger(ForensicExpertWitnessReportConfigPanel.class.getName()).log(Level.SEVERE, "Failed ...", ex);
}
file = new File("C:\\Users\\student\\Documents\\myfile.docx");
fos = new FileOutputStream(file);
/* This logic will check whether the file
* exists or not. If the file is not found
* at the specified location it would create
* a new file
*/
if (!file.exists()) {
file.createNewFile();
}
/*String content cannot be directly written into
* a file. It needs to be converted into bytes
*/
byte[] bytesArray = FileUtils.readFileToByteArray(doc);
fos.write(bytesArray);
fos.flush();
System.out.println("File Written Successfully");
}
catch (IOException ioe) {
ioe.printStackTrace();
}
finally {
try {
if (fos != null)
{
fos.close();
}
}
catch (IOException ioe) {
System.out.println("Error in closing the Stream");
}
}
}
FileUtils.readFileToByteArray is the only thing I've been able to get working so far, which is why I need the value a a file rather than an inputstream.
Currently, the code above gives "A java.lang.IllegalArgumentException" which is why I saw a suggestion online to use getResourceAsStream() instead - however haven't been able to return it as a file.
My next option is to try Reading a resource file from within jar - buffered reader instead.
Can someone help?
I recommend Files with its many useful functions:
Path out = Paths.get("C:\\Users\\student\\Documents\\myfile.docx");
InputStream in = getClass().getResourceAsStream("1.docx");
Files.copy(in, out, StandardCopyOption.REPLACE_EXISTING);
A resource in principle is a read-only file, possibly zipped in a jar.
Hence one cannot write back to it, and it can only serve as template for a real file, as is done here.
I got it working, using this:
InputStream in = getClass().getResourceAsStream("1.docx");
byte[] bytesArray = IOUtils.toByteArray(in);
I am pushing binary files through a com port to a module connected to my computer.
right now I am pushing the files one by one every minute and it seems that I should do it using a stream pushing files constantly.
here are my methods for the pushing of the files:
public void push2rec (File F, boolean chk){
try {
byte[] read = BinRead(F.getAbsolutePath());
SP.writeBytes(read);
if (chk) {F.delete();}
}
catch (SerialPortException ex) {msgArea.append(ex.toString() + "\n");}
}
public static byte[] BinRead(String name){
File file = new File(name);
byte[] bytes = new byte[(int) file.length()];
try {
FileInputStream inputStream = new FileInputStream(file);
inputStream.read(bytes);
inputStream.close();
}
catch (FileNotFoundException ex) {System.out.println(ex);}
catch (IOException ex) {System.out.println(ex);}
return bytes;
}
SP is a serial port instance.
my question is what would be the best way to do it. Also would it be possible to feed the same file over and over again using a stream until the next file should be pushed?
a certain file should be pushed every minute, this is very important. That means I can not push many different files in the same minute. it should be a stream of a the same file.
I am developing an android app that requires downloading a zip file (around 1,5 MB max) with a small amount of logos (png files of 20-30KB average size) from a webserver.
I have encapsulated the process of downloading and unzipping the files into android internal storage in an AsyncTask's doInbackground() method.
The issue I have is that the unZipIntoInternalStorage() method I have developed (pasted down), sometimes runs forever. Usually it takes around 900ms seconds to unzip and save the logos into internal storage, but for some unknown reason around 1 of 4 executions blocks during the loop (and stays there "for ever" taking more than 2 or 3 mins to decompress all png files):
while ((count = zipInputStream.read(buffer)) != -1) {
outputStream.write(buffer, 0, count);
}
Edited: After doing some logging and debugging I found out that the line slowing down so much the execution is : zipInputStream.read(buffer) inside the while condition. Any ideas why sometimes it runs extremely fast and some others extremely slow?
Here is my complete method to unzip the downloaded files and save them into android internal storage. I also add the method where the zipInputStream is initialized, from the zip file downloaded (both methods executed inside doInBackground() ):
private void unZipIntoInternalStorage(ZipInputStream zipInputStream) {
long start = System.currentTimeMillis();
Log.i(LOG_TAG, "Unzipping started ");
try {
File iconsDir = context.getDir("icons", Context.MODE_PRIVATE);
ZipEntry zipEntry;
byte[] buffer = new byte[1024];
int count;
FileOutputStream outputStream;
while ((zipEntry = zipInputStream.getNextEntry()) != null) {
File icon = new File(iconsDir, zipEntry.getName());
outputStream = new FileOutputStream(icon);
while ((count = zipInputStream.read(buffer)) != -1) {
outputStream.write(buffer, 0, count);
}
zipInputStream.closeEntry();
outputStream.close();
}
zipInputStream.close();
} catch (Exception e) {
Log.e(LOG_TAG + " Decompress", "unzip error ", e);
e.printStackTrace();
}
Log.i(LOG_TAG, "Unzipping completed time required: " + (System.currentTimeMillis() - start) + " ms");
}
private ZipInputStream httpDownloadIconsZip(String zipUrl) {
URLConnection urlConnection;
try {
URL finalUrl = new URL(zipUrl);
urlConnection = finalUrl.openConnection();
return new ZipInputStream(urlConnection.getInputStream());
} catch (IOException e) {
Log.e(LOG_TAG, Log.getStackTraceString(e));
return null;
}
}
To clarify, after testing this method several times and debugging, the blocking for ever always happens in the nested while loop I described previously. But I can't find the reason (see edited clarification)
Also I have already tried this method using BufferedOutputStream class and with the same results: nested while loop running forever sometimes and others unzipping successfully in less than a second.
Hope I have been as clear as possible, since I have spent long hours looking for posible causes to the issue in several post regarding unzipping files or java I/O methods with no success.
Any help appreciated. Thanks
I would suspect the InputStream rather than the output to be the issue.
Try :
return new ZipInputStream(new BufferedInputStream(urlConnection.getInputStream()));
You can add an argument for setting buffer size, but default settings should be fine for your use case.
The problem is typically caused by small packet size, leading to one read forcing several IO operations.
Ideally, you do want to use also a BufferedOutputStream, since the read could read much less than 1kB, but you still do pay a full I/O for each write.
As a general rule, remember I/O is 100 times slower than anything else you could do, and often leads to the scheduler putting your task on Wait. So just use BufferedStream anywhere the stream is not in memory (i.e. always except for StringBufferXXXStream basically).
In your case, due to zip protocol, your read could lead to any number of smaller reads on the actual network socket, as Zip parses and interprets headers and contents of the compressed file.
I know this error. If your zip file has been damaged, when you try to unzip it by ZipInputStream, it will be a infinite loop, because the file has no EOF.
But if your unzip it by ZipFile, you can catch that Exception!
public static boolean unZipByFilePath(String fileName, String unZipDir) {
long startUnZipTime = System.currentTimeMillis();
try {
File f = new File(unZipDir);
if (!f.exists()) {
f.mkdirs();
}
BufferedOutputStream dest = null;
BufferedInputStream is = null;
ZipEntry entry;
ZipFile zipfile = new ZipFile(fileName);
Enumeration e = zipfile.entries();
while (e.hasMoreElements()) {
entry = (ZipEntry) e.nextElement();
is = new BufferedInputStream(zipfile.getInputStream(entry));
int count = 0;
byte data[] = new byte[BUFFER];
String destFilePath = unZipDir + "/" + entry.getName();
File desFile = new File(destFilePath);
if (entry.isDirectory()) {
desFile.mkdirs();
} else if (!desFile.exists()) {
desFile.getParentFile().mkdirs();
desFile.createNewFile();
}
FileOutputStream fos = new FileOutputStream(destFilePath);
dest = new BufferedOutputStream(fos, BUFFER);
while ((count = is.read(data, 0, BUFFER)) != -1) {
dest.write(data, 0, count);
}
dest.flush();
dest.close();
is.close();
}
zipfile.close();
} catch (Exception e) {
Log.e(TAG, "unZipByFilePath failed : " + e.getMessage());
return false;
}
return true;
}
In the app I am working on right now, part of the functionality is to write data saved on the device to a flash drive connected via a USB-OTG adapter. Specifically, the device is a rooted Motorola Xoom running 4.2.2. I can successfully write files to the drive and read them on my computer. That part works fine. However, when I try to replace existing files with new information, the resulting files come out empty. I even delete the existing files before writing new data. What's weird is that after copying the contents of my internal file to the flash drive, I log the length of the resulting file. It always matches the input file and is always a non-0 number, yet the file still shows up as blank on my computer. Can anyone help with this problem? Relevant code from the AsyncTask that I have doing this work is below.
#Override
protected Void doInBackground(Void... params) {
File[] files = context.getFilesDir().listFiles();
for (File file : files) {
if (file.isFile()) {
List<String> nameSegments = Arrays.asList(file.getName().split(
"_"));
Log.d("source file", "size: " + file.length());
String destinationPath = "/storage/usbdisk0/"
+ nameSegments.get(0) + "/" + nameSegments.get(1) + "/";
File destinationPathFile = new File(destinationPath);
if (!destinationPathFile.mkdirs()) {
destinationPathFile.mkdirs();
}
File destinationFile = new File(destinationPathFile,
nameSegments.get(2));
FileReader fr = null;
FileWriter fw = null;
try {
fr = new FileReader(file);
fw = new FileWriter(destinationFile, false);
int c = fr.read();
while (c != -1) {
fw.write(c);
c = fr.read();
}
fw.flush();
} catch (IOException e) {
e.printStackTrace();
} finally {
try {
fr.close();
fw.close();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
Log.d("destination file", "size: " + new File(destinationFile.getPath()).length());
}
}
return null;
}
EDIT:
Per #Simon's suggestion, I added output.flush() to my code. This does not change the result.
EDIT #2:
I did some further testing with this and found something interesting. If I go to Settings->Storage->Unmount USB Storage after writing to the flash drive but before removing it from the OTG adapter, everything works perfectly. However, failing to eject the drive after writing results in the data not being written. What's strange is that the folder structure and file itself are created on the drive, but the file is always empty. One more thing: if I go to a file manager application and open up the file prior to removing the drive, the files all exist as they should. However, even removing the device, plugging it straight back in to the tablet and opening any of the files results in the file looking empty. I can't make heads or tails of this, and this is incredibly frustrating. Can anyone help with this?
EDIT #3:
I also changed to using FileReaders and FileWriters just to wee what would happen. I don't care about efficiency at this point, I simply want file writing to work reliably. This change did not affect the issue. Updated code is posted above.
Try using FileReader.ready() method before your FileReader.read() call,
and ensure if your FileReader really has some bytes in it.
Try this , Used buffered reader for writing
try
{
fw = new FileWriter(destinationFile);
BufferedWriter writer=new BufferedWriter(fw);
writer.append(yourText); // Append can be changed to write or something if you want to overwrite
writer.close();
}
catch (Exception e) {
throw new RuntimeException(e);
}
finally {
if (fw != null) {
try {
fw.flush();
fw.close();
}
catch (IOException e) {
}
I found the solution to my problem. It appears that the Android system buffers some files off of the SD card/flash drive, and then writes them to the flash drive upon eject. The following code after my file operations synchronizes the buffer with the filesystem and allows the flash drive to be immediately removed from the device without data loss. It's worth noting that this DOES require root access; it will not work on a non-rooted device.
try {
Process p = Runtime.getRuntime().exec("su");
DataOutputStream os = new DataOutputStream(p.getOutputStream());
os.writeBytes("sync; sync\n");
os.writeBytes("exit\n");
os.flush();
} catch (Exception e) {
e.printStackTrace();
}
Source of my solution: Android 2.1 programatically unmount SDCard
It sounds like the filesystem is caching your changes, but not actually writing them to the flash drive until you eject it. I don't think there's a way to flush the filesystem cache, so the best solution seems to be just to unmount and then remount the flash drive.
I am trying to archive list of files in zip format and then downloading it for the user on the fly...
I am facing out of memory issue when downloading a zip of 1gb size
Please help me how i can resolve this without increasing jvm heap size. i would like to flush the stream periodically..
I AM TRYING TO FLUSH PERIODICALLY BUT THIS IS NOT WORKING FOR ME.
Please find my code attached below:
try{
ServletOutputStream out = response.getOutputStream();
ZipOutputStream zip = new ZipOutputStream(out);
response.setContentType("application/octet-stream");
response.addHeader("Content-Disposition",
"attachment; filename=\"ResultFiles.zip\"");
//adding multiple files to zip
ZipUtility.addFileToZip("c:\\a", "print1.txt", zip);
ZipUtility.addFileToZip("c:\\a", "print2.txt", zip);
ZipUtility.addFileToZip("c:\\a", "print3.txt", zip);
ZipUtility.addFileToZip("c:\\a", "print4.txt", zip);
zip.flush();
zip.close();
out.close();
} catch (ZipException ex) {
System.out.println("zip exception");
} catch (Exception ex) {
System.out.println("exception");
ex.printStackTrace();
}
public class ZipUtility {
static public void addFileToZip(String path, String srcFile,
ZipOutputStream zip) throws Exception {
File file = new File(path + "\\" + srcFile);
boolean exists = file.exists();
if (exists) {
long fileSize = file.length();
int buffersize = (int) fileSize;
byte[] buf = new byte[buffersize];
int len;
FileInputStream fin = new FileInputStream(path + "\\" + srcFile);
zip.putNextEntry(new ZipEntry(srcFile));
int bytesread = 0, bytesBuffered = 0;
while ((bytesread = fin.read(buf)) > -1) {
zip.write(buf, 0, bytesread);
bytesBuffered += bytesread;
if (bytesBuffered > 1024 * 1024) { //flush after 1mb
bytesBuffered = 0;
zip.flush();
}
}
zip.closeEntry();
zip.flush();
fin.close();
}
}
}
}
You want to use chunked encoding to send a file that large otherwise the servlet container will try and figure out the size of the data you are trying to send before sending it so it can set the Content-Length header. Since you are compressing files you don't know the size of the data you're sending. Chunked-Encoding allows you to send pieces of the response in smaller chunks. Don't set the content length of the stream. You might try using curl or something to see the HTTP headers in the response your getting from the server. If it isn't chunked then you'll want to figure that out. You'll want to research how to force the servlet container to send chunked encoding. You might have to add this to the response header to make the servlet container send it chunked.
response.setHeader("Transfer-Encoding", "chunked");
The other option would be to compress the file into a temporary file with File.createTemp(), and then send the contents of that. If you compress to a temp file first then you can know how big the file is and set the content length for the servlet.
I guess you are digging in a wrong direction. Try to replace the servlet output stream by a file stream and see if the issue is still here. I suspect your web container tries to collect whole servlet output to calculate content-length before sending http headers.
Another thing...you are performing your close inside your try catch block. This leaves the chance for the stream to stay open on your files if you have an exception, as well as NOT giving the stream the chance to flush to the disk.
Always make sure your close is in a finally block (at least until you can get Java 7 with its try-with-resources block)
//build the byte buffer for transferring the data from the file
//to the zip.
final int BUFFER = 2048;
byte [] data = new byte[BUFFER];
File zipFile= new File("C\:\\myZip.zip");
BufferedInputStream in = null;
ZipOutputStream zipOut = null;
try {
//create the out stream to send the file to and zip it.
//we want it buffered as that is more efficient.
FileOutputStream destination = new FileOutputStream(zipFile);
zipOut = new ZipOutputStream(new BufferedOutputStream(destination));
zipOut.setMethod(ZipOutputStream.DEFLATED);
//create the input stream (buffered) to read in the file so we
//can write it to the zip.
in = new BufferedInputStream(new FileInputStream(fileToZip), BUFFER);
//now "add" the file to the zip (in object speak only).
ZipEntry zipEntry = new ZipEntry(fileName);
zipOut.putNextEntry(zipEntry);
//now actually read from the file and write the file to the zip.
int count;
while((count = in.read(data, 0, BUFFER)) != -1) {
zipOut.write(data, 0, count);
}
}
catch (FileNotFoundException e) {
throw e;
}
catch (IOException e) {
throw e;
}
finally {
//whether we succeed or not, close the streams.
if(in != null) {
try {
in.close();
}
catch (IOException e) {
//note and do nothing.
e.printStackTrace();
}
}
if(zipOut != null) {
try {
zipOut.close();
}
catch (IOException e) {
//note and do nothing.
e.printStackTrace();
}
}
}
Now if you need to loop, you can just loop around the part that you need to add more files to. Perhaps pass in an array of files and loop over it. This code worked for me zipping a file up.
Don't size your buf based on the file size, use a fixed size buffer.