What I'm trying to do is when a user hits a link with the correct parameters, the system will retrieve the file from the MS Server 2005 db and outputs it to the user. Specifically, I stored an audio file in varbinary data type and now I have the ID to retrieve the audio file but I don't know what is the Java command to output it for the user.
My code is written in Java and I tried to search for a similar topic on here but had no luck. Any help is appreciated.
Thanks
-Bao
There is one example for file downloading here, maybe is helpful:
http://www.daniweb.com/software-development/java/threads/154128
Maybe you can try to combine that with your link handler.
I figured it out. This is the solution that did the job for me. Basically I had to use the javax.sound.sampled.* API. This is what I did below:
InputParameters parameters = parts.getParameters();
int audioFileID = parameters.getIntParameter("audiofileID");
//Retrieves Audio File
AudioFile audioFile = CallManager.getAudioFile(audioFileID);
InputStream is = new ByteArrayInputStream(audioFile.getAudio());
AudioInputStream audioInputStream = AudioSystem.getAudioInputStream(is);
ServletOutputStream out = response.getOutputStream();
ByteArrayOutputStream byteOutputStream = new ByteArrayOutputStream();
AudioFormat format = audioInputStream.getFormat();
audioInputStream = AudioSystem.getAudioInputStream(format, audioInputStream);
}
AudioSystem.write(audioInputStream,javax.sound.sampled.AudioFileFormat.Type.WAVE,byteOutputStream);
Related
I have an InputStream which I would like to convert to a PDF, and save that PDF in a directory. Currently, my code is able to convert the InputStream to a PDF and the PDF does show up in the correct directory. However, when I try to open it, the file is damaged.
Here is the current code:
InputStream pAdESStream = signingServiceConnector.getDirectClient().getPAdES(this.statusReader.getStatusResponse().getpAdESUrl());
byte[] buffer = new byte[pAdESStream.available()];
pAdESStream.read(buffer);
File targetFile = new File(System.getProperty("user.dir") + "targetFile2.pdf");
OutputStream outStream = new FileOutputStream(targetFile);
outStream.write(buffer);
Originally, the InputStream was a pAdES-file (https://en.wikipedia.org/wiki/PAdES). However, it should be able to be read as just a regular PDF.
Does anyone know how to convert the InputStream to a PDF, without getting a damaged PDF as a result?
Hello it might be a bit late but you can use PDFBOX api (or itextpdf)
https://www.tutorialkart.com/pdfbox/create-write-text-pdf-file-using-pdfbox/
here is a tuto of the process gl
I'm trying to download a file using the android DownloadManager, access the file, and write it to a new location (in this example I'm downloading a database which is compiled server side and needs to be wrote to the /database/ directory).
I've been reading up and managed to download the file, and activate the BroadcastReceiver, but at this point I get stuck.
I've returned the ParcelFileDecriptor file but I'm having trouble converting it to a stream in any way. I can't decide if the ParcelFileDecriptor.AutoCloseInputStream is a red herring or not, but I'm pretty sure the ParcelFileDecriptor has relativity to a stream, but I'm really struggling to work it out. Any help would be much appreciated.
Assuming you've started the download already and set up the Broadcast Reciver, the following code will do the job...
ParcelFileDescriptor file = dMgr.openDownloadedFile(downloadId);
File dbFile = getDatabasePath(Roads.DATABASE_NAME);
InputStream fileStream = new FileInputStream(file.getFileDescriptor());
OutputStream newDatabase = new FileOutputStream(dbFile);
byte[] buffer = new byte[1024];
int length;
while((length = fileStream.read(buffer)) > 0)
{
newDatabase.write(buffer, 0, length);
}
newDatabase.flush();
fileStream.close();
newDatabase.close();
If you're looking for more information on overwriting a database with your own check this link (Also where most of the above info has come from):
http://www.reigndesign.com/blog/using-your-own-sqlite-database-in-android-applications/
Our current project requires us to send an audio file to the server and then use the audio file for further computation.
Using the Java sound api, I was able to capture the recording and save it as a wav file in my system. Then in order to pass the audio wav to the server, I am using Apache Commons HttpClient to post a request to the server. (I am using InputstreamEntity provided by apache and sending the data as a chunk).
The problem appears when i am trying to recreate/retrieve the wav file on the server. I understand that I would have to use the AudioSystem.write API to create the wav file (exactly as what was done on my system). However what I observe is that althought the file gets created , it does not play (I am using vlc media player to test it FYI). I have searched in Google for sample codes and have tried to implement it, but is unable to play it once the file gets created.
The sample code snippets indicates the approaches i have tried:
//******************************************************************
try {
InputStream is = request.getInputStream();
FileOutputStream fs = new FileOutputStream("output123.wav");
byte[] tempbuffer = new byte[4096];
int bytesRead;
while((bytesRead=is.read(tempbuffer))!=-1)
{
fs.write(tempbuffer, 0,bytesRead);
}
is.close();
fs.close();
AudioInputStream inputStream =AudioSystem.getAudioInputStream(newFile("output123.wav"));
int numofbytes = inputStream.available();
byte[] buffer = new byte[numofbytes];
inputStream.read(buffer);
int bytesWritten = AudioSystem.write(inputStream, AudioFileFormat.Type.WAVE,new File("outputtest.wav"));
System.out.println("written"+bytesWritten);
Approach 2
InputStream is = request.getInputStream();
System.out.println("inputStream obtained : "+is.toString());
ByteArrayInputStream bais = null;
byte[] audioBuffer = IOUtils.toByteArray(is);
System.out.println(" is audioBuffer empty? : length = ? "+audioBuffer.length);
try {
AudioFileFormat ai = AudioSystem.getAudioFileFormat(is);
System.out.println("ai bytelength ? "+ai.getByteLength());
System.out.println("ai frame length = "+ai.getFrameLength());
Set<Map.Entry<String,Object>> audioProperties = ai.getFormat().properties().entrySet();
System.out.println("entry set is empty ? "+audioProperties.isEmpty());
for(Map.Entry me : audioProperties){
System.out.println("key = "+me.getKey());
System.out.println("value ="+me.getValue());}
bais = new ByteArrayInputStream(audioBuffer);
AudioInputStream ais = new AudioInputStream(bais, new AudioFormat(8000,8,2,true,true), 2);
AudioSystem.write(ais, AudioFileFormat.Type.WAVE,new File("testtest.wav"));
//*************************************************************************************
The audioFormat properties all turned out to be null. Are these null values giving the problem? So while creating the wave file on the server, I tried to set the properties manually once again. But even then the wav file would not play.
I have also tried quite a few approaches already mentioned on this site, but somehow they aren't working. I am sure i am missing something, but I am unable to pinpoint the exact problem.
Would be really helpful, if you guys can point out how to go about the conversion from ServletInputStream to getting a wav.
P.S (1) I know the code is shabby, because i have been under a trial and error situation for quite some time now. But I will give more details on the approaches if needed.
2) Apologise for the clumsiness, this happens to be my first post.. )
this is not how you copy a stream (from Approach 1). you have the correct code to copy a stream just above this.:
int numofbytes = inputStream.available();
byte[] buffer = new byte[numofbytes];
inputStream.read(buffer);
If all your server wants to do is get the data and write it to a file, then you do not need to use any of the audio API: simply treat the data as a stream of bytes.
So the part of approach 1 that is before any mention of AudioInputStream should be sufficient.
Although the approach chosen might not be the perfect solution, due to time constraints, I adopted a simpler approach. Using java.util.zip i simply zipped it up and sent it over to the server and then wrote a layer wherin the file gets unzipped . then i deleted the zip files. Seems like an immature solution (bcos the original challenge was to send the audio file). now i am incurring an overhead of zipping the files, but the file transfer would hapeen relatively faster. Thanks for your help guys.
To stream audio file I have implemented following code. But i am getting Exception:
javax.sound.sampled.UnsupportedAudioFileException: could not get audio input stream from input file
at javax.sound.sampled.AudioSystem.getAudioInputStream(AudioSystem.java:1170)
Can Any one help me please......
try {
// From file
System.out.println("hhhhhhhhhhhhhhhh");
AudioInputStream stream = AudioSystem.getAudioInputStream(new File("C:\\track1.mp3"));
System.out.println("stream created");
AudioFormat format = stream.getFormat();
if (format.getEncoding() != AudioFormat.Encoding.PCM_SIGNED) {
System.out.println("in if");
format = new AudioFormat(
AudioFormat.Encoding.PCM_SIGNED,
format.getSampleRate(),
format.getSampleSizeInBits()*2,
format.getChannels(),
format.getFrameSize()*2,
format.getFrameRate(),
true); // big endian
stream = AudioSystem.getAudioInputStream(format, stream);
}
// Create line
SourceDataLine.Info info = new DataLine.Info(
SourceDataLine.class, stream.getFormat(),
((int)stream.getFrameLength()*format.getFrameSize()));
SourceDataLine line = (SourceDataLine) AudioSystem.getLine(info);
line.open(stream.getFormat());
line.start();
// Continuously read and play chunks of audio
int numRead = 0;
byte[] buf = new byte[line.getBufferSize()];
while ((numRead = stream.read(buf, 0, buf.length)) >= 0) {
int offset = 0;
while (offset < numRead) {
offset += line.write(buf, offset, numRead-offset);
}
}
line.drain();
line.stop();
}
That you're doing this job in a servlet class gives me the impression that your intent is to play the mp3 file whenever someone visits your website and that the visitor should hear this mp3 file.
If true, I'm sorry to say, but you're approaching this entirely wrong. Java servlet code runs in webserver machine and not in webbrowser machine. Whenever someone visits your website, this way the mp3 file would only be played at the webserver machine. This is usually a physically completely different machine which runs at the other side of the network connection and the visitor ain't ever going to hear the music.
You want to send the mp3 file raw (unmodified byte by byte) from webserver to the webbrowser without massaging it by some Java Audio API and instruct the webbrowser to play this file. The easist way is to just drop the mp3 file in public webcontent (there where your HTML/JSP files also are) and use HTML <embed> tag to embed it in your HTML/JSP file. The below example assumes the MP3 file to be in the same folder as the HTML/JSP file:
<embed src="file.mp3" autostart="true"></embed>
That's all and this is supported in practically every browser and it will show a player as well.
If the MP3 file is by business requirement stored outside public webcontent, then you may indeed need a servlet for this, but the servlet should do absolutely nothing more than getting an InputStream of it in some way and write it unmodified to the OutputStream of the HttpServletResponse the usual Java IO way. You only need to set the HTTP Content-Type header to audio/mpeg beforehand and if possible also the HTTP Content-Length header. Then point the src to the servlet's URL instead.
<embed src="mp3servlet" autostart="true"></embed>
Default java AudioInputStream does not support mp3 files. You have to plug in MP3SPI to let it decode mp3.
ALso, what do you mean by streaming? This code will play the audio file, not stream it as in internet radio streaming.
I need to be able to play ALAW files in a Java (desktop) application.
I've tried to follow the example at:
How to play audio in Java Application
I've created a File object from the ALAW file (which exists, according to check) and sent that File to a method where the first thing that happens is this:
AudioInputStream ais = AudioSystem.getAudioInputStream(file);
But this is where the execution stops, since I get this exception:
javax.sound.sampled.UnsupportedAudioFileException: could not get audio input stream from input file
I see that there is a way to convert ALAW files if the check (ais.getFormat().getEncoding() == AudioFormat.Encoding.ALAW) is true, but how can I get there if it's not even possible to create the AudioInputStream?
Anyone who has worked with ALAW files and has an idea of what I should do?
Is there a way to convert the ALAW files programmatically before calling AudioSystem.getAudioInputStream(file)?
I really need to make this work!
Get existing file format from your AudioInputStream:
filepath is String with path to your file,which you obtain for example:
String filename="x.y";
File file = new File(filename);
String filepath=file.getCanonicalPath();
Then main conversion is done by:
AudioInputStream inputStream = AudioSystem.getAudioInputStream(new File(filepath));
AudioFormat format = inputStream.getFormat();
AudioInputStream convertedInputStream;
After that put condition, which checks if your file encoding is alaw or ulaw, and converts it to PCM which can be played by SoundCard:
if ((format.getEncoding() == AudioFormat.Encoding.ULAW) || (format.getEncoding() == AudioFormat.Encoding.ALAW))
AudioFormat tmp = new AudioFormat(
AudioFormat.Encoding.PCM_SIGNED,
format.getSampleRate(),
format.getSampleSizeInBits() * 2,
format.getChannels(),
format.getFrameSize() * 2,
format.getFrameRate(), true);
convertedInputStream = AudioSystem.getAudioInputStream(tmp,inputStream);
format = tmp;}
This code will convert ALAW/ULAW format of your AudioInputStream to PCM_SIGNED
JMF will help in this case.
http://www2.sys-con.com/itsg/virtualcd/java/archives/0503/decarmo/index.html