how to play wav file in java 1.4 - java

as title
How can i play a sound file repeatedly in java v1.4?

If you just want to play the wav file then 'org.life.java''s answer is correct. For other format types you can use JMF( http://www.oracle.com/technetwork/java/javase/tech/index-jsp-140239.html ).
Note: JMF is obsolete now... But it will work with jdk 1.4

import java.net.URL;
import javax.sound.sampled.*;
public class LoopSound {
public static void main(String[] args) throws Exception {
URL url = new URL(
"http://pscode.org/media/leftright.wav");
Clip clip = AudioSystem.getClip();
AudioInputStream ais = AudioSystem.
getAudioInputStream( url );
clip.open(ais);
clip.loop(0);
javax.swing.JOptionPane.
showMessageDialog(null, "Close to exit!");
}
}

This will work in JDK 1.4 (tested in Windows XP and JDK 1.4.2_06).
The other answer fails because as correctly stated in the comments, AudioSystem.getClip() does not exist on JDK 1.4. Below is a complete source (in the form of a main function, but it's adaptable to anything else) that uses DataLine and plays in a separate Thread for better overall performance as well:
import java.io.File;
import javax.sound.sampled.AudioFormat;
import javax.sound.sampled.AudioInputStream;
import javax.sound.sampled.AudioSystem;
import javax.sound.sampled.DataLine;
import javax.sound.sampled.SourceDataLine;
public class AudioTest {
public static void main(String[] args) throws Exception {
AudioInputStream ais = AudioSystem.getAudioInputStream(new File("C:/sound1.wav"));
AudioFormat format = ais.getFormat();
DataLine.Info dataLineInfo = new DataLine.Info(SourceDataLine.class, format);
SourceDataLine sourceDataLine = (SourceDataLine) AudioSystem.getLine(dataLineInfo);
class PlayThread extends Thread {
private AudioInputStream ais;
private AudioFormat format;
private SourceDataLine sourceDataLine;
byte tempBuffer[] = new byte[10000];
public PlayThread(AudioInputStream ais, SourceDataLine sourceDataLine, AudioFormat format) {
this.ais = ais;
this.sourceDataLine = sourceDataLine;
this.format = format;
}
public void run() {
try {
sourceDataLine.open(this.format);
sourceDataLine.start();
int cnt;
while ((cnt = this.ais.read(tempBuffer, 0, tempBuffer.length)) != -1) {
if (cnt > 0) {
sourceDataLine.write(tempBuffer, 0, cnt);
}
}
sourceDataLine.drain();
sourceDataLine.close();
} catch (Exception e) {
throw new RuntimeException(e);
}
}
}
new PlayThread(ais, sourceDataLine, format).start();
}
}
Both question and answers are really old, but I just had to make this work on a fanless mini PC that only run windows XP so... ¯\_(ツ)_/¯

Related

AudioPlayer is internal proprietary API and may be removed in a future release

import javax.swing.*;
import sun.audio.*;
import java.awt.event.*;
import java.io.*;
public class Sound {
public static void main(String[] args) {
JFrame frame = new JFrame();
frame.setSize(200,200);
JButton button = new JButton("Clcik me");
frame.add(button);
button.addActionListener(new AL());
frame.show(true);
}
public static class AL implements ActionListener{
public final void actionPerformed(ActionEvent e){
music();
}
}
public static void music(){
AudioPlayer MGP = AudioPlayer.player;
AudioStream BGM;
AudioData MD;
ContinuousAudioDataStream loop = null;
try{
BGM = new AudioStream(new FileInputStream("backgroundmusic.wav"));
MD = BGM.getData();
loop = new ContinuousAudioDataStream(MD);
}catch(IOException error){
error.printStackTrace();
}
MGP.start(loop);
}
}
When I run this code, I see following warnings. Why and how do I avoid that?
Sound.java:22: warning: AudioPlayer is internal proprietary API and may be removed in a future release
Sound.java:23: warning: AudioStream is internal proprietary API and may be removed in a future release
Sound.java:24: warning: AudioData is internal proprietary API and may be removedin a future release
Sound.java:25: warning: ContinuousAudioDataStream is internal proprietary API and may be removed in a future release
Sound.java:27: warning: AudioStream is internal proprietary API and may be removed in a future release
Sound.java:29: warning: ContinuousAudioDataStream is internal proprietary API and may be removed in a future release
Note: Sound.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
7 warnings
First, you should be using the classes in the javax.sound.sampled package to avoid the compiler warnings. Second, when using these classes, you'll also need to drive them from a thread in the background.
Here's one a wrote awhile ago. There are better ways to do it now than to sleep in a loop, but it works for quick and easy wav files, and you can adapt the code if you need to. A clever implementation might even be able to drive several audio files from the same thread.
Plays an audio file:
import java.io.InputStream;
import javax.sound.sampled.AudioFormat;
import javax.sound.sampled.AudioInputStream;
import javax.sound.sampled.AudioSystem;
import javax.sound.sampled.Clip;
import javax.sound.sampled.DataLine;
public class SoundThread extends Thread {
private final String resource;
public static void play(String resource) {
Thread t = new SoundThread(resource);
t.setDaemon(true);
t.start();
}
public SoundThread(String resource) {
this.resource = resource;
}
#Override
public void run() {
Clip clip = null;
try {
InputStream in = SoundThread.class.getClassLoader().getResourceAsStream(resource);
if(in != null) {
AudioInputStream stream = AudioSystem.getAudioInputStream(in);
AudioFormat format = stream.getFormat();
DataLine.Info info = new DataLine.Info(Clip.class, format);
clip = (Clip) AudioSystem.getLine(info);
clip.open(stream);
clip.loop(0);
do {
try {
Thread.sleep(100);
} catch(InterruptedException iex) {
// bad form on my part here, should do something
}
} while(clip.isRunning());
}
} catch (Exception e) {
x.printStackTrace(System.out);
} finally {
try {
if(clip != null) {
clip.close();
}
} catch(Exception x) {
x.printStackTrace(System.out);
}
}
}
}
Example of how to call it:
SoundThread.play("resources/cashregister6.wav");

Checking the level of audio playback in a Mixer's Line?

I'm trying to figure out if sound of any kind is playing in Windows (by any application). If something is making a noise somewhere, I want to know about it!
After following the docs, I've found how to get a list of mixers on the machine, as well as lines for those mixers -- which, if I understand correctly, are what is used for input/output of the mixer.
However, the problem I'm having is that I don't know how to get the data I need from the line.
The only interface I see that has a notion of volume level is DataLine. The problem with that is that I can't figure out what returns an object that implements the dataline interface.
Enumerating all of the mixers and lines:
public static void printMixers() {
Mixer.Info[] mixers = AudioSystem.getMixerInfo();
for (Mixer.Info mixerInfo : mixers) {
Mixer mixer = AudioSystem.getMixer(mixerInfo);
try {
mixer.open();
Line.Info[] lines = mixer.getSourceLineInfo();
for (Line.Info linfo : lines) {
System.out.println(linfo);
}
}
catch (LineUnavailableException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
That code enumerates and displays all of the audio devices on my machine. From that, shouldn't one of those Lines contain some kind of playback level data?
Oh you wish to find the volume? Well, not all hardware supports it, but here is how you get the dataline.
public static SourceDataLine getSourceDataLine(Line.Info lineInfo){
try{
return (SourceDataLine) AudioSystem.getLine(lineInfo);
}
catch(Exception ex){
ex.printStackTrace();
return null;
}
}
Then just call SourceDataLine.getLevel() to get the volume. I hope this helps.
NB: If the sound is originating from outside the JVM or not via the JavaSound API, this method will not detect the sound as the JVM does not have access to the OS equivalent of the SourceDataLine.
UPDATE: Upon further research, getLevel() is not implemented on most Systems. So I have manually implemented the method based off this forum discussion: https://community.oracle.com/message/5391003
Here are the classes:
public class Main {
public static void main(String[] args){
MicrophoneAnalyzer mic = new MicrophoneAnalyzer(FLACFileWriter.FLAC);
System.out.println("HELLO");
mic.open();
while(true){
byte[] buffer = new byte[mic.getTargetDataLine().getFormat().getFrameSize()];
mic.getTargetDataLine().read(buffer, 0, buffer.length);
try{
System.out.println(getLevel(mic.getAudioFormat(), buffer));
}
catch(Exception e){
System.out.println("ERROR");
e.printStackTrace();
}
}
}
public static double getLevel(AudioFormat af, byte[] chunk) throws IOException{
PCMSigned8Bit converter = new PCMSigned8Bit(af);
if(chunk.length != converter.getRequiredChunkByteSize())
return -1;
AudioInputStream ais = converter.convert(chunk);
ais.read(chunk, 0, chunk.length);
long lSum = 0;
for(int i=0; i<chunk.length; i++)
lSum = lSum + chunk[i];
double dAvg = lSum / chunk.length;
double sumMeanSquare = 0d;
for(int j=0; j<chunk.length; j++)
sumMeanSquare = sumMeanSquare + Math.pow(chunk[j] - dAvg, 2d);
double averageMeanSquare = sumMeanSquare / chunk.length;
return (Math.pow(averageMeanSquare,0.5d));
}
}
The method I used only works on 8bitPCM so we have to convert the encoding to that using these two classes. Here is the general abstract converter class.
import java.io.ByteArrayInputStream;
import java.io.IOException;
import javax.sound.sampled.AudioFormat;
import javax.sound.sampled.AudioInputStream;
import javax.sound.sampled.AudioSystem;
abstract class AbstractSignedLevelConverter
{
private AudioFormat srcf;
public AbstractSignedLevelConverter(AudioFormat sourceFormat)
{
srcf = sourceFormat;
}
protected AudioInputStream convert(byte[] chunk)
{
AudioInputStream ais = null;
if(AudioSystem.isConversionSupported( AudioFormat.Encoding.PCM_SIGNED,
srcf))
{
if(srcf.getEncoding() != AudioFormat.Encoding.PCM_SIGNED)
ais = AudioSystem.getAudioInputStream(
AudioFormat.Encoding.PCM_SIGNED,
new AudioInputStream(new ByteArrayInputStream(chunk),
srcf,
chunk.length * srcf.getFrameSize()));
else
ais = new AudioInputStream(new ByteArrayInputStream(chunk),
srcf,
chunk.length * srcf.getFrameSize());
}
return ais;
}
abstract public double convertToLevel(byte[] chunk) throws IOException;
public int getRequiredChunkByteSize()
{
return srcf.getFrameSize();
}
}
And here is the one for 8BitPCM
import java.io.IOException;
import javax.sound.sampled.AudioFormat;
import javax.sound.sampled.AudioInputStream;
public class PCMSigned8Bit extends AbstractSignedLevelConverter
{
PCMSigned8Bit(AudioFormat sourceFormat)
{
super(sourceFormat);
}
public double convertToLevel(byte[] chunk) throws IOException
{
if(chunk.length != getRequiredChunkByteSize())
return -1;
AudioInputStream ais = convert(chunk);
ais.read(chunk, 0, chunk.length);
return (double)chunk[0];
}
}
This is for TargetDataLine which may not work in your use case, but you could build a wrapper around SourceDataLine and use this to properly implement these methods. Hopes this helps.

How to read an audio file? Which method should I use?

I have a panel with 2 buttons. When I click on the button 1, I'd simply like to read an audio file (a .WAV in that case). Then, when I click on the button 2, I'd like to stop the music.
I do some research, but I'm a little confused about the different methods.
Which one is the best in my case ? Can someone explains the difference between AudioClip, JavaSound and JavaMediaFramework please ?
I've also try an example, but it contains errors.
Here is my Main.class :
import java.io.ByteArrayInputStream;
import java.io.InputStream;
public class Main
{
public static void main(String[] args)
{
SoundPlayer player = new SoundPlayer("C:/Documents and Settings/All Users/Documents/Ma musique/Échantillons de musique/Symphonie n° 9 de Beethoven (scherzo).wma");
InputStream stream = new ByteArrayInputStream(player.getSamples());
player.play(stream);
}
}
Here is my SoundPlayer.class :
import java.io.DataInputStream;
import java.io.File;
import java.io.IOException;
import java.io.InputStream;
import javax.sound.sampled.*;
public class SoundPlayer
{
private AudioFormat format;
private byte[] samples;
/**
*
* #param filename le lien vers le fichier song (URL ou absolute path)
*/
public SoundPlayer(String filename)
{
try
{
AudioInputStream stream = AudioSystem.getAudioInputStream(new File(filename));
format = stream.getFormat();
samples = getSamples(stream);
}
catch (UnsupportedAudioFileException e){e.printStackTrace();}
catch (IOException e){e.printStackTrace();}
}
public byte[] getSamples()
{
return samples;
}
public byte[] getSamples(AudioInputStream stream)
{
int length = (int)(stream.getFrameLength() * format.getFrameSize());
byte[] samples = new byte[length];
DataInputStream in = new DataInputStream(stream);
try
{
in.readFully(samples);
}
catch (IOException e){e.printStackTrace();}
return samples;
}
public void play(InputStream source)
{
int bufferSize = format.getFrameSize() * Math.round(format.getSampleRate() / 10);
byte[] buffer = new byte[bufferSize];
SourceDataLine line;
try
{
DataLine.Info info = new DataLine.Info(SourceDataLine.class, format);
line = (SourceDataLine)AudioSystem.getLine(info);
line.open(format, bufferSize);
}
catch (LineUnavailableException e)
{
e.printStackTrace();
return;
}
line.start();
try
{
int numBytesRead = 0;
while (numBytesRead != -1)
{
numBytesRead = source.read(buffer, 0, buffer.length);
if (numBytesRead != -1)
line.write(buffer, 0, numBytesRead);
}
}
catch (IOException e){e.printStackTrace();}
line.drain();
line.close();
}
}
LOGCAT :
javax.sound.sampled.UnsupportedAudioFileException: could not get audio input stream from input file
at javax.sound.sampled.AudioSystem.getAudioInputStream(Unknown Source)
at SoundPlayer.<init>(SoundPlayer.java:19)
at Main.main(Main.java:8)
Exception in thread "main" java.lang.NullPointerException
at java.io.ByteArrayInputStream.<init>(Unknown Source)
at Main.main(Main.java:9)
In advance, thanks a lot !
That exception will stay. *.wma files are not supported by standard.
Simplest solution would be to use *.wav files or other supported files
You can get more info on:
https://stackoverflow.com/tags/javasound/info
SoundPlayer player = new SoundPlayer("C:/Documents and Settings/All Users/" +
"Documents/Ma musique/Échantillons de musique/" +
"Symphonie n° 9 de Beethoven (scherzo).wma")
Ah, WMA. Great format, Java (Standard Edition) does not provide a Service Provider Interface that supports it.
You will either need to supply an SPI to allow Java Sound to support it, or use a different API. I don't know of any APIs that provide support for WMA. Can you encode it in a different format?
See the Java Sound info. page for a way to support MP3, but it requires the MP3 SPI from JMF.
Write down the full path of your music file it will works
I've found a solution to my problem.
In my case, the use of JAVAZOOM librairy is good.
Here is a sample, which only play an audio file when launching (no graphical part)
public class Sound
{
private boolean isPlaying = false;
private AdvancedPlayer player = null;
public Sound(String path) throws Exception
{
InputStream in = (InputStream)new BufferedInputStream(new FileInputStream(new File(path)));
player = new AdvancedPlayer(in);
}
public Sound(String path,PlaybackListener listener) throws Exception
{
InputStream in = (InputStream)new BufferedInputStream(new FileInputStream(new File(path)));
player = new AdvancedPlayer(in);
player.setPlayBackListener(listener);
}
public void play() throws Exception
{
if (player != null)
{
isPlaying = true;
player.play();
}
}
public void play(int begin,int end) throws Exception
{
if (player != null)
{
isPlaying = true;
player.play(begin,end);
}
}
public void stop() throws Exception
{
if (player != null)
{
player.stop();
isPlaying = false;
}
}
public boolean isPlaying()
{
return isPlaying;
}
public static void main(String[] args)
{
System.out.println("lecture de son");
try
{
Sound sound = new Sound("C:/Documents and Settings/cngo/Bureau/Stage-Save/TCPIP_AndroidJava/TCPIP_V6_Sound/OpeningSuite.mp3");
System.out.println("playing : " + sound.isPlaying());
sound.play();
System.out.println("playing : " + sound.isPlaying());
}
catch (Exception e){e.printStackTrace();}
}
}
Thanks to #murtaza.webdev for his answers !

convert a class file to a applet

I have a java class which running perfectly but now I need to run this class as a web application so I need to convert this class to an applet how can I convert this class to an applet.
I know little bid about applet like it's life cycle
init()
start()
paint()
stop()
destroy()
and to run a applet
applet code = "LifeTest.class"
so any one help me to convert this class to an applet and if it is not possible then any suggestion as a substitute
import java.io.ByteArrayInputStream;
import java.net.DatagramPacket;
import java.net.DatagramSocket;
import javax.sound.sampled.AudioFormat;
import javax.sound.sampled.AudioInputStream;
import javax.sound.sampled.AudioSystem;
import javax.sound.sampled.DataLine;
import javax.sound.sampled.FloatControl;
import javax.sound.sampled.SourceDataLine;
class Server {
AudioInputStream audioInputStream;
static AudioInputStream ais;
static AudioFormat format;
static boolean status = true;
static int port = 50005;
static int sampleRate = 8000;
public static void main(String args[]) throws Exception {
DatagramSocket serverSocket = new DatagramSocket(50005);
/**
* Formula for lag = (byte_size/sample_rate)*2
* Byte size 9728 will produce ~ 0.45 seconds of lag. Voice slightly broken.
* Byte size 1400 will produce ~ 0.06 seconds of lag. Voice extremely broken.
* Byte size 4000 will produce ~ 0.18 seconds of lag. Voice slightly more broken then 9728.
*/
byte[] receiveData = new byte[5000];
format = new AudioFormat(sampleRate, 16, 1, true, false);
while (status == true) {
DatagramPacket receivePacket = new DatagramPacket(receiveData,
receiveData.length);
serverSocket.receive(receivePacket);
ByteArrayInputStream baiss = new ByteArrayInputStream(
receivePacket.getData());
ais = new AudioInputStream(baiss, format, receivePacket.getLength());
toSpeaker(receivePacket.getData());
}
}
public static void toSpeaker(byte soundbytes[]) {
try {
DataLine.Info dataLineInfo = new DataLine.Info(SourceDataLine.class, format);
SourceDataLine sourceDataLine = (SourceDataLine) AudioSystem.getLine(dataLineInfo);
sourceDataLine.open(format);
FloatControl volumeControl = (FloatControl) sourceDataLine.getControl(FloatControl.Type.MASTER_GAIN);
volumeControl.setValue(6.0206f);
sourceDataLine.start();
sourceDataLine.open(format);
sourceDataLine.start();
System.out.println("format? :" + sourceDataLine.getFormat());
sourceDataLine.write(soundbytes, 0, soundbytes.length);
System.out.println(soundbytes.toString());
sourceDataLine.drain();
sourceDataLine.close();
} catch (Exception e) {
System.out.println("Not working in speakers...");
e.printStackTrace();
}
}
}
You need to extend Applet. Let me give you sample code for it.
import java.applet.Applet;
import java.awt.Graphics;
public class HelloWorld extends Applet {
public void paint(Graphics g) {
g.drawString("Hello world!", 50, 25);
}
}
Create MANIFEST.MF file using some text editor. Place it in same directory where your .java file is. Its content should be like.
Manifest-Version: 1.0
Permissions: all-permissions
Application-Name: Name of your application
Now you need to compile your code and need to attach MANIFEST.MF file in it.
javac HelloWorld.java
jar cvfm MANIFEST.MF HelloWorld.jar *.class
Now create one .html file and place <applet> tag in it.
<applet name="HelloWorld" code="HelloWorld.class"
archive="HelloWorld.jar" width="100" height="100">
</applet>

Audio recorder problem in java

I have a problem while recording the audio. I created a servlet and I modified the java sound API demo code to some extent and finally I can record the audio. The problem is that when I play the audio I can see the total time of the audio stored as 645.45 or something like that, but I have been recording the audio only for a couple of mins. One more problem is the audio is getting saved in the Eclipse directory instead of the project directory.
This is the servlet code.
package com;
import java.io.ByteArrayInputStream;
import java.io.ByteArrayOutputStream;
import java.io.File;
import java.io.IOException;
import java.io.InputStream;
import javax.servlet.ServletException;
import javax.servlet.http.HttpServlet;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
import javax.sound.sampled.AudioFileFormat;
import javax.sound.sampled.AudioFormat;
import javax.sound.sampled.AudioInputStream;
import javax.sound.sampled.AudioSystem;
import javax.sound.sampled.DataLine;
import javax.sound.sampled.Clip;
import javax.sound.sampled.LineUnavailableException;
import javax.sound.sampled.SourceDataLine;
import javax.sound.sampled.TargetDataLine;
public class SoundRecorder extends HttpServlet {
private static final long serialVersionUID = 1L;
static protected boolean running;
static ByteArrayOutputStream out;
double fileName = Math.random();
//strFilename = nowLong.toString();
public SoundRecorder() {
System.out.println("Filename will be..." + fileName + ".wav");
}
public void init() {
}
public void destroy() {
}
public void doGet(HttpServletRequest request, HttpServletResponse response)
throws ServletException, IOException {
System.out.println("call received..");
String method = request.getParameter("method");
System.out.println(method);
if("record".equalsIgnoreCase(method)) {
captureAudio(true);
}
else if("stop".equalsIgnoreCase(method)) {
captureAudio(false);
}
else if("play".equalsIgnoreCase(method)) {
System.out.println("yet to write");
playAudio();
}
}
public void doPost(HttpServletRequest request, HttpServletResponse response)
throws ServletException, IOException {
System.out.println("call received..");
String method = request.getParameter("method");
System.out.println(method);
doGet(request, response);
}
private void captureAudio(boolean capturing) {
File outputFile = new File(fileName + ".wav");
AudioFormat audioFormat = new AudioFormat(AudioFormat.Encoding.PCM_SIGNED,44100.0F, 16, 2, 4, 44100.0F, false);
DataLine.Info info = new DataLine.Info(TargetDataLine.class, audioFormat);
TargetDataLine targetDataLine = null;
try
{
targetDataLine = (TargetDataLine) AudioSystem.getLine(info);
targetDataLine.open(audioFormat);
}
catch (LineUnavailableException e)
{
System.out.println("unable to get a recording line");
e.printStackTrace();
System.exit(1);
}
AudioFileFormat.Type targetType = AudioFileFormat.Type.WAVE;
final Recorder recorder = new Recorder(targetDataLine,targetType,outputFile);
System.out.println("Recording...");
if(capturing){
recorder.start();
}
else {
recorder.stopRecording();
}
}
private void playAudio() {
try {
File file = new File(fileName + ".wav");
AudioInputStream stream = AudioSystem.getAudioInputStream(file);
AudioFormat format = stream.getFormat();
DataLine.Info info = new DataLine.Info(Clip.class, stream.getFormat());
Clip clip = (Clip) AudioSystem.getLine(info);
clip.open(stream);
clip.start();
} catch (Exception e) {
System.err.println("Line unavailable: " + e);
System.exit(-4);
}
}
}
And this is the recorder class
public class Recorder extends Thread {
private TargetDataLine m_line;
private AudioFileFormat.Type m_targetType;
private AudioInputStream m_audioInputStream;
private File m_outputFile;
public Recorder(TargetDataLine line,
AudioFileFormat.Type targetType,
File file)
{
m_line = line;
m_audioInputStream = new AudioInputStream(line);
m_targetType = targetType;
m_outputFile = file;
}
/** Starts the recording.
To accomplish this, (i) the line is started and (ii) the
thread is started.
*/
public void start()
{
m_line.start();
super.start();
}
/** Stops the recording.
*/
public void stopRecording()
{
m_line.stop();
m_line.close();
}
/** Main working method.
*/
public void run()
{
try
{
AudioSystem.write(
m_audioInputStream,
m_targetType,
m_outputFile);
}
catch (IOException e)
{
e.printStackTrace();
}
}
private static void closeProgram()
{
System.out.println("Program closing.....");
System.exit(1);
}
private static void out(String strMessage)
{
System.out.println(strMessage);
}
}
When developing with servlets, you need to realize that there's only one servlet instance throughout the whole webapp's lifetime, from startup until shutdown. So, the HTTP requests from all visitors, all sessions, all browser windows/tabs, etc will all share the same servlet instance. Also, when you make a variable static, it will be shared among all instances of the same class (which is not really relevant here since there's only one servlet instance anyway).
In other words, those variables which you've declared in the servlet are not threadsafe:
static protected boolean running;
static ByteArrayOutputStream out;
double fileName = Math.random();
There's only one of them and they are used by all visitors simultaneously. For the first two variables, which are continuously modified, this will lead to major threadsafety problems and for the third variable this means that all visitors record to the very same file. You need to declare them inside the doGet() block. You'd like to store the recording in the session by an unique request based token as key and then pass that key to the subsequent requests.
As to the problem of the file being saved at the unexpected location; when you use relative paths in java.io.File in a servlet, then it will be relative to the directory from where the webserver is started. If you start it from inside Eclipse, then it's saved in Eclipse directory. You'd like to use absolute path in java.io.File instead. If your intent is to save it in public webcontent (there where your JSP's and the /WEB-INF folder is located), then you need ServletContext#getRealPath() to convert a web path to an absolute disk path.
String relativeWebPath = "filename.ext";
String absoluteDiskPath = getServletContext().getRealPath(relativeWebPath);
File file = new File(absoluteDiskPath);
There's however another problem with this: all files will get erased whenever you redeploy the webapp. If you want a bit more permanent storage, then you'd like to store it outside the web project. E.g. C:/path/to/recordings.
File file = new File("C:/path/to/recordings/filename.ext");

Categories