Xuggler Video To Audio conversion - java

I am converting video to audio using Xuggler library in Java. There are no errors or exceptions arising in the program but the audio file generated is 0 Kb. Could someone fix the problem?
Environment: Eclipse Helios, OS: Windows 7
External JAR libraries added to project:
(1)slf4j-api-1.7.7.jar
(2)slf4j-simple-1.7.7.jar
(3)xuggle-xuggler-5.4.jar
Code snippet for video to audio conversion.
import com.xuggle.mediatool.IMediaReader;
import com.xuggle.mediatool.IMediaWriter;
import com.xuggle.mediatool.ToolFactory;
import com.xuggle.xuggler.ICodec;
public class VideoToAudio{
public void convertVideoToAudio(){
IMediaReader reader = ToolFactory.makeReader("D://vid.mp4");
IMediaWriter writer = ToolFactory.makeWriter("D://a.mp3",reader);
int sampleRate = 44100;
int channels = 1;
writer.addAudioStream(0, 0, ICodec.ID.CODEC_ID_MP3, channels, sampleRate);
while (reader.readPacket() == null);
}
public static void main(String [] args){
VideoToAudio vta = new VideoToAudio();
try{
vta.convertVideoToAudio();
}
catch(Exception e){
System.out.println("Could not open video file");
}
}
}

Your program looks fine, you just forgot a line :)
public void convertVideoToAudio(){
IMediaReader reader = ToolFactory.makeReader("D://vid.mp4");
IMediaWriter writer = ToolFactory.makeWriter("D://a.mp3",reader);
int sampleRate = 44100;
int channels = 1;
writer.addAudioStream(1, 0, ICodec.ID.CODEC_ID_MP3, channels, sampleRate);
-> reader.addListener(writer);
while (reader.readPacket() == null);
}

Related

Audio - Streaming Audio from Java is Choppy

My main objective is to create live streaming of encrypted voice chat from mic.
The encrypted audio is then transmitted over the network from one client to another.
The problem is that the audio is always getting stuttering and choppy while running the program (streaming).
I tried different types of hardware (PC, laptop, Raspberry Pi).
Different OSes as well.
Only sampling un-encrypted audio to eliminated any issue causes by the encryption algorithm.
Changing audio sample rate.
Unfortunately everything failed.
To make it simple, I only included the code needed to transmit the audio over the network without the encryption.
MAIN CLASS - both sender and receiver
package com.emaraic.securevoice;
import com.emaraic.securevoice.utils.AES;
import java.net.DatagramPacket;
import java.net.DatagramSocket;
import java.net.InetAddress;
import javax.sound.sampled.*;
public class SecureVoice
{
public static void main(String[] args)
{
Receiver rx = new Receiver();
rx.start();
Transmitter tx = new Transmitter();
tx.start();
}
public static AudioFormat getAudioFormat()
{ //you may change these parameters to fit you mic
float sampleRate = 8000.0f; //8000,11025,16000,22050,44100
int sampleSizeInBits = 16; //8,16
int channels = 1; //1,2
boolean signed = true; //true,false
boolean bigEndian = false; //true,false
return new AudioFormat(sampleRate, sampleSizeInBits, channels, signed, bigEndian);
}
public static final String ANSI_BOLD = "\033[0;1m"; //not working in NetBeans
public static final String ANSI_RESET = "\033[0m";
public static final String ANSI_BLACK = "\033[30m";
public static final String ANSI_RED = "\033[31m";
public static final String ANSI_GREEN = "\033[32;4m";
public static final String ANSI_YELLOW = "\033[33m";
public static final String ANSI_BLUE = "\033[34m";
public static final String ANSI_PURPLE = "\033[35m";
public static final String ANSI_CYAN = "\033[36m";
public static final String ANSI_WHITE = "\033[37m";
}
SENDER
package com.emaraic.securevoice;
import com.emaraic.securevoice.utils.AES;
import java.io.*;
import java.io.File;
import java.net.DatagramPacket;
import java.net.DatagramSocket;
import java.net.InetAddress;
import java.text.SimpleDateFormat;
import java.util.Date;
import javax.sound.sampled.AudioFileFormat;
import javax.sound.sampled.AudioFormat;
import javax.sound.sampled.AudioInputStream;
import javax.sound.sampled.AudioSystem;
import javax.sound.sampled.DataLine;
import javax.sound.sampled.Mixer;
import javax.sound.sampled.Port;
import javax.sound.sampled.TargetDataLine;
public class Transmitter extends Thread
{
// these parameters must be copied and used in the Receiver class of the other client
private static final String TX_IP = "10.101.114.179"; //ip to send to
private static final int TX_PORT = 1034;
#Override
public void run()
{
SecureVoice color = new SecureVoice();
Mixer.Info minfo[] = AudioSystem.getMixerInfo();
System.out.println(color.ANSI_BLUE + "Detecting sound card drivers...");
for (Mixer.Info minfo1 : minfo)
{
System.out.println(" " + minfo1);
}
if (AudioSystem.isLineSupported(Port.Info.MICROPHONE))
{
try
{
DataLine.Info dataLineInfo = new DataLine.Info(TargetDataLine.class, SecureVoice.getAudioFormat());
final TargetDataLine line = (TargetDataLine) AudioSystem.getLine(dataLineInfo); //recording from mic
line.open(SecureVoice.getAudioFormat());
line.start(); //start recording
System.out.println(color.ANSI_GREEN + "Recording...");
byte tempBuffer[] = new byte[line.getBufferSize()];
System.out.println(color.ANSI_BLUE + "Buffer size = " + tempBuffer.length + " bytes");
//AudioCapture audio = new AudioCapture(line); //capture the audio into .wav file
//audio.start();
while (true) //AES encryption
{
int read = line.read(tempBuffer, 0, tempBuffer.length);
byte[] encrypt = AES.encrypt(tempBuffer, 0, read);
// sendToUDP(encrypt);
sendToUDP(tempBuffer);
}
}
catch (Exception e)
{
System.out.println(e.getMessage());
System.exit(0);
}
}
}
public static void sendToUDP(byte soundpacket[])
{
try
{
// EncryptedAudio encrypt = new EncryptedAudio(soundpacket);
// encrypt.start();
DatagramSocket sock = new DatagramSocket();
sock.send(new DatagramPacket(soundpacket, soundpacket.length, InetAddress.getByName(TX_IP), TX_PORT));
sock.close();
}
catch (Exception e)
{
System.out.println(e.getMessage());
}
}
}
RECEIVER
package com.emaraic.securevoice;
import com.emaraic.securevoice.utils.AES;
import java.net.DatagramPacket;
import java.net.DatagramSocket;
import java.net.InetAddress;
import java.util.logging.Level;
import java.util.logging.Logger;
import javax.sound.sampled.AudioFormat;
import javax.sound.sampled.AudioSystem;
import javax.sound.sampled.DataLine;
import javax.sound.sampled.SourceDataLine;
public class Receiver extends Thread {
// these parameters must by used in the Transmitter class of the other client
private static final String RX_IP = "localhost";
private static final int RX_PORT = 1034;
#Override
public void run() {
byte b[] = null;
while (true) {
b = rxFromUDP();
speak(b);
}
}
public static byte[] rxFromUDP() {
try {
DatagramSocket sock = new DatagramSocket(RX_PORT);
byte soundpacket[] = new byte[8192];
DatagramPacket datagram = new DatagramPacket(soundpacket, soundpacket.length, InetAddress.getByName(RX_IP), RX_PORT);
sock.receive(datagram);
sock.close();
// return AES.decrypt(datagram.getData(),0,soundpacket.length); // soundpacket ;
return soundpacket; // if you want to hear encrypted form
} catch (Exception e) {
System.out.println(e.getMessage());
return null;
}
}
public static void speak(byte soundbytes[]) {
try {
DataLine.Info dataLineInfo = new DataLine.Info(SourceDataLine.class, SecureVoice.getAudioFormat());
try (SourceDataLine sourceDataLine = (SourceDataLine) AudioSystem.getLine(dataLineInfo)) {
sourceDataLine.open(SecureVoice.getAudioFormat());
sourceDataLine.start();
sourceDataLine.write(soundbytes, 0, soundbytes.length);
sourceDataLine.drain();
}
} catch (Exception e) {
System.out.println(e.getMessage());
}
}
}
EXTRA LINK
http://emaraic.com/blog/secure-voice-chat
IDE Used
- Netbeans 11.1
Java JDK version
- Java 13 (Windows)
- OpenJDK11 (Linux)
Two problems. Network streamed data will have jitter in arrival time. Starting and stopping audio play will cause delay gaps and jitter due to OS and hardware driver overhead time. There is also the smaller problem of audio sample rate clock rate synchronization between the record and play systems. All of those can impact a continuous stream of audio samples at a fixed rate.
To avoid the audio start-up latency problem, don't stop your audio play or record system between network packets, always have audio data ready to play continuously at the current sample rate. To help cover network jitter, buffer some amount of audio data before starting playback, so there is always some audio ready to play even if the next network packet is sightly delayed.
You may have to gather some statistics on the audio startup and network latency and latency variation to determine a suitable amount to buffer. The alternative is an audio dropout concealment algorithm, which is far more complicated to implement.

Checking the level of audio playback in a Mixer's Line?

I'm trying to figure out if sound of any kind is playing in Windows (by any application). If something is making a noise somewhere, I want to know about it!
After following the docs, I've found how to get a list of mixers on the machine, as well as lines for those mixers -- which, if I understand correctly, are what is used for input/output of the mixer.
However, the problem I'm having is that I don't know how to get the data I need from the line.
The only interface I see that has a notion of volume level is DataLine. The problem with that is that I can't figure out what returns an object that implements the dataline interface.
Enumerating all of the mixers and lines:
public static void printMixers() {
Mixer.Info[] mixers = AudioSystem.getMixerInfo();
for (Mixer.Info mixerInfo : mixers) {
Mixer mixer = AudioSystem.getMixer(mixerInfo);
try {
mixer.open();
Line.Info[] lines = mixer.getSourceLineInfo();
for (Line.Info linfo : lines) {
System.out.println(linfo);
}
}
catch (LineUnavailableException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
That code enumerates and displays all of the audio devices on my machine. From that, shouldn't one of those Lines contain some kind of playback level data?
Oh you wish to find the volume? Well, not all hardware supports it, but here is how you get the dataline.
public static SourceDataLine getSourceDataLine(Line.Info lineInfo){
try{
return (SourceDataLine) AudioSystem.getLine(lineInfo);
}
catch(Exception ex){
ex.printStackTrace();
return null;
}
}
Then just call SourceDataLine.getLevel() to get the volume. I hope this helps.
NB: If the sound is originating from outside the JVM or not via the JavaSound API, this method will not detect the sound as the JVM does not have access to the OS equivalent of the SourceDataLine.
UPDATE: Upon further research, getLevel() is not implemented on most Systems. So I have manually implemented the method based off this forum discussion: https://community.oracle.com/message/5391003
Here are the classes:
public class Main {
public static void main(String[] args){
MicrophoneAnalyzer mic = new MicrophoneAnalyzer(FLACFileWriter.FLAC);
System.out.println("HELLO");
mic.open();
while(true){
byte[] buffer = new byte[mic.getTargetDataLine().getFormat().getFrameSize()];
mic.getTargetDataLine().read(buffer, 0, buffer.length);
try{
System.out.println(getLevel(mic.getAudioFormat(), buffer));
}
catch(Exception e){
System.out.println("ERROR");
e.printStackTrace();
}
}
}
public static double getLevel(AudioFormat af, byte[] chunk) throws IOException{
PCMSigned8Bit converter = new PCMSigned8Bit(af);
if(chunk.length != converter.getRequiredChunkByteSize())
return -1;
AudioInputStream ais = converter.convert(chunk);
ais.read(chunk, 0, chunk.length);
long lSum = 0;
for(int i=0; i<chunk.length; i++)
lSum = lSum + chunk[i];
double dAvg = lSum / chunk.length;
double sumMeanSquare = 0d;
for(int j=0; j<chunk.length; j++)
sumMeanSquare = sumMeanSquare + Math.pow(chunk[j] - dAvg, 2d);
double averageMeanSquare = sumMeanSquare / chunk.length;
return (Math.pow(averageMeanSquare,0.5d));
}
}
The method I used only works on 8bitPCM so we have to convert the encoding to that using these two classes. Here is the general abstract converter class.
import java.io.ByteArrayInputStream;
import java.io.IOException;
import javax.sound.sampled.AudioFormat;
import javax.sound.sampled.AudioInputStream;
import javax.sound.sampled.AudioSystem;
abstract class AbstractSignedLevelConverter
{
private AudioFormat srcf;
public AbstractSignedLevelConverter(AudioFormat sourceFormat)
{
srcf = sourceFormat;
}
protected AudioInputStream convert(byte[] chunk)
{
AudioInputStream ais = null;
if(AudioSystem.isConversionSupported( AudioFormat.Encoding.PCM_SIGNED,
srcf))
{
if(srcf.getEncoding() != AudioFormat.Encoding.PCM_SIGNED)
ais = AudioSystem.getAudioInputStream(
AudioFormat.Encoding.PCM_SIGNED,
new AudioInputStream(new ByteArrayInputStream(chunk),
srcf,
chunk.length * srcf.getFrameSize()));
else
ais = new AudioInputStream(new ByteArrayInputStream(chunk),
srcf,
chunk.length * srcf.getFrameSize());
}
return ais;
}
abstract public double convertToLevel(byte[] chunk) throws IOException;
public int getRequiredChunkByteSize()
{
return srcf.getFrameSize();
}
}
And here is the one for 8BitPCM
import java.io.IOException;
import javax.sound.sampled.AudioFormat;
import javax.sound.sampled.AudioInputStream;
public class PCMSigned8Bit extends AbstractSignedLevelConverter
{
PCMSigned8Bit(AudioFormat sourceFormat)
{
super(sourceFormat);
}
public double convertToLevel(byte[] chunk) throws IOException
{
if(chunk.length != getRequiredChunkByteSize())
return -1;
AudioInputStream ais = convert(chunk);
ais.read(chunk, 0, chunk.length);
return (double)chunk[0];
}
}
This is for TargetDataLine which may not work in your use case, but you could build a wrapper around SourceDataLine and use this to properly implement these methods. Hopes this helps.

How to read an audio file? Which method should I use?

I have a panel with 2 buttons. When I click on the button 1, I'd simply like to read an audio file (a .WAV in that case). Then, when I click on the button 2, I'd like to stop the music.
I do some research, but I'm a little confused about the different methods.
Which one is the best in my case ? Can someone explains the difference between AudioClip, JavaSound and JavaMediaFramework please ?
I've also try an example, but it contains errors.
Here is my Main.class :
import java.io.ByteArrayInputStream;
import java.io.InputStream;
public class Main
{
public static void main(String[] args)
{
SoundPlayer player = new SoundPlayer("C:/Documents and Settings/All Users/Documents/Ma musique/Échantillons de musique/Symphonie n° 9 de Beethoven (scherzo).wma");
InputStream stream = new ByteArrayInputStream(player.getSamples());
player.play(stream);
}
}
Here is my SoundPlayer.class :
import java.io.DataInputStream;
import java.io.File;
import java.io.IOException;
import java.io.InputStream;
import javax.sound.sampled.*;
public class SoundPlayer
{
private AudioFormat format;
private byte[] samples;
/**
*
* #param filename le lien vers le fichier song (URL ou absolute path)
*/
public SoundPlayer(String filename)
{
try
{
AudioInputStream stream = AudioSystem.getAudioInputStream(new File(filename));
format = stream.getFormat();
samples = getSamples(stream);
}
catch (UnsupportedAudioFileException e){e.printStackTrace();}
catch (IOException e){e.printStackTrace();}
}
public byte[] getSamples()
{
return samples;
}
public byte[] getSamples(AudioInputStream stream)
{
int length = (int)(stream.getFrameLength() * format.getFrameSize());
byte[] samples = new byte[length];
DataInputStream in = new DataInputStream(stream);
try
{
in.readFully(samples);
}
catch (IOException e){e.printStackTrace();}
return samples;
}
public void play(InputStream source)
{
int bufferSize = format.getFrameSize() * Math.round(format.getSampleRate() / 10);
byte[] buffer = new byte[bufferSize];
SourceDataLine line;
try
{
DataLine.Info info = new DataLine.Info(SourceDataLine.class, format);
line = (SourceDataLine)AudioSystem.getLine(info);
line.open(format, bufferSize);
}
catch (LineUnavailableException e)
{
e.printStackTrace();
return;
}
line.start();
try
{
int numBytesRead = 0;
while (numBytesRead != -1)
{
numBytesRead = source.read(buffer, 0, buffer.length);
if (numBytesRead != -1)
line.write(buffer, 0, numBytesRead);
}
}
catch (IOException e){e.printStackTrace();}
line.drain();
line.close();
}
}
LOGCAT :
javax.sound.sampled.UnsupportedAudioFileException: could not get audio input stream from input file
at javax.sound.sampled.AudioSystem.getAudioInputStream(Unknown Source)
at SoundPlayer.<init>(SoundPlayer.java:19)
at Main.main(Main.java:8)
Exception in thread "main" java.lang.NullPointerException
at java.io.ByteArrayInputStream.<init>(Unknown Source)
at Main.main(Main.java:9)
In advance, thanks a lot !
That exception will stay. *.wma files are not supported by standard.
Simplest solution would be to use *.wav files or other supported files
You can get more info on:
https://stackoverflow.com/tags/javasound/info
SoundPlayer player = new SoundPlayer("C:/Documents and Settings/All Users/" +
"Documents/Ma musique/Échantillons de musique/" +
"Symphonie n° 9 de Beethoven (scherzo).wma")
Ah, WMA. Great format, Java (Standard Edition) does not provide a Service Provider Interface that supports it.
You will either need to supply an SPI to allow Java Sound to support it, or use a different API. I don't know of any APIs that provide support for WMA. Can you encode it in a different format?
See the Java Sound info. page for a way to support MP3, but it requires the MP3 SPI from JMF.
Write down the full path of your music file it will works
I've found a solution to my problem.
In my case, the use of JAVAZOOM librairy is good.
Here is a sample, which only play an audio file when launching (no graphical part)
public class Sound
{
private boolean isPlaying = false;
private AdvancedPlayer player = null;
public Sound(String path) throws Exception
{
InputStream in = (InputStream)new BufferedInputStream(new FileInputStream(new File(path)));
player = new AdvancedPlayer(in);
}
public Sound(String path,PlaybackListener listener) throws Exception
{
InputStream in = (InputStream)new BufferedInputStream(new FileInputStream(new File(path)));
player = new AdvancedPlayer(in);
player.setPlayBackListener(listener);
}
public void play() throws Exception
{
if (player != null)
{
isPlaying = true;
player.play();
}
}
public void play(int begin,int end) throws Exception
{
if (player != null)
{
isPlaying = true;
player.play(begin,end);
}
}
public void stop() throws Exception
{
if (player != null)
{
player.stop();
isPlaying = false;
}
}
public boolean isPlaying()
{
return isPlaying;
}
public static void main(String[] args)
{
System.out.println("lecture de son");
try
{
Sound sound = new Sound("C:/Documents and Settings/cngo/Bureau/Stage-Save/TCPIP_AndroidJava/TCPIP_V6_Sound/OpeningSuite.mp3");
System.out.println("playing : " + sound.isPlaying());
sound.play();
System.out.println("playing : " + sound.isPlaying());
}
catch (Exception e){e.printStackTrace();}
}
}
Thanks to #murtaza.webdev for his answers !

convert a class file to a applet

I have a java class which running perfectly but now I need to run this class as a web application so I need to convert this class to an applet how can I convert this class to an applet.
I know little bid about applet like it's life cycle
init()
start()
paint()
stop()
destroy()
and to run a applet
applet code = "LifeTest.class"
so any one help me to convert this class to an applet and if it is not possible then any suggestion as a substitute
import java.io.ByteArrayInputStream;
import java.net.DatagramPacket;
import java.net.DatagramSocket;
import javax.sound.sampled.AudioFormat;
import javax.sound.sampled.AudioInputStream;
import javax.sound.sampled.AudioSystem;
import javax.sound.sampled.DataLine;
import javax.sound.sampled.FloatControl;
import javax.sound.sampled.SourceDataLine;
class Server {
AudioInputStream audioInputStream;
static AudioInputStream ais;
static AudioFormat format;
static boolean status = true;
static int port = 50005;
static int sampleRate = 8000;
public static void main(String args[]) throws Exception {
DatagramSocket serverSocket = new DatagramSocket(50005);
/**
* Formula for lag = (byte_size/sample_rate)*2
* Byte size 9728 will produce ~ 0.45 seconds of lag. Voice slightly broken.
* Byte size 1400 will produce ~ 0.06 seconds of lag. Voice extremely broken.
* Byte size 4000 will produce ~ 0.18 seconds of lag. Voice slightly more broken then 9728.
*/
byte[] receiveData = new byte[5000];
format = new AudioFormat(sampleRate, 16, 1, true, false);
while (status == true) {
DatagramPacket receivePacket = new DatagramPacket(receiveData,
receiveData.length);
serverSocket.receive(receivePacket);
ByteArrayInputStream baiss = new ByteArrayInputStream(
receivePacket.getData());
ais = new AudioInputStream(baiss, format, receivePacket.getLength());
toSpeaker(receivePacket.getData());
}
}
public static void toSpeaker(byte soundbytes[]) {
try {
DataLine.Info dataLineInfo = new DataLine.Info(SourceDataLine.class, format);
SourceDataLine sourceDataLine = (SourceDataLine) AudioSystem.getLine(dataLineInfo);
sourceDataLine.open(format);
FloatControl volumeControl = (FloatControl) sourceDataLine.getControl(FloatControl.Type.MASTER_GAIN);
volumeControl.setValue(6.0206f);
sourceDataLine.start();
sourceDataLine.open(format);
sourceDataLine.start();
System.out.println("format? :" + sourceDataLine.getFormat());
sourceDataLine.write(soundbytes, 0, soundbytes.length);
System.out.println(soundbytes.toString());
sourceDataLine.drain();
sourceDataLine.close();
} catch (Exception e) {
System.out.println("Not working in speakers...");
e.printStackTrace();
}
}
}
You need to extend Applet. Let me give you sample code for it.
import java.applet.Applet;
import java.awt.Graphics;
public class HelloWorld extends Applet {
public void paint(Graphics g) {
g.drawString("Hello world!", 50, 25);
}
}
Create MANIFEST.MF file using some text editor. Place it in same directory where your .java file is. Its content should be like.
Manifest-Version: 1.0
Permissions: all-permissions
Application-Name: Name of your application
Now you need to compile your code and need to attach MANIFEST.MF file in it.
javac HelloWorld.java
jar cvfm MANIFEST.MF HelloWorld.jar *.class
Now create one .html file and place <applet> tag in it.
<applet name="HelloWorld" code="HelloWorld.class"
archive="HelloWorld.jar" width="100" height="100">
</applet>

how to play wav file in java 1.4

as title
How can i play a sound file repeatedly in java v1.4?
If you just want to play the wav file then 'org.life.java''s answer is correct. For other format types you can use JMF( http://www.oracle.com/technetwork/java/javase/tech/index-jsp-140239.html ).
Note: JMF is obsolete now... But it will work with jdk 1.4
import java.net.URL;
import javax.sound.sampled.*;
public class LoopSound {
public static void main(String[] args) throws Exception {
URL url = new URL(
"http://pscode.org/media/leftright.wav");
Clip clip = AudioSystem.getClip();
AudioInputStream ais = AudioSystem.
getAudioInputStream( url );
clip.open(ais);
clip.loop(0);
javax.swing.JOptionPane.
showMessageDialog(null, "Close to exit!");
}
}
This will work in JDK 1.4 (tested in Windows XP and JDK 1.4.2_06).
The other answer fails because as correctly stated in the comments, AudioSystem.getClip() does not exist on JDK 1.4. Below is a complete source (in the form of a main function, but it's adaptable to anything else) that uses DataLine and plays in a separate Thread for better overall performance as well:
import java.io.File;
import javax.sound.sampled.AudioFormat;
import javax.sound.sampled.AudioInputStream;
import javax.sound.sampled.AudioSystem;
import javax.sound.sampled.DataLine;
import javax.sound.sampled.SourceDataLine;
public class AudioTest {
public static void main(String[] args) throws Exception {
AudioInputStream ais = AudioSystem.getAudioInputStream(new File("C:/sound1.wav"));
AudioFormat format = ais.getFormat();
DataLine.Info dataLineInfo = new DataLine.Info(SourceDataLine.class, format);
SourceDataLine sourceDataLine = (SourceDataLine) AudioSystem.getLine(dataLineInfo);
class PlayThread extends Thread {
private AudioInputStream ais;
private AudioFormat format;
private SourceDataLine sourceDataLine;
byte tempBuffer[] = new byte[10000];
public PlayThread(AudioInputStream ais, SourceDataLine sourceDataLine, AudioFormat format) {
this.ais = ais;
this.sourceDataLine = sourceDataLine;
this.format = format;
}
public void run() {
try {
sourceDataLine.open(this.format);
sourceDataLine.start();
int cnt;
while ((cnt = this.ais.read(tempBuffer, 0, tempBuffer.length)) != -1) {
if (cnt > 0) {
sourceDataLine.write(tempBuffer, 0, cnt);
}
}
sourceDataLine.drain();
sourceDataLine.close();
} catch (Exception e) {
throw new RuntimeException(e);
}
}
}
new PlayThread(ais, sourceDataLine, format).start();
}
}
Both question and answers are really old, but I just had to make this work on a fanless mini PC that only run windows XP so... ¯\_(ツ)_/¯

Categories