I am making a java voice chat program and the server side voice class is throwing this error
javax.sound.sampled.LineUnavailableException: line with format PCM_SIGNED 8000.0 Hz, 8 bit, mono, 1 bytes/frame, not supported.
at com.sun.media.sound.DirectAudioDevice$DirectDL.implOpen(DirectAudioDevice.java:513)
at com.sun.media.sound.AbstractDataLine.open(AbstractDataLine.java:121)
at com.sun.media.sound.AbstractDataLine.open(AbstractDataLine.java:153)
at client.VoiceUser.run(VoiceUser.java:34)
the error is being thrown on this line of code
microphone.open(audioformat);
Code:
package client;
import java.io.IOException;
import java.io.ObjectOutputStream;
import java.net.Socket;
import java.util.ArrayList;
import javax.sound.sampled.AudioFormat;
import javax.sound.sampled.AudioSystem;
import javax.sound.sampled.DataLine;
import javax.sound.sampled.LineUnavailableException;
import javax.sound.sampled.TargetDataLine;
public class VoiceUser extends Thread {
private ObjectOutputStream clientOutput;
private TargetDataLine microphone;
private ArrayList<ObjectOutputStream> vOutputArray = new ArrayList<ObjectOutputStream>();
private AudioFormat audioformat;
public VoiceUser(Socket sv, ArrayList<ObjectOutputStream> outputArray) throws LineUnavailableException {
try {
clientOutput = new ObjectOutputStream(sv.getOutputStream());
vOutputArray.equals(clientOutput);
vOutputArray.add(clientOutput);
} catch (IOException e) {
System.out.println("Can't create stable connection between server and client");
}
}
public void run() {
try {
DataLine.Info info = new DataLine.Info(TargetDataLine.class, audioformat);
microphone = (TargetDataLine)AudioSystem.getLine(info);
audioformat = new AudioFormat(8000.0f,8,1,true,false);
microphone.open(audioformat);
microphone.start();
} catch (LineUnavailableException e) {
e.printStackTrace();
}
int bytesRead = 0;
byte[] soundData = new byte[3072];
int offset = 0;
while(bytesRead != -1)
{
bytesRead = microphone.read(soundData, 0, soundData.length);
if(bytesRead >= 0) {
offset += bytesRead;
if(offset == soundData.length) {
send(soundData, bytesRead);
offset = 0;
}
}
}
}
public void send(byte[] soundData, int bytesRead) {
for(ObjectOutputStream o : vOutputArray) {
try {
o.write(soundData, 0, bytesRead);
o.flush();
} catch (IOException e) {
e.printStackTrace();
}
}
}
}
It tells you that line is not available. There are couple of reasons - your sound card does not support recording at that specific format you specified. You can try different formats, for example 22050Hz might be supported.
Another reason, the sound card is already busy with recording. On Windows you can not record with two microphones simultaneously. You need to redesign your server to allow just a single recording session.
Related
I am working on a small instant messenger project that implements a custom encryption algorithm I've developed. However, networking isn't my strong area.
Basically, here I'm trying to provide a synchronized audio output stream using a one to many arches.
So, so far, I've managed to pipe out the audio over an HTTP connection in base64 encoded format, but here, I am stuck.
I have no idea how to play back the audio in real time, without reading the same audio data twice(overlap)
audio server
here's my server-side code, please be kind if I have mucked up the whole thing, but I think I've got this part working correctly.
/*
* Decompiled with CFR 0.139.
*/
package SIM.net.client.networking;
import DARTIS.crypt;
import java.io.ByteArrayOutputStream;
import java.io.PrintStream;
import javax.sound.sampled.AudioFormat;
import javax.sound.sampled.AudioSystem;
import javax.sound.sampled.DataLine;
import javax.sound.sampled.LineUnavailableException;
import javax.sound.sampled.TargetDataLine;
import com.sun.org.apache.xml.internal.security.utils.Base64;
public class audioServer {
public static void start(String[] key) {
AudioFormat format = new AudioFormat(8000.0f, 16, 1, true, true);
TargetDataLine microphone = null;
try {
microphone = AudioSystem.getTargetDataLine(format);
} catch (LineUnavailableException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
DataLine.Info info = new DataLine.Info(TargetDataLine.class, format);
try {
microphone = (TargetDataLine)AudioSystem.getLine(info);
} catch (LineUnavailableException e1) {
// TODO Auto-generated catch block
e1.printStackTrace();
}
try {
microphone.open(format);
} catch (LineUnavailableException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
ByteArrayOutputStream out = new ByteArrayOutputStream();
int CHUNK_SIZE = 1024;
byte[] data = new byte[microphone.getBufferSize() / 5];
microphone.start();
int bytesRead = 0;
do {
if (bytesRead >= 4096) {
byte[] audioData = out.toByteArray();
String base64img = Base64.encode(audioData);
String audioclip;
if (key.length > 9999) {
audioclip = crypt.inject(base64img, key);
} else {
audioclip = base64img;
}
audioHandler.setdata(audioclip);
bytesRead = 0;
out.reset();
} else {
int numBytesRead = microphone.read(data, 0, CHUNK_SIZE);
System.out.println(bytesRead += numBytesRead);
out.write(data, 0, numBytesRead);
}
}
while (true);
}
}
audio handler
package SIM.net.client.networking;
import java.io.IOException;
import java.io.OutputStream;
import java.net.InetSocketAddress;
import java.net.URI;
import java.util.HashMap;
import com.sun.net.httpserver.HttpExchange;
import com.sun.net.httpserver.HttpHandler;
import com.sun.net.httpserver.HttpServer;
public class audioHandler
implements HttpHandler {
public static String audiodata;
public static void setdata(String imgdta) {
audiodata = imgdta;
}
public void handle(HttpExchange he) throws IOException {
HashMap parameters = new HashMap();
}
public static void main(String[] args) throws Exception {
HttpServer server = HttpServer.create(new InetSocketAddress(9991), 0);
server.createContext("/audio", new MyHandler());
server.setExecutor(null); // creates a default executor
server.start();
audioServer.start(new String[3]);
}
static class MyHandler implements HttpHandler {
#Override
public void handle(HttpExchange he) throws IOException {
URI requestedUri = he.getRequestURI();
String query = requestedUri.getRawQuery();
he.sendResponseHeaders(200, audiodata.replace("\n", "").replace("\r", "").length());
OutputStream os = he.getResponseBody();
os.write(audiodata.toString().replace("\n", "").replace("\r", "").getBytes());
os.close();
}
}
}
Please understand, this code was originally written to stream webcam snapshots over http in real time, one frame at a time, if this design isn't suitable for audio streaming, please point me in the right direction, I usually learn best from running examples, editing, and observing the changes in it's output, so any sample/example code would help greatly.(im not asking you to solve it for me 100%, just some pointers in the right direction and example code)
I got a bunch of audio files from a client which at some point in the program need to play. Only a few of these .wav files play when I call them, but most don't and give me an IllegalArgumentException. I tried many different code examples of how to play .wav files in Java but none worked for me and I don't know why. Currently I'm using this, which came from another stackOverflow question/answer:
import java.io.File;
import java.io.IOException;
import javax.sound.sampled.AudioInputStream;
import javax.sound.sampled.AudioSystem;
import javax.sound.sampled.DataLine;
import javax.sound.sampled.LineUnavailableException;
import javax.sound.sampled.SourceDataLine;
/**
* Handles playing, stoping, and looping of sounds for the game.
* #author Tyler Thomas
*
*/
public class Sound {
private final int BUFFER_SIZE = 128000;
private AudioInputStream audioStream;
private SourceDataLine sourceLine;
/**
* #param filename the name of the file that is going to be played
*/
public void playSound(String filename){
try {
audioStream = AudioSystem.getAudioInputStream(new File(filename));
} catch (Exception e){
e.printStackTrace();
}
try {
sourceLine = (SourceDataLine) AudioSystem.getLine(new DataLine.Info(SourceDataLine.class, audioStream.getFormat()));
sourceLine.open(audioStream.getFormat());
} catch (LineUnavailableException e) {
e.printStackTrace();
} catch (Exception e) {
e.printStackTrace();
}
sourceLine.start();
int nBytesRead = 0;
byte[] abData = new byte[BUFFER_SIZE];
while (nBytesRead != -1) {
try {
nBytesRead = audioStream.read(abData, 0, abData.length);
} catch (IOException e) {
e.printStackTrace();
}
if (nBytesRead >= 0) {
#SuppressWarnings("unused")
int nBytesWritten = sourceLine.write(abData, 0, nBytesRead);
}
}
sourceLine.drain();
sourceLine.close();
}
}
If I use this to play my file I get the following Error:
java.lang.IllegalArgumentException: No line matching interface SourceDataLine supporting format PCM_FLOAT 44100.0 Hz, 32 bit, stereo, 8 bytes/frame, is supported.
at javax.sound.sampled.AudioSystem.getLine(Unknown Source)
at Sound.playSound(Sound.java:29)
Extra, when I open the file with a regular media player it has no problems. So the files are not corrupted or anything.
i will record the sound from my programs. I use Ubuntu 14.04 and PulseAudio.
Now i try to record from pulseaudio but currently i'm only recording from my microphone.
How can i record the sound from pulseaudio instead of my microphone?
public static void captureAudio() {
try {
final AudioFormat format = getFormat();
DataLine.Info info = new DataLine.Info(TargetDataLine.class, format);
final TargetDataLine line = (TargetDataLine) AudioSystem.getLine(info);
line.open(format);
line.start();
Runnable runnable = new Runnable() {
int bufferSize = (int)format.getSampleRate() * format.getFrameSize();
byte buffer[] = new byte[bufferSize];
public void run() {
out = new ByteArrayOutputStream();
running = true;
try {
while (running) {
int count = line.read(buffer, 0, buffer.length);
if (count > 0) {
out.write(buffer, 0, count);
}
}
out.close();
} catch (IOException ex) {
ex.printStackTrace();
}
}
};
Thread captureThread = new Thread(runnable);
captureThread.start();
} catch (LineUnavailableException ex) {
ex.printStackTrace();
}
}
I tried some things to change this in my code:
Mixer mixer = AudioSystem.getMixer(null);
And then:
final TargetDataLine line = (TargetDataLine) mixer.getLine(info);
Hope anyone have a solution.
Greetings
Daniel
This problem cannot be solved from within Java alone. Java sees only the devices which are already there, as the following Java program demonstrates:
import javax.sound.sampled.*;
public class ListDevices {
public static void main(final String... args) throws Exception {
for (final Mixer.Info info : AudioSystem.getMixerInfo())
System.out.format("%s: %s %s %s %s%n", info, info.getName(), info.getVendor(), info.getVersion(), info.getDescription());
}
}
What you need to do is create a loopback device for your audio system. The following post shows how to do that: https://askubuntu.com/questions/257992/how-can-i-use-pulseaudio-virtual-audio-streams-to-play-music-over-skype The purpose was different, but it should be adaptable for your situation, as your situation seems simpler to me than the situation described in that post.
It should be possible to run those pactl commands from Java using Process.
I want to run a radio station from a hill top with the studio in the valley using a radio Ethernet link of 1.1 mbs data rate below is an example code (below) that I found.
But I want the code to:
Load a text file containing the IPv4 IP address to receive sound
Read a true or false from the file re transmit from the android for another to receive to save data on such a slow connection.
Can someone help please?
import java.io.IOException;
import java.util.Vector;
import javax.media.CaptureDeviceInfo;
import javax.media.CaptureDeviceManager;
import javax.media.DataSink;
import javax.media.Manager;
import javax.media.MediaLocator;
import javax.media.NoPlayerException;
import javax.media.NoProcessorException;
import javax.media.NotRealizedError;
import javax.media.Player;
import javax.media.Processor;
import javax.media.control.FormatControl;
import javax.media.control.TrackControl;
import javax.media.format.AudioFormat;
import javax.media.protocol.ContentDescriptor;
import javax.media.protocol.DataSource;
public class SimpleVoiceTransmiter {
/**
* #param args
*/
public static void main(String[] args) {
// First find a capture device that will capture linear audio
// data at 8bit 8Khz
AudioFormat format= new AudioFormat(AudioFormat.LINEAR, 8000, 8, 1);
Vector devices= CaptureDeviceManager.getDeviceList( format);
CaptureDeviceInfo di= null;
if (devices.size() > 0) {
di = (CaptureDeviceInfo) devices.elementAt( 0);
}
else {
// exit if we could not find the relevant capturedevice.
System.exit(-1);
}
// Create a processor for this capturedevice & exit if we
// cannot create it
Processor processor = null;
try {
processor = Manager.createProcessor(di.getLocator());
} catch (IOException e) {
System.exit(-1);
} catch (NoProcessorException e) {
System.exit(-1);
}
// configure the processor
processor.configure();
while (processor.getState() != Processor.Configured){
try {
Thread.sleep(100);
} catch (InterruptedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
processor.setContentDescriptor(
new ContentDescriptor( ContentDescriptor.RAW));
TrackControl track[] = processor.getTrackControls();
boolean encodingOk = false;
// Go through the tracks and try to program one of them to
// output gsm data.
for (int i = 0; i < track.length; i++) {
if (!encodingOk && track[i] instanceof FormatControl) {
if (((FormatControl)track[i]).
setFormat( new AudioFormat(AudioFormat.GSM_RTP, 8000, 8, 1)) == null) {
track[i].setEnabled(false);
}
else {
encodingOk = true;
}
} else {
// we could not set this track to gsm, so disable it
track[i].setEnabled(false);
}
}
// At this point, we have determined where we can send out
// gsm data or not.
// realize the processor
if (encodingOk) {
processor.realize();
while (processor.getState() != Processor.Realized){
try {
Thread.sleep(100);
} catch (InterruptedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
// get the output datasource of the processor and exit
// if we fail
DataSource ds = null;
try {
ds = processor.getDataOutput();
} catch (NotRealizedError e) {
System.exit(-1);
}
// hand this datasource to manager for creating an RTP
// datasink our RTP datasink will multicast the audio
try {
String url= "rtp://224.0.0.1:22224/audio/16";
MediaLocator m = new MediaLocator(url);
DataSink d = Manager.createDataSink(ds, m);
d.open();
d.start();
processor.start();
} catch (Exception e) {
System.exit(-1);
}
}
}
}
Receiver:
import java.io.IOException;
import java.net.MalformedURLException;
import javax.media.Manager;
import javax.media.MediaLocator;
import javax.media.NoPlayerException;
import javax.media.Player;
public class SimpleVoiceReciver{
/**
* #param args
*/
public static void main(String[] args) {
String url= "rtp://192.168.1.111:22224/audio/16";
MediaLocator mrl= new MediaLocator(url);
if (mrl == null) {
System.err.println("Can't build MRL for RTP");
System.exit(-1);
}
// Create a player for this rtp session
Player player = null;
try {
player = Manager.createPlayer(mrl);
} catch (NoPlayerException e) {
System.err.println("Error:" + e);
System.exit(-1);
} catch (MalformedURLException e) {
System.err.println("Error:" + e);
System.exit(-1);
} catch (IOException e) {
System.err.println("Error:" + e);
System.exit(-1);
}
if (player != null) {
System.out.println("Player created.");
player.realize();
// wait for realizing
while (player.getState() != Player.Realized){
try {
Thread.sleep(10);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
player.start();
} else {
System.err.println("Player doesn't created.");
System.exit(-1);
}
}
}
It sounds perfectly possible to do this multicasting over a local network. AFAIK this will not work across the internet. Also see this : Device support for multicasting is apparently very patchy. So do your research, and make sure the Android devices you work with actually support this on a software and hardware level - many of them do not. Caveat Emptor.
We are trying to integrated sound in one of our project, my team members don't get this error, and I get it on two different machines.
Stack trace:
Exception in thread "SoundPlayer" java.lang.IllegalArgumentException: No line matching interface Clip supporting format PCM_SIGNED 16000.0 Hz, 16 bit, stereo, 4 bytes/frame, little-endian, and buffers of 11129272 to 11129272 bytes is supported.
at javax.sound.sampled.AudioSystem.getLine(Unknown Source)
at sound.Music.run(Music.java:86)
at java.lang.Thread.run(Unknown Source)
Code:
package sound;
import java.io.File;
import java.io.IOException;
import javax.sound.sampled.AudioFormat;
import javax.sound.sampled.AudioInputStream;
import javax.sound.sampled.AudioSystem;
import javax.sound.sampled.Clip;
import javax.sound.sampled.DataLine;
import javax.sound.sampled.FloatControl;
import javax.sound.sampled.LineEvent;
import javax.sound.sampled.LineListener;
import javax.sound.sampled.LineUnavailableException;
import javax.sound.sampled.UnsupportedAudioFileException;
public class Music implements LineListener, Runnable
{
private File soundFile;
private Thread thread;
private static Music player;
private Music audio;
private Clip clip;
public Music()
{
}
public void playSiren(String musicFileName)
{
Music p = getPlayer();
p.playSirenFile(musicFileName);
}
private void playSirenFile(String musicFileName)
{
this.soundFile = new File("Music/"+musicFileName+".wav");
thread = new Thread(this);
thread.setName("SoundPlayer");
thread.start();
}
public void run()
{
try
{
AudioInputStream stream = AudioSystem.getAudioInputStream(this.soundFile);
AudioFormat format = stream.getFormat();
/**
* we can't yet open the device for ALAW/ULAW playback, convert
* ALAW/ULAW to PCM
*/
if ((format.getEncoding() == AudioFormat.Encoding.ULAW) || (format.getEncoding() == AudioFormat.Encoding.ALAW))
{
AudioFormat tmp = new AudioFormat(
AudioFormat.Encoding.PCM_SIGNED,
format.getSampleRate(),
format.getSampleSizeInBits() * 2, format.getChannels(),
format.getFrameSize() * 2, format.getFrameRate(), true);
stream = AudioSystem.getAudioInputStream(tmp, stream);
format = tmp;
}
DataLine.Info info = new DataLine.Info(Clip.class, stream
.getFormat(), ((int) stream.getFrameLength() * format
.getFrameSize()));
clip = (Clip) AudioSystem.getLine(info);
clip.addLineListener(this);
clip.open(stream);
clip.start();
try
{
thread.sleep(99);
}
catch (Exception e)
{
}
while (clip.isActive() && thread != null)
{
try
{
thread.sleep(99);
}
catch (Exception e)
{
break;
}
}
clip.loop(99999999);
}
catch (UnsupportedAudioFileException e)
{
// TODO Auto-generated catch block
e.printStackTrace();
}
catch (IOException e)
{
// TODO Auto-generated catch block
e.printStackTrace();
}
catch (LineUnavailableException e)
{
// TODO Auto-generated catch block
e.printStackTrace();
}
}
private static Music getPlayer()
{
if (player == null)
{
player = new Music();
}
return player;
}
public void update(LineEvent event)
{
}
public void stopClip()
{
clip.stop();
}
public void closeClip()
{
clip.close();
}
public void startClip()
{
clip.start();
}
public void volume(float volume)
{
/*
FloatControl gainControl = (FloatControl) clip.getControl(FloatControl.Type.MASTER_GAIN);
gainControl.setValue(-50.0f); // Reduce volume IN DECIBELS
clip.start();
*/
}
}
We call this class from domainController with
audio = new Music();
audio.playSiren("stillAliveDecent");
does Anyone have an idea how this exception can be resolved?
I tried reinstalling my editor software (Eclipse) but to no avail.
thanks allot in advance.
Edit
We just tried switching the sound file. We tried running it with much smaller file. this now works, but once we switch back to the larger .wav file (10+MB) I get the exception again.
Only using smaller files is not really an option as we would like to use some self made songs which are quite long.
Edit 2
I'm quite sure it isn't a corrupted wav. we recompiled it, even used another wave of similar length and size, and i'm still the only one getting this error.
some extra requested info:
OS: Windows 7 64bit Ultimate
JDK: 1.6.0_22
Edit 3
After some wave creating and playing we have come to the conclusion that for some reason I can't play wave's larger than 2MB.
Still why aren't my teammates affected by this?
I was experiencing this same problem on a raspberry pi. It would play the first 5 files just fine, then I'd get the error. It turned out that I was not closing the clip when I needed to.
Clip clip = AudioSystem.getClip();
clip.addLineListener(event -> {
if(LineEvent.Type.STOP.equals(event.getType())) {
clip.close();
}
});
ByteArrayInputStream audioBytes = new ByteArrayInputStream(SOUNDS.get(file));
AudioInputStream inputStream = AudioSystem.getAudioInputStream(audioBytes);
clip.open(inputStream);
clip.start();
After adding the line listener and closing the clip when it stopped, the errors went away.
You can actually play sound above 40 mb, if needed, thats how far i went :p, the problem is mostly eclipse, and to be more exact its the .metadata folder in your workspace i think its like a small plugin that only gets uploaded half of the time, so the problem lies with your editor and not the code, the code above is working perfectly since i could play songs without any trouble. Make sure your paths are correct, and try to get a correct version of the .metadata and you should be fine. A friend of mine had the same problem, and i gave him my copy of the workspace and .metadata and it worked perfectly.