Play back audio streamed over http - java

I am working on a small instant messenger project that implements a custom encryption algorithm I've developed. However, networking isn't my strong area.
Basically, here I'm trying to provide a synchronized audio output stream using a one to many arches.
So, so far, I've managed to pipe out the audio over an HTTP connection in base64 encoded format, but here, I am stuck.
I have no idea how to play back the audio in real time, without reading the same audio data twice(overlap)
audio server
here's my server-side code, please be kind if I have mucked up the whole thing, but I think I've got this part working correctly.
/*
* Decompiled with CFR 0.139.
*/
package SIM.net.client.networking;
import DARTIS.crypt;
import java.io.ByteArrayOutputStream;
import java.io.PrintStream;
import javax.sound.sampled.AudioFormat;
import javax.sound.sampled.AudioSystem;
import javax.sound.sampled.DataLine;
import javax.sound.sampled.LineUnavailableException;
import javax.sound.sampled.TargetDataLine;
import com.sun.org.apache.xml.internal.security.utils.Base64;
public class audioServer {
public static void start(String[] key) {
AudioFormat format = new AudioFormat(8000.0f, 16, 1, true, true);
TargetDataLine microphone = null;
try {
microphone = AudioSystem.getTargetDataLine(format);
} catch (LineUnavailableException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
DataLine.Info info = new DataLine.Info(TargetDataLine.class, format);
try {
microphone = (TargetDataLine)AudioSystem.getLine(info);
} catch (LineUnavailableException e1) {
// TODO Auto-generated catch block
e1.printStackTrace();
}
try {
microphone.open(format);
} catch (LineUnavailableException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
ByteArrayOutputStream out = new ByteArrayOutputStream();
int CHUNK_SIZE = 1024;
byte[] data = new byte[microphone.getBufferSize() / 5];
microphone.start();
int bytesRead = 0;
do {
if (bytesRead >= 4096) {
byte[] audioData = out.toByteArray();
String base64img = Base64.encode(audioData);
String audioclip;
if (key.length > 9999) {
audioclip = crypt.inject(base64img, key);
} else {
audioclip = base64img;
}
audioHandler.setdata(audioclip);
bytesRead = 0;
out.reset();
} else {
int numBytesRead = microphone.read(data, 0, CHUNK_SIZE);
System.out.println(bytesRead += numBytesRead);
out.write(data, 0, numBytesRead);
}
}
while (true);
}
}
audio handler
package SIM.net.client.networking;
import java.io.IOException;
import java.io.OutputStream;
import java.net.InetSocketAddress;
import java.net.URI;
import java.util.HashMap;
import com.sun.net.httpserver.HttpExchange;
import com.sun.net.httpserver.HttpHandler;
import com.sun.net.httpserver.HttpServer;
public class audioHandler
implements HttpHandler {
public static String audiodata;
public static void setdata(String imgdta) {
audiodata = imgdta;
}
public void handle(HttpExchange he) throws IOException {
HashMap parameters = new HashMap();
}
public static void main(String[] args) throws Exception {
HttpServer server = HttpServer.create(new InetSocketAddress(9991), 0);
server.createContext("/audio", new MyHandler());
server.setExecutor(null); // creates a default executor
server.start();
audioServer.start(new String[3]);
}
static class MyHandler implements HttpHandler {
#Override
public void handle(HttpExchange he) throws IOException {
URI requestedUri = he.getRequestURI();
String query = requestedUri.getRawQuery();
he.sendResponseHeaders(200, audiodata.replace("\n", "").replace("\r", "").length());
OutputStream os = he.getResponseBody();
os.write(audiodata.toString().replace("\n", "").replace("\r", "").getBytes());
os.close();
}
}
}
Please understand, this code was originally written to stream webcam snapshots over http in real time, one frame at a time, if this design isn't suitable for audio streaming, please point me in the right direction, I usually learn best from running examples, editing, and observing the changes in it's output, so any sample/example code would help greatly.(im not asking you to solve it for me 100%, just some pointers in the right direction and example code)

Related

Reading Outbound messages from HTTP Streaming Connection in JAVA

I am implementing an integration client which will be connected to an API Gateway and send requests to the gateway. In short, I am the consumer of the API.
Now, the architecture the opposite party has built that I have to open a http streaming connection and read the outbound messages sent by the system.
In case of http streaming connection, it will be automatically closed if kept on idle state for a while so we have to send continuous heart-beat request to the system so that the stream connection shall stay alive.
Now I have following queries:
Do I have to use URLConnection.openConnection to open streaming channel?
How can I keep the stream open by sending requests? Do I have to open a separate thread for achieving this?
What are Outbound messages? Is the term refers to the responses I receive from the API?
How can I test it before getting actual testing setup? is there any dummy test case available on the internet where I can open stream and read data? if yes, please suggest any link?
I have to read JSON Data.
I have been provided with following Example Code for HTTP Streaming
import java.io.BufferedReader;
import java.io.IOException;
import java.io.InputStreamReader;
import java.lang.invoke.MethodHandles;
import java.net.URL;
import java.net.URLConnection;
import java.net.URLDecoder;
import javax.ws.rs.core.HttpHeaders;
import org.apache.commons.lang.StringUtils;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import com.google.gson.Gson;
/** * Sample on Java * Open stream and start StreamReader to read stream in separate thread */
/** * Sample of stream processing */
public class StreamReader implements Runnable {
private static final Logger log = LoggerFactory.getLogger(MethodHandles.lookup().lookupClass());
final private BufferedReader in;
final private boolean toConsole;
final private String serviceName;
private final Crypto crypto;
StreamReader(BufferedReader in, boolean toConsole, String serviceName, Crypto crypto) {
this.in = in;
this.toConsole = toConsole;
this.serviceName = serviceName;
this.crypto = crypto;
}
#Override
public void run() {
String inputLine;
try {
stop:
while (!Thread.currentThread().isInterrupted()) {
try {
while (!StringUtils.isEmpty(inputLine = in.readLine())) {
if ( inputLine.startsWith( serviceName + ":" ) ) {
log.info("Stream message (outgoing from IPS):");
if (toConsole)
System.out.println("Stream message (outgoing from IPS):");
String[] s = inputLine.split(":");
if (s.length >= 2) {
log.debug("Encoded:");
log.debug(inputLine);
log.trace("Decoded:");
String decoded = URLDecoder.decode(s[1], "UTF-8");
log.trace(decoded);
NReply n = Gson.fromJson(decoded, NReply.class);
log.info("Readable:");
log.info(n.o.toString());
if (toConsole) {
System.out.println("Readable:");
System.out.println(n.o.toString());
}
if (s.length > 2) {
int lastSemicolon = inputLine.lastIndexOf(':');
log.debug("Signed data:");
final String signedData = inputLine.substring(0, lastSemicolon);
log.debug(signedData);
log.debug("base64 encoded signature:");
log.debug(s[2]);
try {
//crypto.checkSignSep(signedData, s[2]);
}
catch (Exception e) {
log.error(e.getMessage(), e);
}
}
}
else {
log.error("Invalid stream message:");
log.error(inputLine);
if (toConsole) {
System.out.println("Invalid stream message:");
System.out.println(inputLine);
}
}
}
else {
log.debug("View's info:");
log.debug(inputLine);
}
if ("CLOSED".equals(inputLine)) {
Thread.currentThread().interrupt();
break stop;
}
}
Thread.sleep(1000L);
}
catch (InterruptedException e) {
log.error(e.getMessage(), e);
Thread.currentThread().interrupt();
break;
}
catch (Exception e) {
log.error(e.getMessage(), e);
break;
}
}
log.info("polling has been stopped");
}
finally {
try {
in.close();
}
catch (IOException e) {
log.error(e.getMessage(), e);
}
}
}
public void openStream(boolean toConsole) throws IOException
{
//final URL stream = new URL (hostName+"/stream");
final URL stream = new URL ("url");
final URLConnection uc = stream.openConnection();
//uc.setRequestProperty(HttpHeaders.COOKIE, "JSESSIONID=" + heartBeat.getSessionId() );
uc.setRequestProperty(HttpHeaders.COOKIE, "JSESSIONID=" + "SessionIDFROMSERVICE" );
final BufferedReader in =new BufferedReader(new InputStreamReader(uc.getInputStream()));
StreamReader reader=new StreamReader(in,toConsole,serviceName,crypto);
Thread thread=new Thread(reader);
thread.setDaemon(true);
thread.start();
}
}
Any sort of help is appreciated.. :).

Is it possible to get word timings from from text to speech in Watson Java API?

My teacher gave this Java example on how to generate speech from text and save to a Wav file. He asked for us to modify it to save word timings to disk. I don't see any options to do this in SynthesizeOptions (http://watson-developer-cloud.github.io/java-sdk/docs/java-sdk-7.2.0/com/ibm/watson/text_to_speech/v1/model/SynthesizeOptions.Builder.html) even though the API says it is possible: https://cloud.ibm.com/docs/services/text-to-speech?topic=text-to-speech-timing#timingRequest
Authenticator authenticator = new IamAuthenticator("api_key");
TextToSpeech textToSpeech = new TextToSpeech(authenticator);
try {
SynthesizeOptions synthesizeOptions = new SynthesizeOptions.Builder()
.text(text)
.accept("audio/wav")
.voice("pt-BR_IsabelaV3Voice")
.timings(words)
.build();
// a callback is defined to handle certain events, like an audio transmission or a timing marker
// in this case, we'll build up a byte array of all the received bytes to build the resulting file
final ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();
textToSpeech.synthesizeUsingWebSocket(synthesizeOptions, new BaseSynthesizeCallback() {
#Override
public void onAudioStream(byte[] bytes) {
// append to our byte array
try {
byteArrayOutputStream.write(bytes);
} catch (IOException e) {
e.printStackTrace();
}
}
});
// quick way to wait for synthesis to complete, since synthesizeUsingWebSocket() runs asynchronously
try {
Thread.sleep(5000);
} catch (InterruptedException e) {
e.printStackTrace();
}
// create file with audio data
String filename = id + ".wav";
OutputStream fileOutputStream = new FileOutputStream(filename);
byteArrayOutputStream.writeTo(fileOutputStream);
// clean up
byteArrayOutputStream.close();
fileOutputStream.close();
} catch (IOException e) {
e.printStackTrace();
}
You need to think out of the box. You have an audio file, and word timings are a feature of speech to text services, not text to speech services.
package com.watsontest;
import java.io.*;
//import java.io.IOExceintellintellij ideaption;
import java.util.ArrayList;
import com.google.gson.Gson;
import com.ibm.cloud.sdk.core.http.HttpMediaType;
import com.ibm.cloud.sdk.core.security.Authenticator;
import com.ibm.cloud.sdk.core.security.IamAuthenticator;
import com.ibm.watson.speech_to_text.v1.SpeechToText;
import com.ibm.watson.speech_to_text.v1.model.RecognizeOptions;
import com.ibm.watson.speech_to_text.v1.model.SpeechRecognitionResults;
import com.ibm.watson.text_to_speech.v1.TextToSpeech;
import com.ibm.watson.text_to_speech.v1.model.SynthesizeOptions;
import com.ibm.watson.text_to_speech.v1.model.Timings;
import com.ibm.watson.text_to_speech.v1.websocket.BaseSynthesizeCallback;
public class Main {
public void geraVoz(String id, String text, ArrayList<String> words){
Authenticator authenticator = new IamAuthenticator("API_KEY_HERE");
TextToSpeech textToSpeech = new TextToSpeech(authenticator);
ArrayList arrayList = new ArrayList<String>();
arrayList.add("words");
ArrayList timingsArrayList = new ArrayList<Timings>();
try {
SynthesizeOptions synthesizeOptions = new SynthesizeOptions.Builder()
.text(text)
.accept("audio/wav")
.voice("pt-BR_IsabelaV3Voice")
.timings(arrayList)
.build();
// a callback is defined to handle certain events, like an audio transmission or a timing marker
// in this case, we'll build up a byte array of all the received bytes to build the resulting file
final ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();
textToSpeech.synthesizeUsingWebSocket(synthesizeOptions, new BaseSynthesizeCallback() {
#Override
public void onAudioStream(byte[] bytes) {
// append to our byte array
try {
byteArrayOutputStream.write(bytes);
} catch (IOException e) {
e.printStackTrace();
}
}
#Override
public void onTimings(Timings timings) {
timingsArrayList.add(timings);
}
#Override
public void onDisconnected() {
System.out.println("disconnected!");
String json = new Gson().toJson(timingsArrayList);
try {
PrintWriter out = new PrintWriter("timings.json");
out.println(json);
out.close();
} catch (Exception e) {
System.out.println(e);
}
}
});
// quick way to wait for synthesis to complete, since synthesizeUsingWebSocket() runs asynchronously
try {
Thread.sleep(5000);
} catch (InterruptedException e) {
e.printStackTrace();
}
// create file with audio data
String filename = id + ".wav";
OutputStream fileOutputStream = new FileOutputStream(filename);
byteArrayOutputStream.writeTo(fileOutputStream);
System.out.println(synthesizeOptions.getTimings());
// clean up
byteArrayOutputStream.close();
fileOutputStream.close();
System.out.println("recorded file");
} catch (IOException e) {
e.printStackTrace();
}
}
public static void main(String[] args) {
new Main().geraVoz("id1", "testando transcrição de voz. Olá isso é um teste", null);
}
}

javax.sound.sampled.LineUnavailableException on attempt to record sound

I am making a java voice chat program and the server side voice class is throwing this error
javax.sound.sampled.LineUnavailableException: line with format PCM_SIGNED 8000.0 Hz, 8 bit, mono, 1 bytes/frame, not supported.
at com.sun.media.sound.DirectAudioDevice$DirectDL.implOpen(DirectAudioDevice.java:513)
at com.sun.media.sound.AbstractDataLine.open(AbstractDataLine.java:121)
at com.sun.media.sound.AbstractDataLine.open(AbstractDataLine.java:153)
at client.VoiceUser.run(VoiceUser.java:34)
the error is being thrown on this line of code
microphone.open(audioformat);
Code:
package client;
import java.io.IOException;
import java.io.ObjectOutputStream;
import java.net.Socket;
import java.util.ArrayList;
import javax.sound.sampled.AudioFormat;
import javax.sound.sampled.AudioSystem;
import javax.sound.sampled.DataLine;
import javax.sound.sampled.LineUnavailableException;
import javax.sound.sampled.TargetDataLine;
public class VoiceUser extends Thread {
private ObjectOutputStream clientOutput;
private TargetDataLine microphone;
private ArrayList<ObjectOutputStream> vOutputArray = new ArrayList<ObjectOutputStream>();
private AudioFormat audioformat;
public VoiceUser(Socket sv, ArrayList<ObjectOutputStream> outputArray) throws LineUnavailableException {
try {
clientOutput = new ObjectOutputStream(sv.getOutputStream());
vOutputArray.equals(clientOutput);
vOutputArray.add(clientOutput);
} catch (IOException e) {
System.out.println("Can't create stable connection between server and client");
}
}
public void run() {
try {
DataLine.Info info = new DataLine.Info(TargetDataLine.class, audioformat);
microphone = (TargetDataLine)AudioSystem.getLine(info);
audioformat = new AudioFormat(8000.0f,8,1,true,false);
microphone.open(audioformat);
microphone.start();
} catch (LineUnavailableException e) {
e.printStackTrace();
}
int bytesRead = 0;
byte[] soundData = new byte[3072];
int offset = 0;
while(bytesRead != -1)
{
bytesRead = microphone.read(soundData, 0, soundData.length);
if(bytesRead >= 0) {
offset += bytesRead;
if(offset == soundData.length) {
send(soundData, bytesRead);
offset = 0;
}
}
}
}
public void send(byte[] soundData, int bytesRead) {
for(ObjectOutputStream o : vOutputArray) {
try {
o.write(soundData, 0, bytesRead);
o.flush();
} catch (IOException e) {
e.printStackTrace();
}
}
}
}
It tells you that line is not available. There are couple of reasons - your sound card does not support recording at that specific format you specified. You can try different formats, for example 22050Hz might be supported.
Another reason, the sound card is already busy with recording. On Windows you can not record with two microphones simultaneously. You need to redesign your server to allow just a single recording session.

IllegalArgumentException with opening .wav files in Java

I got a bunch of audio files from a client which at some point in the program need to play. Only a few of these .wav files play when I call them, but most don't and give me an IllegalArgumentException. I tried many different code examples of how to play .wav files in Java but none worked for me and I don't know why. Currently I'm using this, which came from another stackOverflow question/answer:
import java.io.File;
import java.io.IOException;
import javax.sound.sampled.AudioInputStream;
import javax.sound.sampled.AudioSystem;
import javax.sound.sampled.DataLine;
import javax.sound.sampled.LineUnavailableException;
import javax.sound.sampled.SourceDataLine;
/**
* Handles playing, stoping, and looping of sounds for the game.
* #author Tyler Thomas
*
*/
public class Sound {
private final int BUFFER_SIZE = 128000;
private AudioInputStream audioStream;
private SourceDataLine sourceLine;
/**
* #param filename the name of the file that is going to be played
*/
public void playSound(String filename){
try {
audioStream = AudioSystem.getAudioInputStream(new File(filename));
} catch (Exception e){
e.printStackTrace();
}
try {
sourceLine = (SourceDataLine) AudioSystem.getLine(new DataLine.Info(SourceDataLine.class, audioStream.getFormat()));
sourceLine.open(audioStream.getFormat());
} catch (LineUnavailableException e) {
e.printStackTrace();
} catch (Exception e) {
e.printStackTrace();
}
sourceLine.start();
int nBytesRead = 0;
byte[] abData = new byte[BUFFER_SIZE];
while (nBytesRead != -1) {
try {
nBytesRead = audioStream.read(abData, 0, abData.length);
} catch (IOException e) {
e.printStackTrace();
}
if (nBytesRead >= 0) {
#SuppressWarnings("unused")
int nBytesWritten = sourceLine.write(abData, 0, nBytesRead);
}
}
sourceLine.drain();
sourceLine.close();
}
}
If I use this to play my file I get the following Error:
java.lang.IllegalArgumentException: No line matching interface SourceDataLine supporting format PCM_FLOAT 44100.0 Hz, 32 bit, stereo, 8 bytes/frame, is supported.
at javax.sound.sampled.AudioSystem.getLine(Unknown Source)
at Sound.playSound(Sound.java:29)
Extra, when I open the file with a regular media player it has no problems. So the files are not corrupted or anything.

Recording keyboard, mouse activity in linux

I need some way to detect mouse/keyboard activity on Linux. I need to record this activity and send this record to my android tablet using tcp socket. I m running this program in terminal and it is showing error Exception in thread "main"java.lang.UnsupportedClassVersionError: Mouse : Unsupported major.minor version 51.0..any help????
import java.awt.HeadlessException;
import java.awt.MouseInfo;
import java.awt.Point;
import java.io.BufferedReader;
import java.io.ByteArrayInputStream;
import java.io.File;
import java.io.FileNotFoundException;
import java.io.FileReader;
import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
import java.io.RandomAccessFile;
import java.net.DatagramPacket;
import java.net.DatagramSocket;
import java.net.InetAddress;
import java.net.InetSocketAddress;
import java.net.Socket;
import java.net.SocketException;
import java.net.UnknownHostException;
import java.util.Timer;
import java.util.TimerTask;
public class Mouse {
public static void main(String[] args) throws InterruptedException {
Point p, prev_p;
p = MouseInfo.getPointerInfo().getLocation();
DatagramSocket socket = null;
try {
socket = new DatagramSocket(8988);
} catch (SocketException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
InetAddress addr = null;
try {
addr = InetAddress.getByName("107.108.203.204");
} catch (UnknownHostException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
File file = new File("/sys/kernel/debug/usb/usbmon/6u");
BufferedReader br = null;
try {
br = new BufferedReader(new FileReader(file));
} catch (FileNotFoundException e1) {
// TODO Auto-generated catch block
e1.printStackTrace();
System.err
.println("To fix the error Run as root or Change ownership of the file to the user who runs this program");
}
String line, s = null;
try {
while ((line = br.readLine()) != null) {
prev_p = p;
p = MouseInfo.getPointerInfo().getLocation();
String[] arr = line.split(" ");
if (arr.length == 8)
s = arr[7];
System.out.println(s+" "+Integer.parseInt(s.substring(2,4),16));
byte[] buffer = s.getBytes();
DatagramPacket pak = new DatagramPacket(buffer, buffer.length,
addr, 8988);
try {
socket.send(pak);
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
I shall not go into the basics of telling you how to use a tcp socket, that is simple enough.
However the basics of your question is that you will need to open and constantly read the /dev/input/by-id/yourmouseorkeyboardnamehere file. Reading this file will cause your program to block until there is a keyboard/mouse input (depending on if you read the keyboard or mouse file) then you will be able to read data representing what data came from the keyboard or mouse.
It should from there be fairly easy to send this data over a tcp socket to your tablet, you can learn to do that from any sockets tutorial on the Internet.
If you have any questions or need more detail please comment bellow.

Categories