Audio recording and playback with libgdx - java

I was wondering if anyone has any luck with audio recording and playback using libgdx. I'm currently using 0.9.8 and just messing around trying to make a simple voice chat app.
I create my audio device using
int samples = 44100;
int seconds = 5;
boolean isMono = true;
short[] data = new short[samples*seconds];
AudioRecorder recorder = Gdx.audio.newAudioRecorder(samples, isMono);
AudioDevice player = Gdx.audio.newAudioDevice(samples, isMono);
On a button press I kick of a new thread to record the data. (I know this is a messy to kick off a new thread, but i'm only trying to play with audio recording)
new Thread(new Runnable() {
#Override
public void run() {
System.out.println("Record: Start");
recorder.read(data, 0, data.length);
System.out.println("Record: End");
}
}).start();
After the recording I play back the recorded data using
new Thread(new Runnable() {
#Override
public void run() {
System.out.println("Play : Start");
player.writeSamples(data, samples, data.length);
System.out.println("Play : End");
}
}).start();
On my laptop the recording and playback seems to work fine. I can record the data and then playback works great.
The problem happens on Android. I've tried it on three devices (Samsung S3, Samsung Galaxy Mini and a Nexus 10). In all cases, the recording works perfectly, the issue occurs when I attempt the playback it just locks up in the player.writeSamples and nothing is played. I've left it 10 min and never prints "Record : End".
Has anyone ever got audio playback working? Am I missing something?

After a bit of playing around, I found the issue.
Your line, here:
player.writeSamples(data, samples, data.length);
should instead be:
player.writeSamples(data, 0, data.length);
The second argument is the offset, not the sample rate.
So, to sum up, the following works on Android devices:
final int samples = 44100;
boolean isMono = true;
final short[] data = new short[samples * 5];
final AudioRecorder recorder = Gdx.audio.newAudioRecorder(samples, isMono);
final AudioDevice player = Gdx.audio.newAudioDevice(samples, isMono);
new Thread(new Runnable() {
#Override
public void run() {
System.out.println("Record: Start");
recorder.read(data, 0, data.length);
recorder.dispose();
System.out.println("Record: End");
System.out.println("Play : Start");
player.writeSamples(data, 0, data.length);
System.out.println("Play : End");
player.dispose();
}
}).start();

Related

Android if statement in runnable not always firing

I am trying to write a simple little thing that stops an audio file from being played when the user "interrupts" it by talking to the phone. I'm using the SoundMeter class that this person wrote to get the maxAmplitude of the microphone. I've set a threshold to 800 and want to test the return of the microphone amplitude against it constantly (every 50ms).
For some reason, it's hit or miss when I get the "interrupt" to show up and when the audio stops. I'd like for the interrupt to be showing the entire time the amplitude is above 800.
handler = new Handler();
final Runnable r = new Runnable() {
#Override
public void run() {
mic_amplitude.setText(""+mSoundMeter.getAmplitude());
if(mSoundMeter.getAmplitude() > threshold) {
Log.d("INTERRUPT", "INTERRUPT");
interrupt.setText("Interrupt");
mBackgroundAudio.pause(mp);
audioButton.setText("Resume Audio");
} else {
interrupt.setText("");
}
handler.postDelayed(this, 50);
}
};
handler.postDelayed(r, 50);
While viewing this, I'm able to see the amplitude at a steady 40-50 while there is no background noise and peaks of 1200-1500 when talking quietly. I'd like the interrupt to show anytime above 800, but it's currently only showing up intermittently.
Ok, I figured it out. I tested what my amplitude was by logging the amp along with the interrupt and I saw I was getting 0. I realized I had been testing on a new amplitude (other than the amp I was showing), so I assigned the amplitude to a variable and used the variable everywhere else.
here's my outcome:
handler = new Handler();
final Runnable r = new Runnable() {
#Override
public void run() {
mAmplitude = mSoundMeter.getAmplitude();
mic_amplitude.setText(""+mAmplitude);
if(mAmplitude > threshold) {
Log.d("INTERRUPT", "INTERRUPT " + mAmplitude);
interrupt.setText("Interrupt");
mBackgroundAudio.pause(mp);
audioButton.setText("Resume Audio");
} else {
interrupt.setText("");
}
handler.postDelayed(this, 100);
}
};
handler.postDelayed(r, 100);

Webcam aplication won't close after exitting OPENCV 248 JAVA

I just finished an app that has a jpanel showing images from the webcam, I used OPENCV 2.4.8 and VideoCapture, when I close the Frame in netbeans the program still keeps running so I have to close it from netbean's stop bottom. I try running the app *.jar in windows the first time it works fine but when I close it and open it again there is a problem with the camara like it is already been use!!! any ideas?
Here is a piece of the code-
public static void main(String[] args) {
Permisos objPer = new Permisos();
objPer.setLocationRelativeTo(null);
objPer.setBounds(0, 0, 468, 328);
objPer.setVisible(true);
objPer.setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE);
System.loadLibrary(Core.NATIVE_LIBRARY_NAME);
webFrame panel = new webFrame();
pan1.add(panel);
Mat webcam = new Mat();
video = new VideoCapture(0);
if (video.isOpened()) {
while (true) {
video.read(webcam);
panel.MatToBufferedImage(webcam);
panel.setSize(webcam.width() + 40, webcam.height() + 60);
panel.repaint();
}
} else {
System.out.println("no video open");
}
}
I GET THE CAMARA TO WORK AGAIN WHEN I UNPLUG IT AND PLUG IT BACK AGAIN TO PC
Got it working! The class VideoCapture had a function named release(). just needed to add it the exit of the app... Thanks for the Help Mayur!

vary slow frame rate when calling HoughCircles() method from onCameraFrame, OpenCV, Android/Java

extremely slow frame rates when using the openCV Java method in android for detecting circular shaped objects in images
Imgproc.HoughCircles(mGray, circles, Imgproc.CV_HOUGH_GRADIENT, 1, 50);
when i remove this method it runs fast, but after adding this method inside of this callback
public Mat onCameraFrame(final CvCameraViewFrame inputFrame) {
the frame rate slows to 1 to 2 frames per second, I don't understand why it gets so slow, i tried putting this method in a separate thread and it would not help, the only thing that worked is to use a counter and and an if statement to run the method every 10 frames.
in the OpenCV examples there is a sample project called face detection and it has both a native C++ and Java camera versions and they both are vary fast, how is it possible that when I use similar code I get this slow constipated action from OpenCV?
is there something i am doing wrong here? In the face detection project from openCV examples they take every frame and they don't launch a separate thread. how do I fix this problem and make my code run fast like the sample projects in OpenCV?
in a different project I am also having the same problem of slow frame rate, in this practice project where I am not using openCV, it is just the android Camera class only, in that I am taking the image from the onPreviewFrame(byte[] data, Camera camera) method and doing some light processing like converting the YUV format from the byte array into a bitmap and putting that into another view on the same screen as the camera view, and the result is vary slow frame rate.
EDIT: In some additional experimentation I added the Imgproc.HoughCircles() method to the OpenCV face Detection sample project. putting this method inside the onCameraFrame method of the java detector.
the result is the same as in my project, it became vary slow. so the HoughCircles method probably takes more processing power than the face detection method CascadeClassifier.detectMultiScale(), however that does not explain the fact I watched other circle detection projects on youTube and in their videos the frame rate is not slowed down. that is why I think there is something wrong with what I am doing.
here is a sample of the code I am using
public class CircleActivity extends Activity implements CvCameraViewListener2 {
Mat mRgba;
Mat mGray;
File mCascadeFile;
CascadeClassifier mJavaDetector;
CameraBridgeViewBase mOpenCvCameraView;
LinearLayout linearLayoutOne;
ImageView imageViewOne;
int counter = 0;
private BaseLoaderCallback mLoaderCallback = new BaseLoaderCallback(this) {
#Override
public void onManagerConnected(int status) {
switch (status) {
case LoaderCallbackInterface.SUCCESS:
{
Log.i("OPENCV", "OpenCV loaded successfully");
mOpenCvCameraView.enableView();
} break;
default:
{
super.onManagerConnected(status);
} break;
}
}
};
/** Called when the activity is first created. */
#Override
public void onCreate(Bundle savedInstanceState) {
if (!OpenCVLoader.initDebug()) {
// Handle initialization error
}
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_coffee);
getWindow().addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);
mOpenCvCameraView = (CameraBridgeViewBase) findViewById(R.id.fd_activity_surface_view);
mOpenCvCameraView.setCvCameraViewListener(this);
}
#Override
public void onPause()
{
super.onPause();
if (mOpenCvCameraView != null)
mOpenCvCameraView.disableView();
}
#Override
public void onResume()
{
super.onResume();
OpenCVLoader.initAsync(OpenCVLoader.OPENCV_VERSION_2_4_3, this, mLoaderCallback);
}
public void onDestroy() {
super.onDestroy();
mOpenCvCameraView.disableView();
}
public void onCameraViewStarted(int width, int height) {
mGray = new Mat();
mRgba = new Mat();
}
public void onCameraViewStopped() {
mGray.release();
mRgba.release();
}
public Mat onCameraFrame(final CvCameraViewFrame inputFrame) {
mRgba = inputFrame.rgba();
mGray = inputFrame.gray();
if(counter == 9) {
MatOfRect circles = new MatOfRect();
Imgproc.HoughCircles(mGray, circles, Imgproc.CV_HOUGH_GRADIENT, 1, 50);
// returns number of circular objects found
Log.e("circle check", "circles.cols() " + circles.cols());
}
counterAdder();
return mRgba;
} // end oncamera frame
public void counterAdder() {
if (counter > 10) {
counter = 0;
}
counter++;
}
}
Reducing resolution of camera frames might help
mOpenCvCameraView.setMaxFrameSize(640, 480);
From my brief experience, the running time for HoughCircles greatly depends on the image. A textured image with a lot of potential circles takes much longer than an image with a uniform background. Hope this helps.
I' ve faced this problem either.
I' ve tried to decrease the camera resolution with mOpenCvCameraView.setMaxFrameSize(1280, 720);
However it is still slow. I' ve been trying to work parallel with Threads, but it is still 3.5FPS.
public Mat onCameraFrame(CameraBridgeViewBase.CvCameraViewFrame inputFrame) {
//System.gc();
carrierMat = inputFrame.gray();
Thread thread = new Thread(new MultThread(carrierMat, this));
thread.start();
try {
thread.join();
} catch (InterruptedException e) {
e.printStackTrace();
}
return carrierMat;
}
My MultThread class is just like this
public class MultThread implements Runnable {
private Mat source;
private Context context;
public MultThread(Mat source, Context context) {
this.source = source;
this.context = context;
}
#Override
public void run() {
//output = General.Threshold(source);
int x = General.MSERP(source);
Log.i("MtMTxtDtc:Main","x: " + x);
if (x > 10){
((Vibrator) context.getSystemService(Context.VIBRATOR_SERVICE)).vibrate(500);
}
}
}
You have to perform the Hough Circle transform in the background not in the main activity!
Otherwise your app response will be too slow and it may be killed by the operating system due to Application Not Responding (ANR) error.
You need to add this class to your main activity and you are good to go.
private class HoughCircleTransformTask
extends AsyncTask<Mat, Void, Integer> {
#Override
protected Boolean doInBackground(Mat mGray) {
MatOfRect circles = new MatOfRect();
Imgproc.HoughCircles(mGray, circles, Imgproc.CV_HOUGH_GRADIENT, 1, 50);
// returns number of circular objects found
// then display it from onPostExecute()
return circles.cols();
}
#Override
protected void onPostExecute(Integer circlesCols){
// This is only logging
// You can display it in a TextView as well in the main activity
Log.e("circle check", "circles.cols() " + circles.cols());
}
}
And just call it from onCameraFrame with one line of code only
public Mat onCameraFrame(final CvCameraViewFrame inputFrame) {
mRgba = inputFrame.rgba();
mGray = inputFrame.gray();
if(counter == 9) {
// call AsyncTask
new HoughCircleTransformTask().execute(mGray);
}
counterAdder();
return mRgba;
} // end oncamera frame

How can I close the camera with java and OpenCV?

I'm new in the world of StackOverflow and in OpenCV programming.
I've opened my camera with some Java code and it worked because the light of camera was on, but when I tried to close the camera, I failed.
Code:
public class camera {
public static void main(String[] args) {
System.loadLibrary("opencv_java244");
VideoCapture camera = new VideoCapture(0);
if (camera.isOpened())
System.out.println("Camera is ready!");
else {
System.out.println("Camera Error!");
return;
}
Mat newMat = new Mat();
try {
Thread.sleep(5000);
} catch (InterruptedException e) {
//e.printStackTrace();
}
camera.read(newMat);
Highgui.imwrite("testfile.jpg", newMat);
camera.release();
if (camera.isOpened()) {
System.out.println("Camera is running!");
}
else {
System.out.println("Camera closed!");
}
}
}
result:
Camera is ready!
Camera closed!
I really got the picture, but the light was still on!
P.S. Everytime when I try to open my camera, my computer will open a drive software named YouCam, and I must close it manually to release the camera.
Try capture.retrieve() instead of capture.read(). Here is a snapshot which works for me without using even Thread.sleep()
VideoCapture capture = new VideoCapture(0);
if (!capture.isOpened()) {
imagePanel.add(new JLabel("Oops! Your camera is not working!"));
return;
}
Mat frame = new Mat();
capture.retrieve(frame);
frame = FaceDetector.detect(frame);
BufferedImage image = GestureUtil.matToBufferedImage(frame);*/
imagePanel.setImage(image);
imagePanel.repaint();
String window_name = "Capture - Face detection.jpg";
Highgui.imwrite(window_name, frame);
capture.release();
I have using this along with Swing. However, you can ignore swing code. Hope this helps

How to decode H.264 video frame in Java environment

Does anyone know how to decode H.264 video frame in Java environment?
My network camera products support the RTP/RTSP Streaming.
The service standard RTP/RTSP from my network camera is served and it also supports “RTP/RTSP over HTTP”.
RTSP : TCP 554
RTP Start Port: UDP 5000
Or use Xuggler. Works with RTP, RTMP, HTTP or other protocols, and can decode and encode H264 and most other codecs. And is actively maintained, free, and open-source (LGPL).
I found a very simple and straight-forward solution based on JavaCV's FFmpegFrameGrabber class. This library allows you to play a streaming media by wrapping the ffmpeg in Java.
How to use it?
First, you may download and install the library, using Maven or Gradle.
Here you have a StreamingClient class that calls a SimplePlayer class that has Thread to play the video.
public class StreamingClient extends Application implements GrabberListener
{
public static void main(String[] args)
{
launch(args);
}
private Stage primaryStage;
private ImageView imageView;
private SimplePlayer simplePlayer;
#Override
public void start(Stage stage) throws Exception
{
String source = "rtsp://184.72.239.149/vod/mp4:BigBuckBunny_115k.mov"; // the video is weird for 1 minute then becomes stable
primaryStage = stage;
imageView = new ImageView();
StackPane root = new StackPane();
root.getChildren().add(imageView);
imageView.fitWidthProperty().bind(primaryStage.widthProperty());
imageView.fitHeightProperty().bind(primaryStage.heightProperty());
Scene scene = new Scene(root, 640, 480);
primaryStage.setTitle("Streaming Player");
primaryStage.setScene(scene);
primaryStage.show();
simplePlayer = new SimplePlayer(source, this);
}
#Override
public void onMediaGrabbed(int width, int height)
{
primaryStage.setWidth(width);
primaryStage.setHeight(height);
}
#Override
public void onImageProcessed(Image image)
{
LogHelper.e(TAG, "image: " + image);
Platform.runLater(() -> {
imageView.setImage(image);
});
}
#Override
public void onPlaying() {}
#Override
public void onGainControl(FloatControl gainControl) {}
#Override
public void stop() throws Exception
{
simplePlayer.stop();
}
}
SimplePlayer class uses FFmpegFrameGrabber to decode a frame that is converted into an image and displayed in your Stage
public class SimplePlayer
{
private static volatile Thread playThread;
private AnimationTimer timer;
private SourceDataLine soundLine;
private int counter;
public SimplePlayer(String source, GrabberListener grabberListener)
{
if (grabberListener == null) return;
if (source.isEmpty()) return;
counter = 0;
playThread = new Thread(() -> {
try {
FFmpegFrameGrabber grabber = new FFmpegFrameGrabber(source);
grabber.start();
grabberListener.onMediaGrabbed(grabber.getImageWidth(), grabber.getImageHeight());
if (grabber.getSampleRate() > 0 && grabber.getAudioChannels() > 0) {
AudioFormat audioFormat = new AudioFormat(grabber.getSampleRate(), 16, grabber.getAudioChannels(), true, true);
DataLine.Info info = new DataLine.Info(SourceDataLine.class, audioFormat);
soundLine = (SourceDataLine) AudioSystem.getLine(info);
soundLine.open(audioFormat);
soundLine.start();
}
Java2DFrameConverter converter = new Java2DFrameConverter();
while (!Thread.interrupted()) {
Frame frame = grabber.grab();
if (frame == null) {
break;
}
if (frame.image != null) {
Image image = SwingFXUtils.toFXImage(converter.convert(frame), null);
Platform.runLater(() -> {
grabberListener.onImageProcessed(image);
});
} else if (frame.samples != null) {
ShortBuffer channelSamplesFloatBuffer = (ShortBuffer) frame.samples[0];
channelSamplesFloatBuffer.rewind();
ByteBuffer outBuffer = ByteBuffer.allocate(channelSamplesFloatBuffer.capacity() * 2);
for (int i = 0; i < channelSamplesFloatBuffer.capacity(); i++) {
short val = channelSamplesFloatBuffer.get(i);
outBuffer.putShort(val);
}
}
}
grabber.stop();
grabber.release();
Platform.exit();
} catch (Exception exception) {
System.exit(1);
}
});
playThread.start();
}
public void stop()
{
playThread.interrupt();
}
}
You can use a pure Java library called JCodec ( http://jcodec.org ).
Decoding one H.264 frame is as easy as:
ByteBuffer bb = ... // Your frame data is stored in this buffer
H264Decoder decoder = new H264Decoder();
Picture out = Picture.create(1920, 1088, ColorSpace.YUV_420); // Allocate output frame of max size
Picture real = decoder.decodeFrame(bb, out.getData());
BufferedImage bi = JCodecUtil.toBufferedImage(real); // If you prefere AWT image
If you want to read a from from a container ( like MP4 ) you can use a handy helper class FrameGrab:
int frameNumber = 150;
BufferedImage frame = FrameGrab.getFrame(new File("filename.mp4"), frameNumber);
ImageIO.write(frame, "png", new File("frame_150.png"));
Finally, here's a full sophisticated sample:
private static void avc2png(String in, String out) throws IOException {
SeekableByteChannel sink = null;
SeekableByteChannel source = null;
try {
source = readableFileChannel(in);
sink = writableFileChannel(out);
MP4Demuxer demux = new MP4Demuxer(source);
H264Decoder decoder = new H264Decoder();
Transform transform = new Yuv420pToRgb(0, 0);
MP4DemuxerTrack inTrack = demux.getVideoTrack();
VideoSampleEntry ine = (VideoSampleEntry) inTrack.getSampleEntries()[0];
Picture target1 = Picture.create((ine.getWidth() + 15) & ~0xf, (ine.getHeight() + 15) & ~0xf,
ColorSpace.YUV420);
Picture rgb = Picture.create(ine.getWidth(), ine.getHeight(), ColorSpace.RGB);
ByteBuffer _out = ByteBuffer.allocate(ine.getWidth() * ine.getHeight() * 6);
BufferedImage bi = new BufferedImage(ine.getWidth(), ine.getHeight(), BufferedImage.TYPE_3BYTE_BGR);
AvcCBox avcC = Box.as(AvcCBox.class, Box.findFirst(ine, LeafBox.class, "avcC"));
decoder.addSps(avcC.getSpsList());
decoder.addPps(avcC.getPpsList());
Packet inFrame;
int totalFrames = (int) inTrack.getFrameCount();
for (int i = 0; (inFrame = inTrack.getFrames(1)) != null; i++) {
ByteBuffer data = inFrame.getData();
Picture dec = decoder.decodeFrame(splitMOVPacket(data, avcC), target1.getData());
transform.transform(dec, rgb);
_out.clear();
AWTUtil.toBufferedImage(rgb, bi);
ImageIO.write(bi, "png", new File(format(out, i)));
if (i % 100 == 0)
System.out.println((i * 100 / totalFrames) + "%");
}
} finally {
if (sink != null)
sink.close();
if (source != null)
source.close();
}
}
I think the best solution is using "JNI + ffmpeg". In my current project, I need to play several full screen videos at the same time in a java openGL game based on libgdx. I have tried almost all the free libs but none of them has acceptable performance. So finally I decided to write my own jni C codes to work with ffmpeg. Here is the final performance on my laptop:
Environment: CPU: Core i7 Q740 #1.73G, Video: nVidia GeForce GT 435M,
OS: Windows 7 64bit, Java: Java7u60 64bit
Video: h264rgb / h264 encoded, no sound, resolution: 1366 * 768
Solution: Decode: JNI + ffmpeg v2.2.2, Upload to GPU:
update openGL texture using lwjgl
Performance: Decoding speed:
700-800FPS, Texture Uploading: about 1ms per frame.
I only spent several days to complete the first version. But the first version's decoding speed was only about 120FPS, and uploading time was about 5ms per frame. After several months' optimization, I got this final performance and some additional features. Now I can play several HD videos at the same time without any slowness.
Most videos in my game have transparent background. This kind of transparent video is a mp4 file with 2 video streams, one stream stores h264rgb encoded rgb data, the other stream stores h264 encoded alpha data. So to play an alpha video, I need to decode 2 video streams and merge them together and then upload to GPU. As a result, I can play several transparent HD videos above an opaque HD video at the same time in my game.
Take a look at the Java Media Framework (JMF) - http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/formats.html
I used it a while back and it was a bit immature, but they may have beefed it up since then.

Categories