Increase rate at which samples are taken from AudioRecord - java

Hey so I am trying to record data from the audio device and graph it. It is recording from a piezo that outputs in the 16khz to 19khz range. Below I will have the relevant code. My current issue is that the data isnt be calculated or read at a fast enough rate. I am no sound engineer and a lot of the FFT work was just pulled from multiple resources.
here is a picture of what I am talking about. As you can see I am getting 1 datapoint from the first R wave in the qrs complex. At first I was getting one. Then I changed my min buffer size * 500 and that seemed to give me two datapoints. I would like to have several in the really short amount of time to get an accurate reading.
TLDR: what do i change to make is so I can get more frequency readings per second.
https://dl2.pushbulletusercontent.com/mugoZmNLGtbCta4Si5Pu4RUdJOgMqILK/Screenshot_20160705-094146.png
and my code...
the buffer is being added to a list that has already been declared. The FFT method I am using takes a short[]
public int audioSource = MediaRecorder.AudioSource.MIC;
public int channelConfig = AudioFormat.CHANNEL_IN_MONO;
public int audioEncoding = AudioFormat.ENCODING_PCM_16BIT;
public AudioRecord audioRecord = null;
public int blockSize = 256; // deal with this many samples at a time
public int sampleRate = 44100; // Sample rate in Hz
public void audioRecordLoop() throws Exception {
new Thread(new Runnable() {
#Override
public void run() {
Log.e(TAG, "start audioRecordLoop");
int bufferSize = AudioRecord.getMinBufferSize(sampleRate, channelConfig, audioEncoding);
audioRecord = new AudioRecord(audioSource, sampleRate, channelConfig, audioEncoding, bufferSize * 500);
if (audioRecord.getState() != AudioRecord.STATE_INITIALIZED) {
Log.e(TAG, "AudioRecord init failed");
return;
}
final short[] buffer = new short[blockSize];
audioRecord.startRecording();
int len = 0;
while (isRecording == true) {
len = audioRecord.read(buffer, 0, blockSize);
shortList.add(buffer);
if (len < 0) {
Log.e(TAG, "read error " + len);
return;
}
}
if (audioRecord != null)
audioRecord.release();
}
}).start();
}
to calculate
public void calcFrequency() {
new Thread(new Runnable() {
#Override
public void run() {
FrequencyScanner frequencyScanner = new FrequencyScanner();
while (isRecording) {
if (shortList != null && shortList.size() > 0) {
short[] shorts = shortList.get(0);
final double frequencys = frequencyScanner.extractFrequency(shorts, sampleRate);
frequencyList.add(frequencys);
}
}
}
}).start();
}
then to graph
public void graphFrequenecy() {
new Thread(new Runnable() {
#Override
public void run() {
while (isRecording) {
try {
if (frequencyList != null && frequencyList.size() > 0) {
final double frequency = frequencyList.get(0);
if (frequency > 13000) {
Log.d(TAG, "run: " + frequency);
runOnUiThread(new Runnable() {
#Override
public void run() {
series.appendData(new DataPoint(series.getHighestValueX() + 1, frequency), true, 3000);
}
});
}
frequencyList.remove(0);
}
} catch (Exception e) {
e.printStackTrace();
}
try {
Thread.sleep(10);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
}
}).start();
}

Related

How to keep track of audio playback position?

I created a thread to play an mp3 file in Java by converting it to an array of bytes.
I'm wondering if I can keep track of the current play position as the mp3 is being played.
First, I set up my music stream like so:
try {
AudioInputStream in = AudioSystem.getAudioInputStream(file);
musicInputStream = AudioSystem.getAudioInputStream(MUSIC_FORMAT, in);
DataLine.Info dataLineInfo = new DataLine.Info(SourceDataLine.class, MUSIC_FORMAT);
musicDataLine = (SourceDataLine) AudioSystem.getLine(dataLineInfo);
musicDataLine.open(MUSIC_FORMAT);
musicDataLine.start();
startMusicThread();
} catch(Exception e) {
e.printStackTrace();
}
Next, my music thread looks like this:
private class MusicThread extends Thread {
byte musicBuffer[] = new byte[BUFFER_SIZE];
public void run() {
try {
int musicCount = 0;
while(writeOutput){
if(writeMusic && (musicCount = musicInputStream.read(musicBuffer, 0, musicBuffer.length)) > 0){
musicDataLine.write(musicBuffer, 0, musicCount);
}
}
} catch (Exception e) {
System.out.println("AudioStream Exception - Music Thread"+e);
e.printStackTrace();
}
}
}
I thought of one possibility, to create another thread with a timer that slowly ticks down, second by second, to show the remaining amount of time for the mp3 song. But that doesn't seem like a good solution at all.
Your int musicCount (the return value from AudioInputStream.read(...)) tells you the number of bytes read, so with that you can do a small computation to figure out your place in the stream always. (DataLine has some methods to do some of the math for you but they can't always be used...see below.)
int musicCount = 0;
int totalBytes = 0;
while ( loop stuff ) {
// accumulate it
// and do whatever you need with it
totalBytes += musicCount;
musicDataLine.write(...);
}
To get the number of seconds elapsed, you can do the following things:
AudioFormat fmt = musicInputStream.getFormat();
long framesRead = totalBytes / fmt.getFrameSize();
long totalFrames = musicInputStream.getFrameLength();
double totalSeconds = (double) totalFrames / fmt.getSampleRate();
double elapsedSeconds =
((double) framesRead / (double) totalFrames) * totalSeconds;
So you'd just get the elapsed time each loop and put it wherever you need it to go. Note that the accuracy of this kind of depends on the size of your buffer. The smaller the buffer, the more accurate.
Also, Clip has some methods to query this for you (but you'd probably have to change what you're doing a lot).
These methods (get(Long)FramePosition/getMicrosecondPosition) are inherited from DataLine, so you can also call them on the SourceDataLine as well if you don't want to do the math yourself. However, you basically need to make a new line for every file you play, so it depends on how you're using the line. (Personally I'd rather just do the division myself since asking the line is kind of opaque.)
BTW:
musicDataLine.open(MUSIC_FORMAT);
You should open the line with your own buffer size specified, using the (AudioFormat, int) overload. SourceDataLine.write(...) only blocks when its internal buffer is full, so if it's a different size from your byte array, sometimes your loop is blocking, other times it's just spinning.
MCVE for good measure:
import javax.swing.*;
import java.awt.*;
import java.awt.event.*;
import java.io.*;
import java.util.*;
import javax.sound.sampled.*;
public class SimplePlaybackProgress
extends WindowAdapter implements Runnable, ActionListener {
class AudioPlayer extends Thread {
volatile boolean shouldPlay = true;
final int bufferSize;
final AudioFormat fmt;
final AudioInputStream audioIn;
final SourceDataLine audioOut;
final long frameSize;
final long totalFrames;
final double sampleRate;
AudioPlayer(File file)
throws UnsupportedAudioFileException,
IOException,
LineUnavailableException {
audioIn = AudioSystem.getAudioInputStream(file);
fmt = audioIn.getFormat();
bufferSize = fmt.getFrameSize() * 8192;
frameSize = fmt.getFrameSize();
totalFrames = audioIn.getFrameLength();
sampleRate = fmt.getSampleRate();
try {
audioOut = AudioSystem.getSourceDataLine(audioIn.getFormat());
audioOut.open(fmt, bufferSize);
} catch (LineUnavailableException x) {
try {
audioIn.close();
} catch(IOException suppressed) {
// Java 7+
// x.addSuppressed(suppressed);
}
throw x;
}
}
#Override
public void run() {
final byte[] buffer = new byte[bufferSize];
long framePosition = 0;
try {
audioOut.start();
while (shouldPlay) {
int bytesRead = audioIn.read(buffer);
if (bytesRead < 0) {
break;
}
int bytesWritten = audioOut.write(buffer, 0, bytesRead);
if (bytesWritten != bytesRead) {
// shouldn't happen
throw new RuntimeException(String.format(
"read: %d, wrote: %d", bytesWritten, bytesRead));
}
framePosition += bytesRead / frameSize;
// or
// framePosition = audioOut.getLongFramePosition();
updateProgressBar(framePosition);
}
audioOut.drain();
audioOut.stop();
} catch (Throwable x) {
showErrorMessage(x);
} finally {
updateProgressBar(0);
try {
audioIn.close();
} catch (IOException x) {
showErrorMessage(x);
}
audioOut.close();
}
}
void updateProgressBar(
final long framePosition) {
SwingUtilities.invokeLater(new Runnable() {
#Override
public void run() {
double fractionalProgress =
(double) framePosition / (double) totalFrames;
int progressValue = (int) Math.round(
fractionalProgress * theProgressBar.getMaximum());
theProgressBar.setValue(progressValue);
int secondsElapsed = (int) Math.round(
(double) framePosition / sampleRate);
int minutes = secondsElapsed / 60;
int seconds = secondsElapsed % 60;
theProgressBar.setString(String.format(
"%d:%02d", minutes, seconds));
}
});
}
void stopPlaybackAndDrain() throws InterruptedException {
shouldPlay = false;
this.join();
}
}
/* * */
public static void main(String[] args) {
SwingUtilities.invokeLater(new SimplePlaybackProgress());
}
JFrame theFrame;
JButton theButton;
JProgressBar theProgressBar;
// this should only ever have 1 thing in it...
// multithreaded code with poor behavior just bugs me,
// even for improbable cases, so the queue makes it more robust
final Queue<AudioPlayer> thePlayerQueue = new ArrayDeque<AudioPlayer>();
#Override
public void run() {
theFrame = new JFrame("Playback Progress");
theFrame.setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE);
theButton = new JButton("Open");
theProgressBar = new JProgressBar(
SwingConstants.HORIZONTAL, 0, 1000);
theProgressBar.setStringPainted(true);
theProgressBar.setString("0:00");
Container contentPane = theFrame.getContentPane();
((JPanel) contentPane).setBorder(
BorderFactory.createEmptyBorder(8, 8, 8, 8));
contentPane.add(theButton, BorderLayout.WEST);
contentPane.add(theProgressBar, BorderLayout.CENTER);
theFrame.pack();
theFrame.setResizable(false);
theFrame.setLocationRelativeTo(null);
theFrame.setVisible(true);
theButton.addActionListener(this);
theFrame.addWindowListener(this);
}
#Override
public void actionPerformed(ActionEvent ae) {
JFileChooser dialog = new JFileChooser();
int option = dialog.showOpenDialog(theFrame);
if (option == JFileChooser.APPROVE_OPTION) {
File file = dialog.getSelectedFile();
try {
enqueueNewPlayer(new AudioPlayer(file));
} catch (UnsupportedAudioFileException x) { // ew, Java 6
showErrorMessage(x); //
} catch (IOException x) { //
showErrorMessage(x); //
} catch (LineUnavailableException x) { //
showErrorMessage(x); //
} //
}
}
#Override
public void windowClosing(WindowEvent we) {
stopEverything();
}
void enqueueNewPlayer(final AudioPlayer newPlayer) {
// stopPlaybackAndDrain calls join
// so we want to do it off the EDT
new Thread() {
#Override
public void run() {
synchronized (thePlayerQueue) {
stopEverything();
newPlayer.start();
thePlayerQueue.add(newPlayer);
}
}
}.start();
}
void stopEverything() {
synchronized (thePlayerQueue) {
while (!thePlayerQueue.isEmpty()) {
try {
thePlayerQueue.remove().stopPlaybackAndDrain();
} catch (InterruptedException x) {
// shouldn't happen
showErrorMessage(x);
}
}
}
}
void showErrorMessage(Throwable x) {
x.printStackTrace(System.out);
String errorMsg = String.format(
"%s:%n\"%s\"", x.getClass().getSimpleName(), x.getMessage());
JOptionPane.showMessageDialog(theFrame, errorMsg);
}
}
For Clip, you'd just have something like a Swing timer (or other side-thread) and query it however often:
new javax.swing.Timer(100, new ActionListener() {
#Override
public void actionPerformed(ActionEvent ae) {
long usPosition = theClip.getMicrosecondPosition();
// put it somewhere
}
}).start();
Related:
How to calculate the level/amplitude/db of audio signal in java?
How to make waveform rendering more interesting?

Changing Volume with JLayer

I have a very specific problem with JLayer in my Music-Player project. I want to include something to adjust the volume, but it seems it isn't that easy to implement.
This is because JLayer isn't supporting it from itself. I excluded the Player class from my project and changed some methods and it works fine for playing mp3. For adjusting the volume I added to the Player class this method:
public boolean setGain(float newGain) {
if (audio instanceof JavaSoundAudioDevice) {
System.out.println("InstanceOf");
JavaSoundAudioDevice jsAudio = (JavaSoundAudioDevice) audio;
try {
jsAudio.write(null, 0, 0);
} catch (JavaLayerException ex) {
ex.printStackTrace();
}
return jsAudio.setLineGain(newGain);
}
return false;
}
I then extracted the JavaSoundAudioDevice, decompiled it and changed it:
public boolean setLineGain(float gain)
{
System.out.println("Vor Source");
if (source != null)
{
System.out.println("Nach Source");
FloatControl volControl = (FloatControl) source.getControl(FloatControl.Type.MASTER_GAIN);
float newGain = Math.min(Math.max(gain, volControl.getMinimum()), volControl.getMaximum());
volControl.setValue(newGain);
return true;
}
return false;
}
And:
public void createSource() throws JavaLayerException {
Throwable t = null;
try {
Line line = AudioSystem.getLine(getSourceLineInfo());
if (line instanceof SourceDataLine) {
source = (SourceDataLine) line;
//source.open(fmt, millisecondsToBytes(fmt, 2000));
source.open(fmt);
/*
if (source.isControlSupported(FloatControl.Type.MASTER_GAIN))
{
FloatControl c = (FloatControl)source.getControl(FloatControl.Type.MASTER_GAIN);
c.setValue(c.getMaximum());
}*/
source.start();
}
} catch (RuntimeException ex) {
t = ex;
} catch (LinkageError ex) {
t = ex;
} catch (LineUnavailableException ex) {
t = ex;
}
if (source == null) {
throw new JavaLayerException("cannot obtain source audio line", t);
}
}
But the createSource method, which was already implemented, doesn't work. At the line
Line line = AudioSystem.getLine(getSourceLineInfo());
I always get an IllegalArgumentException:
java.lang.IllegalArgumentException: No line matching interface SourceDataLine supporting format PCM_SIGNED 0.0 Hz, 16 bit, 0 channels, 0 bytes/frame, little-endian is supported.
at javax.sound.sampled.AudioSystem.getLine(AudioSystem.java:479)
at javazoom.jl.player.JavaSoundAudioDevice.createSource(JavaSoundAudioDevice.java:80)
at javazoom.jl.player.JavaSoundAudioDevice.writeImpl(JavaSoundAudioDevice.java:119)
at javazoom.jl.player.AudioDeviceBase.write(Unknown Source)
at MusikPlayer.Erweiterungen.Players.MyPlayer.setGain(MyPlayer.java:192)
at MusikPlayer.PlayerTest.main(PlayerTest.java:21)
Exception in thread "main" java.lang.IllegalArgumentException: No line matching interface SourceDataLine supporting format PCM_SIGNED 0.0 Hz, 16 bit, 0 channels, 0 bytes/frame, little-endian is supported.
at javax.sound.sampled.AudioSystem.getLine(AudioSystem.java:479)
at javazoom.jl.player.JavaSoundAudioDevice.createSource(JavaSoundAudioDevice.java:80)
at javazoom.jl.player.JavaSoundAudioDevice.writeImpl(JavaSoundAudioDevice.java:119)
at javazoom.jl.player.AudioDeviceBase.write(Unknown Source)
at MusikPlayer.Erweiterungen.Players.MyPlayer.decodeFrame(MyPlayer.java:161)
at MusikPlayer.Erweiterungen.Players.MyPlayer.play(MyPlayer.java:87)
at MusikPlayer.Erweiterungen.Players.MyPlayer.play(MyPlayer.java:66)
at MusikPlayer.PlayerTest.main(PlayerTest.java:22)
Does anyone know why this is happening? Does anyone know how to solve this issue?
Well, i solved it..
This Test class is used:
public class PlayerTest {
public static void main(String[] args) {
try {
File f = new File("D:\\Musik\\Musik-Oberordner\\Favoriten\\06-ich_und_ich_-_so_soll_es_bleiben.mp3");
MyPlayer player = new MyPlayer(new FileInputStream(f));
player.setGain(-30f);
player.play();
} catch (JavaLayerException | FileNotFoundException ex) {
ex.printStackTrace();
}
}
the setten Gain will adjust the volume, from -80.0f to 6f.
The changed JavaSoundAudioDevice:
package javazoom.jl.player;
import javax.sound.sampled.AudioFormat;
import javax.sound.sampled.AudioSystem;
import javax.sound.sampled.DataLine;
import javax.sound.sampled.FloatControl;
import javax.sound.sampled.Line;
import javax.sound.sampled.LineUnavailableException;
import javax.sound.sampled.SourceDataLine;
import javazoom.jl.decoder.Decoder;
import javazoom.jl.decoder.JavaLayerException;
/**
* The <code>JavaSoundAudioDevice</code> implements an audio device by using the
* JavaSound API.
*
* #since 0.0.8
* #author Mat McGowan
*/
public class JavaSoundAudioDevice extends AudioDeviceBase {
private SourceDataLine source = null;
private AudioFormat fmt = null;
private byte[] byteBuf = new byte[4096];
protected void setAudioFormat(AudioFormat fmt0) {
fmt = fmt0;
}
protected AudioFormat getAudioFormat() {
// if (fmt == null) {
fmt = new AudioFormat(44100,
16,
2,
true,
false);
return fmt;
}
protected DataLine.Info getSourceLineInfo() {
AudioFormat fmt = getAudioFormat();
//DataLine.Info info = new DataLine.Info(SourceDataLine.class, fmt, 4000);
DataLine.Info info = new DataLine.Info(SourceDataLine.class, fmt);
return info;
}
public void open(AudioFormat fmt) throws JavaLayerException {
if (!isOpen()) {
setAudioFormat(fmt);
openImpl();
setOpen(true);
}
}
public boolean setLineGain(float gain) {
System.out.println("Vor Source");
if (source != null) {
System.out.println("Nach Source");
FloatControl volControl = (FloatControl) source.getControl(FloatControl.Type.MASTER_GAIN);
float newGain = Math.min(Math.max(gain, volControl.getMinimum()), volControl.getMaximum());
volControl.setValue(newGain);
return true;
}
return false;
}
public void openImpl()
throws JavaLayerException {
}
// createSource fix.
public void createSource() throws JavaLayerException {
Throwable t = null;
try {
Line line = AudioSystem.getLine(getSourceLineInfo());
if (line instanceof SourceDataLine) {
source = (SourceDataLine) line;
//source.open(fmt, millisecondsToBytes(fmt, 2000));
source.open(fmt);
// if (source.isControlSupported(FloatControl.Type.MASTER_GAIN))
// {
// System.out.println("Control");
// FloatControl c = (FloatControl)source.getControl(FloatControl.Type.MASTER_GAIN);
// c.setValue(c.getMinimum());
// }
source.start();
}
} catch (RuntimeException ex) {
t = ex;
} catch (LinkageError ex) {
t = ex;
} catch (LineUnavailableException ex) {
t = ex;
}
if (source == null) {
throw new JavaLayerException("cannot obtain source audio line", t);
}
}
public int millisecondsToBytes(AudioFormat fmt, int time) {
return (int) (time * (fmt.getSampleRate() * fmt.getChannels() * fmt.getSampleSizeInBits()) / 8000.0);
}
protected void closeImpl() {
if (source != null) {
source.close();
}
}
protected void writeImpl(short[] samples, int offs, int len)
throws JavaLayerException {
if (source == null) {
createSource();
}
byte[] b = toByteArray(samples, offs, len);
source.write(b, 0, len * 2);
}
protected byte[] getByteArray(int length) {
if (byteBuf.length < length) {
byteBuf = new byte[length + 1024];
}
return byteBuf;
}
protected byte[] toByteArray(short[] samples, int offs, int len) {
byte[] b = getByteArray(len * 2);
int idx = 0;
short s;
while (len-- > 0) {
s = samples[offs++];
b[idx++] = (byte) s;
b[idx++] = (byte) (s >>> 8);
}
return b;
}
protected void flushImpl() {
if (source != null) {
source.drain();
}
}
public int getPosition() {
int pos = 0;
if (source != null) {
pos = (int) (source.getMicrosecondPosition() / 1000);
}
return pos;
}
/**
* Runs a short test by playing a short silent sound.
*/
public void test()
throws JavaLayerException {
// try {
open(new AudioFormat(22000, 16, 1, true, false));
short[] data = new short[22000 / 10];
write(data, 0, data.length);
flush();
close();
// } catch (RuntimeException ex) {
// throw new JavaLayerException("Device test failed: " + ex);
// }
}
}
Now u have to just implement it into ur Project, overwrite the old JavaSoundDevice and enjoy VolumeAdjusting!

Need get correct AVI Player class to play all AVI video files inside android application

I try to get correct AVI video player to play AVI file since my current AVI Player class looks like not work really good anymore.
Some AVI files can play correctly, but the others can not.
People who know which AVI Video player class play all avi files exactly,
Please help me,
Thank you,
p/s :
I don't want intent to 3rd application to play AVI file.
Below codes is current I used to play AVI files:
AVI Player.java
package runnable;
import java.io.BufferedInputStream;
import java.io.FileInputStream;
import java.io.FileNotFoundException;
import java.io.IOException;
import java.util.ArrayList;
import iterface.IVideoSink;
public class VideoPlayer implements Runnable {
private final int FPS = 24;
/**
* String section
*/
private boolean IS_ALIVE = true;
private long LAST_FRAME_TIME;
/**
* Data section
*/
private ArrayList<IVideoSink> mAlVideoSinks;
/**
* Others section
*/
private BufferedInputStream mBufferedInputStream;
public VideoPlayer(String filename) {
mAlVideoSinks = new ArrayList<IVideoSink>();
try {
mBufferedInputStream = new BufferedInputStream(new FileInputStream(filename));
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
public void addVideoSink(IVideoSink videoSink) {
synchronized (mAlVideoSinks) {
mAlVideoSinks.add(videoSink);
}
}
public void removeVideoSink(IVideoSink videoSink) {
synchronized (mAlVideoSinks) {
if (mAlVideoSinks.contains(videoSink))
mAlVideoSinks.remove(videoSink);
}
}
#Override
public void run() {
int count = 0;
while (IS_ALIVE) {
if (LAST_FRAME_TIME == 0) {
LAST_FRAME_TIME = System.currentTimeMillis();
}
try {
long currentTime = System.currentTimeMillis();
if (currentTime - LAST_FRAME_TIME < 1000 / FPS) {
Thread.sleep(1000 / FPS - (currentTime - LAST_FRAME_TIME));
}
LAST_FRAME_TIME = System.currentTimeMillis();
int b0 = mBufferedInputStream.read();
if (b0 == -1) break;
int b1 = mBufferedInputStream.read();
int b2 = mBufferedInputStream.read();
int b3 = mBufferedInputStream.read();
count = b0 + (b1 << 8) + (b2 << 16) + (b3 << 24);
byte[] buffer = new byte[count];
int readCount = mBufferedInputStream.read(buffer, 0, count);
for (IVideoSink videoSink : mAlVideoSinks) {
videoSink.onFrame(buffer, null);
}
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
try {
mBufferedInputStream.close();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
for (IVideoSink videoSink : mAlVideoSinks) {
videoSink.onVideoEnd();
}
}
public void stop() {
IS_ALIVE = false;
}
}
PCM Player.java
package runnable;
import android.media.AudioFormat;
import android.media.AudioManager;
import android.media.AudioTrack;
import java.util.ArrayList;
public class PCMPlayer implements Runnable {
/**
* String section
*/
private boolean IS_ALIVE = true;
/**
* Data section
*/
private ArrayList<byte[]> mAlBuffers = new ArrayList<byte[]>();
/**
* Other section
*/
private AudioTrack mAudioTrack;
public PCMPlayer() {
}
#Override
public void run() {
int bufSize = AudioTrack.getMinBufferSize(8000,
AudioFormat.CHANNEL_CONFIGURATION_MONO,
AudioFormat.ENCODING_PCM_16BIT);
mAudioTrack = new AudioTrack(AudioManager.STREAM_MUSIC,
8000,
AudioFormat.CHANNEL_CONFIGURATION_MONO,
AudioFormat.ENCODING_PCM_16BIT,
bufSize,
AudioTrack.MODE_STREAM);
mAudioTrack.play();
while (IS_ALIVE) {
byte[] buffer = null;
boolean dataFlag = true;
while (dataFlag) {
synchronized (mAlBuffers) {
if (mAlBuffers.size() > 0) {
buffer = mAlBuffers.remove(0);
} else {
dataFlag = false;
break;
}
}
mAudioTrack.write(buffer, 0, buffer.length);
}
try {
Thread.sleep(10);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
mAudioTrack.stop();
mAudioTrack.release();
}
public void writePCM(byte[] pcm) {
synchronized (mAlBuffers) {
byte[] buffer = new byte[pcm.length];
System.arraycopy(pcm, 0, buffer, 0, buffer.length);
mAlBuffers.add(buffer);
}
}
public void stop() {
IS_ALIVE = false;
}
}
I don't know your intentions. But in your place, I would use a library for that. For examlpe jVLC .
The problem is that AVI is just a container for sound and video data, that can be encoded by any codecs. The sound can be mp3, ogg, or whatever and the video can be mpeg, divx, xvid of several versions.
You can not count on that the all AVIs will contain the same sound/video formats. That is why some of your AVIs can not be played by you current program.

Huge delay when socketing in java

(UPDATED CODE)
I'm trying to make clients communicate with server (I've made simple client-server apps, like a chatroom). The communication is created, but there is a huge delay (I send coordinates from the client to the server). It's over 10 seconds (sometimes even more). What could be the problem?
The client:
public class GameComponent extends Canvas implements Runnable {
private static final long serialVersionUID = 1L;
private static final int WIDTH = 320;
private static final int HEIGHT = 240;
private static final int SCALE = 2;
private boolean running;
private JFrame frame;
Thread thread;
public static final int GRID_W = 16;
public static final int GRID_H = 16;
private Socket socket;
private DataInputStream reader;
private DataOutputStream writer;
private HashMap<Integer, OtherPlayer> oPlayers;
private ArrayList<OtherPlayer> opList;
private int maxID = 1;
private int ID;
Player player;
public GameComponent() {
//GUI code..
oPlayers = new HashMap<Integer, OtherPlayer>(); //Hash map to be able to get players by their ID's
opList = new ArrayList<OtherPlayer>(); //And an array list for easier drawing
setUpNetworking();
start();
}
public void start() {
if (running)
return;
running = true;
thread = new Thread(this);
player = new Player(GRID_W * 2, GRID_H * 2);
thread.start();
}
public void stop() {
if (!running)
return;
running = false;
}
public void run() { //The main loop, ticks 60 times every second
long lastTime = System.nanoTime();
double nsPerTick = 1000000000D / 60D;
int frames = 0;
int ticks = 0;
long lastTimer = System.currentTimeMillis();
double delta = 0;
while (running) {
long now = System.nanoTime();
delta += (now - lastTime) / nsPerTick;
lastTime = now;
boolean shouldRender = true;
while (delta >= 1) {
ticks++;
tick(delta);
delta -= 1;
shouldRender = true;
}
try {
Thread.sleep(2);
} catch (InterruptedException e) {
e.printStackTrace();
}
if (shouldRender) {
frames++;
render();
}
if (System.currentTimeMillis() - lastTimer >= 1000) {
lastTimer += 1000;
frames = 0;
ticks = 0;
}
}
}
private void tick(double delta) { //main logic
player.move();
try {
writer.writeInt(ID); //I send the player data here (id, x, y)
writer.writeInt(player.getX());
writer.writeInt(player.getY());
writer.flush();
} catch (IOException e) {
e.printStackTrace();
}
}
private void render(Graphics2D g2d) {
//rendering the stuff
for (OtherPlayer i : opList) { //drawing a black rectangle for every other player
g2d.fillRect(i.getX(), i.getY(), GRID_W, GRID_H);
}
}
private void render() {
//more rendering...
}
public static void main(String[] args) {
new GameComponent();
}
class TKeyListener implements KeyListener {
//movement methods...
}
private void setUpNetworking() { //This is where I make my message reader and data IO
try {
socket = new Socket("127.0.0.1", 5099);
reader = new DataInputStream(socket.getInputStream());
writer = new DataOutputStream(socket.getOutputStream());
Thread rT = new Thread(new msgReader());
rT.start();
} catch (Exception e) {
e.printStackTrace();
}
}
class msgReader implements Runnable { //where I read messages
public void run() {
try {
ID = reader.readInt(); //when I connect, I get an id from the server
while(true) { //my main loop
int oid = reader.readInt(); //get the read data id
int ox, oy;
ox = reader.readInt(); //get the read player's x and y
oy = reader.readInt();
if (oid != ID){ //If not reading myself
if (oPlayers.containsKey(oid)) { //If a player with this id exists
OtherPlayer op = (OtherPlayer) oPlayers.get(oid);
op.setX(ox); //set it's x, y
op.setY(oy);
} else { //if it doesn't exist, create him
OtherPlayer op = new OtherPlayer(ox, oy);
opList.add(op);
oPlayers.put(oid, op);
}
}
maxID = reader.readInt(); //Allways read the highest current id from server
}
} catch(Exception ex) {
ex.printStackTrace();
}
}
}
}
And the server:
public class ServerBase {
ServerSocket serverSocket;
ArrayList<DataOutputStream> clients;
private int id = 1;
SyncSend ss = new SyncSend();
class ClientHandler implements Runnable {
private Socket soc;
private DataInputStream reader;
private int x;
private int y;
private int id;
private boolean run = true;
public ClientHandler(Socket s) {
soc = s;
try {
reader = new DataInputStream(soc.getInputStream());
} catch (IOException e) {
e.printStackTrace();
}
}
public void run() {
try {
while (run) {
id = reader.readInt();
x = reader.readInt();
y = reader.readInt();
if (id == 2)
System.out.println("x: " + x + " y: " + y);
int[] tmb = {id, x, y};
ss.sendEveryone(tmb);
}
} catch (Exception e) {
run = false;
clients.remove(this);
}
}
}
class SyncSend {
public synchronized void sendEveryone(int[] a) throws SocketException {
ArrayList<DataOutputStream> cl = (ArrayList<DataOutputStream>) clients.clone();
Iterator<DataOutputStream> it = cl.iterator();
while(it.hasNext()){
try {
DataOutputStream writer = (DataOutputStream) it.next();
writer.writeInt(a[0]);
writer.writeInt(a[1]);
writer.writeInt(a[2]);
writer.writeInt(id-1);
writer.flush();
} catch (Exception ex) {
throw new SocketException();
}
}
}
}
public void init() {
clients = new ArrayList<DataOutputStream>();
try {
serverSocket = new ServerSocket(5099);
while(true) {
Socket clientSocket = serverSocket.accept();
DataOutputStream clientWriter = new DataOutputStream(clientSocket.getOutputStream());
clients.add(clientWriter);
clientWriter.writeInt(id);
id++;
Thread t = new Thread(new ClientHandler(clientSocket));
t.start();
}
} catch (Exception e) {
e.printStackTrace();
}
}
public static void main(String[] args) {
new ServerBase().init();
}
}
What causes the delay? I've been searching for the reason for hours now, but with no success.
You most likely need to call flush() on the client-side. Even if this is not your current problem, it is probably a good idea.
Streams may buffer their content, meaning they may not send the data to its destination (whether that be a disk or over the wire to a server) the instant you call write (or writeInt in this case). Instead, they may wait until they get a sufficient amount of data to make the transfer "worth it". If they did not behave in this way, they would end up making lots of inefficient, smaller transfers. The downside to all of this is that you may need to call flush to tell the stream that you are done sending data for a while and that the stream should go ahead and initiate the transfer.
try to put your codes into several threads everywhere you can and then call threads, I mean you don't need to wait for every Socket and simply run all of them at same time...or something like this :)
for example in Port Scanners, you should use many threads to speed up searching...
Be aware that your call to ss.sendEveryone(tmb) is synchronized on the ss object. I am assuming this is a static variable somewhere that holds a reference to all of the clients. This means that if there are several clients sending data at the same time, a lot of calls to sendEveryone will happen all at once and they will all line up in a queue waiting for the others to finish, before those threads can go back and read more data from the client again.
As a diagnostic exercise, you may want to remove this call and see if you still have your problem.

Unable to do low-level decoding of video on Android 4.2 without using media extractor

I wanted to decode video frames without using an extractor. So I just tried a small sample, where I use media extractor but I don't do extractor.readsample() to copy the bitstream data into the input buffer instead I use FFmpeg parser, inside JNI, where I memcopy the video frames into the input byte buffers and then queued the input buffer.
But when I call decoder.dequeueOutputBuffer(info, 10000):
It returns MediaCodec.INFO_TRY_AGAIN_LATER
While it works fine if I use extractor.readsample()
Java Side:
import java.io.File;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.IOException;
import java.nio.ByteBuffer;
import android.app.Activity;
import android.media.MediaCodec;
import android.media.MediaCodec.BufferInfo;
import android.media.MediaExtractor;
import android.media.MediaFormat;
import android.os.Bundle;
import android.os.Environment;
import android.util.Log;
import android.view.Surface;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
public class VideoBrowser extends Activity implements SurfaceHolder.Callback {
private static final String SAMPLE = Environment.getExternalStorageDirectory() + "/obama.mp4";
private PlayerThread mPlayer = null;
private static native < jintArray > int AVinitializecntxt(String strl, jintArray arr);
private native int AVREADVIDEO(byte[] array);
public int FLAG = 0;
public int jk = 0;
File f1;
FileOutputStream f;
static {
Log.i("ABCD", "BEFORE");
System.loadLibrary("ffmpeg");
System.loadLibrary("ffmpeg-test-jni");
Log.i("ABCD", "Success");
}
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
SurfaceView sv = new SurfaceView(this);
sv.getHolder().addCallback(this);
setContentView(sv);
int val;
int[] array = new int[6];
int END_OF_FILE = 0;
int aud_stream = 0;
int vid_stream = 0;
String urlString = "/mnt/sdcard/obama.mp4";
f1 = new File("/mnt/sdcard/t.h264");
try {
f = new FileOutputStream(f1);
} catch (FileNotFoundException e) {
e.printStackTrace();
}
// This is where I call the function to initialize the ffmpeg inside JNI
val = AVinitializecntxt(urlString, array);
FLAG = val;
Log.i("ABCD", "FLAG : " + FLAG + val);
}
protected void onDestroy() {
super.onDestroy();
}
#Override
public void surfaceCreated(SurfaceHolder holder) {}
#Override
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
if (mPlayer == null) {
mPlayer = new PlayerThread(holder.getSurface());
mPlayer.start();
}
}
#Override
public void surfaceDestroyed(SurfaceHolder holder) {
if (mPlayer != null) {
mPlayer.interrupt();
}
}
private class PlayerThread extends Thread {
private MediaExtractor extractor;
private MediaCodec decoder;
private Surface surface;
// private VideoPlayer VideoPlayerAPIInterfaceClass = new VideoPlayer();
public PlayerThread(Surface surface) {
this.surface = surface;
}
#Override
public void run() {
if (FLAG == 1) {
extractor = new MediaExtractor();
extractor.setDataSource(SAMPLE);
for (int i = 0; i < extractor.getTrackCount(); i++) {
MediaFormat format = extractor.getTrackFormat(i);
String mime = format.getString(MediaFormat.KEY_MIME);
if (mime.startsWith("video/")) {
extractor.selectTrack(i);
decoder = MediaCodec.createDecoderByType("video/avc");
// Log.i("ABCD", "MIME : " + mime);
decoder.configure(format, surface, null, 0);
break;
}
}
if (decoder == null) {
Log.e("DecodeActivity", "Can't find video info!");
return;
}
decoder.start();
ByteBuffer[] inputBuffers = decoder.getInputBuffers();
ByteBuffer[] outputBuffers = decoder.getOutputBuffers();
BufferInfo info = new BufferInfo();
boolean isEOS = false;
long startMs = System.currentTimeMillis();
int outIndex1 = -1;
while (outIndex1 < 0) {
outIndex1 = decoder.dequeueOutputBuffer(info, 10000);
Log.i("ABCD", "etgeuieoy");
}
while (!Thread.interrupted()) {
if (!isEOS) {
int inIndex = decoder.dequeueInputBuffer(10000);
if (inIndex >= 0) {
ByteBuffer buffer = inputBuffers[inIndex];
// int sampleSize = extractor.readSampleData(buffer, 0);
byte[] bytes = new byte[buffer.capacity()];
// This is where we call JNI function to memcopy the encoded bitstream into the input buffer
int sampleSize = [b] AVREADVIDEO[/b](bytes);
buffer.clear(); buffer.put(bytes, 0, sampleSize);
if (sampleSize < 0) {
// We shouldn't stop the playback at this point, just pass the EOS
// flag to decoder, we will get it again from the
// dequeueOutputBuffer
// Log.d("DecodeActivity", "InputBuffer BUFFER_FLAG_END_OF_STREAM");
decoder.queueInputBuffer(inIndex, 0, 0, 0, MediaCodec.BUFFER_FLAG_END_OF_STREAM);
isEOS = true;
} else {
decoder.queueInputBuffer(inIndex, 0, sampleSize, 0, 0);
extractor.advance();
}
}
}
int outIndex = decoder.dequeueOutputBuffer(info, 10000);
switch (outIndex) {
case MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED:
Log.d("DecodeActivity", "INFO_OUTPUT_BUFFERS_CHANGED");
outputBuffers = decoder.getOutputBuffers();
break;
case MediaCodec.INFO_OUTPUT_FORMAT_CHANGED:
Log.d("DecodeActivity", "New format " + decoder.getOutputFormat());
break;
case MediaCodec.INFO_TRY_AGAIN_LATER:
Log.d("DecodeActivity", "dequeueOutputBuffer timed out!");
break;
default:
ByteBuffer buffer = outputBuffers[outIndex];
Log.v("DecodeActivity", "We can't use this buffer but render it due to the API limit, " + buffer);
// We use a very simple clock to keep the video FPS, or the video
// playback will be too fast
while (info.presentationTimeUs / 1000 > System.currentTimeMillis() - startMs) {
try {
sleep(10);
} catch (InterruptedException e) {
e.printStackTrace();
break;
}
}
// Log.i("ABCD", "RELEASING OUTPUT BUFFER");
decoder.releaseOutputBuffer(outIndex, true);
//decoder.releaseOutputBuffer(outIndex, false);
break;
}
// All decoded frames have been rendered, we can stop playing now
if ((info.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
Log.d("DecodeActivity", "OutputBuffer BUFFER_FLAG_END_OF_STREAM");
break;
}
}
decoder.stop();
decoder.release();
extractor.release();
}
}
}
}
}
JNI Side:
JNIEXPORT jint JNICALL
Java_com_alldigital_videoplayer_VideoBrowser_AVREADVIDEO(JNIEnv * pEnv,
jobject pObj, jbyteArray array) {
AV_ctxt * avctxt = & aud_vid_ctxt;
jbyte * buf = ( * pEnv) - > GetByteArrayElements(pEnv, array, NULL);
if (buf == NULL) {
LOGERR(10, "AVVIDEOREAD", "Bytes null");
}
AVPacket * packet;
packet = av_malloc(sizeof(AVPacket));
av_init_packet(packet);
int avread_res = av_read_frame(avctxt - > gFormatCtx, packet);
int size = packet - > size;
if (avread_res >= 0) {
if (packet - > stream_index == avctxt - > gVideoStreamIndex) {
// packet->size,packet->
if (NULL == memcpy(buf,(char * ) packet - > data, packet - > size))
LOGERR(10, "AV_AUDIO_DECODE", "memcpy for audio buffer failed");
}
}
( * pEnv) - > ReleaseByteArrayElements(pEnv, array, buf, 0);
av_free_packet(packet);
packet = NULL;
return size;
}
}
Even though I am copying the encoded data of each frame through FFmpeg without calling extractor, I am getting this outputbuffer timeout issue - why?
try {
ByteArrayOutputStream baos = new ByteArrayOutputStream();
FileInputStream fis = new FileInputStream(new File(
"ur file path"));
byte[] buf = new byte[1024];
int n;
while (-1 != (n = fis.read(buf))) {
baos.write(buf, 0, n);
}
byte[] videoBytes = baos.toByteArray();
// use this videoBytes which is low level of original video
} catch (Exception e) {
e.printStackTrace();
}

Categories