Midi input on raspberry pi - java

I have implemented a simple program, that takes input from a Midi keyboard and then outputs the corresponding sound using the javax synthesizer interface.
This works really well on my Pc running Windows, however, I want to run it on a Raspberry Pi running Raspbian. It actually does work, too, but as soon as i play more/faster notes, the sound starts to jitter and crackle really bad, and i have to stop playing notes for about 2 seconds in order for the jittering to die down.
I am using a external USB sound adapter already, which did not really help alot.
Here is the class that handles the midi input:
public class MidiHandler {
public MidiHandler() {
MidiDevice device;
MidiDevice.Info[] infos = MidiSystem.getMidiDeviceInfo();
for (int i = 0; i < infos.length; i++) {
try {
device = MidiSystem.getMidiDevice(infos[i]);
// does the device have any transmitters?
// if it does, add it to the device list
System.out.println(infos[i]);
// get all transmitters
List<Transmitter> transmitters = device.getTransmitters();
// and for each transmitter
for (int j = 0; j < transmitters.size(); j++) {
// create a new receiver
transmitters.get(j).setReceiver(
// using my own MidiInputReceiver
new MidiInputReceiver(device.getDeviceInfo()
.toString()));
}
Transmitter trans = device.getTransmitter();
trans.setReceiver(new MidiInputReceiver(device.getDeviceInfo()
.toString()));
// open each device
device.open();
// if code gets this far without throwing an exception
// print a success message
} catch (MidiUnavailableException e) {
}
}
}
// tried to write my own class. I thought the send method handles an
// MidiEvents sent to it
public class MidiInputReceiver implements Receiver {
Synthesizer synth;
MidiChannel[] mc;
Instrument[] instr;
int instrument;
int channel;
public MidiInputReceiver(String name) {
try
{
patcher p = new patcher();
this.instrument = p.getInstrument();
this.channel = p.getChannel();
this.synth = MidiSystem.getSynthesizer();
this.synth.open();
this.mc = synth.getChannels();
instr = synth.getDefaultSoundbank().getInstruments();
this.synth.loadInstrument(instr[1]);
mc[this.channel].programChange(0, this.instrument);
System.out.println(this.channel + ", " + this.instrument);
}
catch (MidiUnavailableException e)
{
e.printStackTrace();
System.exit(1);
}
}
public void send(MidiMessage msg, long timeStamp) {
/*
* Use to display midi message
*
for(int i = 0; i < msg.getMessage().length; i++) {
System.out.print("[" + msg.getMessage()[i] + "] ");
}
System.out.println();
*/
if (msg.getMessage()[0] == -112) {
mc[this.channel].noteOn(msg.getMessage()[1], msg.getMessage()[2]+1000);
}
if (msg.getMessage()[0] == -128) {
mc[this.channel].noteOff(msg.getMessage()[1], msg.getMessage()[2]+1000);
}
}
public void close() {
}
}
}
Is this due to hardware limitations of the Pi, or can I do anything about it?

If you have any loops for updating the midi retrieval/sound output, maybe try thread-sleeping in there to give the OS time to do stuff? Other than that, not sure. The USB on the original Raspberry Pi wasn't very good for some reason (lots of bugs, slow performance -- but these did get fixed somewhat in newer Linux/firmware). You may also need to modify the sample rate to match the ideal setting for the current sound output, if it's accessible (sample rate mismatch means more conversion). Java may try to use the ideal as the default, but may be misreported by the OS?

Related

Sending data from arduino to java

Hello im working on a project for myself. i have serveral RFID cards. I want to send a RFID code(cardID) to my arduino and then scan a random rfid card. if the sent RFID code and the one scaned are matching i want to send back a string/number or just one byte back to my java programm.
so far sending to my arduino and checking the RFID card is working fine however im stuck at sending something back to my java programm.
i was thinking about using serielConnection but tbh its very difficult to understand.
can someone help me out ?
thanks in advance
my java code
package test;
import java.io.IOException;
import com.fazecast.jSerialComm.SerialPort;
public class Startup {
public static void main(String[] args) throws IOException, InterruptedException {
SerialPort sp = SerialPort.getCommPort("COM4"); // device name TODO: must be changed
sp.setComPortParameters(115200, 8, 33, 34); // 9600,8,1,0default connection settings for Arduino
sp.setComPortTimeouts(SerialPort.TIMEOUT_WRITE_BLOCKING, 0, 0); // block until bytes can be written
if (sp.openPort()) {
System.out.println("Port is open :)");
} else {
System.out.println("Failed to open port :(");
return;
}
/*for (Integer i = 0; i < 5; ++i) {
sp.getOutputStream().write(i.byteValue());
sp.getOutputStream().flush();
System.out.println("Sent number: " + i);
Thread.sleep(4000); //default 1000
}
*/
//rfid tag 0x82,130
//rfid card 118
Integer i = 0;//0x82,130
sp.getOutputStream().write(i.byteValue());
sp.getOutputStream().flush();
System.out.println("Sent number: " + i);
Thread.sleep(8000);
if (sp.closePort()) {
System.out.println("Port is closed :)");
} else {
System.out.println("Failed to close port :(");
return;
}
//from arduino
}
}
arduino code:
#include <SPI.h>
#include <MFRC522.h>
#define RST_PIN 26
#define SS_PIN 5
#define MISO_PIN 19
#define MOSI_PIN 23
#define SCK_PIN 18
byte incomingByte = Serial.read();
MFRC522 mfrc522(SS_PIN, RST_PIN);
MFRC522::MIFARE_Key key;
void setup() {
Serial.begin(115200); // Initialize serial communications with the PC
pinMode(16,OUTPUT);
pinMode(21,OUTPUT);
while (!Serial); // Do nothing if no serial port is opened (added for Arduinos based on ATMEGA32U4)
SPI.begin(SCK_PIN, MISO_PIN, MOSI_PIN);
mfrc522.PCD_Init(); // Init MFRC522
delay(40); // Optional delay. Some board do need more time after init to be ready, see Readme
mfrc522.PCD_DumpVersionToSerial(); // Show details of PCD - MFRC522 Card Reader details
//dump_byte_array(key.keyByte, MFRC522::MF_KEY_SIZE);
}
void loop() {
if ( ! mfrc522.PICC_IsNewCardPresent()) {
return;
}
if ( ! mfrc522.PICC_ReadCardSerial()) {
return;
}
if(Serial.available() > 0){
byte incomingByte = Serial.read();
if (mfrc522.uid.uidByte[0] == incomingByte )/*&&*/ //それぞれ読んだカードの値と比較したい16進数の値を比較
)
{
digitalWrite(16,HIGH);
delay(5000);
digitalWrite(16,LOW);
delay(50);
}
else{
digitalWrite(21,HIGH);
delay(2000);
digitalWrite(21,LOW);
}
}
}
as i understand so far, for the arduino i just have to add this to send data from the arduino
SerielWrite(a number);
what is giving me a headache is how to receive it in java.
i already know that i need rxtx or jseriel libary for my java code.
i also already tryed some examples to recieve a number but it didnt work at all.
does someone maybe has a very easy code to receive a number in java ?
To recieve data in java from the arduino, you need a SerialDataListener to listen for data on the Serial Port. This can be done by creating a class which implements the SerialDataListener interface. This examples just prints to the console whatever data is read from the Serial Port but you can make it do whatever you'd like.
import com.fazecast.jSerialComm.*;
public class ComPortListener implements SerialPortDataListener {
private static String bufferReadToString = ""; // empty, but not null
private static int cutoffASCII = 10; // ASCII code of the character used for cut-off between received messages
#Override
public int getListeningEvents() {
return SerialPort.LISTENING_EVENT_DATA_AVAILABLE; //returns data on the Serial Port
}
#Override
public void serialEvent(SerialPortEvent event) {
byte[] buffer = new byte[event.getSerialPort().bytesAvailable()];
event.getSerialPort().readBytes(buffer, buffer.length);
String s = new String(buffer);
bufferReadToString = bufferReadToString.concat(s); //converts the bytes read from the Serial port to a string
if ((bufferReadToString.indexOf(cutoffASCII) + 1) > 0) {
System.out.println(bufferReadToString); //prints out the recived data
}
}
}
To use this, add the following code to your Startup class
if (sp.openPort()) {
System.out.println("Port is open :)");
ComPortListener listenerObject = new ComPortListener(); //creates new listener object
serialPort.addDataListener(listenerObject);
}

Java TargetDataLine not picking up any audio?

I'm writing a function to capture an audio clip for ~ 7.5 seconds using a TargetDataLine. The code executes and renders an 'input.wav' file, but when I play it there is no sound.
My approach, as shown in the code at the bottom of this post, is to do the following things:
Create an AudioFormat and get the Info for a Target Data Line.
Create the Target Data Line by getting the line from AudioSystem.
Open and Start the TargetDataLine, which allocates system resources for recording.
Create an auxiliary Thread that will record audio by writing to a file.
Start the auxiliary Thread, pause the main Thread in the meantime, and then close out the Target Data Line in order to stop recording.
What I have tried so far:
Changing the AudioFormat. Initially, I was using the other AudioFormat constructor which takes the file type as well (where the first argument is AudioFormat.Encoding.PCM_SIGNED etc). I had a sample rate of 44100, 16 bits, 2 channels and small-Endian settings on the other format, which yielded the same result.
Changing the order of commands on my auxiliary and main Thread (i.e. performing TLine.open() or start() in alternate locations).
Checking that my auxiliary thread does actually start.
For reference I am using IntelliJ on a Mac OS Big Sur.
public static void captureAudio() {
try {
AudioFormat f = new AudioFormat(22050, 8, 1, false, false);
DataLine.Info secure = new DataLine.Info(TargetDataLine.class, f);
if (!AudioSystem.isLineSupported(secure)) {
System.err.println("Unsupported Line");
}
TargetDataLine tLine = (TargetDataLine)AudioSystem.getLine(secure);
System.out.println("Starting recording...");
tLine.open(f);
tLine.start();
File writeTo = new File("input.wav");
Thread t = new Thread(){
public void run() {
try {
AudioInputStream is = new AudioInputStream(tLine);
AudioSystem.write(is, AudioFileFormat.Type.WAVE, writeTo);
} catch(IOException e) {
System.err.println("Encountered system I/O error in recording:");
e.printStackTrace();
}
}
};
t.start();
Thread.sleep(7500);
tLine.stop();
tLine.close();
System.out.println("Recording has ended.");
} catch(Exception e) {
e.printStackTrace();
}
}
Update 1: Some new testing and results
My microphone and speakers are both working with other applications - recorded working audio with QuickTimePlayer.
I did a lot of testing around what my TargetDataLines are and what the deal is with them. I ran the following code:
public static void main(String[] args) {
AudioFormat f = new AudioFormat(48000, 16, 2, true, false);
//DataLine.Info inf = new DataLine.Info(SourceDataLine.class, f);
try {
TargetDataLine line = AudioSystem.getTargetDataLine(f);
DataLine.Info test = new DataLine.Info(TargetDataLine.class, f);
TargetDataLine other = (TargetDataLine)AudioSystem.getLine(test);
String output = line.equals(other) ? "Yes" : "No";
if (output.equals("No")) {
System.out.println(other.toString());
}
System.out.println(line.toString());
System.out.println("_______________________________");
for (Mixer.Info i : AudioSystem.getMixerInfo()) {
Line.Info[] tli = AudioSystem.getMixer(i).getTargetLineInfo();
if (tli.length != 0) {
Line comp = AudioSystem.getLine(tli[0]);
System.out.println(comp.toString() + ":" +i.getName());
if (comp.equals(line) || comp.equals(other)) {
System.out.println("The TargetDataLine is from " + i.getName());
}
}
}
} catch (Exception e) {
e.printStackTrace();
}
}
Long story short, the TargetDataLine I receive from doing
TargetDataLine line = AudioSystem.getTargetDataLine(f); and
TargetDataLine other = (TargetDataLine)AudioSystem.getLine(new DataLine.Info(TargetDataLine.class, f));
are different, and furthermore, don't match any of the TargetDataLines that are associated with my system's mixers.
The output of the above code was this (where there first lines are other and line respectively):
com.sun.media.sound.DirectAudioDevice$DirectTDL#cc34f4d
com.sun.media.sound.DirectAudioDevice$DirectTDL#17a7cec2
_______________________________
com.sun.media.sound.PortMixer$PortMixerPort#79fc0f2f:Port MacBook Pro Speakers
com.sun.media.sound.PortMixer$PortMixerPort#4d405ef7:Port ZoomAudioDevice
com.sun.media.sound.DirectAudioDevice$DirectTDL#3f91beef:Default Audio Device
com.sun.media.sound.DirectAudioDevice$DirectTDL#1a6c5a9e:MacBook Pro Microphone
com.sun.media.sound.DirectAudioDevice$DirectTDL#37bba400:ZoomAudioDevice
Upon this realization I manually loaded up all the TargetDataLines from my mixers and tried recording audio with each of them to see if I got any sound.
I used the following method to collect all the TargetDataLines:
public static ArrayList<Line.Info> allTDL() {
ArrayList<Line.Info> all = new ArrayList<>();
for (Mixer.Info i : AudioSystem.getMixerInfo()) {
Line.Info[] tli = AudioSystem.getMixer(i).getTargetLineInfo();
if (tli.length != 0) {
for (int f = 0; f < tli.length; f += 1) {
all.add(tli[f]);
}
}
}
return all;
}
My capture/record audio method remained the same, except for switching the format to AudioFormat f = new AudioFormat(48000, 16, 2, true, false);, changing the recording time to 5000 milliseconds, and writing the method header as public static void recordAudio(Line.Info inf) so I could load each TargetDataLine individually with it's info.
I then executed the following code to rotate TargetDataLines:
public static void main(String[] args) {
for (Line.Info inf : allTDL()) {
recordAudio(inf);
try {
Thread.sleep(5000);
} catch(Exception e) {
e.printStackTrace();
}
if (!soundless(loadAsBytes("input.wav"))) {
System.out.println("The recording with " + inf.toString() + " has sound!");
}
System.out.println("The last recording with " + inf.toString() + " was soundless.");
}
}
}
The output was as such:
Recording...
Was unable to cast com.sun.media.sound.PortMixer$PortMixerPort#506e1b77 to a TargetDataLine.
End recording.
The last recording with SPEAKER target port was soundless.
Recording...
Was unable to cast com.sun.media.sound.PortMixer$PortMixerPort#5e9f23b4 to a TargetDataLine.
End recording.
The last recording with ZoomAudioDevice target port was soundless.
Recording...
End recording.
The last recording with interface TargetDataLine supporting 8 audio formats, and buffers of at least 32 bytes was soundless.
Recording...
End recording.
The last recording with interface TargetDataLine supporting 8 audio formats, and buffers of at least 32 bytes was soundless.
Recording...
End recording.
The last recording with interface TargetDataLine supporting 14 audio formats, and buffers of at least 32 bytes was soundless.
TL;DR the audio came out soundless for every TargetDataLine.
For completeness, here are the soundless and loadAsBytes functions:
public static byte[] loadAsBytes(String name) {
assert name.contains(".wav");
ByteArrayOutputStream out = new ByteArrayOutputStream();
File retrieve = new File("src/"+ name);
try {
InputStream input = AudioSystem.getAudioInputStream(retrieve);
int read;
byte[] b = new byte[1024];
while ((read = input.read(b)) > 0) {
out.write(b, 0, read);
}
out.flush();
byte[] full = out.toByteArray();
return full;
} catch(UnsupportedAudioFileException e) {
System.err.println("The File " + name + " is unsupported on this system.");
e.printStackTrace();
} catch (IOException e) {
System.err.println("Input-Output Exception on retrieval of file " + name);
e.printStackTrace();
}
return null;
}
static boolean soundless(byte[] s) {
if (s == null) {
return true;
}
for (int i = 0; i < s.length; i += 1) {
if (s[i] != 0) {
return false;
}
}
return true;
}
I'm not really sure what the issue could be at this point save for an operating system quirk that doesn't allow Java to access audio lines, but I do not know how to fix that - looking at System Preferences there isn't any obvious way to allow access. I think it might have to be done with terminal commands but also not sure of precisely what commands I'd have to execute there.
I'm not seeing anything wrong in the code you are showing. I haven't tried testing it on my system though. (Linux, Eclipse)
It seems to me your code closely matches this tutorial. The author Nam Ha Minh is exceptionally conscienscious about answering questions. You might try his exact code example and consult with him if his version also fails for you.
But first, what is the size of the resulting .wav file? Does the file size match the amount of data expected for the duration you are recording? If so, are you sure you have data incoming from your microphone? Nam has another code example where recorded sound is progressively read and placed into memory. Basically, instead of using the AudioInputStream as a parameter to the AudioSystem.write method, you execute multiple read method calls on the AudioInputStream and inspect the incoming data directly. That might be helpful for trouble-shooting whether the problem is occurring on the incoming vs outgoing part of the process.
I'm not knowledgeable enough about formats to know if the Mac does things differently. I'm surprised you are setting the format to unsigned. For my limited purposes, I stick with "CD quality stereo" and signed PCM at all junctures.
EDIT: based on feedback, it seems that the problem is that the incoming line is not returning data. From looking at other, similar tutorials, it seems that several people have had the same problem on their Mac systems.
First thing to verify: does your microphone work with other applications?
As far as next steps, I would try verifying the chosen line. The lines that are exposed to java can be enumerated/inspected. The tutorial Accessing Audio System Resources has some basic information on how to do this. It looks like AudioSystem.getMixerInfo() will return a list of available mixers that can be inspected. Maybe AudioSystem.getTargetLineInfo() would be more to the point.
I suppose it is possible that the default Line or Port being used when you obtain a TargetDataLine isn't the one that is running the microphone. If a particular line or port turns out to be the one you need, then it can be specified explicitly via an overridden getTargetDataLine method.
I'm reading that there might be a security policy that needs to be handled. I don't fully understand the code, but if that were the issue, an Exception presumably would have been thrown. Perhaps there are new security measures coming from the MacOs, to prevent an external program from opening a mic line surreptitiously?
If you do get this solved, be sure and post the answer and mark it solved. This seems to be a live question for many people.

MediaCodec AV Sync when decoding

All of the questions regarding syncing audio and video, when decoding using MediaCodec, suggests that we should use an "AV Sync" mechanism to sync the video and audio using their timestamps.
Here is what I do to achieve this:
I have 2 threads, one for decoding video and one for audio. To sync the video and audio I'm using Extractor.getSampleTime() to determine if I should release the audio or video buffers, please see below:
//This is called after configuring MediaCodec(both audio and video)
private void startPlaybackThreads(){
//Audio playback thread
mAudioWorkerThread = new Thread("AudioThread") {
#Override
public void run() {
if (!Thread.interrupted()) {
try {
//Check info below
if (shouldPushAudio()) {
workLoopAudio();
}
} catch (Exception e) {
e.printStackTrace();
}
}
}
};
mAudioWorkerThread.start();
//Video playback thread
mVideoWorkerThread = new Thread("VideoThread") {
#Override
public void run() {
if (!Thread.interrupted()) {
try {
//Check info below
if (shouldPushVideo()) {
workLoopVideo();
}
} catch (Exception e) {
e.printStackTrace();
}
}
}
};
mVideoWorkerThread.start();
}
//Check if more buffers should be sent to the audio decoder
private boolean shouldPushAudio(){
int audioTime =(int) mAudioExtractor.getSampleTime();
int videoTime = (int) mExtractor.getSampleTime();
return audioTime <= videoTime;
}
//Check if more buffers should be sent to the video decoder
private boolean shouldPushVideo(){
int audioTime =(int) mAudioExtractor.getSampleTime();
int videoTime = (int) mExtractor.getSampleTime();
return audioTime > videoTime;
}
Inside workLoopAudio() and workLoopVideo() is all my MediaCodec logic (I decided not to post it because it's not relevant).
So what I do is, I get the sample time of the video and the audio tracks, I then check which one is bigger(further ahead). If the video is "ahead" then I pass more buffers to my audio decoder and visa versa.
This seems to be working fine - The video and audio are playing in sync.
My question:
I would like to know if my approach is correct(is this how we should be doing it, or is there another/better way)? I could not find any working examples of this(written in java/kotlin), thus the question.
EDIT 1:
I've found that the audio trails behind the video (very slightly) when I decode/play a video that was encoded using FFmpeg. If I use a video that was not encoded using FFmpeg then the video and audio syncs perfectly.
The FFmpeg command is nothing out of the ordinary:
-i inputPath -crf 18 -c:v libx264 -preset ultrafast OutputPath
I will be providing additional information below:
I initialize/create AudioTrack like this:
//Audio
mAudioExtractor = new MediaExtractor();
mAudioExtractor.setDataSource(mSource);
int audioTrackIndex = selectAudioTrack(mAudioExtractor);
if (audioTrackIndex < 0){
throw new IOException("Can't find Audio info!");
}
mAudioExtractor.selectTrack(audioTrackIndex);
mAudioFormat = mAudioExtractor.getTrackFormat(audioTrackIndex);
mAudioMime = mAudioFormat.getString(MediaFormat.KEY_MIME);
mAudioChannels = mAudioFormat.getInteger(MediaFormat.KEY_CHANNEL_COUNT);
mAudioSampleRate = mAudioFormat.getInteger(MediaFormat.KEY_SAMPLE_RATE);
final int min_buf_size = AudioTrack.getMinBufferSize(mAudioSampleRate, (mAudioChannels == 1 ? AudioFormat.CHANNEL_OUT_MONO : AudioFormat.CHANNEL_OUT_STEREO), AudioFormat.ENCODING_PCM_16BIT);
final int max_input_size = mAudioFormat.getInteger(MediaFormat.KEY_MAX_INPUT_SIZE);
mAudioInputBufSize = min_buf_size > 0 ? min_buf_size * 4 : max_input_size;
if (mAudioInputBufSize > max_input_size) mAudioInputBufSize = max_input_size;
final int frameSizeInBytes = mAudioChannels * 2;
mAudioInputBufSize = (mAudioInputBufSize / frameSizeInBytes) * frameSizeInBytes;
mAudioTrack = new AudioTrack(AudioManager.STREAM_MUSIC,
mAudioSampleRate,
(mAudioChannels == 1 ? AudioFormat.CHANNEL_OUT_MONO : AudioFormat.CHANNEL_OUT_STEREO),
AudioFormat.ENCODING_PCM_16BIT,
AudioTrack.getMinBufferSize(mAudioSampleRate, mAudioChannels == 1 ? AudioFormat.CHANNEL_OUT_MONO : AudioFormat.CHANNEL_OUT_STEREO, AudioFormat.ENCODING_PCM_16BIT),
AudioTrack.MODE_STREAM);
try {
mAudioTrack.play();
} catch (final Exception e) {
Log.e(TAG, "failed to start audio track playing", e);
mAudioTrack.release();
mAudioTrack = null;
}
And I write to the AudioTrack like this:
//Called from within workLoopAudio, when releasing audio buffers
if (bufferAudioIndex >= 0) {
if (mAudioBufferInfo.size > 0) {
internalWriteAudio(mAudioOutputBuffers[bufferAudioIndex], mAudioBufferInfo.size);
}
mAudioDecoder.releaseOutputBuffer(bufferAudioIndex, false);
}
private boolean internalWriteAudio(final ByteBuffer buffer, final int size) {
if (mAudioOutTempBuf.length < size) {
mAudioOutTempBuf = new byte[size];
}
buffer.position(0);
buffer.get(mAudioOutTempBuf, 0, size);
buffer.clear();
if (mAudioTrack != null)
mAudioTrack.write(mAudioOutTempBuf, 0, size);
return true;
}
"NEW" Question:
The audio trails about 200ms behind the video if I use a video that was encoded using FFmpeg, is there a reason why this could be happening?
It seems like it is working now. I use the same logic as above, but now I keep a reference of the presentationTimeUs returned from MediaCodec.BufferInfo() before calling dequeueOutputBuffer to check if I should continue my video or audio work loop:
// Check if audio work loop should continue
private boolean shouldPushAudio(){
long videoTime = mExtractor.getSampleTime();
return tempAudioPresentationTimeUs <= videoTime;
}
// Check if video work loop should continue
private boolean shouldPushVideo(){
long videoTime = mExtractor.getSampleTime();
return tempAudioPresentationTimeUs >= videoTime;
}
// tempAudioPresentationTimeUs is set right before I call dequeueOutputBuffer
// As shown here:
tempAudioPresentationTimeUs = mAudioBufferInfo.presentationTimeUs;
int outIndex = mAudioDecoder.dequeueOutputBuffer(mAudioBufferInfo, timeout);
By doing this, my video and audio is synced perfectly, even with files that was encoded with FFmpeg(as mentioned in my edit above).
I ran into an issue where my video work loop didn't complete, this was caused by the audio reaching EOS before the video and then returning -1. So I changed my original mVideoWorkerThread to the following:
mVideoWorkerThread = new Thread("VideoThread") {
#Override
public void run() {
if (!Thread.interrupted()) {
try {
if (shouldPushVideo() || audioReachedEOS()) {
workLoopVideo();
}
} catch (Exception e) {
e.printStackTrace();
}
}
}
};
mVideoWorkerThread.start();
private boolean audioReachedEOS() {
return mAudioExtractor.getSampleTime() == -1;
}
So I use audioReachedEOS() to check if my audio MediaExtractor returns -1. If it does then it means that my audio is done, but my video is not, so I continue my video work loop until it is done.
This seems to be working as expected (when I only play/pause the video without seeking). I had another issue with seeking, but I will not elaborate on this.
I will release my application as is and update this answer when I run into problems.

Three-dimensional array arrives in wrong order at serial port

For an university assignment I'm writing a java application that will run some game logic for an interactive LED table. The table itself is being controlled by either 2 Arduino Duemilanove or 1 Arduino Mega 2560.
To give the Arduino(s) information about which LEDs should be lit in which color I send the data over the serial port from a Raspberry Pi 3b+ to the Arduinos. As the table consists of 14 LED strips with 14 LEDs per LED strip and each LED has 3 color values (RGB) I store the data about the table in an int[14][14][3] array.
Before sending the array to the Arduino I create a JSON object of it (using the Jackson library) and then send the array as a String using jSerialComm. Depending on which Arduino setup I use I also either transfer the whole array to JSON or split it into two int[7][14][3] arrays before creating the JSON object.
As the data arrived in the wrong order at the serial port when I used 2 Arduinos and jSerialComm I now got a new Arduino Mega 2560 (as other SO questions suggested the wrong data order might occur due to an outdated PL2303 module) and tried it again with the same result. After some further research I now tried using JSSC instead of jSerialComm but still the same result shows up.
The java class that I use to send the data to the arduino looks like this (the outcommented code being the code where I used jSerialComm / 2 Arduinos):
package de.pimatrix.backend;
import java.io.IOException;
import java.io.InputStream;
import java.io.ObjectInputStream;
import java.net.ServerSocket;
import java.net.Socket;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.fazecast.jSerialComm.SerialPort;
import jssc.SerialPortException;
public class SerialThread implements Runnable {
public static SerialPort arduino1, arduino2;
private int[][][] matrix = new int[14][14][3];
private int[][][] matrixLeft = new int[7][14][3];
private int[][][] matrixRight = new int[7][14][3];
private Socket localHost;
private Matrix matrixData;
private ObjectInputStream in;
#Override
public void run() {
SerialJSONWriter writer = new SerialJSONWriter();
ServerSocket ss = null;
localHost = null;
matrixData = new Matrix(matrix);
try {
ss = new ServerSocket(62000); // erstellen eines lokalen Sockets auf Port 62000, um die zu übertragende
// Matrix vom ClientThread
} catch (IOException e) {
}
while (true) {
try {
localHost = ss.accept();
} catch (Exception e) {
e.printStackTrace();
}
initializeInputStream();
waitForMatrix();
splitMatrix();
try {
writer.tryWrite(matrixRight, matrixLeft);
} catch (Exception e) {
e.printStackTrace();
}
}
}
private void splitMatrix() {
for (int i = 0; i < 14; i++) {
for (int j = 0; j < 14; j++) {
if (i <= 6) {
matrixRight[i][j][0] = matrix[i][j][0];
matrixRight[i][j][1] = matrix[i][j][1];
matrixRight[i][j][2] = matrix[i][j][2];
} else {
matrixLeft[i - 7][j][0] = matrix[i][j][0];
matrixLeft[i - 7][j][1] = matrix[i][j][1];
matrixLeft[i - 7][j][2] = matrix[i][j][2];
}
}
}
}
private void initializeInputStream() {
try {
InputStream input = localHost.getInputStream();
in = new ObjectInputStream(input);
} catch (IOException e) {
e.printStackTrace();
}
}
public void waitForMatrix() {
System.out.println("Waiting for Matrix");
try {
matrixData = (Matrix) in.readObject();
} catch (IOException e) {
e.printStackTrace();
} catch (ClassNotFoundException e) {
e.printStackTrace();
}
this.matrix = matrixData.matrix;
}
class SerialJSONWriter implements AutoCloseable {
// Zuweisen der seriellen Ports
// private final SerialPort /*arduino1, arduino2,*/ arduinoMega;
private jssc.SerialPort arduinoMega;
public SerialJSONWriter() {
// arduino1 = SerialPort.getCommPort("COM5");
// arduino2 = SerialPort.getCommPort("COM6");
// arduinoMega = SerialPort.getCommPort("COM7");
arduinoMega = new jssc.SerialPort("COM7");
try {
arduinoMega.openPort();
arduinoMega.setParams(115200, 8, 1, jssc.SerialPort.PARITY_EVEN);
} catch (SerialPortException e) {
e.printStackTrace();
}
// arduinoMega.setBaudRate(115200);
// arduinoMega.setNumDataBits(8);
// arduinoMega.setNumStopBits(1);
// arduinoMega.setParity(0);
// setzen der Timeouts für die Kommunikation mit den Arduinos
// arduino1.setComPortTimeouts(SerialPort.TIMEOUT_SCANNER, 0, 0);
// arduino2.setComPortTimeouts(SerialPort.TIMEOUT_SCANNER, 0, 0);
// arduinoMega.setComPortTimeouts(SerialPort.TIMEOUT_SCANNER, 0, 0);
// arduino1.setBaudRate(115200);
// arduino2.setBaudRate(115200);
// arduinoMega.setBaudRate(115200);
// arduino1.openPort();
// arduino2.openPort();
// arduinoMega.openPort();
// arduino1.setComPortTimeouts(SerialPort.TIMEOUT_READ_SEMI_BLOCKING | SerialPort.TIMEOUT_WRITE_BLOCKING, 0,
// 0);
// arduino2.setComPortTimeouts(SerialPort.TIMEOUT_READ_SEMI_BLOCKING | SerialPort.TIMEOUT_WRITE_BLOCKING, 0,
// 0);
// arduinoMega.setComPortTimeouts(SerialPort.TIMEOUT_READ_SEMI_BLOCKING | SerialPort.TIMEOUT_WRITE_BLOCKING, 0,
// 0);
}
public void write() {
}
private void tryWrite(Object dataRight, Object dataLeft) throws IOException {
String dataAsJSONRight = new ObjectMapper().writeValueAsString(dataRight) + "\n";
String dataAsJSONLeft = new ObjectMapper().writeValueAsString(dataLeft) + "\n";
try {
arduinoMega.writeString(dataAsJSONRight);
} catch (SerialPortException e) {
e.printStackTrace();
}
// for (int i = 0; i < dataAsJSONRight.length(); i++) {
//// arduino1.getOutputStream().write(dataAsJSONRight.getBytes()[i]);
// System.out.println(dataAsJSONRight);
// arduinoMega.getOutputStream().write(dataAsJSONRight.getBytes()[i]);
// }
// for (int i = 0; i < dataAsJSONLeft.length(); i++) {
//// arduino2.getOutputStream().write(dataAsJSONLeft.getBytes()[i]);
// arduinoMega.getOutputStream().write(dataAsJSONLeft.getBytes()[i]);
// }
}
#Override
public void close() throws Exception {
// arduino1.closePort();
// arduino2.closePort();
arduinoMega.closePort();
}
}
}
On the Arduino(s) the processing looks like this:
#include <ArduinoJson.h>
#include <Adafruit_NeoPixel.h>
#define PINROW0 2
#define PINROW1 3
#define PINROW2 4
#define PINROW3 5
#define PINROW4 6
#define PINROW5 7
#define PINROW6 8
#define NUMPIXELS 14 //Amount of pixels per row
Adafruit_NeoPixel row[] = { //Intitialize the array, that contains the addressable LED strips in the Adafruit format
Adafruit_NeoPixel(NUMPIXELS, PINROW0, NEO_GRB + NEO_KHZ800),
Adafruit_NeoPixel(NUMPIXELS, PINROW1, NEO_GRB + NEO_KHZ800),
Adafruit_NeoPixel(NUMPIXELS, PINROW2, NEO_GRB + NEO_KHZ800),
Adafruit_NeoPixel(NUMPIXELS, PINROW3, NEO_GRB + NEO_KHZ800),
Adafruit_NeoPixel(NUMPIXELS, PINROW4, NEO_GRB + NEO_KHZ800),
Adafruit_NeoPixel(NUMPIXELS, PINROW5, NEO_GRB + NEO_KHZ800),
Adafruit_NeoPixel(NUMPIXELS, PINROW6, NEO_GRB + NEO_KHZ800)
};
#define DELAY 1000 //set refresh cycle to 10 milliseconds
#define NUMSTRIPS 7/*(sizeof(row)/sizeof(row[0]))*/ //Amount of connected LED strips
int values[7][14][3];
int c = 0;
String matrixAsString = "";
void setup() {
/*Setup serial port on which the Pi connects to the Arduino*/
Serial.begin(115200); //set baudrate to 115200 Bit per second
Serial.setTimeout(1000);
Serial.println(100);
/*initialize NeoPixel Library*/
for (int i = 0; i < NUMSTRIPS; i++) {
row[i].begin();
row[i].show();
}
}
void process(String matrixAsString) {
StaticJsonDocument<4372> doc;
Serial.println(matrixAsString);
deserializeJson(doc, matrixAsString);
for (int i = 0; i < 7; i++) {
for (int j = 0; i < 14; j++) {
values[i][j][0] = values[i][j][1] = values[i][j][2] = (int) (doc[i][j][0]);
}
}
}
//infinite loop refreshing the matrix
void loop() {
while (Serial.available()) {
char c = Serial.read();
Serial.println(matrixAsString);
matrixAsString += c;
if (c == '\n') {
process(matrixAsString);
matrixAsString = "";
}
}
}
When sending the data for a half matrix (so an int[7][14][3]):
[[[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0]],[[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0]],[[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[255,0,0],[0,0,0],[0,0,0],[0,0,0]],[[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0]],[[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0]],[[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0]],[[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0]]]
through the serial monitor in the Arduino IDE I get this output from the Arduino (as of the Serial.println() in void loop):
As one can see the first RGB values are transmitted correctly, however after even less than one complete LED strip the data arrives in the wrong order and (as you can see at the end of the picture) at some point completely stops showing up, which probably indicates no data is being read any more.
I've been trying a hell of things like changing the Arduino in case the PL2303 is outdated or defective as well as trying different libraries for serial communication however I cannot figure out what I'm doing wrong. I've spent over 30 hours trying different approaches to no avail so things are becoming really frustrating for me.
UPDATE
As suggested by B.Letz I got the setup of the data, stop and parity bits right (now being 8 data, 1 stop and no parity bits). By reading the arduino feedback I still got the same results but after some tweeking I recognized that the problem probably was that my Serial.print apparently led to a massive lag on the Arduino so it couldn't handle all the data correctly and in time. After removing the first Serial.print call before doing the processing I now see the first matrix that is being transmitted being printed correctly by the Arduino. However for some reason for all further transmitted data the Arduino prints null. I'll try extending the timeouts in case the null pointer occurs due to a timeout on the Arduino-side.
UPDATE 2
Opposed to my assumption reconfiguring the timeouts didn't solve the problem. I also figured out that after the first JSON Object is sent the Arduino prints null to the console and only sends me the first JSON object after receiving the second JSON object. However this is the only time I get any feedback from the Arduino except null. What I also noticed is that when I send the JSON String over the serial monitor the Arduino instantly prints the correct String BUT it also prints an empty new line and doesn't respond to any new data in any kind.
The first step for a working solution was removing the unnecessary Serial.print() call everytime a new char was read. After removing this line I could confirm the data arrived properly.
The shifted feedback as mentioned in my second update to the post:
I also figured out that after the first JSON Object is sent the Arduino prints null to the console and only sends me the first JSON object after receiving the second JSON object. However this is the only time I get any feedback from the Arduino except null
occured due to the fact that I didn't wait long enough on the java application side for data to arrive before calling the read() function. After solving this I always received the correct string.
Trying out different configurations with either DynamicJsonDocument and StaticJsonDocument I now ended up using DynamicJsonDocument however also a StaticJsonDocument might have worked here.
A rather unpleasant issue was that in the inner for-loop in void process I accidently compared the counter variable to that of the outer for-loop, although I was able to retrieve the correct data at that point outside of the for-loop.
The problem asked in this questions is thereby solved, however a even bigger problem now occured as I'm not able to retrieve any data from the received JSON object as soon as I start implementing the code for controlling the LEDs and call row[i].setPixelColor(j, row[i].Color(values[i][j][0], values[i][j][1], values[i][j][2])); at any point in my code. So to sum things up this specific call is the actual reason that the code doesn't work properly.
I'll be opening a new question for that new problem as it doesn't belong to this question thematically, however I will add a reference to it here, once it's written.
UPDATE
The new question regarding addressing more than 7 LED strips using Adafruit's NeoPixel library can be found here.

ChannelGroup not working as intended in Netty 4.1.6

I have a ChannelGroup containing 4 clients that have logged in one by one and have been added to the group as they log in through the Netty method handlerAdded.
static ChannelGroup channels = new DefaultChannelGroup(GlobalEventExecutor.INSTANCE);
#Override
public void handlerAdded(ChannelHandlerContext ctx) throws Exception {
channels.add(ctx.channel());
assignGroup(ctx.channel());
}
I later save my ChannelGroup in an object named GameMaster and initialize a game loop that gives each client a chance to play its turn:
static ChannelGroup channels; // contains the actual channels, saved in contructor
public void bettingPhase(boolean isOk) {
int i = 0;
int y = 0;
for (Channel chan : channels) { // Should start on the first client that logged in
Server.MyMessage.Builder message = Server.MyMessage.newBuilder();
message.setKeyword("304");
message.setValue("Please input a contract:");
chan.writeAndFlush(message); // PROBLEM HERE
while (isOk == false) { // Loop to check the playing client's input
Server.MyMessage.Builder message2 = Server.MyMessage.newBuilder();
try {
Thread.sleep(1000);
if (playerAnswer.getKeyword().equals("CONTRACT")) {
System.out.println("Player has inputed contract.");
i++;
message2.setKeyword("310");
chan.writeAndFlush(message); // Tells the current client his turn is over
System.out.println("End of turn for player.");
for (Channel channel : channels) { // Loop to tell the next client it's his turn
if (y == i) {
message2.setKeyword("309");
channel.writeAndFlush(message);
System.out.println("Start of turn for player.");
}
y++;
}
isOk = true;
playerAnswer.clearKeyword();
}
} catch (Exception e) {
e.printStackTrace();
}
}
isOk = false;
}
}
Please excuse this big chunk of code.
But for some reason my sentence "304 Please input a contract:" sent via chan.writeAndFlush(message); is sent to another client (and not the first one as it should through the loop) !
Am I missing something?
What do you mean by "first one"? ChannelGroup is a Set and it makes no guarantee to maintain insertion order when being iterated.

Categories