In Python, It's possible to read an audio file as an array of samples and receive its sample rate with:
array_of_samples, samplerate = librosa.load('filename.mp3')
also:
array_of_samples, samplerate = sf.read('existing_file.wav')
Is there any similar way to do it in Dart code?
Related
I'm thinking about coding a Java applet that would take in the top 100 or so songs, find their samples (music that appears within the songs) off WhoSampled.com and then playing those samples off of YouTube.
My problem is the playing part, let's say I have the URL. What's the best way to deal with that in Java, do you think ripping the audio off and playing the audio from there would be best, or should I try to control a sentient YouTube player.
I'm leaning towards extracting the audio, and this: thread mentions a way to extract that audio, however the code:
wget http://www.youtube.com/get_video.php?video_id=...
ffmpeg -i - audio.mp3
Is not written in Java. How do I, if possible convert this to run in a Java program? Or does anyone know a good way in Java
Thank you for your suggestions.
You can use an FFMPEG Java wrapper like this one https://github.com/bramp/ffmpeg-cli-wrapper/
An example can be found in Readme. Converting MP4 to mp3 should be like this:
FFmpeg ffmpeg = new FFmpeg("/path/to/ffmpeg");
FFprobe ffprobe = new FFprobe("/path/to/ffprobe");
FFmpegBuilder builder = new FFmpegBuilder()
.setInput("input.mp4") // Filename, or a FFmpegProbeResult
.overrideOutputFiles(true) // Override the output if it exists
.addOutput("output.mp3") // Filename for the destination
.setFormat("mp3") // Format is inferred from filename, or can be set
.setAudioCodec("aac") // using the aac codec
.setAudioSampleRate(48_000) // at 48KHz
.setAudioBitRate(32768) // at 32 kbit/s
.done();
FFmpegExecutor executor = new FFmpegExecutor(ffmpeg, ffprobe);
// Run a one-pass encode
executor.createJob(builder).run();
Basically, I built an app in android that records my message and saves it as .m4a or .3gpp format.
When I plays the records in my app it works fine, but when I'm trying to play it on my website it doesnt work...
Android(Java)
recorder = new MediaRecorder();
recorder.setAudioSource(MediaRecorder.AudioSource.MIC);
recorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
recorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
recorder.setOutputFile(OUTPUT_FILE);
recorder.prepare();
recorder.start();
Website(HTML)
<audio controls="controls" preload="none">
<source src="my_record.m4a" type="audio/mp4"/>
</audio>
P.S: When I tried to open some other m4a audio files(files that i found online), I succeded.
The audio tag is quite sensitive about this. Anything above 128mbps it will not play. A lot of encoders automatically choose the highest quality bit rate (usually around 320mbps) and the audio tag won't play them. Sample rate should be 44100hz.
the sampling rate supported by AAC audio coding standard ranges from 8 to 96 kHz, the sampling rate supported by AMRNB is 8kHz, and the sampling rate supported by AMRWB is 16kHz.
Hence change Audioencoder to AAC in your code
recorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
to
recorder.setAudioEncoder(MediaRecorder.AudioEncoder.AAC);
and then set filename extension to .mp3
Hope this works for you.:)
How can I make a record with a microphone in FLAC file?
I have tried this:
import javaFlacEncoder.FLAC_FileEncoder;
FLAC_FileEncoder flacEncoder = new FLAC_FileEncoder();
File outputFile = new File(dir + "/flac1.flac");
flacEncoder.encode(file, outputFile);
Error:
E/AndroidRuntime(5891): java.lang.VerifyError: javaFlacEncoder/FLAC_FileEncoder
Whether it is possible to record sound in format Wav using Java 1.6 and android 4.0.3?
From what earlier comments here were alluding to:
From this link Android appears only support AMR, PCM and GSM codecs. If you need .flac (often google's services favorite) I suggest using the lossless PCM (supports .wav) formats and using FFMPEG, SOX or some other audio converter service to then make them .flac.
(all codecs implemented in Android 3.1x (level 12) so this should apply to your version)
To stream audio file I have implemented following code. But i am getting Exception:
javax.sound.sampled.UnsupportedAudioFileException: could not get audio input stream from input file
at javax.sound.sampled.AudioSystem.getAudioInputStream(AudioSystem.java:1170)
Can Any one help me please......
try {
// From file
System.out.println("hhhhhhhhhhhhhhhh");
AudioInputStream stream = AudioSystem.getAudioInputStream(new File("C:\\track1.mp3"));
System.out.println("stream created");
AudioFormat format = stream.getFormat();
if (format.getEncoding() != AudioFormat.Encoding.PCM_SIGNED) {
System.out.println("in if");
format = new AudioFormat(
AudioFormat.Encoding.PCM_SIGNED,
format.getSampleRate(),
format.getSampleSizeInBits()*2,
format.getChannels(),
format.getFrameSize()*2,
format.getFrameRate(),
true); // big endian
stream = AudioSystem.getAudioInputStream(format, stream);
}
// Create line
SourceDataLine.Info info = new DataLine.Info(
SourceDataLine.class, stream.getFormat(),
((int)stream.getFrameLength()*format.getFrameSize()));
SourceDataLine line = (SourceDataLine) AudioSystem.getLine(info);
line.open(stream.getFormat());
line.start();
// Continuously read and play chunks of audio
int numRead = 0;
byte[] buf = new byte[line.getBufferSize()];
while ((numRead = stream.read(buf, 0, buf.length)) >= 0) {
int offset = 0;
while (offset < numRead) {
offset += line.write(buf, offset, numRead-offset);
}
}
line.drain();
line.stop();
}
That you're doing this job in a servlet class gives me the impression that your intent is to play the mp3 file whenever someone visits your website and that the visitor should hear this mp3 file.
If true, I'm sorry to say, but you're approaching this entirely wrong. Java servlet code runs in webserver machine and not in webbrowser machine. Whenever someone visits your website, this way the mp3 file would only be played at the webserver machine. This is usually a physically completely different machine which runs at the other side of the network connection and the visitor ain't ever going to hear the music.
You want to send the mp3 file raw (unmodified byte by byte) from webserver to the webbrowser without massaging it by some Java Audio API and instruct the webbrowser to play this file. The easist way is to just drop the mp3 file in public webcontent (there where your HTML/JSP files also are) and use HTML <embed> tag to embed it in your HTML/JSP file. The below example assumes the MP3 file to be in the same folder as the HTML/JSP file:
<embed src="file.mp3" autostart="true"></embed>
That's all and this is supported in practically every browser and it will show a player as well.
If the MP3 file is by business requirement stored outside public webcontent, then you may indeed need a servlet for this, but the servlet should do absolutely nothing more than getting an InputStream of it in some way and write it unmodified to the OutputStream of the HttpServletResponse the usual Java IO way. You only need to set the HTTP Content-Type header to audio/mpeg beforehand and if possible also the HTTP Content-Length header. Then point the src to the servlet's URL instead.
<embed src="mp3servlet" autostart="true"></embed>
Default java AudioInputStream does not support mp3 files. You have to plug in MP3SPI to let it decode mp3.
ALso, what do you mean by streaming? This code will play the audio file, not stream it as in internet radio streaming.
I'm working on an application that has to process audio files. When using mp3 files I'm not sure how to handle data (the data I'm interested in are the the audio bytes, the ones that represent what we hear).
If I'm using a wav file I know I have a 44 bytes header and then the data. When it comes to an mp3, I've read that they are composed by frames, each frame containing a header and audio data. Is it possible to get all the audio data from a mp3 file?
I'm using java (I've added MP3SPI, Jlayer, and Tritonus) and I'm able to get the bytes from the file, but I'm not sure about what these bytes represent or how to handle then.
From the documentation for MP3SPI:
File file = new File(filename);
AudioInputStream in= AudioSystem.getAudioInputStream(file);
AudioInputStream din = null;
AudioFormat baseFormat = in.getFormat();
AudioFormat decodedFormat = new AudioFormat(AudioFormat.Encoding.PCM_SIGNED,
baseFormat.getSampleRate(),
16,
baseFormat.getChannels(),
baseFormat.getChannels() * 2,
baseFormat.getSampleRate(),
false);
din = AudioSystem.getAudioInputStream(decodedFormat, in);
You then just read data from din - it will be the "raw" data as per decodedFormat. (See the docs for AudioFormat for more information.)
(Note that this sample code doesn't close the stream or anything like that - use appropriate try/finally blocks as normal.)
The data that you want are the actual samples, while MP3 represents the data differently. So, like what everyone else has said - you need a library to decode the MP3 data into actual samples for your purpose.
As mentioned in the other answers, you need a decoder to decode MP3 into regular audio samples.
One popular option would be JavaLayer (LGPL).