My app encodes a PCM file to a m4a file using MediaMuxer, MediaFormat, and MediaCodec.
I read some code that sets things like this:
MediaFormat outputFormat = MediaFormat.createAudioFormat("audio/mp4a-latm", SampleRate, 1);
outputFormat.setInteger(MediaFormat.KEY_AAC_PROFILE, MediaCodecInfo.CodecProfileLevel.AACObjectLC);
outputFormat.setInteger(MediaFormat.KEY_BIT_RATE, 96000);
outputFormat.setInteger(MediaFormat.KEY_MAX_INPUT_SIZE, 16384);
I searched for MediaFormat.KEY_MAX_INPUT_SIZE, but it is not clear to me why it is necessary to be set. I've read that some Samsung devices crash without this, but I don't know if this is true.
Is it necessary/good/advisable to set this? If so, to which value(s)?
Related
I'm thinking about coding a Java applet that would take in the top 100 or so songs, find their samples (music that appears within the songs) off WhoSampled.com and then playing those samples off of YouTube.
My problem is the playing part, let's say I have the URL. What's the best way to deal with that in Java, do you think ripping the audio off and playing the audio from there would be best, or should I try to control a sentient YouTube player.
I'm leaning towards extracting the audio, and this: thread mentions a way to extract that audio, however the code:
wget http://www.youtube.com/get_video.php?video_id=...
ffmpeg -i - audio.mp3
Is not written in Java. How do I, if possible convert this to run in a Java program? Or does anyone know a good way in Java
Thank you for your suggestions.
You can use an FFMPEG Java wrapper like this one https://github.com/bramp/ffmpeg-cli-wrapper/
An example can be found in Readme. Converting MP4 to mp3 should be like this:
FFmpeg ffmpeg = new FFmpeg("/path/to/ffmpeg");
FFprobe ffprobe = new FFprobe("/path/to/ffprobe");
FFmpegBuilder builder = new FFmpegBuilder()
.setInput("input.mp4") // Filename, or a FFmpegProbeResult
.overrideOutputFiles(true) // Override the output if it exists
.addOutput("output.mp3") // Filename for the destination
.setFormat("mp3") // Format is inferred from filename, or can be set
.setAudioCodec("aac") // using the aac codec
.setAudioSampleRate(48_000) // at 48KHz
.setAudioBitRate(32768) // at 32 kbit/s
.done();
FFmpegExecutor executor = new FFmpegExecutor(ffmpeg, ffprobe);
// Run a one-pass encode
executor.createJob(builder).run();
As the title points out, I'm having trouble writing files to the external storage. My debug device is a Nexus 5. The thing is, I'm able to read files perfectly from the device (I've been trying with the ones in the Download Folder) but cannot write them. I am aware that I must do this while the device isn't connected to the computer. But it doesn't work either.
In fact, I've tried reading the state of the SD card prior to writing to it (which didn't work, of course). The state showed as "mounted" either when the device was connected to my PC or not. And I compared the state to Environment.MEDIA_MOUNTED_READ_ONLY and Environment.MEDIA_MOUNTED without any success. My device is in none of these states.
One thing which you must know is that my phone doesn't have an external SD card, as it's an internal one. This results in my device having a "/storage/emulated/0/..." directory for the external storage.
I must also point out that I have implemented the following tags in my Android Manifest:
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE"/>
<uses-permission android:name="ANDROID.PERMISSION.WRITE_EXTERNAL_STORAGE"/>
I don't have any clue to what might be happening. Another thing which might help is that I've tried managing files with winrar (for Android) and I've been able to remove files with the device connected to my PC as well as without having it connected. So I don't know what to do.
The code which I'm using to write a file is the following. Bear in mind that it should read an image file (which it does), convert it into a string, convert it back into an image and then save it to the Downloads Folder:
File file = new File(Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_DOWNLOADS).getAbsolutePath() + "/base_image.jpg");
// Reading a Image file from file system
FileInputStream imageInFile = new FileInputStream(file);
byte imageData[] = new byte[(int) file.length()];
imageInFile.read(imageData);
// Converting Image byte array into Base64 String
String imageDataString = encodeImage(imageData);
// Converting a Base64 String into Image byte array
byte[] imageByteArray = decodeImage(imageDataString);
File newFile = new File(Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_DOWNLOADS), "converted_image.jpg");
//Write a image byte array into file system
FileOutputStream imageOutFile = new FileOutputStream(newFile);
imageOutFile.write(imageByteArray);
imageInFile.close();
imageOutFile.close();
What should I do?
Just fix ANDROID.PERMISSION.WRITE_EXTERNAL_STORAGE to android.permission.WRITE_EXTERNAL_STORAGE in your uses-permission.
I've encounterd this problem, UPPERCASE in permission is not useful.
FileOutputStream does NOT automatically create a file if it's not exist.
So, you need to check and create if your file doesn't exist.
if(!newFile.exists()) {
newFile.createNewFile();
}
Hope this help!
My goal
I want to do the following in Java: record some sound made by Java itself, and then save it as a wav file.
I've searched for a lot of these kind of programs online, and I've found some good ones, but I get the same problem again and again (the current version can be found here). The problem is always something like this:
In the public class, the function start begins with
AudioFormat format = getAudioFormat();
DataLine.Info info = new DataLine.Info(TargetDataLine.class, format);
if (!AudioSystem.isLineSupported(info)) {
System.out.println("Line not supported");
System.exit(0);
}
, where getAudioFormat() returns AudioFormat(16000,8,2,true,true). If I run the file, it prints Line not supported, but why?
What I've tried:
I've searched for loads of different formats online, and tried each of them.
I've taken an existing wav file which Java can play, taken the format of that one, and used it as the format for the file I try to create.
Some more information:
System.out.println(AudioSystem.getTargetLineInfo(info));
System.out.println(AudioSystem.getAudioFileTypes());
System.out.println(AudioSystem.getMixerInfo());
System.out.println(AudioSystem.getSourceDataLine(format));
where format is the format of the existing wav file, and info is defined as in the first code snippet, prints:
[Ljavax.sound.sampled.Line$Info;#4eec7777
[Ljavax.sound.sampled.AudioFileFormat$Type;#41629346
[Ljavax.sound.sampled.Mixer$Info;#6d311334
com.sun.media.sound.DirectAudioDevice$DirectSDL#448139f0
I use Windows 7, 64 bits.
I'd like to add that if there's a better, entirely different way to achieve my goal, that's fine too.
Our current project requires us to send an audio file to the server and then use the audio file for further computation.
Using the Java sound api, I was able to capture the recording and save it as a wav file in my system. Then in order to pass the audio wav to the server, I am using Apache Commons HttpClient to post a request to the server. (I am using InputstreamEntity provided by apache and sending the data as a chunk).
The problem appears when i am trying to recreate/retrieve the wav file on the server. I understand that I would have to use the AudioSystem.write API to create the wav file (exactly as what was done on my system). However what I observe is that althought the file gets created , it does not play (I am using vlc media player to test it FYI). I have searched in Google for sample codes and have tried to implement it, but is unable to play it once the file gets created.
The sample code snippets indicates the approaches i have tried:
//******************************************************************
try {
InputStream is = request.getInputStream();
FileOutputStream fs = new FileOutputStream("output123.wav");
byte[] tempbuffer = new byte[4096];
int bytesRead;
while((bytesRead=is.read(tempbuffer))!=-1)
{
fs.write(tempbuffer, 0,bytesRead);
}
is.close();
fs.close();
AudioInputStream inputStream =AudioSystem.getAudioInputStream(newFile("output123.wav"));
int numofbytes = inputStream.available();
byte[] buffer = new byte[numofbytes];
inputStream.read(buffer);
int bytesWritten = AudioSystem.write(inputStream, AudioFileFormat.Type.WAVE,new File("outputtest.wav"));
System.out.println("written"+bytesWritten);
Approach 2
InputStream is = request.getInputStream();
System.out.println("inputStream obtained : "+is.toString());
ByteArrayInputStream bais = null;
byte[] audioBuffer = IOUtils.toByteArray(is);
System.out.println(" is audioBuffer empty? : length = ? "+audioBuffer.length);
try {
AudioFileFormat ai = AudioSystem.getAudioFileFormat(is);
System.out.println("ai bytelength ? "+ai.getByteLength());
System.out.println("ai frame length = "+ai.getFrameLength());
Set<Map.Entry<String,Object>> audioProperties = ai.getFormat().properties().entrySet();
System.out.println("entry set is empty ? "+audioProperties.isEmpty());
for(Map.Entry me : audioProperties){
System.out.println("key = "+me.getKey());
System.out.println("value ="+me.getValue());}
bais = new ByteArrayInputStream(audioBuffer);
AudioInputStream ais = new AudioInputStream(bais, new AudioFormat(8000,8,2,true,true), 2);
AudioSystem.write(ais, AudioFileFormat.Type.WAVE,new File("testtest.wav"));
//*************************************************************************************
The audioFormat properties all turned out to be null. Are these null values giving the problem? So while creating the wave file on the server, I tried to set the properties manually once again. But even then the wav file would not play.
I have also tried quite a few approaches already mentioned on this site, but somehow they aren't working. I am sure i am missing something, but I am unable to pinpoint the exact problem.
Would be really helpful, if you guys can point out how to go about the conversion from ServletInputStream to getting a wav.
P.S (1) I know the code is shabby, because i have been under a trial and error situation for quite some time now. But I will give more details on the approaches if needed.
2) Apologise for the clumsiness, this happens to be my first post.. )
this is not how you copy a stream (from Approach 1). you have the correct code to copy a stream just above this.:
int numofbytes = inputStream.available();
byte[] buffer = new byte[numofbytes];
inputStream.read(buffer);
If all your server wants to do is get the data and write it to a file, then you do not need to use any of the audio API: simply treat the data as a stream of bytes.
So the part of approach 1 that is before any mention of AudioInputStream should be sufficient.
Although the approach chosen might not be the perfect solution, due to time constraints, I adopted a simpler approach. Using java.util.zip i simply zipped it up and sent it over to the server and then wrote a layer wherin the file gets unzipped . then i deleted the zip files. Seems like an immature solution (bcos the original challenge was to send the audio file). now i am incurring an overhead of zipping the files, but the file transfer would hapeen relatively faster. Thanks for your help guys.
I'm working on an application that has to process audio files. When using mp3 files I'm not sure how to handle data (the data I'm interested in are the the audio bytes, the ones that represent what we hear).
If I'm using a wav file I know I have a 44 bytes header and then the data. When it comes to an mp3, I've read that they are composed by frames, each frame containing a header and audio data. Is it possible to get all the audio data from a mp3 file?
I'm using java (I've added MP3SPI, Jlayer, and Tritonus) and I'm able to get the bytes from the file, but I'm not sure about what these bytes represent or how to handle then.
From the documentation for MP3SPI:
File file = new File(filename);
AudioInputStream in= AudioSystem.getAudioInputStream(file);
AudioInputStream din = null;
AudioFormat baseFormat = in.getFormat();
AudioFormat decodedFormat = new AudioFormat(AudioFormat.Encoding.PCM_SIGNED,
baseFormat.getSampleRate(),
16,
baseFormat.getChannels(),
baseFormat.getChannels() * 2,
baseFormat.getSampleRate(),
false);
din = AudioSystem.getAudioInputStream(decodedFormat, in);
You then just read data from din - it will be the "raw" data as per decodedFormat. (See the docs for AudioFormat for more information.)
(Note that this sample code doesn't close the stream or anything like that - use appropriate try/finally blocks as normal.)
The data that you want are the actual samples, while MP3 represents the data differently. So, like what everyone else has said - you need a library to decode the MP3 data into actual samples for your purpose.
As mentioned in the other answers, you need a decoder to decode MP3 into regular audio samples.
One popular option would be JavaLayer (LGPL).