I posted this question over at anddev.org and thougt here was a good place to post this too because this site has a bigger usergroup.
I will make it shorter here: How can i play .xm or .mod files on the android platform? Is there a good android compatible library for the playback? I cant seem to find one. I dont search a player like Extended Multimedia Player, just a library.
I know i could convert all the modules to mp3 but I don't like that. It would blow up the app in filesize.
Regards.
AndEngine is probably a good way to go.
Or if you just want to add mod/xm playing to your own code base, check out my page:
building libmodplug for android
there are a few caveats (using older Android NDK, building version 0.8.6 of libmodplug, could use better thread handling) but basically it works. There's an example apk with source.
Also recently added a new app using libmodplug with soundtracks from two of my games - 44 songs in 1.2MB :)
PGsoundtracks
The 2D game engine AndEngine has an extension that can play .mod files, it might suit your needs:
http://code.google.com/p/andenginemodplayerextension/
download xmp player from Android Market. That is the module Andengine extensions are using. You don't need Andengine nor their extensions for that.
Related
I am currently working on a project. I have learned Java for ~4 years and am completely new to Android Studio.
In this project, a camera with artificial intelligence should recognize people within a certain radius.
With an Android app that I want to program using Java via Android Studio, I want to access the camera and query the information.
My question would be, with which tool or library can I program such an app and what else do i need?
I tried to use OpenCV and also Exoplayer, watched several different tutorials, but nothing really worked out for me.
My question would be, with which tool or library can I program such an app and what else do i need?
I thought using histograms might be the right apporach. I looked around and found this intersting project. https://isl.cs.technion.ac.il/wp-content/uploads/2020/11/Face_Recognition_Project.pdf
Introduction
Our goal was to create a fully operational mobile application which could detect,
recognize and track human faces.
In order to do that, we have decided to use the Android[3] platform combined
with the opencv library[4][5].
The development of the application was made on Qualcomm MSM8960[6] mobile device which run a 4.0.3 Android OS.
In addition to the application we have built, we also did a research about how
well we can use LDA[1] and PCA[2] in order to recognize faces and also about
the use of LDA in order to do basic pose estimation.
This is also interesting: https://towardsdatascience.com/face-recognition-how-lbph-works-90ec258c3d6b
I am creating sound recording app, and i need to create audio visualizer. I looked at this similar question, but there was only outdated libraries and i looked at several other questions, but i could not find the answer.
Do you know about any open source github not outdated projects?
I want to make a sound visualization that would look something like below image:
You can use this Library for a visualization of your Audio while recording
https://github.com/SimpleMobileTools/Simple-Voice-Recorder
I have used this in my application, and it's working really well.
It's not outdated, and is up to dated to be used in new projects.
Thanks
I want to work on making an android app by integrating OpenCV with android Studio. I have a set of 2D hardcopy card images that i want to save as templates with in the app. Then, using the app, when i place my camera on any of the cards, the app should search the directory which contain the templates and look for match and provide feedback if a match is found. If anyone can guide on how to achieve this, it will be highly appreciated.
Also, if not OpenCV, then which SDK or tool should be preferred ?
The question is a general one, so the answer will be general as well, and will make assumptions about what you'd like to accomplish with your application.
Android Studio with OpenCV is probably a reasonable stack to use.
Presuming the library has more than a trivial number of images, you'll probably want to extract matching information for each image in your library in an offline process (on your code-development machine). So for instance, you would configure your development machine with a SQLite database driver and OpenCV, construct a program that extracts feature points and saves those to your a SQLite database. That file can then be loaded into the Android APK assets, and it would be ready upon the application's first use. The application would take frames from the camera and compare those with each item in the database, looking for matches.
I am prototyping a fairly simple camera app to test out using MediaRecorder to create a custom camera activity with one snag, I want to set the aspect ratio of recorded videos to a 1x1. Through much research I have found that this is only possible by using a library like FFMPEG to crop each frame of the video to the size I desire.
I have read many tutorials and articles on different ways to build FFMPEG into Android, but most of them are either outdated and use older versions of both the Android NDK and FFMPEG, or more recent ones just do not work when followed. I tried following the popular http://www.roman10.net/how-to-build-ffmpeg-for-android/ and a few other similar ones that all lead to an error about a missing pkg-config file because FFMPEG is generally meant to be installed on linux or another OS apparently. I found some information about building FFMPEG in android by using a make-standalone-toolchain.sh file here http://software.intel.com/en-us/android/blogs/2013/12/06/building-ffmpeg-for-android-on-x86 and can't make heads or tails as to how to go about using this method.
This now leads into my question: What is the best/proven way currently to build and use FFMPEG within android applications? If the standalone toolchain method is the way to go, is there any material better than the one listed that is easier to follow? I would even be open to a reliable template application with the FFMPEG Libraries ready to go (if this is possible); although, I would much rather know how to build this into android for future use.
Thank you in advance for any advice or suggestions on this issue.
I have successfully build ffmpeg libraries using
https://code.google.com/p/dolphin-player/
You have to be on Ubuntu to build that.
This is the guide I liked the most: http://www.roman10.net/how-to-build-ffmpeg-with-ndk-r9/
If you need more options, you can take a look at these, which are equally good:
https://github.com/guardianproject/android-ffmpeg
https://github.com/halfninja/android-ffmpeg-x264
https://vec.io/posts/how-to-build-ffmpeg-with-android-ndk
EDIT: I updated the first link with a more recent article (it uses NDK r9).
I'm trying to implement a small mp3 player on android. So far so good, but I cant implement the following feature:
When playback of the file starts, check the file, get the artwork and display it.
(the artwork is embedded)
I've seen several libraries which claim to be capable of doing so, but I did not manage to implement it. (jaudiotagger, jid3)
Did somebody ever implement this and can show me some code?
Thank, Nico
Android doesn't support the standard ImageIO libraries which is why this part of jaudiotagger doesnt work with Android, there is an outstanding issue on jaudiotagger to resolve this issue for android.