I am using audio/video manipulation library mentioned below.
MobileFFmpeg --- implementation 'com.arthenica:mobile-ffmpeg-min:4.4'
After implementation and applying ProGuard rules app download size is 26.8 Mb.
I want to reduce the app size.
How to reduce the app size?
When you use this kind of library, you don't have much of a choice on app size. One thing you can do is release aab instead of apk on google play.
For example, if you analyze the apk you might see, you have different .so file for different architectures.
With aab file, google can choose the right architecture to build apk for right device.
I would not worry too much about size for this kind of app, because it needs those .so files to work properly.
Another bold approach you can take is to make your own version of ffmpeg lite by removing anything that you don't need.
But that requires a deep understanding of the ffmpeg library.
Related
I've got an app in Google Play, as well as on Amazon Appstore, Huawei Marketplace, Samsung Apps and one more e-shop in China.
It's a puzzle game. Regularly, I implement new types of puzzles. Each time, I then have to make them available to the users by creating new version of the app and publishing it to those 5 shops.
Implementation of a single puzzle consists of 2 Java classes, a binary file and a few pngs (icons).
I am thinking that maybe it would be possible to write my app in a way so that it could instead download such 'puzzle modules' from my server? The apk would then need to be able to download executable Java code from Internet and somehow 'adjoin' it as a module to itself. Is that possible?
it is possibile, but it is also restricted in some stores, in Google Play for shure. (also possible on iOS, also restricted in App Store)
this is just very unsafe letting developers adding some executable code without store reviews and informing users, so policies are forbidding it
I am developing an android app that processes images. I tried android ndk before so i can code natively to speed up performance. But I discontinued using it because I can't fully understand the C language.
So I continue my app dev using Java code. I searched for easier and better solution to speed up the performance of my app without using native code and it was a success.
After sometime, i realized that the ndk wasn't remove from my project. I removed it and recompile my code so i can remove the cpp folder too. But when I test my app, it became very slowwwwww... I linked the ndk again and now the image processing is fast again.
My question is, how did that happen???
For processing images, usually C/C++ gives better image output than Java in desktop.
While programming in Android also i saw C/C++ Android NDK code gives better image output and faster than Android Java code.
This may be because of C/C++ code directly complies to native code where else java compiles to byte code and from byte code complies to native code by JNI.
And in NDK, .so files are created by Application Binary Interface(ABI) by that application machine code is directly interacted with the system at run time.
A DVR I bought includes its SDK which gives me all the programming codes for the DVR.
The example I found in the SDK was made for Windows only, is there a way I can implement these methods on Android using NDK?
The SDK uses .cpp files and some .h files. It also links to an external .dll file which I have no way of reading the codes to implement to Android
EDIT
I'm trying to implement these methods for a live view application. I managed to get the stream using rtsp with low latency using a different source, but I need to get the playback function which is available in the SDK
EDIT 2
What I mean by "playback" is to play the recorded videos saved on the hard disk of the DVR.
I asked some of my connections and they confused "playback" with "livestream" so this might also confuse some who reads this question
I want to work on making an android app by integrating OpenCV with android Studio. I have a set of 2D hardcopy card images that i want to save as templates with in the app. Then, using the app, when i place my camera on any of the cards, the app should search the directory which contain the templates and look for match and provide feedback if a match is found. If anyone can guide on how to achieve this, it will be highly appreciated.
Also, if not OpenCV, then which SDK or tool should be preferred ?
The question is a general one, so the answer will be general as well, and will make assumptions about what you'd like to accomplish with your application.
Android Studio with OpenCV is probably a reasonable stack to use.
Presuming the library has more than a trivial number of images, you'll probably want to extract matching information for each image in your library in an offline process (on your code-development machine). So for instance, you would configure your development machine with a SQLite database driver and OpenCV, construct a program that extracts feature points and saves those to your a SQLite database. That file can then be loaded into the Android APK assets, and it would be ready upon the application's first use. The application would take frames from the camera and compare those with each item in the database, looking for matches.
I am prototyping a fairly simple camera app to test out using MediaRecorder to create a custom camera activity with one snag, I want to set the aspect ratio of recorded videos to a 1x1. Through much research I have found that this is only possible by using a library like FFMPEG to crop each frame of the video to the size I desire.
I have read many tutorials and articles on different ways to build FFMPEG into Android, but most of them are either outdated and use older versions of both the Android NDK and FFMPEG, or more recent ones just do not work when followed. I tried following the popular http://www.roman10.net/how-to-build-ffmpeg-for-android/ and a few other similar ones that all lead to an error about a missing pkg-config file because FFMPEG is generally meant to be installed on linux or another OS apparently. I found some information about building FFMPEG in android by using a make-standalone-toolchain.sh file here http://software.intel.com/en-us/android/blogs/2013/12/06/building-ffmpeg-for-android-on-x86 and can't make heads or tails as to how to go about using this method.
This now leads into my question: What is the best/proven way currently to build and use FFMPEG within android applications? If the standalone toolchain method is the way to go, is there any material better than the one listed that is easier to follow? I would even be open to a reliable template application with the FFMPEG Libraries ready to go (if this is possible); although, I would much rather know how to build this into android for future use.
Thank you in advance for any advice or suggestions on this issue.
I have successfully build ffmpeg libraries using
https://code.google.com/p/dolphin-player/
You have to be on Ubuntu to build that.
This is the guide I liked the most: http://www.roman10.net/how-to-build-ffmpeg-with-ndk-r9/
If you need more options, you can take a look at these, which are equally good:
https://github.com/guardianproject/android-ffmpeg
https://github.com/halfninja/android-ffmpeg-x264
https://vec.io/posts/how-to-build-ffmpeg-with-android-ndk
EDIT: I updated the first link with a more recent article (it uses NDK r9).