Reproducing and resolving Android java.lang.unsatisfiedLinkError locally - java

Together with a friend I have created an Android app to organize school grades. The app works fine on my device and on most user devices, however there is a crashing rate over 3 percent, mostly because of java.lang.UnsatisfiedLinkError and occurring on Android versions 7.0, 8.1 as well as 9.
I've tested the app on my phone and on several emulators, including all the architectures. I upload the app to the app store as an android-app-bundle and suspect that this could be the source of the problem.
I am a bit lost here, because I've tried already several things but so far I was not able to either reduce the number of occurrences nor to reproduce it on any of my devices. Any help will be highly appreciated.
I have found this resource which points out that Android sometimes fails to unpack external libraries. Therefore they created a ReLinker library which will try to fetch the libraries from the compressed app:
Unfortunately, this did not reduce the amount of crashes due to java.lang.UnsatisfiedLinkError. I continued my online research and found this article, which suggests that the problem lies in the 64-bit libraries. So I removed the 64bit libraries (the app still runs on all devices, because 64-bit architectures can also execute 32-bit libraries). However, the error still occurs in the same frequency like before.
Through the google-play-console I got the following crash report:
java.lang.UnsatisfiedLinkError:
at ch.fidelisfactory.pluspoints.Core.Wrapper.callCoreEndpointJNI (Wrapper.java)
at ch.fidelisfactory.pluspoints.Core.Wrapper.a (Wrapper.java:9)
at ch.fidelisfactory.pluspoints.Model.Exam.a (Exam.java:46)
at ch.fidelisfactory.pluspoints.SubjectActivity.i (SubjectActivity.java:9)
at ch.fidelisfactory.pluspoints.SubjectActivity.onCreate (SubjectActivity.java:213)
at android.app.Activity.performCreate (Activity.java:7136)
at android.app.Activity.performCreate (Activity.java:7127)
at android.app.Instrumentation.callActivityOnCreate (Instrumentation.java:1272)
at android.app.ActivityThread.performLaunchActivity (ActivityThread.java:2908)
at android.app.ActivityThread.handleLaunchActivity (ActivityThread.java:3063)
at android.app.servertransaction.LaunchActivityItem.execute (LaunchActivityItem.java:78)
at android.app.servertransaction.TransactionExecutor.executeCallbacks (TransactionExecutor.java:108)
at android.app.servertransaction.TransactionExecutor.execute (TransactionExecutor.java:68)
at android.app.ActivityThread$H.handleMessage (ActivityThread.java:1823)
at android.os.Handler.dispatchMessage (Handler.java:107)
at android.os.Looper.loop (Looper.java:198)
at android.app.ActivityThread.main (ActivityThread.java:6729)
at java.lang.reflect.Method.invoke (Method.java)
at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run (RuntimeInit.java:493)
at com.android.internal.os.ZygoteInit.main (ZygoteInit.java:876)
The Wrapper.java is the class which calls our native library. The line it points to however just reads as follows:
import java.util.HashMap;
The ch.fidelisfactory.pluspoints.Core.Wrapper.callCoreEndpointJNI is the entry point to our native cpp library.
In the native cpp library we use some external libraries (curl, jsoncpp, plog-logging, sqlite and tinyxml2).
Edit 4th June 2019
As requested, here the code of Wrapper.java:
package ch.fidelisfactory.pluspoints.Core;
import android.content.Context;
import org.json.JSONException;
import org.json.JSONObject;
import java.io.Serializable;
import java.util.HashMap;
import ch.fidelisfactory.pluspoints.Logging.Log;
/***
* Wrapper around the cpp pluspoints core
*/
public class Wrapper {
/**
* An AsyncCallback can be given to the executeEndpointAsync method.
* The callback method will be called with the returned json from the core.
*/
public interface AsyncCallback {
void callback(JSONObject object);
}
public static boolean setup(Context context) {
String path = context.getFilesDir().getPath();
return setupWithFolderAndLogfile(path,
path + "/output.log");
}
private static boolean setupWithFolderAndLogfile(String folderPath, String logfilePath) {
HashMap<String, Serializable> data = new HashMap<>();
data.put("folder", folderPath);
data.put("logfile", logfilePath);
JSONObject res = executeEndpoint("/initialization", data);
return !isErrorResponse(res);
}
public static JSONObject executeEndpoint(String path, HashMap<String, Serializable> data) {
JSONObject jsonData = new JSONObject(data);
String res = callCoreEndpointJNI(path, jsonData.toString());
JSONObject ret;
try {
ret = new JSONObject(res);
} catch (JSONException e) {
Log.e("Error while converting core return statement to json.");
Log.e(e.getMessage());
Log.e(e.toString());
ret = new JSONObject();
try {
ret.put("error", e.toString());
} catch (JSONException e2) {
Log.e("Error while putting the error into the return json.");
Log.e(e2.getMessage());
Log.e(e2.toString());
}
}
return ret;
}
public static void executeEndpointAsync(String path, HashMap<String, Serializable> data, AsyncCallback callback) {
// Create and start the task.
AsyncCoreTask task = new AsyncCoreTask();
task.setCallback(callback);
task.setPath(path);
task.setData(data);
task.execute();
}
public static boolean isErrorResponse(JSONObject data) {
return data.has("error");
}
public static boolean isSuccess(JSONObject data) {
String res;
try {
res = data.getString("status");
} catch (JSONException e) {
Log.w(String.format("JsonData is no status message: %s", data.toString()));
res = "no";
}
return res.equals("success");
}
public static Error errorFromResponse(JSONObject data) {
String errorDescr;
if (isErrorResponse(data)) {
try {
errorDescr = data.getString("error");
} catch (JSONException e) {
errorDescr = e.getMessage();
errorDescr = "There was an error while getting the error message: " + errorDescr;
}
} else {
errorDescr = "Data contains no error message.";
}
return new Error(errorDescr);
}
private static native String callCoreEndpointJNI(String jPath, String jData);
/**
* Log a message to the core
* #param level The level of the message. A number from 0 (DEBUG) to 5 (FATAL)
* #param message The message to log
*/
public static native void log(int level, String message);
}
Additionally,here the cpp definition of the entrypoint that then calls our core library:
#include <jni.h>
#include <string>
#include "pluspoints.h"
extern "C"
JNIEXPORT jstring JNICALL
Java_ch_fidelisfactory_pluspoints_Core_Wrapper_callCoreEndpointJNI(
JNIEnv* env,
jobject /* this */,
jstring jPath,
jstring jData) {
const jsize pathLen = env->GetStringUTFLength(jPath);
const char* pathChars = env->GetStringUTFChars(jPath, (jboolean *)0);
const jsize dataLen = env->GetStringUTFLength(jData);
const char* dataChars = env->GetStringUTFChars(jData, (jboolean *)0);
std::string path(pathChars, (unsigned long) pathLen);
std::string data(dataChars, (unsigned long) dataLen);
std::string result = pluspoints_execute(path.c_str(), data.c_str());
env->ReleaseStringUTFChars(jPath, pathChars);
env->ReleaseStringUTFChars(jData, dataChars);
return env->NewStringUTF(result.c_str());
}
extern "C"
JNIEXPORT void JNICALL Java_ch_fidelisfactory_pluspoints_Core_Wrapper_log(
JNIEnv* env,
jobject,
jint level,
jstring message) {
const jsize messageLen = env->GetStringUTFLength(message);
const char *messageChars = env->GetStringUTFChars(message, (jboolean *)0);
std::string cppMessage(messageChars, (unsigned long) messageLen);
pluspoints_log((PlusPointsLogLevel)level, cppMessage);
}
Here, the pluspoints.h file:
/**
* Copyright 2017 FidelisFactory
*/
#ifndef PLUSPOINTSCORE_PLUSPOINTS_H
#define PLUSPOINTSCORE_PLUSPOINTS_H
#include <string>
/**
* Send a request to the Pluspoints core.
* #param path The endpoint you wish to call.
* #param request The request.
* #return The return value from the executed endpoint.
*/
std::string pluspoints_execute(std::string path, std::string request);
/**
* The different log levels at which can be logged.
*/
typedef enum {
LEVEL_VERBOSE = 0,
LEVEL_DEBUG = 1,
LEVEL_INFO = 2,
LEVEL_WARNING = 3,
LEVEL_ERROR = 4,
LEVEL_FATAL = 5
} PlusPointsLogLevel;
/**
* Log a message with the info level to the core.
*
* The message will be written in the log file in the core.
* #note The core needs to be initialized before this method can be used.
* #param level The level at which to log the message.
* #param logMessage The log message
*/
void pluspoints_log(PlusPointsLogLevel level, std::string logMessage);
#endif //PLUSPOINTSCORE_PLUSPOINTS_H

UnsatisfiedLinkError happens when your code tries to call smth that doesn't exist for some reason: post about it
Here is one of potential reasons for multidex apps:
Nowadays almost every Android app uses Multidex to be able to include more stuff in it. When building the DEX file, build tools try to understand which classes are required on the start and puts them to the main dex. However, they can miss smth, especially when JNI is bound.
You can try manually marking the Wrapper class as required to be in the main DEX: docs. It may help it to bring its dependent native library as well in case if you have a multidex app.

Your two native methods are declared static in Java, but in C++ the corresponding functions are declared with the second parameter belonging to type jobject.
Changing the type to jclass should help solving your problem.

Looking at the call stack you reported in the exception:
at ch.fidelisfactory.pluspoints.Core.Wrapper.callCoreEndpointJNI (Wrapper.java)
at ch.fidelisfactory.pluspoints.Core.Wrapper.a (Wrapper.java:9)
at ch.fidelisfactory.pluspoints.Model.Exam.a (Exam.java:46)
at ch.fidelisfactory.pluspoints.SubjectActivity.i (SubjectActivity.java:9)
at ch.fidelisfactory.pluspoints.SubjectActivity.onCreate (SubjectActivity.java:213)
It looks obfuscated (ProGuarded)? After all, the trace should involve executeEndpoint(String, HashMap<String, Serializable>) according to your pasted code.
It could be that the lookup of the native method is failing as the strings no longer match. It's just a suggestion - I don't see why it would fail on just 3% of phones. But I came across this problem before.
First off, test after you disable all obfuscation.
If it is related to proguarding, then you will want to add rules to the project. See this link for suggestions: In proguard, how to preserve a set of classes' method names?
Another thing, a quick check that can be useful to prevent unsightly crashes - add upon start-up whether the package name and method that is later causing the UnsatisfiedLinkError can be resolved.
//this is the potentially obfuscated native method you're trying to test
String myMethod = "<to fill in>";
boolean result = true;
try{
//set actual classname as required
String packageName = MyClass.class.getPackage().getName();
Log.i( TAG, "Checking package: name is " + packageName );
if( !packageName.contains( myMethod ) ){
Log.w( TAG, "Cannot resolve expected name" );
result = false;
}
}catch( Exception e ){
Log.e( TAG, "Error fetching package name " );
e.printStackTrace();
result = false;
}
If you get a negative result, warn the user of a problem, and fail gracefully.

If the 3% of users had the app crash on a device with 64-bit processors then
you should see this post on Medium.

that this has to do with proguard is unlikely - and the code provided is quite irrelevant. the build.gradle and the directory structure would be the only thing one would need to know. when writing Android 7,8,9 this is most likely ARM64 related. the question also features the fairly inaccurate assumption, that ARM64 would be able to run ARM native assembly... because this is only the case, when dropping the 32bit native assembly into the armeabi directory; but it will complain about an UnsatisfiedLinkError, when using the armeabi-v7a directory. this is not even required, when being able to build for ARM64 and dropping the ARM64 native assembly into arm64-v8a directory.
and if this should be app bundle related (I've just noticed the content tag), it appears likely the the native assembly for ARM64 had been packaged into the wrong bundle part - or ARM64 platform is not being delivered with that assembly. would suggest not to re-link much, but closely inspect what actually a) had been packaged and b) is being delivered to ARM64 platform. which CPU those models have which fail to link might also be interesting, just to see if there is any pattern.
getting your hands on any of these problematic models, either in form of hardware or a cloud-based emulator (which preferably runs on real hardware), might be the most easy to at least reproduce the problem while testing. lookup the models and then go to eBay, search "2nd hand" or "refurbished"... your tests might have failed to reproduce the problem, because not installing the bundle from Play Store.

To reproduce this locally, you can side load an apk[x86 apk to arm device or vice versa or cross architecture] to your phone.
Usually users might use tools like ShareIt to transfer apps between phones.
When done so, the architectures of the sharing phones might be different.
This is the majority of cause of the strange unsatisfied link exception.
There is a way that you can mitigate this though.
Play has an api to verify if an installation happened through PlayStore.
This way you can restrict installs through other channels and hence reducing unsatisfied link exceptions.
https://developer.android.com/guide/app-bundle/sideload-check

This issue might be related to https://issuetracker.google.com/issues/127691101
It happens on some devices of LG or old Samsung devices where the user has moved the app to the SD Card.
One way of fixing the issue is to use Relinker library to load your native libraries instead of directly calling System.load method. It did work for my app's use case.
https://github.com/KeepSafe/ReLinker
Another way is to block the movement of the app to the SD card.
You can also keep android.bundle.enableUncompressedNativeLibs=false in your gradle.properties file. But it will increase the app download size on Play Store as well as its disk size.

Related

Why Unity3d failed importing jar plugin?

I have a small Unity3d project to integrate JAR with it. My (simplified) java class in Android Studio library project is like below code:
package com.playsqreen.library.api;
... imports ...
public class PlaysreenAPI {
private static PlayscreenAPI _api;
private PlayscreenAPI(Activity activity) {
this.activity = activity;
}
// static method to create singleton
public static PlayscreenAPI build(Activity activity, String key) {
if (_api == null) {
// post processing something
// before returning instance of this class
_api = new PlayscreenAPI(activity);
}
return _api;
}
public String doEchoThis(String msg) {
return "ECHO: " + msg;
}
}
So from Android Studio, I generate my JAR and dump it into ../MyProject/Assets/Plugins/Android and from Unity IDE I can see something like below:
Then I create a C# script like below to load my java class:
void Start () {
_builder = new StringBuilder();
try
{
_builder.Append(">>> Step 1\n");
AndroidJavaClass activityClass = new AndroidJavaClass("com.unity3d.player.UnityPlayer");
if (activityClass != null)
{
_builder.Append(">>> Step 2\n");
AndroidJavaObject activity = activityClass.GetStatic<AndroidJavaObject>("currentActivity");
if (activity != null)
{
_builder.Append(">>> Step 3\n");
AndroidJavaClass apiClass = new AndroidJavaClass("com.playsqreen.library.api.PlayscreenAPI");
if (apiClass != null)
{
_builder.Append(">>> Step 4\n");
object[] args = { activity, secretKey };
api = apiClass.CallStatic<AndroidJavaObject>("build", args);
}
}
}
}
catch (System.Exception e)
{
_builder.Append(e.StackTrace.ToString());
}
}
I print my StringBuilder in Text UI object in order for me to capture on which Step my code brakes, and it turns out after step 3. and my Text UI object prints:
java.lang.ClassNotFoundException:com.playsqreen.library.api.PlayscreenAPI and etc...
A thread I found here suggest me to use Java Decompiler to check if the java class really included in the Jar, from this site. And so I did, and the Java Decompiler shows my java class does exists (see below)
So I really stuck now. How can I load my java class from Unity? Please help.
Thanks.
After following lysergic-acid advice below, I includes the rest of jars that required by my custom jar, see below pic. And everything fine :)
You did not post the full error message you are getting at runtime. Also you did not mention which of your debug prints get printed, so i'll try to come up with a few different issues that you can check. Hopefully, one of these can assist in fixing the issue:
JAR name contains '.' (period character). Not sure how Unity interprets this (i've never used such a naming convention myself). Select the .JAR file in Unity and make sure that Unity marks it up as an Android plugin (should have "Android" selected in the plugin importer. I would also try to rename that to a name without any periods just to be on the safe side.
Wrong invocation of the Java method: in your example, the static method build in Java receives a single argument (Activity), but when you're calling it from Unity, you're passing an array of 2 arguments.
Missing dependencies: When your native Java code relies on other classes (e.g: from other libraries), in case you do not include these libraries with your .JAR file, your class will not be loaded at runtime and it will fail with cryptic errors such as NoClassDefFoundException. Make sure to include all dependencies as well.
**Shameless Promotion: ** I offer services to fix Android related issues in Unity. Check it out here, in case you're interested :)
Install newest JDK version and try again.

Unit test Java class that loads native library

I'm running unit tests in Android Studio. I have a Java class that loads a native library with the following code
static
{
System.loadLibrary("mylibrary");
}
But when I test this class inside my src/test directory I get
java.lang.UnsatisfiedLinkError: no mylibrary in java.library.path
at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1864)
at java.lang.Runtime.loadLibrary0(Runtime.java:870)
at java.lang.System.loadLibrary(System.java:1122)
How can I make it find the path of native .so libraries which is located at src/main/libs in order to unit test without errors?
Note: inside src/main/libs directory I have 3 more subdirectories: armeabi, mips and x86. Each one of those contains the proper .so file. I'm using the Non experimental version for building NDK libs.
I don't wanna use other 3rd party testing libraries as all my other "pure" java classes can be unit tested fine. But if that's not possible then I'm open to alternatives.
Here is my test code which throws the error
#Test
public void testNativeClass() throws Exception
{
MyNativeJavaClass test = new MyNativeJavaClass("lalalal")
List<String> results = test.getResultsFromNativeMethodAndPutThemInArrayList();
assertEquals("There should be only three result", 3, results.size());
}
The only solution I found that works without hacks is to use JUnit through instrumentation testing (androidTest directory).
My class can now be tested fine but with help of the android device or emulator.
If the library is required for your test, use an AndroidTest (under src/androidTest/...) rather than a junit test. This will allow you to load and use the native library like you do elsewhere in your code.
If the library is not required for your test, simply wrap the system load in a try/catch. This will allow the JNI class to still work in junit tests (under src/test/...) and it is a safe workaround, given that it is unlikely to mask the error (something else will certainly fail, if the native lib is actually needed). From there, you can use something like mockito to stub out any method calls that still hit the JNI library.
For example in Kotlin:
companion object {
init {
try {
System.loadLibrary("mylibrary")
} catch (e: UnsatisfiedLinkError) {
// log the error or track it in analytics
}
}
}
I am not sure whether this solves your problem or not but so far nobody has mentioned about strategy pattern for dealing with classes preloading library during their creation.
Let's see the example:
We want to implement Fibonacci solver class. Assuming that we provided implementation in the native code and managed to generate the native library, we can implement the following:
public interface Fibonacci {
long calculate(int steps);
}
Firstly, we provide our native implementation:
public final class FibonacciNative implements Fibonacci {
static {
System.loadLibrary("myfibonacci");
}
public native long calculate(int steps);
}
Secondly, we provide Java implementation for Fibonacci solver:
public final class FibonacciJava implements Fibonacci {
#Override
public long calculate(int steps) {
if(steps > 1) {
return calculate(steps-2) + calculate(steps-1);
}
return steps;
}
}
Thirdly, we wrap the solvers with parental class choosing its own implementation during its instantiation:
public class FibonnaciSolver implements Fibonacci {
private static final Fibonacci STRATEGY;
static {
Fibonacci implementation;
try {
implementation = new FibonnaciNative();
} catch(Throwable e) {
implementation = new FibonnaciJava();
}
STRATEGY = implementation;
}
#Override
public long calculate(int steps) {
return STRATEGY.calculate(steps);
}
}
Thus, the problem with finding path to the library using strategy. This case, however, does not resolve the problem if the native library is really necessary to be included during the test. It does not neither solve the problem if the native library is a third-party library.
Basically, this gets around the native library load problem by mocking out the native code for java code.
Hope this helps somehow:)
There is a way to configure library path of Gradle-run VM for local unit tests, and I'm going to describe it below, but spoiler: in my expericence, #ThanosFisherman is right: local unit tests for stuff that uses the Android NDK seem to be a fools errand right now.
So, for anyone else looking for a way to load shared (i.e. .so) libraries into unit tests with gradle, here's the somewhat lengthy abstract:
The goal is to set the shared library lookup path for the JVM running the unit tests.
Althoug many people suggest putting the lib path into java.library.path, I found that it doesn't work, at least not on my linux machine. (also, same results in this CodeRanch thread)
What does work though is setting the LD_LIBRARY_PATH os environment variable (or PATH is the closest synonym in Windows)
Using Gradle:
// module-level build.gradle
apply plugin: 'com.android.library' // or application
android {
...
testOptions {
unitTests {
all {
// This is where we have access to the properties of gradle's Test class,
// look it up if you want to customize more test parameters
// next we take our cmake output dir for whatever architecture
// you can also put some 3rd party libs here, or override
// the implicitly linked stuff (libc, libm and others)
def libpath = '' + projectDir + '/build/intermediates/cmake/debug/obj/x86_64/'
+':/home/developer/my-project/some-sdk/lib'
environment 'LD_LIBRARY_PATH', libpath
}
}
}
}
With that, you can run, e.g. ./gradlew :mymodule:testDebugUnitTest and the native libs will be looked for in the paths that you specified.
Using Android Studio JUnit plugin
For the Android Studio's JUnit plugin, you can specify the VM options and the environment variables in the test configuration's settings, so just run a JUnit test (right-clicking on a test method or whatever) and then edit the Run Configuration:
Although it sounds like "mission accomplished", I found that when using libc.so, libm.so and others from my os /usr/lib gives me version errors (probably because my own library is compiled by cmake with the android ndk toolkit against it's own platform libs). And using the platform libs from the ndk packages brought down the JVM wih a SIGSEGV error (due to incompatibility of the ndk platform libs with the host os environment)
Update As #AlexCohn incisively pointed out in the comments, one has to build against the host environment libs for this to work; even though your machine most likely is x86_64, the x86_64 binaries built against NDK environment will not do.
There may be something I overlooked, obviously, and I'll appreciate any feedback, but for now I'm dropping the whole idea in favor of instrumented tests.
Just make sure, the directory containing the library is contained in the java.library.path system property.
From the test you could set it before you load the library:
System.setProperty("java.library.path", "... path to the library .../libs/x86");
You can specify the path hard coded, but this will make the project less portable to other environments. So I suggest you build it up programmatically.
The .so files are to be placed under
src/main/jniLibs
Not under src/main/libs
(Tested with Android Studio 1.2.2)
For reference check the page - http://ph0b.com/android-studio-gradle-and-ndk-integration/, though some portions might be outdated.
This is very, very tricky. Setting java.library.path does not work, but trying to understand someone else’s Mac OSX approach I eventually found a working solution.
Legal release: all code examples directly copied into this post are available under CC0 but it would be appeciated to credit my employer ⮡ tarent, the LLCTO project at Deutsche Telekom, and the author mirabilos.
CAVEATS first:
with this, you’re testing a version of the native code compiled against your system libraries (usually glibc on GNU/Linux, and on BSD, Mac OSX and Windows it’s even trickier) so adding some instrumented tests should be done anyway, use the unittests only for faster testing of things that actually can be tested on the host OS
I’ve only tested this with a GNU/Linux host (and am, in fact, excluding these native tests on all other host OSes, see below)
it should work under unixoid OSes with GNU/BSD-style shared libraries as-is
with small adaptions from the “someone else’s” article linked above, it might probably work on Mac OSX
Windows… no, just no. Use WSL, which is basically Linux anyway and makes things much easier, and so much closer to Android which is also basically Linux just not GNU
IDE integration needs manual steps at each developer’s machine (but these are easily documented, see (much) below)
Prerequisites
You’ll need to make sure that all build dependencies of your native code are also installed in the host system. This includes cmake (because we sadly cannot reuse the NDK cmake) and a host C compiler. Note that these introduce further differences in the build: you’re testing something that has been built with the host C compiler (often GCC, not clang like in Android) against the host C library and other libraries by the host clang. Do consider this when writing your tests. I had to move one of the tests to instrumented because it was impossible to test under glibc.
For filesystem layout, we assume the following:
~/MYPRJ/build.gradle is the top-level build file (generated by IntelliJ / Android Studio)
~/MYPRJ/app/build.gradle is where the Android code in question is built (generated by IntelliJ / Android Studio)
~/MYPRJ/app/src/main/native/CMakeLists.txt is where the native code is situated
This means build.gradle (for the app) has something like this already, by the point where you begin wondering about whether your project can be unittested:
externalNativeBuild {
cmake {
path "src/main/native/CMakeLists.txt"
return void // WTF‽
}
}
Make sure your code builds on the host
Doing this ought to be easy at first glance:
$ rm -rf /tmp/build
$ mkdir /tmp/build
$ cd /tmp/build
$ cmake ~/MYPRJ/app/src/main/native/
$ make
(Make sure you give cmake the path to the directory the main CMakeLists.txt file is in, but not to that file itself!)
This will fail for everything nōntrivial, of course. Most people would use Android logging. (It will also fail because it cannot find <jni.h>, and because GNU libc requires an extra _GNU_SOURCE definition to access some prototypes, etc…)
So I wrote a header to include instead of <android/log.h> which abstracts the logging away…
#ifndef MYPRJ_ALOG_H
#define MYPRJ_ALOG_H
#ifndef MYPRJ_ALOG_TAG
#define MYPRJ_ALOG_TAG "MYPRJ-JNI"
#endif
#if defined(MYPRJ_ALOG_TYPE) && (MYPRJ_ALOG_TYPE == 1)
#include <android/log.h>
#define ecnlog_err(msg, ...) __android_log_print(ANDROID_LOG_ERROR, \
MYPRJ_ALOG_TAG, msg, ##__VA_ARGS__)
#define ecnlog_warn(msg, ...) __android_log_print(ANDROID_LOG_WARN, \
MYPRJ_ALOG_TAG, msg, ##__VA_ARGS__)
#define ecnlog_info(msg, ...) __android_log_print(ANDROID_LOG_INFO, \
MYPRJ_ALOG_TAG, msg, ##__VA_ARGS__)
#elif defined(MYPRJ_ALOG_TYPE) && (MYPRJ_ALOG_TYPE == 2)
#include <stdio.h>
#define ecnlog_err(msg, ...) fprintf(stderr, \
"E: [" MYPRJ_ALOG_TAG "] " msg "\n", ##__VA_ARGS__)
#define ecnlog_warn(msg, ...) fprintf(stderr, \
"W: [" MYPRJ_ALOG_TAG "] " msg "\n", ##__VA_ARGS__)
#define ecnlog_info(msg, ...) fprintf(stderr, \
"I: [" MYPRJ_ALOG_TAG "] " msg "\n", ##__VA_ARGS__)
#else
# error What logging system to use?
#endif
#endif
… and updated my CMakeLists.txt to indicate whether building for NDK (must be default) or native:
cmake_minimum_required(VERSION 3.10)
project(myprj-native)
option(UNDER_NDK "Build under the Android NDK" ON)
add_compile_options(-fvisibility=hidden)
add_compile_options(-Wall -Wextra -Wformat)
add_library(myprj-native SHARED
alog.h
myprj-jni.c
)
if (UNDER_NDK)
add_definitions(-DECNBITS_ALOG_TYPE=1)
find_library(log-lib log)
target_link_libraries(myprj-native ${log-lib})
else (UNDER_NDK)
add_definitions(-DECNBITS_ALOG_TYPE=2)
include(FindJNI)
include_directories(${JNI_INCLUDE_DIRS})
add_definitions(-D_GNU_SOURCE)
endif (UNDER_NDK)
Note this also already includes the fix for <jni.h> (FindJNI) and the extra definitions.
Now let’s try to build it again:
$ rm -rf /tmp/build
$ mkdir /tmp/build
$ cd /tmp/build
$ cmake -DUNDER_NDK=OFF ~/MYPRJ/app/src/main/native/
$ make
In my case, this was sufficient. If you’re still not there, fix this first before proceeding. If you cannot fix this, give up on buildhost-local unit tests for your JNI code and move the respective tests to instrumented.
Let Gradle build the host-native code
Add the following to the app build.gradle:
def dirForNativeNoNDK = project.layout.buildDirectory.get().dir("native-noNDK")
def srcForNativeNoNDK = project.layout.projectDirectory.dir("src/main/native").asFile
task createNativeNoNDK() {
def dstdir = dirForNativeNoNDK.asFile
if (!dstdir.exists()) dstdir.mkdirs()
}
task buildCMakeNativeNoNDK(type: Exec) {
dependsOn createNativeNoNDK
workingDir dirForNativeNoNDK
commandLine "/usr/bin/env", "cmake", "-DUNDER_NDK=OFF", srcForNativeNoNDK.absolutePath
}
task buildGMakeNativeNoNDK(type: Exec) {
dependsOn buildCMakeNativeNoNDK
workingDir dirForNativeNoNDK
commandLine "/usr/bin/env", "make"
}
project.afterEvaluate {
if (org.gradle.internal.os.OperatingSystem.current().isLinux()) {
testDebugUnitTest {
dependsOn buildGMakeNativeNoNDK
systemProperty "java.library.path", dirForNativeNoNDK.asFile.absolutePath + ":" + System.getProperty("java.library.path")
}
testReleaseUnitTest {
dependsOn buildGMakeNativeNoNDK
systemProperty "java.library.path", dirForNativeNoNDK.asFile.absolutePath + ":" + System.getProperty("java.library.path")
}
}
}
This defines a few new tasks to compile the buildhost-native version of the shared library, and hooks this up if the host OS is “Linux”. (This syntax will also work for other unixoid OSes — BSD, Mac OSX — but not for Windows. But we can probably test this under Linux only anyway. WSL counts as Linux.) It also sets up the JVM library path so that ../gradlew test will let the JVM pick up the library from its path.
Loose ends
There’s a few loose ends you might have noticed here:
In the last paragraph of the previous section, I mentioned that ../gradlew test will pick up the library. Testing from the IDE will not work yet; this involves manual setup.
I mentioned that the relevant unit tests must be skipped if the buildhost OS is not “Linux”; we have yet to do that. Unfortunately, JUnit 4 lacks such facilities, but switching the unit tests to JUnit 5 “Jupiter” will allow us to do that. (We’re not switching the instrumented tests, though; that’d be more invasive.)
You’ll probably not yet have noticed, but the logging output from the native code will not show up thanks to Gradle’s default settings which we’ll need to change.
So, let’s do that. First, edit your app build.gradle file again. There will be a dependencies { block. We’ll need to fill that with suitable dependencies for either JUnit:
dependencies {
testImplementation 'org.junit.jupiter:junit-jupiter-api:5.7.0'
testRuntimeOnly 'org.junit.jupiter:junit-jupiter-engine:5.7.0'
//noinspection GradleDependency
androidTestImplementation 'com.android.support.test:runner:1.0.1'
//noinspection GradleDependency
androidTestImplementation 'com.android.support.test.espresso:espresso-core:3.0.1'
//noinspection GradleDependency
androidTestImplementation 'junit:junit:4.12'
}
You’ll also have a line apply plugin: 'com.android.application' (or perhaps apply plugin: 'com.android.library') at the top. Directly below that line, insert this one:
apply plugin: 'de.mannodermaus.android-junit5'
Also, make sure that, under android { defaultConfig { the testInstrumentationRunner is still "android.support.test.runner.AndroidJUnitRunner" (the default as generated by IntelliJ / Android Studio).
Next, edit the top-level ~/MYPRJ/build.gradle file. You’ll already have a buildscript { dependencies { and will have to add a line to that section to make the JUnit5 plugin available in the first place:
//noinspection GradleDependency
classpath 'de.mannodermaus.gradle.plugins:android-junit5:1.5.2.0'
Then, add a new section under allprojects {:
tasks.withType(Test) {
testLogging {
outputs.upToDateWhen { false }
showStandardStreams = true
exceptionFormat = 'full'
}
systemProperty 'java.util.logging.config.file', file('src/test/resources/logging.properties').getAbsolutePath()
}
This ensures that…
tests are never skipped because Gradle thinks them up-to-date
logging output and exceptions are shown in full
if you have a ~/MYPRJ/app/src/test/resources/logging.properties it will set up java.util.logging with this (recommended)
Now see to your test, something like ~/MYPRJ/app/src/test/java/org/example/packagename/JNITest.java. First, you should add a “test” that can always run (I use one that merely tests whether my JNI class can be loaded), and ensure it displays some information first:
// or Lombok #Log
private static final java.util.logging.Logger LOGGER = java.util.logging.Logger.getLogger(JNITest.class.getName());
#Test
public void testClassBoots() {
LOGGER.info("running on " + System.getProperty("os.name"));
if (!LINUX.isCurrentOs()) {
LOGGER.warning("skipping JNI tests");
}
// for copy/paste into IntelliJ run options
LOGGER.info("VM options: -Djava.library.path=" +
System.getProperty("java.library.path"));
LOGGER.info("testing Java™ part of JNI class…");
[…]
}
Then, annotate the actual JNI tests that need to be skipped on other OSes:
#Test
#EnabledOnOs(LINUX)
public void testJNIBoots() {
LOGGER.info("testing JNI part of JNI class…");
final long tid;
try {
tid = JNI.n_gettid();
} catch (Throwable t) {
LOGGER.log(Level.SEVERE, "it failed", t);
Assertions.fail("JNI does not work");
return;
}
LOGGER.info("it also works: " + tid);
assertNotEquals(0, tid, "but is 0");
}
For comparison, instrumented tests (unittests that run on the Android device or emulator) — e.g. ~/MYPRJ/app/src/androidTest/java/org/example/packagename/JNIInstrumentedTest.java — look like this:
#RunWith(AndroidJUnit4.class)
public class JNIInstrumentedTest {
#Test
public void testJNIBoots() {
Log.i("ECN-Bits-JNITest", "testing JNI part of JNI class…");
final long tid;
try {
tid = JNI.n_gettid();
} catch (Throwable t) {
Log.e("ECN-Bits-JNITest", "it failed", t);
fail("JNI does not work");
return;
}
Log.i("ECN-Bits-JNITest", "it also works: " + tid);
assertNotEquals("but is 0", 0, tid);
}
}
See Testable.java if you need an assertThrows for instrumented tests (JUnit 5 already comes with one), by the way. (Note that this does not fall under the CC0 grant above but comes under a permissive licence.)
Now, you can run both tests, unittests and (if an Android emulator is started or device commected) instrumented tests:
../gradlew test connectedAndroidTest
Do so. Note the output of the VM options: logger call from the buildhost-native unit tests; in fact, copy it to the clipboard. You’ll now need it to set up testing in the IDE.
In the Project view (left-side tree), right-click either on your JNITest class or the entire src/test/java/ directory. Click on Run 'JNITest' (or Run 'Tests in 'java''), it will fail with an UnsatisfiedLinkError as in the original post.
Now click on the arrow in the test drop-down below the menu bar, then select Save JNITest configuration, then do it again and select Edit configurations… and select your configuration. Append the entire pasted thing to VM options: so the field will now look like -ea -Djava.library.path=/home/USER/MYPRJ/app/build/native-noNDK:/usr/java/packages/lib:/usr/lib/x86_64-linux-gnu/jni:/lib/x86_64-linux-gnu:/usr/lib/x86_64-linux-gnu:/usr/lib/jni:/lib:/usr/lib (of course, the actual value will differ) and click OK. Re-run the test, and it will succeed.
Unfortunately, you’ll have to do this for every native test class once and for the entire directory, so all possible ways of invocation will be covered. You’ll also have to do this manually, by clicking around, for every IDE instance, and these values depend on the path the code was checked out into. I’ve not found a way to automate these (if you know of one, do tell).
Exception backtraces
If you’re throwing custom exceptions from your code, you’ll most likely wish to include file/lineno/function information. Use a constructor like MyprjNativeException(final String file, final int line, final String func, final String msg, … /* custom data */, final Throwable cause) and, after calling super(msg, cause) (possibly with a changed message), do this:
StackTraceElement[] currentStack = getStackTrace();
StackTraceElement[] newStack = new StackTraceElement[currentStack.length + 1];
System.arraycopy(currentStack, 0, newStack, 1, currentStack.length);
newStack[0] = new StackTraceElement("<native>", func, file, line);
setStackTrace(newStack);
Then, to throw an exception like this from native code:
#define throw(env,...) vthrow(__FILE__, __func__, env, __LINE__, __VA_ARGS__)
static void vthrow(const char *loc_file, const char *loc_func, JNIEnv *env,
int loc_line, /* custom args */ const char *msg, ...);
Use as follows:
if (func() != expected)
throw(env, /* custom args */ "foo");
Implementation (assuming you cache class and constructor method references) looks as follows (adjust for custom args):
static void vthrow(const char *loc_file, const char *loc_func, JNIEnv *env,
int loc_line, const char *fmt, ...)
{
jthrowable e;
va_list ap;
jstring jfile = NULL;
jint jline = loc_line;
jstring jfunc = NULL;
jstring jmsg = NULL;
jthrowable cause = NULL;
const char *msg;
char *msgbuf;
if ((*env)->PushLocalFrame(env, /* adjust for amount you need */ 5)) {
cause = (*env)->ExceptionOccurred(env);
(*env)->ExceptionClear(env);
(*env)->Throw(env, (*env)->NewObject(env, classreference, constructorreference,
jfile, jline, jfunc, jmsg, /* custom */…, cause));
return;
}
if ((cause = (*env)->ExceptionOccurred(env))) {
/* will be treated as cause */
(*env)->ExceptionClear(env);
}
va_start(ap, fmt);
if (vasprintf(&msgbuf, fmt, ap) == -1) {
msgbuf = NULL;
msg = fmt;
} else
msg = msgbuf;
va_end(ap);
jmsg = (*env)->NewStringUTF(env, msg);
free(msgbuf);
if (!jmsg)
goto onStringError;
if (!(jfunc = (*env)->NewStringUTF(env, loc_func)))
goto onStringError;
/* allocate NewStringUTF for any custom things you need */
/* exactly like the one for loc_func above */
/* increase PushLocalFrame argument for each */
jfile = (*env)->NewStringUTF(env, loc_file);
if (!jfile) {
onStringError:
(*env)->ExceptionClear(env);
}
e = (*env)->PopLocalFrame(env, (*env)->NewObject(env, classreference, constructorreference,
jfile, jline, jfunc, jmsg, /* custom */…, cause));
if (e)
(*env)->Throw(env, e);
}
Now using __FILE__ will put the full absolute path into the messages and backtraces. This is not very nice. There’s a compiler option to fix that, but NDK r21’s clang is much too old, so we need a workaround.
CMakeLists.txt:
if (NOT TOPLEV)
message(FATAL_ERROR "setting the top-level directory is mandatory")
endif (NOT TOPLEV)
[…]
if (UNDER_NDK)
[…]
execute_process(COMMAND ${CMAKE_CXX_COMPILER} --version OUTPUT_VARIABLE cxx_version_full)
string(REGEX REPLACE "^Android [^\n]* clang version ([0-9]+)\\.[0-9].*$" "\\1" cxx_version_major ${cxx_version_full})
if (${cxx_version_major} VERSION_GREATER_EQUAL 10)
add_definitions("-ffile-prefix-map=${TOPLEV}=«MyPrj»")
else (${cxx_version_major} VERSION_GREATER_EQUAL 10)
add_definitions(-DOLD_CLANG_SRCDIR_HACK="${TOPLEV}/")
endif (${cxx_version_major} VERSION_GREATER_EQUAL 10)
else (UNDER_NDK)
[…]
add_definitions("-ffile-prefix-map=${TOPLEV}=«MyPrj»")
endif (UNDER_NDK)
app build.gradle:
(straight after the apply plugin lines)
def dirToplev = project.layout.projectDirectory.asFile.absolutePath
(inside android { defaultConfig { add a new block)
externalNativeBuild {
cmake {
//noinspection GroovyAssignabilityCheck because Gradle and the IDE have different world views…
arguments "-DTOPLEV=" + dirToplev
}
return void // WTF‽
}
(later, where you call cmake)
commandLine "/usr/bin/env", "cmake", "-DTOPLEV=" + dirToplev, "-DUNDER_NDK=OFF", srcForNativeNoNDK.absolutePath
Then, replace the line jfile = (*env)->NewStringUTF(env, loc_file); with the following snippet:
#ifdef OLD_CLANG_SRCDIR_HACK
if (!strncmp(loc_file, OLD_CLANG_SRCDIR_HACK, sizeof(OLD_CLANG_SRCDIR_HACK) - 1) &&
asprintf(&msgbuf, "«ECN-Bits»/%s", loc_file + sizeof(OLD_CLANG_SRCDIR_HACK) - 1) != -1) {
msg = msgbuf;
} else {
msg = loc_file;
msgbuf = NULL;
}
#else
#define msg loc_file
#endif
jfile = (*env)->NewStringUTF(env, msg);
#ifdef OLD_CLANG_SRCDIR_HACK
free(msgbuf);
#else
#undef msg
#endif
Tieing it all together
This all is implemented in the ECN-Bits project. I’m posting a permalink because it’s currently on a nōn-default branch but expected to be merged (once the actual functionality is no longer WIP), so be sure to check master at some point in time as well (although this permalink is probably a better example as it has the testing down and there’s not as much “actual” code to get in the way). Note that these links do not fall under the CC0 grant from above; the files all have a permissive licence though (the files which don’t have it explicit (gradle/cmake files) have the same as the unittest class permalink), but enough of it was reposted in this article so that should not be a problem for you; these only serve to show an actually-compiling-and-testing example.
In this project, it’s not in app/ but as a separate library module.
top-level build.gradle
library build.gradle
instrumented tests
unittest class
unittest logging configuration
native CMakeLists.txt
native alog.h
native code including caching the references
Java™ code to which the JNI native code attaches including the Exception class
Try running your test code with java -XshowSettings:properties option and make sure your destination path for system libraries and in the output of this command, library path values are the same
Just to clarify, the System.loadlibrary() call was failing because the junit unit test uses host/system environment which was windows in my case. Hence the loadlibrary() call was trying to search for the .so files in standard shared libary folders. But this isn't what I was expecting to happen. Instead I wanted the libxxx.so files to be loaded from .aar file(contains android resources, jars, jni libs).
This can only happen by two ways:
Baking the libxxx.so into the APK manually: we ourselves place the libxxx.so files into jnilibs dir(which the system will search for when loading the shared lib) under src/java/ root of the apk by copying the required files.
adding the .aar as an external dependency to module level gradle build script (implementation/androidTestImplementation), by doing this we make this aar available for link editing used by the apk build process.
But in both cases the app runs in the android environment/vm and hence the System.loadlibrary() call will resolve to correct libxxx.so which would be part of the apk. Hence NO ISSUES.
However in case of unit tests, which are does not require instrument(ie, android device) and runs on the JVM running on the host system where the tests are running (ex: windows/linux/mac), The call to System.loadlibrary() resolves only the standard lib paths of host system for finding shared libs/exe and doesn't refer to android system environment. Hence the ISSUES.
Fixes:
unpack the libxxx.so into some temp dir and add this dir to the system's library search path (ex: java.library.path, %PATH% variable on windows etc). Now run the required unit tests which doesn't require the android environment but involves the native code testing using JNI if any. This should work!!
(Efficient method) Simply move these type of unit tests to androidTest(ie, Instrumentation tests) so that above explained loading and packing are intact and System.loadlibrary() can successfully find the libxxx.so when running inside the instrument(android device/os). This way you ensure appropriate lib type (x86, x86-64, arm-v7a, arm-v8a(AARCH64)) is invoked on target device and tests are run on specific target devices.

No implementation found for native

I have a test project that works good. I tried to use the native library from there (and the same methods) in my other project and I get this error :
W/dalvikvm(22240): No implementation found for native Lcom/example/myapp/Serial;.open:(Ljava/lang/String;IIZ)Ljava/io/FileDescriptor;
this is the JNI code in Serial class:
// JNI
private native static FileDescriptor open(String path, int baudrate,
int flags, boolean flowCon);
public native void close();
static {
System.loadLibrary("test_lib");
}
and I have "libtest_lib.so" in three folders in my libs folder : armeabi, armeabi-v7a, x86.
I debugged the app and it seems the 'System.loadLibrary' is ineed called before the call of native method 'open'. So I can't see other options for the error..
You probably copied and pasted from another project redefine your prototype in you c file as Java_Pacakage_JavaParentClass_functionName(JNIEnv* env, jobject thiz)

Porting a CPP application to Android

I need to port a CPP project to Android but I somehow got stuck because of the following things that I am not sure of:
Do I need to use some kind of java wrapper for my CPP project at all, i.e is it necessarily that I use Android SDK to integrate my application with Android? If there is another way, which one would that be?
I have seen some people claiming they have been able to manipulate their cmake file and some custom android-cmake toolchain to build their “.so” or eventually an “.apk” from their project. Would it be possible without a java wrapper to manipulate the cmake files of the cpp project to build your project? (source: Build Android NDK project with Cmake)
From my experience, I would go with using the android java entry point, otherwise you will most likely bump into problems (you can make full native android apps , but I strongly advise against it).
One of the reasons is that you will want SDK function calls from inside your CPP, and using reflection on the java environment from CPP isn't trivial.
The steps would be the following :
Create a very simple C code that will server as bridge to your CPP code . Here are some sample C functions that I've commonly used :
OnApplicationStart
OnApplicationPaused
OnApplicationResumed
OnApplicationUpdate
OnTouchReceived
Export these functions and load them in your Java code (lookup JNI on how to do this)
Handle all Android-specific actions in Java, and all application specific actions in cpp
So the answer to your 1st question is that a java wrapper isn't mandatory, but it's HIGHLY recommended.
To answer your 2nd question :
Yes, you can use cmkae, ant, cygwin , allot of command tools that will end up creating your apk (It's up to you on how you feel comfortable).You will still use the android compiler since you are targeting arm.
The difference is that you need to change in your manifest.xml the entry point of the application from a normal activity to a native activity ( http://developer.android.com/reference/android/app/NativeActivity.html ) .
As you can see, the difference isn't the build system, it's the way you define your entry point.
As a friendly advice, try using the minimal java wrapper approach. You might get to better results sooner, and it won't take you more then 1 week of research on the web on how to link java code to cpp.
EDIT :
As of demand, I will shortly explain how I would approach the process of porting a CPP application to Android :
Rethink your application to work as a shared library , using a C entry point :
a.Create a simple C application that will load yourCPPApp.dll (or .so if you are on linux)
b. In your .dll create the minimum necessary extern "C" functions to be exported in order for you to give the necessary information to your dll
For simplicity, we'll assume we have 3 methods :
void StartApplication();
bool OnApplicationUpdate();
void OnScreenTouched(int x, int y);
c. Implement the simple C project that will make the calls to these methods externaly (so the .exe will call the methods from the .dll ! )
Sample code :
#include "myCPPapp.h"
int main(int arg, char** argv)
{
StartApplication();
srand(time(NULL));
while (OnApplicationUpdate())
{
// we assume we have a 480x640 resolution
OnScreenTouched(rand()%480,rand()%640);
}
return 0;
}
Now that we have things working in full native with a .exe and a .dll , time to make it work with a .apk and a .so
a. Rename the exposed methods from myCppApp into java compatible prototypes
extern "C" {
JNIEXPORT void JNICALL Java_com_sample_nativebridge_OnApplicationStart(JNIEnv env, jobject thiz);
JNIEXPORT jboolean JNICALL Java_com_sample_nativebridge_OnApplcationUpdate(JNIEnv env, jobject thiz);
JNIEXPORT void JNICALL Java_com_sample_nativebridge_OnScreenTouched(JNIEnv env, jobject thiz, jint x, jint y);
}
b. create the java class nativebridge (case sensitive) in the package com.sample (you need t respect the names in order for correct linkage to native)
class nativebridge {
public static native void OnApplicationStart();
public static native boolean OnApplicationUpdate();
public static native void OnScreenTouched(int x, int y);
}
c. add the load library statement in the native bridge in order to have your library loaded at runtime
class nativebridge {
....
static {
System.loadLibrary("myCppApp");
// notice that the lib prefix and .so sufix aren't part of the name
}
}
d. compile your CPP code with the armeabi compiler and create libmyCPPApp.so (ndk build system , or whatever you'd like... i would go with the ndk, it's quite easy , just go into the folder with Android.mk and call $ANDROID_NDK_BUILD_PATH/build )
At this point you will need to createa a .mk file that will compile your myCppApp code, but this is out of scope, you will need to do this research on your own (it's quite trivial once you get the hang of it).
c. use the native methods from the bridge inside your java app wherever you see fit.
A very good tip would be to go through a hello world sample of ndk :
http://www.ntu.edu.sg/home/ehchua/programming/android/android_ndk.html
Enjoy.

Java: CaptureDeviceManager#getDeviceList() is empty?

I am trying to print out all of the capture devices that are supported using the #getDeviceList() method in the CaptureDeviceManager class and the returned Vector has a size of 0.
Why is that? I have a webcam that works - so there should be at least one. I am running Mac OS X Lion - using JMF 2.1.1e.
Thanks!
CaptureDeviceManager.getDeviceList(Format format) does not detect devices. Instead it reads from the JMF registry which is the jmf.properties file. It searches for the jmf.properties file in the classpath.
If your JMF install has succeeded, then the classpath would have been configured to include all the relevant JMF jars and directories. The JMF install comes with a jmf.properties file included in the 'lib' folder under the JMF installation directory. This means the jmf.properties would be located by JMStudio and you would usually see the JMStudio application executing correctly. (If your JMF install is under 'C:\Program Files', then run as administrator to get around UAC)
When you create your own application to detect the devices, the problem you described above might occur. I have seen a few questions related to the same problem. This is because your application's classpath might be different and might not include the environment classpath. Check out your IDE's properties here. The problem is that CaptureDeviceManager cannot find the jmf.properties file because it is not there.
As you have found out correctly, you can copy the jmf.properties file from the JMF installation folder. It would contain the correct device list since JMF detects it during the install (Check it out just to make sure anyway).
If you want do device detection yourself, then create an empty jmf.properties file and put it somewhere in your classpath (it might throw a java.io.EOFException initially during execution but that's properly handled by the JMF classes). Then use the following code for detecting webcams...
import javax.media.*;
import java.util.*;
public static void main(String[] args) {
VFWAuto vfwObj = new VFWAuto();
Vector devices = CaptureDeviceManager.getDeviceList(null);
Enumeration deviceEnum = devices.elements();
System.out.println("Device count : " + devices.size());
while (deviceEnum.hasMoreElements()) {
CaptureDeviceInfo cdi = (CaptureDeviceInfo) deviceEnum.nextElement();
System.out.println("Device : " + cdi.getName());
}
}
The code for the VFWAuto class is given below. This is part of the JMStudio source code. You can get a good idea on how the devices are detected and recorded in the registry. Put both classes in the same package when you test. Disregard the main method in the VFWAuto class.
import com.sun.media.protocol.vfw.VFWCapture;
import java.util.*;
import javax.media.*;
public class VFWAuto {
public VFWAuto() {
Vector devices = (Vector) CaptureDeviceManager.getDeviceList(null).clone();
Enumeration enum = devices.elements();
while (enum.hasMoreElements()) {
CaptureDeviceInfo cdi = (CaptureDeviceInfo) enum.nextElement();
String name = cdi.getName();
if (name.startsWith("vfw:"))
CaptureDeviceManager.removeDevice(cdi);
}
int nDevices = 0;
for (int i = 0; i < 10; i++) {
String name = VFWCapture.capGetDriverDescriptionName(i);
if (name != null && name.length() > 1) {
System.err.println("Found device " + name);
System.err.println("Querying device. Please wait...");
com.sun.media.protocol.vfw.VFWSourceStream.autoDetect(i);
nDevices++;
}
}
}
public static void main(String [] args) {
VFWAuto a = new VFWAuto();
System.exit(0);
}
}
Assuming you are on a Windows platform and you have a working web-cam, then this code should detect the device and populate the jmf.properties file. On the next run you can also comment out the VFWAuto section and it's object references and you can see that CaptureDeviceManager reads from the jmf.properties file.
The VFWAuto class is part of jmf.jar. You can also see the DirectSoundAuto and JavaSoundAuto classes for detecting audio devices in the JMStudio sample source code. Try it out the same way as you did for VFWAuto.
My configuration was Windows 7 64 bit + JMF 2.1.1e windows performance pack + a web-cam.
I had the same issue and I solved by invoking flush() on my ObjectInputStream object.
According to the API documentation for ObjectInputStream's constructor:
The stream header containing the magic number and version number are read from the stream and verified. This method will block until the corresponding ObjectOutputStream has written and flushed the header.
This is a very important point to be aware of when trying to send objects in both directions over a socket because opening the streams in the wrong order will cause deadlock.
Consider for example what would happen if both client and server tried to construct an ObjectInputStream from a socket's input stream, prior to either constructing the corresponding ObjectOutputStream. The ObjectInputStream constructor on the client would block, waiting for the magic number and version number to arrive over the connection, while at the same time the ObjectInputStream constructor on the server side would also block for the same reason. Hence, deadlock.
Because of this, you should always make it a practice in your code to open the ObjectOutputStream and flush it first, before you open the ObjectInputStream. The ObjectOutputStream constructor will not block, and invoking flush() will force the magic number and version number to travel over the wire. If you follow this practice in both your client and server, you shouldn't have a problem with deadlock.
Credit goes to Tim Rohaly and his explanation here.
Before calling CaptureDeviceManager.getDeviceList(), the available devices must be loaded into the memory first.
You can do it manually by running JMFRegistry after installing JMF.
or do it programmatically with the help of the extension library FMJ (Free Media in Java). Here is the code:
import java.lang.reflect.Field;
import java.util.Vector;
import javax.media.*;
import javax.media.format.RGBFormat;
import net.sf.fmj.media.cdp.GlobalCaptureDevicePlugger;
public class FMJSandbox {
static {
System.setProperty("java.library.path", "D:/fmj-sf/native/win32-x86/");
try {
final Field sysPathsField = ClassLoader.class.getDeclaredField("sys_paths");
sysPathsField.setAccessible(true);
sysPathsField.set(null, null);
} catch (Exception e) {
e.printStackTrace();
}
}
public static void main(String args[]) {
GlobalCaptureDevicePlugger.addCaptureDevices();
Vector deviceInfo = CaptureDeviceManager.getDeviceList(new RGBFormat());
System.out.println(deviceInfo.size());
for (Object obj : deviceInfo ) {
System.out.println(obj);
}
}
}
Here is the output:
USB2.0 Camera : civil:\\?\usb#vid_5986&pid_02d3&mi_00#7&584a19f&0&0000#{65e8773d-8f56-11d0-a3b9-00a0c9223196}\global
RGB, -1-bit, Masks=-1:-1:-1, PixelStride=-1, LineStride=-1

Categories