Unit testing creating an SQLite database using Spock and Robospock - java
spock-core:0.7-groovy-2.0
robospock:0.5.0
Android Studio 0.8.2
Fedora release 20 (Heisenbug)
This is the complete solution. Now it compiles and runs the unit test successfully, and the directory structure is the same as the preview edit. Please feel free to comment on anything that doesn't look right.
Edit Solution =====
build.gradle:
apply plugin: 'java'
apply plugin: 'groovy'
repositories {
mavenCentral()
maven {
// Location of Android SDK for compiling otherwise get this error:
/* Could not find com.android.support:support-v4:19.0.1.
Required by:
:testSQLite:unspecified > org.robospock:robospock:0.5.0 > org.robolectric:robolectric:2.3 */
url "/home/steve/local/android-studio/sdk/extras/android/m2repository/"
}
}
dependencies {
// just compile so we can use the sqlite API
compile 'com.google.android:android:4.1.1.4', {
// Do not bring in dependencies
transitive = false
}
testCompile 'org.codehaus.groovy:groovy:2.3.+'
testCompile 'org.spockframework:spock-core:0.7-groovy-2.0'
testCompile 'org.robospock:robospock:0.5.0'
testCompile 'org.robospock:robospock-plugin:0.4.0'
}
SnapzClientTest.groovy:
package com.example.DataAccess
import com.example.DataAccess.SnapzAndroidDB
import org.robolectric.Robolectric
import pl.polidea.robospock.RoboSpecification
class SnapClientTest extends RoboSpecification {
/* Create Sqlite database for Android */
def 'Create a sqlite database for Android'() {
setup:
def androidDB = new SnapzAndroidDB(Robolectric.application)
expect:
androidDB != null
}
}
SnapzAndroidDB.java, no change from the 5th August Edit
Edit 5 August ================
Basically, I am trying to create a JAR file that will be used in an Android app that will have the functionality of SQLite, so I can use this JAR file for many apps.
I have started from scratch and created a smaller application that is easier to bug fix. This is the directory structure, and there is only three files:
testSQLite/build.gradle
testSQLite/src/main/java/com/example/sqltest/SnapzAndroidDB.java
testSQLite/src/test/groovy/SnapzClientTest.groovy
build.gradle
apply plugin: 'java'
apply plugin: 'groovy'
repositories {
mavenCentral()
maven {
// Location of Android SDK for compiling otherwise get this error:
/* Could not find com.android.support:support-v4:19.0.1.
Required by:
:testSQLite:unspecified > org.robospock:robospock:0.5.0 > org.robolectric:robolectric:2.3 */
url "/home/steve/local/android-studio/sdk/extras/android/m2repository/"
}
}
dependencies {
// Just compile so we can use the sqlite API
compile 'com.google.android:android:4.1.1.4', {
// Do not bring in dependencies
transitive = false
}
testCompile 'org.codehaus.groovy:groovy:2.3.+'
testCompile 'org.spockframework:spock-core:0.7-groovy-2.0'
testCompile 'org.robospock:robospock:0.5.0'
testCompile 'org.robospock:robospock-plugin:0.4.0'
}
SnapzAndroidDB.java
package com.example.DataAccess;
import java.util.logging.ConsoleHandler;
import java.util.logging.SimpleFormatter;
import java.util.logging.Handler;
import java.util.logging.Logger;
import java.util.logging.Level;
import android.content.Context;
import android.content.ContentValues;
import android.database.sqlite.SQLiteDatabase;
import android.database.sqlite.SQLiteOpenHelper;
import android.database.sqlite.SQLiteException;
import android.database.Cursor;
public class SnapzAndroidDB extends SQLiteOpenHelper {
/**
* Logger for displaying log messages
*/
private static final Logger log = Logger.getLogger("SnapzAndroidDB");
private SQLiteDatabase mDb;
public SnapzAndroidDB(Context context) {
super(context, "DB_NAME", null, 1);
/* Create logger */
ConsoleHandler consoleHandler = new ConsoleHandler();
log.addHandler(consoleHandler);
log.setLevel(Level.FINE);
consoleHandler.setFormatter(new SimpleFormatter());
consoleHandler.setLevel(Level.ALL);
log.log(Level.INFO, "SnapzAndroidDB()");
}
/* Called only once first time the database is created */
#Override
public void onCreate(SQLiteDatabase mDb) {
log.log(Level.INFO, "onCreate(SQLiteDatabase db)");
String createConfig = String.format("create table %s (%s int primary key, %s text, %s text)",
"TABLE_CONFIG",
"ID",
"NAME",
"VALUE");
log.log(Level.INFO, "onCreate with SQL: " + createConfig);
mDb.execSQL(createConfig);
}
#Override
public void onUpgrade(SQLiteDatabase mDb, int oldVersion, int newVersion) {
log.log(Level.INFO, "onUpgrade()");
/* Only if there is some schema changes to the database */
}
}
SnapzClientTest.groovy
package com.example.DataAccess
import com.example.DataAccess.SnapzAndroidDB
import spock.lang.Specification
import org.robolectric.Robolectric
class SnapClientTest extends Specification {
/* Create SQLite database for Android */
def 'Create an SQLite database for Android'() {
setup:
def androidDB = new SnapzAndroidDB(Robolectric.application)
expect:
androidDB != null
}
}
The error I am still getting is the following:
com.example.DataAccess.SnapClientTest > Create an SQLite database for Android FAILED
java.lang.RuntimeException: Stub!
at android.database.sqlite.SQLiteOpenHelper.<init>(SQLiteOpenHelper.java:4)
at com.example.DataAccess.SnapzAndroidDB.<init>(SnapzAndroidDB.java:26)
at com.example.DataAccess.SnapClientTest.Create a sqlite database for Android(SnapzClientTest.groovy:15)
Edit 4 August ===================
This is my updated test specification that uses Robolectric to generate a context that can be used in the constructor of SQLiteOpenHelper(...)
import org.robolectric.Robolectric
def 'Create an SQLite database for Android'() {
setup:
def androidDB = new SnapzAndroidDB(Robolectric.application)
expect:
androidDB != null
}
The function I am actually testing is a class that extends SQLiteOpenHelper. And my constructor SnapzAndroidDB(...) calls SQLiteOpenHelper() constructor, as you can see the context is the first parameter that is passed from the test spec:
public class SnapzAndroidDB extends SQLiteOpenHelper
public SnapzAndroidDB(Context context) {
super(context, SnapzContract.DB_NAME, null, SnapzContract.DB_VERSION);
}
.
.
}
When I run my test I get this error:
com.sunsystem.HttpSnapClient.SnapClientTest > Create an SQLite database for Android FAILED
java.lang.RuntimeException: Stub!
at android.database.sqlite.SQLiteOpenHelper.<init>(SQLiteOpenHelper.java:4)
at com.sunsystem.DataAccess.SnapzAndroidDB.<init>(SnapzAndroidDB.java:33)
at com.sunsystem.HttpSnapClient.SnapClientTest.Create a sqlite database for Android(SnapClientTest.groovy:168)
END EDIT =======================
Edit ====
When I try and use the getBaseContext() I get the following error:
com.sunsystem.HttpSnapClient.SnapClientTest > Create an SQLite database for Android FAILED
groovy.lang.MissingMethodException: No signature of method: com.sunsystem.HttpSnapClient.SnapClientTest.getBaseContext() is applicable for argument types: () values: []
at com.sunsystem.HttpSnapClient.SnapClientTest.Create a sqlite database for Android(SnapClientTest.groovy:159)
My specification spock function is this:
def 'Create an SQLite database for Android'() {
setup:
def androidDB = new SnapzAndroidDB(getBaseContext())
expect:
androidDB != null
}
Here are the dependencies:
dependencies {
compile "com.googlecode.json-simple:json-simple:1.1.1", {
// Exclude junit as we don't want this include in our JAR file as it will add hamcast and other dependencies as well
exclude group:'junit', module: 'junit'
}
// Just compile so we can use the SQLite API. This won't be included in the JAR
compile 'com.google.android:android:4.1.1.4', {
// Do not bring in dependencies
transitive = false
}
// Compile for unit testing only
testCompile "org.codehaus.groovy:groovy:2.3.4"
testCompile "org.spockframework:spock-core:0.7-groovy-2.0"
testCompile 'org.robospock:robospock:0.5.0'
testCompile 'com.google.android:android-test:4.1.1.4'
testCompile 'com.android.tools.build:gradle:0.12.2'
testCompile 'org.robospock:robospock-plugin:0.4.0'
}
====
I am doing Spock unit testing for my library written in Java that will be used in my Android application.
The Java JAR file that will be deployed to an Android application for doing database stuff. It's this JAR file I am testing.
I have written a Spock specification for testing the creation of an SQLite database.
In my Java JAR file I have a class that creates the SQLite database, and I want to test that in my Spock unit test.
However, the problem is that the SQLiteOpenHelper constructor needs to be called with a Context, and I am trying to mock that context using import android.text.mock.MockContext in my Spock unit test.
public class SnapzAndroidDB extends SQLiteOpenHelper implements SnapzDAO {
public SnapzAndroidDB(Context context) {
super(context, SnapzContract.DB_NAME, null, SnapzContract.DB_VERSION);
}
/* Called only once first time the database is created */
#Override
public void onCreate(SQLiteDatabase db) {
String sqlCreate = String.format("create table %s (%s int primary key, %s text, %s text, %s text)",
SnapzContract.TABLE,
SnapzContract.GetConfigColumn.ID,
SnapzContract.GetConfigColumn.NAME,
SnapzContract.GetConfigColumn.VALUE,
SnapzContract.GetConfigColumn.CFG_TYPE);
db.execSQL(sqlCreate);
}
.
.
}
Now in my unit testing spec I have this in my SnapClientTest.groovy:
import android.test.mock.MockContext
def 'Create an SQLite database for Android'() {
setup:
def context = new MockContext()
def androidDB = new SnapzAndroidDB(context.getApplicationContext())
expect:
androidDB != null
}
From this you can see that I am mocking the context and sending that as a parameter to the constructor of my class that will call the SQLiteOpenHelper constructor.
The error I get when I run my unit test is this:
com.HttpSnapClient.SnapClientTest > Create an SQLite database for Android FAILED
11:05:27.062 [DEBUG] [TestEventLogger] java.lang.RuntimeException: Stub!
11:05:27.063 [DEBUG] [TestEventLogger] at android.content.Context.<init>(Context.java:4)
11:05:27.063 [DEBUG] [TestEventLogger] at android.test.mock.MockContext.<init>(MockContext.java:5)
11:05:27.063 [DEBUG] [TestEventLogger] at com.sunsystem.HttpSnapClient.SnapClientTest.Create a sqlite database for Android(SnapClientTest.groovy:155)
11:05:27.065 [QUIET] [system.out] 11:05:27.064 [DEBUG] [org.gradle.process.internal.child.ActionExecutionWorker] Stopping client connection.
Being new to Spock I am not sure if this is possible or not, as I am just testing my JAR file.
Spock is one of the most widely used frameworks in the Groovy and Java ecosystem that allows the creation of BDD tests in a very intuitive language and facilitates some common tasks such as mocking and extensibility. What makes it stand out from the crowd is its beautiful and highly expressive specification language. Thanks to its JUnit runner, Spock is compatible with most IDEs, build tools, and continuous integration servers. To work with Spock you basically need to perform a set of steps, such as following a recipe, that will allow you to effectively implement both a unit test and a web integration.
Your current error message reads:
Create a sqlite database for Android FAILED
Try these steps and see how it goes:
Including to your code getWritableDatabase and getReadableDatabase should help:
jokesHelper dbHelper = new jokesHelper(getBaseContext());
SQLiteDatabase db = dbHelper.getWritableDatabase();
Doing so, Android will be able manage and cache the connection.
Yet if you got any error message from getBaseContext, try to uninstall the testing plugin and recreate the STS resources (.classpath & .project) using integrate-with --eclipse, then it should work.
If you are having any problem with getSpecificationContext, it means that some detail was left out and you need to double check your specifications.
In case you are not using Eclipse, to create your Java Jar file with spock, you can interface it with Emacs through command-line Java development tools as usual, for instance Sun’s JDK or any other approach expected for Enterprise Development. To only run a SampleTest you must invoke the test task from the command-line with the Java system property:
gradle -Dtest.single=Sample test
Or alternatively
gradle -Dtest.single=SoapTest clean test
Also check what are the permissions on the directory in use. And in case you haven't done yet, remember to include dependencies:
dependencies {
classpath 'com.android.tools.build:gradle:0.8.+'
classpath 'org.robospock:robospock-plugin:0.4.0'
}
And inform the test directory that you are using, e.g. srcDirs. And keep in mind that ("Es ist wichtig das man hier die richtige Resources-Klasse importiert") it is important to import the right resource required by the class. As such, also include to "build.gradle" in the "defaultConfig":
testPackageName "com.yourpackage.test"
testInstrumentationRunner "android.test.InstrumentationTestRunner"
testFunctionalTest true
The Spock and Robospock are innovative tools that can aid helpful resources for development of unit test. Alternatively you could also use tools such as TCL Tests. The TCL Tests are the oldest set of tests for SQLite and is the best approach you can take. In fact, SQLite started life as a Tcl extension. Much of the testing and development tools for SQLite are written in Tcl. In addition to the native C API, the Tcl extension is the only API supported by the core SQLite team.
To enable the Tcl bindings, download the TEA (Tcl Extension Architecture) distribution of the SQLite source from the SQLite website. This version of the code is essentially the amalgamation distribution with the
Tcl bindings appended to the end. This will build into a Tcl extension that can then be imported into any Tcl environment.
Very specific steps should be followed and attention to every single detail is essential, as it can make the difference to enable your testing to successfully run or not.
The instrumentation framework is the foundation of the testing framework.
Instrumentation controls the application under test and permits the injection of mock components required by the application to run. For example, you can create mock Contexts before the application starts and let the application use them.
All interaction of the application with the surrounding environment can be controlled using this approach. You can also isolate your application in a restricted environment to be able to predict the results, forcing the values returned by some methods or mocking persistent and unchanged data for ContentProvider, databases, or even the filesystem content. Therefore, it is also important to specify in your activity the information that you are running a test:
<?xml version="1.0" encoding="utf-8"?>
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
package="com.example.aatg.sample.test"
android:versionCode="1" android:versionName="1.0">
<application android:icon="#drawable/icon"
android:label="#string/app_name">
<uses-library android:name="android.test.runner" />
</application>
<uses-sdk android:minSdkVersion="7" />
<instrumentation
android:targetPackage="com.example.aatg.sample
android:name="android.test.InstrumentationTestRunner"
android:label="Sample Tests" />
<uses-permission android:name="
android.permission.INJECT_EVENTS" />
</manifest>
In case you run JNI to manipulate your database with native code, there are two ways to load an extension with SQLite. One is through a C API call, and one is through an SQL function that calls down into the same code as the C API function. In both cases, you provide a filename and, optionally, the name of the entry point function:
int sqlite3_load_extension( sqlite3 *db, const char *ext_name,
const char *entry_point, char **error )
The other way to load a loadable extension is with the built-in SQL function:
load_extension( 'ext_name' )
load_extension( 'ext_name', 'entry_point' )
This function is similar to the C sqlite3_load_extension() call, with one major limitation. Because this is an SQL function, when it is called there will be, by definition, an SQL statement executing when the extension is loaded. That means that any extension loaded with the load_extension() SQL function will be completely unable to redefine
or delete a custom function, including the specialized set of like() functions. Approaches to load the data with suitable syntax works similarly with Java, as expected.
Debug directives are only used for testing and development
purposes, as they add significant overhead and make everything run noticeably slower, just similarly as to including throws Exception. As you are running unit test you need to set them accordingly, and also check to avoid that you database don't get corrupted. Basically, achieving the best tune for your debugging settings will improve and help you go smoothly to put your testing to run in the best way.
In addition to all the other build directives, SQLite has a fair number of SQLITE_OMIT_* compile-time directives. These are designed to remove core features from the build in an effort to make the core database library as small and compact as possible. In order to use most of these omit directives, you need to be building SQLite from the development sources found in the source control tree. Most omit directives won’t work
correctly when applied to a source distribution or to the pre-built amalgamation. Also be aware that these compile-time directives are not officially supported, in the sense that they are not part of the official testing chain. For any given version of SQLite, there may be both compile problems and runtime issues if arbitrary sets of omit flags are enabled.
Of course you don't need to be a samurai to run unit test for SQLite on Android, although it might help.
The issue you are facing is getting a correct Context for creating the DB.
Your first attempt with getBaseContext() didn't work since it didn't find a function like that in SnapClientTest: "No signature of method"
In your second attempt you are creating an Instance of MockContext - this is a stub implementation that you can't use directly.
http://developer.android.com/reference/android/test/mock/MockContext.html
"A mock Context class. All methods are non-functional and throw UnsupportedOperationException."
Try:
def androidDB = new SnapzAndroidDB(Robolectric.application)
Accoding to http://robospock.org/ Robolectric.application should give a working context.
Update
I just noticed you are not extending RoboSpecification but Specification:
import pl.polidea.robospock.RoboSpecification
and
class SnapClientTest extends RoboSpecification
Related
How to use C++ or Java Console Application to operate my Cloud Firestore project?
I use Cloud Firestore to store data for my Android application, then I want to make a supporting tool to operate my Cloud Firestore project easily.(It is too hard and bore for me and my fingers to add more 100 datas in a constant format to Cloud Firestrore by Webpage GUI.) Therefore, I want to make a support tool to read CSV file(I know how to do this in C++ or Java) connect to my Cloud Firestore project operate(add or erase) data deriving from CSV file. I read Google Official start up guide "Get started with Cloud Firestore"(https://firebase.google.com/docs/firestore/quickstart#java_1), and did following things. install gradle Set User environment variable in following sentence.(the location is the secret json file made by Cloud Firestore Service Account.) GOOGLE_APPLICATION_CREDENTIALS="C:\Users\username\Downloads\service-account-file.json" write "build.gradle" as following sentence. apply plugin: 'java' apply plugin: 'application' mainClassName = 'Main' repositories { mavenCentral() } dependencies { implementation 'com.google.firebase:firebase-admin:6.11.0' implementation 'com.google.firebase:firebase-firestore:21.2.1' } write following java file. import com.google.auth.oauth2.GoogleCredentials; import com.google.cloud.firestore.Firestore; import com.google.firebase.FirebaseApp; import com.google.firebase.FirebaseOptions; import java.io.*; //the following two package does not exist, by gradle's compiling. import com.google.firebase.cloud.*; import com.google.firebase.firestore.*; public class Main { public static void main(String args[]) { try { FirebaseOptions options = new FirebaseOptions.Builder() .setCredentials(GoogleCredentials.getApplicationDefault()) .setDatabaseUrl("https://rikotenapp2020.firebaseio.com").build(); FirebaseApp.initializeApp(options); System.out.println("no exception!"); Firestore db = FirestoreClient.getFirestore(); //from my survey, if I erase the rest code(DocumentReference...~println();}) //I can compile it successfully. DocumentReference ref = db.collection("AppVersion").document("Android"); ApiFuture<DocumentSnapshot> future = ref.get(); DocumentSnapshot document = future.get(); if (document.exists()) { System.out.println("android app version is " + document.getData()); } else { System.out.println("No such document!"); } } catch (IOException e) { System.out.println("IOException happened!"); } } } I set up "Firestore Admin SDK", was I wrong? If someone know how to resolve this, I'm very glad to get your valuable advices if you can tell me. This is my first question, and I'm not native English speaker.Please forgive my hard-understand question.
I resolve it now. What I should do is doing following official tutorial(https://firebase.google.com/docs/firestore/quickstart). However, because I use Visual Studio Code to edit java program, and I don't have any plugin to adapt "Auto Import" about external library, I found this situation as a hard problem. For other people who will come here: The introduction of Java in tutorial doesn't have careful import sentence.The best and direct way to resolve it is reading official reference(https://googleapis.dev/java/google-cloud-firestore/latest/index.html) and write in your java program with import sentences. This question is starting from my misunderstanding. I appreciate all people to help me about this problem.
Unit test Java class that loads native library
I'm running unit tests in Android Studio. I have a Java class that loads a native library with the following code static { System.loadLibrary("mylibrary"); } But when I test this class inside my src/test directory I get java.lang.UnsatisfiedLinkError: no mylibrary in java.library.path at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1864) at java.lang.Runtime.loadLibrary0(Runtime.java:870) at java.lang.System.loadLibrary(System.java:1122) How can I make it find the path of native .so libraries which is located at src/main/libs in order to unit test without errors? Note: inside src/main/libs directory I have 3 more subdirectories: armeabi, mips and x86. Each one of those contains the proper .so file. I'm using the Non experimental version for building NDK libs. I don't wanna use other 3rd party testing libraries as all my other "pure" java classes can be unit tested fine. But if that's not possible then I'm open to alternatives. Here is my test code which throws the error #Test public void testNativeClass() throws Exception { MyNativeJavaClass test = new MyNativeJavaClass("lalalal") List<String> results = test.getResultsFromNativeMethodAndPutThemInArrayList(); assertEquals("There should be only three result", 3, results.size()); }
The only solution I found that works without hacks is to use JUnit through instrumentation testing (androidTest directory). My class can now be tested fine but with help of the android device or emulator.
If the library is required for your test, use an AndroidTest (under src/androidTest/...) rather than a junit test. This will allow you to load and use the native library like you do elsewhere in your code. If the library is not required for your test, simply wrap the system load in a try/catch. This will allow the JNI class to still work in junit tests (under src/test/...) and it is a safe workaround, given that it is unlikely to mask the error (something else will certainly fail, if the native lib is actually needed). From there, you can use something like mockito to stub out any method calls that still hit the JNI library. For example in Kotlin: companion object { init { try { System.loadLibrary("mylibrary") } catch (e: UnsatisfiedLinkError) { // log the error or track it in analytics } } }
I am not sure whether this solves your problem or not but so far nobody has mentioned about strategy pattern for dealing with classes preloading library during their creation. Let's see the example: We want to implement Fibonacci solver class. Assuming that we provided implementation in the native code and managed to generate the native library, we can implement the following: public interface Fibonacci { long calculate(int steps); } Firstly, we provide our native implementation: public final class FibonacciNative implements Fibonacci { static { System.loadLibrary("myfibonacci"); } public native long calculate(int steps); } Secondly, we provide Java implementation for Fibonacci solver: public final class FibonacciJava implements Fibonacci { #Override public long calculate(int steps) { if(steps > 1) { return calculate(steps-2) + calculate(steps-1); } return steps; } } Thirdly, we wrap the solvers with parental class choosing its own implementation during its instantiation: public class FibonnaciSolver implements Fibonacci { private static final Fibonacci STRATEGY; static { Fibonacci implementation; try { implementation = new FibonnaciNative(); } catch(Throwable e) { implementation = new FibonnaciJava(); } STRATEGY = implementation; } #Override public long calculate(int steps) { return STRATEGY.calculate(steps); } } Thus, the problem with finding path to the library using strategy. This case, however, does not resolve the problem if the native library is really necessary to be included during the test. It does not neither solve the problem if the native library is a third-party library. Basically, this gets around the native library load problem by mocking out the native code for java code. Hope this helps somehow:)
There is a way to configure library path of Gradle-run VM for local unit tests, and I'm going to describe it below, but spoiler: in my expericence, #ThanosFisherman is right: local unit tests for stuff that uses the Android NDK seem to be a fools errand right now. So, for anyone else looking for a way to load shared (i.e. .so) libraries into unit tests with gradle, here's the somewhat lengthy abstract: The goal is to set the shared library lookup path for the JVM running the unit tests. Althoug many people suggest putting the lib path into java.library.path, I found that it doesn't work, at least not on my linux machine. (also, same results in this CodeRanch thread) What does work though is setting the LD_LIBRARY_PATH os environment variable (or PATH is the closest synonym in Windows) Using Gradle: // module-level build.gradle apply plugin: 'com.android.library' // or application android { ... testOptions { unitTests { all { // This is where we have access to the properties of gradle's Test class, // look it up if you want to customize more test parameters // next we take our cmake output dir for whatever architecture // you can also put some 3rd party libs here, or override // the implicitly linked stuff (libc, libm and others) def libpath = '' + projectDir + '/build/intermediates/cmake/debug/obj/x86_64/' +':/home/developer/my-project/some-sdk/lib' environment 'LD_LIBRARY_PATH', libpath } } } } With that, you can run, e.g. ./gradlew :mymodule:testDebugUnitTest and the native libs will be looked for in the paths that you specified. Using Android Studio JUnit plugin For the Android Studio's JUnit plugin, you can specify the VM options and the environment variables in the test configuration's settings, so just run a JUnit test (right-clicking on a test method or whatever) and then edit the Run Configuration: Although it sounds like "mission accomplished", I found that when using libc.so, libm.so and others from my os /usr/lib gives me version errors (probably because my own library is compiled by cmake with the android ndk toolkit against it's own platform libs). And using the platform libs from the ndk packages brought down the JVM wih a SIGSEGV error (due to incompatibility of the ndk platform libs with the host os environment) Update As #AlexCohn incisively pointed out in the comments, one has to build against the host environment libs for this to work; even though your machine most likely is x86_64, the x86_64 binaries built against NDK environment will not do. There may be something I overlooked, obviously, and I'll appreciate any feedback, but for now I'm dropping the whole idea in favor of instrumented tests.
Just make sure, the directory containing the library is contained in the java.library.path system property. From the test you could set it before you load the library: System.setProperty("java.library.path", "... path to the library .../libs/x86"); You can specify the path hard coded, but this will make the project less portable to other environments. So I suggest you build it up programmatically.
The .so files are to be placed under src/main/jniLibs Not under src/main/libs (Tested with Android Studio 1.2.2) For reference check the page - http://ph0b.com/android-studio-gradle-and-ndk-integration/, though some portions might be outdated.
This is very, very tricky. Setting java.library.path does not work, but trying to understand someone else’s Mac OSX approach I eventually found a working solution. Legal release: all code examples directly copied into this post are available under CC0 but it would be appeciated to credit my employer ⮡ tarent, the LLCTO project at Deutsche Telekom, and the author mirabilos. CAVEATS first: with this, you’re testing a version of the native code compiled against your system libraries (usually glibc on GNU/Linux, and on BSD, Mac OSX and Windows it’s even trickier) so adding some instrumented tests should be done anyway, use the unittests only for faster testing of things that actually can be tested on the host OS I’ve only tested this with a GNU/Linux host (and am, in fact, excluding these native tests on all other host OSes, see below) it should work under unixoid OSes with GNU/BSD-style shared libraries as-is with small adaptions from the “someone else’s” article linked above, it might probably work on Mac OSX Windows… no, just no. Use WSL, which is basically Linux anyway and makes things much easier, and so much closer to Android which is also basically Linux just not GNU IDE integration needs manual steps at each developer’s machine (but these are easily documented, see (much) below) Prerequisites You’ll need to make sure that all build dependencies of your native code are also installed in the host system. This includes cmake (because we sadly cannot reuse the NDK cmake) and a host C compiler. Note that these introduce further differences in the build: you’re testing something that has been built with the host C compiler (often GCC, not clang like in Android) against the host C library and other libraries by the host clang. Do consider this when writing your tests. I had to move one of the tests to instrumented because it was impossible to test under glibc. For filesystem layout, we assume the following: ~/MYPRJ/build.gradle is the top-level build file (generated by IntelliJ / Android Studio) ~/MYPRJ/app/build.gradle is where the Android code in question is built (generated by IntelliJ / Android Studio) ~/MYPRJ/app/src/main/native/CMakeLists.txt is where the native code is situated This means build.gradle (for the app) has something like this already, by the point where you begin wondering about whether your project can be unittested: externalNativeBuild { cmake { path "src/main/native/CMakeLists.txt" return void // WTF‽ } } Make sure your code builds on the host Doing this ought to be easy at first glance: $ rm -rf /tmp/build $ mkdir /tmp/build $ cd /tmp/build $ cmake ~/MYPRJ/app/src/main/native/ $ make (Make sure you give cmake the path to the directory the main CMakeLists.txt file is in, but not to that file itself!) This will fail for everything nōntrivial, of course. Most people would use Android logging. (It will also fail because it cannot find <jni.h>, and because GNU libc requires an extra _GNU_SOURCE definition to access some prototypes, etc…) So I wrote a header to include instead of <android/log.h> which abstracts the logging away… #ifndef MYPRJ_ALOG_H #define MYPRJ_ALOG_H #ifndef MYPRJ_ALOG_TAG #define MYPRJ_ALOG_TAG "MYPRJ-JNI" #endif #if defined(MYPRJ_ALOG_TYPE) && (MYPRJ_ALOG_TYPE == 1) #include <android/log.h> #define ecnlog_err(msg, ...) __android_log_print(ANDROID_LOG_ERROR, \ MYPRJ_ALOG_TAG, msg, ##__VA_ARGS__) #define ecnlog_warn(msg, ...) __android_log_print(ANDROID_LOG_WARN, \ MYPRJ_ALOG_TAG, msg, ##__VA_ARGS__) #define ecnlog_info(msg, ...) __android_log_print(ANDROID_LOG_INFO, \ MYPRJ_ALOG_TAG, msg, ##__VA_ARGS__) #elif defined(MYPRJ_ALOG_TYPE) && (MYPRJ_ALOG_TYPE == 2) #include <stdio.h> #define ecnlog_err(msg, ...) fprintf(stderr, \ "E: [" MYPRJ_ALOG_TAG "] " msg "\n", ##__VA_ARGS__) #define ecnlog_warn(msg, ...) fprintf(stderr, \ "W: [" MYPRJ_ALOG_TAG "] " msg "\n", ##__VA_ARGS__) #define ecnlog_info(msg, ...) fprintf(stderr, \ "I: [" MYPRJ_ALOG_TAG "] " msg "\n", ##__VA_ARGS__) #else # error What logging system to use? #endif #endif … and updated my CMakeLists.txt to indicate whether building for NDK (must be default) or native: cmake_minimum_required(VERSION 3.10) project(myprj-native) option(UNDER_NDK "Build under the Android NDK" ON) add_compile_options(-fvisibility=hidden) add_compile_options(-Wall -Wextra -Wformat) add_library(myprj-native SHARED alog.h myprj-jni.c ) if (UNDER_NDK) add_definitions(-DECNBITS_ALOG_TYPE=1) find_library(log-lib log) target_link_libraries(myprj-native ${log-lib}) else (UNDER_NDK) add_definitions(-DECNBITS_ALOG_TYPE=2) include(FindJNI) include_directories(${JNI_INCLUDE_DIRS}) add_definitions(-D_GNU_SOURCE) endif (UNDER_NDK) Note this also already includes the fix for <jni.h> (FindJNI) and the extra definitions. Now let’s try to build it again: $ rm -rf /tmp/build $ mkdir /tmp/build $ cd /tmp/build $ cmake -DUNDER_NDK=OFF ~/MYPRJ/app/src/main/native/ $ make In my case, this was sufficient. If you’re still not there, fix this first before proceeding. If you cannot fix this, give up on buildhost-local unit tests for your JNI code and move the respective tests to instrumented. Let Gradle build the host-native code Add the following to the app build.gradle: def dirForNativeNoNDK = project.layout.buildDirectory.get().dir("native-noNDK") def srcForNativeNoNDK = project.layout.projectDirectory.dir("src/main/native").asFile task createNativeNoNDK() { def dstdir = dirForNativeNoNDK.asFile if (!dstdir.exists()) dstdir.mkdirs() } task buildCMakeNativeNoNDK(type: Exec) { dependsOn createNativeNoNDK workingDir dirForNativeNoNDK commandLine "/usr/bin/env", "cmake", "-DUNDER_NDK=OFF", srcForNativeNoNDK.absolutePath } task buildGMakeNativeNoNDK(type: Exec) { dependsOn buildCMakeNativeNoNDK workingDir dirForNativeNoNDK commandLine "/usr/bin/env", "make" } project.afterEvaluate { if (org.gradle.internal.os.OperatingSystem.current().isLinux()) { testDebugUnitTest { dependsOn buildGMakeNativeNoNDK systemProperty "java.library.path", dirForNativeNoNDK.asFile.absolutePath + ":" + System.getProperty("java.library.path") } testReleaseUnitTest { dependsOn buildGMakeNativeNoNDK systemProperty "java.library.path", dirForNativeNoNDK.asFile.absolutePath + ":" + System.getProperty("java.library.path") } } } This defines a few new tasks to compile the buildhost-native version of the shared library, and hooks this up if the host OS is “Linux”. (This syntax will also work for other unixoid OSes — BSD, Mac OSX — but not for Windows. But we can probably test this under Linux only anyway. WSL counts as Linux.) It also sets up the JVM library path so that ../gradlew test will let the JVM pick up the library from its path. Loose ends There’s a few loose ends you might have noticed here: In the last paragraph of the previous section, I mentioned that ../gradlew test will pick up the library. Testing from the IDE will not work yet; this involves manual setup. I mentioned that the relevant unit tests must be skipped if the buildhost OS is not “Linux”; we have yet to do that. Unfortunately, JUnit 4 lacks such facilities, but switching the unit tests to JUnit 5 “Jupiter” will allow us to do that. (We’re not switching the instrumented tests, though; that’d be more invasive.) You’ll probably not yet have noticed, but the logging output from the native code will not show up thanks to Gradle’s default settings which we’ll need to change. So, let’s do that. First, edit your app build.gradle file again. There will be a dependencies { block. We’ll need to fill that with suitable dependencies for either JUnit: dependencies { testImplementation 'org.junit.jupiter:junit-jupiter-api:5.7.0' testRuntimeOnly 'org.junit.jupiter:junit-jupiter-engine:5.7.0' //noinspection GradleDependency androidTestImplementation 'com.android.support.test:runner:1.0.1' //noinspection GradleDependency androidTestImplementation 'com.android.support.test.espresso:espresso-core:3.0.1' //noinspection GradleDependency androidTestImplementation 'junit:junit:4.12' } You’ll also have a line apply plugin: 'com.android.application' (or perhaps apply plugin: 'com.android.library') at the top. Directly below that line, insert this one: apply plugin: 'de.mannodermaus.android-junit5' Also, make sure that, under android { defaultConfig { the testInstrumentationRunner is still "android.support.test.runner.AndroidJUnitRunner" (the default as generated by IntelliJ / Android Studio). Next, edit the top-level ~/MYPRJ/build.gradle file. You’ll already have a buildscript { dependencies { and will have to add a line to that section to make the JUnit5 plugin available in the first place: //noinspection GradleDependency classpath 'de.mannodermaus.gradle.plugins:android-junit5:1.5.2.0' Then, add a new section under allprojects {: tasks.withType(Test) { testLogging { outputs.upToDateWhen { false } showStandardStreams = true exceptionFormat = 'full' } systemProperty 'java.util.logging.config.file', file('src/test/resources/logging.properties').getAbsolutePath() } This ensures that… tests are never skipped because Gradle thinks them up-to-date logging output and exceptions are shown in full if you have a ~/MYPRJ/app/src/test/resources/logging.properties it will set up java.util.logging with this (recommended) Now see to your test, something like ~/MYPRJ/app/src/test/java/org/example/packagename/JNITest.java. First, you should add a “test” that can always run (I use one that merely tests whether my JNI class can be loaded), and ensure it displays some information first: // or Lombok #Log private static final java.util.logging.Logger LOGGER = java.util.logging.Logger.getLogger(JNITest.class.getName()); #Test public void testClassBoots() { LOGGER.info("running on " + System.getProperty("os.name")); if (!LINUX.isCurrentOs()) { LOGGER.warning("skipping JNI tests"); } // for copy/paste into IntelliJ run options LOGGER.info("VM options: -Djava.library.path=" + System.getProperty("java.library.path")); LOGGER.info("testing Java™ part of JNI class…"); […] } Then, annotate the actual JNI tests that need to be skipped on other OSes: #Test #EnabledOnOs(LINUX) public void testJNIBoots() { LOGGER.info("testing JNI part of JNI class…"); final long tid; try { tid = JNI.n_gettid(); } catch (Throwable t) { LOGGER.log(Level.SEVERE, "it failed", t); Assertions.fail("JNI does not work"); return; } LOGGER.info("it also works: " + tid); assertNotEquals(0, tid, "but is 0"); } For comparison, instrumented tests (unittests that run on the Android device or emulator) — e.g. ~/MYPRJ/app/src/androidTest/java/org/example/packagename/JNIInstrumentedTest.java — look like this: #RunWith(AndroidJUnit4.class) public class JNIInstrumentedTest { #Test public void testJNIBoots() { Log.i("ECN-Bits-JNITest", "testing JNI part of JNI class…"); final long tid; try { tid = JNI.n_gettid(); } catch (Throwable t) { Log.e("ECN-Bits-JNITest", "it failed", t); fail("JNI does not work"); return; } Log.i("ECN-Bits-JNITest", "it also works: " + tid); assertNotEquals("but is 0", 0, tid); } } See Testable.java if you need an assertThrows for instrumented tests (JUnit 5 already comes with one), by the way. (Note that this does not fall under the CC0 grant above but comes under a permissive licence.) Now, you can run both tests, unittests and (if an Android emulator is started or device commected) instrumented tests: ../gradlew test connectedAndroidTest Do so. Note the output of the VM options: logger call from the buildhost-native unit tests; in fact, copy it to the clipboard. You’ll now need it to set up testing in the IDE. In the Project view (left-side tree), right-click either on your JNITest class or the entire src/test/java/ directory. Click on Run 'JNITest' (or Run 'Tests in 'java''), it will fail with an UnsatisfiedLinkError as in the original post. Now click on the arrow in the test drop-down below the menu bar, then select Save JNITest configuration, then do it again and select Edit configurations… and select your configuration. Append the entire pasted thing to VM options: so the field will now look like -ea -Djava.library.path=/home/USER/MYPRJ/app/build/native-noNDK:/usr/java/packages/lib:/usr/lib/x86_64-linux-gnu/jni:/lib/x86_64-linux-gnu:/usr/lib/x86_64-linux-gnu:/usr/lib/jni:/lib:/usr/lib (of course, the actual value will differ) and click OK. Re-run the test, and it will succeed. Unfortunately, you’ll have to do this for every native test class once and for the entire directory, so all possible ways of invocation will be covered. You’ll also have to do this manually, by clicking around, for every IDE instance, and these values depend on the path the code was checked out into. I’ve not found a way to automate these (if you know of one, do tell). Exception backtraces If you’re throwing custom exceptions from your code, you’ll most likely wish to include file/lineno/function information. Use a constructor like MyprjNativeException(final String file, final int line, final String func, final String msg, … /* custom data */, final Throwable cause) and, after calling super(msg, cause) (possibly with a changed message), do this: StackTraceElement[] currentStack = getStackTrace(); StackTraceElement[] newStack = new StackTraceElement[currentStack.length + 1]; System.arraycopy(currentStack, 0, newStack, 1, currentStack.length); newStack[0] = new StackTraceElement("<native>", func, file, line); setStackTrace(newStack); Then, to throw an exception like this from native code: #define throw(env,...) vthrow(__FILE__, __func__, env, __LINE__, __VA_ARGS__) static void vthrow(const char *loc_file, const char *loc_func, JNIEnv *env, int loc_line, /* custom args */ const char *msg, ...); Use as follows: if (func() != expected) throw(env, /* custom args */ "foo"); Implementation (assuming you cache class and constructor method references) looks as follows (adjust for custom args): static void vthrow(const char *loc_file, const char *loc_func, JNIEnv *env, int loc_line, const char *fmt, ...) { jthrowable e; va_list ap; jstring jfile = NULL; jint jline = loc_line; jstring jfunc = NULL; jstring jmsg = NULL; jthrowable cause = NULL; const char *msg; char *msgbuf; if ((*env)->PushLocalFrame(env, /* adjust for amount you need */ 5)) { cause = (*env)->ExceptionOccurred(env); (*env)->ExceptionClear(env); (*env)->Throw(env, (*env)->NewObject(env, classreference, constructorreference, jfile, jline, jfunc, jmsg, /* custom */…, cause)); return; } if ((cause = (*env)->ExceptionOccurred(env))) { /* will be treated as cause */ (*env)->ExceptionClear(env); } va_start(ap, fmt); if (vasprintf(&msgbuf, fmt, ap) == -1) { msgbuf = NULL; msg = fmt; } else msg = msgbuf; va_end(ap); jmsg = (*env)->NewStringUTF(env, msg); free(msgbuf); if (!jmsg) goto onStringError; if (!(jfunc = (*env)->NewStringUTF(env, loc_func))) goto onStringError; /* allocate NewStringUTF for any custom things you need */ /* exactly like the one for loc_func above */ /* increase PushLocalFrame argument for each */ jfile = (*env)->NewStringUTF(env, loc_file); if (!jfile) { onStringError: (*env)->ExceptionClear(env); } e = (*env)->PopLocalFrame(env, (*env)->NewObject(env, classreference, constructorreference, jfile, jline, jfunc, jmsg, /* custom */…, cause)); if (e) (*env)->Throw(env, e); } Now using __FILE__ will put the full absolute path into the messages and backtraces. This is not very nice. There’s a compiler option to fix that, but NDK r21’s clang is much too old, so we need a workaround. CMakeLists.txt: if (NOT TOPLEV) message(FATAL_ERROR "setting the top-level directory is mandatory") endif (NOT TOPLEV) […] if (UNDER_NDK) […] execute_process(COMMAND ${CMAKE_CXX_COMPILER} --version OUTPUT_VARIABLE cxx_version_full) string(REGEX REPLACE "^Android [^\n]* clang version ([0-9]+)\\.[0-9].*$" "\\1" cxx_version_major ${cxx_version_full}) if (${cxx_version_major} VERSION_GREATER_EQUAL 10) add_definitions("-ffile-prefix-map=${TOPLEV}=«MyPrj»") else (${cxx_version_major} VERSION_GREATER_EQUAL 10) add_definitions(-DOLD_CLANG_SRCDIR_HACK="${TOPLEV}/") endif (${cxx_version_major} VERSION_GREATER_EQUAL 10) else (UNDER_NDK) […] add_definitions("-ffile-prefix-map=${TOPLEV}=«MyPrj»") endif (UNDER_NDK) app build.gradle: (straight after the apply plugin lines) def dirToplev = project.layout.projectDirectory.asFile.absolutePath (inside android { defaultConfig { add a new block) externalNativeBuild { cmake { //noinspection GroovyAssignabilityCheck because Gradle and the IDE have different world views… arguments "-DTOPLEV=" + dirToplev } return void // WTF‽ } (later, where you call cmake) commandLine "/usr/bin/env", "cmake", "-DTOPLEV=" + dirToplev, "-DUNDER_NDK=OFF", srcForNativeNoNDK.absolutePath Then, replace the line jfile = (*env)->NewStringUTF(env, loc_file); with the following snippet: #ifdef OLD_CLANG_SRCDIR_HACK if (!strncmp(loc_file, OLD_CLANG_SRCDIR_HACK, sizeof(OLD_CLANG_SRCDIR_HACK) - 1) && asprintf(&msgbuf, "«ECN-Bits»/%s", loc_file + sizeof(OLD_CLANG_SRCDIR_HACK) - 1) != -1) { msg = msgbuf; } else { msg = loc_file; msgbuf = NULL; } #else #define msg loc_file #endif jfile = (*env)->NewStringUTF(env, msg); #ifdef OLD_CLANG_SRCDIR_HACK free(msgbuf); #else #undef msg #endif Tieing it all together This all is implemented in the ECN-Bits project. I’m posting a permalink because it’s currently on a nōn-default branch but expected to be merged (once the actual functionality is no longer WIP), so be sure to check master at some point in time as well (although this permalink is probably a better example as it has the testing down and there’s not as much “actual” code to get in the way). Note that these links do not fall under the CC0 grant from above; the files all have a permissive licence though (the files which don’t have it explicit (gradle/cmake files) have the same as the unittest class permalink), but enough of it was reposted in this article so that should not be a problem for you; these only serve to show an actually-compiling-and-testing example. In this project, it’s not in app/ but as a separate library module. top-level build.gradle library build.gradle instrumented tests unittest class unittest logging configuration native CMakeLists.txt native alog.h native code including caching the references Java™ code to which the JNI native code attaches including the Exception class
Try running your test code with java -XshowSettings:properties option and make sure your destination path for system libraries and in the output of this command, library path values are the same
Just to clarify, the System.loadlibrary() call was failing because the junit unit test uses host/system environment which was windows in my case. Hence the loadlibrary() call was trying to search for the .so files in standard shared libary folders. But this isn't what I was expecting to happen. Instead I wanted the libxxx.so files to be loaded from .aar file(contains android resources, jars, jni libs). This can only happen by two ways: Baking the libxxx.so into the APK manually: we ourselves place the libxxx.so files into jnilibs dir(which the system will search for when loading the shared lib) under src/java/ root of the apk by copying the required files. adding the .aar as an external dependency to module level gradle build script (implementation/androidTestImplementation), by doing this we make this aar available for link editing used by the apk build process. But in both cases the app runs in the android environment/vm and hence the System.loadlibrary() call will resolve to correct libxxx.so which would be part of the apk. Hence NO ISSUES. However in case of unit tests, which are does not require instrument(ie, android device) and runs on the JVM running on the host system where the tests are running (ex: windows/linux/mac), The call to System.loadlibrary() resolves only the standard lib paths of host system for finding shared libs/exe and doesn't refer to android system environment. Hence the ISSUES. Fixes: unpack the libxxx.so into some temp dir and add this dir to the system's library search path (ex: java.library.path, %PATH% variable on windows etc). Now run the required unit tests which doesn't require the android environment but involves the native code testing using JNI if any. This should work!! (Efficient method) Simply move these type of unit tests to androidTest(ie, Instrumentation tests) so that above explained loading and packing are intact and System.loadlibrary() can successfully find the libxxx.so when running inside the instrument(android device/os). This way you ensure appropriate lib type (x86, x86-64, arm-v7a, arm-v8a(AARCH64)) is invoked on target device and tests are run on specific target devices.
Hadoop Hive UDF with external library
I'm trying to write a UDF for Hadoop Hive, that parses User Agents. Following code works fine on my local machine, but on Hadoop I'm getting: org.apache.hadoop.hive.ql.metadata.HiveException: Unable to execute method public java.lang.String MyUDF .evaluate(java.lang.String) throws org.apache.hadoop.hive.ql.metadata.HiveException on object MyUDF#64ca8bfb of class MyUDF with arguments {All Occupations:java.lang.String} of size 1', Code: import java.io.IOException; import org.apache.hadoop.hive.ql.exec.UDF; import org.apache.hadoop.hive.ql.metadata.HiveException; import org.apache.hadoop.io.Text; import org.apache.hadoop.*; import com.decibel.uasparser.OnlineUpdater; import com.decibel.uasparser.UASparser; import com.decibel.uasparser.UserAgentInfo; public class MyUDF extends UDF { public String evaluate(String i) { UASparser parser = null; parser = new UASparser(); String key = ""; OnlineUpdater update = new OnlineUpdater(parser, key); UserAgentInfo info = null; info = parser.parse(i); return info.getDeviceType(); } } Facts that come to my mind I should mention: I'm compiling with Eclipse with "export runnable jar file" and extract required libraries into generated jar option I'm uploading this "fat jar" file with Hue Minimum working example I managed to run: public String evaluate(String i) { return "hello" + i.toString()"; } I guess the problem lies somewhere around that library (downloaded from https://udger.com) I'm using, but I have no idea where. Any suggestions? Thanks, Michal
It could be a few things. Best thing is to check the logs, but here's a list of a few quick things you can check in a minute. jar does not contain all dependencies. I am not sure how eclipse builds a runnable jar, but it may not include all dependencies. You can do jar tf your-udf-jar.jar to see what was included. You should see stuff from com.decibel.uasparser. If not, you have to build the jar with the appropriate dependencies (usually you do that using maven). Different version of the JVM. If you compile with jdk8 and the cluster runs jdk7, it would also fail Hive version. Sometimes the Hive APIs change slightly, enough to be incompatible. Probably not the case here, but make sure to compile the UDF against the same version of hadoop and hive that you have in the cluster You should always check if info is null after the call to parse() looks like the library uses a key, meaning that actually gets data from an online service (udger.com), so it may not work without an actual key. Even more important, the library updates online, contacting the online service for each record. This means, looking at the code, that it will create one update thread per record. You should change the code to do that only once in the constructor like the following: Here's how to change it: public class MyUDF extends UDF { UASparser parser = new UASparser(); public MyUDF() { super() String key = "PUT YOUR KEY HERE"; // update only once, when the UDF is instantiated OnlineUpdater update = new OnlineUpdater(parser, key); } public String evaluate(String i) { UserAgentInfo info = parser.parse(i); if(info!=null) return info.getDeviceType(); // you want it to return null if it's unparseable // otherwise one bad record will stop your processing // with an exception else return null; } } But to know for sure, you have to look at the logs...yarn logs, but also you can look at the hive logs on the machine you're submitting the job on ( probably in /var/log/hive but it depends on your installation).
such a problem probably can be solved by steps: overide the method UDF.getRequiredJars(), make it returning a hdfs file path list which values are determined by where you put the following xxx_lib folder into your hdfs. Note that , the list mist exactly contains each jar's full hdfs path strings ,such as hdfs://yourcluster/some_path/xxx_lib/some.jar export your udf code by following "Runnable jar file exporting wizard" (chose "copy required libraries into a sub folder next to the generated jar". This steps will result in a xxx.jar and a lib folder xxx_lib next to xxx.jar put xxx.jar and the folders xxx_lib to your hdfs filesystem according to your code in step 0. create a udf using: add jar ${the-xxx.jar-hdfs-path}; create function your-function as $}qualified name of udf class}; Try it. I test this and it works
Query exist-db from Java
i want to query existdb from Java. i know there are samples but where can i get the necessary packages to run the examples? in the samples : import javax.xml.transform.OutputKeys; import org.exist.storage.serializers.EXistOutputKeys; import org.exist.xmldb.EXistResource; import org.xmldb.api.DatabaseManager; import org.xmldb.api.base.Collection; import org.xmldb.api.base.Database; import org.xmldb.api.modules.XMLResource; where can i get these ? and what is the right standard connection string for exist-db? port number etc and YES, i have tried to read the existdb documentation, but those are not really understandable for beginners. they are confusing. All i want to do is write a Java class in eclipse that can connect to a exist-db and query an xml document.
Your question is badly written, and I think you are really not explaining what you are trying to do very well. If you want the JAR files as dependencies directly for some project then you can download eXist and get them from there. Already covered several times here, which JAR files you need as dependencies is documented on the eXist website and links to that documentation have already been posted in this thread. I wanted to add, that if you did want a series of simple Java examples that use Maven to resolve the dependencies (which takes away the hard work), then when we wrote the eXist book we provided just that in the Integration Chapter. It shows you how to use each of eXist's different APIs from Java for storing/querying/updating etc. You can find the code from that book chapter here: https://github.com/eXist-book/book-code/tree/master/chapters/integration. Included are the Maven project files to resolve all the dependencies and build and run the examples. If the code is not enough for you, you might also want to consider purchasing the book and reading the Integration Chapter carefully, that should answer all of your questions.
i ended up with a maven project and imported some missing jars (like ws.commons etc) by manually installing them on maven. the missing jars i copied from the existdb installation path on my local system. then i got it to work.
from: http://exist-db.org/exist/apps/doc/devguide_xmldb.xml There are several XML:DB examples provided in eXist's samples directory . To start an example, use the start.jar jar file and pass the name of the example class as the first parameter, for instance: java -jar start.jar org.exist.examples.xmldb.Retrieve [- other options] Example: Retrieving a Document with XML:DB import org.xmldb.api.base.*; import org.xmldb.api.modules.*; import org.xmldb.api.*; import javax.xml.transform.OutputKeys; import org.exist.xmldb.EXistResource; public class RetrieveExample { private static String URI = "xmldb:exist://localhost:8080/exist/xmlrpc"; /** * args[0] Should be the name of the collection to access * args[1] Should be the name of the resource to read from the collection */ public static void main(String args[]) throws Exception { final String driver = "org.exist.xmldb.DatabaseImpl"; // initialize database driver Class cl = Class.forName(driver); Database database = (Database) cl.newInstance(); database.setProperty("create-database", "true"); DatabaseManager.registerDatabase(database); Collection col = null; XMLResource res = null; try { // get the collection col = DatabaseManager.getCollection(URI + args[0]); col.setProperty(OutputKeys.INDENT, "no"); res = (XMLResource)col.getResource(args[1]); if(res == null) { System.out.println("document not found!"); } else { System.out.println(res.getContent()); } } finally { //dont forget to clean up! if(res != null) { try { ((EXistResource)res).freeResources(); } catch(XMLDBException xe) {xe.printStackTrace();} } if(col != null) { try { col.close(); } catch(XMLDBException xe) {xe.printStackTrace();} } } } }
On the page http://exist-db.org/exist/apps/doc/deployment.xml#D2.2.6 a list of dependencies is included; unfortunately there is no link to this page on http://exist-db.org/exist/apps/doc/devguide_xmldb.xml (should be added); The latest xmldb.jar documentation can be found on http://xmldb.exist-db.org/ All the jar files can be retrieved by installing eXist-db from the installer jar; the files are all in EXIST_HOME/lib/core
If you work with a maven project, try adding this to your pom.xml <dependency> <groupId>xmldb</groupId> <artifactId>xmldb-api</artifactId> <version>20021118</version> </dependency> Be aware that the release date is 2002. Otherwise you can query exist-db via XML-RPC
Different source files in different build types in gradle builds?
Now that Android Studio uses Gradle Build System: Is it possible to have different source files or compiler switches for debug and release build. In fact in my project there are some differences - e.g. I want to have different endpoint urls or bugreport urls when my application is built in debug mode. As far as I know, in plain Java it's not possible to have compiler switches just like #if DEBUG // do something when in debug build #else // do something when not in debug build #endif in C#. Does Gradle or Android Studio itself provide me the possibility to compile different code dependent in the build type? And how to do this? Two examples: #ReportsCrashes(formKey = "", formUri = "http://mydebugcrashreportserver/") //#ReportsCrashes(formKey = "", formUri = "http://myreleasecrashreportserver/") public class MyApplication extends Application { ... } Dependent on the build type, ACRA should report to different servers. public class Configuration { public static final String APPLICATION_BASE_URL = "https://mydebugendpoint/api/rest"; // public static final String APPLICATION_BASE_URL = "https://myreleaseendpoint/api/rest"; public static final String GA_TRACKING_ID = "UA-XXXXXXXX-Y"; // debug Analytics // public static final String GA_TRACKING_ID = "UA-ZZZZZZZZ-Y"; // release Analytics } Dependent on the build type my API endpoint and my analytics account differ. And it's very annoying and there are potential bugs when I have to remind to do theese changes manually before every build. So how can I automate this in Android Studio and/or Gradle?
Ok, I found the solution. It is described in Android Docs. I have to create additional folders named debug and release and whatever buildvariant I'd like and put the sourcefiles dependent on that variant there.