The Tensorflow Android demo provides a decent base for building an Android app that uses a TensorFlow graph, but I've been getting stuck on how to repurpose it for an app that does not do image classification. As it is, it loads in the Inception graph from a .pb file and uses that to run inferences (and the code assumes as such), but what I'd like to do is load my own graph in (from a .pb file), and do a custom implementation of how to handle the input/output of the graph.
The graph in question is from Assignment 6 of Udacity's deep learning course, an RNN that uses LSTMs to generate text. (I've already frozen it into a .pb file.) However, the Android demo's code is based on the assumption that they're dealing with an image classifier. So far I've figured out that I'll need to change the values of the parameters passed into tensorflow.initializeTensorflow (called in TensorFlowImageListener), but several of the parameters represent properties of image inputs (e.g. IMAGE_SIZE), which the graph I'm looking to load in doesn't have. Does this mean I'll have to change the native code? More generally, how can I approach this entire issue?
Look at TensorFlow Serving for a generic way to load and serve tensorflow models.
Good news: it recently became a lot easier to embed a pre-trained TensorFlow model in your Android app. Check out my blog posts here:
https://medium.com/#daj/using-a-pre-trained-tensorflow-model-on-android-e747831a3d6 (part 1)
https://medium.com/#daj/using-a-pre-trained-tensorflow-model-on-android-part-2-153ebdd4c465 (part 2)
My blog post goes into a lot more detail, but in summary, all you need to do is:
Include the compile org.tensorflow:tensorflow-android:+ dependency in your build.gradle.
Use the Java TensorFlowInferenceInterface class to interface with your model (no need to modify any of the native code).
The TensorFlow Android demo app has been updated to use this new approach. See TensorFlowImageClassifier.recognizeImage for where it uses the TensorFlowInferenceInterface.
You'll still need to specify some configuration, like the names of the input and output nodes in the graph, and the size of the input, but you should be able to figure that information out from using TensorBoard, or inspecting the training script.
Related
Does anyone know how to use a language model in TensorFlow Lite? I have a generated language model with an LSTM structure in TensorFlow. I have already converted it to .tflite for use on android. Now my question is how can I use it? My intention is to use that model to predict the next word in a sentence. The original model works perfectly in Python, now what I need is for it to work and make predictions in Java or Kotlin.
You can achieve this in two ways:
Add the .tflite model in assets folders, implement it to build Gradle file and then finally import that model to carry out requirements in java/kotlin. as shown here
By using firebase, firebase provides custom machine learning model deployment options.
you can add the model to firebase and use it as a remote service. the link to docs.
I'd recommend the first method if your model size is small & if you want to run your app offline.
I have already asked the question as a github issue, but was redirected to here. I have seen the example for importing model created and trained in Python imported into Java code and used for predictions. However, I had some problems understanding what was actually going on especially in this block and the GraphBuilder class declaration between lines 156-207. Could someone please give some explanation for them?
Moreover, I know the Java API is still under construction. However, I would be interested in if it is possible to see some more sophisticated examples, if it is possible including:
importing model into Java and then performing training on the model
implementing, training, evaluating, saving, loading a model from scratch in Java with Tensorflow
Does anyone have such an example, and willing to share it?
Thank you for any help!
Cheers,
Peter
The code block you pointed to generates a TensorFlow graph to "normalize" an image so that the image can be fed into the other TensorFlow graph (inception). It is achieving the equivalent of something like this in Python:
image = tf.cast(tf.image.decode_jpeg(input, channels = 3), tf.float32)
batch = tf.expand_dims(image, 0);
resized = tf.image.resize_bilinear(dims_expander, [input_height, input_width])
normalized = tf.divide(tf.subtract(resized, [input_mean]), [input_std])
Many of the Python functions for executing TensorFlow operations (like tf.cast, tf.image.decode_jpeg etc.) are generated from the TensorFlow op definitions. However, such generated functions do not exist yet in the Java API so the operations have to be constructed from lower level primitives, which is what the GraphBuilder class is doing.
Hope that helps.
Your other questions seem too broad, so not sure how to answer them here.
Hi I've created a sample application with TensorFlow Java API approximately a month ago. I've used a YOLOv2 model in the example. You can access it here: https://github.com/szaza/tensorflow-example-java.
I've also created a client-server architecture with Spring and Gradle, please see more details here: https://github.com/szaza/tensorflow-java-examples-spring.
I deployed it to my Google Cloud, so a live demo is also available here: http://35.229.93.105:8080/
More info about the project can be found here: https://sites.google.com/view/tensorflow-example-java-api
For analysis.
I know we can use the Save function and load the Model in Spark application. But it works only in Spark application (Java, Scala, Python).
We can also use the PMML and export the model to other type of application.
Is there any way to use a Spark model in a Java application?
I am one of the creators of MLeap. Check us out, it is meant for exactly your use case. If there is a transformer you need that is not currently supported, get in touch with me and we will get it in there.
Our serialization format is solely JSON/Protobuf right now, so it is very portable and supports large models like RandomForest. You can serialize your model to a zip file then load it up wherever.
Take a look at our demo to get a use case:
https://github.com/TrueCar/mleap-demo
Currently no, your options are to use PMML for those models that support it, or write your own framework for using models outside of Spark.
There is movement towards enabling this (see this issue). You could also check out Mleap.
I have a PMML file of a trained Artificial Neural Network (ANN). I would like to create a Java method which simply takes in the inputs and returns the targeted value.
This seems pretty easy, but I do not know how realize it.
The PMML Version = 3.0
Update: 24.05.2013
I tried to use the jpmml Java API.
This is how I have done:
(1) Downloaded via Maven Central Repository (link) three .Jar files:
pmml-manager-1.0.2.jar
pmml-model-1.0.2.jar
pmml-evaluator-1.0.2.jar
(2) Used eclipse to "configure Build path" and added those three external .Jar's
(3) Import my PMML-File named "text.xml" ( an artificial neural network (ANN)) PMML version="3.0"
(4) Tried to run an example "TreeModelTraversalExample.java" provided by the jpmml-project
Obviously it did not work for some reasons:
the mentioned example is not for ANN's. How to rewrite it?
my PMML-file is in XML-format. Is it the right format?
I do not know how to handle or to add Java API's. Should I even add those by "configure build path" in eclipse?
Obvious fact #2, I have no clue what I do :-)
Thanks again and kindest regards.
Stefan
JPMML should be able to handle PMML 3.X and newer versions of NeuralNetwork models without problem. Moreover, it should be able to handle all the normalization and denormalization transformations that may accompany such models.
I could use a clarification that why are you interested in converting PMML models to Java code in the first place. This complicates the whole matter a lot and it doesn't add any value. The JPMML library itself is rather compact and has minimal external dependencies (at the moment of writing this, it only depends on commons-math). There shouldn't be much difference performance-wise. You can reasonably expect to obtain up to 10'000 scorings/sec on a modern desktop computer.
The JPMML codebase has recently moved to GitHub: http://github.com/jpmml/jpmml
Fellow coders in Turn Inc. have forked this codebase and are implementing PMML-to-Java translation (see top-level module "pmml-translation") for selected model types: https://github.com/turn/jpmml
At the moment I recommend you to check out the Openscoring project (uses JPMML internally): http://www.openscoring.org
Then, you could try the following:
Deploy your XML file using the HTTP PUT method.
Get your model summary information using the HTTP GET method. If the request succeeds (as opposed to failing with an HTTP status 500 error code) then your model is well supported.
Execute the model either in single prediction mode or batch prediction mode using the HTTP POST method. Try sending larger batches to see if it meets your performance requirements.
Undeploy the model using the HTTP DELETE method.
You can always try contacting project owners for more insight. I'm sure they are nice people.
Another approach would be to use the Cascading API. There's a library called "Pattern" for Cascading, which translates PMML models into Cascading apps in Java. https://github.com/Cascading/pattern
Generally those are for Hadoop jobs; however, if you use the "local mode" flow planner in Cascading, it can be built as a JAR file to include with some other Java app.
There is work in progress for ANN models. Check on the developer email list: https://groups.google.com/forum/?fromgroups#!forum/pattern-user
I think this might do what you need. It is an open source library that claims to be able to read and evaluate pmml neural networks. I have not tried it.
https://code.google.com/p/jpmml/
Just a general question really?
Let's say I am making a game and have made a character model in Blender. How would I use this model in Java?
Would I import it somehow?
Thanks.
Generally when making models in blender you export the model in a format which allows you to later import it in the game engine of your choice, which format you use differ in requirements.
The export-import cycle is often referred to as the "Asset Pipeline", and you generally want to keep it as simple and automated as possible since it is something you or your artists will perform on a regular basis.
So if we look at a few specific graphics engines and platforms;
OGRE3D (or Ogre4J) supports it's own plain-text format (.scene, .mesh.xml, .material.xml) in order to load scenes, models and materials. It also has support for armature animations among other things, there is also some support for loading .blend-files directly. See their documentation for blender.
JmonkeyEngine has support for loading both OGRE3D .scene's and .blend's directly. It also has it's own binary j3o format which these can be converted into when you want to package the game. For specific examples, see their tutorials.
There are multiple formats you can take into consideration when deciding how you want to use your model. When it is imported however, the game engine of choice represents it in an internal structure which usually allows you to be decoupled from the exact format of choice.
Picking which to use is and should not be written in stone since requirements might change and if done properly it should not have a considerable effect on the project. This is also something you should take into consideration if you are writing your own engine.
There are input/output scripts available for Blender that will help you.
Blend2Java, for example, is a set of Python scripts for use with Blender that will export to Java XML, which can be decoded with the standard java.beans.XMLDecoder class.
There's a good overview of how to do this at http://blend2java.sourceforge.net/blend2java-howto.html
Here's a better idea: Use an existing Java 3D library (dzzd I highly recommend) and load in your model using the library's built in functions. Then, instead of just working with the data, you can actually display it. From Blender, it's a simple matter of exporting as 3DS.
Yet another solution: Java .Blend provides you with a type-safe Java API to all data in a Blender file. It supports even creating new Blender files from within Java ;)