I have this code that works for python
X = numpy.loadtxt("compiledFeatures.csv", delimiter=",")
model = load_model("kerasnaive.h5")
predictions = model.predict(X)
print(predictions);
and I am trying to write a code with the same functionality in java,
I have written this code but it do not works, anyone knows what I am doing wrong or is there another simpler way to do it?
the code is going to the catch block, and during debugging the code it seems that all the information gained from the model file is null
path = String.format("%s\\kerasnaive.h5", System.getProperty("user.dir"),
pAgents[i]);
try {
network = KerasModelImport.importKerasModelAndWeights(path, false);
}
catch (Exception e){
System.out.println("cannot build keras layers");
}
INDArray input = Nd4j.create(1);
input.add(featuresInput); //an NDarray that i got in the method
INDArray output = network[i].outputSingle(input);
it seems that the model does not built (the network is still null)
the code for python loads the model and it works,
in java i get the error: "Could not determine number of outputs for layer: no output_dim or nb_filter field found. For more information, see http://deeplearning4j.org/model-import-keras."
although the same file is used in both casses
Thanks,
Ori
You are currently importing the trained keras model using importKerasModelAndWeights. I'm not sure how you trained your model, but in Keras there are two types of models available: the Sequential model, and the Model class that uses the functional API. You can read more here.
If you used the Sequential model when you created the network, you need to use the importKerasSequentialModel function. Keras Sequential models.
Related
In Python I've trained a TensorFlow LinearClassifier and saved it like:
model = tf.contrib.learn.LinearClassifier(feature_columns=columns)
model.fit(input_fn=train_input_fn, steps=100)
model.export_savedmodel(export_dir, parsing_serving_input_fn)
By using the TensorFlow Java API I am able to load this model in Java using:
model = SavedModelBundle.load(export_dir, "serve");
It seems I should be able to run the graph using something like
model.session().runner().feed(???, ???).fetch(???, ???).run()
but what variable names/data should I feed to/fetch from the graph to provide it features and to fetch the probabilities of the classes? The Java documentation is lacking this information as far as I can see.
The names of the nodes to feed would depend on what parsing_serving_input_fn does, in particular they should be the names of the Tensor objects that are returned by parsing_serving_input_fn. The names of the nodes to fetch would depend on what you're predicting (arguments to model.predict() if using your model from Python).
That said, the TensorFlow saved model format does include the "signature" of the model (i.e., the names of all Tensors that can be fed or fetched) as metadata that can provide hints.
From Python you can load the saved model and list out its signature using something like:
with tf.Session() as sess:
md = tf.saved_model.loader.load(sess, ['serve'], export_dir)
sig = md.signature_def[tf.saved_model.signature_constants.DEFAULT_SERVING_SIGNATURE_DEF_KEY]
print(sig)
Which will print something like:
inputs {
key: "inputs"
value {
name: "input_example_tensor:0"
dtype: DT_STRING
tensor_shape {
dim {
size: -1
}
}
}
}
outputs {
key: "scores"
value {
name: "linear/binary_logistic_head/predictions/probabilities:0"
dtype: DT_FLOAT
tensor_shape {
dim {
size: -1
}
dim {
size: 2
}
}
}
}
method_name: "tensorflow/serving/classify"
Suggesting that what you want to do in Java is:
Tensor t = /* Tensor object to be fed */
model.session().runner().feed("input_example_tensor", t).fetch("linear/binary_logistic_head/predictions/probabilities").run()
You can also extract this information purely within Java if your program includes the generated Java code for TensorFlow protocol buffers (packaged in the org.tensorflow:proto artifact) using something like this:
// Same as tf.saved_model.signature_constants.DEFAULT_SERVING_SIGNATURE_DEF_KEY
// in Python. Perhaps this should be an exported constant in TensorFlow's Java API.
final String DEFAULT_SERVING_SIGNATURE_DEF_KEY = "serving_default";
final SignatureDef sig =
MetaGraphDef.parseFrom(model.metaGraphDef())
.getSignatureDefOrThrow(DEFAULT_SERVING_SIGNATURE_DEF_KEY);
You will have to add:
import org.tensorflow.framework.MetaGraphDef;
import org.tensorflow.framework.SignatureDef;
Since the Java API and the saved-model-format are somewhat new, there is much room for improvement in the documentation.
Hope that helps.
I've created a model based on the 'wide and deep' example (https://github.com/tensorflow/tensorflow/blob/master/tensorflow/examples/learn/wide_n_deep_tutorial.py).
I've exported the model as follows:
m = build_estimator(model_dir)
m.fit(input_fn=lambda: input_fn(df_train, True), steps=FLAGS.train_steps)
results = m.evaluate(input_fn=lambda: input_fn(df_test, True), steps=1)
print('Model statistics:')
for key in sorted(results):
print("%s: %s" % (key, results[key]))
print('Done training!!!')
# Export model
export_path = sys.argv[-1]
print('Exporting trained model to %s' % export_path)
m.export(
export_path,
input_fn=serving_input_fn,
use_deprecated_input_fn=False,
input_feature_key=INPUT_FEATURE_KEY
My question is, how do I create a client to make predictions from this exported model? Also, have I exported the model correctly?
Ultimately I need to be able do this in Java too. I suspect I can do this by creating Java classes from proto files using gRPC.
Documentation is very sketchy, hence why I am asking on here.
Many thanks!
I wrote a simple tutorial Exporting and Serving a TensorFlow Wide & Deep Model.
TL;DR
To export an estimator there are four steps:
Define features for export as a list of all features used during estimator initialization.
Create a feature config using create_feature_spec_for_parsing.
Build a serving_input_fn suitable for use in serving using input_fn_utils.build_parsing_serving_input_fn.
Export the model using export_savedmodel().
To run a client script properly you need to do three following steps:
Create and place your script somewhere in the /serving/ folder, e.g. /serving/tensorflow_serving/example/
Create or modify corresponding BUILD file by adding a py_binary.
Build and run a model server, e.g. tensorflow_model_server.
Create, build and run a client that sends a tf.Example to our tensorflow_model_server for the inference.
For more details look at the tutorial itself.
Just spent a solid week figuring this out. First off, m.export is going to deprecated in a couple weeks, so instead of that block, use: m.export_savedmodel(export_path, input_fn=serving_input_fn).
Which means you then have to define serving_input_fn(), which of course is supposed to have a different signature than the input_fn() defined in the wide and deep tutorial. Namely, moving forward, I guess it's recommended that input_fn()-type things are supposed to return an InputFnOps object, defined here.
Here's how I figured out how to make that work:
from tensorflow.contrib.learn.python.learn.utils import input_fn_utils
from tensorflow.python.ops import array_ops
from tensorflow.python.framework import dtypes
def serving_input_fn():
features, labels = input_fn()
features["examples"] = tf.placeholder(tf.string)
serialized_tf_example = array_ops.placeholder(dtype=dtypes.string,
shape=[None],
name='input_example_tensor')
inputs = {'examples': serialized_tf_example}
labels = None # these are not known in serving!
return input_fn_utils.InputFnOps(features, labels, inputs)
This is probably not 100% idiomatic, but I'm pretty sure it works. For now.
I have to extract the geometry of a ifc file in JAVA. My problem is, that i don't know how to do it.
I tried to use openifctools but the documentation is really bad. For now i have the ifc file loaded, but i cannot get the geometry out of the model.
Does anyone have experience with ifc model loading?
Thanks in advance.
EDIT: This is what I've done so far
try {
IfcModel ifcModel = new IfcModel();
ifcModel.readStepFile(new File("my-project.ifc"));
Collection<IfcClass> ifcObjects = ifcModel.getIfcObjects();
System.out.println(ifcObjects.iterator().next());
} catch (Exception e) {
e.printStackTrace();
}
This correctly loads the ifc file. But I don't know what to do with this information.
I also tried to use IfcOpenShell but the provided jar container hadn't worked either. At the moment I try to build IfcOpenShell by myself.
I'm kinda desperate because everything is very undocumented and I really need to load and parse the ifc geometry.
Depending on what you want to do with the geometry, how deep you want to delve into the IFC standard and what performance you need for your solution you have two different options:
Extract the implicit geometry on your own
Use an external geometry engine
If you go for the first option, you'd have to study the IFC schema intensively. You would only be interested in IFCProducts, because only those can have geometry. Using OpenIfcTools you could do something like:
Collection<IfcProduct> products = model.getCollection(IfcProduct.class);
for(IfcProduct product: products){
List<IfcRepresentation> representations = product.getRepresentation().getRepresentations();
assert ! representations.isEmpty();
assert representations.get(0) instanceof IfcShapeRepresentation:
Collection<IfcRepresentationItem> repr = representations.get(0).getItems();
assert !repr.isEmpty();
IfcRepresentationItem representationItem = repr.iterator().next();
assert representationItem instanceof IfcFacetedBrep;
for(IfcFace face: ((IfcFacetedBrep)representationItem).getOuter().getCfsFaces()){
for(IfcFaceBound faceBound: face.getBounds()){
IfcLoop loop = faceBound.getBound();
assert loop instanceof IfcPolyLoop;
for(IfcCartesianPoint point: ((IfcPolyLoop) loop).getPolygon()){
point.getCoordinates();
}
}
}
}
However, there are a lot of different GeometryRepresentations, which you'd have to cover, probably doing triangulation and stuff on your own. I've shown one special case and made a lot of assertions. And you'd have to fiddle with coordinate transformations, because these may be nested recursively.
If you go for the second option the geometry engines I know are all written in C/C++ (Ifcopenshell, RDF IfcEngine), so you'd have to cope with native library integration. The jar package provided with IFCOpenshell is intended to be used as a Bimserver plugin. Those you can't use it without the respective dependencies. However you can grab the native binaries from this package. In order to use the engine you can draw some inspiration from the Bimserver plugin source. The key native methods you're gonna use are
boolean setIfcData(byte[] ifc) to parse the ifc data
IfcGeomObject getGeometry() to access the extracted geometry successively.
I have a requirement to produce graphs of matrices and display these graphs on a JSP. The project has been developed in Java and so far all my operations relating to matrices are being performed using the MatLabControl API
http://code.google.com/p/matlabcontrol/ .
I wanted to return the matrices produced by MATLAB (especially eigen value matrices and wavelets). MATLAB provides a function "im2java" that converts graph image from its MATLAB representation to a java.awt.Image. My code used to get the image data in MatlabControl was as follows:
public Image produceEigenValueGraph(final double [][] matrix) {
final double [][] maxEigenValueMatrix =
extractOutMaxEigenValues(matrix);
Image matlabPlotImage = null;
try {
MatlabNumericArray matLabEigenValueMatrix =
new MatlabNumericArray(maxEigenValueMatrix, null);
matLabTypeConverter.setNumericArray("eigen",
matLabEigenValueMatrix);
matLabProxy.setVariable("amountOfTime", matrix.length - 1);
matLabProxy.eval("time");
matLabProxy.eval("plot(time, eigen)");
matLabProxy.eval("frame=getframe");
final Object [] returnedMatlabArguements =
matLabProxy.returningEval("im2java(frame.cdata)", 1);
matlabPlotImage =
(Image)returnedMatlabArguements[0];
} catch (MatlabInvocationException mie) {
mie.printStackTrace();
}
return matlabPlotImage;
}
The code returns a nested exception:
Caused by: java.io.WriteAbortedException: writing aborted;
java.io.NotSerializableException: sun.awt.image.ToolkitImage
Which basically puts an end to any hope of the above code working, unless I am incorrect in my use.
N.B The code does produce a correct graph it fails to return it in java.awt.Image
My questions are:
-Is the above code the correct/only way to return images to a java program from Matlab?
-If it is what would be the best alternatives to using Matlab, Java API or otherwise?
Is this the line that causes the exception?
matlabPlotImage = (Image)returnedMatlabArguements[0];
In answer to your question
"-Is the above code the correct/only way to return images to a java program from Matlab?"
You can call java classes from Matlab so you could also use the java in a Matlab file and call that to replace
final Object [] returnedMatlabArguements = matLabProxy.returningEval("im2java(frame.cdata)", 1);
matlabPlotImage = (Image)returnedMatlabArguements[0];
The error is being thrown because Image is not serializeable. An option would be to save it as a file in some image format (jpg,png,tiff) using either matlab or java and return File instead of Image.
"-If it is what would be the best alternatives to using Matlab, Java API or otherwise?"
Mathworks provide a Java api to perform a number of linear algebra calculations that you could implement.
http://math.nist.gov/javanumerics/jama/#Package
Alternatively the Apache Commons Math project provide a wide range of linear algebraic functions as well as other tools. http://commons.apache.org/math/userguide/linear.html
I would check other posts for suggestions on graphing in java
constructing graphs in Java
Java Graphing Libraries for Web Applicattions?
Has anyone ever persisted a training set for CI-Bayes? I have sample code from this site: http://www.theserverside.com/news/thread.tss?thread_id=49773
here is the code:
FisherClassifier fc=new FisherClassifierImpl();
fc.train("The quick brown fox jumps over the lazy dog's tail","good");
fc.train("Make money fast!", "bad");
String classification=fc.getClassification("money", "unknown"); // should be "bad"
so I need to be able to store the training set in a local file.
Has anyone ever done this before?
To persist a java Object in a local file, the Object must first implement the Serializable interface.
import java.io.Serializable;
public class MyClass implements Serializable {...
Then, the class from which you would like to persist this training set, should include a method like:
public void persistTrainingSet(FisherClassifier fc) {
String outputFile = <path/to/output/file>;
try {
FileOutputStream fos = new FileOutputStream(outputFile);
ObjectOutputStream oos = new ObjectOutputStream(fos);
oos.writeObject(fc);
oos.close();
}
catch (IOException e) {
//handle exception
}
finally {
//do any cleaning up
}
}
I have. After doing a couple projects with CI-Bayes, I would recommend you look elsewhere (of course this was a long time ago). It is a very bad idea to use an inference engine that needs to be trained before each use and if you really consider the issue of state management, it's complicated (e.g. do you want to just store the training data? or perhaps the trained distributions? chains?).
CI-Bayes is also kind of a convoluted codebase. It was modeled off some Python code that appeared in a book about intelligence. The Java version is not very well designed. It also does not use TDD, does not really have JavaDoc to speak of.
That said, you can get a simple classifier going pretty quickly. The longer term goal is the one you asked about though.