I have trained a TensorFlow model in Python and would like to use it in Java code. Training the model is done via something like this code:
def input_fn():
features = {'a': tf.constant([[1],[2]]),
'b': tf.constant([[3],[4]]) }
labels = tf.constant([0, 1])
return features, labels
feature_a = tf.contrib.layers.sparse_column_with_integerized_feature("a", bucket_size=10)
feature_b = tf.contrib.layers.sparse_column_with_integerized_feature("b", bucket_size=10)
feature_columns = [feature_a, feature_b]
model = tf.contrib.learn.LinearClassifier(feature_columns=feature_columns)
model.fit(input_fn=input_fn, steps=10)
Now I want to save this model to use it in Java. It seems that export_savedmodel is the new/preferred way of saving, so I tried:
feature_spec = tf.contrib.layers.create_feature_spec_for_parsing(feature_columns)
serving_input_fn = input_fn_utils.build_parsing_serving_input_fn(feature_spec)
model.export_savedmodel('export', serving_input_fn, as_text=True)
This results in a saved model, which can be loaded from Java with
model = SavedModelBundle.load(dir, "serve");
model.session().runner()
.feed("input_example_tensor", input)
.fetch("linear/binary_logistic_head/predictions/probabilities")
.run();
There is now a problem though: the input_example_tensor should be a Tensor containing Strings/byte[]s, but this is not supported in Java yet (see: Tensor.java#88 "throw new UnsupportedOperationException"). As far as I understand it, the reason that it wants a String is that build_parsing_serving_input_fn wants to parse serialized Example protocol buffers.
Maybe a different serving_input_fn would do better. input_fn_utils.build_default_serving_input_fn looks promising, but I didn't get that to work.
If I call it like:
features_dict = {'a':feature_a, 'b':feature_b}
serving_input_fn = input_fn_utils.build_default_serving_input_fn(features)
I get "AttributeError: '_SparseColumnIntegerized' object has no attribute 'get_shape'"
If I call it like:
features = {'a': tf.constant([[1],[2]]),
'b': tf.constant([[3],[4]]) }
serving_input_fn = input_fn_utils.build_default_serving_input_fn(features)
I get "ValueError: 'Const:0' is not a valid scope name".
What is the proper way to use input_fn_utils.build_default_serving_input_fn? I can't find any example that uses it.
Related
Proto3 supports the oneof features, where you can have a message with many fields and where at most one field will be set at the same time.
Since one field will be set at a time, it would be reasonable to have duplicate field names in the proto schema. The problem is the proto generater sees this as a redefinition.
I'd like to do this because in my situation, this makes json serialization with JsonFormat simple.
For example, I may like to have
message MyResponse {
int32 a = 1;
string b = 2;
oneof Properties {
PropertiesType1 properties = 3;
PropertiesType2 properties = 4;
PropertiesType3 properties = 5;
PropertiesType4 properties = 6;
}
}
Is there a way around this, or will have to make the effort of redefining the proto? A possible work around may be for example to use map<string, Properties> properties = 9;
Ignore the JSON but for now; in most languages/frameworks, you are going to access those properties by their name, whether that is getting the value, or checking which one is set. If the names conflict: you can't do that.
Also: anyof allows the same type to be used for multiple of the members in a discriminated union, in which case what you want to do gets ever more confusing.
Finally, going back to JSON: the parser sees "properties": - what does it expect next? And once it has parsed the value, what field is considered "set" in the discriminated union?
So no, for many reasons: this isn't allowed.
I have solved similar use case for JSON serialization using this way.
message MyResponse {
int32 a = 1;
string b = 2;
oneof Properties {
PropertiesType1 properties1 = 3 [json_name = "properties"];
PropertiesType2 properties2 = 4 [json_name = "properties"];
PropertiesType3 properties3 = 5 [json_name = "properties"];
PropertiesType4 properties4 = 6 [json_name = "properties"];
}
}
This would work if you use protoc compilers but it wont work for advanced tools like buf lint/build. Hope this helps.
But as #marc gravell said this is not recommended way.
I have an old question sustained in my mind for a long time. When I was writing code in Spring, there are lots dirty and useless code for DTO, domain objects. For language level, I am hopeless in Java and see some light in Kotlin. Here is my question:
Style 1 It is common for us to write following code (Java, C++, C#, ...)
// annot: AdminPresentation
val override = FieldMetadataOverride()
override.broadleafEnumeration = annot.broadleafEnumeration
override.hideEnumerationIfEmpty = annot.hideEnumerationIfEmpty
override.fieldComponentRenderer = annot.fieldComponentRenderer
Sytle 2 Previous code can be simplified by using T.apply() in Kotlin
override.apply {
broadleafEnumeration = annot.broadleafEnumeration
hideEnumerationIfEmpty = annot.hideEnumerationIfEmpty
fieldComponentRenderer = annot.fieldComponentRenderer
}
Sytle 3 Can such code be even simplified to something like this?
override.copySameNamePropertiesFrom (annot) { // provide property list here
broadleafEnumeration
hideEnumerationIfEmpty
fieldComponentRenderer
}
First Priority Requirments
Provide property name list only one time
The property name is provided as normal code, so as to we can get IDE auto complete feature.
Second Priority Requirements
It's prefer to avoid run-time cost for Style 3. (For example, 'reflection' may be a possible implementation, but it do introduce cost.)
It's prefer to generated code like style1/style2 directly.
Not care
The final syntax of Style 3.
I am a novice for Kotlin language. Is it possible to use Kotlin to define somthing like 'Style 3' ?
It should be pretty simple to write a 5 line helper to do this which even supports copying every matching property or just a selection of properties.
Although it's probably not useful if you're writing Kotlin code and heavily utilising data classes and val (immutable properties). Check it out:
fun <T : Any, R : Any> T.copyPropsFrom(fromObject: R, vararg props: KProperty<*>) {
// only consider mutable properties
val mutableProps = this::class.memberProperties.filterIsInstance<KMutableProperty<*>>()
// if source list is provided use that otherwise use all available properties
val sourceProps = if (props.isEmpty()) fromObject::class.memberProperties else props.toList()
// copy all matching
mutableProps.forEach { targetProp ->
sourceProps.find {
// make sure properties have same name and compatible types
it.name == targetProp.name && targetProp.returnType.isSupertypeOf(it.returnType)
}?.let { matchingProp ->
targetProp.setter.call(this, matchingProp.getter.call(fromObject))
}
}
}
This approach uses reflection, but it uses Kotlin reflection which is very lightweight. I haven't timed anything, but it should run almost at same speed as copying properties by hand.
Now given 2 classes:
data class DataOne(val propA: String, val propB: String)
data class DataTwo(var propA: String = "", var propB: String = "")
You can do the following:
var data2 = DataTwo()
var data1 = DataOne("a", "b")
println("Before")
println(data1)
println(data2)
// this copies all matching properties
data2.copyPropsFrom(data1)
println("After")
println(data1)
println(data2)
data2 = DataTwo()
data1 = DataOne("a", "b")
println("Before")
println(data1)
println(data2)
// this copies only matching properties from the provided list
// with complete refactoring and completion support
data2.copyPropsFrom(data1, DataOne::propA)
println("After")
println(data1)
println(data2)
Output will be:
Before
DataOne(propA=a, propB=b)
DataTwo(propA=, propB=)
After
DataOne(propA=a, propB=b)
DataTwo(propA=a, propB=b)
Before
DataOne(propA=a, propB=b)
DataTwo(propA=, propB=)
After
DataOne(propA=a, propB=b)
DataTwo(propA=a, propB=)
My use case:
I am trying to serve the models trained by python within our existing JVM service with libtensorflow_jni.
Now I am able to load the model by using SavedModelBundle.load(). But I find hard to feed the request into the model. As my user request is not simply a scalar matrix, but a map of features, like:
{'gender':1, 'age': 20, 'country': 100, other features ...}
By searching around the tensor flow tutorials, I see that Example protocol buffers may fit here as it basically holds a list of features. But I am not sure how to convert it into a Java Tensor object.
If I create a Tensor directly with serialized Example object, TensorFlow runtime seems not happy with datatype. For example, it I do following,
Tensor inputTensor = Tensor.create(example.toByteArray());
s.runner().feed(inputTensorName, inputTensor).fetch(outputTensorName).run().get(0);
I will get an IllegalArgumentException:
java.lang.IllegalArgumentException: Expected serialized to be a vector, got shape: []
Could you guys shed some light how I can move forward from here in case you happens to know or have same use cases?
Thanks!
Looking at your error message, it appears that the problem is that your model is expecting a vector of string tensors (most likely corresponding to a batch of serialized Example protocol buffer messages, probably from tf.parse_example) but you're feeding it a scalar string tensor.
Unfortunately, till issue #8531 is resolved, the Java API doesn't have a way to create a Tensor of strings except for scalars. Once that issue is resolved, things will be easier.
In the mean time, you could work around this by constructing a TensorFlow "model" to convert your scalar string into a vector of size 1 :). That could be done with something like this:
// A TensorFlow "model" that reshapes a string scalar into a vector.
// Should be much prettier once https://github.com/tensorflow/tensorflow/issues/7149
// is resolved.
private static class Reshaper implements AutoCloseable {
Reshaper() {
this.graph = new Graph();
this.session = new Session(graph);
this.in =
this.graph.opBuilder("Placeholder", "in")
.setAttr("dtype", DataType.STRING)
.build()
.output(0);
try (Tensor shape = Tensor.create(new int[] {1})) {
Output vectorShape =
this.graph.opBuilder("Const", "vector_shape")
.setAttr("dtype", shape.dataType())
.setAttr("value", shape)
.build()
.output(0);
this.out =
this.graph.opBuilder("Reshape", "out").addInput(in).addInput(vectorShape).build().output(0);
}
}
#Override
public void close() {
this.session.close();
this.graph.close();
}
public Tensor vector(Tensor input) {
return this.session.runner().feed(this.in, input).fetch(this.out).run().get(0);
}
private final Graph graph;
private final Session session;
private final Output in;
private final Output out;
}
With the above, you can convert your example proto tensor to a vector and feed it into the model you're interested in with something like this:
Tensor inputTensor = null;
try (Tensor scalar = Tensor.create(example.toByteArray())) {
inputTensor = reshaper.vector(scalar);
}
s.runner().feed(inputTensorName, inputTensor).fetch(outputTensorName).run().get(0);
For full details, see this example on github
Hope that helps!
$postfields["pricing[1][annually]"] = "50.00";
$postfields["pricing[1][monthly]"] = "50.00";
$postfields["pricing[2][monthly]"] = "8.00";
$postfields["pricing[2][annually]"] = "80.00";
I want something similar to the above variable in java. I am not talking about creating a class with required variables.
I have used List<Map<String,String>> pricing = new ArrayList<Map<String,String>>();
but that doesn't seem to work with WHMCS api.
I debugged and came across this value on the back-end
"pricing" -> "[{monthly=5.00, annually=50.00}]"
That is how it is done in the api:
http://docs.whmcs.com/API:Add_Product
Do we have anything similar in java that can cater this issue?
I am integrating a billing solution with WHMCS using their api.
You can use a simple 2D double array. You'll create some int constants for annually = 0, monthly = 1
double pricing = new double[][]{50.0, 50.0};
and so on
I am prototyping an interface to our application to allow other people to use python, our application is written in java. I would like to pass some of our data from the java app to the python code but I am unsure how to pass an object to python. I have done a simple java->python function call using simple parameters using Jython and found it very useful for what I am trying to do. Given the class below, how can I then use it in Python/Jython as an input to a function/class:
public class TestObject
{
private double[] values;
private int length;
private int anotherVariable;
//getters, setters
}
One solution. You could use some sort of message system, queue, or broker of some sort to serialize/deserialize, or pass messages between python and java. Then create some sort workers/producer/consumers to put work on the queues to be processed in python, or java.
Also consider checking out for inspiration: https://www.py4j.org/
py4j is used heavily by/for pyspark and hadoop type stuff.
To answer your question more immediately.
Example using json-simple.:
import org.apache.commons.io.FileUtils;
import org.json.simple.JSONObject;
//import org.json.simple.JSONObject;
public class TestObject
{
private double[] values;
private int length;
private int anotherVariable;
private boolean someBool;
private String someString;
//getters, setters
public String toJSON() {
JSONObject obj=new JSONObject();
obj.put("values",new Double(this.values));
obj.put("length",new Integer(this.length));
obj.put("bool_val",new Boolean(this.SomeBool));
obj.put("string_key",this.someString);
StringWriter out = new StringWriter();
obj.writeJSONString(out);
return out.toString();
}
public void writeObject(){
Writer writer = new BufferedWriter(
new OutputStreamWriter(
new FileOutputStream("anObject.json"), "utf-8")
)
)
writer.write(this.toJSON());
}
public static void setObject(){
values = 100.134;
length = 12;
anotherVariable = 15;
someString = "spam";
}
}
And in python:
class DoStuffWithObject(object):
def __init__(self,obj):
self.obj = obj
self.changeObj()
self.writeObj()
def changeObj(self):
self.obj['values'] = 100.134;
self.obj['length'] = 12;
self.obj['anotherVariable'] = 15;
self.obj['someString'] = "spam";
def writeObj(self):
''' write back to file '''
with open('anObject.json', 'w') as f:
json.dump(self.obj, f)
def someOtherMethod(self, s):
''' do something else '''
print('hello {}'.format(s))
import json
with open('anObject.json','r') as f:
obj = json.loads(f.read())
# print out obj['values'] obj['someBool'] ...
for key in obj:
print(key, obj[key])
aThing = DoStuffWithObject(obj)
aThing.someOtherMethod('there')
And then in java read back the object. There are solutions that exist implementing this idea (JSON-RPC, XML-RPC, and variants). Depending, you may may also want to consider using something like ( http://docs.mongodb.org/ecosystem/drivers/java/ ) the benefit being that mongo does json.
See:
https://spring.io/guides/gs/messaging-reactor/
http://spring.io/guides/gs/messaging-rabbitmq/
http://spring.io/guides/gs/scheduling-tasks/
Celery like Java projects
Jedis
RabbitMQ
ZeroMQ
A more comprehensive list of queues:
http://queues.io/
Resources referenced:
http://www.oracle.com/technetwork/articles/java/json-1973242.html
How do I create a file and write to it in Java?
https://code.google.com/p/json-simple/wiki/EncodingExamples
Agree with the answer below. I think that the bottom line is that "Python and Java are separate interpreter-environments." You therefore shouldn't expect to transfer "an object" from one to the other. You shouldn't expect to "call methods." But it is reasonable to pass data from one to another, by serializing and de-serializing it through some intermediate data format (e.g. JSON) as you would do with any other program.
In some environments, such as Microsoft Windows, it's possible that a technology like OLE (dot-Net) might be usable to allow environments to be linked-together "actively," where the various systems implement and provide OLE-objects. But I don't have any personal experience with whether, nor how, this might be done.
Therefore, the safest thing to do is to treat them as "records," and to use serialization techniques on both sides. (Or, if you got very adventurous, run (say) Java in a child-thread.) An "adventurous" design could get out-of-hand very quickly, with little return on investment.
You need to make the python file to exe using py2exe , Refer the link : https://www.youtube.com/watch?v=kyoGfnLm4LA. Then use the program in java and pass arguements:
Please refer this link it will be having the details:
Calling fortran90 exe program from java is not executing