i have build a python file based on randomforestclassifier and made a PMML model, now we need to use this PMML in Java to classify the data into 2 catagories..but this is new to me and i don't know how to handle the java part..
Google: pmml java
Second Link is https://github.com/jpmml/jpmml-evaluator
There you have a library with examples. So try it and come back if you have problems.
Another choice is PMML4S that is implemented in Scala, but you are free to use it by Scala or Java API. It's very easy to use, for example:
import org.pmml4s.model.Model;
Model model = Model.fromFile("/the/pmml/file/path");
Object result = model.predict(data)
The data could be in Map, Array, A String in JSON, Series, the result type is same as the input. For details how to use PMML4S in Java, you could see the example: https://github.com/autodeployai/pmml4s/blob/master/src/test/java/org/pmml4s/model/JModelTest.java
There is a demo base on high performance and light framework Vert.x:
vertx-pmml
Just config router via Json like this:
"route": {
"/predict/iris": {
"access": "public",
"method": "POST",
"actor": "evaluter.predict",
"ext": {
"name": "IRIS_SVC",
"pmml": ".\\config\\model_svc.pmml",
"version": "1.0.0"
}
}
...
Then prefect your feature handler(not nescessary if feed a json map match your pmml inputfields):
//src/main/java/com/hirisklab/evaluate/evaluator/actor/EvaluateImpl.java
private Future<Map<String, Object>> Featurelize(JsonObject data) {
Promise<Map<String, Object>> promise = Promise.promise();
try {
Map<String, Object> feature = data.getMap();
// TODO: featurelize with pmml input fields...
promise.complete(feature);
} catch (Exception e) {
promise.fail(e);
}
return promise.future();
}
The action assigned in router(actor field) will handle at here:
//src/main/java/com/hirisklab/evaluate/evaluator/actor/EvaluateImpl.java
public void predict(JsonObject data, Handler<AsyncResult<EvaluateResponse<JsonObject>>> handler) {
Promise<EvaluateResponse<JsonObject>> promise = Promise.promise();
try {
JsonObject profile = Optional.ofNullable(data.getJsonObject("_EXT")).orElseThrow(() -> EvaluateException.InvalidConfigException);
EvaluaterFactory.getEvaluater(profile).onSuccess(evaluater -> {
Featurelize(data.getJsonObject("data")).onSuccess(feature -> {
evaluater.predict(feature).onSuccess(result -> {
promise.complete(new EvaluateResponse<JsonObject>(new JsonObject().put("raw", result)));
}).onFailure(f -> promise.fail(EvaluateException.FailedToPredict));
}).onFailure(f -> promise.fail(EvaluateException.FailedToFeaturelize));
}).onFailure(f -> promise.fail(EvaluateException.FailedToLoadEvaluator));
} catch (Exception e) {
e.printStackTrace();
promise.fail(e);
}
handler.handle(promise.future());
}
Here is the simple code that might help you to get the direction:
// This will load the PMML file
Evaluator evaluator = new LoadingModelEvaluatorBuilder()
.load(new File("path\\file.pmml"))
.build();
// The internal self check
evaluator.verify();
System.out.println("PMML Loaded");
// This will create the `actual` pipeline from the PMML and load in java
TransformerBuilder pmmlTransformerBuilder = new TransformerBuilder(evaluator)
.withTargetCols()
.withOutputCols()
.exploded(true);
System.out.println("Building in...");
Transformer pmmlTransformer = pmmlTransformerBuilder.build();
// Now we are loading the file and converting that into the data frame so that
// we can use it to transform into the prediction into the PMML model
Dataset<Row> DF = sparkSession.read()
.option("header", true)
.option("inferSchema", true)
.csv("path\\file.csv");
DF.printSchema();
// This will predict the new data points from the pipeline
Dataset<Row> result = pmmlTransformer.transform(DF);
NOTE: Make sure, you are loading the correct csv file there. There should not be any change in the column headers otherwise it will show an error.
Here is the link: JPMML-Evaluator (the same which is provided by Florian) to explore more about this.
Related
Here I have a method where fetchReport is an external call to a vendor API. I want to copy that data to Azure Blob Storage but not if There was an error. If there was an error then I want to return the CustomResponse with the error details. writeToBlob() also returns a CustomResponse. I want to be able to preserve the error message from the external API to give to the consumer.
Is there any way I can use some conditional logic like
if response.contains("Failed") -> then return response with error details
else -> write to blob
public Flux<CustomResponse> getAndSaveReport(Mono<JsonNode> fetchReport, String reportFilePrefix) {
Mono<JsonNode> reportMono = fetchReport
.doOnSuccess(result -> {
log.info(Logger.EVENT_UNSPECIFIED, "Successfully retrieved report");
})
.switchIfEmpty(Mono.just(objectMapper.convertValue(new CustomResponse("No content"), JsonNode.class)))
.onErrorResume(BusinessException.class, err -> {
log.error(Logger.EVENT_FAILURE, "Failed to retrieve report");
JsonNode errJson = null;
CustomResponse apiResponse = new CustomResponse();
apiResponse.setStatus("Failed");
apiResponse.setMessage("Error message: " + err.getMessage());
apiResponse.setType(reportFilePrefix);
errJson = objectMapper.convertValue(apiResponse, JsonNode.class);
return Mono.just(errJson);
});
return writeToBlob(reportMono.flux(), reportFilePrefix).flux();
}
Any help would be appreciated!
Not sure what fetchReport returns but the code could be simplified by applying flatMap. Also, not sure why are you using flux() everywhere when only one signal is passed - you can use Mono instead.
public Mono<CustomResponse> getAndSaveReport(Mono<JsonNode> fetchReport, String reportFilePrefix) {
return fetchReport
.flatMap(result -> {
if (result.response.contains("Failed")) {
// error handling
return Mono.just(errorResponse);
} else {
return writeToBlob(result.report, reportFilePrefix)
}
});
}
I'm trying to build a Springboot app that allows to insert Json object from Postman and save it to existing json file that has other data. I'm new to Jackson so perhaps I missed something?
this is how my json file looks:
[
{
"Name":"After Dark",
"Author":"Haruki Murakami"
},
{
"Name":"It",
"Author":"Stephen King"
}
]
This is what I have tried:
#PostMapping("/insertBook")
public void insertBook(#RequestBody Book book) {
File booksJsonFile = Paths.get(this.getClass().getResource("/books.json").toURI()).toFile();
objectMapper.writeValue(booksJsonFile, book);
}
It's inserts to an empty file but it's doesn't append to existing json file.
I also have tried this:
#PostMapping("/insertBook")
public void insertBook(#RequestBody Book book) throws URISyntaxException {
try {
File file = Paths.get(this.getClass().getResource("/books.json").toURI()).toFile();
FileWriter fileWriter = new FileWriter(file, true);
SequenceWriter seqWriter = objectMapper.writer().writeValuesAsArray(fileWriter);
seqWriter.write(book);
seqWriter.close();
} catch (IOException e) {
e.printStackTrace();
}
}
This is what I'm sending from Postman:
Do I need to use something else to achieve the result that I want?
I will be thankfull for your help.
I have tried to reproduce your problem according to your code and I come to following conclusions:
You can not modify file under resources directly. Here is explanation why.
I managed to append new JSON to the file (using your approach but saving file locally) but it's probably not what you expect (json structure is corrupted):
[
{
"Name":"After Dark",
"Author":"Haruki Murakami"
},
{
"Name":"It",
"Author":"Stephen King"
}
][{"Name":"new name","Author":"new author"}]
I am afraid that it is not possible to update current JSON structure directly in the file.
I managed to solve your problem using org.json library. However, the disadvantage of my solution is necessity of rewriting entire file each time. In addition I used synchronized keyword in order to avoid simultaneous file modification.
public synchronized void updateJsonFile(Book book) throws IOException {
ObjectMapper objectMapper = new ObjectMapper();
Path path = Paths.get("./books.json");
final String currentJsonArrayAsString = Files.readString(path);
try (FileWriter fileWriter = new FileWriter(path.toFile(), false)) {
JSONObject jsonObject = new JSONObject(objectMapper.writeValueAsString(book));
JSONArray jsonArray = new JSONArray(currentJsonArrayAsString);
jsonArray.put(jsonObject);
fileWriter.write(jsonArray.toString());
}
}
And now the books.json has following content:
[
{
"Author":"Haruki Murakami",
"Name":"After Dark"
},
{
"Author":"Stephen King",
"Name":"It"
},
{
"Author":"new author",
"Name":"new name"
}
]
I have the following JSON format
{
"file": {
"version": "v1.4",
"release": "1.1"
},
"status": "ON",
"document": {
"status": "NOT_FOUND",
"release": "undefined"
}
}
and I would like to know how I can add the format into my PactDslJsonBody, something like?
DslPart result = new PactDslJsonBody()
.stringType("file.version", "v1.4")
.stringType("file.release", "1.1")
.stringType("status", "ON")
.stringType("document.status", "NOT_FOUND")
.stringType("document.release", "release")
.asBody();
Or is it possible to add a Java Pojo? I have the class ApplicationResponse:
public class ApplicationResponse {
private File file;
private String status;
private Document document;
//...
}
Something like ??
DslPart result = new PactDslJsonBody()
.object(ApplicationResponse)
.asBody();
What could be the best approach? could you please add an example
We attempted to do what you are trying to do using reflection to stub out our pojos. However, our classes carry many Lombok annotations & we couldn't get default values out from builder annotated fields. We gave up trying to use it. But a dev with more time could achieve this no doubt.
I am now actively creating Pacts for our projects and use both LambdaDsl and PactDslJsonBodyto build the interaction.
DslPart actualPactDsl = LambdaDsl.newJsonBody((body) -> {
body
.stringType("status", "ON")
.object("document", (doc) -> {
doc.stringType("status", "NOT_FOUND");
doc.stringType("release", "undefined");
})
.object("file", (file) -> {
file.stringType("version", "v1.4");
file.stringType("release", "1.1");
});
}).build();
or
String pactDslJson = new PactDslJsonBody()
.stringType("status", "ON")
.object("document")
.stringType("status", "NOT_FOUND")
.stringType("release", "undefined")
.closeObject()
.object("file")
.stringType("version", "v1.4")
.stringType("release", "1.1")
.closeObject()
.getBody().toString();
Both of these examples will produce the Json string from your example.
The examples that are a part of Pact-Jvm are really helpful to get your head around the different types of tests you can create.
I am struggling with importing data into Mongodb from a Json file.
I can do the same in command line by using mongoimport command.
I explored and tried lot but not able to import from Json file using java.
sample.json
{ "test_id" : 1245362, "name" : "ganesh", "age" : "28", "Job" :
{"company name" : "company1", "designation" : "SSE" }
}
{ "test_id" : 254152, "name" : "Alex", "age" : "26", "Job" :
{"company name" : "company2", "designation" : "ML" }
}
Thank for your time.
~Ganesh~
Suppose you can read the JSON string respectively. For example, you read the first JSON text
{ "test_id" : 1245362, "name" : "ganesh", "age" : "28", "Job" :
{"company name" : "company1", "designation" : "SSE" }
}
and assign it to a variable (String json1), the next step is to parse it,
DBObject dbo = (DBObject) com.mongodb.util.JSON.parse(json1);
put all dbo into a list,
List<DBObject> list = new ArrayList<>();
list.add(dbo);
then save them into database:
new MongoClient().getDB("test").getCollection("collection").insert(list);
EDIT:
In the newest MongoDB Version you have to use Documents instead of DBObject, and the methods for adding the object look different now. Here's an updated example:
Imports are:
import com.mongodb.MongoClient;
import com.mongodb.client.MongoDatabase;
import org.bson.Document;
The code would like this (refering to the text above the EDIT):
Document doc = Document.parse(json1);
new MongoClient().getDataBase("db").getCollection("collection").insertOne(doc);
you can also do it the way with the list. but then you need
new MongoClient().getDataBase("db").getCollection("collection").insertMany(list);
But I think there is a problem with this solution. When you type:
db.collection.find()
in the mongo shell to get all objects in the collection, the result looks like the following:
{ "_id" : ObjectId("56a0d2ddbc7c512984be5d97"),
"test_id" : 1245362, "name" : "ganesh", "age" : "28", "Job" :
{ "company name" : "company1", "designation" : "SSE"
}
}
which is not exactly the same as before.
Had a similar "problem" myself and ended up using Jackson with POJO databinding, and Morphia.
While this sound a bit like cracking a nut with a sledgehammer, it is actually very easy to use, robust and quite performant and easy to maintain code wise.
Small caveat: You need to map your test_id field to MongoDB's _id if you want to reuse it.
Step 1: Create an annotated bean
You need to hint Jackson how to map the data from a JSON file to a POJO. I shortened the class a bit for the sake of readability:
#JsonRootName(value="person")
#Entity
public class Person {
#JsonProperty(value="test_id")
#Id
Integer id;
String name;
public Integer getId() {
return id;
}
public void setId(Integer id) {
this.id = id;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
}
As for the embedded document Job, please have a look at the POJO data binding examples linked.
Step 2: Map the POJO and create a datastore
Somewhere during your application initialization, you need to map the annotated POJO. Since you already should have a MongoClient, I am going to reuse that ;)
Morphia morphia = new Morphia();
morphia.map(Person.class);
/* You can reuse this datastore */
Datastore datastore = morphia.createDatastore(mongoClient, "myDatabase");
/*
* Jackson's ObjectMapper, which is reusable, too,
* does all the magic.
*/
ObjectMapper mapper = new ObjectMapper();
Do the actual importing
Now importing a given JSON file becomes as easy as
public Boolean importJson(Datastore ds, ObjectMapper mapper, String filename) {
try {
JsonParser parser = new JsonFactory().createParser(new FileReader(filename));
Iterator<Person> it = mapper.readValues(parser, Person.class);
while(it.hasNext()) {
ds.save(it.next());
}
return Boolean.TRUE;
} catch (JsonParseException e) {
/* Json was invalid, deal with it here */
} catch (JsonMappingException e) {
/* Jackson was not able to map
* the JSON values to the bean properties,
* possibly because of
* insufficient mapping information.
*/
} catch (IOException e) {
/* Most likely, the file was not readable
* Should be rather thrown, but was
* cought for the sake of showing what can happen
*/
}
return Boolean.FALSE;
}
With a bit of refatcoring, this can be converted in a generic importer for Jackson annotated beans.
Obviously, I left out some special cases, but this would out of the scope of this answer.
With 3.2 driver, if you have a mongo collection and a collection of json documents e.g:
MongoCollection<Document> collection = ...
List<String> jsons = ...
You can insert individually:
jsons.stream().map(Document::parse).forEach(collection::insertOne);
or bulk:
collection.insertMany(
jsons.stream().map(Document::parse).collect(Collectors.toList())
);
I just faced this issue today and solved it in another different way while none here satisfied me, so enjoy my extra contribution. Performances are sufficient to export 30k documents and import them in my Springboot app for integration test cases (takes a few seconds).
First, the way your export your data in the first place matters.
I wanted a file where each line contains 1 document that I can parse in my java app.
mongo db --eval 'db.data.find({}).limit(30000).forEach(function(f){print(tojson(f, "", true))})' --quiet > dataset.json
Then I get the file from my resources folder, parse it, extract lines, and process them with mongoTemplate. Could use a buffer.
#Autowired
private MongoTemplate mongoTemplate;
public void createDataSet(){
mongoTemplate.dropCollection("data");
try {
InputStream inputStream = Thread.currentThread().getContextClassLoader().getResourceAsStream(DATASET_JSON);
List<Document> documents = new ArrayList<>();
String line;
InputStreamReader isr = new InputStreamReader(inputStream, Charset.forName("UTF-8"));
BufferedReader br = new BufferedReader(isr);
while ((line = br.readLine()) != null) {
documents.add(Document.parse(line));
}
mongoTemplate.insert(documents,"data");
} catch (Exception e) {
throw new RuntimeException(e);
}
}
List<Document> jsonList = new ArrayList<Document>();
net.sf.json.JSONArray array = net.sf.json.JSONArray.fromObject(json);
for (Object object : array) {
net.sf.json.JSONObject jsonStr = (net.sf.json.JSONObject)JSONSerializer.toJSON(object);
Document jsnObject = Document.parse(jsonStr.toString());
jsonList.add(jsnObject);
}
collection.insertMany(jsonList);
Runtime r = Runtime.getRuntime();
Process p = null;
//dir is the path to where your mongoimport is.
File dir=new File("C:/Program Files/MongoDB/Server/3.2/bin");
//this line will open your shell in giving dir, the command for import is exactly same as you use mongoimport in command promote
p = r.exec("c:/windows/system32/cmd.exe /c mongoimport --db mydb --collection student --type csv --file student.csv --headerline" ,null,dir);
public static void importCSV(String path) {
try {
List<Document> list = new ArrayList<>();
MongoDatabase db = DbConnection.getDbConnection();
db.createCollection("newCollection");
MongoCollection<Document> collection = db.getCollection("newCollection");
BufferedReader reader = new BufferedReader(new FileReader(path));
String line;
while ((line = reader.readLine()) != null) {
String[] item = line.split(","); // csv file is "" separated
String id = item[0]; // get the value in the csv assign keywords
String first_name = item[1];
String last_name = item[2];
String address = item[3];
String gender = item[4];
String dob = item[5];
Document document = new Document(); // create a document
document.put("id", id); // data into the database
document.put("first_name", first_name);
document.put("last_name", last_name);
document.put("address", address);
document.put("gender", gender);
document.put("dob", dob);
list.add(document);
}
collection.insertMany(list);
}catch (Exception e){
System.out.println(e);
}
}
I need to call this service in Java -
https://api.semantics3.com/test/v1/products?q={"cat_id": "13658", "brand": "Toshiba", "model": "Satellite"}
I've managed to do this in python as follows -
class Semantics:
def __init__(self):
self.service_address = 'https://api.semantics3.com/test/v1/products?'
self.api_key = 'SEM3158A71D4AB3A3715C2230B96943F46D0'
def query(self, params):
query = 'q=' + params.__dict__.__str__().replace("'",'"')
query = urlencode(urlparse.parse_qs(query), True)
req = Request(url = self.service_address + query)
req.add_header('api_key', self.api_key)
return urlopen(req).read()
class QueryParams:
def __init__(self, cat_id, model, brand):
self.cat_id = cat_id
self.model = model
self.brand = brand
qp = QueryParams('13658', 'Satellite', "Toshiba")
print Semantics().query(qp)
I have tried writing an equivalent Java program using Spring REST API and Apache HttpClient to no avail. I can't find a way to set a dictionary (i.e. Map) into the query String.
public static void main(String[] args) {
String uri = "https://api.semantics3.com/test/v1/products?";
HttpClient hc = new HttpClient();
GetMethod method = new GetMethod(uri);
method.getParams().setParameter("q", "Toshiba");//How do I insert a Map here?
method.getParams().setParameter(HttpMethodParams.RETRY_HANDLER,
new DefaultHttpMethodRetryHandler(3, false));
method.setRequestHeader("api_key", "SEM2158A71D4AB3A3715C2435B96943F46D0");
try {
int statusCode = hc.executeMethod(method);
System.out.println(statusCode);
byte[] responseBody = method.getResponseBody();
System.out.println(new String(responseBody));
} catch (HttpException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
method.releaseConnection();
}
}
At the lowest level I can manually produce the query string manually via concatenation and then Url encode it. But is there a more elegant way to do it?
I think you can use external jar like GSON to convert the Map into JSON
Map<String, String> map = new HashMap<String,String>();
map.put("cat_id", "12345");
..
Gson gson = new Gson();
method.getParams().setParameter("q", gson.toJson(map));
Have a look at Google's Http Client
As you can see from the examples, it uses objects to build the request url and deserialise the response body. The docs also show you how to deserialise JSON specifically, and you can choose your JSON library of choice.