I want to build an app with simple gui that include: "Start Point" and "End Point". The purpose of the app is to out to the screen how many meters/kilometers the rout that google maps choose, without to present the map itself. How can I do that? All what I found it's with map graphics.
You can use Distance Matrix API:
The Distance Matrix API is a service that provides travel distance and
time for a matrix of origins and destinations. The API returns
information based on the recommended route between start and end
points, as calculated by the Google Maps API, and consists of rows
containing duration and distance values for each pair.
In that case you need HttpUrlConnection or something like that for send URL request like:
https://maps.googleapis.com/maps/api/distancematrix/json?units=imperial&origins=Washington,DC&destinations=New+York+City,NY&key=YOUR_API_KEY
and parse JSON (or XML) response like that:
{
"destination_addresses" : [ "New York, NY, USA" ],
"origin_addresses" : [ "Washington, DC, USA" ],
"rows" : [
{
"elements" : [
{
"distance" : {
"text" : "225 mi",
"value" : 361715
},
"duration" : {
"text" : "3 hours 49 mins",
"value" : 13725
},
"status" : "OK"
}
]
}
],
"status" : "OK"
}
NB! don't forget to get API key.
Also take a look at Official and other e.g. like that tutorials and examples.
Related
I have a csv structure like this
and I also have one json response
[
{
"ID" : "1",
"Name" : "abc",
"Mobile" : "123456"
},
{
"ID" : "2",
"Name" : "cde",
"Mobile" : "123345"
}
]
I need the output like this
If your intention is to convert directly the JSON then that baeldung solution that you were given is good.
Otherwise, the way i see it and based on the info you're giving, you will need to have a representation of that JSON in a java object that will either represent some kind of request coming from somewhere or data you're getting from your database in order to be written on a csv.
Check out these, might be useful:
https://www.codejava.net/frameworks/spring-boot/csv-export-example
https://zetcode.com/springboot/csv/
I'd like to be able to query a JSON object and modify it in a way that is structure-agnostic and doesn't involve marshalling to POJOs. Here's an example to illustrate.
{
"user" : [ {
"username" : "foo",
"user-id" : "1234",
"name" : {
"first-name" : "John",
"last-name" : "Smith"
},
"ssn": "123-45-6789"
}, {
"username" : "bar",
"user-id" : "1235",
"name" : {
"first-name" : "Jane",
"last-name" : "Doe"
},
"ssn": "098-76-5432"
} ]
}
What I want to do is this:
Get the ssn nodes, retrieve their values, encrypt them, put the encrypted values back in the JSON, and return the whole thing. I've found two ways to do this so far:
Marshalling to a User POJO
Writing a method that iterates over the "user"s directly and modifies the ssn value.
The problem with either approach is that I have to generate "user" specific code. This is fine when I have only one data format, but I'm going to have dozens. I want to work more agnostically. Ideally I can do this with a line like this:
List<JsonNode> ssnNodes = jsonObj.match("$.user[*].ssn");
and then just iterate over the list -- just as I can with XML using XPath. This way I can maintain a list of json-paths to query with and that is as much as I need to know about the data.
Someone please tell me there is a way to do this in Java, but I haven't found a way so far. Thanks in advance!
I have to write over a hundred integrations between many systems. This integration layer must be able to convert codes. Each system uses codes to represent business types like insurance_type, customer_type, etc. Each of them has a set of valid values. Theses values are not the same from system to system, and may even vary over time.
I start looking for data domain mapping libraries in Java. I didn't found anything suitable. I thought about: CloverETL,Pentaho ETLou GETL​ but they are all way too complex for my need or not maintain.
The goal is to put the conversion rules out of the code so they could evolve over time without the need for a new executable deployment.
I'm looking for a tool, library that would allow me to represent mapping similar to this:
{
"domains" :[
{
"name": "type police host",
"values": [
{
"code" : "0001",
"description":"Habitation",
"start_date":"2019-06-30",
"end_date":""},
{
"code" : "0002",
"description":"Automobile",
"start_date":"2019-06-30",
"end_date":""}
]
},
{
"name": "type police web",
"values": [
{
"code" : "Habitation",
"description":"Habitation",
"start_date":"2019-06-30",
"end_date":""}
]
}
],
"conversions" : [
{
"from": "type police host",
"to": "type police web",
"rules" : [
{
"from": ["0001"],
"to" : "Habitation",
"start_date":"2019-06-30",
"end_date":""},
{
"from": [ "0003","0004"],
"to" : "Deux roues",
"start_date":"2019-06-30",
"end_date":""}
]
}
]
}
From the configuration file above, I would be able to do things like convertsAsOf("2019-07-10", "type police host", "type police web", "0001") and it would return "Habitation". Any suggestion of a library that would do it?
I have document schema such as
{
"_id" : 18,
"name" : "Verdell Sowinski",
"scores" : [
{
"type" : "exam",
"score" : 62.12870233109035
},
{
"type" : "quiz",
"score" : 84.74586220889356
},
{
"type" : "homework",
"score" : 81.58947824932574
},
{
"type" : "homework",
"score" : 69.09840625499065
}
]
}
I have a solution using pull that copes with removing a single element at a time but saw
I want to get a general solution that would cope with irregular schema where there would be between one and many elements to the array and I would like to remove all elements based on a condition.
I'm using mongodb driver 3.2.2 and saw this pullByFilter which sounded good
Creates an update that removes from an array all elements that match the given filter.
I tried this
Bson filter = and(eq("type", "homework"), lt("score", highest));
Bson u = Updates.pullByFilter(filter);
UpdateResult ur = collection.updateOne(studentDoc, u);
Unsurprisingly, this did not have any effect since I wasn't specifying the array scores
I get an error
The positional operator did not find the match needed from the query. Unexpanded update: scores.$.type
when I change the filter to be
Bson filter = and(eq("scores.$.type", "homework"), lt("scores.$.score", highest));
Is there a one step solution to this problem?
There seems very little info on this particular method I can find. This question may relate to How to Update Multiple Array Elements in mongodb
After some more "thinking" (and a little trial and error), I found the correct Filters method to wrap my basic filter. I think I was focusing on array operators too much.
I'll not post it here in case of flaming.
Clue: think "matches..." (as in regex pattern matching) when dealing with Filters helper methods ;)
I'm trying Spark with Java and MongoDB and I want to aggregate some Documents into a single one based on timestamps. For example, I want to aggregate X documents into a single one:
{
"_id" : ObjectId("598c32f455f0353f9e69ebf1"),
"_class" : "...",
"timestamp" : ISODate("2017-08-10T10:17:00.000Z"),
"value" : 10.1
}
...
{
"_id" : ObjectId("598c32f455f0353f9e69ebz2"),
"_class" : "...",
"timestamp" : ISODate("2017-08-10T10:18:00.000Z"),
"value" : 2.1
}
Lets say I have 60 documents like this and their timestamps are in a window of 1 minute (from 10:17:00 to 10:18:00) and I want to obtain one document:
{
"_id" : ObjectId("598c32f455f0353f9e69e231"),
"_class" : "...",
"start_timestamp" : ISODate("2017-08-10T10:17:00.000Z"),
"end_timestamp" : ISODate("2017-08-10T10:18:00.000Z"),
"average_value" : **average value of those documents**
}
Is it possible to perform this kind transformation? Can I retrieve one window of 1 minute of data at a time?
An approach which takes all the documents and compare their timestamps looks slow and inefficient.
Thanks in advance.