I am working on a project,which requires to get weather information and so i used openweathermap api. My program worked and i got information from "main" and "wind" ,but i also need to get description from the main weather set.The problem is that the weather set is a list in the json file and i am unable to cast it to map.The example json file which i am trying to parse is http://api.openweathermap.org/data/2.5/weather?q=London
JSON:
{
"coord":{
"lon":-0.13,
"lat":51.51
},
"weather":[
{
"id":803,
"main":"Clouds",
"description":"broken clouds",
"icon":"04n"
}
],
"base":"stations",
"main":{
"temp":43.56,
"pressure":1004,
"humidity":87,
"temp_min":41,
"temp_max":46.4
},
"visibility":10000,
"wind":{
"speed":11.41,
"deg":80
},
"rain":{
},
"clouds":{
"all":75
},
"dt":1573350303,
"sys":{
"type":1,
"id":1414,
"country":"GB",
"sunrise":1573369754,
"sunset":1573402780
},
"timezone":0,
"id":2643743,
"name":"London",
"cod":200
}
When we look at the file,we notice that there is an [] bracket inside the weather set which is creating problems in my project.I tried to look up on how to cast list to map and tried playing with my code ,but didn't help.The commented code in the file are things which i have tried while trying to make it work.
import java.io.BufferedReader;
import java.io.IOException;
import java.io.InputStreamReader;
import java.net.URL;
import java.net.URLConnection;
import java.util.HashMap;
import java.util.Map;
import com.google.gson.*;
import com.google.gson.reflect.*;
import java.util.List;
import java.lang.reflect.Type;
import java.util.HashMap;
import java.util.Map;
import com.google.gson.Gson;
import com.google.gson.reflect.TypeToken;
public class PlantWateringApp {
public static Map<String, Object> jsonToMap(String str) {
Map<String, Object> map = new Gson().fromJson(str, new TypeToken<HashMap<String, Object>>() {
}.getType());
return map;
}
public static void main(String[] args) {
String LOCATION = "delhi,india";
String result = "{\"coord\":{\"lon\":77.22,\"lat\":28.65},\"weather\":[{\"id\":711,\"main\":\"Smoke\",\"description\":\"smoke\",\"icon\":\"50d\"}],\"base\":\"stations\",\"main\":{\"temp\":72.32,\"pressure\":1015,\"humidity\":59,\"temp_min\":64.4,\"temp_max\":77},\"visibility\":1000,\"wind\":{\"speed\":3.36,\"deg\":270},\"clouds\":{\"all\":0},\"dt\":1573351180,\"sys\":{\"type\":1,\"id\":9165,\"country\":\"IN\",\"sunrise\":1573348168,\"sunset\":1573387234},\"timezone\":19800,\"id\":1273294,\"name\":\"Delhi\",\"cod\":200}";
System.out.println(result);
Map<String, Object> respMap = jsonToMap(result.toString());
Map<String, Object> mainMap = jsonToMap(respMap.get("main").toString());
Map<String, Object> windMap = jsonToMap(respMap.get("wind").toString());
// Type listType = new TypeToken<List<Map<String,String>>>()
// {}.getType();
// List<Map<String,String>> weatherMap = new
// Gson().fromJson(respMap.get("description").toString(),listType);
// Map<String, Object> name = (Map<String, Object>)
// respMap.get("description");
// Map<String, Object > weatherMap = jsonToMap
// (respMap.get("description").toString());
System.out.println("Location: " + LOCATION);
System.out.println("Current Temperature: " + mainMap.get("temp"));
System.out.println("Current Humidity: " + mainMap.get("humidity"));
System.out.println("Max: " + mainMap.get("temp_min"));
System.out.println("Min: " + mainMap.get("temp_max"));
System.out.println("Wind Speed: " + windMap.get("speed"));
System.out.println("Wind Angle: " + windMap.get("deg"));
}
}
I tried to do the same way as i did for main and wind : Map weatherMap = jsonToMap (respMap.get("weather").toString()); .But i got errors:
////java.lang.IllegalStateException: Expected BEGIN_ARRAY but was BEGIN_OBJECT at line 1 column 3 path $[0]
So i tried to not convert json to Map rather directly use map like Map weatherMap = (Map) respMap.get("weather"); but i got
////java.lang.ClassCastException: java.util.ArrayList cannot be cast to java.util.Map
For this,i tried to cast list to map using
List<Map<String,String>> weatherMap = new Gson().fromJson(respMap.get("weather").toString(),listType);
But this says:
//String cannot be converted to int
I am really confused on what to do in this situation.I am unable to figure out how to deal with [] in the json file.
As this data is provided as a List and you are trying to convert it into the Map. That is not right. You need to get it (weather) as a list of Map and then need to treat each element as Map. Here is an example how to get it as a Map
///...
//// other code
///...
Map<String, Object > respMap = jsonToMap (result.toString());
// don't need to convert from string to map again and again
Map<String, Object > mainMap = (Map<String, Object >)respMap.get("main");
Map<String, Object > windMap = (Map<String, Object >)respMap.get("wind");
// fist get weather as list
List<Map<String, Object >> weather = (List<Map<String, Object>>) (respMap.get("weather"));
//...
System.out.println("Wind Speed: " + windMap.get("speed") );
System.out.println("Wind Angle: " + windMap.get("deg") );
// weather as list
System.out.println("Weather: "+ weather);
// assuming weather contains at-least 1 element.
Map<String, Object> weatherMap = weather.get(0);
System.out.println("Weather as map: "+ weatherMap);
Casting it to list.
List<Map<String, Object >> weather = (List<Map<String, Object>>) (respMap.get("weather"));
Then treat each element as Map:
// assuming weather contains at-least 1 element.
Map<String, Object> weatherMap = weather.get(0);
Hope this helps.
Make life simple and use real types.
import java.util.List;
import com.google.gson.Gson;
public class PlantWateringApp {
class Weather_2_5 {
List<Weather> weather;
}
class Weather {
Integer id;
String main;
String description;
String icon;
}
public static void main(String[] args) {
String result = "{\"coord\":{\"lon\":77.22,\"lat\":28.65},\"weather\":[{\"id\":711,\"main\":\"Smoke\",\"description\":\"smoke\",\"icon\":\"50d\"}],\"base\":\"stations\",\"main\":{\"temp\":72.32,\"pressure\":1015,\"humidity\":59,\"temp_min\":64.4,\"temp_max\":77},\"visibility\":1000,\"wind\":{\"speed\":3.36,\"deg\":270},\"clouds\":{\"all\":0},\"dt\":1573351180,\"sys\":{\"type\":1,\"id\":9165,\"country\":\"IN\",\"sunrise\":1573348168,\"sunset\":1573387234},\"timezone\":19800,\"id\":1273294,\"name\":\"Delhi\",\"cod\":200}";
//System.out.println(result);
Gson G = new Gson();
Weather_2_5 obj = G.fromJson(result, Weather_2_5.class);
for (int idx = 0; idx < obj.weather.size(); idx++) {
System.out.println(obj.weather.get(idx).description);
}
}
}
Instead of writing it all from the beginning you could use:
https://github.com/xSAVIKx/openweathermap-java-api
I think this example is closest to what you are trying to do:
https://github.com/xSAVIKx/openweathermap-java-api/blob/development/api-examples/src/main/java/org/openweathermap/api/example/CurrentWeatherOneLocationExample.java
I'm creating a Spring application on backend and my main goal is to manage properties (add/update/delete) in *.properties file. I want to convert this file to JSON and then manipulate it from UI application.
Is there any possibility to convert structure like this:
a.x=1
a.y=2
b.z=3
To JSON like this:
{
"a": {
"x": 1,
"y": 2
},
"b": {
"z": 3
}
}
I found solution to use GSON library, but it creates for me flat structure, not hierarchical, code I used:
Properties props = new Properties();
try (FileInputStream in = new FileInputStream(classPathResource.getFile())) {
props.load(in);
}
String json = new GsonBuilder().enableComplexMapKeySerialization().create().toJson(props);
Is here someone who was facing same problem and maybe found a working project for this? Maybe GSON library can do that?
This solution does involve loads of work, but you will get what you want to achieve using the below code, basically, the idea is to split the key based on the single dot and then create a JsonObject if the same first key is found.
import java.io.File;
import java.io.FileReader;
import java.io.IOException;
import java.util.Iterator;
import java.util.Map.Entry;
import java.util.Properties;
import org.json.JSONObject;
import com.fasterxml.jackson.annotation.JsonIgnore;
public class SOTest {
public static void main(String[] args) throws IOException {
JSONObject jsonObject = new JSONObject();
FileReader fileReader = new FileReader(new File("C:\\Usrc\\main\\java\\Sample.properties"));
Properties properties = new Properties();
properties.load(fileReader);
Iterator<Entry<Object, Object>> iterator = properties.entrySet().iterator();
while (iterator.hasNext()) {
Entry<Object, Object> entry = iterator.next();
String value = (String) entry.getKey();
String[] values = value.split("\\.");
JSONObject opt = jsonObject.optJSONObject(values[0]);
if(opt!=null) {
opt.put(values[1],entry.getValue());
}else {
JSONObject object = new JSONObject();
object.put(values[1], entry.getValue());
jsonObject.put(values[0], object);
}
}
System.out.println(jsonObject.toString());
}
}
Output
{"a":{"x":"1","y":"3"},"b":{"z":"10"}}
I am trying to retrieve Map content via WFS Geoserver connection in Java with Geotools 18.4. But I am getting the following error: Content type is required for org.geotools.data.ows.Response.
The idea is that i want to map features (Position and Heartrate of a running person) of a WFS Layer with the java processing library.
I would be very grateful if someone can help me with this error.
here is the code:
`
import java.io.IOException;
import java.io.Serializable;
import java.util.HashMap;
import java.util.Map;
import org.geotools.data.DataStore;
import org.geotools.data.wfs.WFSDataStoreFactory;
public class Heartrate2 {
public static void main(String[] args) throws IOException {
Heartrate2 me = new Heartrate2();
DataStore ds = me.dataStoreWFS();
for (String n:ds.getTypeNames()) {
System.out.println(n);
}
}
public DataStore dataStoreWFS() {
DataStore dataStore = null;
try {
Map<String, Serializable> connectionParameters = new HashMap<>();
String getCapabilities = "http://webgis.regione.sardegna.it/geoserver/ows?service=WFS&request=GetCapabilities";
String variableCapabilities = "WFSDataStoreFactory:GET_CAPABILITIES_URL";
connectionParameters.put(variableCapabilities, getCapabilities);
dataStore = (new WFSDataStoreFactory()).createDataStore(connectionParameters);
} catch (IOException e) {
e.printStackTrace();
}
return dataStore;
}
}
`
I am working with Elastic Search, i came across a plugin called ReadOnlyRest plugin for Auth purpose. To set up this plugin we need to add user to Elastic search yml.
So i searched how to add "key : value" pair data to yml using Java. Came across SnakeYAML to add data.
I am able to send the data of user from Java.
Java code.
package com.test.elasticsearch;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileWriter;
import java.io.IOException;
import java.io.InputStream;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import org.apache.log4j.Logger;
import org.yaml.snakeyaml.DumperOptions;
import org.yaml.snakeyaml.Yaml;
public class YAMLSample {
protected static Logger logger = Logger.getLogger(YAMLSample.class);
final String fileName = "/home/Installation/elasticsearch-2.3.1/config/elasticsearch.yml";
public void writeToYML() throws IOException {
logger.debug("Write to YML");
Map<String, Object> data = new HashMap<String, Object>();
data.put("name", "user5");
data.put("type", "allow");
data.put("auth_key", "user5:user5");
data.put("kibana_access", "ro");
data.put("indices", new String[] { ".kibana*", "abc", "def" });
List<Map<String, Object>> list = new ArrayList<Map<String, Object>>();
list.add(data);
DumperOptions options = new DumperOptions();
options.setIndent(5);
Yaml yaml = new Yaml(options);
FileWriter writer = new FileWriter(fileName, true);
yaml.dump(list, writer);
logger.debug("DONE!");
}
public static void main(String[] args) throws IOException {
// new YAMLSample().readYML();
new YAMLSample().writeToYML();
}
}
Output from the above code is:
- name: user5
indices: [.kibana*, abc, def]
kibana_access: ro
type: allow
auth_key: user5:user5
But, expected output is:
- name: user5
indices: [.kibana*, abc, def]
kibana_access: ro
type: allow
auth_key: user5:user5
the "Hyphen-minus" should have just one space and before the "Hyphen-minus" there should be 4 spaces.
I mean to say i am expecting this to appear as Array of Users. else than "Hyphen-minus" then few spaces.
Please assist me with finding out the solution.
I've modified your code and got the expected result. Below is how the code look like:
public class YAMLSample {
final String fileName = "/tmp/rest.yml";
public void writeToYML() throws IOException {
log( "Write to YML" );
Map<String, Object> user = new HashMap<>();
user.put( "name", "user5" );
user.put( "type", "allow" );
user.put( "auth_key", "user5:user5" );
user.put( "kibana_access", "ro" );
user.put( "indices", new String[] { ".kibana*", "abc", "def" } );
Map<String, Object> user2 = new HashMap<>();
user2.put("name", "user2");
user2.put("type", "allow");
user2.put("auth_key", "user2:user2");
user2.put("kibana_access", "ro");
user2.put("indices", new String[] { ".kibana*", "abc", "def" });
List<Map<String, Object>> list = new ArrayList<>();
list.add(user);
list.add(user2);
Map<String, List<Map<String, Object>>> config = new HashMap<>();
config.put( "access_control_rules", list );
DumperOptions options = new DumperOptions();
options.setIndent( 6 );
options.setIndicatorIndent( 4 );
options.setDefaultFlowStyle(DumperOptions.FlowStyle.AUTO);
Yaml yaml = new Yaml(options);
FileWriter writer = new FileWriter(fileName, true);
yaml.dump( config, writer );
log( "DONE!" );
}
public static void main(String[] args) throws IOException {
new YAMLSample().writeToYML();
}
public void log(String str) {
System.out.println(str);
}
}
Basically I added this two options to your Dumper
options.setIndicatorIndent(4);
options.setDefaultFlowStyle(DumperOptions.FlowStyle.AUTO);
and updated from 5 to 6 the options.setIndent(6);
I am working on a Spark based Kafka Consumer that reads the data in Avro format.
Following, is the try catch code reading and processing the input.
import java.util.*;
import java.io.*;
import com.twitter.bijection.Injection;
import com.twitter.bijection.avro.GenericAvroCodecs;
import kafka.serializer.StringDecoder;
import kafka.serializer.DefaultDecoder;
import scala.Tuple2;
import org.apache.avro.Schema;
import org.apache.avro.generic.GenericRecord;
import kafka.producer.KeyedMessage;
import org.apache.spark.api.java.JavaRDD;
import org.apache.spark.SparkConf;
import org.apache.spark.streaming.Duration;
import org.apache.spark.streaming.api.java.JavaPairInputDStream;
import org.apache.spark.streaming.api.java.JavaStreamingContext;
import org.apache.spark.api.java.function.*;
import org.apache.spark.streaming.api.java.*;
import org.apache.spark.streaming.kafka.KafkaUtils;
import org.apache.spark.streaming.Durations;
public class myKafkaConsumer{
/**
* Main function, entry point to the program.
* #param args, takes the user-ids as the parameters, which
*will be treated as topics
* in our case.
*/
private String [] topics;
private SparkConf sparkConf;
private JavaStreamingContext jssc;
public static final String USER_SCHEMA = "{"
+ "\"type\":\"record\","
+ "\"name\":\"myrecord\","
+ "\"fields\":["
+ " { \"name\":\"str1\", \"type\":\"string\" },"
+ " { \"name\":\"int1\", \"type\":\"int\" }"
+ "]}";
public static void main(String [] args){
if(args.length < 1){
System.err.println("Usage : myKafkaConsumber <topics/user-id>");
System.exit(1);
}
myKafkaConsumer bKC = new myKafkaConsumer(args);
bKC.run();
}
/**
* Constructor
*/
private myKafkaConsumer(String [] topics){
this.topics = topics;
sparkConf = new SparkConf();
sparkConf = sparkConf.setAppName("JavaDirectKafkaFilterMessages");
jssc = new JavaStreamingContext(sparkConf, Durations.seconds(2));
}
/**
* run function, runs the entire program.
* #param topics, a string array containing the topics to be read from
* #return void
*/
private void run(){
HashSet<String> topicSet = new HashSet<String>();
for(String topic : topics){
topicSet.add(topic);
System.out.println(topic);
}
HashMap<String, String> kafkaParams = new HashMap<String, String>();
kafkaParams.put("metadata.broker.list", "128.208.244.3:9092");
kafkaParams.put("auto.offset.reset", "smallest");
try{
JavaPairInputDStream<String, byte[]> messages = KafkaUtils.createDirectStream(
jssc,
String.class,
byte[].class,
StringDecoder.class,
DefaultDecoder.class,
kafkaParams,
topicSet
);
JavaDStream<String> avroRows = messages.map(new Function<Tuple2<String, byte[]>, String>(){
public String call(Tuple2<String, byte[]> tuple2){
return testFunction(tuple2._2().toString());
}
});
avroRows.print();
jssc.start();
jssc.awaitTermination();
}catch(Exception E){
System.out.println(E.toString());
E.printStackTrace();
}
}
private static String testFunction(String str){
System.out.println("Input String : " + str);
return "Success";
}
}
The code compiles correctly, however, when I try to run the code on a Spark cluster I get Task not Serializable error. I tried removing the function and simply printing some text, still, the error persists.
P.S. I have checked printing the messages and found that they are correctly read.
The print statement collects your RDD to the driver in order to print them on the screen. Such a task triggers serialization/deserialization of your data.
In order for your code to work, the records in the avroRows Dstream must be of a serializable type.
For example, it should work if you replace the avroRows definition by this :
JavaDStream<String> avroRows = messages.map(new Function<Tuple2<String, byte[]>, String>(){
public String call(Tuple2<String, byte[]> tuple2){
return tuple2._2().toString();
}
});
I just added a toString to your records because the String type is serializable (of course, it is not necessarily what you need, it is just an example).