Java Writing multiple HashMaps to file - java

I am basically trying to write multiple (12, specifically) HashMap dictionaries to a local file and then retrieve them. So far I manage to do one however when I am trying to do more than one, i basically can not make it work. So any help to do this is appreciated. Here's my code so far:
private HashMap<String, List<String>> loadDict() {
int month = Functions.getMonth();
//load back in
FileInputStream fis;
try {
fis = new FileInputStream(statsFile);
ObjectInputStream ois = new ObjectInputStream(fis);
for (int i = 0; i < 13; i++) {
//itemsDict = (HashMap) ois.readObject();
Object whatisThis = (Object) ois.readObject();
dictionaries.add(whatisThis);
}
ois.close();
fis.close();
} catch (FileNotFoundException e1) {
// TODO Auto-generated catch block
e1.printStackTrace();
} catch (IOException e1) {
// TODO Auto-generated catch block
e1.printStackTrace();
} catch (ClassNotFoundException e1) {
// TODO Auto-generated catch block
e1.printStackTrace();
}
itemsDict = (HashMap) dictionaries.get(month);
System.out.println(itemsDict.get("cake"));
return itemsDict;
}
private void setupDictionaries() {
HashMap<String, List<String>> dictionary = new HashMap<String,List<String>>();
FileOutputStream fos;
try {
fos = new FileOutputStream(statsFile);
ObjectOutputStream oos = new ObjectOutputStream(fos);
for (int i = 0; i < 13; i++) {
oos.writeObject(dictionary);
}
oos.close();
fos.close();
} catch (FileNotFoundException e1) {
// TODO Auto-generated catch block
e1.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
private void storeThis(String product, String price, String quantity, String date, List<List<String>> myContent) {//after set, replace dictionary in dictionaries array
dictionaries.set(Functions.getMonth(), itemsDict);
//save the dictionary to the overall statistics file
FileOutputStream fos;
try {
fos = new FileOutputStream(statsFile);
ObjectOutputStream oos = new ObjectOutputStream(fos);
for (int i = 0; i < 13; i++) {
oos.writeObject(dictionaries.get(i));
}
//oos.writeObject(itemsDict);
oos.close();
fos.close();
} catch (FileNotFoundException e1) {
// TODO Auto-generated catch block
e1.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
A bit of clarification: setupDictionaries is only called on the first run (to setup the file), otherwise loadDict() is called on runtime, to load all the dictionaries into an arraylist. From the arraylist, the correct object (hashmap) should be chosen and then cased to itemsDict as hashmap.
storeThis() is called when a button is pressed, however I cut down the code to only relevant bits.
So I am trying to implement the JSON you have suggested, so far I've got:
private void setupDictionaries() {
ObjectMapper mapper = new ObjectMapper();
ArrayNode arrayNode = mapper.createArrayNode();
JsonNode rootNode = mapper.createObjectNode();
ArrayList<String> myThing = new ArrayList<String>();
myThing.add("hi");
myThing.add(".");
itemsDict.put("cake", myThing);
JsonNode childNode1 = mapper.valueToTree(itemsDict);
((ObjectNode) rootNode).set("Jan", childNode1);
JsonNode childNode2 = mapper.createObjectNode();
((ObjectNode) rootNode).set("obj2", childNode2);
JsonNode childNode3 = mapper.createObjectNode();
((ObjectNode) rootNode).set("obj3", childNode3);
String jsonString;
try {
jsonString = mapper.writerWithDefaultPrettyPrinter().writeValueAsString(rootNode);
System.out.println(jsonString);
ObjectWriter writer = mapper.writer(new DefaultPrettyPrinter());
writer.writeValue(new File(statsFile), jsonString);
} catch (JsonProcessingException e2) {
// TODO Auto-generated catch block
e2.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
Question, how would I be able to load this back? (only everything underneath Jan for example, to a hashmap)
private HashMap<String, List<String>> loadDict() {
ObjectMapper mapper = new ObjectMapper();
try {
HashMap<String, ArrayList<String>> map = mapper.readValue(new File(statsFile), new TypeReference<HashMap<String, ArrayList<String>>>() {});
System.out.println(map.get("Jan"));
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
With this code I would be able to load it, however I get this exception (because I have multiple hashmaps within the Json):
JsonMappingException: Can not construct instance of java.util.HashMap: no String-argument constructor/factory method to deserialize from String value
(I don't know how to put exceptions here)
My JSON:
"{\r\n \"Jan\" : {\r\n \"cake\" : [ \"hi\", \".\" ]\r\n },\r\n \"obj2\" : { },\r\n \"obj3\" : { }\r\n}"
So how would I be able to only load a specific month into a hashmap?

I would definitely use Json format, consider that this format (which is plain text) give you the freedom to edit the file with an editor.
I'll suggest to use Jackson library.
You have just to create an ObjectMapper and use it to serialise and deserialise the json. Reading the documentation I see you can also read and write json files.
ObjectMapper objectMapper = new ObjectMapper();
For example this line would convert a Json String into a Map;
Map<String, Object> map = objectMapper.readValue(json, new TypeReference<Map<String, Object>>() {});
And you can convert a map into a Json even easier:
objectMapper.writeValueAsString(json)
The rest of your problem remain read and write the files.

When you've got multiple independently growing lists or equivalent and are trying to handle storing within a single file, the question is how would you want to handle overlap.
What would also matter is how frequently you write versus how frequently the data is read. If its mostly reads and long pauses are fine, then go for using json format (every time an edit is made then you have to re-write the whole json and the reads will have to wait until the operation is complete).
However if read and write would be approximately equal in measure then I think you'll need to consider splitting up the data into sequential sections - similar to what a database might do it.
===============================================================================
For example :
meta-data + 1*1024 bytes of 01st map
meta-data + 1*1024 bytes of 02nd map
meta-data + 1*1024 bytes of 03rd map
..
meta-data + 1*1024 bytes of 12th map
meta-data + 2*1024 bytes of 01st map
meta-data + 2*1024 bytes of 02nd map
..
meta-data + 2*1024 bytes of 12th map
meta-data + 3*1024 bytes of 01st map
...
and so on..
The meta-data will tell you whether to continue to the next section for any given map's data.
===============================================================================
You would also have to consider things like whether you are using hard-disk (sequential access) or SSD (random-access) and then decide which approach you want to go with.

Related

Reading the first record from array of Json records in a json file in JAVA

How to get the first record from a JSON File that contains array of JSON Records
Sample File:
[
{"l7ProtocolID":"dhcp","packets_out":1,"bytes_out":400,"start_time":1454281199898,"flow_sample":0,"duration":102,"path":["base","ip","udp","dhcp"],"bytes_in":1200,"l4":[{"client":"68","server":"67","level":0}],"l2":[{"client":"52:54:00:50:04:B2","server":"FF:FF:FF:FF:FF:FF","level":0}],"l3":[{"client":"::ffff:0.0.0.0","server":"::ffff:255.255.255.255","level":0}],"flow_id":"81454281200000731489","applicationID":"dhcp","packets_in":1}
{"l7ProtocolID":"dhcp","packets_out":1,"bytes_out":400,"start_time":1454281199898,"flow_sample":0,"duration":102,"path":["base","ip","udp","dhcp"],"bytes_in":1200,"l4":[{"client":"68","server":"67","level":0}],"l2":[{"client":"52:54:00:50:04:B2","server":"FF:FF:FF:FF:FF:FF","level":0}],"l3":[{"client":"::ffff:0.0.0.0","server":"::ffff:255.255.255.255","level":0}],"flow_id":"81454281200000731489","applicationID":"dhcp","packets_in":1}
Record n.....
]
And simillarly there might be 1000+ records in the file.
I want to fetch the first record out of this file.
The below code doesn't load the whole file as a String in memory. Although, the whole array would be in memory. For example, Gson would load about 10KB of file bytes into buffer at a time, and parse each row and add to the array. But, all 1000 objects will be on the heap in the array.
Partial Streaming suitable for most cases
public static void readDom() {
BufferedReader reader = null;
try {
reader = new BufferedReader(new FileReader(file));
Gson gson = new GsonBuilder().create();
Person[] people = gson.fromJson(reader, Person[].class);
System.out.println("Object mode: " + people[0]);
} catch (FileNotFoundException ex) {
...
} finally {
...
}
}
Above code is more efficient than below:
One shot read (Only for small files)
String fileContents = FileUtils.readAsString(file);
Person[] persons = gson.fromJson(fileContents, Person[].class);
First approach could be okay for upto 5k-10k rows at a time. But, beyond 10k, even first approach may not be great.
This third option is the best for large data. Iterate and read one row at a time and stop whenever you want.
True Streaming
public static void readStream() {
try {
JsonReader reader = new JsonReader(new InputStreamReader(stream, "UTF-8"));
Gson gson = new GsonBuilder().create();
// Read file in stream mode
reader.beginArray();
while (reader.hasNext()) {
// Read data into object model
Person person = gson.fromJson(reader, Person.class);
if (person.getId() == 0 ) {
System.out.println("Stream mode: " + person);
}
break;
}
reader.close();
} catch (UnsupportedEncodingException ex) {
...
} catch (IOException ex) {
...
}
}
Source: Reading JSON as Stream using GSON
Dealing with JSON parsing without matching POJO structures
If you don't want to take the trouble of creating a matching POJO object graph structure, you could just instruct GSON to treat each row as a HashMap.
Type type = new TypeToken<Map<String, Object>>(){}.getType();
Map<String, Object> thisRow = gson.fromJson(reader, type);
Got the solution using org.JSON.simple library
public String ReadJsonFromFile(String fileName){
JSONParser parser = new JSONParser();
String firstRecord = null;
try {
JSONArray jsonArray = (JSONArray) parser.parse(new FileReader(fileName));
JSONObject jsonObject = (JSONObject) jsonArray.get(0);
firstRecord = jsonObject.toString();
} catch (FileNotFoundException e) {
LOG.info("JSON -> Can't read from file: File Not Found");
e.printStackTrace();
} catch (IOException e) {
LOG.info("JSON -> Can't read File : IO Exception");
e.printStackTrace();
} catch (ParseException e) {
LOG.info("JSON -> Can't Parse JSON in File");
e.printStackTrace();
}
return firstRecord;
}

Writing JSON object to file writes unwanted character

I'm trying to write JSON object to file which consists of JSONArray. Before writing this object to TXT file I check each structure printing it in console. For example I got this out put:
{"splittedHostEnteredString":["cześć.","w"],"foreignLanguage":"cześć","historyToWord":"cześć. w","mainLanguage":"jak","country":0}
In list I have for example 10 objects like above and each of them prints correctly. Problem is when I try to write ArrayList which has all objects as strings. File writes also but after it, there are some additional characters - "\" in all structure. I convert object to string using GSON library. Below is the output after saving file:
{"names":["{\"splittedHostEnteredString\":[\"cześć.\",\"każdym\"],\"foreignLanguage\":\"jak\",\"historyToWord\":\"cześć. w każdym razie\",\"mainLanguage\":\"z\",\"country\":0}","{\"splittedHostEnteredString\":[\"cześć.\",\"w\"],\"foreignLanguage\":\"cześć\",\"historyToWord\":\"cześć. w\",\"mainLanguage\":\"jak\",\"country\":0}"]}
This is my code:
public void saveListOfExistingObjectsInApplicationToOneFileJson()
{
String filename = "configuration.txt";
File myFile = new File(Environment.getExternalStorageDirectory(), filename);
try {
myFile.createNewFile();
JSONObject obj = new JSONObject();
try {
JSONArray array = new JSONArray();
for(int i = 0; i < listOfSavedFiles.size(); i++)
{
array.put(listOfSavedFiles.get(i));
}
obj.put("names", array);
System.out.println(obj.toString());
String tmpStringToWrite = obj.toString();
FileOutputStream stream = new FileOutputStream(myFile);
try {
stream.write(tmpStringToWrite.getBytes());
} finally {
stream.close();
}
} catch (JSONException e1) {
// TODO Auto-generated catch block
e1.printStackTrace();
}
} catch (IOException e1) {
// TODO Auto-generated catch block
e1.printStackTrace();
}
}

How do I print the next element in a linked list to a CSV file?

I'm making an address book and my program is supposed to save each element in a list to a CSV file. I've gotten everything to work asside from the fact that it will only save 1 line to the file.
public static void save(){
PrintWriter writer = null;
try {
writer = new PrintWriter("C:\\Users\\Remixt\\workspace\\2\\AddressBook.csv", "UTF-8");
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
System.exit(0);
} catch (UnsupportedEncodingException e) {
// TODO Auto-generated catch block
e.printStackTrace();
System.exit(0);
}
{
writer.println(AddressBook.get(getListSize()-1)+"\n");
writer.close();//saves file
}
Edit: It will only save the last element to the file. It only shows 1 thing in the file no matter how many times i add something else to the list.
the problem is here
writer.println(AddressBook.get(getListSize()-1)+"\n");
you just write the last element of AddressBook to the csv file, use for loop
the following is a sample
for (int i = 0; i < AddressBook.size(); i++) {
writer.println(AddressBook.get(i)+"\n");
}
at last, you should write file by append mode
filename=new FileWriter("printWriter.txt",true);
writer=new java.io.PrintWriter(filename);

To sort the contents of a file using comparator in java

I have added the contents of a file through an ArrayList and now I need to sort those contents and then put them back on the file. how can I do it?
package training;
import java.io.*;
import java.util.Collections;
/* method to create a file, store the contents of list and write read them back
*/
public class FileCreation implements Serializable {
public void filesPatient(Person p) throws ClassNotFoundException {
String Filename = "PatientDetails.txt";
try {
ObjectOutputStream os = new ObjectOutputStream(
new FileOutputStream(Filename));
os.writeObject(p);
os.close();
} catch (Exception e) {
}
System.out.println("Done wRiting");
try {
ObjectInputStream is = new ObjectInputStream(new FileInputStream(
Filename));
Person l = (Person) is.readObject();
System.out.println("*****************Patient Details*************");
System.out.println("Name : " + l.getName() + "\nID: " + l.getId()
+ "\nGender: " + l.getGender());
System.out.println();
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
Please help me find an appropriate method to do it.
You could use the Collections.sort method, if you want to sort them by the default String-value sort:
Collections.sort( list );
Otherwise you can write your own comparator to do your own specific sorting logic. Please check out this post for more examples / explanation:
How to sort a Collection<T>?
EDIT
I believe you are looking for something like this. Note in order to sort an ArrayList, you will have to load the entire list into memory, then sort it, then write the whole list to file, so make sure your list is not too long and will fit in memory.
ArrayList<Person> arrlist = new ArrayList<Person>(2);
while(more Person data exists) { // Replace this with however you are loading your data
p = new Person();
p.getdata();
arrlist.add(p);
}
Collections.sort(arrList); // now your list is sorted
for(Person p : arrList) {
fc.filesPatient(p); // add all your patients to file, from your list which is now sorted
}
ArrayList<Person> arrlist = new ArrayList<Person>(2);
p = new Person();
p.getdata();
arrlist.add(p);
fc.filesPatient(p);
p = new Person();
p.getdata();
arrlist.add(p);
fc.filesPatient(p);

Java: Create a KML File and insert elements in existing file

I`m developing an app that reads the GPS-Exif Information of Photos and writes the Tags (Lat/Lon,...) in an KML or CSV File.
Creating the Files if they dont exist, especially the csv, is not the problem, but in this case i want to add a new KML- placemark to an existing KML-file.
so far i have created a method that checks if the file already exists - if not (if-statement) it creates a new one.
and if the file exists it should add the information (else).
public void createKMLFile(){
String kmlstart = "<?xml version=\"1.0\" encoding=\"utf-8\"?>\n" +
"<kml xmlns=\"http://www.opengis.net/kml/2.2\">\n";
String kmlelement ="\t<Placemark>\n" +
"\t<name>Simple placemark</name>\n" +
"\t<description>"+name+"</description>\n" +
"\t<Point>\n" +
"\t\t<coordinates>"+latlon[1]+","+latlon[0]+","+z+ "</coordinates>\n" +
"\t</Point>\n" +
"\t</Placemark>\n";
String kmlend = "</kml>";
ArrayList<String> content = new ArrayList<String>();
//content.add(0,kmlstart);
//content.add(1,kmlelement);
//content.add(2,kmlend);
String kmltest;
//Zum Einsetzen eines Substrings (weitere Placemark)
//String test = "</kml>";
//int index = kml.lastIndexOf(test);
File test = new File(datapath+"/"+name+".kml");
Writer fwriter;
if(test.exists() == false){
try {
content.add(0,kmlstart);
content.add(1,kmlelement);
content.add(2,kmlend);
kmltest = content.get(0) + content.get(1) + content.get(2);
fwriter = new FileWriter(datapath+"/"+name+".kml");
fwriter.write(kmltest);
//fwriter.append("HalloHallo", index, kml.length());
fwriter.flush();
fwriter.close();
}catch (IOException e1) {
// TODO Auto-generated catch block
e1.printStackTrace();
}
}
else{
kmltest = content.get(0) + content.get(1) + content.get(2);
StringTokenizer tokenize = new StringTokenizer(kmltest, ">");
ArrayList<String> append = new ArrayList<String>();
while(tokenize.hasMoreTokens()){
append.add(tokenize.nextToken());
append.add(1, kmlelement);
String rewrite = append.toString();
try {
fwriter = new FileWriter(datapath+"/"+name+".kml");
fwriter.write(rewrite);
fwriter.flush();
fwriter.close();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
}
I dont get any Logs in the LogCat but the App stops working if i try to update the existing file... any suggestions?
thanks in advance
EDIT: Ok i see that content.add(0, kml...) has to be outside the try block... but thats not the main problem it seems
When modifying XML files it is best accomplished using a library of some sort. I maintain the XML-manipulation library called JDOM which is designed to make this sort of manipulation as easy as it can. Other options are using the DOM library (which is already built in to the Java runtime which makes it much easier to integrate in to your program), and SAX (which, in this case, I would not recommend, even though it may be faster). Other external libraries (like JDOM) exist which would also help, like XOM, dom4j, etc. This stackoverflow answer seems relevant: Best XML parser for Java
In JDOM, your code would look something like:
Document doc = null;
Namespace kmlns = new Namespace("http://www.opengis.net/kml/2.2");
Element position = new Element("Position", kmlns);
position.addContent(new Element("name", kmlns).setText(positionName));
position.addContent(new Element("desc", kmlns).setText(description));
position.addContent(..... all the XML content needed for the Position ....);
// create the XML Document in memory if the file does not exist
// otherwise read the file from the disk
if(!test.exists()){
doc = new Document();
Element root = new Element("kml", kmlns);
} else {
SAXBuilder sb = new SAXBuilder();
doc = sb.build(test);
}
Element root = doc.getRootElement();
// modify the XML as you need
// add Position Element
root.addContent(position);
try {
fwriter = new FileWriter(datapath+"/"+name+".kml");
XMLOutputter xout = new XMLOutputter(Format.getPrettyFormat());
xout.output(doc, writer);
fwriter.flush();
fwriter.close();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
EDIT: you ask what's wrong with your actual code.... There are a few things that are contributing to your problems, but you don't show an actual error, or other indication of how the program 'stops working'.
there are bugs in your code which should throw serious exceptions: kmltest = content.get(0) + content.get(1) + content.get(2); should throw IndexOutOfBoundsException because the content ArrayList is empty (the lines adding values to the ArrayList are commented out....) - but let's assume that they are not....
You never read the file you are changing, so how can you be changing it?
The StringTokenizer delimeter is ">", which is never a good way to parse XML.
You loop through the String tokenizer on evert '>' delimeter, but you never add the token back in to the output (i.e. your output is milling a lot of '>' characters).
You add the kmlelement Position content in the place of every '>' caracter in the document, not just the one that is important.
The FileWriter logic should be ** outside** the loop.... you do not want to modify the file for every token you modify.
It´s working now, thanks for your input rolfl!
In my programm i have implemented the method with the JDOM library which is much more comfortable, anyhow here is the working code of my first try if someone is interested.
The output is not in a pretty format but the kml-file is working..
public void createKMLFile(){
String kmlstart = "<?xml version=\"1.0\" encoding=\"utf-8\"?>\n" +
"<kml xmlns=\"http://www.opengis.net/kml/2.2\">\n";
String kmlelement ="\t<Placemark>\n" +
"\t<name>Simple placemark</name>\n" +
"\t<description>"+name+"</description>\n" +
"\t<Point>\n" +
"\t\t<coordinates>"+latlon[1]+","+latlon[0]+","+z+ "</coordinates>\n" +
"\t</Point>\n" +
"\t</Placemark>\n";
String kmlend = "</kml>";
ArrayList<String> content = new ArrayList<String>();
content.add(0,kmlstart);
content.add(1,kmlelement);
content.add(2,kmlend);
String kmltest = content.get(0) + content.get(1) + content.get(2);
File testexists = new File(datapath+"/"+name+".kml");
Writer fwriter;
if(!testexists.exists()){
try {
fwriter = new FileWriter(datapath+"/"+name+".kml");
fwriter.write(kmltest);
fwriter.flush();
fwriter.close();
}catch (IOException e1) {
// TODO Auto-generated catch block
e1.printStackTrace();
}
}
else{
//schleifenvariable
String filecontent ="";
ArrayList<String> newoutput = new ArrayList<String>();;
try {
BufferedReader in = new BufferedReader(new FileReader(testexists));
while((filecontent = in.readLine()) !=null)
newoutput.add(filecontent);
} catch (FileNotFoundException e1) {
// TODO Auto-generated catch block
e1.printStackTrace();
} catch (IOException e1) {
// TODO Auto-generated catch block
e1.printStackTrace();
}
newoutput.add(2,kmlelement);
String rewrite ="";
for(String s : newoutput){
rewrite += s;
}
try {
fwriter = new FileWriter(datapath+"/"+name+".kml");
fwriter.write(rewrite);
fwriter.flush();
fwriter.close();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}

Categories