text file reading first line has key and second line onward value - java

****text file format:****
FirstName,lastname,role,startdate,emptype
sreedhar,reddy,Admin,20-2-2018,contract
shekar,kumar,Admin,20-2-2018,contract
RAJ,roy,Admin,20-2-2018,contract
somu,reddy,Admin,20-2-2018,contract
sumanth,reddy,Admin,20-2-2018,contract
Question:
How to read the text file and how to put in Map (Key ,vaule);
first line has key in map (ex: firstname,lastname,ect)
Second line on onwards value in map(eg:sreedhar,reddy,ect)
Map output:{Firstname:sreedhar,Lastname:reddy,role:Admin,startdat:2-6-2018}
please any one provide java code read the text file and put into map read has key, value pair.

You'll need to specify a different key for the Map as it requires a unique one each time:
A map cannot contain duplicate keys; each key can map to at most one
value.
So you're more than likely going to need a Map of Maps here:
Read in the file:
File file = new File("\\\\share\\path\\to\\file\\text.txt");
Add to scanner:
Scanner input = new Scanner(file);
Read the first line as your "header":
String[] headerArray = input.nextLine().split(",");
Create a Map of Maps:
Map<String, Map<String, String>> myMap = new HashMap<>();
Loop through the rest of the text file, adding to a Map, then adding that Map to the main Map, along with a key (I've used User0, User1...):
int pos = 0;
String user = "User";
while (input.hasNextLine()) {
Map<String, String> map = new HashMap<>();
int loop = 0;
String[] temp = input.nextLine().split(",");
for (String temp1 : temp) {
map.put(headerArray[loop], temp1);
loop++;
}
myMap.put(user + " " + pos, map);
pos++;
}
Once you break it down into steps, it makes life easier.

You can do something like this -
br = new BufferedReader(new FileReader("file.txt"));
String line = br.readLine();
String headerLine = line;
List<String> headerList = Arrays.asList(headerLine.split(","));
List<List<String>> valueListList = new ArrayList<List<String>>();
while (line != null) {
line = br.readLine();
String valueLine = line;
if(valueLine != null) {
List<String> valueList = Arrays.asList(valueLine.split(","));
valueListList.add(valueList);
}
}
Map<String, List<String>> map = new HashMap<String, List<String>>();
for(int i=0; i<headerList.size();i++){
List<String> tempList = new ArrayList<String>();
for(int j=0; j<headerList.size();j++){
tempList.add(valueListList.get(i).get(i));
}
map.put(headerList.get(i), tempList);
}
System.out.println(map);
Output:
{emptype=[contract, contract, contract, contract, contract],
startdate=[20-2-2018, 20-2-2018, 20-2-2018, 20-2-2018, 20-2-2018],
role=[Admin, Admin, Admin, Admin, Admin],
lastname=[kumar, kumar, kumar, kumar, kumar],
FirstName=[sreedhar, sreedhar, sreedhar, sreedhar, sreedhar]}

Related

how to create TreeMap with <Integer, List<Integer>> from a two-column txt file

I am reading a txt file having two columns where each element is an integer. I'd like to use the elements in the first column as a key, and the elements in the second column as my value.
I am sharing just a small portion of my data set.
0 1
0 2
0 3
0 4
1 2
1 3
1 0
Scanner scanner = new Scanner(new FileReader(DATADIR+"data.txt"));
TreeMap<Integer, List<Integer>> myMap = new TreeMap<Integer, List<Integer>>();
while (scanner.hasNextLine()) {
String[] line= scanner.nextLine().split("\t");
}
Now, what I need is to have a structure with which when I call 0, I should get <1,2,3,4>.
You should check if the key exists in map and add accordingly.
Sample code:
final Scanner scanner = new Scanner(new FileReader(DATADIR + "data.txt"));
final TreeMap<Integer, List<Integer>> myMap = new TreeMap<Integer, List<Integer>>();
while (scanner.hasNextLine()) {
final String[] line = scanner.nextLine().split("\t");
final Integer key = Integer.parseInt(line[0]);
final Integer value = Integer.parseInt(line[1]);
if (myMap.containsKey(key)) {
myMap.get(key).add(value);
} else {
final List<Integer> valueList = new LinkedList<>();
valueList.add(value);
myMap.put(key, valueList);
}
}
You can try this:
TreeMap<Integer, List<Integer>> myMap = new TreeMap<Integer, List<Integer>>();
while (scanner.hasNextLine()) {
String[] line = scanner.nextLine().split("\\s+");
myMap.computeIfAbsent(Integer.valueOf(line[0]), k -> new ArrayList<>()).add(Integer.valueOf(line[1]));
}
System.out.println(myMap );
I wouldn't use a Scanner at all for reading a file. Instead, use a modern way of streaming the file content and then handle each line as desired.
In my environment, splitting a file by "\t" just does not work, that's why I split a String containing an arbitrary amount of whitespaces between the desired values by an arbitrary amount of whitespaces.
See the following minimal example:
public static void main(String[] args) {
Path filePath = Paths.get(DATADIR).resolve(Paths.get("data.txt"));
// define the map
Map<Integer, List<Integer>> map = new TreeMap<>();
try {
// stream all the lines read from the file
Files.lines(filePath).forEach(line -> {
// split each line by an arbitrary amount of whitespaces
String[] columnValues = line.split("\\s+");
// parse the values to int
int key = Integer.parseInt(columnValues[0]);
int value = Integer.parseInt(columnValues[1]);
// and put them into the map,
// either as new key-value pair or as new value to an existing key
map.computeIfAbsent(key, k -> new ArrayList<>()).add(value);
});
} catch (IOException e) {
e.printStackTrace();
}
// print the map content
map.forEach((key, value) -> System.out.println(key + " : " + value));
}
You will have to use the following imports along with the ones you have:
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.Paths;

Merge two array list into a TreeMap in java

I want to combine these two text files
Driver details text file:
AB11; Angela
AB22; Beatrice
Journeys text file:
AB22,Edinburgh ,6
AB11,Thunderdome,1
AB11,Station,5
And I want my output to be only the names and where the person has been. It should look like this:
Angela
Thunderdone
Station
Beatrice
Edinburgh
Here is my code. I'm not sure what i'm doing wrong but i'm not getting the right output.
ArrayList<String> names = new ArrayList<String>();
TreeSet<String> destinations = new TreeSet<String>();
public TaxiReader() {
BufferedReader brName = null;
BufferedReader brDest = null;
try {
// Have the buffered readers start to read the text files
brName = new BufferedReader(new FileReader("taxi_details.txt"));
brDest = new BufferedReader(new FileReader("2017_journeys.txt"));
String line = brName.readLine();
String lines = brDest.readLine();
while (line != null && lines != null ){
// The input lines are split on the basis of certain characters that the text files use to split up the fields within them
String name [] = line.split(";");
String destination [] = lines.split(",");
// Add names and destinations to the different arraylists
String x = new String(name[1]);
//names.add(x);
String y = new String (destination[1]);
destinations.add(y);
// add arraylists to treemap
TreeMap <String, TreeSet<String>> taxiDetails = new TreeMap <String, TreeSet<String>> ();
taxiDetails.put(x, destinations);
System.out.println(taxiDetails);
// Reads the next line of the text files
line = brName.readLine();
lines = brDest.readLine();
}
// Catch blocks exist here to catch every potential error
} catch (FileNotFoundException ex) {
ex.printStackTrace();
} catch (IOException ex) {
ex.printStackTrace();
// Finally block exists to close the files and handle any potential exceptions that can happen as a result
} finally {
try {
if (brName != null)
brName.close();
} catch (IOException ex) {
ex.printStackTrace();
}
}
}
public static void main (String [] args){
TaxiReader reader = new TaxiReader();
}
You are reading 2 files in parallel, I don't think that's gonna work too well. Try reading one file at a time.
Also you might want to rethink your data structures.
The first file relates a key "AB11" to a value "Angela". A map is better than an arraylist:
Map<String, String> names = new HashMap<String, String>();
String key = line.split(",")[0]; // "AB11"
String value = line.split(",")[1]; // "Angela"
names.put(key, value)
names.get("AB11"); // "Angela"
Similarly, the second file relates a key "AB11" to multiple values "Thunderdome", "Station". You could also use a map for this:
Map<String, List<String>> destinations = new HashMap<String, List<String>>();
String key = line.split(",")[0]; // "AB11"
String value = line.split(",")[1]; // "Station"
if(map.get(key) == null) {
List<String> values = new LinkedList<String>();
values.add(value);
map.put(key, values);
} else {
// we already have a destination value stored for this key
// add a new destination to the list
List<String> values = map.get(key);
values.add(value);
}
To get the output you want:
// for each entry in the names map
for(Map.Entry<String, String> entry : names.entrySet()) {
String key = entry.getKey();
String name = entry.getValue();
// print the name
System.out.println(name);
// use the key to retrieve the list of destinations for this name
List<String> values = destinations.get(key);
for(String destination : values) {
// print each destination with a small indentation
System.out.println(" " + destination);
}
}

How to compare and edit two csv files in java depending on one column?

public class CompareCSV {
public static void main(String args[]) throws FileNotFoundException, IOException {
String path = "C:\\csv\\";
String file1 = "file1.csv";
String file2 = "file2.csv";
String file3 = "file3.csv";
ArrayList<String> al1 = new ArrayList<String>();
ArrayList<String> al2 = new ArrayList<String>();
BufferedReader CSVFile1 = new BufferedReader(new FileReader("/C:/Users/bida0916/Desktop/macro.csv"));
String dataRow1 = CSVFile1.readLine();
while (dataRow1 != null) {
String[] dataArray1 = dataRow1.split(",");
for (String item1 : dataArray1) {
al1.add(item1);
}
dataRow1 = CSVFile1.readLine();
}
CSVFile1.close();
BufferedReader CSVFile2 = new BufferedReader(new FileReader("C:/Users/bida0916/Desktop/Deprecated.csv"));
String dataRow2 = CSVFile2.readLine();
while (dataRow2 != null) {
String[] dataArray2 = dataRow2.split(",");
for (String item2 : dataArray2) {
al2.add(item2);
}
dataRow2 = CSVFile2.readLine();
}
CSVFile2.close();
for (String bs : al2) {
al1.remove(bs);
}
int size = al1.size();
System.out.println(size);
try {
FileWriter writer = new FileWriter("C:/Users/bida0916/Desktop/NewMacro.csv");
while (size != 0) {
size--;
writer.append("" + al1.get(size));
writer.append('\n');
}
writer.flush();
writer.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
I want to compare two csv files in java and want to have the complete details removed of one csv file from the other by comparing the first column of both the files. Currently I am getting a csv file with one column only having all details jumbled up.
You are adding all values of all columns to a single list, that's why you get the mess in your output:
ArrayList<String> al1=new ArrayList<String>();
//...
String[] dataArray1 = dataRow1.split(",");
for (String item1:dataArray1)
{
al1.add(item1);
}
Add the complete string array from your file to your list, then you can access your data in a structured way:
List<String[]> al1 = new ArrayList<>();
//...
String[] dataArray1 = dataRow1.split(",");
al1.add(dataArray1);
But for removal of rows I'd recommend to use Maps for faster access, where the key is the element on which you decide which row to delete and the value is the full row from your cvs file:
Map<String, String> al1 = new HashMap<>(); // or LinkedHashMap if row order is relevant
//...
String[] dataArray1 = dataRow1.split(",");
al1.put(dataArray1[0], dataRow1);
But be aware, that if two rows in a file contain the same value in the first column, only one will be preserved. If that's possible you might need to adopt that solution to store the data in a Map<String, Set<String>> or Map<String, List<String>>.
At this point I'd like to recommend to extract the file-reading to a separate method, which you can reuse to read both of your input-files and reduce duplicate code:
Map<String, String> al1 = readInputCsvFile(file1);
Map<String, String> al2 = readInputCsvFile(file2);
For the deletion of the lines which shall be removed, iterate over the key set of one of the maps and remove the entry from the other:
for (String key : al2.keySet()) {
al1.remove(key);
}
And for writing your output file, just write the row read from the original file as stored in the 'value' of your map.
for (String dataRow : al1.values()) {
writer.append(dataRow);
writer.append('\n');
}
EDIT
If you need to perform operations based on other data columns you should rather store the 'split-array' in the map instead of the full-line string read from the file. Then you have all data columns separately available:
Map<String, String[]> al2 = new HashMap<>();
//...
String[] dataArray2 = dataRow2.split(",");
al2.put(dataArray2[0], dataArray2);
You might then, e.g. add a condition for deleting:
for (Entry<String, String[]> entry : al2.entrySet()) {
String[] data = entry.getValue();
if ("delete".equals(data[17])) {
al1.remove(entry.getKey());
}
}
For writing your output file you have to rebuild the csv-format.
I'd recommend to use Apache commons-lang StringUtils for that task:
for (String[] data : al1.values()) {
writer.append(StringUtils.join(data, ","));
writer.append('\n');
}

java file reading and storing it in a hashmap

I was trying to read a .txt file and wanted to store it in a hasmap(String, List). But when it tried to store the values were overwritten with the last value.
String filePath = "D:/liwc_new.txt";
HashMap<String,List<String>> map = new HashMap<String,List<String>>();
String line;
BufferedReader reader = new BufferedReader(new FileReader(filePath));
String key = null;
List<String> value = new ArrayList<String>();
//putting words in key and cat strings in value of map
int count = 0;
String cats[]= null;
value.clear();
while ((line = reader.readLine()) != null)
{
String[] parts = line.split(":", 2);
value.clear();
count++;
key = parts[0].trim();
cats=parts[1].split(", ");
for(int i=0;i<cats.length;i++) {
cats[i]=cats[i].trim();
cats[i]=cats[i].replace("[", "");
cats[i]=cats[i].replace("]", "");
value.add(cats[i]);
}
map.put(key, value);
//map.put(key, value);
}
The line List<String> value = new ArrayList<String>(); should be moved to the first line of your while loop and both calls to clear removed.
The reason they are getting overwritten is you only ever allocate one list and put it in every value of the k:v pair of the map. So for every category, you have the same list, and the contents of that list are cleared and rebuilt every time a new line is read. So every value will have the contents of whatever was added after the last clear.
On the other hand, if you create a new list with each iteration, each category will have it's own list, what you want.
Delete the second occurrence of
value.clear();

How to convert a csv to Hashmap if there are multiple values for a key ? (without using csv reader )

Here is the link which states the reading of data from csv to Hashmap.
Convert CSV values to a HashMap key value pairs in JAVA
However, I am trying to read a file of csv, in which there are multiple values for a given key.
Eg:
Key - Value
Fruit - Apple
Fruit -Strawberry
Fruit -Grapefruit
Vegetable -Potatoe
Vegetable -Celery
where , Fruit and Vegetable are the keys.
I am using an ArrayList<> to store the values.
The code I am writing is able to store the keys , but stores only the last corresponding value .
So, when I print the hashmap , what I get is : Fruit - [Grapefruit] Vegetable- [Celery]
How can I iterate through the loop, and store all the values?
Following is the code, which I have written :
public class CsvValueReader {
public static void main(String[] args) throws IOException {
Map<String, ArrayList<String>> mp=null;
try {
String csvFile = "test.csv";
//create BufferedReader to read csv file
BufferedReader br = new BufferedReader(new FileReader(csvFile));
String line = "";
StringTokenizer st = null;
mp= new HashMap<String, ArrayList<String>>();
int lineNumber = 0;
int tokenNumber = 0;
//read comma separated file line by line
while ((line = br.readLine()) != null) {
lineNumber++;
//use comma as token separator
st = new StringTokenizer(line, ",");
while (st.hasMoreTokens()) {
tokenNumber++;
String token_lhs=st.nextToken();
String token_rhs= st.nextToken();
ArrayList<String> arrVal = new ArrayList<String>();
arrVal.add(token_rhs);
mp.put(token_lhs,arrVal);
}
}
System.out.println("Final Hashmap is : "+mp);
} catch (Exception e) {
System.err.println("CSV file cannot be read : " + e);
}
}
}
Currently, you're putting a new ArrayList in your map for each value you find. This replaces the old list you had for that particular key. Instead, you should use the existing array list (if it is already there), and add your value to it.
You should therefore replace this:
ArrayList<String> arrVal = new ArrayList<String>();
arrVal.add(token_rhs);
mp.put(token_lhs,arrVal);
By this:
ArrayList<String> arrVal = mp.get(token_lhs);
if (arrVal == null) {
arrVal = new ArrayList<String>();
mp.put(token_lhs,arrVal);
}
arrVal.add(token_rhs);
you have:
while readline
while splitline
new ArrayList(); and list.add()
map.put(key, arraylist)
so everytime you executed the map.put(), a new arrayList would be put into the map, and the value of existing key would be overwritten with the new arraylist. You need first get the arrayList from the map, with certain key, and append the value to the arraylist. if key doesn't exist, create a new arrayList.
If you want to save that part of work, you could consider to use some MultiMap api, E.g guava ArrayListMultiMap
This is because you create a new arrVal list every time.
You should try this code
ArrayList<String> arrVal = mp.get(token_lhs);
if(arrVal == null) {
arrVal = new ArrayList<String>();
mp.put(token_lhs, arrVal);
}
arrVal.add(token_rhs);
It seems that you always initalize a new ArrayList inside your while (st.hasMoreTokens()) loop, so you will only have the last ArrayList used (containing only the last token of the csv line)

Categories