I am writing my data to csv file in java. Below is my code
public static void writeAccountToFile(List<Map<String, Object>> list, String filePath) {
System.out.println("Write data to csv file start");
try {
File file = new File(filePath);
Writer writer = new OutputStreamWriter(new FileOutputStream(file), "UTF-8");
CsvSchema schema = null;
CsvSchema.Builder schemaBuilder = CsvSchema.builder();
if (list != null && !list.isEmpty()) {
for (String col : list.get(0).keySet()) {
schemaBuilder.addColumn(col);
}
schema = schemaBuilder.build().withLineSeparator("\r").withHeader();
}
CsvMapper mapper = new CsvMapper();
mapper.writer(schema).writeValues(writer).writeAll(list);
writer.flush();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
System.out.println("Write data to csv file end");
}
When I check my result file, last line there is no "" in accountName test3 and test4.
accountId,accountName,address
1111,"test1111111",england
2222,"test222222222",tokyo
3333,test3,italy
4444,test4,indo
Here is my input list:
[{accountId=1111, accountName=test1111111, address=england}, {accountId=2222,
accountName=test222222222, address=tokyo}, {accountId=3333, accountName=test3,
address=italy}, {accountId=4444, accountName=test4,
address=indo}]
Here is my code to read csv file and assign it to list:
public static List<Map<String, Object>> loadFileAccount(String filePath) throws Exception {
List<Map<String, Object>> list = new ArrayList<>();
removeBom(Paths.get(filePath));
System.out.println("Load account data from csv start");
File file = new File(filePath);
Reader reader = new InputStreamReader(new FileInputStream(file), "UTF-8");
Iterator<Map<String, Object>> iterator = new CsvMapper()
.readerFor(Map.class)
.with(CsvSchema.emptySchema().withHeader())
.readValues(reader);
while (iterator.hasNext()) {
Map<String, Object> keyVals = iterator.next();
list.add(keyVals);
}
reader.close();
System.out.println("Load account data from csv end");
return list;
}
What is error in my code?
It seems that you are right, when the String is long, it adds quotes.
So to avoid inconsistencies, you can specify if you want quotes or not using
CsvGenerator.Feature.ALWAYS_QUOTE_STRINGS or CsvGenerator.Feature.STRICT_CHECK_FOR_QUOTING
This will always add double quotes:
CsvMapper mapper = new CsvMapper();
mapper.configure(CsvGenerator.Feature.ALWAYS_QUOTE_STRINGS, true);
mapper.writer(schema).writeValues(writer).writeAll(list);
The other one should never add double quotes
Related
My dependency which is not third party
<dependency>
<groupId>javax.json</groupId>
<artifactId>javax.json-api</artifactId>
<version>1.1.4</version>
</dependency>
<dependency>
<groupId>org.glassfish</groupId>
<artifactId>javax.json</artifactId>
<version>1.1.4</version>
</dependency>
JSon array test
[{"name":"aondx","value":10,"date":"1999-01-09T14:30:53Z"}]
I am able to parse/ write into the a create a csv file but the issue is it is not in the right format in my csv file.
public static void writeFilteredJsonToNewFile(JsonArray jsonArray) {
try {
for (Object object : jsonArray) {
JsonObject obj = (JsonObject) object;
StringWriter writer = new StringWriter();
JsonWriter jsonWriter = Json.createWriter(writer);
jsonWriter.writeObject(obj);
writer.close();
FileWriter fileWriter = new FileWriter(diretory + name + ".csv");
BufferedWriter bufferedWriter = new BufferedWriter(fileWriter);
bufferedWriter.write(writer.toString());
bufferedWriter.close();
System.out.println("Created new File!");
}
} catch (IOException e) {
e.printStackTrace();
}
}
Result
Expected result (table format) in a csv file
name
value
date
aondx
40
1999-01-09T14:30:53Z
If my understanding is correct, you want to convert the json into csv file but while writing to csv you are getting key value in each row instead of key in the first row and then value in the subsequent rows.
I think if you follow below steps then you will able to achieve this:
1) First convert your Json into the List<Map<String,String>> using below code
> ObjectMapper mapper = new ObjectMapper();
> List<Map<String, Object>> datas = mapper.readValue(jsonArray.toString(), new
> TypeReference<List<Map<String, Object>>>(){});
2) Find the keys from the json, which will be used as your column, assuming all the data having same keys.
> Set<String> columns = datas.get(0).keySet(); write columns as the
> first row in the csv file.
3) Iterate over the datas list and start writing the value row wise.
As you mentioned in the comment you doesn't not want to use the external library then instead of ObjectMapper you can use your JsonArray.
Instead of converting your json Array to List<Map<String,String>> and getting the keys you can do the same thing with JsonObject only as it implement Map only.
Set<String> columns = jsonArray.get(0).keySet();
this code should be before you are starting the loop.
Inside the loop you are iterating over the jsonArray, there read the value .
Set<String> columns = jsonArray.get(0).keySet();
for (Object object : jsonArray) {
JsonObject obj = (JsonObject) object;
List<String> dataRows = new ArrayList();
for(String column : columns ) {
String value = (String)obj.get(column);
dataRows.add(value);
}
.......
// write the value to file
.......
}
If you are Interested in full solution, you can try this.
public static void writeFilteredJsonToNewFile(JsonArray jsonArray) {
List<List<String>> allDatas = new ArrayList<>();
Set<String> columns = jsonArray.get(0).keySet();
allDatas.add(new ArrayList<>(columns));
for (Object object : jsonArray) {
JsonObject obj = (JsonObject) object;
List<String> dataRows = new ArrayList();
for(String column : columns ) {
String value = (String)obj.get(column);
dataRows.add(value);
}
allDatas.add(dataRows);
}
writeToCsvFile(allDatas,",","test-file.csv");
}
public void writeToCsvFile(List<List<String>> thingsToWrite, String separator, String fileName){
try (FileWriter writer = new FileWriter(fileName)){
for (List<String> strings : thingsToWrite) {
for (int i = 0; i < strings.size(); i++) {
writer.append(strings[i]);
if(i < (strings.length-1))
writer.append(separator);
}
writer.append(System.lineSeparator());
}
writer.flush();
} catch (IOException e) {
e.printStackTrace();
}
}
I'm using this method for writing data into csv but the problem is that it is rewriting data again below the old data. How can I prevent it from doing this ? I tried to set the FileWriter writer = new FileWriter(answerFile, false); but then its only writing the last array in the csv file.
I have this code:
public static void writeCsv(List<String> myList) throws IOException {
FileWriter writer = new FileWriter(answerFile, true);
CSVPrinter csvPrinter = new CSVPrinter(writer, CSVFormat.DEFAULT);
List<String[]> myListSplitted = myList.stream().map(row -> row.split(",")).collect(Collectors.toList());
csvPrinter.printRecords(myListSplitted);
csvPrinter.flush();
csvPrinter.close();
}
This the method in which I'm calling this method:
public static void appendAnswers() throws IOException {
try (BufferedReader br = new BufferedReader(new FileReader(questionFile))) {
String csvRow;
int counter = 0;
String[] csvArr;
String data;
br.readLine();
List<String> myList = new ArrayList<>();
while ((csvRow = br.readLine()) != null) {
csvArr = csvRow.split(",");
csvArr = Arrays.copyOf(csvArr, csvArr.length + 1);
csvArr[csvArr.length - 1] = answers.get(counter);
data = Arrays.toString(csvArr).replace("[", "").replace("]",
"").trim();
counter++;
myList = new ArrayList<String>(Arrays.asList(data.split("\n")));
}
writeCsv(myList);
}
From the manual page :
Prints values in a CSV format.
Values can be appended to the output by calling the print(Object) method.
Basically, CSVPrinter emulates a printer. A printer appends new lines to whatever has already been printed.
If you want to overwrite, use FileWriter more directly rather than a printer emulator.
I am writing data to a JSON file using a for loop, my question is will all the data be written to the file or a new .json file will be created every time?
List<String> list = new ArrayList<String>();
list.add("abc");
list.add("def");
list.add("xyz");
for (String name : list) {
JSONObject obj = new JSONObject();
obj.put("Name:", name);
try (FileWriter file = new FileWriter("C:\\Users\\elements.json")) {
file.write(obj.toJSONString());
file.flush();
} catch (IOException e) {
e.printStackTrace();
}
}
Please update your code like this.
...
FileWriter file = new FileWriter("C:\Users\elements.json")
for (String name : list) {
JSONObject obj = new JSONObject();
obj.put("Name:", name);
file.write(obj.toJSONString());
}
file.flush();
...
Otherwise, please use string variable to store all json as one string variable and write the string variable to file at once.
Instead, you can use Jackson's ObjectMapper to map the entire list object.
List<String> list = new ArrayList<>();
list.add("abc");
list.add("def");
list.add("xyz");
ObjectMapper objectMapper = new ObjectMapper();
objectMapper.writeValue(new File("elements.json"), list);
hey ive got a chunk of code here trying to read a single line in a .csv file:
rows = new WarehouseItem[];
public void readCSV(String filename) {
FileInputStream fileStrm = null;
InputStreamReader rdr;
BufferedReader bufRdr;
int lineNum;
String line;
try {
fileStrm = new FileInputStream(filename);
rdr = new InputStreamReader(fileStrm);
bufRdr = new BufferedReader(rdr);
numRows = 0;
line = bufRdr.readLine();
while (line != null) {
rows[numRows] = line;
numRows++;
line = bufRdr.readLine();
}
fileStrm.close();
}
catch (IOException e) {
if (fileStrm != null) {
try {
fileStrm.close();
} catch (IOException ex2) {}
}
System.out.println("Error in file processing: " + e.getMessage());
}
}
on the rows[numRows] = line im trying to store the line into an array of objects(i have premade an object which contains an array of strings and the number of columns)
im not entirely sure how to store the single line im trying to read in my object.
any help would be really appreciated :)
Your life would be an awful lot easier if you used a CSV library to do this. With jackson it's really simple to read CSV into an array of objects.
For example:
CsvMapper mapper = new CsvMapper();
mapper.enable(CsvParser.Feature.WRAP_AS_ARRAY);
File csvFile = new File("input.csv"); // or from String, URL etc
MappingIterator<Object[]> it = mapper.reader(Object[].class).readValues(csvFile);
See here for more info on parsing CSV in java: http://demeranville.com/how-not-to-parse-csv-using-java/
I'm working on a csv parser, I want to read headers and the rest of the csv file separately.
Here is my code to read csv.
The current code reads everything in the csv file, but I need to read headers separate.
please help me regarding this.
public class csv {
private void csvRead(File file)
{
try
{
BufferedReader br = new BufferedReader( new FileReader(file));
String strLine = "";
StringTokenizer st = null;
File cfile=new File("csv.txt");
BufferedWriter writer = new BufferedWriter(new FileWriter(cfile));
int tokenNumber = 0;
while( (strLine = br.readLine()) != null)
{
st = new StringTokenizer(strLine, ",");
while(st.hasMoreTokens())
{
tokenNumber++;
writer.write(tokenNumber+" "+ st.nextToken());
writer.newLine();
}
tokenNumber = 0;
writer.flush();
}
}
catch(Exception e)
{
e.getMessage();
}
}
We have withHeader() method available in CSVFormat. If you use this option then you will be able to read the file using headers.
CSVFormat format = CSVFormat.newFormat(',').withHeader();
Map<String, Integer> headerMap = dataCSVParser.getHeaderMap();
will give you all headers.
public class CSVFileReaderEx {
public static void main(String[] args){
readFile();
}
public static void readFile(){
List<Map<String, String>> csvInputList = new CopyOnWriteArrayList<>();
List<Map<String, Integer>> headerList = new CopyOnWriteArrayList<>();
String fileName = "C:/test.csv";
CSVFormat format = CSVFormat.newFormat(',').withHeader();
try (BufferedReader inputReader = new BufferedReader(new FileReader(new File(fileName)));
CSVParser dataCSVParser = new CSVParser(inputReader, format); ) {
List<CSVRecord> csvRecords = dataCSVParser.getRecords();
Map<String, Integer> headerMap = dataCSVParser.getHeaderMap();
headerList.add(headerMap);
headerList.forEach(System.out::println);
for(CSVRecord record : csvRecords){
Map<String, String> inputMap = new LinkedHashMap<>();
for(Map.Entry<String, Integer> header : headerMap.entrySet()){
inputMap.put(header.getKey(), record.get(header.getValue()));
}
if (!inputMap.isEmpty()) {
csvInputList.add(inputMap);
}
}
csvInputList.forEach(System.out::println);
} catch (Exception e) {
System.out.println(e);
}
}
}
Please consider the use of Commons CSV. This library is written according RFC 4180 - Common Format and MIME Type for Comma-Separated Values (CSV) Files. What is compatible to read such lines:
"aa,a","b""bb","ccc"
And the use is quite simple, there is just 3 classes, and a small sample according documentation:
Parsing of a csv-string having tabs as separators, '"' as an optional
value encapsulator, and comments starting with '#':
CSVFormat format = new CSVFormat('\t', '"', '#');
Reader in = new StringReader("a\tb\nc\td");
String[][] records = new CSVParser(in, format).getRecords();
And additionally you get this parsers already available as constants:
DEFAULT - Standard comma separated format as defined by RFC 4180.
EXCEL - Excel file format (using a comma as the value delimiter).
MYSQL - Default MySQL format used by the SELECT INTO OUTFILE and LOAD DATA INFILE operations.
TDF - Tabulation delimited format.
Have you considered OpenCSV?
Previous question here...
CSV API for Java
Looks like you can split out the header quite easily...
String fileName = "data.csv";
CSVReader reader = new CSVReader(new FileReader(fileName ));
// if the first line is the header
String[] header = reader.readNext();
// iterate over reader.readNext until it returns null
String[] line = reader.readNext();
Your code here, being
while( (strLine = br.readLine()) != null)
{
//reads everything in your csv
}
will print all of your CSV content.
For example, the following fetches your header:
Reader in = ...;
CSVFormat.EXCEL.withHeader("Col1", "Col2", "Col3").parse(in);
As suggested, life could be easier using the predefined CSVFormat from the apache commons library. Link here (https://commons.apache.org/proper/commons-csv/user-guide.html).
Cheers.