I am trying to get data out of a CSV data. However, if I try to read the data so I can use the individual data inside it, it prints extra stuff like:
x����sr��java.util.ArrayListx����a���I��sizexp������w������t��17 mei 2017t��Home - Gastt��4 - 1t��(4 - 0)t��
With this code:
FileInputStream in = openFileInput("savetest13.dat");
BufferedReader reader = new BufferedReader(new InputStreamReader(in));
List<String[]> resultList = new ArrayList();
String csvLine;
while ((csvLine = reader.readLine()) != null){
String[] row = csvLine.split(",");
out.println("while gepakt");
out.println(row);
date = row[0];
out.println("date: "+date);
resultList.add(row);
txtTest.setText(date);
}
But whenever I read the file to check what data it contains, I get the exact same data as I put in. But I can't manage to split the data with stuff:
FileInputStream in = openFileInput("savetest13.dat");
ObjectInputStream ois = new ObjectInputStream(in);
List stuff = (List) ois.readObject();
txtTest.setText(String.valueOf(stuff));
[17 mei 2017, Home - Guest, 2 - 1, (2 - 0), ]
I am trying to get them separated into date, names, score1, score2
.
Which of the 2 would be better to use and how can I get the correct output, which I am failing to obtain?
You are not writing CSV to your output file, rather than that your are using standard java serialization ObjectOutputStream#writeObject(...) to create that file. try using a CSV library to write/read data in CSV format (see hier) and before that, start here to learn about CSV because
[17 mei 2017, Home - Guest, 2 - 1, (2 - 0), ]
is not csv, but only the output of toString of the list you are using.
Here is an easy way to write the CSV file formatted correctly. This is a simple example which does not take into account the need to escape any commas found in your data.You can open the file created in Excel, Google Sheets, OpenOffice etc. and see that it is formatted correctly.
final String COMMA = ",";
final String NEW_LINE = System.getProperty("line.separator");
ArrayList<String> myRows = new ArrayList<String>();
// add comma delimited rows to ArrayList
myRows.add("date" + COMMA +
"names" + COMMA +
"score1" + COMMA +
"score2"); // etc.
// optional - insert field names into the first row
myRows.add(0, "[Date]" + COMMA +
"[Names]" + COMMA +
"[Score1]" + COMMA +
"[Score2]");
// get a writer
final String fileName = "myFileName.csv";
final String dirPath = Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_DOCUMENTS).getAbsolutePath();
File myDirectory = new File(dirPath);
FileWriter writer = new FileWriter(new File(myDirectory, fileName));
// write the rows to the file
for (String myRow : myRows) {
writer.append(myRow + NEW_LINE);
}
writer.close();
Related
I have 2 csv files with column 'car', 'bike', 'tractor' etc
The below code prints out data from the csv which works fine, however cvs 1 prints out in a different or to csv 2 so I want to arrange the columns in a different order.
From this code, how can I organise the data to print out in order of which column I want first, second etc.
BufferedReader r = new BufferedReader(new InputStreamReader(str));
Stream lines = r.lines().skip(1);
lines.forEachOrdered(
line -> {
line= ((String) line).replace("\"", "");
ret.add((String) line);
The columns print out like this:
csv 1
Car, Bike, Tractor, Plane, Train
csv 2
Bike, Plane, Tractor, Train, Car,
but I want to manipulate the code so the two csv files print out in the same order like;
Bike, Plane ,Tractor, Train, Car
I can't use the likes of col[1],col[3], as the two files are in different or so I would need to call them by column name in the csv file so col["Truck"] etc
Or is there another way. Like creating a new list from the csv 1 output and rearranging ?
I haven't used BufferedReader much so I'm not sure if this is a silly question and there's a simple solution
A BufferedReader reads lines, and does not care for the content of those lines. So this code will simply save lines into ret as it is reading them:
List<String> ret = new ArrayList<>();
try (BufferedReader r = new BufferedReader(new InputStreamReader(str))) {
r.lines().skip(1).forEachOrdered(l -> ret.add(l.replace("\"", ""));
}
// now ret contains one string per CSV line, excluding the 1st
(This is somewhat better than your code in that it is guaranteed to close the reader correctly, and does not require any casts to string).
If your CSV lines do not contain any , characters that are not separators, you can modify the above code to split lines into columns; which you can then reorder:
List<String[]> ret = new ArrayList<>(); // list of string arrays
try (BufferedReader r = new BufferedReader(new InputStreamReader(str))) {
r.lines().skip(1).forEachOrdered(l ->
ret.add(l.replace("\"", "").split(",")); // splits by ','
}
// now ret contains a String[] per CSV line, skipping the 1st;
// with ret.get(0)[1] being the 2nd column of the 1st non-skipped line
// this will output all lines, reversing the order of columns 1 and 2:
for (String[] line : ret) {
System.out.print(line[1] + ", " + line[0]);
for (int i=2; i<line.length; i++) System.out.print(", " + line[i]);
System.out.println();
}
If your CSV lines can contain ,s that are not delimiters, you will need to learn how to correctly parse (=read) CSVs, and that requires significantly more than a BufferedReader. I would recommend using an external library to handle this correctly (for there are many types of CSVs in the wild). In particular, using Apache Commons CSV, things are relatively straightforward:
try (Reader in = new FileReader("path/to/file.csv")) {
Iterable<CSVRecord> records = CSVFormat.RFC4180.parse(in);
for (CSVRecord record : records) {
String columnOne = record.get(0);
String columnTwo = record.get(1);
}
}
I am retrieving the values using regular expression in jmeter and writing those values into a csv file.But one of my value returns values as (value1,value2), how can i add write those 2 values as one value in csv file.Below is my code
String statusvar = vars.get("guid");
String guidstat = vars.get("guidn");
String custstat = vars.get("custType");
String fpath = vars.get("write_file_path");
String newStatus;
FileWriter fstream = new FileWriter(fpath+"new_record.csv", false);
BufferedWriter out = new BufferedWriter(fstream);
out.write(statusvar+","+guidstat+","+custstat);
out.newLine();
out.flush();
Write your values within quotes and it should be OK. If a value contains quotes, then you'd need to escape them. Just replace each " by "", so value"a,valueB is written as "value""a,valueB"
If this becomes too tricky then I suggest getting a CSV parsing/writing library to do the job for you such as univocity-parsers - I'm the author of this one by the way.
List<String> list=null;
ICsvListReader listReader = null;
listReader = new CsvListReader(new FileReader(new File(folderPath+"/"+fileName)), CsvPreference.TAB_PREFERENCE);
while((list=listReader.read())!=null) {
System.out.println(list.toString());
}
I am using org.supercsv.io.ICsvListReader but the code breaks if in the tsv input file ' or " is present. Instead of six fields I am getting the 6th field as null. Please tell me if any changes required or some other library for reading large tsv file is there .
Data:
data "city' b state country sdsadsd details value
solved it using univocity-parsers
TsvParserSettings settings = new TsvParserSettings();
settings.getFormat().setLineSeparator("\n");
TsvParser parser = new TsvParser(settings);
parser.beginParsing(getReader("/examples/example.tsv"));
String[] row;
while ((row = parser.parseNext()) != null) {
println(out, Arrays.toString(row));
}
you can refer this link
https://github.com/uniVocity/univocity-parsers/blob/master/src/test/java/com/univocity/parsers/examples/TsvParserExamples.java
I have a CSV files (excel) which has data in it and i need to parse the data using java.
the data in those files doesn't separated using comma,the CSV files has number of columns and number of rows(each cell has data) where all the data is written.
i need to go through on all the files until i get to the EOF(end of file)of each file and parse the data.
the files contains also empty rows in it so empty row is not a criteria to stop parsing,i think only EOF will indicate that i've reached to the end of the specific file.
many thanks.
You can use opencsv to parse the excel CSV. I've used this myself, all you need to do is split on the ';'. Empty cells will be parsed aswell.
You can find info here : http://opencsv.sourceforge.net/
And to parse the excelCSV you can do:
CSVReader reader = new CSVReader(new FileReader("yourfile.csv"), ';');
Aside from other suggestions, I would offer Jackson CSV module. Jackson has very powerful data-binding functionality, and CSV module allows reading/writing as CSV as an alternative to JSON (or XML, YAML, and other supported formats). So you can also do conversions between other data formats, in addition to powerful CSV-to/from-POJO binding.
Please have a Stream Object to read the CSV file.
FileInputStream fis = new FileInputStream("FileName.CSV");
BufferedInputStream bis = new BufferedInputStream(fis);
InputStreamReader isr = new InputStreamReader(bis);
Read an inputstream Object and store the file in String object.
Then using StringTokenizer with ,[comma] as delimeter -->you will get the tokens
Please manipulate the token to get the value.
String str = "This is String , split by StringTokenizer, created by mkyong";
StringTokenizer st = new StringTokenizer(str);
System.out.println("---- Split by space ------");
while (st.hasMoreElements()) {
System.out.println(st.nextElement());
}
System.out.println("---- Split by comma ',' ------");
StringTokenizer st2 = new StringTokenizer(str, ",");
while (st2.hasMoreElements()) {
System.out.println(st2.nextElement());
}
Thanks,
Pavan
Suppose you have a csv fileContent in form of string:
String fileContent;
Generally, the CSV fileContent are parsed into List>.
final List<String> rows = new ArrayList<String>(Lists.newArraysList(fileContent.split("[\\r\\n]+")));
Split the file into List of rows.
Then use CSVParser of OpenCSV and parse the comma separated line into List
final CSVParser parser = new CSVParser();
final List<List<String>> csvDetails = new ArrayList<List<String>>();
rows.forEach(t -> {
try {
csvDetails.add(Lists.newArrayList(parser.parseLine(t)));
} catch (Exception e) {
throw new RunTimeException("Exception occurred while parsing the data");
}
});
currently i creating a java apps and no database required
that why i using text file to make it
the structure of file is like this
unique6id username identitynumber point
unique6id username identitynumber point
may i know how could i read and find match unique6id then update the correspond row of point ?
Sorry for lack of information
and here is the part i type is
public class Cust{
string name;
long idenid, uniqueid;
int pts;
customer(){}
customer(string n,long ide, long uni, int pt){
name = n;
idenid = ide;
uniqueid = uni;
pts = pt;
}
FileWriter fstream = new FileWriter("Data.txt", true);
BufferedWriter fbw = new BufferedWriter(fstream);
Cust newCust = new Cust();
newCust.name = memUNTF.getText();
newCust.ic = Long.parseLong(memICTF.getText());
newCust.uniqueID = Long.parseLong(memIDTF.getText());
newCust.pts= points;
fbw.write(newCust.name + " " + newCust.ic + " " + newCust.uniqueID + " " + newCust.point);
fbw.newLine();
fbw.close();
this is the way i text in the data
then the result inside Data.txt is
spencerlim 900419129876 448505 0
Eugene 900419081234 586026 0
when user type in 586026 then it will grab row of eugene
bind into Cust
and update the pts (0 in this case, try to update it into other number eg. 30)
Thx for reply =D
Reading is pretty easy, but updating a text file in-place (ie without rewriting the whole file) is very awkward.
So, you have two options:
Read the whole file, make your changes, and then write the whole file to disk, overwriting the old version; this is quite easy, and will be fast enough for small files, but is not a good idea for very large files.
Use a format that is not a simple text file. A database would be one option (and bear in mind that there is one, Derby, built into the JDK); there are other ways of keeping simple key-value stores on disk (like a HashMap, but in a file), but there's nothing built into the JDK.
You can use OpenCSV with custom separators.
Here's a sample method that updates the info for a specified user:
public static void updateUserInfo(
String userId, // user id
String[] values // new values
) throws IOException{
String fileName = "yourfile.txt.csv";
CSVReader reader = new CSVReader(new FileReader(fileName), ' ');
List<String[]> lines = reader.readAll();
Iterator<String[]> iterator = lines.iterator();
while(iterator.hasNext()){
String[] items = (String[]) iterator.next();
if(items[0].equals(userId)){
for(int i = 0; i < values.length; i++){
String value = values[i];
if(value!=null){
// for every array value that's not null,
// update the corresponding field
items[i+1]=value;
}
}
break;
}
}
new CSVWriter(new FileWriter(fileName), ' ').writeAll(lines);
}
Use InputStream(s) and Reader(s) to read file.
Here is a code snippet that shows how to read file.
BufferedReader reader = new BufferedReader(new InputStreamReader(new FileInputStream("c:/myfile.txt")));
String line = null;
while ((line = reader.readLine()) != null) {
// do something with the line.
}
Use OutputStream and Writer(s) to write to file. Although you can use random access files, i.e. write to the specific place of the file I do not recommend you to do this. Much easier and robust way is to create new file every time you have to write something. I know that it is probably not the most efficient way, but you do not want to use DB for some reasons... If you have to save and update partial information relatively often and perform search into the file I'd recommend you to use DB. There are very light weight implementations including pure java implementations (e.g. h2: http://www.h2database.com/html/main.html).