Hi i'm working on an android app and this is my problem
I have a text file that is maybe 100 lines long it differs from phone to phone but lets say a section is like this
line 1 = 34
line 2 = 94
line 3 = 65
line 4 = 82
line 5 = 29
etc
each line will be equal to some number however that number will be different from phone since my application will be changing this number and it may already be different before my app is installed. So here's my problem i want to search the text file for say "line 3 = " then delete that entire line and replace it with "line 3 = some number"
My main goal is to change that number at the end of line 3 and keep line 3 that is the text exactly the same i only want to edit the number however the problem is that number will always be different
How can i go about doing this? thanks for any help
You can't "insert" or "remove" characters in the middle of a file. I.e., you can't replace 123 with 1234 or 12 in the middle of a file.
So either you "pad" each number so they all have equal width, i.e., you represent 43 as for instance 000043, or you'll probably have to regenerate the whole file.
To regenerate the whole file, I suggest you read the original file line by line, process the lines as appropriate, and write them out to a new, temporary file along the way. Then, when you're through, you replace the old file with the new one.
To process the line I suggest you do something like
String line = "line 3 = 65";
Pattern p = Pattern.compile("line (\\d+) = (\\d+)");
Matcher m = p.matcher(line);
int key, val;
if (m.matches()) {
key = Integer.parseInt(m.group(1));
val = Integer.parseInt(m.group(2));
// Update value if relevant key has been found.
if (key == 3)
val = 123456;
line = String.format("line %d = %d", key, val);
}
// write out line to file...
Thanks guys for the replies but what i ended up doing was using the sed command in bash and the wild card command * to replace the line and then just ran the script through java which went a little like this
Script
busybox sed -i 's/line 3 = .*/line 3 = 70/g' /path/to/file
Java
Command
execCommand("/path/to/script");
Method
public Boolean execCommand(String command)
{
try {
Runtime rt = Runtime.getRuntime();
Process process = rt.exec("su");
DataOutputStream os = new DataOutputStream(process.getOutputStream());
os.writeBytes(command + "\n");
os.flush();
os.writeBytes("exit\n");
os.flush();
process.waitFor();
} catch (IOException e) {
return false;
} catch (InterruptedException e) {
return false;
}
return true;
}
The simplest solution is to read the whole file into memory and then replace the line want to change and then write it back to the file.
For exmple:
String input = "line 1 = 34\nline 2 = 94\nline 3 = 65\nline 4 = 82\nline 5 = 29\n";
String out = input.replaceAll("line 3 = (\\d+)", "line 3 = some number");
...outputs:
line 1 = 34
line 2 = 94
line 3 = some number
line 4 = 82
line 5 = 29
A couple thoughts. An easier way to do this (if possible) would be to store these lines in a collection (like an ArrayList) and do all of your manipulation within your collection.
Another solution can be found here. If you need to replace the contents within a text file, you could call a method periodically to do this:
try {
BufferedReader in = new BufferedReader(new FileReader("in.txt"));
PrintWriter out = new PrintWriter(new File("out.txt"));
String line; //a line in the file
String params[]; //holds the line number and value
while ((line = in.readLine()) != null) {
params = line.split("="); //split the line
if (params[0].equalsIgnoreCase("line 3") && Integer.parseInt(params[1]) == 65) { //find the line we want to replace
out.println(params[0] + " = " + "3"); //output the new line
} else {
out.println(line); //if it's not the line, just output it as-is
}
}
in.close();
out.flush();
out.close();
}catch(Exception e) {
e.printStackTrace();
}
Related
I am currently building an application that is extracting values from a text file inside a project. Somehow managed extract data from specific lines but don't seem to get the right one.
Here is the code:
private String getInputsFromATextFile(int item) throws FileNotFoundException {
InputStream is = this.getResources().openRawResource(R.drawable.input);
StringBuilder builder = new StringBuilder();
int lineNo = 0;
try{
BufferedReader reader = new BufferedReader(new InputStreamReader(is));
String line;
while((line = reader.readLine()) != null){
lineNo++;
if(lineNo == item){
builder.append(reader.readLine());
}
}
reader.close();
}
catch (IOException e){
e.printStackTrace();
}
return builder.toString();
}
And here are the text file contents:
20.45
21.65
1
225
4102
401
3
3
6
1
196.41
64.11
7
3
5
2
144.01
3
452.33
12
701.33
33
78.12
12
123.90
4
25.00
10
6.51
30.98
2.50
Spiderman
100.00
90
150.00
100
10
34
12
James
1267
Joshue
401
Christelle
3050
Ryan
888
Hanna
5
13
24
9
5
3
50
Suppose we assign a certain line number in a parameter. This method returns the exact next data from which the line number is assigned. Although, maybe I can adjust to the output that it always returns (lineNo + 1), but if in case I assigned '0' (zero) in the parameter, it instead returns null. Why is that so? I must be missing something really important.
That's because you're reading the line again in the statement builder.append(reader.readLine()).
Notice that you've already read it in while loop.
So, the correct statement would be:
builder.append(line);
Don't read it again when appending it. Use :
builder.append(line);
Also, if you want it to be 0 indexed, you should increment lineno after comparing it.
if(lineno == item)
{
}
lineno ++;
If you do it before comparision, it will never be 0 and hence returns a null .
Is that optimal?
If it just a one time retireval Yes.
If you use it again and again - No, everytime you want data from the line, you need to traverse the whole file till that line.
One way you can do is to store it in an ArrayList.
InputStream is = this.getResources().openRawResource(R.drawable.input);
ArrayList<String> list = new ArrayList();
try{
BufferedReader reader = new BufferedReader(new InputStreamReader(is));
String line;
while((line = reader.readLine()) != null){
list.add(line);
}
reader.close();
}
catch (IOException e){
e.printStackTrace();
}
getLineFromFile(list, 10);
Store all the strings in an arrayList and then retrieve them.
public String getLineFromFile(ArrayList<String> list, int lineNo);
{
return list.get(lineNo - 1);
}
This question already has answers here:
Java IO implementation of unix/linux "tail -f"
(9 answers)
Closed 8 years ago.
I have a text file that I first want to print the last 6 lines of, and then to detect when a new line has been added so that it will keep updating the screen with recent activity. The idea is that I'm trying to display six recent transactions made in my program.
The problem I am currently encountering is that it keeps printing the first (not last) six lines in the text file, when I want it to be the other way around.
Here is my sample code:
BufferedReader in = new BufferedReader(new FileReader("transaction-list.txt"));
System.out.println();
System.out.println("SIX MOST RECENT TRANSACTIONS:");
System.out.println();
String line;
for (int i=0; i<6;i++){
line=in.readLine();
System.out.println(line);
}
in.close();
}catch (IOException e){
e.printStackTrace();
}
break;
You have to save the lines into String Array. and after reading whole file just print Array. just remember where to start the reading of saved array..
BufferedReader in = new BufferedReader(new FileReader("transaction-list.txt"));
System.out.println();
System.out.println("SIX MOST RECENT TRANSACTIONS:");
System.out.println();
String[] last6 = new String[6];
int count=0;
while(in.ready()){
last6[count++%6]=in.readLine();
}
for (int i=0; i<6;i++){
System.out.println(last6[(i+count)%6]);
}
in.close();
Your currently logic only reads the first 6 lines and print it, basically you can read all lines into a list and remove those lines you don't need. Check following post:
How to read last 5 lines of a .txt file into java
While there are 4 other answers, I don't think any address both your points: (1) to print the last 6 lines and (2) then keep monitoring the file and printing new lines.
I also think you should keep it simple to better convey your code's intent and remove bug risk:
just use a BufferedReader rather than RandomAccessFile - this is what BufferedReader is for
instead of using an array just use a FIFO Queue like ArrayDeque<String> - this is a perfect use case for it and the "ringbuffer" implementation is fully encapsulated inside ArrayDeque
A barebones implementation which does all this would be something like:
public static void MonitorFile(String filePath)
throws FileNotFoundException, IOException, InterruptedException
{
// Used for demo only: count lines after init to exit function after n new lines
int newLineCount = 0;
// constants
final int INITIAL_LINE_LIMIT = 6;
final int POLLING_INTERVAL = 1000;
// file readers
FileReader file = new FileReader(filePath);
BufferedReader fr = new BufferedReader(file);
// read-and-monitor loop
boolean initialising = true;
Queue<String> lineBuffer = new ArrayDeque<String>(INITIAL_LINE_LIMIT);
int lineCount = 0;
while (true) {
String line= fr.readLine();
if (line != null)
{
if (initialising) { // buffer
lineBuffer.add(line);
if (++lineCount > INITIAL_LINE_LIMIT) lineBuffer.remove();
}
else { // print
System.out.printf("%d %s%n", ++lineCount, line);
newLineCount++;
}
}
else
{
// No more lines, so dump buffer and/or start monitoring
if (initialising)
{
initialising = false;
// reset the line numbers for printing
lineCount = Math.max(0, lineCount - INITIAL_LINE_LIMIT);
// print out the buffered lines
while((line = lineBuffer.poll()) != null)
System.out.printf("%d %s%n", ++lineCount, line);
System.out.println("finished pre-loading file: now monitoring changes");
}
// Wait and try and read again.
if (newLineCount > 2) break; // demo only: terminate after 2 new lines
else Thread.sleep(POLLING_INTERVAL);
}
}
}
Points to consider:
For what it's worth, I would pass the BufferedReader in as a parameter so this becomes more generalised,
This needs some kind of cancellation so it doesn't monitor forever.
Rather than polling and sleeping your thread you could also use file change monitoring, but that code would be more complex than is suitable for this answer.
The above code gives the following output
2 test line b
3 test line c
4 test line d
5 test line e
6 test line f
7 test line g
finished pre-loading file: now monitoring changes
8 test line h
9 test line i
10 test line j
11 test line k
I have this program that reads a text file. I need to get some data out of it.
The text files look like this:
No. Ret.Time Peak Name Height Area Rel.Area Amount Type
min µS µS*min % mG/L
1 2.98 Fluoride 0.161 0.028 0.72 15.370 BMB
2 3.77 Chloride 28.678 3.784 99.28 2348.830 BMB
Total: 28.839 3.812 100.00 2364.201
I need to start reading from line #29 and from there get the Peak Name and the Amount of each element like Fluoride, Chloride and so on. The example only shows those two elements, but other text files will have more. I know I will need some sort of loop to iterate through those lines starting on line #29 which is where the "1" starts then the "2" which will be the 30th line and so on.
I have tried to make this work, but I am missing something I think and I`m not sure what. Here is my Code.
int lines = 0;
BufferedReader br = new BufferedReader(new FileReader(selectFile.getSelectedFile()));
Scanner sc = new Scanner(new FileReader(selectFile.getSelectedFile()));
String word = null;
while((word =br.readLine()) != null){
lines++;
/*if(lines == 29)
System.out.println(word);*/
if ((lines == 29) && sc.hasNext())
count++;
String value = sc.next();
if (count == 2)
System.out.println(value + ",");
}
Here's some code for you:
int linesToSkip = 28;
try (BufferedReader br = new BufferedReader(new FileReader(file))) {
String line;
while ( (line = br.readLine()) != null) {
if (linesToSkip-- > 0) {
continue;
}
String[] values = line.split(" +");
int index = 0;
for (String value : values) {
System.out.println("values[" + index + "] = " + value);
index++;
}
}
}
Note that I've surrounded it in a try(expr) {} block to ensure that the reader is closed at the end, otherwise you'll consume resources and possibly lock the file from other processes.
I've also renamed the variable you called word as line to make it clearer what it contains (i.e. a string representing a line in the file).
The line.split(" +") uses a regular expression to split a String into its constituent values. In this case your values have spaces between, so we're using " +" which means 'one or more spaces'. I've just looped through the values and printed them out; obviously, you will need to do whatever it is you need to do with them.
I replaced the line count with a linesToSkip variable that decrements. It's less code and explains better what you're trying to achieve. However, if you need the line number for some reason then use that instead, as follows:
if (++lineCount <= 28) {
continue;
}
If I'm reading it correctly, you are mixing two different readers with the BufferedReader and the Scanner, so you are not going to get the results right changing from one to the other (one is not pointing to the same position than the other). You already have the line in word and you can parse it, no need of using the Scanner. Just skip until line 29 (lines > 29) and then parse the values you want, line by line.
You are reading the file twice... try something like this
int lines = 0;
BufferedReader br = new BufferedReader(new FileReader(selectFile.getSelectedFile()));
String line = null;
while ((line = br.readLine()) != null) {
if (++lines < 29)
continue; //this ignores the line
for(String word : line.split("separator here")) {
// this will iterate over every word on that line
// I think you can take it from here
System.out.println(word);
}
}
I have a csv file that currently has 20 lines of data.
The data contains employee info and is in the following format:
first name, last name, Employee ID
So one line would like this: Emma, Nolan, 2
I know how to write to the file in java and have all 20 lines print to the console, but what I'm not sure how to do is how to get Java to print one specific line to the console.
I also want to take the last employee id number in the last entry and have java add 1 to it one I add new employees. I thinking this needs to be done with a counter just not sure how.
You can do something like this:
BufferedReader reader = new BufferedReader(new FileReader(<<your file>>));
List<String> lines = new ArrayList<>();
String line = null;
while ((line = reader.readLine()) != null) {
lines.add(line);
}
System.out.println(lines.get(0));
With BufferedReader you are able to read lines directly. This example reads the file line by line and stores the lines in an array list. You can access the lines after that by using lines.get(lineNumber).
You can read text from a file one line at a time and then do whatever you want to with that line, print it, compare it, etc...
// Construct a BufferedReader object from the input file
BufferedReader r = new BufferedReader(new FileReader("employeeData.txt"));
int i = 1;
try {
// "Prime" the while loop
String line = r.readLine();
while (line != null) {
// Print a single line of input file to console
System.out.print("Line "+i+": "+line);
// Prepare for next loop iteration
line = r.readLine();
i++;
}
} finally {
// Free up file descriptor resources
r.close();
}
// Remember the next available employee number in a one-up scheme
int nextEmployeeId = i;
BufferedReader reader =new BufferedReader(new FileReader("yourfile.csv"));
String line = "";
while((line=reader.readLine())!=null){
String [] employee =line.trim().split(",");
// if you want to check either it contains some name
//index 0 is first name, index 1 is last name, index 2 is ID
}
Alternatively, If you want more control over read CSV files then u can think about CsvBeanReader that will give you more access over files contents..
Here is an algorithm which I use for reading csv files. The most effective way is to read all the data in the csv file into a 2D array first. It just makes it a lot more flexible to manipulate the data.
That way you can specify which line of the file to print to the console by specifying it in the index of the array and using a for. I.e: System.out.println(employee_Data[1][y]); for record 1. y is the index variable for fields. You would need to use a For Loop of course, to print every element for each line.
By the way, if you want to use the employee data in a larger program, in which it may for example store the data in a database or write to another file, I'd recommend encapsulating this entire code block into a function named Read_CSV_File(), which will return a 2D String array.
My Code
// The return type of this function is a String.
// The CSVFile_path can be for example "employeeData.csv".
public static String[][] Read_CSV_File(String CSVFile_path){
String employee_Data[][];
int x;
int y;
int noofFields;
try{
String line;
BufferedReader in = new BufferedReader(new FileReader(CSVFile_path));
// reading files in specified directory
// This assigns the data to the 2D array
// The program keeps looping through until the line read in by the console contains no data in it i.e. the end of the file.
while ( (( line = in.readLine()) != null ){
String[] current_Record = line.split(",");
if(x == 0) {
// Counts the number of fields in the csv file.
noofFields = current_Record.length();
}
for (String str : values) {
employee_Data[x][y] = str;
System.out.print(", "+employee_Data[x][y]);
// The field index variable, y is incremented in every loop.
y = y + 1;
}
// The record index variable, x is incremented in every loop.
x = x + 1;
}
// This frees up the BufferedReader file descriptor resources
in.close();
/* If an error occurs, it is caught by the catch statement and an error message
* is generated and displayed to the user.
*/
}catch( IOException ioException ) {
System.out.println("Exception: "+ioException);
}
// This prints to console the specific line of your choice
System.out.println(("Employee 1:);
for(y = 0; y < noofFields ; y++){
// Prints out all fields of record 1
System.out.print(employee_Data[1][y]+", ");
}
return employee_Data;
}
For reading large file,
log.debug("****************Start Reading CSV File*******");
copyFile(inputCSVFile);
StringBuilder stringBuilder = new StringBuilder();
String line= "";
BufferedReader brOldFile = null;
try {
String inputfile = inputCSVFile;
log.info("inputfile:" + inputfile);
brOldFile = new BufferedReader(new FileReader(inputfile));
while ((line = brOldFile.readLine()) != null) {
//line = replaceSpecialChar(line);
/*do your stuff here*/
stringBuilder.append(line);
stringBuilder.append("\n");
}
log.debug("****************End reading CSV File**************");
} catch (Exception e) {
log.error(" exception in readStaffInfoCSVFile ", e);
}finally {
if(null != brOldFile) {
try {
brOldFile.close();
} catch (IOException e) {
}
}
}
return stringBuilder.toString();
Ok, say I have a text file called "people.txt", and it contains the following information:
1 adam 20 M
2 betty 49 F
3 charles 9 M
4 david 22 M
5 ethan 41 M
6 faith 23 F
7 greg 22 M
8 heidi 63 F
Basically, the first number is the ID of the person, then comes the person's name, age and gender. Say I want to replace line 2, or the person with ID number 2 with different values. Now, I know I cant use RandomAccessFile for this because the names are not always the same number of bytes, neither are the ages. While searching random Java forums, I found that StringBuilder or StringBuffer should suffice for my needs, but I'm not sure how to implement either. Can they be used to directly write to the text file? I want this to work directly from user input.
Just created an example for you
public static void main(String args[]) {
try {
// Open the file that is the first
// command line parameter
FileInputStream fstream = new FileInputStream("d:/new6.txt");
BufferedReader br = new BufferedReader(new InputStreamReader(fstream));
String strLine;
StringBuilder fileContent = new StringBuilder();
//Read File Line By Line
while ((strLine = br.readLine()) != null) {
// Print the content on the console
System.out.println(strLine);
String tokens[] = strLine.split(" ");
if (tokens.length > 0) {
// Here tokens[0] will have value of ID
if (tokens[0].equals("2")) {
tokens[1] = "betty-updated";
tokens[2] = "499";
String newLine = tokens[0] + " " + tokens[1] + " " + tokens[2] + " " + tokens[3];
fileContent.append(newLine);
fileContent.append("\n");
} else {
// update content as it is
fileContent.append(strLine);
fileContent.append("\n");
}
}
}
// Now fileContent will have updated content , which you can override into file
FileWriter fstreamWrite = new FileWriter("d:/new6.txt");
BufferedWriter out = new BufferedWriter(fstreamWrite);
out.write(fileContent.toString());
out.close();
//Close the input stream
in.close();
} catch (Exception e) {//Catch exception if any
System.err.println("Error: " + e.getMessage());
}
}
One solution could be to read in the file, line-by-line, manipulate the lines you need (performing some parsing/tokenization to get the ID/name/etc.), then write all the lines into the file (overwriting its current content). This solution depends on the size of the file you are working with: too large a file will consume a lot of memory as you are holding all its contents in memory at once
Another approach (to cut down on memory requirements) is to process the file lin-by-line, but instead of holding all lines in memory, you write the current line to a temporary file after processing of each line, then move the temporary file to the location of the input file (overwriting that file).
The classes FileReader and FileWriter should help you with reading/writing to the file. You might want to wrap them in a BufferedReader/BufferedWriter to improve performance.
Also, don't forget to close the reader (also, the writer) when done reading (writing) the file, so consequent accesses to the file are not blocked due to the file still being open