I have to methods. One reads from file, another writes to it. If to look at them, they differ only in local variable:
public method1 wtite() {
try {
BufferedWriter out = new BufferedWriter(new FileWriter(file, true));
} catch (here come catch cases equal in two methods)
}
public method1 read() {
try {
BufferedReader in = new BufferedReader(new FileReader(file));
} catch (here come catch cases equal in two methods)
}
I want to extract a single method from both. And depending what the incoming object is: open file or close it. Smth like this:
public fileIO(??? io) {
try{
//read or write
} catch//put the same code here
}
Is it possible to combine Writer and Reader under the same method?
Excract common parts into methods:
void handle(...) {
// handle exception
}
public void read(...) {
try {
...
} catch (...) {
handle(...); // use defined method
}
}
public void write(...) {
try {
...
} catch (...) {
handle(...); // and here
}
}
Related
I've got kinda weird situation, I have methods:
public void generateRecords(Request request) {
String pathToFile = request.getPathFile();
String recordOne = generateRecordOne(request);
String recordTwo = generateRecordTwo(request);
fileService.writeToFile(pathToFile, recordOne);
fileService.writeToFile(pathToFile, recordTwo);
}
public void writeToFile(String path, String content) {
try {
FileWriter writer = new FileWriter(path, true);
writer.append(content);
writer.close();
} catch (IOException e) {
e.printStack();
}
}
generateRecords() is executing is rest endpoint. I am getting something like this:
id:1:record1
id:2:record1
id:1:record2
id:2:record2
While I would like to get something like this:
id:1:record1
id:1:record2
id:2:record1
id:2:record2
It is occuring sometimes, but still it is destroying my file. How can I avoid this?
Try using synchronized on writeToFile method.
Also, consider using the try-with-resources statement. In the code you have right now, an exception in your writer would lead to not closing the FileWriter.
public synchronized void writeToFile(String path, String content) {
try (FileWriter writer = new FileWriter(path, true)) {
writer.append(content);
} catch (IOException e) {
e.printStackTrace();
}
}
I have a Java program littered with values I want to log to a txt file. I'm new to the language and finding it not so straight forward.
I created a Logger class:
public static void loggerMain(String content) {
try {
PrintWriter out = new PrintWriter(new BufferedWriter(new FileWriter("debug.txt", true)));
out.println(content);
out.close();
} catch (IOException e) {
//exception handling left as an exercise for the reader
}
}
I then call the method in another class:
Logger.loggerMain("testing");
It logs the String but if I then run the script again, it will append the same String to a new line.But I don't want the same println to be appended each time the script is called. I want to override the file. How would I go about this?
If I change the FileWriter argument to False, the file will only log the latest call to the method. e.g.:
Logger.loggerMain("testing1");
Logger.loggerMain("testing2");
Only Logger.loggerMain("testing2"); will be logged. I know why, it's because I'm creating a new file each time I call the method.. but I really don't know the solution to this!
If I understood you correctly you want to clear the log for every time the programm is executed. You can do this with the following addition to the Logger class:
class Logger {
private static boolean FIRST_CALL = true;
public static void loggerMain(String content) {
try {
PrintWriter out = new PrintWriter(new BufferedWriter(new FileWriter("debug.txt", !FIRST_CALL)));
if(FIRST_CALL){
FIRST_CALL = false;
}
out.println(content);
out.close();
} catch (IOException e) {
//exception handling left as an exercise for the reader
}
}
}
With the variable FIRST_CALL we track if the logger has been executed for the first time in the current script context. If it is, we overwrite the file, by passing in false (!FIRST_CALL) into the FileWriter
Just a re-iteration of the other answer:
class Logger {
private static boolean FIRST_CALL = true;
public static void loggerMain(String content) {
try (
PrintWriter out = new PrintWriter(new BufferedWriter(new FileWriter("debug.txt", !FIRST_CALL)))) {
FIRST_CALL = false;
out.println(content);
} catch (IOException e) {
//exception handling left as an exercise for the reader
}
}
}
try-with-resources will spare you an explicit close() call, and will properly close the resource regardless of completing the block normally or with an exception.
This one is subjective: as the code will touch FIRST_CALL anyway, I feel it simpler to set it without the extra check.
I have source files in Cp1250 encoding. All of those file are in dirName directory or its subdirectories. I would like to merge them into one utf-8 file by adding their contents. Unfortunately I get empty line at the beginning of result file.
public static void processDir(String dirName, String resultFileName) {
try {
File resultFile = new File(resultFileName);
BufferedWriter bw = new BufferedWriter(new OutputStreamWriter(new FileOutputStream(resultFile), "utf-8"));
Files.walk(Paths.get(dirName)).filter(Files::isRegularFile).forEach((path) -> {
try {
Files.readAllLines(path, Charset.forName("Windows-1250")).stream().forEach((line) -> {
try {
bw.newLine();
bw.write(line);
} catch (Exception e) {
e.printStackTrace();
}
});
} catch (Exception e) {
e.printStackTrace();
}
});
bw.close();
} catch (Exception e) {
e.printStackTrace();
}
}
The reason is that I don't know how to detect the first file in my stream.
I came up with extremely stupid solution which does not rely on streams so it is unsatisfactory:
public static void processDir(String dirName, String resultFileName) {
try {
File resultFile = new File(resultFileName);
BufferedWriter bw = new BufferedWriter(new OutputStreamWriter(new FileOutputStream(resultFile), "utf-8"));
Files.walk(Paths.get(dirName)).filter(Files::isRegularFile).forEach((path) -> {
try {
Files.readAllLines(path, Charset.forName("Windows-1250")).stream().forEach((line) -> {
try {
if(resultFile.length() != 0){
bw.newLine();
}
bw.write(line);
if(resultFile.length() == 0){
bw.flush();
}
} catch (Exception e) {
e.printStackTrace();
}
});
} catch (Exception e) {
e.printStackTrace();
}
});
bw.close();
} catch (Exception e) {
e.printStackTrace();
}
}
Also I could use static boolean but that is total gibberish.
You can use the flatMap to create the stream of all lines of all files, then use flatMap again to interleave it with line separator, then use skip(1) to skip the leading separator like this:
public static void processDir(String dirName, String resultFileName) {
try(BufferedWriter bw = Files.newBufferedWriter(Paths.get(resultFileName))) {
Files.walk(Paths.get(dirName)).filter(Files::isRegularFile)
.flatMap(path -> {
try {
return Files.lines(path, Charset.forName("Windows-1250"));
} catch (IOException e) {
throw new UncheckedIOException(e);
}
})
.flatMap(line -> Stream.of(System.lineSeparator(), line))
.skip(1)
.forEach(line -> {
try {
bw.write(line);
} catch (IOException e) {
throw new UncheckedIOException(e);
}
});
} catch (IOException e) {
throw new UncheckedIOException(e);
}
}
In general using flatMap+skip combination can help to solve many similar problems.
Also note the Files.newBufferedWriter method which is simpler way to create BufferedWriter. And don't forget about try-with-resources.
Rethink your strategy. If you want to join files and neither, remove nor convert, line terminators, there is no reason to process lines. It seems, the only reason for you to write code processing lines, is, that you have a demand to bail lambda expressions and streams into the solution and the only possibility offered by the current API is to process streams of lines. But obviously, they are not the right tool for the job:
public static void processDir(String dirName, String resultFileName) throws IOException {
Charset cp1250 = Charset.forName("Windows-1250");
CharBuffer buffer=CharBuffer.allocate(8192);
try(BufferedWriter bw
=Files.newBufferedWriter(Paths.get(resultFileName), CREATE, TRUNCATE_EXISTING)) {
Files.walkFileTree(Paths.get(dirName), new SimpleFileVisitor<Path>() {
#Override public FileVisitResult visitFile(
Path path, BasicFileAttributes attrs) throws IOException {
try(BufferedReader r=Files.newBufferedReader(path, cp1250)) {
while(r.read(buffer)>0) {
bw.write(buffer.array(), buffer.arrayOffset(), buffer.position());
buffer.clear();
}
}
return FileVisitResult.CONTINUE;
}
});
bw.close();
}
}
Note how this solution solves the problems of your first attempt. You don’t have to deal with line terminators here, this code doesn’t even waste resources in trying to find them in the input. All it does, is performing the charset conversion on chunks of input data and writing them to the target. The performance difference can be significant.
Further, the code isn’t cluttered with catching exceptions, that you can’t handle. If an IOException occurs at any place of the operation, all pending resources are properly closed and the exception is relayed to the caller.
Granted, it just uses a good old inner class instead of a lambda expression. But it doesn’t reduce the readability compared to your attempt. If it still really bothers you that there is no lambda expression involved, you may check this question & answer for a way to bring them in again.
I need to list all subfolders in a directory and written on to text file.But when i coded only the last subfolder is only written on to the file.Please help.I am a beginner to Java.
public class Main {
// private Object bufferedWriter;
/**
* Prints some data to a file using a BufferedWriter
*/
public void writeToFile(String filename) {
try
{
BufferedWriter bufferedWriter = null;
bufferedWriter = new BufferedWriter(new FileWriter(filename));
int i=1;
File f=new File("D:/Moviezzz");
File[] fi=f.listFiles();
for(File fil:fi)
{
if(fil.isHidden())
{
System.out.print("");
}
else if(fil.isDirectory()||fil.isFile())
{
int s=i++;
String files = fil.getName();
//Start writing to the output stream
bufferedWriter.write(s+" "+fil);
bufferedWriter.newLine();
// bufferedWriter.write(s+" "+files);
}
}
//Construct the BufferedWriter object
} catch (FileNotFoundException ex) {
ex.printStackTrace();
}catch (IOException ex) {
ex.printStackTrace();}
}
public static void main(String[] args) {
new Main().writeToFile("d://my.txt");
}
}
Uptil you call flush() method of BufferWriter class it will not write your data to file.
It is not necessary to flush() every time in a loop. But you can write it after your end of the loop.
Main thing to put that yourObj.flush() is to keep your buffer memory clean. as after call of that flush() method, data will be release from memory and written to your file.
Close the BufferedReader after the loop.
for(File fil:fi)
{
...
}
bufferedReader.close();
Also, I suggest these changes in your code to make it more readable and efficient:
BufferedWriter bufferedWriter = new BufferedWriter(new FileWriter(filename));
...
if(!fil.isHidden() && (fil.isDirectory() || fil.isFile()))
{
...
}
You can create the BufferedReaderdirectly. Then, you are getting the file name, but not doing anything with it, so just remove the get. And last, you don't have to have put System.out.print(""); in an if to check if the file is hidden. You can use an empty statement or even no code, or use the ! operator to invert.
if(fil.isHidden())
{
; // Do nothing
}
else
{
// Do something
}
if(fil.isHidden()); // Do nothing
else
{
// Do something
}
if(!fil.isHidden)
{
// Do something
}
public void tokenize(){
// attempt creating a reader for the input
reader = this.newReader();
while((line = reader.readLine())!=null){
tokenizer = new StringTokenizer(line);
while(tokenizer.hasMoreTokens()){
toke = (tokenizer.nextToken().trim());
this.tokenType(toke);
//System.out.println(this.tokenType(toke));
}
}
}
private BufferedReader newReader(){
try {//attempt to read the file
reader = new BufferedReader(new FileReader("Input.txt"));
}
catch(FileNotFoundException e){
System.out.println("File not found");
}
catch(IOException e){
System.out.println("I/O Exception");
}
return reader;
}
I thought I had handled it within newReader() but it appears to be unreachable. Eclipse recommends a throws but I don't understand what that's doing, or if it's even solving the problem?
Appreciate the help!
If you don't know how to handle an IOException in this method, then it means that it's not the responsibility of the method to handle it, and it should thus be thrown by the method.
The reader should be closed in this method, though, since this method opens it:
public void tokenize() throws IOException {
BufferedReader reader = null;
try {
// attempt creating a reader for the input
reader = this.newReader();
...
}
finally {
if (reader != null) {
try {
reader.close();
}
catch (IOException e) {
// nothing to do anymore: ignoring
}
}
}
}
Also, note that unless your class is itself a kind of Reader wrapping another reader, and thus has a close method, the reader shouldn't be an instance field. It should be a local variable as shown in my example.