I want to merge multiple .eml files into one. By using simple merge technique I am able to merge it. But while trying to open it, i can only find one email inside it. Can anyone help me with it, as i am not sure what is wrong.
below is the code snip..
public static void mergeFiles(List<InputStream> source, File mergedFile) throws FileNotFoundException, IOException {
FileWriter fstream = null;
BufferedWriter out = null;
try {
fstream = new FileWriter(mergedFile, true);
out = new BufferedWriter(fstream);
} catch (IOException e1) {
e1.printStackTrace();
}
// for (InputStream f : source) {
Iterator itr = source.iterator();
while(itr.hasNext())
{
//System.out.println("merging: " + f.getName());
InputStream fis;
fis = (InputStream) itr.next();
BufferedReader in = new BufferedReader(new InputStreamReader(fis));
String aLine;
while ((aLine = in.readLine()) != null) {
out.write(aLine);
out.newLine();
}
in.close();
}
out.close();
}
Related
I have a simulation program that needs to write certain results to a csv file very frequently during execution. I have found that there is something wrong with the printwriter which dramatically slows down running my program, as the output file is getting larger in size (near to 1 million rows). I doublt it's overwriting the entire file each time from the beginning, wheras I just need to append a single line at the bottom each time when it's being called. below is the code related to the writing fuctions.
one of the writing fuctions:
public void printHubSummary(Hub hub, String filePath) {
try {
StringBuilder sb = new StringBuilder();
String h = hub.getHub_code();
String date = Integer.toString(hub.getGs().getDate());
String time = hub.getGs().getHHMMFromMinute(hub.getGs().getClock());
String wgt = Double.toString(hub.getIb_wgt());
sb.append(h+","+date+","+time+","+wgt);
// System.out.println("truck print line: " + sb);
FileWriter.writeFile(sb.toString(),filePath);
}
catch (Exception e) {
System.out.println("Something wrong when outputing truck summary file!");
e.printStackTrace();
}
}
the file writer code: (should be where the problem is!)
public static boolean writeFile(String newStr, String filename) throws IOException {
boolean flag = false;
String filein = newStr + "\r\n";
String temp = "";
FileInputStream fis = null;
InputStreamReader isr = null;
BufferedReader br = null;
FileOutputStream fos = null;
PrintWriter pw = null;
try {
File file = new File(filename);
fis = new FileInputStream(file);
isr = new InputStreamReader(fis);
br = new BufferedReader(isr);
StringBuffer buf = new StringBuffer();
for (int j = 1; (temp = br.readLine()) != null; j++) {
buf = buf.append(temp);
buf = buf.append(System.getProperty("line.separator"));
}
if (buf.length() > 0 && buf.charAt(0) == '\uFEFF') {
buf.deleteCharAt(0);
}
buf.append(filein);
fos = new FileOutputStream(file);
byte[] unicode = {(byte)0xEF, (byte)0xBB, (byte)0xBF};
fos.write(unicode);
pw = new PrintWriter(fos);
pw.write(buf.toString().toCharArray());
pw.flush();
flag = true;
} catch (IOException e1) {
throw e1;
} finally {
if (pw != null) {
pw.close();
}
if (fos != null) {
fos.close();
}
if (br != null) {
br.close();
}
if (isr != null) {
isr.close();
}
if (fis != null) {
fis.close();
}
}
return flag;
}
An update on code modification. I have freezed the operations of repeatitively overwrting the entire file. It appears to solve the problem, but writing for sometime it's slowed down as well. Is it the best arrangement for wrting very large file? what other modifications can be done to make it even more efficient?
public static boolean writeFile1(String newStr, String filename) throws IOException {
boolean flag = false;
String filein = newStr + "\r\n";
String temp = "";
FileInputStream fis = null;
InputStreamReader isr = null;
BufferedReader br = null;
FileOutputStream fos = null;
PrintWriter pw = null;
try {
File file = new File(filename);
fis = new FileInputStream(file);
isr = new InputStreamReader(fis);
br = new BufferedReader(isr);
StringBuffer buf = new StringBuffer();
// for (int j = 1; (temp = br.readLine()) != null; j++) {
// buf = buf.append(temp);
// buf = buf.append(System.getProperty("line.separator"));
// }
// if (buf.length() > 0 && buf.charAt(0) == '\uFEFF') {
// buf.deleteCharAt(0);
// }
buf.append(filein);
fos = new FileOutputStream(file,true);
byte[] unicode = {(byte)0xEF, (byte)0xBB, (byte)0xBF};
fos.write(unicode);
pw = new PrintWriter(fos);
pw.write(buf.toString().toCharArray());
pw.flush();
flag = true;
} catch (IOException e1) {
throw e1;
} finally {
if (pw != null) {
pw.close();
}
if (fos != null) {
fos.close();
}
if (br != null) {
br.close();
}
if (isr != null) {
isr.close();
}
if (fis != null) {
fis.close();
}
}
return flag;
}
Provide a second argument to the FileOutputStream constructor to specify whether or not to use append mode, which will add to the end of the file rather than overwriting it.
fos = new FileOutputStream(file, true);
Alternatively, you could create a single static PrintWriter in append mode, which will probably be faster as it reduces garbage collection.
Use the Files / Path / Java NIO2 which is richer: the code below would need Java 7 at least.
Path path = Paths.get(filename);
try (BufferedWriter bw = Files.newBufferedWriter(
path, StandardCharsets.UTF_8, StandardOpenOption.APPEND, StandardOpenOption.CREATE, StandardOpenOption.WRITE)) {
bw.append(filein);
bw.newLine();
}
Your cue here is the StandardOpenOption.
You will probably have to do some additional code before to write the Unicode part (and fix the StandardCharsets.UTF_8):
if (Files.notExists(path)) {
Files.write(path, new byte[] {(byte)0xEF, (byte)0xBB, (byte)0xBF});
}
Also, try to not use StringBuffer in a local method, use StringBuilder: you don't need synchronisation most of the time.
The code is like this:
public class TxtToCsvConverter{
private static final String DATA_FILE_NAME = "/Desktop/ml-1m/movies.dat";
public static void main(String[] args) {
StringBuilder bulider = new StringBuilder();
try{
// Open the file
FileInputStream fileInputStream = new FileInputStream(DATA_FILE_NAME);
// Create a new csv file to store your data
PrintWriter writer = new PrintWriter(new File("result.csv"));
DataInputStream dataInputStream = new DataInputStream(fileInputStream);
BufferedReader reader = new BufferedReader(new InputStreamReader(dataInputStream));
String strLine;
// Read file line by line
while ((strLine = reader.readLine()) != null) {
String[] stringArray = strLine.split("::");
for (int i = 0; i < stringArray.length; i++) {
bulider.append(stringArray[i]);
bulider.append(",");
}
bulider.append("\n");
}
// Close the input stream
dataInputStream.close();
// Write builder into file
writer.write(bulider.toString());
// Save the file
writer.close();
}catch (Exception e) {
// Catch exception if any
System.err.println("Error: " + e.getMessage());
}
}
}
And it appears to come out a error says
Could not find or load main class TxtToCsvConverter.data_preprocess"
I couldn't find the reason. Please help
I want to calculate some column data and write it to csv file as column. Then after calculating other column of data I want to append it to same file but as new column.
Here is what I did:
try {
FileWriter writer = new FileWriter(OUT_FILE_PATH, true);
for (int i=0; i<data.size(); i++) {
writer.append(String.valueOf(data.get(i)));
writer.append(",");
writer.append("\n");
}
writer.flush();
writer.close();
} catch (Exception e) {}
Result - It appends the new column below the first column, so I have single long column.
Thanks,
Something like this perhaps:
public void appendCol(String fileName, ???ArrayList??? data) { //assuming data is of type ArrayList here, you need to be more explicit when posting code
String lineSep = System.getProperty("line.separator");
String output = "";
try{
BufferedReader br = new BufferedReader(new FileReader(fileName));
String line = null;
int i = 0;
while ((line = br.readLine()) != null) {
output += line.replace(
lineSep,
"," + String.valueOf(data.get(i)) + lineSep);
i++;
}
br.close();
FileWriter fw = new FileWriter(fileName, false); //false to replace file contents, your code has true for append to file contents
fw.write(output);
fw.flush();
fw.close();
} catch (Exception e){
e.printStackTrace();
}
}
You will have to read your file (line by line) and then insert the new column to every line. Here's a solution using BufferedReader and BufferedWriter
public void addColumn(String path,String fileName) throws IOException{
BufferedReader br=null;
BufferedWriter bw=null;
final String lineSep=System.getProperty("line.separator");
try {
File file = new File(path, fileName);
File file2 = new File(path, fileName+".1");//so the
//names don't conflict or just use different folders
br = new BufferedReader(new InputStreamReader(new FileInputStream(file))) ;
bw = new BufferedWriter(new OutputStreamWriter(new FileOutputStream(file2)));
String line = null;
int i=0;
for ( line = br.readLine(); line != null; line = br.readLine(),i++)
{
String addedColumn = String.valueOf(data.get(i));
bw.write(line+addedColumn+lineSep);
}
}catch(Exception e){
System.out.println(e);
}finally {
if(br!=null)
br.close();
if(bw!=null)
bw.close();
}
}
I have used apache-commons for resolving this issue. There was no perfect answer that worked for me. After a lot of effort, this worked for me.
Writer writer = Files.newBufferedWriter(Paths.get("output.csv"));
CSVPrinter csvPrinter = new CSVPrinter(writer, CSVFormat.DEFAULT
//add whichever column you want in withHeader
.withHeader("createdTs", "destroyedTs", "channelName", "uid", "suid", "did", "joinTs", "leaveTs", "platform", "location", "consumption"));
//actual columns in your passed CSV
String[] HEADERS = {"createdTs", "destroyedTs", "channelName", "uid", "suid", "did", "joinTs", "leaveTs", "platform", "location"};
Reader in = new FileReader(yourCsvFile);
Iterable<CSVRecord> records = CSVFormat.DEFAULT
.withHeader(HEADERS)
.withFirstRecordAsHeader()
.parse(in);
for (CSVRecord row : records) {
String tempValue = String.valueOf(Long.parseLong(row.get("leaveTs")) - Long.parseLong(row.get("joinTs")));
csvPrinter.printRecord(row.get("createdTs"), row.get("destroyedTs"),row.get("channelName"), row.get("uid"),
row.get("suid"), row.get("did"), row.get("joinTs"), row.get("leaveTs"),
row.get("platform"), row.get("location"), tempValue);
}
Hope this will help you.
{
//CREATE CSV FILE
StringBuffer csvReport = new StringBuffer();
csvReport.append("header1,Header2,Header3\n");
csvReport.append(value1 + "," + value2 + "," + value3 + "\n");
generateCSVFile( filepath,fileName, csvReport); // Call the implemented mathod
}
public void generateCSVFile(String filepath,String fileName,StringBuffer result)
{
try{
FileOutputStream fop = new FileOutputStream(filepath);
// get the content in bytes
byte[] contentInBytes = result.toString().getBytes();
fop.write(contentInBytes);
fop.flush();
//wb.write(fileOut);
if(fop != null)
fop.close();
}catch (Exception ex)
{
ex.printStackTrace();
}
}
I have a webapplication which allows to upload binary files. I have to parse them and save the content 1:1 into a String and then into the database.
When I use uuencode on a unix machine to encode the binary file, then it works. Is there a way to do this automatically in java?
if (isMultipart) {
//Create a new file upload handler
ServletFileUpload upload = new ServletFileUpload();
//Parse the request
FileItemIterator iter = upload.getItemIterator(request);
while (iter.hasNext()) {
FileItemStream item = iter.next();
String name = item.getFieldName();
InputStream stream = item.openStream();
if (!item.isFormField()) {
BufferedReader reader = new BufferedReader(new InputStreamReader(stream));
String line;
licenseString = "";
while ((line = reader.readLine()) != null) {
System.out.println(line);
// Generate License File
licenseString += line + "\n";
}
}
}
session.setAttribute("licenseFile", licenseString);
System.out.println("adding licensestring to session. ");
}
It works of course for all non-binary files uploaded. How can I extend it to support binary files?
// save to file
// =======================================
InputStream is = new BufferedInputStream(item.openStream());
BufferedOutputStream output = null;
try {
output = new BufferedOutputStream(new FileOutputStream("temp.txt", false));
int data = -1;
while ((data = is.read()) != -1) {
output.write(data);
}
} finally {
is.close();
output.close();
}
// read content of file
// =======================================
System.out.println("content of file:");
try {
FileInputStream fstream = new FileInputStream("temp.txt");
DataInputStream in = new DataInputStream(fstream);
BufferedReader br = new BufferedReader(new InputStreamReader(in));
String line;
licenseString = "";
String strLine;
while ((strLine = br.readLine()) != null) {
System.out.println(javax.xml.bind.DatatypeConverter.printBase64Binary(strLine.getBytes()));
licenseString += javax.xml.bind.DatatypeConverter.printBase64Binary(strLine.getBytes()) + "\n";
}
} catch (Exception e) {
System.err.println("Error: " + e.getMessage());
}
You could use the commons_fileupload lib (check it here : org.apache.commons.fileupload.disk.DiskFileItem is not created properly?)
The doc is here : http://commons.apache.org/fileupload/using.html
Your case is pretty well explained on the official website.
A Better way would be to write the upload to a temporary file and then handle it from there:
if (!item.isFormField()) {
InputStream stream = new BufferedInputStream(item.getInputStream());
BufferedOutputStream output = null;
try {
output = new BufferedOutputStream(new FileOutputStream(your_temp_file, false));
int data = -1;
while ((data = input.read()) != -1) {
output.write(data);
}
} finally {
input.close();
output.close();
}
}
now you have a temporary file, which is the same as the uploaded file, you can do your 'other' calculations from there.
I need to read a text file line by line using Java. I use available() method of FileInputStream to check and loop over the file. But while reading, the loop terminates after the line before the last one. i.e., if the file has 10 lines, the loop reads only the first 9 lines.
Snippet used :
while(fis.available() > 0)
{
char c = (char)fis.read();
.....
.....
}
You should not use available(). It gives no guarantees what so ever. From the API docs of available():
Returns an estimate of the number of bytes that can be read (or skipped over) from this input stream without blocking by the next invocation of a method for this input stream.
You would probably want to use something like
try {
BufferedReader in = new BufferedReader(new FileReader("infilename"));
String str;
while ((str = in.readLine()) != null)
process(str);
in.close();
} catch (IOException e) {
}
(taken from http://www.exampledepot.com/egs/java.io/ReadLinesFromFile.html)
How about using Scanner? I think using Scanner is easier
private static void readFile(String fileName) {
try {
File file = new File(fileName);
Scanner scanner = new Scanner(file);
while (scanner.hasNextLine()) {
System.out.println(scanner.nextLine());
}
scanner.close();
} catch (FileNotFoundException e) {
e.printStackTrace();
}
}
Read more about Java IO here
If you want to read line-by-line, use a BufferedReader. It has a readLine() method which returns the line as a String, or null if the end of the file has been reached. So you can do something like:
BufferedReader reader = new BufferedReader(new InputStreamReader(fis));
String line;
while ((line = reader.readLine()) != null) {
// Do something with line
}
(Note that this code doesn't handle exceptions or close the stream, etc)
String file = "/path/to/your/file.txt";
try {
BufferedReader br = new BufferedReader(new InputStreamReader(new FileInputStream(file)));
String line;
// Uncomment the line below if you want to skip the fist line (e.g if headers)
// line = br.readLine();
while ((line = br.readLine()) != null) {
// do something with line
}
br.close();
} catch (IOException e) {
System.out.println("ERROR: unable to read file " + file);
e.printStackTrace();
}
You can try FileUtils from org.apache.commons.io.FileUtils, try downloading jar from here
and you can use the following method:
FileUtils.readFileToString("yourFileName");
Hope it helps you..
The reason your code skipped the last line was because you put fis.available() > 0 instead of fis.available() >= 0
In Java 8 you could easily turn your text file into a List of Strings with streams by using Files.lines and collect:
private List<String> loadFile() {
URI uri = null;
try {
uri = ClassLoader.getSystemResource("example.txt").toURI();
} catch (URISyntaxException e) {
LOGGER.error("Failed to load file.", e);
}
List<String> list = null;
try (Stream<String> lines = Files.lines(Paths.get(uri))) {
list = lines.collect(Collectors.toList());
} catch (IOException e) {
LOGGER.error("Failed to load file.", e);
}
return list;
}
//The way that I read integer numbers from a file is...
import java.util.*;
import java.io.*;
public class Practice
{
public static void main(String [] args) throws IOException
{
Scanner input = new Scanner(new File("cards.txt"));
int times = input.nextInt();
for(int i = 0; i < times; i++)
{
int numbersFromFile = input.nextInt();
System.out.println(numbersFromFile);
}
}
}
Try this just a little search in Google
import java.io.*;
class FileRead
{
public static void main(String args[])
{
try{
// Open the file that is the first
// command line parameter
FileInputStream fstream = new FileInputStream("textfile.txt");
// Get the object of DataInputStream
DataInputStream in = new DataInputStream(fstream);
BufferedReader br = new BufferedReader(new InputStreamReader(in));
String strLine;
//Read File Line By Line
while ((strLine = br.readLine()) != null) {
// Print the content on the console
System.out.println (strLine);
}
//Close the input stream
in.close();
}catch (Exception e){//Catch exception if any
System.err.println("Error: " + e.getMessage());
}
}
}
Try using java.io.BufferedReader like this.
java.io.BufferedReader br = new java.io.BufferedReader(new java.io.InputStreamReader(new java.io.FileInputStream(fileName)));
String line = null;
while ((line = br.readLine()) != null){
//Process the line
}
br.close();
Yes, buffering should be used for better performance.
Use BufferedReader OR byte[] to store your temp data.
thanks.
user scanner it should work
Scanner scanner = new Scanner(file);
while (scanner.hasNextLine()) {
System.out.println(scanner.nextLine());
}
scanner.close();
public class ReadFileUsingFileInputStream {
/**
* #param args
*/
static int ch;
public static void main(String[] args) {
File file = new File("C://text.txt");
StringBuffer stringBuffer = new StringBuffer("");
try {
FileInputStream fileInputStream = new FileInputStream(file);
try {
while((ch = fileInputStream.read())!= -1){
stringBuffer.append((char)ch);
}
}
catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
System.out.println("File contents :");
System.out.println(stringBuffer);
}
}
public class FilesStrings {
public static void main(String[] args) throws FileNotFoundException, IOException {
FileInputStream fis = new FileInputStream("input.txt");
InputStreamReader input = new InputStreamReader(fis);
BufferedReader br = new BufferedReader(input);
String data;
String result = new String();
while ((data = br.readLine()) != null) {
result = result.concat(data + " ");
}
System.out.println(result);
File file = new File("Path");
FileReader reader = new FileReader(file);
while((ch=reader.read())!=-1)
{
System.out.print((char)ch);
}
This worked for me
Simple code for reading file in JAVA:
import java.io.*;
class ReadData
{
public static void main(String args[])
{
FileReader fr = new FileReader(new File("<put your file path here>"));
while(true)
{
int n=fr.read();
if(n>-1)
{
char ch=(char)fr.read();
System.out.print(ch);
}
}
}
}