I have 3 text files. Numbers.txt have int and double values like 1 1.5 2... I wanna put my int values to my Int.txt and double values to Double.txt.
So how can I do?
I tried .hasNextDouble() or .hasNextInt()
public static void main(String[] args) {
File f = new File("a.rtf");
File fWrite = new File("aWrite");
try {
FileReader fr = new FileReader(f);
FileWriter fw = new FileWriter(fWrite);
double c = fr.read();
while(c != -1){
char k = (char)c;
c.hasNextDouble();
System.out.print(k + " ");
fw.write((int) c);
c = fr.read();
}
fr.close();
fw.close();
} catch(Exception e) {
e.printStackTrace();
}
I am new in Java. What else can I try?
You can try this:
Writer wr = new FileWriter("aWrite.txt");
wr.write(String.valueOf(1));
wr.write(String.valueOf(1.5));
wr.write(String.valueOf(2));
wr.flush();
wr.close();
Or this approach which is mainly better for bigger data:
File file = new File("aWrite.txt");
BufferedWriter out = new BufferedWriter(new FileWriter(file));
out.write("Write the string to text file");
out.newLine();
I find another way. I hope help...
import java.io.*;
import java.util.*;
public class Test {
public static void main(String[] args)throws IOException {
double[] doubleNumbers = new double[6];
int[] integerNumbers = new int[6];
int intCount = 0;
int doubleCount = 0;
File numbers = new File("numbers.txt");
FileReader fr = new FileReader(numbers);
LineNumberReader lnreader= new LineNumberReader(fr);
String line = "";
while ((line = lnreader.readLine()) != null) {
var _temp = line.split(" ");
for(int i = 0;i<_temp.length;i++) {
if(_temp[i].indexOf(".") > 0) {
doubleNumbers[doubleCount] = Double.parseDouble(_temp[i]);
++doubleCount;
}else {
integerNumbers[intCount] = Integer.parseInt(_temp[i]);
++intCount;
}
}
}
fr.close();
File doubleFile = new File("double.txt");
FileWriter fw = new FileWriter(doubleFile);
for(int i = 0;i<doubleCount;i++) {
fw.write(doubleNumbers[i]+" ");
if((i +1) % 3 == 0)
fw.write("\n");
}
fw.flush();
fw.close();
File integerFile = new File("integer.txt");
FileWriter fw = new FileWriter(integerFile);
for(int i = 0;i<intCount;i++) {
fw.write(integerNumbers[i]+" ");
if((i +1) % 3 == 0)
fw.write("\n");
}
fw.flush();
fw.close();
System.out.print("Done");
}
}
Related
I created a class (LerEscreverArquivo) that reads the contents of a text file, does a treatment on the read data, prints on the screen the result of the treatment done and writes in a text file the information.
The data reading and writing on the screen is working. The problem happens at the time of writing the values in the text file.
The script only writes the last record it writes to the file. The previous records it ignores. It follows the code and the images of the problem.
public class LerEscreverArquivo {
private static final String NomeArquivoEntrada = "E:\\DesafioProgramacao\\matriculasSemDV.txt";
private static final String NomeArquivoSaida = "E:\\DesafioProgramacao\\matriculasComDV.txt";
public static void main(String[] args) {
FileReader fr = null;
BufferedReader br = null;
BufferedWriter bw = null;
FileWriter fw = null;
try {
//input file
fr = new FileReader(NomeArquivoEntrada);
br = new BufferedReader(fr);
String sCurrentLine;
System.out.println("Início do arquivo.");
while ((sCurrentLine = br.readLine()) != null) {
if (!sCurrentLine.isEmpty()) {
int Total = 0;
int contador = 5;
int resto;
for (int i = 0; i < sCurrentLine.length(); i++) {
int j = Character.digit(sCurrentLine.charAt(i), 10);
Total = Total + (j * contador);
contador = contador - 1;
}
resto = Total / 16;
String decimal = Integer.toHexString(resto);
String DigitoCod=sCurrentLine + "-" + decimal;
//output file
fw = new FileWriter(NomeArquivoSaida);
bw = new BufferedWriter(fw);
bw.write(DigitoCod);
System.out.println(DigitoCod);
}
}
System.out.println("Fim do arquivo.");
} catch (IOException eReader) {
eReader.printStackTrace();
} finally {
try {
if (br != null) {
br.close();
}
if (fr != null) {
fr.close();
}
if(bw != null) {
bw.close();
}
if(fw != null) {
fw.close();
}
} catch (IOException exeReader) {
exeReader.printStackTrace();
}
}
}
}
You are initializing the FileWriter in each iteration of the loop
fw = new FileWriter(NomeArquivoSaida);
bw = new BufferedWriter(fw);
bw.write(DigitoCod);
So basically your file is started fresh by removing the previous contents.
Try moving the following two lines above the loop and your problem should be solved.
fw = new FileWriter(NomeArquivoSaida);
bw = new BufferedWriter(fw);
EDIT
Working code is following:
import java.io.BufferedReader;
import java.io.BufferedWriter;
import java.io.FileReader;
import java.io.FileWriter;
import java.io.IOException;
public class LerEscreverArquivo{
private static final String NomeArquivoEntrada = "matriculasSemDV.txt";
private static final String NomeArquivoSaida = "matriculasComDV.txt";
public static void main(String[] args) {
FileReader fr = null;
BufferedReader br = null;
BufferedWriter bw = null;
FileWriter fw = null;
try {
// input file
fr = new FileReader(NomeArquivoEntrada);
br = new BufferedReader(fr);
fw = new FileWriter(NomeArquivoSaida);
bw = new BufferedWriter(fw);
String sCurrentLine = "";
System.out.println("Início do arquivo.");
while ((sCurrentLine = br.readLine()) != null) {
if (!sCurrentLine.isEmpty()) {
int Total = 0;
int contador = 5;
int resto;
for (int i = 0; i < sCurrentLine.length(); i++) {
int j = Character.digit(sCurrentLine.charAt(i), 10);
Total = Total + (j * contador);
contador = contador - 1;
}
resto = Total / 16;
String decimal = Integer.toHexString(resto);
String DigitoCod = sCurrentLine + "-" + decimal;
// output file
bw.write(DigitoCod);
System.out.println(DigitoCod);
}
}
System.out.println("Fim do arquivo.");
} catch (IOException eReader) {
eReader.printStackTrace();
} finally {
try {
if (br != null) {
br.close();
}
if (fr != null) {
fr.close();
}
if (bw != null) {
bw.close();
}
if (fw != null) {
fw.close();
}
} catch (IOException exeReader) {
exeReader.printStackTrace();
}
}
}
}
I'm going to pass my data from MongoDB to Neo4j.
So, I exported my MongoDB documents in .csv. As you can read here I have a problem with the array uniform.
So I wrote a java program to fix this problem.
Here is the .csv exported from MongoDB (note the different about uniform array):
_id,official_name,common_name,country,started_by.day,started_by.month,started_by.year,championship,stadium.name,stadium.capacity,palmares.first_prize,palmares.second_prize,palmares.third_prize,palmares.fourth_prize,average_age,squad_value,foreigners,uniform
0,yaDIXxLAOV,WWYWLqPcYM,QsVwiNmeGl,7,9,1479,oYKGgstIMv,qskcxizCkd,8560,10,25,9,29,16,58,6,"[""first_colour"",""second_colour"",""third_colour""]"
Here is how it must be to import in Neo4j:
_id,official_name,common_name,country,started_by.day,started_by.month,started_by.year,championship,stadium.name,stadium.capacity,palmares.first_prize,palmares.second_prize,palmares.third_prize,palmares.fourth_prize,average_age,squad_value,foreigners,uniform.0,uniform.1,uniform.2
0,yaDIXxLAOV,WWYWLqPcYM,QsVwiNmeGl,7,9,1479,oYKGgstIMv,qskcxizCkd,8560,10,25,9,29,16,58,6,first_colour,second_colour,third_colour
My code works, but I have to convert 500k line of the .csv file and the program it is too much slow(it's still working after 20 minutes :/):
import java.io.BufferedReader;
import java.io.File;
import java.io.FileReader;
import java.io.IOException;
import java.io.PrintWriter;
public class ConvertireCSV {
public static void main(String[] args) throws IOException {
FileReader f;
f=new FileReader("output.csv");
BufferedReader b;
b=new BufferedReader(f);
String firstLine= b.readLine();
int uniform = firstLine.indexOf("uniform");
firstLine=firstLine.substring(0, uniform);
firstLine = firstLine + "uniform.0,uniform.1,uniform.2\n";
String line="";
String csv="";
while(true) {
line=b.readLine();
if(line==null)
break;
int u = line.indexOf("\"[");
line=line.substring(0, u);
line=line + "first_colour,second_colour,third_colour \n";
csv=csv+line;
}
File file = new File("outputForNeo4j.csv");
if(file.createNewFile()) {
PrintWriter pw = new PrintWriter(file);
pw.println(firstLine + csv);
System.out.println("New file \"outputForNeo4j.csv\" created.");
pw.flush();
pw.close();
}
}
}
How can I make it faster?
Okay some basic ways to improve your code:
Make sure that your variables got the minimal scope required. If you don't need line outside your loop, don't declare it outside your loop.
Concatenation of simple strings is in general slow. Use a StringBuilder to speed things to there.
Why are you buffering the string anyway? Seems like a waste of memory. Just open the output stream to your target file and write the lines to the new file as you process them.
Examples:
I don't think you need a example on the first point.
For the second things could look like this:
...
StringBuilder csv = new StringBuilder();
while(true) {
...
csv.append(line);
}
...
if(file.createNewFile()) {
...
pw.println(firstLine + csv.toString());
...
}
For the third point the rewriting would be a little more extensive:
public static void main(String[] args) throws IOException {
FileReader f;
f=new FileReader("output.csv");
BufferedReader b;
b=new BufferedReader(f);
String firstLine= b.readLine();
int uniform = firstLine.indexOf("uniform");
firstLine=firstLine.substring(0, uniform);
firstLine = firstLine + "uniform.0,uniform.1,uniform.2\n";
File file = new File("outputForNeo4j.csv");
if(!file.createNewFile()) {
// all work would be for nothing! Bailing out.
return;
}
PrintWriter pw = new PrintWriter(file);
pw.print(firstLine);
while(true) {
String line=b.readLine();
if(line==null)
break;
int u = line.indexOf("\"[");
line=line.substring(0, u);
line=line + "first_colour,second_colour,third_colour \n";
pw.print(line);
}
System.out.println("New file \"outputForNeo4j.csv\" created.");
pw.flush();
pw.close();
b.close()
}
csv=csv+line;
string concatenation is expensive operation. I would suggest using bufferedWriter.
something like this:
FileReader f;
f=new FileReader("output.csv");
BufferedReader b;
BufferedWriter out;
b=new BufferedReader(f);
try{
out = new BufferedWriter(new FileWriter("outputForNeo4j.csv"));
} catch(Exception e){
//cannot create file
}
System.out.println("New file \"outputForNeo4j.csv\" created.");
String firstLine= b.readLine();
int uniform = firstLine.indexOf("uniform");
firstLine=firstLine.substring(0, uniform);
firstLine = firstLine + "uniform.0,uniform.1,uniform.2\n";
String line="";
String csv="";
out.write(firstLine);
while(true) {
line=b.readLine();
if(line==null)
break;
int u = line.indexOf("\"[");
line=line.substring(0, u);
line=line + "first_colour,second_colour,third_colour \n";
out.write(line);
}
out.flush();
}
Results :
test0 : Runs: 241 iterations ,avarage milis = 246
test1 : Runs: 249 iterations ,avarage milis = 118
test2 : Runs: 269 iterations ,avarage milis = 5
test3 : Runs: 241 iterations ,avarage milis = 2
import java.io.*;
import java.lang.reflect.InvocationTargetException;
import java.lang.reflect.Method;
import java.util.Random;
public class Tester {
private static final String filePath = "c:\\bigFile.txt";
//private static final String filePath = "c:\\bigfileNewLine.txt";
private static final int numOfMethods = 4;
private static final int numOfIter = 1000;
public Tester() throws NoSuchMethodException {
System.out.println("Tester.Tester");
int[] milisArr = new int [numOfMethods];
int[] actualRun = new int [numOfMethods];
Random rnd = new Random(System.currentTimeMillis());
Long startMs = 0l, endMs = 0l;
Method[] method = new Method[numOfMethods];
for (int i = 0; i < numOfMethods; i++)
method[i] = this.getClass().getMethod("test" + i);
int testCount = 0;
while (testCount++ < numOfIter) {
int testMethod = rnd.nextInt(numOfMethods);
Method m = method[testMethod];
try {
System.gc();
startMs = System.currentTimeMillis();
String retval = (String) m.invoke(null);
endMs = System.currentTimeMillis();
} catch (IllegalAccessException e) {
e.printStackTrace(); //To change body of catch statement use File | Settings | File Templates.
} catch (InvocationTargetException e) {
e.printStackTrace(); //To change body of catch statement use File | Settings | File Templates.
}
milisArr[testMethod] += (endMs - startMs);
actualRun[testMethod]++;
System.out.println("Test name: " + m.getName() + " testCount=" + testCount + " Of " + numOfIter + " iteration, Total time :" + (endMs - startMs) / 1000.0 + " seconds");
}
System.out.println("Test Summery :");
for (int i = 0; i < numOfMethods; i++)
System.out.println("test" + i + " : Runs: " + actualRun[i] + " iterations ,avarage milis = " + milisArr[i]/numOfIter);
}
public static String test0() throws IOException {
InputStream file = getInputStream();
StringBuffer textBuffer = new StringBuffer();
int c;
while ((c = file.read()) != -1)
textBuffer.append((char) c);
file.close();
return textBuffer.toString();
}
public static String test1() throws IOException {
Reader reader = new FileReader(new File(filePath));
BufferedReader br = new BufferedReader(reader);
String line = br.readLine();
String result = line;
while (line != null) {
line = br.readLine();
if (line == null) {
} else {
result = result + "\n" + line;
}
}
br.close();
reader.close();
return result;
}
public static String test2() throws IOException {
byte[] buf = new byte[1024];
int l;
InputStream is = getInputStream();
StringBuffer tmpBuf = new StringBuffer();
while ((l = is.read(buf)) != -1) {
tmpBuf.append(new String(buf, 0, l));
}
is.close();
return tmpBuf.toString();
}
public static String test3() throws IOException {
File source = new File(filePath);
final DataInputStream dis = new DataInputStream(new BufferedInputStream(new FileInputStream(source)));
final byte[] buffer = new byte[(int) source.length()];
dis.readFully(buffer);
dis.close();
return new String(buffer, "UTF-8");
}
private static InputStream getInputStream() {
try {
return new FileInputStream(filePath);
} catch (FileNotFoundException e) {
e.printStackTrace();
return null;
}
}
public static void main(String[] args) {
try {
new Tester();
} catch (NoSuchMethodException e) {
e.printStackTrace(); //To change body of catch statement use File | Settings | File Templates.
}
}
}
Hi I want to write an array to a file I can do this except I want to write the first 50 objects of the array on first line then move to the next line and write another 50 numbers and then the next line and so on.
at the moment I have this:
if (pause.save) {
Formatter f;
FileWriter fw;
PrintWriter pw = null;
File f2 = new File("docs/save.txt");
FileReader r;
BufferedReader r1;
try{
//f = new Formatter("docs/save.txt");
f2.createNewFile();
fw = new FileWriter(f2);
pw = new PrintWriter(fw);
pw.println(mapwidth);
pw.println(mapheight);
for(int i = 0; i < overview.s.toArray().length; i ++){
pw.println(overview.s.get(i).getID());
}
}catch(IOException e){
e.printStackTrace();
System.out.println("ERROR!!!");
}finally{
pw.close();
}
}
I have been trying to do this for ages but cant figure it out.
You can do this in following way:-
for(int i = 0; i < overview.s.toArray().length; i ++){
pw.println(overview.s.get(i).getID());
if(i!=0&&i%50==0)
//newline here
}
Just do:
if (pause.save) {
Formatter f;
FileWriter fw;
PrintWriter pw = null;
File f2 = new File("docs/save.txt");
FileReader r;
BufferedReader r1;
try{
//f = new Formatter("docs/save.txt");
f2.createNewFile();
fw = new FileWriter(f2);
pw = new PrintWriter(fw);
pw.println(mapwidth);
pw.println(mapheight);
for(int i = 0; i < overview.s.toArray().length; i ++){
pw.print(overview.s.get(i).getID());
if ((i + 1) % 50 == 0) {
pw.println();
}
}
}catch(IOException e){
e.printStackTrace();
System.out.println("ERROR!!!");
}finally{
pw.close();
}
}
I have a file which I would like to read in Java and split this file into n (user input) output files. Here is how I read the file:
int n = 4;
BufferedReader br = new BufferedReader(new FileReader("file.csv"));
try {
String line = br.readLine();
while (line != null) {
line = br.readLine();
}
} finally {
br.close();
}
How do I split the file - file.csv into n files?
Note - Since the number of entries in the file are of the order of 100k, I can't store the file content into an array and then split it and save into multiple files.
Since one file can be very large, each split file could be large as well.
Example:
Source File Size: 5GB
Num Splits: 5: Destination
File Size: 1GB each (5 files)
There is no way to read this large split chunk in one go, even if we have such a memory. Basically for each split we can read a fix size byte-array which we know should be feasible in terms of performance as well memory.
NumSplits: 10 MaxReadBytes: 8KB
public static void main(String[] args) throws Exception
{
RandomAccessFile raf = new RandomAccessFile("test.csv", "r");
long numSplits = 10; //from user input, extract it from args
long sourceSize = raf.length();
long bytesPerSplit = sourceSize/numSplits ;
long remainingBytes = sourceSize % numSplits;
int maxReadBufferSize = 8 * 1024; //8KB
for(int destIx=1; destIx <= numSplits; destIx++) {
BufferedOutputStream bw = new BufferedOutputStream(new FileOutputStream("split."+destIx));
if(bytesPerSplit > maxReadBufferSize) {
long numReads = bytesPerSplit/maxReadBufferSize;
long numRemainingRead = bytesPerSplit % maxReadBufferSize;
for(int i=0; i<numReads; i++) {
readWrite(raf, bw, maxReadBufferSize);
}
if(numRemainingRead > 0) {
readWrite(raf, bw, numRemainingRead);
}
}else {
readWrite(raf, bw, bytesPerSplit);
}
bw.close();
}
if(remainingBytes > 0) {
BufferedOutputStream bw = new BufferedOutputStream(new FileOutputStream("split."+(numSplits+1)));
readWrite(raf, bw, remainingBytes);
bw.close();
}
raf.close();
}
static void readWrite(RandomAccessFile raf, BufferedOutputStream bw, long numBytes) throws IOException {
byte[] buf = new byte[(int) numBytes];
int val = raf.read(buf);
if(val != -1) {
bw.write(buf);
}
}
import java.io.*;
import java.util.Scanner;
public class split {
public static void main(String args[])
{
try{
// Reading file and getting no. of files to be generated
String inputfile = "C:/test.txt"; // Source File Name.
double nol = 2000.0; // No. of lines to be split and saved in each output file.
File file = new File(inputfile);
Scanner scanner = new Scanner(file);
int count = 0;
while (scanner.hasNextLine())
{
scanner.nextLine();
count++;
}
System.out.println("Lines in the file: " + count); // Displays no. of lines in the input file.
double temp = (count/nol);
int temp1=(int)temp;
int nof=0;
if(temp1==temp)
{
nof=temp1;
}
else
{
nof=temp1+1;
}
System.out.println("No. of files to be generated :"+nof); // Displays no. of files to be generated.
//---------------------------------------------------------------------------------------------------------
// Actual splitting of file into smaller files
FileInputStream fstream = new FileInputStream(inputfile); DataInputStream in = new DataInputStream(fstream);
BufferedReader br = new BufferedReader(new InputStreamReader(in)); String strLine;
for (int j=1;j<=nof;j++)
{
FileWriter fstream1 = new FileWriter("C:/New Folder/File"+j+".txt"); // Destination File Location
BufferedWriter out = new BufferedWriter(fstream1);
for (int i=1;i<=nol;i++)
{
strLine = br.readLine();
if (strLine!= null)
{
out.write(strLine);
if(i!=nol)
{
out.newLine();
}
}
}
out.close();
}
in.close();
}catch (Exception e)
{
System.err.println("Error: " + e.getMessage());
}
}
}
Though its a old question but for reference I am listing out the code which I used to split large files to any sizes and it works with any Java versions above 1.4 .
Sample Split and Join blocks were like below:
public void join(String FilePath) {
long leninfile = 0, leng = 0;
int count = 1, data = 0;
try {
File filename = new File(FilePath);
//RandomAccessFile outfile = new RandomAccessFile(filename,"rw");
OutputStream outfile = new BufferedOutputStream(new FileOutputStream(filename));
while (true) {
filename = new File(FilePath + count + ".sp");
if (filename.exists()) {
//RandomAccessFile infile = new RandomAccessFile(filename,"r");
InputStream infile = new BufferedInputStream(new FileInputStream(filename));
data = infile.read();
while (data != -1) {
outfile.write(data);
data = infile.read();
}
leng++;
infile.close();
count++;
} else {
break;
}
}
outfile.close();
} catch (Exception e) {
e.printStackTrace();
}
}
public void split(String FilePath, long splitlen) {
long leninfile = 0, leng = 0;
int count = 1, data;
try {
File filename = new File(FilePath);
//RandomAccessFile infile = new RandomAccessFile(filename, "r");
InputStream infile = new BufferedInputStream(new FileInputStream(filename));
data = infile.read();
while (data != -1) {
filename = new File(FilePath + count + ".sp");
//RandomAccessFile outfile = new RandomAccessFile(filename, "rw");
OutputStream outfile = new BufferedOutputStream(new FileOutputStream(filename));
while (data != -1 && leng < splitlen) {
outfile.write(data);
leng++;
data = infile.read();
}
leninfile += leng;
leng = 0;
outfile.close();
count++;
}
} catch (Exception e) {
e.printStackTrace();
}
}
Complete java code available here in File Split in Java Program link.
a clean solution to edit.
this solution involves loading the entire file into memory.
set all line of a file in List<String> rowsOfFile;
edit maxSizeFile to choice max size of a single file splitted
public void splitFile(File fileToSplit) throws IOException {
long maxSizeFile = 10000000 // 10mb
StringBuilder buffer = new StringBuilder((int) maxSizeFile);
int sizeOfRows = 0;
int recurrence = 0;
String fileName;
List<String> rowsOfFile;
rowsOfFile = Files.readAllLines(fileToSplit.toPath(), Charset.defaultCharset());
for (String row : rowsOfFile) {
buffer.append(row);
numOfRow++;
sizeOfRows += row.getBytes(StandardCharsets.UTF_8).length;
if (sizeOfRows >= maxSizeFile) {
fileName = generateFileName(recurrence);
File newFile = new File(fileName);
try (PrintWriter writer = new PrintWriter(newFile)) {
writer.println(buffer.toString());
}
recurrence++;
sizeOfRows = 0;
buffer = new StringBuilder();
}
}
// last rows
if (sizeOfRows > 0) {
fileName = generateFileName(recurrence);
File newFile = createFile(fileName);
try (PrintWriter writer = new PrintWriter(newFile)) {
writer.println(buffer.toString());
}
}
Files.delete(fileToSplit.toPath());
}
method to generate Name of file:
public String generateFileName(int numFile) {
String extension = ".txt";
return "myFile" + numFile + extension;
}
Have a counter to count no of entries. Let's say one entry per line.
step1: Initially create new subfile, set counter=0;
step2: increment counter as you read each entry from source file to buffer
step3: when counter reaches limit to number of entries that you want to write in each sub file, flush contents of buffer to subfile. close the subfile
step4 : jump to step1 till you have data in source file to read from
There's no need to loop twice through the file. You could estimate the size of each chunk as the source file size divided by number of chunks needed. Then you just stop filling each cunk with data as it's size exceeds estimated.
Here is one that worked for me and I used it to split 10GB file. it also enables you to add a header and a footer. very useful when splitting document based format such as XML and JSON because you need to add document wrapper in the new split files.
import java.io.BufferedReader;
import java.io.BufferedWriter;
import java.io.File;
import java.io.IOException;
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.Paths;
import java.nio.file.StandardOpenOption;
public class FileSpliter
{
public static void main(String[] args) throws IOException
{
splitTextFiles("D:\\xref.csx", 750000, "", "", null);
}
public static void splitTextFiles(String fileName, int maxRows, String header, String footer, String targetDir) throws IOException
{
File bigFile = new File(fileName);
int i = 1;
String ext = fileName.substring(fileName.lastIndexOf("."));
String fileNoExt = bigFile.getName().replace(ext, "");
File newDir = null;
if(targetDir != null)
{
newDir = new File(targetDir);
}
else
{
newDir = new File(bigFile.getParent() + "\\" + fileNoExt + "_split");
}
newDir.mkdirs();
try (BufferedReader reader = Files.newBufferedReader(Paths.get(fileName)))
{
String line = null;
int lineNum = 1;
Path splitFile = Paths.get(newDir.getPath() + "\\" + fileNoExt + "_" + String.format("%02d", i) + ext);
BufferedWriter writer = Files.newBufferedWriter(splitFile, StandardOpenOption.CREATE);
while ((line = reader.readLine()) != null)
{
if(lineNum == 1)
{
System.out.print("new file created '" + splitFile.toString());
if(header != null && header.length() > 0)
{
writer.append(header);
writer.newLine();
}
}
writer.append(line);
if (lineNum >= maxRows)
{
if(footer != null && footer.length() > 0)
{
writer.newLine();
writer.append(footer);
}
writer.close();
System.out.println(", " + lineNum + " lines written to file");
lineNum = 1;
i++;
splitFile = Paths.get(newDir.getPath() + "\\" + fileNoExt + "_" + String.format("%02d", i) + ext);
writer = Files.newBufferedWriter(splitFile, StandardOpenOption.CREATE);
}
else
{
writer.newLine();
lineNum++;
}
}
if(lineNum <= maxRows) // early exit
{
if(footer != null && footer.length() > 0)
{
writer.newLine();
lineNum++;
writer.append(footer);
}
}
writer.close();
System.out.println(", " + lineNum + " lines written to file");
}
System.out.println("file '" + bigFile.getName() + "' split into " + i + " files");
}
}
Below code used to split a big file into small files with lesser lines.
long linesWritten = 0;
int count = 1;
try {
File inputFile = new File(inputFilePath);
InputStream inputFileStream = new BufferedInputStream(new FileInputStream(inputFile));
BufferedReader reader = new BufferedReader(new InputStreamReader(inputFileStream));
String line = reader.readLine();
String fileName = inputFile.getName();
String outfileName = outputFolderPath + "\\" + fileName;
while (line != null) {
File outFile = new File(outfileName + "_" + count + ".split");
Writer writer = new OutputStreamWriter(new FileOutputStream(outFile));
while (line != null && linesWritten < linesPerSplit) {
writer.write(line);
line = reader.readLine();
linesWritten++;
}
writer.close();
linesWritten = 0;//next file
count++;//nect file count
}
reader.close();
} catch (Exception e) {
e.printStackTrace();
}
Split a file to multiple chunks (in memory operation), here I'm splitting any file to a size of 500kb(500000 bytes) :
public static List<ByteArrayOutputStream> splitFile(File f) {
List<ByteArrayOutputStream> datalist = new ArrayList<>();
try {
int sizeOfFiles = 500000;
byte[] buffer = new byte[sizeOfFiles];
try (FileInputStream fis = new FileInputStream(f); BufferedInputStream bis = new BufferedInputStream(fis)) {
int bytesAmount = 0;
while ((bytesAmount = bis.read(buffer)) > 0) {
try (OutputStream out = new ByteArrayOutputStream()) {
out.write(buffer, 0, bytesAmount);
out.flush();
datalist.add((ByteArrayOutputStream) out);
}
}
}
} catch (Exception e) {
//get the error
}
return datalist; }
I am a bit late to answer, But here's how I did it:
Approach:
First I determine how many bytes each of the individual files should contain then I split the large file by bytes. Only one file chunk worth of data is loaded into memory at a time.
Example:- if a 5 GB file is split into 10 files then only 500MB worth of bytes are loaded into memory at a time which are held in the buffer variable in the splitBySize method below.
Code Explaination:
The method splitFile first gets the number of bytes each of the individual file chunks should contain by calling the getSizeInBytes method, then it calls the splitBySize method which splits the large file by size (i..e maxChunkSize represents the number of bytes each of file chunks will contain).
public static List<File> splitFile(File largeFile, int noOfFiles) throws IOException {
return splitBySize(largeFile, getSizeInBytes(largeFile.length(), noOfFiles));
}
public static List<File> splitBySize(File largeFile, int maxChunkSize) throws IOException {
List<File> list = new ArrayList<>();
int numberOfFiles = 0;
try (InputStream in = Files.newInputStream(largeFile.toPath())) {
final byte[] buffer = new byte[maxChunkSize];
int dataRead = in.read(buffer);
while (dataRead > -1) {
list.add(stageLocally(buffer, dataRead));
numberOfFiles++;
dataRead = in.read(buffer);
}
}
System.out.println("Number of files generated: " + numberOfFiles);
return list;
}
private static int getSizeInBytes(long totalBytes, int numberOfFiles) {
if (totalBytes % numberOfFiles != 0) {
totalBytes = ((totalBytes / numberOfFiles) + 1)*numberOfFiles;
}
long x = totalBytes / numberOfFiles;
if (x > Integer.MAX_VALUE){
throw new NumberFormatException("Byte chunk too large");
}
return (int) x;
}
Full Code:
public class StackOverflow {
private static final String INPUT_FILE_PATH = "/Users/malkesingh/Downloads/5MB.zip";
private static final String TEMP_DIRECTORY = "/Users/malkesingh/temp";
public static void main(String[] args) throws IOException {
File input = new File(INPUT_FILE_PATH);
File outPut = fileJoin2(splitFile(input, 5));
try (InputStream in = Files.newInputStream(input.toPath()); InputStream out = Files.newInputStream(outPut.toPath())) {
System.out.println(IOUtils.contentEquals(in, out));
}
}
public static List<File> splitFile(File largeFile, int noOfFiles) throws IOException {
return splitBySize(largeFile, getSizeInBytes(largeFile.length(), noOfFiles));
}
public static List<File> splitBySize(File largeFile, int maxChunkSize) throws IOException {
List<File> list = new ArrayList<>();
int numberOfFiles = 0;
try (InputStream in = Files.newInputStream(largeFile.toPath())) {
final byte[] buffer = new byte[maxChunkSize];
int dataRead = in.read(buffer);
while (dataRead > -1) {
list.add(stageLocally(buffer, dataRead));
numberOfFiles++;
dataRead = in.read(buffer);
}
}
System.out.println("Number of files generated: " + numberOfFiles);
return list;
}
private static int getSizeInBytes(long totalBytes, int numberOfFiles) {
if (totalBytes % numberOfFiles != 0) {
totalBytes = ((totalBytes / numberOfFiles) + 1)*numberOfFiles;
}
long x = totalBytes / numberOfFiles;
if (x > Integer.MAX_VALUE){
throw new NumberFormatException("Byte chunk too large");
}
return (int) x;
}
private static File stageLocally(byte[] buffer, int length) throws IOException {
File outPutFile = File.createTempFile("temp-", "split", new File(TEMP_DIRECTORY));
try(FileOutputStream fos = new FileOutputStream(outPutFile)) {
fos.write(buffer, 0, length);
}
return outPutFile;
}
public static File fileJoin2(List<File> list) throws IOException {
File outPutFile = File.createTempFile("temp-", "unsplit", new File(TEMP_DIRECTORY));
FileOutputStream fos = new FileOutputStream(outPutFile);
for (File file : list) {
Files.copy(file.toPath(), fos);
}
fos.close();
return outPutFile;
}}
import java.util.*;
import java.io.*;
public class task13 {
public static void main(String[] args)throws IOException{
Scanner s =new Scanner(System.in);
System.out.print("Enter path:");
String a=s.next();
File f=new File(a+".txt");
Scanner st=new Scanner(f);
System.out.println(f.canRead()+"\n"+f.canWrite());
long l=f.length();
System.out.println("Length is:"+l);
System.out.print("Enter no.of partitions:");
int p=s.nextInt();
long x=l/p;
st.useDelimiter("\\Z");
String t=st.next();
int j=0;
System.out.println("Each File Length is:"+x);
for(int i=1;i<=p;i++){
File ft=new File(a+"-"+i+".txt");
ft.createNewFile();
int g=(j*(int)x);
int h=(j+1)*(int)x;
if(g<=l&&h<=l){
FileWriter fw=new FileWriter(a+"-"+i+".txt");
String v=t.substring(g,h);
fw.write(v);
j++;
fw.close();
}}
}}
I am trying to add almost 2lac lines in particular txt file(actually conf file) by java program. But it takes almost 112 min when number is only 189000!
I write following code for that
import java.io.*;
public class Fileshandling_example {
static long s1;
static long e1;
static long e2;
static Fileshandling_example fhe= new Fileshandling_example();
public static void main(String args[]) {
try {
s1 = System.nanoTime();
File file1 = new File("\example\mandar.txt");
LineNumberReader lnr1 = new LineNumberReader(new FileReader(file1));
BufferedReader br1 = new BufferedReader(new FileReader(file1));
lnr1.skip(Long.MAX_VALUE);
int a = 1;
StringBuffer sb1 = new StringBuffer("[stations]");
String sCurrentline1 = br1.readLine();
while ((sCurrentline1 = br1.readLine()) != null) {
a++;
if (sCurrentline1.contentEquals(sb1) == true) {
int count = a;
int arraycount = 100000;
for(int i =0; i< (arraycount+1); i++){
if(0 == (i%10000)){
e2 = System.nanoTime();
System.out.println("Time = "+(e2-s1));
}
String abc ="extern => 00"+(1000 + (arraycount-i))+",1,Wait(0.05)";
fhe.insertintoExtensions(file1, (count+1),abc);
}
}
}
} catch (IOException e) {
e.printStackTrace();
} catch (Exception e) {
e.printStackTrace();
}
e1 = System.nanoTime();
System.out.println("Time = "+(e1-s1));
}
public void insertintoExtensions(File inFile1, int lineno, String s1)throws Exception {
File outFile1 = new File("\example\111.tmp");
FileInputStream fis = new FileInputStream(inFile1);
BufferedReader in = new BufferedReader(new InputStreamReader(fis));
FileOutputStream fos = new FileOutputStream(outFile1);
PrintWriter out = new PrintWriter(fos);
String thisLine = "";
int i =1;
while ((thisLine = in.readLine()) != null) {
if(i == lineno) out.println(s1);
out.println(thisLine);
i++;
}
out.flush();
out.close();
in.close();
inFile1.delete();
outFile1.renameTo(inFile1);
}
}
Can any one help me where i get wrong?
I asked similar question coderanch but here i get clue very fast so i ask here also.
Sorry for that (cross forum asking).
Thanks.
You are loop 100,000 times for every '[stations]' found on "\example\mandar.txt":
if (sCurrentline1.contentEquals(sb1) == true) {
int count = a;
int arraycount = 100000;
for(int i =0; i< (arraycount+1); i++){
and call fhe.insertintoExtensions which loops "\example\mandar.txt" again to copy or the content of the line or the content of s1 parameter until the actual line number is reached:
while ((thisLine = in.readLine()) != null) {
if(i == lineno) out.println(s1);
out.println(thisLine);
i++;
}
Try to improve you code and use BufferedWriter instead of PrintWriter.