I am trying to copy files from windows server1 to another windows server2 and not sure where to put the try catch block. I want to inform the user whenver windows server1 or windows server2 shuts down while copying process is ongoing either throught a popup or displaying in a textArea and here is my swingworker code. Thanks in advance
class CopyTask extends SwingWorker<Void, Integer>
{
private File source;
private File target;
private long totalBytes = 0;
private long copiedBytes = 0;
public CopyTask(File src, File dest)
{
this.source = src;
this.target = dest;
progressAll.setValue(0);
progressCurrent.setValue(0);
}
#Override
public Void doInBackground() throws Exception
{
ta.append("Retrieving info ... ");
retrieveTotalBytes(source);
ta.append("Done!\n");
copyFiles(source, target);
return null;
}
#Override
public void process(List<Integer> chunks)
{
for(int i : chunks)
{
progressCurrent.setValue(i);
}
}
#Override
public void done()
{
setProgress(100);
}
private void retrieveTotalBytes(File sourceFile)
{
File[] files = sourceFile.listFiles();
for(File file : files)
{
if(file.isDirectory()) retrieveTotalBytes(file);
else totalBytes += file.length();
}
}
private void copyFiles(File sourceFile, File targetFile) throws IOException
{
if(sourceFile.isDirectory())
{
if(!targetFile.exists()) targetFile.mkdirs();
String[] filePaths = sourceFile.list();
for(String filePath : filePaths)
{
File srcFile = new File(sourceFile, filePath);
File destFile = new File(targetFile, filePath);
copyFiles(srcFile, destFile);
}
}
else
{
ta.append("Copying " + sourceFile.getAbsolutePath() + " to " + targetFile.getAbsolutePath() ); //appends to textarea
bis = new BufferedInputStream(new FileInputStream(sourceFile));
bos = new BufferedOutputStream(new FileOutputStream(targetFile));
long fileBytes = sourceFile.length();
long soFar = 0;
int theByte;
while((theByte = bis.read()) != -1)
{
bos.write(theByte);
setProgress((int) (copiedBytes++ * 100 / totalBytes));
publish((int) (soFar++ * 100 / fileBytes));
}
bis.close();
bos.close();
publish(100);
}
}
Where is the line where the exception can happen? That's the first place I locate any exception.
Generally, if your modules are small, you can wrap the try around all the real code in the module and catch the exceptions at the end, especially if the exception is fatal. Then you can log the exception and return an error message/status to the user.
However, the strategy is different if the exception is not fatal. In this case you'll have to handle it right where the connection exception is thrown so you can seamlessly resume when the connection returns. Of course, this is a little more work.
EDIT - you probably want bis.close() and bos.close() inside a finally block to ensure they get closed. It may be pedantic but it seems prudent.
Related
I am using Zip4J for extracting zip file and I am able to do it. However, I want to use progress monitor provided in Zip4J but not able to use it successfully.
The documentation only says that it should have run in thread mode true. I did it and my console stuck on this on command line. Any working example of extractAll() with progress monitor.
public String unzipFile(String sourceFilePath, String extractionPath) {
String extractionDirectory = "";
FileHeader fileHeader = null;
if (FileUtility.isPathExist(sourceFilePath) && FileUtility.isPathExist(extractionPath)) {
try {
ZipFile zipFile = new ZipFile(sourceFilePath);
LOG.info("File Extraction started");
List<FileHeader> fileHeaderList = zipFile.getFileHeaders();
if (fileHeaderList.size() > 0)
fileHeader = (FileHeader) fileHeaderList.get(0);
if (fileHeader != null)
extractionDirectory = splitFileName(fileHeader.getFileName());
long totalPercentage = 235;
long startTime = System.currentTimeMillis();
zipFile.extractAll(extractionPath);
LOG.info("File Extraction completed.");
System.out.println();
} catch (ZipException e) {
LOG.error("Extraction Exception ->\n" + e.getMessage());
}
} else {
LOG.error("Either source path or extraction path is not exist.");
}
return extractionDirectory;
}
Don't know, works fine if you add enough files, that there actually is a progress to see. I added some really fat ones for the purpose.
#Test
public void testExtractAllDeflateAndNoEncryptionExtractsSuccessfully() throws IOException {
ZipFile zipFile = new ZipFile(generatedZipFile);
List<File> toAdd = Arrays.asList(
getTestFileFromResources("sample_text1.txt"),
getTestFileFromResources("sample_text_large.txt"),
getTestFileFromResources("OrccTutorial.pdf"),
getTestFileFromResources("introduction-to-automata-theory.pdf"),
getTestFileFromResources("thomas.pdf")
);
zipFile.addFiles(toAdd);
zipFile.setRunInThread(true);
zipFile.extractAll(outputFolder.getPath());
ProgressMonitor mon = zipFile.getProgressMonitor();
while (mon.getState() == BUSY) {
System.out.println(zipFile.getProgressMonitor().getPercentDone());
try {
Thread.sleep(10);
} catch (InterruptedException e) {
throw new RuntimeException(e);
}
}
ZipFileVerifier.verifyFolderContentsSameAsSourceFiles(outputFolder);
verifyNumberOfFilesInOutputFolder(outputFolder, 5);
}
testAddFilesWithProgressMonitor.java in the the project's test cases shows how to use ProgressMonitor.
I'm working on a file copying application which is used to copy files from client machine to a network folder (UNC path). Client and network folder are connected using a 10Gbps connection. Traditional Stream/Buffer mechanism could only use up to 250Mbps. That is why I started using NIO methods. Both Files.copy() and transferFrom() methods could use upto 6Gbps bandwidth which is sufficient for now. But the problem is both these methods doesn't provide progress. I must need to display the file copying progress in my application.
Then I found ReadableByteChannel interface to track the upload progress. But after implementing this, upload speed dropped to 100Mbps. Not sure if I didn't implement it correctly.
OS level copying (Ctrl+C and Ctrl+V) works with 6Gbps bandwidth utilization. How to achieve the same with Java method with progress monitoring?
public class AppTest {
/**
* #param args the command line arguments
*/
public static void main(String[] args) {
File source = new File(args[0]);
File dest = new File(args[1] + File.separator + source.getName());
long startTime = System.currentTimeMillis();
try {
if (args[2].equalsIgnoreCase("s")) {
copyUsingStream(source, dest, args.length > 3 ? Integer.parseInt(args[3]) : 32 * 1024);
} else if (args[2].equalsIgnoreCase("fp")) {
copyUsingFileChannelWithProgress(source, dest);
} else if (args[2].equalsIgnoreCase("f")){
copyUsingFileChannels(source, dest);
} else if (args[2].equalsIgnoreCase("j")) {
copyUsingFilescopy(source, dest);
} else {
System.out.println("Unknown copy option.");
}
} catch (Exception e) {
e.printStackTrace();
}
System.out.println("Completed in " + (System.currentTimeMillis() - startTime));
}
private static void copyUsingStream(File source, File dest, int buf_size) throws IOException {
System.out.println("Copying using feeder code...");
System.out.println("Buffer Size : " + buf_size);
FileInputStream sourceFileIS = new FileInputStream(source);
FileOutputStream srvrFileOutStrm = new FileOutputStream(dest);
byte[] buf = new byte[buf_size];
int dataReadLen;
while ((dataReadLen = sourceFileIS.read(buf)) > 0) {
srvrFileOutStrm.write(buf, 0, dataReadLen);
}
srvrFileOutStrm.close();
sourceFileIS.close();
}
private static void copyUsingFileChannels(File source, File dest)
throws IOException {
System.out.println("Copying using filechannel...");
FileChannel inputChannel = null;
FileChannel outputChannel = null;
try {
inputChannel = new FileInputStream(source).getChannel();
outputChannel = new FileOutputStream(dest).getChannel();
outputChannel.transferFrom(inputChannel, 0, inputChannel.size());
} finally {
inputChannel.close();
outputChannel.close();
}
}
private static void copyUsingFilescopy(File source, File dest) throws IOException{
Files.copy(source.toPath(), dest.toPath());
}
interface ProgressCallBack {
public void callback(CallbackByteChannel rbc, double progress);
}
static class CallbackByteChannel implements ReadableByteChannel {
ProgressCallBack delegate;
long size;
ReadableByteChannel rbc;
long sizeRead;
CallbackByteChannel(ReadableByteChannel rbc, long sizeRead, long expectedSize, ProgressCallBack delegate) {
this.delegate = delegate;
this.sizeRead = sizeRead;
this.size = expectedSize;
this.rbc = rbc;
}
#Override
public void close() throws IOException {
rbc.close();
}
public long getReadSoFar() {
return sizeRead;
}
#Override
public boolean isOpen() {
return rbc.isOpen();
}
#Override
public int read(ByteBuffer bb) throws IOException {
int n;
double progress;
if ((n = rbc.read(bb)) > 0) {
sizeRead += n;
progress = size > 0 ? (double) sizeRead / (double) size * 100.0 : -1.0;
delegate.callback(this, progress);
}
return n;
}
}
private static void copyUsingFileChannelWithProgress(File sourceFile, File destFile) throws IOException {
ProgressCallBack progressCallBack = new ProgressCallBack() {
#Override
public void callback(CallbackByteChannel rbc, double progress) {
// publish((int)progress);
}
};
FileOutputStream fos = null;
FileChannel sourceChannel = null;
sourceChannel = new FileInputStream(sourceFile).getChannel();
ReadableByteChannel rbc = new CallbackByteChannel(sourceChannel, 0, sourceFile.length(), progressCallBack);
fos = new FileOutputStream(destFile);
fos.getChannel().transferFrom(rbc, 0, sourceFile.length());
if (sourceChannel.isOpen()) {
sourceChannel.close();
}
fos.close();
}
}
Use transferFrom() in a loop with a large chunk size that is still smaller than the file size. You will have to trade off speed for progress indication here. You will probably want to make the chunks at least 1Mb to retain speed.
I have created a game in Android. I have written a class for input/ouput with prefer install location external. I want to make some basic questions. First of all the file I use is a .txt (I know that its not the best way to save your data but I use it for testing). The strange part is that when the the game is over it should automatically save the user highscores but it does not, so when I close the app and restart it the highscores have disappeared. I would also like to learn what the prefered file type for saving settings/highscores/coins etc (hopefully secured) is. Lastly I debug the game using a Nexus 5 whitch does not have external storage (it should be stored locally though). This is my code, thanks in advance :).
public class AndroidFileIO implements FileIO {
Context context;
AssetManager assets;
String externalStoragePath;
public AndroidFileIO(Context context) {
this.context = context;
this.assets = context.getAssets();
this.externalStoragePath = Environment.getExternalStorageDirectory()
.getAbsolutePath() + File.separator;
}
public InputStream readAsset(String fileName) throws IOException {
return assets.open(fileName);
}
public InputStream readFile(String fileName) throws IOException {
return new FileInputStream(externalStoragePath + fileName);
}
public OutputStream writeFile(String fileName) throws IOException {
return new FileOutputStream(externalStoragePath + fileName);
}
public SharedPreferences getPreferences() {
return PreferenceManager.getDefaultSharedPreferences(context);
}
}
my game class has this method
public FileIO getFileIO() {
return fileIO;
}
this is the way i load the file
Settings.load(game.getFileIO());
and finaly my save/load methods of the settings class
public static void load(FileIO files) {
BufferedReader in = null;
try {
in = new BufferedReader(new InputStreamReader(
files.readFile("mrnom.txt")));
soundEnabled = Boolean.parseBoolean(in.readLine());
for (int i = 0; i < 5; i++) {
highscores[i] = Integer.parseInt(in.readLine());
}
} catch (IOException e) {
// :( It's ok we have defaults
} catch (NumberFormatException e) {
// :/ It's ok, defaults save our day
} finally {
try {
if (in != null)
in.close();
} catch (IOException e) {
}
}
}
public static void save(FileIO files) {
BufferedWriter out = null;
try {
out = new BufferedWriter(new OutputStreamWriter(
files.writeFile("mrnom.txt")));
out.write(Boolean.toString(soundEnabled));
for (int i = 0; i < 5; i++) {
out.write(Integer.toString(highscores[i]));
}
} catch (IOException e) {
} finally {
try {
if (out != null)
out.close();
} catch (IOException e) {
}
}
}
Here save is called
private void updateGameOver(List<TouchEvent> touchEvents) {
int len = touchEvents.size();
for(int i = 0; i < len; i++) {
TouchEvent event = touchEvents.get(i);
if(event.type == TouchEvent.TOUCH_UP) {
if(event.x >= 128 && event.x <= 192 &&
event.y >= 200 && event.y <= 264) {
if(Settings.soundEnabled)
Assets.click.play(1);
//debug begin
FileIO fileIO = game.getFileIO();
Settings.save(fileIO);
//debug end
game.setScreen(new MainMenuScreen(game));
return;
}
}
}
}
Your issue is in the save method when you write the strings to the out reference. You are not saving a value per line, but are later reading a value per line in your load method. With the current code you save the following in your mrnom.txt file: true10203040 instead of true\n10\n20\n30\n40.
To fix this, one way is to change:
out.write(Boolean.toString(soundEnabled));
to
out.write(Boolean.toString(soundEnabled) + "\n");
AND
out.write(Integer.toString(highscores[i]));
to
out.write(Integer.toString(highscores[i]) + "\n");
I have huge (>5GB) CSV file in format:
username,transaction
I want to have as an output separate CSV file for each user with only all of his transactions in the same format. I have few ideas in mind, but i want to hear other ideas for effective (fast and memory efficient) implementation.
Here is what i done up to now. First test is read/process/write in single thread, second test is with many threads. Performance is not that good, so i think i'm doing something wrong. Please correct me.
public class BatchFileReader {
private ICsvBeanReader beanReader;
private double total;
private String[] header;
private CellProcessor[] processors;
private DataTransformer<HashMap<String, List<LoginDto>>> processor;
private boolean hasMoreRecords = true;
public BatchFileReader(String file, DataTransformer<HashMap<String, List<LoginDto>>> processor) {
try {
this.processor = processor;
this.beanReader = new CsvBeanReader(new FileReader(file), CsvPreference.STANDARD_PREFERENCE);
header = CSVUtils.getHeader(beanReader.getHeader(true));
processors = CSVUtils.getProcessors();
} catch (IOException e) {
e.printStackTrace();
}
}
public void read() {
try {
readFile();
} catch (IOException e) {
e.printStackTrace();
} finally {
if (beanReader != null) {
try {
beanReader.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
}
private void readFile() throws IOException {
while (hasMoreRecords) {
long start = System.currentTimeMillis();
HashMap<String, List<LoginDto>> usersBatch = readBatch();
long end = System.currentTimeMillis();
System.out.println("Reading batch for " + ((end - start) / 1000f) + " seconds.");
total +=((end - start)/ 1000f);
if (processor != null && !usersBatch.isEmpty()) {
processor.transform(usersBatch);
}
}
System.out.println("total = " + total);
}
private HashMap<String, List<LoginDto>> readBatch() throws IOException {
HashMap<String, List<LoginDto>> users = new HashMap<String, List<LoginDto>>();
int readLoginCount = 0;
while (readLoginCount < CONFIG.READ_BATCH_SIZE) {
LoginDto login = beanReader.read(LoginDto.class, header, processors);
if (login != null) {
if (!users.containsKey(login.getUsername())) {
List<LoginDto> logins = new LinkedList<LoginDto>();
users.put(login.getUsername(), logins);
}
users.get(login.getUsername()).add(login);
readLoginCount++;
} else {
hasMoreRecords = false;
break;
}
}
return users;
}
}
public class BatchFileWriter {
private final String file;
private final List<T> processedData;
public BatchFileWriter(final String file, List<T> processedData) {
this.file = file;
this.processedData = processedData;
}
public void write() {
try {
writeFile(file, processedData);
} catch (IOException e) {
e.printStackTrace();
} finally {
}
}
private void writeFile(final String file, final List<T> processedData) throws IOException {
System.out.println("START WRITE " + " " + file);
FileWriter writer = new FileWriter(file, true);
long start = System.currentTimeMillis();
for (T record : processedData) {
writer.write(record.toString());
writer.write("\n");
}
writer.flush();
writer.close();
long end = System.currentTimeMillis();
System.out.println("Writing in file " + file + " complete for " + ((end - start) / 1000f) + " seconds.");
}
}
public class LoginsTest {
private static final ExecutorService executor = Executors.newSingleThreadExecutor();
private static final ExecutorService procExec = Executors.newFixedThreadPool(Runtime.getRuntime().availableProcessors() + 1);
#Test
public void testSingleThreadCSVtoCSVSplit() throws InterruptedException, ExecutionException {
long start = System.currentTimeMillis();
DataTransformer<HashMap<String, List<LoginDto>>> simpleSplitProcessor = new DataTransformer<HashMap<String, List<LoginDto>>>() {
#Override
public void transform(HashMap<String, List<LoginDto>> data) {
for (String field : data.keySet()) {
new BatchFileWriter<LoginDto>(field + ".csv", data.get(field)).write();
}
}
};
BatchFileReader reader = new BatchFileReader("loadData.csv", simpleSplitProcessor);
reader.read();
long end = System.currentTimeMillis();
System.out.println("TOTAL " + ((end - start)/ 1000f) + " seconds.");
}
#Test
public void testMultiThreadCSVtoCSVSplit() throws InterruptedException, ExecutionException {
long start = System.currentTimeMillis();
System.out.println(start);
final DataTransformer<HashMap<String, List<LoginDto>>> simpleSplitProcessor = new DataTransformer<HashMap<String, List<LoginDto>>>() {
#Override
public void transform(HashMap<String, List<LoginDto>> data) {
System.out.println("transform");
processAsync(data);
}
};
final CountDownLatch readLatch = new CountDownLatch(1);
executor.execute(new Runnable() {
#Override
public void run() {
BatchFileReader reader = new BatchFileReader("loadData.csv", simpleSplitProcessor);
reader.read();
System.out.println("read latch count down");
readLatch.countDown();
}});
System.out.println("read latch before await");
readLatch.await();
System.out.println("read latch after await");
procExec.shutdown();
executor.shutdown();
long end = System.currentTimeMillis();
System.out.println("TOTAL " + ((end - start)/ 1000f) + " seconds.");
}
private void processAsync(final HashMap<String, List<LoginDto>> data) {
procExec.execute(new Runnable() {
#Override
public void run() {
for (String field : data.keySet()) {
writeASync(field, data.get(field));
}
}
});
}
private void writeASync(final String field, final List<LoginDto> data) {
procExec.execute(new Runnable() {
#Override
public void run() {
new BatchFileWriter<LoginDto>(field + ".csv", data).write();
}
});
}
}
Would it not be better to use unix commands to sort and then split the original file?
Something like: cat txn.csv | sort > txn-sorted.csv
From there get a listing of the unique usernames via grep and then grep the sorted file for each username
If you know Camel already, I'd write a simple Camel route to:
Read line from file
Parse the line
Write to the correct output file
Its a very simple route but if you want it as fast as possible it is then trivially easy make it multithreaded
eg your route would look something like:
from("file:/myfile.csv")
.beanRef("lineParser")
.to("seda:internal-queue");
from("seda:internal-queue")
.concurrentConsumers(5)
.to("fileWriter");
If you don't know Camel then its not worth learning some this one task. However you are probably going to need to make it multithreaded to get the maximum performance. You'll have to experiment where best to put the threading as it will depend on what parts of the operation are slowest.
The multithreading will use up more memory so you'll need to balance memory efficiency against performance.
I would open/append a new output file for each user. If you wanted to minimize memory usage and incur more I/O overhead, you could do something like the following, though you'd probably want to use a real CSV parser like Super CSV (http://supercsv.sourceforge.net/index.html):
Scanner s = new Scanner(new File("/my/dir/users-and-transactions.txt"));
while (s.hasNextLine()) {
String line = s.nextLine();
String[] tokens = line.split(",");
String user = tokens[0];
String transaction = tokens[1];
PrintStream out = new PrintStream(new FileOutputStream("/my/dir/" + user, true));
out.println(transaction);
out.close();
}
s.close();
If you've got a reasonable amount of memory, you could create a Map of user name to OutputStream. Each time you see a user string, you could get the existing OutputStream for that user name or create a new one if none exists.
I am trying to copy folders and files which is working fine but I need help on how to filter a single folder and copy the rest of the folders. For example, I have directories like carsfolder and truckfolder in(C:\vehicle\carsfolder and C:\vehicle\truckfolder). When I use the below code it copies both carsfolder and truckfolder but I wanted to copy only carsfolder. How can I do that. Your help is highly appreciated.(Using Swing and Java 1.6)
class CopyTask extends SwingWorker<Void, Integer>
{
private File source;
private File target;
private long totalBytes = 0;
private long copiedBytes = 0;
public CopyTask(File src, File dest)
{
this.source = src;
this.target = dest;
progressAll.setValue(0);
}
#Override
public Void doInBackground() throws Exception
{
ta.append("Retrieving info ... "); //append to TextArea
retrieveTotalBytes(source);
ta.append("Done!\n");
copyFiles(source, target);
return null;
}
#Override
public void process(List<Integer> chunks)
{
for(int i : chunks)
{
}
}
#Override
public void done()
{
setProgress(100);
}
private void retrieveTotalBytes(File sourceFile)
{
try
{
File[] files = sourceFile.listFiles();
for(File file : files)
{
if(file.isDirectory()) retrieveTotalBytes(file);
else totalBytes += file.length();
}
}
catch(Exception ee)
{
}
}
private void copyFiles(File sourceFile, File targetFile) throws IOException
{
if(sourceFile.isDirectory())
{
try{
if(!targetFile.exists()) targetFile.mkdirs();
String[] filePaths = sourceFile.list();
for(String filePath : filePaths)
{
File srcFile = new File(sourceFile, filePath);
File destFile = new File(targetFile, filePath);
copyFiles(srcFile, destFile);
}
}
catch(Exception ie)
{
}
}
else
{
try
{
ta.append("Copying " + sourceFile.getAbsolutePath() + " to " + targetFile.getAbsolutePath() );
bis = new BufferedInputStream(new FileInputStream(sourceFile));
bos = new BufferedOutputStream(new FileOutputStream(targetFile));
long fileBytes = sourceFile.length();
long soFar = 0;
int theByte;
while((theByte = bis.read()) != -1)
{
bos.write(theByte);
setProgress((int) (copiedBytes++ * 100 / totalBytes));
publish((int) (soFar++ * 100 / fileBytes));
}
bis.close();
bos.close();
publish(100);
ta.append(" Done!\n");
}
catch(Exception excep)
{
setProgress(0);
bos.flush();
bis.close();
bos.close();
}
finally{
try {
bos.flush();
}
catch (Exception e) {
}
try {
bis.close();
}
catch (Exception e) {
}
try {
bos.close();
}
catch (Exception e) {
}
}
}
}
}
Maybe you can introduce a regex or list of regexes that specify which files and dirs to exclude?
For example, to exclude truckfolder, use a "exclusion" regex like "C:\\vehicle\\truckfolder.*".
Then, in your code, before you copy anything, check to make sure the absolute path of the sourcefile doesn't match the exclusion regex(s).