currently i'm trying to write to a txt-file with Java's Files-Class. This task should be done every 5 seconds and is scheduled by an ScheduledExecutorService. For some time it's doing it's job propperly but after a random time my program exits without any warnings, errors or so. I have tried to reproduce this but it seems to occur very random. I've also tried to use a PrintWriter but that lead randomly to the same behaviour.
Thanks in advance!!!
Here is my code:
import java.io.IOException;
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.Paths;
import java.nio.file.StandardOpenOption;
import java.time.LocalDateTime;
import java.time.LocalTime;
import java.util.concurrent.Executors;
import java.util.concurrent.ScheduledExecutorService;
import java.util.concurrent.TimeUnit;
public class FileWritingClass
{
public static void main(String[] args) throws IOException
{
ScheduledExecutorService scheduler = Executors.newScheduledThreadPool(1);
scheduler.scheduleAtFixedRate(
new Runnable() {
long i;
String str = "";
Path path = Paths.get("TestFile.txt");
#Override
public void run() {
str = LocalTime.now().toString() + ": still alive after " + i + " times.";
i++;
try
{
System.out.println(str);
Files.write(path, (str + System.lineSeparator()).getBytes(), StandardOpenOption.APPEND);
} catch (IOException e) {
e.printStackTrace();
}
}
},
500 /* Startdelay */,
5000 /* period */,
TimeUnit.MILLISECONDS );
}
}
The Executor is swallowing Exceptions other than IOException. Catch Throwable instead of just IOException.
The Executor fails when it encounters some Exception other than IOException. The VM is still alive, but the Executor has failed.
Here's some code to try:
try
{
System.out.println(str);
Files.write(path, (str + System.lineSeparator()).getBytes(), StandardOpenOption.APPEND);
if(new Random().nextBoolean()) {
System.out.println("Here we go.");
throw new RuntimeException("Uncaught");
}
} catch (IOException e) {
e.printStackTrace();
} catch(Throwable t) {
t.printStackTrace();
System.out.println("The show must go on...");
}
Note how I give a 50/50 chance to throw an uncaught exception after the write (just for quick failure).
If you remove the second catch, it will just stop printing.
With the catch, the output becomes:
12:46:00.780250700: still alive after 0 times.
12:46:05.706355500: still alive after 1 times.
Here we go.
java.lang.RuntimeException:
Uncaught at
KeepaTokenRecorder$FileWritingClass$1.run(KeepaTokenRecorder.java:89)
at [...]
The show must go on...
12:46:15.705899200: still alive after 3 times.
Related
The below code works fine and it connects to a given server (host, port) and gets the connection status.
What it does is:
PollService implements the Callable interface and connects to a server(host, port) then it returns the status.
Since this should happen periodically, it iterates the Hashmap entries in a while(true) loop infinitely.
The problem: On the server-side, I see it takes 2 or 3 seconds to reach the thread and if I use Runnable with periodic implementation it connects within 1 sec. Looks like iterating the Hashmap infinitely is a slow approach.
However, I can not use Runnable as it doesn't return the status of the connection which I need later to use.
Below is the ServiceMonitor class (client) which connects to the server.
package org.example;
import java.time.LocalDateTime;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.concurrent.ExecutionException;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
import java.util.concurrent.Future;
import java.util.logging.Level;
import java.util.logging.Logger;
import java.util.stream.Collectors;
public class ServicesMonitor {
private ExecutorService scheduledExecutorService = null;
private static Logger logger = Logger.getLogger(ServicesMonitor.class.getName());
private final Map<ServiceType, List<ClientMonitorService>> clientMonitorServicesMap = new HashMap<>();
public void registerInterest(ClientMonitorService clientMonitorService) {
clientMonitorServicesMap.computeIfAbsent(clientMonitorService.getServiceToMonitor().getServiceType(), v -> new ArrayList<>()).add(clientMonitorService);
}
public Map<ServiceType, List<ClientMonitorService>> getClineMonitorService() {
return clientMonitorServicesMap;
}
public void poll(){
//Observable.interval(1, TimeUnit.SECONDS).st
}
public void pollServices() {
scheduledExecutorService = Executors.newFixedThreadPool(clientMonitorServicesMap.size());
try {
while (true) {
clientMonitorServicesMap.forEach((k, v) -> {
Future<Boolean> val = scheduledExecutorService.submit(new PollService(k));
try {
boolean result = val.get();
System.out.println("service " + k.getHost() + ":" + k.getPort() + "status is " + result);
if (result) {
List<ClientMonitorService> list = v.stream().filter(a -> LocalDateTime.now().getSecond() % a.getServiceToMonitor().getFreqSec() == 0)
.collect(Collectors.toList());
list.stream().forEach(a -> System.out.println(a.getClientId()));
}
} catch (InterruptedException e) {
e.printStackTrace();
} catch (ExecutionException e) {
e.printStackTrace();
}
});
}
} catch (Exception e) {
logger.log(Level.SEVERE, e.getMessage());
} finally {
scheduledExecutorService.shutdown();
}
}
}
How to improve the performance of this code by reducing the time it takes to connect to the server?
How to improve this code?
after using the get(1, TimeUnit.SECONDS); I started to see improvement on the server side as well (Reaching the threads less than 1 second) since we are not waiting more than 1 second on the client side.
while (true) {
clientMonitorServicesMap.forEach((k, v) -> {
Future<Boolean> val = scheduledExecutorService.submit(new PollService(k));
try {
boolean result = val.get(1, TimeUnit.SECONDS);
System.out.println("service " + k.getHost() + ":" + k.getPort() + "status is " + result);
if (result) {
List<ClientMonitorService> list = v.stream()
//.filter(a -> LocalDateTime.now().getSecond() % a.getServiceToMonitor().getFreqSec() == 0)
.collect(Collectors.toList());
list.stream().forEach(a -> System.out.println(a.getClientId()));
}
} catch (InterruptedException e) {
logger.log(Level.WARNING,"Interrupted -> " + k.getHost()+":"+k.getPort());
} catch (ExecutionException e) {
logger.log(Level.INFO,"ExecutionException exception -> "+ k.getHost()+":"+k.getPort());
} catch (TimeoutException e) {
logger.log(Level.INFO,"TimeoutException exception -> "+ k.getHost()+":"+k.getPort());
}
});
}
I have this code to find out how to get the status code from a URL:
import java.io.IOException;
import java.net.HttpURLConnection;
import java.net.URL;
/**
* #author Crunchify.com
*
*/
class j {
public static void main(String args[]) throws Exception {
String[] hostList = { "http://example.com", "http://example2.com","http://example3.com" };
for (int i = 0; i < hostList.length; i++) {
String url = hostList[i];
String status = getStatus(url);
System.out.println(url + "\t\tStatus:" + status);
}
}
public static String getStatus(String url) throws IOException {
String result = "";
try {
URL siteURL = new URL(url);
HttpURLConnection connection = (HttpURLConnection) siteURL
.openConnection();
connection.setRequestMethod("HEAD");
connection.connect();
int code = connection.getResponseCode();
result = Integer.toString(code);
} catch (Exception e) {
result = "->Red<-";
}
return result;
}
}
I have checked it for small input it works fine. But I have millions of domains which I need to scan. I have a file containing it.
I want to know how I can give file as an input to this code.
I want the code to work in Multiple Threads. Say Thread count should be more than 20000, so that my output will be faster.
How I can write the out to another file?
Kindly help me. If possible I would like to know which the Bandwidth Savvy method to do the same job. I want to make the code faster anyways. how I can do these thing with the code I have?
Java Version:
java version "1.8.0_121"
Java(TM) SE Runtime Environment (build 1.8.0_121-b13)
Java HotSpot(TM) 64-Bit Server VM (build 25.121-b13, mixed mode)
This does what you want:
Input list file (c://lines.txt)
http://www.adam-bien.com/
http://stackoverflow.com/
http://www.dfgdfgdfgdfgdfgertwsgdfhdfhsru.de
http://www.google.de
The Thread:
import java.net.HttpURLConnection;
import java.net.URL;
import java.util.concurrent.Callable;
public class StatusThread implements Callable<String> {
String url;
public StatusThread(String url) {
this.url = url;
}
#Override
public String call() throws Exception {
String result = "";
try {
URL siteURL = new URL(url);
HttpURLConnection connection = (HttpURLConnection) siteURL.openConnection();
connection.setRequestMethod("HEAD");
connection.connect();
int code = connection.getResponseCode();
result = Integer.toString(code);
} catch (Exception e) {
result = "->Red<-";
}
return url + "|" + result;
}
}
And the main program:
import java.io.IOException;
import java.nio.file.Files;
import java.nio.file.Paths;
import java.util.ArrayList;
import java.util.List;
import java.util.concurrent.ExecutionException;
import java.util.concurrent.Executors;
import java.util.concurrent.Future;
import java.util.concurrent.ThreadPoolExecutor;
import java.util.stream.Stream;
public class CallableExample {
public static void main(String[] args) throws IOException {
// Number of threads
int numberOfThreads = 10;
// Input file
String sourceFileName = "c://lines.txt"; // Replace by your own
String targetFileName = "c://output.txt"; // Replace by your own
// Read input file into List
ArrayList<String> urls = new ArrayList<>();
try (Stream<String> stream = Files.lines(Paths.get(sourceFileName ))) {
stream.forEach((string) -> {
urls.add(string);
});
} catch (IOException e) {
e.printStackTrace();
}
// Create thread pool
ThreadPoolExecutor executor = (ThreadPoolExecutor) Executors.newFixedThreadPool(numberOfThreads);
List<Future<String>> resultList = new ArrayList<>();
// Launch threads
for(String url : urls) {
StatusThread statusGetter = new StatusThread(url);
Future<String> result = executor.submit(statusGetter);
resultList.add(result);
}
// Use results
FileWriter writer;
writer = new FileWriter(targetFileName);
for (Future<String> future : resultList) {
try {
String oneResult = future.get().split("\\|")[0] + " -> " + future.get().split("\\|")[1];
// Print the results to the console
System.out.println(oneResult);
// Write the result to a file
writer.write(oneResult + System.lineSeparator());
} catch (InterruptedException | ExecutionException e) {
e.printStackTrace();
}
}
writer.close();
// Shut down the executor service
executor.shutdown();
}
}
Don't forget to:
Create your input file and point to it (c://lines.txt)
Change the number of threads to get the best result
You will have issues sharing a file across threads. Much better to read the file and then spawn a thread to process each record in the file.
Creating a thread is none trivial resource wise so a thread pool would be useful so threads can be reused.
Do you want all threads to write to a single file?
I would do that using a shared list between the threads and the writer. others may have a better idea.
How to do all this depends on Java version.
You can use the ExecutorService and set the thread number to use.
The ExecutorService instance will handle for your the threads management.
You just need to provide it the tasks to execute and invoking all tasks executions.
When all the task are performed you can get the result.
In the call() method of The Callable implementation we return a String with a separator to indicate the url and the response code of the request.
For example : http://example3.com||301, http://example.com||200, etc...
I have not written the code to read a file and store in another file the result of the tasks. You should not have great difficulty to implement it.
Here is the main class :
import java.util.HashSet;
import java.util.List;
import java.util.Set;
import java.util.concurrent.Callable;
import java.util.concurrent.ExecutionException;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
import java.util.concurrent.Future;
public class Main {
public static void main(String[] args) throws InterruptedException {
String[] hostList = { "http://example.com", "http://example2.com", "http://example3.com" };
int nbThreadToUse = Runtime.getRuntime().availableProcessors() - 1;
ExecutorService executorService = Executors.newFixedThreadPool(nbThreadToUse);
Set<Callable<String>> callables = new HashSet<Callable<String>>();
for (String host : hostList) {
callables.add(new UrlCall(host));
}
List<Future<String>> futures = executorService.invokeAll(callables);
for (Future<String> future : futures) {
try {
String result = future.get();
String[] keyValueToken = result.split("\\|\\|");
String url = keyValueToken[0];
String response = keyValueToken[1];
System.out.println("url=" + url + ", response=" + response);
} catch (ExecutionException e) {
e.printStackTrace();
}
}
executorService.shutdown();
}
}
Here is UrlCall, the Callable implementation to perform a call to the url.
UrlCall takes in its constructor the url to test.
import java.io.IOException;
import java.net.HttpURLConnection;
import java.net.URL;
import java.util.concurrent.Callable;
public class UrlCall implements Callable<String> {
private String url;
public UrlCall(String url) {
this.url = url;
}
#Override
public String call() throws Exception {
return getStatus(url);
}
private String getStatus(String url) throws IOException {
try {
URL siteURL = new URL(url);
HttpURLConnection connection = (HttpURLConnection) siteURL.openConnection();
connection.setRequestMethod("HEAD");
connection.connect();
int code = connection.getResponseCode();
return url + "||" + code;
} catch (Exception e) {
//FIXME to log of course
return url + "||exception";
}
}
}
I agree with Thread pool approach exposed here.
Multi-threading consists in exploiting the time the others threads spend to wait (I guess int his case: the distant site response). It does not multiply processing power. Then about 10 threads seem reasonable (more depending on hardware).
An important point that seem to have been neglected in answer I read is that OP talk about millions of domains. Then I would discourage loading whole file in memory in a list iterated over afterwards. I would rather merge all in a single loop (file reading), instead of 3 (read, ping, write).
stream.forEach((url) -> {
StatusThread statusGetter = new StatusThread(url, outputWriter);
Future<String> result = executor.submit(statusGetter);
});
outputWriter would be a type with a synchronized method to write into an output stream.
I am not talking about threading or anything to make this more complicated.
Most server programs I saw are like this or while(true){...} (same concept).
import java.net.ServerSocket;
import java.net.Socket;
import java.util.Scanner;
import java.io.DataOutputStream;
import java.io.IOException;
public class TCPServer {
ServerSocket welcomeSocket;
public TCPServer(int port) throws IOException {
welcomeSocket = new ServerSocket(port);
}
public void go() throws IOException {
// This is not a valid way to wait for a socket connection, You should
// not have a forever loop or while(true)
**for (; ;) {**
Socket connectionSocket = welcomeSocket.accept();
Scanner clientIn = new Scanner(connectionSocket.getInputStream());
DataOutputStream clientOut = new DataOutputStream(connectionSocket.getOutputStream());
String clientLine = clientIn.nextLine();
String modLine = clientLine.toUpperCase();
clientOut.writeBytes(modLine + "\n");
}
}
public static void main(String[] args){
try {
TCPServer server = new TCPServer(6789);
server.go();
}
catch(IOException ioe) {
ioe.printStackTrace();
}
}
}
It is not looping permanently, your code blocks on line welcomeSocket.accept() until someone connects and only after that next lines are executed then it waits for a new connection on welcomeSocket.accept(). In other words it loops as many times as it needs( per each connection ).
If you just want to allow only one client to connect, remove for (; ;) statement. But it will require to restart your server every time.
The while(!finished) option might be a better solution than the "empty" for loop. When the exit event occurs, you just set finished to true.
You can run a scheduler with any number of thread in the pool
and prevent the main thread from termination. Here I've used input stream but you can do it in different ways. Here you can use multiple threads to get connections and customize the frequency of the scheduler.
import java.io.IOException;
import java.util.concurrent.Executors;
import java.util.concurrent.ScheduledExecutorService;
import java.util.concurrent.ScheduledFuture;
import java.util.concurrent.TimeUnit;
import org.junit.Test;
public class MyTest {
ScheduledExecutorService scheduler = Executors.newScheduledThreadPool(4);
#Test
public void hello() {
final Runnable helloServer = new Runnable() {
public void run() {
// handle soket connection here
System.out.println("hadling connection");
}
};
final ScheduledFuture<?> helloHandle = scheduler.scheduleAtFixedRate(helloServer, 1000, 1000, TimeUnit.MILLISECONDS);
try {
System.in.read();
} catch (IOException e) {
e.printStackTrace();
}
}
}
In my previous questions about HtmlUnit
Skip particular Javascript execution in HTML unit
and
Fetch Page source using HtmlUnit : URL got stuck
I had mentioned that URL is getting stuck. I also found out that it is getting stuck due to one of the methods(parse) in HtmlUnit library is not coming out of execution.
I did further work on this. I wrote code to get out of the method if it takes more than specified time-out seconds to complete.
import java.io.IOException;
import java.net.MalformedURLException;
import java.util.Date;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
import java.util.concurrent.Future;
import java.util.concurrent.TimeUnit;
import java.util.concurrent.TimeoutException;
import com.gargoylesoftware.htmlunit.BrowserVersion;
import com.gargoylesoftware.htmlunit.FailingHttpStatusCodeException;
import com.gargoylesoftware.htmlunit.WebClient;
import com.gargoylesoftware.htmlunit.html.HtmlPage;
public class HandleHtmlUnitTimeout {
public static void main(String[] args) throws FailingHttpStatusCodeException, MalformedURLException, IOException, InterruptedException, TimeoutException
{
Date start = new Date();
String url = "http://ericaweiner.com/collections/";
doWorkWithTimeout(url, 60);
}
public static void doWorkWithTimeout(final String url, long timeoutSecs) throws InterruptedException, TimeoutException {
//maintains a thread for executing the doWork method
ExecutorService executor = Executors.newFixedThreadPool(1);
//logger.info("Starting method with "+timeoutSecs+" seconds as timeout");
//set the executor thread working
final Future<?> future = executor.submit(new Runnable() {
public void run()
{
try
{
getPageSource(url);
}
catch (Exception e)
{
throw new RuntimeException(e);
}
}
});
//check the outcome of the executor thread and limit the time allowed for it to complete
try {
future.get(timeoutSecs, TimeUnit.SECONDS);
} catch (Exception e) {
//ExecutionException: deliverer threw exception
//TimeoutException: didn't complete within downloadTimeoutSecs
//InterruptedException: the executor thread was interrupted
//interrupts the worker thread if necessary
future.cancel(true);
//logger.warn("encountered problem while doing some work", e);
throw new TimeoutException();
}finally{
executor.shutdownNow();
}
}
public static void getPageSource(String productPageUrl)
{
try {
if(productPageUrl == null)
{
productPageUrl = "http://ericaweiner.com/collections/";
}
WebClient wb = new WebClient(BrowserVersion.FIREFOX_3_6);
wb.getOptions().setTimeout(120000);
wb.getOptions().setJavaScriptEnabled(true);
wb.getOptions().setThrowExceptionOnScriptError(true);
wb.getOptions().setThrowExceptionOnFailingStatusCode(false);
HtmlPage page = wb.getPage(productPageUrl);
wb.waitForBackgroundJavaScript(4000);
wb.closeAllWindows();
}
catch (FailingHttpStatusCodeException e)
{
e.printStackTrace();
}
catch (MalformedURLException e)
{
e.printStackTrace();
}
catch (IOException e)
{
e.printStackTrace();
}
}
}
This code does come out of doWorkWithTimeout(url, 60); method. But this does not terminate.
When I try to call similiar implementation with following code:
import java.util.Date;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
import java.util.concurrent.Future;
import java.util.concurrent.TimeUnit;
import org.apache.log4j.Logger;
public class HandleScraperTimeOut {
private static Logger logger = Logger.getLogger(HandleScraperTimeOut .class);
public void doWork() throws InterruptedException {
logger.info(new Date()+ "Starting worker method ");
Thread.sleep(20000);
logger.info(new Date()+ "Ending worker method ");
//perform some long running task here...
}
public void doWorkWithTimeout(int timeoutSecs) {
//maintains a thread for executing the doWork method
ExecutorService executor = Executors.newFixedThreadPool(1);
logger.info("Starting method with "+timeoutSecs+" seconds as timeout");
//set the executor thread working
final Future<?> future = executor.submit(new Runnable() {
public void run()
{
try
{
doWork();
}
catch (Exception e)
{
throw new RuntimeException(e);
}
}
});
//check the outcome of the executor thread and limit the time allowed for it to complete
try {
future.get(timeoutSecs, TimeUnit.SECONDS);
} catch (Exception e) {
//ExecutionException: deliverer threw exception
//TimeoutException: didn't complete within downloadTimeoutSecs
//InterruptedException: the executor thread was interrupted
//interrupts the worker thread if necessary
future.cancel(true);
logger.warn("encountered problem while doing some work", e);
}
executor.shutdown();
}
public static void main(String a[])
{
HandleScraperTimeOut hcto = new HandleScraperTimeOut ();
hcto.doWorkWithTimeout(30);
}
}
If anybody can have a look and tell me what is the issue, it will be really helpful.
For more details about issue, you can look into Skip particular Javascript execution in HTML unit
and
Fetch Page source using HtmlUnit : URL got stuck
Update 1
Strange thing is : future.cancel(true); is returning TRUE in both cases.
How I expected it to be was :
With HtmlUnit it should return FALSE since process is still hanging.
With normal Thread.sleep(); it should return TRUE since the process
got cancelled successfully.
Update 2
It only hangs with http://ericaweiner.com/collections/ URL. If I give any other URL i.e. http://www.google.com , http://www.yahoo.com , It does not hand. In these cases it throws IntruptedException and come out of the Process.
It seems that http://ericaweiner.com/collections/ page source has certain elements which are causing problems.
Future.cancel(boolean) returns:
false if the task could not be cancelled, typically because it has already completed normally
true otherwise
Cancelled means means the thread did not finish before cancel, the canceled flag was set to true and if requested the thread was interrupted.
Interrupt the thread menans it called Thread.interrupt and nothing more. Future.cancel(boolean) does not check if the thread actually stopped.
So it is right that cancel return true on that cases.
Interrupting a thread means it should stop as soon as possible but it is not enforced. You can try to make it stop/fail closing a resource it needs or something. I usually do that with a thread reading (waiting incoming data) from a socket. I close the socket so it stops waiting.
The code below demonstarates the problem: java 7 interfaces for watching file changes report opening of file, but don't report actual change of content.
Is there a way to detect content change? My program needs to read new content as soon as is available.
The only length reported by event listener is 0 (right after opening of the file).
Compilable source to reproduce the problem:
import java.io.Writer;
import java.nio.charset.Charset;
import java.nio.file.ClosedWatchServiceException;
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.StandardWatchEventKinds;
import java.nio.file.WatchEvent;
import java.nio.file.WatchKey;
import java.nio.file.WatchService;
import java.nio.file.attribute.FileAttribute;
class NioModifiedProblem {
public static void println(String str) {
System.out.println(str);
}
public static void printFileInfo(Path path) {
try {
println(String.format("File %s, size %d, modified %s", path, Files.size(path), Files.getLastModifiedTime(path)));
} catch (Throwable e) {
e.printStackTrace();
}
}
public static void main(String[] args) {
String data = "Some not too long string goes here. Goes. Goes.";
try {
final Path path = Files.createTempFile("nioProblem", ".tmp", new FileAttribute[0]);
path.toFile().deleteOnExit();
println("Created");
printFileInfo(path);
Thread thread = new Thread() {
public void run() {
try {
final Path parent = path.getParent();
final WatchService service = parent.getFileSystem().newWatchService();
WatchKey key = parent.register(service, StandardWatchEventKinds.ENTRY_MODIFY);
try {
while (true) {
for (WatchEvent<?> event : service.take().pollEvents()){
Path modifiedPath = parent.resolve((Path)event.context());
println("Path "+modifiedPath+" modified EVENT."); // This is printed only once, on file opening.
printFileInfo(modifiedPath);
}
}
} catch (ClosedWatchServiceException e) {
println("Service closed");
}
} catch (Throwable e) {
e.printStackTrace();
} finally {
println("Watcher thread exiting");
}
}
};
thread.setDaemon(true);
thread.start();
Thread.sleep(1000);
Writer fw = Files.newBufferedWriter(path, Charset.forName("UTF-8"));
println("Opened");
printFileInfo(path);
Thread.sleep(1000);
fw.write(data);
println("Written");
printFileInfo(path);
fw.close();
println("Closed");
printFileInfo(path);
Thread.sleep(1000);
println("Sleeped");
printFileInfo(path);
return;
} catch (Throwable e) {
e.printStackTrace();
}
}
}
Output on Java(TM) SE Runtime Environment (build 1.7.0-b147):
Created
File C:\Users\b\AppData\Local\Temp\nioProblem190636654560972941.tmp, size 0, modified 2011-09-14T16:20:06.782Z
Opened
Path C:\Users\b\AppData\Local\Temp\nioProblem190636654560972941.tmp modified EVENT.
File C:\Users\b\AppData\Local\Temp\nioProblem190636654560972941.tmp, size 0, modified 2011-09-14T16:20:07.807Z
File C:\Users\b\AppData\Local\Temp\nioProblem190636654560972941.tmp, size 0, modified 2011-09-14T16:20:07.807Z
Written
File C:\Users\b\AppData\Local\Temp\nioProblem190636654560972941.tmp, size 0, modified 2011-09-14T16:20:07.807Z
Closed
File C:\Users\b\AppData\Local\Temp\nioProblem190636654560972941.tmp, size 47, modified 2011-09-14T16:20:08.81Z
Sleeped
File C:\Users\b\AppData\Local\Temp\nioProblem190636654560972941.tmp, size 47, modified 2011-09-14T16:20:08.81Z
The problem with the output - event is raised only once on file opening, and is never raised on actual writes or file closing.
Download source
WatchedKey should be reset after pollEvents() to be able to accept futher events.