We have a legacy system that has a admim module that allows users to upload jar files. After the upload, the jar file is validated and if not compliant to internal rules, it is deleted.
The problem is that windows is throwing an exception telling that the file "is already being used by another process." (when I call Files.delete(tmpJar);). I'm not able to identify why the file is open. Seems to me that I have closed everything.
First, we are using primefaces (4.0) to upload the file. Primefaces relies on commons-fileupload (1.3.1). It call the following method:
public void handleFileUpload(FileUploadEvent event) {
Path tmpJar = null;
try {
tmpJar = Files.createFile(Paths.get(event.getFile().getFileName()));
Files.write(tmpJar, event.getFile().getContents());
} catch (IOException e) {
LOGGER.error(e.getMessage(), e);
}
if (tmpJar != null) {
try {
this.validateJar(tmpJar.toString());
Files.delete(tmpJar);
} catch (IOException e) {
LOGGER.error(e.getMessage(), e);
}
}
}
Before NIO Files.write, I was using "standard" java IO classes. The problem isn't related to the above code, because if I comment the call to validateJar, Files.delete(tmpJar) is executed without problems and the file is removed. So, the problem is related with the code below, but I can't find where...
Job is an internal class, basically a simple POJO. "jobAnnotation" is a custom annotation to identify Jobs. I have shortened the code, but the essencial parts are preserved.
private List<Job> validateJar(final String jarPath) throws IOException {
List<Job> jobs = new ArrayList<Job>();
try (JarFile jarFile = new JarFile(jarPath)) {
URL[] jars = { new URL("file:" + jarPath) };
ClassLoader jobClassLoader = URLClassLoader.newInstance(jars, this.getClass().getClassLoader());
Enumeration<JarEntry> jarEntries = jarFile.entries();
while (jarEntries.hasMoreElements()) {
JarEntry jarEntry = jarEntries.nextElement();
String className = jarEntry.getName();
Class<?> classToLoad;
try {
classToLoad = Class.forName(className, true, jobClassLoader);
} catch (Exception e1) {
LOGGER.error(e1.getMessage(), e1);
continue;
}
if (classToLoad.isAnnotationPresent(jobAnnotation)) {
String vlr = null;
try {
Class<?> jobClass = (Class<?>) Class.forName(classToLoad.getCanonicalName(), true, jobClassLoader);
Annotation annotation = jobClass.getAnnotation(jobAnnotation);
Method method = annotation.getClass().getMethod("getValue");
vlr = ((String) method.invoke(annotation, new Object[0]));
} catch (Exception e1) {
LOGGER.error(e1.getMessage(), e1);
}
Job job = new Job();
job.setEnabled(true);
job.setJarfile(jarPath);
job.setClassName(classToLoad.getName());
Parameter parameter = new Parameter();
parameter.setRequired(true);
parameter.setName("name");
parameter.setValue(vlr);
job.addParameter(parameter);
jobs.add(job);
}
}
} catch (IOException e) {
throw e;
}
return jobs;
}
Before using try-with-resources, I was using regular try-catch-finally to close the JarFile, thats the only thing that has a explicit close method. Probably is the classloading that is holding the file open, but I don't know how to close it.
I did some searches, and I found that I can't unload classes (Unloading classes in java?).
So, the problem is, how do I release it? Or how can I remove the file?
BTW, I'm using java 1.7.0_71, jboss 7.1.1, windows 7 (64).
The URLClassLoader class already has a close() method. The close() method will close any Jar file that are opened with the URLClassLoader. This should prevent the "file already in use" exception.
File is already being used by another process. says that it could be not your fault, maybe just another application is used that file. You can check this question to find a process which is used your file.
Some Virus scanner software take a long time in checking JARs. Try to disable the Virusscanner. Other candidates can be the Windows indexer process, or the explorer.exe itself. When you don't find any reason for the file lock, try a delay between the validation and the deletion. Maybe you need a loop with multiple tries.
Related
I've come across many posts about these two topics: Auto-Updating and URLClassloaders. I'll start with the auto updating goal. I found this post here that talks about a 2 jar system. One jar that launches the main app jar: From Stephen C:
The launcher could be a Java application that creates a classloader for the new JAR, loads an entrypoint class and calls some method on it. If you do it this way, you have to watch for classloader storage leaks, but that's not difficult. (You just need to make sure that no objects with classes loaded from the JAR are reachable after you relaunch.)
This is the approach I'm taking, but I'm open to other ideas if they prove easier and/or more reliable. The Coordinator has posted some pretty cool launcher code to which I plan on incorporating some of this reload type code in my launcher, but first I need to get it to work.
My issue is that my main app jar has many other dependencies, and I cannot get some of those classes to load despite the fact that all the jars have been added to the URL's array. This brings up the second topic URLClassloader.
Side Note for future readers: When passing a URL to the URLClassloader that is a directory, a helpful note that would have saved me (an embarrassingly large) amount of time is that the contents of the directory must be .class files! I was originally pointing to my dependent jar directory, no good.
Context for the code below, my launcher jar resides in the same directory as my app jar, which is why I'm using user.dir. I will probably change this, but for now the code works and gets far enough into my app's code to request a connection to a sqlite database before failing.
Launcher:
public class Launcher {
public static void main(String[] args) {
try {
String userdir = System.getProperty("user.dir");
File parentDir = new File(userdir);
ArrayList<URL> urls = getJarURLs(parentDir);
URL[] jarURLs = new URL[urls.size()];
int index = 0;
for (URL u : urls) {
System.out.println(u.toString());
jarURLs[index] = u;
index ++;
}
URLClassLoader urlCL = new URLClassLoader(jarURLs);
Class<?> c = urlCL.loadClass("main.AppStart");
Object [] args2 = new Object[] {new String[] {}};
c.getMethod("main", String[].class).invoke(null, args2);
urlCL.close();
} catch (Exception e1) {
e1.printStackTrace();
}
}
public static ArrayList<URL> getJarURLs(File parentDir) throws MalformedURLException {
ArrayList<URL> list = new ArrayList<>();
for (File f : parentDir.listFiles()) {
if (f.isDirectory()) {
list.addAll(getJarURLs(f));
} else {
String name = f.getName();
if (name.endsWith(".jar")) {
list.add(f.toURI().toURL());
}
}
}
return list;
}
}
Here's an example of the URL output added to the array:
file:/C:/my/path/to/dependent/jars/sqlite-jdbc-3.32.3.2.jar
file:/C:/my/path/to/main/app.jar
file: ... [10 more]
The URLClassloader seems to work well enough to load my main method in app.jar. The main executes a some startup type stuff, before attempting to load a login screen. When the request is made to get the user info database, my message screen loads and displays (<-this is important for later)
the stacktrace containing:
java.sql.SQLException: No suitable driver found for jdbc:sqlite:C:\...\users.db
I understand that this is because that jar is not on the class path, but it's loaded via the class loader, so why can't it find the classes from the jar? From this post JamesB suggested adding Class.forName("org.sqlite.JDBC"); before the connection request. I rebuilt the app jar with this line of code and it worked!
The weird thing that happened next, is that my message screen class can no longer be found even though earlier it loaded and displayed correctly. The message screen is a class inside my main app.jar and not in a dependent jar, which is why I'm baffled. Am I going to have to add Class.forName before every instance of any of my classes? That seems rude..
So what could I be doing wrong with the class loader? Why does it load some classes and not others despite that fact that all the jars have been added to the URL array?
Some other relative info: My app works perfectly as intended when launched from windows command line when the classpath is specified: java -cp "main-app.jar;my/dependent/jar/directory/*" main.AppStart. It's only when I try launching the app via this classloader that I have these issues.
By the way, is this java command universal? Will it work on all operating systems with java installed? If so, could I not just scrap this launcher, and use a process builder to execute the above command? Bonus points for someone who can tell me how to execute the command from a jre packaged with my app, as that's what I plan on doing so the user does not have to download Java.
EDIT
I figured out one of the answers to one of the questions below. Turns out, I didn't need to do any of the code below. My main method loads a login screen but after it's loaded it returns back to the AppLauncher code, thus closing the URLClassLoader! Of course, at that point any requested class will not be found as the loader has been closed! What an oof! Hopefully I will save someone a headache in the future...
Original
Well, after more time, effort, research, and effective use of Eclipse's debugging tool, I was able to figure out what I needed to do to resolve my issues.
So the first issue was my JDBC driver was never registered when passing the jars to the URLClassloader. This is the part I sorta don't understand, so advisement would be welcomed, but there is a static block in the JDBC class that registers the driver so it can be used by DriverManager see code below. Loading the class is what executes that static block, hence why calling Class.forName works.
static {
try {
DriverManager.registerDriver(new JDBC());
} catch (SQLException e) {
e.printStackTrace();
}
}
What I don't understand, is how class loading works if jars are specified via the class path. The URLClassLoader doesn't load any of those classes until they are called, and I never directly work with the JDBC class, thus no suitable driver exception, but are all the classes specified via the classpath loaded initially? Seems that way for static blocks to execute.
Anyhow, to resolve my other issue with some of my app's classes not being found I had to implement my own classloader. I get what I did and how it works well, but still don't understand why I had to do it. All of my jars were loaded to the original URLClassloader so if I could find them and the files within, why couldn't it do it?
Basically, I had to override the findClass and findResource methods to return jarEntry information that I had to store. I hope this code helps someone!
public class SBURLClassLoader extends URLClassLoader {
private HashMap<String, Storage> map;
public SBURLClassLoader(URL[] urls) {
super(urls);
map = new HashMap<>();
try {
storeClasses(urls);
} catch (ClassNotFoundException e) {
e.printStackTrace();
}
}
private void storeClasses(URL[] urls) throws ClassNotFoundException {
for (URL u : urls) {
try {
JarFile jarFile = new JarFile(new File(u.getFile()));
Enumeration<JarEntry> e = jarFile.entries();
while (e.hasMoreElements()) {
JarEntry jar = e.nextElement();
String entryName = jar.getName();
if (jar.isDirectory()) continue;
if (!entryName.endsWith(".class")) {
//still need to store these non-class files as resources
//let code continue to store entry un-altered
} else {
entryName = entryName.replace(".class", "");
entryName = entryName.replace("/", ".");
}
map.put(entryName, new Storage(jarFile, jar));
System.out.println(entryName);
}
} catch (IOException e) {
e.printStackTrace();
}
}
}
#Override
protected Class<?> findClass(String name) throws ClassNotFoundException {
Class<?> c = null;
try {
c = super.findClass(name);
} catch (ClassNotFoundException e) {
Storage s = map.get(name);
try {
InputStream in = s.jf.getInputStream(s.je);
int len = in.available();
c = defineClass(name, in.readAllBytes(), 0, len);
resolveClass(c);
} catch (IOException e1) {
e1.printStackTrace();
}
if (c == null) throw e;
}
return c;
}
#Override
public URL findResource(String name) {
URL url = super.findResource(name);
if (url == null) {
Storage s = map.get(name);
if (s != null) {
try {
url = new URL("jar:"+s.base.toString() + "!/" + name);
} catch (Exception e) {
e.printStackTrace();
}
}
}
return url;
}
private class Storage {
public JarFile jf;
public JarEntry je;
public URL base;
public Storage(JarFile jf, JarEntry je) {
this.jf = jf;
this.je = je;
try {
base = Path.of(jf.getName()).toUri().toURL();
} catch (MalformedURLException e) {
e.printStackTrace();
}
}
}
}
using java 8, tomcat 8
Hi, i am loading a file using properties, but i have a check before loading which returns the same properties object if its already been loaded (not null). which is a normal case scenario but i want to know if there is any way that if any change occur in target file, and some trigger should be called and refreshes all the properties objects. here is my code.
public static String loadConnectionFile(String keyname) {
String message = "";
getMessageFromConnectionFile();
if (propertiesForConnection.containsKey(keyname))
message = propertiesForConnection.getProperty(keyname);
return message;
}
public static synchronized void getMessageFromConnectionFile() {
if (propertiesForConnection == null) {
FileInputStream fileInput = null;
try {
File file = new File(Constants.GET_CONNECTION_FILE_PATH);
fileInput = new FileInputStream(file);
Reader reader = new InputStreamReader(fileInput, "UTF-8");
propertiesForConnection = new Properties();
propertiesForConnection.load(reader);
} catch (Exception e) {
Utilities.printErrorLog(Utilities.convertStackTraceToString(e), logger);
} finally {
try {
fileInput.close();
} catch (Exception e) {
Utilities.printErrorLog(Utilities.convertStackTraceToString(e), logger);
}
}
}
}
the loadConnectionFile method executes first and calls getMessageFromConnectionFile which has check implemented for "null", now if we remove that check it will definitely load updated file every time but it will slower the performance. i want an alternate way.
hope i explained my question.
thanks in advance.
Java has a file watcher service. It is an API. You can "listen" for changes in files and directories. So you can listen for changes to your properties file, or the directory in which your properties file is located. The Java Tutorials on Oracle's OTN Web site has a section on the watcher service.
Good Luck,
Avi.
I'm trying to copy a resource in my project onto another location on the disk. So far, I have this code:
if (!file.exists()){
try {
file.createNewFile();
Files.copy(new InputSupplier<InputStream>() {
public InputStream getInput() throws IOException {
return Main.class.getResourceAsStream("/" + name);
}
}, file);
} catch (IOException e) {
file = null;
return null;
}
}
And it works fine, but the InputSupplier class is deprecated, so I was wondering if there was a better way to do what I'm trying to do.
See the documentation for the Guava InputSupplier class:
For InputSupplier<? extends InputStream>, use ByteSource instead. For InputSupplier<? extends Reader>, use CharSource. Implementations of InputSupplier that don't fall into one of those categories do not benefit from any of the methods in common.io and should use a different interface. This interface is scheduled for removal in December 2015.
So in your case, you're looking for ByteSource:
Resources.asByteSource(url).copyTo(Files.asByteSink(file));
See this section of the Guava Wiki for more information.
If you're looking for a pure Java (no external libraries) version, you can do the following:
try (InputStream is = this.getClass().getClassLoader().getResourceAsStream("/" + name)) {
Files.copy(is, Paths.get("C:\\some\\file.txt"));
} catch (IOException e) {
// An error occurred copying the resource
}
Note that this is only valid for Java 7 and higher.
I am having a problem writing to a .xml file inside of my jar. When I use the following code inside of my Netbeans IDE, no error occurs and it writes to the file just fine.
public void saveSettings(){
Properties prop = new Properties();
FileOutputStream out;
try {
File file = new File(Duct.class.getResource("/Settings.xml").toURI());
out = new FileOutputStream(file);
prop.setProperty("LAST_FILE", getLastFile());
try {
prop.storeToXML(out,null);
} catch (Exception e) {
JOptionPane.showMessageDialog(null, e.toString());
}
try {
out.close();
} catch (Exception e) {
JOptionPane.showMessageDialog(null, e.toString());
}
} catch (Exception e) {
JOptionPane.showMessageDialog(null, e.toString());
}
}
However, when I execute the jar I get an error saying:
IllegalArguementException: uri is not hierachal
Does anyone have an idea of why it's working when i run it in Netbeans, but not working when i execute the jar. Also does anyone have a solution to the problem?
The default class loader expects the classpath to be static (so it can cache heavily), so this approach will not work.
You can put Settings.xml in the file system if you can get a suitable location to put it. This is most likely vendor and platform specific, but can be done.
Add the location of the Settings.xml to the classpath.
I was also struggling with this exception. But finally found out the solution.
When you use .toURI() it returns some thing like
D:/folderName/folderName/Settings.xml
and hence you get the exception "URI is not hierarchical"
To avoid this call the method getPath() on the URI returned, which returns something like
/D:/folderName/folderName/Settings.xml
which is now hierarchical.
In your case, the 5th line in your code should be
File file = new File(Duct.class.getResource("/Settings.xml").toURI().getPath());
I'm trying to configure the Java Logging API's FileHandler to log my server to a file within a folder in my home directory, but I don't want to have to create those directories on every machine it's running.
For example in the logging.properties file I specify:
java.util.logging.FileHandler
java.util.logging.FileHandler.pattern=%h/app-logs/MyApplication/MyApplication_%u-%g.log
This would allow me to collect logs in my home directory (%h) for MyApplication and would rotate them (using the %u, and %g variables).
Log4j supports this when I specify in my log4j.properties:
log4j.appender.rolling.File=${user.home}/app-logs/MyApplication-log4j/MyApplication.log
It looks like there is a bug against the Logging FileHandler:
Bug 6244047: impossible to specify driectorys to logging FileHandler unless they exist
It sounds like they don't plan on fixing it or exposing any properties to work around the issue (beyond having your application parse the logging.properties or hard code the path needed):
It looks like the
java.util.logging.FileHandler does not
expect that the specified directory
may not exist. Normally, it has to
check this condition anyway. Also, it
has to check the directory writing
permissions as well. Another question
is what to do if one of these check
does not pass.
One possibility is to create the
missing directories in the path if the
user has proper permissions. Another
is to throw an IOException with a
clear message what is wrong. The
latter approach looks more consistent.
It seems like log4j version 1.2.15 does it.
Here is the snippet of the code which does it
public
synchronized
void setFile(String fileName, boolean append, boolean bufferedIO, int bufferSize)
throws IOException {
LogLog.debug("setFile called: "+fileName+", "+append);
// It does not make sense to have immediate flush and bufferedIO.
if(bufferedIO) {
setImmediateFlush(false);
}
reset();
FileOutputStream ostream = null;
try {
//
// attempt to create file
//
ostream = new FileOutputStream(fileName, append);
} catch(FileNotFoundException ex) {
//
// if parent directory does not exist then
// attempt to create it and try to create file
// see bug 9150
//
String parentName = new File(fileName).getParent();
if (parentName != null) {
File parentDir = new File(parentName);
if(!parentDir.exists() && parentDir.mkdirs()) {
ostream = new FileOutputStream(fileName, append);
} else {
throw ex;
}
} else {
throw ex;
}
}
Writer fw = createWriter(ostream);
if(bufferedIO) {
fw = new BufferedWriter(fw, bufferSize);
}
this.setQWForFiles(fw);
this.fileName = fileName;
this.fileAppend = append;
this.bufferedIO = bufferedIO;
this.bufferSize = bufferSize;
writeHeader();
LogLog.debug("setFile ended");
}
This piece of code is from FileAppender, RollingFileAppender extends FileAppender.
Here it is not checking whether we have permission to create the parent folders, but if the parent folders is not existing then it will try to create the parent folders.
EDITED
If you want some additional functionalily, you can always extend RollingFileAppender and override the setFile() method.
You can write something like this.
package org.log;
import java.io.IOException;
import org.apache.log4j.RollingFileAppender;
public class MyRollingFileAppender extends RollingFileAppender {
#Override
public synchronized void setFile(String fileName, boolean append,
boolean bufferedIO, int bufferSize) throws IOException {
//Your logic goes here
super.setFile(fileName, append, bufferedIO, bufferSize);
}
}
Then in your configuration
log4j.appender.fileAppender=org.log.MyRollingFileAppender
This works perfectly for me.
To work around the limitations of the Java Logging framework, and the unresolved bug: Bug 6244047: impossible to specify driectorys to logging FileHandler unless they exist
I've come up with 2 approaches (although only the first approach will actually work), both require your static void main() method for your app to initialize the logging system.
e.g.
public static void main(String[] args) {
initLogging();
...
}
The first approach hard-codes the log directories you expect to exist and creates them if they don't exist.
private static void initLogging() {
try {
//Create logging.properties specified directory for logging in home directory
//TODO: If they ever fix this bug (http://bugs.sun.com/bugdatabase/view_bug.do?bug_id=6244047) in the Java Logging API we wouldn't need this hack
File homeLoggingDir = new File (System.getProperty("user.home")+"/webwars-logs/weblings-gameplatform/");
if (!homeLoggingDir.exists() ) {
homeLoggingDir.mkdirs();
logger.info("Creating missing logging directory: " + homeLoggingDir);
}
} catch(Exception e) {
e.printStackTrace();
}
try {
logger.info("[GamePlatform] : Starting...");
} catch (Exception exc) {
exc.printStackTrace();
}
}
The second approach could catch the IOException and create the directories listed in the exception, the problem with this approach is that the Logging framework has already failed to create the FileHandler so catching and resolving the error still leaves the logging system in a bad state.
As a possible solution I think there are 2 approaches (look at some of the previous answers). I can extend a Java Logging Handler class and write my own custom handler. I could also copy the log4j functionality and adapt it to the Java Logging framework.
Here's an example of copying the basic FileHandler and creating a CustomFileHandler see pastebin for full class:
The key is the openFiles() method where it tries to create a FileOutputStream and checking and creating the parent directory if it doesn't exist (I also had to copy package protected LogManager methods, why did they even make those package protected anyways):
// Private method to open the set of output files, based on the
// configured instance variables.
private void openFiles() throws IOException {
LogManager manager = LogManager.getLogManager();
...
// Create a lock file. This grants us exclusive access
// to our set of output files, as long as we are alive.
int unique = -1;
for (;;) {
unique++;
if (unique > MAX_LOCKS) {
throw new IOException("Couldn't get lock for " + pattern);
}
// Generate a lock file name from the "unique" int.
lockFileName = generate(pattern, 0, unique).toString() + ".lck";
// Now try to lock that filename.
// Because some systems (e.g. Solaris) can only do file locks
// between processes (and not within a process), we first check
// if we ourself already have the file locked.
synchronized (locks) {
if (locks.get(lockFileName) != null) {
// We already own this lock, for a different FileHandler
// object. Try again.
continue;
}
FileChannel fc;
try {
File lockFile = new File(lockFileName);
if (lockFile.getParent() != null) {
File lockParentDir = new File(lockFile.getParent());
// create the log dir if it does not exist
if (!lockParentDir.exists()) {
lockParentDir.mkdirs();
}
}
lockStream = new FileOutputStream(lockFileName);
fc = lockStream.getChannel();
} catch (IOException ix) {
// We got an IOException while trying to open the file.
// Try the next file.
continue;
}
try {
FileLock fl = fc.tryLock();
if (fl == null) {
// We failed to get the lock. Try next file.
continue;
}
// We got the lock OK.
} catch (IOException ix) {
// We got an IOException while trying to get the lock.
// This normally indicates that locking is not supported
// on the target directory. We have to proceed without
// getting a lock. Drop through.
}
// We got the lock. Remember it.
locks.put(lockFileName, lockFileName);
break;
}
}
...
}
I generally try to avoid static code but to work around this limitaton here is my approach that worked on my project just now.
I subclassed java.util.logging.FileHandler and implemented all constructors with their super calls. I put a static block of code in the class that creates the folders for my app in the user.home folder if they don't exist.
In my logging properties file I replaced java.util.logging.FileHandler with my new class.