I'm looking for code, or a library, which I can use to copy files between Windows servers using Java. Picture a primary server that runs a background thread, so that whenever a transaction is completed, it backs up the database files to the backup server. (This is required protocol, so no need to discuss the pros/cons of this action). In other words, transaction completes, Java code gets executed which copies one directory to the back-up server.
The way the Windows machines are set up, the primary server has the back-up server's C: drive mapped as it's own Z: drive. Both machines running Windows 2003 or 2008 Server. Java 1.6.
Found the correct answer on another forum and from messing around a little with the settings. The problem with copying files from one machine to another in Windows (using either a .bat file or using straight-up Java code) is the user permissions. On the primary server, you MUST set the Tomcat process to run as the administrator, using that administrator's username and password. (Right-click on the Tomcat service, select "Log On" tab, enter administrator's username/password). The default user that Tomcat runs on (local user), isn't sufficient to copy files between networked drives on Windows. When I set that correctly, both the .bat file solution I had tried previous to this post, and a straight-Java solution suggested here worked just fine.
Hope that helps someone else, and thanks for the suggestions.
Obtain the files by File#listFiles() on a directory from one disk, iterate over each of them, create a new File on the other disk, read from a FileInputStream from the file from one disk and write it to a FileOutputStream on the file on the other disk.
In a nut:
for (File file : new File("C:/path/to/files").listFiles()) {
if (!file.isDirectory()) {
File destination = new File("Z:/path/to/files", file.getName());
// Do the usual read InputStream, write OutputStream job here.
}
}
See also:
Java IO tutorial
By the way, if you were using Java 7, you would have used Path#copyTo() for this.
Paths.get("C:/path/to/files").copyTo(Paths.get("Z:/path/to/files"));
I would really recommend using Apache Commons IO for this.
The FileUtils class provides methods to iterate over directories and copy files from one directory to another. You don't have to worry about reading and writing files yourself because it's all done by the library for you.
Related
If I have a resource file (mydata.txt) in resources folder (set up as another source folder) of my application. This mydata.txt eventually would be packaged inside root of a jar file (.war) to be deployed to some application server (Tomcat, Jetty, WildFly).
File mydata.txt has some crucial data needed to the application, and this file shall be appended by the application.
To get a file from jar I can use getClass().getResourceAsStream("/mydata.txt") - thus I get this file as InputStream. But there is no way I can get this file as OutputStream and write to it.
All solutions with getClass().getResource() - returning URL are discouraged, getResourceAsStream is always recommended, but it allows only reading, not writing/updating/appending the file.
getClass().getProtectionDomain().getCodeSource() solution is also discouraged to get (write) access to the file.
I could create some file in a temporary directory on Tomcat Server (System.getProperty("java.io.tmpdir") and write to it, but it is nonsense because this file contents is crucial for the application (to write it to tmp dir), besides I need to append file, not create a new one and write to it.
Also, I am not sure that writing to any other directory (other than tmp) of application server is a good idea (please correct me if I am wrong here).
So I come to the conclusion that it is not recommended to save any data to a file in enterprise application, so I shall always use a database instead?
In short: Yes. Beside all you mentioned (which is all correct) the biggest problems are
concurrent access
transaction handling
which both a database serves perfect and with a file approach is just a pain in the ****
In addition to that especially an application server provides you with configuration of connection (and pools) to data sources of any kind, which is really handy in a production environment.
I am writing a Java Agent which does some attachment manipulations and I am looking for a 'clean' place where I can do the manipulations - i.e. won't have too much hassles with admins setting special permissions. Is there a best practise for the location of the temp directory? In Lotusscript I would use
Environ("Temp")
which would give me the temp directory of the local machine.
There is also the possibility of using the data directory, but that makes me uneasy...
var d = session.getEnvironmentString("directory",true)
Any tips/best recommendations?
The general rule is if you need a temp directory, then request it from the system.
Example:
System.getProperty("java.io.tmpdir")
Using the Data folder is probably going to upset the admin.
If you want to create diagnostic logs which you may use later then I recommend writing to:
<DOMINO DATA FOLDER>\IBM_TECHNICAL_SUPPORT
This way the admin has one set place to find logs.
An example could be
File temp = File.createTempFile("temp-file-name", ".tmp");
temp.deleteOnExit(); //This will delete the file when the JVM shuts down.
That file would be saved in
C:\Users\*User*\AppData\Local\Temp\temp-file-name623426.tmp
I have a team of users that have read only access to a shared network drive. Sometimes these users will need to deploy their project resources to the drive. I am trying to come up with a secure build process for them to use. Currently I am using a batch file that they can execute from their local system which will do the following...
User starts batch file
Batch file calls a java program (the credentials are 'hidden' and 'encrypted' within the java program)
The java program handles the encryption process and then calls a final batch file that actually runs the NET USE command to map the drive with admin credentials
The final batch file maps the drive, copies the required resources onto the shared drive, and then re-maps the drive with original user credentials (read only).
My major problem is that users will have direct access to the batch files that do this entire process and they could simply remove the #ECHO off command from the final batch file to display all the credentials to the cmd output window.
I'm not sure if there's a better solution to this sort of thing? Any ideas will be greatly appreciated!
Also, all machines are using Windows 7 and using a Windows network drive.
The best solution would be to copy the resources directly in the Java program using the jCIFS library.
A second option would be to map the drive from within the Java program. There's more information in this SO question: How can I mount a windows drive in Java?
There are some .bat to .exe compliers out there. Not sure how well they will work for your particular batch file, but probably worth a look. You can search for them. Here's a couple of them out there
Advanced BAT to EXE Complier
Quick Batch File Compiler
Batch File Complier PE
What is the best way for a program which retrieves files from ftp server to check if the file to be downloaded is an ongoing transfer (someone already uploading this file during we decide to download). Do ftp client apis handle this ? (e.g. apache commons ftp client).
i think it's not really possible. a couple years ago i had a similar problem and i've got 2 options. (unfortunatelly it was C#, not Java)
you can check if the file's still growing (implies that you're gonna have a small delay because you need to check twice) or if you're using windows (i don't know how linux works) you can try to access this file and you should get an exception that the file is in use by another process.
just two possibilities and a starter for you to think about your problem. maybe someone else'
s coming up with a really good solution, but for now that might be a little workaround for you
Ftp was not designed to tell you if a file is in use, the most the ftp daemon can do is deny the transfer, and that is configurable in some servers. There may be a server that renames files temporarily or offer a script to do so , but you'd have to find one.
Do not know if it is sufficient for you, but if you need only some "dum" check, I would try System.getSecurityManager().checkDelete(). File can be deleted only if no streams are opened.
I need some ideas on how I can best solve this problem.
I have a JBoss Seam application running on JBoss 4.3.3
What a small portion of this application does is generate an html and a pdf document based on an Open Office template.
The files that are generated I put inside /tmp/ on the filesystem.
I have tried with System.getProperties("tmp.dir") and some other options, and they always return $JBOSS_HOME/bin
I would like to choose the path $JBOSS_HOME/$DEPLOY/myEAR.ear/myWAR.war/WhateverLocationHere/
However, I don't know how I can programatically choose path without giving an absolute path, or setting $JBOSS_HOME and $DEPLOY.
Anybody know how I can do this?
The second question;
I want to easily preview these generated files. Either through JavaScript, or whatever is the easiest way. However, JavaScript cannot access the filesystem on the server, so I cannot open the file through JavaScript.
Any easy solutions out there?
Not sure how you are generating your PDFs, but if possible, skip the disk IO all together, stash the PDF content in a byte[] and flush it out to the user in a servlet setting the mime type to application/pdf* that responds to a URL which is specified by a link in your client or dynamically set in a <div> by javascript. You're probably taking the memory hit anyways, and in addition to skipping the IO, you don't have to worry about deleting the tmp files when you're done with the preview.
*****I think this is right. Need to look it up.
Not sure I have a complete grasp of what you are trying to achieve, but I'll give it a try anyway:
My assumption is that your final goal is to make some files (PDF, HTML) available to end users via a web application.
In that case, why not have Apache serve those file to the end users, so you only need your JBOSS application to know the path of a directory that is mapped to an Apache virtual host.
So basically, create a file and save it as /var/www/html/myappfiles/tempfile.pdf (the folder your application knows), and then provide http://mydomain.com/myappfiles (an Apache virtual host) to your users. The rest will be done by the web server.
You will have to set an environment variable or system property to let your application know where your folder resides (/var/www/html/myappfiles/ in this example).
Hopefully I was not way off :)
I agree with Peter (yo Pete!). Put the directory outside of your WAR and setup an environment variable pointing to this. Have a read of this post by Jacob Orshalick about how to configure environment variables in Seam :
As for previewing PDFs, have a look at how Google Docs handles previewing PDFs - it displays them as an image. To do this with Java check out the Sun PDF Renderer.
I'm not sure if this works in JBoss, given that you want a path inside a WAR archive, but you could try using ServletContext.getRealPath(String).
However, I personally would not want generated files to be inside my deployed application; instead I would configure an external data directory somewhere like $JBOSS_HOME/server/default/data/myapp
First, most platforms use java.io.tmpdir to set a temporary directory. Some servlet containers redefine this property to be something underneath their tree. Why do you care where the file gets written?
Second, I agree with Nicholas: After generating the PDF on the server side, you can generate a URL that, when clicked, sends the file to the browser. If you use MIME type application/pdf, the browser should do the right thing with it.