How to upload file outside web server? - java

I want to upload files outisde web server like in d d drive into servlets, I but I'm not able to upload them.
What I have to do to make functionality like this enable in Tomcat 6.0?

This ought just to work. All you basically need to do is to obtain the uploaded file in flavor of an InputStream from the request body. You normally use Apache Commons FileUpload for this. Then you can write it to any OutputStream you like the usual Java IO way, such as FileOutputStream.
Assuming that you're actually using Apache Commons FileUpload which requires Apache Commons IO as a dependency, here's a basic example:
String filename = FilenameUtils.getName(fileItem.getName()); // Important!
File destination = new File("D:/path/to/files", filename);
InputStream input = null;
OutputStream output = null;
try {
input = fileItem.getInputStream();
output = new FileOutputStream(destination);
IOUtils.copy(input, output);
} finally {
IOUtils.closeQuietly(output);
IOUtils.closeQuietly(input);
}
Alternatively you can also just use the Fileupload's convenienced FileItem#write() method:
String filename = FilenameUtils.getName(fileItem.getName()); // Important!
File destination = new File("D:/path/to/files", filename);
fileItem.write(destination);
For more examples, hints and tricks, check the FileUpload User Guide and FAQ.

Related

Use Apache Commons VFS RAM file to avoid using file system with API requiring a file

There is a highly upvoted comment on this post:
how to create new java.io.File in memory?
where Sorin Postelnicu mentions using an Apache Commons VFS RAM file as a way to have an in memory file to pass to an API that requires a java.io.File (I am paraphrasing... I hope I haven't missed the point).
Based on reading related posts I have come up with this sample code:
#Test
public void working () throws IOException {
DefaultFileSystemManager manager = new DefaultFileSystemManager();
manager.addProvider("ram", new RamFileProvider());
manager.init();
final String rootPath = "ram://virtual";
manager.createVirtualFileSystem(rootPath);
String hello = "Hello, World!";
FileObject testFile = manager.resolveFile(rootPath + "/test.txt");
testFile.createFile();
OutputStream os = testFile.getContent().getOutputStream();
os.write(hello.getBytes());
//FileContent test = testFile.getContent();
testFile.close();
manager.close();
}
So, I think that I have an in memory file called ram://virtual/test.txt with contents "Hello, World!"
My question is: how could I use this file with an API that requires a java.io.File?
Java's File API always works with native file system. So there is no way of converting the VFS's FileObject to File without having the file present on the native file system.
But there is a way if your API can also work with InputStream. Most libraries usually have overloaded methods that take in InputStreams. In that case, following should work:
InputStream is = testFile.getContent().getInputStream();
SampleAPI api = new SampleApi(is);

Upload to S3 using Gzip in Java

I'm new to Java and I'm trying to upload a large file ( ~10GB ) to Amazon S3. Could anyone please help me with how to use GZip outputsteam for it ?
I've been through some documentations but got confused about Byte Streams, Gzip streams. They must be used together ? Can anyone help me with this piece of code ?
Thanks in advance.
Have a look at this,
Is it possible to gzip and upload this string to Amazon S3 without ever being written to disk?
ByteArrayOutputStream byteOut = new ByteArrayOutputStream();
GZipOuputStream gzipOut = new GZipOutputStream(byteOut);
// write your stuff
byte[] bites = byteOut.toByteArray();
//write the bites to the amazon stream
Since its a large file you might want to have a look at multi part upload
This question could have been more specific and there are several ways to achieve this. One approach might look like the below.
The example depends on the commons-io and commons-compress libraries, and uses classes from the java.nio.file package.
public static void compressAndUpload(AmazonS3 s3, InputStream in)
throws IOException
{
// Create temp file
Path tmpPath = Files.createTempFile("prefix", "suffix");
// Create and write to gzip compressor stream
OutputStream out = Files.newOutputStream(tmpPath);
GzipCompressorOutputStream gzOut = new GzipCompressorOutputStream(out);
IOUtils.copy(in, gzOut);
// Read content from temp file
InputStream fileIn = Files.newInputStream(tmpPath);
long size = Files.size(tmpPath);
ObjectMetadata metadata = new ObjectMetadata();
metadata.setContentType("application/x-gzip");
metadata.setContentLength(size);
// Upload file to S3
s3.putObject(new PutObjectRequest("bucket", "key", fileIn, metadata));
}
Buffering, error handling and closing of streams are omitted for brevity.

How to write encoded text to a file using Java?

How to write encoded text to a file using java/jsp with FileWriter?
FileWriter testfilewriter = new FileWriter(testfile, true);
testfilewriter.write(testtext);
testtext:- is text
testfile:- is String (Encoded)
What I m trying to do is encoding testfile with Base64 and storing it in file. What is the best way to do so?
Since your data is not plain text, you can't use FileWriter. What you need is FileOutputStream.
Encode your text as Base64:
byte[] encodedText = Base64.encodeBase64( testtext.getBytes("UTF-8") );
and write to file:
try (OutputStream stream = new FileOutputStream(testfile)) {
stream.write(encodedText);
}
or if you don't want to lose existing data, write in append mode by setting append boolean to true:
try (OutputStream stream = new FileOutputStream(testfile, true)) {
stream.write(encodedText);
}
You can do the encoding yourself and then write to the file as suggested by #Alper OR if you want to create a stream which does encoding/decoding to while writing and reading from file , apache commons codec library will come in handy see Base64OutputStream and Base64InputStream
Interestingly Java 8 has a similar API Base64.Encoder. Checkout the wrap method
Hope this helps.
The Approach to be followed depends on the algorithm you are using and writing the encoded file is same as writing the file in java
IMHO, if you are trying to do it using jsp , Kindly go with servlets .As jsp are not meant for business layers rather do servlets.
I'm not going to give the code, as it is pretty easy if you try it. I'll share the best way to do it as a psuedo code. Here are steps to write your encoded text.
Open input file in read mode & output file in append mode.
If input file isn't huge (it can fit in memory) then read whole file at once, otherwise read line-by-line.
Encode the text retrieved from file using Base64Encoder
Write in the output file in append mode.
You can't use a FileWriter directly for this task.
You asked how you can do it, but you didn't give any information about which JDK and library you use, so here are a few solutions with the standard tools.
If you're using Java 8:
String testFile = "";
try (Writer writer = new OutputStreamWriter(
Base64.getEncoder().wrap(
java.nio.file.Files.newOutputStream(
Paths.get(testFile),
StandardOpenOption.APPEND)),
StandardCharsets.UTF_8)
) {
writer.write("text to be encoded in Base64");
}
If you're using Java 7 with Guava:
String testFile = "";
CharSink sink = BaseEncoding.base64()
.encodingSink(
com.google.common.io.Files.asCharSink(
new File(testFile),
StandardCharsets.UTF_8,
FileWriteMode.APPEND))
.asCharSink(StandardCharsets.UTF_8);
try (Writer writer = sink.openStream()) {
writer.write("text to be encoded in Base64");
}
If you're using Java 6 with Guava:
String testFile = "";
CharSink sink = BaseEncoding.base64()
.encodingSink(
com.google.common.io.Files.asCharSink(
new File(testFile),
Charsets.UTF_8,
FileWriteMode.APPEND))
.asCharSink(Charsets.UTF_8);
Closer closer = Closer.create();
try {
Writer writer = closer.register(sink.openStream());
writer.write("text to be encoded in Base64");
} catch (Throwable e) { // must catch Throwable
throw closer.rethrow(e);
} finally {
closer.close();
}
I don't have much knowledge about other libraries so I won't pretend I do and add another helper.

Tika could not delete temporary files

In our application we are processing files using Apache Tika. But there are some files (e.g. *.mov, *.mp4) which Tika cannot process and leaves the corresponding *.tmp file in the user's Temp folder. After some research I found that it is a known bug: https://issues.apache.org/jira/browse/TIKA-1040?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
In the last comment a user talks about a workaround but it does not work for me:
final Tika tika = new Tika();
final TikaInputStream fileStream = TikaInputStream.get(/*some InputStream*/);
try {
final String extractedString = tika.parseToString(fileStream);
//do something with the string
} finally {
CloseUtils.close(fileStream);
}
Using the code above still leaves temp files in the Temp folder. What could be a solution to this?
The get() method should be called with a File object instead of an InputStream:
final File file = new File("c:/some_file.mov");
final TikaInputStream fileStream = TikaInputStream.get(file);
Tika still cannot process it but it actually manages to delete the correspondig tmp file.
Another workaround is disabling the org.apache.tika.parser.mp4.MP4Parser. Two solutions are here:
with configuration
with code

Is there a nice, safe, quick way to write an InputStream to a File in Scala?

Specifically, I'm saving a file upload to local file in a Lift web app.
With Java 7 or later you can use Files from the new File I/O:
Files.copy(from, to)
where from and to can be Paths or InputStreams. This way, you can even use it to conveniently extract resources from applications packed in a jar.
If it's a text file, and you want to limit yourself to Scala and Java, then using scala.io.Source to do the reading is probably the fastest--it's not built in, but easy to write:
def inputToFile(is: java.io.InputStream, f: java.io.File) {
val in = scala.io.Source.fromInputStream(is)
val out = new java.io.PrintWriter(f)
try { in.getLines().foreach(out.println(_)) }
finally { out.close }
}
But if you need other libraries anyway, you can make your life even easier by using them (as Michel illustrates).
(P.S.--in Scala 2.7, getLines should not have a () after it.)
(P.P.S.--in old versions of Scala, getLines did not remove the newline, so you need to print instead of println.)
I don't know about any Scala specific API, but since Scala is fully compatible to Java you can use any other library like Apache Commons IO and Apache Commons FileUpload.
Here is some example code (untested):
//using Commons IO:
val is = ... //input stream you want to write to a file
val os = new FileOutputStream("out.txt")
org.apache.commons.io.IOUtils.copy(is, os)
os.close()
//using Commons FileUpload
import javax.servlet.http.HttpServletRequest
import org.apache.commons.fileupload.{FileItemFactory, FileItem}
import apache.commons.fileupload.disk.DiskFileItemFactory
import org.apache.commons.fileupload.servlet.ServletFileUpload
val request: HttpServletRequest = ... //your HTTP request
val factory: FileItemFactory = new DiskFileItemFactory()
val upload = new ServletFileUpload(factory)
val items = upload.parseRequest(request).asInstanceOf[java.util.List[FileItem]]
for (item <- items) item.write(new File(item.getName))
The inputToFile method given above doesn't work well with binary files like .pdf files. It throws a runtime exception while attempting to decode the file into string. What worked for me was this:
def inputStreamToFile(inputStream: java.io.InputStream, file: java.io.File) = {
val fos = new java.io.FileOutputStream(file)
fos.write(
Stream.continually(inputStream.read).takeWhile(-1 !=).map(_.toByte).toArray
)
fos.close()
}

Categories