How can I override removeEldestEntry method to saving eldest entry to file? Also how to limit the size of a file like I did it in LinkedHashMap. Here is code:
import java.util.*;
public class level1 {
private static final int max_cache = 50;
private Map cache = new LinkedHashMap(max_cache, .75F, true) {
protected boolean removeEldestEntry(Map.Entry eldest) {
return size() > max_cache;
}
};
public level1() {
for (int i = 1; i < 52; i++) {
String string = String.valueOf(i);
cache.put(string, string);
System.out.println("\rCache size = " + cache.size() +
"\tRecent value = " + i + " \tLast value = " +
cache.get(string) + "\tValues in cache=" +
cache.values());
}
I tried to use FileOutPutSTream :
private Map cache = new LinkedHashMap(max_cache, .75F, true) {
protected boolean removeEldestEntry(Map.Entry eldest) throws IOException {
boolean removed = super.removeEldestEntry(eldest);
if (removed) {
FileOutputStream fos = new FileOutputStream("t.tmp");
ObjectOutputStream oos = new ObjectOutputStream(fos);
oos.writeObject(eldest.getValue());
oos.close();
}
return removed;
}
But I have gained an error
Error(15,27): removeEldestEntry(java.util.Map.Entry) in cannot override removeEldestEntry(java.util.Map.Entry) in java.util.LinkedHashMap; overridden method does not throw java.io.IOException
Without IOExecptio compiler asks to handle IOexception and Filenotfoundexception.
Maybe another way exists? Pls show me example code, I am new in java and just trying to understand the basic principles of 2 level caching. Thx
You first need to make sure your method properly overrides the parent. You can make some small changes to the signature, such as only throwing a more specific checked exception that is a sub-class of a checked exception declared in the parent. In this case, the parent does not declare any checked exception so you can not refine that further and may not throw any checked exceptions. So you will have to handle the IOException locally. There are several ways you can do that, convert it to a RuntimeException of some kind and/or log it.
If you are concerned about the file size, you probably do not want to keep just the last removed entry but many of them - so you should open the file for append.
You need to return true from the method to actually remove the eldest and you need to decide if the element should be removed.
When working with files you should use try/finally to ensure that you close the resource even if there is an exception. This can get a little ugly - sometimes it's nice to have a utility method to do the close so you don't need the extra try/catch.
Generally you should also use some buffering for file I/O which greatly improves performance; in this case use wrap the file stream in a java.io.BufferedOutputStream and provide that to the ObjectOutputStream.
Here is something that may do what you want:
private static final int MAX_ENTRIES_ALLOWED = 100;
private static final long MAX_FILE_SIZE = 1L * 1024 * 1024; // 1 MB
protected boolean removeEldestEntry(Map.Entry eldest) {
if (size() <= MAX_ENTRIES_ALLOWED) {
return false;
}
File objFile = new File("t.tmp");
if (objFile.length() > MAX_FILE_SIZE) {
// Do something here to manage the file size, such as renaming the file
// You won't be able to easily remove an object from the file without a more
// advanced file structure since you are writing arbitrary sized serialized
// objects. You would need to do some kind of tagging of each entry or include
// a record length before each one. Then you would have to scan and rebuild
// a new file. You cannot easily just delete bytes earlier in the file without
// even more advanced structures (like having an index, fixed size records and
// free space lists, or even a database).
}
FileOutputStream fos = null;
try {
fos = new FileOutputStream(objFile, true); // Open for append
ObjectOutputStream oos = new ObjectOutputStream(new BufferedOutputStream(fos));
oos.writeObject(eldest.getValue());
oos.close(); // Close the object stream to flush remaining generated data (if any).
return true;
} catch (IOException e) {
// Log error here or....
throw new RuntimeException(e.getMessage(), e); // Convert to RuntimeException
} finally {
if (fos != null) {
try {
fos.close();
} catch (IOException e2) {
// Log failure - no need to throw though
}
}
}
}
You can't change the method signature when overriding a method. So you need to handle the exception in the overridden method instead of throwing it.
This contains a good explanation on how to use try and catch: http://download.oracle.com/javase/tutorial/essential/exceptions/try.html
Related
How can we write a byte array to a file (and read it back from that file) in Java?
Yes, we all know there are already lots of questions like that, but they get very messy and subjective due to the fact that there are so many ways to accomplish this task.
So let's reduce the scope of the question:
Domain:
Android / Java
What we want:
Fast (as possible)
Bug-free (in a rigidly meticulous way)
What we are not doing:
Third-party libraries
Any libraries that require Android API later than 23 (Marshmallow)
(So, that rules out Apache Commons, Google Guava, Java.nio, and leaves us with good ol' Java.io)
What we need:
Byte array is always exactly the same (content and size) after going through the write-then-read process
Write method only requires two arguments: File file, and byte[] data
Read method returns a byte[] and only requires one argument: File file
In my particular case, these methods are private (not a library) and are NOT responsible for the following, (but if you want to create a more universal solution that applies to a wider audience, go for it):
Thread-safety (file will not be accessed by more than one process at once)
File being null
File pointing to non-existent location
Lack of permissions at the file location
Byte array being too large
Byte array being null
Dealing with any "index," "length," or "append" arguments/capabilities
So... we're sort of in search of the definitive bullet-proof code that people in the future can assume is safe to use because your answer has lots of up-votes and there are no comments that say, "That might crash if..."
This is what I have so far:
Write Bytes To File:
private void writeBytesToFile(final File file, final byte[] data) {
try {
FileOutputStream fos = new FileOutputStream(file);
fos.write(data);
fos.close();
} catch (Exception e) {
Log.i("XXX", "BUG: " + e);
}
}
Read Bytes From File:
private byte[] readBytesFromFile(final File file) {
RandomAccessFile raf;
byte[] bytesToReturn = new byte[(int) file.length()];
try {
raf = new RandomAccessFile(file, "r");
raf.readFully(bytesToReturn);
} catch (Exception e) {
Log.i("XXX", "BUG: " + e);
}
return bytesToReturn;
}
From what I've read, the possible Exceptions are:
FileNotFoundException : Am I correct that this should not happen as long as the file path being supplied was derived using Android's own internal tools and/or if the app was tested properly?
IOException : I don't really know what could cause this... but I'm assuming that there's no way around it if it does.
So with that in mind... can these methods be improved or replaced, and if so, with what?
It looks like these are going to be core utility/library methods which must run on Android API 23 or later.
Concerning library methods, I find it best to make no assumptions on how applications will use these methods. In some cases the applications may want to receive checked IOExceptions (because data from a file must exist for the application to work), in other cases the applications may not even care if data is not available (because data from a file is only cache that is also available from a primary source).
When it comes to I/O operations, there is never a guarantee that operations will succeed (e.g. user dropping phone in the toilet). The library should reflect that and give the application a choice on how to handle errors.
To optimize I/O performance always assume the "happy path" and catch errors to figure out what went wrong. This is counter intuitive to normal programming but essential in dealing with storage I/O. For example, just checking if a file exists before reading from a file can make your application twice as slow - all these kind of I/O actions add up fast to slow your application down. Just assume the file exists and if you get an error, only then check if the file exists.
So given those ideas, the main functions could look like:
public static void writeFile(File f, byte[] data) throws FileNotFoundException, IOException {
try (FileOutputStream out = new FileOutputStream(f)) {
out.write(data);
}
}
public static int readFile(File f, byte[] data) throws FileNotFoundException, IOException {
try (FileInputStream in = new FileInputStream(f)) {
return in.read(data);
}
}
Notes about the implementation:
The methods can also throw runtime-exceptions like NullPointerExceptions - these methods are never going to be "bug free".
I do not think buffering is needed/wanted in the methods above since only one native call is done
(see also here).
The application now also has the option to read only the beginning of a file.
To make it easier for an application to read a file, an additional method can be added. But note that it is up to the library to detect any errors and report them to the application since the application itself can no longer detect those errors.
public static byte[] readFile(File f) throws FileNotFoundException, IOException {
int fsize = verifyFileSize(f);
byte[] data = new byte[fsize];
int read = readFile(f, data);
verifyAllDataRead(f, data, read);
return data;
}
private static int verifyFileSize(File f) throws IOException {
long fsize = f.length();
if (fsize > Integer.MAX_VALUE) {
throw new IOException("File size (" + fsize + " bytes) for " + f.getName() + " too large.");
}
return (int) fsize;
}
public static void verifyAllDataRead(File f, byte[] data, int read) throws IOException {
if (read != data.length) {
throw new IOException("Expected to read " + data.length
+ " bytes from file " + f.getName() + " but got only " + read + " bytes from file.");
}
}
This implementation adds another hidden point of failure: OutOfMemory at the point where the new data array is created.
To accommodate applications further, additional methods can be added to help with different scenario's. For example, let's say the application really does not want to deal with checked exceptions:
public static void writeFileData(File f, byte[] data) {
try {
writeFile(f, data);
} catch (Exception e) {
fileExceptionToRuntime(e);
}
}
public static byte[] readFileData(File f) {
try {
return readFile(f);
} catch (Exception e) {
fileExceptionToRuntime(e);
}
return null;
}
public static int readFileData(File f, byte[] data) {
try {
return readFile(f, data);
} catch (Exception e) {
fileExceptionToRuntime(e);
}
return -1;
}
private static void fileExceptionToRuntime(Exception e) {
if (e instanceof RuntimeException) { // e.g. NullPointerException
throw (RuntimeException)e;
}
RuntimeException re = new RuntimeException(e.toString());
re.setStackTrace(e.getStackTrace());
throw re;
}
The method fileExceptionToRuntime is a minimal implementation, but it shows the idea here.
The library could also help an application to troubleshoot when an error does occur. For example, a method canReadFile(File f) could check if a file exists and is readable and is not too large. The application could call such a function after a file-read fails and check for common reasons why a file cannot be read. The same can be done for writing to a file.
Although you can't use third party libraries, you can still read their code and learn from their experience. In Google Guava for example, you usually read a file into bytes like this:
FileInputStream reader = new FileInputStream("test.txt");
byte[] result = ByteStreams.toByteArray(reader);
The core implementation of this is toByteArrayInternal. Before calling this, you should check:
A not null file is passed (NullPointerException)
The file exists (FileNotFoundException)
After that, it is reduced to handling an InputStream and this where IOExceptions come from. When reading streams a lot of things out of the control of your application can go wrong (bad sectors and other hardware issues, mal-functioning drivers, OS access rights) and manifest themselves with an IOException.
I am copying here the implementation:
private static final int BUFFER_SIZE = 8192;
/** Max array length on JVM. */
private static final int MAX_ARRAY_LEN = Integer.MAX_VALUE - 8;
private static byte[] toByteArrayInternal(InputStream in, Queue<byte[]> bufs, int totalLen)
throws IOException {
// Starting with an 8k buffer, double the size of each successive buffer. Buffers are retained
// in a deque so that there's no copying between buffers while reading and so all of the bytes
// in each new allocated buffer are available for reading from the stream.
for (int bufSize = BUFFER_SIZE;
totalLen < MAX_ARRAY_LEN;
bufSize = IntMath.saturatedMultiply(bufSize, 2)) {
byte[] buf = new byte[Math.min(bufSize, MAX_ARRAY_LEN - totalLen)];
bufs.add(buf);
int off = 0;
while (off < buf.length) {
// always OK to fill buf; its size plus the rest of bufs is never more than MAX_ARRAY_LEN
int r = in.read(buf, off, buf.length - off);
if (r == -1) {
return combineBuffers(bufs, totalLen);
}
off += r;
totalLen += r;
}
}
// read MAX_ARRAY_LEN bytes without seeing end of stream
if (in.read() == -1) {
// oh, there's the end of the stream
return combineBuffers(bufs, MAX_ARRAY_LEN);
} else {
throw new OutOfMemoryError("input is too large to fit in a byte array");
}
}
As you can see most of the logic has to do with reading the file in chunks. This is to handle situations, where you don't know the size of the InputStream, before starting reading. In your case, you only need to read files and you should be able to know the length beforehand, so this complexity could be avoided.
The other check is OutOfMemoryException. In standard Java the limit is too big, however in Android, it will be a much smaller value. You should check, before trying to read the file that there is enough memory available.
I am given an assignment where we are not allowed to use a DB or libraries but only textfile for data storage.
But it has rather complex requirements, for e.g. many validations, because of that, we need to "access the db" (i.e. read the textfile) many times.
My question is: should I create a class like this:
class SomeRepository{
static ArrayList<Users> users = new ArrayList();
public SomeRepository(){
//instantiate this class on program load
//In constructor, we read the text file, instantiate and store everything inside the arraylist.
}
//public getOneUser(){ // for get methods, we don't read from text file at all }
/public save() { //text file saving code overhere }
}
Is this a good approach to solve the above problem? Currently, what we are doing is reading and writing to the text file every time we want to retrieve some data or write something new.
Wouldn't this be too expensive in terms of heap space memory? Or should I just read/write to the text file for every method?
public class IOManager {
public static void writeObjToTxtFile(String fileName, Object object) {
File file = new File(fileName + ".txt");//File will be created in the root directory where the program runs.
try (FileOutputStream fos = new FileOutputStream(file);
ObjectOutputStream oos = new ObjectOutputStream(fos);) {
oos.writeObject(object);
} catch (IOException e) {
e.printStackTrace();
}
}
public static Object readObjFromTxtFile(String fileName) {
Object obj = null;
File file = new File(fileName + ".txt");
FileInputStream fis = null;
try {
fis = new FileInputStream(file);
ObjectInputStream ois = new ObjectInputStream(fis);
obj = ois.readObject();
} catch (ClassNotFoundException | IOException e) {
e.printStackTrace();
}
return obj;
}
}
Add this class to your project. Since it's general for all Objects, you can pass and receive Objects like these as well: ArrayList<Users>. Play around and Tinker with it to fit whatever your specific purpose is. Hint: You can write other custom methods that calls these methods. eg:
public static void writeUsersToFile(ArrayList<Users> usersArrayList){
writeObjToTxtFile("users",usersArrayList);
}
Ps. Make sure your Objects implement Serializable. Eg:
public class Users implements Serializable {
}
I would suggest reading the contents of your file to a dynamic list such as an arraylist at the start of your program. Make the required queries/changes to your arraylist and then write that arraylist to your file when the program is set to close. This will save significant time over repeated file reads/writes.
This isn't without it's drawbacks, though. You don't want to hogg up memory in case of very large files - but considering this is an assignment, that may not be the case. Additionally, should your program terminate prior to the write at the end, all changes made to your database during the current execution will be lost.
EDIT (for the sake of confusion): null has been written into the files "abc" and "efg".
After running the following code, the contents of file "abc" change which were initially null , and I get EOFException in every next execution :
ObjIStream = new ObjectInputStream(new FileInputStream("abc"));
M[][] objs = (M[][]) ObjIStream.readObject();
FS.objs = objs;
ObjIStream.close();
Here, FS.objs is a static member of class FS of type M[][] type.
On the other hand, this one has no effect on the file and I don't get any Exceptions after any number of executions:
ObjIStream = new ObjectInputStream(new FileInputStream("abc"));
M[][] objs = (M[][]) ObjIStream.readObject();
ObjIStream.close();
EDIT: I just found the trouble that exists in class FS in this form:
static{
try {
ObjOStream = new ObjectOutputStream(new FileOutputStream("abc"));
ObjOStream.close();
ObjOStream = new java.io.ObjectOutputStream(new java.io.FileOutputStream("efg"));
ObjOStream.close();
}
catch (IOException ex) { }
}
How is it troubling anyways?
The problem is new FileOutputStream("abc") itself, which means new FileOutputStream("abc", false). It cleans up all the data in file because you are not going to append anything. It calls FileOutputStream.open(String name, boolean append) which is a private native function. It erases everything in file in overwrite mode.
I'm doing an animation in Processing. I'm using random points and I need to execute the code twice for stereo vision.
I have lots of random variables in my code, so I should save it somewhere for the second run or re-generate the SAME string of "random" numbers any time I run the program. (as said here: http://www.coderanch.com/t/372076/java/java/save-random-numbers)
Is this approach possible? How? If I save the numbers in a txt file and then read it, will my program run slower? What's the best way to do this?
Thanks.
If you just need to be able to generate the same sequence for a limited time, seeding the random number generator with the same value to generate the same sequence is most likely the easiest and fastest way to go. Just make sure that any parallel threads always request their pseudo random numbers in the same sequence, or you'll be in trouble.
Note though that there afaik is nothing guaranteeing the same sequence if you update your Java VM or even run a patch, so if you want long time storage for your sequence, or want to be able to use it outside of your Java program, you need to save it to a file.
Here is a sample example:
public static void writeRandomDoublesToFile(String filePath, int numbersCount) throws IOException
{
FileOutputStream fos = new FileOutputStream(new File(filePath));
BufferedOutputStream bos = new BufferedOutputStream(fos);
DataOutputStream dos = new DataOutputStream(bos);
dos.writeInt(numbersCount);
for(int i = 0; i < numbersCount; i++) dos.writeDouble(Math.random());
}
public static double[] readRandomDoublesFromFile(String filePath) throws IOException
{
FileInputStream fis = new FileInputStream(new File(filePath));
BufferedInputStream bis = new BufferedInputStream(fis);
DataInputStream dis = new DataInputStream(bis);
int numbersCount = dis.readInt();
double[] result = new double[numbersCount];
for(int i = 0; i < numbersCount; i++) result[i] = dis.readDouble();
return result;
}
Well, there's a couple of ways that you can approach this problem. One of them would be to save the random variables as input into a file and pass that file name as a parameter to your program.
And you could do that in one of two ways, the first of which would be to use the args[] parameter:
import java.io.*;
import java.util.*;
public class bla {
public static void main(String[] args) {
// You'd need to put some verification code here to make
// sure that input was actually sent to the program.
Scanner in = new Scanner(new File(args[1]));
while(in.hasNextLine()) {
System.out.println(in.nextLine());
}
} }
Another way would be to use Scanner and read from the console input. It's all the same code as above, but instead of Scanner in = new Scanner(new File(args[1])); and all the verification code above that. You'd substitute Scanner in = new Scanner(System.in), but that's just to load the file.
The process of generating those points could be done in the following manner:
import java.util.*;
import java.io.*;
public class generator {
public static void main(String[] args) {
// You'd get some user input (or not) here
// that would ask for the file to save to,
// and that can be done by either using the
// scanner class like the input example above,
// or by using args, but in this case we'll
// just say:
String fileName = "somefile.txt";
FileWriter fstream = new FileWriter(fileName);
BufferedWriter out = new BufferedWriter(fstream);
out.write("Stuff");
out.close();
}
}
Both of those solutions are simple ways to read and write to and from a file in Java. However, if you deploy either of those solutions, you're still left with some kind of parsing of the data.
If it were me, I'd go for object serialization, and store a binary copy of the data structure I've already generated to disk rather than having to parse and reparse that information in an inefficient way. (Using text files, usually, takes up more disk space.)
And here's how you would do that (Here, I'm going to reuse code that has already been written, and comment on it along the way) Source
You declare some wrapper class that holds data (you don't always have to do this, by the way.)
public class Employee implements java.io.Serializable
{
public String name;
public String address;
public int transient SSN;
public int number;
public void mailCheck()
{
System.out.println("Mailing a check to " + name
+ " " + address);
}
}
And then, to serialize:
import java.io.*;
public class SerializeDemo
{
public static void main(String [] args)
{
Employee e = new Employee();
e.name = "Reyan Ali";
e.address = "Phokka Kuan, Ambehta Peer";
e.SSN = 11122333;
e.number = 101;
try
{
FileOutputStream fileOut =
new FileOutputStream("employee.ser");
ObjectOutputStream out =
new ObjectOutputStream(fileOut);
out.writeObject(e);
out.close();
fileOut.close();
}catch(IOException i)
{
i.printStackTrace();
}
}
}
And then, to deserialize:
import java.io.*;
public class DeserializeDemo
{
public static void main(String [] args)
{
Employee e = null;
try
{
FileInputStream fileIn =
new FileInputStream("employee.ser");
ObjectInputStream in = new ObjectInputStream(fileIn);
e = (Employee) in.readObject();
in.close();
fileIn.close();
}catch(IOException i)
{
i.printStackTrace();
return;
}catch(ClassNotFoundException c)
{
System.out.println(.Employee class not found.);
c.printStackTrace();
return;
}
System.out.println("Deserialized Employee...");
System.out.println("Name: " + e.name);
System.out.println("Address: " + e.address);
System.out.println("SSN: " + e.SSN);
System.out.println("Number: " + e.number);
}
}
Another alternative solution to your problem, that does not involve storing data, is to create a lazy generator for whatever function that provides you your random values, and provide the same seed each and every time. That way, you don't have to store any data at all.
However, that still is quite a bit slower (I think) than serializing the object to disk and loading it back up again. (Of course, that's a really subjective statement, but I'm not going to enumerate cases where that is not true). The advantage of doing that is so that it doesn't require any kind of storage at all.
Another way, that you may have not possibly thought of, is to create a wrapper around your generator function that memoizes the output -- meaning that data that has already been generated before will be retrieved from memory and will not have to be generated again if the same inputs are true. You can see some resources on that here: Memoization source
The idea behind memoizing your function calls is that you save time without persisting to disk. This is ideal if the same values are generated over and over and over again. Of course, for a set of random points, this isn't going to work very well if every point is unique, but keep that in the back of your mind.
The really interesting part comes when considering the ways that all the previous strategies I've described in this post can be combined together.
It'd be interesting to setup a Memoizer class, like described in the second page of 2 and then implement java.io.Serialization in that class. After that, you can add methods save(String fileName) and load(String fileName) in the memoizer class that make serialization and deserialization easier, so you can persist the cache used to memoize the function. Very useful.
Anyway, enough is enough. In short, just use the same seed value, and generate the same point pairs on the fly.
Note: Please do not judge this question. To those who think that I am doing this to "cheat"; you are mistaken, as I am no longer in school anyway. In addition, if I was, myself actually trying to cheat, I would simply use services that have already been created for this, instead of recreating the program. I took on this project because I thought it might be fun, nothing else. Before you down-vote, please consider the value of the question it's self, and not the speculative uses of it, as the purpose of SO is not to judge, but simply give the public information.
I am developing a program in java that is supposed intentionally corrupt a file (specifically a .doc, txt, or pdf, but others would be good as well)
I initially tried this:
public void corruptFile (String pathInName, String pathOutName) {
curroptMethod method = new curroptMethod();
ArrayList<Integer> corruptHash = corrupt(getBytes(pathInName));
writeBytes(corruptHash, pathOutName);
new MimetypesFileTypeMap().getContentType(new File(pathInName));
// "/home/ephraim/Desktop/testfile"
}
public ArrayList<Integer> getBytes(String filePath) {
ArrayList<Integer> fileBytes = new ArrayList<Integer>();
try {
FileInputStream myInputStream = new FileInputStream(new File(filePath));
do {
int currentByte = myInputStream.read();
if(currentByte == -1) {
System.out.println("broke loop");
break;
}
fileBytes.add(currentByte);
} while (true);
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
System.out.println(fileBytes);
return fileBytes;
}
public void writeBytes(ArrayList<Integer> hash, String pathName) {
try {
OutputStream myOutputStream = new FileOutputStream(new File(pathName));
for (int currentHash : hash) {
myOutputStream.write(currentHash);
}
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
//System.out.println(hash);
}
public ArrayList<Integer> corrupt(ArrayList<Integer> hash) {
ArrayList<Integer> corruptHash = new ArrayList<Integer>();
ArrayList<Integer> keywordCodeArray = new ArrayList<Integer>();
Integer keywordIndex = 0;
String keyword = "corruptthisfile";
for (int i = 0; i < keyword.length(); i++) {
keywordCodeArray.add(keyword.codePointAt(i));
}
for (Integer currentByte : hash) {
//Integer currentByteProduct = (keywordCodeArray.get(keywordIndex) + currentByte) / 2;
Integer currentByteProduct = currentByte - keywordCodeArray.get(keywordIndex);
if (currentByteProduct < 0) currentByteProduct += 255;
corruptHash.add(currentByteProduct);
if (keywordIndex == (keyword.length() - 1)) {
keywordIndex = 0;
} else keywordIndex++;
}
//System.out.println(corruptHash);
return corruptHash;
}
but the problem is that the file is still openable. When you open the file, all of the words are changed (and they may not make any sense, and they may not even be letters, but it can still be opened)
so here is my actual question:
Is there a way to make a file so corrupt that the computer doesn't know how to open it at all (ie. when you open it, the computer will say something along the lines of "this file is not recognized, and cannot be opened")?
I think you want to look into the RandomAccessFile. Also, it is almost always the case that a program recognizes its file by its very start. So open the file and scramble the first 5 bytes.
The only way to fully corrupt an arbitrary file is to replace all of its contents with random garbage. Even then, there is an infinitely small probability that the random garbage will actually be something meaningful.
Depending on the file type, it may be possible to recover from limited - or even from not so limited - corruption. E.g.:
Streaming media codecs are designed with network packet loss take into account. Limited corruption may show up as picture artifacts, or even as a few lost frames, but the content is usually still viewable.
Block-based compression algorithms, such as bzip2, allow undamaged blocks to be recovered.
File-based compression systems such as rar and zip may be able to recover those files whose compressed data has not been damaged, regardless of damage to the rest of the archive.
Human-readable text, such as text files and source code files, is still viewable in a text editor, even if parts of it are corrupt - not to mention its size that does not change. Unless you corrupted the whole thing, any casual reader would be able to tell whether an assignment was done and whether the retransmitted file was the same as the one that got corrupted.
Apart from the ethical issue, have you considered that this would be a one-time thing only? Data corruption does happen, but it's not that frequent and it's never that convenient...
If you are that desperate for more time, you would be better off breaking your leg and getting yourself admitted to a hospital.
There are better ways:
Your professor accepts Word documents. Infect it with a macro virus before sending.
"Forget" to attach the file to the email.
Forge the send date on your email. If your prof is the kind that accepts Word docs, this may work.