I have to work with a large number of compiled Java classes which didn't explicitly specify a serialVersionUID. Because their UIDs were arbitrarily generated by the compiler, many of the classes which need to be serialized and deserialized end up causing exceptions, even though the actual class definitions match up. (This is all expected behavior, of course.)
It is impractical for me to go back and fix all of this 3rd-party code.
Therefore, my question is: Is there any way to make the Java runtime ignore differences in serialVersionUIDs, and only fail to deserialize when there are actual differences in structure?
If you have access to the code base, you could use the SerialVer task for Ant to insert and to modify the serialVersionUID in the source code of a serializable class and fix the problem once for all.
If you can't, or if this is not an option (e.g. if you have already serialized some objects that you need to deserialize), one solution would be to extend ObjectInputStream. Augment its behavior to compare the serialVersionUID of the stream descriptor with the serialVersionUID of the class in the local JVM that this descriptor represents and to use the local class descriptor in case of mismatch. Then, just use this custom class for the deserialization. Something like this (credits to this message):
import java.io.IOException;
import java.io.InputStream;
import java.io.InvalidClassException;
import java.io.ObjectInputStream;
import java.io.ObjectStreamClass;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
public class DecompressibleInputStream extends ObjectInputStream {
private static Logger logger = LoggerFactory.getLogger(DecompressibleInputStream.class);
public DecompressibleInputStream(InputStream in) throws IOException {
super(in);
}
#Override
protected ObjectStreamClass readClassDescriptor() throws IOException, ClassNotFoundException {
ObjectStreamClass resultClassDescriptor = super.readClassDescriptor(); // initially streams descriptor
Class localClass; // the class in the local JVM that this descriptor represents.
try {
localClass = Class.forName(resultClassDescriptor.getName());
} catch (ClassNotFoundException e) {
logger.error("No local class for " + resultClassDescriptor.getName(), e);
return resultClassDescriptor;
}
ObjectStreamClass localClassDescriptor = ObjectStreamClass.lookup(localClass);
if (localClassDescriptor != null) { // only if class implements serializable
final long localSUID = localClassDescriptor.getSerialVersionUID();
final long streamSUID = resultClassDescriptor.getSerialVersionUID();
if (streamSUID != localSUID) { // check for serialVersionUID mismatch.
final StringBuffer s = new StringBuffer("Overriding serialized class version mismatch: ");
s.append("local serialVersionUID = ").append(localSUID);
s.append(" stream serialVersionUID = ").append(streamSUID);
Exception e = new InvalidClassException(s.toString());
logger.error("Potentially Fatal Deserialization Operation.", e);
resultClassDescriptor = localClassDescriptor; // Use local class descriptor for deserialization
}
}
return resultClassDescriptor;
}
}
Use CGLIB to insert them into the binary classes?
How impractical is this to fix ? If you have the source and can rebuild, can you not just run a script over the entire codebase to insert a
private long serialVersionUID = 1L;
everywhere ?
The serialization errors at runtime tell you explicitly what the ID is expected to be. Just change your classes to declare these as the ID and everything will be OK. This does involve you making changes but I don't believe that this can be avoided
You could possibly use Aspectj to 'introduce' the field into each serializable class as it is loaded. I would first introduce a marker interface into each class using the package and then introduce the field using a Hash of the class file for the serialVersionUID
public aspect SerializationIntroducerAspect {
// introduce marker into each class in the org.simple package
declare parents: (org.simple.*) implements SerialIdIntroduced;
public interface SerialIdIntroduced{}
// add the field to each class marked with the interface above.
private long SerialIdIntroduced.serialVersionUID = createIdFromHash();
private long SerialIdIntroduced.createIdFromHash()
{
if(serialVersionUID == 0)
{
serialVersionUID = getClass().hashCode();
}
return serialVersionUID;
}
}
You will need to add the aspectj load time weaver agent to the VM so it can weave the advice into your existing 3rd party classes. Its funny though once you get around to setting Aspectj up, its remarkable the number of uses that you will put to.
HTH
ste
Related
I'm trying to make a custom annotation that checks to see if a certain method is called in a method's body annotated with it. Something like:
#TypeQualifierDefault(ElementType.METHOD)
#Retention(RetentionPolicy.SOURCE)
#interface MutatingMethod {
}
interface Mutable {
void preMutate();
void postMutate();
// other methods
}
And then within a certain Mutable class we would have:
class Structure<T> implements Mutable {
#MutatingMethod
void add(T data) {
preMutate();
// actual mutation code
postMutate();
}
}
I want to be able to get warnings of some sort if the body of a method like add that is annotated with #MutatingMethod does not include calls to preMutate and postMutate. Can an ElementVisitor (javax.lang.model.element.ElementVisitor) be used to traverse the (possibly obfuscated) statements and method calls in the body of a method? If so what would that look like? If not what else can I use?
Just to clarify, I know this is impossible (or more difficult) to accomplish in runtime via bytecode decompilation, so this annotation is meant to only work during compilation via reflection (java.lang.reflect.* and javax.lang.model.*) and is not retained in class files.
You are free to modify the code however you want to get it to work, for example by introducing a new annotation called #MutableType that Structure and any other Mutable types must be annotated with it for this to work.
A cherry on top would be to assert that preMutate is called before postMutate and not after.
It shouldn't matter but I'm using Gradle and the IntelliJ IDEA IDE.
Any help is greatly appreciated; material on this is strangely scarce and/or inadequate on the web. I have been using publicly available sources to learn about this!
There are two modules,
java.compiler which contains the API for annotation processors and the simple abstraction you have already discovered.
The ElementVisitor abstraction does not support digging into the method’s code.
The jdk.compiler module, containing an extended API originally not considered to be part of the standard API and hence not included in the official API documentation prior to the introduction of the module system.
This API allows analyzing the syntax tree of the currently compiled source code.
When your starting point is an annotation processor, you should have a ProcessingEnvironment which was given to your init method. Then, you can invoke Trees.instance(ProcessingEnvironment) to get a helper object which has the method getTree(Element) you can use to get the syntax tree element. Then, you can traverse the syntax tree from there.
Most of these classes documented in the JDK 17 API do already exist in earlier versions (you might notice the “since 1.6”) even when not present in the older documentation. But prior to JDK 9 you have to include the lib/tools.jar of the particular JDK into your classpath when compiling the annotation processor.
(when writing a modular annotation processor)
import javax.annotation.processing.Processor;
module anno.proc.example {
requires jdk.compiler;
provides Processor with anno.proc.example.MyProcessor;
}
package anno.proc.example;
import java.util.*;
import javax.annotation.processing.*;
import javax.lang.model.element.TypeElement;
import javax.tools.Diagnostic;
import com.sun.source.tree.*;
import com.sun.source.tree.Tree.Kind;
import com.sun.source.util.Trees;
import javax.lang.model.SourceVersion;
import javax.lang.model.element.Element;
#SupportedSourceVersion(SourceVersion.RELEASE_17) // adapt when using older version
#SupportedAnnotationTypes(MyProcessor.ANNOTATION_NAME)
public class MyProcessor extends AbstractProcessor {
static final String ANNOTATION_NAME = "my.example.MutatingMethod";
static final String REQUIRED_FIRST = "preMutate", REQUIRED_LAST = "postMutate";
// the inherited method does already store the processingEnv
// public void init(ProcessingEnvironment processingEnv) {
#Override
public boolean process(Set<? extends TypeElement> annotations, RoundEnvironment roundEnv) {
Optional<? extends TypeElement> o = annotations.stream()
.filter(e -> ANNOTATION_NAME.contentEquals(e.getQualifiedName())).findAny();
if(!o.isPresent()) return false;
TypeElement myAnnotation = o.get();
roundEnv.getElementsAnnotatedWith(myAnnotation).forEach(this::check);
return true;
}
private void check(Element e) {
Trees trees = Trees.instance(processingEnv);
Tree tree = trees.getTree(e);
if(tree.getKind() != Kind.METHOD) { // should not happen as compiler handles #Target
processingEnv.getMessager()
.printMessage(Diagnostic.Kind.ERROR, ANNOTATION_NAME + " only allowed at methods", e);
return;
}
MethodTree m = (MethodTree) tree;
List<? extends StatementTree> statements = m.getBody().getStatements();
if(statements.isEmpty() || !isRequiredFirst(statements.get(0))) {
processingEnv.getMessager()
.printMessage(Diagnostic.Kind.MANDATORY_WARNING,
"Mutating method does not start with " + REQUIRED_FIRST + "();", e);
}
// open challenges:
// - accept a return statement after postMutate();
// - allow a try { body } finally { postMutate(); }
if(statements.isEmpty() || !isRequiredLast(statements.get(statements.size() - 1))) {
processingEnv.getMessager()
.printMessage(Diagnostic.Kind.MANDATORY_WARNING,
"Mutating method does not end with " + REQUIRED_LAST + "();", e);
}
}
private boolean isRequiredFirst(StatementTree st) {
return invokes(st, REQUIRED_FIRST);
}
private boolean isRequiredLast(StatementTree st) {
return invokes(st, REQUIRED_LAST);
}
// check whether tree is an invocation of a no-arg method of the given name
private boolean invokes(Tree tree, String method) {
if(tree.getKind() != Kind.EXPRESSION_STATEMENT) return false;
tree = ((ExpressionStatementTree)tree).getExpression();
if(tree.getKind() != Kind.METHOD_INVOCATION) return false;
MethodInvocationTree i = (MethodInvocationTree)tree;
if(!i.getArguments().isEmpty()) return false; // not a no-arg method
ExpressionTree ms = i.getMethodSelect();
// TODO add support for explicit this.method()
return ms.getKind() == Kind.IDENTIFIER
&& method.contentEquals(((IdentifierTree)ms).getName());
}
}
Annotation processors only process declarations, not bytecode. They can not be used to make assertions about the content of a method.
If it is important to you that these methods are always called, you may want to enforce this with a proxy rather than having boilerplate code in each method. For instance, you could use a bytecode engineering library such as Javassist to add the calls at runtime:
var f = new ProxyFactory();
f.setSuperclass(Foo.class);
f.setFilter(m -> m.isAnnotationPresent(MutatingMethod.class));
Class c = f.createClass();
Foo foo = c.newInstance();
((Proxy)foo).setHandler((self, m, proceed, args) -> {
self.preMutate();
proceed.invoke(self, args);
self.postMutate();
});
foo.setName("Peter"); // automatically calls preMutate and postMutate()
(code untested, since I don't have an IDE at hand)
Then, the methods are always called in the proper order, as long as you control the creation of the objects in question (which you can sort of enforce by making the super class abstract).
How does one go about and try to find all subclasses of a given class (or all implementors of a given interface) in Java?
As of now, I have a method to do this, but I find it quite inefficient (to say the least).
The method is:
Get a list of all class names that exist on the class path
Load each class and test to see if it is a subclass or implementor of the desired class or interface
In Eclipse, there is a nice feature called the Type Hierarchy that manages to show this quite efficiently.
How does one go about and do it programmatically?
Scanning for classes is not easy with pure Java.
The spring framework offers a class called ClassPathScanningCandidateComponentProvider that can do what you need. The following example would find all subclasses of MyClass in the package org.example.package
ClassPathScanningCandidateComponentProvider provider = new ClassPathScanningCandidateComponentProvider(false);
provider.addIncludeFilter(new AssignableTypeFilter(MyClass.class));
// scan in org.example.package
Set<BeanDefinition> components = provider.findCandidateComponents("org/example/package");
for (BeanDefinition component : components)
{
Class cls = Class.forName(component.getBeanClassName());
// use class cls found
}
This method has the additional benefit of using a bytecode analyzer to find the candidates which means it will not load all classes it scans.
There is no other way to do it other than what you described. Think about it - how can anyone know what classes extend ClassX without scanning each class on the classpath?
Eclipse can only tell you about the super and subclasses in what seems to be an "efficient" amount of time because it already has all of the type data loaded at the point where you press the "Display in Type Hierarchy" button (since it is constantly compiling your classes, knows about everything on the classpath, etc).
This is not possible to do using only the built-in Java Reflections API.
A project exists that does the necessary scanning and indexing of your classpath so you can get access this information...
Reflections
A Java runtime metadata analysis, in the spirit of Scannotations
Reflections scans your classpath, indexes the metadata, allows you to query it on runtime and may save and collect that information for many modules within your project.
Using Reflections you can query your metadata for:
get all subtypes of some type
get all types annotated with some annotation
get all types annotated with some annotation, including annotation parameters matching
get all methods annotated with some
(disclaimer: I have not used it, but the project's description seems to be an exact fit for your needs.)
Try ClassGraph. (Disclaimer, I am the author). ClassGraph supports scanning for subclasses of a given class, either at runtime or at build time, but also much more. ClassGraph can build an abstract representation of the entire class graph (all classes, annotations, methods, method parameters, and fields) in memory, for all classes on the classpath, or for classes in selected packages, and you can query this class graph however you want. ClassGraph supports more classpath specification mechanisms and classloaders than any other scanner, and also works seamlessly with the new JPMS module system, so if you base your code on ClassGraph, your code will be maximally portable. See the API here.
Don't forget that the generated Javadoc for a class will include a list of known subclasses (and for interfaces, known implementing classes).
I know I'm a few years late to this party, but I came across this question trying to solve the same problem. You can use Eclipse's internal searching programatically, if you're writing an Eclipse Plugin (and thus take advantage of their caching, etc), to find classes which implement an interface. Here's my (very rough) first cut:
protected void listImplementingClasses( String iface ) throws CoreException
{
final IJavaProject project = <get your project here>;
try
{
final IType ifaceType = project.findType( iface );
final SearchPattern ifacePattern = SearchPattern.createPattern( ifaceType, IJavaSearchConstants.IMPLEMENTORS );
final IJavaSearchScope scope = SearchEngine.createWorkspaceScope();
final SearchEngine searchEngine = new SearchEngine();
final LinkedList<SearchMatch> results = new LinkedList<SearchMatch>();
searchEngine.search( ifacePattern,
new SearchParticipant[]{ SearchEngine.getDefaultSearchParticipant() }, scope, new SearchRequestor() {
#Override
public void acceptSearchMatch( SearchMatch match ) throws CoreException
{
results.add( match );
}
}, new IProgressMonitor() {
#Override
public void beginTask( String name, int totalWork )
{
}
#Override
public void done()
{
System.out.println( results );
}
#Override
public void internalWorked( double work )
{
}
#Override
public boolean isCanceled()
{
return false;
}
#Override
public void setCanceled( boolean value )
{
}
#Override
public void setTaskName( String name )
{
}
#Override
public void subTask( String name )
{
}
#Override
public void worked( int work )
{
}
});
} catch( JavaModelException e )
{
e.printStackTrace();
}
}
The first problem I see so far is that I'm only catching classes which directly implement the interface, not all their subclasses - but a little recursion never hurt anyone.
I did this several years ago. The most reliable way to do this (i.e. with official Java APIs and no external dependencies) is to write a custom doclet to produce a list that can be read at runtime.
You can run it from the command line like this:
javadoc -d build -doclet com.example.ObjectListDoclet -sourcepath java/src -subpackages com.example
or run it from ant like this:
<javadoc sourcepath="${src}" packagenames="*" >
<doclet name="com.example.ObjectListDoclet" path="${build}"/>
</javadoc>
Here's the basic code:
public final class ObjectListDoclet {
public static final String TOP_CLASS_NAME = "com.example.MyClass";
/** Doclet entry point. */
public static boolean start(RootDoc root) throws Exception {
try {
ClassDoc topClassDoc = root.classNamed(TOP_CLASS_NAME);
for (ClassDoc classDoc : root.classes()) {
if (classDoc.subclassOf(topClassDoc)) {
System.out.println(classDoc);
}
}
return true;
}
catch (Exception ex) {
ex.printStackTrace();
return false;
}
}
}
For simplicity, I've removed command line argument parsing and I'm writing to System.out rather than a file.
Keeping in mind the limitations mentioned in the other answers, you can also use openpojo's PojoClassFactory (available on Maven) in the following manner:
for(PojoClass pojoClass : PojoClassFactory.enumerateClassesByExtendingType(packageRoot, Superclass.class, null)) {
System.out.println(pojoClass.getClazz());
}
Where packageRoot is the root String of the packages you wish to search in (e.g. "com.mycompany" or even just "com"), and Superclass is your supertype (this works on interfaces as well).
Depending on your particular requirements, in some cases Java's service loader mechanism might achieve what you're after.
In short, it allows developers to explicitly declare that a class subclasses some other class (or implements some interface) by listing it in a file in the JAR/WAR file's META-INF/services directory. It can then be discovered using the java.util.ServiceLoader class which, when given a Class object, will generate instances of all the declared subclasses of that class (or, if the Class represents an interface, all the classes implementing that interface).
The main advantage of this approach is that there is no need to manually scan the entire classpath for subclasses - all the discovery logic is contained within the ServiceLoader class, and it only loads the classes explicitly declared in the META-INF/services directory (not every class on the classpath).
There are, however, some disadvantages:
It won't find all subclasses, only those that are explicitly declared. As such, if you need to truly find all subclasses, this approach may be insufficient.
It requires the developer to explicitly declare the class under the META-INF/services directory. This is an additional burden on the developer, and can be error-prone.
The ServiceLoader.iterator() generates subclass instances, not their Class objects. This causes two issues:
You don't get any say on how the subclasses are constructed - the no-arg constructor is used to create the instances.
As such, the subclasses must have a default constructor, or must explicity declare a no-arg constructor.
Apparently Java 9 will be addressing some of these shortcomings (in particular, the ones regarding instantiation of subclasses).
An Example
Suppose you're interested in finding classes that implement an interface com.example.Example:
package com.example;
public interface Example {
public String getStr();
}
The class com.example.ExampleImpl implements that interface:
package com.example;
public class ExampleImpl implements Example {
public String getStr() {
return "ExampleImpl's string.";
}
}
You would declare the class ExampleImpl is an implementation of Example by creating a file META-INF/services/com.example.Example containing the text com.example.ExampleImpl.
Then, you could obtain an instance of each implementation of Example (including an instance of ExampleImpl) as follows:
ServiceLoader<Example> loader = ServiceLoader.load(Example.class)
for (Example example : loader) {
System.out.println(example.getStr());
}
// Prints "ExampleImpl's string.", plus whatever is returned
// by other declared implementations of com.example.Example.
It should be noted as well that this will of course only find all those subclasses that exist on your current classpath. Presumably this is OK for what you are currently looking at, and chances are you did consider this, but if you have at any point released a non-final class into the wild (for varying levels of "wild") then it is entirely feasible that someone else has written their own subclass that you will not know about.
Thus if you happened to be wanting to see all subclasses because you want to make a change and are going to see how it affects subclasses' behaviour - then bear in mind the subclasses that you can't see. Ideally all of your non-private methods, and the class itself should be well-documented; make changes according to this documentation without changing the semantics of methods/non-private fields and your changes should be backwards-compatible, for any subclass that followed your definition of the superclass at least.
The reason you see a difference between your implementation and Eclipse is because you scan each time, while Eclipse (and other tools) scan only once (during project load most of the times) and create an index. Next time you ask for the data it doesn't scan again, but look at the index.
I'm using a reflection lib, which scans your classpath for all subclasses: https://github.com/ronmamo/reflections
This is how it would be done:
Reflections reflections = new Reflections("my.project");
Set<Class<? extends SomeType>> subTypes = reflections.getSubTypesOf(SomeType.class);
You can use org.reflections library and then, create an object of Reflections class. Using this object, you can get list of all subclasses of given class.
https://www.javadoc.io/doc/org.reflections/reflections/0.9.10/org/reflections/Reflections.html
Reflections reflections = new Reflections("my.project.prefix");
System.out.println(reflections.getSubTypesOf(A.class)));
Add them to a static map inside (this.getClass().getName()) the parent classes constructor (or create a default one) but this will get updated in runtime. If lazy initialization is an option you can try this approach.
I just write a simple demo to use the org.reflections.Reflections to get subclasses of abstract class:
https://github.com/xmeng1/ReflectionsDemo
I needed to do this as a test case, to see if new classes had been added to the code. This is what I did
final static File rootFolder = new File(SuperClass.class.getProtectionDomain().getCodeSource().getLocation().getPath());
private static ArrayList<String> files = new ArrayList<String>();
listFilesForFolder(rootFolder);
#Test(timeout = 1000)
public void testNumberOfSubclasses(){
ArrayList<String> listSubclasses = new ArrayList<>(files);
listSubclasses.removeIf(s -> !s.contains("Superclass.class"));
for(String subclass : listSubclasses){
System.out.println(subclass);
}
assertTrue("You did not create a new subclass!", listSubclasses.size() >1);
}
public static void listFilesForFolder(final File folder) {
for (final File fileEntry : folder.listFiles()) {
if (fileEntry.isDirectory()) {
listFilesForFolder(fileEntry);
} else {
files.add(fileEntry.getName().toString());
}
}
}
If you intend to load all subclassess of given class which are in the same package, you can do so:
public static List<Class> loadAllSubClasses(Class pClazz) throws IOException, ClassNotFoundException {
ClassLoader classLoader = pClazz.getClassLoader();
assert classLoader != null;
String packageName = pClazz.getPackage().getName();
String dirPath = packageName.replace(".", "/");
Enumeration<URL> srcList = classLoader.getResources(dirPath);
List<Class> subClassList = new ArrayList<>();
while (srcList.hasMoreElements()) {
File dirFile = new File(srcList.nextElement().getFile());
File[] files = dirFile.listFiles();
if (files != null) {
for (File file : files) {
String subClassName = packageName + '.' + file.getName().substring(0, file.getName().length() - 6);
if (! subClassName.equals(pClazz.getName())) {
subClassList.add(Class.forName(subClassName));
}
}
}
}
return subClassList;
}
find all classes in classpath
public static List<String> getClasses() {
URLClassLoader urlClassLoader = (URLClassLoader) Thread.currentThread().getContextClassLoader();
List<String> classes = new ArrayList<>();
for (URL url : urlClassLoader.getURLs()) {
try {
if (url.toURI().getScheme().equals("file")) {
File file = new File(url.toURI());
if (file.exists()) {
try {
if (file.isDirectory()) {
for (File listFile : FileUtils.listFiles(file, new String[]{"class"}, true)) {
String classFile = listFile.getAbsolutePath().replace(file.getAbsolutePath(), "").replace(".class", "");
if (classFile.startsWith(File.separator)) {
classFile = classFile.substring(1);
}
classes.add(classFile.replace(File.separator, "."));
}
} else {
JarFile jarFile = new JarFile(file);
if (url.getFile().endsWith(".jar")) {
Enumeration<JarEntry> entries = jarFile.entries();
while (entries.hasMoreElements()) {
JarEntry jarEntry = entries.nextElement();
if (jarEntry.getName().endsWith(".class")) {
classes.add(jarEntry.getName().replace(".class", "").replace("/", "."));
}
}
}
}
} catch (IOException e) {
e.printStackTrace();
}
}
}
} catch (URISyntaxException e) {
e.printStackTrace();
}
}
return classes;
}
enter link description hereService Manager in java will get all implementing classes for an interface in J
UPDATE MAY 20: I should have mentioned that the objects in question do have "serialVersionUID" set (same value in both old & new), but serialization fails before readObject() is called with the following exception:
Exception in thread "main" java.io.InvalidClassException: TestData; incompatible types for field number
I'm also now including an example down below.
I'm working with a large application that sends serialized objects (implements Serializable, not Exernalizable) from client to server. Unfortunately, we now have a situation where an existing field has changed type altogether, which breaks the serialization.
The plan is to upgrade the server side first. From there, I have control over the new versions of the serialized objects, as well as the ObjectInputStream to read the (at first old) objects coming from the clients.
I had at first thought of implementing readObject() in the new version; however, after trying it out, I discovered that validation fails (due to incompatible types) prior to the method ever being called.
If I subclass ObjectInputStream, can I accomplish what I want?
Even better, are there any 3rd-party libraries that do any sort of serialization 'magic'? It would be really interesting if there were any tools/libraries that could convert a serialized object stream into something like an array of HashMaps...without needing to load the objects themselves. I'm not sure if it is possible to do that (convert a serialized object to a HashMap without loading the object definition itself), but if it is possible, I could imagine a tool that could convert a serialized object (or stream of objects) to a new version, using, say, a set of Properties for conversion hints/rules, etc...
Thanks for any suggestions.
Update May 20 - example source below - The field 'number' in TestData changes from an 'int' in the old version to a 'Long' in the new version. Note readObject() in new version of TestData is not called, because an exception is thrown before it ever gets there:
Exception in thread "main" java.io.InvalidClassException: TestData; incompatible types for field number
Below is the source. Save it to a folder, then create sub-folders 'old' and 'new'. Put the 'old' version of the TestData class in the "old" folder, and the new version in "new". Put the "WriteIt" and "ReadIt" classes in the main (parent of old/new) folder. 'cd' to the 'old' folder, and compile with: javac -g -classpath . TestData.java
...then do the same thing in the 'new' folder. 'cd' back to the parent folder, and compile WriteIt/ReadIt:
javac -g -classpath .;old WriteIt.java
javac -g -classpath .;new ReadIt.java
Then run with:
java -classpath .;old WriteIt //Serialize old version of TestData to TestData_old.ser
java -classpath .;new ReadIt //Attempt to read back the old object using reference to new version
[OLD version]TestData.java
import java.io.*;
public class TestData
implements java.io.Serializable
{
private static final long serialVersionUID = 2L;
public int number;
}
[NEW version] TestData.java
import java.io.*;
public class TestData
implements java.io.Serializable
{
private static final long serialVersionUID = 2L;
public Long number; //Changed from int to Long
private void readObject(final ObjectInputStream ois)
throws IOException, ClassNotFoundException
{
System.out.println("[TestData NEW] readObject() called...");
/* This is where I would, in theory, figure out how to handle
* both the old and new type. But a serialization exception is
* thrown before this method is ever called.
*/
}
}
WriteIt.java - Serialize old version of TestData to 'TestData_old.ser'
import java.io.*;
public class WriteIt
{
public static void main(String args[])
throws Exception
{
TestData d = new TestData();
d.number = 2013;
ObjectOutputStream oos = new ObjectOutputStream(new FileOutputStream("TestData_old.ser"));
oos.writeObject(d);
oos.close();
System.out.println("Done!");
}
}
ReadIt.java - Attempt to de-serialized old object into new version of TestData. readObject() in new version is not called, due to exception beforehand.
import java.io.*;
public class ReadIt
{
public static void main(String args[])
throws Exception
{
ObjectInputStream ois = new ObjectInputStream(new FileInputStream("TestData_old.ser"));
TestData d = (TestData)ois.readObject();
ois.close();
System.out.println("Number = " + d.number);
}
}
Your immediate problem seems to be that you have not defined fixed the serialVersionUID strings for your classes. If you don't do that, the object serialization and deserialization code generates the UIDs based on the representational structure of the types being sent and received. If you change the type at one end and not the other, the UIDs don't match. If you fix the UIDs, then the reader code will get past the UID check, and your readObject method will have a chance to "work around" the differences in serialized data.
Apart from that, my suggestion would be:
Try to change all of the clients and servers at the same time so that you don't need to use readObject / writeObject hacks
If that's not practical, try to move away from using Java serialized objects in anything where there is a possibility of version matches as your system evolves. Use XML, JSON, or something else that is less sensitive to "protocol" or "schema" changes.
If that's not practical, version your client/server protocol.
I doubt that you will get any traction by subclassing the ObjectStream classes. (Take a look at the code ... in the OpenJDK codebase.)
You can continue to use serialization even when changes to classes will make the old serialized data incompatible with the new class, if you implement Externalizable and write an extra field indicating the version before writing the class data. This allows readExternal to handle old versions in the way that you specify. I know of no way to automatically do this, but using this manual method may work for you.
Here are the modified classes that will compile and will run without throwing an exception.
old/TestData.java
import java.io.*;
public class TestData
implements Externalizable
{
private static final long serialVersionUID = 1L;
private static final long version = 1L;
public int number;
public void readExternal(ObjectInput in)
throws IOException, ClassNotFoundException {
long version = (long) in.readLong();
if (version == 1) {
number = in.readInt();
}
}
public void writeExternal(ObjectOutput out)
throws IOException {
out.writeLong(version); // always write current version as first field
out.writeInt(number);
}
}
new/TestData.java
import java.io.*;
public class TestData
implements Externalizable
{
private static final long serialVersionUID = 1L;
private static final long version = 2L; // Changed
public Long number; //Changed from int to Long
public void readExternal(ObjectInput in)
throws IOException, ClassNotFoundException {
long version = (long) in.readLong();
if (version == 1) {
number = new Long(in.readInt());
}
if (version == 2) {
number = (Long) in.readObject();
}
}
public void writeExternal(ObjectOutput out)
throws IOException {
out.writeLong(version); // always write current version as first field
out.writeObject(number);
}
}
These classes can be run with the following statements
$ javac -g -classpath .:old ReadIt.java
$ javac -g -classpath .:old WriteIt.java
$ java -classpath .:old WriteIt
Done!
$ java -classpath .:old ReadIt
Number = 2013
$ javac -g -classpath .:new ReadIt.java
$ java -classpath .:new ReadIt
Number = 2013
Changing a class to implement Exernalizable instead of Serializable will result in the following exception.
Exception in thread "main" java.io.InvalidClassException: TestData; Serializable incompatible with Externalizable
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:634)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1623)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1518)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1774)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
at ReadIt.main(ReadIt.java:10)
In your case, you'd have to change the old version to implement Externalizable first, restart the server and clients at the same time, then you could upgrade the server before the clients with incompatible changes.
after trying it out, I discovered that validation fails (due to incompatible types) prior to the method ever being called
That's because you haven't defined a serialVersionUID. That's easy to fix. Just run the serialver tool on the old version of the classes, then paste the output into the new source code. You may then be able to write your readObject() method in a compatible way.
In addition, you could look into writing readResolve() and writeReplace() methods, or defining serialPersistentFields in a compatible way, if that's possible. See the Object Serialization Specification.
See also the valuable suggestions by #StephenC.
Post your edit I suggest you change the name of the variable along with its type. That counts as a deletion plus an insertion, which is serialization-compatible, and will work find as long as you don't want the old int number read into the new Long number (why Long instead of long?). If you do need that, I suggest leaving int number there and adding a new Long/long field with a different name, and modifying readObject() to set the new field to 'number' if it isn't already set by defaultReadObject().
If you change the type of a field, you should change it's name. Otherwise you will get the error you see. If you want to set the new field name from the old data (or compute it some other way during object read), you can do something like this:
public class TestData
implements java.io.Serializable
{
private static final long serialVersionUID = 2L;
public Long numberL; //Changed from "int number;"
private void readObject(final ObjectInputStream ois)
throws IOException, ClassNotFoundException
{
System.out.println("[TestData NEW] readObject() called...");
ois.defaultReadObject();
if (numberL == null) {
// deal with old structure
GetField fields = ois.readFields();
numberL = fields.get("number", 0L); // autoboxes old value
}
}
}
Do not change the value of serialVersionUID and the above should be able to read both new and old versions of TestData.
does someone maybe have sample code how to replace a java List (LinkedList or ArrayList) with something similar in BerkeleyDB? My problem is that I have to replace Lists to scale beyond main memory limits. Some simple sample code would be really nice.
I've now used a simple TupleBinding for Integers (keys) and a SerialBinding for the Diff-class (data values).
Now I'm receiving the Error:
14:03:29.287 [pool-5-thread-1] ERROR o.t.g.view.model.TraverseCompareTree - org.treetank.diff.Diff; local class incompatible: stream classdesc serialVersionUID = 8484615870884317488, local class serialVersionUID = -8805161170968505227
java.io.InvalidClassException: org.treetank.diff.Diff; local class incompatible: stream classdesc serialVersionUID = 8484615870884317488, local class serialVersionUID = -8805161170968505227
The listener and TransactionRunner classes which I'm using are:
/** {#inheritDoc} */
#Override
public void diffListener(final EDiff paramDiff, final IStructuralItem paramNewNode,
final IStructuralItem paramOldNode, final DiffDepth paramDepth) {
try {
mRunner.run(new PopulateDatabase(mDiffDatabase, mKey++, new Diff(paramDiff, paramNewNode.getNodeKey(), paramOldNode.getNodeKey(), paramDepth)));
} catch (final Exception e) {
LOGWRAPPER.error(e.getMessage(), e);
}
}
private static class PopulateDatabase implements TransactionWorker {
private StoredMap<Integer, Diff> mMap;
private int mKey;
private Diff mValue;
public PopulateDatabase(final DiffDatabase paramDatabase, final int paramKey, final Diff paramValue) {
Objects.requireNonNull(paramDatabase);
Objects.requireNonNull(paramValue);
mMap = paramDatabase.getMap();
mKey = paramKey;
mValue = paramValue;
}
#Override
public void doWork() throws DatabaseException {
mMap.put(mKey, mValue);
}
}
I don't know why it doesn't work :-/
Edit: Sorry, I just had to delete the generated environment/database and create a new one.
I'm afraid, it wont be that simple. A first step, you might want to take is to refactor your code to move all accesses to the list into a separate class (call it a DAO, if you like). Then it will be a lot easier to move to a database instead of the list.
Berkeley DB is severe over-kill for this type of task. It's a fair beast to configure and set up, plus I believe the license is now commercial. You'll be much better off using a disk-backed list or map. As an example of the latter take a look at Kyoto Cabinet. It's extremely fast, implements the standard Java Collections interface and is as easy to use as a List or Map. See my other answer for example code.
I have an array that I have created from a database ResultSet. I am trying to Serialize it so that I can send it over a socket stream. At the moment I am getting an error telling me that the array is not Serializable. The code I have is down below, the first part is the class to create an object for the array:
class ProteinData
{
private int ProteinKey;
public ProteinData(Integer ProteinKey)
{
this.ProteinKey = ProteinKey;
}
public Integer getProteinKey() {
return this.ProteinKey;
}
public void setProteinKey(Integer ProteinKey) {
this.ProteinKey = ProteinKey;
}
}
The code to populate the array:
public List<ProteinData> readJavaObject(String query, Connection con) throws Exception
{
PreparedStatement stmt = con.prepareStatement(query);
query_results = stmt.executeQuery();
while (query_results.next())
{
ProteinData pro = new ProteinData();
pro.setProteinKey(query_results.getInt("ProteinKey"));
tableData.add(pro);
}
query_results.close();
stmt.close();
return tableData;
}
And the code to call this is:
List dataList = (List) this.readJavaObject(query, con);
ObjectOutputStream output_stream = new ObjectOutputStream(socket.getOutputStream());
output_stream.writeObject(dataList);
And the code recieving this is:
List dataList = (List) input_stream.readObject();
Can someone help me serailize this array. All I can find in forums is simple arrays(EG. int[]).
I tried to add the serializable to the class and the UID number but got java.lang.ClassNotFoundException: socketserver.ProteinData error message. Does anyone now why?
Thanks for any help.
Basically you need that the classes you want to serialize are implementing Serializable. And if you want to avoid the warning related to the serial you should have also a long serialVersionUIDfor each one, that is a code used to distinguish your specific version of the class. Read a tutorial like this one to get additional info, serialization is not so hard to handle..
However remember that serialization is faulty when used between two different versions of the JVM (and it has some flaws in general).
Just a side note: the interface Serializabledoesn't actually give any required feature to the class itself (it's not a typical interface) and it is used just to distinguish between classes that are supposed to be sent over streams and all the others. Of course, if a class is Serializable, all the component it uses (instance variables) must be serializable too to be able to send the whole object.
Change your class declaration to:
class ProteinData implements Serializable {
...
}
I would have thought as a minimum that you would need
class ProteinData implements Serializable
and a
private static final long serialVersionUID = 1234556L;
(Eclipse will generate the magic number for you).
in the class.