I am running an embedded hazelcast deployment and storing a ConcurrentMap<String, MyType>, where the type of value in the map is my custom class.
MyType
public class MyType implements Serializable {
private MyTag tag;
...
}
One of its fields is an interface MyTag.
MyTag
public interface MyTag<T> {
}
And I have a class containing several enum implementations of MyTag interface:
MyTags
public class MyTags {
public static enum Integers implements MyTag<Integer> {
INT_TAG1,
INT_TAG2,
...
}
public static enum Strings implements MyTag<String> {
STRING_TAG1,
...
}
...
}
After moving MyTags class to a different package and redeploying one of my services (with MyType in the new package) upon attempting a get on the map, an exception is thrown:
com.hazelcast.nio.serialization.HazelcastSerializationException: java.lang.ClassNotFoundException: old.package.MyTags$Strings
How could I protect myself from this situation when deploying on a production environment?
Java's Serializable depends on the class remaining the exact same. A moved class is no longer the same class. Select one of the other options that Hazelcast has for serialization of objects. See Comparing Serialization Options for some more guidance on the different options.
Related
I'm currently working at a company that has a diverse set of modules. In that company if you want to provide module internals you provide it via a java interface, that hides the actual implementing type and gives an interface for the requesting module. Now I want to have one provider to be able to provide data for multiple modules that expose different fields or methods of the actual internal data.
Therefore I have an internal Object, which has some data and I have an interface for each module that needs access to some but not strictly all fields. Finally I have an external object that implements all those interfaces and holds an instance of the internal object to delegate the method calls:
public class InternalObject {
public int getA() { return 0; }
public int getB() { return 0; }
}
public interface ModuleXObject {
int getA();
}
public interface ModuleYObject {
int getA();
int getB();
}
public class ExternalObject implements ModuleXObject, ModuleYObject {
private InternalObject _internal;
public int getA() { return _internal.getA(); }
public int getB() { return _internal.getB(); }
}
Now that is all fine and dandy, but if I want to provide - lets say - repository methods for finding a list of said objects typed for the correct module, I run into problems with how I can achieve that. I would wish for something like the following:
public interface ModuleXObjectRepository {
List<ModuleXObject> loadAllObjects();
}
public interface ModuleYObjectRepository {
List<ModuleYObject> loadAllObjects();
}
public class ExternalObjectRepository implements ModuleXObjectRepository, ModuleYObjectRepository {
public List<ExternalObject> loadAllObjects() {
// ...
}
}
This doesn't compile saying the return type is incompatible.
So my question is, if it is possible to achieve something like that and if, how?
I should note that I tried some different approaches which I want to include for completeness and to portray their downsides (in my eyes).
Approach 1:
public interface ModuleXObjectRepository {
List<? extends ModuleXObject> loadAllObjects();
}
public interface ModuleYObjectRepository {
List<? extends ModuleYObject> loadAllObjects();
}
public class ExternalObjectRepository implements ModuleXObjectRepository, ModuleYObjectRepository {
public List<ExternalObject> loadAllObjects() {
// ...
}
}
This approach is quite close to the solution I would prefer, but results in code like this:
List<? extends ModuleXObject> objects = repository.loadAllObjects();
Therefore requiring the user to include the "? extends" into each List-Declaration regarding to an invocation of loadAllObjects().
Approach 2:
public interface ModuleXObjectRepository {
List<ModuleXObject> loadAllObjects();
}
public interface ModuleYObjectRepository {
List<ModuleYObject> loadAllObjects();
}
public class ExternalObjectRepository implements ModuleXObjectRepository, ModuleYObjectRepository {
public List loadAllObjects() {
// ...
}
}
This approach just omits the generic in the ExternalObjectRepository and therefore reduces the type safety too much in my opinion. Also I haven't tested if this actually works.
Just to reharse, is there any possible way to define the loadAllObjects-method in a way that enables users to get lists that are typed with the objects for their respective module without
requiring "? extends" in the users code
degrading type safety in the repository implementation
using class/interface level generics
The challenge with allowing it to be typed as List<ModuleXObject> is that other code may hold is as a List<ExternalObject>.
All ExternalObject instances are ModuleXObject instances but the inverse is not true.
Consider the following additional class:
public class MonkeyWrench implements ModuleXObject{
//STUFF
}
MonkeyWrench instances are NOT ExternalObject instances but if one could cast a List<ExternalObject> to a List<ModuleXObject> one could add MonkeyWrench instances to this collection, and this causes a risk of run time class cast exceptions and ruins type safety.
Other code could very easily have:
for(ExternalObject externalObject:externalObjectRepository.loadAllObjects())
If one of those instances is a MonkeyWrench instance, run time class cast, which is what generics are meant to avoid.
The implication of ? extends ModuleXObject is that you can read any object from the collection as a ModuleXObject but you can't add anything to the collection as other code may have additional constraints on the collection that are not obvious/available at compile time.
I'd suggest in your case to use ? extends ModuleXObject as its semantics seem to align with what you want, namely pulling out ModuleXObject instances, e.g.
ModuleXObjectRepository repo = //get repo however
for(ModuleXObject obj : repo.loadAllObjects()){
//do stuff with obj
}
I have a class as follows in a .jar file (library file):
class A{
//someimplementation
}
I would like to make it to implements Serializable interface as follows:
class A implements Serializable {
//the same implementation as present in classA
}
I do not want to decompile the jar file, changing the class signature and then archiving it again after compilation.
Is there any way like writing hooks to achieve this? Kindly provide any pointers/suggestions.
My ultimate aim is to achieve implementing Serializable interface without modifying the jar file.
You can probably achieve this using Serialization Proxy Pattern (Effective Java 2nd edition Item 78)
A few links about the Pattern :
http://jtechies.blogspot.com/2012/07/item-78-consider-serialization-proxies.html
http://java.dzone.com/articles/serialization-proxy-pattern
Follow up: instance control in Java without enum
Make a new class that extends A and is Serializable. In order to avoid serialization errors, however, because A isn't serializable, you need to make a SerializationProxy that creates a new instance via constructor or factory method instead of the normal Java Serialization mechanism of explicitly setting the fields outside of any constructor.
public class MySerializableA extends A implements Serializable{
private final Foo foo;
private final Bar bar;
...
private Object writeReplace() {
return new SerializationProxy(this);
}
//this forces us to use the SerializationProxy
private void readObject(ObjectInputStream stream) throws InvalidObjectException {
throw new InvalidObjectException("Use Serialization Proxy instead.");
}
//this private inner class is what actually does our Serialization
private static class SerializationProxy implements Serializable {
private final Foo foo;
private final Bar bar;
...
public SerializationProxy(MySerializableA myA) {
this.foo = myA.getFoo();
this.bar = myA.getBar();
...//etc
}
private Object readResolve() {
return new MySerializableA(foo, bar,...);
}
}
}
The only downside is when you want to serialize an A, you will have to wrap it in a MyA. but when deserializing, the cast to A will work fine.
I have multiple services (in Spring MVC) that are children of a global Service. So I need to know about the best practice (or your opinions) with multiple methods with this example:
//Domain classes
public class MyParentObject{}
public class MyObj extends MyParentObject{}
//Services
public class MyParentObjectServiceImpl implements MyParentObjectService{
#Override
public MyParentObject findObjectByProp(String prop, String objectType){
//myCode (not abstract class)
}
}
public class MyObjServiceImpl extends MyParentObjectServiceImpl implements MyObjectService{
private myObjType = "MyObj";
#Override
public MyObj findMyObjByProp(String prop){
return (MyObj) super.findObjectByProp(prop, this.myObjType);
}
}
And in this approach, I use calls like this:
MyObj foo = myObjService.findMyObjByProp(prop);
So I need to know if this approach is "better" or more apropiate that calling directly the parent method with the second parameter. E.g:
MyObj foo = (MyObj)myParentObjectService.findObjectByProp(prop, "MyObj");
..and avoiding the creation of second methods, more specific. It is important to know that the children services will be created anyway, because we have lot of code that is specific of a domain objects.
I have the idea that the first approach is better, because is more readable, but I need to support that decision with some documents, blog, or opinions to discuss this designs with my colleagues.
This looks like a tagged class hierarchy. It's difficult to comment on the value of this design in general without knowing the details. However, a slightly different approach that I would recommend is to generify your base class to gain a little bit of type safety.
In particular:
public /* abstract */ class MyParentObjectServiceImpl<T extends MyParentObject>
implements MyParentObjectService{
MyParentObjectServiceImpl(Class<T> type) { this.type = type; }
private final Class<T> type; // subclasses provide this
#Override
public T findObjectByProp(String prop){
//you can use type for object specific stuff
}
}
public class MyObjServiceImpl extends MyParentObjectServiceImpl<MyObj>
// You might not need this interface anymore
// if the only method defined is findMyObjByProp
/* implements MyObjectService */ {
MyObjServiceImpl() {
super(MyObj.class);
}
#Override
public /* final */ MyObj findMyObjByProp(String prop) {
return (MyObj) super.findObjectByProp(prop, this.myObjType);
}
}
You definitely gain in type safety (casting will only appear in the base class), you get rid of the "tags" (the strings that identify the different objects) and possibly reduce the number of classes/interfaces required to implement the whole hierarchy. I successfully used this approach several times. Note that this works best if the base class is abstract. Food for thoughts.
I have a mongo collection that may contain three types of entities that I map to java types:
Node
LeafType1
LeafType2
Collection is ment to store tree-like structure using dbRefs of child nodes in parent entry.
I didn't find any information about subject in Spring reference documentation so I'm asking here: Is there a way to use Repository mechanism to work with collection that may contain different types of objects?
Declaring several repositories for different types in one collection seems like not very good idea because I always struggle with situations when queried object is not of expected type and creating one repository for abstract class that all possible types inherrit doesn't seems to work.
To illustrate what I mean:
/**
* This seems not safe
*/
public interface NodeRepository extends MongoRepository<Node, String> { }
public interface LeafType1Repository extends MongoRepository<LeafType1, String> { }
public interface LeafType2Repository extends MongoRepository<LeafType2, String> { }
/**
* This doesn't work at all
*/
public interface MyCollectionRepository extends MongoRepository<AbstractMyCollectionNode, String> { }
If Node\LeafType1\LeafType2 are sub-classes of AbstractMyCollectionNode, then things will be easy. Just declare the repository like you write:
public interface MyCollectionRepository extends MongoRepository<AbstractMyCollectionNode, String> { }
We have done this in a project, and it works good. Spring Data will add an property named '_class' to the documents in mongodb collection, so that it can finger out which class to instantiate.
Documents that stored in one collection may have some similarity, maybe you can extract a generic class for them.
Here are some code copied from one of our projects:
Entity:
public abstract class Document {
private String id;
public String getId() {
return id;
}
public void setId(String id) {
this.id = id;
}
....
public class WebClipDocument extends Document {
private String digest;
...
Repository:
public interface DocumentDao extends MongoRepository<Document, String>{
...
And, if your documents in mongodb collection does not have the "_class" property. You can use Converter:
When storing and querying your objects it is convenient to have a MongoConverter instance handle the mapping of all Java types to DBObjects. However, sometimes you may want the `MongoConverter’s do most of the work but allow you to selectively handle the conversion for a particular type or to optimize performance.
Spring data uses the Repository-Declarations as entry-point when looking for Entity classes (it does not scan packages for entities directly).
So all you need to do is to declare an "unused" Repository-Interface for your sub-classes, just like you proposed as "unsafe" in your OP:
public interface NodeRepository extends MongoRepository<Node, String> {
// all of your repo methods go here
Node findById(String id);
Node findFirst100ByNodeType(String nodeType);
... etc.
}
public interface LeafType1Repository extends MongoRepository<LeafType1, String> {
// leave empty
}
public interface LeafType2Repository extends MongoRepository<LeafType2, String> {
// leave empty
}
You do not have to use the additional LeafTypeX repositories, you can stick with the NodeRepository for storing and looking up objects of type LeafType1 and LeafType2. But the declaration of the other two repositories is needed, so that LeafType1 and LeafType2 will be found as Entities when initial scanning takes place.
PS: This all assumes, of course, that you have #Document(collection= "nodes") annotations on your LeafType1 and LeafType2 classes
I have struck at one point where i am not sure how to move ahead.I have a class with following code
public class MyClass{
private Class<? extends ValidationProvider> providerClass;
//getter ans setter
}
Provider class is being passed at run time and there can be different implementation for that, all it needs to follow the above contract ? extends ValidationProvider.
In my implementation i have provided a property where user can pass the provider name class and based on that i need to move ahead.In order to move ahead i need to get an Instance of provider class using the build in dependency injection mechanism like
public class MyClass{
private Class<? extends ValidationProvider> providerClass;
//getter ans setter
public void setProvider(){
providerClass=container.getInstance("type","name");
}
}
Signature of container.getInstance("type","name"); is
<T> T getInstance(Class<T> type, String name);
i am not sure how to pass type to the container and to get instance of the provider class(since provider class has a signature <? extends ValidationProvider> ).
for simple use case i can do like
container.getInstance(String.class,"my constant");
but not sure how to do it for my use-case
Any help in this regard will really be appreciated.
Let's take a step back... Your requirement is that MyClass should be configurable with different implementations of ValidationProvider, which I am assuming from your syntax is a superclass. I don't think your need generics in MyClass to achieve this:
public class MyClass{
private ValidationProvider provider;
public void setProvider(ValidationProvider _provider){
this.provider = _provider;
}
}
Then if (for example) the configuration is achieved by having the fully-qualified name of the exact provider implementation in a config file, you can set your instance of MyClass as follows:
...
String validatorClassName;
...
// Assign validatorClassName a value from a config file
...
ValidationProvider v = container.getInstance(Class.forName(validatorClassName), "blah");
myClass.setProvider(v);