I need some help with generics and reflection in java.
What I try to achieve is a factory class that based on a class MyClass returns another class MyClassStore that implements an interface Store<MyClass>.
How can I make sure that the right type of store is returned with reflection.
See sample code:
public class Application {
public static void main(String[] args) {
Store<MyClass> store = Factory.getFactory(MyClass.class);
MyClass myClass = store.get();
System.out.println(myClass.getId());
}
}
public class Factory {
public static <T> Store<T> getFactory(Class<T> type) {
Store<T> store = null;
// TODO: How to implement this
return store;
}
}
public class MyClassStore implements Store<MyClass>{
public MyClass get() {
return new MyClass("1000");
}
}
public interface Store<T> {
public T get();
}
public class MyClass {
private String id;
public MyClass(String id) {
this.id = id;
}
public String getId() {
return id;
}
public void setId(String id) {
this.id = id;
}
}
I have tried different ways but can't get it to work the way I want. This is the implementation I have tried. But is there a better way?
public static <T> Store<T> getFactory(Class<T> type) {
Store<T> store = null;
if (type == MyClass.class) {
store = (Store<T>) new MyClassStore();
}
return store;
}
Just taking the problem as stated, the best solution I can come up with is to have a static final Map<Class, Class> in the Factory that maps (in this example) from MyClass.class to MyClassStore.class, use it to look up which class to instantiate, and call newInstance() on the Class retrieved from the map.
Taking a bit more liberty with your design, you could replace MyClassStore with a more general generic class that takes a Class<T> parameter, stores it locally, and implements get() with a call to newInstance() on the stored Class. getFactory() could then simply make a generic store regardless and pass the Class object on.
However, this type of code strikes me as unlikely to be a good solution to whatever your real problem is. Why does MyClassStore need to create new MyClass objects? Why does the internal logic of any implementation of Store need to care about what type T actually is? That latter question in particular is usually a strong indicator of poor design in my opinion.
Related
I have two classes that extend an abstract class in a very similar manner, and I'd like to abstract out the common parts since I will likely have to use this again. They each return a ThingLink containing data linking them to a Parent object. They also return a Widget that varies based on the class, but only up to their name. Here is the pattern:
public abstract class SomeClass extends TopClass {
protected abstract Widget createWidget();
public void someMethod() { // Does something }
}
public class ThingA extends SomeClass {
private static final String INFO_TYPE = "int";
public ThingLink newLink(Parent master, Int info) {
ThingLink link = new ThingLink(parent, ThingA.class);
link.addData(INFO_TYPE, info);
return link;
}
public Widget createWidget() {
// Stuff to get someData
return ThingAWidget.createMe(someData);
}
}
public class ThingB extends SomeClass {
private static final String INFO_TYPE = "String";
public ThingLink newLink(Parent master, String info) {
ThingLink link = new ThingLink(parent, ThingB.class);
link.addData(INFO_TYPE, info);
return link;
}
public Widget createWidget() {
// Stuff to get someData
return ThingBWidget.createMe(someData);
}
}
I have no access to TopClass, the ThingLink class, or the Widget class. I was trying to abstract out the common parts using generics, but I can't seem to figure out if that will provide a complete solution. My big problem is figuring out how to get the pieces that are self-referential. I would like a class something like:
public abstract class Thing<T> extends SomeClass {
private String infoType;
public void setInfoType(String type) { infoType = type; }
public ThingLink newLink(Parent master, T info {
ThingLink link = new ThingLink(parent, ???????????);
link.addData(infoType, info);
return link;
}
public Widget createWidget() {
// Stuff to get someData
return ??????????????.createMe(someData);
}
}
Keep in mind that I am fairly new to Java, and self-taught, but I am trying very hard to make some bigger leaps and really understand how to write good code. I appreciate your help.
I would like to use the builder pattern in some upcoming work that I have which has several classes in a hierarchy. The base class will have at least 9 fields to start, and the various sub-classes may add between 2-4 more fields each. This would get out of hand very quickly and the builder pattern is appealing to me for this exact reason. I got some initial exposure to the builder pattern in books and articles. They were helpful, but had nothing on how to extend this pattern. I tried to implement this by myself, but I ran into trouble with the constructors of each of the sub-classes because I didn't get how to pass the collected data in the builder to super class. I looked on SO for some answers, and here's what I found.
This one is from SO 24243240 where an example of how to extend an abstract class with an abstract builder is given. It is also based on this blog post.
public abstract class AbstractA {
protected String s;
protected int i;
protected AbstractA() {
}
protected abstract static class ABuilder<T extends AbstractA, B extends ABuilder<T,B>> {
protected T object;
protected B thisObject;
protected abstract T getObject(); //Each concrete implementing subclass overrides this so that T becomes an object of the concrete subclass
protected abstract B thisObject(); //Each concrete implementing subclass builder overrides this for the same reason, but for B for the builder
protected ABuilder() {
object = getObject();
thisObject = thisObject();
}
public B withS(String s) {
object.s = s;
return thisObject;
}
public B withI(int i) {
object.i = i;
return thisObject;
}
public T build() {
return object;
}
}
}
public final class ConcreteA extends AbstractA {
private String foo;
protected ConcreteA() {
}
public static final class Builder extends AbstractA.ABuilder<ConcreteA,Builder> {
#Override protected ConcreteA getObject() {
return new ConcreteA();
}
#Override protected Builder thisObject() {
return this;
}
public Builder() {
}
public Builder withFoo(String foo) {
object.foo = foo;
return this;
}
}
}
And then in client code, it would look like...
ConcreteA baz = new ConcreteA.Builder().withFoo("foo").withS("bar").withI(0).build();
I like this example because it allows you to easily extend these classes, but it also seems to me that this defeats the purpose of using the builder pattern because the methods withS(String s) and withI(int i) act alot like setter methods. Also, this method leaves the fields of the base class and the builder class as protected rather than private.
Here's one from SO 17164375
public class NutritionFacts {
private final int calories;
public static class Builder<T extends Builder> {
private int calories = 0;
public Builder() {}
public T calories(int val) {
calories = val;
return (T) this;
}
public NutritionFacts build() { return new NutritionFacts(this); }
}
protected NutritionFacts(Builder builder) {
calories = builder.calories;
}
}
public class GMOFacts extends NutritionFacts {
private final boolean hasGMO;
public static class Builder extends NutritionFacts.Builder<Builder> {
private boolean hasGMO = false;
public Builder() {}
public Builder GMO(boolean val) {
hasGMO = val;
return this;
}
public GMOFacts build() { return new GMOFacts(this); }
}
protected GMOFacts(Builder builder) {
super(builder);
hasGMO = builder.hasGMO;
}
}
I like that this one seemingly adheres more closely to the builder pattern described by Josh Bloch and it also allows you to simply pass the builder into the constructor for the class you want to instantiate. This would be a nice way to do some validation inside the builder before instantiating the object in the call to build(). At the same time though, this example shows how you can extend the builder pattern with concrete classes, and when you do that the potential for all the nastiness that comes with extending concrete classes (e.g. inconsistent interfaces, inheriting methods which can corrupt the state of your object, etc.)
So my question is there a way to implement an abstract class with an abstract builder that also allows you to pass in a reference to a builder in the constructor for the base class? Something like:
public abstract BaseClass {
// various fields go here
...
public abstract Builder<T extends BaseClass, B extends Builder<T,B>> {
// add chaining methods here
...
public T build() {
if (isValid()) return new T(this);
else Throw new IllegalArgumentException("Invalid data passed to builder.");
}
}
public BaseClass(Builder builder) {
// set fields of baseclass here
}
}
I realize that you can't instantiate an object the way that I've shown here, but is there some other way to do it I mean? Is this possibly where a factory would go? Maybe I just have the wrong assumptions about the builder pattern in general. :) If that's the case, is there a better direction to take?
Your first example is not bad, but I don't think it is what you are looking for.
I am still a little unsure of exactly what you want, but seeing your examples do not work for you, I thought I'd give you one or two of my own. :)
class ParentBuilder{
public ConcreteParent build(){
ConcreteParent parent = new ConcreteParent();
parent.setFirst(1);
parent.setSecond(2);
parent.setThird(3);
return parent;
}
}
class ChildBuilder{
public ConcreteChild build(ParentBuilder parentBuilder){
ConcreteParent parent = parentBuilder.build();
ConcreteChild child = new ConcreteChild();
child.setFirst(parent.getFirst());
child.setSecond(parent.getSecond());
child.setThird(parent.getThird());
child.setFourth(4); //Child specific value
child.setFifth(5); //Child specific value
return child;
}
}
Any new type, would have its own builder, taking in its parent's builder.
As you can see this is similar to:
public NutritionFacts build() { return new NutritionFacts(this); }
}
protected NutritionFacts(Builder builder) {
calories = builder.calories;
}
In your example.
This however, quickly gets out of hand as well, increasingly for the number of variables and subclasses.
An alternativ, would be to use dynanic variables, have a look at this: http://martinfowler.com/apsupp/properties.pdf
Martin Fowler writes a great article specifying all the pros and cons.
Anyways, here's my second example:
public class Demo {
public static void main(String[] args) {
ConcreteBuilder builder = new ConcreteBuilder();
Concrete concrete = builder.with("fourth", "valueOfFourth").build();
for(String value : concrete.getAttributes().values())
System.out.println(value);
}
}
class ConcreteBuilder{
private Concrete concrete;
public ConcreteBuilder(){
concrete = new Concrete();
}
public ConcreteBuilder with(String key, String value){
concrete.getAttributes().put(key, value);
return this;
}
public Concrete build(){
return concrete;
}
}
class Concrete{
private HashMap<String, String> attributes;
public Concrete(){
attributes = new HashMap<>();
}
public HashMap<String, String> getAttributes(){
attributes.put("first", "valueOfFirst");
attributes.put("second", "valueOfSecond");
attributes.put("third", "valueOfThird");
return attributes;
}
}
The magic here is, you (might) no longer need all these subclasses.
If these subclasses' behavior does not change, but only their variables, you should be fine using a system like this.
I strongly advise that you read Martin Fowler article on the subject though, there are good places and bad places to do this, but I think this is a good one.
I hope this brings you closer to an answer, good luck. :)
I have an ObjectFactory and a specialized case of implementation of that factory. I can't change the interface, that has 0 argument.
In one of the implementation I have to read a file and load some data. To pass the filename I can use the system properties because all I need to share is a string.
But in the other implementation I must start not from a file but from a memory structure. How can I do to pass the object (then I think the object reference) to the factory? Other methods? No way I serialize the object on a file and after I read it again because what I want to avoid is right the I/O footprint.
Thanks
OK, more informations:
This is the interface and the abstract factory I have to implement
public abstract interface A
{
public abstract Set<Foo> getFoo();
public abstract Set<Bar> getBar();
}
//this is otherpackage.AFactory
public abstract class AFactory
{
public static AccessFactory newInstance()
{
return a new built instance of the factory
}
public abstract A newA();
}
This is my implementation with my problem:
public class AFactory extends otherpackage.AFactory
{
#Override
public Access newA()
{
return new AA();
}
}
public class AA implements A
{
protected AA()
{
this.objectReferenceIWantToSaveHere = I retrieve from the shared memory zone;
use the object
}
}
Now I'd like to do something like this:
B b = something I built before
save b in a shared memory zone or something like that
otherpackage.AFactory f = mypackage.AccessFactory.newInstance();
A a = f.newA();
And inside the f.newA() call I'd like to access to the b object
Can't you simply use a constructor?
interface ObjectFactory { Object create(); }
class SpecialFactory implements ObjectFactory {
private final Object data;
public SpecialFactory(Object data) { this.data = data; }
#Override public Object create() { return somethingThatUsesData; }
}
Ass assylias proposes, you can pass the reference to the constructor. Or if you know where to find the reference, you could just ask for it before you use it? E.g. data = dataBank.giveMeTheData()
Agree it would help to get some more context around what you are doing... but could you use a shared static class in which your calling code places info into the static class, and your interface implementation references this same static class to obtain either the object and/or instructions?
So here's a client class. It has the entry point..and wants to pass an object to the interface implementer but it can't pass it directly...So it set's object it wants to pass in the MyStaticHelper.SetSharedObject method.
public class Client {
/**
* #param args
*/
public static void main(String[] args) {
// TODO Auto-generated method stub
String mySharedObject = "Couldbeanyobject, not just string";
// Set your shared object in static class
MyStaticHelper.SetSharedObject(mySharedObject);
InterferfaceImplementer myInterfaceImplementer = new InterferfaceImplementer();
//
myInterfaceImplementer.RunMyMethod();
}
Here is the code for the static helper...
public class MyStaticHelper {
private static Object _insructionsObject;
public static void SetSharedObject(Object anObject)
{
_insructionsObject = anObject;
}
public static Object GetSharedObject()
{
return _insructionsObject;
}
}
and finally the the class that you call that uses the static helper to get the same object.
public class InterferfaceImplementer {
// no objects
public void RunMyMethod()
{
System.out.println(MyStaticHelper.GetSharedObject());
}
}
Again this works in a very simple scenario and wouldn't stand up if more than one implementer needs to be called simultaneously as this solution would only allow one obj to be in the static helper class.
In previous C++ code I've used friend classes when creating a factory that can output "read only" objects which means that as the objects are consumed throughout the code there is no risk that they can be inadvertently changed/corrupted.
Is there is there a similar way to implement this in Java or am I being overly defensive?
Make use of the final keyword. This keyword can mark a class/methods as non-extendable, and mark fields/variables as non-mutable.
You will hide the default constructor of the object using the private constructor, and force parameterised constructors which will initialise all necessary final fields.
Your only problem is that the factory is kind of redundant. Since all fields of the object are final, you will have to use all factory methods at object build-time.
Example:
public final class DataObject
{
protected final String name;
protected final String payload;
private DataObject()
{
}
public DataObject(final String name, final String payload)
{
this.name = name;
this.payload = payload;
}
}
// Using the factory
DataObject factory = new Factory().setName("Name").setPayload("Payload").build();
// As opposed to
DataObject dao = new DataObject("Name", "Payload");
// ==> Factory becomes redundant, only adding extra code
Solution without final:
I'm afraid you will have to forget about the immutability mechanism of C++. The factory pattern is never a bad choice if you have huge data objects (i.e. with a lot of setters), but you can't really avoid mutability of the constructed object. What you could do, is make the data object an inner class of the factory, and make the setters private. That way, ONLY the factory can access the setters. This would be the best approach for you (i.e. simulate immutability).
Example:
public class Factory
{
private String name;
private String payload;
public Factory setName(final String name)
{
this.name = name;
}
public Factory setPayload(final String payload)
{
this.payload = payload;
}
public DataObject build()
{
DataObject newObj = new DataObject();
newObj.setName( this.name );
newObj.setPayload( this.payload );
return newObj;
}
public class DataObject
{
// fields and setters, ALL PRIVATE
}
}
You can either put the object class and factory in the same package, and make the mutable methods package-scoped (this is the default visibility in Java, simply don't declare the methods to be public, private or protected), or make the class truly immutable and do all the work in the constructor. If you find that there are too many arguments in the constructor and it is difficult to understand, consider the Builder Pattern.
There is no direct equal to friend classes in Java. However have a look at http://docs.oracle.com/javase/tutorial/java/javaOO/accesscontrol.html.
If your object implements an interface and the factory returns interface type rather than the concrete type (which is better) then you can use java.lang.reflect.Proxy to create dynamic proxy at runtime that intercepts all method calls to the target object. As in the following code example FooFactory class creates a Foo instance (every time its createFoo method is called) but does not directly return instance but instead returns a dynamic proxy that implements the same interface as Foo and dynamic proxy intercepts and delegates all method calls to the Foo instance. This mechanism can be helpful to control access to a class when you dont have class code.
public class FooFactory {
public static IF createFoo() {
//Create Foo instance
Foo target = new Foo(); // Implements interface IF
//Create a dynamic proxy that intercepts method calls to the Foo instance
IF fooProxy = (IF) Proxy.newProxyInstance(IF.class.getClassLoader(),
new Class[] { IF.class }, new IFInvocationHandler(target));
return fooProxy;
}
}
class IFInvocationHandler implements InvocationHandler {
private Foo foo;
IFInvocationHandler(Foo foo) {
this.foo = foo;
}
#Override
public Object invoke(Object proxy, Method method, Object[] args)
throws Throwable {
if (method.getName().equals("setMethod")) {
// Block call
throw new IllegalAccessException();
} else {
// Allow call
method.invoke(proxy, args);
}
return null;
}
}
class Foo implements IF {
public void setMethod() {
} // method that is not allowed to call
public void getMethod() {
}
}
interface IF {
void setMethod(); // method that is not allowed to call
void getMethod(); // method that is allowed to call
}
The closest thing to a C++ friend class in Java is package-private access.
SomeObject.java:
package somewhere.someobjandfriends;
public class SomeObject {
Object aField; // field and constructor
SomeObject() {} // are package-only access
public void aMethod() {
System.out.println(this);
}
}
SomeObjFactory.java:
package somewhere.someobjandfriends;
public class SomeObjFactory {
public SomeObject newHelloWorld() {
return new SomeObject() {
{
aField = "hello world!";
}
#Override
public String toString() {
return aField.toString();
}
};
}
}
Anywhere outside of the package can see SomeObject and aMethod but can only create new instances through the factory.
I am trying to use polymorphism to enable different processing of an object based on its class, as follows:
public class GeneralStuff {
private int ID;
}
public class IntStuff extends GeneralStuff {
private int value;
public void setValue(int v)
{
value = v;
}
public int getValue()
{
return value;
}
}
public class DoubleStuff extends GeneralStuff {
private double value;
public void setValue(double v)
{
value = v;
}
public double getValue()
{
return value;
}
}
public class ProcessStuff {
public String process(GeneralStuff gS)
{
return doProcess(gS);
}
private String doProcess(IntStuff i)
{
return String.format("%d", i.getValue());
}
private String doProcess(DoubleStuff d)
{
return String.format("%f", d.getValue());
}
}
public class Test {
public static void main(String[] args)
{
IntStuff iS = new IntStuff();
DoubleStuff dS = new DoubleStuff();
ProcessStuff pS = new ProcessStuff();
iS.setValue(5);
dS.setValue(23.2);
System.out.println(pS.process(iS));
System.out.println(pS.process(dS));
}
}
This, however, doesn't work, because calling doProcess(gS) expects a method with a signature doProcess(GeneralStuff gS).
I know I could just have two exposed polymorphic process methods in the ProcessStuff class, but the actual situation won't allow it because I'm working within the constraints of an existing library mechanism; this is just a contrived example for testing.
I could, of course, define process(GeneralStuff gS) as
public String process(GeneralStuff gS)
{
if (gS instanceof IntStuff)
{
return doProcess((IntStuff) gS);
}
else if (gS instanceof DoubleStuff)
{
return doProcess((DoubleStuff) gS);
}
return "";
}
which works, but it seems that I shouldn't have to do that (plus, the Programming Police would skewer me for using instanceof in this way).
Is there a way that I can enforce the polymorphic calls in a better way?
Thanks in advance for any help.
The type of dynamic dispatch you are looking for is not possible in Java without using reflection. Java does its linking at compile time based on the declared type (so even though a method is overloaded, the actual method invoked is based on the declared type of the variable not the runtime type).
So you are left with either using instanceof as you propose, using reflection, or putting the process methods in the objects themselves (which is the "oop" way to do it, but is often not suitable or advisable).
One potential alternative is to create a map of processing objects by class, eg:
Map<Class<? extends GeneralStuff>,Processor> processors;
public String process(GeneralStuff stuff)
{
Processor processor = processors.get(stuff.getClass());
if (processor != null)
{
return processor.process(stuff);
}
}
public interface Processor
{
public String process(GeneralStuff stuff);
}
public class IntegerProcessor implements Processor
{
public String process(GeneralStuff stuff)
{
return String.format("%i",((IntegerStuff) stuff).getValue());
}
}
However, for your specific example, String.format takes objects as the parameters, so you could avoid this whole issue by having getValue and getFormatString methods in GeneralStuff which are overriden in the specific subclasses.
You are actually on the right track, you indeed need to use reflection in this case. What you are looking for is sort of double dispatch, because you want the dispatch to be done on the dynamic type of the stuff parameter.
This type of switching-on-dynamic-type is not as rare as you think. See for example this javaworld tipe, which reflects on the visitor pattern
The compiler complains for good reason. There is no guarantee that your GeneralStuff object is an IntStuff or a DoubleStuff. It can be a plain GeneralStuff or any other extension of GeneralStuff, which is a case you also did not cover in your process method with the instanceof (unless returning the empty String was the desired behavior).
Is it not possible to move that process method into the GeneralStuff class and override it in the extensions ?
Another possible solution is to have a sort of composite ProcessStuff class in which you plug a IntStuffProcess, DoubleStuffProcess, ... instance . Each of those instances will still have the instanceof check to decide whether they can handle the GeneralStuff object passed to them, but this is at least more scalable/maintainable then one big instanceof construct
Perhaps, it's better to have overloaded process method in ProcessStuff:
public class ProcessStuff {
private String process(IntStuff i) {
return String.format("%d", i.getValue());
}
private String process(DoubleStuff d) {
return String.format("%f", d.getValue());
}
}
Define an GeneralStuff as an abstract class, with a doProcess method (abstract) which is filled in in the inheriting classes. This way you avoid all problems with instanceof values and such. Or you can do what is suggested by βнɛƨн Ǥʋяʋиɢ, but then you still would have to define an overload for each specific class, whereas in mine you just call it directly.
So my suggestion would be:
public abstract class GeneralStuff {
private int ID;
public abstract String process();
}
public class IntStuff extends GeneralStuff {
private int value;
public void setValue(int v)
{
value = v;
}
public int getValue()
{
return value;
}
#override
public String process(){
return String.format("%d", getValue());
}
}
public class DoubleStuff extends GeneralStuff {
private double value;
public void setValue(double v)
{
value = v;
}
public double getValue()
{
return value;
}
#override
public String process(){
return String.format("%f", getValue());
}
}
public class Test {
public static void main(String[] args)
{
IntStuff iS = new IntStuff();
DoubleStuff dS = new DoubleStuff();
ProcessStuff pS = new ProcessStuff();
iS.setValue(5);
dS.setValue(23.2);
System.out.println(iS.process());
System.out.println(dS.process());
}
}