I would like to know if there is a more efficient way of storing values (like fields) for an instance of an interface (if implementing it is not guaranteed) than a static hashmap in an other non-visible class.
Example:
public interface myInterface {
public default Object getMyVariable() {
return Storage.data.get(this);
}
}
final class Storage {
static HashMap<myInterface, Object> data = new HashMap<myInterface, Object>();
}
Firs of all this is bad bracrice - you abstraction knows about implementations. The point of interface is to introduce abstraction - and get rig of rigit design.
You can define interface like that:
public interface MyInterface {
default Object getMyVariable() {
return getDefaultObject();
}
Object getDefaultObject()
}
As you can see i added required method getDefaultObject() - that all implementations have to implement. However this will not work if you already have implementation classes - and you have no control over them.
Related
I'm currently working at a company that has a diverse set of modules. In that company if you want to provide module internals you provide it via a java interface, that hides the actual implementing type and gives an interface for the requesting module. Now I want to have one provider to be able to provide data for multiple modules that expose different fields or methods of the actual internal data.
Therefore I have an internal Object, which has some data and I have an interface for each module that needs access to some but not strictly all fields. Finally I have an external object that implements all those interfaces and holds an instance of the internal object to delegate the method calls:
public class InternalObject {
public int getA() { return 0; }
public int getB() { return 0; }
}
public interface ModuleXObject {
int getA();
}
public interface ModuleYObject {
int getA();
int getB();
}
public class ExternalObject implements ModuleXObject, ModuleYObject {
private InternalObject _internal;
public int getA() { return _internal.getA(); }
public int getB() { return _internal.getB(); }
}
Now that is all fine and dandy, but if I want to provide - lets say - repository methods for finding a list of said objects typed for the correct module, I run into problems with how I can achieve that. I would wish for something like the following:
public interface ModuleXObjectRepository {
List<ModuleXObject> loadAllObjects();
}
public interface ModuleYObjectRepository {
List<ModuleYObject> loadAllObjects();
}
public class ExternalObjectRepository implements ModuleXObjectRepository, ModuleYObjectRepository {
public List<ExternalObject> loadAllObjects() {
// ...
}
}
This doesn't compile saying the return type is incompatible.
So my question is, if it is possible to achieve something like that and if, how?
I should note that I tried some different approaches which I want to include for completeness and to portray their downsides (in my eyes).
Approach 1:
public interface ModuleXObjectRepository {
List<? extends ModuleXObject> loadAllObjects();
}
public interface ModuleYObjectRepository {
List<? extends ModuleYObject> loadAllObjects();
}
public class ExternalObjectRepository implements ModuleXObjectRepository, ModuleYObjectRepository {
public List<ExternalObject> loadAllObjects() {
// ...
}
}
This approach is quite close to the solution I would prefer, but results in code like this:
List<? extends ModuleXObject> objects = repository.loadAllObjects();
Therefore requiring the user to include the "? extends" into each List-Declaration regarding to an invocation of loadAllObjects().
Approach 2:
public interface ModuleXObjectRepository {
List<ModuleXObject> loadAllObjects();
}
public interface ModuleYObjectRepository {
List<ModuleYObject> loadAllObjects();
}
public class ExternalObjectRepository implements ModuleXObjectRepository, ModuleYObjectRepository {
public List loadAllObjects() {
// ...
}
}
This approach just omits the generic in the ExternalObjectRepository and therefore reduces the type safety too much in my opinion. Also I haven't tested if this actually works.
Just to reharse, is there any possible way to define the loadAllObjects-method in a way that enables users to get lists that are typed with the objects for their respective module without
requiring "? extends" in the users code
degrading type safety in the repository implementation
using class/interface level generics
The challenge with allowing it to be typed as List<ModuleXObject> is that other code may hold is as a List<ExternalObject>.
All ExternalObject instances are ModuleXObject instances but the inverse is not true.
Consider the following additional class:
public class MonkeyWrench implements ModuleXObject{
//STUFF
}
MonkeyWrench instances are NOT ExternalObject instances but if one could cast a List<ExternalObject> to a List<ModuleXObject> one could add MonkeyWrench instances to this collection, and this causes a risk of run time class cast exceptions and ruins type safety.
Other code could very easily have:
for(ExternalObject externalObject:externalObjectRepository.loadAllObjects())
If one of those instances is a MonkeyWrench instance, run time class cast, which is what generics are meant to avoid.
The implication of ? extends ModuleXObject is that you can read any object from the collection as a ModuleXObject but you can't add anything to the collection as other code may have additional constraints on the collection that are not obvious/available at compile time.
I'd suggest in your case to use ? extends ModuleXObject as its semantics seem to align with what you want, namely pulling out ModuleXObject instances, e.g.
ModuleXObjectRepository repo = //get repo however
for(ModuleXObject obj : repo.loadAllObjects()){
//do stuff with obj
}
I have a class that contains a collection. I want to be able to specify what implementation is used at runtime. What is the best object oriented way to accomplish this?
public class Klazz {
private class Data {
...
}
private Collection<Data> collection;
public Klazz(?) {
}
}
How can I make it so the constructor specifies what type of collection is used to implement Klazz.
In Java, List, Set and Queue Interfaces are derived from java.util.Collection Interface. Hence all the objects of any implementations of List, Set and Queue can be referenced by a Collection reference.
For a detailed explanation please refer to this very good guide at geeksforgeeks
tl;dr you can do something like:
public class Klazz {
private class Data {
...
}
private Collection<Data> collection;
public Klazz(Collection<Data> collection) {
this.collection = collection;
}
public static void main(String[] args) {
Klazz k1 = new Klazz(new ArrayList<>());
Klazz k2 = new Klazz(new HashSet<>());
}
}
The classic way to allow a Java client to choose an implementation class is to use a Factory Object supplied by the client to instantiate the object.
Something like this:
public interface CollectionFactory {
<T> Collection<T> create();
}
public class Klazz {
private Collection<Data> collection;
public Klazz(CollectionFactory <Data> factory) {
collection = factory.create();
}
}
(I haven't checked the above with the compiler, so there could be typos, etc)
Passing in a collection object is another solution. There is potentially an abstraction leakage, but the same is true with a factory object if the factory is "tricky".
It is also possible to pass a Class object and use reflection to instantiate the collection.
#Tim Biegeleisen has a point though. The Collection API does not constrain the properties of the collection a great deal, and use-cases which will work equally for collections that are lists, sets or "bags" are .... unusual.
Essentially what I'm trying to do is create a generic method that can take many different kinds of enums. I'm looking for a way to do it how I'm going to describe, or any other way a person might think of.
I've got a base class, and many other classes extend off that. In each of those classes, I want to have an enum called Includes like this:
public enum Includes {
VENDOR ("Vendor"),
OFFERS_CODES ("OffersCodes"),
REMAINING_REDEMPTIONS ("RemainingRedemptions");
private String urlParam;
Includes(String urlParam) {
this.urlParam = urlParam;
}
public String getUrlParam() {
return urlParam;
}
}
I've got a method that takes in a generic class that extends from BaseClass, and I want to be able to also pass any of the includes on that class to the method, and be able to access the methods on the enum, like this:
ApiHelper.Response<Offer> offer = apiHelper.post(new Offer(), Offer.Includes.VENDOR);
public <T extends BaseClass> Response<T> post(T inputObject, Includes... includes) {
ArrayList<String> urlParams = new ArrayList<String>();
for (Include include : includes){
urlParams.add(include.getUrlParam());
}
return null;
}
Is there a way to be able to pass in all the different kinds of enums, or is there a better way to do this?
---EDIT---
I've added an interface to my enum, but how can I generify my method? I've got this:
public <T extends BaseClass> Response<T> post(Offer inputObject, BaseClass.Includes includes) {
for (Enum include : includes){
if (include instanceof Offer.Includes){
((Offer.Includes) include).getUrlParam();
}
}
return null;
}
But I get an error on apiHelper.post(new Offer(), Offer.Includes.VENDOR); saying the second param must be BaseClass.Includes.
Enums can implement interfaces, so you can create an interface with these methods that you'd like to be able to call:
interface SomeBaseClass {
String getUrlParam();
void setUrlParam(String urlParam);
}
and then your enum can implement this interface:
public enum Includes implements SomeBaseClass {
VENDOR ("Vendor"),
OFFERS_CODES ("OffersCodes"),
REMAINING_REDEMPTIONS ("RemainingRedemptions");
private String urlParam;
Includes(String urlParam) {
this.urlParam = urlParam;
}
#Override
public String getUrlParam() {
return urlParam;
}
#Override
public void setUrlParam(String urlParam) {
this.urlParam = urlParam;
}
}
If you want to get really fancy, it's possible to restrict subtypes of the interface to enums, but the generic type declaration will be pretty ugly (thus hard to understand and maintain) and probably won't provide any "real" benefits.
Unrelated note regarding this design: it's a pretty strong code smell that the enum instances are mutable. Reconsider why you need that setUrlParam() method in the first place.
I have multiple services (in Spring MVC) that are children of a global Service. So I need to know about the best practice (or your opinions) with multiple methods with this example:
//Domain classes
public class MyParentObject{}
public class MyObj extends MyParentObject{}
//Services
public class MyParentObjectServiceImpl implements MyParentObjectService{
#Override
public MyParentObject findObjectByProp(String prop, String objectType){
//myCode (not abstract class)
}
}
public class MyObjServiceImpl extends MyParentObjectServiceImpl implements MyObjectService{
private myObjType = "MyObj";
#Override
public MyObj findMyObjByProp(String prop){
return (MyObj) super.findObjectByProp(prop, this.myObjType);
}
}
And in this approach, I use calls like this:
MyObj foo = myObjService.findMyObjByProp(prop);
So I need to know if this approach is "better" or more apropiate that calling directly the parent method with the second parameter. E.g:
MyObj foo = (MyObj)myParentObjectService.findObjectByProp(prop, "MyObj");
..and avoiding the creation of second methods, more specific. It is important to know that the children services will be created anyway, because we have lot of code that is specific of a domain objects.
I have the idea that the first approach is better, because is more readable, but I need to support that decision with some documents, blog, or opinions to discuss this designs with my colleagues.
This looks like a tagged class hierarchy. It's difficult to comment on the value of this design in general without knowing the details. However, a slightly different approach that I would recommend is to generify your base class to gain a little bit of type safety.
In particular:
public /* abstract */ class MyParentObjectServiceImpl<T extends MyParentObject>
implements MyParentObjectService{
MyParentObjectServiceImpl(Class<T> type) { this.type = type; }
private final Class<T> type; // subclasses provide this
#Override
public T findObjectByProp(String prop){
//you can use type for object specific stuff
}
}
public class MyObjServiceImpl extends MyParentObjectServiceImpl<MyObj>
// You might not need this interface anymore
// if the only method defined is findMyObjByProp
/* implements MyObjectService */ {
MyObjServiceImpl() {
super(MyObj.class);
}
#Override
public /* final */ MyObj findMyObjByProp(String prop) {
return (MyObj) super.findObjectByProp(prop, this.myObjType);
}
}
You definitely gain in type safety (casting will only appear in the base class), you get rid of the "tags" (the strings that identify the different objects) and possibly reduce the number of classes/interfaces required to implement the whole hierarchy. I successfully used this approach several times. Note that this works best if the base class is abstract. Food for thoughts.
In the Java code I'm working with we have an interface to define our Data Access Objects(DAO). Most of the methods take a parameter of a Data Transfer Object (DTO). The problem occurs when an implementation of the DAO needs to refer to a specific type of DTO. The method then needs to do a (to me completely unnecessary cast of the DTO to SpecificDTO. Not only that but the compiler can't enforce any type of type checking for specific implementations of the DAO which should only take as parameters their specifc types of DTOs.
My question is: how do I fix this in the smallest possible manner?
You could use generics:
DAO<SpecificDTO> dao = new SpecificDAO();
dao.save(new SpecificDTO());
etc.
Your DAO class would look like:
interface DAO<T extends DTO> {
void save(T);
}
class SpecificDAO implements DAO<SpecificDTO> {
void save(SpecificDTO) {
// implementation.
}
// etc.
}
SpecificDTO would extend or implement DTO.
Refactoring to generics is no small amount of pain (even though it's most likely worth it).
This will be especially horrendous if code uses your DTO interface like so:
DTO user = userDAO.getById(45);
((UserDTO)user).setEmail(newEmail)
userDAO.update(user);
I've seen this done (in much more subtle ways).
You could do this:
public DeprecatedDAO implements DAO
{
public void save(DTO dto)
{
logger.warn("Use type-specific calls from now on", new Exception());
}
}
public UserDAO extends DeprecatedDAO
{
#Deprecated
public void save(DTO dto)
{
super.save(dto);
save((UserDTO)dto);
}
public void save(UserDTO dto)
{
// do whatever you do to save the object
}
}
This is not a great solution, but might be easier to implement; your legacy code should still work, but it will produce warnings and stack traces to help you hunt them down, and you have a type-safe implementation as well.