I am searching for a design pattern/way to exchange a (persistence) layer of my application dynamically (preferably even at runtime).
Why?
I'd like to be able to decide whether to save certain data to XML or a database on a "per instance"-basis. So I may decide that one project uses XML as a backend and another uses a database. I want to be flexible here and to be able to easily add another "driver" for e.g. Json or whatever.
Now assume the following setup:
We have a controller and we want to manage some data. We can choose between a SQL and XML implementation.
One possible (working) solution:
BasicController.scala
val myPersistenceLayer: PersistenceLayer = SQLPersistenceLayer
val apples: Seq[Apple] = myPersistenceLayer.getApples()
trait PersistenceLayer
{
def getApples(): Seq[Apple]
def getBananas(): Seq[Banana]
}
object SQLPersistenceLayer extends PersistenceLayer
{
override def getApples(): Seq[Apple] = {...}
override def getBananas(): Seq[Banana] = {...}
}
This is a rather nasty solution as one would have to add methods for each new Model (think fruit! ;)) not only in the trait, but also in every implementation. I like my single responsibility so I'd rather delegate that to the models instead, like:
trait PersistenceLayer
{
def getAll(model: Model): Seq[Model] = { model.getAll() }
}
trait Model
{
def getAll(): Seq[Model]
}
package "SQL"
class Apple extends Model
{
def getAll(): Seq[Apple] = { // do some SQL magic here }
}
package "XML"
class Apple extends Model
{
def getAll(): Seq[Apple] = { // do some XML magic here instead }
}
Now the big problem here is, even if I implement a concrete PersistenceLayer, like so:
object SQLPersistenceLayer extends PersistenceLayer {}
how could I tell the application to use the model of the right package?
If I use the SQLPersistenceLayer upon:
val apples = myPersistenceLayer.get(Apple)
I would need to import the right "Apple" class, which defeats the whole purpose because then I could just remove all other classes, import the right one and just use a generic "getAll()" method on it.
So again I would need to change the implementation at multiple lines, which is what I want to avoid.
I thought about something like giving a string with the package-name, like
val package = "sql" and in the controller to import it from the right package, but this is not really feasible and not really easy to accomplish and it's a rather nasty hack for something I'm obviously missing.
To make a long story short: I want to be able to switch the package to use for my persistence needs dynamically. In some dynamically typed languages I could come up with a solution, but not in Scala or any statically typed language, so I guess I'm not knowing a certain design pattern here
** Edit **
A thought occurred (ya, sometimes it happens ;)) and now I'm wondering whether something like this could lead to what I want:
namespace tld.app.persistence
trait PersistenceLayer
{
proteced val models: mutable.HashMap[String, Model] = new mutable.HashMap[String, Model]
def registerModel(key: String, model: Model): Unit =
{
models.remove(key)
models.put(key, model)
}
def get(model: String): Seq[Future[Model]] =
{
val m: Model = models.getOrElse(model, throw new Exception("No such model found!"))
m.get
}
}
trait Model
{
def get(): Seq[Future[Model]]
}
namespace tld.app.persistence.sql
object SQLPersistenceLayer extends PersistenceLayer
class Person extends Model
{
def get(): Seq[Future[Model]] =
{
// ... query the database
}
}
namespace tld.app.persistence.xml
object XMLPersistenceLayer extends PersistenceLayer
class Person extends Model
{
def get(): Seq[Future[Model]] =
{
// ... read in from the appropriate xml-file
}
}
object Settings
{
var persistenceLayer: PersistenceLayer = SQLPersistenceLayer // Default is SQLPersistenceLayer
}
Somewhere in the application:
Settings.persistenceLayer.get("person")
// Then a user-interaction happens
Settings.persistenceLayer = XMLPersistenceLayer
Settings.persistenceLayer.get("person")
The persistenceLayer normally stays the same, but the user can decide upon changing it. I'll have a deeper look at it, as soon as I can find time. But maybe somebody immediately spots a problem with that approach.
DI allows you to wire an implementation at compile time. There are many ways to do DI in Scala (Cake Pattern, Reader Monad, DI frameworks, etc).
If you want to wire the dependency on application startup then regular dependency mechanisms would work. You would just create an instance of desired dependency (SQL, XML) based on some condition and pass it to the code.
If you want to keep switching between dependencies during your application execution, i.e. sometimes you save to SQL, other times to XML then you can use something similar to Lift Injector, see also my answer here - option 2.
You can use runtime reflection to accomplish it. You need to specify and create class/object at runtime which you'll be passing to Persistency layer and then just call generic getAll method.
For details of reflection library -> http://docs.scala-lang.org/overviews/reflection/overview.html
It would be better to make companion object Apple which has getAll method implemented differently for each persistency layer.
Then access Apple objects with reflection by using full package name
val apple:sql.Apple = //Reflection library object access
val apple:xml.Apple = //Reflection library object access
val apples = myPersistenceLayer.get(apple)
I think you can achieve module-based inclusion with implicits + TypeTags with something along these lines
object SqlPersistence {
implicit def getAll[T: TypeTag](): Seq[T] = {/* type-based sql implementation*/}
}
object JsonPersistence {
implicit def getAll[T: TypeTag](): Seq[T] = {/* type-based json implementation*/}
}
object PersistenceLayer {
def getAll[T](implicit getter: Unit => Seq[T]): Seq[T] = getter
}
// somewhere else ...
import SqlPersistence._
PersistenceLayer.getAll[Apple]
The advantage is that you can decide on your persistence layer at the spot by bringing a corresponding import. The major downside is the same: you need to decide on your persistence layer with every call and make sure that it is what you think. Also, from my personal experience compiler is less helpful with tricky implicit corner cases, so there is a potential to spend more time debugging.
If you set your persistence layer once for an app, then DI would do fine, e.g. cake pattern. But then again, you either need to have a method per class or resort to reflection. Without reflection, it may look like that:
trait PersistenceLayer {
def getApples(): Apples
}
trait SqlPersistenceLayer extends PersistenceLayer {
override def getApples() = // sql to get apples
}
trait Controller {
this: PersistenceLayer =>
def doMyAppleStuff = getApples()
}
// somewhere in the main ...
val controller = new Controller with SqlPersistence {}
controller.doMyAppleStuff
Something similar is strategy pattern if that helps.
I think the repository pattern is your solution.
EDIT:
ok. thanks for "-1" thats ok because i did not explained my idea behind...
my example is only one of many others. so i hope that this is usefull for someone out there
i will try to explain my idea about using the repository and factory pattern.
for this i made a github repository with the example code: https://github.com/StefanHeimberg/stackoverflow-32319416
my setup ist nearly the same as in your question. but the difference is the following:
i did not use scala. but the concept would be the same...
my settings contains only a "flag" for the repository factory.
the "model" objects are persistence ignorance. that means the do not know how the are persisted. this is the concern of the repositories
i made dependency injection by hand cause this should be sufficient for the example
i have no "Controller" but i have "Application Services"...
the decition about the implementation used is made inside the factory on each call to the create() method.
the domain layer does not know anything about the used infrastructure implementation. the application layer is orchestrating the domain service and the infrastructure services (in my example only the repositories)
if you have any DI Container then the factory could by a Producer or soething else... depends on DI Container
package structure:
i have also made a simple integration test
public class AppleServiceIT {
private Settings settings;
private AppleService appleService;
#Before
public void injectDependencies() {
settings = new Settings();
final JdbcAppleRepository jdbcAppleRepository = new JdbcAppleRepository();
final JsonAppleRepository jsonAppleRepository = new JsonAppleRepository();
final AppleRepositoryFactory appleRepositoryFactory = new AppleRepositoryFactory(jdbcAppleRepository, jsonAppleRepository);
appleService = new AppleService(settings, appleRepositoryFactory);
}
#Test
public void test_findAppleById() {
// test with jdbc
settings.setRepositoryType(RepositoryTypeEnum.JDBC);
assertEquals("JDBC-135", appleService.findAppleById(135l).getMessage());
// test with json
settings.setRepositoryType(RepositoryTypeEnum.JSON);
assertEquals("JSON-243", appleService.findAppleById(243l).getMessage());
}
#Test
public void test_getApples() {
// test with jdbc
settings.setRepositoryType(RepositoryTypeEnum.JDBC);
assertEquals(2, appleService.getApples().size());
// test with json
settings.setRepositoryType(RepositoryTypeEnum.JSON);
assertEquals(3, appleService.getApples().size());
}
}
Related
I read "Clean Code" book ((c) Robert C. Martin) and try to use SRP(single responsibility principle). And I have some questions about it. I have some service in my application, and I do not know how can I refactor it so it matched the right approach. For example, I have service:
public interface SendRequestToThirdPartySystemService {
void sendRequest();
}
What does it do if you look at the class name? - send a request to the third party system. But I have this implementation:
#Slf4j
#Service
public class SendRequestToThirdPartySystemServiceImpl implements SendRequestToThirdPartySystemService {
#Value("${topic.name}")
private String topicName;
private final EventBus eventBus;
private final ThirdPartyClient thirdPartyClient;
private final CryptoService cryptoService;
private final Marshaller marshaller;
public SendRequestToThirdPartySystemServiceImpl(EventBus eventBus, ThirdPartyClient thirdPartyClient, CryptoService cryptoService, Marshaller marshaller) {
this.eventBus = eventBus;
this.thirdPartyClient = thirdPartyClient;
this.cryptoService = cryptoService;
this.marshaller = marshaller;
}
#Override
public void sendRequest() {
try {
ThirdPartyRequest thirdPartyRequest = createThirdPartyRequest();
Signature signature = signRequest(thirdPartyRequest);
thirdPartyRequest.setSignature(signature);
ThirdPartyResponse response = thirdPartyClient.getResponse(thirdPartyRequest);
byte[] serialize = SerializationUtils.serialize(response);
eventBus.sendToQueue(topicName, serialize);
} catch (Exception e) {
log.error("Send request was filed with exception: {}", e.getMessage());
}
}
private ThirdPartyRequest createThirdPartyRequest() {
...
return thirdPartyRequest;
}
private Signature signRequest(ThirdPartyRequest thirdPartyRequest) {
byte[] elementForSignBytes = marshaller.marshal(thirdPartyRequest);
Element element = cryptoService.signElement(elementForSignBytes);
Signature signature = new Signature(element);
return signature;
}
What does it do actually? - create a request -> sign this request -> send this request -> to send the response to Queue
This service inject 4 another services: eventBus, thirdPartyClient, cryptoSevice and marshaller. And in sendRequest method calls each this service.
If I want to create a unit test for this service, I need mock 4 services. I think it's too much.
Can somebody indicate how can this service be changed?
Change the class name and leave as is?
Split into several classes?
Something else?
The SRP is a tricky one.
Let's ask two questions:
What is a responsibility?
What are the different types of responsibilities?
One important thing about responsibilities is that they have a Scope and you can define them in different levels of Granularity. and are hierarchical in nature.
Everything in your application can have a responsibility.
Let's start with Modules. Each module has responsibilities an can adhere to the SRP.
Then this Module can be made of Layers. Each Layer has a responsibility and can adhere to the SRP.
Each Layer is made of different Objects, Functions etc. Each Object and/or Function has responsibilities and can adhere to the SRP.
Each Object has Methods. Each Method can adhere to the SRP. Objects can contain other objects and so on.
Each Function or Method in an Object is made of statements and can be broken down to more Functions/Methods. Each statement can have responsibilities too.
Let's give an example. Let's say we have a Billing module. If this module is implemented in a single huge class, does this module adhere to the SRP?
From the point of view of the system, the module does indeed adhere to the SRP. The fact that it's a mess doesn't affect this fact.
From the point of view of the module, the class that represents this module doesn't adhere to the SRP as it will do a lot of other things, like communicate with DB, send Emails, do business logic etc.
Let's take a look at the different types of responsibilities.
When something should be done
How it should be dome
Let's take an example.
public class UserService_v1 {
public class SomeOperation(Guid userID) {
var user = getUserByID(userID);
// do something with the user
}
public User GetUserByID(Guid userID) {
var query = "SELECT * FROM USERS WHERE ID = {userID}";
var dbResult = db.ExecuteQuery(query);
return CreateUserFromDBResult(dbResult);
}
public User CreateUserFromDBResult(DbResult result) {
// parse and return User
}
}
public class UserService_v2 {
public void SomeOperation(Guid userID) {
var user = UserRepository.getByID(userID);
// do something with the user
}
}
Let's take a look at these two implementations.
UserService_v1 and UserService_v2 do exactly the same thing but different ways. From the point of view of the System, these services adhere to the SRP as they contain operations related to Users.
Now let's take a look at what they actually do to complete their work.
UserService_v1 does these things:
Builds a SQL query string.
Calls the db to execute the query
Takes the specific DbResult and creates a User from it.
Does the operation on the User
UserService_v2 does these things:
1. Requests from the repository the User by ID
2. Does the operation on the User
UserService_v1 contains:
How specific query is build
How the specific DbResult is mapped to a User
When this query need to be called (in the begging of the operation in this case)
UserService_v1 contains:
When a User should be retrieved from the DB
UserRepository contains:
How specific query is build
How the specific DbResult is mapped to a User
What we do here is to move the responsibility of How from the Service to the Repository. This way each class has one reason to change. If how changes, we change the Repository. If when changes, we change the Service.
This way we create objects that collaborate with each other to do specific work, by dividing responsibilities. The tricky parts is: what responsibilities we divide?
If we have a UserService and OrderService we don't divide when and how here. We divide what so we can have one service per Entity in our system.
It's natural for there services to need other objects to do their work. We can of course add all of the responsibilities of what, when and how to a single object but that just makes to the messy, unreadable and hard to change.
In this regard the SRP helps us to achieve cleaner code by having more smaller parts that collaborate with and use each other.
Let's take a look at your specific case.
If you can move the responsibility of how the ClientRequest is created and signed by moving it to the ThirdPartyClient, your SendRequestToThirdPartySystemService will only tell when this request should be sent. This will remove Marshaller, and CryptoService as dependencies from your SendRequestToThirdPartySystemService.
Also you have SerializationUtils that you probably rename to Serializer to capture the intent better as Utils is something that we stick to objects that we just don't know how to name and contains a lot of logic (and probably multiple responsibilities).
This will reduce the number of dependencies and your tests will have less things to mock.
Here's a version of the sendRequest method with less responsibilities.
#Override
public void sendRequest() {
try {
// params are not clear as you don't show them to your code
ThirdPartyResponse response = thirdPartyClient.sendRequest(param1, param2);
byte[] serializedMessage = SerializationUtils.serialize(response);
eventBus.sendToQueue(topicName, serialize);
} catch (Exception e) {
log.error("Send request was filed with exception: {}", e.getMessage());
}
}
From your code I'm not sure if you can also move the responsibility of serialization and deserialization to the EventBus, but if you can do that, it will remove Seriazaliation from your service also. This will make the EventBus responsible for how it serialized and stores the things inside it making it more cohesive. Other objects that collaborate with it will just tell it to send and object to the queue not caring how this objects get's there.
I am not sure yet if I'm on the wrong track or not. There is an example on the Micronaut Getting Started page for a V8 Engine and injecting a Vehicle.
Defining Beans (often used Engine interface example)
With that example in mind. What is the most straightforward way to implement "Model A" with Micronaut using Java? If there's no direct approach, what is the closest hands-off method with Micronaut?
My simple vision of vanilla field injection stepping-on with such an example is as so (using Java), I'm labelling it "Model A" ...
Model A
import io.micronaut.context.*
public class MyApp { // (A)
#Inject
Vehicle vehicle;
public void runApp( String... args ){
println( vehicle.start() )
}
public static main( String... args ){
// whatever set-up and steps need // (B)
// for auto-inject / auto-wiring.
MyApp body = new MyApp( args );
body.runApp(); // (C)
}
}
Where the annotation processor uses provides an instance of the #Singleton Vehicle in this example. Or creates a new instance in the case of non-singleton-s.
Either way the result of the process would be that I don't need to write code to 'instantiate' the code or find a factory to do so explicitly.
The example itself goes on to demonstrate the method I'll label "Model B" (using Groovy)...
Model B
import io.micronaut.context.*
...
Vehicle vehicle = BeanContext.run().getBean(Vehicle)
println( vehicle.start() )
Which in fact is MORE typing than just writing:
Vehicle vehicle = new Vehicle();
// OR
Vehicle vehicle = Vehicle.getInstance();
With some libraries you need to initialise the scopes or context, I see that. The question boils donw to what must I do to inject Vehicle as shown in my code.
I made a #Singleton and tried to #Inject the field. The reference is NULL. I then made a #Providerand set a break point. That isn't called.
can I do "Model A"?
If yes, what needs to happen?
I've scanned lots of examples doing great things. I'd love to get into those fancy things too. Right now I'm in the basement looking for a way up to the ground floor. Many thanks for you guidance.
I'm currently working on a project that involves creating an abstraction layer. The goal of the project is to support multiple implementations of server software in the event that I might need to switch over to it. The list of features to be abstracted is rather long, so I'm going to want to look into a rather painless way to do it.
Other applications will be able to interact with my project and make calls that will eventually boil down to being passed to the server I'm using.
Herein lies the problem. I haven't much experience in this area and I'm really not sure how to make this not become a sandwich of death. Here's a chain of roughly what it's supposed to look like (and what I'm trying to accomplish).
/*
Software that is dependent on mine
|
Public API layer (called by other software)
|
Abstraction between API and my own internal code (this is the issue)
|
Internal code (this gets replaced per-implementation, as in, each implementation needs its own layer of this, so it's a different package of entirely different classes for each implementation)
|
The software I'm actually using to write this (which is called by the internal code)
*/
The abstraction layer (the one in the very middle, obviously) is what I'm struggling to put together.
Now, I'm only stuck on one silly aspect. How can I possibly make the abstraction layer something that isn't a series of
public void someMethod() {
if(Implementation.getCurrentImplementation() == Implementation.TYPE1) {
// whatever we need to do for this specific implementation
else {
throw new NotImplementedException();
}
}
(forgive the pseudo-code; also, imagine the same situation but for a switch/case since that's probably better than a chain of if's for each method) for each and every method in each and every abstraction-level class.
This seems very elementary but I can't come up with a logical solution to address this. If I haven't explained my point clearly, please explain with what I need to elaborate on. Maybe I'm thinking about this whole thing wrong?
Why not using inversion of control ?
You have your set of abstractions, you create several implementations, and then you configure your public api to use one of the implementations.
Your API is protected by the set of interfaces that the implementations inherit. You can add new implementations later without modifying the API code, and you can switch even at runtime.
I don't know anymore if inversion of control IS dependency injection, or if DI is a form of Ioc but... it's just that you remove the responsibility of dependency management from your component.
Here, you are going to have
API layer (interface that the client uses)
implementations (infinite)
wrapper (that does the IoC by bringing the impl)
API layer:
// my-api.jar
public interface MyAPI {
String doSomething();
}
public interface MyAPIFactory {
MyAPI getImplementationOfMyAPI();
}
implementations:
// red-my-api.jar
public class RedMyAPI implements MyAPI {
public String doSomething() {
return "red";
}
}
// green-my-api.jar
public class GreenMyAPI implements MyAPI {
public String doSomething() {
return "green";
}
}
// black-my-api.jar
public class BlackMyAPI implements MyAPI {
public String doSomething() {
return "black";
}
}
Some wrapper provide a way to configure the right implementation. Here, you can hide your switch case in the factory, or load the impl from a config.
// wrapper-my-api.jar
public class NotFunnyMyAPIFactory implements MyAPIFactory {
private Config config;
public MyAPI getImplementationOfMyAPI() {
if (config.implType == GREEN) {
return new GreenMyAPI();
} else if (config.implType == BLACK) {
return new BlackMyAPI();
} else if (config.implType == RED) {
return new RedMyAPI();
} else {
// throw...
}
}
}
public class ReflectionMyAPIFactory implements MyAPIFactory {
private Properties prop;
public MyAPI getImplementationOfMyAPI() {
return (MyAPI) Class.forName(prop.get('myApi.implementation.className'))
}
}
// other possible strategies
The factory allows to use several strategies to load the class. Depending on the solution, you only have to add a new dependency and change a configuration (and reload the app... or not) to change the implementation.
You might want to test the performances as well.
If you use Spring, you can only use the interface in your code, and you inject the right implementation from a configuration class (Spring is a DI container). But no need to use Spring, you can do that on the Main entry point directly (you inject from the nearest of your entry point).
The my-api.jar does not have dependencies (or maybe some towards the internal layers).
All the jar for implementations depend on my-api.jar and on you internal code.
The wrapper jar depends on my-api.jar and on some of the impl jar.
So the client load the jar he wants, use the factory he wants or a configuration that inject the impl, and use your code. It depends also on how you expose your api.
I am building a piece of software that sends and receives messages in particular binary definitions and with a particular version. As such, I have classes that look like this, which vary mostly only in the package name (the version, in this case):
For version 1.5:
com.mydomain.clothesmessage.v0105.fielddefinitions.Field100
com.mydomain.clothesmessage.v0105.fielddefinitions.Field200
com.mydomain.clothesmessage.v0105.messagedefinitions.Pants
com.mydomain.clothesmessage.v0105.messagedefinitions.Socks
and for version 2.7:
com.mydomain.clothesmessage.v0207.fielddefinitions.Field100
com.mydomain.clothesmessage.v0207.fielddefinitions.Field200
com.mydomain.clothesmessage.v0207.messagedefinitions.Pants
com.mydomain.clothesmessage.v0207.messagedefinitions.Socks
The class that manages the transmission and reception of these messages uses all versions, depending on where the message comes from, etc.
My problem is that defining an instance of the class requires I use the entire package path, because otherwise it's ambiguous. Even if there exists a situation where I use only one version in a given file, a casual reader of the code won't be able to see what version is being used. Pants pants = new Pants() is ambiguous until you look at the imported package.
My ideal usage of this would be something like this:
V0207.Pants pantsMessage = new V0702.Pants();
That makes it very clear what version is being used. I could make this happen by creating the Pants message classes as inner classes of the V0207 class, but then the V0207 class becomes gigantic (there could be a hundred messages, each with 100 fields, for every given version). Is there possibly a way to #include an inner class, so they can be stored in separate files? This would be ideal.
I suppose I can emulate this with a wrapper class, that does something (silly?) like this, where there exists an instance of the Pants class in the V0207 object:
Object pantsMessage = V0207.pants.getClass().newInstance();
((com.mydomain.clothesmessage.v0207.messagedefinitions.Pants)pantsMessage).getZipperType();
But I dislike that. It looks contrived and requires try/catch and casting when in use. Terrible.
I could also use a factory. That would be a bit nicer, but requires a parent class (or interface) and would require casting when used, since each message has unique methods.
Message pantsMessage = V0207Factory.newMessage(V0207.PantsMessage);
((com.mydomain.clothesmessage.v0207.messagedefinitions.Pants)pantsMessage).getZipperType();
or
Message sockMessage = V0207Factory.newSock();
((com.mydomain.clothesmessage.v0207.messagedefinitions.Socks)sockMessage).getSmellLevel();
What are your thoughts? I'm using JDK 1.7, but 1.8 might be usable.
Consider using the factory design pattern with interfaces. The version of Java that you use does not make a difference (though support for Java 7 goes away in the spring, April if I remember correctly).
Define an interface for each class containing the method signatures that will be implemented by all the versions of the class.
Update your class definitions to include the appropriate interface definition.
Create a class factory for each needed class, passing it the information needed to create the appropriate version of the class. This class factory should return the interface type for the created class.
Here is an example:
TestPants
public class TestPants {
IPants pants = PantsFactory.PantsFactory(207);
Message zipperType = pants.getZipperType();
Message color = pants.getColor();
)
}
IPants
public interface IPants {
Message getZipperType();
Message getColor();
}
Pants
public class Pants implements IPants {
// Class fields and Object fields
#Override
public Message getColor () {
return null;
}
#Override
public Message getZipperType () {
return null;
}
// implement any common methods among all versions
}
PantsV0105
public class PantsV0105 extends Pants {
// add changes for this version
}
PantsV0207
public class PantsV0207 extends Pants {
// add changes for this version
}
PantsFactory
public class PantsFactory {
public static IPants PantsFactory(int version) {
switch (version) {
case 105: return new PantsV0105(); break;
case 207: return new PantsV0207(); break;
default: return null;
}
}
I initially solved this by using inner static classes in one gigantic "version" class. Thus, the use looked like this:
V0207.Pants pantsMessage = new V0702.Pants();
But the version class ('V0207') grew too quickly, especially as other developers on the team demanded a more "Java" way of setting the fields (which required a lot of getters and setters).
Thus, the final solution is to put the messages inside their own v0207.messages package name, and prepend each message with the version:
V0207_Pants pantsMessage = new V0702_Pants();
It's not as nice as using a C++ namespace, but it works. The version is clear to the reader, and the object can contain a lot of code without any files becoming too large.
Is Javascript-like prototyping anyhow achievable, even using Reflection? Can I wrap my object inside another one, just to extend its functionality with one or two more methods, without wiring all its original nonprivate methods to the wrapper class, or extends is all I get?
If you are looking for extension methods, you could try Xtend. Xtend is language that compiles to java code and eliminates boilerplate code.
The following text is stolen from the Xtend Docs for extensions:
By adding the extension keyword to a field, a local variable or a parameter declaration, its instance methods become extension methods.
Imagine you want to have some layer specific functionality on a class Person. Let us say you are in a servlet-like class and want to persist a Person using some persistence mechanism. Let us assume Person implements a common interface Entity. You could have the following interface
interface EntityPersistence {
public save(Entity e);
public update(Entity e);
public delete(Entity e);
}
And if you have obtained an instance of that type (through a factory or dependency injection or what ever) like this:
class MyServlet {
extension EntityPersistence ep = Factory.get(typeof(EntityPersistence))
...
}
You are able to save, update and delete any entity like this:
val Person person = ...
person.save // calls ep.save(person)
person.name = 'Horst'
person.update // calls ep.update(person)
person.delete // calls ep.delete(person)
I don't think you can do this in Java. You can though in Groovy, using metaclasses
String.metaClass.world = {
return delegate + " world!"
}
println "Hello".world()