How does Intershop object wiring work - java

Could someone explain what is the capi package used for? Is it for dependency injection?
Also what would be the proper way to connect two BOs (eg BasketBO and BucketBO to get access to their methods and create some new data) ?
Is it through the pipeline or through some new common object?

The capi (cartridge API) package is where you put the interfaces/classes to your public API of the cartridge. You might have noticed there is almost always an internal package too, this is where the implementation of the public API goes. So interface SomeObjectMgr will be in the com.example.capi package and the implementation class SomeObjectMgrImpl in the com.example.internal package. The capi package you can consider to be stable, while the internal packages can drastically change from version to version.
As for your second question, BO's are group together in an aggregate if they belong together, but an aggregate can refence other aggregates. So you are not limited in the relations you need to build . Example the BasketBO can access the BucketBO objects using its access methods. You could write an extension with business logic that manipulate the two BO's and return whatever data you need. However keep in mind that transaction control is on pipeline/pipelet level. So take that into account when designing your methods if you need to do a rollback of a transaction.
As for dependency injection, intershop uses google's guice framework. You can find more information on how to use it here.

Related

Should I inject every and any class or specific classes?

I just started getting into DI(dependency injection) frameworks and I was wondering should I inject every and any class or specific classes? How would I know which classes you should inject and which you shouldn't?
A lot of the tutorials(Dagger 2, Hilt, etc) that I've found don't seem to talk about this or don't explain it very well.
I don't have any code examples cause I'm trying out things as I read along with tutorials. Just trying to get a feeling of things.
Explain it as if I was 4 year old :)
My answer is no, you do not have to do that use dependency injection every time you need to create an object.
This is the intent of dependency injection according to Wikipedia
The intent behind dependency injection is to achieve separation of concerns of construction and use of objects. This can increase readability and code reuse.
So you should use dependency injection it would improve your code reuse and readability.
This article mentions some cases where you should use dependency injection to create objects
You need to inject configuration data into one or more components.
You need to inject the same dependency into multiple components.
You need to inject different implementations of the same dependency.
You need to inject the same implementation in different configurations.
You need some of the services provided by the container.
Imagine that you have username and password fields in an app that you would use to capture a user's detail, you do not have to use dependency injection to create a User object from the details, you can just create it directly, its not something that would be reused across your application, there are other object creation patterns that you should look into like factory pattern, abstract factory pattern, builder pattern e.t.c.
There's a certain number of classes which I always inject and never ask myself the question: the application context, shared preferences, DAOs, Retrofit, Json object. As you can see, these are usually related the third-party libraries or the Android framework.
Then there are classes which depend on those: the repository is a common example, but there are many more. For example you could have a preference manager which depends on shared preferences and context. Or you could have a class JsonExporter used to export user data which depends on DAOs and the Json mapper.
In turn, there are other classes which depend on those new classes: a view model/presenter in MVVM/MVP architecture could depend on both the repository and JsonExporter for example. At this point you have two choices:
Instantiate the class yourself. But that means you need to have access to all of the class' dependencies, which are only available through dependency injection...
so you might as well inject it. The cost of injection at this point is often very low, just adding #Inject to the constructor. Only the base dependencies have to be provided in a module.
From a certain number of injected base classes, the decision to inject those more nested in the dependency graph really comes automatically.
There are also cases where you'll have a class that depends on nothing. Perhaps you decided to extract code from a big class into another one. Or perhaps the class just has no dependencies. I probably wouldn't inject such a class, there's no point.
Singletons are often handy to inject because the dependency injection framework (e.g. Dagger) can make sure there's always only one instance automatically.
One reason why we do dependency injection is so that classes depend on abstractions, not concretions (aka inversion of control). It's useful to have your class depend on a Repository interface, because you can decide to provide it with the implementation you want, for example RealRepository in the app, and MockRepository in tests. Another example is flavors or build variants: an injected FlavorBehavior interface could have different implementations in two different flavors. It's not the class' responsability to decide which to use.
Note that this is not a definitive answer, and I'm not an expert on the subject. It's an opinion-based subject too.

How to create layers in an Android library without exposing internal classes to consumer of the library

I'm creating an android library and wanted to organize with layers it something like this.
PublicClassExposedToLibraryConsumer.java
logic.PublicFooLogicInterface1.java
logic.PackagePrivateFooLogicClass1.java
logic.PublicFooLogicInterface2.java
logic.PackagePrivateFooLogicClass2.java
domain.PublicFooDomainInterface1.java
domain.PackagePrivateFooDomainClass1.java
domain.PublicFooDomainInterface2.java
domain.PackagePrivateFooDomainClass2.java
repository.PublicFooRepoInterface1.java
repository.PackagePrivateFooRepoClass1.java
repository.PublicFooRepoInterface2.java
repository.PackagePrivateFooRepoClass2.java
1) I want a number of layers and I want to limit interaction between those layers by using interfaces.
2) I want to only expose PublicClassExposedToLibraryConsumer.java to the consumer of the library. They should not be able to access the other classes and interfaces.
Is this possible? From what I've read in order to make something accessible to something consuming the library it needs to be public and to hide something from the consumer of the library it needs to be not public. By my reading this means that you can't separate layers without exposing something and you can't hide internal classes without being forced to use a completely flat architecture. I find this very hard to believe, I have to be missing something.
You can try with annotations providing specific scope for your desired file to restrict to end-user of your library. Best way to do this in Android is using #RestrictTo support library annotation on class level.
Note : For Fields and Methods of particular entity can be scoped with access-modifiers like private, protected or package-protected etc. *(Just ignore if you already know that)
#RestrictTo : Denotes that the annotated element should only
be accessed from within a specific scope (as defined by
RestrictTo.Scope).

Domain Driven Design - testability and the "new" keyword

I have been trying to follow a domain driven design approach in my new project. I have always generally used Spring for dependency injection, which nicely separates my application code from the construction code, however, with DDD I always seem to have one domain object wanting to create another domain object, both of which have state and behaviour.
For example, given a media file, we want to encode it to a different format - the media asset calls on a transcode service and receives a callback:
class MediaAsset implements TranscodingResultListener {
private NetworkLocation permanentStorage;
private Transcoder transcoder;
public void transcodeTo(Format format){
transcoder.transcode(this,format);
}
public void onSuccessfulTranscode(TranscodeResult result){
Rendition rendition = new Rendition(this, result.getPath(), result.getFormat());
rendition.moveTo(permanentStorage);
}
}
Which throws two problems:
If the rendition needs some dependencies (like the MediaAsset requires a "Transcoder") and I want to use something like Spring to inject them, then I have to use AOP in order for my program to run, which I don't like.
If I want a unit test for MediaAsset that tests that a new format is moved to temporary storage, then how do I do that? I cannot mock the rendition class to verify that it had its method called... the real Rendition class will be created.
Having a factory to create this class is something that I've considered, but it is a lot of code overhead just to contain the "new" keyword which causes the problems.
Is there an approach here that I am missing, or am I just doing it all wrong?
I think that the injection of a RenditionFactory is the right approach in this case. I know it requires extra work, but you also remove a SRP violation from your class. It is often tempting to construct objects inside business logic, but my experience is that injection of the object or a objectfactory pays off 99 out of 100 times. Especially if the mentioned object is complex, and/or if it interacts with system resources.
I assume your approach for unit testing is to test the MediaAsset in isolation. Doing this, I think a factory is the common solution.
Another approach is to test the whole system (or almost the whole system). Let your test access the outer interface[1] (user interface, web service interface, etc) and create test doubles for all external systems that the system accesses (database, file system, external services, etc). Then let the test inject these external dependencies.
Doing this, you can let the tests be all about behaviour. The tests become decoupled from implementation details. For instance, you can use dependency injection for Rendition, or not: the tests don't care. Also, you might discover that MediaAsset and Rendition are not the correct concepts[2], and you might need to split MediaAsset in two and merge half of it with Rendition. Again, you can do it without worrying about the tests.
(Disclaimer: Testing on the outer level does not always work. Sometimes you need to test common concepts, which requires you to write micro tests. And then you might run into this problem again.)
[1] The best level might actually be a "domain interface", a level below the user interface where you can use the domain language instead of strings and integers, and where you can talk domain actions instead of button clicks and focus events.
[2] Perhaps this is actually your problem: Are MediaAsset and Rendition the correct concepts? If you ask your domain expert, does he know what these are? If not, are you really doing DDD?

Dependency Injection in every aspect of a spring app?

I am taking a look into Spring as a web framework, however I am needing a bit of help getting my head around DI.
The concept of objects getting constructed in the container on run time is such a new concept.
I am just wondering how this will reflect in a big application, would I have some modules doing work that are more highly coupled or should every object be initialised at runtime?
It all seems a little intensive to me, I mean say for example I have a CSV file data mining application that removes the data per row - each rows data is encapsulated in one of my own CSVRow objects for processing or whatever. These objects are instantiated whenever an Excel file maybe uploaded to the server. I don't know how many I will need to create?
I seem to be getting a bit lost, any clarity, an overview or some guidance would be much appreciated.
Thanks in advance!
I'll try to put it simply:
use dependency injection for stateless classes that have logic (business logic, persistence logic, front-end logic)
use new for value objects
Broadly speaking, an application is made up of a collection of classes that implement the business logic.
Normally each object is responsible to obtain references of the objects it needs (and this object's dependencies).
I think it is obvious that this leads to:
1) tightly coupled classes
2) code hard to test since each object instantiates specific classes it depends on and if there needs to be a change, the code must be modified.
So using Dependency Injections the objects do not instantiate the dependent objects themselves but an "external component" provides the dependencies at the object creation time i.e. injects the dependencies into the objects.
So in your example, the idea is that you can have for example a CsvRow object instantiated by Spring (along with all its dependencies) and get an object whenever needed. It is also possible to switch to for example CsvRow2 object (another implementation) by just changing your configuration
You don't need to use DI for your CSV row abstraction. Once you get the file, when you start parsing it, your code can create the CSVRow things as it goes. You don't need to wire them up.
You certainly could if you wanted to. You would grab your applicationContext and just get the beans by name. You would want to do this if the CsvRow had dependencies that you wanted Spring to manage for you.
I think of Spring as a way to create "singletons". When I want to guarantee there's only one instance of a class in the application, use Spring to create it. But, instead of being a traditional singleton with a static INSTANCE field or similar, it's a POJO with whatever constructors / setters you need. Spring creates the instance at runtime for you and makes sure that creation only happens once.

Where to put business logic in Eclipse RCP program

I'm writing a small application in RCP to wrap around the business logic in another (non-RCP) simulation library. I can access and use the library fine from any of my plugins, but I don't know where I should put the instance of the Simulation library so that, say, one of the command handlers can make calls to it.
From reading the docs it sounds like I should be storing 'global' information like this in the workbench - but I still don't really understand how to do that.
Help?
First, the business layer (BL) can and should reside in its' own plugin. That will provide decent decoupling between the layers.
Second, you should carefully decide what the interface should be and which classes are exposed. Ideally, you should mostly expose interfaces and data objects.
Finally, decide how the "hand shake" works. E.g., how to obtain the initial interface to the BL. Since it is a Plugin, it could have an Activator which loads it. You could add a method in the activator which returns the BL interface.
If you are looking for something more decoupled, you could create an extension point or deploy the BL as an OSGi service, but that's a bit of an overkill for you need.
If I understand you correctly, I see two ways:
Store the instance in the model plug-in itself, using ‘SimulationFactory.getInstance(String myAppId)‘. The passed String is a constant in you app that is always used, when obtaining the reference.
Define a new class e.g. GlobalAccess in you app that is initilized with an instance of your model and has some getter (whether you use a single instance again or only provide public static methods is a matter of taste).
The seocond way is similar to some classes in eclipse like platfom or platformui, where you can obtain initial references and navigate through the workbench.
edit
i just found a tutorial that might help you:
Passing Data between Plug-ins

Categories