I have a pretty simple interface which manages the update of business proposals, specifically during a nightly batch process each record is submitted here (but it might be used in other scenarios).
This interface is used inside an EJB 2.0 Bean, which fetches records and "cycles" them.
Beware names are translated from Italian to English so pardon possible errors. I also simplified some concepts.
public interface ProposalUpdateService {
void updateProposal(final ProposalFirstType proposal);
void updateProposal(final ProposalSecondType proposal);
}
The implementation of this interface has quite a lot of dependencies:
public class ProposalUpdateDefaultService implements ProposalUpdateService {
private final ComplexService complexService;
private final OtherComplexService otherComplexService;
private final ProposalStep<Proposal> stepOne;
private final ProposalStep<Proposal> stepTwo;
private final ProposalStep<ProposalTypeTwo> stepThree;
private final ProposalStep<Proposal> stepFour;
public ProposalUpdateDefaultService(
final ComplexService complexService,
final OtherComplexService otherComplexService,
final YetAnotherComplexService yetAnotherComplexService,
final SimpleService simpleService,
final OtherSimpleService otherSimpleService,
final YetAnotherSimpleService yetAnotherSimpleService,
final Converter<ProposalTypeOne, ComplexServiceType> converterProposalTypeOne,
final Converter<ProposalTypeTwo, OtherComplexServiceType> converterProposalTypeTwo) {
this.complexService = complexService;
this.otherComplexService = otherComplexService;
stepOne = new StepOne(yetAnotherComplexService);
stepTwo =
new StepTwo(
complexService,
otherComplexService,
yetAnotherComplexService,
converterProposalTypeOne,
converterProposalTypeTwo);
stepThree =
new StepThree(
simpleService,
otherSimpleService,
yetAnotherSimpleService);
stepFour = new StepFour();
}
...
As you can see this class encapsulate the update of a Proposal object, and this process is splitted in four phases, each representing a single concept (such as, "should this proposal be expired?" or "should I advance its state?"). Those four phases may be arranged differently between different types of Proposal.
Here is the highly simplified implementation of those two updateProposal methods:
#Override
public void updateProposal(final ProposalTypeOne proposal) {
stepOne.process(proposal);
stepTwo.process(proposal);
if (...) {
stepFour.process(proposal);
}
}
#Override
public void updateProposal(final ProposalTypeTwo proposal) {
stepOne.process(proposal);
stepTwo.process(proposal);
stepThree.process(proposal);
stepFour.process(proposal);
}
The two private fields
private final ComplexService complexService;
private final OtherComplexService otherComplexService;
are used for helper private methods.
As you can see this class just organize and delegate work, however, it does depend on too many other classes. The same could be said for certain ProposalStep(s).
The *Service(s) are used inside each step to retrieve details from the database, to update dependent entries, etc.
Would you accept this number of dependencies?
How would you refactor to simplify?
I've read about the Facade Service concept as a way to reduce dependencies, and how I should group cluster of dependencies together, but here I don't really understand what to do.
I may group the Converter(s) and the Service(s) which uses them, but they'll be too many anyway.
Let me know if other details are needed.
The issue I can see is ProposalUpdateDefaultService doing too many things and know too much. It accepts a lot of services, creates steps and executes the steps instead it should only accept a single parameter object and update without knowing what are the steps.
First I would try to reduce the parameters from the constructor ProposalUpdateDefaultService by creating a separate class which will contain the services and converters.
public class ServicesAndConverters {
ComplexService complexService;
OtherComplexService otherComplexService
//...
}
In that way the code can be much cleaner
public class ProposalUpdateDefaultService implements ProposalUpdateService {
ServiceAndConverters serviceAndConvert;
public ProposalUpdateDefaultService(final ServiceAndConverters serviceAndConverters) {
this.serviceAndConvert = serviceAndConverters; //maybe group them in two different class??
}
}
Now the second issue I can see to create steps in the ProposalUpdateDefaultService itself. This should be responsibility of different class. Something like below
public class ProposalUpdateDefaultService implements ProposalUpdateService {
ServiceAndConverters serviceAndConvert;
StepCreator stepCreator = new StepCreator();
public ProposalUpdateDefaultService(final ServiceAndConverters serviceAndConverters) {
this.serviceAndConvert = serviceAndConverters;
stepCreator.createSteps(this.serviceAndConverter);
}
}
And the StepCreator class should look like this
public class StepCreator implements ProposalUpdateService {
private final ProposalStep<Proposal> stepOne;
private final ProposalStep<Proposal> stepTwo;
private final ProposalStep<ProposalTypeTwo> stepThree;
private final ProposalStep<Proposal> stepFour;
public void createSteps(ServiceAndConverters s) {
// do the step processing here
}
}
Now ProposalUpdateDefaultService can execute the steps without knowing what is the steps and which service need to execute
#Override
public void updateProposal(final ProposalTypeOne proposal) {
stepCreator.getStepOne().process(proposal);
stepCreator.getStepTwo().process(proposal);
if (...) {
stepCreator.getStepFour().process(proposal);
}
}
The solution that I found more convenient is just removing the ProposalUpdateService abstraction, and letting the EJB Bean manage the various steps.
This abstraction layer was unnecessary as of now, and each step is still usable individually. Both ProposalUpdateService method invocations become private methods in the EJB Bean.
Related
TLDR;
Does my DailyRecordDataManager class have a code smell? Is it a 'God Class'? and how can I improve the structure?
Hi,
I'm working on my first project with Spring. It's going to fetch covid-19 data from the Madrid (where I live) government website, organise it by locality, and serve it up through an API.
Here is a sample of the JSON data I'm consuming.
{
"codigo_geometria": "079603",
"municipio_distrito": "Madrid-Retiro",
"tasa_incidencia_acumulada_ultimos_14dias": 23.4668991007149,
"tasa_incidencia_acumulada_total": 1417.23308497532,
"casos_confirmados_totales": 1691,
"casos_confirmados_ultimos_14dias": 28,
"fecha_informe": "2020/07/01 09:00:00"
}
Each JSON object is a a record of cases and the infection rate on a specific date and for a specific municipal district.
After fetching the data the program: parses it, filters it, trims/rounds some properties, maps it by locality, uses it to create an object for each locality (DistrictData), and writes the locality DistrictData objects to a MonoDB instance.
At the moment I have split each of these steps in the process separate classes, as per the single responsibility principle. As can be seen in the linked screenshot:
screenshot of intellij package structure
My problem is I don't know how to link these multiple classes together.
At the moment I have a Manager class which smells a bit like a God Class to me:
#Service
public class DailyRecordDataManager implements DataManager {
private final Logger logger = LoggerFactory.getLogger(DailyRecordDataManager.class);
private final DailyRecordDataCollector<String> dataCollector;
private final DataVerifier<String> dataVerifier;
private final JsonParser<DailyRecord> dataParser;
private final DataFilter<List<DailyRecord>> dataFilter;
private final DataTrimmer<List<DailyRecord>> dataTrimmer;
private final DataSorter<List<DailyRecord>> dataSorter;
private final DataMapper<List<DailyRecord>> dataMapper;
private final DataTransformer dataTransformer;
private final DistrictDataService districtDataService;
public DailyRecordDataManager(DailyRecordDataCollector<String> collector,
DataVerifier<String> verifier,
JsonParser<DailyRecord> parser,
DataFilter<List<DailyRecord>> dataFilter,
DataTrimmer<List<DailyRecord>> dataTrimmer,
DataSorter<List<DailyRecord>> dataSorter,
DataMapper dataMapper,
DataTransformer dataConverter,
DistrictDataService districtDataService) {
this.dataCollector = collector;
this.dataVerifier = verifier;
this.dataParser = parser;
this.dataFilter = dataFilter;
this.dataTrimmer = dataTrimmer;
this.dataSorter = dataSorter;
this.dataMapper = dataMapper;
this.dataTransformer = dataConverter;
this.districtDataService = districtDataService;
}
#Override
public boolean newData() {
String data = dataCollector.collectData();
if (!dataVerifier.verifyData(data)) {
logger.debug("Data is not new.");
return false;
}
List<DailyRecord> parsedData = dataParser.parse(data);
if (parsedData.size() == 0) {
return false;
}
List<DailyRecord> filteredData = dataFilter.filter(parsedData);
List<DailyRecord> trimmedData = dataTrimmer.trim(filteredData);
List<DailyRecord> sortedData = dataSorter.sort(trimmedData);
Map<String, List<DailyRecord>> mappedData = dataMapper.map(sortedData);
List<DistrictData> convertedData = dataTransformer.transform(mappedData);
districtDataService.save(convertedData);
return true;
}
}
I also thought about linking all of the involved classes together in a chain of Injected Dependencies -> so each class has the next class in the process as a dependency and, provided nothing goes wrong with the data, calls that next class in the chain when it's time.
I do also however feel that there must be a design pattern that solves the problem I have!
Thanks!
For anyone who finds this and wonders what I ended up opting for the Pipeline pattern.
It allowed me to easily organise all of the individual classes I was using into one clean workflow. It also made each stage of the process very easy to test. As well as the pipeline class itself!
I highly recommend anyone interested in the patter in Java to check out this article, which I used extensively.
In project I am working on we have a bunch of commonly used helpers. Consider the following example:
public class ServiceHelper {
public HttpServletRequest() getRequest() { ... }
public Model getModel() { ... }
public UserCache getUserCache() { ... }
public ComponentContainer getComponentContainer() { ... }
}
Imagine this helper is being used across the whole application by every web service we have. Then, in order to test these services I need to mock it. Each time. But what if I create a factory of some kind instead, something like:
public class ServiceHelperMockStore {
public static ServiceHelper create() {
return init();
}
public static ServiceHelper create(final Model model) {
final ServiceHelper helper = init();
when(helper.getModel()).thenReturn(model);
return helper;
}
private static ServiceHelper init() {
final ServiceHelper helper = mock(ServiceHelper.class);
final HttpServletRequest request = mock(HttpServletRequest.class);
final Model model = mock(Model.class);
final UserCache userCache = mock(UserCache.class);
final ComponentContainer container = mock(ComponentContainer.class);
final BusinessRules businessRules= mock(BusinessRules.class);
final ModelTransformer modelTransformer = mock(ModelTransformer.class);
when(helper.getRequest()).thenReturn(request);
when(helper.getModel()).thenReturn(model);
when(helper.getUserCache()).thenReturn(userCache);
when(helper.getComponentContainer()).thenReturn(container);
when(container.getComponent(BusinessRules.class)).thenReturn(businessRules);
when(componentContainer.getComponent(ModelTransformer.class)).thenReturn(modelTransformer);
return helper;
}
}
This factory nicely fit my purposes and oftentimes I can completely avoid using 'mock' and 'when' in the actual test suites. Instead, I can do the following:
#RunWith(MockitoJUnitRunner.Silent.class)
public class ModelServiceTest {
private final Model model = new Model();
private final ServiceHelper serviceHelper = ServiceHelperMockStore.create(model);
private final BusinessRules businessRules = serviceHelper.getComponentContainer().getComponent(BusinessRules.class);
private final ModelType modelType1 = new ModelType();
private final ModelType modelType2 = new ModelType();
private final ModelService modelService = new ModelService(serviceHelper);
#Before
public void setUp() {
modelType1.setItemId("item1");
modelType2.setItemId("item2");
model.setTypes(modelType1, modelType2);
when(businessRules.get("type")).thenReturn(modelType1);
}
...tests...
}
So instead of creating a lot of mocks in the ModelServiceTest, I can just access the predefined ones, like:
BusinessRules businessRules = serviceHelper.getComponentContainer().getComponent(BusinessRules.class);
and this even reflect my helper's API. Also, I can provide my own mock or stub passing parameters to my factory method or using some different approach.
The only problem I have is UnnecessaryStubbingException being thrown by Mockito as normally I don't use all those stubbings I've created per each test file. So I have to use MockitoJUnitRunner.Silent runner to silent the error and according to the mockito api docs it is not recommended.
So I am seeking for an advice what kind of approach must be chosen in this case. Am I doing it right or there is some other way? Or, maybe, using such kind of factories is a bad style of programming in relation to unit tests as it hides some initialization and makes happening things less evident so I must do just a plain copy of my code between test suits?
The fact that you need this identical complex mock configuration at different places shows that your code violates the Law of Demeter (Don't talk to strangers).
A unit should only get dependencies it actually interacts with (other than only to getting another dependency from it).
So instead of creating a lot of mocks in the ModelServiceTest, I can just access the predefined ones,
You Unittests are not only verification of correct behavior but also minimal examples how to use the CUT (Code under test).
The configuration of the CUTs dependencies is an essential part of that example and should be easily accessible to the reader of the tests.
I'd strongly discourage from "factories for mocks" especially it they were moved to other classes (in the test folder).
Our application is getting complex, it has mainly 3 flow and have to process based on one of the 3 type. Many of these functionalities overlap each other.
So currently code is fully of if-else statements, it is all messed up and not organised. How to make a pattern so that 3 flows are clearly separated from each other but making use of power of re-usability.
Please provide some thoughts, this is a MVC application, where we need to produce and consume web servicees using jaxb technology.
May be you can view the application as a single object as input on which different strategies needs to be implemented based on runtime value.
You did not specify what your if-else statements are doing. Say they filtering depending on some value.
If I understand your question correctly, you want to look at Factory Pattern.
This is a clean approach, easy to maintain and produces readable code. Adding or removing a Filter is also easy, Just remove the class and remove it from FilterFactory hashmap.
Create an Interface : Filter
public interface Filter {
void Filter();
}
Create a Factory which returns correct Filter according to your value. Instead of your if-else now you can just use the following :
Filter filter = FilterFactory.getFilter(value);
filter.filter();
One common way to write FilterFactory is using a HashMap inside it.
public class FilterFactory{
static HashMap<Integer, Filter> filterMap;
static{
filterMap = new HashMap<>();
filterMap.put(0,new Filter0());
...
}
// this function will change depending on your needs
public Filter getFilter(int value){
return filterMap.get(value);
}
}
Create your three(in your case) Filters like this: (With meaningful names though)
public class Filter0 implements Filter {
public void filter(){
//do something
}
}
NOTE: As you want to reuse some methods, create a FilterUtility class and make all your filters extend this class so that you can use all the functions without rewriting them.
Your question is very broad and almost impossible to answer without some description or overview of the structure of your application. However, I've been in a similar situation and this is the approach I took:
Replace conditions with Polymorphism where possible
it has mainly 3 flow and have to process based on this one of the 3
type. Many of these functionalities overlap each other.
You say your project has 3 main flows and that much of the code overlaps each other. This sounds to me like a strategy pattern:
You declare an interface that defines the tasks performed by a Flow.
public interface Flow{
public Data getData();
public Error validateData();
public void saveData();
public Error gotoNextStep();
}
You create an abstract class that provides implementation that is common to all 3 flows. (methods in this abstract class don't have to be final, but you definitely want to consider it carefully.)
public abstract class AbstractFlow{
private FlowManager flowManager
public AbstractFlow(FlowManager fm){
flowManager = fm;
}
public final void saveData(){
Data data = getData();
saveDataAsXMl(data);
}
public final Error gotoNextStep(){
Error error = validateData();
if(error != null){
return error;
}
saveData();
fm.gotoNextStep();
return null;
}
}
Finally, you create 3 concrete classes that extend from the abstract class and define concrete implementation for the given flow.
public class BankDetailsFlow extends AbstractFlow{
public BankDetailsData getData(){
BankDetailsData data = new BankDetailsData();
data.setSwiftCode(/*get swift code somehow*/);
return data;
}
public Error validateData(){
BankDetailsData data = getData();
return validate(data);
}
public void onFormSubmitted(){
Error error = gotoNextStep();
if(error != null){
handleError(error);
}
}
}
Lets take example, suppose you have model say "Data" [which has some attributes and getters,setters, optional methods].In context of Mobile application ,in particular Android application there can be two modes Off-line or On-line. If device is connected to network , data is sent to network else stored to local database of device.
In procedural way someone can , define two models as OnlineData,OfflineData and write code as[The code is not exact ,its just like pseudo code ]:
if(Connection.isConnected()){
OnlineData ond=new OnlineData();
ond.save();//save is called which stores data on server using HTTP.
}
else{
OfflineData ofd=new Onlinedata();
ofd.save();//save is called which stores data in local database
}
A good approach to implement this is using OOPS principles :
Program to interface not Implementation
Lets see How to DO THIS.
I am just writing code snippets that will be more effectively represent what I mean.The snippets are as follows:
public interface Model {
long save();//save method
//other methods .....
}
public class OnlineData extends Model {
//attributes
public long save(){
//on-line implementation of save method for Data model
}
//implementation of other methods.
}
public class OfflineData extends Model {
//attributes
public long save(){
//off-line implementation of save method for Data model
}
//implementation of other methods.
}
public class ObjectFactory{
public static Model getDataObject(){
if(Connection.isConnected())
return new OnlineData();
else
return new OfflineData();
}
}
and Here is code that your client class should use:
public class ClientClass{
public void someMethod(){
Model model=ObjectFactory.getDataObject();
model.save();// here polymorphism plays role...
}
}
Also this follows:
Single Responsibility Principle [SRP]
because On-line and Off-line are two different responsibilities which we can be able to integrate in Single save() using if-else statement.
After loong time I find opensource rule engine frameworks like "drools" is a great alternative to fit my requirement.
I am trying to make a class as ThreadSafe Singleton but somehow I am not able to understand how to make ThreadSafe Singleton class which can accepts parameter.
Below is the class which I am using from this github link which I am using currently to make a connection to Zookeeper -
public class LeaderLatchExample {
private CuratorFramework client;
private String latchPath;
private String id;
private LeaderLatch leaderLatch;
public LeaderLatchExample(String connString, String latchPath, String id) {
client = CuratorFrameworkFactory.newClient(connString, new ExponentialBackoffRetry(1000, Integer.MAX_VALUE));
this.id = id;
this.latchPath = latchPath;
}
public void start() throws Exception {
client.start();
client.getZookeeperClient().blockUntilConnectedOrTimedOut();
leaderLatch = new LeaderLatch(client, latchPath, id);
leaderLatch.start();
}
public boolean isLeader() {
return leaderLatch.hasLeadership();
}
public Participant currentLeader() throws Exception {
return leaderLatch.getLeader();
}
public void close() throws IOException {
leaderLatch.close();
client.close();
}
public CuratorFramework getClient() {
return client;
}
public String getLatchPath() {
return latchPath;
}
public String getId() {
return id;
}
public LeaderLatch getLeaderLatch() {
return leaderLatch;
}
}
And this is the way I am calling the above class -
public static void main(String[] args) throws Exception {
String latchPath = "/latch";
String connStr = "10.12.136.235:2181";
LeaderLatchExample node1 = new LeaderLatchExample(connStr, latchPath, "node-1"); // this I will be doing only one time at just the initialization time
node1.start();
System.out.println("now node-1 think the leader is " + node1.currentLeader());
}
Now what I need is if I am calling these two below methods from any class in my program, I should be able to get an instance of it. So I am thinking to make above class as a Thread Safe Singleton so that I can access these two methods across all my java program.
isLeader()
getClient()
How do I make above class as ThreadSafe singleton and then make use of isLeader() and getClient() across all my classes to see who is the leader and get the client instance..
I need to do this only at the initialization time and once it is done, I should be able to use isLeader() and getClient() across all my classes.. Is this possible to do?
// this line I will be doing only one time at just the initialization time
LeaderLatchExample node1 = new LeaderLatchExample(connStr, latchPath, "node-1");
node1.start();
This is more of Java question not Zookeeper stuff..
A singleton which requires a parameter is a bit of a contradiction in terms. After all, you'd need to supply the parameter value on every call, and then consider what would happen if the value was different to an earlier one.
I would encourage you to avoid using the singleton pattern at all here. Instead, make your class a perfectly normal one - but use dependency injection to provide a reference to a single configured instance to all your classes that need it.
That way:
The singleton nature isn't enforced, it's just a natural part of you only needing one reference. If later on you needed two references (e.g. for different Zookeeper instances for some reason) you can just configure the dependency injection differently
The lack of global state generally makes things much easier to test. One test might use one configuration; another test might use a different one. No singleton, no problem. Just pass the relevant reference into the constructor of the class under test.
The book Growing Object Oriented Software gives several examples in jMock where the state is made explicit without exposing it through an API. I really like this idea. Is there a way to do this in Mockito?
Here's one example from the book
public class SniperLauncherTest {
private final States auctionState = context.states("auction state")
.startsAs("not joined");
#Test public void addsNewSniperToCollectorAndThenJoinsAuction() {
final String itemId = "item 123";
context.checking(new Expectations() {{
allowing(auctionHouse).auctionFor(itemId); will(returnValue(auction));
oneOf(sniperCollector).addSniper(with(sniperForItem(item)));
when(auctionState.is("not joined"));
oneOf(auction).addAuctionEventListener(with(sniperForItem(itemId)));
when(auctionState.is("not joined"));
one(auction).join(); then(auctionState.is("joined"));
}});
launcher.joinAuction(itemId);
}
}
I used a spy for the self same exercise:
http://docs.mockito.googlecode.com/hg/latest/org/mockito/Mockito.html#13
I changed my SniperListener mock into a spy thus:
private final SniperListener sniperListenerSpy = spy(new SniperListenerStub());
private final AuctionSniper sniper = new AuctionSniper(auction, sniperListenerSpy);
And also created a stubbed implementation of SniperListener:
private class SniperListenerStub implements SniperListener {
#Override
public void sniperLost() {
}
#Override
public void sniperBidding() {
sniperState = SniperState.bidding;
}
#Override
public void sniperWinning() {
}
}
The book uses JMock's "States", but I used a nested enum instead:
private SniperState sniperState = SniperState.idle;
private enum SniperState {
idle, winning, bidding
}
You then have to use regular JUnit asserts to test for the state:
#Test
public void reportsLostIfAuctionClosesWhenBidding() {
sniper.currentPrice(123, 45, PriceSource.FromOtherBidder);
sniper.auctionClosed();
verify(sniperListenerSpy, atLeastOnce()).sniperLost();
assertEquals(SniperState.bidding, sniperState);
}
Not that I'm aware of. I've used mockito a far amount and there's nothing in the doco similar to what I read on the JMock site about states. If I have it correctly they basically limit the time at which an exepection can occur to the duration of a specific state of another object. It's an interesting idea, but I'm struggling to see the applications for it.
In Mockito you can execute code using Stubbing with callbacks to do the same job. In the callback method you can execute further validations of the state. Alternatively you can employ a Custom argument matcher as they are also executed at the time of the call.
Both of these give you access to the code at execution time which is the time you want to check the state.