aggregate not found in the event store - java

I am trying to add data using CQRS framework AXON. But while hitting the API(used to add an order). I am getting the below error:-
Command 'com.cqrs.order.commands.CreateOrderCommand' resulted in org.axonframework.modelling.command.AggregateNotFoundException(The aggregate was not found in the event store)
But i already have an Aggregate in my code(OrderAggregate.Java).
The Full code can be found at - https://github.com/iftekharkhan09/OrderManagementSystem
API to add Order - http://localhost:8080/confirmOrder
Request Body:-
{
"studentName":"Sunny Khan"
}
Can anyone please tell me where am i doing wrong?
Any help is appreciated!

For other readers, let me share the Aggregate you've created in your repository:
#Aggregate
public class OrderAggregate {
public OrderAggregate(OrderRepositoryData orderRepositoryData) {
this.orderRepositoryData = orderRepositoryData;
}
#AggregateIdentifier
private Integer orderId;
private OrderRepositoryData orderRepositoryData;
#CommandHandler
public void handle(CreateOrderCommand command) {
apply(new OrderCreatedEvent(command.getOrderId()));
}
#EventSourcingHandler
public void on(OrderCreatedEvent event) {
this.orderId=event.getOrderId();
Order order=new Order("Order New");
orderRepositoryData.save(order);
}
protected OrderAggregate() {
// Required by Axon to build a default Aggregate prior to Event Sourcing
}
}
There are several things you can remove entirely from this Aggregate, which are:
The OrderRepositoryData
The OrderAggregate constructor which sets the OrderRepositoryData
The manually saving of an Order in the #EventSourcingHandler annotated function
What you're doing here is mixing the Command Model's concern of making decisions with creating a queryable Order for the Query Model. It would be better to remove this logic entirely from an Aggregate (the Command Model in your example) and move this to an Event Handling Component.
This is however not the culprit for the AggregateNotFoundException you're receiving.
What you've missed is to make the CreateOrderCommand command handler a constructor.
The CreateOrderCommand will create an Order, as it's name already suggests.
Hence, it should be handled by a constructor rather than a regular method.
So, instead of this:
#CommandHandler
public *void* handle(CreateOrderCommand command) {
apply(new OrderCreatedEvent(command.getOrderId()));
}
You should be doing this:
#CommandHandler
public OrderAggregate(CreateOrderCommand command) {
apply(new OrderCreatedEvent(command.getOrderId()));
}
Hope this helps you out #Sunny!

aggregate not found in the event store
The main reason for this exception is, When the axon is trying to save the aggregate it should create the aggragate first.
#CommandHandler
public OrderAggregate(CreateOrderCommand command) {
apply(new OrderCreatedEvent(command.getOrderId()));
}
Also in this way ur
private OrderRepositoryData orderRepositoryData;
won't be initialized, so autowired the orderRepositoryData also.
#Autowired
private OrderRepositoryData orderRepositoryData;
For the successive events you should use same OrderId ,else also it will throw
handleThrowable(java.lang.Throwable,org.springframework.web.context.request.WebRequest)
org.axonframework.modelling.command.AggregateNotFoundException: The aggregate was not found in the event store
at org.axonframework.eventsourcing.EventSourcingRepository.doLoadWithLock(EventSourcingRepository.java:122)

Related

Spring batch: reader gave one item, processor have to extract many from it [duplicate]

I'm writing a spring batch job and in one of my step I have the following code for the processor:
#Component
public class SubscriberProcessor implements ItemProcessor<NewsletterSubscriber, Account>, InitializingBean {
#Autowired
private AccountService service;
#Override public Account process(NewsletterSubscriber item) throws Exception {
if (!Strings.isNullOrEmpty(item.getId())) {
return service.getAccount(item.getId());
}
// search with email address
List<Account> accounts = service.findByEmail(item.getEmail());
checkState(accounts.size() <= 1, "Found more than one account with email %s", item.getEmail());
return accounts.isEmpty() ? null : accounts.get(0);
}
#Override public void afterPropertiesSet() throws Exception {
Assert.notNull(service, "account service must be set");
}
}
The above code works but I've found out that there are some edge cases where having more than one Account per NewsletterSubscriber is allowed. So I need to remove the state check and to pass more than one Account to the item writer.
One solution I found is to change both ItemProcessor and ItemWriter to deal with List<Account> type instead of Account but this have two drawbacks:
Code and tests are uglier and harder to write and maintain because of nested lists in writer
Most important more than one Account object may be written in the same transaction because a list given to writer may contain multiple accounts and I'd like to avoid this.
Is there any way, maybe using a listener, or replacing some internal component used by spring batch to avoid lists in processor?
Update
I've opened an issue on spring Jira for this problem.
I'm looking into isComplete and getAdjustedOutputs methods in FaultTolerantChunkProcessor which are marked as extension points in SimpleChunkProcessor to see if I can use them in some way to achieve my goal.
Any hint is welcome.
Item Processor takes one thing in, and returns a list
MyItemProcessor implements ItemProcessor<SingleThing,List<ExtractedThingFromSingleThing>> {
public List<ExtractedThingFromSingleThing> process(SingleThing thing) {
//parse and convert to list
}
}
Wrap the downstream writer to iron things out. This way stuff downstream from this writer doesn't have to work with lists.
#StepScope
public class ItemListWriter<T> implements ItemWriter<List<T>> {
private ItemWriter<T> wrapped;
public ItemListWriter(ItemWriter<T> wrapped) {
this.wrapped = wrapped;
}
#Override
public void write(List<? extends List<T>> items) throws Exception {
for (List<T> subList : items) {
wrapped.write(subList);
}
}
}
There isn't a way to return more than one item per call to an ItemProcessor in Spring Batch without getting pretty far into the weeds. If you really want to know where the relationship between an ItemProcessor and ItemWriter exits (not recommended), take a look at the implementations of the ChunkProcessor interface. While the simple case (SimpleChunkProcessor) isn't that bad, if you use any of the fault tolerant logic (skip/retry via FaultTolerantChunkProcessor), it get's very unwieldily quick.
A much simpler option would be to move this logic to an ItemReader that does this enrichment before returning the item. Wrap whatever ItemReader you're using in a custom ItemReader implementation that does the service lookup before returning the item. In this case, instead of returning a NewsletterSubscriber from the reader, you'd be returning an Account based on the previous information.
Instead of returning an Account you return create an AccountWrapper or Collection. The Writer obviously must take this into account :)
You can made transformer to transform your Pojo( Pojo object from file) to your Entity
By making the following code :
public class Intializer {
public static LGInfo initializeEntity() throws Exception {
Constructor<LGInfo> constr1 = LGInfo.class.getConstructor();
LGInfo info = constr1.newInstance();
return info;
}
}
And in your item Processor
public class LgItemProcessor<LgBulkLine, LGInfo> implements ItemProcessor<LgBulkLine, LGInfo> {
private static final Log log = LogFactory.getLog(LgItemProcessor.class);
#SuppressWarnings("unchecked")
#Override
public LGInfo process(LgBulkLine item) throws Exception {
log.info(item);
return (LGInfo) Intializer.initializeEntity();
}
}

How can I test an Aggregate that relies on time?

I am trying to model a time keeping application.
Ordinarily when I have a class that depends on time, I can provide an overloaded constructor or method to be able to inject a Clock into that method or class and be able to test its behavior.
If I have a command that needs to be able to pass the current time into an event, how can this work in the aggregate of an axon based application?
#Aggregate
#Slf4j
public class TimeCard {
#AggregateIdentifier
private String employeeName;
private Instant clockInTime;
public TimeCard() {
//Axon requires empty constructor on aggregate
}
#CommandHandler
public TimeCard(ClockInCommand cmd) {
AggregateLifecycle.apply(new ClockInEvent(cmd.getEmployeeName(), Instant.now()));
}
#EventSourcingHandler
public void on(ClockInEvent event) {
this.employeeName = event.getEmployeeName();
this.clockInTime = event.getClockInTime();
}
}
It seemed that the test fixture handled that cleanly for me by providing methods to provide the time. Here is my test method:
#Test
void testClockInCommand() {
testFixture.givenNoPriorActivity()
.andGivenCurrentTime(clock.instant())
.when(new ClockInCommand("GoldFlsh"))
.expectEvents(new ClockInEvent("GoldFlsh", testFixture.currentTime()));
}
But my event did end up being different by a fraction of a second.
Expected <2020-02-02T13:47:20.684344700Z> but got <2020-02-02T13:47:20.954347700Z>
What's the best way to handle this? Should commands only take in time from upstream? Or can I inject a clock somehow for testing.
When relying on time in Aggregates (and other axon types) you can use the GenericEventMessage.clock which defaults to System.UTC in most runtime configurations.
The Testfixture will override this to be a fixed time during tests. Update the use of Instant.now() to use this clock.
#CommandHandler
public TimeCard(ClockInCommand cmd) {
AggregateLifecycle.apply(new ClockInEvent(cmd.getEmployeeName(), GenericEventMessage.clock.instant()));
}

Hibernate Envers: onPostInsert fired when object has not been saved yet

In my Java app I want to listen to the events when an object is persisted i.e added to the db. Then I use the id of this persisted object to perform some queries. This is the code
#Component
public class DataCreationListener extends EnversPostInsertEventListenerImpl {
private static final long serialVersionUID = 1L;
#Autowired
DataService DataService;
public DataCreationListener() {
super(null);
}
#Override
public void onPostInsert(PostInsertEvent event) {
if (event.getEntity() instanceof DataDAO) {
//Here I use the info from event.getEntity() to perform some queries.
Data Data = DataService.fromDao((DataDAO) event.getEntity());
// other stuff
}
}
}
But since the object for which the event has been fired has not been saved yet, the queries raise exceptions. If the listener name is onPostInsert I expect it to fire when an entry has already been made in the dB but that is not the case. Upon debugging I see that once the control flows out of the listener then the object gets saved.
Is there any way that I can get the desired functionality here i.e. get the listener to fire when the object is saved, or do it somehow myself in a reliable and safe way? Can someone point me in the right direction. Thanks !!

HashMaps vs Reactive Programming

I am starting to embrace reactive programming a bit more, and I'm trying to apply it to my typical business problems. One pattern I often design with is database-driven classes. I have some defined unit class like ActionProfile whose instances are managed by an ActionProfileManager, which creates the instances off a database table and stores them in a Map<Integer,ActionProfile> where Integer is the actionProfileId key. The ActionProfileManager may clear and re-import the data periodically, and notify all dependencies to re-pull from its map.
public final class ActionProfileManager {
private volatile ImmutableMap<Integer,ActionProfile> actionProfiles;
private ActionProfileManager() {
this.actionProfiles = importFromDb();
}
public void refresh() {
this.actionProfiles = importFromDb();
notifyEventBus();
}
//called by clients on their construction or when notifyEventBus is called
public ActionProfile forKey(int actionProfileId) {
return actionProfiles.get(actionProfiles);
}
private ImmutableMap<Integer,ActionProfile> importFromDb() {
return ImmutableMap.of(); //import data here
}
private void notifyEventBus() {
//notify event through EventBus here
}
}
However, if I want this to be more reactive creating the map would kind of break the monad. One approach I could do is make the Map itself an Observable, and return a monad that looks up a specific key for the client. However the intermediate imperative operations may not be ideal, especially if I start using the rxjava-jdbc down the road. But the hashmap may help lookup performance significantly in intensive cases.
public final class ActionProfileManager {
private final BehaviorSubject<ImmutableMap<Integer,ActionProfile>> actionProfiles;
private ActionProfileManager() {
this.actionProfiles = BehaviorSubject.create(importFromDb());
}
public void refresh() {
actionProfiles.onNext(importFromDb());
}
public Observable<ActionProfile> forKey(int actionProfileId) {
return actionProfiles.map(m -> m.get(actionProfileId));
}
private ImmutableMap<Integer,ActionProfile> importFromDb() {
return ImmutableMap.of(); //import data here
}
}
Therefore, the most reactive approach to me seems to be just pushing everything from the database on each refresh through an Observable<ActionProfile> and filtering for the last matching ID for the client.
public final class ActionProfileManager {
private final ReplaySubject<ActionProfile> actionProfiles;
private ActionProfileManager() {
this.actionProfiles = ReplaySubject.create();
importFromDb();
}
public void refresh() {
importFromDb();
}
public Observable<ActionProfile> forKey(int actionProfileId) {
return actionProfiles.filter(m -> m.getActionProfileID() == actionProfileId).last();
}
private void importFromDb() {
// call onNext() on actionProfiles and pass each new ActionProfile coming from database
}
}
Is this the optimal approach? What about old data causing memory leaks and not being GC'd? Is it more practical to maintain the map and make it observable?
What is the most optimal reactive approach above to data driven classes? Or is there a better way I have not discovered?
Using BehaviorSubject is the right thing to do here if you don't care about earlier values.
Note most post discouraging Subjects were written in the early days of Rx.NET and is mostly quoted over and over again without much thought. I attribute this to the possibility that such authors didn't really understand how Subjects work or run into some problems with them and just declared they shouldn't be used.
I think Subjects are a great way to multicast events (coming from a single thread usually) where you control or you are the source of the events and the event dispatching is somewhat 'global' (such as listening to mouse move events).

jsprit VRP Related jobs hard Contraint

Is it possible to have two or multiple shipment in same route by hard Contraint.
If not, do you know other java libraries that can handle such kind of restrictions?
Thank you!
The easiest way do make sure that shipments are in the same route is to tag these shipment with a skill
shipmentBuilder.addRequiredSkill("tag")
but then you need to tag a particular vehicle as well:
vehicleBuilder.addSkill("tag")
And make sure you make the algorithm consider skills/these tags (see https://github.com/jsprit/jsprit/blob/master/WHATS_NEW.md - you need to use 1.3.2-SNAPSHOT).
If you do not want to assign a particular vehicle with a tag, you need to implement a core.problem.constraint.HardRouteStateLevelConstraint which is basically this method
public boolean fulfilled(JobInsertionContext insertionContext)
Make sure that insertionContext.getJob() [which is the job to be inserted] can be inserted into insertionContext.getRoute(). At this point you need to know two things:
the associated shipments of insertionContext.getJob(), i.e. shipments that need to be in same route as insertionContext.getJob()
whether one of these associated jobs has already been assigned to a route and if so whether this route is the same as insertionContext.getRoute()
For latter information you need to define states that provides you with a job-route assignment. I would define a problemState and its according updater like this:
static class UpdateJobRouteAssignment implements StateUpdater,JobInsertedListener,InsertionStartsListener {
StateManager stateManager;
UpdateJobRouteAssignment(StateManager stateManager) {
this.stateManager = stateManager;
}
#Override
public void informJobInserted(Job job2insert, VehicleRoute inRoute, double additionalCosts, double additionalTime) {
stateManager.putProblemState(stateManager.createStateId(job2insert.getId()), VehicleRoute.class, inRoute);
}
#Override
public void informInsertionStarts(Collection<VehicleRoute> vehicleRoutes, Collection<Job> unassignedJobs) {
for(VehicleRoute r : vehicleRoutes){
for(Job j : r.getTourActivities().getJobs()){
informJobInserted(j,r,0.,0.);
}
}
}
}
Add your state updater and your constraint to your State/ConstraintManager and you are done.

Categories