How to connect OSGI bundles (Send entity though bundles)? - java

I have a REST service, which contains three classes in one module(bundle)
User.java -> Entity
UserService.java -> REST service
UserValidation.java -> Special validator for the entity. Server send entity to this validator and get validation result (true or false):
User.java
#XmlRootElement(name = "User")
public class User {
private long id;
private String name;
private String surname;
private String patronymic;
/*Getters and Setters*/
}
UserService.java
public class UserServiceImpl implements UserService {
private UserDAO userDbDao = new UserDatabaseDAO();
#POST
#Path("/users/")
public Response addUser(User user) {
UserValidator userValidator = new UserValidator(user);
if (userValidator.isValid()) {
User newUser = userDbDao.createUser(user);
return Response.ok().type("application/xml").entity(newUser).build();
} else {
return Response.status(Response.Status.BAD_REQUEST).entity(userValidator.getErrorMessage()).build();
}
}
}
UserValidator.java
public class UserValidator {
private static final int MAX_SIZE_NAME = 50;
private static final int MIN_SIZE_NAME = 2;
private User user;
public UserValidator(User user) {
this.user = user;
}
private BadUserResponse badUserResponse = new BadUserResponse();
private boolean isNameValid(String name) {
if (name == null) {
badUserResponse.setNsp("Null in fields name/surname/patronymic");
return false;
}
String tempName = name.trim();
if (tempName.length() < MIN_SIZE_NAME || tempName.length() > MAX_SIZE_NAME) {
badUserResponse.setNsp(String.format("Fields name/surname/patronymic too long or too short (Allowed length from %d to %d)", MIN_SIZE_NAME, MAX_SIZE_NAME));
return false;
}
for (int i = 0; i < tempName.length(); i++) {
if (!Character.isLetter(tempName.charAt(i))) {
badUserResponse.setNsp("Fields name/surname/patronymic contains wrong symbols (Only letters allowed)");
return false;
}
}
return true;
}
public boolean isValid() {
return (isNameValid(user.getName()) &
isNameValid(user.getSurname()) &
isNameValid(user.getPatronymic()));
}
public BadUserResponse getErrorMessage() {
return badUserResponse;
}
BadUserResponse.java
#XmlRootElement(name="baduserresponce")
public class BadUserResponse {
private String nsp;
public String getNsp() {
return nsp;
}
public void setNsp(String nsp) {
this.nsp = nsp;
}
}
But now, I need to split this into separate bundles. Because, as you can see, they uses functionality of each other. For example UserService.java
just used this UserValidator userValidator = new UserValidator(user);
I need to connect these bundles somehow (OSGI Service, ActiveMQ).
In my opinion it works something like this:
UserService bundle get User entity from REST method.
Put all of the User fields (name, surname, patronymic) to ActiveMQ queue (because UserValidator bundle don't know what's is User entity).
UserValidator bundle get User's fiels from queue and validate them.
UserValidator bundle put validation result (true/false) to queue.
UserService bundle get validation result from queue and send User to DAO.
But this is just a concept. Am I wrong?
What's the best way to pass entity though bundles and how should I do this?

You current way of simply initiating the UserValidator via new is technically fine even if they live in different bundles. If your validator is only needed in this place and is simple I would even leave it in the same bundle.
The other options can make sense to decouple your bundles. Using messaging allows you to avoid sync calls. It can also be use to send the data to a remote machine. JMS messaging is quite heavy weight though. You need a broker and depend on the API. In your case you also directly need the result of the validation. So you would simulate a sync call with JMS. So I would rather avoid this.
Using an OSGi service allows you to decouple from the implementation of the service. In this case it makes sense to create an interface for UserValidator. I would also put this interface into a separate bundle. You then need to register the service in the bundle that implements the validator and bind the service in the bundle that uses the validator. OSGi services are very light weight and by default synchronous. So I think they would fit your problem well.
For registering and binding services do not use the OSGi API directly. Instead use declarative services with annotations. They take away most of the complexity in dealing with OSGi services.
Btw. I am not sure how you do REST. I suggest to have a look at the Aries JAX-RS Whiteboard.

Related

How should I go about extending com.microsoft.azure.spring.autoconfigure.aad.UserPrincipal?

I'm using MSAL on the front end (PKCE) and azure-active-directory-spring-boot-starter on the server to provide an entity which represents the logged in user and their claims.
I've built a class which wraps Microsoft's UserPrincipal to provide easy access to well-known claims:
import com.microsoft.azure.spring.autoconfigure.aad.UserPrincipal;
public class MyCustomUser {
private UserPrincipal userPrincipal;
public MyCustomUser(UserPrincipal userPrincipal) {
this.userPrincipal = userPrincipal;
}
public String getEmployeeId() {
return String.valueOf(this.userPrincipal.getClaim("emplid"));
}
}
I make this available via a helper class
#Component
public class MyAppSecurityContext {
public MyCustomUser getUser() {
UserPrincipal principal = (UserPrincipal) SecurityContextHolder.getContext().getAuthentication().getPrincipal();
return new MyCustomUser(principal);
}
}
and then use it in my service layer:
#Service
public class AnnouncementServiceImpl implements AnnouncementService {
#Autowired
private MyAppSecurityContext securityContext;
#Override
public List<Announcement> saveAnnouncement(Announcement announcement) {
MyCustomUser currentUser = this.securityContext.getUser();
String empid = currentUser.getEmployeeId();
return this.announcementRepository.saveAnnouncement(empid, announcement);
}
}
This works, but it feels wrong. I'd prefer to have MyCustomUser extend UserPrincipal and have getPrincipal() return my custom type (without effectively re-implementing my own UserPrincipal) instead of providing a facade in front of a member object. The problem is that UserPrincipal's constructor expects JWT concerns, which suggests that this isn't the correct approach. Is there another, more appropriate way to model the user for a Spring security project which relies on client-side claims only?
#Josh.
In azure-active-directory-spring-boot-starter, UserPrincipal is used in AADAuthenticationFilter and AADAppRoleStatelessAuthenticationFilter. Now both of the 2 filters are deprecated.
Could you please use the latest version of azure-spring-boot-starter-active-diectory? Which is 3.7.0, and it works for spring-boot:2.5.0. Since you used UserPrincipal, then you application is a resource server.
Here is the docs: https://github.com/Azure/azure-sdk-for-java/tree/azure-spring-boot-starter-active-directory_3.7.0/sdk/spring/azure-spring-boot-starter-active-directory#accessing-a-resource-server
We have some samples.
You current application is similar to:
https://github.com/Azure-Samples/azure-spring-boot-samples/tree/azure-spring-boot_3.6/aad/azure-spring-boot-sample-active-directory-resource-server-by-filter-stateless
https://github.com/Azure-Samples/azure-spring-boot-samples/tree/azure-spring-boot_3.6/aad/azure-spring-boot-sample-active-directory-resource-server-by-filter
But the 2 usage is deprecated, I suggest you to learn the new way:https://github.com/Azure-Samples/azure-spring-boot-samples/tree/azure-spring-boot_3.6/aad/azure-spring-boot-sample-active-directory-resource-server

How to use Mongo Auditing and a UUID as id with Spring Boot 2.2.x?

I would like to have Documents stored with an UUID id and createdAt / updatedAt fields. My solution was working with Spring Boot 2.1.x. After I upgraded from Spring Boot 2.1.11.RELEASE to 2.2.0.RELEASE my test for MongoAuditing failed with createdAt = null. What do I need to do to get the createdAt field filled again?
This is not just a testproblem. I ran the application and it has the same behaviour as my test. All auditing fields stay null.
I have a Configuration to enable MongoAuditing and UUID generation:
#Configuration
#EnableMongoAuditing
public class MongoConfiguration {
#Bean
public GenerateUUIDListener generateUUIDListener() {
return new GenerateUUIDListener();
}
}
The listner hooks into the onBeforeConvert - I guess thats where the trouble starts.
public class GenerateUUIDListener extends AbstractMongoEventListener<IdentifiableEntity> {
#Override
public void onBeforeConvert(BeforeConvertEvent<IdentifiableEntity> event) {
IdentifiableEntity entity = event.getSource();
if (entity.isNew()) {
entity.setId(UUID.randomUUID());
}
}
}
The document itself (I dropped the getter and setters):
#Document
public class MyDocument extends InsertableEntity {
private String name;
}
public abstract class InsertableEntity extends IdentifiableEntity {
#CreatedDate
#JsonIgnore
private Instant createdAt;
}
public abstract class IdentifiableEntity implements Persistable<UUID> {
#Id
private UUID id;
#JsonIgnore
public boolean isNew() {
return getId() == null;
}
}
A complete minimal example can be find here (including a test) https://github.com/mab/auditable
With 2.1.11.RELEASE the test succeeds with 2.2.0.RELEASE it fails.
For me the best solution was to switch from event UUID generation to a callback based one. With the implementation of Ordered we can set the new callback to be executed after the AuditingEntityCallback.
public class IdEntityCallback implements BeforeConvertCallback<IdentifiableEntity>, Ordered {
#Override
public IdentifiableEntity onBeforeConvert(IdentifiableEntity entity, String collection) {
if (entity.isNew()) {
entity.setId(UUID.randomUUID());
}
return entity;
}
#Override
public int getOrder() {
return 101;
}
}
I registered the callback with the MongoConfiguration. For a more general solution you might want to take a look at the registration of the AuditingEntityCallback with the `MongoAuditingBeanDefinitionParser.
#Configuration
#EnableMongoAuditing
public class MongoConfiguration {
#Bean
public IdEntityCallback registerCallback() {
return new IdEntityCallback();
}
}
MongoTemplate works in the following way on doInsert()
this.maybeEmitEvent - emit an event (onBeforeConvert, onBeforeSave and such) so any AbstractMappingEventListener can catch and act upon like you did with GenerateUUIDListener
this.maybeCallBeforeConvert - call before convert callbacks like mongo auditing
like you can see in source code of MongoTemplate.class src (831-832)
protected <T> T doInsert(String collectionName, T objectToSave, MongoWriter<T> writer) {
BeforeConvertEvent<T> event = new BeforeConvertEvent(objectToSave, collectionName);
T toConvert = ((BeforeConvertEvent)this.maybeEmitEvent(event)).getSource(); //emit event
toConvert = this.maybeCallBeforeConvert(toConvert, collectionName); //call some before convert handlers
...
}
MongoAudit marks createdAt only to new entities by checking if entity.isNew() == true
because your code (UUID) already set the Id the createdAt is not populated (the entity is not considered new)
you can do the following (order by best to worst):
forget about the UUID and use String for your id, let the mongo itself create and manage it's entities ids (this how MongoTemplate actually works lines 811-812)
keep the UUID at the code level, convert from/to String when inserting and retrieving from the db
create a custom repository like in this post
stay with 2.1.11.RELEASE
set the updateAt by GenerateUUIDListener as well as id (rename it NewEntityListener or smth), basically implement the audit
implement a new isNew() logic that don't depends only on the entity id
in version 2.1.11.RELEASE the order of the methods was flipped (MongoTemplate.class 804-805) so your code worked fine
as an abstract approach, the nature of event is to be sort of send-and-forget (async compatible), so it's a very bad practice to change the object itself, there is NO grantee for order of computation, if any
this is why the audit build on callbacks and not events, and that's why Pivotal don't (need to) keep order between versions

Creating Spring #Repository and #Controller for every item I'm working with(from database)

While working with a project that involves requesting multiple data types from a database I came to a following question:
Lets say I have 2 java classes that correspond to database entities:
Routes
public class Route {
public Route(int n, int region, Date fdate, boolean changed, int points,
int length) {
super();
this.n = n;
this.region = region;
this.fdate = fdate;
this.changed = changed;
this.points = points;
this.length = length;
}
}
Carrier
public class Carrier {
public Carrier(...) {
this.id = src.getId();
this.name = src.getName();
this.instId = src.getInstId();
this.depotId = src.getDepotId();
}
If so, what's the correct approach of creating Dao interfaces and classes? I'm doing it like this -
#Repository
public class CarrierDaoImpl implements CarrierDao{
#Autowired
DataSource dataSource;
public List<Carrier> getAllOrgs() { ... }
}
#Repository
public class RoutesDaoImpl implements RoutesDao {
#Autowired
DataSource dataSource;
public ArrayList<AtmRouteItem> getRoutes(AtmRouteFilter filter) { ... }
}
I'm creating a #Repository DAO for every java class item\db entity and then 2 separate controllers for requests about carriers and routes. Like this:
#RestController
#RequestMapping(path = "/routes")
public class RoutesController {
#Autowired
RoutesDao routesDao;
#GetMapping(value = {"/getRoutes/", "/getRoutes"})
public ArrayList<Route> getRoutes() { ... } }
And same for controller Carriers. Is it correct and if not what's the correct approach?
Sorry for styling issues, that's my first question on stackoverflow :)
I would suggest creating services marked with #Service annotation (i.e. CarrierService interface and CarrierServiceImpl implementation). Than inject them into controllers. Use repositories within services because some database operations will require transactions and a better place for managing transactions are services. Also services can do more specialized job which will require access to multiple repositories so you can inject them. And don’t forget to mark your services with #Transactional annotation.
It's correct to have a DAO for each entity.
When working with JPA repositories you have no choice but to provide the entity. For instance:
public interface FooRepository extends JpaRepository<Foo,Long>{}
Same for the REST controllers, you have to bring together functionalities by object as you do.
You can improve your mapping to be more RESTful. To retrieve all routes, don't specify a path:
#GetMapping
public ArrayList<RouteResource> getRoutes() { ... }
(I never use #GetMapping yet but it should work like that)
And if you want specific route:
#GetMapping("/get/{id}")
public RouteResource getRoute() {...}
You should return resources instead of entities to client.

Circular Dependency due to usage of HATEOAS in REST

I'm designing my REST application architecture using Domain Driven Design and Adapter patter (there are interfaces, and many implementations in the aggregate root). It's all fine as long as don't add HATEOAS to the puzzle. In HATEOAS my value objects (on the bottom of dependency hierarchy) need to depend on resources (in the top layer). This messes up everything. I'm fairly new to HATEOAS so maybe I'm missing something. I'm planning to use Dropwizard and Jersey Declarative Linking.
Here is a diagram of my architecture:
Little clarification - this "Return and attributes types" between interfaces and value objects should actually be "Return and argument types" - It means, that all the interfaces' methods take objects from Value objects module as an arguments and return those objects to the caller.
I can add a piece of code that will show you what's in what module:
REST - JAX-RS Resources
#Component
#Path("/groups")
#Produces(MediaType.APPLICATION_JSON)
public class GroupsResource {
#Autowired
ProcessEngine processEngine; //interface with driver implementation under it
#GET
#Timed
public List<UserGroup> getUserGroups(#Auth BpmUser user) {
return processEngine.getUserGroups(user.id);
}
}
Interface ProcessEngine
public interface ProcessEngine {
void init();
List<UserGroup> getUserGroups(String username);
}
Implementation in drivers module
public class ActivitiProcessEngine implements ProcessEngine {
private org.activiti.engine.ProcessEngine processEngine;
private DataSource dataSource;
private String databaseType;
public ActivitiProcessEngine(String databaseType, DataSource dataSource) {
this.databaseType = databaseType;
this.dataSource = dataSource;
}
#Override
public void init() {
if (processEngine != null)
throw new ProcessEngineAlreadyInitializedException();
try {
processEngine = createProcessEngineConfiguration().buildProcessEngine();
ProcessEngines.registerProcessEngine(processEngine);
} catch (SQLException e) {
throw new ProcessEngineDatabaseException(e);
}
}
#Override
public List<UserGroup> getUserGroups(String username) {
return processEngine
.getIdentityService()
.createGroupQuery()
.groupMember(username)
.list()
.stream()
.map(Group::getId)
.map(UserGroup::new)
.collect(Collectors.toList());
}
...
}
Value object
public class UserGroup {
#JsonProperty
public String name;
//I want to be able add linking to another resources here
public UserGroup(String name){
this.name = name;
}
}
Domain object should never know anything about Controller or any other application logic. So, link controllers to domain object. It will solve your dependency problem.

Where should business logic (and what is that?) really live and how to do that with Spring?

I was just reading this article:
http://www.tutorialized.com/view/tutorial/Spring-MVC-Application-Architecture/11986
which I find great. It explains the layer architecture nicely and I was glad that the architecture I'm working with pretty much is what he describes.
But there's one thing, that I don't seem to get:
First: what exactly is business logic and what is it not? In the article he says (and he's not the only one), that business logic should go in the domain model. So an Account class should have an activate() method that knows how to activate an Account. In my understanding this would involve some persistence work probably. But the domain model should not have a dependency of DAOs. Only the service layer should know about DAOs.
So, is business logic just what a domain entity can do with itself? Like the activate()method would set the active property to true, plus set the dateActivated property to new Date() and then it's the service's task to first call account.activate()and second dao.saveAccount(account)? And what needs external dependencies goes to a service? That's what I did until now mostly.
public AccountServiceImpl implements AccountService
{
private AccountDAO dao;
private MailSender mailSender;
public void activateAccount(Account account)
{
account.setActive(true);
account.setDateActivated(new Date());
dao.saveAccount(account);
sendActivationEmail(account);
}
private void sendActivationEmail(Account account)
{
...
}
}
This is in contrast to what he says, I think, no?
What I also don't get is the example on how to have Spring wire domain objects like Account. Which would be needed should Account send its e-mail on its own.
Given this code:
import org.springframework.mail.MailSender;
import org.springframework.mail.SimpleMailMessage;
public class Account {
private String email;
private MailSender mailSender;
private boolean active = false;
public String getEmail() {
return email;
}
public void setEmail(String email) {
this.email = email;
}
public void setMailSender(MailSender mailSender) {
this.mailSender = mailSender;
}
public void activate() {
if (active) {
throw new IllegalStateException("Already active");
}
active = true;
sendActivationEmail();
}
private void sendActivationEmail() {
SimpleMailMessage msg = new SimpleMailMessage();
msg.setTo(email);
msg.setSubject("Congrats!");
msg.setText("You're the best.");
mailSender.send(msg);
}
}
If I use Hibernate, I could use the DependencyInjectionInterceptorFactoryBean in order to wire mailSender. If I use JDBC instead, I'd really write the follwing cumbersome code? Plus, also when I create a new instance for Account in a MVC controller, for let's say populating it to a model??
BeanFactory beanFactory = new XmlBeanFactory(
new ClassPathResource("chapter3.xml"));
Account account = new Account();
account.setEmail("email#example.com");
((AutowireCapableBeanFactory)beanFactory).applyBeanPropertyValues(
account, "accountPrototype");
account.activate();
This is not reliable and very cumbersome, no? I'd have to ask myself where that object has been created, whenever I see an instance of Account. Plus, if I would go with this approach: I have not a single appContext.xml I could pass, but several, one for persistence, one for the service config. How would I do that? Plus, that would create a completely new context every time such an instance is created or am I missing something?
Is there no better solution to that?
Any help is greatly appreciated.
I think send activation email action is not a part of a business-layer here, your domain logic here is the account activation action, that piece of logic should live in the DomainObject with name Account ( activate() method ). The send activation email action is the part of infrastructure or application layers.
Service is the object that handles account activation request and connects business-layer and others. Service takes the given account, activates them and performs send activation email action of MailSenderService or something like this.
Short sample:
public AccountServiceImpl implements AccountService
{
private AccountDAO dao;
private MailSenderService mailSender;
public void activateAccount(AccountID accountID)
{
Account account = dao.findAccount(accountID);
....
account.activate();
dao.updateAccount(account);
....
mailSender.sendActivationEmail(account);
}
}
The next step that I can suggest is a complete separation of business layer
and a layer of infrastructure. This can be obtained by introducing the
business event. Service no longer has to perform an action to send an
email, it creates event notifying other layers about
account activation.
In the Spring we have two tools to work with events, ApplicationEventPublisher and ApplicationListener.
Short example, service that publish domain events:
public AccountActivationEvent extends ApplicationEvent {
private Account account;
AccountActivationEvent(Account account) {
this.account = account;
}
public Account getActivatedAccount() {
return account;
}
}
public AccountServiceImpl implements AccountService, ApplicationEventPublisherAware
{
private AccountDAO dao;
private ApplicationEventPublisher epublisher;
public void setApplicationEventPublisher(ApplicationEventPublisher epublisher) {
this.epublisher = epublisher;
}
public void activateAccount(AccountID accountID)
{
Account account = dao.findAccount(accountID);
....
account.activate();
dao.updateAccount(account);
....
epublisher.publishEvent(new AccountActivationEvent(account));
}
}
And domain event listener, on the infrastructure layer:
public class SendAccountActivationEmailEventListener
implements ApplicationListener<AccountActivationEvent> {
private MailSenderService mailSender;
....
public final void onApplicationEvent(final AccountActivationEvent event) {
Account account = event.getActivatedAccount():
.... perform mail ...
mailSender.sendEmail(email);
}
}
Now you can add another activation types, logging, other infrastructure stuff support without change and pollute your domain(business)-layer.
Ah, you can learn more about spring events in the documentation.

Categories