Is it possible to create one configuration for some classes with Orika mapper? - java

I try to config Orika mapper. I have 5 entities and 5 DTO. My configuration work true, but I can't find any information about how to configure mapper for different classes. So is it possible to create one configuration for some classes or I should create configuration for every pair of classes?
#Configuration
public class MapperConfig implements OrikaMapperFactoryConfigurer {
#Bean
DatatypeFactory datatypeFactory() throws DatatypeConfigurationException {
return DatatypeFactory.newInstance();
}
#Bean
DefaultMapperFactory.MapperFactoryBuilder<?, ?> orikaMapperFactoryBuilder() {
DefaultMapperFactory.Builder orikaMapperFactoryBuilder = new DefaultMapperFactory.Builder();
return orikaMapperFactoryBuilder;
}
#Bean
public MapperFactory orikaMapperFactory(DefaultMapperFactory.MapperFactoryBuilder<?, ?> orikaMapperFactoryBuilder) {
MapperFactory orikaMapperFactory = orikaMapperFactoryBuilder.build();
this.configure(orikaMapperFactory);
return orikaMapperFactory;
}
public void configure(MapperFactory orikaMapperFactory) {
orikaMapperFactory.classMap(Author.class, AuthorDto.class)
.byDefault()
.register();
}
#Bean
public MapperFacade orikaMapperFacade(MapperFactory orikaMapperFactory) {
MapperFacade orikaMapperFacade = orikaMapperFactory.getMapperFacade();
return orikaMapperFacade;
}
}

Yes you can, you need to register mapper for other two pairs of classes in configure method
Example:
public void configure(MapperFactory orikaMapperFactory) {
orikaMapperFactory.classMap(Author.class, AuthorDto.class)
.byDefault()
.register();
orikaMapperFactory.classMap(A.class, B.class)
.byDefault()
.register();
}

Related

how to create N number of Kafka Template dynamicaly at run time - spring boot

I have a spring boot application that needs to connect N number of Kafka clusters. based on some condition Kafka template need to switch and send a message
I have seen some solutions to create separate Kafka template beans but in my use case number of clusters will change at the deployment time
ex:
#Bean(name = "cluster1")
public KafkaTemplate<String, String> kafkaTemplatesample1() {
return new KafkaTemplate<>(devProducerFactory1());
}
#Bean(name = "cluster2")
public KafkaTemplate<String, String> kafkaTemplatesample2() {
return new KafkaTemplate<>(devProducerFactory2());
}
is there any other solution for this? if you can share a sample code its much appreciated
Let's assume that each cluster can be described with the following attributes:
#Getter
#Setter
public class KafkaCluster {
private String beanName;
private List<String> bootstrapServers;
}
For example, two clusters are defined in the application.properties:
kafka.clusters[0].bean-name=cluster1
kafka.clusters[0].bootstrap-servers=CLUSTER_1_URL
kafka.clusters[1].bean-name=cluster2
kafka.clusters[1].bootstrap-servers=CLUSTER_2_URL
Those properties are needed before beans are instantiated, to register KafkaTemplate beans' definitions, which makes #ConfigurationProperties unsuitable for this case. Instead, Binder API is used to bind them programmatically.
KafkaTemplate beans' definitions can be registered in the implementation of BeanDefinitionRegistryPostProcessor interface.
public class KafkaTemplateDefinitionRegistrar implements BeanDefinitionRegistryPostProcessor {
private final List<KafkaCluster> clusters;
public KafkaTemplateDefinitionRegistrar(Environment environment) {
clusters= Binder.get(environment)
.bind("kafka.clusters", Bindable.listOf(KafkaCluster.class))
.orElseThrow(IllegalStateException::new);
}
#Override
public void postProcessBeanDefinitionRegistry(BeanDefinitionRegistry registry) throws BeansException {
clusters.forEach(cluster -> {
GenericBeanDefinition beanDefinition = new GenericBeanDefinition();
beanDefinition.setBeanClass(KafkaTemplate.class);
beanDefinition.setInstanceSupplier(() -> kafkaTemplate(cluster));
registry.registerBeanDefinition(cluster.getBeanName(), beanDefinition);
});
}
#Override
public void postProcessBeanFactory(ConfigurableListableBeanFactory beanFactory) throws BeansException {
}
public ProducerFactory<String, String> producerFactory(KafkaCluster kafkaCluster) {
Map<String, Object> configProps = new HashMap<>();
configProps.put(
ProducerConfig.BOOTSTRAP_SERVERS_CONFIG,
kafkaCluster.getBootstrapServers());
configProps.put(
ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG,
StringSerializer.class);
configProps.put(
ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG,
StringSerializer.class);
return new DefaultKafkaProducerFactory<>(configProps);
}
public KafkaTemplate<String, String> kafkaTemplate(KafkaCluster kafkaCluster) {
return new KafkaTemplate<>(producerFactory(kafkaCluster));
}
}
Configuration class for the KafkaTemplateDefinitionRegistrar bean:
#Configuration
public class KafkaTemplateDefinitionRegistrarConfiguration {
#Bean
public static KafkaTemplateDefinitionRegistrar beanDefinitionRegistrar(Environment environment) {
return new KafkaTemplateDefinitionRegistrar(environment);
}
}
Additionally, exclude KafkaAutoConfiguration in the main class to prevent creating the default KafkaTemplate bean. This is probably not the best way because all the other KafkaAutoConfiguration beans are not created in that case.
#SpringBootApplication(exclude={KafkaAutoConfiguration.class})
Finally, below is a simple test that proves the existence of two KafkaTemplate beans.
#SpringBootTest
class SpringBootApplicationTest {
#Autowired
List<KafkaTemplate<String,String>> kafkaTemplates;
#Test
void kafkaTemplatesSizeTest() {
Assertions.assertEquals(kafkaTemplates.size(), 2);
}
}
For reference: Create N number of beans with BeanDefinitionRegistryPostProcessor, Spring Boot Dynamic Bean Creation From Properties File

How to use #ConfigurationProperties annotation in Spring Boot

I am trying to apply #ConfigurationProperties annotation to get a map from application.yml file to places in code where I need it.
Here is the code I have so far:
application.yml
repository:
type:
big: big
small: small
medium: medium
to:
database:
something: "STRING"
configuration.java
#Configuration
#EnableConfigurationProperties
public class SnowflakeRepositoryConfig {
#Bean
#ConfigurationProperties(prefix = "repository.type")
public DatabaseTypeMapping databaseTypeMapping() {
return new DatabaseTypeMapping();
}
public static class DatabaseTypeMapping {
public Map<Type, Type> typeMappingMigration;
public void setMapping(Map<Type, Type> typeMappingMigration) {
this.typeMappingMigration = typeMappingMigration; }
}
#Bean
#ConfigurationProperties(prefix = "repository.to.database")
public BrandToDatabaseProperties brandToDatabaseProperties() {
return new BrandToDatabaseProperties();
}
public static class BrandToDatabaseProperties {
public Map<Brand, String> mapping;
public void setMapping(Map<Brand, String> mapping) {
this.mapping = mapping;
}
}
And in config file I am applying it to serviceImpl class like this:
#Bean
public UserDataService getUserData(BrandToDatabaseProperties brandToDatabaseProperties, DatabaseTypeMapping databaseTypeMapping){
return new UserDataServiceImpl(brandToDatabaseProperties.mapping, databaseTypeMapping.typeMappingMigration);
}
In serviceImpl.java class, I include it like this:
public class UserDataServiceImpl implements UserDataService {
private final Map<Type, Type> typeMappingMigration;
private final Map<Brand, String> brandToDatabaseMapping;
public UserDataServiceImpl(Map<Brand, String> brandToDatabaseMapping, Map<Type, Type> typeMappingMigration) {
this.brandToDatabaseMapping = Collections.unmodifiableMap(brandToDatabaseMapping);
this.typeMappingMigration = Collections.unmodifiableMap(typeMappingMigration);
}
When I try to start my application, I am getting the following error:
Failed to instantiate [service.UserDataService]: Factory method 'getUserData' threw exception; nested exception is java.lang.NullPointerException
What am I missing here?
You don’t need to declare a bean of type DatabaseTypeMapping. Move the ConfigurationProperties annotation to the class, and let component scan pick it up. Alternatively, you can specify the class name in the EnableConfigurationProperties annotation.
I can’t be sure but I think ConfigurationProperties isn’t supposed to be declared on a method, it doesn’t make sense logically.

Use Jackson Objectmapper configured by Spring boot in Hibernate

I want to configure Hibernate to use Jackson's Objectmapper created by Spring to map between json and entities. In the project I'm working on I already configured Jooq to use the Spring's ObjectMapper but I'm having trouble how to configure Hibernate to use it. The goal in the end is that both Jooq and Hibernate would use the same ObjectMapper.
I checked this article by Vlad. Unfortunately all the tips given in the article don't work for the project I'm working on.
Here's an example configuration I tried
#Configuration
public class HibernateConfiguration implements HibernatePropertiesCustomizer {
//Autowire Objectmapper created by Spring
#Autowired
ObjectMapper objectMapper;
#Override
public void customize(Map<String, Object> hibernateProperties) {
ObjectMapperSupplier objectMapperSupplier = () -> objectMapper;
// Below config doesn't work since Hibernate types creates it's own mapper
hibernateProperties.put("hibernate.types.jackson.object.mapper", objectMapperSupplier);
}
Also tried the same approach by adding the Objectmapper to hibernate-types.properties.
#Used by Hibernate but cannot get reference of Spring managed ObjectMapper since this is class is called outside of Spring's context.
hibernate.types.jackson.object.mapper=path.to.ObjectMapperSupplier
Another approach I used but it fails with a NullpointerException when converting from JSON to an entity in JsonTypeDescriptor class.
#Configuration
public class HibernateConfiguration implements HibernatePropertiesCustomizer{
#Autowired
ObjectMapper objectMapper;
#Override
public void customize(Map<String, Object> hibernateProperties) {
// Underlying implementation needs some JavaType or propertyClass, otherwise when converting
// from JSON we get a nullpointer.
var jsonBinaryType = new JsonBinaryType(objectMapper);
hibernateProperties.put("hibernate.type_contributors", (TypeContributorList) () ->
Collections.singletonList((typeContributions, serviceRegistry) ->
typeContributions.contributeType(jsonBinaryType)));
}
Below is the type declaration for entity super class.
// This makes Hibernate types create it's own mapper.
#TypeDef(name = "jsonb", typeClass = JsonBinaryType.class)
#MappedSuperclass
public abstract class Entity{
}
So, are there any possible solutions how I can hook up Spring managed ObjectMapper to Hibernate?
I finally figured this out, but it is kind of an creative solution...
TLDR: I have a bean that stores the Spring-configured objectMapper in a static field. A BeanFactoryPostProcessor ensures that this bean is initialized before Hibernate (types) tries to load / get the ObjectMapper.
hibernate.properties
hibernate.types.jackson.object.mapper=com.github.lion7.example.HibernateObjectMapperSupplier
HibernateObjectMapperSupplier.kt
package com.github.lion7.example
import com.fasterxml.jackson.databind.ObjectMapper
import com.vladmihalcea.hibernate.type.util.ObjectMapperSupplier
import org.springframework.beans.factory.config.BeanFactoryPostProcessor
import org.springframework.beans.factory.config.ConfigurableListableBeanFactory
import org.springframework.stereotype.Component
class HibernateObjectMapperSupplier : ObjectMapperSupplier {
override fun get(): ObjectMapper =
ObjectMapperHolder.objectMapper
}
#Component
class ObjectMapperHolder(objectMapper: ObjectMapper) {
companion object {
lateinit var objectMapper: ObjectMapper
}
init {
Companion.objectMapper = objectMapper
}
}
#Component
class ObjectMapperDependencyFixer : BeanFactoryPostProcessor {
override fun postProcessBeanFactory(beanFactory: ConfigurableListableBeanFactory) {
val beanDefinition = beanFactory.getBeanDefinition("entityManagerFactory")
val oldDependsOn = beanDefinition.dependsOn ?: emptyArray()
val newDependsOn = oldDependsOn + "objectMapperHolder"
beanDefinition.setDependsOn(*newDependsOn)
}
}
Same code as gist: https://gist.github.com/lion7/c8006b69a309e38183deb69124b888b5
A Java implementation.
#Component
public class HibernateObjectMapper implements Supplier<ObjectMapper> {
private static ObjectMapper objectMapper;
#Autowired
public void setObjectMapper(ObjectMapper objectMapper) {
HibernateObjectMapper.objectMapper = objectMapper;
}
#Override
public ObjectMapper get() {
return objectMapper;
}
}
If you define your own JPA beans, simply add #DependsOn("hibernateObjectMapper") to their config. Otherwise you need a BeanPostProcessor to add the dependency to the autoconfigured bean:
#Component
class HibernateBeanDependencyProcessor implements BeanFactoryPostProcessor {
#Override
public void postProcessBeanFactory(ConfigurableListableBeanFactory factory) {
BeanDefinition beanDefinition = factory.getBeanDefinition("entityManagerFactory");
String[] dependsOn = beanDefinition.getDependsOn();
dependsOn = dependsOn == null ? new String[]{} : dependsOn;
String[] newDependsOn = new String[dependsOn.length + 1];
System.arraycopy(dependsOn, 0, newDependsOn, 1, dependsOn.length);
newDependsOn[0] = "hibernateObjectMapper";
beanDefinition.setDependsOn(newDependsOn);
}
}
As for the property, hibernate.types.* don't work when set programmatically. The library looks directly in the hibernate.properties, hibernate-types.properties, and application.properties files.
I think that I've found solution to do it programmatically (without magic with fixing dependency graph).
HibernateConfiguration.kt
#Configuration(proxyBeanMethods = false)
class HibernateConfiguration {
#Bean
fun hibernatePropertiesCustomizer(
objectMapper: ObjectMapper // Thanks to that Spring can create correct dependency graph
): HibernatePropertiesCustomizer =
HibernatePropertiesCustomizer { hibernateProperties ->
HibernateObjectMapperSupplier.objectMapper = objectMapper
hibernateProperties["hibernate.types.jackson.object.mapper"] = HibernateObjectMapperSupplier::class.qualifiedName
}
}
HibernateObjectMapperSupplier.kt
class HibernateObjectMapperSupplier : Supplier<ObjectMapper> {
override fun get(): ObjectMapper {
return objectMapper
}
companion object {
lateinit var objectMapper: ObjectMapper
}
}
System.getProperties().put(
Configuration.PropertyKey.JACKSON_OBJECT_MAPPER.getKey(),
MyObjectMapperSupplier.class.getName()
);

Spring Java Configuration - how do create a map of enums to beans-references

with Java based configuration, i am trying to convert a map that maps enums to bean references to be in pure java config (currently in XML & works) but can't seem to find anything in the documentations;
Currently, my XML like so;
<util:map id="colourHanders" key-type="com.example.ColourEnum"
value-type="com.example.ColourHandler">
<entry key="white" value-ref="whiteColourHandler"/>
<entry key="blue" value-ref="blueColourHandler"/>
<entry key="red" value-ref="redColourHandler"/>
</util:map>
I'm sure it is easy but again, can't find anything on the subject of how to represent this in Pure Java (so I don't have any XML configuration files)..
Note; the ColourHandler beans are created using the #Component annotation, e.g..
#Component
public class RedColourHandler implements ColourHander{
.....
}
and the map of colourHandlers is referenced as so;
#Resource(name="colourHandlers")
private Map<ColourHandlerEnum, ColourHandler> colourHandlers;
Thanks,
Ian.
You probably want something like this:
#Configuration
public class MyConfiguration {
#Bean public Map<ColourEnum, ColourHandler> colourHandlers() {
Map<ColourEnum, ColourHandler> map = new EnumMap<>();
map.put(WHITE, whiteHandler());
// etc
return map;
}
#Bean public ColourHandler whiteHandler() {
return new WhiteHandler();
}
}
If you need to keep your handlers as #Components, then you can autowire them into the configuration class:
#Configuration
public class MyConfiguration {
#Autowired private WhiteColourHandler whiteColourHandler;
#Bean public Map<ColourEnum, ColourHandler> colourHandlers() {
Map<ColourEnum, ColourHandler> map = new EnumMap<>();
map.put(WHITE, whiteColourHandler);
return map;
}
}
Since you already have a unique class/#Component for each ColorHandler, I would just let Spring figure out what to use (no need for #Autowire injection nor any additional creation methods):
#Configuration
public class MyConfiguration {
#Bean public Map<ColourEnum, ColourHandler> colourHandlers(
WhiteColourHandler whiteHandler,
BlueColourHandler blueHandler,
RedColourHandler redHandler) {
Map<ColourEnum, ColourHandler> map = new EnumMap<>();
map.put(WHITE, whiteHandler);
map.put(BLUE, blueHandler);
map.put(RED, redHandler);
return map;
}
}
Similar to the accepted answer except that, instead of autowiring components, you can declare the beans in the configuration class as usual and pass them as arguments to the Map bean method:
#Configuration
public class MyConfiguration {
#Bean public Map<ColourEnum, ColourHandler> colourHandlers(ColourHandler whiteHandler) {
Map<ColourEnum, ColourHandler> map = new EnumMap<>();
map.put(WHITE, whiteHandler);
return map;
}
#Bean public ColourHandler whiteHandler() {
return new WhiteHandler();
}
}
Also note that the injection of the map as a #Resource doesn't need the annotation's "name" parameter if the field name follows the same naming convention as the bean definition.
i.e. This would work without the name parameter:
#Resource
private Map<ColourHandlerEnum, ColourHandler> colourHandlers;
but this would require it:
#Resource(name="colourHandlers")
private Map<ColourHandlerEnum, ColourHandler> handlers;
This is actually pretty simple but you need to know how:
#Autowired private ColourHandler whiteColourHandler;
...
public Map<ColourEnum, ColourHandler> getColourHander() {
Map<ColourEnum, ColourHandler> result = ...;
map.put( ColourEnum.white, whiteColourHandler );
...
return map;
}
The trick is that you can inject beans into a config.
You can have the ColourHandler class itself define its own type, this way you will not have to keep changing your config class when you get new types:
public interface ColourHander {
ColourEnum getType();
}
#Component
public class RedColourHandler implements ColourHander{
#Override
public ColourEnum getType() {
return ColourEnum.RED;
}
}
And in your config class you get all ColourHandlers and map them to their type.
#Configuration
public class MyConfiguration {
#Bean
public Map<ColourEnum, ColourHandler> colourHandlers(List<ColourHandler> colourHandlers) {
return colourHandlers.stream().collect(toMap(ColourHandler::getType, x -> x));
}
}
Note that you will get an IllegalStateException in case you have more than one handler per color, which I guess is the expected behaviour, you can catch it and throw your own.

Can I use already bound instances in Guice's Module.configure()?

I'd like to bind a MethodInterceptor in my module's configure() method, like this:
public class DataModule implements Module {
#Override
public void configure(Binder binder) {
MethodInterceptor transactionInterceptor = ...;
binder.bindInterceptor(Matchers.any(), Matchers.annotatedWith(Transactional.class), null);
}
#Provides
public DataSource dataSource() {
JdbcDataSource dataSource = new JdbcDataSource();
dataSource.setURL("jdbc:h2:test");
return dataSource;
}
#Provides
public PlatformTransactionManager transactionManager(DataSource dataSource) {
return new DataSourceTransactionManager(dataSource);
}
#Provides
public TransactionInterceptor transactionInterceptor(PlatformTransactionManager transactionManager) {
return new TransactionInterceptor(transactionManager, new AnnotationTransactionAttributeSource());
}
}
Is there a way to get the transactionInterceptor with the help of Guice, or do I need to create all objects required for my interceptor manually?
This is covered in the Guice FAQ. From that document:
In order to inject dependencies in an AOP MethodInterceptor, use requestInjection() alongside the standard bindInterceptor() call.
public class NotOnWeekendsModule extends AbstractModule {
protected void configure() {
MethodInterceptor interceptor = new WeekendBlocker();
requestInjection(interceptor);
bindInterceptor(any(), annotatedWith(NotOnWeekends.class), interceptor);
}
}
Another option is to use Binder.getProvider and pass the dependency in the constructor of the interceptor.
public class NotOnWeekendsModule extends AbstractModule {
protected void configure() {
bindInterceptor(any(),
annotatedWith(NotOnWeekends.class),
new WeekendBlocker(getProvider(Calendar.class)));
}
}
Take a look at how Guice Persist was written. Specifically, the JpaPersistService and its module.

Categories