DefaultKafkaProducerFactory with multiple JsonSerializer mappings - java

i'm going through spring documentation and found that we can have multiple mappings for single producer factory spring-docs
senderProps.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, JsonSerializer.class);
senderProps.put(JsonSerializer.TYPE_MAPPINGS, "foo:com.myfoo.Foo, bar:com.mybar.bar");
But it is unclear for me how to create Producerfactory like below
#Bean
public ProducererFactory<Foo, Bar> kafkaProducerFactory(KafkaProperties properties,
JsonSserializer customSerializer) {
return new DefaultKafkaConsumerFactory<>(properties.buildProducerProperties(),
customSerializer, customSerializer);
}
According to my knowledge Foo must be key and Bar must be value right?, and what is this customSerializer? i'm looking for clear example with much more info.
My question is i wish to have single ProducerFactory and kafkatemplate that produces multiple type message to kafka, for example Foo, Bar,Car is that possible?

No; this
senderProps.put(JsonSerializer.TYPE_MAPPINGS, "foo:com.myfoo.Foo, bar:com.mybar.bar");
is only for when you define the deserializer when using properties only.
When using the DefaultKafkaConsumerFactory and DefaultKafkaProducerFactory constructors that take fully built serializer/deserializer objects directly, you must configure the deserializer yourself.
typeMapper = new DefaultJackson2JavaTypeMapper();
typeMapper.setIdClassMapping(myTypeMappingsMap);
deserializer = new JsonDeserializer();
deserlialzer.setTypeMapper(typeMapper);
(and similarly for the serializer).

Related

Java Jackson Always Serialize One Type As Another

Is there a way to tell Jackson to always serialize one type to another. In my case I would like to always serialize Long to String. Right now whenever there is an object with a Long property we have to annotate it with #JsonSerialize(using=ToStringSerializer.class). This is tedious and easy to forget.
I would like to be able to configure the Jackson object mapper to always convert Long to String in the spring boot bean creation.
IMHO, multiple options are there.
I
com.fasterxml.jackson.databind.ser.std.StdSerializer implementation that can be set to your ObjectMapper in the spring context.
#Bean
public Jackson2ObjectMapperBuilder objectMapperBuilder() {
Jackson2ObjectMapperBuilder builder = new Jackson2ObjectMapperBuilder();
....
builder.serializerByType(<type>, <your custom serializer>);
return builder;
}
As for the custom serializer, you can extend the above-mentioned class StdSerializer.
II
spring.jackson.generator.write-numbers-as-strings=true
Note
Be aware of that Feature.WRITE_NUMBERS_AS_STRINGS has deprecated Since 2.10 of jackson version.
I hope it helps.
This can be done using this serialization feature.
jsonGenerator.configure(Feature.WRITE_NUMBERS_AS_STRINGS,true);
http://fasterxml.github.io/jackson-core/javadoc/2.10/com/fasterxml/jackson/core/json/JsonWriteFeature.html#WRITE_NUMBERS_AS_STRINGS

Spring-Kafka #KafkaHandlers not consuming respective java Objects

I know it has been asked multiple time, however, i cant find the solution yet. I am able to consume a specific Java Object by my #KafkaListener(On a class level) and works perfectly fine, however, When i try to consume multiple different JSON Objects from the same topic( I use #KafkaListener at class level and #KafkaHandler at the method level, each #KafkaHandler method expecting different Object), it always produces LinkedHashMap. I can parse this and get the data and do a factory pattern to generate different Instances based on a json field, however, i do not want to do that when Spring can auto detect a specific #KafkaHandler for routing the message.
How do i consume different JSON object from a single topic with single #KafkaListener.
I am using below configs:
public ConsumerFactory<String, Object> abcConsumerFactory() {
Map<String, Object> config = new HashMap<>();
config.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
config.put(ConsumerConfig.GROUP_ID_CONFIG,"group-1");
config.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
config.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG,JsonDeserializer.class );
config.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, false);
config.put(JsonDeserializer.TRUSTED_PACKAGES, "*");
return new DefaultKafkaConsumerFactory<>(config, new StringDeserializer(), new JsonDeserializer<>(Object.class, false));
#Bean
public ConcurrentKafkaListenerContainerFactory<String, Object> abcListenerContainerFactory() {
ConcurrentKafkaListenerContainerFactory<String, Object> factory = new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(abcConsumerFactory());
return factory;
}
If i use the actual class(Foo or Bar) instead of Object in above configs it works fine for that particular objects, However, when i try to generalize, it does not go to the specific #KafkaHandler, and instead goes to a #KafkaHandler with Payload type LinkedHashMap(I was trying to check if it gets delivered to a #KafkaHandler)
#Bean
public ConcurrentKafkaListenerContainerFactory<String, Object> abcListenerContainerFactory() {
ConcurrentKafkaListenerContainerFactory<String, Object> factory = new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(abcConsumerFactory());
return factory;
}
Class:
#KafkaListener(topics={"abc_gh"}, containerFactory = "abcListenerContainerFactory")
#Service
public class MyListener {
#KafkaHandler
public void consumeMessage(#Payload Foo f) {
//only comes here when i use Foo in my configs instead of Object
}
#KafkaHandler
public void consumeMessage22(#Payload Bar b) {
//only comes here when i use Bar in my configs instead of Object
}
#KafkaHandler
public void consumeMessage77(#Payload LinkedHashMap bc) {
//comes here when i use Object in the configs, even if i expect a Foo or a Bar object
}
}
One thing i want to share is the Producer is not using Spring-Kafka.
I dont know what i am missing, i tried a lot of things but no luck.
As the documentation says:
When using #KafkaHandler methods, the payload must have already been converted to the domain object (so the match can be performed). Use a custom deserializer, the JsonDeserializer, or the JsonMessageConverter with its TypePrecedence set to TYPE_ID. See Serialization, Deserialization, and Message Conversion for more information.
In order to route to the proper method, the ConsumerRecord must already have the correct type in it.
config.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG,JsonDeserializer.class );
You are not providing the deserializer with any information about what object to create.
When Spring is the producer, it adds type information in record headers which can be used by the deserializer to construct the right type.
If there is no type information, you'll get a Map.
The producer has to set the type headers for this to work.
When the #KaflaListener is at the method level, we can determine the type to create from the method paramters. At the class level, we have a catch-22 - we can't choose a method unless the record already has been converted.
Your producer doesn't have to know the actual type, but it at least has to provide a header that can be used to look up the type we need to convert to.
See Mapping Types.
Alternatively, the producer's JSON serializer must be configured to add type information into the JSON itself.
Another option is a custom deserializer that "peeks" at the JSON to determine what class to instantiate.

Allowed packages in custom header of Kafka-Message

In spring-kafka, how do I add classes from a package to be trusted as a custom header field?
The message is being sent like this:
#Autowired
private KafkaTemplate kafkaTemplate;
Message<BouquetMQDTO> m = MessageBuilder
.withPayload(payload)
.setHeader(KafkaHeaders.TOPIC, "topic")
.setHeader("EVENT_TYPE", MessageType.UPSERT)
.build();
kafkaTemplate.send(m);
The receiving end looks like this:
#Component
#KafkaListener(topics = "topic")
public class KafkaController {
#KafkaHandler
public void listen(
#Payload Object objectDTO,
#Header(value = "EVENT_TYPE") MessageType messageType
) {
System.out.println(messageType);
}
}
The exception I keep getting is this:
Caused by: org.springframework.core.convert.ConverterNotFoundException: No converter found capable of converting from type [org.springframework.kafka.support.DefaultKafkaHeaderMapper$NonTrustedHeaderType] to type [#org.springframework.messaging.handler.annotation.Header my.package.MessageType]
MessageType is an enum and I can get it working by sending the String representation and using valueOf() on the receiving side but this solution does not quite feel right. There also loads of tutorials that use something from java.utils, which is trusted by default.
I found that you should be able to declare a bean to allow the enum to be deserialized:
#Bean
public KafkaHeaderMapper defaultKafkaHeaderMapper() {
DefaultKafkaHeaderMapper mapper = new DefaultKafkaHeaderMapper();
mapper.addTrustedPackages("my.package");
return mapper;
}
Sadly, this doesn't work. The exceptions remains the same. I assume I have to declare some more beans and use the KafkaHeaderMapper bean in there, but I can't seem to find out which those are.
I also already have an ConsumerFactory bean where I allow packages to be used as payloads, but allowing the package the enum is from there doesn't do anything either.
props.put(JsonDeserializer.TRUSTED_PACKAGES, "my.other.package,my.package");
return new DefaultKafkaConsumerFactory<>(props);
The JsonDeserializer.TRUSTED_PACKAGES is fully not related to headers.
This one can deal with key or value of the `ConsumerRecord. The header mapper happens slightly in a different place.
Not sure if you use Spring Boot, but there is a MessagingMessageListenerAdapter which comes with a default MessagingMessageConverter and, therefore, default DefaultKafkaHeaderMapper. To customize for your own HeaderMapper, you need crate that MessagingMessageConverter, take a reference to a HeaderMapper and inject that converter into an AbstractKafkaListenerContainerFactory bean.
If you deal with Spring Boot, there is just enough to declare that MessagingMessageConverter and it is going to be auto-configured into an AbstractKafkaListenerContainerFactory created by the framework.
This way you can get access to your trusted packages. However I think 3it is not going to work yet because enum is not so JSON-friendly by default: https://www.baeldung.com/jackson-serialize-enums

how to register kryo serializer instances in storm?

I'm desparately trying to configure serializer instances to use in my storm topology.
The storm documentation states, there are 2 ways to register serializers :
1. The name of a class to register. In this case, Storm will use Kryo’s FieldsSerializer to serialize the class. This may or may not be optimal for the class – see the Kryo docs for more details.
2. A map from the name of a class to register to an implementation of com.esotericsoftware.kryo.Serializer.
I want to use 2. ->
Map<String, Object> serializerConfig = new HashMap<String, Object>();
serializerConfig.put(Record.class.getName(), new AvroSerializer(params));
conf.put(Config.TOPOLOGY_KRYO_REGISTER, serializerConfig);
Unfortunately, this results in
Exception in thread "main" java.lang.IllegalArgumentException: Storm conf is not valid. Must be json-serializable
on topology submission.
Does anyone know how to do this (register serializer instances) ?
Thank you very much
There is a method in the backtype.storm.Config class to register your own class derived from Serializer.
For your example, put this in the main method that creates and submits the topology:
// Read in Storm Configuration
Config conf = new Config();
conf.registerSerialization(Record.class, AvroSerializer.class);
As Steven Magana-Zook said above, you want to register the class in the config as he's done. This apparently doesn't let you pass in parameters, but if you look in SerializationFactory.java in storm's source, you can see that it resolves various possible constructors of your serializer class, including several that contain the storm Config. You can stash your parameters in there.
So not what you were hoping for exactly but you should be able to reach the same end.

What's the common way to deal with Jackson serialization

Currently I have a project that makes use of Spring-Hibernate and also Jackson to deal with JSON. The first time I tried to use Jackson I always got LazyInitializationException and sometimes infinite loop for multiple entities that references each other. Then I found #JsonIgnore and #JsonIdentityInfo.
Now the problem is sometimes it is needed to ignore properties but sometimes I just need those properties to be serializable. Is there a way to sometimes ignore several fields and sometimes serialize the fields at the runtime?
I found "Serialization and Deserialization with Jackson: how to programmatically ignore fields?"
But if I always have to use the mix in annotation, it would be cumbersome if an object dozens of properties to retrieve. Eg. In page1 I need propertyA, propertyB, propertyC; in page2 I need propertyA and propertyC; in page3 I only need propertyB. In those cases alone I would have to create 1 class for each page resulting in 3 classes.
So in that case is there a way to define something like:
objectA.ignoreAllExcept('propertyA');
String[] properties = {'propertyA', 'propertyC'};
objectB.ignoreAllExcept(properties); // Retrieve propertyA and propertyC
objectC.ignore(properties);
What you might be looking for is a Module. The documentation says that Modules are
Simple interface for extensions that can be registered with ObjectMappers to provide a well-defined set of extensions to default functionality.
Following is am example of how you might use them to accomplish what you want. Note, there are other ways using which this can be achieved; this is just one of them.
A simple DTO that can be used for specifying the properties to filter:
public class PropertyFilter {
public Class<?> classToFilter;
public Set<String> propertiesToIgnore = Collections.emptySet();
public PropertyFilter(Class<?> classToFilter, Set<String> propertiesToIgnore) {
this.classToFilter = classToFilter;
this.propertiesToIgnore = propertiesToIgnore;
}
}
A custom module that filters out properties based on some attribute that you store in the current request.
public class MyModule extends Module {
#Override
public String getModuleName() {
return "Test Module";
}
#Override
public void setupModule(SetupContext context) {
context.addBeanSerializerModifier(new MySerializerModifier());
}
#Override
public Version version() {
// Modify if you need to.
return Version.unknownVersion();
}
public static class MySerializerModifier extends BeanSerializerModifier {
public BeanSerializerBuilder updateBuilder(SerializationConfig config,
BeanDescription beanDesc,
BeanSerializerBuilder builder) {
List<PropertyFilter> filters = (List<PropertyFilter>) RequestContextHolder.getRequestAttributes().getAttribute("filters", RequestAttributes.SCOPE_REQUEST);
PropertyFilter filter = getPropertyFilterForClass(filters, beanDesc.getBeanClass());
if(filter == null) {
return builder;
}
List<BeanPropertyWriter> propsToWrite = new ArrayList<BeanPropertyWriter>();
for(BeanPropertyWriter writer : builder.getProperties()) {
if(!filter.propertiesToIgnore.contains(writer.getName())) {
propsToWrite.add(writer);
}
}
builder.setProperties(propsToWrite);
return builder;
}
private PropertyFilter getPropertyFilterForClass(List<PropertyFilter> filters, Class<?> classToCheck) {
for(PropertyFilter f : filters) {
if(f.classToFilter.equals(classToCheck)) {
return f;
}
}
return null;
}
}
}
Note: There is a changeProperties method in the BeanSerializerModifier class that is more appropriate for changing the property list (according to the documentation). So you can move the code written in the updateBuilder to changeProperties method with appropriate changes.
Now, you need to register this custom module with your ObjectMapper. You can get the Jackson HTTP message converter from your application context, and get its object mapper. I am assuming you already know how to do that as you have been dealing with the lazy-initialization issue as well.
// Figure out a way to get the ObjectMapper.
MappingJackson2HttpMessageConverter converter = ... // get the jackson-mapper;
converter.getObjectMapper().registerModule(new MyModule())
And you are done. When you want to customize the serialization for a particular type of object, create a PropertyFilter for that, put it in a List and make it available as an attribute in the current request. This is just a simple example. You might need to tweak it a bit to suit your needs.
In your question, you seem to be looking for a way to specify the properties-to-filter-out on the serialized objects themselves. That, in my opinion, should be avoided as the list of properties to filter-out doesn't belong to your entities. However, if you do want to do that, create an interface that provides setters and getters for the list of properties. Suppose the name of the interface is CustomSerialized Then, you can modify the MyModule class to look for the instances of this CustomSerialized interface and filter out the properties accordingly.
Note: You might need to adjust/tweak a few things based on the versions of the libraries you are using.
I think there is a more flexible way to do it. You can configure Jackson in a such a way that it will silently ignore lazy loaded properties instead of stopping serialization process. So you can reuse the same class. Just load all necessary properties / relations and pass it to Jackson. You can try to do it by declaring your custom ObjectMapper and by turning off SerializationFeature.FAIL_ON_EMPTY_BEANS feature. Hope it helps.
You can filter out properties without modifying classes by creating a static interface for a mixin annotation. Next, annotate that interface with the #JsonFilter annotation. Create a SimpleBeanPropertyFilter and a SimpleFilterProvider. Then create an ObjectWriter with your filter provider by invoking objectMapper.writer(filterProvider)

Categories