Kafka consumer couldn't receive serialize Object? - java

So i want to implement simple application which send notification kafka producer to kafka consumer.So far i have successfully send String message to producer to consumer.But when i try to send notification object kafka consumer didn't receive any objects.This is the code i have used.
public class Notification implements Serializable{
private String name;
private String message;
private long currentTimeStamp;
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public String getMessage() {
return message;
}
public void setMessage(String message) {
this.message = message;
}
public long getCurrentTimeStamp() {
return currentTimeStamp;
}
public void setCurrentTimeStamp(long currentTimeStamp) {
this.currentTimeStamp = currentTimeStamp;
}
#Override
public boolean equals(Object o) {
if (this == o) return true;
if (o == null || getClass() != o.getClass()) return false;
Notification that = (Notification) o;
if (currentTimeStamp != that.currentTimeStamp) return false;
if (message != null ? !message.equals(that.message) : that.message != null) return false;
if (name != null ? !name.equals(that.name) : that.name != null) return false;
return true;
}
#Override
public int hashCode() {
int result = name != null ? name.hashCode() : 0;
result = 31 * result + (message != null ? message.hashCode() : 0);
result = 31 * result + (int) (currentTimeStamp ^ (currentTimeStamp >>> 32));
return result;
}
#Override
public String toString() {
return "Notification{" +
"name='" + name + '\'' +
", message='" + message + '\'' +
", currentTimeStamp=" + currentTimeStamp +
'}';
}
}
And this is producer
public class KafkaProducer {
static String topic = "kafka-tutorial";
public static void main(String[] args) {
System.out.println("Start Kafka producer");
Properties properties = new Properties();
properties.put("metadata.broker.list", "localhost:9092");
properties.put("serializer.class", "dev.innova.kafka.tutorial.producer.CustomSerializer");
ProducerConfig producerConfig = new ProducerConfig(properties);
kafka.javaapi.producer.Producer<String, Notification> producer = new kafka.javaapi.producer.Producer<String, Notification>(producerConfig);
KeyedMessage<String, Notification> message = new KeyedMessage<String, Notification>(topic, createNotification());
System.out.println("send Message to broker");
producer.send(message);
producer.close();
}
private static Notification createNotification(){
Notification notification = new Notification();
notification.setMessage("Sample Message");
notification.setName("Sajith");
notification.setCurrentTimeStamp(System.currentTimeMillis());
return notification;
}
}
And this is consumer
public class KafkaConcumer extends Thread {
final static String clientId = "SimpleConsumerDemoClient";
final static String TOPIC = "kafka-tutorial";
ConsumerConnector consumerConnector;
public KafkaConcumer() {
Properties properties = new Properties();
properties.put("zookeeper.connect","localhost:2181");
properties.put("group.id","test-group");
properties.put("serializer.class", "dev.innova.kafka.tutorial.producer.CustomSerializer");
properties.put("zookeeper.session.timeout.ms", "400");
properties.put("zookeeper.sync.time.ms", "200");
properties.put("auto.commit.interval.ms", "1000");
ConsumerConfig consumerConfig = new ConsumerConfig(properties);
consumerConnector = Consumer.createJavaConsumerConnector(consumerConfig);
}
#Override
public void run() {
Map<String, Integer> topicCountMap = new HashMap<String, Integer>();
topicCountMap.put(TOPIC, new Integer(1));
Map<String, List<KafkaStream<byte[], byte[]>>> consumerMap = consumerConnector.createMessageStreams(topicCountMap);
KafkaStream<byte[], byte[]> stream = consumerMap.get(TOPIC).get(0);
ConsumerIterator<byte[], byte[]> it = stream.iterator();
System.out.println("It :" + it.size());
while(it.hasNext()){
System.out.println(new String(it.next().message()));
}
}
private static void printMessages(ByteBufferMessageSet messageSet) throws UnsupportedEncodingException {
for(MessageAndOffset messageAndOffset: messageSet) {
ByteBuffer payload = messageAndOffset.message().payload();
byte[] bytes = new byte[payload.limit()];
payload.get(bytes);
System.out.println(new String(bytes, "UTF-8"));
}
}
}
And finally i have used customserializer to serialize and deserialize object.
public class CustomSerializer implements Encoder<Notification>, Decoder<Notification> {
public CustomSerializer(VerifiableProperties verifiableProperties) {
/* This constructor must be present for successful compile. */
}
#Override
public byte[] toBytes(Notification o) {
return new byte[0];
}
#Override
public Notification fromBytes(byte[] bytes) {
return null;
}
}
Can someone tell me what is the issue ? is this the right way ?

You have two problems.
First, your deserializer doesn't have any logic. It returns an empty byte array for each object it serializes and returns a null object whenever it's asked to deserialize an object. You need to put code there that actually serializes and deserializes your objects.
Second, if you plan to use the native JVM serialization and deserialization logic from the JVM, you'll need to add a serialVersionUID to your beans that will be transported. Something like this:
private static final long serialVersionUID = 123L;
You can use any value you like. When an object is deserialized by the JVM the serialVersionId in the object is compared to the value specified in the loaded class definition. If the two are different then the JVM assumes that even though you have a class definition loaded you don't have the correct version of the class definition loaded and serialization will fail. If you don't specify a value for serialVersionID in your class definition then the JVM will make one up for you and two different JVM's (the one with the producer and the one with the consumer) will almost certainly make up different values for you.
EDIT
You'd need to make your serializer look something like this if you want to leverage the default Java serialization:
public class CustomSerializer implements Encoder<Notification>, Decoder<Notification> {
public CustomSerializer(VerifiableProperties verifiableProperties) {
/* This constructor must be present for successful compile. */
}
#Override
public byte[] toBytes(Notification o) {
try {
ByteArrayOutputStream baos = new ByteArrayOutputStream();
ObjectOutputStream oos = new ObjectOutputStream(baos);
oos.writeObject(o);
oos.close();
byte[] b = baos.toByteArray();
return b;
} catch (IOException e) {
return new byte[0];
}
}
#Override
public Notification fromBytes(byte[] bytes) {
try {
return (Notification) new ObjectInputStream(new ByteArrayInputStream(b)).readObject();
} catch (Exception e) {
return null;
}
}

Create a custom deserializer , Kafka need a way to serialize and deserialize .We have to provide both of these implementations so far
Need to add library to get the object mapper class
FasterXML jackson – 2.8.6
Example - serializer
public class PayloadSerializer implements org.apache.kafka.common.serialization.Serializer {
#Override
public byte[] serialize(String arg0, Object arg1) {
byte[] retVal = null;
ObjectMapper objectMapper = new ObjectMapper();
TestModel model =(TestModel) arg1;
try {
retVal = objectMapper.writeValueAsString(model).getBytes();
} catch (Exception e) {
e.printStackTrace();
}
return retVal;
}
#Override
public void close() {
}
#Override
public void configure(Map map, boolean bln) {
}
}
Deserializer
public class PayloadDeserializer implements Deserializer {
#Override
public void close() {
}
#Override
public TestModel deserialize(String arg0, byte[] arg1) {
ObjectMapper mapper = new ObjectMapper();
TestModel testModel = null;
try {
testModel = mapper.readValue(arg1, TestModel.class);
} catch (Exception e) {
e.printStackTrace();
}
return testModel;
}
#Override
public void configure(Map map, boolean bln) {
}
}
Finally we have to pass deserializer class to the receiver
ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG - PayloadDeserializer.class
or
deserializer.class - classpath.PayloadDeserializer

I strongly suggest you to convert your object to an Avro object before sending it.
It is not that difficult and is the Kafka way of transmitting objects.

Related

Unable to create Kafka Redis Sink with Single Message Transformations

I am trying to create a Kafka Redis sink that deletes a particular key in Redis. One of the ways is to create a Record or Message in Kafka with a specific key and Value as null. But as per the use case, generating the keys is not possible. As a workaround, I wrote a Single message transformer that takes the message from Kafka, sets a particular Key, and sets Value equals null.
Here are my Kafka Connect Confgurations
"connector.class": "com.github.jcustenborder.kafka.connect.redis.RedisSinkConnector",
"transforms.invalidaterediskeys.type": "com.github.cjmatta.kafka.connect.smt.InvalidateRedisKeys",
"redis.database": "0",
"redis.client.mode": "Standalone",
"topics": "test_redis_deletion2",
"tasks.max": "1",
"redis.hosts": "REDIS-HOST",
"key.converter": "org.apache.kafka.connect.storage.StringConverter",
"transforms": "invalidaterediskeys"
}
Here is the code for the transformations :
public class InvalidateRedisKeys<R extends ConnectRecord<R>> implements Transformation<R> {
private static final Logger LOG = LoggerFactory.getLogger(InvalidateRedisKeys.class);
private static final ObjectMapper mapper = new ObjectMapper()
.configure(DeserializationConfig.Feature.FAIL_ON_UNKNOWN_PROPERTIES, false);
#Override
public ConfigDef config() {
return new ConfigDef();
}
#Override
public void configure(Map<String, ?> settings) {
}
#Override
public void close() {
}
#Override
public R apply(R r) {
try {
return r.newRecord(
r.topic(),
r.kafkaPartition(),
Schema.STRING_SCHEMA,
getKey(r.value()),
null,
null,
r.timestamp()
);
} catch (IOException e) {
LOG.error("a.jsonhandling.{}", e.getMessage());
return null;
} catch (Exception e) {
LOG.error("a.exception.{}", e.getMessage());
return null;
}
}
private String getKey(Object value) throws IOException {
A a = mapper.readValue(value.toString(), A.class);
long userId = a.getUser_id();
int roundId = a.getRound_id();
return KeyGeneratorUtil.getKey(userId, roundId);
}
}
where A is
public class A {
private long user_id;
private int round_id;
}
And KeyGeneratorUtil contains a static function that generates a relevant string and sends the results.
I took help from
https://github.com/cjmatta/kafka-connect-insert-uuid
https://github.com/apache/kafka/tree/trunk/connect/transforms/src/main/java/org/apache/kafka/connect/transforms
When I try to initialize Kafka Connect, it says invalid Configurations. Is there something that I am missing?

how can i specify the return type for asyncRabbitTemplate.convertSendAndReceiveAsType() at run time?

i've been struggeling with the following code. and am not sure how to deserialize it or even pass the correct type at run time.
the code is:
#Override
public <T, R> R sendAsync(T payload, String routingKey, String exchangeName) {
ListenableFuture<R> listenableFuture =
asyncRabbitTemplate.convertSendAndReceiveAsType(
exchangeName,
routingKey,
payload,
new ParameterizedTypeReference<>() {
}
);
try {
return listenableFuture.get();
} catch (InterruptedException | ExecutionException e) {
LOGGER.error(" [x] Cannot get response.", e);
return null;
}
}
let us say that am just calling the method like the following
SaveImageResponse response = backendClient.sendAsync( new SaveImageRequest(createQRRequest.getOwner(), qr), RabbitConstants.CREATE_QR_IMAGE_KEY, RabbitConstants.CDN_EXCHANGE);
while the pojo is the following:
public class SaveImageResponse {
private String id;
private String message;
public SaveImageResponse() {
}
public String getId() {
return id;
}
public void setId(String id) {
this.id = id;
}
public String getMessage() {
return message;
}
public void setMessage(String message) {
this.message = message;
}
#Override
public String toString() {
return "SaveImageResponse{" +
"id='" + id + '\'' +
", message='" + message + '\'' +
'}';
}
}
the current code is throwing the following error:
Caused by: java.lang.ClassCastException: class java.util.LinkedHashMap cannot be cast to class dev.yafatek.qr.api.responses.SaveImageResponse (java.util.LinkedHashMap is in module java.base of loader 'bootstrap'; dev.yafatek.qr.api.responses.SaveImageResponse is in unnamed module of loader 'app')
thanks in advance
SOLUTION:
so I ended up using the following:
#Override
public <T, R> R sendAsync(T payload, String routingKey, String exchangeName, Class<R> clazz) {
ListenableFuture<R> listenableFuture =
asyncRabbitTemplate.convertSendAndReceiveAsType(
exchangeName,
routingKey,
payload,
new ParameterizedTypeReference<>() {
}
);
try {
return objectMapper.convertValue(listenableFuture.get(), clazz);
} catch (InterruptedException | ExecutionException e) {
LOGGER.error(" [x] Cannot get response.", e);
return null;
}
}
by using the object mapper and pass the actual type when call the method using
Class<POJO> clazz
to use the above code :
WebsiteInfoResponse websiteInfoResponse = backendClient.sendAsync(new GetWebsiteInfoReq(createBusinessDetailsRequest.getWebsiteUrlId()), RabbitConstants.GET_WEBSITE_INFO_KEY, RabbitConstants.QR_EXCHANGE, WebsiteInfoResponse.class);
You can't.
The whole reason for ParameterizedTypeReference<Foo> is to tell the converter you want a Foo; this has to be resolved at compile time for the method; you can't call sendAsync() to receive different types.
Providing no generic type means it will convert to Object (usually a map).
Even new ParameterizedTypeReference<R>() { } won't work because R is not resolved at compile time for the generic type (of the sendAsync() method).
You have to do the conversion yourself.
#SpringBootApplication
public class So69299112Application {
public static void main(String[] args) {
SpringApplication.run(So69299112Application.class, args);
}
#Bean
MessageConverter converter() {
return new Jackson2JsonMessageConverter();
}
ObjectMapper mapper = new ObjectMapper();
#Bean
AsyncRabbitTemplate template(RabbitTemplate template) {
template.setMessageConverter(new SimpleMessageConverter());
return new AsyncRabbitTemplate(template);
}
#Bean
ApplicationRunner runner(Service service) {
return args -> {
byte[] response = service.sendAsync("bar", "foo", "");
Foo foo = this.mapper.readerFor(Foo.class).readValue(response);
System.out.println(foo);
};
}
#RabbitListener(queues = "foo")
public Foo listen(String in) {
return new Foo(in);
}
public static class Foo {
String foo;
public Foo() {
}
public Foo(String foo) {
this.foo = foo;
}
public String getFoo() {
return this.foo;
}
public void setFoo(String foo) {
this.foo = foo;
}
#Override
public String toString() {
return "Foo [foo=" + this.foo + "]";
}
}
}
#Component
class Service {
private static final Logger LOGGER = LoggerFactory.getLogger(Service.class);
AsyncRabbitTemplate asyncRabbitTemplate;
public Service(AsyncRabbitTemplate asyncRabbitTemplate) {
this.asyncRabbitTemplate = asyncRabbitTemplate;
}
public byte[] sendAsync(Object payload, String routingKey, String exchangeName) {
ListenableFuture<byte[]> listenableFuture = asyncRabbitTemplate.convertSendAndReceive(
exchangeName,
routingKey,
payload);
try {
return listenableFuture.get();
}
catch (InterruptedException | ExecutionException e) {
LOGGER.error(" [x] Cannot get response.", e);
return null;
}
}
}

How can I store a POJO in Redis with Micronaut framework?

I am new to Micronaut. I am trying to port a project to Micronaut (v1.1.1) and I have found a problem with Redis.
I am just trying to save a simple POJO in Redis, but when I try to "save" it the following error is raised:
io.lettuce.core.RedisException: io.netty.handler.codec.EncoderException: Cannot encode command. Please close the connection as the connection state may be out of sync.
Code is very simple (HERE you can find a complete test.):
class DummyTest {
#Test
public void testIssue() throws Exception {
final Date now = Date.from(Instant.now());
CatalogContent expectedContentOne = CatalogContent.builder()
.contentId(1)
.status(ContentStatus.AVAILABLE)
.title("uno")
.streamId(1)
.available(now)
.tags(Set.of("tag1", "tag2"))
.build();
repository.save(expectedContentOne);
}
}
/.../
class CatalogContentRepository {
private StatefulRedisConnection<String, CatalogContent> connection;
public CatalogContentRepository(StatefulRedisConnection<String, CatalogContent> connection) {
this.connection = connection;
}
public void save(CatalogContent content) {
RedisCommands<String, CatalogContent> redisApi = connection.sync();
redisApi.set(String.valueOf(content.getContentId()),content); //Error here!
}
}
Any idea will be welcomed.
Thanks in advance.
For the record I will answer my own question:
Right now (20190514) Micronaut only generate StatefulRedisConnection<String,String> with a hardcoded UTF8 String codec.
To change this you have to replace the DefaultRedisClientFactory and define a method returning the StatefulRedisConnection you need,
with your prefered codec.
In my case:
#Requires(beans = DefaultRedisConfiguration.class)
#Singleton
#Factory
#Replaces(factory = DefaultRedisClientFactory.class)
public class RedisClientFactory extends AbstractRedisClientFactory {
#Bean(preDestroy = "shutdown")
#Singleton
#Primary
#Override
public RedisClient redisClient(#Primary AbstractRedisConfiguration config) {
return super.redisClient(config);
}
#Bean(preDestroy = "close")
#Singleton
#Primary
public StatefulRedisConnection<String, Object> myRedisConnection(#Primary RedisClient redisClient) {
return redisClient.connect(new SerializedObjectCodec());
}
#Bean(preDestroy = "close")
#Singleton
#Primary
#Override
public StatefulRedisConnection<String, String> redisConnection(#Primary RedisClient redisClient) {
throw new RuntimeException("puta mierda");
}
#Override
#Bean(preDestroy = "close")
#Singleton
public StatefulRedisPubSubConnection<String, String> redisPubSubConnection(#Primary RedisClient redisClient) {
return super.redisPubSubConnection(redisClient);
}
}
Codec has been taken from Redis Lettuce wiki
public class SerializedObjectCodec implements RedisCodec<String, Object> {
private Charset charset = Charset.forName("UTF-8");
#Override
public String decodeKey(ByteBuffer bytes) {
return charset.decode(bytes).toString();
}
#Override
public Object decodeValue(ByteBuffer bytes) {
try {
byte[] array = new byte[bytes.remaining()];
bytes.get(array);
ObjectInputStream is = new ObjectInputStream(new ByteArrayInputStream(array));
return is.readObject();
} catch (Exception e) {
return null;
}
}
#Override
public ByteBuffer encodeKey(String key) {
return charset.encode(key);
}
#Override
public ByteBuffer encodeValue(Object value) {
try {
ByteArrayOutputStream bytes = new ByteArrayOutputStream();
ObjectOutputStream os = new ObjectOutputStream(bytes);
os.writeObject(value);
return ByteBuffer.wrap(bytes.toByteArray());
} catch (IOException e) {
return null;
}
}
}

Issue with ArrayList Serde in Kafka Streams API

Based on my previous question, I am still trying to figure out what's the issue with my code.
I've got a most basic topic possible: keys and values are a type of Long and this is my producer code:
public class DemoProducer {
public static void main(String... args) {
Producer<Long, Long> producer = new KafkaProducer<>(createProperties());
LongStream.range(1, 100)
.forEach(
i ->
LongStream.range(100, 115)
.forEach(j -> producer.send(new ProducerRecord<>("test", i, j))));
producer.close();
}
private static final Properties createProperties() {
final Properties props = new Properties();
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "broker:9092");
props.put(ProducerConfig.ACKS_CONFIG, "all");
props.put(ProducerConfig.RETRIES_CONFIG, 0);
props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, LongSerializer.class.getName());
props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, LongSerializer.class.getName());
return props;
}
}
I'd like to group things up using key and put values in a ArrayList using Kafka Streams API.
This is my Stream app that's supposed to do the transformation and put things to new topic - test-aggregated:
public class DemoStreams {
public static void main(String... args) {
final Serde<Long> longSerde = Serdes.Long();
KStreamBuilder builder = new KStreamBuilder();
builder
.stream(longSerde, longSerde, "test")
.groupByKey(longSerde, longSerde)
.aggregate(
ArrayList::new,
(subscriberId, reportId, queue) -> {
queue.add(reportId);
return queue;
},
new ArrayListSerde<>(longSerde))
.to(longSerde, new ArrayListSerde<>(longSerde), "test-aggregated");
final KafkaStreams streams = new KafkaStreams(builder, createProperties());
streams.cleanUp();
streams.start();
Runtime.getRuntime().addShutdownHook(new Thread(streams::close));
}
private static Properties createProperties() {
final Properties properties = new Properties();
String longSerdes = Serdes.Long().getClass().getName();
properties.put(StreamsConfig.APPLICATION_ID_CONFIG, "aggregation-app");
properties.put(StreamsConfig.CLIENT_ID_CONFIG, "aggregation-app-client");
properties.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, "broker:9092");
properties.put(StreamsConfig.DEFAULT_KEY_SERDE_CLASS_CONFIG, longSerdes);
properties.put(StreamsConfig.DEFAULT_VALUE_SERDE_CLASS_CONFIG, ArrayListSerde.class);
properties.put(StreamsConfig.COMMIT_INTERVAL_MS_CONFIG, 10 * 1000);
properties.put(StreamsConfig.CACHE_MAX_BYTES_BUFFERING_CONFIG, 0);
return properties;
}
}
I have implemented my Serde as follows:
ArrayListSerde
public class ArrayListSerde<T> implements Serde<ArrayList<T>> {
private final Serde<ArrayList<T>> inner;
public ArrayListSerde(Serde<T> serde) {
inner =
Serdes.serdeFrom(
new ArrayListSerializer<>(serde.serializer()),
new ArrayListDeserializer<>(serde.deserializer()));
}
#Override
public Serializer<ArrayList<T>> serializer() {
return inner.serializer();
}
#Override
public Deserializer<ArrayList<T>> deserializer() {
return inner.deserializer();
}
#Override
public void configure(Map<String, ?> configs, boolean isKey) {
inner.serializer().configure(configs, isKey);
inner.deserializer().configure(configs, isKey);
}
#Override
public void close() {
inner.serializer().close();
inner.deserializer().close();
}
}
ArrayListSerializer
public class ArrayListSerializer<T> implements Serializer<ArrayList<T>> {
private Serializer<T> inner;
public ArrayListSerializer(Serializer<T> inner) {
this.inner = inner;
}
// Default constructor needed by Kafka
public ArrayListSerializer() {}
#Override
public void configure(Map<String, ?> configs, boolean isKey) {
// do nothing
}
#Override
public byte[] serialize(String topic, ArrayList<T> queue) {
final int size = queue.size();
final ByteArrayOutputStream baos = new ByteArrayOutputStream();
final DataOutputStream dos = new DataOutputStream(baos);
final Iterator<T> iterator = queue.iterator();
try {
dos.writeInt(size);
while (iterator.hasNext()) {
final byte[] bytes = inner.serialize(topic, iterator.next());
dos.writeInt(bytes.length);
dos.write(bytes);
}
} catch (IOException e) {
throw new RuntimeException("Unable to serialize ArrayList", e);
}
return baos.toByteArray();
}
#Override
public void close() {
inner.close();
}
}
ArrayListDeserializer
public class ArrayListDeserializer<T> implements Deserializer<ArrayList<T>> {
private final Deserializer<T> valueDeserializer;
public ArrayListDeserializer(final Deserializer<T> valueDeserializer) {
this.valueDeserializer = valueDeserializer;
}
#Override
public void configure(Map<String, ?> configs, boolean isKey) {
// do nothing
}
#Override
public ArrayList<T> deserialize(String topic, byte[] bytes) {
if (bytes == null || bytes.length == 0) {
return null;
}
final ArrayList<T> arrayList = new ArrayList<>();
final DataInputStream dataInputStream = new DataInputStream(new ByteArrayInputStream(bytes));
try {
final int records = dataInputStream.readInt();
for (int i = 0; i < records; i++) {
final byte[] valueBytes = new byte[dataInputStream.readInt()];
dataInputStream.read(valueBytes);
arrayList.add(valueDeserializer.deserialize(topic, valueBytes));
}
} catch (IOException e) {
throw new RuntimeException("Unable to deserialize ArrayList", e);
}
return arrayList;
}
#Override
public void close() {
// do nothing
}
}
However I end up getting this Exception:
Exception in thread "permission-agg4-client-StreamThread-1" org.apache.kafka.streams.errors.StreamsException: stream-thread [aggregation-app-client-StreamThread-1] Failed to rebalance.
at org.apache.kafka.streams.processor.internals.StreamThread.pollRequests(StreamThread.java:543)
at org.apache.kafka.streams.processor.internals.StreamThread.runOnce(StreamThread.java:490)
at org.apache.kafka.streams.processor.internals.StreamThread.runLoop(StreamThread.java:480)
at org.apache.kafka.streams.processor.internals.StreamThread.run(StreamThread.java:457)
Caused by: org.apache.kafka.streams.errors.StreamsException: Failed to configure value serde class utils.ArrayListSerde
at org.apache.kafka.streams.StreamsConfig.defaultValueSerde(StreamsConfig.java:770)
at org.apache.kafka.streams.processor.internals.AbstractProcessorContext.<init>(AbstractProcessorContext.java:59)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.<init>(ProcessorContextImpl.java:40)
at org.apache.kafka.streams.processor.internals.StreamTask.<init>(StreamTask.java:138)
at org.apache.kafka.streams.processor.internals.StreamThread.createStreamTask(StreamThread.java:1078)
at org.apache.kafka.streams.processor.internals.StreamThread$TaskCreator.createTask(StreamThread.java:255)
at org.apache.kafka.streams.processor.internals.StreamThread$AbstractTaskCreator.createTasks(StreamThread.java:245)
at org.apache.kafka.streams.processor.internals.StreamThread.addStreamTasks(StreamThread.java:1147)
at org.apache.kafka.streams.processor.internals.StreamThread.access$800(StreamThread.java:68)
at org.apache.kafka.streams.processor.internals.StreamThread$RebalanceListener.onPartitionsAssigned(StreamThread.java:184)
at org.apache.kafka.clients.consumer.internals.ConsumerCoordinator.onJoinComplete(ConsumerCoordinator.java:265)
at org.apache.kafka.clients.consumer.internals.AbstractCoordinator.joinGroupIfNeeded(AbstractCoordinator.java:367)
at org.apache.kafka.clients.consumer.internals.AbstractCoordinator.ensureActiveGroup(AbstractCoordinator.java:316)
at org.apache.kafka.clients.consumer.internals.ConsumerCoordinator.poll(ConsumerCoordinator.java:297)
at org.apache.kafka.clients.consumer.KafkaConsumer.pollOnce(KafkaConsumer.java:1078)
at org.apache.kafka.clients.consumer.KafkaConsumer.poll(KafkaConsumer.java:1043)
at org.apache.kafka.streams.processor.internals.StreamThread.pollRequests(StreamThread.java:536)
... 3 more
Caused by: org.apache.kafka.common.KafkaException: Could not instantiate class utils.ArrayListSerde Does it have a public no-argument constructor?
at org.apache.kafka.common.utils.Utils.newInstance(Utils.java:286)
at org.apache.kafka.common.config.AbstractConfig.getConfiguredInstance(AbstractConfig.java:246)
at org.apache.kafka.streams.StreamsConfig.defaultValueSerde(StreamsConfig.java:764)
... 19 more
Caused by: java.lang.InstantiationException: utils.ArrayListSerde
at java.lang.Class.newInstance(Class.java:427)
at org.apache.kafka.common.utils.Utils.newInstance(Utils.java:282)
... 21 more
Caused by: java.lang.NoSuchMethodException: utils.ArrayListSerde.<init>()
at java.lang.Class.getConstructor0(Class.java:3082)
at java.lang.Class.newInstance(Class.java:412)
... 22 more
I was trying to implement Serde based on PriorityQueue example found in Confluent's GitHub page: https://github.com/confluentinc/kafka-streams-examples/tree/3.3.0-post/src/main/java/io/confluent/examples/streams/utils
As the error indicates, all Serdes need to have a non-argument constructor:
Caused by: org.apache.kafka.common.KafkaException: Could not instantiate class utils.ArrayListSerde Does it have a public no-argument constructor?
You class ArrayListSerde does only have constructor:
public ArrayListSerde(Serde<T> serde) { ... }
Thus, we get this error.
Compare ArrayListSerializer:
// Default constructor needed by Kafka
public ArrayListSerializer() {}
Update:
A standard implementation for ListSerde is WIP and should be included in a future release making custom List Serde obsolete: https://issues.apache.org/jira/browse/KAFKA-8326

Do we need to explicitly release memory for objects transferred using netty?

I am using netty for data transfer for a project. This part of the project is basically sending and receiving data packets. Each packet is basically an object that encapsulates some metadata and actual file chunks. When I try to transfer large files, say 5GB, somehow memory usage going up and does not free after file has been transferred. For 5gb file, memory usage starts form ~1gb and goes up to 15GB, and stays there until I quit the program.
This is my packet object:
public class BlockPacket
{
private String requestId;
private String blockId;
private int packetSeqNo=0;
private byte[] data;
public BlockPacket(){
}
public String toString()
{
return "BlockPacket [requestId=" + requestId + ", blockId=" + blockId + ", packetSeqNo=" + packetSeqNo + "]";
}
public String getRequestId()
{
return requestId;
}
public void setRequestId(String requestId)
{
this.requestId = requestId;
}
public String getBlockId()
{
return blockId;
}
public void setBlockId(String blockId)
{
this.blockId = blockId;
}
public int getPacketSeqNo()
{
return packetSeqNo;
}
public void setPacketSeqNo(int no)
{
this.packetSeqNo = no;
}
public byte[] getData()
{
return data;
}
public void setData(byte[] data)
{
this.data = data;
}
}
Here is channelRead at ServerHandler
#Override
public void channelRead(ChannelHandlerContext ctx, Object obj)
{
if (obj instanceof BlockPacket)
{
BlockPacket packet = (BlockPacket) obj;
//Do something with packet
}else if (obj instanceof Block){
Block block = (Block) obj;
//Do something with block
}else{
//Do something else
}
}
Here is my custom custom encode and decode
#Override
protected void encode(ChannelHandlerContext ctx, Object in, ByteBuf out) throws Exception {
ByteArrayOutputStream outStream = new ByteArrayOutputStream();
Output output = new Output(outStream);
kryo.writeClassAndObject(output, in);
output.flush();
byte[] outArray = outStream.toByteArray();
out.writeInt(outArray.length);
out.writeBytes(outArray);
}
#Override
protected void decode(ChannelHandlerContext ctx, ByteBuf in, List<Object> out) throws Exception {
if (in.readableBytes() < 4)
return;
in.markReaderIndex();
boolean read = false;
if(len == 0){
read = true;
len = in.readInt();
}
if ((in.readableBytes() < len && read) || (in.readableBytes() < len+4 && !read)) {
in.resetReaderIndex();
read = false;
return;
}
if (!read)
in.readInt();
byte[] buf = new byte[len];
in.readBytes(buf);
Input input = new Input(buf);
Object object = kryo.readClassAndObject(input);
out.add(object);
}
I went through Reference counted object, but I could not solve my problem. Examples in this tutorial are generally for ByteBuf, which has release() method. However, my Packet object is a simple Java object. Any help is appreciated.

Categories