I'm working on a springboot project following a microservice architecture and I use Kafka as an event bus to exchange data between some of them. I also have Junit tests which test some part of my application which doesn't require the bus and others that require it by using an embedded Kafka broker.
The problem I have is when I launch all my tests, they take so much time and they fail because each of then is trying to connect to the embedded Kafka broker (connection not available) whereas they don't need Kafka bus in order to achieve their task.
Is it possible to disable the loading of Kafka components for these tests and only allow them for the ones that require it ?
This is how I usually write my JUnit tester classes, that usually wont connect to KAFKA Brokers for each test.
Mocking the REST API, if your KAFKA Client (Producer/Consumer) integrated with a REST API
public class MyMockedRESTAPI {
public MyMockedRESTAPI() {
}
public APIResponseWrapper apiResponseWrapper(parameters..) throws RestClientException {
if (throwException) {
throw new RestClientException(....);
}
return new APIResponseWrapper();
}
}
A factory class to generate an incoming KAFKA Event and REST API request and response wrappers
public class mockFactory {
private static final Gson gson = new Gson();
public static KAKFAEvent generateKAFKAEvent() {
KAKFAEvent kafkaEvent = new KAKFAEvent();
kafkaEvent.set...
kafkaEvent.set...
kafkaEvent.set...
return KAKFAEvent;
}
public static ResponseEntity<APIResponse> createAPIResponse() {
APIResponse response = new APIResponse();
return new ResponseEntity<>(response, HttpStatus.OK);
}
}
A Test Runner Class
#RunWith(SpringJUnit4ClassRunner.class)
public class KAFKAJUnitTest {
Your assertion should be declared here
}
You can also refer : https://www.baeldung.com/spring-boot-kafka-testing
a good practice would be to avoid sending messages to Kafka while testing code in your isolated microservice scope. but when you need to make an integration test ( many microservices in the same time ) sometimes you need to activate Kafka messages.
So my purpose is :
1- Activate/Deactivate loding Kafka configuration as required
#ConditionalOnProperty(prefix = "my.kafka.consumer", value = "enabled", havingValue = "true", matchIfMissing = false)
#Configuration
public class KafkaConsumerConfiguration {
...
}
#ConditionalOnProperty(prefix = "my.kafka.producer", value = "enabled", havingValue = "true", matchIfMissing = false)
#Configuration
public class KafkaProducerConfiguration {
...
}
and then u will be able to activate/deactivate loading consumer and producer as you need...
Examples :
#SpringBootApplication
#Import(KafkaConsumerConfiguration.class)
public class MyMicroservice_1 {
public static void main(String[] args) {
SpringApplication.run(MyMicroservice_1.class, args);
}
}
or
#SpringBootApplication
#Import(KafkaProducerConfiguration.class)
public class MyMicroservice_2 {
public static void main(String[] args) {
SpringApplication.run(MyMicroservice_2.class, args);
}
}
or maybe a microservice that need both of configurations
#SpringBootApplication
#Import(value = { KafkaProducerConfiguration.class, KafkaConsumerConfiguration.class })
public class MyMicroservice_3 {
public static void main(String[] args) {
SpringApplication.run(MyMicroservice_3.class, args);
}
}
2 - You need also to make sending messages depending on the current spring profile. To do that u can override the send method of the Kafka template object:
#ConditionalOnProperty(prefix = "my.kafka.producer", value = "enabled", havingValue = "true", matchIfMissing = false)
#Configuration
public class KafkaProducerConfiguration {
...
#Resource
Environment environment;
#Bean
public KafkaTemplate<String, String> kafkaTemplate() {
return new KafkaTemplate<>(producerFactory()) {
#Override
protected ListenableFuture<SendResult<String, String>> doSend(ProducerRecord<String, String> producerRecord) {
if (Arrays.asList(environment.getActiveProfiles()).contains("test")) {
return null;
}
return super.doSend(producerRecord);
}
};
}
#Bean
public ProducerFactory<String, String> producerFactory() {
Map<String, Object> props = new HashMap<>();
...
return new DefaultKafkaProducerFactory<>(props);
}
}
Related
I'm new with Kafka and want to persist data from kafka topics to database tables (each topic flow to a specific table). I know Kafka connect exists and can be used to achieve this but there are reasons why this approach is preferred.
Unfortunately only one topic is writing the database. Kafka seems to not process() all processors concurrently. Either MyFirstData is writing to database or MySecondData but never but at the same time.
According the my readings the is the option overriding init() from of kafka stream Processor interface which offers context.forward() not sure if this will help and how to use it in my used case.
I use Spring Cloud Stream (but got the same behaviour with Kafka DSL and Processor API implementations)
My code snippet:
Configuring the consumers:
#Configuration
#RequiredArgsConstructor
public class DatabaseProcessorConfiguration {
private final MyFirstDao myFirstDao;
private final MySecondDao mySecondDao;
#Bean
public Consumer<KStream<GenericData.Record, GenericData.Record>> myFirstDbProcessor() {
return stream -> stream.process(() -> {
return new MyFirstDbProcessor(myFirstDao);
});
}
#Bean
public Consumer<KStream<GenericRecord, GenericRecord>> mySecondDbProcessor() {
return stream -> stream.process(() -> new MySecondDbProcessor(mySecondDao));
}
}
This MyFirstDbProcessor and MySecondDbProcessor is analog to this.
#Slf4j
#RequiredArgsConstructor
public class MyFirstDbProcessor implements Processor<GenericData.Record, GenericData.Record, Void, Void> {
private final MyFirstDao myFirstDao;
#Override
public void process(Record<GenericData.Record, GenericData.Record> record) {
CdcRecordAdapter adapter = new CdcRecordAdapter(record.key(), record.value());
MyFirstTopicKey myFirstTopicKey = adapter.getKeyAs(MyFirstTopicKey.class);
MyFirstTopicValue myFirstTopicValue = adapter.getValueAs(MyFirstTopicValue.class);
MyFirstData data = PersistenceMapper.map(myFirstTopicKey, myFirstTopicValue);
switch (myFirstTopicValue.getCrudOperation()) {
case UPDATE, INSERT -> myFirstDao.persist(data);
case DELETE -> myFirstDao.delete(data);
default -> System.err.println("unimplemented CDC operation streamed by kafka");
}
}
}
My Dao implementations: I try an implementation of MyFirstRepository with JPARepository and ReactiveCrudRepository but same behaviour. MySecondRepository is implemented analog to MyFirstRepository.
#Component
#RequiredArgsConstructor
public class MyFirstDaoImpl implements MyFirstDao {
private final MyFirstRepository myFirstRepository;
#Override
public MyFirstData persist(MyFirstData myFirstData) {
Optional<MyFirstData> dataOptional = MyFirstRepository.findById(myFirstData.getId());
if (dataOptional.isPresent()){
var data = dataOptional.get();
myFirstData.setCreatedDate(data.getCreatedDate());
}
return myFirstRepository.save(myFirstData);
}
#Override
public void delete(MyFirstData myFirstData) {
System.out.println("delete() from transaction detail dao called");
MyFirstRepository.delete(myFirstData);
}
}
I'm using avro to generate a java class (Heartbeat) and I'm using Spring cloud messaging Processor in order to push a message to kafka using this Heartbeat class.
So here is my service:
#Service
public class HeartbeatServiceImpl implements HeartbeatService {
private Processor processor;
public HeartbeatServiceImpl(Processor processor) {
this.processor = processor;
}
#Override
public boolean sendHeartbeat(Heartbeat heartbeat) {
Message<Heartbeat> message =
MessageBuilder.withPayload(heartbeat).setHeader(KafkaHeaders.MESSAGE_KEY, "MY_KEY").build();
return processor.output().send(message);
}
}
I have this consumer on my test package:
#Component
public class HeartbeatKafkaConsumer {
private CountDownLatch latch = new CountDownLatch(1);
private String payload = null;
#KafkaListener(topics = "HEARTBEAT")
public void receive(ConsumerRecord<?, ?> consumerRecord) {
this.payload = consumerRecord.toString();
this.latch.countDown();
}
public CountDownLatch getLatch() {
return latch;
}
public Object getPayload() {
return payload;
}
}
Now on my actual test class I have this:
public class HeartbeatServiceImplIntegrationTest {
#Autowired
private HeartbeatServiceImpl heartbeatService;
#Autowired
private HeartbeatKafkaConsumer heartbeatKafkaConsumer;
#Test
public void assertHeartbeatPushedToKafka() throws InterruptedException {
Heartbeat heartbeat =
Heartbeat.newBuilder().setID("my-Test ID").setINPUTSOURCE("my-Test IS")
.setMSGID("my-Test 123").setMSGTIME(12345l).setRECEIVEDTIME(12345l).build();
boolean isMessageSent = heartbeatService.sendHeartbeat(heartbeat);
assertThat(isMessageSent).isTrue();
heartbeatKafkaConsumer.getLatch().await(10000, TimeUnit.MILLISECONDS);
assertThat(heartbeatKafkaConsumer.getLatch().getCount()).isEqualTo(0L);
assertThat(heartbeatKafkaConsumer.getPayload()).isEqualTo(heartbeat);
}
}
I can see the message does arrive on kafka by running ksql. So the message is there as expected.
I also receive a message from my heartbeatKafkaConsumer, but when I do the assertion I'm getting this error:
Expecting:
<"ConsumerRecord(topic = HEARTBEAT, partition = 0, leaderEpoch = 0, offset = 4, CreateTime = 1622134829899, serialized key size = 12, serialized value size = 75, headers = RecordHeaders(headers = [], isReadOnly = false), key = [B#765aa560, value = [B#3582e1cd)">
to be equal to:
<{"ID": "my-Test ID", "TYPE": "null", "MSG_ID": "my-Test 123", "MSG_TIME": 12345, "RECEIVED_TIME": 12345, "INPUT_SOURCE": "my-Test IS", "SBK_FEED_PROVIDER_ID": "null", "SBK_FEED_PROVIDER_NAME": "null"}>
Now I tried to read in many different ways from my HeartbeatKafkaConsumer but I just couldn't make the value properly parse back to a Heartbeat.
How do I consume from kafka as a Heartbeat so I can test it against the message originally sent?
I'm not even being able to retrieve as a string either.
Oh, and here's my applicaiton.properties configuration for kafka:
spring.cloud.stream.default.producer.useNativeEncoding=true
spring.cloud.stream.default.consumer.useNativeEncoding=true
spring.cloud.stream.bindings.input.destination=HEARTBEAT
spring.cloud.stream.bindings.input.content-type=application/*+avro
spring.cloud.stream.bindings.output.destination=HEARTBEAT
spring.cloud.stream.bindings.output.content-type=application/*+avro
spring.cloud.stream.kafka.binder.producer-properties.schema.registry.url=http://localhost:8081
spring.cloud.stream.kafka.binder.producer-properties.key.serializer=io.confluent.kafka.serializers.KafkaAvroSerializer
spring.cloud.stream.kafka.binder.producer-properties.value.serializer=io.confluent.kafka.serializers.KafkaAvroSerializer
spring.cloud.stream.kafka.binder.consumer-properties.schema.registry.url=http://localhost:8081
spring.cloud.stream.kafka.binder.consumer-properties.key.serializer=io.confluent.kafka.serializers.KafkaAvroDeserializer
spring.cloud.stream.kafka.binder.consumer-properties.value.serializer=io.confluent.kafka.serializers.KafkaAvroDeserializer
spring.cloud.stream.kafka.binder.consumer-properties.specific.avro.reader=true
spring.kafka.bootstrap-servers=127.0.0.1:9092
spring.kafka.consumer.group-id=myclient
spring.kafka.consumer.auto-offset-reset=earliest
Use consumerRecord.value() instead of toString.
It will be a byte[] which you can pass into an ObjectMapper to deserialize it as a Heartbeat.
Or, simply configure the consumer to use a JsonDeserializer and consumerRecord.value() will be the heartbeat. You will need to configure the deserializer to tell it which type to create.
Third (and simplest) option is to add a JsonMessageConverter #Bean (boot will wire it into the listener container) and change your method to
#KafkaListener(topics = "HEARTBEAT")
public void receive(Heartbeat heartbeat) {
...
}
The framework tells the converter what type to create, from the method signature.
https://docs.spring.io/spring-kafka/docs/current/reference/html/#messaging-message-conversion
I have a requirement to start kafka consumer automatically on application startup. But before doing this, I would like to know if there are any other consumers already polling to that topic.
I can see these details from kafka confluent site, but I would like to get these details through java code.
I know there are some .sh files in bin folder of kafka, but I would like to do this in production environment and through java code.
You can use the AdminClient to describe the consumer groups:
#SpringBootApplication
public class So67668813Application {
public static void main(String[] args) {
SpringApplication.run(So67668813Application.class, args);
}
#Bean
public NewTopic topic() {
return TopicBuilder.name("so67668813").partitions(4).replicas(1).build();
}
#KafkaListener(id = "so67668813", topics = "so67668813")
public void listen(String in) {
System.out.println(in);
}
#Bean
public ApplicationRunner runner(KafkaAdmin admin) {
return args -> {
Thread.sleep(2000);
try (AdminClient client = AdminClient.create(admin.getConfigurationProperties())) {
Map<String, ConsumerGroupDescription> groups =
client.describeConsumerGroups(Collections.singletonList("so67668813"))
.all()
.get(10, TimeUnit.SECONDS);
System.out.println(groups);
}
};
}
}
{so67668813=(groupId=so67668813, isSimpleConsumerGroup=false, members=(memberId=consumer-so67668813-1-620a9dd7-c995-461d-bb7d-c141457a5799, groupInstanceId=null, clientId=consumer-so67668813-1, host=/127.0.0.1, assignment=(topicPartitions=so67668813-3,so67668813-1,so67668813-2,so67668813-0)), partitionAssignor=range, state=Stable, coordinator=localhost:9092 (id: 0 rack: null), authorizedOperations=null)}
I was working with karate framework to test my rest service and it work great, however I have service that consume message from kafka topic then persist on mongo to finally notify kafka.
I made a java producer on my karate project, it called by js to be used by feature.
Then I have a consumer to check the message
Feature:
* def kafkaProducer = read('../js/KafkaProducer.js')
JS:
function(kafkaConfiguration){
var Producer = Java.type('x.y.core.producer.Producer');
var producer = new Producer(kafkaConfiguration);
return producer;
}
Java:
public class Producer {
private static final Logger LOGGER = LoggerFactory.getLogger(Producer.class);
private static final String KEY = "C636E8E238FD7AF97E2E500F8C6F0F4C";
private KafkaConfiguration kafkaConfiguration;
private ObjectMapper mapper;
private AESEncrypter aesEncrypter;
public Producer(KafkaConfiguration kafkaConfiguration) {
kafkaConfiguration.getProperties().put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.StringSerializer");
kafkaConfiguration.getProperties().put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.ByteArraySerializer");
this.kafkaConfiguration = kafkaConfiguration;
this.mapper = new ObjectMapper();
this.aesEncrypter = new AESEncrypter(KEY);
}
public String produceMessage(String payload) {
// Just notify kafka with payload and return id of payload
}
Other class
public class KafkaConfiguration {
private static final Logger LOGGER = LoggerFactory.getLogger(KafkaConfiguration.class);
private Properties properties;
public KafkaConfiguration(String host) {
try {
properties = new Properties();
properties.put(BOOTSTRAP_SERVERS_CONFIG, host);
properties.put(ConsumerConfig.GROUP_ID_CONFIG, "karate-integration-test");
properties.put(ConsumerConfig.CLIENT_ID_CONFIG, "offset123");
properties.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, true);
properties.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
} catch (Exception e) {
LOGGER.error("Fail creating the consumer...", e);
throw e;
}
}
public Properties getProperties() {
return properties;
}
public void setProperties(Properties properties) {
this.properties = properties;
}
}
I'd would like to use the producer code with anotation like cucumber does like:
#Then("^Notify kafka with payload (-?\\d+)$")
public void validateResult(String payload) throws Throwable {
new Producer(kafkaConfiguration).produceMessage(payload);
}
and on feature use
Then Notify kafka with payload "{example:value}"
I want to do that because I want to reuse that code on base project in order to be included in other project
If annotation doesn't works, maybe you can suggest me another way to do it
The answer is simple, use normal Java / Maven concepts. Move the common Java code to the "main" packages (src/main/java). Now all you need to do is build a JAR and add it as a dependency to any Karate project.
The last piece of the puzzle is this: use the classpath: prefix to refer to any features or JS files in the JAR. Karate will be able to pick them up.
EDIT: Sorry Karate does not support Cucumber or step-definitions. It has a much simpler approach. Please read this for details: https://github.com/intuit/karate/issues/398
We have an application using Spring Boot and its JMS facility. At runtime, we have different producers that jump online and tell our application the name of the topic or queue to listen to. Right now, we have:
#JmsListener(destination = "helloworld.q")
public void receive(String message) {
LOGGER.info("received message='{}'", message);
}
which works when we send a message to the helloworld.q topic. The problem is, we won't know what the name of the topic will be until runtime, and JmsListener seems to want a constant expression.
Message producers will hook into our ActiveMQ instance and broadcast a message telling us we need to start listening to their topic, such as "Wasabi", "WhitePaper", "SatelliteMajor", "BigBoosters", etc. There is no way to know at runtime which topics we'll need to start listening to.
I've read the Spring documentation that explains how to listen to topics/queues at runtime (sort of):
#Configuration
#EnableJms
public class ReceiverConfig implements JmsListenerConfigurer {
#Override
public void configureJmsListeners(JmsListenerEndpointRegistrar registrar) {
SimpleJmsListenerEndpoint endpoint = new SimpleJmsListenerEndpoint();
endpoint.setId("myJmsEndpoint");
endpoint.setDestination("anotherQueue");
endpoint.setMessageListener(message -> {
// processing
});
registrar.registerEndpoint(endpoint);
}
// other methods...
}
I've shoved that into our Receiver config as a test, and it does get called when we send a message. The problem is, Spring makes all this stuff get called automagically and we don't know where and how to give this method the name of the topic/queue the endpoint needs to listen to. Also, the message listener never seems to get called, but that's a separate problem; I'm sure we can solve it if we at least can send the custom topic or queue for it to listen to.
We're using Spring 2.x.
You can use a property placeholder for the destination name
#SpringBootApplication
public class So56226984Application {
public static void main(String[] args) {
SpringApplication.run(So56226984Application.class, args);
}
#JmsListener(destination = "${foo.bar}")
public void listen(String in) {
System.out.println(in);
}
#Bean
public ApplicationRunner runner(JmsTemplate template) {
return args -> template.convertAndSend("baz", "qux");
}
}
Then set the property, e.g. in application.yml for a Spring Boot app, or a command-line property when launching the JVM
-Dfoo.bar=baz
EDIT
You can make the listener bean a prototype and adjust an environment property.
#SpringBootApplication
public class So56226984Application {
public static void main(String[] args) {
SpringApplication.run(So56226984Application.class, args).close();
}
#Bean
public ApplicationRunner runner(JmsTemplate template, JmsListenerEndpointRegistry registry,
ConfigurableApplicationContext context) {
return args -> {
Scanner scanner = new Scanner(System.in);
String queue = scanner.nextLine();
Properties props = new Properties();
context.getEnvironment().getPropertySources().addLast(new PropertiesPropertySource("queues", props));
while (!"quit".equals(queue)) {
System.out.println("Adding " + queue);
props.put("queue.name", queue);
context.getBean("listener", Listener.class);
template.convertAndSend(queue, "qux sent to " + queue);
System.out.println("There are now " + registry.getListenerContainers().size() + " containers");
queue = scanner.nextLine();
}
scanner.close();
};
}
#Bean
#Scope(ConfigurableBeanFactory.SCOPE_PROTOTYPE)
public Listener listener() {
return new Listener();
}
public static class Listener {
#JmsListener(destination = "${queue.name}")
public void listen(String in) {
System.out.println(in);
}
}
}