Having issue in testing with spring cloud stream with kafka binding - java

I have following Binding
public interface KafkaBinding {
String DATA_OUT = "dataout";
#Output(DATA_OUT)
MessageChannel dataOut();
}
Here is kafka utility
#EnableBinding(KafkaBinding.class)
public class KafkaStreamUtil {
private final MessageChannel out;
public KafkaStreamUtil(KafkaBinding binding) {
this.out = binding .dataOut();
}
public void SendToKafkaTopic(List<data> dataList){
dataList
.stream()
.forEach(this::sendMessgae);
}
private void sendMessgae(Data data) {
Message<Data> message = MessageBuilder
.withPayload(data)
.setHeader(KafkaHeaders.MESSAGE_KEY, data.getRequestId().getBytes())
.build();
try {
this.out.send(message);
} catch (Exception e) {
log.error(e.getMessage(),e);
}
}
My test class
#RunWith(SpringRunner.class)
#ActiveProfiles("test")
#SpringBootTest(classes = {KafkaStreamUtil.class,KafkaBinding.class})
#DirtiesContext(classMode= DirtiesContext.ClassMode.AFTER_EACH_TEST_METHOD)
public class KafkaStreamUtilTest {
private static final String TEST_TOPIC1 = "data";
private static final String GROUP_NAME = "embeddedKafkaApplication";
#ClassRule
public static EmbeddedKafkaRule kafkaRule =
new EmbeddedKafkaRule(1, true, TEST_TOPIC1);
public static EmbeddedKafkaBroker embeddedKafka = kafkaRule.getEmbeddedKafka();
#Autowired
private KafkaStreamUtil kUtil;
#Autowired
private KafkaBinding binding;
#BeforeClass
public static void setupProperties() {
System.setProperty("spring.cloud.stream.kafka.binder.brokers", embeddedKafka.getBrokersAsString());
System.setProperty("spring.cloud.stream.kafka.streams.binder.configuration.commit.interval.ms", "1000");
System.setProperty("spring.cloud.stream.kafka.streams.binder.configuration.default.key.serde", "org.apache.kafka.common.serialization.Serdes$StringSerde");
System.setProperty("spring.cloud.stream.kafka.streams.binder.configuration.default.value.serde", "org.apache.kafka.common.serialization.Serdes$StringSerde");
System.setProperty("spring.cloud.stream.bindings.dataout.destination", "data");
System.setProperty("spring.cloud.stream.bindings.dataout.producer.header-mode", "raw");
System.setProperty("spring.autoconfigure.exclude","org.springframework.cloud.stream.test.binder.TestSupportBinderAutoConfiguration");
System.setProperty("spring.kafka.consumer.value-deserializer","org.apache.kafka.common.serialization.StringDeserializer");
System.setProperty("spring.cloud.stream.bindings.input.consumer.headerMode","raw");
System.setProperty("spring.cloud.stream.bindings.input.group","embeddedKafkaApplication");
System.setProperty("spring.kafka.consumer.group-id","EmbeddedKafkaIntTest");
}
#Before
public void setUp() throws Exception {
kUtil = new KafkaStreamUtil(binding);
}
#Test
public void sendToKafkaTopic() {
kUtil.SendToKafkaTopic(dataList);
Map<String, Object> consumerProps = KafkaTestUtils.consumerProps(GROUP_NAME, "false", embeddedKafka);
consumerProps.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
consumerProps.put("key.deserializer", StringDeserializer.class);
consumerProps.put("value.deserializer", StringDeserializer.class);
DefaultKafkaConsumerFactory<String, String> cf = new DefaultKafkaConsumerFactory<>(consumerProps);
Consumer<String, String> consumer = cf.createConsumer();
consumer.subscribe(Collections.singleton(TEST_TOPIC1));
ConsumerRecords<String, String> records = consumer.poll(10_000);
consumer.commitSync();
consumer.close();
Assert.assertNotNull(records);
}
}
I am unable to get binding with Kafka util in test the binding is always null. Please let know what I am missing. I am able to test with springboottest but it is loading all beans I just want to load necessary componens required for this test.

Related

Trigger one Kafka consumer by using values of another consumer In Spring Kafka

I have one scheduler which produces one event. My consumer consumes this event. The payload of this event is a json with below fields:
private String topic;
private String partition;
private String filterKey;
private long CustId;
Now I need to trigger one more consumer which will take all this information which I get a response from first consumer.
#KafkaListener(topics = "<**topic-name-from-first-consumer-response**>", groupId = "group" containerFactory = "kafkaListenerFactory")
public void consumeJson(List<User> data, Acknowledgment acknowledgment,
#Header(KafkaHeaders.RECEIVED_PARTITION_ID) List<Integer> partitions,
#Header(KafkaHeaders.OFFSET) List<Long> offsets) {
// consumer code goes here...}
I need to create some dynamic variable which I can pass in place of topic name.
similarly, I am using the filtering in the configuration file and I need to pass key dynamically in the configuration.
factory.setRecordFilterStrategy(new RecordFilterStrategy<String, Object>() {
#Override
public boolean filter(ConsumerRecord<String, Object> consumerRecord) {
if(consumerRecord.key().equals("**Key will go here**")) {
return false;
}
else {
return true;
}
}
});
How can we dynamically inject these values from the response of first consumer and trigger the second consumer. Both the consumers are in same application
You cannot do that with an annotated listener, the configuration is only used during initialization; you would need to create a listener container yourself (using the ConcurrentKafkaListenerContainerFactory) to dynamically create a listener.
EDIT
Here's an example.
#SpringBootApplication
public class So69134055Application {
public static void main(String[] args) {
SpringApplication.run(So69134055Application.class, args);
}
#Bean
public NewTopic topic() {
return TopicBuilder.name("so69134055").partitions(1).replicas(1).build();
}
}
#Component
class Listener {
private static final Logger log = LoggerFactory.getLogger(Listener.class);
private static final Method otherListen;
static {
try {
otherListen = Listener.class.getDeclaredMethod("otherListen", List.class);
}
catch (NoSuchMethodException | SecurityException ex) {
throw new IllegalStateException(ex);
}
}
private final ConcurrentKafkaListenerContainerFactory<String, String> factory;
private final MessageHandlerMethodFactory methodFactory;
private final KafkaAdmin admin;
private final KafkaTemplate<String, String> template;
public Listener(ConcurrentKafkaListenerContainerFactory<String, String> factory, KafkaAdmin admin,
KafkaTemplate<String, String> template, KafkaListenerAnnotationBeanPostProcessor<?, ?> bpp) {
this.factory = factory;
this.admin = admin;
this.template = template;
this.methodFactory = bpp.getMessageHandlerMethodFactory();
}
#KafkaListener(id = "so69134055", topics = "so69134055")
public void listen(String topicName) {
try (AdminClient client = AdminClient.create(this.admin.getConfigurationProperties())) {
NewTopic topic = TopicBuilder.name(topicName).build();
client.createTopics(List.of(topic)).all().get(10, TimeUnit.SECONDS);
}
catch (Exception e) {
log.error("Failed to create topic", e);
}
ConcurrentMessageListenerContainer<String, String> container =
this.factory.createContainer(new TopicPartitionOffset(topicName, 0));
BatchMessagingMessageListenerAdapter<String, String> adapter =
new BatchMessagingMessageListenerAdapter<>(this, otherListen);
adapter.setHandlerMethod(new HandlerAdapter(
this.methodFactory.createInvocableHandlerMethod(this, otherListen)));
FilteringBatchMessageListenerAdapter<String, String> filtered =
new FilteringBatchMessageListenerAdapter<>(adapter, record -> !record.key().equals("foo"));
container.getContainerProperties().setMessageListener(filtered);
container.getContainerProperties().setGroupId("group.for." + topicName);
container.setBeanName(topicName + ".container");
container.start();
IntStream.range(0, 10).forEach(i -> this.template.send(topicName, 0, i % 2 == 0 ? "foo" : "bar", "test" + i));
}
void otherListen(List<String> others) {
log.info("Others: {}", others);
}
}
spring.kafka.consumer.auto-offset-reset=earliest
Output - showing that the filter was applied to the records with bar in the key.
Others: [test0, test2, test4, test6, test8]

Integration testing in multi module Maven project with Spring

I have a multi module maven project with several modules (parent, service, updater1, updater2).
The #SpringBootApplication is in 'service' module and the others doesn't have artifacts.
'updater1' is a module which have a Kafka listener and a http client, and when receives a kafka event launches a request to an external API. I want to create integration tests in this module with testcontainers, so I've created the containers and a Kafka producer to send a KafkaTemplate to my consumer.
My problem is the Kafka producer is autowiring to null, so the tests throws a NullPointerException. I think it should be a Spring configuration problem, but I can't find the problem. Can you help me? Thank's!
This is my test class:
#ExtendWith(SpringExtension.class)
#ContextConfiguration(classes = {KafkaConfiguration.class, CacheConfiguration.class, ClientConfiguration.class})
public class InvoicingTest {
#ClassRule
public static final Containers containers = Containers.Builder.aContainer()
.withKafka()
.withServer()
.build();
private final MockHttpClient mockHttpClient =
new MockHttpClient(containers.getHost(SERVER),
containers.getPort(SERVER));
#Autowired
private KafkaEventProducer kafkaEventProducer;
#BeforeEach
#Transactional
void setUp() {
mockHttpClient.reset();
}
#Test
public void createElementSuccesfullResponse() throws ExecutionException, InterruptedException, TimeoutException {
mockHttpClient.whenPost("/v1/endpoint")
.respond(HttpStatusCode.OK_200);
kafkaEventProducer.produce("src/test/resources/event/invoiceCreated.json");
mockHttpClient.verify();
}
And this is the event producer:
#Component
public class KafkaEventProducer {
private final KafkaTemplate<String, String> kafkaTemplate;
private final String topic;
#Autowired
KafkaInvoicingEventProducer(KafkaTemplate<String, String> kafkaTemplate,
#Value("${kafka.topic.invoicing.name}") String topic){
this.kafkaTemplate = kafkaTemplate;
this.topic = topic;
}
public void produce(String event){
kafkaTemplate.send(topic, event);
}
}
You haven't detailed how KafkaEventProducer is implemented (is it a #Component?), neither your test class is annotated with #SpringBootTest and the runner #RunWith.
Check out this sample, using Apache KakfaProducer:
import org.apache.kafka.clients.producer.KafkaProducer;
public void sendRecord(String topic, String event) {
try (KafkaProducer<String, byte[]> producer = new KafkaProducer<>(producerProps(bootstrapServers, false))) {
send(producer, topic, event);
}
}
where
public void send(KafkaProducer<String, byte[]> producer, String topic, String event) {
try {
ProducerRecord<String, byte[]> record = new ProducerRecord<>(topic, event.getBytes());
producer.send(record).get();
} catch (InterruptedException | ExecutionException e) {
fail("Not expected exception: " + e.getMessage());
}
}
protected Properties producerProps(String bootstrapServer, boolean transactional) {
Properties producerProperties = new Properties();
producerProperties.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapServer);
producerProperties.put(KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class.getName());
producerProperties.put(VALUE_SERIALIZER_CLASS_CONFIG, ByteArraySerializer.class.getName());
if (transactional) {
producerProperties.put(TRANSACTIONAL_ID_CONFIG, "my-transactional-id");
}
return producerProperties;
}
and bootstrapServers is taken from kafka container:
KafkaContainer kafka = new KafkaContainer();
kafka.start();
bootstrapServers = kafka.getBootstrapServers();

Exception in combination of #DirtiesContext and FlywayConfig bean in integration test

I have a problem when adding new test. And the problem is I think related to #DirtiesContext. I tried removing and adding it but nothing works in combination. Test 1 is using Application Context as well.
the following two are running together and no issue.
Test 1
#ActiveProfiles({"aws", "local"})
#RunWith(SpringJUnit4ClassRunner.class)
#SpringBootTest(webEnvironment = SpringBootTest.WebEnvironment.RANDOM_PORT)
public class UnauthorizedControllerTest {
private static final Logger LOGGER = LoggerFactory.getLogger(UnauthorizedControllerTest.class);
#Autowired
private TestRestTemplate testRestTemplate;
#LocalServerPort
private int port;
#Autowired
private ApplicationContext context;
private Map<Class<?>, List<String>> excludedMethodsPerController;
#Before
public void setUp() {
excludedMethodsPerController = excludedMethodsPerController();
}
#Test
public void contextStarts() {
assertNotNull(context);
}
#Test
public void controllerCall_WithoutAuthorization_ReturnsUnauthorized() {
Map<String, Object> controllerBeans = context.getBeansWithAnnotation(Controller.class);
for (Object controllerInstance : controllerBeans.values()) {
LOGGER.info("Checking controller {}", controllerInstance);
checkController(controllerInstance);
}
}
public void checkController(Object controllerInstance) {
// Use AopUtils for the case that spring wraps the controller in a proxy
Class<?> controllerClass = AopProxyUtils.ultimateTargetClass(controllerInstance);
Method[] allMethods = controllerClass.getDeclaredMethods();
for (Method method : allMethods) {
LOGGER.info("Checking method: {}", method.getName());
if (!isCallable(controllerClass, method)) {
continue;
}
String urlPrefix = urlPrefix(controllerClass);
Mapping mapping = Mapping.of(method.getAnnotations());
for (String url : mapping.urls) {
for (RequestMethod requestMethod : mapping.requestMethods) {
ResponseEntity<String> exchange = exchange(urlPrefix + url, requestMethod);
String message = String.format("Failing %s.%s", controllerClass.getName(), method.getName());
assertEquals(message, HttpStatus.UNAUTHORIZED, exchange.getStatusCode());
}
}
}
}
private ResponseEntity<String> exchange(String apiEndpoint, RequestMethod requestMethod) {
return testRestTemplate.exchange(url(replacePathVariables(apiEndpoint)), HttpMethod.resolve(requestMethod.name()), null, String.class);
}
private String urlPrefix(Class<?> aClass) {
if (!aClass.isAnnotationPresent(RequestMapping.class)) {
return "";
}
RequestMapping annotation = aClass.getAnnotation(RequestMapping.class);
return annotation.value()[0];
}
private String url(String url) {
return "http://localhost:" + port + url;
}
private boolean isCallable(Class<?> controller, Method method) {
return Modifier.isPublic(method.getModifiers())
&& !isExcluded(controller, method)
&& !isExternal(controller);
}
private boolean isExcluded(Class<?> controller, Method method) {
List<String> excludedMethodsPerController = this.excludedMethodsPerController.getOrDefault(controller, new ArrayList<>());
return excludedMethodsPerController.contains(method.getName());
}
private boolean isExternal(Class<?> controller) {
return controller.getName().startsWith("org.spring");
}
private String replacePathVariables(String url) {
return url.replaceAll("\\{[^\\/]+}", "someValue");
}
/**
* There must be a really good reason to exclude the method from being checked.
*
* #return The list of urls that must not be checked by the security
*/
private static Map<Class<?>, List<String>> excludedMethodsPerController() {
Map<Class<?>, List<String>> methodPerController = new HashMap<>();
methodPerController.put(AuthenticationController.class, Collections.singletonList("generateAuthorizationToken"));
methodPerController.put(SystemUserLoginController.class, Arrays.asList("systemUserLogin", "handleException"));
methodPerController.put(ValidationController.class, Collections.singletonList("isValid"));
return methodPerController;
}
}
Test 2
#RunWith(SpringJUnit4ClassRunner.class)
#SpringBootTest(webEnvironment = SpringBootTest.WebEnvironment.RANDOM_PORT)
#ActiveProfiles({"aws", "local"})
public class RoleAdminControllerAuditTest {
private static final String DOMAIN_NAME = "testDomain";
private static final String APP_NAME_1 = "testApp_1";
private static final String APP_NAME_2 = "testApp_2";
private static final String ROLE_NAME = "testRole";
private static final String USER_NAME = "testUser";
#Autowired
AuditRepository auditRepository;
#Autowired
RoleAdminController roleAdminController;
#MockBean
RoleAdminService roleAdminService;
#MockBean
RoleAdminInfoBuilder infoBuilder;
#MockBean
AppInfoBuilder appInfoBuilder;
#MockBean
BoundaryValueService boundaryValueService;
#MockBean
RoleService roleService;
#MockBean
private SecurityService securityService;
private static final String URS_USER = "loggedInUser";
private static final String BOUNDARY_VALUE_KEY = "11";
private static final String BOUNDARY_VALUE_NAME = "Schulenberg";
private String auditEventDate = LocalDate.now().format(DateTimeFormatter.ofPattern("yyyy-MM-dd"));
#BeforeClass
public static void setupTestEnv() {
// https://github.com/localstack/localstack/issues/592
System.setProperty("com.amazonaws.sdk.disableCbor", "true");
}
#Before
public void setUp() throws Exception {
auditRepository.clean();
when(securityService.getLoggedInUser()).thenReturn(new TestHelper.FakeUser(URS_USER));
//when(roleService.addRoleToApp(any(), any(), eq(ROLE_NAME))).thenReturn(TestHelper.initRole(ROLE_NAME));
when(boundaryValueService.findBoundaryValueById(eq(123L))).thenReturn(initBoundaryValue(BOUNDARY_VALUE_KEY, BOUNDARY_VALUE_NAME));
when(boundaryValueService.findBoundaryValueById(eq(666L))).thenReturn(initBoundaryValue(BOUNDARY_VALUE_KEY, BOUNDARY_VALUE_NAME));
}
#Test
public void addUserAsRoleAdminLogged() throws UserIsAlreadyRoleAdminException, RoleNotFoundException, BoundaryValueNotFoundException {
User user = initUser(USER_NAME);
List<RoleAdminInfo> roleAdminInfos = getRoleAdminInfos();
roleAdminController.addUserAsRoleAdmin(user, roleAdminInfos);
List<String> result = auditRepository.readAll();
assertEquals("some data", result.toString());
}
#Test
public void removeUserAsRoleAdminLogged() throws RoleNotFoundException, BoundaryValueNotFoundException {
User user = initUser(USER_NAME);
Long roleId = Long.valueOf(444);
Role role = initRole("test-role");
role.setApp(initApp("test-app"));
role.setDomain(initDomain("test-domain"));
when(roleService.getRoleByIdOrThrow(roleId)).thenReturn(role);
roleAdminController.removeUserAsRoleAdmin(user, roleId, Long.valueOf(666));
List<String> result = auditRepository.readAll();
assertEquals("some data", result.toString());
}
#Test
public void removeRoleAdminPermission() throws RoleNotFoundException, BoundaryValueNotFoundException {
User user = initUser(USER_NAME);
List<RoleAdminInfo> roleAdminInfos = getRoleAdminInfos();
roleAdminController.removeRoleAdminPermission(user, roleAdminInfos);
List<String> result = auditRepository.readAll();
assertEquals(1, result.size());
assertEquals("some data", result.toString());
}
private List<RoleAdminInfo> getRoleAdminInfos() {
RoleAdminInfo info1 = initRoleAdminInfo(DOMAIN_NAME, ROLE_NAME, APP_NAME_1);
RoleAdminInfo info2 = initRoleAdminInfo(DOMAIN_NAME, ROLE_NAME, APP_NAME_2);
info1.setBoundaryValueId(123L);
info1.setBoundaryValueKey(BOUNDARY_VALUE_KEY);
info1.setBoundaryValueName(BOUNDARY_VALUE_NAME);
return Arrays.asList(info1, info2);
}
}
Test 3 (newly added one)
#RunWith(SpringJUnit4ClassRunner.class)
#SpringBootTest(webEnvironment = SpringBootTest.WebEnvironment.RANDOM_PORT, classes = FlywayConfig.class)
#DirtiesContext(classMode = DirtiesContext.ClassMode.BEFORE_CLASS)
#ActiveProfiles({"aws", "local"})
public class BoundaryValueDeltaControllerTest {
private static final String API_V1 = "/api/v1/";
#Autowired
TestRestTemplate testRestTemplate;
#Autowired
private DomainBuilder domainBuilder;
#Autowired
private AppBuilder appBuilder;
#Autowired
private UserBuilder userBuilder;
#Autowired
private DomainAdminBuilder domainAdminBuilder;
#Autowired
private BoundarySetBuilder boundarySetBuilder;
#MockBean
private LoginUserProvider loginUserProvider;
#MockBean
private LoginTokenService loginTokenService;
#MockBean
private BoundaryServiceAdapter serviceAdapter;
#LocalServerPort
private int port;
LoginUserInfo loggedInUser;
#Before
public void setUp() {
clear();
}
#After
public void tearDown() {
clear();
}
#Test
public void updateBoundaryValuesFromApi() throws UrsBusinessException {
Domain domain = domainBuilder.persist();
App app = appBuilder.persist(domain);
BoundarySet boundarySet = boundarySetBuilder.persist(domain);
User user = userBuilder.persist(domain.getAuthor().getUsername());
aLoggedInUser(domain.getAuthor().getUsername());
domainAdminBuilder.persist(user, domain);
mockReadInfoFromApiUsingApp();
ResponseEntity<String> response = callUpdateBoundaryValuesFromApi(domain, boundarySet, app);
assertEquals(HttpStatus.OK, response.getStatusCode());
assertNotNull(response.getBody());
}
private void mockReadInfoFromApiUsingApp() throws UrsBusinessException {
BoundaryValueInfo boundaryValueInfo = new BoundaryValueInfo();
boundaryValueInfo.setBoundaryValueId(10L);
boundaryValueInfo.setBoundaryValueKey("boundaryValueKey");
boundaryValueInfo.setBoundaryValueName("boundaryValuename");
when(serviceAdapter.readInfoFromApiUsingApp(any(), any(), any())).thenReturn(new BoundaryValueInfo[]{boundaryValueInfo});
}
private ResponseEntity<String> callUpdateBoundaryValuesFromApi(Domain domain, BoundarySet boundarySet, App app) {
String url = url(API_V1 + "domains/" + domain.getName() + "/boundarysets/" + boundarySet.getBoundarySetName() + "/app/" + app.getName()+ "/updatefromapi/");
return testRestTemplate.exchange(url,HttpMethod.GET, null, String.class);
}
private String url(String url) {
return "http://localhost:" + port + url;
}
private void aLoggedInUser(String username) {
Claims claims = Jwts.claims();
claims.put("username", username);
loggedInUser = LoginUserInfo.parse(claims);
when(loginUserProvider.getLoggedInUser()).thenReturn(loggedInUser);
when(loginTokenService.parseToken(any())).thenReturn(loggedInUser);
}
private void clear() {
appBuilder.deleteAll();
boundarySetBuilder.deleteAll();
domainAdminBuilder.deleteAll();
domainBuilder.deleteAll();
userBuilder.deleteAll();
}
}
Flyway config
#TestConfiguration
public class FlywayConfig {
#Bean
public FlywayMigrationStrategy clean() {
return flyway -> {
flyway.clean();
flyway.migrate();
};
}
}
And I am getting below exception while running all together.
java.lang.IllegalStateException: Failed to load ApplicationContext
at org.springframework.test.context.cache.DefaultCacheAwareContextLoaderDelegate.loadContext(DefaultCacheAwareContextLoaderDelegate.java:125)
at org.springframework.test.context.support.DefaultTestContext.getApplicationContext(DefaultTestContext.java.....
Caused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'flywayInitializer' defined in class path resource [org/springframework/boot/autoconfigure/flyway/FlywayAutoConfiguration$FlywayConfiguration.class]: Invocation of init method failed; nested exception is org.flywaydb.core.internal.exception.FlywaySqlException:
Unable to obtain connection from database: Too many connections
---------------------------------------------------------------
SQL State : 08004
Error Code : 1040
Message : Too many connections
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1762)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:593)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:515)
... 49 more
Caused by: org.flywaydb.core.internal.exception.FlywaySqlException:
Unable to obtain connection from database: Too many connections
I am struggling since yesterday's and you might find duplicate but I tried to add the more details today. please guide me here.
you must add the configuration for flyway
flyway.url=jdbc:postgresql://xxx.eu-west-2.rds.amazonaws.com:5432/xxx
flyway.user=postgres
flyway.password=xxx

Embedded Kafka starting with wrong number of partitions

I have started an instance of EmbeddedKafka in a JUnit test. I can read the records that I have pushed to my stream correctly in my application, but one thing I have noticed is that I only have one partition per topic. Can anyone explain why?
In my application I have the following:
List<PartitionInfo> partitionInfos = consumer.partitionsFor(topic);
This returns a list with one item. When running against local Kafka with 3 partitions, it returns a list with 3 items as expected.
And my test looks like:
#RunWith(SpringRunner.class)
#SpringBootTest
#EmbeddedKafka(partitions = 3)
#ActiveProfiles("inmemory")
#DirtiesContext(classMode = DirtiesContext.ClassMode.AFTER_EACH_TEST_METHOD)
#TestPropertySource(
locations = "classpath:application-test.properties",
properties = {"app.onlyMonitorIfDataUpdated=true"})
public class MonitorRestKafkaIntegrationTest {
#Autowired
private EmbeddedKafkaBroker embeddedKafkaBroker;
#Value("${spring.embedded.kafka.brokers}")
private String embeddedBrokers;
#Autowired
private WebApplicationContext wac;
#Autowired
private JsonUtility jsonUtility;
private MockMvc mockMvc;
#Before
public void setup() {
mockMvc = webAppContextSetup(wac).build();
UserGroupInformation.setLoginUser(UserGroupInformation.createRemoteUser("dummyUser"));
}
private ResultActions interactiveMonitoringREST(String eggID, String monitoringParams) throws Exception {
return mockMvc.perform(post(String.format("/eggs/%s/interactive", eggID)).contentType(MediaType.APPLICATION_JSON_VALUE).content(monitoringParams));
}
#Test
#WithMockUser("super_user")
public void testEmbeddedKafka() throws Exception {
Producer<String, String> producer = getKafkaProducer();
sendRecords(producer, 3);
updateConn();
interactiveMonitoringREST(EGG_KAFKA, monitoringParams)
.andExpect(status().isOk())
.andDo(print())
.andExpect(jsonPath("$.taskResults[0].resultDetails.numberOfRecordsProcessed").value(3))
.andExpect(jsonPath("$.taskResults[0].resultDetails.numberOfRecordsSkipped").value(0));
}
private void sendRecords(Producer<String, String> producer, int records) {
for (int i = 0; i < records; i++) {
String val = "{\"auto_age\":" + String.valueOf(i + 10) + "}";
producer.send(new ProducerRecord<>(testTopic, String.valueOf(i), val));
}
producer.flush();
}
private Producer<String, String> getKafkaProducer() {
Map<String, Object> prodConfigs = new HashMap<>(KafkaTestUtils.producerProps(embeddedKafkaBroker));
return new DefaultKafkaProducerFactory<>(prodConfigs, new StringSerializer(), new StringSerializer()).createProducer();
}
private void updateConn() throws Exception {
String conn = getConnectionREST(CONN_KAFKA).andReturn().getResponse().getContentAsString();
ConnectionDetail connectionDetail = jsonUtility.fromJson(conn, ConnectionDetail.class);
connectionDetail.getDetails().put(ConnectionDetailConstants.CONNECTION_SERVER, embeddedBrokers);
String updatedConn = jsonUtility.toJson(connectionDetail);
updateConnectionREST(CONN_KAFKA, updatedConn).andExpect(status().isOk());
}
}
You need to tell the broker to pre-create the topics...
#SpringBootTest
#EmbeddedKafka(topics = "foo", partitions = 3)
class So57481979ApplicationTests {
#Test
void testPartitions(#Autowired KafkaAdmin admin) throws InterruptedException, ExecutionException {
AdminClient client = AdminClient.create(admin.getConfig());
Map<String, TopicDescription> map = client.describeTopics(Collections.singletonList("foo")).all().get();
System.out.println(map.values().iterator().next().partitions().size());
}
}
Or set the num.partitions broker property if you want the broker to auto-create the topics for you on first use.
We should probably automatically do that, based on the partitions property.
I found bootstrapServersProperty is important in #EmbeddedKafka, which is used to populate the property in the application-test.yml, which then can be used to create consumers/listener containers.

Kafka consumer unit test with Avro Schema registry failing

I'm writing a consumer which listens to a Kafka topic and consumes message whenever message is available. I've tested the logic/code by running Kafka locally and it's working fine.
While writing the unit/component test cases, it's failing with avro schema registry url error. I've tried different options available on internet but could not find anything working. I am not sure if my approach is even correct. Please help.
Listener Class
#KafkaListener(topics = "positionmgmt.v1", containerFactory = "genericKafkaListenerFactory")
public void receive(ConsumerRecord<String, GenericRecord> consumerRecord) {
try {
GenericRecord generic = consumerRecord.value();
Object obj = generic.get("metadata");
ObjectMapper mapper = new ObjectMapper();
Header headerMetaData = mapper.readValue(obj.toString(), Header.class);
System.out.println("Received payload : " + consumerRecord.value());
//Call backend with details in GenericRecord
}catch (Exception e){
System.out.println("Exception while reading message from Kafka " + e );
}
Kafka config
#Bean
public ConcurrentKafkaListenerContainerFactory<String, GenericRecord> genericKafkaListenerFactory() {
ConcurrentKafkaListenerContainerFactory<String, GenericRecord> factory = new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(genericConsumerFactory());
return factory;
}
public ConsumerFactory<String, GenericRecord> genericConsumerFactory() {
Map<String, Object> config = new HashMap<>();
config.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "127.0.0.1:9092");
config.put(ConsumerConfig.GROUP_ID_CONFIG, "group_id");
config.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
config.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, KafkaAvroDeserializer.class);
config.put(KafkaAvroDeserializerConfig.SCHEMA_REGISTRY_URL_CONFIG,"http://localhost:8081");
return new DefaultKafkaConsumerFactory<>(config);
}
Avro Schema
{
"type":"record",
"name":"KafkaEvent",
"namespace":"com.ms.model.avro",
"fields":[
{
"name":"metadata",
"type":{
"name":"metadata",
"type":"record",
"fields":[
{
"name":"correlationid",
"type":"string",
"doc":"this is corrleation id for transaction"
},
{
"name":"subject",
"type":"string",
"doc":"this is subject for transaction"
},
{
"name":"version",
"type":"string",
"doc":"this is version for transaction"
}
]
}
},
{
"name":"name",
"type":"string"
},
{
"name":"dept",
"type":"string"
},
{
"name":"empnumber",
"type":"string"
}
]
}
This is my test code which I tried...
#ComponentTest
#RunWith(SpringRunner.class)
#EmbeddedKafka(partitions = 1, topics = { "positionmgmt.v1" })
#SpringBootTest(classes={Application.class})
#DirtiesContext
public class ConsumeKafkaMessageTest {
private static final String TEST_TOPIC = "positionmgmt.v1";
#Autowired(required=true)
EmbeddedKafkaBroker embeddedKafkaBroker;
private Schema schema;
private SchemaRegistryClient schemaRegistry;
private KafkaAvroSerializer avroSerializer;
private KafkaAvroDeserializer avroDeserializer;
private MockSchemaRegistryClient mockSchemaRegistryClient = new MockSchemaRegistryClient();
private String registryUrl = "unused";
private String avroSchema = string representation of avro schema
#BeforeEach
public void setUp() throws Exception {
Schema.Parser parser = new Schema.Parser();
schema = parser.parse(avroSchema);
mockSchemaRegistryClient.register("Vendors-value", schema);
}
#Test
public void consumeKafkaMessage_receive_sucess() {
Schema metadataSchema = schema.getField("metadata").schema();
GenericRecord metadata = new GenericData.Record(metadataSchema);
metadata.put("version", "1.0");
metadata.put("correlationid", "correlationid");
metadata.put("subject", "metadata");
GenericRecord record = new GenericData.Record(schema);
record.put("metadata", metadata);
record.put("name", "ABC");
record.put("dept", "XYZ");
Consumer<String, GenericRecord> consumer = configureConsumer();
Producer<String, GenericRecord> producer = configureProducer();
ProducerRecord<String, GenericRecord> prodRecord = new ProducerRecord<String, GenericRecord>(TEST_TOPIC, record);
producer.send(prodRecord);
ConsumerRecord<String, GenericRecord> singleRecord = KafkaTestUtils.getSingleRecord(consumer, TEST_TOPIC);
assertNotNull(singleRecord.value());
consumer.close();
producer.close();
}
private Consumer<String, GenericRecord> configureConsumer() {
Map<String, Object> consumerProps = KafkaTestUtils.consumerProps("groupid", "true", embeddedKafkaBroker);
consumerProps.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
Consumer<String, GenericRecord> consumer = new DefaultKafkaConsumerFactory<String, GenericRecord>(consumerProps).createConsumer();
consumer.subscribe(Collections.singleton(TEST_TOPIC));
return consumer;
}
private Producer<String, GenericRecord> configureProducer() {
Map<String, Object> producerProps = new HashMap<>(KafkaTestUtils.producerProps(embeddedKafkaBroker));
producerProps.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class.getName());
producerProps.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, KafkaAvroSerializer.class.getName());
producerProps.put(KafkaAvroSerializerConfig.SCHEMA_REGISTRY_URL_CONFIG, mockSchemaRegistryClient);
producerProps.put(KafkaAvroSerializerConfig.AUTO_REGISTER_SCHEMAS, "false");
return new DefaultKafkaProducerFactory<String, GenericRecord>(producerProps).createProducer();
}
}
Error
component.com.ms.listener.ConsumeKafkaMessageTest > consumeKafkaMessage_receive_sucess() FAILED
org.apache.kafka.common.KafkaException: Failed to construct kafka producer
at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:457)
at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:289)
at org.springframework.kafka.core.DefaultKafkaProducerFactory.createKafkaProducer(DefaultKafkaProducerFactory.java:318)
at org.springframework.kafka.core.DefaultKafkaProducerFactory.createProducer(DefaultKafkaProducerFactory.java:305)
at component.com.ms.listener.ConsumeKafkaMessageTest.configureProducer(ConsumeKafkaMessageTest.java:125)
at component.com.ms.listener.ConsumeKafkaMessageTest.consumeKafkaMessage_receive_sucess(ConsumeKafkaMessageTest.java:97)
Caused by:
io.confluent.common.config.ConfigException: Invalid value io.confluent.kafka.schemaregistry.client.MockSchemaRegistryClient#20751870 for configuration schema.registry.url: Expected a comma separated list.
at io.confluent.common.config.ConfigDef.parseType(ConfigDef.java:345)
at io.confluent.common.config.ConfigDef.parse(ConfigDef.java:249)
at io.confluent.common.config.AbstractConfig.<init>(AbstractConfig.java:78)
at io.confluent.kafka.serializers.AbstractKafkaAvroSerDeConfig.<init>(AbstractKafkaAvroSerDeConfig.java:105)
at io.confluent.kafka.serializers.KafkaAvroSerializerConfig.<init>(KafkaAvroSerializerConfig.java:32)
at io.confluent.kafka.serializers.KafkaAvroSerializer.configure(KafkaAvroSerializer.java:48)
at org.apache.kafka.common.serialization.ExtendedSerializer$Wrapper.configure(ExtendedSerializer.java:60)
at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:372)
... 5 more
I investigated it a bit and I found out that the problem is in the CashedSchemaRegistryClient that is used by the KafkaAvroSerializer/Deserializer. It is used to fetch the schema definitions from the Confluent Schema Registry.
You already have your schema definition locally so you don't need to go to Schema Registry for them. (at least in your tests)
I had a similar problem and I solved it by creating a custom KafkaAvroSerializer/KafkaAvroDeserializer.
This is a sample of KafkaAvroSerializer. It is rather simple. You just need to extend the provided KafkaAvroSerializer and tell him to use MockSchemaRegistryClient.
public class CustomKafkaAvroSerializer extends KafkaAvroSerializer {
public CustomKafkaAvroSerializer() {
super();
super.schemaRegistry = new MockSchemaRegistryClient();
}
public CustomKafkaAvroSerializer(SchemaRegistryClient client) {
super(new MockSchemaRegistryClient());
}
public CustomKafkaAvroSerializer(SchemaRegistryClient client, Map<String, ?> props) {
super(new MockSchemaRegistryClient(), props);
}
}
This is a sample of KafkaAvroDeserializer. When the deserialize method is called you need to tell him which schema to use.
public class CustomKafkaAvroDeserializer extends KafkaAvroDeserializer {
#Override
public Object deserialize(String topic, byte[] bytes) {
this.schemaRegistry = getMockClient(KafkaEvent.SCHEMA$);
return super.deserialize(topic, bytes);
}
private static SchemaRegistryClient getMockClient(final Schema schema$) {
return new MockSchemaRegistryClient() {
#Override
public synchronized Schema getById(int id) {
return schema$;
}
};
}
}
The last step is to tell spring to use created Serializer/Deserializer
spring.kafka.producer.properties.schema.registry.url= not-used
spring.kafka.producer.value-serializer = CustomKafkaAvroSerializer
spring.kafka.producer.key-serializer = org.apache.kafka.common.serialization.StringSerializer
spring.kafka.producer.group-id = showcase-producer-id
spring.kafka.consumer.properties.schema.registry.url= not-used
spring.kafka.consumer.value-deserializer = CustomKafkaAvroDeserializer
spring.kafka.consumer.key-deserializer = org.apache.kafka.common.serialization.StringDeserializer
spring.kafka.consumer.group-id = showcase-consumer-id
spring.kafka.auto.offset.reset = earliest
spring.kafka.producer.auto.register.schemas= true
spring.kafka.properties.specific.avro.reader= true
I wrote a short blog post about that:
https://medium.com/#igorvlahek1/no-need-for-schema-registry-in-your-spring-kafka-tests-a5b81468a0e1?source=friends_link&sk=e55f73b86504e9f577e259181c8d0e23
Link to the working sample project: https://github.com/ivlahek/kafka-avro-without-registry
The answer from #ivlahek is working, but if you look at this example 3 year later you might want to do slight modification to CustomKafkaAvroDeserializer
private static SchemaRegistryClient getMockClient(final Schema schema) {
return new MockSchemaRegistryClient() {
#Override
public ParsedSchema getSchemaBySubjectAndId(String subject, int id)
throws IOException, RestClientException {
return new AvroSchema(schema);
}
};
}
As the error says, you need to provide a string to the registry in the producer config, not an object.
Since you're using the Mock class, that string could be anything...
However, you'll need to construct the serializers given the registry instance
Serializer serializer = new KafkaAvroSerializer(mockSchemaRegistry);
// make config map with ("schema.registry.url", "unused")
serializer.configure(config, false);
Otherwise, it will try to create a non-mocked client
And put that into the properties
producerProps.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, serializer);
If your #KafkaListener is in test class then you can read it in StringDeserializer then convert it to the desired class manually
#Autowired
private MyKafkaAvroDeserializer myKafkaAvroDeserializer;
#KafkaListener( topics = "test")
public void inputData(ConsumerRecord<?, ?> consumerRecord) {
log.info("received payload='{}'", consumerRecord.toString(),consumerRecord.value());
GenericRecord genericRecord = (GenericRecord)myKafkaAvroDeserializer.deserialize("test",consumerRecord.value().toString().getBytes(StandardCharsets.UTF_8));
Myclass myclass = (Myclass) SpecificData.get().deepCopy(Myclass.SCHEMA$, genericRecord);
}
#Component
public class MyKafkaAvroDeserializer extends KafkaAvroDeserializer {
#Override
public Object deserialize(String topic, byte[] bytes) {
this.schemaRegistry = getMockClient(Myclass.SCHEMA$);
return super.deserialize(topic, bytes);
}
private static SchemaRegistryClient getMockClient(final Schema schema$) {
return new MockSchemaRegistryClient() {
#Override
public synchronized org.apache.avro.Schema getById(int id) {
return schema$;
}
};
}
}
Remember to add schema registry and key/value serializer in application.yml although it won't be used
consumer:
key-deserializer: org.apache.kafka.common.serialization.StringDeserializer
value-deserializer: org.apache.kafka.common.serialization.StringDeserializer
properties:
schema.registry.url :http://localhost:8080

Categories