Micrometer Unit Test Java - java

I have created a Micrometer class where counters are created and incremented. How to write unit test cases for the public method and avoid registering or sending the events to micrometer.
public class MicroMeter {
private static final MeterRegistry registry = Metrics.globalRegistry;
private Counter createCounter(final String meterName, Map<String, String> mp) {
List<Tag> tags = new ArrayList<>();
for (Map.Entry<String, String> entry : mp.entrySet()) {
tags.add(Tag.of(entry.getKey(), entry.getValue()));
}
return Counter
.builder(meterName)
.tags(tags)
.register(registry);
}
private void incrementCounter(Counter counter) {
counter.increment();
}
public static void createCounterAndIncrement(final String meterName, Map<String, String> mp){
MicroMeter microMeter = new MicroMeter();
Counter counter = microMeter.createCounter(meterName, dimensions);
microMeter.incrementCounter(counter);
}
}

You can simply pass in an in-memory meter registry for unit testing. I don’t remember the class name, but Micrometer comes with one.
Your code needs to be designed to accept the registry, not create it.
Because the whole purpose of Micrometer is to integrate with your chosen backend (like Graphite), there aren’t a lot of benefits to be had purely from unit testing. Apart of just creating the metrics, you need to check that those are linearized if your backend doesn’t support tags, and other things like client-side histograms if those are enabled.
What I do myself and recommend is integration testing. Here are the general steps:
Create an in-memory meter registry; The registry should be a bean, not a static variable, and you can replace it for testing very easily.
Mock the sender for your backend, like GraphiteSender if I remember the name correctly, and use the mock to verify that metrics being sent.

One way of writing a test for this scenario is to utilize the SimpleMeterRegistry by adding it to the globalRegistry, fetch the Counter and then test the expected behaviour.
Example snippet:
private MeterRegistry meterRegistry;
#BeforeEach
void setUp() {
meterRegistry = new SimpleMeterRegistry();
Metrics.globalRegistry.add(meterRegistry);
}
#AfterEach
void tearDown() {
meterRegistry.clear();
Metrics.globalRegistry.clear();
}
#Test
void testCreateCounterAndIncrement() {
// When
MicroMeter.createCounterAndIncrement("meterName", Map.of("key", "val"));
// Then
var counter = meterRegistry.find("meterName").counter();
then(counter).isNotNull();
then(counter.count()).isEqualTo(1);
then(counter.getId().getTag("key")).isEqualTo("val");
}

Related

java factory class test mocking list of implementations

I have created a factory to provide instance of IMyProcessor based on some boolean flag.
The below populates the map with both of my implementations.
#Component
public class MyProcessorFactory {
private static final Map<String, IMyProcessor> processorServiceCache = new HashMap<>();
#Value("${processor.async:true}")
private boolean isAsync;
public MyProcessorFactory(final List<IMyProcessor> processors) {
for (IMyProcessor service : processors) {
processorServiceCache.put(service.getType(), service);
}
}
public IMyProcessor getInstance() {
IMyProcessor processor = isAsync ? processorServiceCache.get("asynchronous") : processorServiceCache.get("synchronous");
return processor;
}
}
I am now trying to write a Unit test using Junit5 but I am struggling to setup the List of implementations:
I have tried the following:
#ExtendWith(MockitoExtension.class)
class ProcessorFactoryTest {
#InjectMocks
private MyProcessorFactory myProcessorFactory;
#Test
void testAsyncIsReturned() {
}
#Test
void testSyncisReturned() {}
}
I want to test based on the boolean flag async true/false, the correct implementation is returned.
It will be helpful to see how you write such test cases. I autowire the implementations of the interface as construction injection into a list then add to a map using a string key.
Along with answer, I am open to other ideas/refactorings that may make the testing easier.

Is there a better way to implement multitenant using kafka?

I’m trying to implement a multi tenant micro service using Spring Boot. I already implemented the web layer and the persistence layer. On web layer, I’ve implement a filter which sets the tenant id in a prototype bean (using ThreadLocalTargetSource), on persistence layer I’ve used Hibernate multi tenancy configuration (schema per tenant), they work fine, data is persisted in the appropriate schema. Currently I am implementing the same behaviour on messaging layer, using spring-kaka library, so far ir works the way I expected, but I’d like to know if there is a better way to do it.
Here is my code:
This si the class that manage a KafkaMessageListenerContainer:
#Component
public class MessagingListenerContainer {
private final MessagingProperties messagingProperties;
private KafkaMessageListenerContainer<String, String> container;
#PostConstruct
public void init() {
ContainerProperties containerProps = new ContainerProperties(
messagingProperties.getConsumer().getTopicsAsList());
containerProps.setMessageListener(buildCustomMessageListener());
container = createContainer(containerProps);
container.start();
}
#Bean
public MessageListener<String, String> buildCustomMessageListener() {
return new CustomMessageListener();
}
private KafkaMessageListenerContainer<String, String> createContainer(
ContainerProperties containerProps) {
Map<String, Object> props = consumerProps();
…
return container;
}
private Map<String, Object> consumerProps() {
Map<String, Object> props = new HashMap<>();
…
return props;
}
#PreDestroy
public void finish() {
container.stop();
}
}
This is the CustomMessageListener:
#Slf4j
public class CustomMessageListener implements MessageListener<String, String> {
#Autowired
private TenantStore tenantStore; // Prototype Bean
#Autowired
private List<ServiceListener> services;
#Override
public void onMessage(ConsumerRecord<String, String> record) {
log.info(“Tenant {} | Payload: {} | Record: {}", record.key(),
record.value(), record.toString());
tenantStore.setTenantId(record.key()); // Currently tenant is been setting as key
services.stream().forEach(sl -> sl.onMessage(record.value()));
}
}
This is a test service which would use the message data and tenant:
#Slf4j
#Service
public class ConsumerService implements ServiceListener {
private final MessagesRepository messages;
private final TenantStore tenantStore;
#Override
public void onMessage(String message) {
log.info("ConsumerService {}, tenant {}", message, tenantStore.getTenantId());
messages.save(new Message(message));
}
}
Thanks for your time!
Just to be clear ( correct me if I'm wrong ): you are using the same topic(s) for all your tenants. The way that you distinguish the message according to each tenant is by using the message key which in your case is the tenant id.
A slight improvement can be done by using message headers to store the tenant id instead of the key. By doing this then you will not be limited to partitioning messages based on tenants.
Although the model described by you works it has a major security issue. If someone gets access to your topic then you will be leaking data of all your tenants.
A more secure approach is using topic naming conventions and ACL's ( access control lists ). You can find a short explanation here. In a nutshell, you can include the name of your tenant in the topic's name by either using a suffix or a prefix.
e.g: orders_tenantA, orders_tenantB or tenantA_orders, tenantB_orders
Then, using ACL's you can restrict which applications can connect to those specific topics. This scenario is also helpful if one of your tenants need to connect one of their applications directly to your Kafka cluster.

How does spring.kafka.consumer.auto-offset-reset works in spring-kafka

KafkaProperties java doc:
/**
* What to do when there is no initial offset in Kafka or if the current offset
* does not exist any more on the server.
*/
private String autoOffsetReset;
I have hello world appllication which contains application.properties
spring.kafka.consumer.group-id=foo
spring.kafka.consumer.auto-offset-reset=latest
At this case #KafkaListener method is invoked for all entries. But expected result was that #KafkaListener method is invoked only for latest 3 options I send. I tried to use another option:
spring.kafka.consumer.auto-offset-reset=earlisest
But behaviour the same.
Can you explain this stuff?
P.S.
code sample:
#SpringBootApplication
public class Application implements CommandLineRunner {
public static Logger logger = LoggerFactory.getLogger(Application.class);
public static void main(String[] args) {
SpringApplication.run(Application.class, args).close();
}
#Autowired
private KafkaTemplate<String, String> template;
private final CountDownLatch latch = new CountDownLatch(3);
#Override
public void run(String... args) throws Exception {
this.template.send("spring_kafka_topic", "foo1");
this.template.send("spring_kafka_topic", "foo2");
this.template.send("spring_kafka_topic", "foo3");
latch.await(60, TimeUnit.SECONDS);
logger.info("All received");
}
#KafkaListener(topics = "spring_kafka_topic")
public void listen(ConsumerRecord<?, ?> cr) throws Exception {
logger.info(cr.toString());
latch.countDown();
}
}
Update:
Behaviour doesn't depends on
spring.kafka.consumer.auto-offset-reset
it is only depends on spring.kafka.consumer.auto-offset-reset=earliest
spring.kafka.consumer.enable-auto-commit
if I set spring.kafka.consumer.enable-auto-commit=false - I see all records.
if I set spring.kafka.consumer.enable-auto-commit=true - I see only 3 last records.
Please clarify menaning of spring.kafka.consumer.auto-offset-reset property
The KafkaProperties in Spring Boot does this:
public Map<String, Object> buildProperties() {
Map<String, Object> properties = new HashMap<String, Object>();
if (this.autoCommitInterval != null) {
properties.put(ConsumerConfig.AUTO_COMMIT_INTERVAL_MS_CONFIG,
this.autoCommitInterval);
}
if (this.autoOffsetReset != null) {
properties.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG,
this.autoOffsetReset);
}
This buildProperties() is used from the buildConsumerProperties() which, in turn in the:
#Bean
#ConditionalOnMissingBean(ConsumerFactory.class)
public ConsumerFactory<?, ?> kafkaConsumerFactory() {
return new DefaultKafkaConsumerFactory<Object, Object>(
this.properties.buildConsumerProperties());
}
So, if you use your own ConsumerFactory bean definition be sure to reuse those KafkaProperties: https://docs.spring.io/spring-boot/docs/1.5.7.RELEASE/reference/htmlsingle/#boot-features-kafka-extra-props
UPDATE
OK. I see what's going on.
Try to add this property:
spring.kafka.consumer.enable-auto-commit=false
This way we won't have async auto-commits based on some commit interval.
The logic in our application is based on the exit fact after the latch.await(60, TimeUnit.SECONDS);. When we get 3 expected records we exit. This way the async auto-commit from the consumer might not happen yet. So, the next time you run the application the consumer polls data from the uncommited offset.
When we turn off auto-commit, we have an AckMode.BATCH, which is performed synchronously and we have an ability to see really latest recodrs in the topic for this foo consumer group.

Make a static variable injectable

i am building a http API client that needs to call out to a specific endpoint like so:
public class MyApiClient {
private static final String ENDPOINT ="http://myapi....";
}
Here the endpoint won't change so its constant. However, I want to be able to override this for testing so that I can test against a mock http server for example.
Whats the best way to do this? Is it just to make it an instance variable and provide it with a starting value:
private String endpoint = ="http://myapi....";
public void setEndpoint(String endpoint){
...
}
Well, there are of course many solutions to this and one way of doing it is to use a system property with a default value:
private static final String DEFAULT_ENDPOINT = "http://myapi....";
private static final String ENDPOINT =
System.getProperty("my.endpoint", DEFAULT_ENDPOINT);
This way you get a configurable way of solving your problem. If you need even more flexibility when initializing your static constants you could also use a static initializer:
private static final String ENDPOINT;
static {
// do initialization here but do not throw any exceptions (bad practice)
// you can e.g. read from files etc...
// Then assign your constant...
ENDPOINT =
}
System properties are passed on the command line as -D parameters e.g:
java -Dmy.endpoint=http://...
But in my opinion, an even better approach is to actually inject the value to the class that is using it:
public class ClassThatIsUsingTheConfig {
private final String endpoint;
public ClassThatIsUsingTheConfig(final String endpoint) {
this.endpoint = endpoint;
}
public void someMethod() {
// use endpoint
}
}
And then, make the selection of which endpoint to use in the caller class. From a test case, this will be very easy to mock.
public class MyTest {
#Test
public void testMethod() {
ClassThatIsUsingTheConfig var = new ClassThatIsUsingTheConfig(TEST_ENDPOINT);
var.someMethod();
}
}
public class MyProdClass {
public void prodMethod() {
ClassThatIsUsingTheConfig var = new ClassThatIsUsingTheConfig(PROD_ENDPOINT);
var.someMethod();
}
}
You can read more about dependency injection here.
On a side note, if you are using some kind of framework for managing dependencies such as Spring Framework or CDI it is common to be able to inject properties and constants in various ways (e.g. based on which environment that is currently running). An example, when using Spring Framework you can declare all your constants in a property file and inject the property using annotations:
#Autowired
public ClassWhoIsUsingTheConfig(#Value("my.endoint") final String endpoint) {
this.endpoint = endpoint;
}
The property file for prod could be along the lines of:
my.endpoint=http://prodserver...
wheras the property file for test would look like this:
my.endpoint=http://testserver...
The approach of using a Dependency Injection engine allows for a very flexible way of handling external constants, paths, resources etc and simplifies your life when it comes to testing the code.

Sanitize a database during Arquillian testing

I am currently writing tests for some REST-full Services I wrote. The services I am testing are written in Java and use MongoDb/Morphia. The tests call on the services, some of which in turn write to a test collection. I need to clean up after the tests and delete the data I injected. What is the best way to go about this?
Here is an example of one of my simple services:
package org.haib.myerslab.services;
#Path("/database")
public class DatabaseService {
#Inject
private Datastore ds;
#Path("/genre/")
#POST
#Produces("application/json")
public GenreDTO postFromGenreDTO(#Context UriInfo uri, GenreDTO form) throws ParseException {
Genre myNewGenre = DtoToDomainMapper.gerneFromGenreDTO(form);
myNewGenre.setId(form.getId());
ds.save(myNewGenre);
return new GenreDTO(myNewGenre);
}
}
And here is an example of my Arquillian test:
#RunWith(Arquillian.class)
public class GeneTest {
private static String myId = "myGenreId";
private static String myGenre = "myGenre";
private static String myGenreInfo = "myGenreInfo";
#Deployment
public static WebArchive getDeployment() {
return TestHelper.getDeployment();
}
#Test
#RunAsClient
#InSequence(1)
public void canPostGenre(#ArquillianResource URL baseURL) throws Exception {
GenreDTO newGenre = new GenreDTO();
newGenre.setGenre(myGenre);
newGenre.setGenreInfo(myGenreInfo);
newGenre.setId(myId);
String url = baseURL.toURI().resolve("/database/genre/").toString();
JsonNode rootNode = TestHelper.postUrl(url, newGene);
assertEquals(myGenre, rootNode.get("genre").asText());
assertEquals(myGenreInfo, rootNode.get("genreInfo").asText());
assertEquals(myId, rootNode.get("id").asText());
}
}
Where the getDeployment function looks like this:
public static WebArchive getDeployment() {
File[] depend = Maven.resolver().loadPomFromFile("pom.xml").importRuntimeDependencies().resolve().withTransitivity().asFile();
WebArchive war = ShrinkWrap.create(WebArchive.class).addClass(TestHelper.class)
.addClass(Genre.class).addClass(Application.class).addPackage("org/haib/myerslab")
.addPackage("org/haib/myerslab/database").addPackage("org/haib/myerslab/genre")
.addPackage("org/haib/myerslab/dto").addPackage("org/haib/myerslab/dto/genre")
.addAsLibraries(depend).addAsWebInfResource("jboss-deployment-structure.xml")
.addAsWebInfResource(EmptyAsset.INSTANCE, "beans.xml").setWebXML("test-web.xml");
return war;
}
So where I am lost is, what is the best way to Inject the database in an #After, and clear our the Genre Class I posted into it so that my next test doesn't have it there.
How should I do this? Is there another way?
Take a look at nosql-unit. It provides annotations and rules that help you with seeding datasets, comparing expectations and cleaning up MongoDB.
To get your MongoDB into a pristine state before executing a test, you can simply use the following Annotation with the `CLEAN_INSERT´ :
#UsingDataSet(locations="my_data_set.json", loadStrategy=LoadStrategyEnum.CLEAN_INSERT)
public void canPostGenre() { ...}
If you need the behavior around the integration testing lifecycle with MongoDB to be more powerful, you can also roll your own based on the ideas of nosql-unit. Also make sure to check out Junit Rules.

Categories