I'm working on reading data from Kafka and have encountered this code to return the Kafka connection details but I don't understand how the context is shared. Here is the class to setup the KafkaConnection :
import akka.actor.typed.ActorSystem;
import akka.kafka.ConsumerMessage;
import akka.kafka.ConsumerSettings;
import akka.kafka.Subscriptions;
import akka.kafka.javadsl.Consumer;
import akka.stream.javadsl.SourceWithContext;
import lombok.Builder;
import lombok.Getter;
import lombok.extern.slf4j.Slf4j;
import org.apache.kafka.clients.consumer.ConsumerConfig;
import org.apache.kafka.clients.consumer.ConsumerRecord;
import org.apache.kafka.common.serialization.Deserializer;
import java.time.Duration;
#Getter
#Slf4j
public final class KafkaSource<K, V> {
private final SourceWithContext<ConsumerRecord<K, V>, ConsumerMessage.Committable, ?> commitableSource;
#Builder
private KafkaSource(final Deserializer<K> keyd, final Deserializer<V> valueDeserializer, final ActorSystem actorSystem) {
final String kafkaBootstrapServers = "localhost:9092";
final ConsumerSettings<K, V> kafkaConsumerSettings =
ConsumerSettings.create(actorSystem, keyd, valueDeserializer)
.withBootstrapServers(kafkaBootstrapServers)
.withGroupId("testGroup12")
.withProperty(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest")
.withStopTimeout(Duration.ofSeconds(5));
final String topics = "request-topic";
this.commitableSource = Consumer.sourceWithOffsetContext(kafkaConsumerSettings,
Subscriptions.topics(topics)).mapContext(ctx -> ctx);
}
}
Here is the Akka Stream to process the data from Kafka I've written :
import akka.actor.typed.ActorSystem;
import akka.actor.typed.javadsl.Behaviors;
import com.fasterxml.jackson.databind.ObjectMapper;
import org.apache.kafka.clients.consumer.ConsumerRecord;
import org.apache.kafka.common.serialization.StringDeserializer;
public class ReadFromKafka {
private static ObjectMapper objectMapper = new ObjectMapper();
public static void main(String args[]) {
final ActorSystem actorSystem = ActorSystem.create(Behaviors.empty(), "as");
var ksource = KafkaSource.<String, String>builder()
.actorSystem(actorSystem)
.keyd(new StringDeserializer()).valueDeserializer(new StringDeserializer())
.build();
ksource.getCommitableSource()
.map(ConsumerRecord::value)
.map(x -> {
var mappedObject = objectMapper.readValue(x, RequestDto.class);
System.out.println("mappedObject is :" + mappedObject);
return mappedObject;
}
)
.log("error")
.asSource()
.map(pair -> pair.second().commitInternal())
.run(actorSystem);
}
}
The class being mapped, RequestDto :
import com.fasterxml.jackson.annotation.JsonFormat;
import lombok.*;
import lombok.extern.jackson.Jacksonized;
import java.util.Date;
#Jacksonized
#AllArgsConstructor
#Getter
#Builder
#ToString
public class RequestDto {
#JsonFormat(pattern = "yyyy-MM-dd HH:mm:sss")
private final Date datePurchased;
private String someOtherField;
}
Although ReadFromKafka works as expected why is ConsumerMessage.Committable from SourceWithContext<ConsumerRecord<K, V>, ConsumerMessage.Committable, ?> not dropped when executing :
.map(ConsumerRecord::value)
.map(x -> {
var mappedObject = objectMapper.readValue(x, RequestDto.class);
System.out.println("mappedObject is :" + mappedObject);
return mappedObject;
}
)
.asSource() allows accessing the context within the Tuple at second position to then commit the offset using :
.map(pair -> pair.second().commitInternal())
I'm confused as to how this works, it appears something implicit is happening in the background that allows the context to be propagated throughout the stream?
A SourceWithContext<A, B, M> defines the stream operations which it supports to only work on the A part of the value.
So if f is a function taking an A and returning a C, .map(f) results in a SourceWithContext<C, B, M>.
Under the hood, it's a Source<Pair<A, B>, M>. map could be defined as something like (as always apologies for atrocious Java):
private Source<Pair<A, B>, M> underlying;
public <C> SourceWithContext<C, B, M> map(Function<A, C> f) {
Source<Pair<C, B>, M> src =
underlying.map(pair -> {
A a = pair.first();
C c = f(a);
return Pair.of<C, B>(c, pair.second()); // no idea if this is the correct Java syntax, but you get the idea
})
return SourceWithContext.fromPairs<C, B, M>(src);
}
Note that the f never gets to see the second part of the Pair. So long as every operation does the right thing with respect to the context, it just works.
There are operations where there's no unambiguous "right thing" to do. An example of this is an operation which could reorder elements.
Related
I am practising restful api endpoints using https://api.predic8.de/shop/docs
Here is my repo
I am getting a NPE failure when I try to use #InjectMocks during my TDD approach
However, I can make my test pass when I make a direct call in the setup()
vendorService = new VendorServiceImpl(VendorMapper.INSTANCE, vendorRepository);
I wanted to extend my learning by trying to create an endpoint for getting all vendors.
When I employ TDD along the way, but, my test getAllVendors() fails on a NPE when I try to use #InjectMocks but passes when I substitute it for a direct call in the setup() method.
The NPE is linked to the mapper class I think.
Here are the classes that I believe are useful. VendorServiceTest, VendorServiceImpl, VendorMapper.
I have commented out the direct call in the setup as I want to get the test passing using #InjectMocks
package guru.springfamework.services;
import guru.springfamework.api.v1.mapper.VendorMapper; import
guru.springfamework.api.v1.model.VendorDTO; import
guru.springfamework.domain.Vendor; import
guru.springfamework.repositories.VendorRepository; import
org.junit.Before; import org.junit.Test; import
org.mockito.InjectMocks; import org.mockito.Mock; import
org.mockito.MockitoAnnotations; import
org.springframework.test.web.servlet.MockMvc;
import java.util.Arrays; import java.util.List;
import static org.junit.Assert.*; import static
org.mockito.Mockito.when;
public class VendorServiceTest {
public static final String NAME = "Tasty";
public static final Long ID = 1L;
#Mock
VendorMapper vendorMapper;
#Mock
VendorRepository vendorRepository;
#InjectMocks
VendorServiceImpl vendorService;
//VendorService vendorService;
#Before
public void setUp() throws Exception {
MockitoAnnotations.initMocks(this);
//vendorService = new VendorServiceImpl(VendorMapper.INSTANCE, vendorRepository);
}
#Test
public void getAllVendors() {
//given
List<Vendor> vendors = Arrays.asList(new Vendor(), new Vendor(), new Vendor());
when(vendorRepository.findAll()).thenReturn(vendors);
//when
List<VendorDTO> vendorDTOList = vendorService.getAllVendors();
//then
assertEquals(3, vendorDTOList.size());
}
#Test
public void findByName() {
}
}
package guru.springfamework.services;
import guru.springfamework.api.v1.mapper.VendorMapper; import
guru.springfamework.api.v1.model.VendorDTO; import
guru.springfamework.repositories.VendorRepository; import
org.springframework.stereotype.Service;
import java.util.List; import java.util.stream.Collectors;
#Service public class VendorServiceImpl implements VendorService {
private final VendorMapper vendorMapper;
private final VendorRepository vendorRepository;
public VendorServiceImpl(VendorMapper vendorMapper, VendorRepository vendorRepository) {
this.vendorMapper = vendorMapper;
this.vendorRepository = vendorRepository;
}
#Override
public List<VendorDTO> getAllVendors() {
return vendorRepository
.findAll()
.stream()
.map(vendor -> {
VendorDTO vendorDTO = vendorMapper.vendorToVendorDTO(vendor);
vendorDTO.setVendorUrl("/api/v1/vendors/" + vendor.getId());
return vendorDTO;
})
.collect(Collectors.toList());
}
#Override
public VendorDTO findByName(String name) {
return vendorMapper.vendorToVendorDTO(vendorRepository.findByName(name));
}
#Override
public VendorDTO getVendorById(Long id) {
return vendorMapper.vendorToVendorDTO(vendorRepository.findById(id).orElseThrow(RuntimeException::new));
}
}
package guru.springfamework.api.v1.mapper;
import guru.springfamework.api.v1.model.VendorDTO; import
guru.springfamework.domain.Vendor; import org.mapstruct.Mapper; import
org.mapstruct.factory.Mappers;
#Mapper public interface VendorMapper {
VendorMapper INSTANCE = Mappers.getMapper(VendorMapper.class);
VendorDTO vendorToVendorDTO(Vendor vendor);
}
Does anyone know where and why I am going wrong?
The problem is that you created mock object for the mapper, but you didn't say what should happen when the method vendorToVendorDTO is called.
Therefore, when that method is called in the next line of code:
VendorDTO vendorDTO = vendorMapper.vendorToVendorDTO(vendor);
It will return null, and then in this line of code:
vendorDTO.setVendorUrl("/api/v1/vendors/" + vendor.getId());
You will get NullPointerException.
To make this work, change your getAllVendors() method as follows:
#Test
public void getAllVendors() {
//given
List<Vendor> vendors = Arrays.asList(new Vendor(), new Vendor(), new Vendor());
VendorDTO mockDto = mock(VendorDTO.class);
when(vendorRepository.findAll()).thenReturn(vendors);
when(vendorMapper.vendorToVendorDTO(any(Vendor.class))).thenReturn(mockDto);
//when
List<VendorDTO> vendorDTOList = vendorService.getAllVendors();
//then
assertEquals(3, vendorDTOList.size());
}
And the test should pass.
Have you tried to put #RunWith(MockitoJUnitRunner.class)/#ExtendsWith(MockitoExtension.class) over your test class?
I'm stuck trying to get the second test of the code below to work.
Both have the same payload structure, the one that works use Map<> only, the second one uses classes.
Why does the first test work but the second doesn't?
It seems a bug with usingRecursiveFieldByFieldElementComparator.
package com.example.hi;
import com.example.hi.ExampleTest.Foo.Bar;
import lombok.AllArgsConstructor;
import lombok.Value;
import org.junit.jupiter.api.Test;
import java.time.LocalDateTime;
import java.time.temporal.ChronoUnit;
import java.util.Comparator;
import java.util.List;
import java.util.Map;
import static java.util.Collections.singletonList;
import static org.assertj.core.api.Assertions.assertThat;
public class ExampleTest {
Comparator<LocalDateTime> truncateSeconds = (a, exp) ->
a.isAfter(exp.truncatedTo(ChronoUnit.SECONDS)) ? 0 : 1;
#Test
void works() {
var a = Map.of("values", Map.of("one", singletonList(Map.of("date", LocalDateTime.now()))));
var b = Map.of("values", Map.of("one", singletonList(Map.of("date", LocalDateTime.now()))));
assertThat(singletonList(a))
.usingRecursiveFieldByFieldElementComparator()
.usingComparatorForElementFieldsWithType(truncateSeconds, LocalDateTime.class)
.containsExactly(b);
}
#Test
void works_not() {
var a = new Foo(Map.of("one", singletonList(new Bar(LocalDateTime.now()))));
var b = new Foo(Map.of("one", singletonList(new Bar(LocalDateTime.now()))));
assertThat(singletonList(a))
.usingRecursiveFieldByFieldElementComparator()
.usingComparatorForElementFieldsWithType(truncateSeconds, LocalDateTime.class)
.containsExactly(b);
}
#Value
#AllArgsConstructor
public static class Foo {
Map<String, List<Bar>> values;
#Value
#AllArgsConstructor
public static class Bar {
LocalDateTime date;
}
}
}
The issue has been fixed in AssertJ Core 3.17.0 and the following should work:
// static import RecursiveComparisonConfiguration.builder
RecursiveComparisonConfiguration configuration = builder().withIgnoreAllOverriddenEquals(true)
.withComparatorForType(truncateSeconds, LocalDateTime.class)
.build();
assertThat(list).usingRecursiveFieldByFieldElementComparator(configuration)
.containsExactly(b);
I am serializing a class that includes an unmodifiable list with default typing enabled. The problem is that the type that Jackson uses is
java.util.Collections$UnmodifiableRandomAccessList
which, for some reason, the deserializer does not know how to handle.
Is there a way to tell Jackson to set the type as
java.util.ArrayList
which the deserializer does know how to handle, instead? If possible, I'd like to do it using mixins.
Something like
public abstract class ObjectMixin {
#JsonCreator
public ObjectMixin(
#JsonProperty("id") String id,
#JsonProperty("list") #JsonSerialize(as = ArrayList.class) List<String> list;
) {}
}
which, unfortunately, does not work.
I would like to start from security risk warning which comes from ObjectMapper documentation:
Notes on security: use "default typing" feature (see
enableDefaultTyping()) is a potential security risk, if used with
untrusted content (content generated by untrusted external parties).
If so, you may want to construct a custom TypeResolverBuilder
implementation to limit possible types to instantiate, (using
setDefaultTyping(com.fasterxml.jackson.databind.jsontype.TypeResolverBuilder<?)).
Lets implement custom resolver:
class CollectionsDefaultTypeResolverBuilder extends ObjectMapper.DefaultTypeResolverBuilder {
private final Map<String, String> notValid2ValidIds = new HashMap<>();
public CollectionsDefaultTypeResolverBuilder() {
super(ObjectMapper.DefaultTyping.OBJECT_AND_NON_CONCRETE);
this._idType = JsonTypeInfo.Id.CLASS;
this._includeAs = JsonTypeInfo.As.PROPERTY;
notValid2ValidIds.put("java.util.Collections$UnmodifiableRandomAccessList", ArrayList.class.getName());
// add more here...
}
#Override
protected TypeIdResolver idResolver(MapperConfig<?> config, JavaType baseType, Collection<NamedType> subtypes,
boolean forSer, boolean forDeser) {
return new ClassNameIdResolver(baseType, config.getTypeFactory()) {
#Override
protected String _idFrom(Object value, Class<?> cls, TypeFactory typeFactory) {
String id = notValid2ValidIds.get(cls.getName());
if (id != null) {
return id;
}
return super._idFrom(value, cls, typeFactory);
}
};
}
}
Now, we can use it as below:
import com.fasterxml.jackson.annotation.JsonTypeInfo;
import com.fasterxml.jackson.databind.JavaType;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.fasterxml.jackson.databind.SerializationFeature;
import com.fasterxml.jackson.databind.cfg.MapperConfig;
import com.fasterxml.jackson.databind.jsontype.NamedType;
import com.fasterxml.jackson.databind.jsontype.TypeIdResolver;
import com.fasterxml.jackson.databind.jsontype.impl.ClassNameIdResolver;
import com.fasterxml.jackson.databind.type.TypeFactory;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collection;
import java.util.Collections;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
public class JsonApp {
public static void main(String[] args) throws Exception {
ObjectMapper mapper = new ObjectMapper();
mapper.enable(SerializationFeature.INDENT_OUTPUT);
mapper.setDefaultTyping(new CollectionsDefaultTypeResolverBuilder());
Root root = new Root();
root.setData(Collections.unmodifiableList(Arrays.asList("1", "b")));
String json = mapper.writeValueAsString(root);
System.out.println(json);
System.out.println(mapper.readValue(json, Root.class));
}
}
class Root {
private List<String> data;
public List<String> getData() {
return data;
}
public void setData(List<String> data) {
this.data = data;
}
#Override
public String toString() {
return "Root{" +
"data=" + data +
'}';
}
}
Above code prints:
{
"data" : [ "java.util.ArrayList", [ "1", "b" ] ]
}
Root{data=[1, b]}
You can even map it to List interface:
notValid2ValidIds.put("java.util.Collections$UnmodifiableRandomAccessList", List.class.getName());
And output would be:
{
"data" : [ "java.util.List", [ "1", "b" ] ]
}
Problem:
I am adding tests and refactoring existing Spring #RestControllers.
The urls used to be hard coded in the Rest annotations.
#GetMapping("/api/v1/{taxonomy}/animals")
public List<String> getAnimals() {
...
}
Naming scheme:
I started moving them into constants. So the tests will be easier maintainable and not break on trivial url changes.
I came up with my own naming scheme for the url constants. (see the code at the end of the question for my complete scheme)
public static final String SERVICE_URL = "/api/v1/{taxonomy}/animals";
public static final String PATH_VAR_ANIMAL = "animal";
Are there any best practices for this? I was unable to find somethings and all the examples I found use hard coded strings.
Url construction:
Also a co-worker pointed out that string concatenation for url construction is risky. e.g. somebody could forget a '/' at the start of one url part.
#PostMapping(URL_PART_ANIMAL + URL_PART_FUR)
Is there a preferred way to build this kind of urls as they are not complete but only the end of an actual url?
Complete example:
package com.huelfe.animal.api.v1.rest;
import java.util.Collection;
import java.util.List;
import java.util.Map;
import javax.validation.constraints.NotBlank;
import javax.validation.constraints.NotEmpty;
import org.springframework.web.bind.annotation.DeleteMapping;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.PathVariable;
import org.springframework.web.bind.annotation.PostMapping;
import org.springframework.web.bind.annotation.PutMapping;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.bind.annotation.RestController;
#RestController
#RequestMapping(AnimalController.SERVICE_URL)
public class AnimalController {
public static final String SERVICE_URL = "/api/v1/{taxonomy}/animals";
public static final String PATH_VAR_ANIMAL = "animal";
public static final String URL_PART_ANIMAL = "/{" + PATH_VAR_ANIMAL + "}";
public static final String URL_PART_RELATIVE = "/relative";
public static final String URL_PART_RENAME = "/rename";
public static final String URL_PART_FUR = "/fur";
public static final String PARAM_NAME = "name";
public static final String PARAM_FUR = "fur";
#GetMapping
public List<String> getAnimals() {
...
}
#GetMapping(value = URL_PART_RELATIVE)
public Map<String, Collection<String>> getAnimalsRelatives() {
...
}
#PostMapping
public Result doAdd(#RequestParam(value = PARAM_NAME) #NotBlank String newAnimal) {
...
}
#DeleteMapping(value = URL_PART_ANIMAL)
public Result doRemove(#PathVariable(PATH_VAR_ANIMAL) String animal) {
...
}
#PutMapping(URL_PART_ANIMAL + URL_PART_RENAME)
public Result doRename(
#PathVariable(PATH_VAR_ANIMAL) String animal,
#RequestParam(value = PARAM_NAME) #NotBlank String newName) {
...
}
#PostMapping(URL_PART_ANIMAL + URL_PART_FUR)
public Result addFurs(
#PathVariable(PATH_VAR_ANIMAL) String targetAnimal,
#RequestParam(value = PARAM_FUR) #NotEmpty List<#NotBlank String> furs) {
...
}
}
I Can't cast input object to DTO because of below error ExecutionStrategy.resolveField() - Exception while fetching data java.lang.ClassCastException: java.util.LinkedHashMap incompatible with com.fathome.graphql.OffersDto
OffersDto inputObject = environment.getArgument("offersInput");
please let me know what's wrong in below code, thanks in advance.
package com.fathome.graphql;
import graphql.schema.*;
import static graphql.Scalars.*;
import static graphql.schema.GraphQLFieldDefinition.newFieldDefinition;
import static graphql.schema.GraphQLInputObjectField.newInputObjectField;
import static graphql.schema.GraphQLInputObjectType.newInputObject;
import static graphql.schema.GraphQLList.list;
import static graphql.schema.GraphQLObjectType.newObject;
public class ManualGraphQLQuerySchema {
public static GraphQLObjectType offersResponse = newObject()
.name("OffersResponse")
.field(newFieldDefinition()
.name("offerName")
.type(GraphQLString))
.field(newFieldDefinition()
.name("offerId")
.type(GraphQLString))
.build();
public static GraphQLInputObjectType offersRequestType = GraphQLInputObjectType.newInputObject()
.name("OffersDto")
.field(newInputObjectField()
.name("offerName")
.type(GraphQLString))
.field(newInputObjectField()
.name("offerId")
.type(GraphQLString))
.build();
public static GraphQLObjectType queryType = newObject()
.name("QueryType")
.field(newFieldDefinition()
.name("offers")
.type(offersResponse)
.argument(GraphQLArgument.newArgument()
.name("offersInput")
.type(offersRequestType))
.dataFetcher(new OffersFetcher()))
.build();
}
package com.fathome.graphql;
import com.fasterxml.jackson.annotation.JsonInclude;
import com.fasterxml.jackson.annotation.JsonProperty;
import com.fasterxml.jackson.annotation.JsonPropertyOrder;
#JsonInclude(JsonInclude.Include.NON_NULL)
#JsonPropertyOrder({
"offerName",
"offerId"
})
public class OffersDto {
#JsonProperty("offerName")
private String offerName;
#JsonProperty("offerName")
public String getOfferName() {
return offerName;
}
#JsonProperty("offerName")
public void setOfferName(String offerName) {
this.offerName = offerName;
}
#JsonProperty("offerId")
private String offerId;
#JsonProperty("offerId")
public String getOfferId() {
return offerId;
}
#JsonProperty("offerId")
public void setOfferId(String offerId) {
this.offerId = offerId;
}
}
package com.fathome.graphql;
import graphql.schema.DataFetcher;
import graphql.schema.DataFetchingEnvironment;
public class OffersFetcher implements DataFetcher<OffersDto> {
#Override
public OffersDto get(DataFetchingEnvironment environment) {
//Can't cast input object DTO this is error in below line
//ExecutionStrategy.resolveField() - Exception while fetching data
//java.lang.ClassCastException: java.util.LinkedHashMap incompatible with com.fathome.graphql.OffersDto
OffersDto inputObject = environment.getArgument("offersInput");
//calling service to get offerdetails using inputObject
//for testing not calling service just returning mock object.
OffersDto offersDto = new OffersDto();
offersDto.setOfferName("123");
offersDto.setOfferId("456");
return offersDto;
}
}
In below reference link similar to my code working fine.
Episode episode = environment.getArgument("episode");
ReviewInput review = environment.getArgument("review");
http://graphql-java.readthedocs.io/en/latest/execution.html
The values you get from environment.getArgument(...) will either be scalar values (strings, numbers etc) or a Map in case of an object GraphQL input type (like in your case).
You then need to do the deserialization yourself. Since you're using Jackson, it would look like this:
private final ObjectMapper objectMapper;
...
Object rawInput = environment.getArgument("offersInput");
OffersDto inputObject = objectMapper.convertValue(rawInput, OffersDto.class);
Check out graphql-java-tools for schema-first approach, or graphql-spqr for code-first, both make DataFetchers completely transparent, so no manual steps like above needed.