Schemas disappear from components when programmatically adding security scheme - java

I've recently converted from Springfox to Springdoc-openapi for generating my OpenAPI for my Spring Boot Rest API service.
Everything was working perfectly until I added a security scheme. Once I did that, my schemes no longer appear and an error appears on the SwaggerUI page:
Could not resolve reference: Could not resolve pointer: /components/schemas/Ping does not exist in document
I am setting up my configuration programmatically, and have 2 groups.
I'm using Spring Boot v2.4.0 with springdoc-openapi-ui v1.5.1
Snippet of my pom.xml:
<dependency>
<groupId>org.springdoc</groupId>
<artifactId>springdoc-openapi-ui</artifactId>
<version>1.5.1</version>
</dependency>
<dependency>
<groupId>org.springdoc</groupId>
<artifactId>springdoc-openapi-hateoas</artifactId>
<version>1.5.1</version>
</dependency>
<dependency>
<groupId>org.springdoc</groupId>
<artifactId>springdoc-openapi-security</artifactId>
<version>1.5.1</version>
</dependency>
Snippet from configuration:
#Bean
public GroupedOpenApi apiV1() {
String[] paths = {"/v1/**"};
String[] packagesToScan = {"com.test.controller"};
return GroupedOpenApi.builder()
.group("v1")
.packagesToScan(packagesToScan)
.pathsToMatch(paths)
.addOpenApiCustomiser(buildV1OpenAPI())
.build();
}
#Bean
public GroupedOpenApi apiV2() {
String[] paths = {"/v2/**"};
String[] packagesToScan = {"com.test.controller"};
return GroupedOpenApi.builder()
.group("v2")
.packagesToScan(packagesToScan)
.pathsToMatch(paths)
.addOpenApiCustomiser(buildV2OpenAPI())
.build();
}
public OpenApiCustomiser buildV1OpenAPI() {
return openApi -> openApi.info(apiInfo().version("v1"));
}
public OpenApiCustomiser buildV2OpenAPI() {
final String securitySchemeName = "Access Token";
return openApi -> openApi.info(apiInfo().version("v2"))
.addSecurityItem(new SecurityRequirement().addList(securitySchemeName))
.components(new Components().addSecuritySchemes(securitySchemeName, new SecurityScheme()
.type(SecurityScheme.Type.APIKEY)
.in(SecurityScheme.In.HEADER)
.name(HttpHeaders.AUTHORIZATION)));
}
// Describe the apis
private Info apiInfo() {
return new Info()
.title("Title")
.description("API Description");
}
For my v1 group, everything works fine. My Schemas appear on the Swagger UI page and I see them in the components section of the generated api-doc.
"components": {
"schemas": {
"ApplicationErrorResponse": {
...
}
},
"Ping": {
...
}
}
}
For my v2 group, the Schemas are not generated.
"components": {
"securitySchemes": {
"Access Token": {
"type": "apiKey",
"name": "Authorization",
"in": "header"
}
}
}
Any idea why my Schemas are not automatically scanned and added when adding the security scheme to the OpenAPI components programmatically? Am I missing something in my config?
Here's my request mapping in my controller.
#Operation(summary = "Verify API and backend connectivity",
description = "Confirm connectivity to the backend, as well and verify API service is running.")
#OkResponse
#GetMapping(value = API_VERSION_2 + "/ping", produces = MediaType.APPLICATION_JSON_VALUE)
public ResponseEntity<Ping> getPingV2(HttpServletRequest request) {
...
}
And here's my #OkResponse annotation:
#Retention(RetentionPolicy.RUNTIME)
#Target({ElementType.TYPE, ElementType.METHOD})
#Documented
#ApiResponse(responseCode = HTTP_200,
description = HTTP_200_OK,
headers = {
#Header(name = CONTENT_VERSION_HEADER, description = CONTENT_VERSION_HEADER_DESCRIPTION, schema = #Schema(type = "string")),
#Header(name = DEPRECATION_MESSAGE_HEADER, description = DEPRECATION_MESSAGE_HEADER_DESCRIPTION, schema = #Schema(type = "string")),
#Header(name = DESCRIPTION_HEADER, description = DESCRIPTION_HEADER_DESCRIPTION, schema = #Schema(type = "string"))
})
public #interface OkResponse {
}
My v1 mappings are defined similarly.

So, it would seem that when solely relying on OpenApiCustomiser for creating the OpenAPI, the scanned components are ignored, or at least overwritten with just the components specified in the customizer (I could have also programmatically added all of my schemas, but this would have been very cumbersome to maintain).
Changing my config to the following resolved my issue:
#Bean
public OpenAPI customOpenAPI() {
final String securitySchemeName = "Access Token";
return new OpenAPI()
.addSecurityItem(new SecurityRequirement().addList(securitySchemeName))
.components(new Components().addSecuritySchemes(securitySchemeName, new SecurityScheme()
.type(SecurityScheme.Type.APIKEY)
.in(SecurityScheme.In.HEADER)
.name(HttpHeaders.AUTHORIZATION)))
.info(apiInfo());
}
#Bean
public GroupedOpenApi apiV1() {
String[] paths = {"/v1/**"};
String[] packagesToScan = {"com.test.controller"};
return GroupedOpenApi.builder()
.group("v1")
.packagesToScan(packagesToScan)
.pathsToMatch(paths)
.addOpenApiCustomiser(buildV1OpenAPI())
.build();
}
#Bean
public GroupedOpenApi apiV2() {
String[] paths = {"/v2/**"};
String[] packagesToScan = {"com.test.controller"};
return GroupedOpenApi.builder()
.group("v2")
.packagesToScan(packagesToScan)
.pathsToMatch(paths)
.addOpenApiCustomiser(buildV2OpenAPI())
.build();
}
public OpenApiCustomiser buildV1OpenAPI() {
return openApi -> openApi.info(openApi.getInfo().version("v1"));
}
public OpenApiCustomiser buildV2OpenAPI() {
return openApi -> openApi.info(openApi.getInfo().version("v2"));
}
// Describe the apis
private Info apiInfo() {
return new Info()
.title("Title")
.description("API Description.");
}
While this technically does also add the Authorize button and security scheme to the v1 group, it can be ignored because those API endpoints are not secured anyway (internal API and they should be going away soon anyway).
Probably a better solution anyway as the Info is basically identical between the groups.

Instead of creating new Components you should just modify them:
public OpenApiCustomiser buildV2OpenAPI() {
final String securitySchemeName = "Access Token";
return openApi -> {
openApi.info(apiInfo().version("v2"))
.addSecurityItem(new SecurityRequirement().addList(securitySchemeName));
openApi.getComponents().addSecuritySchemes(securitySchemeName, new SecurityScheme()
.type(SecurityScheme.Type.APIKEY)
.in(SecurityScheme.In.HEADER)
.name(HttpHeaders.AUTHORIZATION));
return openApi;
};
}

Related

Is there anyway to disable "Retryable writes" to false in Spring Boot 2.2.1

First time
I am trying to develop a controller to save data in DocumentDB in AWS.
In the first time it saves, but in the second time, I am looking for this register saved in database, I got this and change some data, and save, but...
I am getting this error:
Caused by: com.mongodb.MongoCommandException: Command failed with error 301: 'Retryable writes are not supported' on server aws:27017. The full response is {"ok": 0.0, "code": 301, "errmsg": "Retryable writes are not supported", "operationTime": {"$timestamp": {"t": 1641469879, "i": 1}}}
This my java code
#Service
public class SaveStateHandler extends Handler<SaveStateCommand> {
#Autowired
private MongoRepository repository;
#Autowired
private MongoTemplate mongoTemplate;
#Override
public String handle(Command command) {
SaveStateCommand cmd = (SaveStateCommand) command;
State state = buildState(cmd);
repository.save(state);
return state.getId();
}
private State buildState(SaveStateCommand cmd) {
State state = State
.builder()
.activityId(cmd.getActivityId())
.agent(cmd.getAgent())
.stateId(cmd.getStateId())
.data(cmd.getData())
.dataAlteracao(LocalDateTime.now())
.build();
State stateFound = findState(cmd);
if (stateFound != null) {
state.setId(stateFound.getId());
}
return state;
}
private State findState(SaveStateCommand request) {
Query query = new Query();
selectField(query);
where(request, query);
return mongoTemplate.findOne(query, State.class);
}
private void selectField(Query query) {
query.fields().include("id");
}
private void where(SaveStateCommand request, Query query) {
query.addCriteria(new Criteria().andOperator(
Criteria.where("activityId").is(request.getActivityId()),
Criteria.where("agent").is(request.getAgent())));
}
}
In AWS they suggest to use retryWrites=false but I donĀ“t know how to do it in Spring Boot.
I use Spring Boot 2.2.1
I tryed to do this
#Bean
public MongoClientSettings mongoSettings() {
return MongoClientSettings
.builder()
.retryWrites(Boolean.FALSE)
.build();
}
But not worked.
=================================================================================
Second Time
I connected to AWS DocumentDb with SSH Tunnel.
Started my application with these database configuration
#Configuration
#EnableConfigurationProperties({MongoProperties.class})
public class MongoAutoConfiguration {
private final MongoClientFactory factory;
private final MongoClientOptions options;
private MongoClient mongo;
public MongoAutoConfiguration(MongoProperties properties, ObjectProvider<MongoClientOptions> options, Environment environment) {
this.options = options.getIfAvailable();
if (StringUtils.isEmpty(properties.getUsername()) || StringUtils.isEmpty(properties.getPassword())) {
properties.setUsername(null);
properties.setPassword(null);
}
properties.setUri(createUri(properties));
this.factory = new MongoClientFactory(properties, environment);
}
private String createUri(MongoProperties properties) {
String uri = "mongodb://";
if (StringUtils.hasText(properties.getUsername()) && !StringUtils.isEmpty(properties.getPassword())) {
uri = uri + properties.getUsername() + ":" + new String(properties.getPassword()) + "#";
}
return uri + properties.getHost() + ":" + properties.getPort() + "/" + properties.getDatabase() + "?retryWrites=false";
}
#PreDestroy
public void close() {
if (this.mongo != null) {
this.mongo.close();
}
}
#Bean
public MongoClient mongo() {
this.mongo = this.factory.createMongoClient(this.options);
return this.mongo;
}
}
And localy it saves the data without error.
But, if I put my API update in AWS ECS, and try to save, got the same error.
=================================================================================
Dependencies
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>2.2.1.RELEASE</version>
</dependency>
<dependency>
<groupId>com.querydsl</groupId>
<artifactId>querydsl-mongodb</artifactId>
<version>4.1.4</version>
</dependency>
When you construct your connection string, you can include the parameters for disabling retryable writes, by adding this to your connection URI:
?replicaSet=rs0&readPreference=primaryPreferred&retryWrites=false&maxIdleTimeMS=30000
Then use this when creating the database factory and mongo template (this example uses the Reactive database factory, but the principle is the same for the SimpleMongoClientDatabaseFactory:
#Bean
fun reactiveMongoDatabaseFactory(
#Value("\${spring.data.mongodb.uri}") uri: String,
#Value("\${mongodb.database-name}") database: String
): ReactiveMongoDatabaseFactory {
val parsedURI = URI(uri)
return SimpleReactiveMongoDatabaseFactory(MongoClients.create(uri), database)
}

swagger-core v3 ignore my configurations in javax.ws.rs.core.Application

Using swagger-core v3 2.1.6 these three configurations produce same openapi.json result.
No configuration:
#ApplicationPath("/rest")
public class AuthRestConfig extends Application {
public AuthRestConfig() {
}
}
Server Configuration, I need to use this kind of conf:
#ApplicationPath("/rest")
public class AuthRestConfig extends Application {
public AuthRestConfig(#Context ServletConfig servletConfig) {
super();
OpenAPI oas = new OpenAPI();
Info info = new Info()
.title("Suma Automaton OpenAPI")
.description("This is a sample server Petstore server. You can find out more about Swagger "
+ "at [http://swagger.io](http://swagger.io) or on [irc.freenode.net, #swagger](http://swagger.io/irc/). For this sample, "
+ "you can use the api key `special-key` to test the authorization filters.")
.termsOfService("http://swagger.io/terms/")
.contact(new Contact()
.email("baldodavi#gmail.com"));
oas.info(info);
String url = "/suma-automaton-ms";
List<Server> servers = new ArrayList<>();
Server server = new Server();
server.setUrl(url);
servers.add(server);
oas.setServers(servers);
SwaggerConfiguration oasConfig = new SwaggerConfiguration()
.openAPI(oas)
.prettyPrint(true)
.resourcePackages(Stream.of("io.swagger.sample.resource").collect(Collectors.toSet()));
try {
new JaxrsOpenApiContextBuilder<>()
.servletConfig(servletConfig)
.application(this)
.openApiConfiguration(oasConfig)
.buildContext(true);
} catch (OpenApiConfigurationException e) {
throw new RuntimeException(e.getMessage(), e);
}
}
}
Any suggestion about this behavior? Is there something I misunderstood? Consider I've the same behaviour also using standard configuration reported in https://github.com/swagger-api/swagger-core/wiki/Swagger-2.X---Integration-and-configuration#jax-rs-application
Add this to your AuthRestConfig class:
import io.swagger.v3.jaxrs2.integration.resources.OpenApiResource;
#Override
public Set<Class<?>> getClasses() {
Set<Class<?>> resources = new HashSet<>();
resources.add(OpenApiResource.class);
return resources;
}

Angular ng-swagger-gen generate from many resourcees from one server

I using zuul with many swagger resources (I have different links to specifics api-docs) so my question is how can I configure ng-swagger-gen config to generate all classes from many resources?
This is my ng-swagger-gen config:
{
"$schema": "./node_modules/ng-swagger-gen/ng-swagger-gen-schema.json",
"swagger": "http://localhost:8080/v2/api-docs",
"output": "src/api/",
"apiModule": true
}
And this is my swagger confing in zuul application
#Primary
#Configuration
public class SwaggerConfig implements SwaggerResourcesProvider {
#Override
public List<SwaggerResource> get() {
List<SwaggerResource> resources = new ArrayList<>();
resources.add(swaggerResource("USER-SERVICE", "/api/user/v2/api-docs"));
resources.add(swaggerResource("STORY-SERVICE", "/api/story/v2/api-docs"));
return resources;
}
private SwaggerResource swaggerResource(String name, String location) {
SwaggerResource swaggerResource = new SwaggerResource();
swaggerResource.setName(name);
swaggerResource.setLocation(location);
swaggerResource.setSwaggerVersion("2.0");
return swaggerResource;
}
}

I can't upload files with graphql-java

Im tried upload file with java graphql. I looked at a solution to this topic: How to upload files with graphql-java?
I'm using graphql-java version 11.0, graphql-spring-boot-starter version 5.0.2, graphql-java-kickstart version 7.5.0 .
public class PartDeserializer extends JsonDeserializer {
#Override
public Part deserialize(JsonParser p, DeserializationContext ctxt) throws IOException, JsonProcessingException {
return null;
}
#Bean
public ObjectMapper objectMapper() {
ObjectMapper objectMapper = new ObjectMapper();
objectMapper.configure(SerializationFeature.FAIL_ON_EMPTY_BEANS, false);
SimpleModule module = new SimpleModule();
module.addDeserializer(Part.class, new PartDeserializer());
objectMapper.registerModule(module);
return objectMapper;
}
}
#Configuration
public class GraphqlConfig {
#Bean
public GraphQLScalarType uploadScalarDefine() {
return ApolloScalars.Upload;
}
}
public Boolean testMultiFilesUpload(List<Part> parts, DataFetchingEnvironment env) {
// get file parts from DataFetchingEnvironment, the parts parameter is not use
List<Part> attachmentParts = env.getArgument("files");
int i = 1;
for (Part part : attachmentParts) {
String uploadName = "copy" + i;
try {
part.write("your path:" + uploadName);
} catch (IOException e) {
e.printStackTrace();
}
i++;
}
return true;
}
scalar Upload
testMultiFilesUpload(files: [Upload!]!): Boolean
My query from-data in Postman like that
operations
{ "query": "mutation($files: [Upload!]!) {testMultiFilesUpload(files:$files)}", "variables": {"files": [null,null] } }
map
{ "file0": ["variables.files.0"] , "file1":["variables.files.1"]}
file0
0.jpeg
file1
1.jpeg
this is server response
INFO 11663 --- [0.1-1100-exec-7] g.servlet.AbstractGraphQLHttpServlet : Bad POST multipart request: no part named "graphql" or "query"
what I'm doing wrong?
you can try this dependencies :
<properties>
<graphql-java.version>13.0</graphql-java.version>
<graphql-java-kickstart-springboot.version>5.10.0</graphql-java-kickstart-springboot.version>
<graphql-java-kickstart-tools.version>5.6.1</graphql-java-kickstart-tools.version>
<graphql-java-kickstart-servlet.version>8.0.0</graphql-java-kickstart-servlet.version>
</properties>
<dependency>
<groupId>com.graphql-java-kickstart</groupId>
<artifactId>graphql-spring-boot-starter</artifactId>
<version>5.9.0</version>
</dependency>
<dependency>
<groupId>com.graphql-java-kickstart</groupId>
<artifactId>graphql-java-tools</artifactId>
<version>5.6.1</version>
</dependency>
<dependency>
<groupId>com.graphql-java-kickstart</groupId>
<artifactId>graphiql-spring-boot-starter</artifactId>
<version>5.6.0</version>
</dependency>
But there is a problem with graphql file upload,we can't delete the temp file that generate by graphql, because it always be used by graphql and the file stream didn't closed.
I suggest you use Apollo
https://github.com/apollographql/apollo-android
It uses RxJava Integration, Retrofit, Subscriptions and support for AutoValue. This will make your work easier as there are no straightforward ways in building Queries & Parsing responses for GraphQL.

Replacing a Mocked Spring Boot Controller with the actual Controller

I'm new to Spring Boot and Testing.
tl;dr How do I replace a #MockBean controller with the actual controller, in a spring boot application so that I can test that the controller is working instead of just testing that my objects are output correctly?
I'm writing a gradle managed API with dependencies (from build.gradle):
// Spring Boot (2.0.5 Release)
compile('org.springframework.boot:spring-boot-starter-actuator')
compile('org.springframework.boot:spring-boot-starter-data-jpa')
compile('org.springframework.boot:spring-boot-starter-hateoas')
compile('org.springframework.boot:spring-boot-starter-web')
runtime('org.springframework.boot:spring-boot-devtools')
// Testing
testImplementation('org.junit.jupiter:junit-jupiter-api:5.3.1')
testRuntimeOnly('org.junit.jupiter:junit-jupiter-engine:5.3.1')
testCompile('org.springframework.boot:spring-boot-starter-test')
testCompile("org.assertj:assertj-core:3.11.1")
testCompile 'org.mockito:mockito-core:2.+'
I've got an API controller class with the following relevant code:
#Controller
public class ObjectivesApiController extends AbstractRestHelperFunctionality implements ObjectivesApi {
protected ObjectivesApiController(
UserRepository userRepository,
CompaniesRepository companiesRepository,
TeamsRepository teamsRepository,
ProjectsRepository projectsRepository,
OdiAssessmentRepository odiAssessmentRepository,
OdiCustomerRatingRepository odiCustomerRatingRepository,
OdiTechRatingRepository odiTechRatingRepository,
OdiValueRatingRepository odiValueRatingRepository,
ObjectivesRepository objectivesRepository,
KeyResultRepository keyResultRepository) {
super(
userRepository,
companiesRepository,
teamsRepository,
projectsRepository,
odiAssessmentRepository,
odiCustomerRatingRepository,
odiTechRatingRepository,
odiValueRatingRepository,
objectivesRepository,
keyResultRepository);
}
public ResponseEntity<KeyResult> createKeyResult(#ApiParam(value = "id", required = true) #PathVariable("id") Long id, #ApiParam(value = "keyResult", required = true) #Valid #RequestBody KeyResult keyResultDTO) {
KeyResult keyResult = KeyResultBuilder
.aKeyResult()
.withDescription(keyResultDTO.getDescription())
.withCompleted(keyResultDTO.getCompleted())
.build();
Objective parentObjective = objectivesRepository.findByObjectiveId(id);
parentObjective.addKeyResult(keyResult);
keyResultRepository.save(keyResult);
objectivesRepository.save(parentObjective);
return new ResponseEntity<KeyResult>(HttpStatus.CREATED);
}
public ResponseEntity<Objective> createObjective(#ApiParam(value = "objective", required = true) #Valid #RequestBody Objective objectiveDTO) {
Objective objective = ObjectiveBuilder
.anObjective()
.withDescription(objectiveDTO.getDescription())
.withCompleted(objectiveDTO.getCompleted())
.withKeyResults(objectiveDTO.getKeyResults())
.build();
objective.getKeyResults().forEach(keyResultRepository::save);
objectivesRepository.save(objective);
return new ResponseEntity<Objective>(HttpStatus.CREATED);
}
public ResponseEntity<Void> deleteAllLinkedKeyResults(#ApiParam(value = "id", required = true) #PathVariable("id") Long id) {
Objective subjectObjective = objectivesRepository.findByObjectiveId(id);
subjectObjective.getKeyResults().clear();
objectivesRepository.save(subjectObjective);
return new ResponseEntity<Void>(HttpStatus.NO_CONTENT);
}
public ResponseEntity<Void> deleteObjective(#ApiParam(value = "id", required = true) #PathVariable("id") Long id) {
objectivesRepository.delete(objectivesRepository.findByObjectiveId(id));
return new ResponseEntity<Void>(HttpStatus.NO_CONTENT);
}
public ResponseEntity<Void> deleteOneKeyResult(#ApiParam(value = "the id of the objective you want key results for", required = true) #PathVariable("objectiveId") Long objectiveId, #ApiParam(value = "the id of the key result", required = true) #PathVariable("keyResultId") Long keyResultId) {
Objective subjectObjective = objectivesRepository.findByObjectiveId(objectiveId);
KeyResult keyResult = keyResultRepository.findByKeyResultId(keyResultId);
subjectObjective.removeKeyResult(keyResult);
objectivesRepository.save(subjectObjective);
keyResultRepository.delete(keyResult);
return new ResponseEntity<Void>(HttpStatus.NO_CONTENT);
}
public ResponseEntity<List<Objective>> getAllObjectives() {
List<Objective> allObjectives = objectivesRepository.findAll();
return new ResponseEntity<List<Objective>>(allObjectives, HttpStatus.OK);
}
public ResponseEntity<List<KeyResult>> getKeyResultsForObjective(#ApiParam(value = "id", required = true) #PathVariable("id") Long id) {
Objective subjectObjective = objectivesRepository.findByObjectiveId(id);
List<KeyResult> allKeyResults = subjectObjective.getKeyResults();
return new ResponseEntity<List<KeyResult>>(allKeyResults, HttpStatus.OK);
}
public ResponseEntity<Objective> getObjective(#ApiParam(value = "id", required = true) #PathVariable("id") Long id) {
Objective subjectObjective = objectivesRepository.findByObjectiveId(id);
return new ResponseEntity<Objective>(subjectObjective, HttpStatus.OK);
}
public ResponseEntity<KeyResult> getKeyResultForObjective(#ApiParam(value = "the id of the objective you want key results for", required = true) #PathVariable("objectiveId") Long objectiveId, #ApiParam(value = "the id of the key result", required = true) #PathVariable("keyResultId") Long keyResultId) {
Objective subjectObjective = objectivesRepository.findByObjectiveId(objectiveId);
KeyResult subjecKeyResult = subjectObjective.getKeyResults().stream()
.filter(KeyResult -> keyResultId.equals(KeyResult.getKeyResultId()))
.findFirst()
.orElse(null);
return new ResponseEntity<KeyResult>(subjecKeyResult, HttpStatus.OK);
}
public ResponseEntity<Objective> updateObjective(#ApiParam(value = "id", required = true) #PathVariable("id") Long id, #ApiParam(value = "objective", required = true) #Valid #RequestBody Objective objectiveDTO) {
Objective existingObjective = objectivesRepository.findByObjectiveId(id);
Objective objective = ObjectiveBuilder
.anObjective()
.withObjectiveId(existingObjective.getObjectiveId())
.withDescription(objectiveDTO.getDescription())
.withCompleted(objectiveDTO.getCompleted())
.withKeyResults(objectiveDTO.getKeyResults())
.build();
objective.getKeyResults().forEach(keyResultRepository::save);
objectivesRepository.save(objective);
return new ResponseEntity<Objective>(HttpStatus.NO_CONTENT);
}
public ResponseEntity<KeyResult> updateKeyResult(#ApiParam(value = "the id of the objective you want key results for", required = true) #PathVariable("objectiveId") Long objectiveId, #ApiParam(value = "the id of the key result", required = true) #PathVariable("keyResultId") Long keyResultId, #ApiParam(value = "keyResult", required = true) #Valid #RequestBody KeyResult keyResultDTO) {
if (objectivesRepository.existsById(objectiveId) && keyResultRepository.existsById(keyResultId)) {
Objective subjectObjective = objectivesRepository.findByObjectiveId(objectiveId);
KeyResult subjecKeyResult = subjectObjective.getKeyResults().stream()
.filter(KeyResult -> keyResultId.equals(KeyResult.getKeyResultId()))
.findFirst()
.orElse(null);
KeyResult updatedKeyResult = KeyResultBuilder
.aKeyResult()
.withKeyResultId(subjecKeyResult.getKeyResultId())
.withDescription(keyResultDTO.getDescription())
.withCompleted(keyResultDTO.getCompleted())
.build();
keyResultRepository.save(updatedKeyResult);
Collections.replaceAll(subjectObjective.getKeyResults(), subjecKeyResult, updatedKeyResult);
objectivesRepository.save(subjectObjective);
}
return new ResponseEntity<KeyResult>(HttpStatus.NO_CONTENT);
}
}
For context on this class, all the AbstractRestHelper super class is doing, is creating singletons of my repositories, which are then .. field injected (unsure if this is the right term) in to the controller. This pattern is repeated across all controllers hence the clutter.
The API being implemented is a Swagger 2 API interface that keeps this controller free of annotations where possible.
The final piece is the test class. This is the core of my question.
#ExtendWith(SpringExtension.class)
#WebMvcTest(ObjectivesApiController.class)
class ObjectivesApiControllerTest {
#Autowired
private MockMvc mockMvc;
#MockBean
private ObjectivesApiController objectivesApiControllerMock;
#BeforeEach
void setUp() {
}
#AfterEach
void tearDown() {
}
#Test
void getAllObjectives() throws Exception {
// Create two objects to test with:
Objective testObjective1 = ObjectiveBuilder
.anObjective()
.withObjectiveId(1L)
.withDescription("Test Objective")
.withCompleted(false)
.build();
Objective testObjective2 = ObjectiveBuilder
.anObjective()
.withObjectiveId(2L)
.withDescription("Test Objective")
.withCompleted(true)
.build();
List<Objective> testList = new ArrayList<Objective>();
testList.add(testObjective1);
testList.add(testObjective2);
// Set expectations on what should be found:
when(objectivesApiControllerMock.getAllObjectives()).thenReturn(new ResponseEntity<List<Objective>>(testList, HttpStatus.OK));
// Carry out the mocked API call:
mockMvc.perform(get("/objectives"))
.andExpect(status().isOk())
.andExpect(content().contentType(MediaType.APPLICATION_JSON_UTF8))
.andExpect(jsonPath("$", hasSize(2)))
.andExpect(jsonPath("$[0].objectiveId", is(1)))
.andExpect(jsonPath("$[0].description", is("Test Objective")))
.andExpect(jsonPath("$[0].completed", is(false)))
.andExpect(jsonPath("$[1].objectiveId", is(2)))
.andExpect(jsonPath("$[1].description", is("Test Objective")))
.andExpect(jsonPath("$[1].completed", is(true)));
// Validate the response is what we expect:
verify(objectivesApiControllerMock, times(1)).getAllObjectives();
verifyNoMoreInteractions(objectivesApiControllerMock);
}
#Test
void getKeyResultsForObjective() throws Exception {
KeyResult testKeyResultWithParentObjective1 = KeyResultBuilder
.aKeyResult()
.withKeyResultId(1L)
.withCompleted(false)
.withDescription("My parent Key Result is 1")
.build();
KeyResult testKeyResultWithParentObjective2 = KeyResultBuilder
.aKeyResult()
.withKeyResultId(2L)
.withCompleted(true)
.withDescription("My parent Key Result is 1")
.build();
Objective testObjectiveWithKeyResults = ObjectiveBuilder
.anObjective()
.withObjectiveId(1L)
.withDescription("Test Objective")
.withKeyResults(new ArrayList<KeyResult>())
.withCompleted(false)
.build();
testObjectiveWithKeyResults.addKeyResult(testKeyResultWithParentObjective1);
testObjectiveWithKeyResults.addKeyResult(testKeyResultWithParentObjective2);
when(objectivesApiControllerMock.getKeyResultsForObjective(1L)).thenReturn(new ResponseEntity<List<KeyResult>>(testObjectiveWithKeyResults.getKeyResults(), HttpStatus.OK));
mockMvc.perform(get("/objectives/1/keyresult"))
.andExpect(status().isOk())
.andExpect(content().contentType(MediaType.APPLICATION_JSON_UTF8))
.andExpect(jsonPath("$", hasSize(2)))
.andExpect(jsonPath("$[0].keyResultId", is(1)))
.andExpect(jsonPath("$[0].description", is("My parent Key Result is 1")))
.andExpect(jsonPath("$[0].completed", is(false)))
.andExpect(jsonPath("$[1].keyResultId", is(2)))
.andExpect(jsonPath("$[1].description", is("My parent Key Result is 1")))
.andExpect(jsonPath("$[1].completed", is(true)));
}
}
My question is this:
Having mocked the objective controller using Mockito to validate that my objects are being formed properly, I now want to do the same thing but instead of mocking, I want to actually test the controller.
What do you think is the most naive way of getting this to work (I can refactor later). The resources I've search through either use different versions of Junit or rely on mockito rather than the actual controller.
Nothing fits quite right - since the controller is mocked, I'm not actually covering any code, and so the tests are worthless right? The only thing I'm looking at is that the objects are formed properly, where I now need to check that the controller is functioning as it should, AND are returning well formed objects.
Has anyone done anything simillar? What approaches do you use to manage the testing of field injected controllers?
Any advice on this would be hugely appreciated. I'd love to learn how people working on production grade applications are handling the testing of Spring Boot Apps with Controllers, Repos, etc.
Thanks so much!
You could use #SpyBean. That way you can both use it as it is or mock some calls. https://www.baeldung.com/mockito-spy

Categories