Springdoc random api-docs generation - java

I am looking to generate an api that take different content type.
The problem I am facing is that if I run several time my application I have different output documentation
#RestController
public class MyRestController {
#Operation(summary = "GetMyData", operationId = "gettt",
responses = #ApiResponse(responseCode = "204", content = #Content(mediaType = "application/vnd.something")))
#GetMapping(produces = "application/vnd.something")
public ResponseEntity<Void> getSomethingElse() {
return noContent().build();
}
#GetMapping(produces = TEXT_PLAIN_VALUE)
public String get() {
return "some text";
}
#GetMapping(produces = HAL_JSON_VALUE)
public EntityModel<JsonResponse> getHal() {
return EntityModel.of(new JsonResponse(),
linkTo(MyRestController.class).slash("somelink").withSelfRel()
);
}
#GetMapping(produces = APPLICATION_JSON_VALUE)
public JsonResponse getJson() {
return new JsonResponse();
}
}
It currently generate a wrong api-docs
"operationId": "gettt_1_1_1",
"responses": {
"200": {
"content": {
"application/hal+json": {
"schema": {
"$ref": "#/components/schemas/EntityModelJsonResponse"
}
},
"application/json": {
"schema": {
"$ref": "#/components/schemas/JsonResponse"
}
},
"text/plain": {
"schema": {
"type": "string"
}
}
},
"description": "OK"
},
"204": {
"content": {
"application/hal+json": {
"schema": {
"$ref": "#/components/schemas/EntityModelJsonResponse"
}
},
"application/vnd.something": {},
"text/plain": {
"schema": {
"type": "string"
}
}
},
"description": "No Content"
}
},
If I restart my server without changing the code the following response is generated
"operationId": "gettt_1",
"responses": {
"200": {
"content": {
"application/hal+json": {
"schema": {
"$ref": "#/components/schemas/EntityModelJsonResponse"
}
},
"application/json": {
"schema": {
"$ref": "#/components/schemas/JsonResponse"
}
},
"text/plain": {
"schema": {
"type": "string"
}
}
},
"description": "OK"
},
"204": {
"content": {
"application/vnd.something": {}
},
"description": "No Content"
}
},
I would expect that restarting my server will always generate the same documentation

Have you looked at the documentation?
https://springdoc.github.io/springdoc-openapi-demos/springdoc-properties.html#swagger-ui-properties
You can use the swagger-ui properties, without having to override the standard way of sorting (operationsSorter and tagsSorter).
For example:
springdoc.swagger-ui.operationsSorter=method
springdoc.swagger-ui.tagsSorter=alpha
If you want a an order on the server side, you can use OpenApiCustomiser, to sort the elements
This is a sample code that you can customize using Comparators, depending on the sorting logic you want:
Example, for alphabetical order sorting of schemas:
#Bean
public OpenApiCustomiser sortSchemasAlphabetically() {
return openApi -> {
Map<String, Schema> schemas = openApi.getComponents().getSchemas();
openApi.getComponents().setSchemas(new TreeMap<>(schemas));
};
}
Example for sorting tags, in alphabetical order:
#Bean
public OpenApiCustomiser sortTagsAlphabetically() {
return openApi -> openApi.setTags(openApi.getTags()
.stream()
.sorted(Comparator.comparing(tag -> StringUtils.stripAccents(tag.getName())))
.collect(Collectors.toList()));
}
You can have full control on the elements order, and you can sort them depending on your use case...

one other flag mentioned here:
springdoc:
writer-with-order-by-keys

Related

How to change collection format when using Swagger and Spring Fox?

I am using Spring Fox to generate the OpenApi 3.0 document for Swagger-ui.
With springfox-boot-starter:3.0.0
#GetMapping("/test")
public void test(TestParams testParams) {
}
public class TestParams {
List<String> list1;
}
When I get the json api-docs.I got this
{
// ...
"parameters": [
{
"name": "list1",
"in": "query",
"required": false,
"style": "pipeDelimited", << How to modify this to simple?
"schema": {
"type": "array",
"items": {
"type": "string"
}
}
}
]
// ...
}
Now I need comma separator for this query field, not pipe separator.
See https://swagger.io/docs/specification/serialization/
So How to modify the style 'pipeDelimited' to 'simple'?
Thanks for help!

Problem with quotas from google cloud endpoint framework for Java

I'm developing a simple api with GAE standard and Cloud Endpoint to test the available quota feature.
Here's the Java Code using the EndPoint Framework annotation:
package com.example.skeleton;
import com.google.api.server.spi.config.*;
#Api(
name = "simpleapi",
version = "v1",
limitDefinitions = {
#ApiLimitMetric(
name = "read-requests",
displayName = "Read requests",
limit = 10
)
}
)
public class MyApi {
#ApiMethod(
name = "hellosecuredWithQuotas",
path = "hellosecuredWithQuotas",
httpMethod = ApiMethod.HttpMethod.GET,
apiKeyRequired= AnnotationBoolean.TRUE,
metricCosts = {
#ApiMetricCost(
name ="read-requests",
cost = 1
)
}
)
public Message helloSecuredWithQuotas(#Named("name") String name) {
return new Message("hello " + name);
}
}
So given the #Api annotations the quotas is 10 requests per minute.
I deploy the App in GAE.
I Generate the OpenAPI json file (see below for the generated content) and deploy it to Cloud Endpoint using gcloud CLI.
Finally I use the generated client to call the endpoint in a loop which is calling the endpoint more than 10 times per minute.
... but unfortunately I never receive the expected "HTTP status code of 429 Too Many Requests".
public class App {
public static void main(String []args) throws IOException {
HttpTransport httpTransport = new NetHttpTransport();
JsonFactory jsonFactory = new JacksonFactory();
Simpleapi simpleapi = new Simpleapi.Builder(httpTransport, jsonFactory, null)
.setApplicationName("test")
.build();
for (int i = 0; i < 1000 ; i++) {
Message titi = simpleapi.hellosecuredWithQuotas("foobar" + System.currentTimeMillis()).setKey("my-api-key-here").execute();
System.out.println(titi.getMessage());
}
}
}
Here is the generated openapi.json file:
{
"swagger": "2.0",
"info": {
"version": "1.0.0",
"title": "my-project-name-here.appspot.com"
},
"host": "my-project-name-here.appspot.com",
"basePath": "/_ah/api",
"schemes": [
"https"
],
"consumes": [
"application/json"
],
"produces": [
"application/json"
],
"paths": {
"/simpleapi/v1/hellosecuredWithQuotas": {
"get": {
"operationId": "SimpleapiHelloSecuredWithQuotas",
"parameters": [
{
"name": "name",
"in": "query",
"required": true,
"type": "string"
}
],
"responses": {
"200": {
"description": "A successful response",
"schema": {
"$ref": "#/definitions/Message"
}
}
},
"security": [
{
"api_key": []
}
],
"x-google-quota": {
"metricCosts": {
"read-requests": 10
}
}
}
}
},
"securityDefinitions": {
"api_key": {
"type": "apiKey",
"name": "key",
"in": "query"
}
},
"definitions": {
"Message": {
"type": "object",
"properties": {
"message": {
"type": "string"
}
}
}
},
"x-google-management": {
"metrics": [
{
"name": "read-requests",
"valueType": "INT64",
"metricKind": "GAUGE"
}
],
"quota": {
"limits": [
{
"name": "read-requests",
"metric": "read-requests",
"values": {
"STANDARD": 10
},
"unit": "1/min/{project}",
"displayName": "Read requests"
}
]
}
}
}
I was able to get this to work by setting the limit value in the ApiLimitMetric configuration to a value 100 or greater. If you'd like to have a maximum of 10 requests per minute, you should set the limit to 100 and the cost value in the ApiMetricCost configuration on the method to 10. This would give you the same ratio of 1/10 as listed in your original post.

Elastic Search wildcard query not working with case insensitive ( for lower case)

I am trying to fetch records from elasticsearch using wildcard queries.
Please find the below query
get my_index12/_search
{
"query": {
"wildcard": {
"code.keyword": {
"value": "*ARG*"
}
}
}
}
It's working and giving expected results for the above query., but it is not working for the lower case value.
get my_index12/_search
{
"query": {
"wildcard": {
"code.keyword": {
"value": "*Arg*"
}
}
}
}
Try Following:
Mapping:
PUT my_index12
{
"settings": {
"analysis": {
"analyzer": {
"custom_analyzer": {
"type": "custom",
"tokenizer": "whitespace",
"char_filter": [
"html_strip"
],
"filter": [
"lowercase",
"asciifolding"
]
}
}
}
},
"mappings": {
"doc": {
"properties": {
"code": {
"type": "text",
"analyzer": "custom_analyzer"
}
}
}
}
}
Then Run Query String Query
GET my_index12/_search
{
"query": {
"query_string": {
"default_field": "code",
"query": "AB\\-7000*"
}
}
}
It will also work for ab-7000*
Let me know if it works for you.
You have to normalize your keyword field:
ElasticSearch normalizer
Something like (from documentation):
PUT index
{
"settings": {
"analysis": {
"normalizer": {
"my_normalizer": {
"type": "custom",
"char_filter": [],
"filter": ["lowercase", "asciifolding"]
}
}
}
},
"mappings": {
"_doc": {
"properties": {
"foo": {
"type": "keyword",
"normalizer": "my_normalizer"
}
}
}
}
}
UPDATE
Some additional info:
Only parts of the analysis chain that operate at the character level are applied. So for instance, if the analyzer performs both lowercasing and stemming, only the lowercasing will be applied: it would be wrong to perform stemming on a word that is missing some of its letters.
By setting analyze_wildcard to true, queries that end with a * will be analyzed and a boolean query will be built out of the different tokens, by ensuring exact matches on the first N-1 tokens, and prefix match on the last token.

Elasticsearch High Level Rest Client - java Map with typed (sub) fields - dates, numbers etc

(clarification copied from a comment)
I have a java.util.Map that has different key value pairs, and some of the values are dates, some numbers, some are strings, and some are also java.util.Maps that can also contain all kinds of above mentioned types. I am able to put it into the index, I see that the elasticsearch mapping is created automatically with correct field types, now I want to retrieve that Map and see dates, numbers, strings and nested Maps instead of what I currently have - just strings and Maps
Further story:
I'm putting a java.util.Map in Elasticsearch using the following code:
public void putMap(String key, Map<String, ?> value) {
try {
IndexRequest ir = Requests.indexRequest(_index)
.id(key)
.type("doc")
.source(value);
Factory.DB.index(ir); // the high level rest client here
} catch (IOException ex) {
throw new RuntimeException(ex);
}
}
I am not able to create mappings explicitly as per my task.
For one of my indices it has created the mapping like this, which is quite fine:
{
"rex": {
"mappings": {
"doc": {
"properties": {
"variables": {
"properties": {
"customer": {
"properties": {
"accounts": {
"properties": {
"dateOpen": {
"type": "date"
},
"id": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
}
}
},
"dateOfBirth": {
"type": "date"
},
"firstName": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"id": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"lastName": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"middleName": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
}
}
}
}
}
}
}
}
}
}
Now I am retrieving my structure back with the following code,
public Map<String, ?> getMap(String key) {
try {
GetRequest gr = new GetRequest(_index, "doc", key);
try {
GetResponse response = Factory.DB.get(gr);
if (!response.isExists()) {
return null;
}
Map<String, ?> ret = response.getSourceAsMap();
return ret;
} catch (ElasticsearchStatusException ee) {
if (ee.status().getStatus() == RestStatus.NOT_FOUND.getStatus()) {
return null;
} else {
throw new RuntimeException(ee);
}
}
} catch (IOException ex) {
throw new RuntimeException(ex);
}
}
The dates are returned as strings like "1957-04-29T00:00:00.000Z"
There's no Java object to map this document to as I have only Maps of Maps/Lists/values.
How do I make the Java Rest Client respect the mapping the Elasticsearch created for the index? response.getFields() returns empty map.
In case it is impossible (like 'source is json/strings by design' etc etc), I am ready to retrieve the mapping in the most convenient form possible and walk through the result by myself. The code to retrieve elasticsearch mapping will be appreciated.
Big thank you!
If you still want to fetch the type mapping and do the conversion manually, the Indices API has a Get Mapping command you can invoke with the Java Low Level REST Client.
String getMapping(RestHighLevelClient client, String index, String type) throws IOException {
String endpoint = index + "/_mapping/" + type;
Response response = client.getLowLevelClient().performRequest("GET", endpoint);
return EntityUtils.toString(response.getEntity());
}
But really I would recommend instead using something like Jackson for data binding. Bind the Map you get from Elasticsearch to a Java object that models the document, and let Jackson handle the type conversion for you.

Problems consuming json+hal _embedded resources with spring RestTemplate

I have a simple use case where I would like to consume a resource collection that is represented with json+hal.
I use the spring RestTemplate and have configuired it to use the Jackson2HalModule.
When I debug my code I find that the Response object does contain accurate metadata (e.g. number of pages and resources) and response headers but there is no content or links. I have looked at many articles and guides on the internet over the last day, and I feel that my custom rest template should be working for my use case based on my findings.
If anybody can shed any light on this I would be eternally grateful.
My code for my service is as follows:
#Service
public class EventServiceImpl extends BaseService implements EventService {
private static final String knownEntity = "59d786d642572853721728f6";
private static String SERVICE_URL = "http://EVENTS-SERVER";
private static String EVENTS_PATH = "/events";
#Autowired
#LoadBalanced
protected RestTemplate restTemplate;
#Override
public ResponseEntity<PagedResources<Event>> fetchEventsList() {
// acceptable media type
List<MediaType> acceptableMediaTypes = Arrays.asList(HAL_JSON);
// header
HttpHeaders headers = new HttpHeaders();
headers.setAccept(acceptableMediaTypes);
HttpEntity<String> entity = new HttpEntity<String>(null, headers);
ResponseEntity<PagedResources<Event>> response = getRestTemplateWithHalMessageConverter()
.exchange(SERVICE_URL + EVENTS_PATH, HttpMethod.GET, entity, new ParameterizedTypeReference<PagedResources<Event>>(){});
return response;
}
public RestTemplate getRestTemplateWithHalMessageConverter() {
List<HttpMessageConverter<?>> existingConverters = restTemplate.getMessageConverters();
List<HttpMessageConverter<?>> newConverters = new ArrayList<>();
newConverters.add(getHalMessageConverter());
newConverters.addAll(existingConverters);
restTemplate.setMessageConverters(newConverters);
return restTemplate;
}
private HttpMessageConverter getHalMessageConverter() {
ObjectMapper objectMapper = new ObjectMapper();
objectMapper.registerModule(new Jackson2HalModule());
MappingJackson2HttpMessageConverter halConverter = new TypeConstrainedMappingJackson2HttpMessageConverter(ResourceSupport.class);
objectMapper.configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false);
halConverter.setSupportedMediaTypes(Arrays.asList(HAL_JSON));
halConverter.setObjectMapper(objectMapper);
return halConverter;
}
And my simple model is:
public class Event {
private String name;
private String location;
private int capacity;
public String getName() {
return name;
}
public String getLocation() {
return location;
}
public int getCapacity() {
return capacity;
}
}
For completeness, here is a samle of the shape of the hal+json I am attempting to consume:
{
"_embedded": {
"events": [
{
"name": null,
"location": null,
"capacity": 0,
"currentState": "CANCELLED",
"_links": {
"self": {
"href": "http://192.168.1.6:2221/events/59d786d642572853721728f6"
},
"event": {
"href": "http://192.168.1.6:2221/events/59d786d642572853721728f6"
},
"reinstate": {
"href": "http://192.168.1.6:2221/events/59d786d642572853721728f6/reinstate"
},
"reschedule": {
"href": "http://192.168.1.6:2221/events/59d786d642572853721728f6/reschedule"
}
}
},
{
"name": null,
"location": null,
"capacity": 0,
"currentState": "ADVERTISED",
"_links": {
"self": {
"href": "http://192.168.1.6:2221/events/59d7f14342572812ceca7fc6"
},
"event": {
"href": "http://192.168.1.6:2221/events/59d7f14342572812ceca7fc6"
},
"cancel": {
"href": "http://192.168.1.6:2221/events/59d7f14342572812ceca7fc6/cancel"
},
"reschedule": {
"href": "http://192.168.1.6:2221/events/59d7f14342572812ceca7fc6/reschedule"
}
}
},
{
"name": null,
"location": null,
"capacity": 0,
"currentState": "ADVERTISED",
"_links": {
"self": {
"href": "http://192.168.1.6:2221/events/59d7f14742572812ceca7fc7"
},
"event": {
"href": "http://192.168.1.6:2221/events/59d7f14742572812ceca7fc7"
},
"cancel": {
"href": "http://192.168.1.6:2221/events/59d7f14742572812ceca7fc7/cancel"
},
"reschedule": {
"href": "http://192.168.1.6:2221/events/59d7f14742572812ceca7fc7/reschedule"
}
}
},
{
"name": null,
"location": null,
"capacity": 0,
"currentState": "ADVERTISED",
"_links": {
"self": {
"href": "http://192.168.1.6:2221/events/59d7f14c42572812ceca7fc8"
},
"event": {
"href": "http://192.168.1.6:2221/events/59d7f14c42572812ceca7fc8"
},
"cancel": {
"href": "http://192.168.1.6:2221/events/59d7f14c42572812ceca7fc8/cancel"
},
"reschedule": {
"href": "http://192.168.1.6:2221/events/59d7f14c42572812ceca7fc8/reschedule"
}
}
}
]
},
"_links": {
"self": {
"href": "http://192.168.1.6:2221/events{?page,size,sort}",
"templated": true
},
"profile": {
"href": "http://192.168.1.6:2221/profile/events"
}
},
"page": {
"size": 20,
"totalElements": 4,
"totalPages": 1,
"number": 0
}
}
EDIT: I can consume an individual Event with no problems.
I had a similar problem and I ended up using org.springframework.hateoas.Resources
For my example below, this object is located in org.springframework.hateoas:spring-hateoas:jar:0.25.2.RELEASE
which is being pulled in from org.springframework.boot:spring-boot-starter-data-rest:jar:2.1.7.RELEASE, so there's a good chance its already in your classpath assuming you declare the spring-boot-starter-data-rest dependency.
Here's a simple example (using your info):
RestTemplate restTemplate = new RestTemplate();
Resources<Event> resourceEvents = restTemplate.getForObject("http://192.168.1.6:2221/events", Resources.class);
List<Event> events = new ArrayList<>(resourceEvents.getContent());
There are probably some gotchas the make this not so straight forward, but hopefully this provides a start to solving your problem.

Categories