How to set method name prefix in swagger codegen? - java

(Newbie to Swagger)
In the swagger specification file, the operationId is the name of the operation, corresponding to the HTTP methods.
For example,
"/pet/findByStatus": {
"get": {
"tags": [
"pet"
],
"summary": "Finds Pets by status",
"description": "Multiple status values can be provided with comma separated strings",
"operationId": "findPetsByStatus",
As seen above, operationId = findPetsByStatus. Suppose I want to generate a prefix for all get operations in my java code, with prefix = 'get_'.
For example, I would expect the swagger codegen to produce all operations corresponding to HTTP GET methods with a prefix = 'get_'. Specifically, above, it might generate: get_findPetsByStatus.
Is there a way to tell swagger codegen to prefix methods?
Please note that I want to use swagger-codegen itself and not APIMatic-like alternatives.

Implement AbstractJavaCodegen (or a subclass that implements it) and overload the postProcessOperations function to prepend prefixes to operations (operationId property of the CodegenOperation class). See making-your-own-codegen-modules for instructions on building and running a custom codegen.
Pseudocode:
public class MyCodegen extends AbstractJavaCodegen{ \\or
[...]
#Override
public Map<String, Object> postProcessOperations(Map<String, Object> objs) {
super.postProcessOperations(objs);
Map<String, Object> operations = (Map<String, Object>) objs.get("operations");
if (operations != null) {
List<CodegenOperation> ops = (List<CodegenOperation>) operations.get("operation");
for (CodegenOperation operation : ops) {
if(operation.httpMethod.equals("GET"){
operation.operationId = "get_" + operation.operationId;
}[...]
}
}
return objs;
}
}

Related

Parsing picocli-based CLI usage output into structured data

I have a set of picocli-based applications that I'd like to parse the usage output into structured data. I've written three different output parsers so far and I'm not happy with any of them (fragility, complexity, difficulty in extending, etc.). Any thoughts on how to cleanly parse this type of semi-structured output?
The usage output generally looks like this:
Usage: taker-mvo-2 [-hV] [-C=file] [-E=file] [-p=payoffs] [-s=millis] PENALTY
(ASSET SPREAD)...
Submits liquidity-taking orders based on mean-variance optimization of multiple
assets.
PENALTY risk penalty for payoff variance
(ASSET SPREAD)... Spread for creating market above fundamental value
for assets
-C, --credential=file credential file
-E, --endpoint=file marketplace endpoint file
-h, --help display this help message
-p, --payoffs=payoffs payoff states and probabilities (default: .fm/payoffs)
-s, --sleep=millis sleep milliseconds before acting (default: 2000)
-V, --version print product version and exit
I want to capture the program name and description, options, parameters, and parameter-groups along with their descriptions into an agent:
public class Agent {
private String name;
private String description = "";
private List<Option> options;
private List<Parameter> parameters;
private List<ParameterGroup> parameterGroups;
}
The program name is taker-mvo-2 and the (possibly multi-lined) description is after the (possibly multi-line) arguments list:
Submits liquidity-taking orders based on mean-variance optimization of multiple assets.
Options (in square brackets) should be parsed into:
public class Option {
private String shortName;
private String parameter;
private String longName;
private String description;
}
The parsed options' JSON is:
options: [ {
"shortName": "h",
"parameter": null,
"longName": "help",
"description": "display this help message"
}, {
"shortName": "V",
"parameter": null,
"longName": "version",
"description": "print product version and exit"
}, {
"shortName": "C",
"parameter": file,
"longName": "credential",
"description": "credential file"
}, {
"shortName": "E",
"parameter": file,
"longName": "endpoint",
"description": "marketplace endpoint file"
}, {
"shortName": "p",
"parameter": payoffs,
"longName": "payoffs",
"description": "payoff states and probabilities (default: ~/.fm/payoffs)"
}]
Similarly for the parameters which should be parsed into:
public class Parameter {
private String name;
private String description;
}
and parameter-groups which are surrounded by ( and )... should be parsed into:
public class ParameterGroup {
private List<String> parameters;
private String description;
}
The first hand-written parser I wrote walked the buffer, capturing the data as it progresses. It works pretty well, but it looks horrible. And it's horrible to extend. The second hand-written parser uses regex expressions while walking the buffer. Better looking than the first but still ugly and difficult to extend. The third parser uses regex expressions. Probably the best looking of the bunch but still ugly and unmanageable.
I thought this text would be pretty simple to parse manually but now I'm wondering if ANTLR might be a better tool for this. Any thoughts or alternative ideas?
Model
It sounds like what you need is a model. An object model that describes the command, its options, option parameter types, option description, option names, and similar for positional parameters, argument groups, and potentially subcommands.
Then, once you have an object model of your application, it is relatively straightforward to render this as JSON or as some other format.
Picocli has an object model
You could build this yourself, but if you are using picocli anyway, why not leverage picocli's strengths and use picocli's built-in model?
CommandSpec
OptionSpec
PositionalParamSpec
ArgGroupSpec
and more...
Accessing picocli's object model
Commands can access their own model
Within a picocli-based application, a #Command-annotated class can access its own picocli object model by declaring a #Spec-annotated field. Picocli will inject the CommandSpec into that field.
For example:
#Command(name = "taker-mvo-2", mixinStandardHelpOptions = true, version = "taker-mvo-2 0.2")
class TakerMvo2 implements Runnable {
// ...
#Option(names = {"-C", "--credential"}, description = "credential file")
File file;
#Spec CommandSpec spec; // injected by picocli
public void run() {
for (OptionSpec option : spec.options()) {
System.out.printf("%s=%s%n", option.longestName(), option.getValue());
}
}
}
The picocli user manual has a more detailed example that uses the CommandSpec to loop over all options in a command to see if the option was defaulted or whether a value was specified on the command line.
Creating a model of any picocli command
An alternative way to access picocli's object model is to construct a CommandLine instance with the #Command-annotated class (or an object of that class). You can do this outside of your picocli application.
For example:
class Agent {
public static void main(String... args) {
CommandLine cmd = new CommandLine(new TakerMvo2());
CommandSpec spec = cmd.getCommandSpec();
// get subcommands
Map<String,CommandLine> subCmds = spec.subcommands();
// get options as a list
List<OptionSpec> options = spec.options()
// get argument groups
List<ArgGroupSpec> argGroups = spec.argGroups()
...
}
}

Refactor multiple cases from switch

TestDTO testDTO = new TestDTO();
for (Object attribute : row.getAttributes()) {
switch (attribute) {
case "CATEGORY":
testDTO.setCategory((String) attribute);
break;
case "DESCRIPTION":
testDTO.setDescription((String) attribute);
break;
case "NOTE":
testDTO.setNote((String) attribute);
break;
case "FEATURES":
testDTO.setFeatures((String) attribute);
break;
case "INDICATOR":
testDTO.setIndicator((String) attribute);
break;
case "LABEL":
testDTO.setLabel((String) attribute);
break;
case "TYPE":
testDTO.setType((String) attribute);
break;
default:
}
}
As you can see in above code, we are using multiple case for setting data. Code is working fine.
Is there any way for reducing multiple cases for setting those data.
In the above code, the problem is maintainability. Because suppose if we have 30 fields, then we need to put 30 cases for that.
Is there any other way to achieve the same?
Without refactoring you cannot do anything really helping the situation. Also you will need to add specific code for every field anyway - this is obvious
In abstract situation what you could do would be to implement factory or strategy pattern and e.g. register proper handlers for every type of attribute - something like
Map<Object, BiConsumer<TestoDTO, Object>> handlers; // then you can add for example handlers.put("TYPE", (d, a) -> d.setType(a))
And just iterate over attributes
row.getAttributes().forEach(a -> handlers.get(attribute).accept(dto, a)); // ofc you need to handle all situation like NPE, no key etc
In scope of mapping objects you could use some existing tool like ObjectMapper or ModelMapper because it's quite possible that these tools will resolve your issue out of the box
Last and least (:)) solution is to use some reflection, map attribute to field name, extract setter... Don't do this :) it's filthy, insecure, hard to write and understand - will cause many issues you will regret but because it's an option I'm mentioning this
For a robust solution you can also build your association using enumerated types and method references, and conveniently encapsulate the map into a single type. Plus, it's pretty obvious how to add new fields:
enum DTOMap
{
CATEGORY(TestDTO::setCategory),
DESCRIPTION(TestDTO::setDescription);
private final BiConsumer<TestDTO, String> attributeConsumer;
private DTOMap(BiConsumer<TestDTO, String> attributeConsumer) {
this.attributeConsumer = attributeConsumer;
}
public static void execute(TestDTO testDTO, Object attribute) {
String attributeAsString = (String) attribute;
DTOMap.valueOf(attributeAsString.toUpperCase()).attributeConsumer.accept(testDTO, attributeAsString);
}
}
With this your switch statement can be reduced to a single line:
for (Object attribute : row.getAttributes()) {
DTOMap.execute(testDTO, attribute);
}
You can use reflection to refactor it like below:
TestDTO testDTO = new TestDTO();
for (Object attribute : row.getAttributes()) {
Method method = testDTO.getClass().getMethod("set" + capitalize((String) attribute), String.class);
method.invoke(testDTO, (String) attribute);
}
The capitalize func:
public String capitalize(String string) {
return string.substring(0, 1).toUpperCase() + string.substring(1).toLowerCase();
}

How to deserialize an array with leading label in front without using wrapper object?

My web service (WS) receives an HTTP POST request and JacksonJsonProvider is deserializing incoming body object into JSON string. The DTO is simple:
public class SettingDTO {
private String key;
private String value;
...
}
The WS signature looks like this:
#Post
Response saveList(List<SettingDTO> list);
The WS is awaiting an array in the input. Example:
{
"settings": [
{
"key": "key1",
"value": "val1"
},
{
"key": "key2",
"value": "val2"
}
]
}
This results in an exception. Jackson does not know how to handle the leading "settings" label. If I try it without the label, just a plain array, it works well. But the requirement is set to use it the way it is.
One solution I know is to use a wrapper object, another DTO. I wonder if this could be solved without an extra wrapper? Maybe an annotation will do the job?
After deserialization, I want to end up with the populated List<Setting> settings ...
There are several ways can achieve this. But if you don't want to use an extra wrapper class, one way is to read the inner JSON string first, then deserialize it to List<SettingDTO> as follows:
Code snippet
ObjectMapper mapper = new ObjectMapper();
JsonNode root = mapper.readTree(jsonStr);
String settingsStr = root.get("settings").toString();
List<SettingDTO> settings = mapper.readValue(settingsStr, new TypeReference<List<SettingDTO>>(){});
System.out.println(settings.toString());
Console output
[SettingDTO [key=key1, value=val1], SettingDTO [key=key2, value=val2]]
BTW, if you have tried to add #JsonRootName(value = "settings") to class SettingDTO, AFAIK, it doesn't work for JSON array!

Find specific key-value pair in an array of object with rest assured

I have a json response something like this:
"someArray": [
{
"someProperty":"someValue",
// other properties that are not relevant for me
},
{
"someProperty":"someOtherValue",
// other properties that are not relevant for me
}
]
I want to check, if the someArray array has an object with property named "someProperty" with the value "someValue", but don't fail the test if it has another object with the same property but not the same value.
Is it possible? Until this I was using static index because I had only one element in that array.
Here's a solution using JsonPath:
List<String> values = RestAssured.when().get("/api")
.then().extract().jsonPath()
.getList("someArray.someProperty");
Assert.assertTrue(values.contains("someValue"));
Will work for following response JSON:
{
"someArray":[
{
"someProperty":"someValue"
},
{
"someProperty":"someOtherValue"
}
]
}
Assuming you're using Java 8 or above, you should use Arrays.stream(someArray) and then use the filter method to select elements you desire.
I haven't used REST-assured but based on their documentation, it looks like you should be able to use something like this below
#Test public void
lotto_resource_returns_200_with_expected_id_and_winners() {
when().
get("/lotto/{id}", 5).
then().
statusCode(200).
body("someArray", hasItems(hasEntry("someProperty", "someValue")));
}
This works if you can put some kind of deserialization logic to convert object to map before using hasEntry
Another solution is to use findAll
body("someArray.findAll {o -> o.someProperty == 'someValue'}.size()", greaterThan(0))

Pass object inside existing model object, where the new object parameters are not defined and may change

Problem: I have a request body where I have a predefined POJO class, inside this class I need to add another object as parameter. This new object at a given time may have random properties/attributes/params. How can I achieve this?
{
"id": "{{id}}",
"enableTouchId": true,
"idleLogoutMinutes": 10,
"platformSpecificPreferences": {
"ios": {
"terms": "1234",
"privacy": "12345"
},
"web": {
"terms" : "abc"
},
"android": {
"newProperty" : "newValue"
}
}
}
So the new object I am trying to add is platformSpecificPreferences, which when hit using rest calls might or might not have all the properties shown here, which is why I cannot use redefined POJO class for platformSpecificPreferences and create its object.
Solution I tried:
I thought of using JsonObject inside request body, which makes
#JsonProperty("platformSpecificPreferences")
private JsonObject platformSpecificPreferences;
but the problem is, I am not able to hit the api as it doesnt accept this parameter and gives 404.
Thanks in advance.
You can use, kind must a predefined pojo for platformSpecificPreferences but in the pojo you need to ignore values that are not given in the rest call!
You can do this with a json annotation:#JsonIgnoreProperties(ignoreUnknown = true) in the Pojo above the class.

Categories