Hi I am using OpenAPI Generator Maven Plugin to generate some Java Client code (using Spring WebClient library). One of the endpoints of my spec. returns binary content, like:
"schema": {
"type": "string",
"format": "binary"
}
The generated code uses java.io.File as the return type for that, like:
public Mono<ResponseEntity<File>> downloadWithHttpInfo(String filename) throws WebClientResponseException {
ParameterizedTypeReference<File> localVarReturnType = new ParameterizedTypeReference<File>() {};
return downloadRequestCreation(filename).toEntity(localVarReturnType);
}
When calling this generated method, the response code was 200 (i.e. OK from the server side), but I got the following error in my client code:
org.springframework.web.reactive.function.UnsupportedMediaTypeException:
Content type 'application/octet-stream' not supported for bodyType=java.io.File
This came from the toEntity() method, which is part of the Spring WebClient code instead of my code.
Is there a way to workaround this? A: Instruct OpenAPI Generator Maven Plugin not to use java.io.File type but use Resource type? B: Somehow make WebClient able to decode application/octet-stream into java.io.File?
Found a solution: add the following options to the OpenAPI Generator Maven Plugin then generate the code again, which would replace File to Resource
<generatorName>java</generatorName>
<library>webclient</library>
<typeMappings>string+binary=Resource</typeMappings>
<importMappings>Resource=org.springframework.core.io.Resource</importMappings>
The above is saying when the return type is string and the format is binary, map it to Resource and for Resource import it as org.springframework.core.io.Resource. There you go.
I had the exact same issue, but using Gradle instead of Maven.
Here is the syntax for doing the same in Gradle:
task generateClientSources(type: org.openapitools.generator.gradle.plugin.tasks.GenerateTask) {
generatorName = 'java'
// other configs ..
configOptions = [
// other configs ..
library : 'webclient'
]
typeMappings = [
File : 'Resource'
]
importMappings = [
File : 'org.springframework.core.io.Resource'
]
}
Related
I am using attlasian library for schema validation of swagger.
com.atlassian.oai.validator.restassured.OpenApiValidationFilter
private static final OpenApiValidationFilter SWAGGER_FILTER_ = new OpenApiValidationFilter( OpenApiInteractionValidator.createFor("swagger.yml") .withBasePathOverride("ApiBasePath") .build());
The above code works only when the swagger specs are available in src/main/api folder.
I am trying to read specs from src/main/swagger or src/main/api/swaggers folder.
" java.lang.RuntimeException: Could not find /api/swaggers/swagger.yml
on the classpath"
What am I missing here ?
You can use real path, instead of Object
OpenApiValidationFilter validationFilter = new OpenApiValidationFilter(
OpenApiInteractionValidator.createFor(Paths.get(System.getProperty("user.dir"), "src", "main", "swagger", "swagger.yml").toString()
).build());
I'm trying to create API resources containing External file reference in parameters and responses and try to get these references resolved using swagger. (Support importing OpenAPI definitions with external references).
For that, I am getting the YAML files as file archive and there will be a master main.YAML file and from that other files are referenced.
OpenAPIV3Parser openAPIV3Parser = new OpenAPIV3Parser();
ParseOptions options = new ParseOptions();
options.setResolve(true);
options.setFlatten(true);
OpenAPI openAPI = openAPIV3Parser.read(extractedLocation + "/main.yaml", null, options);
String openAPIContent = Yaml.mapper().writerWithDefaultPrettyPrinter().writeValueAsString(openAPI);
APIDefinitionValidationResponse apiDefinitionValidationResponse = new APIDefinitionValidationResponse ();
apiDefinitionValidationResponse = OASParserUtil.validateAPIDefinition(openAPIContent, returnContent);
I tried with this code snippet but the apiDefinitionValidationResponse is throwing an error when there is $ref in the YAML file. If there's no $ref then apiDefinitionValidationResponse is a success and api is created.
So i doubt there is a problem in giving the data to OASParserUtil.validateAPIDefinition method (validateAPIDefinition method has no issues and it has been validated and tested)
Could someone help me with this?
The generated YAML file has extensions{} lines all over it
Error messages in debug logs:
attribute info.license.extensions is unexpected
attribute info.extensions is unexpected
attribute components.schemas.ErrorListItem.extensions is unexpected
attribute components.schemas.MenuItem.extensions is unexpected
attribute components.schemas.Order.extensions is unexpected
What i can tell from the error message and your result yaml is, that the transformation step adds some extensions: {} lines into the final yaml.
Having an extensions attribute at those places it complains about is not allowed by the OpenAPI specification.
Looks like your yaml serialization is to simple. Looking at the SerializerUtils from the openapi-generator they have a bit more configuration.
The extra module takes care of serializing only the interesting part of the OpenAPI object.
I have a wsdl url using which I have to create a template file which has the list of the parameters for a particular API and create a pojo file for that request. I tried using soapui-api but I was unable to do so because of unable to fulfill the dependencies (Followed all the stackoverflow help to resolve the jar issues but it did not work):
Code:
WsdlProject project = new WsdlProject();
WsdlInterface[] wsdls = WsdlImporter.importWsdl(project, "http://XXXXX?wsdl");
WsdlInterface wsdl = wsdls[0];
for (com.eviware.soapui.model.iface.Operation operation : wsdl.getOperationList()) {
WsdlOperation wsdlOperation = (WsdlOperation) operation;
System.out.println("OP:"+wsdlOperation.getName());
System.out.println("Request:");
System.out.println(wsdlOperation.createRequest(true));
System.out.println("Response:"); System.out.println(wsdlOperation.createResponse(true));
}
Another approach in which I tried to parse the wsdl url using parser and get the list of names of the possible requests. I was able to get the request list but not the parameters required to create that request.
WSDLParser parser = new WSDLParser();
Definitions wsdl = parser.parse("http://XXXX?wsdl");
String str = wsdl.getLocalBindings().toString();
for(Message msg : wsdl.getMessages()) {
for (Part part : msg.getParts()) {
System.out.println(part.getElement());
}
}
Please help me on how to get the list of parameters from a wsdl url by either of the one approach.
well there are various stranded approach available for this , try and search for WS-import tool which is one of tools to do this .
This is the simple and best example here
WS_Import_tool
one more way of doing this is -
Apache_CFX
If you want to generate them using eclipse - that is also possible .
Check it out -
How do you convert WSDLs to Java classes using Eclipse?
What errors you are facing for SOAP UI
You can reffer this link for trouble shooting
Generate_java_from_Wsdl
I have a message with a field of the "Any" well known type which can hold a serialized protobuf message of any type.
I want to convert this field to its json representation.
I know the field names are required, and typically you would need the generated classes loaded in the app for this to work, but I am looking for a way to do it with the descriptors.
First, I parse the descriptors:
FileInputStream descriptorFile = new FileInputStream("/descriptor");
DescriptorProtos.FileDescriptorSet fdp = DescriptorProtos.FileDescriptorSet.parseFrom(descriptorFile);
Then, loop through the contained messages and find the correct one (using the "Any" type's URL, which contains the package and message name. I add this to a TypeRegistry which is used to format the JSON.
JsonFormat.TypeRegistry.Builder typeRegistryBuilder = JsonFormat.TypeRegistry.newBuilder();
String messageNameFromUrl = member.getAny().getTypeUrl().split("/")[1];
for (DescriptorProtos.FileDescriptorProto file : fdp.getFileList()) {
for (DescriptorProtos.DescriptorProto dp : file.getMessageTypeList()) {
if (messageNameFromUrl.equals(String.format("%s.%s", file.getPackage(), dp.getName()))) {
typeRegistryBuilder.add(dp.getDescriptorForType()); //Doesn't work.
typeRegistryBuilder.add(MyConcreteGeneratedClass.getDescriptor()); //Works
System.out.println(JsonFormat.printer().usingTypeRegistry(typeRegistryBuilder.build()).preservingProtoFieldNames().print(member.getAny()));
return;
}
}
}
The problem seems to be that parsing the descriptor gives me access to Descriptors.DescriptorProto objects, but I see no way to get the Descriptors.Descriptor object needed for the type registry. I can access the concrete class's descriptor with getDescriptor() and that works, but I am trying to format the JSON at runtime by accessing a pre-generated descriptor file from outside the app, and so I do not have that concrete class available to call getDescriptor()
What would be even better is if I could use the "Any" field's type URL to resolve the Type object and use that to generate the JSON, since it also appears to have the field numbers and names as required for this process.
Any help is appreciated, thanks!
If you convert a DescriptorProtos.FileDescriptorProto to Descriptors.FileDescriptor, the latter has a getMessageTypes() method that returns List<Descriptor>.
Following is a snippet of Kotlin code taken from an open-source library I'm developing called okgrpc. It's the first of its kind attempt to create a dynamic gRPC client/CLI in Java.
private fun DescriptorProtos.FileDescriptorProto.resolve(
index: Map<String, DescriptorProtos.FileDescriptorProto>,
cache: MutableMap<String, Descriptors.FileDescriptor>
): Descriptors.FileDescriptor {
if (cache.containsKey(this.name)) return cache[this.name]!!
return this.dependencyList
.map { (index[it] ?: error("Unknown dependency: $it")).resolve(index, cache) }
.let {
val fd = Descriptors.FileDescriptor.buildFrom(this, *it.toTypedArray())
cache[fd.name] = fd
fd
}
}
I'm using logstash avro plugin.
My client is a java application. I have few schemas that use 'long' as a type and each time when I send them I see wrong value after deserialization. I suppose that there's some overflow in logstash avro plugin.
Is there any workarounds for it? I don't want to send string each time when i have big value...
Here's the code snippets for my case. I have a valid .avsc schema with such field:
{
"name": "scoringId",
"type": "long",
},
And then i have avro-generated DTO on java side, which i convert to ByteArray.
My kafka config is ok, it uses ByteArraySerializer:
props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, ByteArraySerializer::class.java)
In logstash conf i have such input:
input {
kafka {
bootstrap_servers => 'kafkaserver:9092'
topics => ["bart.vector"]
codec => avro { schema_uri => "C:\logstash-6.1.2\vectorInfoDWH.avsc" }
client_id => "logstash-vector-tracking"
}
}
It uses avro plugin. And as a result i can access all of the fields and get correct values except longs(and timestamps, because they're translated as longs).
Any ideas?
The problem was with serializationDeserialization. Convertion of message to base64 from java side was a solution.