I'm refactoring legacy C++ system to SOA using gSoap. We have some performance issues (very big XMLs) so my lead asked me to take a look at protocol buffers. I did, and it looks very cool (We need C++ and Java support). However protocol buffers are solution just for serialization and now I need to send it to Java front-end. What should I use from C++ and Java perspective to send those serialized stuff over HTTP (just internal network)?
PS. Another guy tries to speed-up our gSoap solution, I'm interested in protocol buffers only.
You can certainly send even a binary payload with an HTTP request, or in an HTTP response. Just write the bytes of the protocol buffer directly into the request/response, and make sure to set the content type to "application/octet-stream". The client, and server, should be able to take care of the rest easily. I don't think you need anything more special than that on either end.
ProtoBuf is a binary protocol. It doesn't mix well with SOAP. I suggest you either stick with gSOAP or convert to ProtoBuf entirely.
With protoBuf, you define your protocol in a special format like this,
message Product {
required string id = 1;
required string description = 2;
required int32 quantity = 3;
optional bool discontinued = 4;
}
The protoc tool can generate code in C++/Java/Python so you can serialize it on one end and deserialize on another.
As you can see, ProtoBuf is designed to serialize individual object. It doesn't provide all the facilities provided by SOAP, like headers. To get around this issue, we use ProtoBuf inside ProtoBuf. We define an Envelope like this,
message Envelope {
enum Type {
SEARCH = 1;
SEARCH_RESPONSE = 2;
RETRIEVE = 3;
RETRIEVE_RESPONSE = 4;
}
required Type type = 1;
required bytes encodedMessage = 2;
message Header {
required string key = 1;
required bytes value = 2;
}
repeated Header headers = 3;
}
The encodedMessage is another serialized ProtoBuf message. All the stuff in SOAP header now goes to headers.
Google frontends prefer application/protobuf.
The ProtocolBufferModel of the Google API client uses application/x-protobuf.
You can serialize/de-serialize protobuf encoded data to/from strings. Send the serialized string as the body of an HTTP POST to Java and de-serialize it. That is one approach. Another way is to make use of the protobuf Service interface. Protobuf allows you to define a service interface in a .proto file and the protocol buffer compiler will generate service interface code and stubs in your chosen language. You only need to implement the protobuf::RpcChannel and protobuf::RpcController classes to get a complete RPC framework. Probably you can write an HTTP wrapper for these classes. See the following links for more information:
http://code.google.com/apis/protocolbuffers/docs/proto.html#services
http://code.google.com/apis/protocolbuffers/docs/reference/cpp-generated.html#service
http://code.google.com/apis/protocolbuffers/docs/reference/cpp/google.protobuf.service.html
To my knowledge protocol buffers support is available in both C++ and Java, you should be able to exchange protocol buffer serialized data between both systems.
That said, it seems your real question is "How do I send stuff over HTTP between a C++ backend and Java client"
It sound like you need to learn how to use gSOAP, read the docs.
Alternatively you could host a RESTful web server from your C++ app: Look at this: https://stackoverflow.com/questions/298113/how-can-i-implement-a-restful-webservice-using-c++
Next you would need to access the data hosted on your new C++ RESTful server: Look at this: Rest clients for Java?
Related
I'm trying to utilise com.google.cloud.dialogflow.v2.WebhookResponse to interact with my dialogflow agent. But I'm having trouble responding back to the agent during fulfillment.
The response created doesn't follow the specifications required i.e the agent expect the json to be fulfillmentText: "something" but the builder builds it in the format of fulfillment_text. There's not enough documentation on how to use API client correctly
Anyone has experience doing this in java/kotlin?
val response = WebhookResponse
.newBuilder()
.setFulfillmentText("Hello")
.build()
println(response)
println(Gson().toJson(response))
Output:
fulfillment_text: "Hello"
{"bitField0_":0,"fulfillmentText_":"Hello","fulfillmentMessages_":
[],"source_":"","outputContexts_":[],"memoizedIsInitialized":1,"unknownFields":{"fields":{}},"memoizedSize":-1,"memoizedHashCode":0}
I'm using 'com.google.cloud:google-cloud-dialogflow:0.75.1-alpha' from https://cloud.google.com/dialogflow-enterprise/docs/reference/libraries/java
The library you're using is primarily designed as a client library, letting you send text to Dialogflow and having it determine the Intent and parameters (and possibly a response) from that text.
It sounds like you're trying to use this on the other end - in a webhook to handle fulfillment. It just isn't designed for that. The Class was automatically generated from the ProtoBuf definition, which does not serialize to JSON, and isn't designed to represent things that way.
You will need to build the JSON for the response yourself.
I've looked through the examples at https://doc.akka.io/docs/akka-http/current/introduction.html for Akka HTTP routing and strangely for something built on top of Akka Streams none of the examples connect to a stream.
Can somebody show a simple example of creating a Java DSL flow (not Scala please), and then connecting a Route directly to that flow?
Or am I missing the point and it's not possible but requires some CompletionStage code within the Route to wait for a result of glue code that calls a Flow?
Edit: to clarify the flow can do something like append a string to a posted request body.
Using akka streams to complete a route is definitely possible. It involves either:
a web socket route, see examples in the docs, or
a chunked http response (since you typically do not know the size of the response if it's fed from a stream). You can create a Chunked Entity from an akka stream Source of ByteStrings
you can also use other response types if the response size is known in advance, see docs for HttpEntity about their specifics
Edit: to clarify the flow can do something like append a string to a posted request body.
MichaĆ's answer contains good links, so please give them a read. Akka HTTP is by default and always streaming with its data -- e.g. the entities. So for example to do a streaming "echo" which at the same time adds a suffix, you could do something like this:
path("test", () ->
// extract the request entity, it contains the streamed entity as `getDataBytes`
extractRequestEntity(requestEntity -> {
// prepare what to add as suffix to the incoming entity stream:
Source<ByteString, NotUsed> suffixSource =
Source.single(ByteString.fromString("\n\nADDS THIS AFTER INCOMING ENTITY"))
// concat the suffix stream to the incoming entity stream
Source<ByteString, Object> replySource = requestEntity.getDataBytes()
.concat(suffixSource);
// prepare and return the entity:
HttpEntity.Chunked replyEntity = HttpEntities.create(ContentTypes.TEXT_PLAIN_UTF8, replySource);
return complete(StatusCodes.OK, replyEntity);
})
);
Having that said, there is numerous ways to make use of the streaming capabilities, including framed JSON Streaming and more. You should also give the docs page about implications of streaming a read.
Is there any way to add custom via header in jain-sip? Adding the oc-parametrs from RFC 7339.
From enter link description hereplace I got the following example, but not sure if it will work. The quote from link:
This could be easily achieved by adding some code to implementation of
javax.sip.message.Message.addHeader(Header header) function.
void addHeader(Header header) {
if(!(header instanceof InternalHeaderObject)
&& header instanceof ExtensionHeader) {
ExtensionHeader extensionHeader = (ExtensionHeader) header;
header = headerFactory.createHeader(extensionHeader.getName(), extensionHeader.getValue());
}
...
}
I will start by saying you can absolutely handle custom via headers in terms of SIP as long as it is valid SIP. For this RFC you just need to use viaHeader.set/getParameter if I am not missing something.
The blog post talks about creating your own header classes, which is not relevant to your needs as far as I can imagine. Custom header classes are tricky and inefficient. For example JAIN SIP will automatically construct it's own ViaHeader instance for inbound messages when parsing them. Plugging a custom header to override the default Via internally will break a lot of validation promises and cause overhead..
If you have a showstopper case for custom header classes I will gladly listen though.
I'm in over my head.
At the broadest level, I'm trying to expose an Odata interface to an existing pool of data exposed by a service written using Mule. When my Mule service is invoked, if I detect that the URL is Odata format, I want to delegate processing down to something written in Java and then feed the response from that component back to my caller.
I found the Olingo and OData4j libraries. My problem is that these start from building a Web service. But that's too far upstream for me. I have a Web service. What I need to understand are what components I need to implement in order to pass the URL (which I have in hand) onward to an Odata parser which will, in turn, invoke a data provider.
I'm a bit lost with this technology. Can someone point me to a very basic tutorial that clearly delineates this. Or, can they give me a couple steps like: "You have to implement A, B & C and then pass your URL into C.foo()"?
I've tried the Getting Started doc for both libraries but they both start with "first we'll implement a Web service" and don't clearly delineate (to me, at least) where that leaves off and pure Odata sets in.
Thanks.
The following is the code that will help you to get started for using data from a service exposing via OData.(using Apache Olingo).
URL url=new URL(/*your url*/);
HttpURLConnection conn=(HttpURLConnection) url.openConnection();
conn.setRequestMethod("GET");
conn.setRequestProperty(HttpHeaders.ACCEPT,HttpContentType.APPLICATION_XML);
conn.connect();
InputStream content=conn.getInputStream();
Edm edm = EntityProvider.readMetadata(content, false);
After this you can use static methods of EntityProvider class for carrying out various operations like read,update,write
If you are using odata4j go with the following code
ODataConsumer demo_consumer= ODataConsumers.create(/*your URL*/);
Enumerable<EntitySetInfo> demo_entitySetList = demo_consumer.getEntitySets();
for (EntitySetInfo entitySet : entitySetList) {
System.out.println(entitySet.getHref());
}
this sounds very like how we read rss or other data feeds
Since you have a url, this can be read by a Http Connector or even a polling http connector.
The data can be streamed using a Java input stream the default behavior or converted to a string (object to string).
A simple java component using (OData4j) can process your content .. it sounds like 2 simple components on a mule flow.
R
I am developing an application that uses restful api. A java client sending a request to a standalone server is throwing Unsupported Media Type exception.
The client code is as follows
StringBuilder xml = new StringBuilder();
xml.append("<?xml version=\"1.0\" encoding=\"${encoding}\"?>").append("\n");
xml.append("<root>").append("\n");
xml.append("<user>").append("\n");
xml.append("<username>"+username+"</username>");
xml.append("\n");
xml.append("<password>"+pass+"</password");
xml.append("\n");
xml.append("</user>");
xml.append("</root>");
Representation representation = new StringRepresentation(xml.toString());
new ClientResource("http://localhost:7777/Auth").post(representation);
Server code is as follows
new Server(Protocol.HTTP,7777,TestServer.class).start();
String username = (String) getRequest().getAttributes().get("username");
String password=(String) getRequest().getAttributes().get("password");
StringRepresentation representation = null;
You are not passing the content-type header; I strongly recommend using an API like Apache Common HttpClient to produce such requests (and maybe read the contents from a file).
#Riccardo is correct, the Restlet Resource on the server is checking the Content-Type header of the client's request to make sure the entity you're POSTing to the server has a type it can support. Here's a Restlet 1.1 example. You'll notice that that Resource is set up to expect XML:
// Declare the kind of representations supported by this resource.
getVariants().add(new Variant(MediaType.TEXT_XML));
So maybe your server side doesn't declare the representations it can handle, or it does and Restlet's automatic media type negotiation is detecting that your request doesn't have Content-Type: text/xml (or application/xml) set.
So as #Riccardo suggests, use Apache HttpClient and call HttpRequest.setHeader("Content-Type", "text/xml"), or use Restlet's client library API to do this (it adds another abstraction layer on top of an HTTP client connector like Apache HttpClient).