I've looked through the examples at https://doc.akka.io/docs/akka-http/current/introduction.html for Akka HTTP routing and strangely for something built on top of Akka Streams none of the examples connect to a stream.
Can somebody show a simple example of creating a Java DSL flow (not Scala please), and then connecting a Route directly to that flow?
Or am I missing the point and it's not possible but requires some CompletionStage code within the Route to wait for a result of glue code that calls a Flow?
Edit: to clarify the flow can do something like append a string to a posted request body.
Using akka streams to complete a route is definitely possible. It involves either:
a web socket route, see examples in the docs, or
a chunked http response (since you typically do not know the size of the response if it's fed from a stream). You can create a Chunked Entity from an akka stream Source of ByteStrings
you can also use other response types if the response size is known in advance, see docs for HttpEntity about their specifics
Edit: to clarify the flow can do something like append a string to a posted request body.
MichaĆ's answer contains good links, so please give them a read. Akka HTTP is by default and always streaming with its data -- e.g. the entities. So for example to do a streaming "echo" which at the same time adds a suffix, you could do something like this:
path("test", () ->
// extract the request entity, it contains the streamed entity as `getDataBytes`
extractRequestEntity(requestEntity -> {
// prepare what to add as suffix to the incoming entity stream:
Source<ByteString, NotUsed> suffixSource =
Source.single(ByteString.fromString("\n\nADDS THIS AFTER INCOMING ENTITY"))
// concat the suffix stream to the incoming entity stream
Source<ByteString, Object> replySource = requestEntity.getDataBytes()
.concat(suffixSource);
// prepare and return the entity:
HttpEntity.Chunked replyEntity = HttpEntities.create(ContentTypes.TEXT_PLAIN_UTF8, replySource);
return complete(StatusCodes.OK, replyEntity);
})
);
Having that said, there is numerous ways to make use of the streaming capabilities, including framed JSON Streaming and more. You should also give the docs page about implications of streaming a read.
Related
I'm trying stream the data from an HTTP (GET) response to another HTTP (POST) request. With old HttpURLConnection I would take the responses OutputStream, read parts into a buffer and write them to the requests InputStream.
I've already managed to do the same with HttpClient in Java 11 by creating my own Publisher that is used in the POST to write the request body. The GET request has a BodyHandler with ofByteArrayConsumer that sends the chunks to the custom Publisher which itself then sends the chunks to the subscribing HTTP POST request.
But I think this is not the correct approach as it looks like there is something in the API that looks like this could be done directly without implementing publishers and subscribers myself.
There is HttpResponse.BodyHandlers.ofPublisher() which returns a Publisher<List<ByteBuffer> which I can use for the HTTP GET request. Unfortunately for my POST request, there is HttpRequest.BodyPublishers.fromPublisher which expects a Publisher<? extends ByteBuffer> so it seems that the fromPublisher only works for a publisher that holds a complete ByteBuffer and not one that sends several ByteBuffers for parts of the data.
Do I miss something here to be able to connect the BodyPublisher from one request to the other?
You're not missing anything. This is simply a use case that is not supported out of the box for now. Though the mapping from ByteBuffer to List<ByteBuffer> is trivial, the inverse mapping is less so. One easy (if not optimal) way to adapt from one to the other could be to collect all the buffers in the list into a single buffer - possibly combining HttpResponse.BodyHandlers.ofPublisher() with HttpResponse.BodyHandlers.buffering() if you want to control the amount of bytes in each published List<ByteBuffer> that you receive from upstream.
I'd like to log the original 'raw' request body (e.g. JSON) while using Camel Rest endpoints. What's the proper way to do this?
My setup (RouteBuilder) looks like this:
restConfiguration().component("jetty")
.host(this.host)
.port(this.port)
.contextPath(this.contextPath)
.bindingMode(RestBindingMode.json);
rest("myService/").post()
.produces("application/json; charset=UTF-8")
.type(MyServiceRequest.class)
.outType(MyServiceResponse.class)
.to(SERVICE_CONTEXT_IN);
from(SERVICE_CONTEXT_IN).process(this.serviceProcessor);
My problem here is that the mechanics such as storing the request as an Exchange property are 'too late' in terms of using this approach, any processors are too late in the route, i.e., the binding already took place and consumed the Request. Also the CamelHttpServletRequest's InputStream has already been read and contains no data.
The first place to use the log EIP is directly before the single processor:
from(SERVICE_CONTEXT_IN).log(LoggingLevel.INFO, "Request: ${in.body}")
.process(this.serviceProcessor);
but at that point the ${in.body} is already an instance of MyServiceRequest. The added log above simply yields Request: x.y.z.MyServiceRequest#12345678. What I'd like to log is the original JSON prior to being bound to a POJO.
There seems to be no built-in way of enabling logging of the 'raw' request in RestConfigurationDefinition nor RestDefinition.
I could get rid of the automatic JSON binding and manually read the HTTP Post request's InputStream, log and perform manual unmarshalling etc. in a dedicated processor but I would like to keep the built-in binding.
I agree there is no way to log the raw request (I assume you mean the payload going through the wire before any automatic binding) using Camel Rest endpoints.
But taking Roman Vottner into account, you may change your restConfiguration() as follows:
restConfiguration().component("jetty")
.host(this.host)
.port(this.port)
.componentProperty("handlers", "#yourLoggingHandler")
.contextPath(this.contextPath)
.bindingMode(RestBindingMode.json);
where your #yourLoggingHandler needs to be registered in your registry and implement org.eclipse.jetty.server.Handler. Please take a look at writing custom handlers at Jetty documentation http://www.eclipse.org/jetty/documentation/current/jetty-handlers.html#writing-custom-handlers.
In the end I 'solved' this by not using the REST DSL binding with a highly sophisticated processor for logging the payload:
restConfiguration().component("jetty")
.host(this.host)
.port(this.port)
.contextPath(this.contextPath);
rest("myService/").post()
.produces("application/json; charset=UTF-8")
.to(SERVICE_CONTEXT_IN);
from(SERVICE_CONTEXT_IN).process(this.requestLogProcessor)
.unmarshal()
.json(JsonLibrary.Jackson, MyServiceRequest.class)
.process(this.serviceProcessor)
.marshal()
.json(JsonLibrary.Jackson);
All the requestLogProcessor does is to read the in body as InputStream, get and log the String, and eventually pass it on.
You can solve this by:
Turning the RestBindingMode to off on your specific route and logging the incoming request string as is.
After which you can convert the JSON string to your IN type object using ObjectMapper.
At the end of the route convert the java object to JSON and put it in the exchange out body, as we turned off the RestBindingMode.
rest("myService/").post()
.bindingMode(RestBindingMode.off)
.to(SERVICE_CONTEXT_IN);
In my case, streamCaching did the trick because the Stream was readable only once. Thefore I was able log but was not able to forward the body any more. I hope this might be of help to someone
I'm in over my head.
At the broadest level, I'm trying to expose an Odata interface to an existing pool of data exposed by a service written using Mule. When my Mule service is invoked, if I detect that the URL is Odata format, I want to delegate processing down to something written in Java and then feed the response from that component back to my caller.
I found the Olingo and OData4j libraries. My problem is that these start from building a Web service. But that's too far upstream for me. I have a Web service. What I need to understand are what components I need to implement in order to pass the URL (which I have in hand) onward to an Odata parser which will, in turn, invoke a data provider.
I'm a bit lost with this technology. Can someone point me to a very basic tutorial that clearly delineates this. Or, can they give me a couple steps like: "You have to implement A, B & C and then pass your URL into C.foo()"?
I've tried the Getting Started doc for both libraries but they both start with "first we'll implement a Web service" and don't clearly delineate (to me, at least) where that leaves off and pure Odata sets in.
Thanks.
The following is the code that will help you to get started for using data from a service exposing via OData.(using Apache Olingo).
URL url=new URL(/*your url*/);
HttpURLConnection conn=(HttpURLConnection) url.openConnection();
conn.setRequestMethod("GET");
conn.setRequestProperty(HttpHeaders.ACCEPT,HttpContentType.APPLICATION_XML);
conn.connect();
InputStream content=conn.getInputStream();
Edm edm = EntityProvider.readMetadata(content, false);
After this you can use static methods of EntityProvider class for carrying out various operations like read,update,write
If you are using odata4j go with the following code
ODataConsumer demo_consumer= ODataConsumers.create(/*your URL*/);
Enumerable<EntitySetInfo> demo_entitySetList = demo_consumer.getEntitySets();
for (EntitySetInfo entitySet : entitySetList) {
System.out.println(entitySet.getHref());
}
this sounds very like how we read rss or other data feeds
Since you have a url, this can be read by a Http Connector or even a polling http connector.
The data can be streamed using a Java input stream the default behavior or converted to a string (object to string).
A simple java component using (OData4j) can process your content .. it sounds like 2 simple components on a mule flow.
R
I need to write a HTTP client which to communicate with Floodlight OpenFlow controller via its REST API.
For testing I did it in python, and it worked OK. But now I'm in a situation where it has to be done in Java, of which I'm admittedly still at the beginner's level. One of my apps uses AsyncHttpClient to dispatch async GET requests, and works just fine. Now as a Floodlight's REST client, it has to do POST and DELETE with JSON encoded body. My code for an async POST request works very much as expected.
But no luck with DELETE.
Somehow it doesn't write JSON string into its request body.
The code is almost identical with POST. For debugging, I don't feed an AsyncCompletionHandler instance to execute() method.
System.out.println(ofEntry.toJson()); // this returns {"name": "xyz"} as expected.
Future<Response> f = httpClient.prepareRequest(new RequestBuilder("DELETE")
.setUrl("http://" + myControllerBaseUrl + urlPathFlowPostDelete)
.setHeader("content-type", "application/json")
.setBody(ofEntry.toJson())
.build()).execute();
System.out.println(f.getStatusCode()); // returns 200.
System.out.println(f.getResponseBody()); // returns {"status" : "Error! No data posted."}.
Just to make sure, I peeped into packet dump with wireshark, and found out the server isn't lying :)
The author of the library has written an extensive amount of relevant, valuable information, but unfortunately I can't find example code specifically for building a DELETE request.
I'd very much appreciate any suggestions, pointers, and of course pinpoint solutions!
Not sure that replying to my own question is appropriate here, but I have just found a related thread at the floodlight-dev Google group.
Problem with Static Flow Pusher DELETE REST method
So this could be a problem with Floodlight REST API which requires message body for a DELETE request to identify what to be deleted, whereas AHC is simply compliant with RFC2616.
I will follow the thread at Google group, and see how it will conclude among developers.
I need my Play! application to accept http POST from other server.
Is there some simple way how to manage external http post, get data and sent response?
Some easy http request listener?
Thanks
You could say nearly ALL http requests come from a remote source, so this is how Play and all HTTP based containers works by default!
However, to offer some advice, as you are sharing data between servers, and not a client-browser, I would check out renderXml and renderJSON in your controllers to return data in a way that your server will expect (as it is unlikely to be expecting HTML content??).
I agree with Codemwnci - besides those tips you can take a look in the 'routes' file and mark your method to accept only POST:
POST /edit controllerName.methodName
Thanks for the Answers, once I have the routes, it is really easy to write controller:
public static void accept(){
InputStream inputStream = request.body;
...
String response = "cmd=asynch-no-trace";
renderText(response);
}