Spring: How to export objects methods via non-HTTP protocol - java

I have a collection of stateless scala/Java APIs, which might look like:
class UserService {
def get(id: UserIDType): User { ... }
def update( user:User): User { ... }
...
}
where User has a set of inspectable bean properties. I'd like to make these same APIs not only available over HTTP (which I know how to do), but also other, more performant non-HTTP protocols (ideally also running in the same process as the HTTP server). And more importantly, automate as much as possible including generation of client APIs that match the original Java API, and the dispatching of method calls following network requests.
I've found Spring's guide on remoting
However, this looks limited to HTTP (text not binary). What I'd really love is a library/other method that:
lets me scan for registered/annotated services and describes methods and parameters as data
lets me easily dispatch method calls (so that I'm in control of the sockets, communication protocols and whatnot and can chose a more performant one than HTTP).
i.e. the kinds of things that Spring's DispatcherServlet does internally, but without the HTTP limitations.
Ideas?
Background
I'd like to make a set of stateless service API calls available as network services with the following goals:
Some of the APIs should be available as REST calls from web pages/mobile clients
A superset of these APIs should be available internally within our company (e.g. from C++ libraries, python libraries) in a way that is as high-performance (read low-latency) as possible, particularly when the service and client are:
co-located in the same process, or
co-located on the same machine.
automate wrapper code for the client and server. If I add a method a service API, or and an attribute to a class, no-one should have to write additional code in client or server. (e.g. I hit a button and a new client API that matches the original Java/scala service is auto-generated).
(1) is easily achievable with Spring MVC. Without duplication of classes I can simply markup the service: (i.e. service is also a controller)
#Controller
#Service
#RequestMapping(...)
class UserService {
#RequestMapping(...)
def get(#PathVariable("id") id: UserIDType): User { ... }
#RequestMapping(...)
def update( #RequestBody user:User): User { ... }
...
}
With tools like Swagger (3) is easily achievable (I can trivially read the Swagger-generated JSON spec and can auto-generate C++ or Python code that mimics the original service class, it's methods, parameter names and the POJO parameter types).
However, HTTP is not particularly performant, so I'd like to use a different (binary) protocol here. Trivially, I can use the this same spec to generate a thrift .idl or even directly generate client code to talk Thrift/ProtocolBuffers (or any other binary protocol) while also providing to the client an API that looks just like the original Java/scala one.
Really -- the only tricky part is getting a description of the service method and then actually dispatching a method calls. So -- are there libraries to help me do that?
Notes:
I'm aware that mixing service and controller annotations disgusts some Spring MVC purists. However, my goal really is to have a logical API, and automate all of the service embodiments of it (i.e. the controller part is redundant boilerplate that should be auto-generated). I'd REALLY rather not spend my life writing boilerplate code.

Related

Initializing JestClient when application calls multiple Elasticsearch endpoints

My API currently calls one Elasticsearch endpoint using JestClient. I want to add some functionality that requires calling a second, different Elasticsearch endpoint. How is this possible, when you have to specify the endpoint upon initializing JestClient?
#Provides
#Singleton
public JestClient jestClient() {
JestClientFactory factory = new JestClientFactory();
factory.setHttpClientConfig(
new HttpClientConfig.Builder("http://localhost:9200")
.build());
return factory.getObject();
}
My application design uses Singleton classes for these initializations so I'm not sure how to fix this aside from using a different Elasticsearch client for my second endpoint.
Or you can create two clients and based on the use-case in calling classes, if you require both the clients, inject both clients or if you requires one specific client, inject that client in your class, you just need to have a different name(like clientv2 for es 2 version, and clientv5 for ES 5.x version) for your client to make it work, its easy as well, as you know your use-case and know what all versions of clients is need in your classes.
on a side note, active development of JEST is stopped long ago, and now elasticsearch provides the official java client know as Java high level rest client, so IMHO you should switch to JHLRC to get the maximum benefit and to make future migration easy, which I think is the use-case of yours.
You could create a factory and return the client depending on a parameter.
The parameter could be the url itself or just an enum that represents the endpoint.

EJB-like communication of REST components

I'm designing software with different components, that need to communicate with each other through a REST interface.
I've seen and implemented solutions, where the different components of the software (deployed on the same cluster) would communicate by declaring and injecting EJB beans. This way, it's really easy to standardize the communication by defining common interfaces and DTOs in a separate .jar dependency.
This level of comfort and standardization is what I'd like to achieve with RESTful services, between Java-based components of my software.
Imagine something like this:
Let's say, I have a REST Client (C) and a REST Server (S).
I'd like to be able to communicate between them, via a common interface, which is implemented in S and called by C. This common interface is in an API component (I).
It would mean that I would have an interface:
#RestController
#RequestMapping("/rest/user")
public class UserController {
#GetMapping("list")
ResponseEntity<List<UsersModel>> getUserList(OAuth2LoginAuthenticationToken token);
}
In C it could be used like:
public class Sg {
private final UserController userController;
...
public void method(OAuth2LoginAuthenticationToken token) {
...
userController.getUserList(token);
...
}
}
Lastly, in S:
public class UserControllerImpl implements UserController {
#Override
public ResponseEntity<List<UsersModel>> getUserList(OAuth2LoginAuthenticationToken token) {
...
}
}
The only configuration needed is to tell the client the context root (and host address) of the server, everything else is present in the common interface in the form of annotations.
Since not all components are necessarily Java-based, it is important for the REST resource to be callable in a typical REST-like way, so those Java remote service calling mechanics are out of consideration.
I was looking into JAX-RS, which seems promising, but is missing a couple of features that would be nice. For example, there isn't a common interface telling the client which endpoint on the server can the REST resource be found, neither are the method names, etc. AFAIK, on the client, you can only call the method representing the HTTP method of the request, which is a bummer.
Am I out of my mind with this spec? I'm not really experienced with REST services yet, so I don't really know if I'm speaking of something that is out of the REST services scope. Is there an already existing solution to the problem I face?
After more thorough research, I found that RESTeasy already has a solution for this.
You need to use the ProxyBuilder to create a proxy of your interface and that's it.

Expose Web Services to third party applications?

I've been asked in tech discussions how do we write an application to use internally in the firm, and also expose it as an API to third party clients?
I am assuming this is in context of Web Services. I am thinking, won't the third party simply call the end point and consume the response?
Clearly, that answer is raw, and I am missing the point.
Is there a known approach, or any Frameworks to do this?
What are the considerations here? And how do we address them?
You would write and expose the RESTful services for internal and external users same way however when you do it for external clients then you have to careful about some of the following points
Security - If your API is secured then how are we going to achieve this ? We can leverage external identity providers to secure our APIs like (Azure AD, Auth0 (https://auth0.com))
Limit call rate - If you want to cap number of call from external Users ? e.g. free tier would only allow 100 req/min etc .
Sign up process - For external users you need to take care how do they have to sign up to your services (acquire token) to access your services.
Scaleable - Your APIs should be scaleable.
HATEOAS - This is very important REST principal. IF you follow this pattern your external users can explore your API in a better way by just following links (https://en.wikipedia.org/wiki/HATEOAS).
Open API Your API should be very well documented and Open API (swagger) is very much a standard now (https://swagger.io/specification/)
You can do all these tasks by your self or you can use Any API manager to do that.
One concrete way to achieve this is with a REST API secured using Json Web Tokens (JWT). On each REST endpoint, you can specify the roles that are allowed to call that endpoint.
For your use case, you could achieve this with a "system" role for internal callers, and unauthorized (i.e. no role) for external callers.
A framework you can use to achieve this is MicroProfile JWT, which might look something like this:
#Path("/rank")
#ApplicationScoped
public class RankingService {
#GET
#Path("/{playerId}")
public long getRank(#PathParam("playerId") String id) {
// get the ranking info for a player
// anyone is allowed to do this
}
#POST
#RolesAllowed({ "system" })
#Path("/{playerId}")
public void recordGame(#PathParam("playerId") String id,
#QueryParam("place") int place,
#HeaderParam("Authorization") String token) {
// update player ranking information
// only internal users are allowed to update ranks!
}
}
Here is a link to a talk that I gave at conference that walks through securing a REST endpoint using MicroProfile JWT.

Bypassing JAX-WS SOAP overhead with JAX-RS/Jersey?

The only web services I've ever integrated with or used have been RESTful. I'm now trying to integrate with a 3rd party SOAP service and am awe-struck at how seemingly convoluted SOAP appears to be.
With REST, I use a JAX-RS client called Jersey that makes hitting RESTful endpoints a piece a' cake. For instance, if a service is exposing a POST endpoint at http://api.example.com/fizz (say, for upserting Fizz objects), then in Jersey I might make a service client that looks like this (pseudo-code):
// Groovy pseudo-code
class Fizz {
int type
boolean derps
String uid
}
class FizzClient {
WebResource webResource = initAt("https://api.example.com")
upsertFizz(Fizz fizz) {
webResource.path("fizz").post(fizz)
}
}
But Java-based SOAP clients seem, at first blush, to be fairly more complicated. If I understand the setup correctly, the general process is this:
Obtain an XML document called a WSDL from the service provider; this appears to be a language-agnostic description of all the available endpoints
Run a JDK tool called wsimport on the WSDL which actually generates Java source code, which implements JAX-WS APIs and actually represents my SOAP client
Import those generated source files into my project and use them
First off, if anything I have said about this process is incorrect, please begin by correcting me! Assuming I'm more or less correct, what I don't understand is: why is this necessary if its all an HTTP conversation? Why couldn't I achieve SOAP-based conversations with Jersey, and bypass all this source-generation boilerplate?
For instance, say the same endpoint exists, but is governed by SOAP:
class FizzClient {
WebResource webResource = initAt("https://api.example.com")
FizzSerializer serializer // I take Fizz instances and turn them into XML
FizzDeserializer deserializer // I take XML and turn them into Fizz instances
upsertFizz(Fizz fizz) {
String xmlFizz = serializer.serialize(fizz)
webResource.path("fizz").post(xmlFizz)
}
}
If I understand SOAP correctly, its just a way of utilizing HTTP verbs and request/response entities to send app-specific messages around; it's an HTTP "conversation". So why couldn't I hijack a REST framework like Jersey to HTTP POST messages, and in doing so, bypass this SOAP overhead?
This is going to attract opinion-based answers, but first, you should understand that
jax-rs is much younger than jax-ws (jax-ws had a final draft in 2006, JAX-RS came out in 2008-9).
RESTful webservices standard, for many purposes is quite amorphous - many businesses prefer the comfort of a contract in the form of a WSDL.
Not to mention that JAX-WS, in concert with WS-I provides many other standards that govern security, message reliability and other enterprise-goodies (under the generic "WS-*" banner) that businesses care about. There's a hodge-podge of libraries that are attempting to get that kind of uniformity on to the jax-rs platform, but for now, jax-ws/WS-I is the industry standard

GWT RequestFactory client scenarios

My understanding is that the GWT RequestFactory (RF) API is for building data-oriented services whereby a client-side entity can communicate directly with it's server-side DAO.
My understanding is that when you fire a RF method from the client-side, a RequestFactoryServlet living on the server is what first receives the request. This servlet acts like a DispatchServlet and routes the request on to the correct service, which is tied to a single entity (model) in the data store.
I'm used to writing servlets that might pass the request on to some business logic (like an EJB), and then compute some response to send back. This might be a JSP view, some complicated JSON (Jackson) object, or anything else.
In all the RF examples, I see no such existence of these servlets, and I'm wondering if they even exist in GWT-RF land. If the RequestFactoryServlet is automagically routing requests to the correct DAO and method, and the DAO method is what is returned in the response, then I can see a scenario where GWT RF doesn't even utilize traditional servlets. (1) Is this the case?
Regardless, there are times in my GWT application where I want to hit a specific url, such as http://www.example.com?foo=bar. (2) Can I use RF for this, and if so, how?
I think if I could see two specific examples, side-by-side of GWT RF in action, I'd be able to connect all the dots:
Scenario #1 : I have a Person entity with methods like isHappy(), isSad(), etc. that would require interaction with a server-side DAO; and
Scenario #2 : I want to fire an HTTP request to http://www.example.com?foo=bar and manually inspect the HTTP response
If it's possible to accomplish both with the RF API, that would be my first preference. If the latter scenario can't be accomplished with RF, then please explain why and what is the GWT-preferred alternative. Thanks in advance!
1.- Request factory not only works for Entities but Services, so you could define any service in server-side with methods which you call from client. Of course when you use RF services they are able to deal with certain types (primitive, boxed primitives, sets, lists and RF proxies)
#Service(value=RfService.class, locator=RfServiceLocator.class)
public interface TwService extends RequestContext {
Request<String> parse(String value);
}
public class RfService {
public String parse(String value) {
return value.replace("a", "b");
}
2.- RF is not thought to receive other message payloads than the RF servlet produces, and the most you can do in client side with RF is ask for that services hosted in a different site (when you deploy your server and client sides in different hosts).
You can use other mechanisms in gwt world to get data from other urls, take a look to gwtquery Ajax and data-binding or this article

Categories