Navigating through HATEOAS with Spring client - java

We have an API which uses Spring JPA and provides access to some data in our database via REST. This API is exposed in a Hateoas fashion (we are using the Spring implementation).
We are now considering whether stick with this approach or code s=our own REST interface manually. Now, I have read a lot of articles about HATEOAS but I am not sure what's the big advantage of using it. Sure, I understand that I can navigate through it using links but I still have to know about the existence of the links at each level, right?
To illustrate my problem, let's say that I have the following structure:
server.com/
- /store
- /users/
server.com/users
- /managers/
- /other/
server.com/managers
- list of entities with ids
I want to consume this API and get all 'manager' entities (located under server.com/users/managers)
What is the correct way to do so when using Spring boot links?
Option one:
RequestEntity<Void> request = RequestEntity.get("server.com/users/managers").accept(HAL_JSON).build();
final Resource<Manager> managers = restTemplate.exchange(request, new ResourcesType<Manager>() {
}).getBody();
Option two:
// global endpoint
RequestEntity<Void> request = RequestEntity.get("server.com").accept(HAL_JSON).build();
final Resource<Object> rootLinks = restTemplate.exchange(request, new ResourceType<Object>() {
}).getBody();
Links links = new Links(rootLinks.getLinks());
final Link userLink = links.getLink("users").expand();
// users endpoint
request = RequestEntity.get(URI.create(userLink.getHref())).accept(HAL_JSON).build();
final Resource<Object> managerLinks = restTemplate.exchange(request, new ResourceType<Object>() {
}).getBody();
links = new Links(managerLinks.getLinks());
final Link managerLink = links.getLink("managers").expand();
// managers endpoint
request = RequestEntity.get(URI.create(managerLink.getHref())).accept(HAL_JSON).build();
final Resources<Manager> resourceAccounts = restTemplate.exchange(request, new ResourcesType<Manager>() {
}).getBody();
The first option one seems straightforward and I can get all entities with single request. However, I fail to see hot Hateoas is beneficial if I just use this approach. Spring documentation states, that using hardcoded links is not recommended.
The second approach seems to be more in the Hateoas fashion but it creates three requests just to get to the resource which location I already know. That doesn't seem right either.
I know it's probably a dummy question but can somebody explain me what is the great idea behind Hateoas that I am clearly missing?

With HATEOAS server can guide a client through provided links. A contract between server and client is link's relation type and media type. A server can, by providing or not providing links on same resource representation, give client information if a resource is in a state where editing is enabled or not enabled or if a user is authorized for some operation on the resource and so on. A server can change URLs without breaking a contract.

Related

Where is the Spring Actuator Controller endpoint and can I call it programmatically with jvm call?

I want to find the actual java class that serves the Spring Actuator endpoint (/actuator).
It's similar to this question in a way, but that person wanted to call it via a network HTTP call. Ideally, I can call it within the JVM to save on the cost of setting up an HTTP connection.
The reason for this is because we have 2 metrics frameworks in our system. We have a legacy metrics framework built on OpenCensus and we migrated to Spring Actuator (Prometheus metrics based on Micrometer). I think the Spring one is better but I didn't realize how much my company built infrastructure around the old one. For example, we leverage internal libraries that use OpenCensus. Infra team is depending on Opencensus-based metrics from our app. So the idea is to try to merge and report both sets of metrics.
I want to create my own metrics endpoint that pulls in data from Opencensus's endpoint and Actuator's endpoint. I could make an HTTP call to each, but I'd rather call them within the JVM to save on resources and reduce latency.
Or perhaps I'm thinking about it wrong. Should I simply be using MeterRegistry.forEachMeter() in my endpoint?
In any case, I thought if I found the Spring Actuator endpoint, I can see an example of how they're doing it and mimic the implementation even if I don't call it directly.
Bonus: I'll need to track down the Opencensus handler that serves its endpoint too and will probably make another post for that, but if you know the answer to that as well, please share!
I figured it out and posting this for anyone else interested.
The key finding: The MeterRegistry that is #Autowired is actually a PrometheusMeterRegistry if you enable the prometheus metrics.
Once you cast it into a PrometheusMeterRegistry, you can call its .scrape() method to return the exact same metrics printout you would when you hit the http endpoint.
I also need to get the same info from OpenCensus and I found a way to do that too.
Here's the snippet of code for getting metrics from both frameworks
Enumeration<MetricFamilySamples> openCensusSamples = CollectorRegistry.defaultRegistry.filteredMetricFamilySamples(ImmutableSet.of());
StringWriter writer = new StringWriter();
TextFormat.write004(writer, openCensusSamples);
String openCensusMetrics = writer.toString();
PrometheusMeterRegistry registry = (PrometheusMeterRegistry) meterRegistry;
String micrometerMetrics = registry.scrape();
return openCensusMetrics.concat(micrometerMetrics);
I found out another interesting way of doing this.
The other answer I gave but it has one issue. It contains duplicate results. When I looked into it, I realized that both OpenCensus and Micrometer were reporting the same result.
Turns out that the PrometheusScrapeEndpoint implementation uses the same CollectorRegistry that OpenCensus does so the both sets of metrics were being added to the same registry.
You just need to make sure to provide these beans
#PostConstruct
public void openCensusStats() {
PrometheusStatsCollector.createAndRegister();
}
#Bean
public CollectorRegistry collectorRegistry() {
return CollectorRegistry.defaultRegistry;
}

Keycloak - how to handle multiple work contexts

I have an application where single user can work in contexts of multiple companies. We call such a connection (user<->company) a permit. Every one of this permits can have different sets of permissions/roles. We want user to login just once and then he can simply change permits within application without need to enter password again.
Till now we had only one application and kept this whole permission model in our own DB. Unfortunately now we have to support second application which should inherit those permits. I was wondering wether is possible to move that model to keycloak so we don't have to replicate it to every single db and keep it in sync manually.
I have searched keycloak documentation regarding this topic but have found no information att all, which seems quite odd, because I don't think we are the first one working with multiple context application.
So now I'm asking is it possible to configure our model in keycloak and if so, how to do it? Eventually are there different options? I guess that I can provided that model as a claim with json structure but that doesn't feel right to me. I was thinking about custom IDP which could provide such claims based on DB so there no spelling errors and less repetition but I feel there should be a better way.
You could try to write your own Keycloak provider (SPI). There is a built in mechanism that allows you to expose REST endpoint on the Keycloak: https://github.com/keycloak/keycloak/tree/master/examples/providers/domain-extension
That REST could be called with authorized context only for example by passing Access-Token (Authorization header with Bearer value). On the provider level (through implementation of: org.keycloak.services.resource.RealmResourceProviderFactory and org.keycloak.services.resource.RealmResourceProvider) you have access to user's Keycloak session and object UserModel like in the following code:
AuthenticationManager.AuthResult authResult = new AppAuthManager().authenticateBearerToken(keycloakSession, keycloakSession.getContext().getRealm());
UserModel userModel = authResult.getUser();
UserModel class has methods for getting and setting attributes, so some information that indicates the current permit/company ID can be stored there. You can use REST methods exposed on the Keycloak to modify the model within the 'session' (represented by Access-Token).
The Github example shows also how to use another Keycloak provider (ex. built-in JPA provider) from you custom provider's level, so using that approach you could try to connect to the database with your permits/company informations. Of course the datasource representing you database should also be registered as Keycloak datasource.

Add authentication to GAE Java endpoints

I am generating some endpoints and it works correctly, however, I would like to keep one session per client so I do not have to send the request by mail and password, but I am not sure of doing it.
This is an example of one of my endpoints
#Api(name = "test")
public class MyApi {
#ApiMethod(name = "printHi", httpMethod = "POST")
public Message imprimirHola(Input input) {
Message message = new Message();
if(datosCorrectos(input.getMail(), input.getPassword()))
message.setMessage("Hi");
else
message.setMessage("authentication failed");
return message;
}
}
After doing some research in the topic you have issues with, I have found the following information that can be useful for what you are trying to achieve.
Google Cloud Platform offers the Cloud Endpoints service, which has some available frameworks for user authentication. You can find detailed information and procedures about that in the documentation but, in short, you can use Firebase Auth, Auth0 or Google Accounts to authenticate a user against your endpoints (this link will help you decide which option suits you better).
However, in order to operate with one of those options, you will need that your API is managed by Cloud Endpoints, so you will have to follow this walkthrough to Add API Management to your API using OpenAPI.
Finally, here you have a working example on how to that using Java.
I know it is a lot of information, but I think you will be able to solve the authentication issue with your Java API just by reading more detailed information in this last documentation page, and move step by step in the "Getting Started" dropdown menu in the left of this page.

pac4j saml generate sp metaData

I am now comparing spring saml and pac4j saml. Generally speaking, I think pac4j is easier to implement than spring saml. But there are one thing I can not figure out:
See this config code:
#Configuration
public class Pac4jConfig {
#Bean
public Config config() {
final SAML2ClientConfiguration cfg = new SAML2ClientConfiguration(
"resource:samlKeystoreNgcsc.jks",
"juniper",
"juniper",
"resource:metadata-okta.xml"
);
cfg.setMaximumAuthenticationLifetime(3600);
cfg.setServiceProviderEntityId("http://localhost:8080/callback?client_name=SAML2Client");
cfg.setServiceProviderMetadataPath("sp-metadata.xml");
final SAML2Client saml2Client = new SAML2Client(cfg);
final Clients clients = new Clients("http://localhost:8080/callback", saml2Client);
final Config config = new Config(clients);
//config.addAuthorizer("admin", new RequireAnyRoleAuthorizer("ROLE_ADMIN"));
//config.addAuthorizer("custom", new CustomAuthorizer());
return config;
}
}
From this sample code, we already have IDP metaData, that is fine, we just ask for IDP to provide metaData and we can use directly.
But where is the sp-metadata.xml? We need to generate it and provide to idp to intergration purpose.
If I am using springSaml, it provides a UI to generate this metaData, we just need to download and send over to IDP. But for pac4j saml, I do not see this utility at all. So can anyone help to tell me what will be the best solution to generate the sp metaData?
Thanks
saml2Client.init() does all work of generating sp-metadata just make sure that you have sufficient permissions to create the file on the specified path.
saml2Client.getConfiguration().setServiceProviderMetadataResource(new FileSystemResource(new File("C:\\sp-metadata.xml").getAbsolutePath()));
saml2Client.init();
String spMetadata = saml2Client.getServiceProviderMetadataResolver().getMetadata();
I somehow manage to generate it by using this setting in the SecurityModule configuration. This might not be the best way, but I still figuring out the best way.
cfg.setServiceProviderMetadataPath(new File("yourPath", "fileName.xml").getAbsolutePath())
Note that SPMetata ONLY generate when there's a SAML Request happen.
If you come across this issue when using pac4j and TestShib, make sure your Identity Provider metadata is up-to-date, i.e., update your local testshib-providers.xml with the one from the TestShib website.

Access GWT Form Values from Spring Controller

I have started learning GWT and I would like to know if it is possible to create a form in GWT for a process such as user registration and then, handle the logic of the registration(extra validation, adding the data to the database, etc) in a spring controller. I have so far been unable to find resources on the web to do this, so I am guessing that what I am after might not be possible. Can this only be done using classes which extend the RemoteServiceServlet as shown in this video tutorial?
I have tried to access the data from my spring controller using the request.getParameter() method calls, what I noticed was that when I used the Post method, I was unable to access the form parameters, but when I use Get, I can access them. This is some of the code I am using:
GWT:
HorizontalPanel hrzPnlname = new HorizontalPanel();
hrzPnlname.add(new Label("User Name: "));
final TextBox txtUserName = new TextBox();
txtUserName.setName("userName");
hrzPnlname.add(txtUserName);...
Spring:
#RequestMapping(method = RequestMethod.POST)
public ModelAndView helloWorldFromForm(HttpServletRequest request)
{
Map<String, Object> model = new HashMap<String, Object>();
System.out.println("------>\n\n\n\n\n\n" + request.getParameter("userName") + "\n\n\n\n\n<-------");...
I am using GWT 2.4. Any information on this would be highly appreciated.
Thanks!
You can share data between GWT client and spring controllers in a several ways:
REST/JSON request
See how:
Calling REST from GWT with a little bit of JQuery
See also original GWT documentation(section "Communicating with the server/JSON")
This is the best way to communicate with a server in my opinion, because JSON is a pure standard protocol and it can be used by 3-rd party plugins(jQuery for example) and other web services.
Async GWT RMI(Remote method invocation), provided by Google
I think this is not the best idea to use GWT RPC, see why:
4 More GWT Antipatterns
Submit forms directly to spring controller as POST/GET request (as you trying to do).
See com.google.gwt.user.client.ui.FormPanel
I can reconmmend you to use forms submitting only for file uploading, because the browser will only upload files using form submission. For all other operations, form is not required in GWT, any request possible to implement using ajax.
Here are links to some resources - http://technophiliac.wordpress.com/2008/08/24/giving-gwt-a-spring-in-its-step/ and Experiences with integrating spring 3 mvc with GWT? .

Categories