I'm recently working with microservices, developed as Spring Boot applications (v 2.2) and in my company we're using Keycloak as authorization server.
We chose it because we need complex policies, roles and groups, and we also need the User Managed Authorization (UMA) to share resources between users.
We configured Keycloak with a single realm and many clients (one client per microservice).
Now, I understand that I need to explicitly define Resources within Keycloak and this is fine, but the question is: do I really need to duplicate all of them in my microservice's property file?
All the documentation, examples and tutorials end up with the same thing, that is something like:
keycloak.policy-enforcer-config.enforcement-mode=PERMISSIVE
keycloak.policy-enforcer-config.paths[0].name=Car Resource
keycloak.policy-enforcer-config.paths[0].path=/cars/create
keycloak.policy-enforcer-config.paths[0].scopes[0]=car:create
keycloak.policy-enforcer-config.paths[1].path=/cars/{id}
keycloak.policy-enforcer-config.paths[1].methods[0].method=GET
keycloak.policy-enforcer-config.paths[1].methods[0].scopes[0]=car:view-detail
keycloak.policy-enforcer-config.paths[1].methods[1].method=DELETE
keycloak.policy-enforcer-config.paths[1].methods[1].scopes[0]=car:delete
(this second example fits better our case because it also uses different authorization scopes per http method).
In real life each microservice we're developing has dozens of endpoints and define them one by one seems to me a waste of time and a weakness in the code's robustness: we change an endpoint, we need to reconfigure it in both Keycloak and the application properties.
Is there a way to use some kind of annotation at Controller level? Something like the following pseudo-code:
#RestController
#RequestMapping("/foo")
public class MyController {
#GetMapping
#KeycloakPolicy(scope = "foo:view")
public ResponseEntity<String> foo() {
...
}
#PostMapping
#KeycloakPolicy(scope = "bar:create")
public ResponseEntity<String> bar() {
...
}
}
In the end, I developed my own project that provides auto-configuration capabilities to a spring-boot project that needs to work as a resource server.
The project is released under MIT2 license and it's available on my github:
keycloak-resource-autoconf
Related
I want to find the actual java class that serves the Spring Actuator endpoint (/actuator).
It's similar to this question in a way, but that person wanted to call it via a network HTTP call. Ideally, I can call it within the JVM to save on the cost of setting up an HTTP connection.
The reason for this is because we have 2 metrics frameworks in our system. We have a legacy metrics framework built on OpenCensus and we migrated to Spring Actuator (Prometheus metrics based on Micrometer). I think the Spring one is better but I didn't realize how much my company built infrastructure around the old one. For example, we leverage internal libraries that use OpenCensus. Infra team is depending on Opencensus-based metrics from our app. So the idea is to try to merge and report both sets of metrics.
I want to create my own metrics endpoint that pulls in data from Opencensus's endpoint and Actuator's endpoint. I could make an HTTP call to each, but I'd rather call them within the JVM to save on resources and reduce latency.
Or perhaps I'm thinking about it wrong. Should I simply be using MeterRegistry.forEachMeter() in my endpoint?
In any case, I thought if I found the Spring Actuator endpoint, I can see an example of how they're doing it and mimic the implementation even if I don't call it directly.
Bonus: I'll need to track down the Opencensus handler that serves its endpoint too and will probably make another post for that, but if you know the answer to that as well, please share!
I figured it out and posting this for anyone else interested.
The key finding: The MeterRegistry that is #Autowired is actually a PrometheusMeterRegistry if you enable the prometheus metrics.
Once you cast it into a PrometheusMeterRegistry, you can call its .scrape() method to return the exact same metrics printout you would when you hit the http endpoint.
I also need to get the same info from OpenCensus and I found a way to do that too.
Here's the snippet of code for getting metrics from both frameworks
Enumeration<MetricFamilySamples> openCensusSamples = CollectorRegistry.defaultRegistry.filteredMetricFamilySamples(ImmutableSet.of());
StringWriter writer = new StringWriter();
TextFormat.write004(writer, openCensusSamples);
String openCensusMetrics = writer.toString();
PrometheusMeterRegistry registry = (PrometheusMeterRegistry) meterRegistry;
String micrometerMetrics = registry.scrape();
return openCensusMetrics.concat(micrometerMetrics);
I found out another interesting way of doing this.
The other answer I gave but it has one issue. It contains duplicate results. When I looked into it, I realized that both OpenCensus and Micrometer were reporting the same result.
Turns out that the PrometheusScrapeEndpoint implementation uses the same CollectorRegistry that OpenCensus does so the both sets of metrics were being added to the same registry.
You just need to make sure to provide these beans
#PostConstruct
public void openCensusStats() {
PrometheusStatsCollector.createAndRegister();
}
#Bean
public CollectorRegistry collectorRegistry() {
return CollectorRegistry.defaultRegistry;
}
I'm still quite new to microservices and have a few basic architectural questions that I can't get solved right now.
I'm using the Quarkus framework with the standard extensions like quarkus-resteasy and quarkus-rest-client for the realization.
The scenario:
I have an example of a "Persistence" service that I want to externally populate with data via a REST call in a dedicated Maven project.
#Path("/api/persistence")
#Products(MediaType.APPLICATION_JSON)
public class Persistence{
#Inject
EntityManager entityManager;
#POST
#Transactional
public Response create(PostDto postDto) {
Post post = toPostMapper.toResource(postDto);
entityManager.persist(post);
return Response.ok(postDto).status(201).build();
}
}
At the same time I would like to have a microservice DataGenerator which generates the corresponding data and passes it to the Persistence Service.
My problem : API sharing
Both services were created as Maven projects.
According to the tutorials I found the correct way would be to declare an interface (here called PersistenceApi) in the DataGenerator project like this
#Path("/api/persistence")
#Products(MediaType.APPLICATION_JSON)
#RegisterRestClient
public interface PersistenceApi {
#POST
#Transactional
public Response create(PostDto post) ;
}
This interface is then integrated into the DataGenerator service via #Inject, which leads to the following exemplary service.
#RequestScoped
#Path("/api/datagenerator")
#Products("application/json")
#Consumes("application/json")
public class DataGenerator{
#Inject
#RestClient
PersistenceApi persistenceApi
#POST
public void getPostExamplePostToPersistence() {
PostDto post = new PostDto();
post.setTitle("Find me in db in persistence-service")
persistenceApi.create(post);
}
}
I have the PersistenceService running locally on port 8181 and have added the following entry in the application.properties of the DataGenerator project so that the service can be found.
furnace.collection.item.service.PersistenceApi/mp-rest/url=http://localhost:8181
furnace.collection.item.service.PersistenceApi/mp-rest/scope=javax.inject.Singleton
I find it "wrong" to declare the interface in my DataGenerator, because at this point I don't notice when the api provided by the Persistence service changes. Accordingly one could come up with the idea to position the interface in the Persistence service, which is then implemented by my concrete Persistence implementation and leads to the following code.
#Path("/api/persistence")
#Products(MediaType.APPLICATION_JSON)
#RegisterRestClient
public class PersistenceApiImpl implements PersistenceApi {
#Inject
EntityManager entityManager;
#POST
#Transactional
public Response create(PostDto fruit) {
Post post = toPostMapper.toResource(fruit);
entityManager.persist(post);
return Response.ok(fruit).status(201).build();
}
}
In order to use them in my DataGenerator project, I would have to include the Persistence project as a dependency in my DataGenerator project, which sounds like a "monolith with extra steps" to me and therefore feels wrong in terms of "separation of concerns".
I have tried the following approach:
I created another Maven project called PersistenceApi which only contains the corresponding PersistenceApi. This PersistenceApi project was then included as a dependency in both the "Persistence" and "DataGenerator" projects. In the "Persistence"-Project I implement the service from the example above and try to address the corresponding interface in the "DataGenerator"-Project via #Inject.
Unfortunately this does not work. When I'm building the service, I get the message that the required dependency PersistenceApi, which I want to include via #Inject in the DataGenerator service, cannot be injected in the form of an UnsatisfiedResolutionException.
Now my questions:
I don't see what I'm missing here. Could you help me?
Is this kind of API-sharing with dedicated Api projects a viable way or is the "monolith with extra steps" approach really the way to go?
Thank you in advance.
Thats a common problem with microservices. Like in the book "Microservices: Grundlagen flexibler Softwarearchitekturen" by Eberhard Wolff (I saw that you are German too) i follow the idea that microservices should have the same coupling like the teams developing them and like the organization your developing it for(have a look at Conway's law). Therefore services of mostly independent teams should be developed independly and the api changes of one service should not affect another at the time of the update.
If you develop both services in your team then i think you can couple them the way you are doing it because you dont have to work together with other teams and there will be no huge overhead. Note that you will be forced to release both services together. If that is always ok for you then save your time and do it your way, if not have a look at API-Versioning:
I use api versioning so the old api is still reachable under "v1/" and the new one under "v2/". This way the team behind the other microservice has enough time to update their service.
Have a look at Domain-driven Design for different ways of integrating bounded contexts (=services) and the coupling consequences. Without API-Versioning you are forced to a partnership and you need to release together. Maybe you prefer Customer-Supplier or even conformist.
To test compatibility between both services have a look at consumer driven contracts and Pact. You can also generate open api files and track their changes but that will only help to notify people about changes.
I know that there are already similar questions, here and here, regarding this problem but every solution proposed failed to help me. There is also mention to this library in most of those answers but (with all due respect) I would like to avoid depending on an external library just to be able to test a simple controller.
So, I have a very simple api that is accessed using a bearer token generated by keycloak and I would like to test the controller. Something along these lines:
#Test
#DisplayName("Should be ok")
#WithMockUser
void whenCalled_shouldBeOk() throws Exception {
SecurityContext context = SecurityContextHolder.getContext();
Authentication authentication = context.getAuthentication();
mockMvc.perform(
post("/api/url/something")
.content("{}")
.contentType(APPLICATION_JSON)
.with(authentication(authentication))
).andExpect(status().isOk());
}
The problem is that I will always get a null pointer exception thrown by the KeycloakDeploymentBuilder for it's missing the adapter config. In our SecurityConfig we extend the KeycloakWebSecurityConfigurerAdapter and do all the required configurations for the app to work but I am failing to mock/by-pass this process in the test. Normally I find my way around this authentication problems in the tests (when keycloak isn't used) with #WithMockUser annotation but not this time.
Isn't there way to mock the adapter or the filter process in order to by-pass this issue?
I have tried everything that was answered in the other questions (except the library) with no luck. If you have any clue that could be of help or at least point me in the correct direction (since this can be due to a lack of knowledge on spring security from my part) that would very much appreciated.
2023 update
The deprecated Keycloak adapters for Spring (where KeycloakAuthenticationToken is defined) are not compatible with spring-boot 3. Alternatives in the accepted answer to "Use Keycloak Spring Adapter with Spring Boot 3"
Original answer
As I already wrote in my answer to the first question you linked, #WithMockUser populates the security context with a UsernamePaswordAuthenticationToken when your code / conf probably expect a KeycloakAuthenticationToken.
If you read carefully the same answer, you'll find an alternative to using my lib to do this: manually populate the security-context with a KeycloakAuthenticationToken instance or mock in each test.
Minimal sample with Mockito I added to my repo:
#Test
public void test() {
final var principal = mock(Principal.class);
when(principal.getName()).thenReturn("user");
final var account = mock(OidcKeycloakAccount.class);
when(account.getRoles()).thenReturn(Set.of("offline_access", "uma_authorization"));
when(account.getPrincipal()).thenReturn(principal);
final var authentication = mock(KeycloakAuthenticationToken.class);
when(authentication.getAccount()).thenReturn(account);
// post(...).with(authentication(authentication))
// limits to testing secured #Controller with MockMvc
// I prefer to set security context directly instead:
SecurityContextHolder.getContext().setAuthentication(authentication);
//TODO: invoque mockmvc to test #Controller or test any other type of #Component as usual
}
Maybe, after you mesure how much this clutters your tests (there are very few claims set here) you'll reconsider using my lib (or copying from it as it's opensource).
With my annotation, above sample becomes:
#Test
#WithMockKeycloakAuth
public void test() throws Exception {
//TODO: invoque mockmvc to test #Controller or test any other type of #Component as usual
}
Regarding spring test config with Keycloak involved, you could dig a bit into the tests of spring-addons-keycloak module. You'd find a complete app using KeycloakAuthenticationToken with unit tests (and working test conf).
Last (and maybe off-topic) you could read the repo main README and consider using a more generic OIDC implementation than Keycloak's one. I provide one, along with test annotation, and wrote tutorials on how to extend it to your app specific needs.
Here a solution is described to handle redirects to a custom URL based on a condition via use of AccessStrategy.
This however is part of the unauthorized login logical flow therefore results into a still not-logged in user arriving at the end url we redirect to. (via getUnauthorizedUrl)
If we want to redirect the user based on a condition, say via injecting an action to the webflow, how can we manipulate the return URL to be changed into a custom one?
WebUtils.getService(requestContext) include getters of the source/originalUrl but no obvious way to set/manipulate said value through an action bean.
p.s. Currently using CAS version 5.3.x
Responses for normal web applications from CAS are built using WebApplicationServiceResponseBuilder.
If you examine this block you will find that the final response is built using WebApplicationServiceResponseBuilder bean. It is only created conditionally, if an existing bean is not already found in the context by the same name. So to provide your own, you just need to register a bean with the same name using your own #Configuration class.
#Bean
public ResponseBuilder<WebApplicationService> webApplicationServiceResponseBuilder() {
return new MyOwnWebApplicationServiceResponseBuilder(...);
}
...and then proceed to design your own MyOwnWebApplicationServiceResponseBuilder, perhaps even by extending WebApplicationServiceResponseBuilder and overriding what you need where necessary to build the final redirect logic conditionally.
To learn about how #Configuration classes work in general, you can:
Review this post
or this post
or consult the documentation for Spring and/or Spring Boot.
I am developing a web application using spring framework and google app engine. I am wondering if there is a design pattern or framework with the help of which I can develop features of my application as pluggable modules. For example I have identified 4 features of the application:
Oauth Login
User Profile Management
User Group creation
User File management
Now what I need is to develop all these features as independent modules, so that i can detach any of them dynamically and they are as loosely coupled as possible. They can have their own database implementation, their own set of technologies etc. Is there a design principle to implement modules in such a way.
You can take a look at MQ systems (such as RabbitMQ, ActiveMQ).
MQ system will work as intermediate layer, which provide you loosely coupling.
Communication between modules will be implemented as posting messages to queue and listening for posting.
Also, OSGI may help you. It gives you possibility to make your application as a set of pluggable modules, which might be loaded dynamically.
As per my experience, I suggest, Use MVC pattern. Use Servlet filttersfor 1.Oauth Login.
Create service/POJOs to implement and inject each other according to your requirement for
2.User Profile Management
3.User Group creation
4.User File management
If you know Spring AOP, use. So that you can achive more dynamic integration between implementations of points 2,3, and 4.
You should split the feature in two components: API and implementation. The first one contain interfaces, the second their implementations. You pass the interface to web app controller and inject implementation via Spring or any other Dependency Injection framework. For example
web-app, UserController which handles requests from client and delegate to your components
#Component
public class UserController {
private FileManager fileManager;
#Autowired
public UserController(FileManager fileManager) {
this.fileManager = fileManager;
}
#GET("/user/{userId}/file/{fileId}")
public File getUserFile(long userId, long fileId) {
fileManager.getUserFile(userId, fileId);
}
}
file-mgt-api where you define interfaces to decouple web-app from implementation
public interface FileManager {
File getUserFile(long userId, long fileId);
}
file-mgt-impl where all the details of how to get requested file
#Component
public class FileManagerImpl implements FileManager {
#Override
public File getUserFile(long userId, long fileId) {
// get file by id from DB
// verify that provided user is the file owner
// do other useful stuff
// return the file or throw exception if something wrong
}
}
Do the same for group, profile management and other features. After that you can easily replace implementation by replacing single jar file. Your web-app is completely decoupled and don't know anything about implementation details, it only depends on interfaces.