I am trying to use the Spring Cloud GCP. I want to know how to programmatically load GCP credentials rather than loading the GCP credentials from a json file using spring.cloud.gcp.credentials.location .
Does creating a bean of type com.google.auth.Credentials let Spring-Boot auto-configure Spring-Cloud-GCP correctly to use within the application?
If not, what is the way to inject Credentials so that Spring Cloud GCP is configured correctly?
Not Credentials, but a bean of type CredentialsProvider will take precedence over any properties/autoconfiguration.
It's a functional interface, so you can return a lambda:
#Bean
public CredentialsProvider credentialsProvider() {
return () -> NoCredentialsProvider.create().getCredentials();
}
Related
So I am working on spring boot application from which I am expected to access AWS resources, I know how to access the AWS resources via IAM credentials and STS credentials, I am looking for an example or way to consume following temporary AWS credentials via spring boot application.
AWS_SECRET_ACCESS_KEY
AWS_ACCESS_KEY
AWS_SESSION_TOKEN
AWS_SESSION_ID
Note: I have tried accessing via BasicSessionCredentials and BasicAWSCredentials but no luck with the same getting error Unable to execute Http request
So any reference or example to set all four properties using java would help a lot, thanks!!
You can use https://docs.aws.amazon.com/AWSJavaSDK/latest/javadoc/com/amazonaws/auth/DefaultAWSCredentialsProviderChain.html
default credentials provider chain that looks for credentials in this order:
Environment Variables - AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY
Java System Properties - aws.accessKeyId and aws.secretKey
Web Identity Token credentials from the environment or container
Credential profiles file at the default location (~/.aws/credentials) shared by all AWS SDKs and the AWS CLI
Credentials delivered through the Amazon EC2 container service
Instance profile credentials delivered through the Amazon EC2 metadata service
#Bean
public AWSCredentialsProvider amazonAWSCredentialsProvider() {
return DefaultAWSCredentialsProviderChain.getInstance();
}
#Bean
public AmazonS3 amazonS3() {
return AmazonS3ClientBuilder
.standard()
.withCredentials(amazonAWSCredentialsProvider())
.withRegion(Regions.US_WEST_2)
.build();
}
I am trying to find way, how to call external API based on YAML definition:
I have (Spring boot, external) API-1, generated with OpenAPI plugin, which has
it's YAML file.
I have (Spring boot) API-2, which calls API-1. But all I have is YAML definition of API-1.
We know external API can be called with:
#Bean
public WebClient remoteApiBean() {
return WebClient.create("http://example.com/api");
}
And e.g. Service implementation:
#Autowired
public MyService(WebClient remoteApiBean) {
this.remoteApiBean = remoteApiBean;
}
And requesting resource
this.remoteApiBean.get()
.uri("/request")
.retrieve()
.bodyToMono(ModelDTO.class)
.block(REQUEST_TIMEOUT);
API and model info are stored inside of API-1 YAML config. My question is, if it is possible to use API-1 YAML definition as a resource for API-2. Generate code above and also generate DTOs. I think, there MUST be any plugin which generate this, but I am unable to find it, because there are very similar issues with Spring Boot yaml external configurations and google result are bad according this searching.
I am using Spring Boot Actuator module in my project which exposes REST endpoint URLs to monitor & manage application usages in production environment, without coding & configuration for any of them.
By default, only /health and /info endpoints are exposed.
I am customising the endpoints via application.properties file as per my use case.
application.properties.
#To expose all endpoints
management.endpoints.web.exposure.include=*
#To expose only selected endpoints
management.endpoints.jmx.exposure.include=health,info,env,beans
I want to understand, where exactly does Spring Boot create actual endpoints for /health and /info and how does it expose them over HTTP?
Thanks #Puce and #MarkBramnik for helping me out with the reference docs & code repository.
I wanted to understand how the endpoints were working and how they were exposed over HTTP, so that I could create custom endpoints to leverage in my application.
One of the great features of Spring Framework is that it’s very easy to extend, and I was able to achieve the same.
To create a custom actuator endpoints, Use #Endpoint annotation on a class. Then leverage #ReadOperation / #WriteOperation / #DeleteOperation annotations on the methods to expose them as actuator endpoint bean as needed.
Reference Doc : Implementing Custom Endpoints
Reference Example :
#Endpoint(id="custom_endpoint")
#Component
public class MyCustomEndpoint {
#ReadOperation
#Bean
public String greet() {
return "Hello from custom endpoint";
}
}
The endpoint id i.e custom_endpoint needs to be configured in the list of actuator endpoints to be enabled.
application.properties :
management.endpoints.web.exposure.include=health,info,custom_endpoint
After a restart, endpoint works like a charm!
I am using spring-cloud-stream-binder-kinesis, version: 2.0.2.RELEASE.
I was able to successfully use binder and access it locally using the default ContextCredentialsAutoConfiguration mentioned in the KinesisBinderConfiguration.
Now I know this set-up wont work for me because,
The Kinesis data stream is in AWS account 1
The Service is running in AWS account 2
(I have already done the setup of assumed Role so that Account 2 can access streams in account 1 using the assumed role)
However I am not sure how can I override the credentials in binder to use STSAssumeRoleSessionCredentialsProvider
Can someone help please?
The KinesisBinderConfiguration is fully based on the auto-configuration from the Spring Cloud AWS, which provides for us a ContextCredentialsAutoConfiguration and expose an AWSCredentialsProvider bean under the credentialsProvider name if not present yet.
So, probably you just need to have your STSAssumeRoleSessionCredentialsProvider as a bean in your configuration class and give it that credentialsProvider bean name.
I'm trying to use Spring Cloud to consume a generic REST service from a Cloud Foundry app.
This service is created using Spring Boot, as follows:
package com.something;
#RestController
public class DemoServiceController {
#RequestMapping("/sayHi")
public String sayHi() {
return "Hello!";
}
}
This works fine - I can access http://www.example.com/srv/demo/sayHi and get "Hello!" back.
Next, I created a user-provided service instance using the CF-CLI and bound it to my app. I can now see the bound service in VCAP_SERVICES.
cf cups my-demo-service -p '{"url":"http://www.example.com/srv/demo/"}'
cf bs my-demo-app my-demo-service
Next, as described here, I added this bean to my app's Spring config, with the connector-type set to my original controller (I have a reference to it as well).
<cloud:service id="myDemoService"
service-name="my-demo-service"
connector-type="com.something.DemoServiceController"
/>
Now when I auto-wire "myDemoService" into my app,
#Autowired
private DemoController myDemoService;
I get an error:
No services of the specified type could be found.
I've made sure to include all required dependencies, including spring-cloud-spring-service-connector and spring-cloud-cloudfoundry-connector.
What's going wrong here? Am I giving the wrong bean parameters? Any help is much appreciated.
Spring Cloud Connectors won't know what to do with this service, as each supported service must be of a known type (MySQL, Postgres, Redis, MongoDB, RabbitMQ, etc). Setting the connector-type to your Controller class won't do what you want.
What you will need to do is to create a custom Connectors extension. Here's an example of a project that does that: https://github.com/cf-platform-eng/spring-boot-cities.