Im working on a lambda function that get's triggered from Kinesis streams.
I'm new to the aws kind of stuff so im just wondring if there is a way to debug localy the lambda function !
What i'm doing right now is every time i want to test i need to deploy on integration invirenement and it's kind of painefull and time consuming.
The function is built using java8.
Any propositions please !!
Related
I am able to run AWS lambda using postman. I want to use the event to run unit test along it. I am not sure how I can generate the event.json file.
I tried logging event that I receive in my handleRequest method on cloudwatch logs but it does not seem to be the right one.
you can find the Lambda event structures in the documentation. Here is the list for all the different event types, https://docs.aws.amazon.com/lambda/latest/dg/lambda-services.html.
In your case I would say the event you are looking for coming from API Gateway, you can find it here: https://docs.aws.amazon.com/lambda/latest/dg/services-apigateway.html#apigateway-example-event
I have an SQS queue with a supported DLQ for failures. I want to write a custom Lambda function in java (spring boot) to get the messages from this dlq, write it to a file, upload the file to an S3 bucket and send the file as an alert to a specified webhook.
I'm new to lambda and I hope this design can be implemented.
One requirement is that I want to execute the Lambda only once per day. So let's say at 6:00 am everyday. I want all the messages in the queue to be written to a file.
I'm trying to see examples of RequestHandler being implemented where the messages in the queue are received and iterated to be saved in the file one at a time.
I'm not sure how to configure the lambda such that it runs only once per day instead of each time a message entering the DLQ.
Any documentation relating to these queries will be really helpful. Please critique my expected implementation and offer any better solutions for the same.
You can have your lambda code run on any schedule (once per day for your case) using CloudWatch Event Schedule.
To create the schedule, follow this link
In you lambda code, you can fetch the messages from DLQ and process them iteratively.
You no need to use Spring framework in AWS Lambda use Java only
Use Lambda with Cloud watch cron expression and schedule daily run.
Write your own logic
https://docs.aws.amazon.com/lambda/latest/dg/java-samples.html
https://www.freecodecamp.org/news/using-lambda-functions-as-cronjobs/
Is there a way to run Cron Jobs with Ktor? My end objective is to host a Cron Job written with Kotlin for the Coinverse app's backend service to populate data.
I'm currently hosting multiple Java .jar apps written in Kotlin on AppEngine. I'm looking to refactor these apps into Ktor apps on AppEngine with a Cron Job for scheduled tasks, as the .jar apps have more issues with dependencies.
I'm looking for Ktor's equivalent to Cloud Functions' built-in implementation for Cron Jobs with JavaScript.
functions.pubsub.schedule
Back-up option: If Ktor does not have this feature and I want to keep the code in Kotlin, Google has an alpha, Using Kotlin with Google Cloud Functions. It appears Kotlin + Cloud Functions' built-in implementation could be used with this approach.
Sergey Mashkov from the JetBrains team suggests in the kotlinlang Slack group to launch a Kotlin Coroutine on the Application scope using an infinite loop and delay.
Then, the Ktor app can be deployed to AppEngine.
fun Application.main() {
launch {
while(true) {
delay(600000)
// Populate data here.
}
}
}
As for my experience, this will not work the app will stop after 20 minutes or so.
The only solution I've found is to make a regular cron.yaml and and a ktor app, and it works without complain....(the ktor app shall implement a get, and will be called by the cron file)
I need to invoke an AWS lambda function using an SQS trigger.
The lambda function needs to include spring boot features - because it needs to transform the message from SQS into another input form and then call an external REST API using spring cloud feign.
I'm supernew to spring & lambda so I'm not sure exactly how to do this.
I would like to understand if my expected workflow is supported by spring and aws, if so, can anyone share some sample code to learn how to achieve this?
Is there way to build proper integration test for few AWS Lambdas interacting each other via AWS SNS topic?
I deployed two lambdas using Java.
The first one is subscribed to AWS SNS_topic#1. It filters and transforms SNS message and pushes modified data onto SNS_topic#2.
The second one lambda is subscribed to SNS_topic#2. It modifies SNS message and do http request onto external endpoint.
I need to build end-to-end Integration test to check whole interaction.
Actually there is a better way, you should check out the
https://github.com/localstack/localstack
and https://github.com/localstack/localstack-java-utils
This two tools will help you to test your serverless apps built on AWS
You can use the amazon-cli if you want to test it from your local.
amazon-cli will help you to manually trigger your lamda via aws lambda invoke the command.
Please install amazon-cli at your local from this link.
After that you can invoke the amazon 1st lamda from your cli with aws lamda command. It comes with many options like you can pass payload (supposed to be pass from SNS in actual scenario).
Executing AWS Lamda from Amazon-CLI Command details description
Example command:
aws lambda invoke --function-name your_function_name --invocation-type RequestResponse outfile.txt --payload file:requestFile.txt
I hope it helps.