Creating an AWS Lambda Function API to Read/Write to AWS S3
For more information on Lambda, visit aws.amazon.com
Lambda API Description
This instruction set will show how to set up an AWS lambda function behind an API Gateway endpoint, that takes a post input, and pushes that post data to S3. It will have a few moving pieces that we will walk through, primarily being lambda, api-gateway, and S3. The user will create a post request that will go to api-gateway using an API key for authentication, it will then pass the post data back to the lambda function, which will take the post data, construct an S3 object and push it to an S3 bucket. It will then send back an OK response back to api-gateway, which will pass the response back to the requester. Buckle up boys and girls, this may be a bumpy ride!
Lambda S3 Role
The first thing that we will need to do in order to be able to read and write data to Amazon S3 is to create a role that will allow Lambda the right to read and write to S3. In order to do this, we must first go to the IAM console. From the top left side of the navigational menu bar, click on the Services menu, and then choose IAM by either navigating to the section of the listed services, or by typing the first few letters of the service name in the search box, and then choosing it from the filtered list.
1. Create New Role:
Once we have navigated to the IAM console, from the left side menu, choose Roles, and then click on the Create role button.
2. Create Lambda S3 Role:
In the Create role view, ensure that you have selected AWS service from the top level choice list. Next, under the list of services, choose Lambda, and then click on the Next: Permissions button.
Next, from the list of available policies, Type S3, and choose the AmazonS3FullAccess policy. Once selected, search and select the CloudWatchLogsFullAccess policy. click the Next: Review button.
Least Privilege Permissions:
For the purpose of this lab we are allowing Lambda Full S3 permissions by choosing the AmazonS3FullAccess s3 policy. In a production environment you should ALWAYS choose a policy, or create a custom policy that only allows for the minimum privilege's that the lambda function requires. The Least Privilege model should always be chosen when creating roles for service execution.
Last, we simply need to type a name and description for our new role, and then click the Create role button.
3. Role Complete:
Our role should now be completed, and will be available in the list of available/assignable roles in the lambda console. So now lets move on and create our Lambda function.
Lambda S3 Bucket
1. Create a Bucket:
Because this particular Lambda function will write an post request object to an S3 bucket, we need to create a bucket and apply the proper policy to the bucket to ensure that Lambda will be able to write to the bucket properly. In order to do this, from the top left side of the navigational menu bar, click on the Services menu, and then choose S3 by either navigating to the section of the listed services, or by typing the first few letters of the service name in the search box, and then choosing it from the filtered list.
Once in the S3 console, click the Create bucket button in order to create a new S3 bucket.
2. Bucket Configuration:
Next, the S3 Create bucket modal window will pop up, allowing us to set up and configure our S3 bucket. In the Name and region section, choose a bucket name, the region that the bucket will live in, and optionally copy any settings from existing buckets and then press the Next button.
Next, in the Set properties dialog, choose and enable things such as versioning, access logging, tags, encryption etc.. and click Next. For the purpose of this exercise, we will leave all options default, and just proceed to the next section.
Next, in the Set permissions dialog, configure any permissions that the bucket should have. By default, the bucket will give the owner full permissions. Here we could add permissions for other users, other accounts, and change the public/private flag of the bucket. Again for the purpose of this exercise we will leave the default values, and click Next. We will configure our bucket permissions with a bucket policy in the next section.
Last, review the bucket configuration and click the Create bucket button.
Once complete, you should now be able to look at your bucket list and see your newly created bucket.
3. Bucket Policy:
In order to give lambda permission to write to our new S3 bucket, we will need to set up a bucket policy to control access to the bucket. In order to do this, from your S3 console, click on the bucket, and then from the bucket properties console, click on the Permissions tab, and then click Bucket Policy to access the policy editor view. Paste the following policy into the policy editor, and then click on Save.
Policy Substitutions:
Ensure that you substitute the {{ROLE_ARN}} below with the ARN of the role created in previous steps in the Principal section, as well as {{BUCKET_NAME}} in the Resource section with the actual bucket name that was also created in previous steps.
Policy Syntax:
{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Principal": { "AWS": "{{ROLE_ARN}}" }, "Action": [ "s3:GetObject", "s3:PutObject" ], "Resource": "arn:aws:s3:::{{BUCKET_NAME}}/*" } ] }
Policy Example:
{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Principal": { "AWS": "arn:aws:iam::012345678910:role/Lambda-S3Admin" }, "Action": [ "s3:GetObject", "s3:PutObject" ], "Resource": "arn:aws:s3:::aws-allow-lambda-write/*" } ] }
S3 POLICY:
This bucket policy will allow only the Lambda-S3Admin role, which is assumed by the lambda service, the ability to GetObject, and PutObject into the specified S3 bucket (aws-allow-lambda-write). The function will not allow write or get to any other bucket, nor can any other user or role access this particular bucket.
Lambda Function
Now that we have our bucket setup, its time to setup our Lambda function.
1. Navigate to the Lambda Console:
From the top left side of the navigational menu bar, click on the Services menu, and then choose Lambda by either navigating to the section of the listed services, or by typing the first few letters of the service name in the search box, and then choosing it from the filtered list.
2. Getting Started:
In the lambda console, if no lambda functions have previously been configured, Click the Create a function button to configure your first lambda function.
3. Select Runtime and Blueprint:
When you create a new lambda function, two options will be available. Author from scratch will allow you to create a new lambda function from scratch as titled, where as the second option, Blueprints will allow you to select a pre-existing template to start building off of, that will auto populate bits of code to template out the function logic. Choose a runtime (node, python, etc..) from the runtime drop list, and then choose a pre-defined template, or choose Author from scratch to get started. For the purpose of this tutorial, we will pick the first option, Author from scratch. Fill in the required bits and press the Create function button. NOTE: Under the Role section, select Choose existing role, and then select the role that we just created in the IAM section.
4. Configure Lambda Trigger:
Next, in the function console window, verify that the function has access to both Amazon S3 and Amazon CloudWatch Logs on the right side of the function designer. Then on the left side of the function designer, click on API Gateway to add API Gateway as a trigger for the Lambda function.
Next, scroll down to the Configure triggers section, and select Create a new API from the API field. Then, choose and fill in the API name, Deployment stage, and choose Open with access key as the Security method. Last, click the Add button.
5. Lambda Code:
Next, in the designer section at the top, click on the lambda function itself, and then scroll down to the code block, and paste the following code into the editor replacing any pre-existing code in the function editor box.
Code Substitutions:
Ensure that you substitute the bucket = '{{BUCKET_NAME}}' syntax below with the actual bucket name that was created in previous steps.
from __future__ import print_function import json import urllib import time import boto3 from botocore.client import Config from botocore.exceptions import ClientError print('Loading function') s3 = boto3.client('s3', config=Config(signature_version='s3v4')) def lambda_handler(event, context): print("Received event: " + json.dumps(event, indent=2)) # The function will error out if body-json is not set, this gets set via API-GW print(event['body-json']) # Create a new object and push to S3 # ------------------------------------ #print("Received event: " + json.dumps(event, indent=2)) # Get the object from the event and show its content type bucket = '{{BUCKET_NAME}}' key = str(int(time.time())) + ".json" body = json.dumps(event['body-json']) try: response = s3.put_object(Bucket=bucket, Key=key, Body=body) # print("CONTENT TYPE: " + response['ContentType']) # return response['ContentType'] except Exception as e: print(e) print('Error getting object {} from bucket {}. Make sure they exist and your bucket is in the same region as this function.'.format(key, bucket)) raise e # Send API response return { 'statusCode': '200', 'body': event['body-json'], 'headers': { 'Content-Type': 'application/json', 'Access-Control-Allow-Origin': '*' } }
6. Save the Function:
Now that we have our Lambda function details filled in, click the Save button in the top right corner, to save the function.
API Gateway
Now that we have our Lambda function defined, we need to finalize the setup of the API gateway. In order to do this, first we must again, from the top left side of the navigational menu bar, click on the Services menu, and then choose API Gateway by either navigating to the section of the listed services, or by typing the first few letters of the service name in the search box, and then choosing it from the filtered list.
1. Select the API to Modify:
Once in the API Gateway console, you will see a list of all of the API Gateways that have been previously created. Locate the new API Gateway from the list and click on it to view the gateway properties.
2. Create Post Method:
Once inside the API Gateway, we need to define a POST method that will allow us to send a post request with JSON data, that will, via the Lambda function, be saved to our defined S3 bucket. To create the new method, click on the API name, and then from the Actions menu, choose Create Method. This will create a drop list under the API. From the drop List choose POST, and then click the check mark next to the new method, in order to save the method.
3. Post Method Configuration:
Next, we need to setup our POST method to ensure that any POST requests that we send to the API gateway, will be sent to, and response's will come from our Lambda function. In order to do so, insure that the Integration type is set to Lambda Function, next choose the Lambda Region where you created the lambda function, and then from the new input field, type the name of the Lambda function that we created in previous steps. Once complete, click on Save.
Once you have clicked Save, a pop up window will appear ensuring that you want to add permissions to your Lambda function that will allow API gateway to trigger it. Verify the information in the pop box, and then click on the OK button.
4. Post Method Flow:
Once the Method has been constructed, you will be returned to the method view, and should now be able to see, in a diagram format the flow of your newly constructed method.
5. Enable CORs:
Next, we need to enable CORs or Cross Origin Resource Sharing. This will allow any domain to send a request to the API endpoint without the endpoint shutting us down because the origin request didn't originate within the same domain as the API Gateway endpoint. In order to do this, from the Resources api menu, and while still having our POST request selected, in our API, again click on Actions, and then Enable CORS.
From within the Enable CORS console, leave the default values and click the Enable CORS and replace existing CORS headers button.
Next, a pop up dialog will be displayed asking if we are sure we want to replace the existing headers with the new CORS enabled headers. Click the Yes, replace existing values button.
Once the configuration has completed. CORS should be configured and enabled for our new API.
6. Deploy the API:
The last configuration step that we need to perform is to actually deploy the API. Any time any changes are made to our API we need to ensure that it gets redeployed. To deploy the API, from the Resources left API menu, click on the API, then click on Actions and from the drop menu, choose, Deploy API.
Once the deploy has been initiated, choose the stage where the API will be deployed, and then type a description or reason why the API was deployed or redeployed. Once complete, click the Deploy button.
7. Post Method URL:
Our API should be all set up and ready for testing. In order to test, we will need to get the URL for the API Endpoint. In order to do so, click on Stages from the left side menu, then expand the stage that we want to hit, in this case Prod, and click on the POST method from the list of all methods. In the method, view, you should now be able to copy the URL for the API Endpoint.
Testing the API
The final step of this process is to ensure that we can now send a POST request to the URL, and to ensure that the data that we send in that post request gets written to our S3 bucket. In order to do this you can use whatever API testing tool that you are comfortable with. In this example, I will use Postman, my favorite API testing utility to test the API.
1. Create and Send the POST Request:
To test the API using postman, or your testing tool of choice, configure the request to be a POST request, using the URL that was gathered in the above step. In the Request Body click on raw to send a raw JSON structure that lambda function will gather, and save as an S3 object. The JSON raw body structure should look like the following:
{ "body-json":{ "username":"testuser", "email": "me@mydomain.com", "phone": "(555) 555-5555" } }
Once the request has been configured, simply hit the send button, and ensure that you receive a 200 response back.
2. Verify the Request Object in S3:
Last, lets check the S3 bucket to ensure that the JSON that we supplied in the POST request was received by our Lambda function and written to S3 properly. To do so, simply navigate back to S3, click the bucket that we set up for the Lambda function to write to, and hit the refresh button. Once complete, we should see our object in our S3 bucket.
Last, click on the object, click the Make public button, and then click the Link to download the object. Open the object in the editor of your choosing and verify that the data in the object is the same data that we sent using the post request.
3. Congratulations!:
If everything went smooth, then you have successfully configured your Lambda function behind an API gateway, and made it accessible for public consumption !!
Public API:
Ensure you understand that this process made your API public. Anyone with the proper POST URL, can technically now send a post request to this URL and save data in the configured S3 bucket. In order to lock down the API properly, you will want to configure the use of an API token, AWS Authorizer, or reconfigure the API to be privately available only in your VPC. All of these methods fall outside of the scope of this article and will be addressed in future articles.
Lambda S3 Additional Resources
No Additional Resources.
Lambda S3 Site/Information References
No Additional References.