I have an S3 bucket that I want to attach to an application’s upload area, but I want to move them out of the bucket accessible to the application after they’ve been uploaded. Eventually, I want to have this be after a small delay, but initially I wanted to test out the concept itself.
Step 1: Have source and destination buckets in S3
Create buckets for source and destination. The ACLs on both of the buckets are the same (non-public) in my case.
Step 2: Create a Lambda Execution Role
- Go to IAM > Roles > Create Role
- Choose Lambda as a Use Case
- Next: Permissions
- Search for S3 and check AmazonS3FullAccess
- Search for “lambdabasic” and check AWSLambdaBasicExecutionRole (for CloudWatch logs)
- Click [Next: Tags] > [Next: Review] and give your role a name and verify that the S3 and Lambda policies are added:
- Click [Create Role]
Step 3: Prep the Lambda Code
- Clone https://github.com/stringsn88keys/move_file_on_upload
- Be sure to have the correct ruby version (
2.7.0at the time of writing) installed
- Change into
bundle config set --local deployment 'true'to vendor the gems for the AWS Lambda
zip ../move_file_on_upload.zip **to package the zip
Step 4: Create the Lambda
- Go to AWS Lambda in the AWS Console and click [Create Function]
- Name the function, set Ruby 2.7 as the runtime, and use the role you created
- [Create function]
Step 5: Add S3 Trigger
- Click [+ Add Trigger]
- Search and select S3
- Fill in your source bucket and select all object create events
- If you get this error (“configurations overlap”), select your bucket in S3, click the Properties tab, and you’ll see an Event Notifications that’s been orphaned by a previous config (be sure to delete the dependent Lambda as well if it exists):
Step 6: Upload your code
- Go back to the [Code] tab for your lambda and select the [Upload from] dropdown and select zip file.
- Go to the [Configuration] section and add a FROM_BUCKET and TO_BUCKET environment variable for your Lambda to know what to process
Step 7: Monitor and test
- You can test the Lambda execution via the Test dropdown
- S3 put is one of the available test templates
- Click “Create” after selecting S3 Put and you’ll be able to watch the event get executed.
- Go to CloudWatch -> Logs -> Log Groups and you should see a log group for
- If all else is successful, you should see “the specified key does not exist”
Step 8: Test it live
- Create a folder in your source bucket.
- Go into that folder
- Upload a file.
- The file should pretty quickly be copies to the destination bucket in the same folder structure and removed from its original location.
- The top level folder under the bucket should remain.