Serverless Architectures with Amazon DynamoDB and Amazon Kinesis Streams with AWS Lambda

Overview

In this tutorial, you will learn the basics of event-driven programming using Amazon DynamoDB, DynamoDB Streams, and AWS Lambda. You will walk through the process of building a real-world application using triggers that combine DynamoDB Streams and Lambda.


TOPICS COVERED

By the end of this tutorial, you will be able to:

  • Create an AWS Lambda function from a blueprint
  • Create an Amazon Kinesis Stream
  • Use Amazon CloudWatch to monitor Kinesis event data triggering your Lambda function
  • Create an Amazon DynamoDB table and insert items
  • Enable the Amazon DynamoDB Streams feature
  • Configure and troubleshoot Lambda functions

About the Technologies

AWS LAMBDA

Lambda is a compute service that provides resizable compute capacity in the cloud to make web-scale computing easier for developers. You can upload your code to AWS Lambda and the service can run the code on your behalf using AWS infrastructure. AWS Lambda supports multiple coding languages: Node.js, Java, or Python. After you upload your code and create a Lambda function, AWS Lambda takes care of provisioning and managing the servers that you use to run the code.

In this tutorial, you will use AWS Lambda as an event-driven compute service where AWS Lambda runs your code in response to changes to data in and SNS topic and an Amazon S3 bucket.

You can use AWS Lambda in two ways:

As an event-driven compute service where AWS Lambda runs your code in response to events, such as uploading image files as you will see in this lab.

As a compute service to run your code in response to HTTP requests using Amazon API Gateway or API calls.

Lambda passes on to you the financial benefits of Amazon’s scale. AWS Lambda executes your code only when needed and scales automatically, from a few requests per day to thousands per second. With these capabilities, you can use Lambda to easily build data processing triggers for AWS services like Amazon S3 and Amazon DynamoDB, process streaming data stored in Amazon Kinesis, or create your own back end that operates at AWS scale, performance, and security.

This lab guide explains basic concepts of AWS in a step by step fashion. However, it can only give a brief overview of Lambda concepts. For further information, see the official Amazon Web Services Documentation for Lambda at https://aws.amazon.com/documentation/lambda/. For pricing details, see https://aws.amazon.com/lambda/pricing/.

LAMBDA BLUEPRINTS

Blueprints are sample configurations of event sources and Lambda functions that do minimal processing for you. Most blueprints process events from specific event sources, such as Amazon S3 or DynamoDB. For example, if you select an s3-get-object blueprint, it provides sample code that processes an object-created event published by Amazon S3 that Lambda receives as parameter.

When you create a new AWS Lambda function, you can use a blueprint that best aligns with your scenario. You can then customize the blueprint as needed. You do not have to use a blueprint (you can author a Lambda function and configure an event source separately).

AMAZON DYNAMODB

Amazon DynamoDB is a fast and flexible NoSQL database service for all applications that need consistent, single-digit millisecond latency at any scale. It is a fully managed database and supports both document and key-value data models. Its flexible data model and reliable performance make it a great fit for mobile, web, gaming, ad-tech, IoT, and many other applications. For further information, see the official Amazon Web Services Documentation for DynamoDB at https://aws.amazon.com/documentation/dynamodb/.

AMAZON KINESIS

Amazon Kinesis is a fully managed service for real-time processing of streaming data at massive scale. Amazon Kinesis can collect and process hundreds of terabytes of data per hour from hundreds of thousands of sources, allowing you to easily write applications that process information in real-time, from sources such as web site click-streams, marketing and financial information, manufacturing instrumentation and social media, and operational logs and metering data.

With Amazon Kinesis applications, you can easily send data to a variety of other services such as Amazon Simple Storage Service (Amazon S3), Amazon DynamoDB, Amazon Lambda, or Amazon Redshift. In a few clicks and a couple of lines of code, you can start building applications which respond to changes in your data stream in seconds, at any scale, while only paying for the resources you use. For further information, see the official Amazon Web Services Documentation for Kinesis at http://aws.amazon.com/documentation/kinesis/.

Part 1: Event-Driven Programming with Amazon Kinesis and AWS Lambda

In the first part of this lab, you will learn event-driven programming with Kinesis and Lambda.

Task 1: Create an Amazon Kinesis Stream

In this task, you will create an Amazon Kinesis stream. 

  1. In the AWS Management Console, click Services then click Kinesis.
  2. Under the How it works section, click Create data stream then configure:
  3. Kinesis stream name: 
    1. Demo-Stream
    2. Number of shards: 1 (Each shard supports a pre-defined capacity, as shown in the Total stream capacity section. This lab only requires one shard, but applications requiring more capacity can simply request more shards.)
  4. Click Create Kinesis stream



Task 2: Create a Lambda Function

In this task, you will define an AWS Lambda function that will be triggered by data coming into the stream.
  1. On the Services menu, click Lambda.
  2. Click Create function (You will start by selecting a Lambda blueprint. Blueprints are pre-built for you and can be customized to suit your specific needs)
  3. Select Use a blueprint, then:
    1. Click the Blueprints search  box
    2. Search for: kinesis-process-record-python
    3. Select  kinesis-process-record-python
    4. Click Configure
  4. In the Basic information section, configure:
    1. Function name: ProcessKinesisRecords
    2. Execution role: Use an existing role
    3. Existing role: lambda_basic_execution
  5. In the Kinesis trigger section, configure:
    1. Kinesis stream: Demo-Stream
    2. Check  Enable trigger (This will configure the Lambda function so that it is triggered whenever data comes into the Kinesis stream you created earlier.)
  6. Scroll down and examine the Lambda blueprint displayed in the Lambda function code section. It does the following:
    1. Loop through each of the records received
    2. Decode the data, which is encoded in Base 64
    3. Print the data to the debug log
  7. At the bottom of the screen, click Create function (This function will now trigger whenever data is sent to the stream.)


Task 3: Test your Function

In this task you will simulate data coming from a stream to trigger your Lambda function.
  1. Click Test (An event template for Kinesis will be automatically selected. The event contains a simulated message arriving via Kinesis.)
  2. For Event name, enter: stream
  3. Click Create
  4. Click Test (You should see the message: Execution result: succeeded)


  5. Expand  Details to view the output of the Lambda function. You will be shown information about the Lambda execution:
    1. Execution duration
    2. Resources consumed
    3. Maximum memory used
    4. Log output
  6. Click the test button another three times, waiting a few seconds between each test. (This will generate test data for Amazon CloudWatch.)
  7. Click the Monitoring tab. (You will be presented with CloudWatch metrics for your Lambda function. Metrics should be available for Invocations and Duration. If the metrics do not appear, wait a minute and then click  refresh.)
Part 2: Event Driven Programing with Amazon DynamoDB and AWS Lambda

In the second half of this lab, you will learn about a different event driven programming method, this time with DynamoDB and Lambda.

Task 4: Create Tables in DynamoDB

First, you will create a DynamoDB table that will contain scores for online games.
  1. On the Services menu, click DynamoDB.


  2. Click Create table and configure:
    1. Table name: GameScoreRecords
    2. Primary key: RecordID
    3. Primary key type: Number
    4. Click Create
    5. You will now create another table for linking scores to users.
  3. Click Create table and configure:
    1. Table name: GameScoresByUser
    2. Primary key: Username
    3. Primary key type: String
    4. Click Create
    5. You can now activate DynamoDB Streams on the first table. This will generate streaming data whenever there is any change to the table (insert, update, delete).
  4. Click the first table you created, GameScoreRecords.
  5. On the Overview tab, click Manage DynamoDB Stream then configure:
    1. View type:  New image-the entire item, as it appears after it was modified
    2. Click Enable (Any record sent to this table will now send a message via DyanmoDB streams, which can trigger a Lambda function.)


Task 5: Create a Lambda Function

You will now create a Lambda function that will be triggered by updates to your DynamoDB table.
  1. On the Services menu, click Lambda.
  2. Click Create function
  3. You will be providing the code to run, so click on Author from scratch.
  4. Configure the following:
    1. Function name: AggregateScoresByUser
    2. Runtime: Node.js 12.x
    3. Expand  Change default execution role
    4. Select Execution role:  Use an existing role
    5. Expand Existing role:  lambda_basic_execution_dynamodb
  5. Click Create function


  6. Scroll down to the Function code section, then:
    1. Delete all of the code in the index.js editor
    2. Copy and paste this code into the index.js editor: CLICK HERE TO VIEW CODE
    3. Examine the code. It does the following:
    4. Loop through each incoming record
    5. Create (ADD) an item in the GameScoresByUser table with the incoming score
    6. Wait until all updates have been processed
  7. Click Deploy. (You will now configure the function to execute when a value is added to the DynamoDB table.
  8. Scroll up to the Designer section.
  9. Click  Add trigger then configure:
  10. In the Trigger configuration section, configure the following:
    1. Select a trigger: click DynamoDB.
    2. DynamoDB table: GameScoreRecords
    3. Click Add (The function will now be triggered when a new game score is added to the DynamoDB table. You can now test the function with a record that simulates an update of the database.)
  11. Click Test
  12. For Event name, enter: score
  13. Delete the existing test code (with key3, etc).
  14. Copy and paste this record into the test event window: Examine the test record. It is simulating an incoming record from the GameScoreRecords table. CLICK HERE TO VIEW CODE
  15. At the bottom of the page, click Create
  16. Click Test (Your Lambda function will be invoked. The heading (towards the top of the page) should say Execution result: succeeded.)
  17. Expand  Details to view the output of the Lambda function. You should see: Successfully processed 1 records.


  18. VERIFY IN DYNAMODB - You will now verify that the data was updated in DynamoDB.
    1. On the Services menu, click DynamoDB.
    2. In the left navigation pane, click Tables.
    3. Select  GameScoreByUser.
    4. Click the Items tab. (This table was previously empty (you created it yourself), but you should now see an entry for Jane Doe.)


  19. TRIGGER THE UPDATE - You can perform more tests by inserting values in the Scores table, and confirming that the Lambda updates the User table.
    1. Select the  GameScoreRecords table.
    2. Click Create item then for RecordID, enter any number.
    3. Enter a user name:
      1. Click  > Append > String
      2. For FIELD, enter: Username
      3. For VALUE, enter a person’s name
    4. You will now add a score:
      1. Clicking  > Append > Number
      2. For FIELD, enter: Score
      3. For VALUE, enter a random score
    5. Click Save - Your new item will be displayed. It should have also triggered the Lambda function, resulting in a new entry in the other table.


    6. Click the GameScoresByUser table. You should see that the new data you entered has been copied to the User table. Feel free to repeat the test by adding more items in the GameScoreRecords table.
Additional Resources

For more information about AWS Lambda, see https://aws.amazon.com/lambda/

Comments

Let's connect. A great way to get my attention is to comment on one of my posts.

Get In Touch

Send