VOIDKAT

Using Serverless on AWS - Saving to S3 and DynamoDb - Part 2/2

October 21, 2019

Welcome to part 2 of Using Serverless on AWS with S3 and DynamoDB. In Part 1 we configured serverless, created our resources and managed to save a JSON file to our S3 bucket. In this part we will now save an image to S3 and write to our DynamoDB.

DynamoDB is proprietary Amazon NoSQL database, as such it integrates well with other AWS services. But being proprietary it does entail some vendor lock in.

Core concepts

Before we continue with the rest of the tutorial I want to outline that adding Amazon APIs to Serverless is a lot of reading the documentation and implementation, the core steps are always the same:

  • Read AWS docs on implementation an API
  • Add correct IAM access rights to the serverless.yaml
  • Add the code to your handler
  • Make sure the handler uses async/promises so it ‘waits’ for the operation to complete
  • Test with mocks and via console
  • Push and test using curl

Now that we covered that let’s get to implementing DynamoDB to our Serverless application.

Setup

Previously we only submitted an email and saved this email to a JSON file. Now we will submit a name, email and an image in base64 format. This will be then saved to a DynamoDB database.

We will check the email against our database, if we have no record of the email, we will accept the request, save the image to S3 and write the name, email and path to the S3 file to DynamoDb.

If the email has a record on our database, we will reject the request with the appropriate message.

Setting up DynamoDB on AWS Console

Now we need to create the database, head to console.aws.com log in, head to DynamoDB. The only key thing to create a primary key, this is the value against the database can be search and had to be unique, no other fields need to be defined as DynamoDB is a document based database so additional fields and values can be added as needed.

Our data set is going to consist of:

  • Email
  • Name
  • Image

We will use email as the primary key since this will be unique and helps us in eliminating re-submissions from the same email. So hit create DynamoDB table , enter your database name and retain it as we will need to insert this into our YAML config file, set email as the primary key and leave everything as default.

Adding DynamoDB permissions to serverless.yaml

Now we will add the necessary IAM permissions to the serverless YAML file, this will entail referencing the database name and it’s allowed actions (read, write and query)

We will add the dynamoDb table to the provider settings:

  environment:
    DYNAMODB_TABLE: '<your-table-name>'

Under iamRoleStatements and after your S3 IAM settings add the following:

 - Effect: Allow
      Action:
        - dynamodb:DescribeTable
        - dynamodb:Query
        - dynamodb:Scan
        - dynamodb:GetItem
        - dynamodb:PutItem
        - dynamodb:UpdateItem
        - dynamodb:DeleteItem
      Resource: "arn:aws:dynamodb:${self:custom.region}:*:table/${self:custom.database}"

The self custom portions are defined in my own custom variables:

custom:
  bucket: <your-s3-bucket>
  database: <your-table-name>
  region: us-east-1

This is all that is needed for the permissions to work.

Adding DynamoDb library to handler

Like AWS we need to add DynamoDb. It has two API versions so we will specify the latest. In your api.js handler add const ddb = new AWS.DynamoDB({apiVersion: '2012-08-10'}); above our const S3 = new AWS.S3(); call.

Saving to DynamoDb works the same way as saving to S3, we construct a payload and then call save to db with an await. Again the reason we use await is because an action be it saving to S3 or writing to DynamoDb takes computational time.

Now, we will change the flow a bit, we will first check if the entered email exists in the database and return with a error code if this is the case.

  // check first if the email being sent has been used before
  var params = {
    TableName: "sls-lambda-users",
    Key: {
      email: {
        S: data.email
      }
    },
  };
  
  // check with DB 
  const checkIfExists = await ddb.getItem(params).promise();

Now let’s just quickly cover this call:

  • S: data.email this specifies that we are using a String and checking the email submitted
  • The response of this call if a record exists will be an Item
  // Check if email exists
  if (checkIfExists.Item){
    let payload = {
      error: `${data.email} already exists.`
    }
    // Send failure
    return {
      headers:headersVariables,
      statusCode: 401,
      body: JSON.stringify(payload)
    };

If it does exist this will send an error 401 back. However if there is no record, first the incoming base64 image will be saved to S3 bucket. This is the code for that action:

  // Payload key is the final name
  let s3payload = {
    Bucket: 'save-date-lambda',
    Key: data.name + '.png',
    Body: buf,
    ContentEncoding: 'base64',
    ContentType: 'image/png'
  };

  // Try S3 save.
  const s3Response = await S3.upload(s3payload).promise();

Once the base64 image has been saved to S3, we can now write our entry to the database:

// Payload for AWS RDS
let dbPayload = {
  TableName: 'sls-lambda-users',
  Item: {
    userId: {
      S: uuid.v1()
    },
    infoId: {
      S: data.name
    },
    email: {
      S: data.email
    },
    path: {
      S: s3Response.Location
    },
    createdAt: {
      N: date.toString()
    }
  }
};

// Save to AWS RDS
const saveDb = await ddb.putItem(dbPayload).promise();

Let’s go over this code:

  • Items being saved to DynamoDB have to have a data type value, thus the S or N in front of values.
  • uuid.v1() is just a ID node package
  • path is a location as my code saves the base64 submitted image to S3 then records the path in this DynamoDB entry

Finally if the submission was a success:

// our response payload on successfully saving
  let payload = {
    path: dbPayload.Item.path.S,
    createdAt: dbPayload.Item.createdAt.N,
    userId: dbPayload.Item.userId.S
  };

  // Send Success
  return {
    headers: headersVariables,
    statusCode: 201,
    body: JSON.stringify(payload)
  };

The payload of a successful save will send back some information about the entry. Be advised that this is all best wrapped in a try/catch so any errors in the saving to S3 or DynamoDB sends back an error code.

Conclusion

This concludes our tutorial on using Serverless on AWS to save to S3 and DynamoDB. I hope you learnt something! But be advised this is not at all production level code but a tutorial. Personally I don’t think I will ever write Lambda functions without Serverless. Testing and debugging is just easier to do using serverless!


Farhad Agzamov

Written by Farhad Agzamov who lives and works in London building things. You can follow him on Twitter and check out his github here