Back

A simple serverless app with HTTP API Gateway, Lambda and S3

0

 

When coming up with architectures for an application, what to use for data storage is oftentimes a discussion that pops up. There are plenty of answers, each designed to cater for particular use-cases, but one stands out of them all - S3.


AWS S3 is one of the most popularly used storage services due to its ability to handle massive scale and potentially infinite storage, all at a low cost. It's a simple key-value data store that allows you to store pretty much anything, in any format, with or without a schema. 


But to make any data stored on S3 useful, we also need the ability to retrieve, update and delete that data. That's what we'll focus on in this article by building an HTTP API endpoint for a simple TO-DO app, using API Gateway, Lambda, and S3. 


1. ARCHITECTURE

Let's start with defining what the architecture would look like, that encompasses all the HTTP methods we're interested in.


GET - Fetch TODO tasks

POST - Creates a new TODO task

PUT - Updates an existing TODO task

DELETE - Deletes a task


AWS Architecture for TODO app with Lambda, API Gateway and S3



2. DATA SCHEMA

The data will be stored on S3 as a simple JSON file in the following format:


Schema:

{
 "<projectId>": {
    "task""Title",
    "assignedTo""Developer name",
    "priority""High, Medium or Low",
    "dueDate"""
  }
}

<projectId> is a unique key that identifies a task. In our code, we'll use the timestamp in epoch format as unique 'projectId'


Example:

{
 "1000": {
    "task""Build a simple app using serverless",
    "assignedTo""John Doe",
    "priority""High",
    "dueDate""1 April 2021"
  },
 "1001": {
    "task""Deploy the simple serverless app",
    "assignedTo""John Doe",
    "priority""Medium",
    "dueDate""1 April 2021"
  }
}



3. CREATING THE LAMBDA

Lambda is a serverless, event-based service that is triggered only when requested. The beauty is, there is no infrastructure hardware or software to manage. You simply code up a function, deploy it and it's ready for use. 


It's mostly cheaper than running on a hosted server such as EC2. That's not to say it can't get expensive, because it certainly can and depends on the use-case.


For our purposes, we're focusing on a simpler app, so we just need to define what each HTTP method does, package it in a zip file & deploy it to a Lambda.


Code for each method using NodeJS

Code in the 'handler' for each of the HTTP methods processing data coming in as the 'body' of the API call:


GET:

Fetch the list of all tasks.

module.exports = {

    fetch: async(event=> {

        return new Promise(async(resolvereject=> {

            let response = {
                statusCode: 200,
                body: null
            };

            try {
                console.log('Request GET');
                let bucket = process.env.S3_BUCKET;
                let key = process.env.S3_KEY;
                console.log('Params GET'JSON.stringify({ bucketkey }));

                let tasks = await getObject({ bucketkey });
                console.log('Result GET'JSON.stringify(tasks));

                if (!taskstasks = JSONStructure();

                if (tasksresponse['body'] = tasks;

                console.log('Response GET'JSON.stringify(response));

                resolve(response);
            } catch (e) {
                console.log('Error in GET method');
                reject(e);
            }
        });
    }
}



POST:

For creating a new task with payload validation.

'projectId' is auto-generated using epoch timestamp for uniqueness.


Payload format:

{
"task": "Build a simple app using serverless",
"assignedTo": "John Doe",
"priority": "High",
"dueDate": "1 April 2021"
}


module.exports = {

    create: (event=> {

        return new Promise(async(resolvereject=> {

            let response = {
                statusCode: 200,
                body: null
            }

            try {
                console.log('Request POST');
                console.log('Event'JSON.stringify(event));
                let { bodyrequest } = event;

                let [ validerrors ] = postMethodValidator(request);
                console.log('Validation request'valid);

                if (valid) {
                    let bucket = process.env.S3_BUCKET;
                    let key = process.env.S3_KEY;
                    let tasks = await getObject({ bucketkey });
                    console.log('Tasks from S3'JSON.stringify(tasks));

                    let currentDate = new Date();
                    let projectId = currentDate.getTime();

                    if (!taskstasks = JSONStructure();

                    tasks[projectId] = request;

                    console.log('Tasks to S3'JSON.stringify(tasks));
                    await putObject({ bucketkeybody: taskscontentType: 'application/json' });
                    response['body'] = tasks;
                } else {
                    ([ errors ] = errors);
                    response['statusCode'] = 400;
                    response['body'] = errors;
                }

                console.log('Response POST'JSON.stringify(response));
                resolve(response);
            } catch (e) {
                console.log('Error in method POST');
                reject(e);
            }
        });
    }
}



PUT:

For updating an existing task.

Payload format:

{
"projectId": "1000",
"task": "Deploy the simple serverless app",
"assignedTo": "John Doe",
"priority": "Medium",
"dueDate": "1 April 2021"
}

module.exports = {

    update: (event=> {
        return new Promise(async(resolvereject=> {

            let response = {
                statusCode: 200,
                body: null
            }

            try {
                console.log('Request PUT');
                let { bodyrequest } = event;

                let [ validerrors ] = putMethodValidator(request);
                console.log('Validation request'valid);

                if (valid) {
                    let bucket = process.env.S3_BUCKET;
                    let key = process.env.S3_KEY;
                    let tasks = await getObject({ bucketkey });
                    console.log('Tasks from S3'JSON.stringify(tasks));

                    let projectId = request['projectId'];
                    delete request['projectId'];

                    if (!taskstasks = JSONStructure();

                    if (!tasks?.[projectId]) {
                        response['statusCode'] = 400;
                        response['body'] = { message: 'Task not exists' };
                    } else {
                        tasks[projectId] = request;
                        console.log('Tasks to S3'JSON.stringify(tasks));
                        await putObject({ bucketkeybody: taskscontentType: 'application/json' });
                        response['body'] = tasks;
                    }
                } else {
                    ([ errors ] = errors);
                    response['statusCode'] = 400;
                    response['body'] = errors;
                }

                console.log('Response PUT'JSON.stringify(response));
                resolve(response);
            } catch (e) {
                console.log('Error in method PUT');
                reject(e);
            }
        });
    }
}



DELETE:

For deleting a task. Payload of DELETE request:

{ "projectId": 1000 }


module.exports = {

    destroy: (event=> {
        return new Promise(async(resolvereject=> {

            let response = {
                statusCode: 200,
                body: null
            }

            try {
                console.log('Request DELETE');
                let { bodyrequest } = event;

                let [ validerrors ] = deleteMethodValidator(request);
                console.log('Validation request'valid);

                if (valid) {
                    let bucket = process.env.S3_BUCKET;
                    let key = process.env.S3_KEY;
                    let tasks = await getObject({ bucketkey });
                    console.log('Tasks from S3'JSON.stringify(tasks));

                    if (!taskstasks = JSONStructure();

                    let projectId = request['projectId'];
                    delete request['projectId'];

                    if (!tasks?.[projectId]) {
                        response['statusCode'] = 400;
                        response['body'] = { message: 'Task not exists' };
                    } else {
                        delete tasks[projectId];

                        console.log('Tasks to S3'JSON.stringify(tasks));
                        await putObject({ bucketkeybody: taskscontentType: 'application/json' });
                        response['body'] = tasks;
                    }
                } else {
                    ([ errors ] = errors);
                    response['statusCode'] = 400;
                    response['body'] = errors;
                }

                console.log('Response DELETE'JSON.stringify(response));
                resolve(response);
            } catch (e) {
                console.log('Error in method DELETE');
                reject(e);
            }
        });
    }
}



Deploy the Lambda

Once the code has been packaged in a zip, use the AWS console to create a new Lambda function:

Create Lambda function


Update the newly created lambda with the ZIP package:

Update Lambda function



Define Environment variables for the bucket name and JSON file where data is stored:

Define Environment variables for Lambda





Update the IAM role used with the Lambda to allow read/write access to S3:

Update IAM role for S3 read write permissions


Attach S3 policy to IAM role:

Add S3 permissions to IAM role



4. LINK HTTP API GATEWAY TO LAMBDA

Now that the lambda is ready, we'll link each of the HTTP methods of API gateway to the lambda:


Create API Gateway


Link API Gateway to Lambda



Configure routes for API Gateway to Lambda




5. TEST API USING POSTMAN

Postman is a great tool to test API endpoints. It allows you to craft HTTP headers and body for every API call. 


POST

Testing API Gateway, Lambda and S3 with Postman



GET

Testing API Gateway, Lambda and S3 with Postman


PUT

Testing API Gateway, Lambda and S3 with Postman


DELETE

Testing API Gateway, Lambda and S3 with Postman



And we're done

What we witnessed was how easy it is to create an API endpoint for a simple TODO app, using AWS Lambda, API Gateway, and S3. There's more you can configure and add to the app like securing the API endpoints with AWS Cognito to allow only trusted clients and so forth. To further simplify creating this setup, automation using Cloudformation templates or CLI commands could also be leveraged.


Since we're using S3 as a database here with a single JSON file, you may be wondering what happens if multiple clients were to update the same JSON file simultaneously with different data? Which one would win? Well, that's certainly a problem using this approach because S3 doesn't natively support concurrency as other databases (ACID). But there is a solution for this too, which we'll cover soon. Stay tuned.


PS: Full code can be found on github

Share

This may also interest you

A simple serverless app with HTTP API Gateway, Lambda and S3

A simple serverless app with HTTP API Gateway, Lambda and S3

When coming up with architectures for an application, wha…

Introducing An AWS Savings Plans Calculator

Introducing An AWS Savings Plans Calculator

Today, Cloudshim is excited to announce the launch of the A…

AWS Cost & Usage Report (CUR) as a service (CURAAS?)

AWS Cost & Usage Report (CUR) as a service (CURAAS?)

For those of you who've ever tried to decode how AWS bi…

Making the most of AWS EC2 Savings Plan

Making the most of AWS EC2 Savings Plan

AWS introduced Savings plan (SP) a year ago, for customers…