Mixing VPC and Non-VPC Lambda Functions for Higher Performing Microservices

I came across a post the in the Serverless forums that asked how to disable the VPC for a single function within a Serverless project. This got me thinking about how other people structure their serverless microservices, so I wanted to throw out some ideas. I often mix my Lambda functions between VPC and non-VPC depending on their use and data requirements. In this post, I’ll outline some ways you can structure your Lambda microservices to isolate services, make execution faster, and maybe even save you some money. ⚡️💰

Continue Reading…

5 Reasons Why Your Serverless Application Might Be A Security Risk

There has been a lot of buzz lately about serverless security. People are certainly talking about it more and sharing great articles on the topic, but many serverless developers (especially new ones) are still making the same critical mistakes. Every time a serverless function is deployed, its unique security challenges need to be addressed. Every time. I’ve researched and written extensively about serverless security (see Securing Serverless: A Newbie’s Guide). I’ve read countless articles on the subject. And while there is no shortage of information available, let’s be honest: developers are busy building applications, not pouring through hundreds of articles.

I know, it sounds boring, but I would encourage you to do your research on serverless security. Serverless applications are different than traditional, server-hosted applications. Much of the security responsibility falls on the developer, and not following best practices opens you (or your company) up to an attack. But I know you’re busy. I totally get it. So rather than forcing you to read a bunch of long articles 😴 or watch a plethora of videos 🙈, I’ve whittled it all down to the five biggest serverless security risks for you. Sure, there are a lot of other things to consider, but IMO, these are the most important ones. Nothing here hasn’t been said before. But If you do nothing more than follow these principles, your serverless applications will be much more secure. 🔒

Continue Reading…

Serverless Consumers with Lambda and SQS Triggers

On Wednesday, June 27, 2018, Amazon Web Services released SQS triggers for Lambda functions. Those of you who have been building serverless applications with AWS Lambda probably know how big of a deal this is. Until now, the AWS Simple Queue Service (SQS) was generally a pain to deal with for serverless applications. Communicating with SQS is simple and straightforward, but there was no way to automatically consume messages without implementing a series of hacks. In general, these hacks “worked” and were fairly manageable. However, as your services became more complex, dealing with concurrency and managing fan out made your applications brittle and error prone. SQS triggers solve all of these problems. 👊

Update September 3, 2020: There are a number of important recommendations available in the AWS Developer Guide for using SQS as a Lambda Trigger: https://docs.aws.amazon.com/lambda/latest/dg/with-sqs.html.

The most important are:

“To allow your function time to process each batch of records, set the source queue’s visibility timeout to at least 6 times the timeout that you configure on your function. The extra time allows for Lambda to retry if your function execution is throttled while your function is processing a previous batch.”

“To give messages a better chance to be processed before sending them to the dead-letter queue, set the maxReceiveCount on the source queue’s redrive policy to at least 5.”

It is also imperative that you set a minimum concurrency of 5 on your processing Lambda function due to the initial scaling behavior of the SQS Poller.

Update November 19, 2019: AWS announced support for SQS FIFO queues as a Lambda event source (announcement here). FIFO queues guarantee message order, which means only one Lambda function is invoked per MessageGroupId.

Update December 6, 2018: At some point over the last few months AWS fixed the issue with the concurrency limits and the redrive policy. See Additional experiments with concurrency and redrive polices below.

Audio Version (please note that this audio version is out of date given the new updates)

Continue Reading…

Event Injection: Protecting your Serverless Applications

Updated January 25, 2019: This post was updated based on feedback from the community.

The shared security model of cloud providers extends much further with serverless offerings, but application security is still the developer’s responsibility. Many traditional web applications are front-ended with WAFs (web application firewalls), RASPs (runtime application self-protection), EPPs (endpoint protection platforms) and WSGs (web security gateways) that inspect incoming and outgoing traffic. These extra layers of protection can save developers from themselves when making common programming mistakes that would otherwise leave their applications vulnerable. If you’re invoking serverless functions from sources other than API Gateway, you no longer have the ability to use the protection of a WAF. 

Continue Reading…

10 Things You Need To Know When Building Serverless Applications

I am a HUGE fan of serverless architectures. This new type of compute not only opens up more possibilities for developers, but can support highly-scalable, complex applications for a fraction of the cost compared to provisioning virtual servers. My first question when planning a new application is always, “Can I build this with serverless?” Spoiler alert, the answer is almost always YES!

I’ve been building serverless applications since the release of AWS Lambda in 2015, so answering the question above is pretty easy for me. However, a lot of people I talk to who are new to serverless often have many questions (and misconceptions). I want you to be successful, so below I’ve create a list of 10 things you need to know when building a serverless application. These are things I wish I knew when I started, so hopefully they’ll help you get up to speed a faster and start building some amazing applications.

Continue Reading…

How To: Optimize the Serverless Optimizer Plugin

I’m sure you’re already well aware of how awesome the ⚡ Serverless Framework is for managing and deploying your serverless applications. And you’re probably aware that there are several great plugins available that make Serverless even better. But did you know that there was a plugin to optimize your functions and reduce the size of your deployment packages? Or are you already using this plugin to optimize your functions, but hate how it takes too long to optimize locally run functions? In this post I’ll share some quick tips to help you optimize your Serverless Optimizer experience.

Continue Reading…

Solving the Cold Start Problem

Dear AWS Lambda Team,

I have a serious problem: I love AWS Lambda! In fact, I love it so much that I’ve pretty much gone all in on this whole #serverless thing. I use Lambda for almost everything now. I use it to build backend data processing pipelines, distribute long running tasks, and respond to API requests. Heck, I even built an Alexa app just for fun. I found myself building so many RESTful APIs using Lambda and API Gateway that I went ahead and created the open source Lambda API web framework to allow users to more efficiently route and respond to API Gateway requests.

Serverless technologies, like Lambda, have revolutionized how developers think about building applications. Abstracting away the underlying compute layer and replacing it with on-demand, near-infinitely scalable function containers is brilliant. As we would say out here in Boston, “you guys are wicked smaht.” But I think you missed something very important. In your efforts to conform to the “pay only for the compute time you consume” promise of serverless, you inadvertently handicapped the service. My biggest complaint, and the number one objection that I hear from most of the “serverless-is-not-ready-for-primetime” naysayers, are Cold Starts.

Continue Reading…

How To: Manage Serverless Environment Variables Per Stage

I often find myself creating four separate stages for each ⚡ Serverless Framework project I work on: dev, staging, prod, and local. Obviously the first three are meant to be deployed to the cloud, but the last one, local, is meant to run and test interactions with local resources. It’s also great to have an offline version (like when you’re on a plane ✈ or have terrible wifi somewhere). Plus, development is much faster because you’re not waiting for round trips to the server. 😉

A really great feature of Serverless is the ability to configure ENVIRONMENT variables in the serverless.yml file. This lets us store important global information like database names, service endpoints and more. We can even reference passwords securely using AWS’s Service Manager Parameter Store and decode encrypted secrets on deployment, keeping them safe from developers and source repositories alike. 😬 Just reference the variable with ${ssm:/myapp/my-secure-value~true} in your configuration file.

Continue Reading…

How To: Manage RDS Connections from AWS Lambda Serverless Functions

Someone asked a great question on my How To: Reuse Database Connections in AWS Lambda post about how to end the unused connections left over by expired Lambda functions:

I’m playing around with AWS lambda and connections to an RDS database and am finding that for the containers that are not reused the connection remains. I found before that sometimes the connections would just die eventually. I was wondering, is there some way to manage and/or end the connections without needing to wait for them to end on their own? The main issue I’m worried about is that these unused connections would remain for an excessive amount of time and prevent new connections that will actually be used from being made due to the limit on the number of connections.

🧟‍♂️ Zombie RDS connections leftover on container expiration can become a problem when you start to reach a high number of concurrent Lambda executions. My guess is that this is why AWS is launching Aurora Serverless, to deal with relational databases at scale. At the time of this writing it is still in preview mode.

Update September 2, 2018: I wrote an NPM module that manages MySQL connections for you in serverless environments. Check it out here.

Update August 9, 2018: Aurora Serverless is now Generally Available!

Overall, I’ve found that Lambda is pretty good about closing database connections when the container expires, but even if it does it reliably, it still doesn’t solve the MAX CONNECTIONS problem. Here are several strategies that I’ve used to deal with this issue.

Continue Reading…

Securing Serverless: A Newbie’s Guide

So you’ve decided to build a serverless application. That’s awesome! May I be the first to welcome you to the future. 🤖 I bet you’ve done a lot of research. You’ve probably even deployed a few test functions to AWS Lambda or Google Cloud Functions and you’re ready to actually build something useful. You probably still have a bunch of unanswered questions, and that’s cool. We can still build some really great applications even if we only know the basics. However, when we start working with new things we typically make a bunch of dumb mistakes. While some are relatively innocuous, security mistakes can cause some serious damage.

I’ve been working with serverless applications since AWS launched Lambda in early 2015. Over the last few years I’ve developed many serverless applications covering a wide range of use cases. The most important thing I’ve learned: SECURE YOUR FUNCTIONS! I can tell you from personal experience, getting burned by an attack is no bueno. I’d hate to see it happen to you. 😢

To make sure it doesn’t happen to you, I’ve put together a list of 🔒Serverless Security Best Practices. This is not a comprehensive list, but it covers the things you ABSOLUTELY must do. I also give you some more things to think about as you continue on your serverless journey. 🚀

Continue Reading…