50
loading...
This website collects cookies to deliver better user experience
Have a ton of traffic - This isn't a problem if you have a high frequency API that's getting hit all day. If you have a smaller site or API, then it can really ruin the user experience. It will also feel erratic to the user since it will only happen once and then the site will feel fast.
Keep Hitting Your Own API to Keep it Alive - Yeah, this is a hack, but even the noble developers over at the serverless framework kind of recommend this.
Use AWS Lambda and fiddle with Provisioned Concurrency - Realizing this is a big problem. Amazon is trying to make it less impactful. There is a setting in AWS Lambda to try and reduce the cold boot time, but it doesn't eliminate it entirely. Cloudflare Workers also have an approach to minimize cold boots, but it results in other compromises I won't go into here.
Use a Database With High Connection Limits - You can just brute force this by running your own database server on a big machine but now you have to manage a database server! You can also try DigitalOcean's Managed Postgres service, which has PG Bouncer built in. That means you can get up to 5000 connections per database on Digital Ocean.
Use Amazon DynamoDB or FaunaDB - Neither of these have connection limits issues. Dynamo can be really hard to query though and FaunaDB is not as battle tested as MongoDB.
Use Amazon Aurora Serverless With a Data API - This is Amazon's clever way of allowing you to use Postgres and MySQL without a traditional database connection. Instead you make HTTP API calls to your database. This isn't great for performance since now you have the overhead of a SSL
handshake to make a database call. This solution is also very very new at the time of writing this article.
Don't Work Around It - Just hope you never get big enough to have this problem. That's kind of ironic though since you probably want to go get big enough to have this problem.
50