Problem
Reaching maximum connection limits - Serverless functions result in high usage of connections to the Postgres database. Invoked Serverless functions do not immediately end the connection. With moderate usage the Postgres database memory becomes overloaded.
Potential solutions
- Put a connection pooler in front of your database so it matters later or not at all.
- You can manually kill all connections from time to time to close the connections to the old function runtimes.
- You could also write some code that does this automatically every x minutes or on demand.
- Or you can configure your database to automatically close connections after x minutes of inactivity.
- Use a server instead of Serverless
Solution notes and problems
- RDS Proxy - This would have been an easy to setup connection pooler in front of the database however this is not currently compatible with Prisma.
- pgbouncer - This would be the recommended connection pooling approach. This would require setting up an EC2 instance and running the pgbouncer ourselves and maintaining that. This will likely be required if any larger scale was reached.
Solution approach
- Investigate current limitations - Run a 4gb to 8gb memory database instance and then load test the site and document how many concurrent users it can handle.
- Automatically close connections - There is no long running database queries for the project so putting a cap on the connect time would help prevent the connection limit being reached. This makes sense as a starting approach.
- Cron job to close connections - If necessary look at how to close inactive connections using a cron job every number of minutes.
- Add pgbouncer connecting pooling - If the previous steps do not suffice in preventing connection limit issues then investing time in setting up pgbouncer properly can be carried out.