Which combination of steps will meet these requirements?
(Choose three.)
Use Amazon RDS Proxy to create a proxy. Connect the proxy to the Aurora cluster reader endpoint. Set a maximum connections percentage on the proxy.
Implement database connection pooling inside the Lambda code. Set a maximum number of connections on the database connection pool.
Implement the database connection opening outside the Lambda event handler code.
Implement the database connection opening and closing inside the Lambda event handler code.
Connect to the proxy endpoint from the Lambda function.
Connect to the Aurora cluster endpoint from the Lambda function.
Explanations:
Using Amazon RDS Proxy allows for connection management and pooling, which helps in reducing the number of open connections to the database and optimizes the performance. Setting a maximum connections percentage helps manage load effectively as the Lambda function scales.
While connection pooling can be beneficial, implementing it inside the Lambda code does not address the inherent cold start and connection overhead issues that come with Lambda functions. This approach may lead to a higher number of connections being opened and closed, adding to latency.
Opening the database connection outside the Lambda event handler allows the connection to be reused across invocations of the function, thereby reducing latency associated with establishing new connections each time the function is invoked.
Opening and closing the database connection inside the Lambda event handler means a new connection is created for every invocation, leading to increased latency and resource consumption, particularly as the invocation rate increases.
Connecting to the RDS Proxy endpoint from the Lambda function allows for better connection management and pooling, which improves performance, especially under high invocation rates. The proxy can manage connections efficiently, thus reducing the load on the database.
Connecting directly to the Aurora cluster endpoint from the Lambda function can lead to a high number of connections being opened and closed with each invocation, resulting in increased latency and decreased throughput, particularly as the event rate rises.