Which solution will meet these requirements?
Use Lambda to retrieve all due payments. Publish the due payments to an Amazon S3 bucket. Configure the S3 bucket with an event notification to invoke another Lambda function to process the due payments.
Use Lambda to retrieve all due payments. Publish the due payments to an Amazon Simple Queue Service (Amazon SQS) queue. Configure another Lambda function to poll the SQS queue and to process the due payments.
Use Lambda to retrieve all due payments. Publish the due payments to an Amazon Simple Queue Service (Amazon SQS) FIFO queue. Configure another Lambda function to poll the FIFO queue and to process the due payments.
Use Lambda to retrieve all due payments. Store the due payments in an Amazon DynamoDB table. Configure streams on the DynamoDB table to invoke another Lambda function to process the due payments.
Explanations:
Using Amazon S3 to publish due payments and invoking a Lambda function via S3 event notifications may lead to the risk of duplicate processing if the event notifications are triggered multiple times for the same object or if the Lambda function does not manage idempotency properly. S3 is not designed for guaranteed ordering or deduplication.
While using Amazon SQS for standard queues can help with message processing, it does not guarantee that messages will be processed exactly once, which means duplicate payments can still occur. The standard queue allows for potential duplicates since it is designed for high throughput.
Using an Amazon SQS FIFO queue ensures that messages are processed in the exact order they are sent and prevents the processing of duplicate messages due to its deduplication feature. This makes it the most suitable option for ensuring that payments are processed without duplicates.
Although DynamoDB streams can trigger a Lambda function for processing payments, this method does not inherently guarantee deduplication or ordering. If multiple updates occur to the same payment in DynamoDB, the stream might invoke the Lambda function multiple times, leading to possible duplicate processing.