Which solution will meet these requirements MOST cost-effectively?
Create a topic in AWS IoT Core to ingest the sensor data. Create an AWS Lambda function to enrich the data and to write the data to Amazon S3. Configure an AWS IoT rule action to invoke the Lambda function.
Use AWS IoT Core Basic Ingest to ingest the sensor data. Configure an AWS IoT rule action to write the data to Amazon Kinesis Data Firehose. Set the Kinesis Data Firehose buffering interval to 900 seconds. Use Kinesis Data Firehose to invoke an AWS Lambda function to enrich the data, Configure Kinesis Data Firehose to deliver the data to Amazon S3.
Create a topic in AWS IoT Core to ingest the sensor data. Configure an AWS IoT rule action to send the data to an Amazon Timestream table. Create an AWS Lambda, function to read the data from Timestream. Configure the Lambda function to enrich the data and to write the data to Amazon S3.
Use AWS loT Core Basic Ingest to ingest the sensor data. Configure an AWS IoT rule action to write the data to Amazon Kinesis Data Streams. Create a consumer AWS Lambda function to process the data from Kinesis Data Streams and to enrich the data. Call the S3 PutObject API operation from the Lambda function to write the data to Amazon S3.
Explanations:
While this option effectively ingests sensor data and processes it using AWS Lambda, invoking the Lambda function directly for every data transmission can lead to high costs due to Lambda’s per-execution pricing model. Also, it may struggle with managing the data flow and delays, potentially exceeding the 30-minute requirement.
This option leverages AWS IoT Core Basic Ingest and Kinesis Data Firehose, which is optimized for handling large volumes of data. Kinesis Data Firehose can buffer data for a specified time (900 seconds) before invoking Lambda for enrichment, ensuring that data is processed efficiently and sent to S3 within the required timeframe. This approach minimizes costs and meets the 30-minute requirement effectively.
Although this option uses AWS IoT Core to ingest data and AWS Lambda for enrichment, using Amazon Timestream introduces additional complexity and potential latency. The process of writing data to Timestream and then reading it back for processing may exceed the 30-minute requirement and is not the most cost-effective solution.
This option utilizes Kinesis Data Streams for data ingestion, which may introduce additional costs and complexity. Although it can handle real-time data processing, the need for a consumer Lambda function to read and enrich data could lead to higher operational costs and potential delays, failing to guarantee the 30-minute requirement effectively.