Which solution will meet these requirements?
Launch two Amazon EC2 instances to host the Kafka server in an active/standby configuration across two Availability Zones. Create a domain name in Amazon Route 53. Create a Route 53 failover policy. Route the sensors to send the data to the domain name.
Migrate the on-premises Kafka server to Amazon Managed Streaming for Apache Kafka (Amazon MSK). Create a Network Load Balancer (NLB) that points to the Amazon MSK broker. Enable NLB health checks. Route the sensors to send the data to the NLB.
Deploy AWS IoT Core, and connect it to an Amazon Kinesis Data Firehose delivery stream. Use an AWS Lambda function to handle data transformation. Route the sensors to send the data to AWS IoT Core.
Deploy AWS IoT Core, and launch an Amazon EC2 instance to host the Kafka server. Configure AWS IoT Core to send the data to the EC2 instance. Route the sensors to send the data to AWS IoT Core.
Explanations:
An active/standby configuration with EC2 instances does not provide the scalability and high availability needed. It also does not eliminate single points of failure as the Kafka server itself is still managed manually.
While Amazon MSK provides managed Kafka service, using an NLB with MSK does not fully leverage the high availability and scalability of MSK itself, which handles fault tolerance and distribution automatically.
AWS IoT Core provides a highly available and scalable solution for receiving data from IoT devices. It integrates with Kinesis Data Firehose for efficient data delivery and transformation, which is suitable for this use case.
Combining AWS IoT Core with an EC2 instance hosting Kafka does not provide the full scalability and high availability. The EC2 instance introduces a single point of failure, which does not meet the high-availability requirement.