Which steps will allow the solutions architect to perform the migration within the specified timeline?
Install Oracle database software on an Amazon EC2 instance. Configure VPN connectivity between AWS and the company’s data center. Configure the Oracle database running on Amazon EC2 to join the Oracle Real Application Clusters (RAC). When the Oracle database on Amazon EC2 finishes synchronizing, create an AWS DMS ongoing replication task to migrate the data from the Oracle database on Amazon EC2 to Amazon Redshift. Verify the data migration is complete and perform the cut over to Amazon Redshift.
Create an AWS Snowball import job. Export a backup of the Oracle data warehouse. Copy the exported data to the Snowball device. Return the Snowball device to AWS. Create an Amazon RDS for Oracle database and restore the backup file to that RDS instance. Create an AWS DMS task to migrate the data from the RDS for Oracle database to Amazon Redshift. Copy daily incremental backups from Oracle in the data center to the RDS for Oracle database over the internet. Verify the data migration is complete and perform the cut over to Amazon Redshift.
Install Oracle database software on an Amazon EC2 instance. To minimize the migration time, configure VPN connectivity between AWS and the company’s data center by provisioning a 1 Gbps AWS Direct Connect connection. Configure the Oracle database running on Amazon EC2 to be a read replica of the data center Oracle database. Start the synchronization process between the company’s on-premises data center and the Oracle database on Amazon EC2. When the Oracle database on Amazon EC2 is synchronized with the on-premises database, create an AWS DMS ongoing replication task to migrate the data from the Oracle database read replica that is running on Amazon EC2 to Amazon Redshift. Verify the data migration is complete and perform the cut over to Amazon Redshift.
Create an AWS Snowball import job. Configure a server in the company’s data center with an extraction agent. Use AWS SCT to manage the extraction agent and convert the Oracle schema to an Amazon Redshift schema. Create a new project in AWS SCT using the registered data extraction agent. Create a local task and an AWS DMS task in AWS SCT with replication of ongoing changes. Copy data to the Snowball device and return the Snowball device to AWS. Allow AWS DMS to copy data from Amazon S3 to Amazon Redshift. Verify that the data migration is complete and perform the cut over to Amazon Redshift.
Explanations:
Configuring Oracle on EC2 and using Oracle RAC introduces unnecessary complexity and cost. This option also requires significant manual intervention for synchronization and does not leverage efficient migration tools like AWS DMS, making it impractical for meeting the 30-day window.
The solution involves a complex sequence of steps (Snowball, RDS, DMS), which is inefficient and time-consuming, especially with 50 TB of data. Copying daily incremental backups would be slow over a 50 Mbps connection, and the Snowball device does not optimize this type of continuous migration.
The solution requires configuring a read replica on EC2, which introduces complexity. While AWS Direct Connect could speed up data transfer, the EC2-based read replica approach adds unnecessary layers and doesn’t optimize the migration time. Also, creating a read replica of a large on-premises database can be complex and may not complete in the available time.
This option efficiently handles the migration by using AWS Snowball for bulk data transfer and AWS DMS for ongoing incremental updates. It leverages AWS SCT for schema conversion, and the combination of DMS and Snowball provides an optimal solution for migrating large datasets within the given 30-day window. The approach minimizes impact on business operations and keeps costs low.