fbpx

As the database grows rapidly, migrating data to the cloud has become a solution.

With the time pass by and the innovation of new devices, the pipeline and quantity of data generation are becoming more diverse, and the demand for storing data is also increasing. In the past, companies had to buy more hardware to store these constantly generated data. However, with this approach, companies had to invest a lot of fixed costs in building their infrastructure in the early stages, plus the cost of maintaining (labor cost, electricity bill, computer room space, usage fee, etc.) makes the Total Cost of Ownership (TCO) very high, which appears to be quite an amount of expense. Not to mention the newly established companies whose funding sources are unstable, these costs may be the last straw to break the camel’s back. Compared with the traditional situation, the cloud service purchased by renting does not need to invest a lot of cost in the early stage, and then charge according to customer’s usage. More importantly, the customer can use the cloud platform service globalization and the advantages of high availability and scalability to deploy and deliver services and content to the world.

AWS provides a variety of ways to migrate data to the cloud, and Nextlink Solution Architects have organized relevant information for you in this newsletter.

Internet / VPN capture

No matter where you are, you can migrate data to the cloud by connecting to the VPC via the Internet or VPN. This is the easiest and most common way to put objects into Amazon S3. If you need to upload a large file, you can use Multi-Part-Upload to speed up file transfer and upload up to 5 TB of large objects. Transferring data to AWS via Internet/VPN is free.

 

Amazon S3 Transfer Acceleration

If the company’s internal users need to transmit data in GB and TB level to the same Amazon S3 Bucket across regions and continents, you can use Amazon S3 Transfer Acceleration, a network-based and protocol-based data transfer service. The speed and security of transferring long-distance files to and from the Amazon S3 Bucket is even higher, and the overall transmission efficiency is increased by 50% to 400%. Transfer Acceleration calculates the best path through the nodes of Amazon Cloudfront’s global layout to transfer the object to Amazon S3.

AWS Direct Connect

AWS Direct Connect :AWS Direct Connect is a network-dedicated transport service that deploys on-site to AWS. Compared to Internet transport, network-dedicated connections have the advantage of reducing cost, increasing bandwidth, and being more stable than Internet traffic. It integrates with other services on existing AWS, such as Amazon S3, Amazon EC2, and Amazon VPC, and you can also establish one or more private connections with Amazon VPC on the ground to maintain network isolation.

 

The Snow Family

If your company’s data center needs to migrate PB or even EB grades data to the AWS platform, consider using the AWS Snow family services to help you.

AWS Snowball: PB-rated data transfer device with 256-bit encryption and TPM module to ensure data security. When you create a job in the AWS Console, the snowball device will send it to you. You can transfer the data to the device through the standard network interface and storage protocol, and send it back to the specified location according to the E ink displayed on the device. AWS will upload the data to Amazon S3.
AWS Snowball Edge: A PB-level data transfer device with built-in storage and computing ability. Snowball Edge supports specific Amazon EC2 instances and AWS Lambda functions, which can be used for image sorting, IoT, and machine learning in addition to data migration and transfer.
AWS Snowmobile: EB-level data transfer service with 256 key encryption hardened data container trucks. After AWS makes an assessment, AWS Snowmobile will send it to the local data center and connect to the removable network switch. With the machine network, you can transfer the data source of the data center to Snowmobile. After the transfer, Snowmobile will send it back to AWS to upload your data to Amazon S3 or Amazon Glacier.

Amazon Kinesis Data Firehose

Amazon Kinesis Data Firehose is a way to import streaming data to data storage services and analytics tools, providing an easy way to stream data to ETL and with Amazon S3, Amazon Redshift and Amazon Elasticsearch Service or third-party applications integration.

For example, Splunk allows you to get the latest information faster to analyze and respond to events.

 

After the data is migrated, what’s next?

After migrating the original data to the AWS platform through the above methods, the next step is to store the data. The AWS platform also provides a variety of storage methods. Please follow up on the Nextlink newsletter. We will inform you about AWS storage and backup related services in the future.