This topic describes the operations that you must perform before you migrate data.
Step 1: Estimate the amount of data to be migrated
Estimate the size and the number of objects that you want to migrate. You can query the data size in the bucket of Google Cloud Storage by using the gsutil tool or checking the storage logs. For more information, see Getting bucket information.
The information in the preceding link may be outdated due to changes in the original server and is for reference only.
Step 2: Restore data in the source bucket
Before you create a migration task to migrate data of the Archive storage class, you must manually restore the data. Take note of the following items when you restore the data:
Before you create a source data address and a migration task, make sure that data of the Archive storage class is restored.
Specify the number of days during which the restored data remains in the restored state based on the amount of data that you want to migrate. This can prevent data from entering the archived state during data migration.
You may be charged for the restoration operation. The fee may be relatively high. For more information about the billing methods, contact the service provider that offers the source bucket.
Data Online Migration does not restore data in archived files at the source data address during data migration. The files that are not restored or are being restored cannot be migrated.
Step 3: Create a private key of a service account that is used for migration.
Log on to the IAM & Admin console.
In the left-side navigation pane, click Service Accounts. Click the Select a project drop-down list on the top navigation bar. In the dialog box that appears, select the project in which the bucket resides.
On the Service accounts page, click CREATE SERVICE ACCOUNT. In the Create service account wizard, create a service account and grant the account the read permissions on the bucket that stores the data to be migrated. In Step 3, click CREATE KEY, and select JSON as the key type. Click CREATE to save the JSON file to your computer. Click CLOSE. Click DONE.
If you already have a service account, find the service account on the Service accounts page and click its name. On the page that appears, choose JSON as the key type and choose .
. SelectThe preceding procedure may be outdated due to changes in the original server and is for reference only.
Step 4: Create a destination bucket
Create a destination bucket in the OSS console to store the migrated data. For more information, see Create a bucket.
Step 5: Create a RAM role that is used to migrate data
To ensure data security, we recommend that you create a RAM role and attach the required policies to the RAM role based on the principle of least privilege for data migration.
Log on to the RAM console.
In the left-side navigation pane, choose
.On the Roles page, click Create Role.
In the Select Role Type step of the Create Role wizard, select Alibaba Cloud Service and click Next.
In the Selected Trusted Entity section, select Normal Service Role as Role Type, enter a RAM role name in the RAM Role Name field, and select Data Online Migration from the Select Trusted Service drop-down list. Then, click OK.
Step 6: Grant permissions to the RAM user
After the RAM user is created, go to the Users page in the RAM console. Find the RAM user that you want to manage and click Add Permissions in the Actions column to grant permissions to the RAM user.
System policy: AliyunMGWFullAccess, which grants full permissions on Data Transport
Custom policy:
Grant the RAM user full permissions on the destination bucket
NoteThe following policy is for reference only. Replace
mybucket
with the name of the destination bucket.For more information about RAM policies for OSS, see Common examples of RAM policies.
{ "Version": "1", "Statement": [ { "Effect": "Allow", "Action": "oss:*", "Resource": [ "acs:oss:*:*:mybucket", "acs:oss:*:*:mybucket/*" ] } ] }