Category | Parameter | Description |
N/A | Task Name | The name of the task. DTS automatically assigns a name to the task. We recommend that you specify a descriptive name that makes it easy to identify the task. You do not need to specify a unique task name. |
Source Database | Select an existing DMS database instance | The database instance that you want to use. You can determine whether to select an existing instance based on your business requirements. If you select an existing instance, DTS automatically populates the parameters for the instance. If you do not select an existing instance, you must configure parameters for the source database.
|
Database Type | The type of the source database. Select PolarDB-X 1.0. |
Access Method | The access method of the source database. Select Alibaba Cloud Instance. |
Instance Region | The region in which the source PolarDB-X 1.0 instance resides. |
Replicate Data Across Alibaba Cloud Accounts | Specifies whether to migrate data across Alibaba Cloud accounts. In this example, No is selected. |
Instance ID | The ID of the source PolarDB-X 1.0 instance. |
Database Account | The database account of the source PolarDB-X 1.0 instance. Grant permissions to the account based on the format in which data is stored in the destination ApsaraMQ for Kafka instance. |
Database Password | The password of the database account. |
Destination Database | Select an existing DMS database instance | The database instance that you want to use. You can determine whether to select an existing instance based on your business requirements. If you select an existing instance, DTS automatically populates the parameters for the instance. If you do not select an existing instance, you must configure parameters for the destination database.
|
Database Type | The type of the destination database. Select Kafka. |
Access Method | The access method of the destination database. Select Express Connect, VPN Gateway, or Smart Access Gateway. |
Instance Region | The region in which the destination ApsaraMQ for Kafka instance resides. |
Connected VPC | The ID of the virtual private cloud (VPC) to which the destination ApsaraMQ for Kafka instance belongs. To obtain the VPC ID, perform the following operations: Log on to the ApsaraMQ for Kafka console and go to the Instance Details page of the ApsaraMQ for Kafka instance. In the Configuration Information section of the Instance Information tab, view the VPC ID. |
IP Address or Domain Name | An IP address of the destination ApsaraMQ for Kafka instance. Note To obtain an IP address of the ApsaraMQ for Kafka instance, perform the following operations: Log on to the ApsaraMQ for Kafka console and go to the Instance Details page of the ApsaraMQ for Kafka instance. In the Endpoint Information section of the Instance Information tab, obtain an IP address from the Default Endpoint parameter. |
Port Number | The service port number of the destination ApsaraMQ for Kafka instance. Default value: 9092. |
Database Account | The database account of the destination ApsaraMQ for Kafka instance. Note The database account and database password are required only for ApsaraMQ for Kafka instances for which the access control list (ACL) feature is enabled. For more information about how to enable the ACL feature, see Grant permissions to SASL users. |
Database Password |
Kafka Version | The version of the destination ApsaraMQ for Kafka instance. |
Encryption | Specifies whether to encrypt the connection to the destination instance. Select Non-encrypted or SCRAM-SHA-256 based on your business and security requirements. |
Topic | The topic that is used to receive the migrated data. Select a topic from the drop-down list. |
Topic That Stores DDL Information | The topic that is used to store the DDL information. Select a topic from the drop-down list. If you do not specify this parameter, the DDL information is stored in the topic that is specified by the Topic parameter. |
Use Kafka Schema Registry | Specifies whether to use Kafka Schema Registry. Kafka Schema Registry provides a serving layer for your metadata. It provides a RESTful API to store and retrieve your Avro schemas. Valid values: No: does not use Kafka Schema Registry. Yes: uses Kafka Schema Registry. In this case, you must enter the URL or IP address that is registered in Kafka Schema Registry for your Avro schemas.
|