Section | Parameter | Description |
N/A | Task Name | The name of the task. DTS automatically assigns a name to the task. We recommend that you specify a descriptive name that makes it easy to identify the task. You do not need to specify a unique task name. |
Source Database | Select an existing DMS database instance | The database instance that you want to use. You can choose whether to select an existing instance based on your business requirements. If you select an existing instance, DTS automatically populates the parameters for the database. If you do not select an existing instance, you must manually configure parameters for the database.
|
Database Type | The type of the source database. Select PolarDB-X 2.0. |
Access Method | The access method of the source database. Select Alibaba Cloud Instance. |
Instance Region | The region in which the source PolarDB-X instance resides. |
Instance ID | The ID of the source PolarDB-X instance. |
Database Account | The database account of the source PolarDB-X instance. For more information about the permissions that are required for the account, see the Permissions required for database accounts section of this topic. |
Database Password | The password of the database account. |
Destination Database | Select an existing DMS database instance | The database instance that you want to use. You can choose whether to select an existing instance based on your business requirements. If you select an existing instance, DTS automatically populates the parameters for the database. If you do not select an existing instance, you must manually configure parameters for the database.
|
Database Type | The type of the destination database. Select Kafka. |
Access Method | The access method of the destination database. Select Express Connect, VPN Gateway, or Smart Access Gateway. Note DTS does not provide Message Queue for Apache Kafka as an access method. You can use Message Queue for Apache Kafka as a self-managed Kafka cluster to configure data migration. |
Instance Region | The region in which the destination Message Queue for Apache Kafka instance resides. |
Connected VPC | The ID of the virtual private cloud (VPC) to which the destination Message Queue for Apache Kafka instance belongs. To obtain the VPC ID, perform the following operations: Log on to the Message Queue for Apache Kafka console and go to the Instance Details page of the Message Queue for Apache Kafka instance. In the Configuration Information section of the Instance Information tab, view the VPC ID. |
IP Address or Domain Name | An IP address of the Message Queue for Apache Kafka instance. Note To obtain an IP address of the Message Queue for Apache Kafka instance, perform the following operations: Log on to the Message Queue for Apache Kafka console and go to the Instance Details page of the Message Queue for Apache Kafka instance. In the Endpoint Information section of the Instance Information tab, obtain an IP address from the Default Endpoint parameter. |
Port Number | The service port number of the destination Message Queue for Apache Kafka instance. Default value: 9092. |
Database Account | The database account of the destination Message Queue for Apache Kafka instance. Note If the Message Queue for Apache Kafka instance is a VPC-connected instance, you do not need to set the Database Account or Database Password parameter. |
Database Password | The password of the database account. |
Kafka Version | The version of the destination Message Queue for Apache Kafka instance. |
Encryption | Specifies whether to encrypt the connection. Select Non-encrypted or SCRAM-SHA-256 based on your business and security requirements. |
Topic | The topic used to receive the migrated data. Select a topic from the drop-down list. |
Topic That Stores DDL Information | The topic used to store the DDL information. Select a topic from the drop-down list. If you do not set this parameter, the DDL information is stored in the topic that is specified by the Topic parameter. |
Use Kafka Schema Registry | Specifies whether to use Kafka Schema Registry, which provides a serving layer for your metadata. It provides a RESTful API for storing and retrieving your Avro schemas. Valid values: No: does not use Kafka Schema Registry. Yes: uses Kafka Schema Registry. In this case, you must enter the URL or IP address that is registered in Kafka Schema Registry for your Avro schemas.
|