AnalyticDB for MySQL allows you to import various data sources, such as ApsaraDB RDS for MySQL, ApsaraDB for MongoDB, Object Storage Service (OSS), MaxCompute, and Kafka, to data warehouses or data lakes. You can select import methods based on different data sources.
Import data to data warehouses
Category | Data source | Import method | Edition | References |
Category | Data source | Import method | Edition | References |
Database | ApsaraDB RDS for MySQL | External table | Data Warehouse Edition | Use external tables to import data to Data Warehouse Edition |
Enterprise Edition, Basic Edition, or Data Lakehouse Edition | Use external tables to import data to Data Lakehouse Edition | |||
Data Transmission Service (DTS) | Data Warehouse Edition | |||
Enterprise Edition, Basic Edition, or Data Lakehouse Edition | ||||
DataWorks | Data Warehouse Edition | |||
Enterprise Edition, Basic Edition, or Data Lakehouse Edition | ||||
Zero-ETL | Enterprise Edition, Basic Edition, or Data Lakehouse Edition | |||
ApsaraDB RDS for SQL Server | DTS | Data Warehouse Edition | ||
Enterprise Edition, Basic Edition, or Data Lakehouse Edition | ||||
DataWorks | Data Warehouse Edition | |||
Enterprise Edition, Basic Edition, or Data Lakehouse Edition | ||||
PolarDB-X | DTS | Data Warehouse Edition | ||
Enterprise Edition, Basic Edition, or Data Lakehouse Edition | ||||
DataWorks | Data Warehouse Edition | |||
Enterprise Edition, Basic Edition, or Data Lakehouse Edition | ||||
One-stop synchronization | Enterprise Edition, Basic Edition, or Data Lakehouse Edition | Automatically synchronize the metadata of a PolarDB-X instance to a Data Lakehouse Edition cluster | ||
PolarDB for MySQL | Federated analytics | Enterprise Edition, Basic Edition, or Data Lakehouse Edition | Use federated analytics to synchronize data to Data Lakehouse Edition | |
DTS | Data Warehouse Edition | |||
Enterprise Edition, Basic Edition, or Data Lakehouse Edition | ||||
Zero-ETL | Enterprise Edition, Basic Edition, or Data Lakehouse Edition | |||
ApsaraDB for MongoDB | External table | Enterprise Edition, Basic Edition, or Data Lakehouse Edition | ||
Lindorm | Zero-ETL | Enterprise Edition, Basic Edition, or Data Lakehouse Edition | ||
Oracle | DataWorks | Data Warehouse Edition | ||
Enterprise Edition, Basic Edition, or Data Lakehouse Edition | ||||
Self-managed MySQL database | External table | Data Warehouse Edition | ||
Self-managed HBase database | DTS | Data Warehouse Edition | ||
Storage | OSS | External table | Data Warehouse Edition | Use external tables to import data to Data Warehouse Edition |
Enterprise Edition, Basic Edition, or Data Lakehouse Edition | Use external tables to import data to Data Lakehouse Edition | |||
DataWorks | Data Warehouse Edition | |||
Enterprise Edition, Basic Edition, or Data Lakehouse Edition | ||||
Tablestore | External table | Enterprise Edition, Basic Edition, or Data Lakehouse Edition | ||
HDFS | External table | Data Warehouse Edition | Use external tables to import data to Data Warehouse Edition | |
DataWorks | Data Warehouse Edition | |||
Enterprise Edition, Basic Edition, or Data Lakehouse Edition | ||||
Big data | MaxCompute | External table | Data Warehouse Edition | Use external tables to import data to Data Warehouse Edition |
Enterprise Edition, Basic Edition, or Data Lakehouse Edition | Use external tables to import data to Data Lakehouse Edition | |||
DataWorks | Data Warehouse Edition | |||
Enterprise Edition, Basic Edition, or Data Lakehouse Edition | ||||
Flink | Flink | Data Warehouse Edition | ||
Message queue | Kafka | Logstash | Data Warehouse Edition | |
DataWorks | Data Warehouse Edition | |||
Enterprise Edition, Basic Edition, or Data Lakehouse Edition | ||||
Log data | Log data | Logstash | Data Warehouse Edition | |
Simple Log Service | Data Warehouse Edition | Use Simple Log Service to import data to Data Warehouse Edition | ||
Data synchronization | Enterprise Edition, Basic Edition, or Data Lakehouse Edition | Use data synchronization to synchronize data from Simple Log Service to Data Warehouse Edition | ||
Data synchronization | Enterprise Edition, Basic Edition, or Data Lakehouse Edition | Use data synchronization to synchronize data from Simple Log Service to Data Lakehouse Edition | ||
On-premises data | LOAD DATA | Data Warehouse Edition | ||
Import tool | Data Warehouse Edition | Use the AnalyticDB for MySQL import tool to import data to Data Warehouse Edition | ||
Kettle | Data Warehouse Edition |
Import data to data lakes
When you import data to data lakes, data is written to the specified OSS bucket in the Hudi format.
You can import data to data lakes only in AnalyticDB for MySQL Enterprise Edition, Basic Edition, or Data Lakehouse Edition clusters.
Category | Data source | Import method | References |
Category | Data source | Import method | References |
Message queue | Kafka | Data synchronization | |
Log data | Simple Log Service | Data synchronization | |
Big data | Hive | Data migration | |
Storage | OSS | Metadata discovery | Use metadata discovery to import data to Data Lakehouse Edition |
References
AnalyticDB for MySQL allows you to asynchronously submit data import jobs. For more information, see Asynchronously submit an import job.