AnalyticDB for MySQL allows you to import various data sources, such as ApsaraDB RDS for MySQL, ApsaraDB for MongoDB, Object Storage Service (OSS), MaxCompute, and Kafka, to data warehouses or data lakes. You can select import methods based on different data sources.
Import data to data warehouses
Category | Data source | Import method | Edition | References |
Database | ApsaraDB RDS for MySQL | External table | Data Warehouse Edition | Use external tables to import data to Data Warehouse Edition |
Data Lakehouse Edition | Use external tables to import data to Data Lakehouse Edition | |||
Data Transmission Service (DTS) | Data Warehouse Edition | |||
Data Lakehouse Edition | ||||
DataWorks | Data Warehouse Edition | |||
Data Lakehouse Edition | ||||
ApsaraDB RDS for SQL Server | DTS | Data Warehouse Edition | ||
Data Lakehouse Edition | ||||
DataWorks | Data Warehouse Edition | |||
Data Lakehouse Edition | ||||
PolarDB-X | DTS | Data Warehouse Edition | ||
Data Lakehouse Edition | ||||
DataWorks | Data Warehouse Edition | |||
Data Lakehouse Edition | ||||
PolarDB for MySQL | Federated analytics | Data Lakehouse Edition | Use federated analytics to synchronize data to Data Lakehouse Edition | |
DTS | Data Warehouse Edition | |||
Data Lakehouse Edition | ||||
ApsaraDB for MongoDB | External table | Data Lakehouse Edition | ||
Oracle | DataWorks | Data Warehouse Edition | ||
Data Lakehouse Edition | ||||
Self-managed MySQL database | External table | Data Warehouse Edition | ||
Self-managed HBase database | DTS | Data Warehouse Edition | ||
Storage | OSS | External table | Data Warehouse Edition | Use external tables to import data to Data Warehouse Edition |
Data Lakehouse Edition | Use external tables to import data to Data Lakehouse Edition | |||
DataWorks | Data Warehouse Edition | |||
Data Lakehouse Edition | ||||
Tablestore | External table | Data Lakehouse Edition | ||
Apsara File Storage for HDFS | External table | Data Warehouse Edition | Use external tables to import data to Data Warehouse Edition | |
DataWorks | Data Warehouse Edition | |||
Data Lakehouse Edition | ||||
Big data | MaxCompute | External table | Data Warehouse Edition | Use external tables to import data to Data Warehouse Edition |
Data Lakehouse Edition | Use external tables to import data to Data Lakehouse Edition | |||
DataWorks | Data Warehouse Edition | |||
Data Lakehouse Edition | ||||
Flink | Flink | Data Warehouse Edition | ||
ApsaraMQ | Kafka | Logstash | Data Warehouse Edition | |
DataWorks | Data Warehouse Edition | |||
Data Lakehouse Edition | ||||
Log data | Log data | Logstash | Data Warehouse Edition | |
Simple Log Service | Data Warehouse Edition | Use Simple Log Service to import data to Data Warehouse Edition | ||
Data synchronization | Data Warehouse Edition | Use data synchronization to synchronize data from Simple Log Service to Data Warehouse Edition | ||
Data synchronization | Data Lakehouse Edition | Use data synchronization to synchronize data from Simple Log Service to Data Lakehouse Edition | ||
Local data | LOAD DATA | Data Warehouse Edition | ||
Import tool | Data Warehouse Edition | Use the AnalyticDB for MySQL import tool to import data to Data Warehouse Edition | ||
Kettle | Data Warehouse Edition |
Import data to data lakes
Importing data to data lakes is suitable only for AnalyticDB for MySQL Data Lakehouse Edition clusters.
When you import data to data lakes, data is written to the specified OSS bucket in the Hudi format.
Category | Data source | Import method | References |
ApsaraMQ | Kafka | Data synchronization | |
Log data | Simple Log Service | Data synchronization | |
Big data | Hive | Data migration |
References
AnalyticDB for MySQL allows you to asynchronously submit data import jobs. For more information, see Asynchronously submit an import job.