Use the Data Import feature in Data Management Service (DMS) to quickly import large volumes of data from files (SQL, CSV, Excel) into your database.
Prerequisites
The database is one of the following types:
MySQL: RDS for MySQL, PolarDB for MySQL, MyBase for MySQL, PolarDB-X, AnalyticDB for MySQL, or other MySQL sources.
SQL Server: RDS for SQL Server, MyBase for SQL Server, or other SQL Server sources.
PostgreSQL: RDS for PostgreSQL, PolarDB for PostgreSQL, MyBase for PostgreSQL, AnalyticDB for PostgreSQL, or other PostgreSQL sources.
MariaDB: RDS for MariaDB or other MariaDB sources.
OceanBase.
PolarDB for PostgreSQL (Compatible with Oracle).
DM.
DB2.
Oracle.
Redis.
MongoDB.
You are logged on to the database instance in the DMS console.
You have the change permissions for the destination database.
Limitations
The size of a single imported SQL, CSV, or Excel file cannot exceed 5 GB.
DMS supports uploading ZIP attachments. To import multiple files in a single ticket, compress them into a single
.ziparchive before uploading.You cannot modify multiple databases in a single Data Import ticket. To modify multiple databases, submit a normal data modify ticket.
For minor data changes, use a normal data modify or DML lock-free change ticket to ensure stability.
If a batch data import script contains schema change statements, these statements are executed in native mode. They are not executed in lock-free mode, even if lock-free schema change is enabled for the instance.
To avoid affecting database performance during a batch data import, yuse efficient SQL statements, such as
INSERT, andUPDATEorDELETEstatements that use a primary key.The imported CSV or Excel file (Sample files) must contain only field names and field values. Any other content in the file causes the import to fail.
Procedure
This example uses an ApsaraDB RDS for MySQL database in Security Collaboration control mode.
- Log on to the DMS console V5.0.
Move the pointer over the
icon in the upper-left corner of the DMS console and choose . NoteIf you use the DMS console in normal mode, choose in the top navigation bar.
On the Data Change Tickets page, configure the parameters. The following table describes some of the key parameters.
Parameter
Description
Database
Search for the destination database. You can select only one database.
NoteTo make database instances in Flexible Management or Stable Change control mode appear in the drop-down list, log on to them in DMS first.
You can also search by entering the database connection endpoint, for example,
test_db@rm-2zegs****.mysql.rds.aliyuncs.com:3306 [example].
Business Background
State the business reason for the import to streamline the review process.
File Encoding
The character encoding of the import file. The system selects Automatic Identification default. Other values include:
UTF-8
GBK
ISO-8859-1
Import Mode
The data import mode. Two modes are supported:
Speed Mode: Reads the file during the execution phase and directly runs the SQL statements against the specified destination database. This mode is faster but less secure than Security Mode.
NoteBy default, security rules do not allow data import in Speed Mode. An administrator or DBA can enable this check item in for the instance.
Security Mode: Parses the file and caches the SQL or CSV data during the pre-check phase, then executes the task from the cache. This mode is more secure but slower than Speed Mode.
File Type
Select the type of the import file:
SQL Script: In Security Collaboration mode, only
INSERTandREPLACEcommand types are allowed by default. To change this, a DBA or administrator can adjust this in Security Rules > SQL Change > Batch Data Import.CSV: The delimiter in the file must be a comma.
Excel: The Excel file can have a header row (for column names) or contain only data.
NoteFor CSV or Excel files, the target table must already exist.
Destination Table Name
Select the destination table for the data import.
NoteThis parameter is available only when you set File Type to CSV or Excel.
Data Location
Specify the data's starting position:
1st behavioral attributes: The first row of the table contains field names.
1st behavioral data: The first row of the table contains data.
NoteThis parameter is available only when you set File Type to CSV or Excel.
Write Mode
The data writing method:
INSERT: The database checks the primary key during insertion. An error is reported if a duplicate key is found.
INSERT_IGNORE: If a record with the same key already exists, the new data is ignored.
REPLACE_INTO: If a row in the table has the same primary key or unique index value, the existing row is deleted and then the new row is inserted.
NoteThis parameter is available only when you set File Type to CSV or Excel.
RDS for MySQL, PolarDB for MySQL, PolarDB-X, AnalyticDB for MySQL, and OceanBase databases support all three write methods:
INSERT INTO,INSERT IGNORE, andREPLACE INTO. Other databases support onlyINSERT INTO.
Attachment
Supports Upload File (local file) and Upload OSS Object.
To upload a file from OSS, you must first add the target OSS data source in DMS. After that, click Upload OSS Object, select the target bucket, and enter the file path, such as
examplefolder/example.sql.NoteSupported file types are SQL, CSV, TXT, XLSX, and ZIP.
The attachment size cannot exceed 5 GB.
To add an OSS data source, see Add an ApsaraDB database.
Other Options (Optional)
Select whether to ignore exceptions.
Unchecked (Default): When an exception occurs, the system stops executing subsequent SQL statements and reports an error.
Checked: The system ignores exceptions and continues executing subsequent SQL statements.
Click Submit and wait for the precheck to complete. If the precheck fails, review the error message, resolve the issue, and click Retry.
NoteThe system prechecks the uploaded SQL file. If you upload a CSV file, the system generates the corresponding
INSERTstatements.If an error occurs during the type check in the precheck phase, you can adjust the security rules that are associated with the instance based on the error message. For more information, see Data change.
In the Approval section, click Submit for Approval, and then click OK in the Prompt dialog box.
After the ticket is approved, in the Execute section, click Execute Change.
In the Task Settings dialog box, select an execution method.
The execution methods are as follows:
Running Immediately: Default. The task executes immediately after you click Confirm Execution.
Schedule: Select a start time for the task. The task executes automatically at the specified time after you click Confirm Execution.
NoteDuring execution, the system reads the parsed SQL in a stream and executes it in batches in the database. The size of each batch is 1 MB.
You can view the task status, SQL check details, and scheduling logs in the Execution section.
When a paused task is restarted, its behavior depends on the import mode:
Speed Mode: The script execution or data file import starts from the beginning.
Security Mode: Tthe script execution or data file import resumes from the pause point.
A completion message appears when the task finishes.
(Optional) Query the imported data.
Go to the SQL Console page to query the imported data. For more information, see Get started with SQL Console.
Sample files
FAQ
Q: What should I do if a "permission denied for schema XXXXX" error occurs during data import?
A: This error indicates that the database account used to log on to DMS does not have sufficient permissions. You can try to log on to DMS with a privileged database account and retry the import, or try to import the data to another public database. To switch database accounts, see Edit instance information.
If the issue persists, you can use Data Transmission Service (DTS) to migrate data. For more information, see Overview of migration solutions.
Q: What should I do if the data import ticket fails the precheck and a message indicates that submitting XXXX change statements is not allowed?
A: An administrator or DBA must modify the SQL change security rules to allow that SQL type during import. The following example shows how to allow
CREATE_TABLEstatements:In the DMS console, go to the Security Rules page.
Edit the destination rule. On the rule details page, click SQL Correct in the left-side pane.
Select Batch Data Import Rules and click Edit to the right of the Allow bulk import of insert statements rule.
Add the
CREATE_TABLESQL type and click Submit. For more information, see SQL Change.After the rule is modified, return to the data import ticket details page and click Retry in the Precheck section.
Q: What should I do if the file to be uploaded is larger than 5 GB?
A: You can choose one of the following methods:
Split the file into smaller files and submit multiple tickets to upload them.
Use Navicat to remotely connect to the database and then upload the file.
Q: How can I delete data that was imported incorrectly?
A: The deletion method depends on the data volume.
Small amount of data
You can execute a
DELETEstatement in the DMS SQL console. For more information, see Use SQL statements to change table data.Large amount of data
You can submit a Normal Data Modify or DML Lock-Free Change ticket to delete the table data. The DML Lock-Free Change feature lets you change large amounts of table data without locking the table. For more information, see Normal Data Modify and DML Lock-Free Change.