All Products
Search
Document Center

Data Management:Data import

Last Updated:Jan 08, 2026

Use the Data Import feature in Data Management Service (DMS) to quickly import large volumes of data from files (SQL, CSV, Excel) into your database.

Prerequisites

  • The database is one of the following types:

    • MySQL: RDS for MySQL, PolarDB for MySQL, MyBase for MySQL, PolarDB-X, AnalyticDB for MySQL, or other MySQL sources.

    • SQL Server: RDS for SQL Server, MyBase for SQL Server, or other SQL Server sources.

    • PostgreSQL: RDS for PostgreSQL, PolarDB for PostgreSQL, MyBase for PostgreSQL, AnalyticDB for PostgreSQL, or other PostgreSQL sources.

    • MariaDB: RDS for MariaDB or other MariaDB sources.

    • OceanBase.

    • PolarDB for PostgreSQL (Compatible with Oracle).

    • DM.

    • DB2.

    • Oracle.

    • Redis.

    • MongoDB.

  • You are logged on to the database instance in the DMS console.

  • You have the change permissions for the destination database.

Limitations

  • The size of a single imported SQL, CSV, or Excel file cannot exceed 5 GB.

  • DMS supports uploading ZIP attachments. To import multiple files in a single ticket, compress them into a single .zip archive before uploading.

  • You cannot modify multiple databases in a single Data Import ticket. To modify multiple databases, submit a normal data modify ticket.

  • For minor data changes, use a normal data modify or DML lock-free change ticket to ensure stability.

  • If a batch data import script contains schema change statements, these statements are executed in native mode. They are not executed in lock-free mode, even if lock-free schema change is enabled for the instance.

  • To avoid affecting database performance during a batch data import, yuse efficient SQL statements, such as INSERT, and UPDATE or DELETE statements that use a primary key.

  • The imported CSV or Excel file (Sample files) must contain only field names and field values. Any other content in the file causes the import to fail.

Procedure

Note

This example uses an ApsaraDB RDS for MySQL database in Security Collaboration control mode.

  1. Log on to the DMS console V5.0.
  2. Move the pointer over the 2023-01-28_15-57-17.png icon in the upper-left corner of the DMS console and choose All Features > Database Development > Data Change > Data Import.

    Note

    If you use the DMS console in normal mode, choose Database Development > Data Change > Data Import in the top navigation bar.

  3. On the Data Change Tickets page, configure the parameters. The following table describes some of the key parameters.

    Parameter

    Description

    Database

    Search for the destination database. You can select only one database.

    Note
    • To make database instances in Flexible Management or Stable Change control mode appear in the drop-down list, log on to them in DMS first.

    • You can also search by entering the database connection endpoint, for example, test_db@rm-2zegs****.mysql.rds.aliyuncs.com:3306 [example].

    Business Background

    State the business reason for the import to streamline the review process.

    File Encoding

    The character encoding of the import file. The system selects Automatic Identification default. Other values include:

    • UTF-8

    • GBK

    • ISO-8859-1

    Import Mode

    The data import mode. Two modes are supported:

    • Speed Mode: Reads the file during the execution phase and directly runs the SQL statements against the specified destination database. This mode is faster but less secure than Security Mode.

      Note

      By default, security rules do not allow data import in Speed Mode. An administrator or DBA can enable this check item in Security Rules > SQL Correct > Basic Configuration Item for the instance.

    • Security Mode: Parses the file and caches the SQL or CSV data during the pre-check phase, then executes the task from the cache. This mode is more secure but slower than Speed Mode.

    File Type

    Select the type of the import file:

    • SQL Script: In Security Collaboration mode, only INSERT and REPLACE command types are allowed by default. To change this, a DBA or administrator can adjust this in Security Rules > SQL Change > Batch Data Import.

    • CSV: The delimiter in the file must be a comma.

    • Excel: The Excel file can have a header row (for column names) or contain only data.

    Note

    For CSV or Excel files, the target table must already exist.

    Destination Table Name

    Select the destination table for the data import.

    Note

    This parameter is available only when you set File Type to CSV or Excel.

    Data Location

    Specify the data's starting position:

    • 1st behavioral attributes: The first row of the table contains field names.

    • 1st behavioral data: The first row of the table contains data.

    Note

    This parameter is available only when you set File Type to CSV or Excel.

    Write Mode

    The data writing method:

    • INSERT: The database checks the primary key during insertion. An error is reported if a duplicate key is found.

    • INSERT_IGNORE: If a record with the same key already exists, the new data is ignored.

    • REPLACE_INTO: If a row in the table has the same primary key or unique index value, the existing row is deleted and then the new row is inserted.

    Note
    • This parameter is available only when you set File Type to CSV or Excel.

    • RDS for MySQL, PolarDB for MySQL, PolarDB-X, AnalyticDB for MySQL, and OceanBase databases support all three write methods: INSERT INTO, INSERT IGNORE, and REPLACE INTO. Other databases support only INSERT INTO.

    Attachment

    Supports Upload File (local file) and Upload OSS Object.

    To upload a file from OSS, you must first add the target OSS data source in DMS. After that, click Upload OSS Object, select the target bucket, and enter the file path, such as examplefolder/example.sql.

    Note
    • Supported file types are SQL, CSV, TXT, XLSX, and ZIP.

    • The attachment size cannot exceed 5 GB.

    • To add an OSS data source, see Add an ApsaraDB database.

    Other Options (Optional)

    Select whether to ignore exceptions.

    • Unchecked (Default): When an exception occurs, the system stops executing subsequent SQL statements and reports an error.

    • Checked: The system ignores exceptions and continues executing subsequent SQL statements.

  4. Click Submit and wait for the precheck to complete. If the precheck fails, review the error message, resolve the issue, and click Retry.

    Note
    • The system prechecks the uploaded SQL file. If you upload a CSV file, the system generates the corresponding INSERT statements.

    • If an error occurs during the type check in the precheck phase, you can adjust the security rules that are associated with the instance based on the error message. For more information, see Data change.

  5. In the Approval section, click Submit for Approval, and then click OK in the Prompt dialog box.

  6. After the ticket is approved, in the Execute section, click Execute Change.

  7. In the Task Settings dialog box, select an execution method.

    The execution methods are as follows:

    • Running Immediately: Default. The task executes immediately after you click Confirm Execution.

    • Schedule: Select a start time for the task. The task executes automatically at the specified time after you click Confirm Execution.

    Note
    • During execution, the system reads the parsed SQL in a stream and executes it in batches in the database. The size of each batch is 1 MB.

    • You can view the task status, SQL check details, and scheduling logs in the Execution section.

    • When a paused task is restarted, its behavior depends on the import mode:

      • Speed Mode: The script execution or data file import starts from the beginning.

      • Security Mode: Tthe script execution or data file import resumes from the pause point.

  8. A completion message appears when the task finishes.

  9. (Optional) Query the imported data.

    Go to the SQL Console page to query the imported data. For more information, see Get started with SQL Console.

Sample files

FAQ

  • Q: What should I do if a "permission denied for schema XXXXX" error occurs during data import?

    A: This error indicates that the database account used to log on to DMS does not have sufficient permissions. You can try to log on to DMS with a privileged database account and retry the import, or try to import the data to another public database. To switch database accounts, see Edit instance information.

    If the issue persists, you can use Data Transmission Service (DTS) to migrate data. For more information, see Overview of migration solutions.

  • Q: What should I do if the data import ticket fails the precheck and a message indicates that submitting XXXX change statements is not allowed?

    A: An administrator or DBA must modify the SQL change security rules to allow that SQL type during import. The following example shows how to allow CREATE_TABLE statements:

    1. In the DMS console, go to the Security Rules page.

    2. Edit the destination rule. On the rule details page, click SQL Correct in the left-side pane.

    3. Select Batch Data Import Rules and click Edit to the right of the Allow bulk import of insert statements rule.

    4. Add the CREATE_TABLE SQL type and click Submit. For more information, see SQL Change.

    5. After the rule is modified, return to the data import ticket details page and click Retry in the Precheck section.

  • Q: What should I do if the file to be uploaded is larger than 5 GB?

    A: You can choose one of the following methods:

    • Split the file into smaller files and submit multiple tickets to upload them.

    • Use Navicat to remotely connect to the database and then upload the file.

  • Q: How can I delete data that was imported incorrectly?

    A: The deletion method depends on the data volume.

    • Small amount of data

      You can execute a DELETE statement in the DMS SQL console. For more information, see Use SQL statements to change table data.

    • Large amount of data

      You can submit a Normal Data Modify or DML Lock-Free Change ticket to delete the table data. The DML Lock-Free Change feature lets you change large amounts of table data without locking the table. For more information, see Normal Data Modify and DML Lock-Free Change.