All Products
Search
Document Center

DataWorks:Best practice for migrating data from Oracle to MaxCompute

Last Updated:Feb 26, 2026

This topic describes how to use the data integration feature of DataWorks to migrate data from Oracle to MaxCompute.

Prerequisites

  • Prepare a DataWorks environment.

    1. Activate MaxCompute.

    2. Create a workspace. This topic uses a workspace in basic mode as an example.

    3. Create a business flow in DataWorks. For more information, see Create a business flow.

  • Prepare an Oracle environment.

    In this topic, Oracle is installed on an Elastic Compute Service (ECS) instance with the following configurations. To ensure network connectivity, you must assign a public IP address to the ECS instance and configure its security group rules to allow traffic on port 1521, the default port for Oracle databases. For more information, see Modify security group rules.

    As shown in the preceding figure, this topic uses an ecs.c5.xlarge ECS instance that is in a virtual private cloud (VPC) in the China (Hangzhou) region.

Background information

This topic uses the DataWorks Oracle Reader to read test data from Oracle. For more information, see Oracle Reader.

Prepare test data in Oracle

  1. Open the graphical user interface (GUI) for Oracle and create a table named DTSTEST.GOOD_SALE. The table must include the following seven columns: create_time, category, brand, buyer_id, trans_num, trans_amount, and click_cnt.

  2. Insert the test data. In this example, the data is inserted manually.

    insert into good_sale values('28-Dec-19','Kitchenware','BrandA','hanmeimei','6','80.6','4');
    insert into good_sale values('21-Dec-19','FreshFood','BrandB','lilei','7','440.6','5');
    insert into good_sale values('29-Dec-19','Apparel','BrandC','lily','12','351.9','9');
    commit;
  3. After you insert the data, execute the following statement to view the table data.

    select * from good_sale;

Migrate data from Oracle to MaxCompute using DataWorks

  1. Log on to the DataWorks console. In the top navigation bar, select the desired region. In the left-side navigation pane, choose Data Development and O&M > Data Development. On the page that appears, select the desired workspace from the drop-down list and click Go to Data Development.

  2. On the Data Studio page, create a destination table to store the data migrated from Oracle.

    1. Right-click the business flow and choose New Table > MaxCompute > Table.

    2. On the Create Table page, configure the basic information for the table.

    3. At the top of the table editor page, click DDL Mode.

    4. In the DDL dialog box, enter the table creation statement, and click Generate Table Schema.

      CREATE TABLE good_sale 
      (
          create_time     string,
          category        string,
          brand           string,
          buyer_id        string,
          trans_num       bigint,
          trans_amount    double,
          click_cnt       bigint
      ) ;

      When you create the table, consider the data type mappings between Oracle and MaxCompute. For more information about the data types that the Oracle Reader supports, see Data type mappings.

    5. Click Submit to Production Environment.

  3. Create an Oracle data source. For more information, see Configure an Oracle data source.

  4. Create an offline synchronization node.

    1. Go to the data analytics page. Right-click the specified workflow and choose Create Node > Data Integration > Offline synchronization.

    2. In the Create Node dialog box, configure the basic information for the node and click OK.

    3. After the offline data synchronization node is created, on the Network and Resource Configuration tab, set Data Source to the Oracle data source that you added and Data Destination to the MaxCompute data source. In the My Resource Group section, select the resource group that you want to use and click Test Connectivity. Then, click Next. On the Configure Data Source and Destination tab, set the Table parameter for both Data Source and Data Destination to the test table that you created. In the Field Mapping section, select Same-name Mapping. You can keep the default configurations for the other parameters.

    4. Click **icon to run the code.

    5. You can operation Log view the results.

Validation results

  1. Right-click the workflow and choose new > MaxCompute > ODPS SQL.

  2. In the Create Node dialog box, configure the basic information for the node and click OK.

  3. On the ODPS SQL node editor page, enter the following statement.

    -- Check whether the data is written to MaxCompute.
    select * from good_sale;
  4. Click **icon to run the code.

  5. You can Runtime Log view the results.