All Products
Search
Document Center

DataWorks:CreateImportMigration

Last Updated:Sep 18, 2024

Creates an import task. The import task contains the import packages of data sources, nodes, and tables.

Operation description

The import package must be uploaded. Example of the upload method:

    Config config = new Config();
    config.setAccessKeyId(accessId);
    config.setAccessKeySecret(accessKey);
    config.setEndpoint(popEndpoint);
    config.setRegionId(regionId);
    
    Client client = new Client(config);

    CreateImportMigrationAdvanceRequest request = new CreateImportMigrationAdvanceRequest();
    request.setName("test_migration_api_" + System.currentTimeMillis());
    request.setProjectId(123456L); 
    request.setPackageType("DATAWORKS_MODEL");
    request.setPackageFileObject(new FileInputStream("/home/admin/Downloads/test.zip"));

    RuntimeOptions runtime = new RuntimeOptions();
    CreateImportMigrationResponse response = client.createImportMigrationAdvance(request, runtime);
    ...

Debugging

OpenAPI Explorer automatically calculates the signature value. For your convenience, we recommend that you call this operation in OpenAPI Explorer.

Authorization information

The following table shows the authorization information corresponding to the API. The authorization information can be used in the Action policy element to grant a RAM user or RAM role the permissions to call this API operation. Description:

  • Operation: the value that you can use in the Action element to specify the operation on a resource.
  • Access level: the access level of each operation. The levels are read, write, and list.
  • Resource type: the type of the resource on which you can authorize the RAM user or the RAM role to perform the operation. Take note of the following items:
    • The required resource types are displayed in bold characters.
    • If the permissions cannot be granted at the resource level, All Resources is used in the Resource type column of the operation.
  • Condition Key: the condition key that is defined by the cloud service.
  • Associated operation: other operations that the RAM user or the RAM role must have permissions to perform to complete the operation. To complete the operation, the RAM user or the RAM role must have the permissions to perform the associated operations.
OperationAccess levelResource typeCondition keyAssociated operation
dataworks:*update
  • All Resources
    *
    none
none

Request parameters

ParameterTypeRequiredDescriptionExample
ProjectIdlongYes

The DataWorks workspace ID. You can log on to the DataWorks console and go to the Workspace page to obtain the workspace ID.

123456
NamestringYes

The name of the import task. The name must be unique within the workspace.

test_import_001
PackageTypestringYes

The type of the import package. Valid values:

  • DATAWORKS_MODEL (standard format)
  • DATAWORKS_V2 (Apsara Stack DataWorks V3.6.1 to V3.11)
  • DATAWORKS_V3 (Apsara Stack DataWorks V3.12 and later)
DATAWORKS_MODEL
PackageFilestringYes

The path of the import package. The import package must be uploaded. Example of the upload method:

        Config config = new Config();
        config.setAccessKeyId(accessId);
        config.setAccessKeySecret(accessKey);
        config.setEndpoint(popEndpoint);
        config.setRegionId(regionId);

        Client client = new Client(config);

        CreateImportMigrationAdvanceRequest request = new CreateImportMigrationAdvanceRequest();
        request.setName("test_migration_api_" + System.currentTimeMillis());
        request.setProjectId(123456L); 
        request.setPackageType("DATAWORKS_MODEL");
        request.setPackageFileObject(new FileInputStream("/home/admin/Downloads/test.zip"));

        RuntimeOptions runtime = new RuntimeOptions();
        CreateImportMigrationResponse response = client.createImportMigrationAdvance(request, runtime);
/home/admin/xxx/import.zip
ResourceGroupMapstringNo

The mapping between the resource group for scheduling and the resource group for Data Integration. The keys and values in the mapping are the identifiers of the resource groups. Specify the mapping in the following format:

{
    "SCHEDULER_RESOURCE_GROUP": {
        "xxx": "yyy"
    },
    "DI_RESOURCE_GROUP": {
        "ccc": "dfdd"
    }
}
{"SCHEDULER_RESOURCE_GROUP": {"xxx":"yyy"},"DI_RESOURCE_GROUP":{"ccc":"ddd"}}
WorkspaceMapstringNo

The mapping between the prefixes for the names of the source and destination workspaces. When the system performs the import operation, the prefix for the name of the source workspace in the import package is replaced based on the mapping.

{"test_workspace_src": "test_workspace_target"}
CalculateEngineMapstringYes

The mapping between the source compute engine instance and the destination compute engine instance. The following types of compute engine instances are supported: MaxCompute, E-MapReduce (EMR), Hadoop CDH, and Hologres.

{ "ODPS": { "zxy_8221431_engine": "wzp_kaifazheban_engine" }, "EMR": { "aaaa": "bbb" } }
CommitRulestringYes

The rule configured for automatically committing and deploying the import task. The rule contains the following parameters:

  • resourceAutoCommit: specifies whether resources are automatically committed. The value true indicates yes and the value false indicates no.
  • resourceAutoDeploy: specifies whether resources are automatically deployed. The value true indicates yes and the value false indicates no.
  • functionAutoCommit: specifies whether the function is automatically committed. The value true indicates yes and the value false indicates no.
  • functionAutoDeploy: specifies whether the function is automatically deployed. The value true indicates yes and the value false indicates no.
  • tableAutoCommitToDev: specifies whether the table is automatically committed to the development environment. The value true indicates yes and the value false indicates no.
  • tableAutoCommitToProd: specifies whether the table is automatically committed to the production environment. The value true indicates yes and the value false indicates no.
  • ignoreLock: specifies whether the lock is automatically ignored when an import task is locked. The value true indicates yes and the value false indicates no. If you set this parameter to true for an import task, you can forcefully update the task even if the task is locked.
  • fileAutoCommit: specifies whether the file is automatically committed. The value true indicates yes and the value false indicates no.
  • fileAutoDeploy: specifies whether the file is automatically deployed. The value true indicates yes and the value false indicates no.
{ "resourceAutoCommit": false, "resourceAutoDeploy": false, "functionAutoCommit": false, "functionAutoDeploy": false, "tableAutoCommitToDev": false, "tableAutoCommitToProd": false, "ignoreLock": false, "fileAutoCommit": false, "fileAutoDeploy": false }
DescriptionstringNo

The description of the import package.

test description

Response parameters

ParameterTypeDescriptionExample
object

The returned data.

HttpStatusCodeinteger

The HTTP status code.

200
Datalong

The import task ID. The ID is used as an input parameter if you want the system to run the import task or you want to obtain the running progress of the import task.

123456
ErrorMessagestring

The error message.

test error message
RequestIdstring

The request ID. You can locate logs and troubleshoot issues based on the ID.

ADFASDFASDFA-ADFASDF-ASDFADSDF-AFFADS
ErrorCodestring

The error code.

110001123456
Successboolean

Indicates whether the request was successful.

true

Examples

Sample success responses

JSONformat

{
  "HttpStatusCode": 200,
  "Data": 123456,
  "ErrorMessage": "test error message",
  "RequestId": "ADFASDFASDFA-ADFASDF-ASDFADSDF-AFFADS",
  "ErrorCode": "110001123456",
  "Success": true
}

Error codes

For a list of error codes, visit the Service error codes.

Change history

Change timeSummary of changesOperation
2023-10-10The internal configuration of the API is changed, but the call is not affectedView Change Details