All Products
Search
Document Center

DataWorks:Best practices for calling API operations to develop, commit, and run tasks

Last Updated:Mar 29, 2024

DataWorks provides various API operations. You can call the API operations to manage your business based on your requirements. This topic describes how to call DataWorks API operations to quickly develop, commit, and run tasks.

Background information

This topic describes the DataWorks API operations that can be called in the following business scenarios. Before you perform the steps that are described in this topic, we recommend that you understand the core capabilities and concepts related to the business scenarios.

  • Query and manage workspaces, workflows, node folders, and nodes, and commit and deploy nodes. DataStudio API operations, such as CreateBusiness and ListBusiness, are used.

  • Perform smoke testing and view run logs. Operation Center API operations, such as RunSmokeTest, are used.

The following sections describe the procedure and provide the core parts of sample code for the procedure.

  1. Backend code development

  2. Frontend code development

  3. Deploy and run the code on your on-premises machine

If you want to view or download the complete sample source code, see Reference: Download complete sample source code in this topic.

Backend code development

Step 1: Develop the ProjectService class to query workspaces

You need to develop the ProjectService class. The class defines the ListProjects function that is used to call the ListProjects operation to query workspaces. After you call the operation, the workspaces that can be used for frontend development are returned.

package com.aliyun.dataworks.services;


import com.aliyuncs.dataworks_public.model.v20200518.ListProjectsRequest;
import com.aliyuncs.dataworks_public.model.v20200518.ListProjectsResponse;
import com.aliyuncs.exceptions.ClientException;
import com.aliyuncs.exceptions.ServerException;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service;


@Service
public class ProjectService {


    @Autowired
    private DataWorksOpenApiClient dataWorksOpenApiClient;


    /**
   	 * @param pageNumber
     * @param pageSize
     * @return
     */
    public ListProjectsResponse.PageResult listProjects(Integer pageNumber, Integer pageSize) {
        try {
            ListProjectsRequest listProjectsRequest = new ListProjectsRequest();
            listProjectsRequest.setPageNumber(pageNumber);
            listProjectsRequest.setPageSize(pageSize);
            ListProjectsResponse listProjectsResponse = dataWorksOpenApiClient.createClient().getAcsResponse(listProjectsRequest);
            System.out.println(listProjectsResponse.getRequestId());
            return listProjectsResponse.getPageResult();
        } catch (ServerException e) {
            e.printStackTrace();
        } catch (ClientException e) {
            e.printStackTrace();
            // The request ID.
            System.out.println(e.getRequestId());
            // The error code.
            System.out.println(e.getErrCode());
            // The error message.
            System.out.println(e.getErrMsg());
        }
        return null;
    }
}

Step 2: Develop the BusinessService class to process workflows

You need to develop the BusinessService class. The class defines the following functions:

  • The CreateBusiness function that can be used to call the CreateBusiness operation to create a workflow.

  • The ListBusiness function that can be used to call the ListBusiness operation to query workflows.

The functions are used during frontend development to create a sample workflow and query workflows.

Note

You can also develop the FolderService class to display a directory tree. The directory tree consists of workflows, node folders, and nodes. The following sample code provides an example of the core process. For the FolderService functions that are related to node folders, see the complete sample code that is provided in GitHub.

package com.aliyun.dataworks.services;


import com.aliyun.dataworks.dto.CreateBusinessDTO;
import com.aliyun.dataworks.dto.DeleteBusinessDTO;
import com.aliyun.dataworks.dto.ListBusinessesDTO;
import com.aliyun.dataworks.dto.UpdateBusinessDTO;
import com.aliyuncs.dataworks_public.model.v20200518.*;
import com.aliyuncs.exceptions.ClientException;
import com.aliyuncs.exceptions.ServerException;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service;
import org.springframework.util.CollectionUtils;


import java.util.List;


/**
 * @author dataworks demo
 */
@Service
public class BusinessService {


    @Autowired
    private DataWorksOpenApiClient dataWorksOpenApiClient;


    /**
		 * create a business
     *
     * @param createBusinessDTO
     */
    public Long createBusiness(CreateBusinessDTO createBusinessDTO) {
        try {
            CreateBusinessRequest createBusinessRequest = new CreateBusinessRequest();
            // The name of the workflow.
            createBusinessRequest.setBusinessName(createBusinessDTO.getBusinessName());
            createBusinessRequest.setDescription(createBusinessDTO.getDescription());
            createBusinessRequest.setOwner(createBusinessDTO.getOwner());
            createBusinessRequest.setProjectId(createBusinessDTO.getProjectId());
            // The module to which the workflow belongs. Valid values: NORMAL and MANUAL_BIZ.
            createBusinessRequest.setUseType(createBusinessDTO.getUseType());
            CreateBusinessResponse createBusinessResponse = dataWorksOpenApiClient.createClient().getAcsResponse(createBusinessRequest);
            System.out.println("create business requestId:" + createBusinessResponse.getRequestId());
            System.out.println("create business successful,the businessId:" + createBusinessResponse.getBusinessId());
            return createBusinessResponse.getBusinessId();
        } catch (ServerException e) {
            e.printStackTrace();
        } catch (ClientException e) {
            e.printStackTrace();
            // The request ID.
            System.out.println(e.getRequestId());
            // The error code.
            System.out.println(e.getErrCode());
            // The error message.
            System.out.println(e.getErrMsg());
        }
        return null;
    }


    /**
     * @param listBusinessesDTO
     * @return
     */
    public List<ListBusinessResponse.Data.BusinessItem> listBusiness(ListBusinessesDTO listBusinessesDTO) {
        try {
            ListBusinessRequest listBusinessRequest = new ListBusinessRequest();
            listBusinessRequest.setKeyword(listBusinessesDTO.getKeyword());
            listBusinessRequest.setPageNumber(listBusinessesDTO.getPageNumber() < 1 ? 1 : listBusinessesDTO.getPageNumber());
            listBusinessRequest.setPageSize(listBusinessesDTO.getPageSize() < 10 ? 10 : listBusinessesDTO.getPageSize());
            listBusinessRequest.setProjectId(listBusinessesDTO.getProjectId());
            ListBusinessResponse listBusinessResponse = dataWorksOpenApiClient.createClient().getAcsResponse(listBusinessRequest);
            System.out.println("list business requestId:" + listBusinessResponse.getRequestId());
            ListBusinessResponse.Data data = listBusinessResponse.getData();
            System.out.println("total count:" + data.getTotalCount());
            if (!CollectionUtils.isEmpty(data.getBusiness())) {
                for (ListBusinessResponse.Data.BusinessItem businessItem : data.getBusiness()) {
                    System.out.println(businessItem.getBusinessId() + "," + businessItem.getBusinessName() + "," + businessItem.getUseType());
                }
            }
            return data.getBusiness();
        } catch (ServerException e) {
            e.printStackTrace();
        } catch (ClientException e) {
            e.printStackTrace();
            // The request ID.
            System.out.println(e.getRequestId());
            // The error code.
            System.out.println(e.getErrCode());
            // The error message.
            System.out.println(e.getErrMsg());
        }
        return null;
    }


    /**
     * update a business
     * @param updateBusinessDTO
     * @return
     */
    public Boolean updateBusiness(UpdateBusinessDTO updateBusinessDTO) {
        try {
            UpdateBusinessRequest updateBusinessRequest = new UpdateBusinessRequest();
            updateBusinessRequest.setBusinessId(updateBusinessDTO.getBusinessId());
            updateBusinessRequest.setBusinessName(updateBusinessDTO.getBusinessName());
            updateBusinessRequest.setDescription(updateBusinessDTO.getDescription());
            updateBusinessRequest.setOwner(updateBusinessDTO.getOwner());
            updateBusinessRequest.setProjectId(updateBusinessDTO.getProjectId());
            UpdateBusinessResponse updateBusinessResponse = dataWorksOpenApiClient.createClient().getAcsResponse(updateBusinessRequest);
            System.out.println(updateBusinessResponse.getRequestId());
            System.out.println(updateBusinessResponse.getSuccess());
            return updateBusinessResponse.getSuccess();
        } catch (ServerException e) {
            e.printStackTrace();
        } catch (ClientException e) {
            e.printStackTrace();
            // The request ID.
            System.out.println(e.getRequestId());
            // The error code.
            System.out.println(e.getErrCode());
            // The error message.
            System.out.println(e.getErrMsg());
        }
        return false;
    }


    /**
     * delete a business
     * @param deleteBusinessDTO
     */
    public boolean deleteBusiness(DeleteBusinessDTO deleteBusinessDTO) {
        try {
            DeleteBusinessRequest deleteBusinessRequest = new DeleteBusinessRequest();
            deleteBusinessRequest.setBusinessId(deleteBusinessDTO.getBusinessId());
            deleteBusinessRequest.setProjectId(deleteBusinessDTO.getProjectId());
            DeleteBusinessResponse deleteBusinessResponse = dataWorksOpenApiClient.createClient().getAcsResponse(deleteBusinessRequest);
            System.out.println("delete business:" + deleteBusinessResponse.getRequestId());
            System.out.println("delete business" + deleteBusinessResponse.getSuccess());
            return deleteBusinessResponse.getSuccess();
        } catch (ServerException e) {
            e.printStackTrace();
        } catch (ClientException e) {
            e.printStackTrace();
            // The request ID.
            System.out.println(e.getRequestId());
            // The error code.
            System.out.println(e.getErrCode());
            // The error message.
            System.out.println(e.getErrMsg());
        }
        return false;
    }



}

Step 3: Develop the FileService class to process files

You need to develop the FileService class. The class defines the following functions that can be used to process files:

  • The listFile function that can be used to call the ListFiles operation to query files.

  • The createFile function that can be used to call the CreateFile operation to create files.

  • The updateFile function that can be used to call the UpdateFile operation to update files.

  • The deployFile function that can be used to call the DeployFile operation to deploy files.

  • The runSmokeTest function that can be used to call the RunSmokeTest operation to perform smoke testing.

  • The getInstanceLog function that can be used to call the GetInstanceLog operation to query the logs of an instance.

The functions can be used to create a file, query files, save a file, and commit and run a file.

package com.aliyun.dataworks.services;


import com.aliyun.dataworks.dto.*;
import com.aliyuncs.dataworks_public.model.v20200518.*;
import com.aliyuncs.exceptions.ClientException;
import com.aliyuncs.exceptions.ServerException;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service;
import org.springframework.util.CollectionUtils;


import java.util.List;


/**
 * the ide files manager service
 *
 * @author dataworks demo
 */
@Service
public class FileService {


    @Autowired
    private DataWorksOpenApiClient dataWorksOpenApiClient;


    public static final int CYCLE_NUM = 10;


    /**
     * Query files by page.
     * @param listFilesDTO
     * @return
     */
    public List<ListFilesResponse.Data.File> listFiles(ListFilesDTO listFilesDTO) {
        try {
            ListFilesRequest listFilesRequest = new ListFilesRequest();
            // File path: "Workflow/Name of the desired workflow/Name of the directory/Name of the latest folder"
            // Workflow/My first workflow/MaxCompute/ODS layer. Do not add "DataStudio" at the start of the path.
            listFilesRequest.setFileFolderPath(listFilesDTO.getFileFolderPath());
            // The code type of the files. You can specify multiple code types for files. Separate the code types with commas (,), such as 10,23.
            listFilesRequest.setFileTypes(listFilesDTO.getFileTypes());
            // The keyword in the file names. Fuzzy match is supported.
            listFilesRequest.setKeyword(listFilesDTO.getKeyword());
            // The ID of the node that is scheduled.
            listFilesRequest.setNodeId(listFilesDTO.getNodeId());
            // The owner of the files.
            listFilesRequest.setOwner(listFilesDTO.getOwner());
            // The number of the page to return.
            listFilesRequest.setPageNumber(listFilesDTO.getPageNumber() <= 0 ? 1 : listFilesDTO.getPageNumber());
            // The number of entries to return on each page.
            listFilesRequest.setPageSize(listFilesDTO.getPageSize() <= 10 ? 10 : listFilesDTO.getPageSize());
            // The ID of the DataWorks workspace.
            listFilesRequest.setProjectId(listFilesDTO.getProjectId());
            // The module to which the files belong.
            listFilesRequest.setUseType(listFilesDTO.getUseType());
            ListFilesResponse listFilesResponse = dataWorksOpenApiClient.createClient()
                    .getAcsResponse(listFilesRequest);
            ListFilesResponse.Data fileData = listFilesResponse.getData();
            if (fileData.getFiles() != null && !fileData.getFiles().isEmpty()) {
                for (ListFilesResponse.Data.File file : fileData.getFiles()) {
                    // The ID of the workflow.
                    System.out.println(file.getBusinessId());
                    // The ID of the file.
                    System.out.println(file.getFileId());
                    // The name of the file.
                    System.out.println(file.getFileName());
                    // The code type of the file, such as 10.
                    System.out.println(file.getFileType());
                    // The ID of the node.
                    System.out.println(file.getNodeId());
                    // The ID of the folder.
                    System.out.println(file.getFileFolderId());
                }
            }
            return fileData.getFiles();
        } catch (ServerException e) {
            e.printStackTrace();
        } catch (ClientException e) {
            e.printStackTrace();
            // The request ID.
            System.out.println(e.getRequestId());
            // The error code.
            System.out.println(e.getErrCode());
            // The error message.
            System.out.println(e.getErrMsg());
        }
        return null;
    }


    /**
     * Create a file.
     * @param createFileDTO
     */
    public Long createFile(CreateFileDTO createFileDTO) {
        try {
            CreateFileRequest createFileRequest = new CreateFileRequest();
            // The advanced configurations of the node.
            createFileRequest.setAdvancedSettings(createFileDTO.getAdvancedSettings());
            // Specifies whether to enable the automatic parsing feature for the file. This parameter is required.
            createFileRequest.setAutoParsing(createFileDTO.getAutoParsing());
            // The interval between automatic reruns after an error occurs. Unit: milliseconds. Maximum value: 1800000 (30 minutes).
            createFileRequest.setAutoRerunIntervalMillis(createFileDTO.getAutoRerunIntervalMillis());
            // The number of automatic retries.
            createFileRequest.setAutoRerunTimes(createFileDTO.getAutoRerunTimes());
            // The name of the connected data source that you want to use to run the node.  This parameter is required.
            createFileRequest.setConnectionName(createFileDTO.getConnectionName());
            // The code of the file. This parameter is required.
            createFileRequest.setContent(createFileDTO.getContent());
            // The CRON expression that represents the periodic scheduling policy of the node. This parameter is required.
            createFileRequest.setCronExpress(createFileDTO.getCronExpress());
            // The type of the scheduling cycle. This parameter is required.
            createFileRequest.setCycleType(createFileDTO.getCycleType());
            // The IDs of the nodes on which the current node depends. The instance that is generated for the node in the current cycle depends on the instances that are generated for the specified nodes in the previous cycle.
            createFileRequest.setDependentNodeIdList(createFileDTO.getDependentNodeIdList());
            // The type of the cross-cycle scheduling dependency for the current node. This parameter is required.
            createFileRequest.setDependentType(createFileDTO.getDependentType());
            // The end timestamp of automatic scheduling, in milliseconds. 
            createFileRequest.setEndEffectDate(createFileDTO.getEndEffectDate());
            // The description of the file.
            createFileRequest.setFileDescription(createFileDTO.getFileDescription());
            // The path of the file. This parameter is required.
            createFileRequest.setFileFolderPath(createFileDTO.getFileFolderPath());
            // The name of the file. This parameter is required.
            createFileRequest.setFileName(createFileDTO.getFileName());
            // The code type of the file. This parameter is required.
            createFileRequest.setFileType(createFileDTO.getFileType());
            // The output name of the file on which the current file depends. If you specify multiple output names, separate them with commas (,). This parameter is required.
            createFileRequest.setInputList(createFileDTO.getInputList());
            // The ID of the Alibaba Cloud account that is used by the file owner. If this parameter is not configured, the ID of the Alibaba Cloud account of the user who calls the operation is used.  This parameter is required.
            createFileRequest.setOwner(createFileDTO.getOwner());
            // The scheduling parameter. 
            createFileRequest.setParaValue(createFileDTO.getParaValue());
            // The ID of the workspace. This parameter is required.
            createFileRequest.setProjectId(createFileDTO.getProjectId());
            // The rerun type for the node.
            createFileRequest.setRerunMode(createFileDTO.getRerunMode());
            // The resource group that you want to use to run the node. This parameter is required.
            createFileRequest.setResourceGroupIdentifier(createFileDTO.getResourceGroupIdentifier());
            // The scheduling type of the node.
            createFileRequest.setSchedulerType(createFileDTO.getSchedulerType());
            // The start timestamp of automatic scheduling, in milliseconds.
            createFileRequest.setStartEffectDate(createFileDTO.getStartEffectDate());
            // Specifies whether to suspend the scheduling of the node.
            createFileRequest.setStop(createFileDTO.getStop());
            CreateFileResponse createFileResponse = dataWorksOpenApiClient.createClient()
                    .getAcsResponse(createFileRequest);
            // requestId
            System.out.println(createFileResponse.getRequestId());
            // fileId
            System.out.println(createFileResponse.getData());
            return createFileResponse.getData();
        } catch (ServerException e) {
            e.printStackTrace();
        } catch (ClientException e) {
            e.printStackTrace();
            // The request ID.
            System.out.println(e.getRequestId());
            // The error code.
            System.out.println(e.getErrCode());
            // The error message.
            System.out.println(e.getErrMsg());
        }
        return null;
    }


    /**
     * Update a file.  
     *
     * @param updateFileDTO
     */
    public boolean updateFile(UpdateFileDTO updateFileDTO) {
        try {
            UpdateFileRequest updateFileRequest = new UpdateFileRequest();
            // The advanced configurations of the node. For more information, see the related documentation.
            updateFileRequest.setAdvancedSettings(updateFileDTO.getAdvancedSettings());
            // Specifies whether to enable the automatic parsing feature for the file.
            updateFileRequest.setAutoParsing(updateFileDTO.getAutoParsing());
            // The interval between automatic reruns after an error occurs. Unit: milliseconds. Maximum value: 1800000 (30 minutes). 
            updateFileRequest.setAutoRerunIntervalMillis(updateFileDTO.getAutoRerunIntervalMillis());
            // The number of automatic reruns that are allowed after an error occurs.
            updateFileRequest.setAutoRerunTimes(updateFileDTO.getAutoRerunTimes());
            // The name of the data source that you want to use to run the node.
            updateFileRequest.setConnectionName(updateFileDTO.getConnectionName());
            // The code of the file.
            updateFileRequest.setContent(updateFileDTO.getContent());
            // The CRON expression that represents the periodic scheduling policy of the node. 
            updateFileRequest.setCronExpress(updateFileDTO.getCronExpress());
            // The type of the scheduling cycle. Valid values: NOT_DAY and DAY. The value NOT_DAY indicates that the node is scheduled to run by minute or hour. The value DAY indicates that the node is scheduled to run by day, week, or month.
            updateFileRequest.setCycleType(updateFileDTO.getCycleType());
            // The ID of the node on which the node that corresponds to the file depends when the DependentType parameter is set to USER_DEFINE. If you specify multiple IDs, separate them with commas (,).
            updateFileRequest.setDependentNodeIdList(updateFileDTO.getDependentNodeIdList());
            // The type of the cross-cycle scheduling dependency for the node that corresponds to the file.
            updateFileRequest.setDependentType(updateFileDTO.getDependentType());
            // The end timestamp of automatic scheduling, in milliseconds. 
            updateFileRequest.setEndEffectDate(updateFileDTO.getEndEffectDate());
            // The description of the file.
            updateFileRequest.setFileDescription(updateFileDTO.getFileDescription());
            // The path where the file resides.
            updateFileRequest.setFileFolderPath(updateFileDTO.getFileFolderPath());
            // The ID of the file.
            updateFileRequest.setFileId(updateFileDTO.getFileId());
            // The name of the file.
            updateFileRequest.setFileName(updateFileDTO.getFileName());
            // The output name of the file on which the current file depends. If you specify multiple output names, separate them with commas (,).
            updateFileRequest.setInputList(updateFileDTO.getInputList());
            // The output of the file.
            updateFileRequest.setOutputList(updateFileDTO.getOutputList());
            // The ID of the file owner.
            updateFileRequest.setOwner(updateFileDTO.getOwner());
            // The scheduling parameter.
            updateFileRequest.setParaValue(updateFileDTO.getParaValue());
            // The ID of the DataWorks workspace.
            updateFileRequest.setProjectId(updateFileDTO.getProjectId());
            // The rerun type for the node that corresponds to the file. Set the value to ALL_ALLOWED.
            updateFileRequest.setRerunMode(updateFileDTO.getRerunMode());
            // The identifier of the resource group that you want to use to run the node.
            updateFileRequest.setResourceGroupIdentifier(updateFileDTO.getResourceGroupIdentifier());
            // The scheduling type of the node. Set the value to NORMAL.
            updateFileRequest.setSchedulerType(updateFileDTO.getSchedulerType());
            // The start timestamp of automatic scheduling, in milliseconds.
            updateFileRequest.setStartEffectDate(updateFileDTO.getStartEffectDate());
            // Specifies whether to immediately run the node after the node is deployed.
            updateFileRequest.setStartImmediately(updateFileDTO.getStartImmediately());
            // Specifies whether to suspend the scheduling of the node.
            updateFileRequest.setStop(updateFileDTO.getStop());
            UpdateFileResponse updateFileResponse = dataWorksOpenApiClient.createClient()
                    .getAcsResponse(updateFileRequest);
            // requestId
            System.out.println(updateFileResponse.getRequestId());
            // The update result. Valid values: True and False.
            System.out.println(updateFileResponse.getSuccess());
            return updateFileResponse.getSuccess();
        } catch (ServerException e) {
            e.printStackTrace();
        } catch (ClientException e) {
            e.printStackTrace();
            // The request ID.
            System.out.println(e.getRequestId());
            // The error code.
            System.out.println(e.getErrCode());
            // The error message.
            System.out.println(e.getErrMsg());
        }
        return false;
    }


    /**
     * Delete a file.
     * @param deleteFileDTO
     * @return
     * @throws InterruptedException
     */
    public boolean deleteFile(DeleteFileDTO deleteFileDTO) throws InterruptedException {
        try {


            DeleteFileRequest deleteFileRequest = new DeleteFileRequest();
            deleteFileRequest.setFileId(deleteFileDTO.getFileId());
            deleteFileRequest.setProjectId(deleteFileDTO.getProjectId());
            DeleteFileResponse deleteFileResponse = dataWorksOpenApiClient.createClient()
                    .getAcsResponse(deleteFileRequest);
            System.out.println(deleteFileResponse.getRequestId());
            System.out.println(deleteFileResponse.getDeploymentId());


            GetDeploymentRequest getDeploymentRequest = new GetDeploymentRequest();
            getDeploymentRequest.setProjectId(deleteFileDTO.getProjectId());
            getDeploymentRequest.setDeploymentId(deleteFileResponse.getDeploymentId());
            for (int i = 0; i < CYCLE_NUM; i++) {
                GetDeploymentResponse getDeploymentResponse = dataWorksOpenApiClient.createClient()
                        .getAcsResponse(getDeploymentRequest);
                // The status of the deployment task. Valid values: 0, 1, and 2. The value 0 indicates that the deployment task is ready. The value 1 indicates that the deployment task is successful. The value 2 indicates that the deployment task failed. 
                Integer deleteStatus = getDeploymentResponse.getData().getDeployment().getStatus();
                // Perform a round robin to check the status of the file.
                if (deleteStatus == 1) {
                    System.out.println("File deleted.");
                    break;
                } else {
                    System.out.println("Deleting file...");
                    Thread.sleep(1000L);
                }
            }


            GetProjectRequest getProjectRequest = new GetProjectRequest();
            getProjectRequest.setProjectId(deleteFileDTO.getProjectId());
            GetProjectResponse getProjectResponse = dataWorksOpenApiClient.createClient()
                    .getAcsResponse(getProjectRequest);
            // The type of the environment. A workspace in standard mode provides both the development and production environments, and a workspace in basic mode provides only the production environment.
            Boolean standardMode = getProjectResponse.getData().getEnvTypes().size() == 2;
            if (standardMode) {
                // If the workspace is in standard mode, you must deploy the operation of deleting the file to the production environment to make the operation take effect.
                DeployFileRequest deployFileRequest = new DeployFileRequest();
                deployFileRequest.setProjectId(deleteFileDTO.getProjectId());
                deployFileRequest.setFileId(deleteFileDTO.getFileId());
                DeployFileResponse deployFileResponse = dataWorksOpenApiClient.createClient()
                        .getAcsResponse(deployFileRequest);
                getDeploymentRequest.setDeploymentId(deployFileResponse.getData());
                for (int i = 0; i < CYCLE_NUM; i++) {
                    GetDeploymentResponse getDeploymentResponse = dataWorksOpenApiClient.createClient()
                            .getAcsResponse(getDeploymentRequest);
                    // The status of the deployment task. Valid values: 0, 1, and 2. The value 0 indicates that the deployment task is ready. The value 1 indicates that the deployment task is successful. The value 2 indicates that the deployment task failed. 
                    Integer deleteStatus = getDeploymentResponse.getData().getDeployment().getStatus();
                    // Perform a round robin to check the status of the file.
                    if (deleteStatus == 1) {
                        System.out.println("File deleted.");
                        break;
                    } else {
                        System.out.println("Deleting file...");
                        Thread.sleep(1000L);
                    }
                }
            }
            return true;
        } catch (ServerException e) {
            e.printStackTrace();
        } catch (ClientException e) {
            e.printStackTrace();
            // The request ID.
            System.out.println(e.getRequestId());
            // The error code.
            System.out.println(e.getErrCode());
            // The error message.
            System.out.println(e.getErrMsg());
        }
        return false;
    }


    /**
     * Query files.
     * @param getFileDTO
     */
    public GetFileResponse.Data.File getFile(GetFileDTO getFileDTO) {
        try {
            GetFileRequest getFileRequest = new GetFileRequest();
            getFileRequest.setFileId(getFileDTO.getFileId());
            getFileRequest.setProjectId(getFileDTO.getProjectId());
            getFileRequest.setNodeId(getFileDTO.getNodeId());
            GetFileResponse getFileResponse = dataWorksOpenApiClient.createClient().getAcsResponse(getFileRequest);
            System.out.println(getFileResponse.getRequestId());
            GetFileResponse.Data.File file = getFileResponse.getData().getFile();
            System.out.println(file.getFileName());
            System.out.println(file.getFileType());
            System.out.println(file.getNodeId());
            System.out.println(file.getCreateUser());
            return file;
        } catch (ServerException e) {
            e.printStackTrace();
        } catch (ClientException e) {
            e.printStackTrace();
            // The request ID.
            System.out.println(e.getRequestId());
            // The error code.
            System.out.println(e.getErrCode());
            // The error message.
            System.out.println(e.getErrMsg());
        }
        return null;
    }


    /**
     * @param deployFileDTO
     * @return
     * @throws InterruptedException
     */
    public Boolean deployFile(DeployFileDTO deployFileDTO) throws InterruptedException {
        try {
            GetProjectRequest getProjectRequest = new GetProjectRequest();
            getProjectRequest.setProjectId(deployFileDTO.getProjectId());
            GetProjectResponse getProjectResponse = dataWorksOpenApiClient.createClient()
                    .getAcsResponse(getProjectRequest);
            // The type of the environment. A workspace in standard mode provides both the development and production environments, and a workspace in basic mode provides only the production environment.
            Boolean standardMode = getProjectResponse.getData().getEnvTypes().size() == 2;
            if (standardMode) {
                SubmitFileRequest submitFileRequest = new SubmitFileRequest();
                submitFileRequest.setFileId(deployFileDTO.getFileId());
                submitFileRequest.setProjectId(deployFileDTO.getProjectId());
                SubmitFileResponse submitFileResponse = dataWorksOpenApiClient.createClient()
                        .getAcsResponse(submitFileRequest);
                System.out.println("submit file requestId:" + submitFileResponse.getRequestId());
                System.out.println("submit file deploymentId:" + submitFileResponse.getData());
                for (int i = 0; i < CYCLE_NUM; i++) {
                    GetDeploymentRequest getDeploymentRequest = new GetDeploymentRequest();
                    getDeploymentRequest.setProjectId(deployFileDTO.getProjectId());
                    getDeploymentRequest.setDeploymentId(submitFileResponse.getData());
                    GetDeploymentResponse getDeploymentResponse = dataWorksOpenApiClient.createClient()
                            .getAcsResponse(getDeploymentRequest);
                    // The status of the deployment task. Valid values: 0, 1, and 2. The value 0 indicates that the deployment task is ready. The value 1 indicates that the deployment task is successful. The value 2 indicates that the deployment task failed. 
                    Integer deleteStatus = getDeploymentResponse.getData().getDeployment().getStatus();
                    // Perform a round robin to check the status of the file.
                    if (deleteStatus == 1) {
                        System.out.println("File submitted.");
                        break;
                    } else {
                        (a) System.out.println("Submitting file...");
                        Thread.sleep(1000L);
                    }
                }
            }
            DeployFileRequest deployFileRequest = new DeployFileRequest();
            deployFileRequest.setFileId(deployFileDTO.getFileId());
            deployFileRequest.setProjectId(deployFileDTO.getProjectId());
            DeployFileResponse deployFileResponse = dataWorksOpenApiClient.createClient()
                    .getAcsResponse(deployFileRequest);
            System.out.println("deploy file requestId:" + deployFileResponse.getRequestId());
            System.out.println("deploy file deploymentId:" + deployFileResponse.getData());
            for (int i = 0; i < CYCLE_NUM; i++) {
                GetDeploymentRequest getDeploymentRequest = new GetDeploymentRequest();
                getDeploymentRequest.setProjectId(deployFileDTO.getProjectId());
                getDeploymentRequest.setDeploymentId(deployFileResponse.getData());
                GetDeploymentResponse getDeploymentResponse = dataWorksOpenApiClient.createClient()
                        .getAcsResponse(getDeploymentRequest);
                // The status of the deployment task. Valid values: 0, 1, and 2. The value 0 indicates that the deployment task is ready. The value 1 indicates that the deployment task is successful. The value 2 indicates that the deployment task failed. 
                Integer deleteStatus = getDeploymentResponse.getData().getDeployment().getStatus();
                // Perform a round robin to check the status of the file.
                if (deleteStatus == 1) {
                    System.out.println("File deployed.");
                    break;
                } else {
                    (a) System.out.println("Deploying file...");
                    Thread.sleep(1000L);
                }
            }
            return true;
        } catch (ServerException e) {
            e.printStackTrace();
        } catch (ClientException e) {
            e.printStackTrace();
            // The request ID.
            System.out.println(e.getRequestId());
            // The error code.
            System.out.println(e.getErrCode());
            // The error message.
            System.out.println(e.getErrMsg());
        }
        return false;


    }


    public List<ListInstancesResponse.Data.Instance> runSmokeTest(RunSmokeTestDTO runSmokeTestDTO) {
        try {
            RunSmokeTestRequest runSmokeTestRequest = new RunSmokeTestRequest();
            runSmokeTestRequest.setBizdate(runSmokeTestDTO.getBizdate());
            runSmokeTestRequest.setNodeId(runSmokeTestDTO.getNodeId());
            runSmokeTestRequest.setNodeParams(runSmokeTestDTO.getNodeParams());
            runSmokeTestRequest.setName(runSmokeTestDTO.getName());
            runSmokeTestRequest.setProjectEnv(runSmokeTestDTO.getProjectEnv());
            RunSmokeTestResponse runSmokeTestResponse = dataWorksOpenApiClient.createClient()
                    .getAcsResponse(runSmokeTestRequest);
            System.out.println(runSmokeTestResponse.getRequestId());
            // DAGID
            System.out.println(runSmokeTestResponse.getData());


            ListInstancesRequest listInstancesRequest = new ListInstancesRequest();
            listInstancesRequest.setDagId(runSmokeTestResponse.getData());
            listInstancesRequest.setProjectId(runSmokeTestDTO.getProjectId());
            listInstancesRequest.setProjectEnv(runSmokeTestDTO.getProjectEnv());
            listInstancesRequest.setNodeId(runSmokeTestDTO.getNodeId());
            listInstancesRequest.setPageNumber(1);
            listInstancesRequest.setPageSize(10);
            ListInstancesResponse listInstancesResponse = dataWorksOpenApiClient.createClient()
                    .getAcsResponse(listInstancesRequest);
            System.out.println(listInstancesResponse.getRequestId());
            List<ListInstancesResponse.Data.Instance> instances = listInstancesResponse.getData().getInstances();
            if (CollectionUtils.isEmpty(instances)) {
                return null;
            }
            return instances;
        } catch (ServerException e) {
            e.printStackTrace();
        } catch (ClientException e) {
            e.printStackTrace();
            // The request ID.
            System.out.println(e.getRequestId());
            // The error code.
            System.out.println(e.getErrCode());
            // The error message.
            System.out.println(e.getErrMsg());
        }
        return null;
    }


    public InstanceDetail getInstanceLog(Long instanceId, String projectEnv) {
        try {
            GetInstanceLogRequest getInstanceLogRequest = new GetInstanceLogRequest();
            getInstanceLogRequest.setInstanceId(instanceId);
            getInstanceLogRequest.setProjectEnv(projectEnv);
            GetInstanceLogResponse getInstanceLogResponse = dataWorksOpenApiClient.createClient()
                    .getAcsResponse(getInstanceLogRequest);
            System.out.println(getInstanceLogResponse.getRequestId());


            GetInstanceRequest getInstanceRequest = new GetInstanceRequest();
            getInstanceRequest.setInstanceId(instanceId);
            getInstanceRequest.setProjectEnv(projectEnv);
            GetInstanceResponse getInstanceResponse = dataWorksOpenApiClient.createClient()
                    .getAcsResponse(getInstanceRequest);
            System.out.println(getInstanceResponse.getRequestId());
            System.out.println(getInstanceResponse.getData());


            InstanceDetail instanceDetail = new InstanceDetail();
            instanceDetail.setInstance(getInstanceResponse.getData());
            instanceDetail.setInstanceLog(getInstanceLogResponse.getData());
            return instanceDetail;
        } catch (ServerException e) {
            e.printStackTrace();
        } catch (ClientException e) {
            e.printStackTrace();
            // The request ID.
            System.out.println(e.getRequestId());
            // The error code.
            System.out.println(e.getErrCode());
            // The error message.
            System.out.println(e.getErrMsg());
        }
        return null;
    }
}

Step 4: Develop an IDE controller

You need to define an IDE controller that provides the API operations that can be called for routing during the frontend development.

package com.aliyun.dataworks.demo;


import com.aliyun.dataworks.dto.*;
import com.aliyun.dataworks.services.BusinessService;
import com.aliyun.dataworks.services.FileService;
import com.aliyun.dataworks.services.FolderService;
import com.aliyun.dataworks.services.ProjectService;
import com.aliyuncs.dataworks_public.model.v20200518.*;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.web.bind.annotation.*;


import java.util.List;


/**
 * @author dataworks demo
 */
@RestController
@RequestMapping("/ide")
public class IdeController {


    @Autowired
    private FileService fileService;


    @Autowired
    private FolderService folderService;


    @Autowired
    private BusinessService businessService;


    @Autowired
    private ProjectService projectService;


    /**
     * for list those files
     *
     * @param listFilesDTO
     * @return ListFilesResponse.Data.File
     */
    @CrossOrigin(origins = "http://localhost:8080")
    @GetMapping("/listFiles")
    public List<ListFilesResponse.Data.File> listFiles(ListFilesDTO listFilesDTO) {
        return fileService.listFiles(listFilesDTO);
    }


    /**
     * for list those folders
     *
     * @param listFoldersDTO
     * @return ListFoldersResponse.Data.FoldersItem
     */
    @CrossOrigin(origins = "http://localhost:8080")
    @GetMapping("/listFolders")
    public List<ListFoldersResponse.Data.FoldersItem> listFolders(ListFoldersDTO listFoldersDTO) {
        return folderService.listFolders(listFoldersDTO);
    }


    /**
     * for create the folder
     *
     * @param createFolderDTO
     * @return boolean
     */
    @CrossOrigin(origins = "http://localhost:8080")
    @PostMapping("/createFolder")
    public boolean createFolder(@RequestBody CreateFolderDTO createFolderDTO) {
        return folderService.createFolder(createFolderDTO);
    }


    /**
     * for update the folder
     *
     * @param updateFolderDTO
     * @return boolean
     */
    @CrossOrigin(origins = "http://localhost:8080")
    @PostMapping("/updateFolder")
    public boolean updateFolder(@RequestBody UpdateFolderDTO updateFolderDTO) {
        return folderService.updateFolder(updateFolderDTO);
    }


    /**
     * for get the file
     *
     * @param getFileDTO
     * @return GetFileResponse.Data.File
     */
    @CrossOrigin(origins = "http://localhost:8080")
    @GetMapping("/getFile")
    public GetFileResponse.Data.File getFile(GetFileDTO getFileDTO) {
        return fileService.getFile(getFileDTO);
    }


    /**
     * for create the file
     *
     * @param createFileDTO
     * @return fileId
     */
    @CrossOrigin(origins = "http://localhost:8080")
    @PostMapping("/createFile")
    public Long createFile(@RequestBody CreateFileDTO createFileDTO) {
        return fileService.createFile(createFileDTO);
    }


    /**
     * for update the file
     *
     * @param updateFileDTO
     * @return boolean
     */
    @CrossOrigin(origins = "http://localhost:8080")
    @PostMapping("/updateFile")
    public boolean updateFile(@RequestBody UpdateFileDTO updateFileDTO) {
        return fileService.updateFile(updateFileDTO);
    }


    /**
     * for deploy the file
     *
     * @param deployFileDTO
     * @return boolean
     */
    @CrossOrigin(origins = "http://localhost:8080")
    @PostMapping("/deployFile")
    public boolean deployFile(@RequestBody DeployFileDTO deployFileDTO) {
        try {
            return fileService.deployFile(deployFileDTO);
        } catch (Exception e) {
            System.out.println(e);
        }
        return false;
    }


    /**
     * for delete the file
     *
     * @param deleteFileDTO
     * @return
     */
    @CrossOrigin(origins = "http://localhost:8080")
    @DeleteMapping("/deleteFile")
    public boolean deleteFile(DeleteFileDTO deleteFileDTO) {
        try {
            return fileService.deleteFile(deleteFileDTO);
        } catch (Exception e) {
            System.out.println(e);
        }
        return false;
    }


    /**
     * for delete the folder
     *
     * @param deleteFolderDTO
     * @return
     */
    @CrossOrigin(origins = "http://localhost:8080")
    @DeleteMapping("/deleteFolder")
    public boolean deleteFolder(DeleteFolderDTO deleteFolderDTO) {
        return folderService.deleteFolder(deleteFolderDTO);
    }


    /**
     * list businesses
     *
     * @param listBusinessesDTO
     * @return
     */
    @CrossOrigin(origins = "http://localhost:8080")
    @GetMapping("/listBusinesses")
    public List<ListBusinessResponse.Data.BusinessItem> listBusiness(ListBusinessesDTO listBusinessesDTO) {
        return businessService.listBusiness(listBusinessesDTO);
    }


    /**
     * create a business
     *
     * @param createBusinessDTO
     * @return
     */
    @CrossOrigin(origins = "http://localhost:8080")
    @PostMapping("/createBusiness")
    public Long createBusiness(@RequestBody CreateBusinessDTO createBusinessDTO) {
        return businessService.createBusiness(createBusinessDTO);
    }


    /**
     * update a business
     *
     * @param updateBusinessDTO
     * @return
     */
    @CrossOrigin(origins = "http://localhost:8080")
    @PostMapping("/updateBusiness")
    public boolean updateBusiness(@RequestBody UpdateBusinessDTO updateBusinessDTO) {
        return businessService.updateBusiness(updateBusinessDTO);
    }


    /**
     * delete a business
     *
     * @param deleteBusinessDTO
     * @return
     */
    @CrossOrigin(origins = "http://localhost:8080")
    @PostMapping("/deleteBusiness")
    public boolean deleteBusiness(@RequestBody DeleteBusinessDTO deleteBusinessDTO) {
        return businessService.deleteBusiness(deleteBusinessDTO);
    }



    /**
     * @param pageNumber
     * @param pageSize
     * @return
     */
    @CrossOrigin(origins = "http://localhost:8080")
    @GetMapping("/listProjects")
    public ListProjectsResponse.PageResult listProjects(Integer pageNumber, Integer pageSize) {
        return projectService.listProjects(pageNumber, pageSize);
    }


    /**
     * @param runSmokeTestDTO
     * @return
     */
    @CrossOrigin(origins = "http://localhost:8080")
    @PutMapping("/runSmokeTest")
    public List<ListInstancesResponse.Data.Instance> runSmokeTest(@RequestBody RunSmokeTestDTO runSmokeTestDTO) {
        return fileService.runSmokeTest(runSmokeTestDTO);
    }


    /**
     * @param instanceId
     * @param projectEnv
     * @return
     */
    @CrossOrigin(origins = "http://localhost:8080")
    @GetMapping("/getLog")
    public InstanceDetail getLog(@RequestParam Long instanceId, @RequestParam String projectEnv) {
        return fileService.getInstanceLog(instanceId, projectEnv);
    }


}

Frontend code development

  1. Initialize the editor, directory tree, and terminal.

    Sample code:

    const App: FunctionComponent<Props> = () => {
      const editorRef = useRef<HTMLDivElement>(null);
      const termianlRef = useRef<HTMLDivElement>(null);
      const [terminal, setTerminal] = useState<NextTerminal>();
      const [editor, setEditor] = useState<monaco.editor.IStandaloneCodeEditor>();
      const [expnadedKeys, setExpandedKeys] = useState<any[]>();
      const [workspace, setWorkspace] = useState<number>();
      const [workspaces, setWorkspaces] = useState<{ label: string, value: number }[]>([]);
      const [dataSource, setDataSource] = useState<any[]>();
      const [selectedFile, setSelectedFile] = useState<number>();
      const [loading, setLoading] = useState<boolean>(false);
      // Create an editor instance.
      useEffect(() => {
        if (editorRef.current) {
          const nextEditor = monaco.editor.create(editorRef.current, editorOptions);
          setEditor(nextEditor);
          return () => { nextEditor.dispose(); };
        }
      }, [editorRef.current]);
      // Add a keyboard input event that is used to save the file.
      useEffect(() => {
        editor?.addCommand(monaco.KeyMod.CtrlCmd | monaco.KeyCode.KeyS, () => {
          if (!workspace) {
            showTips('Please select workspace first');
            return;
          }
          saveFile(workspace, editor, selectedFile);
        });
      }, [editor, workspace, selectedFile]);
      // Create a terminal instance.
      useEffect(() => {
        if (termianlRef.current) {
          const term: NextTerminal = new Terminal(terminalOptions) as any;
          term.pointer = -1;
          term.stack = [];
          setTerminal(term);
          const fitAddon = new FitAddon();
          term.loadAddon(fitAddon);
          term.open(termianlRef.current);
          fitAddon.fit();
          term.write('$ ');
          return () => { term.dispose(); };
        }
      }, [termianlRef.current]);
      // Register a terminal input event.
      useEffect(() => {
        const event = terminal?.onKey(e => onTerminalKeyChange(e, terminal, dataSource, workspace));
        return () => {
          event?.dispose();
        };
      }, [terminal, dataSource, workspace]);
      // Query data sources in the directory tree.
      useEffect(() => {
        workspace && (async () => {
          setLoading(true);
          const nextDataSource = await getTreeDataSource(workspace, workspaces);
          const defaultKey = nextDataSource?.[0]?.key;
          defaultKey && setExpandedKeys([defaultKey]);
          setDataSource(nextDataSource);
          setLoading(false);
        })();
      }, [workspace]);
      // When you click a file in the directory tree, you can query the details and code of the file.
      useEffect(() => {
        workspace && selectedFile && (async () => {
          setLoading(true);
          const file = await getFileInfo(workspace, selectedFile);
          editor?.setValue(file.content);
          editor?.getAction('editor.action.formatDocument').run();
          setLoading(false);
        })();
      }, [selectedFile]);
      // Query workspaces.
      useEffect(() => {
        (async () => {
          const list = await getWorkspaceList();
          setWorkspaces(list);
        })();
      }, []);
      const onExapnd = useCallback((keys: number[]) => { setExpandedKeys(keys); }, []);
      const onWorkspaceChange = useCallback((value: number) => { setWorkspace(value) }, []);
      const onTreeNodeSelect = useCallback((key: number[]) => { key[0] && setSelectedFile(key[0]) }, []);
      return (
        <div className={cn(classes.appWrapper)}>
          <div className={cn(classes.leftArea)}>
            <div className={cn(classes.workspaceWrapper)}>
              Workspace:
              <Select
                value={workspace}
                dataSource={workspaces}
                onChange={onWorkspaceChange}
                autoWidth={false}
                showSearch
              />
            </div>
            <div className={cn(classes.treeWrapper)}>
              <Tree
                dataSource={dataSource}
                isNodeBlock={{ defaultPaddingLeft: 20 }}
                expandedKeys={expnadedKeys}
                selectedKeys={[selectedFile]}
                onExpand={onExapnd}
                onSelect={onTreeNodeSelect}
                defaultExpandAll
              />
            </div>
          </div>
          <div className={cn(classes.rightArea)}>
            <div
              className={cn(classes.monacoEditorWrapper)}
              ref={editorRef}
            />
            <div
              className={cn(classes.panelWrapper)}
              ref={termianlRef}
            />
          </div>
          <div className={cn(classes.loaderLine)} style={{ display: loading ? 'block' : 'none' }} />
        </div>
      );
    };
                            
  2. Query the sample workflow and file and display the directory tree.

    The following flowchart shows how to query the sample workflow and file.流程图1

    /**
     * Query data sources in the directory tree.
     * @param workspace The ID of the workspace.
     * @param dataSource The workspaces.
     */
    async function getTreeDataSource(workspace: number, dataSource: { label: string, value: number }[]) {
      try {
        const businesses = await services.ide.getBusinessList(workspace, openPlatformBusinessName);
        businesses.length === 0 && await services.ide.createBusiness(workspace, openPlatformBusinessName);
      } catch (e) {
        showError('You have no permission to access this workspace.');
        return;
      }
      const fileFolderPath = 'Workflow/${openPlatformBusinessName}/MaxCompute';
      const files = await services.ide.getFileList(workspace, fileFolderPath);
      let children: { key: number, label: string }[] = [];
      if (files.length === 0) {
        try {
          const currentWorkspace = dataSource.find(i => i.value === workspace);
          const file1 = await services.ide.createFile(workspace, currentWorkspace!.label, fileFolderPath, 'simpleSQL.mc.sql', 'SELECT 1');
          const file2 = await services.ide.createFile(workspace, currentWorkspace!.label, fileFolderPath, 'createTable.mc.sql', 'CREATE TABLE IF NOT EXISTS _qcc_mysql1_odps_source_20220113100903_done_ (\ncol string\n)\nCOMMENT \'Tables whose full data is synchronized are marked with done.\'\nPARTITIONED BY\n(\nstatus STRING   COMMENT \'Partitions marked with done\'\n)\nLIFECYCLE 36500;');
          children = children.concat([
            { key: file1, label: 'simpleSQL.mc.sql' },
            { key: file2, label: 'createTable.mc.sql' },
          ]);
        } catch (e) {
          showError('Create file failed. The datasource odps_source does not exist.');
          return;
        }
      } else {
        children = files.map((i) => ({ key: i.fileId, label: i.fileName }));
      }
      return [{ key: 1, label: openPlatformBusinessName, children }];
    }
  3. After you edit and save the file, you need to pass the edited file to the backend and update the file.

    Sample code:

    /**
     * Press Ctrl+S to save the file.
     * @param workspace The ID of the workspace.
     * @param editor The editor instance.
     * @param selectedFile The file you selected.
     */
    async function saveFile(workspace: number, editor: monaco.editor.IStandaloneCodeEditor, selectedFile?: number) {
      if (!selectedFile) {
        showTips('Please select a file.');
        return;
      }
      const content = editor.getValue();
      const result = await services.ide.updateFile(workspace, selectedFile, { content });
      result ? showTips('Saved file') : showError('Failed to save file');
    }
  4. When you enter dw run dw run ... in the terminal, the file is committed to the scheduling system and the smoke testing is performed.

    The following flowchart shows how to commit the file to the scheduling system and perform smoke testing.流程2

    /**
     * Process the keyboard input event of the terminal.
     * @param e The keyboard input event to process.
     * @param term The terminal instance.
     * @param dataSource The data source in the directory tree.
     * @param workspace The ID of the workspace.
     */
    function onTerminalKeyChange(e: { key: string; domEvent: KeyboardEvent; }, term: NextTerminal, dataSource: any, workspace?: number) {
      const ev = e.domEvent;
      const printable = !ev.altKey && !ev.ctrlKey && !ev.metaKey;
      term.inputText = typeof term.inputText === 'string' ? term.inputText : '';
      switch (ev.key) {
        case 'ArrowUp':
          term.pointer = term.pointer < (term.stack.length - 1) ? term.pointer + 1 : term.pointer;
          term.inputText = term.stack[term.pointer];
          term.write(`\x1b[2K\r$ ${term.inputText}`);
          break;
        case 'ArrowDown':
          term.pointer = term.pointer > -1 ? term.pointer - 1 : -1;
          term.inputText = term.pointer === -1 ? '' : term.stack[term.pointer];
          term.write(`\x1b[2K\r$ ${term.inputText}`);
          break;
        case 'ArrowLeft':
          (term as any)._core.buffer.x > 2 && printable && term.write(e.key);
          break;
        case 'ArrowRight':
          (term as any)._core.buffer.x <= (term.inputText.length + 1) && printable && term.write(e.key);
          break;
        case 'Enter':
          commandHandler(term, dataSource, workspace);
          break;
        case 'Backspace':
          if ((term as any)._core.buffer.x > 2) {
            term.inputText = term.inputText.slice(0, -1);
            term.write('\b \b');
          }
          break;
        default:
          if (printable) {
            term.inputText += e.key;
            term.write(e.key);
          }
      }
    }
    /**
     * Process the keyboard input event. This function is called when dw run ... is entered in the terminal.
     * @param term The terminal instance.
     * @param dataSource The data source in the directory tree.
     * @param workspace The ID of the workspace.
     */
    async function commandHandler(term: NextTerminal, dataSource: any, workspace?: number) {
      term.write('\r\n$ ');
      const input = term.inputText;
      term.inputText = '';
      if (['', undefined].includes(input)) {
        return;
      }
      term.stack = [input!, ...term.stack];
      term.pointer = -1;
      if (!workspace) {
        term.write(highlight.text('[ERROR] You should select workspace first.\r\n$ ', brush));
        return;
      }
      // Process the input command. If the command starts with dw and the command is run, process the input command. Otherwise, an error is reported.
      const words = input?.split(' ');
      const tag = words?.[0].toLowerCase();
      const command = words?.[1]?.toLowerCase();
      const fileName = words?.[2];
      if (tag !== 'dw' || !validCommands.includes(command!)) {
        term.write(highlight.text('[ERROR] Invalid command.\r\n$ ', brush));
        return;
      }
      // Query the input file.
      const source = dataSource?.[0]?.children.find((i: any) => i.label === fileName);
      const file = await services.ide.getFile(workspace, source.key);
      if (!file) {
        term.write(highlight.text('[ERROR] File name does not exist.\r\n$ ', brush));
        return;
      }
      term.write(highlight.text('[INFO] Submiting file.\r\n$ ', brush));
      // Deploy the file to the scheduling system.
      const response = await services.ide.deployFile(workspace, source.key);
      if (response) {
        term.write(highlight.text('[INFO] Submit file success.\r\n$ ', brush));
      } else {
        term.write(highlight.text('[ERROR] Submit file failed.\r\n$ ', brush));
        return;
      }
      // Perform smoke testing and run the scheduling node.
      let dag: services.ide.Dag;
      try {
        term.write(highlight.text('[INFO] Start to run task.\r\n$ ', brush));
        dag = (await services.ide.runSmoke(workspace, file.nodeId, openPlatformBusinessName))[0];
        term.write(highlight.text('[INFO] Trigger sql task success.\r\n$ ', brush));
      } catch (e) {
        term.write(highlight.text('[ERROR] Trigger sql task failed.\r\n$ ', brush));
        return;
      }
      // Perform a round robin to query the logs of the node.
      const event = setInterval(async () => {
        try {
          const logInfo = await services.ide.getLog(dag.instanceId, 'DEV');
          let log: string;
          switch (logInfo.instance.status) {
            case 'WAIT_TIME':
              log ='Waiting for the scheduling time to arrive';
              break;
            case 'WAIT_RESOURCE':
              log ='Waiting for resources...';
              break;
            default:
              log = logInfo.instanceLog;
          }
          term.write(`${highlight.text(log, brush).replace(/\n/g, '\r\n')}\r\n$ `);
          const finished = ['SUCCESS', 'FAILURE', 'NOT_RUN'].includes(logInfo.instance.status);
          finished && clearInterval(event);
        } catch (e) {
          term.write(highlight.text('[ERROR] SQL Task run failed.\r\n$ ', brush));
          return;
        }
      }, 3000);
    }

Deploy and run the code on your on-premises machine

You must follow the instructions provided in GitHub to prepare the environment. Make sure that the following dependencies are prepared: Java 8 or later, Maven, runtime environment, and pnpm. Then, perform initial installation.

pnpm install

You also need to modify information related to AccessKey pairs in the root path. Run the following command in the project root directory to run the sample code:

npm run example:ide

You can enter https://localhost:8080 in the address bar of a browser to verify the results.

https://localhost:8080

Reference: Download complete sample source code

You can download the complete sample source code from GitHub. Sample code for all operations in this topic:

import { useEffect, useRef, useState, useCallback } from 'react';
import type { FunctionComponent } from 'react';
import cn from 'classnames';
import * as monaco from 'monaco-editor';
import { Terminal } from 'xterm';
import { FitAddon } from 'xterm-addon-fit';
import { Tree, Select, Message } from '@alifd/next';
import * as highlight from '../helpers/highlight';
import * as services from '../services';
import classes from '../styles/app.module.css';

export interface Props {}
export interface NextTerminal extends Terminal {
  inputText?: string;
  stack: string[];
  pointer: number;
}

const brush = {
  rules: [
    { regex: /\bERROR\b/gmi, theme: 'red' },
    { regex: /\bWARN\b/gmi, theme: 'yellow' },
    { regex: /\bINFO\b/gmi, theme: 'green' },
    { regex: /^FAILED:.*$/gmi, theme: 'red' },
  ],
};
// The name of the sample workflow.
const openPlatformBusinessName ='Sample workflow in Open Platform;
// The parameters for creating the editor instance.
const editorOptions = {
  content: '',
  language: 'sql',
  theme: 'vs-dark',
  automaticLayout: true,
  fontSize: 16,
};
// The parameters for creating the terminal instance.
const terminalOptions = {
  cursorBlink: true,
  cursorStyle: 'underline' as const,
  fontSize: 16,
};
const validCommands = [
  'run',
];

/**
 * Method to display error messages in a pop-up window
 * @param message The error message.
 */
function showError(message: string) {
  Message.error({ title: 'Error Message', content: message });
}
/**
 * Method to display prompt messages in a pop-up window
 * @param The prompt message.
 */
function showTips(message: string) {
  Message.show({ title: 'Tips', content: message });
}
/**
 * Process the keyboard input event. This function is called when dw run ... is entered in the terminal.
 * @param term The terminal instance.
 * @param dataSource The data source in the directory tree.
 * @param workspace The ID of the workspace.
 */
async function commandHandler(term: NextTerminal, dataSource: any, workspace?: number) {
  term.write('\r\n$ ');
  const input = term.inputText;
  term.inputText = '';
  if (['', undefined].includes(input)) {
    return;
  }
  term.stack = [input!, ...term.stack];
  term.pointer = -1;
  if (!workspace) {
    term.write(highlight.text('[ERROR] You should select workspace first.\r\n$ ', brush));
    return;
  }
  // Process the input command. If the command starts with dw and the command is run, process the input command. Otherwise, an error is reported.
  const words = input?.split(' ');
  const tag = words?.[0].toLowerCase();
  const command = words?.[1]?.toLowerCase();
  const fileName = words?.[2];
  if (tag !== 'dw' || !validCommands.includes(command!)) {
    term.write(highlight.text('[ERROR] Invalid command.\r\n$ ', brush));
    return;
  }
  // Query the input file.
  const source = dataSource?.[0]?.children.find((i: any) => i.label === fileName);
  const file = await services.ide.getFile(workspace, source.key);
  if (!file) {
    term.write(highlight.text('[ERROR] File name does not exist.\r\n$ ', brush));
    return;
  }
  term.write(highlight.text('[INFO] Submiting file.\r\n$ ', brush));
  // Deploy the file to the scheduling system.
  const response = await services.ide.deployFile(workspace, source.key);
  if (response) {
    term.write(highlight.text('[INFO] Submit file success.\r\n$ ', brush));
  } else {
    term.write(highlight.text('[ERROR] Submit file failed.\r\n$ ', brush));
    return;
  }
  // Perform smoke testing and run the scheduling node.
  let dag: services.ide.Dag;
  try {
    term.write(highlight.text('[INFO] Start to run task.\r\n$ ', brush));
    dag = (await services.ide.runSmoke(workspace, file.nodeId, openPlatformBusinessName))[0];
    term.write(highlight.text('[INFO] Trigger sql task success.\r\n$ ', brush));
  } catch (e) {
    term.write(highlight.text('[ERROR] Trigger sql task failed.\r\n$ ', brush));
    return;
  }
  // Perform a round robin to query the logs of the node.
  const event = setInterval(async () => {
    try {
      const logInfo = await services.ide.getLog(dag.instanceId, 'DEV');
      let log: string;
      switch (logInfo.instance.status) {
        case 'WAIT_TIME':
          log ='Waiting for the scheduling time to arrive';
          break;
        case 'WAIT_RESOURCE':
          log ='Waiting for resources...';
          break;
        default:
          log = logInfo.instanceLog;
      }
      term.write(`${highlight.text(log, brush).replace(/\n/g, '\r\n')}\r\n$ `);
      const finished = ['SUCCESS', 'FAILURE', 'NOT_RUN'].includes(logInfo.instance.status);
      finished && clearInterval(event);
    } catch (e) {
      term.write(highlight.text('[ERROR] SQL Task run failed.\r\n$ ', brush));
      return;
    }
  }, 3000);
}
/**
 * Process the keyboard input event of the terminal.
 * @param e The keyboard input event to process.
 * @param term The terminal instance.
 * @param dataSource The data source in the directory tree.
 * @param workspace The ID of the workspace.
 */
function onTerminalKeyChange(e: { key: string; domEvent: KeyboardEvent; }, term: NextTerminal, dataSource: any, workspace?: number) {
  const ev = e.domEvent;
  const printable = !ev.altKey && !ev.ctrlKey && !ev.metaKey;
  term.inputText = typeof term.inputText === 'string' ? term.inputText : '';
  switch (ev.key) {
    case 'ArrowUp':
      term.pointer = term.pointer < (term.stack.length - 1) ? term.pointer + 1 : term.pointer;
      term.inputText = term.stack[term.pointer];
      term.write(`\x1b[2K\r$ ${term.inputText}`);
      break;
    case 'ArrowDown':
      term.pointer = term.pointer > -1 ? term.pointer - 1 : -1;
      term.inputText = term.pointer === -1 ? '' : term.stack[term.pointer];
      term.write(`\x1b[2K\r$ ${term.inputText}`);
      break;
    case 'ArrowLeft':
      (term as any)._core.buffer.x > 2 && printable && term.write(e.key);
      break;
    case 'ArrowRight':
      (term as any)._core.buffer.x <= (term.inputText.length + 1) && printable && term.write(e.key);
      break;
    case 'Enter':
      commandHandler(term, dataSource, workspace);
      break;
    case 'Backspace':
      if ((term as any)._core.buffer.x > 2) {
        term.inputText = term.inputText.slice(0, -1);
        term.write('\b \b');
      }
      break;
    default:
      if (printable) {
        term.inputText += e.key;
        term.write(e.key);
      }
  }
}
/**
 /* Query workspaces.
 */
async function getWorkspaceList() {
  const response = await services.tenant.getProjectList();
  const list = response.projectList.filter(i => i.projectStatusCode === 'AVAILABLE').map(i => (
    { label: i.projectName, value: i.projectId }
  ));
  return list;
}
/**
 * Query data sources in the directory tree.
 * @param workspace The ID of the workspace.
 * @param dataSource The workspaces.
 */
async function getTreeDataSource(workspace: number, dataSource: { label: string, value: number }[]) {
  try {
    const businesses = await services.ide.getBusinessList(workspace, openPlatformBusinessName);
    businesses.length === 0 && await services.ide.createBusiness(workspace, openPlatformBusinessName);
  } catch (e) {
    showError('You have no permission to access this workspace.');
    return;
  }
  const fileFolderPath = 'Workflow/${openPlatformBusinessName}/MaxCompute';
  const files = await services.ide.getFileList(workspace, fileFolderPath);
  let children: { key: number, label: string }[] = [];
  if (files.length === 0) {
    try {
      const currentWorkspace = dataSource.find(i => i.value === workspace);
      const file1 = await services.ide.createFile(workspace, currentWorkspace!.label, fileFolderPath, 'simpleSQL.mc.sql', 'SELECT 1');
      const file2 = await services.ide.createFile(workspace, currentWorkspace!.label, fileFolderPath, 'createTable.mc.sql', 'CREATE TABLE IF NOT EXISTS _qcc_mysql1_odps_source_20220113100903_done_ (\ncol string\n)\nCOMMENT \'Tables whose full data is synchronized are marked with done.\'\nPARTITIONED BY\n(\nstatus STRING   COMMENT \'Partitions marked with done\'\n)\nLIFECYCLE 36500;');
      children = children.concat([
        { key: file1, label: 'simpleSQL.mc.sql' },
        { key: file2, label: 'createTable.mc.sql' },
      ]);
    } catch (e) {
      showError('Create file failed. The datasource odps_source does not exist.');
      return;
    }
  } else {
    children = files.map((i) => ({ key: i.fileId, label: i.fileName }));
  }
  return [{ key: 1, label: openPlatformBusinessName, children }];
}
/**
 * Query the details of the file.
 * @param workspace The ID of the workspace.
 * @param fileId The ID of the file.
 */
async function getFileInfo(workspace: number, fileId: number) {
  const response = await services.ide.getFile(workspace, fileId);
  return response;
}
/**
 * Press Ctrl+S to save the file.
 * @param workspace The ID of the workspace.
 * @param editor The editor instance.
 * @param selectedFile The file you selected.
 */
async function saveFile(workspace: number, editor: monaco.editor.IStandaloneCodeEditor, selectedFile?: number) {
  if (!selectedFile) {
    showTips('Please select a file.');
    return;
  }
  const content = editor.getValue();
  const result = await services.ide.updateFile(workspace, selectedFile, { content });
  result ? showTips('Saved file') : showError('Failed to save file');
}

const App: FunctionComponent<Props> = () => {
  const editorRef = useRef<HTMLDivElement>(null);
  const termianlRef = useRef<HTMLDivElement>(null);
  const [terminal, setTerminal] = useState<NextTerminal>();
  const [editor, setEditor] = useState<monaco.editor.IStandaloneCodeEditor>();
  const [expnadedKeys, setExpandedKeys] = useState<any[]>();
  const [workspace, setWorkspace] = useState<number>();
  const [workspaces, setWorkspaces] = useState<{ label: string, value: number }[]>([]);
  const [dataSource, setDataSource] = useState<any[]>();
  const [selectedFile, setSelectedFile] = useState<number>();
  const [loading, setLoading] = useState<boolean>(false);
  // Create an editor instance.
  useEffect(() => {
    if (editorRef.current) {
      const nextEditor = monaco.editor.create(editorRef.current, editorOptions);
      setEditor(nextEditor);
      return () => { nextEditor.dispose(); };
    }
  }, [editorRef.current]);
  // Add a keyboard input event that is used to save the file.
  useEffect(() => {
    editor?.addCommand(monaco.KeyMod.CtrlCmd | monaco.KeyCode.KeyS, () => {
      if (!workspace) {
        showTips('Please select workspace first');
        return;
      }
      saveFile(workspace, editor, selectedFile);
    });
  }, [editor, workspace, selectedFile]);
  // Create a terminal instance.
  useEffect(() => {
    if (termianlRef.current) {
      const term: NextTerminal = new Terminal(terminalOptions) as any;
      term.pointer = -1;
      term.stack = [];
      setTerminal(term);
      const fitAddon = new FitAddon();
      term.loadAddon(fitAddon);
      term.open(termianlRef.current);
      fitAddon.fit();
      term.write('$ ');
      return () => { term.dispose(); };
    }
  }, [termianlRef.current]);
  // Register a terminal input event.
  useEffect(() => {
    const event = terminal?.onKey(e => onTerminalKeyChange(e, terminal, dataSource, workspace));
    return () => {
      event?.dispose();
    };
  }, [terminal, dataSource, workspace]);
  // Query data sources in the directory tree.
  useEffect(() => {
    workspace && (async () => {
      setLoading(true);
      const nextDataSource = await getTreeDataSource(workspace, workspaces);
      const defaultKey = nextDataSource?.[0]?.key;
      defaultKey && setExpandedKeys([defaultKey]);
      setDataSource(nextDataSource);
      setLoading(false);
    })();
  }, [workspace]);
  // When you click a file in the directory tree, you can query the details and code of the file.
  useEffect(() => {
    workspace && selectedFile && (async () => {
      setLoading(true);
      const file = await getFileInfo(workspace, selectedFile);
      editor?.setValue(file.content);
      editor?.getAction('editor.action.formatDocument').run();
      setLoading(false);
    })();
  }, [selectedFile]);
  // Query workspaces.
  useEffect(() => {
    (async () => {
      const list = await getWorkspaceList();
      setWorkspaces(list);
    })();
  }, []);
  const onExapnd = useCallback((keys: number[]) => { setExpandedKeys(keys); }, []);
  const onWorkspaceChange = useCallback((value: number) => { setWorkspace(value) }, []);
  const onTreeNodeSelect = useCallback((key: number[]) => { key[0] && setSelectedFile(key[0]) }, []);
  return (
    <div className={cn(classes.appWrapper)}>
      <div className={cn(classes.leftArea)}>
        <div className={cn(classes.workspaceWrapper)}>
          Workspace:
          <Select
            value={workspace}
            dataSource={workspaces}
            onChange={onWorkspaceChange}
            autoWidth={false}
            showSearch
          />
        </div>
        <div className={cn(classes.treeWrapper)}>
          <Tree
            dataSource={dataSource}
            isNodeBlock={{ defaultPaddingLeft: 20 }}
            expandedKeys={expnadedKeys}
            selectedKeys={[selectedFile]}
            onExpand={onExapnd}
            onSelect={onTreeNodeSelect}
            defaultExpandAll
          />
        </div>
      </div>
      <div className={cn(classes.rightArea)}>
        <div
          className={cn(classes.monacoEditorWrapper)}
          ref={editorRef}
        />
        <div
          className={cn(classes.panelWrapper)}
          ref={termianlRef}
        />
      </div>
      <div className={cn(classes.loaderLine)} style={{ display: loading ? 'block' : 'none' }} />
    </div>
  );
};

export default App;