All Products
Search
Document Center

Simple Log Service:Use Simple Log Service SDK for Java to manage data shipping jobs

Last Updated:Sep 18, 2024

After you use Simple Log Service to collect data, you can use the data shipping feature to ship the collected data to other Alibaba Cloud services by using the Simple Log Service console or Simple Log Service SDK. This way, you can use other systems to store data or consume data. This topic describes how to use Simple Log Service SDK for Java to manage data shipping jobs.

Prerequisites

  • A Resource Access Management (RAM) user is created, and the required permissions are granted to the RAM user. For more information, see Create a RAM user and grant permissions to the RAM user.

  • The ALIBABA_CLOUD_ACCESS_KEY_ID and ALIBABA_CLOUD_ACCESS_KEY_SECRET environment variables are configured. For more information, see Configure environment variables in Linux, macOS, and Windows.

    Important
    • The AccessKey pair of an Alibaba Cloud account has permissions on all API operations. We recommend that you use the AccessKey pair of a RAM user to call API operations or perform routine O&M.

    • We recommend that you do not save the AccessKey ID or AccessKey secret in your project code. Otherwise, the AccessKey pair may be leaked, and the security of all resources within your account may be compromised.

  • Simple Log Service SDK for Java is installed. For more information, see Install Simple Log Service SDK for Java.

  • Logs are written to a Logstore. For more information, see Data collection overview.

  • The MaxCompute endpoint for the required region is obtained from MaxCompute. For more information, see Endpoints in different regions (VPC).

    When you create or update a MaxCompute data shipping job, an endpoint is specified.

Usage notes

In this example, the public Simple Log Service endpoint for the China (Hangzhou) region is used, which is https://cn-hangzhou.log.aliyuncs.com. If you want to access Simple Log Service by using other Alibaba Cloud services that reside in the same region as your project, you can use the internal Simple Log Service endpoint, which is https://cn-hangzhou-intranet.log.aliyuncs.com. For more information about the supported regions and endpoints of Simple Log Service, see Endpoints.

Sample code

Query OSS or MaxCompute data shipping jobs

The following code provides an example on how to create a file named QuerySinkDemo.java. The file is used to query the configurations of a data shipping job in a specified project.

import com.alibaba.fastjson.JSONArray;
import com.alibaba.fastjson.JSONObject;
import com.aliyun.openservices.log.Client;
import com.aliyun.openservices.log.common.*;
import com.aliyun.openservices.log.exception.LogException;
import com.aliyun.openservices.log.request.*;
import com.aliyun.openservices.log.response.*;

import java.util.ArrayList;
import java.util.List;
import java.util.Map;

public class QuerySinkDemo {
    // The Simple Log Service endpoint. In this example, the Simple Log Service endpoint for the China (Hangzhou) region is used. Replace the parameter value with the actual endpoint. 
    private static final String endpoint = "http://cn-hangzhou.log.aliyuncs.com";
    // In this example, the AccessKey ID and AccessKey secret are obtained from environment variables. 
    private static final String accessKeyId = System.getenv("ALIBABA_CLOUD_ACCESS_KEY_ID");
    private static final String accessKeySecret = System.getenv("ALIBABA_CLOUD_ACCESS_KEY_SECRET");
    // The name of the project. Enter a name based on your business requirements. You must specify the name of an existing project. 
    private static final String project = "ali-test-project";
    // The name of the data shipping job. You can view the name of the data shipping job in the Basic Information section. 
    private static final String jobName = "ali-test-job";

    // Create a Simple Log Service client. 
    private static final Client client = new Client(endpoint, accessKeyId, accessKeySecret);

    private static String getSinkExportJob() throws LogException {
        GetJobResponse getJobResponse = client.getJob(new GetJobRequest(project, jobName));
        Job job = getJobResponse.getJob();
        return JSONObject.toJSONString(job);
    }

    private static List<Export> listSinkExportJob() throws LogException {
        ListExportResponse listExportResponse = client.listExport(new ListExportRequest(project));
        return listExportResponse.getResults();
    }

    private static List<String> getOdpsSinkExportJob() throws LogException {
        List<String> odpsSinkList = new ArrayList<>();
        List<Export> listExports = listSinkExportJob();
        for (Export job:listExports) {
            DataSink type = job.getConfiguration().getSink();
            Map<String, Object> map = JSONObject.parseObject(JSONArray.toJSONString(type));
            Object sinkType = map.get("type");
            if (sinkType.equals("AliyunODPS")) {
                odpsSinkList.add(JSONArray.toJSONString(job));
            }
        }
        return odpsSinkList;
    }
    private static List<String> getOssSinkExportJob() throws LogException {
        List<String> ossSinkList = new ArrayList<>();
        List<Export> listExports = listSinkExportJob();
        for (Export job:listExports) {
            DataSink type = job.getConfiguration().getSink();
            Map<String, Object> map = JSONObject.parseObject(JSONArray.toJSONString(type));
            Object sinkType = map.get("type");
            if (sinkType.equals("AliyunOSS")) {
                ossSinkList.add(JSONArray.toJSONString(job));
            }
        }
        return ossSinkList;
    }

    public static void main(String[] args) throws LogException {
        // Query the data shipping job that is specified. 
        String jobConfig = getSinkExportJob();
        System.out.println("**********Query the configurations of the specified data shipping job**********");
        System.out.println(jobConfig);
        // Query all data shipping jobs. 
        listSinkExportJob();
        // Query all MaxCompute data shipping jobs. 
        List<String> odpsSinkList = getOdpsSinkExportJob();
        System.out.println("**********Query the configurations of all MaxCompute data shipping jobs**********");
        System.out.println(odpsSinkList);
        // Query all Object Storage Service (OSS) data shipping jobs. 
        List<String> ossSinkList = getOssSinkExportJob();
        System.out.println("**********Query the configurations of all OSS data shipping jobs**********");
        System.out.println(ossSinkList);
    }
}

Expected results:

**********Query the configurations of the specified data shipping job**********
{
    "configuration": {
    ......
        "fromTime": 1,
        "instanceType": "Standard",
        "logstore": "ali-test-logstore",
    ......
    "state": "Enabled",
    "status": "RUNNING",
    "type": "Export"
}
**********Query the configurations of all MaxCompute data shipping jobs**********
[{
    "configuration": {
    ......
    "status": "RUNNING",
    "type": "Export"
}, {
    "configuration": {
    ......
    "state": "Enabled",
    "status": "STOPPED",
    "type": "Export"
}]
**********Query the configurations of all OSS data shipping jobs**********
[{
    "configuration": {
    ......
        "fromTime": 0,
        "instanceType": "Standard",
        "logstore": "ali-test-logstore",
    ......
    "state": "Enabled",
    "status": "RUNNING",
    "type": "Export"
}]

Create an OSS data shipping job

The following code provides an example on how to create a file named CreateOssSinkDemo.java. The file is used to create an OSS data shipping job.

package demo;

import com.aliyun.openservices.log.Client;
import com.aliyun.openservices.log.common.*;
import com.aliyun.openservices.log.exception.LogException;
import com.aliyun.openservices.log.request.*;

public class CreateOssSinkDemo {
    // The Simple Log Service endpoint. In this example, the Simple Log Service endpoint for the China (Hangzhou) region is used. Replace the parameter value with the actual endpoint. 
    private static final String endpoint = "http://cn-hangzhou.log.aliyuncs.com";
    // In this example, the AccessKey ID and AccessKey secret are obtained from environment variables. 
    private static final String accessKeyId = System.getenv("ALIBABA_CLOUD_ACCESS_KEY_ID");
    private static final String accessKeySecret = System.getenv("ALIBABA_CLOUD_ACCESS_KEY_SECRET");

    // The name of the project. Enter a name based on your business requirements. You must specify the name of an existing project. 
    private static final String project = "ali-test-project";
    // The name of the Logstore. Enter a name based on your business requirements. You must specify the name of a Logstore that is obtained from your project. 
    private static final String logStore = "ali-test-logstore";
    // The Alibaba Cloud Resource Name (ARN) of the RAM role. Enter an ARN based on your business requirements. You must specify an ARN that is obtained from the required RAM role in the RAM console. 
    private static final String roleArn = "acs:ram::111111";
    // The name of the OSS bucket. Enter a name based on your business requirements. You must specify the name of an existing OSS bucket. 
    private static final String bucket = "yourBucketName";
    // The name of the data shipping job. 
    private static final String jobName = "ali-test-job-name";
    // The display name of the data shipping job. 
    private static final String displayName = "ali-test-job-displayname";
    // The description of the data shipping job. 
    private static final String description = "This is a OSS Shipper task.";
    // The directory to which you want to ship data in the OSS bucket. 
    private static final String preffix = "test";
    // The suffix of the OSS objects in which the shipped data is stored. 
    private static final String suffix = "11111";
    // The partition format that is used to generate subdirectories in the OSS bucket. A subdirectory is dynamically generated based on the shipping time. 
    private static final String pathFormat = "%Y/%m/%d/%H/%M";
    // The size of data in a shard. The value of this parameter is the size of the raw data that is shipped to OSS and is stored in an OSS object. Unit: MB. 
    private static final int bufferSize = 255;
    // The interval between two operations that ship the data of a shard. Unit: seconds. Valid values: 300 to 900. 
    private static final int bufferInterval = 300;
    // The storage format for data. After data is shipped to OSS, the data can be stored in different formats. Valid values: csv, json, parquet, and orc. 
    private static final String contentType = "json";
    // The start time of the data shipping job. The value 1 specifies that the job starts to ship historical data that is generated at the earliest point in time. If you specify a point in time, the job starts to ship data at the specified point in time. 
    private static final int fromtime = 1;
    // The end time of the data shipping job. The value 0 specifies that the job constantly ships data as long as data is available. If you specify a point in time, the job stops shipping data at the specified point in time. 
    private static final int totime = 0;

    private static void createOssExportJob(Client client) throws LogException {
        Export export = new Export();
        export.setName(jobName);
        export.setDisplayName(displayName);
        export.setDescription(description);

        ExportConfiguration exportConfiguration = new ExportConfiguration();
        AliyunOSSSink ossSink = new AliyunOSSSink();
        ossSink.setRoleArn(roleArn);
        ossSink.setBucket(bucket);
        ossSink.setPrefix(preffix);
        ossSink.setSuffix(suffix);
        ossSink.setPathFormat(pathFormat);
        ossSink.setBufferSize(bufferSize);
        ossSink.setBufferInterval(bufferInterval);
        ossSink.setContentType(contentType);

        ExportContentJsonDetail jsonDetail = new ExportContentJsonDetail();
        jsonDetail.setEnableTag(true);

        ossSink.setContentDetail(jsonDetail);
        exportConfiguration.setLogstore(logStore);
        exportConfiguration.setRoleArn(roleArn);
        exportConfiguration.setSink(ossSink);
        exportConfiguration.setFromTime(fromtime);
        exportConfiguration.setToTime(totime);
        exportConfiguration.setVersion("v2.0");
        export.setConfiguration(exportConfiguration);
        client.createExport(new CreateExportRequest(project, export));
    }

    public static void main(String[] args) throws LogException {
        Client client = new Client(endpoint, accessKeyId, accessKeySecret);
        createOssExportJob(client);
    }
}

Update an OSS data shipping job

The following code provides an example on how to create a file named UpdateOssSinkDemo.java. The file is used to update an OSS data shipping job and restart the job. The new configurations take effect after the job is restarted.

package demo;

import com.aliyun.openservices.log.Client;
import com.aliyun.openservices.log.common.*;
import com.aliyun.openservices.log.exception.LogException;
import com.aliyun.openservices.log.request.*;
import java.util.ArrayList;

public class UpdateOssSinkDemo {
    // The Simple Log Service endpoint. In this example, the Simple Log Service endpoint for the China (Hangzhou) region is used. Replace the parameter value with the actual endpoint. 
    private static final String endpoint = "http://cn-hangzhou.log.aliyuncs.com";
    // In this example, the AccessKey ID and AccessKey secret are obtained from environment variables. 
    private static final String accessKeyId = System.getenv("ALIBABA_CLOUD_ACCESS_KEY_ID");
    private static final String accessKeySecret = System.getenv("ALIBABA_CLOUD_ACCESS_KEY_SECRET");

    // The name of the project. Enter a name based on your business requirements. You must specify the name of an existing project. 
    private static final String project = "ali-test-project";
    // The name of the Logstore. Enter a name based on your business requirements. You must specify the name of a Logstore that is obtained from your project. 
    private static final String logStore = "ali-test-logstore";
    // The ARN of the RAM role. Enter an ARN based on your business requirements. You must specify an ARN that is obtained from the required RAM role in the RAM console. 
    private static final String roleArn = "acs:ram::111111";
    // The name of the OSS bucket. Enter a name based on your business requirements. You must specify the name of an existing OSS bucket. 
    private static final String bucket = "yourBucketName";
    The name of the data shipping job that you want to update. 
    private static final String jobName = "ali-test-job-name";
    // The display name of the data shipping job. 
    private static final String displayName = "ali-test-job-displayname";
    // The description of the data shipping job. 
    private static final String description = "This is a OSS Shipper task.";
    // The directory to which you want to ship data in the OSS bucket. 
    private static final String preffix = "test";
    // The suffix of the OSS objects in which the shipped data is stored. 
    private static final String suffix = "11111";
    // The partition format that is used to generate subdirectories in the OSS bucket. A subdirectory is dynamically generated based on the shipping time. 
    private static final String pathFormat = "%Y/%m/%d/%H/%M";
    // The size of data in a shard. The value of this parameter is the size of the raw data that is shipped to OSS and is stored in an OSS object. Unit: MB. 
    private static final int bufferSize = 255;
    // The interval between two operations that ship the data of a shard. Unit: seconds. Valid values: 300 to 900. 
    private static final int bufferInterval = 300;
    // The storage format for data. After data is shipped to OSS, the data can be stored in different formats. Valid values: csv, json, parquet, and orc. 
    private static final String contentType = "csv";
    // The start time of the data shipping job. The value 1 specifies that the job starts to ship historical data that is generated at the earliest point in time. If you specify a point in time, the job starts to ship data at the specified point in time. 
    private static final int fromtime = 1;
    // The end time of the data shipping job. The value 0 specifies that the job constantly ships data as long as data is available. If you specify a point in time, the job stops shipping data at the specified point in time. 
    private static final int totime = 0;
    private static final ArrayList<String> columns = new ArrayList<>();

    private static void updateWithRestartOssSinkJob(Client client) throws LogException {
        Export export = new Export();
        export.setName(jobName);
        export.setDisplayName(displayName);
        export.setDescription(description);

        ExportConfiguration exportConfiguration = new ExportConfiguration();

        AliyunOSSSink ossSink = new AliyunOSSSink();
        ossSink.setRoleArn(roleArn);
        ossSink.setBucket(bucket);
        ossSink.setPrefix(preffix);
        ossSink.setSuffix(suffix);
        ossSink.setPathFormat(pathFormat);
        ossSink.setBufferSize(bufferSize);
        ossSink.setBufferInterval(bufferInterval);
        ossSink.setContentType(contentType);
        ExportContentCsvDetail csvDetail = new ExportContentCsvDetail();
        csvDetail.setNullIdentifier("");
        csvDetail.setStorageColumns(columns);
        ossSink.setContentDetail(csvDetail);

        exportConfiguration.setLogstore(logStore);
        exportConfiguration.setRoleArn(roleArn);
        exportConfiguration.setSink(ossSink);
        exportConfiguration.setFromTime(fromtime);
        exportConfiguration.setToTime(totime);
        exportConfiguration.setVersion("v2.0");
        export.setConfiguration(exportConfiguration);

        client.restartExport(new RestartExportRequest(project,export));
    }

    public static void main(String[] args) throws LogException {
        Client client = new Client(endpoint, accessKeyId, accessKeySecret);
        // Configure custom columns for data in the CSV format.
        columns.add("bucket");
        columns.add("__topic__");
        updateWithRestartOssSinkJob(client);
    }
}

Create a MaxCompute data shipping job

The following code provides an example on how to create a file named CreateOdpsSinkDemo.java. The file is used to create a MaxCompute data shipping job.

package demo;

import com.aliyun.openservices.log.Client;
import com.aliyun.openservices.log.common.*;
import com.aliyun.openservices.log.exception.LogException;
import com.aliyun.openservices.log.request.CreateExportRequest;

public class CreateOdpsSinkDemo {
    // The Simple Log Service endpoint. In this example, the Simple Log Service endpoint for the China (Hangzhou) region is used. Replace the parameter value with the actual endpoint. 
    private static final String endpoint = "http://cn-hangzhou.log.aliyuncs.com";
    // In this example, the AccessKey ID and AccessKey secret are obtained from environment variables. 
    private static final String accessKeyId = System.getenv("ALIBABA_CLOUD_ACCESS_KEY_ID");
    private static final String accessKeySecret = System.getenv("ALIBABA_CLOUD_ACCESS_KEY_SECRET");
    // The ARN of the RAM role. Enter an ARN based on your business requirements. You must specify an ARN that is obtained from the required RAM role in the RAM console. 
    private static final String roleArn = "acs:ram::111111";
    // The name of the project. Enter a name based on your business requirements. You must specify the name of an existing project. 
    private static final String projectName = "ali-test-project";
    // The name of the Logstore. Enter a name based on your business requirements. You must specify the name of an existing Logstore. 
    private static final String logstore = "ali-test-logstore";
    // The display name of the data shipping job. 
    private static final String displayName = "ali-test-displayname";
    // The name of the data shipping job. 
    private static final String jobName = "ali-test-yname";
    // The description of the data shipping job. 
    private static final String description = "This is a MaxCompute Shipper task.";
    // The start time of the data shipping job. The value 1 specifies that the job starts to ship historical data that is generated at the earliest point in time. If you specify a point in time, the job starts to ship data at the specified point in time. 
    private static final int fromtime = 1;
    // The end time of the data shipping job. The value 0 specifies that the job constantly ships data as long as data is available. If you specify a point in time, the job stops shipping data at the specified point in time. 
    private static final int totime = 0;
    // The type of the data shipping job. The value AliyunODPS specifies that the job is used to ship data to MaxCompute. 
    private static final String type = "AliyunODPS";
    // The name of the MaxCompute project. Enter a name based on your business requirements. You can specify a name that is obtained from the MaxCompute console. 
    private static final String odpsProject = "yourodpsProjectName";
    // The name of the MaxCompute table. Enter a name based on your business requirements. You can specify a name that is obtained from the MaxCompute console. 
    private static final String odpsTable = "yourodpsTabletName";
    // In this example, the Simple Log Service endpoint for the China (Hangzhou) region is used. Replace the parameter value with the actual endpoint. You can also check the MaxCompute endpoint on the configuration page of MaxCompute data shipping jobs in the Simple Log Service console or in the help center of the MaxCompute official website. 
    private static final String odpsEndpoint = "http://service.cn-hangzhou.maxcompute.aliyun-inc.com/api";
    private static final String odpsTunnelEndpoint = "http://dt.cn-hangzhou.maxcompute.aliyun-inc.com";
    // Use time to format partitions. 
    private static final String partitionTimeFormat = "%Y_%m_%d_%H_%M";
    private static final String timeZone = "+0800";
    // Specify whether to filter invalid fields. 
    private static final Boolean filterInvalid = true;
    // The ARN of the RAM role. Enter an ARN based on your business requirements. You must specify an ARN that is obtained from the required RAM role in the RAM console. 
    private static final String odpsRolearn = "acs:ram::111111";
    // The log field in Simple Log Service. 
    private static final String[] fields = {"11111","22222","33333"};
    // The name of the partition column. 
    private static final String[] partitionColumn = {"__partition_time__"};
    // The shipping mode. Valid values: stream and interval. 
    // stream: reads data from the Logstore and ships the data to MaxCompute in real time. interval: reads data that is generated 5 to 10 minutes earlier than the current time from the Logstore and ships the data to MaxCompute in a batch. 
    private static final String mode = "stream";

    private static void creatOdpsSinkJob(Client client) throws LogException {
        Export export = new Export();
        export.setDisplayName(displayName);
        export.setDescription(description);
        export.setName(jobName);
        ExportConfiguration configuration = new ExportConfiguration();
        configuration.setLogstore(logstore);
        configuration.setVersion("v2.0");
        configuration.setAccessKeyId(accessKeyId);
        configuration.setAccessKeySecret(accessKeySecret);
        configuration.setRoleArn(roleArn);
        configuration.setFromTime(fromtime);
        configuration.setToTime(totime);
        ExportGeneralSink sink = new ExportGeneralSink();
        sink.put("type", type);
        sink.put("odpsProject", odpsProject);
        sink.put("odpsTable", odpsTable);
        sink.put("odpsEndpoint", odpsEndpoint);
        sink.put("odpsTunnelEndpoint", odpsTunnelEndpoint);
        sink.put("partitionTimeFormat", partitionTimeFormat);
        sink.put("timeZone", timeZone);
        sink.put("filterInvalid", filterInvalid);
        sink.put("odpsRolearn", odpsRolearn);
        sink.put("fields", fields);
        sink.put("partitionColumn", partitionColumn);
        sink.put("mode", mode);
        configuration.setSink(sink);
        export.setConfiguration(configuration);
        JobSchedule jobSchedule = new JobSchedule();
        jobSchedule.setType(JobScheduleType.RESIDENT);
        export.setSchedule(jobSchedule);
        CreateExportRequest createExportRequest = new CreateExportRequest(projectName, export);
        client.createExport(createExportRequest);
    }

    public static void main(String[] args) throws LogException {
        Client client = new Client(endpoint, accessKeyId, accessKeySecret);
        creatOdpsSinkJob(client);
    }
}

Update a MaxCompute data shipping job

The following code provides an example on how to create a file named UpdateOdpsSinkDemo.java. The file is used to update a MaxCompute data shipping job and restart the job. The new configurations take effect after the job is restarted.

package demo;

import com.aliyun.openservices.log.Client;
import com.aliyun.openservices.log.common.*;
import com.aliyun.openservices.log.exception.LogException;
import com.aliyun.openservices.log.request.RestartExportRequest;

public class UpdateOdpsSinkDemo {
    // The Simple Log Service endpoint. In this example, the Simple Log Service endpoint for the China (Hangzhou) region is used. Replace the parameter value with the actual endpoint. 
    private static final String endpoint = "http://cn-hangzhou.log.aliyuncs.com";
    // In this example, the AccessKey ID and AccessKey secret are obtained from environment variables. 
    private static final String accessKeyId = System.getenv("ALIBABA_CLOUD_ACCESS_KEY_ID");
    private static final String accessKeySecret = System.getenv("ALIBABA_CLOUD_ACCESS_KEY_SECRET");
    // The ARN of the RAM role. Enter an ARN based on your business requirements. You must specify an ARN that is obtained from the required RAM role in the RAM console. 
    private static final String roleArn = "acs:ram::111111";
    // The name of the project. Enter a name based on your business requirements. You must specify the name of an existing project. 
    private static final String projectName = "ali-test-project";
    // The name of the Logstore. Enter a name based on your business requirements. You must specify the name of an existing Logstore. 
    private static final String logstore = "ali-test-logstore";
    // The display name of the data shipping job. 
    private static final String displayName = "ali-test-displayname";
    // The name of the data shipping job. 
    private static final String jobName = "ali-test-job-name";
    // The description of the data shipping job. 
    private static final String description = "This is a MaxCompute Shipper task.";
    // The start time of the data shipping job. The value 1 specifies that the job starts to ship historical data that is generated at the earliest point in time. If you specify a point in time, the job starts to ship data at the specified point in time. 
    private static final int fromtime = 1;
    // The end time of the data shipping job. The value 0 specifies that the job constantly ships data as long as data is available. If you specify a point in time, the job stops shipping data at the specified point in time. 
    private static final int totime = 0;
    // The type of the data shipping job. The value AliyunODPS specifies that the job is used to ship data to MaxCompute. 
    private static final String type = "AliyunODPS";
    // The name of the MaxCompute project. Enter a name based on your business requirements. You can specify a name that is obtained from the MaxCompute console. 
    private static final String odpsProject = "yourodpsProjectName";
    // The name of the MaxCompute table. Enter a name based on your business requirements. You can specify a name that is obtained from the MaxCompute console. 
    private static final String odpsTable = "yourodpsTableName";
    // In this example, the Simple Log Service endpoint for the China (Hangzhou) region is used. Replace the parameter value with the actual endpoint. You can also check the MaxCompute endpoint on the configuration page of MaxCompute data shipping jobs in the Simple Log Service console. 
    private static final String odpsEndpoint = "http://service.cn-hangzhou.maxcompute.aliyun-inc.com/api";
    private static final String odpsTunnelEndpoint = "http://dt.cn-hangzhou.maxcompute.aliyun-inc.com";
    // Use time to format partitions. 
    private static final String partitionTimeFormat = "%Y_%m_%d_%H_%M";
    private static final String timeZone = "+0800";
    // Specify whether to filter invalid fields. 
    private static final Boolean filterInvalid = true;
    // The ARN of the RAM role. Enter an ARN based on your business requirements. You must specify an ARN that is obtained from the required RAM role in the RAM console. 
    private static final String odpsRolearn = "acs:ram::11111";
    // The log field in Simple Log Service. 
    private static final String[] fields = {"11111","22222","33333"};
    // The name of the partition column. 
    private static final String[] partitionColumn = {"__partition_time__"};
    // The shipping mode. Valid values: stream and interval. 
    // stream: reads data from the Logstore and ships the data to MaxCompute in real time. interval: reads data that is generated 5 to 10 minutes earlier than the current time from the Logstore and ships the data to MaxCompute in a batch. 
    private static final String mode = "stream";

    private static void updateWithRestartOdpsSinkJob(Client client) throws LogException {
        Export export = new Export();
        export.setDisplayName(displayName);
        export.setDescription(description);
        export.setName(jobName);
        ExportConfiguration configuration = new ExportConfiguration();
        configuration.setLogstore(logstore);
        configuration.setVersion("v2.0");
        configuration.setAccessKeyId(accessKeyId);
        configuration.setAccessKeySecret(accessKeySecret);
        configuration.setRoleArn(roleArn);
        configuration.setFromTime(fromtime);
        configuration.setToTime(totime);
        ExportGeneralSink sink = new ExportGeneralSink();
        sink.put("type", type);
        sink.put("odpsProject", odpsProject);
        sink.put("odpsTable", odpsTable);
        sink.put("odpsEndpoint", odpsEndpoint);
        sink.put("odpsTunnelEndpoint", odpsTunnelEndpoint);
        sink.put("partitionTimeFormat", partitionTimeFormat);
        sink.put("timeZone", timeZone);
        sink.put("filterInvalid", filterInvalid);
        sink.put("odpsRolearn", odpsRolearn);
        sink.put("fields", fields);
        sink.put("partitionColumn", partitionColumn);
        sink.put("mode", mode);
        configuration.setSink(sink);
        export.setConfiguration(configuration);
        JobSchedule jobSchedule = new JobSchedule();
        jobSchedule.setType(JobScheduleType.RESIDENT);
        export.setSchedule(jobSchedule);
        RestartExportRequest restartExportRequest = new RestartExportRequest(projectName, export);
        client.restartExport(restartExportRequest);
    }

    public static void main(String[] args) throws LogException {
        Client client = new Client(endpoint, accessKeyId, accessKeySecret);
        updateWithRestartOdpsSinkJob(client);
    }
}

Delete an OSS or a MaxCompute data shipping job

The following code provides an example on how to create a file named DeleteSinkDemo.java. The file is used to delete an OSS or a MaxCompute data shipping job.

package demo;

import com.alibaba.fastjson.JSONObject;
import com.aliyun.openservices.log.Client;
import com.aliyun.openservices.log.exception.LogException;
import com.aliyun.openservices.log.request.*;
import com.aliyun.openservices.log.response.*;

public class DeleteSinkDemo {
    // The Simple Log Service endpoint. In this example, the Simple Log Service endpoint for the China (Hangzhou) region is used. Replace the parameter value with the actual endpoint. 
    private static final String endpoint = "http://cn-hangzhou.log.aliyuncs.com";
    // In this example, the AccessKey ID and AccessKey secret are obtained from environment variables. 
    private static final String accessKeyId = System.getenv("ALIBABA_CLOUD_ACCESS_KEY_ID");
    private static final String accessKeySecret = System.getenv("ALIBABA_CLOUD_ACCESS_KEY_SECRET");

    // The name of the project. Enter a name based on your business requirements. You must specify the name of an existing project. 
    private static final String project = "ali-test-project";
    // The name of the data shipping job that you want to delete. 
    private static final String jobName = "ali-test-job-name";
    private static void deleteSinkExportJob(Client client) throws LogException {
        DeleteExportResponse deleteExportResponse = client.deleteExport(new DeleteExportRequest(project, jobName));
        System.out.println(JSONObject.toJSONString(deleteExportResponse));
    }

    public static void main(String[] args) throws LogException {
        Client client = new Client(endpoint, accessKeyId, accessKeySecret);
        deleteSinkExportJob(client);
    }
}

References

  • If the response that is returned by Log Service contains error information after you call an API operation, the call fails. You can handle errors based on the error codes that are returned when API calls fail. For more information, see Error codes.
  • Alibaba Cloud OpenAPI Explorer provides debugging capabilities, SDKs, examples, and related documents. You can use OpenAPI Explorer to debug Log Service API operations without the need to manually encapsulate or sign requests. For more information, visit OpenAPI Portal.
  • Log Service provides the command-line interface (CLI) to meet the requirements for automated configurations in Log Service. For more information, see Log Service CLI.
  • For more information about sample code, see Alibaba Cloud Log Service SDK for Java on GitHub.
  • Simple Log Service provides a console in which you can perform operations. For example, you can create a data shipping job, and view and manage the created job. For more information, see Create a data shipping job, Manage data shipping jobs of the new version for OSS, and Manage data shipping jobs of the new version for MaxCompute.