All Products
Search
Document Center

ApsaraMQ for Kafka:Create Simple Log Service sink connectors

Last Updated:Dec 04, 2024

This topic describes how to create a Simple Log Service sink connector to export data from a topic on an ApsaraMQ for Kafka instance to Simple Log Service.

Prerequisites

For information about the prerequisites, see Prerequisites.

Step 1: Create Simple Log Service resources

In this example, a project named guide-sls-sink-project and a Logstore named guide-sls-sink-logstore are created.

Step 2: Create and start a Simple Log Service sink connector

  1. Log on to the ApsaraMQ for Kafka console. In the Resource Distribution section of the Overview page, select the region where the ApsaraMQ for Kafka instance that you want to manage resides.

  2. In the left-side navigation pane, choose Connector Ecosystem Integration > Tasks.

  3. On the Tasks page, click Create Task.

    • Task Creation

      1. In the Source step, set the Data Provider parameter to ApsaraMQ for Kafka and follow the on-screen instructions to configure other parameters. Then, click Next Step. The following table describes the parameters.

      2. Parameter

        Description

        Example

        Region

        The region where the ApsaraMQ for Kafka instance resides.

        China (Beijing)

        ApsaraMQ for Kafka Instance

        The ApsaraMQ for Kafka instance in which the messages that you want to route are produced.

        alikafka_post-cn-jte3****

        Topic

        The topic on the ApsaraMQ for Kafka instance in which the messages that you want to route are produced.

        topic

        Group ID

        The ID of the group on the ApsaraMQ for Kafka instance in which the messages that you want to route are produced.

        • Quickly Create: Create a new group.

        • Use Existing Group: Select a created group.

        GID_http_1

        Consumer Offset

        The offset from which messages are consumed.

        Latest Offset

        Network Configuration

        The type of the network over which you want to route messages.

        Basic Network

        VPC

        The ID of the virtual private cloud (VPC) in which the ApsaraMQ for Kafka instance is deployed. This parameter is required only if you set the Network Configuration parameter to Self-managed Internet.

        vpc-bp17fapfdj0dwzjkd****

        vSwitch

        The ID of the vSwitch with which the ApsaraMQ for Kafka instance is associated. This parameter is required only if you set the Network Configuration parameter to Self-managed Internet.

        vsw-bp1gbjhj53hdjdkg****

        Security Group

        The security group to which the ApsaraMQ for Kafka instance belongs. This parameter is required only if you set the Network Configuration parameter to Self-managed Internet.

        alikafka_pre-cn-7mz2****

        Messages

        The maximum number of messages that can be sent in each function invocation. Requests are sent only when the number of messages in the backlog reaches the specified value. Valid values: 1 to 10000.

        100

        Interval (Unit: Seconds)

        The time interval at which the function is invoked. The system sends the aggregated messages to Function Compute at the specified time interval. Valid values: 0 to 15. Unit: seconds. The value 0 specifies that messages are sent immediately after aggregation.

        3

      3. In the Filtering step, define a data pattern in the Pattern Content code editor to filter requests. For more information, see Event patterns.

      4. In the Transformation step, specify a data cleansing method to implement data processing capabilities such as splitting, mapping, enrichment, and dynamic routing. For more information, see Data cleansing.

      5. In the Sink step, set the Service Type parameter to Simple Log Service and follow the on-screen instructions to configure other parameters. Then, click Save. The following table describes the parameters.

      Table 1. Simple Log Service

      Parameter

      Description

      Example

      Project

      The Simple Log Service project that you created.

      guide-sls-sink-project

      Logstore

      The Simple Log Service Logstore that you created.

      guide-sls-sink-logstore

      Topic

      The method that is used to generate a topic in Simple Log Service.

      Not Specified

      Role

      The role that can be assumed by EventBridge to write data to Simple Log Service. If no role is available, you can follow the on-screen instructions to create a role.

      sls_eb

  4. Go back to the Tasks page, find the OSS sink connector that you created, and then click Enable in the Actions column.

  5. In the Note message, click OK.

    The connector requires 30 to 60 seconds to be enabled. You can view the progress in the Status column on the Tasks page.

Step 3: Test the Simple Log Service sink connector

  1. On the Tasks page, find the Simple Log Service sink connector that you created and click the source topic in the Event Source column.

  2. On the Topic Details page, click Send Message.
  3. In the Start to Send and Consume Message panel, configure the parameters based on the following figure and click OK.

    发送消息

  4. On the Tasks page, find the Simple Log Service sink connector that you created and click the destination project in the Event Target column.

  5. On the Logstores page, view the log content.

    日志