All Products
Search
Document Center

Lindorm:Import incremental data from SLS

Last Updated:Jul 29, 2025

This topic describes how to import incremental data from Lindorm to a Lindorm wide table using the Simple Log Service console.

Usage notes

This feature was unpublished on June 16, 2023. You cannot use this feature for Lindorm Tunnel Service (LTS) instances purchased after this date. If your LTS instance was purchased before June 16, 2023, you can still use this feature.

Prerequisites

Supported destination table types

Tables created using Lindorm SQL.

Procedure

  1. On the LTS action page, click Import Lindorm/HBase > SLS Real-time Data Replication.

  2. On the SLS Real-time Data Replication page, click Create Task.

  3. Specify a custom Channel Name, select the source and destination clusters, and then enter the tables to be synchronized or migrated.

  4. Click Create. After the channel is created, you can view its details.

Parameters

{
  "reader": {
    "columns": [
      "__client_ip__",
      "C_Source",
      "id",
      "name"
    ],
    "consumerSize": 2, // The number of consumers that subscribe to the Loghub data. Default value: 1.
    "logstore": "LTS-test"
  },
  "writer": {
    "columns": [
      {
        "name": "col1",
        "value": "{{ concat('xx', name) }}" // Expressions are supported.
      },
      {
        "name": "col2",
        "value": "__client_ip__" // Column mapping.
      },
      {
        {
            "isPk":true,// Specifies whether this is a primary key.
            "name":"id",// You do not need to specify the column family for the primary key.
            "value":"id"
        }
      }
    ]
    "tableName": "default.sls"
  }
}
            

Simple Jtwig syntax is supported.

{
  "name": "hhh",
  "value": "{{ concat(title, id) }}"
}