If you want to consume data in a data table in real time, you can call the CreateTunnel operation to create a tunnel for the table. You can create multiple tunnels for a data table. When you create a tunnel, you must specify the data table name, tunnel name, and tunnel type.
Prerequisites
A TunnelClient instance is initialized.
A data table is created. For more information, see Create a data table.
Parameters
Request parameters
Parameter | Description |
TableName | The name of the data table. |
TunnelName | The name of the tunnel. |
TunnelType | The type of the tunnel. Valid values:
If you set this parameter to Stream or BaseAndStream, the system considers the data that is written to the data table after the tunnel is created incremental. If you want to consume incremental data generated after a specific point of time, you must configure the startTime parameter for the incremental data.
|
Response parameters
Parameter | Description |
TunnelId | The ID of the tunnel. |
ResponseInfo | Other fields returned in the request. |
RequestId | The ID of the request. |
Examples
Create a tunnel that is used to consume full data
The following sample code provides an example on how to create a tunnel of the BaseData type:
// You can create a tunnel of the TunnelType.BaseData, TunnelType.Stream, or TunnelType.BaseAndStream type.
// In this example, a tunnel of the BaseData type is created. To create tunnels of other types, set the TunnelType parameter in CreateTunnelRequest to the required types.
private static void createTunnel(TunnelClient client, String tableName, String tunnelName) {
CreateTunnelRequest request = new CreateTunnelRequest(tableName, tunnelName, TunnelType.BaseData);
CreateTunnelResponse resp = client.createTunnel(request);
System.out.println("RequestId: " + resp.getRequestId());
System.out.println("TunnelId: " + resp.getTunnelId());
}
Create a tunnel that is used to consume incremental or differential data
The following sample code provides an example on how to create a tunnel of the Stream or BaseAndStream type and specify the time range of the incremental data that you want to consume:
// Create a tunnel that is used to consume incremental or differential data and specify the start timestamp and the end timestamp for the time range of the incremental data that you want to consume. The configurations specified by StreamTunnelConfig do not take effect for tunnels of the BaseData type.
private static void createStreamTunnelByOffset(TunnelClient client,String tableName,String tunnelName, long startTime, long endTime){
CreateTunnelRequest createTunnelRequest = new CreateTunnelRequest(tableName,tunnelName, TunnelType.Stream);// Create a tunnel of the Stream type.
//CreateTunnelRequest createTunnelRequest = new CreateTunnelRequest(tableName,tunnelName, TunnelType.BaseAndStream);// Create a tunnel of the BaseAndStream type.
StreamTunnelConfig streamTunnelConfig = new StreamTunnelConfig();
/*
Specify the start timestamp and the end timestamp to read incremental data. Unit: milliseconds. Valid values: [CurrentSystemTime - StreamExpiration + 5 minute, CurrentSystemTime).
CurrentSystemTime is the timestamp of the current system time accurate to milliseconds. StreamExpiration is the timestamp of the incremental log validity period accurate to milliseconds. The maximum validity period of incremental logs is seven days. You can specify the Stream validity period when you enable Stream for the data table.
The value of endTime must be greater than the value of startTime.
*/
streamTunnelConfig.setStartOffset(startTime);
streamTunnelConfig.setEndOffset(endTime);
createTunnelRequest.setStreamTunnelConfig(streamTunnelConfig);
CreateTunnelResponse resp = client.createTunnel(createTunnelRequest);
System.out.println("RequestId: " + resp.getRequestId());
System.out.println("TunnelId: " + resp.getTunnelId());
}
FAQ
References
For information about the API operation that you can call to create a tunnel, see CreateTunnel.
If you want to quickly use Tunnel Service to consume data, see Getting started.
You can query information about all tunnels of a table. For more information, see Query information about all tunnels of a data table.
You can query information about a tunnel. For more information, see Query information about a tunnel.
You can delete a tunnel that you no longer require. For more information, see Delete a tunnel.
You can use Tunnel Service to migrate data. For more information, see Synchronize data from one table to another table in Tablestore.
Realtime Compute for Apache Flink can use the tunnels of Tunnel Service as the source of streaming data to compute and analyze Tablestore data. For more information, see Tutorial for the Wide Column model and Tutorial (TimeSeries model).
You are not charged for Tunnel Service. However, you are charged for the read throughput that is generated when you use tunnels to consume data. For more information, see Billing overview.