All Products
Search
Document Center

Simple Log Service:Import data from a MySQL database to Simple Log Service

Last Updated:Nov 15, 2024

You can import data from a self-managed MySQL database or a MySQL database on an ApsaraDB RDS for MySQL instance to Simple Log Service. This way, you can query, analyze, and transform the data in Simple Log Service. This topic describes how to use the MySQL - Data Import feature to import data from a MySQL database to Simple Log Service.

Important

The MySQL - Data Import feature will stop updating and maintaining. The timeline for the cessation of maintenance and related migration details will be provided later. We recommend that you use DataWorks for data synchronization. For more information, see MySQL data source.

Limits

  • Data import from a MySQL database is available in the following regions: China (Hangzhou), China (Shanghai), China (Qingdao), China (Beijing), China (Zhangjiakou), China (Hohhot), China (Ulanqab), China (Chengdu), China (Shenzhen), China (Heyuan), China (Guangzhou), China (Hong Kong), Singapore, Malaysia (Kuala Lumpur), Indonesia (Jakarta), Philippines (Manila), Thailand (Bangkok), Japan (Tokyo), South Korea (Seoul), UK (London), Germany (Frankfurt), US (Virginia), and US (Silicon Valley).

  • If the data source for import is AnalyticDB for MySQL, JSON fields need to be converted to VARCHAR type before importing. For example, if the detail field is of JSON type, you can convert it using cast(detail as VARCHAR) as detail. For more information about the CAST function, see CAST functions.

Prerequisites

  • A MySQL database is created. A database account is created and is granted minimum read permissions.

  • An IP address whitelist is configured for the MySQL database.

    You must configure an IP address whitelist based on the region of the project to which the data import job belongs and the network of the database. For more information about the IP addresses that you can add to a whitelist, see IP address whitelists. For more information about how to add a whitelist to a MySQL database, see Configure an IP address whitelist.

  • A project and a Logstore are created. For more information, see Create a project and Create a Logstore.

Import data

  1. Log on to the Simple Log Service console.

  2. In the Projects section, click the project you want.

    image

  3. On the Log Storage > Logstores tab, click > to the left of the Logstore, move the pointer over Data Collection > Data Import, and then click +.

    image

  4. In the Quick Data Import dialog box, search for and click MySQL - Data Import.

  5. In the Import Configuration step, create a data import configuration.

    1. On the Configure Data Source tab, configure the following parameters.

      Basic parameters

      Parameter

      Description

      Job Name

      Unique name of the job.

      Display Name

      Display name of the job.

      Job Description

      Description of the job.

      Use VPC Endpoint

      Use a MySQL database on an ApsaraDB RDS for MySQL instance that belongs to a VPC

      Select Use VPC Endpoint and ApsaraDB RDS Instance, and configure the following parameters:

      • VPC-based Instance ID: the ID of the virtual private cloud (VPC) to which the ApsaraDB RDS for MySQL instance belongs.

      • RDS Instance ID: the ID of the ApsaraDB RDS for MySQL instance. You must set this parameter to the value of VpcCloudInstanceId that is specified for the ApsaraDB RDS for MySQL instance. You can call the DescribeDBInstanceAttribute operation to obtain the value of VpcCloudInstanceId.

      Use a self-managed MySQL database on an ECS instance that belongs to a VPC

      Select Use VPC Endpoint and Self-managed Database on ECS, and configure the following parameters:

      • VPC-based Instance ID: the ID of the VPC to which the Elastic Compute Service (ECS) instance belongs.

      • ECS Instance IP Address: the private IP address of the ECS instance.

      Use a MySQL database on an ApsaraDB RDS for MySQL instance that is accessible over the Internet

      Clear Use VPC Endpoint and configure the following parameter:

      • Public IP Address or Domain Name of Database: the public IP address or domain name of the MySQL database.

      MySQL Port

      The port that is used to connect to the MySQL database. Default value: 3306.

      Username

      The username of the account that is used to log on to the MySQL database.

      We recommend that you grant the account only the read permissions on the MySQL database.

      Password

      The password of the account that is used to log on to the MySQL database.

      Database

      The name of the MySQL database.

      Query

      The query statement. You can enter a SELECT statement or a SHOW statement. If you enter a SHOW statement, you cannot configure Page Size in the Advanced Settings section.

      You can use question marks (?) to specify replacement characters. If you use a question mark in the query statement, you must configure CheckPoint Field. For example, if you enter SELECT * FROM employees WHERE id > ? in Query, you must set CheckPoint Field to id and set Starting Value of CheckPoint Field to 0. Each time the data import job is run, the system saves the ID of the last data record as a checkpoint. The next time the data import job is run, the system replaces the question mark (?) that is specified in the query statement with the checkpoint.

      Advanced parameters

      Parameter

      Description

      Maximum Number of Rows for Each Import

      The maximum number of rows that can be imported each time the MySQL data import job is run. The MySQL data import job is scheduled. Default value: 0, which indicates that the maximum number of rows is unlimited.

      Connection Timeout Settings

      The timeout period of connections to the MySQL database when the MySQL data import job is run. Default value: 3000. Unit: ms.

      Query Timeout Settings

      The timeout period of queries from the MySQL database when the MySQL data import job is run. Default value: 5000. Unit: ms.

      Page Size

      The number of data records on each page. The value 0 indicates that no pagination is performed. If you set the parameter to a value that is greater than 0, the system appends limit pageSize offset {checkpoint} to the SELECT statement. When the MySQL data import job is run, pagination is performed to import data.

      Time Field

      The time field of the MySQL data. Example: create_time or update_time. The value of this field is considered the time at which data is imported to Simple Log Service.

      Time Field Format

      The format of the time field. Java-based time formats are supported. Example: yyyy-MM-dd HH:mm:ss. For more information, see Time formats.

      Time Zone

      The time zone of the time field.

      CheckPoint Field

      The checkpoint field. A checkpoint indicates a position from which the system starts to import data. For example, if you enter select * from t where id > ? in Query, you must set CheckPoint Field to id. id must be an auto-increment field.

      You can also specify other fields, such as create_time and update_time, for CheckPoint Field.

      Note

      If you do not specify a time range in the SQL statement, we recommend that you configure CheckPoint Field to import incremental data and prevent repeated data collection.

      Starting Value of CheckPoint Field

      The initial value of the checkpoint field.

      Additional Fields

      The additional fields. Format: map[string]string. For example, if you enter select a, b from t; in Query, the system automatically imports the a and b fields. You can also import other fields to Simple Log Service. Example: {"k1": "v1", "k2": "v2"}.

    2. Click Preview to preview the import result. Confirm the result and click Next Configuration.

    3. On the Specify Scheduling Interval tab, configure the following parameters.

      Parameter

      Description

      Import Interval

      The interval at which data is imported from the MySQL database to Simple Log Service.

      Execute Now

      If you turn on Execute Now, the data import job is immediately run.

    4. After the configuration is complete, click Next.

  6. Create indexes and preview data. Then, click Next. By default, full-text indexing is enabled in Simple Log Service. You can also manually create field indexes for the collected logs or click Automatic Index Generation. Then, Simple Log Service generates field indexes. For more information, see Create indexes.

    Important

    If you want to query all fields in logs, we recommend that you use full-text indexes. If you want to query only specific fields, we recommend that you use field indexes. This helps reduce index traffic. If you want to analyze fields, you must create field indexes. You must include a SELECT statement in your query statement for analysis.

  7. You must wait approximately 1 minute for the indexes to take effect. Then, you can view the collected logs on the Raw Logs tab. For more information about how to query and analyze logs, see Query and analyze logs.

View the data import configuration and reports

After you create a data import configuration, you can view the configuration and related reports in the Simple Log Service console.

  1. In the Projects section, click the project to which the data import configuration belongs.

  2. Find and click the Logstore to which the data import configuration belongs, choose Data Import > Data Import, and then click the data import configuration.

    image

  3. On the Import Configuration Overview page, view the basic information about the data import configuration and the related reports.

What to do next

On the Import Configuration Overview page, you can perform the following operations:

  • Modify the data import configuration

    To modify the data import configuration, click Modify Settings. For more information, see Import data.

  • Delete the data import configuration

    To delete the data import configuration, click Delete Configuration.

    Warning

    After a data import configuration is deleted, it cannot be restored. Proceed with caution.