Promo Center

50% off for new user

Direct Mail-46% off

Learn More

Guide to log query and analysis

Updated at: 2025-02-16 03:28

Simple Log Service provides the query and analysis features. You can query billions to hundreds of billions of logs within seconds and use SQL syntax to perform statistical analysis on query results. This topic describes how to enable the indexing feature and query and analyze logs in the Simple Log Service console in an efficient manner based on NGINX logs.

Prerequisites

A project and a Standard Logstore are created, and logs are collected. For more information, see Create a project, Create a Logstore, and Data collection overview.

Step 1: Create indexes

  1. Log on to the Simple Log Service console.

  2. In the Projects section, click the project that you want to manage.

  3. On the Log Storage > Logstores tab, click the logstore that you want to manage.

  4. On the query and analysis page of the Logstore, click Enable.

    Note

    You can query the latest data approximately 1 minute after you click Enable.

    image

  5. After you click Enable, Full-text Index is automatically turned on. In the Search & Analysis panel, click Automatic Index Generation. Simple Log Service automatically generates field indexes based on the first log in the preview results of data collection.

    Note

    You can retain the default values for other parameters. For more information, see Create indexes.

    image

    The following figure shows the settings of the generated field indexes.

    image

Step 2: Query and analyze logs

On the query and analysis page of a Logstore, specify a search or an analytic statement and click Search & Analyze.image

  • Search statement

    A search statement is used to query and filter data. A search statement supports only simple query. You can specify conditions in a search statement to filter data. The conditions include time ranges, request types, and keywords. A search statement can be independently executed. For more information, see Query syntax and functions.

    For example, you can execute the following search statement to query the logs whose request status code is 200:

    status :200

    For more information, see Examples of search statements.

  • Analytic statement

    An analytic statement is used to filter, convert, calculate, and aggregate data. For example, you can use an analytic statement to calculate an average value within a specific period of time or compare data in different periods of time. An analytic statement must be executed together with a search statement in the Search statement|Analytic statement format. For more information, see SQL syntax and functions.

    For example, you can execute the following analytic statement to query all log records and analyze the number of requests in different states:

    * | SELECT status, count(*) AS PV GROUP BY status

    For more information about query and analysis examples, see SQL functions and SQL clauses.

Note

By default, when you click a Logstore in the Logstores list, the system automatically performs a query operation on the query and analysis page. To disable the feature or specify a query time range, perform the following operations: In the upper-right corner of the query and analysis page, click the 设置 icon. On the Query Settings tab, turn off Enable queries the first time you access a page or configure the Custom Query Time Range parameter.

image

Specify a time range

You can use one of the following methods to specify a time range for data that you want to query or analyze. If you specify a time range in an analytic statement, the time range is used for query and analysis.

  • In the upper part of the query and analysis page, select a time range from the drop-down list. Example: Last 15 Minutes.image

  • In the analytic statement, use the __time__ field to specify a time range, which is a closed time interval. Example:

    * | SELECT * FROM log WHERE __time__>1731297600 AND __time__< 1731310038
  • In the analytic statement, use the from_unixtime or to_unixtime function to convert the format of the specified time. Examples:

    • * | SELECT * FROM log WHERE from_unixtime(__time__) > from_unixtime(1731297600) AND from_unixtime(__time__) < now()
    • * | SELECT * FROM log WHERE __time__ > to_unixtime(date_parse('2024-10-19 15:46:05', '%Y-%m-%d %H:%i:%s')) AND __time__ < to_unixtime(now())
Note

By default, only 100 rows of data are returned after you execute a query statement. To increase the number of rows of data that are returned, you can use a LIMIT clause. For more information, see LIMIT clause.

Description of the query and analysis page

Page overviewimage

Histogram

image

  • When you move the pointer over a green rectangle, you can view the period of time that is represented by the rectangle and the number of returned logs within the period of time.

  • If you double-click a green rectangle, you can view log distribution in a finer-grained manner. You can also view the returned logs within the specified period of time on the Raw Logs tab.

Raw Logs

  • Log detailsimage

    • Click Table or Raw Data to switch between the display formats of logs.

    • 下载日志 > Download Log: allows you to download logs to your computer. For more information, see Download logs.

    • image.png > JSON Configurations: allows you to specify the display type of JSON and the level of JSON expansion.

    • image.png > Event Settings: allows you to configure events for raw logs. For more information, see Event settings.

    • image.png: allows you to copy log content.

    • image.png: allows you to label specific information or query error information in log content. This icon also serves as a copilot.

    • 查询日志-004: allows you to view the context information of a specific log in the raw log file. You can use contextual query only on the logs that are collected by Logtail. For more information, see Contextual query.

    • LiveTail: allows you to monitor log content in real time and extract key log information. You can use LiveTail only on the logs that are collected by Logtail. For more information, see LiveTail.

  • Displayed fieldsimage

    • Below Displayed Fields, move the pointer over a field and click the image.png icon to remove the field from Displayed Fields. Then, the field is no longer displayed in the log content on the right side.

    • image.png: allows you to add a view to your favorites. After fields are displayed in Section 5, you can add the current view to your favorites. Then, you can select the view from the drop-down list above Section 4.

    • image.png > Tag Settings: allows you to add fields as system tags.

    • image.png > Alias: After you turn on Alias, the names of fields are replaced with aliases. If you do not specify an alias for a field, the name of the field is retained. For more information about how to specify an alias for a field, see Create indexes.

  • Indexed fieldsimage

    • Below Indexed Fields, move the pointer over a field and click the image.png icon to add the field to Displayed Fields. Then, the field is displayed in the log content on the right side.

    • image.png: allows you to view field details, including Basic Distribution and Statistical Metrics. For more information, see Field settings.

Graph

Simple Log Service renders the results of a query statement to charts. Simple Log Service provides various types of charts, such as tables, line charts, and column charts. For more information, see Overview of charts (Pro) and Chart overview. After you execute a query statement, you can view the query and analysis results on the Graph tab. image

Description of other features on the Graph tab:

  • Add to New Dashboard: Simple Log Service provides dashboards on which you can analyze data in real time. You can click Add to New Dashboard to save the query and analysis results as a chart to a dashboard. For more information, see Overview of visualization.

  • Save as Scheduled SQL Job: Simple Log Service provides the Scheduled SQL feature. You can use the feature to automatically analyze data at a scheduled time and aggregate data for storage. You can also use the feature to project and filter data. For more information, see How Scheduled SQL works.

  • Interaction Occurrences: Interaction occurrences are important for data analysis. You can use interaction occurrences to switch between the levels of data dimensions and the analysis granularities to obtain more detailed information. For more information, see Configure an interaction occurrence for a dashboard to perform drill-down analysis.

LogReduceimage

On the LogReduce tab, you can click Enable LogReduce to cluster similar logs during log collection. For more information, see LogReduce.

SQL Enhancementimage

You can click the SQL独享版 icon in the upper-right corner to enable Dedicated SQL. If you use the Standard SQL feature to analyze a large amount of data that is generated over a period of time, Simple Log Service cannot analyze all data in a single query request. You can enable the Dedicated SQL feature to increase computing resources and the amount of data that can be analyzed in a single query request. For more information, see Enable Dedicated SQL.

Alertingimage

You can click the 另存为告警 icon in the upper-right corner to configure alerts for the query and analysis results. For more information, see Configure an alert rule in Simple Log Service.

Saved searchimage

You can click the 快速查询 icon in the upper-right corner to save a query statement as a saved search. You can use a saved search to quickly perform query and analysis operations. For more information, see Saved search.

Sharingimage

You can click the image.png icon in the upper-right corner to copy the link to the current page and share the link with other users. For more information, see Embed console pages and share log data.

Data transformationimage

You can click Data Transformation in the upper part of the query and analysis page to go to the data transformation page. The data transformation feature allows you to standardize data, extract information, cleanse and filter data, and distribute data to multiple Logstores. For more information, see Create a data transformation job (new version).

References

  • On this page (1, M)
  • Prerequisites
  • Step 1: Create indexes
  • Step 2: Query and analyze logs
  • Search statement
  • Analytic statement
  • Specify a time range
  • Description of the query and analysis page
  • Page overview
  • Histogram
  • Raw Logs
  • Graph
  • LogReduce
  • SQL Enhancement
  • Alerting
  • Saved search
  • Sharing
  • Data transformation
  • References
Feedback
phone Contact Us

Chat now with Alibaba Cloud Customer Service to assist you in finding the right products and services to meet your needs.

alicare alicarealicarealicare