Simple Log Service offers query and analysis capabilities that support second-level queries of billions to hundreds of billions of logs and enable statistical analysis of query results using SQL. This topic demonstrates how to quickly set up indexes and perform basic query and analysis operations in the console, using Nginx logs as an example.
Prerequisites
A project and a standard Logstore are created in Simple Log Service, and logs are collected. For more information, see Create a project, Create a Logstore, and Collect logs.
Step 1: Configure indexes
-
Log on to the Simple Log Service console.
-
In the Projects section, click the project that you want to manage.
-
Choose
. On the Logstores tab, click the Logstore that you want to manage. -
On the Query And Analysis page of the Logstore, click Enable Index.
NoteThe latest log data will be available 1 minute after enabling the index.
-
Once you select Enable Index, the system activates the full-text index by default. Navigate to the Query Analysis page and choose Auto-generate Index. Simple Log Service will then automatically create field indexes using the initial log from the preview of collected data.
NoteLeave other configuration items at their default settings. For more details, see Create Indexes.
The resulting field index configuration is as follows:
Step 2: Query and analyze logs
-
By default, the query and analysis page performs a query automatically when opened, displaying the results. To disable this feature or set the query time, click the icon in the upper right corner and adjust the settings on the Query Settings tab.
-
Enter a query or analytic statement in the top search bar and click Query/analyze.
-
Query statement
A query statement is used to search and filter data. Specify conditions such as time ranges, request types, and keywords to refine your search. A query statement can function independently. For more information, see Query Syntax and Features.
For instance, to find logs with a status code of
200
, use the following statement:status :200
For additional query examples, see Query Statement Examples.
-
Analytic statement
An analytic statement is used to filter, transform, calculate, and aggregate data. For instance, calculate an average value within a specific period or compare data across different periods. An analytic statement must accompany a search statement, formatted as
search statement|analytic statement
. For syntax details, see SQL Functions.For example, to analyze the frequency of each request status in the logs, use the following statement:
* | SELECT status, count(*) AS PV GROUP BY status
-
Define the time range for your query or analysis. Use one of the following methods to set a time range. If specified in an analytic statement, that time range applies to the query and analysis:
-
Choose a time range like Last 15 Minutes from the dropdown at the top of the page.
-
In an analytic statement, use the
__time__
field to specify a closed interval time range, for example:* | SELECT * FROM log WHERE __time__>1731297600 AND __time__< 1731310038
-
When specifying time in an analytic statement, convert the time format using the from_unixtime function or to_unixtime function, such as:
-
* | SELECT * FROM log WHERE from_unixtime(__time__) > from_unixtime(1731297600) AND from_unixtime(__time__) < now()
-
* | SELECT * FROM log WHERE __time__ > to_unixtime(date_parse('2024-10-19 15:46:05', '%Y-%m-%d %H:%i:%s')) AND __time__ < to_unixtime(now())
-
By default, a query statement returns only 100 rows of data. To increase this limit, use a LIMIT clause. For more information, see LIMIT Clause.
Console description
Overview
Select a period and click Query/analyze to view log query and analysis results.
Histogram
-
Hover over a green rectangle to see the time period it represents and the number of logs returned within that period.
-
Double-click a green rectangle to drill down into the log distribution with finer time granularity. The query results for the specified time range are also displayed on the Raw Logs tab.
Raw logs
-
Log details area
-
Switch between Table and Raw formats for log display.
-
: Choose the download range and tool. For more details, see Download Logs.
-
>JSON Settings: Adjust the JSON display type and level.
-
> Event Configuration: Set up event handling for raw log data. For more information, see Configure Events.
-
: Copy log content.
-
: SLS Copilot summarizes information based on log content and identifies error messages.
-
: View the context of a specific log in the original file. Contextual queries are only supported for log data collected by Logtail. For more information, see Contextual Query.
-
: Conduct real-time monitoring of log content and extract key information. The LiveTail feature is only available for log data collected by Logtail. For more details, see LiveTail.
-
-
Display fields area
-
: Add the current view to your favorites for easy access later.
-
>Tag Settings: Designate the field as a system tag.
-
>Alias: Enable this feature to replace field names with their aliases. Fields without aliases will display their original names. For details on setting field aliases, see Create Indexes.
-
-
Indexed fields area
-
In the Indexed Fields area, click the icon next to a field to add it to Display Fields and show it in the log information on the right.
In the Display Fields area, click the icon next to a field to remove it from Display Fields and stop showing it in the log information.
-
: Examine the Basic Distribution and Statistics of a field. For more information, see Field Settings.
-
Chart
View visualized query and analysis results on the Chart tab after executing a query statement.
-
Viewing query and analysis results: Charts, rendered by Simple Log Service, visualize the data derived from analytic statements. Simple Log Service offers a variety of chart types, including tables, line charts, and column charts, available in both Pro and Standard versions. For more information, see Chart Overview and Chart Overview.
-
To incorporate charts into your dashboard, a feature of Simple Log Service that provides a panel for real-time data analysis, simply click Add To Dashboard. This action saves your query and analysis results as charts on the dashboard. For additional details, refer to Visualization Overview.
-
Configure interaction occurrences: Interaction occurrences are crucial for data analysis, allowing you to switch between data dimension levels and analysis granularities for more detailed insights. For more details, see Add Interaction Occurrences to the Dashboard for Drill-down Analysis.
-
Create scheduled SQL tasks: Simple Log Service offers a scheduled SQL feature that allows for the periodic analysis of data, storage of aggregated results, and the application of projections and filters to data. For more information, see Scheduled SQL.
LogReduce
On the Logreduce tab, click Enable Logreduce to aggregate logs with high similarity during collection. For more details, see LogReduce.
SQL enhancement
Click the icon in the upper right corner to enable Dedicated SQL for a single session. This feature increases computing resources and the data volume that can be analyzed in a single query request. For default enablement settings, see Default Enablement of Dedicated SQL.
Alert
Click the icon in the upper right corner to set alerts for query and analysis results. For more information, see Quickly Set Log-based Alerts.
Saved search
Click the icon in the upper right corner to save a query and analysis statement as a saved search. For more details, see Saved Search.
Share
Click the icon in the upper right corner to copy the link to this page and share it with other users.
Scan
If you haven't created indexes or need to query or analyze logs without them, use the scan feature. For more information, see Scan Logs.