This topic describes how to query and analyze operation logs of Data Management (DMS) in the Simple Log Service console.
Background information
Operation logs record the details of all operations that are performed in DMS. An operation log includes the information about the operator, name of the used feature, time when the operation was performed, type of the operation, and executed SQL statements. For more information, see the "Features" section of the Use the operation audit feature topic.
Prerequisites
Simple Log Service is activated. For more information, see Resource management overview.
A Simple Log Service project and a Logstore are created. For more information, see the "Create a project" section of the Manage a project topic and the "Create a Logstore" section of the Manage a Logstore topic.
The Logstore is empty, and full-text indexing is enabled for the Logstore. For more information about how to enable full-text indexing, see Create indexes.
Billing method
You can export operation logs of DMS to Simple Log Service free of charge. However, Simple Log Service charges you for storage. For more information about the billing of Simple Log Service, see Billable items of pay-by-feature.
Procedure
Step 1: Register the Simple Log Service project with DMS
Log on to the DMS console V5.0 as a DMS administrator.
On the Home page of the DMS console, click the icon next to Database Instances in the left-side navigation pane.
NoteIf you log on to the DMS console in simple mode, click Database Instances in the left-side navigation pane. In the instance list that appears, click the icon.
In the Add Instance dialog box, configure the following parameters to register the Simple Log Service project with DMS.
Section
Parameter
Description
Data Source
N/A
The service provider of the data source. In this example, Alibaba Cloud is selected.
Basic Information
Database Type
The type of the data source. In this example, SLS is selected.
Instance Region
The region in which the Simple Log Service project resides.
Connection Method
The method that you use to register the Simple Log Service project. Default value: Connection String Address.
Connection String Address
The connection string address of the Simple Log Service project. DMS automatically generates a connection string address after you configure the Instance Region parameter.
Project name
The name of the Simple Log Service project.
AccessKey ID
The AccessKey ID of your Alibaba Cloud account. The AccessKey ID is used for identity authentication when you register the Simple Log Service project with DMS.
NoteFor more information about how to obtain the AccessKey ID, see Create an AccessKey pair.
AccessKey Secret
The AccessKey secret that corresponds to the AccessKey ID.
NoteFor more information about how to obtain the AccessKey secret, see Create an AccessKey pair.
Advanced Feature Pack
No feature packs can be added for the data source. By default, the data source supports the Flexible Management mode.
Advanced Information
Environment Type
The type of the environment in which the Simple Log Service project is deployed. Valid values: Dev, Test, Product, Pre, SIT, UAT, Pet, and STAG. For more information, see the "Environment types" section of the Change the environment type of an instance topic.
Instance Name
The name of the Simple Log Service project in DMS.
NoteYou can change the display name of the project when you manage the project in the DMS console. For more information, see Modify database instances.
DBA
Select the database administrator (DBA) of the Simple Log Service project. In subsequent operations, you can request permissions from the DBA.
Query Timeout(s)
Specify the timeout period for the execution of an SQL query statement. If the execution of an SQL query statement lasts longer than the specified timeout period, the execution of the statement is terminated to ensure database security.
Export Timeout (s)
Specify the timeout period for the execution of an SQL export statement. If the execution of an SQL export statement lasts longer than the specified timeout period, the execution of the statement is terminated to ensure database security.
NoteAfter you complete the configurations in the Basic Information section, click Test Connection in the lower-left corner of the dialog box. Wait until the connectivity test passes.
If the error message "The execution result of the 'getProject' command is null" appears, check whether the Simple Log Service project is created by the Alibaba Cloud account that you use to log on to DMS.
Click Submit.
Step 2: Create a task to export operation logs of DMS
- Log on to the DMS console V5.0.
In the top navigation bar, choose Security and Specifications > Operation Audit.
NoteIf you log on to the DMS console in simple mode, click the icon in the upper-left corner of the console and choose All Features > Security and Specifications > Operation Audit.
Click the Export logs tab. On the Export logs tab, click New Task in the upper-right corner.
In the Create Export Task dialog box, configure the parameters that are described in the following table.
Parameter
Required
Description
The task name.
Yes
The name of the export task. Specify a descriptive name for easy identification.
Destination Log Service
Yes
The Simple Log Service project that is used to manage resources.
SLS Logstore
Yes
The Logstore to which you want to export operation logs of DMS. Select the destination Logstore from the drop-down list.
NoteIf the destination Logstore is not displayed in the drop-down list, click Sync Dictionary. In the message that appears, click OK. Then, DMS automatically collects metadata of the Logstore.
Function Module
Yes
The functional modules whose logs you want to export. You can view the modules on the Operation Logs tab. Valid values include Instance management, User Management, Permissions, and Cross-database Query Result Export. By default, All is selected.
Scheduling Method
Yes
The scheduling method of the export task. Valid values:
One-time Tasks: After you create an export task, the task exports the logs only once.
Periodic Tasks: You can select Day, Week, or Month to export logs to the destination Logstore on a periodic basis. If a periodic task runs for the first time, all operation logs that are generated in DMS from the start time of log entries to the scheduled start time are exported. Only incremental logs are exported later. For more information, see the "Periodic scheduling" section of the Export operation logs of DMS to Simple Log Service topic.
Log Time Range
No
NoteThis parameter is displayed only if you set the Scheduling Method parameter to One-time Tasks.
The time range within which the operation logs to be exported are generated. By default, if you do not configure this parameter, all logs generated in the last three years are exported.
Log Start Time
No
NoteThis parameter is displayed only if you set the Scheduling Method parameter to Periodic Tasks.
Periodic tasks do not have an end time.
The start time of the logs to be exported. If you do not configure this parameter, the default value is the date three years ago from the date when the export task is created.
Click OK. A log export task is created. The system also creates index fields such as dbId, dbName, and dbUser in your Logstore for querying and analyzing data.
If a one-time task is in the Successful state, the logs are exported.
NoteThe indexes in the Logstore take effect at a later time after they are created. Therefore, a one-time task starts approximately 90 seconds after the task is created.
A periodic task exports logs multiple times and is in the Pending Scheduling state before and after the logs are exported. You can view the task logs to determine whether a task is successful.
You can perform the following operations in the Operation column of a task:
Query: After you click Query, you are navigated to the SQL Console page. On the page that appears, click Query. In the Execution History section in the lower part of the page, you can view the logs that are exported to the Logstore.
Task Logs: Click Task Logs to view the information about the task, such as the start time, end time, number of log entries to be exported, and task status.
Pause: Click Pause. In the dialog box that appears, click OK. Then, the periodic task is paused.
Restart: Click Restart. In the dialog box that appears, click OK to restart the paused task.
NoteYou cannot perform the Restart operation on a one-time task.
You can perform all supported operations such as Query and Pause on a periodic task.
For more information about how to create an export task, see Export operation logs of DMS to Simple Log Service.
Step 3: Query and analyze the exported operation logs of DMS in the Simple Log Service console
Log on to the Simple Log Service console.
In the Projects section, click the project that you want to manage.
In the left-side navigation pane, click Log Storage. In the Logstores list, click the Logstore that you want to manage.
Enter a query statement in the search box.
A query statement consists of a search statement and an analytic statement and is in the
Search statement|Analytic statement
format. For more information, see Search syntax and Aggregate functions.You can query and analyze only the following information in the Simple Log Service console.
NoteIn this example, the dmstest Logstore is used.
Execute the following SQL statement to query the user who failed to log on to a specific database the most times:
__topic__ : DMS_LOG_DELIVERY AND subModule : LOGIN | SELECT concat(cast(operUserId as varchar), '(', operUserName, ')') user, COUNT(*) cnt FROM dmstest WHERE state = '0' GROUP BY operUserId, operUserName ORDER BY cnt DESC LIMIT 10;
Execute the following SQL statement to query the user whose source IP address is not allowed to access the specific instance. In this example, 127.0.0.1 is used.
NoteThe source IP address of an instance is the IP address that you use when you register the instance with DMS. This IP address is used to identify the source of the instance.
__topic__ : DMS_LOG_DELIVERY | SELECT concat(cast(operUserId as varchar), '(', operUserName, ')') user, COUNT(*) cnt FROM dmstest WHERE state = '0' and requestIp in ('127.0.0.1') GROUP BY operUserId, operUserName ORDER BY cnt DESC LIMIT 10;
Execute the following SQL statement to query the user who logs on to the DMS console the most times:
__topic__ : DMS_LOG_DELIVERY| SELECT concat(cast(operUserId as varchar), '(', operUserName, ')') user, COUNT(*) cnt FROM dmstest GROUP BY operUserId, operUserName ORDER BY cnt DESC LIMIT 10;
Execute the following SQL statement to query the users who access and manage multiple databases on the same day:
__topic__: DMS_LOG_DELIVERY | SELECT concat(cast(operUserId as varchar), '(', operUserName, ')') user, date_trunc('day', gmtCreate) time, dbId, COUNT(*) qpd from dmstest GROUP BY time, operUserId, operUserName, dbId ORDER BY time, qpd DESC;
Execute the following SQL statement to query the users who failed to perform operations on a database in DMS:
__topic__ : DMS_LOG_DELIVERY AND moudleName : SQL_CONSOLE | SELECT concat(cast(operUserId as varchar), '(', operUserName, ')') user, actionDesc as sqlStatement, subModule as sqlType, remark as failReason FROM dmstest WHERE state = '-1' order by id;
Execute the following SQL statement to query the user who downloads sensitive data the most times:
__topic__ : DMS_LOG_DELIVERY AND moudleName : DATA_EXPORT | SELECT concat(cast(operUserId as varchar), '(', operUserName, ')') user, COUNT(*) cnt FROM dmstest WHERE hasSensitiveData = 'true' GROUP BY operUserId, operUserName ORDER BY cnt DESC LIMIT 10;
Execute the following SQL statement to query the SQL statements that are executed to delete and modify sensitive data in batches:
__topic__ : DMS_LOG_DELIVERY | SELECT subModule, COUNT(*) cnt, COUNT(affectRows) affectRow FROM dmstest WHERE subModule != '' GROUP BY subModule ORDER BY cnt DESC;
Execute the following SQL statement to query the users who enable or disable the data watermark feature during data export:
__topic__ : DMS_LOG_DELIVERY AND moudleName : DATA_EXPORT | SELECT targetId as orderId, concat(cast(operUserId as varchar), '(', operUserName, ')') user, COUNT(*) cnt FROM dmstest where actionDesc like '%enable data watermark: false' GROUP BY targetId, operUserId, operUserName ORDER BY cnt DESC LIMIT 10;
Note'%enable data watermark: true'
is used in the query statement to query the users who enable the data watermark feature.'%enable data watermark: false'
is used in the query statement to query the users who disable the data watermark feature.
Execute the following SQL statement to query the users who download the SQL result set in the Execution History section on the SQL Console page:
__topic__ : DMS_LOG_DELIVERY AND moudleName : SQL_CONSOLE_EXPORT | SELECT concat(cast(operUserId as varchar), '(', operUserName, ')') user, COUNT(*) cnt FROM dmstest GROUP BY operUserId, operUserName ORDER BY cnt DESC LIMIT 10;
For more information about the query statements, see the "Step 1: Enter a query statement" section of the Query and analyze logs topic.
Fields in raw logs of Simple Log Service
The following table describes some important fields in a DMS operation log that is imported to Simple Log Service and the meanings of the fields.
Field | Description |
id | The unique ID of the log. |
gmt_create | The time when the log was created. |
gmt_modified | The time when the log was modified. |
oper_user_id | The user ID of the operator. |
oper_user_name | The name of the operator. |
moudle_name | The name of the exported functional module. Valid values:
|
sub_module | The sub-functional module. For example, a sub-functional module on the SQL Console page refers to the type of an executed SQL statement. |
db_id | The ID of the database that you use. The ID is assigned in DMS. |
db_name | The name of the database that you use. |
is_logic_db | Specifies whether the database is a logical database. |
instance_id | The ID of the instance that you use. The ID is assigned in DMS. |
instance_name | The name of the instance that you use. |
action_desc | The description of the operation. |
remark | The remarks. |
has_sensitive_data | Specifies whether the log contains sensitive information. |