The threat analysis and response feature provides hot data and cold data storage solutions. The solutions allow you to store and query logs of cloud services that are added to the feature. The solutions help you precisely identify alerts and trace attack sources to improve the efficiency of response to potential threats, simplify log management across environments, and strengthen the overall defense system. The solutions are compliant with the Cybersecurity Law and Multi-Level Protection Scheme (MLPS) 2.0 standards. This topic describes how to use the hot data and cold data storage solutions.
To enhance the user experience of the log management feature of threat analysis and response, Alibaba Cloud Security Center plans to unpublish and end the public preview of the cold data storage feature. For more information, see [Notice] Public preview of the cold data storage feature of threat analysis and response ends and the feature is unpublished.
How the cold data storage solution works
The threat analysis and response feature, Object Storage Service (OSS), and Alibaba Cloud Data Lake Formation (DLF) jointly launched the cold data storage solution to provide log storage and analysis capabilities based on self-developed secure data lakes.
After you enable the cold data storage solution, the feature automatically creates the following buckets in the OSS console:
security-lake-<Alibaba Cloud account ID>-<Data lake name>: This bucket stores the log data delivered to the cold data storage solution. The system retains the logs until the specified storage duration ends. When the duration ends, the system automatically deletes the logs.
security-lake-<Alibaba Cloud account ID>-<Data lake name>-query-result: This bucket stores the query results of logs. Log data in this bucket is stored for a long period of time. You can delete historical log data that you no longer require.
The log storage region of the buckets is the China (Shanghai) region.
Billing
You are charged based on the size of log data stored in OSS buckets. The fees are included in the bills of OSS. For more information, see Billing overview.
When you query data, the capabilities of DLF are used, which may generate fees. The fees are included in the bills of DLF. For more information, see Billing.
ImportantAt present, all features of DLF are free of charge. You are not charged even if the number of metadata objects exceeds one million or the number of metadata requests exceeds one million.
The cold data storage solution is in public preview and is provided free of charge.
Multi-account management
If you configure the multi-account management feature and use the global administrator account to log on to the Security Center console, you must select the appropriate view before you can manage logs on the Log Management page. The following list describes the supported views:
Current Account View: You can view and manage hot and cold data within the current account.
Global Account View: You can view and manage hot and cold data within the Alibaba Cloud accounts that are managed by the threat analysis and response feature.
If you enable cold data delivery in Current Account View and Global Account View, dedicated buckets are created for the global administrator account in OSS. OSS fees are included in the bills of the Alibaba Cloud account that is specified as the global administrator account, and delivered log data is stored within the global administrator account.
Prerequisites
OSS is activated. For more information, see Step 1: Activate OSS.
Step 1: Enable log delivery
From September 12, 2024, Alibaba Cloud users can no longer enable the public preview of the cold data storage solution. Users who have enabled the public preview prior to this date can follow the steps below to enable log delivery.
Log on to the Security Center console. In the top navigation bar, select the region of the asset that you want to manage as China.
In the left-side navigation pane, choose .
In the upper-right corner of the Service Integration page, click Log Storage Management.
In the Log Delivery Management section of the panel that appears, find a type of log that you want to deliver for cold storage, and turn on the switch in the Deliver Log to Cold Data/Enabled and Disabled At column.
You can select multiple log types, click Batch Deliver Log To, and select Cold Data.
If you do not want to store a specific type of log for a cloud service, you can turn off the switch for the log type. New logs of the log type are no longer delivered for cold storage.
Step 2: Query logs
In the left-side navigation pane, choose .
On the Cold Data tab, click Query 1. Enter a SQL statement that you want to execute and click Run.
You can view the log fields that can be used for log query in the left-side section.
You can use the following common statements:
Simple query
select u_name,file_path,activity_name,category_name,proc_path,intra_ip,container_host_name,asset_id,container_machine_ip,ecs_instance_id,parent_proc_path,cmd_line_format,container_file_path,k8s_name_space,asset_name,euid_name,main_user_id,inter_ip,parent_proc_start_time,cloud_code,parent_file_path,raw_data,file_uid,comm,k8s_cluster_id,container_image_id,pcomm,log_name,proc_id,index,file_uid_name,parent_file_name,srv_cmd_line,start_time,container_image_name,client_mode,cmd_chain_index,os_type,container_type,proc_name,parent_proc_id,docker_image_name,container_id,gid,perm,docker_container_id,proc_start_time,product_code,host_uuid,file_gid_name,sid,uid,k8s_node_name,file_gid,occur_time,vpc_instance_id,egroup_name,asset_type,sub_user_id,parent_cmd_line,docker_image_id,class_name,asset_list,k8s_node_id,euid,file_name,host_instance_id,end_time,k8s_pod_name,time_zone,egroup_id,log_time,cwd,gid_name,cmd_line,container_name,log_code,parent_proc_name,docker_file_path,tty,os_name,cmd_chain,scan_time,host_name from test****_17661858941*****.cloud_siem_aegis_proc where year = '2024' AND month = '03' AND day = '26' limit 200
The following result is returned.
Statistical analysis
select asset_id, `year`, `month`, `day`, `hour`, count(1) from test****_17661858941*****.cloud_siem_aegis_proc where year = '2024' AND month = '03' AND day = '26' group by asset_id, year, month, day, hour
The following result is returned.
Click Download below the query result to download the query result as a CSV file.
A file can contain up to 10,000 records.
More operations
Change the log storage duration
By default, logs of cloud services that you deliver for cold storage are permanently stored. You can specify a log storage duration based on your business requirements.
In this case, the system immediately deletes the logs that are stored for longer than the specified storage duration. We recommend that you specify an appropriate storage duration.
In the left-side navigation pane, choose .
In the upper-right corner of the Service Integration page, click Log Storage Management.
In the Log Management panel, change the storage duration of cold data.
Manage the log storage capacity
You can delete objects in an OSS bucket to reduce the log storage occupied by cold data.
In the left-side navigation pane, choose .
In the upper-right corner of the Service Integration page, click Log Storage Management.
In the Log Management panel, click Go to the OSS Console to Clear Capacity.
In the OSS console, delete the logs that you no longer require.
For more information, see Delete objects.
References
Security Center is set to unpublish the cold data storage feature. If you need to store logs that are added to the threat analysis and response feature, we recommend that you use the log management feature of threat analysis and response. For more information, see Manage logs.
To view logs delivered by the cold data storage solution, you can access the data stored in the Object Storage Service (OSS) by the data exploration feature from the Data Lake Formation (DLF) console. For more information, see Overview.