You can import log data from Object Storage Service (OSS) buckets to Simple Log Service and perform operations on the data in Simple Log Service. For example, you can query, analyze, and transform the data. You can import only OSS objects that do not exceed 5 GB in size to Simple Log Service. If you want to import a compressed object, the size of the object after compression cannot exceed 5 GB.
Billing
You are not charged for the data import feature of Simple Log Service. However, the feature calls OSS API. You are charged for the OSS traffic and requests that are generated. For more information about the pricing of related billable items, see Pricing of OSS. The daily OSS fee that is generated when you import data from OSS is calculated by using the following formula:
Prerequisites
Log files are uploaded to an OSS bucket. For more information, see Upload objects.
A project and a Logstore are created. For more information, see Create a project and Create a Logstore.
Simple Log Service is authorized to assume the AliyunLogImportOSSRole role to access your OSS resources. You can complete authorization on the Cloud Resource Access Authorization page.
The Resource Access Management (RAM) user that you want to use is granted the oss:ListBuckets permission to access OSS buckets. For more information, see Attach a custom policy to a RAM user.
If you use a RAM user, you must grant the PassRole permission to the RAM user. The following example shows a policy that you can use to grant the permission. For more information, see Create custom policies and Grant permissions to a RAM user.
{ "Statement": [ { "Effect": "Allow", "Action": "ram:PassRole", "Resource": "acs:ram:*:*:role/aliyunlogimportossrole" }, { "Effect": "Allow", "Action": "oss:GetBucketWebsite", "Resource": "*" }, { "Effect": "Allow", "Action": "oss:ListBuckets", "Resource": "*" } ], "Version": "1" }
Create a data import configuration
If an OSS object is imported to Simple Log Service and new data is appended to the OSS object, all data of the OSS object is re-imported to Simple Log Service when a data import job for the OSS object is run.
Log on to the Simple Log Service console.
In the Import Data section, click the Data Import tab. Then, click OSS - Data Import.
Select the project and Logstore. Then, click Next.
In the Import Configuration step, create a data import configuration.
In the Import Configuration step, configure the following parameters.
Click Preview to preview the import result.
After you confirm the result, click Next.
Create indexes and preview data. Then, click Next. By default, full-text indexing is enabled in Simple Log Service. You can also manually create field indexes for the collected logs or click Automatic Index Generation. Then, Simple Log Service generates field indexes. For more information, see Create indexes.
ImportantIf you want to query all fields in logs, we recommend that you use full-text indexes. If you want to query only specific fields, we recommend that you use field indexes. This helps reduce index traffic. If you want to analyze fields, you must create field indexes. You must include a SELECT statement in your query statement for analysis.
Click Query Log. On the query and analysis page that appears, check whether OSS data is imported.
Wait for approximately 1 minute. If the required OSS data exists, the data is imported.
What to do next
After you create a data import configuration, you can view the configuration details and related statistical reports in the Simple Log Service console.
In the Projects section, click the project to which the data import configuration belongs.
On the tab, click the Logstore to which the data import configuration belongs, choose , and then click the name of the data import configuration.
View the data import job
On the Import Configuration Overview page, view the basic information and statistical reports of the data import configuration.
Modify the data import job
To modify the data import configuration, click Edit Configurations. For more information, see Create a data import configuration.
Delete the data import job
To delete the data import configuration, click Delete Configuration.
WarningAfter the data import configuration is deleted, it cannot be restored.
Stop the data import job
To stop the data import job, click Stop.
FAQ
Problem description | Possible cause | Solution |
No data is displayed during preview. | The OSS bucket contains no objects, the objects contain no data, or no objects meet the filter conditions. |
|
Garbled characters exist. | The data format, compression format, or encoding format is not configured as expected. | Check the actual format of the OSS objects and modify Data Format, Compression Format, or Encoding Format. To handle the existing garbled characters, create a Logstore and a data import configuration. |
The log time displayed in Simple Log Service is different from the actual log time. | No time field is specified in the data import configuration, or the specified time format or time zone is invalid. | Specify a time field or specify a valid time format and time zone. For more information, see Create a data import configuration. |
After data is imported, the data cannot be queried or analyzed. |
|
|
The number of imported data entries is less than expected. | Some OSS objects contain lines that are greater than 3 MB in size. In this case, the lines are discarded during the import. For more information, see Limits on collection. | When you write data to an OSS object, make sure that the size of a line does not exceed 3 MB. |
The number of OSS objects and the total volume of data are large, but the import speed does not meet expectations. In most cases, the import speed can reach 80 MB/s. | The number of shards in the Logstore is excessively small. For more information, see Limits on performance. | If the number of shards in a Logstore is small, increase the number of shards to 10 or more and check the latency. For more information, see Manage shards. |
OSS buckets cannot be selected during the creation of a data import configuration. | The AliyunLogImportOSSRole role is not assigned to Simple Log Service. | Complete authorization based on the descriptions in the "Prerequisites" section of this topic. |
Some OSS objects failed to be imported to Simple Log Service. | The settings of the filter conditions are invalid or the size of an object exceeds 5 GB. For more information, see Limits on collection. |
|
Archive objects are not imported to Simple Log Service. | Import Archive Files is turned off. For more information, see Limits on collection. |
|
An error occurred in parsing an OSS object that is in the Multi-line Text Logs format. | The regular expression that is specified to match the first line or the last line in a log is invalid. | Check whether the regular expression that is specified to match the first line or the last line in a log is valid. |
The latency to import new OSS objects is higher than expected. | The number of existing OSS objects that meet the conditions specified by File Path Prefix Filter exceeds the upper limit and OSS Metadata Indexing is turned off in the data import configuration. | If the number of existing OSS objects that meet the conditions specified by File Path Prefix Filter exceeds one million, turn on OSS Metadata Indexing in the data import configuration. Otherwise, the efficiency of new file discovery is low. |
Error handling
Error | Description |
File read failure | If an OSS object fails to be completely read because a network exception occurs or the object is damaged, the data import job automatically retries to read the object. If the object fails to be read after three retries, the object is skipped. The retry interval is the same as the value of New File Check Cycle. If New File Check Cycle is set to Never Check, the retry interval is 5 minutes. |
Compression format parsing error | If an OSS object is in an invalid format, the data import job skips the object during decompression. |
Data format parsing error |
|
OSS bucket absence | A data import job periodically retries. After an OSS bucket is re-created, the data import job automatically resumes the import. |
Permission error | If a permission error occurs when data is read from an OSS bucket or data is written to a Simple Log Service Logstore, the data import job periodically retries. After the error is fixed, the data import job automatically resumes the import. If a permission error occurs, the data import job does not skip any OSS objects. After the error is fixed, the data import job automatically imports data from the unprocessed objects in the OSS bucket to the Simple Log Service Logstore. |