Serverless App Engine (SAE) allows you to collect standard output (stdout) of application instances and logs from the specified directories of application instances to ApsaraMQ for Kafka. Then, you can deliver data from ApsaraMQ for Kafka to other persistent databases, such as Elasticsearch databases, based on your business requirements. This way, you can manage and analyze logs in a centralized manner. This topic describes how to collect logs to ApsaraMQ for Kafka in the SAE console.
Prerequisites
SAE
At least 25% of CPU space and 250 MB of memory are reserved for each instance in your application.
Kafka
ApsaraMQ for Kafka is activated and a topic is created. SAE supports instances of ApsaraMQ for Kafka 2.X or later.
If your ApsaraMQ for Kafka instance is created in a virtual private cloud (VPC) and cannot access the Internet, the ApsaraMQ for Kafka instance and the SAE instance must reside in the same VPC.
An IP address whitelist of the ApsaraMQ for Kafka instance is configured.
If the ApsaraMQ for Kafka instance and the Serverless App Engine (SAE) instance reside in the same VPC but are associated with different vSwitches, you must add the vSwitch CIDR block of the SAE instance to the IP address whitelist of the ApsaraMQ for Kafka instance on the Instances page in the ApsaraMQ for Kafka console.
If you want to enable access over a VPC, use 0.0.0.0/0.
Background information
If you cannot collect logs to Log Service projects or RAM users cannot view logs in Log Service, you can collect logs from SAE to ApsaraMQ for Kafka.
For information about the usage limits of ApsaraMQ for Kafka, see Limits.
You are charged when you use ApsaraMQ for Kafka. For information about the pricing of ApsaraMQ for Kafka, see Billing overview.
Configure file log collection
Configure log collection when you create an application
Log on to the SAE console.
In the left-side navigation pane, click Applications. In the top navigation bar, select a region. Then, click Create Application.
In the Basic Information step, configure the parameters and click Next: Application Deployment Configurations.
In the Deployment Configurations step, configure the Technology Stack Programming Language and Application Deployment Method parameters and the corresponding settings.
In the Log Collection Service section, click the Log Collection to Kafka tab and turn on Enable Log Collection to Kafka.
Parameter
Description
Kafka Instance
Select the ApsaraMQ for Kafka instance to which you want to collect logs from SAE.
Log Type
Select a log type. Valid values:
File Logs (Log Path in Container): the file logs. This is the default value. You can add multiple collection rules to collect file logs from different log sources.
Container Standard Output Logs: the container stdout logs. You can configure only one collection rule to collect stdout logs. This log type is available in the Log Type drop-down list only when your vSwitch resides in one of the recommended zones. For more information, see Change the security group and vSwitch of an application.
Log Source
Enter the directory in which the log file is stored. The directory must contain the name of the log file. Example: /tmp0/cjsc.log. If you select Container Standard Output Logs, you do not need to configure the Log Source parameter. You can specify a regular expression to match the file name and path. If a directory contains a large number of log files that have the same format, you can specify a log source in the /xxx/xxx/xxx/*.log format.
ImportantDo not save important files of other types in the log source. Otherwise, the files are overwritten by log files.
Kafka Topic Name
Select the ApsaraMQ for Kafka topic.
Click Next: Confirm Specifications.
In the Specification Confirmation step, view the details of the application and the fee for the selected specifications. Then, click Confirm.
The Creation Completed step appears. You can click Application Details to go to the Basic Information page of the application.
Check the result.
After you deploy an application, SAE collects and delivers logs to the specified directory based on the configured log collection rule.
In the left-side navigation pane of the application details page, choose
. On the Persistent Logs page, view the information about the collected logs.If logs exist, the log collection rule takes effect. You can perform business analysis based on the logs.
Configure log collection when you deploy applications
After you redeploy an application, the application is restarted. To prevent unpredictable errors such as business interruptions, we recommend that you deploy applications during off-peak hours.
The procedure that can be performed to update an application varies based on the number of instances in the application. This section provides an example on how to configure the required features for an application in which the number of instances is greater than or equal to 1. For information about how to update an application in which the number of instances is 0, see Update an application.
Log on to the SAE console.
In the left-side navigation pane, click Applications. In the top navigation bar, select a region. Then, click the name of an application.
In the upper-right corner of the Basic Information page, click Deploy Application.
In the Log Collection Service section of the Deploy Configurations page, click the Log Collection to Kafka tab and turn on Enable Log Collection to Kafka.
Parameter
Description
Kafka Instance
Select the ApsaraMQ for Kafka instance to which you want to collect logs from SAE.
Log Type
Select a log type. Valid values:
File Logs (Log Path in Container): the file logs. This is the default value. You can add multiple collection rules to collect file logs from different log sources.
Container Standard Output Logs: the container stdout logs. You can configure only one collection rule to collect stdout logs. This log type is available in the Log Type drop-down list only when your vSwitch resides in one of the recommended zones. For more information, see Change the security group and vSwitch of an application.
Log Source
Enter the directory in which the log file is stored. The directory must contain the name of the log file. Example: /tmp0/cjsc.log. If you select Container Standard Output Logs, you do not need to configure the Log Source parameter. You can specify a regular expression to match the file name and path. If a directory contains a large number of log files that have the same format, you can specify a log source in the /xxx/xxx/xxx/*.log format.
ImportantDo not save important files of other types in the log source. Otherwise, the files are overwritten by log files.
Kafka Topic Name
Select the ApsaraMQ for Kafka topic.
After you configure the settings, click Confirm.
ImportantYou cannot collect a log file to two ApsaraMQ for Kafka topics at the same time. If you deploy your application in phased release mode or canary release mode and retain the configured log source but change the topic for log collection, your logs are still collected to the original topic until all configurations of your application are deployed.
Check the result.
After you deploy an application, SAE collects and delivers logs to the specified directory based on the configured log collection rule.
In the left-side navigation pane of the application details page, choose
. On the Persistent Logs page, view the information about the collected logs.If logs exist, the log collection rule takes effect. You can perform business analysis based on the logs.
Formats
After you enable Log Collection to Kafka, the collected log is in the following format:
{
"file":"/home/admin/apache-tomcat-8.5.42/logs/localhost.2022-03-01.log",
"host":"test-kafka-9527eec8-b2c1-4f03-9178-5dac0fe16d07-*****",
"message":"01-Mar-2022 15:09:36.016 INFO [localhost-startStop-1] org.apache.catalina.core.ApplicationContext.log No Spring WebApplicationInitializer types detected on classpath",
"topic":"test2"
}
where:
file
indicates the path of the collected log file.host
indicates the name of the instance on which you collect logs.message
indicates the content of the collected logs.topic
indicates the ApsaraMQ for Kafka topic to which the collected logs are delivered.
Collect multi-line logs
In Java applications, an error is reported if logs are automatically merged into one line. The following example shows a Java exception log:
java.lang.RuntimeException: testLog at cn.niutong.controller.TestController.heathc(TestController.java:141) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at ...
If a line break
\n
is detected, the log data is re-written into a new line. We recommend that you package logs into a JSON string in your business program and export the string as a single line.
If you have additional requirements, such as merging multiple lines of logs, join the DingTalk group 32874633 for technical support.
FAQ
Are wildcard characters supported after I enable Log Collection to Kafka?
Yes, wildcard characters are supported after you enable Log Collection to Kafka. For example, you can use asterisks (*) as wildcard characters to specify all files in a specific folder. Example:
/tmp/logs/*.log
.Why do logs fail to be collected?
Logs may fail to be collected due to network exceptions. To troubleshoot the issue, perform the following steps:
Log on to the SAE Webshell. Run the telnet command to query the address of the ApsaraMQ for Kafka instance and check whether the network connection of the instance is normal. For more information about Webshell, see Use the webshell feature to check the health status of applications.
Check the network status.
Check whether the SAE application and the ApsaraMQ for Kafka instance reside in the same VPC and whether the IP address whitelist is configured.
If the network meets the requirements for log collection, join the DingTalk group 32874633 for technical support.