You can write an SCF function to process the logs collected in the CLS service. By passing the collected logs as a parameter, the function can be invoked and the function code can process and analyze the data or dump it to other Tencent Cloud services.
Characteristics of CLS triggers:
After consuming messages, the CLS backend consumption module will encapsulate them into event structures according to the max waiting time, delivery process, and message body size and then initiate async function invocation. The applicable limits are as follows:
AccessDeniedException
or ResourceNotFoundException
, the CLS trigger will pause the transfer retry logic.Note:
- In the message delivery process, the time aggregation may vary by combination; that is, the number of messages in each event structure ranges from 1 to the maximum waiting time. If the configured maximum waiting time is too long, there may be cases where the aggregation time in an event structure will never reach the maximum aggregation time.
- After the event content is obtained by the function, each message can be guaranteed for processing by loop handling, and it is not assumed that the message time passed each time is constant.
When the specified CLS trigger receives a message, the CLS backend consumption module will consume the message and encapsulate it to asynchronously invoke your function. In order to ensure the efficiency of data transfer in a single triggering action, the value of the data field is a Base64-encoded Gzip document.
{
"clslogs": {
"data": "ewogICAgIm1lc3NhZ2VUeXBlIjogIkRBVEFfTUVTU0FHRSIsCiAgICAib3duZXIiOiAiMTIzNDU2Nzg5MDEyIiwKICAgICJsb2dHcm91cCI6I..."
}
}
After being decoded and decompressed, the log data will look like the following JSON body (using decoded CLS Logs message data as an example):
{
"topic_id": "xxxx-xx-xx-xx-yyyyyyyy",
"topic_name": "testname",
"records": [{
"timestamp": "1605578090000000",
"content": "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
}, {
"timestamp": "1605578090000000",
"content": "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
}]
}
The data structures are as detailed below:
Structure | Description |
---|---|
topic_id | Log topic ID |
topic_name | Log topic name |
timestamp | Log production time (timestamp at the microsecond level) |
content | Log content |
Was this page helpful?