Overview
The ES Serverless service is a fully managed, cloud-native ES service by Tencent Cloud, built on a self-developed Serverless architecture with no cluster concept. Users can create and use indexes as needed, benefiting from auto-scaling and completely maintenance-free capabilities, which effectively solve the issue of high resource costs associated with peak and off-peak fluctuations in log analysis and metric monitoring scenarios. Fully compatible with the ELK ecosystem, it provides end-to-end data writing, data management, and data visualization features, providing a plug-and-play log analysis experience.
Quick Start
The ES Serverless service supports writing data into indexes through methods such as native ES APIs, Logstash, Flink, or Kafka. If you require log collection for services such as CVM, TKE, or TCHouse-C, a one-stop visualized configuration option is also available. By simply setting up data sources and index information, logs can be collected into indexes for efficient retrieval and analysis. This document will guide you through the full process of index creation > data writing > retrieval and analysis, giving you a quick overview of using the ES Serverless service in log analysis scenarios. Basic Concepts
Before diving into the experience, let us review several relevant basic concepts:
|
Project space | Project space is a fundamental resource unit in the ES Serverless service. You can create indexes related to the same business within a single project space, facilitating index management. |
Index | Index is the smallest unit for data storage and management, providing log storage and near real-time query capabilities. Collected log data can be stored in indexes. |
Kibana | Kibana is a data analysis and visualization platform integrated with ES, allowing for log writing, retrieval, and chart creation (such as maps and line charts). |
Logs | Logs are records generated during the operation of application systems, including operation logs, access logs, and error logs. |
Creating a Space
2. In the space list, click Create Project to enter the project creation page.
3. On the project creation page, configure the following settings:
Project Name: Use this name to identify the project. Follow the naming guidelines provided on the page.
VPC / AZ and Subnet: The project space is created within a VPC to ensure secure access. Select the appropriate VPC, availability zone, and subnet. if creation is needed, see Create New VPC and Create New Subnet. 4. After completing the information, click Confirm to create the project.
Creating an Index
There are two methods to create an index: directly from the Project list page or from the Project Basic page. The following example demonstrates the process on the Project list page.
1. On the Project list page, enter the Quick Access Data page and select your data source. Here, we will use API write as an example.
2. Review the writing prompts, then click Next.
3. On the Index Settings page, enter the basic information and index configuration, then click Create.
Region: Select the regional information from the dropdown list.
Project: Choose the project space to organize the index for easier management. If no options are available in the dropdown, click Create Project and follow the previous instructions to create one.
Index name: This name will be used for subsequent data writing and querying. Follow the naming prompts shown on the page.
Field mapping: Used to set the field details of the data. You can select Dynamic creation, which will automatically generate field settings based on the data you write in, or choose to customize the field settings.
Time field: Select or input a field with a date type from your data. Once the index is created, this field cannot be modified.
Data retention period: The retention period of the data. For example, if it is set to Limited 30 days, data will be deleted on the 30th day after being written to the index.
Data Writing
1. In the project list, click the name of the desired project to enter the Project Management page.
Note:
Kibana's relevant modules are embedded directly into the Tencent Cloud Console, allowing you to use search and analysis features directly. Search and Analysis corresponds to Discover, and Development Tools corresponds to Dev Tools. This embedded feature requires third-party cookies to be enabled in your browser; if you experience issues, please enable third-party cookies. To access Kibana externally for data writing, see Writing Data. Enter the Development Tools page.
Enter the following statement and click the triangle icon to write data. Each click counts as one data entry (the content within {} represents a complete log entry). You may click several times to generate enough entries for the upcoming data retrieval demonstration.
Sample statement:
POST index name/_doc
{
"id":"090798",
"routing_no":"4087",
"region":"10002424",
"user_name":"user-Ufa9Yee1P",
"user_type":"01",
"ip":"119.147.10.191",
"now_local":"gz",
"@timestamp":1705648983762
}
Note:
Replace Index Name in the statement with your specific index name.
If your time field is not @timestamp, modify @timestamp in the figure to match your custom time field.
Retrieval and Analysis
With the data successfully written into the ES Serverless service, the following steps demonstrate how to query this data.
Method 1: Using DSL
1. Copy the example statement below and click the triangle icon to execute a query on the written data.
GET index name/_search
{
"query":{
"match_all":{}
}
}
2. The returned result below indicates that the data was successfully queried.
Method 2: Using Discover
1. Click Log Analysis, and select the index just written from the index drop-down list.
2. Filter by time. Since the written data is from January 2024 in this example, select a year ago to successfully retrieve data from the past year.
3. You can also enter keywords to retrieve content that matches specific criteria. For example, if you want to retrieve entries where the field now_local has the value gz.
Click now_local as shown below:
Select : as shown below:
Enter gz and click Refresh. All entries with the now_local field value of gz will be highlighted, as shown below:
For more details on data retrieval and analysis methods, see Data Query.
Was this page helpful?