Update | Description | Release Date | Documentation |
EMR ClickHouse is supported | When data is distributed to ClickHouse, EMR ClickHouse can be selected as the data warehouse type. | 2022-05-24 | |
Tasks can be restarted | Abnormal tasks can be restarted. The previously processed data and CKafka instance involved will not be affected. | 2022-05-24 | |
Tasks can be replicated and recreated | When you have a large number of tasks with similar configurations, after creating the first task successfully, you can create more tasks quickly with the task replication feature. Task creation failures may be caused by incorrect configurations. In this case, you can manually recreate tasks. | 2022-05-24 | |
The consumption progress can be displayed during data processing and data distribution | - | 2022-05-24 | |
The latest message can be viewed | - | 2022-05-24 |
Update | Description | Release Date | Documentation |
The schema management feature is supported | With this feature, you can associate a created schema to a specific data access task to verify the format of the accessed data according to the schema. | 2022-04-08 | |
Schema is supported for reporting data over HTTP | You can use the specified schema to verify the format of the data reported over HTTP. | 2022-04-08 | |
You can specify the start offset for a data sync task | - | 2022-04-08 | |
Data can be distributed to COS | You can use DataHub to distribute CKafka data to COS for data analysis and download. | 2022-04-08 | |
COS becomes a supported data source | DataHub supports pulling data from COS for unified management and distribution to downstream offline/online processing systems, forming a clear data flow channel. | 2022-04-08 |
Update | Description | Release Date | Documentation |
Messages with parsing failures can be discarded in a data distribution task with ES or ClickHouse being the data target | If the data target is ES or ClickHouse, messages that failed to be parsed can be discarded. If you don't discard them, exceptions may occur and data dumping will be stopped. | 2022-03-09 | |
JSONPATH is a supported data processing type | JSONPATH is used to parse nested JSON data. It starts with the `$` symbol and uses the `.` symbol to locate specific fields in nested JSON data. | 2022-03-09 | |
Data can be distributed to CLS | You can use DataHub to distribute CKafka data to CLS for troubleshooting, metric monitoring, and security audit. | 2022-04-08 |
Update | Description | Release Date | Documentation |
DTS becomes a supported data source | DataHub supports pulling data from DTS for unified management and distribution to downstream offline/online processing systems, forming a clear data flow channel. | 2022-01-23 | |
Data can be distributed to TDW | You can use DataHub to distribute CKafka data to TDW for data storage, query, and analysis. | 2022-01-23 | |
Data can be distributed to CLS | You can use DataHub to distribute CKafka data to CLS for troubleshooting, metric monitoring, and security audit. | 2022-01-23 | |
Data can be distributed to ClickHouse | You can use DataHub to distribute CKafka data to ClickHouse for data storage, query, and analysis. | 2022-01-23 |
Update | Description | Release Date | Documentation |
DataHub is officially launched | DataHub is a data access and processing platform in Tencent Cloud for one-stop data access, processing, and distribution. It can continuously receive and collect data from applications, web, cloud product logs, and other sources, and process the data in real time. This helps build a data flow linkage at low costs to connect data sources and data processing systems. | 2021-12-21 |
Was this page helpful?