tencent cloud

Feedback

Connector Release Notes

Last updated: 2024-09-11 15:29:49

    May 2024

    Update
    Description
    Release Date
    Documentation
    K2K supports cross-cloud synchronization.
    The cross-cloud synchronization not only enables the synchronization of data and metadata from Kafka instances of different cloud service providers to Tencent Cloud CKafka, but also allows self-built Kafka instances within Tencent Cloud to synchronize data and metadata to CKafka.
    2024-5-24
    Data Subscription Binlog type supports lock method configuration.
    Add row lock configuration to reduce the lock scope during data synchronization and avoid lock wait situations.
    2024-5-24
    -

    December 2023

    Update
    Description
    Release Date
    Documentation
    MySQL Data Collection supports filtering empty messages
    If MySQL Binlog logs collected contain MySQL commands, such as Begin Transaction, they will appear as empty strings when delivered to CKafka. This feature supports filtering such empty strings.
    The cross-cloud synchronization not only enables the synchronization of data and metadata from Kafka instances of different cloud service providers to Tencent Cloud CKafka, but also allows self-built Kafka instances within Tencent Cloud to synchronize data and metadata to CKafka.
    2023-12-06
    -

    November 2023

    Update
    Description
    Release Date
    Documentation
    Data synchronization supports syncing Offset.
    When performing Kafka instance-level data synchronization with the connector, it supports selecting and synchronizing the consumption offsets.
    2023-11-01
    -
    Supports synchronizing source database change operations subscribed by the connector to update the target MySQL database.
    The connector supports synchronizing data changes (inserts, updates, and deletes) in the MySQL or PostgreSQL database that are subscribed to a Topic to the target MySQL database. Data changes are identified to keep the target table data consistent with the source table data.
    2023-11-01
    -
    Connector MySQL subscription supports Custom Time Zone.
    Connector MySQL subscription supports Custom Time Zone.
    2023-11-01
    -

    September 2023

    Update
    Description
    Release Date
    Documentation
    The data distributed from the connector to COS supports minute-level aggregation.
    The data distributed from the connector to COS supports minute-level aggregation.
    2023-09-20

    August 2023

    Update
    Description
    Release Date
    Documentation
    Custom field mapping is supported when using the connector to subscribe to change data from relational databases to ES.
    When using the connector to subscribe to change data from relational databases to ES, it supports custom mapping of message field names to target index fields.
    2023-8-22
    Connector data distributed to ES supports specifying index time.
    When using the connector to distribute data to ES, it supports specifying a field in the source data as the index time. By default, it is the message delivery time.
    2023-8-22

    May 2023

    Update
    Description
    Release Date
    Documentation
    Data distributed to Kafka supports instance-level data synchronization.
    Supports Kafka instance-level and topic-level data synchronization, including replication and migration between instances across different regions, as well as data transfer and automatic synchronization between topics in different CKafka instances. This enhances both business continuity and data reliability.
    2023-05-08
    -

    January 2023

    Update
    Description
    Release Date
    Documentation
    The data distribution target now supports SCF.
    With SCF and CKafka Connector, it is now easy to export CKafka messages to COS, ES, CDW, etc.
    2023-01-13
    -
    Data synchronization feature migration
    The Cross-Region Disaster Recovery feature has been moved to Instance List - Cluster Backup.
    The Data Synchronization feature has been moved to Connector - Task List - Data Distribution to CKafka.
    The above feature migrations will not affect the normal usage for customers.
    2023-01-13
    -
    The data processing and parsing rules support JSON Object Data Parsing.
    New data processing and parsing modes support JSON Object Array - Single Line Output and JSON Object Array - Multi-line Output, enhancing data flow efficiency.
    2023-01-13
    -
    The data processing rules support Export/Import Task Configuration.
    You can export data processing rules as templates to reuse in subsequent data tasks, reducing the operating costs of repetitive configuration.
    2023-01-13
    -
    The data processing supports Automatic Generation of Regular Expression.
    The data processing supports Automatic Generation of Regular Expression, suitable for log text where each line is an original log and each log entry can be parsed into multiple key-value pairs based on the regular expression.
    2023-01-13
    -
    Data distributed to COS supports aggregate by time.
    Data distributed to COS now supports time-based aggregation. You can select the time interval for aggregating messages based on the volume of messages.
    2023-01-13
    COS

    November 2022

    Update
    Description
    Release Date
    Documentation
    The database subscription task supports automatic Topic creation.
    When you create a database subscription task, the data target configuration supports automatically creating a Topic or selecting an existing Topic.
    2022-11-14
    -
    MySQL Subscription supports regular expression matching libraries.
    When you create the MySQL and TDSQL-C MySQL database subscription task, you can use regular expression matching to subscribe to either the entire database or specific tables when selecting tables and databases as data sources.
    2022-11-14
    -
    The PostgreSQL data distribution supports distributing data from different tables to different Topics.
    When you create a PostgreSQL, TDSQL-C PostgreSQL data subscription task, the data target configuration supports distributing data from different tables to different Topics.
    2022-11-14
    Supports creating subscribed tables
    When editing database subscription tasks, you can add new subscribed tables. Existing tables being monitored will maintain their original collection logic.
    2022-11-14
    -
    The data access task supports data compression.
    -
    2022-11-14

    October 2022

    Update
    Description
    Release Date
    Documentation
    Supports task orchestration feature
    For scenarios with only one data source and one data target, users can use pre-set templates to quickly build data flow tasks, integrating data from the source to the specified target swiftly.
    2022-10-28
    -
    Supports new data target types
    Data targets now support Time Series Database (CTSDB) and analytical database Doris.
    2022-10-28
    -
    CLS sink log time precision now supports milliseconds.
    -
    2022-10-28
    MariaDB data subscription and TDSQL-C MySQL data subscription optimization
    Supports subscribing to multiple databases.
    Supports three methods for selecting tables: all database tables, batch selection, and regular expression matching.
    Supports sending subscription data from different database tables to different Topics.
    2022-10-28
    -
    TDSQL-C PostgreSQL data subscription and PostgreSQL data subscription
    The TDSQL-C PostgreSQL and PostgreSQL data source configuration capabilities are now aligned with the MySQL data subscription.
    2022-10-28

    September 2022

    Update
    Description
    Release Date
    Documentation
    Data distributed to PostgreSQL supports field default matching.
    When the structure of MySQL Binlog/PostgreSQL row-level changes data tables from upstream changes, these changes can be synchronized to downstream PostgreSQL.
    2022-09-15
    -
    MySQL Data Subscription supports non-Schema format output.
    When creating a data access task and configuring data source information, you can choose to enable/disable the Schema switch to decide whether the KEY and VALUE content in the message output includes the Schema.
    2022-09-15
    -
    When data is distributed to CLS, the system supports specifying a time field as the log time.
    When creating a data distribution task and configuring data target information, you can specify a time field in the preview data as the log time.
    2022-09-15
    New supported types are added for Connection Management.
    Connection Management now supports the following types: Cloud Data Warehouse - PostgreSQL and TDSQL for PostgreSQL.
    2022-09-15
    New supported types are added for data distribution.
    Data distribution now supports: PostgreSQL, and Cloud Data Warehouse - PostgreSQL and TDSQL-C.
    2022-09-15
    
    -
    
    MySQL data subscription optimization
    Supports subscribing to multiple databases.
    Supports three methods for selecting tables: all database tables, batch selection, and regular expression matching.
    Supports sending subscription data from different database tables to different Topics.
    2022-09-01
    -
    Supports self-built services on the cloud (based on CLB).
    Supports self-built databases MySQL and PostgreSQL on the cloud CLB.
    2022-09-01
    Data processing supports ROW format output.
    -
    2022-09-01

    August 2022

    Update
    Description
    Release Date
    Documentation
    MariaDB and TDSQL-C for MySQL versions support binlog subscription.
    -
    2022-08-22
    -
    Supports MySQL JDBC source and Sink connection.
    The data access and data distribution targets support MySQL JDBC.
    2022-08-22
    -
    Merge the data processing and data distribution tasks.
    When creating data distribution tasks, you can set data processing rules. After processing, the data is directly transferred to the downstream target, reducing redundancy in the data pipeline from CKafka Topic to CKafka Topic. Kafka has been added as a new data distribution target.
    2022-08-22
    -
    Supports Event Center.
    The Event Center manages, stores, analyzes, and displays event data generated by connectors in a unified manner, making it convenient for you to view and analyze. You can view detailed event data in the Event Center and configure alarm notification rules for events to detect and address issues promptly.
    2022-08-22
    Enhanced monitoring metrics
    The monitoring feature now includes task monitoring and database monitoring capabilities.
    2022-08-22
    Data processing supports regular expression replacement.
    When you set the data processing rules, the processing value supports regular expression replacement mode.
    2022-08-22
    -
    Supports delivering failure messages to the user's CLS
    When the data target is ClickHouse, MySQL and ES, failure messages can now be delivered to the user's CLS.
    2022-08-22
    When data is distributed to COS, specifying object names is supported.
    -
    2022-08-22

    July 2022

    Update
    Description
    Release Date
    Documentation
    Supports Connection Management
    Supports creating connections independently. Once a connection is created, it can be directly linked to specific data tasks as either a data source or data target, avoiding repeated configuration and reducing operational costs.
    2022-07-08
    New data source support for data access
    Data access now supports PostgreSQL, TDSQL-C, and MySQL data subscriptions.
    2022-07-08
    -
    Independent Topic sales support
    Connectors now allow the creation of Topics separately in the console, which can be used as data sources or data targets for data tasks.
    2022-07-08
    Task creation page redesign
    Data access, data processing, and data distribution tasks are now managed in a unified task list.
    2022-07-08

    May 2022

    Update
    Description
    Release Date
    Documentation
    EMR ClickHouse is supported
    When data is distributed to ClickHouse, EMR ClickHouse can be selected as the data warehouse type.
    2022-05-24
    Tasks can be restarted
    Abnormal tasks can be restarted. The previously processed data and CKafka instance involved will not be affected.
    2022-05-24
    Tasks can be replicated and recreated
    When you have a large number of tasks with similar configurations, after creating the first task successfully, you can create more tasks quickly with the task replication feature.
    Task creation failures may be caused by incorrect configurations. In this case, you can manually recreate tasks.
    2022-05-24
    The consumption progress can be displayed during data processing and data distribution
    -
    2022-05-24
    The latest message can be viewed
    -
    2022-05-24

    April 2022

    Update
    Description
    Release Date
    Documentation
    The schema management feature is supported
    With this feature, you can associate a created schema to a specific data access task to verify the format of the accessed data according to the schema.
    2022-04-08
    Schema is supported for reporting data over HTTP
    You can use the specified schema to verify the format of the data reported over HTTP.
    2022-04-08
    You can specify the start offset for a data sync task
    -
    2022-04-08
    Data Sync
    Data can be distributed to COS
    You can use DataHub to distribute CKafka data to COS for data analysis and download.
    2022-04-08
    COS becomes a supported data source
    DataHub supports pulling data from COS for unified management and distribution to downstream offline/online processing systems, forming a clear data flow channel.
    2022-04-08
    COS

    March 2022

    Update
    Description
    Release Date
    Documentation
    Messages with parsing failures can be discarded in a data distribution task with ES or ClickHouse being the data target
    If the data target is ES or ClickHouse, messages that failed to be parsed can be discarded. If you don't discard them, exceptions may occur and data dumping will be stopped.
    2022-03-09
    JSONPATH is a supported data processing type
    JSONPATH is used to parse nested JSON data. It starts with the $ symbol and uses the . symbol to locate specific fields in nested JSON data.
    2022-03-09
    Data can be distributed to CLS
    You can use DataHub to distribute CKafka data to CLS for troubleshooting, metric monitoring, and security audit.
    2022-04-08

    January 2022

    Update
    Description
    Release Date
    Documentation
    DTS becomes a supported data source
    DataHub supports pulling data from DTS for unified management and distribution to downstream offline/online processing systems, forming a clear data flow channel.
    2022-01-23
    DTS
    Data can be distributed to TDW
    You can use DataHub to distribute CKafka data to TDW for data storage, query, and analysis.
    2022-01-23
    Data can be distributed to CLS
    You can use DataHub to distribute CKafka data to CLS for troubleshooting, metric monitoring, and security audit.
    2022-01-23
    Data can be distributed to ClickHouse
    You can use DataHub to distribute CKafka data to ClickHouse for data storage, query, and analysis.
    2022-01-23

    December 2021

    Update
    Description
    Release Date
    Documentation
    DataHub is officially launched
    DataHub is a data access and processing platform in Tencent Cloud for one-stop data access, processing, and distribution. It can continuously receive and collect data from applications, web, cloud product logs, and other sources, and process the data in real time. This helps build a data flow linkage at low costs to connect data sources and data processing systems.
    2021-12-21
    
    Contact Us

    Contact our sales team or business advisors to help your business.

    Technical Support

    Open a ticket if you're looking for further assistance. Our Ticket is 7x24 avaliable.

    7x24 Phone Support