tencent cloud

Feedback

Error Codes

Last updated: 2024-09-04 11:18:20
    DLC provides error codes and descriptions to help you quickly identify the type of error, understand the cause, and find recommended solutions.

    Error Code Descriptions

    Error Code
    Error Description
    50000
    Business system error. Please retry or submit a ticket to contact us.
    50001
    Parameter not found or parameter is empty.
    50002
    The specified file path format is not compliant. Currently, only cosn:// or lakefs:// is supported.
    50003
    The specified file compression format is not compliant. Currently, only tar.gz, tar, or tgz is supported.
    50004
    Date parameter error, such as the end time is earlier than the start time.
    50005
    Invalid number of filter conditions. When filters are specified, there should be at least 1 and no more than 5 conditions.
    50006
    Specified parameter Base64 decoding failed. Please convert to Base64 encoding.
    50007
    The specified time format is not compliant. Currently, only YYYY-mm-dd HH:MM:SS is supported.
    50008
    This is an allowlist feature. Please contact us to enable it.
    50009
    The specified URL format is not compliant. See the example: //ip:port/xxx.
    50010
    The specified FiltersKey is not supported. Please see the API documentation.
    50011
    Filter.Value cannot be empty.
    51001
    The specified script name exceeds the length limit, which should be 128 characters or fewer.
    51002
    The specified script status type does not match. Currently supported types are
    0: Initialization; 1: Completed; 2: Deleted.
    51003
    The specified script enable status does not match. Currently supported types are
    0: Disabled and 1: Enabled.
    51004
    The specified ViewKey type does not match. Currently supported types are BatchId, SessionSQL, ComputerEngine, ComputeResource, DataAmount, DataNumber, DeployStatus, SubmitTime, TotalTime and UsedTime.
    51005
    The user does not have permissions to modify the current script.
    51006
    The specified script does not exist. Please adjust and try again.
    51007
    The specified script already exists. Please adjust and try again.
    51008
    When you change the script name, IsMove must be set to 1.
    53001
    The current file compression format is not supported. Supported formats are snappy, gzip, and none.
    53002
    The current file format is not supported. Supported formats are json, csv, avro, orc, and parquet.
    53003
    COS path error. Please check whether the COS path is correct.
    53004
    COS file error. Please select the file again.
    53005
    File inference failed. Please select the file again, or submit a ticket to contact us.
    53006
    The specified data connection type is not supported. Currently supported types are DataLakeCatalog, Mysql, HiveCos, HiveHdfs, HiveCHdfs, PostgreSql, SqlServer, and ClickHouse.
    53007
    Metadata error. Please retry, or submit a ticket to contact us.
    53008
    Data governance error. Please retry, or submit a ticket to contact us.
    53009
    System file error. Please retry, or submit a ticket to contact us.
    53010
    Data source connection configuration error. Please retry, or submit a ticket to contact us.
    53011
    Data source connection is not unique. Please retry, or submit a ticket to contact us.
    53012
    Data source connection does not exist. Please retry, or submit a ticket to contact us.
    53013
    The specified filter condition is not supported. Please see the API documentation for parameters.
    53014
    Invalid DecimalType setting: Precision must be greater than or equal to Scale, and Precision must be less than 38.
    53015
    Invalid table format. Currently supported formats are TextFile, CSV, Json, Parquet, ORC, and AVRO.
    53016
    The current user does not have permissions to create UDFs. Please obtain authorization and try again.
    53017
    The current user does not have permissions to modify UDFs. Please obtain authorization and try again.
    53018
    The current user does not have permissions to delete UDFs. Please obtain authorization and try again.
    53019
    UDF parameter error
    53020
    UDF duplication
    53021
    The UDF does not exist. Please retry, or submit a ticket to contact us.
    53022
    The view does not exist. Please retry, or submit a ticket to contact us.
    53023
    The directory already exists.
    53024
    The current user does not have permissions to delete the directory.
    53025
    The directory does not exist.
    53026
    The current user is not allowed to modify the directory; only administrators or the creator can modify.
    53027
    The parent directory does not exist.
    53028
    Table path configuration error. Please check and reset the path.
    53029
    Table format configuration error. Only one format can be set.
    53030
    Table name configuration error. The table name must be 128 bytes or fewer.
    53031
    Field type configuration error. Supported types are string, tinyint, smallint, int, bigint, boolean, float, double, decimal, timestamp, date, binary, array<>, map<>, struct<>, and uniontype<>.
    53032
    Field count configuration error. The number of fields must be 4096 or fewer.
    53033
    Field name configuration error. Field names must be 128 bytes or fewer.
    53034
    The table does not exist. Please retry, or submit a ticket to contact us.
    53035
    Function name must consist of alphanumeric characters or underscores, and be no longer than 20 characters.
    53036
    Function class name format error. A fully qualified Java class name is required.
    53037
    Only JAR file types are supported for packages.
    53038
    TableType configuration error. Supported types are TABLE, VIEW, MANAGED_TABLE, EXTERNAL_TABLE, LAKEFS, ICEBERG, HIVE, and OTHER.
    53039
    Column not found. Please retry, or Submit a ticket to contact us.
    55001
    The Spark Shuffle storage path not found. Please go to the Console > Data Explorer page > Storage Configuration to set it up.
    55002
    The Warehouse storage path not found. Please go to the Console > Data Explorer page > Storage Configuration to set it up.
    56001
    Specified task resources exceed the remaining cluster resource limits. Please adjust and retry.
    56002
    The specified Spark task does not exist. Please adjust and retry.
    56003
    The specified Spark task already exists. Please adjust and retry.
    56004
    The specified Spark task type does not match. Currently supported types are
    1: Batch Tasks; 2: Streaming Tasks; 3: PySpark Tasks; 4: SQL Tasks
    56005
    The specified Spark task package file format does not match. Currently, only .jar or .py files are supported.
    56006
    The specified Spark task Filter.Key does not match. Currently supported keys are spark-app-type, user-name, spark-job-name, spark-job-id, and key-word.
    56007
    Spark tasks can only be run using the Spark Job Engine.
    56008
    The specified Spark task sorting type does not match. Currently supported types are create-time, update-time, user-name, and data-engine-name.
    56009
    The specified Spark task IsInherit type does not match. Currently supported values are
    0: Inherit; 1: Do not inherit.
    56010
    The specified Spark task RoleArn does not exist. Please adjust and retry.
    56011
    The specified TCR Spark image format does not match. See the example: my-image/ndf/python/latest.
    56012
    Insufficient resources for the specified Spark job. Please adjust driver/executor specifications.
    57001
    The current statement only supports SQL type.
    57002
    SQL script Base64 decoding failed. Please convert to Base64 encoding.
    57003
    SQL parameter preprocessing failed.
    57004
    The number of submitted SQL statements must be between 1 and 50.
    57005
    Failed to obtain the result storage path. Please go to the Console > Data Exploration Page to set it up.
    57006
    The current session only supports the following types: spark, pyspark, sparkr, and sql.
    57007
    The specified DriverSize specification only supports small, medium, large, xlarge, m.small, m.medium, m.large, and m.xlarge.
    57008
    The specified ExecutorSize specification only supports small, medium, large, xlarge, m.small, m.medium, m.large, and m.xlarge.
    57009
    The specified dynamic number of EXECUTORS must be set to the current maximum value.
    57010
    The current task only supports running on the Spark batch job engine.
    57011
    Insufficient resources to create the session, please adjust the driver/executor specification.
    57012
    Session not found.
    57013
    Session state is dead.
    57014
    The specified interactive SQL task does not exist.
    57015
    The specified SortBy type for the interactive SQL task does not match. Currently supported types are create-time and resource-usage.
    57016
    The specified interactive SQL task Filter.Key does not match. Currently supported keys are task-sql-keyword, task-operator, batch-id, session-id, and task-state.
    57017
    The specified interactive SQL task is not unique.
    58001
    The specified engine does not exist.
    58002
    The specified engine already exists.
    58003
    The specified engine is not running.
    58004
    The specified ENI resource does not exist.
    58005
    The specified ENI resource already exists.
    58006
    The specified RoleArn does not exist.
    58007
    The specified isPublic value does not match. Currently supported values are
    1: Public; 2: Private
    58008
    The specified cluster image version does not exist.
    58009
    The specified cluster image version already exists.
    58010
    The specified state value does not match. Currently supported values are
    1: Initializing; 2: Online; 3: Offline
    58011
    The specified engine type does not match. Currently supported types are SparkSQL, PrestoSQL, and SparkBatch.
    58012
    The specified cluster image session configuration does not exist.
    58013
    The specified cluster image session configuration already exists.
    58014
    The default engine not found.
    58015
    The specified cluster image UserRecords value does not match. Currently supported values are
    1: parentVersion; 2: childVersion; 3: pySpark
    58016
    The specified cluster image ParameterSubmitMethod does not match. Currently supported values are
    1: session; 2: common; 3: cluster
    58017
    The specified cluster image ParameterSubmitMethod does not match. Currently supported values are User and BackGround.
    58018
    The specified cluster parameter is invalid. Please check and retry.
    58019
    The specified cluster parameter already exists.
    58020
    The specified cluster is not in a running status.
    58021
    The specified cluster is not multi-versioned and does not support this operation.
    58022
    The specified cluster is not of the Spark batch job type and does not support this operation.
    58023
    The specified cluster configuration instance does not exist.
    58024
    The specified cluster configuration instance already exists.
    58025
    The specified cluster has unfinished tasks. Please wait for the tasks to complete and retry.
    58026
    The specified cluster configuration has not changed.
    58027
    The specified cluster billing type does not match. Currently supported values are
    0: Postpaid; 1: Prepaid
    58028
    The specified cluster image constant format is not in JSON.
    58029
    The specified cluster image session parameter format is not in JSON.
    58030
    The specified cluster ExecType does not match. Currently supported values are SQL or BATCH.
    58031
    The specified cluster image Config operation mode does not match. Currently supported values are
    1: Modify Session configuration; 2: Modify Cluster configuration
    58032
    The specified cluster image is not activated.
    58033
    The specified cluster image name does not meet the required standards.
    58034
    The specified private settings for the cluster image already exist.
    58035
    The specified cluster specification does not meet the required standards.
    58036
    The specified cluster image parameter does not exist.
    58037
    The specified cluster image Cluster parameter format is not in JSON.
    58038
    The specified cluster image operation does not match. Currently supported operations are InitImage, UpgradeImage, SwitchImage, RollbackImage, and ModifyResource.
    58039
    The specified cluster type does not match. Currently supported types are spark and presto.
    58040
    The specified cluster specification type does not match. Currently supported types are
    Standard_CU (General) and Memory_CU (Memory-optimized; not supported for SQL type clusters)
    58041
    The specified cluster resource type does not match. Currently supported types are spark_cu (for Spark clusters) and presto_cu (for Presto clusters).
    58042
    The specified cluster elasticity number does not match. The minimum cluster size should be greater than 1, the maximum should be 10 or fewer, and the maximum should be greater than the minimum.
    58043
    The specified cluster elasticity specification does not match. The upper limit of the specified elastic specification should be less than the cluster specification.
    58044
    The specified cluster CIDR format does not match. See the example: 192.0.2.1/24.
    58045
    The specified cluster start/stop policy is invalid. If the auto-suspend policy is enabled, the scheduled start/stop policy cannot be enabled, and vice versa.
    58046
    The specified cluster scheduled suspend policy is invalid. Currently supported options are
    1: Suspend after tasks are completed; 2: Force suspend (tasks will fail)
    58047
    The specified unit for cluster resource usage duration does not match. Postpaid: h; Prepaid: m. The default is h.
    58048
    The specified cluster resource usage duration does not match. Postpaid: must be set to 3600; Prepaid: the minimum is 1 (representing one month) and the maximum is 120. The default is 3600.
    58049
    The specified cluster billing mode does not match. Currently supported modes are
    1: Pay-as-you-go; 2: Monthly or annual subscription
    58050
    The specified cluster auto-renewal mode does not match.
    Postpaid does not require renewal and must be set to 0.
    For prepaid: 0 indicates manual renewal; 1 indicates automatic renewal; 2 indicates no renewal.
    58051
    The specified cluster auto-suspend policy does not match. Subscription clusters do not support this.
    58052
    The specified cluster auto-start policy does not match. Subscription clusters do not support this.
    58053
    The specified postpaid cluster policy does not match. Subscription clusters do not support this.
    58054
    Insufficient resources to create the cluster. Please retry or submit a ticket to contact us.
    58055
    EMR-LIVY cluster limit exceeded.
    58056
    This feature is only supported by batch job clusters.
    58057
    User does not exist. Please enter a valid user.
    58058
    The specified NetworkConnectionType is invalid. Currently supported values are
    2: Cross-source; 4: Enhanced
    58059
    This VPC is already bound to the specified data engine.
    58060
    No runnable EKS cluster exists.
    58061
    Unsupported Hive version. Supported versions are 2.1.1, 2.3.2, 2.3.3, 2.3.5, 2.3.7, 3.1.1, 3.1.2.
    58062
    Cloud Ladder users do not support this operation.
    58063
    Information.type is invalid. Currently supported values are User, Group, DataAuth, EngineAuth, and RowFilter.
    58064
    DatasourceConnectionType is invalid. Currently supported values are Mysql, HiveCos, HiveHdfs, HiveCHdfs, Kafka, MysqlEni, OtherDatasourceConnection, PostgreSql, TDSQLPostgreSql, SqlServer, ClickHouse, and Elasticsearch.
    58506
    CIDR is invalid.
    59001
    The current user does not have permissions to grant authorization.
    59002
    The current user does not have permissions to revoke authorization.
    59003
    The current user does not have permissions to perform this operation.
    59004
    The current user does not have permissions to add users to the workgroup.
    59005
    No permission to bind a job to a user.
    59006
    The user does not have permissions to create a workgroup.
    59007
    No permission to create a user.
    59008
    The user does not have permissions to remove a user from a workgroup.
    59009
    User does not have permissions to delete a workgroup.
    59010
    The user does not have permissions to modify user information.
    59011
    The user does not have permissions to modify user type.
    59012
    The user does not have permissions to modify workgroup information.
    59013
    The user does not have permissions to unbind a workgroup.
    59014
    The user does not have permissions to create an administrator.
    59015
    The user does not have permissions to use the engine.
    59016
    The user does not have permissions to update the engine.
    59017
    The user does not have permissions to delete the engine.
    59018
    The user does not have permissions to monitor the engine.
    59019
    The user does not have permissions to operate the compute engine.
    59020
    The user does not have permissions to modify the compute engine.
    59021
    The sub-user does not exist.
    59022
    The user does not have permissions to create a UDF.
    59023
    The user does not have permissions to modify the UDF.
    59024
    The user does not have permissions to delete the UDF.
    59025
    The user does not have permissions to create a Catalog.
    59026
    Insufficient permissions for sub-account; payment not allowed.
    59027
    The user does not have permissions to use the specified engine.
    59028
    Failed to retrieve user information. Please retry or Submit a ticket to contact us.
    59029
    Invalid username. Usernames can only contain numbers and must be 64 characters or fewer.
    59030
    The workgroup does not exist.
    59031
    The network connection does not exist.
    59032
    Policy is invalid. Please see the API documentation.
    59033
    The field does not support sorting. Please see the API documentation.
    59034
    The workgroup does not exist.
    59035
    Sorting is invalid. Currently supported values are desc/asc.
    59036
    Modifying administrator permissions is not allowed.
    59037
    UserType is invalid. Currently supported values are ADMIN/COMMON.
    59038
    The workgroup already exists.
    60001
    TaskType error. For Spark Engine, the task type is SparkSQLTask; for Presto Engine, the task type is SQLTask.
    60002
    Task fault tolerance type error. Currently supported values are Proceed/Terminate.
    60003
    Number of SQL statements error. The number of SQL statements must be between 1 and 50.
    60004
    Failed to retrieve the whitelist. Please retry, or submit a ticket to contact us.
    60005
    Allowlist validation failed. Please retry, or submit a ticket to contact us.
    60006
    SQL parameter validation failed. Please adjust the parameters, or submit a ticket to contact us.
    60007
    The task has already failed.
    60008
    The task has already completed.
    60009
    The specified task parameter Value length exceeds the limit.
    60010
    The specified task parameter Value does not meet the requirements.
    60011
    The specified task parameter Key does not exist.
    60012
    The specified SQL task does not exist.
    60013
    The specified SQL task already exists.
    60014
    The number of SQL task results retrieved at once must be greater than 0 and less than 1000.
    60015
    The specified SQL task SortBy type does not match. Currently supported types are create-time, data-amount, used-time, and resource-usage.
    60016
    The specified number of Filter.Values exceeds the limit. Currently, it should be 50 or fewer.
    60017
    The specified task status does not match. Currently supported statuses are
    0: Initializing; 1: Running; 2: Successful; 3: Data Writing; 4: Queued; -1: Failed; -3: Deleted
    60018
    The specified Filter.Key does not match. Currently supported keys are: task-id, task-sql-keyword, task-kind, task-operator, batch-id, session-id, and task-state.
    60019
    You are currently allowed to view only 100 result data entries. If you need to adjust this limit, please contact us.
    60020
    SQL execution failed. Please validate and retry.
    60021
    Syntax parsing failed. Please validate and retry.

    Error Format

    API Format

    If the response contains an Error field, it indicates that the API call failed. For example:
    {
    "Response": {
    "Error": {
    "Action": "AddUsersToWorkGroup",
    "Code": "InternalError.InternalSystemException",
    "Detail": "{\\"errStr\\":\\"InternalError.InternalSystemException\\",\\"errDesc\\":\\" Business system error. Please retry or submit a ticket to contact us. \\",\\"errCode\\":\\"50000\\",\\"errMsg\\":\\"\\"}",
    "Message": Business system error. Please retry or submit a ticket to contact us. \\n{\\"errStr\\":\\"InternalError.InternalSystemException\\",\\"errDesc\\":\\"Business system error. Please retry or submit a ticket to contact us. \\",\\"errCode\\":\\"50000\\",\\"errMsg\\":\\"\\"}"
    },
    "RequestId": "27c536fe-351b-41a6-8156-24330989a742"
    }
    }

    Frontend Format

    If an error occurs, the frontend will display the error as follows:
    Title: Error Cause Identification
    Error code: 123456
    Error description: Explanation related to the error

    Handling Procedure

    If you encounter an error code during production operations, you can follow the steps below:
    1. Follow the recommended actions based on the returned error code, such as retrying the operation or adjusting configuration settings.
    2. Copy the error code, error description, and other relevant information, and then paste it into a work order or contact after-sales for assistance.
    
    Contact Us

    Contact our sales team or business advisors to help your business.

    Technical Support

    Open a ticket if you're looking for further assistance. Our Ticket is 7x24 avaliable.

    7x24 Phone Support