tencent cloud

Feedback

DescribeSparkAppTasks

Last updated: 2024-08-08 15:32:03

    1. API Description

    Domain name for API request: dlc.tencentcloudapi.com.

    This API is used to query the list of running task instances of a Spark job.

    A maximum of 40 requests can be initiated per second for this API.

    We recommend you to use API Explorer
    Try it
    API Explorer provides a range of capabilities, including online call, signature authentication, SDK code generation, and API quick search. It enables you to view the request, response, and auto-generated examples.

    2. Input Parameters

    The following request parameter list only provides API request parameters and some common parameters. For the complete common parameter list, see Common Request Parameters.

    Parameter Name Required Type Description
    Action Yes String Common Params. The value used for this API: DescribeSparkAppTasks.
    Version Yes String Common Params. The value used for this API: 2021-01-25.
    Region Yes String Common Params. For more information, please see the list of regions supported by the product.
    JobId Yes String Spark job ID
    Offset No Integer Paginated query offset
    Limit No Integer Paginated query limit
    TaskId No String Execution instance ID
    StartTime No String The update start time in the format of yyyy-MM-dd HH:mm:ss.
    EndTime No String The update end time in the format of yyyy-MM-dd HH:mm:ss.
    Filters.N No Array of Filter Filter by this parameter, which can be task-state.

    3. Output Parameters

    Parameter Name Type Description
    Tasks TaskResponseInfo Task result (this field has been disused)
    Note: This field may return null, indicating that no valid values can be obtained.
    TotalCount Integer Total number of tasks
    SparkAppTasks Array of TaskResponseInfo List of task results
    Note: This field may return null, indicating that no valid values can be obtained.
    RequestId String The unique request ID, generated by the server, will be returned for every request (if the request fails to reach the server for other reasons, the request will not obtain a RequestId). RequestId is required for locating a problem.

    4. Example

    Example1 Querying the Running Task List of a Spark Job

    This example shows you how to query the list of running tasks of a Spark job.

    Input Example

    POST / HTTP/1.1
    Host: dlc.tencentcloudapi.com
    Content-Type: application/json
    X-TC-Action:DescribeSparkAppTasks
    <Common request parameters>
    
    {
        "JobId": "batch_133e005d-6486-4517-8ea7-b6b97b183a6b",
        "Offset": 0,
        "Limit": 10
    }
    

    Output Example

    {
        "Response": {
            "Tasks": {
                "DatabaseName": "abc",
                "DataAmount": 0,
                "Id": "abc",
                "UsedTime": 0,
                "OutputPath": "abc",
                "CreateTime": "abc",
                "State": 0,
                "SQLType": "abc",
                "SQL": "abc",
                "ResultExpired": true,
                "RowAffectInfo": "abc",
                "DataSet": "abc",
                "Error": "abc",
                "Percentage": 0,
                "OutputMessage": "abc",
                "TaskType": "abc",
                "ProgressDetail": "abc",
                "UpdateTime": "abc",
                "DataEngineId": "abc",
                "OperateUin": "abc",
                "DataEngineName": "abc",
                "InputType": "abc",
                "InputConf": "abc",
                "DataNumber": 0,
                "CanDownload": true,
                "UserAlias": "abc",
                "SparkJobName": "abc",
                "SparkJobId": "abc",
                "SparkJobFile": "abc",
                "UiUrl": "abc",
                "TotalTime": 0,
                "CmdArgs": "abc",
                "ImageVersion": "abc",
                "DriverSize": "abc",
                "ExecutorSize": "abc",
                "ExecutorNums": 1,
                "ExecutorMaxNumbers": 1
            },
            "TotalCount": 0,
            "SparkAppTasks": [
                {
                    "DatabaseName": "abc",
                    "DataAmount": 0,
                    "Id": "abc",
                    "UsedTime": 0,
                    "OutputPath": "abc",
                    "CreateTime": "abc",
                    "State": 0,
                    "SQLType": "abc",
                    "SQL": "abc",
                    "ResultExpired": true,
                    "RowAffectInfo": "abc",
                    "DataSet": "abc",
                    "Error": "abc",
                    "Percentage": 0,
                    "OutputMessage": "abc",
                    "TaskType": "abc",
                    "ProgressDetail": "abc",
                    "UpdateTime": "abc",
                    "DataEngineId": "abc",
                    "OperateUin": "abc",
                    "DataEngineName": "abc",
                    "InputType": "abc",
                    "InputConf": "abc",
                    "DataNumber": 0,
                    "CanDownload": true,
                    "UserAlias": "abc",
                    "SparkJobName": "abc",
                    "SparkJobId": "abc",
                    "SparkJobFile": "abc",
                    "UiUrl": "abc",
                    "TotalTime": 0,
                    "CmdArgs": "abc",
                    "ImageVersion": "abc",
                    "DriverSize": "abc",
                    "ExecutorSize": "abc",
                    "ExecutorNums": 1,
                    "ExecutorMaxNumbers": 1
                }
            ],
            "RequestId": "abc"
        }
    }
    

    5. Developer Resources

    SDK

    TencentCloud API 3.0 integrates SDKs that support various programming languages to make it easier for you to call APIs.

    Command Line Interface

    6. Error Code

    The following only lists the error codes related to the API business logic. For other error codes, see Common Error Codes.

    Error Code Description
    FailedOperation The operation failed.
    InternalError.InternalSystemException The business system is abnormal. Please try again or submit a ticket to contact us.
    InvalidParameter.FiltersValuesNumberOutOfLimit The number of specified Filter.Values parameters exceeds the limit. Currently, it should be less than or equal to 50.
    InvalidParameter.InvalidTimeFormat The specified time format is not compliant. Currently, only YYYY-mm-dd HH:MM:SS is supported.
    InvalidParameter.InvalidTimeParameter The date parameters are abnormal. For example, the end time is earlier than the start time.
    InvalidParameter.ParameterNotFoundOrBeNone The parameter is not found or empty.
    InvalidParameter.SparkJobFiltersKeyTypeNotMath The specified Spark task Filter.Key does not match. Currently, only spark-app-type/user-name/spark-job-name/spark-job-id/key-word is supported.
    InvalidParameter.SparkJobNotFound The specified Spark task does not exist.
    InvalidParameter.SparkJobNotUnique The specified Spark task already exists.
    ResourceUnavailable.WhiteListFunction Currently, the allowlist function is available. Please contact us to activate it.
    Contact Us

    Contact our sales team or business advisors to help your business.

    Technical Support

    Open a ticket if you're looking for further assistance. Our Ticket is 7x24 avaliable.

    7x24 Phone Support