tencent cloud

文档反馈

DescribeSparkAppJob

最后更新时间:2024-08-08 15:32:04

    1. API Description

    Domain name for API request: dlc.tencentcloudapi.com.

    u200cThis API is used to query the information of a Spark job.

    A maximum of 20 requests can be initiated per second for this API.

    We recommend you to use API Explorer
    Try it
    API Explorer provides a range of capabilities, including online call, signature authentication, SDK code generation, and API quick search. It enables you to view the request, response, and auto-generated examples.

    2. Input Parameters

    The following request parameter list only provides API request parameters and some common parameters. For the complete common parameter list, see Common Request Parameters.

    Parameter Name Required Type Description
    Action Yes String Common Params. The value used for this API: DescribeSparkAppJob.
    Version Yes String Common Params. The value used for this API: 2021-01-25.
    Region Yes String Common Params. For more information, please see the list of regions supported by the product.
    JobId No String The Spark job ID. If it co-exists with JobName, JobName is invalid. At least JobId or JobName must be used.
    JobName No String Spark job name

    3. Output Parameters

    Parameter Name Type Description
    Job SparkJobInfo Spark job details
    Note: This field may return null, indicating that no valid values can be obtained.
    IsExists Boolean Whether the queried Spark job exists
    RequestId String The unique request ID, generated by the server, will be returned for every request (if the request fails to reach the server for other reasons, the request will not obtain a RequestId). RequestId is required for locating a problem.

    4. Example

    Example1 Querying Spark Job Information

    This example shows you how to query the information of a Spark job.

    Input Example

    POST / HTTP/1.1
    Host: dlc.tencentcloudapi.com
    Content-Type: application/json
    X-TC-Action:DescribeSparkAppJob
    <Common request parameters>
    
    {
        "JobId": "batch_133e005d-6486-4517-8ea7-b6b97b183a6b",
        "JobName": "spark_app"
    }
    

    Output Example

    {
        "Response": {
            "Job": {
                "JobId": "batch_e6c5ae75-fb02-4831-a5b8-88999d09003c",
                "JobName": "abc",
                "JobType": 1,
                "DataEngine": "testjar3",
                "Eni": "testeni2",
                "IsLocal": "cos",
                "JobFile": "cosn://danierwei-test-1305424723/sparkjar/spark-ckafka-1.0-SNAPSHOT.jar",
                "RoleArn": 3,
                "MainClass": "org.apache.spark.examples.SparkPi",
                "CmdArgs": "testArgs",
                "JobConf": "",
                "IsLocalJars": "abc",
                "JobJars": "lakefs://4000002928ef2638d7ab6aabb088bd51b7db914729a5c43b13a998ffa9750511f511d0ab@dlcda57-100018379117-1636704841-100017307912-1304028854/1305424723/.system/sparkAppJar/20220513/dd3c6ad3-a746-40d8-806c-fa8b15b5e9f9/spark-examples_2.12-3.1.2.jar",
                "IsLocalFiles": "lakefs",
                "JobFiles": "",
                "JobDriverSize": "small",
                "JobExecutorSize": "small",
                "JobExecutorNums": 1,
                "JobMaxAttempts": 1,
                "JobCreator": "admin",
                "JobCreateTime": 1652769991248,
                "JobUpdateTime": 1652769991248,
                "CurrentTaskId": "2aedsa7a-9ds2-44ds-9fdd-65cbds9d6301",
                "JobStatus": 1,
                "StreamingStat": {
                    "StartTime": "2022-01-01 12:12:12",
                    "Receivers": 0,
                    "NumActiveReceivers": 0,
                    "NumInactiveReceivers": 0,
                    "NumActiveBatches": 0,
                    "NumRetainedCompletedBatches": 0,
                    "NumTotalCompletedBatches": 0,
                    "AverageInputRate": 0,
                    "AverageSchedulingDelay": 0,
                    "AverageProcessingTime": 0,
                    "AverageTotalDelay": 0
                },
                "DataSource": "DataLakeCatalog",
                "IsLocalPythonFiles": "cos",
                "AppPythonFiles": "",
                "IsLocalArchives": "cos",
                "JobArchives": "",
                "SparkImage": "Spark 3.2",
                "JobPythonFiles": "cos",
                "TaskNum": 1,
                "DataEngineStatus": 0,
                "JobExecutorMaxNumbers": 1,
                "SessionId": "xxssd-dsakjj-dkslk-doeks"
            },
            "IsExists": true,
            "RequestId": "2ae4707a-9f72-44aa-9fd4-65cb739d6301"
        }
    }
    

    5. Developer Resources

    SDK

    TencentCloud API 3.0 integrates SDKs that support various programming languages to make it easier for you to call APIs.

    Command Line Interface

    6. Error Code

    The following only lists the error codes related to the API business logic. For other error codes, see Common Error Codes.

    Error Code Description
    InternalError An internal error occurred.
    InternalError.InternalSystemException The business system is abnormal. Please try again or submit a ticket to contact us.
    InvalidParameter The parameter is incorrect.
    InvalidParameter.InvalidSparkAppParam The SparkAppParam is invalid.
    InvalidParameter.ParameterNotFoundOrBeNone The parameter is not found or empty.
    InvalidParameter.SparkJobNotFound The specified Spark task does not exist.
    InvalidParameter.SparkJobNotUnique The specified Spark task already exists.