tencent cloud

文档反馈

ModifySparkAppBatch

最后更新时间:2024-08-08 15:32:00

    1. API Description

    Domain name for API request: dlc.tencentcloudapi.com.

    This API is used to modify Spark job parameters in batches.

    A maximum of 20 requests can be initiated per second for this API.

    We recommend you to use API Explorer
    Try it
    API Explorer provides a range of capabilities, including online call, signature authentication, SDK code generation, and API quick search. It enables you to view the request, response, and auto-generated examples.

    2. Input Parameters

    The following request parameter list only provides API request parameters and some common parameters. For the complete common parameter list, see Common Request Parameters.

    Parameter Name Required Type Description
    Action Yes String Common Params. The value used for this API: ModifySparkAppBatch.
    Version Yes String Common Params. The value used for this API: 2021-01-25.
    Region Yes String Common Params. For more information, please see the list of regions supported by the product.
    SparkAppId.N Yes Array of String The list of the IDs of the Spark job tasks to be modified in batches.
    DataEngine No String The engine ID.
    AppDriverSize No String The driver size.
    Valid values for the standard resource type: small, medium, large, and xlarge.
    Valid values for the memory resource type: m.small, m.medium, m.large, and m.xlarge.
    AppExecutorSize No String The executor size.
    Valid values for the standard resource type: small, medium, large, and xlarge.
    Valid values for the memory resource type: m.small, m.medium, m.large, and m.xlarge.
    AppExecutorNums No Integer The executor count. The minimum value is 1 and the maximum value is less than the cluster specification.
    AppExecutorMaxNumbers No Integer The maximum executor count (in dynamic configuration scenarios). The minimum value is 1 and the maximum value is less than the cluster specification. If you set ExecutorMaxNumbers to a value smaller than that of ExecutorNums, the value of ExecutorMaxNumbers is automatically changed to that of ExecutorNums.
    IsInherit No Integer Whether to inherit the task resource configuration from the cluster template. Valid values: 0 (default): No; 1: Yes.

    3. Output Parameters

    Parameter Name Type Description
    RequestId String The unique request ID, generated by the server, will be returned for every request (if the request fails to reach the server for other reasons, the request will not obtain a RequestId). RequestId is required for locating a problem.

    4. Example

    Example1 Modifying Spark job parameters in batches

    This example shows you how to modify Spark job parameters in batches.

    Input Example

    POST / HTTP/1.1
    Host: dlc.tencentcloudapi.com
    Content-Type: application/json
    X-TC-Action: ModifySparkAppBatch
    <Common request parameters>
    
    {
        "SparkAppId": [
            "batch_a7dca867-b941-4294-af9e-3dsefc086f1e"
        ],
        "DataEngine": "DataEngine-dde2f7vq",
        "AppDriverSize": "small",
        "AppExecutorSize": "small",
        "AppExecutorNums": 1,
        "AppExecutorMaxNumbers": 1,
        "IsInherit": 0
    }
    

    Output Example

    {
        "Response": {
            "RequestId": "b8sd7dd7-ekd4-4e5e-993e-e5db64fa21c1"
        }
    }
    

    5. Developer Resources

    SDK

    TencentCloud API 3.0 integrates SDKs that support various programming languages to make it easier for you to call APIs.

    Command Line Interface

    6. Error Code

    The following only lists the error codes related to the API business logic. For other error codes, see Common Error Codes.

    Error Code Description
    FailedOperation The operation failed.
    InternalError An internal error occurred.
    InternalError.InternalSystemException The business system is abnormal. Please try again or submit a ticket to contact us.
    InvalidParameter The parameter is incorrect.
    InvalidParameter.InvalidSQL SQL parsing failed.
    InvalidParameter.ParameterNotFoundOrBeNone The parameter is not found or empty.
    InvalidParameter.SparkJobIsInheritTypeNotMatch The specified Spark task IsInherit type does not match. Currently, it only supports: 0: Inherit; 1: Do not inherit.
    ResourceInsufficient.SparkJobInsufficientResources The specified spark job resources are insufficient. Please adjust the driver/executor specifications.
    ResourceNotFound The resource does not exist.
    ResourceNotFound.DataEngineNotFound The specified engine does not exist.
    ResourceNotFound.DataEngineNotUnique The specified engine already exists.
    联系我们

    联系我们,为您的业务提供专属服务。

    技术支持

    如果你想寻求进一步的帮助,通过工单与我们进行联络。我们提供7x24的工单服务。

    7x24 电话支持