tencent cloud

All product documents
Data Lake Compute
CreateSparkSessionBatchSQL
Last updated: 2024-08-08 15:32:08
CreateSparkSessionBatchSQL
Last updated: 2024-08-08 15:32:08

1. API Description

Domain name for API request: dlc.tencentcloudapi.com.

This API is used to submit a Spark SQL batch task to the job engine.

A maximum of 30 requests can be initiated per second for this API.

We recommend you to use API Explorer
Try it
API Explorer provides a range of capabilities, including online call, signature authentication, SDK code generation, and API quick search. It enables you to view the request, response, and auto-generated examples.

2. Input Parameters

The following request parameter list only provides API request parameters and some common parameters. For the complete common parameter list, see Common Request Parameters.

Parameter Name Required Type Description
Action Yes String Common Params. The value used for this API: CreateSparkSessionBatchSQL.
Version Yes String Common Params. The value used for this API: 2021-01-25.
Region Yes String Common Params. For more information, please see the list of regions supported by the product.
DataEngineName Yes String The name of the engine for executing the Spark job.
ExecuteSQL Yes String Run SQL. The base64 encoding is needed.
DriverSize No String The driver size. Valid values: small (default, 1 CU), medium (2 CUs), large (4 CUs), and xlarge (8 CUs).
ExecutorSize No String The executor size. Valid values: small (default, 1 CU), medium (2 CUs), large (4 CUs), and xlarge (8 CUs).
ExecutorNumbers No Integer The executor count, which defaults to 1.
ExecutorMaxNumbers No Integer The maximum executor count, which defaults to 1. This parameter applies if the "Dynamic" mode is selected. If the "Dynamic" mode is not selected, the value of this parameter is the same as that of ExecutorNumbers.
TimeoutInSecond No Integer The session timeout period in seconds. Default value: 3600
SessionId No String The unique ID of a session. If this parameter is specified, the task will be run using the specified session.
SessionName No String The name of the session to create.
Arguments.N No Array of KVPair The session configurations. Valid values: 1.dlc.eni for user-defined ENI gateway information;
2.dlc.role.arn for user-defined roleArn configurations;
and 3.dlc.sql.set.config for user-defined cluster configurations.
IsInherit No Integer Whether to inherit the resource configuration of clusters; 0: not inherit (by default); 1: inherit clusters.
CustomKey No String User-defined primary key, and it should be unique.

3. Output Parameters

Parameter Name Type Description
BatchId String The unique identifier of a batch task.
Statements Array of StatementInformation Statement task list information
Note: This field may return null, indicating that no valid values can be obtained.
RequestId String The unique request ID, generated by the server, will be returned for every request (if the request fails to reach the server for other reasons, the request will not obtain a RequestId). RequestId is required for locating a problem.

4. Example

Example1 Creating and Executing Spark SQL Batch Tasks

This API (CreateSparkSessionBatchSQL) is used to submit Spark SQL batch tasks to the Spark job engine.

Input Example

POST / HTTP/1.1
Host: dlc.tencentcloudapi.com
Content-Type: application/json
X-TC-Action: CreateSparkSessionBatchSQL
<Common request parameters>

{
    "DataEngineName": "data_engine_1",
    "ExecuteSQL": "c2VsZWN0IDE=",
    "DriverSize": "small",
    "ExecutorSize": "small",
    "ExecutorNumbers": 1,
    "ExecutorMaxNumbers": 1,
    "TimeoutInSecond": 2,
    "SessionId": "",
    "SessionName": "livy-session-123",
    "Arguments": [
        {
            "Value": "eni",
            "Key": "test_eni"
        }
    ]
}

Output Example

{
    "Response": {
        "RequestId": "b8sd7dd7-ekd4-4e5e-993e-e5db64fa21c1",
        "BatchId": "d3018ad4-9a7e-4f64-a3f4-f38507c69742"
    }
}

5. Developer Resources

SDK

TencentCloud API 3.0 integrates SDKs that support various programming languages to make it easier for you to call APIs.

Command Line Interface

6. Error Code

The following only lists the error codes related to the API business logic. For other error codes, see Common Error Codes.

Error Code Description
FailedOperation The operation failed.
FailedOperation.NoPermissionToUseTheDataEngine The user does not have permission to specify the engine.
InternalError An internal error occurred.
InternalError.InternalSystemException The business system is abnormal. Please try again or submit a ticket to contact us.
InvalidParameter The parameter is incorrect.
InvalidParameter.BatchSQLCustomKeyNotUnique The custom primary key of the specified interactive SQL task is not unique.
InvalidParameter.DataEngineOnlySupportSparkBatch The current task only supports the Spark batch job processing engine.
InvalidParameter.ImageEngineTypeNotMatch The specified engine type does not match. Currently, only SparkSQL, PrestoSQL, and SparkBatch are supported.
InvalidParameter.ImageIsPublicNotMatch The specified isPublic does not match. Currently, it only supports 1: public and 2: private.
InvalidParameter.ImageParameterSubmitMethodNotMatch The specified cluster image ParameterSubmitMethod does not match. Currently, only User and BackGround are supported.
InvalidParameter.ImageParameterTypeNotMatch The specified cluster image ParameterType does not match. Currently, it only supports 1: session; 2: common; 3: cluster.
InvalidParameter.ImageStateNotMatch The specified state does not match. Currently, it only supports 1: initializing, 2: online, 3: offline.
InvalidParameter.InvalidDriverSize The current DriverSize specification only supports small/medium/large/xlarge/m.small/m.medium/m.large/m.xlarge.
InvalidParameter.InvalidDynamicAllocationMaxExecutors The specified dynamic number of executors must be the current maximum value.
InvalidParameter.InvalidExecutorSize The current ExecutorSize specification only supports small/medium/large/xlarge/m.small/m.medium/m.large/m.xlarge.
InvalidParameter.InvalidFileCompressionFormat The specified file compression format is not compliant. Currently, only .tar.gz/.tar/.tgz is supported.
InvalidParameter.InvalidFilePathFormat The specified file path format is not compliant. Currently, only cosn:// or lakefs:// is supported.
InvalidParameter.InvalidSQL SQL parsing failed.
InvalidParameter.InvalidSessionKindType Currently, types supported by Session only include spark/pyspark/sparkr/sql.
InvalidParameter.InvalidStatementKindType Currently, Statement only supports the type of sql.
InvalidParameter.InvalidWhiteListKey There is an error in getting an allowlist. Please try again or submit a ticket to contact us.
InvalidParameter.NumberOfSQLExceedsTheLimit The number of SQL statements submitted is limited to 1~50.
InvalidParameter.ParameterNotFoundOrBeNone The parameter is not found or empty.
InvalidParameter.SQLBase64DecodeFail Base64 parsing of the SQL script failed.
InvalidParameter.SQLParameterPreprocessingFailed SQL parameter preprocessing failed.
ResourceNotFound The resource does not exist.
ResourceNotFound.DataEngineNotFound The specified engine does not exist.
ResourceNotFound.DataEngineNotRunning The specified engine is not running.
ResourceNotFound.DataEngineNotUnique The specified engine already exists.
ResourceNotFound.ImageSessionConfigNotFound The specified cluster image Session configuration does not exist.
ResourceNotFound.ImageSessionConfigNotUnique The specified cluster image Session configuration already exists.
ResourceNotFound.ImageVersionNotFound The specified cluster image version does not exist.
ResourceNotFound.ResultSavePathNotFound Obtaining the result storage path failed. Please go to the Console -> Data Exploration page for settings.
ResourceNotFound.RoleArnResourceNotFound The specified RoleArn does not exist.
ResourceNotFound.SessionInsufficientResources There are currently no resources to create a session. Please try again later or use an annual or monthly subscription cluster.
ResourceNotFound.SessionNotFound The session does not exist.
ResourceNotFound.SessionStateDead The session has expired.
ResourceNotFound.ShuffleDirNotFound The Spark Shuffle storage path cannot be found. Please go to the Console -> Data Exploration page -> Storage Configuration to set it.
ResourceNotFound.WarehouseDirNotFound The Warehouse storage path cannot be found. Please go to the Console->Data Exploration page->Storage Configuration to set it.
Was this page helpful?
You can also Contact Sales or Submit a Ticket for help.
Yes
No

Feedback

Contact Us

Contact our sales team or business advisors to help your business.

Technical Support

Open a ticket if you're looking for further assistance. Our Ticket is 7x24 available.

7x24 Phone Support
Hong Kong, China
+852 800 906 020 (Toll Free)
United States
+1 844 606 0804 (Toll Free)
United Kingdom
+44 808 196 4551 (Toll Free)
Canada
+1 888 605 7930 (Toll Free)
Australia
+61 1300 986 386 (Toll Free)
EdgeOne hotline
+852 300 80699
More local hotlines coming soon