Domain name for API request: dlc.tencentcloudapi.com.
This API is used to start a Spark job.
A maximum of 20 requests can be initiated per second for this API.
The following request parameter list only provides API request parameters and some common parameters. For the complete common parameter list, see Common Request Parameters.
Parameter Name | Required | Type | Description |
---|---|---|---|
Action | Yes | String | Common Params. The value used for this API: CreateSparkAppTask. |
Version | Yes | String | Common Params. The value used for this API: 2021-01-25. |
Region | Yes | String | Common Params. For more information, please see the list of regions supported by the product. |
JobName | Yes | String | Spark job name |
CmdArgs | No | String | The input parameters of the Spark job, separated by space. They are generally used for periodic calls. |
Parameter Name | Type | Description |
---|---|---|
BatchId | String | Batch ID |
TaskId | String | Task ID |
RequestId | String | The unique request ID, generated by the server, will be returned for every request (if the request fails to reach the server for other reasons, the request will not obtain a RequestId). RequestId is required for locating a problem. |
This example shows you how to start a Spark job.
POST / HTTP/1.1
Host: dlc.tencentcloudapi.com
Content-Type: application/json
X-TC-Action:CreateSparkAppTask
<Common request parameters>
{
"JobName": "spark-app-test",
"CmdArgs": "10 test 20"
}
{
"Response": {
"RequestId": "2ae4707a-9f72-44aa-9fd4-65cb739d6301",
"BatchId": "batch-9vsx3lh0",
"TaskId": "4a7cad6bb86211ec9c616e6f30623d72"
}
}
TencentCloud API 3.0 integrates SDKs that support various programming languages to make it easier for you to call APIs.
The following only lists the error codes related to the API business logic. For other error codes, see Common Error Codes.
Error Code | Description |
---|---|
FailedOperation | The operation failed. |
FailedOperation.NoPermissionToUseTheDataEngine | The user does not have permission to specify the engine. |
InternalError.InternalSystemException | The business system is abnormal. Please try again or submit a ticket to contact us. |
InvalidParameter.ImageEngineTypeNotMatch | The specified engine type does not match. Currently, only SparkSQL, PrestoSQL, and SparkBatch are supported. |
InvalidParameter.ImageIsPublicNotMatch | The specified isPublic does not match. Currently, it only supports 1: public and 2: private. |
InvalidParameter.ImageParameterSubmitMethodNotMatch | The specified cluster image ParameterSubmitMethod does not match. Currently, only User and BackGround are supported. |
InvalidParameter.ImageParameterTypeNotMatch | The specified cluster image ParameterType does not match. Currently, it only supports 1: session; 2: common; 3: cluster. |
InvalidParameter.ImageSessionParametersFormatNotJson | The specified cluster image Session parameter format is not JSON. |
InvalidParameter.ImageStateNotMatch | The specified state does not match. Currently, it only supports 1: initializing, 2: online, 3: offline. |
InvalidParameter.ImageUserRecordsTypeNotMatch | The specified cluster image UserRecords does not match. Currently, it only supports: 1: parentVersion; 2: childVersion; 3: pySpark. |
InvalidParameter.InvalidRoleArn | The CAM role arn is invalid. |
InvalidParameter.InvalidSparkAppParam | The SparkAppParam is invalid. |
InvalidParameter.InvalidSparkConfigFormat | The specified Spark job configuration format is abnormal. Please see spark.network.timeout=120s. |
InvalidParameter.InvalidTcrSparkImageFormat | The specified TCR Spark image format does not match. The example for reference is my-image/ndf/python/latest. |
InvalidParameter.InvalidWhiteListKey | There is an error in getting an allowlist. Please try again or submit a ticket to contact us. |
InvalidParameter.NumberOfSQLExceedsTheLimit | The number of SQL statements submitted is limited to 1~50. |
InvalidParameter.ParameterBase64DecodeFailed | Base64 parsing of the specified parameter failed. |
InvalidParameter.ParameterNotFoundOrBeNone | The parameter is not found or empty. |
InvalidParameter.SQLBase64DecodeFail | Base64 parsing of the SQL script failed. |
InvalidParameter.SQLParameterPreprocessingFailed | SQL parameter preprocessing failed. |
InvalidParameter.SparkJobNotFound | The specified Spark task does not exist. |
InvalidParameter.SparkJobNotUnique | The specified Spark task already exists. |
InvalidParameter.SparkJobRoleArnNotFound | The specified Spark task RoleArn does not exist. |
ResourceInsufficient.SparkJobInsufficientResources | The specified spark job resources are insufficient. Please adjust the driver/executor specifications. |
ResourceNotFound.DataEngineConfigInstanceNotFound | The specified cluster configuration instance does not exist. |
ResourceNotFound.DataEngineConfigInstanceNotUnique | The specified cluster configuration instance already exists. |
ResourceNotFound.DataEngineNotActivity | The specified cluster is not running. |
ResourceNotFound.DataEngineNotFound | The specified engine does not exist. |
ResourceNotFound.DataEngineNotUnique | The specified engine already exists. |
ResourceNotFound.ImageVersionNotFound | The specified cluster image version does not exist. |
ResourceNotFound.ImageVersionNotUnique | The specified cluster image version already exists. |
ResourceNotFound.ResourceUsageOutOfLimit | The specified task resources exceed the limit of the remaining resources of the cluster. Please try again after adjustment. |
ResourceNotFound.ShuffleDirNotFound | The Spark Shuffle storage path cannot be found. Please go to the Console -> Data Exploration page -> Storage Configuration to set it. |
ResourceNotFound.WarehouseDirNotFound | The Warehouse storage path cannot be found. Please go to the Console->Data Exploration page->Storage Configuration to set it. |
ResourceUnavailable | The resource is unavailable. |
UnauthorizedOperation.UseComputingEngine | The sub-user does not have permission to use the compute engine. |
本页内容是否解决了您的问题?