Domain name for API request: dlc.tencentcloudapi.com.
This API is used to update a Spark job.
A maximum of 20 requests can be initiated per second for this API.
The following request parameter list only provides API request parameters and some common parameters. For the complete common parameter list, see Common Request Parameters.
Parameter Name | Required | Type | Description |
---|---|---|---|
Action | Yes | String | Common Params. The value used for this API: ModifySparkApp. |
Version | Yes | String | Common Params. The value used for this API: 2021-01-25. |
Region | Yes | String | Common Params. For more information, please see the list of regions supported by the product. |
AppName | Yes | String | The Spark job name. |
AppType | Yes | Integer | The Spark job type. Valid values: 1 for Spark JAR job and 2 for Spark streaming job. |
DataEngine | Yes | String | The data engine executing the Spark job. |
AppFile | Yes | String | The path of the Spark job package. |
RoleArn | Yes | Integer | The data access policy (CAM role arn). |
AppDriverSize | Yes | String | The driver size. Valid values: small (default, 1 CU), medium (2 CUs), large (4 CUs), and xlarge (8 CUs). |
AppExecutorSize | Yes | String | The executor size. Valid values: small (default, 1 CU), medium (2 CUs), large (4 CUs), and xlarge (8 CUs). |
AppExecutorNums | Yes | Integer | Number of Spark job executors |
SparkAppId | Yes | String | The Spark job ID. |
Eni | No | String | This field has been disused. Use the Datasource field instead. |
IsLocal | No | String | The source of the Spark job package. Valid values: cos for COS and lakefs for the local system (for use in the console, but this method does not support direct API calls). |
MainClass | No | String | The main class of the Spark job. |
AppConf | No | String | Spark configurations separated by line break |
IsLocalJars | No | String | The source of the dependency JAR packages of the Spark job. Valid values: cos for COS and lakefs for the local system (for use in the console, but this method does not support direct API calls). |
AppJars | No | String | The dependency JAR packages of the Spark JAR job (JAR packages), separated by comma. |
IsLocalFiles | No | String | The source of the dependency files of the Spark job. Valid values: cos for COS and lakefs for the local system (for use in the console, but this method does not support direct API calls). |
AppFiles | No | String | The dependency files of the Spark job (files other than JAR and ZIP packages), separated by comma. |
IsLocalPythonFiles | No | String | The source of the PySpark dependencies. Valid values: cos for COS and lakefs for the local system (for use in the console, but this method does not support direct API calls). |
AppPythonFiles | No | String | The PySpark dependencies (Python files), separated by comma, with .py, .zip, and .egg formats supported. |
CmdArgs | No | String | The input parameters of the Spark job, separated by comma. |
MaxRetries | No | Integer | The maximum number of retries, valid for Spark streaming tasks only. |
DataSource | No | String | Data source name |
IsLocalArchives | No | String | The source of the dependency archives of the Spark job. Valid values: cos for COS and lakefs for the local system (for use in the console, but this method does not support direct API calls). |
AppArchives | No | String | The dependency archives of the Spark job, separated by comma, with tar.gz, .tgz, and .tar formats supported. |
SparkImage | No | String | The Spark image version. |
SparkImageVersion | No | String | The Spark image version name. |
AppExecutorMaxNumbers | No | Integer | The specified executor count (max), which defaults to 1. This parameter applies if the "Dynamic" mode is selected. If the "Dynamic" mode is not selected, the executor count is equal to AppExecutorNums . |
SessionId | No | String | The associated Data Lake Compute query script. |
IsInherit | No | Integer | Whether to inherit the task resource configuration from the cluster configuration template. Valid values: 0 (default): No; 1 : Yes. |
IsSessionStarted | No | Boolean | Whether to run the task with the session SQLs. Valid values: false for no and true for yes. |
Parameter Name | Type | Description |
---|---|---|
RequestId | String | The unique request ID, generated by the server, will be returned for every request (if the request fails to reach the server for other reasons, the request will not obtain a RequestId). RequestId is required for locating a problem. |
This example shows you how to update a Spark job.
POST / HTTP/1.1
Host: dlc.tencentcloudapi.com
Content-Type: application/json
X-TC-Action: ModifySparkApp
<Common request parameters>
{
"SparkAppId": "batch_sadfafd",
"AppName": "spark-test",
"AppType": 1,
"DataEngine": "spark-engine",
"Eni": "kafka-eni",
"IsLocal": "cos",
"AppFile": "test.jar",
"RoleArn": 12,
"MainClass": "com.test.WordCount",
"AppConf": "spark-default.properties",
"IsLocalJars": "cos",
"AppJars": "com.test2.jar",
"IsLocalFiles": "cos",
"AppFiles": "spark-default.properties",
"AppDriverSize": "small",
"AppExecutorSize": "small",
"AppExecutorNums": 1,
"AppExecutorMaxNumbers": 1
}
{
"Response": {
"RequestId": "2ae4707a-9f72-44aa-9fd4-65cb739d6301"
}
}
TencentCloud API 3.0 integrates SDKs that support various programming languages to make it easier for you to call APIs.
The following only lists the error codes related to the API business logic. For other error codes, see Common Error Codes.
Error Code | Description |
---|---|
FailedOperation | The operation failed. |
InternalError.InternalSystemException | The business system is abnormal. Please try again or submit a ticket to contact us. |
InvalidParameter.InvalidAppFileFormat | The specified Spark task package file format does not match. Currently, only .jar or .py is supported. |
InvalidParameter.InvalidDataEngineName | The data engine name is invalid. |
InvalidParameter.InvalidDriverSize | The current DriverSize specification only supports small/medium/large/xlarge/m.small/m.medium/m.large/m.xlarge. |
InvalidParameter.InvalidExecutorSize | The current ExecutorSize specification only supports small/medium/large/xlarge/m.small/m.medium/m.large/m.xlarge. |
InvalidParameter.InvalidFileCompressionFormat | The specified file compression format is not compliant. Currently, only .tar.gz/.tar/.tgz is supported. |
InvalidParameter.InvalidFilePathFormat | The specified file path format is not compliant. Currently, only cosn:// or lakefs:// is supported. |
InvalidParameter.SQLBase64DecodeFail | Base64 parsing of the SQL script failed. |
InvalidParameter.SparkJobNotFound | The specified Spark task does not exist. |
InvalidParameter.SparkJobOnlySupportSparkBatchEngine | Spark tasks can only be run using the Spark job engine. |
ResourceInsufficient.SparkJobInsufficientResources | The specified spark job resources are insufficient. Please adjust the driver/executor specifications. |
ResourceNotFound.DataEngineNotFound | The specified engine does not exist. |
本页内容是否解决了您的问题?