./tdlc [command] --help
.Operating system | TDLC Binary Packages Download Address |
Windows | |
Mac | |
Linux |
tdlc
. Open the command line on your client, switch to the download path, and if you are using a Mac/Linux system, you need to grant file execution permission with the chmod +x tdlc
command. After executing ./tdlc
, if the following content is displayed successfully, the installation is successful and it can be used.Tencentcloud DLC command tools is used to play around with DLC.With TDLC user can manger engines, execute SQLs and submit Spark Jobs.Usage:tdlc [flags]tdlc [command]Available Commands:confighelp Help about any commandspark Submit spark app to engines.sql Executing SQL.versionFlags:--endpoint string Endpoint of Tencentcloud account. (default "dlc.tencentcloudapi.com")--engine string DLC engine. (default "public-engine")-h, --help help for tdlc--region string Region of Tencentcloud account.--role-arn string Required by spark jar app.--secret-id string SecretId of Tencentcloud account.--secret-key string SecretKey of Tencentcloud account.--token string Token of Tencentcloud account.Use "tdlc [command] --help" for more information about a command.
Global Parameters | Description |
--endpoint string | Service Connection Address, the default is dlc.tencentcloudapi.com |
--engine string | The DLC Data Engine name, with a default value of public-engine. It is recommended that you use a Dedicated Data Engine |
--region string | Use Region, such as ap-nanjing, ap-beijing, ap-guangzhou, ap-shanghai, ap-chengdu, ap-chongqing, na-siliconvalley, ap-singapore, ap-hongkong |
--role-arn string | When you submit a Spark job, you need to specify the permissions to access COS files. For this, specify the role's rolearn. Details on rolearn can be referred to in Configuring Data Access Policy. |
--secret-id string | The Tencent Cloud account's secretId |
--secret-key string | The Tencent Cloud account's secretKey |
--token string | (Optional) Tencent Cloud Account Temporary Token |
Command | Description |
list | List the current default configuration |
set | Adjusting configuration |
unset | Reset Configuration |
./tdlc config list./tdlc config set secret-id={1} secret-key={2} region={b}./tdlc config unset region
Parameter | Description |
-e, --exec | Execute SQL Statement |
-f, --file | Execute SQL file, if there are multiple SQL files, please use ; to split |
--no-result | No result retrieval after execution |
-p, --progress | Display Execution Progress |
-q, --quiet | Quiet Mode, submit the task without waiting for the execution status |
./tdlc sql -e "SELECT 1" --secret-id aa --secret-key bb --region ap-beijing --engine public-engine./tdlc sql -f ~/biz.sql --no-result
Command | Description |
submit | Submit tasks via spark-submit |
run | Execute Spark job |
log | Viewing Execution Logs |
list | View Spark job list |
kill | Terminating Task |
Parameter | Description |
--driver-size | Driver Specification, defaults to small, medium, large, xlarge, for memory-optimized clusters use m.xmall, m.medium, m.large, m.xlarge |
--executor-size | Executor Specification, defaults to small, medium, large, xlarge, for memory-optimized clusters use m.xmall, m.medium, m.large, m.xlarge |
--executor-num | Number of Executors |
--files | View Spark job list |
--archives | Dependencies in Compressed Files |
--class | Main Function for Java/Scala Execution |
--jars | Dependent JAR Packages, use , to separate |
--name | Program Name |
--py-files | Dependent Python Files, Supports .zip, .egg, .py Formats |
--conf | Additional Configuration |
./tdlc spark submit --name spark-demo1 --engine sparkjar --jars /root/sparkjar-dep.jar --class com.demo.Example /root/sparkjar-main.jar arg1./tdlc spark submit --name spark-demo2 cosn://bucket1/abc.py arg1
Was this page helpful?