tencent cloud

All product documents
Data Lake Compute
DocumentationData Lake ComputeClient AccessTDLC Command Line Interface Tool Access
TDLC Command Line Interface Tool Access
Last updated: 2024-07-31 17:33:04
TDLC Command Line Interface Tool Access
Last updated: 2024-07-31 17:33:04
TDLC is the Client Command Tool provided by Tencent Cloud Data Lake Computing (DataLake Compute, DLC). With the TDLC tool, you can submit SQL, Spark tasks to the DLC data engine.
TDLC is written in Go, based on the Cobra Framework, and supports configuring multiple buckets and cross-bucket operations. You can view the usage of TDLC by using ./tdlc [command] --help.

Download and Installation

TDLC TCCLI offers binary packages for Windows, Mac, Linux operating systems. You can use them after simple installation and configuration. You can choose to download according to the type of operating system on the client.
Operating system
TDLC Binary Packages Download Address
Windows
Mac
Linux
Rename the downloaded file to tdlc. Open the command line on your client, switch to the download path, and if you are using a Mac/Linux system, you need to grant file execution permission with the chmod +x tdlc command. After executing ./tdlc, if the following content is displayed successfully, the installation is successful and it can be used.
Tencentcloud DLC command tools is used to play around with DLC.
With TDLC user can manger engines, execute SQLs and submit Spark Jobs.

Usage:
tdlc [flags]
tdlc [command]

Available Commands:
config
help Help about any command
spark Submit spark app to engines.
sql Executing SQL.
version

Flags:
--endpoint string Endpoint of Tencentcloud account. (default "dlc.tencentcloudapi.com")
--engine string DLC engine. (default "public-engine")
-h, --help help for tdlc
--region string Region of Tencentcloud account.
--role-arn string Required by spark jar app.
--secret-id string SecretId of Tencentcloud account.
--secret-key string SecretKey of Tencentcloud account.
--token string Token of Tencentcloud account.

Use "tdlc [command] --help" for more information about a command.

Use Instructions 

Global Parameters

TDLC provides the following global parameters.
Global Parameters
Description
--endpoint string
Service Connection Address, the default is dlc.tencentcloudapi.com
--engine string
The DLC Data Engine name, with a default value of public-engine. It is recommended that you use a Dedicated Data Engine
--region string
Use Region, such as ap-nanjing, ap-beijing, ap-guangzhou, ap-shanghai, ap-chengdu, ap-chongqing, na-siliconvalley, ap-singapore, ap-hongkong
--role-arn string
When you submit a Spark job, you need to specify the permissions to access COS files. For this, specify the role's rolearn. Details on rolearn can be referred to in Configuring Data Access Policy.
--secret-id string
The Tencent Cloud account's secretId
--secret-key string
The Tencent Cloud account's secretKey
--token string
(Optional) Tencent Cloud Account Temporary Token

CONFIG Command

The config can be used to set commonly used parameters, which will be provided with default values. Command line parameters will override the parameters set in the config.
Command
 Description
list
List the current default configuration
set
Adjusting configuration
unset
Reset Configuration
Example:
./tdlc config list
./tdlc config set secret-id={1} secret-key={2} region={b}
./tdlc config unset region

SQL Subcommand

SQL subcommands currently only support Presto or SparkSQL clusters. Below are the parameters supported by SQL subcommands.
Parameter
 Description
-e, --exec
Execute SQL Statement
-f, --file
Execute SQL file, if there are multiple SQL files, please use ; to split
--no-result
No result retrieval after execution
-p, --progress
Display Execution Progress
-q, --quiet
Quiet Mode, submit the task without waiting for the execution status
Example:
./tdlc sql -e "SELECT 1" --secret-id aa --secret-key bb --region ap-beijing --engine public-engine
./tdlc sql -f ~/biz.sql --no-result 

SPARK Subcommand

Spark Subcommands include the following commands which can be used to submit Spark jobs, view running logs, and terminate tasks.
Command
Description
submit
Submit tasks via spark-submit
run 
Execute Spark job
log
Viewing Execution Logs
list
View Spark job list
kill
Terminating Task
Below are the parameters supported by Spark submit subcommand, parameters related to files in the list support using local files or COSN protocol.
Parameter
 Description
--driver-size 
Driver Specification, defaults to small, medium, large, xlarge, for memory-optimized clusters use m.xmall, m.medium, m.large, m.xlarge
--executor-size 
Executor Specification, defaults to small, medium, large, xlarge, for memory-optimized clusters use m.xmall, m.medium, m.large, m.xlarge
--executor-num
Number of Executors
--files
View Spark job list
--archives
Dependencies in Compressed Files
--class
Main Function for Java/Scala Execution
--jars
Dependent JAR Packages, use , to separate
--name
Program Name
--py-files
Dependent Python Files, Supports .zip, .egg, .py Formats
--conf
Additional Configuration
Example:
./tdlc spark submit --name spark-demo1 --engine sparkjar --jars /root/sparkjar-dep.jar --class com.demo.Example /root/sparkjar-main.jar arg1
./tdlc spark submit --name spark-demo2 cosn://bucket1/abc.py arg1
Was this page helpful?
You can also Contact Sales or Submit a Ticket for help.
Yes
No

Feedback

Contact Us

Contact our sales team or business advisors to help your business.

Technical Support

Open a ticket if you're looking for further assistance. Our Ticket is 7x24 avaliable.

7x24 Phone Support
Hong Kong, China
+852 800 906 020 (Toll Free)
United States
+1 844 606 0804 (Toll Free)
United Kingdom
+44 808 196 4551 (Toll Free)
Canada
+1 888 605 7930 (Toll Free)
Australia
+61 1300 986 386 (Toll Free)
EdgeOne hotline
+852 300 80699
More local hotlines coming soon