Data Lake Compute allows you to quickly query and analyze COS data. Currently, CSV, ORC, Parquet, JSON, Avro, and text files are supported.
Note:If you already have permissions or you are the root account admin, skip this step.
If you are logging in as a sub-account for the first time, in addition to the necessary CAM authorization, you also need Data Lake Compute permissions, which can be granted by a Data Lake Compute admin or root account admin through Permission management on the left sidebar in the Data Lake Compute console. For permission details, see Permission Overview.
Note:By default, the system will activate the shared public engine based on the Presto kernel, so you can quickly try features out without purchasing a private cluster.
For detailed directions, see Sub-Account Permission Management.
If you are familiar with SQL statements, write the CREATE DATABASE
statement in the query and skip the creation wizard.
CREATE DATABASE
statement.If you are familiar with SQL statements, write the CREATE TABLE
statement in the query and skip the creation wizard.
Note:An external table generally refers to a data file stored in a COS bucket under your account. It can be directly created in Data Lake Compute for analysis with no need to load additional data. It is external, so only its metadata will be deleted when you run
DROP TABLE
, while your original data will remain.
Note:Structure inference is an auxiliary tool for table creation and may not be 100% accurate. You need to check and modify the field names and types as needed.
After the data is prepared, write the SQL analysis statement, select an appropriate compute engine, and start data analysis.
Write a SQL statement with all data query results being SUCCESS
and run the statement after selecting a compute engine.
select * from `DataLakeCatalog`.`demo2`.`demo_audit_table` where _c5 = 'SUCCESS'
Was this page helpful?