pyspark.sql.Catalog.listColumns¶
- 
Catalog.listColumns(tableName: str, dbName: Optional[str] = None) → List[pyspark.sql.catalog.Column][source]¶
- Returns a list of columns for the given table/view in the specified database. - New in version 2.0.0. - Parameters
- tableNamestr
- name of the table to list columns. - Changed in version 3.4.0: Allow - tableNameto be qualified with catalog name when- dbNameis None.
- dbNamestr, optional
- name of the database to find the table to list columns. 
 
- Returns
- list
- A list of - Column.
 
 - Notes - The order of arguments here is different from that of its JVM counterpart because Python does not support method overloading. - If no database is specified, the current database and catalog are used. This API includes all temporary views. - Examples - >>> _ = spark.sql("DROP TABLE IF EXISTS tbl1") >>> _ = spark.sql("CREATE TABLE tblA (name STRING, age INT) USING parquet") >>> spark.catalog.listColumns("tblA") [Column(name='name', description=None, dataType='string', nullable=True, ... >>> _ = spark.sql("DROP TABLE tblA")