pyspark.sql.Catalog.uncacheTable

Catalog.uncacheTable(tableName: str) → None[source]

Removes the specified table from the in-memory cache.

New in version 2.0.0.

Parameters
tableNamestr

name of the table to get.

Changed in version 3.4.0: Allow tableName to be qualified with catalog name.

Examples

>>> _ = spark.sql("DROP TABLE IF EXISTS tbl1")
>>> _ = spark.sql("CREATE TABLE tbl1 (name STRING, age INT) USING parquet")
>>> spark.catalog.cacheTable("tbl1")
>>> spark.catalog.uncacheTable("tbl1")
>>> spark.catalog.isCached("tbl1")
False

Throw an analysis exception when the table does not exist.

>>> spark.catalog.uncacheTable("not_existing_table")
Traceback (most recent call last):
    ...
AnalysisException: ...

Using the fully qualified name for the table.

>>> spark.catalog.uncacheTable("spark_catalog.default.tbl1")
>>> spark.catalog.isCached("tbl1")
False
>>> _ = spark.sql("DROP TABLE tbl1")