Packages

t

org.apache.spark.sql.connector.catalog

SessionConfigSupport

trait SessionConfigSupport extends TableProvider

A mix-in interface for TableProvider. Data sources can implement this interface to propagate session configs with the specified key-prefix to all data source operations in this session.

Annotations
@Evolving()
Source
SessionConfigSupport.java
Since

3.0.0

Linear Supertypes
TableProvider, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. SessionConfigSupport
  2. TableProvider
  3. AnyRef
  4. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. Protected

Abstract Value Members

  1. abstract def getTable(schema: StructType, partitioning: Array[Transform], properties: Map[String, String]): Table

    Return a Table instance with the specified table schema, partitioning and properties to do read/write.

    Return a Table instance with the specified table schema, partitioning and properties to do read/write. The returned table should report the same schema and partitioning with the specified ones, or Spark may fail the operation.

    schema

    The specified table schema.

    partitioning

    The specified table partitioning.

    properties

    The specified table properties. It's case preserving (contains exactly what users specified) and implementations are free to use it case sensitively or insensitively. It should be able to identify a table, e.g. file path, Kafka topic name, etc.

    Definition Classes
    TableProvider
  2. abstract def inferSchema(options: CaseInsensitiveStringMap): StructType

    Infer the schema of the table identified by the given options.

    Infer the schema of the table identified by the given options.

    options

    an immutable case-insensitive string-to-string map that can identify a table, e.g. file path, Kafka topic name, etc.

    Definition Classes
    TableProvider
  3. abstract def keyPrefix(): String

    Key prefix of the session configs to propagate, which is usually the data source name.

    Key prefix of the session configs to propagate, which is usually the data source name. Spark will extract all session configs that starts with spark.datasource.$keyPrefix, turn spark.datasource.$keyPrefix.xxx -> yyy into xxx -> yyy, and propagate them to all data source operations in this session.

Concrete Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##: Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  5. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.CloneNotSupportedException]) @IntrinsicCandidate() @native()
  6. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  7. def equals(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef → Any
  8. final def getClass(): Class[_ <: AnyRef]
    Definition Classes
    AnyRef → Any
    Annotations
    @IntrinsicCandidate() @native()
  9. def hashCode(): Int
    Definition Classes
    AnyRef → Any
    Annotations
    @IntrinsicCandidate() @native()
  10. def inferPartitioning(options: CaseInsensitiveStringMap): Array[Transform]

    Infer the partitioning of the table identified by the given options.

    Infer the partitioning of the table identified by the given options.

    By default this method returns empty partitioning, please override it if this source support partitioning.

    options

    an immutable case-insensitive string-to-string map that can identify a table, e.g. file path, Kafka topic name, etc.

    Definition Classes
    TableProvider
  11. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  12. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  13. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @IntrinsicCandidate() @native()
  14. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @IntrinsicCandidate() @native()
  15. def supportsExternalMetadata(): Boolean

    Returns true if the source has the ability of accepting external table metadata when getting tables.

    Returns true if the source has the ability of accepting external table metadata when getting tables. The external table metadata includes:

    • For table reader: user-specified schema from DataFrameReader/ DataStreamReader and schema/partitioning stored in Spark catalog.
    • For table writer: the schema of the input Dataframe of DataframeWriter/DataStreamWriter.

    By default this method returns false, which means the schema and partitioning passed to Transform[], Map) are from the infer methods. Please override it if this source has expensive schema/partitioning inference and wants external table metadata to avoid inference.

    Definition Classes
    TableProvider
  16. final def synchronized[T0](arg0: => T0): T0
    Definition Classes
    AnyRef
  17. def toString(): String
    Definition Classes
    AnyRef → Any
  18. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException])
  19. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException]) @native()
  20. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException])

Deprecated Value Members

  1. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.Throwable]) @Deprecated
    Deprecated

    (Since version 9)

Inherited from TableProvider

Inherited from AnyRef

Inherited from Any

Ungrouped