Packages

c

org.apache.spark.sql

SparkSessionExtensions

class SparkSessionExtensions extends AnyRef

:: Experimental :: Holder for injection points to the SparkSession. We make NO guarantee about the stability regarding binary compatibility and source compatibility of methods here.

This current provides the following extension points:

  • Analyzer Rules.
  • Check Analysis Rules.
  • Optimizer Rules.
  • Pre CBO Rules.
  • Planning Strategies.
  • Customized Parser.
  • (External) Catalog listeners.
  • Columnar Rules.
  • Adaptive Query Stage Preparation Rules.

The extensions can be used by calling withExtensions on the SparkSession.Builder, for example:

SparkSession.builder()
  .master("...")
  .config("...", true)
  .withExtensions { extensions =>
    extensions.injectResolutionRule { session =>
      ...
    }
    extensions.injectParser { (session, parser) =>
      ...
    }
  }
  .getOrCreate()

The extensions can also be used by setting the Spark SQL configuration property spark.sql.extensions. Multiple extensions can be set using a comma-separated list. For example:

SparkSession.builder()
  .master("...")
  .config("spark.sql.extensions", "org.example.MyExtensions")
  .getOrCreate()

class MyExtensions extends Function1[SparkSessionExtensions, Unit] {
  override def apply(extensions: SparkSessionExtensions): Unit = {
    extensions.injectResolutionRule { session =>
      ...
    }
    extensions.injectParser { (session, parser) =>
      ...
    }
  }
}

Note that none of the injected builders should assume that the SparkSession is fully initialized and should not touch the session's internals (e.g. the SessionState).

Annotations
@DeveloperApi() @Experimental() @Unstable()
Source
SparkSessionExtensions.scala
Linear Supertypes
AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. SparkSessionExtensions
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new SparkSessionExtensions()

Type Members

  1. type CheckRuleBuilder = (SparkSession) ⇒ (LogicalPlan) ⇒ Unit
  2. type ColumnarRuleBuilder = (SparkSession) ⇒ ColumnarRule
  3. type FunctionDescription = (FunctionIdentifier, ExpressionInfo, FunctionBuilder)
  4. type ParserBuilder = (SparkSession, ParserInterface) ⇒ ParserInterface
  5. type QueryStagePrepRuleBuilder = (SparkSession) ⇒ Rule[SparkPlan]
  6. type RuleBuilder = (SparkSession) ⇒ Rule[LogicalPlan]
  7. type StrategyBuilder = (SparkSession) ⇒ Strategy

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  5. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()
  6. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  7. def equals(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  8. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  9. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  10. def hashCode(): Int
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  11. def injectCheckRule(builder: CheckRuleBuilder): Unit

    Inject an check analysis Rule builder into the SparkSession.

    Inject an check analysis Rule builder into the SparkSession. The injected rules will be executed after the analysis phase. A check analysis rule is used to detect problems with a LogicalPlan and should throw an exception when a problem is found.

  12. def injectColumnar(builder: ColumnarRuleBuilder): Unit

    Inject a rule that can override the columnar execution of an executor.

  13. def injectFunction(functionDescription: FunctionDescription): Unit

    Injects a custom function into the org.apache.spark.sql.catalyst.analysis.FunctionRegistry at runtime for all sessions.

  14. def injectOptimizerRule(builder: RuleBuilder): Unit

    Inject an optimizer Rule builder into the SparkSession.

    Inject an optimizer Rule builder into the SparkSession. The injected rules will be executed during the operator optimization batch. An optimizer rule is used to improve the quality of an analyzed logical plan; these rules should never modify the result of the LogicalPlan.

  15. def injectParser(builder: ParserBuilder): Unit

    Inject a custom parser into the SparkSession.

    Inject a custom parser into the SparkSession. Note that the builder is passed a session and an initial parser. The latter allows for a user to create a partial parser and to delegate to the underlying parser for completeness. If a user injects more parsers, then the parsers are stacked on top of each other.

  16. def injectPlannerStrategy(builder: StrategyBuilder): Unit

    Inject a planner Strategy builder into the SparkSession.

    Inject a planner Strategy builder into the SparkSession. The injected strategy will be used to convert a LogicalPlan into a executable org.apache.spark.sql.execution.SparkPlan.

  17. def injectPostHocResolutionRule(builder: RuleBuilder): Unit

    Inject an analyzer Rule builder into the SparkSession.

    Inject an analyzer Rule builder into the SparkSession. These analyzer rules will be executed after resolution.

  18. def injectPreCBORule(builder: RuleBuilder): Unit

    Inject an optimizer Rule builder that rewrites logical plans into the SparkSession.

    Inject an optimizer Rule builder that rewrites logical plans into the SparkSession. The injected rules will be executed once after the operator optimization batch and before any cost-based optimization rules that depend on stats.

  19. def injectQueryStagePrepRule(builder: QueryStagePrepRuleBuilder): Unit

    Inject a rule that can override the the query stage preparation phase of adaptive query execution.

  20. def injectResolutionRule(builder: RuleBuilder): Unit

    Inject an analyzer resolution Rule builder into the SparkSession.

    Inject an analyzer resolution Rule builder into the SparkSession. These analyzer rules will be executed as part of the resolution phase of analysis.

  21. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  22. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  23. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  24. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  25. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  26. def toString(): String
    Definition Classes
    AnyRef → Any
  27. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  28. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  29. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()

Inherited from AnyRef

Inherited from Any

Ungrouped