Package org.apache.spark.sql
Class SparkSession.Builder
Object
org.apache.spark.sql.SparkSession.Builder
- All Implemented Interfaces:
 org.apache.spark.internal.Logging
- Enclosing class:
 - SparkSession
 
public static class SparkSession.Builder
extends Object
implements org.apache.spark.internal.Logging
Builder for 
SparkSession.- 
Nested Class Summary
Nested classes/interfaces inherited from interface org.apache.spark.internal.Logging
org.apache.spark.internal.Logging.SparkShellLoggingFilter - 
Constructor Summary
Constructors - 
Method Summary
Modifier and TypeMethodDescriptionSets a name for the application, which will be shown in the Spark web UI.Sets a config option.Sets a config option.Sets a config option.Sets a config option.Sets a config option.Sets a list of config options based on the givenSparkConf.Sets a config option.Enables Hive support, including connectivity to a persistent Hive metastore, support for Hive serdes, and Hive user-defined functions.Gets an existingSparkSessionor, if there is no existing one, creates a new one based on the options set in this builder.Sets the Spark master URL to connect to, such as "local" to run locally, "local[4]" to run locally with 4 cores, or "spark://master:7077" to run on a Spark standalone cluster.withExtensions(scala.Function1<SparkSessionExtensions, scala.runtime.BoxedUnit> f) Inject extensions into theSparkSession.Methods inherited from class java.lang.Object
equals, getClass, hashCode, notify, notifyAll, toString, wait, wait, waitMethods inherited from interface org.apache.spark.internal.Logging
initializeForcefully, initializeLogIfNecessary, initializeLogIfNecessary, initializeLogIfNecessary$default$2, isTraceEnabled, log, logDebug, logDebug, logError, logError, logInfo, logInfo, logName, logTrace, logTrace, logWarning, logWarning, org$apache$spark$internal$Logging$$log_, org$apache$spark$internal$Logging$$log__$eq 
- 
Constructor Details
- 
Builder
public Builder() 
 - 
 - 
Method Details
- 
appName
Sets a name for the application, which will be shown in the Spark web UI. If no application name is set, a randomly generated name will be used.- Parameters:
 name- (undocumented)- Returns:
 - (undocumented)
 - Since:
 - 2.0.0
 
 - 
config
Sets a config option. Options set using this method are automatically propagated to bothSparkConfand SparkSession's own configuration.- Parameters:
 key- (undocumented)value- (undocumented)- Returns:
 - (undocumented)
 - Since:
 - 2.0.0
 
 - 
config
Sets a config option. Options set using this method are automatically propagated to bothSparkConfand SparkSession's own configuration.- Parameters:
 key- (undocumented)value- (undocumented)- Returns:
 - (undocumented)
 - Since:
 - 2.0.0
 
 - 
config
Sets a config option. Options set using this method are automatically propagated to bothSparkConfand SparkSession's own configuration.- Parameters:
 key- (undocumented)value- (undocumented)- Returns:
 - (undocumented)
 - Since:
 - 2.0.0
 
 - 
config
Sets a config option. Options set using this method are automatically propagated to bothSparkConfand SparkSession's own configuration.- Parameters:
 key- (undocumented)value- (undocumented)- Returns:
 - (undocumented)
 - Since:
 - 2.0.0
 
 - 
config
Sets a config option. Options set using this method are automatically propagated to bothSparkConfand SparkSession's own configuration.- Parameters:
 map- (undocumented)- Returns:
 - (undocumented)
 - Since:
 - 3.4.0
 
 - 
config
Sets a config option. Options set using this method are automatically propagated to bothSparkConfand SparkSession's own configuration.- Parameters:
 map- (undocumented)- Returns:
 - (undocumented)
 - Since:
 - 3.4.0
 
 - 
config
Sets a list of config options based on the givenSparkConf.- Parameters:
 conf- (undocumented)- Returns:
 - (undocumented)
 - Since:
 - 2.0.0
 
 - 
enableHiveSupport
Enables Hive support, including connectivity to a persistent Hive metastore, support for Hive serdes, and Hive user-defined functions.- Returns:
 - (undocumented)
 - Since:
 - 2.0.0
 
 - 
getOrCreate
Gets an existingSparkSessionor, if there is no existing one, creates a new one based on the options set in this builder.This method first checks whether there is a valid thread-local SparkSession, and if yes, return that one. It then checks whether there is a valid global default SparkSession, and if yes, return that one. If no valid global default SparkSession exists, the method creates a new SparkSession and assigns the newly created SparkSession as the global default.
In case an existing SparkSession is returned, the non-static config options specified in this builder will be applied to the existing SparkSession.
- Returns:
 - (undocumented)
 - Since:
 - 2.0.0
 
 - 
master
Sets the Spark master URL to connect to, such as "local" to run locally, "local[4]" to run locally with 4 cores, or "spark://master:7077" to run on a Spark standalone cluster.- Parameters:
 master- (undocumented)- Returns:
 - (undocumented)
 - Since:
 - 2.0.0
 
 - 
withExtensions
public SparkSession.Builder withExtensions(scala.Function1<SparkSessionExtensions, scala.runtime.BoxedUnit> f) Inject extensions into theSparkSession. This allows a user to add Analyzer rules, Optimizer rules, Planning Strategies or a customized parser.- Parameters:
 f- (undocumented)- Returns:
 - (undocumented)
 - Since:
 - 2.2.0
 
 
 -