Package org.apache.spark.sql.jdbc
Class DatabricksDialect
Object
org.apache.spark.sql.jdbc.JdbcDialect
org.apache.spark.sql.jdbc.DatabricksDialect
- All Implemented Interfaces:
Serializable,org.apache.spark.internal.Logging,NoLegacyJDBCError,scala.Equals,scala.Product
public class DatabricksDialect
extends JdbcDialect
implements NoLegacyJDBCError, scala.Product, Serializable
- See Also:
-
Nested Class Summary
Nested classes/interfaces inherited from interface org.apache.spark.internal.Logging
org.apache.spark.internal.Logging.LogStringContext, org.apache.spark.internal.Logging.SparkShellLoggingFilter -
Constructor Summary
Constructors -
Method Summary
Modifier and TypeMethodDescriptionabstract static Rapply()booleanCheck if this dialect instance can handle a certain jdbc url.scala.Option<DataType>getCatalystType(int sqlType, String typeName, int size, MetadataBuilder md) Get the custom datatype mapping for the given jdbc meta information.scala.Option<JdbcType>getJDBCType(DataType dt) Retrieve the jdbc / sql type for a given datatype.getTableSample(org.apache.spark.sql.execution.datasources.v2.TableSampleInfo sample) booleanbooleanisSyntaxErrorBestEffort(SQLException exception) Attempts to determine if the given SQLException is a SQL syntax error.String[][]listSchemas(Connection conn, org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions options) Lists all the schemas in this table.quoteIdentifier(String colName) Quotes the identifier.booleanschemasExists(Connection conn, org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions options, String schema) Check schema exists or not.booleanbooleanReturns ture if dialect supports LIMIT clause.booleanReturns ture if dialect supports OFFSET clause.booleanstatic StringtoString()Methods inherited from class org.apache.spark.sql.jdbc.JdbcDialect
alterTable, beforeFetch, classifyException, classifyException, compileAggregate, compileExpression, compileValue, convertJavaDateToDate, convertJavaTimestampToTimestamp, convertJavaTimestampToTimestampNTZ, convertTimestampNTZToJavaTimestamp, createConnectionFactory, createIndex, createSchema, createTable, dropIndex, dropSchema, dropTable, functions, getAddColumnQuery, getDayTimeIntervalAsMicros, getDeleteColumnQuery, getFullyQualifiedQuotedTableName, getJdbcSQLQueryBuilder, getLimitClause, getOffsetClause, getRenameColumnQuery, getSchemaCommentQuery, getSchemaQuery, getTableCommentQuery, getTableExistsQuery, getTruncateQuery, getTruncateQuery, getUpdateColumnNullabilityQuery, getUpdateColumnTypeQuery, getYearMonthIntervalAsMonths, indexExists, insertIntoTable, isCascadingTruncateTable, isSupportedFunction, listIndexes, removeSchemaCommentQuery, renameTable, renameTable, supportsJoin, updateExtraColumnMetaMethods inherited from class java.lang.Object
equals, getClass, hashCode, notify, notifyAll, toString, wait, wait, waitMethods inherited from interface scala.Equals
canEqual, equalsMethods inherited from interface org.apache.spark.internal.Logging
initializeForcefully, initializeLogIfNecessary, initializeLogIfNecessary, initializeLogIfNecessary$default$2, isTraceEnabled, log, logBasedOnLevel, logDebug, logDebug, logDebug, logDebug, logError, logError, logError, logError, logInfo, logInfo, logInfo, logInfo, logName, LogStringContext, logTrace, logTrace, logTrace, logTrace, logWarning, logWarning, logWarning, logWarning, MDC, org$apache$spark$internal$Logging$$log_, org$apache$spark$internal$Logging$$log__$eq, withLogContextMethods inherited from interface org.apache.spark.sql.jdbc.NoLegacyJDBCError
classifyExceptionMethods inherited from interface scala.Product
productArity, productElement, productElementName, productElementNames, productIterator, productPrefix
-
Constructor Details
-
DatabricksDialect
public DatabricksDialect()
-
-
Method Details
-
apply
public abstract static R apply() -
toString
-
canHandle
Description copied from class:JdbcDialectCheck if this dialect instance can handle a certain jdbc url.- Specified by:
canHandlein classJdbcDialect- Parameters:
url- the jdbc url.- Returns:
- True if the dialect can be applied on the given jdbc url.
-
isObjectNotFoundException
- Overrides:
isObjectNotFoundExceptionin classJdbcDialect
-
getCatalystType
public scala.Option<DataType> getCatalystType(int sqlType, String typeName, int size, MetadataBuilder md) Description copied from class:JdbcDialectGet the custom datatype mapping for the given jdbc meta information.Guidelines for mapping database defined timestamps to Spark SQL timestamps:
-
TIMESTAMP WITHOUT TIME ZONE if preferTimestampNTZ ->
TimestampNTZType -
TIMESTAMP WITHOUT TIME ZONE if !preferTimestampNTZ ->
TimestampType(LTZ) - TIMESTAMP WITH TIME ZONE ->
TimestampType(LTZ) - TIMESTAMP WITH LOCAL TIME ZONE ->
TimestampType(LTZ) -
If the TIMESTAMP cannot be distinguished by
sqlTypeandtypeName, preferTimestampNTZ is respected for now, but we may need to add another option in the future if necessary.
- Overrides:
getCatalystTypein classJdbcDialect- Parameters:
sqlType- Refers toTypesconstants, or other constants defined by the target database, e.g.-101is Oracle's TIMESTAMP WITH TIME ZONE type. This value is returned byResultSetMetaData.getColumnType(int).typeName- The column type name used by the database (e.g. "BIGINT UNSIGNED"). This is sometimes used to determine the target data type whensqlTypeis not sufficient if multiple database types are conflated into a single id. This value is returned byResultSetMetaData.getColumnTypeName(int).size- The size of the type, e.g. the maximum precision for numeric types, length for character string, etc. This value is returned byResultSetMetaData.getPrecision(int).md- Result metadata associated with this type. This contains additional information fromResultSetMetaDataor user specified options.-
isTimestampNTZ: Whether read a TIMESTAMP WITHOUT TIME ZONE value asTimestampNTZTypeor not. This is configured byJDBCOptions.preferTimestampNTZ. -
scale: The length of fractional partResultSetMetaData.getScale(int)
-
- Returns:
- An option the actual DataType (subclasses of
DataType) or None if the default type mapping should be used.
-
TIMESTAMP WITHOUT TIME ZONE if preferTimestampNTZ ->
-
getJDBCType
Description copied from class:JdbcDialectRetrieve the jdbc / sql type for a given datatype.- Overrides:
getJDBCTypein classJdbcDialect- Parameters:
dt- The datatype (e.g.StringType)- Returns:
- The new JdbcType if there is an override for this DataType
-
isSyntaxErrorBestEffort
Description copied from class:JdbcDialectAttempts to determine if the given SQLException is a SQL syntax error.This check is best-effort: it may not detect all syntax errors across all JDBC dialects. However, if this method returns true, the exception is guaranteed to be a syntax error.
This is used to decide whether to wrap the exception in a more appropriate Spark exception.
- Overrides:
isSyntaxErrorBestEffortin classJdbcDialect- Parameters:
exception- (undocumented)- Returns:
- true if the exception is confidently identified as a syntax error; false otherwise.
-
quoteIdentifier
Description copied from class:JdbcDialectQuotes the identifier. This is used to put quotes around the identifier in case the column name is a reserved keyword, or in case it contains characters that require quotes (e.g. space).- Overrides:
quoteIdentifierin classJdbcDialect- Parameters:
colName- (undocumented)- Returns:
- (undocumented)
-
supportsLimit
public boolean supportsLimit()Description copied from class:JdbcDialectReturns ture if dialect supports LIMIT clause.Note: Some build-in dialect supports LIMIT clause with some trick, please see:
OracleDialect.OracleSQLQueryBuilderandMsSqlServerDialect.MsSqlServerSQLQueryBuilder.- Overrides:
supportsLimitin classJdbcDialect- Returns:
- (undocumented)
-
supportsOffset
public boolean supportsOffset()Description copied from class:JdbcDialectReturns ture if dialect supports OFFSET clause.Note: Some build-in dialect supports OFFSET clause with some trick, please see:
OracleDialect.OracleSQLQueryBuilderandMySQLDialect.MySQLSQLQueryBuilder.- Overrides:
supportsOffsetin classJdbcDialect- Returns:
- (undocumented)
-
supportsTableSample
public boolean supportsTableSample()- Overrides:
supportsTableSamplein classJdbcDialect
-
getTableSample
- Overrides:
getTableSamplein classJdbcDialect
-
supportsHint
public boolean supportsHint()- Overrides:
supportsHintin classJdbcDialect
-
schemasExists
public boolean schemasExists(Connection conn, org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions options, String schema) Description copied from class:JdbcDialectCheck schema exists or not.- Overrides:
schemasExistsin classJdbcDialect- Parameters:
conn- (undocumented)options- (undocumented)schema- (undocumented)- Returns:
- (undocumented)
-
listSchemas
public String[][] listSchemas(Connection conn, org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions options) Description copied from class:JdbcDialectLists all the schemas in this table.- Overrides:
listSchemasin classJdbcDialect- Parameters:
conn- (undocumented)options- (undocumented)- Returns:
- (undocumented)
-