public class DayTimeIntervalType extends DataType implements scala.Product, scala.Serializable
DayTimeIntervalType
represents positive as well as negative day-time intervals.
param: startField The leftmost field which the type comprises of. Valid values: 0 (DAY), 1 (HOUR), 2 (MINUTE), 3 (SECOND). param: endField The rightmost field which the type comprises of. Valid values: 0 (DAY), 1 (HOUR), 2 (MINUTE), 3 (SECOND).
Constructor and Description |
---|
DayTimeIntervalType(byte startField,
byte endField) |
Modifier and Type | Method and Description |
---|---|
static DayTimeIntervalType |
apply() |
static DayTimeIntervalType |
apply(byte field) |
abstract static boolean |
canEqual(Object that) |
static byte |
DAY() |
static scala.collection.Seq<Object> |
dayTimeFields() |
static DayTimeIntervalType |
DEFAULT() |
int |
defaultSize()
The day-time interval type has constant precision.
|
byte |
endField() |
abstract static boolean |
equals(Object that) |
static String |
fieldToString(byte field) |
static byte |
HOUR() |
static byte |
MINUTE() |
abstract static int |
productArity() |
abstract static Object |
productElement(int n) |
static scala.collection.Iterator<Object> |
productIterator() |
static String |
productPrefix() |
static byte |
SECOND() |
byte |
startField() |
static scala.collection.immutable.Map<String,Object> |
stringToField() |
String |
typeName()
Name of the type used in JSON serialization.
|
static boolean |
unapply(org.apache.spark.sql.catalyst.expressions.Expression e)
Enables matching against AtomicType for expressions:
|
canWrite, catalogString, equalsStructurally, equalsStructurallyByName, fromDDL, fromJson, json, parseTypeWithFallback, prettyJson, simpleString, sql
equals, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
public static byte DAY()
public static byte HOUR()
public static byte MINUTE()
public static byte SECOND()
public static scala.collection.Seq<Object> dayTimeFields()
public static String fieldToString(byte field)
public static scala.collection.immutable.Map<String,Object> stringToField()
public static DayTimeIntervalType DEFAULT()
public static DayTimeIntervalType apply()
public static DayTimeIntervalType apply(byte field)
public abstract static boolean canEqual(Object that)
public abstract static boolean equals(Object that)
public abstract static Object productElement(int n)
public abstract static int productArity()
public static scala.collection.Iterator<Object> productIterator()
public static String productPrefix()
public byte startField()
public byte endField()
public int defaultSize()
Long
.defaultSize
in class DataType
public String typeName()
DataType
public static boolean unapply(org.apache.spark.sql.catalyst.expressions.Expression e)
case Cast(child @ AtomicType(), StringType) =>
...
e
- (undocumented)