DecimalType

class pyspark.sql.types.DecimalType(precision: int = 10, scale: int = 0)[source]

Decimal (decimal.Decimal) data type.

The DecimalType must have fixed precision (the maximum total number of digits) and scale (the number of digits on the right of dot). For example, (5, 2) can support the value from [-999.99 to 999.99].

The precision can be up to 38, the scale must be less or equal to precision.

When creating a DecimalType, the default precision and scale is (10, 0). When inferring schema from decimal.Decimal objects, it will be DecimalType(38, 18).

Parameters
precisionint, optional

the maximum (i.e. total) number of digits (default: 10)

scaleint, optional

the number of digits on right side of dot. (default: 0)

Methods

fromInternal(obj)

Converts an internal SQL object into a native Python object.

json()

jsonValue()

needConversion()

Does this type needs conversion between Python object and internal SQL object.

simpleString()

toInternal(obj)

Converts a Python object into an internal SQL object.

typeName()

Methods Documentation

fromInternal(obj: Any) → Any

Converts an internal SQL object into a native Python object.

json() → str
jsonValue() → str[source]
needConversion() → bool

Does this type needs conversion between Python object and internal SQL object.

This is used to avoid the unnecessary conversion for ArrayType/MapType/StructType.

simpleString() → str[source]
toInternal(obj: Any) → Any

Converts a Python object into an internal SQL object.

classmethod typeName() → str