MapType

class pyspark.sql.types.MapType(keyType: pyspark.sql.types.DataType, valueType: pyspark.sql.types.DataType, valueContainsNull: bool = True)[source]

Map data type.

Parameters
keyTypeDataType

DataType of the keys in the map.

valueTypeDataType

DataType of the values in the map.

valueContainsNullbool, optional

indicates whether values can contain null (None) values.

Notes

Keys in a map data type are not allowed to be null (None).

Examples

>>> from pyspark.sql.types import IntegerType, FloatType, MapType, StringType

The below example demonstrates how to create class:MapType:

>>> map_type = MapType(StringType(), IntegerType())

The values of the map can contain null (None) values by default:

>>> (MapType(StringType(), IntegerType())
...        == MapType(StringType(), IntegerType(), True))
True
>>> (MapType(StringType(), IntegerType(), False)
...        == MapType(StringType(), FloatType()))
False

Methods

fromInternal(obj)

Converts an internal SQL object into a native Python object.

fromJson(json)

json()

jsonValue()

needConversion()

Does this type needs conversion between Python object and internal SQL object.

simpleString()

toInternal(obj)

Converts a Python object into an internal SQL object.

typeName()

Methods Documentation

fromInternal(obj: Dict[T, Optional[U]]) → Dict[T, Optional[U]][source]

Converts an internal SQL object into a native Python object.

classmethod fromJson(json: Dict[str, Any])pyspark.sql.types.MapType[source]
json() → str
jsonValue() → Dict[str, Any][source]
needConversion() → bool[source]

Does this type needs conversion between Python object and internal SQL object.

This is used to avoid the unnecessary conversion for ArrayType/MapType/StructType.

simpleString() → str[source]
toInternal(obj: Dict[T, Optional[U]]) → Dict[T, Optional[U]][source]

Converts a Python object into an internal SQL object.

classmethod typeName() → str