pyspark.sql.functions.try_make_timestamp_ntz#
- pyspark.sql.functions.try_make_timestamp_ntz(years, months, days, hours, mins, secs)[source]#
Try to create local date-time from years, months, days, hours, mins, secs fields. The function returns NULL on invalid inputs.
New in version 4.0.0.
- Parameters
- years
Column
or column name The year to represent, from 1 to 9999
- months
Column
or column name The month-of-year to represent, from 1 (January) to 12 (December)
- days
Column
or column name The day-of-month to represent, from 1 to 31
- hours
Column
or column name The hour-of-day to represent, from 0 to 23
- mins
Column
or column name The minute-of-hour to represent, from 0 to 59
- secs
Column
or column name The second-of-minute and its micro-fraction to represent, from 0 to 60. The value can be either an integer like 13 , or a fraction like 13.123. If the sec argument equals to 60, the seconds field is set to 0 and 1 minute is added to the final timestamp.
- years
- Returns
Column
A new column that contains a local date-time, or NULL in case of an error.
See also
Examples
>>> spark.conf.set("spark.sql.session.timeZone", "America/Los_Angeles")
Example 1: Make local date-time from years, months, days, hours, mins, secs.
>>> import pyspark.sql.functions as sf >>> df = spark.createDataFrame([[2014, 12, 28, 6, 30, 45.887]], ... ['year', 'month', 'day', 'hour', 'min', 'sec']) >>> df.select( ... sf.try_make_timestamp_ntz('year', 'month', df.day, df.hour, df.min, df.sec) ... ).show(truncate=False) +--------------------------------------------------------+ |try_make_timestamp_ntz(year, month, day, hour, min, sec)| +--------------------------------------------------------+ |2014-12-28 06:30:45.887 | +--------------------------------------------------------+
Example 2: Make local date-time with invalid input
>>> import pyspark.sql.functions as sf >>> df = spark.createDataFrame([[2014, 13, 28, 6, 30, 45.887]], ... ['year', 'month', 'day', 'hour', 'min', 'sec']) >>> df.select( ... sf.try_make_timestamp_ntz('year', 'month', df.day, df.hour, df.min, df.sec) ... ).show(truncate=False) +--------------------------------------------------------+ |try_make_timestamp_ntz(year, month, day, hour, min, sec)| +--------------------------------------------------------+ |NULL | +--------------------------------------------------------+
>>> spark.conf.unset("spark.sql.session.timeZone")