I have a Dataframe with a bigint column. How to convert a bigint column to timestamp in scala spark
Asked
Active
Viewed 1.2k times
1 Answers
5
You can use from_unixtime/to_timestamp function in spark to convert Bigint column to timestamp.
Example:
spark.sql("select timestamp(from_unixtime(1563853753,'yyyy-MM-dd HH:mm:ss')) as ts").show(false)
+-------------------+
|ts |
+-------------------+
|2019-07-22 22:49:13|
+-------------------+
(or)
spark.sql("select to_timestamp(1563853753) as ts").show(false)
+-------------------+
|ts |
+-------------------+
|2019-07-22 22:49:13|
+-------------------+
Schema:
spark.sql("select to_timestamp(1563853753) as ts").printSchema
root
|-- ts: timestamp (nullable = false)
Refer this link for more details regards to converting different formats of timestamps in spark.
notNull
- 25,693
- 2
- 22
- 43
-
Is there a command to convert the column? Say I have a df with a column named 'ts' and I want that column to be converted from bigint to date time, how to do that? – Cr4zyTun4 Jan 19 '22 at 11:23