How to use built in spark UDF's

In the i talked about how to create a custom UDF in scala for spark. But before you do that always check Spark UDF's that are available with Spark already. I have this sample Spark data frame with list of users I wanted to sort the list of users in descending order of age so i used following 2 lines, first is to import functions that are available with Spark already and then i used desc function to order age in descending order

import org.apache.spark.sql.functions._
display(userDF.orderBy(desc("age")))
Now if i wanted to sort the data frame records using age in ascending order

display(userDF.orderBy(asc("age")))
This is sample of how to use the sum() function

userDF.select(sum("age")).show

1 comment: