How to measure the execution time of a query on Spark

To do it in a spark-shell (Scala), you can use spark.time().

See another response by me:

df = sqlContext.sql(query)

The output would be:

Time taken: xxx ms

Related: On Measuring Apache Spark Workload Metrics for Performance Troubleshooting.

Leave a Comment