countByValue() is an action that returns the Map of each unique value with its count
Syntax
Example
Learning Spark : 41
http://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.rdd.RDD
Syntax
def countByValue()(implicit ord: Ordering[T] = null): Map[T, Long] Return the count of each unique value in this RDD as a local map of (value, count) pairs. Note that this method should only be used if the resulting map is expected to be small, as the whole thing is loaded into the driver's memory. To handle very large results, consider using rdd.map(x => (x, 1L)).reduceByKey(_ + _), which returns an RDD[T, Long] instead of a map.
Example
scala> val inputrdd = sc.parallelize{ Seq(10, 4, 3, 3) } inputrdd: org.apache.spark.rdd.RDD[Int] = ParallelCollectionRDD[28] at parallelize at:47 scala> inputrdd.countByValue() res34: scala.collection.Map[Int,Long] = Map(10 -> 1, 3 -> 2, 4 -> 1)
Reference
Learning Spark : 41
http://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.rdd.RDD