30 May 2015

Modes of Operation

Spark can be run either in Local Mode or Cluster Mode. Spark by default runs in Local Mode. Spark can be configured to run in Cluster Mode using Cluster Manager. Currently Spark supports Three Cluster Managers
  1. YARN (Hadoop)
  2. Mesos 
  3. Standalone Scheduler(Available as part of Spark Installation)
We can specify the Mode in which Spark runs while submitting the Application(in spark-submit script)