Spark Standalone mode installation steps

Based on

Download Spark from

I used the prebuilt version of Spark for this post. For building from source please see instructions on website


Extract it to some location say


For running this you need both Scala and Java

I have downloaded them and configured following things

In /etc/environment file




You might change the above paths depending on your system

Go to


Rename the file to

Add the following values

export SCALA_HOME="/home/jagat/Downloads/scala-2.9.3"

export JAVA_HOME="/home/jagat/Downloads/jdk1.7.0_25"
export PATH=$PATH:$JAVA_HOME/bin

Now check your hosts file that your system DNS is resolving correct. Specially if you are on Ubuntu like me

Go to /etc/hosts

Change the following

jagat@Dell9400:~$ cat /etc/hosts    localhost    Dell9400

# The following lines are desirable for IPv6 capable hosts
::1     ip6-localhost ip6-loopback
fe00::0 ip6-localnet
ff00::0 ip6-mcastprefix
ff02::1 ip6-allnodes
ff02::2 ip6-allrouters

for Ubuntu the loopback address for your host is , change it to your exact IP

Now we are ready to go

Go to


Start Spark Master

./run spark.deploy.master.Master

Check the URL where master started


It will give info that master is started with

URL: spark://Dell9400:7077

This is the URL which we need in all our applications

Lets start one Worker by telling it about master

./run spark.deploy.worker.Worker spark://Dell9400:7077

This register the worker with the master.

Now refresh the master page


You can see that a worker is added on the page


Connecting a Job to the Cluster

To run a job on the Spark cluster, simply pass the spark://IP:PORT URL of the master as to the SparkContext constructor.

To run an interactive Spark shell against the cluster, run the following command:

MASTER=spark://IP:PORT ./spark-shell

Thats it

I admit that was very raw steps , but i kept it simple and quick for first time users

Happy Sparking :)