Caching is pretty simple. Just use .persist() at the end of the desired computation. There are also option to persist an computation on disk or to replicate among multiple nodes. There is also the option to persist objects in in-Memory shared file-systems such as Tachyon.
Connect a master
There are two ways for connecting a master
This will connect the local master with 8 cores
val conf = new SparkConf().setAppName(appName).setMaster(master) new SparkContext(conf)
$ ./bin/spark-shell --master local