Caching
Caching is pretty simple. Just use .persist() at the end of the desired computation. There are also option to persist an computation on disk or to replicate among multiple nodes. There is also the option to persist objects in in-Memory shared file-systems such as Tachyon.
Connect a master
There are two ways for connecting a master
val conf = new SparkConf().setAppName(appName).setMaster(master)
new SparkContext(conf)
This will connect the local master with 8 cores$ ./bin/spark-shell --master local[8]
No comments:
Post a Comment