Monday: A sparkSession includes all the possible sparkContexts. I’ve been using the sql context, mostly. A sparkContext connects your application driver (maybe a jupyter notebook) to a Spark cluster with the help of a Resource Manager (could be Spark Standalone, YARN, or Apache Mesos). Here’s a basic example to get you started.