Python Jupyter
Jump to navigation
Jump to search
Spark notebooks
In order to launch a Spark notebook, you can use the script at /software/astro/scripts/spark_notebook.sh
.
Available arguments are:
-e: name of python environment (in ~/env/) -c: spark executor cores (default: 5) -m: spark executor memory (default: 15G) -o: spark executor overhead memory (default: 7.5G) -n: number of executors (default: dynamic) -p: port to listen to, may be different if busy (default: 8888) -d: spark driver memory (default: 4G) -r: spark driver max result size (default: 2G) -g: whether to do enable python profiling (default: false)
Example
/software/astro/scripts/spark_notebook.sh -e mocks -n 20 -g