JupyterHub

From Public PIC Wiki
Revision as of 09:48, 27 October 2022 by Neissner (talk | contribs) (→‎SageMath)
Jump to navigation Jump to search

Introduction

PIC offers a service for running Jupyter notebooks on CPU or GPU resources. This service is primarily thought for code developing or prototyping rather than data processing. The usage is similar to running notebooks on your personal computer but offers the advantage of developing and testing your code on different hardware configurations, as well as facilitating the scalability of the code since it is being tested in the same environment in which it would run on a mass scale.

Since the service is strictly thought for development and small scale testing tasks, a shutdown policy for the sessions has been put in place:

  1. The maximum duration for a session is 48h.
  2. After an idle period of 2 hours, the session will be closed.

In practice that means that you should estimate the test data volume that you work with during a session to be able to be processed in less than 48 hours.

How to connect to the service

Got to jupyter.pic.es to see your login screen.

Login screen

Sign in with your PIC user credentials. This will prompt you to the following screen.

current

Here you can choose the hardware configuration for your Jupyter session. Also, you have to choose the experiment (project) you are working on during the Jupyter session. After choosing a configuration and pressing start the next screen will show you the progress of the initialisation process. Keep in mind that a job containing your Jupyter session is actually sent to the HTCondor queuing system and waiting for available resources before being started. This usually takes less than a minute but can take up to a few depending on our resource usage.

Screen02.png

In the next screen you can choose the tool that you want to use for your work: a Python notebook, a Python console or a plain bash terminal. For the Python environment (either notebook or environment) you have two default options:

  • the ipykernel version of Python 3
  • the XPython version of Python 3.9, this one allows you to use the integrated debugging module.

Further you see an icon with a "D" - desktop, this one starts a VNC session that allows the use of programs with graphical user interfaces.

Also, recently you can find the icon of Visual Studio, an integrated development environment.

Screen03.png

Your python environments should appear under Notebook and Console headers. In a later section we will show you how to create a new environment and to remove an existing one.

Terminate your session and logout

It is important that you terminate your session before you log out. In order to do so, go to the top page menu "File -> Hub Control Panel" and you will see the following screen.

Screen04.png

Here, click on the Stop My Server button. After that you can log out by clicking the Logout button in the right upper corner.

Python virtual environments

This section covers the use of Python virtual environments with Jupyter.

Initialize conda (we highly recommend the use of mambaforge)

Before using conda/mamba in your bash session, you have to initialize it. For access to an available conda/mamba installation, please get in contact with your project liaison at PIC. He/she will give you the actual value for the /path/to/mambaforge placeholder.

Log onto Jupyter and start a session. On the homepage of your Jupyter session, click on the terminal button on the session dashboard on the right to open a bash terminal. If no specific version is needed you can use the link provided in the example.

First, let's initialize conda for our bash sessions:

[neissner@td110 ~]$ /data/astro/software/centos7/conda/mambaforge_4.14.0/bin/mamba

This actually changes the .bashrc file in your home directory in order to activate the base environment on login. To avoid that the base environment is activated every time you log on to a node, run:

[neissner@td110 ~]$ conda config --set auto_activate_base false

For now you can exit the terminal.

[neissner@td110 ~]$ exit

Link an existing environment to Jupyter

You can find instructions on how to create your own environments, e.g. here.

Log into Jupyter, start a session. From the session dashboard choose the bash terminal.

Inside the terminal, activate your environment.

For conda/mamba environments:

  • if you created the environment without a prefix:
[neissner@td110 ~]$ mamba activate environment
(...) [neissner@td110 ~]$ 

The parenthesis (...) in front of your bash prompt show the name of your environment.

  • if you created the environment with a prefix:
[neissner@td110 ~]$ mamba activate /path/to/environment
(...) [neissner@td110 ~]$ 

The parenthesis (...) in front of your bash prompt show the absolute path of your environment.

For venv environments:

[neissner@td110 ~]$ source /path/to/environment/bin/activate
(...) [neissner@td110 ~]$ 

Link the environment to a Jupyter kernel. For both, conda/mamba and venv:

(...) [neissner@td110 ~]$ python -m  ipykernel install --user --name=whatever_kernel_name
Installed kernelspec whatever_kernel_name in 
                         /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name

Deactivate your environment.

For conda:

(...) [neissner@td110 ~]$ mamba deactivate

For venv:

(...) [neissner@td110 ~]$ deactivate

Now you can exit the terminal. After refreshing the Jupyter page your whatever_kernel_name appears in the dashboard. In this example test has been used for whatever_kernel_name

Screen05.png

Unlink an environment from Jupyter

Log onto Jupyter, start a session and from the session dashboard choose the bash terminal. To remove your environment/kernel from Jupyter run:

[neissner@td110 ~]$ jupyter kernelspec uninstall whatever_kernel_name
Kernel specs to remove:
  whatever_kernel_name     /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name
Remove 1 kernel specs [y/N]: y
[RemoveKernelSpec] Removed /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name

Keep in mind that, although not available in Jupyter anymore, the environment still exists. Whenever you need it, you can link it again.

Create virtual environments with venv or conda

Before creating a new environment, please get in contact with your project liaison at PIC as there may be already a suitable environment for your needs in place.

If none of the existing environments suits your needs, you can create a new environment. First, create a directory in a suitable place to store the environment. For single-user environments, place them in your home under ~/env. For environments that will be shared with other project users, contact your project liaison and ask him/her for a path in a shared storage volume that is visible to all of them.

Once you have the location (i.e. /path/to/env/folder), create the environment with the following commands:

For venv environments (recommended)

If your_env is installed at /path/to/env/your_env

[neissner@td110 ~]$ cd /path/to/env
[neissner@td110 ~]$ python3 -m venv your_env

Now you should be able to activate your environment and install additional modules

[neissner@td110 ~]$ cd /path/to/env
[neissner@td110 ~]$ source your_env/bin/activate
(...)[neissner@td110 ~]$ pip install additional_module1 additional_module2 ...

For conda/mamba environments

[neissner@td110 ~]$ mamba create --prefix /path/to/env/your_env

The list of modules (module1, module2, ...) is optional. For instance, for a python3 environment with scipy you would specify: python=3 scipy

Now you should be able to activate your environment and install additional modules

[neissner@td110 ~]$ mamba activate /path/to/env/folder/your_env
(...)[neissner@td110 ~]$ mamba install additional_module1 additional_module2 ...

You can use pip install inside a mamba environment, however, resolving dependencies might require installing additional packages manually.

Software of particular interest

SageMath

SageMath is particularly interesting for Cosmology because it allows symbolic calculations, e.g. deriving the equations of motions for the scale factor starting from a customised space-time metric. The following code derives the Einstein equations for the FLRW metric:

Screenshot Sage01.png

Screenshot Sage02.png

Screenshot Sage03.png

Screenshot Sage04.png

You can find the corresponding Notebook in any PIC terminal at this location: /data/astro/software/notebooks/FLRW_cosmology.ipynb

The notebook you can find at /data/astro/software/notebooks/FLRW_cosmology_solutions.ipynb uses known analytical solutions of the FLRW cosmology and produces this image for the evolution of the scale factor:

Screenshot Sage05.png

Enabling SageMath environment in Jupyter

[neissner@td110 ~] mamba activate /data/astro/software/envs/sage
(/data/astro/software/envs/sage) [neissner@td110 ~]$ python -m  ipykernel install --user --name=sage
....
(/data/astro/software/envs/sage) [neissner@td110 ~]$ mamba deactivate

This creates a file in you home:

~/.local/share/jupyter/kernels/sage/kernel.json

which has to be modified to look like this:

{
 "argv": [
  "/data/astro/software/envs/sage/bin/sage",
  "--python",
  "-m",
  "sage.repl.ipython_kernel",
  "-f",
  "{connection_file}"
 ],
 "display_name": "sage",
 "language": "sage",
 "metadata": {
  "debugger": true
 }
}

Next time you go to your Jupyter dashboard you will find the sage environment listed there.