<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://pwiki.pic.es/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Torradeflot</id>
	<title>Public PIC Wiki - User contributions [en]</title>
	<link rel="self" type="application/atom+xml" href="https://pwiki.pic.es/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Torradeflot"/>
	<link rel="alternate" type="text/html" href="https://pwiki.pic.es/index.php?title=Special:Contributions/Torradeflot"/>
	<updated>2026-05-15T08:46:42Z</updated>
	<subtitle>User contributions</subtitle>
	<generator>MediaWiki 1.35.14</generator>
	<entry>
		<id>https://pwiki.pic.es/index.php?title=Transferring_data_to/from_PIC&amp;diff=1356</id>
		<title>Transferring data to/from PIC</title>
		<link rel="alternate" type="text/html" href="https://pwiki.pic.es/index.php?title=Transferring_data_to/from_PIC&amp;diff=1356"/>
		<updated>2026-05-14T15:18:18Z</updated>

		<summary type="html">&lt;p&gt;Torradeflot: /* Configuration (only once) */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= How to provide data access to PIC massive storage (dCache) =&lt;br /&gt;
&lt;br /&gt;
== Requirements ==&lt;br /&gt;
&lt;br /&gt;
 * Install and configure Rclone&lt;br /&gt;
 * PIC credentials or macaroon&lt;br /&gt;
&lt;br /&gt;
== Install Rclone ==&lt;br /&gt;
&lt;br /&gt;
You can directly download the binary without installing anything. For instance, for a linux 64 bits machine:&lt;br /&gt;
&lt;br /&gt;
    $ curl -JLO https://downloads.rclone.org/rclone-current-linux-amd64.zip&lt;br /&gt;
    [...]&lt;br /&gt;
    $ unzip rclone-current-linux-amd64.zip&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Or if you prefer, you can install Rclone like the next example on a Ubuntu machine:&lt;br /&gt;
&lt;br /&gt;
    $ cd /tmp&lt;br /&gt;
    $ curl -JLO 'https://downloads.rclone.org/rclone-current-linux-amd64.deb'&lt;br /&gt;
    $ sudo apt install ./rclone-current-linux-amd64.deb&lt;br /&gt;
&lt;br /&gt;
== Configure Rclone ==&lt;br /&gt;
&lt;br /&gt;
You should have been given some credentials and the url of a WebDAV endpoint at PIC.&lt;br /&gt;
With them, you just need to create the config in rclone:&lt;br /&gt;
&lt;br /&gt;
    $ rclone config&lt;br /&gt;
    No remotes found, make a new one?&lt;br /&gt;
    n) New remote&lt;br /&gt;
    s) Set configuration password&lt;br /&gt;
    q) Quit config&lt;br /&gt;
    n/s/q&amp;gt; n&lt;br /&gt;
    &lt;br /&gt;
    Enter name for new remote.&lt;br /&gt;
    name&amp;gt; pic&lt;br /&gt;
    &lt;br /&gt;
    Option Storage.&lt;br /&gt;
    Type of storage to configure.&lt;br /&gt;
    Choose a number from below, or type in your own value.&lt;br /&gt;
     1 / 1Fichier&lt;br /&gt;
       \ (fichier)&lt;br /&gt;
    [...]&lt;br /&gt;
    Storage&amp;gt; webdav&lt;br /&gt;
    &lt;br /&gt;
    Option url.&lt;br /&gt;
    URL of http host to connect to.&lt;br /&gt;
    E.g. https://example.com.&lt;br /&gt;
    Enter a value.&lt;br /&gt;
    url&amp;gt;  https://webdav.pic.es/PATH_TO_YOUR_STORAGE_SPACE&lt;br /&gt;
    &lt;br /&gt;
    Option vendor.&lt;br /&gt;
    Name of the WebDAV site/service/software you are using.&lt;br /&gt;
    Choose a number from below, or type in your own value.&lt;br /&gt;
    Press Enter to leave empty.&lt;br /&gt;
     1 / Nextcloud&lt;br /&gt;
       \ (nextcloud)&lt;br /&gt;
    [...]&lt;br /&gt;
     5 / Other site/service or software&lt;br /&gt;
       \ (other)&lt;br /&gt;
    vendor&amp;gt; other&lt;br /&gt;
    &lt;br /&gt;
===  Using your PIC credentials ===&lt;br /&gt;
&lt;br /&gt;
If you have a PIC user, enter it and the corresponding password in this step. Otherwise, leave these fields blank.&lt;br /&gt;
&lt;br /&gt;
    Option user.&lt;br /&gt;
    User name.&lt;br /&gt;
    In case NTLM authentication is used, the username should be in the format 'Domain\User'.&lt;br /&gt;
    Enter a value. Press Enter to leave empty.&lt;br /&gt;
    user&amp;gt; YOUR_PIC_USERNAME&lt;br /&gt;
&lt;br /&gt;
    Option pass.&lt;br /&gt;
    Password.&lt;br /&gt;
    Choose an alternative below. Press Enter for the default (n).&lt;br /&gt;
    y) Yes, type in my own password&lt;br /&gt;
    g) Generate random password&lt;br /&gt;
    n) No, leave this optional password blank (default)&lt;br /&gt;
    y/g/n&amp;gt; y&lt;br /&gt;
    Enter the password:&lt;br /&gt;
    password: YOUR_PIC_PASSWORD&lt;br /&gt;
    Confirm the password:&lt;br /&gt;
    Password: YOUR_PIC_PASSWORD&lt;br /&gt;
&lt;br /&gt;
=== Using a Macaroon token ===&lt;br /&gt;
&lt;br /&gt;
If you have been given a Macaroon token, provide it as a bearer token after leaving the user and password blank&lt;br /&gt;
&lt;br /&gt;
    Option bearer_token.&lt;br /&gt;
    Bearer token instead of user/pass (e.g. a Macaroon).&lt;br /&gt;
    Enter a value. Press Enter to leave empty.&lt;br /&gt;
    bearer_token&amp;gt; YOUR_MACAROON_TOKEN&lt;br /&gt;
&lt;br /&gt;
=== Using an OIDC token ===&lt;br /&gt;
&lt;br /&gt;
    Option bearer_token.&lt;br /&gt;
    Bearer token instead of user/pass (e.g. a Macaroon).&lt;br /&gt;
    Enter a value. Press Enter to leave empty.&lt;br /&gt;
    bearer_token&amp;gt; &lt;br /&gt;
    &lt;br /&gt;
    Edit advanced config?&lt;br /&gt;
    y) Yes&lt;br /&gt;
    n) No (default)&lt;br /&gt;
    y/n&amp;gt; y&lt;br /&gt;
    &lt;br /&gt;
    Option bearer_token_command.&lt;br /&gt;
    Command to run to get a bearer token.&lt;br /&gt;
    Enter a value. Press Enter to leave empty.&lt;br /&gt;
    bearer_token_command&amp;gt; oidc-token OIDC_AGENT_ACCOUNT_SHORTNAME&lt;br /&gt;
&lt;br /&gt;
=== Review settings ===&lt;br /&gt;
&lt;br /&gt;
At the end, just review the information you entered and confirm.&lt;br /&gt;
&lt;br /&gt;
    Edit advanced config?&lt;br /&gt;
    y) Yes&lt;br /&gt;
    n) No (default)&lt;br /&gt;
    y/n&amp;gt; n&lt;br /&gt;
    &lt;br /&gt;
    Configuration complete.&lt;br /&gt;
    Options:&lt;br /&gt;
    - type: webdav&lt;br /&gt;
    - url: https://door04.pic.es/PATH_TO_YOUR_STORAGE_SPACE&lt;br /&gt;
    - vendor: other&lt;br /&gt;
    - user: YOUR_PIC_USERNAME&lt;br /&gt;
    - pass: *** ENCRYPTED ***&lt;br /&gt;
    Keep this &amp;quot;pic&amp;quot; remote?&lt;br /&gt;
    y) Yes this is OK (default)&lt;br /&gt;
    e) Edit this remote&lt;br /&gt;
    d) Delete this remote&lt;br /&gt;
    y/e/d&amp;gt; y&lt;br /&gt;
    &lt;br /&gt;
    Current remotes:&lt;br /&gt;
    &lt;br /&gt;
    Name                 Type&lt;br /&gt;
    ====                 ====&lt;br /&gt;
    pic                  webdav&lt;br /&gt;
    &lt;br /&gt;
    e) Edit existing remote&lt;br /&gt;
    n) New remote&lt;br /&gt;
    d) Delete remote&lt;br /&gt;
    r) Rename remote&lt;br /&gt;
    c) Copy remote&lt;br /&gt;
    s) Set configuration password&lt;br /&gt;
    q) Quit config&lt;br /&gt;
    e/n/d/r/c/s/q&amp;gt; q&lt;br /&gt;
&lt;br /&gt;
Once done, you can use command line to browse and download/upload data.&lt;br /&gt;
&lt;br /&gt;
=== Usage ===&lt;br /&gt;
&lt;br /&gt;
* List a remote PIC directory: &lt;br /&gt;
  rclone lsd &amp;lt;name&amp;gt;:&amp;lt;path&amp;gt;`&lt;br /&gt;
&lt;br /&gt;
* Download a remote directory from PIC&lt;br /&gt;
  rclone copy &amp;lt;name&amp;gt;:&amp;lt;path&amp;gt; &amp;lt;local_path&amp;gt;&lt;br /&gt;
&lt;br /&gt;
* Upload a local directory to PIC&lt;br /&gt;
  rclone ${UPLOAD_FLAGS} copy &amp;lt;local_dir&amp;gt; &amp;lt;name&amp;gt;:&amp;lt;path&amp;gt;&lt;br /&gt;
&lt;br /&gt;
When uploading data, we recommend using the following flags, where n_transfers can be up to 350 if transferring lots of small files &lt;br /&gt;
  --check-first -P --stats-one-line --transfers &amp;lt;n_transfers&amp;gt; --size-only&lt;br /&gt;
&lt;br /&gt;
If uploading lots onto directories with lots of files (&amp;gt;1000), please use:&lt;br /&gt;
  --no-traverse&lt;br /&gt;
&lt;br /&gt;
If uploading files larger than 200 MB, also use&lt;br /&gt;
   --multi-thread-streams 1&lt;br /&gt;
&lt;br /&gt;
If uploading very large files (&amp;gt;10G), also use the following to allow more time to compute the checksums&lt;br /&gt;
  --timeout=15m&lt;br /&gt;
&lt;br /&gt;
See rclone manual for more extensive documentation https://rclone.org/docs/&lt;br /&gt;
&lt;br /&gt;
== Configuring oidc-agent for obtaining OIDC tokens ==&lt;br /&gt;
&lt;br /&gt;
Make sure &amp;lt;code&amp;gt;oidc-agent&amp;lt;/code&amp;gt; is available.&lt;br /&gt;
&lt;br /&gt;
=== Load oidc-agent === &lt;br /&gt;
&lt;br /&gt;
Initialize &amp;lt;code&amp;gt;oidc-agent&amp;lt;/code&amp;gt; in the terminal session.&lt;br /&gt;
&lt;br /&gt;
    $ eval `oidc-agent`&lt;br /&gt;
&lt;br /&gt;
=== Configuration (only once) ===&lt;br /&gt;
&lt;br /&gt;
To run this step you need an updated version of oidc-agent, version &amp;gt; 5.0.0.&lt;br /&gt;
'''Ask your contact for the client-secret you have to replace below'''&lt;br /&gt;
&lt;br /&gt;
Configure a &amp;lt;code&amp;gt;pic-dcache&amp;lt;/code&amp;gt; account to retrieve tokens from PIC. Open the URL that will show and input the code provided (or just open the QR code displayed).&lt;br /&gt;
After authenticating on the web browser, return to the terminal and input an encryption password twice. You'll need it when refreshing/reloading the &amp;lt;code&amp;gt;oidc-agent&amp;lt;/code&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
    $ oidc-gen -m --client-id dcache-view \&lt;br /&gt;
      --client-secret XXXXXXXXXXXXXXXXXX \&lt;br /&gt;
      --pub --flow=device \&lt;br /&gt;
      --discovery-endpoint=https://idp.pic.es/realms/PIC/.well-known/openid-configuration \&lt;br /&gt;
      --scope=&amp;quot;openid profile offline_access&amp;quot; --redirect-uri=edu.kit.data.oidc-agent:/ pic-dcache&lt;br /&gt;
    &lt;br /&gt;
    No account exists with this short name. Creating new configuration ...&lt;br /&gt;
    Generating account configuration ...&lt;br /&gt;
    accepted&lt;br /&gt;
    &lt;br /&gt;
    Using a browser on any device, visit:&lt;br /&gt;
    https://idp.pic.es/realms/PIC/device&lt;br /&gt;
    &lt;br /&gt;
    And enter the code: ASDF-GHJK&lt;br /&gt;
    Alternatively you can use the following QR code to visit the above listed URL.&lt;br /&gt;
        &lt;br /&gt;
    [ QR CODE ]&lt;br /&gt;
    &lt;br /&gt;
    Enter encryption password for account configuration 'pic-dcache': &lt;br /&gt;
    Confirm encryption password: &lt;br /&gt;
    Everything setup correctly!&lt;br /&gt;
&lt;br /&gt;
=== Reauthenticating (if refresh token has expired) ===&lt;br /&gt;
&lt;br /&gt;
If the oidc-agent process gets restarted, or iIf your refresh token expires due to inactivity, you will need to reauthenticate to retrieve further tokens&lt;br /&gt;
&lt;br /&gt;
    $ oidc-gen --reauthenticate pic-dcache&lt;br /&gt;
    Enter decryption password for account config 'testtest': &lt;br /&gt;
    Generating account configuration ...&lt;br /&gt;
    accepted&lt;br /&gt;
    &lt;br /&gt;
    Using a browser on any device, visit:&lt;br /&gt;
    https://idp.pic.es/realms/PIC/device&lt;br /&gt;
    &lt;br /&gt;
    And enter the code: ASDF-GHJK&lt;br /&gt;
    Alternatively you can use the following QR code to visit the above listed URL.&lt;br /&gt;
    &lt;br /&gt;
    [ QR CODE ]&lt;br /&gt;
    &lt;br /&gt;
    Enter encryption password for account configuration 'pic-dcache' [***]: &lt;br /&gt;
    Everything setup correctly!&lt;br /&gt;
&lt;br /&gt;
=== Testing ===&lt;br /&gt;
&lt;br /&gt;
After loading and configuring, you can get a token by running the following command:&lt;br /&gt;
&lt;br /&gt;
    $ oidc-token pic-dcache&lt;br /&gt;
      eyJhbGciOiJSUzI1[...]4YjAwg&lt;br /&gt;
&lt;br /&gt;
== Obtaining a macaroon (for contacts) ==&lt;br /&gt;
&lt;br /&gt;
Macaroons are valid up to 7 days.&lt;br /&gt;
&lt;br /&gt;
For downloading data (read-only permissions on the path):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$ curl -u ${USER} -X POST -H 'Content-Type: application/macaroon-request' \&lt;br /&gt;
-d '{&amp;quot;caveats&amp;quot;: [&amp;quot;activity:DOWNLOAD,LIST&amp;quot;], &amp;quot;validity&amp;quot;: &amp;quot;P7D&amp;quot;}' \&lt;br /&gt;
https://door04.pic.es:8460/${RESTRICTED_PATH}&lt;br /&gt;
&lt;br /&gt;
{&lt;br /&gt;
    &amp;quot;macaroon&amp;quot;: &amp;quot;MDA2MGxvY2F0aW&amp;quot;,&lt;br /&gt;
    &amp;quot;uri&amp;quot;: {&lt;br /&gt;
        &amp;quot;targetWithMacaroon&amp;quot;: &amp;quot;https://door04.pic.es:8460/${RESTRICTED_PATH}?authz=MDA2MGxvY2F0aW&amp;quot;,&lt;br /&gt;
        &amp;quot;baseWithMacaroon&amp;quot;: &amp;quot;https://door04.pic.es:8460/?authz=MDA2MGxvY2F0aW&amp;quot;,&lt;br /&gt;
        &amp;quot;target&amp;quot;: &amp;quot;https://door04.pic.es:8460/${RESTRICTED_PATH}&amp;quot;,&lt;br /&gt;
        &amp;quot;base&amp;quot;: &amp;quot;https://door04.pic.es:8460/&amp;quot;&lt;br /&gt;
    }&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For uploading data (full permissions on the path):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$ curl -u ${USER} -X POST -H 'Content-Type: application/macaroon-request' \&lt;br /&gt;
-d '{&amp;quot;validity&amp;quot;: &amp;quot;P7D&amp;quot;}' \&lt;br /&gt;
https://door04.pic.es:8460/${RESTRICTED_PATH}&lt;br /&gt;
&lt;br /&gt;
{&lt;br /&gt;
    &amp;quot;macaroon&amp;quot;: &amp;quot;MDA2MGxvY2F0aW&amp;quot;,&lt;br /&gt;
    &amp;quot;uri&amp;quot;: {&lt;br /&gt;
        &amp;quot;targetWithMacaroon&amp;quot;: &amp;quot;https://door04.pic.es:8460/${RESTRICTED_PATH}?authz=MDA2MGxvY2F0aW&amp;quot;,&lt;br /&gt;
        &amp;quot;baseWithMacaroon&amp;quot;: &amp;quot;https://door04.pic.es:8460/?authz=MDA2MGxvY2F0aW&amp;quot;,&lt;br /&gt;
        &amp;quot;target&amp;quot;: &amp;quot;https://door04.pic.es:8460/${RESTRICTED_PATH}&amp;quot;,&lt;br /&gt;
        &amp;quot;base&amp;quot;: &amp;quot;https://door04.pic.es:8460/&amp;quot;&lt;br /&gt;
    }&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;/div&gt;</summary>
		<author><name>Torradeflot</name></author>
	</entry>
	<entry>
		<id>https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1338</id>
		<title>JupyterHub</title>
		<link rel="alternate" type="text/html" href="https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1338"/>
		<updated>2026-04-08T08:14:19Z</updated>

		<summary type="html">&lt;p&gt;Torradeflot: /* GPUs */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;tldr; Connect to https://jupyter.pic.es/ . Enjoy!&lt;br /&gt;
= Introduction =&lt;br /&gt;
&lt;br /&gt;
PIC offers a service for running Jupyter notebooks on CPU or GPU resources. This service is primarily thought for code developing or prototyping rather than data processing. The usage is similar to running notebooks on your personal computer but offers the advantage of developing and testing your code on different hardware configurations, as well as facilitating the scalability of the code since it is being tested in the same environment in which it would run on a mass scale. &lt;br /&gt;
&lt;br /&gt;
Since the service is strictly thought for development and small scale testing tasks, a shutdown policy for the sessions has been put in place:&lt;br /&gt;
&lt;br /&gt;
# The maximum duration for a session is 48h.&lt;br /&gt;
# After an idle period of 2 hours, the session will be closed. &lt;br /&gt;
&lt;br /&gt;
In practice that  means that you should estimate the test data volume that you work with during a session to be able to be processed in less than 48 hours.&lt;br /&gt;
&lt;br /&gt;
== How to connect to the service ==&lt;br /&gt;
&lt;br /&gt;
Got to [https://jupyter.pic.es jupyter.pic.es] to see your login screen.&lt;br /&gt;
&lt;br /&gt;
[[File:login.png|700px|Login screen]]&lt;br /&gt;
&lt;br /&gt;
Sign in with your PIC user credentials. This will prompt you to the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterSpawn.png|700px|current]]&lt;br /&gt;
&lt;br /&gt;
Here you can choose the hardware configuration for your Jupyter session. Also, you have to choose the experiment (project) you are working on during the Jupyter session. After choosing a configuration and pressing start the next screen will show you the progress of the initialisation process. Keep in mind that a job containing your Jupyter session is actually sent to the HTCondor queuing system and waiting for available resources before being started. This usually takes less than a minute but can take up to a few depending on our resource usage.&lt;br /&gt;
&lt;br /&gt;
[[File:screen02.png|900px]]&lt;br /&gt;
&lt;br /&gt;
In the next screen you can choose the tool that you want to use for your work: a Python notebook, a Python console or a plain bash terminal.&lt;br /&gt;
For the Python environment (either notebook or environment) you have two default options:&lt;br /&gt;
* the ipykernel version of Python 3&lt;br /&gt;
* the XPython version of Python 3.9, this one allows you to use the integrated debugging module.&lt;br /&gt;
&lt;br /&gt;
== Terminate your session and logout ==&lt;br /&gt;
&lt;br /&gt;
It is important that you terminate your session before you log out. In order to do so, go to the top page menu &amp;quot;'''File''' -&amp;gt; '''Hub Control Panel'''&amp;quot; and you will see the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:screen04.png|600px]]&lt;br /&gt;
&lt;br /&gt;
Here, click on the '''Stop My Server''' button. After that you can log out by clicking the '''Logout''' button in the right upper corner.&lt;br /&gt;
&lt;br /&gt;
=  Python virtual environments =&lt;br /&gt;
&lt;br /&gt;
This section covers the use of Python virtual environments with Jupyter.&lt;br /&gt;
&lt;br /&gt;
== Prebuilt environments == &lt;br /&gt;
&lt;br /&gt;
PIC's jupyterhub service comes with a collection of prebuilt environments located at '''/data/jupyter/software/envs'''.&lt;br /&gt;
&lt;br /&gt;
The master environment located at '''/data/jupyter/software/envs/master''' is the one used to start the jupyterlab service and the default for new notebooks.&lt;br /&gt;
&lt;br /&gt;
This is a non-extensive list of the packages included:&lt;br /&gt;
  - astropy=6.1.0&lt;br /&gt;
  - bokeh=3.4.1&lt;br /&gt;
  - dash=2.17.0&lt;br /&gt;
  - dask=2024.5.1&lt;br /&gt;
  - findspark=2.0.1&lt;br /&gt;
  - matplotlib=3.8.4&lt;br /&gt;
  - numpy=1.26.4&lt;br /&gt;
  - pandas=2.2.2&lt;br /&gt;
  - pillow=10.3.0&lt;br /&gt;
  - plotly=5.22.0&lt;br /&gt;
  - pyhive=0.7.0&lt;br /&gt;
  - python=3.12&lt;br /&gt;
  - pywavelets=1.4.1&lt;br /&gt;
  - scikit-image=0.22.0&lt;br /&gt;
  - scikit-learn=1.5.0&lt;br /&gt;
  - scipy=1.13.1&lt;br /&gt;
  - seaborn=0.13.2&lt;br /&gt;
  - statsmodels=0.14.2&lt;br /&gt;
&lt;br /&gt;
== Initialize conda (we highly recommend the use of mamba/micromamba) ==&lt;br /&gt;
&lt;br /&gt;
Before using conda/mamba in your bash session, you have to initialize it.&lt;br /&gt;
* For access to an available conda/mamba installation, please get in contact with your project liaison at PIC. He/she will give you the actual value for the '''/path/to/conda/mamba''' placeholder.&lt;br /&gt;
* If you want to use your own conda/mamba/micromamba installation, there are two recommended options&lt;br /&gt;
** '''miniforge''': a distribution with conda and mamba executables in a minimal base environment, instructions [https://github.com/conda-forge/miniforge here]&lt;br /&gt;
** '''micromamba''': a self-contained executable (micromamba) with no base environment, instructions [https://mamba.readthedocs.io/en/latest/installation/micromamba-installation.html here]&lt;br /&gt;
&lt;br /&gt;
Log onto Jupyter and start a session. On the homepage of your Jupyter session, click on the terminal button on the session dashboard on the right to open a bash terminal.&lt;br /&gt;
&lt;br /&gt;
In order to use conda/mamba/micromamba you need to intialize the shell. This initialization can be persistent, which will do some changes to your '''~/.bashrc''' file, or you can do it every time you want to use it.&lt;br /&gt;
&lt;br /&gt;
Run the following command:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eval &amp;quot;$(/data/astro/software/miniforge3/bin/conda shell.bash hook)&amp;quot;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, if you are using miniforge and you want to persist the initialization:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
/data/astro/software/miniforge3/bin/mamba init&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This actually changes the .bashrc file in your home directory in order to activate the base environment on login.&lt;br /&gt;
To avoid that the base environment is activated every time you log on to a node, run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Link an existing environment to Jupyter ==&lt;br /&gt;
&lt;br /&gt;
You can find instructions on how to create your own environments, e.g. [[#Create_virtual_environments_with_venv_or_conda | here]].&lt;br /&gt;
&lt;br /&gt;
Log into Jupyter, start a session. From the session dashboard choose the bash terminal.&lt;br /&gt;
&lt;br /&gt;
Inside the terminal, activate your environment.&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments:&lt;br /&gt;
* if you created the environment without a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the name of your environment. &lt;br /&gt;
* if you created the environment with a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the absolute path of your environment. &lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ source /path/to/environment/bin/activate&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Link the environment to a Jupyter kernel. For both, '''conda/mamba''' and '''venv''':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ python -m  ipykernel install --user --name=whatever_kernel_name&lt;br /&gt;
Installed kernelspec whatever_kernel_name in &lt;br /&gt;
                         /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you don't have the '''ipykernel''' module installed in your environment you may receive an error message like the one below when trying to run the previous command.&lt;br /&gt;
&amp;lt;pre&amp;gt;No module named ipykernel&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If this is the case, you need to install it by running: '''pip install ipykernel'''&lt;br /&gt;
&lt;br /&gt;
Deactivate your environment. &lt;br /&gt;
&lt;br /&gt;
For conda:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For venv:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now you can exit the terminal. After refreshing the Jupyter page your whatever_kernel_name appears in the dashboard. In this example '''test''' has been used for whatever_kernel_name&lt;br /&gt;
&lt;br /&gt;
[[File:screen05.png|700px]]&lt;br /&gt;
&lt;br /&gt;
== Unlink an environment from Jupyter ==&lt;br /&gt;
Log onto Jupyter, start a session and from the session dashboard choose the bash terminal. To remove your environment/kernel from Jupyter run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ jupyter kernelspec uninstall whatever_kernel_name&lt;br /&gt;
Kernel specs to remove:&lt;br /&gt;
  whatever_kernel_name     /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
Remove 1 kernel specs [y/N]: y&lt;br /&gt;
[RemoveKernelSpec] Removed /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Keep in mind that, although not available in Jupyter anymore, the environment still exists. Whenever you need it, you can link it again.&lt;br /&gt;
&lt;br /&gt;
== Create virtual environments with venv or conda ==&lt;br /&gt;
&lt;br /&gt;
Before creating a new environment, please get in contact with your project liaison at PIC as there may be already a suitable environment for your needs in place.&lt;br /&gt;
&lt;br /&gt;
If none of the existing environments suits your needs, you can create a new environment.&lt;br /&gt;
First, create a directory in a suitable place to store the environment. For single-user environments, place them in your home under ~/env. For environments that will be shared with other project users, contact your project liaison and ask him/her for a path in a shared storage volume that is visible to all of them.&lt;br /&gt;
&lt;br /&gt;
Once you have the location (i.e. /path/to/env/folder), create the environment with the following commands:&lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments '''(recommended)'''&lt;br /&gt;
&lt;br /&gt;
If your_env is installed at /path/to/env/your_env&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ python3 -m venv your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ source your_env/bin/activate&lt;br /&gt;
(...)[neissner@td110 ~]$ pip install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments&lt;br /&gt;
 &amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba create --prefix /path/to/env/your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The list of modules (module1, module2, ...) is optional. For instance, for a python3 environment with scipy you would specify: ''python=3 scipy'' &lt;br /&gt;
&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/env/folder/your_env&lt;br /&gt;
(...)[neissner@td110 ~]$ mamba install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
You can use pip install inside a mamba environment, however, resolving dependencies might require installing additional packages manually.&lt;br /&gt;
&lt;br /&gt;
== Conda / mamba configuration ==&lt;br /&gt;
&lt;br /&gt;
The behaviour of conda/mamba can be configured through the &amp;quot;$HOME/.condarc&amp;quot; file, described [https://docs.conda.io/projects/conda/en/latest/configuration.html here]. Some interesting parameters:&lt;br /&gt;
&lt;br /&gt;
* envs_dirs: The list of directories to search for named environments. E.g.: different locations where you created environments&lt;br /&gt;
&lt;br /&gt;
  envs_dirs:&lt;br /&gt;
    - /data/pic/scratch/torradeflot/envs&lt;br /&gt;
    - /data/astro/scratch/torradeflot/envs&lt;br /&gt;
    - /data/aai/scratch/torradeflot/envs&lt;br /&gt;
&lt;br /&gt;
* pkgs_dirs: Folder where to store conda packages&lt;br /&gt;
&lt;br /&gt;
  pkgs_dirs:&lt;br /&gt;
  - /data/aai/scratch_ssd/torradeflot/pkgs&lt;br /&gt;
  - /data/aai/scratch/torradeflot/pkgs&lt;br /&gt;
  - /data/pic/scratch/torradeflot/pkgs&lt;br /&gt;
&lt;br /&gt;
if `pkgs_dirs` and `envs_dirs` are in the same storage, conda will use hard links, thus optimizing the disk space.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Jupyterlab user guide =&lt;br /&gt;
&lt;br /&gt;
You can find the official documentation of the currently installed version of jupyterlab (3.6) [https://jupyterlab.readthedocs.io/en/4.2.x/ here], there you will find instruction on how to &lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/commands.html Access the command palette]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/toc.html Build a Table Of Contents]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/debugger.html Debug your code]&lt;br /&gt;
&lt;br /&gt;
A set of non-official jupyterlab extensions are installed to provide additional functionalities&lt;br /&gt;
&lt;br /&gt;
== Sidecar Apps: Remote desktop &amp;amp; Visual Studio IDE ==&lt;br /&gt;
Further you see an icon with a &amp;quot;D&amp;quot; - desktop, this one starts a VNC session that allows the use of programs with graphical user interfaces.&lt;br /&gt;
&lt;br /&gt;
You can also find the icon of Visual Studio, an integrated development environment.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterlab20231103.png|700px]]&lt;br /&gt;
&lt;br /&gt;
== Jupytext ==&lt;br /&gt;
Pair your notebooks with text files to enhance version tracking.&lt;br /&gt;
https://jupytext.readthedocs.io&lt;br /&gt;
&lt;br /&gt;
'''Example'''&lt;br /&gt;
&lt;br /&gt;
If you had a notebook (.ipynb file) containing only the cell below, tracked in a git repository&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%matplotlib inline&lt;br /&gt;
import numpy as np&lt;br /&gt;
import matplotlib.pyplot as plt&lt;br /&gt;
plt.imshow(np.random.random([10, 10]))&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Different executions of the cell would produce different images, and the images are embedded in a pseudo-binary format inside the notebook file. In this case, doing a '''git diff''' of the .ipynb file would produce a huge output (because the image changed), even if there wasn't any change in the code. It is thus convenient to sync the notebook with a text file (e.g. a .py script) using the jupytext extension and track this one with git. The outputs, including images, as well as some additional metadata, won't be added to the synced text file. So in the case of different executions of the same notebook, the diff will always be empty.&lt;br /&gt;
&lt;br /&gt;
== Git ==&lt;br /&gt;
Sidebar GUI to git repo management&lt;br /&gt;
https://github.com/jupyterlab/jupyterlab-git&lt;br /&gt;
&lt;br /&gt;
== Variable inspector ==&lt;br /&gt;
Variable Inspector provides an interactive interface for inspecting the current state of variables in a JupyterLab session. It allows users to view variable names, types, shapes, and values in a structured table, facilitating exploratory analysis and debugging workflows similar to variable inspection tools available in environments such as MATLAB.&lt;br /&gt;
&lt;br /&gt;
https://github.com/jupyterlab-contrib/jupyterlab-variableInspector&lt;br /&gt;
&lt;br /&gt;
== Jupyter server proxy ==&lt;br /&gt;
&lt;br /&gt;
This extension is installed in PIC's jupyter environment and it is used to be able to access network/web services running on the same host as the jupyterlab server from outside through the &amp;quot;https://jupyter.pic.es/user/{username}/proxy/{port}&amp;quot; URL.&lt;br /&gt;
&lt;br /&gt;
Full documentation here: https://jupyter-server-proxy.readthedocs.io/en/latest/index.html&lt;br /&gt;
...&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Tips &amp;amp; Tricks =&lt;br /&gt;
&lt;br /&gt;
== Software of particular interest ==&lt;br /&gt;
&lt;br /&gt;
=== ROOT ===&lt;br /&gt;
&lt;br /&gt;
Using ROOT from a jupyter notebook needs to do some tricks.&lt;br /&gt;
&lt;br /&gt;
Environment creation and kernel installation&lt;br /&gt;
&lt;br /&gt;
    $ micromamba env create -p /data/pic/scratch/torradeflot/envs/mcdata root ipykernel&lt;br /&gt;
    $ micromamba activate mcdata&lt;br /&gt;
    $ python -m ipykernel install --user --name mcdata&lt;br /&gt;
&lt;br /&gt;
After doing this ROOT can be imported from a python shell, but it does not work from a notebook.&lt;br /&gt;
&lt;br /&gt;
* ROOT uses JIT compilation with Cling: https://root.cern/cling/&lt;br /&gt;
* conda provides it's own set of compilation tools: gcc, gxx, fortran&lt;br /&gt;
* Some environment variables are necessary to ensure that these two pieces work well together:&lt;br /&gt;
** PATH: to be able to find the compiler tools&lt;br /&gt;
** CONDA_BUILD_SYSROOT: needed to configure the compiler call&lt;br /&gt;
&lt;br /&gt;
These environment variables are not propagated to the notebook, because they are set by .bashrc conda activate, ... so we need to explicitly set them in the notebook.&lt;br /&gt;
&lt;br /&gt;
     import os&lt;br /&gt;
     import sys&lt;br /&gt;
     from pathlib import Path&lt;br /&gt;
     bin_dir = Path(sys.executable).parent&lt;br /&gt;
     os.environ['PATH'] = f'{bin_dir}:{os.environ['PATH']}'&lt;br /&gt;
     os.environ['CONDA_BUILD_SYSROOT'] = str(bin_dir.parent / 'x86_64-conda-linux-gnu/sysroot')&lt;br /&gt;
&lt;br /&gt;
=== SageMath ===&lt;br /&gt;
&lt;br /&gt;
[https://www.sagemath.org/ SageMath] is particularly interesting for Cosmology because it allows symbolic calculations, e.g. deriving the equations of motions for the scale factor starting from a customised space-time metric. &lt;br /&gt;
&lt;br /&gt;
'''Standard cosmology examples'''&lt;br /&gt;
&lt;br /&gt;
* The Friedman equations for the FLRW solution of the Einstein equations.&lt;br /&gt;
You can find the corresponding Notebook in any PIC terminal at '''/data/astro/software/notebooks/FLRW_cosmology.ipynb'''&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/FLRW_cosmology_solutions.ipynb''' uses known analytical solutions of the FLRW cosmology and produces this image for the evolution of the scale factor:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage05.png|300px]]&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/Interior_Schwarzschild.ipynb''' shows the formalism for the interior Schwarzschild metric and displays the solutions for density and pressure of a static celestial object that is sufficiently larger than its Schwarzschild radius. The pressure for an object with constant density id shown in the image:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage06.png|300px]]&lt;br /&gt;
&lt;br /&gt;
'''Enabling SageMath environment in Jupyter'''&lt;br /&gt;
&lt;br /&gt;
If you have never initialized mamba, run:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /data/astro/software/centos7/conda/mambaforge_4.14.0/bin/mamba init&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
After that you can enable SageMath for its use in a Jupyter notebook session: &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~] mamba activate /data/astro/software/envs/sage&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ python -m  ipykernel install --user --name=sage&lt;br /&gt;
....&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This creates a file in you home '''~/.local/share/jupyter/kernels/sage/kernel.json'''&lt;br /&gt;
which has to be modified to look like this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
{&lt;br /&gt;
 &amp;quot;argv&amp;quot;: [&lt;br /&gt;
  &amp;quot;/data/astro/software/envs/sage/bin/sage&amp;quot;,&lt;br /&gt;
  &amp;quot;--python&amp;quot;,&lt;br /&gt;
  &amp;quot;-m&amp;quot;,&lt;br /&gt;
  &amp;quot;sage.repl.ipython_kernel&amp;quot;,&lt;br /&gt;
  &amp;quot;-f&amp;quot;,&lt;br /&gt;
  &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
 ],&lt;br /&gt;
 &amp;quot;display_name&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;language&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;metadata&amp;quot;: {&lt;br /&gt;
  &amp;quot;debugger&amp;quot;: true&lt;br /&gt;
 }&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next time you go to your Jupyter dashboard you will find the sage environment listed there.&lt;br /&gt;
&lt;br /&gt;
== Dask ==&lt;br /&gt;
Dask supports parallel computations in Python. The PIC Jupyterlab has an extension for launching&lt;br /&gt;
your own Dask clusters. For more information, see [[Dask|Dask documentation]].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Using a Singularity image as a Jupyter kernel ==&lt;br /&gt;
&lt;br /&gt;
In some projects, the software stack is provided as a Singularity image. In such cases, it can be convenient to use this image directly as a Jupyter kernel, allowing notebooks on jupyter.pic.es to run within the same controlled software environment.&lt;br /&gt;
&lt;br /&gt;
To be used as a Jupyter kernel, the Singularity image must satisfy certain requirements. These depend on the programming language used inside the notebook.&lt;br /&gt;
&lt;br /&gt;
The singularity image needs to have the '''python''' and the '''ipykernel''' module installed. &lt;br /&gt;
&lt;br /&gt;
* Create the folder that will host the kernel definition&lt;br /&gt;
&lt;br /&gt;
  mkdir -p $HOME/.local/share/jupyter/kernels/singularity&lt;br /&gt;
&lt;br /&gt;
* Create the '''kernel.json''' file inside it with the following content:&lt;br /&gt;
&lt;br /&gt;
 {&lt;br /&gt;
   &amp;quot;argv&amp;quot;: [&lt;br /&gt;
     &amp;quot;singularity&amp;quot;,&lt;br /&gt;
     &amp;quot;exec&amp;quot;,&lt;br /&gt;
     &amp;quot;--cleanenv&amp;quot;,&lt;br /&gt;
     &amp;quot;/path/to/the/singularity/image.sif&amp;quot;,&lt;br /&gt;
     &amp;quot;python&amp;quot;,&lt;br /&gt;
     &amp;quot;-m&amp;quot;,&lt;br /&gt;
    &amp;quot;ipykernel&amp;quot;,&lt;br /&gt;
     &amp;quot;-f&amp;quot;,&lt;br /&gt;
     &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
   ],&lt;br /&gt;
   &amp;quot;language&amp;quot;: &amp;quot;python&amp;quot;,&lt;br /&gt;
   &amp;quot;display_name&amp;quot;: &amp;quot;singularity-kernel&amp;quot;&lt;br /&gt;
 }&lt;br /&gt;
&lt;br /&gt;
Refresh or start the jupyterlab interface and the singularity kernel should appear in the launcher tab&lt;br /&gt;
&lt;br /&gt;
== GPUs ==&lt;br /&gt;
&lt;br /&gt;
The GPUs that are assigned to your job are listed in the environment variable CUDA_VISIBLE_DEVICES. In a terminal run &amp;quot;echo $CUDA_VISIBLE_DEVICES&amp;quot;. The environment variable contains a list of comma-separated GPU ids. With this you will already know how many gpus are assigned to your job. If the variable does not exist, there are no gpus assigned to the job&lt;br /&gt;
&lt;br /&gt;
    [torradeflot@gpu05 ~]$ echo $CUDA_VISIBLE_DEVICES &lt;br /&gt;
    GPU-a2361e9d-c520-684b-b21b-60cc3a59b05b&lt;br /&gt;
&lt;br /&gt;
You can track the GPU utilization (memory and load) using the GPU Dashboard in the left sidebar&lt;br /&gt;
&lt;br /&gt;
[[File:GPU_utilization_20260408.png|700px]]&lt;br /&gt;
&lt;br /&gt;
== Code samples ==&lt;br /&gt;
&lt;br /&gt;
A repository with sample code can be found here: https://gitlab.pic.es/services/code-samples/&lt;br /&gt;
&lt;br /&gt;
== Running notebooks through HTCondor ==&lt;br /&gt;
After developing a notebook, you might want to run it as a script with different configurations. The&lt;br /&gt;
following documentation explains  &lt;br /&gt;
&lt;br /&gt;
[[notebook_htcondor|how to run a notebook through HTCondor.]]&lt;br /&gt;
&lt;br /&gt;
= Troubleshooting =&lt;br /&gt;
&lt;br /&gt;
== Logs ==&lt;br /&gt;
&lt;br /&gt;
The log files for the jupyterlab server are stored in &amp;quot;~/.jupyter&amp;quot;. The log files will be created once the jupyter lab server job is finished.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Clean workspaces ==&lt;br /&gt;
&lt;br /&gt;
Jupyterlab stores the workspace status in the &amp;quot;~/.jupyter/lab/workspaces&amp;quot; folder. If you want to start with a fresh (empty) workspace, delete all the content of this folder before launching the notebook.&lt;br /&gt;
&lt;br /&gt;
    cd ~/.jupyter/lab/workspaces&lt;br /&gt;
    rm *&lt;br /&gt;
&lt;br /&gt;
= Known problems =&lt;br /&gt;
&lt;br /&gt;
== HOME folder is full ==&lt;br /&gt;
&lt;br /&gt;
Your HOME folder located at &amp;quot;/nfs/pic.es/user/X/XYZTUV&amp;quot; has a quota assigned, usually 25GB. &lt;br /&gt;
&lt;br /&gt;
If you see the error message &amp;quot;Disk quota exceeded&amp;quot; when spawning a notebook server or a generic 500 error, it could be because your $HOME folder is full. Log in to &amp;quot;ui.pic.es&amp;quot; (through SSH) and run &amp;quot;quota&amp;quot; to check the usage vs quota. If it is full you'll have to free up space.&lt;br /&gt;
&lt;br /&gt;
== Error 500: Internal Server Error ==&lt;br /&gt;
&lt;br /&gt;
This is a generic error. Means that the jupyterlab server failed. This could be for different reasons:&lt;br /&gt;
&lt;br /&gt;
* [[#HOME_folder_is_full|Your HOME folder is full]]&lt;br /&gt;
&lt;br /&gt;
== 504 Gateway timeout ==&lt;br /&gt;
&lt;br /&gt;
The notebook job is running in HTCondor but the user can not access the notebook server. Ultimately a 504 error is received.&lt;br /&gt;
&lt;br /&gt;
This is probably because there's some error when starting the jupyterlab server. First of all, shutdown the notebook server and [[#Logs|check the logs]] to better identify the problem. If you don't see the source of the error, try to [[#Clean_workspaces|clean the workspaces]] and launch a notebook again.&lt;br /&gt;
&lt;br /&gt;
== Loading libraries is very slow ==&lt;br /&gt;
Conda environments at storage using hard drives can be extremely slow to load. If you encounter this problem, please ask &lt;br /&gt;
your support contact for a &amp;quot;SSD scratch&amp;quot; location to store environments. Currently this service is being tested and&lt;br /&gt;
deployed as needed.&lt;br /&gt;
&lt;br /&gt;
== Spawn failed: The 'ip' trait of a PICCondorSpawner instance expected a unicode string, not the NoneType None ==&lt;br /&gt;
&lt;br /&gt;
Jupyterhub could not get the host name from HTCondor's stdout, because it didn't match the expected regular expression.&lt;br /&gt;
&lt;br /&gt;
This error happens randomly from time to time, it does not imply any major problem in any of the services.&lt;br /&gt;
&lt;br /&gt;
Try to request a new notebook server.&lt;br /&gt;
&lt;br /&gt;
== 403 : Forbidden. XSRF cookie does not match POST argument ==&lt;br /&gt;
&lt;br /&gt;
The value of the &amp;quot;_xsrf&amp;quot; cookie sent by the browser does not match the expected value. This could be for many reasons: temporary high load on the server, race conditions, temporary network unstability, many open tabs in the browser, etc.&lt;br /&gt;
&lt;br /&gt;
In general it can be solved by closing all tabs pointing to &amp;quot;jupyter.pic.es&amp;quot;, cleaning the cookies and connecting back to &amp;quot;jupyter.pic.es&amp;quot;&lt;br /&gt;
&lt;br /&gt;
== Proper usage of X509 based proxies ==&lt;br /&gt;
&lt;br /&gt;
We found recently that the usage of proxies within a Jupyter session might cause problems because the environment changes certain standard locations such as '''/tmp'''&lt;br /&gt;
&lt;br /&gt;
For a correct functioning please create the proxy the following way, example for Virgo:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /bin/voms-proxy-init --voms virgo:/virgo/ligo --out ./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
ls: cannot access /cvmfs/ligo.osgstorage.org: Permission denied&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here the proxy cannot properly be located. Therefore we have to put the complete path into the variable:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ pwd&lt;br /&gt;
/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;/x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
frames  powerflux  pycbc  test_access&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;/div&gt;</summary>
		<author><name>Torradeflot</name></author>
	</entry>
	<entry>
		<id>https://pwiki.pic.es/index.php?title=File:GPU_utilization_20260408.png&amp;diff=1337</id>
		<title>File:GPU utilization 20260408.png</title>
		<link rel="alternate" type="text/html" href="https://pwiki.pic.es/index.php?title=File:GPU_utilization_20260408.png&amp;diff=1337"/>
		<updated>2026-04-08T08:11:29Z</updated>

		<summary type="html">&lt;p&gt;Torradeflot: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Torradeflot</name></author>
	</entry>
	<entry>
		<id>https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1336</id>
		<title>JupyterHub</title>
		<link rel="alternate" type="text/html" href="https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1336"/>
		<updated>2026-04-08T08:10:10Z</updated>

		<summary type="html">&lt;p&gt;Torradeflot: /* GPUs */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;tldr; Connect to https://jupyter.pic.es/ . Enjoy!&lt;br /&gt;
= Introduction =&lt;br /&gt;
&lt;br /&gt;
PIC offers a service for running Jupyter notebooks on CPU or GPU resources. This service is primarily thought for code developing or prototyping rather than data processing. The usage is similar to running notebooks on your personal computer but offers the advantage of developing and testing your code on different hardware configurations, as well as facilitating the scalability of the code since it is being tested in the same environment in which it would run on a mass scale. &lt;br /&gt;
&lt;br /&gt;
Since the service is strictly thought for development and small scale testing tasks, a shutdown policy for the sessions has been put in place:&lt;br /&gt;
&lt;br /&gt;
# The maximum duration for a session is 48h.&lt;br /&gt;
# After an idle period of 2 hours, the session will be closed. &lt;br /&gt;
&lt;br /&gt;
In practice that  means that you should estimate the test data volume that you work with during a session to be able to be processed in less than 48 hours.&lt;br /&gt;
&lt;br /&gt;
== How to connect to the service ==&lt;br /&gt;
&lt;br /&gt;
Got to [https://jupyter.pic.es jupyter.pic.es] to see your login screen.&lt;br /&gt;
&lt;br /&gt;
[[File:login.png|700px|Login screen]]&lt;br /&gt;
&lt;br /&gt;
Sign in with your PIC user credentials. This will prompt you to the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterSpawn.png|700px|current]]&lt;br /&gt;
&lt;br /&gt;
Here you can choose the hardware configuration for your Jupyter session. Also, you have to choose the experiment (project) you are working on during the Jupyter session. After choosing a configuration and pressing start the next screen will show you the progress of the initialisation process. Keep in mind that a job containing your Jupyter session is actually sent to the HTCondor queuing system and waiting for available resources before being started. This usually takes less than a minute but can take up to a few depending on our resource usage.&lt;br /&gt;
&lt;br /&gt;
[[File:screen02.png|900px]]&lt;br /&gt;
&lt;br /&gt;
In the next screen you can choose the tool that you want to use for your work: a Python notebook, a Python console or a plain bash terminal.&lt;br /&gt;
For the Python environment (either notebook or environment) you have two default options:&lt;br /&gt;
* the ipykernel version of Python 3&lt;br /&gt;
* the XPython version of Python 3.9, this one allows you to use the integrated debugging module.&lt;br /&gt;
&lt;br /&gt;
== Terminate your session and logout ==&lt;br /&gt;
&lt;br /&gt;
It is important that you terminate your session before you log out. In order to do so, go to the top page menu &amp;quot;'''File''' -&amp;gt; '''Hub Control Panel'''&amp;quot; and you will see the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:screen04.png|600px]]&lt;br /&gt;
&lt;br /&gt;
Here, click on the '''Stop My Server''' button. After that you can log out by clicking the '''Logout''' button in the right upper corner.&lt;br /&gt;
&lt;br /&gt;
=  Python virtual environments =&lt;br /&gt;
&lt;br /&gt;
This section covers the use of Python virtual environments with Jupyter.&lt;br /&gt;
&lt;br /&gt;
== Prebuilt environments == &lt;br /&gt;
&lt;br /&gt;
PIC's jupyterhub service comes with a collection of prebuilt environments located at '''/data/jupyter/software/envs'''.&lt;br /&gt;
&lt;br /&gt;
The master environment located at '''/data/jupyter/software/envs/master''' is the one used to start the jupyterlab service and the default for new notebooks.&lt;br /&gt;
&lt;br /&gt;
This is a non-extensive list of the packages included:&lt;br /&gt;
  - astropy=6.1.0&lt;br /&gt;
  - bokeh=3.4.1&lt;br /&gt;
  - dash=2.17.0&lt;br /&gt;
  - dask=2024.5.1&lt;br /&gt;
  - findspark=2.0.1&lt;br /&gt;
  - matplotlib=3.8.4&lt;br /&gt;
  - numpy=1.26.4&lt;br /&gt;
  - pandas=2.2.2&lt;br /&gt;
  - pillow=10.3.0&lt;br /&gt;
  - plotly=5.22.0&lt;br /&gt;
  - pyhive=0.7.0&lt;br /&gt;
  - python=3.12&lt;br /&gt;
  - pywavelets=1.4.1&lt;br /&gt;
  - scikit-image=0.22.0&lt;br /&gt;
  - scikit-learn=1.5.0&lt;br /&gt;
  - scipy=1.13.1&lt;br /&gt;
  - seaborn=0.13.2&lt;br /&gt;
  - statsmodels=0.14.2&lt;br /&gt;
&lt;br /&gt;
== Initialize conda (we highly recommend the use of mamba/micromamba) ==&lt;br /&gt;
&lt;br /&gt;
Before using conda/mamba in your bash session, you have to initialize it.&lt;br /&gt;
* For access to an available conda/mamba installation, please get in contact with your project liaison at PIC. He/she will give you the actual value for the '''/path/to/conda/mamba''' placeholder.&lt;br /&gt;
* If you want to use your own conda/mamba/micromamba installation, there are two recommended options&lt;br /&gt;
** '''miniforge''': a distribution with conda and mamba executables in a minimal base environment, instructions [https://github.com/conda-forge/miniforge here]&lt;br /&gt;
** '''micromamba''': a self-contained executable (micromamba) with no base environment, instructions [https://mamba.readthedocs.io/en/latest/installation/micromamba-installation.html here]&lt;br /&gt;
&lt;br /&gt;
Log onto Jupyter and start a session. On the homepage of your Jupyter session, click on the terminal button on the session dashboard on the right to open a bash terminal.&lt;br /&gt;
&lt;br /&gt;
In order to use conda/mamba/micromamba you need to intialize the shell. This initialization can be persistent, which will do some changes to your '''~/.bashrc''' file, or you can do it every time you want to use it.&lt;br /&gt;
&lt;br /&gt;
Run the following command:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eval &amp;quot;$(/data/astro/software/miniforge3/bin/conda shell.bash hook)&amp;quot;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, if you are using miniforge and you want to persist the initialization:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
/data/astro/software/miniforge3/bin/mamba init&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This actually changes the .bashrc file in your home directory in order to activate the base environment on login.&lt;br /&gt;
To avoid that the base environment is activated every time you log on to a node, run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Link an existing environment to Jupyter ==&lt;br /&gt;
&lt;br /&gt;
You can find instructions on how to create your own environments, e.g. [[#Create_virtual_environments_with_venv_or_conda | here]].&lt;br /&gt;
&lt;br /&gt;
Log into Jupyter, start a session. From the session dashboard choose the bash terminal.&lt;br /&gt;
&lt;br /&gt;
Inside the terminal, activate your environment.&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments:&lt;br /&gt;
* if you created the environment without a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the name of your environment. &lt;br /&gt;
* if you created the environment with a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the absolute path of your environment. &lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ source /path/to/environment/bin/activate&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Link the environment to a Jupyter kernel. For both, '''conda/mamba''' and '''venv''':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ python -m  ipykernel install --user --name=whatever_kernel_name&lt;br /&gt;
Installed kernelspec whatever_kernel_name in &lt;br /&gt;
                         /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you don't have the '''ipykernel''' module installed in your environment you may receive an error message like the one below when trying to run the previous command.&lt;br /&gt;
&amp;lt;pre&amp;gt;No module named ipykernel&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If this is the case, you need to install it by running: '''pip install ipykernel'''&lt;br /&gt;
&lt;br /&gt;
Deactivate your environment. &lt;br /&gt;
&lt;br /&gt;
For conda:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For venv:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now you can exit the terminal. After refreshing the Jupyter page your whatever_kernel_name appears in the dashboard. In this example '''test''' has been used for whatever_kernel_name&lt;br /&gt;
&lt;br /&gt;
[[File:screen05.png|700px]]&lt;br /&gt;
&lt;br /&gt;
== Unlink an environment from Jupyter ==&lt;br /&gt;
Log onto Jupyter, start a session and from the session dashboard choose the bash terminal. To remove your environment/kernel from Jupyter run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ jupyter kernelspec uninstall whatever_kernel_name&lt;br /&gt;
Kernel specs to remove:&lt;br /&gt;
  whatever_kernel_name     /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
Remove 1 kernel specs [y/N]: y&lt;br /&gt;
[RemoveKernelSpec] Removed /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Keep in mind that, although not available in Jupyter anymore, the environment still exists. Whenever you need it, you can link it again.&lt;br /&gt;
&lt;br /&gt;
== Create virtual environments with venv or conda ==&lt;br /&gt;
&lt;br /&gt;
Before creating a new environment, please get in contact with your project liaison at PIC as there may be already a suitable environment for your needs in place.&lt;br /&gt;
&lt;br /&gt;
If none of the existing environments suits your needs, you can create a new environment.&lt;br /&gt;
First, create a directory in a suitable place to store the environment. For single-user environments, place them in your home under ~/env. For environments that will be shared with other project users, contact your project liaison and ask him/her for a path in a shared storage volume that is visible to all of them.&lt;br /&gt;
&lt;br /&gt;
Once you have the location (i.e. /path/to/env/folder), create the environment with the following commands:&lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments '''(recommended)'''&lt;br /&gt;
&lt;br /&gt;
If your_env is installed at /path/to/env/your_env&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ python3 -m venv your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ source your_env/bin/activate&lt;br /&gt;
(...)[neissner@td110 ~]$ pip install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments&lt;br /&gt;
 &amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba create --prefix /path/to/env/your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The list of modules (module1, module2, ...) is optional. For instance, for a python3 environment with scipy you would specify: ''python=3 scipy'' &lt;br /&gt;
&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/env/folder/your_env&lt;br /&gt;
(...)[neissner@td110 ~]$ mamba install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
You can use pip install inside a mamba environment, however, resolving dependencies might require installing additional packages manually.&lt;br /&gt;
&lt;br /&gt;
== Conda / mamba configuration ==&lt;br /&gt;
&lt;br /&gt;
The behaviour of conda/mamba can be configured through the &amp;quot;$HOME/.condarc&amp;quot; file, described [https://docs.conda.io/projects/conda/en/latest/configuration.html here]. Some interesting parameters:&lt;br /&gt;
&lt;br /&gt;
* envs_dirs: The list of directories to search for named environments. E.g.: different locations where you created environments&lt;br /&gt;
&lt;br /&gt;
  envs_dirs:&lt;br /&gt;
    - /data/pic/scratch/torradeflot/envs&lt;br /&gt;
    - /data/astro/scratch/torradeflot/envs&lt;br /&gt;
    - /data/aai/scratch/torradeflot/envs&lt;br /&gt;
&lt;br /&gt;
* pkgs_dirs: Folder where to store conda packages&lt;br /&gt;
&lt;br /&gt;
  pkgs_dirs:&lt;br /&gt;
  - /data/aai/scratch_ssd/torradeflot/pkgs&lt;br /&gt;
  - /data/aai/scratch/torradeflot/pkgs&lt;br /&gt;
  - /data/pic/scratch/torradeflot/pkgs&lt;br /&gt;
&lt;br /&gt;
if `pkgs_dirs` and `envs_dirs` are in the same storage, conda will use hard links, thus optimizing the disk space.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Jupyterlab user guide =&lt;br /&gt;
&lt;br /&gt;
You can find the official documentation of the currently installed version of jupyterlab (3.6) [https://jupyterlab.readthedocs.io/en/4.2.x/ here], there you will find instruction on how to &lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/commands.html Access the command palette]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/toc.html Build a Table Of Contents]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/debugger.html Debug your code]&lt;br /&gt;
&lt;br /&gt;
A set of non-official jupyterlab extensions are installed to provide additional functionalities&lt;br /&gt;
&lt;br /&gt;
== Sidecar Apps: Remote desktop &amp;amp; Visual Studio IDE ==&lt;br /&gt;
Further you see an icon with a &amp;quot;D&amp;quot; - desktop, this one starts a VNC session that allows the use of programs with graphical user interfaces.&lt;br /&gt;
&lt;br /&gt;
You can also find the icon of Visual Studio, an integrated development environment.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterlab20231103.png|700px]]&lt;br /&gt;
&lt;br /&gt;
== Jupytext ==&lt;br /&gt;
Pair your notebooks with text files to enhance version tracking.&lt;br /&gt;
https://jupytext.readthedocs.io&lt;br /&gt;
&lt;br /&gt;
'''Example'''&lt;br /&gt;
&lt;br /&gt;
If you had a notebook (.ipynb file) containing only the cell below, tracked in a git repository&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%matplotlib inline&lt;br /&gt;
import numpy as np&lt;br /&gt;
import matplotlib.pyplot as plt&lt;br /&gt;
plt.imshow(np.random.random([10, 10]))&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Different executions of the cell would produce different images, and the images are embedded in a pseudo-binary format inside the notebook file. In this case, doing a '''git diff''' of the .ipynb file would produce a huge output (because the image changed), even if there wasn't any change in the code. It is thus convenient to sync the notebook with a text file (e.g. a .py script) using the jupytext extension and track this one with git. The outputs, including images, as well as some additional metadata, won't be added to the synced text file. So in the case of different executions of the same notebook, the diff will always be empty.&lt;br /&gt;
&lt;br /&gt;
== Git ==&lt;br /&gt;
Sidebar GUI to git repo management&lt;br /&gt;
https://github.com/jupyterlab/jupyterlab-git&lt;br /&gt;
&lt;br /&gt;
== Variable inspector ==&lt;br /&gt;
Variable Inspector provides an interactive interface for inspecting the current state of variables in a JupyterLab session. It allows users to view variable names, types, shapes, and values in a structured table, facilitating exploratory analysis and debugging workflows similar to variable inspection tools available in environments such as MATLAB.&lt;br /&gt;
&lt;br /&gt;
https://github.com/jupyterlab-contrib/jupyterlab-variableInspector&lt;br /&gt;
&lt;br /&gt;
== Jupyter server proxy ==&lt;br /&gt;
&lt;br /&gt;
This extension is installed in PIC's jupyter environment and it is used to be able to access network/web services running on the same host as the jupyterlab server from outside through the &amp;quot;https://jupyter.pic.es/user/{username}/proxy/{port}&amp;quot; URL.&lt;br /&gt;
&lt;br /&gt;
Full documentation here: https://jupyter-server-proxy.readthedocs.io/en/latest/index.html&lt;br /&gt;
...&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Tips &amp;amp; Tricks =&lt;br /&gt;
&lt;br /&gt;
== Software of particular interest ==&lt;br /&gt;
&lt;br /&gt;
=== ROOT ===&lt;br /&gt;
&lt;br /&gt;
Using ROOT from a jupyter notebook needs to do some tricks.&lt;br /&gt;
&lt;br /&gt;
Environment creation and kernel installation&lt;br /&gt;
&lt;br /&gt;
    $ micromamba env create -p /data/pic/scratch/torradeflot/envs/mcdata root ipykernel&lt;br /&gt;
    $ micromamba activate mcdata&lt;br /&gt;
    $ python -m ipykernel install --user --name mcdata&lt;br /&gt;
&lt;br /&gt;
After doing this ROOT can be imported from a python shell, but it does not work from a notebook.&lt;br /&gt;
&lt;br /&gt;
* ROOT uses JIT compilation with Cling: https://root.cern/cling/&lt;br /&gt;
* conda provides it's own set of compilation tools: gcc, gxx, fortran&lt;br /&gt;
* Some environment variables are necessary to ensure that these two pieces work well together:&lt;br /&gt;
** PATH: to be able to find the compiler tools&lt;br /&gt;
** CONDA_BUILD_SYSROOT: needed to configure the compiler call&lt;br /&gt;
&lt;br /&gt;
These environment variables are not propagated to the notebook, because they are set by .bashrc conda activate, ... so we need to explicitly set them in the notebook.&lt;br /&gt;
&lt;br /&gt;
     import os&lt;br /&gt;
     import sys&lt;br /&gt;
     from pathlib import Path&lt;br /&gt;
     bin_dir = Path(sys.executable).parent&lt;br /&gt;
     os.environ['PATH'] = f'{bin_dir}:{os.environ['PATH']}'&lt;br /&gt;
     os.environ['CONDA_BUILD_SYSROOT'] = str(bin_dir.parent / 'x86_64-conda-linux-gnu/sysroot')&lt;br /&gt;
&lt;br /&gt;
=== SageMath ===&lt;br /&gt;
&lt;br /&gt;
[https://www.sagemath.org/ SageMath] is particularly interesting for Cosmology because it allows symbolic calculations, e.g. deriving the equations of motions for the scale factor starting from a customised space-time metric. &lt;br /&gt;
&lt;br /&gt;
'''Standard cosmology examples'''&lt;br /&gt;
&lt;br /&gt;
* The Friedman equations for the FLRW solution of the Einstein equations.&lt;br /&gt;
You can find the corresponding Notebook in any PIC terminal at '''/data/astro/software/notebooks/FLRW_cosmology.ipynb'''&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/FLRW_cosmology_solutions.ipynb''' uses known analytical solutions of the FLRW cosmology and produces this image for the evolution of the scale factor:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage05.png|300px]]&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/Interior_Schwarzschild.ipynb''' shows the formalism for the interior Schwarzschild metric and displays the solutions for density and pressure of a static celestial object that is sufficiently larger than its Schwarzschild radius. The pressure for an object with constant density id shown in the image:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage06.png|300px]]&lt;br /&gt;
&lt;br /&gt;
'''Enabling SageMath environment in Jupyter'''&lt;br /&gt;
&lt;br /&gt;
If you have never initialized mamba, run:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /data/astro/software/centos7/conda/mambaforge_4.14.0/bin/mamba init&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
After that you can enable SageMath for its use in a Jupyter notebook session: &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~] mamba activate /data/astro/software/envs/sage&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ python -m  ipykernel install --user --name=sage&lt;br /&gt;
....&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This creates a file in you home '''~/.local/share/jupyter/kernels/sage/kernel.json'''&lt;br /&gt;
which has to be modified to look like this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
{&lt;br /&gt;
 &amp;quot;argv&amp;quot;: [&lt;br /&gt;
  &amp;quot;/data/astro/software/envs/sage/bin/sage&amp;quot;,&lt;br /&gt;
  &amp;quot;--python&amp;quot;,&lt;br /&gt;
  &amp;quot;-m&amp;quot;,&lt;br /&gt;
  &amp;quot;sage.repl.ipython_kernel&amp;quot;,&lt;br /&gt;
  &amp;quot;-f&amp;quot;,&lt;br /&gt;
  &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
 ],&lt;br /&gt;
 &amp;quot;display_name&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;language&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;metadata&amp;quot;: {&lt;br /&gt;
  &amp;quot;debugger&amp;quot;: true&lt;br /&gt;
 }&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next time you go to your Jupyter dashboard you will find the sage environment listed there.&lt;br /&gt;
&lt;br /&gt;
== Dask ==&lt;br /&gt;
Dask supports parallel computations in Python. The PIC Jupyterlab has an extension for launching&lt;br /&gt;
your own Dask clusters. For more information, see [[Dask|Dask documentation]].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Using a Singularity image as a Jupyter kernel ==&lt;br /&gt;
&lt;br /&gt;
In some projects, the software stack is provided as a Singularity image. In such cases, it can be convenient to use this image directly as a Jupyter kernel, allowing notebooks on jupyter.pic.es to run within the same controlled software environment.&lt;br /&gt;
&lt;br /&gt;
To be used as a Jupyter kernel, the Singularity image must satisfy certain requirements. These depend on the programming language used inside the notebook.&lt;br /&gt;
&lt;br /&gt;
The singularity image needs to have the '''python''' and the '''ipykernel''' module installed. &lt;br /&gt;
&lt;br /&gt;
* Create the folder that will host the kernel definition&lt;br /&gt;
&lt;br /&gt;
  mkdir -p $HOME/.local/share/jupyter/kernels/singularity&lt;br /&gt;
&lt;br /&gt;
* Create the '''kernel.json''' file inside it with the following content:&lt;br /&gt;
&lt;br /&gt;
 {&lt;br /&gt;
   &amp;quot;argv&amp;quot;: [&lt;br /&gt;
     &amp;quot;singularity&amp;quot;,&lt;br /&gt;
     &amp;quot;exec&amp;quot;,&lt;br /&gt;
     &amp;quot;--cleanenv&amp;quot;,&lt;br /&gt;
     &amp;quot;/path/to/the/singularity/image.sif&amp;quot;,&lt;br /&gt;
     &amp;quot;python&amp;quot;,&lt;br /&gt;
     &amp;quot;-m&amp;quot;,&lt;br /&gt;
    &amp;quot;ipykernel&amp;quot;,&lt;br /&gt;
     &amp;quot;-f&amp;quot;,&lt;br /&gt;
     &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
   ],&lt;br /&gt;
   &amp;quot;language&amp;quot;: &amp;quot;python&amp;quot;,&lt;br /&gt;
   &amp;quot;display_name&amp;quot;: &amp;quot;singularity-kernel&amp;quot;&lt;br /&gt;
 }&lt;br /&gt;
&lt;br /&gt;
Refresh or start the jupyterlab interface and the singularity kernel should appear in the launcher tab&lt;br /&gt;
&lt;br /&gt;
== GPUs ==&lt;br /&gt;
&lt;br /&gt;
The GPUs that are assigned to your job are listed in the environment variable CUDA_VISIBLE_DEVICES. In a terminal run &amp;quot;echo $CUDA_VISIBLE_DEVICES&amp;quot;. The environment variable contains a list of comma-separated GPU ids. With this you will already know how many gpus are assigned to your job. If the variable does not exist, there are no gpus assigned to the job&lt;br /&gt;
&lt;br /&gt;
    [torradeflot@gpu05 ~]$ echo $CUDA_VISIBLE_DEVICES &lt;br /&gt;
    GPU-a2361e9d-c520-684b-b21b-60cc3a59b05b&lt;br /&gt;
&lt;br /&gt;
You can then track the GPU utilization (memory and load) using the GPU Dashboard in the left sidebar&lt;br /&gt;
&lt;br /&gt;
== Code samples ==&lt;br /&gt;
&lt;br /&gt;
A repository with sample code can be found here: https://gitlab.pic.es/services/code-samples/&lt;br /&gt;
&lt;br /&gt;
== Running notebooks through HTCondor ==&lt;br /&gt;
After developing a notebook, you might want to run it as a script with different configurations. The&lt;br /&gt;
following documentation explains  &lt;br /&gt;
&lt;br /&gt;
[[notebook_htcondor|how to run a notebook through HTCondor.]]&lt;br /&gt;
&lt;br /&gt;
= Troubleshooting =&lt;br /&gt;
&lt;br /&gt;
== Logs ==&lt;br /&gt;
&lt;br /&gt;
The log files for the jupyterlab server are stored in &amp;quot;~/.jupyter&amp;quot;. The log files will be created once the jupyter lab server job is finished.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Clean workspaces ==&lt;br /&gt;
&lt;br /&gt;
Jupyterlab stores the workspace status in the &amp;quot;~/.jupyter/lab/workspaces&amp;quot; folder. If you want to start with a fresh (empty) workspace, delete all the content of this folder before launching the notebook.&lt;br /&gt;
&lt;br /&gt;
    cd ~/.jupyter/lab/workspaces&lt;br /&gt;
    rm *&lt;br /&gt;
&lt;br /&gt;
= Known problems =&lt;br /&gt;
&lt;br /&gt;
== HOME folder is full ==&lt;br /&gt;
&lt;br /&gt;
Your HOME folder located at &amp;quot;/nfs/pic.es/user/X/XYZTUV&amp;quot; has a quota assigned, usually 25GB. &lt;br /&gt;
&lt;br /&gt;
If you see the error message &amp;quot;Disk quota exceeded&amp;quot; when spawning a notebook server or a generic 500 error, it could be because your $HOME folder is full. Log in to &amp;quot;ui.pic.es&amp;quot; (through SSH) and run &amp;quot;quota&amp;quot; to check the usage vs quota. If it is full you'll have to free up space.&lt;br /&gt;
&lt;br /&gt;
== Error 500: Internal Server Error ==&lt;br /&gt;
&lt;br /&gt;
This is a generic error. Means that the jupyterlab server failed. This could be for different reasons:&lt;br /&gt;
&lt;br /&gt;
* [[#HOME_folder_is_full|Your HOME folder is full]]&lt;br /&gt;
&lt;br /&gt;
== 504 Gateway timeout ==&lt;br /&gt;
&lt;br /&gt;
The notebook job is running in HTCondor but the user can not access the notebook server. Ultimately a 504 error is received.&lt;br /&gt;
&lt;br /&gt;
This is probably because there's some error when starting the jupyterlab server. First of all, shutdown the notebook server and [[#Logs|check the logs]] to better identify the problem. If you don't see the source of the error, try to [[#Clean_workspaces|clean the workspaces]] and launch a notebook again.&lt;br /&gt;
&lt;br /&gt;
== Loading libraries is very slow ==&lt;br /&gt;
Conda environments at storage using hard drives can be extremely slow to load. If you encounter this problem, please ask &lt;br /&gt;
your support contact for a &amp;quot;SSD scratch&amp;quot; location to store environments. Currently this service is being tested and&lt;br /&gt;
deployed as needed.&lt;br /&gt;
&lt;br /&gt;
== Spawn failed: The 'ip' trait of a PICCondorSpawner instance expected a unicode string, not the NoneType None ==&lt;br /&gt;
&lt;br /&gt;
Jupyterhub could not get the host name from HTCondor's stdout, because it didn't match the expected regular expression.&lt;br /&gt;
&lt;br /&gt;
This error happens randomly from time to time, it does not imply any major problem in any of the services.&lt;br /&gt;
&lt;br /&gt;
Try to request a new notebook server.&lt;br /&gt;
&lt;br /&gt;
== 403 : Forbidden. XSRF cookie does not match POST argument ==&lt;br /&gt;
&lt;br /&gt;
The value of the &amp;quot;_xsrf&amp;quot; cookie sent by the browser does not match the expected value. This could be for many reasons: temporary high load on the server, race conditions, temporary network unstability, many open tabs in the browser, etc.&lt;br /&gt;
&lt;br /&gt;
In general it can be solved by closing all tabs pointing to &amp;quot;jupyter.pic.es&amp;quot;, cleaning the cookies and connecting back to &amp;quot;jupyter.pic.es&amp;quot;&lt;br /&gt;
&lt;br /&gt;
== Proper usage of X509 based proxies ==&lt;br /&gt;
&lt;br /&gt;
We found recently that the usage of proxies within a Jupyter session might cause problems because the environment changes certain standard locations such as '''/tmp'''&lt;br /&gt;
&lt;br /&gt;
For a correct functioning please create the proxy the following way, example for Virgo:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /bin/voms-proxy-init --voms virgo:/virgo/ligo --out ./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
ls: cannot access /cvmfs/ligo.osgstorage.org: Permission denied&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here the proxy cannot properly be located. Therefore we have to put the complete path into the variable:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ pwd&lt;br /&gt;
/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;/x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
frames  powerflux  pycbc  test_access&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;/div&gt;</summary>
		<author><name>Torradeflot</name></author>
	</entry>
	<entry>
		<id>https://pwiki.pic.es/index.php?title=Gitlab&amp;diff=1326</id>
		<title>Gitlab</title>
		<link rel="alternate" type="text/html" href="https://pwiki.pic.es/index.php?title=Gitlab&amp;diff=1326"/>
		<updated>2026-03-23T09:29:51Z</updated>

		<summary type="html">&lt;p&gt;Torradeflot: /* Best practices / recommendations */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Introduction ==&lt;br /&gt;
&lt;br /&gt;
[https://about.gitlab.com/ Gitlab] is a Dev(Sec)Ops platform to handle all the elements in the software development life cycle.&lt;br /&gt;
&lt;br /&gt;
Main features:&lt;br /&gt;
* Git repository management&lt;br /&gt;
* Software development planning&lt;br /&gt;
* Continuous Integration and Continuous Deployment (CI/CD)&lt;br /&gt;
&lt;br /&gt;
The service can be accessed at [https://gitlab.pic.es] with your PIC account.&lt;br /&gt;
&lt;br /&gt;
The groups inside gitlab are not synced with the LDAP groups, if you want to have access to some specific group or project in gitlab you have to request it through the application.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Official Gitlab documentation ==&lt;br /&gt;
&lt;br /&gt;
The official Gitlab documentation can be found here: https://docs.gitlab.com/&lt;br /&gt;
&lt;br /&gt;
Make sure that the documentation you are viewing applies to the Free (Community Edition) distribution and the version deployed at '''gitlab.pic.es'''. You can easily find the version number in the Help menu.&lt;br /&gt;
&lt;br /&gt;
The official documentation can be overwhelming, here are some useful links:&lt;br /&gt;
&lt;br /&gt;
* [https://gitlab.pic.es/help/user/ssh.md#generate-an-ssh-key-pair Generate an SSH key pair]&lt;br /&gt;
* [https://docs.gitlab.com/ee/topics/git/lfs/ git LFS]&lt;br /&gt;
* [https://docs.gitlab.com/ee/user/project/web_ide/index.html Web IDE]&lt;br /&gt;
* [https://docs.gitlab.com/ee/ci/yaml/ CI/CD .yaml file syntax reference]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Best practices / recommendations ==&lt;br /&gt;
&lt;br /&gt;
=== General rules ===&lt;br /&gt;
&lt;br /&gt;
* '''Make small and atomic changes''' Commit / push often.&lt;br /&gt;
* '''Use branches for dedicated/long developments'''&lt;br /&gt;
* '''Keep the main branch stable''' the tests should always pass (yes you should have tests!!)&lt;br /&gt;
* '''Write descriptive commit messages''' Avoid messages like “changes”, “test”  or “.”&lt;br /&gt;
* '''Adopt a branching strategy''' [https://nvie.com/posts/a-successful-git-branching-model/ gitflow], trunk-based,...&lt;br /&gt;
* '''Do code reviews if possible''' Specially if working in a team&lt;br /&gt;
&lt;br /&gt;
=== Our contributions ===&lt;br /&gt;
&lt;br /&gt;
* '''Use git!!''' If you work alone or in a team, use it!&lt;br /&gt;
* '''Do not upload big binary files to a git repository''' Git is not for data, it is for code. If you still want to do it you can use git LFS (see above).&lt;br /&gt;
* '''Use .gitignore to track only relevant files''' [https://git-scm.com/docs/gitignore official docs] and [https://github.com/github/gitignore/tree/main some examples] &lt;br /&gt;
* '''Do not track  jupyter notebooks (.ipynb)''' pair them with a script using [[JupyterHub#jupytext|jupytext]]&lt;br /&gt;
* '''Do not upload confidential data''' passwords, ssh keys, etc will be there forever&lt;br /&gt;
&lt;br /&gt;
== Troubleshooting ==&lt;br /&gt;
&lt;br /&gt;
=== Errors in CI/CD jobs ===&lt;br /&gt;
&lt;br /&gt;
* ''pods XXXX is forbidden: exceeded quota'': There's a limit on the amount of resources that can be used simultaneously for CI/CD jobs. This takes into account your own jobs and jobs from other projects. You should be able to retry the pipeline after some period to wait for resources to be freed. If it doesn't work contact the administrator.&lt;br /&gt;
* ''ERROR: Preparation failed: couldn't prepare overwrites: invalid build requests specified: the resource &amp;quot;XXXXXX&amp;quot; requested &amp;quot;X&amp;quot; is higher than limit allowed &amp;quot;X&amp;quot;'': Your job requested more than the allowed resources. Lower down the amount of resources requested.&lt;br /&gt;
* ''ERROR: Job failed (system failure): Error in container build: exit code: 137, reason: 'OOMKilled''': The job exceeded the memory  limit and it was killed. Try increasing the memory requested for you CI/CD job. If this is not enough contact the servicea administrator.&lt;br /&gt;
&lt;br /&gt;
=== Configure resource limits in CI/CD jobs ===&lt;br /&gt;
&lt;br /&gt;
Gitlab CI/CD jobs run as pods on one of PIC's kubernetes clusters. Pods can have resource requests and limits. Resource requests are effectively allocated for the pod, which will be killed if it exceeds the resource limits.&lt;br /&gt;
&lt;br /&gt;
Default resources are:&lt;br /&gt;
* 1GiB memory and 0.5 cpu requested&lt;br /&gt;
* 2GiB memory and 1 cpu limit&lt;br /&gt;
&lt;br /&gt;
Default resources can be overwritten by setting environment variables in the job configuration in .gitlab-ci.yml like this:&lt;br /&gt;
&lt;br /&gt;
cicd_job:&lt;br /&gt;
  image: python:3.11-slim&lt;br /&gt;
  variables:&lt;br /&gt;
    KUBERNETES_CPU_REQUEST: &amp;quot;1&amp;quot;&lt;br /&gt;
    KUBERNETES_CPU_LIMIT: &amp;quot;2&amp;quot;&lt;br /&gt;
    KUBERNETES_MEMORY_REQUEST: &amp;quot;4Gi&amp;quot;&lt;br /&gt;
    KUBERNETES_MEMORY_LIMIT: &amp;quot;6Gi&amp;quot;&lt;br /&gt;
&lt;br /&gt;
Maximum resources that can be overwritten are:&lt;br /&gt;
* 6 GiB memory and 2 cpu requested&lt;br /&gt;
* 8 GiB memory and 4 cpu limit&lt;/div&gt;</summary>
		<author><name>Torradeflot</name></author>
	</entry>
	<entry>
		<id>https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1320</id>
		<title>JupyterHub</title>
		<link rel="alternate" type="text/html" href="https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1320"/>
		<updated>2026-02-24T14:26:19Z</updated>

		<summary type="html">&lt;p&gt;Torradeflot: /* Known problems */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;tldr; Connect to https://jupyter.pic.es/ . Enjoy!&lt;br /&gt;
= Introduction =&lt;br /&gt;
&lt;br /&gt;
PIC offers a service for running Jupyter notebooks on CPU or GPU resources. This service is primarily thought for code developing or prototyping rather than data processing. The usage is similar to running notebooks on your personal computer but offers the advantage of developing and testing your code on different hardware configurations, as well as facilitating the scalability of the code since it is being tested in the same environment in which it would run on a mass scale. &lt;br /&gt;
&lt;br /&gt;
Since the service is strictly thought for development and small scale testing tasks, a shutdown policy for the sessions has been put in place:&lt;br /&gt;
&lt;br /&gt;
# The maximum duration for a session is 48h.&lt;br /&gt;
# After an idle period of 2 hours, the session will be closed. &lt;br /&gt;
&lt;br /&gt;
In practice that  means that you should estimate the test data volume that you work with during a session to be able to be processed in less than 48 hours.&lt;br /&gt;
&lt;br /&gt;
== How to connect to the service ==&lt;br /&gt;
&lt;br /&gt;
Got to [https://jupyter.pic.es jupyter.pic.es] to see your login screen.&lt;br /&gt;
&lt;br /&gt;
[[File:login.png|700px|Login screen]]&lt;br /&gt;
&lt;br /&gt;
Sign in with your PIC user credentials. This will prompt you to the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterSpawn.png|700px|current]]&lt;br /&gt;
&lt;br /&gt;
Here you can choose the hardware configuration for your Jupyter session. Also, you have to choose the experiment (project) you are working on during the Jupyter session. After choosing a configuration and pressing start the next screen will show you the progress of the initialisation process. Keep in mind that a job containing your Jupyter session is actually sent to the HTCondor queuing system and waiting for available resources before being started. This usually takes less than a minute but can take up to a few depending on our resource usage.&lt;br /&gt;
&lt;br /&gt;
[[File:screen02.png|900px]]&lt;br /&gt;
&lt;br /&gt;
In the next screen you can choose the tool that you want to use for your work: a Python notebook, a Python console or a plain bash terminal.&lt;br /&gt;
For the Python environment (either notebook or environment) you have two default options:&lt;br /&gt;
* the ipykernel version of Python 3&lt;br /&gt;
* the XPython version of Python 3.9, this one allows you to use the integrated debugging module.&lt;br /&gt;
&lt;br /&gt;
== Terminate your session and logout ==&lt;br /&gt;
&lt;br /&gt;
It is important that you terminate your session before you log out. In order to do so, go to the top page menu &amp;quot;'''File''' -&amp;gt; '''Hub Control Panel'''&amp;quot; and you will see the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:screen04.png|600px]]&lt;br /&gt;
&lt;br /&gt;
Here, click on the '''Stop My Server''' button. After that you can log out by clicking the '''Logout''' button in the right upper corner.&lt;br /&gt;
&lt;br /&gt;
=  Python virtual environments =&lt;br /&gt;
&lt;br /&gt;
This section covers the use of Python virtual environments with Jupyter.&lt;br /&gt;
&lt;br /&gt;
== Prebuilt environments == &lt;br /&gt;
&lt;br /&gt;
PIC's jupyterhub service comes with a collection of prebuilt environments located at '''/data/jupyter/software/envs'''.&lt;br /&gt;
&lt;br /&gt;
The master environment located at '''/data/jupyter/software/envs/master''' is the one used to start the jupyterlab service and the default for new notebooks.&lt;br /&gt;
&lt;br /&gt;
This is a non-extensive list of the packages included:&lt;br /&gt;
  - astropy=6.1.0&lt;br /&gt;
  - bokeh=3.4.1&lt;br /&gt;
  - dash=2.17.0&lt;br /&gt;
  - dask=2024.5.1&lt;br /&gt;
  - findspark=2.0.1&lt;br /&gt;
  - matplotlib=3.8.4&lt;br /&gt;
  - numpy=1.26.4&lt;br /&gt;
  - pandas=2.2.2&lt;br /&gt;
  - pillow=10.3.0&lt;br /&gt;
  - plotly=5.22.0&lt;br /&gt;
  - pyhive=0.7.0&lt;br /&gt;
  - python=3.12&lt;br /&gt;
  - pywavelets=1.4.1&lt;br /&gt;
  - scikit-image=0.22.0&lt;br /&gt;
  - scikit-learn=1.5.0&lt;br /&gt;
  - scipy=1.13.1&lt;br /&gt;
  - seaborn=0.13.2&lt;br /&gt;
  - statsmodels=0.14.2&lt;br /&gt;
&lt;br /&gt;
== Initialize conda (we highly recommend the use of mamba/micromamba) ==&lt;br /&gt;
&lt;br /&gt;
Before using conda/mamba in your bash session, you have to initialize it.&lt;br /&gt;
* For access to an available conda/mamba installation, please get in contact with your project liaison at PIC. He/she will give you the actual value for the '''/path/to/conda/mamba''' placeholder.&lt;br /&gt;
* If you want to use your own conda/mamba/micromamba installation, there are two recommended options&lt;br /&gt;
** '''miniforge''': a distribution with conda and mamba executables in a minimal base environment, instructions [https://github.com/conda-forge/miniforge here]&lt;br /&gt;
** '''micromamba''': a self-contained executable (micromamba) with no base environment, instructions [https://mamba.readthedocs.io/en/latest/installation/micromamba-installation.html here]&lt;br /&gt;
&lt;br /&gt;
Log onto Jupyter and start a session. On the homepage of your Jupyter session, click on the terminal button on the session dashboard on the right to open a bash terminal.&lt;br /&gt;
&lt;br /&gt;
In order to use conda/mamba/micromamba you need to intialize the shell. This initialization can be persistent, which will do some changes to your '''~/.bashrc''' file, or you can do it every time you want to use it.&lt;br /&gt;
&lt;br /&gt;
Run the following command:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eval &amp;quot;$(/data/astro/software/miniforge3/bin/conda shell.bash hook)&amp;quot;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, if you are using miniforge and you want to persist the initialization:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
/data/astro/software/miniforge3/bin/mamba init&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This actually changes the .bashrc file in your home directory in order to activate the base environment on login.&lt;br /&gt;
To avoid that the base environment is activated every time you log on to a node, run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Link an existing environment to Jupyter ==&lt;br /&gt;
&lt;br /&gt;
You can find instructions on how to create your own environments, e.g. [[#Create_virtual_environments_with_venv_or_conda | here]].&lt;br /&gt;
&lt;br /&gt;
Log into Jupyter, start a session. From the session dashboard choose the bash terminal.&lt;br /&gt;
&lt;br /&gt;
Inside the terminal, activate your environment.&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments:&lt;br /&gt;
* if you created the environment without a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the name of your environment. &lt;br /&gt;
* if you created the environment with a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the absolute path of your environment. &lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ source /path/to/environment/bin/activate&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Link the environment to a Jupyter kernel. For both, '''conda/mamba''' and '''venv''':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ python -m  ipykernel install --user --name=whatever_kernel_name&lt;br /&gt;
Installed kernelspec whatever_kernel_name in &lt;br /&gt;
                         /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you don't have the '''ipykernel''' module installed in your environment you may receive an error message like the one below when trying to run the previous command.&lt;br /&gt;
&amp;lt;pre&amp;gt;No module named ipykernel&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If this is the case, you need to install it by running: '''pip install ipykernel'''&lt;br /&gt;
&lt;br /&gt;
Deactivate your environment. &lt;br /&gt;
&lt;br /&gt;
For conda:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For venv:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now you can exit the terminal. After refreshing the Jupyter page your whatever_kernel_name appears in the dashboard. In this example '''test''' has been used for whatever_kernel_name&lt;br /&gt;
&lt;br /&gt;
[[File:screen05.png|700px]]&lt;br /&gt;
&lt;br /&gt;
== Unlink an environment from Jupyter ==&lt;br /&gt;
Log onto Jupyter, start a session and from the session dashboard choose the bash terminal. To remove your environment/kernel from Jupyter run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ jupyter kernelspec uninstall whatever_kernel_name&lt;br /&gt;
Kernel specs to remove:&lt;br /&gt;
  whatever_kernel_name     /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
Remove 1 kernel specs [y/N]: y&lt;br /&gt;
[RemoveKernelSpec] Removed /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Keep in mind that, although not available in Jupyter anymore, the environment still exists. Whenever you need it, you can link it again.&lt;br /&gt;
&lt;br /&gt;
== Create virtual environments with venv or conda ==&lt;br /&gt;
&lt;br /&gt;
Before creating a new environment, please get in contact with your project liaison at PIC as there may be already a suitable environment for your needs in place.&lt;br /&gt;
&lt;br /&gt;
If none of the existing environments suits your needs, you can create a new environment.&lt;br /&gt;
First, create a directory in a suitable place to store the environment. For single-user environments, place them in your home under ~/env. For environments that will be shared with other project users, contact your project liaison and ask him/her for a path in a shared storage volume that is visible to all of them.&lt;br /&gt;
&lt;br /&gt;
Once you have the location (i.e. /path/to/env/folder), create the environment with the following commands:&lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments '''(recommended)'''&lt;br /&gt;
&lt;br /&gt;
If your_env is installed at /path/to/env/your_env&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ python3 -m venv your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ source your_env/bin/activate&lt;br /&gt;
(...)[neissner@td110 ~]$ pip install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments&lt;br /&gt;
 &amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba create --prefix /path/to/env/your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The list of modules (module1, module2, ...) is optional. For instance, for a python3 environment with scipy you would specify: ''python=3 scipy'' &lt;br /&gt;
&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/env/folder/your_env&lt;br /&gt;
(...)[neissner@td110 ~]$ mamba install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
You can use pip install inside a mamba environment, however, resolving dependencies might require installing additional packages manually.&lt;br /&gt;
&lt;br /&gt;
== Conda / mamba configuration ==&lt;br /&gt;
&lt;br /&gt;
The behaviour of conda/mamba can be configured through the &amp;quot;$HOME/.condarc&amp;quot; file, described [https://docs.conda.io/projects/conda/en/latest/configuration.html here]. Some interesting parameters:&lt;br /&gt;
&lt;br /&gt;
* envs_dirs: The list of directories to search for named environments. E.g.: different locations where you created environments&lt;br /&gt;
&lt;br /&gt;
  envs_dirs:&lt;br /&gt;
    - /data/pic/scratch/torradeflot/envs&lt;br /&gt;
    - /data/astro/scratch/torradeflot/envs&lt;br /&gt;
    - /data/aai/scratch/torradeflot/envs&lt;br /&gt;
&lt;br /&gt;
* pkgs_dirs: Folder where to store conda packages&lt;br /&gt;
&lt;br /&gt;
  pkgs_dirs:&lt;br /&gt;
  - /data/aai/scratch_ssd/torradeflot/pkgs&lt;br /&gt;
  - /data/aai/scratch/torradeflot/pkgs&lt;br /&gt;
  - /data/pic/scratch/torradeflot/pkgs&lt;br /&gt;
&lt;br /&gt;
if `pkgs_dirs` and `envs_dirs` are in the same storage, conda will use hard links, thus optimizing the disk space.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Jupyterlab user guide =&lt;br /&gt;
&lt;br /&gt;
You can find the official documentation of the currently installed version of jupyterlab (3.6) [https://jupyterlab.readthedocs.io/en/4.2.x/ here], there you will find instruction on how to &lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/commands.html Access the command palette]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/toc.html Build a Table Of Contents]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/debugger.html Debug your code]&lt;br /&gt;
&lt;br /&gt;
A set of non-official jupyterlab extensions are installed to provide additional functionalities&lt;br /&gt;
&lt;br /&gt;
== Sidecar Apps: Remote desktop &amp;amp; Visual Studio IDE ==&lt;br /&gt;
Further you see an icon with a &amp;quot;D&amp;quot; - desktop, this one starts a VNC session that allows the use of programs with graphical user interfaces.&lt;br /&gt;
&lt;br /&gt;
You can also find the icon of Visual Studio, an integrated development environment.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterlab20231103.png|700px]]&lt;br /&gt;
&lt;br /&gt;
== Jupytext ==&lt;br /&gt;
Pair your notebooks with text files to enhance version tracking.&lt;br /&gt;
https://jupytext.readthedocs.io&lt;br /&gt;
&lt;br /&gt;
'''Example'''&lt;br /&gt;
&lt;br /&gt;
If you had a notebook (.ipynb file) containing only the cell below, tracked in a git repository&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%matplotlib inline&lt;br /&gt;
import numpy as np&lt;br /&gt;
import matplotlib.pyplot as plt&lt;br /&gt;
plt.imshow(np.random.random([10, 10]))&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Different executions of the cell would produce different images, and the images are embedded in a pseudo-binary format inside the notebook file. In this case, doing a '''git diff''' of the .ipynb file would produce a huge output (because the image changed), even if there wasn't any change in the code. It is thus convenient to sync the notebook with a text file (e.g. a .py script) using the jupytext extension and track this one with git. The outputs, including images, as well as some additional metadata, won't be added to the synced text file. So in the case of different executions of the same notebook, the diff will always be empty.&lt;br /&gt;
&lt;br /&gt;
== Git ==&lt;br /&gt;
Sidebar GUI to git repo management&lt;br /&gt;
https://github.com/jupyterlab/jupyterlab-git&lt;br /&gt;
&lt;br /&gt;
== Variable inspector ==&lt;br /&gt;
Variable Inspector provides an interactive interface for inspecting the current state of variables in a JupyterLab session. It allows users to view variable names, types, shapes, and values in a structured table, facilitating exploratory analysis and debugging workflows similar to variable inspection tools available in environments such as MATLAB.&lt;br /&gt;
&lt;br /&gt;
https://github.com/jupyterlab-contrib/jupyterlab-variableInspector&lt;br /&gt;
&lt;br /&gt;
== Jupyter server proxy ==&lt;br /&gt;
&lt;br /&gt;
This extension is installed in PIC's jupyter environment and it is used to be able to access network/web services running on the same host as the jupyterlab server from outside through the &amp;quot;https://jupyter.pic.es/user/{username}/proxy/{port}&amp;quot; URL.&lt;br /&gt;
&lt;br /&gt;
Full documentation here: https://jupyter-server-proxy.readthedocs.io/en/latest/index.html&lt;br /&gt;
...&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Tips &amp;amp; Tricks =&lt;br /&gt;
&lt;br /&gt;
== Software of particular interest ==&lt;br /&gt;
&lt;br /&gt;
=== ROOT ===&lt;br /&gt;
&lt;br /&gt;
Using ROOT from a jupyter notebook needs to do some tricks.&lt;br /&gt;
&lt;br /&gt;
Environment creation and kernel installation&lt;br /&gt;
&lt;br /&gt;
    $ micromamba env create -p /data/pic/scratch/torradeflot/envs/mcdata root ipykernel&lt;br /&gt;
    $ micromamba activate mcdata&lt;br /&gt;
    $ python -m ipykernel install --user --name mcdata&lt;br /&gt;
&lt;br /&gt;
After doing this ROOT can be imported from a python shell, but it does not work from a notebook.&lt;br /&gt;
&lt;br /&gt;
* ROOT uses JIT compilation with Cling: https://root.cern/cling/&lt;br /&gt;
* conda provides it's own set of compilation tools: gcc, gxx, fortran&lt;br /&gt;
* Some environment variables are necessary to ensure that these two pieces work well together:&lt;br /&gt;
** PATH: to be able to find the compiler tools&lt;br /&gt;
** CONDA_BUILD_SYSROOT: needed to configure the compiler call&lt;br /&gt;
&lt;br /&gt;
These environment variables are not propagated to the notebook, because they are set by .bashrc conda activate, ... so we need to explicitly set them in the notebook.&lt;br /&gt;
&lt;br /&gt;
     import os&lt;br /&gt;
     import sys&lt;br /&gt;
     from pathlib import Path&lt;br /&gt;
     bin_dir = Path(sys.executable).parent&lt;br /&gt;
     os.environ['PATH'] = f'{bin_dir}:{os.environ['PATH']}'&lt;br /&gt;
     os.environ['CONDA_BUILD_SYSROOT'] = str(bin_dir.parent / 'x86_64-conda-linux-gnu/sysroot')&lt;br /&gt;
&lt;br /&gt;
=== SageMath ===&lt;br /&gt;
&lt;br /&gt;
[https://www.sagemath.org/ SageMath] is particularly interesting for Cosmology because it allows symbolic calculations, e.g. deriving the equations of motions for the scale factor starting from a customised space-time metric. &lt;br /&gt;
&lt;br /&gt;
'''Standard cosmology examples'''&lt;br /&gt;
&lt;br /&gt;
* The Friedman equations for the FLRW solution of the Einstein equations.&lt;br /&gt;
You can find the corresponding Notebook in any PIC terminal at '''/data/astro/software/notebooks/FLRW_cosmology.ipynb'''&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/FLRW_cosmology_solutions.ipynb''' uses known analytical solutions of the FLRW cosmology and produces this image for the evolution of the scale factor:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage05.png|300px]]&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/Interior_Schwarzschild.ipynb''' shows the formalism for the interior Schwarzschild metric and displays the solutions for density and pressure of a static celestial object that is sufficiently larger than its Schwarzschild radius. The pressure for an object with constant density id shown in the image:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage06.png|300px]]&lt;br /&gt;
&lt;br /&gt;
'''Enabling SageMath environment in Jupyter'''&lt;br /&gt;
&lt;br /&gt;
If you have never initialized mamba, run:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /data/astro/software/centos7/conda/mambaforge_4.14.0/bin/mamba init&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
After that you can enable SageMath for its use in a Jupyter notebook session: &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~] mamba activate /data/astro/software/envs/sage&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ python -m  ipykernel install --user --name=sage&lt;br /&gt;
....&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This creates a file in you home '''~/.local/share/jupyter/kernels/sage/kernel.json'''&lt;br /&gt;
which has to be modified to look like this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
{&lt;br /&gt;
 &amp;quot;argv&amp;quot;: [&lt;br /&gt;
  &amp;quot;/data/astro/software/envs/sage/bin/sage&amp;quot;,&lt;br /&gt;
  &amp;quot;--python&amp;quot;,&lt;br /&gt;
  &amp;quot;-m&amp;quot;,&lt;br /&gt;
  &amp;quot;sage.repl.ipython_kernel&amp;quot;,&lt;br /&gt;
  &amp;quot;-f&amp;quot;,&lt;br /&gt;
  &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
 ],&lt;br /&gt;
 &amp;quot;display_name&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;language&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;metadata&amp;quot;: {&lt;br /&gt;
  &amp;quot;debugger&amp;quot;: true&lt;br /&gt;
 }&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next time you go to your Jupyter dashboard you will find the sage environment listed there.&lt;br /&gt;
&lt;br /&gt;
== Dask ==&lt;br /&gt;
Dask supports parallel computations in Python. The PIC Jupyterlab has an extension for launching&lt;br /&gt;
your own Dask clusters. For more information, see [[Dask|Dask documentation]].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Using a Singularity image as a Jupyter kernel ==&lt;br /&gt;
&lt;br /&gt;
In some projects, the software stack is provided as a Singularity image. In such cases, it can be convenient to use this image directly as a Jupyter kernel, allowing notebooks on jupyter.pic.es to run within the same controlled software environment.&lt;br /&gt;
&lt;br /&gt;
To be used as a Jupyter kernel, the Singularity image must satisfy certain requirements. These depend on the programming language used inside the notebook.&lt;br /&gt;
&lt;br /&gt;
The singularity image needs to have the '''python''' and the '''ipykernel''' module installed. &lt;br /&gt;
&lt;br /&gt;
* Create the folder that will host the kernel definition&lt;br /&gt;
&lt;br /&gt;
  mkdir -p $HOME/.local/share/jupyter/kernels/singularity&lt;br /&gt;
&lt;br /&gt;
* Create the '''kernel.json''' file inside it with the following content:&lt;br /&gt;
&lt;br /&gt;
 {&lt;br /&gt;
   &amp;quot;argv&amp;quot;: [&lt;br /&gt;
     &amp;quot;singularity&amp;quot;,&lt;br /&gt;
     &amp;quot;exec&amp;quot;,&lt;br /&gt;
     &amp;quot;--cleanenv&amp;quot;,&lt;br /&gt;
     &amp;quot;/path/to/the/singularity/image.sif&amp;quot;,&lt;br /&gt;
     &amp;quot;python&amp;quot;,&lt;br /&gt;
     &amp;quot;-m&amp;quot;,&lt;br /&gt;
    &amp;quot;ipykernel&amp;quot;,&lt;br /&gt;
     &amp;quot;-f&amp;quot;,&lt;br /&gt;
     &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
   ],&lt;br /&gt;
   &amp;quot;language&amp;quot;: &amp;quot;python&amp;quot;,&lt;br /&gt;
   &amp;quot;display_name&amp;quot;: &amp;quot;singularity-kernel&amp;quot;&lt;br /&gt;
 }&lt;br /&gt;
&lt;br /&gt;
Refresh or start the jupyterlab interface and the singularity kernel should appear in the launcher tab&lt;br /&gt;
&lt;br /&gt;
== GPUs ==&lt;br /&gt;
&lt;br /&gt;
The way to identify the GPUs that are assigned to your job is:&lt;br /&gt;
* check the environment variable CUDA_VISIBLE_DEVICES. In a terminal run &amp;quot;echo $CUDA_VISIBLE_DEVICES&amp;quot;. The environment variable contains a list of comma-separated GPU ids. With this you will already know how many gpus are assigned to your job. If the variable does not exist, there are no gpus assigned to the job&lt;br /&gt;
&lt;br /&gt;
* list the gpus with nvidia-smi, in a terminal run &amp;quot;nvidia-smi -L&amp;quot;, and look for the gpus you've been assigned. Remember their indexes (integers from 0 to 7)&lt;br /&gt;
&lt;br /&gt;
* The two steps above can be done with the following command:&lt;br /&gt;
&lt;br /&gt;
nvidia-smi -L | grep  $CUDA_VISIBLE_DEVICES&lt;br /&gt;
&lt;br /&gt;
if only having a single assigned GPU.&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_id_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
* in the GPU dashboard the gpus are identified with their index&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_resources_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Code samples ==&lt;br /&gt;
&lt;br /&gt;
A repository with sample code can be found here: https://gitlab.pic.es/services/code-samples/&lt;br /&gt;
&lt;br /&gt;
== Running notebooks through HTCondor ==&lt;br /&gt;
After developing a notebook, you might want to run it as a script with different configurations. The&lt;br /&gt;
following documentation explains  &lt;br /&gt;
&lt;br /&gt;
[[notebook_htcondor|how to run a notebook through HTCondor.]]&lt;br /&gt;
&lt;br /&gt;
= Troubleshooting =&lt;br /&gt;
&lt;br /&gt;
== Logs ==&lt;br /&gt;
&lt;br /&gt;
The log files for the jupyterlab server are stored in &amp;quot;~/.jupyter&amp;quot;. The log files will be created once the jupyter lab server job is finished.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Clean workspaces ==&lt;br /&gt;
&lt;br /&gt;
Jupyterlab stores the workspace status in the &amp;quot;~/.jupyter/lab/workspaces&amp;quot; folder. If you want to start with a fresh (empty) workspace, delete all the content of this folder before launching the notebook.&lt;br /&gt;
&lt;br /&gt;
    cd ~/.jupyter/lab/workspaces&lt;br /&gt;
    rm *&lt;br /&gt;
&lt;br /&gt;
= Known problems =&lt;br /&gt;
&lt;br /&gt;
== HOME folder is full ==&lt;br /&gt;
&lt;br /&gt;
Your HOME folder located at &amp;quot;/nfs/pic.es/user/X/XYZTUV&amp;quot; has a quota assigned, usually 25GB. &lt;br /&gt;
&lt;br /&gt;
If you see the error message &amp;quot;Disk quota exceeded&amp;quot; when spawning a notebook server or a generic 500 error, it could be because your $HOME folder is full. Log in to &amp;quot;ui.pic.es&amp;quot; (through SSH) and run &amp;quot;quota&amp;quot; to check the usage vs quota. If it is full you'll have to free up space.&lt;br /&gt;
&lt;br /&gt;
== Error 500: Internal Server Error ==&lt;br /&gt;
&lt;br /&gt;
This is a generic error. Means that the jupyterlab server failed. This could be for different reasons:&lt;br /&gt;
&lt;br /&gt;
* [[#HOME_folder_is_full|Your HOME folder is full]]&lt;br /&gt;
&lt;br /&gt;
== 504 Gateway timeout ==&lt;br /&gt;
&lt;br /&gt;
The notebook job is running in HTCondor but the user can not access the notebook server. Ultimately a 504 error is received.&lt;br /&gt;
&lt;br /&gt;
This is probably because there's some error when starting the jupyterlab server. First of all, shutdown the notebook server and [[#Logs|check the logs]] to better identify the problem. If you don't see the source of the error, try to [[#Clean_workspaces|clean the workspaces]] and launch a notebook again.&lt;br /&gt;
&lt;br /&gt;
== Loading libraries is very slow ==&lt;br /&gt;
Conda environments at storage using hard drives can be extremely slow to load. If you encounter this problem, please ask &lt;br /&gt;
your support contact for a &amp;quot;SSD scratch&amp;quot; location to store environments. Currently this service is being tested and&lt;br /&gt;
deployed as needed.&lt;br /&gt;
&lt;br /&gt;
== Spawn failed: The 'ip' trait of a PICCondorSpawner instance expected a unicode string, not the NoneType None ==&lt;br /&gt;
&lt;br /&gt;
Jupyterhub could not get the host name from HTCondor's stdout, because it didn't match the expected regular expression.&lt;br /&gt;
&lt;br /&gt;
This error happens randomly from time to time, it does not imply any major problem in any of the services.&lt;br /&gt;
&lt;br /&gt;
Try to request a new notebook server.&lt;br /&gt;
&lt;br /&gt;
== 403 : Forbidden. XSRF cookie does not match POST argument ==&lt;br /&gt;
&lt;br /&gt;
The value of the &amp;quot;_xsrf&amp;quot; cookie sent by the browser does not match the expected value. This could be for many reasons: temporary high load on the server, race conditions, temporary network unstability, many open tabs in the browser, etc.&lt;br /&gt;
&lt;br /&gt;
In general it can be solved by closing all tabs pointing to &amp;quot;jupyter.pic.es&amp;quot;, cleaning the cookies and connecting back to &amp;quot;jupyter.pic.es&amp;quot;&lt;br /&gt;
&lt;br /&gt;
== Proper usage of X509 based proxies ==&lt;br /&gt;
&lt;br /&gt;
We found recently that the usage of proxies within a Jupyter session might cause problems because the environment changes certain standard locations such as '''/tmp'''&lt;br /&gt;
&lt;br /&gt;
For a correct functioning please create the proxy the following way, example for Virgo:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /bin/voms-proxy-init --voms virgo:/virgo/ligo --out ./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
ls: cannot access /cvmfs/ligo.osgstorage.org: Permission denied&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here the proxy cannot properly be located. Therefore we have to put the complete path into the variable:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ pwd&lt;br /&gt;
/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;/x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
frames  powerflux  pycbc  test_access&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;/div&gt;</summary>
		<author><name>Torradeflot</name></author>
	</entry>
	<entry>
		<id>https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1307</id>
		<title>JupyterHub</title>
		<link rel="alternate" type="text/html" href="https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1307"/>
		<updated>2026-01-15T06:21:55Z</updated>

		<summary type="html">&lt;p&gt;Torradeflot: /* SageMath */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;tldr; Connect to https://jupyter.pic.es/ . Enjoy!&lt;br /&gt;
= Introduction =&lt;br /&gt;
&lt;br /&gt;
PIC offers a service for running Jupyter notebooks on CPU or GPU resources. This service is primarily thought for code developing or prototyping rather than data processing. The usage is similar to running notebooks on your personal computer but offers the advantage of developing and testing your code on different hardware configurations, as well as facilitating the scalability of the code since it is being tested in the same environment in which it would run on a mass scale. &lt;br /&gt;
&lt;br /&gt;
Since the service is strictly thought for development and small scale testing tasks, a shutdown policy for the sessions has been put in place:&lt;br /&gt;
&lt;br /&gt;
# The maximum duration for a session is 48h.&lt;br /&gt;
# After an idle period of 2 hours, the session will be closed. &lt;br /&gt;
&lt;br /&gt;
In practice that  means that you should estimate the test data volume that you work with during a session to be able to be processed in less than 48 hours.&lt;br /&gt;
&lt;br /&gt;
== How to connect to the service ==&lt;br /&gt;
&lt;br /&gt;
Got to [https://jupyter.pic.es jupyter.pic.es] to see your login screen.&lt;br /&gt;
&lt;br /&gt;
[[File:login.png|700px|Login screen]]&lt;br /&gt;
&lt;br /&gt;
Sign in with your PIC user credentials. This will prompt you to the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterSpawn.png|700px|current]]&lt;br /&gt;
&lt;br /&gt;
Here you can choose the hardware configuration for your Jupyter session. Also, you have to choose the experiment (project) you are working on during the Jupyter session. After choosing a configuration and pressing start the next screen will show you the progress of the initialisation process. Keep in mind that a job containing your Jupyter session is actually sent to the HTCondor queuing system and waiting for available resources before being started. This usually takes less than a minute but can take up to a few depending on our resource usage.&lt;br /&gt;
&lt;br /&gt;
[[File:screen02.png|900px]]&lt;br /&gt;
&lt;br /&gt;
In the next screen you can choose the tool that you want to use for your work: a Python notebook, a Python console or a plain bash terminal.&lt;br /&gt;
For the Python environment (either notebook or environment) you have two default options:&lt;br /&gt;
* the ipykernel version of Python 3&lt;br /&gt;
* the XPython version of Python 3.9, this one allows you to use the integrated debugging module.&lt;br /&gt;
&lt;br /&gt;
== Terminate your session and logout ==&lt;br /&gt;
&lt;br /&gt;
It is important that you terminate your session before you log out. In order to do so, go to the top page menu &amp;quot;'''File''' -&amp;gt; '''Hub Control Panel'''&amp;quot; and you will see the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:screen04.png|600px]]&lt;br /&gt;
&lt;br /&gt;
Here, click on the '''Stop My Server''' button. After that you can log out by clicking the '''Logout''' button in the right upper corner.&lt;br /&gt;
&lt;br /&gt;
=  Python virtual environments =&lt;br /&gt;
&lt;br /&gt;
This section covers the use of Python virtual environments with Jupyter.&lt;br /&gt;
&lt;br /&gt;
== Prebuilt environments == &lt;br /&gt;
&lt;br /&gt;
PIC's jupyterhub service comes with a collection of prebuilt environments located at '''/data/jupyter/software/envs'''.&lt;br /&gt;
&lt;br /&gt;
The master environment located at '''/data/jupyter/software/envs/master''' is the one used to start the jupyterlab service and the default for new notebooks.&lt;br /&gt;
&lt;br /&gt;
This is a non-extensive list of the packages included:&lt;br /&gt;
  - astropy=6.1.0&lt;br /&gt;
  - bokeh=3.4.1&lt;br /&gt;
  - dash=2.17.0&lt;br /&gt;
  - dask=2024.5.1&lt;br /&gt;
  - findspark=2.0.1&lt;br /&gt;
  - matplotlib=3.8.4&lt;br /&gt;
  - numpy=1.26.4&lt;br /&gt;
  - pandas=2.2.2&lt;br /&gt;
  - pillow=10.3.0&lt;br /&gt;
  - plotly=5.22.0&lt;br /&gt;
  - pyhive=0.7.0&lt;br /&gt;
  - python=3.12&lt;br /&gt;
  - pywavelets=1.4.1&lt;br /&gt;
  - scikit-image=0.22.0&lt;br /&gt;
  - scikit-learn=1.5.0&lt;br /&gt;
  - scipy=1.13.1&lt;br /&gt;
  - seaborn=0.13.2&lt;br /&gt;
  - statsmodels=0.14.2&lt;br /&gt;
&lt;br /&gt;
== Initialize conda (we highly recommend the use of mamba/micromamba) ==&lt;br /&gt;
&lt;br /&gt;
Before using conda/mamba in your bash session, you have to initialize it.&lt;br /&gt;
* For access to an available conda/mamba installation, please get in contact with your project liaison at PIC. He/she will give you the actual value for the '''/path/to/conda/mamba''' placeholder.&lt;br /&gt;
* If you want to use your own conda/mamba/micromamba installation, there are two recommended options&lt;br /&gt;
** '''miniforge''': a distribution with conda and mamba executables in a minimal base environment, instructions [https://github.com/conda-forge/miniforge here]&lt;br /&gt;
** '''micromamba''': a self-contained executable (micromamba) with no base environment, instructions [https://mamba.readthedocs.io/en/latest/installation/micromamba-installation.html here]&lt;br /&gt;
&lt;br /&gt;
Log onto Jupyter and start a session. On the homepage of your Jupyter session, click on the terminal button on the session dashboard on the right to open a bash terminal.&lt;br /&gt;
&lt;br /&gt;
In order to use conda/mamba/micromamba you need to intialize the shell. This initialization can be persistent, which will do some changes to your '''~/.bashrc''' file, or you can do it every time you want to use it.&lt;br /&gt;
&lt;br /&gt;
Run the following command:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eval &amp;quot;$(/data/astro/software/miniforge3/bin/conda shell.bash hook)&amp;quot;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, if you are using miniforge and you want to persist the initialization:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
/data/astro/software/miniforge3/bin/mamba init&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This actually changes the .bashrc file in your home directory in order to activate the base environment on login.&lt;br /&gt;
To avoid that the base environment is activated every time you log on to a node, run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Link an existing environment to Jupyter ==&lt;br /&gt;
&lt;br /&gt;
You can find instructions on how to create your own environments, e.g. [[#Create_virtual_environments_with_venv_or_conda | here]].&lt;br /&gt;
&lt;br /&gt;
Log into Jupyter, start a session. From the session dashboard choose the bash terminal.&lt;br /&gt;
&lt;br /&gt;
Inside the terminal, activate your environment.&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments:&lt;br /&gt;
* if you created the environment without a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the name of your environment. &lt;br /&gt;
* if you created the environment with a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the absolute path of your environment. &lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ source /path/to/environment/bin/activate&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Link the environment to a Jupyter kernel. For both, '''conda/mamba''' and '''venv''':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ python -m  ipykernel install --user --name=whatever_kernel_name&lt;br /&gt;
Installed kernelspec whatever_kernel_name in &lt;br /&gt;
                         /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you don't have the '''ipykernel''' module installed in your environment you may receive an error message like the one below when trying to run the previous command.&lt;br /&gt;
&amp;lt;pre&amp;gt;No module named ipykernel&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If this is the case, you need to install it by running: '''pip install ipykernel'''&lt;br /&gt;
&lt;br /&gt;
Deactivate your environment. &lt;br /&gt;
&lt;br /&gt;
For conda:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For venv:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now you can exit the terminal. After refreshing the Jupyter page your whatever_kernel_name appears in the dashboard. In this example '''test''' has been used for whatever_kernel_name&lt;br /&gt;
&lt;br /&gt;
[[File:screen05.png|700px]]&lt;br /&gt;
&lt;br /&gt;
== Unlink an environment from Jupyter ==&lt;br /&gt;
Log onto Jupyter, start a session and from the session dashboard choose the bash terminal. To remove your environment/kernel from Jupyter run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ jupyter kernelspec uninstall whatever_kernel_name&lt;br /&gt;
Kernel specs to remove:&lt;br /&gt;
  whatever_kernel_name     /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
Remove 1 kernel specs [y/N]: y&lt;br /&gt;
[RemoveKernelSpec] Removed /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Keep in mind that, although not available in Jupyter anymore, the environment still exists. Whenever you need it, you can link it again.&lt;br /&gt;
&lt;br /&gt;
== Create virtual environments with venv or conda ==&lt;br /&gt;
&lt;br /&gt;
Before creating a new environment, please get in contact with your project liaison at PIC as there may be already a suitable environment for your needs in place.&lt;br /&gt;
&lt;br /&gt;
If none of the existing environments suits your needs, you can create a new environment.&lt;br /&gt;
First, create a directory in a suitable place to store the environment. For single-user environments, place them in your home under ~/env. For environments that will be shared with other project users, contact your project liaison and ask him/her for a path in a shared storage volume that is visible to all of them.&lt;br /&gt;
&lt;br /&gt;
Once you have the location (i.e. /path/to/env/folder), create the environment with the following commands:&lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments '''(recommended)'''&lt;br /&gt;
&lt;br /&gt;
If your_env is installed at /path/to/env/your_env&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ python3 -m venv your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ source your_env/bin/activate&lt;br /&gt;
(...)[neissner@td110 ~]$ pip install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments&lt;br /&gt;
 &amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba create --prefix /path/to/env/your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The list of modules (module1, module2, ...) is optional. For instance, for a python3 environment with scipy you would specify: ''python=3 scipy'' &lt;br /&gt;
&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/env/folder/your_env&lt;br /&gt;
(...)[neissner@td110 ~]$ mamba install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
You can use pip install inside a mamba environment, however, resolving dependencies might require installing additional packages manually.&lt;br /&gt;
&lt;br /&gt;
== Conda / mamba configuration ==&lt;br /&gt;
&lt;br /&gt;
The behaviour of conda/mamba can be configured through the &amp;quot;$HOME/.condarc&amp;quot; file, described [https://docs.conda.io/projects/conda/en/latest/configuration.html here]. Some interesting parameters:&lt;br /&gt;
&lt;br /&gt;
* envs_dirs: The list of directories to search for named environments. E.g.: different locations where you created environments&lt;br /&gt;
&lt;br /&gt;
  envs_dirs:&lt;br /&gt;
    - /data/pic/scratch/torradeflot/envs&lt;br /&gt;
    - /data/astro/scratch/torradeflot/envs&lt;br /&gt;
    - /data/aai/scratch/torradeflot/envs&lt;br /&gt;
&lt;br /&gt;
* pkgs_dirs: Folder where to store conda packages&lt;br /&gt;
&lt;br /&gt;
  pkgs_dirs:&lt;br /&gt;
  - /data/aai/scratch_ssd/torradeflot/pkgs&lt;br /&gt;
  - /data/aai/scratch/torradeflot/pkgs&lt;br /&gt;
  - /data/pic/scratch/torradeflot/pkgs&lt;br /&gt;
&lt;br /&gt;
if `pkgs_dirs` and `envs_dirs` are in the same storage, conda will use hard links, thus optimizing the disk space.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Jupyterlab user guide =&lt;br /&gt;
&lt;br /&gt;
You can find the official documentation of the currently installed version of jupyterlab (3.6) [https://jupyterlab.readthedocs.io/en/4.2.x/ here], there you will find instruction on how to &lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/commands.html Access the command palette]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/toc.html Build a Table Of Contents]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/debugger.html Debug your code]&lt;br /&gt;
&lt;br /&gt;
A set of non-official jupyterlab extensions are installed to provide additional functionalities&lt;br /&gt;
&lt;br /&gt;
== Sidecar Apps: Remote desktop &amp;amp; Visual Studio IDE ==&lt;br /&gt;
Further you see an icon with a &amp;quot;D&amp;quot; - desktop, this one starts a VNC session that allows the use of programs with graphical user interfaces.&lt;br /&gt;
&lt;br /&gt;
You can also find the icon of Visual Studio, an integrated development environment.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterlab20231103.png|700px]]&lt;br /&gt;
&lt;br /&gt;
== Jupytext ==&lt;br /&gt;
Pair your notebooks with text files to enhance version tracking.&lt;br /&gt;
https://jupytext.readthedocs.io&lt;br /&gt;
&lt;br /&gt;
'''Example'''&lt;br /&gt;
&lt;br /&gt;
If you had a notebook (.ipynb file) containing only the cell below, tracked in a git repository&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%matplotlib inline&lt;br /&gt;
import numpy as np&lt;br /&gt;
import matplotlib.pyplot as plt&lt;br /&gt;
plt.imshow(np.random.random([10, 10]))&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Different executions of the cell would produce different images, and the images are embedded in a pseudo-binary format inside the notebook file. In this case, doing a '''git diff''' of the .ipynb file would produce a huge output (because the image changed), even if there wasn't any change in the code. It is thus convenient to sync the notebook with a text file (e.g. a .py script) using the jupytext extension and track this one with git. The outputs, including images, as well as some additional metadata, won't be added to the synced text file. So in the case of different executions of the same notebook, the diff will always be empty.&lt;br /&gt;
&lt;br /&gt;
== Git ==&lt;br /&gt;
Sidebar GUI to git repo management&lt;br /&gt;
https://github.com/jupyterlab/jupyterlab-git&lt;br /&gt;
&lt;br /&gt;
== Variable inspector ==&lt;br /&gt;
Variable Inspector provides an interactive interface for inspecting the current state of variables in a JupyterLab session. It allows users to view variable names, types, shapes, and values in a structured table, facilitating exploratory analysis and debugging workflows similar to variable inspection tools available in environments such as MATLAB.&lt;br /&gt;
&lt;br /&gt;
https://github.com/jupyterlab-contrib/jupyterlab-variableInspector&lt;br /&gt;
&lt;br /&gt;
== Jupyter server proxy ==&lt;br /&gt;
&lt;br /&gt;
This extension is installed in PIC's jupyter environment and it is used to be able to access network/web services running on the same host as the jupyterlab server from outside through the &amp;quot;https://jupyter.pic.es/user/{username}/proxy/{port}&amp;quot; URL.&lt;br /&gt;
&lt;br /&gt;
Full documentation here: https://jupyter-server-proxy.readthedocs.io/en/latest/index.html&lt;br /&gt;
...&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Tips &amp;amp; Tricks =&lt;br /&gt;
&lt;br /&gt;
== Software of particular interest ==&lt;br /&gt;
&lt;br /&gt;
=== ROOT ===&lt;br /&gt;
&lt;br /&gt;
Using ROOT from a jupyter notebook needs to do some tricks.&lt;br /&gt;
&lt;br /&gt;
Environment creation and kernel installation&lt;br /&gt;
&lt;br /&gt;
    $ micromamba env create -p /data/pic/scratch/torradeflot/envs/mcdata root ipykernel&lt;br /&gt;
    $ micromamba activate mcdata&lt;br /&gt;
    $ python -m ipykernel install --user --name mcdata&lt;br /&gt;
&lt;br /&gt;
After doing this ROOT can be imported from a python shell, but it does not work from a notebook.&lt;br /&gt;
&lt;br /&gt;
* ROOT uses JIT compilation with Cling: https://root.cern/cling/&lt;br /&gt;
* conda provides it's own set of compilation tools: gcc, gxx, fortran&lt;br /&gt;
* Some environment variables are necessary to ensure that these two pieces work well together:&lt;br /&gt;
** PATH: to be able to find the compiler tools&lt;br /&gt;
** CONDA_BUILD_SYSROOT: needed to configure the compiler call&lt;br /&gt;
&lt;br /&gt;
These environment variables are not propagated to the notebook, because they are set by .bashrc conda activate, ... so we need to explicitly set them in the notebook.&lt;br /&gt;
&lt;br /&gt;
     import os&lt;br /&gt;
     import sys&lt;br /&gt;
     from pathlib import Path&lt;br /&gt;
     bin_dir = Path(sys.executable).parent&lt;br /&gt;
     os.environ['PATH'] = f'{bin_dir}:{os.environ['PATH']}'&lt;br /&gt;
     os.environ['CONDA_BUILD_SYSROOT'] = str(bin_dir.parent / 'x86_64-conda-linux-gnu/sysroot')&lt;br /&gt;
&lt;br /&gt;
=== SageMath ===&lt;br /&gt;
&lt;br /&gt;
[https://www.sagemath.org/ SageMath] is particularly interesting for Cosmology because it allows symbolic calculations, e.g. deriving the equations of motions for the scale factor starting from a customised space-time metric. &lt;br /&gt;
&lt;br /&gt;
'''Standard cosmology examples'''&lt;br /&gt;
&lt;br /&gt;
* The Friedman equations for the FLRW solution of the Einstein equations.&lt;br /&gt;
You can find the corresponding Notebook in any PIC terminal at '''/data/astro/software/notebooks/FLRW_cosmology.ipynb'''&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/FLRW_cosmology_solutions.ipynb''' uses known analytical solutions of the FLRW cosmology and produces this image for the evolution of the scale factor:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage05.png|300px]]&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/Interior_Schwarzschild.ipynb''' shows the formalism for the interior Schwarzschild metric and displays the solutions for density and pressure of a static celestial object that is sufficiently larger than its Schwarzschild radius. The pressure for an object with constant density id shown in the image:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage06.png|300px]]&lt;br /&gt;
&lt;br /&gt;
'''Enabling SageMath environment in Jupyter'''&lt;br /&gt;
&lt;br /&gt;
If you have never initialized mamba, run:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /data/astro/software/centos7/conda/mambaforge_4.14.0/bin/mamba init&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
After that you can enable SageMath for its use in a Jupyter notebook session: &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~] mamba activate /data/astro/software/envs/sage&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ python -m  ipykernel install --user --name=sage&lt;br /&gt;
....&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This creates a file in you home '''~/.local/share/jupyter/kernels/sage/kernel.json'''&lt;br /&gt;
which has to be modified to look like this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
{&lt;br /&gt;
 &amp;quot;argv&amp;quot;: [&lt;br /&gt;
  &amp;quot;/data/astro/software/envs/sage/bin/sage&amp;quot;,&lt;br /&gt;
  &amp;quot;--python&amp;quot;,&lt;br /&gt;
  &amp;quot;-m&amp;quot;,&lt;br /&gt;
  &amp;quot;sage.repl.ipython_kernel&amp;quot;,&lt;br /&gt;
  &amp;quot;-f&amp;quot;,&lt;br /&gt;
  &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
 ],&lt;br /&gt;
 &amp;quot;display_name&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;language&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;metadata&amp;quot;: {&lt;br /&gt;
  &amp;quot;debugger&amp;quot;: true&lt;br /&gt;
 }&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next time you go to your Jupyter dashboard you will find the sage environment listed there.&lt;br /&gt;
&lt;br /&gt;
== Dask ==&lt;br /&gt;
Dask supports parallel computations in Python. The PIC Jupyterlab has an extension for launching&lt;br /&gt;
your own Dask clusters. For more information, see [[Dask|Dask documentation]].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Using a Singularity image as a Jupyter kernel ==&lt;br /&gt;
&lt;br /&gt;
In some projects, the software stack is provided as a Singularity image. In such cases, it can be convenient to use this image directly as a Jupyter kernel, allowing notebooks on jupyter.pic.es to run within the same controlled software environment.&lt;br /&gt;
&lt;br /&gt;
To be used as a Jupyter kernel, the Singularity image must satisfy certain requirements. These depend on the programming language used inside the notebook.&lt;br /&gt;
&lt;br /&gt;
The singularity image needs to have the '''python''' and the '''ipykernel''' module installed. &lt;br /&gt;
&lt;br /&gt;
* Create the folder that will host the kernel definition&lt;br /&gt;
&lt;br /&gt;
  mkdir -p $HOME/.local/share/jupyter/kernels/singularity&lt;br /&gt;
&lt;br /&gt;
* Create the '''kernel.json''' file inside it with the following content:&lt;br /&gt;
&lt;br /&gt;
 {&lt;br /&gt;
   &amp;quot;argv&amp;quot;: [&lt;br /&gt;
     &amp;quot;singularity&amp;quot;,&lt;br /&gt;
     &amp;quot;exec&amp;quot;,&lt;br /&gt;
     &amp;quot;--cleanenv&amp;quot;,&lt;br /&gt;
     &amp;quot;/path/to/the/singularity/image.sif&amp;quot;,&lt;br /&gt;
     &amp;quot;python&amp;quot;,&lt;br /&gt;
     &amp;quot;-m&amp;quot;,&lt;br /&gt;
    &amp;quot;ipykernel&amp;quot;,&lt;br /&gt;
     &amp;quot;-f&amp;quot;,&lt;br /&gt;
     &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
   ],&lt;br /&gt;
   &amp;quot;language&amp;quot;: &amp;quot;python&amp;quot;,&lt;br /&gt;
   &amp;quot;display_name&amp;quot;: &amp;quot;singularity-kernel&amp;quot;&lt;br /&gt;
 }&lt;br /&gt;
&lt;br /&gt;
Refresh or start the jupyterlab interface and the singularity kernel should appear in the launcher tab&lt;br /&gt;
&lt;br /&gt;
== GPUs ==&lt;br /&gt;
&lt;br /&gt;
The way to identify the GPUs that are assigned to your job is:&lt;br /&gt;
* check the environment variable CUDA_VISIBLE_DEVICES. In a terminal run &amp;quot;echo $CUDA_VISIBLE_DEVICES&amp;quot;. The environment variable contains a list of comma-separated GPU ids. With this you will already know how many gpus are assigned to your job. If the variable does not exist, there are no gpus assigned to the job&lt;br /&gt;
&lt;br /&gt;
* list the gpus with nvidia-smi, in a terminal run &amp;quot;nvidia-smi -L&amp;quot;, and look for the gpus you've been assigned. Remember their indexes (integers from 0 to 7)&lt;br /&gt;
&lt;br /&gt;
* The two steps above can be done with the following command:&lt;br /&gt;
&lt;br /&gt;
nvidia-smi -L | grep  $CUDA_VISIBLE_DEVICES&lt;br /&gt;
&lt;br /&gt;
if only having a single assigned GPU.&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_id_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
* in the GPU dashboard the gpus are identified with their index&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_resources_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Code samples ==&lt;br /&gt;
&lt;br /&gt;
A repository with sample code can be found here: https://gitlab.pic.es/services/code-samples/&lt;br /&gt;
&lt;br /&gt;
== Running notebooks through HTCondor ==&lt;br /&gt;
After developing a notebook, you might want to run it as a script with different configurations. The&lt;br /&gt;
following documentation explains  &lt;br /&gt;
&lt;br /&gt;
[[notebook_htcondor|how to run a notebook through HTCondor.]]&lt;br /&gt;
&lt;br /&gt;
= Troubleshooting =&lt;br /&gt;
&lt;br /&gt;
== Logs ==&lt;br /&gt;
&lt;br /&gt;
The log files for the jupyterlab server are stored in &amp;quot;~/.jupyter&amp;quot;. The log files will be created once the jupyter lab server job is finished.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Clean workspaces ==&lt;br /&gt;
&lt;br /&gt;
Jupyterlab stores the workspace status in the &amp;quot;~/.jupyter/lab/workspaces&amp;quot; folder. If you want to start with a fresh (empty) workspace, delete all the content of this folder before launching the notebook.&lt;br /&gt;
&lt;br /&gt;
    cd ~/.jupyter/lab/workspaces&lt;br /&gt;
    rm *&lt;br /&gt;
&lt;br /&gt;
= Known problems =&lt;br /&gt;
&lt;br /&gt;
== Error 500: Internal Server Error ==&lt;br /&gt;
&lt;br /&gt;
This is a generic error. Means that the jupyterlab server failed. This could be for different reasons:&lt;br /&gt;
&lt;br /&gt;
* Your HOME folder is full. Log in to &amp;quot;ui.pic.es&amp;quot; and run &amp;quot;quota&amp;quot; to check the usage vs quota. If it is full you'll have to free up space.&lt;br /&gt;
&lt;br /&gt;
== 504 Gateway timeout ==&lt;br /&gt;
&lt;br /&gt;
The notebook job is running in HTCondor but the user can not access the notebook server. Ultimately a 504 error is received.&lt;br /&gt;
&lt;br /&gt;
This is probably because there's some error when starting the jupyterlab server. First of all, shutdown the notebook server and [[#Logs|check the logs]] to better identify the problem. If you don't see the source of the error, try to [[#Clean_workspaces|clean the workspaces]] and launch a notebook again.&lt;br /&gt;
&lt;br /&gt;
== Loading libraries is very slow ==&lt;br /&gt;
Conda environments at storage using hard drives can be extremely slow to load. If you encounter this problem, please ask &lt;br /&gt;
your support contact for a &amp;quot;SSD scratch&amp;quot; location to store environments. Currently this service is being tested and&lt;br /&gt;
deployed as needed.&lt;br /&gt;
&lt;br /&gt;
== Spawn failed: The 'ip' trait of a PICCondorSpawner instance expected a unicode string, not the NoneType None ==&lt;br /&gt;
&lt;br /&gt;
Jupyterhub could not get the host name from HTCondor's stdout, because it didn't match the expected regular expression.&lt;br /&gt;
&lt;br /&gt;
This error happens randomly from time to time, it does not imply any major problem in any of the services.&lt;br /&gt;
&lt;br /&gt;
Try to request a new notebook server.&lt;br /&gt;
&lt;br /&gt;
== 403 : Forbidden. XSRF cookie does not match POST argument ==&lt;br /&gt;
&lt;br /&gt;
The value of the &amp;quot;_xsrf&amp;quot; cookie sent by the browser does not match the expected value. This could be for many reasons: temporary high load on the server, race conditions, temporary network unstability, many open tabs in the browser, etc.&lt;br /&gt;
&lt;br /&gt;
In general it can be solved by closing all tabs pointing to &amp;quot;jupyter.pic.es&amp;quot;, cleaning the cookies and connecting back to &amp;quot;jupyter.pic.es&amp;quot;&lt;br /&gt;
&lt;br /&gt;
== Proper usage of X509 based proxies ==&lt;br /&gt;
&lt;br /&gt;
We found recently that the usage of proxies within a Jupyter session might cause problems because the environment changes certain standard locations such as '''/tmp'''&lt;br /&gt;
&lt;br /&gt;
For a correct functioning please create the proxy the following way, example for Virgo:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /bin/voms-proxy-init --voms virgo:/virgo/ligo --out ./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
ls: cannot access /cvmfs/ligo.osgstorage.org: Permission denied&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here the proxy cannot properly be located. Therefore we have to put the complete path into the variable:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ pwd&lt;br /&gt;
/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;/x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
frames  powerflux  pycbc  test_access&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;/div&gt;</summary>
		<author><name>Torradeflot</name></author>
	</entry>
	<entry>
		<id>https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1306</id>
		<title>JupyterHub</title>
		<link rel="alternate" type="text/html" href="https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1306"/>
		<updated>2026-01-15T06:20:22Z</updated>

		<summary type="html">&lt;p&gt;Torradeflot: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;tldr; Connect to https://jupyter.pic.es/ . Enjoy!&lt;br /&gt;
= Introduction =&lt;br /&gt;
&lt;br /&gt;
PIC offers a service for running Jupyter notebooks on CPU or GPU resources. This service is primarily thought for code developing or prototyping rather than data processing. The usage is similar to running notebooks on your personal computer but offers the advantage of developing and testing your code on different hardware configurations, as well as facilitating the scalability of the code since it is being tested in the same environment in which it would run on a mass scale. &lt;br /&gt;
&lt;br /&gt;
Since the service is strictly thought for development and small scale testing tasks, a shutdown policy for the sessions has been put in place:&lt;br /&gt;
&lt;br /&gt;
# The maximum duration for a session is 48h.&lt;br /&gt;
# After an idle period of 2 hours, the session will be closed. &lt;br /&gt;
&lt;br /&gt;
In practice that  means that you should estimate the test data volume that you work with during a session to be able to be processed in less than 48 hours.&lt;br /&gt;
&lt;br /&gt;
== How to connect to the service ==&lt;br /&gt;
&lt;br /&gt;
Got to [https://jupyter.pic.es jupyter.pic.es] to see your login screen.&lt;br /&gt;
&lt;br /&gt;
[[File:login.png|700px|Login screen]]&lt;br /&gt;
&lt;br /&gt;
Sign in with your PIC user credentials. This will prompt you to the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterSpawn.png|700px|current]]&lt;br /&gt;
&lt;br /&gt;
Here you can choose the hardware configuration for your Jupyter session. Also, you have to choose the experiment (project) you are working on during the Jupyter session. After choosing a configuration and pressing start the next screen will show you the progress of the initialisation process. Keep in mind that a job containing your Jupyter session is actually sent to the HTCondor queuing system and waiting for available resources before being started. This usually takes less than a minute but can take up to a few depending on our resource usage.&lt;br /&gt;
&lt;br /&gt;
[[File:screen02.png|900px]]&lt;br /&gt;
&lt;br /&gt;
In the next screen you can choose the tool that you want to use for your work: a Python notebook, a Python console or a plain bash terminal.&lt;br /&gt;
For the Python environment (either notebook or environment) you have two default options:&lt;br /&gt;
* the ipykernel version of Python 3&lt;br /&gt;
* the XPython version of Python 3.9, this one allows you to use the integrated debugging module.&lt;br /&gt;
&lt;br /&gt;
== Terminate your session and logout ==&lt;br /&gt;
&lt;br /&gt;
It is important that you terminate your session before you log out. In order to do so, go to the top page menu &amp;quot;'''File''' -&amp;gt; '''Hub Control Panel'''&amp;quot; and you will see the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:screen04.png|600px]]&lt;br /&gt;
&lt;br /&gt;
Here, click on the '''Stop My Server''' button. After that you can log out by clicking the '''Logout''' button in the right upper corner.&lt;br /&gt;
&lt;br /&gt;
=  Python virtual environments =&lt;br /&gt;
&lt;br /&gt;
This section covers the use of Python virtual environments with Jupyter.&lt;br /&gt;
&lt;br /&gt;
== Prebuilt environments == &lt;br /&gt;
&lt;br /&gt;
PIC's jupyterhub service comes with a collection of prebuilt environments located at '''/data/jupyter/software/envs'''.&lt;br /&gt;
&lt;br /&gt;
The master environment located at '''/data/jupyter/software/envs/master''' is the one used to start the jupyterlab service and the default for new notebooks.&lt;br /&gt;
&lt;br /&gt;
This is a non-extensive list of the packages included:&lt;br /&gt;
  - astropy=6.1.0&lt;br /&gt;
  - bokeh=3.4.1&lt;br /&gt;
  - dash=2.17.0&lt;br /&gt;
  - dask=2024.5.1&lt;br /&gt;
  - findspark=2.0.1&lt;br /&gt;
  - matplotlib=3.8.4&lt;br /&gt;
  - numpy=1.26.4&lt;br /&gt;
  - pandas=2.2.2&lt;br /&gt;
  - pillow=10.3.0&lt;br /&gt;
  - plotly=5.22.0&lt;br /&gt;
  - pyhive=0.7.0&lt;br /&gt;
  - python=3.12&lt;br /&gt;
  - pywavelets=1.4.1&lt;br /&gt;
  - scikit-image=0.22.0&lt;br /&gt;
  - scikit-learn=1.5.0&lt;br /&gt;
  - scipy=1.13.1&lt;br /&gt;
  - seaborn=0.13.2&lt;br /&gt;
  - statsmodels=0.14.2&lt;br /&gt;
&lt;br /&gt;
== Initialize conda (we highly recommend the use of mamba/micromamba) ==&lt;br /&gt;
&lt;br /&gt;
Before using conda/mamba in your bash session, you have to initialize it.&lt;br /&gt;
* For access to an available conda/mamba installation, please get in contact with your project liaison at PIC. He/she will give you the actual value for the '''/path/to/conda/mamba''' placeholder.&lt;br /&gt;
* If you want to use your own conda/mamba/micromamba installation, there are two recommended options&lt;br /&gt;
** '''miniforge''': a distribution with conda and mamba executables in a minimal base environment, instructions [https://github.com/conda-forge/miniforge here]&lt;br /&gt;
** '''micromamba''': a self-contained executable (micromamba) with no base environment, instructions [https://mamba.readthedocs.io/en/latest/installation/micromamba-installation.html here]&lt;br /&gt;
&lt;br /&gt;
Log onto Jupyter and start a session. On the homepage of your Jupyter session, click on the terminal button on the session dashboard on the right to open a bash terminal.&lt;br /&gt;
&lt;br /&gt;
In order to use conda/mamba/micromamba you need to intialize the shell. This initialization can be persistent, which will do some changes to your '''~/.bashrc''' file, or you can do it every time you want to use it.&lt;br /&gt;
&lt;br /&gt;
Run the following command:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eval &amp;quot;$(/data/astro/software/miniforge3/bin/conda shell.bash hook)&amp;quot;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, if you are using miniforge and you want to persist the initialization:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
/data/astro/software/miniforge3/bin/mamba init&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This actually changes the .bashrc file in your home directory in order to activate the base environment on login.&lt;br /&gt;
To avoid that the base environment is activated every time you log on to a node, run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Link an existing environment to Jupyter ==&lt;br /&gt;
&lt;br /&gt;
You can find instructions on how to create your own environments, e.g. [[#Create_virtual_environments_with_venv_or_conda | here]].&lt;br /&gt;
&lt;br /&gt;
Log into Jupyter, start a session. From the session dashboard choose the bash terminal.&lt;br /&gt;
&lt;br /&gt;
Inside the terminal, activate your environment.&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments:&lt;br /&gt;
* if you created the environment without a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the name of your environment. &lt;br /&gt;
* if you created the environment with a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the absolute path of your environment. &lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ source /path/to/environment/bin/activate&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Link the environment to a Jupyter kernel. For both, '''conda/mamba''' and '''venv''':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ python -m  ipykernel install --user --name=whatever_kernel_name&lt;br /&gt;
Installed kernelspec whatever_kernel_name in &lt;br /&gt;
                         /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you don't have the '''ipykernel''' module installed in your environment you may receive an error message like the one below when trying to run the previous command.&lt;br /&gt;
&amp;lt;pre&amp;gt;No module named ipykernel&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If this is the case, you need to install it by running: '''pip install ipykernel'''&lt;br /&gt;
&lt;br /&gt;
Deactivate your environment. &lt;br /&gt;
&lt;br /&gt;
For conda:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For venv:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now you can exit the terminal. After refreshing the Jupyter page your whatever_kernel_name appears in the dashboard. In this example '''test''' has been used for whatever_kernel_name&lt;br /&gt;
&lt;br /&gt;
[[File:screen05.png|700px]]&lt;br /&gt;
&lt;br /&gt;
== Unlink an environment from Jupyter ==&lt;br /&gt;
Log onto Jupyter, start a session and from the session dashboard choose the bash terminal. To remove your environment/kernel from Jupyter run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ jupyter kernelspec uninstall whatever_kernel_name&lt;br /&gt;
Kernel specs to remove:&lt;br /&gt;
  whatever_kernel_name     /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
Remove 1 kernel specs [y/N]: y&lt;br /&gt;
[RemoveKernelSpec] Removed /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Keep in mind that, although not available in Jupyter anymore, the environment still exists. Whenever you need it, you can link it again.&lt;br /&gt;
&lt;br /&gt;
== Create virtual environments with venv or conda ==&lt;br /&gt;
&lt;br /&gt;
Before creating a new environment, please get in contact with your project liaison at PIC as there may be already a suitable environment for your needs in place.&lt;br /&gt;
&lt;br /&gt;
If none of the existing environments suits your needs, you can create a new environment.&lt;br /&gt;
First, create a directory in a suitable place to store the environment. For single-user environments, place them in your home under ~/env. For environments that will be shared with other project users, contact your project liaison and ask him/her for a path in a shared storage volume that is visible to all of them.&lt;br /&gt;
&lt;br /&gt;
Once you have the location (i.e. /path/to/env/folder), create the environment with the following commands:&lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments '''(recommended)'''&lt;br /&gt;
&lt;br /&gt;
If your_env is installed at /path/to/env/your_env&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ python3 -m venv your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ source your_env/bin/activate&lt;br /&gt;
(...)[neissner@td110 ~]$ pip install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments&lt;br /&gt;
 &amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba create --prefix /path/to/env/your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The list of modules (module1, module2, ...) is optional. For instance, for a python3 environment with scipy you would specify: ''python=3 scipy'' &lt;br /&gt;
&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/env/folder/your_env&lt;br /&gt;
(...)[neissner@td110 ~]$ mamba install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
You can use pip install inside a mamba environment, however, resolving dependencies might require installing additional packages manually.&lt;br /&gt;
&lt;br /&gt;
== Conda / mamba configuration ==&lt;br /&gt;
&lt;br /&gt;
The behaviour of conda/mamba can be configured through the &amp;quot;$HOME/.condarc&amp;quot; file, described [https://docs.conda.io/projects/conda/en/latest/configuration.html here]. Some interesting parameters:&lt;br /&gt;
&lt;br /&gt;
* envs_dirs: The list of directories to search for named environments. E.g.: different locations where you created environments&lt;br /&gt;
&lt;br /&gt;
  envs_dirs:&lt;br /&gt;
    - /data/pic/scratch/torradeflot/envs&lt;br /&gt;
    - /data/astro/scratch/torradeflot/envs&lt;br /&gt;
    - /data/aai/scratch/torradeflot/envs&lt;br /&gt;
&lt;br /&gt;
* pkgs_dirs: Folder where to store conda packages&lt;br /&gt;
&lt;br /&gt;
  pkgs_dirs:&lt;br /&gt;
  - /data/aai/scratch_ssd/torradeflot/pkgs&lt;br /&gt;
  - /data/aai/scratch/torradeflot/pkgs&lt;br /&gt;
  - /data/pic/scratch/torradeflot/pkgs&lt;br /&gt;
&lt;br /&gt;
if `pkgs_dirs` and `envs_dirs` are in the same storage, conda will use hard links, thus optimizing the disk space.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Jupyterlab user guide =&lt;br /&gt;
&lt;br /&gt;
You can find the official documentation of the currently installed version of jupyterlab (3.6) [https://jupyterlab.readthedocs.io/en/4.2.x/ here], there you will find instruction on how to &lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/commands.html Access the command palette]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/toc.html Build a Table Of Contents]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/debugger.html Debug your code]&lt;br /&gt;
&lt;br /&gt;
A set of non-official jupyterlab extensions are installed to provide additional functionalities&lt;br /&gt;
&lt;br /&gt;
== Sidecar Apps: Remote desktop &amp;amp; Visual Studio IDE ==&lt;br /&gt;
Further you see an icon with a &amp;quot;D&amp;quot; - desktop, this one starts a VNC session that allows the use of programs with graphical user interfaces.&lt;br /&gt;
&lt;br /&gt;
You can also find the icon of Visual Studio, an integrated development environment.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterlab20231103.png|700px]]&lt;br /&gt;
&lt;br /&gt;
== Jupytext ==&lt;br /&gt;
Pair your notebooks with text files to enhance version tracking.&lt;br /&gt;
https://jupytext.readthedocs.io&lt;br /&gt;
&lt;br /&gt;
'''Example'''&lt;br /&gt;
&lt;br /&gt;
If you had a notebook (.ipynb file) containing only the cell below, tracked in a git repository&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%matplotlib inline&lt;br /&gt;
import numpy as np&lt;br /&gt;
import matplotlib.pyplot as plt&lt;br /&gt;
plt.imshow(np.random.random([10, 10]))&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Different executions of the cell would produce different images, and the images are embedded in a pseudo-binary format inside the notebook file. In this case, doing a '''git diff''' of the .ipynb file would produce a huge output (because the image changed), even if there wasn't any change in the code. It is thus convenient to sync the notebook with a text file (e.g. a .py script) using the jupytext extension and track this one with git. The outputs, including images, as well as some additional metadata, won't be added to the synced text file. So in the case of different executions of the same notebook, the diff will always be empty.&lt;br /&gt;
&lt;br /&gt;
== Git ==&lt;br /&gt;
Sidebar GUI to git repo management&lt;br /&gt;
https://github.com/jupyterlab/jupyterlab-git&lt;br /&gt;
&lt;br /&gt;
== Variable inspector ==&lt;br /&gt;
Variable Inspector provides an interactive interface for inspecting the current state of variables in a JupyterLab session. It allows users to view variable names, types, shapes, and values in a structured table, facilitating exploratory analysis and debugging workflows similar to variable inspection tools available in environments such as MATLAB.&lt;br /&gt;
&lt;br /&gt;
https://github.com/jupyterlab-contrib/jupyterlab-variableInspector&lt;br /&gt;
&lt;br /&gt;
== Jupyter server proxy ==&lt;br /&gt;
&lt;br /&gt;
This extension is installed in PIC's jupyter environment and it is used to be able to access network/web services running on the same host as the jupyterlab server from outside through the &amp;quot;https://jupyter.pic.es/user/{username}/proxy/{port}&amp;quot; URL.&lt;br /&gt;
&lt;br /&gt;
Full documentation here: https://jupyter-server-proxy.readthedocs.io/en/latest/index.html&lt;br /&gt;
...&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Tips &amp;amp; Tricks =&lt;br /&gt;
&lt;br /&gt;
== Software of particular interest ==&lt;br /&gt;
&lt;br /&gt;
=== ROOT ===&lt;br /&gt;
&lt;br /&gt;
Using ROOT from a jupyter notebook needs to do some tricks.&lt;br /&gt;
&lt;br /&gt;
Environment creation and kernel installation&lt;br /&gt;
&lt;br /&gt;
    $ micromamba env create -p /data/pic/scratch/torradeflot/envs/mcdata root ipykernel&lt;br /&gt;
    $ micromamba activate mcdata&lt;br /&gt;
    $ python -m ipykernel install --user --name mcdata&lt;br /&gt;
&lt;br /&gt;
After doing this ROOT can be imported from a python shell, but it does not work from a notebook.&lt;br /&gt;
&lt;br /&gt;
* ROOT uses JIT compilation with Cling: https://root.cern/cling/&lt;br /&gt;
* conda provides it's own set of compilation tools: gcc, gxx, fortran&lt;br /&gt;
* Some environment variables are necessary to ensure that these two pieces work well together:&lt;br /&gt;
** PATH: to be able to find the compiler tools&lt;br /&gt;
** CONDA_BUILD_SYSROOT: needed to configure the compiler call&lt;br /&gt;
&lt;br /&gt;
These environment variables are not propagated to the notebook, because they are set by .bashrc conda activate, ... so we need to explicitly set them in the notebook.&lt;br /&gt;
&lt;br /&gt;
     import os&lt;br /&gt;
     import sys&lt;br /&gt;
     from pathlib import Path&lt;br /&gt;
     bin_dir = Path(sys.executable).parent&lt;br /&gt;
     os.environ['PATH'] = f'{bin_dir}:{os.environ['PATH']}'&lt;br /&gt;
     os.environ['CONDA_BUILD_SYSROOT'] = str(bin_dir.parent / 'x86_64-conda-linux-gnu/sysroot')&lt;br /&gt;
&lt;br /&gt;
=== SageMath ===&lt;br /&gt;
&lt;br /&gt;
[https://www.sagemath.org/ SageMath] is particularly interesting for Cosmology because it allows symbolic calculations, e.g. deriving the equations of motions for the scale factor starting from a customised space-time metric. &lt;br /&gt;
&lt;br /&gt;
==== Standard cosmology examples ====&lt;br /&gt;
&lt;br /&gt;
* The Friedman equations for the FLRW solution of the Einstein equations.&lt;br /&gt;
You can find the corresponding Notebook in any PIC terminal at '''/data/astro/software/notebooks/FLRW_cosmology.ipynb'''&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/FLRW_cosmology_solutions.ipynb''' uses known analytical solutions of the FLRW cosmology and produces this image for the evolution of the scale factor:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage05.png|300px]]&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/Interior_Schwarzschild.ipynb''' shows the formalism for the interior Schwarzschild metric and displays the solutions for density and pressure of a static celestial object that is sufficiently larger than its Schwarzschild radius. The pressure for an object with constant density id shown in the image:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage06.png|300px]]&lt;br /&gt;
&lt;br /&gt;
==== Enabling SageMath environment in Jupyter ====&lt;br /&gt;
&lt;br /&gt;
If you have never initialized mamba, run:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /data/astro/software/centos7/conda/mambaforge_4.14.0/bin/mamba init&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
After that you can enable SageMath for its use in a Jupyter notebook session: &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~] mamba activate /data/astro/software/envs/sage&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ python -m  ipykernel install --user --name=sage&lt;br /&gt;
....&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This creates a file in you home '''~/.local/share/jupyter/kernels/sage/kernel.json'''&lt;br /&gt;
which has to be modified to look like this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
{&lt;br /&gt;
 &amp;quot;argv&amp;quot;: [&lt;br /&gt;
  &amp;quot;/data/astro/software/envs/sage/bin/sage&amp;quot;,&lt;br /&gt;
  &amp;quot;--python&amp;quot;,&lt;br /&gt;
  &amp;quot;-m&amp;quot;,&lt;br /&gt;
  &amp;quot;sage.repl.ipython_kernel&amp;quot;,&lt;br /&gt;
  &amp;quot;-f&amp;quot;,&lt;br /&gt;
  &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
 ],&lt;br /&gt;
 &amp;quot;display_name&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;language&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;metadata&amp;quot;: {&lt;br /&gt;
  &amp;quot;debugger&amp;quot;: true&lt;br /&gt;
 }&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next time you go to your Jupyter dashboard you will find the sage environment listed there.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Dask ==&lt;br /&gt;
Dask supports parallel computations in Python. The PIC Jupyterlab has an extension for launching&lt;br /&gt;
your own Dask clusters. For more information, see [[Dask|Dask documentation]].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Using a Singularity image as a Jupyter kernel ==&lt;br /&gt;
&lt;br /&gt;
In some projects, the software stack is provided as a Singularity image. In such cases, it can be convenient to use this image directly as a Jupyter kernel, allowing notebooks on jupyter.pic.es to run within the same controlled software environment.&lt;br /&gt;
&lt;br /&gt;
To be used as a Jupyter kernel, the Singularity image must satisfy certain requirements. These depend on the programming language used inside the notebook.&lt;br /&gt;
&lt;br /&gt;
The singularity image needs to have the '''python''' and the '''ipykernel''' module installed. &lt;br /&gt;
&lt;br /&gt;
* Create the folder that will host the kernel definition&lt;br /&gt;
&lt;br /&gt;
  mkdir -p $HOME/.local/share/jupyter/kernels/singularity&lt;br /&gt;
&lt;br /&gt;
* Create the '''kernel.json''' file inside it with the following content:&lt;br /&gt;
&lt;br /&gt;
 {&lt;br /&gt;
   &amp;quot;argv&amp;quot;: [&lt;br /&gt;
     &amp;quot;singularity&amp;quot;,&lt;br /&gt;
     &amp;quot;exec&amp;quot;,&lt;br /&gt;
     &amp;quot;--cleanenv&amp;quot;,&lt;br /&gt;
     &amp;quot;/path/to/the/singularity/image.sif&amp;quot;,&lt;br /&gt;
     &amp;quot;python&amp;quot;,&lt;br /&gt;
     &amp;quot;-m&amp;quot;,&lt;br /&gt;
    &amp;quot;ipykernel&amp;quot;,&lt;br /&gt;
     &amp;quot;-f&amp;quot;,&lt;br /&gt;
     &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
   ],&lt;br /&gt;
   &amp;quot;language&amp;quot;: &amp;quot;python&amp;quot;,&lt;br /&gt;
   &amp;quot;display_name&amp;quot;: &amp;quot;singularity-kernel&amp;quot;&lt;br /&gt;
 }&lt;br /&gt;
&lt;br /&gt;
Refresh or start the jupyterlab interface and the singularity kernel should appear in the launcher tab&lt;br /&gt;
&lt;br /&gt;
== GPUs ==&lt;br /&gt;
&lt;br /&gt;
The way to identify the GPUs that are assigned to your job is:&lt;br /&gt;
* check the environment variable CUDA_VISIBLE_DEVICES. In a terminal run &amp;quot;echo $CUDA_VISIBLE_DEVICES&amp;quot;. The environment variable contains a list of comma-separated GPU ids. With this you will already know how many gpus are assigned to your job. If the variable does not exist, there are no gpus assigned to the job&lt;br /&gt;
&lt;br /&gt;
* list the gpus with nvidia-smi, in a terminal run &amp;quot;nvidia-smi -L&amp;quot;, and look for the gpus you've been assigned. Remember their indexes (integers from 0 to 7)&lt;br /&gt;
&lt;br /&gt;
* The two steps above can be done with the following command:&lt;br /&gt;
&lt;br /&gt;
nvidia-smi -L | grep  $CUDA_VISIBLE_DEVICES&lt;br /&gt;
&lt;br /&gt;
if only having a single assigned GPU.&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_id_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
* in the GPU dashboard the gpus are identified with their index&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_resources_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Code samples ==&lt;br /&gt;
&lt;br /&gt;
A repository with sample code can be found here: https://gitlab.pic.es/services/code-samples/&lt;br /&gt;
&lt;br /&gt;
== Running notebooks through HTCondor ==&lt;br /&gt;
After developing a notebook, you might want to run it as a script with different configurations. The&lt;br /&gt;
following documentation explains  &lt;br /&gt;
&lt;br /&gt;
[[notebook_htcondor|how to run a notebook through HTCondor.]]&lt;br /&gt;
&lt;br /&gt;
= Troubleshooting =&lt;br /&gt;
&lt;br /&gt;
== Logs ==&lt;br /&gt;
&lt;br /&gt;
The log files for the jupyterlab server are stored in &amp;quot;~/.jupyter&amp;quot;. The log files will be created once the jupyter lab server job is finished.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Clean workspaces ==&lt;br /&gt;
&lt;br /&gt;
Jupyterlab stores the workspace status in the &amp;quot;~/.jupyter/lab/workspaces&amp;quot; folder. If you want to start with a fresh (empty) workspace, delete all the content of this folder before launching the notebook.&lt;br /&gt;
&lt;br /&gt;
    cd ~/.jupyter/lab/workspaces&lt;br /&gt;
    rm *&lt;br /&gt;
&lt;br /&gt;
= Known problems =&lt;br /&gt;
&lt;br /&gt;
== Error 500: Internal Server Error ==&lt;br /&gt;
&lt;br /&gt;
This is a generic error. Means that the jupyterlab server failed. This could be for different reasons:&lt;br /&gt;
&lt;br /&gt;
* Your HOME folder is full. Log in to &amp;quot;ui.pic.es&amp;quot; and run &amp;quot;quota&amp;quot; to check the usage vs quota. If it is full you'll have to free up space.&lt;br /&gt;
&lt;br /&gt;
== 504 Gateway timeout ==&lt;br /&gt;
&lt;br /&gt;
The notebook job is running in HTCondor but the user can not access the notebook server. Ultimately a 504 error is received.&lt;br /&gt;
&lt;br /&gt;
This is probably because there's some error when starting the jupyterlab server. First of all, shutdown the notebook server and [[#Logs|check the logs]] to better identify the problem. If you don't see the source of the error, try to [[#Clean_workspaces|clean the workspaces]] and launch a notebook again.&lt;br /&gt;
&lt;br /&gt;
== Loading libraries is very slow ==&lt;br /&gt;
Conda environments at storage using hard drives can be extremely slow to load. If you encounter this problem, please ask &lt;br /&gt;
your support contact for a &amp;quot;SSD scratch&amp;quot; location to store environments. Currently this service is being tested and&lt;br /&gt;
deployed as needed.&lt;br /&gt;
&lt;br /&gt;
== Spawn failed: The 'ip' trait of a PICCondorSpawner instance expected a unicode string, not the NoneType None ==&lt;br /&gt;
&lt;br /&gt;
Jupyterhub could not get the host name from HTCondor's stdout, because it didn't match the expected regular expression.&lt;br /&gt;
&lt;br /&gt;
This error happens randomly from time to time, it does not imply any major problem in any of the services.&lt;br /&gt;
&lt;br /&gt;
Try to request a new notebook server.&lt;br /&gt;
&lt;br /&gt;
== 403 : Forbidden. XSRF cookie does not match POST argument ==&lt;br /&gt;
&lt;br /&gt;
The value of the &amp;quot;_xsrf&amp;quot; cookie sent by the browser does not match the expected value. This could be for many reasons: temporary high load on the server, race conditions, temporary network unstability, many open tabs in the browser, etc.&lt;br /&gt;
&lt;br /&gt;
In general it can be solved by closing all tabs pointing to &amp;quot;jupyter.pic.es&amp;quot;, cleaning the cookies and connecting back to &amp;quot;jupyter.pic.es&amp;quot;&lt;br /&gt;
&lt;br /&gt;
== Proper usage of X509 based proxies ==&lt;br /&gt;
&lt;br /&gt;
We found recently that the usage of proxies within a Jupyter session might cause problems because the environment changes certain standard locations such as '''/tmp'''&lt;br /&gt;
&lt;br /&gt;
For a correct functioning please create the proxy the following way, example for Virgo:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /bin/voms-proxy-init --voms virgo:/virgo/ligo --out ./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
ls: cannot access /cvmfs/ligo.osgstorage.org: Permission denied&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here the proxy cannot properly be located. Therefore we have to put the complete path into the variable:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ pwd&lt;br /&gt;
/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;/x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
frames  powerflux  pycbc  test_access&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;/div&gt;</summary>
		<author><name>Torradeflot</name></author>
	</entry>
	<entry>
		<id>https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1305</id>
		<title>JupyterHub</title>
		<link rel="alternate" type="text/html" href="https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1305"/>
		<updated>2026-01-15T06:18:08Z</updated>

		<summary type="html">&lt;p&gt;Torradeflot: /* Known errors */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;tldr; Connect to https://jupyter.pic.es/ . Enjoy!&lt;br /&gt;
= Introduction =&lt;br /&gt;
&lt;br /&gt;
PIC offers a service for running Jupyter notebooks on CPU or GPU resources. This service is primarily thought for code developing or prototyping rather than data processing. The usage is similar to running notebooks on your personal computer but offers the advantage of developing and testing your code on different hardware configurations, as well as facilitating the scalability of the code since it is being tested in the same environment in which it would run on a mass scale. &lt;br /&gt;
&lt;br /&gt;
Since the service is strictly thought for development and small scale testing tasks, a shutdown policy for the sessions has been put in place:&lt;br /&gt;
&lt;br /&gt;
# The maximum duration for a session is 48h.&lt;br /&gt;
# After an idle period of 2 hours, the session will be closed. &lt;br /&gt;
&lt;br /&gt;
In practice that  means that you should estimate the test data volume that you work with during a session to be able to be processed in less than 48 hours.&lt;br /&gt;
&lt;br /&gt;
== How to connect to the service ==&lt;br /&gt;
&lt;br /&gt;
Got to [https://jupyter.pic.es jupyter.pic.es] to see your login screen.&lt;br /&gt;
&lt;br /&gt;
[[File:login.png|700px|Login screen]]&lt;br /&gt;
&lt;br /&gt;
Sign in with your PIC user credentials. This will prompt you to the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterSpawn.png|700px|current]]&lt;br /&gt;
&lt;br /&gt;
Here you can choose the hardware configuration for your Jupyter session. Also, you have to choose the experiment (project) you are working on during the Jupyter session. After choosing a configuration and pressing start the next screen will show you the progress of the initialisation process. Keep in mind that a job containing your Jupyter session is actually sent to the HTCondor queuing system and waiting for available resources before being started. This usually takes less than a minute but can take up to a few depending on our resource usage.&lt;br /&gt;
&lt;br /&gt;
[[File:screen02.png|900px]]&lt;br /&gt;
&lt;br /&gt;
In the next screen you can choose the tool that you want to use for your work: a Python notebook, a Python console or a plain bash terminal.&lt;br /&gt;
For the Python environment (either notebook or environment) you have two default options:&lt;br /&gt;
* the ipykernel version of Python 3&lt;br /&gt;
* the XPython version of Python 3.9, this one allows you to use the integrated debugging module.&lt;br /&gt;
&lt;br /&gt;
== Terminate your session and logout ==&lt;br /&gt;
&lt;br /&gt;
It is important that you terminate your session before you log out. In order to do so, go to the top page menu &amp;quot;'''File''' -&amp;gt; '''Hub Control Panel'''&amp;quot; and you will see the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:screen04.png|600px]]&lt;br /&gt;
&lt;br /&gt;
Here, click on the '''Stop My Server''' button. After that you can log out by clicking the '''Logout''' button in the right upper corner.&lt;br /&gt;
&lt;br /&gt;
=  Python virtual environments =&lt;br /&gt;
&lt;br /&gt;
This section covers the use of Python virtual environments with Jupyter.&lt;br /&gt;
&lt;br /&gt;
== Prebuilt environments == &lt;br /&gt;
&lt;br /&gt;
PIC's jupyterhub service comes with a collection of prebuilt environments located at '''/data/jupyter/software/envs'''.&lt;br /&gt;
&lt;br /&gt;
The master environment located at '''/data/jupyter/software/envs/master''' is the one used to start the jupyterlab service and the default for new notebooks.&lt;br /&gt;
&lt;br /&gt;
This is a non-extensive list of the packages included:&lt;br /&gt;
  - astropy=6.1.0&lt;br /&gt;
  - bokeh=3.4.1&lt;br /&gt;
  - dash=2.17.0&lt;br /&gt;
  - dask=2024.5.1&lt;br /&gt;
  - findspark=2.0.1&lt;br /&gt;
  - matplotlib=3.8.4&lt;br /&gt;
  - numpy=1.26.4&lt;br /&gt;
  - pandas=2.2.2&lt;br /&gt;
  - pillow=10.3.0&lt;br /&gt;
  - plotly=5.22.0&lt;br /&gt;
  - pyhive=0.7.0&lt;br /&gt;
  - python=3.12&lt;br /&gt;
  - pywavelets=1.4.1&lt;br /&gt;
  - scikit-image=0.22.0&lt;br /&gt;
  - scikit-learn=1.5.0&lt;br /&gt;
  - scipy=1.13.1&lt;br /&gt;
  - seaborn=0.13.2&lt;br /&gt;
  - statsmodels=0.14.2&lt;br /&gt;
&lt;br /&gt;
== Initialize conda (we highly recommend the use of mamba/micromamba) ==&lt;br /&gt;
&lt;br /&gt;
Before using conda/mamba in your bash session, you have to initialize it.&lt;br /&gt;
* For access to an available conda/mamba installation, please get in contact with your project liaison at PIC. He/she will give you the actual value for the '''/path/to/conda/mamba''' placeholder.&lt;br /&gt;
* If you want to use your own conda/mamba/micromamba installation, there are two recommended options&lt;br /&gt;
** '''miniforge''': a distribution with conda and mamba executables in a minimal base environment, instructions [https://github.com/conda-forge/miniforge here]&lt;br /&gt;
** '''micromamba''': a self-contained executable (micromamba) with no base environment, instructions [https://mamba.readthedocs.io/en/latest/installation/micromamba-installation.html here]&lt;br /&gt;
&lt;br /&gt;
Log onto Jupyter and start a session. On the homepage of your Jupyter session, click on the terminal button on the session dashboard on the right to open a bash terminal.&lt;br /&gt;
&lt;br /&gt;
In order to use conda/mamba/micromamba you need to intialize the shell. This initialization can be persistent, which will do some changes to your '''~/.bashrc''' file, or you can do it every time you want to use it.&lt;br /&gt;
&lt;br /&gt;
Run the following command:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eval &amp;quot;$(/data/astro/software/miniforge3/bin/conda shell.bash hook)&amp;quot;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, if you are using miniforge and you want to persist the initialization:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
/data/astro/software/miniforge3/bin/mamba init&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This actually changes the .bashrc file in your home directory in order to activate the base environment on login.&lt;br /&gt;
To avoid that the base environment is activated every time you log on to a node, run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Link an existing environment to Jupyter ==&lt;br /&gt;
&lt;br /&gt;
You can find instructions on how to create your own environments, e.g. [[#Create_virtual_environments_with_venv_or_conda | here]].&lt;br /&gt;
&lt;br /&gt;
Log into Jupyter, start a session. From the session dashboard choose the bash terminal.&lt;br /&gt;
&lt;br /&gt;
Inside the terminal, activate your environment.&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments:&lt;br /&gt;
* if you created the environment without a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the name of your environment. &lt;br /&gt;
* if you created the environment with a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the absolute path of your environment. &lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ source /path/to/environment/bin/activate&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Link the environment to a Jupyter kernel. For both, '''conda/mamba''' and '''venv''':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ python -m  ipykernel install --user --name=whatever_kernel_name&lt;br /&gt;
Installed kernelspec whatever_kernel_name in &lt;br /&gt;
                         /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you don't have the '''ipykernel''' module installed in your environment you may receive an error message like the one below when trying to run the previous command.&lt;br /&gt;
&amp;lt;pre&amp;gt;No module named ipykernel&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If this is the case, you need to install it by running: '''pip install ipykernel'''&lt;br /&gt;
&lt;br /&gt;
Deactivate your environment. &lt;br /&gt;
&lt;br /&gt;
For conda:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For venv:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now you can exit the terminal. After refreshing the Jupyter page your whatever_kernel_name appears in the dashboard. In this example '''test''' has been used for whatever_kernel_name&lt;br /&gt;
&lt;br /&gt;
[[File:screen05.png|700px]]&lt;br /&gt;
&lt;br /&gt;
== Unlink an environment from Jupyter ==&lt;br /&gt;
Log onto Jupyter, start a session and from the session dashboard choose the bash terminal. To remove your environment/kernel from Jupyter run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ jupyter kernelspec uninstall whatever_kernel_name&lt;br /&gt;
Kernel specs to remove:&lt;br /&gt;
  whatever_kernel_name     /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
Remove 1 kernel specs [y/N]: y&lt;br /&gt;
[RemoveKernelSpec] Removed /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Keep in mind that, although not available in Jupyter anymore, the environment still exists. Whenever you need it, you can link it again.&lt;br /&gt;
&lt;br /&gt;
== Create virtual environments with venv or conda ==&lt;br /&gt;
&lt;br /&gt;
Before creating a new environment, please get in contact with your project liaison at PIC as there may be already a suitable environment for your needs in place.&lt;br /&gt;
&lt;br /&gt;
If none of the existing environments suits your needs, you can create a new environment.&lt;br /&gt;
First, create a directory in a suitable place to store the environment. For single-user environments, place them in your home under ~/env. For environments that will be shared with other project users, contact your project liaison and ask him/her for a path in a shared storage volume that is visible to all of them.&lt;br /&gt;
&lt;br /&gt;
Once you have the location (i.e. /path/to/env/folder), create the environment with the following commands:&lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments '''(recommended)'''&lt;br /&gt;
&lt;br /&gt;
If your_env is installed at /path/to/env/your_env&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ python3 -m venv your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ source your_env/bin/activate&lt;br /&gt;
(...)[neissner@td110 ~]$ pip install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments&lt;br /&gt;
 &amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba create --prefix /path/to/env/your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The list of modules (module1, module2, ...) is optional. For instance, for a python3 environment with scipy you would specify: ''python=3 scipy'' &lt;br /&gt;
&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/env/folder/your_env&lt;br /&gt;
(...)[neissner@td110 ~]$ mamba install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
You can use pip install inside a mamba environment, however, resolving dependencies might require installing additional packages manually.&lt;br /&gt;
&lt;br /&gt;
== Conda / mamba configuration ==&lt;br /&gt;
&lt;br /&gt;
The behaviour of conda/mamba can be configured through the &amp;quot;$HOME/.condarc&amp;quot; file, described [https://docs.conda.io/projects/conda/en/latest/configuration.html here]. Some interesting parameters:&lt;br /&gt;
&lt;br /&gt;
* envs_dirs: The list of directories to search for named environments. E.g.: different locations where you created environments&lt;br /&gt;
&lt;br /&gt;
  envs_dirs:&lt;br /&gt;
    - /data/pic/scratch/torradeflot/envs&lt;br /&gt;
    - /data/astro/scratch/torradeflot/envs&lt;br /&gt;
    - /data/aai/scratch/torradeflot/envs&lt;br /&gt;
&lt;br /&gt;
* pkgs_dirs: Folder where to store conda packages&lt;br /&gt;
&lt;br /&gt;
  pkgs_dirs:&lt;br /&gt;
  - /data/aai/scratch_ssd/torradeflot/pkgs&lt;br /&gt;
  - /data/aai/scratch/torradeflot/pkgs&lt;br /&gt;
  - /data/pic/scratch/torradeflot/pkgs&lt;br /&gt;
&lt;br /&gt;
if `pkgs_dirs` and `envs_dirs` are in the same storage, conda will use hard links, thus optimizing the disk space.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Jupyterlab user guide =&lt;br /&gt;
&lt;br /&gt;
You can find the official documentation of the currently installed version of jupyterlab (3.6) [https://jupyterlab.readthedocs.io/en/4.2.x/ here], there you will find instruction on how to &lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/commands.html Access the command palette]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/toc.html Build a Table Of Contents]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/debugger.html Debug your code]&lt;br /&gt;
&lt;br /&gt;
A set of non-official jupyterlab extensions are installed to provide additional functionalities&lt;br /&gt;
&lt;br /&gt;
== Sidecar Apps: Remote desktop &amp;amp; Visual Studio IDE ==&lt;br /&gt;
Further you see an icon with a &amp;quot;D&amp;quot; - desktop, this one starts a VNC session that allows the use of programs with graphical user interfaces.&lt;br /&gt;
&lt;br /&gt;
You can also find the icon of Visual Studio, an integrated development environment.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterlab20231103.png|700px]]&lt;br /&gt;
&lt;br /&gt;
== Jupytext ==&lt;br /&gt;
Pair your notebooks with text files to enhance version tracking.&lt;br /&gt;
https://jupytext.readthedocs.io&lt;br /&gt;
&lt;br /&gt;
'''Example'''&lt;br /&gt;
&lt;br /&gt;
If you had a notebook (.ipynb file) containing only the cell below, tracked in a git repository&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%matplotlib inline&lt;br /&gt;
import numpy as np&lt;br /&gt;
import matplotlib.pyplot as plt&lt;br /&gt;
plt.imshow(np.random.random([10, 10]))&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Different executions of the cell would produce different images, and the images are embedded in a pseudo-binary format inside the notebook file. In this case, doing a '''git diff''' of the .ipynb file would produce a huge output (because the image changed), even if there wasn't any change in the code. It is thus convenient to sync the notebook with a text file (e.g. a .py script) using the jupytext extension and track this one with git. The outputs, including images, as well as some additional metadata, won't be added to the synced text file. So in the case of different executions of the same notebook, the diff will always be empty.&lt;br /&gt;
&lt;br /&gt;
== Git ==&lt;br /&gt;
Sidebar GUI to git repo management&lt;br /&gt;
https://github.com/jupyterlab/jupyterlab-git&lt;br /&gt;
&lt;br /&gt;
== Variable inspector ==&lt;br /&gt;
Variable Inspector provides an interactive interface for inspecting the current state of variables in a JupyterLab session. It allows users to view variable names, types, shapes, and values in a structured table, facilitating exploratory analysis and debugging workflows similar to variable inspection tools available in environments such as MATLAB.&lt;br /&gt;
&lt;br /&gt;
https://github.com/jupyterlab-contrib/jupyterlab-variableInspector&lt;br /&gt;
&lt;br /&gt;
== Jupyter server proxy ==&lt;br /&gt;
&lt;br /&gt;
This extension is installed in PIC's jupyter environment and it is used to be able to access network/web services running on the same host as the jupyterlab server from outside through the &amp;quot;https://jupyter.pic.es/user/{username}/proxy/{port}&amp;quot; URL.&lt;br /&gt;
&lt;br /&gt;
Full documentation here: https://jupyter-server-proxy.readthedocs.io/en/latest/index.html&lt;br /&gt;
...&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Tips &amp;amp; Tricks =&lt;br /&gt;
&lt;br /&gt;
== Software of particular interest ==&lt;br /&gt;
=== SageMath ===&lt;br /&gt;
&lt;br /&gt;
[https://www.sagemath.org/ SageMath] is particularly interesting for Cosmology because it allows symbolic calculations, e.g. deriving the equations of motions for the scale factor starting from a customised space-time metric. &lt;br /&gt;
&lt;br /&gt;
==== Standard cosmology examples ====&lt;br /&gt;
&lt;br /&gt;
* The Friedman equations for the FLRW solution of the Einstein equations.&lt;br /&gt;
You can find the corresponding Notebook in any PIC terminal at '''/data/astro/software/notebooks/FLRW_cosmology.ipynb'''&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/FLRW_cosmology_solutions.ipynb''' uses known analytical solutions of the FLRW cosmology and produces this image for the evolution of the scale factor:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage05.png|300px]]&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/Interior_Schwarzschild.ipynb''' shows the formalism for the interior Schwarzschild metric and displays the solutions for density and pressure of a static celestial object that is sufficiently larger than its Schwarzschild radius. The pressure for an object with constant density id shown in the image:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage06.png|300px]]&lt;br /&gt;
&lt;br /&gt;
==== Enabling SageMath environment in Jupyter ====&lt;br /&gt;
&lt;br /&gt;
If you have never initialized mamba, run:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /data/astro/software/centos7/conda/mambaforge_4.14.0/bin/mamba init&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
After that you can enable SageMath for its use in a Jupyter notebook session: &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~] mamba activate /data/astro/software/envs/sage&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ python -m  ipykernel install --user --name=sage&lt;br /&gt;
....&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This creates a file in you home '''~/.local/share/jupyter/kernels/sage/kernel.json'''&lt;br /&gt;
which has to be modified to look like this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
{&lt;br /&gt;
 &amp;quot;argv&amp;quot;: [&lt;br /&gt;
  &amp;quot;/data/astro/software/envs/sage/bin/sage&amp;quot;,&lt;br /&gt;
  &amp;quot;--python&amp;quot;,&lt;br /&gt;
  &amp;quot;-m&amp;quot;,&lt;br /&gt;
  &amp;quot;sage.repl.ipython_kernel&amp;quot;,&lt;br /&gt;
  &amp;quot;-f&amp;quot;,&lt;br /&gt;
  &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
 ],&lt;br /&gt;
 &amp;quot;display_name&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;language&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;metadata&amp;quot;: {&lt;br /&gt;
  &amp;quot;debugger&amp;quot;: true&lt;br /&gt;
 }&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next time you go to your Jupyter dashboard you will find the sage environment listed there.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Dask ==&lt;br /&gt;
Dask supports parallel computations in Python. The PIC Jupyterlab has an extension for launching&lt;br /&gt;
your own Dask clusters. For more information, see [[Dask|Dask documentation]].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Using a Singularity image as a Jupyter kernel ==&lt;br /&gt;
&lt;br /&gt;
In some projects, the software stack is provided as a Singularity image. In such cases, it can be convenient to use this image directly as a Jupyter kernel, allowing notebooks on jupyter.pic.es to run within the same controlled software environment.&lt;br /&gt;
&lt;br /&gt;
To be used as a Jupyter kernel, the Singularity image must satisfy certain requirements. These depend on the programming language used inside the notebook.&lt;br /&gt;
&lt;br /&gt;
The singularity image needs to have the '''python''' and the '''ipykernel''' module installed. &lt;br /&gt;
&lt;br /&gt;
* Create the folder that will host the kernel definition&lt;br /&gt;
&lt;br /&gt;
  mkdir -p $HOME/.local/share/jupyter/kernels/singularity&lt;br /&gt;
&lt;br /&gt;
* Create the '''kernel.json''' file inside it with the following content:&lt;br /&gt;
&lt;br /&gt;
 {&lt;br /&gt;
   &amp;quot;argv&amp;quot;: [&lt;br /&gt;
     &amp;quot;singularity&amp;quot;,&lt;br /&gt;
     &amp;quot;exec&amp;quot;,&lt;br /&gt;
     &amp;quot;--cleanenv&amp;quot;,&lt;br /&gt;
     &amp;quot;/path/to/the/singularity/image.sif&amp;quot;,&lt;br /&gt;
     &amp;quot;python&amp;quot;,&lt;br /&gt;
     &amp;quot;-m&amp;quot;,&lt;br /&gt;
    &amp;quot;ipykernel&amp;quot;,&lt;br /&gt;
     &amp;quot;-f&amp;quot;,&lt;br /&gt;
     &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
   ],&lt;br /&gt;
   &amp;quot;language&amp;quot;: &amp;quot;python&amp;quot;,&lt;br /&gt;
   &amp;quot;display_name&amp;quot;: &amp;quot;singularity-kernel&amp;quot;&lt;br /&gt;
 }&lt;br /&gt;
&lt;br /&gt;
Refresh or start the jupyterlab interface and the singularity kernel should appear in the launcher tab&lt;br /&gt;
&lt;br /&gt;
== GPUs ==&lt;br /&gt;
&lt;br /&gt;
The way to identify the GPUs that are assigned to your job is:&lt;br /&gt;
* check the environment variable CUDA_VISIBLE_DEVICES. In a terminal run &amp;quot;echo $CUDA_VISIBLE_DEVICES&amp;quot;. The environment variable contains a list of comma-separated GPU ids. With this you will already know how many gpus are assigned to your job. If the variable does not exist, there are no gpus assigned to the job&lt;br /&gt;
&lt;br /&gt;
* list the gpus with nvidia-smi, in a terminal run &amp;quot;nvidia-smi -L&amp;quot;, and look for the gpus you've been assigned. Remember their indexes (integers from 0 to 7)&lt;br /&gt;
&lt;br /&gt;
* The two steps above can be done with the following command:&lt;br /&gt;
&lt;br /&gt;
nvidia-smi -L | grep  $CUDA_VISIBLE_DEVICES&lt;br /&gt;
&lt;br /&gt;
if only having a single assigned GPU.&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_id_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
* in the GPU dashboard the gpus are identified with their index&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_resources_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Code samples ==&lt;br /&gt;
&lt;br /&gt;
A repository with sample code can be found here: https://gitlab.pic.es/services/code-samples/&lt;br /&gt;
&lt;br /&gt;
== Running notebooks through HTCondor ==&lt;br /&gt;
After developing a notebook, you might want to run it as a script with different configurations. The&lt;br /&gt;
following documentation explains  &lt;br /&gt;
&lt;br /&gt;
[[notebook_htcondor|how to run a notebook through HTCondor.]]&lt;br /&gt;
&lt;br /&gt;
= Troubleshooting =&lt;br /&gt;
&lt;br /&gt;
== Logs ==&lt;br /&gt;
&lt;br /&gt;
The log files for the jupyterlab server are stored in &amp;quot;~/.jupyter&amp;quot;. The log files will be created once the jupyter lab server job is finished.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Clean workspaces ==&lt;br /&gt;
&lt;br /&gt;
Jupyterlab stores the workspace status in the &amp;quot;~/.jupyter/lab/workspaces&amp;quot; folder. If you want to start with a fresh (empty) workspace, delete all the content of this folder before launching the notebook.&lt;br /&gt;
&lt;br /&gt;
    cd ~/.jupyter/lab/workspaces&lt;br /&gt;
    rm *&lt;br /&gt;
&lt;br /&gt;
= Known problems =&lt;br /&gt;
&lt;br /&gt;
== Error 500: Internal Server Error ==&lt;br /&gt;
&lt;br /&gt;
This is a generic error. Means that the jupyterlab server failed. This could be for different reasons:&lt;br /&gt;
&lt;br /&gt;
* Your HOME folder is full. Log in to &amp;quot;ui.pic.es&amp;quot; and run &amp;quot;quota&amp;quot; to check the usage vs quota. If it is full you'll have to free up space.&lt;br /&gt;
&lt;br /&gt;
== 504 Gateway timeout ==&lt;br /&gt;
&lt;br /&gt;
The notebook job is running in HTCondor but the user can not access the notebook server. Ultimately a 504 error is received.&lt;br /&gt;
&lt;br /&gt;
This is probably because there's some error when starting the jupyterlab server. First of all, shutdown the notebook server and [[#Logs|check the logs]] to better identify the problem. If you don't see the source of the error, try to [[#Clean_workspaces|clean the workspaces]] and launch a notebook again.&lt;br /&gt;
&lt;br /&gt;
== ROOT Installation ==&lt;br /&gt;
&lt;br /&gt;
Environment creation and kernel installation&lt;br /&gt;
&lt;br /&gt;
    $ micromamba env create -p /data/pic/scratch/torradeflot/envs/mcdata root ipykernel&lt;br /&gt;
    $ micromamba activate mcdata&lt;br /&gt;
    $ python -m ipykernel install --user --name mcdata&lt;br /&gt;
&lt;br /&gt;
After doing this ROOT can be imported from a python shell, but it does not work from a notebook.&lt;br /&gt;
&lt;br /&gt;
* ROOT uses JIT compilation with Cling: https://root.cern/cling/&lt;br /&gt;
* conda provides it's own set of compilation tools: gcc, gxx, fortran&lt;br /&gt;
* Some environment variables are necessary to ensure that these two pieces work well together:&lt;br /&gt;
** PATH: to be able to find the compiler tools&lt;br /&gt;
** CONDA_BUILD_SYSROOT: needed to configure the compiler call&lt;br /&gt;
&lt;br /&gt;
These environment variables are not propagated to the notebook, because they are set by .bashrc conda activate, ... so we need to explicitly set them in the notebook.&lt;br /&gt;
&lt;br /&gt;
     import os&lt;br /&gt;
     import sys&lt;br /&gt;
     from pathlib import Path&lt;br /&gt;
     bin_dir = Path(sys.executable).parent&lt;br /&gt;
     os.environ['PATH'] = f'{bin_dir}:{os.environ['PATH']}'&lt;br /&gt;
     os.environ['CONDA_BUILD_SYSROOT'] = str(bin_dir.parent / 'x86_64-conda-linux-gnu/sysroot')&lt;br /&gt;
&lt;br /&gt;
== Loading libraries is very slow ==&lt;br /&gt;
Conda environments at storage using hard drives can be extremely slow to load. If you encounter this problem, please ask &lt;br /&gt;
your support contact for a &amp;quot;SSD scratch&amp;quot; location to store environments. Currently this service is being tested and&lt;br /&gt;
deployed as needed.&lt;br /&gt;
&lt;br /&gt;
== Spawn failed: The 'ip' trait of a PICCondorSpawner instance expected a unicode string, not the NoneType None ==&lt;br /&gt;
&lt;br /&gt;
Jupyterhub could not get the host name from HTCondor's stdout, because it didn't match the expected regular expression.&lt;br /&gt;
&lt;br /&gt;
This error happens randomly from time to time, it does not imply any major problem in any of the services.&lt;br /&gt;
&lt;br /&gt;
Try to request a new notebook server.&lt;br /&gt;
&lt;br /&gt;
== 403 : Forbidden. XSRF cookie does not match POST argument ==&lt;br /&gt;
&lt;br /&gt;
The value of the &amp;quot;_xsrf&amp;quot; cookie sent by the browser does not match the expected value. This could be for many reasons: temporary high load on the server, race conditions, temporary network unstability, many open tabs in the browser, etc.&lt;br /&gt;
&lt;br /&gt;
In general it can be solved by closing all tabs pointing to &amp;quot;jupyter.pic.es&amp;quot;, cleaning the cookies and connecting back to &amp;quot;jupyter.pic.es&amp;quot;&lt;br /&gt;
&lt;br /&gt;
== Proper usage of X509 based proxies ==&lt;br /&gt;
&lt;br /&gt;
We found recently that the usage of proxies within a Jupyter session might cause problems because the environment changes certain standard locations such as '''/tmp'''&lt;br /&gt;
&lt;br /&gt;
For a correct functioning please create the proxy the following way, example for Virgo:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /bin/voms-proxy-init --voms virgo:/virgo/ligo --out ./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
ls: cannot access /cvmfs/ligo.osgstorage.org: Permission denied&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here the proxy cannot properly be located. Therefore we have to put the complete path into the variable:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ pwd&lt;br /&gt;
/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;/x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
frames  powerflux  pycbc  test_access&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;/div&gt;</summary>
		<author><name>Torradeflot</name></author>
	</entry>
	<entry>
		<id>https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1304</id>
		<title>JupyterHub</title>
		<link rel="alternate" type="text/html" href="https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1304"/>
		<updated>2026-01-15T06:17:00Z</updated>

		<summary type="html">&lt;p&gt;Torradeflot: /* Using a Singularity image as a Jupyter kernel */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;tldr; Connect to https://jupyter.pic.es/ . Enjoy!&lt;br /&gt;
= Introduction =&lt;br /&gt;
&lt;br /&gt;
PIC offers a service for running Jupyter notebooks on CPU or GPU resources. This service is primarily thought for code developing or prototyping rather than data processing. The usage is similar to running notebooks on your personal computer but offers the advantage of developing and testing your code on different hardware configurations, as well as facilitating the scalability of the code since it is being tested in the same environment in which it would run on a mass scale. &lt;br /&gt;
&lt;br /&gt;
Since the service is strictly thought for development and small scale testing tasks, a shutdown policy for the sessions has been put in place:&lt;br /&gt;
&lt;br /&gt;
# The maximum duration for a session is 48h.&lt;br /&gt;
# After an idle period of 2 hours, the session will be closed. &lt;br /&gt;
&lt;br /&gt;
In practice that  means that you should estimate the test data volume that you work with during a session to be able to be processed in less than 48 hours.&lt;br /&gt;
&lt;br /&gt;
== How to connect to the service ==&lt;br /&gt;
&lt;br /&gt;
Got to [https://jupyter.pic.es jupyter.pic.es] to see your login screen.&lt;br /&gt;
&lt;br /&gt;
[[File:login.png|700px|Login screen]]&lt;br /&gt;
&lt;br /&gt;
Sign in with your PIC user credentials. This will prompt you to the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterSpawn.png|700px|current]]&lt;br /&gt;
&lt;br /&gt;
Here you can choose the hardware configuration for your Jupyter session. Also, you have to choose the experiment (project) you are working on during the Jupyter session. After choosing a configuration and pressing start the next screen will show you the progress of the initialisation process. Keep in mind that a job containing your Jupyter session is actually sent to the HTCondor queuing system and waiting for available resources before being started. This usually takes less than a minute but can take up to a few depending on our resource usage.&lt;br /&gt;
&lt;br /&gt;
[[File:screen02.png|900px]]&lt;br /&gt;
&lt;br /&gt;
In the next screen you can choose the tool that you want to use for your work: a Python notebook, a Python console or a plain bash terminal.&lt;br /&gt;
For the Python environment (either notebook or environment) you have two default options:&lt;br /&gt;
* the ipykernel version of Python 3&lt;br /&gt;
* the XPython version of Python 3.9, this one allows you to use the integrated debugging module.&lt;br /&gt;
&lt;br /&gt;
== Terminate your session and logout ==&lt;br /&gt;
&lt;br /&gt;
It is important that you terminate your session before you log out. In order to do so, go to the top page menu &amp;quot;'''File''' -&amp;gt; '''Hub Control Panel'''&amp;quot; and you will see the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:screen04.png|600px]]&lt;br /&gt;
&lt;br /&gt;
Here, click on the '''Stop My Server''' button. After that you can log out by clicking the '''Logout''' button in the right upper corner.&lt;br /&gt;
&lt;br /&gt;
=  Python virtual environments =&lt;br /&gt;
&lt;br /&gt;
This section covers the use of Python virtual environments with Jupyter.&lt;br /&gt;
&lt;br /&gt;
== Prebuilt environments == &lt;br /&gt;
&lt;br /&gt;
PIC's jupyterhub service comes with a collection of prebuilt environments located at '''/data/jupyter/software/envs'''.&lt;br /&gt;
&lt;br /&gt;
The master environment located at '''/data/jupyter/software/envs/master''' is the one used to start the jupyterlab service and the default for new notebooks.&lt;br /&gt;
&lt;br /&gt;
This is a non-extensive list of the packages included:&lt;br /&gt;
  - astropy=6.1.0&lt;br /&gt;
  - bokeh=3.4.1&lt;br /&gt;
  - dash=2.17.0&lt;br /&gt;
  - dask=2024.5.1&lt;br /&gt;
  - findspark=2.0.1&lt;br /&gt;
  - matplotlib=3.8.4&lt;br /&gt;
  - numpy=1.26.4&lt;br /&gt;
  - pandas=2.2.2&lt;br /&gt;
  - pillow=10.3.0&lt;br /&gt;
  - plotly=5.22.0&lt;br /&gt;
  - pyhive=0.7.0&lt;br /&gt;
  - python=3.12&lt;br /&gt;
  - pywavelets=1.4.1&lt;br /&gt;
  - scikit-image=0.22.0&lt;br /&gt;
  - scikit-learn=1.5.0&lt;br /&gt;
  - scipy=1.13.1&lt;br /&gt;
  - seaborn=0.13.2&lt;br /&gt;
  - statsmodels=0.14.2&lt;br /&gt;
&lt;br /&gt;
== Initialize conda (we highly recommend the use of mamba/micromamba) ==&lt;br /&gt;
&lt;br /&gt;
Before using conda/mamba in your bash session, you have to initialize it.&lt;br /&gt;
* For access to an available conda/mamba installation, please get in contact with your project liaison at PIC. He/she will give you the actual value for the '''/path/to/conda/mamba''' placeholder.&lt;br /&gt;
* If you want to use your own conda/mamba/micromamba installation, there are two recommended options&lt;br /&gt;
** '''miniforge''': a distribution with conda and mamba executables in a minimal base environment, instructions [https://github.com/conda-forge/miniforge here]&lt;br /&gt;
** '''micromamba''': a self-contained executable (micromamba) with no base environment, instructions [https://mamba.readthedocs.io/en/latest/installation/micromamba-installation.html here]&lt;br /&gt;
&lt;br /&gt;
Log onto Jupyter and start a session. On the homepage of your Jupyter session, click on the terminal button on the session dashboard on the right to open a bash terminal.&lt;br /&gt;
&lt;br /&gt;
In order to use conda/mamba/micromamba you need to intialize the shell. This initialization can be persistent, which will do some changes to your '''~/.bashrc''' file, or you can do it every time you want to use it.&lt;br /&gt;
&lt;br /&gt;
Run the following command:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eval &amp;quot;$(/data/astro/software/miniforge3/bin/conda shell.bash hook)&amp;quot;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, if you are using miniforge and you want to persist the initialization:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
/data/astro/software/miniforge3/bin/mamba init&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This actually changes the .bashrc file in your home directory in order to activate the base environment on login.&lt;br /&gt;
To avoid that the base environment is activated every time you log on to a node, run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Link an existing environment to Jupyter ==&lt;br /&gt;
&lt;br /&gt;
You can find instructions on how to create your own environments, e.g. [[#Create_virtual_environments_with_venv_or_conda | here]].&lt;br /&gt;
&lt;br /&gt;
Log into Jupyter, start a session. From the session dashboard choose the bash terminal.&lt;br /&gt;
&lt;br /&gt;
Inside the terminal, activate your environment.&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments:&lt;br /&gt;
* if you created the environment without a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the name of your environment. &lt;br /&gt;
* if you created the environment with a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the absolute path of your environment. &lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ source /path/to/environment/bin/activate&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Link the environment to a Jupyter kernel. For both, '''conda/mamba''' and '''venv''':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ python -m  ipykernel install --user --name=whatever_kernel_name&lt;br /&gt;
Installed kernelspec whatever_kernel_name in &lt;br /&gt;
                         /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you don't have the '''ipykernel''' module installed in your environment you may receive an error message like the one below when trying to run the previous command.&lt;br /&gt;
&amp;lt;pre&amp;gt;No module named ipykernel&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If this is the case, you need to install it by running: '''pip install ipykernel'''&lt;br /&gt;
&lt;br /&gt;
Deactivate your environment. &lt;br /&gt;
&lt;br /&gt;
For conda:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For venv:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now you can exit the terminal. After refreshing the Jupyter page your whatever_kernel_name appears in the dashboard. In this example '''test''' has been used for whatever_kernel_name&lt;br /&gt;
&lt;br /&gt;
[[File:screen05.png|700px]]&lt;br /&gt;
&lt;br /&gt;
== Unlink an environment from Jupyter ==&lt;br /&gt;
Log onto Jupyter, start a session and from the session dashboard choose the bash terminal. To remove your environment/kernel from Jupyter run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ jupyter kernelspec uninstall whatever_kernel_name&lt;br /&gt;
Kernel specs to remove:&lt;br /&gt;
  whatever_kernel_name     /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
Remove 1 kernel specs [y/N]: y&lt;br /&gt;
[RemoveKernelSpec] Removed /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Keep in mind that, although not available in Jupyter anymore, the environment still exists. Whenever you need it, you can link it again.&lt;br /&gt;
&lt;br /&gt;
== Create virtual environments with venv or conda ==&lt;br /&gt;
&lt;br /&gt;
Before creating a new environment, please get in contact with your project liaison at PIC as there may be already a suitable environment for your needs in place.&lt;br /&gt;
&lt;br /&gt;
If none of the existing environments suits your needs, you can create a new environment.&lt;br /&gt;
First, create a directory in a suitable place to store the environment. For single-user environments, place them in your home under ~/env. For environments that will be shared with other project users, contact your project liaison and ask him/her for a path in a shared storage volume that is visible to all of them.&lt;br /&gt;
&lt;br /&gt;
Once you have the location (i.e. /path/to/env/folder), create the environment with the following commands:&lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments '''(recommended)'''&lt;br /&gt;
&lt;br /&gt;
If your_env is installed at /path/to/env/your_env&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ python3 -m venv your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ source your_env/bin/activate&lt;br /&gt;
(...)[neissner@td110 ~]$ pip install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments&lt;br /&gt;
 &amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba create --prefix /path/to/env/your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The list of modules (module1, module2, ...) is optional. For instance, for a python3 environment with scipy you would specify: ''python=3 scipy'' &lt;br /&gt;
&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/env/folder/your_env&lt;br /&gt;
(...)[neissner@td110 ~]$ mamba install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
You can use pip install inside a mamba environment, however, resolving dependencies might require installing additional packages manually.&lt;br /&gt;
&lt;br /&gt;
== Conda / mamba configuration ==&lt;br /&gt;
&lt;br /&gt;
The behaviour of conda/mamba can be configured through the &amp;quot;$HOME/.condarc&amp;quot; file, described [https://docs.conda.io/projects/conda/en/latest/configuration.html here]. Some interesting parameters:&lt;br /&gt;
&lt;br /&gt;
* envs_dirs: The list of directories to search for named environments. E.g.: different locations where you created environments&lt;br /&gt;
&lt;br /&gt;
  envs_dirs:&lt;br /&gt;
    - /data/pic/scratch/torradeflot/envs&lt;br /&gt;
    - /data/astro/scratch/torradeflot/envs&lt;br /&gt;
    - /data/aai/scratch/torradeflot/envs&lt;br /&gt;
&lt;br /&gt;
* pkgs_dirs: Folder where to store conda packages&lt;br /&gt;
&lt;br /&gt;
  pkgs_dirs:&lt;br /&gt;
  - /data/aai/scratch_ssd/torradeflot/pkgs&lt;br /&gt;
  - /data/aai/scratch/torradeflot/pkgs&lt;br /&gt;
  - /data/pic/scratch/torradeflot/pkgs&lt;br /&gt;
&lt;br /&gt;
if `pkgs_dirs` and `envs_dirs` are in the same storage, conda will use hard links, thus optimizing the disk space.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Jupyterlab user guide =&lt;br /&gt;
&lt;br /&gt;
You can find the official documentation of the currently installed version of jupyterlab (3.6) [https://jupyterlab.readthedocs.io/en/4.2.x/ here], there you will find instruction on how to &lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/commands.html Access the command palette]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/toc.html Build a Table Of Contents]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/debugger.html Debug your code]&lt;br /&gt;
&lt;br /&gt;
A set of non-official jupyterlab extensions are installed to provide additional functionalities&lt;br /&gt;
&lt;br /&gt;
== Sidecar Apps: Remote desktop &amp;amp; Visual Studio IDE ==&lt;br /&gt;
Further you see an icon with a &amp;quot;D&amp;quot; - desktop, this one starts a VNC session that allows the use of programs with graphical user interfaces.&lt;br /&gt;
&lt;br /&gt;
You can also find the icon of Visual Studio, an integrated development environment.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterlab20231103.png|700px]]&lt;br /&gt;
&lt;br /&gt;
== Jupytext ==&lt;br /&gt;
Pair your notebooks with text files to enhance version tracking.&lt;br /&gt;
https://jupytext.readthedocs.io&lt;br /&gt;
&lt;br /&gt;
'''Example'''&lt;br /&gt;
&lt;br /&gt;
If you had a notebook (.ipynb file) containing only the cell below, tracked in a git repository&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%matplotlib inline&lt;br /&gt;
import numpy as np&lt;br /&gt;
import matplotlib.pyplot as plt&lt;br /&gt;
plt.imshow(np.random.random([10, 10]))&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Different executions of the cell would produce different images, and the images are embedded in a pseudo-binary format inside the notebook file. In this case, doing a '''git diff''' of the .ipynb file would produce a huge output (because the image changed), even if there wasn't any change in the code. It is thus convenient to sync the notebook with a text file (e.g. a .py script) using the jupytext extension and track this one with git. The outputs, including images, as well as some additional metadata, won't be added to the synced text file. So in the case of different executions of the same notebook, the diff will always be empty.&lt;br /&gt;
&lt;br /&gt;
== Git ==&lt;br /&gt;
Sidebar GUI to git repo management&lt;br /&gt;
https://github.com/jupyterlab/jupyterlab-git&lt;br /&gt;
&lt;br /&gt;
== Variable inspector ==&lt;br /&gt;
Variable Inspector provides an interactive interface for inspecting the current state of variables in a JupyterLab session. It allows users to view variable names, types, shapes, and values in a structured table, facilitating exploratory analysis and debugging workflows similar to variable inspection tools available in environments such as MATLAB.&lt;br /&gt;
&lt;br /&gt;
https://github.com/jupyterlab-contrib/jupyterlab-variableInspector&lt;br /&gt;
&lt;br /&gt;
== Jupyter server proxy ==&lt;br /&gt;
&lt;br /&gt;
This extension is installed in PIC's jupyter environment and it is used to be able to access network/web services running on the same host as the jupyterlab server from outside through the &amp;quot;https://jupyter.pic.es/user/{username}/proxy/{port}&amp;quot; URL.&lt;br /&gt;
&lt;br /&gt;
Full documentation here: https://jupyter-server-proxy.readthedocs.io/en/latest/index.html&lt;br /&gt;
...&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Tips &amp;amp; Tricks =&lt;br /&gt;
&lt;br /&gt;
== Software of particular interest ==&lt;br /&gt;
=== SageMath ===&lt;br /&gt;
&lt;br /&gt;
[https://www.sagemath.org/ SageMath] is particularly interesting for Cosmology because it allows symbolic calculations, e.g. deriving the equations of motions for the scale factor starting from a customised space-time metric. &lt;br /&gt;
&lt;br /&gt;
==== Standard cosmology examples ====&lt;br /&gt;
&lt;br /&gt;
* The Friedman equations for the FLRW solution of the Einstein equations.&lt;br /&gt;
You can find the corresponding Notebook in any PIC terminal at '''/data/astro/software/notebooks/FLRW_cosmology.ipynb'''&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/FLRW_cosmology_solutions.ipynb''' uses known analytical solutions of the FLRW cosmology and produces this image for the evolution of the scale factor:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage05.png|300px]]&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/Interior_Schwarzschild.ipynb''' shows the formalism for the interior Schwarzschild metric and displays the solutions for density and pressure of a static celestial object that is sufficiently larger than its Schwarzschild radius. The pressure for an object with constant density id shown in the image:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage06.png|300px]]&lt;br /&gt;
&lt;br /&gt;
==== Enabling SageMath environment in Jupyter ====&lt;br /&gt;
&lt;br /&gt;
If you have never initialized mamba, run:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /data/astro/software/centos7/conda/mambaforge_4.14.0/bin/mamba init&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
After that you can enable SageMath for its use in a Jupyter notebook session: &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~] mamba activate /data/astro/software/envs/sage&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ python -m  ipykernel install --user --name=sage&lt;br /&gt;
....&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This creates a file in you home '''~/.local/share/jupyter/kernels/sage/kernel.json'''&lt;br /&gt;
which has to be modified to look like this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
{&lt;br /&gt;
 &amp;quot;argv&amp;quot;: [&lt;br /&gt;
  &amp;quot;/data/astro/software/envs/sage/bin/sage&amp;quot;,&lt;br /&gt;
  &amp;quot;--python&amp;quot;,&lt;br /&gt;
  &amp;quot;-m&amp;quot;,&lt;br /&gt;
  &amp;quot;sage.repl.ipython_kernel&amp;quot;,&lt;br /&gt;
  &amp;quot;-f&amp;quot;,&lt;br /&gt;
  &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
 ],&lt;br /&gt;
 &amp;quot;display_name&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;language&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;metadata&amp;quot;: {&lt;br /&gt;
  &amp;quot;debugger&amp;quot;: true&lt;br /&gt;
 }&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next time you go to your Jupyter dashboard you will find the sage environment listed there.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Dask ==&lt;br /&gt;
Dask supports parallel computations in Python. The PIC Jupyterlab has an extension for launching&lt;br /&gt;
your own Dask clusters. For more information, see [[Dask|Dask documentation]].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Using a Singularity image as a Jupyter kernel ==&lt;br /&gt;
&lt;br /&gt;
In some projects, the software stack is provided as a Singularity image. In such cases, it can be convenient to use this image directly as a Jupyter kernel, allowing notebooks on jupyter.pic.es to run within the same controlled software environment.&lt;br /&gt;
&lt;br /&gt;
To be used as a Jupyter kernel, the Singularity image must satisfy certain requirements. These depend on the programming language used inside the notebook.&lt;br /&gt;
&lt;br /&gt;
The singularity image needs to have the '''python''' and the '''ipykernel''' module installed. &lt;br /&gt;
&lt;br /&gt;
* Create the folder that will host the kernel definition&lt;br /&gt;
&lt;br /&gt;
  mkdir -p $HOME/.local/share/jupyter/kernels/singularity&lt;br /&gt;
&lt;br /&gt;
* Create the '''kernel.json''' file inside it with the following content:&lt;br /&gt;
&lt;br /&gt;
 {&lt;br /&gt;
   &amp;quot;argv&amp;quot;: [&lt;br /&gt;
     &amp;quot;singularity&amp;quot;,&lt;br /&gt;
     &amp;quot;exec&amp;quot;,&lt;br /&gt;
     &amp;quot;--cleanenv&amp;quot;,&lt;br /&gt;
     &amp;quot;/path/to/the/singularity/image.sif&amp;quot;,&lt;br /&gt;
     &amp;quot;python&amp;quot;,&lt;br /&gt;
     &amp;quot;-m&amp;quot;,&lt;br /&gt;
    &amp;quot;ipykernel&amp;quot;,&lt;br /&gt;
     &amp;quot;-f&amp;quot;,&lt;br /&gt;
     &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
   ],&lt;br /&gt;
   &amp;quot;language&amp;quot;: &amp;quot;python&amp;quot;,&lt;br /&gt;
   &amp;quot;display_name&amp;quot;: &amp;quot;singularity-kernel&amp;quot;&lt;br /&gt;
 }&lt;br /&gt;
&lt;br /&gt;
Refresh or start the jupyterlab interface and the singularity kernel should appear in the launcher tab&lt;br /&gt;
&lt;br /&gt;
== GPUs ==&lt;br /&gt;
&lt;br /&gt;
The way to identify the GPUs that are assigned to your job is:&lt;br /&gt;
* check the environment variable CUDA_VISIBLE_DEVICES. In a terminal run &amp;quot;echo $CUDA_VISIBLE_DEVICES&amp;quot;. The environment variable contains a list of comma-separated GPU ids. With this you will already know how many gpus are assigned to your job. If the variable does not exist, there are no gpus assigned to the job&lt;br /&gt;
&lt;br /&gt;
* list the gpus with nvidia-smi, in a terminal run &amp;quot;nvidia-smi -L&amp;quot;, and look for the gpus you've been assigned. Remember their indexes (integers from 0 to 7)&lt;br /&gt;
&lt;br /&gt;
* The two steps above can be done with the following command:&lt;br /&gt;
&lt;br /&gt;
nvidia-smi -L | grep  $CUDA_VISIBLE_DEVICES&lt;br /&gt;
&lt;br /&gt;
if only having a single assigned GPU.&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_id_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
* in the GPU dashboard the gpus are identified with their index&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_resources_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Code samples ==&lt;br /&gt;
&lt;br /&gt;
A repository with sample code can be found here: https://gitlab.pic.es/services/code-samples/&lt;br /&gt;
&lt;br /&gt;
== Running notebooks through HTCondor ==&lt;br /&gt;
After developing a notebook, you might want to run it as a script with different configurations. The&lt;br /&gt;
following documentation explains  &lt;br /&gt;
&lt;br /&gt;
[[notebook_htcondor|how to run a notebook through HTCondor.]]&lt;br /&gt;
&lt;br /&gt;
= Troubleshooting =&lt;br /&gt;
&lt;br /&gt;
== Logs ==&lt;br /&gt;
&lt;br /&gt;
The log files for the jupyterlab server are stored in &amp;quot;~/.jupyter&amp;quot;. The log files will be created once the jupyter lab server job is finished.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Clean workspaces ==&lt;br /&gt;
&lt;br /&gt;
Jupyterlab stores the workspace status in the &amp;quot;~/.jupyter/lab/workspaces&amp;quot; folder. If you want to start with a fresh (empty) workspace, delete all the content of this folder before launching the notebook.&lt;br /&gt;
&lt;br /&gt;
    cd ~/.jupyter/lab/workspaces&lt;br /&gt;
    rm *&lt;br /&gt;
&lt;br /&gt;
= Known errors =&lt;br /&gt;
&lt;br /&gt;
== Error 500: Internal Server Error ==&lt;br /&gt;
&lt;br /&gt;
This is a generic error. Means that the jupyterlab server failed. This could be for different reasons:&lt;br /&gt;
&lt;br /&gt;
* Your HOME folder is full. Log in to &amp;quot;ui.pic.es&amp;quot; and run &amp;quot;quota&amp;quot; to check the usage vs quota. If it is full you'll have to free up space.&lt;br /&gt;
&lt;br /&gt;
== 504 Gateway timeout ==&lt;br /&gt;
&lt;br /&gt;
The notebook job is running in HTCondor but the user can not access the notebook server. Ultimately a 504 error is received.&lt;br /&gt;
&lt;br /&gt;
This is probably because there's some error when starting the jupyterlab server. First of all, shutdown the notebook server and [[#Logs|check the logs]] to better identify the problem. If you don't see the source of the error, try to [[#Clean_workspaces|clean the workspaces]] and launch a notebook again.&lt;br /&gt;
&lt;br /&gt;
== Install ROOT ==&lt;br /&gt;
&lt;br /&gt;
Environment creation and kernel installation&lt;br /&gt;
&lt;br /&gt;
    $ micromamba env create -p /data/pic/scratch/torradeflot/envs/mcdata root ipykernel&lt;br /&gt;
    $ micromamba activate mcdata&lt;br /&gt;
    $ python -m ipykernel install --user --name mcdata&lt;br /&gt;
&lt;br /&gt;
After doing this ROOT can be imported from a python shell, but it does not work from a notebook.&lt;br /&gt;
&lt;br /&gt;
* ROOT uses JIT compilation with Cling: https://root.cern/cling/&lt;br /&gt;
* conda provides it's own set of compilation tools: gcc, gxx, fortran&lt;br /&gt;
* Some environment variables are necessary to ensure that these two pieces work well together:&lt;br /&gt;
** PATH: to be able to find the compiler tools&lt;br /&gt;
** CONDA_BUILD_SYSROOT: needed to configure the compiler call&lt;br /&gt;
&lt;br /&gt;
These environment variables are not propagated to the notebook, because they are set by .bashrc conda activate, ... so we need to explicitly set them in the notebook.&lt;br /&gt;
&lt;br /&gt;
     import os&lt;br /&gt;
     import sys&lt;br /&gt;
     from pathlib import Path&lt;br /&gt;
     bin_dir = Path(sys.executable).parent&lt;br /&gt;
     os.environ['PATH'] = f'{bin_dir}:{os.environ['PATH']}'&lt;br /&gt;
     os.environ['CONDA_BUILD_SYSROOT'] = str(bin_dir.parent / 'x86_64-conda-linux-gnu/sysroot')&lt;br /&gt;
&lt;br /&gt;
== Loading libraries is very slow ==&lt;br /&gt;
Conda environments at storage using hard drives can be extremely slow to load. If you encounter this problem, please ask &lt;br /&gt;
your support contact for a &amp;quot;SSD scratch&amp;quot; location to store environments. Currently this service is being tested and&lt;br /&gt;
deployed as needed.&lt;br /&gt;
&lt;br /&gt;
== Spawn failed: The 'ip' trait of a PICCondorSpawner instance expected a unicode string, not the NoneType None ==&lt;br /&gt;
&lt;br /&gt;
Jupyterhub could not get the host name from HTCondor's stdout, because it didn't match the expected regular expression.&lt;br /&gt;
&lt;br /&gt;
This error happens randomly from time to time, it does not imply any major problem in any of the services.&lt;br /&gt;
&lt;br /&gt;
Try to request a new notebook server.&lt;br /&gt;
&lt;br /&gt;
== 403 : Forbidden. XSRF cookie does not match POST argument ==&lt;br /&gt;
&lt;br /&gt;
The value of the &amp;quot;_xsrf&amp;quot; cookie sent by the browser does not match the expected value. This could be for many reasons: temporary high load on the server, race conditions, temporary network unstability, many open tabs in the browser, etc.&lt;br /&gt;
&lt;br /&gt;
In general it can be solved by closing all tabs pointing to &amp;quot;jupyter.pic.es&amp;quot;, cleaning the cookies and connecting back to &amp;quot;jupyter.pic.es&amp;quot;&lt;br /&gt;
&lt;br /&gt;
== Proper usage of X509 based proxies ==&lt;br /&gt;
&lt;br /&gt;
We found recently that the usage of proxies within a Jupyter session might cause problems because the environment changes certain standard locations such as '''/tmp'''&lt;br /&gt;
&lt;br /&gt;
For a correct functioning please create the proxy the following way, example for Virgo:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /bin/voms-proxy-init --voms virgo:/virgo/ligo --out ./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
ls: cannot access /cvmfs/ligo.osgstorage.org: Permission denied&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here the proxy cannot properly be located. Therefore we have to put the complete path into the variable:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ pwd&lt;br /&gt;
/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;/x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
frames  powerflux  pycbc  test_access&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;/div&gt;</summary>
		<author><name>Torradeflot</name></author>
	</entry>
	<entry>
		<id>https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1303</id>
		<title>JupyterHub</title>
		<link rel="alternate" type="text/html" href="https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1303"/>
		<updated>2026-01-15T06:16:35Z</updated>

		<summary type="html">&lt;p&gt;Torradeflot: /* Running notebooks through HTCondor */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;tldr; Connect to https://jupyter.pic.es/ . Enjoy!&lt;br /&gt;
= Introduction =&lt;br /&gt;
&lt;br /&gt;
PIC offers a service for running Jupyter notebooks on CPU or GPU resources. This service is primarily thought for code developing or prototyping rather than data processing. The usage is similar to running notebooks on your personal computer but offers the advantage of developing and testing your code on different hardware configurations, as well as facilitating the scalability of the code since it is being tested in the same environment in which it would run on a mass scale. &lt;br /&gt;
&lt;br /&gt;
Since the service is strictly thought for development and small scale testing tasks, a shutdown policy for the sessions has been put in place:&lt;br /&gt;
&lt;br /&gt;
# The maximum duration for a session is 48h.&lt;br /&gt;
# After an idle period of 2 hours, the session will be closed. &lt;br /&gt;
&lt;br /&gt;
In practice that  means that you should estimate the test data volume that you work with during a session to be able to be processed in less than 48 hours.&lt;br /&gt;
&lt;br /&gt;
== How to connect to the service ==&lt;br /&gt;
&lt;br /&gt;
Got to [https://jupyter.pic.es jupyter.pic.es] to see your login screen.&lt;br /&gt;
&lt;br /&gt;
[[File:login.png|700px|Login screen]]&lt;br /&gt;
&lt;br /&gt;
Sign in with your PIC user credentials. This will prompt you to the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterSpawn.png|700px|current]]&lt;br /&gt;
&lt;br /&gt;
Here you can choose the hardware configuration for your Jupyter session. Also, you have to choose the experiment (project) you are working on during the Jupyter session. After choosing a configuration and pressing start the next screen will show you the progress of the initialisation process. Keep in mind that a job containing your Jupyter session is actually sent to the HTCondor queuing system and waiting for available resources before being started. This usually takes less than a minute but can take up to a few depending on our resource usage.&lt;br /&gt;
&lt;br /&gt;
[[File:screen02.png|900px]]&lt;br /&gt;
&lt;br /&gt;
In the next screen you can choose the tool that you want to use for your work: a Python notebook, a Python console or a plain bash terminal.&lt;br /&gt;
For the Python environment (either notebook or environment) you have two default options:&lt;br /&gt;
* the ipykernel version of Python 3&lt;br /&gt;
* the XPython version of Python 3.9, this one allows you to use the integrated debugging module.&lt;br /&gt;
&lt;br /&gt;
== Terminate your session and logout ==&lt;br /&gt;
&lt;br /&gt;
It is important that you terminate your session before you log out. In order to do so, go to the top page menu &amp;quot;'''File''' -&amp;gt; '''Hub Control Panel'''&amp;quot; and you will see the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:screen04.png|600px]]&lt;br /&gt;
&lt;br /&gt;
Here, click on the '''Stop My Server''' button. After that you can log out by clicking the '''Logout''' button in the right upper corner.&lt;br /&gt;
&lt;br /&gt;
=  Python virtual environments =&lt;br /&gt;
&lt;br /&gt;
This section covers the use of Python virtual environments with Jupyter.&lt;br /&gt;
&lt;br /&gt;
== Prebuilt environments == &lt;br /&gt;
&lt;br /&gt;
PIC's jupyterhub service comes with a collection of prebuilt environments located at '''/data/jupyter/software/envs'''.&lt;br /&gt;
&lt;br /&gt;
The master environment located at '''/data/jupyter/software/envs/master''' is the one used to start the jupyterlab service and the default for new notebooks.&lt;br /&gt;
&lt;br /&gt;
This is a non-extensive list of the packages included:&lt;br /&gt;
  - astropy=6.1.0&lt;br /&gt;
  - bokeh=3.4.1&lt;br /&gt;
  - dash=2.17.0&lt;br /&gt;
  - dask=2024.5.1&lt;br /&gt;
  - findspark=2.0.1&lt;br /&gt;
  - matplotlib=3.8.4&lt;br /&gt;
  - numpy=1.26.4&lt;br /&gt;
  - pandas=2.2.2&lt;br /&gt;
  - pillow=10.3.0&lt;br /&gt;
  - plotly=5.22.0&lt;br /&gt;
  - pyhive=0.7.0&lt;br /&gt;
  - python=3.12&lt;br /&gt;
  - pywavelets=1.4.1&lt;br /&gt;
  - scikit-image=0.22.0&lt;br /&gt;
  - scikit-learn=1.5.0&lt;br /&gt;
  - scipy=1.13.1&lt;br /&gt;
  - seaborn=0.13.2&lt;br /&gt;
  - statsmodels=0.14.2&lt;br /&gt;
&lt;br /&gt;
== Initialize conda (we highly recommend the use of mamba/micromamba) ==&lt;br /&gt;
&lt;br /&gt;
Before using conda/mamba in your bash session, you have to initialize it.&lt;br /&gt;
* For access to an available conda/mamba installation, please get in contact with your project liaison at PIC. He/she will give you the actual value for the '''/path/to/conda/mamba''' placeholder.&lt;br /&gt;
* If you want to use your own conda/mamba/micromamba installation, there are two recommended options&lt;br /&gt;
** '''miniforge''': a distribution with conda and mamba executables in a minimal base environment, instructions [https://github.com/conda-forge/miniforge here]&lt;br /&gt;
** '''micromamba''': a self-contained executable (micromamba) with no base environment, instructions [https://mamba.readthedocs.io/en/latest/installation/micromamba-installation.html here]&lt;br /&gt;
&lt;br /&gt;
Log onto Jupyter and start a session. On the homepage of your Jupyter session, click on the terminal button on the session dashboard on the right to open a bash terminal.&lt;br /&gt;
&lt;br /&gt;
In order to use conda/mamba/micromamba you need to intialize the shell. This initialization can be persistent, which will do some changes to your '''~/.bashrc''' file, or you can do it every time you want to use it.&lt;br /&gt;
&lt;br /&gt;
Run the following command:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eval &amp;quot;$(/data/astro/software/miniforge3/bin/conda shell.bash hook)&amp;quot;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, if you are using miniforge and you want to persist the initialization:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
/data/astro/software/miniforge3/bin/mamba init&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This actually changes the .bashrc file in your home directory in order to activate the base environment on login.&lt;br /&gt;
To avoid that the base environment is activated every time you log on to a node, run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Link an existing environment to Jupyter ==&lt;br /&gt;
&lt;br /&gt;
You can find instructions on how to create your own environments, e.g. [[#Create_virtual_environments_with_venv_or_conda | here]].&lt;br /&gt;
&lt;br /&gt;
Log into Jupyter, start a session. From the session dashboard choose the bash terminal.&lt;br /&gt;
&lt;br /&gt;
Inside the terminal, activate your environment.&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments:&lt;br /&gt;
* if you created the environment without a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the name of your environment. &lt;br /&gt;
* if you created the environment with a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the absolute path of your environment. &lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ source /path/to/environment/bin/activate&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Link the environment to a Jupyter kernel. For both, '''conda/mamba''' and '''venv''':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ python -m  ipykernel install --user --name=whatever_kernel_name&lt;br /&gt;
Installed kernelspec whatever_kernel_name in &lt;br /&gt;
                         /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you don't have the '''ipykernel''' module installed in your environment you may receive an error message like the one below when trying to run the previous command.&lt;br /&gt;
&amp;lt;pre&amp;gt;No module named ipykernel&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If this is the case, you need to install it by running: '''pip install ipykernel'''&lt;br /&gt;
&lt;br /&gt;
Deactivate your environment. &lt;br /&gt;
&lt;br /&gt;
For conda:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For venv:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now you can exit the terminal. After refreshing the Jupyter page your whatever_kernel_name appears in the dashboard. In this example '''test''' has been used for whatever_kernel_name&lt;br /&gt;
&lt;br /&gt;
[[File:screen05.png|700px]]&lt;br /&gt;
&lt;br /&gt;
== Unlink an environment from Jupyter ==&lt;br /&gt;
Log onto Jupyter, start a session and from the session dashboard choose the bash terminal. To remove your environment/kernel from Jupyter run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ jupyter kernelspec uninstall whatever_kernel_name&lt;br /&gt;
Kernel specs to remove:&lt;br /&gt;
  whatever_kernel_name     /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
Remove 1 kernel specs [y/N]: y&lt;br /&gt;
[RemoveKernelSpec] Removed /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Keep in mind that, although not available in Jupyter anymore, the environment still exists. Whenever you need it, you can link it again.&lt;br /&gt;
&lt;br /&gt;
== Create virtual environments with venv or conda ==&lt;br /&gt;
&lt;br /&gt;
Before creating a new environment, please get in contact with your project liaison at PIC as there may be already a suitable environment for your needs in place.&lt;br /&gt;
&lt;br /&gt;
If none of the existing environments suits your needs, you can create a new environment.&lt;br /&gt;
First, create a directory in a suitable place to store the environment. For single-user environments, place them in your home under ~/env. For environments that will be shared with other project users, contact your project liaison and ask him/her for a path in a shared storage volume that is visible to all of them.&lt;br /&gt;
&lt;br /&gt;
Once you have the location (i.e. /path/to/env/folder), create the environment with the following commands:&lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments '''(recommended)'''&lt;br /&gt;
&lt;br /&gt;
If your_env is installed at /path/to/env/your_env&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ python3 -m venv your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ source your_env/bin/activate&lt;br /&gt;
(...)[neissner@td110 ~]$ pip install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments&lt;br /&gt;
 &amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba create --prefix /path/to/env/your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The list of modules (module1, module2, ...) is optional. For instance, for a python3 environment with scipy you would specify: ''python=3 scipy'' &lt;br /&gt;
&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/env/folder/your_env&lt;br /&gt;
(...)[neissner@td110 ~]$ mamba install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
You can use pip install inside a mamba environment, however, resolving dependencies might require installing additional packages manually.&lt;br /&gt;
&lt;br /&gt;
== Conda / mamba configuration ==&lt;br /&gt;
&lt;br /&gt;
The behaviour of conda/mamba can be configured through the &amp;quot;$HOME/.condarc&amp;quot; file, described [https://docs.conda.io/projects/conda/en/latest/configuration.html here]. Some interesting parameters:&lt;br /&gt;
&lt;br /&gt;
* envs_dirs: The list of directories to search for named environments. E.g.: different locations where you created environments&lt;br /&gt;
&lt;br /&gt;
  envs_dirs:&lt;br /&gt;
    - /data/pic/scratch/torradeflot/envs&lt;br /&gt;
    - /data/astro/scratch/torradeflot/envs&lt;br /&gt;
    - /data/aai/scratch/torradeflot/envs&lt;br /&gt;
&lt;br /&gt;
* pkgs_dirs: Folder where to store conda packages&lt;br /&gt;
&lt;br /&gt;
  pkgs_dirs:&lt;br /&gt;
  - /data/aai/scratch_ssd/torradeflot/pkgs&lt;br /&gt;
  - /data/aai/scratch/torradeflot/pkgs&lt;br /&gt;
  - /data/pic/scratch/torradeflot/pkgs&lt;br /&gt;
&lt;br /&gt;
if `pkgs_dirs` and `envs_dirs` are in the same storage, conda will use hard links, thus optimizing the disk space.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Jupyterlab user guide =&lt;br /&gt;
&lt;br /&gt;
You can find the official documentation of the currently installed version of jupyterlab (3.6) [https://jupyterlab.readthedocs.io/en/4.2.x/ here], there you will find instruction on how to &lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/commands.html Access the command palette]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/toc.html Build a Table Of Contents]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/debugger.html Debug your code]&lt;br /&gt;
&lt;br /&gt;
A set of non-official jupyterlab extensions are installed to provide additional functionalities&lt;br /&gt;
&lt;br /&gt;
== Sidecar Apps: Remote desktop &amp;amp; Visual Studio IDE ==&lt;br /&gt;
Further you see an icon with a &amp;quot;D&amp;quot; - desktop, this one starts a VNC session that allows the use of programs with graphical user interfaces.&lt;br /&gt;
&lt;br /&gt;
You can also find the icon of Visual Studio, an integrated development environment.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterlab20231103.png|700px]]&lt;br /&gt;
&lt;br /&gt;
== Jupytext ==&lt;br /&gt;
Pair your notebooks with text files to enhance version tracking.&lt;br /&gt;
https://jupytext.readthedocs.io&lt;br /&gt;
&lt;br /&gt;
'''Example'''&lt;br /&gt;
&lt;br /&gt;
If you had a notebook (.ipynb file) containing only the cell below, tracked in a git repository&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%matplotlib inline&lt;br /&gt;
import numpy as np&lt;br /&gt;
import matplotlib.pyplot as plt&lt;br /&gt;
plt.imshow(np.random.random([10, 10]))&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Different executions of the cell would produce different images, and the images are embedded in a pseudo-binary format inside the notebook file. In this case, doing a '''git diff''' of the .ipynb file would produce a huge output (because the image changed), even if there wasn't any change in the code. It is thus convenient to sync the notebook with a text file (e.g. a .py script) using the jupytext extension and track this one with git. The outputs, including images, as well as some additional metadata, won't be added to the synced text file. So in the case of different executions of the same notebook, the diff will always be empty.&lt;br /&gt;
&lt;br /&gt;
== Git ==&lt;br /&gt;
Sidebar GUI to git repo management&lt;br /&gt;
https://github.com/jupyterlab/jupyterlab-git&lt;br /&gt;
&lt;br /&gt;
== Variable inspector ==&lt;br /&gt;
Variable Inspector provides an interactive interface for inspecting the current state of variables in a JupyterLab session. It allows users to view variable names, types, shapes, and values in a structured table, facilitating exploratory analysis and debugging workflows similar to variable inspection tools available in environments such as MATLAB.&lt;br /&gt;
&lt;br /&gt;
https://github.com/jupyterlab-contrib/jupyterlab-variableInspector&lt;br /&gt;
&lt;br /&gt;
== Jupyter server proxy ==&lt;br /&gt;
&lt;br /&gt;
This extension is installed in PIC's jupyter environment and it is used to be able to access network/web services running on the same host as the jupyterlab server from outside through the &amp;quot;https://jupyter.pic.es/user/{username}/proxy/{port}&amp;quot; URL.&lt;br /&gt;
&lt;br /&gt;
Full documentation here: https://jupyter-server-proxy.readthedocs.io/en/latest/index.html&lt;br /&gt;
...&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Tips &amp;amp; Tricks =&lt;br /&gt;
&lt;br /&gt;
== Software of particular interest ==&lt;br /&gt;
=== SageMath ===&lt;br /&gt;
&lt;br /&gt;
[https://www.sagemath.org/ SageMath] is particularly interesting for Cosmology because it allows symbolic calculations, e.g. deriving the equations of motions for the scale factor starting from a customised space-time metric. &lt;br /&gt;
&lt;br /&gt;
==== Standard cosmology examples ====&lt;br /&gt;
&lt;br /&gt;
* The Friedman equations for the FLRW solution of the Einstein equations.&lt;br /&gt;
You can find the corresponding Notebook in any PIC terminal at '''/data/astro/software/notebooks/FLRW_cosmology.ipynb'''&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/FLRW_cosmology_solutions.ipynb''' uses known analytical solutions of the FLRW cosmology and produces this image for the evolution of the scale factor:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage05.png|300px]]&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/Interior_Schwarzschild.ipynb''' shows the formalism for the interior Schwarzschild metric and displays the solutions for density and pressure of a static celestial object that is sufficiently larger than its Schwarzschild radius. The pressure for an object with constant density id shown in the image:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage06.png|300px]]&lt;br /&gt;
&lt;br /&gt;
==== Enabling SageMath environment in Jupyter ====&lt;br /&gt;
&lt;br /&gt;
If you have never initialized mamba, run:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /data/astro/software/centos7/conda/mambaforge_4.14.0/bin/mamba init&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
After that you can enable SageMath for its use in a Jupyter notebook session: &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~] mamba activate /data/astro/software/envs/sage&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ python -m  ipykernel install --user --name=sage&lt;br /&gt;
....&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This creates a file in you home '''~/.local/share/jupyter/kernels/sage/kernel.json'''&lt;br /&gt;
which has to be modified to look like this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
{&lt;br /&gt;
 &amp;quot;argv&amp;quot;: [&lt;br /&gt;
  &amp;quot;/data/astro/software/envs/sage/bin/sage&amp;quot;,&lt;br /&gt;
  &amp;quot;--python&amp;quot;,&lt;br /&gt;
  &amp;quot;-m&amp;quot;,&lt;br /&gt;
  &amp;quot;sage.repl.ipython_kernel&amp;quot;,&lt;br /&gt;
  &amp;quot;-f&amp;quot;,&lt;br /&gt;
  &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
 ],&lt;br /&gt;
 &amp;quot;display_name&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;language&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;metadata&amp;quot;: {&lt;br /&gt;
  &amp;quot;debugger&amp;quot;: true&lt;br /&gt;
 }&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next time you go to your Jupyter dashboard you will find the sage environment listed there.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Dask ==&lt;br /&gt;
Dask supports parallel computations in Python. The PIC Jupyterlab has an extension for launching&lt;br /&gt;
your own Dask clusters. For more information, see [[Dask|Dask documentation]].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Using a Singularity image as a Jupyter kernel ==&lt;br /&gt;
&lt;br /&gt;
In some projects, the software stack is provided as a Singularity image. In such cases, it can be convenient to use this image directly as a Jupyter kernel, allowing notebooks on jupyter.pic.es to run within the same controlled software environment.&lt;br /&gt;
&lt;br /&gt;
To be used as a Jupyter kernel, the Singularity image must satisfy certain requirements. These depend on the programming language used inside the notebook.&lt;br /&gt;
&lt;br /&gt;
The singularity image needs to have the '''python''' and the '''ipykernel''' module installed. &lt;br /&gt;
&lt;br /&gt;
* Create the folder that will host the kernel definition&lt;br /&gt;
&lt;br /&gt;
  mkdir -p $HOME/.local/share/jupyter/kernels/singularity&lt;br /&gt;
&lt;br /&gt;
* Create the '''kernel.json''' file inside it with the following content:&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 {&lt;br /&gt;
   &amp;quot;argv&amp;quot;: [&lt;br /&gt;
     &amp;quot;singularity&amp;quot;,&lt;br /&gt;
     &amp;quot;exec&amp;quot;,&lt;br /&gt;
     &amp;quot;--cleanenv&amp;quot;,&lt;br /&gt;
     &amp;quot;/path/to/the/singularity/image.sif&amp;quot;,&lt;br /&gt;
     &amp;quot;python&amp;quot;,&lt;br /&gt;
     &amp;quot;-m&amp;quot;,&lt;br /&gt;
    &amp;quot;ipykernel&amp;quot;,&lt;br /&gt;
     &amp;quot;-f&amp;quot;,&lt;br /&gt;
     &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
   ],&lt;br /&gt;
   &amp;quot;language&amp;quot;: &amp;quot;python&amp;quot;,&lt;br /&gt;
   &amp;quot;display_name&amp;quot;: &amp;quot;singularity-kernel&amp;quot;&lt;br /&gt;
 }&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Refresh or start the jupyterlab interface and the singularity kernel should appear in the launcher tab&lt;br /&gt;
&lt;br /&gt;
== GPUs ==&lt;br /&gt;
&lt;br /&gt;
The way to identify the GPUs that are assigned to your job is:&lt;br /&gt;
* check the environment variable CUDA_VISIBLE_DEVICES. In a terminal run &amp;quot;echo $CUDA_VISIBLE_DEVICES&amp;quot;. The environment variable contains a list of comma-separated GPU ids. With this you will already know how many gpus are assigned to your job. If the variable does not exist, there are no gpus assigned to the job&lt;br /&gt;
&lt;br /&gt;
* list the gpus with nvidia-smi, in a terminal run &amp;quot;nvidia-smi -L&amp;quot;, and look for the gpus you've been assigned. Remember their indexes (integers from 0 to 7)&lt;br /&gt;
&lt;br /&gt;
* The two steps above can be done with the following command:&lt;br /&gt;
&lt;br /&gt;
nvidia-smi -L | grep  $CUDA_VISIBLE_DEVICES&lt;br /&gt;
&lt;br /&gt;
if only having a single assigned GPU.&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_id_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
* in the GPU dashboard the gpus are identified with their index&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_resources_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Code samples ==&lt;br /&gt;
&lt;br /&gt;
A repository with sample code can be found here: https://gitlab.pic.es/services/code-samples/&lt;br /&gt;
&lt;br /&gt;
== Running notebooks through HTCondor ==&lt;br /&gt;
After developing a notebook, you might want to run it as a script with different configurations. The&lt;br /&gt;
following documentation explains  &lt;br /&gt;
&lt;br /&gt;
[[notebook_htcondor|how to run a notebook through HTCondor.]]&lt;br /&gt;
&lt;br /&gt;
= Troubleshooting =&lt;br /&gt;
&lt;br /&gt;
== Logs ==&lt;br /&gt;
&lt;br /&gt;
The log files for the jupyterlab server are stored in &amp;quot;~/.jupyter&amp;quot;. The log files will be created once the jupyter lab server job is finished.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Clean workspaces ==&lt;br /&gt;
&lt;br /&gt;
Jupyterlab stores the workspace status in the &amp;quot;~/.jupyter/lab/workspaces&amp;quot; folder. If you want to start with a fresh (empty) workspace, delete all the content of this folder before launching the notebook.&lt;br /&gt;
&lt;br /&gt;
    cd ~/.jupyter/lab/workspaces&lt;br /&gt;
    rm *&lt;br /&gt;
&lt;br /&gt;
= Known errors =&lt;br /&gt;
&lt;br /&gt;
== Error 500: Internal Server Error ==&lt;br /&gt;
&lt;br /&gt;
This is a generic error. Means that the jupyterlab server failed. This could be for different reasons:&lt;br /&gt;
&lt;br /&gt;
* Your HOME folder is full. Log in to &amp;quot;ui.pic.es&amp;quot; and run &amp;quot;quota&amp;quot; to check the usage vs quota. If it is full you'll have to free up space.&lt;br /&gt;
&lt;br /&gt;
== 504 Gateway timeout ==&lt;br /&gt;
&lt;br /&gt;
The notebook job is running in HTCondor but the user can not access the notebook server. Ultimately a 504 error is received.&lt;br /&gt;
&lt;br /&gt;
This is probably because there's some error when starting the jupyterlab server. First of all, shutdown the notebook server and [[#Logs|check the logs]] to better identify the problem. If you don't see the source of the error, try to [[#Clean_workspaces|clean the workspaces]] and launch a notebook again.&lt;br /&gt;
&lt;br /&gt;
== Install ROOT ==&lt;br /&gt;
&lt;br /&gt;
Environment creation and kernel installation&lt;br /&gt;
&lt;br /&gt;
    $ micromamba env create -p /data/pic/scratch/torradeflot/envs/mcdata root ipykernel&lt;br /&gt;
    $ micromamba activate mcdata&lt;br /&gt;
    $ python -m ipykernel install --user --name mcdata&lt;br /&gt;
&lt;br /&gt;
After doing this ROOT can be imported from a python shell, but it does not work from a notebook.&lt;br /&gt;
&lt;br /&gt;
* ROOT uses JIT compilation with Cling: https://root.cern/cling/&lt;br /&gt;
* conda provides it's own set of compilation tools: gcc, gxx, fortran&lt;br /&gt;
* Some environment variables are necessary to ensure that these two pieces work well together:&lt;br /&gt;
** PATH: to be able to find the compiler tools&lt;br /&gt;
** CONDA_BUILD_SYSROOT: needed to configure the compiler call&lt;br /&gt;
&lt;br /&gt;
These environment variables are not propagated to the notebook, because they are set by .bashrc conda activate, ... so we need to explicitly set them in the notebook.&lt;br /&gt;
&lt;br /&gt;
     import os&lt;br /&gt;
     import sys&lt;br /&gt;
     from pathlib import Path&lt;br /&gt;
     bin_dir = Path(sys.executable).parent&lt;br /&gt;
     os.environ['PATH'] = f'{bin_dir}:{os.environ['PATH']}'&lt;br /&gt;
     os.environ['CONDA_BUILD_SYSROOT'] = str(bin_dir.parent / 'x86_64-conda-linux-gnu/sysroot')&lt;br /&gt;
&lt;br /&gt;
== Loading libraries is very slow ==&lt;br /&gt;
Conda environments at storage using hard drives can be extremely slow to load. If you encounter this problem, please ask &lt;br /&gt;
your support contact for a &amp;quot;SSD scratch&amp;quot; location to store environments. Currently this service is being tested and&lt;br /&gt;
deployed as needed.&lt;br /&gt;
&lt;br /&gt;
== Spawn failed: The 'ip' trait of a PICCondorSpawner instance expected a unicode string, not the NoneType None ==&lt;br /&gt;
&lt;br /&gt;
Jupyterhub could not get the host name from HTCondor's stdout, because it didn't match the expected regular expression.&lt;br /&gt;
&lt;br /&gt;
This error happens randomly from time to time, it does not imply any major problem in any of the services.&lt;br /&gt;
&lt;br /&gt;
Try to request a new notebook server.&lt;br /&gt;
&lt;br /&gt;
== 403 : Forbidden. XSRF cookie does not match POST argument ==&lt;br /&gt;
&lt;br /&gt;
The value of the &amp;quot;_xsrf&amp;quot; cookie sent by the browser does not match the expected value. This could be for many reasons: temporary high load on the server, race conditions, temporary network unstability, many open tabs in the browser, etc.&lt;br /&gt;
&lt;br /&gt;
In general it can be solved by closing all tabs pointing to &amp;quot;jupyter.pic.es&amp;quot;, cleaning the cookies and connecting back to &amp;quot;jupyter.pic.es&amp;quot;&lt;br /&gt;
&lt;br /&gt;
== Proper usage of X509 based proxies ==&lt;br /&gt;
&lt;br /&gt;
We found recently that the usage of proxies within a Jupyter session might cause problems because the environment changes certain standard locations such as '''/tmp'''&lt;br /&gt;
&lt;br /&gt;
For a correct functioning please create the proxy the following way, example for Virgo:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /bin/voms-proxy-init --voms virgo:/virgo/ligo --out ./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
ls: cannot access /cvmfs/ligo.osgstorage.org: Permission denied&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here the proxy cannot properly be located. Therefore we have to put the complete path into the variable:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ pwd&lt;br /&gt;
/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;/x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
frames  powerflux  pycbc  test_access&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;/div&gt;</summary>
		<author><name>Torradeflot</name></author>
	</entry>
	<entry>
		<id>https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1302</id>
		<title>JupyterHub</title>
		<link rel="alternate" type="text/html" href="https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1302"/>
		<updated>2026-01-15T06:15:50Z</updated>

		<summary type="html">&lt;p&gt;Torradeflot: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;tldr; Connect to https://jupyter.pic.es/ . Enjoy!&lt;br /&gt;
= Introduction =&lt;br /&gt;
&lt;br /&gt;
PIC offers a service for running Jupyter notebooks on CPU or GPU resources. This service is primarily thought for code developing or prototyping rather than data processing. The usage is similar to running notebooks on your personal computer but offers the advantage of developing and testing your code on different hardware configurations, as well as facilitating the scalability of the code since it is being tested in the same environment in which it would run on a mass scale. &lt;br /&gt;
&lt;br /&gt;
Since the service is strictly thought for development and small scale testing tasks, a shutdown policy for the sessions has been put in place:&lt;br /&gt;
&lt;br /&gt;
# The maximum duration for a session is 48h.&lt;br /&gt;
# After an idle period of 2 hours, the session will be closed. &lt;br /&gt;
&lt;br /&gt;
In practice that  means that you should estimate the test data volume that you work with during a session to be able to be processed in less than 48 hours.&lt;br /&gt;
&lt;br /&gt;
== How to connect to the service ==&lt;br /&gt;
&lt;br /&gt;
Got to [https://jupyter.pic.es jupyter.pic.es] to see your login screen.&lt;br /&gt;
&lt;br /&gt;
[[File:login.png|700px|Login screen]]&lt;br /&gt;
&lt;br /&gt;
Sign in with your PIC user credentials. This will prompt you to the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterSpawn.png|700px|current]]&lt;br /&gt;
&lt;br /&gt;
Here you can choose the hardware configuration for your Jupyter session. Also, you have to choose the experiment (project) you are working on during the Jupyter session. After choosing a configuration and pressing start the next screen will show you the progress of the initialisation process. Keep in mind that a job containing your Jupyter session is actually sent to the HTCondor queuing system and waiting for available resources before being started. This usually takes less than a minute but can take up to a few depending on our resource usage.&lt;br /&gt;
&lt;br /&gt;
[[File:screen02.png|900px]]&lt;br /&gt;
&lt;br /&gt;
In the next screen you can choose the tool that you want to use for your work: a Python notebook, a Python console or a plain bash terminal.&lt;br /&gt;
For the Python environment (either notebook or environment) you have two default options:&lt;br /&gt;
* the ipykernel version of Python 3&lt;br /&gt;
* the XPython version of Python 3.9, this one allows you to use the integrated debugging module.&lt;br /&gt;
&lt;br /&gt;
== Terminate your session and logout ==&lt;br /&gt;
&lt;br /&gt;
It is important that you terminate your session before you log out. In order to do so, go to the top page menu &amp;quot;'''File''' -&amp;gt; '''Hub Control Panel'''&amp;quot; and you will see the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:screen04.png|600px]]&lt;br /&gt;
&lt;br /&gt;
Here, click on the '''Stop My Server''' button. After that you can log out by clicking the '''Logout''' button in the right upper corner.&lt;br /&gt;
&lt;br /&gt;
=  Python virtual environments =&lt;br /&gt;
&lt;br /&gt;
This section covers the use of Python virtual environments with Jupyter.&lt;br /&gt;
&lt;br /&gt;
== Prebuilt environments == &lt;br /&gt;
&lt;br /&gt;
PIC's jupyterhub service comes with a collection of prebuilt environments located at '''/data/jupyter/software/envs'''.&lt;br /&gt;
&lt;br /&gt;
The master environment located at '''/data/jupyter/software/envs/master''' is the one used to start the jupyterlab service and the default for new notebooks.&lt;br /&gt;
&lt;br /&gt;
This is a non-extensive list of the packages included:&lt;br /&gt;
  - astropy=6.1.0&lt;br /&gt;
  - bokeh=3.4.1&lt;br /&gt;
  - dash=2.17.0&lt;br /&gt;
  - dask=2024.5.1&lt;br /&gt;
  - findspark=2.0.1&lt;br /&gt;
  - matplotlib=3.8.4&lt;br /&gt;
  - numpy=1.26.4&lt;br /&gt;
  - pandas=2.2.2&lt;br /&gt;
  - pillow=10.3.0&lt;br /&gt;
  - plotly=5.22.0&lt;br /&gt;
  - pyhive=0.7.0&lt;br /&gt;
  - python=3.12&lt;br /&gt;
  - pywavelets=1.4.1&lt;br /&gt;
  - scikit-image=0.22.0&lt;br /&gt;
  - scikit-learn=1.5.0&lt;br /&gt;
  - scipy=1.13.1&lt;br /&gt;
  - seaborn=0.13.2&lt;br /&gt;
  - statsmodels=0.14.2&lt;br /&gt;
&lt;br /&gt;
== Initialize conda (we highly recommend the use of mamba/micromamba) ==&lt;br /&gt;
&lt;br /&gt;
Before using conda/mamba in your bash session, you have to initialize it.&lt;br /&gt;
* For access to an available conda/mamba installation, please get in contact with your project liaison at PIC. He/she will give you the actual value for the '''/path/to/conda/mamba''' placeholder.&lt;br /&gt;
* If you want to use your own conda/mamba/micromamba installation, there are two recommended options&lt;br /&gt;
** '''miniforge''': a distribution with conda and mamba executables in a minimal base environment, instructions [https://github.com/conda-forge/miniforge here]&lt;br /&gt;
** '''micromamba''': a self-contained executable (micromamba) with no base environment, instructions [https://mamba.readthedocs.io/en/latest/installation/micromamba-installation.html here]&lt;br /&gt;
&lt;br /&gt;
Log onto Jupyter and start a session. On the homepage of your Jupyter session, click on the terminal button on the session dashboard on the right to open a bash terminal.&lt;br /&gt;
&lt;br /&gt;
In order to use conda/mamba/micromamba you need to intialize the shell. This initialization can be persistent, which will do some changes to your '''~/.bashrc''' file, or you can do it every time you want to use it.&lt;br /&gt;
&lt;br /&gt;
Run the following command:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eval &amp;quot;$(/data/astro/software/miniforge3/bin/conda shell.bash hook)&amp;quot;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, if you are using miniforge and you want to persist the initialization:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
/data/astro/software/miniforge3/bin/mamba init&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This actually changes the .bashrc file in your home directory in order to activate the base environment on login.&lt;br /&gt;
To avoid that the base environment is activated every time you log on to a node, run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Link an existing environment to Jupyter ==&lt;br /&gt;
&lt;br /&gt;
You can find instructions on how to create your own environments, e.g. [[#Create_virtual_environments_with_venv_or_conda | here]].&lt;br /&gt;
&lt;br /&gt;
Log into Jupyter, start a session. From the session dashboard choose the bash terminal.&lt;br /&gt;
&lt;br /&gt;
Inside the terminal, activate your environment.&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments:&lt;br /&gt;
* if you created the environment without a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the name of your environment. &lt;br /&gt;
* if you created the environment with a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the absolute path of your environment. &lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ source /path/to/environment/bin/activate&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Link the environment to a Jupyter kernel. For both, '''conda/mamba''' and '''venv''':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ python -m  ipykernel install --user --name=whatever_kernel_name&lt;br /&gt;
Installed kernelspec whatever_kernel_name in &lt;br /&gt;
                         /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you don't have the '''ipykernel''' module installed in your environment you may receive an error message like the one below when trying to run the previous command.&lt;br /&gt;
&amp;lt;pre&amp;gt;No module named ipykernel&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If this is the case, you need to install it by running: '''pip install ipykernel'''&lt;br /&gt;
&lt;br /&gt;
Deactivate your environment. &lt;br /&gt;
&lt;br /&gt;
For conda:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For venv:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now you can exit the terminal. After refreshing the Jupyter page your whatever_kernel_name appears in the dashboard. In this example '''test''' has been used for whatever_kernel_name&lt;br /&gt;
&lt;br /&gt;
[[File:screen05.png|700px]]&lt;br /&gt;
&lt;br /&gt;
== Unlink an environment from Jupyter ==&lt;br /&gt;
Log onto Jupyter, start a session and from the session dashboard choose the bash terminal. To remove your environment/kernel from Jupyter run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ jupyter kernelspec uninstall whatever_kernel_name&lt;br /&gt;
Kernel specs to remove:&lt;br /&gt;
  whatever_kernel_name     /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
Remove 1 kernel specs [y/N]: y&lt;br /&gt;
[RemoveKernelSpec] Removed /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Keep in mind that, although not available in Jupyter anymore, the environment still exists. Whenever you need it, you can link it again.&lt;br /&gt;
&lt;br /&gt;
== Create virtual environments with venv or conda ==&lt;br /&gt;
&lt;br /&gt;
Before creating a new environment, please get in contact with your project liaison at PIC as there may be already a suitable environment for your needs in place.&lt;br /&gt;
&lt;br /&gt;
If none of the existing environments suits your needs, you can create a new environment.&lt;br /&gt;
First, create a directory in a suitable place to store the environment. For single-user environments, place them in your home under ~/env. For environments that will be shared with other project users, contact your project liaison and ask him/her for a path in a shared storage volume that is visible to all of them.&lt;br /&gt;
&lt;br /&gt;
Once you have the location (i.e. /path/to/env/folder), create the environment with the following commands:&lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments '''(recommended)'''&lt;br /&gt;
&lt;br /&gt;
If your_env is installed at /path/to/env/your_env&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ python3 -m venv your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ source your_env/bin/activate&lt;br /&gt;
(...)[neissner@td110 ~]$ pip install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments&lt;br /&gt;
 &amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba create --prefix /path/to/env/your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The list of modules (module1, module2, ...) is optional. For instance, for a python3 environment with scipy you would specify: ''python=3 scipy'' &lt;br /&gt;
&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/env/folder/your_env&lt;br /&gt;
(...)[neissner@td110 ~]$ mamba install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
You can use pip install inside a mamba environment, however, resolving dependencies might require installing additional packages manually.&lt;br /&gt;
&lt;br /&gt;
== Conda / mamba configuration ==&lt;br /&gt;
&lt;br /&gt;
The behaviour of conda/mamba can be configured through the &amp;quot;$HOME/.condarc&amp;quot; file, described [https://docs.conda.io/projects/conda/en/latest/configuration.html here]. Some interesting parameters:&lt;br /&gt;
&lt;br /&gt;
* envs_dirs: The list of directories to search for named environments. E.g.: different locations where you created environments&lt;br /&gt;
&lt;br /&gt;
  envs_dirs:&lt;br /&gt;
    - /data/pic/scratch/torradeflot/envs&lt;br /&gt;
    - /data/astro/scratch/torradeflot/envs&lt;br /&gt;
    - /data/aai/scratch/torradeflot/envs&lt;br /&gt;
&lt;br /&gt;
* pkgs_dirs: Folder where to store conda packages&lt;br /&gt;
&lt;br /&gt;
  pkgs_dirs:&lt;br /&gt;
  - /data/aai/scratch_ssd/torradeflot/pkgs&lt;br /&gt;
  - /data/aai/scratch/torradeflot/pkgs&lt;br /&gt;
  - /data/pic/scratch/torradeflot/pkgs&lt;br /&gt;
&lt;br /&gt;
if `pkgs_dirs` and `envs_dirs` are in the same storage, conda will use hard links, thus optimizing the disk space.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Jupyterlab user guide =&lt;br /&gt;
&lt;br /&gt;
You can find the official documentation of the currently installed version of jupyterlab (3.6) [https://jupyterlab.readthedocs.io/en/4.2.x/ here], there you will find instruction on how to &lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/commands.html Access the command palette]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/toc.html Build a Table Of Contents]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/debugger.html Debug your code]&lt;br /&gt;
&lt;br /&gt;
A set of non-official jupyterlab extensions are installed to provide additional functionalities&lt;br /&gt;
&lt;br /&gt;
== Sidecar Apps: Remote desktop &amp;amp; Visual Studio IDE ==&lt;br /&gt;
Further you see an icon with a &amp;quot;D&amp;quot; - desktop, this one starts a VNC session that allows the use of programs with graphical user interfaces.&lt;br /&gt;
&lt;br /&gt;
You can also find the icon of Visual Studio, an integrated development environment.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterlab20231103.png|700px]]&lt;br /&gt;
&lt;br /&gt;
== Jupytext ==&lt;br /&gt;
Pair your notebooks with text files to enhance version tracking.&lt;br /&gt;
https://jupytext.readthedocs.io&lt;br /&gt;
&lt;br /&gt;
'''Example'''&lt;br /&gt;
&lt;br /&gt;
If you had a notebook (.ipynb file) containing only the cell below, tracked in a git repository&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%matplotlib inline&lt;br /&gt;
import numpy as np&lt;br /&gt;
import matplotlib.pyplot as plt&lt;br /&gt;
plt.imshow(np.random.random([10, 10]))&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Different executions of the cell would produce different images, and the images are embedded in a pseudo-binary format inside the notebook file. In this case, doing a '''git diff''' of the .ipynb file would produce a huge output (because the image changed), even if there wasn't any change in the code. It is thus convenient to sync the notebook with a text file (e.g. a .py script) using the jupytext extension and track this one with git. The outputs, including images, as well as some additional metadata, won't be added to the synced text file. So in the case of different executions of the same notebook, the diff will always be empty.&lt;br /&gt;
&lt;br /&gt;
== Git ==&lt;br /&gt;
Sidebar GUI to git repo management&lt;br /&gt;
https://github.com/jupyterlab/jupyterlab-git&lt;br /&gt;
&lt;br /&gt;
== Variable inspector ==&lt;br /&gt;
Variable Inspector provides an interactive interface for inspecting the current state of variables in a JupyterLab session. It allows users to view variable names, types, shapes, and values in a structured table, facilitating exploratory analysis and debugging workflows similar to variable inspection tools available in environments such as MATLAB.&lt;br /&gt;
&lt;br /&gt;
https://github.com/jupyterlab-contrib/jupyterlab-variableInspector&lt;br /&gt;
&lt;br /&gt;
== Jupyter server proxy ==&lt;br /&gt;
&lt;br /&gt;
This extension is installed in PIC's jupyter environment and it is used to be able to access network/web services running on the same host as the jupyterlab server from outside through the &amp;quot;https://jupyter.pic.es/user/{username}/proxy/{port}&amp;quot; URL.&lt;br /&gt;
&lt;br /&gt;
Full documentation here: https://jupyter-server-proxy.readthedocs.io/en/latest/index.html&lt;br /&gt;
...&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Tips &amp;amp; Tricks =&lt;br /&gt;
&lt;br /&gt;
== Software of particular interest ==&lt;br /&gt;
=== SageMath ===&lt;br /&gt;
&lt;br /&gt;
[https://www.sagemath.org/ SageMath] is particularly interesting for Cosmology because it allows symbolic calculations, e.g. deriving the equations of motions for the scale factor starting from a customised space-time metric. &lt;br /&gt;
&lt;br /&gt;
==== Standard cosmology examples ====&lt;br /&gt;
&lt;br /&gt;
* The Friedman equations for the FLRW solution of the Einstein equations.&lt;br /&gt;
You can find the corresponding Notebook in any PIC terminal at '''/data/astro/software/notebooks/FLRW_cosmology.ipynb'''&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/FLRW_cosmology_solutions.ipynb''' uses known analytical solutions of the FLRW cosmology and produces this image for the evolution of the scale factor:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage05.png|300px]]&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/Interior_Schwarzschild.ipynb''' shows the formalism for the interior Schwarzschild metric and displays the solutions for density and pressure of a static celestial object that is sufficiently larger than its Schwarzschild radius. The pressure for an object with constant density id shown in the image:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage06.png|300px]]&lt;br /&gt;
&lt;br /&gt;
==== Enabling SageMath environment in Jupyter ====&lt;br /&gt;
&lt;br /&gt;
If you have never initialized mamba, run:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /data/astro/software/centos7/conda/mambaforge_4.14.0/bin/mamba init&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
After that you can enable SageMath for its use in a Jupyter notebook session: &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~] mamba activate /data/astro/software/envs/sage&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ python -m  ipykernel install --user --name=sage&lt;br /&gt;
....&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This creates a file in you home '''~/.local/share/jupyter/kernels/sage/kernel.json'''&lt;br /&gt;
which has to be modified to look like this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
{&lt;br /&gt;
 &amp;quot;argv&amp;quot;: [&lt;br /&gt;
  &amp;quot;/data/astro/software/envs/sage/bin/sage&amp;quot;,&lt;br /&gt;
  &amp;quot;--python&amp;quot;,&lt;br /&gt;
  &amp;quot;-m&amp;quot;,&lt;br /&gt;
  &amp;quot;sage.repl.ipython_kernel&amp;quot;,&lt;br /&gt;
  &amp;quot;-f&amp;quot;,&lt;br /&gt;
  &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
 ],&lt;br /&gt;
 &amp;quot;display_name&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;language&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;metadata&amp;quot;: {&lt;br /&gt;
  &amp;quot;debugger&amp;quot;: true&lt;br /&gt;
 }&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next time you go to your Jupyter dashboard you will find the sage environment listed there.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Dask ==&lt;br /&gt;
Dask supports parallel computations in Python. The PIC Jupyterlab has an extension for launching&lt;br /&gt;
your own Dask clusters. For more information, see [[Dask|Dask documentation]].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Using a Singularity image as a Jupyter kernel ==&lt;br /&gt;
&lt;br /&gt;
In some projects, the software stack is provided as a Singularity image. In such cases, it can be convenient to use this image directly as a Jupyter kernel, allowing notebooks on jupyter.pic.es to run within the same controlled software environment.&lt;br /&gt;
&lt;br /&gt;
To be used as a Jupyter kernel, the Singularity image must satisfy certain requirements. These depend on the programming language used inside the notebook.&lt;br /&gt;
&lt;br /&gt;
The singularity image needs to have the '''python''' and the '''ipykernel''' module installed. &lt;br /&gt;
&lt;br /&gt;
* Create the folder that will host the kernel definition&lt;br /&gt;
&lt;br /&gt;
  mkdir -p $HOME/.local/share/jupyter/kernels/singularity&lt;br /&gt;
&lt;br /&gt;
* Create the '''kernel.json''' file inside it with the following content:&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 {&lt;br /&gt;
   &amp;quot;argv&amp;quot;: [&lt;br /&gt;
     &amp;quot;singularity&amp;quot;,&lt;br /&gt;
     &amp;quot;exec&amp;quot;,&lt;br /&gt;
     &amp;quot;--cleanenv&amp;quot;,&lt;br /&gt;
     &amp;quot;/path/to/the/singularity/image.sif&amp;quot;,&lt;br /&gt;
     &amp;quot;python&amp;quot;,&lt;br /&gt;
     &amp;quot;-m&amp;quot;,&lt;br /&gt;
    &amp;quot;ipykernel&amp;quot;,&lt;br /&gt;
     &amp;quot;-f&amp;quot;,&lt;br /&gt;
     &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
   ],&lt;br /&gt;
   &amp;quot;language&amp;quot;: &amp;quot;python&amp;quot;,&lt;br /&gt;
   &amp;quot;display_name&amp;quot;: &amp;quot;singularity-kernel&amp;quot;&lt;br /&gt;
 }&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Refresh or start the jupyterlab interface and the singularity kernel should appear in the launcher tab&lt;br /&gt;
&lt;br /&gt;
== GPUs ==&lt;br /&gt;
&lt;br /&gt;
The way to identify the GPUs that are assigned to your job is:&lt;br /&gt;
* check the environment variable CUDA_VISIBLE_DEVICES. In a terminal run &amp;quot;echo $CUDA_VISIBLE_DEVICES&amp;quot;. The environment variable contains a list of comma-separated GPU ids. With this you will already know how many gpus are assigned to your job. If the variable does not exist, there are no gpus assigned to the job&lt;br /&gt;
&lt;br /&gt;
* list the gpus with nvidia-smi, in a terminal run &amp;quot;nvidia-smi -L&amp;quot;, and look for the gpus you've been assigned. Remember their indexes (integers from 0 to 7)&lt;br /&gt;
&lt;br /&gt;
* The two steps above can be done with the following command:&lt;br /&gt;
&lt;br /&gt;
nvidia-smi -L | grep  $CUDA_VISIBLE_DEVICES&lt;br /&gt;
&lt;br /&gt;
if only having a single assigned GPU.&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_id_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
* in the GPU dashboard the gpus are identified with their index&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_resources_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Code samples ==&lt;br /&gt;
&lt;br /&gt;
A repository with sample code can be found here: https://gitlab.pic.es/services/code-samples/&lt;br /&gt;
&lt;br /&gt;
== Running notebooks through HTCondor ==&lt;br /&gt;
After developing a notebook, you might want to run it with different configurations. The&lt;br /&gt;
following documentation explains  &lt;br /&gt;
&lt;br /&gt;
[[notebook_htcondor|how to run a notebook through HTCondor.]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Troubleshooting =&lt;br /&gt;
&lt;br /&gt;
== Logs ==&lt;br /&gt;
&lt;br /&gt;
The log files for the jupyterlab server are stored in &amp;quot;~/.jupyter&amp;quot;. The log files will be created once the jupyter lab server job is finished.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Clean workspaces ==&lt;br /&gt;
&lt;br /&gt;
Jupyterlab stores the workspace status in the &amp;quot;~/.jupyter/lab/workspaces&amp;quot; folder. If you want to start with a fresh (empty) workspace, delete all the content of this folder before launching the notebook.&lt;br /&gt;
&lt;br /&gt;
    cd ~/.jupyter/lab/workspaces&lt;br /&gt;
    rm *&lt;br /&gt;
&lt;br /&gt;
= Known errors =&lt;br /&gt;
&lt;br /&gt;
== Error 500: Internal Server Error ==&lt;br /&gt;
&lt;br /&gt;
This is a generic error. Means that the jupyterlab server failed. This could be for different reasons:&lt;br /&gt;
&lt;br /&gt;
* Your HOME folder is full. Log in to &amp;quot;ui.pic.es&amp;quot; and run &amp;quot;quota&amp;quot; to check the usage vs quota. If it is full you'll have to free up space.&lt;br /&gt;
&lt;br /&gt;
== 504 Gateway timeout ==&lt;br /&gt;
&lt;br /&gt;
The notebook job is running in HTCondor but the user can not access the notebook server. Ultimately a 504 error is received.&lt;br /&gt;
&lt;br /&gt;
This is probably because there's some error when starting the jupyterlab server. First of all, shutdown the notebook server and [[#Logs|check the logs]] to better identify the problem. If you don't see the source of the error, try to [[#Clean_workspaces|clean the workspaces]] and launch a notebook again.&lt;br /&gt;
&lt;br /&gt;
== Install ROOT ==&lt;br /&gt;
&lt;br /&gt;
Environment creation and kernel installation&lt;br /&gt;
&lt;br /&gt;
    $ micromamba env create -p /data/pic/scratch/torradeflot/envs/mcdata root ipykernel&lt;br /&gt;
    $ micromamba activate mcdata&lt;br /&gt;
    $ python -m ipykernel install --user --name mcdata&lt;br /&gt;
&lt;br /&gt;
After doing this ROOT can be imported from a python shell, but it does not work from a notebook.&lt;br /&gt;
&lt;br /&gt;
* ROOT uses JIT compilation with Cling: https://root.cern/cling/&lt;br /&gt;
* conda provides it's own set of compilation tools: gcc, gxx, fortran&lt;br /&gt;
* Some environment variables are necessary to ensure that these two pieces work well together:&lt;br /&gt;
** PATH: to be able to find the compiler tools&lt;br /&gt;
** CONDA_BUILD_SYSROOT: needed to configure the compiler call&lt;br /&gt;
&lt;br /&gt;
These environment variables are not propagated to the notebook, because they are set by .bashrc conda activate, ... so we need to explicitly set them in the notebook.&lt;br /&gt;
&lt;br /&gt;
     import os&lt;br /&gt;
     import sys&lt;br /&gt;
     from pathlib import Path&lt;br /&gt;
     bin_dir = Path(sys.executable).parent&lt;br /&gt;
     os.environ['PATH'] = f'{bin_dir}:{os.environ['PATH']}'&lt;br /&gt;
     os.environ['CONDA_BUILD_SYSROOT'] = str(bin_dir.parent / 'x86_64-conda-linux-gnu/sysroot')&lt;br /&gt;
&lt;br /&gt;
== Loading libraries is very slow ==&lt;br /&gt;
Conda environments at storage using hard drives can be extremely slow to load. If you encounter this problem, please ask &lt;br /&gt;
your support contact for a &amp;quot;SSD scratch&amp;quot; location to store environments. Currently this service is being tested and&lt;br /&gt;
deployed as needed.&lt;br /&gt;
&lt;br /&gt;
== Spawn failed: The 'ip' trait of a PICCondorSpawner instance expected a unicode string, not the NoneType None ==&lt;br /&gt;
&lt;br /&gt;
Jupyterhub could not get the host name from HTCondor's stdout, because it didn't match the expected regular expression.&lt;br /&gt;
&lt;br /&gt;
This error happens randomly from time to time, it does not imply any major problem in any of the services.&lt;br /&gt;
&lt;br /&gt;
Try to request a new notebook server.&lt;br /&gt;
&lt;br /&gt;
== 403 : Forbidden. XSRF cookie does not match POST argument ==&lt;br /&gt;
&lt;br /&gt;
The value of the &amp;quot;_xsrf&amp;quot; cookie sent by the browser does not match the expected value. This could be for many reasons: temporary high load on the server, race conditions, temporary network unstability, many open tabs in the browser, etc.&lt;br /&gt;
&lt;br /&gt;
In general it can be solved by closing all tabs pointing to &amp;quot;jupyter.pic.es&amp;quot;, cleaning the cookies and connecting back to &amp;quot;jupyter.pic.es&amp;quot;&lt;br /&gt;
&lt;br /&gt;
== Proper usage of X509 based proxies ==&lt;br /&gt;
&lt;br /&gt;
We found recently that the usage of proxies within a Jupyter session might cause problems because the environment changes certain standard locations such as '''/tmp'''&lt;br /&gt;
&lt;br /&gt;
For a correct functioning please create the proxy the following way, example for Virgo:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /bin/voms-proxy-init --voms virgo:/virgo/ligo --out ./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
ls: cannot access /cvmfs/ligo.osgstorage.org: Permission denied&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here the proxy cannot properly be located. Therefore we have to put the complete path into the variable:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ pwd&lt;br /&gt;
/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;/x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
frames  powerflux  pycbc  test_access&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;/div&gt;</summary>
		<author><name>Torradeflot</name></author>
	</entry>
	<entry>
		<id>https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1301</id>
		<title>JupyterHub</title>
		<link rel="alternate" type="text/html" href="https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1301"/>
		<updated>2026-01-15T06:07:38Z</updated>

		<summary type="html">&lt;p&gt;Torradeflot: /* Jupytext */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;tldr; Connect to https://jupyter.pic.es/ . Enjoy!&lt;br /&gt;
= Introduction =&lt;br /&gt;
&lt;br /&gt;
PIC offers a service for running Jupyter notebooks on CPU or GPU resources. This service is primarily thought for code developing or prototyping rather than data processing. The usage is similar to running notebooks on your personal computer but offers the advantage of developing and testing your code on different hardware configurations, as well as facilitating the scalability of the code since it is being tested in the same environment in which it would run on a mass scale. &lt;br /&gt;
&lt;br /&gt;
Since the service is strictly thought for development and small scale testing tasks, a shutdown policy for the sessions has been put in place:&lt;br /&gt;
&lt;br /&gt;
# The maximum duration for a session is 48h.&lt;br /&gt;
# After an idle period of 2 hours, the session will be closed. &lt;br /&gt;
&lt;br /&gt;
In practice that  means that you should estimate the test data volume that you work with during a session to be able to be processed in less than 48 hours.&lt;br /&gt;
&lt;br /&gt;
== How to connect to the service ==&lt;br /&gt;
&lt;br /&gt;
Got to [https://jupyter.pic.es jupyter.pic.es] to see your login screen.&lt;br /&gt;
&lt;br /&gt;
[[File:login.png|700px|Login screen]]&lt;br /&gt;
&lt;br /&gt;
Sign in with your PIC user credentials. This will prompt you to the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterSpawn.png|700px|current]]&lt;br /&gt;
&lt;br /&gt;
Here you can choose the hardware configuration for your Jupyter session. Also, you have to choose the experiment (project) you are working on during the Jupyter session. After choosing a configuration and pressing start the next screen will show you the progress of the initialisation process. Keep in mind that a job containing your Jupyter session is actually sent to the HTCondor queuing system and waiting for available resources before being started. This usually takes less than a minute but can take up to a few depending on our resource usage.&lt;br /&gt;
&lt;br /&gt;
[[File:screen02.png|900px]]&lt;br /&gt;
&lt;br /&gt;
In the next screen you can choose the tool that you want to use for your work: a Python notebook, a Python console or a plain bash terminal.&lt;br /&gt;
For the Python environment (either notebook or environment) you have two default options:&lt;br /&gt;
* the ipykernel version of Python 3&lt;br /&gt;
* the XPython version of Python 3.9, this one allows you to use the integrated debugging module.&lt;br /&gt;
&lt;br /&gt;
== Terminate your session and logout ==&lt;br /&gt;
&lt;br /&gt;
It is important that you terminate your session before you log out. In order to do so, go to the top page menu &amp;quot;'''File''' -&amp;gt; '''Hub Control Panel'''&amp;quot; and you will see the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:screen04.png|600px]]&lt;br /&gt;
&lt;br /&gt;
Here, click on the '''Stop My Server''' button. After that you can log out by clicking the '''Logout''' button in the right upper corner.&lt;br /&gt;
&lt;br /&gt;
=  Python virtual environments =&lt;br /&gt;
&lt;br /&gt;
This section covers the use of Python virtual environments with Jupyter.&lt;br /&gt;
&lt;br /&gt;
== Prebuilt environments == &lt;br /&gt;
&lt;br /&gt;
PIC's jupyterhub service comes with a collection of prebuilt environments located at '''/data/jupyter/software/envs'''.&lt;br /&gt;
&lt;br /&gt;
The master environment located at '''/data/jupyter/software/envs/master''' is the one used to start the jupyterlab service and the default for new notebooks.&lt;br /&gt;
&lt;br /&gt;
This is a non-extensive list of the packages included:&lt;br /&gt;
  - astropy=6.1.0&lt;br /&gt;
  - bokeh=3.4.1&lt;br /&gt;
  - dash=2.17.0&lt;br /&gt;
  - dask=2024.5.1&lt;br /&gt;
  - findspark=2.0.1&lt;br /&gt;
  - matplotlib=3.8.4&lt;br /&gt;
  - numpy=1.26.4&lt;br /&gt;
  - pandas=2.2.2&lt;br /&gt;
  - pillow=10.3.0&lt;br /&gt;
  - plotly=5.22.0&lt;br /&gt;
  - pyhive=0.7.0&lt;br /&gt;
  - python=3.12&lt;br /&gt;
  - pywavelets=1.4.1&lt;br /&gt;
  - scikit-image=0.22.0&lt;br /&gt;
  - scikit-learn=1.5.0&lt;br /&gt;
  - scipy=1.13.1&lt;br /&gt;
  - seaborn=0.13.2&lt;br /&gt;
  - statsmodels=0.14.2&lt;br /&gt;
&lt;br /&gt;
== Initialize conda (we highly recommend the use of mamba/micromamba) ==&lt;br /&gt;
&lt;br /&gt;
Before using conda/mamba in your bash session, you have to initialize it.&lt;br /&gt;
* For access to an available conda/mamba installation, please get in contact with your project liaison at PIC. He/she will give you the actual value for the '''/path/to/conda/mamba''' placeholder.&lt;br /&gt;
* If you want to use your own conda/mamba/micromamba installation, there are two recommended options&lt;br /&gt;
** '''miniforge''': a distribution with conda and mamba executables in a minimal base environment, instructions [https://github.com/conda-forge/miniforge here]&lt;br /&gt;
** '''micromamba''': a self-contained executable (micromamba) with no base environment, instructions [https://mamba.readthedocs.io/en/latest/installation/micromamba-installation.html here]&lt;br /&gt;
&lt;br /&gt;
Log onto Jupyter and start a session. On the homepage of your Jupyter session, click on the terminal button on the session dashboard on the right to open a bash terminal.&lt;br /&gt;
&lt;br /&gt;
In order to use conda/mamba/micromamba you need to intialize the shell. This initialization can be persistent, which will do some changes to your '''~/.bashrc''' file, or you can do it every time you want to use it.&lt;br /&gt;
&lt;br /&gt;
Run the following command:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eval &amp;quot;$(/data/astro/software/miniforge3/bin/conda shell.bash hook)&amp;quot;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, if you are using miniforge and you want to persist the initialization:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
/data/astro/software/miniforge3/bin/mamba init&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This actually changes the .bashrc file in your home directory in order to activate the base environment on login.&lt;br /&gt;
To avoid that the base environment is activated every time you log on to a node, run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Link an existing environment to Jupyter ==&lt;br /&gt;
&lt;br /&gt;
You can find instructions on how to create your own environments, e.g. [[#Create_virtual_environments_with_venv_or_conda | here]].&lt;br /&gt;
&lt;br /&gt;
Log into Jupyter, start a session. From the session dashboard choose the bash terminal.&lt;br /&gt;
&lt;br /&gt;
Inside the terminal, activate your environment.&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments:&lt;br /&gt;
* if you created the environment without a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the name of your environment. &lt;br /&gt;
* if you created the environment with a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the absolute path of your environment. &lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ source /path/to/environment/bin/activate&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Link the environment to a Jupyter kernel. For both, '''conda/mamba''' and '''venv''':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ python -m  ipykernel install --user --name=whatever_kernel_name&lt;br /&gt;
Installed kernelspec whatever_kernel_name in &lt;br /&gt;
                         /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you don't have the '''ipykernel''' module installed in your environment you may receive an error message like the one below when trying to run the previous command.&lt;br /&gt;
&amp;lt;pre&amp;gt;No module named ipykernel&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If this is the case, you need to install it by running: '''pip install ipykernel'''&lt;br /&gt;
&lt;br /&gt;
Deactivate your environment. &lt;br /&gt;
&lt;br /&gt;
For conda:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For venv:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now you can exit the terminal. After refreshing the Jupyter page your whatever_kernel_name appears in the dashboard. In this example '''test''' has been used for whatever_kernel_name&lt;br /&gt;
&lt;br /&gt;
[[File:screen05.png|700px]]&lt;br /&gt;
&lt;br /&gt;
== Unlink an environment from Jupyter ==&lt;br /&gt;
Log onto Jupyter, start a session and from the session dashboard choose the bash terminal. To remove your environment/kernel from Jupyter run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ jupyter kernelspec uninstall whatever_kernel_name&lt;br /&gt;
Kernel specs to remove:&lt;br /&gt;
  whatever_kernel_name     /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
Remove 1 kernel specs [y/N]: y&lt;br /&gt;
[RemoveKernelSpec] Removed /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Keep in mind that, although not available in Jupyter anymore, the environment still exists. Whenever you need it, you can link it again.&lt;br /&gt;
&lt;br /&gt;
== Create virtual environments with venv or conda ==&lt;br /&gt;
&lt;br /&gt;
Before creating a new environment, please get in contact with your project liaison at PIC as there may be already a suitable environment for your needs in place.&lt;br /&gt;
&lt;br /&gt;
If none of the existing environments suits your needs, you can create a new environment.&lt;br /&gt;
First, create a directory in a suitable place to store the environment. For single-user environments, place them in your home under ~/env. For environments that will be shared with other project users, contact your project liaison and ask him/her for a path in a shared storage volume that is visible to all of them.&lt;br /&gt;
&lt;br /&gt;
Once you have the location (i.e. /path/to/env/folder), create the environment with the following commands:&lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments '''(recommended)'''&lt;br /&gt;
&lt;br /&gt;
If your_env is installed at /path/to/env/your_env&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ python3 -m venv your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ source your_env/bin/activate&lt;br /&gt;
(...)[neissner@td110 ~]$ pip install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments&lt;br /&gt;
 &amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba create --prefix /path/to/env/your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The list of modules (module1, module2, ...) is optional. For instance, for a python3 environment with scipy you would specify: ''python=3 scipy'' &lt;br /&gt;
&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/env/folder/your_env&lt;br /&gt;
(...)[neissner@td110 ~]$ mamba install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
You can use pip install inside a mamba environment, however, resolving dependencies might require installing additional packages manually.&lt;br /&gt;
&lt;br /&gt;
== Conda / mamba configuration ==&lt;br /&gt;
&lt;br /&gt;
The behaviour of conda/mamba can be configured through the &amp;quot;$HOME/.condarc&amp;quot; file, described [https://docs.conda.io/projects/conda/en/latest/configuration.html here]. Some interesting parameters:&lt;br /&gt;
&lt;br /&gt;
* envs_dirs: The list of directories to search for named environments. E.g.: different locations where you created environments&lt;br /&gt;
&lt;br /&gt;
  envs_dirs:&lt;br /&gt;
    - /data/pic/scratch/torradeflot/envs&lt;br /&gt;
    - /data/astro/scratch/torradeflot/envs&lt;br /&gt;
    - /data/aai/scratch/torradeflot/envs&lt;br /&gt;
&lt;br /&gt;
* pkgs_dirs: Folder where to store conda packages&lt;br /&gt;
&lt;br /&gt;
  pkgs_dirs:&lt;br /&gt;
  - /data/aai/scratch_ssd/torradeflot/pkgs&lt;br /&gt;
  - /data/aai/scratch/torradeflot/pkgs&lt;br /&gt;
  - /data/pic/scratch/torradeflot/pkgs&lt;br /&gt;
&lt;br /&gt;
if `pkgs_dirs` and `envs_dirs` are in the same storage, conda will use hard links, thus optimizing the disk space.&lt;br /&gt;
&lt;br /&gt;
= Software of particular interest =&lt;br /&gt;
== SageMath ==&lt;br /&gt;
&lt;br /&gt;
[https://www.sagemath.org/ SageMath] is particularly interesting for Cosmology because it allows symbolic calculations, e.g. deriving the equations of motions for the scale factor starting from a customised space-time metric. &lt;br /&gt;
&lt;br /&gt;
=== Standard cosmology examples ===&lt;br /&gt;
&lt;br /&gt;
* The Friedman equations for the FLRW solution of the Einstein equations.&lt;br /&gt;
You can find the corresponding Notebook in any PIC terminal at '''/data/astro/software/notebooks/FLRW_cosmology.ipynb'''&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/FLRW_cosmology_solutions.ipynb''' uses known analytical solutions of the FLRW cosmology and produces this image for the evolution of the scale factor:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage05.png|300px]]&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/Interior_Schwarzschild.ipynb''' shows the formalism for the interior Schwarzschild metric and displays the solutions for density and pressure of a static celestial object that is sufficiently larger than its Schwarzschild radius. The pressure for an object with constant density id shown in the image:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage06.png|300px]]&lt;br /&gt;
&lt;br /&gt;
=== Enabling SageMath environment in Jupyter ===&lt;br /&gt;
&lt;br /&gt;
If you have never initialized mamba, run:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /data/astro/software/centos7/conda/mambaforge_4.14.0/bin/mamba init&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
After that you can enable SageMath for its use in a Jupyter notebook session: &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~] mamba activate /data/astro/software/envs/sage&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ python -m  ipykernel install --user --name=sage&lt;br /&gt;
....&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This creates a file in you home '''~/.local/share/jupyter/kernels/sage/kernel.json'''&lt;br /&gt;
which has to be modified to look like this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
{&lt;br /&gt;
 &amp;quot;argv&amp;quot;: [&lt;br /&gt;
  &amp;quot;/data/astro/software/envs/sage/bin/sage&amp;quot;,&lt;br /&gt;
  &amp;quot;--python&amp;quot;,&lt;br /&gt;
  &amp;quot;-m&amp;quot;,&lt;br /&gt;
  &amp;quot;sage.repl.ipython_kernel&amp;quot;,&lt;br /&gt;
  &amp;quot;-f&amp;quot;,&lt;br /&gt;
  &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
 ],&lt;br /&gt;
 &amp;quot;display_name&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;language&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;metadata&amp;quot;: {&lt;br /&gt;
  &amp;quot;debugger&amp;quot;: true&lt;br /&gt;
 }&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next time you go to your Jupyter dashboard you will find the sage environment listed there.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Dask =&lt;br /&gt;
Dask supports parallel computations in Python. The PIC Jupyterlab has an extension for launching&lt;br /&gt;
your own Dask clusters. For more information, see [[Dask|Dask documentation]].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Using a Singularity image as a Jupyter kernel =&lt;br /&gt;
&lt;br /&gt;
In some projects, the software stack is provided as a Singularity image. In such cases, it can be convenient to use this image directly as a Jupyter kernel, allowing notebooks on jupyter.pic.es to run within the same controlled software environment.&lt;br /&gt;
&lt;br /&gt;
To be used as a Jupyter kernel, the Singularity image must satisfy certain requirements. These depend on the programming language used inside the notebook.&lt;br /&gt;
&lt;br /&gt;
The singularity image needs to have the '''python''' and the '''ipykernel''' module installed. &lt;br /&gt;
&lt;br /&gt;
* Create the folder that will host the kernel definition&lt;br /&gt;
&lt;br /&gt;
  mkdir -p $HOME/.local/share/jupyter/kernels/singularity&lt;br /&gt;
&lt;br /&gt;
* Create the '''kernel.json''' file inside it with the following content:&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 {&lt;br /&gt;
   &amp;quot;argv&amp;quot;: [&lt;br /&gt;
     &amp;quot;singularity&amp;quot;,&lt;br /&gt;
     &amp;quot;exec&amp;quot;,&lt;br /&gt;
     &amp;quot;--cleanenv&amp;quot;,&lt;br /&gt;
     &amp;quot;/path/to/the/singularity/image.sif&amp;quot;,&lt;br /&gt;
     &amp;quot;python&amp;quot;,&lt;br /&gt;
     &amp;quot;-m&amp;quot;,&lt;br /&gt;
    &amp;quot;ipykernel&amp;quot;,&lt;br /&gt;
     &amp;quot;-f&amp;quot;,&lt;br /&gt;
     &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
   ],&lt;br /&gt;
   &amp;quot;language&amp;quot;: &amp;quot;python&amp;quot;,&lt;br /&gt;
   &amp;quot;display_name&amp;quot;: &amp;quot;singularity-kernel&amp;quot;&lt;br /&gt;
 }&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Refresh or start the jupyterlab interface and the singularity kernel should appear in the launcher tab&lt;br /&gt;
&lt;br /&gt;
= GPUs =&lt;br /&gt;
&lt;br /&gt;
The way to identify the GPUs that are assigned to your job is:&lt;br /&gt;
* check the environment variable CUDA_VISIBLE_DEVICES. In a terminal run &amp;quot;echo $CUDA_VISIBLE_DEVICES&amp;quot;. The environment variable contains a list of comma-separated GPU ids. With this you will already know how many gpus are assigned to your job. If the variable does not exist, there are no gpus assigned to the job&lt;br /&gt;
&lt;br /&gt;
* list the gpus with nvidia-smi, in a terminal run &amp;quot;nvidia-smi -L&amp;quot;, and look for the gpus you've been assigned. Remember their indexes (integers from 0 to 7)&lt;br /&gt;
&lt;br /&gt;
* The two steps above can be done with the following command:&lt;br /&gt;
&lt;br /&gt;
nvidia-smi -L | grep  $CUDA_VISIBLE_DEVICES&lt;br /&gt;
&lt;br /&gt;
if only having a single assigned GPU.&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_id_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
* in the GPU dashboard the gpus are identified with their index&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_resources_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
= Jupyterlab user guide =&lt;br /&gt;
&lt;br /&gt;
You can find the official documentation of the currently installed version of jupyterlab (3.6) [https://jupyterlab.readthedocs.io/en/4.2.x/ here], there you will find instruction on how to &lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/commands.html Access the command palette]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/toc.html Build a Table Of Contents]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/debugger.html Debug your code]&lt;br /&gt;
&lt;br /&gt;
A set of non-official jupyterlab extensions are installed to provide additional functionalities&lt;br /&gt;
&lt;br /&gt;
== Sidecar Apps: Remote desktop &amp;amp; Visual Studio IDE ==&lt;br /&gt;
Further you see an icon with a &amp;quot;D&amp;quot; - desktop, this one starts a VNC session that allows the use of programs with graphical user interfaces.&lt;br /&gt;
&lt;br /&gt;
You can also find the icon of Visual Studio, an integrated development environment.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterlab20231103.png|700px]]&lt;br /&gt;
&lt;br /&gt;
== Jupytext ==&lt;br /&gt;
Pair your notebooks with text files to enhance version tracking.&lt;br /&gt;
https://jupytext.readthedocs.io&lt;br /&gt;
&lt;br /&gt;
'''Example'''&lt;br /&gt;
&lt;br /&gt;
If you had a notebook (.ipynb file) containing only the cell below, tracked in a git repository&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%matplotlib inline&lt;br /&gt;
import numpy as np&lt;br /&gt;
import matplotlib.pyplot as plt&lt;br /&gt;
plt.imshow(np.random.random([10, 10]))&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Different executions of the cell would produce different images, and the images are embedded in a pseudo-binary format inside the notebook file. In this case, doing a '''git diff''' of the .ipynb file would produce a huge output (because the image changed), even if there wasn't any change in the code. It is thus convenient to sync the notebook with a text file (e.g. a .py script) using the jupytext extension and track this one with git. The outputs, including images, as well as some additional metadata, won't be added to the synced text file. So in the case of different executions of the same notebook, the diff will always be empty.&lt;br /&gt;
&lt;br /&gt;
== Git ==&lt;br /&gt;
Sidebar GUI to git repo management&lt;br /&gt;
https://github.com/jupyterlab/jupyterlab-git&lt;br /&gt;
&lt;br /&gt;
== Variable inspector ==&lt;br /&gt;
Variable Inspector provides an interactive interface for inspecting the current state of variables in a JupyterLab session. It allows users to view variable names, types, shapes, and values in a structured table, facilitating exploratory analysis and debugging workflows similar to variable inspection tools available in environments such as MATLAB.&lt;br /&gt;
&lt;br /&gt;
https://github.com/jupyterlab-contrib/jupyterlab-variableInspector&lt;br /&gt;
&lt;br /&gt;
== Jupyter server proxy ==&lt;br /&gt;
&lt;br /&gt;
This extension is installed in PIC's jupyter environment and it is used to be able to access network/web services running on the same host as the jupyterlab server from outside through the &amp;quot;https://jupyter.pic.es/user/{username}/proxy/{port}&amp;quot; URL.&lt;br /&gt;
&lt;br /&gt;
Full documentation here: https://jupyter-server-proxy.readthedocs.io/en/latest/index.html&lt;br /&gt;
...&lt;br /&gt;
&lt;br /&gt;
= Code samples =&lt;br /&gt;
&lt;br /&gt;
A repository with sample code can be found here: https://gitlab.pic.es/services/code-samples/&lt;br /&gt;
&lt;br /&gt;
= Running notebooks through HTCondor =&lt;br /&gt;
After developing a notebook, you might want to run it with different configurations. The&lt;br /&gt;
following documentation explains  &lt;br /&gt;
&lt;br /&gt;
[[notebook_htcondor|how to run a notebook through HTCondor.]]&lt;br /&gt;
&lt;br /&gt;
= Troubleshooting =&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Logs ==&lt;br /&gt;
&lt;br /&gt;
The log files for the jupyterlab server are stored in &amp;quot;~/.jupyter&amp;quot;. The log files will be created once the jupyter lab server job is finished.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Clean workspaces ==&lt;br /&gt;
&lt;br /&gt;
Jupyterlab stores the workspace status in the &amp;quot;~/.jupyter/lab/workspaces&amp;quot; folder. If you want to start with a fresh (empty) workspace, delete all the content of this folder before launching the notebook.&lt;br /&gt;
&lt;br /&gt;
    cd ~/.jupyter/lab/workspaces&lt;br /&gt;
    rm *&lt;br /&gt;
&lt;br /&gt;
= Known errors =&lt;br /&gt;
&lt;br /&gt;
== Error 500: Internal Server Error ==&lt;br /&gt;
&lt;br /&gt;
This is a generic error. Means that the jupyterlab server failed. This could be for different reasons:&lt;br /&gt;
&lt;br /&gt;
* Your HOME folder is full. Log in to &amp;quot;ui.pic.es&amp;quot; and run &amp;quot;quota&amp;quot; to check the usage vs quota. If it is full you'll have to free up space.&lt;br /&gt;
&lt;br /&gt;
== 504 Gateway timeout ==&lt;br /&gt;
&lt;br /&gt;
The notebook job is running in HTCondor but the user can not access the notebook server. Ultimately a 504 error is received.&lt;br /&gt;
&lt;br /&gt;
This is probably because there's some error when starting the jupyterlab server. First of all, shutdown the notebook server and [[#Logs|check the logs]] to better identify the problem. If you don't see the source of the error, try to [[#Clean_workspaces|clean the workspaces]] and launch a notebook again.&lt;br /&gt;
&lt;br /&gt;
== Install ROOT ==&lt;br /&gt;
&lt;br /&gt;
Environment creation and kernel installation&lt;br /&gt;
&lt;br /&gt;
    $ micromamba env create -p /data/pic/scratch/torradeflot/envs/mcdata root ipykernel&lt;br /&gt;
    $ micromamba activate mcdata&lt;br /&gt;
    $ python -m ipykernel install --user --name mcdata&lt;br /&gt;
&lt;br /&gt;
After doing this ROOT can be imported from a python shell, but it does not work from a notebook.&lt;br /&gt;
&lt;br /&gt;
* ROOT uses JIT compilation with Cling: https://root.cern/cling/&lt;br /&gt;
* conda provides it's own set of compilation tools: gcc, gxx, fortran&lt;br /&gt;
* Some environment variables are necessary to ensure that these two pieces work well together:&lt;br /&gt;
** PATH: to be able to find the compiler tools&lt;br /&gt;
** CONDA_BUILD_SYSROOT: needed to configure the compiler call&lt;br /&gt;
&lt;br /&gt;
These environment variables are not propagated to the notebook, because they are set by .bashrc conda activate, ... so we need to explicitly set them in the notebook.&lt;br /&gt;
&lt;br /&gt;
     import os&lt;br /&gt;
     import sys&lt;br /&gt;
     from pathlib import Path&lt;br /&gt;
     bin_dir = Path(sys.executable).parent&lt;br /&gt;
     os.environ['PATH'] = f'{bin_dir}:{os.environ['PATH']}'&lt;br /&gt;
     os.environ['CONDA_BUILD_SYSROOT'] = str(bin_dir.parent / 'x86_64-conda-linux-gnu/sysroot')&lt;br /&gt;
&lt;br /&gt;
== Loading libraries is very slow ==&lt;br /&gt;
Conda environments at storage using hard drives can be extremely slow to load. If you encounter this problem, please ask &lt;br /&gt;
your support contact for a &amp;quot;SSD scratch&amp;quot; location to store environments. Currently this service is being tested and&lt;br /&gt;
deployed as needed.&lt;br /&gt;
&lt;br /&gt;
== Spawn failed: The 'ip' trait of a PICCondorSpawner instance expected a unicode string, not the NoneType None ==&lt;br /&gt;
&lt;br /&gt;
Jupyterhub could not get the host name from HTCondor's stdout, because it didn't match the expected regular expression.&lt;br /&gt;
&lt;br /&gt;
This error happens randomly from time to time, it does not imply any major problem in any of the services.&lt;br /&gt;
&lt;br /&gt;
Try to request a new notebook server.&lt;br /&gt;
&lt;br /&gt;
== 403 : Forbidden. XSRF cookie does not match POST argument ==&lt;br /&gt;
&lt;br /&gt;
The value of the &amp;quot;_xsrf&amp;quot; cookie sent by the browser does not match the expected value. This could be for many reasons: temporary high load on the server, race conditions, temporary network unstability, many open tabs in the browser, etc.&lt;br /&gt;
&lt;br /&gt;
In general it can be solved by closing all tabs pointing to &amp;quot;jupyter.pic.es&amp;quot;, cleaning the cookies and connecting back to &amp;quot;jupyter.pic.es&amp;quot;&lt;br /&gt;
&lt;br /&gt;
== Proper usage of X509 based proxies ==&lt;br /&gt;
&lt;br /&gt;
We found recently that the usage of proxies within a Jupyter session might cause problems because the environment changes certain standard locations such as '''/tmp'''&lt;br /&gt;
&lt;br /&gt;
For a correct functioning please create the proxy the following way, example for Virgo:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /bin/voms-proxy-init --voms virgo:/virgo/ligo --out ./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
ls: cannot access /cvmfs/ligo.osgstorage.org: Permission denied&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here the proxy cannot properly be located. Therefore we have to put the complete path into the variable:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ pwd&lt;br /&gt;
/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;/x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
frames  powerflux  pycbc  test_access&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;/div&gt;</summary>
		<author><name>Torradeflot</name></author>
	</entry>
	<entry>
		<id>https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1300</id>
		<title>JupyterHub</title>
		<link rel="alternate" type="text/html" href="https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1300"/>
		<updated>2026-01-15T06:06:22Z</updated>

		<summary type="html">&lt;p&gt;Torradeflot: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;tldr; Connect to https://jupyter.pic.es/ . Enjoy!&lt;br /&gt;
= Introduction =&lt;br /&gt;
&lt;br /&gt;
PIC offers a service for running Jupyter notebooks on CPU or GPU resources. This service is primarily thought for code developing or prototyping rather than data processing. The usage is similar to running notebooks on your personal computer but offers the advantage of developing and testing your code on different hardware configurations, as well as facilitating the scalability of the code since it is being tested in the same environment in which it would run on a mass scale. &lt;br /&gt;
&lt;br /&gt;
Since the service is strictly thought for development and small scale testing tasks, a shutdown policy for the sessions has been put in place:&lt;br /&gt;
&lt;br /&gt;
# The maximum duration for a session is 48h.&lt;br /&gt;
# After an idle period of 2 hours, the session will be closed. &lt;br /&gt;
&lt;br /&gt;
In practice that  means that you should estimate the test data volume that you work with during a session to be able to be processed in less than 48 hours.&lt;br /&gt;
&lt;br /&gt;
== How to connect to the service ==&lt;br /&gt;
&lt;br /&gt;
Got to [https://jupyter.pic.es jupyter.pic.es] to see your login screen.&lt;br /&gt;
&lt;br /&gt;
[[File:login.png|700px|Login screen]]&lt;br /&gt;
&lt;br /&gt;
Sign in with your PIC user credentials. This will prompt you to the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterSpawn.png|700px|current]]&lt;br /&gt;
&lt;br /&gt;
Here you can choose the hardware configuration for your Jupyter session. Also, you have to choose the experiment (project) you are working on during the Jupyter session. After choosing a configuration and pressing start the next screen will show you the progress of the initialisation process. Keep in mind that a job containing your Jupyter session is actually sent to the HTCondor queuing system and waiting for available resources before being started. This usually takes less than a minute but can take up to a few depending on our resource usage.&lt;br /&gt;
&lt;br /&gt;
[[File:screen02.png|900px]]&lt;br /&gt;
&lt;br /&gt;
In the next screen you can choose the tool that you want to use for your work: a Python notebook, a Python console or a plain bash terminal.&lt;br /&gt;
For the Python environment (either notebook or environment) you have two default options:&lt;br /&gt;
* the ipykernel version of Python 3&lt;br /&gt;
* the XPython version of Python 3.9, this one allows you to use the integrated debugging module.&lt;br /&gt;
&lt;br /&gt;
== Terminate your session and logout ==&lt;br /&gt;
&lt;br /&gt;
It is important that you terminate your session before you log out. In order to do so, go to the top page menu &amp;quot;'''File''' -&amp;gt; '''Hub Control Panel'''&amp;quot; and you will see the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:screen04.png|600px]]&lt;br /&gt;
&lt;br /&gt;
Here, click on the '''Stop My Server''' button. After that you can log out by clicking the '''Logout''' button in the right upper corner.&lt;br /&gt;
&lt;br /&gt;
=  Python virtual environments =&lt;br /&gt;
&lt;br /&gt;
This section covers the use of Python virtual environments with Jupyter.&lt;br /&gt;
&lt;br /&gt;
== Prebuilt environments == &lt;br /&gt;
&lt;br /&gt;
PIC's jupyterhub service comes with a collection of prebuilt environments located at '''/data/jupyter/software/envs'''.&lt;br /&gt;
&lt;br /&gt;
The master environment located at '''/data/jupyter/software/envs/master''' is the one used to start the jupyterlab service and the default for new notebooks.&lt;br /&gt;
&lt;br /&gt;
This is a non-extensive list of the packages included:&lt;br /&gt;
  - astropy=6.1.0&lt;br /&gt;
  - bokeh=3.4.1&lt;br /&gt;
  - dash=2.17.0&lt;br /&gt;
  - dask=2024.5.1&lt;br /&gt;
  - findspark=2.0.1&lt;br /&gt;
  - matplotlib=3.8.4&lt;br /&gt;
  - numpy=1.26.4&lt;br /&gt;
  - pandas=2.2.2&lt;br /&gt;
  - pillow=10.3.0&lt;br /&gt;
  - plotly=5.22.0&lt;br /&gt;
  - pyhive=0.7.0&lt;br /&gt;
  - python=3.12&lt;br /&gt;
  - pywavelets=1.4.1&lt;br /&gt;
  - scikit-image=0.22.0&lt;br /&gt;
  - scikit-learn=1.5.0&lt;br /&gt;
  - scipy=1.13.1&lt;br /&gt;
  - seaborn=0.13.2&lt;br /&gt;
  - statsmodels=0.14.2&lt;br /&gt;
&lt;br /&gt;
== Initialize conda (we highly recommend the use of mamba/micromamba) ==&lt;br /&gt;
&lt;br /&gt;
Before using conda/mamba in your bash session, you have to initialize it.&lt;br /&gt;
* For access to an available conda/mamba installation, please get in contact with your project liaison at PIC. He/she will give you the actual value for the '''/path/to/conda/mamba''' placeholder.&lt;br /&gt;
* If you want to use your own conda/mamba/micromamba installation, there are two recommended options&lt;br /&gt;
** '''miniforge''': a distribution with conda and mamba executables in a minimal base environment, instructions [https://github.com/conda-forge/miniforge here]&lt;br /&gt;
** '''micromamba''': a self-contained executable (micromamba) with no base environment, instructions [https://mamba.readthedocs.io/en/latest/installation/micromamba-installation.html here]&lt;br /&gt;
&lt;br /&gt;
Log onto Jupyter and start a session. On the homepage of your Jupyter session, click on the terminal button on the session dashboard on the right to open a bash terminal.&lt;br /&gt;
&lt;br /&gt;
In order to use conda/mamba/micromamba you need to intialize the shell. This initialization can be persistent, which will do some changes to your '''~/.bashrc''' file, or you can do it every time you want to use it.&lt;br /&gt;
&lt;br /&gt;
Run the following command:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eval &amp;quot;$(/data/astro/software/miniforge3/bin/conda shell.bash hook)&amp;quot;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, if you are using miniforge and you want to persist the initialization:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
/data/astro/software/miniforge3/bin/mamba init&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This actually changes the .bashrc file in your home directory in order to activate the base environment on login.&lt;br /&gt;
To avoid that the base environment is activated every time you log on to a node, run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Link an existing environment to Jupyter ==&lt;br /&gt;
&lt;br /&gt;
You can find instructions on how to create your own environments, e.g. [[#Create_virtual_environments_with_venv_or_conda | here]].&lt;br /&gt;
&lt;br /&gt;
Log into Jupyter, start a session. From the session dashboard choose the bash terminal.&lt;br /&gt;
&lt;br /&gt;
Inside the terminal, activate your environment.&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments:&lt;br /&gt;
* if you created the environment without a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the name of your environment. &lt;br /&gt;
* if you created the environment with a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the absolute path of your environment. &lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ source /path/to/environment/bin/activate&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Link the environment to a Jupyter kernel. For both, '''conda/mamba''' and '''venv''':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ python -m  ipykernel install --user --name=whatever_kernel_name&lt;br /&gt;
Installed kernelspec whatever_kernel_name in &lt;br /&gt;
                         /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you don't have the '''ipykernel''' module installed in your environment you may receive an error message like the one below when trying to run the previous command.&lt;br /&gt;
&amp;lt;pre&amp;gt;No module named ipykernel&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If this is the case, you need to install it by running: '''pip install ipykernel'''&lt;br /&gt;
&lt;br /&gt;
Deactivate your environment. &lt;br /&gt;
&lt;br /&gt;
For conda:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For venv:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now you can exit the terminal. After refreshing the Jupyter page your whatever_kernel_name appears in the dashboard. In this example '''test''' has been used for whatever_kernel_name&lt;br /&gt;
&lt;br /&gt;
[[File:screen05.png|700px]]&lt;br /&gt;
&lt;br /&gt;
== Unlink an environment from Jupyter ==&lt;br /&gt;
Log onto Jupyter, start a session and from the session dashboard choose the bash terminal. To remove your environment/kernel from Jupyter run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ jupyter kernelspec uninstall whatever_kernel_name&lt;br /&gt;
Kernel specs to remove:&lt;br /&gt;
  whatever_kernel_name     /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
Remove 1 kernel specs [y/N]: y&lt;br /&gt;
[RemoveKernelSpec] Removed /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Keep in mind that, although not available in Jupyter anymore, the environment still exists. Whenever you need it, you can link it again.&lt;br /&gt;
&lt;br /&gt;
== Create virtual environments with venv or conda ==&lt;br /&gt;
&lt;br /&gt;
Before creating a new environment, please get in contact with your project liaison at PIC as there may be already a suitable environment for your needs in place.&lt;br /&gt;
&lt;br /&gt;
If none of the existing environments suits your needs, you can create a new environment.&lt;br /&gt;
First, create a directory in a suitable place to store the environment. For single-user environments, place them in your home under ~/env. For environments that will be shared with other project users, contact your project liaison and ask him/her for a path in a shared storage volume that is visible to all of them.&lt;br /&gt;
&lt;br /&gt;
Once you have the location (i.e. /path/to/env/folder), create the environment with the following commands:&lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments '''(recommended)'''&lt;br /&gt;
&lt;br /&gt;
If your_env is installed at /path/to/env/your_env&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ python3 -m venv your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ source your_env/bin/activate&lt;br /&gt;
(...)[neissner@td110 ~]$ pip install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments&lt;br /&gt;
 &amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba create --prefix /path/to/env/your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The list of modules (module1, module2, ...) is optional. For instance, for a python3 environment with scipy you would specify: ''python=3 scipy'' &lt;br /&gt;
&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/env/folder/your_env&lt;br /&gt;
(...)[neissner@td110 ~]$ mamba install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
You can use pip install inside a mamba environment, however, resolving dependencies might require installing additional packages manually.&lt;br /&gt;
&lt;br /&gt;
== Conda / mamba configuration ==&lt;br /&gt;
&lt;br /&gt;
The behaviour of conda/mamba can be configured through the &amp;quot;$HOME/.condarc&amp;quot; file, described [https://docs.conda.io/projects/conda/en/latest/configuration.html here]. Some interesting parameters:&lt;br /&gt;
&lt;br /&gt;
* envs_dirs: The list of directories to search for named environments. E.g.: different locations where you created environments&lt;br /&gt;
&lt;br /&gt;
  envs_dirs:&lt;br /&gt;
    - /data/pic/scratch/torradeflot/envs&lt;br /&gt;
    - /data/astro/scratch/torradeflot/envs&lt;br /&gt;
    - /data/aai/scratch/torradeflot/envs&lt;br /&gt;
&lt;br /&gt;
* pkgs_dirs: Folder where to store conda packages&lt;br /&gt;
&lt;br /&gt;
  pkgs_dirs:&lt;br /&gt;
  - /data/aai/scratch_ssd/torradeflot/pkgs&lt;br /&gt;
  - /data/aai/scratch/torradeflot/pkgs&lt;br /&gt;
  - /data/pic/scratch/torradeflot/pkgs&lt;br /&gt;
&lt;br /&gt;
if `pkgs_dirs` and `envs_dirs` are in the same storage, conda will use hard links, thus optimizing the disk space.&lt;br /&gt;
&lt;br /&gt;
= Software of particular interest =&lt;br /&gt;
== SageMath ==&lt;br /&gt;
&lt;br /&gt;
[https://www.sagemath.org/ SageMath] is particularly interesting for Cosmology because it allows symbolic calculations, e.g. deriving the equations of motions for the scale factor starting from a customised space-time metric. &lt;br /&gt;
&lt;br /&gt;
=== Standard cosmology examples ===&lt;br /&gt;
&lt;br /&gt;
* The Friedman equations for the FLRW solution of the Einstein equations.&lt;br /&gt;
You can find the corresponding Notebook in any PIC terminal at '''/data/astro/software/notebooks/FLRW_cosmology.ipynb'''&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/FLRW_cosmology_solutions.ipynb''' uses known analytical solutions of the FLRW cosmology and produces this image for the evolution of the scale factor:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage05.png|300px]]&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/Interior_Schwarzschild.ipynb''' shows the formalism for the interior Schwarzschild metric and displays the solutions for density and pressure of a static celestial object that is sufficiently larger than its Schwarzschild radius. The pressure for an object with constant density id shown in the image:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage06.png|300px]]&lt;br /&gt;
&lt;br /&gt;
=== Enabling SageMath environment in Jupyter ===&lt;br /&gt;
&lt;br /&gt;
If you have never initialized mamba, run:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /data/astro/software/centos7/conda/mambaforge_4.14.0/bin/mamba init&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
After that you can enable SageMath for its use in a Jupyter notebook session: &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~] mamba activate /data/astro/software/envs/sage&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ python -m  ipykernel install --user --name=sage&lt;br /&gt;
....&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This creates a file in you home '''~/.local/share/jupyter/kernels/sage/kernel.json'''&lt;br /&gt;
which has to be modified to look like this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
{&lt;br /&gt;
 &amp;quot;argv&amp;quot;: [&lt;br /&gt;
  &amp;quot;/data/astro/software/envs/sage/bin/sage&amp;quot;,&lt;br /&gt;
  &amp;quot;--python&amp;quot;,&lt;br /&gt;
  &amp;quot;-m&amp;quot;,&lt;br /&gt;
  &amp;quot;sage.repl.ipython_kernel&amp;quot;,&lt;br /&gt;
  &amp;quot;-f&amp;quot;,&lt;br /&gt;
  &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
 ],&lt;br /&gt;
 &amp;quot;display_name&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;language&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;metadata&amp;quot;: {&lt;br /&gt;
  &amp;quot;debugger&amp;quot;: true&lt;br /&gt;
 }&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next time you go to your Jupyter dashboard you will find the sage environment listed there.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Dask =&lt;br /&gt;
Dask supports parallel computations in Python. The PIC Jupyterlab has an extension for launching&lt;br /&gt;
your own Dask clusters. For more information, see [[Dask|Dask documentation]].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Using a Singularity image as a Jupyter kernel =&lt;br /&gt;
&lt;br /&gt;
In some projects, the software stack is provided as a Singularity image. In such cases, it can be convenient to use this image directly as a Jupyter kernel, allowing notebooks on jupyter.pic.es to run within the same controlled software environment.&lt;br /&gt;
&lt;br /&gt;
To be used as a Jupyter kernel, the Singularity image must satisfy certain requirements. These depend on the programming language used inside the notebook.&lt;br /&gt;
&lt;br /&gt;
The singularity image needs to have the '''python''' and the '''ipykernel''' module installed. &lt;br /&gt;
&lt;br /&gt;
* Create the folder that will host the kernel definition&lt;br /&gt;
&lt;br /&gt;
  mkdir -p $HOME/.local/share/jupyter/kernels/singularity&lt;br /&gt;
&lt;br /&gt;
* Create the '''kernel.json''' file inside it with the following content:&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 {&lt;br /&gt;
   &amp;quot;argv&amp;quot;: [&lt;br /&gt;
     &amp;quot;singularity&amp;quot;,&lt;br /&gt;
     &amp;quot;exec&amp;quot;,&lt;br /&gt;
     &amp;quot;--cleanenv&amp;quot;,&lt;br /&gt;
     &amp;quot;/path/to/the/singularity/image.sif&amp;quot;,&lt;br /&gt;
     &amp;quot;python&amp;quot;,&lt;br /&gt;
     &amp;quot;-m&amp;quot;,&lt;br /&gt;
    &amp;quot;ipykernel&amp;quot;,&lt;br /&gt;
     &amp;quot;-f&amp;quot;,&lt;br /&gt;
     &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
   ],&lt;br /&gt;
   &amp;quot;language&amp;quot;: &amp;quot;python&amp;quot;,&lt;br /&gt;
   &amp;quot;display_name&amp;quot;: &amp;quot;singularity-kernel&amp;quot;&lt;br /&gt;
 }&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Refresh or start the jupyterlab interface and the singularity kernel should appear in the launcher tab&lt;br /&gt;
&lt;br /&gt;
= GPUs =&lt;br /&gt;
&lt;br /&gt;
The way to identify the GPUs that are assigned to your job is:&lt;br /&gt;
* check the environment variable CUDA_VISIBLE_DEVICES. In a terminal run &amp;quot;echo $CUDA_VISIBLE_DEVICES&amp;quot;. The environment variable contains a list of comma-separated GPU ids. With this you will already know how many gpus are assigned to your job. If the variable does not exist, there are no gpus assigned to the job&lt;br /&gt;
&lt;br /&gt;
* list the gpus with nvidia-smi, in a terminal run &amp;quot;nvidia-smi -L&amp;quot;, and look for the gpus you've been assigned. Remember their indexes (integers from 0 to 7)&lt;br /&gt;
&lt;br /&gt;
* The two steps above can be done with the following command:&lt;br /&gt;
&lt;br /&gt;
nvidia-smi -L | grep  $CUDA_VISIBLE_DEVICES&lt;br /&gt;
&lt;br /&gt;
if only having a single assigned GPU.&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_id_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
* in the GPU dashboard the gpus are identified with their index&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_resources_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
= Jupyterlab user guide =&lt;br /&gt;
&lt;br /&gt;
You can find the official documentation of the currently installed version of jupyterlab (3.6) [https://jupyterlab.readthedocs.io/en/4.2.x/ here], there you will find instruction on how to &lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/commands.html Access the command palette]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/toc.html Build a Table Of Contents]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/debugger.html Debug your code]&lt;br /&gt;
&lt;br /&gt;
A set of non-official jupyterlab extensions are installed to provide additional functionalities&lt;br /&gt;
&lt;br /&gt;
== Sidecar Apps: Remote desktop &amp;amp; Visual Studio IDE ==&lt;br /&gt;
Further you see an icon with a &amp;quot;D&amp;quot; - desktop, this one starts a VNC session that allows the use of programs with graphical user interfaces.&lt;br /&gt;
&lt;br /&gt;
You can also find the icon of Visual Studio, an integrated development environment.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterlab20231103.png|700px]]&lt;br /&gt;
&lt;br /&gt;
== Jupytext ==&lt;br /&gt;
Pair your notebooks with text files to enhance version tracking.&lt;br /&gt;
https://jupytext.readthedocs.io&lt;br /&gt;
&lt;br /&gt;
=== Example ===&lt;br /&gt;
&lt;br /&gt;
If you had a notebook (.ipynb file) containing only the cell below, tracked in a git repository&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%matplotlib inline&lt;br /&gt;
import numpy as np&lt;br /&gt;
import matplotlib.pyplot as plt&lt;br /&gt;
plt.imshow(np.random.random([10, 10]))&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Different executions of the cell would produce different images, and the images are embedded in a pseudo-binary format inside the notebook file. In this case, doing a '''git diff''' of the .ipynb file would produce a huge output (because the image changed), even if there wasn't any change in the code. It is thus convenient to sync the notebook with a text file (e.g. a .py script) using the jupytext extension and track this one with git. The outputs, including images, as well as some additional metadata, won't be added to the synced text file. So in the case of different executions of the same notebook, the diff will always be empty.&lt;br /&gt;
&lt;br /&gt;
== Git ==&lt;br /&gt;
Sidebar GUI to git repo management&lt;br /&gt;
https://github.com/jupyterlab/jupyterlab-git&lt;br /&gt;
&lt;br /&gt;
== Variable inspector ==&lt;br /&gt;
Variable Inspector provides an interactive interface for inspecting the current state of variables in a JupyterLab session. It allows users to view variable names, types, shapes, and values in a structured table, facilitating exploratory analysis and debugging workflows similar to variable inspection tools available in environments such as MATLAB.&lt;br /&gt;
&lt;br /&gt;
https://github.com/jupyterlab-contrib/jupyterlab-variableInspector&lt;br /&gt;
&lt;br /&gt;
== Jupyter server proxy ==&lt;br /&gt;
&lt;br /&gt;
This extension is installed in PIC's jupyter environment and it is used to be able to access network/web services running on the same host as the jupyterlab server from outside through the &amp;quot;https://jupyter.pic.es/user/{username}/proxy/{port}&amp;quot; URL.&lt;br /&gt;
&lt;br /&gt;
Full documentation here: https://jupyter-server-proxy.readthedocs.io/en/latest/index.html&lt;br /&gt;
...&lt;br /&gt;
&lt;br /&gt;
= Code samples =&lt;br /&gt;
&lt;br /&gt;
A repository with sample code can be found here: https://gitlab.pic.es/services/code-samples/&lt;br /&gt;
&lt;br /&gt;
= Running notebooks through HTCondor =&lt;br /&gt;
After developing a notebook, you might want to run it with different configurations. The&lt;br /&gt;
following documentation explains  &lt;br /&gt;
&lt;br /&gt;
[[notebook_htcondor|how to run a notebook through HTCondor.]]&lt;br /&gt;
&lt;br /&gt;
= Troubleshooting =&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Logs ==&lt;br /&gt;
&lt;br /&gt;
The log files for the jupyterlab server are stored in &amp;quot;~/.jupyter&amp;quot;. The log files will be created once the jupyter lab server job is finished.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Clean workspaces ==&lt;br /&gt;
&lt;br /&gt;
Jupyterlab stores the workspace status in the &amp;quot;~/.jupyter/lab/workspaces&amp;quot; folder. If you want to start with a fresh (empty) workspace, delete all the content of this folder before launching the notebook.&lt;br /&gt;
&lt;br /&gt;
    cd ~/.jupyter/lab/workspaces&lt;br /&gt;
    rm *&lt;br /&gt;
&lt;br /&gt;
= Known errors =&lt;br /&gt;
&lt;br /&gt;
== Error 500: Internal Server Error ==&lt;br /&gt;
&lt;br /&gt;
This is a generic error. Means that the jupyterlab server failed. This could be for different reasons:&lt;br /&gt;
&lt;br /&gt;
* Your HOME folder is full. Log in to &amp;quot;ui.pic.es&amp;quot; and run &amp;quot;quota&amp;quot; to check the usage vs quota. If it is full you'll have to free up space.&lt;br /&gt;
&lt;br /&gt;
== 504 Gateway timeout ==&lt;br /&gt;
&lt;br /&gt;
The notebook job is running in HTCondor but the user can not access the notebook server. Ultimately a 504 error is received.&lt;br /&gt;
&lt;br /&gt;
This is probably because there's some error when starting the jupyterlab server. First of all, shutdown the notebook server and [[#Logs|check the logs]] to better identify the problem. If you don't see the source of the error, try to [[#Clean_workspaces|clean the workspaces]] and launch a notebook again.&lt;br /&gt;
&lt;br /&gt;
== Install ROOT ==&lt;br /&gt;
&lt;br /&gt;
Environment creation and kernel installation&lt;br /&gt;
&lt;br /&gt;
    $ micromamba env create -p /data/pic/scratch/torradeflot/envs/mcdata root ipykernel&lt;br /&gt;
    $ micromamba activate mcdata&lt;br /&gt;
    $ python -m ipykernel install --user --name mcdata&lt;br /&gt;
&lt;br /&gt;
After doing this ROOT can be imported from a python shell, but it does not work from a notebook.&lt;br /&gt;
&lt;br /&gt;
* ROOT uses JIT compilation with Cling: https://root.cern/cling/&lt;br /&gt;
* conda provides it's own set of compilation tools: gcc, gxx, fortran&lt;br /&gt;
* Some environment variables are necessary to ensure that these two pieces work well together:&lt;br /&gt;
** PATH: to be able to find the compiler tools&lt;br /&gt;
** CONDA_BUILD_SYSROOT: needed to configure the compiler call&lt;br /&gt;
&lt;br /&gt;
These environment variables are not propagated to the notebook, because they are set by .bashrc conda activate, ... so we need to explicitly set them in the notebook.&lt;br /&gt;
&lt;br /&gt;
     import os&lt;br /&gt;
     import sys&lt;br /&gt;
     from pathlib import Path&lt;br /&gt;
     bin_dir = Path(sys.executable).parent&lt;br /&gt;
     os.environ['PATH'] = f'{bin_dir}:{os.environ['PATH']}'&lt;br /&gt;
     os.environ['CONDA_BUILD_SYSROOT'] = str(bin_dir.parent / 'x86_64-conda-linux-gnu/sysroot')&lt;br /&gt;
&lt;br /&gt;
== Loading libraries is very slow ==&lt;br /&gt;
Conda environments at storage using hard drives can be extremely slow to load. If you encounter this problem, please ask &lt;br /&gt;
your support contact for a &amp;quot;SSD scratch&amp;quot; location to store environments. Currently this service is being tested and&lt;br /&gt;
deployed as needed.&lt;br /&gt;
&lt;br /&gt;
== Spawn failed: The 'ip' trait of a PICCondorSpawner instance expected a unicode string, not the NoneType None ==&lt;br /&gt;
&lt;br /&gt;
Jupyterhub could not get the host name from HTCondor's stdout, because it didn't match the expected regular expression.&lt;br /&gt;
&lt;br /&gt;
This error happens randomly from time to time, it does not imply any major problem in any of the services.&lt;br /&gt;
&lt;br /&gt;
Try to request a new notebook server.&lt;br /&gt;
&lt;br /&gt;
== 403 : Forbidden. XSRF cookie does not match POST argument ==&lt;br /&gt;
&lt;br /&gt;
The value of the &amp;quot;_xsrf&amp;quot; cookie sent by the browser does not match the expected value. This could be for many reasons: temporary high load on the server, race conditions, temporary network unstability, many open tabs in the browser, etc.&lt;br /&gt;
&lt;br /&gt;
In general it can be solved by closing all tabs pointing to &amp;quot;jupyter.pic.es&amp;quot;, cleaning the cookies and connecting back to &amp;quot;jupyter.pic.es&amp;quot;&lt;br /&gt;
&lt;br /&gt;
== Proper usage of X509 based proxies ==&lt;br /&gt;
&lt;br /&gt;
We found recently that the usage of proxies within a Jupyter session might cause problems because the environment changes certain standard locations such as '''/tmp'''&lt;br /&gt;
&lt;br /&gt;
For a correct functioning please create the proxy the following way, example for Virgo:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /bin/voms-proxy-init --voms virgo:/virgo/ligo --out ./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
ls: cannot access /cvmfs/ligo.osgstorage.org: Permission denied&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here the proxy cannot properly be located. Therefore we have to put the complete path into the variable:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ pwd&lt;br /&gt;
/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;/x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
frames  powerflux  pycbc  test_access&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;/div&gt;</summary>
		<author><name>Torradeflot</name></author>
	</entry>
	<entry>
		<id>https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1299</id>
		<title>JupyterHub</title>
		<link rel="alternate" type="text/html" href="https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1299"/>
		<updated>2026-01-15T06:00:18Z</updated>

		<summary type="html">&lt;p&gt;Torradeflot: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;tldr; Connect to https://jupyter.pic.es/ . Enjoy!&lt;br /&gt;
= Introduction =&lt;br /&gt;
&lt;br /&gt;
PIC offers a service for running Jupyter notebooks on CPU or GPU resources. This service is primarily thought for code developing or prototyping rather than data processing. The usage is similar to running notebooks on your personal computer but offers the advantage of developing and testing your code on different hardware configurations, as well as facilitating the scalability of the code since it is being tested in the same environment in which it would run on a mass scale. &lt;br /&gt;
&lt;br /&gt;
Since the service is strictly thought for development and small scale testing tasks, a shutdown policy for the sessions has been put in place:&lt;br /&gt;
&lt;br /&gt;
# The maximum duration for a session is 48h.&lt;br /&gt;
# After an idle period of 2 hours, the session will be closed. &lt;br /&gt;
&lt;br /&gt;
In practice that  means that you should estimate the test data volume that you work with during a session to be able to be processed in less than 48 hours.&lt;br /&gt;
&lt;br /&gt;
== How to connect to the service ==&lt;br /&gt;
&lt;br /&gt;
Got to [https://jupyter.pic.es jupyter.pic.es] to see your login screen.&lt;br /&gt;
&lt;br /&gt;
[[File:login.png|700px|Login screen]]&lt;br /&gt;
&lt;br /&gt;
Sign in with your PIC user credentials. This will prompt you to the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterSpawn.png|700px|current]]&lt;br /&gt;
&lt;br /&gt;
Here you can choose the hardware configuration for your Jupyter session. Also, you have to choose the experiment (project) you are working on during the Jupyter session. After choosing a configuration and pressing start the next screen will show you the progress of the initialisation process. Keep in mind that a job containing your Jupyter session is actually sent to the HTCondor queuing system and waiting for available resources before being started. This usually takes less than a minute but can take up to a few depending on our resource usage.&lt;br /&gt;
&lt;br /&gt;
[[File:screen02.png|900px]]&lt;br /&gt;
&lt;br /&gt;
In the next screen you can choose the tool that you want to use for your work: a Python notebook, a Python console or a plain bash terminal.&lt;br /&gt;
For the Python environment (either notebook or environment) you have two default options:&lt;br /&gt;
* the ipykernel version of Python 3&lt;br /&gt;
* the XPython version of Python 3.9, this one allows you to use the integrated debugging module.&lt;br /&gt;
&lt;br /&gt;
== Virtual desktop ==&lt;br /&gt;
Further you see an icon with a &amp;quot;D&amp;quot; - desktop, this one starts a VNC session that allows the use of programs with graphical user interfaces.&lt;br /&gt;
&lt;br /&gt;
Also, recently you can find the icon of Visual Studio, an integrated development environment.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterlab20231103.png|700px]]&lt;br /&gt;
&lt;br /&gt;
Your python environments should appear under Notebook and Console headers. In a later section we will show you how to create a new environment and to remove an existing one.&lt;br /&gt;
&lt;br /&gt;
== Terminate your session and logout ==&lt;br /&gt;
&lt;br /&gt;
It is important that you terminate your session before you log out. In order to do so, go to the top page menu &amp;quot;'''File''' -&amp;gt; '''Hub Control Panel'''&amp;quot; and you will see the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:screen04.png|600px]]&lt;br /&gt;
&lt;br /&gt;
Here, click on the '''Stop My Server''' button. After that you can log out by clicking the '''Logout''' button in the right upper corner.&lt;br /&gt;
&lt;br /&gt;
=  Python virtual environments =&lt;br /&gt;
&lt;br /&gt;
This section covers the use of Python virtual environments with Jupyter.&lt;br /&gt;
&lt;br /&gt;
== Prebuilt environments == &lt;br /&gt;
&lt;br /&gt;
PIC's jupyterhub service comes with a collection of prebuilt environments located at '''/data/jupyter/software/envs'''.&lt;br /&gt;
&lt;br /&gt;
The master environment located at '''/data/jupyter/software/envs/master''' is the one used to start the jupyterlab service and the default for new notebooks.&lt;br /&gt;
&lt;br /&gt;
This is a non-extensive list of the packages included:&lt;br /&gt;
  - astropy=6.1.0&lt;br /&gt;
  - bokeh=3.4.1&lt;br /&gt;
  - dash=2.17.0&lt;br /&gt;
  - dask=2024.5.1&lt;br /&gt;
  - findspark=2.0.1&lt;br /&gt;
  - matplotlib=3.8.4&lt;br /&gt;
  - numpy=1.26.4&lt;br /&gt;
  - pandas=2.2.2&lt;br /&gt;
  - pillow=10.3.0&lt;br /&gt;
  - plotly=5.22.0&lt;br /&gt;
  - pyhive=0.7.0&lt;br /&gt;
  - python=3.12&lt;br /&gt;
  - pywavelets=1.4.1&lt;br /&gt;
  - scikit-image=0.22.0&lt;br /&gt;
  - scikit-learn=1.5.0&lt;br /&gt;
  - scipy=1.13.1&lt;br /&gt;
  - seaborn=0.13.2&lt;br /&gt;
  - statsmodels=0.14.2&lt;br /&gt;
&lt;br /&gt;
== Initialize conda (we highly recommend the use of mamba/micromamba) ==&lt;br /&gt;
&lt;br /&gt;
Before using conda/mamba in your bash session, you have to initialize it.&lt;br /&gt;
* For access to an available conda/mamba installation, please get in contact with your project liaison at PIC. He/she will give you the actual value for the '''/path/to/conda/mamba''' placeholder.&lt;br /&gt;
* If you want to use your own conda/mamba/micromamba installation, there are two recommended options&lt;br /&gt;
** '''miniforge''': a distribution with conda and mamba executables in a minimal base environment, instructions [https://github.com/conda-forge/miniforge here]&lt;br /&gt;
** '''micromamba''': a self-contained executable (micromamba) with no base environment, instructions [https://mamba.readthedocs.io/en/latest/installation/micromamba-installation.html here]&lt;br /&gt;
&lt;br /&gt;
Log onto Jupyter and start a session. On the homepage of your Jupyter session, click on the terminal button on the session dashboard on the right to open a bash terminal.&lt;br /&gt;
&lt;br /&gt;
In order to use conda/mamba/micromamba you need to intialize the shell. This initialization can be persistent, which will do some changes to your '''~/.bashrc''' file, or you can do it every time you want to use it.&lt;br /&gt;
&lt;br /&gt;
Run the following command:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eval &amp;quot;$(/data/astro/software/miniforge3/bin/conda shell.bash hook)&amp;quot;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, if you are using miniforge and you want to persist the initialization:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
/data/astro/software/miniforge3/bin/mamba init&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This actually changes the .bashrc file in your home directory in order to activate the base environment on login.&lt;br /&gt;
To avoid that the base environment is activated every time you log on to a node, run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Link an existing environment to Jupyter ==&lt;br /&gt;
&lt;br /&gt;
You can find instructions on how to create your own environments, e.g. [[#Create_virtual_environments_with_venv_or_conda | here]].&lt;br /&gt;
&lt;br /&gt;
Log into Jupyter, start a session. From the session dashboard choose the bash terminal.&lt;br /&gt;
&lt;br /&gt;
Inside the terminal, activate your environment.&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments:&lt;br /&gt;
* if you created the environment without a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the name of your environment. &lt;br /&gt;
* if you created the environment with a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the absolute path of your environment. &lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ source /path/to/environment/bin/activate&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Link the environment to a Jupyter kernel. For both, '''conda/mamba''' and '''venv''':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ python -m  ipykernel install --user --name=whatever_kernel_name&lt;br /&gt;
Installed kernelspec whatever_kernel_name in &lt;br /&gt;
                         /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you don't have the '''ipykernel''' module installed in your environment you may receive an error message like the one below when trying to run the previous command.&lt;br /&gt;
&amp;lt;pre&amp;gt;No module named ipykernel&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If this is the case, you need to install it by running: '''pip install ipykernel'''&lt;br /&gt;
&lt;br /&gt;
Deactivate your environment. &lt;br /&gt;
&lt;br /&gt;
For conda:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For venv:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now you can exit the terminal. After refreshing the Jupyter page your whatever_kernel_name appears in the dashboard. In this example '''test''' has been used for whatever_kernel_name&lt;br /&gt;
&lt;br /&gt;
[[File:screen05.png|700px]]&lt;br /&gt;
&lt;br /&gt;
== Unlink an environment from Jupyter ==&lt;br /&gt;
Log onto Jupyter, start a session and from the session dashboard choose the bash terminal. To remove your environment/kernel from Jupyter run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ jupyter kernelspec uninstall whatever_kernel_name&lt;br /&gt;
Kernel specs to remove:&lt;br /&gt;
  whatever_kernel_name     /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
Remove 1 kernel specs [y/N]: y&lt;br /&gt;
[RemoveKernelSpec] Removed /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Keep in mind that, although not available in Jupyter anymore, the environment still exists. Whenever you need it, you can link it again.&lt;br /&gt;
&lt;br /&gt;
== Create virtual environments with venv or conda ==&lt;br /&gt;
&lt;br /&gt;
Before creating a new environment, please get in contact with your project liaison at PIC as there may be already a suitable environment for your needs in place.&lt;br /&gt;
&lt;br /&gt;
If none of the existing environments suits your needs, you can create a new environment.&lt;br /&gt;
First, create a directory in a suitable place to store the environment. For single-user environments, place them in your home under ~/env. For environments that will be shared with other project users, contact your project liaison and ask him/her for a path in a shared storage volume that is visible to all of them.&lt;br /&gt;
&lt;br /&gt;
Once you have the location (i.e. /path/to/env/folder), create the environment with the following commands:&lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments '''(recommended)'''&lt;br /&gt;
&lt;br /&gt;
If your_env is installed at /path/to/env/your_env&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ python3 -m venv your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ source your_env/bin/activate&lt;br /&gt;
(...)[neissner@td110 ~]$ pip install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments&lt;br /&gt;
 &amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba create --prefix /path/to/env/your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The list of modules (module1, module2, ...) is optional. For instance, for a python3 environment with scipy you would specify: ''python=3 scipy'' &lt;br /&gt;
&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/env/folder/your_env&lt;br /&gt;
(...)[neissner@td110 ~]$ mamba install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
You can use pip install inside a mamba environment, however, resolving dependencies might require installing additional packages manually.&lt;br /&gt;
&lt;br /&gt;
== Conda / mamba configuration ==&lt;br /&gt;
&lt;br /&gt;
The behaviour of conda/mamba can be configured through the &amp;quot;$HOME/.condarc&amp;quot; file, described [https://docs.conda.io/projects/conda/en/latest/configuration.html here]. Some interesting parameters:&lt;br /&gt;
&lt;br /&gt;
* envs_dirs: The list of directories to search for named environments. E.g.: different locations where you created environments&lt;br /&gt;
&lt;br /&gt;
  envs_dirs:&lt;br /&gt;
    - /data/pic/scratch/torradeflot/envs&lt;br /&gt;
    - /data/astro/scratch/torradeflot/envs&lt;br /&gt;
    - /data/aai/scratch/torradeflot/envs&lt;br /&gt;
&lt;br /&gt;
* pkgs_dirs: Folder where to store conda packages&lt;br /&gt;
&lt;br /&gt;
  pkgs_dirs:&lt;br /&gt;
  - /data/aai/scratch_ssd/torradeflot/pkgs&lt;br /&gt;
  - /data/aai/scratch/torradeflot/pkgs&lt;br /&gt;
  - /data/pic/scratch/torradeflot/pkgs&lt;br /&gt;
&lt;br /&gt;
if `pkgs_dirs` and `envs_dirs` are in the same storage, conda will use hard links, thus optimizing the disk space.&lt;br /&gt;
&lt;br /&gt;
= Software of particular interest =&lt;br /&gt;
== SageMath ==&lt;br /&gt;
&lt;br /&gt;
[https://www.sagemath.org/ SageMath] is particularly interesting for Cosmology because it allows symbolic calculations, e.g. deriving the equations of motions for the scale factor starting from a customised space-time metric. &lt;br /&gt;
&lt;br /&gt;
=== Standard cosmology examples ===&lt;br /&gt;
&lt;br /&gt;
* The Friedman equations for the FLRW solution of the Einstein equations.&lt;br /&gt;
You can find the corresponding Notebook in any PIC terminal at '''/data/astro/software/notebooks/FLRW_cosmology.ipynb'''&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/FLRW_cosmology_solutions.ipynb''' uses known analytical solutions of the FLRW cosmology and produces this image for the evolution of the scale factor:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage05.png|300px]]&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/Interior_Schwarzschild.ipynb''' shows the formalism for the interior Schwarzschild metric and displays the solutions for density and pressure of a static celestial object that is sufficiently larger than its Schwarzschild radius. The pressure for an object with constant density id shown in the image:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage06.png|300px]]&lt;br /&gt;
&lt;br /&gt;
=== Enabling SageMath environment in Jupyter ===&lt;br /&gt;
&lt;br /&gt;
If you have never initialized mamba, run:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /data/astro/software/centos7/conda/mambaforge_4.14.0/bin/mamba init&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
After that you can enable SageMath for its use in a Jupyter notebook session: &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~] mamba activate /data/astro/software/envs/sage&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ python -m  ipykernel install --user --name=sage&lt;br /&gt;
....&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This creates a file in you home '''~/.local/share/jupyter/kernels/sage/kernel.json'''&lt;br /&gt;
which has to be modified to look like this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
{&lt;br /&gt;
 &amp;quot;argv&amp;quot;: [&lt;br /&gt;
  &amp;quot;/data/astro/software/envs/sage/bin/sage&amp;quot;,&lt;br /&gt;
  &amp;quot;--python&amp;quot;,&lt;br /&gt;
  &amp;quot;-m&amp;quot;,&lt;br /&gt;
  &amp;quot;sage.repl.ipython_kernel&amp;quot;,&lt;br /&gt;
  &amp;quot;-f&amp;quot;,&lt;br /&gt;
  &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
 ],&lt;br /&gt;
 &amp;quot;display_name&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;language&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;metadata&amp;quot;: {&lt;br /&gt;
  &amp;quot;debugger&amp;quot;: true&lt;br /&gt;
 }&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next time you go to your Jupyter dashboard you will find the sage environment listed there.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Dask =&lt;br /&gt;
Dask supports parallel computations in Python. The PIC Jupyterlab has an extension for launching&lt;br /&gt;
your own Dask clusters. For more information, see [[Dask|Dask documentation]].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Using a Singularity image as a Jupyter kernel =&lt;br /&gt;
&lt;br /&gt;
In some projects, the software stack is provided as a Singularity image. In such cases, it can be convenient to use this image directly as a Jupyter kernel, allowing notebooks on jupyter.pic.es to run within the same controlled software environment.&lt;br /&gt;
&lt;br /&gt;
To be used as a Jupyter kernel, the Singularity image must satisfy certain requirements. These depend on the programming language used inside the notebook.&lt;br /&gt;
&lt;br /&gt;
The singularity image needs to have the '''python''' and the '''ipykernel''' module installed. &lt;br /&gt;
&lt;br /&gt;
* Create the folder that will host the kernel definition&lt;br /&gt;
&lt;br /&gt;
  mkdir -p $HOME/.local/share/jupyter/kernels/singularity&lt;br /&gt;
&lt;br /&gt;
* Create the '''kernel.json''' file inside it with the following content:&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 {&lt;br /&gt;
   &amp;quot;argv&amp;quot;: [&lt;br /&gt;
     &amp;quot;singularity&amp;quot;,&lt;br /&gt;
     &amp;quot;exec&amp;quot;,&lt;br /&gt;
     &amp;quot;--cleanenv&amp;quot;,&lt;br /&gt;
     &amp;quot;/path/to/the/singularity/image.sif&amp;quot;,&lt;br /&gt;
     &amp;quot;python&amp;quot;,&lt;br /&gt;
     &amp;quot;-m&amp;quot;,&lt;br /&gt;
    &amp;quot;ipykernel&amp;quot;,&lt;br /&gt;
     &amp;quot;-f&amp;quot;,&lt;br /&gt;
     &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
   ],&lt;br /&gt;
   &amp;quot;language&amp;quot;: &amp;quot;python&amp;quot;,&lt;br /&gt;
   &amp;quot;display_name&amp;quot;: &amp;quot;singularity-kernel&amp;quot;&lt;br /&gt;
 }&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Refresh or start the jupyterlab interface and the singularity kernel should appear in the launcher tab&lt;br /&gt;
&lt;br /&gt;
= GPUs =&lt;br /&gt;
&lt;br /&gt;
The way to identify the GPUs that are assigned to your job is:&lt;br /&gt;
* check the environment variable CUDA_VISIBLE_DEVICES. In a terminal run &amp;quot;echo $CUDA_VISIBLE_DEVICES&amp;quot;. The environment variable contains a list of comma-separated GPU ids. With this you will already know how many gpus are assigned to your job. If the variable does not exist, there are no gpus assigned to the job&lt;br /&gt;
&lt;br /&gt;
* list the gpus with nvidia-smi, in a terminal run &amp;quot;nvidia-smi -L&amp;quot;, and look for the gpus you've been assigned. Remember their indexes (integers from 0 to 7)&lt;br /&gt;
&lt;br /&gt;
* The two steps above can be done with the following command:&lt;br /&gt;
&lt;br /&gt;
nvidia-smi -L | grep  $CUDA_VISIBLE_DEVICES&lt;br /&gt;
&lt;br /&gt;
if only having a single assigned GPU.&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_id_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
* in the GPU dashboard the gpus are identified with their index&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_resources_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
= Jupyterlab user guide =&lt;br /&gt;
&lt;br /&gt;
You can find the official documentation of the currently installed version of jupyterlab (3.6) [https://jupyterlab.readthedocs.io/en/4.2.x/ here], there you will find instruction on how to &lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/commands.html Access the command palette]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/toc.html Build a Table Of Contents]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/debugger.html Debug your code]&lt;br /&gt;
&lt;br /&gt;
A set of non-official jupyterlab extensions are installed to provide additional functionalities&lt;br /&gt;
&lt;br /&gt;
== Jupytext ==&lt;br /&gt;
Pair your notebooks with text files to enhance version tracking.&lt;br /&gt;
https://jupytext.readthedocs.io&lt;br /&gt;
&lt;br /&gt;
=== Example ===&lt;br /&gt;
&lt;br /&gt;
If you had a notebook (.ipynb file) containing only the cell below, tracked in a git repository&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%matplotlib inline&lt;br /&gt;
import numpy as np&lt;br /&gt;
import matplotlib.pyplot as plt&lt;br /&gt;
plt.imshow(np.random.random([10, 10]))&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Different executions of the cell would produce different images, and the images are embedded in a pseudo-binary format inside the notebook file. In this case, doing a '''git diff''' of the .ipynb file would produce a huge output (because the image changed), even if there wasn't any change in the code. It is thus convenient to sync the notebook with a text file (e.g. a .py script) using the jupytext extension and track this one with git. The outputs, including images, as well as some additional metadata, won't be added to the synced text file. So in the case of different executions of the same notebook, the diff will always be empty.&lt;br /&gt;
&lt;br /&gt;
== Git ==&lt;br /&gt;
Sidebar GUI to git repo management&lt;br /&gt;
https://github.com/jupyterlab/jupyterlab-git&lt;br /&gt;
&lt;br /&gt;
== Variable inspector ==&lt;br /&gt;
Variable Inspector provides an interactive interface for inspecting the current state of variables in a JupyterLab session. It allows users to view variable names, types, shapes, and values in a structured table, facilitating exploratory analysis and debugging workflows similar to variable inspection tools available in environments such as MATLAB.&lt;br /&gt;
&lt;br /&gt;
https://github.com/jupyterlab-contrib/jupyterlab-variableInspector&lt;br /&gt;
&lt;br /&gt;
== Jupyter server proxy ==&lt;br /&gt;
&lt;br /&gt;
This extension is installed in PIC's jupyter environment and it is used to be able to access network/web services running on the same host as the jupyterlab server from outside through the &amp;quot;https://jupyter.pic.es/user/{username}/proxy/{port}&amp;quot; URL.&lt;br /&gt;
&lt;br /&gt;
Full documentation here: https://jupyter-server-proxy.readthedocs.io/en/latest/index.html&lt;br /&gt;
...&lt;br /&gt;
&lt;br /&gt;
= Code samples =&lt;br /&gt;
&lt;br /&gt;
A repository with sample code can be found here: https://gitlab.pic.es/services/code-samples/&lt;br /&gt;
&lt;br /&gt;
= Running notebooks through HTCondor =&lt;br /&gt;
After developing a notebook, you might want to run it with different configurations. The&lt;br /&gt;
following documentation explains  &lt;br /&gt;
&lt;br /&gt;
[[notebook_htcondor|how to run a notebook through HTCondor.]]&lt;br /&gt;
&lt;br /&gt;
= Troubleshooting =&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Logs ==&lt;br /&gt;
&lt;br /&gt;
The log files for the jupyterlab server are stored in &amp;quot;~/.jupyter&amp;quot;. The log files will be created once the jupyter lab server job is finished.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Clean workspaces ==&lt;br /&gt;
&lt;br /&gt;
Jupyterlab stores the workspace status in the &amp;quot;~/.jupyter/lab/workspaces&amp;quot; folder. If you want to start with a fresh (empty) workspace, delete all the content of this folder before launching the notebook.&lt;br /&gt;
&lt;br /&gt;
    cd ~/.jupyter/lab/workspaces&lt;br /&gt;
    rm *&lt;br /&gt;
&lt;br /&gt;
= Known errors =&lt;br /&gt;
&lt;br /&gt;
== Error 500: Internal Server Error ==&lt;br /&gt;
&lt;br /&gt;
This is a generic error. Means that the jupyterlab server failed. This could be for different reasons:&lt;br /&gt;
&lt;br /&gt;
* Your HOME folder is full. Log in to &amp;quot;ui.pic.es&amp;quot; and run &amp;quot;quota&amp;quot; to check the usage vs quota. If it is full you'll have to free up space.&lt;br /&gt;
&lt;br /&gt;
== 504 Gateway timeout ==&lt;br /&gt;
&lt;br /&gt;
The notebook job is running in HTCondor but the user can not access the notebook server. Ultimately a 504 error is received.&lt;br /&gt;
&lt;br /&gt;
This is probably because there's some error when starting the jupyterlab server. First of all, shutdown the notebook server and [[#Logs|check the logs]] to better identify the problem. If you don't see the source of the error, try to [[#Clean_workspaces|clean the workspaces]] and launch a notebook again.&lt;br /&gt;
&lt;br /&gt;
== Install ROOT ==&lt;br /&gt;
&lt;br /&gt;
Environment creation and kernel installation&lt;br /&gt;
&lt;br /&gt;
    $ micromamba env create -p /data/pic/scratch/torradeflot/envs/mcdata root ipykernel&lt;br /&gt;
    $ micromamba activate mcdata&lt;br /&gt;
    $ python -m ipykernel install --user --name mcdata&lt;br /&gt;
&lt;br /&gt;
After doing this ROOT can be imported from a python shell, but it does not work from a notebook.&lt;br /&gt;
&lt;br /&gt;
* ROOT uses JIT compilation with Cling: https://root.cern/cling/&lt;br /&gt;
* conda provides it's own set of compilation tools: gcc, gxx, fortran&lt;br /&gt;
* Some environment variables are necessary to ensure that these two pieces work well together:&lt;br /&gt;
** PATH: to be able to find the compiler tools&lt;br /&gt;
** CONDA_BUILD_SYSROOT: needed to configure the compiler call&lt;br /&gt;
&lt;br /&gt;
These environment variables are not propagated to the notebook, because they are set by .bashrc conda activate, ... so we need to explicitly set them in the notebook.&lt;br /&gt;
&lt;br /&gt;
     import os&lt;br /&gt;
     import sys&lt;br /&gt;
     from pathlib import Path&lt;br /&gt;
     bin_dir = Path(sys.executable).parent&lt;br /&gt;
     os.environ['PATH'] = f'{bin_dir}:{os.environ['PATH']}'&lt;br /&gt;
     os.environ['CONDA_BUILD_SYSROOT'] = str(bin_dir.parent / 'x86_64-conda-linux-gnu/sysroot')&lt;br /&gt;
&lt;br /&gt;
== Loading libraries is very slow ==&lt;br /&gt;
Conda environments at storage using hard drives can be extremely slow to load. If you encounter this problem, please ask &lt;br /&gt;
your support contact for a &amp;quot;SSD scratch&amp;quot; location to store environments. Currently this service is being tested and&lt;br /&gt;
deployed as needed.&lt;br /&gt;
&lt;br /&gt;
== Spawn failed: The 'ip' trait of a PICCondorSpawner instance expected a unicode string, not the NoneType None ==&lt;br /&gt;
&lt;br /&gt;
Jupyterhub could not get the host name from HTCondor's stdout, because it didn't match the expected regular expression.&lt;br /&gt;
&lt;br /&gt;
This error happens randomly from time to time, it does not imply any major problem in any of the services.&lt;br /&gt;
&lt;br /&gt;
Try to request a new notebook server.&lt;br /&gt;
&lt;br /&gt;
== 403 : Forbidden. XSRF cookie does not match POST argument ==&lt;br /&gt;
&lt;br /&gt;
The value of the &amp;quot;_xsrf&amp;quot; cookie sent by the browser does not match the expected value. This could be for many reasons: temporary high load on the server, race conditions, temporary network unstability, many open tabs in the browser, etc.&lt;br /&gt;
&lt;br /&gt;
In general it can be solved by closing all tabs pointing to &amp;quot;jupyter.pic.es&amp;quot;, cleaning the cookies and connecting back to &amp;quot;jupyter.pic.es&amp;quot;&lt;br /&gt;
&lt;br /&gt;
== Proper usage of X509 based proxies ==&lt;br /&gt;
&lt;br /&gt;
We found recently that the usage of proxies within a Jupyter session might cause problems because the environment changes certain standard locations such as '''/tmp'''&lt;br /&gt;
&lt;br /&gt;
For a correct functioning please create the proxy the following way, example for Virgo:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /bin/voms-proxy-init --voms virgo:/virgo/ligo --out ./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
ls: cannot access /cvmfs/ligo.osgstorage.org: Permission denied&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here the proxy cannot properly be located. Therefore we have to put the complete path into the variable:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ pwd&lt;br /&gt;
/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;/x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
frames  powerflux  pycbc  test_access&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;/div&gt;</summary>
		<author><name>Torradeflot</name></author>
	</entry>
	<entry>
		<id>https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1242</id>
		<title>JupyterHub</title>
		<link rel="alternate" type="text/html" href="https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1242"/>
		<updated>2025-09-15T06:38:10Z</updated>

		<summary type="html">&lt;p&gt;Torradeflot: /* Python virtual environments */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
= Introduction =&lt;br /&gt;
&lt;br /&gt;
PIC offers a service for running Jupyter notebooks on CPU or GPU resources. This service is primarily thought for code developing or prototyping rather than data processing. The usage is similar to running notebooks on your personal computer but offers the advantage of developing and testing your code on different hardware configurations, as well as facilitating the scalability of the code since it is being tested in the same environment in which it would run on a mass scale. &lt;br /&gt;
&lt;br /&gt;
Since the service is strictly thought for development and small scale testing tasks, a shutdown policy for the sessions has been put in place:&lt;br /&gt;
&lt;br /&gt;
# The maximum duration for a session is 48h.&lt;br /&gt;
# After an idle period of 2 hours, the session will be closed. &lt;br /&gt;
&lt;br /&gt;
In practice that  means that you should estimate the test data volume that you work with during a session to be able to be processed in less than 48 hours.&lt;br /&gt;
&lt;br /&gt;
== How to connect to the service ==&lt;br /&gt;
&lt;br /&gt;
Got to [https://jupyter.pic.es jupyter.pic.es] to see your login screen.&lt;br /&gt;
&lt;br /&gt;
[[File:login.png|700px|Login screen]]&lt;br /&gt;
&lt;br /&gt;
Sign in with your PIC user credentials. This will prompt you to the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterSpawn.png|700px|current]]&lt;br /&gt;
&lt;br /&gt;
Here you can choose the hardware configuration for your Jupyter session. Also, you have to choose the experiment (project) you are working on during the Jupyter session. After choosing a configuration and pressing start the next screen will show you the progress of the initialisation process. Keep in mind that a job containing your Jupyter session is actually sent to the HTCondor queuing system and waiting for available resources before being started. This usually takes less than a minute but can take up to a few depending on our resource usage.&lt;br /&gt;
&lt;br /&gt;
[[File:screen02.png|900px]]&lt;br /&gt;
&lt;br /&gt;
In the next screen you can choose the tool that you want to use for your work: a Python notebook, a Python console or a plain bash terminal.&lt;br /&gt;
For the Python environment (either notebook or environment) you have two default options:&lt;br /&gt;
* the ipykernel version of Python 3&lt;br /&gt;
* the XPython version of Python 3.9, this one allows you to use the integrated debugging module.&lt;br /&gt;
&lt;br /&gt;
Further you see an icon with a &amp;quot;D&amp;quot; - desktop, this one starts a VNC session that allows the use of programs with graphical user interfaces.&lt;br /&gt;
&lt;br /&gt;
Also, recently you can find the icon of Visual Studio, an integrated development environment.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterlab20231103.png|700px]]&lt;br /&gt;
&lt;br /&gt;
Your python environments should appear under Notebook and Console headers. In a later section we will show you how to create a new environment and to remove an existing one.&lt;br /&gt;
&lt;br /&gt;
== Terminate your session and logout ==&lt;br /&gt;
&lt;br /&gt;
It is important that you terminate your session before you log out. In order to do so, go to the top page menu &amp;quot;'''File''' -&amp;gt; '''Hub Control Panel'''&amp;quot; and you will see the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:screen04.png|600px]]&lt;br /&gt;
&lt;br /&gt;
Here, click on the '''Stop My Server''' button. After that you can log out by clicking the '''Logout''' button in the right upper corner.&lt;br /&gt;
&lt;br /&gt;
=  Python virtual environments =&lt;br /&gt;
&lt;br /&gt;
This section covers the use of Python virtual environments with Jupyter.&lt;br /&gt;
&lt;br /&gt;
== Prebuilt environments == &lt;br /&gt;
&lt;br /&gt;
PIC's jupyterhub service comes with a collection of prebuilt environments located at '''/data/jupyter/software/envs'''.&lt;br /&gt;
&lt;br /&gt;
The master environment located at '''/data/jupyter/software/envs/master''' is the one used to start the jupyterlab service and the default for new notebooks.&lt;br /&gt;
&lt;br /&gt;
This is a non-extensive list of the packages included:&lt;br /&gt;
  - astropy=6.1.0&lt;br /&gt;
  - bokeh=3.4.1&lt;br /&gt;
  - dash=2.17.0&lt;br /&gt;
  - dask=2024.5.1&lt;br /&gt;
  - findspark=2.0.1&lt;br /&gt;
  - matplotlib=3.8.4&lt;br /&gt;
  - numpy=1.26.4&lt;br /&gt;
  - pandas=2.2.2&lt;br /&gt;
  - pillow=10.3.0&lt;br /&gt;
  - plotly=5.22.0&lt;br /&gt;
  - pyhive=0.7.0&lt;br /&gt;
  - python=3.12&lt;br /&gt;
  - pywavelets=1.4.1&lt;br /&gt;
  - scikit-image=0.22.0&lt;br /&gt;
  - scikit-learn=1.5.0&lt;br /&gt;
  - scipy=1.13.1&lt;br /&gt;
  - seaborn=0.13.2&lt;br /&gt;
  - statsmodels=0.14.2&lt;br /&gt;
&lt;br /&gt;
== Initialize conda (we highly recommend the use of mamba/micromamba) ==&lt;br /&gt;
&lt;br /&gt;
Before using conda/mamba in your bash session, you have to initialize it.&lt;br /&gt;
* For access to an available conda/mamba installation, please get in contact with your project liaison at PIC. He/she will give you the actual value for the '''/path/to/conda/mamba''' placeholder.&lt;br /&gt;
* If you want to use your own conda/mamba/micromamba installation, there are two recommended options&lt;br /&gt;
** '''miniforge''': a distribution with conda and mamba executables in a minimal base environment, instructions [https://github.com/conda-forge/miniforge here]&lt;br /&gt;
** '''micromamba''': a self-contained executable (micromamba) with no base environment, instructions [https://mamba.readthedocs.io/en/latest/installation/micromamba-installation.html here]&lt;br /&gt;
&lt;br /&gt;
Log onto Jupyter and start a session. On the homepage of your Jupyter session, click on the terminal button on the session dashboard on the right to open a bash terminal.&lt;br /&gt;
&lt;br /&gt;
In order to use conda/mamba/micromamba you need to intialize the shell. This initialization can be persistent, which will do some changes to your '''~/.bashrc''' file, or you can do it every time you want to use it.&lt;br /&gt;
&lt;br /&gt;
Run the following command:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eval &amp;quot;$(/data/astro/software/miniforge3/bin/conda shell.bash hook)&amp;quot;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, if you are using miniforge and you want to persist the initialization:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
/data/astro/software/miniforge3/bin/mamba init&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This actually changes the .bashrc file in your home directory in order to activate the base environment on login.&lt;br /&gt;
To avoid that the base environment is activated every time you log on to a node, run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Link an existing environment to Jupyter ==&lt;br /&gt;
&lt;br /&gt;
You can find instructions on how to create your own environments, e.g. [[#Create_virtual_environments_with_venv_or_conda | here]].&lt;br /&gt;
&lt;br /&gt;
Log into Jupyter, start a session. From the session dashboard choose the bash terminal.&lt;br /&gt;
&lt;br /&gt;
Inside the terminal, activate your environment.&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments:&lt;br /&gt;
* if you created the environment without a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the name of your environment. &lt;br /&gt;
* if you created the environment with a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the absolute path of your environment. &lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ source /path/to/environment/bin/activate&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Link the environment to a Jupyter kernel. For both, '''conda/mamba''' and '''venv''':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ python -m  ipykernel install --user --name=whatever_kernel_name&lt;br /&gt;
Installed kernelspec whatever_kernel_name in &lt;br /&gt;
                         /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you don't have the '''ipykernel''' module installed in your environment you may receive an error message like the one below when trying to run the previous command.&lt;br /&gt;
&amp;lt;pre&amp;gt;No module named ipykernel&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If this is the case, you need to install it by running: '''pip install ipykernel'''&lt;br /&gt;
&lt;br /&gt;
Deactivate your environment. &lt;br /&gt;
&lt;br /&gt;
For conda:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For venv:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now you can exit the terminal. After refreshing the Jupyter page your whatever_kernel_name appears in the dashboard. In this example '''test''' has been used for whatever_kernel_name&lt;br /&gt;
&lt;br /&gt;
[[File:screen05.png|700px]]&lt;br /&gt;
&lt;br /&gt;
== Unlink an environment from Jupyter ==&lt;br /&gt;
Log onto Jupyter, start a session and from the session dashboard choose the bash terminal. To remove your environment/kernel from Jupyter run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ jupyter kernelspec uninstall whatever_kernel_name&lt;br /&gt;
Kernel specs to remove:&lt;br /&gt;
  whatever_kernel_name     /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
Remove 1 kernel specs [y/N]: y&lt;br /&gt;
[RemoveKernelSpec] Removed /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Keep in mind that, although not available in Jupyter anymore, the environment still exists. Whenever you need it, you can link it again.&lt;br /&gt;
&lt;br /&gt;
== Create virtual environments with venv or conda ==&lt;br /&gt;
&lt;br /&gt;
Before creating a new environment, please get in contact with your project liaison at PIC as there may be already a suitable environment for your needs in place.&lt;br /&gt;
&lt;br /&gt;
If none of the existing environments suits your needs, you can create a new environment.&lt;br /&gt;
First, create a directory in a suitable place to store the environment. For single-user environments, place them in your home under ~/env. For environments that will be shared with other project users, contact your project liaison and ask him/her for a path in a shared storage volume that is visible to all of them.&lt;br /&gt;
&lt;br /&gt;
Once you have the location (i.e. /path/to/env/folder), create the environment with the following commands:&lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments '''(recommended)'''&lt;br /&gt;
&lt;br /&gt;
If your_env is installed at /path/to/env/your_env&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ python3 -m venv your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ source your_env/bin/activate&lt;br /&gt;
(...)[neissner@td110 ~]$ pip install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments&lt;br /&gt;
 &amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba create --prefix /path/to/env/your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The list of modules (module1, module2, ...) is optional. For instance, for a python3 environment with scipy you would specify: ''python=3 scipy'' &lt;br /&gt;
&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/env/folder/your_env&lt;br /&gt;
(...)[neissner@td110 ~]$ mamba install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
You can use pip install inside a mamba environment, however, resolving dependencies might require installing additional packages manually.&lt;br /&gt;
&lt;br /&gt;
== Conda / mamba configuration ==&lt;br /&gt;
&lt;br /&gt;
The behaviour of conda/mamba can be configured through the &amp;quot;$HOME/.condarc&amp;quot; file, described [https://docs.conda.io/projects/conda/en/latest/configuration.html here]. Some interesting parameters:&lt;br /&gt;
&lt;br /&gt;
* envs_dirs: The list of directories to search for named environments. E.g.: different locations where you created environments&lt;br /&gt;
&lt;br /&gt;
  envs_dirs:&lt;br /&gt;
    - /data/pic/scratch/torradeflot/envs&lt;br /&gt;
    - /data/astro/scratch/torradeflot/envs&lt;br /&gt;
    - /data/aai/scratch/torradeflot/envs&lt;br /&gt;
&lt;br /&gt;
* pkgs_dirs: Folder where to store conda packages&lt;br /&gt;
&lt;br /&gt;
  pkgs_dirs:&lt;br /&gt;
  - /data/aai/scratch_ssd/torradeflot/pkgs&lt;br /&gt;
  - /data/aai/scratch/torradeflot/pkgs&lt;br /&gt;
  - /data/pic/scratch/torradeflot/pkgs&lt;br /&gt;
&lt;br /&gt;
if `pkgs_dirs` and `envs_dirs` are in the same storage, conda will use hard links, thus optimizing the disk space.&lt;br /&gt;
&lt;br /&gt;
= Proper usage of X509 based proxies =&lt;br /&gt;
&lt;br /&gt;
We found recently that the usage of proxies within a Jupyter session might cause problems because the environment changes certain standard locations such as '''/tmp'''&lt;br /&gt;
&lt;br /&gt;
For a correct functioning please create the proxy the following way, example for Virgo:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /bin/voms-proxy-init --voms virgo:/virgo/ligo --out ./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
ls: cannot access /cvmfs/ligo.osgstorage.org: Permission denied&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here the proxy cannot properly be located. Therefore we have to put the complete path into the variable:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ pwd&lt;br /&gt;
/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;/x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
frames  powerflux  pycbc  test_access&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Software of particular interest =&lt;br /&gt;
== SageMath ==&lt;br /&gt;
&lt;br /&gt;
[https://www.sagemath.org/ SageMath] is particularly interesting for Cosmology because it allows symbolic calculations, e.g. deriving the equations of motions for the scale factor starting from a customised space-time metric. &lt;br /&gt;
&lt;br /&gt;
=== Standard cosmology examples ===&lt;br /&gt;
&lt;br /&gt;
* The Friedman equations for the FLRW solution of the Einstein equations.&lt;br /&gt;
You can find the corresponding Notebook in any PIC terminal at '''/data/astro/software/notebooks/FLRW_cosmology.ipynb'''&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/FLRW_cosmology_solutions.ipynb''' uses known analytical solutions of the FLRW cosmology and produces this image for the evolution of the scale factor:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage05.png|300px]]&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/Interior_Schwarzschild.ipynb''' shows the formalism for the interior Schwarzschild metric and displays the solutions for density and pressure of a static celestial object that is sufficiently larger than its Schwarzschild radius. The pressure for an object with constant density id shown in the image:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage06.png|300px]]&lt;br /&gt;
&lt;br /&gt;
=== Enabling SageMath environment in Jupyter ===&lt;br /&gt;
&lt;br /&gt;
If you have never initialized mamba, run:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /data/astro/software/centos7/conda/mambaforge_4.14.0/bin/mamba init&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
After that you can enable SageMath for its use in a Jupyter notebook session: &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~] mamba activate /data/astro/software/envs/sage&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ python -m  ipykernel install --user --name=sage&lt;br /&gt;
....&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This creates a file in you home '''~/.local/share/jupyter/kernels/sage/kernel.json'''&lt;br /&gt;
which has to be modified to look like this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
{&lt;br /&gt;
 &amp;quot;argv&amp;quot;: [&lt;br /&gt;
  &amp;quot;/data/astro/software/envs/sage/bin/sage&amp;quot;,&lt;br /&gt;
  &amp;quot;--python&amp;quot;,&lt;br /&gt;
  &amp;quot;-m&amp;quot;,&lt;br /&gt;
  &amp;quot;sage.repl.ipython_kernel&amp;quot;,&lt;br /&gt;
  &amp;quot;-f&amp;quot;,&lt;br /&gt;
  &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
 ],&lt;br /&gt;
 &amp;quot;display_name&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;language&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;metadata&amp;quot;: {&lt;br /&gt;
  &amp;quot;debugger&amp;quot;: true&lt;br /&gt;
 }&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next time you go to your Jupyter dashboard you will find the sage environment listed there.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Dask =&lt;br /&gt;
Dask supports parallel computations in Python. The PIC Jupyterlab has an extension for launching&lt;br /&gt;
your own Dask clusters. For more information, see [[Dask|Dask documentation]].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Using a singularity image as a jupyter kernel =&lt;br /&gt;
&lt;br /&gt;
Sometimes the software stack of some projects may be provided in the shape of a singularity image, it will then be convenient to use this image as a kernel for the notebooks in jupyter.pic.es.&lt;br /&gt;
&lt;br /&gt;
The singularity image to be used as a kernel needs to fullfill some requirements. Different requirements will apply depending on the programming language.&lt;br /&gt;
&lt;br /&gt;
== python jupyter kernel in a singularity image ==&lt;br /&gt;
&lt;br /&gt;
The singularity image needs to have the '''python''' and the '''ipykernel''' module installed. &lt;br /&gt;
&lt;br /&gt;
* Create the folder that will host the kernel definition&lt;br /&gt;
&lt;br /&gt;
  mkdir -p $HOME/.local/share/jupyter/kernels/singularity&lt;br /&gt;
&lt;br /&gt;
* Create the '''kernel.json''' file inside it with the following content:&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 {&lt;br /&gt;
   &amp;quot;argv&amp;quot;: [&lt;br /&gt;
     &amp;quot;singularity&amp;quot;,&lt;br /&gt;
     &amp;quot;exec&amp;quot;,&lt;br /&gt;
     &amp;quot;--cleanenv&amp;quot;,&lt;br /&gt;
     &amp;quot;/path/to/the/singularity/image.sif&amp;quot;,&lt;br /&gt;
     &amp;quot;python&amp;quot;,&lt;br /&gt;
     &amp;quot;-m&amp;quot;,&lt;br /&gt;
    &amp;quot;ipykernel&amp;quot;,&lt;br /&gt;
     &amp;quot;-f&amp;quot;,&lt;br /&gt;
     &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
   ],&lt;br /&gt;
   &amp;quot;language&amp;quot;: &amp;quot;python&amp;quot;,&lt;br /&gt;
   &amp;quot;display_name&amp;quot;: &amp;quot;singularity-kernel&amp;quot;&lt;br /&gt;
 }&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Refresh or start the jupyterlab interface and the singularity kernel should appear in the launcher tab&lt;br /&gt;
&lt;br /&gt;
= GPUs =&lt;br /&gt;
&lt;br /&gt;
The way to identify the GPUs that are assigned to your job is:&lt;br /&gt;
* check the environment variable CUDA_VISIBLE_DEVICES. In a terminal run &amp;quot;echo $CUDA_VISIBLE_DEVICES&amp;quot;. The environment variable contains a list of comma-separated GPU ids. With this you will already know how many gpus are assigned to your job. If the variable does not exist, there are no gpus assigned to the job&lt;br /&gt;
&lt;br /&gt;
* list the gpus with nvidia-smi, in a terminal run &amp;quot;nvidia-smi -L&amp;quot;, and look for the gpus you've been assigned. Remember their indexes (integers from 0 to 7)&lt;br /&gt;
&lt;br /&gt;
* The two steps above can be done with the following command:&lt;br /&gt;
&lt;br /&gt;
nvidia-smi -L | grep  $CUDA_VISIBLE_DEVICES&lt;br /&gt;
&lt;br /&gt;
if only having a single assigned GPU.&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_id_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
* in the GPU dashboard the gpus are identified with their index&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_resources_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
= Jupyterlab user guide =&lt;br /&gt;
&lt;br /&gt;
You can find the official documentation of the currently installed version of jupyterlab (3.6) [https://jupyterlab.readthedocs.io/en/4.2.x/ here], there you will find instruction on how to &lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/commands.html Access the command palette]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/toc.html Build a Table Of Contents]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/debugger.html Debug your code]&lt;br /&gt;
&lt;br /&gt;
A set of non-official jupyterlab extensions are installed to provide additional functionalities&lt;br /&gt;
&lt;br /&gt;
== jupytext ==&lt;br /&gt;
Pair your notebooks with text files to enhance version tracking.&lt;br /&gt;
https://jupytext.readthedocs.io&lt;br /&gt;
&lt;br /&gt;
=== Example ===&lt;br /&gt;
&lt;br /&gt;
If you had a notebook (.ipynb file) containing only the cell below, tracked in a git repository&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%matplotlib inline&lt;br /&gt;
import numpy as np&lt;br /&gt;
import matplotlib.pyplot as plt&lt;br /&gt;
plt.imshow(np.random.random([10, 10]))&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Different executions of the cell would produce different images, and the images are embedded in a pseudo-binary format inside the notebook file. In this case, doing a '''git diff''' of the .ipynb file would produce a huge output (because the image changed), even if there wasn't any change in the code. It is thus convenient to sync the notebook with a text file (e.g. a .py script) using the jupytext extension and track this one with git. The outputs, including images, as well as some additional metadata, won't be added to the synced text file. So in the case of different executions of the same notebook, the diff will always be empty.&lt;br /&gt;
&lt;br /&gt;
== git ==&lt;br /&gt;
Sidebar GUI to git repo management&lt;br /&gt;
https://github.com/jupyterlab/jupyterlab-git&lt;br /&gt;
&lt;br /&gt;
== variable inspector ==&lt;br /&gt;
Variable inspection à la Matlab&lt;br /&gt;
https://github.com/jupyterlab-contrib/jupyterlab-variableInspector&lt;br /&gt;
&lt;br /&gt;
== jupyter server proxy ==&lt;br /&gt;
&lt;br /&gt;
This extension is installed in PIC's jupyter environment and it is used to be able to access network/web services running on the same host as the jupyterlab server from outside through the &amp;quot;https://jupyter.pic.es/user/{username}/proxy/{port}&amp;quot; URL.&lt;br /&gt;
&lt;br /&gt;
Full documentation here: https://jupyter-server-proxy.readthedocs.io/en/latest/index.html&lt;br /&gt;
...&lt;br /&gt;
&lt;br /&gt;
= Code samples =&lt;br /&gt;
&lt;br /&gt;
A repository with sample code can be found here: https://gitlab.pic.es/services/code-samples/&lt;br /&gt;
&lt;br /&gt;
= Troubleshooting =&lt;br /&gt;
&lt;br /&gt;
== Error 500: Internal Server Error ==&lt;br /&gt;
&lt;br /&gt;
This is a generic error. Means that the jupyterlab server failed. This could be for different reasons:&lt;br /&gt;
&lt;br /&gt;
* Your HOME folder is full. Log in to &amp;quot;ui.pic.es&amp;quot; and run &amp;quot;quota&amp;quot; to check the usage vs quota. If it is full you'll have to free up space.&lt;br /&gt;
&lt;br /&gt;
== Logs ==&lt;br /&gt;
&lt;br /&gt;
The log files for the jupyterlab server are stored in &amp;quot;~/.jupyter&amp;quot;. The log files will be created once the jupyter lab server job is finished.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Clean workspaces ==&lt;br /&gt;
&lt;br /&gt;
Jupyterlab stores the workspace status in the &amp;quot;~/.jupyter/lab/workspaces&amp;quot; folder. If you want to start with a fresh (empty) workspace, delete all the content of this folder before launching the notebook.&lt;br /&gt;
&lt;br /&gt;
    cd ~/.jupyter/lab/workspaces&lt;br /&gt;
    rm *&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 504 Gateway timeout ==&lt;br /&gt;
&lt;br /&gt;
The notebook job is running in HTCondor but the user can not access the notebook server. Ultimately a 504 error is received.&lt;br /&gt;
&lt;br /&gt;
This is probably because there's some error when starting the jupyterlab server. First of all, shutdown the notebook server and [[#Logs|check the logs]] to better identify the problem. If you don't see the source of the error, try to [[#Clean_workspaces|clean the workspaces]] and launch a notebook again.&lt;br /&gt;
&lt;br /&gt;
== Install ROOT ==&lt;br /&gt;
&lt;br /&gt;
Environment creation and kernel installation&lt;br /&gt;
&lt;br /&gt;
    $ micromamba env create -p /data/pic/scratch/torradeflot/envs/mcdata root ipykernel&lt;br /&gt;
    $ micromamba activate mcdata&lt;br /&gt;
    $ python -m ipykernel install --user --name mcdata&lt;br /&gt;
&lt;br /&gt;
After doing this ROOT can be imported from a python shell, but it does not work from a notebook.&lt;br /&gt;
&lt;br /&gt;
* ROOT uses JIT compilation with Cling: https://root.cern/cling/&lt;br /&gt;
* conda provides it's own set of compilation tools: gcc, gxx, fortran&lt;br /&gt;
* Some environment variables are necessary to ensure that these two pieces work well together:&lt;br /&gt;
** PATH: to be able to find the compiler tools&lt;br /&gt;
** CONDA_BUILD_SYSROOT: needed to configure the compiler call&lt;br /&gt;
&lt;br /&gt;
These environment variables are not propagated to the notebook, because they are set by .bashrc conda activate, ... so we need to explicitly set them in the notebook.&lt;br /&gt;
&lt;br /&gt;
     import os&lt;br /&gt;
     import sys&lt;br /&gt;
     from pathlib import Path&lt;br /&gt;
     bin_dir = Path(sys.executable).parent&lt;br /&gt;
     os.environ['PATH'] = f'{bin_dir}:{os.environ['PATH']}'&lt;br /&gt;
     os.environ['CONDA_BUILD_SYSROOT'] = str(bin_dir.parent / 'x86_64-conda-linux-gnu/sysroot')&lt;br /&gt;
&lt;br /&gt;
== Loading libraries is very slow ==&lt;br /&gt;
Conda environments at storage using hard drives can be extremely slow to load. If you encounter this problem, please ask &lt;br /&gt;
your support contact for a &amp;quot;SSD scratch&amp;quot; location to store environments. Currently this service is being tested and&lt;br /&gt;
deployed as needed.&lt;br /&gt;
&lt;br /&gt;
== Spawn failed: The 'ip' trait of a PICCondorSpawner instance expected a unicode string, not the NoneType None ==&lt;br /&gt;
&lt;br /&gt;
Jupyterhub could not get the host name from HTCondor's stdout, because it didn't match the expected regular expression.&lt;br /&gt;
&lt;br /&gt;
This error happens randomly from time to time, it does not imply any major problem in any of the services.&lt;br /&gt;
&lt;br /&gt;
Try to request a new notebook server.&lt;br /&gt;
&lt;br /&gt;
== 403 : Forbidden. XSRF cookie does not match POST argument ==&lt;br /&gt;
&lt;br /&gt;
The value of the &amp;quot;_xsrf&amp;quot; cookie sent by the browser does not match the expected value. This could be for many reasons: temporary high load on the server, race conditions, temporary network unstability, many open tabs in the browser, etc.&lt;br /&gt;
&lt;br /&gt;
In general it can be solved by closing all tabs pointing to &amp;quot;jupyter.pic.es&amp;quot;, cleaning the cookies and connecting back to &amp;quot;jupyter.pic.es&amp;quot;&lt;/div&gt;</summary>
		<author><name>Torradeflot</name></author>
	</entry>
	<entry>
		<id>https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1230</id>
		<title>JupyterHub</title>
		<link rel="alternate" type="text/html" href="https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1230"/>
		<updated>2025-06-06T07:11:04Z</updated>

		<summary type="html">&lt;p&gt;Torradeflot: /* Jupyterlab user guide */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
= Introduction =&lt;br /&gt;
&lt;br /&gt;
PIC offers a service for running Jupyter notebooks on CPU or GPU resources. This service is primarily thought for code developing or prototyping rather than data processing. The usage is similar to running notebooks on your personal computer but offers the advantage of developing and testing your code on different hardware configurations, as well as facilitating the scalability of the code since it is being tested in the same environment in which it would run on a mass scale. &lt;br /&gt;
&lt;br /&gt;
Since the service is strictly thought for development and small scale testing tasks, a shutdown policy for the sessions has been put in place:&lt;br /&gt;
&lt;br /&gt;
# The maximum duration for a session is 48h.&lt;br /&gt;
# After an idle period of 2 hours, the session will be closed. &lt;br /&gt;
&lt;br /&gt;
In practice that  means that you should estimate the test data volume that you work with during a session to be able to be processed in less than 48 hours.&lt;br /&gt;
&lt;br /&gt;
== How to connect to the service ==&lt;br /&gt;
&lt;br /&gt;
Got to [https://jupyter.pic.es jupyter.pic.es] to see your login screen.&lt;br /&gt;
&lt;br /&gt;
[[File:login.png|700px|Login screen]]&lt;br /&gt;
&lt;br /&gt;
Sign in with your PIC user credentials. This will prompt you to the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterSpawn.png|700px|current]]&lt;br /&gt;
&lt;br /&gt;
Here you can choose the hardware configuration for your Jupyter session. Also, you have to choose the experiment (project) you are working on during the Jupyter session. After choosing a configuration and pressing start the next screen will show you the progress of the initialisation process. Keep in mind that a job containing your Jupyter session is actually sent to the HTCondor queuing system and waiting for available resources before being started. This usually takes less than a minute but can take up to a few depending on our resource usage.&lt;br /&gt;
&lt;br /&gt;
[[File:screen02.png|900px]]&lt;br /&gt;
&lt;br /&gt;
In the next screen you can choose the tool that you want to use for your work: a Python notebook, a Python console or a plain bash terminal.&lt;br /&gt;
For the Python environment (either notebook or environment) you have two default options:&lt;br /&gt;
* the ipykernel version of Python 3&lt;br /&gt;
* the XPython version of Python 3.9, this one allows you to use the integrated debugging module.&lt;br /&gt;
&lt;br /&gt;
Further you see an icon with a &amp;quot;D&amp;quot; - desktop, this one starts a VNC session that allows the use of programs with graphical user interfaces.&lt;br /&gt;
&lt;br /&gt;
Also, recently you can find the icon of Visual Studio, an integrated development environment.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterlab20231103.png|700px]]&lt;br /&gt;
&lt;br /&gt;
Your python environments should appear under Notebook and Console headers. In a later section we will show you how to create a new environment and to remove an existing one.&lt;br /&gt;
&lt;br /&gt;
== Terminate your session and logout ==&lt;br /&gt;
&lt;br /&gt;
It is important that you terminate your session before you log out. In order to do so, go to the top page menu &amp;quot;'''File''' -&amp;gt; '''Hub Control Panel'''&amp;quot; and you will see the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:screen04.png|600px]]&lt;br /&gt;
&lt;br /&gt;
Here, click on the '''Stop My Server''' button. After that you can log out by clicking the '''Logout''' button in the right upper corner.&lt;br /&gt;
&lt;br /&gt;
=  Python virtual environments =&lt;br /&gt;
&lt;br /&gt;
This section covers the use of Python virtual environments with Jupyter.&lt;br /&gt;
&lt;br /&gt;
== Initialize conda (we highly recommend the use of mamba/micromamba) ==&lt;br /&gt;
&lt;br /&gt;
Before using conda/mamba in your bash session, you have to initialize it.&lt;br /&gt;
* For access to an available conda/mamba installation, please get in contact with your project liaison at PIC. He/she will give you the actual value for the '''/path/to/conda/mamba''' placeholder.&lt;br /&gt;
* If you want to use your own conda/mamba/micromamba installation, there are two recommended options&lt;br /&gt;
** '''miniforge''': a distribution with conda and mamba executables in a minimal base environment, instructions [https://github.com/conda-forge/miniforge here]&lt;br /&gt;
** '''micromamba''': a self-contained executable (micromamba) with no base environment, instructions [https://mamba.readthedocs.io/en/latest/installation/micromamba-installation.html here]&lt;br /&gt;
&lt;br /&gt;
Log onto Jupyter and start a session. On the homepage of your Jupyter session, click on the terminal button on the session dashboard on the right to open a bash terminal.&lt;br /&gt;
&lt;br /&gt;
In order to use conda/mamba/micromamba you need to intialize the shell. This initialization can be persistent, which will do some changes to your '''~/.bashrc''' file, or you can do it every time you want to use it.&lt;br /&gt;
&lt;br /&gt;
Run the following command:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eval &amp;quot;$(/data/astro/software/miniforge3/bin/conda shell.bash hook)&amp;quot;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, if you are using miniforge and you want to persist the initialization:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
/data/astro/software/miniforge3/bin/mamba init&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This actually changes the .bashrc file in your home directory in order to activate the base environment on login.&lt;br /&gt;
To avoid that the base environment is activated every time you log on to a node, run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Link an existing environment to Jupyter ==&lt;br /&gt;
&lt;br /&gt;
You can find instructions on how to create your own environments, e.g. [[#Create_virtual_environments_with_venv_or_conda | here]].&lt;br /&gt;
&lt;br /&gt;
Log into Jupyter, start a session. From the session dashboard choose the bash terminal.&lt;br /&gt;
&lt;br /&gt;
Inside the terminal, activate your environment.&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments:&lt;br /&gt;
* if you created the environment without a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the name of your environment. &lt;br /&gt;
* if you created the environment with a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the absolute path of your environment. &lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ source /path/to/environment/bin/activate&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Link the environment to a Jupyter kernel. For both, '''conda/mamba''' and '''venv''':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ python -m  ipykernel install --user --name=whatever_kernel_name&lt;br /&gt;
Installed kernelspec whatever_kernel_name in &lt;br /&gt;
                         /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you don't have the '''ipykernel''' module installed in your environment you may receive an error message like the one below when trying to run the previous command.&lt;br /&gt;
&amp;lt;pre&amp;gt;No module named ipykernel&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If this is the case, you need to install it by running: '''pip install ipykernel'''&lt;br /&gt;
&lt;br /&gt;
Deactivate your environment. &lt;br /&gt;
&lt;br /&gt;
For conda:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For venv:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now you can exit the terminal. After refreshing the Jupyter page your whatever_kernel_name appears in the dashboard. In this example '''test''' has been used for whatever_kernel_name&lt;br /&gt;
&lt;br /&gt;
[[File:screen05.png|700px]]&lt;br /&gt;
&lt;br /&gt;
== Unlink an environment from Jupyter ==&lt;br /&gt;
Log onto Jupyter, start a session and from the session dashboard choose the bash terminal. To remove your environment/kernel from Jupyter run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ jupyter kernelspec uninstall whatever_kernel_name&lt;br /&gt;
Kernel specs to remove:&lt;br /&gt;
  whatever_kernel_name     /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
Remove 1 kernel specs [y/N]: y&lt;br /&gt;
[RemoveKernelSpec] Removed /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Keep in mind that, although not available in Jupyter anymore, the environment still exists. Whenever you need it, you can link it again.&lt;br /&gt;
&lt;br /&gt;
== Create virtual environments with venv or conda ==&lt;br /&gt;
&lt;br /&gt;
Before creating a new environment, please get in contact with your project liaison at PIC as there may be already a suitable environment for your needs in place.&lt;br /&gt;
&lt;br /&gt;
If none of the existing environments suits your needs, you can create a new environment.&lt;br /&gt;
First, create a directory in a suitable place to store the environment. For single-user environments, place them in your home under ~/env. For environments that will be shared with other project users, contact your project liaison and ask him/her for a path in a shared storage volume that is visible to all of them.&lt;br /&gt;
&lt;br /&gt;
Once you have the location (i.e. /path/to/env/folder), create the environment with the following commands:&lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments '''(recommended)'''&lt;br /&gt;
&lt;br /&gt;
If your_env is installed at /path/to/env/your_env&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ python3 -m venv your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ source your_env/bin/activate&lt;br /&gt;
(...)[neissner@td110 ~]$ pip install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments&lt;br /&gt;
 &amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba create --prefix /path/to/env/your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The list of modules (module1, module2, ...) is optional. For instance, for a python3 environment with scipy you would specify: ''python=3 scipy'' &lt;br /&gt;
&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/env/folder/your_env&lt;br /&gt;
(...)[neissner@td110 ~]$ mamba install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
You can use pip install inside a mamba environment, however, resolving dependencies might require installing additional packages manually.&lt;br /&gt;
&lt;br /&gt;
== Conda / mamba configuration ==&lt;br /&gt;
&lt;br /&gt;
The behaviour of conda/mamba can be configured through the &amp;quot;$HOME/.condarc&amp;quot; file, described [https://docs.conda.io/projects/conda/en/latest/configuration.html here]. Some interesting parameters:&lt;br /&gt;
&lt;br /&gt;
* envs_dirs: The list of directories to search for named environments. E.g.: different locations where you created environments&lt;br /&gt;
&lt;br /&gt;
  envs_dirs:&lt;br /&gt;
    - /data/pic/scratch/torradeflot/envs&lt;br /&gt;
    - /data/astro/scratch/torradeflot/envs&lt;br /&gt;
    - /data/aai/scratch/torradeflot/envs&lt;br /&gt;
&lt;br /&gt;
* pkgs_dirs: Folder where to store conda packages&lt;br /&gt;
&lt;br /&gt;
  pkgs_dirs:&lt;br /&gt;
  - /data/aai/scratch_ssd/torradeflot/pkgs&lt;br /&gt;
  - /data/aai/scratch/torradeflot/pkgs&lt;br /&gt;
  - /data/pic/scratch/torradeflot/pkgs&lt;br /&gt;
&lt;br /&gt;
if `pkgs_dirs` and `envs_dirs` are in the same storage, conda will use hard links, thus optimizing the disk space.&lt;br /&gt;
&lt;br /&gt;
= Proper usage of X509 based proxies =&lt;br /&gt;
&lt;br /&gt;
We found recently that the usage of proxies within a Jupyter session might cause problems because the environment changes certain standard locations such as '''/tmp'''&lt;br /&gt;
&lt;br /&gt;
For a correct functioning please create the proxy the following way, example for Virgo:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /bin/voms-proxy-init --voms virgo:/virgo/ligo --out ./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
ls: cannot access /cvmfs/ligo.osgstorage.org: Permission denied&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here the proxy cannot properly be located. Therefore we have to put the complete path into the variable:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ pwd&lt;br /&gt;
/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;/x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
frames  powerflux  pycbc  test_access&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Software of particular interest =&lt;br /&gt;
== SageMath ==&lt;br /&gt;
&lt;br /&gt;
[https://www.sagemath.org/ SageMath] is particularly interesting for Cosmology because it allows symbolic calculations, e.g. deriving the equations of motions for the scale factor starting from a customised space-time metric. &lt;br /&gt;
&lt;br /&gt;
=== Standard cosmology examples ===&lt;br /&gt;
&lt;br /&gt;
* The Friedman equations for the FLRW solution of the Einstein equations.&lt;br /&gt;
You can find the corresponding Notebook in any PIC terminal at '''/data/astro/software/notebooks/FLRW_cosmology.ipynb'''&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/FLRW_cosmology_solutions.ipynb''' uses known analytical solutions of the FLRW cosmology and produces this image for the evolution of the scale factor:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage05.png|300px]]&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/Interior_Schwarzschild.ipynb''' shows the formalism for the interior Schwarzschild metric and displays the solutions for density and pressure of a static celestial object that is sufficiently larger than its Schwarzschild radius. The pressure for an object with constant density id shown in the image:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage06.png|300px]]&lt;br /&gt;
&lt;br /&gt;
=== Enabling SageMath environment in Jupyter ===&lt;br /&gt;
&lt;br /&gt;
If you have never initialized mamba, run:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /data/astro/software/centos7/conda/mambaforge_4.14.0/bin/mamba init&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
After that you can enable SageMath for its use in a Jupyter notebook session: &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~] mamba activate /data/astro/software/envs/sage&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ python -m  ipykernel install --user --name=sage&lt;br /&gt;
....&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This creates a file in you home '''~/.local/share/jupyter/kernels/sage/kernel.json'''&lt;br /&gt;
which has to be modified to look like this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
{&lt;br /&gt;
 &amp;quot;argv&amp;quot;: [&lt;br /&gt;
  &amp;quot;/data/astro/software/envs/sage/bin/sage&amp;quot;,&lt;br /&gt;
  &amp;quot;--python&amp;quot;,&lt;br /&gt;
  &amp;quot;-m&amp;quot;,&lt;br /&gt;
  &amp;quot;sage.repl.ipython_kernel&amp;quot;,&lt;br /&gt;
  &amp;quot;-f&amp;quot;,&lt;br /&gt;
  &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
 ],&lt;br /&gt;
 &amp;quot;display_name&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;language&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;metadata&amp;quot;: {&lt;br /&gt;
  &amp;quot;debugger&amp;quot;: true&lt;br /&gt;
 }&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next time you go to your Jupyter dashboard you will find the sage environment listed there.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Dask =&lt;br /&gt;
&lt;br /&gt;
A notebook with instructions on how to run Dask at PIC can be found [https://gitlab.pic.es/services/code-samples/-/blob/main/computing/dask/dask_htcondor.ipynb here]&lt;br /&gt;
&lt;br /&gt;
= Using a singularity image as a jupyter kernel =&lt;br /&gt;
&lt;br /&gt;
Sometimes the software stack of some projects may be provided in the shape of a singularity image, it will then be convenient to use this image as a kernel for the notebooks in jupyter.pic.es.&lt;br /&gt;
&lt;br /&gt;
The singularity image to be used as a kernel needs to fullfill some requirements. Different requirements will apply depending on the programming language.&lt;br /&gt;
&lt;br /&gt;
== python jupyter kernel in a singularity image ==&lt;br /&gt;
&lt;br /&gt;
The singularity image needs to have the '''python''' and the '''ipykernel''' module installed. &lt;br /&gt;
&lt;br /&gt;
* Create the folder that will host the kernel definition&lt;br /&gt;
&lt;br /&gt;
  mkdir -p $HOME/.local/share/jupyter/kernels/singularity&lt;br /&gt;
&lt;br /&gt;
* Create the '''kernel.json''' file inside it with the following content:&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 {&lt;br /&gt;
   &amp;quot;argv&amp;quot;: [&lt;br /&gt;
     &amp;quot;singularity&amp;quot;,&lt;br /&gt;
     &amp;quot;exec&amp;quot;,&lt;br /&gt;
     &amp;quot;--cleanenv&amp;quot;,&lt;br /&gt;
     &amp;quot;/path/to/the/singularity/image.sif&amp;quot;,&lt;br /&gt;
     &amp;quot;python&amp;quot;,&lt;br /&gt;
     &amp;quot;-m&amp;quot;,&lt;br /&gt;
    &amp;quot;ipykernel&amp;quot;,&lt;br /&gt;
     &amp;quot;-f&amp;quot;,&lt;br /&gt;
     &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
   ],&lt;br /&gt;
   &amp;quot;language&amp;quot;: &amp;quot;python&amp;quot;,&lt;br /&gt;
   &amp;quot;display_name&amp;quot;: &amp;quot;singularity-kernel&amp;quot;&lt;br /&gt;
 }&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Refresh or start the jupyterlab interface and the singularity kernel should appear in the launcher tab&lt;br /&gt;
&lt;br /&gt;
= GPUs =&lt;br /&gt;
&lt;br /&gt;
The way to identify the GPUs that are assigned to your job is:&lt;br /&gt;
* check the environment variable CUDA_VISIBLE_DEVICES. In a terminal run &amp;quot;echo $CUDA_VISIBLE_DEVICES&amp;quot;. The environment variable contains a list of comma-separated GPU ids. With this you will already know how many gpus are assigned to your job. If the variable does not exist, there are no gpus assigned to the job&lt;br /&gt;
&lt;br /&gt;
* list the gpus with nvidia-smi, in a terminal run &amp;quot;nvidia-smi -L&amp;quot;, and look for the gpus you've been assigned. Remember their indexes (integers from 0 to 7)&lt;br /&gt;
&lt;br /&gt;
* The two steps above can be done with the following command:&lt;br /&gt;
&lt;br /&gt;
nvidia-smi -L | grep  $CUDA_VISIBLE_DEVICES&lt;br /&gt;
&lt;br /&gt;
if only having a single assigned GPU.&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_id_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
* in the GPU dashboard the gpus are identified with their index&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_resources_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
= Jupyterlab user guide =&lt;br /&gt;
&lt;br /&gt;
You can find the official documentation of the currently installed version of jupyterlab (3.6) [https://jupyterlab.readthedocs.io/en/4.2.x/ here], there you will find instruction on how to &lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/commands.html Access the command palette]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/toc.html Build a Table Of Contents]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/debugger.html Debug your code]&lt;br /&gt;
&lt;br /&gt;
A set of non-official jupyterlab extensions are installed to provide additional functionalities&lt;br /&gt;
&lt;br /&gt;
== jupytext ==&lt;br /&gt;
Pair your notebooks with text files to enhance version tracking.&lt;br /&gt;
https://jupytext.readthedocs.io&lt;br /&gt;
&lt;br /&gt;
=== Example ===&lt;br /&gt;
&lt;br /&gt;
If you had a notebook (.ipynb file) containing only the cell below, tracked in a git repository&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%matplotlib inline&lt;br /&gt;
import numpy as np&lt;br /&gt;
import matplotlib.pyplot as plt&lt;br /&gt;
plt.imshow(np.random.random([10, 10]))&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Different executions of the cell would produce different images, and the images are embedded in a pseudo-binary format inside the notebook file. In this case, doing a '''git diff''' of the .ipynb file would produce a huge output (because the image changed), even if there wasn't any change in the code. It is thus convenient to sync the notebook with a text file (e.g. a .py script) using the jupytext extension and track this one with git. The outputs, including images, as well as some additional metadata, won't be added to the synced text file. So in the case of different executions of the same notebook, the diff will always be empty.&lt;br /&gt;
&lt;br /&gt;
== git ==&lt;br /&gt;
Sidebar GUI to git repo management&lt;br /&gt;
https://github.com/jupyterlab/jupyterlab-git&lt;br /&gt;
&lt;br /&gt;
== variable inspector ==&lt;br /&gt;
Variable inspection à la Matlab&lt;br /&gt;
https://github.com/jupyterlab-contrib/jupyterlab-variableInspector&lt;br /&gt;
&lt;br /&gt;
== jupyter server proxy ==&lt;br /&gt;
&lt;br /&gt;
This extension is installed in PIC's jupyter environment and it is used to be able to access network/web services running on the same host as the jupyterlab server from outside through the &amp;quot;https://jupyter.pic.es/user/{username}/proxy/{port}&amp;quot; URL.&lt;br /&gt;
&lt;br /&gt;
Full documentation here: https://jupyter-server-proxy.readthedocs.io/en/latest/index.html&lt;br /&gt;
...&lt;br /&gt;
&lt;br /&gt;
= Code samples =&lt;br /&gt;
&lt;br /&gt;
A repository with sample code can be found here: https://gitlab.pic.es/services/code-samples/&lt;br /&gt;
&lt;br /&gt;
= Troubleshooting =&lt;br /&gt;
&lt;br /&gt;
== Error 500: Internal Server Error ==&lt;br /&gt;
&lt;br /&gt;
This is a generic error. Means that the jupyterlab server failed. This could be for different reasons:&lt;br /&gt;
&lt;br /&gt;
* Your HOME folder is full. Log in to &amp;quot;ui.pic.es&amp;quot; and run &amp;quot;quota&amp;quot; to check the usage vs quota. If it is full you'll have to free up space.&lt;br /&gt;
&lt;br /&gt;
== Logs ==&lt;br /&gt;
&lt;br /&gt;
The log files for the jupyterlab server are stored in &amp;quot;~/.jupyter&amp;quot;. The log files will be created once the jupyter lab server job is finished.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Clean workspaces ==&lt;br /&gt;
&lt;br /&gt;
Jupyterlab stores the workspace status in the &amp;quot;~/.jupyter/lab/workspaces&amp;quot; folder. If you want to start with a fresh (empty) workspace, delete all the content of this folder before launching the notebook.&lt;br /&gt;
&lt;br /&gt;
    cd ~/.jupyter/lab/workspaces&lt;br /&gt;
    rm *&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 504 Gateway timeout ==&lt;br /&gt;
&lt;br /&gt;
The notebook job is running in HTCondor but the user can not access the notebook server. Ultimately a 504 error is received.&lt;br /&gt;
&lt;br /&gt;
This is probably because there's some error when starting the jupyterlab server. First of all, shutdown the notebook server and [[#Logs|check the logs]] to better identify the problem. If you don't see the source of the error, try to [[#Clean_workspaces|clean the workspaces]] and launch a notebook again.&lt;br /&gt;
&lt;br /&gt;
== Install ROOT ==&lt;br /&gt;
&lt;br /&gt;
Environment creation and kernel installation&lt;br /&gt;
&lt;br /&gt;
    $ micromamba env create -p /data/pic/scratch/torradeflot/envs/mcdata root ipykernel&lt;br /&gt;
    $ micromamba activate mcdata&lt;br /&gt;
    $ python -m ipykernel install --user --name mcdata&lt;br /&gt;
&lt;br /&gt;
After doing this ROOT can be imported from a python shell, but it does not work from a notebook.&lt;br /&gt;
&lt;br /&gt;
* ROOT uses JIT compilation with Cling: https://root.cern/cling/&lt;br /&gt;
* conda provides it's own set of compilation tools: gcc, gxx, fortran&lt;br /&gt;
* Some environment variables are necessary to ensure that these two pieces work well together:&lt;br /&gt;
** PATH: to be able to find the compiler tools&lt;br /&gt;
** CONDA_BUILD_SYSROOT: needed to configure the compiler call&lt;br /&gt;
&lt;br /&gt;
These environment variables are not propagated to the notebook, because they are set by .bashrc conda activate, ... so we need to explicitly set them in the notebook.&lt;br /&gt;
&lt;br /&gt;
     import os&lt;br /&gt;
     import sys&lt;br /&gt;
     from pathlib import Path&lt;br /&gt;
     bin_dir = Path(sys.executable).parent&lt;br /&gt;
     os.environ['PATH'] = f'{bin_dir}:{os.environ['PATH']}'&lt;br /&gt;
     os.environ['CONDA_BUILD_SYSROOT'] = str(bin_dir.parent / 'x86_64-conda-linux-gnu/sysroot')&lt;br /&gt;
&lt;br /&gt;
== Loading libraries is very slow ==&lt;br /&gt;
Conda environments at storage using hard drives can be extremely slow to load. If you encounter this problem, please ask &lt;br /&gt;
your support contact for a &amp;quot;SSD scratch&amp;quot; location to store environments. Currently this service is being tested and&lt;br /&gt;
deployed as needed.&lt;br /&gt;
&lt;br /&gt;
== Spawn failed: The 'ip' trait of a PICCondorSpawner instance expected a unicode string, not the NoneType None ==&lt;br /&gt;
&lt;br /&gt;
Jupyterhub could not get the host name from HTCondor's stdout, because it didn't match the expected regular expression.&lt;br /&gt;
&lt;br /&gt;
This error happens randomly from time to time, it does not imply any major problem in any of the services.&lt;br /&gt;
&lt;br /&gt;
Try to request a new notebook server.&lt;br /&gt;
&lt;br /&gt;
== 403 : Forbidden. XSRF cookie does not match POST argument ==&lt;br /&gt;
&lt;br /&gt;
The value of the &amp;quot;_xsrf&amp;quot; cookie sent by the browser does not match the expected value. This could be for many reasons: temporary high load on the server, race conditions, temporary network unstability, many open tabs in the browser, etc.&lt;br /&gt;
&lt;br /&gt;
In general it can be solved by closing all tabs pointing to &amp;quot;jupyter.pic.es&amp;quot;, cleaning the cookies and connecting back to &amp;quot;jupyter.pic.es&amp;quot;&lt;/div&gt;</summary>
		<author><name>Torradeflot</name></author>
	</entry>
	<entry>
		<id>https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1226</id>
		<title>JupyterHub</title>
		<link rel="alternate" type="text/html" href="https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1226"/>
		<updated>2025-05-19T10:22:20Z</updated>

		<summary type="html">&lt;p&gt;Torradeflot: /* Spawn failed: The 'ip' trait of a PICCondorSpawner instance expected a unicode string, not the NoneType None */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
= Introduction =&lt;br /&gt;
&lt;br /&gt;
PIC offers a service for running Jupyter notebooks on CPU or GPU resources. This service is primarily thought for code developing or prototyping rather than data processing. The usage is similar to running notebooks on your personal computer but offers the advantage of developing and testing your code on different hardware configurations, as well as facilitating the scalability of the code since it is being tested in the same environment in which it would run on a mass scale. &lt;br /&gt;
&lt;br /&gt;
Since the service is strictly thought for development and small scale testing tasks, a shutdown policy for the sessions has been put in place:&lt;br /&gt;
&lt;br /&gt;
# The maximum duration for a session is 48h.&lt;br /&gt;
# After an idle period of 2 hours, the session will be closed. &lt;br /&gt;
&lt;br /&gt;
In practice that  means that you should estimate the test data volume that you work with during a session to be able to be processed in less than 48 hours.&lt;br /&gt;
&lt;br /&gt;
== How to connect to the service ==&lt;br /&gt;
&lt;br /&gt;
Got to [https://jupyter.pic.es jupyter.pic.es] to see your login screen.&lt;br /&gt;
&lt;br /&gt;
[[File:login.png|700px|Login screen]]&lt;br /&gt;
&lt;br /&gt;
Sign in with your PIC user credentials. This will prompt you to the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterSpawn.png|700px|current]]&lt;br /&gt;
&lt;br /&gt;
Here you can choose the hardware configuration for your Jupyter session. Also, you have to choose the experiment (project) you are working on during the Jupyter session. After choosing a configuration and pressing start the next screen will show you the progress of the initialisation process. Keep in mind that a job containing your Jupyter session is actually sent to the HTCondor queuing system and waiting for available resources before being started. This usually takes less than a minute but can take up to a few depending on our resource usage.&lt;br /&gt;
&lt;br /&gt;
[[File:screen02.png|900px]]&lt;br /&gt;
&lt;br /&gt;
In the next screen you can choose the tool that you want to use for your work: a Python notebook, a Python console or a plain bash terminal.&lt;br /&gt;
For the Python environment (either notebook or environment) you have two default options:&lt;br /&gt;
* the ipykernel version of Python 3&lt;br /&gt;
* the XPython version of Python 3.9, this one allows you to use the integrated debugging module.&lt;br /&gt;
&lt;br /&gt;
Further you see an icon with a &amp;quot;D&amp;quot; - desktop, this one starts a VNC session that allows the use of programs with graphical user interfaces.&lt;br /&gt;
&lt;br /&gt;
Also, recently you can find the icon of Visual Studio, an integrated development environment.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterlab20231103.png|700px]]&lt;br /&gt;
&lt;br /&gt;
Your python environments should appear under Notebook and Console headers. In a later section we will show you how to create a new environment and to remove an existing one.&lt;br /&gt;
&lt;br /&gt;
== Terminate your session and logout ==&lt;br /&gt;
&lt;br /&gt;
It is important that you terminate your session before you log out. In order to do so, go to the top page menu &amp;quot;'''File''' -&amp;gt; '''Hub Control Panel'''&amp;quot; and you will see the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:screen04.png|600px]]&lt;br /&gt;
&lt;br /&gt;
Here, click on the '''Stop My Server''' button. After that you can log out by clicking the '''Logout''' button in the right upper corner.&lt;br /&gt;
&lt;br /&gt;
=  Python virtual environments =&lt;br /&gt;
&lt;br /&gt;
This section covers the use of Python virtual environments with Jupyter.&lt;br /&gt;
&lt;br /&gt;
== Initialize conda (we highly recommend the use of mamba/micromamba) ==&lt;br /&gt;
&lt;br /&gt;
Before using conda/mamba in your bash session, you have to initialize it.&lt;br /&gt;
* For access to an available conda/mamba installation, please get in contact with your project liaison at PIC. He/she will give you the actual value for the '''/path/to/conda/mamba''' placeholder.&lt;br /&gt;
* If you want to use your own conda/mamba/micromamba installation, there are two recommended options&lt;br /&gt;
** '''miniforge''': a distribution with conda and mamba executables in a minimal base environment, instructions [https://github.com/conda-forge/miniforge here]&lt;br /&gt;
** '''micromamba''': a self-contained executable (micromamba) with no base environment, instructions [https://mamba.readthedocs.io/en/latest/installation/micromamba-installation.html here]&lt;br /&gt;
&lt;br /&gt;
Log onto Jupyter and start a session. On the homepage of your Jupyter session, click on the terminal button on the session dashboard on the right to open a bash terminal.&lt;br /&gt;
&lt;br /&gt;
In order to use conda/mamba/micromamba you need to intialize the shell. This initialization can be persistent, which will do some changes to your '''~/.bashrc''' file, or you can do it every time you want to use it.&lt;br /&gt;
&lt;br /&gt;
Run the following command:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eval &amp;quot;$(/data/astro/software/miniforge3/bin/conda shell.bash hook)&amp;quot;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, if you are using miniforge and you want to persist the initialization:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
/data/astro/software/miniforge3/bin/mamba init&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This actually changes the .bashrc file in your home directory in order to activate the base environment on login.&lt;br /&gt;
To avoid that the base environment is activated every time you log on to a node, run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Link an existing environment to Jupyter ==&lt;br /&gt;
&lt;br /&gt;
You can find instructions on how to create your own environments, e.g. [[#Create_virtual_environments_with_venv_or_conda | here]].&lt;br /&gt;
&lt;br /&gt;
Log into Jupyter, start a session. From the session dashboard choose the bash terminal.&lt;br /&gt;
&lt;br /&gt;
Inside the terminal, activate your environment.&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments:&lt;br /&gt;
* if you created the environment without a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the name of your environment. &lt;br /&gt;
* if you created the environment with a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the absolute path of your environment. &lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ source /path/to/environment/bin/activate&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Link the environment to a Jupyter kernel. For both, '''conda/mamba''' and '''venv''':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ python -m  ipykernel install --user --name=whatever_kernel_name&lt;br /&gt;
Installed kernelspec whatever_kernel_name in &lt;br /&gt;
                         /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you don't have the '''ipykernel''' module installed in your environment you may receive an error message like the one below when trying to run the previous command.&lt;br /&gt;
&amp;lt;pre&amp;gt;No module named ipykernel&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If this is the case, you need to install it by running: '''pip install ipykernel'''&lt;br /&gt;
&lt;br /&gt;
Deactivate your environment. &lt;br /&gt;
&lt;br /&gt;
For conda:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For venv:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now you can exit the terminal. After refreshing the Jupyter page your whatever_kernel_name appears in the dashboard. In this example '''test''' has been used for whatever_kernel_name&lt;br /&gt;
&lt;br /&gt;
[[File:screen05.png|700px]]&lt;br /&gt;
&lt;br /&gt;
== Unlink an environment from Jupyter ==&lt;br /&gt;
Log onto Jupyter, start a session and from the session dashboard choose the bash terminal. To remove your environment/kernel from Jupyter run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ jupyter kernelspec uninstall whatever_kernel_name&lt;br /&gt;
Kernel specs to remove:&lt;br /&gt;
  whatever_kernel_name     /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
Remove 1 kernel specs [y/N]: y&lt;br /&gt;
[RemoveKernelSpec] Removed /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Keep in mind that, although not available in Jupyter anymore, the environment still exists. Whenever you need it, you can link it again.&lt;br /&gt;
&lt;br /&gt;
== Create virtual environments with venv or conda ==&lt;br /&gt;
&lt;br /&gt;
Before creating a new environment, please get in contact with your project liaison at PIC as there may be already a suitable environment for your needs in place.&lt;br /&gt;
&lt;br /&gt;
If none of the existing environments suits your needs, you can create a new environment.&lt;br /&gt;
First, create a directory in a suitable place to store the environment. For single-user environments, place them in your home under ~/env. For environments that will be shared with other project users, contact your project liaison and ask him/her for a path in a shared storage volume that is visible to all of them.&lt;br /&gt;
&lt;br /&gt;
Once you have the location (i.e. /path/to/env/folder), create the environment with the following commands:&lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments '''(recommended)'''&lt;br /&gt;
&lt;br /&gt;
If your_env is installed at /path/to/env/your_env&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ python3 -m venv your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ source your_env/bin/activate&lt;br /&gt;
(...)[neissner@td110 ~]$ pip install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments&lt;br /&gt;
 &amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba create --prefix /path/to/env/your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The list of modules (module1, module2, ...) is optional. For instance, for a python3 environment with scipy you would specify: ''python=3 scipy'' &lt;br /&gt;
&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/env/folder/your_env&lt;br /&gt;
(...)[neissner@td110 ~]$ mamba install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
You can use pip install inside a mamba environment, however, resolving dependencies might require installing additional packages manually.&lt;br /&gt;
&lt;br /&gt;
== Conda / mamba configuration ==&lt;br /&gt;
&lt;br /&gt;
The behaviour of conda/mamba can be configured through the &amp;quot;$HOME/.condarc&amp;quot; file, described [https://docs.conda.io/projects/conda/en/latest/configuration.html here]. Some interesting parameters:&lt;br /&gt;
&lt;br /&gt;
* envs_dirs: The list of directories to search for named environments. E.g.: different locations where you created environments&lt;br /&gt;
&lt;br /&gt;
  envs_dirs:&lt;br /&gt;
    - /data/pic/scratch/torradeflot/envs&lt;br /&gt;
    - /data/astro/scratch/torradeflot/envs&lt;br /&gt;
    - /data/aai/scratch/torradeflot/envs&lt;br /&gt;
&lt;br /&gt;
* pkgs_dirs: Folder where to store conda packages&lt;br /&gt;
&lt;br /&gt;
  pkgs_dirs:&lt;br /&gt;
  - /data/aai/scratch_ssd/torradeflot/pkgs&lt;br /&gt;
  - /data/aai/scratch/torradeflot/pkgs&lt;br /&gt;
  - /data/pic/scratch/torradeflot/pkgs&lt;br /&gt;
&lt;br /&gt;
if `pkgs_dirs` and `envs_dirs` are in the same storage, conda will use hard links, thus optimizing the disk space.&lt;br /&gt;
&lt;br /&gt;
= Proper usage of X509 based proxies =&lt;br /&gt;
&lt;br /&gt;
We found recently that the usage of proxies within a Jupyter session might cause problems because the environment changes certain standard locations such as '''/tmp'''&lt;br /&gt;
&lt;br /&gt;
For a correct functioning please create the proxy the following way, example for Virgo:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /bin/voms-proxy-init --voms virgo:/virgo/ligo --out ./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
ls: cannot access /cvmfs/ligo.osgstorage.org: Permission denied&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here the proxy cannot properly be located. Therefore we have to put the complete path into the variable:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ pwd&lt;br /&gt;
/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;/x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
frames  powerflux  pycbc  test_access&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Software of particular interest =&lt;br /&gt;
== SageMath ==&lt;br /&gt;
&lt;br /&gt;
[https://www.sagemath.org/ SageMath] is particularly interesting for Cosmology because it allows symbolic calculations, e.g. deriving the equations of motions for the scale factor starting from a customised space-time metric. &lt;br /&gt;
&lt;br /&gt;
=== Standard cosmology examples ===&lt;br /&gt;
&lt;br /&gt;
* The Friedman equations for the FLRW solution of the Einstein equations.&lt;br /&gt;
You can find the corresponding Notebook in any PIC terminal at '''/data/astro/software/notebooks/FLRW_cosmology.ipynb'''&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/FLRW_cosmology_solutions.ipynb''' uses known analytical solutions of the FLRW cosmology and produces this image for the evolution of the scale factor:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage05.png|300px]]&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/Interior_Schwarzschild.ipynb''' shows the formalism for the interior Schwarzschild metric and displays the solutions for density and pressure of a static celestial object that is sufficiently larger than its Schwarzschild radius. The pressure for an object with constant density id shown in the image:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage06.png|300px]]&lt;br /&gt;
&lt;br /&gt;
=== Enabling SageMath environment in Jupyter ===&lt;br /&gt;
&lt;br /&gt;
If you have never initialized mamba, run:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /data/astro/software/centos7/conda/mambaforge_4.14.0/bin/mamba init&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
After that you can enable SageMath for its use in a Jupyter notebook session: &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~] mamba activate /data/astro/software/envs/sage&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ python -m  ipykernel install --user --name=sage&lt;br /&gt;
....&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This creates a file in you home '''~/.local/share/jupyter/kernels/sage/kernel.json'''&lt;br /&gt;
which has to be modified to look like this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
{&lt;br /&gt;
 &amp;quot;argv&amp;quot;: [&lt;br /&gt;
  &amp;quot;/data/astro/software/envs/sage/bin/sage&amp;quot;,&lt;br /&gt;
  &amp;quot;--python&amp;quot;,&lt;br /&gt;
  &amp;quot;-m&amp;quot;,&lt;br /&gt;
  &amp;quot;sage.repl.ipython_kernel&amp;quot;,&lt;br /&gt;
  &amp;quot;-f&amp;quot;,&lt;br /&gt;
  &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
 ],&lt;br /&gt;
 &amp;quot;display_name&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;language&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;metadata&amp;quot;: {&lt;br /&gt;
  &amp;quot;debugger&amp;quot;: true&lt;br /&gt;
 }&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next time you go to your Jupyter dashboard you will find the sage environment listed there.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Dask =&lt;br /&gt;
&lt;br /&gt;
A notebook with instructions on how to run Dask at PIC can be found [https://gitlab.pic.es/services/code-samples/-/blob/main/computing/dask/dask_htcondor.ipynb here]&lt;br /&gt;
&lt;br /&gt;
= Using a singularity image as a jupyter kernel =&lt;br /&gt;
&lt;br /&gt;
Sometimes the software stack of some projects may be provided in the shape of a singularity image, it will then be convenient to use this image as a kernel for the notebooks in jupyter.pic.es.&lt;br /&gt;
&lt;br /&gt;
The singularity image to be used as a kernel needs to fullfill some requirements. Different requirements will apply depending on the programming language.&lt;br /&gt;
&lt;br /&gt;
== python jupyter kernel in a singularity image ==&lt;br /&gt;
&lt;br /&gt;
The singularity image needs to have the '''python''' and the '''ipykernel''' module installed. &lt;br /&gt;
&lt;br /&gt;
* Create the folder that will host the kernel definition&lt;br /&gt;
&lt;br /&gt;
  mkdir -p $HOME/.local/share/jupyter/kernels/singularity&lt;br /&gt;
&lt;br /&gt;
* Create the '''kernel.json''' file inside it with the following content:&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 {&lt;br /&gt;
   &amp;quot;argv&amp;quot;: [&lt;br /&gt;
     &amp;quot;singularity&amp;quot;,&lt;br /&gt;
     &amp;quot;exec&amp;quot;,&lt;br /&gt;
     &amp;quot;--cleanenv&amp;quot;,&lt;br /&gt;
     &amp;quot;/path/to/the/singularity/image.sif&amp;quot;,&lt;br /&gt;
     &amp;quot;python&amp;quot;,&lt;br /&gt;
     &amp;quot;-m&amp;quot;,&lt;br /&gt;
    &amp;quot;ipykernel&amp;quot;,&lt;br /&gt;
     &amp;quot;-f&amp;quot;,&lt;br /&gt;
     &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
   ],&lt;br /&gt;
   &amp;quot;language&amp;quot;: &amp;quot;python&amp;quot;,&lt;br /&gt;
   &amp;quot;display_name&amp;quot;: &amp;quot;singularity-kernel&amp;quot;&lt;br /&gt;
 }&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Refresh or start the jupyterlab interface and the singularity kernel should appear in the launcher tab&lt;br /&gt;
&lt;br /&gt;
= GPUs =&lt;br /&gt;
&lt;br /&gt;
The way to identify the GPUs that are assigned to your job is:&lt;br /&gt;
* check the environment variable CUDA_VISIBLE_DEVICES. In a terminal run &amp;quot;echo $CUDA_VISIBLE_DEVICES&amp;quot;. The environment variable contains a list of comma-separated GPU ids. With this you will already know how many gpus are assigned to your job. If the variable does not exist, there are no gpus assigned to the job&lt;br /&gt;
&lt;br /&gt;
* list the gpus with nvidia-smi, in a terminal run &amp;quot;nvidia-smi -L&amp;quot;, and look for the gpus you've been assigned. Remember their indexes (integers from 0 to 7)&lt;br /&gt;
&lt;br /&gt;
* The two steps above can be done with the following command:&lt;br /&gt;
&lt;br /&gt;
nvidia-smi -L | grep  $CUDA_VISIBLE_DEVICES&lt;br /&gt;
&lt;br /&gt;
if only having a single assigned GPU.&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_id_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
* in the GPU dashboard the gpus are identified with their index&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_resources_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
= Jupyterlab user guide =&lt;br /&gt;
&lt;br /&gt;
You can find the official documentation of the currently installed version of jupyterlab (3.6) [https://jupyterlab.readthedocs.io/en/4.2.x/ here], there you will find instruction on how to &lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/commands.html Access the command palette]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/toc.html Build a Table Of Contents]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/debugger.html Debug your code]&lt;br /&gt;
&lt;br /&gt;
A set of non-official jupyterlab extensions are installed to provide additional functionalities&lt;br /&gt;
&lt;br /&gt;
== jupytext ==&lt;br /&gt;
Pair your notebooks with text files to enhance version tracking.&lt;br /&gt;
https://jupytext.readthedocs.io&lt;br /&gt;
&lt;br /&gt;
=== Example ===&lt;br /&gt;
&lt;br /&gt;
If you had a notebook (.ipynb file) containing only the cell below, tracked in a git repository&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%matplotlib inline&lt;br /&gt;
import numpy as np&lt;br /&gt;
import matplotlib.pyplot as plt&lt;br /&gt;
plt.imshow(np.random.random([10, 10]))&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Different executions of the cell would produce different images, and the images are embedded in a pseudo-binary format inside the notebook file. In this case, doing a '''git diff''' of the .ipynb file would produce a huge output (because the image changed), even if there wasn't any change in the code. It is thus convenient to sync the notebook with a text file (e.g. a .py script) using the jupytext extension and track this one with git. The outputs, including images, as well as some additional metadata, won't be added to the synced text file. So in the case of different executions of the same notebook, the diff will always be empty.&lt;br /&gt;
&lt;br /&gt;
== git ==&lt;br /&gt;
Sidebar GUI to git repo management&lt;br /&gt;
https://github.com/jupyterlab/jupyterlab-git&lt;br /&gt;
&lt;br /&gt;
== variable inspector ==&lt;br /&gt;
Variable inspection à la Matlab&lt;br /&gt;
https://github.com/jupyterlab-contrib/jupyterlab-variableInspector&lt;br /&gt;
&lt;br /&gt;
...&lt;br /&gt;
&lt;br /&gt;
= Code samples =&lt;br /&gt;
&lt;br /&gt;
A repository with sample code can be found here: https://gitlab.pic.es/services/code-samples/&lt;br /&gt;
&lt;br /&gt;
= Troubleshooting =&lt;br /&gt;
&lt;br /&gt;
== Error 500: Internal Server Error ==&lt;br /&gt;
&lt;br /&gt;
This is a generic error. Means that the jupyterlab server failed. This could be for different reasons:&lt;br /&gt;
&lt;br /&gt;
* Your HOME folder is full. Log in to &amp;quot;ui.pic.es&amp;quot; and run &amp;quot;quota&amp;quot; to check the usage vs quota. If it is full you'll have to free up space.&lt;br /&gt;
&lt;br /&gt;
== Logs ==&lt;br /&gt;
&lt;br /&gt;
The log files for the jupyterlab server are stored in &amp;quot;~/.jupyter&amp;quot;. The log files will be created once the jupyter lab server job is finished.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Clean workspaces ==&lt;br /&gt;
&lt;br /&gt;
Jupyterlab stores the workspace status in the &amp;quot;~/.jupyter/lab/workspaces&amp;quot; folder. If you want to start with a fresh (empty) workspace, delete all the content of this folder before launching the notebook.&lt;br /&gt;
&lt;br /&gt;
    cd ~/.jupyter/lab/workspaces&lt;br /&gt;
    rm *&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 504 Gateway timeout ==&lt;br /&gt;
&lt;br /&gt;
The notebook job is running in HTCondor but the user can not access the notebook server. Ultimately a 504 error is received.&lt;br /&gt;
&lt;br /&gt;
This is probably because there's some error when starting the jupyterlab server. First of all, shutdown the notebook server and [[#Logs|check the logs]] to better identify the problem. If you don't see the source of the error, try to [[#Clean_workspaces|clean the workspaces]] and launch a notebook again.&lt;br /&gt;
&lt;br /&gt;
== Install ROOT ==&lt;br /&gt;
&lt;br /&gt;
Environment creation and kernel installation&lt;br /&gt;
&lt;br /&gt;
    $ micromamba env create -p /data/pic/scratch/torradeflot/envs/mcdata root ipykernel&lt;br /&gt;
    $ micromamba activate mcdata&lt;br /&gt;
    $ python -m ipykernel install --user --name mcdata&lt;br /&gt;
&lt;br /&gt;
After doing this ROOT can be imported from a python shell, but it does not work from a notebook.&lt;br /&gt;
&lt;br /&gt;
* ROOT uses JIT compilation with Cling: https://root.cern/cling/&lt;br /&gt;
* conda provides it's own set of compilation tools: gcc, gxx, fortran&lt;br /&gt;
* Some environment variables are necessary to ensure that these two pieces work well together:&lt;br /&gt;
** PATH: to be able to find the compiler tools&lt;br /&gt;
** CONDA_BUILD_SYSROOT: needed to configure the compiler call&lt;br /&gt;
&lt;br /&gt;
These environment variables are not propagated to the notebook, because they are set by .bashrc conda activate, ... so we need to explicitly set them in the notebook.&lt;br /&gt;
&lt;br /&gt;
     import os&lt;br /&gt;
     import sys&lt;br /&gt;
     from pathlib import Path&lt;br /&gt;
     bin_dir = Path(sys.executable).parent&lt;br /&gt;
     os.environ['PATH'] = f'{bin_dir}:{os.environ['PATH']}'&lt;br /&gt;
     os.environ['CONDA_BUILD_SYSROOT'] = str(bin_dir.parent / 'x86_64-conda-linux-gnu/sysroot')&lt;br /&gt;
&lt;br /&gt;
== Loading libraries is very slow ==&lt;br /&gt;
Conda environments at storage using hard drives can be extremely slow to load. If you encounter this problem, please ask &lt;br /&gt;
your support contact for a &amp;quot;SSD scratch&amp;quot; location to store environments. Currently this service is being tested and&lt;br /&gt;
deployed as needed.&lt;br /&gt;
&lt;br /&gt;
== Spawn failed: The 'ip' trait of a PICCondorSpawner instance expected a unicode string, not the NoneType None ==&lt;br /&gt;
&lt;br /&gt;
Jupyterhub could not get the host name from HTCondor's stdout, because it didn't match the expected regular expression.&lt;br /&gt;
&lt;br /&gt;
This error happens randomly from time to time, it does not imply any major problem in any of the services.&lt;br /&gt;
&lt;br /&gt;
Try to request a new notebook server.&lt;br /&gt;
&lt;br /&gt;
== 403 : Forbidden. XSRF cookie does not match POST argument ==&lt;br /&gt;
&lt;br /&gt;
The value of the &amp;quot;_xsrf&amp;quot; cookie sent by the browser does not match the expected value. This could be for many reasons: temporary high load on the server, race conditions, temporary network unstability, many open tabs in the browser, etc.&lt;br /&gt;
&lt;br /&gt;
In general it can be solved by closing all tabs pointing to &amp;quot;jupyter.pic.es&amp;quot;, cleaning the cookies and connecting back to &amp;quot;jupyter.pic.es&amp;quot;&lt;/div&gt;</summary>
		<author><name>Torradeflot</name></author>
	</entry>
	<entry>
		<id>https://pwiki.pic.es/index.php?title=Spark_on_Hadoop&amp;diff=1220</id>
		<title>Spark on Hadoop</title>
		<link rel="alternate" type="text/html" href="https://pwiki.pic.es/index.php?title=Spark_on_Hadoop&amp;diff=1220"/>
		<updated>2025-05-08T10:24:03Z</updated>

		<summary type="html">&lt;p&gt;Torradeflot: /* How to configure Kerberos authentication with Firefox */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= Monitoring you Spark jobs =&lt;br /&gt;
&lt;br /&gt;
Spark jobs submitted to Yarn can be monitored through the web interface at http://hsrv02.pic.es:8088&lt;br /&gt;
&lt;br /&gt;
You need to have a VPN and Kerberos authentication configured in your browser if you want to access it.&lt;br /&gt;
&lt;br /&gt;
== How to configure Kerberos authentication with Firefox ==&lt;br /&gt;
&lt;br /&gt;
Tested in a Ubuntu 22 server with Firefox installed with apt. Configuring Kerberos authentication with other browser &amp;amp; OS combinations may be differ or not be supported. In particular, in Ubuntu 22, Firefox installed with snap won't work.&lt;br /&gt;
&lt;br /&gt;
You need to have Kerberos client installed in your PC.&lt;br /&gt;
&lt;br /&gt;
Update the &amp;quot;/etc/krb5.conf&amp;quot; file to contain PIC realm. Do not remove additional content in the &amp;quot;krb5.conf&amp;quot; file:&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
    [realms]&lt;br /&gt;
    	PIC.ES = {&lt;br /&gt;
    		kdc = ipa01.pic.es:88&lt;br /&gt;
    		master_kdc = ipa01.pic.es:88&lt;br /&gt;
    		admin_server = ipa01.pic.es:749&lt;br /&gt;
    		kpasswd_server = ipa01.pic.es:464&lt;br /&gt;
    		default_domain = pic.es&lt;br /&gt;
    	}&lt;br /&gt;
     &lt;br /&gt;
    [domain_realm]&lt;br /&gt;
      .pic.es = PIC.ES&lt;br /&gt;
      pic.es = PIC.ES&lt;br /&gt;
      hsrv03.pic.es = PIC.ES&lt;br /&gt;
&lt;br /&gt;
Update the Firefox configuration&lt;br /&gt;
* Open Firefox&lt;br /&gt;
* Access the configuration in &amp;quot;about:config&amp;quot; in the navigation bar&lt;br /&gt;
* look for the options with the &amp;quot;negotiate&amp;quot; text and set these values:&lt;br /&gt;
** network.negotiate-auth.allow-non-fqdn: true&lt;br /&gt;
** network.negotiate-auth.delegation-uris: pic.es&lt;br /&gt;
** network.negotiate-auth.trusted-uris: pic.es&lt;/div&gt;</summary>
		<author><name>Torradeflot</name></author>
	</entry>
	<entry>
		<id>https://pwiki.pic.es/index.php?title=Spark_on_Hadoop&amp;diff=1219</id>
		<title>Spark on Hadoop</title>
		<link rel="alternate" type="text/html" href="https://pwiki.pic.es/index.php?title=Spark_on_Hadoop&amp;diff=1219"/>
		<updated>2025-05-08T10:17:56Z</updated>

		<summary type="html">&lt;p&gt;Torradeflot: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= Monitoring you Spark jobs =&lt;br /&gt;
&lt;br /&gt;
Spark jobs submitted to Yarn can be monitored through the web interface at http://hsrv02.pic.es:8088&lt;br /&gt;
&lt;br /&gt;
You need to have a VPN and Kerberos authentication configured in your browser if you want to access it.&lt;br /&gt;
&lt;br /&gt;
== How to configure Kerberos authentication with Firefox ==&lt;br /&gt;
&lt;br /&gt;
Tested in a Ubuntu 22 server with Firefox installed with apt. Configuring Kerberos authentication with other browser &amp;amp; OS combinations may be differ or not be supported.&lt;br /&gt;
&lt;br /&gt;
You need to have Kerberos client installed in your PC.&lt;br /&gt;
&lt;br /&gt;
Update the &amp;quot;/etc/krb5.conf&amp;quot; file to contain PIC real:&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[realms]&lt;br /&gt;
	PIC.ES = {&lt;br /&gt;
		kdc = ipa01.pic.es:88&lt;br /&gt;
		master_kdc = ipa01.pic.es:88&lt;br /&gt;
		admin_server = ipa01.pic.es:749&lt;br /&gt;
		kpasswd_server = ipa01.pic.es:464&lt;br /&gt;
		default_domain = pic.es&lt;br /&gt;
	}&lt;br /&gt;
&lt;br /&gt;
[domain_realm]&lt;br /&gt;
  .pic.es = PIC.ES&lt;br /&gt;
  pic.es = PIC.ES&lt;br /&gt;
  hsrv03.pic.es = PIC.ES&lt;/div&gt;</summary>
		<author><name>Torradeflot</name></author>
	</entry>
	<entry>
		<id>https://pwiki.pic.es/index.php?title=Spark_on_Hadoop&amp;diff=1218</id>
		<title>Spark on Hadoop</title>
		<link rel="alternate" type="text/html" href="https://pwiki.pic.es/index.php?title=Spark_on_Hadoop&amp;diff=1218"/>
		<updated>2025-05-08T10:10:11Z</updated>

		<summary type="html">&lt;p&gt;Torradeflot: Created blank page&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Torradeflot</name></author>
	</entry>
	<entry>
		<id>https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1215</id>
		<title>JupyterHub</title>
		<link rel="alternate" type="text/html" href="https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1215"/>
		<updated>2025-04-08T10:04:11Z</updated>

		<summary type="html">&lt;p&gt;Torradeflot: /* Troubleshooting */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
= Introduction =&lt;br /&gt;
&lt;br /&gt;
PIC offers a service for running Jupyter notebooks on CPU or GPU resources. This service is primarily thought for code developing or prototyping rather than data processing. The usage is similar to running notebooks on your personal computer but offers the advantage of developing and testing your code on different hardware configurations, as well as facilitating the scalability of the code since it is being tested in the same environment in which it would run on a mass scale. &lt;br /&gt;
&lt;br /&gt;
Since the service is strictly thought for development and small scale testing tasks, a shutdown policy for the sessions has been put in place:&lt;br /&gt;
&lt;br /&gt;
# The maximum duration for a session is 48h.&lt;br /&gt;
# After an idle period of 2 hours, the session will be closed. &lt;br /&gt;
&lt;br /&gt;
In practice that  means that you should estimate the test data volume that you work with during a session to be able to be processed in less than 48 hours.&lt;br /&gt;
&lt;br /&gt;
== How to connect to the service ==&lt;br /&gt;
&lt;br /&gt;
Got to [https://jupyter.pic.es jupyter.pic.es] to see your login screen.&lt;br /&gt;
&lt;br /&gt;
[[File:login.png|700px|Login screen]]&lt;br /&gt;
&lt;br /&gt;
Sign in with your PIC user credentials. This will prompt you to the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterSpawn.png|700px|current]]&lt;br /&gt;
&lt;br /&gt;
Here you can choose the hardware configuration for your Jupyter session. Also, you have to choose the experiment (project) you are working on during the Jupyter session. After choosing a configuration and pressing start the next screen will show you the progress of the initialisation process. Keep in mind that a job containing your Jupyter session is actually sent to the HTCondor queuing system and waiting for available resources before being started. This usually takes less than a minute but can take up to a few depending on our resource usage.&lt;br /&gt;
&lt;br /&gt;
[[File:screen02.png|900px]]&lt;br /&gt;
&lt;br /&gt;
In the next screen you can choose the tool that you want to use for your work: a Python notebook, a Python console or a plain bash terminal.&lt;br /&gt;
For the Python environment (either notebook or environment) you have two default options:&lt;br /&gt;
* the ipykernel version of Python 3&lt;br /&gt;
* the XPython version of Python 3.9, this one allows you to use the integrated debugging module.&lt;br /&gt;
&lt;br /&gt;
Further you see an icon with a &amp;quot;D&amp;quot; - desktop, this one starts a VNC session that allows the use of programs with graphical user interfaces.&lt;br /&gt;
&lt;br /&gt;
Also, recently you can find the icon of Visual Studio, an integrated development environment.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterlab20231103.png|700px]]&lt;br /&gt;
&lt;br /&gt;
Your python environments should appear under Notebook and Console headers. In a later section we will show you how to create a new environment and to remove an existing one.&lt;br /&gt;
&lt;br /&gt;
== Terminate your session and logout ==&lt;br /&gt;
&lt;br /&gt;
It is important that you terminate your session before you log out. In order to do so, go to the top page menu &amp;quot;'''File''' -&amp;gt; '''Hub Control Panel'''&amp;quot; and you will see the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:screen04.png|600px]]&lt;br /&gt;
&lt;br /&gt;
Here, click on the '''Stop My Server''' button. After that you can log out by clicking the '''Logout''' button in the right upper corner.&lt;br /&gt;
&lt;br /&gt;
=  Python virtual environments =&lt;br /&gt;
&lt;br /&gt;
This section covers the use of Python virtual environments with Jupyter.&lt;br /&gt;
&lt;br /&gt;
== Initialize conda (we highly recommend the use of mamba/micromamba) ==&lt;br /&gt;
&lt;br /&gt;
Before using conda/mamba in your bash session, you have to initialize it.&lt;br /&gt;
* For access to an available conda/mamba installation, please get in contact with your project liaison at PIC. He/she will give you the actual value for the '''/path/to/conda/mamba''' placeholder.&lt;br /&gt;
* If you want to use your own conda/mamba/micromamba installation, there are two recommended options&lt;br /&gt;
** '''miniforge''': a distribution with conda and mamba executables in a minimal base environment, instructions [https://github.com/conda-forge/miniforge here]&lt;br /&gt;
** '''micromamba''': a self-contained executable (micromamba) with no base environment, instructions [https://mamba.readthedocs.io/en/latest/installation/micromamba-installation.html here]&lt;br /&gt;
&lt;br /&gt;
Log onto Jupyter and start a session. On the homepage of your Jupyter session, click on the terminal button on the session dashboard on the right to open a bash terminal.&lt;br /&gt;
&lt;br /&gt;
In order to use conda/mamba/micromamba you need to intialize the shell. This initialization can be persistent, which will do some changes to your '''~/.bashrc''' file, or you can do it every time you want to use it.&lt;br /&gt;
&lt;br /&gt;
Run the following command:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eval &amp;quot;$(/data/astro/software/miniforge3/bin/conda shell.bash hook)&amp;quot;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, if you are using miniforge and you want to persist the initialization:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
/data/astro/software/miniforge3/bin/mamba init&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This actually changes the .bashrc file in your home directory in order to activate the base environment on login.&lt;br /&gt;
To avoid that the base environment is activated every time you log on to a node, run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Link an existing environment to Jupyter ==&lt;br /&gt;
&lt;br /&gt;
You can find instructions on how to create your own environments, e.g. [[#Create_virtual_environments_with_venv_or_conda | here]].&lt;br /&gt;
&lt;br /&gt;
Log into Jupyter, start a session. From the session dashboard choose the bash terminal.&lt;br /&gt;
&lt;br /&gt;
Inside the terminal, activate your environment.&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments:&lt;br /&gt;
* if you created the environment without a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the name of your environment. &lt;br /&gt;
* if you created the environment with a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the absolute path of your environment. &lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ source /path/to/environment/bin/activate&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Link the environment to a Jupyter kernel. For both, '''conda/mamba''' and '''venv''':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ python -m  ipykernel install --user --name=whatever_kernel_name&lt;br /&gt;
Installed kernelspec whatever_kernel_name in &lt;br /&gt;
                         /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you don't have the '''ipykernel''' module installed in your environment you may receive an error message like the one below when trying to run the previous command.&lt;br /&gt;
&amp;lt;pre&amp;gt;No module named ipykernel&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If this is the case, you need to install it by running: '''pip install ipykernel'''&lt;br /&gt;
&lt;br /&gt;
Deactivate your environment. &lt;br /&gt;
&lt;br /&gt;
For conda:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For venv:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now you can exit the terminal. After refreshing the Jupyter page your whatever_kernel_name appears in the dashboard. In this example '''test''' has been used for whatever_kernel_name&lt;br /&gt;
&lt;br /&gt;
[[File:screen05.png|700px]]&lt;br /&gt;
&lt;br /&gt;
== Unlink an environment from Jupyter ==&lt;br /&gt;
Log onto Jupyter, start a session and from the session dashboard choose the bash terminal. To remove your environment/kernel from Jupyter run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ jupyter kernelspec uninstall whatever_kernel_name&lt;br /&gt;
Kernel specs to remove:&lt;br /&gt;
  whatever_kernel_name     /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
Remove 1 kernel specs [y/N]: y&lt;br /&gt;
[RemoveKernelSpec] Removed /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Keep in mind that, although not available in Jupyter anymore, the environment still exists. Whenever you need it, you can link it again.&lt;br /&gt;
&lt;br /&gt;
== Create virtual environments with venv or conda ==&lt;br /&gt;
&lt;br /&gt;
Before creating a new environment, please get in contact with your project liaison at PIC as there may be already a suitable environment for your needs in place.&lt;br /&gt;
&lt;br /&gt;
If none of the existing environments suits your needs, you can create a new environment.&lt;br /&gt;
First, create a directory in a suitable place to store the environment. For single-user environments, place them in your home under ~/env. For environments that will be shared with other project users, contact your project liaison and ask him/her for a path in a shared storage volume that is visible to all of them.&lt;br /&gt;
&lt;br /&gt;
Once you have the location (i.e. /path/to/env/folder), create the environment with the following commands:&lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments '''(recommended)'''&lt;br /&gt;
&lt;br /&gt;
If your_env is installed at /path/to/env/your_env&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ python3 -m venv your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ source your_env/bin/activate&lt;br /&gt;
(...)[neissner@td110 ~]$ pip install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments&lt;br /&gt;
 &amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba create --prefix /path/to/env/your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The list of modules (module1, module2, ...) is optional. For instance, for a python3 environment with scipy you would specify: ''python=3 scipy'' &lt;br /&gt;
&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/env/folder/your_env&lt;br /&gt;
(...)[neissner@td110 ~]$ mamba install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
You can use pip install inside a mamba environment, however, resolving dependencies might require installing additional packages manually.&lt;br /&gt;
&lt;br /&gt;
== Conda / mamba configuration ==&lt;br /&gt;
&lt;br /&gt;
The behaviour of conda/mamba can be configured through the &amp;quot;$HOME/.condarc&amp;quot; file, described [https://docs.conda.io/projects/conda/en/latest/configuration.html here]. Some interesting parameters:&lt;br /&gt;
&lt;br /&gt;
* envs_dirs: The list of directories to search for named environments. E.g.: different locations where you created environments&lt;br /&gt;
&lt;br /&gt;
  envs_dirs:&lt;br /&gt;
    - /data/pic/scratch/torradeflot/envs&lt;br /&gt;
    - /data/astro/scratch/torradeflot/envs&lt;br /&gt;
    - /data/aai/scratch/torradeflot/envs&lt;br /&gt;
&lt;br /&gt;
* pkgs_dirs: Folder where to store conda packages&lt;br /&gt;
&lt;br /&gt;
  pkgs_dirs:&lt;br /&gt;
  - /data/aai/scratch_ssd/torradeflot/pkgs&lt;br /&gt;
  - /data/aai/scratch/torradeflot/pkgs&lt;br /&gt;
  - /data/pic/scratch/torradeflot/pkgs&lt;br /&gt;
&lt;br /&gt;
if `pkgs_dirs` and `envs_dirs` are in the same storage, conda will use hard links, thus optimizing the disk space.&lt;br /&gt;
&lt;br /&gt;
= Proper usage of X509 based proxies =&lt;br /&gt;
&lt;br /&gt;
We found recently that the usage of proxies within a Jupyter session might cause problems because the environment changes certain standard locations such as '''/tmp'''&lt;br /&gt;
&lt;br /&gt;
For a correct functioning please create the proxy the following way, example for Virgo:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /bin/voms-proxy-init --voms virgo:/virgo/ligo --out ./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
ls: cannot access /cvmfs/ligo.osgstorage.org: Permission denied&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here the proxy cannot properly be located. Therefore we have to put the complete path into the variable:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ pwd&lt;br /&gt;
/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;/x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
frames  powerflux  pycbc  test_access&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Software of particular interest =&lt;br /&gt;
== SageMath ==&lt;br /&gt;
&lt;br /&gt;
[https://www.sagemath.org/ SageMath] is particularly interesting for Cosmology because it allows symbolic calculations, e.g. deriving the equations of motions for the scale factor starting from a customised space-time metric. &lt;br /&gt;
&lt;br /&gt;
=== Standard cosmology examples ===&lt;br /&gt;
&lt;br /&gt;
* The Friedman equations for the FLRW solution of the Einstein equations.&lt;br /&gt;
You can find the corresponding Notebook in any PIC terminal at '''/data/astro/software/notebooks/FLRW_cosmology.ipynb'''&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/FLRW_cosmology_solutions.ipynb''' uses known analytical solutions of the FLRW cosmology and produces this image for the evolution of the scale factor:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage05.png|300px]]&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/Interior_Schwarzschild.ipynb''' shows the formalism for the interior Schwarzschild metric and displays the solutions for density and pressure of a static celestial object that is sufficiently larger than its Schwarzschild radius. The pressure for an object with constant density id shown in the image:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage06.png|300px]]&lt;br /&gt;
&lt;br /&gt;
=== Enabling SageMath environment in Jupyter ===&lt;br /&gt;
&lt;br /&gt;
If you have never initialized mamba, run:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /data/astro/software/centos7/conda/mambaforge_4.14.0/bin/mamba init&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
After that you can enable SageMath for its use in a Jupyter notebook session: &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~] mamba activate /data/astro/software/envs/sage&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ python -m  ipykernel install --user --name=sage&lt;br /&gt;
....&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This creates a file in you home '''~/.local/share/jupyter/kernels/sage/kernel.json'''&lt;br /&gt;
which has to be modified to look like this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
{&lt;br /&gt;
 &amp;quot;argv&amp;quot;: [&lt;br /&gt;
  &amp;quot;/data/astro/software/envs/sage/bin/sage&amp;quot;,&lt;br /&gt;
  &amp;quot;--python&amp;quot;,&lt;br /&gt;
  &amp;quot;-m&amp;quot;,&lt;br /&gt;
  &amp;quot;sage.repl.ipython_kernel&amp;quot;,&lt;br /&gt;
  &amp;quot;-f&amp;quot;,&lt;br /&gt;
  &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
 ],&lt;br /&gt;
 &amp;quot;display_name&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;language&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;metadata&amp;quot;: {&lt;br /&gt;
  &amp;quot;debugger&amp;quot;: true&lt;br /&gt;
 }&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next time you go to your Jupyter dashboard you will find the sage environment listed there.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Dask =&lt;br /&gt;
&lt;br /&gt;
A notebook with instructions on how to run Dask at PIC can be found [https://gitlab.pic.es/services/code-samples/-/blob/main/computing/dask/dask_htcondor.ipynb here]&lt;br /&gt;
&lt;br /&gt;
= Using a singularity image as a jupyter kernel =&lt;br /&gt;
&lt;br /&gt;
Sometimes the software stack of some projects may be provided in the shape of a singularity image, it will then be convenient to use this image as a kernel for the notebooks in jupyter.pic.es.&lt;br /&gt;
&lt;br /&gt;
The singularity image to be used as a kernel needs to fullfill some requirements. Different requirements will apply depending on the programming language.&lt;br /&gt;
&lt;br /&gt;
== python jupyter kernel in a singularity image ==&lt;br /&gt;
&lt;br /&gt;
The singularity image needs to have the '''python''' and the '''ipykernel''' module installed. &lt;br /&gt;
&lt;br /&gt;
* Create the folder that will host the kernel definition&lt;br /&gt;
&lt;br /&gt;
  mkdir -p $HOME/.local/share/jupyter/kernels/singularity&lt;br /&gt;
&lt;br /&gt;
* Create the '''kernel.json''' file inside it with the following content:&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 {&lt;br /&gt;
   &amp;quot;argv&amp;quot;: [&lt;br /&gt;
     &amp;quot;singularity&amp;quot;,&lt;br /&gt;
     &amp;quot;exec&amp;quot;,&lt;br /&gt;
     &amp;quot;--cleanenv&amp;quot;,&lt;br /&gt;
     &amp;quot;/path/to/the/singularity/image.sif&amp;quot;,&lt;br /&gt;
     &amp;quot;python&amp;quot;,&lt;br /&gt;
     &amp;quot;-m&amp;quot;,&lt;br /&gt;
    &amp;quot;ipykernel&amp;quot;,&lt;br /&gt;
     &amp;quot;-f&amp;quot;,&lt;br /&gt;
     &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
   ],&lt;br /&gt;
   &amp;quot;language&amp;quot;: &amp;quot;python&amp;quot;,&lt;br /&gt;
   &amp;quot;display_name&amp;quot;: &amp;quot;singularity-kernel&amp;quot;&lt;br /&gt;
 }&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Refresh or start the jupyterlab interface and the singularity kernel should appear in the launcher tab&lt;br /&gt;
&lt;br /&gt;
= GPUs =&lt;br /&gt;
&lt;br /&gt;
The way to identify the GPUs that are assigned to your job is:&lt;br /&gt;
* check the environment variable CUDA_VISIBLE_DEVICES. In a terminal run &amp;quot;echo $CUDA_VISIBLE_DEVICES&amp;quot;. The environment variable contains a list of comma-separated GPU ids. With this you will already know how many gpus are assigned to your job. If the variable does not exist, there are no gpus assigned to the job&lt;br /&gt;
&lt;br /&gt;
* list the gpus with nvidia-smi, in a terminal run &amp;quot;nvidia-smi -L&amp;quot;, and look for the gpus you've been assigned. Remember their indexes (integers from 0 to 7)&lt;br /&gt;
&lt;br /&gt;
* The two steps above can be done with the following command:&lt;br /&gt;
&lt;br /&gt;
nvidia-smi -L | grep  $CUDA_VISIBLE_DEVICES&lt;br /&gt;
&lt;br /&gt;
if only having a single assigned GPU.&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_id_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
* in the GPU dashboard the gpus are identified with their index&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_resources_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
= Jupyterlab user guide =&lt;br /&gt;
&lt;br /&gt;
You can find the official documentation of the currently installed version of jupyterlab (3.6) [https://jupyterlab.readthedocs.io/en/4.2.x/ here], there you will find instruction on how to &lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/commands.html Access the command palette]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/toc.html Build a Table Of Contents]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/debugger.html Debug your code]&lt;br /&gt;
&lt;br /&gt;
A set of non-official jupyterlab extensions are installed to provide additional functionalities&lt;br /&gt;
&lt;br /&gt;
== jupytext ==&lt;br /&gt;
Pair your notebooks with text files to enhance version tracking.&lt;br /&gt;
https://jupytext.readthedocs.io&lt;br /&gt;
&lt;br /&gt;
=== Example ===&lt;br /&gt;
&lt;br /&gt;
If you had a notebook (.ipynb file) containing only the cell below, tracked in a git repository&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%matplotlib inline&lt;br /&gt;
import numpy as np&lt;br /&gt;
import matplotlib.pyplot as plt&lt;br /&gt;
plt.imshow(np.random.random([10, 10]))&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Different executions of the cell would produce different images, and the images are embedded in a pseudo-binary format inside the notebook file. In this case, doing a '''git diff''' of the .ipynb file would produce a huge output (because the image changed), even if there wasn't any change in the code. It is thus convenient to sync the notebook with a text file (e.g. a .py script) using the jupytext extension and track this one with git. The outputs, including images, as well as some additional metadata, won't be added to the synced text file. So in the case of different executions of the same notebook, the diff will always be empty.&lt;br /&gt;
&lt;br /&gt;
== git ==&lt;br /&gt;
Sidebar GUI to git repo management&lt;br /&gt;
https://github.com/jupyterlab/jupyterlab-git&lt;br /&gt;
&lt;br /&gt;
== variable inspector ==&lt;br /&gt;
Variable inspection à la Matlab&lt;br /&gt;
https://github.com/jupyterlab-contrib/jupyterlab-variableInspector&lt;br /&gt;
&lt;br /&gt;
...&lt;br /&gt;
&lt;br /&gt;
= Code samples =&lt;br /&gt;
&lt;br /&gt;
A repository with sample code can be found here: https://gitlab.pic.es/services/code-samples/&lt;br /&gt;
&lt;br /&gt;
= Troubleshooting =&lt;br /&gt;
&lt;br /&gt;
== Error 500: Internal Server Error ==&lt;br /&gt;
&lt;br /&gt;
This is a generic error. Means that the jupyterlab server failed. This could be for different reasons:&lt;br /&gt;
&lt;br /&gt;
* Your HOME folder is full. Log in to &amp;quot;ui.pic.es&amp;quot; and run &amp;quot;quota&amp;quot; to check the usage vs quota. If it is full you'll have to free up space.&lt;br /&gt;
&lt;br /&gt;
== Logs ==&lt;br /&gt;
&lt;br /&gt;
The log files for the jupyterlab server are stored in &amp;quot;~/.jupyter&amp;quot;. The log files will be created once the jupyter lab server job is finished.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Clean workspaces ==&lt;br /&gt;
&lt;br /&gt;
Jupyterlab stores the workspace status in the &amp;quot;~/.jupyter/lab/workspaces&amp;quot; folder. If you want to start with a fresh (empty) workspace, delete all the content of this folder before launching the notebook.&lt;br /&gt;
&lt;br /&gt;
    cd ~/.jupyter/lab/workspaces&lt;br /&gt;
    rm *&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 504 Gateway timeout ==&lt;br /&gt;
&lt;br /&gt;
The notebook job is running in HTCondor but the user can not access the notebook server. Ultimately a 504 error is received.&lt;br /&gt;
&lt;br /&gt;
This is probably because there's some error when starting the jupyterlab server. First of all, shutdown the notebook server and [[#Logs|check the logs]] to better identify the problem. If you don't see the source of the error, try to [[#Clean_workspaces|clean the workspaces]] and launch a notebook again.&lt;br /&gt;
&lt;br /&gt;
== Install ROOT ==&lt;br /&gt;
&lt;br /&gt;
Environment creation and kernel installation&lt;br /&gt;
&lt;br /&gt;
    $ micromamba env create -p /data/pic/scratch/torradeflot/envs/mcdata root ipykernel&lt;br /&gt;
    $ micromamba activate mcdata&lt;br /&gt;
    $ python -m ipykernel install --user --name mcdata&lt;br /&gt;
&lt;br /&gt;
After doing this ROOT can be imported from a python shell, but it does not work from a notebook.&lt;br /&gt;
&lt;br /&gt;
* ROOT uses JIT compilation with Cling: https://root.cern/cling/&lt;br /&gt;
* conda provides it's own set of compilation tools: gcc, gxx, fortran&lt;br /&gt;
* Some environment variables are necessary to ensure that these two pieces work well together:&lt;br /&gt;
** PATH: to be able to find the compiler tools&lt;br /&gt;
** CONDA_BUILD_SYSROOT: needed to configure the compiler call&lt;br /&gt;
&lt;br /&gt;
These environment variables are not propagated to the notebook, because they are set by .bashrc conda activate, ... so we need to explicitly set them in the notebook.&lt;br /&gt;
&lt;br /&gt;
     import os&lt;br /&gt;
     import sys&lt;br /&gt;
     from pathlib import Path&lt;br /&gt;
     bin_dir = Path(sys.executable).parent&lt;br /&gt;
     os.environ['PATH'] = f'{bin_dir}:{os.environ['PATH']}'&lt;br /&gt;
     os.environ['CONDA_BUILD_SYSROOT'] = str(bin_dir.parent / 'x86_64-conda-linux-gnu/sysroot')&lt;br /&gt;
&lt;br /&gt;
== Loading libraries is very slow ==&lt;br /&gt;
Conda environments at storage using hard drives can be extremely slow to load. If you encounter this problem, please ask &lt;br /&gt;
your support contact for a &amp;quot;SSD scratch&amp;quot; location to store environments. Currently this service is being tested and&lt;br /&gt;
deployed as needed.&lt;br /&gt;
&lt;br /&gt;
== Spawn failed: The 'ip' trait of a PICCondorSpawner instance expected a unicode string, not the NoneType None ==&lt;br /&gt;
&lt;br /&gt;
Jupyterhub could not get the host name from HTCondor's stdout, because it didn't match the expected regular expression.&lt;br /&gt;
&lt;br /&gt;
This error happens randomly from time to time, it does not imply any major problem in any of the services.&lt;br /&gt;
&lt;br /&gt;
Try to request a new notebook server.&lt;/div&gt;</summary>
		<author><name>Torradeflot</name></author>
	</entry>
	<entry>
		<id>https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1205</id>
		<title>JupyterHub</title>
		<link rel="alternate" type="text/html" href="https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1205"/>
		<updated>2025-03-24T06:21:25Z</updated>

		<summary type="html">&lt;p&gt;Torradeflot: /* Loading libraries is very slow */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
= Introduction =&lt;br /&gt;
&lt;br /&gt;
PIC offers a service for running Jupyter notebooks on CPU or GPU resources. This service is primarily thought for code developing or prototyping rather than data processing. The usage is similar to running notebooks on your personal computer but offers the advantage of developing and testing your code on different hardware configurations, as well as facilitating the scalability of the code since it is being tested in the same environment in which it would run on a mass scale. &lt;br /&gt;
&lt;br /&gt;
Since the service is strictly thought for development and small scale testing tasks, a shutdown policy for the sessions has been put in place:&lt;br /&gt;
&lt;br /&gt;
# The maximum duration for a session is 48h.&lt;br /&gt;
# After an idle period of 2 hours, the session will be closed. &lt;br /&gt;
&lt;br /&gt;
In practice that  means that you should estimate the test data volume that you work with during a session to be able to be processed in less than 48 hours.&lt;br /&gt;
&lt;br /&gt;
== How to connect to the service ==&lt;br /&gt;
&lt;br /&gt;
Got to [https://jupyter.pic.es jupyter.pic.es] to see your login screen.&lt;br /&gt;
&lt;br /&gt;
[[File:login.png|700px|Login screen]]&lt;br /&gt;
&lt;br /&gt;
Sign in with your PIC user credentials. This will prompt you to the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterSpawn.png|700px|current]]&lt;br /&gt;
&lt;br /&gt;
Here you can choose the hardware configuration for your Jupyter session. Also, you have to choose the experiment (project) you are working on during the Jupyter session. After choosing a configuration and pressing start the next screen will show you the progress of the initialisation process. Keep in mind that a job containing your Jupyter session is actually sent to the HTCondor queuing system and waiting for available resources before being started. This usually takes less than a minute but can take up to a few depending on our resource usage.&lt;br /&gt;
&lt;br /&gt;
[[File:screen02.png|900px]]&lt;br /&gt;
&lt;br /&gt;
In the next screen you can choose the tool that you want to use for your work: a Python notebook, a Python console or a plain bash terminal.&lt;br /&gt;
For the Python environment (either notebook or environment) you have two default options:&lt;br /&gt;
* the ipykernel version of Python 3&lt;br /&gt;
* the XPython version of Python 3.9, this one allows you to use the integrated debugging module.&lt;br /&gt;
&lt;br /&gt;
Further you see an icon with a &amp;quot;D&amp;quot; - desktop, this one starts a VNC session that allows the use of programs with graphical user interfaces.&lt;br /&gt;
&lt;br /&gt;
Also, recently you can find the icon of Visual Studio, an integrated development environment.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterlab20231103.png|700px]]&lt;br /&gt;
&lt;br /&gt;
Your python environments should appear under Notebook and Console headers. In a later section we will show you how to create a new environment and to remove an existing one.&lt;br /&gt;
&lt;br /&gt;
== Terminate your session and logout ==&lt;br /&gt;
&lt;br /&gt;
It is important that you terminate your session before you log out. In order to do so, go to the top page menu &amp;quot;'''File''' -&amp;gt; '''Hub Control Panel'''&amp;quot; and you will see the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:screen04.png|600px]]&lt;br /&gt;
&lt;br /&gt;
Here, click on the '''Stop My Server''' button. After that you can log out by clicking the '''Logout''' button in the right upper corner.&lt;br /&gt;
&lt;br /&gt;
=  Python virtual environments =&lt;br /&gt;
&lt;br /&gt;
This section covers the use of Python virtual environments with Jupyter.&lt;br /&gt;
&lt;br /&gt;
== Initialize conda (we highly recommend the use of mamba/micromamba) ==&lt;br /&gt;
&lt;br /&gt;
Before using conda/mamba in your bash session, you have to initialize it.&lt;br /&gt;
* For access to an available conda/mamba installation, please get in contact with your project liaison at PIC. He/she will give you the actual value for the '''/path/to/conda/mamba''' placeholder.&lt;br /&gt;
* If you want to use your own conda/mamba/micromamba installation, there are two recommended options&lt;br /&gt;
** '''miniforge''': a distribution with conda and mamba executables in a minimal base environment, instructions [https://github.com/conda-forge/miniforge here]&lt;br /&gt;
** '''micromamba''': a self-contained executable (micromamba) with no base environment, instructions [https://mamba.readthedocs.io/en/latest/installation/micromamba-installation.html here]&lt;br /&gt;
&lt;br /&gt;
Log onto Jupyter and start a session. On the homepage of your Jupyter session, click on the terminal button on the session dashboard on the right to open a bash terminal.&lt;br /&gt;
&lt;br /&gt;
In order to use conda/mamba/micromamba you need to intialize the shell. This initialization can be persistent, which will do some changes to your '''~/.bashrc''' file, or you can do it every time you want to use it.&lt;br /&gt;
&lt;br /&gt;
If you are using miniforge and you want to persist the initialization:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
/path/to/conda/mamba init&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you are using micromamba:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
/path/to/micromamba shell init&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This actually changes the .bashrc file in your home directory in order to activate the base environment on login.&lt;br /&gt;
To avoid that the base environment is activated every time you log on to a node, run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you don't want the initialization to be persisted, you can run it for every session:&lt;br /&gt;
&lt;br /&gt;
If you are using miniforge, the initialization hook needs to be run using the '''conda''' executable:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eval &amp;quot;$(/path/to/conda shell.bash hook)&amp;quot;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you are using micromamba:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eval &amp;quot;$(/path/to/micromamba shell hook -s posix)&amp;quot;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Link an existing environment to Jupyter ==&lt;br /&gt;
&lt;br /&gt;
You can find instructions on how to create your own environments, e.g. [[#Create_virtual_environments_with_venv_or_conda | here]].&lt;br /&gt;
&lt;br /&gt;
Log into Jupyter, start a session. From the session dashboard choose the bash terminal.&lt;br /&gt;
&lt;br /&gt;
Inside the terminal, activate your environment.&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments:&lt;br /&gt;
* if you created the environment without a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the name of your environment. &lt;br /&gt;
* if you created the environment with a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the absolute path of your environment. &lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ source /path/to/environment/bin/activate&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Link the environment to a Jupyter kernel. For both, '''conda/mamba''' and '''venv''':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ python -m  ipykernel install --user --name=whatever_kernel_name&lt;br /&gt;
Installed kernelspec whatever_kernel_name in &lt;br /&gt;
                         /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you don't have the '''ipykernel''' module installed in your environment you may receive an error message like the one below when trying to run the previous command.&lt;br /&gt;
&amp;lt;pre&amp;gt;No module named ipykernel&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If this is the case, you need to install it by running: '''pip install ipykernel'''&lt;br /&gt;
&lt;br /&gt;
Deactivate your environment. &lt;br /&gt;
&lt;br /&gt;
For conda:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For venv:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now you can exit the terminal. After refreshing the Jupyter page your whatever_kernel_name appears in the dashboard. In this example '''test''' has been used for whatever_kernel_name&lt;br /&gt;
&lt;br /&gt;
[[File:screen05.png|700px]]&lt;br /&gt;
&lt;br /&gt;
== Unlink an environment from Jupyter ==&lt;br /&gt;
Log onto Jupyter, start a session and from the session dashboard choose the bash terminal. To remove your environment/kernel from Jupyter run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ jupyter kernelspec uninstall whatever_kernel_name&lt;br /&gt;
Kernel specs to remove:&lt;br /&gt;
  whatever_kernel_name     /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
Remove 1 kernel specs [y/N]: y&lt;br /&gt;
[RemoveKernelSpec] Removed /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Keep in mind that, although not available in Jupyter anymore, the environment still exists. Whenever you need it, you can link it again.&lt;br /&gt;
&lt;br /&gt;
== Create virtual environments with venv or conda ==&lt;br /&gt;
&lt;br /&gt;
Before creating a new environment, please get in contact with your project liaison at PIC as there may be already a suitable environment for your needs in place.&lt;br /&gt;
&lt;br /&gt;
If none of the existing environments suits your needs, you can create a new environment.&lt;br /&gt;
First, create a directory in a suitable place to store the environment. For single-user environments, place them in your home under ~/env. For environments that will be shared with other project users, contact your project liaison and ask him/her for a path in a shared storage volume that is visible to all of them.&lt;br /&gt;
&lt;br /&gt;
Once you have the location (i.e. /path/to/env/folder), create the environment with the following commands:&lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments '''(recommended)'''&lt;br /&gt;
&lt;br /&gt;
If your_env is installed at /path/to/env/your_env&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ python3 -m venv your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ source your_env/bin/activate&lt;br /&gt;
(...)[neissner@td110 ~]$ pip install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments&lt;br /&gt;
 &amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba create --prefix /path/to/env/your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The list of modules (module1, module2, ...) is optional. For instance, for a python3 environment with scipy you would specify: ''python=3 scipy'' &lt;br /&gt;
&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/env/folder/your_env&lt;br /&gt;
(...)[neissner@td110 ~]$ mamba install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
You can use pip install inside a mamba environment, however, resolving dependencies might require installing additional packages manually.&lt;br /&gt;
&lt;br /&gt;
== Conda / mamba configuration ==&lt;br /&gt;
&lt;br /&gt;
The behaviour of conda/mamba can be configured through the &amp;quot;$HOME/.condarc&amp;quot; file, described [https://docs.conda.io/projects/conda/en/latest/configuration.html here]. Some interesting parameters:&lt;br /&gt;
&lt;br /&gt;
* envs_dirs: The list of directories to search for named environments. E.g.: different locations where you created environments&lt;br /&gt;
&lt;br /&gt;
  envs_dirs:&lt;br /&gt;
    - /data/pic/scratch/torradeflot/envs&lt;br /&gt;
    - /data/astro/scratch/torradeflot/envs&lt;br /&gt;
    - /data/aai/scratch/torradeflot/envs&lt;br /&gt;
&lt;br /&gt;
* pkgs_dirs: Folder where to store conda packages&lt;br /&gt;
&lt;br /&gt;
  pkgs_dirs:&lt;br /&gt;
  - /data/aai/scratch_ssd/torradeflot/pkgs&lt;br /&gt;
  - /data/aai/scratch/torradeflot/pkgs&lt;br /&gt;
  - /data/pic/scratch/torradeflot/pkgs&lt;br /&gt;
&lt;br /&gt;
if `pkgs_dirs` and `envs_dirs` are in the same storage, conda will use hard links, thus optimizing the disk space.&lt;br /&gt;
&lt;br /&gt;
= Proper usage of X509 based proxies =&lt;br /&gt;
&lt;br /&gt;
We found recently that the usage of proxies within a Jupyter session might cause problems because the environment changes certain standard locations such as '''/tmp'''&lt;br /&gt;
&lt;br /&gt;
For a correct functioning please create the proxy the following way, example for Virgo:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /bin/voms-proxy-init --voms virgo:/virgo/ligo --out ./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
ls: cannot access /cvmfs/ligo.osgstorage.org: Permission denied&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here the proxy cannot properly be located. Therefore we have to put the complete path into the variable:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ pwd&lt;br /&gt;
/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;/x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
frames  powerflux  pycbc  test_access&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Software of particular interest =&lt;br /&gt;
== SageMath ==&lt;br /&gt;
&lt;br /&gt;
[https://www.sagemath.org/ SageMath] is particularly interesting for Cosmology because it allows symbolic calculations, e.g. deriving the equations of motions for the scale factor starting from a customised space-time metric. &lt;br /&gt;
&lt;br /&gt;
=== Standard cosmology examples ===&lt;br /&gt;
&lt;br /&gt;
* The Friedman equations for the FLRW solution of the Einstein equations.&lt;br /&gt;
You can find the corresponding Notebook in any PIC terminal at '''/data/astro/software/notebooks/FLRW_cosmology.ipynb'''&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/FLRW_cosmology_solutions.ipynb''' uses known analytical solutions of the FLRW cosmology and produces this image for the evolution of the scale factor:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage05.png|300px]]&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/Interior_Schwarzschild.ipynb''' shows the formalism for the interior Schwarzschild metric and displays the solutions for density and pressure of a static celestial object that is sufficiently larger than its Schwarzschild radius. The pressure for an object with constant density id shown in the image:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage06.png|300px]]&lt;br /&gt;
&lt;br /&gt;
=== Enabling SageMath environment in Jupyter ===&lt;br /&gt;
&lt;br /&gt;
If you have never initialized mamba, run:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /data/astro/software/centos7/conda/mambaforge_4.14.0/bin/mamba init&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
After that you can enable SageMath for its use in a Jupyter notebook session: &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~] mamba activate /data/astro/software/envs/sage&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ python -m  ipykernel install --user --name=sage&lt;br /&gt;
....&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This creates a file in you home '''~/.local/share/jupyter/kernels/sage/kernel.json'''&lt;br /&gt;
which has to be modified to look like this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
{&lt;br /&gt;
 &amp;quot;argv&amp;quot;: [&lt;br /&gt;
  &amp;quot;/data/astro/software/envs/sage/bin/sage&amp;quot;,&lt;br /&gt;
  &amp;quot;--python&amp;quot;,&lt;br /&gt;
  &amp;quot;-m&amp;quot;,&lt;br /&gt;
  &amp;quot;sage.repl.ipython_kernel&amp;quot;,&lt;br /&gt;
  &amp;quot;-f&amp;quot;,&lt;br /&gt;
  &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
 ],&lt;br /&gt;
 &amp;quot;display_name&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;language&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;metadata&amp;quot;: {&lt;br /&gt;
  &amp;quot;debugger&amp;quot;: true&lt;br /&gt;
 }&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next time you go to your Jupyter dashboard you will find the sage environment listed there.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Dask =&lt;br /&gt;
&lt;br /&gt;
A notebook with instructions on how to run Dask at PIC can be found [https://gitlab.pic.es/services/code-samples/-/blob/main/computing/dask/dask_htcondor.ipynb here]&lt;br /&gt;
&lt;br /&gt;
= Using a singularity image as a jupyter kernel =&lt;br /&gt;
&lt;br /&gt;
Sometimes the software stack of some projects may be provided in the shape of a singularity image, it will then be convenient to use this image as a kernel for the notebooks in jupyter.pic.es.&lt;br /&gt;
&lt;br /&gt;
The singularity image to be used as a kernel needs to fullfill some requirements. Different requirements will apply depending on the programming language.&lt;br /&gt;
&lt;br /&gt;
== python jupyter kernel in a singularity image ==&lt;br /&gt;
&lt;br /&gt;
The singularity image needs to have the '''python''' and the '''ipykernel''' module installed. &lt;br /&gt;
&lt;br /&gt;
* Create the folder that will host the kernel definition&lt;br /&gt;
&lt;br /&gt;
  mkdir -p $HOME/.local/share/jupyter/kernels/singularity&lt;br /&gt;
&lt;br /&gt;
* Create the '''kernel.json''' file inside it with the following content:&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 {&lt;br /&gt;
   &amp;quot;argv&amp;quot;: [&lt;br /&gt;
     &amp;quot;singularity&amp;quot;,&lt;br /&gt;
     &amp;quot;exec&amp;quot;,&lt;br /&gt;
     &amp;quot;--cleanenv&amp;quot;,&lt;br /&gt;
     &amp;quot;/path/to/the/singularity/image.sif&amp;quot;,&lt;br /&gt;
     &amp;quot;python&amp;quot;,&lt;br /&gt;
     &amp;quot;-m&amp;quot;,&lt;br /&gt;
    &amp;quot;ipykernel&amp;quot;,&lt;br /&gt;
     &amp;quot;-f&amp;quot;,&lt;br /&gt;
     &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
   ],&lt;br /&gt;
   &amp;quot;language&amp;quot;: &amp;quot;python&amp;quot;,&lt;br /&gt;
   &amp;quot;display_name&amp;quot;: &amp;quot;singularity-kernel&amp;quot;&lt;br /&gt;
 }&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Refresh or start the jupyterlab interface and the singularity kernel should appear in the launcher tab&lt;br /&gt;
&lt;br /&gt;
= GPUs =&lt;br /&gt;
&lt;br /&gt;
The way to identify the GPUs that are assigned to your job is:&lt;br /&gt;
* check the environment variable CUDA_VISIBLE_DEVICES. In a terminal run &amp;quot;echo $CUDA_VISIBLE_DEVICES&amp;quot;. The environment variable contains a list of comma-separated GPU ids. With this you will already know how many gpus are assigned to your job. If the variable does not exist, there are no gpus assigned to the job&lt;br /&gt;
&lt;br /&gt;
* list the gpus with nvidia-smi, in a terminal run &amp;quot;nvidia-smi -L&amp;quot;, and look for the gpus you've been assigned. Remember their indexes (integers from 0 to 7)&lt;br /&gt;
&lt;br /&gt;
* The two steps above can be done with the following command:&lt;br /&gt;
&lt;br /&gt;
nvidia-smi -L | grep  $CUDA_VISIBLE_DEVICES&lt;br /&gt;
&lt;br /&gt;
if only having a single assigned GPU.&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_id_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
* in the GPU dashboard the gpus are identified with their index&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_resources_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
= Jupyterlab user guide =&lt;br /&gt;
&lt;br /&gt;
You can find the official documentation of the currently installed version of jupyterlab (3.6) [https://jupyterlab.readthedocs.io/en/4.2.x/ here], there you will find instruction on how to &lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/commands.html Access the command palette]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/toc.html Build a Table Of Contents]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/debugger.html Debug your code]&lt;br /&gt;
&lt;br /&gt;
A set of non-official jupyterlab extensions are installed to provide additional functionalities&lt;br /&gt;
&lt;br /&gt;
== jupytext ==&lt;br /&gt;
Pair your notebooks with text files to enhance version tracking.&lt;br /&gt;
https://jupytext.readthedocs.io&lt;br /&gt;
&lt;br /&gt;
=== Example ===&lt;br /&gt;
&lt;br /&gt;
If you had a notebook (.ipynb file) containing only the cell below, tracked in a git repository&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%matplotlib inline&lt;br /&gt;
import numpy as np&lt;br /&gt;
import matplotlib.pyplot as plt&lt;br /&gt;
plt.imshow(np.random.random([10, 10]))&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Different executions of the cell would produce different images, and the images are embedded in a pseudo-binary format inside the notebook file. In this case, doing a '''git diff''' of the .ipynb file would produce a huge output (because the image changed), even if there wasn't any change in the code. It is thus convenient to sync the notebook with a text file (e.g. a .py script) using the jupytext extension and track this one with git. The outputs, including images, as well as some additional metadata, won't be added to the synced text file. So in the case of different executions of the same notebook, the diff will always be empty.&lt;br /&gt;
&lt;br /&gt;
== git ==&lt;br /&gt;
Sidebar GUI to git repo management&lt;br /&gt;
https://github.com/jupyterlab/jupyterlab-git&lt;br /&gt;
&lt;br /&gt;
== variable inspector ==&lt;br /&gt;
Variable inspection à la Matlab&lt;br /&gt;
https://github.com/jupyterlab-contrib/jupyterlab-variableInspector&lt;br /&gt;
&lt;br /&gt;
...&lt;br /&gt;
&lt;br /&gt;
= Code samples =&lt;br /&gt;
&lt;br /&gt;
A repository with sample code can be found here: https://gitlab.pic.es/services/code-samples/&lt;br /&gt;
&lt;br /&gt;
= Troubleshooting =&lt;br /&gt;
&lt;br /&gt;
== Logs ==&lt;br /&gt;
&lt;br /&gt;
The log files for the jupyterlab server are stored in &amp;quot;~/.jupyter&amp;quot;. The log files will be created once the jupyter lab server job is finished.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Clean workspaces ==&lt;br /&gt;
&lt;br /&gt;
Jupyterlab stores the workspace status in the &amp;quot;~/.jupyter/lab/workspaces&amp;quot; folder. If you want to start with a fresh (empty) workspace, delete all the content of this folder before launching the notebook.&lt;br /&gt;
&lt;br /&gt;
    cd ~/.jupyter/lab/workspaces&lt;br /&gt;
    rm *&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 504 Gateway timeout ==&lt;br /&gt;
&lt;br /&gt;
The notebook job is running in HTCondor but the user can not access the notebook server. Ultimately a 504 error is received.&lt;br /&gt;
&lt;br /&gt;
This is probably because there's some error when starting the jupyterlab server. First of all, shutdown the notebook server and [[#Logs|check the logs]] to better identify the problem. If you don't see the source of the error, try to [[#Clean_workspaces|clean the workspaces]] and launch a notebook again.&lt;br /&gt;
&lt;br /&gt;
== Install ROOT ==&lt;br /&gt;
&lt;br /&gt;
Environment creation and kernel installation&lt;br /&gt;
&lt;br /&gt;
    $ micromamba env create -p /data/pic/scratch/torradeflot/envs/mcdata root ipykernel&lt;br /&gt;
    $ micromamba activate mcdata&lt;br /&gt;
    $ python -m ipykernel install --user --name mcdata&lt;br /&gt;
&lt;br /&gt;
After doing this ROOT can be imported from a python shell, but it does not work from a notebook.&lt;br /&gt;
&lt;br /&gt;
* ROOT uses JIT compilation with Cling: https://root.cern/cling/&lt;br /&gt;
* conda provides it's own set of compilation tools: gcc, gxx, fortran&lt;br /&gt;
* Some environment variables are necessary to ensure that these two pieces work well together:&lt;br /&gt;
** PATH: to be able to find the compiler tools&lt;br /&gt;
** CONDA_BUILD_SYSROOT: needed to configure the compiler call&lt;br /&gt;
&lt;br /&gt;
These environment variables are not propagated to the notebook, because they are set by .bashrc conda activate, ... so we need to explicitly set them in the notebook.&lt;br /&gt;
&lt;br /&gt;
     import os&lt;br /&gt;
     import sys&lt;br /&gt;
     from pathlib import Path&lt;br /&gt;
     bin_dir = Path(sys.executable).parent&lt;br /&gt;
     os.environ['PATH'] = f'{bin_dir}:{os.environ['PATH']}'&lt;br /&gt;
     os.environ['CONDA_BUILD_SYSROOT'] = str(bin_dir.parent / 'x86_64-conda-linux-gnu/sysroot')&lt;br /&gt;
&lt;br /&gt;
== Loading libraries is very slow ==&lt;br /&gt;
Conda environments at storage using hard drives can be extremely slow to load. If you encounter this problem, please ask &lt;br /&gt;
your support contact for a &amp;quot;SSD scratch&amp;quot; location to store environments. Currently this service is being tested and&lt;br /&gt;
deployed as needed.&lt;br /&gt;
&lt;br /&gt;
== Spawn failed: The 'ip' trait of a PICCondorSpawner instance expected a unicode string, not the NoneType None ==&lt;br /&gt;
&lt;br /&gt;
Jupyterhub could not get the host name from HTCondor's stdout, because it didn't match the expected regular expression.&lt;br /&gt;
&lt;br /&gt;
This error happens randomly from time to time, it does not imply any major problem in any of the services.&lt;br /&gt;
&lt;br /&gt;
Try to request a new notebook server.&lt;/div&gt;</summary>
		<author><name>Torradeflot</name></author>
	</entry>
	<entry>
		<id>https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1204</id>
		<title>JupyterHub</title>
		<link rel="alternate" type="text/html" href="https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1204"/>
		<updated>2025-03-19T06:52:01Z</updated>

		<summary type="html">&lt;p&gt;Torradeflot: /* Conda / mamba configuration */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
= Introduction =&lt;br /&gt;
&lt;br /&gt;
PIC offers a service for running Jupyter notebooks on CPU or GPU resources. This service is primarily thought for code developing or prototyping rather than data processing. The usage is similar to running notebooks on your personal computer but offers the advantage of developing and testing your code on different hardware configurations, as well as facilitating the scalability of the code since it is being tested in the same environment in which it would run on a mass scale. &lt;br /&gt;
&lt;br /&gt;
Since the service is strictly thought for development and small scale testing tasks, a shutdown policy for the sessions has been put in place:&lt;br /&gt;
&lt;br /&gt;
# The maximum duration for a session is 48h.&lt;br /&gt;
# After an idle period of 2 hours, the session will be closed. &lt;br /&gt;
&lt;br /&gt;
In practice that  means that you should estimate the test data volume that you work with during a session to be able to be processed in less than 48 hours.&lt;br /&gt;
&lt;br /&gt;
== How to connect to the service ==&lt;br /&gt;
&lt;br /&gt;
Got to [https://jupyter.pic.es jupyter.pic.es] to see your login screen.&lt;br /&gt;
&lt;br /&gt;
[[File:login.png|700px|Login screen]]&lt;br /&gt;
&lt;br /&gt;
Sign in with your PIC user credentials. This will prompt you to the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterSpawn.png|700px|current]]&lt;br /&gt;
&lt;br /&gt;
Here you can choose the hardware configuration for your Jupyter session. Also, you have to choose the experiment (project) you are working on during the Jupyter session. After choosing a configuration and pressing start the next screen will show you the progress of the initialisation process. Keep in mind that a job containing your Jupyter session is actually sent to the HTCondor queuing system and waiting for available resources before being started. This usually takes less than a minute but can take up to a few depending on our resource usage.&lt;br /&gt;
&lt;br /&gt;
[[File:screen02.png|900px]]&lt;br /&gt;
&lt;br /&gt;
In the next screen you can choose the tool that you want to use for your work: a Python notebook, a Python console or a plain bash terminal.&lt;br /&gt;
For the Python environment (either notebook or environment) you have two default options:&lt;br /&gt;
* the ipykernel version of Python 3&lt;br /&gt;
* the XPython version of Python 3.9, this one allows you to use the integrated debugging module.&lt;br /&gt;
&lt;br /&gt;
Further you see an icon with a &amp;quot;D&amp;quot; - desktop, this one starts a VNC session that allows the use of programs with graphical user interfaces.&lt;br /&gt;
&lt;br /&gt;
Also, recently you can find the icon of Visual Studio, an integrated development environment.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterlab20231103.png|700px]]&lt;br /&gt;
&lt;br /&gt;
Your python environments should appear under Notebook and Console headers. In a later section we will show you how to create a new environment and to remove an existing one.&lt;br /&gt;
&lt;br /&gt;
== Terminate your session and logout ==&lt;br /&gt;
&lt;br /&gt;
It is important that you terminate your session before you log out. In order to do so, go to the top page menu &amp;quot;'''File''' -&amp;gt; '''Hub Control Panel'''&amp;quot; and you will see the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:screen04.png|600px]]&lt;br /&gt;
&lt;br /&gt;
Here, click on the '''Stop My Server''' button. After that you can log out by clicking the '''Logout''' button in the right upper corner.&lt;br /&gt;
&lt;br /&gt;
=  Python virtual environments =&lt;br /&gt;
&lt;br /&gt;
This section covers the use of Python virtual environments with Jupyter.&lt;br /&gt;
&lt;br /&gt;
== Initialize conda (we highly recommend the use of mamba/micromamba) ==&lt;br /&gt;
&lt;br /&gt;
Before using conda/mamba in your bash session, you have to initialize it.&lt;br /&gt;
* For access to an available conda/mamba installation, please get in contact with your project liaison at PIC. He/she will give you the actual value for the '''/path/to/conda/mamba''' placeholder.&lt;br /&gt;
* If you want to use your own conda/mamba/micromamba installation, there are two recommended options&lt;br /&gt;
** '''miniforge''': a distribution with conda and mamba executables in a minimal base environment, instructions [https://github.com/conda-forge/miniforge here]&lt;br /&gt;
** '''micromamba''': a self-contained executable (micromamba) with no base environment, instructions [https://mamba.readthedocs.io/en/latest/installation/micromamba-installation.html here]&lt;br /&gt;
&lt;br /&gt;
Log onto Jupyter and start a session. On the homepage of your Jupyter session, click on the terminal button on the session dashboard on the right to open a bash terminal.&lt;br /&gt;
&lt;br /&gt;
In order to use conda/mamba/micromamba you need to intialize the shell. This initialization can be persistent, which will do some changes to your '''~/.bashrc''' file, or you can do it every time you want to use it.&lt;br /&gt;
&lt;br /&gt;
If you are using miniforge and you want to persist the initialization:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
/path/to/conda/mamba init&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you are using micromamba:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
/path/to/micromamba shell init&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This actually changes the .bashrc file in your home directory in order to activate the base environment on login.&lt;br /&gt;
To avoid that the base environment is activated every time you log on to a node, run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you don't want the initialization to be persisted, you can run it for every session:&lt;br /&gt;
&lt;br /&gt;
If you are using miniforge, the initialization hook needs to be run using the '''conda''' executable:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eval &amp;quot;$(/path/to/conda shell.bash hook)&amp;quot;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you are using micromamba:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eval &amp;quot;$(/path/to/micromamba shell hook -s posix)&amp;quot;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Link an existing environment to Jupyter ==&lt;br /&gt;
&lt;br /&gt;
You can find instructions on how to create your own environments, e.g. [[#Create_virtual_environments_with_venv_or_conda | here]].&lt;br /&gt;
&lt;br /&gt;
Log into Jupyter, start a session. From the session dashboard choose the bash terminal.&lt;br /&gt;
&lt;br /&gt;
Inside the terminal, activate your environment.&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments:&lt;br /&gt;
* if you created the environment without a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the name of your environment. &lt;br /&gt;
* if you created the environment with a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the absolute path of your environment. &lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ source /path/to/environment/bin/activate&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Link the environment to a Jupyter kernel. For both, '''conda/mamba''' and '''venv''':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ python -m  ipykernel install --user --name=whatever_kernel_name&lt;br /&gt;
Installed kernelspec whatever_kernel_name in &lt;br /&gt;
                         /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you don't have the '''ipykernel''' module installed in your environment you may receive an error message like the one below when trying to run the previous command.&lt;br /&gt;
&amp;lt;pre&amp;gt;No module named ipykernel&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If this is the case, you need to install it by running: '''pip install ipykernel'''&lt;br /&gt;
&lt;br /&gt;
Deactivate your environment. &lt;br /&gt;
&lt;br /&gt;
For conda:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For venv:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now you can exit the terminal. After refreshing the Jupyter page your whatever_kernel_name appears in the dashboard. In this example '''test''' has been used for whatever_kernel_name&lt;br /&gt;
&lt;br /&gt;
[[File:screen05.png|700px]]&lt;br /&gt;
&lt;br /&gt;
== Unlink an environment from Jupyter ==&lt;br /&gt;
Log onto Jupyter, start a session and from the session dashboard choose the bash terminal. To remove your environment/kernel from Jupyter run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ jupyter kernelspec uninstall whatever_kernel_name&lt;br /&gt;
Kernel specs to remove:&lt;br /&gt;
  whatever_kernel_name     /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
Remove 1 kernel specs [y/N]: y&lt;br /&gt;
[RemoveKernelSpec] Removed /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Keep in mind that, although not available in Jupyter anymore, the environment still exists. Whenever you need it, you can link it again.&lt;br /&gt;
&lt;br /&gt;
== Create virtual environments with venv or conda ==&lt;br /&gt;
&lt;br /&gt;
Before creating a new environment, please get in contact with your project liaison at PIC as there may be already a suitable environment for your needs in place.&lt;br /&gt;
&lt;br /&gt;
If none of the existing environments suits your needs, you can create a new environment.&lt;br /&gt;
First, create a directory in a suitable place to store the environment. For single-user environments, place them in your home under ~/env. For environments that will be shared with other project users, contact your project liaison and ask him/her for a path in a shared storage volume that is visible to all of them.&lt;br /&gt;
&lt;br /&gt;
Once you have the location (i.e. /path/to/env/folder), create the environment with the following commands:&lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments '''(recommended)'''&lt;br /&gt;
&lt;br /&gt;
If your_env is installed at /path/to/env/your_env&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ python3 -m venv your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ source your_env/bin/activate&lt;br /&gt;
(...)[neissner@td110 ~]$ pip install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments&lt;br /&gt;
 &amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba create --prefix /path/to/env/your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The list of modules (module1, module2, ...) is optional. For instance, for a python3 environment with scipy you would specify: ''python=3 scipy'' &lt;br /&gt;
&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/env/folder/your_env&lt;br /&gt;
(...)[neissner@td110 ~]$ mamba install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
You can use pip install inside a mamba environment, however, resolving dependencies might require installing additional packages manually.&lt;br /&gt;
&lt;br /&gt;
== Conda / mamba configuration ==&lt;br /&gt;
&lt;br /&gt;
The behaviour of conda/mamba can be configured through the &amp;quot;$HOME/.condarc&amp;quot; file, described [https://docs.conda.io/projects/conda/en/latest/configuration.html here]. Some interesting parameters:&lt;br /&gt;
&lt;br /&gt;
* envs_dirs: The list of directories to search for named environments. E.g.: different locations where you created environments&lt;br /&gt;
&lt;br /&gt;
  envs_dirs:&lt;br /&gt;
    - /data/pic/scratch/torradeflot/envs&lt;br /&gt;
    - /data/astro/scratch/torradeflot/envs&lt;br /&gt;
    - /data/aai/scratch/torradeflot/envs&lt;br /&gt;
&lt;br /&gt;
* pkgs_dirs: Folder where to store conda packages&lt;br /&gt;
&lt;br /&gt;
  pkgs_dirs:&lt;br /&gt;
  - /data/aai/scratch_ssd/torradeflot/pkgs&lt;br /&gt;
  - /data/aai/scratch/torradeflot/pkgs&lt;br /&gt;
  - /data/pic/scratch/torradeflot/pkgs&lt;br /&gt;
&lt;br /&gt;
if `pkgs_dirs` and `envs_dirs` are in the same storage, conda will use hard links, thus optimizing the disk space.&lt;br /&gt;
&lt;br /&gt;
= Proper usage of X509 based proxies =&lt;br /&gt;
&lt;br /&gt;
We found recently that the usage of proxies within a Jupyter session might cause problems because the environment changes certain standard locations such as '''/tmp'''&lt;br /&gt;
&lt;br /&gt;
For a correct functioning please create the proxy the following way, example for Virgo:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /bin/voms-proxy-init --voms virgo:/virgo/ligo --out ./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
ls: cannot access /cvmfs/ligo.osgstorage.org: Permission denied&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here the proxy cannot properly be located. Therefore we have to put the complete path into the variable:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ pwd&lt;br /&gt;
/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;/x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
frames  powerflux  pycbc  test_access&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Software of particular interest =&lt;br /&gt;
== SageMath ==&lt;br /&gt;
&lt;br /&gt;
[https://www.sagemath.org/ SageMath] is particularly interesting for Cosmology because it allows symbolic calculations, e.g. deriving the equations of motions for the scale factor starting from a customised space-time metric. &lt;br /&gt;
&lt;br /&gt;
=== Standard cosmology examples ===&lt;br /&gt;
&lt;br /&gt;
* The Friedman equations for the FLRW solution of the Einstein equations.&lt;br /&gt;
You can find the corresponding Notebook in any PIC terminal at '''/data/astro/software/notebooks/FLRW_cosmology.ipynb'''&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/FLRW_cosmology_solutions.ipynb''' uses known analytical solutions of the FLRW cosmology and produces this image for the evolution of the scale factor:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage05.png|300px]]&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/Interior_Schwarzschild.ipynb''' shows the formalism for the interior Schwarzschild metric and displays the solutions for density and pressure of a static celestial object that is sufficiently larger than its Schwarzschild radius. The pressure for an object with constant density id shown in the image:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage06.png|300px]]&lt;br /&gt;
&lt;br /&gt;
=== Enabling SageMath environment in Jupyter ===&lt;br /&gt;
&lt;br /&gt;
If you have never initialized mamba, run:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /data/astro/software/centos7/conda/mambaforge_4.14.0/bin/mamba init&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
After that you can enable SageMath for its use in a Jupyter notebook session: &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~] mamba activate /data/astro/software/envs/sage&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ python -m  ipykernel install --user --name=sage&lt;br /&gt;
....&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This creates a file in you home '''~/.local/share/jupyter/kernels/sage/kernel.json'''&lt;br /&gt;
which has to be modified to look like this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
{&lt;br /&gt;
 &amp;quot;argv&amp;quot;: [&lt;br /&gt;
  &amp;quot;/data/astro/software/envs/sage/bin/sage&amp;quot;,&lt;br /&gt;
  &amp;quot;--python&amp;quot;,&lt;br /&gt;
  &amp;quot;-m&amp;quot;,&lt;br /&gt;
  &amp;quot;sage.repl.ipython_kernel&amp;quot;,&lt;br /&gt;
  &amp;quot;-f&amp;quot;,&lt;br /&gt;
  &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
 ],&lt;br /&gt;
 &amp;quot;display_name&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;language&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;metadata&amp;quot;: {&lt;br /&gt;
  &amp;quot;debugger&amp;quot;: true&lt;br /&gt;
 }&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next time you go to your Jupyter dashboard you will find the sage environment listed there.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Dask =&lt;br /&gt;
&lt;br /&gt;
A notebook with instructions on how to run Dask at PIC can be found [https://gitlab.pic.es/services/code-samples/-/blob/main/computing/dask/dask_htcondor.ipynb here]&lt;br /&gt;
&lt;br /&gt;
= Using a singularity image as a jupyter kernel =&lt;br /&gt;
&lt;br /&gt;
Sometimes the software stack of some projects may be provided in the shape of a singularity image, it will then be convenient to use this image as a kernel for the notebooks in jupyter.pic.es.&lt;br /&gt;
&lt;br /&gt;
The singularity image to be used as a kernel needs to fullfill some requirements. Different requirements will apply depending on the programming language.&lt;br /&gt;
&lt;br /&gt;
== python jupyter kernel in a singularity image ==&lt;br /&gt;
&lt;br /&gt;
The singularity image needs to have the '''python''' and the '''ipykernel''' module installed. &lt;br /&gt;
&lt;br /&gt;
* Create the folder that will host the kernel definition&lt;br /&gt;
&lt;br /&gt;
  mkdir -p $HOME/.local/share/jupyter/kernels/singularity&lt;br /&gt;
&lt;br /&gt;
* Create the '''kernel.json''' file inside it with the following content:&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 {&lt;br /&gt;
   &amp;quot;argv&amp;quot;: [&lt;br /&gt;
     &amp;quot;singularity&amp;quot;,&lt;br /&gt;
     &amp;quot;exec&amp;quot;,&lt;br /&gt;
     &amp;quot;--cleanenv&amp;quot;,&lt;br /&gt;
     &amp;quot;/path/to/the/singularity/image.sif&amp;quot;,&lt;br /&gt;
     &amp;quot;python&amp;quot;,&lt;br /&gt;
     &amp;quot;-m&amp;quot;,&lt;br /&gt;
    &amp;quot;ipykernel&amp;quot;,&lt;br /&gt;
     &amp;quot;-f&amp;quot;,&lt;br /&gt;
     &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
   ],&lt;br /&gt;
   &amp;quot;language&amp;quot;: &amp;quot;python&amp;quot;,&lt;br /&gt;
   &amp;quot;display_name&amp;quot;: &amp;quot;singularity-kernel&amp;quot;&lt;br /&gt;
 }&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Refresh or start the jupyterlab interface and the singularity kernel should appear in the launcher tab&lt;br /&gt;
&lt;br /&gt;
= GPUs =&lt;br /&gt;
&lt;br /&gt;
The way to identify the GPUs that are assigned to your job is:&lt;br /&gt;
* check the environment variable CUDA_VISIBLE_DEVICES. In a terminal run &amp;quot;echo $CUDA_VISIBLE_DEVICES&amp;quot;. The environment variable contains a list of comma-separated GPU ids. With this you will already know how many gpus are assigned to your job. If the variable does not exist, there are no gpus assigned to the job&lt;br /&gt;
&lt;br /&gt;
* list the gpus with nvidia-smi, in a terminal run &amp;quot;nvidia-smi -L&amp;quot;, and look for the gpus you've been assigned. Remember their indexes (integers from 0 to 7)&lt;br /&gt;
&lt;br /&gt;
* The two steps above can be done with the following command:&lt;br /&gt;
&lt;br /&gt;
nvidia-smi -L | grep  $CUDA_VISIBLE_DEVICES&lt;br /&gt;
&lt;br /&gt;
if only having a single assigned GPU.&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_id_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
* in the GPU dashboard the gpus are identified with their index&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_resources_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
= Jupyterlab user guide =&lt;br /&gt;
&lt;br /&gt;
You can find the official documentation of the currently installed version of jupyterlab (3.6) [https://jupyterlab.readthedocs.io/en/4.2.x/ here], there you will find instruction on how to &lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/commands.html Access the command palette]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/toc.html Build a Table Of Contents]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/debugger.html Debug your code]&lt;br /&gt;
&lt;br /&gt;
A set of non-official jupyterlab extensions are installed to provide additional functionalities&lt;br /&gt;
&lt;br /&gt;
== jupytext ==&lt;br /&gt;
Pair your notebooks with text files to enhance version tracking.&lt;br /&gt;
https://jupytext.readthedocs.io&lt;br /&gt;
&lt;br /&gt;
=== Example ===&lt;br /&gt;
&lt;br /&gt;
If you had a notebook (.ipynb file) containing only the cell below, tracked in a git repository&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%matplotlib inline&lt;br /&gt;
import numpy as np&lt;br /&gt;
import matplotlib.pyplot as plt&lt;br /&gt;
plt.imshow(np.random.random([10, 10]))&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Different executions of the cell would produce different images, and the images are embedded in a pseudo-binary format inside the notebook file. In this case, doing a '''git diff''' of the .ipynb file would produce a huge output (because the image changed), even if there wasn't any change in the code. It is thus convenient to sync the notebook with a text file (e.g. a .py script) using the jupytext extension and track this one with git. The outputs, including images, as well as some additional metadata, won't be added to the synced text file. So in the case of different executions of the same notebook, the diff will always be empty.&lt;br /&gt;
&lt;br /&gt;
== git ==&lt;br /&gt;
Sidebar GUI to git repo management&lt;br /&gt;
https://github.com/jupyterlab/jupyterlab-git&lt;br /&gt;
&lt;br /&gt;
== variable inspector ==&lt;br /&gt;
Variable inspection à la Matlab&lt;br /&gt;
https://github.com/jupyterlab-contrib/jupyterlab-variableInspector&lt;br /&gt;
&lt;br /&gt;
...&lt;br /&gt;
&lt;br /&gt;
= Code samples =&lt;br /&gt;
&lt;br /&gt;
A repository with sample code can be found here: https://gitlab.pic.es/services/code-samples/&lt;br /&gt;
&lt;br /&gt;
= Troubleshooting =&lt;br /&gt;
&lt;br /&gt;
== Logs ==&lt;br /&gt;
&lt;br /&gt;
The log files for the jupyterlab server are stored in &amp;quot;~/.jupyter&amp;quot;. The log files will be created once the jupyter lab server job is finished.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Clean workspaces ==&lt;br /&gt;
&lt;br /&gt;
Jupyterlab stores the workspace status in the &amp;quot;~/.jupyter/lab/workspaces&amp;quot; folder. If you want to start with a fresh (empty) workspace, delete all the content of this folder before launching the notebook.&lt;br /&gt;
&lt;br /&gt;
    cd ~/.jupyter/lab/workspaces&lt;br /&gt;
    rm *&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 504 Gateway timeout ==&lt;br /&gt;
&lt;br /&gt;
The notebook job is running in HTCondor but the user can not access the notebook server. Ultimately a 504 error is received.&lt;br /&gt;
&lt;br /&gt;
This is probably because there's some error when starting the jupyterlab server. First of all, shutdown the notebook server and [[#Logs|check the logs]] to better identify the problem. If you don't see the source of the error, try to [[#Clean_workspaces|clean the workspaces]] and launch a notebook again.&lt;br /&gt;
&lt;br /&gt;
== Install ROOT ==&lt;br /&gt;
&lt;br /&gt;
Environment creation and kernel installation&lt;br /&gt;
&lt;br /&gt;
    $ micromamba env create -p /data/pic/scratch/torradeflot/envs/mcdata root ipykernel&lt;br /&gt;
    $ micromamba activate mcdata&lt;br /&gt;
    $ python -m ipykernel install --user --name mcdata&lt;br /&gt;
&lt;br /&gt;
After doing this ROOT can be imported from a python shell, but it does not work from a notebook.&lt;br /&gt;
&lt;br /&gt;
* ROOT uses JIT compilation with Cling: https://root.cern/cling/&lt;br /&gt;
* conda provides it's own set of compilation tools: gcc, gxx, fortran&lt;br /&gt;
* Some environment variables are necessary to ensure that these two pieces work well together:&lt;br /&gt;
** PATH: to be able to find the compiler tools&lt;br /&gt;
** CONDA_BUILD_SYSROOT: needed to configure the compiler call&lt;br /&gt;
&lt;br /&gt;
These environment variables are not propagated to the notebook, because they are set by .bashrc conda activate, ... so we need to explicitly set them in the notebook.&lt;br /&gt;
&lt;br /&gt;
     import os&lt;br /&gt;
     import sys&lt;br /&gt;
     from pathlib import Path&lt;br /&gt;
     bin_dir = Path(sys.executable).parent&lt;br /&gt;
     os.environ['PATH'] = f'{bin_dir}:{os.environ['PATH']}'&lt;br /&gt;
     os.environ['CONDA_BUILD_SYSROOT'] = str(bin_dir.parent / 'x86_64-conda-linux-gnu/sysroot')&lt;br /&gt;
&lt;br /&gt;
== Loading libraries is very slow ==&lt;br /&gt;
Conda environments at storage using hard drives can be extremely slow to load. If you encounter this problem, please ask &lt;br /&gt;
your support contact for a &amp;quot;SSD scratch&amp;quot; location to store environments. Currently this service is being tested and&lt;br /&gt;
deployed as needed.&lt;/div&gt;</summary>
		<author><name>Torradeflot</name></author>
	</entry>
	<entry>
		<id>https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1203</id>
		<title>JupyterHub</title>
		<link rel="alternate" type="text/html" href="https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1203"/>
		<updated>2025-03-19T06:49:11Z</updated>

		<summary type="html">&lt;p&gt;Torradeflot: /* Initialize conda (we highly recommend the use of mamba/micromamba) */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
= Introduction =&lt;br /&gt;
&lt;br /&gt;
PIC offers a service for running Jupyter notebooks on CPU or GPU resources. This service is primarily thought for code developing or prototyping rather than data processing. The usage is similar to running notebooks on your personal computer but offers the advantage of developing and testing your code on different hardware configurations, as well as facilitating the scalability of the code since it is being tested in the same environment in which it would run on a mass scale. &lt;br /&gt;
&lt;br /&gt;
Since the service is strictly thought for development and small scale testing tasks, a shutdown policy for the sessions has been put in place:&lt;br /&gt;
&lt;br /&gt;
# The maximum duration for a session is 48h.&lt;br /&gt;
# After an idle period of 2 hours, the session will be closed. &lt;br /&gt;
&lt;br /&gt;
In practice that  means that you should estimate the test data volume that you work with during a session to be able to be processed in less than 48 hours.&lt;br /&gt;
&lt;br /&gt;
== How to connect to the service ==&lt;br /&gt;
&lt;br /&gt;
Got to [https://jupyter.pic.es jupyter.pic.es] to see your login screen.&lt;br /&gt;
&lt;br /&gt;
[[File:login.png|700px|Login screen]]&lt;br /&gt;
&lt;br /&gt;
Sign in with your PIC user credentials. This will prompt you to the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterSpawn.png|700px|current]]&lt;br /&gt;
&lt;br /&gt;
Here you can choose the hardware configuration for your Jupyter session. Also, you have to choose the experiment (project) you are working on during the Jupyter session. After choosing a configuration and pressing start the next screen will show you the progress of the initialisation process. Keep in mind that a job containing your Jupyter session is actually sent to the HTCondor queuing system and waiting for available resources before being started. This usually takes less than a minute but can take up to a few depending on our resource usage.&lt;br /&gt;
&lt;br /&gt;
[[File:screen02.png|900px]]&lt;br /&gt;
&lt;br /&gt;
In the next screen you can choose the tool that you want to use for your work: a Python notebook, a Python console or a plain bash terminal.&lt;br /&gt;
For the Python environment (either notebook or environment) you have two default options:&lt;br /&gt;
* the ipykernel version of Python 3&lt;br /&gt;
* the XPython version of Python 3.9, this one allows you to use the integrated debugging module.&lt;br /&gt;
&lt;br /&gt;
Further you see an icon with a &amp;quot;D&amp;quot; - desktop, this one starts a VNC session that allows the use of programs with graphical user interfaces.&lt;br /&gt;
&lt;br /&gt;
Also, recently you can find the icon of Visual Studio, an integrated development environment.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterlab20231103.png|700px]]&lt;br /&gt;
&lt;br /&gt;
Your python environments should appear under Notebook and Console headers. In a later section we will show you how to create a new environment and to remove an existing one.&lt;br /&gt;
&lt;br /&gt;
== Terminate your session and logout ==&lt;br /&gt;
&lt;br /&gt;
It is important that you terminate your session before you log out. In order to do so, go to the top page menu &amp;quot;'''File''' -&amp;gt; '''Hub Control Panel'''&amp;quot; and you will see the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:screen04.png|600px]]&lt;br /&gt;
&lt;br /&gt;
Here, click on the '''Stop My Server''' button. After that you can log out by clicking the '''Logout''' button in the right upper corner.&lt;br /&gt;
&lt;br /&gt;
=  Python virtual environments =&lt;br /&gt;
&lt;br /&gt;
This section covers the use of Python virtual environments with Jupyter.&lt;br /&gt;
&lt;br /&gt;
== Initialize conda (we highly recommend the use of mamba/micromamba) ==&lt;br /&gt;
&lt;br /&gt;
Before using conda/mamba in your bash session, you have to initialize it.&lt;br /&gt;
* For access to an available conda/mamba installation, please get in contact with your project liaison at PIC. He/she will give you the actual value for the '''/path/to/conda/mamba''' placeholder.&lt;br /&gt;
* If you want to use your own conda/mamba/micromamba installation, there are two recommended options&lt;br /&gt;
** '''miniforge''': a distribution with conda and mamba executables in a minimal base environment, instructions [https://github.com/conda-forge/miniforge here]&lt;br /&gt;
** '''micromamba''': a self-contained executable (micromamba) with no base environment, instructions [https://mamba.readthedocs.io/en/latest/installation/micromamba-installation.html here]&lt;br /&gt;
&lt;br /&gt;
Log onto Jupyter and start a session. On the homepage of your Jupyter session, click on the terminal button on the session dashboard on the right to open a bash terminal.&lt;br /&gt;
&lt;br /&gt;
In order to use conda/mamba/micromamba you need to intialize the shell. This initialization can be persistent, which will do some changes to your '''~/.bashrc''' file, or you can do it every time you want to use it.&lt;br /&gt;
&lt;br /&gt;
If you are using miniforge and you want to persist the initialization:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
/path/to/conda/mamba init&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you are using micromamba:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
/path/to/micromamba shell init&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This actually changes the .bashrc file in your home directory in order to activate the base environment on login.&lt;br /&gt;
To avoid that the base environment is activated every time you log on to a node, run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you don't want the initialization to be persisted, you can run it for every session:&lt;br /&gt;
&lt;br /&gt;
If you are using miniforge, the initialization hook needs to be run using the '''conda''' executable:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eval &amp;quot;$(/path/to/conda shell.bash hook)&amp;quot;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you are using micromamba:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eval &amp;quot;$(/path/to/micromamba shell hook -s posix)&amp;quot;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Link an existing environment to Jupyter ==&lt;br /&gt;
&lt;br /&gt;
You can find instructions on how to create your own environments, e.g. [[#Create_virtual_environments_with_venv_or_conda | here]].&lt;br /&gt;
&lt;br /&gt;
Log into Jupyter, start a session. From the session dashboard choose the bash terminal.&lt;br /&gt;
&lt;br /&gt;
Inside the terminal, activate your environment.&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments:&lt;br /&gt;
* if you created the environment without a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the name of your environment. &lt;br /&gt;
* if you created the environment with a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the absolute path of your environment. &lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ source /path/to/environment/bin/activate&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Link the environment to a Jupyter kernel. For both, '''conda/mamba''' and '''venv''':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ python -m  ipykernel install --user --name=whatever_kernel_name&lt;br /&gt;
Installed kernelspec whatever_kernel_name in &lt;br /&gt;
                         /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you don't have the '''ipykernel''' module installed in your environment you may receive an error message like the one below when trying to run the previous command.&lt;br /&gt;
&amp;lt;pre&amp;gt;No module named ipykernel&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If this is the case, you need to install it by running: '''pip install ipykernel'''&lt;br /&gt;
&lt;br /&gt;
Deactivate your environment. &lt;br /&gt;
&lt;br /&gt;
For conda:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For venv:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now you can exit the terminal. After refreshing the Jupyter page your whatever_kernel_name appears in the dashboard. In this example '''test''' has been used for whatever_kernel_name&lt;br /&gt;
&lt;br /&gt;
[[File:screen05.png|700px]]&lt;br /&gt;
&lt;br /&gt;
== Unlink an environment from Jupyter ==&lt;br /&gt;
Log onto Jupyter, start a session and from the session dashboard choose the bash terminal. To remove your environment/kernel from Jupyter run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ jupyter kernelspec uninstall whatever_kernel_name&lt;br /&gt;
Kernel specs to remove:&lt;br /&gt;
  whatever_kernel_name     /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
Remove 1 kernel specs [y/N]: y&lt;br /&gt;
[RemoveKernelSpec] Removed /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Keep in mind that, although not available in Jupyter anymore, the environment still exists. Whenever you need it, you can link it again.&lt;br /&gt;
&lt;br /&gt;
== Create virtual environments with venv or conda ==&lt;br /&gt;
&lt;br /&gt;
Before creating a new environment, please get in contact with your project liaison at PIC as there may be already a suitable environment for your needs in place.&lt;br /&gt;
&lt;br /&gt;
If none of the existing environments suits your needs, you can create a new environment.&lt;br /&gt;
First, create a directory in a suitable place to store the environment. For single-user environments, place them in your home under ~/env. For environments that will be shared with other project users, contact your project liaison and ask him/her for a path in a shared storage volume that is visible to all of them.&lt;br /&gt;
&lt;br /&gt;
Once you have the location (i.e. /path/to/env/folder), create the environment with the following commands:&lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments '''(recommended)'''&lt;br /&gt;
&lt;br /&gt;
If your_env is installed at /path/to/env/your_env&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ python3 -m venv your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ source your_env/bin/activate&lt;br /&gt;
(...)[neissner@td110 ~]$ pip install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments&lt;br /&gt;
 &amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba create --prefix /path/to/env/your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The list of modules (module1, module2, ...) is optional. For instance, for a python3 environment with scipy you would specify: ''python=3 scipy'' &lt;br /&gt;
&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/env/folder/your_env&lt;br /&gt;
(...)[neissner@td110 ~]$ mamba install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
You can use pip install inside a mamba environment, however, resolving dependencies might require installing additional packages manually.&lt;br /&gt;
&lt;br /&gt;
== Conda / mamba configuration ==&lt;br /&gt;
&lt;br /&gt;
The behaviour of conda/mamba can be configured through the &amp;quot;$HOME/.condarc&amp;quot; file, described [https://docs.conda.io/projects/conda/en/latest/configuration.html here]. Some interesting parameters:&lt;br /&gt;
&lt;br /&gt;
* envs_dirs: The list of directories to search for named environments. E.g.: different locations where you created environments&lt;br /&gt;
&lt;br /&gt;
  envs_dirs:&lt;br /&gt;
    - /data/pic/scratch/torradeflot/envs&lt;br /&gt;
    - /data/astro/scratch/torradeflot/envs&lt;br /&gt;
    - /data/aai/scratch/torradeflot/envs&lt;br /&gt;
&lt;br /&gt;
* pkgs_dirs: Folder where to store conda packages&lt;br /&gt;
&lt;br /&gt;
= Proper usage of X509 based proxies =&lt;br /&gt;
&lt;br /&gt;
We found recently that the usage of proxies within a Jupyter session might cause problems because the environment changes certain standard locations such as '''/tmp'''&lt;br /&gt;
&lt;br /&gt;
For a correct functioning please create the proxy the following way, example for Virgo:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /bin/voms-proxy-init --voms virgo:/virgo/ligo --out ./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
ls: cannot access /cvmfs/ligo.osgstorage.org: Permission denied&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here the proxy cannot properly be located. Therefore we have to put the complete path into the variable:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ pwd&lt;br /&gt;
/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;/x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
frames  powerflux  pycbc  test_access&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Software of particular interest =&lt;br /&gt;
== SageMath ==&lt;br /&gt;
&lt;br /&gt;
[https://www.sagemath.org/ SageMath] is particularly interesting for Cosmology because it allows symbolic calculations, e.g. deriving the equations of motions for the scale factor starting from a customised space-time metric. &lt;br /&gt;
&lt;br /&gt;
=== Standard cosmology examples ===&lt;br /&gt;
&lt;br /&gt;
* The Friedman equations for the FLRW solution of the Einstein equations.&lt;br /&gt;
You can find the corresponding Notebook in any PIC terminal at '''/data/astro/software/notebooks/FLRW_cosmology.ipynb'''&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/FLRW_cosmology_solutions.ipynb''' uses known analytical solutions of the FLRW cosmology and produces this image for the evolution of the scale factor:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage05.png|300px]]&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/Interior_Schwarzschild.ipynb''' shows the formalism for the interior Schwarzschild metric and displays the solutions for density and pressure of a static celestial object that is sufficiently larger than its Schwarzschild radius. The pressure for an object with constant density id shown in the image:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage06.png|300px]]&lt;br /&gt;
&lt;br /&gt;
=== Enabling SageMath environment in Jupyter ===&lt;br /&gt;
&lt;br /&gt;
If you have never initialized mamba, run:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /data/astro/software/centos7/conda/mambaforge_4.14.0/bin/mamba init&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
After that you can enable SageMath for its use in a Jupyter notebook session: &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~] mamba activate /data/astro/software/envs/sage&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ python -m  ipykernel install --user --name=sage&lt;br /&gt;
....&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This creates a file in you home '''~/.local/share/jupyter/kernels/sage/kernel.json'''&lt;br /&gt;
which has to be modified to look like this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
{&lt;br /&gt;
 &amp;quot;argv&amp;quot;: [&lt;br /&gt;
  &amp;quot;/data/astro/software/envs/sage/bin/sage&amp;quot;,&lt;br /&gt;
  &amp;quot;--python&amp;quot;,&lt;br /&gt;
  &amp;quot;-m&amp;quot;,&lt;br /&gt;
  &amp;quot;sage.repl.ipython_kernel&amp;quot;,&lt;br /&gt;
  &amp;quot;-f&amp;quot;,&lt;br /&gt;
  &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
 ],&lt;br /&gt;
 &amp;quot;display_name&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;language&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;metadata&amp;quot;: {&lt;br /&gt;
  &amp;quot;debugger&amp;quot;: true&lt;br /&gt;
 }&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next time you go to your Jupyter dashboard you will find the sage environment listed there.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Dask =&lt;br /&gt;
&lt;br /&gt;
A notebook with instructions on how to run Dask at PIC can be found [https://gitlab.pic.es/services/code-samples/-/blob/main/computing/dask/dask_htcondor.ipynb here]&lt;br /&gt;
&lt;br /&gt;
= Using a singularity image as a jupyter kernel =&lt;br /&gt;
&lt;br /&gt;
Sometimes the software stack of some projects may be provided in the shape of a singularity image, it will then be convenient to use this image as a kernel for the notebooks in jupyter.pic.es.&lt;br /&gt;
&lt;br /&gt;
The singularity image to be used as a kernel needs to fullfill some requirements. Different requirements will apply depending on the programming language.&lt;br /&gt;
&lt;br /&gt;
== python jupyter kernel in a singularity image ==&lt;br /&gt;
&lt;br /&gt;
The singularity image needs to have the '''python''' and the '''ipykernel''' module installed. &lt;br /&gt;
&lt;br /&gt;
* Create the folder that will host the kernel definition&lt;br /&gt;
&lt;br /&gt;
  mkdir -p $HOME/.local/share/jupyter/kernels/singularity&lt;br /&gt;
&lt;br /&gt;
* Create the '''kernel.json''' file inside it with the following content:&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 {&lt;br /&gt;
   &amp;quot;argv&amp;quot;: [&lt;br /&gt;
     &amp;quot;singularity&amp;quot;,&lt;br /&gt;
     &amp;quot;exec&amp;quot;,&lt;br /&gt;
     &amp;quot;--cleanenv&amp;quot;,&lt;br /&gt;
     &amp;quot;/path/to/the/singularity/image.sif&amp;quot;,&lt;br /&gt;
     &amp;quot;python&amp;quot;,&lt;br /&gt;
     &amp;quot;-m&amp;quot;,&lt;br /&gt;
    &amp;quot;ipykernel&amp;quot;,&lt;br /&gt;
     &amp;quot;-f&amp;quot;,&lt;br /&gt;
     &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
   ],&lt;br /&gt;
   &amp;quot;language&amp;quot;: &amp;quot;python&amp;quot;,&lt;br /&gt;
   &amp;quot;display_name&amp;quot;: &amp;quot;singularity-kernel&amp;quot;&lt;br /&gt;
 }&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Refresh or start the jupyterlab interface and the singularity kernel should appear in the launcher tab&lt;br /&gt;
&lt;br /&gt;
= GPUs =&lt;br /&gt;
&lt;br /&gt;
The way to identify the GPUs that are assigned to your job is:&lt;br /&gt;
* check the environment variable CUDA_VISIBLE_DEVICES. In a terminal run &amp;quot;echo $CUDA_VISIBLE_DEVICES&amp;quot;. The environment variable contains a list of comma-separated GPU ids. With this you will already know how many gpus are assigned to your job. If the variable does not exist, there are no gpus assigned to the job&lt;br /&gt;
&lt;br /&gt;
* list the gpus with nvidia-smi, in a terminal run &amp;quot;nvidia-smi -L&amp;quot;, and look for the gpus you've been assigned. Remember their indexes (integers from 0 to 7)&lt;br /&gt;
&lt;br /&gt;
* The two steps above can be done with the following command:&lt;br /&gt;
&lt;br /&gt;
nvidia-smi -L | grep  $CUDA_VISIBLE_DEVICES&lt;br /&gt;
&lt;br /&gt;
if only having a single assigned GPU.&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_id_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
* in the GPU dashboard the gpus are identified with their index&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_resources_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
= Jupyterlab user guide =&lt;br /&gt;
&lt;br /&gt;
You can find the official documentation of the currently installed version of jupyterlab (3.6) [https://jupyterlab.readthedocs.io/en/4.2.x/ here], there you will find instruction on how to &lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/commands.html Access the command palette]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/toc.html Build a Table Of Contents]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/debugger.html Debug your code]&lt;br /&gt;
&lt;br /&gt;
A set of non-official jupyterlab extensions are installed to provide additional functionalities&lt;br /&gt;
&lt;br /&gt;
== jupytext ==&lt;br /&gt;
Pair your notebooks with text files to enhance version tracking.&lt;br /&gt;
https://jupytext.readthedocs.io&lt;br /&gt;
&lt;br /&gt;
=== Example ===&lt;br /&gt;
&lt;br /&gt;
If you had a notebook (.ipynb file) containing only the cell below, tracked in a git repository&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%matplotlib inline&lt;br /&gt;
import numpy as np&lt;br /&gt;
import matplotlib.pyplot as plt&lt;br /&gt;
plt.imshow(np.random.random([10, 10]))&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Different executions of the cell would produce different images, and the images are embedded in a pseudo-binary format inside the notebook file. In this case, doing a '''git diff''' of the .ipynb file would produce a huge output (because the image changed), even if there wasn't any change in the code. It is thus convenient to sync the notebook with a text file (e.g. a .py script) using the jupytext extension and track this one with git. The outputs, including images, as well as some additional metadata, won't be added to the synced text file. So in the case of different executions of the same notebook, the diff will always be empty.&lt;br /&gt;
&lt;br /&gt;
== git ==&lt;br /&gt;
Sidebar GUI to git repo management&lt;br /&gt;
https://github.com/jupyterlab/jupyterlab-git&lt;br /&gt;
&lt;br /&gt;
== variable inspector ==&lt;br /&gt;
Variable inspection à la Matlab&lt;br /&gt;
https://github.com/jupyterlab-contrib/jupyterlab-variableInspector&lt;br /&gt;
&lt;br /&gt;
...&lt;br /&gt;
&lt;br /&gt;
= Code samples =&lt;br /&gt;
&lt;br /&gt;
A repository with sample code can be found here: https://gitlab.pic.es/services/code-samples/&lt;br /&gt;
&lt;br /&gt;
= Troubleshooting =&lt;br /&gt;
&lt;br /&gt;
== Logs ==&lt;br /&gt;
&lt;br /&gt;
The log files for the jupyterlab server are stored in &amp;quot;~/.jupyter&amp;quot;. The log files will be created once the jupyter lab server job is finished.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Clean workspaces ==&lt;br /&gt;
&lt;br /&gt;
Jupyterlab stores the workspace status in the &amp;quot;~/.jupyter/lab/workspaces&amp;quot; folder. If you want to start with a fresh (empty) workspace, delete all the content of this folder before launching the notebook.&lt;br /&gt;
&lt;br /&gt;
    cd ~/.jupyter/lab/workspaces&lt;br /&gt;
    rm *&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 504 Gateway timeout ==&lt;br /&gt;
&lt;br /&gt;
The notebook job is running in HTCondor but the user can not access the notebook server. Ultimately a 504 error is received.&lt;br /&gt;
&lt;br /&gt;
This is probably because there's some error when starting the jupyterlab server. First of all, shutdown the notebook server and [[#Logs|check the logs]] to better identify the problem. If you don't see the source of the error, try to [[#Clean_workspaces|clean the workspaces]] and launch a notebook again.&lt;br /&gt;
&lt;br /&gt;
== Install ROOT ==&lt;br /&gt;
&lt;br /&gt;
Environment creation and kernel installation&lt;br /&gt;
&lt;br /&gt;
    $ micromamba env create -p /data/pic/scratch/torradeflot/envs/mcdata root ipykernel&lt;br /&gt;
    $ micromamba activate mcdata&lt;br /&gt;
    $ python -m ipykernel install --user --name mcdata&lt;br /&gt;
&lt;br /&gt;
After doing this ROOT can be imported from a python shell, but it does not work from a notebook.&lt;br /&gt;
&lt;br /&gt;
* ROOT uses JIT compilation with Cling: https://root.cern/cling/&lt;br /&gt;
* conda provides it's own set of compilation tools: gcc, gxx, fortran&lt;br /&gt;
* Some environment variables are necessary to ensure that these two pieces work well together:&lt;br /&gt;
** PATH: to be able to find the compiler tools&lt;br /&gt;
** CONDA_BUILD_SYSROOT: needed to configure the compiler call&lt;br /&gt;
&lt;br /&gt;
These environment variables are not propagated to the notebook, because they are set by .bashrc conda activate, ... so we need to explicitly set them in the notebook.&lt;br /&gt;
&lt;br /&gt;
     import os&lt;br /&gt;
     import sys&lt;br /&gt;
     from pathlib import Path&lt;br /&gt;
     bin_dir = Path(sys.executable).parent&lt;br /&gt;
     os.environ['PATH'] = f'{bin_dir}:{os.environ['PATH']}'&lt;br /&gt;
     os.environ['CONDA_BUILD_SYSROOT'] = str(bin_dir.parent / 'x86_64-conda-linux-gnu/sysroot')&lt;br /&gt;
&lt;br /&gt;
== Loading libraries is very slow ==&lt;br /&gt;
Conda environments at storage using hard drives can be extremely slow to load. If you encounter this problem, please ask &lt;br /&gt;
your support contact for a &amp;quot;SSD scratch&amp;quot; location to store environments. Currently this service is being tested and&lt;br /&gt;
deployed as needed.&lt;/div&gt;</summary>
		<author><name>Torradeflot</name></author>
	</entry>
	<entry>
		<id>https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1202</id>
		<title>JupyterHub</title>
		<link rel="alternate" type="text/html" href="https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1202"/>
		<updated>2025-03-14T08:43:11Z</updated>

		<summary type="html">&lt;p&gt;Torradeflot: /* Initialize conda (we highly recommend the use of miniforge/mambaforge) */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
= Introduction =&lt;br /&gt;
&lt;br /&gt;
PIC offers a service for running Jupyter notebooks on CPU or GPU resources. This service is primarily thought for code developing or prototyping rather than data processing. The usage is similar to running notebooks on your personal computer but offers the advantage of developing and testing your code on different hardware configurations, as well as facilitating the scalability of the code since it is being tested in the same environment in which it would run on a mass scale. &lt;br /&gt;
&lt;br /&gt;
Since the service is strictly thought for development and small scale testing tasks, a shutdown policy for the sessions has been put in place:&lt;br /&gt;
&lt;br /&gt;
# The maximum duration for a session is 48h.&lt;br /&gt;
# After an idle period of 2 hours, the session will be closed. &lt;br /&gt;
&lt;br /&gt;
In practice that  means that you should estimate the test data volume that you work with during a session to be able to be processed in less than 48 hours.&lt;br /&gt;
&lt;br /&gt;
== How to connect to the service ==&lt;br /&gt;
&lt;br /&gt;
Got to [https://jupyter.pic.es jupyter.pic.es] to see your login screen.&lt;br /&gt;
&lt;br /&gt;
[[File:login.png|700px|Login screen]]&lt;br /&gt;
&lt;br /&gt;
Sign in with your PIC user credentials. This will prompt you to the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterSpawn.png|700px|current]]&lt;br /&gt;
&lt;br /&gt;
Here you can choose the hardware configuration for your Jupyter session. Also, you have to choose the experiment (project) you are working on during the Jupyter session. After choosing a configuration and pressing start the next screen will show you the progress of the initialisation process. Keep in mind that a job containing your Jupyter session is actually sent to the HTCondor queuing system and waiting for available resources before being started. This usually takes less than a minute but can take up to a few depending on our resource usage.&lt;br /&gt;
&lt;br /&gt;
[[File:screen02.png|900px]]&lt;br /&gt;
&lt;br /&gt;
In the next screen you can choose the tool that you want to use for your work: a Python notebook, a Python console or a plain bash terminal.&lt;br /&gt;
For the Python environment (either notebook or environment) you have two default options:&lt;br /&gt;
* the ipykernel version of Python 3&lt;br /&gt;
* the XPython version of Python 3.9, this one allows you to use the integrated debugging module.&lt;br /&gt;
&lt;br /&gt;
Further you see an icon with a &amp;quot;D&amp;quot; - desktop, this one starts a VNC session that allows the use of programs with graphical user interfaces.&lt;br /&gt;
&lt;br /&gt;
Also, recently you can find the icon of Visual Studio, an integrated development environment.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterlab20231103.png|700px]]&lt;br /&gt;
&lt;br /&gt;
Your python environments should appear under Notebook and Console headers. In a later section we will show you how to create a new environment and to remove an existing one.&lt;br /&gt;
&lt;br /&gt;
== Terminate your session and logout ==&lt;br /&gt;
&lt;br /&gt;
It is important that you terminate your session before you log out. In order to do so, go to the top page menu &amp;quot;'''File''' -&amp;gt; '''Hub Control Panel'''&amp;quot; and you will see the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:screen04.png|600px]]&lt;br /&gt;
&lt;br /&gt;
Here, click on the '''Stop My Server''' button. After that you can log out by clicking the '''Logout''' button in the right upper corner.&lt;br /&gt;
&lt;br /&gt;
=  Python virtual environments =&lt;br /&gt;
&lt;br /&gt;
This section covers the use of Python virtual environments with Jupyter.&lt;br /&gt;
&lt;br /&gt;
== Initialize conda (we highly recommend the use of mamba/micromamba) ==&lt;br /&gt;
&lt;br /&gt;
Before using conda/mamba in your bash session, you have to initialize it.&lt;br /&gt;
* For access to an available conda/mamba installation, please get in contact with your project liaison at PIC. He/she will give you the actual value for the '''/path/to/conda/mamba''' placeholder.&lt;br /&gt;
* If you want to use your own conda/mamba/micromamba installation, there are two recommended options&lt;br /&gt;
** '''miniforge''': a distribution with conda and mamba executables in a minimal base environment, instructions [https://github.com/conda-forge/miniforge here]&lt;br /&gt;
** '''micromamba''': a self-contained executable (micromamba) with no base environment, instructions [https://mamba.readthedocs.io/en/latest/installation/micromamba-installation.html here]&lt;br /&gt;
&lt;br /&gt;
Log onto Jupyter and start a session. On the homepage of your Jupyter session, click on the terminal button on the session dashboard on the right to open a bash terminal.&lt;br /&gt;
&lt;br /&gt;
In order to use conda/mamba/micromamba you need to intialize the shell. This initialization can be persistent, which will do some changes to your '''~/.bashrc''' file, or you can do it every time you want to use it.&lt;br /&gt;
&lt;br /&gt;
If you are using miniforge and you want to persist the initialization:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
/path/to/conda/mamba init&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you are using micromamba:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
/path/to/micromamba shell init&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you don't want the initialization to be persisted, you can run it for every session:&lt;br /&gt;
&lt;br /&gt;
If you are using miniforge, the initialization hook needs to be run using the '''conda''' executable:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eval &amp;quot;$(/path/to/conda shell.bash hook)&amp;quot;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you are using micromamba:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eval &amp;quot;$(/path/to/micromamba shell hook -s posix)&amp;quot;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This actually changes the .bashrc file in your home directory in order to activate the base environment on login.&lt;br /&gt;
To avoid that the base environment is activated every time you log on to a node, run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Link an existing environment to Jupyter ==&lt;br /&gt;
&lt;br /&gt;
You can find instructions on how to create your own environments, e.g. [[#Create_virtual_environments_with_venv_or_conda | here]].&lt;br /&gt;
&lt;br /&gt;
Log into Jupyter, start a session. From the session dashboard choose the bash terminal.&lt;br /&gt;
&lt;br /&gt;
Inside the terminal, activate your environment.&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments:&lt;br /&gt;
* if you created the environment without a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the name of your environment. &lt;br /&gt;
* if you created the environment with a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the absolute path of your environment. &lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ source /path/to/environment/bin/activate&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Link the environment to a Jupyter kernel. For both, '''conda/mamba''' and '''venv''':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ python -m  ipykernel install --user --name=whatever_kernel_name&lt;br /&gt;
Installed kernelspec whatever_kernel_name in &lt;br /&gt;
                         /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you don't have the '''ipykernel''' module installed in your environment you may receive an error message like the one below when trying to run the previous command.&lt;br /&gt;
&amp;lt;pre&amp;gt;No module named ipykernel&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If this is the case, you need to install it by running: '''pip install ipykernel'''&lt;br /&gt;
&lt;br /&gt;
Deactivate your environment. &lt;br /&gt;
&lt;br /&gt;
For conda:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For venv:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now you can exit the terminal. After refreshing the Jupyter page your whatever_kernel_name appears in the dashboard. In this example '''test''' has been used for whatever_kernel_name&lt;br /&gt;
&lt;br /&gt;
[[File:screen05.png|700px]]&lt;br /&gt;
&lt;br /&gt;
== Unlink an environment from Jupyter ==&lt;br /&gt;
Log onto Jupyter, start a session and from the session dashboard choose the bash terminal. To remove your environment/kernel from Jupyter run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ jupyter kernelspec uninstall whatever_kernel_name&lt;br /&gt;
Kernel specs to remove:&lt;br /&gt;
  whatever_kernel_name     /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
Remove 1 kernel specs [y/N]: y&lt;br /&gt;
[RemoveKernelSpec] Removed /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Keep in mind that, although not available in Jupyter anymore, the environment still exists. Whenever you need it, you can link it again.&lt;br /&gt;
&lt;br /&gt;
== Create virtual environments with venv or conda ==&lt;br /&gt;
&lt;br /&gt;
Before creating a new environment, please get in contact with your project liaison at PIC as there may be already a suitable environment for your needs in place.&lt;br /&gt;
&lt;br /&gt;
If none of the existing environments suits your needs, you can create a new environment.&lt;br /&gt;
First, create a directory in a suitable place to store the environment. For single-user environments, place them in your home under ~/env. For environments that will be shared with other project users, contact your project liaison and ask him/her for a path in a shared storage volume that is visible to all of them.&lt;br /&gt;
&lt;br /&gt;
Once you have the location (i.e. /path/to/env/folder), create the environment with the following commands:&lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments '''(recommended)'''&lt;br /&gt;
&lt;br /&gt;
If your_env is installed at /path/to/env/your_env&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ python3 -m venv your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ source your_env/bin/activate&lt;br /&gt;
(...)[neissner@td110 ~]$ pip install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments&lt;br /&gt;
 &amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba create --prefix /path/to/env/your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The list of modules (module1, module2, ...) is optional. For instance, for a python3 environment with scipy you would specify: ''python=3 scipy'' &lt;br /&gt;
&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/env/folder/your_env&lt;br /&gt;
(...)[neissner@td110 ~]$ mamba install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
You can use pip install inside a mamba environment, however, resolving dependencies might require installing additional packages manually.&lt;br /&gt;
&lt;br /&gt;
== Conda / mamba configuration ==&lt;br /&gt;
&lt;br /&gt;
The behaviour of conda/mamba can be configured through the &amp;quot;$HOME/.condarc&amp;quot; file, described [https://docs.conda.io/projects/conda/en/latest/configuration.html here]. Some interesting parameters:&lt;br /&gt;
&lt;br /&gt;
* envs_dirs: The list of directories to search for named environments. E.g.: different locations where you created environments&lt;br /&gt;
&lt;br /&gt;
  envs_dirs:&lt;br /&gt;
    - /data/pic/scratch/torradeflot/envs&lt;br /&gt;
    - /data/astro/scratch/torradeflot/envs&lt;br /&gt;
    - /data/aai/scratch/torradeflot/envs&lt;br /&gt;
&lt;br /&gt;
* pkgs_dirs: Folder where to store conda packages&lt;br /&gt;
&lt;br /&gt;
= Proper usage of X509 based proxies =&lt;br /&gt;
&lt;br /&gt;
We found recently that the usage of proxies within a Jupyter session might cause problems because the environment changes certain standard locations such as '''/tmp'''&lt;br /&gt;
&lt;br /&gt;
For a correct functioning please create the proxy the following way, example for Virgo:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /bin/voms-proxy-init --voms virgo:/virgo/ligo --out ./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
ls: cannot access /cvmfs/ligo.osgstorage.org: Permission denied&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here the proxy cannot properly be located. Therefore we have to put the complete path into the variable:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ pwd&lt;br /&gt;
/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;/x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
frames  powerflux  pycbc  test_access&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Software of particular interest =&lt;br /&gt;
== SageMath ==&lt;br /&gt;
&lt;br /&gt;
[https://www.sagemath.org/ SageMath] is particularly interesting for Cosmology because it allows symbolic calculations, e.g. deriving the equations of motions for the scale factor starting from a customised space-time metric. &lt;br /&gt;
&lt;br /&gt;
=== Standard cosmology examples ===&lt;br /&gt;
&lt;br /&gt;
* The Friedman equations for the FLRW solution of the Einstein equations.&lt;br /&gt;
You can find the corresponding Notebook in any PIC terminal at '''/data/astro/software/notebooks/FLRW_cosmology.ipynb'''&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/FLRW_cosmology_solutions.ipynb''' uses known analytical solutions of the FLRW cosmology and produces this image for the evolution of the scale factor:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage05.png|300px]]&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/Interior_Schwarzschild.ipynb''' shows the formalism for the interior Schwarzschild metric and displays the solutions for density and pressure of a static celestial object that is sufficiently larger than its Schwarzschild radius. The pressure for an object with constant density id shown in the image:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage06.png|300px]]&lt;br /&gt;
&lt;br /&gt;
=== Enabling SageMath environment in Jupyter ===&lt;br /&gt;
&lt;br /&gt;
If you have never initialized mamba, run:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /data/astro/software/centos7/conda/mambaforge_4.14.0/bin/mamba init&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
After that you can enable SageMath for its use in a Jupyter notebook session: &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~] mamba activate /data/astro/software/envs/sage&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ python -m  ipykernel install --user --name=sage&lt;br /&gt;
....&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This creates a file in you home '''~/.local/share/jupyter/kernels/sage/kernel.json'''&lt;br /&gt;
which has to be modified to look like this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
{&lt;br /&gt;
 &amp;quot;argv&amp;quot;: [&lt;br /&gt;
  &amp;quot;/data/astro/software/envs/sage/bin/sage&amp;quot;,&lt;br /&gt;
  &amp;quot;--python&amp;quot;,&lt;br /&gt;
  &amp;quot;-m&amp;quot;,&lt;br /&gt;
  &amp;quot;sage.repl.ipython_kernel&amp;quot;,&lt;br /&gt;
  &amp;quot;-f&amp;quot;,&lt;br /&gt;
  &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
 ],&lt;br /&gt;
 &amp;quot;display_name&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;language&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;metadata&amp;quot;: {&lt;br /&gt;
  &amp;quot;debugger&amp;quot;: true&lt;br /&gt;
 }&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next time you go to your Jupyter dashboard you will find the sage environment listed there.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Dask =&lt;br /&gt;
&lt;br /&gt;
A notebook with instructions on how to run Dask at PIC can be found [https://gitlab.pic.es/services/code-samples/-/blob/main/computing/dask/dask_htcondor.ipynb here]&lt;br /&gt;
&lt;br /&gt;
= Using a singularity image as a jupyter kernel =&lt;br /&gt;
&lt;br /&gt;
Sometimes the software stack of some projects may be provided in the shape of a singularity image, it will then be convenient to use this image as a kernel for the notebooks in jupyter.pic.es.&lt;br /&gt;
&lt;br /&gt;
The singularity image to be used as a kernel needs to fullfill some requirements. Different requirements will apply depending on the programming language.&lt;br /&gt;
&lt;br /&gt;
== python jupyter kernel in a singularity image ==&lt;br /&gt;
&lt;br /&gt;
The singularity image needs to have the '''python''' and the '''ipykernel''' module installed. &lt;br /&gt;
&lt;br /&gt;
* Create the folder that will host the kernel definition&lt;br /&gt;
&lt;br /&gt;
  mkdir -p $HOME/.local/share/jupyter/kernels/singularity&lt;br /&gt;
&lt;br /&gt;
* Create the '''kernel.json''' file inside it with the following content:&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 {&lt;br /&gt;
   &amp;quot;argv&amp;quot;: [&lt;br /&gt;
     &amp;quot;singularity&amp;quot;,&lt;br /&gt;
     &amp;quot;exec&amp;quot;,&lt;br /&gt;
     &amp;quot;--cleanenv&amp;quot;,&lt;br /&gt;
     &amp;quot;/path/to/the/singularity/image.sif&amp;quot;,&lt;br /&gt;
     &amp;quot;python&amp;quot;,&lt;br /&gt;
     &amp;quot;-m&amp;quot;,&lt;br /&gt;
    &amp;quot;ipykernel&amp;quot;,&lt;br /&gt;
     &amp;quot;-f&amp;quot;,&lt;br /&gt;
     &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
   ],&lt;br /&gt;
   &amp;quot;language&amp;quot;: &amp;quot;python&amp;quot;,&lt;br /&gt;
   &amp;quot;display_name&amp;quot;: &amp;quot;singularity-kernel&amp;quot;&lt;br /&gt;
 }&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Refresh or start the jupyterlab interface and the singularity kernel should appear in the launcher tab&lt;br /&gt;
&lt;br /&gt;
= GPUs =&lt;br /&gt;
&lt;br /&gt;
The way to identify the GPUs that are assigned to your job is:&lt;br /&gt;
* check the environment variable CUDA_VISIBLE_DEVICES. In a terminal run &amp;quot;echo $CUDA_VISIBLE_DEVICES&amp;quot;. The environment variable contains a list of comma-separated GPU ids. With this you will already know how many gpus are assigned to your job. If the variable does not exist, there are no gpus assigned to the job&lt;br /&gt;
&lt;br /&gt;
* list the gpus with nvidia-smi, in a terminal run &amp;quot;nvidia-smi -L&amp;quot;, and look for the gpus you've been assigned. Remember their indexes (integers from 0 to 7)&lt;br /&gt;
&lt;br /&gt;
* The two steps above can be done with the following command:&lt;br /&gt;
&lt;br /&gt;
nvidia-smi -L | grep  $CUDA_VISIBLE_DEVICES&lt;br /&gt;
&lt;br /&gt;
if only having a single assigned GPU.&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_id_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
* in the GPU dashboard the gpus are identified with their index&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_resources_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
= Jupyterlab user guide =&lt;br /&gt;
&lt;br /&gt;
You can find the official documentation of the currently installed version of jupyterlab (3.6) [https://jupyterlab.readthedocs.io/en/4.2.x/ here], there you will find instruction on how to &lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/commands.html Access the command palette]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/toc.html Build a Table Of Contents]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/debugger.html Debug your code]&lt;br /&gt;
&lt;br /&gt;
A set of non-official jupyterlab extensions are installed to provide additional functionalities&lt;br /&gt;
&lt;br /&gt;
== jupytext ==&lt;br /&gt;
Pair your notebooks with text files to enhance version tracking.&lt;br /&gt;
https://jupytext.readthedocs.io&lt;br /&gt;
&lt;br /&gt;
=== Example ===&lt;br /&gt;
&lt;br /&gt;
If you had a notebook (.ipynb file) containing only the cell below, tracked in a git repository&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%matplotlib inline&lt;br /&gt;
import numpy as np&lt;br /&gt;
import matplotlib.pyplot as plt&lt;br /&gt;
plt.imshow(np.random.random([10, 10]))&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Different executions of the cell would produce different images, and the images are embedded in a pseudo-binary format inside the notebook file. In this case, doing a '''git diff''' of the .ipynb file would produce a huge output (because the image changed), even if there wasn't any change in the code. It is thus convenient to sync the notebook with a text file (e.g. a .py script) using the jupytext extension and track this one with git. The outputs, including images, as well as some additional metadata, won't be added to the synced text file. So in the case of different executions of the same notebook, the diff will always be empty.&lt;br /&gt;
&lt;br /&gt;
== git ==&lt;br /&gt;
Sidebar GUI to git repo management&lt;br /&gt;
https://github.com/jupyterlab/jupyterlab-git&lt;br /&gt;
&lt;br /&gt;
== variable inspector ==&lt;br /&gt;
Variable inspection à la Matlab&lt;br /&gt;
https://github.com/jupyterlab-contrib/jupyterlab-variableInspector&lt;br /&gt;
&lt;br /&gt;
...&lt;br /&gt;
&lt;br /&gt;
= Code samples =&lt;br /&gt;
&lt;br /&gt;
A repository with sample code can be found here: https://gitlab.pic.es/services/code-samples/&lt;br /&gt;
&lt;br /&gt;
= Troubleshooting =&lt;br /&gt;
&lt;br /&gt;
== Logs ==&lt;br /&gt;
&lt;br /&gt;
The log files for the jupyterlab server are stored in &amp;quot;~/.jupyter&amp;quot;. The log files will be created once the jupyter lab server job is finished.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Clean workspaces ==&lt;br /&gt;
&lt;br /&gt;
Jupyterlab stores the workspace status in the &amp;quot;~/.jupyter/lab/workspaces&amp;quot; folder. If you want to start with a fresh (empty) workspace, delete all the content of this folder before launching the notebook.&lt;br /&gt;
&lt;br /&gt;
    cd ~/.jupyter/lab/workspaces&lt;br /&gt;
    rm *&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 504 Gateway timeout ==&lt;br /&gt;
&lt;br /&gt;
The notebook job is running in HTCondor but the user can not access the notebook server. Ultimately a 504 error is received.&lt;br /&gt;
&lt;br /&gt;
This is probably because there's some error when starting the jupyterlab server. First of all, shutdown the notebook server and [[#Logs|check the logs]] to better identify the problem. If you don't see the source of the error, try to [[#Clean_workspaces|clean the workspaces]] and launch a notebook again.&lt;br /&gt;
&lt;br /&gt;
== Install ROOT ==&lt;br /&gt;
&lt;br /&gt;
Environment creation and kernel installation&lt;br /&gt;
&lt;br /&gt;
    $ micromamba env create -p /data/pic/scratch/torradeflot/envs/mcdata root ipykernel&lt;br /&gt;
    $ micromamba activate mcdata&lt;br /&gt;
    $ python -m ipykernel install --user --name mcdata&lt;br /&gt;
&lt;br /&gt;
After doing this ROOT can be imported from a python shell, but it does not work from a notebook.&lt;br /&gt;
&lt;br /&gt;
* ROOT uses JIT compilation with Cling: https://root.cern/cling/&lt;br /&gt;
* conda provides it's own set of compilation tools: gcc, gxx, fortran&lt;br /&gt;
* Some environment variables are necessary to ensure that these two pieces work well together:&lt;br /&gt;
** PATH: to be able to find the compiler tools&lt;br /&gt;
** CONDA_BUILD_SYSROOT: needed to configure the compiler call&lt;br /&gt;
&lt;br /&gt;
These environment variables are not propagated to the notebook, because they are set by .bashrc conda activate, ... so we need to explicitly set them in the notebook.&lt;br /&gt;
&lt;br /&gt;
     import os&lt;br /&gt;
     import sys&lt;br /&gt;
     from pathlib import Path&lt;br /&gt;
     bin_dir = Path(sys.executable).parent&lt;br /&gt;
     os.environ['PATH'] = f'{bin_dir}:{os.environ['PATH']}'&lt;br /&gt;
     os.environ['CONDA_BUILD_SYSROOT'] = str(bin_dir.parent / 'x86_64-conda-linux-gnu/sysroot')&lt;br /&gt;
&lt;br /&gt;
== Loading libraries is very slow ==&lt;br /&gt;
Conda environments at storage using hard drives can be extremely slow to load. If you encounter this problem, please ask &lt;br /&gt;
your support contact for a &amp;quot;SSD scratch&amp;quot; location to store environments. Currently this service is being tested and&lt;br /&gt;
deployed as needed.&lt;/div&gt;</summary>
		<author><name>Torradeflot</name></author>
	</entry>
	<entry>
		<id>https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1201</id>
		<title>JupyterHub</title>
		<link rel="alternate" type="text/html" href="https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1201"/>
		<updated>2025-03-14T07:58:57Z</updated>

		<summary type="html">&lt;p&gt;Torradeflot: /* Python virtual environments */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
= Introduction =&lt;br /&gt;
&lt;br /&gt;
PIC offers a service for running Jupyter notebooks on CPU or GPU resources. This service is primarily thought for code developing or prototyping rather than data processing. The usage is similar to running notebooks on your personal computer but offers the advantage of developing and testing your code on different hardware configurations, as well as facilitating the scalability of the code since it is being tested in the same environment in which it would run on a mass scale. &lt;br /&gt;
&lt;br /&gt;
Since the service is strictly thought for development and small scale testing tasks, a shutdown policy for the sessions has been put in place:&lt;br /&gt;
&lt;br /&gt;
# The maximum duration for a session is 48h.&lt;br /&gt;
# After an idle period of 2 hours, the session will be closed. &lt;br /&gt;
&lt;br /&gt;
In practice that  means that you should estimate the test data volume that you work with during a session to be able to be processed in less than 48 hours.&lt;br /&gt;
&lt;br /&gt;
== How to connect to the service ==&lt;br /&gt;
&lt;br /&gt;
Got to [https://jupyter.pic.es jupyter.pic.es] to see your login screen.&lt;br /&gt;
&lt;br /&gt;
[[File:login.png|700px|Login screen]]&lt;br /&gt;
&lt;br /&gt;
Sign in with your PIC user credentials. This will prompt you to the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterSpawn.png|700px|current]]&lt;br /&gt;
&lt;br /&gt;
Here you can choose the hardware configuration for your Jupyter session. Also, you have to choose the experiment (project) you are working on during the Jupyter session. After choosing a configuration and pressing start the next screen will show you the progress of the initialisation process. Keep in mind that a job containing your Jupyter session is actually sent to the HTCondor queuing system and waiting for available resources before being started. This usually takes less than a minute but can take up to a few depending on our resource usage.&lt;br /&gt;
&lt;br /&gt;
[[File:screen02.png|900px]]&lt;br /&gt;
&lt;br /&gt;
In the next screen you can choose the tool that you want to use for your work: a Python notebook, a Python console or a plain bash terminal.&lt;br /&gt;
For the Python environment (either notebook or environment) you have two default options:&lt;br /&gt;
* the ipykernel version of Python 3&lt;br /&gt;
* the XPython version of Python 3.9, this one allows you to use the integrated debugging module.&lt;br /&gt;
&lt;br /&gt;
Further you see an icon with a &amp;quot;D&amp;quot; - desktop, this one starts a VNC session that allows the use of programs with graphical user interfaces.&lt;br /&gt;
&lt;br /&gt;
Also, recently you can find the icon of Visual Studio, an integrated development environment.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterlab20231103.png|700px]]&lt;br /&gt;
&lt;br /&gt;
Your python environments should appear under Notebook and Console headers. In a later section we will show you how to create a new environment and to remove an existing one.&lt;br /&gt;
&lt;br /&gt;
== Terminate your session and logout ==&lt;br /&gt;
&lt;br /&gt;
It is important that you terminate your session before you log out. In order to do so, go to the top page menu &amp;quot;'''File''' -&amp;gt; '''Hub Control Panel'''&amp;quot; and you will see the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:screen04.png|600px]]&lt;br /&gt;
&lt;br /&gt;
Here, click on the '''Stop My Server''' button. After that you can log out by clicking the '''Logout''' button in the right upper corner.&lt;br /&gt;
&lt;br /&gt;
=  Python virtual environments =&lt;br /&gt;
&lt;br /&gt;
This section covers the use of Python virtual environments with Jupyter.&lt;br /&gt;
&lt;br /&gt;
== Initialize conda (we highly recommend the use of miniforge/mambaforge) ==&lt;br /&gt;
&lt;br /&gt;
Before using conda/mamba in your bash session, you have to initialize it.&lt;br /&gt;
* For access to an available conda/mamba installation, please get in contact with your project liaison at PIC. He/she will give you the actual value for the '''/path/to/mambaforge''' placeholder.&lt;br /&gt;
* If you want to use your own conda/mamba installation, we recommend you to install the minimal '''miniforge''' distribution, instructions [https://github.com/conda-forge/miniforge here] or [https://github.com/mamba-org/mamba mamba/micromamba]&lt;br /&gt;
&lt;br /&gt;
Log onto Jupyter and start a session. On the homepage of your Jupyter session, click on the terminal button on the session dashboard on the right to open a bash terminal.&lt;br /&gt;
&lt;br /&gt;
First, let's initialize conda for our bash sessions:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ /data/astro/software/alma9/conda/miniforge-24.1.2/bin/mamba init&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This actually changes the .bashrc file in your home directory in order to activate the base environment on login.&lt;br /&gt;
To avoid that the base environment is activated every time you log on to a node, run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, in order to activate the base environment you will have to run this command:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ eval &amp;quot;$(/data/astro/software/alma9/conda/miniforge-24.1.2/bin/conda shell.bash hook)&amp;quot;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For now you can exit the terminal.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ exit&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Link an existing environment to Jupyter ==&lt;br /&gt;
&lt;br /&gt;
You can find instructions on how to create your own environments, e.g. [[#Create_virtual_environments_with_venv_or_conda | here]].&lt;br /&gt;
&lt;br /&gt;
Log into Jupyter, start a session. From the session dashboard choose the bash terminal.&lt;br /&gt;
&lt;br /&gt;
Inside the terminal, activate your environment.&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments:&lt;br /&gt;
* if you created the environment without a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the name of your environment. &lt;br /&gt;
* if you created the environment with a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the absolute path of your environment. &lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ source /path/to/environment/bin/activate&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Link the environment to a Jupyter kernel. For both, '''conda/mamba''' and '''venv''':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ python -m  ipykernel install --user --name=whatever_kernel_name&lt;br /&gt;
Installed kernelspec whatever_kernel_name in &lt;br /&gt;
                         /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you don't have the '''ipykernel''' module installed in your environment you may receive an error message like the one below when trying to run the previous command.&lt;br /&gt;
&amp;lt;pre&amp;gt;No module named ipykernel&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If this is the case, you need to install it by running: '''pip install ipykernel'''&lt;br /&gt;
&lt;br /&gt;
Deactivate your environment. &lt;br /&gt;
&lt;br /&gt;
For conda:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For venv:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now you can exit the terminal. After refreshing the Jupyter page your whatever_kernel_name appears in the dashboard. In this example '''test''' has been used for whatever_kernel_name&lt;br /&gt;
&lt;br /&gt;
[[File:screen05.png|700px]]&lt;br /&gt;
&lt;br /&gt;
== Unlink an environment from Jupyter ==&lt;br /&gt;
Log onto Jupyter, start a session and from the session dashboard choose the bash terminal. To remove your environment/kernel from Jupyter run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ jupyter kernelspec uninstall whatever_kernel_name&lt;br /&gt;
Kernel specs to remove:&lt;br /&gt;
  whatever_kernel_name     /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
Remove 1 kernel specs [y/N]: y&lt;br /&gt;
[RemoveKernelSpec] Removed /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Keep in mind that, although not available in Jupyter anymore, the environment still exists. Whenever you need it, you can link it again.&lt;br /&gt;
&lt;br /&gt;
== Create virtual environments with venv or conda ==&lt;br /&gt;
&lt;br /&gt;
Before creating a new environment, please get in contact with your project liaison at PIC as there may be already a suitable environment for your needs in place.&lt;br /&gt;
&lt;br /&gt;
If none of the existing environments suits your needs, you can create a new environment.&lt;br /&gt;
First, create a directory in a suitable place to store the environment. For single-user environments, place them in your home under ~/env. For environments that will be shared with other project users, contact your project liaison and ask him/her for a path in a shared storage volume that is visible to all of them.&lt;br /&gt;
&lt;br /&gt;
Once you have the location (i.e. /path/to/env/folder), create the environment with the following commands:&lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments '''(recommended)'''&lt;br /&gt;
&lt;br /&gt;
If your_env is installed at /path/to/env/your_env&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ python3 -m venv your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ source your_env/bin/activate&lt;br /&gt;
(...)[neissner@td110 ~]$ pip install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments&lt;br /&gt;
 &amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba create --prefix /path/to/env/your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The list of modules (module1, module2, ...) is optional. For instance, for a python3 environment with scipy you would specify: ''python=3 scipy'' &lt;br /&gt;
&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/env/folder/your_env&lt;br /&gt;
(...)[neissner@td110 ~]$ mamba install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
You can use pip install inside a mamba environment, however, resolving dependencies might require installing additional packages manually.&lt;br /&gt;
&lt;br /&gt;
== Conda / mamba configuration ==&lt;br /&gt;
&lt;br /&gt;
The behaviour of conda/mamba can be configured through the &amp;quot;$HOME/.condarc&amp;quot; file, described [https://docs.conda.io/projects/conda/en/latest/configuration.html here]. Some interesting parameters:&lt;br /&gt;
&lt;br /&gt;
* envs_dirs: The list of directories to search for named environments. E.g.: different locations where you created environments&lt;br /&gt;
&lt;br /&gt;
  envs_dirs:&lt;br /&gt;
    - /data/pic/scratch/torradeflot/envs&lt;br /&gt;
    - /data/astro/scratch/torradeflot/envs&lt;br /&gt;
    - /data/aai/scratch/torradeflot/envs&lt;br /&gt;
&lt;br /&gt;
* pkgs_dirs: Folder where to store conda packages&lt;br /&gt;
&lt;br /&gt;
= Proper usage of X509 based proxies =&lt;br /&gt;
&lt;br /&gt;
We found recently that the usage of proxies within a Jupyter session might cause problems because the environment changes certain standard locations such as '''/tmp'''&lt;br /&gt;
&lt;br /&gt;
For a correct functioning please create the proxy the following way, example for Virgo:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /bin/voms-proxy-init --voms virgo:/virgo/ligo --out ./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
ls: cannot access /cvmfs/ligo.osgstorage.org: Permission denied&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here the proxy cannot properly be located. Therefore we have to put the complete path into the variable:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ pwd&lt;br /&gt;
/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;/x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
frames  powerflux  pycbc  test_access&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Software of particular interest =&lt;br /&gt;
== SageMath ==&lt;br /&gt;
&lt;br /&gt;
[https://www.sagemath.org/ SageMath] is particularly interesting for Cosmology because it allows symbolic calculations, e.g. deriving the equations of motions for the scale factor starting from a customised space-time metric. &lt;br /&gt;
&lt;br /&gt;
=== Standard cosmology examples ===&lt;br /&gt;
&lt;br /&gt;
* The Friedman equations for the FLRW solution of the Einstein equations.&lt;br /&gt;
You can find the corresponding Notebook in any PIC terminal at '''/data/astro/software/notebooks/FLRW_cosmology.ipynb'''&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/FLRW_cosmology_solutions.ipynb''' uses known analytical solutions of the FLRW cosmology and produces this image for the evolution of the scale factor:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage05.png|300px]]&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/Interior_Schwarzschild.ipynb''' shows the formalism for the interior Schwarzschild metric and displays the solutions for density and pressure of a static celestial object that is sufficiently larger than its Schwarzschild radius. The pressure for an object with constant density id shown in the image:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage06.png|300px]]&lt;br /&gt;
&lt;br /&gt;
=== Enabling SageMath environment in Jupyter ===&lt;br /&gt;
&lt;br /&gt;
If you have never initialized mamba, run:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /data/astro/software/centos7/conda/mambaforge_4.14.0/bin/mamba init&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
After that you can enable SageMath for its use in a Jupyter notebook session: &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~] mamba activate /data/astro/software/envs/sage&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ python -m  ipykernel install --user --name=sage&lt;br /&gt;
....&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This creates a file in you home '''~/.local/share/jupyter/kernels/sage/kernel.json'''&lt;br /&gt;
which has to be modified to look like this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
{&lt;br /&gt;
 &amp;quot;argv&amp;quot;: [&lt;br /&gt;
  &amp;quot;/data/astro/software/envs/sage/bin/sage&amp;quot;,&lt;br /&gt;
  &amp;quot;--python&amp;quot;,&lt;br /&gt;
  &amp;quot;-m&amp;quot;,&lt;br /&gt;
  &amp;quot;sage.repl.ipython_kernel&amp;quot;,&lt;br /&gt;
  &amp;quot;-f&amp;quot;,&lt;br /&gt;
  &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
 ],&lt;br /&gt;
 &amp;quot;display_name&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;language&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;metadata&amp;quot;: {&lt;br /&gt;
  &amp;quot;debugger&amp;quot;: true&lt;br /&gt;
 }&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next time you go to your Jupyter dashboard you will find the sage environment listed there.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Dask =&lt;br /&gt;
&lt;br /&gt;
A notebook with instructions on how to run Dask at PIC can be found [https://gitlab.pic.es/services/code-samples/-/blob/main/computing/dask/dask_htcondor.ipynb here]&lt;br /&gt;
&lt;br /&gt;
= Using a singularity image as a jupyter kernel =&lt;br /&gt;
&lt;br /&gt;
Sometimes the software stack of some projects may be provided in the shape of a singularity image, it will then be convenient to use this image as a kernel for the notebooks in jupyter.pic.es.&lt;br /&gt;
&lt;br /&gt;
The singularity image to be used as a kernel needs to fullfill some requirements. Different requirements will apply depending on the programming language.&lt;br /&gt;
&lt;br /&gt;
== python jupyter kernel in a singularity image ==&lt;br /&gt;
&lt;br /&gt;
The singularity image needs to have the '''python''' and the '''ipykernel''' module installed. &lt;br /&gt;
&lt;br /&gt;
* Create the folder that will host the kernel definition&lt;br /&gt;
&lt;br /&gt;
  mkdir -p $HOME/.local/share/jupyter/kernels/singularity&lt;br /&gt;
&lt;br /&gt;
* Create the '''kernel.json''' file inside it with the following content:&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 {&lt;br /&gt;
   &amp;quot;argv&amp;quot;: [&lt;br /&gt;
     &amp;quot;singularity&amp;quot;,&lt;br /&gt;
     &amp;quot;exec&amp;quot;,&lt;br /&gt;
     &amp;quot;--cleanenv&amp;quot;,&lt;br /&gt;
     &amp;quot;/path/to/the/singularity/image.sif&amp;quot;,&lt;br /&gt;
     &amp;quot;python&amp;quot;,&lt;br /&gt;
     &amp;quot;-m&amp;quot;,&lt;br /&gt;
    &amp;quot;ipykernel&amp;quot;,&lt;br /&gt;
     &amp;quot;-f&amp;quot;,&lt;br /&gt;
     &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
   ],&lt;br /&gt;
   &amp;quot;language&amp;quot;: &amp;quot;python&amp;quot;,&lt;br /&gt;
   &amp;quot;display_name&amp;quot;: &amp;quot;singularity-kernel&amp;quot;&lt;br /&gt;
 }&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Refresh or start the jupyterlab interface and the singularity kernel should appear in the launcher tab&lt;br /&gt;
&lt;br /&gt;
= GPUs =&lt;br /&gt;
&lt;br /&gt;
The way to identify the GPUs that are assigned to your job is:&lt;br /&gt;
* check the environment variable CUDA_VISIBLE_DEVICES. In a terminal run &amp;quot;echo $CUDA_VISIBLE_DEVICES&amp;quot;. The environment variable contains a list of comma-separated GPU ids. With this you will already know how many gpus are assigned to your job. If the variable does not exist, there are no gpus assigned to the job&lt;br /&gt;
&lt;br /&gt;
* list the gpus with nvidia-smi, in a terminal run &amp;quot;nvidia-smi -L&amp;quot;, and look for the gpus you've been assigned. Remember their indexes (integers from 0 to 7)&lt;br /&gt;
&lt;br /&gt;
* The two steps above can be done with the following command:&lt;br /&gt;
&lt;br /&gt;
nvidia-smi -L | grep  $CUDA_VISIBLE_DEVICES&lt;br /&gt;
&lt;br /&gt;
if only having a single assigned GPU.&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_id_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
* in the GPU dashboard the gpus are identified with their index&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_resources_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
= Jupyterlab user guide =&lt;br /&gt;
&lt;br /&gt;
You can find the official documentation of the currently installed version of jupyterlab (3.6) [https://jupyterlab.readthedocs.io/en/4.2.x/ here], there you will find instruction on how to &lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/commands.html Access the command palette]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/toc.html Build a Table Of Contents]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/debugger.html Debug your code]&lt;br /&gt;
&lt;br /&gt;
A set of non-official jupyterlab extensions are installed to provide additional functionalities&lt;br /&gt;
&lt;br /&gt;
== jupytext ==&lt;br /&gt;
Pair your notebooks with text files to enhance version tracking.&lt;br /&gt;
https://jupytext.readthedocs.io&lt;br /&gt;
&lt;br /&gt;
=== Example ===&lt;br /&gt;
&lt;br /&gt;
If you had a notebook (.ipynb file) containing only the cell below, tracked in a git repository&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%matplotlib inline&lt;br /&gt;
import numpy as np&lt;br /&gt;
import matplotlib.pyplot as plt&lt;br /&gt;
plt.imshow(np.random.random([10, 10]))&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Different executions of the cell would produce different images, and the images are embedded in a pseudo-binary format inside the notebook file. In this case, doing a '''git diff''' of the .ipynb file would produce a huge output (because the image changed), even if there wasn't any change in the code. It is thus convenient to sync the notebook with a text file (e.g. a .py script) using the jupytext extension and track this one with git. The outputs, including images, as well as some additional metadata, won't be added to the synced text file. So in the case of different executions of the same notebook, the diff will always be empty.&lt;br /&gt;
&lt;br /&gt;
== git ==&lt;br /&gt;
Sidebar GUI to git repo management&lt;br /&gt;
https://github.com/jupyterlab/jupyterlab-git&lt;br /&gt;
&lt;br /&gt;
== variable inspector ==&lt;br /&gt;
Variable inspection à la Matlab&lt;br /&gt;
https://github.com/jupyterlab-contrib/jupyterlab-variableInspector&lt;br /&gt;
&lt;br /&gt;
...&lt;br /&gt;
&lt;br /&gt;
= Code samples =&lt;br /&gt;
&lt;br /&gt;
A repository with sample code can be found here: https://gitlab.pic.es/services/code-samples/&lt;br /&gt;
&lt;br /&gt;
= Troubleshooting =&lt;br /&gt;
&lt;br /&gt;
== Logs ==&lt;br /&gt;
&lt;br /&gt;
The log files for the jupyterlab server are stored in &amp;quot;~/.jupyter&amp;quot;. The log files will be created once the jupyter lab server job is finished.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Clean workspaces ==&lt;br /&gt;
&lt;br /&gt;
Jupyterlab stores the workspace status in the &amp;quot;~/.jupyter/lab/workspaces&amp;quot; folder. If you want to start with a fresh (empty) workspace, delete all the content of this folder before launching the notebook.&lt;br /&gt;
&lt;br /&gt;
    cd ~/.jupyter/lab/workspaces&lt;br /&gt;
    rm *&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 504 Gateway timeout ==&lt;br /&gt;
&lt;br /&gt;
The notebook job is running in HTCondor but the user can not access the notebook server. Ultimately a 504 error is received.&lt;br /&gt;
&lt;br /&gt;
This is probably because there's some error when starting the jupyterlab server. First of all, shutdown the notebook server and [[#Logs|check the logs]] to better identify the problem. If you don't see the source of the error, try to [[#Clean_workspaces|clean the workspaces]] and launch a notebook again.&lt;br /&gt;
&lt;br /&gt;
== Install ROOT ==&lt;br /&gt;
&lt;br /&gt;
Environment creation and kernel installation&lt;br /&gt;
&lt;br /&gt;
    $ micromamba env create -p /data/pic/scratch/torradeflot/envs/mcdata root ipykernel&lt;br /&gt;
    $ micromamba activate mcdata&lt;br /&gt;
    $ python -m ipykernel install --user --name mcdata&lt;br /&gt;
&lt;br /&gt;
After doing this ROOT can be imported from a python shell, but it does not work from a notebook.&lt;br /&gt;
&lt;br /&gt;
* ROOT uses JIT compilation with Cling: https://root.cern/cling/&lt;br /&gt;
* conda provides it's own set of compilation tools: gcc, gxx, fortran&lt;br /&gt;
* Some environment variables are necessary to ensure that these two pieces work well together:&lt;br /&gt;
** PATH: to be able to find the compiler tools&lt;br /&gt;
** CONDA_BUILD_SYSROOT: needed to configure the compiler call&lt;br /&gt;
&lt;br /&gt;
These environment variables are not propagated to the notebook, because they are set by .bashrc conda activate, ... so we need to explicitly set them in the notebook.&lt;br /&gt;
&lt;br /&gt;
     import os&lt;br /&gt;
     import sys&lt;br /&gt;
     from pathlib import Path&lt;br /&gt;
     bin_dir = Path(sys.executable).parent&lt;br /&gt;
     os.environ['PATH'] = f'{bin_dir}:{os.environ['PATH']}'&lt;br /&gt;
     os.environ['CONDA_BUILD_SYSROOT'] = str(bin_dir.parent / 'x86_64-conda-linux-gnu/sysroot')&lt;br /&gt;
&lt;br /&gt;
== Loading libraries is very slow ==&lt;br /&gt;
Conda environments at storage using hard drives can be extremely slow to load. If you encounter this problem, please ask &lt;br /&gt;
your support contact for a &amp;quot;SSD scratch&amp;quot; location to store environments. Currently this service is being tested and&lt;br /&gt;
deployed as needed.&lt;/div&gt;</summary>
		<author><name>Torradeflot</name></author>
	</entry>
	<entry>
		<id>https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1197</id>
		<title>JupyterHub</title>
		<link rel="alternate" type="text/html" href="https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1197"/>
		<updated>2025-02-24T22:21:36Z</updated>

		<summary type="html">&lt;p&gt;Torradeflot: /* Install ROOT */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
= Introduction =&lt;br /&gt;
&lt;br /&gt;
PIC offers a service for running Jupyter notebooks on CPU or GPU resources. This service is primarily thought for code developing or prototyping rather than data processing. The usage is similar to running notebooks on your personal computer but offers the advantage of developing and testing your code on different hardware configurations, as well as facilitating the scalability of the code since it is being tested in the same environment in which it would run on a mass scale. &lt;br /&gt;
&lt;br /&gt;
Since the service is strictly thought for development and small scale testing tasks, a shutdown policy for the sessions has been put in place:&lt;br /&gt;
&lt;br /&gt;
# The maximum duration for a session is 48h.&lt;br /&gt;
# After an idle period of 2 hours, the session will be closed. &lt;br /&gt;
&lt;br /&gt;
In practice that  means that you should estimate the test data volume that you work with during a session to be able to be processed in less than 48 hours.&lt;br /&gt;
&lt;br /&gt;
== How to connect to the service ==&lt;br /&gt;
&lt;br /&gt;
Got to [https://jupyter.pic.es jupyter.pic.es] to see your login screen.&lt;br /&gt;
&lt;br /&gt;
[[File:login.png|700px|Login screen]]&lt;br /&gt;
&lt;br /&gt;
Sign in with your PIC user credentials. This will prompt you to the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterSpawn.png|700px|current]]&lt;br /&gt;
&lt;br /&gt;
Here you can choose the hardware configuration for your Jupyter session. Also, you have to choose the experiment (project) you are working on during the Jupyter session. After choosing a configuration and pressing start the next screen will show you the progress of the initialisation process. Keep in mind that a job containing your Jupyter session is actually sent to the HTCondor queuing system and waiting for available resources before being started. This usually takes less than a minute but can take up to a few depending on our resource usage.&lt;br /&gt;
&lt;br /&gt;
[[File:screen02.png|900px]]&lt;br /&gt;
&lt;br /&gt;
In the next screen you can choose the tool that you want to use for your work: a Python notebook, a Python console or a plain bash terminal.&lt;br /&gt;
For the Python environment (either notebook or environment) you have two default options:&lt;br /&gt;
* the ipykernel version of Python 3&lt;br /&gt;
* the XPython version of Python 3.9, this one allows you to use the integrated debugging module.&lt;br /&gt;
&lt;br /&gt;
Further you see an icon with a &amp;quot;D&amp;quot; - desktop, this one starts a VNC session that allows the use of programs with graphical user interfaces.&lt;br /&gt;
&lt;br /&gt;
Also, recently you can find the icon of Visual Studio, an integrated development environment.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterlab20231103.png|700px]]&lt;br /&gt;
&lt;br /&gt;
Your python environments should appear under Notebook and Console headers. In a later section we will show you how to create a new environment and to remove an existing one.&lt;br /&gt;
&lt;br /&gt;
== Terminate your session and logout ==&lt;br /&gt;
&lt;br /&gt;
It is important that you terminate your session before you log out. In order to do so, go to the top page menu &amp;quot;'''File''' -&amp;gt; '''Hub Control Panel'''&amp;quot; and you will see the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:screen04.png|600px]]&lt;br /&gt;
&lt;br /&gt;
Here, click on the '''Stop My Server''' button. After that you can log out by clicking the '''Logout''' button in the right upper corner.&lt;br /&gt;
&lt;br /&gt;
=  Python virtual environments =&lt;br /&gt;
&lt;br /&gt;
This section covers the use of Python virtual environments with Jupyter.&lt;br /&gt;
&lt;br /&gt;
== Initialize conda (we highly recommend the use of miniforge/mambaforge) ==&lt;br /&gt;
&lt;br /&gt;
Before using conda/mamba in your bash session, you have to initialize it.&lt;br /&gt;
* For access to an available conda/mamba installation, please get in contact with your project liaison at PIC. He/she will give you the actual value for the '''/path/to/mambaforge''' placeholder.&lt;br /&gt;
* If you want to use your own conda/mamba installation, we recommend you to install the minimal '''miniforge''' distribution, instructions [https://github.com/conda-forge/miniforge here] or [https://github.com/mamba-org/mamba mamba/micromamba]&lt;br /&gt;
&lt;br /&gt;
Log onto Jupyter and start a session. On the homepage of your Jupyter session, click on the terminal button on the session dashboard on the right to open a bash terminal. If no specific version is needed you can use the link provided in the example.&lt;br /&gt;
&lt;br /&gt;
First, let's initialize conda for our bash sessions:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ /data/astro/software/alma9/conda/miniforge-24.1.2/bin/mamba init&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This actually changes the .bashrc file in your home directory in order to activate the base environment on login.&lt;br /&gt;
To avoid that the base environment is activated every time you log on to a node, run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, in order to activate the base environment you will have to run this command:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ eval &amp;quot;$(/data/astro/software/alma9/conda/miniforge-24.1.2/bin/conda shell.bash hook)&amp;quot;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For now you can exit the terminal.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ exit&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Link an existing environment to Jupyter ==&lt;br /&gt;
&lt;br /&gt;
You can find instructions on how to create your own environments, e.g. [[#Create_virtual_environments_with_venv_or_conda | here]].&lt;br /&gt;
&lt;br /&gt;
Log into Jupyter, start a session. From the session dashboard choose the bash terminal.&lt;br /&gt;
&lt;br /&gt;
Inside the terminal, activate your environment.&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments:&lt;br /&gt;
* if you created the environment without a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the name of your environment. &lt;br /&gt;
* if you created the environment with a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the absolute path of your environment. &lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ source /path/to/environment/bin/activate&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Link the environment to a Jupyter kernel. For both, '''conda/mamba''' and '''venv''':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ python -m  ipykernel install --user --name=whatever_kernel_name&lt;br /&gt;
Installed kernelspec whatever_kernel_name in &lt;br /&gt;
                         /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you don't have the '''ipykernel''' module installed in your environment you may receive an error message like the one below when trying to run the previous command.&lt;br /&gt;
&amp;lt;pre&amp;gt;No module named ipykernel&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If this is the case, you need to install it by running: '''pip install ipykernel'''&lt;br /&gt;
&lt;br /&gt;
Deactivate your environment. &lt;br /&gt;
&lt;br /&gt;
For conda:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For venv:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now you can exit the terminal. After refreshing the Jupyter page your whatever_kernel_name appears in the dashboard. In this example '''test''' has been used for whatever_kernel_name&lt;br /&gt;
&lt;br /&gt;
[[File:screen05.png|700px]]&lt;br /&gt;
&lt;br /&gt;
== Unlink an environment from Jupyter ==&lt;br /&gt;
Log onto Jupyter, start a session and from the session dashboard choose the bash terminal. To remove your environment/kernel from Jupyter run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ jupyter kernelspec uninstall whatever_kernel_name&lt;br /&gt;
Kernel specs to remove:&lt;br /&gt;
  whatever_kernel_name     /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
Remove 1 kernel specs [y/N]: y&lt;br /&gt;
[RemoveKernelSpec] Removed /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Keep in mind that, although not available in Jupyter anymore, the environment still exists. Whenever you need it, you can link it again.&lt;br /&gt;
&lt;br /&gt;
== Create virtual environments with venv or conda ==&lt;br /&gt;
&lt;br /&gt;
Before creating a new environment, please get in contact with your project liaison at PIC as there may be already a suitable environment for your needs in place.&lt;br /&gt;
&lt;br /&gt;
If none of the existing environments suits your needs, you can create a new environment.&lt;br /&gt;
First, create a directory in a suitable place to store the environment. For single-user environments, place them in your home under ~/env. For environments that will be shared with other project users, contact your project liaison and ask him/her for a path in a shared storage volume that is visible to all of them.&lt;br /&gt;
&lt;br /&gt;
Once you have the location (i.e. /path/to/env/folder), create the environment with the following commands:&lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments '''(recommended)'''&lt;br /&gt;
&lt;br /&gt;
If your_env is installed at /path/to/env/your_env&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ python3 -m venv your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ source your_env/bin/activate&lt;br /&gt;
(...)[neissner@td110 ~]$ pip install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments&lt;br /&gt;
 &amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba create --prefix /path/to/env/your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The list of modules (module1, module2, ...) is optional. For instance, for a python3 environment with scipy you would specify: ''python=3 scipy'' &lt;br /&gt;
&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/env/folder/your_env&lt;br /&gt;
(...)[neissner@td110 ~]$ mamba install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
You can use pip install inside a mamba environment, however, resolving dependencies might require installing additional packages manually.&lt;br /&gt;
&lt;br /&gt;
== conda / mamba configuration ==&lt;br /&gt;
&lt;br /&gt;
The behaviour of conda/mamba can be configured through the &amp;quot;$HOME/.condarc&amp;quot; file, described [https://docs.conda.io/projects/conda/en/latest/configuration.html here]. Some interesting parameters:&lt;br /&gt;
&lt;br /&gt;
* envs_dirs: The list of directories to search for named environments. E.g.: different locations where you created environments&lt;br /&gt;
&lt;br /&gt;
  envs_dirs:&lt;br /&gt;
    - /data/pic/scratch/torradeflot/envs&lt;br /&gt;
    - /data/astro/scratch/torradeflot/envs&lt;br /&gt;
    - /data/aai/scratch/torradeflot/envs&lt;br /&gt;
&lt;br /&gt;
* pkgs_dirs: Folder where to store conda packages&lt;br /&gt;
&lt;br /&gt;
= Proper usage of X509 based proxies =&lt;br /&gt;
&lt;br /&gt;
We found recently that the usage of proxies within a Jupyter session might cause problems because the environment changes certain standard locations such as '''/tmp'''&lt;br /&gt;
&lt;br /&gt;
For a correct functioning please create the proxy the following way, example for Virgo:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /bin/voms-proxy-init --voms virgo:/virgo/ligo --out ./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
ls: cannot access /cvmfs/ligo.osgstorage.org: Permission denied&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here the proxy cannot properly be located. Therefore we have to put the complete path into the variable:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ pwd&lt;br /&gt;
/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;/x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
frames  powerflux  pycbc  test_access&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Software of particular interest =&lt;br /&gt;
== SageMath ==&lt;br /&gt;
&lt;br /&gt;
[https://www.sagemath.org/ SageMath] is particularly interesting for Cosmology because it allows symbolic calculations, e.g. deriving the equations of motions for the scale factor starting from a customised space-time metric. &lt;br /&gt;
&lt;br /&gt;
=== Standard cosmology examples ===&lt;br /&gt;
&lt;br /&gt;
* The Friedman equations for the FLRW solution of the Einstein equations.&lt;br /&gt;
You can find the corresponding Notebook in any PIC terminal at '''/data/astro/software/notebooks/FLRW_cosmology.ipynb'''&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/FLRW_cosmology_solutions.ipynb''' uses known analytical solutions of the FLRW cosmology and produces this image for the evolution of the scale factor:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage05.png|300px]]&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/Interior_Schwarzschild.ipynb''' shows the formalism for the interior Schwarzschild metric and displays the solutions for density and pressure of a static celestial object that is sufficiently larger than its Schwarzschild radius. The pressure for an object with constant density id shown in the image:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage06.png|300px]]&lt;br /&gt;
&lt;br /&gt;
=== Enabling SageMath environment in Jupyter ===&lt;br /&gt;
&lt;br /&gt;
If you have never initialized mamba, run:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /data/astro/software/centos7/conda/mambaforge_4.14.0/bin/mamba init&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
After that you can enable SageMath for its use in a Jupyter notebook session: &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~] mamba activate /data/astro/software/envs/sage&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ python -m  ipykernel install --user --name=sage&lt;br /&gt;
....&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This creates a file in you home '''~/.local/share/jupyter/kernels/sage/kernel.json'''&lt;br /&gt;
which has to be modified to look like this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
{&lt;br /&gt;
 &amp;quot;argv&amp;quot;: [&lt;br /&gt;
  &amp;quot;/data/astro/software/envs/sage/bin/sage&amp;quot;,&lt;br /&gt;
  &amp;quot;--python&amp;quot;,&lt;br /&gt;
  &amp;quot;-m&amp;quot;,&lt;br /&gt;
  &amp;quot;sage.repl.ipython_kernel&amp;quot;,&lt;br /&gt;
  &amp;quot;-f&amp;quot;,&lt;br /&gt;
  &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
 ],&lt;br /&gt;
 &amp;quot;display_name&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;language&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;metadata&amp;quot;: {&lt;br /&gt;
  &amp;quot;debugger&amp;quot;: true&lt;br /&gt;
 }&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next time you go to your Jupyter dashboard you will find the sage environment listed there.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Dask =&lt;br /&gt;
&lt;br /&gt;
A notebook with instructions on how to run Dask at PIC can be found [https://gitlab.pic.es/services/code-samples/-/blob/main/computing/dask/dask_htcondor.ipynb here]&lt;br /&gt;
&lt;br /&gt;
= Using a singularity image as a jupyter kernel =&lt;br /&gt;
&lt;br /&gt;
Sometimes the software stack of some projects may be provided in the shape of a singularity image, it will then be convenient to use this image as a kernel for the notebooks in jupyter.pic.es.&lt;br /&gt;
&lt;br /&gt;
The singularity image to be used as a kernel needs to fullfill some requirements. Different requirements will apply depending on the programming language.&lt;br /&gt;
&lt;br /&gt;
== python jupyter kernel in a singularity image ==&lt;br /&gt;
&lt;br /&gt;
The singularity image needs to have the '''python''' and the '''ipykernel''' module installed. &lt;br /&gt;
&lt;br /&gt;
* Create the folder that will host the kernel definition&lt;br /&gt;
&lt;br /&gt;
  mkdir -p $HOME/.local/share/jupyter/kernels/singularity&lt;br /&gt;
&lt;br /&gt;
* Create the '''kernel.json''' file inside it with the following content:&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 {&lt;br /&gt;
   &amp;quot;argv&amp;quot;: [&lt;br /&gt;
     &amp;quot;singularity&amp;quot;,&lt;br /&gt;
     &amp;quot;exec&amp;quot;,&lt;br /&gt;
     &amp;quot;--cleanenv&amp;quot;,&lt;br /&gt;
     &amp;quot;/path/to/the/singularity/image.sif&amp;quot;,&lt;br /&gt;
     &amp;quot;python&amp;quot;,&lt;br /&gt;
     &amp;quot;-m&amp;quot;,&lt;br /&gt;
    &amp;quot;ipykernel&amp;quot;,&lt;br /&gt;
     &amp;quot;-f&amp;quot;,&lt;br /&gt;
     &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
   ],&lt;br /&gt;
   &amp;quot;language&amp;quot;: &amp;quot;python&amp;quot;,&lt;br /&gt;
   &amp;quot;display_name&amp;quot;: &amp;quot;singularity-kernel&amp;quot;&lt;br /&gt;
 }&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Refresh or start the jupyterlab interface and the singularity kernel should appear in the launcher tab&lt;br /&gt;
&lt;br /&gt;
= GPUs =&lt;br /&gt;
&lt;br /&gt;
The way to identify the GPUs that are assigned to your job is:&lt;br /&gt;
* check the environment variable CUDA_VISIBLE_DEVICES. In a terminal run &amp;quot;echo $CUDA_VISIBLE_DEVICES&amp;quot;. The environment variable contains a list of comma-separated GPU ids. With this you will already know how many gpus are assigned to your job. If the variable does not exist, there are no gpus assigned to the job&lt;br /&gt;
&lt;br /&gt;
* list the gpus with nvidia-smi, in a terminal run &amp;quot;nvidia-smi -L&amp;quot;, and look for the gpus you've been assigned. Remember their indexes (integers from 0 to 7)&lt;br /&gt;
&lt;br /&gt;
* The two steps above can be done with the following command:&lt;br /&gt;
&lt;br /&gt;
nvidia-smi -L | grep  $CUDA_VISIBLE_DEVICES&lt;br /&gt;
&lt;br /&gt;
if only having a single assigned GPU.&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_id_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
* in the GPU dashboard the gpus are identified with their index&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_resources_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
= Jupyterlab user guide =&lt;br /&gt;
&lt;br /&gt;
You can find the official documentation of the currently installed version of jupyterlab (3.6) [https://jupyterlab.readthedocs.io/en/4.2.x/ here], there you will find instruction on how to &lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/commands.html Access the command palette]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/toc.html Build a Table Of Contents]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/debugger.html Debug your code]&lt;br /&gt;
&lt;br /&gt;
A set of non-official jupyterlab extensions are installed to provide additional functionalities&lt;br /&gt;
&lt;br /&gt;
== jupytext ==&lt;br /&gt;
Pair your notebooks with text files to enhance version tracking.&lt;br /&gt;
https://jupytext.readthedocs.io&lt;br /&gt;
&lt;br /&gt;
=== Example ===&lt;br /&gt;
&lt;br /&gt;
If you had a notebook (.ipynb file) containing only the cell below, tracked in a git repository&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%matplotlib inline&lt;br /&gt;
import numpy as np&lt;br /&gt;
import matplotlib.pyplot as plt&lt;br /&gt;
plt.imshow(np.random.random([10, 10]))&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Different executions of the cell would produce different images, and the images are embedded in a pseudo-binary format inside the notebook file. In this case, doing a '''git diff''' of the .ipynb file would produce a huge output (because the image changed), even if there wasn't any change in the code. It is thus convenient to sync the notebook with a text file (e.g. a .py script) using the jupytext extension and track this one with git. The outputs, including images, as well as some additional metadata, won't be added to the synced text file. So in the case of different executions of the same notebook, the diff will always be empty.&lt;br /&gt;
&lt;br /&gt;
== git ==&lt;br /&gt;
Sidebar GUI to git repo management&lt;br /&gt;
https://github.com/jupyterlab/jupyterlab-git&lt;br /&gt;
&lt;br /&gt;
== variable inspector ==&lt;br /&gt;
Variable inspection à la Matlab&lt;br /&gt;
https://github.com/jupyterlab-contrib/jupyterlab-variableInspector&lt;br /&gt;
&lt;br /&gt;
...&lt;br /&gt;
&lt;br /&gt;
= Code samples =&lt;br /&gt;
&lt;br /&gt;
A repository with sample code can be found here: https://gitlab.pic.es/services/code-samples/&lt;br /&gt;
&lt;br /&gt;
= Troubleshooting =&lt;br /&gt;
&lt;br /&gt;
== Logs ==&lt;br /&gt;
&lt;br /&gt;
The log files for the jupyterlab server are stored in &amp;quot;~/.jupyter&amp;quot;. The log files will be created once the jupyter lab server job is finished.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Clean workspaces ==&lt;br /&gt;
&lt;br /&gt;
Jupyterlab stores the workspace status in the &amp;quot;~/.jupyter/lab/workspaces&amp;quot; folder. If you want to start with a fresh (empty) workspace, delete all the content of this folder before launching the notebook.&lt;br /&gt;
&lt;br /&gt;
    cd ~/.jupyter/lab/workspaces&lt;br /&gt;
    rm *&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 504 Gateway timeout ==&lt;br /&gt;
&lt;br /&gt;
The notebook job is running in HTCondor but the user can not access the notebook server. Ultimately a 504 error is received.&lt;br /&gt;
&lt;br /&gt;
This is probably because there's some error when starting the jupyterlab server. First of all, shutdown the notebook server and [[#Logs|check the logs]] to better identify the problem. If you don't see the source of the error, try to [[#Clean_workspaces|clean the workspaces]] and launch a notebook again.&lt;br /&gt;
&lt;br /&gt;
== Install ROOT ==&lt;br /&gt;
&lt;br /&gt;
Environment creation and kernel installation&lt;br /&gt;
&lt;br /&gt;
    $ micromamba env create -p /data/pic/scratch/torradeflot/envs/mcdata root ipykernel&lt;br /&gt;
    $ micromamba activate mcdata&lt;br /&gt;
    $ python -m ipykernel install --user --name mcdata&lt;br /&gt;
&lt;br /&gt;
After doing this ROOT can be imported from a python shell, but it does not work from a notebook.&lt;br /&gt;
&lt;br /&gt;
* ROOT uses JIT compilation with Cling: https://root.cern/cling/&lt;br /&gt;
* conda provides it's own set of compilation tools: gcc, gxx, fortran&lt;br /&gt;
* Some environment variables are necessary to ensure that these two pieces work well together:&lt;br /&gt;
** PATH: to be able to find the compiler tools&lt;br /&gt;
** CONDA_BUILD_SYSROOT: needed to configure the compiler call&lt;br /&gt;
&lt;br /&gt;
These environment variables are not propagated to the notebook, because they are set by .bashrc conda activate, ... so we need to set them &amp;quot;by hand&amp;quot; in the notebook.&lt;br /&gt;
&lt;br /&gt;
    import os&lt;br /&gt;
    # values picked from the shell&lt;br /&gt;
    os.environ['PATH'] = ('/data/pic/scratch/torradeflot/envs/root3/bin:bash/condabin:/data/jupyter/software/envs/master/bin'&lt;br /&gt;
    ':/bin:/data/jupyter/software/envs/master/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin'&lt;br /&gt;
    ':/data/jupyter/software/envs/master/bin:/data/jupyter/software/tigervnc/1.10.0/usr/bin'&lt;br /&gt;
    ':/opt/puppetlabs/bin:/opt/hadoop-3.2.3/bin:/opt/tez-0.10.1/bin:/opt/hive-3.1.2/bin'&lt;br /&gt;
    ':/opt/spark-3.4.2/bin:/nfs/pic.es/user/t/torradeflot/bin')&lt;br /&gt;
    os.environ['CONDA_BUILD_SYSROOT'] = '/data/pic/scratch/torradeflot/envs/root3/x86_64-conda-linux-gnu/sysroot'&lt;br /&gt;
&lt;br /&gt;
== Loading libraries is very slow ==&lt;br /&gt;
Conda environments at storage using hard drives can be extremely slow to load. If you encounter this problem, please ask &lt;br /&gt;
your support contact for a &amp;quot;SSD scratch&amp;quot; location to store environments. Currently this service is being tested and&lt;br /&gt;
deployed as needed.&lt;/div&gt;</summary>
		<author><name>Torradeflot</name></author>
	</entry>
	<entry>
		<id>https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1195</id>
		<title>JupyterHub</title>
		<link rel="alternate" type="text/html" href="https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1195"/>
		<updated>2025-02-24T07:49:20Z</updated>

		<summary type="html">&lt;p&gt;Torradeflot: /* Install ROOT */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
= Introduction =&lt;br /&gt;
&lt;br /&gt;
PIC offers a service for running Jupyter notebooks on CPU or GPU resources. This service is primarily thought for code developing or prototyping rather than data processing. The usage is similar to running notebooks on your personal computer but offers the advantage of developing and testing your code on different hardware configurations, as well as facilitating the scalability of the code since it is being tested in the same environment in which it would run on a mass scale. &lt;br /&gt;
&lt;br /&gt;
Since the service is strictly thought for development and small scale testing tasks, a shutdown policy for the sessions has been put in place:&lt;br /&gt;
&lt;br /&gt;
# The maximum duration for a session is 48h.&lt;br /&gt;
# After an idle period of 2 hours, the session will be closed. &lt;br /&gt;
&lt;br /&gt;
In practice that  means that you should estimate the test data volume that you work with during a session to be able to be processed in less than 48 hours.&lt;br /&gt;
&lt;br /&gt;
== How to connect to the service ==&lt;br /&gt;
&lt;br /&gt;
Got to [https://jupyter.pic.es jupyter.pic.es] to see your login screen.&lt;br /&gt;
&lt;br /&gt;
[[File:login.png|700px|Login screen]]&lt;br /&gt;
&lt;br /&gt;
Sign in with your PIC user credentials. This will prompt you to the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterSpawn.png|700px|current]]&lt;br /&gt;
&lt;br /&gt;
Here you can choose the hardware configuration for your Jupyter session. Also, you have to choose the experiment (project) you are working on during the Jupyter session. After choosing a configuration and pressing start the next screen will show you the progress of the initialisation process. Keep in mind that a job containing your Jupyter session is actually sent to the HTCondor queuing system and waiting for available resources before being started. This usually takes less than a minute but can take up to a few depending on our resource usage.&lt;br /&gt;
&lt;br /&gt;
[[File:screen02.png|900px]]&lt;br /&gt;
&lt;br /&gt;
In the next screen you can choose the tool that you want to use for your work: a Python notebook, a Python console or a plain bash terminal.&lt;br /&gt;
For the Python environment (either notebook or environment) you have two default options:&lt;br /&gt;
* the ipykernel version of Python 3&lt;br /&gt;
* the XPython version of Python 3.9, this one allows you to use the integrated debugging module.&lt;br /&gt;
&lt;br /&gt;
Further you see an icon with a &amp;quot;D&amp;quot; - desktop, this one starts a VNC session that allows the use of programs with graphical user interfaces.&lt;br /&gt;
&lt;br /&gt;
Also, recently you can find the icon of Visual Studio, an integrated development environment.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterlab20231103.png|700px]]&lt;br /&gt;
&lt;br /&gt;
Your python environments should appear under Notebook and Console headers. In a later section we will show you how to create a new environment and to remove an existing one.&lt;br /&gt;
&lt;br /&gt;
== Terminate your session and logout ==&lt;br /&gt;
&lt;br /&gt;
It is important that you terminate your session before you log out. In order to do so, go to the top page menu &amp;quot;'''File''' -&amp;gt; '''Hub Control Panel'''&amp;quot; and you will see the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:screen04.png|600px]]&lt;br /&gt;
&lt;br /&gt;
Here, click on the '''Stop My Server''' button. After that you can log out by clicking the '''Logout''' button in the right upper corner.&lt;br /&gt;
&lt;br /&gt;
=  Python virtual environments =&lt;br /&gt;
&lt;br /&gt;
This section covers the use of Python virtual environments with Jupyter.&lt;br /&gt;
&lt;br /&gt;
== Initialize conda (we highly recommend the use of miniforge/mambaforge) ==&lt;br /&gt;
&lt;br /&gt;
Before using conda/mamba in your bash session, you have to initialize it.&lt;br /&gt;
* For access to an available conda/mamba installation, please get in contact with your project liaison at PIC. He/she will give you the actual value for the '''/path/to/mambaforge''' placeholder.&lt;br /&gt;
* If you want to use your own conda/mamba installation, we recommend you to install the minimal '''miniforge''' distribution, instructions [https://github.com/conda-forge/miniforge here] or [https://github.com/mamba-org/mamba mamba/micromamba]&lt;br /&gt;
&lt;br /&gt;
Log onto Jupyter and start a session. On the homepage of your Jupyter session, click on the terminal button on the session dashboard on the right to open a bash terminal. If no specific version is needed you can use the link provided in the example.&lt;br /&gt;
&lt;br /&gt;
First, let's initialize conda for our bash sessions:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ /data/astro/software/alma9/conda/miniforge-24.1.2/bin/mamba init&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This actually changes the .bashrc file in your home directory in order to activate the base environment on login.&lt;br /&gt;
To avoid that the base environment is activated every time you log on to a node, run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, in order to activate the base environment you will have to run this command:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ eval &amp;quot;$(/data/astro/software/alma9/conda/miniforge-24.1.2/bin/conda shell.bash hook)&amp;quot;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For now you can exit the terminal.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ exit&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Link an existing environment to Jupyter ==&lt;br /&gt;
&lt;br /&gt;
You can find instructions on how to create your own environments, e.g. [[#Create_virtual_environments_with_venv_or_conda | here]].&lt;br /&gt;
&lt;br /&gt;
Log into Jupyter, start a session. From the session dashboard choose the bash terminal.&lt;br /&gt;
&lt;br /&gt;
Inside the terminal, activate your environment.&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments:&lt;br /&gt;
* if you created the environment without a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the name of your environment. &lt;br /&gt;
* if you created the environment with a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the absolute path of your environment. &lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ source /path/to/environment/bin/activate&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Link the environment to a Jupyter kernel. For both, '''conda/mamba''' and '''venv''':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ python -m  ipykernel install --user --name=whatever_kernel_name&lt;br /&gt;
Installed kernelspec whatever_kernel_name in &lt;br /&gt;
                         /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you don't have the '''ipykernel''' module installed in your environment you may receive an error message like the one below when trying to run the previous command.&lt;br /&gt;
&amp;lt;pre&amp;gt;No module named ipykernel&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If this is the case, you need to install it by running: '''pip install ipykernel'''&lt;br /&gt;
&lt;br /&gt;
Deactivate your environment. &lt;br /&gt;
&lt;br /&gt;
For conda:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For venv:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now you can exit the terminal. After refreshing the Jupyter page your whatever_kernel_name appears in the dashboard. In this example '''test''' has been used for whatever_kernel_name&lt;br /&gt;
&lt;br /&gt;
[[File:screen05.png|700px]]&lt;br /&gt;
&lt;br /&gt;
== Unlink an environment from Jupyter ==&lt;br /&gt;
Log onto Jupyter, start a session and from the session dashboard choose the bash terminal. To remove your environment/kernel from Jupyter run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ jupyter kernelspec uninstall whatever_kernel_name&lt;br /&gt;
Kernel specs to remove:&lt;br /&gt;
  whatever_kernel_name     /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
Remove 1 kernel specs [y/N]: y&lt;br /&gt;
[RemoveKernelSpec] Removed /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Keep in mind that, although not available in Jupyter anymore, the environment still exists. Whenever you need it, you can link it again.&lt;br /&gt;
&lt;br /&gt;
== Create virtual environments with venv or conda ==&lt;br /&gt;
&lt;br /&gt;
Before creating a new environment, please get in contact with your project liaison at PIC as there may be already a suitable environment for your needs in place.&lt;br /&gt;
&lt;br /&gt;
If none of the existing environments suits your needs, you can create a new environment.&lt;br /&gt;
First, create a directory in a suitable place to store the environment. For single-user environments, place them in your home under ~/env. For environments that will be shared with other project users, contact your project liaison and ask him/her for a path in a shared storage volume that is visible to all of them.&lt;br /&gt;
&lt;br /&gt;
Once you have the location (i.e. /path/to/env/folder), create the environment with the following commands:&lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments '''(recommended)'''&lt;br /&gt;
&lt;br /&gt;
If your_env is installed at /path/to/env/your_env&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ python3 -m venv your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ source your_env/bin/activate&lt;br /&gt;
(...)[neissner@td110 ~]$ pip install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments&lt;br /&gt;
 &amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba create --prefix /path/to/env/your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The list of modules (module1, module2, ...) is optional. For instance, for a python3 environment with scipy you would specify: ''python=3 scipy'' &lt;br /&gt;
&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/env/folder/your_env&lt;br /&gt;
(...)[neissner@td110 ~]$ mamba install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
You can use pip install inside a mamba environment, however, resolving dependencies might require installing additional packages manually.&lt;br /&gt;
&lt;br /&gt;
== conda / mamba configuration ==&lt;br /&gt;
&lt;br /&gt;
The behaviour of conda/mamba can be configured through the &amp;quot;$HOME/.condarc&amp;quot; file, described [https://docs.conda.io/projects/conda/en/latest/configuration.html here]. Some interesting parameters:&lt;br /&gt;
&lt;br /&gt;
* envs_dirs: The list of directories to search for named environments. E.g.: different locations where you created environments&lt;br /&gt;
&lt;br /&gt;
  envs_dirs:&lt;br /&gt;
    - /data/pic/scratch/torradeflot/envs&lt;br /&gt;
    - /data/astro/scratch/torradeflot/envs&lt;br /&gt;
    - /data/aai/scratch/torradeflot/envs&lt;br /&gt;
&lt;br /&gt;
* pkgs_dirs: Folder where to store conda packages&lt;br /&gt;
&lt;br /&gt;
= Proper usage of X509 based proxies =&lt;br /&gt;
&lt;br /&gt;
We found recently that the usage of proxies within a Jupyter session might cause problems because the environment changes certain standard locations such as '''/tmp'''&lt;br /&gt;
&lt;br /&gt;
For a correct functioning please create the proxy the following way, example for Virgo:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /bin/voms-proxy-init --voms virgo:/virgo/ligo --out ./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
ls: cannot access /cvmfs/ligo.osgstorage.org: Permission denied&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here the proxy cannot properly be located. Therefore we have to put the complete path into the variable:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ pwd&lt;br /&gt;
/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;/x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
frames  powerflux  pycbc  test_access&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Software of particular interest =&lt;br /&gt;
== SageMath ==&lt;br /&gt;
&lt;br /&gt;
[https://www.sagemath.org/ SageMath] is particularly interesting for Cosmology because it allows symbolic calculations, e.g. deriving the equations of motions for the scale factor starting from a customised space-time metric. &lt;br /&gt;
&lt;br /&gt;
=== Standard cosmology examples ===&lt;br /&gt;
&lt;br /&gt;
* The Friedman equations for the FLRW solution of the Einstein equations.&lt;br /&gt;
You can find the corresponding Notebook in any PIC terminal at '''/data/astro/software/notebooks/FLRW_cosmology.ipynb'''&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/FLRW_cosmology_solutions.ipynb''' uses known analytical solutions of the FLRW cosmology and produces this image for the evolution of the scale factor:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage05.png|300px]]&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/Interior_Schwarzschild.ipynb''' shows the formalism for the interior Schwarzschild metric and displays the solutions for density and pressure of a static celestial object that is sufficiently larger than its Schwarzschild radius. The pressure for an object with constant density id shown in the image:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage06.png|300px]]&lt;br /&gt;
&lt;br /&gt;
=== Enabling SageMath environment in Jupyter ===&lt;br /&gt;
&lt;br /&gt;
If you have never initialized mamba, run:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /data/astro/software/centos7/conda/mambaforge_4.14.0/bin/mamba init&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
After that you can enable SageMath for its use in a Jupyter notebook session: &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~] mamba activate /data/astro/software/envs/sage&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ python -m  ipykernel install --user --name=sage&lt;br /&gt;
....&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This creates a file in you home '''~/.local/share/jupyter/kernels/sage/kernel.json'''&lt;br /&gt;
which has to be modified to look like this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
{&lt;br /&gt;
 &amp;quot;argv&amp;quot;: [&lt;br /&gt;
  &amp;quot;/data/astro/software/envs/sage/bin/sage&amp;quot;,&lt;br /&gt;
  &amp;quot;--python&amp;quot;,&lt;br /&gt;
  &amp;quot;-m&amp;quot;,&lt;br /&gt;
  &amp;quot;sage.repl.ipython_kernel&amp;quot;,&lt;br /&gt;
  &amp;quot;-f&amp;quot;,&lt;br /&gt;
  &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
 ],&lt;br /&gt;
 &amp;quot;display_name&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;language&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;metadata&amp;quot;: {&lt;br /&gt;
  &amp;quot;debugger&amp;quot;: true&lt;br /&gt;
 }&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next time you go to your Jupyter dashboard you will find the sage environment listed there.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Dask =&lt;br /&gt;
&lt;br /&gt;
A notebook with instructions on how to run Dask at PIC can be found [https://gitlab.pic.es/services/code-samples/-/blob/main/computing/dask/dask_htcondor.ipynb here]&lt;br /&gt;
&lt;br /&gt;
= Using a singularity image as a jupyter kernel =&lt;br /&gt;
&lt;br /&gt;
Sometimes the software stack of some projects may be provided in the shape of a singularity image, it will then be convenient to use this image as a kernel for the notebooks in jupyter.pic.es.&lt;br /&gt;
&lt;br /&gt;
The singularity image to be used as a kernel needs to fullfill some requirements. Different requirements will apply depending on the programming language.&lt;br /&gt;
&lt;br /&gt;
== python jupyter kernel in a singularity image ==&lt;br /&gt;
&lt;br /&gt;
The singularity image needs to have the '''python''' and the '''ipykernel''' module installed. &lt;br /&gt;
&lt;br /&gt;
* Create the folder that will host the kernel definition&lt;br /&gt;
&lt;br /&gt;
  mkdir -p $HOME/.local/share/jupyter/kernels/singularity&lt;br /&gt;
&lt;br /&gt;
* Create the '''kernel.json''' file inside it with the following content:&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 {&lt;br /&gt;
   &amp;quot;argv&amp;quot;: [&lt;br /&gt;
     &amp;quot;singularity&amp;quot;,&lt;br /&gt;
     &amp;quot;exec&amp;quot;,&lt;br /&gt;
     &amp;quot;--cleanenv&amp;quot;,&lt;br /&gt;
     &amp;quot;/path/to/the/singularity/image.sif&amp;quot;,&lt;br /&gt;
     &amp;quot;python&amp;quot;,&lt;br /&gt;
     &amp;quot;-m&amp;quot;,&lt;br /&gt;
    &amp;quot;ipykernel&amp;quot;,&lt;br /&gt;
     &amp;quot;-f&amp;quot;,&lt;br /&gt;
     &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
   ],&lt;br /&gt;
   &amp;quot;language&amp;quot;: &amp;quot;python&amp;quot;,&lt;br /&gt;
   &amp;quot;display_name&amp;quot;: &amp;quot;singularity-kernel&amp;quot;&lt;br /&gt;
 }&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Refresh or start the jupyterlab interface and the singularity kernel should appear in the launcher tab&lt;br /&gt;
&lt;br /&gt;
= GPUs =&lt;br /&gt;
&lt;br /&gt;
The way to identify the GPUs that are assigned to your job is:&lt;br /&gt;
* check the environment variable CUDA_VISIBLE_DEVICES. In a terminal run &amp;quot;echo $CUDA_VISIBLE_DEVICES&amp;quot;. The environment variable contains a list of comma-separated GPU ids. With this you will already know how many gpus are assigned to your job. If the variable does not exist, there are no gpus assigned to the job&lt;br /&gt;
&lt;br /&gt;
* list the gpus with nvidia-smi, in a terminal run &amp;quot;nvidia-smi -L&amp;quot;, and look for the gpus you've been assigned. Remember their indexes (integers from 0 to 7)&lt;br /&gt;
&lt;br /&gt;
* The two steps above can be done with the following command:&lt;br /&gt;
&lt;br /&gt;
nvidia-smi -L | grep  $CUDA_VISIBLE_DEVICES&lt;br /&gt;
&lt;br /&gt;
if only having a single assigned GPU.&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_id_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
* in the GPU dashboard the gpus are identified with their index&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_resources_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
= Jupyterlab user guide =&lt;br /&gt;
&lt;br /&gt;
You can find the official documentation of the currently installed version of jupyterlab (3.6) [https://jupyterlab.readthedocs.io/en/4.2.x/ here], there you will find instruction on how to &lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/commands.html Access the command palette]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/toc.html Build a Table Of Contents]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/debugger.html Debug your code]&lt;br /&gt;
&lt;br /&gt;
A set of non-official jupyterlab extensions are installed to provide additional functionalities&lt;br /&gt;
&lt;br /&gt;
== jupytext ==&lt;br /&gt;
Pair your notebooks with text files to enhance version tracking.&lt;br /&gt;
https://jupytext.readthedocs.io&lt;br /&gt;
&lt;br /&gt;
=== Example ===&lt;br /&gt;
&lt;br /&gt;
If you had a notebook (.ipynb file) containing only the cell below, tracked in a git repository&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%matplotlib inline&lt;br /&gt;
import numpy as np&lt;br /&gt;
import matplotlib.pyplot as plt&lt;br /&gt;
plt.imshow(np.random.random([10, 10]))&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Different executions of the cell would produce different images, and the images are embedded in a pseudo-binary format inside the notebook file. In this case, doing a '''git diff''' of the .ipynb file would produce a huge output (because the image changed), even if there wasn't any change in the code. It is thus convenient to sync the notebook with a text file (e.g. a .py script) using the jupytext extension and track this one with git. The outputs, including images, as well as some additional metadata, won't be added to the synced text file. So in the case of different executions of the same notebook, the diff will always be empty.&lt;br /&gt;
&lt;br /&gt;
== git ==&lt;br /&gt;
Sidebar GUI to git repo management&lt;br /&gt;
https://github.com/jupyterlab/jupyterlab-git&lt;br /&gt;
&lt;br /&gt;
== variable inspector ==&lt;br /&gt;
Variable inspection à la Matlab&lt;br /&gt;
https://github.com/jupyterlab-contrib/jupyterlab-variableInspector&lt;br /&gt;
&lt;br /&gt;
...&lt;br /&gt;
&lt;br /&gt;
= Code samples =&lt;br /&gt;
&lt;br /&gt;
A repository with sample code can be found here: https://gitlab.pic.es/services/code-samples/&lt;br /&gt;
&lt;br /&gt;
= Troubleshooting =&lt;br /&gt;
&lt;br /&gt;
== Logs ==&lt;br /&gt;
&lt;br /&gt;
The log files for the jupyterlab server are stored in &amp;quot;~/.jupyter&amp;quot;. The log files will be created once the jupyter lab server job is finished.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Clean workspaces ==&lt;br /&gt;
&lt;br /&gt;
Jupyterlab stores the workspace status in the &amp;quot;~/.jupyter/lab/workspaces&amp;quot; folder. If you want to start with a fresh (empty) workspace, delete all the content of this folder before launching the notebook.&lt;br /&gt;
&lt;br /&gt;
    cd ~/.jupyter/lab/workspaces&lt;br /&gt;
    rm *&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 504 Gateway timeout ==&lt;br /&gt;
&lt;br /&gt;
The notebook job is running in HTCondor but the user can not access the notebook server. Ultimately a 504 error is received.&lt;br /&gt;
&lt;br /&gt;
This is probably because there's some error when starting the jupyterlab server. First of all, shutdown the notebook server and [[#Logs|check the logs]] to better identify the problem. If you don't see the source of the error, try to [[#Clean_workspaces|clean the workspaces]] and launch a notebook again.&lt;br /&gt;
&lt;br /&gt;
== Install ROOT ==&lt;br /&gt;
&lt;br /&gt;
When installing ROOT in a conda environment from a terminal in a workernode there's a conflict with the value of `LD_LIBRARY_PATH=/opt/hadoop-3.2.3/lib/native:/usr/lib64/` enforced through `/etc/profile.d`. This configuration does not apply to jupyter kernels. So some headers in `/usr/include` can not be found (e.g. `/usr/include/dlfcn.h`) and ROOT can not be imported in a notebook.&lt;br /&gt;
&lt;br /&gt;
So, to install root in a conda environment named &amp;quot;mcdata&amp;quot; from a terminal in jupyter.pic.es and enable using it as a kernel you first have to unset the LD_LIBRARY_PATH variable, that is:&lt;br /&gt;
&lt;br /&gt;
    $ unset LD_LIBRARY_PATH&lt;br /&gt;
    $ micromamba env create -p /data/pic/scratch/torradeflot/envs/mcdata root ipykernel&lt;br /&gt;
    $ micromamba activate mcdata&lt;br /&gt;
    $ python -m ipykernel install --user --name mcdata&lt;br /&gt;
&lt;br /&gt;
== Loading libraries is very slow ==&lt;br /&gt;
Conda environments at storage using hard drives can be extremely slow to load. If you encounter this problem, please ask &lt;br /&gt;
your support contact for a &amp;quot;SSD scratch&amp;quot; location to store environments. Currently this service is being tested and&lt;br /&gt;
deployed as needed.&lt;/div&gt;</summary>
		<author><name>Torradeflot</name></author>
	</entry>
	<entry>
		<id>https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1192</id>
		<title>JupyterHub</title>
		<link rel="alternate" type="text/html" href="https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1192"/>
		<updated>2025-02-20T07:42:05Z</updated>

		<summary type="html">&lt;p&gt;Torradeflot: /* Install ROOT */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
= Introduction =&lt;br /&gt;
&lt;br /&gt;
PIC offers a service for running Jupyter notebooks on CPU or GPU resources. This service is primarily thought for code developing or prototyping rather than data processing. The usage is similar to running notebooks on your personal computer but offers the advantage of developing and testing your code on different hardware configurations, as well as facilitating the scalability of the code since it is being tested in the same environment in which it would run on a mass scale. &lt;br /&gt;
&lt;br /&gt;
Since the service is strictly thought for development and small scale testing tasks, a shutdown policy for the sessions has been put in place:&lt;br /&gt;
&lt;br /&gt;
# The maximum duration for a session is 48h.&lt;br /&gt;
# After an idle period of 2 hours, the session will be closed. &lt;br /&gt;
&lt;br /&gt;
In practice that  means that you should estimate the test data volume that you work with during a session to be able to be processed in less than 48 hours.&lt;br /&gt;
&lt;br /&gt;
== How to connect to the service ==&lt;br /&gt;
&lt;br /&gt;
Got to [https://jupyter.pic.es jupyter.pic.es] to see your login screen.&lt;br /&gt;
&lt;br /&gt;
[[File:login.png|700px|Login screen]]&lt;br /&gt;
&lt;br /&gt;
Sign in with your PIC user credentials. This will prompt you to the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterSpawn.png|700px|current]]&lt;br /&gt;
&lt;br /&gt;
Here you can choose the hardware configuration for your Jupyter session. Also, you have to choose the experiment (project) you are working on during the Jupyter session. After choosing a configuration and pressing start the next screen will show you the progress of the initialisation process. Keep in mind that a job containing your Jupyter session is actually sent to the HTCondor queuing system and waiting for available resources before being started. This usually takes less than a minute but can take up to a few depending on our resource usage.&lt;br /&gt;
&lt;br /&gt;
[[File:screen02.png|900px]]&lt;br /&gt;
&lt;br /&gt;
In the next screen you can choose the tool that you want to use for your work: a Python notebook, a Python console or a plain bash terminal.&lt;br /&gt;
For the Python environment (either notebook or environment) you have two default options:&lt;br /&gt;
* the ipykernel version of Python 3&lt;br /&gt;
* the XPython version of Python 3.9, this one allows you to use the integrated debugging module.&lt;br /&gt;
&lt;br /&gt;
Further you see an icon with a &amp;quot;D&amp;quot; - desktop, this one starts a VNC session that allows the use of programs with graphical user interfaces.&lt;br /&gt;
&lt;br /&gt;
Also, recently you can find the icon of Visual Studio, an integrated development environment.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterlab20231103.png|700px]]&lt;br /&gt;
&lt;br /&gt;
Your python environments should appear under Notebook and Console headers. In a later section we will show you how to create a new environment and to remove an existing one.&lt;br /&gt;
&lt;br /&gt;
== Terminate your session and logout ==&lt;br /&gt;
&lt;br /&gt;
It is important that you terminate your session before you log out. In order to do so, go to the top page menu &amp;quot;'''File''' -&amp;gt; '''Hub Control Panel'''&amp;quot; and you will see the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:screen04.png|600px]]&lt;br /&gt;
&lt;br /&gt;
Here, click on the '''Stop My Server''' button. After that you can log out by clicking the '''Logout''' button in the right upper corner.&lt;br /&gt;
&lt;br /&gt;
=  Python virtual environments =&lt;br /&gt;
&lt;br /&gt;
This section covers the use of Python virtual environments with Jupyter.&lt;br /&gt;
&lt;br /&gt;
== Initialize conda (we highly recommend the use of miniforge/mambaforge) ==&lt;br /&gt;
&lt;br /&gt;
Before using conda/mamba in your bash session, you have to initialize it.&lt;br /&gt;
* For access to an available conda/mamba installation, please get in contact with your project liaison at PIC. He/she will give you the actual value for the '''/path/to/mambaforge''' placeholder.&lt;br /&gt;
* If you want to use your own conda/mamba installation, we recommend you to install the minimal '''miniforge''' distribution, instructions [https://github.com/conda-forge/miniforge here] or [https://github.com/mamba-org/mamba mamba/micromamba]&lt;br /&gt;
&lt;br /&gt;
Log onto Jupyter and start a session. On the homepage of your Jupyter session, click on the terminal button on the session dashboard on the right to open a bash terminal. If no specific version is needed you can use the link provided in the example.&lt;br /&gt;
&lt;br /&gt;
First, let's initialize conda for our bash sessions:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ /data/astro/software/alma9/conda/miniforge-24.1.2/bin/mamba init&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This actually changes the .bashrc file in your home directory in order to activate the base environment on login.&lt;br /&gt;
To avoid that the base environment is activated every time you log on to a node, run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, in order to activate the base environment you will have to run this command:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ eval &amp;quot;$(/data/astro/software/alma9/conda/miniforge-24.1.2/bin/conda shell.bash hook)&amp;quot;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For now you can exit the terminal.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ exit&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Link an existing environment to Jupyter ==&lt;br /&gt;
&lt;br /&gt;
You can find instructions on how to create your own environments, e.g. [[#Create_virtual_environments_with_venv_or_conda | here]].&lt;br /&gt;
&lt;br /&gt;
Log into Jupyter, start a session. From the session dashboard choose the bash terminal.&lt;br /&gt;
&lt;br /&gt;
Inside the terminal, activate your environment.&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments:&lt;br /&gt;
* if you created the environment without a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the name of your environment. &lt;br /&gt;
* if you created the environment with a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the absolute path of your environment. &lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ source /path/to/environment/bin/activate&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Link the environment to a Jupyter kernel. For both, '''conda/mamba''' and '''venv''':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ python -m  ipykernel install --user --name=whatever_kernel_name&lt;br /&gt;
Installed kernelspec whatever_kernel_name in &lt;br /&gt;
                         /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you don't have the '''ipykernel''' module installed in your environment you may receive an error message like the one below when trying to run the previous command.&lt;br /&gt;
&amp;lt;pre&amp;gt;No module named ipykernel&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If this is the case, you need to install it by running: '''pip install ipykernel'''&lt;br /&gt;
&lt;br /&gt;
Deactivate your environment. &lt;br /&gt;
&lt;br /&gt;
For conda:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For venv:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now you can exit the terminal. After refreshing the Jupyter page your whatever_kernel_name appears in the dashboard. In this example '''test''' has been used for whatever_kernel_name&lt;br /&gt;
&lt;br /&gt;
[[File:screen05.png|700px]]&lt;br /&gt;
&lt;br /&gt;
== Unlink an environment from Jupyter ==&lt;br /&gt;
Log onto Jupyter, start a session and from the session dashboard choose the bash terminal. To remove your environment/kernel from Jupyter run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ jupyter kernelspec uninstall whatever_kernel_name&lt;br /&gt;
Kernel specs to remove:&lt;br /&gt;
  whatever_kernel_name     /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
Remove 1 kernel specs [y/N]: y&lt;br /&gt;
[RemoveKernelSpec] Removed /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Keep in mind that, although not available in Jupyter anymore, the environment still exists. Whenever you need it, you can link it again.&lt;br /&gt;
&lt;br /&gt;
== Create virtual environments with venv or conda ==&lt;br /&gt;
&lt;br /&gt;
Before creating a new environment, please get in contact with your project liaison at PIC as there may be already a suitable environment for your needs in place.&lt;br /&gt;
&lt;br /&gt;
If none of the existing environments suits your needs, you can create a new environment.&lt;br /&gt;
First, create a directory in a suitable place to store the environment. For single-user environments, place them in your home under ~/env. For environments that will be shared with other project users, contact your project liaison and ask him/her for a path in a shared storage volume that is visible to all of them.&lt;br /&gt;
&lt;br /&gt;
Once you have the location (i.e. /path/to/env/folder), create the environment with the following commands:&lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments '''(recommended)'''&lt;br /&gt;
&lt;br /&gt;
If your_env is installed at /path/to/env/your_env&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ python3 -m venv your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ source your_env/bin/activate&lt;br /&gt;
(...)[neissner@td110 ~]$ pip install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments&lt;br /&gt;
 &amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba create --prefix /path/to/env/your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The list of modules (module1, module2, ...) is optional. For instance, for a python3 environment with scipy you would specify: ''python=3 scipy'' &lt;br /&gt;
&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/env/folder/your_env&lt;br /&gt;
(...)[neissner@td110 ~]$ mamba install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
You can use pip install inside a mamba environment, however, resolving dependencies might require installing additional packages manually.&lt;br /&gt;
&lt;br /&gt;
== conda / mamba configuration ==&lt;br /&gt;
&lt;br /&gt;
The behaviour of conda/mamba can be configured through the &amp;quot;$HOME/.condarc&amp;quot; file, described [https://docs.conda.io/projects/conda/en/latest/configuration.html here]. Some interesting parameters:&lt;br /&gt;
&lt;br /&gt;
* envs_dirs: The list of directories to search for named environments. E.g.: different locations where you created environments&lt;br /&gt;
&lt;br /&gt;
  envs_dirs:&lt;br /&gt;
    - /data/pic/scratch/torradeflot/envs&lt;br /&gt;
    - /data/astro/scratch/torradeflot/envs&lt;br /&gt;
    - /data/aai/scratch/torradeflot/envs&lt;br /&gt;
&lt;br /&gt;
* pkgs_dirs: Folder where to store conda packages&lt;br /&gt;
&lt;br /&gt;
= Proper usage of X509 based proxies =&lt;br /&gt;
&lt;br /&gt;
We found recently that the usage of proxies within a Jupyter session might cause problems because the environment changes certain standard locations such as '''/tmp'''&lt;br /&gt;
&lt;br /&gt;
For a correct functioning please create the proxy the following way, example for Virgo:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /bin/voms-proxy-init --voms virgo:/virgo/ligo --out ./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
ls: cannot access /cvmfs/ligo.osgstorage.org: Permission denied&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here the proxy cannot properly be located. Therefore we have to put the complete path into the variable:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ pwd&lt;br /&gt;
/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;/x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
frames  powerflux  pycbc  test_access&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Software of particular interest =&lt;br /&gt;
== SageMath ==&lt;br /&gt;
&lt;br /&gt;
[https://www.sagemath.org/ SageMath] is particularly interesting for Cosmology because it allows symbolic calculations, e.g. deriving the equations of motions for the scale factor starting from a customised space-time metric. &lt;br /&gt;
&lt;br /&gt;
=== Standard cosmology examples ===&lt;br /&gt;
&lt;br /&gt;
* The Friedman equations for the FLRW solution of the Einstein equations.&lt;br /&gt;
You can find the corresponding Notebook in any PIC terminal at '''/data/astro/software/notebooks/FLRW_cosmology.ipynb'''&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/FLRW_cosmology_solutions.ipynb''' uses known analytical solutions of the FLRW cosmology and produces this image for the evolution of the scale factor:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage05.png|300px]]&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/Interior_Schwarzschild.ipynb''' shows the formalism for the interior Schwarzschild metric and displays the solutions for density and pressure of a static celestial object that is sufficiently larger than its Schwarzschild radius. The pressure for an object with constant density id shown in the image:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage06.png|300px]]&lt;br /&gt;
&lt;br /&gt;
=== Enabling SageMath environment in Jupyter ===&lt;br /&gt;
&lt;br /&gt;
If you have never initialized mamba, run:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /data/astro/software/centos7/conda/mambaforge_4.14.0/bin/mamba init&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
After that you can enable SageMath for its use in a Jupyter notebook session: &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~] mamba activate /data/astro/software/envs/sage&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ python -m  ipykernel install --user --name=sage&lt;br /&gt;
....&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This creates a file in you home '''~/.local/share/jupyter/kernels/sage/kernel.json'''&lt;br /&gt;
which has to be modified to look like this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
{&lt;br /&gt;
 &amp;quot;argv&amp;quot;: [&lt;br /&gt;
  &amp;quot;/data/astro/software/envs/sage/bin/sage&amp;quot;,&lt;br /&gt;
  &amp;quot;--python&amp;quot;,&lt;br /&gt;
  &amp;quot;-m&amp;quot;,&lt;br /&gt;
  &amp;quot;sage.repl.ipython_kernel&amp;quot;,&lt;br /&gt;
  &amp;quot;-f&amp;quot;,&lt;br /&gt;
  &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
 ],&lt;br /&gt;
 &amp;quot;display_name&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;language&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;metadata&amp;quot;: {&lt;br /&gt;
  &amp;quot;debugger&amp;quot;: true&lt;br /&gt;
 }&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next time you go to your Jupyter dashboard you will find the sage environment listed there.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Dask =&lt;br /&gt;
&lt;br /&gt;
A notebook with instructions on how to run Dask at PIC can be found [https://gitlab.pic.es/services/code-samples/-/blob/main/computing/dask/dask_htcondor.ipynb here]&lt;br /&gt;
&lt;br /&gt;
= Using a singularity image as a jupyter kernel =&lt;br /&gt;
&lt;br /&gt;
Sometimes the software stack of some projects may be provided in the shape of a singularity image, it will then be convenient to use this image as a kernel for the notebooks in jupyter.pic.es.&lt;br /&gt;
&lt;br /&gt;
The singularity image to be used as a kernel needs to fullfill some requirements. Different requirements will apply depending on the programming language.&lt;br /&gt;
&lt;br /&gt;
== python jupyter kernel in a singularity image ==&lt;br /&gt;
&lt;br /&gt;
The singularity image needs to have the '''python''' and the '''ipykernel''' module installed. &lt;br /&gt;
&lt;br /&gt;
* Create the folder that will host the kernel definition&lt;br /&gt;
&lt;br /&gt;
  mkdir -p $HOME/.local/share/jupyter/kernels/singularity&lt;br /&gt;
&lt;br /&gt;
* Create the '''kernel.json''' file inside it with the following content:&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 {&lt;br /&gt;
   &amp;quot;argv&amp;quot;: [&lt;br /&gt;
     &amp;quot;singularity&amp;quot;,&lt;br /&gt;
     &amp;quot;exec&amp;quot;,&lt;br /&gt;
     &amp;quot;--cleanenv&amp;quot;,&lt;br /&gt;
     &amp;quot;/path/to/the/singularity/image.sif&amp;quot;,&lt;br /&gt;
     &amp;quot;python&amp;quot;,&lt;br /&gt;
     &amp;quot;-m&amp;quot;,&lt;br /&gt;
    &amp;quot;ipykernel&amp;quot;,&lt;br /&gt;
     &amp;quot;-f&amp;quot;,&lt;br /&gt;
     &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
   ],&lt;br /&gt;
   &amp;quot;language&amp;quot;: &amp;quot;python&amp;quot;,&lt;br /&gt;
   &amp;quot;display_name&amp;quot;: &amp;quot;singularity-kernel&amp;quot;&lt;br /&gt;
 }&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Refresh or start the jupyterlab interface and the singularity kernel should appear in the launcher tab&lt;br /&gt;
&lt;br /&gt;
= GPUs =&lt;br /&gt;
&lt;br /&gt;
The way to identify the GPUs that are assigned to your job is:&lt;br /&gt;
* check the environment variable CUDA_VISIBLE_DEVICES. In a terminal run &amp;quot;echo $CUDA_VISIBLE_DEVICES&amp;quot;. The environment variable contains a list of comma-separated GPU ids. With this you will already know how many gpus are assigned to your job. If the variable does not exist, there are no gpus assigned to the job&lt;br /&gt;
&lt;br /&gt;
* list the gpus with nvidia-smi, in a terminal run &amp;quot;nvidia-smi -L&amp;quot;, and look for the gpus you've been assigned. Remember their indexes (integers from 0 to 7)&lt;br /&gt;
&lt;br /&gt;
* The two steps above can be done with the following command:&lt;br /&gt;
&lt;br /&gt;
nvidia-smi -L | grep  $CUDA_VISIBLE_DEVICES&lt;br /&gt;
&lt;br /&gt;
if only having a single assigned GPU.&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_id_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
* in the GPU dashboard the gpus are identified with their index&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_resources_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
= Jupyterlab user guide =&lt;br /&gt;
&lt;br /&gt;
You can find the official documentation of the currently installed version of jupyterlab (3.6) [https://jupyterlab.readthedocs.io/en/4.2.x/ here], there you will find instruction on how to &lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/commands.html Access the command palette]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/toc.html Build a Table Of Contents]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/debugger.html Debug your code]&lt;br /&gt;
&lt;br /&gt;
A set of non-official jupyterlab extensions are installed to provide additional functionalities&lt;br /&gt;
&lt;br /&gt;
== jupytext ==&lt;br /&gt;
Pair your notebooks with text files to enhance version tracking.&lt;br /&gt;
https://jupytext.readthedocs.io&lt;br /&gt;
&lt;br /&gt;
=== Example ===&lt;br /&gt;
&lt;br /&gt;
If you had a notebook (.ipynb file) containing only the cell below, tracked in a git repository&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%matplotlib inline&lt;br /&gt;
import numpy as np&lt;br /&gt;
import matplotlib.pyplot as plt&lt;br /&gt;
plt.imshow(np.random.random([10, 10]))&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Different executions of the cell would produce different images, and the images are embedded in a pseudo-binary format inside the notebook file. In this case, doing a '''git diff''' of the .ipynb file would produce a huge output (because the image changed), even if there wasn't any change in the code. It is thus convenient to sync the notebook with a text file (e.g. a .py script) using the jupytext extension and track this one with git. The outputs, including images, as well as some additional metadata, won't be added to the synced text file. So in the case of different executions of the same notebook, the diff will always be empty.&lt;br /&gt;
&lt;br /&gt;
== git ==&lt;br /&gt;
Sidebar GUI to git repo management&lt;br /&gt;
https://github.com/jupyterlab/jupyterlab-git&lt;br /&gt;
&lt;br /&gt;
== variable inspector ==&lt;br /&gt;
Variable inspection à la Matlab&lt;br /&gt;
https://github.com/jupyterlab-contrib/jupyterlab-variableInspector&lt;br /&gt;
&lt;br /&gt;
...&lt;br /&gt;
&lt;br /&gt;
= Code samples =&lt;br /&gt;
&lt;br /&gt;
A repository with sample code can be found here: https://gitlab.pic.es/services/code-samples/&lt;br /&gt;
&lt;br /&gt;
= Troubleshooting =&lt;br /&gt;
&lt;br /&gt;
== Logs ==&lt;br /&gt;
&lt;br /&gt;
The log files for the jupyterlab server are stored in &amp;quot;~/.jupyter&amp;quot;. The log files will be created once the jupyter lab server job is finished.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Clean workspaces ==&lt;br /&gt;
&lt;br /&gt;
Jupyterlab stores the workspace status in the &amp;quot;~/.jupyter/lab/workspaces&amp;quot; folder. If you want to start with a fresh (empty) workspace, delete all the content of this folder before launching the notebook.&lt;br /&gt;
&lt;br /&gt;
    cd ~/.jupyter/lab/workspaces&lt;br /&gt;
    rm *&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 504 Gateway timeout ==&lt;br /&gt;
&lt;br /&gt;
The notebook job is running in HTCondor but the user can not access the notebook server. Ultimately a 504 error is received.&lt;br /&gt;
&lt;br /&gt;
This is probably because there's some error when starting the jupyterlab server. First of all, shutdown the notebook server and [[#Logs|check the logs]] to better identify the problem. If you don't see the source of the error, try to [[#Clean_workspaces|clean the workspaces]] and launch a notebook again.&lt;br /&gt;
&lt;br /&gt;
== Install ROOT ==&lt;br /&gt;
&lt;br /&gt;
When installing ROOT in a conda environment from a terminal in a workernode there's a conflict with the value of `LD_LIBRARY_PATH=/opt/hadoop-3.2.3/lib/native:/usr/lib64/` enforced through `/etc/profile.d`. This configuration does not apply to jupyter kernels. So some headers in `/usr/include` can not be found (e.g. `/usr/include/dlfcn.h`) and ROOT can not be imported in a notebook.&lt;br /&gt;
&lt;br /&gt;
So, to install root in a conda environment from a terminal in jupyter.pic.es and enable using it as a kernel you first have to unset the LD_LIBRARY_PATH variable, that is:&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
    $ unset LD_LIBRARY_PATH&lt;br /&gt;
    $ micromamba env create -p /data/pic/scratch/torradeflot/envs/root root ipykernel&lt;br /&gt;
    $ micromamba activate root&lt;br /&gt;
    $ python -m ipykernel install --user --name root&lt;/div&gt;</summary>
		<author><name>Torradeflot</name></author>
	</entry>
	<entry>
		<id>https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1191</id>
		<title>JupyterHub</title>
		<link rel="alternate" type="text/html" href="https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1191"/>
		<updated>2025-02-20T07:41:35Z</updated>

		<summary type="html">&lt;p&gt;Torradeflot: /* 504 Gateway timeout */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
= Introduction =&lt;br /&gt;
&lt;br /&gt;
PIC offers a service for running Jupyter notebooks on CPU or GPU resources. This service is primarily thought for code developing or prototyping rather than data processing. The usage is similar to running notebooks on your personal computer but offers the advantage of developing and testing your code on different hardware configurations, as well as facilitating the scalability of the code since it is being tested in the same environment in which it would run on a mass scale. &lt;br /&gt;
&lt;br /&gt;
Since the service is strictly thought for development and small scale testing tasks, a shutdown policy for the sessions has been put in place:&lt;br /&gt;
&lt;br /&gt;
# The maximum duration for a session is 48h.&lt;br /&gt;
# After an idle period of 2 hours, the session will be closed. &lt;br /&gt;
&lt;br /&gt;
In practice that  means that you should estimate the test data volume that you work with during a session to be able to be processed in less than 48 hours.&lt;br /&gt;
&lt;br /&gt;
== How to connect to the service ==&lt;br /&gt;
&lt;br /&gt;
Got to [https://jupyter.pic.es jupyter.pic.es] to see your login screen.&lt;br /&gt;
&lt;br /&gt;
[[File:login.png|700px|Login screen]]&lt;br /&gt;
&lt;br /&gt;
Sign in with your PIC user credentials. This will prompt you to the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterSpawn.png|700px|current]]&lt;br /&gt;
&lt;br /&gt;
Here you can choose the hardware configuration for your Jupyter session. Also, you have to choose the experiment (project) you are working on during the Jupyter session. After choosing a configuration and pressing start the next screen will show you the progress of the initialisation process. Keep in mind that a job containing your Jupyter session is actually sent to the HTCondor queuing system and waiting for available resources before being started. This usually takes less than a minute but can take up to a few depending on our resource usage.&lt;br /&gt;
&lt;br /&gt;
[[File:screen02.png|900px]]&lt;br /&gt;
&lt;br /&gt;
In the next screen you can choose the tool that you want to use for your work: a Python notebook, a Python console or a plain bash terminal.&lt;br /&gt;
For the Python environment (either notebook or environment) you have two default options:&lt;br /&gt;
* the ipykernel version of Python 3&lt;br /&gt;
* the XPython version of Python 3.9, this one allows you to use the integrated debugging module.&lt;br /&gt;
&lt;br /&gt;
Further you see an icon with a &amp;quot;D&amp;quot; - desktop, this one starts a VNC session that allows the use of programs with graphical user interfaces.&lt;br /&gt;
&lt;br /&gt;
Also, recently you can find the icon of Visual Studio, an integrated development environment.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterlab20231103.png|700px]]&lt;br /&gt;
&lt;br /&gt;
Your python environments should appear under Notebook and Console headers. In a later section we will show you how to create a new environment and to remove an existing one.&lt;br /&gt;
&lt;br /&gt;
== Terminate your session and logout ==&lt;br /&gt;
&lt;br /&gt;
It is important that you terminate your session before you log out. In order to do so, go to the top page menu &amp;quot;'''File''' -&amp;gt; '''Hub Control Panel'''&amp;quot; and you will see the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:screen04.png|600px]]&lt;br /&gt;
&lt;br /&gt;
Here, click on the '''Stop My Server''' button. After that you can log out by clicking the '''Logout''' button in the right upper corner.&lt;br /&gt;
&lt;br /&gt;
=  Python virtual environments =&lt;br /&gt;
&lt;br /&gt;
This section covers the use of Python virtual environments with Jupyter.&lt;br /&gt;
&lt;br /&gt;
== Initialize conda (we highly recommend the use of miniforge/mambaforge) ==&lt;br /&gt;
&lt;br /&gt;
Before using conda/mamba in your bash session, you have to initialize it.&lt;br /&gt;
* For access to an available conda/mamba installation, please get in contact with your project liaison at PIC. He/she will give you the actual value for the '''/path/to/mambaforge''' placeholder.&lt;br /&gt;
* If you want to use your own conda/mamba installation, we recommend you to install the minimal '''miniforge''' distribution, instructions [https://github.com/conda-forge/miniforge here] or [https://github.com/mamba-org/mamba mamba/micromamba]&lt;br /&gt;
&lt;br /&gt;
Log onto Jupyter and start a session. On the homepage of your Jupyter session, click on the terminal button on the session dashboard on the right to open a bash terminal. If no specific version is needed you can use the link provided in the example.&lt;br /&gt;
&lt;br /&gt;
First, let's initialize conda for our bash sessions:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ /data/astro/software/alma9/conda/miniforge-24.1.2/bin/mamba init&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This actually changes the .bashrc file in your home directory in order to activate the base environment on login.&lt;br /&gt;
To avoid that the base environment is activated every time you log on to a node, run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, in order to activate the base environment you will have to run this command:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ eval &amp;quot;$(/data/astro/software/alma9/conda/miniforge-24.1.2/bin/conda shell.bash hook)&amp;quot;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For now you can exit the terminal.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ exit&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Link an existing environment to Jupyter ==&lt;br /&gt;
&lt;br /&gt;
You can find instructions on how to create your own environments, e.g. [[#Create_virtual_environments_with_venv_or_conda | here]].&lt;br /&gt;
&lt;br /&gt;
Log into Jupyter, start a session. From the session dashboard choose the bash terminal.&lt;br /&gt;
&lt;br /&gt;
Inside the terminal, activate your environment.&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments:&lt;br /&gt;
* if you created the environment without a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the name of your environment. &lt;br /&gt;
* if you created the environment with a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the absolute path of your environment. &lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ source /path/to/environment/bin/activate&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Link the environment to a Jupyter kernel. For both, '''conda/mamba''' and '''venv''':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ python -m  ipykernel install --user --name=whatever_kernel_name&lt;br /&gt;
Installed kernelspec whatever_kernel_name in &lt;br /&gt;
                         /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you don't have the '''ipykernel''' module installed in your environment you may receive an error message like the one below when trying to run the previous command.&lt;br /&gt;
&amp;lt;pre&amp;gt;No module named ipykernel&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If this is the case, you need to install it by running: '''pip install ipykernel'''&lt;br /&gt;
&lt;br /&gt;
Deactivate your environment. &lt;br /&gt;
&lt;br /&gt;
For conda:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For venv:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now you can exit the terminal. After refreshing the Jupyter page your whatever_kernel_name appears in the dashboard. In this example '''test''' has been used for whatever_kernel_name&lt;br /&gt;
&lt;br /&gt;
[[File:screen05.png|700px]]&lt;br /&gt;
&lt;br /&gt;
== Unlink an environment from Jupyter ==&lt;br /&gt;
Log onto Jupyter, start a session and from the session dashboard choose the bash terminal. To remove your environment/kernel from Jupyter run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ jupyter kernelspec uninstall whatever_kernel_name&lt;br /&gt;
Kernel specs to remove:&lt;br /&gt;
  whatever_kernel_name     /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
Remove 1 kernel specs [y/N]: y&lt;br /&gt;
[RemoveKernelSpec] Removed /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Keep in mind that, although not available in Jupyter anymore, the environment still exists. Whenever you need it, you can link it again.&lt;br /&gt;
&lt;br /&gt;
== Create virtual environments with venv or conda ==&lt;br /&gt;
&lt;br /&gt;
Before creating a new environment, please get in contact with your project liaison at PIC as there may be already a suitable environment for your needs in place.&lt;br /&gt;
&lt;br /&gt;
If none of the existing environments suits your needs, you can create a new environment.&lt;br /&gt;
First, create a directory in a suitable place to store the environment. For single-user environments, place them in your home under ~/env. For environments that will be shared with other project users, contact your project liaison and ask him/her for a path in a shared storage volume that is visible to all of them.&lt;br /&gt;
&lt;br /&gt;
Once you have the location (i.e. /path/to/env/folder), create the environment with the following commands:&lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments '''(recommended)'''&lt;br /&gt;
&lt;br /&gt;
If your_env is installed at /path/to/env/your_env&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ python3 -m venv your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ source your_env/bin/activate&lt;br /&gt;
(...)[neissner@td110 ~]$ pip install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments&lt;br /&gt;
 &amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba create --prefix /path/to/env/your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The list of modules (module1, module2, ...) is optional. For instance, for a python3 environment with scipy you would specify: ''python=3 scipy'' &lt;br /&gt;
&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/env/folder/your_env&lt;br /&gt;
(...)[neissner@td110 ~]$ mamba install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
You can use pip install inside a mamba environment, however, resolving dependencies might require installing additional packages manually.&lt;br /&gt;
&lt;br /&gt;
== conda / mamba configuration ==&lt;br /&gt;
&lt;br /&gt;
The behaviour of conda/mamba can be configured through the &amp;quot;$HOME/.condarc&amp;quot; file, described [https://docs.conda.io/projects/conda/en/latest/configuration.html here]. Some interesting parameters:&lt;br /&gt;
&lt;br /&gt;
* envs_dirs: The list of directories to search for named environments. E.g.: different locations where you created environments&lt;br /&gt;
&lt;br /&gt;
  envs_dirs:&lt;br /&gt;
    - /data/pic/scratch/torradeflot/envs&lt;br /&gt;
    - /data/astro/scratch/torradeflot/envs&lt;br /&gt;
    - /data/aai/scratch/torradeflot/envs&lt;br /&gt;
&lt;br /&gt;
* pkgs_dirs: Folder where to store conda packages&lt;br /&gt;
&lt;br /&gt;
= Proper usage of X509 based proxies =&lt;br /&gt;
&lt;br /&gt;
We found recently that the usage of proxies within a Jupyter session might cause problems because the environment changes certain standard locations such as '''/tmp'''&lt;br /&gt;
&lt;br /&gt;
For a correct functioning please create the proxy the following way, example for Virgo:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /bin/voms-proxy-init --voms virgo:/virgo/ligo --out ./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
ls: cannot access /cvmfs/ligo.osgstorage.org: Permission denied&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here the proxy cannot properly be located. Therefore we have to put the complete path into the variable:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ pwd&lt;br /&gt;
/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;/x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
frames  powerflux  pycbc  test_access&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Software of particular interest =&lt;br /&gt;
== SageMath ==&lt;br /&gt;
&lt;br /&gt;
[https://www.sagemath.org/ SageMath] is particularly interesting for Cosmology because it allows symbolic calculations, e.g. deriving the equations of motions for the scale factor starting from a customised space-time metric. &lt;br /&gt;
&lt;br /&gt;
=== Standard cosmology examples ===&lt;br /&gt;
&lt;br /&gt;
* The Friedman equations for the FLRW solution of the Einstein equations.&lt;br /&gt;
You can find the corresponding Notebook in any PIC terminal at '''/data/astro/software/notebooks/FLRW_cosmology.ipynb'''&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/FLRW_cosmology_solutions.ipynb''' uses known analytical solutions of the FLRW cosmology and produces this image for the evolution of the scale factor:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage05.png|300px]]&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/Interior_Schwarzschild.ipynb''' shows the formalism for the interior Schwarzschild metric and displays the solutions for density and pressure of a static celestial object that is sufficiently larger than its Schwarzschild radius. The pressure for an object with constant density id shown in the image:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage06.png|300px]]&lt;br /&gt;
&lt;br /&gt;
=== Enabling SageMath environment in Jupyter ===&lt;br /&gt;
&lt;br /&gt;
If you have never initialized mamba, run:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /data/astro/software/centos7/conda/mambaforge_4.14.0/bin/mamba init&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
After that you can enable SageMath for its use in a Jupyter notebook session: &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~] mamba activate /data/astro/software/envs/sage&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ python -m  ipykernel install --user --name=sage&lt;br /&gt;
....&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This creates a file in you home '''~/.local/share/jupyter/kernels/sage/kernel.json'''&lt;br /&gt;
which has to be modified to look like this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
{&lt;br /&gt;
 &amp;quot;argv&amp;quot;: [&lt;br /&gt;
  &amp;quot;/data/astro/software/envs/sage/bin/sage&amp;quot;,&lt;br /&gt;
  &amp;quot;--python&amp;quot;,&lt;br /&gt;
  &amp;quot;-m&amp;quot;,&lt;br /&gt;
  &amp;quot;sage.repl.ipython_kernel&amp;quot;,&lt;br /&gt;
  &amp;quot;-f&amp;quot;,&lt;br /&gt;
  &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
 ],&lt;br /&gt;
 &amp;quot;display_name&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;language&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;metadata&amp;quot;: {&lt;br /&gt;
  &amp;quot;debugger&amp;quot;: true&lt;br /&gt;
 }&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next time you go to your Jupyter dashboard you will find the sage environment listed there.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Dask =&lt;br /&gt;
&lt;br /&gt;
A notebook with instructions on how to run Dask at PIC can be found [https://gitlab.pic.es/services/code-samples/-/blob/main/computing/dask/dask_htcondor.ipynb here]&lt;br /&gt;
&lt;br /&gt;
= Using a singularity image as a jupyter kernel =&lt;br /&gt;
&lt;br /&gt;
Sometimes the software stack of some projects may be provided in the shape of a singularity image, it will then be convenient to use this image as a kernel for the notebooks in jupyter.pic.es.&lt;br /&gt;
&lt;br /&gt;
The singularity image to be used as a kernel needs to fullfill some requirements. Different requirements will apply depending on the programming language.&lt;br /&gt;
&lt;br /&gt;
== python jupyter kernel in a singularity image ==&lt;br /&gt;
&lt;br /&gt;
The singularity image needs to have the '''python''' and the '''ipykernel''' module installed. &lt;br /&gt;
&lt;br /&gt;
* Create the folder that will host the kernel definition&lt;br /&gt;
&lt;br /&gt;
  mkdir -p $HOME/.local/share/jupyter/kernels/singularity&lt;br /&gt;
&lt;br /&gt;
* Create the '''kernel.json''' file inside it with the following content:&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 {&lt;br /&gt;
   &amp;quot;argv&amp;quot;: [&lt;br /&gt;
     &amp;quot;singularity&amp;quot;,&lt;br /&gt;
     &amp;quot;exec&amp;quot;,&lt;br /&gt;
     &amp;quot;--cleanenv&amp;quot;,&lt;br /&gt;
     &amp;quot;/path/to/the/singularity/image.sif&amp;quot;,&lt;br /&gt;
     &amp;quot;python&amp;quot;,&lt;br /&gt;
     &amp;quot;-m&amp;quot;,&lt;br /&gt;
    &amp;quot;ipykernel&amp;quot;,&lt;br /&gt;
     &amp;quot;-f&amp;quot;,&lt;br /&gt;
     &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
   ],&lt;br /&gt;
   &amp;quot;language&amp;quot;: &amp;quot;python&amp;quot;,&lt;br /&gt;
   &amp;quot;display_name&amp;quot;: &amp;quot;singularity-kernel&amp;quot;&lt;br /&gt;
 }&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Refresh or start the jupyterlab interface and the singularity kernel should appear in the launcher tab&lt;br /&gt;
&lt;br /&gt;
= GPUs =&lt;br /&gt;
&lt;br /&gt;
The way to identify the GPUs that are assigned to your job is:&lt;br /&gt;
* check the environment variable CUDA_VISIBLE_DEVICES. In a terminal run &amp;quot;echo $CUDA_VISIBLE_DEVICES&amp;quot;. The environment variable contains a list of comma-separated GPU ids. With this you will already know how many gpus are assigned to your job. If the variable does not exist, there are no gpus assigned to the job&lt;br /&gt;
&lt;br /&gt;
* list the gpus with nvidia-smi, in a terminal run &amp;quot;nvidia-smi -L&amp;quot;, and look for the gpus you've been assigned. Remember their indexes (integers from 0 to 7)&lt;br /&gt;
&lt;br /&gt;
* The two steps above can be done with the following command:&lt;br /&gt;
&lt;br /&gt;
nvidia-smi -L | grep  $CUDA_VISIBLE_DEVICES&lt;br /&gt;
&lt;br /&gt;
if only having a single assigned GPU.&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_id_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
* in the GPU dashboard the gpus are identified with their index&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_resources_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
= Jupyterlab user guide =&lt;br /&gt;
&lt;br /&gt;
You can find the official documentation of the currently installed version of jupyterlab (3.6) [https://jupyterlab.readthedocs.io/en/4.2.x/ here], there you will find instruction on how to &lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/commands.html Access the command palette]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/toc.html Build a Table Of Contents]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/debugger.html Debug your code]&lt;br /&gt;
&lt;br /&gt;
A set of non-official jupyterlab extensions are installed to provide additional functionalities&lt;br /&gt;
&lt;br /&gt;
== jupytext ==&lt;br /&gt;
Pair your notebooks with text files to enhance version tracking.&lt;br /&gt;
https://jupytext.readthedocs.io&lt;br /&gt;
&lt;br /&gt;
=== Example ===&lt;br /&gt;
&lt;br /&gt;
If you had a notebook (.ipynb file) containing only the cell below, tracked in a git repository&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%matplotlib inline&lt;br /&gt;
import numpy as np&lt;br /&gt;
import matplotlib.pyplot as plt&lt;br /&gt;
plt.imshow(np.random.random([10, 10]))&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Different executions of the cell would produce different images, and the images are embedded in a pseudo-binary format inside the notebook file. In this case, doing a '''git diff''' of the .ipynb file would produce a huge output (because the image changed), even if there wasn't any change in the code. It is thus convenient to sync the notebook with a text file (e.g. a .py script) using the jupytext extension and track this one with git. The outputs, including images, as well as some additional metadata, won't be added to the synced text file. So in the case of different executions of the same notebook, the diff will always be empty.&lt;br /&gt;
&lt;br /&gt;
== git ==&lt;br /&gt;
Sidebar GUI to git repo management&lt;br /&gt;
https://github.com/jupyterlab/jupyterlab-git&lt;br /&gt;
&lt;br /&gt;
== variable inspector ==&lt;br /&gt;
Variable inspection à la Matlab&lt;br /&gt;
https://github.com/jupyterlab-contrib/jupyterlab-variableInspector&lt;br /&gt;
&lt;br /&gt;
...&lt;br /&gt;
&lt;br /&gt;
= Code samples =&lt;br /&gt;
&lt;br /&gt;
A repository with sample code can be found here: https://gitlab.pic.es/services/code-samples/&lt;br /&gt;
&lt;br /&gt;
= Troubleshooting =&lt;br /&gt;
&lt;br /&gt;
== Logs ==&lt;br /&gt;
&lt;br /&gt;
The log files for the jupyterlab server are stored in &amp;quot;~/.jupyter&amp;quot;. The log files will be created once the jupyter lab server job is finished.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Clean workspaces ==&lt;br /&gt;
&lt;br /&gt;
Jupyterlab stores the workspace status in the &amp;quot;~/.jupyter/lab/workspaces&amp;quot; folder. If you want to start with a fresh (empty) workspace, delete all the content of this folder before launching the notebook.&lt;br /&gt;
&lt;br /&gt;
    cd ~/.jupyter/lab/workspaces&lt;br /&gt;
    rm *&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 504 Gateway timeout ==&lt;br /&gt;
&lt;br /&gt;
The notebook job is running in HTCondor but the user can not access the notebook server. Ultimately a 504 error is received.&lt;br /&gt;
&lt;br /&gt;
This is probably because there's some error when starting the jupyterlab server. First of all, shutdown the notebook server and [[#Logs|check the logs]] to better identify the problem. If you don't see the source of the error, try to [[#Clean_workspaces|clean the workspaces]] and launch a notebook again.&lt;br /&gt;
&lt;br /&gt;
== Install ROOT ==&lt;br /&gt;
&lt;br /&gt;
When installing ROOT in a conda environment from a terminal in a workernode there's a conflict with the value of `LD_LIBRARY_PATH=/opt/hadoop-3.2.3/lib/native:/usr/lib64/` enforced through `/etc/profile.d`. This configuration does not apply to jupyter kernels. So some headers in `/usr/include` can not be found (e.g. `/usr/include/dlfcn.h`) and ROOT can not be imported in a notebook.&lt;br /&gt;
&lt;br /&gt;
So, to install root in a conda environment from a terminal in jupyter.pic.es and enable using it as a kernel you first have to unset the LD_LIBRARY_PATH variable, that is:&lt;br /&gt;
&lt;br /&gt;
```&lt;br /&gt;
$ unset LD_LIBRARY_PATH&lt;br /&gt;
$ micromamba env create -p /data/pic/scratch/torradeflot/envs/root root ipykernel&lt;br /&gt;
$ micromamba activate root&lt;br /&gt;
$ python -m ipykernel install --user --name root&lt;br /&gt;
```&lt;/div&gt;</summary>
		<author><name>Torradeflot</name></author>
	</entry>
	<entry>
		<id>https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1185</id>
		<title>JupyterHub</title>
		<link rel="alternate" type="text/html" href="https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1185"/>
		<updated>2025-02-04T09:35:09Z</updated>

		<summary type="html">&lt;p&gt;Torradeflot: /* Troubleshooting */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
= Introduction =&lt;br /&gt;
&lt;br /&gt;
PIC offers a service for running Jupyter notebooks on CPU or GPU resources. This service is primarily thought for code developing or prototyping rather than data processing. The usage is similar to running notebooks on your personal computer but offers the advantage of developing and testing your code on different hardware configurations, as well as facilitating the scalability of the code since it is being tested in the same environment in which it would run on a mass scale. &lt;br /&gt;
&lt;br /&gt;
Since the service is strictly thought for development and small scale testing tasks, a shutdown policy for the sessions has been put in place:&lt;br /&gt;
&lt;br /&gt;
# The maximum duration for a session is 48h.&lt;br /&gt;
# After an idle period of 2 hours, the session will be closed. &lt;br /&gt;
&lt;br /&gt;
In practice that  means that you should estimate the test data volume that you work with during a session to be able to be processed in less than 48 hours.&lt;br /&gt;
&lt;br /&gt;
== How to connect to the service ==&lt;br /&gt;
&lt;br /&gt;
Got to [https://jupyter.pic.es jupyter.pic.es] to see your login screen.&lt;br /&gt;
&lt;br /&gt;
[[File:login.png|700px|Login screen]]&lt;br /&gt;
&lt;br /&gt;
Sign in with your PIC user credentials. This will prompt you to the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterSpawn.png|700px|current]]&lt;br /&gt;
&lt;br /&gt;
Here you can choose the hardware configuration for your Jupyter session. Also, you have to choose the experiment (project) you are working on during the Jupyter session. After choosing a configuration and pressing start the next screen will show you the progress of the initialisation process. Keep in mind that a job containing your Jupyter session is actually sent to the HTCondor queuing system and waiting for available resources before being started. This usually takes less than a minute but can take up to a few depending on our resource usage.&lt;br /&gt;
&lt;br /&gt;
[[File:screen02.png|900px]]&lt;br /&gt;
&lt;br /&gt;
In the next screen you can choose the tool that you want to use for your work: a Python notebook, a Python console or a plain bash terminal.&lt;br /&gt;
For the Python environment (either notebook or environment) you have two default options:&lt;br /&gt;
* the ipykernel version of Python 3&lt;br /&gt;
* the XPython version of Python 3.9, this one allows you to use the integrated debugging module.&lt;br /&gt;
&lt;br /&gt;
Further you see an icon with a &amp;quot;D&amp;quot; - desktop, this one starts a VNC session that allows the use of programs with graphical user interfaces.&lt;br /&gt;
&lt;br /&gt;
Also, recently you can find the icon of Visual Studio, an integrated development environment.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterlab20231103.png|700px]]&lt;br /&gt;
&lt;br /&gt;
Your python environments should appear under Notebook and Console headers. In a later section we will show you how to create a new environment and to remove an existing one.&lt;br /&gt;
&lt;br /&gt;
== Terminate your session and logout ==&lt;br /&gt;
&lt;br /&gt;
It is important that you terminate your session before you log out. In order to do so, go to the top page menu &amp;quot;'''File''' -&amp;gt; '''Hub Control Panel'''&amp;quot; and you will see the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:screen04.png|600px]]&lt;br /&gt;
&lt;br /&gt;
Here, click on the '''Stop My Server''' button. After that you can log out by clicking the '''Logout''' button in the right upper corner.&lt;br /&gt;
&lt;br /&gt;
=  Python virtual environments =&lt;br /&gt;
&lt;br /&gt;
This section covers the use of Python virtual environments with Jupyter.&lt;br /&gt;
&lt;br /&gt;
== Initialize conda (we highly recommend the use of miniforge/mambaforge) ==&lt;br /&gt;
&lt;br /&gt;
Before using conda/mamba in your bash session, you have to initialize it.&lt;br /&gt;
* For access to an available conda/mamba installation, please get in contact with your project liaison at PIC. He/she will give you the actual value for the '''/path/to/mambaforge''' placeholder.&lt;br /&gt;
* If you want to use your own conda/mamba installation, we recommend you to install the minimal '''miniforge''' distribution, instructions [https://github.com/conda-forge/miniforge here] or [https://github.com/mamba-org/mamba mamba/micromamba]&lt;br /&gt;
&lt;br /&gt;
Log onto Jupyter and start a session. On the homepage of your Jupyter session, click on the terminal button on the session dashboard on the right to open a bash terminal. If no specific version is needed you can use the link provided in the example.&lt;br /&gt;
&lt;br /&gt;
First, let's initialize conda for our bash sessions:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ /data/astro/software/alma9/conda/miniforge-24.1.2/bin/mamba init&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This actually changes the .bashrc file in your home directory in order to activate the base environment on login.&lt;br /&gt;
To avoid that the base environment is activated every time you log on to a node, run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, in order to activate the base environment you will have to run this command:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ eval &amp;quot;$(/data/astro/software/alma9/conda/miniforge-24.1.2/bin/conda shell.bash hook)&amp;quot;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For now you can exit the terminal.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ exit&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Link an existing environment to Jupyter ==&lt;br /&gt;
&lt;br /&gt;
You can find instructions on how to create your own environments, e.g. [[#Create_virtual_environments_with_venv_or_conda | here]].&lt;br /&gt;
&lt;br /&gt;
Log into Jupyter, start a session. From the session dashboard choose the bash terminal.&lt;br /&gt;
&lt;br /&gt;
Inside the terminal, activate your environment.&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments:&lt;br /&gt;
* if you created the environment without a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the name of your environment. &lt;br /&gt;
* if you created the environment with a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the absolute path of your environment. &lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ source /path/to/environment/bin/activate&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Link the environment to a Jupyter kernel. For both, '''conda/mamba''' and '''venv''':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ python -m  ipykernel install --user --name=whatever_kernel_name&lt;br /&gt;
Installed kernelspec whatever_kernel_name in &lt;br /&gt;
                         /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you don't have the '''ipykernel''' module installed in your environment you may receive an error message like the one below when trying to run the previous command.&lt;br /&gt;
&amp;lt;pre&amp;gt;No module named ipykernel&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If this is the case, you need to install it by running: '''pip install ipykernel'''&lt;br /&gt;
&lt;br /&gt;
Deactivate your environment. &lt;br /&gt;
&lt;br /&gt;
For conda:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For venv:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now you can exit the terminal. After refreshing the Jupyter page your whatever_kernel_name appears in the dashboard. In this example '''test''' has been used for whatever_kernel_name&lt;br /&gt;
&lt;br /&gt;
[[File:screen05.png|700px]]&lt;br /&gt;
&lt;br /&gt;
== Unlink an environment from Jupyter ==&lt;br /&gt;
Log onto Jupyter, start a session and from the session dashboard choose the bash terminal. To remove your environment/kernel from Jupyter run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ jupyter kernelspec uninstall whatever_kernel_name&lt;br /&gt;
Kernel specs to remove:&lt;br /&gt;
  whatever_kernel_name     /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
Remove 1 kernel specs [y/N]: y&lt;br /&gt;
[RemoveKernelSpec] Removed /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Keep in mind that, although not available in Jupyter anymore, the environment still exists. Whenever you need it, you can link it again.&lt;br /&gt;
&lt;br /&gt;
== Create virtual environments with venv or conda ==&lt;br /&gt;
&lt;br /&gt;
Before creating a new environment, please get in contact with your project liaison at PIC as there may be already a suitable environment for your needs in place.&lt;br /&gt;
&lt;br /&gt;
If none of the existing environments suits your needs, you can create a new environment.&lt;br /&gt;
First, create a directory in a suitable place to store the environment. For single-user environments, place them in your home under ~/env. For environments that will be shared with other project users, contact your project liaison and ask him/her for a path in a shared storage volume that is visible to all of them.&lt;br /&gt;
&lt;br /&gt;
Once you have the location (i.e. /path/to/env/folder), create the environment with the following commands:&lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments '''(recommended)'''&lt;br /&gt;
&lt;br /&gt;
If your_env is installed at /path/to/env/your_env&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ python3 -m venv your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ source your_env/bin/activate&lt;br /&gt;
(...)[neissner@td110 ~]$ pip install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments&lt;br /&gt;
 &amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba create --prefix /path/to/env/your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The list of modules (module1, module2, ...) is optional. For instance, for a python3 environment with scipy you would specify: ''python=3 scipy'' &lt;br /&gt;
&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/env/folder/your_env&lt;br /&gt;
(...)[neissner@td110 ~]$ mamba install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
You can use pip install inside a mamba environment, however, resolving dependencies might require installing additional packages manually.&lt;br /&gt;
&lt;br /&gt;
== conda / mamba configuration ==&lt;br /&gt;
&lt;br /&gt;
The behaviour of conda/mamba can be configured through the &amp;quot;$HOME/.condarc&amp;quot; file, described [https://docs.conda.io/projects/conda/en/latest/configuration.html here]. Some interesting parameters:&lt;br /&gt;
&lt;br /&gt;
* envs_dirs: The list of directories to search for named environments. E.g.: different locations where you created environments&lt;br /&gt;
&lt;br /&gt;
  envs_dirs:&lt;br /&gt;
    - /data/pic/scratch/torradeflot/envs&lt;br /&gt;
    - /data/astro/scratch/torradeflot/envs&lt;br /&gt;
    - /data/aai/scratch/torradeflot/envs&lt;br /&gt;
&lt;br /&gt;
* pkgs_dirs: Folder where to store conda packages&lt;br /&gt;
&lt;br /&gt;
= Proper usage of X509 based proxies =&lt;br /&gt;
&lt;br /&gt;
We found recently that the usage of proxies within a Jupyter session might cause problems because the environment changes certain standard locations such as '''/tmp'''&lt;br /&gt;
&lt;br /&gt;
For a correct functioning please create the proxy the following way, example for Virgo:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /bin/voms-proxy-init --voms virgo:/virgo/ligo --out ./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
ls: cannot access /cvmfs/ligo.osgstorage.org: Permission denied&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here the proxy cannot properly be located. Therefore we have to put the complete path into the variable:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ pwd&lt;br /&gt;
/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;/x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
frames  powerflux  pycbc  test_access&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Software of particular interest =&lt;br /&gt;
== SageMath ==&lt;br /&gt;
&lt;br /&gt;
[https://www.sagemath.org/ SageMath] is particularly interesting for Cosmology because it allows symbolic calculations, e.g. deriving the equations of motions for the scale factor starting from a customised space-time metric. &lt;br /&gt;
&lt;br /&gt;
=== Standard cosmology examples ===&lt;br /&gt;
&lt;br /&gt;
* The Friedman equations for the FLRW solution of the Einstein equations.&lt;br /&gt;
You can find the corresponding Notebook in any PIC terminal at '''/data/astro/software/notebooks/FLRW_cosmology.ipynb'''&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/FLRW_cosmology_solutions.ipynb''' uses known analytical solutions of the FLRW cosmology and produces this image for the evolution of the scale factor:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage05.png|300px]]&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/Interior_Schwarzschild.ipynb''' shows the formalism for the interior Schwarzschild metric and displays the solutions for density and pressure of a static celestial object that is sufficiently larger than its Schwarzschild radius. The pressure for an object with constant density id shown in the image:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage06.png|300px]]&lt;br /&gt;
&lt;br /&gt;
=== Enabling SageMath environment in Jupyter ===&lt;br /&gt;
&lt;br /&gt;
If you have never initialized mamba, run:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /data/astro/software/centos7/conda/mambaforge_4.14.0/bin/mamba init&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
After that you can enable SageMath for its use in a Jupyter notebook session: &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~] mamba activate /data/astro/software/envs/sage&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ python -m  ipykernel install --user --name=sage&lt;br /&gt;
....&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This creates a file in you home '''~/.local/share/jupyter/kernels/sage/kernel.json'''&lt;br /&gt;
which has to be modified to look like this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
{&lt;br /&gt;
 &amp;quot;argv&amp;quot;: [&lt;br /&gt;
  &amp;quot;/data/astro/software/envs/sage/bin/sage&amp;quot;,&lt;br /&gt;
  &amp;quot;--python&amp;quot;,&lt;br /&gt;
  &amp;quot;-m&amp;quot;,&lt;br /&gt;
  &amp;quot;sage.repl.ipython_kernel&amp;quot;,&lt;br /&gt;
  &amp;quot;-f&amp;quot;,&lt;br /&gt;
  &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
 ],&lt;br /&gt;
 &amp;quot;display_name&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;language&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;metadata&amp;quot;: {&lt;br /&gt;
  &amp;quot;debugger&amp;quot;: true&lt;br /&gt;
 }&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next time you go to your Jupyter dashboard you will find the sage environment listed there.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Dask =&lt;br /&gt;
&lt;br /&gt;
A notebook with instructions on how to run Dask at PIC can be found [https://gitlab.pic.es/services/code-samples/-/blob/main/computing/dask/dask_htcondor.ipynb here]&lt;br /&gt;
&lt;br /&gt;
= Using a singularity image as a jupyter kernel =&lt;br /&gt;
&lt;br /&gt;
Sometimes the software stack of some projects may be provided in the shape of a singularity image, it will then be convenient to use this image as a kernel for the notebooks in jupyter.pic.es.&lt;br /&gt;
&lt;br /&gt;
The singularity image to be used as a kernel needs to fullfill some requirements. Different requirements will apply depending on the programming language.&lt;br /&gt;
&lt;br /&gt;
== python jupyter kernel in a singularity image ==&lt;br /&gt;
&lt;br /&gt;
The singularity image needs to have the '''python''' and the '''ipykernel''' module installed. &lt;br /&gt;
&lt;br /&gt;
* Create the folder that will host the kernel definition&lt;br /&gt;
&lt;br /&gt;
  mkdir -p $HOME/.local/share/jupyter/kernels/singularity&lt;br /&gt;
&lt;br /&gt;
* Create the '''kernel.json''' file inside it with the following content:&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 {&lt;br /&gt;
   &amp;quot;argv&amp;quot;: [&lt;br /&gt;
     &amp;quot;singularity&amp;quot;,&lt;br /&gt;
     &amp;quot;exec&amp;quot;,&lt;br /&gt;
     &amp;quot;--cleanenv&amp;quot;,&lt;br /&gt;
     &amp;quot;/path/to/the/singularity/image.sif&amp;quot;,&lt;br /&gt;
     &amp;quot;python&amp;quot;,&lt;br /&gt;
     &amp;quot;-m&amp;quot;,&lt;br /&gt;
    &amp;quot;ipykernel&amp;quot;,&lt;br /&gt;
     &amp;quot;-f&amp;quot;,&lt;br /&gt;
     &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
   ],&lt;br /&gt;
   &amp;quot;language&amp;quot;: &amp;quot;python&amp;quot;,&lt;br /&gt;
   &amp;quot;display_name&amp;quot;: &amp;quot;singularity-kernel&amp;quot;&lt;br /&gt;
 }&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Refresh or start the jupyterlab interface and the singularity kernel should appear in the launcher tab&lt;br /&gt;
&lt;br /&gt;
= GPUs =&lt;br /&gt;
&lt;br /&gt;
The way to identify the GPUs that are assigned to your job is:&lt;br /&gt;
* check the environment variable CUDA_VISIBLE_DEVICES. In a terminal run &amp;quot;echo $CUDA_VISIBLE_DEVICES&amp;quot;. The environment variable contains a list of comma-separated GPU ids. With this you will already know how many gpus are assigned to your job. If the variable does not exist, there are no gpus assigned to the job&lt;br /&gt;
&lt;br /&gt;
* list the gpus with nvidia-smi, in a terminal run &amp;quot;nvidia-smi -L&amp;quot;, and look for the gpus you've been assigned. Remember their indexes (integers from 0 to 7)&lt;br /&gt;
&lt;br /&gt;
* The two steps above can be done with the following command:&lt;br /&gt;
&lt;br /&gt;
nvidia-smi -L | grep  $CUDA_VISIBLE_DEVICES&lt;br /&gt;
&lt;br /&gt;
if only having a single assigned GPU.&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_id_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
* in the GPU dashboard the gpus are identified with their index&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_resources_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
= Jupyterlab user guide =&lt;br /&gt;
&lt;br /&gt;
You can find the official documentation of the currently installed version of jupyterlab (3.6) [https://jupyterlab.readthedocs.io/en/4.2.x/ here], there you will find instruction on how to &lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/commands.html Access the command palette]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/toc.html Build a Table Of Contents]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/debugger.html Debug your code]&lt;br /&gt;
&lt;br /&gt;
A set of non-official jupyterlab extensions are installed to provide additional functionalities&lt;br /&gt;
&lt;br /&gt;
== jupytext ==&lt;br /&gt;
Pair your notebooks with text files to enhance version tracking.&lt;br /&gt;
https://jupytext.readthedocs.io&lt;br /&gt;
&lt;br /&gt;
=== Example ===&lt;br /&gt;
&lt;br /&gt;
If you had a notebook (.ipynb file) containing only the cell below, tracked in a git repository&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%matplotlib inline&lt;br /&gt;
import numpy as np&lt;br /&gt;
import matplotlib.pyplot as plt&lt;br /&gt;
plt.imshow(np.random.random([10, 10]))&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Different executions of the cell would produce different images, and the images are embedded in a pseudo-binary format inside the notebook file. In this case, doing a '''git diff''' of the .ipynb file would produce a huge output (because the image changed), even if there wasn't any change in the code. It is thus convenient to sync the notebook with a text file (e.g. a .py script) using the jupytext extension and track this one with git. The outputs, including images, as well as some additional metadata, won't be added to the synced text file. So in the case of different executions of the same notebook, the diff will always be empty.&lt;br /&gt;
&lt;br /&gt;
== git ==&lt;br /&gt;
Sidebar GUI to git repo management&lt;br /&gt;
https://github.com/jupyterlab/jupyterlab-git&lt;br /&gt;
&lt;br /&gt;
== variable inspector ==&lt;br /&gt;
Variable inspection à la Matlab&lt;br /&gt;
https://github.com/jupyterlab-contrib/jupyterlab-variableInspector&lt;br /&gt;
&lt;br /&gt;
...&lt;br /&gt;
&lt;br /&gt;
= Code samples =&lt;br /&gt;
&lt;br /&gt;
A repository with sample code can be found here: https://gitlab.pic.es/services/code-samples/&lt;br /&gt;
&lt;br /&gt;
= Troubleshooting =&lt;br /&gt;
&lt;br /&gt;
== Logs ==&lt;br /&gt;
&lt;br /&gt;
The log files for the jupyterlab server are stored in &amp;quot;~/.jupyter&amp;quot;. The log files will be created once the jupyter lab server job is finished.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Clean workspaces ==&lt;br /&gt;
&lt;br /&gt;
Jupyterlab stores the workspace status in the &amp;quot;~/.jupyter/lab/workspaces&amp;quot; folder. If you want to start with a fresh (empty) workspace, delete all the content of this folder before launching the notebook.&lt;br /&gt;
&lt;br /&gt;
    cd ~/.jupyter/lab/workspaces&lt;br /&gt;
    rm *&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 504 Gateway timeout ==&lt;br /&gt;
&lt;br /&gt;
The notebook job is running in HTCondor but the user can not access the notebook server. Ultimately a 504 error is received.&lt;br /&gt;
&lt;br /&gt;
This is probably because there's some error when starting the jupyterlab server. First of all, shutdown the notebook server and [[#Logs|check the logs]] to better identify the problem. If you don't see the source of the error, try to [[#Clean_workspaces|clean the workspaces]] and launch a notebook again.&lt;/div&gt;</summary>
		<author><name>Torradeflot</name></author>
	</entry>
	<entry>
		<id>https://pwiki.pic.es/index.php?title=Storage&amp;diff=1177</id>
		<title>Storage</title>
		<link rel="alternate" type="text/html" href="https://pwiki.pic.es/index.php?title=Storage&amp;diff=1177"/>
		<updated>2025-01-09T15:43:16Z</updated>

		<summary type="html">&lt;p&gt;Torradeflot: /* Tape */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= Tape =&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Status and attributes of tape-backed storage can be accessed through special dot commands.&lt;br /&gt;
&lt;br /&gt;
Some of them are listed here: https://www.dcache.org/manuals/Book-10.2/rf-dot-commands.shtml&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Check folder tags ==&lt;br /&gt;
&lt;br /&gt;
To check if a folder is backed by tape or not, you can use the &amp;lt;code&amp;gt;grep &amp;quot;&amp;quot; $(cat  &amp;quot;.(tags)()&amp;quot;)&amp;lt;/code&amp;gt;&lt;br /&gt;
&lt;br /&gt;
On a tape-backed folder you will see something like this.&lt;br /&gt;
&lt;br /&gt;
    [torradeflot@tds280 albag20240319044]$ grep &amp;quot;&amp;quot; $(cat  &amp;quot;.(tags)()&amp;quot;)&lt;br /&gt;
    .(tag)(AccessLatency):NEARLINE&lt;br /&gt;
    .(tag)(file_family):tape&lt;br /&gt;
    .(tag)(file_family_width):1&lt;br /&gt;
    .(tag)(file_family_wrapper):cpio_odc&lt;br /&gt;
    .(tag)(library):IBML8&lt;br /&gt;
    .(tag)(OSMTemplate):StoreName vo-incaem&lt;br /&gt;
    .(tag)(RetentionPolicy):CUSTODIAL&lt;br /&gt;
    .(tag)(sGroup):incaem&lt;br /&gt;
    .(tag)(storage_group):vo-incaem&lt;br /&gt;
&lt;br /&gt;
In a non-tape folder:&lt;br /&gt;
&lt;br /&gt;
    [torradeflot@tds280 incaem]$ grep &amp;quot;&amp;quot; $(cat  &amp;quot;.(tags)()&amp;quot;)&lt;br /&gt;
    .(tag)(AccessLatency):ONLINE&lt;br /&gt;
    .(tag)(file_family):incaem&lt;br /&gt;
    .(tag)(RetentionPolicy):REPLICA&lt;br /&gt;
    .(tag)(storage_group):vo-incaem&lt;br /&gt;
&lt;br /&gt;
== File locality ==&lt;br /&gt;
&lt;br /&gt;
    $ cat &amp;quot;.(get)(test_500M_1)(locality)&amp;quot;&lt;br /&gt;
    ONLINE&lt;/div&gt;</summary>
		<author><name>Torradeflot</name></author>
	</entry>
	<entry>
		<id>https://pwiki.pic.es/index.php?title=Storage&amp;diff=1176</id>
		<title>Storage</title>
		<link rel="alternate" type="text/html" href="https://pwiki.pic.es/index.php?title=Storage&amp;diff=1176"/>
		<updated>2025-01-09T15:37:50Z</updated>

		<summary type="html">&lt;p&gt;Torradeflot: Created page with &amp;quot;= Tape =   Status and attributes of tape-backed storage can be accessed through special dot commands.  Some of them are listed here: https://www.dcache.org/manuals/Book-10.2/r...&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= Tape =&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Status and attributes of tape-backed storage can be accessed through special dot commands.&lt;br /&gt;
&lt;br /&gt;
Some of them are listed here: https://www.dcache.org/manuals/Book-10.2/rf-dot-commands.shtml&lt;br /&gt;
&lt;br /&gt;
For example:&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
To check if a folder is backed by tape or not, you can use the `grep &amp;quot;&amp;quot; $(cat  &amp;quot;.(tags)()&amp;quot;)`&lt;br /&gt;
&lt;br /&gt;
On a tape-backed folder you will see something like this.&lt;br /&gt;
&lt;br /&gt;
    [torradeflot@tds280 albag20240319044]$ grep &amp;quot;&amp;quot; $(cat  &amp;quot;.(tags)()&amp;quot;)&lt;br /&gt;
    .(tag)(AccessLatency):NEARLINE&lt;br /&gt;
    .(tag)(file_family):tape&lt;br /&gt;
    .(tag)(file_family_width):1&lt;br /&gt;
    .(tag)(file_family_wrapper):cpio_odc&lt;br /&gt;
    .(tag)(library):IBML8&lt;br /&gt;
    .(tag)(OSMTemplate):StoreName vo-incaem&lt;br /&gt;
    .(tag)(RetentionPolicy):CUSTODIAL&lt;br /&gt;
    .(tag)(sGroup):incaem&lt;br /&gt;
    .(tag)(storage_group):vo-incaem&lt;br /&gt;
&lt;br /&gt;
In a non-tape folder:&lt;br /&gt;
&lt;br /&gt;
    [torradeflot@tds280 incaem]$ grep &amp;quot;&amp;quot; $(cat  &amp;quot;.(tags)()&amp;quot;)&lt;br /&gt;
    .(tag)(AccessLatency):ONLINE&lt;br /&gt;
    .(tag)(file_family):incaem&lt;br /&gt;
    .(tag)(RetentionPolicy):REPLICA&lt;br /&gt;
    .(tag)(storage_group):vo-incaem&lt;/div&gt;</summary>
		<author><name>Torradeflot</name></author>
	</entry>
	<entry>
		<id>https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1175</id>
		<title>JupyterHub</title>
		<link rel="alternate" type="text/html" href="https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1175"/>
		<updated>2024-12-03T11:09:37Z</updated>

		<summary type="html">&lt;p&gt;Torradeflot: /* Jupyterlab user guide */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
= Introduction =&lt;br /&gt;
&lt;br /&gt;
PIC offers a service for running Jupyter notebooks on CPU or GPU resources. This service is primarily thought for code developing or prototyping rather than data processing. The usage is similar to running notebooks on your personal computer but offers the advantage of developing and testing your code on different hardware configurations, as well as facilitating the scalability of the code since it is being tested in the same environment in which it would run on a mass scale. &lt;br /&gt;
&lt;br /&gt;
Since the service is strictly thought for development and small scale testing tasks, a shutdown policy for the sessions has been put in place:&lt;br /&gt;
&lt;br /&gt;
# The maximum duration for a session is 48h.&lt;br /&gt;
# After an idle period of 2 hours, the session will be closed. &lt;br /&gt;
&lt;br /&gt;
In practice that  means that you should estimate the test data volume that you work with during a session to be able to be processed in less than 48 hours.&lt;br /&gt;
&lt;br /&gt;
== How to connect to the service ==&lt;br /&gt;
&lt;br /&gt;
Got to [https://jupyter.pic.es jupyter.pic.es] to see your login screen.&lt;br /&gt;
&lt;br /&gt;
[[File:login.png|700px|Login screen]]&lt;br /&gt;
&lt;br /&gt;
Sign in with your PIC user credentials. This will prompt you to the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterSpawn.png|700px|current]]&lt;br /&gt;
&lt;br /&gt;
Here you can choose the hardware configuration for your Jupyter session. Also, you have to choose the experiment (project) you are working on during the Jupyter session. After choosing a configuration and pressing start the next screen will show you the progress of the initialisation process. Keep in mind that a job containing your Jupyter session is actually sent to the HTCondor queuing system and waiting for available resources before being started. This usually takes less than a minute but can take up to a few depending on our resource usage.&lt;br /&gt;
&lt;br /&gt;
[[File:screen02.png|900px]]&lt;br /&gt;
&lt;br /&gt;
In the next screen you can choose the tool that you want to use for your work: a Python notebook, a Python console or a plain bash terminal.&lt;br /&gt;
For the Python environment (either notebook or environment) you have two default options:&lt;br /&gt;
* the ipykernel version of Python 3&lt;br /&gt;
* the XPython version of Python 3.9, this one allows you to use the integrated debugging module.&lt;br /&gt;
&lt;br /&gt;
Further you see an icon with a &amp;quot;D&amp;quot; - desktop, this one starts a VNC session that allows the use of programs with graphical user interfaces.&lt;br /&gt;
&lt;br /&gt;
Also, recently you can find the icon of Visual Studio, an integrated development environment.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterlab20231103.png|700px]]&lt;br /&gt;
&lt;br /&gt;
Your python environments should appear under Notebook and Console headers. In a later section we will show you how to create a new environment and to remove an existing one.&lt;br /&gt;
&lt;br /&gt;
== Terminate your session and logout ==&lt;br /&gt;
&lt;br /&gt;
It is important that you terminate your session before you log out. In order to do so, go to the top page menu &amp;quot;'''File''' -&amp;gt; '''Hub Control Panel'''&amp;quot; and you will see the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:screen04.png|600px]]&lt;br /&gt;
&lt;br /&gt;
Here, click on the '''Stop My Server''' button. After that you can log out by clicking the '''Logout''' button in the right upper corner.&lt;br /&gt;
&lt;br /&gt;
=  Python virtual environments =&lt;br /&gt;
&lt;br /&gt;
This section covers the use of Python virtual environments with Jupyter.&lt;br /&gt;
&lt;br /&gt;
== Initialize conda (we highly recommend the use of miniforge/mambaforge) ==&lt;br /&gt;
&lt;br /&gt;
Before using conda/mamba in your bash session, you have to initialize it.&lt;br /&gt;
* For access to an available conda/mamba installation, please get in contact with your project liaison at PIC. He/she will give you the actual value for the '''/path/to/mambaforge''' placeholder.&lt;br /&gt;
* If you want to use your own conda/mamba installation, we recommend you to install the minimal '''miniforge''' distribution, instructions [https://github.com/conda-forge/miniforge here] or [https://github.com/mamba-org/mamba mamba/micromamba]&lt;br /&gt;
&lt;br /&gt;
Log onto Jupyter and start a session. On the homepage of your Jupyter session, click on the terminal button on the session dashboard on the right to open a bash terminal. If no specific version is needed you can use the link provided in the example.&lt;br /&gt;
&lt;br /&gt;
First, let's initialize conda for our bash sessions:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ /data/astro/software/alma9/conda/miniforge-24.1.2/bin/mamba init&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This actually changes the .bashrc file in your home directory in order to activate the base environment on login.&lt;br /&gt;
To avoid that the base environment is activated every time you log on to a node, run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, in order to activate the base environment you will have to run this command:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ eval &amp;quot;$(/data/astro/software/alma9/conda/miniforge-24.1.2/bin/conda shell.bash hook)&amp;quot;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For now you can exit the terminal.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ exit&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Link an existing environment to Jupyter ==&lt;br /&gt;
&lt;br /&gt;
You can find instructions on how to create your own environments, e.g. [[#Create_virtual_environments_with_venv_or_conda | here]].&lt;br /&gt;
&lt;br /&gt;
Log into Jupyter, start a session. From the session dashboard choose the bash terminal.&lt;br /&gt;
&lt;br /&gt;
Inside the terminal, activate your environment.&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments:&lt;br /&gt;
* if you created the environment without a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the name of your environment. &lt;br /&gt;
* if you created the environment with a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the absolute path of your environment. &lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ source /path/to/environment/bin/activate&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Link the environment to a Jupyter kernel. For both, '''conda/mamba''' and '''venv''':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ python -m  ipykernel install --user --name=whatever_kernel_name&lt;br /&gt;
Installed kernelspec whatever_kernel_name in &lt;br /&gt;
                         /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you don't have the '''ipykernel''' module installed in your environment you may receive an error message like the one below when trying to run the previous command.&lt;br /&gt;
&amp;lt;pre&amp;gt;No module named ipykernel&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If this is the case, you need to install it by running: '''pip install ipykernel'''&lt;br /&gt;
&lt;br /&gt;
Deactivate your environment. &lt;br /&gt;
&lt;br /&gt;
For conda:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For venv:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now you can exit the terminal. After refreshing the Jupyter page your whatever_kernel_name appears in the dashboard. In this example '''test''' has been used for whatever_kernel_name&lt;br /&gt;
&lt;br /&gt;
[[File:screen05.png|700px]]&lt;br /&gt;
&lt;br /&gt;
== Unlink an environment from Jupyter ==&lt;br /&gt;
Log onto Jupyter, start a session and from the session dashboard choose the bash terminal. To remove your environment/kernel from Jupyter run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ jupyter kernelspec uninstall whatever_kernel_name&lt;br /&gt;
Kernel specs to remove:&lt;br /&gt;
  whatever_kernel_name     /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
Remove 1 kernel specs [y/N]: y&lt;br /&gt;
[RemoveKernelSpec] Removed /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Keep in mind that, although not available in Jupyter anymore, the environment still exists. Whenever you need it, you can link it again.&lt;br /&gt;
&lt;br /&gt;
== Create virtual environments with venv or conda ==&lt;br /&gt;
&lt;br /&gt;
Before creating a new environment, please get in contact with your project liaison at PIC as there may be already a suitable environment for your needs in place.&lt;br /&gt;
&lt;br /&gt;
If none of the existing environments suits your needs, you can create a new environment.&lt;br /&gt;
First, create a directory in a suitable place to store the environment. For single-user environments, place them in your home under ~/env. For environments that will be shared with other project users, contact your project liaison and ask him/her for a path in a shared storage volume that is visible to all of them.&lt;br /&gt;
&lt;br /&gt;
Once you have the location (i.e. /path/to/env/folder), create the environment with the following commands:&lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments '''(recommended)'''&lt;br /&gt;
&lt;br /&gt;
If your_env is installed at /path/to/env/your_env&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ python3 -m venv your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ source your_env/bin/activate&lt;br /&gt;
(...)[neissner@td110 ~]$ pip install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments&lt;br /&gt;
 &amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba create --prefix /path/to/env/your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The list of modules (module1, module2, ...) is optional. For instance, for a python3 environment with scipy you would specify: ''python=3 scipy'' &lt;br /&gt;
&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/env/folder/your_env&lt;br /&gt;
(...)[neissner@td110 ~]$ mamba install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
You can use pip install inside a mamba environment, however, resolving dependencies might require installing additional packages manually.&lt;br /&gt;
&lt;br /&gt;
== conda / mamba configuration ==&lt;br /&gt;
&lt;br /&gt;
The behaviour of conda/mamba can be configured through the &amp;quot;$HOME/.condarc&amp;quot; file, described [https://docs.conda.io/projects/conda/en/latest/configuration.html here]. Some interesting parameters:&lt;br /&gt;
&lt;br /&gt;
* envs_dirs: The list of directories to search for named environments. E.g.: different locations where you created environments&lt;br /&gt;
&lt;br /&gt;
  envs_dirs:&lt;br /&gt;
    - /data/pic/scratch/torradeflot/envs&lt;br /&gt;
    - /data/astro/scratch/torradeflot/envs&lt;br /&gt;
    - /data/aai/scratch/torradeflot/envs&lt;br /&gt;
&lt;br /&gt;
* pkgs_dirs: Folder where to store conda packages&lt;br /&gt;
&lt;br /&gt;
= Proper usage of X509 based proxies =&lt;br /&gt;
&lt;br /&gt;
We found recently that the usage of proxies within a Jupyter session might cause problems because the environment changes certain standard locations such as '''/tmp'''&lt;br /&gt;
&lt;br /&gt;
For a correct functioning please create the proxy the following way, example for Virgo:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /bin/voms-proxy-init --voms virgo:/virgo/ligo --out ./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
ls: cannot access /cvmfs/ligo.osgstorage.org: Permission denied&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here the proxy cannot properly be located. Therefore we have to put the complete path into the variable:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ pwd&lt;br /&gt;
/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;/x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
frames  powerflux  pycbc  test_access&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Software of particular interest =&lt;br /&gt;
== SageMath ==&lt;br /&gt;
&lt;br /&gt;
[https://www.sagemath.org/ SageMath] is particularly interesting for Cosmology because it allows symbolic calculations, e.g. deriving the equations of motions for the scale factor starting from a customised space-time metric. &lt;br /&gt;
&lt;br /&gt;
=== Standard cosmology examples ===&lt;br /&gt;
&lt;br /&gt;
* The Friedman equations for the FLRW solution of the Einstein equations.&lt;br /&gt;
You can find the corresponding Notebook in any PIC terminal at '''/data/astro/software/notebooks/FLRW_cosmology.ipynb'''&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/FLRW_cosmology_solutions.ipynb''' uses known analytical solutions of the FLRW cosmology and produces this image for the evolution of the scale factor:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage05.png|300px]]&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/Interior_Schwarzschild.ipynb''' shows the formalism for the interior Schwarzschild metric and displays the solutions for density and pressure of a static celestial object that is sufficiently larger than its Schwarzschild radius. The pressure for an object with constant density id shown in the image:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage06.png|300px]]&lt;br /&gt;
&lt;br /&gt;
=== Enabling SageMath environment in Jupyter ===&lt;br /&gt;
&lt;br /&gt;
If you have never initialized mamba, run:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /data/astro/software/centos7/conda/mambaforge_4.14.0/bin/mamba init&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
After that you can enable SageMath for its use in a Jupyter notebook session: &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~] mamba activate /data/astro/software/envs/sage&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ python -m  ipykernel install --user --name=sage&lt;br /&gt;
....&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This creates a file in you home '''~/.local/share/jupyter/kernels/sage/kernel.json'''&lt;br /&gt;
which has to be modified to look like this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
{&lt;br /&gt;
 &amp;quot;argv&amp;quot;: [&lt;br /&gt;
  &amp;quot;/data/astro/software/envs/sage/bin/sage&amp;quot;,&lt;br /&gt;
  &amp;quot;--python&amp;quot;,&lt;br /&gt;
  &amp;quot;-m&amp;quot;,&lt;br /&gt;
  &amp;quot;sage.repl.ipython_kernel&amp;quot;,&lt;br /&gt;
  &amp;quot;-f&amp;quot;,&lt;br /&gt;
  &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
 ],&lt;br /&gt;
 &amp;quot;display_name&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;language&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;metadata&amp;quot;: {&lt;br /&gt;
  &amp;quot;debugger&amp;quot;: true&lt;br /&gt;
 }&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next time you go to your Jupyter dashboard you will find the sage environment listed there.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Dask =&lt;br /&gt;
&lt;br /&gt;
A notebook with instructions on how to run Dask at PIC can be found [https://gitlab.pic.es/services/code-samples/-/blob/main/computing/dask/dask_htcondor.ipynb here]&lt;br /&gt;
&lt;br /&gt;
= Using a singularity image as a jupyter kernel =&lt;br /&gt;
&lt;br /&gt;
Sometimes the software stack of some projects may be provided in the shape of a singularity image, it will then be convenient to use this image as a kernel for the notebooks in jupyter.pic.es.&lt;br /&gt;
&lt;br /&gt;
The singularity image to be used as a kernel needs to fullfill some requirements. Different requirements will apply depending on the programming language.&lt;br /&gt;
&lt;br /&gt;
== python jupyter kernel in a singularity image ==&lt;br /&gt;
&lt;br /&gt;
The singularity image needs to have the '''python''' and the '''ipykernel''' module installed. &lt;br /&gt;
&lt;br /&gt;
* Create the folder that will host the kernel definition&lt;br /&gt;
&lt;br /&gt;
  mkdir -p $HOME/.local/share/jupyter/kernels/singularity&lt;br /&gt;
&lt;br /&gt;
* Create the '''kernel.json''' file inside it with the following content:&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 {&lt;br /&gt;
   &amp;quot;argv&amp;quot;: [&lt;br /&gt;
     &amp;quot;singularity&amp;quot;,&lt;br /&gt;
     &amp;quot;exec&amp;quot;,&lt;br /&gt;
     &amp;quot;--cleanenv&amp;quot;,&lt;br /&gt;
     &amp;quot;/path/to/the/singularity/image.sif&amp;quot;,&lt;br /&gt;
     &amp;quot;python&amp;quot;,&lt;br /&gt;
     &amp;quot;-m&amp;quot;,&lt;br /&gt;
    &amp;quot;ipykernel&amp;quot;,&lt;br /&gt;
     &amp;quot;-f&amp;quot;,&lt;br /&gt;
     &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
   ],&lt;br /&gt;
   &amp;quot;language&amp;quot;: &amp;quot;python&amp;quot;,&lt;br /&gt;
   &amp;quot;display_name&amp;quot;: &amp;quot;singularity-kernel&amp;quot;&lt;br /&gt;
 }&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Refresh or start the jupyterlab interface and the singularity kernel should appear in the launcher tab&lt;br /&gt;
&lt;br /&gt;
= GPUs =&lt;br /&gt;
&lt;br /&gt;
The way to identify the GPUs that are assigned to your job is:&lt;br /&gt;
* check the environment variable CUDA_VISIBLE_DEVICES. In a terminal run &amp;quot;echo $CUDA_VISIBLE_DEVICES&amp;quot;. The environment variable contains a list of comma-separated GPU ids. With this you will already know how many gpus are assigned to your job. If the variable does not exist, there are no gpus assigned to the job&lt;br /&gt;
&lt;br /&gt;
* list the gpus with nvidia-smi, in a terminal run &amp;quot;nvidia-smi -L&amp;quot;, and look for the gpus you've been assigned. Remember their indexes (integers from 0 to 7)&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_id_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
* in the GPU dashboard the gpus are identified with their index&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_resources_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
= Jupyterlab user guide =&lt;br /&gt;
&lt;br /&gt;
You can find the official documentation of the currently installed version of jupyterlab (3.6) [https://jupyterlab.readthedocs.io/en/4.2.x/ here], there you will find instruction on how to &lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/commands.html Access the command palette]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/toc.html Build a Table Of Contents]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/4.2.x/user/debugger.html Debug your code]&lt;br /&gt;
&lt;br /&gt;
A set of non-official jupyterlab extensions are installed to provide additional functionalities&lt;br /&gt;
&lt;br /&gt;
== jupytext ==&lt;br /&gt;
Pair your notebooks with text files to enhance version tracking.&lt;br /&gt;
https://jupytext.readthedocs.io&lt;br /&gt;
&lt;br /&gt;
=== Example ===&lt;br /&gt;
&lt;br /&gt;
If you had a notebook (.ipynb file) containing only the cell below, tracked in a git repository&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%matplotlib inline&lt;br /&gt;
import numpy as np&lt;br /&gt;
import matplotlib.pyplot as plt&lt;br /&gt;
plt.imshow(np.random.random([10, 10]))&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Different executions of the cell would produce different images, and the images are embedded in a pseudo-binary format inside the notebook file. In this case, doing a '''git diff''' of the .ipynb file would produce a huge output (because the image changed), even if there wasn't any change in the code. It is thus convenient to sync the notebook with a text file (e.g. a .py script) using the jupytext extension and track this one with git. The outputs, including images, as well as some additional metadata, won't be added to the synced text file. So in the case of different executions of the same notebook, the diff will always be empty.&lt;br /&gt;
&lt;br /&gt;
== git ==&lt;br /&gt;
Sidebar GUI to git repo management&lt;br /&gt;
https://github.com/jupyterlab/jupyterlab-git&lt;br /&gt;
&lt;br /&gt;
== variable inspector ==&lt;br /&gt;
Variable inspection à la Matlab&lt;br /&gt;
https://github.com/jupyterlab-contrib/jupyterlab-variableInspector&lt;br /&gt;
&lt;br /&gt;
...&lt;br /&gt;
&lt;br /&gt;
= Code samples =&lt;br /&gt;
&lt;br /&gt;
A repository with sample code can be found here: https://gitlab.pic.es/services/code-samples/&lt;br /&gt;
&lt;br /&gt;
= Troubleshooting =&lt;br /&gt;
&lt;br /&gt;
== Logs ==&lt;br /&gt;
&lt;br /&gt;
The log files for the jupyterlab server are stored in &amp;quot;~/.jupyter&amp;quot;. The log files will be created once the jupyter lab server job is finished.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Clean workspaces ==&lt;br /&gt;
&lt;br /&gt;
Jupyterlab stores the workspace status in the &amp;quot;~/.jupyter/lab/workspaces&amp;quot; folder. If you want to start with a fresh (empty) workspace, delete all the content of this folder before launching the notebook.&lt;br /&gt;
&lt;br /&gt;
    cd ~/.jupyter/lab/workspaces&lt;br /&gt;
    rm *&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 504 Gateway timeout ==&lt;br /&gt;
&lt;br /&gt;
The notebook job is running in HTCondor but the user can not access the notebook server. Ultimately a 504 error is received.&lt;br /&gt;
&lt;br /&gt;
This is probably because there's some error when starting the jupyterlab server. First of all, shutdown the notebook server and check the logs to better identify the problem. If you don't see the source of the error, try clean the workspaces and launching a notebook again.&lt;/div&gt;</summary>
		<author><name>Torradeflot</name></author>
	</entry>
	<entry>
		<id>https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1172</id>
		<title>JupyterHub</title>
		<link rel="alternate" type="text/html" href="https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1172"/>
		<updated>2024-11-21T11:35:03Z</updated>

		<summary type="html">&lt;p&gt;Torradeflot: /* conda / mamba configuration */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
= Introduction =&lt;br /&gt;
&lt;br /&gt;
PIC offers a service for running Jupyter notebooks on CPU or GPU resources. This service is primarily thought for code developing or prototyping rather than data processing. The usage is similar to running notebooks on your personal computer but offers the advantage of developing and testing your code on different hardware configurations, as well as facilitating the scalability of the code since it is being tested in the same environment in which it would run on a mass scale. &lt;br /&gt;
&lt;br /&gt;
Since the service is strictly thought for development and small scale testing tasks, a shutdown policy for the sessions has been put in place:&lt;br /&gt;
&lt;br /&gt;
# The maximum duration for a session is 48h.&lt;br /&gt;
# After an idle period of 2 hours, the session will be closed. &lt;br /&gt;
&lt;br /&gt;
In practice that  means that you should estimate the test data volume that you work with during a session to be able to be processed in less than 48 hours.&lt;br /&gt;
&lt;br /&gt;
== How to connect to the service ==&lt;br /&gt;
&lt;br /&gt;
Got to [https://jupyter.pic.es jupyter.pic.es] to see your login screen.&lt;br /&gt;
&lt;br /&gt;
[[File:login.png|700px|Login screen]]&lt;br /&gt;
&lt;br /&gt;
Sign in with your PIC user credentials. This will prompt you to the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterSpawn.png|700px|current]]&lt;br /&gt;
&lt;br /&gt;
Here you can choose the hardware configuration for your Jupyter session. Also, you have to choose the experiment (project) you are working on during the Jupyter session. After choosing a configuration and pressing start the next screen will show you the progress of the initialisation process. Keep in mind that a job containing your Jupyter session is actually sent to the HTCondor queuing system and waiting for available resources before being started. This usually takes less than a minute but can take up to a few depending on our resource usage.&lt;br /&gt;
&lt;br /&gt;
[[File:screen02.png|900px]]&lt;br /&gt;
&lt;br /&gt;
In the next screen you can choose the tool that you want to use for your work: a Python notebook, a Python console or a plain bash terminal.&lt;br /&gt;
For the Python environment (either notebook or environment) you have two default options:&lt;br /&gt;
* the ipykernel version of Python 3&lt;br /&gt;
* the XPython version of Python 3.9, this one allows you to use the integrated debugging module.&lt;br /&gt;
&lt;br /&gt;
Further you see an icon with a &amp;quot;D&amp;quot; - desktop, this one starts a VNC session that allows the use of programs with graphical user interfaces.&lt;br /&gt;
&lt;br /&gt;
Also, recently you can find the icon of Visual Studio, an integrated development environment.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterlab20231103.png|700px]]&lt;br /&gt;
&lt;br /&gt;
Your python environments should appear under Notebook and Console headers. In a later section we will show you how to create a new environment and to remove an existing one.&lt;br /&gt;
&lt;br /&gt;
== Terminate your session and logout ==&lt;br /&gt;
&lt;br /&gt;
It is important that you terminate your session before you log out. In order to do so, go to the top page menu &amp;quot;'''File''' -&amp;gt; '''Hub Control Panel'''&amp;quot; and you will see the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:screen04.png|600px]]&lt;br /&gt;
&lt;br /&gt;
Here, click on the '''Stop My Server''' button. After that you can log out by clicking the '''Logout''' button in the right upper corner.&lt;br /&gt;
&lt;br /&gt;
=  Python virtual environments =&lt;br /&gt;
&lt;br /&gt;
This section covers the use of Python virtual environments with Jupyter.&lt;br /&gt;
&lt;br /&gt;
== Initialize conda (we highly recommend the use of miniforge/mambaforge) ==&lt;br /&gt;
&lt;br /&gt;
Before using conda/mamba in your bash session, you have to initialize it.&lt;br /&gt;
* For access to an available conda/mamba installation, please get in contact with your project liaison at PIC. He/she will give you the actual value for the '''/path/to/mambaforge''' placeholder.&lt;br /&gt;
* If you want to use your own conda/mamba installation, we recommend you to install the minimal '''miniforge''' distribution, instructions [https://github.com/conda-forge/miniforge here] or [https://github.com/mamba-org/mamba mamba/micromamba]&lt;br /&gt;
&lt;br /&gt;
Log onto Jupyter and start a session. On the homepage of your Jupyter session, click on the terminal button on the session dashboard on the right to open a bash terminal. If no specific version is needed you can use the link provided in the example.&lt;br /&gt;
&lt;br /&gt;
First, let's initialize conda for our bash sessions:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ /data/astro/software/alma9/conda/miniforge-24.1.2/bin/mamba init&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This actually changes the .bashrc file in your home directory in order to activate the base environment on login.&lt;br /&gt;
To avoid that the base environment is activated every time you log on to a node, run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, in order to activate the base environment you will have to run this command:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ eval &amp;quot;$(/data/astro/software/alma9/conda/miniforge-24.1.2/bin/conda shell.bash hook)&amp;quot;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For now you can exit the terminal.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ exit&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Link an existing environment to Jupyter ==&lt;br /&gt;
&lt;br /&gt;
You can find instructions on how to create your own environments, e.g. [[#Create_virtual_environments_with_venv_or_conda | here]].&lt;br /&gt;
&lt;br /&gt;
Log into Jupyter, start a session. From the session dashboard choose the bash terminal.&lt;br /&gt;
&lt;br /&gt;
Inside the terminal, activate your environment.&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments:&lt;br /&gt;
* if you created the environment without a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the name of your environment. &lt;br /&gt;
* if you created the environment with a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the absolute path of your environment. &lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ source /path/to/environment/bin/activate&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Link the environment to a Jupyter kernel. For both, '''conda/mamba''' and '''venv''':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ python -m  ipykernel install --user --name=whatever_kernel_name&lt;br /&gt;
Installed kernelspec whatever_kernel_name in &lt;br /&gt;
                         /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you don't have the '''ipykernel''' module installed in your environment you may receive an error message like the one below when trying to run the previous command.&lt;br /&gt;
&amp;lt;pre&amp;gt;No module named ipykernel&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If this is the case, you need to install it by running: '''pip install ipykernel'''&lt;br /&gt;
&lt;br /&gt;
Deactivate your environment. &lt;br /&gt;
&lt;br /&gt;
For conda:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For venv:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now you can exit the terminal. After refreshing the Jupyter page your whatever_kernel_name appears in the dashboard. In this example '''test''' has been used for whatever_kernel_name&lt;br /&gt;
&lt;br /&gt;
[[File:screen05.png|700px]]&lt;br /&gt;
&lt;br /&gt;
== Unlink an environment from Jupyter ==&lt;br /&gt;
Log onto Jupyter, start a session and from the session dashboard choose the bash terminal. To remove your environment/kernel from Jupyter run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ jupyter kernelspec uninstall whatever_kernel_name&lt;br /&gt;
Kernel specs to remove:&lt;br /&gt;
  whatever_kernel_name     /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
Remove 1 kernel specs [y/N]: y&lt;br /&gt;
[RemoveKernelSpec] Removed /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Keep in mind that, although not available in Jupyter anymore, the environment still exists. Whenever you need it, you can link it again.&lt;br /&gt;
&lt;br /&gt;
== Create virtual environments with venv or conda ==&lt;br /&gt;
&lt;br /&gt;
Before creating a new environment, please get in contact with your project liaison at PIC as there may be already a suitable environment for your needs in place.&lt;br /&gt;
&lt;br /&gt;
If none of the existing environments suits your needs, you can create a new environment.&lt;br /&gt;
First, create a directory in a suitable place to store the environment. For single-user environments, place them in your home under ~/env. For environments that will be shared with other project users, contact your project liaison and ask him/her for a path in a shared storage volume that is visible to all of them.&lt;br /&gt;
&lt;br /&gt;
Once you have the location (i.e. /path/to/env/folder), create the environment with the following commands:&lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments '''(recommended)'''&lt;br /&gt;
&lt;br /&gt;
If your_env is installed at /path/to/env/your_env&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ python3 -m venv your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ source your_env/bin/activate&lt;br /&gt;
(...)[neissner@td110 ~]$ pip install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments&lt;br /&gt;
 &amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba create --prefix /path/to/env/your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The list of modules (module1, module2, ...) is optional. For instance, for a python3 environment with scipy you would specify: ''python=3 scipy'' &lt;br /&gt;
&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/env/folder/your_env&lt;br /&gt;
(...)[neissner@td110 ~]$ mamba install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
You can use pip install inside a mamba environment, however, resolving dependencies might require installing additional packages manually.&lt;br /&gt;
&lt;br /&gt;
== conda / mamba configuration ==&lt;br /&gt;
&lt;br /&gt;
The behaviour of conda/mamba can be configured through the &amp;quot;$HOME/.condarc&amp;quot; file, described [https://docs.conda.io/projects/conda/en/latest/configuration.html here]. Some interesting parameters:&lt;br /&gt;
&lt;br /&gt;
* envs_dirs: The list of directories to search for named environments. E.g.: different locations where you created environments&lt;br /&gt;
&lt;br /&gt;
  envs_dirs:&lt;br /&gt;
    - /data/pic/scratch/torradeflot/envs&lt;br /&gt;
    - /data/astro/scratch/torradeflot/envs&lt;br /&gt;
    - /data/aai/scratch/torradeflot/envs&lt;br /&gt;
&lt;br /&gt;
* pkgs_dirs: Folder where to store conda packages&lt;br /&gt;
&lt;br /&gt;
= Proper usage of X509 based proxies =&lt;br /&gt;
&lt;br /&gt;
We found recently that the usage of proxies within a Jupyter session might cause problems because the environment changes certain standard locations such as '''/tmp'''&lt;br /&gt;
&lt;br /&gt;
For a correct functioning please create the proxy the following way, example for Virgo:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /bin/voms-proxy-init --voms virgo:/virgo/ligo --out ./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
ls: cannot access /cvmfs/ligo.osgstorage.org: Permission denied&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here the proxy cannot properly be located. Therefore we have to put the complete path into the variable:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ pwd&lt;br /&gt;
/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;/x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
frames  powerflux  pycbc  test_access&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Software of particular interest =&lt;br /&gt;
== SageMath ==&lt;br /&gt;
&lt;br /&gt;
[https://www.sagemath.org/ SageMath] is particularly interesting for Cosmology because it allows symbolic calculations, e.g. deriving the equations of motions for the scale factor starting from a customised space-time metric. &lt;br /&gt;
&lt;br /&gt;
=== Standard cosmology examples ===&lt;br /&gt;
&lt;br /&gt;
* The Friedman equations for the FLRW solution of the Einstein equations.&lt;br /&gt;
You can find the corresponding Notebook in any PIC terminal at '''/data/astro/software/notebooks/FLRW_cosmology.ipynb'''&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/FLRW_cosmology_solutions.ipynb''' uses known analytical solutions of the FLRW cosmology and produces this image for the evolution of the scale factor:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage05.png|300px]]&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/Interior_Schwarzschild.ipynb''' shows the formalism for the interior Schwarzschild metric and displays the solutions for density and pressure of a static celestial object that is sufficiently larger than its Schwarzschild radius. The pressure for an object with constant density id shown in the image:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage06.png|300px]]&lt;br /&gt;
&lt;br /&gt;
=== Enabling SageMath environment in Jupyter ===&lt;br /&gt;
&lt;br /&gt;
If you have never initialized mamba, run:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /data/astro/software/centos7/conda/mambaforge_4.14.0/bin/mamba init&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
After that you can enable SageMath for its use in a Jupyter notebook session: &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~] mamba activate /data/astro/software/envs/sage&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ python -m  ipykernel install --user --name=sage&lt;br /&gt;
....&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This creates a file in you home '''~/.local/share/jupyter/kernels/sage/kernel.json'''&lt;br /&gt;
which has to be modified to look like this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
{&lt;br /&gt;
 &amp;quot;argv&amp;quot;: [&lt;br /&gt;
  &amp;quot;/data/astro/software/envs/sage/bin/sage&amp;quot;,&lt;br /&gt;
  &amp;quot;--python&amp;quot;,&lt;br /&gt;
  &amp;quot;-m&amp;quot;,&lt;br /&gt;
  &amp;quot;sage.repl.ipython_kernel&amp;quot;,&lt;br /&gt;
  &amp;quot;-f&amp;quot;,&lt;br /&gt;
  &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
 ],&lt;br /&gt;
 &amp;quot;display_name&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;language&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;metadata&amp;quot;: {&lt;br /&gt;
  &amp;quot;debugger&amp;quot;: true&lt;br /&gt;
 }&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next time you go to your Jupyter dashboard you will find the sage environment listed there.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Dask =&lt;br /&gt;
&lt;br /&gt;
A notebook with instructions on how to run Dask at PIC can be found [https://gitlab.pic.es/services/code-samples/-/blob/main/computing/dask/dask_htcondor.ipynb here]&lt;br /&gt;
&lt;br /&gt;
= Using a singularity image as a jupyter kernel =&lt;br /&gt;
&lt;br /&gt;
Sometimes the software stack of some projects may be provided in the shape of a singularity image, it will then be convenient to use this image as a kernel for the notebooks in jupyter.pic.es.&lt;br /&gt;
&lt;br /&gt;
The singularity image to be used as a kernel needs to fullfill some requirements. Different requirements will apply depending on the programming language.&lt;br /&gt;
&lt;br /&gt;
== python jupyter kernel in a singularity image ==&lt;br /&gt;
&lt;br /&gt;
The singularity image needs to have the '''python''' and the '''ipykernel''' module installed. &lt;br /&gt;
&lt;br /&gt;
* Create the folder that will host the kernel definition&lt;br /&gt;
&lt;br /&gt;
  mkdir -p $HOME/.local/share/jupyter/kernels/singularity&lt;br /&gt;
&lt;br /&gt;
* Create the '''kernel.json''' file inside it with the following content:&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 {&lt;br /&gt;
   &amp;quot;argv&amp;quot;: [&lt;br /&gt;
     &amp;quot;singularity&amp;quot;,&lt;br /&gt;
     &amp;quot;exec&amp;quot;,&lt;br /&gt;
     &amp;quot;--cleanenv&amp;quot;,&lt;br /&gt;
     &amp;quot;/path/to/the/singularity/image.sif&amp;quot;,&lt;br /&gt;
     &amp;quot;python&amp;quot;,&lt;br /&gt;
     &amp;quot;-m&amp;quot;,&lt;br /&gt;
    &amp;quot;ipykernel&amp;quot;,&lt;br /&gt;
     &amp;quot;-f&amp;quot;,&lt;br /&gt;
     &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
   ],&lt;br /&gt;
   &amp;quot;language&amp;quot;: &amp;quot;python&amp;quot;,&lt;br /&gt;
   &amp;quot;display_name&amp;quot;: &amp;quot;singularity-kernel&amp;quot;&lt;br /&gt;
 }&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Refresh or start the jupyterlab interface and the singularity kernel should appear in the launcher tab&lt;br /&gt;
&lt;br /&gt;
= GPUs =&lt;br /&gt;
&lt;br /&gt;
The way to identify the GPUs that are assigned to your job is:&lt;br /&gt;
* check the environment variable CUDA_VISIBLE_DEVICES. In a terminal run &amp;quot;echo $CUDA_VISIBLE_DEVICES&amp;quot;. The environment variable contains a list of comma-separated GPU ids. With this you will already know how many gpus are assigned to your job. If the variable does not exist, there are no gpus assigned to the job&lt;br /&gt;
&lt;br /&gt;
* list the gpus with nvidia-smi, in a terminal run &amp;quot;nvidia-smi -L&amp;quot;, and look for the gpus you've been assigned. Remember their indexes (integers from 0 to 7)&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_id_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
* in the GPU dashboard the gpus are identified with their index&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_resources_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
= Jupyterlab user guide =&lt;br /&gt;
&lt;br /&gt;
You can find the official documentation of the currently installed version of jupyterlab (3.6) [https://jupyterlab.readthedocs.io/en/3.6.x/ here], there you will find instruction on how to &lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/3.6.x/user/commands.html Access the command palette]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/3.6.x/user/toc.html Build a Table Of Contents]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/3.6.x/user/debugger.html Debug your code]&lt;br /&gt;
&lt;br /&gt;
A set of non-official jupyterlab extensions are installed to provide additional functionalities&lt;br /&gt;
&lt;br /&gt;
== jupytext ==&lt;br /&gt;
Pair your notebooks with text files to enhance version tracking.&lt;br /&gt;
https://jupytext.readthedocs.io&lt;br /&gt;
&lt;br /&gt;
=== Example ===&lt;br /&gt;
&lt;br /&gt;
If you had a notebook (.ipynb file) containing only the cell below, tracked in a git repository&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%matplotlib inline&lt;br /&gt;
import numpy as np&lt;br /&gt;
import matplotlib.pyplot as plt&lt;br /&gt;
plt.imshow(np.random.random([10, 10]))&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Different executions of the cell would produce different images, and the images are embedded in a pseudo-binary format inside the notebook file. In this case, doing a '''git diff''' of the .ipynb file would produce a huge output (because the image changed), even if there wasn't any change in the code. It is thus convenient to sync the notebook with a text file (e.g. a .py script) using the jupytext extension and track this one with git. The outputs, including images, as well as some additional metadata, won't be added to the synced text file. So in the case of different executions of the same notebook, the diff will always be empty.&lt;br /&gt;
&lt;br /&gt;
== git ==&lt;br /&gt;
Sidebar GUI to git repo management&lt;br /&gt;
https://github.com/jupyterlab/jupyterlab-git&lt;br /&gt;
&lt;br /&gt;
== variable inspector ==&lt;br /&gt;
Variable inspection à la Matlab&lt;br /&gt;
https://github.com/jupyterlab-contrib/jupyterlab-variableInspector&lt;br /&gt;
&lt;br /&gt;
...&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Code samples =&lt;br /&gt;
&lt;br /&gt;
A repository with sample code can be found here: https://gitlab.pic.es/services/code-samples/&lt;br /&gt;
&lt;br /&gt;
= Troubleshooting =&lt;br /&gt;
&lt;br /&gt;
== Logs ==&lt;br /&gt;
&lt;br /&gt;
The log files for the jupyterlab server are stored in &amp;quot;~/.jupyter&amp;quot;. The log files will be created once the jupyter lab server job is finished.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Clean workspaces ==&lt;br /&gt;
&lt;br /&gt;
Jupyterlab stores the workspace status in the &amp;quot;~/.jupyter/lab/workspaces&amp;quot; folder. If you want to start with a fresh (empty) workspace, delete all the content of this folder before launching the notebook.&lt;br /&gt;
&lt;br /&gt;
    cd ~/.jupyter/lab/workspaces&lt;br /&gt;
    rm *&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 504 Gateway timeout ==&lt;br /&gt;
&lt;br /&gt;
The notebook job is running in HTCondor but the user can not access the notebook server. Ultimately a 504 error is received.&lt;br /&gt;
&lt;br /&gt;
This is probably because there's some error when starting the jupyterlab server. First of all, shutdown the notebook server and check the logs to better identify the problem. If you don't see the source of the error, try clean the workspaces and launching a notebook again.&lt;/div&gt;</summary>
		<author><name>Torradeflot</name></author>
	</entry>
	<entry>
		<id>https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1171</id>
		<title>JupyterHub</title>
		<link rel="alternate" type="text/html" href="https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1171"/>
		<updated>2024-11-21T11:26:54Z</updated>

		<summary type="html">&lt;p&gt;Torradeflot: /* Create virtual environments with venv or conda */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
= Introduction =&lt;br /&gt;
&lt;br /&gt;
PIC offers a service for running Jupyter notebooks on CPU or GPU resources. This service is primarily thought for code developing or prototyping rather than data processing. The usage is similar to running notebooks on your personal computer but offers the advantage of developing and testing your code on different hardware configurations, as well as facilitating the scalability of the code since it is being tested in the same environment in which it would run on a mass scale. &lt;br /&gt;
&lt;br /&gt;
Since the service is strictly thought for development and small scale testing tasks, a shutdown policy for the sessions has been put in place:&lt;br /&gt;
&lt;br /&gt;
# The maximum duration for a session is 48h.&lt;br /&gt;
# After an idle period of 2 hours, the session will be closed. &lt;br /&gt;
&lt;br /&gt;
In practice that  means that you should estimate the test data volume that you work with during a session to be able to be processed in less than 48 hours.&lt;br /&gt;
&lt;br /&gt;
== How to connect to the service ==&lt;br /&gt;
&lt;br /&gt;
Got to [https://jupyter.pic.es jupyter.pic.es] to see your login screen.&lt;br /&gt;
&lt;br /&gt;
[[File:login.png|700px|Login screen]]&lt;br /&gt;
&lt;br /&gt;
Sign in with your PIC user credentials. This will prompt you to the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterSpawn.png|700px|current]]&lt;br /&gt;
&lt;br /&gt;
Here you can choose the hardware configuration for your Jupyter session. Also, you have to choose the experiment (project) you are working on during the Jupyter session. After choosing a configuration and pressing start the next screen will show you the progress of the initialisation process. Keep in mind that a job containing your Jupyter session is actually sent to the HTCondor queuing system and waiting for available resources before being started. This usually takes less than a minute but can take up to a few depending on our resource usage.&lt;br /&gt;
&lt;br /&gt;
[[File:screen02.png|900px]]&lt;br /&gt;
&lt;br /&gt;
In the next screen you can choose the tool that you want to use for your work: a Python notebook, a Python console or a plain bash terminal.&lt;br /&gt;
For the Python environment (either notebook or environment) you have two default options:&lt;br /&gt;
* the ipykernel version of Python 3&lt;br /&gt;
* the XPython version of Python 3.9, this one allows you to use the integrated debugging module.&lt;br /&gt;
&lt;br /&gt;
Further you see an icon with a &amp;quot;D&amp;quot; - desktop, this one starts a VNC session that allows the use of programs with graphical user interfaces.&lt;br /&gt;
&lt;br /&gt;
Also, recently you can find the icon of Visual Studio, an integrated development environment.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterlab20231103.png|700px]]&lt;br /&gt;
&lt;br /&gt;
Your python environments should appear under Notebook and Console headers. In a later section we will show you how to create a new environment and to remove an existing one.&lt;br /&gt;
&lt;br /&gt;
== Terminate your session and logout ==&lt;br /&gt;
&lt;br /&gt;
It is important that you terminate your session before you log out. In order to do so, go to the top page menu &amp;quot;'''File''' -&amp;gt; '''Hub Control Panel'''&amp;quot; and you will see the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:screen04.png|600px]]&lt;br /&gt;
&lt;br /&gt;
Here, click on the '''Stop My Server''' button. After that you can log out by clicking the '''Logout''' button in the right upper corner.&lt;br /&gt;
&lt;br /&gt;
=  Python virtual environments =&lt;br /&gt;
&lt;br /&gt;
This section covers the use of Python virtual environments with Jupyter.&lt;br /&gt;
&lt;br /&gt;
== Initialize conda (we highly recommend the use of miniforge/mambaforge) ==&lt;br /&gt;
&lt;br /&gt;
Before using conda/mamba in your bash session, you have to initialize it.&lt;br /&gt;
* For access to an available conda/mamba installation, please get in contact with your project liaison at PIC. He/she will give you the actual value for the '''/path/to/mambaforge''' placeholder.&lt;br /&gt;
* If you want to use your own conda/mamba installation, we recommend you to install the minimal '''miniforge''' distribution, instructions [https://github.com/conda-forge/miniforge here] or [https://github.com/mamba-org/mamba mamba/micromamba]&lt;br /&gt;
&lt;br /&gt;
Log onto Jupyter and start a session. On the homepage of your Jupyter session, click on the terminal button on the session dashboard on the right to open a bash terminal. If no specific version is needed you can use the link provided in the example.&lt;br /&gt;
&lt;br /&gt;
First, let's initialize conda for our bash sessions:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ /data/astro/software/alma9/conda/miniforge-24.1.2/bin/mamba init&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This actually changes the .bashrc file in your home directory in order to activate the base environment on login.&lt;br /&gt;
To avoid that the base environment is activated every time you log on to a node, run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, in order to activate the base environment you will have to run this command:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ eval &amp;quot;$(/data/astro/software/alma9/conda/miniforge-24.1.2/bin/conda shell.bash hook)&amp;quot;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For now you can exit the terminal.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ exit&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Link an existing environment to Jupyter ==&lt;br /&gt;
&lt;br /&gt;
You can find instructions on how to create your own environments, e.g. [[#Create_virtual_environments_with_venv_or_conda | here]].&lt;br /&gt;
&lt;br /&gt;
Log into Jupyter, start a session. From the session dashboard choose the bash terminal.&lt;br /&gt;
&lt;br /&gt;
Inside the terminal, activate your environment.&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments:&lt;br /&gt;
* if you created the environment without a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the name of your environment. &lt;br /&gt;
* if you created the environment with a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the absolute path of your environment. &lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ source /path/to/environment/bin/activate&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Link the environment to a Jupyter kernel. For both, '''conda/mamba''' and '''venv''':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ python -m  ipykernel install --user --name=whatever_kernel_name&lt;br /&gt;
Installed kernelspec whatever_kernel_name in &lt;br /&gt;
                         /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you don't have the '''ipykernel''' module installed in your environment you may receive an error message like the one below when trying to run the previous command.&lt;br /&gt;
&amp;lt;pre&amp;gt;No module named ipykernel&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If this is the case, you need to install it by running: '''pip install ipykernel'''&lt;br /&gt;
&lt;br /&gt;
Deactivate your environment. &lt;br /&gt;
&lt;br /&gt;
For conda:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For venv:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now you can exit the terminal. After refreshing the Jupyter page your whatever_kernel_name appears in the dashboard. In this example '''test''' has been used for whatever_kernel_name&lt;br /&gt;
&lt;br /&gt;
[[File:screen05.png|700px]]&lt;br /&gt;
&lt;br /&gt;
== Unlink an environment from Jupyter ==&lt;br /&gt;
Log onto Jupyter, start a session and from the session dashboard choose the bash terminal. To remove your environment/kernel from Jupyter run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ jupyter kernelspec uninstall whatever_kernel_name&lt;br /&gt;
Kernel specs to remove:&lt;br /&gt;
  whatever_kernel_name     /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
Remove 1 kernel specs [y/N]: y&lt;br /&gt;
[RemoveKernelSpec] Removed /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Keep in mind that, although not available in Jupyter anymore, the environment still exists. Whenever you need it, you can link it again.&lt;br /&gt;
&lt;br /&gt;
== Create virtual environments with venv or conda ==&lt;br /&gt;
&lt;br /&gt;
Before creating a new environment, please get in contact with your project liaison at PIC as there may be already a suitable environment for your needs in place.&lt;br /&gt;
&lt;br /&gt;
If none of the existing environments suits your needs, you can create a new environment.&lt;br /&gt;
First, create a directory in a suitable place to store the environment. For single-user environments, place them in your home under ~/env. For environments that will be shared with other project users, contact your project liaison and ask him/her for a path in a shared storage volume that is visible to all of them.&lt;br /&gt;
&lt;br /&gt;
Once you have the location (i.e. /path/to/env/folder), create the environment with the following commands:&lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments '''(recommended)'''&lt;br /&gt;
&lt;br /&gt;
If your_env is installed at /path/to/env/your_env&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ python3 -m venv your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ source your_env/bin/activate&lt;br /&gt;
(...)[neissner@td110 ~]$ pip install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments&lt;br /&gt;
 &amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba create --prefix /path/to/env/your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The list of modules (module1, module2, ...) is optional. For instance, for a python3 environment with scipy you would specify: ''python=3 scipy'' &lt;br /&gt;
&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/env/folder/your_env&lt;br /&gt;
(...)[neissner@td110 ~]$ mamba install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
You can use pip install inside a mamba environment, however, resolving dependencies might require installing additional packages manually.&lt;br /&gt;
&lt;br /&gt;
== conda / mamba configuration ==&lt;br /&gt;
&lt;br /&gt;
The behaviour of conda/mamba can be configured through the &amp;quot;$HOME/.condarc&amp;quot; file, described [https://docs.conda.io/projects/conda/en/latest/configuration.html here]. Some interesting parameters:&lt;br /&gt;
&lt;br /&gt;
* envs_dirs: The list of directories to search for named environments. E.g.: different locations where you created environments&lt;br /&gt;
&lt;br /&gt;
'''&lt;br /&gt;
envs_dirs:&lt;br /&gt;
  - /data/pic/scratch/torradeflot/envs&lt;br /&gt;
  - /data/astro/scratch/torradeflot/envs&lt;br /&gt;
  - /data/aai/scratch/torradeflot/envs&lt;br /&gt;
'''&lt;br /&gt;
&lt;br /&gt;
* pkgs_dirs: Where to store temporary/cached data&lt;br /&gt;
&lt;br /&gt;
= Proper usage of X509 based proxies =&lt;br /&gt;
&lt;br /&gt;
We found recently that the usage of proxies within a Jupyter session might cause problems because the environment changes certain standard locations such as '''/tmp'''&lt;br /&gt;
&lt;br /&gt;
For a correct functioning please create the proxy the following way, example for Virgo:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /bin/voms-proxy-init --voms virgo:/virgo/ligo --out ./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
ls: cannot access /cvmfs/ligo.osgstorage.org: Permission denied&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here the proxy cannot properly be located. Therefore we have to put the complete path into the variable:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ pwd&lt;br /&gt;
/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;/x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
frames  powerflux  pycbc  test_access&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Software of particular interest =&lt;br /&gt;
== SageMath ==&lt;br /&gt;
&lt;br /&gt;
[https://www.sagemath.org/ SageMath] is particularly interesting for Cosmology because it allows symbolic calculations, e.g. deriving the equations of motions for the scale factor starting from a customised space-time metric. &lt;br /&gt;
&lt;br /&gt;
=== Standard cosmology examples ===&lt;br /&gt;
&lt;br /&gt;
* The Friedman equations for the FLRW solution of the Einstein equations.&lt;br /&gt;
You can find the corresponding Notebook in any PIC terminal at '''/data/astro/software/notebooks/FLRW_cosmology.ipynb'''&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/FLRW_cosmology_solutions.ipynb''' uses known analytical solutions of the FLRW cosmology and produces this image for the evolution of the scale factor:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage05.png|300px]]&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/Interior_Schwarzschild.ipynb''' shows the formalism for the interior Schwarzschild metric and displays the solutions for density and pressure of a static celestial object that is sufficiently larger than its Schwarzschild radius. The pressure for an object with constant density id shown in the image:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage06.png|300px]]&lt;br /&gt;
&lt;br /&gt;
=== Enabling SageMath environment in Jupyter ===&lt;br /&gt;
&lt;br /&gt;
If you have never initialized mamba, run:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /data/astro/software/centos7/conda/mambaforge_4.14.0/bin/mamba init&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
After that you can enable SageMath for its use in a Jupyter notebook session: &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~] mamba activate /data/astro/software/envs/sage&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ python -m  ipykernel install --user --name=sage&lt;br /&gt;
....&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This creates a file in you home '''~/.local/share/jupyter/kernels/sage/kernel.json'''&lt;br /&gt;
which has to be modified to look like this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
{&lt;br /&gt;
 &amp;quot;argv&amp;quot;: [&lt;br /&gt;
  &amp;quot;/data/astro/software/envs/sage/bin/sage&amp;quot;,&lt;br /&gt;
  &amp;quot;--python&amp;quot;,&lt;br /&gt;
  &amp;quot;-m&amp;quot;,&lt;br /&gt;
  &amp;quot;sage.repl.ipython_kernel&amp;quot;,&lt;br /&gt;
  &amp;quot;-f&amp;quot;,&lt;br /&gt;
  &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
 ],&lt;br /&gt;
 &amp;quot;display_name&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;language&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;metadata&amp;quot;: {&lt;br /&gt;
  &amp;quot;debugger&amp;quot;: true&lt;br /&gt;
 }&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next time you go to your Jupyter dashboard you will find the sage environment listed there.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Dask =&lt;br /&gt;
&lt;br /&gt;
A notebook with instructions on how to run Dask at PIC can be found [https://gitlab.pic.es/services/code-samples/-/blob/main/computing/dask/dask_htcondor.ipynb here]&lt;br /&gt;
&lt;br /&gt;
= Using a singularity image as a jupyter kernel =&lt;br /&gt;
&lt;br /&gt;
Sometimes the software stack of some projects may be provided in the shape of a singularity image, it will then be convenient to use this image as a kernel for the notebooks in jupyter.pic.es.&lt;br /&gt;
&lt;br /&gt;
The singularity image to be used as a kernel needs to fullfill some requirements. Different requirements will apply depending on the programming language.&lt;br /&gt;
&lt;br /&gt;
== python jupyter kernel in a singularity image ==&lt;br /&gt;
&lt;br /&gt;
The singularity image needs to have the '''python''' and the '''ipykernel''' module installed. &lt;br /&gt;
&lt;br /&gt;
* Create the folder that will host the kernel definition&lt;br /&gt;
&lt;br /&gt;
  mkdir -p $HOME/.local/share/jupyter/kernels/singularity&lt;br /&gt;
&lt;br /&gt;
* Create the '''kernel.json''' file inside it with the following content:&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 {&lt;br /&gt;
   &amp;quot;argv&amp;quot;: [&lt;br /&gt;
     &amp;quot;singularity&amp;quot;,&lt;br /&gt;
     &amp;quot;exec&amp;quot;,&lt;br /&gt;
     &amp;quot;--cleanenv&amp;quot;,&lt;br /&gt;
     &amp;quot;/path/to/the/singularity/image.sif&amp;quot;,&lt;br /&gt;
     &amp;quot;python&amp;quot;,&lt;br /&gt;
     &amp;quot;-m&amp;quot;,&lt;br /&gt;
    &amp;quot;ipykernel&amp;quot;,&lt;br /&gt;
     &amp;quot;-f&amp;quot;,&lt;br /&gt;
     &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
   ],&lt;br /&gt;
   &amp;quot;language&amp;quot;: &amp;quot;python&amp;quot;,&lt;br /&gt;
   &amp;quot;display_name&amp;quot;: &amp;quot;singularity-kernel&amp;quot;&lt;br /&gt;
 }&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Refresh or start the jupyterlab interface and the singularity kernel should appear in the launcher tab&lt;br /&gt;
&lt;br /&gt;
= GPUs =&lt;br /&gt;
&lt;br /&gt;
The way to identify the GPUs that are assigned to your job is:&lt;br /&gt;
* check the environment variable CUDA_VISIBLE_DEVICES. In a terminal run &amp;quot;echo $CUDA_VISIBLE_DEVICES&amp;quot;. The environment variable contains a list of comma-separated GPU ids. With this you will already know how many gpus are assigned to your job. If the variable does not exist, there are no gpus assigned to the job&lt;br /&gt;
&lt;br /&gt;
* list the gpus with nvidia-smi, in a terminal run &amp;quot;nvidia-smi -L&amp;quot;, and look for the gpus you've been assigned. Remember their indexes (integers from 0 to 7)&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_id_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
* in the GPU dashboard the gpus are identified with their index&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_resources_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
= Jupyterlab user guide =&lt;br /&gt;
&lt;br /&gt;
You can find the official documentation of the currently installed version of jupyterlab (3.6) [https://jupyterlab.readthedocs.io/en/3.6.x/ here], there you will find instruction on how to &lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/3.6.x/user/commands.html Access the command palette]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/3.6.x/user/toc.html Build a Table Of Contents]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/3.6.x/user/debugger.html Debug your code]&lt;br /&gt;
&lt;br /&gt;
A set of non-official jupyterlab extensions are installed to provide additional functionalities&lt;br /&gt;
&lt;br /&gt;
== jupytext ==&lt;br /&gt;
Pair your notebooks with text files to enhance version tracking.&lt;br /&gt;
https://jupytext.readthedocs.io&lt;br /&gt;
&lt;br /&gt;
=== Example ===&lt;br /&gt;
&lt;br /&gt;
If you had a notebook (.ipynb file) containing only the cell below, tracked in a git repository&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%matplotlib inline&lt;br /&gt;
import numpy as np&lt;br /&gt;
import matplotlib.pyplot as plt&lt;br /&gt;
plt.imshow(np.random.random([10, 10]))&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Different executions of the cell would produce different images, and the images are embedded in a pseudo-binary format inside the notebook file. In this case, doing a '''git diff''' of the .ipynb file would produce a huge output (because the image changed), even if there wasn't any change in the code. It is thus convenient to sync the notebook with a text file (e.g. a .py script) using the jupytext extension and track this one with git. The outputs, including images, as well as some additional metadata, won't be added to the synced text file. So in the case of different executions of the same notebook, the diff will always be empty.&lt;br /&gt;
&lt;br /&gt;
== git ==&lt;br /&gt;
Sidebar GUI to git repo management&lt;br /&gt;
https://github.com/jupyterlab/jupyterlab-git&lt;br /&gt;
&lt;br /&gt;
== variable inspector ==&lt;br /&gt;
Variable inspection à la Matlab&lt;br /&gt;
https://github.com/jupyterlab-contrib/jupyterlab-variableInspector&lt;br /&gt;
&lt;br /&gt;
...&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Code samples =&lt;br /&gt;
&lt;br /&gt;
A repository with sample code can be found here: https://gitlab.pic.es/services/code-samples/&lt;br /&gt;
&lt;br /&gt;
= Troubleshooting =&lt;br /&gt;
&lt;br /&gt;
== Logs ==&lt;br /&gt;
&lt;br /&gt;
The log files for the jupyterlab server are stored in &amp;quot;~/.jupyter&amp;quot;. The log files will be created once the jupyter lab server job is finished.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Clean workspaces ==&lt;br /&gt;
&lt;br /&gt;
Jupyterlab stores the workspace status in the &amp;quot;~/.jupyter/lab/workspaces&amp;quot; folder. If you want to start with a fresh (empty) workspace, delete all the content of this folder before launching the notebook.&lt;br /&gt;
&lt;br /&gt;
    cd ~/.jupyter/lab/workspaces&lt;br /&gt;
    rm *&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 504 Gateway timeout ==&lt;br /&gt;
&lt;br /&gt;
The notebook job is running in HTCondor but the user can not access the notebook server. Ultimately a 504 error is received.&lt;br /&gt;
&lt;br /&gt;
This is probably because there's some error when starting the jupyterlab server. First of all, shutdown the notebook server and check the logs to better identify the problem. If you don't see the source of the error, try clean the workspaces and launching a notebook again.&lt;/div&gt;</summary>
		<author><name>Torradeflot</name></author>
	</entry>
	<entry>
		<id>https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1170</id>
		<title>JupyterHub</title>
		<link rel="alternate" type="text/html" href="https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1170"/>
		<updated>2024-11-21T11:16:09Z</updated>

		<summary type="html">&lt;p&gt;Torradeflot: /* Initialize conda (we highly recommend the use of miniforge/mambaforge) */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
= Introduction =&lt;br /&gt;
&lt;br /&gt;
PIC offers a service for running Jupyter notebooks on CPU or GPU resources. This service is primarily thought for code developing or prototyping rather than data processing. The usage is similar to running notebooks on your personal computer but offers the advantage of developing and testing your code on different hardware configurations, as well as facilitating the scalability of the code since it is being tested in the same environment in which it would run on a mass scale. &lt;br /&gt;
&lt;br /&gt;
Since the service is strictly thought for development and small scale testing tasks, a shutdown policy for the sessions has been put in place:&lt;br /&gt;
&lt;br /&gt;
# The maximum duration for a session is 48h.&lt;br /&gt;
# After an idle period of 2 hours, the session will be closed. &lt;br /&gt;
&lt;br /&gt;
In practice that  means that you should estimate the test data volume that you work with during a session to be able to be processed in less than 48 hours.&lt;br /&gt;
&lt;br /&gt;
== How to connect to the service ==&lt;br /&gt;
&lt;br /&gt;
Got to [https://jupyter.pic.es jupyter.pic.es] to see your login screen.&lt;br /&gt;
&lt;br /&gt;
[[File:login.png|700px|Login screen]]&lt;br /&gt;
&lt;br /&gt;
Sign in with your PIC user credentials. This will prompt you to the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterSpawn.png|700px|current]]&lt;br /&gt;
&lt;br /&gt;
Here you can choose the hardware configuration for your Jupyter session. Also, you have to choose the experiment (project) you are working on during the Jupyter session. After choosing a configuration and pressing start the next screen will show you the progress of the initialisation process. Keep in mind that a job containing your Jupyter session is actually sent to the HTCondor queuing system and waiting for available resources before being started. This usually takes less than a minute but can take up to a few depending on our resource usage.&lt;br /&gt;
&lt;br /&gt;
[[File:screen02.png|900px]]&lt;br /&gt;
&lt;br /&gt;
In the next screen you can choose the tool that you want to use for your work: a Python notebook, a Python console or a plain bash terminal.&lt;br /&gt;
For the Python environment (either notebook or environment) you have two default options:&lt;br /&gt;
* the ipykernel version of Python 3&lt;br /&gt;
* the XPython version of Python 3.9, this one allows you to use the integrated debugging module.&lt;br /&gt;
&lt;br /&gt;
Further you see an icon with a &amp;quot;D&amp;quot; - desktop, this one starts a VNC session that allows the use of programs with graphical user interfaces.&lt;br /&gt;
&lt;br /&gt;
Also, recently you can find the icon of Visual Studio, an integrated development environment.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterlab20231103.png|700px]]&lt;br /&gt;
&lt;br /&gt;
Your python environments should appear under Notebook and Console headers. In a later section we will show you how to create a new environment and to remove an existing one.&lt;br /&gt;
&lt;br /&gt;
== Terminate your session and logout ==&lt;br /&gt;
&lt;br /&gt;
It is important that you terminate your session before you log out. In order to do so, go to the top page menu &amp;quot;'''File''' -&amp;gt; '''Hub Control Panel'''&amp;quot; and you will see the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:screen04.png|600px]]&lt;br /&gt;
&lt;br /&gt;
Here, click on the '''Stop My Server''' button. After that you can log out by clicking the '''Logout''' button in the right upper corner.&lt;br /&gt;
&lt;br /&gt;
=  Python virtual environments =&lt;br /&gt;
&lt;br /&gt;
This section covers the use of Python virtual environments with Jupyter.&lt;br /&gt;
&lt;br /&gt;
== Initialize conda (we highly recommend the use of miniforge/mambaforge) ==&lt;br /&gt;
&lt;br /&gt;
Before using conda/mamba in your bash session, you have to initialize it.&lt;br /&gt;
* For access to an available conda/mamba installation, please get in contact with your project liaison at PIC. He/she will give you the actual value for the '''/path/to/mambaforge''' placeholder.&lt;br /&gt;
* If you want to use your own conda/mamba installation, we recommend you to install the minimal '''miniforge''' distribution, instructions [https://github.com/conda-forge/miniforge here] or [https://github.com/mamba-org/mamba mamba/micromamba]&lt;br /&gt;
&lt;br /&gt;
Log onto Jupyter and start a session. On the homepage of your Jupyter session, click on the terminal button on the session dashboard on the right to open a bash terminal. If no specific version is needed you can use the link provided in the example.&lt;br /&gt;
&lt;br /&gt;
First, let's initialize conda for our bash sessions:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ /data/astro/software/alma9/conda/miniforge-24.1.2/bin/mamba init&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This actually changes the .bashrc file in your home directory in order to activate the base environment on login.&lt;br /&gt;
To avoid that the base environment is activated every time you log on to a node, run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, in order to activate the base environment you will have to run this command:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ eval &amp;quot;$(/data/astro/software/alma9/conda/miniforge-24.1.2/bin/conda shell.bash hook)&amp;quot;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For now you can exit the terminal.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ exit&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Link an existing environment to Jupyter ==&lt;br /&gt;
&lt;br /&gt;
You can find instructions on how to create your own environments, e.g. [[#Create_virtual_environments_with_venv_or_conda | here]].&lt;br /&gt;
&lt;br /&gt;
Log into Jupyter, start a session. From the session dashboard choose the bash terminal.&lt;br /&gt;
&lt;br /&gt;
Inside the terminal, activate your environment.&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments:&lt;br /&gt;
* if you created the environment without a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the name of your environment. &lt;br /&gt;
* if you created the environment with a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the absolute path of your environment. &lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ source /path/to/environment/bin/activate&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Link the environment to a Jupyter kernel. For both, '''conda/mamba''' and '''venv''':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ python -m  ipykernel install --user --name=whatever_kernel_name&lt;br /&gt;
Installed kernelspec whatever_kernel_name in &lt;br /&gt;
                         /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you don't have the '''ipykernel''' module installed in your environment you may receive an error message like the one below when trying to run the previous command.&lt;br /&gt;
&amp;lt;pre&amp;gt;No module named ipykernel&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If this is the case, you need to install it by running: '''pip install ipykernel'''&lt;br /&gt;
&lt;br /&gt;
Deactivate your environment. &lt;br /&gt;
&lt;br /&gt;
For conda:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For venv:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now you can exit the terminal. After refreshing the Jupyter page your whatever_kernel_name appears in the dashboard. In this example '''test''' has been used for whatever_kernel_name&lt;br /&gt;
&lt;br /&gt;
[[File:screen05.png|700px]]&lt;br /&gt;
&lt;br /&gt;
== Unlink an environment from Jupyter ==&lt;br /&gt;
Log onto Jupyter, start a session and from the session dashboard choose the bash terminal. To remove your environment/kernel from Jupyter run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ jupyter kernelspec uninstall whatever_kernel_name&lt;br /&gt;
Kernel specs to remove:&lt;br /&gt;
  whatever_kernel_name     /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
Remove 1 kernel specs [y/N]: y&lt;br /&gt;
[RemoveKernelSpec] Removed /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Keep in mind that, although not available in Jupyter anymore, the environment still exists. Whenever you need it, you can link it again.&lt;br /&gt;
&lt;br /&gt;
== Create virtual environments with venv or conda ==&lt;br /&gt;
&lt;br /&gt;
Before creating a new environment, please get in contact with your project liaison at PIC as there may be already a suitable environment for your needs in place.&lt;br /&gt;
&lt;br /&gt;
If none of the existing environments suits your needs, you can create a new environment.&lt;br /&gt;
First, create a directory in a suitable place to store the environment. For single-user environments, place them in your home under ~/env. For environments that will be shared with other project users, contact your project liaison and ask him/her for a path in a shared storage volume that is visible to all of them.&lt;br /&gt;
&lt;br /&gt;
Once you have the location (i.e. /path/to/env/folder), create the environment with the following commands:&lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments '''(recommended)'''&lt;br /&gt;
&lt;br /&gt;
If your_env is installed at /path/to/env/your_env&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ python3 -m venv your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ source your_env/bin/activate&lt;br /&gt;
(...)[neissner@td110 ~]$ pip install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments&lt;br /&gt;
 &amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba create --prefix /path/to/env/your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The list of modules (module1, module2, ...) is optional. For instance, for a python3 environment with scipy you would specify: ''python=3 scipy'' &lt;br /&gt;
&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/env/folder/your_env&lt;br /&gt;
(...)[neissner@td110 ~]$ mamba install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
You can use pip install inside a mamba environment, however, resolving dependencies might require installing additional packages manually.&lt;br /&gt;
&lt;br /&gt;
= Proper usage of X509 based proxies =&lt;br /&gt;
&lt;br /&gt;
We found recently that the usage of proxies within a Jupyter session might cause problems because the environment changes certain standard locations such as '''/tmp'''&lt;br /&gt;
&lt;br /&gt;
For a correct functioning please create the proxy the following way, example for Virgo:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /bin/voms-proxy-init --voms virgo:/virgo/ligo --out ./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
ls: cannot access /cvmfs/ligo.osgstorage.org: Permission denied&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here the proxy cannot properly be located. Therefore we have to put the complete path into the variable:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ pwd&lt;br /&gt;
/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;/x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
frames  powerflux  pycbc  test_access&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Software of particular interest =&lt;br /&gt;
== SageMath ==&lt;br /&gt;
&lt;br /&gt;
[https://www.sagemath.org/ SageMath] is particularly interesting for Cosmology because it allows symbolic calculations, e.g. deriving the equations of motions for the scale factor starting from a customised space-time metric. &lt;br /&gt;
&lt;br /&gt;
=== Standard cosmology examples ===&lt;br /&gt;
&lt;br /&gt;
* The Friedman equations for the FLRW solution of the Einstein equations.&lt;br /&gt;
You can find the corresponding Notebook in any PIC terminal at '''/data/astro/software/notebooks/FLRW_cosmology.ipynb'''&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/FLRW_cosmology_solutions.ipynb''' uses known analytical solutions of the FLRW cosmology and produces this image for the evolution of the scale factor:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage05.png|300px]]&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/Interior_Schwarzschild.ipynb''' shows the formalism for the interior Schwarzschild metric and displays the solutions for density and pressure of a static celestial object that is sufficiently larger than its Schwarzschild radius. The pressure for an object with constant density id shown in the image:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage06.png|300px]]&lt;br /&gt;
&lt;br /&gt;
=== Enabling SageMath environment in Jupyter ===&lt;br /&gt;
&lt;br /&gt;
If you have never initialized mamba, run:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /data/astro/software/centos7/conda/mambaforge_4.14.0/bin/mamba init&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
After that you can enable SageMath for its use in a Jupyter notebook session: &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~] mamba activate /data/astro/software/envs/sage&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ python -m  ipykernel install --user --name=sage&lt;br /&gt;
....&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This creates a file in you home '''~/.local/share/jupyter/kernels/sage/kernel.json'''&lt;br /&gt;
which has to be modified to look like this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
{&lt;br /&gt;
 &amp;quot;argv&amp;quot;: [&lt;br /&gt;
  &amp;quot;/data/astro/software/envs/sage/bin/sage&amp;quot;,&lt;br /&gt;
  &amp;quot;--python&amp;quot;,&lt;br /&gt;
  &amp;quot;-m&amp;quot;,&lt;br /&gt;
  &amp;quot;sage.repl.ipython_kernel&amp;quot;,&lt;br /&gt;
  &amp;quot;-f&amp;quot;,&lt;br /&gt;
  &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
 ],&lt;br /&gt;
 &amp;quot;display_name&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;language&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;metadata&amp;quot;: {&lt;br /&gt;
  &amp;quot;debugger&amp;quot;: true&lt;br /&gt;
 }&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next time you go to your Jupyter dashboard you will find the sage environment listed there.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Dask =&lt;br /&gt;
&lt;br /&gt;
A notebook with instructions on how to run Dask at PIC can be found [https://gitlab.pic.es/services/code-samples/-/blob/main/computing/dask/dask_htcondor.ipynb here]&lt;br /&gt;
&lt;br /&gt;
= Using a singularity image as a jupyter kernel =&lt;br /&gt;
&lt;br /&gt;
Sometimes the software stack of some projects may be provided in the shape of a singularity image, it will then be convenient to use this image as a kernel for the notebooks in jupyter.pic.es.&lt;br /&gt;
&lt;br /&gt;
The singularity image to be used as a kernel needs to fullfill some requirements. Different requirements will apply depending on the programming language.&lt;br /&gt;
&lt;br /&gt;
== python jupyter kernel in a singularity image ==&lt;br /&gt;
&lt;br /&gt;
The singularity image needs to have the '''python''' and the '''ipykernel''' module installed. &lt;br /&gt;
&lt;br /&gt;
* Create the folder that will host the kernel definition&lt;br /&gt;
&lt;br /&gt;
  mkdir -p $HOME/.local/share/jupyter/kernels/singularity&lt;br /&gt;
&lt;br /&gt;
* Create the '''kernel.json''' file inside it with the following content:&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 {&lt;br /&gt;
   &amp;quot;argv&amp;quot;: [&lt;br /&gt;
     &amp;quot;singularity&amp;quot;,&lt;br /&gt;
     &amp;quot;exec&amp;quot;,&lt;br /&gt;
     &amp;quot;--cleanenv&amp;quot;,&lt;br /&gt;
     &amp;quot;/path/to/the/singularity/image.sif&amp;quot;,&lt;br /&gt;
     &amp;quot;python&amp;quot;,&lt;br /&gt;
     &amp;quot;-m&amp;quot;,&lt;br /&gt;
    &amp;quot;ipykernel&amp;quot;,&lt;br /&gt;
     &amp;quot;-f&amp;quot;,&lt;br /&gt;
     &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
   ],&lt;br /&gt;
   &amp;quot;language&amp;quot;: &amp;quot;python&amp;quot;,&lt;br /&gt;
   &amp;quot;display_name&amp;quot;: &amp;quot;singularity-kernel&amp;quot;&lt;br /&gt;
 }&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Refresh or start the jupyterlab interface and the singularity kernel should appear in the launcher tab&lt;br /&gt;
&lt;br /&gt;
= GPUs =&lt;br /&gt;
&lt;br /&gt;
The way to identify the GPUs that are assigned to your job is:&lt;br /&gt;
* check the environment variable CUDA_VISIBLE_DEVICES. In a terminal run &amp;quot;echo $CUDA_VISIBLE_DEVICES&amp;quot;. The environment variable contains a list of comma-separated GPU ids. With this you will already know how many gpus are assigned to your job. If the variable does not exist, there are no gpus assigned to the job&lt;br /&gt;
&lt;br /&gt;
* list the gpus with nvidia-smi, in a terminal run &amp;quot;nvidia-smi -L&amp;quot;, and look for the gpus you've been assigned. Remember their indexes (integers from 0 to 7)&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_id_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
* in the GPU dashboard the gpus are identified with their index&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_resources_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
= Jupyterlab user guide =&lt;br /&gt;
&lt;br /&gt;
You can find the official documentation of the currently installed version of jupyterlab (3.6) [https://jupyterlab.readthedocs.io/en/3.6.x/ here], there you will find instruction on how to &lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/3.6.x/user/commands.html Access the command palette]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/3.6.x/user/toc.html Build a Table Of Contents]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/3.6.x/user/debugger.html Debug your code]&lt;br /&gt;
&lt;br /&gt;
A set of non-official jupyterlab extensions are installed to provide additional functionalities&lt;br /&gt;
&lt;br /&gt;
== jupytext ==&lt;br /&gt;
Pair your notebooks with text files to enhance version tracking.&lt;br /&gt;
https://jupytext.readthedocs.io&lt;br /&gt;
&lt;br /&gt;
=== Example ===&lt;br /&gt;
&lt;br /&gt;
If you had a notebook (.ipynb file) containing only the cell below, tracked in a git repository&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%matplotlib inline&lt;br /&gt;
import numpy as np&lt;br /&gt;
import matplotlib.pyplot as plt&lt;br /&gt;
plt.imshow(np.random.random([10, 10]))&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Different executions of the cell would produce different images, and the images are embedded in a pseudo-binary format inside the notebook file. In this case, doing a '''git diff''' of the .ipynb file would produce a huge output (because the image changed), even if there wasn't any change in the code. It is thus convenient to sync the notebook with a text file (e.g. a .py script) using the jupytext extension and track this one with git. The outputs, including images, as well as some additional metadata, won't be added to the synced text file. So in the case of different executions of the same notebook, the diff will always be empty.&lt;br /&gt;
&lt;br /&gt;
== git ==&lt;br /&gt;
Sidebar GUI to git repo management&lt;br /&gt;
https://github.com/jupyterlab/jupyterlab-git&lt;br /&gt;
&lt;br /&gt;
== variable inspector ==&lt;br /&gt;
Variable inspection à la Matlab&lt;br /&gt;
https://github.com/jupyterlab-contrib/jupyterlab-variableInspector&lt;br /&gt;
&lt;br /&gt;
...&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Code samples =&lt;br /&gt;
&lt;br /&gt;
A repository with sample code can be found here: https://gitlab.pic.es/services/code-samples/&lt;br /&gt;
&lt;br /&gt;
= Troubleshooting =&lt;br /&gt;
&lt;br /&gt;
== Logs ==&lt;br /&gt;
&lt;br /&gt;
The log files for the jupyterlab server are stored in &amp;quot;~/.jupyter&amp;quot;. The log files will be created once the jupyter lab server job is finished.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Clean workspaces ==&lt;br /&gt;
&lt;br /&gt;
Jupyterlab stores the workspace status in the &amp;quot;~/.jupyter/lab/workspaces&amp;quot; folder. If you want to start with a fresh (empty) workspace, delete all the content of this folder before launching the notebook.&lt;br /&gt;
&lt;br /&gt;
    cd ~/.jupyter/lab/workspaces&lt;br /&gt;
    rm *&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 504 Gateway timeout ==&lt;br /&gt;
&lt;br /&gt;
The notebook job is running in HTCondor but the user can not access the notebook server. Ultimately a 504 error is received.&lt;br /&gt;
&lt;br /&gt;
This is probably because there's some error when starting the jupyterlab server. First of all, shutdown the notebook server and check the logs to better identify the problem. If you don't see the source of the error, try clean the workspaces and launching a notebook again.&lt;/div&gt;</summary>
		<author><name>Torradeflot</name></author>
	</entry>
	<entry>
		<id>https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1169</id>
		<title>JupyterHub</title>
		<link rel="alternate" type="text/html" href="https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1169"/>
		<updated>2024-11-18T10:13:33Z</updated>

		<summary type="html">&lt;p&gt;Torradeflot: /* How to connect to the service */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
= Introduction =&lt;br /&gt;
&lt;br /&gt;
PIC offers a service for running Jupyter notebooks on CPU or GPU resources. This service is primarily thought for code developing or prototyping rather than data processing. The usage is similar to running notebooks on your personal computer but offers the advantage of developing and testing your code on different hardware configurations, as well as facilitating the scalability of the code since it is being tested in the same environment in which it would run on a mass scale. &lt;br /&gt;
&lt;br /&gt;
Since the service is strictly thought for development and small scale testing tasks, a shutdown policy for the sessions has been put in place:&lt;br /&gt;
&lt;br /&gt;
# The maximum duration for a session is 48h.&lt;br /&gt;
# After an idle period of 2 hours, the session will be closed. &lt;br /&gt;
&lt;br /&gt;
In practice that  means that you should estimate the test data volume that you work with during a session to be able to be processed in less than 48 hours.&lt;br /&gt;
&lt;br /&gt;
== How to connect to the service ==&lt;br /&gt;
&lt;br /&gt;
Got to [https://jupyter.pic.es jupyter.pic.es] to see your login screen.&lt;br /&gt;
&lt;br /&gt;
[[File:login.png|700px|Login screen]]&lt;br /&gt;
&lt;br /&gt;
Sign in with your PIC user credentials. This will prompt you to the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterSpawn.png|700px|current]]&lt;br /&gt;
&lt;br /&gt;
Here you can choose the hardware configuration for your Jupyter session. Also, you have to choose the experiment (project) you are working on during the Jupyter session. After choosing a configuration and pressing start the next screen will show you the progress of the initialisation process. Keep in mind that a job containing your Jupyter session is actually sent to the HTCondor queuing system and waiting for available resources before being started. This usually takes less than a minute but can take up to a few depending on our resource usage.&lt;br /&gt;
&lt;br /&gt;
[[File:screen02.png|900px]]&lt;br /&gt;
&lt;br /&gt;
In the next screen you can choose the tool that you want to use for your work: a Python notebook, a Python console or a plain bash terminal.&lt;br /&gt;
For the Python environment (either notebook or environment) you have two default options:&lt;br /&gt;
* the ipykernel version of Python 3&lt;br /&gt;
* the XPython version of Python 3.9, this one allows you to use the integrated debugging module.&lt;br /&gt;
&lt;br /&gt;
Further you see an icon with a &amp;quot;D&amp;quot; - desktop, this one starts a VNC session that allows the use of programs with graphical user interfaces.&lt;br /&gt;
&lt;br /&gt;
Also, recently you can find the icon of Visual Studio, an integrated development environment.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterlab20231103.png|700px]]&lt;br /&gt;
&lt;br /&gt;
Your python environments should appear under Notebook and Console headers. In a later section we will show you how to create a new environment and to remove an existing one.&lt;br /&gt;
&lt;br /&gt;
== Terminate your session and logout ==&lt;br /&gt;
&lt;br /&gt;
It is important that you terminate your session before you log out. In order to do so, go to the top page menu &amp;quot;'''File''' -&amp;gt; '''Hub Control Panel'''&amp;quot; and you will see the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:screen04.png|600px]]&lt;br /&gt;
&lt;br /&gt;
Here, click on the '''Stop My Server''' button. After that you can log out by clicking the '''Logout''' button in the right upper corner.&lt;br /&gt;
&lt;br /&gt;
=  Python virtual environments =&lt;br /&gt;
&lt;br /&gt;
This section covers the use of Python virtual environments with Jupyter.&lt;br /&gt;
&lt;br /&gt;
== Initialize conda (we highly recommend the use of miniforge/mambaforge) ==&lt;br /&gt;
&lt;br /&gt;
Before using conda/mamba in your bash session, you have to initialize it.&lt;br /&gt;
* For access to an available conda/mamba installation, please get in contact with your project liaison at PIC. He/she will give you the actual value for the '''/path/to/mambaforge''' placeholder.&lt;br /&gt;
* If you want to use your own conda/mamba installation, we recommend you to install the minimal '''miniforge''' distribution, instructions [https://github.com/conda-forge/miniforge here]&lt;br /&gt;
&lt;br /&gt;
Log onto Jupyter and start a session. On the homepage of your Jupyter session, click on the terminal button on the session dashboard on the right to open a bash terminal. If no specific version is needed you can use the link provided in the example.&lt;br /&gt;
&lt;br /&gt;
First, let's initialize conda for our bash sessions:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ /data/astro/software/alma9/conda/miniforge-24.1.2/bin/mamba init&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This actually changes the .bashrc file in your home directory in order to activate the base environment on login.&lt;br /&gt;
To avoid that the base environment is activated every time you log on to a node, run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, in order to activate the base environment you will have to run this command:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ eval &amp;quot;$(/data/astro/software/alma9/conda/miniforge-24.1.2/bin/conda shell.bash hook)&amp;quot;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For now you can exit the terminal.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ exit&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Link an existing environment to Jupyter ==&lt;br /&gt;
&lt;br /&gt;
You can find instructions on how to create your own environments, e.g. [[#Create_virtual_environments_with_venv_or_conda | here]].&lt;br /&gt;
&lt;br /&gt;
Log into Jupyter, start a session. From the session dashboard choose the bash terminal.&lt;br /&gt;
&lt;br /&gt;
Inside the terminal, activate your environment.&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments:&lt;br /&gt;
* if you created the environment without a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the name of your environment. &lt;br /&gt;
* if you created the environment with a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the absolute path of your environment. &lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ source /path/to/environment/bin/activate&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Link the environment to a Jupyter kernel. For both, '''conda/mamba''' and '''venv''':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ python -m  ipykernel install --user --name=whatever_kernel_name&lt;br /&gt;
Installed kernelspec whatever_kernel_name in &lt;br /&gt;
                         /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you don't have the '''ipykernel''' module installed in your environment you may receive an error message like the one below when trying to run the previous command.&lt;br /&gt;
&amp;lt;pre&amp;gt;No module named ipykernel&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If this is the case, you need to install it by running: '''pip install ipykernel'''&lt;br /&gt;
&lt;br /&gt;
Deactivate your environment. &lt;br /&gt;
&lt;br /&gt;
For conda:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For venv:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now you can exit the terminal. After refreshing the Jupyter page your whatever_kernel_name appears in the dashboard. In this example '''test''' has been used for whatever_kernel_name&lt;br /&gt;
&lt;br /&gt;
[[File:screen05.png|700px]]&lt;br /&gt;
&lt;br /&gt;
== Unlink an environment from Jupyter ==&lt;br /&gt;
Log onto Jupyter, start a session and from the session dashboard choose the bash terminal. To remove your environment/kernel from Jupyter run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ jupyter kernelspec uninstall whatever_kernel_name&lt;br /&gt;
Kernel specs to remove:&lt;br /&gt;
  whatever_kernel_name     /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
Remove 1 kernel specs [y/N]: y&lt;br /&gt;
[RemoveKernelSpec] Removed /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Keep in mind that, although not available in Jupyter anymore, the environment still exists. Whenever you need it, you can link it again.&lt;br /&gt;
&lt;br /&gt;
== Create virtual environments with venv or conda ==&lt;br /&gt;
&lt;br /&gt;
Before creating a new environment, please get in contact with your project liaison at PIC as there may be already a suitable environment for your needs in place.&lt;br /&gt;
&lt;br /&gt;
If none of the existing environments suits your needs, you can create a new environment.&lt;br /&gt;
First, create a directory in a suitable place to store the environment. For single-user environments, place them in your home under ~/env. For environments that will be shared with other project users, contact your project liaison and ask him/her for a path in a shared storage volume that is visible to all of them.&lt;br /&gt;
&lt;br /&gt;
Once you have the location (i.e. /path/to/env/folder), create the environment with the following commands:&lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments '''(recommended)'''&lt;br /&gt;
&lt;br /&gt;
If your_env is installed at /path/to/env/your_env&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ python3 -m venv your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ source your_env/bin/activate&lt;br /&gt;
(...)[neissner@td110 ~]$ pip install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments&lt;br /&gt;
 &amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba create --prefix /path/to/env/your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The list of modules (module1, module2, ...) is optional. For instance, for a python3 environment with scipy you would specify: ''python=3 scipy'' &lt;br /&gt;
&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/env/folder/your_env&lt;br /&gt;
(...)[neissner@td110 ~]$ mamba install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
You can use pip install inside a mamba environment, however, resolving dependencies might require installing additional packages manually.&lt;br /&gt;
&lt;br /&gt;
= Proper usage of X509 based proxies =&lt;br /&gt;
&lt;br /&gt;
We found recently that the usage of proxies within a Jupyter session might cause problems because the environment changes certain standard locations such as '''/tmp'''&lt;br /&gt;
&lt;br /&gt;
For a correct functioning please create the proxy the following way, example for Virgo:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /bin/voms-proxy-init --voms virgo:/virgo/ligo --out ./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
ls: cannot access /cvmfs/ligo.osgstorage.org: Permission denied&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here the proxy cannot properly be located. Therefore we have to put the complete path into the variable:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ pwd&lt;br /&gt;
/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;/x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
frames  powerflux  pycbc  test_access&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Software of particular interest =&lt;br /&gt;
== SageMath ==&lt;br /&gt;
&lt;br /&gt;
[https://www.sagemath.org/ SageMath] is particularly interesting for Cosmology because it allows symbolic calculations, e.g. deriving the equations of motions for the scale factor starting from a customised space-time metric. &lt;br /&gt;
&lt;br /&gt;
=== Standard cosmology examples ===&lt;br /&gt;
&lt;br /&gt;
* The Friedman equations for the FLRW solution of the Einstein equations.&lt;br /&gt;
You can find the corresponding Notebook in any PIC terminal at '''/data/astro/software/notebooks/FLRW_cosmology.ipynb'''&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/FLRW_cosmology_solutions.ipynb''' uses known analytical solutions of the FLRW cosmology and produces this image for the evolution of the scale factor:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage05.png|300px]]&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/Interior_Schwarzschild.ipynb''' shows the formalism for the interior Schwarzschild metric and displays the solutions for density and pressure of a static celestial object that is sufficiently larger than its Schwarzschild radius. The pressure for an object with constant density id shown in the image:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage06.png|300px]]&lt;br /&gt;
&lt;br /&gt;
=== Enabling SageMath environment in Jupyter ===&lt;br /&gt;
&lt;br /&gt;
If you have never initialized mamba, run:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /data/astro/software/centos7/conda/mambaforge_4.14.0/bin/mamba init&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
After that you can enable SageMath for its use in a Jupyter notebook session: &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~] mamba activate /data/astro/software/envs/sage&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ python -m  ipykernel install --user --name=sage&lt;br /&gt;
....&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This creates a file in you home '''~/.local/share/jupyter/kernels/sage/kernel.json'''&lt;br /&gt;
which has to be modified to look like this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
{&lt;br /&gt;
 &amp;quot;argv&amp;quot;: [&lt;br /&gt;
  &amp;quot;/data/astro/software/envs/sage/bin/sage&amp;quot;,&lt;br /&gt;
  &amp;quot;--python&amp;quot;,&lt;br /&gt;
  &amp;quot;-m&amp;quot;,&lt;br /&gt;
  &amp;quot;sage.repl.ipython_kernel&amp;quot;,&lt;br /&gt;
  &amp;quot;-f&amp;quot;,&lt;br /&gt;
  &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
 ],&lt;br /&gt;
 &amp;quot;display_name&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;language&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;metadata&amp;quot;: {&lt;br /&gt;
  &amp;quot;debugger&amp;quot;: true&lt;br /&gt;
 }&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next time you go to your Jupyter dashboard you will find the sage environment listed there.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Dask =&lt;br /&gt;
&lt;br /&gt;
A notebook with instructions on how to run Dask at PIC can be found [https://gitlab.pic.es/services/code-samples/-/blob/main/computing/dask/dask_htcondor.ipynb here]&lt;br /&gt;
&lt;br /&gt;
= Using a singularity image as a jupyter kernel =&lt;br /&gt;
&lt;br /&gt;
Sometimes the software stack of some projects may be provided in the shape of a singularity image, it will then be convenient to use this image as a kernel for the notebooks in jupyter.pic.es.&lt;br /&gt;
&lt;br /&gt;
The singularity image to be used as a kernel needs to fullfill some requirements. Different requirements will apply depending on the programming language.&lt;br /&gt;
&lt;br /&gt;
== python jupyter kernel in a singularity image ==&lt;br /&gt;
&lt;br /&gt;
The singularity image needs to have the '''python''' and the '''ipykernel''' module installed. &lt;br /&gt;
&lt;br /&gt;
* Create the folder that will host the kernel definition&lt;br /&gt;
&lt;br /&gt;
  mkdir -p $HOME/.local/share/jupyter/kernels/singularity&lt;br /&gt;
&lt;br /&gt;
* Create the '''kernel.json''' file inside it with the following content:&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 {&lt;br /&gt;
   &amp;quot;argv&amp;quot;: [&lt;br /&gt;
     &amp;quot;singularity&amp;quot;,&lt;br /&gt;
     &amp;quot;exec&amp;quot;,&lt;br /&gt;
     &amp;quot;--cleanenv&amp;quot;,&lt;br /&gt;
     &amp;quot;/path/to/the/singularity/image.sif&amp;quot;,&lt;br /&gt;
     &amp;quot;python&amp;quot;,&lt;br /&gt;
     &amp;quot;-m&amp;quot;,&lt;br /&gt;
    &amp;quot;ipykernel&amp;quot;,&lt;br /&gt;
     &amp;quot;-f&amp;quot;,&lt;br /&gt;
     &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
   ],&lt;br /&gt;
   &amp;quot;language&amp;quot;: &amp;quot;python&amp;quot;,&lt;br /&gt;
   &amp;quot;display_name&amp;quot;: &amp;quot;singularity-kernel&amp;quot;&lt;br /&gt;
 }&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Refresh or start the jupyterlab interface and the singularity kernel should appear in the launcher tab&lt;br /&gt;
&lt;br /&gt;
= GPUs =&lt;br /&gt;
&lt;br /&gt;
The way to identify the GPUs that are assigned to your job is:&lt;br /&gt;
* check the environment variable CUDA_VISIBLE_DEVICES. In a terminal run &amp;quot;echo $CUDA_VISIBLE_DEVICES&amp;quot;. The environment variable contains a list of comma-separated GPU ids. With this you will already know how many gpus are assigned to your job. If the variable does not exist, there are no gpus assigned to the job&lt;br /&gt;
&lt;br /&gt;
* list the gpus with nvidia-smi, in a terminal run &amp;quot;nvidia-smi -L&amp;quot;, and look for the gpus you've been assigned. Remember their indexes (integers from 0 to 7)&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_id_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
* in the GPU dashboard the gpus are identified with their index&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_resources_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
= Jupyterlab user guide =&lt;br /&gt;
&lt;br /&gt;
You can find the official documentation of the currently installed version of jupyterlab (3.6) [https://jupyterlab.readthedocs.io/en/3.6.x/ here], there you will find instruction on how to &lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/3.6.x/user/commands.html Access the command palette]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/3.6.x/user/toc.html Build a Table Of Contents]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/3.6.x/user/debugger.html Debug your code]&lt;br /&gt;
&lt;br /&gt;
A set of non-official jupyterlab extensions are installed to provide additional functionalities&lt;br /&gt;
&lt;br /&gt;
== jupytext ==&lt;br /&gt;
Pair your notebooks with text files to enhance version tracking.&lt;br /&gt;
https://jupytext.readthedocs.io&lt;br /&gt;
&lt;br /&gt;
=== Example ===&lt;br /&gt;
&lt;br /&gt;
If you had a notebook (.ipynb file) containing only the cell below, tracked in a git repository&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%matplotlib inline&lt;br /&gt;
import numpy as np&lt;br /&gt;
import matplotlib.pyplot as plt&lt;br /&gt;
plt.imshow(np.random.random([10, 10]))&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Different executions of the cell would produce different images, and the images are embedded in a pseudo-binary format inside the notebook file. In this case, doing a '''git diff''' of the .ipynb file would produce a huge output (because the image changed), even if there wasn't any change in the code. It is thus convenient to sync the notebook with a text file (e.g. a .py script) using the jupytext extension and track this one with git. The outputs, including images, as well as some additional metadata, won't be added to the synced text file. So in the case of different executions of the same notebook, the diff will always be empty.&lt;br /&gt;
&lt;br /&gt;
== git ==&lt;br /&gt;
Sidebar GUI to git repo management&lt;br /&gt;
https://github.com/jupyterlab/jupyterlab-git&lt;br /&gt;
&lt;br /&gt;
== variable inspector ==&lt;br /&gt;
Variable inspection à la Matlab&lt;br /&gt;
https://github.com/jupyterlab-contrib/jupyterlab-variableInspector&lt;br /&gt;
&lt;br /&gt;
...&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Code samples =&lt;br /&gt;
&lt;br /&gt;
A repository with sample code can be found here: https://gitlab.pic.es/services/code-samples/&lt;br /&gt;
&lt;br /&gt;
= Troubleshooting =&lt;br /&gt;
&lt;br /&gt;
== Logs ==&lt;br /&gt;
&lt;br /&gt;
The log files for the jupyterlab server are stored in &amp;quot;~/.jupyter&amp;quot;. The log files will be created once the jupyter lab server job is finished.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Clean workspaces ==&lt;br /&gt;
&lt;br /&gt;
Jupyterlab stores the workspace status in the &amp;quot;~/.jupyter/lab/workspaces&amp;quot; folder. If you want to start with a fresh (empty) workspace, delete all the content of this folder before launching the notebook.&lt;br /&gt;
&lt;br /&gt;
    cd ~/.jupyter/lab/workspaces&lt;br /&gt;
    rm *&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 504 Gateway timeout ==&lt;br /&gt;
&lt;br /&gt;
The notebook job is running in HTCondor but the user can not access the notebook server. Ultimately a 504 error is received.&lt;br /&gt;
&lt;br /&gt;
This is probably because there's some error when starting the jupyterlab server. First of all, shutdown the notebook server and check the logs to better identify the problem. If you don't see the source of the error, try clean the workspaces and launching a notebook again.&lt;/div&gt;</summary>
		<author><name>Torradeflot</name></author>
	</entry>
	<entry>
		<id>https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1168</id>
		<title>JupyterHub</title>
		<link rel="alternate" type="text/html" href="https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1168"/>
		<updated>2024-11-18T10:10:48Z</updated>

		<summary type="html">&lt;p&gt;Torradeflot: /* Introduction */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
= Introduction =&lt;br /&gt;
&lt;br /&gt;
PIC offers a service for running Jupyter notebooks on CPU or GPU resources. This service is primarily thought for code developing or prototyping rather than data processing. The usage is similar to running notebooks on your personal computer but offers the advantage of developing and testing your code on different hardware configurations, as well as facilitating the scalability of the code since it is being tested in the same environment in which it would run on a mass scale. &lt;br /&gt;
&lt;br /&gt;
Since the service is strictly thought for development and small scale testing tasks, a shutdown policy for the sessions has been put in place:&lt;br /&gt;
&lt;br /&gt;
# The maximum duration for a session is 48h.&lt;br /&gt;
# After an idle period of 2 hours, the session will be closed. &lt;br /&gt;
&lt;br /&gt;
In practice that  means that you should estimate the test data volume that you work with during a session to be able to be processed in less than 48 hours.&lt;br /&gt;
&lt;br /&gt;
== How to connect to the service ==&lt;br /&gt;
&lt;br /&gt;
Got to [https://jupyter.pic.es jupyter.pic.es] to see your login screen.&lt;br /&gt;
&lt;br /&gt;
[[File:login.png|700px|Login screen]]&lt;br /&gt;
&lt;br /&gt;
Sign in with your PIC user credentials. This will prompt you to the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterSpawn.png|700px|current]]&lt;br /&gt;
&lt;br /&gt;
Here you can choose the hardware configuration for your Jupyter session. Also, you have to choose the experiment (project) you are working on during the Jupyter session. After choosing a configuration and pressing start the next screen will show you the progress of the initialisation process. Keep in mind that a job containing your Jupyter session is actually sent to the HTCondor queuing system and waiting for available resources before being started. This usually takes less than a minute but can take up to a few depending on our resource usage.&lt;br /&gt;
&lt;br /&gt;
[[File:screen02.png|600px]]&lt;br /&gt;
&lt;br /&gt;
In the next screen you can choose the tool that you want to use for your work: a Python notebook, a Python console or a plain bash terminal.&lt;br /&gt;
For the Python environment (either notebook or environment) you have two default options:&lt;br /&gt;
* the ipykernel version of Python 3&lt;br /&gt;
* the XPython version of Python 3.9, this one allows you to use the integrated debugging module.&lt;br /&gt;
&lt;br /&gt;
Further you see an icon with a &amp;quot;D&amp;quot; - desktop, this one starts a VNC session that allows the use of programs with graphical user interfaces.&lt;br /&gt;
&lt;br /&gt;
Also, recently you can find the icon of Visual Studio, an integrated development environment.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterlab20231103.png|700px]]&lt;br /&gt;
&lt;br /&gt;
Your python environments should appear under Notebook and Console headers. In a later section we will show you how to create a new environment and to remove an existing one.&lt;br /&gt;
&lt;br /&gt;
== Terminate your session and logout ==&lt;br /&gt;
&lt;br /&gt;
It is important that you terminate your session before you log out. In order to do so, go to the top page menu &amp;quot;'''File''' -&amp;gt; '''Hub Control Panel'''&amp;quot; and you will see the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:screen04.png|600px]]&lt;br /&gt;
&lt;br /&gt;
Here, click on the '''Stop My Server''' button. After that you can log out by clicking the '''Logout''' button in the right upper corner.&lt;br /&gt;
&lt;br /&gt;
=  Python virtual environments =&lt;br /&gt;
&lt;br /&gt;
This section covers the use of Python virtual environments with Jupyter.&lt;br /&gt;
&lt;br /&gt;
== Initialize conda (we highly recommend the use of miniforge/mambaforge) ==&lt;br /&gt;
&lt;br /&gt;
Before using conda/mamba in your bash session, you have to initialize it.&lt;br /&gt;
* For access to an available conda/mamba installation, please get in contact with your project liaison at PIC. He/she will give you the actual value for the '''/path/to/mambaforge''' placeholder.&lt;br /&gt;
* If you want to use your own conda/mamba installation, we recommend you to install the minimal '''miniforge''' distribution, instructions [https://github.com/conda-forge/miniforge here]&lt;br /&gt;
&lt;br /&gt;
Log onto Jupyter and start a session. On the homepage of your Jupyter session, click on the terminal button on the session dashboard on the right to open a bash terminal. If no specific version is needed you can use the link provided in the example.&lt;br /&gt;
&lt;br /&gt;
First, let's initialize conda for our bash sessions:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ /data/astro/software/alma9/conda/miniforge-24.1.2/bin/mamba init&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This actually changes the .bashrc file in your home directory in order to activate the base environment on login.&lt;br /&gt;
To avoid that the base environment is activated every time you log on to a node, run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, in order to activate the base environment you will have to run this command:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ eval &amp;quot;$(/data/astro/software/alma9/conda/miniforge-24.1.2/bin/conda shell.bash hook)&amp;quot;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For now you can exit the terminal.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ exit&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Link an existing environment to Jupyter ==&lt;br /&gt;
&lt;br /&gt;
You can find instructions on how to create your own environments, e.g. [[#Create_virtual_environments_with_venv_or_conda | here]].&lt;br /&gt;
&lt;br /&gt;
Log into Jupyter, start a session. From the session dashboard choose the bash terminal.&lt;br /&gt;
&lt;br /&gt;
Inside the terminal, activate your environment.&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments:&lt;br /&gt;
* if you created the environment without a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the name of your environment. &lt;br /&gt;
* if you created the environment with a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the absolute path of your environment. &lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ source /path/to/environment/bin/activate&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Link the environment to a Jupyter kernel. For both, '''conda/mamba''' and '''venv''':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ python -m  ipykernel install --user --name=whatever_kernel_name&lt;br /&gt;
Installed kernelspec whatever_kernel_name in &lt;br /&gt;
                         /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you don't have the '''ipykernel''' module installed in your environment you may receive an error message like the one below when trying to run the previous command.&lt;br /&gt;
&amp;lt;pre&amp;gt;No module named ipykernel&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If this is the case, you need to install it by running: '''pip install ipykernel'''&lt;br /&gt;
&lt;br /&gt;
Deactivate your environment. &lt;br /&gt;
&lt;br /&gt;
For conda:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For venv:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now you can exit the terminal. After refreshing the Jupyter page your whatever_kernel_name appears in the dashboard. In this example '''test''' has been used for whatever_kernel_name&lt;br /&gt;
&lt;br /&gt;
[[File:screen05.png|700px]]&lt;br /&gt;
&lt;br /&gt;
== Unlink an environment from Jupyter ==&lt;br /&gt;
Log onto Jupyter, start a session and from the session dashboard choose the bash terminal. To remove your environment/kernel from Jupyter run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ jupyter kernelspec uninstall whatever_kernel_name&lt;br /&gt;
Kernel specs to remove:&lt;br /&gt;
  whatever_kernel_name     /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
Remove 1 kernel specs [y/N]: y&lt;br /&gt;
[RemoveKernelSpec] Removed /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Keep in mind that, although not available in Jupyter anymore, the environment still exists. Whenever you need it, you can link it again.&lt;br /&gt;
&lt;br /&gt;
== Create virtual environments with venv or conda ==&lt;br /&gt;
&lt;br /&gt;
Before creating a new environment, please get in contact with your project liaison at PIC as there may be already a suitable environment for your needs in place.&lt;br /&gt;
&lt;br /&gt;
If none of the existing environments suits your needs, you can create a new environment.&lt;br /&gt;
First, create a directory in a suitable place to store the environment. For single-user environments, place them in your home under ~/env. For environments that will be shared with other project users, contact your project liaison and ask him/her for a path in a shared storage volume that is visible to all of them.&lt;br /&gt;
&lt;br /&gt;
Once you have the location (i.e. /path/to/env/folder), create the environment with the following commands:&lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments '''(recommended)'''&lt;br /&gt;
&lt;br /&gt;
If your_env is installed at /path/to/env/your_env&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ python3 -m venv your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ source your_env/bin/activate&lt;br /&gt;
(...)[neissner@td110 ~]$ pip install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments&lt;br /&gt;
 &amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba create --prefix /path/to/env/your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The list of modules (module1, module2, ...) is optional. For instance, for a python3 environment with scipy you would specify: ''python=3 scipy'' &lt;br /&gt;
&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/env/folder/your_env&lt;br /&gt;
(...)[neissner@td110 ~]$ mamba install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
You can use pip install inside a mamba environment, however, resolving dependencies might require installing additional packages manually.&lt;br /&gt;
&lt;br /&gt;
= Proper usage of X509 based proxies =&lt;br /&gt;
&lt;br /&gt;
We found recently that the usage of proxies within a Jupyter session might cause problems because the environment changes certain standard locations such as '''/tmp'''&lt;br /&gt;
&lt;br /&gt;
For a correct functioning please create the proxy the following way, example for Virgo:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /bin/voms-proxy-init --voms virgo:/virgo/ligo --out ./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
ls: cannot access /cvmfs/ligo.osgstorage.org: Permission denied&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here the proxy cannot properly be located. Therefore we have to put the complete path into the variable:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ pwd&lt;br /&gt;
/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;/x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
frames  powerflux  pycbc  test_access&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Software of particular interest =&lt;br /&gt;
== SageMath ==&lt;br /&gt;
&lt;br /&gt;
[https://www.sagemath.org/ SageMath] is particularly interesting for Cosmology because it allows symbolic calculations, e.g. deriving the equations of motions for the scale factor starting from a customised space-time metric. &lt;br /&gt;
&lt;br /&gt;
=== Standard cosmology examples ===&lt;br /&gt;
&lt;br /&gt;
* The Friedman equations for the FLRW solution of the Einstein equations.&lt;br /&gt;
You can find the corresponding Notebook in any PIC terminal at '''/data/astro/software/notebooks/FLRW_cosmology.ipynb'''&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/FLRW_cosmology_solutions.ipynb''' uses known analytical solutions of the FLRW cosmology and produces this image for the evolution of the scale factor:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage05.png|300px]]&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/Interior_Schwarzschild.ipynb''' shows the formalism for the interior Schwarzschild metric and displays the solutions for density and pressure of a static celestial object that is sufficiently larger than its Schwarzschild radius. The pressure for an object with constant density id shown in the image:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage06.png|300px]]&lt;br /&gt;
&lt;br /&gt;
=== Enabling SageMath environment in Jupyter ===&lt;br /&gt;
&lt;br /&gt;
If you have never initialized mamba, run:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /data/astro/software/centos7/conda/mambaforge_4.14.0/bin/mamba init&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
After that you can enable SageMath for its use in a Jupyter notebook session: &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~] mamba activate /data/astro/software/envs/sage&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ python -m  ipykernel install --user --name=sage&lt;br /&gt;
....&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This creates a file in you home '''~/.local/share/jupyter/kernels/sage/kernel.json'''&lt;br /&gt;
which has to be modified to look like this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
{&lt;br /&gt;
 &amp;quot;argv&amp;quot;: [&lt;br /&gt;
  &amp;quot;/data/astro/software/envs/sage/bin/sage&amp;quot;,&lt;br /&gt;
  &amp;quot;--python&amp;quot;,&lt;br /&gt;
  &amp;quot;-m&amp;quot;,&lt;br /&gt;
  &amp;quot;sage.repl.ipython_kernel&amp;quot;,&lt;br /&gt;
  &amp;quot;-f&amp;quot;,&lt;br /&gt;
  &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
 ],&lt;br /&gt;
 &amp;quot;display_name&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;language&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;metadata&amp;quot;: {&lt;br /&gt;
  &amp;quot;debugger&amp;quot;: true&lt;br /&gt;
 }&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next time you go to your Jupyter dashboard you will find the sage environment listed there.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Dask =&lt;br /&gt;
&lt;br /&gt;
A notebook with instructions on how to run Dask at PIC can be found [https://gitlab.pic.es/services/code-samples/-/blob/main/computing/dask/dask_htcondor.ipynb here]&lt;br /&gt;
&lt;br /&gt;
= Using a singularity image as a jupyter kernel =&lt;br /&gt;
&lt;br /&gt;
Sometimes the software stack of some projects may be provided in the shape of a singularity image, it will then be convenient to use this image as a kernel for the notebooks in jupyter.pic.es.&lt;br /&gt;
&lt;br /&gt;
The singularity image to be used as a kernel needs to fullfill some requirements. Different requirements will apply depending on the programming language.&lt;br /&gt;
&lt;br /&gt;
== python jupyter kernel in a singularity image ==&lt;br /&gt;
&lt;br /&gt;
The singularity image needs to have the '''python''' and the '''ipykernel''' module installed. &lt;br /&gt;
&lt;br /&gt;
* Create the folder that will host the kernel definition&lt;br /&gt;
&lt;br /&gt;
  mkdir -p $HOME/.local/share/jupyter/kernels/singularity&lt;br /&gt;
&lt;br /&gt;
* Create the '''kernel.json''' file inside it with the following content:&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 {&lt;br /&gt;
   &amp;quot;argv&amp;quot;: [&lt;br /&gt;
     &amp;quot;singularity&amp;quot;,&lt;br /&gt;
     &amp;quot;exec&amp;quot;,&lt;br /&gt;
     &amp;quot;--cleanenv&amp;quot;,&lt;br /&gt;
     &amp;quot;/path/to/the/singularity/image.sif&amp;quot;,&lt;br /&gt;
     &amp;quot;python&amp;quot;,&lt;br /&gt;
     &amp;quot;-m&amp;quot;,&lt;br /&gt;
    &amp;quot;ipykernel&amp;quot;,&lt;br /&gt;
     &amp;quot;-f&amp;quot;,&lt;br /&gt;
     &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
   ],&lt;br /&gt;
   &amp;quot;language&amp;quot;: &amp;quot;python&amp;quot;,&lt;br /&gt;
   &amp;quot;display_name&amp;quot;: &amp;quot;singularity-kernel&amp;quot;&lt;br /&gt;
 }&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Refresh or start the jupyterlab interface and the singularity kernel should appear in the launcher tab&lt;br /&gt;
&lt;br /&gt;
= GPUs =&lt;br /&gt;
&lt;br /&gt;
The way to identify the GPUs that are assigned to your job is:&lt;br /&gt;
* check the environment variable CUDA_VISIBLE_DEVICES. In a terminal run &amp;quot;echo $CUDA_VISIBLE_DEVICES&amp;quot;. The environment variable contains a list of comma-separated GPU ids. With this you will already know how many gpus are assigned to your job. If the variable does not exist, there are no gpus assigned to the job&lt;br /&gt;
&lt;br /&gt;
* list the gpus with nvidia-smi, in a terminal run &amp;quot;nvidia-smi -L&amp;quot;, and look for the gpus you've been assigned. Remember their indexes (integers from 0 to 7)&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_id_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
* in the GPU dashboard the gpus are identified with their index&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_resources_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
= Jupyterlab user guide =&lt;br /&gt;
&lt;br /&gt;
You can find the official documentation of the currently installed version of jupyterlab (3.6) [https://jupyterlab.readthedocs.io/en/3.6.x/ here], there you will find instruction on how to &lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/3.6.x/user/commands.html Access the command palette]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/3.6.x/user/toc.html Build a Table Of Contents]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/3.6.x/user/debugger.html Debug your code]&lt;br /&gt;
&lt;br /&gt;
A set of non-official jupyterlab extensions are installed to provide additional functionalities&lt;br /&gt;
&lt;br /&gt;
== jupytext ==&lt;br /&gt;
Pair your notebooks with text files to enhance version tracking.&lt;br /&gt;
https://jupytext.readthedocs.io&lt;br /&gt;
&lt;br /&gt;
=== Example ===&lt;br /&gt;
&lt;br /&gt;
If you had a notebook (.ipynb file) containing only the cell below, tracked in a git repository&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%matplotlib inline&lt;br /&gt;
import numpy as np&lt;br /&gt;
import matplotlib.pyplot as plt&lt;br /&gt;
plt.imshow(np.random.random([10, 10]))&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Different executions of the cell would produce different images, and the images are embedded in a pseudo-binary format inside the notebook file. In this case, doing a '''git diff''' of the .ipynb file would produce a huge output (because the image changed), even if there wasn't any change in the code. It is thus convenient to sync the notebook with a text file (e.g. a .py script) using the jupytext extension and track this one with git. The outputs, including images, as well as some additional metadata, won't be added to the synced text file. So in the case of different executions of the same notebook, the diff will always be empty.&lt;br /&gt;
&lt;br /&gt;
== git ==&lt;br /&gt;
Sidebar GUI to git repo management&lt;br /&gt;
https://github.com/jupyterlab/jupyterlab-git&lt;br /&gt;
&lt;br /&gt;
== variable inspector ==&lt;br /&gt;
Variable inspection à la Matlab&lt;br /&gt;
https://github.com/jupyterlab-contrib/jupyterlab-variableInspector&lt;br /&gt;
&lt;br /&gt;
...&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Code samples =&lt;br /&gt;
&lt;br /&gt;
A repository with sample code can be found here: https://gitlab.pic.es/services/code-samples/&lt;br /&gt;
&lt;br /&gt;
= Troubleshooting =&lt;br /&gt;
&lt;br /&gt;
== Logs ==&lt;br /&gt;
&lt;br /&gt;
The log files for the jupyterlab server are stored in &amp;quot;~/.jupyter&amp;quot;. The log files will be created once the jupyter lab server job is finished.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Clean workspaces ==&lt;br /&gt;
&lt;br /&gt;
Jupyterlab stores the workspace status in the &amp;quot;~/.jupyter/lab/workspaces&amp;quot; folder. If you want to start with a fresh (empty) workspace, delete all the content of this folder before launching the notebook.&lt;br /&gt;
&lt;br /&gt;
    cd ~/.jupyter/lab/workspaces&lt;br /&gt;
    rm *&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 504 Gateway timeout ==&lt;br /&gt;
&lt;br /&gt;
The notebook job is running in HTCondor but the user can not access the notebook server. Ultimately a 504 error is received.&lt;br /&gt;
&lt;br /&gt;
This is probably because there's some error when starting the jupyterlab server. First of all, shutdown the notebook server and check the logs to better identify the problem. If you don't see the source of the error, try clean the workspaces and launching a notebook again.&lt;/div&gt;</summary>
		<author><name>Torradeflot</name></author>
	</entry>
	<entry>
		<id>https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1167</id>
		<title>JupyterHub</title>
		<link rel="alternate" type="text/html" href="https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1167"/>
		<updated>2024-11-18T10:10:13Z</updated>

		<summary type="html">&lt;p&gt;Torradeflot: /* How to connect to the service */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
= Introduction =&lt;br /&gt;
&lt;br /&gt;
PIC offers a service for running Jupyter notebooks on CPU or GPU resources. This service is primarily thought for code developing or prototyping rather than data processing. The usage is similar to running notebooks on your personal computer but offers the advantage of developing and testing your code on different hardware configurations, as well as facilitating the scalability of the code since it is being tested in the same environment in which it would run on a mass scale. &lt;br /&gt;
&lt;br /&gt;
Since the service is strictly thought for development and small scale testing tasks, a shutdown policy for the sessions has been put in place:&lt;br /&gt;
&lt;br /&gt;
# The maximum duration for a session is 48h.&lt;br /&gt;
# After an idle period of 2 hours, the session will be closed. &lt;br /&gt;
&lt;br /&gt;
In practice that  means that you should estimate the test data volume that you work with during a session to be able to be processed in less than 48 hours.&lt;br /&gt;
&lt;br /&gt;
== How to connect to the service ==&lt;br /&gt;
&lt;br /&gt;
Got to [https://jupyter.pic.es jupyter.pic.es] to see your login screen.&lt;br /&gt;
&lt;br /&gt;
[[File:login.png|frameless|Login screen]]&lt;br /&gt;
&lt;br /&gt;
Sign in with your PIC user credentials. This will prompt you to the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterSpawn.png|700px|current]]&lt;br /&gt;
&lt;br /&gt;
Here you can choose the hardware configuration for your Jupyter session. Also, you have to choose the experiment (project) you are working on during the Jupyter session. After choosing a configuration and pressing start the next screen will show you the progress of the initialisation process. Keep in mind that a job containing your Jupyter session is actually sent to the HTCondor queuing system and waiting for available resources before being started. This usually takes less than a minute but can take up to a few depending on our resource usage.&lt;br /&gt;
&lt;br /&gt;
[[File:screen02.png|600px]]&lt;br /&gt;
&lt;br /&gt;
In the next screen you can choose the tool that you want to use for your work: a Python notebook, a Python console or a plain bash terminal.&lt;br /&gt;
For the Python environment (either notebook or environment) you have two default options:&lt;br /&gt;
* the ipykernel version of Python 3&lt;br /&gt;
* the XPython version of Python 3.9, this one allows you to use the integrated debugging module.&lt;br /&gt;
&lt;br /&gt;
Further you see an icon with a &amp;quot;D&amp;quot; - desktop, this one starts a VNC session that allows the use of programs with graphical user interfaces.&lt;br /&gt;
&lt;br /&gt;
Also, recently you can find the icon of Visual Studio, an integrated development environment.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterlab20231103.png|700px]]&lt;br /&gt;
&lt;br /&gt;
Your python environments should appear under Notebook and Console headers. In a later section we will show you how to create a new environment and to remove an existing one.&lt;br /&gt;
&lt;br /&gt;
== Terminate your session and logout ==&lt;br /&gt;
&lt;br /&gt;
It is important that you terminate your session before you log out. In order to do so, go to the top page menu &amp;quot;'''File''' -&amp;gt; '''Hub Control Panel'''&amp;quot; and you will see the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:screen04.png|600px]]&lt;br /&gt;
&lt;br /&gt;
Here, click on the '''Stop My Server''' button. After that you can log out by clicking the '''Logout''' button in the right upper corner.&lt;br /&gt;
&lt;br /&gt;
=  Python virtual environments =&lt;br /&gt;
&lt;br /&gt;
This section covers the use of Python virtual environments with Jupyter.&lt;br /&gt;
&lt;br /&gt;
== Initialize conda (we highly recommend the use of miniforge/mambaforge) ==&lt;br /&gt;
&lt;br /&gt;
Before using conda/mamba in your bash session, you have to initialize it.&lt;br /&gt;
* For access to an available conda/mamba installation, please get in contact with your project liaison at PIC. He/she will give you the actual value for the '''/path/to/mambaforge''' placeholder.&lt;br /&gt;
* If you want to use your own conda/mamba installation, we recommend you to install the minimal '''miniforge''' distribution, instructions [https://github.com/conda-forge/miniforge here]&lt;br /&gt;
&lt;br /&gt;
Log onto Jupyter and start a session. On the homepage of your Jupyter session, click on the terminal button on the session dashboard on the right to open a bash terminal. If no specific version is needed you can use the link provided in the example.&lt;br /&gt;
&lt;br /&gt;
First, let's initialize conda for our bash sessions:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ /data/astro/software/alma9/conda/miniforge-24.1.2/bin/mamba init&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This actually changes the .bashrc file in your home directory in order to activate the base environment on login.&lt;br /&gt;
To avoid that the base environment is activated every time you log on to a node, run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, in order to activate the base environment you will have to run this command:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ eval &amp;quot;$(/data/astro/software/alma9/conda/miniforge-24.1.2/bin/conda shell.bash hook)&amp;quot;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For now you can exit the terminal.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ exit&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Link an existing environment to Jupyter ==&lt;br /&gt;
&lt;br /&gt;
You can find instructions on how to create your own environments, e.g. [[#Create_virtual_environments_with_venv_or_conda | here]].&lt;br /&gt;
&lt;br /&gt;
Log into Jupyter, start a session. From the session dashboard choose the bash terminal.&lt;br /&gt;
&lt;br /&gt;
Inside the terminal, activate your environment.&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments:&lt;br /&gt;
* if you created the environment without a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the name of your environment. &lt;br /&gt;
* if you created the environment with a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the absolute path of your environment. &lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ source /path/to/environment/bin/activate&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Link the environment to a Jupyter kernel. For both, '''conda/mamba''' and '''venv''':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ python -m  ipykernel install --user --name=whatever_kernel_name&lt;br /&gt;
Installed kernelspec whatever_kernel_name in &lt;br /&gt;
                         /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you don't have the '''ipykernel''' module installed in your environment you may receive an error message like the one below when trying to run the previous command.&lt;br /&gt;
&amp;lt;pre&amp;gt;No module named ipykernel&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If this is the case, you need to install it by running: '''pip install ipykernel'''&lt;br /&gt;
&lt;br /&gt;
Deactivate your environment. &lt;br /&gt;
&lt;br /&gt;
For conda:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For venv:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now you can exit the terminal. After refreshing the Jupyter page your whatever_kernel_name appears in the dashboard. In this example '''test''' has been used for whatever_kernel_name&lt;br /&gt;
&lt;br /&gt;
[[File:screen05.png|700px]]&lt;br /&gt;
&lt;br /&gt;
== Unlink an environment from Jupyter ==&lt;br /&gt;
Log onto Jupyter, start a session and from the session dashboard choose the bash terminal. To remove your environment/kernel from Jupyter run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ jupyter kernelspec uninstall whatever_kernel_name&lt;br /&gt;
Kernel specs to remove:&lt;br /&gt;
  whatever_kernel_name     /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
Remove 1 kernel specs [y/N]: y&lt;br /&gt;
[RemoveKernelSpec] Removed /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Keep in mind that, although not available in Jupyter anymore, the environment still exists. Whenever you need it, you can link it again.&lt;br /&gt;
&lt;br /&gt;
== Create virtual environments with venv or conda ==&lt;br /&gt;
&lt;br /&gt;
Before creating a new environment, please get in contact with your project liaison at PIC as there may be already a suitable environment for your needs in place.&lt;br /&gt;
&lt;br /&gt;
If none of the existing environments suits your needs, you can create a new environment.&lt;br /&gt;
First, create a directory in a suitable place to store the environment. For single-user environments, place them in your home under ~/env. For environments that will be shared with other project users, contact your project liaison and ask him/her for a path in a shared storage volume that is visible to all of them.&lt;br /&gt;
&lt;br /&gt;
Once you have the location (i.e. /path/to/env/folder), create the environment with the following commands:&lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments '''(recommended)'''&lt;br /&gt;
&lt;br /&gt;
If your_env is installed at /path/to/env/your_env&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ python3 -m venv your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ source your_env/bin/activate&lt;br /&gt;
(...)[neissner@td110 ~]$ pip install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments&lt;br /&gt;
 &amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba create --prefix /path/to/env/your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The list of modules (module1, module2, ...) is optional. For instance, for a python3 environment with scipy you would specify: ''python=3 scipy'' &lt;br /&gt;
&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/env/folder/your_env&lt;br /&gt;
(...)[neissner@td110 ~]$ mamba install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
You can use pip install inside a mamba environment, however, resolving dependencies might require installing additional packages manually.&lt;br /&gt;
&lt;br /&gt;
= Proper usage of X509 based proxies =&lt;br /&gt;
&lt;br /&gt;
We found recently that the usage of proxies within a Jupyter session might cause problems because the environment changes certain standard locations such as '''/tmp'''&lt;br /&gt;
&lt;br /&gt;
For a correct functioning please create the proxy the following way, example for Virgo:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /bin/voms-proxy-init --voms virgo:/virgo/ligo --out ./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
ls: cannot access /cvmfs/ligo.osgstorage.org: Permission denied&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here the proxy cannot properly be located. Therefore we have to put the complete path into the variable:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ pwd&lt;br /&gt;
/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;/x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
frames  powerflux  pycbc  test_access&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Software of particular interest =&lt;br /&gt;
== SageMath ==&lt;br /&gt;
&lt;br /&gt;
[https://www.sagemath.org/ SageMath] is particularly interesting for Cosmology because it allows symbolic calculations, e.g. deriving the equations of motions for the scale factor starting from a customised space-time metric. &lt;br /&gt;
&lt;br /&gt;
=== Standard cosmology examples ===&lt;br /&gt;
&lt;br /&gt;
* The Friedman equations for the FLRW solution of the Einstein equations.&lt;br /&gt;
You can find the corresponding Notebook in any PIC terminal at '''/data/astro/software/notebooks/FLRW_cosmology.ipynb'''&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/FLRW_cosmology_solutions.ipynb''' uses known analytical solutions of the FLRW cosmology and produces this image for the evolution of the scale factor:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage05.png|300px]]&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/Interior_Schwarzschild.ipynb''' shows the formalism for the interior Schwarzschild metric and displays the solutions for density and pressure of a static celestial object that is sufficiently larger than its Schwarzschild radius. The pressure for an object with constant density id shown in the image:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage06.png|300px]]&lt;br /&gt;
&lt;br /&gt;
=== Enabling SageMath environment in Jupyter ===&lt;br /&gt;
&lt;br /&gt;
If you have never initialized mamba, run:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /data/astro/software/centos7/conda/mambaforge_4.14.0/bin/mamba init&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
After that you can enable SageMath for its use in a Jupyter notebook session: &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~] mamba activate /data/astro/software/envs/sage&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ python -m  ipykernel install --user --name=sage&lt;br /&gt;
....&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This creates a file in you home '''~/.local/share/jupyter/kernels/sage/kernel.json'''&lt;br /&gt;
which has to be modified to look like this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
{&lt;br /&gt;
 &amp;quot;argv&amp;quot;: [&lt;br /&gt;
  &amp;quot;/data/astro/software/envs/sage/bin/sage&amp;quot;,&lt;br /&gt;
  &amp;quot;--python&amp;quot;,&lt;br /&gt;
  &amp;quot;-m&amp;quot;,&lt;br /&gt;
  &amp;quot;sage.repl.ipython_kernel&amp;quot;,&lt;br /&gt;
  &amp;quot;-f&amp;quot;,&lt;br /&gt;
  &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
 ],&lt;br /&gt;
 &amp;quot;display_name&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;language&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;metadata&amp;quot;: {&lt;br /&gt;
  &amp;quot;debugger&amp;quot;: true&lt;br /&gt;
 }&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next time you go to your Jupyter dashboard you will find the sage environment listed there.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Dask =&lt;br /&gt;
&lt;br /&gt;
A notebook with instructions on how to run Dask at PIC can be found [https://gitlab.pic.es/services/code-samples/-/blob/main/computing/dask/dask_htcondor.ipynb here]&lt;br /&gt;
&lt;br /&gt;
= Using a singularity image as a jupyter kernel =&lt;br /&gt;
&lt;br /&gt;
Sometimes the software stack of some projects may be provided in the shape of a singularity image, it will then be convenient to use this image as a kernel for the notebooks in jupyter.pic.es.&lt;br /&gt;
&lt;br /&gt;
The singularity image to be used as a kernel needs to fullfill some requirements. Different requirements will apply depending on the programming language.&lt;br /&gt;
&lt;br /&gt;
== python jupyter kernel in a singularity image ==&lt;br /&gt;
&lt;br /&gt;
The singularity image needs to have the '''python''' and the '''ipykernel''' module installed. &lt;br /&gt;
&lt;br /&gt;
* Create the folder that will host the kernel definition&lt;br /&gt;
&lt;br /&gt;
  mkdir -p $HOME/.local/share/jupyter/kernels/singularity&lt;br /&gt;
&lt;br /&gt;
* Create the '''kernel.json''' file inside it with the following content:&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 {&lt;br /&gt;
   &amp;quot;argv&amp;quot;: [&lt;br /&gt;
     &amp;quot;singularity&amp;quot;,&lt;br /&gt;
     &amp;quot;exec&amp;quot;,&lt;br /&gt;
     &amp;quot;--cleanenv&amp;quot;,&lt;br /&gt;
     &amp;quot;/path/to/the/singularity/image.sif&amp;quot;,&lt;br /&gt;
     &amp;quot;python&amp;quot;,&lt;br /&gt;
     &amp;quot;-m&amp;quot;,&lt;br /&gt;
    &amp;quot;ipykernel&amp;quot;,&lt;br /&gt;
     &amp;quot;-f&amp;quot;,&lt;br /&gt;
     &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
   ],&lt;br /&gt;
   &amp;quot;language&amp;quot;: &amp;quot;python&amp;quot;,&lt;br /&gt;
   &amp;quot;display_name&amp;quot;: &amp;quot;singularity-kernel&amp;quot;&lt;br /&gt;
 }&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Refresh or start the jupyterlab interface and the singularity kernel should appear in the launcher tab&lt;br /&gt;
&lt;br /&gt;
= GPUs =&lt;br /&gt;
&lt;br /&gt;
The way to identify the GPUs that are assigned to your job is:&lt;br /&gt;
* check the environment variable CUDA_VISIBLE_DEVICES. In a terminal run &amp;quot;echo $CUDA_VISIBLE_DEVICES&amp;quot;. The environment variable contains a list of comma-separated GPU ids. With this you will already know how many gpus are assigned to your job. If the variable does not exist, there are no gpus assigned to the job&lt;br /&gt;
&lt;br /&gt;
* list the gpus with nvidia-smi, in a terminal run &amp;quot;nvidia-smi -L&amp;quot;, and look for the gpus you've been assigned. Remember their indexes (integers from 0 to 7)&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_id_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
* in the GPU dashboard the gpus are identified with their index&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_resources_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
= Jupyterlab user guide =&lt;br /&gt;
&lt;br /&gt;
You can find the official documentation of the currently installed version of jupyterlab (3.6) [https://jupyterlab.readthedocs.io/en/3.6.x/ here], there you will find instruction on how to &lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/3.6.x/user/commands.html Access the command palette]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/3.6.x/user/toc.html Build a Table Of Contents]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/3.6.x/user/debugger.html Debug your code]&lt;br /&gt;
&lt;br /&gt;
A set of non-official jupyterlab extensions are installed to provide additional functionalities&lt;br /&gt;
&lt;br /&gt;
== jupytext ==&lt;br /&gt;
Pair your notebooks with text files to enhance version tracking.&lt;br /&gt;
https://jupytext.readthedocs.io&lt;br /&gt;
&lt;br /&gt;
=== Example ===&lt;br /&gt;
&lt;br /&gt;
If you had a notebook (.ipynb file) containing only the cell below, tracked in a git repository&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%matplotlib inline&lt;br /&gt;
import numpy as np&lt;br /&gt;
import matplotlib.pyplot as plt&lt;br /&gt;
plt.imshow(np.random.random([10, 10]))&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Different executions of the cell would produce different images, and the images are embedded in a pseudo-binary format inside the notebook file. In this case, doing a '''git diff''' of the .ipynb file would produce a huge output (because the image changed), even if there wasn't any change in the code. It is thus convenient to sync the notebook with a text file (e.g. a .py script) using the jupytext extension and track this one with git. The outputs, including images, as well as some additional metadata, won't be added to the synced text file. So in the case of different executions of the same notebook, the diff will always be empty.&lt;br /&gt;
&lt;br /&gt;
== git ==&lt;br /&gt;
Sidebar GUI to git repo management&lt;br /&gt;
https://github.com/jupyterlab/jupyterlab-git&lt;br /&gt;
&lt;br /&gt;
== variable inspector ==&lt;br /&gt;
Variable inspection à la Matlab&lt;br /&gt;
https://github.com/jupyterlab-contrib/jupyterlab-variableInspector&lt;br /&gt;
&lt;br /&gt;
...&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Code samples =&lt;br /&gt;
&lt;br /&gt;
A repository with sample code can be found here: https://gitlab.pic.es/services/code-samples/&lt;br /&gt;
&lt;br /&gt;
= Troubleshooting =&lt;br /&gt;
&lt;br /&gt;
== Logs ==&lt;br /&gt;
&lt;br /&gt;
The log files for the jupyterlab server are stored in &amp;quot;~/.jupyter&amp;quot;. The log files will be created once the jupyter lab server job is finished.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Clean workspaces ==&lt;br /&gt;
&lt;br /&gt;
Jupyterlab stores the workspace status in the &amp;quot;~/.jupyter/lab/workspaces&amp;quot; folder. If you want to start with a fresh (empty) workspace, delete all the content of this folder before launching the notebook.&lt;br /&gt;
&lt;br /&gt;
    cd ~/.jupyter/lab/workspaces&lt;br /&gt;
    rm *&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 504 Gateway timeout ==&lt;br /&gt;
&lt;br /&gt;
The notebook job is running in HTCondor but the user can not access the notebook server. Ultimately a 504 error is received.&lt;br /&gt;
&lt;br /&gt;
This is probably because there's some error when starting the jupyterlab server. First of all, shutdown the notebook server and check the logs to better identify the problem. If you don't see the source of the error, try clean the workspaces and launching a notebook again.&lt;/div&gt;</summary>
		<author><name>Torradeflot</name></author>
	</entry>
	<entry>
		<id>https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1164</id>
		<title>JupyterHub</title>
		<link rel="alternate" type="text/html" href="https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1164"/>
		<updated>2024-10-22T12:30:32Z</updated>

		<summary type="html">&lt;p&gt;Torradeflot: /* Code samples */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
= Introduction =&lt;br /&gt;
&lt;br /&gt;
PIC offers a service for running Jupyter notebooks on CPU or GPU resources. This service is primarily thought for code developing or prototyping rather than data processing. The usage is similar to running notebooks on your personal computer but offers the advantage of developing and testing your code on different hardware configurations, as well as facilitating the scalability of the code since it is being tested in the same environment in which it would run on a mass scale. &lt;br /&gt;
&lt;br /&gt;
Since the service is strictly thought for development and small scale testing tasks, a shutdown policy for the sessions has been put in place:&lt;br /&gt;
&lt;br /&gt;
# The maximum duration for a session is 48h.&lt;br /&gt;
# After an idle period of 2 hours, the session will be closed. &lt;br /&gt;
&lt;br /&gt;
In practice that  means that you should estimate the test data volume that you work with during a session to be able to be processed in less than 48 hours.&lt;br /&gt;
&lt;br /&gt;
== How to connect to the service ==&lt;br /&gt;
&lt;br /&gt;
Got to [https://jupyter.pic.es jupyter.pic.es] to see your login screen.&lt;br /&gt;
&lt;br /&gt;
[[File:login.png|frameless|700px|Login screen]]&lt;br /&gt;
&lt;br /&gt;
Sign in with your PIC user credentials. This will prompt you to the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterSpawn.png|700px|current]]&lt;br /&gt;
&lt;br /&gt;
Here you can choose the hardware configuration for your Jupyter session. Also, you have to choose the experiment (project) you are working on during the Jupyter session. After choosing a configuration and pressing start the next screen will show you the progress of the initialisation process. Keep in mind that a job containing your Jupyter session is actually sent to the HTCondor queuing system and waiting for available resources before being started. This usually takes less than a minute but can take up to a few depending on our resource usage.&lt;br /&gt;
&lt;br /&gt;
[[File:screen02.png|600px]]&lt;br /&gt;
&lt;br /&gt;
In the next screen you can choose the tool that you want to use for your work: a Python notebook, a Python console or a plain bash terminal.&lt;br /&gt;
For the Python environment (either notebook or environment) you have two default options:&lt;br /&gt;
* the ipykernel version of Python 3&lt;br /&gt;
* the XPython version of Python 3.9, this one allows you to use the integrated debugging module.&lt;br /&gt;
&lt;br /&gt;
Further you see an icon with a &amp;quot;D&amp;quot; - desktop, this one starts a VNC session that allows the use of programs with graphical user interfaces.&lt;br /&gt;
&lt;br /&gt;
Also, recently you can find the icon of Visual Studio, an integrated development environment.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterlab20231103.png|700px]]&lt;br /&gt;
&lt;br /&gt;
Your python environments should appear under Notebook and Console headers. In a later section we will show you how to create a new environment and to remove an existing one.&lt;br /&gt;
&lt;br /&gt;
== Terminate your session and logout ==&lt;br /&gt;
&lt;br /&gt;
It is important that you terminate your session before you log out. In order to do so, go to the top page menu &amp;quot;'''File''' -&amp;gt; '''Hub Control Panel'''&amp;quot; and you will see the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:screen04.png|600px]]&lt;br /&gt;
&lt;br /&gt;
Here, click on the '''Stop My Server''' button. After that you can log out by clicking the '''Logout''' button in the right upper corner.&lt;br /&gt;
&lt;br /&gt;
=  Python virtual environments =&lt;br /&gt;
&lt;br /&gt;
This section covers the use of Python virtual environments with Jupyter.&lt;br /&gt;
&lt;br /&gt;
== Initialize conda (we highly recommend the use of miniforge/mambaforge) ==&lt;br /&gt;
&lt;br /&gt;
Before using conda/mamba in your bash session, you have to initialize it.&lt;br /&gt;
* For access to an available conda/mamba installation, please get in contact with your project liaison at PIC. He/she will give you the actual value for the '''/path/to/mambaforge''' placeholder.&lt;br /&gt;
* If you want to use your own conda/mamba installation, we recommend you to install the minimal '''miniforge''' distribution, instructions [https://github.com/conda-forge/miniforge here]&lt;br /&gt;
&lt;br /&gt;
Log onto Jupyter and start a session. On the homepage of your Jupyter session, click on the terminal button on the session dashboard on the right to open a bash terminal. If no specific version is needed you can use the link provided in the example.&lt;br /&gt;
&lt;br /&gt;
First, let's initialize conda for our bash sessions:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ /data/astro/software/alma9/conda/miniforge-24.1.2/bin/mamba init&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This actually changes the .bashrc file in your home directory in order to activate the base environment on login.&lt;br /&gt;
To avoid that the base environment is activated every time you log on to a node, run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, in order to activate the base environment you will have to run this command:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ eval &amp;quot;$(/data/astro/software/alma9/conda/miniforge-24.1.2/bin/conda shell.bash hook)&amp;quot;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For now you can exit the terminal.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ exit&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Link an existing environment to Jupyter ==&lt;br /&gt;
&lt;br /&gt;
You can find instructions on how to create your own environments, e.g. [[#Create_virtual_environments_with_venv_or_conda | here]].&lt;br /&gt;
&lt;br /&gt;
Log into Jupyter, start a session. From the session dashboard choose the bash terminal.&lt;br /&gt;
&lt;br /&gt;
Inside the terminal, activate your environment.&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments:&lt;br /&gt;
* if you created the environment without a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the name of your environment. &lt;br /&gt;
* if you created the environment with a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the absolute path of your environment. &lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ source /path/to/environment/bin/activate&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Link the environment to a Jupyter kernel. For both, '''conda/mamba''' and '''venv''':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ python -m  ipykernel install --user --name=whatever_kernel_name&lt;br /&gt;
Installed kernelspec whatever_kernel_name in &lt;br /&gt;
                         /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you don't have the '''ipykernel''' module installed in your environment you may receive an error message like the one below when trying to run the previous command.&lt;br /&gt;
&amp;lt;pre&amp;gt;No module named ipykernel&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If this is the case, you need to install it by running: '''pip install ipykernel'''&lt;br /&gt;
&lt;br /&gt;
Deactivate your environment. &lt;br /&gt;
&lt;br /&gt;
For conda:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For venv:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now you can exit the terminal. After refreshing the Jupyter page your whatever_kernel_name appears in the dashboard. In this example '''test''' has been used for whatever_kernel_name&lt;br /&gt;
&lt;br /&gt;
[[File:screen05.png|700px]]&lt;br /&gt;
&lt;br /&gt;
== Unlink an environment from Jupyter ==&lt;br /&gt;
Log onto Jupyter, start a session and from the session dashboard choose the bash terminal. To remove your environment/kernel from Jupyter run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ jupyter kernelspec uninstall whatever_kernel_name&lt;br /&gt;
Kernel specs to remove:&lt;br /&gt;
  whatever_kernel_name     /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
Remove 1 kernel specs [y/N]: y&lt;br /&gt;
[RemoveKernelSpec] Removed /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Keep in mind that, although not available in Jupyter anymore, the environment still exists. Whenever you need it, you can link it again.&lt;br /&gt;
&lt;br /&gt;
== Create virtual environments with venv or conda ==&lt;br /&gt;
&lt;br /&gt;
Before creating a new environment, please get in contact with your project liaison at PIC as there may be already a suitable environment for your needs in place.&lt;br /&gt;
&lt;br /&gt;
If none of the existing environments suits your needs, you can create a new environment.&lt;br /&gt;
First, create a directory in a suitable place to store the environment. For single-user environments, place them in your home under ~/env. For environments that will be shared with other project users, contact your project liaison and ask him/her for a path in a shared storage volume that is visible to all of them.&lt;br /&gt;
&lt;br /&gt;
Once you have the location (i.e. /path/to/env/folder), create the environment with the following commands:&lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments '''(recommended)'''&lt;br /&gt;
&lt;br /&gt;
If your_env is installed at /path/to/env/your_env&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ python3 -m venv your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ source your_env/bin/activate&lt;br /&gt;
(...)[neissner@td110 ~]$ pip install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments&lt;br /&gt;
 &amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba create --prefix /path/to/env/your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The list of modules (module1, module2, ...) is optional. For instance, for a python3 environment with scipy you would specify: ''python=3 scipy'' &lt;br /&gt;
&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/env/folder/your_env&lt;br /&gt;
(...)[neissner@td110 ~]$ mamba install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
You can use pip install inside a mamba environment, however, resolving dependencies might require installing additional packages manually.&lt;br /&gt;
&lt;br /&gt;
= Proper usage of X509 based proxies =&lt;br /&gt;
&lt;br /&gt;
We found recently that the usage of proxies within a Jupyter session might cause problems because the environment changes certain standard locations such as '''/tmp'''&lt;br /&gt;
&lt;br /&gt;
For a correct functioning please create the proxy the following way, example for Virgo:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /bin/voms-proxy-init --voms virgo:/virgo/ligo --out ./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
ls: cannot access /cvmfs/ligo.osgstorage.org: Permission denied&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here the proxy cannot properly be located. Therefore we have to put the complete path into the variable:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ pwd&lt;br /&gt;
/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;/x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
frames  powerflux  pycbc  test_access&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Software of particular interest =&lt;br /&gt;
== SageMath ==&lt;br /&gt;
&lt;br /&gt;
[https://www.sagemath.org/ SageMath] is particularly interesting for Cosmology because it allows symbolic calculations, e.g. deriving the equations of motions for the scale factor starting from a customised space-time metric. &lt;br /&gt;
&lt;br /&gt;
=== Standard cosmology examples ===&lt;br /&gt;
&lt;br /&gt;
* The Friedman equations for the FLRW solution of the Einstein equations.&lt;br /&gt;
You can find the corresponding Notebook in any PIC terminal at '''/data/astro/software/notebooks/FLRW_cosmology.ipynb'''&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/FLRW_cosmology_solutions.ipynb''' uses known analytical solutions of the FLRW cosmology and produces this image for the evolution of the scale factor:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage05.png|300px]]&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/Interior_Schwarzschild.ipynb''' shows the formalism for the interior Schwarzschild metric and displays the solutions for density and pressure of a static celestial object that is sufficiently larger than its Schwarzschild radius. The pressure for an object with constant density id shown in the image:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage06.png|300px]]&lt;br /&gt;
&lt;br /&gt;
=== Enabling SageMath environment in Jupyter ===&lt;br /&gt;
&lt;br /&gt;
If you have never initialized mamba, run:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /data/astro/software/centos7/conda/mambaforge_4.14.0/bin/mamba init&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
After that you can enable SageMath for its use in a Jupyter notebook session: &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~] mamba activate /data/astro/software/envs/sage&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ python -m  ipykernel install --user --name=sage&lt;br /&gt;
....&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This creates a file in you home '''~/.local/share/jupyter/kernels/sage/kernel.json'''&lt;br /&gt;
which has to be modified to look like this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
{&lt;br /&gt;
 &amp;quot;argv&amp;quot;: [&lt;br /&gt;
  &amp;quot;/data/astro/software/envs/sage/bin/sage&amp;quot;,&lt;br /&gt;
  &amp;quot;--python&amp;quot;,&lt;br /&gt;
  &amp;quot;-m&amp;quot;,&lt;br /&gt;
  &amp;quot;sage.repl.ipython_kernel&amp;quot;,&lt;br /&gt;
  &amp;quot;-f&amp;quot;,&lt;br /&gt;
  &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
 ],&lt;br /&gt;
 &amp;quot;display_name&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;language&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;metadata&amp;quot;: {&lt;br /&gt;
  &amp;quot;debugger&amp;quot;: true&lt;br /&gt;
 }&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next time you go to your Jupyter dashboard you will find the sage environment listed there.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Dask =&lt;br /&gt;
&lt;br /&gt;
A notebook with instructions on how to run Dask at PIC can be found [https://gitlab.pic.es/services/code-samples/-/blob/main/computing/dask/dask_htcondor.ipynb here]&lt;br /&gt;
&lt;br /&gt;
= Using a singularity image as a jupyter kernel =&lt;br /&gt;
&lt;br /&gt;
Sometimes the software stack of some projects may be provided in the shape of a singularity image, it will then be convenient to use this image as a kernel for the notebooks in jupyter.pic.es.&lt;br /&gt;
&lt;br /&gt;
The singularity image to be used as a kernel needs to fullfill some requirements. Different requirements will apply depending on the programming language.&lt;br /&gt;
&lt;br /&gt;
== python jupyter kernel in a singularity image ==&lt;br /&gt;
&lt;br /&gt;
The singularity image needs to have the '''python''' and the '''ipykernel''' module installed. &lt;br /&gt;
&lt;br /&gt;
* Create the folder that will host the kernel definition&lt;br /&gt;
&lt;br /&gt;
  mkdir -p $HOME/.local/share/jupyter/kernels/singularity&lt;br /&gt;
&lt;br /&gt;
* Create the '''kernel.json''' file inside it with the following content:&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 {&lt;br /&gt;
   &amp;quot;argv&amp;quot;: [&lt;br /&gt;
     &amp;quot;singularity&amp;quot;,&lt;br /&gt;
     &amp;quot;exec&amp;quot;,&lt;br /&gt;
     &amp;quot;--cleanenv&amp;quot;,&lt;br /&gt;
     &amp;quot;/path/to/the/singularity/image.sif&amp;quot;,&lt;br /&gt;
     &amp;quot;python&amp;quot;,&lt;br /&gt;
     &amp;quot;-m&amp;quot;,&lt;br /&gt;
    &amp;quot;ipykernel&amp;quot;,&lt;br /&gt;
     &amp;quot;-f&amp;quot;,&lt;br /&gt;
     &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
   ],&lt;br /&gt;
   &amp;quot;language&amp;quot;: &amp;quot;python&amp;quot;,&lt;br /&gt;
   &amp;quot;display_name&amp;quot;: &amp;quot;singularity-kernel&amp;quot;&lt;br /&gt;
 }&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Refresh or start the jupyterlab interface and the singularity kernel should appear in the launcher tab&lt;br /&gt;
&lt;br /&gt;
= GPUs =&lt;br /&gt;
&lt;br /&gt;
The way to identify the GPUs that are assigned to your job is:&lt;br /&gt;
* check the environment variable CUDA_VISIBLE_DEVICES. In a terminal run &amp;quot;echo $CUDA_VISIBLE_DEVICES&amp;quot;. The environment variable contains a list of comma-separated GPU ids. With this you will already know how many gpus are assigned to your job. If the variable does not exist, there are no gpus assigned to the job&lt;br /&gt;
&lt;br /&gt;
* list the gpus with nvidia-smi, in a terminal run &amp;quot;nvidia-smi -L&amp;quot;, and look for the gpus you've been assigned. Remember their indexes (integers from 0 to 7)&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_id_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
* in the GPU dashboard the gpus are identified with their index&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_resources_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
= Jupyterlab user guide =&lt;br /&gt;
&lt;br /&gt;
You can find the official documentation of the currently installed version of jupyterlab (3.6) [https://jupyterlab.readthedocs.io/en/3.6.x/ here], there you will find instruction on how to &lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/3.6.x/user/commands.html Access the command palette]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/3.6.x/user/toc.html Build a Table Of Contents]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/3.6.x/user/debugger.html Debug your code]&lt;br /&gt;
&lt;br /&gt;
A set of non-official jupyterlab extensions are installed to provide additional functionalities&lt;br /&gt;
&lt;br /&gt;
== jupytext ==&lt;br /&gt;
Pair your notebooks with text files to enhance version tracking.&lt;br /&gt;
https://jupytext.readthedocs.io&lt;br /&gt;
&lt;br /&gt;
=== Example ===&lt;br /&gt;
&lt;br /&gt;
If you had a notebook (.ipynb file) containing only the cell below, tracked in a git repository&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%matplotlib inline&lt;br /&gt;
import numpy as np&lt;br /&gt;
import matplotlib.pyplot as plt&lt;br /&gt;
plt.imshow(np.random.random([10, 10]))&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Different executions of the cell would produce different images, and the images are embedded in a pseudo-binary format inside the notebook file. In this case, doing a '''git diff''' of the .ipynb file would produce a huge output (because the image changed), even if there wasn't any change in the code. It is thus convenient to sync the notebook with a text file (e.g. a .py script) using the jupytext extension and track this one with git. The outputs, including images, as well as some additional metadata, won't be added to the synced text file. So in the case of different executions of the same notebook, the diff will always be empty.&lt;br /&gt;
&lt;br /&gt;
== git ==&lt;br /&gt;
Sidebar GUI to git repo management&lt;br /&gt;
https://github.com/jupyterlab/jupyterlab-git&lt;br /&gt;
&lt;br /&gt;
== variable inspector ==&lt;br /&gt;
Variable inspection à la Matlab&lt;br /&gt;
https://github.com/jupyterlab-contrib/jupyterlab-variableInspector&lt;br /&gt;
&lt;br /&gt;
...&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Code samples =&lt;br /&gt;
&lt;br /&gt;
A repository with sample code can be found here: https://gitlab.pic.es/services/code-samples/&lt;br /&gt;
&lt;br /&gt;
= Troubleshooting =&lt;br /&gt;
&lt;br /&gt;
== Logs ==&lt;br /&gt;
&lt;br /&gt;
The log files for the jupyterlab server are stored in &amp;quot;~/.jupyter&amp;quot;. The log files will be created once the jupyter lab server job is finished.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Clean workspaces ==&lt;br /&gt;
&lt;br /&gt;
Jupyterlab stores the workspace status in the &amp;quot;~/.jupyter/lab/workspaces&amp;quot; folder. If you want to start with a fresh (empty) workspace, delete all the content of this folder before launching the notebook.&lt;br /&gt;
&lt;br /&gt;
    cd ~/.jupyter/lab/workspaces&lt;br /&gt;
    rm *&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 504 Gateway timeout ==&lt;br /&gt;
&lt;br /&gt;
The notebook job is running in HTCondor but the user can not access the notebook server. Ultimately a 504 error is received.&lt;br /&gt;
&lt;br /&gt;
This is probably because there's some error when starting the jupyterlab server. First of all, shutdown the notebook server and check the logs to better identify the problem. If you don't see the source of the error, try clean the workspaces and launching a notebook again.&lt;/div&gt;</summary>
		<author><name>Torradeflot</name></author>
	</entry>
	<entry>
		<id>https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1158</id>
		<title>JupyterHub</title>
		<link rel="alternate" type="text/html" href="https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1158"/>
		<updated>2024-06-13T10:07:28Z</updated>

		<summary type="html">&lt;p&gt;Torradeflot: /* Dask */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
= Introduction =&lt;br /&gt;
&lt;br /&gt;
PIC offers a service for running Jupyter notebooks on CPU or GPU resources. This service is primarily thought for code developing or prototyping rather than data processing. The usage is similar to running notebooks on your personal computer but offers the advantage of developing and testing your code on different hardware configurations, as well as facilitating the scalability of the code since it is being tested in the same environment in which it would run on a mass scale. &lt;br /&gt;
&lt;br /&gt;
Since the service is strictly thought for development and small scale testing tasks, a shutdown policy for the sessions has been put in place:&lt;br /&gt;
&lt;br /&gt;
# The maximum duration for a session is 48h.&lt;br /&gt;
# After an idle period of 2 hours, the session will be closed. &lt;br /&gt;
&lt;br /&gt;
In practice that  means that you should estimate the test data volume that you work with during a session to be able to be processed in less than 48 hours.&lt;br /&gt;
&lt;br /&gt;
== How to connect to the service ==&lt;br /&gt;
&lt;br /&gt;
Got to [https://jupyter.pic.es jupyter.pic.es] to see your login screen.&lt;br /&gt;
&lt;br /&gt;
[[File:login.png|frameless|700px|Login screen]]&lt;br /&gt;
&lt;br /&gt;
Sign in with your PIC user credentials. This will prompt you to the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterSpawn.png|700px|current]]&lt;br /&gt;
&lt;br /&gt;
Here you can choose the hardware configuration for your Jupyter session. Also, you have to choose the experiment (project) you are working on during the Jupyter session. After choosing a configuration and pressing start the next screen will show you the progress of the initialisation process. Keep in mind that a job containing your Jupyter session is actually sent to the HTCondor queuing system and waiting for available resources before being started. This usually takes less than a minute but can take up to a few depending on our resource usage.&lt;br /&gt;
&lt;br /&gt;
[[File:screen02.png|600px]]&lt;br /&gt;
&lt;br /&gt;
In the next screen you can choose the tool that you want to use for your work: a Python notebook, a Python console or a plain bash terminal.&lt;br /&gt;
For the Python environment (either notebook or environment) you have two default options:&lt;br /&gt;
* the ipykernel version of Python 3&lt;br /&gt;
* the XPython version of Python 3.9, this one allows you to use the integrated debugging module.&lt;br /&gt;
&lt;br /&gt;
Further you see an icon with a &amp;quot;D&amp;quot; - desktop, this one starts a VNC session that allows the use of programs with graphical user interfaces.&lt;br /&gt;
&lt;br /&gt;
Also, recently you can find the icon of Visual Studio, an integrated development environment.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterlab20231103.png|700px]]&lt;br /&gt;
&lt;br /&gt;
Your python environments should appear under Notebook and Console headers. In a later section we will show you how to create a new environment and to remove an existing one.&lt;br /&gt;
&lt;br /&gt;
== Terminate your session and logout ==&lt;br /&gt;
&lt;br /&gt;
It is important that you terminate your session before you log out. In order to do so, go to the top page menu &amp;quot;'''File''' -&amp;gt; '''Hub Control Panel'''&amp;quot; and you will see the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:screen04.png|600px]]&lt;br /&gt;
&lt;br /&gt;
Here, click on the '''Stop My Server''' button. After that you can log out by clicking the '''Logout''' button in the right upper corner.&lt;br /&gt;
&lt;br /&gt;
=  Python virtual environments =&lt;br /&gt;
&lt;br /&gt;
This section covers the use of Python virtual environments with Jupyter.&lt;br /&gt;
&lt;br /&gt;
== Initialize conda (we highly recommend the use of miniforge/mambaforge) ==&lt;br /&gt;
&lt;br /&gt;
Before using conda/mamba in your bash session, you have to initialize it.&lt;br /&gt;
* For access to an available conda/mamba installation, please get in contact with your project liaison at PIC. He/she will give you the actual value for the '''/path/to/mambaforge''' placeholder.&lt;br /&gt;
* If you want to use your own conda/mamba installation, we recommend you to install the minimal '''miniforge''' distribution, instructions [https://github.com/conda-forge/miniforge here]&lt;br /&gt;
&lt;br /&gt;
Log onto Jupyter and start a session. On the homepage of your Jupyter session, click on the terminal button on the session dashboard on the right to open a bash terminal. If no specific version is needed you can use the link provided in the example.&lt;br /&gt;
&lt;br /&gt;
First, let's initialize conda for our bash sessions:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ /data/astro/software/alma9/conda/miniforge-24.1.2/bin/mamba init&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This actually changes the .bashrc file in your home directory in order to activate the base environment on login.&lt;br /&gt;
To avoid that the base environment is activated every time you log on to a node, run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, in order to activate the base environment you will have to run this command:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ eval &amp;quot;$(/data/astro/software/alma9/conda/miniforge-24.1.2/bin/conda shell.bash hook)&amp;quot;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For now you can exit the terminal.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ exit&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Link an existing environment to Jupyter ==&lt;br /&gt;
&lt;br /&gt;
You can find instructions on how to create your own environments, e.g. [[#Create_virtual_environments_with_venv_or_conda | here]].&lt;br /&gt;
&lt;br /&gt;
Log into Jupyter, start a session. From the session dashboard choose the bash terminal.&lt;br /&gt;
&lt;br /&gt;
Inside the terminal, activate your environment.&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments:&lt;br /&gt;
* if you created the environment without a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the name of your environment. &lt;br /&gt;
* if you created the environment with a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the absolute path of your environment. &lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ source /path/to/environment/bin/activate&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Link the environment to a Jupyter kernel. For both, '''conda/mamba''' and '''venv''':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ python -m  ipykernel install --user --name=whatever_kernel_name&lt;br /&gt;
Installed kernelspec whatever_kernel_name in &lt;br /&gt;
                         /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you don't have the '''ipykernel''' module installed in your environment you may receive an error message like the one below when trying to run the previous command.&lt;br /&gt;
&amp;lt;pre&amp;gt;No module named ipykernel&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If this is the case, you need to install it by running: '''pip install ipykernel'''&lt;br /&gt;
&lt;br /&gt;
Deactivate your environment. &lt;br /&gt;
&lt;br /&gt;
For conda:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For venv:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now you can exit the terminal. After refreshing the Jupyter page your whatever_kernel_name appears in the dashboard. In this example '''test''' has been used for whatever_kernel_name&lt;br /&gt;
&lt;br /&gt;
[[File:screen05.png|700px]]&lt;br /&gt;
&lt;br /&gt;
== Unlink an environment from Jupyter ==&lt;br /&gt;
Log onto Jupyter, start a session and from the session dashboard choose the bash terminal. To remove your environment/kernel from Jupyter run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ jupyter kernelspec uninstall whatever_kernel_name&lt;br /&gt;
Kernel specs to remove:&lt;br /&gt;
  whatever_kernel_name     /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
Remove 1 kernel specs [y/N]: y&lt;br /&gt;
[RemoveKernelSpec] Removed /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Keep in mind that, although not available in Jupyter anymore, the environment still exists. Whenever you need it, you can link it again.&lt;br /&gt;
&lt;br /&gt;
== Create virtual environments with venv or conda ==&lt;br /&gt;
&lt;br /&gt;
Before creating a new environment, please get in contact with your project liaison at PIC as there may be already a suitable environment for your needs in place.&lt;br /&gt;
&lt;br /&gt;
If none of the existing environments suits your needs, you can create a new environment.&lt;br /&gt;
First, create a directory in a suitable place to store the environment. For single-user environments, place them in your home under ~/env. For environments that will be shared with other project users, contact your project liaison and ask him/her for a path in a shared storage volume that is visible to all of them.&lt;br /&gt;
&lt;br /&gt;
Once you have the location (i.e. /path/to/env/folder), create the environment with the following commands:&lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments '''(recommended)'''&lt;br /&gt;
&lt;br /&gt;
If your_env is installed at /path/to/env/your_env&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ python3 -m venv your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ source your_env/bin/activate&lt;br /&gt;
(...)[neissner@td110 ~]$ pip install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments&lt;br /&gt;
 &amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba create --prefix /path/to/env/your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The list of modules (module1, module2, ...) is optional. For instance, for a python3 environment with scipy you would specify: ''python=3 scipy'' &lt;br /&gt;
&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/env/folder/your_env&lt;br /&gt;
(...)[neissner@td110 ~]$ mamba install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
You can use pip install inside a mamba environment, however, resolving dependencies might require installing additional packages manually.&lt;br /&gt;
&lt;br /&gt;
= Proper usage of X509 based proxies =&lt;br /&gt;
&lt;br /&gt;
We found recently that the usage of proxies within a Jupyter session might cause problems because the environment changes certain standard locations such as '''/tmp'''&lt;br /&gt;
&lt;br /&gt;
For a correct functioning please create the proxy the following way, example for Virgo:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /bin/voms-proxy-init --voms virgo:/virgo/ligo --out ./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
ls: cannot access /cvmfs/ligo.osgstorage.org: Permission denied&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here the proxy cannot properly be located. Therefore we have to put the complete path into the variable:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ pwd&lt;br /&gt;
/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;/x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
frames  powerflux  pycbc  test_access&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Software of particular interest =&lt;br /&gt;
== SageMath ==&lt;br /&gt;
&lt;br /&gt;
[https://www.sagemath.org/ SageMath] is particularly interesting for Cosmology because it allows symbolic calculations, e.g. deriving the equations of motions for the scale factor starting from a customised space-time metric. &lt;br /&gt;
&lt;br /&gt;
=== Standard cosmology examples ===&lt;br /&gt;
&lt;br /&gt;
* The Friedman equations for the FLRW solution of the Einstein equations.&lt;br /&gt;
You can find the corresponding Notebook in any PIC terminal at '''/data/astro/software/notebooks/FLRW_cosmology.ipynb'''&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/FLRW_cosmology_solutions.ipynb''' uses known analytical solutions of the FLRW cosmology and produces this image for the evolution of the scale factor:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage05.png|300px]]&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/Interior_Schwarzschild.ipynb''' shows the formalism for the interior Schwarzschild metric and displays the solutions for density and pressure of a static celestial object that is sufficiently larger than its Schwarzschild radius. The pressure for an object with constant density id shown in the image:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage06.png|300px]]&lt;br /&gt;
&lt;br /&gt;
=== Enabling SageMath environment in Jupyter ===&lt;br /&gt;
&lt;br /&gt;
If you have never initialized mamba, run:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /data/astro/software/centos7/conda/mambaforge_4.14.0/bin/mamba init&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
After that you can enable SageMath for its use in a Jupyter notebook session: &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~] mamba activate /data/astro/software/envs/sage&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ python -m  ipykernel install --user --name=sage&lt;br /&gt;
....&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This creates a file in you home '''~/.local/share/jupyter/kernels/sage/kernel.json'''&lt;br /&gt;
which has to be modified to look like this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
{&lt;br /&gt;
 &amp;quot;argv&amp;quot;: [&lt;br /&gt;
  &amp;quot;/data/astro/software/envs/sage/bin/sage&amp;quot;,&lt;br /&gt;
  &amp;quot;--python&amp;quot;,&lt;br /&gt;
  &amp;quot;-m&amp;quot;,&lt;br /&gt;
  &amp;quot;sage.repl.ipython_kernel&amp;quot;,&lt;br /&gt;
  &amp;quot;-f&amp;quot;,&lt;br /&gt;
  &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
 ],&lt;br /&gt;
 &amp;quot;display_name&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;language&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;metadata&amp;quot;: {&lt;br /&gt;
  &amp;quot;debugger&amp;quot;: true&lt;br /&gt;
 }&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next time you go to your Jupyter dashboard you will find the sage environment listed there.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Dask =&lt;br /&gt;
&lt;br /&gt;
A notebook with instructions on how to run Dask at PIC can be found [https://gitlab.pic.es/services/code-samples/-/blob/main/computing/dask/dask_htcondor.ipynb here]&lt;br /&gt;
&lt;br /&gt;
= Using a singularity image as a jupyter kernel =&lt;br /&gt;
&lt;br /&gt;
Sometimes the software stack of some projects may be provided in the shape of a singularity image, it will then be convenient to use this image as a kernel for the notebooks in jupyter.pic.es.&lt;br /&gt;
&lt;br /&gt;
The singularity image to be used as a kernel needs to fullfill some requirements. Different requirements will apply depending on the programming language.&lt;br /&gt;
&lt;br /&gt;
== python jupyter kernel in a singularity image ==&lt;br /&gt;
&lt;br /&gt;
The singularity image needs to have the '''python''' and the '''ipykernel''' module installed. &lt;br /&gt;
&lt;br /&gt;
* Create the folder that will host the kernel definition&lt;br /&gt;
&lt;br /&gt;
  mkdir -p $HOME/.local/share/jupyter/kernels/singularity&lt;br /&gt;
&lt;br /&gt;
* Create the '''kernel.json''' file inside it with the following content:&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 {&lt;br /&gt;
   &amp;quot;argv&amp;quot;: [&lt;br /&gt;
     &amp;quot;singularity&amp;quot;,&lt;br /&gt;
     &amp;quot;exec&amp;quot;,&lt;br /&gt;
     &amp;quot;--cleanenv&amp;quot;,&lt;br /&gt;
     &amp;quot;/path/to/the/singularity/image.sif&amp;quot;,&lt;br /&gt;
     &amp;quot;python&amp;quot;,&lt;br /&gt;
     &amp;quot;-m&amp;quot;,&lt;br /&gt;
    &amp;quot;ipykernel&amp;quot;,&lt;br /&gt;
     &amp;quot;-f&amp;quot;,&lt;br /&gt;
     &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
   ],&lt;br /&gt;
   &amp;quot;language&amp;quot;: &amp;quot;python&amp;quot;,&lt;br /&gt;
   &amp;quot;display_name&amp;quot;: &amp;quot;singularity-kernel&amp;quot;&lt;br /&gt;
 }&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Refresh or start the jupyterlab interface and the singularity kernel should appear in the launcher tab&lt;br /&gt;
&lt;br /&gt;
= GPUs =&lt;br /&gt;
&lt;br /&gt;
The way to identify the GPUs that are assigned to your job is:&lt;br /&gt;
* check the environment variable CUDA_VISIBLE_DEVICES. In a terminal run &amp;quot;echo $CUDA_VISIBLE_DEVICES&amp;quot;. The environment variable contains a list of comma-separated GPU ids. With this you will already know how many gpus are assigned to your job. If the variable does not exist, there are no gpus assigned to the job&lt;br /&gt;
&lt;br /&gt;
* list the gpus with nvidia-smi, in a terminal run &amp;quot;nvidia-smi -L&amp;quot;, and look for the gpus you've been assigned. Remember their indexes (integers from 0 to 7)&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_id_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
* in the GPU dashboard the gpus are identified with their index&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_resources_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
= Jupyterlab user guide =&lt;br /&gt;
&lt;br /&gt;
You can find the official documentation of the currently installed version of jupyterlab (3.6) [https://jupyterlab.readthedocs.io/en/3.6.x/ here], there you will find instruction on how to &lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/3.6.x/user/commands.html Access the command palette]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/3.6.x/user/toc.html Build a Table Of Contents]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/3.6.x/user/debugger.html Debug your code]&lt;br /&gt;
&lt;br /&gt;
A set of non-official jupyterlab extensions are installed to provide additional functionalities&lt;br /&gt;
&lt;br /&gt;
== jupytext ==&lt;br /&gt;
Pair your notebooks with text files to enhance version tracking.&lt;br /&gt;
https://jupytext.readthedocs.io&lt;br /&gt;
&lt;br /&gt;
=== Example ===&lt;br /&gt;
&lt;br /&gt;
If you had a notebook (.ipynb file) containing only the cell below, tracked in a git repository&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%matplotlib inline&lt;br /&gt;
import numpy as np&lt;br /&gt;
import matplotlib.pyplot as plt&lt;br /&gt;
plt.imshow(np.random.random([10, 10]))&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Different executions of the cell would produce different images, and the images are embedded in a pseudo-binary format inside the notebook file. In this case, doing a '''git diff''' of the .ipynb file would produce a huge output (because the image changed), even if there wasn't any change in the code. It is thus convenient to sync the notebook with a text file (e.g. a .py script) using the jupytext extension and track this one with git. The outputs, including images, as well as some additional metadata, won't be added to the synced text file. So in the case of different executions of the same notebook, the diff will always be empty.&lt;br /&gt;
&lt;br /&gt;
== git ==&lt;br /&gt;
Sidebar GUI to git repo management&lt;br /&gt;
https://github.com/jupyterlab/jupyterlab-git&lt;br /&gt;
&lt;br /&gt;
== variable inspector ==&lt;br /&gt;
Variable inspection à la Matlab&lt;br /&gt;
https://github.com/jupyterlab-contrib/jupyterlab-variableInspector&lt;br /&gt;
&lt;br /&gt;
...&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Code samples =&lt;br /&gt;
&lt;br /&gt;
A repository with sample code can be found here: https://gitlab.pic.es/services/code-samples/&lt;/div&gt;</summary>
		<author><name>Torradeflot</name></author>
	</entry>
	<entry>
		<id>https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1157</id>
		<title>JupyterHub</title>
		<link rel="alternate" type="text/html" href="https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1157"/>
		<updated>2024-06-13T10:07:18Z</updated>

		<summary type="html">&lt;p&gt;Torradeflot: /* Dask */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
= Introduction =&lt;br /&gt;
&lt;br /&gt;
PIC offers a service for running Jupyter notebooks on CPU or GPU resources. This service is primarily thought for code developing or prototyping rather than data processing. The usage is similar to running notebooks on your personal computer but offers the advantage of developing and testing your code on different hardware configurations, as well as facilitating the scalability of the code since it is being tested in the same environment in which it would run on a mass scale. &lt;br /&gt;
&lt;br /&gt;
Since the service is strictly thought for development and small scale testing tasks, a shutdown policy for the sessions has been put in place:&lt;br /&gt;
&lt;br /&gt;
# The maximum duration for a session is 48h.&lt;br /&gt;
# After an idle period of 2 hours, the session will be closed. &lt;br /&gt;
&lt;br /&gt;
In practice that  means that you should estimate the test data volume that you work with during a session to be able to be processed in less than 48 hours.&lt;br /&gt;
&lt;br /&gt;
== How to connect to the service ==&lt;br /&gt;
&lt;br /&gt;
Got to [https://jupyter.pic.es jupyter.pic.es] to see your login screen.&lt;br /&gt;
&lt;br /&gt;
[[File:login.png|frameless|700px|Login screen]]&lt;br /&gt;
&lt;br /&gt;
Sign in with your PIC user credentials. This will prompt you to the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterSpawn.png|700px|current]]&lt;br /&gt;
&lt;br /&gt;
Here you can choose the hardware configuration for your Jupyter session. Also, you have to choose the experiment (project) you are working on during the Jupyter session. After choosing a configuration and pressing start the next screen will show you the progress of the initialisation process. Keep in mind that a job containing your Jupyter session is actually sent to the HTCondor queuing system and waiting for available resources before being started. This usually takes less than a minute but can take up to a few depending on our resource usage.&lt;br /&gt;
&lt;br /&gt;
[[File:screen02.png|600px]]&lt;br /&gt;
&lt;br /&gt;
In the next screen you can choose the tool that you want to use for your work: a Python notebook, a Python console or a plain bash terminal.&lt;br /&gt;
For the Python environment (either notebook or environment) you have two default options:&lt;br /&gt;
* the ipykernel version of Python 3&lt;br /&gt;
* the XPython version of Python 3.9, this one allows you to use the integrated debugging module.&lt;br /&gt;
&lt;br /&gt;
Further you see an icon with a &amp;quot;D&amp;quot; - desktop, this one starts a VNC session that allows the use of programs with graphical user interfaces.&lt;br /&gt;
&lt;br /&gt;
Also, recently you can find the icon of Visual Studio, an integrated development environment.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterlab20231103.png|700px]]&lt;br /&gt;
&lt;br /&gt;
Your python environments should appear under Notebook and Console headers. In a later section we will show you how to create a new environment and to remove an existing one.&lt;br /&gt;
&lt;br /&gt;
== Terminate your session and logout ==&lt;br /&gt;
&lt;br /&gt;
It is important that you terminate your session before you log out. In order to do so, go to the top page menu &amp;quot;'''File''' -&amp;gt; '''Hub Control Panel'''&amp;quot; and you will see the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:screen04.png|600px]]&lt;br /&gt;
&lt;br /&gt;
Here, click on the '''Stop My Server''' button. After that you can log out by clicking the '''Logout''' button in the right upper corner.&lt;br /&gt;
&lt;br /&gt;
=  Python virtual environments =&lt;br /&gt;
&lt;br /&gt;
This section covers the use of Python virtual environments with Jupyter.&lt;br /&gt;
&lt;br /&gt;
== Initialize conda (we highly recommend the use of miniforge/mambaforge) ==&lt;br /&gt;
&lt;br /&gt;
Before using conda/mamba in your bash session, you have to initialize it.&lt;br /&gt;
* For access to an available conda/mamba installation, please get in contact with your project liaison at PIC. He/she will give you the actual value for the '''/path/to/mambaforge''' placeholder.&lt;br /&gt;
* If you want to use your own conda/mamba installation, we recommend you to install the minimal '''miniforge''' distribution, instructions [https://github.com/conda-forge/miniforge here]&lt;br /&gt;
&lt;br /&gt;
Log onto Jupyter and start a session. On the homepage of your Jupyter session, click on the terminal button on the session dashboard on the right to open a bash terminal. If no specific version is needed you can use the link provided in the example.&lt;br /&gt;
&lt;br /&gt;
First, let's initialize conda for our bash sessions:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ /data/astro/software/alma9/conda/miniforge-24.1.2/bin/mamba init&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This actually changes the .bashrc file in your home directory in order to activate the base environment on login.&lt;br /&gt;
To avoid that the base environment is activated every time you log on to a node, run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, in order to activate the base environment you will have to run this command:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ eval &amp;quot;$(/data/astro/software/alma9/conda/miniforge-24.1.2/bin/conda shell.bash hook)&amp;quot;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For now you can exit the terminal.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ exit&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Link an existing environment to Jupyter ==&lt;br /&gt;
&lt;br /&gt;
You can find instructions on how to create your own environments, e.g. [[#Create_virtual_environments_with_venv_or_conda | here]].&lt;br /&gt;
&lt;br /&gt;
Log into Jupyter, start a session. From the session dashboard choose the bash terminal.&lt;br /&gt;
&lt;br /&gt;
Inside the terminal, activate your environment.&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments:&lt;br /&gt;
* if you created the environment without a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the name of your environment. &lt;br /&gt;
* if you created the environment with a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the absolute path of your environment. &lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ source /path/to/environment/bin/activate&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Link the environment to a Jupyter kernel. For both, '''conda/mamba''' and '''venv''':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ python -m  ipykernel install --user --name=whatever_kernel_name&lt;br /&gt;
Installed kernelspec whatever_kernel_name in &lt;br /&gt;
                         /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you don't have the '''ipykernel''' module installed in your environment you may receive an error message like the one below when trying to run the previous command.&lt;br /&gt;
&amp;lt;pre&amp;gt;No module named ipykernel&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If this is the case, you need to install it by running: '''pip install ipykernel'''&lt;br /&gt;
&lt;br /&gt;
Deactivate your environment. &lt;br /&gt;
&lt;br /&gt;
For conda:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For venv:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now you can exit the terminal. After refreshing the Jupyter page your whatever_kernel_name appears in the dashboard. In this example '''test''' has been used for whatever_kernel_name&lt;br /&gt;
&lt;br /&gt;
[[File:screen05.png|700px]]&lt;br /&gt;
&lt;br /&gt;
== Unlink an environment from Jupyter ==&lt;br /&gt;
Log onto Jupyter, start a session and from the session dashboard choose the bash terminal. To remove your environment/kernel from Jupyter run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ jupyter kernelspec uninstall whatever_kernel_name&lt;br /&gt;
Kernel specs to remove:&lt;br /&gt;
  whatever_kernel_name     /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
Remove 1 kernel specs [y/N]: y&lt;br /&gt;
[RemoveKernelSpec] Removed /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Keep in mind that, although not available in Jupyter anymore, the environment still exists. Whenever you need it, you can link it again.&lt;br /&gt;
&lt;br /&gt;
== Create virtual environments with venv or conda ==&lt;br /&gt;
&lt;br /&gt;
Before creating a new environment, please get in contact with your project liaison at PIC as there may be already a suitable environment for your needs in place.&lt;br /&gt;
&lt;br /&gt;
If none of the existing environments suits your needs, you can create a new environment.&lt;br /&gt;
First, create a directory in a suitable place to store the environment. For single-user environments, place them in your home under ~/env. For environments that will be shared with other project users, contact your project liaison and ask him/her for a path in a shared storage volume that is visible to all of them.&lt;br /&gt;
&lt;br /&gt;
Once you have the location (i.e. /path/to/env/folder), create the environment with the following commands:&lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments '''(recommended)'''&lt;br /&gt;
&lt;br /&gt;
If your_env is installed at /path/to/env/your_env&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ python3 -m venv your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ source your_env/bin/activate&lt;br /&gt;
(...)[neissner@td110 ~]$ pip install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments&lt;br /&gt;
 &amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba create --prefix /path/to/env/your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The list of modules (module1, module2, ...) is optional. For instance, for a python3 environment with scipy you would specify: ''python=3 scipy'' &lt;br /&gt;
&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/env/folder/your_env&lt;br /&gt;
(...)[neissner@td110 ~]$ mamba install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
You can use pip install inside a mamba environment, however, resolving dependencies might require installing additional packages manually.&lt;br /&gt;
&lt;br /&gt;
= Proper usage of X509 based proxies =&lt;br /&gt;
&lt;br /&gt;
We found recently that the usage of proxies within a Jupyter session might cause problems because the environment changes certain standard locations such as '''/tmp'''&lt;br /&gt;
&lt;br /&gt;
For a correct functioning please create the proxy the following way, example for Virgo:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /bin/voms-proxy-init --voms virgo:/virgo/ligo --out ./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
ls: cannot access /cvmfs/ligo.osgstorage.org: Permission denied&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here the proxy cannot properly be located. Therefore we have to put the complete path into the variable:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ pwd&lt;br /&gt;
/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;/x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
frames  powerflux  pycbc  test_access&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Software of particular interest =&lt;br /&gt;
== SageMath ==&lt;br /&gt;
&lt;br /&gt;
[https://www.sagemath.org/ SageMath] is particularly interesting for Cosmology because it allows symbolic calculations, e.g. deriving the equations of motions for the scale factor starting from a customised space-time metric. &lt;br /&gt;
&lt;br /&gt;
=== Standard cosmology examples ===&lt;br /&gt;
&lt;br /&gt;
* The Friedman equations for the FLRW solution of the Einstein equations.&lt;br /&gt;
You can find the corresponding Notebook in any PIC terminal at '''/data/astro/software/notebooks/FLRW_cosmology.ipynb'''&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/FLRW_cosmology_solutions.ipynb''' uses known analytical solutions of the FLRW cosmology and produces this image for the evolution of the scale factor:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage05.png|300px]]&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/Interior_Schwarzschild.ipynb''' shows the formalism for the interior Schwarzschild metric and displays the solutions for density and pressure of a static celestial object that is sufficiently larger than its Schwarzschild radius. The pressure for an object with constant density id shown in the image:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage06.png|300px]]&lt;br /&gt;
&lt;br /&gt;
=== Enabling SageMath environment in Jupyter ===&lt;br /&gt;
&lt;br /&gt;
If you have never initialized mamba, run:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /data/astro/software/centos7/conda/mambaforge_4.14.0/bin/mamba init&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
After that you can enable SageMath for its use in a Jupyter notebook session: &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~] mamba activate /data/astro/software/envs/sage&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ python -m  ipykernel install --user --name=sage&lt;br /&gt;
....&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This creates a file in you home '''~/.local/share/jupyter/kernels/sage/kernel.json'''&lt;br /&gt;
which has to be modified to look like this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
{&lt;br /&gt;
 &amp;quot;argv&amp;quot;: [&lt;br /&gt;
  &amp;quot;/data/astro/software/envs/sage/bin/sage&amp;quot;,&lt;br /&gt;
  &amp;quot;--python&amp;quot;,&lt;br /&gt;
  &amp;quot;-m&amp;quot;,&lt;br /&gt;
  &amp;quot;sage.repl.ipython_kernel&amp;quot;,&lt;br /&gt;
  &amp;quot;-f&amp;quot;,&lt;br /&gt;
  &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
 ],&lt;br /&gt;
 &amp;quot;display_name&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;language&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;metadata&amp;quot;: {&lt;br /&gt;
  &amp;quot;debugger&amp;quot;: true&lt;br /&gt;
 }&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next time you go to your Jupyter dashboard you will find the sage environment listed there.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Dask =&lt;br /&gt;
&lt;br /&gt;
A notebook with instruction on how to run Dask at PIC can be found [https://gitlab.pic.es/services/code-samples/-/blob/main/computing/dask/dask_htcondor.ipynb here]&lt;br /&gt;
&lt;br /&gt;
= Using a singularity image as a jupyter kernel =&lt;br /&gt;
&lt;br /&gt;
Sometimes the software stack of some projects may be provided in the shape of a singularity image, it will then be convenient to use this image as a kernel for the notebooks in jupyter.pic.es.&lt;br /&gt;
&lt;br /&gt;
The singularity image to be used as a kernel needs to fullfill some requirements. Different requirements will apply depending on the programming language.&lt;br /&gt;
&lt;br /&gt;
== python jupyter kernel in a singularity image ==&lt;br /&gt;
&lt;br /&gt;
The singularity image needs to have the '''python''' and the '''ipykernel''' module installed. &lt;br /&gt;
&lt;br /&gt;
* Create the folder that will host the kernel definition&lt;br /&gt;
&lt;br /&gt;
  mkdir -p $HOME/.local/share/jupyter/kernels/singularity&lt;br /&gt;
&lt;br /&gt;
* Create the '''kernel.json''' file inside it with the following content:&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 {&lt;br /&gt;
   &amp;quot;argv&amp;quot;: [&lt;br /&gt;
     &amp;quot;singularity&amp;quot;,&lt;br /&gt;
     &amp;quot;exec&amp;quot;,&lt;br /&gt;
     &amp;quot;--cleanenv&amp;quot;,&lt;br /&gt;
     &amp;quot;/path/to/the/singularity/image.sif&amp;quot;,&lt;br /&gt;
     &amp;quot;python&amp;quot;,&lt;br /&gt;
     &amp;quot;-m&amp;quot;,&lt;br /&gt;
    &amp;quot;ipykernel&amp;quot;,&lt;br /&gt;
     &amp;quot;-f&amp;quot;,&lt;br /&gt;
     &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
   ],&lt;br /&gt;
   &amp;quot;language&amp;quot;: &amp;quot;python&amp;quot;,&lt;br /&gt;
   &amp;quot;display_name&amp;quot;: &amp;quot;singularity-kernel&amp;quot;&lt;br /&gt;
 }&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Refresh or start the jupyterlab interface and the singularity kernel should appear in the launcher tab&lt;br /&gt;
&lt;br /&gt;
= GPUs =&lt;br /&gt;
&lt;br /&gt;
The way to identify the GPUs that are assigned to your job is:&lt;br /&gt;
* check the environment variable CUDA_VISIBLE_DEVICES. In a terminal run &amp;quot;echo $CUDA_VISIBLE_DEVICES&amp;quot;. The environment variable contains a list of comma-separated GPU ids. With this you will already know how many gpus are assigned to your job. If the variable does not exist, there are no gpus assigned to the job&lt;br /&gt;
&lt;br /&gt;
* list the gpus with nvidia-smi, in a terminal run &amp;quot;nvidia-smi -L&amp;quot;, and look for the gpus you've been assigned. Remember their indexes (integers from 0 to 7)&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_id_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
* in the GPU dashboard the gpus are identified with their index&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_resources_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
= Jupyterlab user guide =&lt;br /&gt;
&lt;br /&gt;
You can find the official documentation of the currently installed version of jupyterlab (3.6) [https://jupyterlab.readthedocs.io/en/3.6.x/ here], there you will find instruction on how to &lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/3.6.x/user/commands.html Access the command palette]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/3.6.x/user/toc.html Build a Table Of Contents]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/3.6.x/user/debugger.html Debug your code]&lt;br /&gt;
&lt;br /&gt;
A set of non-official jupyterlab extensions are installed to provide additional functionalities&lt;br /&gt;
&lt;br /&gt;
== jupytext ==&lt;br /&gt;
Pair your notebooks with text files to enhance version tracking.&lt;br /&gt;
https://jupytext.readthedocs.io&lt;br /&gt;
&lt;br /&gt;
=== Example ===&lt;br /&gt;
&lt;br /&gt;
If you had a notebook (.ipynb file) containing only the cell below, tracked in a git repository&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%matplotlib inline&lt;br /&gt;
import numpy as np&lt;br /&gt;
import matplotlib.pyplot as plt&lt;br /&gt;
plt.imshow(np.random.random([10, 10]))&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Different executions of the cell would produce different images, and the images are embedded in a pseudo-binary format inside the notebook file. In this case, doing a '''git diff''' of the .ipynb file would produce a huge output (because the image changed), even if there wasn't any change in the code. It is thus convenient to sync the notebook with a text file (e.g. a .py script) using the jupytext extension and track this one with git. The outputs, including images, as well as some additional metadata, won't be added to the synced text file. So in the case of different executions of the same notebook, the diff will always be empty.&lt;br /&gt;
&lt;br /&gt;
== git ==&lt;br /&gt;
Sidebar GUI to git repo management&lt;br /&gt;
https://github.com/jupyterlab/jupyterlab-git&lt;br /&gt;
&lt;br /&gt;
== variable inspector ==&lt;br /&gt;
Variable inspection à la Matlab&lt;br /&gt;
https://github.com/jupyterlab-contrib/jupyterlab-variableInspector&lt;br /&gt;
&lt;br /&gt;
...&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Code samples =&lt;br /&gt;
&lt;br /&gt;
A repository with sample code can be found here: https://gitlab.pic.es/services/code-samples/&lt;/div&gt;</summary>
		<author><name>Torradeflot</name></author>
	</entry>
	<entry>
		<id>https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1156</id>
		<title>JupyterHub</title>
		<link rel="alternate" type="text/html" href="https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1156"/>
		<updated>2024-06-13T09:55:21Z</updated>

		<summary type="html">&lt;p&gt;Torradeflot: /* Code base */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
= Introduction =&lt;br /&gt;
&lt;br /&gt;
PIC offers a service for running Jupyter notebooks on CPU or GPU resources. This service is primarily thought for code developing or prototyping rather than data processing. The usage is similar to running notebooks on your personal computer but offers the advantage of developing and testing your code on different hardware configurations, as well as facilitating the scalability of the code since it is being tested in the same environment in which it would run on a mass scale. &lt;br /&gt;
&lt;br /&gt;
Since the service is strictly thought for development and small scale testing tasks, a shutdown policy for the sessions has been put in place:&lt;br /&gt;
&lt;br /&gt;
# The maximum duration for a session is 48h.&lt;br /&gt;
# After an idle period of 2 hours, the session will be closed. &lt;br /&gt;
&lt;br /&gt;
In practice that  means that you should estimate the test data volume that you work with during a session to be able to be processed in less than 48 hours.&lt;br /&gt;
&lt;br /&gt;
== How to connect to the service ==&lt;br /&gt;
&lt;br /&gt;
Got to [https://jupyter.pic.es jupyter.pic.es] to see your login screen.&lt;br /&gt;
&lt;br /&gt;
[[File:login.png|frameless|700px|Login screen]]&lt;br /&gt;
&lt;br /&gt;
Sign in with your PIC user credentials. This will prompt you to the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterSpawn.png|700px|current]]&lt;br /&gt;
&lt;br /&gt;
Here you can choose the hardware configuration for your Jupyter session. Also, you have to choose the experiment (project) you are working on during the Jupyter session. After choosing a configuration and pressing start the next screen will show you the progress of the initialisation process. Keep in mind that a job containing your Jupyter session is actually sent to the HTCondor queuing system and waiting for available resources before being started. This usually takes less than a minute but can take up to a few depending on our resource usage.&lt;br /&gt;
&lt;br /&gt;
[[File:screen02.png|600px]]&lt;br /&gt;
&lt;br /&gt;
In the next screen you can choose the tool that you want to use for your work: a Python notebook, a Python console or a plain bash terminal.&lt;br /&gt;
For the Python environment (either notebook or environment) you have two default options:&lt;br /&gt;
* the ipykernel version of Python 3&lt;br /&gt;
* the XPython version of Python 3.9, this one allows you to use the integrated debugging module.&lt;br /&gt;
&lt;br /&gt;
Further you see an icon with a &amp;quot;D&amp;quot; - desktop, this one starts a VNC session that allows the use of programs with graphical user interfaces.&lt;br /&gt;
&lt;br /&gt;
Also, recently you can find the icon of Visual Studio, an integrated development environment.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterlab20231103.png|700px]]&lt;br /&gt;
&lt;br /&gt;
Your python environments should appear under Notebook and Console headers. In a later section we will show you how to create a new environment and to remove an existing one.&lt;br /&gt;
&lt;br /&gt;
== Terminate your session and logout ==&lt;br /&gt;
&lt;br /&gt;
It is important that you terminate your session before you log out. In order to do so, go to the top page menu &amp;quot;'''File''' -&amp;gt; '''Hub Control Panel'''&amp;quot; and you will see the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:screen04.png|600px]]&lt;br /&gt;
&lt;br /&gt;
Here, click on the '''Stop My Server''' button. After that you can log out by clicking the '''Logout''' button in the right upper corner.&lt;br /&gt;
&lt;br /&gt;
=  Python virtual environments =&lt;br /&gt;
&lt;br /&gt;
This section covers the use of Python virtual environments with Jupyter.&lt;br /&gt;
&lt;br /&gt;
== Initialize conda (we highly recommend the use of miniforge/mambaforge) ==&lt;br /&gt;
&lt;br /&gt;
Before using conda/mamba in your bash session, you have to initialize it.&lt;br /&gt;
* For access to an available conda/mamba installation, please get in contact with your project liaison at PIC. He/she will give you the actual value for the '''/path/to/mambaforge''' placeholder.&lt;br /&gt;
* If you want to use your own conda/mamba installation, we recommend you to install the minimal '''miniforge''' distribution, instructions [https://github.com/conda-forge/miniforge here]&lt;br /&gt;
&lt;br /&gt;
Log onto Jupyter and start a session. On the homepage of your Jupyter session, click on the terminal button on the session dashboard on the right to open a bash terminal. If no specific version is needed you can use the link provided in the example.&lt;br /&gt;
&lt;br /&gt;
First, let's initialize conda for our bash sessions:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ /data/astro/software/alma9/conda/miniforge-24.1.2/bin/mamba init&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This actually changes the .bashrc file in your home directory in order to activate the base environment on login.&lt;br /&gt;
To avoid that the base environment is activated every time you log on to a node, run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, in order to activate the base environment you will have to run this command:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ eval &amp;quot;$(/data/astro/software/alma9/conda/miniforge-24.1.2/bin/conda shell.bash hook)&amp;quot;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For now you can exit the terminal.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ exit&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Link an existing environment to Jupyter ==&lt;br /&gt;
&lt;br /&gt;
You can find instructions on how to create your own environments, e.g. [[#Create_virtual_environments_with_venv_or_conda | here]].&lt;br /&gt;
&lt;br /&gt;
Log into Jupyter, start a session. From the session dashboard choose the bash terminal.&lt;br /&gt;
&lt;br /&gt;
Inside the terminal, activate your environment.&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments:&lt;br /&gt;
* if you created the environment without a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the name of your environment. &lt;br /&gt;
* if you created the environment with a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the absolute path of your environment. &lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ source /path/to/environment/bin/activate&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Link the environment to a Jupyter kernel. For both, '''conda/mamba''' and '''venv''':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ python -m  ipykernel install --user --name=whatever_kernel_name&lt;br /&gt;
Installed kernelspec whatever_kernel_name in &lt;br /&gt;
                         /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you don't have the '''ipykernel''' module installed in your environment you may receive an error message like the one below when trying to run the previous command.&lt;br /&gt;
&amp;lt;pre&amp;gt;No module named ipykernel&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If this is the case, you need to install it by running: '''pip install ipykernel'''&lt;br /&gt;
&lt;br /&gt;
Deactivate your environment. &lt;br /&gt;
&lt;br /&gt;
For conda:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For venv:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now you can exit the terminal. After refreshing the Jupyter page your whatever_kernel_name appears in the dashboard. In this example '''test''' has been used for whatever_kernel_name&lt;br /&gt;
&lt;br /&gt;
[[File:screen05.png|700px]]&lt;br /&gt;
&lt;br /&gt;
== Unlink an environment from Jupyter ==&lt;br /&gt;
Log onto Jupyter, start a session and from the session dashboard choose the bash terminal. To remove your environment/kernel from Jupyter run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ jupyter kernelspec uninstall whatever_kernel_name&lt;br /&gt;
Kernel specs to remove:&lt;br /&gt;
  whatever_kernel_name     /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
Remove 1 kernel specs [y/N]: y&lt;br /&gt;
[RemoveKernelSpec] Removed /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Keep in mind that, although not available in Jupyter anymore, the environment still exists. Whenever you need it, you can link it again.&lt;br /&gt;
&lt;br /&gt;
== Create virtual environments with venv or conda ==&lt;br /&gt;
&lt;br /&gt;
Before creating a new environment, please get in contact with your project liaison at PIC as there may be already a suitable environment for your needs in place.&lt;br /&gt;
&lt;br /&gt;
If none of the existing environments suits your needs, you can create a new environment.&lt;br /&gt;
First, create a directory in a suitable place to store the environment. For single-user environments, place them in your home under ~/env. For environments that will be shared with other project users, contact your project liaison and ask him/her for a path in a shared storage volume that is visible to all of them.&lt;br /&gt;
&lt;br /&gt;
Once you have the location (i.e. /path/to/env/folder), create the environment with the following commands:&lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments '''(recommended)'''&lt;br /&gt;
&lt;br /&gt;
If your_env is installed at /path/to/env/your_env&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ python3 -m venv your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ source your_env/bin/activate&lt;br /&gt;
(...)[neissner@td110 ~]$ pip install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments&lt;br /&gt;
 &amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba create --prefix /path/to/env/your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The list of modules (module1, module2, ...) is optional. For instance, for a python3 environment with scipy you would specify: ''python=3 scipy'' &lt;br /&gt;
&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/env/folder/your_env&lt;br /&gt;
(...)[neissner@td110 ~]$ mamba install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
You can use pip install inside a mamba environment, however, resolving dependencies might require installing additional packages manually.&lt;br /&gt;
&lt;br /&gt;
= Proper usage of X509 based proxies =&lt;br /&gt;
&lt;br /&gt;
We found recently that the usage of proxies within a Jupyter session might cause problems because the environment changes certain standard locations such as '''/tmp'''&lt;br /&gt;
&lt;br /&gt;
For a correct functioning please create the proxy the following way, example for Virgo:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /bin/voms-proxy-init --voms virgo:/virgo/ligo --out ./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
ls: cannot access /cvmfs/ligo.osgstorage.org: Permission denied&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here the proxy cannot properly be located. Therefore we have to put the complete path into the variable:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ pwd&lt;br /&gt;
/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;/x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
frames  powerflux  pycbc  test_access&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Software of particular interest =&lt;br /&gt;
== SageMath ==&lt;br /&gt;
&lt;br /&gt;
[https://www.sagemath.org/ SageMath] is particularly interesting for Cosmology because it allows symbolic calculations, e.g. deriving the equations of motions for the scale factor starting from a customised space-time metric. &lt;br /&gt;
&lt;br /&gt;
=== Standard cosmology examples ===&lt;br /&gt;
&lt;br /&gt;
* The Friedman equations for the FLRW solution of the Einstein equations.&lt;br /&gt;
You can find the corresponding Notebook in any PIC terminal at '''/data/astro/software/notebooks/FLRW_cosmology.ipynb'''&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/FLRW_cosmology_solutions.ipynb''' uses known analytical solutions of the FLRW cosmology and produces this image for the evolution of the scale factor:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage05.png|300px]]&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/Interior_Schwarzschild.ipynb''' shows the formalism for the interior Schwarzschild metric and displays the solutions for density and pressure of a static celestial object that is sufficiently larger than its Schwarzschild radius. The pressure for an object with constant density id shown in the image:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage06.png|300px]]&lt;br /&gt;
&lt;br /&gt;
=== Enabling SageMath environment in Jupyter ===&lt;br /&gt;
&lt;br /&gt;
If you have never initialized mamba, run:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /data/astro/software/centos7/conda/mambaforge_4.14.0/bin/mamba init&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
After that you can enable SageMath for its use in a Jupyter notebook session: &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~] mamba activate /data/astro/software/envs/sage&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ python -m  ipykernel install --user --name=sage&lt;br /&gt;
....&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This creates a file in you home '''~/.local/share/jupyter/kernels/sage/kernel.json'''&lt;br /&gt;
which has to be modified to look like this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
{&lt;br /&gt;
 &amp;quot;argv&amp;quot;: [&lt;br /&gt;
  &amp;quot;/data/astro/software/envs/sage/bin/sage&amp;quot;,&lt;br /&gt;
  &amp;quot;--python&amp;quot;,&lt;br /&gt;
  &amp;quot;-m&amp;quot;,&lt;br /&gt;
  &amp;quot;sage.repl.ipython_kernel&amp;quot;,&lt;br /&gt;
  &amp;quot;-f&amp;quot;,&lt;br /&gt;
  &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
 ],&lt;br /&gt;
 &amp;quot;display_name&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;language&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;metadata&amp;quot;: {&lt;br /&gt;
  &amp;quot;debugger&amp;quot;: true&lt;br /&gt;
 }&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next time you go to your Jupyter dashboard you will find the sage environment listed there.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Dask =&lt;br /&gt;
&lt;br /&gt;
[[Media:dask_htcondor.pdf|Dask + HTCondor manual]]&lt;br /&gt;
&lt;br /&gt;
= Using a singularity image as a jupyter kernel =&lt;br /&gt;
&lt;br /&gt;
Sometimes the software stack of some projects may be provided in the shape of a singularity image, it will then be convenient to use this image as a kernel for the notebooks in jupyter.pic.es.&lt;br /&gt;
&lt;br /&gt;
The singularity image to be used as a kernel needs to fullfill some requirements. Different requirements will apply depending on the programming language.&lt;br /&gt;
&lt;br /&gt;
== python jupyter kernel in a singularity image ==&lt;br /&gt;
&lt;br /&gt;
The singularity image needs to have the '''python''' and the '''ipykernel''' module installed. &lt;br /&gt;
&lt;br /&gt;
* Create the folder that will host the kernel definition&lt;br /&gt;
&lt;br /&gt;
  mkdir -p $HOME/.local/share/jupyter/kernels/singularity&lt;br /&gt;
&lt;br /&gt;
* Create the '''kernel.json''' file inside it with the following content:&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 {&lt;br /&gt;
   &amp;quot;argv&amp;quot;: [&lt;br /&gt;
     &amp;quot;singularity&amp;quot;,&lt;br /&gt;
     &amp;quot;exec&amp;quot;,&lt;br /&gt;
     &amp;quot;--cleanenv&amp;quot;,&lt;br /&gt;
     &amp;quot;/path/to/the/singularity/image.sif&amp;quot;,&lt;br /&gt;
     &amp;quot;python&amp;quot;,&lt;br /&gt;
     &amp;quot;-m&amp;quot;,&lt;br /&gt;
    &amp;quot;ipykernel&amp;quot;,&lt;br /&gt;
     &amp;quot;-f&amp;quot;,&lt;br /&gt;
     &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
   ],&lt;br /&gt;
   &amp;quot;language&amp;quot;: &amp;quot;python&amp;quot;,&lt;br /&gt;
   &amp;quot;display_name&amp;quot;: &amp;quot;singularity-kernel&amp;quot;&lt;br /&gt;
 }&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Refresh or start the jupyterlab interface and the singularity kernel should appear in the launcher tab&lt;br /&gt;
&lt;br /&gt;
= GPUs =&lt;br /&gt;
&lt;br /&gt;
The way to identify the GPUs that are assigned to your job is:&lt;br /&gt;
* check the environment variable CUDA_VISIBLE_DEVICES. In a terminal run &amp;quot;echo $CUDA_VISIBLE_DEVICES&amp;quot;. The environment variable contains a list of comma-separated GPU ids. With this you will already know how many gpus are assigned to your job. If the variable does not exist, there are no gpus assigned to the job&lt;br /&gt;
&lt;br /&gt;
* list the gpus with nvidia-smi, in a terminal run &amp;quot;nvidia-smi -L&amp;quot;, and look for the gpus you've been assigned. Remember their indexes (integers from 0 to 7)&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_id_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
* in the GPU dashboard the gpus are identified with their index&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_resources_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
= Jupyterlab user guide =&lt;br /&gt;
&lt;br /&gt;
You can find the official documentation of the currently installed version of jupyterlab (3.6) [https://jupyterlab.readthedocs.io/en/3.6.x/ here], there you will find instruction on how to &lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/3.6.x/user/commands.html Access the command palette]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/3.6.x/user/toc.html Build a Table Of Contents]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/3.6.x/user/debugger.html Debug your code]&lt;br /&gt;
&lt;br /&gt;
A set of non-official jupyterlab extensions are installed to provide additional functionalities&lt;br /&gt;
&lt;br /&gt;
== jupytext ==&lt;br /&gt;
Pair your notebooks with text files to enhance version tracking.&lt;br /&gt;
https://jupytext.readthedocs.io&lt;br /&gt;
&lt;br /&gt;
=== Example ===&lt;br /&gt;
&lt;br /&gt;
If you had a notebook (.ipynb file) containing only the cell below, tracked in a git repository&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%matplotlib inline&lt;br /&gt;
import numpy as np&lt;br /&gt;
import matplotlib.pyplot as plt&lt;br /&gt;
plt.imshow(np.random.random([10, 10]))&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Different executions of the cell would produce different images, and the images are embedded in a pseudo-binary format inside the notebook file. In this case, doing a '''git diff''' of the .ipynb file would produce a huge output (because the image changed), even if there wasn't any change in the code. It is thus convenient to sync the notebook with a text file (e.g. a .py script) using the jupytext extension and track this one with git. The outputs, including images, as well as some additional metadata, won't be added to the synced text file. So in the case of different executions of the same notebook, the diff will always be empty.&lt;br /&gt;
&lt;br /&gt;
== git ==&lt;br /&gt;
Sidebar GUI to git repo management&lt;br /&gt;
https://github.com/jupyterlab/jupyterlab-git&lt;br /&gt;
&lt;br /&gt;
== variable inspector ==&lt;br /&gt;
Variable inspection à la Matlab&lt;br /&gt;
https://github.com/jupyterlab-contrib/jupyterlab-variableInspector&lt;br /&gt;
&lt;br /&gt;
...&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Code samples =&lt;br /&gt;
&lt;br /&gt;
A repository with sample code can be found here: https://gitlab.pic.es/services/code-samples/&lt;/div&gt;</summary>
		<author><name>Torradeflot</name></author>
	</entry>
	<entry>
		<id>https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1155</id>
		<title>JupyterHub</title>
		<link rel="alternate" type="text/html" href="https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1155"/>
		<updated>2024-06-13T09:54:52Z</updated>

		<summary type="html">&lt;p&gt;Torradeflot: /* Jupyterlab user guide */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
= Introduction =&lt;br /&gt;
&lt;br /&gt;
PIC offers a service for running Jupyter notebooks on CPU or GPU resources. This service is primarily thought for code developing or prototyping rather than data processing. The usage is similar to running notebooks on your personal computer but offers the advantage of developing and testing your code on different hardware configurations, as well as facilitating the scalability of the code since it is being tested in the same environment in which it would run on a mass scale. &lt;br /&gt;
&lt;br /&gt;
Since the service is strictly thought for development and small scale testing tasks, a shutdown policy for the sessions has been put in place:&lt;br /&gt;
&lt;br /&gt;
# The maximum duration for a session is 48h.&lt;br /&gt;
# After an idle period of 2 hours, the session will be closed. &lt;br /&gt;
&lt;br /&gt;
In practice that  means that you should estimate the test data volume that you work with during a session to be able to be processed in less than 48 hours.&lt;br /&gt;
&lt;br /&gt;
== How to connect to the service ==&lt;br /&gt;
&lt;br /&gt;
Got to [https://jupyter.pic.es jupyter.pic.es] to see your login screen.&lt;br /&gt;
&lt;br /&gt;
[[File:login.png|frameless|700px|Login screen]]&lt;br /&gt;
&lt;br /&gt;
Sign in with your PIC user credentials. This will prompt you to the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterSpawn.png|700px|current]]&lt;br /&gt;
&lt;br /&gt;
Here you can choose the hardware configuration for your Jupyter session. Also, you have to choose the experiment (project) you are working on during the Jupyter session. After choosing a configuration and pressing start the next screen will show you the progress of the initialisation process. Keep in mind that a job containing your Jupyter session is actually sent to the HTCondor queuing system and waiting for available resources before being started. This usually takes less than a minute but can take up to a few depending on our resource usage.&lt;br /&gt;
&lt;br /&gt;
[[File:screen02.png|600px]]&lt;br /&gt;
&lt;br /&gt;
In the next screen you can choose the tool that you want to use for your work: a Python notebook, a Python console or a plain bash terminal.&lt;br /&gt;
For the Python environment (either notebook or environment) you have two default options:&lt;br /&gt;
* the ipykernel version of Python 3&lt;br /&gt;
* the XPython version of Python 3.9, this one allows you to use the integrated debugging module.&lt;br /&gt;
&lt;br /&gt;
Further you see an icon with a &amp;quot;D&amp;quot; - desktop, this one starts a VNC session that allows the use of programs with graphical user interfaces.&lt;br /&gt;
&lt;br /&gt;
Also, recently you can find the icon of Visual Studio, an integrated development environment.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterlab20231103.png|700px]]&lt;br /&gt;
&lt;br /&gt;
Your python environments should appear under Notebook and Console headers. In a later section we will show you how to create a new environment and to remove an existing one.&lt;br /&gt;
&lt;br /&gt;
== Terminate your session and logout ==&lt;br /&gt;
&lt;br /&gt;
It is important that you terminate your session before you log out. In order to do so, go to the top page menu &amp;quot;'''File''' -&amp;gt; '''Hub Control Panel'''&amp;quot; and you will see the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:screen04.png|600px]]&lt;br /&gt;
&lt;br /&gt;
Here, click on the '''Stop My Server''' button. After that you can log out by clicking the '''Logout''' button in the right upper corner.&lt;br /&gt;
&lt;br /&gt;
=  Python virtual environments =&lt;br /&gt;
&lt;br /&gt;
This section covers the use of Python virtual environments with Jupyter.&lt;br /&gt;
&lt;br /&gt;
== Initialize conda (we highly recommend the use of miniforge/mambaforge) ==&lt;br /&gt;
&lt;br /&gt;
Before using conda/mamba in your bash session, you have to initialize it.&lt;br /&gt;
* For access to an available conda/mamba installation, please get in contact with your project liaison at PIC. He/she will give you the actual value for the '''/path/to/mambaforge''' placeholder.&lt;br /&gt;
* If you want to use your own conda/mamba installation, we recommend you to install the minimal '''miniforge''' distribution, instructions [https://github.com/conda-forge/miniforge here]&lt;br /&gt;
&lt;br /&gt;
Log onto Jupyter and start a session. On the homepage of your Jupyter session, click on the terminal button on the session dashboard on the right to open a bash terminal. If no specific version is needed you can use the link provided in the example.&lt;br /&gt;
&lt;br /&gt;
First, let's initialize conda for our bash sessions:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ /data/astro/software/alma9/conda/miniforge-24.1.2/bin/mamba init&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This actually changes the .bashrc file in your home directory in order to activate the base environment on login.&lt;br /&gt;
To avoid that the base environment is activated every time you log on to a node, run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, in order to activate the base environment you will have to run this command:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ eval &amp;quot;$(/data/astro/software/alma9/conda/miniforge-24.1.2/bin/conda shell.bash hook)&amp;quot;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For now you can exit the terminal.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ exit&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Link an existing environment to Jupyter ==&lt;br /&gt;
&lt;br /&gt;
You can find instructions on how to create your own environments, e.g. [[#Create_virtual_environments_with_venv_or_conda | here]].&lt;br /&gt;
&lt;br /&gt;
Log into Jupyter, start a session. From the session dashboard choose the bash terminal.&lt;br /&gt;
&lt;br /&gt;
Inside the terminal, activate your environment.&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments:&lt;br /&gt;
* if you created the environment without a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the name of your environment. &lt;br /&gt;
* if you created the environment with a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the absolute path of your environment. &lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ source /path/to/environment/bin/activate&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Link the environment to a Jupyter kernel. For both, '''conda/mamba''' and '''venv''':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ python -m  ipykernel install --user --name=whatever_kernel_name&lt;br /&gt;
Installed kernelspec whatever_kernel_name in &lt;br /&gt;
                         /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you don't have the '''ipykernel''' module installed in your environment you may receive an error message like the one below when trying to run the previous command.&lt;br /&gt;
&amp;lt;pre&amp;gt;No module named ipykernel&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If this is the case, you need to install it by running: '''pip install ipykernel'''&lt;br /&gt;
&lt;br /&gt;
Deactivate your environment. &lt;br /&gt;
&lt;br /&gt;
For conda:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For venv:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now you can exit the terminal. After refreshing the Jupyter page your whatever_kernel_name appears in the dashboard. In this example '''test''' has been used for whatever_kernel_name&lt;br /&gt;
&lt;br /&gt;
[[File:screen05.png|700px]]&lt;br /&gt;
&lt;br /&gt;
== Unlink an environment from Jupyter ==&lt;br /&gt;
Log onto Jupyter, start a session and from the session dashboard choose the bash terminal. To remove your environment/kernel from Jupyter run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ jupyter kernelspec uninstall whatever_kernel_name&lt;br /&gt;
Kernel specs to remove:&lt;br /&gt;
  whatever_kernel_name     /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
Remove 1 kernel specs [y/N]: y&lt;br /&gt;
[RemoveKernelSpec] Removed /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Keep in mind that, although not available in Jupyter anymore, the environment still exists. Whenever you need it, you can link it again.&lt;br /&gt;
&lt;br /&gt;
== Create virtual environments with venv or conda ==&lt;br /&gt;
&lt;br /&gt;
Before creating a new environment, please get in contact with your project liaison at PIC as there may be already a suitable environment for your needs in place.&lt;br /&gt;
&lt;br /&gt;
If none of the existing environments suits your needs, you can create a new environment.&lt;br /&gt;
First, create a directory in a suitable place to store the environment. For single-user environments, place them in your home under ~/env. For environments that will be shared with other project users, contact your project liaison and ask him/her for a path in a shared storage volume that is visible to all of them.&lt;br /&gt;
&lt;br /&gt;
Once you have the location (i.e. /path/to/env/folder), create the environment with the following commands:&lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments '''(recommended)'''&lt;br /&gt;
&lt;br /&gt;
If your_env is installed at /path/to/env/your_env&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ python3 -m venv your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ source your_env/bin/activate&lt;br /&gt;
(...)[neissner@td110 ~]$ pip install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments&lt;br /&gt;
 &amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba create --prefix /path/to/env/your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The list of modules (module1, module2, ...) is optional. For instance, for a python3 environment with scipy you would specify: ''python=3 scipy'' &lt;br /&gt;
&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/env/folder/your_env&lt;br /&gt;
(...)[neissner@td110 ~]$ mamba install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
You can use pip install inside a mamba environment, however, resolving dependencies might require installing additional packages manually.&lt;br /&gt;
&lt;br /&gt;
= Proper usage of X509 based proxies =&lt;br /&gt;
&lt;br /&gt;
We found recently that the usage of proxies within a Jupyter session might cause problems because the environment changes certain standard locations such as '''/tmp'''&lt;br /&gt;
&lt;br /&gt;
For a correct functioning please create the proxy the following way, example for Virgo:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /bin/voms-proxy-init --voms virgo:/virgo/ligo --out ./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
ls: cannot access /cvmfs/ligo.osgstorage.org: Permission denied&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here the proxy cannot properly be located. Therefore we have to put the complete path into the variable:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ pwd&lt;br /&gt;
/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;/x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
frames  powerflux  pycbc  test_access&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Software of particular interest =&lt;br /&gt;
== SageMath ==&lt;br /&gt;
&lt;br /&gt;
[https://www.sagemath.org/ SageMath] is particularly interesting for Cosmology because it allows symbolic calculations, e.g. deriving the equations of motions for the scale factor starting from a customised space-time metric. &lt;br /&gt;
&lt;br /&gt;
=== Standard cosmology examples ===&lt;br /&gt;
&lt;br /&gt;
* The Friedman equations for the FLRW solution of the Einstein equations.&lt;br /&gt;
You can find the corresponding Notebook in any PIC terminal at '''/data/astro/software/notebooks/FLRW_cosmology.ipynb'''&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/FLRW_cosmology_solutions.ipynb''' uses known analytical solutions of the FLRW cosmology and produces this image for the evolution of the scale factor:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage05.png|300px]]&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/Interior_Schwarzschild.ipynb''' shows the formalism for the interior Schwarzschild metric and displays the solutions for density and pressure of a static celestial object that is sufficiently larger than its Schwarzschild radius. The pressure for an object with constant density id shown in the image:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage06.png|300px]]&lt;br /&gt;
&lt;br /&gt;
=== Enabling SageMath environment in Jupyter ===&lt;br /&gt;
&lt;br /&gt;
If you have never initialized mamba, run:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /data/astro/software/centos7/conda/mambaforge_4.14.0/bin/mamba init&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
After that you can enable SageMath for its use in a Jupyter notebook session: &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~] mamba activate /data/astro/software/envs/sage&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ python -m  ipykernel install --user --name=sage&lt;br /&gt;
....&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This creates a file in you home '''~/.local/share/jupyter/kernels/sage/kernel.json'''&lt;br /&gt;
which has to be modified to look like this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
{&lt;br /&gt;
 &amp;quot;argv&amp;quot;: [&lt;br /&gt;
  &amp;quot;/data/astro/software/envs/sage/bin/sage&amp;quot;,&lt;br /&gt;
  &amp;quot;--python&amp;quot;,&lt;br /&gt;
  &amp;quot;-m&amp;quot;,&lt;br /&gt;
  &amp;quot;sage.repl.ipython_kernel&amp;quot;,&lt;br /&gt;
  &amp;quot;-f&amp;quot;,&lt;br /&gt;
  &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
 ],&lt;br /&gt;
 &amp;quot;display_name&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;language&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;metadata&amp;quot;: {&lt;br /&gt;
  &amp;quot;debugger&amp;quot;: true&lt;br /&gt;
 }&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next time you go to your Jupyter dashboard you will find the sage environment listed there.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Dask =&lt;br /&gt;
&lt;br /&gt;
[[Media:dask_htcondor.pdf|Dask + HTCondor manual]]&lt;br /&gt;
&lt;br /&gt;
= Using a singularity image as a jupyter kernel =&lt;br /&gt;
&lt;br /&gt;
Sometimes the software stack of some projects may be provided in the shape of a singularity image, it will then be convenient to use this image as a kernel for the notebooks in jupyter.pic.es.&lt;br /&gt;
&lt;br /&gt;
The singularity image to be used as a kernel needs to fullfill some requirements. Different requirements will apply depending on the programming language.&lt;br /&gt;
&lt;br /&gt;
== python jupyter kernel in a singularity image ==&lt;br /&gt;
&lt;br /&gt;
The singularity image needs to have the '''python''' and the '''ipykernel''' module installed. &lt;br /&gt;
&lt;br /&gt;
* Create the folder that will host the kernel definition&lt;br /&gt;
&lt;br /&gt;
  mkdir -p $HOME/.local/share/jupyter/kernels/singularity&lt;br /&gt;
&lt;br /&gt;
* Create the '''kernel.json''' file inside it with the following content:&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 {&lt;br /&gt;
   &amp;quot;argv&amp;quot;: [&lt;br /&gt;
     &amp;quot;singularity&amp;quot;,&lt;br /&gt;
     &amp;quot;exec&amp;quot;,&lt;br /&gt;
     &amp;quot;--cleanenv&amp;quot;,&lt;br /&gt;
     &amp;quot;/path/to/the/singularity/image.sif&amp;quot;,&lt;br /&gt;
     &amp;quot;python&amp;quot;,&lt;br /&gt;
     &amp;quot;-m&amp;quot;,&lt;br /&gt;
    &amp;quot;ipykernel&amp;quot;,&lt;br /&gt;
     &amp;quot;-f&amp;quot;,&lt;br /&gt;
     &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
   ],&lt;br /&gt;
   &amp;quot;language&amp;quot;: &amp;quot;python&amp;quot;,&lt;br /&gt;
   &amp;quot;display_name&amp;quot;: &amp;quot;singularity-kernel&amp;quot;&lt;br /&gt;
 }&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Refresh or start the jupyterlab interface and the singularity kernel should appear in the launcher tab&lt;br /&gt;
&lt;br /&gt;
= GPUs =&lt;br /&gt;
&lt;br /&gt;
The way to identify the GPUs that are assigned to your job is:&lt;br /&gt;
* check the environment variable CUDA_VISIBLE_DEVICES. In a terminal run &amp;quot;echo $CUDA_VISIBLE_DEVICES&amp;quot;. The environment variable contains a list of comma-separated GPU ids. With this you will already know how many gpus are assigned to your job. If the variable does not exist, there are no gpus assigned to the job&lt;br /&gt;
&lt;br /&gt;
* list the gpus with nvidia-smi, in a terminal run &amp;quot;nvidia-smi -L&amp;quot;, and look for the gpus you've been assigned. Remember their indexes (integers from 0 to 7)&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_id_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
* in the GPU dashboard the gpus are identified with their index&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_resources_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
= Jupyterlab user guide =&lt;br /&gt;
&lt;br /&gt;
You can find the official documentation of the currently installed version of jupyterlab (3.6) [https://jupyterlab.readthedocs.io/en/3.6.x/ here], there you will find instruction on how to &lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/3.6.x/user/commands.html Access the command palette]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/3.6.x/user/toc.html Build a Table Of Contents]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/3.6.x/user/debugger.html Debug your code]&lt;br /&gt;
&lt;br /&gt;
A set of non-official jupyterlab extensions are installed to provide additional functionalities&lt;br /&gt;
&lt;br /&gt;
== jupytext ==&lt;br /&gt;
Pair your notebooks with text files to enhance version tracking.&lt;br /&gt;
https://jupytext.readthedocs.io&lt;br /&gt;
&lt;br /&gt;
=== Example ===&lt;br /&gt;
&lt;br /&gt;
If you had a notebook (.ipynb file) containing only the cell below, tracked in a git repository&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%matplotlib inline&lt;br /&gt;
import numpy as np&lt;br /&gt;
import matplotlib.pyplot as plt&lt;br /&gt;
plt.imshow(np.random.random([10, 10]))&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Different executions of the cell would produce different images, and the images are embedded in a pseudo-binary format inside the notebook file. In this case, doing a '''git diff''' of the .ipynb file would produce a huge output (because the image changed), even if there wasn't any change in the code. It is thus convenient to sync the notebook with a text file (e.g. a .py script) using the jupytext extension and track this one with git. The outputs, including images, as well as some additional metadata, won't be added to the synced text file. So in the case of different executions of the same notebook, the diff will always be empty.&lt;br /&gt;
&lt;br /&gt;
== git ==&lt;br /&gt;
Sidebar GUI to git repo management&lt;br /&gt;
https://github.com/jupyterlab/jupyterlab-git&lt;br /&gt;
&lt;br /&gt;
== variable inspector ==&lt;br /&gt;
Variable inspection à la Matlab&lt;br /&gt;
https://github.com/jupyterlab-contrib/jupyterlab-variableInspector&lt;br /&gt;
&lt;br /&gt;
...&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Code base =&lt;br /&gt;
&lt;br /&gt;
A repository with sample code can be found here: https://gitlab.pic.es/services/code-samples/&lt;/div&gt;</summary>
		<author><name>Torradeflot</name></author>
	</entry>
	<entry>
		<id>https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1143</id>
		<title>JupyterHub</title>
		<link rel="alternate" type="text/html" href="https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1143"/>
		<updated>2024-04-03T10:19:19Z</updated>

		<summary type="html">&lt;p&gt;Torradeflot: /* Initialize conda (we highly recommend the use of mambaforge) */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
= Introduction =&lt;br /&gt;
&lt;br /&gt;
PIC offers a service for running Jupyter notebooks on CPU or GPU resources. This service is primarily thought for code developing or prototyping rather than data processing. The usage is similar to running notebooks on your personal computer but offers the advantage of developing and testing your code on different hardware configurations, as well as facilitating the scalability of the code since it is being tested in the same environment in which it would run on a mass scale. &lt;br /&gt;
&lt;br /&gt;
Since the service is strictly thought for development and small scale testing tasks, a shutdown policy for the sessions has been put in place:&lt;br /&gt;
&lt;br /&gt;
# The maximum duration for a session is 48h.&lt;br /&gt;
# After an idle period of 2 hours, the session will be closed. &lt;br /&gt;
&lt;br /&gt;
In practice that  means that you should estimate the test data volume that you work with during a session to be able to be processed in less than 48 hours.&lt;br /&gt;
&lt;br /&gt;
== How to connect to the service ==&lt;br /&gt;
&lt;br /&gt;
Got to [https://jupyter.pic.es jupyter.pic.es] to see your login screen.&lt;br /&gt;
&lt;br /&gt;
[[File:login.png|frameless|700px|Login screen]]&lt;br /&gt;
&lt;br /&gt;
Sign in with your PIC user credentials. This will prompt you to the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterSpawn.png|700px|current]]&lt;br /&gt;
&lt;br /&gt;
Here you can choose the hardware configuration for your Jupyter session. Also, you have to choose the experiment (project) you are working on during the Jupyter session. After choosing a configuration and pressing start the next screen will show you the progress of the initialisation process. Keep in mind that a job containing your Jupyter session is actually sent to the HTCondor queuing system and waiting for available resources before being started. This usually takes less than a minute but can take up to a few depending on our resource usage.&lt;br /&gt;
&lt;br /&gt;
[[File:screen02.png|600px]]&lt;br /&gt;
&lt;br /&gt;
In the next screen you can choose the tool that you want to use for your work: a Python notebook, a Python console or a plain bash terminal.&lt;br /&gt;
For the Python environment (either notebook or environment) you have two default options:&lt;br /&gt;
* the ipykernel version of Python 3&lt;br /&gt;
* the XPython version of Python 3.9, this one allows you to use the integrated debugging module.&lt;br /&gt;
&lt;br /&gt;
Further you see an icon with a &amp;quot;D&amp;quot; - desktop, this one starts a VNC session that allows the use of programs with graphical user interfaces.&lt;br /&gt;
&lt;br /&gt;
Also, recently you can find the icon of Visual Studio, an integrated development environment.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterlab20231103.png|700px]]&lt;br /&gt;
&lt;br /&gt;
Your python environments should appear under Notebook and Console headers. In a later section we will show you how to create a new environment and to remove an existing one.&lt;br /&gt;
&lt;br /&gt;
== Terminate your session and logout ==&lt;br /&gt;
&lt;br /&gt;
It is important that you terminate your session before you log out. In order to do so, go to the top page menu &amp;quot;'''File''' -&amp;gt; '''Hub Control Panel'''&amp;quot; and you will see the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:screen04.png|600px]]&lt;br /&gt;
&lt;br /&gt;
Here, click on the '''Stop My Server''' button. After that you can log out by clicking the '''Logout''' button in the right upper corner.&lt;br /&gt;
&lt;br /&gt;
=  Python virtual environments =&lt;br /&gt;
&lt;br /&gt;
This section covers the use of Python virtual environments with Jupyter.&lt;br /&gt;
&lt;br /&gt;
== Initialize conda (we highly recommend the use of miniforge/mambaforge) ==&lt;br /&gt;
&lt;br /&gt;
Before using conda/mamba in your bash session, you have to initialize it.&lt;br /&gt;
* For access to an available conda/mamba installation, please get in contact with your project liaison at PIC. He/she will give you the actual value for the '''/path/to/mambaforge''' placeholder.&lt;br /&gt;
* If you want to use your own conda/mamba installation, we recommend you to install the minimal '''miniforge''' distribution, instructions [https://github.com/conda-forge/miniforge here]&lt;br /&gt;
&lt;br /&gt;
Log onto Jupyter and start a session. On the homepage of your Jupyter session, click on the terminal button on the session dashboard on the right to open a bash terminal. If no specific version is needed you can use the link provided in the example.&lt;br /&gt;
&lt;br /&gt;
First, let's initialize conda for our bash sessions:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ /data/astro/software/alma9/conda/miniforge-24.1.2/bin/mamba init&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This actually changes the .bashrc file in your home directory in order to activate the base environment on login.&lt;br /&gt;
To avoid that the base environment is activated every time you log on to a node, run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, in order to activate the base environment you will have to run this command:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ eval &amp;quot;$(/data/astro/software/alma9/conda/miniforge-24.1.2/bin/conda shell.bash hook)&amp;quot;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For now you can exit the terminal.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ exit&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Link an existing environment to Jupyter ==&lt;br /&gt;
&lt;br /&gt;
You can find instructions on how to create your own environments, e.g. [[#Create_virtual_environments_with_venv_or_conda | here]].&lt;br /&gt;
&lt;br /&gt;
Log into Jupyter, start a session. From the session dashboard choose the bash terminal.&lt;br /&gt;
&lt;br /&gt;
Inside the terminal, activate your environment.&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments:&lt;br /&gt;
* if you created the environment without a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the name of your environment. &lt;br /&gt;
* if you created the environment with a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the absolute path of your environment. &lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ source /path/to/environment/bin/activate&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Link the environment to a Jupyter kernel. For both, '''conda/mamba''' and '''venv''':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ python -m  ipykernel install --user --name=whatever_kernel_name&lt;br /&gt;
Installed kernelspec whatever_kernel_name in &lt;br /&gt;
                         /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you don't have the '''ipykernel''' module installed in your environment you may receive an error message like the one below when trying to run the previous command.&lt;br /&gt;
&amp;lt;pre&amp;gt;No module named ipykernel&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If this is the case, you need to install it by running: '''pip install ipykernel'''&lt;br /&gt;
&lt;br /&gt;
Deactivate your environment. &lt;br /&gt;
&lt;br /&gt;
For conda:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For venv:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now you can exit the terminal. After refreshing the Jupyter page your whatever_kernel_name appears in the dashboard. In this example '''test''' has been used for whatever_kernel_name&lt;br /&gt;
&lt;br /&gt;
[[File:screen05.png|700px]]&lt;br /&gt;
&lt;br /&gt;
== Unlink an environment from Jupyter ==&lt;br /&gt;
Log onto Jupyter, start a session and from the session dashboard choose the bash terminal. To remove your environment/kernel from Jupyter run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ jupyter kernelspec uninstall whatever_kernel_name&lt;br /&gt;
Kernel specs to remove:&lt;br /&gt;
  whatever_kernel_name     /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
Remove 1 kernel specs [y/N]: y&lt;br /&gt;
[RemoveKernelSpec] Removed /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Keep in mind that, although not available in Jupyter anymore, the environment still exists. Whenever you need it, you can link it again.&lt;br /&gt;
&lt;br /&gt;
== Create virtual environments with venv or conda ==&lt;br /&gt;
&lt;br /&gt;
Before creating a new environment, please get in contact with your project liaison at PIC as there may be already a suitable environment for your needs in place.&lt;br /&gt;
&lt;br /&gt;
If none of the existing environments suits your needs, you can create a new environment.&lt;br /&gt;
First, create a directory in a suitable place to store the environment. For single-user environments, place them in your home under ~/env. For environments that will be shared with other project users, contact your project liaison and ask him/her for a path in a shared storage volume that is visible to all of them.&lt;br /&gt;
&lt;br /&gt;
Once you have the location (i.e. /path/to/env/folder), create the environment with the following commands:&lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments '''(recommended)'''&lt;br /&gt;
&lt;br /&gt;
If your_env is installed at /path/to/env/your_env&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ python3 -m venv your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ source your_env/bin/activate&lt;br /&gt;
(...)[neissner@td110 ~]$ pip install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments&lt;br /&gt;
 &amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba create --prefix /path/to/env/your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The list of modules (module1, module2, ...) is optional. For instance, for a python3 environment with scipy you would specify: ''python=3 scipy'' &lt;br /&gt;
&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/env/folder/your_env&lt;br /&gt;
(...)[neissner@td110 ~]$ mamba install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
You can use pip install inside a mamba environment, however, resolving dependencies might require installing additional packages manually.&lt;br /&gt;
&lt;br /&gt;
= Proper usage of X509 based proxies =&lt;br /&gt;
&lt;br /&gt;
We found recently that the usage of proxies within a Jupyter session might cause problems because the environment changes certain standard locations such as '''/tmp'''&lt;br /&gt;
&lt;br /&gt;
For a correct functioning please create the proxy the following way, example for Virgo:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /bin/voms-proxy-init --voms virgo:/virgo/ligo --out ./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
ls: cannot access /cvmfs/ligo.osgstorage.org: Permission denied&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here the proxy cannot properly be located. Therefore we have to put the complete path into the variable:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ pwd&lt;br /&gt;
/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;/x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
frames  powerflux  pycbc  test_access&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Software of particular interest =&lt;br /&gt;
== SageMath ==&lt;br /&gt;
&lt;br /&gt;
[https://www.sagemath.org/ SageMath] is particularly interesting for Cosmology because it allows symbolic calculations, e.g. deriving the equations of motions for the scale factor starting from a customised space-time metric. &lt;br /&gt;
&lt;br /&gt;
=== Standard cosmology examples ===&lt;br /&gt;
&lt;br /&gt;
* The Friedman equations for the FLRW solution of the Einstein equations.&lt;br /&gt;
You can find the corresponding Notebook in any PIC terminal at '''/data/astro/software/notebooks/FLRW_cosmology.ipynb'''&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/FLRW_cosmology_solutions.ipynb''' uses known analytical solutions of the FLRW cosmology and produces this image for the evolution of the scale factor:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage05.png|300px]]&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/Interior_Schwarzschild.ipynb''' shows the formalism for the interior Schwarzschild metric and displays the solutions for density and pressure of a static celestial object that is sufficiently larger than its Schwarzschild radius. The pressure for an object with constant density id shown in the image:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage06.png|300px]]&lt;br /&gt;
&lt;br /&gt;
=== Enabling SageMath environment in Jupyter ===&lt;br /&gt;
&lt;br /&gt;
If you have never initialized mamba, run:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /data/astro/software/centos7/conda/mambaforge_4.14.0/bin/mamba init&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
After that you can enable SageMath for its use in a Jupyter notebook session: &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~] mamba activate /data/astro/software/envs/sage&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ python -m  ipykernel install --user --name=sage&lt;br /&gt;
....&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This creates a file in you home '''~/.local/share/jupyter/kernels/sage/kernel.json'''&lt;br /&gt;
which has to be modified to look like this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
{&lt;br /&gt;
 &amp;quot;argv&amp;quot;: [&lt;br /&gt;
  &amp;quot;/data/astro/software/envs/sage/bin/sage&amp;quot;,&lt;br /&gt;
  &amp;quot;--python&amp;quot;,&lt;br /&gt;
  &amp;quot;-m&amp;quot;,&lt;br /&gt;
  &amp;quot;sage.repl.ipython_kernel&amp;quot;,&lt;br /&gt;
  &amp;quot;-f&amp;quot;,&lt;br /&gt;
  &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
 ],&lt;br /&gt;
 &amp;quot;display_name&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;language&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;metadata&amp;quot;: {&lt;br /&gt;
  &amp;quot;debugger&amp;quot;: true&lt;br /&gt;
 }&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next time you go to your Jupyter dashboard you will find the sage environment listed there.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Dask =&lt;br /&gt;
&lt;br /&gt;
[[Media:dask_htcondor.pdf|Dask + HTCondor manual]]&lt;br /&gt;
&lt;br /&gt;
= Using a singularity image as a jupyter kernel =&lt;br /&gt;
&lt;br /&gt;
Sometimes the software stack of some projects may be provided in the shape of a singularity image, it will then be convenient to use this image as a kernel for the notebooks in jupyter.pic.es.&lt;br /&gt;
&lt;br /&gt;
The singularity image to be used as a kernel needs to fullfill some requirements. Different requirements will apply depending on the programming language.&lt;br /&gt;
&lt;br /&gt;
== python jupyter kernel in a singularity image ==&lt;br /&gt;
&lt;br /&gt;
The singularity image needs to have the '''python''' and the '''ipykernel''' module installed. &lt;br /&gt;
&lt;br /&gt;
* Create the folder that will host the kernel definition&lt;br /&gt;
&lt;br /&gt;
  mkdir -p $HOME/.local/share/jupyter/kernels/singularity&lt;br /&gt;
&lt;br /&gt;
* Create the '''kernel.json''' file inside it with the following content:&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 {&lt;br /&gt;
   &amp;quot;argv&amp;quot;: [&lt;br /&gt;
     &amp;quot;singularity&amp;quot;,&lt;br /&gt;
     &amp;quot;exec&amp;quot;,&lt;br /&gt;
     &amp;quot;--cleanenv&amp;quot;,&lt;br /&gt;
     &amp;quot;/path/to/the/singularity/image.sif&amp;quot;,&lt;br /&gt;
     &amp;quot;python&amp;quot;,&lt;br /&gt;
     &amp;quot;-m&amp;quot;,&lt;br /&gt;
    &amp;quot;ipykernel&amp;quot;,&lt;br /&gt;
     &amp;quot;-f&amp;quot;,&lt;br /&gt;
     &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
   ],&lt;br /&gt;
   &amp;quot;language&amp;quot;: &amp;quot;python&amp;quot;,&lt;br /&gt;
   &amp;quot;display_name&amp;quot;: &amp;quot;singularity-kernel&amp;quot;&lt;br /&gt;
 }&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Refresh or start the jupyterlab interface and the singularity kernel should appear in the launcher tab&lt;br /&gt;
&lt;br /&gt;
= GPUs =&lt;br /&gt;
&lt;br /&gt;
The way to identify the GPUs that are assigned to your job is:&lt;br /&gt;
* check the environment variable CUDA_VISIBLE_DEVICES. In a terminal run &amp;quot;echo $CUDA_VISIBLE_DEVICES&amp;quot;. The environment variable contains a list of comma-separated GPU ids. With this you will already know how many gpus are assigned to your job. If the variable does not exist, there are no gpus assigned to the job&lt;br /&gt;
&lt;br /&gt;
* list the gpus with nvidia-smi, in a terminal run &amp;quot;nvidia-smi -L&amp;quot;, and look for the gpus you've been assigned. Remember their indexes (integers from 0 to 7)&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_id_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
* in the GPU dashboard the gpus are identified with their index&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_resources_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
= Jupyterlab user guide =&lt;br /&gt;
&lt;br /&gt;
You can find the official documentation of the currently installed version of jupyterlab (3.6) [https://jupyterlab.readthedocs.io/en/3.6.x/ here], there you will find instruction on how to &lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/3.6.x/user/commands.html Access the command palette]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/3.6.x/user/toc.html Build a Table Of Contents]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/3.6.x/user/debugger.html Debug your code]&lt;br /&gt;
&lt;br /&gt;
A set of non-official jupyterlab extensions are installed to provide additional functionalities&lt;br /&gt;
&lt;br /&gt;
== jupytext ==&lt;br /&gt;
Pair your notebooks with text files to enhance version tracking.&lt;br /&gt;
https://jupytext.readthedocs.io&lt;br /&gt;
&lt;br /&gt;
=== Example ===&lt;br /&gt;
&lt;br /&gt;
If you had a notebook (.ipynb file) containing only the cell below, tracked in a git repository&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%matplotlib inline&lt;br /&gt;
import numpy as np&lt;br /&gt;
import matplotlib.pyplot as plt&lt;br /&gt;
plt.imshow(np.random.random([10, 10]))&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Different executions of the cell would produce different images, and the images are embedded in a pseudo-binary format inside the notebook file. In this case, doing a '''git diff''' of the .ipynb file would produce a huge output (because the image changed), even if there wasn't any change in the code. It is thus convenient to sync the notebook with a text file (e.g. a .py script) using the jupytext extension and track this one with git. The outputs, including images, as well as some additional metadata, won't be added to the synced text file. So in the case of different executions of the same notebook, the diff will always be empty.&lt;br /&gt;
&lt;br /&gt;
== git ==&lt;br /&gt;
Sidebar GUI to git repo management&lt;br /&gt;
https://github.com/jupyterlab/jupyterlab-git&lt;br /&gt;
&lt;br /&gt;
== variable inspector ==&lt;br /&gt;
Variable inspection à la Matlab&lt;br /&gt;
https://github.com/jupyterlab-contrib/jupyterlab-variableInspector&lt;br /&gt;
&lt;br /&gt;
...&lt;/div&gt;</summary>
		<author><name>Torradeflot</name></author>
	</entry>
	<entry>
		<id>https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1135</id>
		<title>JupyterHub</title>
		<link rel="alternate" type="text/html" href="https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1135"/>
		<updated>2024-02-21T13:39:56Z</updated>

		<summary type="html">&lt;p&gt;Torradeflot: /* Dask */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
= Introduction =&lt;br /&gt;
&lt;br /&gt;
PIC offers a service for running Jupyter notebooks on CPU or GPU resources. This service is primarily thought for code developing or prototyping rather than data processing. The usage is similar to running notebooks on your personal computer but offers the advantage of developing and testing your code on different hardware configurations, as well as facilitating the scalability of the code since it is being tested in the same environment in which it would run on a mass scale. &lt;br /&gt;
&lt;br /&gt;
Since the service is strictly thought for development and small scale testing tasks, a shutdown policy for the sessions has been put in place:&lt;br /&gt;
&lt;br /&gt;
# The maximum duration for a session is 48h.&lt;br /&gt;
# After an idle period of 2 hours, the session will be closed. &lt;br /&gt;
&lt;br /&gt;
In practice that  means that you should estimate the test data volume that you work with during a session to be able to be processed in less than 48 hours.&lt;br /&gt;
&lt;br /&gt;
== How to connect to the service ==&lt;br /&gt;
&lt;br /&gt;
Got to [https://jupyter.pic.es jupyter.pic.es] to see your login screen.&lt;br /&gt;
&lt;br /&gt;
[[File:login.png|frameless|700px|Login screen]]&lt;br /&gt;
&lt;br /&gt;
Sign in with your PIC user credentials. This will prompt you to the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterSpawn.png|700px|current]]&lt;br /&gt;
&lt;br /&gt;
Here you can choose the hardware configuration for your Jupyter session. Also, you have to choose the experiment (project) you are working on during the Jupyter session. After choosing a configuration and pressing start the next screen will show you the progress of the initialisation process. Keep in mind that a job containing your Jupyter session is actually sent to the HTCondor queuing system and waiting for available resources before being started. This usually takes less than a minute but can take up to a few depending on our resource usage.&lt;br /&gt;
&lt;br /&gt;
[[File:screen02.png|600px]]&lt;br /&gt;
&lt;br /&gt;
In the next screen you can choose the tool that you want to use for your work: a Python notebook, a Python console or a plain bash terminal.&lt;br /&gt;
For the Python environment (either notebook or environment) you have two default options:&lt;br /&gt;
* the ipykernel version of Python 3&lt;br /&gt;
* the XPython version of Python 3.9, this one allows you to use the integrated debugging module.&lt;br /&gt;
&lt;br /&gt;
Further you see an icon with a &amp;quot;D&amp;quot; - desktop, this one starts a VNC session that allows the use of programs with graphical user interfaces.&lt;br /&gt;
&lt;br /&gt;
Also, recently you can find the icon of Visual Studio, an integrated development environment.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterlab20231103.png|700px]]&lt;br /&gt;
&lt;br /&gt;
Your python environments should appear under Notebook and Console headers. In a later section we will show you how to create a new environment and to remove an existing one.&lt;br /&gt;
&lt;br /&gt;
== Terminate your session and logout ==&lt;br /&gt;
&lt;br /&gt;
It is important that you terminate your session before you log out. In order to do so, go to the top page menu &amp;quot;'''File''' -&amp;gt; '''Hub Control Panel'''&amp;quot; and you will see the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:screen04.png|600px]]&lt;br /&gt;
&lt;br /&gt;
Here, click on the '''Stop My Server''' button. After that you can log out by clicking the '''Logout''' button in the right upper corner.&lt;br /&gt;
&lt;br /&gt;
=  Python virtual environments =&lt;br /&gt;
&lt;br /&gt;
This section covers the use of Python virtual environments with Jupyter.&lt;br /&gt;
&lt;br /&gt;
== Initialize conda (we highly recommend the use of mambaforge) ==&lt;br /&gt;
&lt;br /&gt;
Before using conda/mamba in your bash session, you have to initialize it.&lt;br /&gt;
* For access to an available conda/mamba installation, please get in contact with your project liaison at PIC. He/she will give you the actual value for the '''/path/to/mambaforge''' placeholder.&lt;br /&gt;
* If you want to use your own conda/mamba installation, we recommend you to install the minimal '''miniforge''' distribution, instructions [https://github.com/conda-forge/miniforge here]&lt;br /&gt;
&lt;br /&gt;
Log onto Jupyter and start a session. On the homepage of your Jupyter session, click on the terminal button on the session dashboard on the right to open a bash terminal. If no specific version is needed you can use the link provided in the example.&lt;br /&gt;
&lt;br /&gt;
First, let's initialize conda for our bash sessions:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ /data/astro/software/centos7/conda/mambaforge_4.14.0/bin/mamba init&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This actually changes the .bashrc file in your home directory in order to activate the base environment on login.&lt;br /&gt;
To avoid that the base environment is activated every time you log on to a node, run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For now you can exit the terminal.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ exit&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Link an existing environment to Jupyter ==&lt;br /&gt;
&lt;br /&gt;
You can find instructions on how to create your own environments, e.g. [[#Create_virtual_environments_with_venv_or_conda | here]].&lt;br /&gt;
&lt;br /&gt;
Log into Jupyter, start a session. From the session dashboard choose the bash terminal.&lt;br /&gt;
&lt;br /&gt;
Inside the terminal, activate your environment.&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments:&lt;br /&gt;
* if you created the environment without a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the name of your environment. &lt;br /&gt;
* if you created the environment with a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the absolute path of your environment. &lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ source /path/to/environment/bin/activate&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Link the environment to a Jupyter kernel. For both, '''conda/mamba''' and '''venv''':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ python -m  ipykernel install --user --name=whatever_kernel_name&lt;br /&gt;
Installed kernelspec whatever_kernel_name in &lt;br /&gt;
                         /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you don't have the '''ipykernel''' module installed in your environment you may receive an error message like the one below when trying to run the previous command.&lt;br /&gt;
&amp;lt;pre&amp;gt;No module named ipykernel&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If this is the case, you need to install it by running: '''pip install ipykernel'''&lt;br /&gt;
&lt;br /&gt;
Deactivate your environment. &lt;br /&gt;
&lt;br /&gt;
For conda:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For venv:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now you can exit the terminal. After refreshing the Jupyter page your whatever_kernel_name appears in the dashboard. In this example '''test''' has been used for whatever_kernel_name&lt;br /&gt;
&lt;br /&gt;
[[File:screen05.png|700px]]&lt;br /&gt;
&lt;br /&gt;
== Unlink an environment from Jupyter ==&lt;br /&gt;
Log onto Jupyter, start a session and from the session dashboard choose the bash terminal. To remove your environment/kernel from Jupyter run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ jupyter kernelspec uninstall whatever_kernel_name&lt;br /&gt;
Kernel specs to remove:&lt;br /&gt;
  whatever_kernel_name     /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
Remove 1 kernel specs [y/N]: y&lt;br /&gt;
[RemoveKernelSpec] Removed /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Keep in mind that, although not available in Jupyter anymore, the environment still exists. Whenever you need it, you can link it again.&lt;br /&gt;
&lt;br /&gt;
== Create virtual environments with venv or conda ==&lt;br /&gt;
&lt;br /&gt;
Before creating a new environment, please get in contact with your project liaison at PIC as there may be already a suitable environment for your needs in place.&lt;br /&gt;
&lt;br /&gt;
If none of the existing environments suits your needs, you can create a new environment.&lt;br /&gt;
First, create a directory in a suitable place to store the environment. For single-user environments, place them in your home under ~/env. For environments that will be shared with other project users, contact your project liaison and ask him/her for a path in a shared storage volume that is visible to all of them.&lt;br /&gt;
&lt;br /&gt;
Once you have the location (i.e. /path/to/env/folder), create the environment with the following commands:&lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments '''(recommended)'''&lt;br /&gt;
&lt;br /&gt;
If your_env is installed at /path/to/env/your_env&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ python3 -m venv your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ source your_env/bin/activate&lt;br /&gt;
(...)[neissner@td110 ~]$ pip install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments&lt;br /&gt;
 &amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba create --prefix /path/to/env/your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The list of modules (module1, module2, ...) is optional. For instance, for a python3 environment with scipy you would specify: ''python=3 scipy'' &lt;br /&gt;
&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/env/folder/your_env&lt;br /&gt;
(...)[neissner@td110 ~]$ mamba install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
You can use pip install inside a mamba environment, however, resolving dependencies might require installing additional packages manually.&lt;br /&gt;
&lt;br /&gt;
= Proper usage of X509 based proxies =&lt;br /&gt;
&lt;br /&gt;
We found recently that the usage of proxies within a Jupyter session might cause problems because the environment changes certain standard locations such as '''/tmp'''&lt;br /&gt;
&lt;br /&gt;
For a correct functioning please create the proxy the following way, example for Virgo:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /bin/voms-proxy-init --voms virgo:/virgo/ligo --out ./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
ls: cannot access /cvmfs/ligo.osgstorage.org: Permission denied&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here the proxy cannot properly be located. Therefore we have to put the complete path into the variable:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ pwd&lt;br /&gt;
/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;/x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
frames  powerflux  pycbc  test_access&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Software of particular interest =&lt;br /&gt;
== SageMath ==&lt;br /&gt;
&lt;br /&gt;
[https://www.sagemath.org/ SageMath] is particularly interesting for Cosmology because it allows symbolic calculations, e.g. deriving the equations of motions for the scale factor starting from a customised space-time metric. &lt;br /&gt;
&lt;br /&gt;
=== Standard cosmology examples ===&lt;br /&gt;
&lt;br /&gt;
* The Friedman equations for the FLRW solution of the Einstein equations.&lt;br /&gt;
You can find the corresponding Notebook in any PIC terminal at '''/data/astro/software/notebooks/FLRW_cosmology.ipynb'''&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/FLRW_cosmology_solutions.ipynb''' uses known analytical solutions of the FLRW cosmology and produces this image for the evolution of the scale factor:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage05.png|300px]]&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/Interior_Schwarzschild.ipynb''' shows the formalism for the interior Schwarzschild metric and displays the solutions for density and pressure of a static celestial object that is sufficiently larger than its Schwarzschild radius. The pressure for an object with constant density id shown in the image:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage06.png|300px]]&lt;br /&gt;
&lt;br /&gt;
=== Enabling SageMath environment in Jupyter ===&lt;br /&gt;
&lt;br /&gt;
If you have never initialized mamba, run:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /data/astro/software/centos7/conda/mambaforge_4.14.0/bin/mamba init&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
After that you can enable SageMath for its use in a Jupyter notebook session: &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~] mamba activate /data/astro/software/envs/sage&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ python -m  ipykernel install --user --name=sage&lt;br /&gt;
....&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This creates a file in you home '''~/.local/share/jupyter/kernels/sage/kernel.json'''&lt;br /&gt;
which has to be modified to look like this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
{&lt;br /&gt;
 &amp;quot;argv&amp;quot;: [&lt;br /&gt;
  &amp;quot;/data/astro/software/envs/sage/bin/sage&amp;quot;,&lt;br /&gt;
  &amp;quot;--python&amp;quot;,&lt;br /&gt;
  &amp;quot;-m&amp;quot;,&lt;br /&gt;
  &amp;quot;sage.repl.ipython_kernel&amp;quot;,&lt;br /&gt;
  &amp;quot;-f&amp;quot;,&lt;br /&gt;
  &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
 ],&lt;br /&gt;
 &amp;quot;display_name&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;language&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;metadata&amp;quot;: {&lt;br /&gt;
  &amp;quot;debugger&amp;quot;: true&lt;br /&gt;
 }&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next time you go to your Jupyter dashboard you will find the sage environment listed there.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Dask =&lt;br /&gt;
&lt;br /&gt;
[[Media:dask_htcondor.pdf|Dask + HTCondor manual]]&lt;br /&gt;
&lt;br /&gt;
= Using a singularity image as a jupyter kernel =&lt;br /&gt;
&lt;br /&gt;
Sometimes the software stack of some projects may be provided in the shape of a singularity image, it will then be convenient to use this image as a kernel for the notebooks in jupyter.pic.es.&lt;br /&gt;
&lt;br /&gt;
The singularity image to be used as a kernel needs to fullfill some requirements. Different requirements will apply depending on the programming language.&lt;br /&gt;
&lt;br /&gt;
== python jupyter kernel in a singularity image ==&lt;br /&gt;
&lt;br /&gt;
The singularity image needs to have the '''python''' and the '''ipykernel''' module installed. &lt;br /&gt;
&lt;br /&gt;
* Create the folder that will host the kernel definition&lt;br /&gt;
&lt;br /&gt;
  mkdir -p $HOME/.local/share/jupyter/kernels/singularity&lt;br /&gt;
&lt;br /&gt;
* Create the '''kernel.json''' file inside it with the following content:&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 {&lt;br /&gt;
   &amp;quot;argv&amp;quot;: [&lt;br /&gt;
     &amp;quot;singularity&amp;quot;,&lt;br /&gt;
     &amp;quot;exec&amp;quot;,&lt;br /&gt;
     &amp;quot;--cleanenv&amp;quot;,&lt;br /&gt;
     &amp;quot;/path/to/the/singularity/image.sif&amp;quot;,&lt;br /&gt;
     &amp;quot;python&amp;quot;,&lt;br /&gt;
     &amp;quot;-m&amp;quot;,&lt;br /&gt;
    &amp;quot;ipykernel&amp;quot;,&lt;br /&gt;
     &amp;quot;-f&amp;quot;,&lt;br /&gt;
     &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
   ],&lt;br /&gt;
   &amp;quot;language&amp;quot;: &amp;quot;python&amp;quot;,&lt;br /&gt;
   &amp;quot;display_name&amp;quot;: &amp;quot;singularity-kernel&amp;quot;&lt;br /&gt;
 }&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Refresh or start the jupyterlab interface and the singularity kernel should appear in the launcher tab&lt;br /&gt;
&lt;br /&gt;
= GPUs =&lt;br /&gt;
&lt;br /&gt;
The way to identify the GPUs that are assigned to your job is:&lt;br /&gt;
* check the environment variable CUDA_VISIBLE_DEVICES. In a terminal run &amp;quot;echo $CUDA_VISIBLE_DEVICES&amp;quot;. The environment variable contains a list of comma-separated GPU ids. With this you will already know how many gpus are assigned to your job. If the variable does not exist, there are no gpus assigned to the job&lt;br /&gt;
&lt;br /&gt;
* list the gpus with nvidia-smi, in a terminal run &amp;quot;nvidia-smi -L&amp;quot;, and look for the gpus you've been assigned. Remember their indexes (integers from 0 to 7)&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_id_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
* in the GPU dashboard the gpus are identified with their index&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_resources_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
= Jupyterlab user guide =&lt;br /&gt;
&lt;br /&gt;
You can find the official documentation of the currently installed version of jupyterlab (3.6) [https://jupyterlab.readthedocs.io/en/3.6.x/ here], there you will find instruction on how to &lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/3.6.x/user/commands.html Access the command palette]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/3.6.x/user/toc.html Build a Table Of Contents]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/3.6.x/user/debugger.html Debug your code]&lt;br /&gt;
&lt;br /&gt;
A set of non-official jupyterlab extensions are installed to provide additional functionalities&lt;br /&gt;
&lt;br /&gt;
== jupytext ==&lt;br /&gt;
Pair your notebooks with text files to enhance version tracking.&lt;br /&gt;
https://jupytext.readthedocs.io&lt;br /&gt;
&lt;br /&gt;
=== Example ===&lt;br /&gt;
&lt;br /&gt;
If you had a notebook (.ipynb file) containing only the cell below, tracked in a git repository&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%matplotlib inline&lt;br /&gt;
import numpy as np&lt;br /&gt;
import matplotlib.pyplot as plt&lt;br /&gt;
plt.imshow(np.random.random([10, 10]))&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Different executions of the cell would produce different images, and the images are embedded in a pseudo-binary format inside the notebook file. In this case, doing a '''git diff''' of the .ipynb file would produce a huge output (because the image changed), even if there wasn't any change in the code. It is thus convenient to sync the notebook with a text file (e.g. a .py script) using the jupytext extension and track this one with git. The outputs, including images, as well as some additional metadata, won't be added to the synced text file. So in the case of different executions of the same notebook, the diff will always be empty.&lt;br /&gt;
&lt;br /&gt;
== git ==&lt;br /&gt;
Sidebar GUI to git repo management&lt;br /&gt;
https://github.com/jupyterlab/jupyterlab-git&lt;br /&gt;
&lt;br /&gt;
== variable inspector ==&lt;br /&gt;
Variable inspection à la Matlab&lt;br /&gt;
https://github.com/jupyterlab-contrib/jupyterlab-variableInspector&lt;/div&gt;</summary>
		<author><name>Torradeflot</name></author>
	</entry>
	<entry>
		<id>https://pwiki.pic.es/index.php?title=Dask_wiki&amp;diff=1134</id>
		<title>Dask wiki</title>
		<link rel="alternate" type="text/html" href="https://pwiki.pic.es/index.php?title=Dask_wiki&amp;diff=1134"/>
		<updated>2024-02-21T13:38:49Z</updated>

		<summary type="html">&lt;p&gt;Torradeflot: Created page with &amp;quot;= Dask + HTCondor =  == Creating a cluster ==  === using dask-labextension ===  To create a Dask cluster from the extension. Go to the dask-labextension tab and click &amp;quot;+ NEW&amp;quot;...&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= Dask + HTCondor =&lt;br /&gt;
&lt;br /&gt;
== Creating a cluster ==&lt;br /&gt;
&lt;br /&gt;
=== using dask-labextension ===&lt;br /&gt;
&lt;br /&gt;
To create a Dask cluster from the extension. Go to the dask-labextension tab and click &amp;quot;+ NEW&amp;quot;&lt;br /&gt;
&lt;br /&gt;
![kk](static/dask_labextension_1.png)&lt;br /&gt;
&lt;br /&gt;
You can see the cluster information and have some shortcuts&lt;br /&gt;
* &amp;quot;&amp;lt;&amp;gt;&amp;quot; insert a cell with the necessary lines to create a Dask client. Some environment variables have been set in order to be able to use a &amp;quot;regular&amp;quot; Dask Client&lt;br /&gt;
* &amp;quot;SCALE&amp;quot; increase/decrease the number of workers&lt;br /&gt;
* &amp;quot;SHUTDOWN&amp;quot; shutdown the cluster&lt;br /&gt;
&lt;br /&gt;
![kk](static/dask_add_cluster_cell.png)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
    from dask.distributed import Client&lt;br /&gt;
    client = Client(&amp;quot;tls://192.168.100.56:45722&amp;quot;)&lt;br /&gt;
    client&lt;br /&gt;
&lt;br /&gt;
    /data/astro/scratch2/torradeflot/envs/dask/lib/python3.11/site-packages/distributed/client.py:1388: &lt;br /&gt;
    VersionMismatchWarning: Mismatched versions found&lt;br /&gt;
    &lt;br /&gt;
    +---------+----------------+----------------+----------------+&lt;br /&gt;
    | Package | Client         | Scheduler      | Workers        |&lt;br /&gt;
    +---------+----------------+----------------+----------------+&lt;br /&gt;
    | lz4     | 4.3.3          | 4.3.2          | 4.3.2          |&lt;br /&gt;
    | msgpack | 1.0.7          | 1.0.5          | 1.0.5          |&lt;br /&gt;
    | numpy   | 1.26.3         | 1.24.3         | 1.24.3         |&lt;br /&gt;
    | pandas  | 2.2.0          | 2.0.2          | 2.0.2          |&lt;br /&gt;
    | python  | 3.11.7.final.0 | 3.11.5.final.0 | 3.11.5.final.0 |&lt;br /&gt;
    | toolz   | 0.12.1         | 0.12.0         | 0.12.0         |&lt;br /&gt;
    +---------+----------------+----------------+----------------+&lt;br /&gt;
      warnings.warn(version_module.VersionMismatchWarning(msg[0][&amp;quot;warning&amp;quot;]))&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
```python&lt;br /&gt;
client.cluster&lt;br /&gt;
```&lt;br /&gt;
&lt;br /&gt;
The encryption of the traffic between the different Dask components is enforced through environment variables&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
```python&lt;br /&gt;
[f'{k}={v}' for k,v in os.environ.items() if 'DASK' in k]&lt;br /&gt;
```&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
    ['DASK_DISTRIBUTED__COMM__TLS__CLIENT__KEY=/nfs/pic.es/user/t/torradeflot/.config/dask/security/key.pem',&lt;br /&gt;
     'DASK_DISTRIBUTED__COMM__TLS__SCHEDULER__CERT=/nfs/pic.es/user/t/torradeflot/.config/dask/security/cert.pem',&lt;br /&gt;
     'DASK_DISTRIBUTED__COMM__TLS__WORKER__CERT=/nfs/pic.es/user/t/torradeflot/.config/dask/security/cert.pem',&lt;br /&gt;
     'DASK_DISTRIBUTED__COMM__TLS__CA_FILE=/nfs/pic.es/user/t/torradeflot/.config/dask/security/ca_file.pem',&lt;br /&gt;
     'DASK_DISTRIBUTED__COMM__TLS__WORKER__KEY=/nfs/pic.es/user/t/torradeflot/.config/dask/security/key.pem',&lt;br /&gt;
     'DASK_DISTRIBUTED__COMM__TLS__SCHEDULER__KEY=/nfs/pic.es/user/t/torradeflot/.config/dask/security/key.pem',&lt;br /&gt;
     'DASK_DISTRIBUTED__COMM__REQUIRE_ENCRYPTION=true',&lt;br /&gt;
     'DASK_DISTRIBUTED__COMM__TLS__CLIENT__CERT=/nfs/pic.es/user/t/torradeflot/.config/dask/security/cert.pem']&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
![kk](static/dask_labextension_2.png)&lt;br /&gt;
&lt;br /&gt;
## From python&lt;br /&gt;
&lt;br /&gt;
### Using the default environment&lt;br /&gt;
&lt;br /&gt;
This could be a notebook or a script submitted to HTCondor&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
```python&lt;br /&gt;
from pic_jupyterhub.dask_condor import SecureHTCondor&lt;br /&gt;
from dask.distributed import Client&lt;br /&gt;
```&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
```python&lt;br /&gt;
cluster = SecureHTCondor() #scheduler_options={'dashboard_address':&amp;quot;:8788&amp;quot;})&lt;br /&gt;
```&lt;br /&gt;
&lt;br /&gt;
    Certificate files already exist&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The cluster is initially created without workers. We need to scale it.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
```python&lt;br /&gt;
cluster.scale(2)&lt;br /&gt;
```&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
```python&lt;br /&gt;
client = Client(cluster)&lt;br /&gt;
```&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
```python&lt;br /&gt;
cluster.close()&lt;br /&gt;
```&lt;br /&gt;
&lt;br /&gt;
### Using a custom environment&lt;br /&gt;
&lt;br /&gt;
The environment needs to have:&lt;br /&gt;
&lt;br /&gt;
* `dask-jobqueue` to be able to start Dask clusters with HTCondor&lt;br /&gt;
* `ipykernel` to use it as a notebook kernel&lt;br /&gt;
* `numpy` and `pandas` to be able to create Dask arrays and dataframes&lt;br /&gt;
* `bokeh` for the Dask dashboard&lt;br /&gt;
&lt;br /&gt;
`mamba create (-n {env name} | -p {env Path}) dask-jobqueue ipykernel numpy pandas bokeh`&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
```python&lt;br /&gt;
from dask_jobqueue import HTCondorCluster&lt;br /&gt;
from dask.distributed import Client&lt;br /&gt;
```&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
```python&lt;br /&gt;
cluster = HTCondorCluster(cores=1, memory='2GB', disk='10 GB',&lt;br /&gt;
                         job_extra_directives={'getenv': 'True'}) # needed to propagate the security&lt;br /&gt;
```&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
```python&lt;br /&gt;
cluster&lt;br /&gt;
```&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div class=&amp;quot;jp-RenderedHTMLCommon jp-RenderedHTML jp-mod-trusted jp-OutputArea-output&amp;quot;&amp;gt;&lt;br /&gt;
    &amp;lt;div style=&amp;quot;width: 24px; height: 24px; background-color: #e1e1e1; border: 3px solid #9D9D9D; border-radius: 5px; position: absolute;&amp;quot;&amp;gt;&lt;br /&gt;
    &amp;lt;/div&amp;gt;&lt;br /&gt;
    &amp;lt;div style=&amp;quot;margin-left: 48px;&amp;quot;&amp;gt;&lt;br /&gt;
        &amp;lt;h3 style=&amp;quot;margin-bottom: 0px; margin-top: 0px;&amp;quot;&amp;gt;HTCondorCluster&amp;lt;/h3&amp;gt;&lt;br /&gt;
        &amp;lt;p style=&amp;quot;color: #9D9D9D; margin-bottom: 0px;&amp;quot;&amp;gt;a980977e&amp;lt;/p&amp;gt;&lt;br /&gt;
        &amp;lt;table style=&amp;quot;width: 100%; text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
            &amp;lt;tr&amp;gt;&lt;br /&gt;
                &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                    &amp;lt;strong&amp;gt;Dashboard:&amp;lt;/strong&amp;gt; &amp;lt;a href=&amp;quot;http://192.168.102.42:8787/status&amp;quot; target=&amp;quot;_blank&amp;quot;&amp;gt;http://192.168.102.42:8787/status&amp;lt;/a&amp;gt;&lt;br /&gt;
                &amp;lt;/td&amp;gt;&lt;br /&gt;
                &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                    &amp;lt;strong&amp;gt;Workers:&amp;lt;/strong&amp;gt; 1&lt;br /&gt;
                &amp;lt;/td&amp;gt;&lt;br /&gt;
            &amp;lt;/tr&amp;gt;&lt;br /&gt;
            &amp;lt;tr&amp;gt;&lt;br /&gt;
                &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                    &amp;lt;strong&amp;gt;Total threads:&amp;lt;/strong&amp;gt; 1&lt;br /&gt;
                &amp;lt;/td&amp;gt;&lt;br /&gt;
                &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                    &amp;lt;strong&amp;gt;Total memory:&amp;lt;/strong&amp;gt; 1.86 GiB&lt;br /&gt;
                &amp;lt;/td&amp;gt;&lt;br /&gt;
            &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
        &amp;lt;/table&amp;gt;&lt;br /&gt;
&lt;br /&gt;
        &amp;lt;details&amp;gt;&lt;br /&gt;
            &amp;lt;summary style=&amp;quot;margin-bottom: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
                &amp;lt;h3 style=&amp;quot;display: inline;&amp;quot;&amp;gt;Scheduler Info&amp;lt;/h3&amp;gt;&lt;br /&gt;
            &amp;lt;/summary&amp;gt;&lt;br /&gt;
&lt;br /&gt;
            &amp;lt;div style=&amp;quot;&amp;quot;&amp;gt;&lt;br /&gt;
    &amp;lt;div&amp;gt;&lt;br /&gt;
        &amp;lt;div style=&amp;quot;width: 24px; height: 24px; background-color: #FFF7E5; border: 3px solid #FF6132; border-radius: 5px; position: absolute;&amp;quot;&amp;gt; &amp;lt;/div&amp;gt;&lt;br /&gt;
        &amp;lt;div style=&amp;quot;margin-left: 48px;&amp;quot;&amp;gt;&lt;br /&gt;
            &amp;lt;h3 style=&amp;quot;margin-bottom: 0px;&amp;quot;&amp;gt;Scheduler&amp;lt;/h3&amp;gt;&lt;br /&gt;
            &amp;lt;p style=&amp;quot;color: #9D9D9D; margin-bottom: 0px;&amp;quot;&amp;gt;Scheduler-78cf6c5f-5546-44b4-b4c7-54b6caea8f9b&amp;lt;/p&amp;gt;&lt;br /&gt;
            &amp;lt;table style=&amp;quot;width: 100%; text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                &amp;lt;tr&amp;gt;&lt;br /&gt;
                    &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                        &amp;lt;strong&amp;gt;Comm:&amp;lt;/strong&amp;gt; tls://192.168.102.42:40855&lt;br /&gt;
                    &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                        &amp;lt;strong&amp;gt;Workers:&amp;lt;/strong&amp;gt; 1&lt;br /&gt;
                    &amp;lt;/td&amp;gt;&lt;br /&gt;
                &amp;lt;/tr&amp;gt;&lt;br /&gt;
                &amp;lt;tr&amp;gt;&lt;br /&gt;
                    &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                        &amp;lt;strong&amp;gt;Dashboard:&amp;lt;/strong&amp;gt; &amp;lt;a href=&amp;quot;http://192.168.102.42:8787/status&amp;quot; target=&amp;quot;_blank&amp;quot;&amp;gt;http://192.168.102.42:8787/status&amp;lt;/a&amp;gt;&lt;br /&gt;
                    &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                        &amp;lt;strong&amp;gt;Total threads:&amp;lt;/strong&amp;gt; 1&lt;br /&gt;
                    &amp;lt;/td&amp;gt;&lt;br /&gt;
                &amp;lt;/tr&amp;gt;&lt;br /&gt;
                &amp;lt;tr&amp;gt;&lt;br /&gt;
                    &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                        &amp;lt;strong&amp;gt;Started:&amp;lt;/strong&amp;gt; Just now&lt;br /&gt;
                    &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                        &amp;lt;strong&amp;gt;Total memory:&amp;lt;/strong&amp;gt; 1.86 GiB&lt;br /&gt;
                    &amp;lt;/td&amp;gt;&lt;br /&gt;
                &amp;lt;/tr&amp;gt;&lt;br /&gt;
            &amp;lt;/table&amp;gt;&lt;br /&gt;
        &amp;lt;/div&amp;gt;&lt;br /&gt;
    &amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
    &amp;lt;details style=&amp;quot;margin-left: 48px;&amp;quot;&amp;gt;&lt;br /&gt;
        &amp;lt;summary style=&amp;quot;margin-bottom: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
            &amp;lt;h3 style=&amp;quot;display: inline;&amp;quot;&amp;gt;Workers&amp;lt;/h3&amp;gt;&lt;br /&gt;
        &amp;lt;/summary&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
        &amp;lt;div style=&amp;quot;margin-bottom: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
            &amp;lt;div style=&amp;quot;width: 24px; height: 24px; background-color: #DBF5FF; border: 3px solid #4CC9FF; border-radius: 5px; position: absolute;&amp;quot;&amp;gt; &amp;lt;/div&amp;gt;&lt;br /&gt;
            &amp;lt;div style=&amp;quot;margin-left: 48px;&amp;quot;&amp;gt;&lt;br /&gt;
            &amp;lt;details&amp;gt;&lt;br /&gt;
                &amp;lt;summary&amp;gt;&lt;br /&gt;
                    &amp;lt;h4 style=&amp;quot;margin-bottom: 0px; display: inline;&amp;quot;&amp;gt;Worker: HTCondorCluster-0&amp;lt;/h4&amp;gt;&lt;br /&gt;
                &amp;lt;/summary&amp;gt;&lt;br /&gt;
                &amp;lt;table style=&amp;quot;width: 100%; text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Comm: &amp;lt;/strong&amp;gt; tls://192.168.100.27:40884&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Total threads: &amp;lt;/strong&amp;gt; 1&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Dashboard: &amp;lt;/strong&amp;gt; &amp;lt;a href=&amp;quot;http://192.168.100.27:45160/status&amp;quot; target=&amp;quot;_blank&amp;quot;&amp;gt;http://192.168.100.27:45160/status&amp;lt;/a&amp;gt;&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Memory: &amp;lt;/strong&amp;gt; 1.86 GiB&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Nanny: &amp;lt;/strong&amp;gt; tls://192.168.100.27:39432&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;td colspan=&amp;quot;2&amp;quot; style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Local directory: &amp;lt;/strong&amp;gt; /tmp/dask-scratch-space/worker-zscas2e0&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
                &amp;lt;/table&amp;gt;&lt;br /&gt;
            &amp;lt;/details&amp;gt;&lt;br /&gt;
            &amp;lt;/div&amp;gt;&lt;br /&gt;
        &amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
    &amp;lt;/details&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
        &amp;lt;/details&amp;gt;&lt;br /&gt;
    &amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
```python&lt;br /&gt;
cluster.scale(2)&lt;br /&gt;
```&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
```python&lt;br /&gt;
c = Client(cluster)&lt;br /&gt;
```&lt;br /&gt;
&lt;br /&gt;
Security is inherited from environment variables&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
```python&lt;br /&gt;
cluster.security&lt;br /&gt;
```&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;margin-left: auto;&amp;quot;&amp;gt;&lt;br /&gt;
    &amp;lt;h3 style=&amp;quot;margin-bottom: 0px;&amp;quot;&amp;gt;&amp;lt;b&amp;gt;Security&amp;lt;/b&amp;gt;&amp;lt;/h3&amp;gt;&lt;br /&gt;
    &amp;lt;p&amp;gt;&lt;br /&gt;
        &amp;lt;table style=&amp;quot;width: 100%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
            &amp;lt;tr&amp;gt;&lt;br /&gt;
                &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt;require_encryption&amp;lt;/th&amp;gt;&lt;br /&gt;
                &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;Local (/nfs/pic.es/user/t/torradeflot/jupyterhub_conf/examples/true)&amp;lt;/td&amp;gt;&lt;br /&gt;
            &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
            &amp;lt;tr&amp;gt;&lt;br /&gt;
                &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt;tls_ca_file&amp;lt;/th&amp;gt;&lt;br /&gt;
                &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;Local (/nfs/pic.es/user/t/torradeflot/.config/dask/security/ca_file.pem)&amp;lt;/td&amp;gt;&lt;br /&gt;
            &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
            &amp;lt;tr&amp;gt;&lt;br /&gt;
                &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt;tls_client_cert&amp;lt;/th&amp;gt;&lt;br /&gt;
                &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;Local (/nfs/pic.es/user/t/torradeflot/.config/dask/security/cert.pem)&amp;lt;/td&amp;gt;&lt;br /&gt;
            &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
            &amp;lt;tr&amp;gt;&lt;br /&gt;
                &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt;tls_client_key&amp;lt;/th&amp;gt;&lt;br /&gt;
                &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;Local (/nfs/pic.es/user/t/torradeflot/.config/dask/security/key.pem)&amp;lt;/td&amp;gt;&lt;br /&gt;
            &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
            &amp;lt;tr&amp;gt;&lt;br /&gt;
                &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt;tls_min_version&amp;lt;/th&amp;gt;&lt;br /&gt;
                &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;771&amp;lt;/td&amp;gt;&lt;br /&gt;
            &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
            &amp;lt;tr&amp;gt;&lt;br /&gt;
                &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt;tls_scheduler_cert&amp;lt;/th&amp;gt;&lt;br /&gt;
                &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;Local (/nfs/pic.es/user/t/torradeflot/.config/dask/security/cert.pem)&amp;lt;/td&amp;gt;&lt;br /&gt;
            &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
            &amp;lt;tr&amp;gt;&lt;br /&gt;
                &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt;tls_scheduler_key&amp;lt;/th&amp;gt;&lt;br /&gt;
                &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;Local (/nfs/pic.es/user/t/torradeflot/.config/dask/security/key.pem)&amp;lt;/td&amp;gt;&lt;br /&gt;
            &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
            &amp;lt;tr&amp;gt;&lt;br /&gt;
                &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt;tls_worker_cert&amp;lt;/th&amp;gt;&lt;br /&gt;
                &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;Local (/nfs/pic.es/user/t/torradeflot/.config/dask/security/cert.pem)&amp;lt;/td&amp;gt;&lt;br /&gt;
            &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
            &amp;lt;tr&amp;gt;&lt;br /&gt;
                &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt;tls_worker_key&amp;lt;/th&amp;gt;&lt;br /&gt;
                &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;Local (/nfs/pic.es/user/t/torradeflot/.config/dask/security/key.pem)&amp;lt;/td&amp;gt;&lt;br /&gt;
            &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
        &amp;lt;/table&amp;gt;&lt;br /&gt;
    &amp;lt;/p&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
## Connect to existing cluster&lt;br /&gt;
&lt;br /&gt;
Meaning a cluster that was launched from outside you jupyterlab instance, e.g. an independent HTCondor job or somebody else's cluster.&lt;br /&gt;
&lt;br /&gt;
Check IP address of running job&lt;br /&gt;
```&lt;br /&gt;
nslookup $(condor_q $JOB_ID -af RemoteHost | cut -d &amp;quot;@&amp;quot; -f 2)&lt;br /&gt;
```&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
```python&lt;br /&gt;
from dask.distributed import Client&lt;br /&gt;
&lt;br /&gt;
client = Client(&amp;quot;tls://192.168.100.5:42166&amp;quot;)&lt;br /&gt;
client&lt;br /&gt;
```&lt;br /&gt;
&lt;br /&gt;
    /data/astro/scratch2/torradeflot/envs/dask/lib/python3.11/site-packages/distributed/client.py:1388: VersionMismatchWarning: Mismatched versions found&lt;br /&gt;
    &lt;br /&gt;
    +---------+----------------+----------------+----------------+&lt;br /&gt;
    | Package | Client         | Scheduler      | Workers        |&lt;br /&gt;
    +---------+----------------+----------------+----------------+&lt;br /&gt;
    | lz4     | 4.3.3          | 4.3.2          | 4.3.2          |&lt;br /&gt;
    | msgpack | 1.0.7          | 1.0.5          | 1.0.5          |&lt;br /&gt;
    | numpy   | 1.26.3         | 1.24.3         | 1.24.3         |&lt;br /&gt;
    | pandas  | 2.2.0          | 2.0.2          | 2.0.2          |&lt;br /&gt;
    | python  | 3.11.7.final.0 | 3.11.5.final.0 | 3.11.5.final.0 |&lt;br /&gt;
    | toolz   | 0.12.1         | 0.12.0         | 0.12.0         |&lt;br /&gt;
    +---------+----------------+----------------+----------------+&lt;br /&gt;
      warnings.warn(version_module.VersionMismatchWarning(msg[0][&amp;quot;warning&amp;quot;]))&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div&amp;gt;&lt;br /&gt;
    &amp;lt;div style=&amp;quot;width: 24px; height: 24px; background-color: #e1e1e1; border: 3px solid #9D9D9D; border-radius: 5px; position: absolute;&amp;quot;&amp;gt; &amp;lt;/div&amp;gt;&lt;br /&gt;
    &amp;lt;div style=&amp;quot;margin-left: 48px;&amp;quot;&amp;gt;&lt;br /&gt;
        &amp;lt;h3 style=&amp;quot;margin-bottom: 0px;&amp;quot;&amp;gt;Client&amp;lt;/h3&amp;gt;&lt;br /&gt;
        &amp;lt;p style=&amp;quot;color: #9D9D9D; margin-bottom: 0px;&amp;quot;&amp;gt;Client-589eb004-c406-11ee-8584-001e67f120a8&amp;lt;/p&amp;gt;&lt;br /&gt;
        &amp;lt;table style=&amp;quot;width: 100%; text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
        &amp;lt;tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
            &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&amp;lt;strong&amp;gt;Connection method:&amp;lt;/strong&amp;gt; Direct&amp;lt;/td&amp;gt;&lt;br /&gt;
            &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&lt;br /&gt;
        &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
            &amp;lt;tr&amp;gt;&lt;br /&gt;
                &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                    &amp;lt;strong&amp;gt;Dashboard: &amp;lt;/strong&amp;gt; &amp;lt;a href=&amp;quot;http://192.168.100.5:41069/status&amp;quot; target=&amp;quot;_blank&amp;quot;&amp;gt;http://192.168.100.5:41069/status&amp;lt;/a&amp;gt;&lt;br /&gt;
                &amp;lt;/td&amp;gt;&lt;br /&gt;
                &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
            &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
        &amp;lt;/table&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
            &amp;lt;details&amp;gt;&lt;br /&gt;
            &amp;lt;summary style=&amp;quot;margin-bottom: 20px;&amp;quot;&amp;gt;&amp;lt;h3 style=&amp;quot;display: inline;&amp;quot;&amp;gt;Scheduler Info&amp;lt;/h3&amp;gt;&amp;lt;/summary&amp;gt;&lt;br /&gt;
            &amp;lt;div style=&amp;quot;&amp;quot;&amp;gt;&lt;br /&gt;
    &amp;lt;div&amp;gt;&lt;br /&gt;
        &amp;lt;div style=&amp;quot;width: 24px; height: 24px; background-color: #FFF7E5; border: 3px solid #FF6132; border-radius: 5px; position: absolute;&amp;quot;&amp;gt; &amp;lt;/div&amp;gt;&lt;br /&gt;
        &amp;lt;div style=&amp;quot;margin-left: 48px;&amp;quot;&amp;gt;&lt;br /&gt;
            &amp;lt;h3 style=&amp;quot;margin-bottom: 0px;&amp;quot;&amp;gt;Scheduler&amp;lt;/h3&amp;gt;&lt;br /&gt;
            &amp;lt;p style=&amp;quot;color: #9D9D9D; margin-bottom: 0px;&amp;quot;&amp;gt;Scheduler-24c09727-b6fc-4703-b25c-6d3bfbf750ee&amp;lt;/p&amp;gt;&lt;br /&gt;
            &amp;lt;table style=&amp;quot;width: 100%; text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                &amp;lt;tr&amp;gt;&lt;br /&gt;
                    &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                        &amp;lt;strong&amp;gt;Comm:&amp;lt;/strong&amp;gt; tls://192.168.100.5:42166&lt;br /&gt;
                    &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                        &amp;lt;strong&amp;gt;Workers:&amp;lt;/strong&amp;gt; 5&lt;br /&gt;
                    &amp;lt;/td&amp;gt;&lt;br /&gt;
                &amp;lt;/tr&amp;gt;&lt;br /&gt;
                &amp;lt;tr&amp;gt;&lt;br /&gt;
                    &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                        &amp;lt;strong&amp;gt;Dashboard:&amp;lt;/strong&amp;gt; &amp;lt;a href=&amp;quot;http://192.168.100.5:41069/status&amp;quot; target=&amp;quot;_blank&amp;quot;&amp;gt;http://192.168.100.5:41069/status&amp;lt;/a&amp;gt;&lt;br /&gt;
                    &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                        &amp;lt;strong&amp;gt;Total threads:&amp;lt;/strong&amp;gt; 5&lt;br /&gt;
                    &amp;lt;/td&amp;gt;&lt;br /&gt;
                &amp;lt;/tr&amp;gt;&lt;br /&gt;
                &amp;lt;tr&amp;gt;&lt;br /&gt;
                    &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                        &amp;lt;strong&amp;gt;Started:&amp;lt;/strong&amp;gt; 42 minutes ago&lt;br /&gt;
                    &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                        &amp;lt;strong&amp;gt;Total memory:&amp;lt;/strong&amp;gt; 9.30 GiB&lt;br /&gt;
                    &amp;lt;/td&amp;gt;&lt;br /&gt;
                &amp;lt;/tr&amp;gt;&lt;br /&gt;
            &amp;lt;/table&amp;gt;&lt;br /&gt;
        &amp;lt;/div&amp;gt;&lt;br /&gt;
    &amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
    &amp;lt;details style=&amp;quot;margin-left: 48px;&amp;quot;&amp;gt;&lt;br /&gt;
        &amp;lt;summary style=&amp;quot;margin-bottom: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
            &amp;lt;h3 style=&amp;quot;display: inline;&amp;quot;&amp;gt;Workers&amp;lt;/h3&amp;gt;&lt;br /&gt;
        &amp;lt;/summary&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
        &amp;lt;div style=&amp;quot;margin-bottom: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
            &amp;lt;div style=&amp;quot;width: 24px; height: 24px; background-color: #DBF5FF; border: 3px solid #4CC9FF; border-radius: 5px; position: absolute;&amp;quot;&amp;gt; &amp;lt;/div&amp;gt;&lt;br /&gt;
            &amp;lt;div style=&amp;quot;margin-left: 48px;&amp;quot;&amp;gt;&lt;br /&gt;
            &amp;lt;details&amp;gt;&lt;br /&gt;
                &amp;lt;summary&amp;gt;&lt;br /&gt;
                    &amp;lt;h4 style=&amp;quot;margin-bottom: 0px; display: inline;&amp;quot;&amp;gt;Worker: SecureHTCondor-0&amp;lt;/h4&amp;gt;&lt;br /&gt;
                &amp;lt;/summary&amp;gt;&lt;br /&gt;
                &amp;lt;table style=&amp;quot;width: 100%; text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Comm: &amp;lt;/strong&amp;gt; tls://192.168.101.127:38421&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Total threads: &amp;lt;/strong&amp;gt; 1&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Dashboard: &amp;lt;/strong&amp;gt; &amp;lt;a href=&amp;quot;http://192.168.101.127:34923/status&amp;quot; target=&amp;quot;_blank&amp;quot;&amp;gt;http://192.168.101.127:34923/status&amp;lt;/a&amp;gt;&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Memory: &amp;lt;/strong&amp;gt; 1.86 GiB&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Nanny: &amp;lt;/strong&amp;gt; tls://192.168.101.127:38180&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;td colspan=&amp;quot;2&amp;quot; style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Local directory: &amp;lt;/strong&amp;gt; /tmp/dask-scratch-space/worker-vzw0w_hh&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Tasks executing: &amp;lt;/strong&amp;gt; &lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Tasks in memory: &amp;lt;/strong&amp;gt; &lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Tasks ready: &amp;lt;/strong&amp;gt; &lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Tasks in flight: &amp;lt;/strong&amp;gt;&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;CPU usage:&amp;lt;/strong&amp;gt; 4.0%&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Last seen: &amp;lt;/strong&amp;gt; Just now&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Memory usage: &amp;lt;/strong&amp;gt; 125.28 MiB&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Spilled bytes: &amp;lt;/strong&amp;gt; 0 B&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Read bytes: &amp;lt;/strong&amp;gt; 604.32 kiB&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Write bytes: &amp;lt;/strong&amp;gt; 0.95 MiB&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
                &amp;lt;/table&amp;gt;&lt;br /&gt;
            &amp;lt;/details&amp;gt;&lt;br /&gt;
            &amp;lt;/div&amp;gt;&lt;br /&gt;
        &amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
        &amp;lt;div style=&amp;quot;margin-bottom: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
            &amp;lt;div style=&amp;quot;width: 24px; height: 24px; background-color: #DBF5FF; border: 3px solid #4CC9FF; border-radius: 5px; position: absolute;&amp;quot;&amp;gt; &amp;lt;/div&amp;gt;&lt;br /&gt;
            &amp;lt;div style=&amp;quot;margin-left: 48px;&amp;quot;&amp;gt;&lt;br /&gt;
            &amp;lt;details&amp;gt;&lt;br /&gt;
                &amp;lt;summary&amp;gt;&lt;br /&gt;
                    &amp;lt;h4 style=&amp;quot;margin-bottom: 0px; display: inline;&amp;quot;&amp;gt;Worker: SecureHTCondor-1&amp;lt;/h4&amp;gt;&lt;br /&gt;
                &amp;lt;/summary&amp;gt;&lt;br /&gt;
                &amp;lt;table style=&amp;quot;width: 100%; text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Comm: &amp;lt;/strong&amp;gt; tls://192.168.102.27:46440&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Total threads: &amp;lt;/strong&amp;gt; 1&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Dashboard: &amp;lt;/strong&amp;gt; &amp;lt;a href=&amp;quot;http://192.168.102.27:35027/status&amp;quot; target=&amp;quot;_blank&amp;quot;&amp;gt;http://192.168.102.27:35027/status&amp;lt;/a&amp;gt;&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Memory: &amp;lt;/strong&amp;gt; 1.86 GiB&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Nanny: &amp;lt;/strong&amp;gt; tls://192.168.102.27:44026&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;td colspan=&amp;quot;2&amp;quot; style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Local directory: &amp;lt;/strong&amp;gt; /tmp/dask-scratch-space/worker-nj1hcr7h&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Tasks executing: &amp;lt;/strong&amp;gt; &lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Tasks in memory: &amp;lt;/strong&amp;gt; &lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Tasks ready: &amp;lt;/strong&amp;gt; &lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Tasks in flight: &amp;lt;/strong&amp;gt;&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;CPU usage:&amp;lt;/strong&amp;gt; 2.0%&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Last seen: &amp;lt;/strong&amp;gt; Just now&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Memory usage: &amp;lt;/strong&amp;gt; 122.96 MiB&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Spilled bytes: &amp;lt;/strong&amp;gt; 0 B&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Read bytes: &amp;lt;/strong&amp;gt; 17.31 MiB&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Write bytes: &amp;lt;/strong&amp;gt; 182.89 kiB&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
                &amp;lt;/table&amp;gt;&lt;br /&gt;
            &amp;lt;/details&amp;gt;&lt;br /&gt;
            &amp;lt;/div&amp;gt;&lt;br /&gt;
        &amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
        &amp;lt;div style=&amp;quot;margin-bottom: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
            &amp;lt;div style=&amp;quot;width: 24px; height: 24px; background-color: #DBF5FF; border: 3px solid #4CC9FF; border-radius: 5px; position: absolute;&amp;quot;&amp;gt; &amp;lt;/div&amp;gt;&lt;br /&gt;
            &amp;lt;div style=&amp;quot;margin-left: 48px;&amp;quot;&amp;gt;&lt;br /&gt;
            &amp;lt;details&amp;gt;&lt;br /&gt;
                &amp;lt;summary&amp;gt;&lt;br /&gt;
                    &amp;lt;h4 style=&amp;quot;margin-bottom: 0px; display: inline;&amp;quot;&amp;gt;Worker: SecureHTCondor-2&amp;lt;/h4&amp;gt;&lt;br /&gt;
                &amp;lt;/summary&amp;gt;&lt;br /&gt;
                &amp;lt;table style=&amp;quot;width: 100%; text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Comm: &amp;lt;/strong&amp;gt; tls://192.168.102.37:36278&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Total threads: &amp;lt;/strong&amp;gt; 1&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Dashboard: &amp;lt;/strong&amp;gt; &amp;lt;a href=&amp;quot;http://192.168.102.37:41285/status&amp;quot; target=&amp;quot;_blank&amp;quot;&amp;gt;http://192.168.102.37:41285/status&amp;lt;/a&amp;gt;&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Memory: &amp;lt;/strong&amp;gt; 1.86 GiB&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Nanny: &amp;lt;/strong&amp;gt; tls://192.168.102.37:36316&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;td colspan=&amp;quot;2&amp;quot; style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Local directory: &amp;lt;/strong&amp;gt; /tmp/dask-scratch-space/worker-l2021n4v&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Tasks executing: &amp;lt;/strong&amp;gt; &lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Tasks in memory: &amp;lt;/strong&amp;gt; &lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Tasks ready: &amp;lt;/strong&amp;gt; &lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Tasks in flight: &amp;lt;/strong&amp;gt;&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;CPU usage:&amp;lt;/strong&amp;gt; 2.0%&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Last seen: &amp;lt;/strong&amp;gt; Just now&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Memory usage: &amp;lt;/strong&amp;gt; 121.28 MiB&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Spilled bytes: &amp;lt;/strong&amp;gt; 0 B&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Read bytes: &amp;lt;/strong&amp;gt; 79.03 kiB&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Write bytes: &amp;lt;/strong&amp;gt; 102.23 kiB&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
                &amp;lt;/table&amp;gt;&lt;br /&gt;
            &amp;lt;/details&amp;gt;&lt;br /&gt;
            &amp;lt;/div&amp;gt;&lt;br /&gt;
        &amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
        &amp;lt;div style=&amp;quot;margin-bottom: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
            &amp;lt;div style=&amp;quot;width: 24px; height: 24px; background-color: #DBF5FF; border: 3px solid #4CC9FF; border-radius: 5px; position: absolute;&amp;quot;&amp;gt; &amp;lt;/div&amp;gt;&lt;br /&gt;
            &amp;lt;div style=&amp;quot;margin-left: 48px;&amp;quot;&amp;gt;&lt;br /&gt;
            &amp;lt;details&amp;gt;&lt;br /&gt;
                &amp;lt;summary&amp;gt;&lt;br /&gt;
                    &amp;lt;h4 style=&amp;quot;margin-bottom: 0px; display: inline;&amp;quot;&amp;gt;Worker: SecureHTCondor-3&amp;lt;/h4&amp;gt;&lt;br /&gt;
                &amp;lt;/summary&amp;gt;&lt;br /&gt;
                &amp;lt;table style=&amp;quot;width: 100%; text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Comm: &amp;lt;/strong&amp;gt; tls://192.168.100.12:45248&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Total threads: &amp;lt;/strong&amp;gt; 1&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Dashboard: &amp;lt;/strong&amp;gt; &amp;lt;a href=&amp;quot;http://192.168.100.12:43200/status&amp;quot; target=&amp;quot;_blank&amp;quot;&amp;gt;http://192.168.100.12:43200/status&amp;lt;/a&amp;gt;&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Memory: &amp;lt;/strong&amp;gt; 1.86 GiB&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Nanny: &amp;lt;/strong&amp;gt; tls://192.168.100.12:36383&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;td colspan=&amp;quot;2&amp;quot; style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Local directory: &amp;lt;/strong&amp;gt; /tmp/dask-scratch-space/worker-0v5jilfv&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Tasks executing: &amp;lt;/strong&amp;gt; &lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Tasks in memory: &amp;lt;/strong&amp;gt; &lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Tasks ready: &amp;lt;/strong&amp;gt; &lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Tasks in flight: &amp;lt;/strong&amp;gt;&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;CPU usage:&amp;lt;/strong&amp;gt; 2.0%&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Last seen: &amp;lt;/strong&amp;gt; Just now&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Memory usage: &amp;lt;/strong&amp;gt; 125.12 MiB&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Spilled bytes: &amp;lt;/strong&amp;gt; 0 B&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Read bytes: &amp;lt;/strong&amp;gt; 3.26 MiB&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Write bytes: &amp;lt;/strong&amp;gt; 3.21 MiB&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
                &amp;lt;/table&amp;gt;&lt;br /&gt;
            &amp;lt;/details&amp;gt;&lt;br /&gt;
            &amp;lt;/div&amp;gt;&lt;br /&gt;
        &amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
        &amp;lt;div style=&amp;quot;margin-bottom: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
            &amp;lt;div style=&amp;quot;width: 24px; height: 24px; background-color: #DBF5FF; border: 3px solid #4CC9FF; border-radius: 5px; position: absolute;&amp;quot;&amp;gt; &amp;lt;/div&amp;gt;&lt;br /&gt;
            &amp;lt;div style=&amp;quot;margin-left: 48px;&amp;quot;&amp;gt;&lt;br /&gt;
            &amp;lt;details&amp;gt;&lt;br /&gt;
                &amp;lt;summary&amp;gt;&lt;br /&gt;
                    &amp;lt;h4 style=&amp;quot;margin-bottom: 0px; display: inline;&amp;quot;&amp;gt;Worker: SecureHTCondor-4&amp;lt;/h4&amp;gt;&lt;br /&gt;
                &amp;lt;/summary&amp;gt;&lt;br /&gt;
                &amp;lt;table style=&amp;quot;width: 100%; text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Comm: &amp;lt;/strong&amp;gt; tls://192.168.100.14:42847&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Total threads: &amp;lt;/strong&amp;gt; 1&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Dashboard: &amp;lt;/strong&amp;gt; &amp;lt;a href=&amp;quot;http://192.168.100.14:36788/status&amp;quot; target=&amp;quot;_blank&amp;quot;&amp;gt;http://192.168.100.14:36788/status&amp;lt;/a&amp;gt;&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Memory: &amp;lt;/strong&amp;gt; 1.86 GiB&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Nanny: &amp;lt;/strong&amp;gt; tls://192.168.100.14:46489&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;td colspan=&amp;quot;2&amp;quot; style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Local directory: &amp;lt;/strong&amp;gt; /tmp/dask-scratch-space/worker-f3u75i_8&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Tasks executing: &amp;lt;/strong&amp;gt; &lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Tasks in memory: &amp;lt;/strong&amp;gt; &lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Tasks ready: &amp;lt;/strong&amp;gt; &lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Tasks in flight: &amp;lt;/strong&amp;gt;&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;CPU usage:&amp;lt;/strong&amp;gt; 2.0%&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Last seen: &amp;lt;/strong&amp;gt; Just now&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Memory usage: &amp;lt;/strong&amp;gt; 121.30 MiB&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Spilled bytes: &amp;lt;/strong&amp;gt; 0 B&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Read bytes: &amp;lt;/strong&amp;gt; 20.58 MiB&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;&lt;br /&gt;
                            &amp;lt;strong&amp;gt;Write bytes: &amp;lt;/strong&amp;gt; 4.19 MiB&lt;br /&gt;
                        &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
                &amp;lt;/table&amp;gt;&lt;br /&gt;
            &amp;lt;/details&amp;gt;&lt;br /&gt;
            &amp;lt;/div&amp;gt;&lt;br /&gt;
        &amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
    &amp;lt;/details&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
            &amp;lt;/details&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
    &amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
    2024-02-05 11:21:41,076 - distributed.client - ERROR - Failed to reconnect to scheduler after 30.00 seconds, closing client&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
# Troubleshooting&lt;br /&gt;
&lt;br /&gt;
## Compatibility issues&lt;br /&gt;
&lt;br /&gt;
If you try to connect a notebook to a Dask cluster, and the notebook's environment is different from the one used to launch de cluster, you may encounter compatibility issues.&lt;br /&gt;
&lt;br /&gt;
You will tipycally receive a &amp;quot;Mismatched versions found&amp;quot; warning like this:&lt;br /&gt;
&lt;br /&gt;
```&lt;br /&gt;
/data/astro/scratch2/torradeflot/envs/dask/lib/python3.11/site-packages/distributed/client.py:1388: VersionMismatchWarning: Mismatched versions found&lt;br /&gt;
&lt;br /&gt;
+---------+----------------+----------------+----------------+&lt;br /&gt;
| Package | Client         | Scheduler      | Workers        |&lt;br /&gt;
+---------+----------------+----------------+----------------+&lt;br /&gt;
| lz4     | 4.3.3          | 4.3.2          | 4.3.2          |&lt;br /&gt;
| msgpack | 1.0.7          | 1.0.5          | 1.0.5          |&lt;br /&gt;
| numpy   | 1.26.3         | 1.24.3         | 1.24.3         |&lt;br /&gt;
| pandas  | 2.2.0          | 2.0.2          | 2.0.2          |&lt;br /&gt;
| python  | 3.11.7.final.0 | 3.11.5.final.0 | 3.11.5.final.0 |&lt;br /&gt;
| toolz   | 0.12.1         | 0.12.0         | 0.12.0         |&lt;br /&gt;
+---------+----------------+----------------+----------------+&lt;br /&gt;
  warnings.warn(version_module.VersionMismatchWarning(msg[0][&amp;quot;warning&amp;quot;]))&lt;br /&gt;
```&lt;br /&gt;
&lt;br /&gt;
Some mismatches might be blocking, it is recommended to match the major and minor versions. A mismatch in the patch version shouldn't be a problem.&lt;br /&gt;
&lt;br /&gt;
## Problems with securitization&lt;br /&gt;
&lt;br /&gt;
If you launch a Dask cluster from a notebook but you have never launched a cluster from the dask-labextension or using the `pic_jupyterhub` module, you may encounter a problem because encryption is enforced but the certificates do not exist.&lt;br /&gt;
&lt;br /&gt;
If this is the case, you can launch a cluster using one of these options as shown above. This will generate the certificate files and the subsequent creation of a Dask cluster from a notebook should succeed.&lt;br /&gt;
&lt;br /&gt;
# Examples&lt;br /&gt;
## Dask example 1&lt;br /&gt;
&lt;br /&gt;
picked from https://docs.dask.org/en/stable/10-minutes-to-dask.html&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
```python&lt;br /&gt;
import numpy as np&lt;br /&gt;
import pandas as pd&lt;br /&gt;
&lt;br /&gt;
import dask.dataframe as dd&lt;br /&gt;
import dask.array as da&lt;br /&gt;
import dask.bag as db&lt;br /&gt;
```&lt;br /&gt;
&lt;br /&gt;
### DataFrame&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
```python&lt;br /&gt;
index = pd.date_range(&amp;quot;2021-09-01&amp;quot;, periods=2400, freq=&amp;quot;1H&amp;quot;)&lt;br /&gt;
df = pd.DataFrame({&amp;quot;a&amp;quot;: np.arange(2400), &amp;quot;b&amp;quot;: list(&amp;quot;abcaddbe&amp;quot; * 300)}, index=index)&lt;br /&gt;
ddf = dd.from_pandas(df, npartitions=10)&lt;br /&gt;
ddf&lt;br /&gt;
```&lt;br /&gt;
&lt;br /&gt;
    /tmp/ipykernel_291/1125038151.py:1: FutureWarning: 'H' is deprecated and will be removed in a future version, please use 'h' instead.&lt;br /&gt;
      index = pd.date_range(&amp;quot;2021-09-01&amp;quot;, periods=2400, freq=&amp;quot;1H&amp;quot;)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div&amp;gt;&amp;lt;strong&amp;gt;Dask DataFrame Structure:&amp;lt;/strong&amp;gt;&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;div&amp;gt;&lt;br /&gt;
&amp;lt;style scoped&amp;gt;&lt;br /&gt;
    .dataframe tbody tr th:only-of-type {&lt;br /&gt;
        vertical-align: middle;&lt;br /&gt;
    }&lt;br /&gt;
&lt;br /&gt;
    .dataframe tbody tr th {&lt;br /&gt;
        vertical-align: top;&lt;br /&gt;
    }&lt;br /&gt;
&lt;br /&gt;
    .dataframe thead th {&lt;br /&gt;
        text-align: right;&lt;br /&gt;
    }&lt;br /&gt;
&amp;lt;/style&amp;gt;&lt;br /&gt;
&amp;lt;table border=&amp;quot;1&amp;quot; class=&amp;quot;dataframe&amp;quot;&amp;gt;&lt;br /&gt;
  &amp;lt;thead&amp;gt;&lt;br /&gt;
    &amp;lt;tr style=&amp;quot;text-align: right;&amp;quot;&amp;gt;&lt;br /&gt;
      &amp;lt;th&amp;gt;&amp;lt;/th&amp;gt;&lt;br /&gt;
      &amp;lt;th&amp;gt;a&amp;lt;/th&amp;gt;&lt;br /&gt;
      &amp;lt;th&amp;gt;b&amp;lt;/th&amp;gt;&lt;br /&gt;
    &amp;lt;/tr&amp;gt;&lt;br /&gt;
    &amp;lt;tr&amp;gt;&lt;br /&gt;
      &amp;lt;th&amp;gt;npartitions=10&amp;lt;/th&amp;gt;&lt;br /&gt;
      &amp;lt;th&amp;gt;&amp;lt;/th&amp;gt;&lt;br /&gt;
      &amp;lt;th&amp;gt;&amp;lt;/th&amp;gt;&lt;br /&gt;
    &amp;lt;/tr&amp;gt;&lt;br /&gt;
  &amp;lt;/thead&amp;gt;&lt;br /&gt;
  &amp;lt;tbody&amp;gt;&lt;br /&gt;
    &amp;lt;tr&amp;gt;&lt;br /&gt;
      &amp;lt;th&amp;gt;2021-09-01 00:00:00&amp;lt;/th&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;int64&amp;lt;/td&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;object&amp;lt;/td&amp;gt;&lt;br /&gt;
    &amp;lt;/tr&amp;gt;&lt;br /&gt;
    &amp;lt;tr&amp;gt;&lt;br /&gt;
      &amp;lt;th&amp;gt;2021-09-11 00:00:00&amp;lt;/th&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;...&amp;lt;/td&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;...&amp;lt;/td&amp;gt;&lt;br /&gt;
    &amp;lt;/tr&amp;gt;&lt;br /&gt;
    &amp;lt;tr&amp;gt;&lt;br /&gt;
      &amp;lt;th&amp;gt;...&amp;lt;/th&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;...&amp;lt;/td&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;...&amp;lt;/td&amp;gt;&lt;br /&gt;
    &amp;lt;/tr&amp;gt;&lt;br /&gt;
    &amp;lt;tr&amp;gt;&lt;br /&gt;
      &amp;lt;th&amp;gt;2021-11-30 00:00:00&amp;lt;/th&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;...&amp;lt;/td&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;...&amp;lt;/td&amp;gt;&lt;br /&gt;
    &amp;lt;/tr&amp;gt;&lt;br /&gt;
    &amp;lt;tr&amp;gt;&lt;br /&gt;
      &amp;lt;th&amp;gt;2021-12-09 23:00:00&amp;lt;/th&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;...&amp;lt;/td&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;...&amp;lt;/td&amp;gt;&lt;br /&gt;
    &amp;lt;/tr&amp;gt;&lt;br /&gt;
  &amp;lt;/tbody&amp;gt;&lt;br /&gt;
&amp;lt;/table&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;div&amp;gt;Dask Name: from_pandas, 1 graph layer&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
```python&lt;br /&gt;
ddf.divisions&lt;br /&gt;
```&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
    (Timestamp('2021-09-01 00:00:00'),&lt;br /&gt;
     Timestamp('2021-09-11 00:00:00'),&lt;br /&gt;
     Timestamp('2021-09-21 00:00:00'),&lt;br /&gt;
     Timestamp('2021-10-01 00:00:00'),&lt;br /&gt;
     Timestamp('2021-10-11 00:00:00'),&lt;br /&gt;
     Timestamp('2021-10-21 00:00:00'),&lt;br /&gt;
     Timestamp('2021-10-31 00:00:00'),&lt;br /&gt;
     Timestamp('2021-11-10 00:00:00'),&lt;br /&gt;
     Timestamp('2021-11-20 00:00:00'),&lt;br /&gt;
     Timestamp('2021-11-30 00:00:00'),&lt;br /&gt;
     Timestamp('2021-12-09 23:00:00'))&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
```python&lt;br /&gt;
ddf.partitions[1]&lt;br /&gt;
```&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div&amp;gt;&amp;lt;strong&amp;gt;Dask DataFrame Structure:&amp;lt;/strong&amp;gt;&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;div&amp;gt;&lt;br /&gt;
&amp;lt;style scoped&amp;gt;&lt;br /&gt;
    .dataframe tbody tr th:only-of-type {&lt;br /&gt;
        vertical-align: middle;&lt;br /&gt;
    }&lt;br /&gt;
&lt;br /&gt;
    .dataframe tbody tr th {&lt;br /&gt;
        vertical-align: top;&lt;br /&gt;
    }&lt;br /&gt;
&lt;br /&gt;
    .dataframe thead th {&lt;br /&gt;
        text-align: right;&lt;br /&gt;
    }&lt;br /&gt;
&amp;lt;/style&amp;gt;&lt;br /&gt;
&amp;lt;table border=&amp;quot;1&amp;quot; class=&amp;quot;dataframe&amp;quot;&amp;gt;&lt;br /&gt;
  &amp;lt;thead&amp;gt;&lt;br /&gt;
    &amp;lt;tr style=&amp;quot;text-align: right;&amp;quot;&amp;gt;&lt;br /&gt;
      &amp;lt;th&amp;gt;&amp;lt;/th&amp;gt;&lt;br /&gt;
      &amp;lt;th&amp;gt;a&amp;lt;/th&amp;gt;&lt;br /&gt;
      &amp;lt;th&amp;gt;b&amp;lt;/th&amp;gt;&lt;br /&gt;
    &amp;lt;/tr&amp;gt;&lt;br /&gt;
    &amp;lt;tr&amp;gt;&lt;br /&gt;
      &amp;lt;th&amp;gt;npartitions=1&amp;lt;/th&amp;gt;&lt;br /&gt;
      &amp;lt;th&amp;gt;&amp;lt;/th&amp;gt;&lt;br /&gt;
      &amp;lt;th&amp;gt;&amp;lt;/th&amp;gt;&lt;br /&gt;
    &amp;lt;/tr&amp;gt;&lt;br /&gt;
  &amp;lt;/thead&amp;gt;&lt;br /&gt;
  &amp;lt;tbody&amp;gt;&lt;br /&gt;
    &amp;lt;tr&amp;gt;&lt;br /&gt;
      &amp;lt;th&amp;gt;2021-09-11&amp;lt;/th&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;int64&amp;lt;/td&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;object&amp;lt;/td&amp;gt;&lt;br /&gt;
    &amp;lt;/tr&amp;gt;&lt;br /&gt;
    &amp;lt;tr&amp;gt;&lt;br /&gt;
      &amp;lt;th&amp;gt;2021-09-21&amp;lt;/th&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;...&amp;lt;/td&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;...&amp;lt;/td&amp;gt;&lt;br /&gt;
    &amp;lt;/tr&amp;gt;&lt;br /&gt;
  &amp;lt;/tbody&amp;gt;&lt;br /&gt;
&amp;lt;/table&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;div&amp;gt;Dask Name: blocks, 2 graph layers&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
```python&lt;br /&gt;
ddf[&amp;quot;2000-10-01&amp;quot;: &amp;quot;2021-10-09 5:00&amp;quot;].compute()&lt;br /&gt;
```&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div&amp;gt;&lt;br /&gt;
&amp;lt;style scoped&amp;gt;&lt;br /&gt;
    .dataframe tbody tr th:only-of-type {&lt;br /&gt;
        vertical-align: middle;&lt;br /&gt;
    }&lt;br /&gt;
&lt;br /&gt;
    .dataframe tbody tr th {&lt;br /&gt;
        vertical-align: top;&lt;br /&gt;
    }&lt;br /&gt;
&lt;br /&gt;
    .dataframe thead th {&lt;br /&gt;
        text-align: right;&lt;br /&gt;
    }&lt;br /&gt;
&amp;lt;/style&amp;gt;&lt;br /&gt;
&amp;lt;table border=&amp;quot;1&amp;quot; class=&amp;quot;dataframe&amp;quot;&amp;gt;&lt;br /&gt;
  &amp;lt;thead&amp;gt;&lt;br /&gt;
    &amp;lt;tr style=&amp;quot;text-align: right;&amp;quot;&amp;gt;&lt;br /&gt;
      &amp;lt;th&amp;gt;&amp;lt;/th&amp;gt;&lt;br /&gt;
      &amp;lt;th&amp;gt;a&amp;lt;/th&amp;gt;&lt;br /&gt;
      &amp;lt;th&amp;gt;b&amp;lt;/th&amp;gt;&lt;br /&gt;
    &amp;lt;/tr&amp;gt;&lt;br /&gt;
  &amp;lt;/thead&amp;gt;&lt;br /&gt;
  &amp;lt;tbody&amp;gt;&lt;br /&gt;
    &amp;lt;tr&amp;gt;&lt;br /&gt;
      &amp;lt;th&amp;gt;2021-09-01 00:00:00&amp;lt;/th&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;0&amp;lt;/td&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;a&amp;lt;/td&amp;gt;&lt;br /&gt;
    &amp;lt;/tr&amp;gt;&lt;br /&gt;
    &amp;lt;tr&amp;gt;&lt;br /&gt;
      &amp;lt;th&amp;gt;2021-09-01 01:00:00&amp;lt;/th&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;1&amp;lt;/td&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;b&amp;lt;/td&amp;gt;&lt;br /&gt;
    &amp;lt;/tr&amp;gt;&lt;br /&gt;
    &amp;lt;tr&amp;gt;&lt;br /&gt;
      &amp;lt;th&amp;gt;2021-09-01 02:00:00&amp;lt;/th&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;2&amp;lt;/td&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;c&amp;lt;/td&amp;gt;&lt;br /&gt;
    &amp;lt;/tr&amp;gt;&lt;br /&gt;
    &amp;lt;tr&amp;gt;&lt;br /&gt;
      &amp;lt;th&amp;gt;2021-09-01 03:00:00&amp;lt;/th&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;3&amp;lt;/td&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;a&amp;lt;/td&amp;gt;&lt;br /&gt;
    &amp;lt;/tr&amp;gt;&lt;br /&gt;
    &amp;lt;tr&amp;gt;&lt;br /&gt;
      &amp;lt;th&amp;gt;2021-09-01 04:00:00&amp;lt;/th&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;4&amp;lt;/td&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;d&amp;lt;/td&amp;gt;&lt;br /&gt;
    &amp;lt;/tr&amp;gt;&lt;br /&gt;
    &amp;lt;tr&amp;gt;&lt;br /&gt;
      &amp;lt;th&amp;gt;...&amp;lt;/th&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;...&amp;lt;/td&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;...&amp;lt;/td&amp;gt;&lt;br /&gt;
    &amp;lt;/tr&amp;gt;&lt;br /&gt;
    &amp;lt;tr&amp;gt;&lt;br /&gt;
      &amp;lt;th&amp;gt;2021-10-09 01:00:00&amp;lt;/th&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;913&amp;lt;/td&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;b&amp;lt;/td&amp;gt;&lt;br /&gt;
    &amp;lt;/tr&amp;gt;&lt;br /&gt;
    &amp;lt;tr&amp;gt;&lt;br /&gt;
      &amp;lt;th&amp;gt;2021-10-09 02:00:00&amp;lt;/th&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;914&amp;lt;/td&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;c&amp;lt;/td&amp;gt;&lt;br /&gt;
    &amp;lt;/tr&amp;gt;&lt;br /&gt;
    &amp;lt;tr&amp;gt;&lt;br /&gt;
      &amp;lt;th&amp;gt;2021-10-09 03:00:00&amp;lt;/th&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;915&amp;lt;/td&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;a&amp;lt;/td&amp;gt;&lt;br /&gt;
    &amp;lt;/tr&amp;gt;&lt;br /&gt;
    &amp;lt;tr&amp;gt;&lt;br /&gt;
      &amp;lt;th&amp;gt;2021-10-09 04:00:00&amp;lt;/th&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;916&amp;lt;/td&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;d&amp;lt;/td&amp;gt;&lt;br /&gt;
    &amp;lt;/tr&amp;gt;&lt;br /&gt;
    &amp;lt;tr&amp;gt;&lt;br /&gt;
      &amp;lt;th&amp;gt;2021-10-09 05:00:00&amp;lt;/th&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;917&amp;lt;/td&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;d&amp;lt;/td&amp;gt;&lt;br /&gt;
    &amp;lt;/tr&amp;gt;&lt;br /&gt;
  &amp;lt;/tbody&amp;gt;&lt;br /&gt;
&amp;lt;/table&amp;gt;&lt;br /&gt;
&amp;lt;p&amp;gt;918 rows × 2 columns&amp;lt;/p&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
### Array&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
```python&lt;br /&gt;
import numpy as np&lt;br /&gt;
import dask.array as da&lt;br /&gt;
&lt;br /&gt;
data = np.arange(100_000).reshape(200, 500)&lt;br /&gt;
a = da.from_array(data, chunks=(100, 100))&lt;br /&gt;
a&lt;br /&gt;
```&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;table&amp;gt;&lt;br /&gt;
    &amp;lt;tr&amp;gt;&lt;br /&gt;
        &amp;lt;td&amp;gt;&lt;br /&gt;
            &amp;lt;table style=&amp;quot;border-collapse: collapse;&amp;quot;&amp;gt;&lt;br /&gt;
                &amp;lt;thead&amp;gt;&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;td&amp;gt; &amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;th&amp;gt; Array &amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;th&amp;gt; Chunk &amp;lt;/th&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
                &amp;lt;/thead&amp;gt;&lt;br /&gt;
                &amp;lt;tbody&amp;gt;&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th&amp;gt; Bytes &amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td&amp;gt; 781.25 kiB &amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;td&amp;gt; 78.12 kiB &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th&amp;gt; Shape &amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td&amp;gt; (200, 500) &amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;td&amp;gt; (100, 100) &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th&amp;gt; Dask graph &amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td colspan=&amp;quot;2&amp;quot;&amp;gt; 10 chunks in 1 graph layer &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th&amp;gt; Data type &amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td colspan=&amp;quot;2&amp;quot;&amp;gt; int64 numpy.ndarray &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
                &amp;lt;/tbody&amp;gt;&lt;br /&gt;
            &amp;lt;/table&amp;gt;&lt;br /&gt;
        &amp;lt;/td&amp;gt;&lt;br /&gt;
        &amp;lt;td&amp;gt;&lt;br /&gt;
        &amp;lt;svg width=&amp;quot;170&amp;quot; height=&amp;quot;98&amp;quot; style=&amp;quot;stroke:rgb(0,0,0);stroke-width:1&amp;quot; &amp;gt;&lt;br /&gt;
&lt;br /&gt;
  &amp;lt;!-- Horizontal lines --&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;0&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;120&amp;quot; y2=&amp;quot;0&amp;quot; style=&amp;quot;stroke-width:2&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;0&amp;quot; y1=&amp;quot;24&amp;quot; x2=&amp;quot;120&amp;quot; y2=&amp;quot;24&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;0&amp;quot; y1=&amp;quot;48&amp;quot; x2=&amp;quot;120&amp;quot; y2=&amp;quot;48&amp;quot; style=&amp;quot;stroke-width:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
  &amp;lt;!-- Vertical lines --&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;0&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;0&amp;quot; y2=&amp;quot;48&amp;quot; style=&amp;quot;stroke-width:2&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;24&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;24&amp;quot; y2=&amp;quot;48&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;48&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;48&amp;quot; y2=&amp;quot;48&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;72&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;72&amp;quot; y2=&amp;quot;48&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;96&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;96&amp;quot; y2=&amp;quot;48&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;120&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;120&amp;quot; y2=&amp;quot;48&amp;quot; style=&amp;quot;stroke-width:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
  &amp;lt;!-- Colored Rectangle --&amp;gt;&lt;br /&gt;
  &amp;lt;polygon points=&amp;quot;0.0,0.0 120.0,0.0 120.0,48.0 0.0,48.0&amp;quot; style=&amp;quot;fill:#ECB172A0;stroke-width:0&amp;quot;/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
  &amp;lt;!-- Text --&amp;gt;&lt;br /&gt;
  &amp;lt;text x=&amp;quot;60.000000&amp;quot; y=&amp;quot;68.000000&amp;quot; font-size=&amp;quot;1.0rem&amp;quot; font-weight=&amp;quot;100&amp;quot; text-anchor=&amp;quot;middle&amp;quot; &amp;gt;500&amp;lt;/text&amp;gt;&lt;br /&gt;
  &amp;lt;text x=&amp;quot;140.000000&amp;quot; y=&amp;quot;24.000000&amp;quot; font-size=&amp;quot;1.0rem&amp;quot; font-weight=&amp;quot;100&amp;quot; text-anchor=&amp;quot;middle&amp;quot; transform=&amp;quot;rotate(-90,140.000000,24.000000)&amp;quot;&amp;gt;200&amp;lt;/text&amp;gt;&lt;br /&gt;
&amp;lt;/svg&amp;gt;&lt;br /&gt;
        &amp;lt;/td&amp;gt;&lt;br /&gt;
    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;/table&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
```python&lt;br /&gt;
a.chunks&lt;br /&gt;
```&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
    ((100, 100), (100, 100, 100, 100, 100))&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
```python&lt;br /&gt;
a.blocks[1, 3]&lt;br /&gt;
```&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;table&amp;gt;&lt;br /&gt;
    &amp;lt;tr&amp;gt;&lt;br /&gt;
        &amp;lt;td&amp;gt;&lt;br /&gt;
            &amp;lt;table style=&amp;quot;border-collapse: collapse;&amp;quot;&amp;gt;&lt;br /&gt;
                &amp;lt;thead&amp;gt;&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;td&amp;gt; &amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;th&amp;gt; Array &amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;th&amp;gt; Chunk &amp;lt;/th&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
                &amp;lt;/thead&amp;gt;&lt;br /&gt;
                &amp;lt;tbody&amp;gt;&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th&amp;gt; Bytes &amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td&amp;gt; 78.12 kiB &amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;td&amp;gt; 78.12 kiB &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th&amp;gt; Shape &amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td&amp;gt; (100, 100) &amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;td&amp;gt; (100, 100) &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th&amp;gt; Dask graph &amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td colspan=&amp;quot;2&amp;quot;&amp;gt; 1 chunks in 2 graph layers &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th&amp;gt; Data type &amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td colspan=&amp;quot;2&amp;quot;&amp;gt; int64 numpy.ndarray &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
                &amp;lt;/tbody&amp;gt;&lt;br /&gt;
            &amp;lt;/table&amp;gt;&lt;br /&gt;
        &amp;lt;/td&amp;gt;&lt;br /&gt;
        &amp;lt;td&amp;gt;&lt;br /&gt;
        &amp;lt;svg width=&amp;quot;170&amp;quot; height=&amp;quot;170&amp;quot; style=&amp;quot;stroke:rgb(0,0,0);stroke-width:1&amp;quot; &amp;gt;&lt;br /&gt;
&lt;br /&gt;
  &amp;lt;!-- Horizontal lines --&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;0&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;120&amp;quot; y2=&amp;quot;0&amp;quot; style=&amp;quot;stroke-width:2&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;0&amp;quot; y1=&amp;quot;120&amp;quot; x2=&amp;quot;120&amp;quot; y2=&amp;quot;120&amp;quot; style=&amp;quot;stroke-width:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
  &amp;lt;!-- Vertical lines --&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;0&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;0&amp;quot; y2=&amp;quot;120&amp;quot; style=&amp;quot;stroke-width:2&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;120&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;120&amp;quot; y2=&amp;quot;120&amp;quot; style=&amp;quot;stroke-width:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
  &amp;lt;!-- Colored Rectangle --&amp;gt;&lt;br /&gt;
  &amp;lt;polygon points=&amp;quot;0.0,0.0 120.0,0.0 120.0,120.0 0.0,120.0&amp;quot; style=&amp;quot;fill:#ECB172A0;stroke-width:0&amp;quot;/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
  &amp;lt;!-- Text --&amp;gt;&lt;br /&gt;
  &amp;lt;text x=&amp;quot;60.000000&amp;quot; y=&amp;quot;140.000000&amp;quot; font-size=&amp;quot;1.0rem&amp;quot; font-weight=&amp;quot;100&amp;quot; text-anchor=&amp;quot;middle&amp;quot; &amp;gt;100&amp;lt;/text&amp;gt;&lt;br /&gt;
  &amp;lt;text x=&amp;quot;140.000000&amp;quot; y=&amp;quot;60.000000&amp;quot; font-size=&amp;quot;1.0rem&amp;quot; font-weight=&amp;quot;100&amp;quot; text-anchor=&amp;quot;middle&amp;quot; transform=&amp;quot;rotate(-90,140.000000,60.000000)&amp;quot;&amp;gt;100&amp;lt;/text&amp;gt;&lt;br /&gt;
&amp;lt;/svg&amp;gt;&lt;br /&gt;
        &amp;lt;/td&amp;gt;&lt;br /&gt;
    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;/table&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
```python&lt;br /&gt;
a[:50, 200]&lt;br /&gt;
```&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;table&amp;gt;&lt;br /&gt;
    &amp;lt;tr&amp;gt;&lt;br /&gt;
        &amp;lt;td&amp;gt;&lt;br /&gt;
            &amp;lt;table style=&amp;quot;border-collapse: collapse;&amp;quot;&amp;gt;&lt;br /&gt;
                &amp;lt;thead&amp;gt;&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;td&amp;gt; &amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;th&amp;gt; Array &amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;th&amp;gt; Chunk &amp;lt;/th&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
                &amp;lt;/thead&amp;gt;&lt;br /&gt;
                &amp;lt;tbody&amp;gt;&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th&amp;gt; Bytes &amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td&amp;gt; 400 B &amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;td&amp;gt; 400 B &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th&amp;gt; Shape &amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td&amp;gt; (50,) &amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;td&amp;gt; (50,) &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th&amp;gt; Dask graph &amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td colspan=&amp;quot;2&amp;quot;&amp;gt; 1 chunks in 2 graph layers &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th&amp;gt; Data type &amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td colspan=&amp;quot;2&amp;quot;&amp;gt; int64 numpy.ndarray &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
                &amp;lt;/tbody&amp;gt;&lt;br /&gt;
            &amp;lt;/table&amp;gt;&lt;br /&gt;
        &amp;lt;/td&amp;gt;&lt;br /&gt;
        &amp;lt;td&amp;gt;&lt;br /&gt;
        &amp;lt;svg width=&amp;quot;170&amp;quot; height=&amp;quot;79&amp;quot; style=&amp;quot;stroke:rgb(0,0,0);stroke-width:1&amp;quot; &amp;gt;&lt;br /&gt;
&lt;br /&gt;
  &amp;lt;!-- Horizontal lines --&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;0&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;120&amp;quot; y2=&amp;quot;0&amp;quot; style=&amp;quot;stroke-width:2&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;0&amp;quot; y1=&amp;quot;29&amp;quot; x2=&amp;quot;120&amp;quot; y2=&amp;quot;29&amp;quot; style=&amp;quot;stroke-width:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
  &amp;lt;!-- Vertical lines --&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;0&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;0&amp;quot; y2=&amp;quot;29&amp;quot; style=&amp;quot;stroke-width:2&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;120&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;120&amp;quot; y2=&amp;quot;29&amp;quot; style=&amp;quot;stroke-width:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
  &amp;lt;!-- Colored Rectangle --&amp;gt;&lt;br /&gt;
  &amp;lt;polygon points=&amp;quot;0.0,0.0 120.0,0.0 120.0,29.030629010473877 0.0,29.030629010473877&amp;quot; style=&amp;quot;fill:#ECB172A0;stroke-width:0&amp;quot;/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
  &amp;lt;!-- Text --&amp;gt;&lt;br /&gt;
  &amp;lt;text x=&amp;quot;60.000000&amp;quot; y=&amp;quot;49.030629&amp;quot; font-size=&amp;quot;1.0rem&amp;quot; font-weight=&amp;quot;100&amp;quot; text-anchor=&amp;quot;middle&amp;quot; &amp;gt;50&amp;lt;/text&amp;gt;&lt;br /&gt;
  &amp;lt;text x=&amp;quot;140.000000&amp;quot; y=&amp;quot;14.515315&amp;quot; font-size=&amp;quot;1.0rem&amp;quot; font-weight=&amp;quot;100&amp;quot; text-anchor=&amp;quot;middle&amp;quot; transform=&amp;quot;rotate(0,140.000000,14.515315)&amp;quot;&amp;gt;1&amp;lt;/text&amp;gt;&lt;br /&gt;
&amp;lt;/svg&amp;gt;&lt;br /&gt;
        &amp;lt;/td&amp;gt;&lt;br /&gt;
    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;/table&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
```python&lt;br /&gt;
a[:50, 200].compute()&lt;br /&gt;
```&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
    array([  200,   700,  1200,  1700,  2200,  2700,  3200,  3700,  4200,&lt;br /&gt;
            4700,  5200,  5700,  6200,  6700,  7200,  7700,  8200,  8700,&lt;br /&gt;
            9200,  9700, 10200, 10700, 11200, 11700, 12200, 12700, 13200,&lt;br /&gt;
           13700, 14200, 14700, 15200, 15700, 16200, 16700, 17200, 17700,&lt;br /&gt;
           18200, 18700, 19200, 19700, 20200, 20700, 21200, 21700, 22200,&lt;br /&gt;
           22700, 23200, 23700, 24200, 24700])&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
```python&lt;br /&gt;
a.mean()&lt;br /&gt;
a.mean().compute()&lt;br /&gt;
np.sin(a)&lt;br /&gt;
np.sin(a).compute()&lt;br /&gt;
a.T&lt;br /&gt;
a.T.compute()&lt;br /&gt;
```&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
    array([[    0,   500,  1000, ..., 98500, 99000, 99500],&lt;br /&gt;
           [    1,   501,  1001, ..., 98501, 99001, 99501],&lt;br /&gt;
           [    2,   502,  1002, ..., 98502, 99002, 99502],&lt;br /&gt;
           ...,&lt;br /&gt;
           [  497,   997,  1497, ..., 98997, 99497, 99997],&lt;br /&gt;
           [  498,   998,  1498, ..., 98998, 99498, 99998],&lt;br /&gt;
           [  499,   999,  1499, ..., 98999, 99499, 99999]])&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
```python&lt;br /&gt;
b = a.max(axis=1)[::-1] + 10&lt;br /&gt;
```&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
```python&lt;br /&gt;
b[:10].compute()&lt;br /&gt;
```&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
    array([100009,  99509,  99009,  98509,  98009,  97509,  97009,  96509,&lt;br /&gt;
            96009,  95509])&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
```python&lt;br /&gt;
b.dask&lt;br /&gt;
```&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div&amp;gt;&lt;br /&gt;
    &amp;lt;div&amp;gt;&lt;br /&gt;
        &amp;lt;div style=&amp;quot;width: 52px; height: 52px; position: absolute;&amp;quot;&amp;gt;&lt;br /&gt;
            &amp;lt;svg width=&amp;quot;76&amp;quot; height=&amp;quot;71&amp;quot; viewBox=&amp;quot;0 0 76 71&amp;quot; fill=&amp;quot;none&amp;quot; xmlns=&amp;quot;http://www.w3.org/2000/svg&amp;quot;&amp;gt;&lt;br /&gt;
                &amp;lt;circle cx=&amp;quot;61.5&amp;quot; cy=&amp;quot;36.5&amp;quot; r=&amp;quot;13.5&amp;quot; style=&amp;quot;stroke: var(--jp-ui-font-color2, #1D1D1D); fill: var(--jp-layout-color1, #F2F2F2);&amp;quot; stroke-width=&amp;quot;2&amp;quot;/&amp;gt;&lt;br /&gt;
                &amp;lt;circle cx=&amp;quot;14.5&amp;quot; cy=&amp;quot;14.5&amp;quot; r=&amp;quot;13.5&amp;quot; style=&amp;quot;stroke: var(--jp-ui-font-color2, #1D1D1D); fill: var(--jp-layout-color1, #F2F2F2);&amp;quot; stroke-width=&amp;quot;2&amp;quot;/&amp;gt;&lt;br /&gt;
                &amp;lt;circle cx=&amp;quot;14.5&amp;quot; cy=&amp;quot;56.5&amp;quot; r=&amp;quot;13.5&amp;quot; style=&amp;quot;stroke: var(--jp-ui-font-color2, #1D1D1D); fill: var(--jp-layout-color1, #F2F2F2);&amp;quot; stroke-width=&amp;quot;2&amp;quot;/&amp;gt;&lt;br /&gt;
                &amp;lt;path d=&amp;quot;M28 16L30.5 16C33.2614 16 35.5 18.2386 35.5 21L35.5 32.0001C35.5 34.7615 37.7386 37.0001 40.5 37.0001L43 37.0001&amp;quot; style=&amp;quot;stroke: var(--jp-ui-font-color2, #1D1D1D);&amp;quot; stroke-width=&amp;quot;1.5&amp;quot;/&amp;gt;&lt;br /&gt;
                &amp;lt;path d=&amp;quot;M40.5 37L40.5 37.75L40.5 37.75L40.5 37ZM35.5 42L36.25 42L35.5 42ZM35.5 52L34.75 52L35.5 52ZM30.5 57L30.5 57.75L30.5 57ZM41.5001 36.25L40.5 36.25L40.5 37.75L41.5001 37.75L41.5001 36.25ZM34.75 42L34.75 52L36.25 52L36.25 42L34.75 42ZM30.5 56.25L28.0001 56.25L28.0001 57.75L30.5 57.75L30.5 56.25ZM34.75 52C34.75 54.3472 32.8472 56.25 30.5 56.25L30.5 57.75C33.6756 57.75 36.25 55.1756 36.25 52L34.75 52ZM40.5 36.25C37.3244 36.25 34.75 38.8243 34.75 42L36.25 42C36.25 39.6528 38.1528 37.75 40.5 37.75L40.5 36.25Z&amp;quot; style=&amp;quot;fill: var(--jp-ui-font-color2, #1D1D1D);&amp;quot;/&amp;gt;&lt;br /&gt;
                &amp;lt;circle cx=&amp;quot;28&amp;quot; cy=&amp;quot;16&amp;quot; r=&amp;quot;2.25&amp;quot; fill=&amp;quot;#E5E5E5&amp;quot; style=&amp;quot;stroke: var(--jp-ui-font-color2, #1D1D1D);&amp;quot; stroke-width=&amp;quot;1.5&amp;quot;/&amp;gt;&lt;br /&gt;
                &amp;lt;circle cx=&amp;quot;28&amp;quot; cy=&amp;quot;57&amp;quot; r=&amp;quot;2.25&amp;quot; fill=&amp;quot;#E5E5E5&amp;quot; style=&amp;quot;stroke: var(--jp-ui-font-color2, #1D1D1D);&amp;quot; stroke-width=&amp;quot;1.5&amp;quot;/&amp;gt;&lt;br /&gt;
                &amp;lt;path d=&amp;quot;M45.25 36.567C45.5833 36.7594 45.5833 37.2406 45.25 37.433L42.25 39.1651C41.9167 39.3575 41.5 39.117 41.5 38.7321V35.2679C41.5 34.883 41.9167 34.6425 42.25 34.8349L45.25 36.567Z&amp;quot; style=&amp;quot;fill: var(--jp-ui-font-color2, #1D1D1D);&amp;quot;/&amp;gt;&lt;br /&gt;
            &amp;lt;/svg&amp;gt;&lt;br /&gt;
        &amp;lt;/div&amp;gt;&lt;br /&gt;
        &amp;lt;div style=&amp;quot;margin-left: 64px;&amp;quot;&amp;gt;&lt;br /&gt;
            &amp;lt;h3 style=&amp;quot;margin-bottom: 0px;&amp;quot;&amp;gt;HighLevelGraph&amp;lt;/h3&amp;gt;&lt;br /&gt;
            &amp;lt;p style=&amp;quot;color: var(--jp-ui-font-color2, #5D5851); margin-bottom:0px;&amp;quot;&amp;gt;&lt;br /&gt;
                HighLevelGraph with 6 layers and 30 keys from all layers.&lt;br /&gt;
            &amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
            &amp;lt;div style=&amp;quot;&amp;quot;&amp;gt;&lt;br /&gt;
    &amp;lt;svg width=&amp;quot;24&amp;quot; height=&amp;quot;24&amp;quot; viewBox=&amp;quot;0 0 32 32&amp;quot; fill=&amp;quot;none&amp;quot; xmlns=&amp;quot;http://www.w3.org/2000/svg&amp;quot; style=&amp;quot;position: absolute;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
        &amp;lt;circle cx=&amp;quot;16&amp;quot; cy=&amp;quot;16&amp;quot; r=&amp;quot;14&amp;quot; fill=&amp;quot;#8F8F8F&amp;quot; style=&amp;quot;stroke: var(--jp-ui-font-color2, #1D1D1D);&amp;quot; stroke-width=&amp;quot;2&amp;quot;/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
    &amp;lt;/svg&amp;gt;&lt;br /&gt;
&lt;br /&gt;
    &amp;lt;details style=&amp;quot;margin-left: 32px;&amp;quot;&amp;gt;&lt;br /&gt;
        &amp;lt;summary style=&amp;quot;margin-bottom: 10px; margin-top: 10px;&amp;quot;&amp;gt;&lt;br /&gt;
            &amp;lt;h4 style=&amp;quot;display: inline;&amp;quot;&amp;gt;Layer1: array&amp;lt;/h4&amp;gt;&lt;br /&gt;
        &amp;lt;/summary&amp;gt;&lt;br /&gt;
        &amp;lt;p style=&amp;quot;color: var(--jp-ui-font-color2, #5D5851); margin: -0.25em 0px 0px 0px;&amp;quot;&amp;gt;&lt;br /&gt;
            array-5f1b3c0ca172b03296ed6d431d3c9df7&lt;br /&gt;
        &amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
        &amp;lt;table&amp;gt;&lt;br /&gt;
        &amp;lt;tr&amp;gt;&lt;br /&gt;
            &amp;lt;td&amp;gt;&lt;br /&gt;
                &amp;lt;table&amp;gt;&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt;layer_type&amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;MaterializedLayer&amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt;is_materialized&amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;True&amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt;number of outputs&amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;10&amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt;shape&amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;(200, 500)&amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt;dtype&amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;int64&amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt;chunksize&amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;(100, 100)&amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt;type&amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;dask.array.core.Array&amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt;chunk_type&amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;numpy.ndarray&amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
                &amp;lt;/table&amp;gt;&lt;br /&gt;
            &amp;lt;/td&amp;gt;&lt;br /&gt;
            &amp;lt;td&amp;gt;&lt;br /&gt;
                &amp;lt;svg width=&amp;quot;250&amp;quot; height=&amp;quot;130&amp;quot; style=&amp;quot;stroke:rgb(0,0,0);stroke-width:1&amp;quot; &amp;gt;&lt;br /&gt;
&lt;br /&gt;
  &amp;lt;!-- Horizontal lines --&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;0&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;200&amp;quot; y2=&amp;quot;0&amp;quot; style=&amp;quot;stroke-width:2&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;0&amp;quot; y1=&amp;quot;40&amp;quot; x2=&amp;quot;200&amp;quot; y2=&amp;quot;40&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;0&amp;quot; y1=&amp;quot;80&amp;quot; x2=&amp;quot;200&amp;quot; y2=&amp;quot;80&amp;quot; style=&amp;quot;stroke-width:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
  &amp;lt;!-- Vertical lines --&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;0&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;0&amp;quot; y2=&amp;quot;80&amp;quot; style=&amp;quot;stroke-width:2&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;40&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;40&amp;quot; y2=&amp;quot;80&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;80&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;80&amp;quot; y2=&amp;quot;80&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;120&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;120&amp;quot; y2=&amp;quot;80&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;160&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;160&amp;quot; y2=&amp;quot;80&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;200&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;200&amp;quot; y2=&amp;quot;80&amp;quot; style=&amp;quot;stroke-width:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
  &amp;lt;!-- Colored Rectangle --&amp;gt;&lt;br /&gt;
  &amp;lt;polygon points=&amp;quot;0.0,0.0 200.0,0.0 200.0,80.0 0.0,80.0&amp;quot; style=&amp;quot;fill:#ECB172A0;stroke-width:0&amp;quot;/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
  &amp;lt;!-- Text --&amp;gt;&lt;br /&gt;
  &amp;lt;text x=&amp;quot;100.000000&amp;quot; y=&amp;quot;100.000000&amp;quot; font-size=&amp;quot;1.0rem&amp;quot; font-weight=&amp;quot;100&amp;quot; text-anchor=&amp;quot;middle&amp;quot; &amp;gt;500&amp;lt;/text&amp;gt;&lt;br /&gt;
  &amp;lt;text x=&amp;quot;220.000000&amp;quot; y=&amp;quot;40.000000&amp;quot; font-size=&amp;quot;1.0rem&amp;quot; font-weight=&amp;quot;100&amp;quot; text-anchor=&amp;quot;middle&amp;quot; transform=&amp;quot;rotate(-90,220.000000,40.000000)&amp;quot;&amp;gt;200&amp;lt;/text&amp;gt;&lt;br /&gt;
&amp;lt;/svg&amp;gt;&lt;br /&gt;
            &amp;lt;/td&amp;gt;&lt;br /&gt;
        &amp;lt;/tr&amp;gt;&lt;br /&gt;
        &amp;lt;/table&amp;gt;&lt;br /&gt;
&lt;br /&gt;
    &amp;lt;/details&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
            &amp;lt;div style=&amp;quot;&amp;quot;&amp;gt;&lt;br /&gt;
    &amp;lt;svg width=&amp;quot;24&amp;quot; height=&amp;quot;24&amp;quot; viewBox=&amp;quot;0 0 32 32&amp;quot; fill=&amp;quot;none&amp;quot; xmlns=&amp;quot;http://www.w3.org/2000/svg&amp;quot; style=&amp;quot;position: absolute;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
        &amp;lt;circle cx=&amp;quot;16&amp;quot; cy=&amp;quot;16&amp;quot; r=&amp;quot;14&amp;quot; style=&amp;quot;stroke: var(--jp-ui-font-color2, #1D1D1D); fill: var(--jp-layout-color1, #F2F2F2);&amp;quot; stroke-width=&amp;quot;2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
    &amp;lt;/svg&amp;gt;&lt;br /&gt;
&lt;br /&gt;
    &amp;lt;details style=&amp;quot;margin-left: 32px;&amp;quot;&amp;gt;&lt;br /&gt;
        &amp;lt;summary style=&amp;quot;margin-bottom: 10px; margin-top: 10px;&amp;quot;&amp;gt;&lt;br /&gt;
            &amp;lt;h4 style=&amp;quot;display: inline;&amp;quot;&amp;gt;Layer2: chunk_max&amp;lt;/h4&amp;gt;&lt;br /&gt;
        &amp;lt;/summary&amp;gt;&lt;br /&gt;
        &amp;lt;p style=&amp;quot;color: var(--jp-ui-font-color2, #5D5851); margin: -0.25em 0px 0px 0px;&amp;quot;&amp;gt;&lt;br /&gt;
            chunk_max-22359f6e7c214a78df6cd2673d3d603f&lt;br /&gt;
        &amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
        &amp;lt;table&amp;gt;&lt;br /&gt;
        &amp;lt;tr&amp;gt;&lt;br /&gt;
            &amp;lt;td&amp;gt;&lt;br /&gt;
                &amp;lt;table&amp;gt;&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt;layer_type&amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;Blockwise&amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt;is_materialized&amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;False&amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt;number of outputs&amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;10&amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt;shape&amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;(200, 500)&amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt;dtype&amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;int64&amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt;chunksize&amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;(100, 100)&amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt;type&amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;dask.array.core.Array&amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt;chunk_type&amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;numpy.ndarray&amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
                        &amp;lt;tr&amp;gt;&lt;br /&gt;
                            &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt; depends on &amp;lt;/th&amp;gt;&lt;br /&gt;
                            &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;array-5f1b3c0ca172b03296ed6d431d3c9df7&amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
                &amp;lt;/table&amp;gt;&lt;br /&gt;
            &amp;lt;/td&amp;gt;&lt;br /&gt;
            &amp;lt;td&amp;gt;&lt;br /&gt;
                &amp;lt;svg width=&amp;quot;250&amp;quot; height=&amp;quot;130&amp;quot; style=&amp;quot;stroke:rgb(0,0,0);stroke-width:1&amp;quot; &amp;gt;&lt;br /&gt;
&lt;br /&gt;
  &amp;lt;!-- Horizontal lines --&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;0&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;200&amp;quot; y2=&amp;quot;0&amp;quot; style=&amp;quot;stroke-width:2&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;0&amp;quot; y1=&amp;quot;40&amp;quot; x2=&amp;quot;200&amp;quot; y2=&amp;quot;40&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;0&amp;quot; y1=&amp;quot;80&amp;quot; x2=&amp;quot;200&amp;quot; y2=&amp;quot;80&amp;quot; style=&amp;quot;stroke-width:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
  &amp;lt;!-- Vertical lines --&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;0&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;0&amp;quot; y2=&amp;quot;80&amp;quot; style=&amp;quot;stroke-width:2&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;40&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;40&amp;quot; y2=&amp;quot;80&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;80&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;80&amp;quot; y2=&amp;quot;80&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;120&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;120&amp;quot; y2=&amp;quot;80&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;160&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;160&amp;quot; y2=&amp;quot;80&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;200&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;200&amp;quot; y2=&amp;quot;80&amp;quot; style=&amp;quot;stroke-width:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
  &amp;lt;!-- Colored Rectangle --&amp;gt;&lt;br /&gt;
  &amp;lt;polygon points=&amp;quot;0.0,0.0 200.0,0.0 200.0,80.0 0.0,80.0&amp;quot; style=&amp;quot;fill:#ECB172A0;stroke-width:0&amp;quot;/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
  &amp;lt;!-- Text --&amp;gt;&lt;br /&gt;
  &amp;lt;text x=&amp;quot;100.000000&amp;quot; y=&amp;quot;100.000000&amp;quot; font-size=&amp;quot;1.0rem&amp;quot; font-weight=&amp;quot;100&amp;quot; text-anchor=&amp;quot;middle&amp;quot; &amp;gt;500&amp;lt;/text&amp;gt;&lt;br /&gt;
  &amp;lt;text x=&amp;quot;220.000000&amp;quot; y=&amp;quot;40.000000&amp;quot; font-size=&amp;quot;1.0rem&amp;quot; font-weight=&amp;quot;100&amp;quot; text-anchor=&amp;quot;middle&amp;quot; transform=&amp;quot;rotate(-90,220.000000,40.000000)&amp;quot;&amp;gt;200&amp;lt;/text&amp;gt;&lt;br /&gt;
&amp;lt;/svg&amp;gt;&lt;br /&gt;
            &amp;lt;/td&amp;gt;&lt;br /&gt;
        &amp;lt;/tr&amp;gt;&lt;br /&gt;
        &amp;lt;/table&amp;gt;&lt;br /&gt;
&lt;br /&gt;
    &amp;lt;/details&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
            &amp;lt;div style=&amp;quot;&amp;quot;&amp;gt;&lt;br /&gt;
    &amp;lt;svg width=&amp;quot;24&amp;quot; height=&amp;quot;24&amp;quot; viewBox=&amp;quot;0 0 32 32&amp;quot; fill=&amp;quot;none&amp;quot; xmlns=&amp;quot;http://www.w3.org/2000/svg&amp;quot; style=&amp;quot;position: absolute;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
        &amp;lt;circle cx=&amp;quot;16&amp;quot; cy=&amp;quot;16&amp;quot; r=&amp;quot;14&amp;quot; fill=&amp;quot;#8F8F8F&amp;quot; style=&amp;quot;stroke: var(--jp-ui-font-color2, #1D1D1D);&amp;quot; stroke-width=&amp;quot;2&amp;quot;/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
    &amp;lt;/svg&amp;gt;&lt;br /&gt;
&lt;br /&gt;
    &amp;lt;details style=&amp;quot;margin-left: 32px;&amp;quot;&amp;gt;&lt;br /&gt;
        &amp;lt;summary style=&amp;quot;margin-bottom: 10px; margin-top: 10px;&amp;quot;&amp;gt;&lt;br /&gt;
            &amp;lt;h4 style=&amp;quot;display: inline;&amp;quot;&amp;gt;Layer3: chunk_max-partial&amp;lt;/h4&amp;gt;&lt;br /&gt;
        &amp;lt;/summary&amp;gt;&lt;br /&gt;
        &amp;lt;p style=&amp;quot;color: var(--jp-ui-font-color2, #5D5851); margin: -0.25em 0px 0px 0px;&amp;quot;&amp;gt;&lt;br /&gt;
            chunk_max-partial-1dbed0d768d7fbafb3cb682ddb58870a&lt;br /&gt;
        &amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
        &amp;lt;table&amp;gt;&lt;br /&gt;
        &amp;lt;tr&amp;gt;&lt;br /&gt;
            &amp;lt;td&amp;gt;&lt;br /&gt;
                &amp;lt;table&amp;gt;&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt;layer_type&amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;MaterializedLayer&amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt;is_materialized&amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;True&amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt;number of outputs&amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;4&amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt;shape&amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;(200, 2)&amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt;dtype&amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;int64&amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt;chunksize&amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;(100, 1)&amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt;type&amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;dask.array.core.Array&amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt;chunk_type&amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;numpy.ndarray&amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
                        &amp;lt;tr&amp;gt;&lt;br /&gt;
                            &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt; depends on &amp;lt;/th&amp;gt;&lt;br /&gt;
                            &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;chunk_max-22359f6e7c214a78df6cd2673d3d603f&amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
                &amp;lt;/table&amp;gt;&lt;br /&gt;
            &amp;lt;/td&amp;gt;&lt;br /&gt;
            &amp;lt;td&amp;gt;&lt;br /&gt;
                &amp;lt;svg width=&amp;quot;92&amp;quot; height=&amp;quot;250&amp;quot; style=&amp;quot;stroke:rgb(0,0,0);stroke-width:1&amp;quot; &amp;gt;&lt;br /&gt;
&lt;br /&gt;
  &amp;lt;!-- Horizontal lines --&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;0&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;42&amp;quot; y2=&amp;quot;0&amp;quot; style=&amp;quot;stroke-width:2&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;0&amp;quot; y1=&amp;quot;100&amp;quot; x2=&amp;quot;42&amp;quot; y2=&amp;quot;100&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;0&amp;quot; y1=&amp;quot;200&amp;quot; x2=&amp;quot;42&amp;quot; y2=&amp;quot;200&amp;quot; style=&amp;quot;stroke-width:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
  &amp;lt;!-- Vertical lines --&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;0&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;0&amp;quot; y2=&amp;quot;200&amp;quot; style=&amp;quot;stroke-width:2&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;21&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;21&amp;quot; y2=&amp;quot;200&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;42&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;42&amp;quot; y2=&amp;quot;200&amp;quot; style=&amp;quot;stroke-width:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
  &amp;lt;!-- Colored Rectangle --&amp;gt;&lt;br /&gt;
  &amp;lt;polygon points=&amp;quot;0.0,0.0 42.354360857637474,0.0 42.354360857637474,200.0 0.0,200.0&amp;quot; style=&amp;quot;fill:#ECB172A0;stroke-width:0&amp;quot;/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
  &amp;lt;!-- Text --&amp;gt;&lt;br /&gt;
  &amp;lt;text x=&amp;quot;21.177180&amp;quot; y=&amp;quot;220.000000&amp;quot; font-size=&amp;quot;1.0rem&amp;quot; font-weight=&amp;quot;100&amp;quot; text-anchor=&amp;quot;middle&amp;quot; &amp;gt;2&amp;lt;/text&amp;gt;&lt;br /&gt;
  &amp;lt;text x=&amp;quot;62.354361&amp;quot; y=&amp;quot;100.000000&amp;quot; font-size=&amp;quot;1.0rem&amp;quot; font-weight=&amp;quot;100&amp;quot; text-anchor=&amp;quot;middle&amp;quot; transform=&amp;quot;rotate(-90,62.354361,100.000000)&amp;quot;&amp;gt;200&amp;lt;/text&amp;gt;&lt;br /&gt;
&amp;lt;/svg&amp;gt;&lt;br /&gt;
            &amp;lt;/td&amp;gt;&lt;br /&gt;
        &amp;lt;/tr&amp;gt;&lt;br /&gt;
        &amp;lt;/table&amp;gt;&lt;br /&gt;
&lt;br /&gt;
    &amp;lt;/details&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
            &amp;lt;div style=&amp;quot;&amp;quot;&amp;gt;&lt;br /&gt;
    &amp;lt;svg width=&amp;quot;24&amp;quot; height=&amp;quot;24&amp;quot; viewBox=&amp;quot;0 0 32 32&amp;quot; fill=&amp;quot;none&amp;quot; xmlns=&amp;quot;http://www.w3.org/2000/svg&amp;quot; style=&amp;quot;position: absolute;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
        &amp;lt;circle cx=&amp;quot;16&amp;quot; cy=&amp;quot;16&amp;quot; r=&amp;quot;14&amp;quot; fill=&amp;quot;#8F8F8F&amp;quot; style=&amp;quot;stroke: var(--jp-ui-font-color2, #1D1D1D);&amp;quot; stroke-width=&amp;quot;2&amp;quot;/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
    &amp;lt;/svg&amp;gt;&lt;br /&gt;
&lt;br /&gt;
    &amp;lt;details style=&amp;quot;margin-left: 32px;&amp;quot;&amp;gt;&lt;br /&gt;
        &amp;lt;summary style=&amp;quot;margin-bottom: 10px; margin-top: 10px;&amp;quot;&amp;gt;&lt;br /&gt;
            &amp;lt;h4 style=&amp;quot;display: inline;&amp;quot;&amp;gt;Layer4: max-aggregate&amp;lt;/h4&amp;gt;&lt;br /&gt;
        &amp;lt;/summary&amp;gt;&lt;br /&gt;
        &amp;lt;p style=&amp;quot;color: var(--jp-ui-font-color2, #5D5851); margin: -0.25em 0px 0px 0px;&amp;quot;&amp;gt;&lt;br /&gt;
            max-aggregate-f1f4671fcd497326d08ac769e2eeff52&lt;br /&gt;
        &amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
        &amp;lt;table&amp;gt;&lt;br /&gt;
        &amp;lt;tr&amp;gt;&lt;br /&gt;
            &amp;lt;td&amp;gt;&lt;br /&gt;
                &amp;lt;table&amp;gt;&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt;layer_type&amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;MaterializedLayer&amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt;is_materialized&amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;True&amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt;number of outputs&amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;2&amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt;shape&amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;(200,)&amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt;dtype&amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;int64&amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt;chunksize&amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;(100,)&amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt;type&amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;dask.array.core.Array&amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt;chunk_type&amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;numpy.ndarray&amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
                        &amp;lt;tr&amp;gt;&lt;br /&gt;
                            &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt; depends on &amp;lt;/th&amp;gt;&lt;br /&gt;
                            &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;chunk_max-partial-1dbed0d768d7fbafb3cb682ddb58870a&amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
                &amp;lt;/table&amp;gt;&lt;br /&gt;
            &amp;lt;/td&amp;gt;&lt;br /&gt;
            &amp;lt;td&amp;gt;&lt;br /&gt;
                &amp;lt;svg width=&amp;quot;250&amp;quot; height=&amp;quot;92&amp;quot; style=&amp;quot;stroke:rgb(0,0,0);stroke-width:1&amp;quot; &amp;gt;&lt;br /&gt;
&lt;br /&gt;
  &amp;lt;!-- Horizontal lines --&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;0&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;200&amp;quot; y2=&amp;quot;0&amp;quot; style=&amp;quot;stroke-width:2&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;0&amp;quot; y1=&amp;quot;42&amp;quot; x2=&amp;quot;200&amp;quot; y2=&amp;quot;42&amp;quot; style=&amp;quot;stroke-width:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
  &amp;lt;!-- Vertical lines --&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;0&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;0&amp;quot; y2=&amp;quot;42&amp;quot; style=&amp;quot;stroke-width:2&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;100&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;100&amp;quot; y2=&amp;quot;42&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;200&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;200&amp;quot; y2=&amp;quot;42&amp;quot; style=&amp;quot;stroke-width:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
  &amp;lt;!-- Colored Rectangle --&amp;gt;&lt;br /&gt;
  &amp;lt;polygon points=&amp;quot;0.0,0.0 200.0,0.0 200.0,42.354360857637474 0.0,42.354360857637474&amp;quot; style=&amp;quot;fill:#ECB172A0;stroke-width:0&amp;quot;/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
  &amp;lt;!-- Text --&amp;gt;&lt;br /&gt;
  &amp;lt;text x=&amp;quot;100.000000&amp;quot; y=&amp;quot;62.354361&amp;quot; font-size=&amp;quot;1.0rem&amp;quot; font-weight=&amp;quot;100&amp;quot; text-anchor=&amp;quot;middle&amp;quot; &amp;gt;200&amp;lt;/text&amp;gt;&lt;br /&gt;
  &amp;lt;text x=&amp;quot;220.000000&amp;quot; y=&amp;quot;21.177180&amp;quot; font-size=&amp;quot;1.0rem&amp;quot; font-weight=&amp;quot;100&amp;quot; text-anchor=&amp;quot;middle&amp;quot; transform=&amp;quot;rotate(0,220.000000,21.177180)&amp;quot;&amp;gt;1&amp;lt;/text&amp;gt;&lt;br /&gt;
&amp;lt;/svg&amp;gt;&lt;br /&gt;
            &amp;lt;/td&amp;gt;&lt;br /&gt;
        &amp;lt;/tr&amp;gt;&lt;br /&gt;
        &amp;lt;/table&amp;gt;&lt;br /&gt;
&lt;br /&gt;
    &amp;lt;/details&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
            &amp;lt;div style=&amp;quot;&amp;quot;&amp;gt;&lt;br /&gt;
    &amp;lt;svg width=&amp;quot;24&amp;quot; height=&amp;quot;24&amp;quot; viewBox=&amp;quot;0 0 32 32&amp;quot; fill=&amp;quot;none&amp;quot; xmlns=&amp;quot;http://www.w3.org/2000/svg&amp;quot; style=&amp;quot;position: absolute;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
        &amp;lt;circle cx=&amp;quot;16&amp;quot; cy=&amp;quot;16&amp;quot; r=&amp;quot;14&amp;quot; fill=&amp;quot;#8F8F8F&amp;quot; style=&amp;quot;stroke: var(--jp-ui-font-color2, #1D1D1D);&amp;quot; stroke-width=&amp;quot;2&amp;quot;/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
    &amp;lt;/svg&amp;gt;&lt;br /&gt;
&lt;br /&gt;
    &amp;lt;details style=&amp;quot;margin-left: 32px;&amp;quot;&amp;gt;&lt;br /&gt;
        &amp;lt;summary style=&amp;quot;margin-bottom: 10px; margin-top: 10px;&amp;quot;&amp;gt;&lt;br /&gt;
            &amp;lt;h4 style=&amp;quot;display: inline;&amp;quot;&amp;gt;Layer5: getitem&amp;lt;/h4&amp;gt;&lt;br /&gt;
        &amp;lt;/summary&amp;gt;&lt;br /&gt;
        &amp;lt;p style=&amp;quot;color: var(--jp-ui-font-color2, #5D5851); margin: -0.25em 0px 0px 0px;&amp;quot;&amp;gt;&lt;br /&gt;
            getitem-65be51186a8fbeccfd51c0cf1245f77e&lt;br /&gt;
        &amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
        &amp;lt;table&amp;gt;&lt;br /&gt;
        &amp;lt;tr&amp;gt;&lt;br /&gt;
            &amp;lt;td&amp;gt;&lt;br /&gt;
                &amp;lt;table&amp;gt;&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt;layer_type&amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;MaterializedLayer&amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt;is_materialized&amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;True&amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt;number of outputs&amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;2&amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt;shape&amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;(200,)&amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt;dtype&amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;int64&amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt;chunksize&amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;(100,)&amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt;type&amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;dask.array.core.Array&amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt;chunk_type&amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;numpy.ndarray&amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
                        &amp;lt;tr&amp;gt;&lt;br /&gt;
                            &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt; depends on &amp;lt;/th&amp;gt;&lt;br /&gt;
                            &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;max-aggregate-f1f4671fcd497326d08ac769e2eeff52&amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
                &amp;lt;/table&amp;gt;&lt;br /&gt;
            &amp;lt;/td&amp;gt;&lt;br /&gt;
            &amp;lt;td&amp;gt;&lt;br /&gt;
                &amp;lt;svg width=&amp;quot;250&amp;quot; height=&amp;quot;92&amp;quot; style=&amp;quot;stroke:rgb(0,0,0);stroke-width:1&amp;quot; &amp;gt;&lt;br /&gt;
&lt;br /&gt;
  &amp;lt;!-- Horizontal lines --&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;0&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;200&amp;quot; y2=&amp;quot;0&amp;quot; style=&amp;quot;stroke-width:2&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;0&amp;quot; y1=&amp;quot;42&amp;quot; x2=&amp;quot;200&amp;quot; y2=&amp;quot;42&amp;quot; style=&amp;quot;stroke-width:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
  &amp;lt;!-- Vertical lines --&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;0&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;0&amp;quot; y2=&amp;quot;42&amp;quot; style=&amp;quot;stroke-width:2&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;100&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;100&amp;quot; y2=&amp;quot;42&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;200&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;200&amp;quot; y2=&amp;quot;42&amp;quot; style=&amp;quot;stroke-width:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
  &amp;lt;!-- Colored Rectangle --&amp;gt;&lt;br /&gt;
  &amp;lt;polygon points=&amp;quot;0.0,0.0 200.0,0.0 200.0,42.354360857637474 0.0,42.354360857637474&amp;quot; style=&amp;quot;fill:#ECB172A0;stroke-width:0&amp;quot;/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
  &amp;lt;!-- Text --&amp;gt;&lt;br /&gt;
  &amp;lt;text x=&amp;quot;100.000000&amp;quot; y=&amp;quot;62.354361&amp;quot; font-size=&amp;quot;1.0rem&amp;quot; font-weight=&amp;quot;100&amp;quot; text-anchor=&amp;quot;middle&amp;quot; &amp;gt;200&amp;lt;/text&amp;gt;&lt;br /&gt;
  &amp;lt;text x=&amp;quot;220.000000&amp;quot; y=&amp;quot;21.177180&amp;quot; font-size=&amp;quot;1.0rem&amp;quot; font-weight=&amp;quot;100&amp;quot; text-anchor=&amp;quot;middle&amp;quot; transform=&amp;quot;rotate(0,220.000000,21.177180)&amp;quot;&amp;gt;1&amp;lt;/text&amp;gt;&lt;br /&gt;
&amp;lt;/svg&amp;gt;&lt;br /&gt;
            &amp;lt;/td&amp;gt;&lt;br /&gt;
        &amp;lt;/tr&amp;gt;&lt;br /&gt;
        &amp;lt;/table&amp;gt;&lt;br /&gt;
&lt;br /&gt;
    &amp;lt;/details&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
            &amp;lt;div style=&amp;quot;&amp;quot;&amp;gt;&lt;br /&gt;
    &amp;lt;svg width=&amp;quot;24&amp;quot; height=&amp;quot;24&amp;quot; viewBox=&amp;quot;0 0 32 32&amp;quot; fill=&amp;quot;none&amp;quot; xmlns=&amp;quot;http://www.w3.org/2000/svg&amp;quot; style=&amp;quot;position: absolute;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
        &amp;lt;circle cx=&amp;quot;16&amp;quot; cy=&amp;quot;16&amp;quot; r=&amp;quot;14&amp;quot; style=&amp;quot;stroke: var(--jp-ui-font-color2, #1D1D1D); fill: var(--jp-layout-color1, #F2F2F2);&amp;quot; stroke-width=&amp;quot;2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
    &amp;lt;/svg&amp;gt;&lt;br /&gt;
&lt;br /&gt;
    &amp;lt;details style=&amp;quot;margin-left: 32px;&amp;quot;&amp;gt;&lt;br /&gt;
        &amp;lt;summary style=&amp;quot;margin-bottom: 10px; margin-top: 10px;&amp;quot;&amp;gt;&lt;br /&gt;
            &amp;lt;h4 style=&amp;quot;display: inline;&amp;quot;&amp;gt;Layer6: add&amp;lt;/h4&amp;gt;&lt;br /&gt;
        &amp;lt;/summary&amp;gt;&lt;br /&gt;
        &amp;lt;p style=&amp;quot;color: var(--jp-ui-font-color2, #5D5851); margin: -0.25em 0px 0px 0px;&amp;quot;&amp;gt;&lt;br /&gt;
            add-c42a5ea4600e83a48a31b107d8933476&lt;br /&gt;
        &amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
        &amp;lt;table&amp;gt;&lt;br /&gt;
        &amp;lt;tr&amp;gt;&lt;br /&gt;
            &amp;lt;td&amp;gt;&lt;br /&gt;
                &amp;lt;table&amp;gt;&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt;layer_type&amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;Blockwise&amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt;is_materialized&amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;False&amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt;number of outputs&amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;2&amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt;shape&amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;(200,)&amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt;dtype&amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;int64&amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt;chunksize&amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;(100,)&amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt;type&amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;dask.array.core.Array&amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt;chunk_type&amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;numpy.ndarray&amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
                        &amp;lt;tr&amp;gt;&lt;br /&gt;
                            &amp;lt;th style=&amp;quot;text-align: left; width: 150px;&amp;quot;&amp;gt; depends on &amp;lt;/th&amp;gt;&lt;br /&gt;
                            &amp;lt;td style=&amp;quot;text-align: left;&amp;quot;&amp;gt;getitem-65be51186a8fbeccfd51c0cf1245f77e&amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
                &amp;lt;/table&amp;gt;&lt;br /&gt;
            &amp;lt;/td&amp;gt;&lt;br /&gt;
            &amp;lt;td&amp;gt;&lt;br /&gt;
                &amp;lt;svg width=&amp;quot;250&amp;quot; height=&amp;quot;92&amp;quot; style=&amp;quot;stroke:rgb(0,0,0);stroke-width:1&amp;quot; &amp;gt;&lt;br /&gt;
&lt;br /&gt;
  &amp;lt;!-- Horizontal lines --&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;0&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;200&amp;quot; y2=&amp;quot;0&amp;quot; style=&amp;quot;stroke-width:2&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;0&amp;quot; y1=&amp;quot;42&amp;quot; x2=&amp;quot;200&amp;quot; y2=&amp;quot;42&amp;quot; style=&amp;quot;stroke-width:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
  &amp;lt;!-- Vertical lines --&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;0&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;0&amp;quot; y2=&amp;quot;42&amp;quot; style=&amp;quot;stroke-width:2&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;100&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;100&amp;quot; y2=&amp;quot;42&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;200&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;200&amp;quot; y2=&amp;quot;42&amp;quot; style=&amp;quot;stroke-width:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
  &amp;lt;!-- Colored Rectangle --&amp;gt;&lt;br /&gt;
  &amp;lt;polygon points=&amp;quot;0.0,0.0 200.0,0.0 200.0,42.354360857637474 0.0,42.354360857637474&amp;quot; style=&amp;quot;fill:#ECB172A0;stroke-width:0&amp;quot;/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
  &amp;lt;!-- Text --&amp;gt;&lt;br /&gt;
  &amp;lt;text x=&amp;quot;100.000000&amp;quot; y=&amp;quot;62.354361&amp;quot; font-size=&amp;quot;1.0rem&amp;quot; font-weight=&amp;quot;100&amp;quot; text-anchor=&amp;quot;middle&amp;quot; &amp;gt;200&amp;lt;/text&amp;gt;&lt;br /&gt;
  &amp;lt;text x=&amp;quot;220.000000&amp;quot; y=&amp;quot;21.177180&amp;quot; font-size=&amp;quot;1.0rem&amp;quot; font-weight=&amp;quot;100&amp;quot; text-anchor=&amp;quot;middle&amp;quot; transform=&amp;quot;rotate(0,220.000000,21.177180)&amp;quot;&amp;gt;1&amp;lt;/text&amp;gt;&lt;br /&gt;
&amp;lt;/svg&amp;gt;&lt;br /&gt;
            &amp;lt;/td&amp;gt;&lt;br /&gt;
        &amp;lt;/tr&amp;gt;&lt;br /&gt;
        &amp;lt;/table&amp;gt;&lt;br /&gt;
&lt;br /&gt;
    &amp;lt;/details&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
        &amp;lt;/div&amp;gt;&lt;br /&gt;
    &amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
## Dask example 2&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
```python&lt;br /&gt;
import dask.array as da&lt;br /&gt;
x = da.random.random((30_000, 30_000), chunks=(1000, 1000))&lt;br /&gt;
x&lt;br /&gt;
```&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;table&amp;gt;&lt;br /&gt;
    &amp;lt;tr&amp;gt;&lt;br /&gt;
        &amp;lt;td&amp;gt;&lt;br /&gt;
            &amp;lt;table style=&amp;quot;border-collapse: collapse;&amp;quot;&amp;gt;&lt;br /&gt;
                &amp;lt;thead&amp;gt;&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;td&amp;gt; &amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;th&amp;gt; Array &amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;th&amp;gt; Chunk &amp;lt;/th&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
                &amp;lt;/thead&amp;gt;&lt;br /&gt;
                &amp;lt;tbody&amp;gt;&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th&amp;gt; Bytes &amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td&amp;gt; 6.71 GiB &amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;td&amp;gt; 7.63 MiB &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th&amp;gt; Shape &amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td&amp;gt; (30000, 30000) &amp;lt;/td&amp;gt;&lt;br /&gt;
                        &amp;lt;td&amp;gt; (1000, 1000) &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th&amp;gt; Dask graph &amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td colspan=&amp;quot;2&amp;quot;&amp;gt; 900 chunks in 1 graph layer &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
                    &amp;lt;tr&amp;gt;&lt;br /&gt;
                        &amp;lt;th&amp;gt; Data type &amp;lt;/th&amp;gt;&lt;br /&gt;
                        &amp;lt;td colspan=&amp;quot;2&amp;quot;&amp;gt; float64 numpy.ndarray &amp;lt;/td&amp;gt;&lt;br /&gt;
                    &amp;lt;/tr&amp;gt;&lt;br /&gt;
                &amp;lt;/tbody&amp;gt;&lt;br /&gt;
            &amp;lt;/table&amp;gt;&lt;br /&gt;
        &amp;lt;/td&amp;gt;&lt;br /&gt;
        &amp;lt;td&amp;gt;&lt;br /&gt;
        &amp;lt;svg width=&amp;quot;170&amp;quot; height=&amp;quot;170&amp;quot; style=&amp;quot;stroke:rgb(0,0,0);stroke-width:1&amp;quot; &amp;gt;&lt;br /&gt;
&lt;br /&gt;
  &amp;lt;!-- Horizontal lines --&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;0&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;120&amp;quot; y2=&amp;quot;0&amp;quot; style=&amp;quot;stroke-width:2&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;0&amp;quot; y1=&amp;quot;4&amp;quot; x2=&amp;quot;120&amp;quot; y2=&amp;quot;4&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;0&amp;quot; y1=&amp;quot;12&amp;quot; x2=&amp;quot;120&amp;quot; y2=&amp;quot;12&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;0&amp;quot; y1=&amp;quot;16&amp;quot; x2=&amp;quot;120&amp;quot; y2=&amp;quot;16&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;0&amp;quot; y1=&amp;quot;24&amp;quot; x2=&amp;quot;120&amp;quot; y2=&amp;quot;24&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;0&amp;quot; y1=&amp;quot;28&amp;quot; x2=&amp;quot;120&amp;quot; y2=&amp;quot;28&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;0&amp;quot; y1=&amp;quot;36&amp;quot; x2=&amp;quot;120&amp;quot; y2=&amp;quot;36&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;0&amp;quot; y1=&amp;quot;44&amp;quot; x2=&amp;quot;120&amp;quot; y2=&amp;quot;44&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;0&amp;quot; y1=&amp;quot;48&amp;quot; x2=&amp;quot;120&amp;quot; y2=&amp;quot;48&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;0&amp;quot; y1=&amp;quot;56&amp;quot; x2=&amp;quot;120&amp;quot; y2=&amp;quot;56&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;0&amp;quot; y1=&amp;quot;60&amp;quot; x2=&amp;quot;120&amp;quot; y2=&amp;quot;60&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;0&amp;quot; y1=&amp;quot;68&amp;quot; x2=&amp;quot;120&amp;quot; y2=&amp;quot;68&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;0&amp;quot; y1=&amp;quot;72&amp;quot; x2=&amp;quot;120&amp;quot; y2=&amp;quot;72&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;0&amp;quot; y1=&amp;quot;80&amp;quot; x2=&amp;quot;120&amp;quot; y2=&amp;quot;80&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;0&amp;quot; y1=&amp;quot;88&amp;quot; x2=&amp;quot;120&amp;quot; y2=&amp;quot;88&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;0&amp;quot; y1=&amp;quot;92&amp;quot; x2=&amp;quot;120&amp;quot; y2=&amp;quot;92&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;0&amp;quot; y1=&amp;quot;100&amp;quot; x2=&amp;quot;120&amp;quot; y2=&amp;quot;100&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;0&amp;quot; y1=&amp;quot;104&amp;quot; x2=&amp;quot;120&amp;quot; y2=&amp;quot;104&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;0&amp;quot; y1=&amp;quot;112&amp;quot; x2=&amp;quot;120&amp;quot; y2=&amp;quot;112&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;0&amp;quot; y1=&amp;quot;120&amp;quot; x2=&amp;quot;120&amp;quot; y2=&amp;quot;120&amp;quot; style=&amp;quot;stroke-width:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
  &amp;lt;!-- Vertical lines --&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;0&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;0&amp;quot; y2=&amp;quot;120&amp;quot; style=&amp;quot;stroke-width:2&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;4&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;4&amp;quot; y2=&amp;quot;120&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;12&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;12&amp;quot; y2=&amp;quot;120&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;16&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;16&amp;quot; y2=&amp;quot;120&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;24&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;24&amp;quot; y2=&amp;quot;120&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;28&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;28&amp;quot; y2=&amp;quot;120&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;36&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;36&amp;quot; y2=&amp;quot;120&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;44&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;44&amp;quot; y2=&amp;quot;120&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;48&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;48&amp;quot; y2=&amp;quot;120&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;56&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;56&amp;quot; y2=&amp;quot;120&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;60&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;60&amp;quot; y2=&amp;quot;120&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;68&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;68&amp;quot; y2=&amp;quot;120&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;72&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;72&amp;quot; y2=&amp;quot;120&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;80&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;80&amp;quot; y2=&amp;quot;120&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;88&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;88&amp;quot; y2=&amp;quot;120&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;92&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;92&amp;quot; y2=&amp;quot;120&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;100&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;100&amp;quot; y2=&amp;quot;120&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;104&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;104&amp;quot; y2=&amp;quot;120&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;112&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;112&amp;quot; y2=&amp;quot;120&amp;quot; /&amp;gt;&lt;br /&gt;
  &amp;lt;line x1=&amp;quot;120&amp;quot; y1=&amp;quot;0&amp;quot; x2=&amp;quot;120&amp;quot; y2=&amp;quot;120&amp;quot; style=&amp;quot;stroke-width:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
  &amp;lt;!-- Colored Rectangle --&amp;gt;&lt;br /&gt;
  &amp;lt;polygon points=&amp;quot;0.0,0.0 120.0,0.0 120.0,120.0 0.0,120.0&amp;quot; style=&amp;quot;fill:#8B4903A0;stroke-width:0&amp;quot;/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
  &amp;lt;!-- Text --&amp;gt;&lt;br /&gt;
  &amp;lt;text x=&amp;quot;60.000000&amp;quot; y=&amp;quot;140.000000&amp;quot; font-size=&amp;quot;1.0rem&amp;quot; font-weight=&amp;quot;100&amp;quot; text-anchor=&amp;quot;middle&amp;quot; &amp;gt;30000&amp;lt;/text&amp;gt;&lt;br /&gt;
  &amp;lt;text x=&amp;quot;140.000000&amp;quot; y=&amp;quot;60.000000&amp;quot; font-size=&amp;quot;1.0rem&amp;quot; font-weight=&amp;quot;100&amp;quot; text-anchor=&amp;quot;middle&amp;quot; transform=&amp;quot;rotate(-90,140.000000,60.000000)&amp;quot;&amp;gt;30000&amp;lt;/text&amp;gt;&lt;br /&gt;
&amp;lt;/svg&amp;gt;&lt;br /&gt;
        &amp;lt;/td&amp;gt;&lt;br /&gt;
    &amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;/table&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
```python&lt;br /&gt;
y = x + x.T&lt;br /&gt;
```&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
```python&lt;br /&gt;
y.sum().compute()&lt;br /&gt;
```&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
    899986697.6624435&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
```python&lt;br /&gt;
y[:, :10].compute()&lt;br /&gt;
```&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
    array([[1.06732263, 1.44918471, 0.97747399, ..., 0.33531255, 0.81407412,&lt;br /&gt;
            1.18334048],&lt;br /&gt;
           [1.44918471, 0.75954053, 0.94024149, ..., 0.92522134, 0.47216556,&lt;br /&gt;
            1.0685656 ],&lt;br /&gt;
           [0.97747399, 0.94024149, 0.43425583, ..., 1.31146439, 1.13960695,&lt;br /&gt;
            1.17543105],&lt;br /&gt;
           ...,&lt;br /&gt;
           [0.94341018, 0.91293267, 0.4737179 , ..., 0.95118153, 0.68525142,&lt;br /&gt;
            0.9561541 ],&lt;br /&gt;
           [0.68742121, 0.76071136, 0.78407903, ..., 0.96844288, 0.57027703,&lt;br /&gt;
            0.44666467],&lt;br /&gt;
           [1.05163969, 1.174239  , 0.61947849, ..., 0.22588196, 0.7914612 ,&lt;br /&gt;
            0.8904959 ]])&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
```python&lt;br /&gt;
&lt;br /&gt;
```&lt;/div&gt;</summary>
		<author><name>Torradeflot</name></author>
	</entry>
	<entry>
		<id>https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1133</id>
		<title>JupyterHub</title>
		<link rel="alternate" type="text/html" href="https://pwiki.pic.es/index.php?title=JupyterHub&amp;diff=1133"/>
		<updated>2024-02-21T13:23:33Z</updated>

		<summary type="html">&lt;p&gt;Torradeflot: /* Dask */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
= Introduction =&lt;br /&gt;
&lt;br /&gt;
PIC offers a service for running Jupyter notebooks on CPU or GPU resources. This service is primarily thought for code developing or prototyping rather than data processing. The usage is similar to running notebooks on your personal computer but offers the advantage of developing and testing your code on different hardware configurations, as well as facilitating the scalability of the code since it is being tested in the same environment in which it would run on a mass scale. &lt;br /&gt;
&lt;br /&gt;
Since the service is strictly thought for development and small scale testing tasks, a shutdown policy for the sessions has been put in place:&lt;br /&gt;
&lt;br /&gt;
# The maximum duration for a session is 48h.&lt;br /&gt;
# After an idle period of 2 hours, the session will be closed. &lt;br /&gt;
&lt;br /&gt;
In practice that  means that you should estimate the test data volume that you work with during a session to be able to be processed in less than 48 hours.&lt;br /&gt;
&lt;br /&gt;
== How to connect to the service ==&lt;br /&gt;
&lt;br /&gt;
Got to [https://jupyter.pic.es jupyter.pic.es] to see your login screen.&lt;br /&gt;
&lt;br /&gt;
[[File:login.png|frameless|700px|Login screen]]&lt;br /&gt;
&lt;br /&gt;
Sign in with your PIC user credentials. This will prompt you to the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterSpawn.png|700px|current]]&lt;br /&gt;
&lt;br /&gt;
Here you can choose the hardware configuration for your Jupyter session. Also, you have to choose the experiment (project) you are working on during the Jupyter session. After choosing a configuration and pressing start the next screen will show you the progress of the initialisation process. Keep in mind that a job containing your Jupyter session is actually sent to the HTCondor queuing system and waiting for available resources before being started. This usually takes less than a minute but can take up to a few depending on our resource usage.&lt;br /&gt;
&lt;br /&gt;
[[File:screen02.png|600px]]&lt;br /&gt;
&lt;br /&gt;
In the next screen you can choose the tool that you want to use for your work: a Python notebook, a Python console or a plain bash terminal.&lt;br /&gt;
For the Python environment (either notebook or environment) you have two default options:&lt;br /&gt;
* the ipykernel version of Python 3&lt;br /&gt;
* the XPython version of Python 3.9, this one allows you to use the integrated debugging module.&lt;br /&gt;
&lt;br /&gt;
Further you see an icon with a &amp;quot;D&amp;quot; - desktop, this one starts a VNC session that allows the use of programs with graphical user interfaces.&lt;br /&gt;
&lt;br /&gt;
Also, recently you can find the icon of Visual Studio, an integrated development environment.&lt;br /&gt;
&lt;br /&gt;
[[File:ScreenshotJupyterlab20231103.png|700px]]&lt;br /&gt;
&lt;br /&gt;
Your python environments should appear under Notebook and Console headers. In a later section we will show you how to create a new environment and to remove an existing one.&lt;br /&gt;
&lt;br /&gt;
== Terminate your session and logout ==&lt;br /&gt;
&lt;br /&gt;
It is important that you terminate your session before you log out. In order to do so, go to the top page menu &amp;quot;'''File''' -&amp;gt; '''Hub Control Panel'''&amp;quot; and you will see the following screen.&lt;br /&gt;
&lt;br /&gt;
[[File:screen04.png|600px]]&lt;br /&gt;
&lt;br /&gt;
Here, click on the '''Stop My Server''' button. After that you can log out by clicking the '''Logout''' button in the right upper corner.&lt;br /&gt;
&lt;br /&gt;
=  Python virtual environments =&lt;br /&gt;
&lt;br /&gt;
This section covers the use of Python virtual environments with Jupyter.&lt;br /&gt;
&lt;br /&gt;
== Initialize conda (we highly recommend the use of mambaforge) ==&lt;br /&gt;
&lt;br /&gt;
Before using conda/mamba in your bash session, you have to initialize it.&lt;br /&gt;
* For access to an available conda/mamba installation, please get in contact with your project liaison at PIC. He/she will give you the actual value for the '''/path/to/mambaforge''' placeholder.&lt;br /&gt;
* If you want to use your own conda/mamba installation, we recommend you to install the minimal '''miniforge''' distribution, instructions [https://github.com/conda-forge/miniforge here]&lt;br /&gt;
&lt;br /&gt;
Log onto Jupyter and start a session. On the homepage of your Jupyter session, click on the terminal button on the session dashboard on the right to open a bash terminal. If no specific version is needed you can use the link provided in the example.&lt;br /&gt;
&lt;br /&gt;
First, let's initialize conda for our bash sessions:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ /data/astro/software/centos7/conda/mambaforge_4.14.0/bin/mamba init&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This actually changes the .bashrc file in your home directory in order to activate the base environment on login.&lt;br /&gt;
To avoid that the base environment is activated every time you log on to a node, run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For now you can exit the terminal.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ exit&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Link an existing environment to Jupyter ==&lt;br /&gt;
&lt;br /&gt;
You can find instructions on how to create your own environments, e.g. [[#Create_virtual_environments_with_venv_or_conda | here]].&lt;br /&gt;
&lt;br /&gt;
Log into Jupyter, start a session. From the session dashboard choose the bash terminal.&lt;br /&gt;
&lt;br /&gt;
Inside the terminal, activate your environment.&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments:&lt;br /&gt;
* if you created the environment without a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the name of your environment. &lt;br /&gt;
* if you created the environment with a prefix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/environment&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The parenthesis (...) in front of your bash prompt show the absolute path of your environment. &lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ source /path/to/environment/bin/activate&lt;br /&gt;
(...) [neissner@td110 ~]$ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Link the environment to a Jupyter kernel. For both, '''conda/mamba''' and '''venv''':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ python -m  ipykernel install --user --name=whatever_kernel_name&lt;br /&gt;
Installed kernelspec whatever_kernel_name in &lt;br /&gt;
                         /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you don't have the '''ipykernel''' module installed in your environment you may receive an error message like the one below when trying to run the previous command.&lt;br /&gt;
&amp;lt;pre&amp;gt;No module named ipykernel&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If this is the case, you need to install it by running: '''pip install ipykernel'''&lt;br /&gt;
&lt;br /&gt;
Deactivate your environment. &lt;br /&gt;
&lt;br /&gt;
For conda:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For venv:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(...) [neissner@td110 ~]$ deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now you can exit the terminal. After refreshing the Jupyter page your whatever_kernel_name appears in the dashboard. In this example '''test''' has been used for whatever_kernel_name&lt;br /&gt;
&lt;br /&gt;
[[File:screen05.png|700px]]&lt;br /&gt;
&lt;br /&gt;
== Unlink an environment from Jupyter ==&lt;br /&gt;
Log onto Jupyter, start a session and from the session dashboard choose the bash terminal. To remove your environment/kernel from Jupyter run:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ jupyter kernelspec uninstall whatever_kernel_name&lt;br /&gt;
Kernel specs to remove:&lt;br /&gt;
  whatever_kernel_name     /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
Remove 1 kernel specs [y/N]: y&lt;br /&gt;
[RemoveKernelSpec] Removed /nfs/pic.es/user/n/neissner/.local/share/jupyter/kernels/whatever_kernel_name&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Keep in mind that, although not available in Jupyter anymore, the environment still exists. Whenever you need it, you can link it again.&lt;br /&gt;
&lt;br /&gt;
== Create virtual environments with venv or conda ==&lt;br /&gt;
&lt;br /&gt;
Before creating a new environment, please get in contact with your project liaison at PIC as there may be already a suitable environment for your needs in place.&lt;br /&gt;
&lt;br /&gt;
If none of the existing environments suits your needs, you can create a new environment.&lt;br /&gt;
First, create a directory in a suitable place to store the environment. For single-user environments, place them in your home under ~/env. For environments that will be shared with other project users, contact your project liaison and ask him/her for a path in a shared storage volume that is visible to all of them.&lt;br /&gt;
&lt;br /&gt;
Once you have the location (i.e. /path/to/env/folder), create the environment with the following commands:&lt;br /&gt;
&lt;br /&gt;
For '''venv''' environments '''(recommended)'''&lt;br /&gt;
&lt;br /&gt;
If your_env is installed at /path/to/env/your_env&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ python3 -m venv your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ cd /path/to/env&lt;br /&gt;
[neissner@td110 ~]$ source your_env/bin/activate&lt;br /&gt;
(...)[neissner@td110 ~]$ pip install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For '''conda/mamba''' environments&lt;br /&gt;
 &amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba create --prefix /path/to/env/your_env&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The list of modules (module1, module2, ...) is optional. For instance, for a python3 environment with scipy you would specify: ''python=3 scipy'' &lt;br /&gt;
&lt;br /&gt;
Now you should be able to activate your environment and install additional modules&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[neissner@td110 ~]$ mamba activate /path/to/env/folder/your_env&lt;br /&gt;
(...)[neissner@td110 ~]$ mamba install additional_module1 additional_module2 ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
You can use pip install inside a mamba environment, however, resolving dependencies might require installing additional packages manually.&lt;br /&gt;
&lt;br /&gt;
= Proper usage of X509 based proxies =&lt;br /&gt;
&lt;br /&gt;
We found recently that the usage of proxies within a Jupyter session might cause problems because the environment changes certain standard locations such as '''/tmp'''&lt;br /&gt;
&lt;br /&gt;
For a correct functioning please create the proxy the following way, example for Virgo:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /bin/voms-proxy-init --voms virgo:/virgo/ligo --out ./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=./x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
ls: cannot access /cvmfs/ligo.osgstorage.org: Permission denied&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here the proxy cannot properly be located. Therefore we have to put the complete path into the variable:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ pwd&lt;br /&gt;
/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ export X509_USER_PROXY=/nfs/pic.es/user/&amp;lt;letter&amp;gt;/&amp;lt;user&amp;gt;/x509up_u$(id -u)&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ ls /cvmfs/ligo.osgstorage.org&lt;br /&gt;
frames  powerflux  pycbc  test_access&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Software of particular interest =&lt;br /&gt;
== SageMath ==&lt;br /&gt;
&lt;br /&gt;
[https://www.sagemath.org/ SageMath] is particularly interesting for Cosmology because it allows symbolic calculations, e.g. deriving the equations of motions for the scale factor starting from a customised space-time metric. &lt;br /&gt;
&lt;br /&gt;
=== Standard cosmology examples ===&lt;br /&gt;
&lt;br /&gt;
* The Friedman equations for the FLRW solution of the Einstein equations.&lt;br /&gt;
You can find the corresponding Notebook in any PIC terminal at '''/data/astro/software/notebooks/FLRW_cosmology.ipynb'''&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/FLRW_cosmology_solutions.ipynb''' uses known analytical solutions of the FLRW cosmology and produces this image for the evolution of the scale factor:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage05.png|300px]]&lt;br /&gt;
&lt;br /&gt;
* The notebook you can find at '''/data/astro/software/notebooks/Interior_Schwarzschild.ipynb''' shows the formalism for the interior Schwarzschild metric and displays the solutions for density and pressure of a static celestial object that is sufficiently larger than its Schwarzschild radius. The pressure for an object with constant density id shown in the image:&lt;br /&gt;
&lt;br /&gt;
[[File:Screenshot_Sage06.png|300px]]&lt;br /&gt;
&lt;br /&gt;
=== Enabling SageMath environment in Jupyter ===&lt;br /&gt;
&lt;br /&gt;
If you have never initialized mamba, run:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ /data/astro/software/centos7/conda/mambaforge_4.14.0/bin/mamba init&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ conda config --set auto_activate_base false&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
After that you can enable SageMath for its use in a Jupyter notebook session: &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
[&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~] mamba activate /data/astro/software/envs/sage&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ python -m  ipykernel install --user --name=sage&lt;br /&gt;
....&lt;br /&gt;
(/data/astro/software/envs/sage) [&amp;lt;user&amp;gt;@&amp;lt;hostname&amp;gt; ~]$ mamba deactivate&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This creates a file in you home '''~/.local/share/jupyter/kernels/sage/kernel.json'''&lt;br /&gt;
which has to be modified to look like this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
{&lt;br /&gt;
 &amp;quot;argv&amp;quot;: [&lt;br /&gt;
  &amp;quot;/data/astro/software/envs/sage/bin/sage&amp;quot;,&lt;br /&gt;
  &amp;quot;--python&amp;quot;,&lt;br /&gt;
  &amp;quot;-m&amp;quot;,&lt;br /&gt;
  &amp;quot;sage.repl.ipython_kernel&amp;quot;,&lt;br /&gt;
  &amp;quot;-f&amp;quot;,&lt;br /&gt;
  &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
 ],&lt;br /&gt;
 &amp;quot;display_name&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;language&amp;quot;: &amp;quot;sage&amp;quot;,&lt;br /&gt;
 &amp;quot;metadata&amp;quot;: {&lt;br /&gt;
  &amp;quot;debugger&amp;quot;: true&lt;br /&gt;
 }&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next time you go to your Jupyter dashboard you will find the sage environment listed there.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Dask =&lt;br /&gt;
&lt;br /&gt;
[[Media:dask_htcondor.pdf|Dask + HTCondor manual]]&lt;br /&gt;
&lt;br /&gt;
[[Dask wiki]]&lt;br /&gt;
&lt;br /&gt;
= Using a singularity image as a jupyter kernel =&lt;br /&gt;
&lt;br /&gt;
Sometimes the software stack of some projects may be provided in the shape of a singularity image, it will then be convenient to use this image as a kernel for the notebooks in jupyter.pic.es.&lt;br /&gt;
&lt;br /&gt;
The singularity image to be used as a kernel needs to fullfill some requirements. Different requirements will apply depending on the programming language.&lt;br /&gt;
&lt;br /&gt;
== python jupyter kernel in a singularity image ==&lt;br /&gt;
&lt;br /&gt;
The singularity image needs to have the '''python''' and the '''ipykernel''' module installed. &lt;br /&gt;
&lt;br /&gt;
* Create the folder that will host the kernel definition&lt;br /&gt;
&lt;br /&gt;
  mkdir -p $HOME/.local/share/jupyter/kernels/singularity&lt;br /&gt;
&lt;br /&gt;
* Create the '''kernel.json''' file inside it with the following content:&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 {&lt;br /&gt;
   &amp;quot;argv&amp;quot;: [&lt;br /&gt;
     &amp;quot;singularity&amp;quot;,&lt;br /&gt;
     &amp;quot;exec&amp;quot;,&lt;br /&gt;
     &amp;quot;--cleanenv&amp;quot;,&lt;br /&gt;
     &amp;quot;/path/to/the/singularity/image.sif&amp;quot;,&lt;br /&gt;
     &amp;quot;python&amp;quot;,&lt;br /&gt;
     &amp;quot;-m&amp;quot;,&lt;br /&gt;
    &amp;quot;ipykernel&amp;quot;,&lt;br /&gt;
     &amp;quot;-f&amp;quot;,&lt;br /&gt;
     &amp;quot;{connection_file}&amp;quot;&lt;br /&gt;
   ],&lt;br /&gt;
   &amp;quot;language&amp;quot;: &amp;quot;python&amp;quot;,&lt;br /&gt;
   &amp;quot;display_name&amp;quot;: &amp;quot;singularity-kernel&amp;quot;&lt;br /&gt;
 }&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Refresh or start the jupyterlab interface and the singularity kernel should appear in the launcher tab&lt;br /&gt;
&lt;br /&gt;
= GPUs =&lt;br /&gt;
&lt;br /&gt;
The way to identify the GPUs that are assigned to your job is:&lt;br /&gt;
* check the environment variable CUDA_VISIBLE_DEVICES. In a terminal run &amp;quot;echo $CUDA_VISIBLE_DEVICES&amp;quot;. The environment variable contains a list of comma-separated GPU ids. With this you will already know how many gpus are assigned to your job. If the variable does not exist, there are no gpus assigned to the job&lt;br /&gt;
&lt;br /&gt;
* list the gpus with nvidia-smi, in a terminal run &amp;quot;nvidia-smi -L&amp;quot;, and look for the gpus you've been assigned. Remember their indexes (integers from 0 to 7)&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_id_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
* in the GPU dashboard the gpus are identified with their index&lt;br /&gt;
&lt;br /&gt;
[[File:check_gpu_resources_highlighted.png]]&lt;br /&gt;
&lt;br /&gt;
= Jupyterlab user guide =&lt;br /&gt;
&lt;br /&gt;
You can find the official documentation of the currently installed version of jupyterlab (3.6) [https://jupyterlab.readthedocs.io/en/3.6.x/ here], there you will find instruction on how to &lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/3.6.x/user/commands.html Access the command palette]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/3.6.x/user/toc.html Build a Table Of Contents]&lt;br /&gt;
* [https://jupyterlab.readthedocs.io/en/3.6.x/user/debugger.html Debug your code]&lt;br /&gt;
&lt;br /&gt;
A set of non-official jupyterlab extensions are installed to provide additional functionalities&lt;br /&gt;
&lt;br /&gt;
== jupytext ==&lt;br /&gt;
Pair your notebooks with text files to enhance version tracking.&lt;br /&gt;
https://jupytext.readthedocs.io&lt;br /&gt;
&lt;br /&gt;
=== Example ===&lt;br /&gt;
&lt;br /&gt;
If you had a notebook (.ipynb file) containing only the cell below, tracked in a git repository&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%matplotlib inline&lt;br /&gt;
import numpy as np&lt;br /&gt;
import matplotlib.pyplot as plt&lt;br /&gt;
plt.imshow(np.random.random([10, 10]))&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Different executions of the cell would produce different images, and the images are embedded in a pseudo-binary format inside the notebook file. In this case, doing a '''git diff''' of the .ipynb file would produce a huge output (because the image changed), even if there wasn't any change in the code. It is thus convenient to sync the notebook with a text file (e.g. a .py script) using the jupytext extension and track this one with git. The outputs, including images, as well as some additional metadata, won't be added to the synced text file. So in the case of different executions of the same notebook, the diff will always be empty.&lt;br /&gt;
&lt;br /&gt;
== git ==&lt;br /&gt;
Sidebar GUI to git repo management&lt;br /&gt;
https://github.com/jupyterlab/jupyterlab-git&lt;br /&gt;
&lt;br /&gt;
== variable inspector ==&lt;br /&gt;
Variable inspection à la Matlab&lt;br /&gt;
https://github.com/jupyterlab-contrib/jupyterlab-variableInspector&lt;/div&gt;</summary>
		<author><name>Torradeflot</name></author>
	</entry>
</feed>