From d2b9b16e2f31f72711bd78ee23bf8e500590cbad Mon Sep 17 00:00:00 2001
From: Hauke Kirchner <hauke.gronenberg@gwdg.de>
Date: Thu, 15 Aug 2024 13:56:08 +0000
Subject: [PATCH] Update README.md

---
 code/container/README.md | 8 ++++++--
 1 file changed, 6 insertions(+), 2 deletions(-)

diff --git a/code/container/README.md b/code/container/README.md
index 22f042f..5d8fad8 100644
--- a/code/container/README.md
+++ b/code/container/README.md
@@ -29,12 +29,12 @@ Credits: Here the self-hosted model LLaMA 3 70B Instruct was used to generate th
 ## Building the container 🛠️
 
 ```
-me@local-machine~ % ssh grete
+me@local-machine~ % ssh glogin
 [me@glogin9 ~]$ cd /path/to/deep-learning-with-gpu-cores/code/container
 [me@glogin9 ~]$ bash build_dlc-dlwgpu.sh
 ```
 
-The container can be built on our various HCP clusters using the `build_dlc-*.sh` scripts. Details on how to use Apptainer and the process of building containers are described in our [documentation](https://docs.hpc.gwdg.de/software/apptainer/) and our blog article [Decluttering your Python environments](https://gitlab-ce.gwdg.de/hpc-team-public/science-domains-blog/-/blob/main/20230907_python-apptainer.md).
+The container can be built on our various HPC clusters using the `build_dlc-*.sh` scripts. Details on how to use Apptainer and the process of building containers are described in our [documentation](https://docs.hpc.gwdg.de/software/apptainer/) and our blog article [Decluttering your Python environments](https://gitlab-ce.gwdg.de/hpc-team-public/science-domains-blog/-/blob/main/20230907_python-apptainer.md).
 Running `build_dlc-dlwgpu.sh` will build an image with the software used for our workshop example, as defined in `dlc-dlwgpu.def`. Contrary to the traditional way of using conda to install all packages defined in a `requirements.txt` file, pip is used here to reduce the number of software packages used. However, there are good reasons to use conda, so `build_dlc-conda-example.sh` shows a minimal example of installing Python packages in a container using conda (see `dlc-conda-example.def`).
 
 If you encounter errors while building the container, have a look at `build_dlc-dlwgpu.log` or `build_dlc-conda-example.log`. You can also use `cat` (in a different terminal) to see the progress of the image building process.
@@ -82,8 +82,12 @@ cd path/to/deep-learning-with-gpu-cores/code/
 # get available GPUs with (see docs.hpc.gwdg.de/usage_guide/slurm/gpu_usage/)
 # sinfo -o "%25N  %5c  %10m  %32f  %10G %18P " | grep gpu
 
+# grete
 salloc -t 01:00:00 -p grete:interactive -N1 -G 3g.20gb
 
+# kisski
+salloc -t 01:00:00 -p kisski -N1 -G A100:1
+
 module load apptainer
 
 apptainer exec --nv --bind /scratch dlc.sif bash
-- 
GitLab