Gitlab Community Edition Instance

Skip to content
Snippets Groups Projects
Commit d2b9b16e authored by Hauke Kirchner's avatar Hauke Kirchner
Browse files

Update README.md

parent ab56af0b
No related merge requests found
...@@ -29,12 +29,12 @@ Credits: Here the self-hosted model LLaMA 3 70B Instruct was used to generate th ...@@ -29,12 +29,12 @@ Credits: Here the self-hosted model LLaMA 3 70B Instruct was used to generate th
## Building the container 🛠️ ## Building the container 🛠️
``` ```
me@local-machine~ % ssh grete me@local-machine~ % ssh glogin
[me@glogin9 ~]$ cd /path/to/deep-learning-with-gpu-cores/code/container [me@glogin9 ~]$ cd /path/to/deep-learning-with-gpu-cores/code/container
[me@glogin9 ~]$ bash build_dlc-dlwgpu.sh [me@glogin9 ~]$ bash build_dlc-dlwgpu.sh
``` ```
The container can be built on our various HCP clusters using the `build_dlc-*.sh` scripts. Details on how to use Apptainer and the process of building containers are described in our [documentation](https://docs.hpc.gwdg.de/software/apptainer/) and our blog article [Decluttering your Python environments](https://gitlab-ce.gwdg.de/hpc-team-public/science-domains-blog/-/blob/main/20230907_python-apptainer.md). The container can be built on our various HPC clusters using the `build_dlc-*.sh` scripts. Details on how to use Apptainer and the process of building containers are described in our [documentation](https://docs.hpc.gwdg.de/software/apptainer/) and our blog article [Decluttering your Python environments](https://gitlab-ce.gwdg.de/hpc-team-public/science-domains-blog/-/blob/main/20230907_python-apptainer.md).
Running `build_dlc-dlwgpu.sh` will build an image with the software used for our workshop example, as defined in `dlc-dlwgpu.def`. Contrary to the traditional way of using conda to install all packages defined in a `requirements.txt` file, pip is used here to reduce the number of software packages used. However, there are good reasons to use conda, so `build_dlc-conda-example.sh` shows a minimal example of installing Python packages in a container using conda (see `dlc-conda-example.def`). Running `build_dlc-dlwgpu.sh` will build an image with the software used for our workshop example, as defined in `dlc-dlwgpu.def`. Contrary to the traditional way of using conda to install all packages defined in a `requirements.txt` file, pip is used here to reduce the number of software packages used. However, there are good reasons to use conda, so `build_dlc-conda-example.sh` shows a minimal example of installing Python packages in a container using conda (see `dlc-conda-example.def`).
If you encounter errors while building the container, have a look at `build_dlc-dlwgpu.log` or `build_dlc-conda-example.log`. You can also use `cat` (in a different terminal) to see the progress of the image building process. If you encounter errors while building the container, have a look at `build_dlc-dlwgpu.log` or `build_dlc-conda-example.log`. You can also use `cat` (in a different terminal) to see the progress of the image building process.
...@@ -82,8 +82,12 @@ cd path/to/deep-learning-with-gpu-cores/code/ ...@@ -82,8 +82,12 @@ cd path/to/deep-learning-with-gpu-cores/code/
# get available GPUs with (see docs.hpc.gwdg.de/usage_guide/slurm/gpu_usage/) # get available GPUs with (see docs.hpc.gwdg.de/usage_guide/slurm/gpu_usage/)
# sinfo -o "%25N %5c %10m %32f %10G %18P " | grep gpu # sinfo -o "%25N %5c %10m %32f %10G %18P " | grep gpu
# grete
salloc -t 01:00:00 -p grete:interactive -N1 -G 3g.20gb salloc -t 01:00:00 -p grete:interactive -N1 -G 3g.20gb
# kisski
salloc -t 01:00:00 -p kisski -N1 -G A100:1
module load apptainer module load apptainer
apptainer exec --nv --bind /scratch dlc.sif bash apptainer exec --nv --bind /scratch dlc.sif bash
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment