@@ -29,12 +29,12 @@ Credits: Here the self-hosted model LLaMA 3 70B Instruct was used to generate th
...
@@ -29,12 +29,12 @@ Credits: Here the self-hosted model LLaMA 3 70B Instruct was used to generate th
## Building the container 🛠️
## Building the container 🛠️
```
```
me@local-machine~ % ssh grete
me@local-machine~ % ssh glogin
[me@glogin9 ~]$ cd /path/to/deep-learning-with-gpu-cores/code/container
[me@glogin9 ~]$ cd /path/to/deep-learning-with-gpu-cores/code/container
[me@glogin9 ~]$ bash build_dlc-dlwgpu.sh
[me@glogin9 ~]$ bash build_dlc-dlwgpu.sh
```
```
The container can be built on our various HCP clusters using the `build_dlc-*.sh` scripts. Details on how to use Apptainer and the process of building containers are described in our [documentation](https://docs.hpc.gwdg.de/software/apptainer/) and our blog article [Decluttering your Python environments](https://gitlab-ce.gwdg.de/hpc-team-public/science-domains-blog/-/blob/main/20230907_python-apptainer.md).
The container can be built on our various HPC clusters using the `build_dlc-*.sh` scripts. Details on how to use Apptainer and the process of building containers are described in our [documentation](https://docs.hpc.gwdg.de/software/apptainer/) and our blog article [Decluttering your Python environments](https://gitlab-ce.gwdg.de/hpc-team-public/science-domains-blog/-/blob/main/20230907_python-apptainer.md).
Running `build_dlc-dlwgpu.sh` will build an image with the software used for our workshop example, as defined in `dlc-dlwgpu.def`. Contrary to the traditional way of using conda to install all packages defined in a `requirements.txt` file, pip is used here to reduce the number of software packages used. However, there are good reasons to use conda, so `build_dlc-conda-example.sh` shows a minimal example of installing Python packages in a container using conda (see `dlc-conda-example.def`).
Running `build_dlc-dlwgpu.sh` will build an image with the software used for our workshop example, as defined in `dlc-dlwgpu.def`. Contrary to the traditional way of using conda to install all packages defined in a `requirements.txt` file, pip is used here to reduce the number of software packages used. However, there are good reasons to use conda, so `build_dlc-conda-example.sh` shows a minimal example of installing Python packages in a container using conda (see `dlc-conda-example.def`).
If you encounter errors while building the container, have a look at `build_dlc-dlwgpu.log` or `build_dlc-conda-example.log`. You can also use `cat` (in a different terminal) to see the progress of the image building process.
If you encounter errors while building the container, have a look at `build_dlc-dlwgpu.log` or `build_dlc-conda-example.log`. You can also use `cat` (in a different terminal) to see the progress of the image building process.
...
@@ -82,8 +82,12 @@ cd path/to/deep-learning-with-gpu-cores/code/
...
@@ -82,8 +82,12 @@ cd path/to/deep-learning-with-gpu-cores/code/
# get available GPUs with (see docs.hpc.gwdg.de/usage_guide/slurm/gpu_usage/)
# get available GPUs with (see docs.hpc.gwdg.de/usage_guide/slurm/gpu_usage/)