Developing inside aiidalab docker environment

When starting a new developing environment for AiiDA/AiiDAlab. There are many choices, e.g., conda, docker, VM. In this post, I’d like to share my firsthand experience using the AiiDAlab Docker image in the hope that my insights can benefit others.

Creating a New Profile
To begin, we’ll create a new profile named ‘dev-aiidalab’ using the ‘aiidalab-launch’ :

aiidalab-launch profile add --image aiidalab/qe:edge dev-aiidalab

Installing Packages as Root Inside the Docker Container

docker exec -it -u root xxx_container_id /bin/bash

Inside the container, you can update the package list and install any necessary packages. For instance, if you require ‘plocate’ for ‘pgtest,’ you can run:

sudo apt update
# used for pgtest
sudo apt install plocate

More insights and tips will be added shortly.

RabbitMQ

If you want to use the verdi devel rabbitmq command to interact with RabbitMQ.

# rabbitmq is installed in a separate conda environment.
conda activate aiida-core-services
rabbitmq-plugins enable rabbitmq_management
conda deactivate

Github gh command

conda install -c conda-forge gh

Login your github account

gh auth login

Backup

In some cases, the container can not start anymore due to unknown reasons. Don’t worry that your data is still there. Typically, it is In the folder

/var/lib/docker/volumes/aiidalab_dev-aiida_home/_data/

You can backup this data, reset the profile

aiidalab-launch reset -p dev-aiida

and create a new profile again.

Share Your AiiDAlab Docker Experience

If you have any experiences or insights to share regarding the AiiDAlab Docker environment, please feel free to contribute by adding a comment below. Your input can be invaluable to the community.

2 Likes

Thanks! Once we collect some feedback, this would be good for a blog post?

Thanks @Xing, it is super nice you started using it and provided your experience on how to use the images.

Per @giovannipizzi suggestion, I was planning to convert Building Docker Image with Pre-configured AiiDA Computer? - #3 by jusong.yu to a blog. But I think you are right, using and developing inside the container deserves a dedicated blog as well.
Although we have some users who started using it, but we have not yet advertised it. Still, some minor things need to be settled such as how to backup the data.

@Xing’s post is using AiiDAlab image, now there are aiida-core-with-services images for aiida-core specifically. It includes some developer’s optimization configure such as preinstalled plocate (it is in the aiidalab image as well but not in the base conda environment, another way to get pgtest for pytest in the AiiDAlab container is to run conda activate aiida-core-services).

Just use this post to list things need to do to make the images more accessible and useful.

  • Polish the documentation: Run AiiDA via a Docker image — AiiDA 2.4.0.post0 documentation and adding section on the recommended way to backup the data or entire container.
  • A tutorial-like blog includes videos on how to easily get docker running from docker desktop and user stories as @Xing’s post.
  • From my side, I want to also push to developing inside container with using vscode, which brings a lot of convenience like no need to further set up GitHub credentials.

The add on the topic, I use aiida-core-with-services image as the burn-after-reading solution to play with AiiDA archives downloaded from MaterialsCloud.

First I download acwf-verification_unaries-verification-PBE-v1_results_abinit_PseudoDojo_0.5b1_PBE_SR_standard_psp8.aiida archive from Materials Cloud Archive

Then I open a container from image aiida-core-with-services:latest with --rm option to destroy the container after exiting from the shell.

docker run --rm -it --name test-container aiidateam/aiida-core-with-services:latest bash

Open a new shell and copy the archive into the container by:

docker cp acwf-verification_unaries-verification-PBE-v1_results_abinit_PseudoDojo_0.5b1_PBE_SR_standard_psp8.aiida test-container:/home/aiida

Inside the container shell, import the archive by:

verdi archive import acwf-verification_unaries-verification-PBE-v1_results_abinit_PseudoDojo_0.5b1_PBE_SR_standard_psp8.aiida

Then it is ready to run verdi shell and play with the queries. The advantage of this is the tedious profile-creating process is avoided. Also, the container will be destroyed once exit from the container shell. To keep the container so it is can be reached next time, just not using the --rm option. To restart the container, simply run docker start -i test-container, and the shell will come back.

Good stuff @jusong.yu . As an alternative, for people that don’t have Docker or are not used to it, once this PR is merged and released with v2.5, users will be able to do the following:

verdi profile setup core.sqlite_zip -n --profile temp-profile --filepath path/to/archive.aiida

This will create a profile based on the archive. They can then verdi shell to inspect the content or run any Python script that loads the profile. Once they are done, they run

verdi profile delete temp-profile

to delete the profile.

Great suggestions and comments! I also would be interested in easy development with VS code and a container.

@sphuber one question - when you do verdi profile delete, will it also delete the .aiida file? To be consistent with the usual delete behaviour, I think it should, on the other hand it might be unexpected? (Or maybe not, people should be OK with the file being deleted?) In this latter case, would there be an option just to “unsetup” the profile without deleting the .aiida file?

Yes, there is the --delete-storage/--no-delete-storage option that is turned on by default, but can be switched off. When passing --no-delete-storage the archive is kept and the profile is just removed from the config.

I think the current default is to not delete the storage right?

At least I think this should be the default, else it’d indeed be risky that users create a temp profile from an archive, delete it and also accidentally delete the archive.

But let’s continue that discussion on the PR.

I made a mistake, it is off by default, so a user has to explicitly switch it on for the data to be deleted.