Aiida-vibroscopy: problem in running the code

Hi all,

I am new to AiiDA in general and I want to calculate Raman from aiida-vibroscopy.
I prepared following overrides.yaml and submit.py files (please see below).

The submit.py was successfully submitted with PK=42
I am not sure what is wrong there but something is not right. Can you please tell me whats going on here?

Then when I check the status:

(aiidaENV) rkarkee@ch-fe1:/lustre/scratch5/rkarkee> verdi process status 42
IRamanSpectraWorkChain<42> Finished [400] [1:inspect_process]
    └── HarmonicWorkChain<44> Finished [400] [2:inspect_processes]
        ├── generate_preprocess_data<45> Finished [0]
        ├── PhononWorkChain<50> Finished [400] [3:inspect_base_supercell]
        │   ├── generate_preprocess_data<51> Finished [0]
        │   ├── get_supercell<57> Finished [0]
        │   └── PwBaseWorkChain<60> Finished [301] [2:while_(should_run_process)(2:inspect_process)]
        │       └── PwCalculation<63> Excepted
        └── DielectricWorkChain<55> Excepted [2:run_base_scf]

And Process report 44 shows:

(aiidaENV) rkarkee@ch-fe1:/lustre/scratch5/rkarkee> verdi process report 44
2024-02-14 20:49:59 [5  | REPORT]: [44|HarmonicWorkChain|run_phonon]: submitting `PhononWorkChain` <PK=50>
2024-02-14 20:49:59 [6  | REPORT]: [44|HarmonicWorkChain|run_dielectric]: submitting `DielectricWorkChain` <PK=55>
2024-02-14 20:50:00 [7  | REPORT]:   [55|DielectricWorkChain|on_except]: Traceback (most recent call last):
  File "/users/rkarkee/conda/envs/aiidaENV/lib/python3.11/site-packages/sqlalchemy/engine/base.py", line 1969, in _exec_single_context
    self.dialect.do_execute(
  File "/users/rkarkee/conda/envs/aiidaENV/lib/python3.11/site-packages/sqlalchemy/engine/default.py", line 922, in do_execute
    cursor.execute(statement, parameters)
sqlite3.OperationalError: database is locked

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/users/rkarkee/conda/envs/aiidaENV/lib/python3.11/site-packages/plumpy/base/state_machine.py", line 324, in transition_to
    self._enter_next_state(new_state)
  File "/users/rkarkee/conda/envs/aiidaENV/lib/python3.11/site-packages/plumpy/base/state_machine.py", line 388, in _enter_next_state
    self._fire_state_event(StateEventHook.ENTERED_STATE, last_state)
  File "/users/rkarkee/conda/envs/aiidaENV/lib/python3.11/site-packages/plumpy/base/state_machine.py", line 300, in _fire_state_event
    callback(self, hook, state)
  File "/users/rkarkee/conda/envs/aiidaENV/lib/python3.11/site-packages/plumpy/processes.py", line 331, in <lambda>
    lambda _s, _h, from_state: self.on_entered(cast(Optional[process_states.State], from_state)),
                               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/users/rkarkee/conda/envs/aiidaENV/lib/python3.11/site-packages/aiida/engine/processes/process.py", line 424, in on_entered
    set_process_state_change_timestamp(self)
  File "/users/rkarkee/conda/envs/aiidaENV/lib/python3.11/site-packages/aiida/engine/utils.py", line 287, in set_process_state_change_                                                                                                     timestamp
    backend.set_global_variable(key, value, description)
  File "/users/rkarkee/conda/envs/aiidaENV/lib/python3.11/site-packages/aiida/storage/psql_dos/backend.py", line 411, in set_global_va                                                                                                     riable
    session.query(DbSetting).filter(DbSetting.key == key).update(dict(val=value))
  File "/users/rkarkee/conda/envs/aiidaENV/lib/python3.11/site-packages/sqlalchemy/orm/query.py", line 3271, in update
    result: CursorResult[Any] = self.session.execute(
                                ^^^^^^^^^^^^^^^^^^^^^
  File "/users/rkarkee/conda/envs/aiidaENV/lib/python3.11/site-packages/sqlalchemy/orm/session.py", line 2308, in execute
    return self._execute_internal(
           ^^^^^^^^^^^^^^^^^^^^^^^
  File "/users/rkarkee/conda/envs/aiidaENV/lib/python3.11/site-packages/sqlalchemy/orm/session.py", line 2190, in _execute_internal
    result: Result[Any] = compile_state_cls.orm_execute_statement(
                          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/users/rkarkee/conda/envs/aiidaENV/lib/python3.11/site-packages/sqlalchemy/orm/bulk_persistence.py", line 1617, in orm_execute                                                                                                     _statement
    return super().orm_execute_statement(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/users/rkarkee/conda/envs/aiidaENV/lib/python3.11/site-packages/sqlalchemy/orm/context.py", line 293, in orm_execute_statement
    result = conn.execute(
             ^^^^^^^^^^^^^
  File "/users/rkarkee/conda/envs/aiidaENV/lib/python3.11/site-packages/sqlalchemy/engine/base.py", line 1416, in execute
    return meth(
           ^^^^^
  File "/users/rkarkee/conda/envs/aiidaENV/lib/python3.11/site-packages/sqlalchemy/sql/elements.py", line 517, in _execute_on_connecti                                                                                                     on
    return connection._execute_clauseelement(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/users/rkarkee/conda/envs/aiidaENV/lib/python3.11/site-packages/sqlalchemy/engine/base.py", line 1639, in _execute_clauseeleme                                                                                                     nt
    ret = self._execute_context(
          ^^^^^^^^^^^^^^^^^^^^^^
  File "/users/rkarkee/conda/envs/aiidaENV/lib/python3.11/site-packages/sqlalchemy/engine/base.py", line 1848, in _execute_context
    return self._exec_single_context(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/users/rkarkee/conda/envs/aiidaENV/lib/python3.11/site-packages/sqlalchemy/engine/base.py", line 1988, in _exec_single_context
    self._handle_dbapi_exception(
  File "/users/rkarkee/conda/envs/aiidaENV/lib/python3.11/site-packages/sqlalchemy/engine/base.py", line 2344, in _handle_dbapi_except                                                                                                     ion
    raise sqlalchemy_exception.with_traceback(exc_info[2]) from e
  File "/users/rkarkee/conda/envs/aiidaENV/lib/python3.11/site-packages/sqlalchemy/engine/base.py", line 1969, in _exec_single_context
    self.dialect.do_execute(
  File "/users/rkarkee/conda/envs/aiidaENV/lib/python3.11/site-packages/sqlalchemy/engine/default.py", line 922, in do_execute
    cursor.execute(statement, parameters)
sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) database is locked
[SQL: UPDATE db_dbsetting SET val=?, time=? WHERE db_dbsetting."key" = ?]
[parameters: ('"2024-02-14T13:50:00.839680-07:00"', '2024-02-14 13:50:00.842878', 'process|state_change|work')]
(Background on this error at: https://sqlalche.me/e/20/e3q8)

2024-02-14 20:50:00 [8  | REPORT]:   [55|DielectricWorkChain|on_terminated]: remote folders will not be cleaned
2024-02-14 20:50:01 [9  | REPORT]:   [50|PhononWorkChain|run_base_supercell]: launching base supercell scf PwBaseWorkChain<60>
2024-02-14 20:50:02 [10 | REPORT]:     [60|PwBaseWorkChain|run_process]: launching PwCalculation<63> iteration #1
2024-02-14 20:50:02 [12 | REPORT]:     [60|PwBaseWorkChain|on_terminated]: remote folders will not be cleaned
2024-02-14 20:50:02 [13 | REPORT]:   [50|PhononWorkChain|inspect_base_supercell]: base supercell scf failed with exit status 301
2024-02-14 20:50:03 [14 | REPORT]:   [50|PhononWorkChain|on_terminated]: remote folders will not be cleaned
2024-02-14 20:50:03 [15 | REPORT]: [44|HarmonicWorkChain|inspect_processes]: the child `PhononWorkChain` with <PK=50> failed
2024-02-14 20:50:03 [16 | REPORT]: [44|HarmonicWorkChain|on_terminated]: remote folders will not be cleaned

Inputs are:

%%%%%%%%%%%%% overrides.yaml %%%%%%%%%%%%%%%%
clean_workdir: false # whether to clean the working directiories
dielectric:
  clean_workdir: false
  kpoints_parallel_distance: 0.2 # kpoints distance in Angstrom^-1 to sample the BZ parallel to the electric field. If used, it should help in converging faster the final results
  property: raman
  # central_difference: # if you know what you are doing, custom numerical derivatives with respect to electric field
  #   accuracy: 2
  #   electric_field_step: 0.0005
  scf:
    pseudo_family: new_PBE
    kpoints_distance: 0.4 # kpoints distance in Angstrom^-1 to sample the BZ
    kpoints_force_parity: false
    max_iterations: 5
    pw:
      metadata:
        options:
          max_wallclock_seconds: 43200
          resources:
            num_machines: 1
            num_mpiprocs_per_machine: 256
              #queue_name: standard # for SLURM
          # account: account_name # for SLURM, also for project etc
          withmpi: true
      parameters:
        ELECTRONS:
          conv_thr: 2.0e-10
          electron_maxstep: 100
          mixing_beta: 0.2
        SYSTEM:
          ecutrho: 280.0
          ecutwfc: 70.0
          vdw_corr: Grimme-D2


  settings:
    sleep_submission_time: 1.0
phonon:
  clean_workdir: false
  displacement_generator:
    distance: 0.01 # atomic displacements for phonon calculation, in Angstrom
  scf:
    pseudo_family: new_PBE
    kpoints_distance: 0.15 # kpoints distance in Angstrom^-1 to sample the BZ
    kpoints_force_parity: false
    max_iterations: 5
    pw:
      metadata:
        options:
          max_wallclock_seconds: 43200
          resources:
            num_machines: 1
            num_mpiprocs_per_machine: 1
          # queue_name: partition_name # for SLURM
          # account: account_name # for SLURM, also for project etc
          withmpi: true
      settings:
        cmdline: ['-nk', '8']
        # gamma_only: True # to use only if KpointsData has only a mesh 1 1 1 0 0 0 (i.e. Gamma not shifted)
      parameters:
        ELECTRONS:
          conv_thr: 2.0e-12
          electron_maxstep: 80
          mixing_beta: 0.4
        SYSTEM:

          ecutwfc: 70.0
          ecutrho: 280

          vdw_corr: Grimme-D2

  settings:
    sleep_submission_time: 1.0 # waiting time in seconds between different submission of SCF calculation. Recommended to be at least 1 second, to not overload.
settings:
  run_parallel: true
  use_primitive_cell: false
symmetry:
  distinguish_kinds: false
  is_symmetry: true
  symprec: 1.0e-05



%%%%%      submit.py  %%%%%%%%%%%%%%%%%
# -*- coding: utf-8 -*-
# pylint: disable=line-too-long,wildcard-import,pointless-string-statement,unused-wildcard-import
"""Submit an IRamanSpectraWorkChain via the get_builder_from_protocol using the overrides."""
from pathlib import Path

from aiida import load_profile
from aiida.engine import submit
from aiida.orm import *
from aiida_quantumespresso.common.types import ElectronicType

from aiida_vibroscopy.workflows.spectra.iraman import IRamanSpectraWorkChain

load_profile()

# =============================== INPUTS =============================== #
# Please, change the following inputs.
mesh = [[4, 4, 2], [0.5, 0.5, 0.5]]
pseudo_family_name='pbe_psp'
pw_code_label = 'qe-7.3@newhpc'
structure_id = 21  # PK or UUID of your AiiDA StructureData
protocol = 'fast'  # also 'moderate' and 'precise'; 'moderate' should be good enough in general
overrides_filepath = './overrides.yaml'  # should be a path, e.g. /path/to/overrides.yaml. Format is YAML
# Consult the documentation for HOW-TO for how to use properly the overrides.
# !!!!! FOR FULL INPUT NESTED STRUCTURE: https://aiida-vibroscopy.readthedocs.io/en/latest/topics/workflows/spectra/iraman.html
# You can follow the input structure provided on the website to fill further the overrides.
# ====================================================================== #
# If you don't have a StructureData, but you have a CIF or XYZ, or similar, file
# you can import your structure uncommenting the following:
# from ase.io import read
# atoms = read('/path/to/file.cif')
# structure = StructureData(ase=atoms)
# structure.store()
# structure_id =  structure.pk
# print(f"Your structure has been stored in the database with PK={structure_id}")


def main():
    """Submit an IRamanSpectraWorkChain calculation."""
    code = load_code(pw_code_label)
    structure = load_node(structure_id)
    kwargs = {'electronic_type': ElectronicType.INSULATOR}

    kpoints = KpointsData()
    kpoints.set_kpoints_mesh(mesh[0], mesh[1])

    #pseudo_family = load_group(pseudo_family_name)
    #pseudos = pseudo_family.get_pseudos(structure=structure)

    builder = IRamanSpectraWorkChain.get_builder_from_protocol(
        code=code,
        structure=structure,
        protocol=protocol,
        overrides=Path(overrides_filepath),
        **kwargs,
    )

    builder.dielectric.scf.kpoints = kpoints
    builder.dielectric.pop('kpoints_parallel_distance', None)
    builder.dielectric.scf.pop('kpoints_distance', None)
    builder.phonon.scf.kpoints = kpoints

    #builder.dielectric.scf.pw.pseudos = pseudos
    #builder.phonon.scf.pw.pseudos = pseudos

    calc = submit(builder)
    print(f'Submitted IRamanSpectraWorkChain with PK={calc.pk} and UUID={calc.uuid}')
    print('Register *at least* the PK number, e.g. in you submit script.')
    print('You can monitor the status of your calculation with the following commands:')
    print('  * verdi process status PK')
    print('  * verdi process list -L IRamanSpectraWorkChain # list all running IRamanSpectraWorkChain')
    print(
        '  * verdi process list -ap1 -L IRamanSpectraWorkChain # list all IRamanSpectraWorkChain submitted in the previous 1 day'
    )
    print('If the WorkChain finishes with exit code 0, then you can inspect the outputs and post-process the data.')
    print('Use the command')
    print('  * verdi process show PK')
    print('To show further information about your WorkChain. When finished, you should see some outputs.')
    print('The main output can be accessed via `load_node(PK).outputs.vibrational_data.numerical_accuracy_*`.')
    print('You have to complete the remaning `*`, which depends upond the accuracy of the calculation.')
    print('See also the documentation and the reference paper for further details.')


if __name__ == '__main__':
    """Run script."""
    main()

Hi @bastonero any suggestions on this?

Hi @rkarkee, thank you for your question. I recommend enhancing the readability of your post by tidying up the code formatting. For guidance on how to properly format the code in your post, you might find this resource helpful: Discourse Guide: Code Formatting - Meta - Stonehearth Discourse. Thank you!

@rkarkee I formatted your post a bit with Markdown syntax for the code snippets. :slight_smile:

I think this is the issue:

sqlite3.OperationalError: database is locked

I assume you are using the recently released core.sqlite_dos storage for your profile? Although this one is easy to set up and SQLite doesn’t require a running service list PostgreSQL, the performance of this variant is not the same. One issue can be concurrent writing (i.e. using the SQL UPDATE statement), see

And indeed, the error reports the database is still locked when trying to perform an UPDATE:

sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) database is locked
[SQL: UPDATE db_dbsetting SET val=?, time=? WHERE db_dbsetting."key" = ?]
[parameters: ('"2024-02-14T13:50:00.839680-07:00"', '2024-02-14 13:50:00.842878', 'process|state_change|work')]
(Background on this error at: https://sqlalche.me/e/20/e3q8)

Are you running with multiple daemon workers?