All plugins

aiida-ase

The official AiiDA plugin for ASE.

status beta AiiDA >=2.0,<3.0

General information

Install pip install aiida-ase
Python import import aiida_ase
Latest version 3.0.0
Released 2023-10-04

Registry checks

W008: Unable to reach documentation URL: https://aiida-ase.readthedocs.io/
Click any code (W001, E001…) to jump to troubleshooting instructions .

Plugins provided

Calculations 1 Parsers 2 Workflows 1

Entry points

CalcJobs and calculation functions aiida.calculations
  • ase.ase

    class: aiida_ase.calculations.ase:AseCalculation

    `CalcJob` implementation that can be used to wrap around the ASE calculators.

    InputRequiredValid typesDescription
    parameters true Dict Input parameters for the namelists.
    structure true StructureData The input structure.
    code false AbstractCode, NoneType The `Code` to use for this job. This input is required, unless the `remote_folder` input is specified, which means an existing job is being imported and no code will actually be run.
    kpoints false KpointsData, NoneType The k-points to use for the calculation.
    metadata false
    monitors false Dict Add monitoring functions that can inspect output files while the job is running and decide to prematurely terminate the job.
    remote_folder false RemoteData, NoneType Remote directory containing the results of an already completed calculation job without AiiDA. The inputs should be passed to the `CalcJob` as normal but instead of launching the actual job, the engine will recreate the input files and then proceed straight to the retrieve step where the files of this `RemoteData` will be retrieved as if it had been actually launched through AiiDA. If a parser is defined in the inputs, the results are parsed and attached as output nodes as usual.
    settings false Dict, NoneType Optional settings that control the plugin.
    OutputRequiredValid typesDescription
    remote_folder true RemoteData Input files necessary to run the process will be stored in this folder node.
    retrieved true FolderData Files that are retrieved by the daemon will be stored in this node. By default the stdout and stderr of the scheduler will be added, but one can add more by specifying them in `CalcInfo.retrieve_list`.
    array false ArrayData
    parameters false Dict
    remote_stash false RemoteStashData Contents of the `stash.source_list` option are stored in this remote folder after job completion.
    structure false StructureData
    trajectory false TrajectoryData
    Exit statusMessage
    1 The process has failed with an unspecified error.
    2 The process failed with legacy failure mode.
    10 The process returned an invalid output.
    11 The process did not register a required output.
    100 The process did not have the required `retrieved` output.
    110 The job ran out of memory.
    120 The job ran out of walltime.
    131 The specified account is invalid.
    140 The node running the job failed.
    150 {message}
    300 One of the expected output files was missing.
    301 The log file from the DFT code was not written out.
    302 Relaxation did not complete.
    303 SCF Failed.
    305 Cannot identify what went wrong.
    306 gpaw could not find the PAW potentials.
    307 Attribute Error found in the stderr file.
    308 Fermi level is infinite.
    400 The calculation ran out of walltime.
CalcJob parsers aiida.parsers
  • ase.ase

    aiida_ase.parsers.ase:AseParser
  • ase.gpaw

    aiida_ase.parsers.gpaw:GpawParser
WorkChains and work functions aiida.workflows
  • ase.gpaw.base

    class: aiida_ase.workflows.base:GpawBaseWorkChain

    Workchain to run a GPAW calculation with automated error handling and restarts.

    InputRequiredValid typesDescription
    gpaw true Data
    structure true StructureData The input structure.
    clean_workdir false Bool If `True`, work directories of all called calculation jobs will be cleaned at the end of execution.
    handler_overrides false Dict, NoneType Mapping where keys are process handler names and the values are a dictionary, where each dictionary can define the ``enabled`` and ``priority`` key, which can be used to toggle the values set on the original process handler declaration.
    kpoints false KpointsData, NoneType k-points to use for the calculation.
    max_iterations false Int Maximum number of iterations the work chain will restart the process to finish successfully.
    metadata false
    OutputRequiredValid typesDescription
    remote_folder true RemoteData Input files necessary to run the process will be stored in this folder node.
    retrieved true FolderData Files that are retrieved by the daemon will be stored in this node. By default the stdout and stderr of the scheduler will be added, but one can add more by specifying them in `CalcInfo.retrieve_list`.
    array false ArrayData
    parameters false Dict
    remote_stash false RemoteStashData Contents of the `stash.source_list` option are stored in this remote folder after job completion.
    structure false StructureData
    trajectory false TrajectoryData
    Exit statusMessage
    1 The process has failed with an unspecified error.
    2 The process failed with legacy failure mode.
    10 The process returned an invalid output.
    11 The process did not register a required output.
    301 The sub process excepted.
    302 The sub process was killed.
    401 The maximum number of iterations was exceeded.
    402 The process failed for an unknown reason, twice in a row.