All plugins
CalcJobs and calculation functions aiida.calculations CalcJob parsers aiida.parsers WorkChains and work functions aiida.workflows
aiida-ase
The official AiiDA plugin for ASE.
status beta
AiiDA
>=2.0,<3.0
General information
Registry checks
W008: Unable to reach documentation URL: https://aiida-ase.readthedocs.io/
Click any code (W001, E001…) to jump to
troubleshooting instructions
.
Plugins provided
Calculations 1 Parsers 2 Workflows 1
Entry points
-
ase.ase
class:aiida_ase.calculations.ase:AseCalculation`CalcJob` implementation that can be used to wrap around the ASE calculators.
Input Required Valid types Description parameterstrue DictInput parameters for the namelists. structuretrue StructureDataThe input structure. codefalse AbstractCode, NoneTypeThe `Code` to use for this job. This input is required, unless the `remote_folder` input is specified, which means an existing job is being imported and no code will actually be run. kpointsfalse KpointsData, NoneTypeThe k-points to use for the calculation. metadatafalse monitorsfalse DictAdd monitoring functions that can inspect output files while the job is running and decide to prematurely terminate the job. remote_folderfalse RemoteData, NoneTypeRemote directory containing the results of an already completed calculation job without AiiDA. The inputs should be passed to the `CalcJob` as normal but instead of launching the actual job, the engine will recreate the input files and then proceed straight to the retrieve step where the files of this `RemoteData` will be retrieved as if it had been actually launched through AiiDA. If a parser is defined in the inputs, the results are parsed and attached as output nodes as usual. settingsfalse Dict, NoneTypeOptional settings that control the plugin. Output Required Valid types Description remote_foldertrue RemoteDataInput files necessary to run the process will be stored in this folder node. retrievedtrue FolderDataFiles that are retrieved by the daemon will be stored in this node. By default the stdout and stderr of the scheduler will be added, but one can add more by specifying them in `CalcInfo.retrieve_list`. arrayfalse ArrayDataparametersfalse Dictremote_stashfalse RemoteStashDataContents of the `stash.source_list` option are stored in this remote folder after job completion. structurefalse StructureDatatrajectoryfalse TrajectoryDataExit status Message 1 The process has failed with an unspecified error. 2 The process failed with legacy failure mode. 10 The process returned an invalid output. 11 The process did not register a required output. 100 The process did not have the required `retrieved` output. 110 The job ran out of memory. 120 The job ran out of walltime. 131 The specified account is invalid. 140 The node running the job failed. 150 {message} 300 One of the expected output files was missing. 301 The log file from the DFT code was not written out. 302 Relaxation did not complete. 303 SCF Failed. 305 Cannot identify what went wrong. 306 gpaw could not find the PAW potentials. 307 Attribute Error found in the stderr file. 308 Fermi level is infinite. 400 The calculation ran out of walltime.
-
ase.ase
aiida_ase.parsers.ase:AseParser -
ase.gpaw
aiida_ase.parsers.gpaw:GpawParser
-
ase.gpaw.base
class:aiida_ase.workflows.base:GpawBaseWorkChainWorkchain to run a GPAW calculation with automated error handling and restarts.
Input Required Valid types Description gpawtrue Datastructuretrue StructureDataThe input structure. clean_workdirfalse BoolIf `True`, work directories of all called calculation jobs will be cleaned at the end of execution. handler_overridesfalse Dict, NoneTypeMapping where keys are process handler names and the values are a dictionary, where each dictionary can define the ``enabled`` and ``priority`` key, which can be used to toggle the values set on the original process handler declaration. kpointsfalse KpointsData, NoneTypek-points to use for the calculation. max_iterationsfalse IntMaximum number of iterations the work chain will restart the process to finish successfully. metadatafalse Output Required Valid types Description remote_foldertrue RemoteDataInput files necessary to run the process will be stored in this folder node. retrievedtrue FolderDataFiles that are retrieved by the daemon will be stored in this node. By default the stdout and stderr of the scheduler will be added, but one can add more by specifying them in `CalcInfo.retrieve_list`. arrayfalse ArrayDataparametersfalse Dictremote_stashfalse RemoteStashDataContents of the `stash.source_list` option are stored in this remote folder after job completion. structurefalse StructureDatatrajectoryfalse TrajectoryDataExit status Message 1 The process has failed with an unspecified error. 2 The process failed with legacy failure mode. 10 The process returned an invalid output. 11 The process did not register a required output. 301 The sub process excepted. 302 The sub process was killed. 401 The maximum number of iterations was exceeded. 402 The process failed for an unknown reason, twice in a row.