All plugins
CalcJobs and calculation functions aiida.calculations verdi data commands aiida.cmdline.data Data node types aiida.data CalcJob parsers aiida.parsers WorkChains and work functions aiida.workflows
aiida-bigdft
Translation layer for AiiDA-PyBigDFT
status beta
AiiDA
>=1.6.3,<3
General information
Registry checks
W005: Development status in classifiers (alpha) does not match development_status in metadata (beta)
W006: 'development_status' key is deprecated. Use PyPI Trove classifiers in the plugin repository instead.
Click any code (W001, E001…) to jump to
troubleshooting instructions
.
Plugins provided
Calculations 1 Parsers 1 Data 3 Workflows 2 Other 1
Entry points
-
bigdft
class:aiida_bigdft.calculations:BigDFTCalculationAiiDA plugin wrapping a BigDFT calculation requires a valid BigDFT install and a copy of `bigdft.py` on the target machine.
Input Required Valid types Description structuretrue StructureDataInput structure (AiiDA format) codefalse AbstractCode, NoneTypeThe `Code` to use for this job. This input is required, unless the `remote_folder` input is specified, which means an existing job is being imported and no code will actually be run. extra_files_recvfalse ListExtra files to retrieve from calculation extra_files_sendfalse ListExtra files to send with calculation metadatafalse monitorsfalse DictAdd monitoring functions that can inspect output files while the job is running and decide to prematurely terminate the job. parametersfalse BigDFTParametersBigDFT Inputfile parameters, as Dict params_fnamefalse StrName override for parameters file remote_folderfalse RemoteData, NoneTypeRemote directory containing the results of an already completed calculation job without AiiDA. The inputs should be passed to the `CalcJob` as normal but instead of launching the actual job, the engine will recreate the input files and then proceed straight to the retrieve step where the files of this `RemoteData` will be retrieved as if it had been actually launched through AiiDA. If a parser is defined in the inputs, the results are parsed and attached as output nodes as usual. structure_fnamefalse StrName override for structure file Output Required Valid types Description energytrue FloatFinal energy estimate taken from logfile logfiletrue BigDFTLogfileBigDFT calculation Logfile remote_foldertrue RemoteDataInput files necessary to run the process will be stored in this folder node. retrievedtrue FolderDataFiles that are retrieved by the daemon will be stored in this node. By default the stdout and stderr of the scheduler will be added, but one can add more by specifying them in `CalcInfo.retrieve_list`. timefiletrue BigDFTFileBigDFT calculation time log ttotaltrue FloatEstimated total run time (excluding queue) remote_stashfalse RemoteStashDataContents of the `stash.source_list` option are stored in this remote folder after job completion. Exit status Message 1 The process has failed with an unspecified error. 2 The process failed with legacy failure mode. 10 The process returned an invalid output. 11 The process did not register a required output. 100 The process did not have the required `retrieved` output. 100 Calculation did not produce all expected output files. 101 Calculation did not produce all expected output files. 110 The job ran out of memory. 120 The job ran out of walltime. 131 The specified account is invalid. 140 The node running the job failed. 150 {message} 400 Calculation did not finish because of a walltime issue. 401 Calculation did not finish because of memory limit
-
bigdft
aiida_bigdft.cli:data_cli
-
bigdft
aiida_bigdft.data.BigDFTParameters:BigDFTParameters -
bigdftfile
aiida_bigdft.data.BigDFTFile:BigDFTFile -
bigdftlogfile
aiida_bigdft.data.BigDFTFile:BigDFTLogfile
-
bigdft
aiida_bigdft.parsers:BigDFTParser
-
bigdft
class:aiida_bigdft.workflows.base:BigDFTBaseWorkChainBase workchain for running a BigDFT Calculation
Input Required Valid types Description BigDFTtrue Dataclean_workdirfalse BoolIf `True`, work directories of all called calculation jobs will be cleaned at the end of execution. handler_overridesfalse Dict, NoneTypeMapping where keys are process handler names and the values are a dictionary, where each dictionary can define the ``enabled`` and ``priority`` key, which can be used to toggle the values set on the original process handler declaration. max_iterationsfalse IntMaximum number of iterations the work chain will restart the process to finish successfully. metadatafalse Output Required Valid types Description energytrue FloatFinal energy estimate taken from logfile logfiletrue BigDFTLogfileBigDFT calculation Logfile remote_foldertrue RemoteDataInput files necessary to run the process will be stored in this folder node. retrievedtrue FolderDataFiles that are retrieved by the daemon will be stored in this node. By default the stdout and stderr of the scheduler will be added, but one can add more by specifying them in `CalcInfo.retrieve_list`. timefiletrue BigDFTFileBigDFT calculation time log ttotaltrue FloatEstimated total run time (excluding queue) remote_stashfalse RemoteStashDataContents of the `stash.source_list` option are stored in this remote folder after job completion. Exit status Message 1 The process has failed with an unspecified error. 2 The process failed with legacy failure mode. 10 The process returned an invalid output. 11 The process did not register a required output. 300 The calculation encountered an unrecoverable error 301 The sub process excepted. 302 The sub process was killed. 401 The maximum number of iterations was exceeded. 402 The process failed for an unknown reason, twice in a row. -
bigdft.relax
aiida_bigdft.workflows.relax:BigDFTRelaxWorkChain