External API
SCons Extensions
- waves.scons_extensions.abaqus_extract(program: str = 'abaqus', **kwargs) Builder [source]
Abaqus ODB file extraction Builder
This builder executes the
odb_extract
command line utility against an ODB file in the source list. The ODB file must be the first file in the source list. If there is more than one ODB file in the source list, all but the first file are ignored byodb_extract
.This builder is unique in that no targets are required. The Builder emitter will append the builder managed targets and
odb_extract
target name constructions automatically. The first target determines the working directory for the emitter targets. If the target(s) must be built in a build subdirectory, e.g. in a parameterized target build, then at least one target must be provided with the build subdirectory, e.g.parameter_set1/target.h5
. When in doubt, provide the expected H5 file as a target, e.g.source[0].h5
.The target list may specify an output H5 file name that differs from the ODB file base name as
new_name.h5
. If the first file in the target list does not contain the*.h5
extension, or if there is no file in the target list, the target list will be prepended with a name matching the ODB file base name and the*.h5
extension.The builder emitter appends the CSV file created by the
abaqus odbreport
command as executed byodb_extract
unlessdelete_report_file
is set toTrue
.This builder supports the keyword arguments:
output_type
,odb_report_args
,delete_report_file
with behavior as described in the ODB Extract command line interface.Warning
odb_extract
requires Abaqus arguments forodb_report_args
in the form ofoption=value
, e.g.step=step_name
./ # Top level group required in all hdf5 files /<instance name>/ # Groups containing data of each instance found in an odb FieldOutputs/ # Group with multiple xarray datasets for each field output <field name>/ # Group with datasets containing field output data for a specified set or surface # If no set or surface is specified, the <field name> will be # 'ALL_NODES' or 'ALL_ELEMENTS' HistoryOutputs/ # Group with multiple xarray datasets for each history output <region name>/ # Group with datasets containing history output data for specified history region name # If no history region name is specified, the <region name> will be 'ALL NODES' Mesh/ # Group written from an xarray dataset with all mesh information for this instance /<instance name>_Assembly/ # Group containing data of assembly instance found in an odb Mesh/ # Group written from an xarray dataset with all mesh information for this instance /odb/ # Catch all group for data found in the odbreport file not already organized by instance info/ # Group with datasets that mostly give odb meta-data like name, path, etc. jobData/ # Group with datasets that contain additional odb meta-data rootAssembly/ # Group with datasets that match odb file organization per Abaqus documentation sectionCategories/ # Group with datasets that match odb file organization per Abaqus documentation /xarray/ # Group with a dataset that lists the location of all data written from xarray datasets
import waves env = Environment() env["abaqus"] = waves.scons_extensions.add_program(["abaqus"], env) env.Append(BUILDERS={"AbaqusExtract": waves.scons_extensions.abaqus_extract()}) env.AbaqusExtract(target=["my_job.h5", "my_job.csv"], source=["my_job.odb"])
- Parameters:
program – An absolute path or basename string for the abaqus program
- Returns:
Abaqus extract builder
- waves.scons_extensions.abaqus_input_scanner() Scanner [source]
Abaqus input file dependency scanner
Custom SCons scanner that searches for the
INPUT=
parameter and associated file dependencies inside Abaqus*.inp
files.- Returns:
Abaqus input file dependency Scanner
- Return type:
SCons.Scanner.Scanner
- waves.scons_extensions.abaqus_journal(program: str = 'abaqus', post_action: list = [], **kwargs) Builder [source]
Abaqus journal file SCons builder
This builder requires that the journal file to execute is the first source in the list. The builder returned by this function accepts all SCons Builder arguments and adds the keyword argument(s):
journal_options
: The journal file command line options provided as a string.abaqus_options
: The Abaqus command line options provided as a string.
At least one target must be specified. The first target determines the working directory for the builder’s action, as shown in the action code snippet below. The action changes the working directory to the first target’s parent directory prior to executing the journal file.
The Builder emitter will append the builder managed targets automatically. Appends
target[0]
.abaqus_v6.env andtarget[0]
.stdout to thetarget
list.The emitter will assume all emitted targets build in the current build directory. If the target(s) must be built in a build subdirectory, e.g. in a parameterized target build, then the first target must be provided with the build subdirectory, e.g.
parameter_set1/my_target.ext
. When in doubt, provide a STDOUT redirect file as a target, e.g.target.stdout
.cd ${TARGET.dir.abspath} && abaqus cae -noGui ${SOURCE.abspath} ${abaqus_options} -- ${journal_options} > ${TARGETS[-1].abspath} 2>&1
import waves env = Environment() env["abaqus"] = waves.scons_extensions.add_program(["abaqus"], env) env.Append(BUILDERS={"AbaqusJournal": waves.scons_extensions.abaqus_journal()}) env.AbaqusJournal(target=["my_journal.cae"], source=["my_journal.py"], journal_options="")
- Parameters:
program (str) – An absolute path or basename string for the abaqus program.
post_action (list) – List of shell command string(s) to append to the builder’s action list. Implemented to allow post target modification or introspection, e.g. inspect the Abaqus log for error keywords and throw a non-zero exit code even if Abaqus does not. Builder keyword variables are available for substitution in the
post_action
action using the${}
syntax. Actions are executed in the first target’s directory ascd ${TARGET.dir.abspath} && ${post_action}
- Returns:
Abaqus journal builder
- Return type:
SCons.Builder.Builder
- waves.scons_extensions.abaqus_solver(program: str = 'abaqus', post_action: list[str] = [], emitter: Literal['standard', 'explicit', 'datacheck'] | None = None, **kwargs) Builder [source]
Abaqus solver SCons builder
This builder requires that the root input file is the first source in the list. The builder returned by this function accepts all SCons Builder arguments and adds the keyword argument(s):
job_name
: The job name string. If not specifiedjob_name
defaults to the root input file stem. The Builder emitter will append common Abaqus output files as targets automatically from thejob_name
, e.g.job_name.odb
.abaqus_options
: The Abaqus command line options provided as a string.suffixes
: override the emitter targets with a new list of extensions, e.g.AbaqusSolver(target=[], source=["input.inp"], suffixes=[".odb"])
will emit only one file namedjob_name.odb
.
The first target determines the working directory for the builder’s action, as shown in the action code snippet below. The action changes the working directory to the first target’s parent directory prior to executing the journal file.
This builder is unique in that no targets are required. The Builder emitter will append the builder managed targets automatically. The target list only appends those extensions which are common to Abaqus analysis operations. Some extensions may need to be added explicitly according to the Abaqus simulation solver, type, or options. If you find that SCons isn’t automatically cleaning some Abaqus output files, they are not in the automatically appended target list.
The emitter will assume all emitted targets build in the current build directory. If the target(s) must be built in a build subdirectory, e.g. in a parameterized target build, then the first target must be provided with the build subdirectory, e.g.
parameter_set1/job_name.odb
. When in doubt, provide a STDOUT redirect file as a target, e.g.target.stdout
.The
-interactive
option is always appended to the builder action to avoid exiting the Abaqus task before the simulation is complete. The-ask_delete no
option is always appended to the builder action to overwrite existing files in programmatic execution, where it is assumed that the Abaqus solver target(s) should be re-built when their source files change.import waves env = Environment() env["abaqus"] = waves.scons_extensions.add_program(["abaqus"], env) env.Append(BUILDERS={ "AbaqusSolver": waves.scons_extensions.abaqus_solver(), "AbaqusStandard": waves.scons_extensions.abaqus_solver(emitter='standard'), "AbaqusOld": waves.scons_extensions.abaqus_solver(program="abq2019"), "AbaqusPost": waves.scons_extensions.abaqus_solver(post_action="grep -E '\<SUCCESSFULLY' ${job_name}.sta") }) env.AbaqusSolver(target=[], source=["input.inp"], job_name="my_job", abaqus_options="-cpus 4") env.AbaqusSolver(target=[], source=["input.inp"], job_name="my_job", suffixes=[".odb"])
cd ${TARGET.dir.abspath} && ${program} -job ${job_name} -input ${SOURCE.filebase} ${abaqus_options} -interactive -ask_delete no > ${TARGETS[-1].abspath} 2>&1
- Parameters:
program – An absolute path or basename string for the abaqus program
post_action – List of shell command string(s) to append to the builder’s action list. Implemented to allow post target modification or introspection, e.g. inspect the Abaqus log for error keywords and throw a non-zero exit code even if Abaqus does not. Builder keyword variables are available for substitution in the
post_action
action using the${}
syntax. Actions are executed in the first target’s directory ascd ${TARGET.dir.abspath} && ${post_action}
.emitter –
emit file extensions based on the value of this variable. Overridden by the
suffixes
keyword argument that may be provided in the Task definition.”standard”: [“.odb”, “.dat”, “.msg”, “.com”, “.prt”, “.sta”]
”explicit”: [“.odb”, “.dat”, “.msg”, “.com”, “.prt”, “.sta”]
”datacheck”: [“.odb”, “.dat”, “.msg”, “.com”, “.prt”, “.023”, “.mdl”, “.sim”, “.stt”]
default value: [“.odb”, “.dat”, “.msg”, “.com”, “.prt”]
- Returns:
Abaqus solver builder
- waves.scons_extensions.add_cubit(names: list[str], env) str [source]
Modifies environment variables with the paths required to
import cubit
in a Python3 environment.Returns the absolute path of the first program name found. Appends
PATH
with first program’s parent directory if a program is found and the directory is not already onPATH
. PrependsPYTHONPATH
withparent/bin
. PrependsLD_LIBRARY_PATH
withparent/bin/python3
.Returns None if no program name is found.
import waves env = Environment() env["cubit"] = waves.scons_extensions.add_cubit(["cubit"], env)
- Parameters:
names – list of string program names. May include an absolute path.
env (SCons.Script.SConscript.SConsEnvironment) – The SCons construction environment object to modify
- Returns:
Absolute path of the found program. None if none of the names are found.
- waves.scons_extensions.add_program(names: list[str], env) str [source]
Search for a program from a list of possible program names. Add first found to system
PATH
.Returns the absolute path of the first program name found. Appends
PATH
with first program’s parent directory if a program is found and the directory is not already onPATH
. Returns None if no program name is found.import waves env = Environment() env["program"] = waves.scons_extensions.add_program(["program"], env)
- Parameters:
names – list of string program names. May include an absolute path.
env (SCons.Script.SConscript.SConsEnvironment) – The SCons construction environment object to modify
- Returns:
Absolute path of the found program. None if none of the names are found.
- waves.scons_extensions.alias_list_message(env=None, append: bool = True, keep_local: bool = True) None [source]
Add the alias list to the project’s help message
See the SCons Help documentation for appending behavior. Adds text to the project help message formatted as
Target Aliases: Alias_1 Alias_2
where the aliases are recovered from
SCons.Node.Alias.default_ans
.- Parameters:
env (SCons.Script.SConscript.SConsEnvironment) – The SCons construction environment object to modify
append – append to the
env.Help
message (default). When False, theenv.Help
message will be overwritten ifenv.Help
has not been previously called.keep_local – Limit help message to the project specific content when True. Only applies to SCons >=4.6.0
- waves.scons_extensions.append_env_path(program: str, env) None [source]
Append SCons contruction environment
PATH
with the program’s parent directoryRaises a
FileNotFoundError
if theprogram
absolute path does not exist. Uses the SCons AppendENVPath method. If the program parent directory is already onPATH
, thePATH
directory order is preserved.import waves env = Environment() env["program"] = waves.scons_extensions.find_program(["program"], env) if env["program"]: waves.append_env_path(env["program"], env)
- Parameters:
program – An absolute path for the program to add to SCons construction environment
PATH
env (SCons.Script.SConscript.SConsEnvironment) – The SCons construction environment object to modify
- waves.scons_extensions.catenate_actions(**outer_kwargs)[source]
Decorator factory to apply the
catenate_builder_actions
to a function that returns an SCons Builder.Accepts the same keyword arguments as the
waves.scons_extensions.catenate_builder_actions()
import SCons.Builder import waves @waves.scons_extensions.catenate_actions def my_builder(): return SCons.Builder.Builder(action=["echo $SOURCE > $TARGET", "echo $SOURCE >> $TARGET"])
- waves.scons_extensions.catenate_builder_actions(builder: Builder, program: str = '', options: str = '') Builder [source]
Catenate a builder’s arguments and prepend the program and options
${program} ${options} "action one && action two"
- Parameters:
builder – The SCons builder to modify
program – wrapping executable
options – options for the wrapping executable
- Returns:
modified builder
- waves.scons_extensions.conda_environment() Builder [source]
Create a Conda environment file with
conda env export
This builder is intended to help WAVES workflows document the Conda environment used in the current build. At least one target file must be specified for the
conda env export --file ${TARGET}
output. Additional options to the Condaenv export
subcommand may be passed as the builder keyword argumentconda_env_export_options
.At least one target must be specified. The first target determines the working directory for the builder’s action, as shown in the action code snippet below. The action changes the working directory to the first target’s parent directory prior to creating the Conda environment file.
cd ${TARGET.dir.abspath} && conda env export ${conda_env_export_options} --file ${TARGET.file}
The modsim owner may choose to re-use this builder throughout their project configuration to provide various levels of granularity in the recorded Conda environment state. It’s recommended to include this builder at least once for any workflows that also use the
waves.scons_extensions.python_builder()
. The builder may be re-used once per build sub-directory to provide more granular build environment reproducibility in the event that sub-builds are run at different times with variations in the active Conda environment. For per-Python script task environment reproducibility, the builder source list can be linked to the output of awaves.scons_extensions.python_builder()
task with a target environment file name to match.The first recommendation, always building the project wide Conda environment file, is demonstrated in the example usage below.
import waves env = Environment() env.Append(BUILDERS={"CondaEnvironment": waves.scons_extensions.conda_environment()}) environment_target = env.CondaEnvironment(target=["environment.yaml"]) env.AlwaysBuild(environment_target)
- Returns:
Conda environment builder
- Return type:
SCons.Builder.Builder
- waves.scons_extensions.construct_action_list(actions: list[str], prefix: str = 'cd ${TARGET.dir.abspath} &&', postfix: str = '') list[str] [source]
Return an action list with a common pre/post-fix
Returns the constructed action list with pre/post fix strings as
f"{prefix} {new_action} {postfix}"
where SCons action objects are converted to their string representation. If a string is passed instead of a list, it is first converted to a list. If an empty list is passed, and empty list is returned.
- Parameters:
actions – List of action strings
prefix – Common prefix to prepend to each action
postfix – Common postfix to append to each action
- Returns:
action list
- waves.scons_extensions.copy_substitute(source_list: list, substitution_dictionary: dict | None = None, env: ~SCons.Environment.Base = <SCons.Environment.Base object>, build_subdirectory: str = '.', symlink: bool = False) NodeList [source]
Copy source list to current variant directory and perform template substitutions on
*.in
filenamesWarning
This is a Python function and not an SCons builder. It cannot be added to the construction environment
BUILDERS
list. The function returns a list of targets instead of a Builder object.Creates an SCons Copy task for each source file. Files are copied to the current variant directory matching the calling SConscript parent directory. Files with the name convention
*.in
are also given an SCons Substfile Builder, which will perform template substitution with the provided dictionary in-place in the current variant directory and remove the.in
suffix.To avoid dependency cycles, the source file(s) should be passed by absolute path.
import pathlib import waves env = Environment() current_directory = pathlib.Path(Dir(".").abspath) source_list = [ "#/subdir3/file_three.ext", # File found with respect to project root directory using SCons notation current_directory / file_one.ext, # File found in current SConscript directory current_directory / "subdir2/file_two", # File found below current SConscript directory current_directory / "file_four.ext.in" # File with substitutions matching substitution dictionary keys ] substitution_dictionary = { "@variable_one@": "value_one" } waves.scons_extensions.copy_substitute(source_list, substitution_dictionary, env)
- Parameters:
source_list – List of pathlike objects or strings. Will be converted to list of pathlib.Path objects.
substitution_dictionary – key: value pairs for template substitution. The keys must contain the optional template characters if present, e.g.
@variable@
. The template character, e.g.@
, can be anything that works in the SCons Substfile builder.env – An SCons construction environment to use when defining the targets.
build_subdirectory – build subdirectory relative path prepended to target files
symlink – Whether symbolic links are created as new symbolic links. If true, symbolic links are shallow copies as a new symbolic link. If false, symbolic links are copied as a new file (dereferenced).
- Returns:
SCons NodeList of Copy and Substfile target nodes
- waves.scons_extensions.default_targets_message(env=None, append: bool = True, keep_local: bool = True) None [source]
Add a default targets list to the project’s help message
See the SCons Help documentation for appending behavior. Adds text to the project help message formatted as
Default Targets: Default_Target_1 Default_Target_2
where the targets are recovered from
SCons.Script.DEFAULT_TARGETS
.- Parameters:
env (SCons.Script.SConscript.SConsEnvironment) – The SCons construction environment object to modify
append – append to the
env.Help
message (default). When False, theenv.Help
message will be overwritten ifenv.Help
has not been previously called.keep_local – Limit help message to the project specific content when True. Only applies to SCons >=4.6.0
- waves.scons_extensions.find_program(names: list[str], env) str [source]
Search for a program from a list of possible program names.
Returns the absolute path of the first program name found. If path parts contain spaces, the part will be wrapped in double quotes.
- Parameters:
names – list of string program names. May include an absolute path.
env (SCons.Script.SConscript.SConsEnvironment) – The SCons construction environment object to modify
- Returns:
Absolute path of the found program. None if none of the names are found.
- waves.scons_extensions.matlab_script(program: str = 'matlab', post_action: list[str] = [], **kwargs) Builder [source]
Matlab script SCons builder
Warning
Experimental implementation is subject to change
This builder requires that the Matlab script is the first source in the list. The builder returned by this function accepts all SCons Builder arguments and adds the keyword argument(s):
script_options
: The Matlab function interface options in Matlab syntax and provided as a string.matlab_options
: The Matlab command line options provided as a string.
The parent directory absolute path is added to the Matlab
path
variable prior to execution. All required Matlab files should be co-located in the same source directory.At least one target must be specified. The first target determines the working directory for the builder’s action, as shown in the action code snippet below. The action changes the working directory to the first target’s parent directory prior to executing the python script.
The Builder emitter will append the builder managed targets automatically. Appends
target[0].matlab.env and ``target[0]
.stdout to thetarget
list.The emitter will assume all emitted targets build in the current build directory. If the target(s) must be built in a build subdirectory, e.g. in a parameterized target build, then the first target must be provided with the build subdirectory, e.g.
parameter_set1/my_target.ext
. When in doubt, provide a STDOUT redirect file as a target, e.g.target.stdout
.cd ${TARGET.dir.abspath} && {program} ${matlab_options} -batch "path(path, '${SOURCE.dir.abspath}'); ${SOURCE.filebase}(${script_options})" > ${TARGETS[-1].abspath} 2>&1
- Parameters:
program – An absolute path or basename string for the Matlab program.
post_action – List of shell command string(s) to append to the builder’s action list. Implemented to allow post target modification or introspection, e.g. inspect a log for error keywords and throw a non-zero exit code even if Matlab does not. Builder keyword variables are available for substitution in the
post_action
action using the${}
syntax. Actions are executed in the first target’s directory ascd ${TARGET.dir.abspath} && ${post_action}
- Returns:
Matlab script builder
- waves.scons_extensions.print_build_failures(print_stdout: bool = True) None [source]
On exit, query the SCons reported build failures and print the associated node’s STDOUT file, if it exists
- Parameters:
print_stdout – Boolean to set the exit behavior. If False, don’t modify the exit behavior.
- waves.scons_extensions.project_help_message(env=None, append: bool = True, keep_local: bool = True) None [source]
Add default targets and alias lists to project help message
See the SCons Help documentation for appending behavior. Thin wrapper around
- Parameters:
env (SCons.Script.SConscript.SConsEnvironment) – The SCons construction environment object to modify
append – append to the
env.Help
message (default). When False, theenv.Help
message will be overwritten ifenv.Help
has not been previously called.keep_local – Limit help message to the project specific content when True. Only applies to SCons >=4.6.0
- waves.scons_extensions.python_script(post_action: list[str] = []) Builder [source]
Python script SCons builder
This builder requires that the python script to execute is the first source in the list. The builder returned by this function accepts all SCons Builder arguments and adds the keyword argument(s):
script_options
: The Python script command line arguments provided as a string.python_options
: The Python command line arguments provided as a string.
At least one target must be specified. The first target determines the working directory for the builder’s action, as shown in the action code snippet below. The action changes the working directory to the first target’s parent directory prior to executing the python script.
The Builder emitter will append the builder managed targets automatically. Appends
target[0]
.stdout to thetarget
list.The emitter will assume all emitted targets build in the current build directory. If the target(s) must be built in a build subdirectory, e.g. in a parameterized target build, then the first target must be provided with the build subdirectory, e.g.
parameter_set1/my_target.ext
. When in doubt, provide a STDOUT redirect file as a target, e.g.target.stdout
.cd ${TARGET.dir.abspath} && python ${python_options} ${SOURCE.abspath} ${script_options} > ${TARGETS[-1].abspath} 2>&1
import waves env = Environment() env.Append(BUILDERS={"PythonScript": waves.scons_extensions.python_script()}) env.PythonScript(target=["my_output.stdout"], source=["my_script.py"], python_options="", script_options="")
- Parameters:
post_action (list) – List of shell command string(s) to append to the builder’s action list. Implemented to allow post target modification or introspection, e.g. inspect a log for error keywords and throw a non-zero exit code even if Python does not. Builder keyword variables are available for substitution in the
post_action
action using the${}
syntax. Actions are executed in the first target’s directory ascd ${TARGET.dir.abspath} && ${post_action}
- Returns:
Python script builder
- Return type:
SCons.Builder.Builder
- waves.scons_extensions.quinoa_solver(charmrun: str = 'charmrun', inciter: str = 'inciter', charmrun_options: str = '+p1', inciter_options: str = '', prefix_command: str = '', post_action: list[str] = []) Builder [source]
Quinoa solver SCons builder
This builder requires at least two source files provided in the order
Quinoa control file:
*.q
Exodus mesh file:
*.exo
The builder returned by this function accepts all SCons Builder arguments. Except for the
post_action
, the arguments of this function are also available as keyword arguments of the builder. When provided during task definition, the keyword arguments override the builder returned by this function.The first target determines the working directory for the builder’s action, as shown in the action code snippet below. The action changes the working directory to the first target’s parent directory prior to executing quinoa.
The emitter will assume all emitted targets build in the current build directory. If the target(s) must be built in a build subdirectory, e.g. in a parameterized target build, then the first target must be provided with the build subdirectory, e.g.
parameter_set1/target.ext
. When in doubt, provide a STDOUT redirect file as a target, e.g.target.stdout
.Warning
This is an experimental builder for Quinoa support. The only emitted file is the
target[0].stdout
redirected STDOUT and STDERR file. All relevant application output files, e.g.out.*
must be specified in the target list.import waves env = waves.scons_extensions.shell_environment("module load quinoa") env.Append(BUILDERS={ "QuinoaSolver": waves.scons_extensions.quinoa_solver(charmrun_options="+p1"), }) # Serial execution with "+p1" env.QuinoaSolver(target=["flow.stdout"], source=["flow.q", "box.exo"]) # Parallel execution with "+p4" env.QuinoaSolver(target=["flow.stdout"], source=["flow.q", "box.exo"], charmrun_options="+p4")
${prefix_command} ${TARGET.dir.abspath} && ${charmrun} ${charmrun_options} ${inciter} ${inciter_options} --control ${SOURCES[0].abspath} --input ${SOURCES[1].abspath} > ${TARGETS[-1].abspath} 2>&1
- Parameters:
charmrun – The relative or absolute path to the charmrun executable
charmrun_options – The charmrun command line interface options
inciter – The relative or absolute path to the inciter (quinoa) executable
inciter_options – The inciter (quinoa executable) command line interface options
prefix_command – Optional prefix command intended for environment preparation. Primarily intended for use with
waves.scons_extensions.sbatch_quinoa_solver()
or when wrapping the builder withwaves.scons_extensions.ssh_builder_actions()
. For local, direct execution, user’s should prefer to create an SCons construction environment withwaves.scons_extensions.shell_environment()
. When overriding in a task definition, the prefix command must end with' &&'
.post_action – List of shell command string(s) to append to the builder’s action list. Implemented to allow post target modification or introspection, e.g. inspect the Abaqus log for error keywords and throw a non-zero exit code even if Abaqus does not. Builder keyword variables are available for substitution in the
post_action
action using the${}
syntax. Actions are executed in the first target’s directory ascd ${TARGET.dir.abspath} && ${post_action}
- Returns:
Quinoa builder
- waves.scons_extensions.sbatch(program: str = 'sbatch', post_action: list[str] = [], **kwargs) Builder [source]
-
The builder does not use a SLURM batch script. Instead, it requires the
slurm_job
variable to be defined with the command string to execute.At least one target must be specified. The first target determines the working directory for the builder’s action, as shown in the action code snippet below. The action changes the working directory to the first target’s parent directory prior to executing the journal file.
The Builder emitter will append the builder managed targets automatically. Appends
target[0]
.stdout to thetarget
list.cd ${TARGET.dir.abspath} && sbatch --wait --output=${TARGETS[-1].abspath} ${sbatch_options} --wrap ${slurm_job}
import waves env = Environment() env.Append(BUILDERS={"SlurmSbatch": waves.scons_extensions.sbatch()}) env.SlurmSbatch(target=["my_output.stdout"], source=["my_source.input"], slurm_job="cat $SOURCE > $TARGET")
- Parameters:
program – An absolute path or basename string for the sbatch program.
post_action – List of shell command string(s) to append to the builder’s action list. Implemented to allow post target modification or introspection, e.g. inspect the Abaqus log for error keywords and throw a non-zero exit code even if Abaqus does not. Builder keyword variables are available for substitution in the
post_action
action using the${}
syntax. Actions are executed in the first target’s directory ascd ${TARGET.dir.abspath} && ${post_action}
- Returns:
SLURM sbatch builder
- waves.scons_extensions.sbatch_abaqus_journal(*args, **kwargs)[source]
Thin pass through wrapper of
waves.scons_extensions.abaqus_journal()
Catenate the actions and submit with SLURM sbatch. Accepts the
sbatch_options
builder keyword argument to modify sbatch behavior.sbatch --wait --output=${TARGET.base}.slurm.out ${sbatch_options} --wrap "cd ${TARGET.dir.abspath} && abaqus cae -noGui ${SOURCE.abspath} ${abaqus_options} -- ${journal_options} > ${TARGETS[-1].abspath} 2>&1"
- waves.scons_extensions.sbatch_abaqus_solver(*args, **kwargs)[source]
Thin pass through wrapper of
waves.scons_extensions.abaqus_solver()
Catenate the actions and submit with SLURM sbatch. Accepts the
sbatch_options
builder keyword argument to modify sbatch behavior.sbatch --wait --output=${TARGET.base}.slurm.out ${sbatch_options} --wrap "cd ${TARGET.dir.abspath} && ${program} -job ${job_name} -input ${SOURCE.filebase} ${abaqus_options} -interactive -ask_delete no > ${TARGETS[-1].abspath} 2>&1"
- waves.scons_extensions.sbatch_python_script(*args, **kwargs)[source]
Thin pass through wrapper of
waves.scons_extensions.python_script()
Catenate the actions and submit with SLURM sbatch. Accepts the
sbatch_options
builder keyword argument to modify sbatch behavior.sbatch --wait --output=${TARGET.base}.slurm.out ${sbatch_options} --wrap "cd ${TARGET.dir.abspath} && python ${python_options} ${SOURCE.abspath} ${script_options} > ${TARGETS[-1].abspath} 2>&1"
- waves.scons_extensions.sbatch_quinoa_solver(*args, **kwargs)[source]
Thin pass through wrapper of
waves.scons_extensions.quinoa_solver()
Catenate the actions and submit with SLURM sbatch. Accepts the
sbatch_options
builder keyword argument to modify sbatch behavior.sbatch --wait --output=${TARGET.base}.slurm.out ${sbatch_options} --wrap ""
- waves.scons_extensions.sbatch_sierra(*args, **kwargs)[source]
Thin pass through wrapper of
waves.scons_extensions.sierra()
Catenate the actions and submit with SLURM sbatch. Accepts the
sbatch_options
builder keyword argument to modify sbatch behavior.sbatch --wait --output=${TARGET.base}.slurm.out ${sbatch_options} --wrap "cd ${TARGET.dir.abspath} && ${program} ${sierra_options} ${application} ${application_options} -i ${SOURCE.file} > ${TARGETS[-1].abspath} 2>&1"
- waves.scons_extensions.shell_environment(command: str, cache: str | None = None, overwrite_cache: bool = False) Base [source]
Return an SCons shell environment from a cached file or by running a shell command
If the environment is created successfully and a cache file is requested, the cache file is _always_ written. The
overwrite_cache
behavior forces the shellcommand
execution, even when the cache file is present.Warning
Currently only supports bash shells
import waves env = waves.scons_extensions.shell_environment("source my_script.sh")
- Parameters:
command – the shell command to execute
cache – absolute or relative path to read/write a shell environment dictionary. Will be written as YAML formatted file regardless of extension.
overwrite_cache – Ignore previously cached files if they exist.
- Returns:
SCons shell environment
- waves.scons_extensions.sierra(program: str = 'sierra', application: str = 'adagio', post_action: list[str] = []) Builder [source]
Sierra SCons builder
This builder requires that the root input file is the first source in the list. The builder returned by this function accepts all SCons Builder arguments and adds the keyword argument(s):
sierra_options
: The Sierra command line options provided as a string.application_options
: The application (e.g. adagio) command line options provided as a string.
The first target determines the working directory for the builder’s action, as shown in the action code snippet below. The action changes the working directory to the first target’s parent directory prior to executing sierra.
The emitter will assume all emitted targets build in the current build directory. If the target(s) must be built in a build subdirectory, e.g. in a parameterized target build, then the first target must be provided with the build subdirectory, e.g.
parameter_set1/target.ext
. When in doubt, provide a STDOUT redirect file as a target, e.g.target.stdout
.Warning
This is an experimental builder for Sierra support. The only emitted file is the application’s version report in
target[0].env
and thetarget[0].stdout
redirected STDOUT and STDERR file. All relevant application output files, e.g.genesis_output.e
must be specified in the target list.import waves env = waves.scons_extensions.shell_environment("module load sierra") env.Append(BUILDERS={ "Sierra": waves.scons_extensions.sierra(), }) env.Sierra(target=["output.e"], source=["input.i"])
cd ${TARGET.dir.abspath} && ${program} ${sierra_options} ${application} ${application_options} -i ${SOURCE.file} > ${TARGETS[-1].abspath} 2>&1
- Parameters:
program – An absolute path or basename string for the Sierra program
application – The string name for the Sierra application
post_action – List of shell command string(s) to append to the builder’s action list. Implemented to allow post target modification or introspection, e.g. inspect the Sierra log for error keywords and throw a non-zero exit code even if Sierra does not. Builder keyword variables are available for substitution in the
post_action
action using the${}
syntax. Actions are executed in the first target’s directory ascd ${TARGET.dir.abspath} && ${post_action}
.
- Returns:
Sierra builder
- waves.scons_extensions.sphinx_build(program: str = 'sphinx-build', options: str = '', builder: str = 'html', tags: str = '') Builder [source]
Sphinx builder using the
-b
specifierThis builder does not have an emitter. It requires at least one target.
${program} ${options} -b ${builder} ${TARGET.dir.dir.abspath} ${TARGET.dir.abspath} ${tags}
import waves env = Environment() env.Append(BUILDERS={ "SphinxBuild": waves.scons_extensions.sphinx_build(options="-W"), }) sources = ["conf.py", "index.rst"] targets = ["html/index.html"] html = env.SphinxBuild( target=targets, source=sources, ) env.Clean(html, [Dir("html")] + sources) env.Alias("html", html)
- Parameters:
program – sphinx-build executable
options – sphinx-build options
builder – builder name. See the Sphinx documentation for options
tags – sphinx-build tags
- Returns:
Sphinx builder
- waves.scons_extensions.sphinx_latexpdf(program: str = 'sphinx-build', options: str = '', builder: str = 'latexpdf', tags: str = '') Builder [source]
Sphinx builder using the
-M
specifier. Intended forlatexpdf
builds.This builder does not have an emitter. It requires at least one target.
${program} -M ${builder} ${TARGET.dir.dir.abspath} ${TARGET.dir.dir.abspath} ${tags} ${options}"
import waves env = Environment() env.Append(BUILDERS={ "SphinxPDF": waves.scons_extensions.sphinx_latexpdf(options="-W"), }) sources = ["conf.py", "index.rst"] targets = ["latex/project.pdf"] latexpdf = env.SphinxBuild( target=targets, source=sources, ) env.Clean(latexpdf, [Dir("latex")] + sources) env.Alias("latexpdf", latexpdf)
- Parameters:
program (str) – sphinx-build executable
options (str) – sphinx-build options
builder (str) – builder name. See the Sphinx documentation for options
tags (str) – sphinx-build tags
- Returns:
Sphinx latexpdf builder
- waves.scons_extensions.sphinx_scanner() Scanner [source]
SCons scanner that searches for directives
.. include::
.. literalinclude::
.. image::
.. figure::
.. bibliography::
inside
.rst
and.txt
files- Returns:
Abaqus input file dependency Scanner
- Return type:
SCons.Scanner.Scanner
- waves.scons_extensions.ssh_builder_actions(builder: Builder, remote_server: str = '${remote_server}', remote_directory: str = '${remote_directory}') Builder [source]
Wrap a builder’s action list with remote copy operations and ssh commands
By default, the remote server and remote directory strings are written to accept (and require) task-by-task overrides via task keyword arguments. Any SCons replacement string patterns,
${variable}
, will make thatvariable
a required task keyword argument. Only if the remote server and/or remote directory are known to be constant across all possible tasks should those variables be overridden with a string literal containing no${variable}
SCons keyword replacement patterns.Warning
The
waves.scons_extensions.ssh_builder_actions()
is a work-in-progress solution with some assumptions specific to the action construction used by WAVES. It _should_ work for most basic builders, but adapting this function to users’ custom builders will probably require some advanced SCons knowledge and inspection of thewaves.scons_extensions_ssh_builder_actions()
implementation.Design assumptions
Creates the
remote_directory
withmkdir -p
.mkdir
must exist on theremote_server
.Copies all source files to a flat
remote_directory
withrsync -rlptv
.rsync
must exist on the local system.Replaces instances of
cd ${TARGET.dir.abspath} &&
withcd ${remote_directory} &&
in the original builder actions.Replaces instances of
SOURCE.abspath
orSOURCES.abspath
withSOURCE[S].file
in the original builder actions.Prefixes all original builder actions with
cd ${remote_directory} &&
.All original builder actions are wrapped in single quotes as
'{original action}'
to preserve the&&
as part of theremote_server
command. Shell variables, e.g.$USER
, will not be expanded on theremote_server
. If quotes are included in the original builder actions, they should be double quotes.Returns the entire
remote_directory
to the original builder${TARGET.dir.abspath}
withrysnc
.rsync
must exist on the local system.
import getpass import waves user = getpass.getuser() env = Environment() env.Append(BUILDERS={ "SSHAbaqusSolver": waves.scons_extensions.ssh_builder_actions( waves.scons_extensions.abaqus_solver(program="/remote/server/installation/path/of/abaqus"), remote_server="myserver.mydomain.com" ) }) env.SSHAbaqusSolver(target=["myjob.sta"], source=["input.inp"], job_name="myjob", abaqus_options="-cpus 4", remote_directory="/scratch/${user}/myproject/myworkflow", user=user)
import SCons.Builder import waves def print_builder_actions(builder): for action in builder.action.list: print(action.cmd_list) def cat(program="cat"): return SCons.Builder.Builder(action= [f"{program} ${{SOURCES.abspath}} | tee ${{TARGETS.file}}", "echo \"Hello World!\""] ) build_cat = cat() ssh_build_cat = waves.scons_extensions.ssh_builder_actions( cat(), remote_server="myserver.mydomain.com", remote_directory="/scratch/roppenheimer/ssh_wrapper" )
>>> import my_package >>> my_package.print_builder_actions(my_package.build_cat) cat ${SOURCES.abspath} | tee ${TARGETS.file} echo "Hello World!" >>> my_package.print_builder_actions(my_package.ssh_build_cat) ssh myserver.mydomain.com "mkdir -p /scratch/roppenheimer/ssh_wrapper" rsync -rlptv ${SOURCES.abspath} myserver.mydomain.com:/scratch/roppenheimer/ssh_wrapper ssh myserver.mydomain.com 'cd /scratch/roppenheimer/ssh_wrapper && cat ${SOURCES.file} | tee ${TARGETS.file}' ssh myserver.mydomain.com 'cd /scratch/roppenheimer/ssh_wrapper && echo "Hello World!"' rsync -rltpv myserver.mydomain.com:/scratch/roppenheimer/ssh_wrapper/ ${TARGET.dir.abspath}
- Parameters:
builder – The SCons builder to modify
remote_server – remote server where the original builder’s actions should be executed. The default string requires every task to specify a matching keyword argument string.
remote_directory – absolute or relative path where the original builder’s actions should be executed. The default string requires every task to specify a matching keyword argument string.
- Returns:
modified builder
- waves.scons_extensions.substitution_syntax(substitution_dictionary: dict, prefix: str = '@', postfix: str = '@') dict [source]
Return a dictionary copy with the pre/postfix added to the key strings
Assumes a flat dictionary with keys of type str. Keys that aren’t strings will be converted to their string representation. Nested dictionaries can be supplied, but only the first layer keys will be modified. Dictionary values are unchanged.
- Parameters:
substitution_dictionary (dict) – Original dictionary to copy
prefix (string) – String to prepend to all dictionary keys
postfix (string) – String to append to all dictionary keys
- Returns:
Copy of the dictionary with key strings modified by the pre/posfix
Parameter Generators
- class waves.parameter_generators.CartesianProduct(parameter_schema: dict, output_file_template: str = None, output_file: str = None, output_file_type: str = 'yaml', set_name_template: str = 'parameter_set@number', previous_parameter_study: str = None, overwrite: bool = False, dryrun: bool = False, write_meta: bool = False, **kwargs)[source]
Bases:
_ParameterGenerator
Builds a cartesian product parameter study
- Parameters:
parameter_schema (dict) – The YAML loaded parameter study schema dictionary - {parameter_name: schema value} CartesianProduct expects “schema value” to be an iterable. For example, when read from a YAML file “schema value” will be a Python list.
output_file_template (str) – Output file name template. Required if parameter sets will be written to files instead of printed to STDOUT. May contain pathseps for an absolute or relative path template. May contain the
@number
set number placeholder in the file basename but not in the path. If the placeholder is not found it will be appended to the template string.output_file (str) – Output file name for a single file output of the parameter study. May contain pathseps for an absolute or relative path.
output_file
andoutput_file_template
are mutually exclusive. Output file is always overwritten.output_file_type (str) – Output file syntax or type. Options are: ‘yaml’, ‘h5’.
set_name_template (str) – Parameter set name template. Overridden by
output_file_template
, if provided.previous_parameter_study (str) – A relative or absolute file path to a previously created parameter study Xarray Dataset
overwrite (bool) – Overwrite existing output files
dryrun (bool) – Print contents of new parameter study output files to STDOUT and exit
write_meta (bool) – Write a meta file named “parameter_study_meta.txt” containing the parameter set file names. Useful for command line execution with build systems that require an explicit file list for target creation.
Example
>>> import waves >>> parameter_schema = { ... 'parameter_1': [1, 2], ... 'parameter_2': ['a', 'b'] ... } >>> parameter_generator = waves.parameter_generators.CartesianProduct(parameter_schema) >>> print(parameter_generator.parameter_study) <xarray.Dataset> Dimensions: (data_type: 1, parameter_set_hash: 4) Coordinates: * data_type (data_type) object 'samples' parameter_set_hash (parameter_set_hash) <U32 'de3cb3eaecb767ff63973820b2... * parameter_sets (parameter_set_hash) <U14 'parameter_set0' ... 'param... Data variables: parameter_1 (data_type, parameter_set_hash) object 1 1 2 2 parameter_2 (data_type, parameter_set_hash) object 'a' 'b' 'a' 'b'
- Variables:
self.parameter_study – The final parameter study XArray Dataset object
- parameter_study_to_dict(*args, **kwargs) dict [source]
Return parameter study as a dictionary
Used for iterating on parameter sets in an SCons workflow with parameter substitution dictionaries, e.g.
>>> for set_name, parameters in parameter_generator.parameter_study_to_dict().items(): ... print(f"{set_name}: {parameters}") ... parameter_set0: {'parameter_1': 1, 'parameter_2': 'a'} parameter_set1: {'parameter_1': 1, 'parameter_2': 'b'} parameter_set2: {'parameter_1': 2, 'parameter_2': 'a'} parameter_set3: {'parameter_1': 2, 'parameter_2': 'b'}
- Parameters:
data_type (str) – The data_type selection to return - samples or quantiles
- Returns:
parameter study sets and samples as a dictionary: {set_name: {parameter: value}, …}
- Return type:
dict - {str: {str: value}}
- write() None [source]
Write the parameter study to STDOUT or an output file.
Writes to STDOUT by default. Requires non-default
output_file_template
oroutput_file
specification to write to files.If printing to STDOUT, print all parameter sets together. If printing to files, overwrite when contents of existing files have changed. If overwrite is specified, overwrite all parameter set files. If a dry run is requested print file-content associations for files that would have been written.
Writes parameter set files in YAML syntax by default. Output formatting is controlled by
output_file_type
.parameter_1: 1 parameter_2: a
- class waves.parameter_generators.CustomStudy(parameter_schema: dict, output_file_template: str = None, output_file: str = None, output_file_type: str = 'yaml', set_name_template: str = 'parameter_set@number', previous_parameter_study: str = None, overwrite: bool = False, dryrun: bool = False, write_meta: bool = False, **kwargs)[source]
Bases:
_ParameterGenerator
Builds a custom parameter study from user-specified values
- Parameters:
parameter_schema (array) – Dictionary with two keys:
parameter_samples
andparameter_names
. Parameter samples in the form of a 2D array with shape M x N, where M is the number of parameter sets and N is the number of parameters. Parameter names in the form of a 1D array with length N. When creating a parameter_samples array with mixed type (e.g. string and floats) use dtype=object to preserve the mixed types and avoid casting all values to a common type (e.g. all your floats will become strings).output_file_template (str) – Output file name template. Required if parameter sets will be written to files instead of printed to STDOUT. May contain pathseps for an absolute or relative path template. May contain the
@number
set number placeholder in the file basename but not in the path. If the placeholder is not found it will be appended to the template string.output_file (str) – Output file name for a single file output of the parameter study. May contain pathseps for an absolute or relative path.
output_file
andoutput_file_template
are mutually exclusive. Output file is always overwritten.output_file_type (str) – Output file syntax or type. Options are: ‘yaml’, ‘h5’.
set_name_template (str) – Parameter set name template. Overridden by
output_file_template
, if provided.previous_parameter_study (str) – A relative or absolute file path to a previously created parameter study Xarray Dataset
overwrite (bool) – Overwrite existing output files
dryrun (bool) – Print contents of new parameter study output files to STDOUT and exit
write_meta (bool) – Write a meta file named “parameter_study_meta.txt” containing the parameter set file names. Useful for command line execution with build systems that require an explicit file list for target creation.
Example
>>> import waves >>> import numpy >>> parameter_schema = dict( ... parameter_samples = numpy.array([[1.0, 'a', 5], [2.0, 'b', 6]], dtype=object), ... parameter_names = numpy.array(['height', 'prefix', 'index'])) >>> parameter_generator = waves.parameter_generators.CustomStudy(parameter_schema) >>> print(parameter_generator.parameter_study) <xarray.Dataset> Dimensions: (data_type: 1, parameter_set_hash: 2) Coordinates: * data_type (data_type) object 'samples' parameter_set_hash (parameter_set_hash) <U32 '50ba1a2716e42f8c4fcc34a90a... * parameter_sets (parameter_set_hash) <U14 'parameter_set0' 'parameter... Data variables: height (data_type, parameter_set_hash) object 1.0 2.0 prefix (data_type, parameter_set_hash) object 'a' 'b' index (data_type, parameter_set_hash) object 5 6
- Variables:
self.parameter_study – The final parameter study XArray Dataset object
- parameter_study_to_dict(*args, **kwargs) dict [source]
Return parameter study as a dictionary
Used for iterating on parameter sets in an SCons workflow with parameter substitution dictionaries, e.g.
>>> for set_name, parameters in parameter_generator.parameter_study_to_dict().items(): ... print(f"{set_name}: {parameters}") ... parameter_set0: {'parameter_1': 1, 'parameter_2': 'a'} parameter_set1: {'parameter_1': 1, 'parameter_2': 'b'} parameter_set2: {'parameter_1': 2, 'parameter_2': 'a'} parameter_set3: {'parameter_1': 2, 'parameter_2': 'b'}
- Parameters:
data_type (str) – The data_type selection to return - samples or quantiles
- Returns:
parameter study sets and samples as a dictionary: {set_name: {parameter: value}, …}
- Return type:
dict - {str: {str: value}}
- write() None [source]
Write the parameter study to STDOUT or an output file.
Writes to STDOUT by default. Requires non-default
output_file_template
oroutput_file
specification to write to files.If printing to STDOUT, print all parameter sets together. If printing to files, overwrite when contents of existing files have changed. If overwrite is specified, overwrite all parameter set files. If a dry run is requested print file-content associations for files that would have been written.
Writes parameter set files in YAML syntax by default. Output formatting is controlled by
output_file_type
.parameter_1: 1 parameter_2: a
- class waves.parameter_generators.LatinHypercube(*args, **kwargs)[source]
Bases:
_ScipyGenerator
Builds a Latin-Hypercube parameter study from the scipy Latin Hypercube class
The
h5
output_file_type
is the only output type that contains both the parameter samples and quantiles.Warning
The merged parameter study feature does not check for consistent parameter distributions. Changing the parameter definitions and merging with a previous parameter study will result in incorrect relationships between parameters and the parameter study samples and quantiles.
- Parameters:
parameter_schema (dict) – The YAML loaded parameter study schema dictionary - {parameter_name: schema value} LatinHypercube expects “schema value” to be a dictionary with a strict structure and several required keys. Validated on class instantiation.
output_file_template (str) – Output file name template. Required if parameter sets will be written to files instead of printed to STDOUT. May contain pathseps for an absolute or relative path template. May contain the
@number
set number placeholder in the file basename but not in the path. If the placeholder is not found it will be appended to the template string.output_file (str) – Output file name for a single file output of the parameter study. May contain pathseps for an absolute or relative path.
output_file
andoutput_file_template
are mutually exclusive. Output file is always overwritten.output_file_type (str) – Output file syntax or type. Options are: ‘yaml’, ‘h5’.
set_name_template (str) – Parameter set name template. Overridden by
output_file_template
, if provided.previous_parameter_study (str) – A relative or absolute file path to a previously created parameter study Xarray Dataset
overwrite (bool) – Overwrite existing output files
dryrun (bool) – Print contents of new parameter study output files to STDOUT and exit
write_meta (bool) – Write a meta file named “parameter_study_meta.txt” containing the parameter set file names. Useful for command line execution with build systems that require an explicit file list for target creation.
kwargs – Any additional keyword arguments are passed through to the sampler method
To produce consistent Latin Hypercubes on repeat instantiations, the
**kwargs
must include{'seed': <int>}
. See the scipy Latin Hypercubescipy.stats.qmc.LatinHypercube
class documentation for details Thed
keyword argument is internally managed and will be overwritten to match the number of parameters defined in the parameter schema.Example
>>> import waves >>> parameter_schema = { ... 'num_simulations': 4, # Required key. Value must be an integer. ... 'parameter_1': { ... 'distribution': 'norm', # Required key. Value must be a valid scipy.stats ... 'loc': 50, # distribution name. ... 'scale': 1 ... }, ... 'parameter_2': { ... 'distribution': 'skewnorm', ... 'a': 4, ... 'loc': 30, ... 'scale': 2 ... } ... } >>> parameter_generator = waves.parameter_generators.LatinHypercube(parameter_schema) >>> print(parameter_generator.parameter_study) <xarray.Dataset> Dimensions: (data_type: 2, parameter_set_hash: 4) Coordinates: parameter_set_hash (parameter_set_hash) <U32 '1e8219dae27faa5388328e225a... * data_type (data_type) <U9 'quantiles' 'samples' * parameter_sets (parameter_set_hash) <U14 'parameter_set0' ... 'param... Data variables: parameter_1 (data_type, parameter_set_hash) float64 0.125 ... 51.15 parameter_2 (data_type, parameter_set_hash) float64 0.625 ... 30.97
- Variables:
self.parameter_distributions – A dictionary mapping parameter names to the scipy.stats distribution
self.parameter_study – The final parameter study XArray Dataset object
- parameter_study_to_dict(*args, **kwargs) dict [source]
Return parameter study as a dictionary
Used for iterating on parameter sets in an SCons workflow with parameter substitution dictionaries, e.g.
>>> for set_name, parameters in parameter_generator.parameter_study_to_dict().items(): ... print(f"{set_name}: {parameters}") ... parameter_set0: {'parameter_1': 1, 'parameter_2': 'a'} parameter_set1: {'parameter_1': 1, 'parameter_2': 'b'} parameter_set2: {'parameter_1': 2, 'parameter_2': 'a'} parameter_set3: {'parameter_1': 2, 'parameter_2': 'b'}
- Parameters:
data_type (str) – The data_type selection to return - samples or quantiles
- Returns:
parameter study sets and samples as a dictionary: {set_name: {parameter: value}, …}
- Return type:
dict - {str: {str: value}}
- write() None [source]
Write the parameter study to STDOUT or an output file.
Writes to STDOUT by default. Requires non-default
output_file_template
oroutput_file
specification to write to files.If printing to STDOUT, print all parameter sets together. If printing to files, overwrite when contents of existing files have changed. If overwrite is specified, overwrite all parameter set files. If a dry run is requested print file-content associations for files that would have been written.
Writes parameter set files in YAML syntax by default. Output formatting is controlled by
output_file_type
.parameter_1: 1 parameter_2: a
- class waves.parameter_generators.SALibSampler(sampler_class, *args, **kwargs)[source]
Bases:
_ParameterGenerator
,ABC
Builds a SALib sampler parameter study from a SALib.sample
sampler_class
Samplers must use the
N
sample count argument. Note that in SALib.sampleN
is not always equivalent to the number of simulations. The following samplers are tested for parameter study shape and merge behavior:fast_sampler
finite_diff
latin
sobol
morris
Warning
For small numbers of parameters, some SALib generators produce duplicate parameter sets. These duplicate sets are removed during parameter study generation. This may cause the SALib analyze method(s) to raise errors related to the expected parameter set count.
Warning
The merged parameter study feature does not check for consistent parameter distributions. Changing the parameter definitions and merging with a previous parameter study will result in incorrect relationships between parameters and the parameter study samples.
- Parameters:
sampler_class (str) – The SALib.sample sampler class name. Case sensitive.
parameter_schema (dict) – The YAML loaded parameter study schema dictionary - {parameter_name: schema value} SALibSampler expects “schema value” to be a dictionary with a strict structure and several required keys. Validated on class instantiation.
output_file_template (str) – Output file name template. Required if parameter sets will be written to files instead of printed to STDOUT. May contain pathseps for an absolute or relative path template. May contain the
@number
set number placeholder in the file basename but not in the path. If the placeholder is not found it will be appended to the template string.output_file (str) – Output file name for a single file output of the parameter study. May contain pathseps for an absolute or relative path.
output_file
andoutput_file_template
are mutually exclusive. Output file is always overwritten.output_file_type (str) – Output file syntax or type. Options are: ‘yaml’, ‘h5’.
set_name_template (str) – Parameter set name template. Overridden by
output_file_template
, if provided.previous_parameter_study (str) – A relative or absolute file path to a previously created parameter study Xarray Dataset
overwrite (bool) – Overwrite existing output files
dryrun (bool) – Print contents of new parameter study output files to STDOUT and exit
write_meta (bool) – Write a meta file named “parameter_study_meta.txt” containing the parameter set file names. Useful for command line execution with build systems that require an explicit file list for target creation.
kwargs – Any additional keyword arguments are passed through to the sampler method
Keyword arguments for the SALib.sample
sampler_class
sample
method.Example
>>> import waves >>> parameter_schema = { ... "N": 4, # Required key. Value must be an integer. ... "problem": { # Required key. See the SALib sampler interface documentation ... "num_vars": 3, ... "names": ["parameter_1", "parameter_2", "parameter_3"], ... "bounds": [[-1, 1], [-2, 2], [-3, 3]] ... } ... } >>> parameter_generator = waves.parameter_generators.SALibSampler("sobol", parameter_schema) >>> print(parameter_generator.parameter_study) <xarray.Dataset> Dimensions: (data_type: 1, parameter_sets: 32) Coordinates: * data_type (data_type) object 'samples' parameter_set_hash (parameter_sets) <U32 'e0cb1990f9d70070eaf5638101dcaf... * parameter_sets (parameter_sets) <U15 'parameter_set0' ... 'parameter... Data variables: parameter_1 (data_type, parameter_sets) float64 -0.2029 ... 0.187 parameter_2 (data_type, parameter_sets) float64 -0.801 ... 0.6682 parameter_3 (data_type, parameter_sets) float64 0.4287 ... -2.871
- Variables:
self.parameter_study – The final parameter study XArray Dataset object
- Raises:
ValueError – If the SALib sobol or SALib morris sampler is specified and there are fewer than 2 parameters.
AttributeError –
N
is not a key ofparameter_schema
problem
is not a key ofparameter_schema
names
is not a key ofparameter_schema['problem']
TypeError –
parameter_schema
is not a dictionaryparameter_schema['N']
is not an integerparameter_schema['problem']
is not a dictionaryparameter_schema['problem']['names']
is not a YAML compliant iterable (list, set, tuple)
- parameter_study_to_dict(*args, **kwargs) dict [source]
Return parameter study as a dictionary
Used for iterating on parameter sets in an SCons workflow with parameter substitution dictionaries, e.g.
>>> for set_name, parameters in parameter_generator.parameter_study_to_dict().items(): ... print(f"{set_name}: {parameters}") ... parameter_set0: {'parameter_1': 1, 'parameter_2': 'a'} parameter_set1: {'parameter_1': 1, 'parameter_2': 'b'} parameter_set2: {'parameter_1': 2, 'parameter_2': 'a'} parameter_set3: {'parameter_1': 2, 'parameter_2': 'b'}
- Parameters:
data_type (str) – The data_type selection to return - samples or quantiles
- Returns:
parameter study sets and samples as a dictionary: {set_name: {parameter: value}, …}
- Return type:
dict - {str: {str: value}}
- write() None [source]
Write the parameter study to STDOUT or an output file.
Writes to STDOUT by default. Requires non-default
output_file_template
oroutput_file
specification to write to files.If printing to STDOUT, print all parameter sets together. If printing to files, overwrite when contents of existing files have changed. If overwrite is specified, overwrite all parameter set files. If a dry run is requested print file-content associations for files that would have been written.
Writes parameter set files in YAML syntax by default. Output formatting is controlled by
output_file_type
.parameter_1: 1 parameter_2: a
- class waves.parameter_generators.ScipySampler(sampler_class, *args, **kwargs)[source]
Bases:
_ScipyGenerator
Builds a scipy sampler parameter study from a scipy.stats.qmc
sampler_class
Samplers must use the
d
parameter space dimension keyword argument. The following samplers are tested for parameter study shape and merge behavior:Sobol
Halton
LatinHypercube
PoissonDisk
The
h5
output_file_type
is the only output type that contains both the parameter samples and quantiles.Warning
The merged parameter study feature does not check for consistent parameter distributions. Changing the parameter definitions and merging with a previous parameter study will result in incorrect relationships between parameters and the parameter study samples and quantiles.
- Parameters:
sampler_class (str) – The scipy.stats.qmc sampler class name. Case sensitive.
parameter_schema (dict) – The YAML loaded parameter study schema dictionary - {parameter_name: schema value} ScipySampler expects “schema value” to be a dictionary with a strict structure and several required keys. Validated on class instantiation.
output_file_template (str) – Output file name template. Required if parameter sets will be written to files instead of printed to STDOUT. May contain pathseps for an absolute or relative path template. May contain the
@number
set number placeholder in the file basename but not in the path. If the placeholder is not found it will be appended to the template string.output_file (str) – Output file name for a single file output of the parameter study. May contain pathseps for an absolute or relative path.
output_file
andoutput_file_template
are mutually exclusive. Output file is always overwritten.output_file_type (str) – Output file syntax or type. Options are: ‘yaml’, ‘h5’.
set_name_template (str) – Parameter set name template. Overridden by
output_file_template
, if provided.previous_parameter_study (str) – A relative or absolute file path to a previously created parameter study Xarray Dataset
overwrite (bool) – Overwrite existing output files
dryrun (bool) – Print contents of new parameter study output files to STDOUT and exit
write_meta (bool) – Write a meta file named “parameter_study_meta.txt” containing the parameter set file names. Useful for command line execution with build systems that require an explicit file list for target creation.
kwargs – Any additional keyword arguments are passed through to the sampler method
Keyword arguments for the
scipy.stats.qmc
sampler_class
. Thed
keyword argument is internally managed and will be overwritten to match the number of parameters defined in the parameter schema.Example
>>> import waves >>> parameter_schema = { ... 'num_simulations': 4, # Required key. Value must be an integer. ... 'parameter_1': { ... 'distribution': 'norm', # Required key. Value must be a valid scipy.stats ... 'loc': 50, # distribution name. ... 'scale': 1 ... }, ... 'parameter_2': { ... 'distribution': 'skewnorm', ... 'a': 4, ... 'loc': 30, ... 'scale': 2 ... } ... } >>> parameter_generator = waves.parameter_generators.ScipySampler("LatinHypercube", parameter_schema) >>> print(parameter_generator.parameter_study) <xarray.Dataset> Dimensions: (data_type: 2, parameter_set_hash: 4) Coordinates: parameter_set_hash (parameter_set_hash) <U32 '1e8219dae27faa5388328e225a... * data_type (data_type) <U9 'quantiles' 'samples' * parameter_sets (parameter_set_hash) <U14 'parameter_set0' ... 'param... Data variables: parameter_1 (data_type, parameter_set_hash) float64 0.125 ... 51.15 parameter_2 (data_type, parameter_set_hash) float64 0.625 ... 30.97
- Variables:
parameter_distributions – A dictionary mapping parameter names to the
scipy.stats
distributionself.parameter_study – The final parameter study XArray Dataset object
- parameter_study_to_dict(*args, **kwargs) dict [source]
Return parameter study as a dictionary
Used for iterating on parameter sets in an SCons workflow with parameter substitution dictionaries, e.g.
>>> for set_name, parameters in parameter_generator.parameter_study_to_dict().items(): ... print(f"{set_name}: {parameters}") ... parameter_set0: {'parameter_1': 1, 'parameter_2': 'a'} parameter_set1: {'parameter_1': 1, 'parameter_2': 'b'} parameter_set2: {'parameter_1': 2, 'parameter_2': 'a'} parameter_set3: {'parameter_1': 2, 'parameter_2': 'b'}
- Parameters:
data_type (str) – The data_type selection to return - samples or quantiles
- Returns:
parameter study sets and samples as a dictionary: {set_name: {parameter: value}, …}
- Return type:
dict - {str: {str: value}}
- write() None [source]
Write the parameter study to STDOUT or an output file.
Writes to STDOUT by default. Requires non-default
output_file_template
oroutput_file
specification to write to files.If printing to STDOUT, print all parameter sets together. If printing to files, overwrite when contents of existing files have changed. If overwrite is specified, overwrite all parameter set files. If a dry run is requested print file-content associations for files that would have been written.
Writes parameter set files in YAML syntax by default. Output formatting is controlled by
output_file_type
.parameter_1: 1 parameter_2: a
- class waves.parameter_generators.SobolSequence(*args, **kwargs)[source]
Bases:
_ScipyGenerator
Builds a Sobol sequence parameter study from the scipy Sobol class
random
method.The
h5
output_file_type
is the only output type that contains both the parameter samples and quantiles.Warning
The merged parameter study feature does not check for consistent parameter distributions. Changing the parameter definitions and merging with a previous parameter study will result in incorrect relationships between parameters and the parameter study samples and quantiles.
- Parameters:
parameter_schema (dict) – The YAML loaded parameter study schema dictionary - {parameter_name: schema value} SobolSequence expects “schema value” to be a dictionary with a strict structure and several required keys. Validated on class instantiation.
output_file_template (str) – Output file name template. Required if parameter sets will be written to files instead of printed to STDOUT. May contain pathseps for an absolute or relative path template. May contain the
@number
set number placeholder in the file basename but not in the path. If the placeholder is not found it will be appended to the template string.output_file (str) – Output file name for a single file output of the parameter study. May contain pathseps for an absolute or relative path.
output_file
andoutput_file_template
are mutually exclusive. Output file is always overwritten.output_file_type (str) – Output file syntax or type. Options are: ‘yaml’, ‘h5’.
set_name_template (str) – Parameter set name template. Overridden by
output_file_template
, if provided.previous_parameter_study (str) – A relative or absolute file path to a previously created parameter study Xarray Dataset
overwrite (bool) – Overwrite existing output files
dryrun (bool) – Print contents of new parameter study output files to STDOUT and exit
write_meta (bool) – Write a meta file named “parameter_study_meta.txt” containing the parameter set file names. Useful for command line execution with build systems that require an explicit file list for target creation.
kwargs – Any additional keyword arguments are passed through to the sampler method
To produce consistent Sobol sequences on repeat instantiations, the
**kwargs
must include eitherscramble=False
orseed=<int>
. See the scipy Sobolscipy.stats.qmc.Sobol
class documentation for details. Thed
keyword argument is internally managed and will be overwritten to match the number of parameters defined in the parameter schema.Example
>>> import waves >>> parameter_schema = { ... 'num_simulations': 4, # Required key. Value must be an integer. ... 'parameter_1': { ... 'distribution': 'uniform', # Required key. Value must be a valid scipy.stats ... 'loc': 0, # distribution name. ... 'scale': 10 ... }, ... 'parameter_2': { ... 'distribution': 'uniform', ... 'loc': 2, ... 'scale': 3 ... } ... } >>> parameter_generator = waves.parameter_generators.SobolSequence(parameter_schema) >>> print(parameter_generator.parameter_study) <xarray.Dataset> Dimensions: (data_type: 2, parameter_sets: 4) Coordinates: parameter_set_hash (parameter_sets) <U32 'c1fa74da12c0991379d1df6541c421... * data_type (data_type) <U9 'quantiles' 'samples' * parameter_sets (parameter_sets) <U14 'parameter_set0' ... 'parameter... Data variables: parameter_1 (data_type, parameter_sets) float64 0.0 0.5 ... 7.5 2.5 parameter_2 (data_type, parameter_sets) float64 0.0 0.5 ... 4.25
- parameter_study_to_dict(*args, **kwargs) dict [source]
Return parameter study as a dictionary
Used for iterating on parameter sets in an SCons workflow with parameter substitution dictionaries, e.g.
>>> for set_name, parameters in parameter_generator.parameter_study_to_dict().items(): ... print(f"{set_name}: {parameters}") ... parameter_set0: {'parameter_1': 1, 'parameter_2': 'a'} parameter_set1: {'parameter_1': 1, 'parameter_2': 'b'} parameter_set2: {'parameter_1': 2, 'parameter_2': 'a'} parameter_set3: {'parameter_1': 2, 'parameter_2': 'b'}
- Parameters:
data_type (str) – The data_type selection to return - samples or quantiles
- Returns:
parameter study sets and samples as a dictionary: {set_name: {parameter: value}, …}
- Return type:
dict - {str: {str: value}}
- write() None [source]
Write the parameter study to STDOUT or an output file.
Writes to STDOUT by default. Requires non-default
output_file_template
oroutput_file
specification to write to files.If printing to STDOUT, print all parameter sets together. If printing to files, overwrite when contents of existing files have changed. If overwrite is specified, overwrite all parameter set files. If a dry run is requested print file-content associations for files that would have been written.
Writes parameter set files in YAML syntax by default. Output formatting is controlled by
output_file_type
.parameter_1: 1 parameter_2: a