The incremental.py
module#
Summary#
Provides an API to transform an existing workflow into an incrementally evaluating one. |
Transform a workflow into an incrementally evaluating one. |
Description#
Incremental.
Module detail#
- incremental.split_workflow_in_chunks(start_op: ansys.dpf.core.Operator, end_op: ansys.dpf.core.Operator, scoping: ansys.dpf.core.Scoping, rescope: bool = False, max_bytes: int = 1024**3, dict_inputs: Dict[int, Any] = {}, chunk_size: int = None, scoping_pin: int = None, end_input_pin: int = 0)#
Transform a workflow into an incrementally evaluating one.
It wraps in one method the functionality of the IncrementalHelper class as well as the estimation of the chunk size.
If no chunk_size is specified, the function will attempt to estimate the value by calling IncrementalHelper.estimate_size(max_bytes, dict_inputs).
If no scoping_pin is specified, the function will attempt to deduce the correct pin, which would be the first input pin matching a scoping type.
- Parameters:
start_op (Operator) – Initial operator of the workflow to convert
end_op (Operator) – Last operator of the workflow to convert
scoping (Scoping) – Scoping to split across multiple evaluation
rescope (bool, optional) – If enabled, will rescope final outputs with the given scoping (default = False)
max_bytes (int, optional) – Max allowed size for the output from the first operator (default = 1024**3)
dict_inputs (dict[int, any], optional) – Inputs to pass to the first operator, used only for the estimation run (default = {})
int (chunk_size =) – Maximum number of scoping elements to process in an iteration (default = None)
optional – Maximum number of scoping elements to process in an iteration (default = None)
scoping_pin (int, optional) – The pin number on the first operator to bind the scoping (default = None)
end_input_pin (int, optional) – Pin number of the output to use from the first operator(default = 0)