Workflow#

class ansys.dpf.core.workflow.Workflow(workflow=None, server=None)#

Represents a workflow.

A workflow is a black box containing operators and exposing only the necessary operator’s inputs and outputs to compute a given algorithm.

Parameters:
  • server (DPFServer, optional) – Server with channel connected to the remote or local instance. The default is None, in which case an attempt is made to use the global server.

  • workflow (ctypes.c_void_p, workflow_message_pb2.Workflow, optional)

Examples

Create a generic workflow computing the minimum of displacement by chaining the 'U' and 'min_max_fc' operators.

>>> from ansys.dpf import core as dpf
>>> disp_op = dpf.operators.result.displacement()
>>> max_fc_op = dpf.operators.min_max.min_max_fc(disp_op)
>>> workflow = dpf.Workflow()
>>> workflow.add_operators([disp_op,max_fc_op])
>>> workflow.set_input_name("data_sources", disp_op.inputs.data_sources)
>>> workflow.set_output_name("min", max_fc_op.outputs.field_min)
>>> workflow.set_output_name("max", max_fc_op.outputs.field_max)
>>> from ansys.dpf.core import examples
>>> data_src = dpf.DataSources(examples.find_multishells_rst())
>>> workflow.connect("data_sources", data_src)
>>> min = workflow.get_output("min", dpf.types.field) 
>>> max = workflow.get_output("max", dpf.types.field) 

Overview#

connect

Connect an input on the workflow using a pin name.

get_output

Retrieve the output of the operator on the pin number.

set_input_name

Set the name of the input pin of the workflow to expose it for future connection.

set_output_name

Set the name of the output pin of the workflow to expose it for future connection.

add_operators

Add operators to the list of operators of the workflow.

add_operator

Add an operator to the list of operators of the workflow.

record

Add the workflow to DPF’s internal registry with an ID returned by this method.

connect_with

Prepend a given workflow to the current workflow.

create_on_other_server

Create a new instance of a workflow on another server.

view

Run a viewer to show a rendering of the workflow.

to_graphviz

Save the workflow to a GraphViz file.

get_topology

Get the topology of the workflow.

progress_bar

True).

info

Dictionary with the operator names and the exposed input and output names.

operator_names

List of the names of operators added in the workflow.

input_names

List of the input names exposed in the workflow with set_input_name.

output_names

List of the output names exposed in the workflow with set_output_name.

get_recorded_workflow

Retrieve a workflow registered (with workflow.record()).

__del__

Clean up resources associated with the instance.

__str__

Describe the entity.

Import detail#

from ansys.dpf.core.workflow import Workflow

Property detail#

property Workflow.progress_bar: bool#

True).

Type:

Enable or disable progress bar display when requesting workflow output (default

property Workflow.info#

Dictionary with the operator names and the exposed input and output names.

Returns:

info – Dictionary with "operator_names", "input_names", and "output_names" key.

Return type:

dictionarry str->list str

property Workflow.operator_names#

List of the names of operators added in the workflow.

Returns:

names

Return type:

list str

property Workflow.input_names#

List of the input names exposed in the workflow with set_input_name.

Returns:

names

Return type:

list str

property Workflow.output_names#

List of the output names exposed in the workflow with set_output_name.

Returns:

names

Return type:

list str

Method detail#

Workflow.connect(pin_name, inpt, pin_out=0)#

Connect an input on the workflow using a pin name.

Parameters:
  • pin_name (str) – Name of the pin to connect. This name should be exposed before with wf.set_input_name

  • inpt (str, int, double, bool, list of int, list of doubles,) – Field, FieldsContainer, Scoping, ScopingsContainer,

  • MeshedRegion – Object to connect to.

  • MeshesContainer – Object to connect to.

  • DataSources – Object to connect to.

  • Operator – Object to connect to.

  • pin_out (int, optional) – If the input is an operator, the output pin of the input operator. The default is 0.

Examples

Create a generic workflow computing the minimum of displacement by chaining the 'U' and 'min_max_fc' operators.

>>> from ansys.dpf import core as dpf
>>> disp_op = dpf.operators.result.displacement()
>>> max_fc_op = dpf.operators.min_max.min_max_fc(disp_op)
>>> workflow = dpf.Workflow()
>>> workflow.add_operators([disp_op,max_fc_op])
>>> workflow.set_input_name("data_sources", disp_op.inputs.data_sources)
>>> workflow.set_output_name("min", max_fc_op.outputs.field_min)
>>> workflow.set_output_name("max", max_fc_op.outputs.field_max)
>>> from ansys.dpf.core import examples
>>> data_src = dpf.DataSources(examples.find_multishells_rst())
>>> workflow.connect("data_sources", data_src)
>>> min = workflow.get_output("min", dpf.types.field) 
>>> max = workflow.get_output("max", dpf.types.field) 
Workflow.get_output(pin_name, output_type)#

Retrieve the output of the operator on the pin number.

A progress bar following the workflow state is printed.

Parameters:
  • pin_name (str) – Name of the pin to retrieve. This name should be exposed before with wf.set_output_name

  • output_type (core.type enum) – Type of the requested output.

Workflow.set_input_name(name, *args)#

Set the name of the input pin of the workflow to expose it for future connection.

Parameters:
  • name (str) – Name of the pin to connect. This name should be exposed before with wf.set_input_name

  • *args (core.Operator, core.Input, int) – Operator with its input pin number or input to name.

Examples

>>> from ansys.dpf import core as dpf
>>> workflow = dpf.Workflow()
>>> disp_op = dpf.operators.result.displacement()
>>> max_fc_op = dpf.operators.min_max.min_max_fc(disp_op)
>>> workflow.set_input_name("data_sources", disp_op.inputs.data_sources)
>>> from ansys.dpf.core import examples
>>> data_src = dpf.DataSources(examples.find_multishells_rst())
>>> workflow.connect("data_sources", data_src)
Workflow.set_output_name(name, *args)#

Set the name of the output pin of the workflow to expose it for future connection.

Parameters:
  • name (str) – Name of the pin to connect. This name should be exposed before with wf.set_input_name

  • *args (core.Operator, core.Output, int) – Operator with its outpt pin number or output to name.

Examples

>>> from ansys.dpf import core as dpf
>>> from ansys.dpf.core import examples
>>> workflow = dpf.Workflow()
>>> model = dpf.Model(examples.find_simple_bar())
>>> disp_op = model.results.displacement()
>>> max_fc_op = dpf.operators.min_max.min_max_fc(disp_op)
>>> workflow.set_output_name("contour", disp_op.outputs.fields_container)
>>> fc = workflow.get_output("contour", dpf.types.fields_container) 
Workflow.add_operators(operators)#

Add operators to the list of operators of the workflow.

Parameters:

operators (dpf.core.Operator, list of dpf.core.Operator) – Operators to add to the list.

Examples

>>> from ansys.dpf import core as dpf
>>> workflow = dpf.Workflow()
>>> disp_op = dpf.Operator("U")
>>> max_op = dpf.Operator("min_max")
>>> workflow.add_operators([disp_op, max_op])
Workflow.add_operator(operator)#

Add an operator to the list of operators of the workflow.

Parameters:

operator (dpf.core.Operator) – Operator to add to the list.

Examples

>>> from ansys.dpf import core as dpf
>>> workflow = dpf.Workflow()
>>> disp_op = dpf.Operator("U")
>>> workflow.add_operator(disp_op)
Workflow.record(identifier='', transfer_ownership=True)#

Add the workflow to DPF’s internal registry with an ID returned by this method.

The workflow can be recovered by dpf.core.Workflow.get_recorded_workflow(id).

Parameters:
  • identifier (str, optional) – Name given to the workflow.

  • transfer_ownership (bool) – Whether to transfer the ownership. The default is True. If the ownership is not transferred, the workflow is removed from the internal registry as soon as the workflow has been recovered by its ID.

Return type:

int

Examples

>>> from ansys.dpf import core as dpf
>>> workflow = dpf.Workflow()
>>> disp_op = dpf.Operator("U")
>>> workflow.add_operator(disp_op)
>>> # ...
>>> id = workflow.record()
>>> workflow_copy = dpf.Workflow.get_recorded_workflow(id)
static Workflow.get_recorded_workflow(id, server=None)#

Retrieve a workflow registered (with workflow.record()).

Parameters:

id (int) – ID given by the method “record”.

Returns:

workflow – workflow registered in dpf’s registry (server side)

Return type:

core.Workflow()

Examples

>>> from ansys.dpf import core as dpf
>>> workflow = dpf.Workflow()
>>> disp_op = dpf.Operator("U")
>>> workflow.add_operator(disp_op)
>>> # ...
>>> id = workflow.record()
>>> workflow_copy = dpf.Workflow.get_recorded_workflow(id)
Workflow.connect_with(left_workflow, output_input_names=None)#

Prepend a given workflow to the current workflow.

Updates the current workflow to include all the operators of the workflow given as argument. Outputs of the given workflow are connected to inputs of the current workflow according to the map. All outputs of the given workflow become outputs of the current workflow.

Parameters:
  • left_workflow (core.Workflow) – The given workflow’s outputs are chained with the current workflow’s inputs.

  • output_input_names (str tuple, str dict optional) – Map used to connect the outputs of the given workflow to the inputs of the current workflow. Check the names of available inputs and outputs for each workflow using Workflow.input_names and Workflow.output_names. The default is None, in which case it tries to connect each output of the left_workflow with an input of the current workflow with the same name.

Examples

+-------------------------------------------------------------------------------------------------+
|  INPUT:                                                                                         |
|                                                                                                 |
|input_output_names = ("output","field" )                                                         |
|                      _____________                                  ____________                |
|  "data_sources"  -> |left_workflow| ->  "stuff"        "field" -> |     this   | -> "contour"   |
|"time_scoping"    -> |             |             "mesh_scoping" -> |            |                |
|                     |_____________| ->  "output"                  |____________|                |
|  OUTPUT                                                                                         |
|                    ____                                                                         |
|"data_sources"  -> |this| ->  "stuff"                                                            |
|"time_scoping" ->  |    | ->  "contour"                                                          |
|"mesh_scoping" ->  |____| -> "output"                                                            |
+-------------------------------------------------------------------------------------------------+ # noqa: E501
>>> import ansys.dpf.core as dpf
>>> left_wf = dpf.Workflow()
>>> op1 = dpf.operators.utility.forward()
>>> left_wf.set_input_name("op1_input", op1.inputs.any)
>>> left_wf.set_output_name("op1_output", op1.outputs.any)
>>> op2 = dpf.operators.utility.forward()
>>> left_wf.set_input_name("op2_input", op2.inputs.any)
>>> left_wf.set_output_name("op2_output", op2.outputs.any)
>>> left_wf.add_operators([op1, op2])
>>> print(f"{left_wf.input_names=}")
left_wf.input_names=['op1_input', 'op2_input']
>>> print(f"{left_wf.output_names=}")
left_wf.output_names=['op1_output', 'op2_output']
>>> current_wf = dpf.Workflow()
>>> op3 = dpf.operators.utility.forward()
>>> current_wf.set_input_name("op3_input", op3.inputs.any)
>>> current_wf.set_output_name("op3_output", op3.outputs.any)
>>> op4 = dpf.operators.utility.forward()
>>> current_wf.set_input_name("op4_input", op4.inputs.any)
>>> current_wf.set_output_name("op4_output", op4.outputs.any)
>>> current_wf.add_operators([op3, op4])
>>> print(f"{current_wf.input_names=}")
current_wf.input_names=['op3_input', 'op4_input']
>>> print(f"{current_wf.output_names=}")
current_wf.output_names=['op3_output', 'op4_output']
>>> output_input_names = {"op2_output": "op3_input"}
>>> current_wf.connect_with(left_wf, output_input_names)
>>> print(f"New {current_wf.input_names=}")
New current_wf.input_names=['op1_input', 'op2_input', 'op4_input']
>>> print(f"New {current_wf.output_names=}")
New current_wf.output_names=['op1_output', 'op2_output', 'op3_output', 'op4_output']

Notes

Function available with server’s version starting at 3.0.

Workflow.create_on_other_server(*args, **kwargs)#

Create a new instance of a workflow on another server.

The new Workflow has the same operators, exposed inputs and output pins as this workflow. Connections between operators and between data and operators are kept (except for exposed pins).

Parameters:
  • server (server.LegacyGrpcServer, optional) – Server with channel connected to the remote or local instance. When None, attempts to use the global server.

  • ip (str, optional) – ip address on which the new instance should be created (always put a port in args as well)

  • port (str, int , optional)

  • address (str, optional) – address on which the new instance should be created (“ip:port”)

Return type:

Workflow

Examples

Create a generic Workflow computing the minimum of displacement by chaining the 'U' and 'min_max_fc' operators.

>>> from ansys.dpf import core as dpf
>>> disp_op = dpf.operators.result.displacement()
>>> max_fc_op = dpf.operators.min_max.min_max_fc(disp_op)
>>> workflow = dpf.Workflow()
>>> workflow.add_operators([disp_op,max_fc_op])
>>> workflow.set_input_name("data_sources", disp_op.inputs.data_sources)
>>> workflow.set_output_name("min", max_fc_op.outputs.field_min)
>>> workflow.set_output_name("max", max_fc_op.outputs.field_max)
>>> #other_server = dpf.start_local_server(as_global=False)
>>> #new_workflow = workflow.create_on_other_server(server=other_server)
>>> #assert 'data_sources' in new_workflow.input_names
Workflow.view(title: None | str = None, save_as: None | str | os.PathLike = None, off_screen: bool = False, keep_dot_file: bool = False) str | None#

Run a viewer to show a rendering of the workflow.

Warning

The workflow is rendered using GraphViz and requires: - installation of GraphViz on your computer (see https://graphviz.org/download/) - installation of the graphviz library in your Python environment.

Parameters:
  • title – Name to use in intermediate files and in the viewer.

  • save_as – Path to a file to save the workflow view as.

  • off_screen – Render the image off_screen.

  • keep_dot_file – Whether to keep the intermediate DOT file generated.

Return type:

Returns the path to the image file rendered is save_as, else None.

Workflow.to_graphviz(path: os.PathLike | str)#

Save the workflow to a GraphViz file.

Workflow.get_topology()#

Get the topology of the workflow.

Returns:

workflow_topology

Return type:

workflow_topology.WorkflowTopology

Notes

Available from 10.0 server version.

Workflow.__del__()#

Clean up resources associated with the instance.

This method calls the deleter function to release resources. If an exception occurs during deletion, a warning is issued.

Raises:

Warning – If an exception occurs while attempting to delete resources.

Workflow.__str__()#

Describe the entity.

Returns:

description

Return type:

str