ansys.dpf.core.operators.mesh.mesh_provider#
- class ansys.dpf.core.operators.mesh.mesh_provider(time_scoping=None, streams_container=None, data_sources=None, read_cyclic=None, region_scoping=None, laziness=None, config=None, server=None)#
Bases:
ansys.dpf.core.dpf_operator.Operator
Reads a mesh from result files.
- Parameters:
time_scoping (int, optional) – Optional time/frequency set ID of the mesh, supported for adaptative meshes.
streams_container (StreamsContainer, optional) – result file container allowed to be kept open to cache data
data_sources (DataSources) – result file path container, used if no streams are set
read_cyclic (int, optional) – If 1, cyclic symmetry is ignored. If 2, cyclic expansion is done (default is 1).
region_scoping (Scoping or int, optional) – region id (integer) or vector of region ids with one entity (vector) or region scoping with one id (scoping) (region corresponds to zone for Fluid results or part for LSDyna results).
laziness (DataTree, optional) – configurate whether lazy evaluation can be performed and to what extent. Supported attributes are: - “num_named_selections”->num named selection to read (-1 is all, int32, default si -1), careful: the other named selections will not be available, use mesh_property_provider Operator to read them. - all mesh property fields “mat”, “named_selection”, “apdl_element_type”, “section”-> if set to 1 these properties will not be read and a workflow will be bounded to the properties to be evaluated on demand, with 0 they are read (default is 0). - “all_available_properties” option set to 0 will return all possible properties
- Returns:
mesh
- Return type:
Examples
>>> from ansys.dpf import core as dpf
>>> # Instantiate operator >>> op = dpf.operators.mesh.mesh_provider()
>>> # Make input connections >>> my_time_scoping = int() >>> op.inputs.time_scoping.connect(my_time_scoping) >>> my_streams_container = dpf.StreamsContainer() >>> op.inputs.streams_container.connect(my_streams_container) >>> my_data_sources = dpf.DataSources() >>> op.inputs.data_sources.connect(my_data_sources) >>> my_read_cyclic = int() >>> op.inputs.read_cyclic.connect(my_read_cyclic) >>> my_region_scoping = dpf.Scoping() >>> op.inputs.region_scoping.connect(my_region_scoping) >>> my_laziness = dpf.DataTree() >>> op.inputs.laziness.connect(my_laziness)
>>> # Instantiate operator and connect inputs in one line >>> op = dpf.operators.mesh.mesh_provider( ... time_scoping=my_time_scoping, ... streams_container=my_streams_container, ... data_sources=my_data_sources, ... read_cyclic=my_read_cyclic, ... region_scoping=my_region_scoping, ... laziness=my_laziness, ... )
>>> # Get output data >>> result_mesh = op.outputs.mesh()
- _inputs#
- _outputs#
- static _spec() ansys.dpf.core.operators.specification.Specification #
- static default_config(server: ansys.dpf.core.server_types.AnyServerType = None) ansys.dpf.core.config.Config #
Returns the default config of the operator.
This config can then be changed to the user needs and be used to instantiate the operator. The Configuration allows to customize how the operation will be processed by the operator.
- Parameters:
server – Server with channel connected to the remote or local instance. When
None
, attempts to use the global server.- Returns:
A new Config instance equivalent to the default config for this operator.
- Return type:
config
- property inputs: InputsMeshProvider#
Enables to connect inputs to the operator
- Returns:
An instance of InputsMeshProvider.
- Return type:
inputs
- property outputs: OutputsMeshProvider#
Enables to get outputs of the operator by evaluating it
- Returns:
An instance of OutputsMeshProvider.
- Return type:
outputs