Note
Go to the end to download the full example code.
Create a plug-in package with multiple operators#
This example shows how to create a plug-in package with multiple operators. The benefits of writing a package rather than simple scripts are:
Componentization: You can split the code into several Python modules or files.
Distribution: You can use standard Python tools to upload and download packages.
Documentation: You can add README files, documentation, tests, and examples to the package.
For this example, the plug-in package contains two different operators:
One that returns all scoping IDs having data higher than the average
One that returns all scoping IDs having data lower than the average
Note
This example requires DPF 4.0 (Ansys 2022R2) or above. For more information, see Compatibility.
Create the plug-in package#
Each operator implementation derives from the
ansys.dpf.core.custom_operator.CustomOperatorBase
class
and a call to the ansys.dpf.core.custom_operator.record_operator()
method, which records the operators of the plug-in package.
Download the average_filter_plugin
plug-in package that has already been
created for you.
from ansys.dpf.core import examples
plugin_folder = examples.download_average_filter_plugin()
Load the plug-in package#
You use the function ansys.dpf.core.core.load_library()
to load the
plug-in package.
The first argument is the path to the directory where the plug-in package is located.
The second argument is
py_<package>
, where<package>
is the name identifying the plug-in package.The third argument is the name of the function exposed in the
__init__
file for the plug-in package that is used to record operators.
from ansys.dpf import core as dpf
from ansys.dpf.core import examples
# Python plugins are not supported in process.
dpf.start_local_server(config=dpf.AvailableServerConfigs.GrpcServer)
tmp = dpf.make_tmp_dir_server()
dpf.upload_files_in_folder(dpf.path_utilities.join(tmp, "average_filter_plugin"), plugin_folder)
dpf.load_library(
dpf.path_utilities.join(tmp, "average_filter_plugin"),
"py_average_filter",
"load_operators",
)
'py_average_filter successfully loaded'
Instantiate the operator.
new_operator = dpf.Operator("ids_with_data_lower_than_average")
Connect a workflow#
Connect a workflow that computes the norm of the displacement
to the ids_with_data_lower_than_average
operator.
Methods of the ids_with_data_lower_than_average
class are dynamically
added because specifications for the operator are defined in the plug-in
package.
Use the operator#
ds = dpf.DataSources(dpf.upload_file_in_tmp_folder(examples.find_static_rst()))
displacement = dpf.operators.result.displacement(data_sources=ds)
norm = dpf.operators.math.norm(displacement)
new_operator.inputs.connect(norm)
new_scoping = new_operator.outputs.scoping()
print("scoping in was:", norm.outputs.field().scoping)
print("----------------------------------------------")
print("scoping out is:", new_scoping)
scoping in was: DPF Scoping:
with Nodal location and 81 entities
----------------------------------------------
scoping out is: DPF Scoping:
with Nodal location and 35 entities
Total running time of the script: (0 minutes 4.017 seconds)