DPF Collections#

This tutorial demonstrates how to create and work with some DPF collections: FieldsContainer, MeshesContainer and ScopingsContainer.

You can store DPF entities of a given type as a DPF collection and further categorize them according to labels and associated values, which allows you to organize and keep track of data. Collections are essential for handling multiple time steps, frequency sets, or other labeled datasets in your analysis workflows.

Download tutorial as Python script Download tutorial as Jupyter notebook

What You’ll Learn#

This tutorial covers the following topics:

  • Working with FieldsContainer: Extract results across time steps, access individual fields, and create custom containers with multiple labels

  • Working with ScopingsContainer: Create and manage selections, and use them with operators for targeted result extraction

  • Working with MeshesContainer: Store and organize multiple mesh variations or time-dependent meshes

  • Collection operations: Iterate through collections, filter by labels, and access metadata

  • Advanced usage: Learn about other built-in collection types and create custom collections using the collection factory

By the end of this tutorial, you’ll have a basic understanding of how to effectively organize and manipulate DPF data using collections in your analysis workflows.

Introduction to Collections#

Collections in DPF serve as containers that group related objects with labels. The main collection types are:

Each collection provides methods to:

  • Add, retrieve, and iterate over contained objects

  • Access objects by label (time, frequency, set ID, and so on)

  • Perform operations across all contained objects

Collections are used in DPF workflows to provide operators with vectorized data, allowing you to process the data in bulk or to process it in parallel whenever possible.

The LabelSpace#

DPF collections use labels to categorize and organize contained objects.

Labels can be thought of as categories or dimensions along which the objects in the collection are organized.

Each object in the collection is associated with an integer value for every label/category/dimension of the collection.

A LabelSpace is a dictionary of one or more labels (e.g., “time”, “frequency”, “set ID”), each associated with specific values.

The LabelSpace is used to identify and access the objects within the collection.

It can be partial and target multiple objects in the collection, or it can be complete and define a value for each label of the collection, resulting in a unique object.

It is similar to multi-dimensional indexing in arrays or dataframes, or to a filter or query in databases.

Here are some examples:

  • a collection of fields (a FieldsContainer) across time uses the label time with each field associated to a time integer value (the time step ID). The LabelSpace {"time": 3} would uniquely identify the field for time step 3.

  • a collection of fields (a FieldsContainer) across frequency and stage uses the labels frequency and stage with each field associated to a frequency integer value (the frequency ID) and a stage integer value (the stage ID). The LabelSpace {"frequency": 2, "stage": 1} would uniquely identify the field for frequency ID 2 at stage ID 1.

  • a collection of meshes (a MeshesContainer) across different parts, further split by element type, uses the labels part and element_type with each mesh associated to a part integer value (the part ID) and an element_type integer value (the element type code). The LabelSpace {"part": 2, "element_type": 1} would uniquely identify the mesh for part 2 with element type code 1.

  • a collection of scopings (a ScopingsContainer) across different fluid zones for a transient analysis uses the labels zone and time with each scoping associated to a zone integer value (the zone ID) and a time integer value (the time step ID). The LabelSpace {"zone": 1} would identify the collection of scopings for zone 1 only, for all time steps.

Load an example file#

First, import the required modules and load a transient analysis result file.

A transient analysis is a typical example where collections are useful, as data is available at multiple time steps.

# Import the ansys.dpf.core module
import ansys.dpf.core as dpf

# Import the examples module
from ansys.dpf.core import examples

# Load a transient analysis with multiple time steps
result_file_path = examples.find_msup_transient()

# Create a DataSources object
data_sources = dpf.DataSources(result_path=result_file_path)

# Create a Model from the data sources
model = dpf.Model(data_sources=data_sources)

# Display basic model information
print(model)
DPF Model
------------------------------
Transient analysis
Unit system: MKS: m, kg, N, s, V, A, degC
Physics Type: Mechanical
Available results:
     -  node_orientations: Nodal Node Euler Angles
     -  displacement: Nodal Displacement
     -  velocity: Nodal Velocity      
     -  acceleration: Nodal Acceleration
     -  reaction_force: Nodal Force   
     -  stress: ElementalNodal Stress 
     -  elemental_volume: Elemental Volume
     -  stiffness_matrix_energy: Elemental Energy-stiffness matrix
     -  artificial_hourglass_energy: Elemental Hourglass Energy
     -  kinetic_energy: Elemental Kinetic Energy
     -  co_energy: Elemental co-energy
     -  incremental_energy: Elemental incremental energy
     -  thermal_dissipation_energy: Elemental thermal dissipation energy
     -  elastic_strain: ElementalNodal Strain
     -  elastic_strain_eqv: ElementalNodal Strain eqv
------------------------------
DPF  Meshed Region: 
  393 nodes 
  40 elements 
  Unit: m 
  With solid (3D) elements
------------------------------
DPF  Time/Freq Support: 
  Number of sets: 20 
Cumulative     Time (s)       LoadStep       Substep         
1              0.010000       1              1               
2              0.020000       1              2               
3              0.030000       1              3               
4              0.040000       1              4               
5              0.050000       1              5               
6              0.060000       1              6               
7              0.070000       1              7               
8              0.080000       1              8               
9              0.090000       1              9               
10             0.100000       1              10              
11             0.110000       1              11              
12             0.120000       1              12              
13             0.130000       1              13              
14             0.140000       1              14              
15             0.150000       1              15              
16             0.160000       1              16              
17             0.170000       1              17              
18             0.180000       1              18              
19             0.190000       1              19              
20             0.200000       1              20              

Working with FieldsContainer#

A FieldsContainer is the most commonly used collection in DPF. It stores multiple Field objects, each associated with a label such as time step or frequency.

Extract Results into a FieldsContainer#

Extract displacement results for all time steps, which will automatically create a FieldsContainer.

# Get displacement results for all time steps
displacement_fc = model.results.displacement.on_all_time_freqs.eval()

# Display FieldsContainer information
print(displacement_fc)
DPF displacement(s)Fields Container
  with 20 field(s)
  defined on labels: time 

  with:
  - field 0 {time:  1} with Nodal location, 3 components and 393 entities.
  - field 1 {time:  2} with Nodal location, 3 components and 393 entities.
  - field 2 {time:  3} with Nodal location, 3 components and 393 entities.
  - field 3 {time:  4} with Nodal location, 3 components and 393 entities.
  - field 4 {time:  5} with Nodal location, 3 components and 393 entities.
  - field 5 {time:  6} with Nodal location, 3 components and 393 entities.
  - field 6 {time:  7} with Nodal location, 3 components and 393 entities.
  - field 7 {time:  8} with Nodal location, 3 components and 393 entities.
  - field 8 {time:  9} with Nodal location, 3 components and 393 entities.
  - field 9 {time:  10} with Nodal location, 3 components and 393 entities.
  - field 10 {time:  11} with Nodal location, 3 components and 393 entities.
  - field 11 {time:  12} with Nodal location, 3 components and 393 entities.
  - field 12 {time:  13} with Nodal location, 3 components and 393 entities.
  - field 13 {time:  14} with Nodal location, 3 components and 393 entities.
  - field 14 {time:  15} with Nodal location, 3 components and 393 entities.
  - field 15 {time:  16} with Nodal location, 3 components and 393 entities.
  - field 16 {time:  17} with Nodal location, 3 components and 393 entities.
  - field 17 {time:  18} with Nodal location, 3 components and 393 entities.
  - field 18 {time:  19} with Nodal location, 3 components and 393 entities.
  - field 19 {time:  20} with Nodal location, 3 components and 393 entities.

Access Individual Fields in the Container#

You can access individual fields by their label or index.

# Access field by index (first time step)
first_field = displacement_fc[0]
print(f"First field info:")
print(first_field)

# Access field by label (specific time step)
second_time_field = displacement_fc.get_field({"time": 2})
# Equivalent to:
second_time_field = displacement_fc.get_field_by_time_id(2)
print(f"\nSecond time step field:")
print(second_time_field)
First field info:
DPF displacement_0.01s Field
  Location: Nodal
  Unit: m
  393 entities 
  Data: 3 components and 393 elementary data 

  Nodal
  IDs                   data(m)
  ------------          ----------
  9                     1.623646e-14   1.476283e-04   1.964400e-06   
                        
  96                    2.676501e-08   1.477594e-04   1.966312e-06   
                        
  95                    2.906660e-08   1.669437e-04   1.868636e-06   
                        
  ...



Second time step field:
DPF displacement_0.02s Field
  Location: Nodal
  Unit: m
  393 entities 
  Data: 3 components and 393 elementary data 

  Nodal
  IDs                   data(m)
  ------------          ----------
  9                     8.622437e-14   5.809883e-04   9.631718e-06   
                        
  96                    -4.960618e-08  5.807333e-04   9.646653e-06   
                        
  95                    -2.174825e-08  6.778664e-04   9.752991e-06   
                        
  ...


Create a Custom FieldsContainer#

You can create your own FieldsContainer and add fields with custom labels.

# Create an empty FieldsContainer
custom_fc = dpf.FieldsContainer()

# Set up labels for the container
custom_fc.labels = ["time", "zone"]

# Create sample fields for different time steps and zones
field1 = dpf.Field(location=dpf.locations.nodal, nature=dpf.natures.scalar)
field2 = dpf.Field(location=dpf.locations.nodal, nature=dpf.natures.scalar)
field3 = dpf.Field(location=dpf.locations.nodal, nature=dpf.natures.scalar)
field4 = dpf.Field(location=dpf.locations.nodal, nature=dpf.natures.scalar)

# Add some sample nodes and data
field1.scoping.ids = [1, 2, 3, 4, 5]
field1.data = [float(1 * i) for i in range(1, 6)]
field2.scoping.ids = [1, 2, 3, 4, 5]
field2.data = [float(2 * i) for i in range(1, 6)]
field3.scoping.ids = [1, 2, 3, 4, 5]
field3.data = [float(3 * i) for i in range(1, 6)]
field4.scoping.ids = [1, 2, 3, 4, 5]
field4.data = [float(4 * i) for i in range(1, 6)]

# Add field to container with labels
custom_fc.add_field({"time": 1, "zone": 1}, field1)
custom_fc.add_field({"time": 2, "zone": 1}, field2)
custom_fc.add_field({"time": 1, "zone": 2}, field3)
custom_fc.add_field({"time": 2, "zone": 2}, field4)

# Display the custom FieldsContainer
print(custom_fc)
DPF  Fields Container
  with 4 field(s)
  defined on labels: time zone 

  with:
  - field 0 {time:  1, zone:  1} with Nodal location, 1 components and 5 entities.
  - field 1 {time:  2, zone:  1} with Nodal location, 1 components and 5 entities.
  - field 2 {time:  1, zone:  2} with Nodal location, 1 components and 5 entities.
  - field 3 {time:  2, zone:  2} with Nodal location, 1 components and 5 entities.

Working with ScopingsContainer#

A ScopingsContainer holds multiple Scoping objects, which define sets of entity IDs (nodes, elements, etc.).

Create and Populate a ScopingsContainer#

Create different node selections and organize them in a ScopingsContainer.

# Get the mesh from our model
mesh = model.metadata.meshed_region

# Create a ScopingsContainer
scopings_container = dpf.ScopingsContainer()

# Set labels for different selections
scopings_container.labels = ["selection_type"]

# Selection 1: First 10 nodes
first_nodes = dpf.Scoping(location=dpf.locations.nodal)
first_nodes.ids = list(range(1, 11))
scopings_container.add_scoping(label_space={"selection_type": 0}, scoping=first_nodes)

# Selection 2: Every 10th node (sample)
all_node_ids = mesh.nodes.scoping.ids
every_tenth = dpf.Scoping(location=dpf.locations.nodal)
every_tenth.ids = all_node_ids[::10]  # Every 10th node
scopings_container.add_scoping(label_space={"selection_type": 1}, scoping=every_tenth)

# Selection 3: Last 10 nodes
last_nodes = dpf.Scoping(location=dpf.locations.nodal)
last_nodes.ids = all_node_ids[-10:]
scopings_container.add_scoping(label_space={"selection_type": 2}, scoping=last_nodes)

# Display ScopingsContainer information
print(scopings_container)
DPF  Scopings Container
  with 3 scoping(s)
  defined on labels: selection_type 

  with:
  - scoping 0 {selection_type:  0, } located on Nodal 10 entities.
  - scoping 1 {selection_type:  1, } located on Nodal 40 entities.
  - scoping 2 {selection_type:  2, } located on Nodal 10 entities.

Use ScopingsContainer with Operators#

ScopingsContainer objects can be used with operators to apply operations to multiple selections.

# Create an operator to extract displacement on specific node sets
displacement_op = dpf.operators.result.displacement()
# Connect the data source
displacement_op.inputs.data_sources(data_sources)
# Connect the scopings container which defines the node selections
displacement_op.inputs.mesh_scoping(scopings_container)

# Evaluate to get results for all scopings
scoped_displacements = displacement_op.eval()

print(f"Displacement results for different node selections:")
print(scoped_displacements)
Displacement results for different node selections:
DPF displacement(s)Fields Container
  with 3 field(s)
  defined on labels: selection_type time 

  with:
  - field 0 {selection_type:  0, time:  20} with Nodal location, 3 components and 10 entities.
  - field 1 {selection_type:  1, time:  20} with Nodal location, 3 components and 40 entities.
  - field 2 {selection_type:  2, time:  20} with Nodal location, 3 components and 10 entities.

Working with MeshesContainer#

A MeshesContainer stores multiple MeshedRegion objects. This is useful when working with different mesh variations or time-dependent meshes.

Create a MeshesContainer#

Create a MeshesContainer with mesh data for different cases.

# Create a MeshesContainer
meshes_container = dpf.MeshesContainer()

# Set labels for different mesh variations
meshes_container.labels = ["variation"]

# Get the original mesh
original_mesh = model.metadata.meshed_region

# Add original mesh
meshes_container.add_mesh({"variation": 0}, original_mesh)

# Create a modified mesh (example: subset of elements)
# Get element scoping for first half of elements
all_element_ids = original_mesh.elements.scoping.ids
subset_element_ids = all_element_ids[:len(all_element_ids)//2]

# Create element scoping for subset
element_scoping = dpf.Scoping(location=dpf.locations.elemental)
element_scoping.ids = subset_element_ids

# Extract subset mesh using an operator
mesh_extract_op = dpf.operators.mesh.from_scoping()
mesh_extract_op.inputs.mesh(original_mesh)
mesh_extract_op.inputs.scoping(element_scoping)
subset_mesh = mesh_extract_op.eval()

# Add subset mesh to container
meshes_container.add_mesh({"variation": 1}, subset_mesh)

# Display MeshesContainer information
print(meshes_container)
DPF  Meshes Container
  with 2 mesh(es)
  defined on labels: variation 

  with:
  - mesh 0 {variation:  0, } with 393 nodes and 40 elements.
  - mesh 1 {variation:  1, } with 203 nodes and 20 elements.

Collection Operations and Iteration#

Collections support various operations for data manipulation and analysis.

Iterate Through Collections#

You can iterate through collections using different methods.

# Iterate through a FieldsContainer
print("Iterating through displacement fields by index:")
for i in range(3):  # Show the first three fields in the collection
    # Get the field at index i
    field = displacement_fc[i]
    # Get the label space for the field at index i
    label_space = displacement_fc.get_label_space(i)
    # Print field information
    max_value = field.data.max()
    print(f"  Field {i}: {label_space}, max value: {max_value:.6f}")

# Enumerate the scopings in a ScopingsContainer
print("\nEnumerate the scopings in a ScopingsContainer:")
for i, scoping in enumerate(scopings_container):
    # Get the label space for the scoping at index i
    label_space = scopings_container.get_label_space(i)
    # Print scoping information
    print(f"  Scoping {i}: {label_space}, size: {scoping.size}")
Iterating through displacement fields by index:
  Field 0: {'time': 1}, max value: 0.000315
  Field 1: {'time': 2}, max value: 0.001632
  Field 2: {'time': 3}, max value: 0.004094

Enumerate the scopings in a ScopingsContainer:
  Scoping 0: {'selection_type': 0}, size: 10
  Scoping 1: {'selection_type': 1}, size: 40
  Scoping 2: {'selection_type': 2}, size: 10

Filter and Select from Collections#

You can filter collections based on label values.

# Get specific fields from a FieldsContainer with criteria on label values
# Get all fields of ``custom_fc`` where ``zone=1``
zone_1_fields = custom_fc.get_fields({"zone": 1})
print(f"\nFields in custom_fc with zone=1:")
for field in zone_1_fields:
    print(field)

Fields in custom_fc with zone=1:
DPF  Field
  Location: Nodal
  Unit: 
  5 entities 
  Data: 1 components and 5 elementary data 

  IDs                   data
  ------------          ----------
  1                     1.000000e+00   
                        
  2                     2.000000e+00   
                        
  3                     3.000000e+00   
                        
  ...


DPF  Field
  Location: Nodal
  Unit: 
  5 entities 
  Data: 1 components and 5 elementary data 

  IDs                   data
  ------------          ----------
  1                     2.000000e+00   
                        
  2                     4.000000e+00   
                        
  3                     6.000000e+00   
                        
  ...


Other Built-in Collection Types#

DPF provides several built-in collection types for common DPF objects, implemented in their respective modules:

Additionally, the following specialized collection types are available (from collection_base.py):

These built-in collections are optimized for their respective DPF types and should be used when working with fields, meshes, scopings, or basic types. For other supported types, you can use the ansys.dpf.core.collection.Collection.collection_factory() method to create a custom collection class at runtime.

Using the Collection Factory#

Note

Collections can only be made for types supported by DPF. Attempting to use unsupported or arbitrary Python types will result in an error.

The ansys.dpf.core.collection.Collection.collection_factory() method allows you to create a collection class for any supported DPF type at runtime. This is useful when you want to group and manage objects that are not covered by the built-in collection types (such as FieldsContainer, MeshesContainer, or ScopingsContainer).

For example, you can create a collection for ansys.dpf.core.DataSources objects:

from ansys.dpf.core import DataSources
from ansys.dpf.core import examples
from ansys.dpf.core.collection import Collection

# Create a collection class for DataSources
DataSourcesCollection = Collection.collection_factory(DataSources)
ds_collection = DataSourcesCollection()
ds_collection.labels = ["case"]

# Add DataSources objects to the collection
ds1 = DataSources("path/to/first/result/file.rst")
ds2 = DataSources("path/to/second/result/file.rst")
ds_collection.add_entry({"case": 0}, ds1)
ds_collection.add_entry({"case": 1}, ds2)

# Show the collection
print(ds_collection)
DPF   Collection
  with 2 DataSources(s) 
  defined on labels: case 

  with:
  - data_sources 0 {case:  0} with . 
  - data_sources 1 {case:  1} with . 

This approach allows you to leverage the powerful labeling and grouping features of DPF collections for any supported DPF type, making your workflows more flexible and organized.