Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
25 views11 pages

Coffman 2012

Uploaded by

biswajitmmestudy
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
25 views11 pages

Coffman 2012

Uploaded by

biswajitmmestudy
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 11

Available online at www.sciencedirect.

com

Mathematics and Computers in Simulation 82 (2012) 2951–2961

Original Articles

OOF3D: An image-based finite element solver for materials science


Valerie R. Coffman a,b , Andrew C.E. Reid b,c,∗ , Stephen A. Langer a,b , Gunay Dogan a,b,d
aInformation Technology Laboratory, National Institute of Standards and Technology, Gaithersburg, MD 20899, United States
b Material Measurement Laboratory, National Institute of Standards and Technology, Gaithersburg, MD 20899, United States
c Computer Integration and Programming Solutions, 4520 East West Highway, Suite 620, Bethesda, MD 20814, United States
d Theiss Research, 7411 Eads Avenue, La Jolla, CA 92037, United States

Received 26 April 2010; received in revised form 13 December 2011; accepted 13 March 2012
Available online 28 March 2012

Abstract
Recent advances in experimental techniques (micro-CT scans, automated serial sectioning, electron back-scatter diffraction, and
synchrotron radiation X-rays) have made it possible to characterize the full, three-dimensional structure of real materials. Such
new experimental techniques have created a need for software tools that can model the response of these materials under various
kinds of loads. OOF (Object Oriented Finite Elements) is a desktop software application for studying the relationship between the
microstructure of a material and its overall mechanical, dielectric, or thermal properties using finite element models based on real
or simulated micrographs. OOF provides methods for segmenting images, creating meshes of complex geometries, solving PDE’s
using finite element models, and visualizing 3D results. We discuss the challenges involved in implementing OOF in 3D and create
a finite element mesh of trabecular bone as an illustrative example.
Published by Elsevier B.V. on behalf of IMACS.

PACS: 02.70.Dh; 02.70.−c

Keywords: Meshing; Mesh refinement; Microstructures; Finite element modeling; Shape quality; Homogeneity

1. Introduction

Future advances in materials science will come from integrating simulations and experiments. One of the key
challenges of realizing this lies in finding computationally tractable representations of complex geometries such as
multiphase material microstructures and biological materials. In this paper, we describe OOF3D (named for “Object Ori-
ented Finite-Elements”, in three dimensions), a forthcoming software package for addressing this challenge. Previous
publications describe the current and previous two-dimensional versions of OOF [12,14–16,18,19].
The 2D OOF tool has been used successfully in a wide variety of applications, including studying the relationship
between thermal properties and the structure of plasma-sprayed zirconia coatings [13], the effects of microstructure
on the macroscopic mechanical properties of glass–matrix composites [5], the role of texture in the macroscopic
response of polycrystalline piezoelectric materials [10], and the modeling and design of electrode microstructures in
rechargeable lithium-ion batteries [11].

∗ Corresponding author at: Material Measurement Laboratory, National Institute of Standards and Technology, 100 Bureau Drive, Gaithersburg

MD, 20899, United States. Tel.: +1 301 975 4946; fax: +1 301 975 4553.

0378-4754/$36.00 Published by Elsevier B.V. on behalf of IMACS.


http://dx.doi.org/10.1016/j.matcom.2012.03.003
2952 V.R. Coffman et al. / Mathematics and Computers in Simulation 82 (2012) 2951–2961

When the OOF project began, fully 3D microstructure characterization data was rare. Since then, advances in
experimental techniques such as automated serial sectioning [25], 3D electron back-scatter diffraction (EBSD) [21],
and synchrotron X-rays [2,8], have produced an abundance of fully 3D microstructure characterization data. The
new data sets have created a greater need for software tools that can model the properties of materials with complex
microstructure geometries.
OOF works by generating a finite-element mesh from a two- or three-dimensional micrograph. Most meshing
algorithms require as input numerical representations of straight lines, flat planes, and simple curves that make up the
boundaries of the structure to be simulated. But for material microstructures, converting the boundaries to a simple
numerical representation by hand is a tedious task. Another method for forming a finite element mesh from an image is
to do so directly, defining a cube element for each voxel [3,9,20,27]. This has two major pitfalls. First, it usually creates
too many elements—homogeneous regions can adequately be described by larger elements. Second, it introduces
jagged edges where the boundary in the real material is smooth, which can lead to artificial features such as stress
concentrations at voxel corners.
OOF3D avoids these pitfalls by using a unique, image-based, adaptive meshing technique, which is a generalization
of the two-dimensional technique described in detail in [18]. This meshing scheme begins with a coarse, regular,
well-formed, space-filling mesh of tetrahedra on the image, and then brings that mesh into correspondence with the
image by a series of mesh-modifying steps which preserve the space-filling and well-formed character of the mesh.
Mesh modifying steps may refine elements, replacing them with smaller elements, or may move nodes and boundaries
around to align them with image features. Because the well-formed, space-filling character of the mesh is preserved,
unmeshable voids cannot arise, and illegal (inverted) elements can be avoided.
Simulations with OOF3D follow the same general procedure as with OOF2: the user loads a set of experimental
or simulated images or EBSD data constituting slices of the volumetric microstructure sample, groups the voxels into
material categories, assigns properties to material components, generates a finite element mesh, and performs virtual
experiments. OOF is used to visualise the microscopic response of the microstructure to external conditions, to compute
effective macroscopic material properties, or to design material microstructures with desired characteristics. For a brief
description of using OOF2 to analyse a particular 2D microstructure, see [19].
As with OOF2, OOF3D runs on Linux and Mac OSX. The installation requirements include a C++compiler, Python
interpreter, X11 windows system, and GTK2. A new requirement of OOF3D is VTK [22], which is used for 3D
visualization of microstructures, meshes, and numerical results.
In this paper, we discuss the new and interesting aspects involved in implementing the OOF image-based meshing
scheme in 3D. We discuss meshing trabecular bone as an illustrative example to show how OOF meshing methods
improve upon previous methods such as voxel to brick conversion.

2. 3D image-based meshing

As with OOF2, the OOF3D meshing scheme begins with image or EBSD data that has been segmented, that is,
broken up into distinct sets of voxels that correspond to the material phases of the microstructure. The OOF program
provides a suite of tools to assist the user in achieving a segmentation which matches the correct, user-provided physical
interpretation of the image. Image segmentation, and the obstacles to its full automation, is a rich topic beyond the
scope of this paper. For our purposes here, we will assume that a segmentation of the image exists, and also treat “voxel
group”, “voxel category”, and “material” as synonyms, although they have slightly differing technical meanings within
the OOF3D program.
While OOF2 allows mixed meshes of triangles and quadrilaterals, the current OOF3D code allows only tetrahedra.
This is a simplifying strategy for the initial approach to 3D mesh construction, which avoids complications in measuring
shape quality associated with meshes with mixed element types. These complications arise from the fact that the same
nodal displacement can give rise to different responses in differently-shaped elements, because the measures used to
assess shape quality have differing stiffnesses. Optimizing a mixed mesh as a whole will tend to give better shapes for
the elements with the stiffest shape quality measures, at the expense of the shapes with the softer quality measures, as
discussed in [18]. In OOF2, with triangular and quadrilateral element types, it was possible to make ad hoc changes
to the shape measures resulting in quality meshes. For the 3D case, the challenge is greater, as mixed meshes require
at least three types of elements (tetrahedra, hexahedra, and wedges, for instance), and it is more difficult to assess the
V.R. Coffman et al. / Mathematics and Computers in Simulation 82 (2012) 2951–2961 2953

Fig. 1. The homogeneity of an OOF element is the ratio of the maximum element volume occupied by one material to the total element volume.
Black lines here represent the tetrahedral element, and red lines represent the boundary between two voxelized material regions. The voxels, which
are not shown, above and to the right of the red surface are one material, and the voxels below and to the left are another. Calculating the homogeneity
requires finding the volume overlaps of the element with the voxelized regions corresponding to the enclosed materials. (For interpretation of the
references to color in this figure legend, the reader is referred to the web version of the article.)

utility of the quantitative tools. Since the benefits of mixed meshes do not clearly outweigh these difficulties, we have
implemented only tetrahedral 3D meshes in this work.

2.1. Mesh quality

In order for the mesh modification routines to make decisions between various possible local modifications, two
mesh quality measures are introduced: element homogeneity, which reflects the degree to which the mesh represents
the microstructure, and shape quality, which aids in good convergence behavior in the finite-element solution steps.
We define two element functionals which we call “energies”. The homogeneity energy measures how well the mesh
matches the voxel regions, and the shape energy quantifies the quality of the shapes of elements. We use the word
“energy” because some of our mesh modification routines, such as the Anneal routine [18], move the mesh nodes as if
they were physical particles with potential energies given by the shape and homogeneity functionals. If the weighted
average of the two energies, summed over elements, is low, the mesh is a good finite element representation of the
microstructure.
The homogeneity (see Fig. 1) is a measure of how close an element is to enclosing a region that contains a single
material:
maxi {vi }
H= (1)
V
where vi is the volume within the element that belongs to material i and V is the total volume of the element. (Note that
vi includes the fractional volume of voxels that intersect the element boundaries so that the functional is a continuous
function of the element node positions.) If an element is filled with only one material, H = 1. The corresponding
homogeneity energy is defined by

Ehom = 1 − H (2)

Calculating the element homogeneity in 3D involves a challenging computational geometry problem: finding the
volume intersection of a tetrahedral element with a non-convex, voxelized material region of an image. We have devel-
oped an algorithm that ignores the shared internal faces of contiguous material regions and uses careful bookkeeping
of topological data to find the element homogeneity in a way that is fast and robust. This algorithm is essential for
OOF3D to create efficient and well-formed meshes of complex geometries quickly from experimental data and will
be described in more detail in a future publication [6].
2954 V.R. Coffman et al. / Mathematics and Computers in Simulation 82 (2012) 2951–2961

The core of the algorithm is to identify the facets of the volume intersection of a given tetrahedral element with
all the voxels of a given material region. This is a potentially complicated shape, but it is guaranteed to be bounded
by (possibly complicated) planar facets. For each facet, it is possible to compute a volume contribution by taking the
signed volume of the pyramid formed by the facet and an arbitrary point in the space—facets whose outward normals
point away from the arbitrary point count positively, and facets whose outward normals point towards the arbitrary
point count negatively. The volume of the shape is then the sum of the volumes of these pyramids.
In the general case, there are two classes of these facets, and they are discovered in different phases of the algorithm.
The first class consists of facets that are subregions of the planar voxel boundary regions. These facets are internal
to an element, and may or may not be delimited in the voxel boundary plane by the triangular cross section of the
tetrahedron. As these facets are found, the details of the intersection topology are recorded.
The second class of facets are those which are subregions of the tetrahedral element faces and delimited either by
the element edges or the intersection of the element face and voxel group boundary planes. For each tetrahedral face, it
is possible to construct an oriented trajectory consisting of the intersection of the voxel group boundary with the plane
of the tetrahedral face. The trajectory is oriented so that, when viewed from outside the tetrahedron, the interior of the
voxel group is on the left side of the trajectory. Because the tetrahedra are convex, this identification of the orientation
is unique. Using intersection data gathered in the first phase, and enumerating the signed intersections where the voxel
group trajectory enters and exits the element face, all of the bounding segments of the facet can be identified, after
which the volume contribution of the facet can be computed.
Because homogeneous elements are desirable, it is often the case that element boundaries and voxel group boundaries
coincide, either exactly or within round-off error. Indeed, many of the OOF meshing tools promote precisely this
situation. This illuminates the central difficulty of implementing this algorithm, which is that topological data must
be maintained consistently throughout the computation, even when iterating over separate objects. For example, if
an intersection point lies on one face of a tetrahedron, some distance from the edge, it cannot lie on another face,
even though, when independently computed for that face, it may appear to do so due to round-off errors. Inconsistent
topologies can lead to mis-identification of facet boundaries, and consequently large numerical errors in the voxel
group volume contributions. Careful attention to book-keeping, round-off, and order-of-operations, detailed in a future
publication [6], ensures that correct results are obtained efficiently.
For the shape energy, it is useful to have an energy function which has high values for tetrahedra that have high
aspect ratios, resembling needles or plates. These elements can lead to slow convergence of the finite element solver
[23]. For tetrahedral elements, the shape energy is taken to be

37/4 V
Eshape = 1 − (3)
23/2 A3/2
rms

where Arms represents the root-mean-square face area of the tetrahedron. Eshape is zero for regular tetrahedra and 1 for
flat tetrahedra. This measure is chosen because it is scale-invariant and varies smoothly with node positions [23,24].
These two measures can be combined into an effective element energy

E = αEhom + (1 − α)Eshape (4)

where α is a user-tunable parameter. In several of the mesh adaptation routines, the user may adjust α according to
whether the priority is improving shape quality or homogeneity.

2.2. Adaptive methods

The two primary categories for mesh adaptation routines used in OOF3D are routines that (1) move nodes but
preserve the connectivity or topology of the starting mesh (e.g. Anneal, Snap Nodes), and (2) routines that change
the topology by adding, removing, or reconnecting nodes (e.g. Refine, Snap Refine). Some mesh modifiers in the
first category (Anneal, Relax, Smooth, Fix Illegal, and Manual node motion) generalize in a straightforward way for
OOF3D by simply allowing the nodes to move in all three dimensions. See Ref. [18] for details of these methods.
OOF2 mesh modification routines that are specific to triangles and quadrilaterals (e.g. Split Quads, Swap Edges, and
Merge Triangles) have not been generalized for OOF3D.
V.R. Coffman et al. / Mathematics and Computers in Simulation 82 (2012) 2951–2961 2955

In the following sections, we describe the 3D generalizations of the routines described in [18] that are significantly
different from their 2D counterparts, including Snap Nodes, Fix Illegal Elements, Refine, Snap Refine, and Rationalize.
We also describe a new method for smoothing the interfaces between materials.

2.2.1. Snap Nodes


Snap Nodes attempts to move (“snap”) element nodes directly to the boundaries of voxel categories. In OOF3D,
Snap Nodes follows a procedure generally similar general to that in OOF2:

(1) Loop over the elements and mark the transition points between voxel categories along each edge of each element.
(2) Find all possible ways of moving an element’s nodes to the marked edge positions and assign a priority to each.
Consider moves that involve any number of the element’s nodes, unlike OOF2 which only considered one or two
nodes per move. Each marking configuration has a predetermined set of rules indicating how the nodes can be
moved to the transition points.
(3) Starting with the highest priority, and choosing randomly from snaps with equal priorities, choose node movements
and accept or reject them according to the criterion on E (Eq. 4) supplied by the user.

A key difference in Snap Nodes in 3D versus 2D is the number of marking configurations and snapping rules. In
2D, there were five marking configurations (topologically equivalent sets of edges marked by a transition point) for
two element types and a total of 10 different snapping rules. In 3D, there are eight different possible edge marking
configurations (one for one edge, two for two edges, three for three edges, and two for four edges) and a total of 142
different snapping rules. Fig. 2 shows examples of the 3D snap rules.
Currently, the priorities are assigned by the number of marked edges, with highest priority given to the elements
with the most edge marks. However, with such a large number of snap rules (each of which takes computer time to
consider) a systematic study of which rules improve homogeneity most efficiently is needed to optimize this method.

Fig. 2. An example of a snap rule for each of the eight possible marking configurations. The original tetrahedra are indicated in gray, the red dots
indicate the transition points, the snapped tetrahedra are shown in blue, and the green arrows indicate the movement of the nodes. (For interpretation
of the references to color in this figure legend, the reader is referred to the web version of the article.)
2956 V.R. Coffman et al. / Mathematics and Computers in Simulation 82 (2012) 2951–2961

2.2.2. Refine and Snap Refine


Refine and Snap Refine subdivide tetrahedral elements into smaller tetrahedra, creating more degrees of freedom in
the mesh. This is done in two steps. First the segments are marked for subdivision, and then each element is subdivided
according to a refinement rule chosen to be compatible with the subdivision of its segments. (A refinement rule is a
scheme for subdividing an element, given an edge marking configuration.) For Refine, the segments are subdivided
according to a user criterion whereas for Snap Refine, the positions of the new nodes are chosen to be the transition
points along the element segments. Second, each element with marked edges is replaced with a set of smaller elements.
While the Refine method in OOF2 allowed trisection and bisection of elements as well as the choice to preserve only
quadrilaterals or triangles, in OOF3D, only bisection is allowed and only tetrahedra are created. As with Snap Nodes,
the set of predetermined rules is much greater for 3D than for 2D. In 3D there are 10 different edge marking types and
a total of 49 different refinement rules.
In many cases, more than one refinement is consistent with a set of edge markings. These refinements will differ in
the way in which the element’s faces are subdivided. This leads to constraint in 3D that is not present in 2D, that the
faces of neighboring elements must be subdivided compatibly. If there are multiple possible refinements after taking
the constraints into account, the refinement that produces the lowest energy is selected.
For certain marking types, if all the neighbors have already been refined, a situation can arise in which the only way
to refine the element is to add a node to the center. Fig. 3 shows examples of refinement rules used in such cases. Since
this leads to a greater number of degrees of freedom, which we wish to minimize, elements with markings that have
the potential to fall into this trap are sorted out during the edge marking step and handled first.

2.2.3. Rationalize
Rationalize fixes badly shaped tetrahedra by performing local remeshing. Unlike OOF2, Rationalize in OOF3D does
not have subalgorithms. Rationalize searches for tetrahedra that have extreme angles in the triangular faces, extreme
dihedral angles between pairs of faces, or extreme solid angles at nodes. For elements with small face-corner angles
or dihedral angles (Fig. 4(a)), the nodes opposite the small angle are merged, removing all elements that shared the
removed segment. For an element with one node that has a large solid angle (Fig. 4(b)), the element and the neighboring
element that shares the face opposite the node with the large angle are rearranged into three elements. For elements

Fig. 3. Examples of refinement rules where a node must be added to the center. These cases arise when the chosen refinements of the neighboring
elements have predetermined the face refinements of the element in question. The red dots represent the nodes added at transition points along the
edges of the original tetrahedra. The light blue dot represents the addtional new node, located at the center of the original element. (For interpretation
of the references to color in this figure legend, the reader is referred to the web version of the article.)
V.R. Coffman et al. / Mathematics and Computers in Simulation 82 (2012) 2951–2961 2957

Fig. 4. Examples of different types of badly shaped elements. The node on the left in (a) has small face-corner angles, making the element needle-like.
The top node in (b) has a large solid angle, making the overall element flat. The two segments in (c) that nearly form an ‘X’ have very large dihedral
angles, also making the element very flat.

for which all four nodes are nearly in the same plane, with two segments that nearly form an ‘X’ (Fig. 4(c)), a node is
added where the segments nearly intersect and all neighbors that share one of these segments are divided in half.

2.2.4. Surface Smooth


The Surface Smooth routine, like the Smooth routine described in Ref. [18], implements an algorithm similar to
Laplacian smoothing. Surface Smooth only acts on the nodes on the internal boundaries between elements with different
materials. It moves each node towards the average position of its neighbors on the boundary if the move meets the
given criterion (such as lowering the local energy) and does not create illegal elements. This tends to smooth out the
surface, eliminating dents and spikes.
The procedure for a single iteration of Surface Smooth is as follows:

(1) Create a list of the nodes that are on the boundary of elements that represent different materials and reorder them
randomly to remove any potential artifacts from the original ordering of nodes.
(2) Give each node a chance to move towards the average position of its neighbors on the surface, subject to a damping
factor. OOF computes the position from xnew = xold + γ * (xavg − xold ) where γ is a damping factor.
(3) After moving each node, an acceptance criterion is used to decide whether the move is kept. Moves that create
illegal elements are automatically rejected.
(4) If a legal move is unacceptable according to the acceptance criterion, OOF may still accept the move if the
smoothing is done at a non-zero temperature. See Ref. [18] for more on the criteria for accepting moves.

The damping factor γ should be between 0 and 1. The apparently obvious selection of γ = 1 is undesirable, because
of the node-at-a-time nature of the algorithm. Moving nodes directly to the average of their neighbors causes secondary
surface oscillations, and does not efficiently converge to the true surface. We have found empirically that relatively
low values of γ, less than 0.5, work well. A value of γ = 0.2 is used in the example below.

3. Trabecular bone

Trabecular bone is comprised mostly of calcium hydroxyapatite in a complex, sponge like microstructure that is
difficult to mesh. Fig. 5 shows a sample of trabecular bone visualized with OOF3D. This data set was collected by
Perilli et al. and provided publicly to other research groups [1,17].
Recent studies suggest that the risks of osteoporosis are linked not only to bone density, but that the microscale
structure of trabecular bone also plays an important role [3]. Previous finite element studies of trabecular bone have
used meshes created by converting voxels to brick elements which has the pitfalls described above: too many elements,
and the resolution of voxelization artifacts [3,27]. These studies are limited to length scales less than about 5 mm and
to simulating linear elastic and yield properties. The overall homogeneity index, the volume-averaged homogeneity of
all the elements, is 0.977.
Fig. 6 shows an OOF3D mesh of the data shown in Fig. 5. The steps for creating this mesh are representative of
the general procedure, and can serve as a tutorial of sorts in operating the mesh modification tools. The quantitative
choices for the parameters here are largely empirical, although there is also an underlying informal strategy which
2958 V.R. Coffman et al. / Mathematics and Computers in Simulation 82 (2012) 2951–2961

Fig. 5. 3D micro-CT scan of a trabecular bone biopsy taken from the femoral neck of a 61-year-old human male. This sample is 200 × 200 ×
195 voxels and represents approximately 3.9 mm × 3.9 mm × 3.8 mm. The solid bone is represented in white and the red fog is included to add
a perception of depth. We have used the “burn” function within OOF to find the one contiguous portion of solid bone, discarding the fragments
along the boundaries. Of 7.8 × 106 total voxels, approximately 1.33 × 106 are classified as bone, giving a bone density of approximately 17.1%.
(For interpretation of the references to color in this figure legend, the reader is referred to the web version of the article.)

Fig. 6. A finite element mesh created with OOF3D from the data shown in Fig. 5.
V.R. Coffman et al. / Mathematics and Computers in Simulation 82 (2012) 2951–2961 2959

emphasizes homogeneity in the early steps of the mesh refinement, and element shape later on, once the surface has
been well characterized and surface nodes placed correctly. Additionally, because many of the steps proceed element
by element, and can generate new modification candidates as a side-effect of their operation, it is often useful to apply
node-movement modifiers more than once, as in the fourth item below.
The detailed steps are as follows:

(1) Create an initial mesh grid with 10 × 10 × 10 elements for a total of 5000 initial elements.1
(2) Refine all elements with homogeneity less than 1 using α = 0.7.
(3) Refine all elements with homogeneity less than 0.95 using α = 0.7.
(4) Snap Nodes three times on all elements with a homogeneity of less than 0.9, subject to the criterion of increasing
average energy with α = 0.9.
(5) Smooth with internal boundary nodes pinned, subject to the criterion of increasing average energy with α = 0.3 for
5 iterations.
(6) Rationalize badly shaped elements three times with a shape energy greater than 0.9 subject to the criterion of
increasing average energy with α = 0.3.
(7) Snap Refine all elements with homogeneity less than 0.9 using α = 0.7.
(8) Snap Nodes three times on all elements with a homogeneity of less than 0.9, subject to the criterion of increasing
average energy with α = 0.9.
(9) Surface Smooth subject to the criterion of increasing average energy with α = 1 and γ = 0.2, for 30 iterations.

The resulting mesh contains a total of 144,475 elements, 52,458 of which represent bone, giving a greater than
50-fold improvement over voxel to brick conversion for the overall mesh and a 25-fold improvement for only the bone
elements. The volume of the mesh representing bone is 16.6% compared to 17.1% of the image volume representing
bone.
Using the current version of the code, the trabecular bone example above is manageable on a typical desktop or
laptop computer. It requires several gigabytes of memory, and takes a few tens of hours to complete the script. It is
principally CPU bound, and so is likely to benefit substantially from future efforts at parallelization, which will be
required in any case for the OOF application to scale well into the 3D realm.

4. Conclusion

We have described the updates to measures of mesh quality and mesh modification routines from Ref. [18] necessary
to generalize the OOF tool to three dimensions. We have demonstrated our three-dimensional image-based meshing
technique on trabecular bone and found a 50-fold improvement in mesh efficiency versus meshing techniques that
create a number of elements on the same order as the number of voxels. In the case of trabecular bone, this will
drastically increase the scope of possible simulations. This more powerful meshing capability will allow larger scale
simulations with a greater range of physical phenomena including plasticity, fracture and remodeling (the process by
which bone is broken down and rebuilt).
Future plans include parallelizing the mesh-modifying methods, adding interface properties for cohesive zone
models of fracture [4,26,28], integrating the multiscale method of overlapping finite elements and molecular dynamics
(OFEMD) [7] with the 3D image based meshing of OOF3D. The result will be the capability to perform fully atomistic,
fully continuum, or multiscale fracture simulations based on realistic geometries. If the microstructure characterization
technique is nondestructive, such fracture simulations can be compared directly to fracture experiments done on the
same sample. By refining simulation techniques through direct comparisons to experiments, we could potentially
create a simulation method that can accurately predict the fracture behavior of a microstructure allowing us to not only
gain greater insights into which structural features of microstructures determine fracture properties, but to design new
material microstructures with desired properties.

1 The microstructure space is divided into hexahedra, which are each divided into 5 tetrahedra.
2960 V.R. Coffman et al. / Mathematics and Computers in Simulation 82 (2012) 2951–2961

Acknowledgments

We acknowledge the valuable contributions of Seung-Ill Haan, Rhonald C. Lua, R. Edwin García, Craig Carter, and
Ed Fuller to previous versions of OOF on which OOF3D is based. We also acknowledge NIST support for OOF3D
through the grant #70NANB6H6149.
The OOF2 program, including source code, is available at http://www.ctcms.nist.gov/oof/oof2/. OOF2 was produced
by NIST, an agency of the U.S. government, and by statute is not subject to copyright in the United States. Recipients
of the software assume all responsibilities associated with its operation, modification and maintenance.
To be notified when OOF3D is available, add yourself to the OOF email list at http://www.ctcms.nist.gov/oof/
mail.html.

References

[1] F. Baruffaldi, E. Perilli, X-ray MicroCT scans of human bone biopsies, Webpage, 2009, http://www.biomedtown.org/biomed town/
LHDL/Reception/datarepository/repositories/BelRepWikiPages/XRayMicroCTScansOfHumanBoneBiopsies.
[2] D.H. Bilderback, C. Sinclair, S.M. Gruner, The status of the energy recovery linac source of coherent hard X-rays at Cornell University,
Synchrotron Radiation News 19 (6) (2006) 30.
[3] B. Borah, G.J. Gross, T.E. Dufresne, T.S. Smith, M.D. Cockman, P.A. Chmielewski, M.W. Lundy, J.R. Hartke, E.W. Sod, Three-dimensional
microimaging (MRμI and μCT) finite element modeling, and rapid prototyping provide unique insights into bone architecture in osteoporosis,
The Anatomical Record 265 (2001) 101–110.
[4] G.T. Camacho, M. Ortiz, Computational modelling of impact damage in brittle materials, International Journal of Solids and Structures 33
(1996) 2899–2938.
[5] V. Cannillo, T. Manfredini, M. Montorosi, A. Boccaccini, Investigation of the mechanical properties of mo-reinforced glass–matrix composites,
Journal of Non-Crystalline Solids 344 (2004) 88–93.
[6] V.R. Coffman, A.C. Reid, S.A. Langer, Element homogeneity: calculating the intersection of finite elements with voxelized regions, in
preparation.
[7] V.R. Coffman, J.P. Sethna, J. Bozek, A. Ingraffea, N.P. Bailey, E.I. Barker, Challenges in continuum modeling of intergranular fracture, Strain,
47 (Suppl. 2) (2011) 99–104.
[8] K. Finklestein, I. Bazarov, M. Liepe, Q. Shen, D. Bilderback, S. Gruner, A. Kazimirov, Energy recover linac: a next generation source for
inelastic X-ray scattering, Journal of Physics and Chemistry of Solids 66 (2005) 2310.
[9] E. Garboczi, D. Bentz, N. Martys, Digital images and computer modeling, in: P.-Z. Wong (Ed.), Methods in the Physics of Porous Media,
Academic Press, San Diego, CA, 1999, pp. 1–41.
[10] R.E. García, W.C. Carter, S.A. Langer, The effect of texture and microstructure on the macroscopic properties of polycrystalline piezoelectrics:
Application to barium titanate and PZN-PT, Journal of the American Ceramic Society 88 (3) (2005) 750–757.
[11] R.E. García, Y.-M. Chiang, W.C. Carter, P. Limthongkul, C.M. Bishop, Microstructural modeling and design of rechargeable lithium-ion
batteries, Journal of the Electrochemical Society 152 (1) (2005) A255–A263.
[12] R.E. García, A.C.E. Reid, S.A. Langer, W.C. Carter, Microstructural modeling of multifunctional material properties: the OOF project, in: D.
Raabe, F. Roters, F. Barlat, L.-Q. Chen (Eds.), Continuum Scale Simulation of Engineering Materials, Wiley-VCH, 2004.
[13] A.D. Jadhav, N.P. Padture, E.H. Jordan, M. Gell, P. Miranzo, E.R. Fuller Jr., Low-thermal conductivity plasma-sprayed thermal barrier coatings
with engineered microstructures, Acta Materialia 54 (12) (2006) 3343–3349.
[14] S.A. Langer, E.R. Fuller Jr., W.C. Carter, OOF: An image-based finite-element analysis of material microstructures, Computers in Science and
Engineering 3 (3) (2001) 15–23.
[15] S.A. Langer, A. Reid, S.-I. Haan, R.E. García, The OOF2 Manual, website, http://www.ctcms.nist.gov/∼langer/oof2man/index.html.
[16] S.A. Langer, A.C.E. Reid, R.E. García, S.-I. Haan, R.C. Lua, W.C. Carter, E.R. Fuller, Jr, A., Roosen, OOF: Analysis of Real Material
Microstructures, Webpage, http://www.ctcms.nist.gov/oof/index.html.
[17] E. Perilli, F. Baruffaldi, M. Visentin, B. Bordini, F. Traina, A. Cappello, M. Viceconti, MicroCT examination of human bone specimens: effects
of polymethylmethacrylate embedding on structural parameters, Journal of Microscopy-Oxford 225 (2) (2007) 192–200.
[18] A.C.E. Reid, S.A. Langer, R.C. Lua, V.R. Coffman, S.-I. Haan, R.E. García, Image-based finite element mesh construction for material
microstructures, Computational Materials Science 43 (2008) 989–999.
[19] A.C.E. Reid, R.C. Lua, R.E. García, V.R. Coffman, S.A. Langer, Modelling microstructures with OOF2, International Journal of Materials and
Product Technology 35 (3–4) (2009) 361–373.
[20] A. Roberts, E. Garboczi, Elastic properties of model porous ceramics, Journal of the American Ceramic Society 83 (2000) 3041–3048.
[21] A.D. Rollett, S.-B. Lee, R. Campman, G. Rohrer, Three-dimensional characterization of microstructure by electron back-scatter diffrac-
tion, Annual Review of Materials Research 37 (1) (2007) 627–658, http://arjournals.annualreviews.org/doi/abs/10.1146/annurev.matsci.
37.052506.084401.
[22] W. Schroeder, K. Martin, B. Lorensen, The Visualization Toolkit: An Object-Oriented Approach to 3D Graphics, Prentice Hall, 2006.
[23] J. Shewchuk, What is a good linear element? interpolation, conditioning, and quality measures, in: Eleventh International Meshing Roundtable,
Sandia National Laboratories, Ithaca, NY, 2002, pp. 115–126.
V.R. Coffman et al. / Mathematics and Computers in Simulation 82 (2012) 2951–2961 2961

[24] J. Shewchuk, What is a good linear finite element? Interpolation, conditioning, anisotropy, and quality measures, preprint,
http://www.cs.cmu.edu/∼jrs/jrspapers.html.
[25] J.E. Spowart, Automated serial sectioning for 3-D analysis of microstructures, Scripta Materialia 55 (2006) 5–10.
[26] V. Tvergaard, J.W. Hutchinson, The relation between crack growth resistance and fracture process parameters in elastic–plastic solids, Journal
of the Mechanics and Physics of Solids 40 (6) (1992) 1377–1397.
[27] B. van Rietbergen, H. Weinans, R. Huiskes, A. Odgaard, A new method to determine trabecular bone elastic properties and loading using
micromechanical finite-element models, Journal of Biomechanics 28 (1) (1995) 69–81.
[28] X.P. Xu, A. Needleman, Numerical simulations of fast crack groth in brittle solids, Journal of the Mechanics and Physics of Solids 42 (9)
(1994) 1397–1434.

You might also like