Using a Graph Neural Network to Learn Mechanical Properties From 3D Lattice Geometry

Elissa Ross
MESH Consultants
Published in
6 min readOct 7, 2020

--

The study of metamaterials aims to extract extraordinary behaviours from ordinary materials through careful design on the scale of microstructure. At MESH we are particularly interested in the role of geometry in determining the macrostructural behaviour of such an “architected material”.

An example of a metamaterial: the geometry of the lattice determines the emergent material property of the volume.

Additive manufacturing is a promising method for developing metamaterials: while the material (resin, polymer etc.) printed by the machine is typically of one type (with a predetermined stiffness etc.), we can achieve varying properties and compressive behaviours by changing the geometry of the print. Symmetric lattices are particularly appealing from a design perspective, and it is possible to attain a huge spectrum of material behaviour through a variation of the underlying geometry (one of my favourite papers on this topic is Panetta et al., 2015, which we draw on to generate our lattice data). However, the material properties of these lattices are typically assessed through a finite element simulation, which can be costly and time-consuming. The question motivating our recent research was:

Can we use machine learning to create a model that will predict lattice performance from lattice geometry?

Lattice Data & Learning Framework

The first challenge in this project is that lattice data is complex and multi-faceted. It involves both combinatorial (how many edges are present, and what nodes do they connect?) and geometric (where are those nodes in space, and what symmetries are present?) considerations. In this way, the space of possible lattices with cubic symmetry is infinite, which necessitates the definition of a sampling strategy. We chose to limit our attention to lattices that might, on some advanced hardware, be 3D-printable (e.g. low edge counts and low node valence).

Simple schematic of the GNN used to predict lattice compression.

Given that the lattices we are interested in are 3-dimensional, we considered a number of learning approaches within the emerging field of Geometric Deep Learning (aka deep learning on graphs and manifolds). We settled on using a graph neural network, as implemented in PyTorch Geometric. This allowed us to pass the lattice information to the network as an adjacency matrix, together with features on the nodes and edges of the lattices that describe the geometry. Specifically we used the implementation of the paper Neural message passing for quantum chemistry by Gilmer et al. (2017). In that work, the authors view graph neural networks as “message-passing” schemes, in which nodes have messages, and these are passed and aggregated according to the connections in the underlying graph.

Datasets and Simulations

To train the model we created several datasets consisting of lattice unit cells (a basic fundamental unit that can be repeated over and over to obtain what we think of as a lattice, for a cubic lattice like the ones studied here this sits inside a cube). We conducted a basic compression simulation using Kangaroo, a plugin for Rhinoceros3D and Grasshopper. Each lattice cell is defined on the interior of a 1m cube, and we apply periodic boundary conditions to approximate a larger section of lattice geometry. A virtual force is then applied to this cell, and the measure the maximum compression under this force. While this is only an approximation of full finite element analysis, this technique quickly gave us some “black box” simulation data that we could use to train a machine learning model.

Top Row: Samples from the “All Lattices” dataset. Middle Row: Samples from the “One Type” dataset (these are three different geometric realizations of a single lattice topology). Bottom Row: Samples from the “One Type Morphed” dataset.

With this simulation engine we processed over 100K lattices in a number of different datasets including:

  1. The “All Lattices” dataset consists of approximately 6K different combinatorial lattice types (e.g. 6K non-isomorphic graphs), each with 4 different different geometric realizations (e.g. embeddings in 3-space) of the nodes, resulting in ~24K lattices.
  2. The “One Type” dataset consists of a single combinatorial lattice, with 25K different positions for the nodes.
  3. The “One Type Morphed” dataset consists of 25K morphed versions of the “One Type” dataset. The morphs were defined by a random box morph of a cube enclosing the lattice unit cell.

Results

When we applied the graph neural network to the lattice data, we did not see satisfactory results for the “All Lattices” dataset. While the model did perform slightly better than the dummy regressor, it did not learn enough to applicable as a predictive model. This is likely because the data set is simply not large enough to capture the heterogeneity of the space of all symmetric lattices.

In contrast, the accuracy for “One Type” and “One Type Morphed” datasets were 92.42% and 86.29% respectively (measured using mean absolute percentage error). Perhaps more interestingly we discovered that we could also train the model on much smaller subsets of the original datasets (reducing from 25K to 10K or even 2K lattices) with only small reductions of accuracy. This suggests that it would be possible to generate a much larger version of the “All Lattices” dataset with 2K random samples per lattice topology. By the standards of contemporary deep learning, even this enlarged dataset is not particularly large.

Application to Design

We consider the application to design geometries consisting of volumes filled with lattices that are being modified (pushed, pulled, stretched etc.). In this setting, our model offers real-time structural feedback to the designer. In the examples shown here, green lattice blocks are stiffer than expected (0.5X compression), yellow blocks represent the expected (1.0X) compression, and red blocks are more compressive than average (1.5X).

The colours represent the ratio of the predicted compression to the baseline undeformed simulation compression (predicted / actual). Yellow is 1.0 (the same compression), green is 0.5X compression and red is 1.5X compression.

A challenge of this prediction methodology is that it is not possible to directly apply the trained model to predict compression behaviour for arbitrary finite chunks of lattices. For instance, one may be interested in the compressive behaviour of a manufactured part that is “filled” with lattice geometry. The ML model described here will not be able to predict this behaviour as a whole.

Conclusion & Further Questions

To our knowledge, this is the first instance of a machine learning application to predict material properties from 3-dimensional lattice geometries (learning the property from the structure). However we are really more interested in the property-structure relationship, especially in the context of design. That is, we want to solve the inverse problem: given a target stiffness (or other material characteristic), what lattice possesses this property? This problem is naturally much more challenging: lattices are determined by a large number of design parameters that can vary independently. The solutions are therefore not unique, and defining a loss function that can cope with this non-uniqueness is critical to the success of the inverse problem. The forward model established through this research gives some clues about how we might potentially run the problem backward to output lattices.

A key part of this project was that the compression simulation was essentially a stand-in for any kind of analysis results on lattice datasets. The same machine learning pipeline could be applied to other types of properties, which we believe has great potential for the development of performance-aware design tools.

A write up of the full details of the method described here has been accepted for publication in the proceedings of Advances in Architectural Geometry 2020 (now taking place in April 2021). This is joint work with Daniel Hambleton (MESH Consultants Inc.).

--

--