Designing optimized specimens for composite materials: from normative testing to simulation-assisted identification
This article is based on the M2 internship work of Antoine Vintache in collaboration with François Hild of the Laboratoire de Mécanique de Paris-Saclay, and presented at the ECCM21 conference [1]. This work is supported by ArianeGroup.
In experimental mechanics, tests on composite materials are often based on standardized specimen geometries. These geometries – uniaxial tension, three-point bending, Iosipescu shear, etc. – were designed in a context where measurement resources were limited to point sensors, mainly strain gauges. – were designed in a context where measurement resources were limited to point sensors, mainly strain gauges. These geometries were therefore often developed with the aim of preserving the homogeneity of the deformation state. They are still widely used today, particularly in the context of standards.
However, developments in measurement tools, particularly full-field techniques such as Digital Image Correlation (DIC), are calling into question the suitability of these geometries. These methods enable displacement and deformation fields to be observed in detail over the entire surface. They open up the possibility of a much richer exploitation of each test.
At the same time, the use of model updating tools, based on test results, makes it possible to identify several material parameters from a single test. In this context, specimen geometry becomes an optimization variable, which can be adapted to maximize the amount of useful information extracted from the test.
The problem of parameter identifiability
Not all these parameters are necessarily identifiable from a given test. The sensitivity of a parameter is defined as the relative variation in measured quantities (e.g. displacement fields) induced by a variation in this parameter. If this sensitivity is low in relation to measurement uncertainties, parameter identification becomes difficult.
Tests that comply with existing standards therefore fail to identify certain constituent parameters. This lack of identifiability can remain invisible if it is not explicitly evaluated, leading to identification errors, notably linked to couplings between parameters, or excessive dependence on numerical regularization.
In this context, it becomes necessary to rely on a quantitative approach to parameter identifiability, integrating measurement uncertainties and couplings between parameters. This not only enables us to assess the relevance of a given test, but also to optimize its design to maximize the amount of information that can be exploited.
Model fitting approach with EikoTwin Digital Twin
The identification of material parameters can be formulated as an inverse optimization problem: adjusting the parameters of a finite element model (FEM) to reduce the discrepancy between simulation results and experimental measurements. The method used here is based on Weighted Finite Element Model Updating (FEMU), integrated into the EikoTwin Digital Twin environment.
The aim is to minimize a cost function based on the Mahalanobis distance between measured and simulated data, taking explicit account of experimental uncertainties:
where F denotes force, U denotes displacement fields, indices m and FE refer to measured and simulated data, respectively, and covariance matrices model the uncertainties on each data source.
The identification is solved by a Gauss-Newton algorithm, based on an estimation of the hessian matrix of the cost function. This Hessian is constructed from sensitivity vectors of the simulated data with respect to the material parameters.
In cases where certain parameters have low sensitivity, the problem becomes ill-conditioned. A Tikhonov-type regularization is then introduced to stabilize the solution. It consists in adding a term penalizing deviations from the initial values of the parameters, with a weight adjusted iteratively.
This approach makes full use of the measurement fields provided by image correlation tools, while providing a rigorous basis for assessing the influence of parameters and their identifiability in a given test.
Sensitivity analysis as a design tool
Sensitivity analysis is the central element of the specimen geometry optimization process. It enables us to assess, for a given test, which material parameters have a significant influence on the measured data, taking into account experimental uncertainties. It can be carried out using EikoTwin Digital Twin or even a dedicated script.
Sensitivity analysis: Hessian matrix, eigenvalues and parameters, and identifiability classes.
The main tool is the Hessian matrix of the FEMU cost function, expressed here in a weighted version according to the signal-to-noise ratio (SNR). Each diagonal term in this matrix corresponds to the sensitivity of a parameter, while off-diagonal terms reflect correlations between parameters.
To interpret this Hessian, a diagonalization is performed. This provides :
- An orthogonal base of “eigenparameters” (linear combinations of the initial parameters);
- A series of eigenvalues expressed in logarithmic SNR scale, quantifying the identifiability of each “eigenparameter”.
Based on these quantities, we propose a classification into “identifiability classes” [1]:
- Eigenparameters whose class (integer part of the decimal logarithm of the eigenvalue) is negative are considered unidentifiable (sensitivity < uncertainty);
- Those with a positive or zero class are identifiable.
These classes are then reprojected onto the initial material parameters to assign each an aggregate class. This determines the number of identifiable parameters for each specimen geometry tested. This approach is particularly useful in the design phase. By exploring a space of geometries (e.g. stressed element length, lamination angles), it becomes possible to map the number of identifiable parameters, and locate the geometric configurations that maximize the information extracted.
Case study: optimizing an Iosipescu test for a laminate composite
The case studied concerns an Iosipescu-type test on a laminated composite material consisting of 20 symmetrical plies, oriented at two angles noted α and β. The mechanical behavior is modeled as orthotropic and elastic, with 9 constitutive parameters to be identified.
Schematic representation of the test and the geometric parameters to be optimized (𝐿, 𝛼, 𝛽)
Three geometric parameters of the specimen are used as optimization variables:
- L, distance between the jaws of the Iosipescu test ;
- α, orientation of the first group of folds;
- β, orientation of the second group of folds.
The aim is to determine the values of L, α and β that maximize the identifiability of the constituent parameters, based on the sensitivity analysis presented earlier.
Rossi and Pierron [2] have shown a way of optimizing test geometry for improved identifiability using accurate reproductions of test data from simulations. Whereas in their work, a full identification procedure was carried out for each possible geometry, this study introduces a method for carrying out test geometry optimization even before specimen manufacture, and at lower computational cost.
Exploring configurations
Three exploration plans have been defined:
- (α,L) for a unidirectional composite (β=0∘) ;
- (α,L) for a 0°-90° “crossed” composite (β=90∘) ;
- (α,β) with L=5 mm, a value deemed favorable for identifiability.
For each configuration, a series of finite element simulations is performed. A sensitivity is calculated for each unit variation in each constituent parameter (10 simulations per configuration). This represents several thousand simulations, analyzed in terms of sensitivity classes. The criterion used is the average of the classes for all the parameters, which must be maximized to find the most advantageous configuration overall for the 9 parameters.
Average class maps associated with each exploration plan. (a) 𝛽 = 0°; (b) 𝛽 = 90°; (c) 𝐿 = 5 mm. The red circle represents the optimum (global maximum).
The analysis shows that low values of L are overall more favorable to identifiability. The (α,β) plane with L=5 mm identifies up to five parameters with classes greater than or equal to 0: 𝐸₁, 𝐸₂, ν₁₂, ν₂₃ and 𝐺₁₂.
Interpolation of the results on a fine grid made it possible to determine an optimum configuration with the values L=5 mm, α=69∘, β=46∘ (red dot on the figure above). This geometry was selected for the generation of a virtual trial designed to test identification capability on a synthetic case.
Virtual test and identification
A virtual dataset is generated by deforming a specimen with optimized geometry and randomly perturbed material parameters (~8% standard deviation). From the virtual images generated with EikoTwin Virtual, digital image correlation is performed via EikoTwin DIC to obtain displacement fields. The applied load is also simulated and noised.
Diagram of the algorithm used to validate the method
All material parameters are identified and the regularization relaxed at each convergence. L-curve analysis is used to determine the level of regularization (here 10-3) to be applied to minimize the two functionals (FEMU and regularization). Identification by FEMU with this regularization recovers the initial parameters with a mean square error reduced to ~5.7%. The most sensitive parameters (𝐸₁, 𝐸₂, 𝐺₁₂) are identified with a low level of error (<1%), while the others remain close to their initial values under the effect of the regularization.
Algorithm progression and L-curve
Prospects for real-life testing
The complete chain is based on three components:
- EikoTwin Virtual: generation of realistic images from finite element simulations, enabling test preparation and prediction of measurement quality.
- EikoTwin DIC: processing of real images by digital image correlation, producing displacement fields consistent with simulated geometry.
- EikoTwin Digital Twin: sensitivity analysis, then resetting of material parameters by minimization of the FEMU cost function, taking uncertainties into account.
In a context of costly testing and complex materials, this approach enables :
- evaluate a priori whether a geometrical configuration can be used to identify the desired parameters;
- adjust specimen geometry before the test campaign;
- reduce the number of tests required to achieve a given objective (recalibration, certification, model validation).
This methodology is particularly relevant for anisotropic, heterogeneous or complex-behavior materials, for which normative tests often fail to sufficiently constrain all parameters. Building on existing work [3], we can also imagine designing a plan to identify several complementary geometries by carrying out this optimization on all these specimens simultaneously, as they share the same material law. This would make it possible to identify less sensitive parameters, while still limiting the number of tests required for complete identification of a material.
Conclusion
The design of specimens for the identification of mechanical parameters can be optimized by integrating, from the outset, current capabilities in full-field measurement and numerical simulation. The conventional approach, based on normative geometries, does not guarantee the identifiability of the targeted parameters, particularly in the case of orthotropic composite materials, and leads to test campaigns involving a large number of specimens.
The use of EikoTwin Digital Twin makes it possible to quantitatively assess this identifiability via a sensitivity analysis based on the Hessian matrix of the identification cost function. Classification into sensitivity classes provides a direct indicator of the informative potential of each test, and guides the choice of geometric parameters.
The example presented shows that by optimizing the geometry of an Iosipescu test, it is possible to identify at least five parameters with an accuracy compatible with experimental uncertainties, thus reducing the need for multiple test campaigns.
This approach, coupled with the EikoTwin tools, provides a systematic method for designing efficient tests, tailored to precise identification objectives, within a rigorous framework integrating simulation, uncertainties and experimental measurements.
References
[1] A. Vintache et al, Test optimization for the identification of elastic and orthotropic parameters, in ECCM21 conference, 2024.
[2] M. Rossi, F. Pierron. On the use of simulated experiments in designing tests for material characterization from full-field measurements. International Journal of Solids and Structures. 49[3]:420-35, 2012.
[3] J. Neggers, F. Mathieu, F. Hild, S. Roux. Simultaneous full-field multi-experiment identification. Mechanics of Materials. 133:71-84, 2019.