Monday, 8 July 2024

Generative AI and digital twinning for the simulation of ductile/fragile fracture in segregated steel

 
Résilience des aciers de forge ségrégés. Etudes expérimentales et numériques réaslisées dans le cadre du projet French Fab piloté par Anne-Françoise Gourgues.

 

AI-aided digital twinning for recycled-PET bottle blowing


Travail effectué dans le cadre du Carnot Mines fédérateur "recylage des polymères" piloté par Sabine Cantournet



Tuesday, 28 May 2024

Job offers

 

Propositions de thèse de doctorat / PhD thesis offer 

  

- Génération de code JIT pour la génération de noyaux de calculs complexes en mécanique du solide
https://www.mat.minesparis.psl.eu/formation/doctorat/propositions-de-sujets-de-these/?id=57541

 
- Reconstruction par machine learning des bains de fusion en fabrication additive (LPBF) à partir de mesures thermiques en surface 
https://www.mat.minesparis.psl.eu/formation/doctorat/propositions-de-sujets-de-these/?id=57593

Monday, 13 March 2023

Teaching material

Experimental Mechanics @ Mines Paris

https://1drv.ms/b/s!AjM6vw3llOZ-juAaNfBsZMDTp9MIPQ?e=6MjYqD

 

Nonlinear Computational Mechanics @ Mines Paris (ATHENS MP06)

Linear FEA
https://1drv.ms/b/s!AjM6vw3llOZ-i7YQB0fAgby8fOfL4g

Nonlinear FEA
https://1drv.ms/b/s!AjM6vw3llOZ-i5dUYh-0_iEKJlaLXA

 

IDSC @ Mines Paris

Inverse problems and Physics-Informed Neural Networks
https://1drv.ms/b/s!AjM6vw3llOZ-i9ontPMPQmn6ibtMCw?e=SrYbQX

Graph-Neural Networks for geometric learning
https://1drv.ms/b/s!AjM6vw3llOZ-jqltsbDdeAMidrr97w

 

DIMA @ Mines Paris

Multifidelity surrgate modelling
https://1drv.ms/b/s!AjM6vw3llOZ-i5I0v-aZCC9_Rjfo_w

 

DMS Computer Vision week @ Centre des Matériaux, Mines Paris

Surrogate modelling 1
https://1drv.ms/b/s!AjM6vw3llOZ-i_F4z7W5jle8r9BpMg

Surrogate modelling 2
https://1drv.ms/b/s!AjM6vw3llOZ-i_F3D_65lZ94-DyCsg

 

IA week @ PSL University
https://1drv.ms/b/s!AjM6vw3llOZ-jvVCVnD_6MrBOAP8lQ


Convolutional Neural Networks @ Mines Albi
https://onedrive.live.com/?authkey=%21ACKCg1ApHZEr7OA&cid=7EE694E50DBF3A33&id=7EE694E50DBF3A33%21243416&parId=7EE694E50DBF3A33%21243342&o=OneUp

 

Brief introduction to AI in engineering sciences
https://1drv.ms/b/s!AjM6vw3llOZ-jrxTWMqCm8yiqEtp3Q

Wednesday, 18 January 2023

Fracture of an homogeneous medium due to a thermal shock


 

 

FEniCSx input file

https://colab.research.google.com/drive/1FIFjp6h8nSGpX_02RO5TEKF2Om1AEc9q?usp=sharing

Tuesday, 5 July 2022

A probabilistic data assimilation framework to reconstruct finite element error fields from sparse error estimates : application to sub-modelling

https://onlinelibrary.wiley.com/doi/10.1002/nme.7090

In the present work, we propose a computational pipeline to recover full finite element error fields from a few  estimates of errors in scalar quantities of interest (QoI). The approach is weakly intrusive, as it is motivated by large-scale industrial applications wherby modifying the finite element models is undesirable. The goal-oriented error estimation methodology that is chosen is the traditional Zhu-Zienkiewicz (ZZ) approach, which is coupled with the adjoint methodology to deliver goal-oriented results. The novelty of the work is that we consider a set of computed error estimates in QoI as partial observations of an underlying error field, which is to be recovered. We then deploy a Bayesian probabilistic estimation framework, introducing a sparse Gaussian prior for the error field by means of linear stochastic partial differential equations (SPDE), with two adjustable parameters that may be tuned via maximum likelihood (which is made tractable by the SPDE approach). As estimating the posterior state of the error field is a numerical bottleneck, despite the employment of the SPDE-based prior, we propose a projection-based reduced order modelling strategy to reduce the cost of using the SPDE model. The projection basis is constructed adaptively, using a goal-oriented divisive clustering approach that is subsequently used to construct a family of radial basis functions satisfying the partition-of-unity property over the computational domain. We show that the Bayesian reconstruction approach, accelerated by the proposed model reduction technology, yields good probabilistic estimates of full error fields, with a computational complexity that is acceptable compared to the evaluation of the ZZ goal-oriented error estimates that must be provided as input to the algorithm. The strategy is applied to submodelling, whereby the global model is solved using a relatively coarse finite element discretisation, and the effect of the numerical error onto submodelling results is to be controlled. To achieve this, we probabilistically recover full error fields over boundaries of submodelling regions, which we propagate to the submodels using a standard Monte-Carlo approach. Future improvements of the method include the optimal selection of goal-oriented error measures to be acquired prior to the error field reconstruction.

 

 

Monday, 10 January 2022

Multiscale stress surrogates via probabilistic graph-based geometric deep learning

https://arxiv.org/abs/2205.06562


 

Fast stress predictions in multiscale materials & structures graph-based probabilistic geometric deep learning with online physics-based corrections

https://618.euromech.org/slides/

Saturday, 1 May 2021

Simple implementation of a Physics-Informed Neural Networks in Pytorch for inverse problems

 https://colab.research.google.com/drive/1ycheGcblVg38FK662NzShspXudTE3ugg?usp=sharing

class Net_IP(nn.Module):

def __init__(self):
super(Net_IP, self).__init__()
self.input_layer_u = nn.Linear(2,100)
self.hidden_layer1 = nn.Linear(100,10)
self.output_layer_u = nn.Linear(10,1)
self.input_layer_mu = nn.Linear(1,10)
self.hidden_layer2 = nn.Linear(10,3)
self.output_layer_mu = nn.Linear(3,1)

def forward(self, x, t):
# temperature field
inputs_u = torch.cat([x,t],axis=1) # spatio-temporal coordinate
layer1_out = torch.sigmoid(self.input_layer_u(inputs_u))
layer2_out = torch.sigmoid(self.hidden_layer1(layer1_out))
output_u = self.output_layer_u(layer2_out)
# diffusion field
inputs_mu = x # spatial coordinate
layer3_out = torch.sigmoid(self.input_layer_mu(inputs_mu))
layer4_out = torch.sigmoid(self.hidden_layer2(layer3_out))
output_mu = torch.nn.functional.softplus(self.output_layer_mu(layer4_out))
# concatenation
output = torch.cat([output_u,output_mu],axis=1)
return output

net_IP = Net_IP()
net_IP = net_IP.to(device)

 

Thursday, 12 November 2020

The Bayesian Finite Element Method: blending machine learning and classical algorithms to compute ''thick" solutions to partial differential equations

 

 
Partial Differential Equation-based numerical simulators are ubiquitous in engineering. However, the scaling of PDE solvers with increasing spatial, temporal and parametric resolutions is rather poor, leading to computational costs that are exponentially increasing with the complexity of the physical system of interest. As a consequence, discretisation schemes are often coarser than desired, in a pragmatic push towards applications such as physics-based modelling in interaction with reality, aka digital twins.

A way forward is a consistent treatment of all sources of uncertainty and a subsequently approach model refinement as a unified, uncertainty-driven task. To take modelling error into account, classical Bayesian model calibration and state estimation methodologies treat model parameters and model outputs as random variables, which are then conditioned to data in order to yield posterior distributions with appropriate credible intervals. However, the traditional way to quantify discretisation  errors is through deterministic numerical analysis, yielding point estimates or bounds, without distribution, making these approaches incompatible with a Bayesian treatment of model uncertainty.

Recently, significant developments have been made in the area of probabilistic solvers for PDEs. The idea is to formulate discretisation schemes as Bayesian estimation problems, yielding not a single parametrised/spatio/temporal field but a distribution of such fields. Most methods use Gaussian Processes as fundamental building block. The basic idea is to condition a Gaussian random field to satisfy the PDE at particular points of the computational domain. This gives rise to probabilistic variants of meshless methods traditionally used to solve PDEs. To date however, such approaches are not available for finite element solvers, which are typically based on integral formulations over arbitrary simplexes, leading to analytically intractable integrals.

We propose what we believe is the first probabilistic finite element methodology and apply it to steady heat diffusion. It is based on the definition of a discrete Gaussian prior over a p-refined finite element space. This prior is conditioned to satisfy the PDE weakly, using the non-refined finite element space to generate a linear observation operator. The Hyperparameters of the Gaussian process are optimised using maximum likelihood. We also provide an efficient solver based on Monte- Carlo sampling of the analytical posterior, coupled with an approximate multigrid sampler for the p- refined gaussian prior. We show that this sampler ensures that the overall cost of the methodology is of the order the p-refined deterministic FE technology, whilst delivering valuable probability distributions for the continuous solution to the PDE system.


 

Tuesday, 3 November 2020

Stage EDF/Mines/AFH (Paris region) Advanced data assimilation framework for WAAM additive layer manufacturing processes

EDF is currently investigating the capabilities of emerging additive layer manufacturing technologies such as WAAM (wire + arc additive manufacturing). This novel manufacturing process leverages existing welding technologies, whilst promising to allow engineers to build or repair large engineering components in a flexible and reliable manner. As of today, this process is not mature enough to be used for industrial production. This project focusses on establishing a robust numerical pipeline between numerical simulation of WAAM processes on the one hand, and data-rich lab experiments on the other hand. This pipeline will help researchers advance current understanding and control capabilities of this emerging class of additive manufacturing processes. 
 


One of the major difficulties limiting the capabilities of today’s numerical simulators is the multiscale and multiphysics nature of additive manufacturing processes, and WAAM in particular. Predicting how the shape of manufactured parts deviates from nominal geometries proves incredibly challenging, as fine-scale couplings between electromagnetics, thermodynamics, fluid and solid mechanics need to be resolved over large spatial domains and long periods of time. To make simulations possible, it is usually proposed to adopt a simplified, thermo-mechanical macroscopic point of view. However, in order to take unrepresented physics into account, model inputs (heat source models, material deposition models, ...) need to be reliably inferred from appropriately generated experimental data.


The project aims to establish a cutting-edge two-ways experiment-to-simulation pipeline to improve and automatise this inference process. Today’s labs are equipped with high-resolution scanners that may be used to acquire the full geometry of built objects. In turn, we wish to calibrate EDF’s thermo- mechanical model so that the predicted shape deviation from CAD matches that observed in the real-world. It will then be possible to virtually predict the shape deviation from ACD for a new process or component, without manufacturing it physically, thereby paving the way towards virtual design and optimisation of ALM operations.


The technical outlines of the project are as follows.

  •   The candidate will construct geometrical algorithms to assimilate point cloud data generated 
  •   by 3D scanning of manufactured parts, i.e. to allow inference algorithms to compare real surface profiles to simulated ones. The algorithms will be developed in Python and subsequently interfaced with EDF’s solid mechanics finite element code code_aster. 
  • The candidate will develop robust data-assimilation algorithms to tune/learn simplified computational models (of inherent-strain type) based on the 3D-scan data available at EDF. The procedure will be validated against its ability to blindly predict the shape of new WAAM products. 
  • The candidate will deploy a data mining strategy to improve the transferability of the calibrated model parameters over a range of manufacturing conditions and part geometries.

The work will be hosted by Mines ParisTech (Centres des Matériaux, http://www.mat.mines- paristech.fr/Research/Scientific-clusters/SIMS/ ), and in partnership with EDF Chatou. The duration of the stage is 6 months minimum, up to 9 months (expected start: winter/spring 2021). The candidate may take part in designing new sets of experiments as part of the project. The work is sponsored by the Additive Factory Hub (AFH), a group of high-tech industries teaming up to advance the state-of-the-art in metal additive layer manufacturing through shared research. The candidate is expected to take an active part in the dissemination of the results in the AFH network. https://www.additivefactoryhub.com/.

Requirements:

  • Proven experience in computational engineering & numerical simulation - Strong interest in manufacturing and digital twining 
  • Interest in machine learning and data mining 
  • Excellent analytical skills   
  • Scientific curiosity and strong interest in digital industry

Application and additional enquires:

Send CV and statement of motivation to Pierre Kerfriden, Mines ParisTech pierre.kerfriden@mines-paristech.fr
CC: Sami Hilal, EDF Chatou,
sami.hilal@edf.fr , Djamel Missoum-Benziane, djamel.missoum-benziane@mines-paristech.fr 

 

 
 
Keywords:  
Additive Layer Manufacturing, Computational Engineering, Applied Mathematics, Finite Element Method, Data Assimilation, Machine Learning, Industry 4.0
 

Friday, 30 October 2020

Gaussian processes by stochastic differential equations on manifolds

 
FEniCS code:

from
dolfin import *

import os
import numpy as np

from mshr import *
mesh = BoundaryMesh(generate_mesh(Sphere(Point(0,0,0),1),50),"exterior")

V = FunctionSpace(mesh, "CG", 1)

u = TrialFunction(V)
v = TestFunction(V)

k=inner(grad(u),grad(v))*dx
K=assemble(k)
K=K.array()

m=u*v*dx
M = assemble(m)
M=M.array()

alpha = 1.0
beta = 0.2

np.random.seed(12)
L = np.linalg.cholesky(M)
file_realisation = File('results/realisation_sphere.pvd')
func = Function(V)
func.rename('realisation','realisation')
for i in range(10):
    W=np.random.randn(M.shape[0],1)
    y = np.matmul(L,W)
    e = np.linalg.solve(M+(beta**2)*K,alpha*y)
    func.vector().set_local(e)
    file_realisation << func  
 
Generation on Cardiff Dinosaur, Eddy

Tuesday, 10 September 2019

Soutenance d'Habilitation à Diriger des Recherches de Pierre Kerfriden

This public event will take place on the 17/09/2019 at 9:30 at École des Mines, 60 Boulevard Saint Germain, Paris, room L109 – Le Chatelier

------------------------------
Towards the next generation of high-fidelity simulators for online computing: adaptive modelling through the scales

https://zenodo.org/record/3404685
------------------------------

High-fidelity modelling and simulation have profoundly transformed the area of material and structural design.  Through advances in computer hardware and software, material failure can be reliably predicted using multiscale high-fidelity models coupled with appropriately designed discretisation strategies. Yet, such heavy numerical tasks are restricted to ``one-shot" virtual experiments. Emerging applications such as real-time control or interactive design require performing thousands of repeated analyses, with potentially limited computational facilities. Models used for such applications require extreme robustness and swiftness of execution. To unleash the full potential of high-fidelity computational mechanics, we need to develop a new generation of numerical tools that will bridge the gap between, on the one hand, heavy numerical solvers and, on the other hand, computationally demanding ``online" engineering tasks. This thesis introduces and summarises research contributions that aim to help bridge this gap, through the development of robust model reduction approaches to control the cost associated with multiscale and physically detailed numerical simulations, with a particular emphasis on reliability assessment for composite materials and fracture.

L'ingénierie des matériaux et des structures a été transformée en profondeur par la généralisation de simulation numérique. Grâce à l’avancée des outils de calcul scientifique, la rupture des matériaux peut être prédite de manière fiable par des modèles multi-échelles, en conjonction avec des méthodes de résolution numérique haute-performance. Cependant, ces simulations coûteuses restent limitées à l'expérience virtuelle unitaire. Les applications modernes comme le contrôle en temps réel ou la conception interactive requièrent des vitesses d'exécution et des niveaux de stabilité des modèles qui restent hors de portée. Le potentiel des simulations mécaniques haute-fidélité ne pourra être réalisé que par le développement d'une nouvelle génération d'outils numériques chargés de réduire les coûts de calculs afin de permettre l'utilisation de modèles numériques fins dans des applications impliquant des calculs ``à-la-volée". Cette thèse présente quelques contributions de recherche visant à combler ce fossé technologique. L'accent est porté sur le développement de méthodes de réduction de modèle pour le contrôle des coûts de calcul associés aux simulations haute-fidélité, avec un intérêt particulier pour la mécanique des composites et la prédiction multi-échelles de la rupture.

------------------------------
Examining committee
------------------------------

Julien Yvonnet, Professeur à Université Paris-Est Marne-la-Vallée
Anthony Gravouil, Professeur à l’INSA de Lyon
Piotr Breitkopf, Professeur à l’Université Technologique de Troyes
Francisco Chinesta, Professeur à l’ENSAM Paris
Christian REY, Senior Engineer à Safran Tech
Ludovic Chamoin, Professeur à l’ENS Paris-Saclay
Olivier Allix, Professeur à l’ENS Paris-Saclay

Tuesday, 11 June 2019

Manufacturing the Yorkshire Pudding

Simulation of additive layer manufacturing by direct energy deposition using CutFEM

 
 
S. Claus, S. Bigot and P. Kerfriden, CutFEM Method for Stefan--Signorini Problems with Application in Pulsed Laser Ablation, SIAM J. Sci. Comput., 40(5), 2018
 

Thursday, 7 March 2019

PhD Project / Cardiff University / University of Luxembourg

Synopsys NE Ltd (https://www.synopsys.com/simpleware.html), Cardiff University and University of Luxembourg invites applications for 2 Early Stage Researcher position (Doctoral Candidate) as part of the Rapid Biomechanics and Simulation for Personalized Clinical Design (RAINBOW) MCSA European Training Network. RAINBOW is funded under the European Union’s Horizon 2020 research and innovation program.

The post holder will be employed on a fixed term (36-month contract) and be principally based at the Synopsys-Simpleware offices in Exeter UK but will also be enrolled as a full time graduate student at either Cardiff University (http://www.cardiff.ac.uk/) or University of Luxembourg and will be undertaking research towards a PhD degree award. The candidate will be expected to spend periods of time in Cardiff or Luxembourg as well as with other partners in the consortium.

The post holders will develop numerical methods at the intersection between machine learning, biomechanical simulations and image processing. In particular, they will contribute to bridging the gap between advanced 3D imaging techniques and physics-based computer simulations in order to improve current capabilities in the area of computer-aided diagnostic and surgical planning. A thorough knowledge of software development is essential.

This is a full time (37.5 hours per week) position on a fixed term basis for a fixed-term of 36 months. Strong programming and analytical skills are required. Same advanced knowledge in computational physics or mechanics woud be a plus. Applications including a CV and a cover letter are required no later than 18/03/2019. Applications should be sent to pierre.kerfriden@gmail.com with title field: "Application RAINBOW PHD XXX", where XXX is the name of the applicant. 

Additive Layer Manufacturing simulations (metal deposition) with CutFEM


Flexible multiresolution finite element solver

Tuesday, 1 January 2019

CutFEM: 1D fibrous reinforcements embedded in 3D structures

P Kerfriden, S Claus, I Mihai, A mixed-dimensional CutFEM methodology for the simulation of fibre-reinforced composites, Advanced Modeling and Simulation in Engineering Science, 2020 

 

 

We develop a novel unfitted finite element solver for composite materials with quasi-1D fibrous reinforcements. The method belongs to the class of mixed-dimensional non-conforming finite element solvers. The fibres are treated as 1D structural elements that may intersect the mesh of the embedding structure in an arbitrary manner. No meshing of the unidimensional elements is required. Instead, fibre solution fields are described using the trace of the background mesh. A regularised “cut” finite element formulation is carefully designed to ensure that analyses using such non-conforming finite element descriptions are stable. We also design a dedicated primal/dual operator splitting scheme to resolve the coupling between structure and fibrous reinforcements efficiently. The novel computational strategy is applied to the solution of stiff computational models whereby fibrous reinforcements may lose their bond to the embedding material above a certain level of stress. It is shown that the primal-dual 1D/3D CutFEM scheme is convergent and well-behaved in variety of scenarios involving such highly nonlinear structural computations.

Tuesday, 17 July 2018

Researcher position in Cardiff. Closing 16 August 2018

Cardiff University invites applications for an Early Stage Researcher position (Doctoral Candidate) as part of the Rapid Biomechanics and Simulation for Personalised Clinical Design (RAINBOW) MCSA European Training Network. RAINBOW is funded under the European Union’s Horizon 2020 research and innovation programme. The post holder will undertake research on “Meta Modelling for Soft Tissue Contact and Cutting Simulation” leading to a PhD degree award. The post holder will develop numerical methods to simulate the deformations of soft-tissues in the context of computer-aided surgery. In particular, he/she will contribute to bridging the gap between advanced 3D imaging techniques and physics-based computer simulations in order to improve current capabilities in the area of computer-aided diagnostic and surgical planning. A thorough knowledge of numerical methods is essential.

Description
Thanks to recent advances in medical imaging and computer tomography, medical surgery is undergoing a revolution. Surgeons have routinely access to 3D reconstructions of patient anatomy that help them perform diagnoses, plan their operations, and may even have access to real-time feedback during surgical operations. However, good-quality-imaging is invasive, and potentially harmful. Within the RAINBOW MSCA network, Cardiff’s research team aims to use physics-based computer simulations to help reconstruct patient anatomy from partial information (noisy snapshots in space and/or time), and to further predict the evolution of biomechanical processes in the future in order to improve diagnoses and planning.

More information and application pages
https://rainbow.ku.dk/open-positions/
Job at Cardiff University

Tuesday, 10 July 2018

Funded PhD studentship in Cardiff: Machine learning techniques for the optimisation and simulation of Metal Additive Layer Manufacturing process chains

Project Description

The aim of this PhD is to develop new data analytic tools (e.g. machine learning, data mining) to support the understanding, the optimisation and the Multi-scale and multi-physics simulation of metal Additive Layer Machining (ALM) process chains. 

These data analytics tools should meet the needs of the H2020 funded project MANUELA. In particular, to develop “intelligent” feedback loops enabling “online” manufacturing optimisation, design optimisation and tuning of Multi-scale and multi-physics models used for simulations and for the implementation of accurate digital twins of the investigated pilot lines. 

Depending on the type of data available (e.g. temperature maps, machining parameters, localised acoustic information) and on the available controllable factors, various types of process modelling approaches could be used to extract knowledge and features. State of the art modeling, data mining and machine learning tools will be reviewed (e.g. techniques for data regression / classification / clustering such as deep neural network, support vector machine, and dimension reduction learning models, as well as image processing algorithms) and the most relevant will be implemented and enhanced to meet the demands of real data collected at different stages of the pilot line. 

Specialist Equipment / Ressources available: 

Among other standard computing and manufacturing equipment (manufacturing workshop, 3D printers, ) , the student will have access to the following ressources specific to the project needs: 
- Metal Additive Layer Manufacturing Machine 
- High-Performance Computing cluster 
- Machine learning tool kit 
- Indirect access to the H2020 project partners’ equipment (e.g. ALM machines, data analysis/control/simulation softwares) 

Student Required Expertise/skills: 

The work will require the development of software based solutions in the context of ALM pilot lines (real manufacturing, simulation and optimisation), the student should have strong interest and knowledge in the following: 
- Object oriented programming (C++ or equivalent) 
- Data mining/machine learning 
- Additive Layer Manufacturing 


https://www.findaphd.com/search/projectDetails.aspx?PJID=99195

Sunday, 8 July 2018

CutFEM method to simulate composite fracture



Phase-field in the bulk, zero-thickness cohesive elements with friction contact at the matrix/inclusions interface

Thursday, 5 April 2018

Simulation of micro-EDM die-sinking


Crater-by-crater geometric simulations of micro-EDM using fast Octree computations. Results from the PhD thesis of Anthony Surleraux. Supervision by S. Bigot and P. Kerfriden

http://orca.cf.ac.uk/80776/1/2015SurlerauxABPhD.pdf

Sunday, 18 March 2018

Laser ablation: micro-cavity

We have developed a cut finite element method for one-phase Stefan problems, with applications in laser manufacturing. The geometry of the workpiece is represented implicitly via a level set function. Material above the melting/vaporisation temperature is represented by a fictitious gas phase. The moving interface between the workpiece and the fictitious gas phase may cut arbitrarily through the elements of the finite element mesh, which remains fixed throughout the simulation, thereby circumventing the need for cumbersome remeshing operations. The primal/dual formulation of the linear one-phase Stefan problem is recast into a primal non-linear formulation using a Nitsche-type approach, which avoids the difficulty of constructing inf-sup stable primal/dual pairs. Through the careful derivation of stabilisation terms, we show that the proposed Stefan-Signorini-Nitsche CutFEM method remains stable independently of the cut location. In addition, we obtain optimal convergence with respect to space and time refinement. Several 2D and 3D examples are proposed, highlighting the robustness and flexibility of the algorithm, together with its relevance to the field of micro-manufacturing.


 

Simulations by S. Claus, S. Bigot and P. Kerfriden in the FEniCS library CutFEM.

S. Claus, S. Bigot and P. Kerfriden,
CutFEM Method for Stefan--Signorini Problems with Application in Pulsed Laser Ablation, SIAM J. Sci. Comput., 40(5), 2018

Funding: Sêr Cymru National Research Network

Saturday, 3 March 2018

Parameter study of natural convection models




Rayleigh number 10 to the 4

Velocity 

 
Temperature 

Rayleigh number 10 to the 8

Velocity


 Temperature

 

Simulations by P. Kerfriden 
Software: COMSOL Multiphysics

Tuesday, 19 December 2017

Melting and heat convection on a Daruma




The leftmost picture represents the evolution of the temperature field with time. The boundary of the domain is set to a cold temperature of 0 degrees. The mouth of the mask is heated up to 4 degrees. The temperature of the left eye is 3 degrees, whilst the temperature of the the right eye is set to 2 degrees. The melting temperature above which the solid material melts is chosen to be 1 degree. The rightmost picture represents the evolution of the phase field the blends solid and liquid phases (the fully liquid material is red). It also shows the flow of fluid that develop within the liquid phase, under the action of a buoyancy force driven by the temperature gradient.

Simulation by Dr Susanne Claus and Dr Pierre Kerfriden.

Wednesday, 8 November 2017

Computational homogenisation in FEniCS







Simulations by Dr P. Kerfriden
Finite element software: https://fenicsproject.org/
Meshing software: http://gmsh.info/
Visualisation tool: https://www.paraview.org/

Thursday, 26 October 2017

Unsteady heat transfer in a building



Navier-Stokes simulation for convection-dominated temperature analysis in a room The window is closed, then opened, and closed again. The temperature-flow coupling is obtained via the Boussinesq approximation. The colours of the walls represent the value of the temperature fields, while the colors of the streamlines represent the amplitude of the flow velocity field.

Simulations by Dr S. Claus

Friday, 16 June 2017

Robust model selection for Bootstrap-Aggregated Neural Network regression applied to small, noisy datasets

Robust model selection for Bootstrap-Aggregated Neural 
Network regression applied to small, noisy datasets


This is a small piece of work in the area of machine learning. Bootstrap aggregating is known to be an excellent approach to fit regression models when the dataset is small, and potentially polluted by noise. 

The picture given below represents, in blue, a reference 2D function, polluted by white noise and cut by a plane for visualisation purposes. We want to reconstruct this surface from a small sample of function values. We do this by drawing repeatedly from the small dataset, with replacement, and systematically fitting a standard Neural Network (NN) regression (all the thin red curves). Then, all the NN replicates are averaged out. We obtain the thick black curve. Whilst each individual bootstrap replicate clearly overfits (we have only 86 2D datapoints), the bootstrap aggregated regression behaves nicely.



The 2D plot of the aggregated regression is displayed below.

The novelty of the work lies in the estimation of the prediction error. We rely on an Out-Of-Bag approach to estimate the predictive coefficient of determination, and derive frequentist confidence interval for this statistics through a non-parametric bootstrap. The confidence interval allows us to
- select the number of replicates that lead to a correct estimation of the predictive power of the regression.
- select the optimal number of Neurons of the Aggregated Neural Network in a robust manner, through early stopping.


Sunday, 19 February 2017

Eddy, Cardiff sliding Dinosaur



Eddy is held by the tails and gravity acts in direction [1 -1] in the plane of the picture. The dinosaur will either slip or stumble forward depending on the roughness of the contact between its feet and the support.


Eddy is not meshed. Instead, the .stl file that describes its boundary is converted into a continuous level-set, whose negative values indicate eddies spatial occupancy. The zero isoline can cut arbitrarily through the elements.

The simulations were performed using the CutFEM FEniCS library.
 
S. Claus & P. Kerfriden, A stable and optimally convergent LaTIn-Cut Finite Element Method for multiple unilateral contact problems, IJNME, 2017
https://onlinelibrary.wiley.com/doi/abs/10.1002/nme.5694

Burman, E., Claus, S., Hansbo, P., Larson, M. G., and Massing, A. (2015) CutFEM: Discretizing geometry and partial differential equations. Int. J. Numer. Meth. Engng, 104: 472501. doi: 10.1002/nme.4823.

T
he 3D dinosaur model was created by ThinkerThing: http://www.thingiverse.com/thing:343924

Monday, 30 January 2017

Unilateral Contact simulations with a stable LaTIn solver for non-conforming "Cut" finite elements

We have recently developed an unfitted finite element solver for sets of solids that are interacting through unilateral contact. The analysis mesh is regular and the geometry of the solids is allowed to cut through the elements. The two pictures below illustrate the capabilities of our solver


The solver itself combines the best of two world. On the one hand, we use elements of the CutFEM technology pioneered by P. Hansbo, E. Burman, MG Larson in order to allow interfaces to cut through the mesh without altering the convergence rate associated with finite element solvers. On the other hand, different phases of the composite are coupled using the LaTIn solver, first proposed by P. Ladevèze, and whose versatility has been proven over the years. We modified the discrete mixed formulation associated with LaTIn in order to stabilise the interface solution, ensuring that the condition number of  successive linear systems of equations is controlled an that the convergence with mesh refinement is optimum.  More detailed about our general strategy can be found in the wide audience paper that we have written for NAFEMS magazine Benchmark, available in draft form here.



Some more pictures of the geometry of the 3D woven composite material. The fibres are easily described by using a set of appropriate level set functions. The matrix bloc can be meshed "by hand" as it is not requires to conform to the complex geometry of the interfaces between phases of the composite.

The simulations were performed using the CutFEM FEniCS library.

Susanne Claus, Pierre Kerfriden, A stable and optimally convergent LaTIn-Cut Finite Element Method for multiple unilateral contact problems, IJNME, 2016

Saturday, 7 January 2017

Certified Defeaturing of CAD models

 
 
Defeaturing is routinely used by engineering to simplify their numerical analyses. Typically, small geometrical features are removed a priori from the CAD model, leading to more affordable simulation stages. We develop a method to estimate the error made when ignoring such features. The error on the chosen quantity of interest is bounded from above and below, using dedicated methodological derivations that find their roots in convex analysis. Deriving the bounds only necessitates the availability of the defeatured solution, and some affordable post-processing of this solution in the vicinity of the feature.

This is the outcome of the PhD work of Dr Rahimi, supervised by Dr Kerfriden, Dr Langbein and Prof. Martin in Cardiff University.