Showing posts with label Data Assimilation. Show all posts
Showing posts with label Data Assimilation. Show all posts

Monday, 8 July 2024

AI-aided digital twinning for recycled-PET bottle blowing


Travail effectué dans le cadre du Carnot Mines fédérateur "recylage des polymères" piloté par Sabine Cantournet



Thursday, 12 November 2020

The Bayesian Finite Element Method: blending machine learning and classical algorithms to compute ''thick" solutions to partial differential equations

 

 
Partial Differential Equation-based numerical simulators are ubiquitous in engineering. However, the scaling of PDE solvers with increasing spatial, temporal and parametric resolutions is rather poor, leading to computational costs that are exponentially increasing with the complexity of the physical system of interest. As a consequence, discretisation schemes are often coarser than desired, in a pragmatic push towards applications such as physics-based modelling in interaction with reality, aka digital twins.

A way forward is a consistent treatment of all sources of uncertainty and a subsequently approach model refinement as a unified, uncertainty-driven task. To take modelling error into account, classical Bayesian model calibration and state estimation methodologies treat model parameters and model outputs as random variables, which are then conditioned to data in order to yield posterior distributions with appropriate credible intervals. However, the traditional way to quantify discretisation  errors is through deterministic numerical analysis, yielding point estimates or bounds, without distribution, making these approaches incompatible with a Bayesian treatment of model uncertainty.

Recently, significant developments have been made in the area of probabilistic solvers for PDEs. The idea is to formulate discretisation schemes as Bayesian estimation problems, yielding not a single parametrised/spatio/temporal field but a distribution of such fields. Most methods use Gaussian Processes as fundamental building block. The basic idea is to condition a Gaussian random field to satisfy the PDE at particular points of the computational domain. This gives rise to probabilistic variants of meshless methods traditionally used to solve PDEs. To date however, such approaches are not available for finite element solvers, which are typically based on integral formulations over arbitrary simplexes, leading to analytically intractable integrals.

We propose what we believe is the first probabilistic finite element methodology and apply it to steady heat diffusion. It is based on the definition of a discrete Gaussian prior over a p-refined finite element space. This prior is conditioned to satisfy the PDE weakly, using the non-refined finite element space to generate a linear observation operator. The Hyperparameters of the Gaussian process are optimised using maximum likelihood. We also provide an efficient solver based on Monte- Carlo sampling of the analytical posterior, coupled with an approximate multigrid sampler for the p- refined gaussian prior. We show that this sampler ensures that the overall cost of the methodology is of the order the p-refined deterministic FE technology, whilst delivering valuable probability distributions for the continuous solution to the PDE system.


 

Tuesday, 3 November 2020

Stage EDF/Mines/AFH (Paris region) Advanced data assimilation framework for WAAM additive layer manufacturing processes

EDF is currently investigating the capabilities of emerging additive layer manufacturing technologies such as WAAM (wire + arc additive manufacturing). This novel manufacturing process leverages existing welding technologies, whilst promising to allow engineers to build or repair large engineering components in a flexible and reliable manner. As of today, this process is not mature enough to be used for industrial production. This project focusses on establishing a robust numerical pipeline between numerical simulation of WAAM processes on the one hand, and data-rich lab experiments on the other hand. This pipeline will help researchers advance current understanding and control capabilities of this emerging class of additive manufacturing processes. 
 


One of the major difficulties limiting the capabilities of today’s numerical simulators is the multiscale and multiphysics nature of additive manufacturing processes, and WAAM in particular. Predicting how the shape of manufactured parts deviates from nominal geometries proves incredibly challenging, as fine-scale couplings between electromagnetics, thermodynamics, fluid and solid mechanics need to be resolved over large spatial domains and long periods of time. To make simulations possible, it is usually proposed to adopt a simplified, thermo-mechanical macroscopic point of view. However, in order to take unrepresented physics into account, model inputs (heat source models, material deposition models, ...) need to be reliably inferred from appropriately generated experimental data.


The project aims to establish a cutting-edge two-ways experiment-to-simulation pipeline to improve and automatise this inference process. Today’s labs are equipped with high-resolution scanners that may be used to acquire the full geometry of built objects. In turn, we wish to calibrate EDF’s thermo- mechanical model so that the predicted shape deviation from CAD matches that observed in the real-world. It will then be possible to virtually predict the shape deviation from ACD for a new process or component, without manufacturing it physically, thereby paving the way towards virtual design and optimisation of ALM operations.


The technical outlines of the project are as follows.

  •   The candidate will construct geometrical algorithms to assimilate point cloud data generated 
  •   by 3D scanning of manufactured parts, i.e. to allow inference algorithms to compare real surface profiles to simulated ones. The algorithms will be developed in Python and subsequently interfaced with EDF’s solid mechanics finite element code code_aster. 
  • The candidate will develop robust data-assimilation algorithms to tune/learn simplified computational models (of inherent-strain type) based on the 3D-scan data available at EDF. The procedure will be validated against its ability to blindly predict the shape of new WAAM products. 
  • The candidate will deploy a data mining strategy to improve the transferability of the calibrated model parameters over a range of manufacturing conditions and part geometries.

The work will be hosted by Mines ParisTech (Centres des Matériaux, http://www.mat.mines- paristech.fr/Research/Scientific-clusters/SIMS/ ), and in partnership with EDF Chatou. The duration of the stage is 6 months minimum, up to 9 months (expected start: winter/spring 2021). The candidate may take part in designing new sets of experiments as part of the project. The work is sponsored by the Additive Factory Hub (AFH), a group of high-tech industries teaming up to advance the state-of-the-art in metal additive layer manufacturing through shared research. The candidate is expected to take an active part in the dissemination of the results in the AFH network. https://www.additivefactoryhub.com/.

Requirements:

  • Proven experience in computational engineering & numerical simulation - Strong interest in manufacturing and digital twining 
  • Interest in machine learning and data mining 
  • Excellent analytical skills   
  • Scientific curiosity and strong interest in digital industry

Application and additional enquires:

Send CV and statement of motivation to Pierre Kerfriden, Mines ParisTech pierre.kerfriden@mines-paristech.fr
CC: Sami Hilal, EDF Chatou,
sami.hilal@edf.fr , Djamel Missoum-Benziane, djamel.missoum-benziane@mines-paristech.fr 

 

 
 
Keywords:  
Additive Layer Manufacturing, Computational Engineering, Applied Mathematics, Finite Element Method, Data Assimilation, Machine Learning, Industry 4.0