MMSDays24 - posters

Leibniz MMS Days 2024
April 10 - April 12, 2024
Kaiserslautern


Event
Location
Program
MMS Science Slam
Poster Session
Accommodation

Posters

A poster session will take place on April 10, 2024 from 18:00 in IVW Building 58, Entrance / Level 1 / Level 2, Room 210 with the following posters:

From Pixels to Airflow: Exploring Image Similarity Methods for Barn Ventilation Analysis

Ali Alai (ATB Potsdam)

Accurate determination of air exchange rates in agricultural environments is critical for animal comfort, worker well-being, and economic sustainability. Current measurement methods face challenges in spatial and temporal variability, resource-intensive processes, and ambiguity in defining inlets and outlets. This study introduces a novel approach utilizing Computational Fluid Dynamics (CFD) and image similarity methods for barn ventilation analysis. A diverse dataset of barn environments was synthetically generated through ANSYS software, ensuring precise control over variables. Image similarity metrics such as Mean Squared Error (MSE), Structural Similarity Index (SSI), and others were selected for evaluation. Baseline and specialized testing were conducted to assess the capability of image similarity methods to quantify modification of airflow under various conditions and the sensitivity of image similarity methods to the degree of change. Statistical analysis revealed significant differences between image similarity metrics. The Mean Squared Error (MSE) method emerged as a robust choice for quantifying air exchange rates in naturally ventilated barns and excelled in detecting subtle differences. The normalized cross correlation (NCC) demonstrated superior performance in maximizing image similarity. Practical implications and contextual considerations for metric selection as well as potential contributions to advancing methods for evaluating barn ventilation will be discussed in this contribution.

Integral Method for the Accelerated Characterization & Modeling of Continuous Fiber-Reinforced Thermoplastics (cFRTP)

Christian Andriss (IVW Kaiserslautern)

An integral approach has been developed at IVW that provides an efficient characterization method and a corresponding material model [1]. Referring to the material theory proposed by Haupt [2], the developed approach is based on the concept of an equilibrium curve. The equilibrium curve describes an internal state of mechanical equilibrium within the material that is approached for the case of vanishing loading rates or by infinitely prolonged creep or relaxation processes. Therefore, this concept yields a clear separation of the stress state measured in an experiment into a time dependent overstress that vanishes over time and a time independent equilibrium stress. A stepwise relaxation (SR) test has been developed that utilizes a stepwise loading and unloading with a relaxation period at the end of each step in order to determine an approximation to the equilibrium curve. Utilizing an analytical expression for the extrapolation of the stress relaxation response, the duration of the relaxation periods can be reduced significantly in the SR test. With just one test, stress relaxation, stiffness degradation and the evolution of plastic strains can be determined in an efficient manner. The method is complemented by a corresponding material model that can be calibrated using just the data of the SR test. It has been demonstrated for continuous fiber-reinforced polycarbonate [1] that the proposed method yields extensive material data in an efficient manner that can be complemented by accurate model predictions.

[1] C. Andriss, A. Kenf, und S. Schmeer, “Experimental characterization and phenomenological modeling of nonlinear viscoelasticity, plasticity and damage of continuous carbon fiber-reinforced thermoplastics”, Composites Part B: Engineering, S. 110734, Apr. 2023, doi: 10.1016/j.compositesb.2023.110734.
[2] P. Haupt, Continuum Mechanics and Theory of Materials. in Advanced Texts in Physics. Berlin, Heidelberg: Springer Berlin Heidelberg, 2002. doi: 10.1007/978-3-662-04775-0.

Modelling the meaning of quantifiers in use

Anton Benz (ZAS Berlin)

Half a century ago, linguists realised that context-independent ('literal') sentence meaning could be modelled by translating natural language into a language of formal logic. A paradigmatic application is the meaning of quantifiers in sentences, e.g. 'some', 'all', 'none'. However, when confronted with people's interpretation of quantifiers in concrete contexts, a puzzling discrepancy between literal meaning and communicated (pragmatic) meaning emerges. There are numerous approaches that attempt to explain these discrepancies in formal logic or combine logic with decision-theoretic techniques. They all have in common that they abstract away all aspects of processing. The high level of abstraction arguably leads to problems when considering pragmatic meaning in concrete contexts with larger sets of alternative formulations from which the speaker can choose. In this poster I present recent work on a statistical model, derived from an underlying symbolic algorithm, that represents different strategies for translating visual information into linguistic utterances. The strategies are distinguished by parameter settings, so that the paths from the visual situation to concrete utterances naturally lead to a Bayesian network model with multiple multinomial latent variables. I collected a corpus of experimental data with about 3500 situation-utterance tokens for production in English and 1500 tokens for German. By fitting the model to the production data, the proportions of language users who follow different possible strategies. Can be estimated. First results show that only a small number of all possible strategies are used. The complexity of the statistical model is much higher than usually encountered in linguistic research. The initial analyses are based on point estimates. In the future, a Bayesian approach will be explored. The statistical complexities make this a good example of linguistic research that can greatly benefit from collaboration with mathematicians.

Towards a Machine-Learned Poisson Solver for Low-Temperature Plasma Simulations in Complex Geometries

Ihda Chaerony Siffa (INP Greifswald)

In electrostatic self-consistent low-temperature plasma (LTP) simulations, Poisson's equation is solved at each simulation time step, which can amount to a significant computational cost for the entire simulation. In this talk, we present the development of a generic machine-learned Poisson solver specifically designed for the requirements of LTP simulations in complex 2D reactor geometries on structured Cartesian grids. Here, the reactor setups can consist of various objects such as electrodes and/or dielectric materials. We leverage a hybrid CNN-transformer network architecture in combination with a highly randomized synthetic training dataset to ensure the generalizability of the learned solver to unseen reactor setups. To achieve the numerical accuracy of the solution required in LTP simulations, we refine the raw predictions with a GPU-based conventional iterative solver. This especially recovers the high-frequency features not resolved by the initial prediction. For large computational domains, the present approach is shown to reduce the solution time by up to 1020?ompared to a GPU-based conventional iterative solver alone.

Higher-Order Auxiliary Field Quantum Monte Carlo Methods

Florian Goth (University of Wuerzburg)

The auxiliary field quantum Monte Carlo (AFQMC) method has been a workhorse in the field of strongly correlated electrons for a long time and has found its most recent implementation in the ALF package (alf.physik.uni-wuerzburg.de). The utilization of the Trotter decomposition to decouple the interaction from the non-interacting Hamiltonian makes this method inherently second order in terms of the imaginary time slice. We show that due to the use of the Hubbard-Stratonovich transformation (HST) a semigroup structure on the time evolution is imposed that necessitates the introduction of a new family of complex-hermitian splitting methods for the purpose of reaching higher order. We will give examples of these new methods and study their efficiency, as well as perform comparisons with other established second and higher order methods in the realm of the AFQMC method.

A model for a magneto mechanical device: forward and inverse uncertainty quantification

Olaf Klein (WIAS Berlin)

Many technical devices involve hysteresis effects that are modeled by hysteresis operators. The parameters therein are identified using results from measurements. Therefore, they are subject to uncertainties and the methods of Uncertainty Quantification (UQ) are used to deal with them. Here, the parameters for a magnetostrictive actuator and their uncertainties are determined by inverse UQ and the result is afterwards used for forward UQ.

Physics-Based Machine Learning for Simulation Intelligence in Composite Process Design

Denis Korolev (WIAS Berlin)

Accurate modelling of composites can be challenging due to the complex geometries of composite media and the presence of multiple scales with different physics. This requires computational grids with very fine resolution and imposes severe computational constraints on standard numerical solvers designed to solve the underlying equations (PDEs). Homogenization techniques are often used to overcome such difficulties and allow efficient extraction or upscaling of material properties from the microscale level for further use in macroscale simulations. In addition, mesh-free physics-informed neural networks can be used as the PDE solvers. In this poster, we discuss the applications of physics-informed machine learning to the composite process design and homogenization, the possibilities of combining neural networks with standard numerical solvers, and the possible advantages, disadvantages and future perspectives of our machine learning approach in the context of material modelling.

Impact of variably saturated flow and seasonably changing underground temperatures on the performance of large fields of individual geothermal heat collectors Extremely shallow geothermal energy

Ernesto Meneses Rioseco (Georg-August-Universität Göttingen)

Within the context of the energy crisis in Germany and the subsequent boost in the renewable energy sector, the utilization of shallow geothermal energy is experiencing a remarkable development. With the increasing number of very-shallow geothermal facilities, typical for single-family houses, their optimal deployment and design has received special attention in recent years. Optimizing a field of exceedingly shallow geothermal installations in a sustainable way requires a detailed understanding of the thermo-hydraulic interaction between neighboring geothermal heat collectors under varying operational schemes, hydrogeological and thermal conditions. Specially for small-power geothermal facilities (<30kW), which are typical for single-family houses, the thermo-hydraulic interaction between tightly deployed geothermal horizontal collectors has not been fully elucidated yet. Using numerical modelling and simulation, we focus in this work on the optimal and sustainable production of shallow geothermal energy from large fields of individual small-power facilities under variably saturated flow and seasonably changing underground temperatures. Combing the finite-element method with optimization algorithms, we study systematically the influence that variably saturated flow and consequently transient thermal parameters has on different design patterns typical for very-shallow geothermal collectors such as heat baskets, trench collectors, and different horizontal assemblages. COMSOL Multiphysics® is used as the simulation framework for the comprehensive numerical study. We discuss here our most recent modelling and simulation findings.

SampcompR: A new R Package for Research Data Comparison

Björn Rohr (GESIS - Leibniz Institut for Social Science)

For many research projects, we have different datasets that should, in principle, represent the same statistical population. However, in each data collection, biases can occur that can destroy comparability and affect analytical potential. This new R-Package SampcompR was created to provide easy-to-apply functions to compare those populations on a univariate, bivariate, and multivariate level. To illustrate the package, we compare two web surveys conducted in Africa in 2023 using Meta advertisements as a recruitment method to benchmarks from the cross-national Demographics and Health Survey. Our poster will show examples of package output, including visualizations and tables. As to the specific content of our example, we will see that the social media surveys show a high amount of bias. Our R-Package will provide a toolkit to perform data comparisons and using the same or similar procedures and visualizations for the various comparisons will increase comparability and standardization.

Registration-based data assimilation of aortic blood flow

Francesco Romor (WIAS Berlin)

Real-world applications often require the execution of expensive numerical simulations on different geometrical domains, represented by computational meshes. This is especially true when performing shape optimization or when these domains model shapes from the real world, such as anatomical organs of the human body. How we can develop surrogate models or perform data assimilation in these contexts is the main concern of this work. Even if a rich database of solutions is acquired for a specific application, sometimes it cannot be exploited completely because each solution is supported on its own computational mesh. For example, the employment of neural network architectures tailored for this framework is still not consolidated in the literature. A solution is represented by shape registration: a reference template geometry is designed from a cohort of available geometries which can then be mapped onto it. Through registration, a correspondence between every geometry of the database of solutions is designed. We employ the latest methodologies from shape registration to perform data assimilation of clinically relevant biomarkers within the context of aortic blood flows and associated pathologies.

Mass-Conservative Reduced Basis Approach for Heterogeneous Catalysis

Daniel Runge (WIAS Berlin)

This poster introduces a numerical framework for simulating heterogeneous catalytic reactions as they occur in in-situ surface characterization experiments. This involves a finite-element solver for the fluid flow and its mass-conversative integration into a finite-volume scheme for the chemical species transport in both cartesian and axisymmetric coordinates. We employ an efficient reduced boundary basis scheme for solving the species transport where the basis elements are independent of the catalytic reaction rate model and environmental parameters. One objective is the replication and explanation of hysteresis behavior in temperature-dependent XPS measurements.

Children and Large Language Models: A Comparison

Uli Sauerland (ZAS Berlin)

Both children and current large language models (LLMs) such as ChatGPT are fed with language data and learn to use language on the basis of this input. The output of both preschool children and current LLMs is in some respects non-adult, but the type of errors differ. While children's errors reflect human nature and language understanding, LLMs' errors don't conform to these generalizations, and some reflect LLMs essentially unlimited memorization capacity.

Characterization and simulation methods for carbon fiber sheet molding compounds

Dominic Schommer (IVW Kaiserslautern)

The resulting mechanical properties of parts made from fiber reinforced polymers composites are highly influenced by their manufacturing process. It is therefore important to gather as much knowledge as possible about both the material and the process itself in order to be able to predict correctly the behavior of the component in final use. Over the last few years, we have developed accurate and efficient methods for the characterization of C-SMC processed via compression molding. The output information of the characterization tests are used to directly generate the required input parameters for compression molding simulations. The simulation is based on a user-defined material model developed specifically for modelling C-SMC materials and has been implemented for general usage within the FEA software LS-DYNA{\textregistered}. The purpose of this poster is to provide an overview of the entire work and to show how the individual results are combined to form a more complete digital process chain.

Nucleation patterns of polymer crystals analyzed by machine learning models

Marco Werner (IPF Dresden)

We employ machine learning algorithms to investigate conformation patterns during polymer crystallization from undercooled melts using simulation data obtained from coarse grained molecular dynamics simulations. We make use of auto-encoders to compress local conformation descriptors into low dimensional fingerprints that reflect degrees of conformation and orientation order in the monomers environment. Using a Gaussian mixture model, a data-driven decision boundary between ordered (crystalline) and disordered (amorphous) is detected. We demonstrate that the resulting monomer labels are consistent with crystallinity measures based on known order parameters such as stem length and orientation order, but do not require the manual setting of a decision boundary.

Data Acquisition in a Novel Filament Winding Process

Marvin Wolf, Jens Schlimbach (IVW Kaiserslautern)

The acquisition of crucial process parameters in filament winding supports economic and ecological aspects of sustainability in the production of hydrogen pressure vessels. The obtained data are necessary to improve the understanding of the process and optimize it through parameter and process adjustments. Furthermore, the establishment of in-line quality control systems and real-time control loops make a significant contribution to ensure product quality and reduce reject rates. Within the project DigiTain - Digitalization for Sustainability, various measurement methods and sensors for roving speed, temperature, tension and geometry have been examined. Considerations encompassed feasibility, operating requirements and location as well as standard deviations. The sensors will be integrated in the SpeedPreg production line, a novel direct towpreg winding process, to achieve the benefits mentioned above.