No Arabic abstract
Kaemika is an app available on the four major app stores. It provides deterministic and stochastic simulation, supporting natural chemical notation enhanced with recursive and conditional generation of chemical reaction networks. It has a liquid-handling protocol sublanguage compiled to a virtual digital microfluidic device. Chemical and microfluidic simulations can be interleaved for full experimental-cycle modeling. A novel and unambiguous representation of directed multigraphs is used to lay out chemical reaction networks in graphical form.
Glioblastoma is a highly invasive brain tumor, whose cells infiltrate surrounding normal brain tissue beyond the lesion outlines visible in the current medical scans. These infiltrative cells are treated mainly by radiotherapy. Existing radiotherapy plans for brain tumors derive from population studies and scarcely account for patient-specific conditions. Here we provide a Bayesian machine learning framework for the rational design of improved, personalized radiotherapy plans using mathematical modeling and patient multimodal medical scans. Our method, for the first time, integrates complementary information from high resolution MRI scans and highly specific FET-PET metabolic maps to infer tumor cell density in glioblastoma patients. The Bayesian framework quantifies imaging and modeling uncertainties and predicts patient-specific tumor cell density with confidence intervals. The proposed methodology relies only on data acquired at a single time point and thus is applicable to standard clinical settings. An initial clinical population study shows that the radiotherapy plans generated from the inferred tumor cell infiltration maps spare more healthy tissue thereby reducing radiation toxicity while yielding comparable accuracy with standard radiotherapy protocols. Moreover, the inferred regions of high tumor cell densities coincide with the tumor radioresistant areas, providing guidance for personalized dose-escalation. The proposed integration of multimodal scans and mathematical modeling provides a robust, non-invasive tool to assist personalized radiotherapy design.
Up to now, it is not possible to obtain analytical solutions for complex molecular association processes (e.g. Molecule recognition in Signaling or catalysis). Instead Brownian Dynamics (BD) simulations are commonly used to estimate the rate of diffusional association, e.g. to be later used in mesoscopic simulations. Meanwhile a portfolio of diffusional association (DA) methods have been developed that exploit BD. However, DA methods do not clearly distinguish between modeling, simulation, and experiment settings. This hampers to classify and compare the existing methods with respect to, for instance model assumptions, simulation approximations or specific optimization strategies for steering the computation of trajectories. To address this deficiency we propose FADA (Flexible Architecture for Diffusional Association) - an architecture that allows the flexible definition of the experiment comprising a formal description of the model in SpacePi, different simulators, as well as validation and analysis methods. Based on the NAM (Northrup-Allison-McCammon) method, which forms the basis of many existing DA methods, we illustrate the structure and functioning of FADA. A discussion of future validation experiments illuminates how the FADA can be exploited in order to estimate reaction rates and how validation techniques may be applied to validate additional features of the model.
Large, complex, multi-scale, multi-physics simulation codes, running on high performance com-puting (HPC) platforms, have become essential to advancing science and engineering. These codes simulate multi-scale, multi-physics phenomena with unprecedented fidelity on petascale platforms, and are used by large communities. Continued ability of these codes to run on future platforms is as crucial to their communities as continued improvements in instruments and facilities are to experimental scientists. However, the ability of code developers to do these things faces a serious challenge with the paradigm shift underway in platform architecture. The complexity and uncertainty of the future platforms makes it essential to approach this challenge cooperatively as a community. We need to develop common abstractions, frameworks, programming models and software development methodologies that can be applied across a broad range of complex simulation codes, and common software infrastructure to support them. In this position paper we express and discuss our belief that such an infrastructure is critical to the deployment of existing and new large, multi-scale, multi-physics codes on future HPC platforms.
Simulation experiments are typically conducted repeatedly during the model development process, for example, to re-validate if a behavioral property still holds after several model changes. Approaches for automatically reusing and generating simulation experiments can support modelers in conducting simulation studies in a more systematic and effective manner. They rely on explicit experiment specifications and, so far, on user interaction for initiating the reuse. Thereby, they are constrained to support the reuse of simulation experiments in a specific setting. Our approach now goes one step further by automatically identifying and adapting the experiments to be reused for a variety of scenarios. To achieve this, we exploit provenance graphs of simulation studies, which provide valuable information about the previous modeling and experimenting activities, and contain meta-information about the different entities that were used or produced during the simulation study. We define provenance patterns and associate them with a semantics, which allows us to interpret the different activities, and construct transformation rules for provenance graphs. Our approach is implemented in a Reuse and Adapt framework for Simulation Experiments (RASE) which can interface with various modeling and simulation tools. In the case studies, we demonstrate the utility of our framework for a) the repeated sensitivity analysis of an agent-based model of migration routes, and b) the cross-validation of two models of a cell signaling pathway.
A high fidelity multi-physics Eulerian computational framework is presented for the simulation of supersonic parachute inflation during Mars landing. Unlike previous investigations in this area, the framework takes into account an initial folding pattern of the parachute, the flow compressibility effect on the fabric material porosity, and the interactions between supersonic fluid flows and the suspension lines. Several adaptive mesh refinement (AMR)-enabled, large edge simulation (LES)-based, simulations of a full-size disk-gap-band (DGB) parachute inflating in the low-density, low-pressure, carbon dioxide (CO2) Martian atmosphere are reported. The comparison of the drag histories and the first peak forces between the simulation results and experimental data collected during the NASA Curiosity Rovers Mars atmospheric entry shows reasonable agreements. Furthermore, a rudimentary material failure analysis is performed to provide an estimate of the safety factor for the parachute decelerator system. The proposed framework demonstrates the potential of using Computational Fluid Dynamics (CFD) and Fluid-Structure Interaction (FSI)-based simulation tools for future supersonic parachute design.