ترغب بنشر مسار تعليمي؟ اضغط هنا

A farewell to Professor RNDr. Vv{e}ra Trnkova}, DrSc

46   0   0.0 ( 0 )
 نشر من قبل Jiri Adamek
 تاريخ النشر 2019
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English
 تأليف Jiv{r}i Adamek




اسأل ChatGPT حول البحث

This is an introduction of the volume of the journal Commentationes Mathematicae Universitatis Caroalinae dedicated to the memory of Vv{e}ra Trnkova}.


قيم البحث

اقرأ أيضاً

Some of the most obviously correct physical theories - namely string theory and the multiverse - make no testable predictions, leading many to question whether we should accept something as scientific even if it makes no testable predictions and henc e is not refutable. However, some far-thinking physicists have proposed instead that we should give up on the notion of Falsifiability itself. We endorse this suggestion but think it does not go nearly far enough. We believe that we should also dispense with other outdated ideas, such as Fidelity, Frugality, Factuality and other F words. And we quote a lot of famous people to support this view.
102 - Xi-Wen Guan , Feng He 2019
In the 60s Professor Chen Ping Yang with Professor Chen Ning Yang published several seminal papers on the study of Bethes hypothesis for various problems of physics. The works on the lattice gas model, critical behaviour in liquid-gas transition, the one-dimensional (1D) Heisenberg spin chain, and the thermodynamics of 1D delta-function interacting bosons are significantly important and influential in the fields of mathematical physics and statistical mechanics. In particular, the work on the 1D Heisenberg spin chain led to subsequent developments in many problems using Bethes hypothesis. The method which Yang and Yang proposed to treat the thermodynamics of the 1D system of bosons with a delta-function interaction leads to significant applications in a wide range of problems in quantum statistical mechanics. The Yang and Yang thermodynamics has found beautiful experimental verifications in recent years.
The Geant4 toolkit is used extensively in high energy physics to simulate the passage of particles through matter and to predict effects such as detector efficiencies and smearing. Geant4 uses many underlying models to predict particle interaction ki nematics, and uncertainty in these models leads to uncertainty in high energy physics measurements. The Geant4 collaboration recently made free parameters in some models accessible through partnership with Geant4 developers. We present a study of the impact of varying parameters in three Geant4 hadronic physics models on agreement with thin target datasets and describe fits to these datasets using the Professor model tuning framework. We find that varying parameters produces substantially better agreement with some datasets, but that more degrees of freedom are required for full agreement. This work is a first step towards a common framework for propagating uncertainties in Geant4 models to high energy physics measurements, and we outline future work required to complete that goal.
206 - U Dammalapati , K. Jungmann , 2016
Energy levels, wavelengths, lifetimes and hyperfine structure constants for the isotopes of the first and second spectra of radium, Ra I and Ra II have been compiled. Wavelengths and wave numbers are tabulated for 226Ra and for other Ra isotopes. Iso tope shifts and hyperfine structure constants of even and odd-A isotopes of neutral radium atom and singly ionized radium are included. Experimental lifetimes of the states for both neutral and ionic Ra are also added, where available. The information is beneficial for present and future experiments aimed at different physics motivations using neutral Ra and singly ionized Ra.
The rapid recent progress in machine learning (ML) has raised a number of scientific questions that challenge the longstanding dogma of the field. One of the most important riddles is the good empirical generalization of overparameterized models. Ove rparameterized models are excessively complex with respect to the size of the training dataset, which results in them perfectly fitting (i.e., interpolating) the training data, which is usually noisy. Such interpolation of noisy data is traditionally associated with detrimental overfitting, and yet a wide range of interpolating models -- from simple linear models to deep neural networks -- have recently been observed to generalize extremely well on fresh test data. Indeed, the recently discovered double descent phenomenon has revealed that highly overparameterized models often improve over the best underparameterized model in test performance. Understanding learning in this overparameterized regime requires new theory and foundational empirical studies, even for the simplest case of the linear model. The underpinnings of this understanding have been laid in very recent analyses of overparameterized linear regression and related statistical learning tasks, which resulted in precise analytic characterizations of double descent. This paper provides a succinct overview of this emerging theory of overparameterized ML (henceforth abbreviated as TOPML) that explains these recent findings through a statistical signal processing perspective. We emphasize the unique aspects that define the TOPML research area as a subfield of modern ML theory and outline interesting open questions that remain.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا