No Arabic abstract
Bruxism is a disorder characterised by teeth grinding and clenching, and many bruxism sufferers are not aware of this disorder until their dental health professional notices permanent teeth wear. Stress and anxiety are often listed among contributing factors impacting bruxism exacerbation, which may explain why the COVID-19 pandemic gave rise to a bruxism epidemic. It is essential to develop tools allowing for the early diagnosis of bruxism in an unobtrusive manner. This work explores the feasibility of detecting bruxism-related events using earables in a mimicked in-the-wild setting. Using inertial measurement unit for data collection, we utilise traditional machine learning for teeth grinding and clenching detection. We observe superior performance of models based on gyroscope data, achieving an 88% and 66% accuracy on grinding and clenching activities, respectively, in a controlled environment, and 76% and 73% on grinding and clenching, respectively, in an in-the-wild environment.
We present an example of the practical implementation of a protocol for experimental bifurcation detection based on on-line identification and feedback control ideas. The idea is to couple the experiment with an on-line computer-assisted identification/feedback protocol so that the closed-loop system will converge to the open-loop bifurcation points. We demonstrate the applicability of this instability detection method by real-time, computer-assisted detection of period doubling bifurcations of an electronic circuit; the circuit implements an analog realization of the Roessler system. The method succeeds in locating the bifurcation points even in the presence of modest experimental uncertainties, noise and limited resolution. The results presented here include bifurcation detection experiments that rely on measurements of a single state variable and delay-based phase space reconstruction, as well as an example of tracing entire segments of a codimension-1 bifurcation boundary in two parameter space.
In our experience of working with domain experts who are using todays AutoML systems, a common problem we encountered is what we call unrealistic expectations -- when users are facing a very challenging task with noisy data acquisition process, whilst being expected to achieve startlingly high accuracy with machine learning (ML). Consequently, many computationally expensive AutoML runs and labour-intensive ML development processes are predestined to fail from the beginning. In traditional software engineering, this problem is addressed via a feasibility study, an indispensable step before developing any software system. In this paper, we present ease.ml/snoopy with the goal of preforming an automatic feasibility study before building ML applications or collecting too many samples. A user provides inputs in the form of a dataset, which is representative for the task and data acquisition process, and a quality target (e.g., expected accuracy > 0.8). The system returns its deduction on whether this target is achievable using ML given the input data. We approach this problem by estimating the irreducible error of the underlying task, also known as Bayes error. The technical key contribution of this work is the design of a practical Bayes error estimator. We carefully evaluate the benefits and limitations of running ease.ml/snoopy prior to training ML models on too noisy datasets for reaching the desired target accuracy. By including the automatic feasibility study into the iterative label cleaning process, users are able to save substantial labeling time and monetary efforts.
We present an idea for creation of a crystalline undulator and report its first realization. One face of a silicon crystal was given periodic micro-scratches (trenches) by means of a diamond blade. The X-ray tests of the crystal deformation due to given periodic pattern of surface scratches have shown that a sinusoidal shape is observed on both the scratched surface and the opposite (unscratched) face of the crystal, that is, a periodic sinusoidal deformation goes through the bulk of the crystal. This opens up the possibility for experiments with high-energy particles channeled in crystalline undulator, a novel compact source of radiation.
We explore the feasibility of a next-generation Mu2e experiment that uses Project-X beams to achieve a sensitivity approximately a factor ten better than the currently planned Mu2e facility.
Persistent Homology is a fairly new branch of Computational Topology which combines geometry and topology for an effective shape description of use in Pattern Recognition. In particular it registers through Betti Numbers the presence of holes and their persistence while a parameter (filtering function) is varied. In this paper, some recent developments in this field are integrated in a k-Nearest Neighbor search algorithm suited for an automatic retrieval of melanocytic lesions. Since long, dermatologists use five morphological parameters (A = Asymmetry, B = Boundary, C = Color, D = Diameter, E = Elevation or Evolution) for assessing the malignancy of a lesion. The algorithm is based on a qualitative assessment of the segmented images by computing both 1 and 2-dimensional Persistent Betti Numbers functions related to the ABCDE parameters and to the internal texture of the lesion. The results of a feasibility test on a set of 107 melanocytic lesions are reported in the section dedicated to the numerical experiments.