No Arabic abstract
The new Belle II experiment at the asymmetric $e^+ e^-$ accelerator SuperKEKB at KEK in Japan is designed to deliver a peak luminosity of $8times10^{35}text{cm}^{-2}text{s}^{-1}$. To perform high-precision track reconstruction, e.g. for measurements of time-dependent CP-violating decays and secondary vertices, the Belle II detector is equipped with a highly segmented pixel detector (PXD). The high instantaneous luminosity and short bunch crossing times result in a large stream of data in the PXD, which needs to be significantly reduced for offline storage. The data reduction is performed using an FPGA-based Data Acquisition Tracking and Concentrator Online Node (DATCON), which uses information from the Belle II silicon strip vertex detector (SVD) surrounding the PXD to carry out online track reconstruction, extrapolation to the PXD, and Region of Interest (ROI) determination on the PXD. The data stream is reduced by a factor of ten with an ROI finding efficiency of >90% for PXD hits inside the ROI down to 50 MeV in $p_text{T}$ of the stable particles. We will present the current status of the implementation of the track reconstruction using Hough transformations, and the results obtained for simulated Upsilon(4S) $rightarrow , Bbar{B}$ events.
We report on the first calibration of the standard Belle II $B$-flavor tagger using the full data set collected at the $Upsilon(4{rm S})$ resonance in 2019 with the Belle II detector at the SuperKEKB collider, corresponding to 8.7 fb$^{-1}$ of integrated luminosity. The calibration is performed by reconstructing various hadronic charmed $B$-meson decays with flavor-specific final states. We use simulation to optimize our event selection criteria and to train the flavor tagging algorithm. We determine the tagging efficiency and the fraction of wrongly identified tag-side $B$~candidates from a measurement of the time-integrated $B^0-overline{B}^0$ mixing probability. The total effective efficiency is measured to be $varepsilon_{rm eff} = big(33.8 pm 3.6(text{stat}) pm 1.6(text{sys})big)%$, which is in good agreement with the predictions from simulation and comparable with the best one obtained by the Belle experiment. The results show a good understanding of the detector performance and offer a basis for future calibrations.
We describe the conversion of simulated and recorded data by the Belle experiment to the Belle~II format with the software package texttt{b2bii}. It is part of the Belle~II Analysis Software Framework. This allows the validation of the analysis software and the improvement of analyses based on the recorded Belle dataset using newly developed analysis tools.
From April to July 2018, a data sample at the peak energy of the $Upsilon(4S)$ resonance was collected with the Belle~II detector at the SuperKEKB electron-positron collider. This is the first data sample of the Belle~II experiment. Using Bhabha and digamma events, we measure the integrated luminosity of the data sample to be ($496.3 pm 0.3 pm 3.0$)~pb$^{-1}$, where the first uncertainty is statistical and the second is systematic. This work provides a basis for future luminosity measurements at Belle~II.
We present an FPGA-based online data reduction system for the pixel detector of the future Belle II experiment. The occupancy of the pixel detector is estimated at 3 %. This corresponds to a data output rate of more than 20 GB/s after zero suppression, dominated by background. The Online Selection Nodes (ONSEN) system aims to reduce the background data by a factor of 30. It consists of 33 MicroTCA cards, each equipped with a Xilinx Virtex-5 FPGA and 4 GiB DDR2 RAM. These cards are hosted by 9 AdvancedTCA carrier boards. The ONSEN system buffers the entire output data from the pixel detector for up to 5 seconds. During this time, the Belle II high-level trigger PC farm performs an online event reconstruction, using data from the other Belle II subdetectors. It extrapolates reconstructed tracks to the layers of the pixel detector and defines regions of interest around the intercepts. Based on this information, the ONSEN system discards all pixels not inside a region of interest before sending the remaining hits to the event builder system. During a beam test with one layer of the pixel detector and four layers of the surrounding silicon strip detector, including a scaled-down version of the high-level trigger and data acquisition system, the pixel data reduction using regions of interest was exercised. We investigated the data produced in more than 20 million events and verified that the ONSEN system behaved correctly, forwarding all pixels inside regions of interest and discarding the rest.
The Full Event Interpretation is presented: a new exclusive tagging algorithm used by the high-energy physics experiment Belle II. The experimental setup of Belle II allows the precise measurement of otherwise inaccessible $B$ meson decay-modes. The Full Event Interpretation algorithm enables many of these measurements. The algorithm relies on machine learning to automatically identify plausible $B$ meson decay chains based on the data recorded by the detector. Compared to similar algorithms employed by previous experiments, the Full Event Interpretation provides a greater efficiency, yielding a larger effective sample size usable in the measurement.