No Arabic abstract
When an object impacts the free surface of a liquid, it ejects a splash curtain upwards and creates an air cavity below the free surface. As the object descends into the liquid, the air cavity eventually closes under the action of hydrostatic pressure (deep seal). In contrast, the surface curtain may splash outwards or dome over and close, creating a surface seal. In this paper we experimentally investigate how the splash curtain dynamics are governed by the interplay of cavity pressure difference, gravity, and surface tension and how they control the occurrence, or not, of surface seal. Based on the experimental observations and measurements, we develop an analytical model to describe the trajectory and dynamics of the splash curtain. The model enables us to reveal the scaling relationship for the dimensionless surface seal time and discover the existence of a critical dimensionless number that predicts the occurrence of surface seal. This scaling indicates that the most significant parameter governing the occurrence of surface seal is the velocity of the airflow rushing into the cavity. This is in contrast to the current understanding which considers the impact velocity as the determinant parameter.
Most prior work in the sequence-to-sequence paradigm focused on datasets with input sequence lengths in the hundreds of tokens due to the computational constraints of common RNN and Transformer architectures. In this paper, we study long-form abstractive text summarization, a sequence-to-sequence setting with input sequence lengths up to 100,000 tokens and output sequence lengths up to 768 tokens. We propose SEAL, a Transformer-based model, featuring a new encoder-decoder attention that dynamically extracts/selects input snippets to sparsely attend to for each output segment. Using only the original documents and summaries, we derive proxy labels that provide weak supervision for extractive layers simultaneously with regular supervision from abstractive summaries. The SEAL model achieves state-of-the-art results on existing long-form summarization tasks, and outperforms strong baseline models on a new dataset/task we introduce, Search2Wiki, with much longer input text. Since content selection is explicit in the SEAL model, a desirable side effect is that the selection can be inspected for enhanced interpretability.
The Santa Cruz Extreme AO Lab (SEAL) is a new visible-wavelength testbed designed to advance the state of the art in wavefront control for high contrast imaging on large, segmented, ground-based telescopes. SEAL provides multiple options for simulating atmospheric turbulence, including rotating phase plates and a custom Meadowlark spatial light modulator that delivers phase offsets of up to 6pi at 635nm. A 37-segment IrisAO deformable mirror (DM) simulates the W. M. Keck Observatory segmented primary mirror. The adaptive optics system consists of a woofer/tweeter deformable mirror system (a 97-actuator ALPAO DM and 1024-actuator Boston Micromachines MEMs DM, respectively), and four wavefront sensor arms: 1) a high-speed Shack-Hartmann WFS, 2) a reflective pyramid WFS, designed as a prototype for the ShaneAO system at Lick Observatory, 3) a vector-Zernike WFS, and 4) a Fast Atmospheric Self Coherent Camera Technique (FAST) demonstration arm, consisting of a custom focal plane mask and high-speed sCMOS detector. Finally, science arms preliminarily include a classical Lyot-style coronagraph as well as FAST (which doubles as a WFS and science camera). SEALs real time control system is based on the Compute and Control for Adaptive optics (CACAO) package, and is designed to support the efficient transfer of software between SEAL and the Keck II AO system. In this paper, we present an overview of the design and first light performance of SEAL.
The Greenland Sea is an important breeding ground for harp and hooded seals. Estimates of the annual seal pup production are critical factors in the abundance estimation needed for management of the species. These estimates are usually based on counts from aerial photographic surveys. However, only a minor part of the whelping region can be photographed, due to its large extent. To estimate the total seal pup production, we propose a Bayesian hierarchical modeling approach motivated by viewing the seal pup appearances as a realization of a log-Gaussian Cox process using covariate information from satellite imagery as a proxy for ice thickness. For inference, we utilize the stochastic partial differential equation (SPDE) module of the integrated nested Laplace approximation (INLA) framework. In a case study using survey data from 2012, we compare our results with existing methodology in a comprehensive cross-validation study. The results of the study indicate that our method improves local estimation performance, and that the increased prediction uncertainty of our method is required to obtain calibrated count predictions. This suggests that the sampling density of the survey design may not be sufficient to obtain reliable estimates of the seal pup production.
In channel flows a step on the route to turbulence is the formation of streaks, often due to algebraic growth of disturbances. While a variation of viscosity in the gradient direction often plays a large role in laminar-turbulent transition in shear flows, we show that it has, surprisingly, little effect on the algebraic growth. Non-uniform viscosity therefore may not always work as a flow-control strategy for maintaining the flow as laminar.
Instrumental variables (IVs) are extensively used to estimate treatment effects when the treatment and outcome are confounded by unmeasured confounders; however, weak IVs are often encountered in empirical studies and may cause problems. Many studies have considered building a stronger IV from the original, possibly weak, IV in the design stage of a matched study at the cost of not using some of the samples in the analysis. It is widely accepted that strengthening an IV tends to render nonparametric tests more powerful and will increase the power of sensitivity analyses in large samples. In this article, we re-evaluate this conventional wisdom to bring new insights into this topic. We consider matched observational studies from three perspectives. First, we evaluate the trade-off between IV strength and sample size on nonparametric tests assuming the IV is valid and exhibit conditions under which strengthening an IV increases power and conversely conditions under which it decreases power. Second, we derive a necessary condition for a valid sensitivity analysis model with continuous doses. We show that the $Gamma$ sensitivity analysis model, which has been previously used to come to the conclusion that strengthening an IV increases the power of sensitivity analyses in large samples, does not apply to the continuous IV setting and thus this previously reached conclusion may be invalid. Third, we quantify the bias of the Wald estimator with a possibly invalid IV under an oracle and leverage it to develop a valid sensitivity analysis framework; under this framework, we show that strengthening an IV may amplify or mitigate the bias of the estimator, and may or may not increase the power of sensitivity analyses. We also discuss how to better adjust for the observed covariates when building an IV in matched studies.