ترغب بنشر مسار تعليمي؟ اضغط هنا

Metasurface-mediated bound states in the continuum (BIC) provides a versatile platform for light manipulation at subwavelength dimension with diverging radiative quality factor and extreme optical localization. In this work, we employ magnetic dipole quasi-BIC resonance in asymmetric silicon nanobar metasurfaces to realize giant Goos-Hanchen (GH) shift enhancement by more than three orders of wavelength. In sharp contrast to GH shift based on the Brewster dip or transmission-type resonance, the maximum GH shift here is located at the reflection peak with unity reflectance, which can be conveniently detected in the experiment. By adjusting the asymmetric parameter of metasurfaces, the $Q$-factor and GH shift can be modulated accordingly. More interestingly, it is found that GH shift exhibits an inverse quadratic dependence on the asymmetric parameter. Furthermore, we design an ultrasensitive environmental refractive index sensor based on the quasi-BIC enhanced GH shift, with a maximum sensitivity of 1.5$times$10$^{7}$ $mu$m/RIU. Our work not only reveals the essential role of BIC in engineering the basic optical phenomena, but also suggests the way for pushing the performance limits of optical communication devices, information storage, wavelength division de/multiplexers, and ultrasensitive sensors.
Enhancing absorption in optically thin semiconductors is the key in the development of high-performance optical and optoelectronic devices. In this paper, we resort to the concept of degenerate critical coupling and design an ultra-thin semiconductor absorber composed of free-standing GaAs nanocylinder metasurfaces in the near infrared. The numerical results show that perfect absorption can be achieved through overlapping two Mie modes with opposite symmetry, with each mode contributing a theoretical maximum of 50% in their respective critical coupling state. The absorption also shows the polarization-independent and angle-insensitive robustness. This work, together with the design concept, opens up great opportunities for the realization of high-efficiency metasurface devices, including optical emitters, modulators, detectors, and sensors.
Enhancing the light-matter interaction in two-dimensional (2D) materials with high-$Q$ resonances in photonic structures has boosted the development of optical and photonic devices. Herein, we intend to build a bridge between the radiation engineerin g and the bound states in the continuum (BIC), and present a general method to control light absorption at critical coupling through the quasi-BIC resonance. In a single-mode two-port system composed of graphene coupled with silicon nanodisk metasurfaces, the maximum absorption of 0.5 can be achieved when the radiation rate of the magnetic dipole resonance equals to the dissipate loss rate of graphene. Furthermore, the absorption bandwidth can be adjusted more than two orders of magnitude from 0.9 nm to 94 nm by simultaneously changing the asymmetric parameter of metasurfaces, the Fermi level and the layer number of graphene. This work reveals out the essential role of BIC in radiation engineering and provides promising strategies in controlling light absorption of 2D materials for the next-generation optical and photonic devices, e.g., light emitters, detectors, modulators, and sensors.
Recent progress in nanophotonics is driven by the desire to engineer light-matter interaction in two-dimensional (2D) materials using high-quality resonances in plasmonic and dielectric structures. Here, we demonstrate a link between the radiation co ntrol at critical coupling and the metasurface-based bound states in the continuum (BIC) physics, and develop a generalized theory to engineer light absorption of 2D materials in coupling resonance metasurfaces. In a typical example of hybrid graphene-dielectric metasurfaces, we present the manipulation of absorption bandwidth by more than one order of magnitude by simultaneously adjusting the asymmetry parameter of silicon resonators governed by BIC and the graphene surface conductivity while the absorption efficiency maintains maximum. This work reveals the generalized role of BIC in the radiation control at critical coupling and provides promising strategies in engineering light absorption of 2D materials for high-efficiency optoelectronics device applications, e.g., light emission, detection and modulation.
303 - Xisen Jin , Arka Sadhu , Junyi Du 2020
We explore task-free continual learning (CL), in which a model is trained to avoid catastrophic forgetting, but without being provided any explicit task boundaries or identities. However, since CL models are continually updated, the utility of stored seen examples may diminish over time. Here, we propose Gradient based Memory EDiting (GMED), a framework for editing stored examples in continuous input space via gradient updates, in order to create a wide range of more ``challenging examples for replay. GMED-edited examples remain similar to their unedited forms, but can yield increased loss in the upcoming model updates, thereby making the future replays more effective in overcoming catastrophic forgetting. By construction, GMED can be seamlessly applied in conjunction with other memory-based CL algorithms to bring further improvement. Experiments on six datasets validate that GMED is effective, and our single best method significantly outperforms existing approaches on three datasets. Code and data can be found at https://github.com/INK-USC/GMED.
97 - Xisen Jin , Junyi Du , Arka Sadhu 2020
Humans acquire language continually with much more limited access to data samples at a time, as compared to contemporary NLP systems. To study this human-like language acquisition ability, we present VisCOLL, a visually grounded language learning tas k, which simulates the continual acquisition of compositional phrases from streaming visual scenes. In the task, models are trained on a paired image-caption stream which has shifting object distribution; while being constantly evaluated by a visually-grounded masked language prediction task on held-out test sets. VisCOLL compounds the challenges of continual learning (i.e., learning from continuously shifting data distribution) and compositional generalization (i.e., generalizing to novel compositions). To facilitate research on VisCOLL, we construct two datasets, COCO-shift and Flickr-shift, and benchmark them using different continual learning methods. Results reveal that SoTA continual learning approaches provide little to no improvements on VisCOLL, since storing examples of all possible compositions is infeasible. We conduct further ablations and analysis to guide future work.
Large pre-trained sentence encoders like BERT start a new chapter in natural language processing. A common practice to apply pre-trained BERT to sequence classification tasks (e.g., classification of sentences or sentence pairs) is by feeding the emb edding of [CLS] token (in the last layer) to a task-specific classification layer, and then fine tune the model parameters of BERT and classifier jointly. In this paper, we conduct systematic analysis over several sequence classification datasets to examine the embedding values of [CLS] token before the fine tuning phase, and present the biased embedding distribution issue---i.e., embedding values of [CLS] concentrate on a few dimensions and are non-zero centered. Such biased embedding brings challenge to the optimization process during fine-tuning as gradients of [CLS] embedding may explode and result in degraded model performance. We further propose several simple yet effective normalization methods to modify the [CLS] embedding during the fine-tuning. Compared with the previous practice, neural classification model with the normalized embedding shows improvements on several text classification tasks, demonstrates the effectiveness of our method.
129 - Xisen Jin , Zhongyu Wei , Junyi Du 2019
The impressive performance of neural networks on natural language processing tasks attributes to their ability to model complicated word and phrase compositions. To explain how the model handles semantic compositions, we study hierarchical explanatio n of neural network predictions. We identify non-additivity and context independent importance attributions within hierarchies as two desirable properties for highlighting word and phrase compositions. We show some prior efforts on hierarchical explanations, e.g. contextual decomposition, do not satisfy the desired properties mathematically, leading to inconsistent explanation quality in different models. In this paper, we start by proposing a formal and general way to quantify the importance of each word and phrase. Following the formulation, we propose Sampling and Contextual Decomposition (SCD) algorithm and Sampling and Occlusion (SOC) algorithm. Human and metrics evaluation on both LSTM models and BERT Transformer models on multiple datasets show that our algorithms outperform prior hierarchical explanation algorithms. Our algorithms help to visualize semantic composition captured by models, extract classification rules and improve human trust of models. Project page: https://inklab.usc.edu/hiexpl/
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا