ترغب بنشر مسار تعليمي؟ اضغط هنا

Invited Article: miniTimeCube

265   0   0.0 ( 0 )
 نشر من قبل Glenn Jocher
 تاريخ النشر 2016
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

We present the development of the miniTimeCube (mTC), a novel compact neutrino detector. The mTC is a multipurpose detector, aiming to detect not only neutrinos but also fast/thermal neutrons. Potential applications include the counterproliferation of nuclear materials and the investigation of antineutrino short-baseline effects. The mTC is a plastic 0.2% $^{10}$B - doped scintillator (13 cm)$^3$ cube surrounded by 24 Micro-Channel Plate (MCP) photon detectors, each with an $8times8$ anode totaling 1536 individual channels/pixels viewing the scintillator. It uses custom-made electronics modules which mount on top of the MCPs, making our detector compact and able to both distinguish different types of events and reject noise in real time. The detector is currently deployed and being tested at the National Institute of Standards and Technology (NIST) Center for Neutron Research (NCNR) nuclear reactor (20 MW$_mathrm{th}$) in Gaithersburg, MD. A shield for further tests is being constructed, and calibration and upgrades are ongoing. The mTCs improved spatiotemporal resolution will allow for determination of incident particle directions beyond previous capabilities.


قيم البحث

اقرأ أيضاً

Nowadays, editors tend to separate different subtopics of a long Wiki-pedia article into multiple sub-articles. This separation seeks to improve human readability. However, it also has a deleterious effect on many Wikipedia-based tasks that rely on t he article-as-concept assumption, which requires each entity (or concept) to be described solely by one article. This underlying assumption significantly simplifies knowledge representation and extraction, and it is vital to many existing technologies such as automated knowledge base construction, cross-lingual knowledge alignment, semantic search and data lineage of Wikipedia entities. In this paper we provide an approach to match the scattered sub-articles back to their corresponding main-articles, with the intent of facilitating automated Wikipedia curation and processing. The proposed model adopts a hierarchical learning structure that combines multiple variants of neural document pair encoders with a comprehensive set of explicit features. A large crowdsourced dataset is created to support the evaluation and feature extraction for the task. Based on the large dataset, the proposed model achieves promising results of cross-validation and significantly outperforms previous approaches. Large-scale serving on the entire English Wikipedia also proves the practicability and scalability of the proposed model by effectively extracting a vast collection of newly paired main and sub-articles.
RedPRL is an experimental proof assistant based on Cartesian cubical computational type theory, a new type theory for higher-dimensional constructions inspired by homotopy type theory. In the style of Nuprl, RedPRL users employ tactics to establish b ehavioral properties of cubical functional programs embodying the constructive content of proofs. Notably, RedPRL implements a two-level type theory, allowing an extensional, proof-irrelevant notion of exact equality to coexist with a higher-dimensional proof-relevant notion of paths.
100 - Matthieu Sozeau 2021
Proof assistants are getting more widespread use in research and industry to provide certified and independently checkable guarantees about theories, designs, systems and implementations. However, proof assistant implementations themselves are seldom verified, although they take a major share of the trusted code base in any such certification effort. In this area, proof assistants based on Higher-Order Logic enjoy stronger guarantees, as self-certified implementations have been available for some years. One cause of this difference is the inherent complexity of dependent type theories together with their extensions with inductive types, universe polymorphism and complex sort systems, and the gap between theory on paper and practical implementations in efficient programming languages. MetaCoq is a collaborative project that aims to tackle these difficulties to provide the first fully-certified realistic implementation of a type checker for the full calculus underlying the Coq proof assistant. To achieve this, we refined the sometimes blurry, if not incorrect, specification and implementation of the system. We show how theoretical tools from this community such as bidirectional type-checking, Tait-Martin-Lof/Takahashis confluence proof technique and monadic and dependently-typed programming can help construct the following artefacts: a specification of Coqs syntax and type theory, the Polymorphic Cumulative Calculus of (Co)-Inductive Constructions (PCUIC); a monad for the manipulation of raw syntax and interaction with the Coq system; a verification of PCUICs metatheory, whose main results are the confluence of reduction, type preservation and principality of typing; a realistic, correct and complete type-checker for PCUIC; a sound type and proof erasure procedure from PCUIC to untyped lambda-calculus, i.e., the core of the extraction mechanism of Coq.
111 - Giselle Reis 2021
Structural proof theory is praised for being a symbolic approach to reasoning and proofs, in which one can define schemas for reasoning steps and manipulate proofs as a mathematical structure. For this to be possible, proof systems must be designed a s a set of rules such that proofs using those rules are correct by construction. Therefore, one must consider all ways these rules can interact and prove that they satisfy certain properties which makes them well-behaved. This is called the meta-theory of a proof system. Meta-theory proofs typically involve many cases on structures with lots of symbols. The majority of cases are usually quite similar, and when a proof fails, it might be because of a sub-case on a very specific configuration of rules. Developing these proofs by hand is tedious and error-prone, and their combinatorial nature suggests they could be automated. There are various approaches on how to automate, either partially or completely, meta-theory proofs. In this paper, I will present some techniques that I have been involved in for facilitating meta-theory reasoning.
In order to better understand the effect of social media in the dissemination of scholarly articles, employing the daily updated referral data of 110 PeerJ articles collected over a period of 345 days, we analyze the relationship between social media attention and article visitors directed by social media. Our results show that social media presence of PeerJ articles is high. About 68.18% of the papers receive at least one tweet from Twitter accounts other than @PeerJ, the official account of the journal. Social media attention increases the dissemination of scholarly articles. Altmetrics could not only act as the complement of traditional citation measures but also play an important role in increasing the article downloads and promoting the impacts of scholarly articles. There also exists a significant correlation among the online attention from different social media platforms. Articles with more Facebook shares tend to get more tweets. The temporal trends show that social attention comes immediately following publication but does not last long, so do the social media directed article views.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا