Do you want to publish a course? Click here

Applications of Nanoparticles for Particle Physics: A Whitepaper for Snowmass 2013

150   0   0.0 ( 0 )
 Added by Lindley Winslow
 Publication date 2013
  fields Physics
and research's language is English




Ask ChatGPT about the research

The last decade has been the decade of nanotechnology, a length scale which is of particular interest since it is here that we see the transition from the classical to the quantum world. In this transition to the quantum regime new phenomena appear that have proven valuable in a wide range of applications. This whitepaper focusses on the simplest nanotechnology, the spherical nanoparticles and their possible application to particle physics.



rate research

Read More

179 - A. Avetisyan 2013
Snowmass is a US long-term planning study for the high-energy community by the American Physical Societys Division of Particles and Fields. For its simulation studies, opportunistic resources are harnessed using the Open Science Grid infrastructure. Late binding grid technology, GlideinWMS, was used for distributed scheduling of the simulation jobs across many sites mainly in the US. The pilot infrastructure also uses the Parrot mechanism to dynamically access CvmFS in order to ascertain a homogeneous environment across the nodes. This report presents the resource usage and the storage model used for simulating large statistics Standard Model backgrounds needed for Snowmass Energy Frontier studies.
We present several benchmark points in the phenomenological Minimal Supersymmetric Standard Model (pMSSM). We select these models as experimentally well-motivated examples of the MSSM which predict the observed Higgs mass and dark matter relic density while evading the current LHC searches. We also use benchmarks to generate spokes in parameter space by scaling the mass parameters in a manner which keeps the Higgs mass and relic density approximately constant.
48 - Hans Krueger 2005
The demands on detectors for particle detection as well as for medical and astronomical X-ray imaging are continuously pushing the development of novel pixel detectors. The state of the art in pixel detector technology to date are hybrid pixel detectors in which sensor and read-out integrated circuits are processed on different substrates and connected via high density interconnect structures. While these detectors are technologically mastered such that large scale particle detectors can be and are being built, the demands for improved performance for the next generation particle detectors ask for the development of monolithic or semi-monolithic approaches. Given the fact that the demands for medical imaging are different in some key aspects, developments for these applications, which started as particle physics spin-off, are becomming rather independent. New approaches are leading to novel signal processing concepts and interconnect technologies to satisfy the need for very high dynamic range and large area detectors. The present state in hybrid and (semi-)monolithic pixel detector development and their different approaches for particle physics and imaging application is reviewed.
This article describes the physics and nonproliferation goals of WATCHMAN, the WAter Cherenkov Monitor for ANtineutrinos. The baseline WATCHMAN design is a kiloton scale gadolinium-doped (Gd) light water Cherenkov detector, placed 13 kilometers from a civil nuclear reactor in the United States. In its first deployment phase, WATCHMAN will be used to remotely detect a change in the operational status of the reactor, providing a first- ever demonstration of the potential of large Gd-doped water detectors for remote reactor monitoring for future international nuclear nonproliferation applications. During its first phase, the detector will provide a critical large-scale test of the ability to tag neutrons and thus distinguish low energy electron neutrinos and antineutrinos. This would make WATCHMAN the only detector capable of providing both direction and flavor identification of supernova neutrinos. It would also be the third largest supernova detector, and the largest underground in the western hemisphere. In a follow-on phase incorporating the IsoDAR neutrino beam, the detector would have world-class sensitivity to sterile neutrino signatures and to non-standard electroweak interactions (NSI). WATCHMAN will also be a major, U.S. based integration platform for a host of technologies relevant for the Long-Baseline Neutrino Facility (LBNF) and other future large detectors. This white paper describes the WATCHMAN conceptual design,and presents the results of detailed simulations of sensitivity for the projects nonproliferation and physics goals. It also describes the advanced technologies to be used in WATCHMAN, including high quantum efficiency photomultipliers, Water-Based Liquid Scintillator (WbLS), picosecond light sensors such as the Large Area Picosecond Photo Detector (LAPPD), and advanced pattern recognition and particle identification methods.
196 - Sebastian White 2013
In planning for the Phase II upgrades of CMS and ATLAS major considerations are: 1)being able to deal with degradation of tracking and calorimetry up to the radiation doses to be expected with an integrated luminosity of 3000 $fb^{-1}$ and 2)maintaining physics performance at a pileup level of ~140. Here I report on work started within the context of the CMS Forward Calorimetry Task Force and continuing in an expanded CERN RD52 R$&$D program integrating timing (i.e. measuring the time-of-arrival of physics objects) as a potential tool for pileup mitigation and ideas for Forward Calorimetry. For the past 4 years our group has focused on precision timing at the level of 10-20 picoseconds in an environment with rates of $~10^6-10^7$ Hz/$cm^2 $ as is appropriate for the future running of the LHC (HL-LHC era). A time resolution of 10-20 picoseconds is one of the few clear criteria for pileup mitigation at the LHC, since the interaction time of a bunch crossing has an rms of 170 picosec. While work on charged particle timing in other contexts (i.e. ALICE R$&$D) is starting to approach this precision, there have been essentially no technologies that can sustain performance at these rates. I will present results on a tracker we developed within the DOE Advanced Detector R$&$D program which is now meeting these requirements. I will also review some results from Calorimeter Projects developed within our group (PHENIX EMCAL and ATLAS ZDC) which achieved calorimeter timing precision< 100 picoseconds.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا