ترغب بنشر مسار تعليمي؟ اضغط هنا

NeuroStorm: Accelerating Brain Science Discovery in the Cloud

195   0   0.0 ( 0 )
 نشر من قبل Gregory Kiar
 تاريخ النشر 2018
  مجال البحث علم الأحياء
والبحث باللغة English




اسأل ChatGPT حول البحث

Neuroscientists are now able to acquire data at staggering rates across spatiotemporal scales. However, our ability to capitalize on existing datasets, tools, and intellectual capacities is hampered by technical challenges. The key barriers to accelerating scientific discovery correspond to the FAIR data principles: findability, global access to data, software interoperability, and reproducibility/re-usability. We conducted a hackathon dedicated to making strides in those steps. This manuscript is a technical report summarizing these achievements, and we hope serves as an example of the effectiveness of focused, deliberate hackathons towards the advancement of our quickly-evolving field.

قيم البحث

اقرأ أيضاً

Recent advances in fluorescence microscopy techniques and tissue clearing, labeling, and staining provide unprecedented opportunities to investigate brain structure and function. These experiments images make it possible to catalog brain cell types a nd define their location, morphology, and connectivity in a native context, leading to a better understanding of normal development and disease etiology. Consistent annotation of metadata is needed to provide the context necessary to understand, reuse, and integrate these data. This report describes an effort to establish metadata standards for 3D microscopy datasets for use by the Brain Research through Advancing Innovative Neurotechnologies (BRAIN) Initiative and the neuroscience research community. These standards were built on existing efforts and developed with input from the brain microscopy community to promote adoption. The resulting Essential Metadata for 3D BRAIN Microscopy includes 91 fields organized into seven categories: Contributors, Funders, Publication, Instrument, Dataset, Specimen, and Image. Adoption of these metadata standards will ensure that investigators receive credit for their work, promote data reuse, facilitate downstream analysis of shared data, and encourage collaboration.
102 - Xin Li , Xuli Tang 2021
Despite the significant advances in life science, it still takes decades to translate a basic drug discovery into a cure for human disease. To accelerate the process from bench to bedside, interdisciplinary research (especially research involving bot h basic research and clinical research) has been strongly recommend by many previous studies. However, the patterns and the roles of the interdisciplinary characteristics in drug research have not been deeply examined in extant studies. The purpose of this study was to characterize interdisciplinary characteristics in drug research from the perspective of translational science, and to examine the role of different kinds of interdisciplinary characteristics in translational research for drugs.
Modern technologies are enabling scientists to collect extraordinary amounts of complex and sophisticated data across a huge range of scales like never before. With this onslaught of data, we can allow the focal point to shift towards answering the q uestion of how we can analyze and understand the massive amounts of data in front of us. Unfortunately, lack of standardized sharing mechanisms and practices often make reproducing or extending scientific results very difficult. With the creation of data organization structures and tools which drastically improve code portability, we now have the opportunity to design such a framework for communicating extensible scientific discoveries. Our proposed solution leverages these existing technologies and standards, and provides an accessible and extensible model for reproducible research, called science in the cloud (sic). Exploiting scientific containers, cloud computing and cloud data services, we show the capability to launch a computer in the cloud and run a web service which enables intimate interaction with the tools and data presented. We hope this model will inspire the community to produce reproducible and, importantly, extensible results which will enable us to collectively accelerate the rate at which scientific breakthroughs are discovered, replicated, and extended.
It is known that the Arrhenius equation, based on the Boltzmann distribution, can model only a part (e.g. half of the activation energy) for retinal discrete dark noise observed for vertebrate rod and cone pigments. Luo et al (Science, 332, 1307-312, 2011) presented a new approach to explain this discrepancy by showing that applying the Hinshelwood distribution instead the Boltzmann distribution in the Arrhenius equation solves the problem successfully. However, a careful reanalysis of the methodology and results shows that the approach of Luo et al is questionable and the results found do not solve the problem completely.
Astrophysics lies at the crossroads of big datasets (such as the Large Synoptic Survey Telescope and Gaia), open source software to visualize and interpret high dimensional datasets (such as Glue, WorldWide Telescope, and OpenSpace), and uniquely ski lled software engineers who bridge data science and research fields. At the same time, more than 4,000 planetariums across the globe immerse millions of visitors in scientific data. We have identified the potential for critical synergy across data, software, hardware, locations, and content that -- if prioritized over the next decade -- will drive discovery in astronomical research. Planetariums can and should be used for the advancement of scientific research. Current facilities such as the Hayden Planetarium in New York City, Adler Planetarium in Chicago, Morrison Planetarium in San Francisco, the Iziko Planetarium and Digital Dome Research Consortium in Cape Town, and Visualization Center C in Norrkoping are already developing software which ingests catalogs of astronomical and multi-disciplinary data critical for exploration research primarily for the purpose of creating scientific storylines for the general public. We propose a transformative model whereby scientists become the audience and explorers in planetariums, utilizing software for their own investigative purposes. In this manner, research benefits from the authentic and unique experience of data immersion contained in an environment bathed in context and equipped for collaboration. Consequently, in this white paper we argue that over the next decade the research astronomy community should partner with planetariums to create visualization-based research opportunities for the field. Realizing this vision will require new investments in software and human capital.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا