ترغب بنشر مسار تعليمي؟ اضغط هنا

The FAIR Funder pilot programme to make it easy for funders to require and for grantees to produce FAIR Data

166   0   0.0 ( 0 )
 نشر من قبل Erik Schultes
 تاريخ النشر 2019
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English




اسأل ChatGPT حول البحث

There is a growing acknowledgement in the scientific community of the importance of making experimental data machine findable, accessible, interoperable, and reusable (FAIR). Recognizing that high quality metadata are essential to make datasets FAIR, members of the GO FAIR Initiative and the Research Data Alliance (RDA) have initiated a series of workshops to encourage the creation of Metadata for Machines (M4M), enabling any self-identified stakeholder to define and promote the reuse of standardized, comprehensive machine-actionable metadata. The funders of scientific research recognize that they have an important role to play in ensuring that experimental results are FAIR, and that high quality metadata and careful planning for FAIR data stewardship are central to these goals. We describe the outcome of a recent M4M workshop that has led to a pilot programme involving two national science funders, the Health Research Board of Ireland (HRB) and the Netherlands Organisation for Health Research and Development (ZonMW). These funding organizations will explore new technologies to define at the time that a request for proposals is issued the minimal set of machine-actionable metadata that they would like investigators to use to annotate their datasets, to enable investigators to create such metadata to help make their data FAIR, and to develop data-stewardship plans that ensure that experimental data will be managed appropriately abiding by the FAIR principles. The FAIR Funders design envisions a data-management workflow having seven essential stages, where solution providers are openly invited to participate. The initial pilot programme will launch using existing computer-based tools of those who attended the M4M Workshop.



قيم البحث

اقرأ أيضاً

The lack of scientific openness is identified as one of the key challenges of computational reproducibility. In addition to Open Data, Free and Open-source Software (FOSS) and Open Hardware (OH) can address this challenge by introducing open policies , standards, and recommendations. However, while both FOSS and OH are free to use, study, modify, and redistribute, there are significant differences in sharing and reusing these artifacts. FOSS is increasingly supported with software repositories, but support for OH is lacking, potentially due to the complexity of its digital format and licensing. This paper proposes leveraging FAIR principles to make OH findable, accessible, interoperable, and reusable. We define what FAIR means for OH, how it differs from FOSS, and present examples of unique demands. Also, we evaluate dissemination platforms currently used for OH and provide recommendations.
Humanity has been fascinated by the pursuit of fortune since time immemorial, and many successful outcomes benefit from strokes of luck. But success is subject to complexity, uncertainty, and change - and at times becoming increasingly unequally dist ributed. This leads to tension and confusion over to what extent people actually get what they deserve (i.e., fairness/meritocracy). Moreover, in many fields, humans are over-confident and pervasively confuse luck for skill (I win, its skill; I lose, its bad luck). In some fields, there is too much risk taking; in others, not enough. Where success derives in large part from luck - and especially where bailouts skew the incentives (heads, I win; tails, you lose) - it follows that luck is rewarded too much. This incentivizes a culture of gambling, while downplaying the importance of productive effort. And, short term success is often rewarded, irrespective, and potentially at the detriment, of the long-term system fitness. However, much success is truly meritocratic, and the problem is to discern and reward based on merit. We call this the fair reward problem. To address this, we propose three different measures to assess merit: (i) raw outcome; (ii) risk adjusted outcome, and (iii) prospective. We emphasize the need, in many cases, for the deductive prospective approach, which considers the potential of a system to adapt and mutate in novel futures. This is formalized within an evolutionary system, comprised of five processes, inter alia handling the exploration-exploitation trade-off. Several human endeavors - including finance, politics, and science -are analyzed through these lenses, and concrete solutions are proposed to support a prosperous and meritocratic society.
To balance the load and to discourage the free-riding in peer-to-peer (P2P) networks, many incentive mechanisms and policies have been proposed in recent years. Global peer ranking is one such mechanism. In this mechanism, peers are ranked based on a metric called contribution index. Contribution index is defined in such a manner that peers are motivated to share the resources in the network. Fairness in the terms of upload to download ratio in each peer can be achieved by this method. However, calculation of contribution index is not trivial. It is computed distributively and iteratively in the entire network and requires strict clock synchronization among the peers. A very small error in clock synchronization may lead to wrong results. Furthermore, iterative calculation requires a lot of message overhead and storage capacity, which makes its implementation more complex. In this paper, we are proposing a simple incentive mechanism based on the contributions of peers, which can balance the upload and download amount of resources in each peer. It does not require iterative calculation, therefore, can be implemented with lesser message overhead and storage capacity without requiring strict clock synchronization. This approach is efficient as there are very less rejections among the cooperative peers. It can be implemented in a truly distributed fashion with $O(N)$ time complexity per peer.
For models of concurrent and distributed systems, it is important and also challenging to establish correctness in terms of safety and/or liveness properties. Theories of distributed systems consider equivalences fundamental, since they (1) preserve desirable correctness characteristics and (2) often allow for component substitution making compositional reasoning feasible. Modeling distributed systems often requires abstraction utilizing nondeterminism which induces unintended behaviors in terms of infinite executions with one nondeterministic choice being recurrently resolved, each time neglecting a single alternative. These situations are considered unrealistic or highly improbable. Fairness assumptions are commonly used to filter system behaviors, thereby distinguishing between realistic and unrealistic executions. This allows for key arguments in correctness proofs of distributed systems, which would not be possible otherwise. Our contribution is an equivalence spectrum in which fairness assumptions are preserved. The identified equivalences allow for (compositional) reasoning about correctness incorporating fairness assumptions.
134 - Inti Lehmann 2009
The standard model and Quantum Chromodynamics (QCD) have undergone rigorous tests at distances much shorter than the size of a nucleon. Up to now, the predicted phenomena are reproduced rather well. However, at distances comparable to the size of a n ucleon, new experimental results keep appearing which cannot be described consistently by effective theories based on QCD. The physics of strange and charmed quarks holds the potential to connect the two energy domains, interpolating between the limiting scales of QCD. This is the regime which will be explored using the future Antiproton Annihilations at Darmstadt (PANDA) experiment at the Facility for Antiproton and Ion Research (FAIR). In this contribution some of the most relevant physics topics are detailed; and the reason why PANDA is the ideal detector to study them is given. Precision studies of hadron formation in the charmonium region will greatly advance our understanding of hadronic structure. It may reveal particles beyond the two and three-quark configuration, some of which are predicted to have exotic quantum numbers in that mass region. It will deepen the understanding of the charmonium spectrum, where unpredicted states have been found recently by the B-factories. To date the structure of the nucleon, in terms of parton distributions, has been mainly investigated using scattering experiments. Complementary information will be acquired measuring electro-magnetic final states at PANDA.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا