ترغب بنشر مسار تعليمي؟ اضغط هنا

Evaluation of a length-based method to estimate discard rate and the effect of sampling size

135   0   0.0 ( 0 )
 نشر من قبل Erla Sturludottir
 تاريخ النشر 2019
  مجال البحث الاحصاء الرياضي
والبحث باللغة English




اسأل ChatGPT حول البحث

The common fisheries policy aims at eliminating discarding which has been part of fisheries for centuries. It is important to monitor the compliance with the new regulations but estimating the discard rate is a challenging task, especially where the practise is illegal. The aim of this study was to review a length-based method that has been used to estimate the discard rate in Icelandic waters and explore the effects of different monitoring schemes. The length-based method estimates the minimum discard rate and the method of bootstrapping can be used to determine the uncertainty of the estimate. This study showed that the number of ships is the most important factor to consider in order to decrease the uncertainty.

قيم البحث

اقرأ أيضاً

Segmented regression is a standard statistical procedure used to estimate the effect of a policy intervention on time series outcomes. This statistical method assumes the normality of the outcome variable, a large sample size, no autocorrelation in t he observations, and a linear trend over time. Also, segmented regression is very sensitive to outliers. In a small sample study, if the outcome variable does not follow a Gaussian distribution, then using segmented regression to estimate the intervention effect leads to incorrect inferences. To address the small sample problem and non-normality in the outcome variable, including outliers, we describe and develop a robust statistical method to estimate the policy intervention effect in a series of longitudinal data. A simulation study is conducted to demonstrate the effect of outliers and non-normality in the outcomes by calculating the power of the test statistics with the segmented regression and the proposed robust statistical methods. Moreover, since finding the sampling distribution of the proposed robust statistic is analytically difficult, we use a nonparametric bootstrap technique to study the properties of the sampling distribution and make statistical inferences. Simulation studies show that the proposed method has more power than the standard t-test used in segmented regression analysis under the non-normality error distribution. Finally, we use the developed technique to estimate the intervention effect of the Istanbul Declaration on illegal organ activities. The robust method detected more significant effects compared to the standard method and provided shorter confidence intervals.
The determination of the infection fatality rate (IFR) for the novel SARS-CoV-2 coronavirus is a key aim for many of the field studies that are currently being undertaken in response to the pandemic. The IFR together with the basic reproduction numbe r $R_0$, are the main epidemic parameters describing severity and transmissibility of the virus, respectively. The IFR can be also used as a basis for estimating and monitoring the number of infected individuals in a population, which may be subsequently used to inform policy decisions relating to public health interventions and lockdown strategies. The interpretation of IFR measurements requires the calculation of confidence intervals. We present a number of statistical methods that are relevant in this context and develop an inverse problem formulation to determine correction factors to mitigate time-dependent effects that can lead to biased IFR estimates. We also review a number of methods to combine IFR estimates from multiple independent studies, provide example calculations throughout this note and conclude with a summary and best practice recommendations. The developed code is available online.
The recently published GWTC-1 - a journal article summarizing the search for gravitational waves (GWs) from coalescing compact binaries in data produced by the LIGO-Virgo network of ground-based detectors during their first and second observing runs - quoted estimates for the rates of binary neutron star, neutron star black hole binary, and binary black hole mergers, as well as assigned probabilities of astrophysical origin for various significant and marginal GW candidate events. In this paper, we delineate the formalism used to compute these rates and probabilities, which assumes that triggers above a low ranking statistic threshold, whether of terrestrial or astrophysical origin, occur as independent Poisson processes. In particular, we include an arbitrary number of astrophysical categories by redistributing, via mass-based template weighting, the foreground probabilities of candidate events, across source classes. We evaluate this formalism on synthetic GW data, and demonstrate that this method works well for the kind of GW signals observed during the first and second observing runs.
56 - Cheoljoon Jeong 2019
This study has investigated the mortality rate of parties at real estate auctions compared to that of the overall population in South Korea by using various variables, including age, real estate usage, cumulative number of real estate auction events, disposal of real estate, and appraisal price. In each case, there has been a significant difference between mortality rate of parties at real estate auctions and that of the overall population, which provides a new insight regarding utilization of the information on real estate auctions. Despite the need for further detailed analysis on the correlation between real estate auction events and death, because the result from this study is still meaningful, the result is summarized for informational purposes.
Motivation: There is a growing need to integrate mechanistic models of biological processes with computational methods in healthcare in order to improve prediction. We apply data assimilation in the context of Type 2 diabetes to understand parameters associated with the disease. Results: The data assimilation method captures how well patients improve glucose tolerance after their surgery. Data assimilation has the potential to improve phenotyping in Type 2 diabetes.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا