ترغب بنشر مسار تعليمي؟ اضغط هنا

In this paper, an outlier elimination algorithm for ellipse/ellipsoid fitting is proposed. This two-stage algorithm employs a proximity-based outlier detection algorithm (using the graph Laplacian), followed by a model-based outlier detection algorit hm similar to random sample consensus (RANSAC). These two stages compensate for each other so that outliers of various types can be eliminated with reasonable computation. The outlier elimination algorithm considerably improves the robustness of ellipse/ellipsoid fitting as demonstrated by simulations.
Graphical models have been widely applied in solving distributed inference problems in sensor networks. In this paper, the problem of coordinating a network of sensors to train a unique ensemble estimator under communication constraints is discussed. The information structure of graphical models with specific potential functions is employed, and this thus converts the collaborative training task into a problem of local training plus global inference. Two important classes of algorithms of graphical model inference, message-passing algorithm and sampling algorithm, are employed to tackle low-dimensional, parametrized and high-dimensional, non-parametrized problems respectively. The efficacy of this approach is demonstrated by concrete examples.
This paper introduces a modeling framework for distributed regression with agents/experts observing attribute-distributed data (heterogeneous data). Under this model, a new algorithm, the iterative covariance optimization algorithm (ICOA), is designe d to reshape the covariance matrix of the training residuals of individual agents so that the linear combination of the individual estimators minimizes the ensemble training error. Moreover, a scheme (Minimax Protection) is designed to provide a trade-off between the number of data instances transmitted among the agents and the performance of the ensemble estimator without undermining the convergence of the algorithm. This scheme also provides an upper bound (with high probability) on the test error of the ensemble estimator. The efficacy of ICOA combined with Minimax Protection and the comparison between the upper bound and actual performance are both demonstrated by simulations.
This paper introduces a framework for regression with dimensionally distributed data with a fusion center. A cooperative learning algorithm, the iterative conditional expectation algorithm (ICEA), is designed within this framework. The algorithm can effectively discover linear combinations of individual estimators trained by each agent without transferring and storing large amount of data amongst the agents and the fusion center. The convergence of ICEA is explored. Specifically, for a two agent system, each complete round of ICEA is guaranteed to be a non-expansive map on the function space of each agent. The advantages and limitations of ICEA are also discussed for data sets with various distributions and various hidden rules. Moreover, several techniques are also designed to leverage the algorithm to effectively learn more complex hidden rules that are not linearly decomposable.
This paper considers estimation of a univariate density from an individual numerical sequence. It is assumed that (i) the limiting relative frequencies of the numerical sequence are governed by an unknown density, and (ii) there is a known upper boun d for the variation of the density on an increasing sequence of intervals. A simple estimation scheme is proposed, and is shown to be $L_1$ consistent when (i) and (ii) apply. In addition it is shown that there is no consistent estimation scheme for the set of individual sequences satisfying only condition (i).
We consider univariate regression estimation from an individual (non-random) sequence $(x_1,y_1),(x_2,y_2), ... in real times real$, which is stable in the sense that for each interval $A subseteq real$, (i) the limiting relative frequency of $A$ und er $x_1, x_2, ...$ is governed by an unknown probability distribution $mu$, and (ii) the limiting average of those $y_i$ with $x_i in A$ is governed by an unknown regression function $m(cdot)$. A computationally simple scheme for estimating $m(cdot)$ is exhibited, and is shown to be $L_2$ consistent for stable sequences ${(x_i,y_i)}$ such that ${y_i}$ is bounded and there is a known upper bound for the variation of $m(cdot)$ on intervals of the form $(-i,i]$, $i geq 1$. Complementing this positive result, it is shown that there is no consistent estimation scheme for the family of stable sequences whose regression functions have finite variation, even under the restriction that $x_i in [0,1]$ and $y_i$ is binary-valued.
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا