Do you want to publish a course? Click here

Modeling Spacing Distribution of Queuing Vehicles in Front of a Signalized Junction Using Random-Matrix Theory

581   0   0.0 ( 0 )
 Added by Li Li
 Publication date 2008
  fields Physics
and research's language is English




Ask ChatGPT about the research

Modeling of headway/spacing between two consecutive vehicles has many applications in traffic flow theory and transport practice. Most known approaches only study the vehicles running on freeways. In this paper, we propose a model to explain the spacing distribution of queuing vehicles in front of a signalized junction based on random-matrix theory. We show that the recently measured spacing distribution data well fit the spacing distribution of a Gaussian symplectic ensemble (GSE). These results are also compared with the spacing distribution observed for car parking problem. Why vehicle-stationary-queuing and vehicle-parking have different spacing distributions (GSE vs GUE) seems to lie in the difference of driving patterns.



rate research

Read More

Airborne gamma-ray surveys are useful for many applications, ranging from geology and mining to public health and nuclear security. In all these contexts, the ability to decompose a measured spectrum into a linear combination of background source terms can provide useful insights into the data and lead to improvements over techniques that use spectral energy windows. Multiple methods for the linear decomposition of spectra exist but are subject to various drawbacks, such as allowing negative photon fluxes or requiring detailed Monte Carlo modeling. We propose using Non-negative Matrix Factorization (NMF) as a data-driven approach to spectral decomposition. Using aerial surveys that include flights over water, we demonstrate that the mathematical approach of NMF finds physically relevant structure in aerial gamma-ray background, namely that measured spectra can be expressed as the sum of nearby terrestrial emission, distant terrestrial emission, and radon and cosmic emission. These NMF background components are compared to the background components obtained using Noise-Adjusted Singular Value Decomposition (NASVD), which contain negative photon fluxes and thus do not represent emission spectra in as straightforward a way. Finally, we comment on potential areas of research that are enabled by NMF decompositions, such as new approaches to spectral anomaly detection and data fusion.
Correlation and similarity measures are widely used in all the areas of sciences and social sciences. Often the variables are not numbers but are instead qualitative descriptors called categorical data. We define and study similarity matrix, as a measure of similarity, for the case of categorical data. This is of interest due to a deluge of categorical data, such as movie ratings, top-10 rankings and data from social media, in the public domain that require analysis. We show that the statistical properties of the spectra of similarity matrices, constructed from categorical data, follow those from random matrix theory. We demonstrate this approach by applying it to the data of Indian general elections and sea level pressures in North Atlantic ocean.
The distribution of the ratios of nearest neighbor level spacings has become a popular indicator of spectral fluctuations in complex quantum systems like interacting many-body localized and thermalization phases, quantum chaotic systems, and also in atomic and nuclear physics. In contrast to the level spacing distribution, which requires the cumbersome and at times ambiguous unfolding procedure, the ratios of spacings do not require unfolding and are easier to compute. In this work, for the class of Wigner-Dyson random matrices with nearest neighbor spacing ratios $r$ distributed as $P_{beta}(r)$ for the three ensembles indexed by $beta=1,2, 4$, their $k-$th order spacing ratio distributions are shown to be identical to $P_{beta}(r)$, where $beta$, an integer, is a function of $beta$ and $k$. This result is shown for Gaussian and circular ensembles of random matrix theory and for several physical systems such as spin chains, chaotic billiards, Floquet systems and measured nuclear resonances.
The in situ measurement of the particle size distribution (PSD) of a suspension of particles presents huge challenges. Various effects from the process could introduce noise to the data from which the PSD is estimated. This in turn could lead to the occurrence of artificial peaks in the estimated PSD. Limitations in the models used in the PSD estimation could also lead to the occurrence of these artificial peaks. This could pose a significant challenge to in situ monitoring of particulate processes, as there will be no independent estimate of the PSD to allow a discrimination of the artificial peaks to be carried out. Here, we present an algorithm which is capable of discriminating between artificial and true peaks in PSD estimates based on fusion of multiple data streams. In this case, chord length distribution and laser diffraction data have been used. The data fusion is done by means of multi-objective optimisation using the weighted sum approach. The algorithm is applied to two different particle suspensions. The estimated PSDs from the algorithm are compared with offline estimates of PSD from the Malvern Mastersizer and Morphologi G3. The results show that the algorithm is capable of eliminating an artificial peak in a PSD estimate when this artificial peak is sufficiently displaced from the true peak. However, when the artificial peak is too close to the true peak, it is only suppressed but not completely eliminated.
We present a Bayesian approach for the Contamination Source Detection problem in Water Distribution Networks. Given an observation of contaminants in one or more nodes in the network, we try to give probable explanation for it assuming that contamination is a rare event. We introduce extra variables to characterize the place and pattern of the first contamination event. Then we write down the posterior distribution for these extra variables given the observation obtained by the sensors. Our method relies on Belief Propagation for the evaluation of the marginals of this posterior distribution and the determination of the most likely origin. The method is implemented on a simplified binary forward-in-time dynamics. Simulations on data coming from the realistic simulation software EPANET on two networks show that the simplified model is nevertheless flexible enough to capture crucial information about contaminant sources.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا