Do you want to publish a course? Click here

On the statistical description of the inbound air traffic over Heathrow airport

99   0   0.0 ( 0 )
 Added by Carlo Lancia
 Publication date 2013
and research's language is English




Ask ChatGPT about the research

We present a model to describe the inbound air traffic over a congested hub. We show that this model gives a very accurate description of the traffic by the comparison of our theoretical distribution of the queue with the actual distribution observed over Heathrow airport. We discuss also the robustness of our model.



rate research

Read More

This paper introduces a statistical model for the arrival times of connection events in a computer network. Edges between nodes in a network can be interpreted and modelled as point processes where events in the process indicate information being sent along that edge. A model of normal behaviour can be constructed for each edge in the network by identifying key network user features such as seasonality and self-exciting behaviour, where events typically arise in bursts at particular times of day. When monitoring the network in real time, unusual patterns of activity could indicate the presence of a malicious actor. Four different models for self-exciting behaviour are introduced and compared using data collected from the Imperial College and Los Alamos National Laboratory computer networks.
In order to maintain consistent quality of service, computer network engineers face the task of monitoring the traffic fluctuations on the individual links making up the network. However, due to resource constraints and limited access, it is not possible to directly measure all the links. Starting with a physically interpretable probabilistic model of network-wide traffic, we demonstrate how an expensively obtained set of measurements may be used to develop a network-specific model of the traffic across the network. This model may then be used in conjunction with easily obtainable measurements to provide more accurate prediction than is possible with only the inexpensive measurements. We show that the model, once learned may be used for the same network for many different periods of traffic. Finally, we show an application of the prediction technique to create relevant control charts for detection and isolation of shifts in network traffic.
We consider design of skyport locations for air taxis accessing airports and adopt a novel use of the classic hub location problem to properly make trade-offs on access distances for travelers to skyports from other zones, which is shown to reduce costs relative to a clustering approach from the literature. Extensive experiments on data from New York City show the method outperforms the benchmark clustering method by more than 7.4% here. Results suggest that six skyports located between Manhattan and Brooklyn can adequately serve the airport access travel needs and are sufficiently stable against travel time or transfer time increases.
Transport operators have a range of intervention options available to improve or enhance their networks. Such interventions are often made in the absence of sound evidence on resulting outcomes. Cycling superhighways were promoted as a sustainable and healthy travel mode, one of the aims of which was to reduce traffic congestion. Estimating the impacts that cycle superhighways have on congestion is complicated due to the non-random assignment of such intervention over the transport network. In this paper, we analyse the causal effect of cycle superhighways utilising pre-intervention and post-intervention information on traffic and road characteristics along with socio-economic factors. We propose a modeling framework based on the propensity score and outcome regression model. The method is also extended to the doubly robust set-up. Simulation results show the superiority of the performance of the proposed method over existing competitors. The method is applied to analyse a real dataset on the London transport network. The methodology proposed can assist in effective decision making to improve network performance.
271 - Duncan Lee , Gavin Shaddick 2012
The relationship between short-term exposure to air pollution and mortality or morbidity has been the subject of much recent research, in which the standard method of analysis uses Poisson linear or additive models. In this paper we use a Bayesian dynamic generalised linear model (DGLM) to estimate this relationship, which allows the standard linear or additive model to be extended in two ways: (i) the long-term trend and temporal correlation present in the health data can be modelled by an autoregressive process rather than a smooth function of calendar time; (ii) the effects of air pollution are allowed to evolve over time. The efficacy of these two extensions are investigated by applying a series of dynamic and non-dynamic models to air pollution and mortality data from Greater London. A Bayesian approach is taken throughout, and a Markov chain monte carlo simulation algorithm is presented for inference. An alternative likelihood based analysis is also presented, in order to allow a direct comparison with the only previous analysis of air pollution and health data using a DGLM.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا