No Arabic abstract
Empirical observations and theoretical studies indicate that the overall travel-time of vehicles in a traffic network can be optimized by means of ramp metering control systems. Here, we present an analysis of traffic data of the highway network of North-Rhine-Westfalia in order to identify and characterize the sections of the network which limit the performance, i.e., the bottlenecks. It is clarified whether the bottlenecks are of topological nature or if they are constituted by on-ramps. This allows to judge possible optimization mechanisms and reveals in which areas of the network they have to be applied.
Simple cellular automata models are able to reproduce the basic properties of highway traffic. The comparison with empirical data for microscopic quantities requires a more detailed description of the elementary dynamics. Based on existing cellular automata models we propose an improved discrete model incorporating anticipation effects, reduced acceleration capabilities and an enhanced interaction horizon for braking. The modified model is able to reproduce the three phases (free-flow, synchronized, and stop-and-go) observed in real traffic. Furthermore we find a good agreement with detailed empirical single-vehicle data in all phases.
Numerous groups have applied a variety of deep learning techniques to computer vision problems in highway perception scenarios. In this paper, we presented a number of empirical evaluations of recent deep learning advances. Computer vision, combined with deep learning, has the potential to bring about a relatively inexpensive, robust solution to autonomous driving. To prepare deep learning for industry uptake and practical applications, neural networks will require large data sets that represent all possible driving environments and scenarios. We collect a large data set of highway data and apply deep learning and computer vision algorithms to problems such as car and lane detection. We show how existing convolutional neural networks (CNNs) can be used to perform lane and vehicle detection while running at frame rates required for a real-time system. Our results lend credence to the hypothesis that deep learning holds promise for autonomous driving.
Weighted least squares fitting to a database of quantum mechanical calculations can determine the optimal parameters of empirical potential models. While algorithms exist to provide optimal potential parameters for a given fitting database of structures and their structure property functions, and to estimate prediction errors using Bayesian sampling, defining an optimal fitting database based on potential predictions remains elusive. A testing set of structures and their structure property functions provides an empirical measure of potential transferability. Here, we propose an objective function for fitting databases based on testing set errors. The objective function allows the optimization of the weights in a fitting database, the assessment of the inclusion or removal of structures in the fitting database, or the comparison of two different fitting databases. To showcase this technique, we consider an example Lennard-Jones potential for Ti, where modeling multiple complicated crystal structures is difficult for a radial pair potential. The algorithm finds different optimal fitting databases, depending on the objective function of potential prediction error for a testing set.
Information flow analysis prevents secret or untrusted data from flowing into public or trusted sinks. Existing mechanisms cover a wide array of options, ranging from lightweight taint analysis to heavyweight information flow control that also considers implicit flows. Dynamic analysis, which is particularly popular for languages such as JavaScript, faces the question whether to invest in analyzing flows caused by not executing a particular branch, so-called hidden implicit flows. This paper addresses the questions how common different kinds of flows are in real-world programs, how important these flows are to enforce security policies, and how costly it is to consider these flows. We address these questions in an empirical study that analyzes 56 real-world JavaScript programs that suffer from various security problems, such as code injection vulnerabilities, denial of service vulnerabilities, memory leaks, and privacy leaks. The study is based on a state-of-the-art dynamic information flow analysis and a formalization of its core. We find that implicit flows are expensive to track in terms of permissiveness, label creep, and runtime overhead. We find a lightweight taint analysis to be sufficient for most of the studied security problems, while for some privacy-related code, observable tracking is sometimes required. In contrast, we do not find any evidence that tracking hidden implicit flows reveals otherwise missed security problems. Our results help security analysts and analysis designers to understand the cost-benefit tradeoffs of information flow analysis and provide empirical evidence that analyzing implicit flows in a cost-effective way is a relevant problem.
A two-lane extension of a recently proposed cellular automaton model for traffic flow is discussed. The analysis focuses on the reproduction of the lane usage inversion and the density dependence of the number of lane changes. It is shown that the single-lane dynamics can be extended to the two-lane case without changing the basic properties of the model which are known to be in good agreement with empirical single-vehicle data. Therefore it is possible to reproduce various empirically observed two-lane phenomena, like the synchronization of the lanes, without fine-tuning of the model parameters.