No Arabic abstract
The efficacy of sensor data in modern bridge condition evaluations has been undermined by inaccessible technologies. While the links between vibrational properties and structural health have been well established, high costs associated with specialized sensor networks have prevented the integration of such data with bridge management systems. In the last decade, researchers predicted that crowd-sourced mobile sensor data, collected ubiquitously and cheaply, will revolutionize our ability to maintain existing infrastructure; yet no such applications have successfully overcome the challenge of extracting useful information in the field with sufficient precision. Here we fill this knowledge gap by showing that critical physical properties of a real bridge can be determined accurately from everyday vehicle trip data. We collected smartphone data from controlled field experiments and UBER rides on the Golden Gate Bridge and developed an analytical method to recover modal properties, which paves the way for scalable, cost-effective structural health monitoring based on this abundant data class. Our results are consistent with a comprehensive study on the Golden Gate Bridge. We assess the benefit of continuous monitoring with reliability models and show that the inclusion of crowd-sourced data in a bridge maintenance plan can add over fourteen years of service (30% increase) to a bridge without additional costs. These results certify the immediate value of large-scale data sources for studying the health of existing infrastructure, whether the data are crowdsensed or generated by organized vehicle fleets such as ridesourcing companies or municipalities.
Ice-covered ocean worlds possess diverse energy sources and associated mechanisms that are capable of driving significant seismic activity, but to date no measurements of their seismic activity have been obtained. Such investigations could probe their transport properties and radial structures, with possibilities for locating and characterizing trapped liquids that may host life and yielding critical constraints on redox fluxes, and thus on habitability. Modeling efforts have examined seismic sources from tectonic fracturing and impacts. Here, we describe other possible seismic sources, their associations with science questions constraining habitability, and the feasibility of implementing such investigations. We argue, by analogy with the Moon, that detectable seismic activity on tidally flexed ocean worlds should occur frequently. Their ices fracture more easily than rocks, and dissipate more tidal energy than the <1 GW of the Moon and Mars. Icy ocean worlds also should create less thermal noise for a due to their greater distance and consequently smaller diurnal temperature variations. They also lack substantial atmospheres (except in the case of Titan) that would create additional noise. Thus, seismic experiments could be less complex and less susceptible to noise than prior or planned planetary seismology investigations of the Moon or Mars.
The tragedy of the digital commons does not prevent the copious voluntary production of content that one witnesses in the web. We show through an analysis of a massive data set from texttt{YouTube} that the productivity exhibited in crowdsourcing exhibits a strong positive dependence on attention, measured by the number of downloads. Conversely, a lack of attention leads to a decrease in the number of videos uploaded and the consequent drop in productivity, which in many cases asymptotes to no uploads whatsoever. Moreover, uploaders compare themselves to others when having low productivity and to themselves when exceeding a threshold.
In this paper we present the first population-level, city-scale analysis of application usage on smartphones. Using deep packet inspection at the network operator level, we obtained a geo-tagged dataset with more than 6 million unique devices that launched more than 10,000 unique applications across the city of Shanghai over one week. We develop a technique that leverages transfer learning to predict which applications are most popular and estimate the whole usage distribution based on the Point of Interest (POI) information of that particular location. We demonstrate that our technique has an 83.0% hitrate in successfully identifying the top five popular applications, and a 0.15 RMSE when estimating usage with just 10% sampled sparse data. It outperforms by about 25.7% over the existing state-of-the-art approaches. Our findings pave the way for predicting which apps are relevant to a user given their current location, and which applications are popular where. The implications of our findings are broad: it enables a range of systems to benefit from such timely predictions, including operating systems, network operators, appstores, advertisers, and service providers.
Thanks to rapid advances in technologies like GPS and Wi-Fi positioning, smartphone users are able to determine their location almost everywhere they go. This is not true, however, of people who are traveling in underground public transportation networks, one of the few types of high-traffic areas where smartphones do not have access to accurate position information. In this paper, we introduce the problem of underground transport positioning on smartphones and present SubwayPS, an accelerometer-based positioning technique that allows smartphones to determine their location substantially better than baseline approaches, even deep beneath city streets. We highlight several immediate applications of positioning in subway networks in domains ranging from mobile advertising to mobile maps and present MetroNavigator, a proof-of-concept smartphone and smartwatch app that notifies users of upcoming points-of-interest and alerts them when it is time to get ready to exit the train.
The City of Detroit maintains an active fleet of over 2500 vehicles, spending an annual average of over $5 million on new vehicle purchases and over $7.7 million on maintaining this fleet. Understanding the existence of patterns and trends in this data could be useful to a variety of stakeholders, particularly as Detroit emerges from Chapter 9 bankruptcy, but the patterns in such data are often complex and multivariate and the city lacks dedicated resources for detailed analysis of this data. This work, a data collaboration between the Michigan Data Science Team (http://midas.umich.edu/mdst) and the City of Detroits Operations and Infrastructure Group, seeks to address this unmet need by analyzing data from the City of Detroits entire vehicle fleet from 2010-2017. We utilize tensor decomposition techniques to discover and visualize unique temporal patterns in vehicle maintenance; apply differential sequence mining to demonstrate the existence of common and statistically unique maintenance sequences by vehicle make and model; and, after showing these time-dependencies in the dataset, demonstrate an application of a predictive Long Short Term Memory (LSTM) neural network model to predict maintenance sequences. Our analysis shows both the complexities of municipal vehicle fleet data and useful techniques for mining and modeling such data.