PERFORMANCE MEASURES FOR TRAFFIC SIGNAL PEDESTRIAN BUTTON and DETECTOR MAINTENANCE

Similar documents
Performance Measures for Local Agency Traffic Signals

Georgia Department of Transportation. Automated Traffic Signal Performance Measures Reporting Details

Validation Plan: Mitchell Hammock Road. Adaptive Traffic Signal Control System. Prepared by: City of Oviedo. Draft 1: June 2015

WHITE PAPER BENEFITS OF OPTICOM GPS. Upgrading from Infrared to GPS Emergency Vehicle Preemption GLOB A L TRAFFIC TE CHNOLOGIE S

Frequently Asked Questions

Performance Evaluation of Coordinated-Actuated Traffic Signal Systems Gary E. Shoup and Darcy Bullock

Signal Patterns for Improving Light Rail Operation By Wintana Miller and Mark Madden DKS Associates

Signal Performance Metrics Charlie Wetzel, PE, PTOE

Use of Probe Vehicles to Increase Traffic Estimation Accuracy in Brisbane

Event-Based Data Collection for Generating Actuated Controller Performance Measures

Deployment and Testing of Optimized Autonomous and Connected Vehicle Trajectories at a Closed- Course Signalized Intersection

Agenda. TS2 Cabinet Components and Operation. Understanding a Signal Plan Maccarone. Basic Preemption/Priority

Design Guidelines for Deploying Closed Loop Systems

Minnesota Department of Transportation Rural Intersection Conflict Warning System (RICWS) Reliability Evaluation

City of Orlando Alpha Test July 10, 2000

Algorithm for Detector-Error Screening on Basis of Temporal and Spatial Information

Objective 1: Performance Measures for a Signalized Arterial System

Currently 2 vacant engineer positions (1 Engineer level, 1 Managing Engineer level)

USING BLUETOOTH TM TO MEASURE TRAVEL TIME ALONG ARTERIAL CORRIDORS

Exploring Pedestrian Bluetooth and WiFi Detection at Public Transportation Terminals

Getting Through the Green: Smarter Traffic Management with Adaptive Signal Control

AUTOMATED TRAFFIC SIGNAL PERFORMANCE MEASURES

IMPROVEMENTS TO A QUEUE AND DELAY ESTIMATION ALGORITHM UTILIZED IN VIDEO IMAGING VEHICLE DETECTION SYSTEMS

Input-Output and Hybrid Techniques for Real- Time Prediction of Delay and Maximum Queue Length at Signalized Intersections

UTAH S EXPERIENCE WITH AUTOMATED TRAFFIC SIGNAL PERFORMANCE MEASURES

UDOT AUTOMATED TRAFFIC SIGNAL PERFORMANCE MEASURES

Traffic Signal System Upgrade Needs

1 of REV:0

20. Security Classif.(of this page) Unclassified

CHAPTER 28 ACTIVATING/DEACTIVATING A SIGNAL

Active Road Management Assisted by Satellite. ARMAS Phase II

Development and Evaluation of Lane-by-Lane Gap-out Based Actuated Traffic Signal Control

VIP User manual English

Model Deployment Overview. Debby Bezzina Senior Program Manager University of Michigan Transportation Research Institute

Context Aware Dynamic Traffic Signal Optimization

Evaluation of Connected Vehicle Technology for Concept Proposal Using V2X Testbed

Architectural/Engineering Specification for a. Microwave Perimeter Intrusion Detection System

Israel Railways No Fault Liability Renewal The Implementation of New Technological Safety Devices at Level Crossings. Amos Gellert, Nataly Kats

INNOVATIVE DEPLOYMENT OF DYNAMIC MESSAGE SIGNS IN SAFETY APPLICATIONS

AUTOMATIC INCIDENT DETECTION AND ALERTING IN TUNNELS

A STOP BASED APPROACH FOR DETERMINING WHEN TO RUN SIGNAL COORDINATION PLANS

Inductive Loop Detector

Signalized Corridor Assessment

CHAPTER 14: TRAFFIC SIGNAL STANDARDS Introduction and Goals Administration Standards Standard Attachments 14.

ON USING PERFECT SIGNAL PROGRESSION AS THE BASIS FOR ARTERIAL DESIGN: A NEW PERSPECTIVE

Sensor Troubleshooting Application Note

Current Systems. 1 of 6

City of Surrey Adaptive Signal Control Pilot Project

0-6920: PROACTIVE TRAFFIC SIGNAL TIMING AND COORDINATION FOR CONGESTION MITIGATION ON ARTERIAL ROADS. TxDOT Houston District

Chapter 39. Vehicle Actuated Signals Introduction Vehicle-Actuated Signals Basic Principles

FINAL REPORT IMPROVING THE EFFECTIVENESS OF TRAFFIC MONITORING BASED ON WIRELESS LOCATION TECHNOLOGY. Michael D. Fontaine, P.E. Research Scientist

Getting the Best Performance from Challenging Control Loops

Adaptive signal Control. Tom Mathew

THE CHALLENGES OF USING RADAR FOR PEDESTRIAN DETECTION

Jitter Analysis Techniques Using an Agilent Infiniium Oscilloscope

Instrumentation and Control

VIP User manual English

King Mill Lambert DRI# 2035 Henry County, Georgia

Consulting Report Breaker Trip / Fuse Blowing Investigation MCC / Blower Motors

Area Traffic Control System (ATCS)

VSI Labs The Build Up of Automated Driving

OPPORTUNISTIC TRAFFIC SENSING USING EXISTING VIDEO SOURCES (PHASE II)

Description of options, upgrades and accessories for the laser beam stabilization system Compact

!!!! Remote Sensing of Roads and Highways in Colorado

I-85 Integrated Corridor Management. Jennifer Portanova, PE, CPM Sreekanth Sunny Nandagiri, PE, PMP

Traffic Solutions. How to Test FCD Monitoring Solutions: Performance of Cellular-Based Vs. GPS-based systems

MOBILITY RESEARCH NEEDS FROM THE GOVERNMENT PERSPECTIVE

Digital Fault Recorder Deployment at HVDC Converter Stations

Texas Transportation Institute The Texas A&M University System College Station, Texas

Figures. Tables. Comparison of Interchange Control Methods...25

Making sense of electrical signals

San Antonio Wrong Way Driver Initiative

Making sense of electrical signals

Long Range Acoustic Classification

Improving the Safety and Efficiency of Roadway Maintenance Phase II: Developing a Vision Guidance System for the Robotic Roadway Message Painter

Traffic Controller Timing Processes

CHAPTER 7: ALIGNMENT

EVALUATING AN ADAPTIVE SIGNAL CONTROL SYSTEM IN GRESHAM. James M. Peters, P.E., P.T.O.E., Jay McCoy, P.E., Robert Bertini, Ph.D., P.E.

SST Expert Testimony Common Questions and Answers

Business Plan Summary

With Audible Detect Signal

OPAC Adaptive Engine Pinellas County Deployment

interactive IP: Perception platform and modules

Advanced Monitoring Tools to Improve Distribution System Visibility and Reduce Faults and Outages

FORUM MEETING #2: JULY 8-9, 2018; SAN FRANCISCO, CA. Forum on Preparing for Automated Vehicles & Shared Mobility

AGENT PLATFORM FOR ROBOT CONTROL IN REAL-TIME DYNAMIC ENVIRONMENTS. Nuno Sousa Eugénio Oliveira

M-0418 REV:0

Adaptive Controllers for Vehicle Velocity Control for Microscopic Traffic Simulation Models

Roadmap to Successful Deployment of Adaptive Systems

I. Travel Time Determinations on London Bridge Road and_i-264 Corridor using of Bluetooth Detection Devices

Stanford Center for AI Safety

TRB Workshop on the Future of Road Vehicle Automation

Fiber Characterization Test Equipment

AC Power Monitoring Application Brief

Methodology to Assess Traffic Signal Transition Strategies. Employed to Exit Preemption Control

Advanced Traffic Signal Control System Installed in Phuket City, Kingdom of Thailand

TxDOT Project : Evaluation of Pavement Rutting and Distress Measurements

The GATEway Project London s Autonomous Push

ADMINISTRATION BULLETIN

Keywords- Fuzzy Logic, Fuzzy Variables, Traffic Control, Membership Functions and Fuzzy Rule Base.

Transcription:

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 PERFORMANCE MEASURES FOR TRAFFIC SIGNAL PEDESTRIAN BUTTON and DETECTOR MAINTENANCE by Corresponding Author Jay Grossman Elkhart County Highway Department 610 Steury Ave. Goshen, IN 46528 Phone 574-534-9394 Fax 574-533-7103 jgrossman@elkcohwy.org Charles McKenzie Trine University 54821 Kristi Lane Osceola, IN 46561 Phone 574-210-5164 cpmckenzie@my.trine.edu Darcy M. Bullock Purdue University West Lafayette, IN 47097-1284 Phone 765-494-2226 darcy@purdue.edu August 1, 2014 Word Count: 4120 words + 10 * 250 words/table = 6620 words 8/1/2014 Page 1 of 19 4:22:01 PM

34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 ABSTRACT The use of high-resolution data collected by traffic signal controllers has been developed and used for engineering related performance measures over the past ten years. This data can also be used to develop maintenance related performance measures to help signal system operators find and correct faulty or misconfigured equipment in a timely manner, returning the system to optimal operation and efficiency. This research looked specifically at pedestrian buttons and vehicle detectors. Pedestrian button performance measures were developed to identify abnormal output from them. Determination of vehicle detector faults was also performed using an algorithm that compares current operation of a detector, based on the number of calls placed, to a historic baseline for the same detector. The length of the window of historic data needed to create a useful baseline was evaluated, as well as the standard deviation threshold used to indicate errors. A system of only identifying errors after three consecutive values above the threshold was implemented to reduce the number of false errors reported. The methodologies described were shown to be effective in detecting both complete detector failures as well as intermittent failures. 8/1/2014 Page 2 of 19 4:22:01 PM

49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 INTRODUCTION Performance measures relying on high-resolution data are increasingly used within the industry to evaluate and optimize the operation of traffic signals [1]. These performance measures have generally been used for engineering functions analysis of split failures, arrivals on green, amount of green time, and coordination. The log files downloaded from traffic signal controllers that enable performance measures also allow for uses that are more closely aligned with maintenance functions. The log files contain information about all detector calls placed, phases serviced, pedestrian activations, etc. Using this data, useful details about the operation of a traffic signal can be derived. Detection failures, abnormalities and other errors can be identified, alerting the system operator and allowing the problems to be corrected in a timely manner. Examples presented include: verification that pedestrian call buttons are operational, detection of abnormal call button activity, and detection of abnormal vehicle detection. Parameters for developing the historical baseline for comparison are developed, as well as recommendations for identifying errors versus this time line. MOTIVATION Finding intermittent errors and abnormal behavior in signal system detectors is a difficult task. Currently, agency staff typically only look for these types of errors in response to obvious operational problems and/or complaints from the public. Developing scalable longitudinal techniques to examine the behavior of each detector (pedestrian or vehicle) in a system, over a number of days, is needed to pro-actively manage modern traffic signal infrastructure. The goal of this research is to develop a method that can be automated and completed on a recurring basis by a central signal system. Potential detection errors can then be identified by the central system, and a manageable list of potential problems can be created for agency staff to then investigate. Detection errors in signal systems can create large losses in operational efficiency. For instance, if a pedestrian call button malfunctions and places a constant call for a walk phase, the signal may be continually forced out of coordination, and user delay will increase dramatically. Malfunctioning vehicle detectors may have similar effects on the system efficiency, or, if not detecting vehicles at all in a phase, create unsafe conditions as drivers decide to proceed illegally on red after waiting through a number of cycles. 8/1/2014 Page 3 of 19 4:22:01 PM

79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 Another motivation for this research is to better identify detector errors. While some tools for identifying detection failures already exist [2, 3, 4], notably for inductive loop systems, developing a tool that can use any type of system detector and identify abnormal behavior is becoming increasingly important as detection at signals is becoming more diverse and moving beyond inductive loops. Video systems, thermal sensors and radar detectors do not lend themselves as easily to the classic detection failure tools that look at the change in electrical properties of the circuit. Misalignment or atmospheric effects can cause detection problems in these newer systems that are difficult to detect. The detector may still be working and placing calls, but not at a normal rate. LITERATURE REVIEW Audelo, et al., [2] proposed using event log (high-resolution) data to determine detector errors. Their approach focused on calculating the cumulative duration that detectors are on or off and comparing these values to 90 th, 95 th, and 99 th percentile values. These thresholds were then used to determine if a detector was in error mode or not. Stop bar and advanced detector thresholds were calculated separately. Recommendations for detector fault thresholds were determined. TEST SITES AND INFRASTRUCTURE This study was conducted using data collected largely from the Elkhart County, Indiana, County Road 17 corridor. This corridor was connected via fiber-optic in 2011, with controller, cabinet and server upgrades allowing high-resolution data from each controller to be uploaded and stored in a central database. Traffic volumes along County Road 17 range from 7,000 vehicles per day to over 30,000 on a ten mile long, four-lane expressway. The County Road 17 Corridor is shown in Figure 1. 8/1/2014 Page 4 of 19 4:22:01 PM

CR17 Grossman, McKenzie, Bullock Paper No. 15-3951 99 100 101 102 103 104 105 106 107 108 109 110 Figure 1 Signal locations in Elkhart County, Indiana, and the County Road 17 corridor. (Source: Google Maps) Intersections in this corridor have a variety of detection technologies deployed including: inductive loop, video, thermal and radar. The variety of detection provides a useful test bed for detection related research. Data from the City of Mishawaka, Indiana, was also used for validation of error detection algorithms. Mishawaka gathers and records high-resolution controller data from a large portion of their city s signals following a system upgrade in 2013. The use of Mishawaka s data provided a check on the transferability of the methodologies developed on the Elkhart County system. 8/1/2014 Page 5 of 19 4:22:01 PM

111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 PEDESTRIAN CALLS Operational Verification An ongoing issue for system operators is verification that signal equipment was installed and configured properly. The use of high-resolution data can be used as a type of as-built verification that the detectors and other equipment was properly installed during construction and is operational. Pedestrian call buttons at intersections in suburban locations may be activated infrequently. Elkhart County has two intersections with pedestrian call buttons, and at each location there is little activity on two of the walk phases, as there is no development in those quadrants. If call buttons at these locations were inoperable, it is unlikely the agency would be aware of this until a citizen complaint was received. As a test of the system, and as a demonstration of the ability of high-resolution data to verify operations, each of the pedestrian call buttons at the intersection of County Road 10 and County Road 17 was activated by a staff member in the same hour of one day. The data log file on the server was then analyzed, and graphs created for that day. Figure 2 shows the number of pedestrian calls recorded on the test day, per hour, and by phase assignment. As can be seen, all of the pedestrian buttons can be verified as being operational. Figures 2b and 2c show that, except for the staff member conducting the test, phase 4 and 6 are seldom used. 8/1/2014 Page 6 of 19 4:22:01 PM

Ped Calls Ped Calls Ped Calls Ped Calls Grossman, McKenzie, Bullock Paper No. 15-3951 6 6 5 5 4 4 3 3 2 2 1 1 0 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 Hour 0 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 Hour a) Phase 2 b) Phase 4 6 6 5 5 4 4 3 3 2 2 1 1 0 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 Hour 0 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 Hour c) Phase 6 d) Phase 8 129 130 Figure 2 Pedestrian calls by hour and phase at the intersection of CR10 and CR17, demonstrating that all call buttons are operational after staff activated each button in the 14:00 hour. 131 132 133 134 135 136 137 138 139 Call Button Error Detection In suburban situations, traffic signals are often not programmed with the phase time required to accommodate pedestrian walk times. Due to the infrequency of pedestrian calls, it s more efficient to optimize the phase splits for the vehicular traffic, and allow the signal to exit the coordinated pattern when a pedestrian phase is called, and then transition back into coordination after the call has been served. If a pedestrian call button malfunctions and errantly places numerous (or constant) calls for the pedestrian phase, major losses of efficiency and coordination can occur. Therefore, a check on pedestrian calls at an intersection is thought to be a useful maintenance related performance measure. 140 141 142 As a study of pedestrian call activity, the Elkhart County intersection at County Road 10 and County Road 15 was analyzed. This intersection can be seen in Figure 3. 8/1/2014 Page 7 of 19 4:22:01 PM

CR15 Grossman, McKenzie, Bullock Paper No. 15-3951 143 144 Figure 3 County Road 10 at County Road 15, Elkhart County, Indiana (Source: Google Maps). 145 146 147 148 149 150 151 152 Figure 4 shows the number of pedestrian calls at this intersection, per hour, for a three day period in July. The data from July 15, when compared to the day before and the day after, can be seen to be abnormally high for this period. As seen in Figure 3, this location is not near any businesses or events that might generate late night pedestrian activity, there was no underlying cause found for this behavior. Analysis of this problem revealed that an electrical storm in this area on July 15 had put the call buttons into an erratic pattern, probably due to a voltage surge. This was an equipment error that should be identified so that the agency can correct the problem. 153 154 Figure 4 Pedestrian calls, per hour, at the CR10/CR15 intersection for three consecutive days. 8/1/2014 Page 8 of 19 4:22:01 PM

155 156 157 158 159 160 161 162 163 Figure 5 shows pedestrian calls per hour at the intersection of Main and Mishawaka Streets, in the City of Mishawaka for July 4 th and 5 th. The average number of calls per hour for this intersection for the previous two weeks, for the same days of the week, is also plotted. As can be seen, there is an abnormal peak on the evening of July 4 th. In Figure 6 it can be seen that this location is urban, and adjacent to a riverside park. Unlike the abnormal activity at County Road 10 and County Road 15, the peak at this location is explainable, due to a fireworks show in the adjacent park that night. While a system should be able to detect abnormalities and alert operators, it will still take judgment to determine if these abnormalities are equipment errors. 164 165 166 167 Figure 5 Pedestrian Calls, by hour, at Main/Mishawaka on July 4 and 5 with two week historic baseline, by day of week, plotted. 8/1/2014 Page 9 of 19 4:22:01 PM

Mishawaka Main 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 Figure 6 Location of Main and Mishawaka Intersection, City of Mishawaka (Source: Google Maps). VEHICLE DETECTOR ANALYSIS Methodology Developing a methodology for identifying potential errors in vehicle detectors is also important. A central system can then analyze the data for detectors system-wide and alert operators to any potential problems that need to be reviewed. The major components in this task are: creation of an appropriate historical baseline from which comparisons can be made, determination of a threshold for what defines a probable error, and a means of reducing false error identifications. High-resolution log data being communicated to a central database and stored provides a good platform from which to build a baseline of historic data for a detector. This provides a range in timeframe or data types that can be incorporated in a baseline. In the initial creation of a baseline for a detector it was determined that the best baseline for comparison would use only historical data from the same detector. Given the large variation in detector types, functions and traffic, this may seem obvious. A second determination was to develop a baseline that would compare detector functions for a given day of the week only to data from the same day of 8/1/2014 Page 10 of 19 4:22:01 PM

188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 the week. There are large differences in traffic volumes, peak times, etc., by day of week. Detector data for a Monday is thereby only compared to other Monday data. The final point in creating a baseline is determining how many previous weeks of data are needed. A long analysis period will require more computational resources, while a short period may not build an adequate representation of normal operation. Determination of this period is explored in the examples that follow. This study used binned hourly detector calls to form the baseline. As will be shown in the following examples, hourly bins provided enough resolution to detect even intermittent errors. Again, smaller bins may provide more resolution but at a cost of computational time. These bins were then averaged by hour of the day and day of the week. Standard deviations were then calculated for each hourly bin over the study period. A detector s current call volume, per hour, was then compared to this baseline, and a range of standard deviation thresholds was used to generate potential error flags. Trial and error determined that a standard deviation range of 1.5 to 2.0 produced meaningful results. This is explored in more depth in the following examples. The final step in this study s methodology was to try and reduce the number of isolated potential errors identified, reporting just those most likely to be indicative of true errors. As with any real-world system, there is some randomness and inherit variability in the data from a detector. Instead of reporting every individual instance of an hourly detector count being outside of the standard deviation threshold, a system was put in place that only reported an event as an error if it was the third hourly count in a row that was outside the threshold. Again, this will be shown in detail in the following examples. County Road 17 and Beck Figure 7 looks at a single day s data for a detector at the intersection of County Road 17 and Beck Drive. The historical baseline for this detector is shown as a dashed line, with the current day s counts shown as a solid line. Figure 7a plots the historical average for just one prior day, the week before. It then identifies any event above a standard deviation of 1.5 as a potential error. This is an example of a short historical comparison window, with no cleaning of isolated potential error identifications. 8/1/2014 Page 11 of 19 4:22:01 PM

a) Indicated errors, one week of prior data b) Indication of third consecutive error, one week of prior data c) Indicated errors, two weeks of prior data d) Indication of third consecutive error, two weeks of prior data e) Indicated errors, three weeks of prior data f) Indication of third consecutive error, three weeks of prior data 215 216 217 218 g) Indicated errors, four weeks of prior data h) Indication of third consecutive error, four weeks of prior data Figure 7 Comparison of error analysis for detector 18 at CR17/Beck for March 12, 2014, highlighting error periods based on a single hour above standard deviation threshold of 1.5 (Figures a, c, e, f) and three error hours in a series above threshold (Figures b, d, f, h) with one through four weeks of historical data used to create the baseline for comparison. 8/1/2014 Page 12 of 19 4:22:01 PM

219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 Figure 7g shows the same detector analysis, but using four prior days of data to create the baseline for comparison. As can be seen, the number of events flagged does not change much from the case of Figure 7a with only one day of prior data for comparison. Figures 7c and 7e do the same analysis with two and three weeks of prior data respectively. Each of these figures do not try to remove isolated events. Figures 7b, 7d, 7f, and 7h show the same detector analysis, but with events being flagged only if they are the third consecutive event above the threshold of 1.5 times the standard deviation. As can be seen, applying this requirement reduces the number of events identified, and helps identify events more likely to be true errors versus random variations. The amount of prior weeks of data needed to create a good baseline for comparison can also be seen in this data. One week of prior data, as in Figures 7a and 7b, demonstrates a number of isolated events identified as potential errors. With two weeks of prior data, as in Figures 7c and 7d, the error analysis is already noticeably improved, with fewer random or isolated identified events. Adding three and four weeks of prior data makes some minor improvement over two weeks of data, but not a considerable improvement. There is little change between the graphs using three and four weeks of data, suggesting that using four weeks of data is probably not required for accurate error identification. Figure 7 clearly indicates there may be an error on this detector on the analyzed day, with detections dropping to zero in the middle part of the day. This is a video detector aimed to the north, and there was a wet, sticky snow event on this day, blowing strongly from the north and coating the north face of traffic equipment with a heavy coating of snow. The effects to a signal head after this event can be seen in Figure 8. The on-board heater in the video detector eventually cleared the snow from the lens and the detector resumed normal operation. The central system s detector error reporting feature did not identify any problem with this detector during this period. 8/1/2014 Page 13 of 19 4:22:01 PM

247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 Figure 8 Coating of snow on traffic signal lenses, March 12, 2014. (Source: The Elkhart Truth) County Road 17 at County Road 14, Early January Figure 9 shows a similar analysis of detector counts at the intersection of County Road 14 and County Road 17. In the interest of brevity, only two evaluation cases are shown. Figure 9a identifies all individual errors (no smoothing) with a baseline using just one week of prior data. Figure 9b indicates only the third potential error in a series, with four weeks of prior data. Figure 9b can be seen to provide a more accurate identification of errors than Figure 9a. It is interesting to note the results of January 8 th in both Figures 9a and 9b. The effects of the New Year s day holiday one week before can be seen in the plotted averages of prior data for both the one week and four week baselines, creating larger than normal standard deviations in the data and causing both analyses to miss what should be identified as potential errors (zero detections). The four-week baseline used in Figure 9b allows the potential zero count errors on January 6 th and 7 th to be identified, which the one-week baseline of Figure 9a did not. In this instance, the apparent errors in the detector calls between January 6 th and January 9 th are the result of a county-wide snow emergency and travel ban. Obviously the ban worked better on January 6 th than the subsequent days. This is an example of an operator being able to determine if system identified potential errors are accurate or not. 8/1/2014 Page 14 of 19 4:22:01 PM

a) Indicated errors, one week of prior baseline data b) Indication of third consecutive error, four weeks of prior baseline data 266 267 268 Figure 9 Error analysis of detector 12 at CR14/CR17 intersection showing indication of errors based on standard deviation greater than 2.0 based on one week and four weeks of prior data. 8/1/2014 Page 15 of 19 4:22:01 PM

269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 County Road 17 at County Road 14, Late January Figure 10 shows another example of a potential intermittent detector error, identified through the use of high-resolution data. Figures 10a and 10b show the week of January 19-25 for detector 12 at County Road 14 and County Road 17. Starting about January 23 rd, the detector is reporting zero calls in the over-night hours, but operating at near normal levels during the midday. Figures 10c and 10d show the following week of January 26-February 1. The odd night-time detection problem seems to persist, more or less, until January 31. The error threshold used in this example was 2.0 times the standard deviation. Figures 10a and 10c identify all individual errors, Figures 10b and 10d identify the third potential error in a series. Again, Figures 10b and 10d, with smoothing, appear to identify the appropriate errors with fewer random identifications. Noticing a detection error at this location, a maintenance technician was dispatched to investigate. This detector is a thermal sensor connected to an in-cabinet video processing unit. The technician, while checking the cables between the video feed and the processing units found this unit s connection had a short in the cable. Being a very cold winter, the cable apparently shortened in the night enough to short out the video feed, but when temperatures increased during the day, the video feed resumed and the detector operated normally. This type of intermittent error is very difficult to detect with traditional means. The central system s current detector fault reporting mechanism did not identify this as operating improperly. An interesting result seen in this case is that the baselines used for comparison in the week of January 26-February 1 included the data from the week prior, when the detector began to fail. The earlier week s data was, therefore, actually better at identifying the problem as it was cleaner. If a detection problem persists for multiple weeks, it is conceivable that using the methodology employed in this study will eventually consider that behavior normal and identify no errors. Holiday periods or other abnormal traffic flow periods may also cause false error detections, and suggest that a one or two week period of prior data for comparison may not be as good as three weeks or more. 8/1/2014 Page 16 of 19 4:22:01 PM

Four weeks prior data Three weeks prior data Grossman, McKenzie, Bullock Paper No. 15-3951 Intermittent Errors a) Errors above St. Dev. 2.0 b) Three consecutive errors above St. Dev. 2.0 Problem Corrected 296 297 298 c) Errors above St. Dev. 2.0 d) Three consecutive errors above St. Dev. 2.0 Figure 10 Detector 12 at CR14/CR17 intersection, showing intermittent errors at off-peak times, problem corrected on 1/31. 299 300 301 302 303 304 305 306 307 308 CONCLUSIONS High-resolution data collected by a signal system can be used to effectively verify installation and configuration of traffic signal components, ensuring that detectors are mapped to the right channels and phases, and place calls when activated. This data can also indicate malfunctioning equipment, whether it s placing too many calls, or too few. By comparing the operation of a detector or pedestrian push button to a historical baseline, and then defining an error threshold based on standard deviation, more difficult transient errors can also be identified. 8/1/2014 Page 17 of 19 4:22:01 PM

309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 Results from analysis of data in Elkhart County suggest that a useful error threshold may lie between 1.5 and 2.0 standard deviations. However, it seems useful if this value remains user definable given the variations between intersections and detector traffic patterns. Analysis also showed that using four weeks of data to develop a comparison baseline yielded the most accurate error determinations. However, there was little improvement in accuracy over using three weeks of data. Using only one week of data was not very accurate, and resulted in a large number of errors identified. Only identifying errors after three consecutive periods outside the error threshold helped reduce the number of extraneous potential errors identified, while still identifying any real detection problems. This approach to identifying detector errors appeared well suited to use with newer detection technologies. It was able to identify intermittent errors, and errors where the equipment was technically still functional, but obscured or misaligned. Incorporation of these algorithms in a commercially produced central system should allow for more efficient use of an agency s staff time, and maintain a higher level of system maintenance and efficiency. The central system can identify suspect devices, allowing the agency s staff to then investigate the root cause of any abnormalities, catching detector failures sooner. ACKNOWLEDGMENTS This work was supported by the Indiana Local Technical Assistance Program, Elkhart County Highway Department, City of Mishawaka, Econolite Control Products and Purdue University. The contents of this paper reflect the views of the authors, who are responsible for the facts and the accuracy of the data presented herein, and do not necessarily reflect the official views or policies of the sponsoring organizations. These contents do not constitute a standard, specification, or regulation. 336 8/1/2014 Page 18 of 19 4:22:01 PM

337 338 339 340 341 342 343 344 345 346 347 348 349 350 REFERENCES 1. Smaglik E.J., A. Sharma, D.M. Bullock, J.R. Sturdevant, and G. Duncan, Event-Based Data Collection for Generating Actuated Controller Performance Measures, Transportation Research Record, #2035, TRB, National Research Council, Washington, DC, pp.97-106, 2007. 2. Chen, L., and A. D. May. Traffic Detector Errors and Diagnostics. In Transportation Research Record: Journal of the Transportation Research Board, No. 1132, Transportation Research Board of the National Academies, Washington, DC, pp.82-93, 1987. 3. Audelo, Michael, Chih-Sheng Chou, Tiantian Chen, and Andrew P. Nichols. Empirical Analysis of Controller Event Data to Select Vehicle Detector Fault Triggers. Transportation Research Board 2014 Annual Meeting, No. 14-5182, 2014 4. Peeta, S. and Anastassapoulos, I. (2002). Automatic Real-time Detection and Correction of Erroneous Detector Data Using Fourier Transforms for On-line Traffic Control Architectures, Transportation Research Record: Journal of the Transportation Research Board, No. 1811, pp. 1-11, 2002. 8/1/2014 Page 19 of 19 4:22:01 PM