How To Ensure Your AVL/MDC Data Is As Accurate As Possible

Maintenance decision support systems (MDSS) offer transportation agencies a wide range and high volume of vital atmospheric and roadway data, from which actionable insights can be extracted to ultimately benefit commercial carriers and the travelling public. But while this information can be central to most MDSS users’ day-to-day activities, the tools are actually capable of providing much more data than meets the eye. One such way is with the integration of automatic vehicle locator/mobile data computer (AVL/MDC) technology.

What is MDC/AVL?

MDC and AVL are actually two different data sets that can be interpreted in many different ways. For the purposes of this article, AVL will refer to the collection of basic vehicle location, while MDC will cover the data that are collected relating to the weather conditions, road conditions and actions being performed by the winter maintenance vehicle, including application rate, type and plow position.

City and state transportation departments across the United States that use web-based MDSS platforms will utilize this data in a variety of different ways. One major use case is to provide a rationale for how dollars are spent maintaining roads by tracking material usage during winter storm events. While other use cases are to justify actions and to understand when certain roads have been plowed, especially when the public may try to prove otherwise. Regardless of the goal of the data, making sure the data are valid and reliable is essential for all users, which is why we have compiled the following list of best practices so you can be confident that the data you use can be trusted.

MDC/AVL Best Practices

DOs

  • Verify your data
  • Work with your AVL vendor to validate data coming from the truck
  • Calibrate your spreaders
  • Verify the calibration of your spreaders by capturing and weighing material speed
  • Verify that the data in your reports are correct
  • Make it easy for operators to enter data
  • Require operators to enter data at desired intervals
  • Train people to troubleshoot and maintain the system
  • Maintain all of the equipment so that it continues to work as intended
  • Make sure data are later reported if gathered while the truck is out of coverage/communications
  • Have well-defined and consistent reporting glossary where every term means one thing – (e.g., don’t use “MIX” as a reference to 50-50 Sand-Salt in one shop and 80-20 in another)

DON’Ts

  • Assume your data is correct
  • Expect that getting accurate data from MDCs will be easy

Understanding exactly how your units report actions and what the implications are

Depending on your equipment and AVL provider, there may be cumulative application numbers that allow you to see that X had been counted before a shift and Y after, so that Y – X was put down in total during the shift. But for an MDSS such as WebMDSS or ClearPath Weather to perform at its best, what really matters is the complete picture of the way that amount was distributed – i.e. how much and where.

Iteris’ MDSS does not have a smooth continuous accounting. It has to draw the picture by “connecting the dots” of the information it gets about MDC/AVL activities. If there are observations out of the truck on a schedule once every two minutes, then MDSS draws a line between each pair of observation “dots” – it can only assume that whatever was reported at some time applies for that next two minutes until a new observation comes.

It might be that a truck reported zero application rate in one report and then again in the next report but in between there were multiple manual blast actions, for example on a bridge or within a curve. If there were no reports indicating those actions then MDSS cannot know they exist. Similarly, with only scheduled reporting, MDSS would know nothing of periods with the spreader off between two reports of it being on.

If one report showed the truck applying and the next report did not, then MDSS assumes applying for the whole period in between, when in reality the application may have stopped right after the first report. On the other hand, MDSS counts nothing between a report of not applying to one of applying, when in reality application may have started right after the first report.

Over the larger scale (longer time, more trucks) the inaccuracies of MDSS doing that “connecting the dots” might largely balance out. The situations where MDSS over and underestimates could be roughly equal, like flips of a coin. But in the shorter term (like one shift for one truck) they can strongly skew one way or the other, just like in only a few coin flips you might get a noticeable imbalance of heads vs tails. Or maybe the nature of the situation is not so even – it may be that most of the mis-estimates come from missing blast actions causing MDSS to be low in amount totals.

This issue is mostly handled if application reports are event based, meaning a report is not necessarily provided on a schedule but any time a change is made like adjusting the application rate or taking a blast action. In that case, MDSS should get a much better picture because the “dots” it connects are the actual changes in activities, so connecting them gives a more accurate accounting.

Lastly, as part of determining the amount of material, MDSS has to guess the distance between points. It takes the distance calculated between the two points and multiplies it by the rate value from the first point to get a total amount applied. For a straight line, that is great. But an extreme case would be a circle (suppose the truck made a loop between the two reports such that the two points were the same). MDSS thinks the distance is zero (and the amount thus zero also) since it only knows the two points on top of each other even though in reality maybe it drove several miles applying the whole time.

The Bottom Line

Whether you have been utilizing the data received from MDC/AVL technology to make better maintenance decisions for years or are just browsing vendors and exploring options, keep these items in mind, so you can fully understand your data and its limitations.