Using Artificial Intelligence to Reduce Distracted Driving and Enhance Fleet Safety

Stefanie Valentic, Editorial Director

October 28, 2020

9 Min Read
distracteddrivingfeat.png
Getty Images

Distraction behind the wheel, even for a split second, can mean serious injury or death.

The Federal Motor Carrier Safety Administration (FMCSA) indicates that distraction or inattention is the second most common driver-related cause of fatalities for commercial truck operators.

Data analyzed in 2019 from San Diego-based Lytx’s client base showed that drivers who multi-task –such as eating, drinking, smoking and using a phone – increase their risk of an accident by 100%.

The company’s Vice President of Safety Services Del Lisk, who has been in the safety industry for 30 years, has witnessed how technology has become more sophisticated in preventing distracted driving incidents.

Lisk spoke with Waste360 about machine-based learning, artificial intelligence and how data gathered through video telematics can aid companies with risk reduction.

Waste360: How does video telematics work, and how can it reduce distracted driving incidents?

Lisk: [Lytx] has been collecting video of driving for many, many years. In prior years, it was analyzed by a team of about 500 people that were literally reviewing exception-based video, analyzing potential, unsafe driving risk and things like that. So, we've been capturing all this video footage over literally billions of miles and analyzing that and essentially, capturing data that we feed back into our system to make it smarter and smarter.

Over the last five years, we've been using that data to really train our technology to understand what it’s seen. The machine vision is the video. It's seeing things. Then the artificial intelligence – through the data that we feed back to it from our analysis – is interpreting what it's seeing and understanding what it means from a safety perspective. Once we started to integrate machine vision and artificial intelligence into video telematics, we now have a much more precise tool to effectively identify distracted driving than we had before.

In prior years, we would capture video on an exception as a lot of our competitors would primarily capture video on an exception basis. If a vehicle slammed on the brakes, made a sudden swerve, had a crash, those sorts of things. And sometimes it would reveal that the driver was on a phone or distracted, but it was kind of just coincidentally capturing it. It wasn't because of the distraction itself. So, over the most recent years, we've developed machine vision, artificial intelligence and trained it to have a 95% plus accuracy rate, so that it can identify if a driver is, for example, picking up a phone and putting it to their ear. It will recognize that. Or if a driver is – and we've all had this happen – maybe there's something on the passenger seat, and they're continually looking, overreaching for something in their passenger seat, the net of it being that their eyes are off the road for multiple times, several seconds at a time. That machine vision or artificial intelligence can now identify that correctly, and then can alert the driver, notify supervisors, capture video, all sorts of things. But to the question of what makes it so effective in reducing distracted driving incidents,  it is the addition of machine vision, artificial intelligence, and its ability to, with high accuracy, identify when drivers are distracted, that really wasn't something that was available five years ago.

Waste360: Have there been any challenges to the development and implementation of this technology?

Lisk: There's always challenges anytime that you're trying something new. There's some challenges at the beginning with making sure you have it correct. We certainly work with some of our best clients to go through a beta process to make sure that it was accurately identifying some of these behaviors. And by doing that, we're able to fine-tune the algorithms so that with a 95% plus confidence, it is accurately identifying those behaviors. In the very early days, during testing, there was the potential risk of too many false signals. Through the beta process, we really eliminated that before rolling it out.

I think maybe the other kind of “challenge” and we're doesn't appear to be as big a challenge as some might think, is just from a driver acceptance standpoint. Here's another technology that is monitoring the driving and at first glance, they don't necessarily understand it. So, there's potentially some driver challenges simply because they don't understand how the technology works. And that's where communication and education to understand how it actually works is so critical to getting driver acceptance.

Waste360: What opportunities does this technology present to waste management companies in managing their fleets?

Lisk: I joined Lytx in 2003. And soon after that, some of the waste companies started to come on board with what was then a fairly primitive in-cab video technology. And they got a lot of benefit out of that.

But it was, again, as I kind of described before, there were a lot of risky things that the video didn't have the ability to identify. So think about it. With the video, there's no way in the world a human can look at continuous video, analyze it and on a continuous basis and call up the moments where someone did something unsafe and alert the driver or give a heads up to management. It's just it's too inefficient.

So, in the earlier days, the exception-based triggers, which were typically accelerometers or GPS based –driver doing a sudden action or speeding – well, that would capture exception-based video that was analyzed but nothing beyond that. It was missing a lot of other risks that were going on, especially in the areas of fatigue and distracted driving, that were not very effectively getting captured.

What this is enabling now is for companies to much more effectively capture a bigger pool of risk and in many cases, risks that are that typically lead to higher severity crashes are more effectively able to identify that and empower the driver to self-correct. For example, if the technology identifies a driver that is on their cell phone, it can be configured so that so the device actually sends an alert warning them that they're on a cell phone, and they'll put it down, so that driver is empowered to self-correct. But it can also capture that video so that the driver manager can sit down later. review the video and use that as the basis to have a good conversation on what risk occurred and what they need to do differently. So it is ultimately uncovering of a lot more risk. That I think is more correlative to the higher severity crashes with much more opportunity to identify that and correct it than we had before.

Waste360: You mentioned these capabilities weren't possible five years ago. Where do you see this technology going in the next five to ten years?

Lisk: It’s been really fascinating, especially over the last couple of years as we really fine-tune that MBAI (machine-based artificial intelligence) and as relates to distracted driving-related issues. We're absolutely seeing on one end, drivers using handheld cell phones more than we ever realized before, we're not just seeing the occurrence. But we're also able to measure, what percentage of the day are they doing it. Because the longer you do something unsafe, the greater the chances are that something bad is going to happen. So, we're able to track not just who does it, but how often it is occurring. And what part of their day is it's consuming. And certainly, hand-held cell phones are the most notable piece of this.

What’s been interesting is I'm seeing a lot more of just very natural things that people do in the cab of the vehicle that they didn't realize how long it was taking their eyes off the road. Things such as fumbling with the controls of their vehicle. I don't know what kind of vehicle we have, but newer vehicles have so many screens and buttons and things. It's really hard to be familiar with them, and we can unintentionally spend several seconds looking down at them as we're trying to figure out what we're trying to do. In a lot of instances, drivers are spending several seconds looking down at an instrument panel or something else. Even in some cases, it's pretty clear they probably have an iPad or a phone propped up on their console and they're watching a movie or something as they're driving.

These are things that we really had no idea where we're occurring and that we're able to capture now help those drivers understand that risk so that they can correct. It's really been very revealing in that way.

In terms of where this is all going – certainly in the world that we live in, there's going to be continued growth in improvements and application of machine vision, artificial intelligence to flesh out the real, material risk in driving to help drivers identify this and ideally self-correct at the driver level, and the handful that don't, the opportunity for management to be aware of that and have targeted coaching with those handfuls that might need that sort of focus. I think it'll continue to evolve and get better and continue to identify more risky behaviors.

One of the areas we're going down is related to fatigue. We call it our inattentive trigger, but it's certainly correlative to fatigue. In instances where drivers are their head is dropping for extended periods of time, significant eye closures, or just simply their eyes kind of wandering off of the roadway, are all indicators of fatigue. As these triggers begin to become more common use, I think we can make a significant dent for those using this technology in reducing instances that have involved, distracted driving and fatigue.

For the next five to ten years, the other places [technology] is clearly going is that vehicles have so many different sensors in them these days. You have the engine control module. In the trucking world, you've got LDS and things like that. There are all kinds of different devices collecting data in the vehicle. And it's really going towards this convergence over the next five to 10 years, not only collecting the data but putting it together for a more holistic understanding of the person behind the wheel and their performance. But I think also literally just the integration of some of these devices into fewer devices is certainly another evolutionary part of this.

About the Author

Stefanie Valentic

Editorial Director, Waste360

Stefanie Valentic is the editorial director of Waste360. She can be reached at [email protected].

 

Stay in the Know - Subscribe to Our Newsletters
Join a network of more than 90,000 waste and recycling industry professionals. Get the latest news and insights straight to your inbox. Free.

You May Also Like