Driving in the snow is hard enough. Preparing autonomous vehicles to maneuver in snowy conditions might be harder.

But Ford is pushing forward, continuing to test its autonomous cars in the snow, preparing them for whatever the adverse weather conditions might have them encounter — including skidding.

We spoke with Ford's global manager of autonomous driving, Greg Stevens, about the three challenges of preparing autonomous cars to drive in the snow, why self-driving cars handle skidding better than humans and the relationship between driver-assist technologies and full autonomy.

What are the biggest challenges of preparing an autonomous car to drive in the snow or other adverse weather conditions?

There's three main challenges at the highest level. The first is sensing, right, because just like humans have trouble seeing in snowy conditions, sensors at times also struggle. Lowlight conditions can make it hard for cameras to see because they're sort of passive devices. They can only use whatever light is available to them and if that gets blocked because there's snow either in the way or covering a lens or something like that, that's an issue for cameras. LIDAR is an active ... it's a laser, so we're sending out energy and having it come back and we're using that to measure the distance to all the objects around us.

If the laser beam is going out through the sensor and bouncing off snowflakes and coming back, that's not really the useful information we want. We want to see the car that's beyond those snowflakes. That's the challenge there that we have to deal with. Radars actually tend to be the best sensor in terms of robustness to weather. We have a lot of experience using radars in adaptive cruise control and the forward-collision warning systems that we have in production today, but there's still a lot of situations where the weather's heavy enough that the radar struggles, so that's one major challenge is how we get good sensing information back to the vehicle.

Another major challenge is figuring out where you are, which we call localization. When the ground is covered with snow, you can't see the lane lines, so sometimes it's difficult to know exactly where you are on the road. The way we solve that challenge with our vehicle is with these highly annotated maps that we have loaded.

The easiest way to know where you are is to just use a camera and look at the road way and see where the lane lines are. But if the lane lines are covered with snow and you can't see them, what we do is we look out to sides of the vehicle, see all the other landmarks that exist around it because the corner of a building — even in a snow storm — still looks like the corner of a building. You can still see all the things that stick up out of the ground like lampposts and trees that are still viewable.

If we know where those things are in the map, then we place ourselves and know where our vehicle is in that map. Since the map has all the information about where the lane lines are, then the vehicle knows where the lane lines are, even though we can't physically see the lane lines. That's another challenge — localization. It's actually one where we think autonomy can have an advantage over humans because the human doesn't know where those lane lines are, but we have a way where we can figure it out. It's still far more challenging in the snow than it is on a sunny, clear day.

The third major challenge area is controlling in slippery conditions. Again, that's a challenge that exists for humans also — driving in the snow is a significant challenge. The advantage we have is that there's a lot of things that if you have been trained in how to drive in the snow, there's a lot of techniques that expert drivers have because of that specialized training.

We know about understeer and oversteer, we know about friction circle in the tire, we know the effect of weight transfer on the vehicle, so those are things we know as experts that the public generally doesn't know. A lot of that knowledge can then be transferred into driving algorithms that we can then load into the vehicle, so the vehicle can understand what the effect of the steering, the throttle and the brakes at different points in a maneuver are, given that there's a low-grip situation because of the snow.

How does Ford being to prepare self-driving cars for the likelihood of other cars skidding in proximity on the road?

Well, one of the real strengths of an autonomous vehicle is we have full 360-degree sensing and it's always vigilant. It never gets distracted. The autonomous vehicles can track all the vehicles around it, which is something humans can't do. We tend to see the vehicles in front of us or when we check our mirrors — which sometimes isn't frequent — we see what's behind us, but we're not tracking all the vehicles around us all the time.

An autonomous vehicle is going to know — because of the vigilant 360 sensing — about what's happening to vehicles around it and probably be more aware of that than the typical driver would be if the vehicle that's starting to skid nearby happens to be outside of the driver's normal line of sight, looking forward.

The driving task that we give the vehicle ... so another primary role that we give the vehicle is just to avoid collisions and so those collisions could be in the forward-sensing region or it could be another vehicle that we detect is coming towards us that we need to avoid. The fact that we have that sensing, there's a real possibility that the autonomous vehicle would know about other vehicles having trouble controlling themselves quicker than potentially even a human driver would be.


Have the Velodyne ultra pucks been used in Ford's autonomous vehicle testing yet?

We don't have the ultra pucks yet. They're coming. The ultra puck puts a lot more beams in the field of view we care about for driving. The ones we have on there now, the beams' spacing is more spread, which is actually good for mapping because you want a more comprehensive picture of the world around. The ultra pucks will be focused more on the driving tasks rather than the map making.

Earlier this year, Ford announced it's tripling the size of its autonomous development team. General Motors followed suit and announced the creation of its own autonomous driving team. Can this be done and sped up without having this separate, dedicated unit?

Well, my boss is Randy Visintainer. He's the director of autonomous vehicles at Ford. We have a unit that's very focused on autonomous vehicles, but we're also very integrated with the rest of Ford motor company. There's some new sensing challenges, some new computing and computer algorithm challenges that are fairly new to autonomous vehicle, where having a fairly dedicated team working on those things is helpful.

When you get to how the autonomous sensing and computing algorithms interact with the vehicle, then you're talking about needing to command what kind of steering, throttle, brake, shifting, turn signals — all those sort of things that are interfaced with the vehicle — and that's where we very heavily leverage all of our experience in creating vehicles, so that work is leveraging a lot of [what] our normal product development teams do.

Once we figure out what steering, brake, shifting, etc. we want the vehicle to do, then we have an interfaced vehicle. We've set up a set of interfaces to the vehicle that allow that kind of control from the computer to the control modules rather than requiring hands on the steering wheel and feet on pedals. The experience we have in designing and developing robust vehicles really helps us with autonomous driving.

2020 has been the target year for self-driving cars to hit the roads, but as time draws near, some automakers have said that they think fully autonomous cars will probably take a little longer than that and in 2020, we're likely to see more robust driver-assist technologies than actual fully autonomous vehicles. Where does Ford stand on this?

We're not focused on the dates per se. We are focused on developing features. Within the group, I have responsibility for not only the fully autonomous features, but also for those driver-assist features. So, we view it as sort of an integrated continuum of increasing capability from the driver assist all the way up through the full-autonomous team, and that's why it's all coordinated in my department that we ensure that we have all the learnings from what we're building for driver-assist technology being available for the fully autonomous team and that the significant advances that the fully autonomous team is making in sensing and computer algorithms and actuation can be shared backwards with my team doing driver-assistance work.

We're working on that entire space and we're progressing it as fast as we can. This sort of crystal ball of who will be done at what point — it's a difficult challenge to predict that accurately because especially with full autonomy, people are trying to predict when something that nobody has ever done before will be done. That's a difficult challenge to do with precision.

Is it fair to say that full autonomy can't be reached without maximizing driver-assist technologies' potential?
It's more like a lot of the tools, techniques and the algorithms that are developed on the driver-assist side can be reused for autonomy and some of the new technologies that are developed on the full-autonomy side can be reused on the driver-assist side.

To take a simple example, on the driver-assist side, we have adaptive cruise control and adaptive cruise control has to have algorithms that figure out how to manage the gap from the vehicle in front of you. The full autonomy vehicle needs that same sort of algorithms, so as we make improvements on the vehicle-following algorithms on the driver-assist technologies' side, those kind of learnings can be reused on the full autonomy side. As we develop advances in more power, capable sensing, then potentially those could be shared with the driver-assist features to make them more capable. It's really that we see synergy between the two rather than it be sort of being you have to have one to have the other kind of thing.

Last month, reports surfaced about Ford collaborating with Google on autonomy. Can you tell us anything more about that partnership?

There's lots and lots of companies we're talking to in the whole mobility space and all those conversations are private for competitive reasons.

The more fellow automakers and companies are transparent with each other, does it only benefit the chances of autonomous vehicles advancing and possibly even getting regulated quicker?

There's certain areas where we compete with each other and there's certain areas where collaboration is helpful. The one you mentioned is an area where collaboration is helpful. If we get consensus on what the standards, requirements and regulatory structure [are] for what autonomous vehicles and if we have a single nationwide standard for that, that benefits everyone rather than having a patchwork of different rules in 50 different states.

If we can all work together, from auto industry to companies, government — state, local, federal — to come up with consensus approaches and common approaches, that benefits everybody. There are other areas, where we keep all we're doing very close to the vest — obviously some of the algorithms, software and sensing — for competitive reasons. Those are kept more private.

Ford announced its FordPass and collaboration with IBM for smart parking. How do you see those advances in mobility being paired with autonomous vehicles?

Well, historically our business has been selling vehicles to the people. As we move forward, there will be a significant business opportunity in selling mobility to people instead of just selling them a vehicle and we're very interested. That's the whole smart mobility effort — figuring out how to participate in that space. Autonomy we see as one of the significant enablers for that mobility space, along with some of the connectivity things.

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion