One reason why it might be helpful for Ford to be building these vehicles with a steering wheel and pedals (and a driver’s seat positioned where a driver would sit) is because they would be at SAE Level 4 technology. They would operate in geo-fenced areas and only when the weather is conducive to the operation of the cameras, radar and lidar. Which might mean that a run of snow or rain could result in a lot of parked autonomous vehicles (AVs), so having the means for people to drive them when necessary could be beneficial to the fleet.
In February 2017, Ford announced that it is making — over a five-year period — a $1 billion investment in a Pittsburgh-based startup, Argo AI, thereby making Ford the majority shareholder in the company.
Argo was founded by Bryan Salesky, the CEO, and Peter Rander, president and COO. Both are veterans of the Carnegie Mellon National Robotics Engineering Center, where work is done on autonomous technology even for things like agricultural equipment. Rander went on to be an engineering lead at the Uber Advanced Technologies Center and had a stint as a consultant in the self-driving space before establishing Argo AI, which is based on working with artificial intelligence, machine learning and computer vision to facilitate things including, but not restricted to, home-delivering Domino’s pizzas without a ballcap-wearing kid behind the wheel.
Rander points out, “Ford is not exactly looking to Argo AI to tell it how to design a car and about its suspensions and manufacturing and supply chain.” There’s what an automaker does, and then there’s what a “cyberphysical system” developer does.
But he acknowledges that it is important to be work closely together with the OEM in order to execute a system that will operate as intended. That is, he says that one metaphor often used in the context of artificial intelligence for an AV is that it is the “brain.” But in the Argo AI world, it is thought of as the “head,” having more to do than merely think — it also has to sense (as in those various radars and cameras) and then send signals to the nervous system (the electrical architecture) that, in turn, causes the muscles to do something (like steer and brake).
“There may be kits out there someday,” Rander says, “but the head and body are pretty intertwined.”
While some people express impatience with how long it seems to be taking to get self-driving vehicles on the road in notable numbers, as Rander walks through the various things that need to programmed, it is remarkable that it will ever happen. Consider that the vehicle needs to know where it is in space, which calls for precise digital maps and GPS. There are the sensors, but while they might provide information about color and shapes, “It doesn’t tell me what it is: a person, a car or a semitrailer.”
Then there are the aspects of motion planning (how to get from A to B) and prediction. (“How do I do that given the situation around me and the prediction of things that I anticipate will happen? If I move over, will the person coming up in the lane behind me slow down and shrug their shoulders or will they hit the brakes, beep the horn and make interesting gestures, or will they rear-end us because we did something unsafe?”)
Rander points out that every day, people drive inches way from a row of parked cars along roadways and think nothing of it — until they see one of the cars’ brake lights are on, which is a sign that they’re probably going to pull out. Which means the driver must make a decision about what to do. Which means that in order to develop an AV system, they have to tell it what to do under those circumstances.
The test fleets that Ford and Argo AI are running include two safety drivers, both in the front seat of the vehicles to assure not only the safe operation of the vehicles, but to make sure that information is obtained. “The bulk of the work,” Rander says, “is getting out in the real world and experiencing it in context. That proves to be the fastest, most-effective way to get the most accurate data.” He adds, “If I can recognize a car in a studio, that’s great, but it is under studio lighting. If I take it outside, the lighting won’t be so perfect.” And as people don’t drive in studios. …
Then there is the issue of how an AV will fit into a system where it is outnumbered by non-AVs. Consider the situation at a four-way stop, Rander says. “If you hesitate too long, the traffic behind you gets frustrated because everyone else says you’re letting me go.” It might be legal and technically acceptable, but “It’s not socially acceptable, and so not very compatible.” And AVs need to be compatible with the flow of traffic.
He cites what is known as a “Pittsburgh left.” Apparently in downtown Pittsburgh a person stopped at a traffic signal on a road where there is no turn lane who is making a left will make the turn as soon as the signal turns green. This is a cultural thing that people have become familiar with. “Try that in New York City,” Rander says. “I’ll bet you get a different reaction.” (Or in Detroit, where you’ll probably get T-boned.)
And there are posted speed limits versus the flow of traffic. What if the speed limit is 40 and everyone is going at least 45? What does the AV do? “We want the vehicles to be able to keep up with the flow of traffic,” Rander says, but then goes on to explain that it is going to be important to talk with officials at the city, state and federal levels because, “We can’t have every AV pulled over.”
The year 2021 isn’t that far into the future. Will Ford and Argo AI get to that SAE Level 4 by then? A short time will tell.