Jeremy Kahn reports for Bloomberg:
You’re crossing the street wrong.But I think the self-driving car versus pedestrian problem is bigger than this. I reported in January:
That is essentially the argument some self-driving car boosters have fallen back on in the months after the first pedestrian death attributed to an autonomous vehicle and amid growing concerns that artificial intelligence capable of real-world driving is further away than many predicted just a few years ago
.“What we tell people is, ‘Please be lawful and please be considerate,’” says Andrew Ng, a well-known machine learning researcher who runs a venture fund that invests in AI-enabled companies, including self-driving startup Drive.AI. In other words: no jaywalking...
Whether self-driving cars can correctly identify and avoid pedestrians crossing streets has become a burning issue since March after an Uber self-driving car killed a woman in Arizona who was walking a bicycle across the street at night outside a designated crosswalk. The incident is still under investigation, but a preliminary report from federal safety regulators said the car’s sensors had detected the woman but its decision-making software discounted the sensor data, concluding it was likely a false positive...
With these timelines slipping, driverless proponents like Ng say there’s one surefire shortcut to getting self-driving cars on the streets sooner: persuade pedestrians to behave less erratically. If they use crosswalks, where there are contextual clues—pavement markings and stop lights—the software is more likely to identify them.
But to others the very fact that Ng is suggesting such a thing is a sign that today’s technology simply can’t deliver self-driving cars as originally envisioned. “The AI we would really need hasn't yet arrived,” says Gary Marcus, a New York University professor of psychology who researches both human and artificial intelligence. He says Ng is “just redefining the goalposts to make the job easier,” and that if the only way we can achieve safe self-driving cars is to completely segregate them from human drivers and pedestrians, we already had such technology: trains.
Rodney Brooks, a well-known robotics researcher and an emeritus professor at the Massachusetts Institute of Technology, wrote in a blog post critical of Ng’s sentiments that “the great promise of self-driving cars has been that they will eliminate traffic deaths. Now [Ng] is saying that they will eliminate traffic deaths as long as all humans are trained to change their behavior? What just happened?”
It is not uncommon to see self-driving cars being tested on the streets of San Francisco. I probably see one every other day. There are usually two people in the car, one on the driver's side in case of an emergency.Is the "pedestrian problem" the real reason that Uber, according to reports, is considering selling its self-driving car development division?
The other day, a self-driving car was coming down a narrow street. My plan was to cross that street after the car had passed. I was facing a do not walk sign but it is the kind of stop where everyone practices anarchist calisthenics. The self-driving car had a green light but I lazily stepped off the curb to cross after the car passed but the car stopped at my crosswalk apparently because I had stepped off the curb.
If a human was driving the car, by my posture and lack of focus, it would have been easy for the human to tell I had no intention of crossing until the car had crossed. The self-driving car didn't seem to be able to get this.
Too many engineers work from the premise that humans are the error problem. You see the same problem with mainstream economics. Quantification and aggregation of humanity is a waste of time when there are no constants. You might as well try to do statistical analysis on beauty or love (or hate). The only reason traffic is as safe as it is today is because humans are driving. Adding machine drivers to the system only adds more complexity, which will only result in new types of outcomes (many that will be considered failures). And the circle of engineering 'better' solutions will go around again (and again) until the basic lesson is learned.
ReplyDeleteYears ago I read the very entertaining book "A Canticle for Leibowitz". It had a scene where cars were running over people, but they handled that by having another machine that would clear the roadway automatically. Progress.
ReplyDeleteSeriously? A woman walking a bike at night, not in a crosswalk, into the path of a moving vehicle? Even a conscientious, fully alert human driver could probably not have avoided that accident.
ReplyDeleteExactly. And AI machines learn from experience (one technique is called back-propagation). And with self-driving cars, they have potential to learn from all the other self-driving cars to become proficient very quickly.
DeleteAs to Robert's "I stepped into the crosswalk" argument, as I pointed out at the time, in California, it is technically illegal for a car to cross the crosswalk while he's standing in it, and people get tickets for this every day. That's a difficult problem for an AI to work out as generating tickets for its owner is not a great selling point.
You've obviously never avoided hitting a deer, which aren't known for using crosswalks. And in my part of the country, this is an urban, not a rural problem. On top of this is the local ordinance that gives pedestrians the right-of-way everywhere, not just in crosswalks, and the pedestrians are notorious for crossing where ever and whenever they please. Yet human drivers seem to avoid getting blood all over their bumpers.
DeleteGeoih, I have in fact successfully avoided hitting a deer. That being said, I have also seen a great many deer carcasses by the side of the road, and heard of many pedestrians losing their life and third dimension. I agree with Pandemic that self-driving cars will most likely eventually far exceed the safety of the fully alert motorist, not to mention drugged up humans.
DeleteIf "we" can "persuade pedestrians to behave less erratically," why not just persuade drivers to behave less erratically and forget the whole self-driving car nonsense?
ReplyDeletePedestrian deaths are just acceptable collateral damage on the road to that bright AI future, eh? But one thing I don't get: if every drivrrless car needs a human "safety" driver, why not just let the human drive?
ReplyDelete