Wednesday, January 17, 2018

A Report on an Experience with a Self-Driving Car on the Streets of San Francisco

It is not uncommon to see self-driving cars being tested on the streets of San Francisco. I probably see one every other day. There are usually two people in the car, one on the driver's side in case of an emergency.

The other day, a self-driving car was coming down a narrow street. My plan was to cross that street after the car had passed. I was facing a do not walk sign but it is the kind of stop where everyone practices anarchist calisthenics. The self-driving car had a green light but I lazily stepped off the curb to cross after the car passed but the car stopped at my crosswalk apparently because I had stepped off the curb.

If a human was driving the car, by my posture and lack of focus, it would have been easy for the human to tell I had no intention of crossing until the car had crossed. The self-driving car didn't seem to be able to get this.



  1. I thought I read somewhere that self driving cars are causing more accidents at this point because they don't react to other vehicles like human drivers do, so human drivers in other cars can't anticipate what the self driving car is going to do.

  2. I'm pretty sure California Law gives pedestrians the right-of-way in any crosswalk -- even if that pedestrian is breaking the law. So, the car was programmed to obey the law -- something any company wanting to be granted authority to operate, and any company not wanting to be sued would have to accomplish.

    1. This is not about technicalities of the law but how real people drive versus self-driving cars.

    2. My point is it's unlikely things like this will change as companies will be loath to blatantly have their products break the law, and the litigious American society would swoop all over any accident where an autonomous vehicle was breaking the law when an accident occurred.

      Once people figure out these cars will stop for any pedestrian, it will likely spawn a new type of robbery where criminals or gangs of criminals step out in front of autonomous cars and attack the occupants. It's a difficult problem to solve -- especially in an over-regulated, litigious country.

  3. As a professional software engineer, I am in awe that autonomous vehicles are able to do what they do. They, of course, will continue to improve, perhaps to the point where even Wenzel will be impressed.

  4. It is a great conceit to think that all things are tractable. There is a great difference between driving-as-imagined versus driving-as-done. Safe driving and unsafe driving are outcomes from the same process. There are millions of variable actions taken by each driver within a time constrained environment and these variations can either accumulate or cancel each other out, with the result being an accident, or no accident. The process in intractable and the outcomes are emergent.

  5. I see real safety issues with self-driving cars (in addition to the liberty-related issue that, since they are more reliant on software than older cars, they will be much more trackable by the state, either from hacking the software or requiring OEMs to allow the state access).

    It's not hard to imagine a number of instances when circumstances require human insight and not slavish, mechanical obedience to traffic laws. You're at a red light and see a truck bearing down on you from behind, seemingly out of control, requiring you to make the decision to run the red light. On a snowy road where there isn't much traction you're driving up a hill to a stop sign or red light, and it makes sense to roll through. You have to rush someone to the ER and need to break a number of traffic rules. You need to (safely) change lanes even though the line marking is unbroken, or you need to cross the double yellow lines for good reason. You're being chased down in a parking lot and need to get in your car and get out of there quickly, traffic rules be damned (because, of course, the state doesn't want you to be armed). And so on.