
The difference between this collision and others was that the Chrysler Pacifica was a Waymo Autonomous Vehicle. Before the passenger vehicle cut in front to avoid the box truck, the Waymo vehicle was in self-driving mode. Acting on instinct, Waymo’s test driver took manual control of the AV, disengaging from self-driving mode and colliding with the motorcycle.
Waymo reports a decade of testing over 10 million miles on city streets and private facilities. In a recent corporate blog written by CEO John Krafcik, he admitted that the AV driver took the wrong course of action.

“Testing on public roads is vital to the safe development of self-driving technology, and we’re sorry that a member of the community was injured in a collision with one of our cars,” he added. “We recognize the impact this can have on community trust. We hold ourselves to the highest standard, and we are always working to improve and refine our testing program.
“As professional vehicle operators, our test drivers undergo rigorous training that includes defensive driving courses, including guidance on responding to fast-moving scenarios on the road. However, some dynamic situations still challenge human drivers. People are often called upon to make split second decisions with insufficient context. In this case, our test driver reacted quickly to avoid what he thought would be a collision, but his response contributed to another.”

According to the DMV permit, Waymo will be able to test around 36 self-driving vehicles without a driver in Palo Alto, Mountain View, Los Altos, Los Altos Hills, and Sunnyvale. Waymo employees will be the first to take rides in the unmanned vehicles. The self-driving company has been allowed to test autonomous vehicles with safety drivers since 2014, one of 60 companies that are authorized to do so, the DMW said.
But a report from The Information (“Waymo’s Big Ambitions Slowed by Tech Trouble,” August 28, 2018) suggested Waymo’s self-driving technology struggles with more driving tasks than the Mountain View company has indicated. The publication said Waymo vehicles have difficulty making unprotected left turns, distinguishing between individuals in a large group, and merging into turn lanes and highway traffic, among other trouble areas.
“Incidents like this are what motivate all of us at Waymo to work diligently and safely to bring our technology to roads, because this is the type of situation self-driving vehicles can prevent,” Waymo CEO Krafcik wrote. “We designed our technology to see 360 degrees in every direction, at all times. This constant, vigilant monitoring of the car’s surroundings informs our technology’s driving decisions and can lead to safer outcomes.”
How Waymo tech works
As motorcyclists, should we take comfort in the idea of true driverless vehicles deciding what’s best for us, even when one of our own used poor judgement on a cold Friday morning in October? Would the Waymo van’s AI made the right decision to avoid colliding with the Honda Rebel?
Apparently ‘the motorcyclist is at fault’ will be the defense used in all autonomous incidents involving collisions with motorcycles. There were two moto/autonomous collisions in San Francisco in 2017, both were blamed on the riders, as was the rider in this story. It’s the same as it ever was: bikers are almost always blamed for accidents, and killers of bikers are seldom prosecuted. We’ve already seen robot cars kill pedestrians and drivers: autonomous driving is the most fearsomely complex computer problem ever, according to the coders working on them, with so many variables and so much at stake. While I like the idea of idiot/aggressive drivers being replaced by slow robots, I’d rather not be killed by an underdeveloped computer program…
Sorry Paul, I don’t see how you came to your conclusion. Nowhere did it say “the motorcyclist is at fault”. In fact, the human who was driving the car made an error that the computer would not have made. The motorcyclist would NOT have been injured if the computer had been driving. (Note that the car was not being driven by the computer at the time of the accident, the human inside the car had taken over control). I don’t think you need to worry about being killed by an underdeveloped computer program nearly as much as you need to worry about the humans who are all around you.
Sorry Pete but you’re wrong . Dead wrong . See my reply below to Speed for the details .
Paul D’Orléans wrote, “Apparently ‘the motorcyclist is at fault’ will be the defense used in all autonomous incidents involving collisions with motorcycles.”
At the instant of the accident and just before, the Chrysler was not an autonomous vehicle — it was under the control of the test driver. “Acting on instinct, Waymo’s test driver took manual control of the AV, disengaging from self-driving mode”
“In a recent corporate blog written by CEO John Krafcik, he admitted that the AV driver took the wrong course of action.”
[ … ]
“Crucially, our technology correctly anticipated and predicted the future behavior of both the merging vehicle and the motorcyclist. Our simulation shows the self-driving system would have responded to the passenger car by reducing our vehicle’s speed, and nudging slightly in our own lane, avoiding a collision.”
Perhaps the proper response from members of the motorcycling community would be to push for faster integration of autonomous vehicles into the transportation system. It would be more effective than the “Watch for Motorcycles”, “Look for Motorcycles” and “Look Twice, Save a Life — Motorcycles Are Everywhere” signs and bumper stickers.
I’m on P d’O’s side on this 150% . Why ?
Because the two replies from Pete and Speed sound like a whole lotta autonomous manufactures propaganda to me … along with any other thinking person with a modicum of common sense and intelligence . Fact is both comment/replies s sound downright scripted . Probably because they are
Which is to say to both Speed and Pete … yes … I’m accusing you both flat out of being blatant autonomous industry shills if not robots . Because the reality is the genuine investigations into all the recent autonomous accidents has shown … the technology was at fault … not the driver .. which is why the manufactures are being held liable … not the human behind the wheel ( NYTimes AW WSJ WP etc et al )
Facts gentlemen … facts … not blatant propaganda desperate to promote an entirely useless superfluous technology for technology’s sake solely for the purpose of making a profit … consequences be damned
Dr TJ Martin ( retired ) aka GuitarSlinger
Minor correction ;
Dr TJ Martin PhD ( retired )
You mention facts, but can you cite a single source?
I’m glad that you have a PhD, how many years have you worked in automation? I spent my grad school and 25 more years on it, and have been developing Waymo’s radars for the past 5 years. I have access to real data and have studied the accidents.
Accuse me of whatever you want. You’re way out of your element, and it shows. Have a nice day.
Lets face one simple fact when it comes to autonomous vehicles on public roads
They are NOT ready for prime time . The creators know it . The engineers know it . The manufactures considering adapting it know it . The insurance industry and the legal professions know it all too well ( just chomping at the bit for all those $$$ they’ll be making ) etc – et al – ad nauseam
Yet they carry on … not giving a damn about public safety , detriments to society nor the inherit consequences therein .. as they pursue blatant profit over benefit as per the Gospel of Ayn Rand