Matt McFarland | CNN Business
When a dozen small children crossed the path of our “fully autonomous driving” Tesla, I had every reason to be nervous.
I spent my morning in the backseat of a Model 3 using “fully autonomous driving” – a system that Tesla says will change the world by creating safe and reliable autonomous vehicles. I saw the program almost crash into a construction site, try to turn into a stopped truck, and try to drive on the wrong side of the road. Angry drivers hummed as the system hesitated, sometimes right in the middle of an intersection.
The Model 3’s “complete self-driving” required human intervention to protect us and everyone else on the road. Sometimes that meant putting on the brakes to turn off the program so it wouldn’t try to go around the car in front of us. In other cases, we quickly tore the wheel to avoid a collision. (Tesla tells drivers to keep an eye on the road at all times and be ready to act immediately.)
I hoped the machine would no longer make stupid mistakes. After what seemed like an eternity, the children finished the transition. I exhaled.
It was clear to us what it was our turn to do. At first the car seemed overly indecisive, but then I noticed a cyclist walking to our left. We were waiting.
As soon as the cyclist crossed the intersection, the car pulled up and made a smooth turn.
Over the past year, I have watched over a hundred videos of Tesla owners using “fully autonomous driving” technology and have spoken to many of them about their experiences.
Full Self-Drive is a $ 10,000 driver assistance feature offered by Tesla. While all new Tesla’s can use “fully autonomous driving” software, buyers must opt for the costly add-on if they want to access this feature. The software is still in beta testing and is currently only available to a select group of Tesla owners, although CEO Elon Musk said wider adoption is imminent. Musk promises that “fully autonomous driving” will be fully capable of getting the car to its destination in the near future.
But that doesn’t happen. Not at all.
Tesla owners have described this technology as impressive but flawed. One moment he drives perfectly, the next moment he almost crashes into something.
Jason Tallman, a Tesla owner who documents his “completely autonomous” trips on YouTube, suggested that I give it my own experience.
We asked Jason to meet us at Brooklyn Flatbush Avenue. It is an urban artery that travels thousands of cars, trucks, cyclists and pedestrians to Manhattan. Even for experienced drivers, this can be a problem.
RELATED: Apple Accelerates Electric Vehicle Project, Aims for Fully Autonomous Vehicle
Driving in the city is chaotic, with cars driving at red lights and pedestrians in almost every block. It’s far from the suburban neighborhoods and predictable highways around Tesla’s California offices or the wide streets of Arizona, where Alphabet’s Waymo operates fully autonomous vehicles.
Cruise, GM’s self-driving car company, recently completed its first fully autonomous travel in San Francisco. But they were carried out after 23:00 at nightwhen there is little traffic and few pedestrians or cyclists.
Brooklyn gave us a chance to see how close Tesla’s autonomous driving software is to replacing human drivers. This is where people go because they need to, not the place that corporate headquarters choose. This is where self-driving cars can have their greatest impact.
At some point we were driving in the right lane of Flatbush. A construction site loomed ahead. The car continued at full speed towards the row of metal rails.
I felt deja vu when I remembered a video of a Tesla owner hitting the brakes after his car seemed to crash headfirst into a construction site.
But this time I was sitting in the back seat. I instinctively raised my right hand like a Heisman trophy, as if defending myself in a collision.
This was the moment when I wanted “full self-control” to quickly change lanes. Other times I wanted it to cool down in its aggressive corners.
Full autopilot sometimes makes sharp turns. The wheel begins to spin, but then returns to its original position before turning again in the given direction. Gradual turns usually don’t bother with sharp suburban turns, but in a densely populated city built mostly before cars, it is inconvenient.
There is also braking, which may seem random. At some point, the car came close to us from behind after braking, which surprised me. Receiving a signal was common. I never felt like I knew what “full self-driving” would do. Asking “completely self-driving” to navigate Brooklyn was like asking a student driver to take a road test for which he was not yet ready.
That “full self-government” could do well was impressive, but ultimately unnerving. I can’t imagine how to regularly use “fully autonomous driving” in the city. I noticed that I didn’t want to ever look down at the Model 3’s dashboard, for example, to check our speed, because I didn’t want to take my eyes off the road.
Tesla owners usually tell me how highway-oriented autopilot, the precursor to “full self-driving,” makes their journeys less stressful. They arrive at their destinations feeling less tired. Some have told me that they are more likely to go on long trips because of the autopilot.
But “complete self-government” was the opposite. I felt that I needed to be constantly on the lookout for the machine to do something wrong.
Ultimately, the vision of “complete autonomous driving” in Brooklyn reminded me of the importance of the subtle aspects of driving that are difficult for an AI vehicle to master. Things like a small junction exit on a narrow road to turn left so that there is room for cars behind you to drive. “Fully Self-Driving” just sat there as the frustrated drivers hummed behind us.
At this point, “complete self-government” seems more like a party trick than a mandatory feature.
™ & © 2021 Cable News Network, Inc., a WarnerMedia company. All rights reserved.