[ad_1]
“Some people look at it and say, ‘Oh, what’s this crazy dad doing?’ “Well, I do a lot of things like that, but I make sure my son doesn’t get hit,” Kupani told CNN Business.
But it’s another example of the unintended consequences of deploying an unfinished, disruptive technology in the wild — and it shows just how ready some Tesla devotees are to defend him and the company. It seems enough people are doing their own experiments that a government agency has taken the unusual step of warning children against experimenting with car technology.
“Consumers should never attempt to create their own test conditions or test the performance of vehicle technology on real people, especially children,” NHTSA said in a statement Wednesday. The agency called this approach “highly dangerous.”
Testing Tesla
Earlier this month, Tad Park, a California resident, saw another Tesla fan wanting to try “fully self-driving” with a child and volunteered. His two children. Park told CNN Business that it was “a little hard” to get his wife to agree. When he promised to drive the vehicle, she agreed.
“My kids mean more to me than anything else, so I don’t push the limits,” Park said. “I will not risk their lives in any way.”
His Tesla slowed as he approached the “box boy.” Then he stood up again and hit the cardboard handle. Kadamuro speculates that this is because the cameras can’t see the short boxes immediately in front of the bumper and therefore forget they’re there.
Kadamuro says the video started out as entertainment. But he wanted people to see that “fully self-driving” isn’t perfect.
“I’ve found that a lot of people have two extreme ideas about the ‘fully autonomous’ beta,” Cadamuro said. “People like Dan think it’s the worst thing in the world. I know some friends who think it’s close to perfect.”
Kadamuro also said that he has conducted other experiments in which his Tesla effectively maneuvers around the “box boy” by traveling at high speeds.
Locating small objects, such as small children, quickly and accurately will generally be more difficult than understanding large objects and adults with the computer vision systems that Tesla vehicles rely on, said Rajkumar, a professor at Carnegie Mellon University who researches autonomous vehicles.
The more pixels an object captures in a camera image, the more information the system has to find features and identify the object. The system is influenced by the data it is trained on, such as how many young children are exposed.
“Computer vision is not 100% foolproof with machine learning,” Rajkumar said. “Like any disease diagnosis, there are always false positives and negatives.”
Tesla did not respond to requests for comment and generally does not engage with professional news outlets.
“The Laws of Wild West Chaos.”
Some Tesla fans criticized O’Dowd’s use of cones as lane markers in his first test, which may have limited the sedan’s ability to steer around the mannequin. Others believe O’Dowd’s test driver forced the Tesla to hit Manuku by pushing the accelerator, which is not seen in the videos O’Dowd has released. Some Tesla fans have also pointed out the blurred messages on the Tesla vehicle’s screen, indicating that O’Dowd’s test driver was pushing him to cheat the tests.
O’Dowd told CNN Business that the faint messages indicate a lack of supercharge and uneven tire wear. O’Dowd did not provide a clear video of what happened in the car during the tests, so CNN Business could not independently verify what the message said.
O’Dowd is the founder of the Dawn Project, an effort to make computers safer for humans. He ran unsuccessfully for the United States Senate this year, running solely on his criticism of “full self-driving.”
NHTSA is currently investigating Tesla’s driver assist technology so changes are likely in the future.
“The software that controls the lives of billions of people in self-driving cars has to be the best software ever written,” O’Dowd said. “We’re using the rules of absolute wild west chaos and we’ve got something really scary.”
[ad_2]
Source link