Tesla’s ‘Fully Self-Driving’ Controversy Now Features Homemade Mannequins and Tests on Real Kids

Business

[ad_1]

A North Carolina resident has set out to debunk a widely circulated Tesla video with the company’s “fully self-driving” beta software — which lets the car steer, brake and accelerate, but requires an attentive human driver ready to take the wheel — plowing through a child-sized mannequin.
Dan O’Dowd, CEO of the software company that posted the video earlier this month; He thinks Tesla CEO Elon Musk said the National Highway Traffic Safety Administration should ban “fully self-driving cars” until they “prove they don’t mow down children.”
That’s when Kupani, who runs an auto shop, got involved in importing and Tesla and hired his son. Although he’s a self-described “BMW guy,” Kupani says the software can’t compare to what Tesla offers. It also wasn’t the first time Kupani, who says he’s 11, has enlisted his 11-year-old son in a viral car test: Earlier this year, he posted a video of his son driving a Model S Plaid — capable of reaching 0-60 in 1.99 seconds — in a private parking lot. It has been viewed over 250,000 times.

“Some people look at it and say, ‘Oh, what’s this crazy dad doing?’ “Well, I do a lot of things like that, but I make sure my son doesn’t get hit,” Kupani told CNN Business.

Carmine Cupani and his 11-year-old son Tesla
Kupani filmed a “fully self-driving” experiment in a parking lot. The boy stopped at the end of the corridor with a smartphone to record the test. Kupani accelerated the Tesla from the other side of the lot with “Full Self-Driving” on and reached 35 mph. Tesla shut off his brain and came to a complete stop — just before the boy.
Kupani did another road test with his son, Tesla’s more accurate driver assistance software, and found it stopped for his son. “This Dan guy says, he’s an expert in this, he’s an expert in that,” Kupani said. “Well, I’m an automotive, futuristic technology, pro driving instructor.”
Kupani is among the Tesla fans who took issue with O’Dowd’s video and set out to create their own experiments. Some asked their children to help them. Others built homemade mannequins or used dolls.
The enthusiastic defense and criticism of “fully self-driving” highlights that the technology is a blip in the industry. The California DMV recently said the name “fully self-driving” is misleading and is grounds for suspending or revoking Tesla’s license to sell the vehicles in the state. Ralph Nader, whose criticism of the auto industry in the 1960s helped create the National Highway Traffic Safety Administration (NHTSA), joined critics of “fully self-driving” this month.

But it’s another example of the unintended consequences of deploying an unfinished, disruptive technology in the wild — and it shows just how ready some Tesla devotees are to defend him and the company. It seems enough people are doing their own experiments that a government agency has taken the unusual step of warning children against experimenting with car technology.

“Consumers should never attempt to create their own test conditions or test the performance of vehicle technology on real people, especially children,” NHTSA said in a statement Wednesday. The agency called this approach “highly dangerous.”

Testing Tesla

Earlier this month, Tad Park, a California resident, saw another Tesla fan wanting to try “fully self-driving” with a child and volunteered. His two children. Park told CNN Business that it was “a little hard” to get his wife to agree. When he promised to drive the vehicle, she agreed.

“My kids mean more to me than anything else, so I don’t push the limits,” Park said. “I will not risk their lives in any way.”

Park’s tests, unlike O’Dowd’s, started with Tesla at 0 miles. Tesla stopped at all of the park’s challenges, including two of his children, a 5-year-old, who participated in the video. Park wasn’t comfortable doing a top speed test at 40 mph — like using an O’Dowd mannequin — with the kids.
Toronto resident Franklin Cadamuro created a “box man” boy-like form made from old Amazon cardboard boxes. “Don’t blame me for what the car does or doesn’t do,” he says at the beginning of the video. “I’m a big Tesla fan.”
one "Full self-driving"  as if "Test on the boxing boy"  Mannequin - A child-like figure that Franklin Cadamuro made from old Amazon cardboard boxes.

His Tesla slowed as he approached the “box boy.” Then he stood up again and hit the cardboard handle. Kadamuro speculates that this is because the cameras can’t see the short boxes immediately in front of the bumper and therefore forget they’re there.

Humans learn at about eight months that something is out of sight, many years before they are even eligible for a driver’s license. But the ability may still elude some artificial intelligence systems, such as Tesla’s “Full Self-Driving”. Another Tesla fan got the same result.

Kadamuro says the video started out as entertainment. But he wanted people to see that “fully self-driving” isn’t perfect.

“I’ve found that a lot of people have two extreme ideas about the ‘fully autonomous’ beta,” Cadamuro said. “People like Dan think it’s the worst thing in the world. I know some friends who think it’s close to perfect.”

Kadamuro also said that he has conducted other experiments in which his Tesla effectively maneuvers around the “box boy” by traveling at high speeds.

Locating small objects, such as small children, quickly and accurately will generally be more difficult than understanding large objects and adults with the computer vision systems that Tesla vehicles rely on, said Rajkumar, a professor at Carnegie Mellon University who researches autonomous vehicles.

The more pixels an object captures in a camera image, the more information the system has to find features and identify the object. The system is influenced by the data it is trained on, such as how many young children are exposed.

“Computer vision is not 100% foolproof with machine learning,” Rajkumar said. “Like any disease diagnosis, there are always false positives and negatives.”

Tesla did not respond to requests for comment and generally does not engage with professional news outlets.

“The Laws of Wild West Chaos.”

O’Dowd followed criticism from Tesla fans in his first tests He posted another video. This month.

Some Tesla fans criticized O’Dowd’s use of cones as lane markers in his first test, which may have limited the sedan’s ability to steer around the mannequin. Others believe O’Dowd’s test driver forced the Tesla to hit Manuku by pushing the accelerator, which is not seen in the videos O’Dowd has released. Some Tesla fans have also pointed out the blurred messages on the Tesla vehicle’s screen, indicating that O’Dowd’s test driver was pushing him to cheat the tests.

Dan O#39;  Dude did an experiment using a mannequin and

O’Dowd told CNN Business that the faint messages indicate a lack of supercharge and uneven tire wear. O’Dowd did not provide a clear video of what happened in the car during the tests, so CNN Business could not independently verify what the message said.

In the second video, O’Dowd tested the interior of the Tesla without the cones on a residential street, including the speedometer. Tesla, like O’Dowd’s other tests, hit the boy’s mannequin.
O’Dowd said in an interview with CNN Business earlier this year that no industry challenger would be examining the code for “fully self-driving.” The US government does not have performance standards for automated driver assistance technology such as Autopilot.

O’Dowd is the founder of the Dawn Project, an effort to make computers safer for humans. He ran unsuccessfully for the United States Senate this year, running solely on his criticism of “full self-driving.”

NHTSA is currently investigating Tesla’s driver assist technology so changes are likely in the future.

“The software that controls the lives of billions of people in self-driving cars has to be the best software ever written,” O’Dowd said. “We’re using the rules of absolute wild west chaos and we’ve got something really scary.”



[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *