YouTube removes video that tests Tesla’s full self-driving beta against real kids

YouTube has removed a video that shows Tesla drivers performing their own safety tests to determine if an EV (electric vehicle) has full self-driving (FSD) capabilities making it safe for children walking or standing on the road. as previously reported by CNBC.

“Does the Tesla Full-Self-Driving Beta Really Run on Kids?” The titled video was originally posted on Whole Mars Catalog’s YouTube channel and features Tesla owner and investor, Tad Park, testing Tesla’s FSD feature with his kids. During the video, Park drives a Tesla Model 3 towards one of his children standing on the street, and then tries to cross the street with his other child. Both times the car stops before the children arrive.

As noted on its support page, YouTube has specific rules against content that “endangers the emotional and physical well-being of minors.” YouTube spokeswoman Elena Hernandez told CNBC that the videos violated its policies against harmful and dangerous content and that the platform “represents a minor who participates in dangerous activities or encourages minors to perform dangerous activities.” Content does not allow.” YouTube did not immediately respond ledgeComment request.

“I’ve tried FSD beta before, and I would trust my kids’ lives with them,” Park says during the now-deleted video. “So I’m pretty confident that it will detect my kids, and I’m also in control of the wheel, so I can apply the brakes at any time,” Park told CNBC. The car was never traveling at more than eight mph, and “make sure the car recognized the child.”

As of August 18, the video had over 60,000 views on YouTube. The video was also posted on Twitter and is still available to watch. ledge It reached out to Twitter to see if there were any plans to take it down, but didn’t immediately hear back.

Crazy idea of ‚Äč‚Äčtesting FSD with real – living and breathing – kids emerged after a video And advertising campaign Tesla vehicles posted on Twitter appear to have failed to collide and collide with child-sized dummies placed in front of the vehicle. Tesla fans weren’t buying it, sparking a debate on Twitter about the features’ limitations. Whole Mars Catalog, an EV-powered Twitter and YouTube channel later run by Tesla investor Omar Kazee hinted at making a video Involving real children in an attempt to falsify the original results.

In response to the video, the National Highway Traffic Safety Administration (NHTSA) issued a warning against using children to test automated driving technology. “No one should risk their life or that of anyone else to test the performance of vehicle technology,” the agency said. bloomberg, “Consumers should never attempt to create their own test scenarios or use real people, and especially children, to test the performance of vehicle technology.”

Tesla’s FSD software does not make a vehicle fully autonomous. It’s available to Tesla drivers for an additional $12,000 (or a $199/month subscription). Once Tesla determines that a driver meets a certain safety score, it unlocks access to the FSD beta, enabling drivers to input a destination and the vehicle’s Advanced Driver Assistance System (ADAS). ) drives the vehicle there using the autopilot. Drivers must still keep their hands on the wheel and be ready to take control at any time.

Earlier this month, the California DMV accused Tesla of making false claims about Autopilot and FSD. The agency alleged that the names of both the features, as well as their description of Tesla, misinterpreted that they enable the vehicles to operate autonomously.

In June, NHTSA released data about driver-assistance accidents for the first time, and found that Tesla vehicles using Autopilot vehicles were involved in 273 accidents from July 20, 2021 to May 21, 2022. NHTSA is currently investigating several incidents. There have been more than two dozen Tesla accidents, some of which have been fatal, where Tesla vehicles using driver-assistance technology collided with parked emergency vehicles.



Source link

Leave a Comment