Tesla drivers are testing whether the Full Self-Driving software stops for children with varying results.
One video that did not use FSD went viral after it showed a Tesla plowing over a child-sized mannequin.
The tests are not subject to US testing standards and Elon Musk called one of the videos a "scam."
Tesla drivers are staging their own tests to determine if the carmaker's Full Self-Driving (FSD) software stops for children.
On Tuesday, a video from Tesla critic Taylor Ogan went viral after it showed a side-by-side comparison of a Tesla next to a Lexus that appears to be equipped with Luminar autonomous driving sensors. In the clip, the Tesla plows over a child-sized mannequin, while the Lexus slows to a halt mere feet from another mannequin.
The video - which was taken in May in San Mateo, California and now has over 12 million views on Twitter -has since spurred multiple users to test how Tesla's FSD interacts with children.
However, Ogan told Insider the original video shows a failure of Tesla's Automatic Emergency Braking system (AEB) and FSD was not in use. Tesla's AEB is designed to work at speeds between 3 to 90 miles per hour, though the company has added the disclaimer that it is "designed to reduce the severity of an impact" and not to "avoid a collision."
"Of course it's not going to stop for a child consistently on an advanced software if it's not stopping on AEB," Ogan said.
Independent tests yield varying results
Ogan has been known to repost videos of Tesla FSD gaffs. His recent viral video was posted after tech CEO Dan O'Dowd shared a similar video of a Tesla that was allegedly using FSD driving through a toddler-sized mannequin at about 25 miles per hour.
A spokesperson for O'Dowd said the CEO has been in touch with Ogan, but their videos were performed and posted independently.
O'Dowd's video spurred debates on Twitter as to whether FSD was truly activated during the test, and several users attempted to recreate the test with varying results.
The tests conducted by O'Dowd, Ogan, and other Twitter users were all done independently without the oversight of a US regulator, which means they are not subject to testing standards.
A Tesla spokesperson did not respond to a request for comment from Insider, but Tesla CEO Elon Musk called O'Dowd's video a "scam" in a comment on Twitter.
Tesla has told drivers that the system does not replace a licensed driver and instructs them to keep their hands on the wheel and be prepared to take over when the system is running.
Though FSD claims to be fully self-driving, in reality it operates as an optional add-on that enables Teslas to automatically change lanes, enter and exit highways, recognize stop signs and traffic lights, and park. The software is still in a beta testing mode and requires a licensed driver to monitor it at all times.
FSD currently has over 100,000 subscribers who Tesla can use to test the software in real time and allow the system's AI to learn from experienced drivers.
Still, some drivers remain skeptical of the technology. One Twitter user posted a video that appeared to show Tesla's FSD identifying a cardboard cut-out of a child and slowing multiple times for it as a stationary object in the street, as well as an object in motion.
A YouTuber also tried his own test with a balloon imitation of a child. In the video, the Tesla slams into the dummy as it moves across the road.
"Generally speaking, I am very pro-Tesla and I really don't want to expose a situation like this, but I have to state the facts," the YouTuber known as TechGeek Tesla said. "Tesla has some work to do here."
The driver said he felt his figure was representative of an actual child, though he noted the dummy did not move like a real person.
Tesla fan blog @WholeMarsBlog reposted a series of videos that appeared to show Tesla FSD stopping for pedestrians.
"I drive with FSD Beta with my kids in the car all the time," Twitter user @TeslaDriver2022 told EV blog Teslarati. "I see how safe it is. It's safer than anything else that's out there."