Mark Rober recently sparked heated debates with his latest YouTube experiment comparing Tesla’s Autopilot system to a LiDAR-equipped vehicle. The video, which quickly amassed over 10 million views, aimed to test the reliability of Tesla’s camera-only approach by setting up an obstacle—a painted wall—and observing how the two vehicles reacted.
In the test, the LiDAR-equipped vehicle detected the wall and came to a stop, while the Tesla Model Y, operating with its vision-based Autopilot, drove straight through the obstacle. Rober presented this as a major flaw in Tesla’s reliance on cameras rather than LiDAR technology. However, the video soon came under scrutiny, with many questioning whether the test was conducted fairly.
Backlash and Fact-Checking
Critics and Tesla enthusiasts pointed out inconsistencies in the test, questioning whether Autopilot was actually engaged or if Tesla’s Full Self-Driving (FSD) mode was used. Others noted that the lack of standard Autopilot indicators on Tesla’s screen raised doubts about the experiment's authenticity. Many Tesla supporters accused Rober of misrepresenting the capabilities of Tesla’s self-driving technology, arguing that proper use of Autopilot might have prevented the crash. Others defended Rober’s approach, stating that his test highlighted real concerns about Tesla’s camera-only system compared to LiDAR-based solutions used by other autonomous vehicles.
The controversy quickly spread across social media, particularly on Twitter (X), where users debated the validity of Rober’s test.
Blackout Trades argued that Tesla should pull data from the car itself to verify exactly what happened, including how many attempts were made, the speeds involved, and whether Autopilot was actually engaged.
You know what really needs to happen with this Mark Rober fiasco? Tesla needs to pull the data from the car itself and show us exactly what happened. How many attempts, speeds, Autopilot engagement/disengagement, the whole thing. I know what the data will show, that this was a… pic.twitter.com/bD0uATJW8B
effTutorials accused Rober of making false claims, particularly his statement that FSD does not engage without a destination and his implication that Autopilot and FSD are the same thing. He even suggested that Tesla should sue Rober for misinformation.
Mark Rober claims FSD doesn’t engage without a destination — this is false
Mark Rober claims Autopilot and FSD are the same thing — this is false
Can Tesla please sue this guy already? pic.twitter.com/uGUGOCHKgB
Algo Trader expressed disappointment in Rober’s test, arguing that in 2025, people are too knowledgeable about Tesla’s systems to be misled. He claimed that Rober did not use FSD properly and that nothing was actually enabled when the test was conducted. Parker echoed this sentiment, saying that while he did not want to believe Rober was being deceptive, the footage clearly showed that Autopilot was not engaged when the car hit the wall. Tesla Owners Silicon Valley went a step further, calling Rober a fraud and claiming that FSD was only engaged three seconds before the collision.
I always liked him, but it’s 2025 and you can’t make shit up anymore. The @Tesla Model Y was the best selling car in the world last year, people know how it works unless they’re intentionally being ignorant.@MarkRober - you did not use FSD, and nothing was even enabled when you… https://t.co/yF39g4HeCF pic.twitter.com/sTCQ2uz7G8
Rober responded by releasing raw footage of the test, stating that he was unsure why the system disengaged 17 frames before impact but that his feet were not touching the brake or gas. Meet Kevin, a well-known tech investor, weighed in by suggesting that Tesla could sue Rober under the Lanham Act, which covers misleading advertising. He argued that Tesla would not even have to prove financial damages—only that the video had the potential to deceive a significant portion of its audience.
Here is the raw footage of my Tesla going through the wall. Not sure why it disengages 17 frames before hitting the wall but my feet weren’t touching the brake or gas. pic.twitter.com/ddmeyqO3ww
Alex Finn weighed in on the controversy by pointing out the impact of Rober’s video on his personal reputation. He noted that Rober had spent years building a strong presence on social media, becoming one of the most well-respected science communicators online. However, Finn claimed that with this one 18-minute video, Rober had “obliterated” his credibility. He framed the incident as a cautionary tale of greed, deception, and cowardice, strongly suggesting that Rober had knowingly manipulated the test to push a misleading narrative.
This is Mark Rober
He spent years building one of the strongest reputations in social media. Loved by tens of millions of followers
Yesterday he obliterated it all with one 18 minute video
It's a story of greed, deception, and cowardice
Here's the story of what happened: pic.twitter.com/FQTl0FAtrb
Others defended Rober’s approach, stating that his test highlighted real concerns about Tesla’s camera-only system compared to LiDAR-based solutions used by other autonomous vehicles.
What This Means for Self-Driving Tech
Beyond the controversy, the situation has reignited discussions about the future of self-driving cars. Tesla CEO Elon Musk has consistently argued that LiDAR is unnecessary and that a camera-based system, combined with artificial intelligence, is the best path forward. Meanwhile, companies like Waymo and Cruise continue to invest heavily in LiDAR, believing it provides superior obstacle detection. With the conversation around autonomous driving heating up, Rober’s experiment—whether flawed or not—has added another layer to the debate over the technology’s future.