Camera vs. LIDAR: YouTuber Mark Rober’s Tesla Autopilot Test

YouTuber and ex-NASA engineer Mark Rober just put Tesla’s camera-only self-driving approach through the wringer—and the results aren’t pretty. In his latest video, Rober tackles head-on the company’s decision to ditch LIDAR and radar, a move Elon Musk once dismissed LIDAR as “fricking stupid” and unnecessarily expensive.

camera vs lidar

Tesla stands alone in this vision-only gamble. Every other serious player in the autonomous game—Waymo, Cruise, Zoox—relies on multiple sensors working together. They combine cameras with radar and LIDAR as backups for each other. Tesla’s thinking seems simple enough: humans drive using just their eyes, so why can’t AI?

Well, Rober’s tests suggest why not.

The Road Runner Test

In one eye-opening demonstration, Rober set up a wall painted with a realistic road image—think Wile E. Coyote cartoon territory. The Tesla, totally fooled by this optical illusion, crashed into it at full speed without even tapping the brakes.

“I can definitively say, for the first time in history, that Tesla’s optical camera system will absolutely obliterate a fake wall without even a tap on the brakes,” Rober jokes in the video.

Tesla Autopilot Test

Sure, you won’t find cartoon tunnels painted on real highways. But this raises a scary thought—if a simple printed image tricks the system this badly, what happens with sun glare, weird shadows, or digital billboards?

The Mannequin Showdown

It gets worse. In another test, Rober had both a Tesla and a LIDAR-equipped Lexus approach a child-sized mannequin standing in the road.

The Lexus spotted the “child” immediately and stopped. The Tesla? It recognized something was there but plowed right through it anyway.

camera vs lidar lexus

This is exactly what safety experts have warned about. Cameras struggle in bad weather and tricky lighting, and they can’t gauge depth as accurately as LIDAR. Yet Musk keeps pushing ahead with plans for an unsupervised version of Full Self-Driving (FSD) and dreams of robotaxi fleets.

tesla autopilot failed

Testing Flaws and Real-World Concerns

Rober’s test isn’t perfect. He used an older version of Tesla’s Autopilot, not the newest FSD software that runs on cars made since 2023. Would the latest version have done better? Maybe—but we don’t know from this video.

What we do know is that no fully driverless car currently on public roads relies solely on cameras. Every successful autonomous vehicle uses a mix of sensors. Tesla’s unique approach has potential, but it also creates unique risks.

The real issue isn’t whether a Tesla can spot a fake painted tunnel—it’s whether it can tell the difference between a plastic bag blowing across the road and a toddler. NHTSA has already investigated numerous crashes involving Tesla’s Autopilot, some fatal, where the system missed obstacles or reacted too late.

If Musk really plans to release an unsupervised self-driving system this year, we need answers:

  • Can a camera-only system ever be truly safe?
  • How does it stack up against competitors using multiple sensors?
  • Who’s responsible when AI makes a deadly mistake?

For now, Rober’s experiment leaves us with an uncomfortable truth: Tesla’s vision-based system might work when everything’s perfect, but life rarely is. And for technology making life-or-death decisions, “mostly works” isn’t good enough.

Until Tesla can prove its cameras can handle everything from bright sun to midnight fog, its fully autonomous future remains just that—future tense, with safety still a big question mark.

Watch Rober’s YouTube video here or visit his YouTube channel

Please sign up for our newsletter and never miss any updates, blogs, EV news, and articles.

📩 [Sign Up Here] It’s Free!

More Stories You Might Like

Leave a Reply

Your email address will not be published. Required fields are marked *