NASA Engineer Mark Rober Exposes a Huge Tesla Problem!
In a world where the future of transport is increasingly leaning toward automation, the promise of self-driving cars is as exciting as it is controversial. Tesla, one of the major players in this domain, has consistently pushed the boundaries of what autonomous driving should look like. Their decision to exclude LIDAR technology in favor of camera-based systems has sparked a heated debate across the tech and automotive communities.
Elon Musk, Tesla’s outspoken CEO, argues that since humans drive using vision and neural processing, cars should be able to do the same using cameras and advanced AI. But when this philosophy is put to the test in real-world conditions, it quickly reveals its limitations. The brain is a powerful processor, and our ability to intuitively assess movement, depth, and intent cannot be easily replicated by AI, especially when it's fed only visual input. Before we continue, make sure you subscribe to our YouTube channel with the bell notification turned on for more videos.
In recent comparisons, a Lexus equipped with LIDAR technology outperformed Tesla’s system in almost every significant way. While both vehicles were able to stop for a stationary object, such as a mannequin placed in the middle of the road, the real difference came in more dynamic scenarios. One particularly telling situation involved a sudden appearance of an obstacle—a simulation of a child running onto the road. Both cars reacted, but the Lexus did so faster and more reliably. This is likely because LIDAR systems constantly map their surroundings in 3D, firing millions of laser pulses every second, while Tesla’s camera system must first interpret a 2D image before making a decision. It’s a race between perception and reaction, and in these instances, LIDAR consistently proves more efficient.
Fog and rain, common challenges for any driver, presented significant problems for Tesla’s camera-based system. Visibility dropped, and the AI struggled to interpret what was in front of it. In contrast, LIDAR was unfazed, penetrating through dense fog and heavy rain to map obstacles accurately. These are not rare occurrences either. In many regions, weather can change quickly, and drivers—or their automated systems—must be able to adapt immediately. The limitations of Tesla’s vision-only system were again exposed under direct glare from sunlight or other bright lights, which caused overexposure and essentially blinded the cameras. LIDAR, once again, navigated through this challenge without issue, unaffected by the intensity of light in the visible spectrum.