Why This Self-Driving Tesla Car Hit That Truck | Bumper 2 Bumper | Donut Media

**Tesla's Autonomy Day for Investors: Elon Musk Declares LiDAR a "Fool's Errand"**

In 2019, at Tesla's Autonomy Day for Investors, Elon Musk made a bold declaration. The event featured upbeat music and set the tone for an exciting discussion on autonomous vehicles.

During the presentation, Elon Musk said, "LiDAR is a fool's errand." For context, LiDAR stands for Light Detection and Ranging, which is a type of object detection sensor. It's used by almost every manufacturer developing a self-driving car. In fact, nearly every manufacturer relies on LiDAR technology to enable their autonomous vehicles.

However, there is an exception. Tesla, the company founded by Elon Musk himself, does not use LiDAR sensors in its self-driving cars. Just six months prior to this Autonomy Day event, a shocking video was released showing a Tesla Model 3 barreling straight into a completely stationary overturned semi-truck.

The driver of the Tesla claimed that the car was on autopilot before and during the crash. This raises an important question: why did the Tesla fail to detect the semi-truck? And how do autonomous vehicles actually see the world around them?

Would a car using LiDAR sensors, which Elon Musk called a "fool's errand," have been more likely to avoid the same crash? These questions are at the heart of the discussion on autonomous vehicle technology.

In this episode, we'll delve into the details of how autonomous vehicles work and what role LiDAR plays in enabling self-driving cars. We'll explore the implications of Elon Musk's statement and examine the different approaches manufacturers take to developing autonomous technology.

Thanks to Omaze for sponsoring this week's episode of [insert show name here].

WEBVTTKind: captionsLanguage: en- In 2019, at Tesla's(upbeat music)Autonomy Day for Investors,Elon Musk made a bold declaration.- LiDAR is a fool's errand.Anyone relying on LiDAR is doomed.- For context, LiDAR is a typeof object detection sensor,and it's used by almost every manufacturerthat's developing a self-driving car.Every manufacturer, that is, except Tesla.But just six months ago,this video was releasedshowing a Tesla Model 3 barreling straightinto a completely stationaryoverturned semi truck.The driver wasn't harmed,but claimed that the carwas on autopilot beforeand during the crash.So why did the Tesla failto detect the semi truck?How does an autonomous vehicleactually see the world around us?And would a car using LiDAR sensors,the sensors that Mr. Muskcalled a fool's errand,have been more likelyto avoid the same crash?(music ends)We're gonna get into it. Let's go.(upbeat music)(electricity buzzes)Thanks to Omaze forsponsoring this week's episodeof "Bumper 2 Bumper."Not yet, Doug. Not yet.Next week.Love you, though. You're my homie.If you guys couldn't tell already,we love teaming up with Omaze(soft music)because they give you,the fans, chances to winonce-in-a-lifetime dream cars,all while supporting amazing causes,like the Ronald ReaganUCLA Medical Center,the same place that savedour very own Kentucky CobraMr. James Pumphrey's life,so we love them over there.The cars that Omaze offer are sick.I'm talking about PorscheCayenne GTS Coupe.A Ford F-250 that's fullycustomized by LGE-CTS.And how about this sweet Dodge Demon?And you could win.Just ask Sebastian, whowon the Corvette Stingraywe helped Omaze giveaway earlier this year.Hey, Sebastian.So don't miss out on thechance to win your dream carand support a greatcause at the same time.Head on over to omaze.com/carsto check out some of the sickest cars.And while you're there, make a donation.'Cause who knows? You couldwin the car of your dreams.Let's get back to some "B2B."There are four main sensors(soft upbeat music)that autonomous cars useto detect and analyze their surroundings.Before we dive into exactly whatmight have caused the Tesla accident,we need to understand howeach of these sensors work.Probably the most commonobject-detecting sensorfound in cars today isthe ultrasonic sensor.Now, these sensors work by emittinga pulse of sound waves(device beeping)and measuring the timeit takes for that pulseto reflect back off an objectand return to the sensor.The more time it takesfor the sound to return,the further away the object is.It's literally how bats work.(device beeping)We, of course, don'thear these sound wavesbecause they're outside ofthe human's audible spectrum.And ultrasonic sensors, they'recheap and often reliable,so it's probably the firsttype of detection systemyou would opt for ifyou were building a car.However, they do have one major drawback.They don't have a very long range.The reason sonar is so popularfor marine applicationsis because sound travels muchmore effectively through waterthan it does through air.It's like this line of pool balls.If they're tightly packedtogether like water molecules,when you hit the ball on one end,that energy is quickly andefficiently transferredto the ball on the opposite end.However, if you space themout like molecules in air,when you try the same thing,(balls clinking)that energy is quickly dispersed.The energy from our initialhit can't travel very far.For this reason, ultrasonicsensors are most usefulfor detecting objects withinabout three meters of a car.Great for parking and blind-spot detectionand understanding immediate surroundings,but not so great for seeinga car slam on its brakes100 meters in front of you.(metal clanking)If only there weresomething like ultrasonics,but instead of sound, it useda signal that could travelthrough air over further distances.(whooshes)Hello?It's called radar?(letters crash)Oh, thanks, Mom.Radar, or radio detection and ranging,works a lot like ultrasonic sensors,but they use radio wavesin place of sound waves.Because radio waves have long wavelengths,they can cut through fog, dust, and rainwith little interference,allowing radar systems to workno matter the weather conditions.Now, the systems are a bit moreexpensive than ultrasonics,but they can detect objectsfrom a very far distance,which is why you'll usuallysee them on the front of carsdetecting objects further down the road.Radar is great at determiningan object's location and velocity,but it's not the most accuratein determining its size or composition.Because of the nature of radio waves,something highly reflective andsmall, like an aluminum can,can generate a similar signalto something larger but not so bright,like your mom.(record scratches)A radar sensor can be like,"Hey, there's something over there,"or, "Oh, there's something over there,something down there."But it can never be like,"Hey, that's a car,that's a guy on a bike."It's just not possible.Radar just doesn't have the resolutionto differentiate objectsto that level of accuracy.If only there was a system like radarthat used such precise signalsof electromagnetic wavesthat it could recreate anaccurate three-dimensional readingof its entire surroundings.(whooshes)Oh, hello?My insurance rates are about to go up?That's a scam. I don't have insurance.Thought that was gonna be my mom, huh?Well, luckily, there is asystem that does just that.LiDAR.(lasers buzzing)LiDAR is a combination ofthe words, light and radar,but it is now also acceptedto mean light detection and ranging.It basically substitutes theradio waves of radar with.- Lasers.- Yeah, actual lasers, for real.A LiDAR sensor usually sitson the roof of the vehicle,and it emits millions of pulsesof light in a radial patternto build a 3D model of its surroundings.This high-resolution modelcan help decipher objectsin a way that would beimpossible with radar systems.So while a radar or ultrasonicsystem can recognizethat there's an object alongside you,a LiDAR system can recognizethat it's a motorcycleand whether or not the rideris even wearing a helmet.However, because the lasersmust use electromagnetic waveswith shorter wavelengths, thelight can't cut through thingslike heavy fog or rain.I also have to say theykinda look pretty uglyon top of a car.I mean, I don't know if you've seen them,but they're an expensive sensorthat is not pretty to look at.(melancholy music)Kinda look pretty ugly on top of a car.An expensive sensor thatis not pretty to look at.(melancholy music continues)(soft upbeat music)But probably the biggestdrawback of all threeof these systems so faris that they can't actually see anything.If your car is going to drive itself,it needs to be able to read signsand tell if a light is red or green.If only there could be somesort of device that could...(whooshes)Hello?- What are youtalking to right now?- Well, I'm talking to you.- No, not me. What areyou talking to right now?You're looking at it.- Well, I guess I'mtalking to a video camera.Oh ho!- There you go.- That was good.- Oh, my gosh. How are we related?- That was good, UncleJerry. Yeah, thanks for that.Okay, bye.Cameras.(letters crash)Almost every autonomous vehicleintegrates some sort of camera system.The reason cameras are so usefulis that they're verysimilar to the human eye,which is what our currentroad network is built around.We don't use sound totell us when to yield,we don't use radio waves to indicatewhere the turning lane is,and we don't use different 3D shapesto tell us when a lightis about to turn red.And because of this,cameras are the first stepwhen it comes to seeing our road systemsin a very human way.The computer can use camera footageto detect lane lines, street signs,♪ I like that ♪and if it's smart enough,just about anything else.But getting from a 2D imageto a 3D interpretationtakes a lot of work.Remember, an image has nothree-dimensional data on its own.However, there are acouple tricks we can useto get us some three-dimensional dataout of a bunch of two-dimensional images.Look at these two images.(camera shutter clicking)They were both taken by two camerasoffset from each other by one meter.And notice as we switchbetween the two imagesthat the objects in the foregroundmove more than theobjects in the background.This is called stereo vision,and it's how humans use bothof our eyes to perceive depth.♪ I like that ♪And it also shows how autonomouscars with multiple camerascan tell how far away an object is.Now, look at these two images.These were taken by the same camera.However, in the second image,the camera has moved forward a bit.Notice how objects closer to the cameraonce again moved a greater distancethan the objects further away?Well, this form of linearperspective can be usedby a single camera as ittravels through space.But these tricks alonecan only get you so far.They won't help you read a street signor tell the difference betweena plastic bag and a tire,which is, I guess, a commonproblem in autonomous cars.Making those types of interpretationsrequires somethingyou've probably heard ofcalled machine learning.(letters trilling)And we don't have enough timeto get into the nitty-grittyof how machine learning works,but it allows a computer programto learn and evolve over time.And if you've ever wonderedwhy those little CAPTCHA testsalways involve street signs anddifferent types of vehicles,it's because you're helpingtrain these AI systems.♪ I like that ♪Frickin' stealing your data, dude,and you didn't even know it.I honestly just found out(laughing) about this.The fact that camera systemsrely so heavily on machine learningand are more difficult forcomputers to analyze in generalis where the whole debatebetween LiDAR and camerasreally kick offand where Tesla and seeminglyeveryone else disagree.A LiDAR sensor, it generatesdata that doesn't requirea ton of interpretation to be useful.It immediately caninform the car's computerof an object's size and distance(fingers snap)right off the bat.And because of this, mostautonomous cars developedare using LiDAR as their primary meansto interpret the car's surroundingand hope to rely on the camerasonly to interpret signs,lane markers, and traffic signals.Now, Elon Musket,on the other hand,(slurping)he's banking that, with machine learning,the car's cameras can essentially domost of the heavy liftingwith some radar and ultrasonic sensorsto help with general surroundings.It seems like his beliefis that we are tryingto replace human drivers whohave two eyes and a brain,so we might as well use thetechnological equivalentof two cameras and a neural network.So back to that accident(soft music)that we talked about in the intro.Why did this Tesla crash?And if it had a LiDAR system,would it have stopped in time?So the car in questionhere is a Tesla Model 3,and it has 12 ultrasonicsensors, eight cameras,and one forward-facing radar system.With a range of 160 meters,it is unlikely that theforward-facing radarfailed to produce a detection.The issue was more relatedto how the computerinterpreted that detection.Cars using radars have someissues with stationary objects.One theory suggests this is because we flypast stationary objects onthe freeway all the time.Usually, they're side barricadesor overpasses or signs.So the car's computermight have interpretedthe overturned truck as consistentwith one of these common unmoving objects.I mean, I can see how, withthe low resolution of radar,that truck would generate asignal similar to an overpass.But as long as you haveanother reliable systemto cross-reference, the computershould be able to determineif the approaching object hasthe potential for collision.And in this case,that system should havebeen the car's cameras.So why didn't Tesla's computeranalyze the camera footageand realize that there wasan overturned semi truckin the road?That's the million-dollarquestion, and I can't tell you.Maybe it just hadn't beentrained in many situationsthat involved an overturned truck,so it couldn't make senseof what it was seeing.Now, if Tesla had been using a systemthat more precisely detectedobjects, like LiDAR,might it have been able to tellthat the motionless objectwas actually a threat?I think so. LiDAR's pretty frigging good.There's a reason people are using it.But I really hate to make any of thissound like Tesla's fault.When you're on autopilot mode,you're supposed to stillhave your eyes on the road.And there are way more videos out thereof self-driving cars actuallysaving people from accidentsthan there are of these very rare hiccups.So I think it's up in the airwhether LiDAR will come out on topor whether machine learningwill advance enoughthat just a couple of camerasand a powerful computerwill be able to navigate any roador scenario you throw at it.It's like iPod for Zoon.(crew laughing)Zoom.So let me know what you guysthink in the comments below.Thank you guys so much forwatching this episode of "B2B."You can follow us onInstagram here at Donut,all the Donut guys, allthe Donut fun, @donutmedia.You can follow me onInstagram @jeremiahburton.If there's a topic youguys are interested inthat you want to see here on "B2B,"put a comment down below.We'll see if we can make it happen, cap'n.And until then, bye for now.