Tesla Has a Bicycle Problem

The Never-Ending Drama of Tesla's Full Self-Driving Software

As we all know, only a few things in life are certain: death, taxes, and the fact that Tesla is willing to defend Elon and its technology no matter what happens. We also know that sometimes truth is stranger than fiction, and this is certainly one case where that adage rings true.

Recently posted video evidence of one of the most glaring safety problems with Elon's machines was released, which highlights the company's efforts to prove the exact opposite. On today's Wheel House, we're diving into the never-ending drama coming from Tesla and its full self-driving software.

A huge thank you to NordVPN for sponsoring today's video. Mr. Nord, please give them one more chance. I understand, thank you. Mr. Nord is tired of calling me, literally. He can't sleep at night knowing that you're not taking your online safety seriously. We talked about this back in February. You gotta protect your online privacy by using NordVPN.

It's got an easy-to-use Chrome extension connecting you to super fast service all around the world. So, you can browse the internet privately but NordVPN isn't like other VPNs. They released a new feature called threat protection, which gives you even more comprehensive security against malicious ads, harmful websites, and it even blocks trackers.

Speaking of tracking, NordVPN doesn't share data collect or do any tracking whatsoever. So, you can safely Google how to beat Ragadonof the Golden Order without anyone knowing that you're cheating to beat Elden Ring. Don't wait, download NordVPN on Windows, Android, Mac OS, iOS, and even Linux, and the best part is it's all risk-free thanks to Mr. Nord's 30-day money-back guarantee.

So, get our exclusive NordVPN offered by going to Nordvpn.com/donut. NordVPN. So, check it out and stay safe out there.

Now, let's get back to the show. In case you missed it, Galileo Russell and OmarQazi recently decided to take their model out for a spin to see how Tesla's newest self-driving software update would handle the crowded streets of San Francisco.

And as you saw the guys confidently cruised around, praising the new tech when suddenly something truly ridiculous happened. Are we gonna have to cut? - To have a near collision with the pedestrian is bad enough, but the clearly rattled men went onto essentially defend the mishap. I mean, that's what you're supposed to do.

Yeah, that was like fine. It's like something out of a bad sitcom. The origins of this debacle can be traced back to October 2021, when Tesla's fully self-driving version 10.3 update implemented the rolling stop feature, which blatantly against the law, mind you, lets cars drive through intersections if Autopilot doesn't detect other cars or pedestrians nearby.

If you're saying to yourself, wow, testing this on public roads, sounds like a really, really bad idea. Then you won't be surprised to learn that just a few weeks ago, Tesla issued a recall for nearly 54,000 vehicles to disable rolling stops.

All of this self-inflicted drama begs the question, how are we supposed to be any more confident in FSD software when it's blatantly obvious that there are still serious problems? And more importantly, considering this has been a years-long issue, is it even ethical for manufacturers to test self-driving technology on public streets?

As it turns out, we're not the only ones concerned about this problem. Tesla is facing allegations in multiple countries that their technology is not only putting drivers but the general public at serious risk.

Shortly after the recent recall, US senators on the Commerce, Science, and Transportation Committee penned a letter to Elon saying:

You have a lot to say about safety, yet your company's actions are putting people's lives at risk. We urge you to take immediate action to address these concerns and ensure that your technology is safe for public use.

It seems like Tesla has a long way to go before they can confidently say that their self-driving software is ready for the road. But until then, it's up to us as consumers to demand better.

WEBVTTKind: captionsLanguage: en- Yeah, that's actually helpful.- We all know that only a few thingsin life are certain, death,taxes and Tesla stands willingto defend Elon and its technologyno matter what happens.We also knowthat sometimes truth isstranger than fiction.Case in point, these justposted video evidenceof one of the most glaring safety problemswith Elon's machineswhile they were trying toprove the exact opposite.Today on Wheel House,we diving into the neverending drama comingfrom Tesla and their fullself-driving software.Huge thanks to NordVPN forsponsoring today's video.Mr. Nord, please givethem one more chance.I understand, thank you.Mr. Nord is tired ofcalling me, literally.He can't sleep at night knowingyou aren't taking youronline safety seriously.We talked about this back in February.You gotta protect your onlineprivacy by using NordVPN.It's got an easy to useChrome extension connectingyou to super fast serviceall around the world.So you can browse the internet privatelybut NordVPN isn't like other VPNs.They released a new featurecalled threat protection,which gives you even morecomprehensive securityagainst malicious ads, harmful websites,and it even blocks trackers.Speaking of tracking,NordVPN doesn't share,data collect or do anytracking whatsoever.So you can safely Googlehow to beat Ragadonof the Golden Order without anyone knowingyou're cheating to beat Elden Ring.So don't wait, downloadNordVPN on Windows,Android, Mac OS, iOS and evenLinux, and the best part,it's all risk free thanksto Mr. Nord's 30-day money back guarantee.So, get our exclusive NordVPN offeredby going to Nordvpn.com/donutNordVPN.So check it out and stay safe out there.Now let's get back to the show.In case you missed it.Galileo Russell and OmarQazi recently decidedto take their model outfor a spin to see howTesla's newest self-drivingsoftware updatewould handle the crowdedstreets of San Francisco.And as you saw the guysconfidently cruised around,praising the new techwhen suddenly somethingtruly ridiculous happened.(beep)- Are we gonna have to cut?- To have a near collisionwith the pedestrian is bad enough,but the clearly rattled men went onto essentially defend the mishap.- I mean, that's whatyou're supposed to do.- Yeah, that was like fine.- It's like something out of a bad sitcom.The origins of this debaclecan be traced back to October, 2021,when Tesla's fullyself-driving version 10.3update implemented therolling stop feature,which blatantly against the law, mind you,lets cars drive through intersectionsif Autopilot doesn't detect other carsor pedestrians nearby.If you're saying to yourself, wow,testing this on public roads,sounds like a really, really bad idea.Then you won't be surprised to learnthat just a few weeksago, Tesla issued a recallfor nearly 54,000 vehiclesto disable rolling stops.All of this self-inflicteddrama begs the question,how the hell are we supposedto be any more confidentin FSD software whenit's blatantly obviousthat there are still serious problems?And more importantly,considering this hasbeen a years long issue,is it even ethicalfor manufacturers to betesting self-driving technologyon public streets?As it turns out, we're notthe only one one's concernedabout this problem.Tesla is facing allegationsin multiple countriesthat their technology isnot only putting drivers,but the general public at serious risk.Shortly after the recentrecall, US senatorson the Commerce, Scienceand Transportation Committeepenned a letter to Elon saying,"Advanced driver assistancemust be implemented reasonablyand comply with existing traffic laws.When these systems do not meetthe essential requirements,they put all of those who use our roadsat risk of injury or death."The committee was also influencedby a report last Augustwhen the National HighwayTraffic Safety Administrationannounced that it'scurrently investigatingTesla's Autopilot system in a probethat applies to roughly 765,000vehicles built since 2014,one especially concerningelement of the NHTSA reportis that a total of 12Autopilot enabled Teslashave struck emergencyvehicles that have stoppedor parked often at thescene of an earlier accidentwhich have resulted in 17injuries and one death.The report notes that in every Tesla crashthey analyzed, some versionof self-driving softwarewas active when the collision occurred.Stating, "The involved subject vehicleswere all confirmed to have been engagedin either Autopilot orTraffic Aware Cruise Controlduring the approach to the crashes."The most recent of thesecrashes came in late Augustwhen the Florida highway patrolset a state trooper who had stoppedto help a disabled motorist on the highwaywas struck by a Teslathat the driver saidwas in Autopilot mode.According to the policereport, the trooper,"Narrowly missed being struckas he was outside of his patrol car."Okay, this is pretty bad.So how do lawmakers intend to specifyany new FSD legislation?Well for starters,the term Autopilot and fullself-driving are misleading.Tesla Autopilot keeps the car centeredin lanes of traffic whilemaintaining a set speedbut drivers are stillresponsible for lookingout for things in the roadlike parked cars or animals.I don't know, flesh andblood real human beings.So despite being able to somewhat relaxin the driver's seat,you're still required tokeep your hands on the wheel.Videos like the biker, asI'm gonna call it from now onare the strongest proof.If these guys weren't paying attention,they could have seriouslymessed that guy up.With this in mind, theDepartment of Transportationis also reviewing Tesla's useof the term full self-driving.And in Britain, the highway commissionshave recommended new laws tostop driver assistance featuresfrom being marketed as self-driving.There is also growing momentumto change laws regarding traffic accidentswhen FSD is in use.The law commission for England and Walesand the Scottish law commissionhave proposed creationof the Automated Vehicles Act to reflectthe profound legal consequencesin quotes of self-driving cars.They're suggesting that the personin the driver's seat couldno longer be responsiblefor infractions or accidents.Instead, the company thatobtained authorizationfor the self-driving vehiclewould face the regulatory sanctions.Many other industries are grapplingwith how to regulateartificial intelligence similarto self-driving systems.For example, in the medical industry,if someone is misdiagnosed or worse,dies because of a glitch in AI technology,who's responsible?The manufacturer or the doctor?I gotta think that if lawslike these start to pass,we'll see far fewer manufacturers eagerto release FSD features untilthey're much, much safer.(upbeat music)One of the best places to gauge how FSDis evolving is right here in California.There are currently over50 companies licensedto test autonomous cars on thestreets of the golden state.The reason Tesla is being singled outby critics is the fact that Elon is usinghis own customers ratherthan trained safety driversto monitor his self-driving technology.That's right, before the recall,there are some 60,000 customerusing Tesla FSD beta softwareon public roads around the world,considering the startling NHTSA reportand the recall the California DMV saidthat it was, "Revisiting its decisionnot to regulate Tesla's FSD beta testing."All of this legalese makes theguy wonder, is there reallythat much FSD market demand?While it may not be for methere's obviously enthusiastic interestin self-driving technologies.Outside of the industry, public figuressuch as Andrew Yang havepredicted a near futureor most of the 2 millionsemi trucks we seeon the highway will be driven by AI.He also predicted that hewould win the presidencyand we know how that turned out.But even if that happens, clearlywe're still gonna need humandrivers behind the wheelas a safety precaution.I mean, think about trains.A lot of them are fullygreen and computer drivenbut they still need an engineerto oversee the process.And a train is on a freaking track, notin the middle lane ofthe 101 during rush hour.Speaking of trains, evenif FSD tech was flawlessit still doesn't solve theissue of urban gridlock.Just because your car ishaving itself in trafficdoesn't take away from thefact that you're in traffic.If we invested more in public transit,there would be less trafficand maybe your driveto work would be somethingthat you actually wanna do.We all love the idea of less pollutionbut if it doesn't affect my brutal commuteis it really therevolutionary step forwardthat Elon says it is?There's also the issueof implement speed limitsand other traffic lawsin FSD systems fordriving in major cities.If self-driving cars allstick to the speed limitnot to mention the many otherrules, humans often disregardthey would be a serious problem.Going too slow on thefreeway, for example,would actually make traffic worseand result in more accidents, possibly.Now to give Tesla credit,I assume that's what they're thinkingwhen they came up withthe rolling stop feature.Hey, how about we make thecar think more like a person?But until they significantly dialin the more importantaspects of human instinct,like not suddenly swerving into a cyclist,they should save their gimmicky,potentially dangeroushuman instinct optionsfor use down the line.The cyclist debacle isn'ta total disaster for Tesla,but with the crashes piling upand federal regulators investigating,it is another gut check.The biker video shows thatFSD tech is not being deployedor tested responsibly.And don't think I'm against the ideaof self-driving technology.I'm not, I'm just sayingthat letting Tesla customers experimentwith risky new features on public roadsis more than a little problematic.There are many ways toapproach the problemautomated driving, but weclearly have ways to gobefore we find a proper solution.As you know, we'll keep you posted.- Jeremiah could be ona tool, go get yourselfour new tool shirt at donutmedia.com.That's right, we came outwith our all new tools shirt,it's made out of 100% cotton.So it's the perfect shirt towear while you work on your caror to give to that one friendwho forgets the name of that one tool.- Maybe it'll help youget that project car upand running that you'vebeen planning on working on.- Yeah, the problemis that I can't rememberwhat pliers are called.(laughing)This is probably possiblyfavorite shirt we've ever made.First off, I love the color purpleand it looks sick against this gray.And it's like a cheat sheet.If you're in like automotive schoolhave the person that sits infront of you wear this shirtand you'll get everythingright on the test.Donutmedia.com, get you one today.Trust me best shirt ever.- So soft.- Big, thank you to youfor watching Wheel House.Let me know what you thinkof self-driving technologydown in the comments.Do you want a car that drives itself?You might, you might not.I wanna know what your opinion is.I still like driving cars,but maybe my kids willbe like that's so lame.Why would you drive ityourself, you're old?I don't know, we'll see.If you like the video,hit the like button,subscribe to channel ifyou liked it even more.Thanks, all right, bekind, see you next time.