Drift into distraction: How close are we to watching movies in self-driving cars? | self-driving cars

Self-driving cars could allow drivers to watch movies on the highway, under changes to the highway law proposed by the Department of Transportation.

The planned updates aim to pave the way for the adoption of autonomous vehicles on British roads. But they have alarmed some, who fear that new regulations will be introduced before the technology is there to support them.

What is a self-driving car?

The actual definition of a “self-driving” car is hotly contested. At one end of the spectrum, simple driver assistance technologies like cruise control are decades old and have largely been incorporated into current rules without difficulty. At the other end of the spectrum, dreaming of a fully autonomous vehicle, that can handle any driving a human can, remains sci-fi stuff.

Between the poles is where the differences lie. Tesla’s “autopilot” technology, for example, can follow highway lanes and handle intersections without interference, but even drivers who pay to upgrade to what the company calls “full self-driving” must remain at the wheel and alert at all times. , in the event that the vehicle software is unable to handle something unexpected.

The industry uses a six-point scale, from 0 to 5, to cover the differences, and considers anything above 3 to be somewhat “automated.” A level 3 car, like a Tesla, can perform “most” of the driving tasks, but it requires sometimes overtaking a human. A level 4 vehicle, like the automated taxis being tested in San Francisco and Phoenix, can perform all driving under specific conditions — such as in a specific city area, for example — but still retains the option of overtaking a human. Only a Level 5 vehicle, which never needs a human to take over and can be made entirely without a steering wheel, is considered “complete automation”.

Why can I watch a movie without using my phone?

The proposal would allow drivers to view “non-driving-related content on the integrated displays, while the self-driving car is in control”. Cell phones are still specifically banned, however, “due to the greater risk they pose to distracting drivers as indicated in the research”. For a level five self-driving car, this distinction would be moot, as drivers should not be expected to take charge at all.

However, for less advanced automation, the distinction is important: the integrated display can be closely linked to the vehicle’s systems, making it easier to alert the driver that they need to pay attention to the road.

Can it be safe to watch a movie while driving a car?

If technology delivers on its promise, it should be. A good implementation of a Level 3 or Level 4 self-driving car – one in which you would expect drivers to take control sometimes – will take into account the fact that drivers are naturally poor at monitoring the operation of a machine that they do not need to control. This is known as the “automation paradox”: the more efficient an automated system is, the more important the human contribution when it is required.

If you have a regular car, the vast majority of your driving is likely to be routine and routine. But if you have a self-driving car that can handle 99% of the tasks, then in the toughest 1% of situations you’ll only be put back in charge.

Many of the setbacks to self-driving cars over the past decade have involved dealing with this problem: How do you make sure a driver is ready to take on the task at any moment, when the promise of technology involves giving them freedom to do other things?

But the newer generation of self-driving cars prioritizes “safe disengagement,” stopping at the side of the road and stopping when there is difficulty, rather than returning control to the driver at 70 mph. If these safety features are required, it can be safe to watch a movie while driving.

Who gets into a crash?

This is one battle that is still going on. British proposals warn that “motorists should be prepared to resume control in a timely manner if required”, which is the definition of the fourth level of automation. In most accidents involving self-driving cars, the driver is technically at fault – because they were unable to control in a split second before the tragedy. Drivers have been charged in accidents involving Tesla cars and Uber’s self-driving test car.

But experts have referred to human drivers in these situations as “moral breakdown zones,” parts of the system designed to absorb legal and moral responsibility without having the ability to actually improve safety. “Whereas a vehicle’s breakdown zone is intended to protect the human driver, the ethical breakdown zone protects the integrity of the technological system, at the expense of the closest human operator,” says Madeline Claire Eilish, who coined the term in 2019.

But will self-driving cars really come to the UK?

Level 3 automation is already on British streets, and Level 4 is close. Companies in Oxford and Milton Keynes have been testing cars on the road for two years, with increasingly positive results. A simpler version of the “driverless car” could see a company pairing Level 4 AI with wireless broadband, allowing remote security drivers who don’t need to sit in the car behind the wheel.

But the industry has always struggled with the hardest part of driving a car: the other people. Dense pedestrian zones, unsigned busy intersections and withdrawal into heavy traffic pose major problems that may prevent Level 5 automation from becoming a reality.

Leave a Comment

Your email address will not be published.