Geofences: The Invisible Walls Surrounding Autonomous Cars

by | May 3, 2019 | Articles, Videos

When Robert Frost wrote “Fences make good neighbors,” he wasn’t being entirely sincere. The words are actually a quote from a neighbor and Frost, or at least Frost’s narrator, isn’t so sure that’s good advice. Before building a wall, Frost wants to know: “What I was walling in or walling out.” The neighbor’s platitude, though dubious in many instances, is probably good advice when talking about autonomous cars.

Most companies attempting an advanced level of driving automation follow certain prerequisites to be safe. Vehicles at Level 4 or higher autonomy (i.e., a car that drives itself in most situations), currently, need to have some idea of the road they’re driving on and they typically need some limit on speeds. This means restricting cars to areas already mapped in detail using LIDAR and other sensing techniques.

As reported by The New Yorker in an article about ex-Google/ex-Uber head of autonomous driving Anthony Levandowski, deviating from known and predictable environments can have severe consequences:

One day in 2011, a Google executive named Isaac Taylor learned that, while he was on paternity leave, Levandowski had modified the cars’ software so that he could take them on otherwise forbidden routes. A Google executive recalls witnessing Taylor and Levandowski shouting at each other. Levandowski told Taylor that the only way to show him why his approach was necessary was to take a ride together. The men, both still furious, jumped into a self-driving Prius and headed off. The car went onto a freeway, where it travelled past an on-ramp. According to people with knowledge of events that day, the Prius accidentally boxed in another vehicle, a Camry. A human driver could easily have handled the situation by slowing down and letting the Camry merge into traffic, but Google’s software wasn’t prepared for this scenario.

The cars continued speeding down the freeway side by side. The Camry’s driver jerked his car onto the right shoulder. Then, apparently trying to avoid a guardrail, he veered to the left; the Camry pinwheeled across the freeway and into the median. Levandowski, who was acting as the safety driver, swerved hard to avoid colliding with the Camry, causing Taylor to injure his spine so severely that he eventually required multiple surgeries.

It’s probably no surprise that Google’s current self-driving arm, Waymo, has begun its more public-facing rollout of self-driving cars in places like Chandler, Arizona, which has an almost perfect grid layout and lower city speed limits. Riders in the area can use Waymo to get around as part of a pilot program there, but if they want to take one of the company’s self-driving Pacificas around they’re limited to where Waymo says the vehicle can go.

Want to get a bucket of sunflower seeds at Walmart? Waymo will happily take you there. Want to grab lunch at the Space Age Restaurant in Gila Bend? Better call your weird uncle, because that’s currently outside of Google’s geofence.

What Is A Geofence?

A geofence is just an artificial spatial boundary with a fancy name. A familiar example from your childhood might be your mom instructing you as you run out of the front door to “stay within Mr. Johnson’s mailbox and Mrs. Hernandez’s minivan.” A modern example of a geofence would be a “no fly zone” of the type they’ve put around public buildings, airports, and other landmarks to prevent drones from going where they’re not supposed to go.

In either case, there’s no physical fence in place, but rather a series of coordinates or landmarks delineating where it is ok and not ok to fly or where your mom’s deemed it safe to play. There are numerous ways to set a non-physical spatial barrier–ask any puppy whose owner has installed an invisible fence. Since the military turned up the fidelity of GPS in 2000, the most common way to create a geofence is to just tell connected devices which range of geographic coordinates to avoid.

When a self-driving vehicle approaches 33.2335702/-111.85853320 it knows that, for whatever reason, it’s been programmed to turnaround and it isn’t supposed to go any further. While this may be annoying for your as a passenger, there are good reasons why you want your self-driving car to be limited.

Why Most Self-Driving Cars Use Them

As a human being, you can drive somewhere without a physical map and you only need a vague sense of where you are to do so safely. If you’re German and travel to the United States you can roughly understand what do when you pull your rental car out of the hotel parking garage and your understanding is usually enough to keep yourself and others safe.

For driverless cars, though, it’s important they understand precisely where they are in space at all times. Watch this TED Talk from a Google engineer and you’ll see why. The self-driving car takes a pre-existing and detailed map of the world and projects its sensor data on top of it so the car can have enough information to make the safest possible decision about where to go.

Ford, for instance, has explicitly said it won’t let its vehicles out of strictly geofenced urban zones anytime soon.So, first and foremost, geofencing limits vehicles to areas where the company that runs the driverless car feels comfortable it has properly mapped the environment. This is especially important in situations where a vehicle may lose its ability to download data from a satellite or cellular network. In these situations, limiting a vehicle to an area that it has sufficient onboard mapping data for becomes important.

“The whole purpose of AVs is to guarantee mobility for everyone while at the same time ensuring higher safety than conventional human drivers,” says Dr. Francesca Favaro, an assistant professor at San Jose State University who studies safety issues related to autonomous vehicles. “To live up to that expectation you need to have a higher reliability of the vehicle itself, which needs to cope with hundreds of possible failure scenarios.”

There’s also one big, less obvious reasons: Not every community or state has rules allowing for self-driving cars. While the technology may be good enough it’s not worth the legal risk to run a car until there’s some sort of local blanket approval.

“Geofencing is becoming quite popular not just for AVs but for aerial autonomous systems as well,” says Dr. Favaro. “If you put yourself in the shoes of a regulator or policy-making person, it is a lot easier to allow testing of those technologies if you can restrict in which areas they are going to operate.”

This report from the Brookings Institute shows the differences in self-driving laws across the country. Texas, for instance, has already passed laws relating to the operation of driverless cars whereas neighboring New Mexico hadn’t even taken a law up as of the time of the review.

While I love the idea of some sort of autonomous Smokey & The Bandit moment with a manned police car chasing down a self-driving car, my guess is any company developing self-driving tech to be used on public streets is smart enough to not let that happen.

What You Can Learn From Geofences

A vehicle with advanced driver assistance systems (ADAS), like a Tesla Model S or a Cadillac fitted with Super Cruise, can go wherever it wants and doesn’t need to be fenced in because these are not full self-driving cars and are theoretically limited to use in certain situations.

By looking closely at where an autonomous vehicle can go you can also determine a lot about the sophistication of the system, which is why some carmakers are hesitant to explicitly say where their geofences begin and end.

If a company won’t allow its car to drive into an area with roads above 35 mph or in markets where there’s likely to be snow in the winter we can make some assumptions about that vehicle’s level of development.

How Long Will Autonomous Cars Be Fenced In?

Imagine an owner of a self-driving Kia from Brighton, in England, decides she wants to take a dip down to Calais, in France, for dinner via the Chunnel. Because her car drives itself she’s going to catch up on some work both on the ride down there and on the French side. One problem: Her Kia is programmed only for the UK and when she arrives in France it’s suddenly shocked to see everyone driving in the wrong lane of traffic.

For the reason above, truly self-driving cars will probably be “fenced in” for the foreseeable future. While well-mapped urban areas in geographically large countries like the United States may become easier and easier to navigate, self-driving cars will inevitably be asked to go to places where driving customs, signage, and laws can vary considerably over smaller distances.

While the instinct to push technology forward by breaking things can be a good one, walling in driverless cars so they can operate is a practice that’s hard to fault as most humans do the same thing. I learned to drive by moving my grandpa’s wagon around a junior high parking lot, then venturing out into the neighborhood, and finally onto the highway and open road. There’s no doubt that self-driving cars can be programmed to deal with situations of different laws and different driving behaviors, but we’re still in the period of development where safety is the highest priority for both moral and practical reasons; no matter how good your tech it’s not a good look to kill someone with your self-driving car.