It's scarier than I thought.
I tried not to let it show, but when the car first took over the wheel I wanted to scream, "Stop, let me out"!
And I'm not alone. It turns out that many people are leery.
But I forced myself to sit passively and it wasn't long before fear turned to amazement.
Its lane-keeping was first-rate. We flowed flawlessly with the traffic.
It maintained a remarkably consistent distance from the car in front of us, and every time it needed to slow down, it braked with smooth precision.
The car was driving as well as I could. Better, actually.
This has profound social implications
It made turns with uncanny skill, and there were even times when it seemed to know how long before each traffic light we approached would change.
It automatically eased off the gas just enough so that by the time we got to the intersection the light turned green and traffic was moving again, yet we still had momentum.
That saves fuel.
My apprehension melted away and I started to trust this thing, this amazing machine. I didn't have to pay attention to the road at all. And that gave me time to think about the staggering implications.
I wondered, could some future version of this car take the kids to school by itself? Pick them up?
What sort of control would a parent need, and how could it be enforced? What about unauthorized drivers… er, users?
And then I thought, once this technology matures how will the car know when and where it's ok to take which people?
Especially I wondered, does the end of driving mean the end of driver's licenses? What's the point of a license to drive once cars don't even have pedals or steering wheels?
And then it hit me:
Self-driving cars will lead directly to a mandatory biometric ID system, even for kids. In fact, especially for kids.
I mean, if the car is picking up children on its own, how can it know it's ok to go unless it can precisely identify everyone in the car? Most especially someone who's not supposed to be there.
It must know who's who, no matter who gets in.
Perhaps one of the greatest benefits of autonomous vehicles will be the increased safety they offer. For example, they never follow too closely, fail to signal, or brake too late.
Intentionally dangerous driving, the need for high-speed police pursuits and the emotional triggers for road rage will all but disappear.
Although there may be perceptual or technical malfunctions that lead to accidents, many experts assert that the overall rate of failure, and therefore the carnage on the highways, will be substantially reduced.
Autonomous cars measure traction on slippery roads, and manage speed, acceleration and braking far better than humans. They don't get tired. And they don't fall asleep at the wheel.
Autonomous operation will not be limited to just cars. Trucks, buses and all sorts of local delivery vehicles will also drive themselves, further reducing costs and increasing safety.
And they'll do it more efficiently, thereby reducing fuel consumption.
School buses will still need adult supervision to make sure the precious little monsters don't get out of hand or hang out windows, but now that person will be able to pay full attention to the kids because the bus drives itself.
Of course, this is of little comfort to professional drivers worldwide, most of whom stand to lose their livelihoods or make radical career shifts. That's not an immediate threat to them, but it is clearly on the horizon.
However, these concerns are offset by the tremendous increase in safety. For example, when picking up or dropping off kids, the school bus will transmit that fact to all the cars near it, and they will all come to a safe stop.
This is awesome technology, but nobody's talking about how it might catalyze a very Orwellian outcome.
Many countries do not have a national ID system. Instead they rely on driver's licenses as a de facto standard ID.
This is especially true in the United States. And it's worked well for decades because it's a very car-oriented culture, so a license is something more or less everyone is assumed to have.
But soon enough no one will need one, licenses to drive will become pointless. Something else — almost certainly something biometric — must necessarily take its place.
The thing to understand about biometric ID is that you don't carry anything like a card, your body itself is the ID.
Your identity is verified by comparing a real-time scan of you with information previously stored in a central authoritative database. If the scan matches the BioID stored in the government computers, then it's you.
Once everyone has been biometrically identified the temptation to apply such technology to almost everything will be overwhelming.
For example, it could eliminate the need to carry credit or bank cards, and with their demise we'll finally have an effective fix for our fraud-riddled credit card system.
Plus, there'll be no need for passwords because everything from your smartphone to the front door of your house will simply scan you.
The problem is, the system will also track and record everyone all the time. No one will be able to avoid this.
By the time you're old enough to walk you'll be biometrically immortalized in a myriad of databases.
Cars need enough intelligence to understand that it's not ok for hyperactive little 10-year-old Billy to order it to take him to town alone, or to change the destination after road rhythms lull his family to sleep.
On the other hand, it must also know it's OK to go if 7-year-old Mary says, "Emergency, take me to hospital".
The cool part happens once all the cars are speaking to one another fluently, because little Mary's car will inform the others in front of her that there's an emergency, and traffic will magically part.
Her car will race to intersect medical help in less time than it would take for an ambulance alone to get to her.
But for any of this to work properly, the car must know for certain who's in it, and who is giving orders. The only reliable way to do that is by some sort of biometric ID, of everyone1.
And not just for cars. You will also be continuously identified and tracked as you walk down city streets, enter and exit buildings, or go shopping or to the doctor. And everyone you associate with will be known and monitored.
Your life will become more convenient, but you will also come under nearly continuous surveillance.
You will be monitored for violations of the literally millions of laws, rules and regulations of modern society. This will vastly reduce crime, of course, but it will also vastly diminish that most precious of human needs, freedom.
All because cars no longer require drivers.
Autonomous cars are already driving themselves on streets in Texas, Michigan, Pennsylvania, Washington, Arizona and California, although currently they can only be operated in specific areas. We're approaching a time when they'll become commonplace.
The government of the United Kingdom also allows autonomous vehicle testing on public roads.
Countries like France and Switzerland have also regularized the testing of automated vehicles on public roads. And the legislative framework is becoming favorable in many other countries as well.
Tesla Motors is now producing all models with the hardware necessary to eventually empower autonomous operation. Google has run numerous tests totaling over a million kilometers without any fatal accidents. And they've introduced prototypes with no steering wheel or pedals.
Yet despite impressive technical and legal advances, developing completely autonomous vehicles remains a much harder problem than the breathlessly excited media leads you to believe.
The reality is we're still some serious distance from fully autonomous vehicles.
And that's a good thing because it gives us time to consider not only the Orwellian aspects of universal biometric IDs, but also numerous related issues, including in no particular order:
I'm not saying these are unsolvable problems, we'll eventually work them out. But it won't be easy.
Most especially, we'll struggle with human factors because driving is more than just precision guidance… profound ethical issues will arise.
Imagine an unavoidable accident is unfolding. Do you slam into the group of construction workers, or the mother with the baby carriage? In terms of lives lost, the logical answer is to plow into the mother.
But no one I know would ever make such a decision despite the fact that the death toll would be lower. Can the car do the right thing3?
Or what if the choice is between hitting the mom and baby carriage, or going off the road into a ravine, probably killing you? What would HAL-9000 do?
Admittedly, some of this is far-fetched. But not that far.
Autonomous cars will protect us from road risks, but who will protect society from the Orwellian consequences of autonomous cars?
Despite potentially grave consequences, we don't seem to be talking about the sorts of 'big brotherism' this can lead to. Instead, we're as mesmerized by automotive autonomy as 11-year-olds are by video games.
There's nothing wrong with the technology itself, it's great. But there's tremendous social risk if we proceed without proper public discourse about the implications.
If you found this blog informative please like us on Facebook
#1 Some would argue that the system will be set up using an 'exclusive' design such that it only knows the BioID of pre-authorized people who are permitted to control it. Unless of course there's an emergency, in which case it's required to listen to anyone. But then what's to stop our little Dennis the Menace from claiming that? Moreover, if the car is picking up your kids from the mall, and a stranger tries to get in, don't you want the car to be able to identify that person? I sure do, and I realize that the only way it can happen is if biometric IDs are universal.
#2 As a radical example, will the car obey a direct order to accelerate to an illegal speed to escape an oncoming tornado the car can't see but passengers can?
#3. For the sake of argument, let's say the car's capable of such a decision. Ask yourself how that happens. The answer is, there's an algorithm buried deep in its computer that tallies… wait for it… the relative value of those human lives. Oh really? As originally weighed by whom? This is part of a something called the trolley problem.