Rapid progress means self-driving cars are in the fast lane to consumer reality. Is the law up to speed too, asks legal expert Bryant Walker Smith
EVER since the 1930s, self-driving cars have been just 20 years away. Many of those earlier visions, however, depended on changes to physical infrastructure that never came about - like special roads embedded with magnets.
Fast forward to today, and many of the modern concepts for such vehicles are intended to work with existing technologies. These supercomputers-on-wheels use a variety of onboard sensors - and, in some cases, stored maps or communications from other vehicles - to assist or even replace human drivers under specific conditions. And they have the potential to adapt to changes in existing infrastructure rather than requiring it to alter for them.
Infrastructure, however, is more than just roads, pavements, signs and signals. In a broad sense, it also includes the laws that govern motor vehicles: driver licensing requirements, rules of the road and principles of product liability, to name but a few. One major question remains though. Will tomorrow's cars and trucks have to adapt to today's legal infrastructure, or will that infrastructure adapt to them?
Consider the most basic question: are self-driving vehicles legal today? For the US, the short answer is that they probably can be (the long answer runs to nearly 100 pages). Granted, such vehicles must have drivers, and drivers must be able to control their vehicles - these are international requirements that date back to 1926, when horses and cattle were far more likely to be "driverless" than cars. Regardless, these rules, and many others that assume a human presence, do not necessarily prohibit vehicles from steering, braking and accelerating by themselves. Indeed, three US states - Nevada, Florida and most recently California - have passed laws to make that conclusion explicit, at least to a point.
Still unclear, even with these early adopters, is the precise responsibility of the human user, assuming one exists. Must the "driver" remain vigilant, their hands on the wheel and their eyes on the road? If not, what are they allowed to do inside, or outside, the vehicle? Under Nevada law, the person who tells a self-driving vehicle to drive becomes its driver. Unlike the driver of an ordinary vehicle, that person may send text messages. However, they may not "drive" drunk - even if sitting in a bar while the car is self-parking. Broadening the practical and economic appeal of self-driving vehicles may require releasing their human users from many of the current legal duties of driving.
For now, however, the appropriate role of a self-driving vehicle's human operator is not merely a legal question; it is also a technical one. At least at normal speeds, early generations of such vehicles are likely to be joint human-computer systems; the computer may be able to direct the vehicle on certain kinds of roads in certain kinds of traffic and weather, but its human partner may need to be ready to take over in some situations, such as unexpected road works.
A great deal of research will be done on how these transitions should be managed. Consider, for example, how much time you would need to stop reading this article, look up at the road, figure out where you are and resume steering and braking. And consider how far your car would travel in that time. (Note: do not attempt this while driving your own car.)
Technical questions like this mean it will be a while before your children are delivered to school by taxis automatically dispatched and driven by computers, or your latest online purchases arrive in a driver-less delivery truck. That also means we have time to figure out some of the truly futuristic legal questions: How do you ticket a robot? Who should pay? And can it play (or drive) by different rules of the road?
Data protection is a more pressing issue. Many cars and trucks available today already collect driving data through onboard sensors, computers and cellular devices. But imagine taking a dozen smartphones, turning on all of their sensors and cameras, linking them to your social media accounts, and affixing them to the inside and outside of your vehicle. That is an understatement of a self-driving vehicle's potential data collection. Because consumer versions of such vehicles do not yet exist, we don't know what data will actually be collected or how it will be transmitted and used. However, legal issues related to disclosure, consent and ownership will mix with important policy questions about the costs and benefits of data sharing. Indeed, some research vehicles in Germany already have privacy notices printed on their sides to warn other road users.
Finally, what happens when things go wrong - or at least not as right as they might? Given that the vast majority of crashes are caused at least in part by human error, self-driving vehicles have huge potential to save lives. But they will not be perfect; after all, humans will remain in the design loop even after they are out of the driving loop. To what standard, then, should these vehicles be held? Must they perform as well as a perfect human driver for any conceivable manoeuvre? Or must they perform merely as well as an average human in a statistical sense? In any case, how should that performance be measured?
These questions will be considered explicitly or implicitly by the regulators who create new standards, the judges and juries that decide who should pay for injuries, and how much, and the consumers who decide what kind of car to buy. The uncertainty that surrounds the answers will affect the speed and price at which these new technologies are introduced.
Why do these questions matter so much? Because ultimately their most meaningful answers will, one hopes, be expressed in terms of lives saved.
Bryant Walker Smith is a fellow at the Center for Internet and Society at Stanford Law School and the Center for Automotive Research at Stanford University, California. His analysis of the legality of self-driving vehicles in the US came out in November (bit.ly/SVCe32)
If you would like to reuse any content from New Scientist, either in print or online, please contact the syndication department first for permission. New Scientist does not own rights to photos, but there are a variety of licensing options available for use of articles and graphics we own the copyright to.
Have your say
Only subscribers may leave comments on this article. Please log in.
Only personal subscribers may leave comments on this article
Subscribe now to comment.
All comments should respect the New Scientist House Rules. If you think a particular comment breaks these rules then please use the "Report" link in that comment to report it to us.
If you are having a technical problem posting a comment, please contact technical support.
kareem abdul jabbar miramonte elementary school mark jenkins super bowl commercials 2012 mia amar e stoudemire m.i.a.
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.