The field of robotics has seen many exciting developments in the last decade. Ever-evolving facial recognition, artificial intelligence, touch sensor and walking technologies have allowed robotic engineers to create some amazingly sophisticated robots.
Robots have reached a level where they’re now a viable replacement for many human roles. Technological advancements over the last several hundred years have already rendered many human positions obsolete. Companies need not even bother with the hassle of paperwork and documents to keep track of their business transactions, records and emails thanks to exchange hosted services. The giant leap the field of robotics has taken in recent years means many more human positions will be replaceable. The Pentagon recently announced they want one third of their military forces to be robotic by 2015 while many in the health sector are hopeful robots will be replacing doctors and nurses in performing routine operations within the decade.
Machines are forecasted to be replacing humans in more progressively complex tasks in society and this raises a lot of questions about rights and accountability. Already there have been hundreds of instances in which robots were directly involved in a human injury or death since the first recorded robot related death of Robert Williams in 1979. We’ve already seen some of the issues we could be faced with but what steps should we be taking now for when robotic technology reaches the next level in the future?
Issue 1: As robots become more intelligent and sophisticated, who is responsible if they injure or kill a human? Does the blame fall on the designer, the person in charge of the robot or the robot itself?
Issue 2: If a robot can be held accountable for its actions then which laws are applicable? In society, our laws apply only to humans. Does that mean we need to create a different set of laws for robotic beings or must we append our own laws to include machines?
Issue 3: Is there strong enough legislation governing the creation of robots? As it stands, there are no specific standards the robotic industry is obliged to follow other than General Industry practices that apply to all industries.
Issue 4: Should robots be allowed to occupy a position of power or authority within human society? Given the intended military applications, robots could be programmed and used to kill within the next 10 years. Are real-life robopolice in our future? Are people willing to accept a robot as an authority figure?
Isaac Asimov drafted a list of laws that are affectionately known as The Laws of Robotics.
1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey any orders given to it by human beings, except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
These are not real laws but they are often used in science fiction writing and films. As a society we need to begin talking about our future living side-by-side with robotic beings and decide how we plan to deal with it.