First, do no harm: new guidelines for ethical robot design in industry
13 December 2016
In Westworld, which has just concluded its first season on Sky Atlantic, a looming robot uprising is an ever-present threat to the humans who previously controlled their every move. Despite it being a prolific theme in entertainment for decades, all the way back to Hal in 2001: A Space Odyssey, robot ethics has not been discussed much in industry until now.
However, the most recent technological advancements in the field have led to the introduction of a new UK standard for robot designers. How will the new standard impact on industry in that jurisdiction and further abroad?
The British Standards Institution (BSI) has devised a new guideline for the ethical design and application of robots and robotic systems, entitled ‘Robots and robotic devices: guide to the ethical design and application of robots and robotic systems‘. It recognises potential ethical issues that can arise from the increasing number of automated and autonomous systems being introduced to industrial and consumer environments.
It also emphasises that it must always be transparent who is responsible for the behaviour of the robot, even if it behaves autonomously. The standard is relevant to all robots and robotic systems including autonomous cars, medical robots, industrial robots and those used for personal care.
A committee of scientists, academics, philosophers, ethicists and users developed the standard which is intended for use by robot and robotic device designers and managers. The standard, BS 8611:2016, was originally presented in September 2016 at a conference in Oxford, UK, and is available for purchase on the BSI website.
Asimov’s three laws of robotics
The new standard begins similarly to Isaac Asimov’s three laws of robotics, first proposed in his science fiction short story ‘Runaround‘ in 1942. Asimov’s first law states that a robot may not injure a human being or allow a human to come to harm through inaction.
The second law rules that a robot must obey all instructions given by humans, except those that conflict with the first law. Finally, the third law dictates a robot must protect its own existence as long as this does not involve conflicting with the first two laws. Robots should therefore always be safe, secure and fit for purpose.
BSI guidelines for manufacturers on previously uncommon hazards include: robot deception, robot addiction and the potential for a learning system to exceed its remit. The issue of whether forming an emotional bond with a robot is desirable is also covered; a particularly contentious subject if the robot interacts with the elderly or children.
The standard also discusses the risks of the robot becoming sexist or racist, an issue that prominently surfaced when Twitter users influenced Microsoft’s new AI chatbot, Tay, to spew out offensive messages.
According to Alan Winfield, professor of robotics from the University of the West of England, this is the first published standard on robot ethics. However the EU is also working on robot ethics standards, with a draft report issued in May 2016. This covers the ethical issues of an automated workforce and will lay the groundwork for ethical development and design of robots.
If approved, the standard would become the first legal framework on the issue of robot ethics. The introduction of the new standard could provide the impetus for bodies such as the EU or even further afield to consider legal action to safeguard humans from the ethical issues associated with the growing number of industrial and commercial robots.
In industry, standards on the ethical use of robots are of particular use. Traditionally, industrial machines were guarded and caged to be kept safely away from humans. Newer generations of robots are able to work alongside and even in collaboration with human workers, having sensors and the ability to learn, as well as other safety features.
Examples of collaborative industrial robots are ABB’s YuMi or Rethink Robotics’s Baxter. These collaborative robots can work alongside humans and make it easy to integrate automation to an industrial process.
Although collaborative robots are becoming more popular, it is still common for manufacturers to operate legacy industrial automation systems, which offer the benefits of industrial automation without the ethical concerns.
For manufacturers worried about the wellbeing of their industrial automation systems, but who are not ready to upgrade to the latest generation of cobots, sourcing legacy industrial parts doesn’t have to be difficult.
A supplier of new and obsolete industrial automation parts, such as EU Automation, can provide replacement parts to safeguard the system’s future until the manufacturer is sure that an upgrade is necessary.
The BS 8611:2016 standard is one of the first signs that industry is starting to preoccupy itself with ensuring robot behaviour is accountable, truthful and unprejudiced.
The dystopian future of Westworld is an unlikely possibility, but if we want to introduce robotics into industry and consumer environments on a wider scale, the ethical question should be at the forefront of our minds.
Leroy Spence, head of sales development at EU Automation