As a guest user you are not logged in or recognized by your IP address. You have
access to the Front Matter, Abstracts, Author Index, Subject Index and the full
text of Open Access publications.
In this paper, I discuss what I call a new control problem related to AI in the form of humanoid robots, and I compare it to what I call the old control problem related to AI more generally. The old control problem – discussed by authors such as Alan Turing, Norbert Wiener, and Roman Yampolskiy – concerns a worry that we might lose control over advanced AI technologies, which is seen as something that would be instrumentally bad. The new control problem is that there might be certain types of AI technologies – in particular, AI technologies in the form of lifelike humanoid robots – where there might be something problematic, at least from a symbolic point of view, about wanting to completely control these robots. The reason for this is that such robots might be seen as symbolizing human persons and because wanting to control such robots might therefore be seen as symbolizing something non-instrumentally bad: persons controlling other persons. A more general statement of the new control problem is to say that it is the problem of describing under what circumstances having complete control over AI technologies is unambiguously good from an ethical point of view. This paper sketches an answer to this by also discussing AI technologies that do not take the form of humanoid robots and that are such that control over them can be conceptualized as a form of extended self-control.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.