Special Issue - Introduction

  


Special Issue -Introduction
David J. Gunkel & Joanna Bryson, Editors Moral philosophy, although historically defined in different ways by different intellectual traditions, is essentially about intersubjectivity. It concerns how a subject responds to and takes responsibility for their thinking and acting in the face of anotheran other who is, in one way or another, recognized as subject to the same moral consideration as the actor. Consequently, moral philosophy concerns-whether it is ever explicitly recognized or not-who (or what) gets to be considered a subject of moral consideration, a moral subject. Moral subjectivity has two recognized aspects: moral agency, the capacity to be responsible for moral actions; and moral patiency, the property of being the recipient of moral obligations. This special issue of Philosophy and Technology is one of the first coordinated efforts to consider explicitly both sides of the moral relationship in situations involving artificially intelligent artifacts and machines.
When it comes to deciding these fundamental questions of moral standing, we typically distinguish who is a moral subject from what is not. And here the two small and seemingly insignificant words of "who" and "what" make, as Jacques Derrida (2005,80) points out, all the difference. Unfortunately, the distinction between who is a moral subject and what remains a mere object has never been completely resolved. In fact, the history of moral philosophy can be interpreted as a progressive unfolding whereby what had been a mere object, something we do not need to care about in moral terms, comes to be considered another moral subject. "When god-like Odysseus returned from the wars in Troy," Aldo Leopold (1966, 237) reminds us, "he hanged all on one rope a dozen slave-girls of his household whom he suspected of misbehavior during his absence. This hanging involved no question of propriety. The girls were property. The disposal of property was then, as now, a matter of expediency, not of right and wrong." During the time of Odysseus (at least as interpreted by Leopold), it was only the male head of the household who had been considered a proper moral subject, everything else, his women, his children, his slaves, were property under his possession, ostensibly objects that could be used, abused, and disposed of as he saw fit-at least with respect to the legal conventions of that period. Since that time, these previously excluded others-whether slaves, women, foreigners, or children-have been admitted (and often only after considerable struggles and resistance) into the gated-community of moral subjects, becoming other "persons" who count, or are at least deserving of some modicum of protection or respect.
Recently, formal philosophical consideration has been given to extending moral consideration to non-human animals (Singer 1975;Regan 1993) and even the environment (Stone 1974;Birch, 1993), which normally comprise the excluded other of what can be called "the anthropocentric tradition" in ethics. Even in the biological sciences, referring to an animal by name or the use of gendered pronouns ("he" or "she") or even the generic "who," is considered a political act, which is excluded, on principle, by many scientific journals (ref.), and more recently included, in some areas of research, at least in the case of non-human primates. Yet while the formal literature often gets bogged down in questions of about the level of consciousness, sentience, or awareness that would be necessary for moral standing, ordinary language and daily behavior indicates that not only the environment but many mundane objects, such as paintings, laws, or cars (Floridi 2008), may deserve status beyond mere possessions of their owners or creators.
Currently we stand on the verge of a potentially revolutionary challenge to many of the standard practices in moral philosophy. We find ourselves faced with another set of entities that have routinely (though not exclusively) been considered nothing more than objects-not someone to care about, but something to be possessed, used, and disposed of like Odysseus's slave girls or the animals of multinational agri-business.
This other form of otherness is the machine-interactive and learning algorithms that, once programmed and installed, make decisions with little or no human oversight; autonomous robots deployed as weapons on the battlefield, rescuers in disasters, caregivers and assistants in the home or hospital; and the intelligent systems now incorporated into what had previously been mere technological conveniences, like the instruments of transportation, communication, and media. If moral philosophy in the final decades of the twentieth century was defined by the animal question, it is entirely possible that in the early decades of the twenty-first it will be characterized by the machine question.
Although there have been previous efforts to grapple with this problem, we recognized a need to examine more thoroughly the costs, benefits, reasons and rationales for considering any form of technical entity to be any type of moral subject. In Gunkel pursue its various aspects and complications.
While a single journal issue will not finally resolve the machine question, our goal has been to at least open the door to the question, providing an ontology of moral discourse and charting its range of perspectives and consequences. We therefore advance the modest but nevertheless important objective described by G. E Moore at the beginning of his Principia Ethica. "It appears to me," Moore (2005/1903 xvii) writes, "that in Ethics, as in all other philosophical studies, the difficulties and disagreements, of which its history is full, are mainly due to a very simple cause: namely to the attempt to answer questions, without first discovering precisely what question it is which you desire to answer." This special issue is offered as one effort to begin to discover the questions that we believe will define the opportunities and challenges of moral thinking in the coming decades.