by Denis Larrivee
Artificial
intelligence and its companion technology robotics promise to revolutionize
human-machine relations through their capabilities for analyzing, interpreting,
and executing human action. While stimulating both excitement and concern, these
capabilities have also invited reflection on the ethics and values guiding technology
development. Factors that induce value evolution are of interest, therefore,
for influencing the forms the technology we may adopt.
In broad terms these are seen to operate at two levels: 1) by
epistemological inference, often through neuroscientific observation – humans
are like machines, and 2) by ontological predication, that is, as an imputed
analogue of human meta properties – machines are like humans. Due to a design
intent of reducing the onus of human intervention, AI devices are increasingly
given over to servicing a spectrum of human needs, from lower order motoric
assistance to higher-order computational and social functions, e.g., living
assistance companions and work colleagues; accordingly, they invite analogy at
multiple levels.
Simulation of higher-order cognition, especially, is understood to
drive value attribution, which flows from ontological inferences about the operational
resemblance of these technologies to higher-order, human cognition. That is, through
replication of these uniquely human abilities, there is a growing ontological
incursion in the technology, which propels value evolution under the guise of
simulating ontological equivalence. Breazeale’s Kismet robot, for instance,
explores not merely the social gestures essential to promoting human-machine
interactions but also the construction of human social intelligence and even
what it means to be human. Recent trends in roboethics, in consequence, no
longer assume the normative referencing implicit in Asimov's three laws of
robotics, which prioritizes human value over robot rights, having moved on to a
more egalitarian value premise.
Simulation thus challenges the traditional value prioritization placing
human beings at the apex of organismal life and grounding ethical, bioethical,
and neuroethical praxis, a prioritization that has promoted human flourishing
while also restricting harmful intervention into the human being. Rather than
emphasizing the centrality of human value, simulation promotes a value
architecture that is more inclusive, democratic, and horizontal, a trend recently
taken up in ethical parity models. Seen through the lens of ethical parity,
simulation poses a multidimensional challenge to an ethical system where value
is contingent to the human being, a challenge mediated at the level of the
ethical subject, i.e., in the siting of value contingency, in its theory of
ethics, i.e., in how ethics is normatively anchored, and in ethical praxis. In
consequence, it modifies ethical mediation as an intentionalized moral
enactment, which is framed by a referential ontology.
The pursuit of value equivalence between robotic technology and the
human being has notably highlighted the symbiotic nature of human-machine
relations, which is evoked by the reciprocity of ontological exchange. Rather
than the merely instrumentalist association identified in Aristotelian and
scholastic philosophy, the appropriation of ontological status motivates a
physical reciprocity that lies at the intersection of the human and the machine;
that is, behind the human lies hidden the machine, and behind the machine lies
the human. Hence, symbiosis is understood to actuate an a priorism that is
physically operative at the locus of intersection between the two.
Elucidating the philosophical roots of this a priorism is,
nonetheless, infrequently considered. While revealing the presence of a
physical ‘a priorism’ can be expected to constitute a meta valorization
of the processual form of ontological appropriation that distinguishes
simulation; that is, through the mutual endowment of ontological identity, epistemological
sources that may reveal consilience have yet to trace the physical reciprocity
invoked by symbiosis to a meta-physical ground. Modern physics, moreover,
broadly views the world as consisting of individual entities embedded in space
time, a conception apparently contravened by the sort of symbiosis invoked in human
machine chimaeras.
Higher-order cognition, for instance, is thought to align with human
ontology - lower-order human capabilities are rarely considered in these
ontological derivations – and is widely regarded to emerge from neural
activity, which recapitulates machine-like functioning. Indeed, Levy’s functionalist interpretation
of cognition is traced to the semblance between neural activity and
computational capabilities. Neural operation, on the other hand, is deeply physical
and neural architectures can be expected to adhere to meta principles
governing the physical world, including the formation of human entities. How
these positions may be reconciled by their grounding in a physical a priorism,
therefore, is unclear.
This paper opines that the computational neuroscience generally invoked for semblance with machine technology fails to trace its philosophy of science guises to an a prioristic meta field reflective of the physical structure of the world but rather to its properties. Recent integrationist accounts, on the other hand, reveal a consilience with a notion of dynamic entities; that is, neural architectures reveal an a priorism grounded in the unity of their operation, a finding of relevance for ontology, which is characterized by individuation rather than semblance.
Simulation through Functionalism to Heidegger
And, in spite of the victory of the new quantum theory, and the conversion of so many physicists to indeterminism de La Mettrie's doctrine that man is a machine has perhaps more defenders than before among physicists, biologists and philosophers; especially in the form of the thesis that man is a computer.
As Karl Popper notes, the hypothesis that human cognition simulates the computational abilities of machines has propelled a widely held notion that humans share ontological equivalence with computational machines. Indeed, over the last half century, computationalism - whether classicist, connectionist, or neurocomputing forms - has dominated thinking on cognition. Beginning with McCulloch and Pitts (1943), Karl Lashley, and others, this thesis has evolved through several incarnations. Marr and Poggio extended early computationalism to information processing, which built on lower level computational processes to construct a representational and algorithmic, tri-level scheme for cognition, while Fodor’s version entailed the manipulation of symbols by means of a Turing style computation, which he proposed enabled decision making, perception, and linguistic processing. Fodor’s transposition of machine like computational events to abilities distinguished as human properties of mind, particularly, introduced simulation as a methodological paradigm for arriving at an ontological status of parity. Implicitly claimed, in fact, is an absence of ontological distinction, an absence that flows from the a priorism of material semblance and grounds the physical reciprocity of human and machine, highlighted in human machine symbioses.
The equating of symbolic computation with cognitive capacities,
moreover, has been understood to bridge the divide between computational events
and functions carried out by the mind; that is, functions are built on
computational processes which link human and machine at the level of capacities
that are operative in the human mind. On this basis, Putnam posited that mental
states configure these functions; hence, he identified the mind as
constitutively functional. Understood this way the mind lacks a unique physical
contingency; hence, its properties cannot predicate from a holistic origin.
Functionalism, therefore, emerges from an a priorism of material semblance and
is inconsistent with ontological distinction. Chalmer and Clark’s extended mind
hypothesis, for example, is notably distinguished by its lack of a unique physical
origin to which the mind is contingent.
The lack of distinction, however, contrasts with traditional
subject/object dichotomies that view the human in opposition to the machine, a
dichotomy that has motivated efforts for its removal and the accommodation of
ontological parity. The imagery of the cyborg, especially, has been used as a
medium for conceptualizing beyond binary oppositions [Rae], which would otherwise
foreclose the physical reciprocity evoked by semblance. This conceptualization is said to require
replacement of a monadic derivation of ontology, whereby ‘two distinct entities
face one another and define themselves independently of one another’, with a novel
process of mutual endowment whereby each entity only ‘is’ by virtue of and
through its relationship with another [Haraway; Rae]. That is, the basis for
inference about ontology would no longer be drawn from an a priorism determined
by the meta ‘structure’ of the world, but by shared attributes that bind the
two relationally. As Onishi points out, the emphasis on a least common
denominator – a main tenet of the transhumanist vision, for example, is the
belief ‘that the worlds' only underlying and universal feature is information –
has the serious ontological consequence of allowing technology development to
reshape material existence at will, especially the human body. Indeed, such
thinking emphasizes the ‘entwined nature of beings’ [Rae]. Such a derivation,
therefore, denies the existence of a ‘meta-physical’ order that is the ground of
physical reality.
For computationalism the machine-human metaphor has gained traction,
nonetheless, from Heidegger’s critique of metaphysical humanism that likewise
challenged subject/object dichotomies, but did so at the level of being, a
critique that subsequently laid the foundation for the ‘‘anti’’-humanism of
structuralist, post-structuralist, and deconstructionist thought. Heidegger’s
challenge to the Cartesian metaphysical legacy of binary oppositions (which
itself challenged scholastic notions of a priori form and purpose) rooted
itself in an understanding of being as that which enabled ‘things to be’
rather than a feature contingent on their reality; that is, he proposed that being,
rather than synonymous with being (‘s’),
was something fundamentally different, an excess that, in the case of
the human being, allowed the human being to ‘‘exist,” rather than made
evident by his existence. Heidegger’s apriorism of a ‘murky’ being, led him to
posit a certain ‘nullity’ that now defines the postmodern subject, and indeed
all entities; hence, in the absence of predicating properties, the human subject
must be recreated from the merger of interactions with external reality; that
is, through a relational reconstruction. Indeed, much of the fluid, networked
understanding incorporated in posthumanist strands of thinking emerge from this
separation of being from its anchorage in entities, and the ensuing requirement
to restructure the entity through network interactions.
Human Action and Dynamic Entities in a Metaphysics
of Nature
While
Heidegger's critique is crucial for structuring ontological parity between humans
and machines by means of a novel metaphysical paradigm of being, this latter is
not widely invoked as an a priori, meta conception of the physical world.
Esfeld for example, points out that according to modern mainstream, meta-physical
thought, the physical world consists of independent and individuated things
that are embedded in space–time. These things are individual because they have
a unique spatio-temporal location and entities because they are (a) each the subject
of the predication of properties and (b) are distinguished by qualitative
properties from all other individuals.
This broad – indeed historical - recognition that entities comprise the
physical meta-structure of the world underscores the significance of individuation
to the ordering of physical reality. By contrast, Heidegger’s premise that
entities can ‘be’ apart from their qualities leaves open the question of
whether being is one or many, thereby denying that individuation is a
constitutive feature of reality. Hence,
the understanding of individuation has repercussions for how ontology is conceived.
Individuation reveals, especially, that unity is constitutive, not solely for
property predication, but constitutive to what things ‘are’ and the basis for
their persistence; hence, in contemporary physical understanding entities are
individuated because they are unified. Meta understandings of the physical
world, critically, now prominently feature an a priori operational dynamic that
is a unifying principle; thus also, the a priori presence of an operational
dynamic that ‘individuates’ the entity.
The act of
existence is not a state, it is an act, the act of all acts and therefore must
not be understood as a static definable object of conception” Phelan,
Here, Phelan
implicitly (and merely) recapitulates Aquinas’ dynamic notion of a holism:
“every substance exists for the sake of its operation”. Hence, the feature of
being is to act - “to be ‘is’ to act” - and to act is to be individuated.
In living systems – here understood as living entities - it is increasingly evident that unity is autonomously mediated through a dynamic of action execution. Indeed, the coherence and unity made evident through living systems’ autonomous engagement in action argues for the presence of a self-organizing principle evoked as a dynamic locus of action origin. Their presence in the world is therefore consilient with an a prioristic principle of self organizing, self actionable individuation that emerges from the meta structure of reality.
Crucially, human unity likewise flows from a global operational dynamic, where functions predicate from this dynamic. The neuroscience of behavioral action, especially, reveals that actions are embedded within a global operative order that is autonomously evoked during action execution; that is, a physical a priorism of unity mediated through operation. Human ontology, thereby, is an emergent qualification defined by unity, operation, and self-presence; hence, an ontology that is subjectively distinct and that flows from the world’s a priori features.
This physical a priorism is widely evident:
In the coordinated activity of primitive organisms like C elegans. Despite the participation of hundreds of sampled neurons, their
activity is coordinated, and meaningful signals are reduced to far fewer
dimensions.
In the multisensory integration of the individual, who becomes the subject of experience.
In mechanisms of action identification and action contextualization. For dynamic motor trajectories – events necessarily occurring in space and time – it is critical that individual motions be set in context with respect to the body’s spatiotemporal framework so that all motions may be coordinated. This framework functions to unify discrete motions into a coherent matrix in which they can be related each to another.
In action attribution and goal directed
activity. Individual motions perform functions necessarily in relation to
objectives dictated by the body; hence, the body is understood to be the source
of discrete motions.
Humans and Machines in a Physical World
Development of
sophisticated AI and robotics technologies is propelling an increasingly
intense interaction between humans and the machines they create. This has
motivated recognition of a physical semblance in models of cognition with an
ensuing emphasis on ontological and value parity. The absence of consilience at
global levels with an a priori meta model for semblance, however, weakens the
foundation for structuring ontological parity and siting value contingency and
is at odds with a general recognition of the individuation of operationally
dynamic entities that emerge from meta features of the physical world. These
features reveal the presence of dynamic holisms throughout the natural world
that recapitulate ontological distinctiveness along an increasing hierarchy of neural
complexity, culminating with the emergence of human subjectivity. Physically
grounding ontology in a meta world thus offers a basis for siting value
contingency and for informing the evolution of human machine interaction.