Cosmology and Complexity, edited by John Barrow, Paul Davies and
Charles Harper Jr. A telling illustration is our discussion of cosmic rays and
their influence on physical implements.
Trust in a given
physical implement can be modulated without the physical implement changing.
Trust can be reinforced by both multiplying its sources, and by solidifying its
foundation via cross-references. For example, when elements of trust are built
on mathematics, the same mathematics used for a physical implement can be
reused for another, and convergent results consolidate the overall trust
mechanism involved. Trust can also be diminished if the channel of trust
conveyance is either damaged, or if the message it conveys is garbled. Time can
lower trust as it gives more opportunities for elements of trust to erode.
Sometimes that very mechanism can be reversed, with time bringing new elements
of trust. As we see, in all these cases the original physical implement is
constant and therefore not itself the source of change. Its illumination
varies, and its trust level changes accordingly.
Conversely, the
conveyance of trust might be impeccable, while the physical implement does not
itself warrant trust. If the information is bad to start with, a good conduit
won’t make it better or worse. Trust in the physical implement is typically
limited to its boundary and possibly that of its extensions. It also extends to
its fabrication, its distribution and its usage. Together with the physical
implement itself, this forms a gestalt that diverges from the rest of
the environment. However, the environment also affects trust in the implement,
temperature being a case in point. While trust can be decomposed into pieces,
in a very deep sense it always involves all parts of the universe at its limit,
for the very reason that it is, as we mentioned, recursive.
We find the
technical foundation of trust in processes, such as Common Criteria; in logic
and mathematics, for information management; in craft, by guarding against
certain intrusion; in principles, such as that of establishing that only
communication between secure cores can create higher levels of trust; and in
conservation: if a situation doesn’t change, its evaluation may be facilitated.
Moreover, we have seen that layers of technical evaluation can create
additional trust, as they essentially bring to bear higher concepts to the
sensori-motor experience. To the extent that these higher concepts are
themselves trusted, they can convey trust in otherwise suspect sensori-motor
representations.
Trust can apply
to physical goods and it can apply to processes. As far as human artifacts are
concerned, processes take an additional weight because trust in the physical
goods derives from the trust in the process of their assembly. Because these
processes have their origin in humans, it is all about humans, the origin of
the implements. Trust in physical implements ends up being based largely on
trust in humans. For natural goods, trust would seem to be based on
human-independent foundations. However, this would be ignoring that our
interface with natural goods is entirely constructed on our sensori-motor
experience, itself subject to various levels of trust. So we still end up
basing our trust on human considerations. Today, computers have very little of
a trust mechanism built in, so it’s not possible to do an immediate comparison
between their establishing trust and humans establishing trust. Certainly, the
difference in the sensori-motor experiences is bound to make a difference, and
we’ll investigate that more later.
Now that we have
the means of a more structured approach to the understanding of the secure core
place in the trust infrastructure, or at least in constitutive components of
the trust infrastructure, we are ready to look at the trusted core from an
evolutionary perspective.
|