The net result
of these various requirements for rules within the conceptualization, at least
to those of us from the technical computer world, implies that rules will
involve both descriptive as well as procedural elements. It further implies
that some means to invoke the procedural elements must also be anticipated
within the conceptualization.
The thrust of
the conceptualization that we want to develop is aimed at explaining
interactions: interactions among groups, interactions among individuals,
interactions between individuals and groups, and interactions between
individuals and other elements of their social ecosystem, including the
physical ecosystem that is subsumed by the social ecosystem. To facilitate
interactions, it will be necessary to bring the various participants of an
interaction into the necessary physical or logical contact so as to facilitate
the interaction. This bringing together is the function of a specific role
within the ecosystem that is usually termed a broker. For the moment,
consider a broker as an abstract concept. We’ll try to give it more substance
in the next chapter.
The goal of the
ontology is to fully specify the mechanics of transactions. Once the
participants of a transaction are brought into proximity, the rules under which
the transaction will occur must be established. This is the function of yet
another abstract entity that is usually termed an arbiter. The service
performed by an arbiter is an abstract concept as well. With regards to the
establishment of the rules and consequences of transactions, we will term the functions
provided by the policy arbiters as those of arbitration
and adjudication.
Interactions
involving computers can be classified into two main forms: human-computer
interactions and computer-computer interactions. As we noted in the second
chapter, the main epochs of computer evolution over the last half-century or so
can be characterized by a blending of the evolving physiology of computers with
the manner in which a person and a computer interact. In a parallel development,
the mechanics of computer-computer interaction have arrived at the dawn of
fully networked systems; a prospect sometimes characterized as the grid. Functioning much like a
collection of imprintable neurons within the mind, grid computing assumes an
ability to imprint semantic functionality on arbitrary collections of networked
computers.
The dominant
form of interaction found on the Internet today is one of client-server
computing. Through this model, both computer to computer and human to computer
interactions can take place, to effect computational activities on behalf of a
set process, or of a human user. Within the computer domain, as is often the
case within purely human to human interactions, there is often a requirement
for formally defined interactions. We have referred to these as transactions.
As we saw, a
transaction is the atomic unit of interaction. It is of finite length and can
be characterized as to beginning and ending. While a transaction may entail
peer-to-peer exchanges, the establishment of the beginning and end of a
transaction generally entails a client and server relationship between any two
parties to the transaction. By that we mean that in order to proceed through
any type of protocol to facilitate a transaction, it is necessary for the two
parties to agree which is performing which steps of the protocol. We can define
a generic model for such a transaction environment by establishing specific
roles for two parties; the roles are that of supplicant and sentinel.
The sentinel is the guardian of the integrity of the transaction while the
supplicant is the party that desires to establish an interaction relationship
with the sentinel in order to conduct the transaction.
|