Standardization of a number of such
common operations was one of the first consolidating events in the evolutionary
progression of secure core software.
This effort
began in the late 1980’s within the International Standards Organization to
establish a series of international standards related to secure core
technology. This effort was driven largely by the financial industry but it
garnered the development support from a variety of both manufacturers and
large-scale issuers of tokens, like mobile phone operators. The result was the
establishment of a family of standards related to tokens based on integrated
circuit chips, the foundation of which define basic physical interfaces between
integrated circuit chips and interface devices, communication protocols used by
tokens and inter-industry services provided by compliant tokens. Through the
years, this ISO/IEC 7816 family of standards has continued to evolve.
Using these
standards, the software to be resident on the token was designed, developed and
installed on the token prior to its issuance to the token bearer. In the early
1990’s began the development of what became the Multos system, as we discussed
earlier. This new paradigm involved the installation of a virtual machine
interpreter on the token, allowing subsequent trusted installation of new
software, even after the token had been issued. In the late 1990’s, this was
followed by a similar effort that made use of the Java language as the basis
for the on-token virtual machine. By the end of the millennium, these efforts
had resulted in a significant transformation of the secure core software
environment. As we have previously noted, a major driver in this transformation
came through the standardization efforts of the European Telecommunications
Standards Institute in the form of standards for the use of such tokens as Subscriber
Identity Modules in GSM cellular phones worldwide.
Essentially, in
parallel with the development of so-called post-issuance
programmable tokens, an evolution was also occurring in the form of
enhanced cryptographic capabilities on tokens. The addition of a cryptographic
co-processor allowed for the efficient utilization of complex algorithms. An
example is the famous RSA algorithm. Named after its inventors, Ron Rivest, Adi
Shamir and Leonard Adleman, this algorithm forms the basis of asymmetric key
cryptography. This facilitated the deployment of public key infrastructures,
thus laying the groundwork for tokens to become the ubiquitous purveyors of
identification services. When coupled with efforts from the financial community
to specify common protection mechanisms for the applications in the token, in a
new organization called GlobalPlatform, tokens came to much greater utility
within wide area computer networks.
Indeed, the
first decade of the new millennium has seen the evolution of token systems
centered upon their integration into more comprehensive computer platforms.
Essentially, moves are afoot to integrate seamless secure core support into the
operating systems of general computer platforms. This presents something of an
evolutionary quandary. Tokens may become easier to incorporate into widespread
application systems or they may be supplanted with competing mechanisms within
these general computer systems. We will be considering this in more detail in
the remaining chapters of this book. For the moment though, let us step back
just a bit and see if we can identify some parallels between the manner in
which secure core tokens are made operational and the way that people are made
operational.
|