Intelligent Enterprises of the 21st Century

Threats Analysis, Security Mechanisms and Security Services

Every security related activity starts with threats analysis. Although threats analysis may vary from one specific environment to another, the basic approach is as follows (Raepple, 2001). Threats are first identified and the probability of successful realization of the identified threat determined. Afterwards, expected damage is calculated. This is the basis for setting priorities for countermeasures. Investment in counter-measures should certainly not exceed damage costs.

From a technological point of view, the prevention of threats forms security mechanisms and security services (ISO, 1995a). Mechanisms include symmetric algorithms, e.g., AES (Foti, 2001), asymmetric cryptographic algorithms, e.g., RSA (RSA Labs, 2002), one-way hash functions, e.g., SHA-1 (Eastlake, 2001), and physical mechanisms. For devices with weak processing capabilities like smart-cards elliptic curve based systems should be mentioned, e.g., ECDSA (ANSI, 1998). The advantage of these systems is that they require shorter keys than ordinary asymmetric algorithms for a comparable strength of cryptographic transformation.

The same key is used for encryption and decryption with symmetric algorithms, while asymmetric algorithms use one key for encryption and another for decryption. The first key is called a private key, and the second, which can be communicated to anyone, is called a public key. This is a very desirable property that is the basis for digital signature—anyone in possession of a corresponding public key can decrypt a message that has been encrypted with a private key, which assures the origin and integrity of this message. But there are drawbacks. In comparison to symmetric algorithms, asymmetric algorithms are computationally more complex. Next, to ensure that a particular public key indeed belongs to a claimed person, a trusted third party called certification authority (CA) has to be introduced. CA issues a certificate that is a digitally signed electronic document, which binds an entity to the corresponding public key (certificates can be verified by CA's public key). CA also maintains certificate revocation lists (CRL) that should be checked every time a certificate is processed, in order to assure that a private/ public key is still valid. One possible reason for invalid keys is growing processing power of computing devices, which prompts the need for ever-longer keys. Further, private keys may become compromised, and finally, a user may be using a certificate in an unacceptable way. And this is the point where public key infrastructure comes in.

Regarding digital signatures, one should bear in mind that they are actually produced by the use of one-way hash functions, which process a text of arbitrary length to an output of fixed length. One-way hash functions produce a fingerprint of a document and these fingerprints are used for digital signatures: A document is hashed and its hashed value is encrypted with a private key—this actually presents its signature. The recipient produces a hashed value of a received document and decrypts the signature with the public key. If those values match, the document is successfully verified.

Protocols that use cryptographic primitives are cryptographic protocols. They are used to implement security services, which are:

Security Infrastructure

Except auditing, security services are implemented with cryptographic protocols. To provide authentication in a global network, asymmetric algorithms are used because of their low key-management complexity. To compensate for their computational complexity, symmetric algorithms are used for session transfers, once entities have been authenticated. Exchange of session keys is done using an asymmetric algorithm at the authentication phase.

To enable the above-described basic procedures for digital signatures and the establishment of secure sessions, a so-called public key infrastructure has to be set up. Besides CAs, a directory is needed for distributing certificates and CRLs—an example of such a directory is the X.500 directory (ITU, 1997). The so-called Registration Authority (RA) that serves as an interface between a user and CA identifies users and submits certificate requests to CA. In addition, a synchronized time base system is needed for proper operation. All these elements, together with appropriate procedures, form a public key infrastructure (PKI).

The main specification of a certificate and certificate revocation list is X.509 standard version 3 (ITU, 2000). Basic certificate fields are serial number, issuer (trusted third party), the subject that is an owner of a public key, the public key itself, validity and signature of a certificate. Other fields are required for processing instructions, while extensions are needed to support other important issues, which are yet to be resolved: automatic CRL retrieval, their placement and distribution, security policy issues, etc. One should note that before using a public key, the certificate always has to be checked against corresponding CRL (or CRLs).

A procedure for initial key exchange within the web environment goes as follows. When contacting RA, a user signs a request and is identified on the basis of a valid formal document. A user, who has enrolled at RA, is sent two secret strings through two different channels, e.g. e-mail and ordinary mail. After obtaining these strings, a user connects to the CA's server that supports SSL protocol (Freier, 1996) and has installed CA's certificate. By connecting to this server through the browser, SSL protocol is automatically activated and a secure session is established. Based on the data about CA' s certificate a user can be assured of being connected to the appropriate server—usually this is done by checking key fields and a fingerprint of a certificate, which is obtained at RA during initial registration. Confidential exchange of subsequent data along with integrity is then enabled. This starts with the user entering his/her personal data and secret sequence strings, which authenticate the user to the server. Next, a server triggers a browser to produce a key pair and a public key is transmitted over the network for signing. When the certificate is produced, a user can download it to his/her computer, as every certificate is a public document. Regarding its revocation, the most straightforward procedure goes as follows: a user makes a request for revocation with the serial number of a certificate and signs it with a compromised private key.

PKI efforts date back to the late eighties. Many standards now exist and the main introductory reading from the technological point of view can be found in Aresenault et al. (2002).

However, there are still many open issues (Gutmann, 2002). There is no support for automatic certificate distribution and no support of automatic revocation checks. Further, there are problems with atomicity of certificate revocation transactions and problems with frequent issuing of CRLs. Finally, problems include costs associated with the distribution of CRLs, problems with finding certification chains, i.e. determining appropriate sequence of certificates in a non-centralized environment.

Additional Elements of Security Infrastructure—Commercial off the Shelf Solutions

Security infrastructure is not limited only to PKI, which is the basis, but includes also other systems that are mainly available as commercial off the shelf solutions:

Security Issues of New Paradigms in IT

New paradigms include objects, components, mobile code (computing) and intelligent agents. These are all based on recent trends in software development, i.e., object oriented design, implementation, and network awareness (network awareness means that the code has to be highly integrated into the network environment and react accordingly).

Language Java has been designed in line with the above requirements and is becoming the de facto standard for programming modern, network aware applications. Java is based on an objects paradigm, where objects are self-contained pieces of software code with their own data and methods. Nowadays objects are usually grouped into components that are independent modules, which can be inserted into, or removed from, an application without requiring other changes in the application.

Nevertheless, objects present generic elements, therefore security issues have to start with proper treatment of objects. Every code (and object) can be treated as an electronic document. The creator defines its initial data and behavior (methods) and optionally signs it. The signature on the code gives a user a possibility to be assured of proper functioning of this object. The problem is analogous to the problem of assuring authentication and integrity for ordinary electronic documents. The mainstream in the development of software systems goes in the direction where objects/components will be available over the network for installation and execution at local (or remote) premises. To ensure security for a local environment, which means protection from malicious code, these objects have to be signed by the producer and, before being deployed, signatures must be checked. If the source of objects (components) is trusted, the code can be executed or installed.

An important new paradigm in IT is the proliferation of mobile computing, which has significant implications for the business environment. One should bear in mind that we are reaching the point, where the number of wireless devices will exceed the number of fixed nodes. This poses new requirements on security, as handheld mobile devices have limited processing power and memory capacities. Due to the invention of elliptic curve cryptography it is possible to provide strong cryptographic mechanisms for these devices. However, the problem for the wireless world is PKI. Besides open issues already mentioned, PKI in the wireless world requires extensive computation for certificates and CRLs and further narrows the available throughput. Appropriate standards that would enable a wide-scale secure deployment are yet to come (Miller, 2001).

A fundamentally different approach in the contemporary computing environment presents mobile code and especially, mobile intelligent agents—besides mobility, these codes express autonomy, adaptability, mobility, intelligence, capability of cooperation and persistence (Griss, 2001). Agents are passed from one processing environment to another, where they use computing resources of the host. Intelligent agents act on behalf of their users to find the best offers, bid at auctions, etc. Therefore, their security is of utmost importance. Fundamental threats include uncontrolled read and write access to core agent services, privacy and integrity of their messages and denial of service problems. The reason is that agents operate in unpredictable environments and have to be protected from malicious hosts. Put another way, mobile agents are vulnerable to code peeping and code modification through false computation. These important issues are yet to be resolved (FIPA, 2001).

Категории