|
|
Tonge, Rzepa, Yoshida, page 1
Proofs to: H. S. Rzepa, Dept Chemistry, Imperial College, London, SW7 2AY.
------------------------------------
Authentication of Internet-based Distributed Computing Resources in
Chemistry.
aHenry S. Rzepaa*and Hiroshi Yoshidab
(a) Department of Chemistry, Imperial College of Science, Technology and Medicine, London
SW7 2AY. (b) Department of Chemistry, Faculty of Science, Hiroshima University,
Higashi-Hiroshima 739-8526, Japan
Abstract:The evolution of the World Wide Web from a model allowing only the public and
open exchange of information on the global Internet viasimple HTML-based pages, to one
involving the additional development of Intranets or Extranets with secure interactive client-
server processes is discussed. These client-server processes can be customised on the client
browser by the use of embedded applications such as Java Applets that allow the client to
communicate with other remote applications or databases via a distributed computing
architecture. We show1how software authentication using digital object signing procedures in
conjunction with X.509 certificates was used to develop three authenticated distributed
chemical applications. COS (Chemical Object Store) is a database application based on Java
Remote Method Invocation, MoldaNet invokes Java3D to create a molecular visualisation tool
and JSpec delivers analytical spectral data. We argue that such authentication of chemical
resources on the Internet provides one mechanism for increasing the perception of quality and
integrity of molecular information disseminated within the chemical community and creates an
architecture for electronic commerce to develop in the molecular science community.
Introduction
The dominant computing model during the 1970s and 1980s was of a mainframe computer
with a centralised CPU located in a special room and providing computing services to local
dumb display screens. This model was superseded by the advent of powerful and relatively
inexpensive desktop workstations in the late 1980s, which permitted the introduction of the
so-called interactive client-server or network distributed computing model. This allowed
applications to be split amongst multiple processors, with actions on one local machine
creating processes on a second remote machine, for example a computationally intensive
scientific calculation or a database query. The separate widespread deployment of Internet
technology using TCP/IP networking protocols in the mid 1990's saw the introduction of
another model, involving addressable information pages marked up using HTML (Hypertext
Markup Language) and served by a HTTP (Hypertext Transport Protocol) server, the so-
called World-Wide Web model.2The server in this architecture is a networked computer,
which can receive HTTP requests from any TCP/IP compliant computer in the world, and can
returns appropriate pages to be accessed and viewed by a suitable Web browser at the remote
client.
These two distributed computing and publishing models are now fusing in the late 1990s into
a more closely integrated system. Thus Web browsers have developed from being capable of
simply displaying static HTML pages to also having the ability to become interactive clients
|
|