LIBRES: Library and Information Science Research
Electronic Journal ISSN 1058-6768
2000 Volume 10 Issue 2; September


 Editorial note:

 This section contains items culled from various Internet news services, discussion lists and other announcements.  Unless specifically noted, I have not visited the sites, used any of the software, reviewed the literature, or written the news items.  I present this digest to you in good faith but cannot vouch for the accuracy of its content.

 Kerry Smith



New SIG/Digital Libraries

Date: Thu, 20 Jul 2000 17:00:46 -0400
Sender: Open Lib/Info Sci Education Forum <>
From: Suzie Allard <slalla0@POP.UKY.EDU>
Subject: New ASIS SIG on Digital Libraries

Are you interested in Digital Libraries? If you are, please read on!

The American Society for Information Science has chartered a new SIG
focusing on Digital Libraries. SIG/DL provides a forum for discussion
about research, development and use of digital libraries in corporate,
academic and public contexts. It is concerned with all aspects of digital
library design and implementation including, but not limited to: copyright
issues, networked infrastructures, interoperability of distributed
databases, database management, policies and standards in relation to the
digitization of material, metadata, preservation, providing electronic
access, authentication, software, hardware, peopleware, and project
management. SIG/DL also works to facilitate increased international,
interdisciplinary and inter-societal collaboration focusing on digital
libraries and related issues.

The vitality of SIG/DL is enhanced by the multi-faceted talents of its
members whether they be accomplished experts in DL issues or relative
newcomers to this exciting information frontier. As always, ASIS encourages
scholars, professionals and students to participate. I hope that you will
add your voice and ideas to this SIG.

If you have any questions please feel free to contact me at or to write to our Chair, Allison Kopczynski, at

Best Regards,
Suzie Allard
Chair-Elect, SIG/DL

(Membership information is below my signature)
Suzie Allard
University of Kentucky (859)257-8876
College of Communications and Information Studies
School of Library and Information Science
502 King Library South, Lexington, KY 40506-0039

At the following site you may add SIG/DL to your existing membership or join
ASIS designating SIG/DL as your SIG choice:

Adding SIG/DL to an existing membership: $6. If you have an existing
membership, you can print out the form and make a note that you are adding a
SIG, include the $6 (check or charge) then mail that to the address on the

Joining ASIS: $115 annual membership dues which includes $19 as payment for
the Bulletin AND $23 as payment for JASIS.

ASIS values students and offers students the special membership rate of just
$30 a year. There is a UK ASIS student chapter. The benefits are too
lengthy to include here so check out these and eligibility requirements at:


Controlled Vocabulary and the Internet

Date: Sun, 30 Jul 2000 12:02:32 -0500
To: <>
Subject: ASIS-L: CENDI Conference on _Controlled Vocabulary and the Internet_

CENDI Conference on _Controlled Vocabulary and the Internet_

I recently learned about the availability of the PowerPoint presentations made at the CENDI conference on _Controlled Vocabulary and the Internet_ held September 29, 1999. CENDI is an governmental interagency Working Group composed of senior Scientific and Technical Information (STI) managers from major programs in several U.S. federal agencies [ ]. 

The presentations are available at 

The following is a listing of the conference presentations and their authors:

Joseph A. Busch, DataFusion, Inc. - From Authority Files to Ontologies: Knowledge Management in a Networked Environment (PowerPoint File, 298 KB) 

Patricia Harpring, Vocabulary Program Senior Editor, The Getty - Documenting & Access:
Indexing with the Getty Vocabularies (PowerPoint File, 980 KB) 

Stuart J. Nelson, MD, National Library of Medicine - The Role of the Unified Medical
Language System (UMLS) in Vocabulary Control (PowerPoint File, 185 KB) 

Dagobert Soergel College of Library and Information Services University of Maryland - Enriched Thesauri as Networked Knowledge Bases for People and Machines (PDF File, 350 KB) 

Dr. Elizabeth D. Liddy Center for Natural Language Processing School of Information Studies Syracuse University - Whither Come the Words? (PowerPoint File, 116 KB) 

Larry Fitzwater EPA/OIRM/EIMD, Linda Spencer EPA/OIRM/EIMD - EPA Terminology Reference System (TRS) (PowerPoint File, 533 KB) 

Joyce Ward, Northern Light Technology, Inc. - Indexing and Classification at Northern Light
(PowerPoint File, 356 KB) 

Quin J. Hart, University of California at Davis - The CERES/NBII Thesaurus Partnership
Project (Project Web Site) 

Stephen M. Griffin, National Science Foundation - DIGITAL LIBRARIES INITIATIVE An
Interagency Program of Research and Applications (PowerPoint File, 728 KB) 

In addition these presentations, the site contains the PowerPoint presentations from other CENDI programs including: REFERENCE/CITATION LINKING: THE FEDERAL PERSPECTIVE - A JOINT CENDI/FLICC WORKSHOP held June 21, 2000; GovTechNet 99 held in June 99; and 
Open Source Solutions 97: Global Security and Global Competitiveness held in June 1997.

I believe that MyWebColleagues will find each of great value and interest!

/Gerry McKiernan
Theoretical Librarian 
Iowa State University
Ames IA 50011 

"Life is What Happens While You're Making Other Plans"


Council of Europe's Congress of Local and Regional Authorities
series of meetings on
Local and Regional Information Society.

Date: Tue, 11 Apr 2000 12:22:58 +0100
Subject: FW: Council of Europe/Information Society
From: "Cano, Virginia" <>
To: "''" <>

-----Original Message-----
From: Michael Macpherson [] 
Sent: 10 April 2000 11:13
Subject: Council of Europe/Information Society

key words: local, regional, government, governance, citizen, participation,
democracy, society, economy, electronic

Colleagues and Friends,

During the 1998 and 1999 the Council of Europe's

Congress of Local and Regional Authorities
organised a series of meetings on

Local and Regional Information Society.

A rapporteur's memorandum of these meetings, and a recommendation of the
Congress may be found in the world wide web. From the memorandum "More than
six hundred local and regional representatives and experts from twenty-four
European countries took part in the three Seminars on "Local and Regional
Information Society", held in
Helsinki, Finland (21-23 January 1998),
Miercurea Ciuc, Romania (8-9 October 1998) and
Hradec Kralove, Czech Republic (15-16 March 1999)."

Council of Europe reports etc. are available in both French and English.



Council of Europe/The Congress of Local and Regional Authorities of Europe

I was a speaker at two of these meetings (in Romania and Czech Republic)
and am pleased to see that there is a report, and that it's available
on-line :-)

It would be good to discuss some of the issues raised in the Council's
memorandum and recommendation on Local and Regional Information Society.
One virtual place to do that is Democr@cy Forum, hosted by John Gotze and
myself at

Michael Macpherson

Dr. Michael Macpherson,
PSAMRA/Integral Studies,
Berlin FRG
tel.: +49 30 262 3768

Join: Democr@cy Forum



Digital Preservation - Two Major Reports

Date: Sun, 30 Jul 2000 14:43:45 -0500
To: <>
Subject: ASIS-L: Two Major Reports on Digital Preservation

_Two Major Reports on Digital Preservation_

I recently learned about two major reports relating to the issues surrounding digital preservation and believe that they will be of interest to MyWebColleagues.

The first of these was prepared by Gail Hodge of Information International Associates on behalf of the International Council for Scientific and Technical Information (ICSTI) [ ]. The publication, _Digital Electronic Archiving: The State of the Art and the State of the Practice_ was published in April 1999. It is accessible at the following addresses: 
[Table of Contents] 
[Executive Summary] 
[Main Report]

The second is a report prepared by a committee for the National Academy of Sciences (NAS) and provides recommendations on the digital preparedness and the Library of Congress in collecting and preserving digital resources. A general summary of the report recently appeared in the New York Times on July 27 2000 ["Saving The Nation's Digital Legacy" / by Katie Hafner 
( )

[NB: This article was also published as "Library of Congress Lags in Archiving Digital Preservation" in the July 26th NYTimes Web edition ]

[NOTE: A free account is required to access this article]

The title of the report is _LC21: A Digital Strategy for the Library of Congress_. The Library of Congress (LoC) commissioned the study. The report's recommendations are more fully described in a press release available from the LoC [ ]

A *pre-publication* copy of the report is available at:

/Gerry McKiernan
Theoretical Librarian
Iowa State University
Ames IA 50011 

"Life is What Happens While You're Making Other Plans"


ECAI - Electronic Cultural Atlas Initiative

Date: Fri, 9 Jun 2000 11:41:38 -0700 (PDT)
From: Michael Buckland <buckland@SIMS.BERKELEY.EDU>
Subject: ASIS-L: ECAI meeting, London

The Electronic Cultural Atlas Initiative (ECAI) is a rapidly growing
cooperative effort among scholars to increase the use of geo-referencing
on datasets to increase the analyses that can be performed, especially
with datasets of historical and cultural heritage interest. The emphasis
is on sharing data that are coded by both place and time to distinguish,
say, Kyoto in 1500 from Kyoto in 1900 -- on making data freely
ECAI Atlas Teams of area specialists, in conjunction with ECAI Technical
Teams, are producing an interactive electronic atlas of the world from
which selected data from regions, eras, and disciplines can be
instantaneously accessed.
ECAI participants held six-monthly meetings. The next is in London,
hosted by the British Library, June 26-28, 2000. After that in Hong Kong
in mid January 2000.
Details, including the program for the London meeting and of the January
2000 meeting in Berkeley, can be found at. 

Michael Buckland, School of Information Management & Systems,
University of California, Berkeley, CA94720-4600
(510) 642 3159



Date: Tue, 04 Jul 2000 09:28:34 +0200
Organization: International Federation for Information and Documentation (FID)
Subject: FID Publication 720
From: Theresa Stanton <>


'Defining and Assessing the Impact of Information on Development:
Building Research and Action Agendas'. Edited by Forest Woody Horton Jr.

FID Occasional Paper No. 16, 136 pages, Price: US Dollars 60 / 56 Euros
(including p&p). ISBN 92-66-00-720-X. Available from: FID Secretariat,
P.O. Box 90402, 2509 LK The Hague, Netherlands. Fax: +31-70-3140667,

At a time when all nations are struggling with developing their national
information infrastructures so that they can link quickly and
effectively to the global Internet society, this timely publication from
the International Federation for Information and Documentation, FID
offers indispensable guidance to governments, professional societies,
universities, and practicing information professionals to share the very
latest research results to help them better understand how to define and
measure the impact of information on economic and social development. 

This landmark monograph is the result of an extensive series of research
projects over nine years that spanned five continents and involved
hundreds of development experts and practitioners from around the
world. It describes these projects, discusses a working framework for
measuring impact, and makes a series of ecommendations for future
work. Every government official concerned with development, as well as
every foundation and organization concerned with planning, evaluating,
and implementing projects and programs in the development sphere, should
read this and have a copy of this on their bookshelf! 

Please note: Developing countries receive a special reduced rate -
Additional discounts given for bulk orders: More details from FID: Tel.:
+31-703140671 - Email:

Order your copy(ies) now!

Simply complete this form and return it by email or fax to FID at the
address below:

Please send me ……..copy(ies) of: FID Occasional Paper 16: 'Defining and
Assessing the Impact of Information on Development: Building Research
and Action Agendas'. Edited by Forest Woody Horton, Jr. 136 pages.
Price: US Dollars 60 / 56 Euros (including p&p). 

Payment: Cheque sent, payable to FID (cheques in major currencies
Invoice me:…… 
Invoice my institution:……. 
VAT number (if in European Union)……………………………………….. 
Please send your order by return email or fax/mail to: Magda Bouwens,
FID Secretariat, P.O. Box 90402, 2509 LK The Hague, Netherlands. Fax:


From Gutenberg to the Global Information Infrastructure:
Access to Information in the Networked World

Date: Mon, 12 Jun 2000 11:22:21 -0400
Sender: Open Lib/Info Sci Education Forum <>
From: Gretchen Whitney <>
Subject: From Gutenberg to the Global Information Infrastructure (fwd)

---------- Forwarded message ----------
Date: Sat, 20 May 2000 12:06:03 -0700 (PDT)
From: Phil Agre <>
To: Red Rock Eater News Service <>
Subject: [RRE]From Gutenberg to the Global Information Infrastructure

[Heavily reformatted; apologies for any glitches.]

Date: Sat, 20 May 2000 11:22:20 -0700
From: Christine Borgman <>

From Gutenberg to the Global Information Infrastructure:
Access to Information in the Networked World

Christine L. Borgman

MIT Press, March 2000

Table of Contents

1 The Premise and the Promise of a Global Information Infrastructure
2 Is It Digital or Is It a Library? Digital Libraries and
Information Infrastructure
3 Access to Information
4 Books, Bytes, and Behavior
5 Why Are Digital Libraries Hard to Use?
6 Making Digital Libraries Easier to Use
7 Whither, or Wither, Libraries?
8 Acting Locally, Thinking Globally
9 Toward a Global Digital Library: Progress and Prospects

Chapter 1

The Premise and the Promise of a Global Information Infrastructure

Let us build a global community in which the people of neighboring
countries view each other not as potential enemies, but as potential
partners, as members of the same family in the vast, increasingly
interconnected human family. -- Vice-President Al Gore (1994a)

The information society has the potential to improve the quality of
life of Europe's citizens, the efficiency of our social and economic
organization and to reinforce cohesion. -- Bangemann Report (1994)

The premise of a global information infrastructure is that
governments, businesses, communities, and individuals can cooperate
to link the world's telecommunication and computer networks together
into a vast constellation capable of carrying digital and analog
signals in support of every conceivable information and communication
application. The promise is that this constellation of networks will
promote an information society that benefits all: peace, friendship,
and cooperation through improved interpersonal communications;
empowerment through access to information for education, business,
and social good; more productive labor through technology-enriched
work environments; and stronger economies through open competition in
global markets.

The promise is exciting and the premise appears rational. Information
technologies are advancing at a rapid pace and becoming ever more
ubiquitous. Many scholars, policy makers, technologists, business
people, and pundits contend that changes wrought by these new
technologies are revolutionary and will result in profound
transformations of society. Physical location will cease to matter.
More and more human activities in working, learning, conducting
commerce, and communicating will take place via information
technologies. Online access to information resources will provide
a depth and breadth of resources never before possible. Most print
publication will cease; electronic publication and distribution
will become the norm. Libraries, archives, museums, publishers,
bookstores, schools, universities, and other institutions that rely on
artifacts in physical form will be transformed radically or will cease
to exist. Fundamental changes are predicted in the relationships
between these institutions, with authors less dependent on publishers,
information seekers less dependent on libraries, and universities less
dependent on traditional models of publication to evaluate scholarship.
Networks will grease the wheels of commerce, improve education,
increase the amount of interpersonal communication, provide
unprecedented access to information resources and to human expertise,
and lead to greater economic equity.

In contrast, others argue that we are in the process of evolutionary,
not revolutionary, social change toward an information-oriented
society. People make social choices which lead to the development of
desired technologies. Computer networks are continuations of earlier
communication technologies such as the telegraph and telephone,
radio and television, and similar devices that rely on networked
infrastructures. All are dependent on institutions, and these evolve
much more slowly than do technologies. Digital and digitized media
are extensions of earlier media, and the institutions that manage them
will adapt them to their practices as they have adapted many media
before them. Electronic publishing will become ever more important,
but only for certain materials that serve certain purposes. Print
publishing will co-exist with other forms of distribution. Although
relationships between institutions will evolve, publishers, libraries,
and universities serve gatekeeping functions that will continue
to be essential in the future. More activities will be conducted
online, with the result that face-to-face relationships will become
ever more valued and precious. Telecommuting, distance-independent
learning, and electronic commerce will supplement, but not supplant,
physical workplaces, classrooms, and shopping malls. Communication
technologies often increase, rather than decrease, inequities, and
we should be wary of the economic promises of a global information

Which of these scenarios is more likely to occur? Proponents of each
offer historical precedent and argue rationally for their cases. Many
other scenarios exist, some between those presented above and some at
the far ends of the spectrum. The extremes include science-fiction-
like scenarios in which technology controls all aspects of daily
life, resulting in a police state where every activity is monitored,
and survivalist scenarios in which some catastrophe destroys all
technology, with the result that new societies are reinvented
without it. The science fiction and survivalist scenarios are easily
discounted because checks and balances are in place to prevent them.
Choosing between the revolutionary, discontinuity scenario and the
evolutionary, continuity scenario described above is more problematic.
Each has merit and each is the subject of scholarly inquiry and
informed public debate.

In view of the undisputed magnitude of some of these developments, it
is reasonable to speak of a new world emerging. It is not reasonable,
however, to conclude that these changes are absolute, that they will
affect all people equally, or that no prior practices or institutions
will carry over to a new world. Nor is it reasonable to assume that
any individual institutions, whether libraries, archives, museums,
universities, schools, governments, or businesses, will survive
unscathed and unchanged into the next millennium. Strong claims in
either direction are dangerous and misleading, as well as lacking in
intellectual rigor. The arguments for these scenarios, the underlying
assumptions, and the evidence offered must be examined. Upon
close examination, it will often be found that strong claims about
the effects of information technologies on society, and vice versa,
are based on simplistic assumptions about technology, behavior,
organizations, and economics. None of these factors exists in a
vacuum; they interact in complex and often unpredictable ways.

I argue throughout this book that the most likely future scenario
lies somewhere between the discontinuity and continuity scenarios.
Information technology makes possible all sorts of new activities
and new ways of doing old activities. But people do not discard all
their old habits and practices with the advent of each new technology.
Nor are new technologies created without some expectations of how they
will be employed. The probable scenario is neither revolution nor
evolution, but co-evolution of information technology, human behavior,
and organizations. People select and implement technologies that are
available and that suit their practices and goals. As they use them,
they adapt them to suit their needs, often in ways not anticipated
by their designers. Designers develop new technologies on the basis
of technological advances, marketing data, available standards, human
factors studies, and educated guesses about what will sell. Products
evolve in parallel with the uses for which they are employed. To use
a simplistic aphorism: Technology pushes, while demand pulls.

The central concern of this book is access to information in a
networked world. Information access is among the primary arguments
for constructing a global information infrastructure. Information
resources are essential for all manner of human affairs, including
commerce, education, research, participatory democracy, government
policy, and leisure activities. Access to information for all these
purposes is at the center of the discontinuity-continuity debates.
Some argue that computer networks, digital libraries, electronic
publishing, and similar developments will lead to radically different
models of information access. The technologies of creation,
distribution, and preservation will undergo dramatic transformation,
as will information institutions such as libraries, archives, museums,
schools, and universities. Relationships among these and other
stakeholders, including authors, readers, users, and publishers, will
evolve as well. Others argue that stakeholders, relationships, and
practices are so firmly entrenched that structural changes will be
slow and incremental because most new technologies are variations on
those that came before. My view is that some degree of truth exists
in each of these statements. These and other arguments are examined
throughout the book.

Much has been written about technology, human behavior, and policy
regarding access to information. Most of the writing, however,
focuses on one of these three aspects with little attention to the
other two. In this book I endeavor to bring all three together,
drawing on themes, theories, results, and practices from multiple
disciplines and perspectives to illustrate the complex challenges that
we face in creating a global information infrastructure. Technical
issues in digital libraries and information retrieval systems are
addressed, but not in the depth provided in recent books by Lesk
(1997a) and Korfhage (1997). Nor are design issues addressed to
the degree covered by Winograd et al. (1996). Information-related
behavior in electronic environments is covered, but in less depth
than in Marchionini 1995. Institutional and organizational issues are
treated more fully in Bishop and Star 1996, Bowker et al. 1996, and
Sproull and Kiesler 1991. Policy issues of the Internet are addressed
in more depth in Branscomb and Kahin 1995, Kahin and Abbate 1995, and
Kahin and Keller 1995. In this book I draw on these and many other
resources to weave a rich discussion of access to information in a
networked world. In view of the early stages of these developments,
more questions are raised than yet can be answered. My hope is to
provoke informed discussion between the many interested parties around
the world.

Converging Tasks and Technologies

People use computer networks for a vast array of activities, such
as communicating with other individuals and groups, performing tasks
requiring remote resources, exchanging resources, and entertainment
(whether with interactive games or passive media such as videos).
Among the few common threads in predictions of future technology (see,
e.g., Next 50 Years 1997 and Pontin 1998) is that we will see more
convergence of information and communication technologies, blurring
the lines between tasks and activities and between work and play.
We will have "ubiquitous computing" (Pontin 1998) and "pervasive
information systems" (Birnbaum 1997). We will become "intimate with
our technology" (Hillis 1997), and "information overload" (Berghel
1997a) will be more of a problem than ever.

An underlying theme of such predictions is "digital convergence",
indicating that more and more information products will be created in
digital form or will be digitized, allowing applications to be blended
more easily. Digital technologies will co-exist with analog and
other forms of information technologies yet to be invented. Analog
technology is based on continuous flows, rather than the discrete
bits of digital technology. Computer and communication networks
are an example of the bridge between these technologies. The word
"modem" was coined from "modulate" and "demodulate", which describe
the device's function in converting digital data produced by computers
into analog signals that could be sent over telephone lines designed
for voice communication and vice versa. Predictions of ubiquitous
computing are based on an increasing reliance on small communication
devices and embedded systems such as those that control heating and
lighting in homes and offices. Future computer networks are expected
to link these devices just as they now link personal computers, data
storage, printers, and other peripherals (Pontin 1998).

Modes of Communication

No matter what technologies gird the framework of the global
information infrastructure, human activities involving the network
will be intertwined. As the editors of Wired magazine (1997, p. 14)
put it,

... broader and deeper new interfaces for electronic media are
being born. ... What they share are ways to move seamlessly
between media you steer (interactive) and media that steer you
(passive). ... These new interfaces work with existing media,
such as TV, yet they also work on hyper-linked text. But most
important, they work on the emerging universe of networked media
that are spreading across the telecosm.

Despite the hyperbole, this quotation highlights a useful distinction
between "pull" technology (which requires explicit action by the user)
and "push" technology (which comes to the user without the user's
explicit action). Some activities are easily categorized by this
dichotomy, but others have characteristics of each. Composing and
sending an email message and searching a database require explicit
"pull" actions, for example. Although both the broadcast mass media
and the emerging media services that deliver tailored selections of
content to workstations during idle time can be classified as push
technologies (editors of Wired 1997), the latter form also could
be considered "pull", because the user presumably took action to
subscribe to the service. Similarly, if composing and sending email
is pull technology, then receiving mail can be viewed as a form
of "push". Opening and reading messages requires explicit actions,
but users can decide what to read, delete, or ignore. They also can
sort desirable and undesirable messages by means of automatic filters.
Because subscribing to desirable content and filtering out undesirable
content require parallel actions, both can be viewed as forms of push
technology if one accepts the Wired definitions of "push" and "pull".

Push and pull combine in other ways as well. People subscribe to
distribution lists, which then send messages at regular or irregular
intervals. They also subscribe to services that alert them when new
resources are posted on a specific network site, but they must take
explicit action to view or retrieve the resources from that site.

Truly interactive forms of communication are difficult to categorize
as push or pull. People engage in conversations in "chat rooms",
play roles in MUDS and MOOS, and hold conferences, meetings, and
classes online in real time. All require explicit actions, but
the characteristics of these two-way or multi-way conversations are
far richer than the solo-action pull of searching a database or
sending a message. Some of these are the "demassified" communication
technologies that Rogers (1986) predicted more, tailored to individual
users and to small audiences. However, the "push" technologies of
customized desktop news delivery touted by Wired in 1997, in which
messages continually scroll across the subscriber's screen, have yet
to become the commercial success that was predicted. Perhaps they
were not sufficiently customized or "demassified". Perhaps people
found them too disruptive, preferring "pull" modes in which they could
acquire the desired content at their convenience.

The intertwining of communication modes in electronic environments
adds new dimensions to information access. Although more study
has been devoted to "active" than to "passive" information seeking,
even these categories are problematic in this new environment.
These are but a few of many communication definitions and concepts
being reconsidered in the light of new information technologies.

Task Independence and Task Dependence

The more intertwined tasks and activities become, the more difficult
it becomes to isolate any one task for study. In the past, most
theory and research presumed that the human activities involved in
access to information could be isolated sufficiently to be studied
independently. This is particularly true of information-seeking
behavior, a process often viewed as beginning when a person recognizes
the need for information and ending when the person acquires some
information resources that address the need. Such a narrow view of
the process of seeking information simplifies the conduct of research.
For example, information seekers' activities can be studied from
the time they log onto an information retrieval system until they
log off with results in hand. The process can be continued further
by following subsequent activities to determine which resources
discovered online were used, how, and for what purposes. Another
approach is to constrain the scope of study to library-based
information seeking. People can be interviewed when they first
enter a library building to identify their needs as they understood
them at that time. Researchers can follow users around the building
(with permission, of course), and can interview the users again before
departure to determine what they learned or accomplished.

Narrowly bounded studies such as these provide insights into detailed
activities and are useful for evaluating specific systems, services,
and buildings. However, their value and validity are declining for
the purposes of studying the information environment of today and
assessing the needs of the future. In the early days of information
retrieval, people might reasonably conduct most or all of their
searching on one retrieval system. Only a few systems existed, and
each had a limited number of databases. These were complex systems
requiring lengthy training. Information seekers, often with the
assistance of skilled searchers, would devote considerable effort to
constructing, executing, and iterating a search on a single system
(Borgman, Moghdam, and Corbett 1984). A close analysis of user-system
interaction could provide a rich record of negotiating a single
search query. Even so, such studies provide little insight into
the circumstances from which the information need arose or into
the relationship between a particular system and the use of other
information resources.

In today's environment, most people have access to a vast array of
online resources via the Internet and online resources provided by
libraries, archives, universities, businesses, and other organizations
with which they are affiliated, as well as print and other hard-copy
resources. They are much less dependent on any single system or
database. Rather, they are grazing through a vast array of resources,
perhaps "berry picking" (Bates 1989) from multiple sources and
systems. Studying any individual system is far less likely to provide
a comprehensive view of information-seeking activities than it was
in the past. Similarly, people have fewer reasons to spend time in
library buildings, now that they can use many library resources from
the convenience of home, office, dorm, coffee shop, or anywhere else
with network access. And they can do so at hours of day or night
when library buildings normally are closed. Thus, time spent in the
library building may be for narrower and more specific purposes, and
may occur only at critical stages in the search process. The use of
library buildings also reflects patterns that are influenced by age,
generation, culture, discipline of study, and many other factors.
Such research should yield insights into the design of future
buildings and services, provided it is set in a larger context of
overall information-use patterns.

Future research on access to information must consider the complex
relationships between information-related activities and the context
of work and leisure practices in which these activities are conducted.
Although all scholarship is constrained by the necessity of studying
that which can be studied, particular caution is necessary when
studying tasks that tend to be interdependent.

Technology Adoption and Adaptation

Underlying the design of any information technology are assumptions
about how and why people will use it. The assumptions are sometimes
explicit and sometimes only implicit, whether for individual
communication devices, for information systems, or for the design
of a global information infrastructure. In identifying design
criteria, and making implicit assumptions explicit, many methods and
perspectives can be applied. We can evaluate which prior technologies
were adopted and which were not, the processes by which they were
adopted, how similar technologies are used, what features and
functions are most popular and most effective, and how their users
adapt them to new purposes.

I will highlight three perspectives on assessing how and why people
use information technologies. Though many other perspectives and
methods exist, these three are applicable to our concerns for access
to information.


Of the vast number of information technologies that are invented,
only a few make it to the marketplace, and of these, even fewer are
successful. The quality of the product is only one determinant of
market success. Many products that receive critical acclaim fail to
garner large market shares. The Beta video recording technology and
the Macintosh computer are the best-known examples. In contrast, many
products whose reviews range from skepticism to scorn achieve great
market success. Business factors such as timing, marketing, and
pricing are determinants of success. Other determinants are social
factors involving how and why people choose to adopt any particular
innovation. Rogers (1983, 1986) summarizes the results of a large
number of adoption studies using a five-stage model. The first stage
of adoption is knowledge, or becoming aware of the existence of a new
technology that might be useful. This stage is influenced by factors
such as previous practices, felt needs or problems, tendencies
toward being innovative, and norms of the individual's social system.
The second stage is persuasion, which in turn is influenced by the
perceived characteristics of the innovation, how well it might work,
how easy it is to try, and how easily the outcome can be observed.
In the third stage, the adopter makes a tentative decision to accept
or to reject the technology. Acceptance may lead to implementation
(fourth stage) and, if the innovation is deemed sufficiently
useful, to a confirmation to continue its use (fifth stage). If the
innovation is rejected, the individual still may revisit the decision
and adopt it later.

Electronic mail (email) provides an instructive example of the
adoption process. A person may first become aware of its existence
through news reports or through discussions with friends, family, or
co-workers. Someone surrounded by email users will hear about it more
quickly and frequently than someone whose acquaintances are nonusers.
Even today, elderly Americans who have minimal contact with computer
users may have at most a vague idea of what email is, for example. In
countries with minimal telecommunications and computing penetration,
only the elite may be aware of email as a potentially useful
technology. In the persuasion stage, a person who has many potential
email correspondents will find the technology more attractive than
a person who knows no one else with an email address. Similarly,
a person who already owns a computer with a modem will find it far
easier to try email than one who must acquire the technology and the
skills to use it. Once they have tried it, some people will find
email sufficiently useful, affordable, and worth the time and effort
to continue using it. Others will not. Thus, once people become
aware of email, only some will consider trying it, a smaller number
will make the effort to try it; of these, only some will acquire it
and continuing using it, and they may abandon it later. Conversely,
some who rejected email at any of these adoption stages may consider
it again at some later time.

This adoption pattern also operates in the aggregate. The "early
adopters" typically are risk takers who are willing to try unproven
techniques, often at great expense. If they adopt the new technology,
their successes may convince more risk-averse individuals to try
it. Conversely, if the early adopters reject it, others may be
more reluctant to try it. By the time the low-risk late adopters
decide to implement a technology, the early adopters may have moved
on to something yet newer and more innovative. Some technologies
reach a critical mass of adoption in a short period of time and
are great market successes. Others are unable to find a match
with early adopters fast enough, and the entrepreneurs fail
before finding their niche in the market. Others fail because they
do not fill a perceived need. Yet others succeed because they are
good enough, cheap enough, and at the right place at the right time,
although not necessarily an optimal design. Though this explanation
is a gross simplification of the adoption process, it illustrates a
few of the many social variables that influence the success of new
information technologies.

Again, email provides a useful case example. Email filled a perceived
need early in the development of computer networks and reached a
critical mass of computer users fairly quickly. Spreadsheets were
a similarly attractive technology that contributed to the adoption
of personal computers. Early adopters of both technologies were
sophisticated computer users who tolerated complex user interfaces,
often unreliable software, and minimal functionality because the
technology was sufficiently valuable for their purposes. People who
are early adopters of one technology tend to be early adopters of
others, willing to tolerate immature technologies in return for their
benefits, and often enjoy the challenge of working at the "bleeding
edge" of technical frontiers.

Conversely, late adopters of one technology tend to be late adopters
of others. These people are far less likely to appreciate technology
for its own sake, preferring mature, easy-to-use technologies with
a high perceived payoff relative to the effort required in learning
to use them. They are happy to let others "work the bugs out" before
spending the time, effort, and money to adopt them. This distinction
between the personality characteristics and social context of early
and late adopters is an important one to bear in mind when considering
technologies intended for a mass market. If a global information
infrastructure is to achieve wide acceptance, it must be attractive to
late adopters.


Theories of diffusion and adoption are valuable in understanding
the social processes involved in choosing to employ a particular
technology. The "diffusion of innovations" theory originated in rural
sociology to explain farmers' choices of agricultural innovations
such as farming equipment, hybrid plants, pesticides, and techniques
for planting, harvesting, and storing crops. The theory was later
extended to study the adoption of a diverse array of innovations
including solar energy during a fossil-fuels shortage and family
planning methods in developing countries. One weakness of applying
the "diffusion of innovations" theory to information technologies
is the implicit assumption that the innovation is relatively static.
Information technologies tend to be more dynamic and flexible than
farming equipment, for example. Any communication device may be
short-lived, making it difficult to compare the actions of someone who
adopted the first crude implementation to those of someone who adopted
a more sophisticated and less expensive version only months later.
Moreover, information technologies are more malleable and adaptable to
individual purposes than are most other technologies. Thus, we must
look not just at the adoption of information technologies as a binary
(adopt / not adopt) decision, but also at how technologies, once
adopted, are adapted over time.

Books provide an early example of how people adapt information
technologies to their purposes. Manuscripts (meaning, literally,
hand-written) were the first form of written record. Manuscripts on
sheepskin or parchment were easier to create and read than chiseled
stone tablets, but still could be read only by one person in one place
at a time. Manuscripts could be loaned for manual copying, which
enabled duplication, however laborious. Gutenberg's improvements
in movable type in the fifteenth century made multiple copies
economically feasible for the first time. Early printed books
retained the shape and size of manuscripts, following the earlier
technology. Although the distribution of multiple copies meant that
more people could own and read a work concurrently, books still were
too bulky for portable use, except by the very rich. Greenberg (1998)
recounts the oft-told story of Abdul Kassem Ismael, who was said to
have had a library of 117,000 books in tenth-century Persia. Not
only did he carry his library with him while he traveled, on the backs
of 400 camels, he trained the camels to walk in alphabetical order.
Later innovations led to publishing books in more portable sizes that
fit not only in the saddlebags of yesteryear, but in the backpacks and
briefcases of today.

We find similar adaptations in the use of computer networks. The
ARPANET, precursor to the Internet, was created for remote access to
scarce computing resources. Electronic mail was a feature intended to
serve as an ancillary communication function. Email proved so useful
for general communication that it became the dominant use of the
network, much to the surprise of the ARPANET's designers (Licklider
and Vezza 1978; Quarterman 1990). Email was the "killer application"
that attracted most people to the Internet (Anderson et al. 1995;
Quarterman 1990), and it remains the most important reason for becoming
an Internet user (Katz and Aspden 1997).

Email is a far different application today than it was in the early
days of the ARPANET, however. Early email consisted of very short
plain text messages. Less than a decade ago, messages could take
several days to arrive, with delays caused whenever a server in a
store-and-forward network went down. Email was neither fast enough,
reliable enough, nor functional enough to replace most other forms of
communication. The technology advanced, as did users' perceived needs
for more capabilities and better services. Today's email supports
long messages of formatted text and is fast, reliable, convenient,
and inexpensive (Berghel 1997b). Increasingly, email software
allows people to send and receive file attachments that preserve the
integrity of text, images, graphics, and sound. For many purposes,
email is a suitable substitute for telephone, fax, post, or express

Email now combines the features of word processors, file transfer
(ftp), and multimedia file management. It also provides a bridge to
the World Wide Web by embedding live links to web sites. By including
a URL (uniform resource locator) address in an email message, a user
can click on an address to launch a browser application and link to
the web site. And the reverse is true. Once at the web site, a user
can click on "email" and send a message to the web site.

Email has evolved from a simple application to one that combines
a rich array of services. As users realized its value and its
constraints, they identified further improvements that could be made.
Yet today's complex email technology has too much functionality to be
feasible for some purposes. Thus, we also find evidence of complex
applications being stripped down to the bare elements that suit newly
identified needs. An example is the convergence of email with pocket
pagers, which themselves were initially a simple, single-function
technology. Some of today's more elaborate pagers include a full,
albeit tiny, QWERTY keyboard and alphanumeric display, on which people
can send and receive terse messages. Other pagers include function
keys for common responses to email-type messages: yes, no, time, date,
etc. Such devices can convey cryptic but critical messages, such as
"When do you arrive?" (answer: "AA 75, 8:44pm LAX"), "Did we win the
case?", "Running late, resched Tu at 3?" (answer: "no. Tu 2pm ok?")",
"pls get milk", or "get KT @ school".

These are but a few examples of how people adapt information
technologies by using them. People sometimes adopt only part of
a technology, as illustrated by the example of stripped-down email.
Other times they disable or circumvent features of a technology.
Email file attachments are a case in point. They are extremely useful
for exchanging files quickly between team members, co-authors, authors
and editors, authors or publishers and readers, or teachers and
students. But they are useful only when they work. When exchange
partners have identical hardware and software platforms, fast
connections, and (better yet) the ability to scan for viruses before
receipt, file exchange may be seamless.

System designers, along with as those who send file attachments, often
are unaware of the difficulties involved in receiving attachments
intact and in a usable form, however. Despite considerable progress,
the necessary platform independence and software independence required
for reliable exchange of attachments over networks has yet to be
achieved. File exchanges between different platforms (e.g., PC and
Macintosh) and different operating systems (Windows 95, Windows 98,
Windows NT, Macintosh OS 7.5, Macintosh OS 8.0, Unix, etc.) introduce
compatibility problems. Files created with widely used word
processing software such as Microsoft Word and Corel WordPerfect often
fail to transfer intact. Text may transfer but formatting may be
corrupted, and the likelihood of accurate transfer decreases with the
inclusion of software-specific features such as tables, graphics, and
macros. The more recent the version of the software used to create a
file, the less likely that earlier versions of the same software or of
competing software can open it intact. Exchanging files of graphics
or sound is yet more problematic. Adding another layer of concern
is the ability of attachments to carry computer viruses that can
contaminate the receiver's computer.

Unsolicited file attachments containing job applications,
advertisements, jokes, cartoons, greeting cards, and myriad other
materials clog network and modem lines and fill disk space. Owing
to problems with technical compatibility, viruses, and bandwidth,
many people are making minimal use of file attachments, and some
are setting their email parameters to reject them entirely. Local
network managers are introducing delays in email delivery to scan all
attachments for viruses, adding another layer of complexity. Sending
faxes, or mailing paper and disks, can be faster, more reliable, and
less labor intensive.

The email examples offer several lessons in the adoption and
adaptation of information technologies. One lesson is that early
adopters are willing to use an immature technology. As they use it,
they will identify problems, recognize new possibilities, and demand
improvements. Later adopters will identify yet more problems and
more desirable capabilities as they integrate it into their practices,
refining the technology further. Another lesson is that one simple
technology may spawn so many features that it subdivides into
component parts, as email has done. We also see that advanced
features that are extremely useful in some situations may result in
unintended and undesirable consequences in others, as is the present
case with file attachments. When people have positive experiences
with a technology, they often are more inclined to adopt another
technology. Conversely, when they have negative experiences, they
trust the technology less than before, and are less inclined to try
something new. All these lessons argue for the importance of studying
the use of information technologies in actual working situations.
Though laboratory experiments are extremely valuable for improving
technologies under ideal conditions, field studies are essential to
determine how technologies are adopted and adapted.

Organizational Adaptation

Though some technology adoption and adaptation is attributable to
individual choices by individual users, much of it takes place in
the context of organizations. Organizations such as businesses,
governments, universities, and schools make decisions about what
hardware, software, and services to purchase for use by their
constituencies. Individuals may have little choice in which computing
platform, Internet provider, or services they use. Organizations
usually set policies about how services such as email and information
resources are used. Even in view of these constraints, individuals
often have considerable latitude in how they employ these technologies
in their work practices, however.

Sproull and Kiesler (1991) explain the unpredictable effects
of introducing technology into organizations from a "two-level
perspective". They argue that most inventors and early adopters
of technology think primarily about efficiency of the technology.
System designers, as well as early adopters, focus on the instrumental
uses to which the technology is put, whether reducing "telephone tag"
through the use of electronic mail or lowering secretarial costs by
replacing typing with word processing. These are the "first-level
effects" of a technology.

Users rarely implement a new technology in precisely the way that
designers intend, however. Organizations find it difficult to
determine accurate estimates of direct costs, much less to determine
the first-level effects of technology on work practices, productivity,
or profits. Because technologies interact with routine work practices
and policies, implementation leads to "long-term changes in how people
work, treat one another, and structure their organizations" (Sproull
and Kiesler 1991, p. 1). It is these "second-level effects" on the
social system of interdependent people, events, and behaviors that are
most pervasive and most important for organizations. These effects
are also the most difficult to predict.

Again, email offers illustrations of first- and second-level effects
of introducing an information technology into organizations. The
instrumental uses of email are many: it offers rapid interpersonal
communication within the organization and between the organization and
the external world, whether clients, suppliers, members, customers,
citizens, colleagues, friends, or family. Email is convenient and
portable. Because it is asynchronous, it can improve time management
by enabling people to send and receive messages at their convenience.
It serves as a broadcast technology, allowing an organization
to deliver the same message to a mass audience of its employees,
students, or other groups simultaneously. Email has radically
increased the speed and volume of communication for most people who
use it.

We are finding many second-level effects of email that were not
anticipated at the time of its initial development or adoption.
Email is easily abused, whether by broadcasting messages that are of
interest only to a few or by sending rude and inappropriate messages
that are unlikely to be communicated by other means. Junk email
can proliferate, resulting in inefficient use of staff time to sort
through it, rather than the efficiency of communication intended.
Once an organization adopts email, usually everyone who is provided
access is expected to use it regularly. People are expected to
respond to messages, and to do so quickly. As a result, memos and
other communications that did not require a response in paper form now
result in a flurry of acknowledgments and responses, adding another
layer of communication activity.

Communications that once were oral, or confined to one or a few
paper copies that were controlled by the individuals involved, are
now captured in permanent form on an organization's email servers.
As a result, organizations are faced with a difficult balance between
controlling their resources and the rights of individuals to their
privacy (Anderson et al. 1995; Berghel 1997b). Organizations that
read employees' email may defend this practice on the grounds that
email is organizational documentation and that it resides on computers
owned by the organization. Individuals, particularly those who have
lost jobs over the content of email messages, may contend that email
is the equivalent of telephone or other oral communications and is
subject to reasonable expectations of privacy.

Conversely, organizations are learning that email can have unexpected
and adverse legal consequences. Conversations that once were oral and
now are recorded can be treated as legal evidence. Among the evidence
that convicted Oliver North in the Iran-Contra affair were email
messages that he had deleted; they were recovered from backup storage
as part of the legal discovery process. Similarly, email messages
internal to the Microsoft Corporation are being used by the US
government as evidence in an antitrust case against the corporation.
As a result of these and other cases, many organizations are expanding
the scope of their email policies to limit the content of email
messages and to minimize the archival storage of email transactions
(Harmon 1998).

These are only a few of many examples of the positive and negative
effects that email has had on organizational communication.
(For more, see Anderson et al. 1995; Berghel 1997b; Markus 1994.)
People's experiences with email and their perceptions of its role in
an organization combine to determine how they will adapt it to their
own practices.

As information technologies are more widely adopted, concern about
their second-level effects is increasing. These concerns cross
many disciplines, levels of analysis, and research methods. "Social
informatics" is an emerging research area that brings together the
concerns of information, computer, and social scientists with those
in the domains of study (Bishop and Star 1996; Borgman et al. 1996;
Bowker et al. 1996). Social informatics scholars are attempting to
build upon research in the design and the use of information systems
and upon social studies of science and technology. This book brings
a social informatics perspective to bear on access to information
in digital libraries and in a global information infrastructure,
considering first-level effects when these are all that can be known
and second-level effects where possible.

Creating a Global Information Infrastructure

The integration, interaction, and interdependence of information-
related tasks and activities leads us to think in terms of an
information infrastructure. Rather than relying on separate devices
for producing text (e.g., typewriters and personal computers),
producing images (e.g., personal computers, photocopy machines,
drawing pads), communicating with individuals (e.g., telephones,
telefacsimile (fax) machines, mailboxes and stamps), and searching for
information resources (e.g., personal computers, local servers, print
technologies), all these tasks can be accomplished via a personal
computer connected to the Internet. Conversely, these tasks can be
divided up in many new ways by means of specialized devices such as
cell phones, pagers, palmtops, and other "information appliances" that
can share information. Computer and communication networks enable the
integration of tasks and activities involved in creating, seeking, and
using information, increase the interaction between these activities,
and make them ever more interdependent.

In considering the premise and the promise of a "global information
infrastructure", we must determine what is meant by this phrase.
Already it is used in variety of contexts, with meanings that include
a set of technologies, a set of principles for an international
computing and communications network, and a loose aggregation of
people, technology, and content.

What Is Infrastructure?

Terms such as "national information infrastructure" and "global
information infrastructure" are being bandied about with minimal
discussion of what is meant by "infrastructure". Social scientists
and historians are beginning to take a research interest in this
concept, particularly as it relates to organizational communication
and work practices. Star and Ruhleder (1996, p. 111-112) describe
infrastructure as follows:

It is both engine and barrier for change; both customizable
and rigid; both inside and outside organizational practices.
It is product and process. ... With the rise of decentralized
technologies used across wide geographical distance, both the
need for common standards and the need for situated, tailorable
and flexible technologies grow stronger.

Star and Ruhleder are among the first to describe infrastructure
as a social and technical construct. Their eight dimensions (ibid.,
p. 113) can be paraphrased as follows: An infrastructure is embedded
in other structures, social arrangements, and technologies. It is
transparent, in that it invisibly supports tasks. Its reach or scope
may be spatial or temporal, in that it reaches beyond a single event
or a single site of practice. Infrastructure is learned as part of
membership of an organization or group. It is linked with conventions
of practice of day-to-day work. Infrastructure is the embodiment of
standards, so that other tools and infrastructures can interconnect in
a standardized way. It builds upon an installed base, inheriting both
strengths and limitations from that base. And infrastructure becomes
visible upon breakdown, in that we are most aware of it when it fails
to work-when the server is down, the electrical power grid fails, or
the highway bridge collapses.

As a means to explore the technical and public policy implications
of information infrastructure, the Corporation for National Research
Initiatives has sponsored a series of studies that address historical
examples of large-scale infrastructure. These include studies of
the growth of railroads, telephony and telegraphy, electricity and
light, and banking (Friedlander 1995a,b, 1996a,b). In each case,
the technologies involved took some time to be adopted, to stabilize,
and to achieve the critical mass necessary to form an infrastructure.
Railroads, telephones, power companies, and banks all provided local
services for years, or even decades, before reaching nationwide
connectivity. Each developed with some combination of public and
private investment and government regulation. The means by which
an integrated infrastructure evolved varied, and each involved
experimentation with different forms of technology, regulation, and
social arrangements.

Models of infrastructure for railroads, telephones, energy,
and banking could have taken far different forms than they did.
Indeed, with the possible exception of railroads, each of these
infrastructures is still evolving actively. Telephony underwent
extensive restructuring in the United States during the 1980s
and the 1990s due to changes in regulatory structure, mergers
and acquisitions, and technological advances. Similar regulatory
restructuring is now underway in Europe and elsewhere. Meanwhile,
technology advances and mergers and acquisitions continue apace. On
the energy front, models for service provision are changing as energy
companies are privatized and global power relationships shift with
variations in supplies and prices of fossil fuels. On the financial
front, models for banking infrastructure are under scrutiny as markets
for stocks, commodities, currencies, and other financial instruments
are becoming much more tightly coupled.

Each of these infrastructures is deeply embedded in our social fabric,
relies on technical standards, and builds upon an installed base
within the scope of its own and other infrastructures. A corollary to
the notion that infrastructure becomes visible upon breakdown is that
we rarely are aware of it when it is functioning adequately. We often
fail to recognize these as essential infrastructures until telephone
service becomes more complex and expensive, energy services change
in cost and character, or the stock market takes a precipitous fall
in value. And, although Americans make minimal use of railroads,
railroads are an essential form of transportation in much of the
world, where people are very much aware of changes in schedules,
routes, prices, and services.

Star and Ruhleder's (1996) set of eight infrastructure dimensions
highlights the complex interaction of technology, social and work
practices, and standards. They also emphasize social context
by noting that infrastructure builds upon an installed base.
An information infrastructure is built upon an installed base of
telecommunications lines, electrical power grids, and computing
technology, as well as on available information resources,
organizational arrangements, and people's practices in using all these
aspects. An installed base establishes a set of capabilities and a
set of constraints that influence future developments. For example,
mobile telecommunications must interoperate with land-based networks,
and new computers should be able to read files that were created on
the preceding generation of technology.

The concepts of embeddedness, transparency, and visibility are
especially relevant to a discussion of a global information
infrastructure. To be effective, a GII must be embedded in the
technical and social infrastructure of the many nations and cultures
it reaches-so much so that the infrastructure is invisible most of
the time. Whether this degree of embeddedness is possible across
countries and cultures is examined throughout this book. When
an information infrastructure works well, people depend on it for
critical work, education, and leisure tasks, taking its reliability
for granted. When it breaks down (for example, when email cannot
be sent or received, when transferred files cannot be read, or when
online information stores cannot be reached), then the information
infrastructure becomes very visible. People may resort to alternative
means to complete the task, if those means exist; they may create
redundant systems at considerable effort and expense; and they will
trust the infrastructure a bit less each time it breaks down.

Infrastructure as Public Policy

Infrastructures of many kinds are subject to public policy. For
example, the Clinton administration (1997, 1998) set forth a policy
on "critical infrastructure protection" that is noteworthy for our
concerns. The white paper on Presidential Decision Directive 63
(Clinton Administration 1998) defines "critical infrastructures"
as "those physical and cyber-based systems essential to the minimum
operations of the economy and government. They include, but are
not limited to, telecommunications, energy, banking and finance,
transportation, water systems, and emergency services, both
governmental and private". In the past, these infrastructures were
physically and functionally separate. However, with advances in
information technology these systems are increasingly linked and
interdependent. The significance of this interdependence is that
critical systems are ever more vulnerable to "equipment failures,
human error, weather and other natural causes, and physical and cyber
attacks". PDD 63 has the goal of protecting critical infrastructure
from intentional attack and minimizing service disruptions due to any
other form of failure.

Information technologies link these critical infrastructures, making
them interdependent, and thus all information technologies could
be considered parts of an information infrastructure. Information
infrastructure usually is more narrowly defined in public policy
documents, however. Typically the scope includes computing and
communications networks, associated information resources, and perhaps
a set of regulations and policies governing use.

Metaphors for Information Infrastructure

Clever metaphors for information infrastructure have helped to capture
public attention. The concept of information infrastructure is best
known in common parlance as the "information superhighway" (Gore
1994b), or sometimes as the "I-way" or the "Infobahn". These metaphors
for information infrastructure emphasize the roads or pipes over
which data flow, whether telecommunications, broadcast, cable, or
other channels. The highway metaphor captures only a narrow sense
of infrastructure, as it does not encompass information content,
communication processes, or the larger social, political, and economic
context. The superhighway metaphor is misleading both because it
skews public understanding toward a low-level infrastructure and
because it suggests that the government would pay the direct costs
of the highway's construction. The Internet was constructed with a
combination of government and private funds. Current public policy,
especially in the United States, is oriented toward private funding
for further expansion (Branscomb and Kahin 1995; Kahin and Abbate
1995; Kahin and Keller 1995).

Though metaphors such as the information superhighway have been
extremely effective in marshalling support for information
infrastructure development, far more is involved than laying roads
over which information will travel.

National and International Policies

Individual countries began plans for national information
infrastructures in the early 1990s (see, e.g., Information
Infrastructure Program 1992; Karnitas 1996). In the United States,
there was the National Information Infrastructure Act of 1993.
In Europe, there was the European Union's proposal for a European
Information Infrastructure (Bangemann Report 1994). The installed
base of technology on which these plans are predicated includes the
Internet, which began in the late 1960s with the ARPANET (National
Research Council 1994; Quarterman 1990), the "intelligent network"
of telecommunications that followed the deregulation of telephony
(Mansell 1993), and related technologies such as cable and satellite
television networks.

In the mid 1990s, national information infrastructure plans began
to converge. In 1994 the United States proposed formal principles
for a global information infrastructure. The following principles
were incorporated into the International Telecommunication Union's
"Buenos Aires Declaration on Global Telecommunication Development for
the 21st Century" (1994) and the United States' "Global Information
Infrastructure: Agenda for Cooperation" (Brown et al. 1995):

o encouraging private sector investmento promoting open competition

o providing open access to the network for all information providers
and userso creating a flexible regulatory environment that can keep
pace with rapid technological and market changeso ensuring universal

A few months later, the Group of Seven (seven leading industrialized
nations, known as "G-7") met to discuss these principles and agreed to
collaborate "to realize their common vision of the Global Information
Society" and to work cooperatively to construct a global information
infrastructure (G-7 Ministerial Conference on the Information Society
1995a, pp. 1-2). These principles emerged from the 1995 G-7 meeting:

o promoting dynamic competitiono encouraging private investment

o defining an adaptable regulatory frameworko providing open access
to networks whileo promoting equality of opportunity to the citizeno
promoting diversity of content, including cultural and linguistic

o recognizing the necessity of worldwide cooperation with particular
attention to less developed countries.

The G-7 document also included the following.

These principles will apply to the Global Information Infrastructure
by means of:

o promotion of interconnectivity and interoperabilityo developing
global markets for networks, services, and applicationso ensuring
privacy and data security

o protecting intellectual property rights

o cooperating in R&D and in the development of new applications

o monitoring the social and societal implications of the information

The Buenos Aires and G-7 statements have much in common: they
are concerned with technical capabilities ("interconnectivity",
"interoperability", "open access"), promises of rights to provide
network services ("open competition", "dynamic competition"),
guarantees of network services ("universal service", "equality of
opportunity"), a means of funding network development ("encouraging
private investment"), and a means of regulating various aspects of
its development and use ("flexible regulatory environment", "adaptable
regulatory framework"). However, they vary on their treatment of
content: the G-7 principles promote diversity of content and offer
some general protections ("privacy", "data security", "intellectual
property"), while the telecommunications principles do not
mention content, addressing only the development and regulation of
communication channels.

Implementing Global Policy

Statements by the G-7 and other multinational bodies such as the
United Nations promote policy agendas of the countries involved, but
they lack the force of law and they provide little if any funding
for implementation. Some of the language offers more platitudes than
policy, such as the claim in the European Information Infrastructure
plan that, "as a strategic creation for the whole Union", it will lead
to "a more caring European society with a significantly higher quality
of life" (Bangemann Report 1994).

The G-7 policy statements that frame a global information
infrastructure have raised considerable concern about human
rights and social protections from adverse consequences of its use.
Though the G-7 principles include a general statement about privacy
and comment on the need to monitor the social implications of the
information society, they do not ensure legal protection of rights
such as privacy, free expression, and access to information. Despite
requests by human rights groups, the G-7 principles omit references to
assurances in the United Nations Declaration of Human Rights that were
approved in 1948 (see United Nations 1998). Particularly relevant are
articles 12 and 19:

Article 12: No one shall be subjected to arbitrary interference with
his privacy, family, home or correspondence, nor to attacks upon his
honor and reputation. Everyone has the right to the protection of
the law against such interference or attacks.

Article 19: Everyone has the right to freedom of opinion and
expression; this right includes freedom to hold opinions without
interference and to seek, receive and impart information and ideas
through any media and regardless of frontiers.

These principles are receiving renewed attention upon the fiftieth
anniversary of their adoption (United Nations 1998). Computer
networks offer unanticipated capabilities for free speech and access
to information. Because transactions and interactions are easily
trackable, computer networks also can create unanticipated intrusions
into privacy (Kang 1998). Many privacy advocates promote an
alternative design model, known as "privacy-enhancing technologies"
(Burkert 1997), in which individuals can acquire access to most
information services without revealing their identity if they so
choose. Privacy, freedom of speech, and freedom of access to
information are tenets of democracy (Dervin 1994; Lievrouw 1994a,b).
People cannot speak freely or seek information freely if their
movements are being tracked and if they cannot protect and control
data about themselves (Agre and Rotenberg 1997; Diffie and Landau
1998; Information Freedom and Censorship 1988, 1991).

These are contentious issues in the United States. One example
is that the federal policy on critical infrastructure protection,
discussed above, is being challenged on the basis of its potential to
erode civil liberties (Electronic Privacy Information Center 1998).
Public policy on social aspects of information infrastructure is
subject to the laws, the norms, and the practices of individual
countries and jurisdictions, despite the global reach of computer
networks. When local activities took place only locally, variances
in policy and regulation were less apparent and jurisdiction was
rarely an issue. Now that individual communications and information
resources flow quickly and in vast quantities across borders,
variances in policy and regulation can be highly visible and
jurisdiction can be highly contentious. Privacy rights and
regulations have become an international battlefield where many of
these issues are being played out.

The European Union Data Directive, which took effect in late 1998,
highlights fundamental differences in policy approaches to privacy
protection. The United States long has taken a "sector approach",
with specific laws governing credit reports, library borrowing
records, videotape rentals, federal government databases, etc. In the
new arena of computer networks, US policy has favored self-regulation
by the private sector over government-imposed regulation. In
contrast, European countries have favored generalized policies over
the control of personal data, assigning stronger rights to individuals
to control information about themselves than to organizations that
collect and manage personal data. The EU Data Directive consolidates
the policies of individual countries and regulates privacy protections
throughout the European Union. In view of the extensive commerce
between the United States and the European Union and the volumes
of data about personnel, customers, clients, and suppliers that are
subject to regulation, the policies of these jurisdictions often are
in conflict.

For overviews of the rapidly evolving landscape of electronic privacy,
see Agre and Rotenberg 1997, Diffie and Landau 1998, Kang 1998,
Rotenberg 1998, and Schneier and Banisar 1997. Updates, including
pointers to government documents and other primary sources, can be
found at and at

Information Infrastructure as a Technical Framework

"Information infrastructure" can refer to a technical framework rather
than to a public policy. As defined by the (US) National Research
Council (1994, p. 22), an information infrastructure is "a framework
in which communications networks support higher-level services for
human communication and access to information. Such an infrastructure
has an architectural aspect-a structure and design-that is manifested
in standard interfaces and in standard objects (voice, video, files,
email, and so on) transmitted over the interfaces".

One of the key components in defining an information infrastructure
as a technical framework is for it to have an open architecture that
will enable all parties to interconnect electronically and to exchange
data. The "Open Data Network" concept (National Research Council
1994) follows both from the Internet (a successful open architecture
for computing) and from established telecommunications policy
principles (Mansell 1993; National Research Council 1994). Under
the G-7 principles, closed networks can interconnect with the open
network; closed service networks such as cable television are allowed
under other communications regulations as well. As we move toward
ubiquitous computing, a wider array of devices must interconnect; this
makes open systems and interoperability much more essential.

The emerging global network that interconnects a wide variety of
computing devices located around the world offers great utility
for communication between individuals and organizations, whether
for education, work, leisure, or commerce. The technical framework
for such an information infrastructure is now expected to support
a range of tasks and activities far wider than that for which it
was originally designed, however. The original ARPANET and the early
generations of the Internet were constructed by and for the research,
development, and education communities (Quarterman 1990). Benign uses
by a collegial community were presumed when its technical architecture
was designed (Oppliger 1997).

Substantial enhancements are being made to the technical architecture
of the Internet to support a vastly larger volume and variety
of users, capabilities, and services than was anticipated in the
original design. Two new network services illustrate the scope
of the improvements that are under way (Lawton 1998; Lynch 1998).
One is "quality of service": the ability to reserve a set amount of
bandwidth, at a predetermined level of quality, in advance. Rather
than the current model, which is largely "first come, first served"
for bandwidth usage, mostly at flat pricing, the new model supports
differential pricing for differential services. Many organizations
are willing to pay a premium to guarantee adequate bandwidth at a
specified time (for a teleconference or a distance-education course,
for example). Conversely, many individuals are willing to tolerate
delays in email delivery or Web access in return for lower costs.
In view of the complexity of Internet architecture and the number
of political and service-provider boundaries crossed by an individual
transmission, guaranteeing quality of service will not be a simple
accomplishment. Though quality of service is considered an essential
capability of an information infrastructure, precise assessments
of what can be guaranteed and how it can be measured have yet to be
established (Lynch 1998).

Multicasting is another long-awaited service improvement for the
technical framework of a global information infrastructure. At
present, most communications are point-to-point ("unicasting"): copies
of a message are sent individually to each intended recipient. The
alternative is broadcasting, in which one message is sent to all users
of the network, whether they want it or not. An intermediate model is
"multicasting": one message is sent to a selected group of recipients,
reducing the amount of bandwidth required. Technically, under
multicasting, the originating server sends one message to each network
router on which intended recipients are located and that router
re-sends to its local subscribers (Lawton 1998). As with quality of
service, the number of providers involved makes multicasting a complex
process, but one that is necessary for efficient use of bandwidth
on a global information infrastructure (Lynch 1998). A variety of
economic and technical models for network service provision are under
consideration for the next generation of network architecture (Shapiro
and Varian 1998).

The Internet is already a "network of networks". A global information
infrastructure will be even more so. Though we speak metaphorically
of a single open network, in actuality the Internet links many layers
of networks within organizations, within local geographic areas,
within countries, and within larger geographical regions. These go
by various names, including intranets, extranets, local-area networks
(LANs), metropolitan-area networks (MANs), and even tiny-area networks
(TANs). Suffice it to say that the information infrastructure
topography is becoming increasingly complex, linking together internal
organizational networks, closed networks such as cable TV, and the
international Internet.

The boundaries of individual networks can be controlled to varying
degrees. A common technique is to protect organizational or even
national networks with "firewalls" that limit the abilities of
authorized users to exit and of outsiders to enter. Some internal
resources can be publicly accessible while others are restricted
to internal use, for example. Similarly, firewalls and filtering
techniques can be used to limit external sites that can be reached.
Parents can limit their children's ability to connect to sites known
to contain pornography or other undesirable material. The definition
of "undesirable" varies by context. Companies can limit access to
known sites containing games. Countries can limit access to sites
known to provide undesirable political views. China, for example,
currently attempts to control access to sites outside the country
through a single gateway, so that specific sites deemed objectionable
can be blocked. Chinese Internet users are required to register
with the police to gain access to the network (Tan, Mueller, and
Foster 1997). A key phrase here is "known sites". As the Internet
proliferates, new sites appear daily, and sites change names,
location, and content frequently. Reliable filtering software that
can distinguish between acceptable and unacceptable materials is not
yet feasible, and may never be.

For most businesses and governments, security and risk management are
far greater concerns than is pornography. After connectivity, the
most important enabling technology for electronic commerce is security
(Dam and Lin 1996; Geer 1998; Oppliger 1997). One model being studied
and implemented is "trust management", in which mechanisms such as
cryptography are employed to verify the identities of all parties
involved in electronic transactions. Such transactions include
buying and selling goods or services, transferring secure data
(such as financial transactions between banks and stock markets),
and proprietary communications within organizations or between
organizations and their clients, customers, and suppliers. Both
retail transactions between individuals and companies and wholesale
transactions between companies can be accommodated. An alternative
model is "risk management", which focuses on the likelihood of losses
and the size of potential losses from electronic commerce. Rather
than assume that trust can be guaranteed in all transactions, parties
try to determine the degree of risk exposure and to insure against it.
Cryptography is essential to both models as a means of assuring the
authenticity of transactions to the extent possible. The frontiers
of electronic commerce are being tested in the financial markets
today. In view of the size and volume of transactions among banks,
stock markets, investors, and other parties, many technical and policy
aspects of information infrastructure are likely to be tested first in
this arena (Geer 1998).

Information Infrastructure as Technology, People, and Content

Among the broadest conceptualizations of an information infrastructure
is that presented in National Information Infrastructure: Agenda
for Action 1993, where an NII is defined as encompassing a nation's
networks, computers, software, information resources, developers, and
producers. This definition comes closer to capturing the larger sense
of infrastructure as a complex set of interactions between people
and technology than do most other public policy statements, technical
definitions, or metaphors.

The above definition is compelling, if vague, because it recognizes
that we are creating something new and something that is more than a
sum of its parts. The information infrastructure is not a substitute
for telephone, broadcast, or cable networks, for computer systems,
for libraries, archives, or museums, for schools and universities,
for banks, or for governments. Rather, it is a new entity that
incorporates and supplements all these technologies and institutions
but is not likely to replace any of them. However, a GII is likely
to change each of these institutions, and how people use them, in
profound ways.

The term "global information infrastructure" is used in this broad
sense throughout the present book. A GII consists of a technical
framework of computing and communications technologies, information
content, services, and people, all of which interact in complex and
often unpredictable ways. No single entity owns, manages, or controls
the technical framework of a GII, although many governments, vast
numbers of public and private organizations, and millions of people
contribute to it and use it. The GII is perhaps best understood
by the metaphor of the elephant being examined by a group of blind
people-each one touches a different part of the beast, and thus senses
a different entity. From this perspective, a global information
infrastructure is a means for access to information. However, it can
be viewed from many complementary perspectives that also are valid.


These are exciting times. Information technologies are increasing in
speed, power, and sophistication, and they now can link together a
vast array of devices into a network that spans the globe. They offer
new ways of learning, working, and playing, as well as conducting
global commerce. Some contend that these changes are revolutionary
and will change the world; others argue that the changes are
evolutionary, and that individuals and organizations will incorporate
networked information technologies into their practices just as they
incorporated many earlier media and technologies. In this book I take
the view that these changes are neither revolutionary nor evolutionary
but somewhere between: that they are co-evolutionary. New
technologies are based on perceived needs and available capabilities.
People adopt these new technologies if and when they deem the
technologies useful and when they deem the effort and the costs
appropriate. Sometimes individuals make these decisions; sometimes
organizations make them. The result is that some technologies
are adopted by some of the people some of the time. No matter
how voluntary or involuntary the adoption process, individuals and
organizations adapt technologies to their interests and practices,
often in ways not anticipated by the designers of those technologies.
Information technologies are more flexible and malleable to individual
practices than are most other innovations, and this makes them
especially adaptable. They also evolve more quickly than most other
innovations, with new and improved versions appearing at a dizzying

Adoption and adaptation of technology are difficult to predict, owing
to the complex interactions between characteristics of information
technologies, practices of individuals and organizations, economics,
public policy, local cultures, and a host of other factors.
Organizations acquiring new technologies find that estimates of
first-level effects, such as those on productivity and profits,
are unreliable. Reliable predictions of longer-term, second-level
effects, such as those on organizational communication and structure,
are nearly impossible. One reason is that external factors, such as
changes in the legal status of electronic communications, can have
profound effects on how individuals and organizations use information

We are in the process of creating a global information infrastructure
that will interconnect computer networks and various forms of
information technologies around the world. After a review of some of
the many meanings of "information infrastructure", it was determined
that the concept incorporates people, technology, and content and
the interactions between them. This broad definition incorporates
definitions of information infrastructure as a set of public policies
and as a technical framework. The broader definition is best suited
to studying the co-evolution of technology and behavior as related
to access to information, which is the primary concern of this book.
An information infrastructure is only one of several infrastructures
that are essential to a well-functioning society. Others include
energy, transportation, telecommunications, banking and finance,
transportation, water systems, and emergency services. Because
each of these infrastructures is increasingly reliant on information
technologies, they are more interconnected and interdependent. Their
interdependence means that more and more aspects of daily life depend
on the emerging global information infrastructure.



GrayLIT Network Now Available

Date: Sun, 27 Aug 2000 14:48:06 -0500
To: <>
Subject: ASIS-L: GrayLIT Network Now Available

_ GrayLIT Network Now Available_ 

I recently learned that the GrayLIT Network is now available

[ ]

GrayLIT Network provides a portal for over 100,000 FULL_TEXT technical reports located at the Department of Energy, Department of Defense, Environmental Protection Agency (EPA), and National Aeronautics and Space Administration (NASA). Collections in the GrayLIT collaboration include the DOE Information Bridge; the Defense Technical Information Center (DTIC) Report Collection; the EPA National Environmental Publications Internet Site (NEPIS); the NASA Jet Propulsion Lab Reports; and the NASA Langley Technical Reports. 

[ ]

[The U.S. Interagency Gray Literature Working Group, "Gray Information Functional Plan," 18 January 1995, defines gray literature as "foreign or domestic open source material that usually is available through specialized channels and may not enter normal channels or systems of publication,
distribution, bibliographic control, or acquisition by booksellers or subscription

Both "gray" literature and "grey" literature are commonly used to describe this body of information. The decision often hinges on country of origin for the literature, or alternately country of publication.] 

[ ]



Developed by the Department of Energy's Office of Scientific and Technical Information (OSTI), in collaboration with DOD/DTIC, NASA, and EPA, the GrayLIT Network is a portal for technical report information generated through federally funded research and development projects. The GrayLIT Network was released in early response to recommendations from a May 2000 Workshop held at the National Academy of Sciences. 

[ ]


GrayLIT Network ... [is] being made available to the public in partnership with the Government Printing Office through GPO Access ( These tools are maintained by OSTI, a part of the DOE Office of Science. The Director of OSTI is Dr. Walter L. Warnick, (301) 903-7996.

[ ]

I initially learned about GrayLIT Network as a result of a response to previous posting [Thanks, Valerie]

[ ] 

requesting recommendations for "New Products in Grey Literature", a review colum I write for the _International Journal on Grey Literature_ 

[ ]

published by MCB University Press and edited by Julia Gelfand. Applied Sciences Librarian, of the University of California, Irvine.

BTW: My latest column, " The Los Alamos National Laboratory E-Print Server" was published earlier this month [IJGL (1(3): 127-138]

[ bin/EMRbrowcite.cgi?recno=55&index=jt ]

Recommendations for Any and All services, systems, or software that relate 
to the management, access, and control of Grey Literature for review in a future column are Most Welcome! [Of course, I will be reviewing GreyLIT!]

/Gerry McKiernan
Science and Technology Librarian and Bibliographer
Iowa State University Library
Ames IA 50011 

"The Best Way To Predict the Future is to Invent It!" 
Alan Kay



Return-path: <>
From: Vicky Williams <>
To: "''" <>
Subject: New tool for creating awareness of your special collections
Date: Mon, 4 Sep 2000 15:05:58 +0100 


Does your organisation spend time and money collecting, storing, and
maintaining collections of reports, standards, offprints, working papers

The Grey Literature Network Service - GreyNet - recognises that this
material is of high value, but often experiences relatively low usage. 

GreyNet is developing a database of information on collections of this "grey
literature" - what exists, where it is held, and terms of access etc. It
will raise awareness of collections for both holders of collections and
potential users.

Records can be added to the database at: 
The editorial scope of the database is at:

Contribute to, and use this new resource.


Eileen Breen
GreyNet - the Grey Literature Network Service

PS Don't forget that to take a look at GreyNet's journal - the International
Journal on Grey Literature, which is free to you all until the end of the



Functional Requirements for Bibliographic Records (FRBR) - Final Report 

Date: Mon, 10 Apr 2000 16:27:46 -0400
Reply-To: International Federation of Library Associations mailing list 

Sender: International Federation of Library Associations mailing list <IFLA-L@INFOSERV.NLC-BNC.CA>
From: John Byrum <jbyr@LOC.GOV>
Subject: ISBD(M) Revision Proposals: Comments due July 15

In 1998, the IFLA Study Group on the Functional Requirements for
Bibliographic Records (FRBR) published its Final Report after its
recommendations were approved by the IFLA Section on Cataloguing's
Standing Committee (available at: The Standing Committee
agreed that the ISBD Review Group should initiate a full-scale review of
IFLA's "family of ISBDs" to ensure conformity between the provisions of
the ISBDs and those of FRBR - in particular, to achieve consistency with
FRBR's data requirements for the "basic level national bibliographic

The ISBD Review Group has now concluded its review of the
International Standard Bibliographic Description for Monographic
Publications (ISBD(M)), last revised in 1987. The changes which the
Review Group proposes to make in the next iteration of this standard are
listed on the IFLANET at:

You are invited to submit your written comments regarding these
changes to indicate your approval or your reservations by directing your
comments by July 15, 2000, to:
John D. Byrum, Jr. Chair
ISBD Review Group
Regional and Cooperative Cataloging Division
Library of Congress
Washington, D. C. 20540-4380

If you prefer you may fax your comments to Mr. Byrum at
+202-707-2824 or Email them to Other members of the group
are: Francoise Bourdon, Ton Heijligers, Lynne Howarth, Dorothy McGarry,
Glenn Patton, Reinhard Rinn, and Maria Witt.

The ISBD Review Group appreciates your interest.

^^ John D. Byrum, Jr. ^^
^^ Chief, Regional & Cooperative Cataloging Division ^^
^^ Library of Congress LM-535 ^^
^^ Washington, D.C. 20540-4380 LL ^^
^^ LL CCC ^^
^^ (202) 707-6511 LL CC CC ^^
^^ FAX (202) 707-2824 LLLLLLLL ^^
^^ CC CC ^^
^^ CCC ^^


UAP - New Publication

Date: Wed, 24 May 2000 15:24:05 +0100
Sender: International Federation of Library Associations mailing list <IFLA-L@INFOSERV.NLC-BNC.CA>
From: Sara Gould <Sara.Gould@MAIL.BL.UK>
Subject: New Publication from IFLA UAP


Interlending and Document Supply. Proceedings of the Sixth 
International Conference held in Pretoria, South Africa, 25-29 October 

Edited by Sara Gould 

The Sixth International Conference on Interlending and Document Supply 
brought leading experts from all over the world to South Africa, to 
discuss latest developments in the field and common concerns. Almost 
200 librarians from 25 different countries, including Africa and the 
developing world, met to consider all aspects of document delivery and 
interlending, under the theme, Empowering society through the global 
flow of information.

The globalisation of information is a trend which seemingly threatens 
to disempower the developing world. Yet, properly harnessed, it could 
be the gateway to the information superhighway, even for disadvantaged 
countries. The recent developments in information technologies are 
also having a profound impact on ILDS services. Although some 
maintain that old style library services have been superseded by new 
technology, it is clear that an extensive need for traditional 
services still exists and has to be addressed.

The Conference provided a sounding board for these issues with 
opportunities to be informed and learn from others, and with good 
representation from most parts of the world, the Conference provided a 
comprehensive overview of the major developments and best practice in 
this field at this stage. The challenge to the delegates and readers 
of the proceedings is to find from these developments those that will 
contribute best to the improvement of their own interlending and 
document supply, as a means for excellent service to their users.

ISBN 0 9532439 9 0 PRICE £20.00

The proceedings include 32 papers, with abstracts and index, and are 
available from:

IFLA Offices for UAP and International Lending 
c/o British Library
Boston Spa
W Yorkshire
LS23 7BQ

Tel: 01937 546124
Fax: 01937 546478



Date: Thu, 13 Apr 2000 14:21:55 +0200
Sender: International Federation of Library Associations mailing list <IFLA-L@INFOSERV.NLC-BNC.CA>
From: Joelle Garcia <JOELLE.GARCIA@BNF.FR>
Subject: Report on the 5th Joint Technical Symposium (Paris, January 2000)

Scientific and technical event that was organised for the first time in
Stockholm in 1983, then in Berlin (1987), Ottawa (1990) and London (1995),
the JTS gathers, at the initiative and with the support of UNESCO, the
three international organisations implied in the preservation and
restoration of original image and sound materials : Fédération
Internationale des Archives de Film (FIAF), International Federation of
Television Archives (FIAT/IFTVA), International Association of Sound
Archives (IASA), and the audiovisual sub committees of ICA (International
Council of Archives) and of IFLA (International Federation of Library
Associations and institutions).
It is a platform for specialists of audio-visual, cinema and sound archives
to share scientific and technical researches as well as practical
experiences, in order to provide guidelines for action for curators,
technicians, researchers ...
The 5th JTS Paris 2000 has been organised by CNC (Centre National de la
Cinématographie) assisted by CST (Commission Supérieure Technique de
l'Image et du Son), with the collaboration of INA (Institut National de
l'Audiovisuel) and BnF (Bibliothèque nationale de France), and in
association with institutions such as AMIA (Association of Moving Image
Archivists), the ARCHIMEDIA network, ARSAG (Association pour la Recherche
Scientifique sur les Arts Graphiques), BKSTS (British Kinematograph, Sound
and Television Society), and the GAMMA group.
The subject of the JTS Paris 2000 "Image and Sound Archiving and Access :
the challenges of the 3rd Millennium" clearly focussed on the implications
and the evolutions introduced by the new digital and Internet environments
for the preservation of moving images and sounds activities and strategies.
The JTS Paris 2000 presented 30 papers and 8 posters organised in three
chapters corresponding to the main present and future challenges :
· Risk assessment in the preservation of image and sound materials
· Transfer and restoration of original image and sound
· Data management systems and migration strategies

1- Risk assessment in the preservation of image and sound materials

Films, magnetic tapes or discs, all original and preservation duplicate
media can suffer physical and chemical degradations. If these degradations
are not detected, analysed and evaluated in time, the original or
duplicated data may disappear.
The vinegar syndrome that affects films on cellulose triacetate base is now
fully acknowledged : the spontaneous chemical decay of triacetate film base
leads to the deacetylation and chain scission of the polymer. The produced
acetic acid catalyses further decay of the polymer. The critical
autocatalytic point can then be quickly reached.
Time now is not to further understand the film degradation process but
rather to formulate practical preservation strategies for film collections.
The quality of the storage environment and the state of preservation of
acetate film collections are the major determining factors in implementing
a rational preservation strategy.
The conclusions reached by the Image Permanence Institute through their
works show that low temperature is the most effective mean to improve the
chemical stability of triacetate base films. The macro climate control at
low temperature and RH below 50% along with the control of the air quality
(to eliminate the degradation products) is the best adapted option.
In the case of large collections a statistical approach can help to carry
the evaluation of the extent of vinegar syndrome by providing a probability
based sampling model corresponding to the composition of the elements of a
There are many analogies between vinegar syndrome and the pigment binder
deterioration of the magnetic tapes. Under the effect of humidity the
cross?linking of the binder system decomposes into fragments, which migrate
to the surface of the magnetic layer. This leads, amongst other things, to
the shedding of pigment particles which accumulate on tape guides and on
replay heads.
It is estimated that audio archives world-wide are storing around 30
million hours of tapes, video archives around 10 million hours. Their
transfer onto new digital carriers, involving a time factor of 2?3 of the
duration of the programme, will take many years if not decades. The
estimation of the end of life (EOL) of existing stocks has therefore become
an important issue and will enable to define strategies. The best blank
media for recording or copying must be selected according to the risk of
degradation inherent to their composition, and, in the longer term, we need
to monitor the condition of the recordings and of the media in order to
determine when to transfer or migrate content.
The estimated life of the data recorded on these media logically seems to
depend on the estimated life of the media themselves, but several
contributions have confirmed that the permanence of information also
greatly depends on the condition of how it was written.
For audio recordings several studies examined CD-R as a possible media. If
we take into account the many deep analyses necessary to interpret the
complex phenomenon of CD-R writing and the evolutions of degradations due
to different factors or to "natural ageing", it is impossible for the
moment to consider CD-R as a preservation media for sound recordings, still
images, text and digital data, unless appropriate checking procedures based
on relevant parameters are run.
DVD-R for moving images will be the next media to probe.

2- Transfer and restoration of original image and sound

Risk appreciation helps decision making. The issue can be to decide which
technology for restoration when degradations do not allow a mere
duplication on a more stable media, or to set the methodology of transfer
on a media compatible with modern playback equipments and providing easy
and fast access to data. The decision must always be adapted to strategies
in terms of technological evolution, of costs and of expected results.
Beside the fact that media may disappear due to the obsolescence of formats
or to physical and chemical degradation, transfers are also set up because
the constraints of time and of access cost to audio-visual content are no
more accepted by users in the era of Internet. Last but not least, backup
and digitalisation programmes frequently have to be defined urgently when
new mass storage media for a better extended preservation are still not
available and when storage and communication formats are not yet
Needs must be analysed to determine priorities and methods :
- characteristics of the archive
- criteria of selection and priorities (cases of emergency, preservation of
unique originals, priorities related to potential reuse of contents)
- decision on the new formats and on transfer procedures. Experts agree to
say that magnetic tape remains for many years the most suitable media to
preserve television images. The only widely accepted criteria is that the
new preservation format has to be digital and its quality must be
compatible with the future use of programmes.

3- Data management systems and migration strategies

EBU is working on short and mid term migrations projects :
- transfer on a digital tape format that will be automatically managed by
robot systems
- faster than real time automatic transfer on a digital data mass storage
format system where data encoding format will be independent from the
recording format.
In transfer/migration operations it is necessary not only to transfer the
recorded contents, but also to manage the information on these contents
There are a number of vendors offering a variety of solutions to manage the
contents and metadata. The functions that the vendors built into their
software often reflect the business processes they have worked most closely
with (pre-press; newspaper publishing; stock photo sales). Vendors are
generally more concerned with adapting the solutions they have already
developed to new needs than develop specific software.
Diverse communities contribute to the definition and to the development of
metadata. The traditional management community that gathers information
producers categories (authors, etc. producing primary information and
managing its communication, and libraries, archives, etc. producing the
traditional documentary information) and the largely dominated by the Web
oriented needs IT world (encoding, storage, communication) imposing a
global strategy to set joint standards for platforms and metadata.
Based on practical experiment in the field of space missions data
preservation, some pragmatic rules on data preservation have been defined,
applicable to data :
- ensure that data is completely independent from the systems used for its
creation and management. This rule concerns file structure and encoding
methods. It assumes that all ?proprietary? data structures and non-standard
encoding methods are systematically rejected
- ensure that data is described in terms of both syntax and semantics. The
description must comply with the data
and applicable to archival systems : to handle technology developments as
well as possible and limit their negative effects through separation of the
main functions into services as autonomous as possible (data ingestion,
storage, management, access).
A "Reference Model for an Open Archival Information System (OAIS)" sets up
a framework for the general and common comprehension of the issue of
long-term archival of digital data and constitutes a base from which to
develop complementary standards in the area.

Report written by Richard Billeaud (co-organizer of the JTS Paris 2000).

The Proceedings of the 5th JTS will be published in May 2000.

Abstracts can be downloaded (Word or PDF) on the JTS web site :

Joelle Garcia (Section on Audiovisual and Multimedia, Chair)

Librarians in a coalition on scholarly publishing

Date: Fri, 16 Jun 2000 16:52:23 -1000
From: Elizabeth Bryson <>
Subject: Librarians in a coalition on scholarly publishing

>From Library Juice 3:23
Librarians in a coalition on scholarly publishing

This article in the Chronicle of Higher Ed. is free on the website:

THE CRISIS IN SCHOLARLY PUBLISHING: A coalition of librarians,
university administrators, scholars, and publishers has
recommended nine steps to fix what they call a "broken system."

--> SEE


NTCIR Workshop 2: Chinese/Japanese IR & Summarization

Date: Mon, 12 Jun 2000 05:22:12 +0900
From: Noriko Kando <>
To:, nancy@CNI.ORG
Subject: ASIS-L: CFP: NTCIR Workshop 2: Chinese/Japanese IR & Summarization

apology if you have received multiple copies of this announcement...

NTCIR Workshop 2
Evaluation of Chinese & Japanese Text Retrieval
and Text Summarization

May 2000 - Feb 2001
An evaluation workshop of Chinese and Japanese text retrieval
and text summarization will be held from May 2000 to February
2001. Participation is invited from anyone interested in
Chinese and/or Japanese text retrieval and English-Chinese
and English-Japanese cross-lingual information retrieval from
large-scale collections and text summarization of Japanese texts.


- To encourage research in information retrieval, cross-lingual
information retrieval and text summarization by providing reusable
test collections.
- To provide a forum for research groups interested in comparing
results and exchanging ideas or opinions in an informal atmosphere.
- To improve the quality of the test collections based on the
feedback from participants.


- CHINESE IR TASKS (Chinese and English-Chinese IR)
- Chinese news documents will be used for the Chinese IR Tasks.
Details will be announced in June.

- JAPANESE & ENGLISH IR TASKS (Japanese, English and
English-Japanese IR)
- Training set: NTCIR-1 CD, more than 330,000 author abstracts
of conference papers; more than half are Japanese-English
paired (document alignments); alignments are known and
usable for training;
- Test set: NTCIR-1 and NTCIR-2.
NTCIR-2 consists of two document subfiles;
(1) ca.300,000 extended summaries of the research reports;
about 25% are Japanese-English paired.
(2) ca.100,000 author abstracts of conference papers;
more than half are Japanese-English paired; the
alignments are not announced before result submission
- Segmented Japanese texts are available for both Japanese
documents and topics in the NTCIR-1&2; each sentence is
segmented into terms and term components (similar to
phrases and words); use of this data is optional.

- Vairous types of articles in the Japanese newspapers.


- By June 30, 2000: Submit application.
- NTCIR-1 are available to those who have returned required forms.
- For Chinese IR & Text Summarization: application deadline may
vary. announce later.
- August 10, 2000: NTCIR-2 CD (new documents and fifty topics)
will be distributed to the participants of Japanese & English 
IR tasks.
- September 18, 2000: Search results submission (Japanese & English IR)
- January 10, 2001: Results of Relevance Assessments for the new
topics will be distributed to the participants
- February 21-23, 2001: Workshop meeting at NII, Tokyo, Japan.
Day 1: Open to public, Days 2-3: Active participants only


Below, is a brief summary of the tasks envisaged for the Workshop. A
participant will conduct one or more of the tasks or subtasks below.
Participation in only one subtask (for example Japanese monolingual IR
(J-J task)) is available:

- Chinese Information Retrieval Task: Chinese monolingual IR;
English-Chinese cross-lingual IR; to investigate the search
effectiveness of systems that search a static set of Chinese
documents using new Chinese and/or English topics.

- Japanese & English Information Retrieval Task: Japanese and/or
English monolingual IR; cross-lingual IR of single language document
and mixed-language documents of English and Japanese by Japanese
and/or English topics; to investigate the search effectiveness of
systems that search a static set of documents

- Text Summarization task (Japanese description only): automatic
text summarization of Japanese texts; the aim is (1) to collect
qualified text data for summarization in Japanese. we will have 
newspaper articles summarized by hand, and make them available 
for research purpose (2) to evaluate text summarization systems; 
an extrinsic evaluation, task based evaluation.


- A. FULL: Submit retrieval results and describe the system. The
correspondence between the group name and the group ID will 
be announced.
- B. ANONYMOUS: Submit retrieval results. The details of the system
may not be reported. The correspondence between the group name
and the group ID is not announced. This category is mainly for 
the participants from the companies who have troubles to report 
the details.

The list of the participating groups is made public but the evaluation
results will be announced using the group IDs only. Whichever of the 
types of participation, every participating group must submit (1) a
paper for the workshop proceedings, (2) a system description form
which describes your system, and (3) bibliographic references and a
copy of all your papers using NTCIR test collections.


Online application is available at:

For the text version of application form, please complete and return 
it via e-mail, fax, or postal mail to;

ATTN: Noriko Kando
NTCIR Project
National Institute of Informatics (NII)
2-1-2 Hitotsubashi, Chiyoda-ku,Tokyo 101-8430, Japan
fax: +81-3-3556-1916 phone: +81-3-4212-2529


Financial support to attend the NTCIR Workshop meeting will be
available for the limited number of active oversea participants who
will present material at the workshop meeting in February, 2001, and
who are not receiving other funding to attend the NTCIR Workshop
meeting. Priority will be given to younger researchers. The detail
will be announced later.


- Please send email to Noriko Kando, project manager, at, or to NTCIR Project administrators (
- About "Chinese IR Task", please send email to the Task Chairs,
Hsin-Hsi Chen ( or Kuang-Hua Chen
( ).
- About "Text Summarization Task", please send email to the Task
Chairs, Manabu Okumura ( or Takahiro Fukushima


- The first day of the Workshop meeting will be open forum of the
researchers who are interested in the topics. The second and third
days will be open only to the active participating groups that have
submited results and selected people from organizing agencies.
- The proceedings will be published online as well as printed-form.
- Dissemination of the research results using the NTCIR collections
other than in the Workshop's Proceedings is welcome. However, the
conditions of participation preclude specific advertising claims
based on the results using the Collection or the Workshop.
- International participants are welcome. Announcements will be in
English and Japanese, and English and Chinese for Chinese IR Task.
- The official language for the proceedings papers and presentation
at the Workshop meeting in February, 2001 is English.

- An evaluation of Korean text retrieval is organized by Prof Sung 
Hyon Myaeng, Korea ( We keep close 
relationship each other.

For more information, please visit;



Date: Fri, 16 Jun 2000 08:45:17 +1000

----- Original Message -----
From: Australian Libraries Gateway <>
To: <>
Sent: Thursday, June 15, 2000 6:02 PM
Subject: <Psst> - PictureAustralia

 PictureAustralia is an image database service which snuck onto the
this week but won't be official launched for a while yet. It provides a
single point of access to over 450,000 digitised images from the pictorial
collections of many of Australia's leading cultural heritage institutions.
We thought you might be interested in a peek:
There is a link from the Australian Libraries Gateway 'Online Australian
image collections' page
<> - under 'Pathways to
information'. Please have a look at the new service and send your comments

 ALG Administration
 Australian Libraries Gateway
 ... discover Australian libraries
 phone 02 6262 1137 / fax 02 6273 2545


Resource: The Council for Museums, Archives and Libraries

Date: Thu, 20 Apr 2000 15:54:07 +0100
Subject: Resource: The Council for Museums, Archives and Libraries
From: (Henry Girling)

**Apologies for cross posting**

20 April 2000

Resource: The Council for Museums, Archives and Libraries

I am writing to inform you of the new name and corporate identity that we 
have chosen for the organisation formerly known as the Museums, Libraries 
and Archives Council (MLAC). As from today our full name is Resource: The 
Council for Museums, Archives and Libraries. Our shorthand title is simply 
Resource - a word which we feel captures effectively the main elements of 
commonality between museums, libraries and archives and which we hope 
reflects some of the dynamism within the sector. 

The next few months will be a transitional period for us while the remaining 
members of our Board are appointed, we consider the responses made to the 
MLAC consultation exercise, develop our manifesto and undertake a series of 
projects and reviews based around our new agenda. 

We are currently undertaking a staffing review and as soon as this has been 
finalised we will contact you again with information about our new structure 
and contact details. In the meantime, please deal with your former MGC/LIC 
contacts and we will update you as soon as we can. 

On behalf of everyone at Resource, we look forward to working with you in 
the future. 

Neville Mackay
Chief Executive


Archives Advisor 

Date: Wed, 28 Jun 2000 09:52:42 +0100
Subject: News from Resource: The Council for Museums, Libraries and Archiv
From: Henry Girling <>

** apologies for cross posting**
Archives Advisor for Resource

London, 28 June 2000 -- Justin Frost has been appointed as the Archives
Advisor for Resource: The Council for Museums, Archives and Libraries.
Currently Head of the Archive Inspection Services Department at the Public
Record Office (PRO), Justin has been seconded for a minimum period of two
years. He will join Resource in mid-September.

Neville Mackay, Chief Executive of Resource, commented: "Developing our
archives agenda is a priority area for Resource. I am delighted that Justin
is joining us to take forward this important area of work."

Justin Frost added: "I am delighted to take up this post. Like museums and
libraries, archives make a major contribution to creativity, lifelong
learning and economic development. This role presents an exciting
opportunity to develop their profile further."

Justin joined the Public Record Office in 1991 as a Documentation Officer in
the Records Management Department. Following internal promotion he was
appointed to the post of Deputy Archive Inspection Officer in January 1999
and Head of Archive Inspection in April of this year. Recently he has
managed the English Archival Mapping Project and is currently writing the
Phase Two Report which will be completed before Justin joins Resource. 

For further information please contact Emma Wright, Press Officer, or Julie
Taylor, Director of Communications on 020 7233 4200.

Resource: The Council for Museums, Archives and Libraries is a new strategic
agency which will work with museums, libraries and archives across the UK.
It replaced the Museums & Galleries Commission and the Library & Information
Commission on 3 April 2000.


New Millennium Awards Administrator

Date: Fri, 30 Jun 2000 16:24:55 +0100
Subject: News from Resource: The Council for Museums, Libraries and Archiv
From: Henry Girling <>

Resource Appoints New Millennium Awards Administrator

London, 30 June 2000 - Resource: The Council for Museums, Archives and
Libraries has appointed Catherine Atkinson as the Millennium Awards
Administrator for the Sharing Museum Skills Awards Scheme. She takes over
from Annie Hollobone who has moved to the Millennium Commission.

Laura Drysdale, Director of Sector & Professional Services at
Resource, commented: "Sharing Museum Skills offers museum staff and
volunteers a fantastic opportunity to share and extend their experience.
The consortium of Resource, the Museums Association, the Association of
Independent Museums, the Committee of Area Museum Councils and the National
Museum Directors' Conference is delighted to have Catherine administering
the scheme. Annie did a wonderful job over the past two years and I am sure
Catherine will continue to take the scheme from strength to strength."

Catherine Atkinson was formerly Assistant Environmental Officer at
the Museums & Galleries Commission (MGC), working with the Environmental
Adviser and offering advice on applications for both lottery funding and the
Government Indemnity Scheme. She joined the MGC from the British Council,
where she was Assistant to the Director of Arts. Catherine is a qualified

The Sharing Museum Skills Millennium Awards Scheme draws upon the experience
and skills of national and non-national museums and their staff and
volunteers throughout the UK. Through an exchange programme the scheme
benefits both parties and helps develop links between UK museums and
galleries. The next deadline for applications is 10 August 2000, followed
by quarterly deadlines until May 2001. Grants awarded will normally be in
the region of £2,000-£4,000. More information can be found at <> or by contacting Catherine
Atkinson on 020 7233 4200.
# # #
Note : To interview Catherine Atkinson, Millennium Awards Administrator,
please contact Emma Wright on 020 7233 4200. 

The Sharing Museum Skills Millennium Awards scheme is funded by the
Millennium Commission, which awarded a grant of £1.1 million to a consortium
comprising: Resource - The Council for Museums, Archives and Libraries; the
Museums Association; the Association of Independent Museums; the Committee
of Area Museum Councils and the National Museum Directors' Conference. The
Millennium Commission is one of the six good causes supported by the
National Lottery.

Resource is a new strategic agency which will work with museums, libraries
and archives across the UK. It replaced the Museums & Galleries Commission
and the Library & Information Commission on 3 April 2000. The Resource
website can be found at <> .


New Advisers

Date: Wed, 5 Jul 2000 12:30:47 +0100 
Subject: News from Resource: The Council for Museums, Archives and Librari
From: Henry Girling <>

Resource Appoints New Advisers

London, 5 July 2000 -- Resource: The Council for Museums, Archives and
Libraries has appointed Oliver Gillman as Gates Fund Adviser and Nick Poole
as ICT Adviser.

Chris Batt, Director of Access and Learning, commented: "Resource's
agenda has a strong focus on Information and Communication Technology (ICT)
across the three sectors. Oliver and Nick will play an important role in
taking these aims forward and developing ICT across the board."

Oliver Gillman will manage the allocation of the $4.2million (£2.5 million)
gift from the Bill & Melinda Gates Foundation for the provision of ICT
learning centres in public libraries in the UK. Under the plan, 47 central
and regional libraries will each be given 11 public access computers for
library users to develop their information and communication technology
skills. A further 322 community libraries will receive two computers and
associated equipment to provide similar facilities.

Nick Poole will help devise IT strategy for the UK's museums, libraries and
archives, develop the People's Network, evaluate Resource IT projects and
assist in the running of the IT Challenge Fund. Formerly the Resource
Website Manager, Nick developed the organisation's interim website at <> . 
# # #

Notes To interview Oliver Gillman, Nick Poole or Chris Batt please contact
Emma Wright on 020 7233 4200. Resource is a new strategic agency which will
work with museums, libraries and archives across the UK. It replaced the
Museums & Galleries Commission and the Library & Information Commission in
April 2000.
The Bill & Melinda Gates Foundation is dedicated to improving people's lives
by sharing advances in health and learning with the global community.
The People's Network is a project to connect all public libraries to the
Information Superhighway by the end of 2002.
£500,000 is available from the Department for Culture, Media and Sport IT
Challenge Fund to help museums devise ways to use new technology to display
and interpret collections, and find new ways of attracting the public. 

Henry Girling
Communications Assistant
Resource: the Council for Museums, Archives and Libraries
16 Queen Anne's Gate
Tel: 020 7233 4200
Fax: 020 7233 3686
email <> 


Risk Management of Digital Information: A File Format Investigation

Date: Thu, 15 Jun 2000 20:33:49 -0400
Sender: Management & Preservation of Electronic Records
Subject: Preservation of Digital Information

Members of this list may be interested in the following recent publication described in RLG DigiNews:

Risk Management of Digital Information: A File Format Investigation

The Council on Library and Information Resources (CLIR) sponsored a risk assessment study conducted by Cornell University Library (CUL) in 1999 that focused on the file format risks inherent in migration as a preservation strategy for digital materials. Project participants included: Gregory Lawrence (PI), William Kehoe, Anne R. Kenney, Oya Y. Rieger, and William Walters.

The issue of Diginews cited below provides a summary from the full report. The full report will be available from CLIR in late June 2000.

Visit the article at:

This study describes efforts to develop a risk assessment analysis method and tools for migrating digital data. The study focused only on Lotus 1-2-3 spreadsheets and on TIFF files.

Given the current discussion of the migration vs. emulation options for long-term preservation of digital materials, this report may be timely.

Tom Ruller
Moderator, ERECS-L

A posting from ERECS-L, an edited listserv for discussions about the
preservation and management of records in electronic form.


Subject Index to Literature on Electronic Sources of Information

May 1st 2000 edition

Date: Mon, 08 May 2000 07:36:57 -0600 (CST)
From: Marian Dworaczek <marian.dworaczek@USASK.CA>
Subject: ASIS-L: Subject Index to Literature on Electronic Sources of Information

The May 1, 2000 edition of the "Subject Index to Literature on
Electronic Sources of Information" is available at:

The page-specific "Subject Index to Literature on Electronic Sources of
Information" and the accompanying "Electronic Sources of Information: A
Bibliography" (listing all indexed items) deal with all aspects of
electronic publishing and include print and non-print materials,
periodical articles, monographs and individual chapters in collected
works. This edition includes 1,239 titles. Both the Index and the
Bibliography are continuously updated.

Introduction, which includes sample search and instructions how to use the
Subject Index and the Bibliography, is located at:

This message has been crossposted to several mailing lists. Please excuse
any duplication.


August 1st 2000 edition


Date: Tue, 8 Aug 2000 13:13:10 -0600
Sender: International Federation of Library Associations mailing list <IFLA-L@INFOSERV.NLC-BNC.CA>
From: Marian Dworaczek <Marian.Dworaczek@USASK.CA>
Subject: Subject Index to Literature on Electronic Sources of Information

The August 1, 2000 edition of the "Subject Index to Literature on
Electronic Sources of Information" is available at:

The page-specific "Subject Index to Literature on Electronic Sources
of Information" and the accompanying "Electronic Sources of Information:
A Bibliography" (listing all indexed items) deal with all aspects of
electronic publishing and include print and non-print materials,
periodical articles, monographs and individual chapters in collected
works. This edition includes 1,283 titles. Both the Index and the
Bibliography are continuously updated.

Introduction, which includes sample search and instructions how to use
the Subject Index and the Bibliography, is located at:

This message has been crossposted to several mailing lists. Please
excuse any duplication.

*Marian Dworaczek *
*Head, Acquisitions Department *
*University of Saskatchewan Libraries *
*3 Campus Drive, Main Library *
*Saskatoon, SK S7N 5A4, Canada *
*E-mail: *
*Phone: (306) 966-6016 *
*Fax: (306) 966-5919 *
*Home Page: *


Technology Leapfrogging in Developing Countries - An
Inevitable Luxury?

Date: Mon, 19 Jun 2000 09:51:48 +0100
Subject: FW: article on leapfrogging in developing countries thanks to ICT

From: "Cano, Virginia" <>

-----Original Message-----
From: Roger Harris [] 
Sent: 16 June 2000 02:19
Subject: Re: [GKD] Wireless in Mali: Leapfrogging

Anyone interested in a discussion of leapfrogging ICTs for LDCs will
find an article "Technology Leapfrogging in Developing Countries - An
Inevitable Luxury?" in the Electronic Journal on Information Systems in
Developing Countries, at

Dr. Roger Harris
Head of the Information Systems Core Group
Faculty of Information Technology - Universiti Malaysia Sarawak
Editor-in-Chief - Electronic Journal on Information Systems in
Developing Countries
Track Chair - HICSS 2001 - Community Informatics

[***Moderator's Note: For GKD Members without Web access, the following
summary of the article mentioned above might prove helpful. The authors
of this article examine the opportunities and requirements for
developing countries to "leapfrog" into more advance stages of
technology use. They point out that "successful use of IT requires much
more than mere installation and application of systematised knowledge.
It also requires the application of implied knowledge regarding the
organisation and management of the technology and its application to the
contextual environment in which it is to be used. Often the requisite
knowledge about technology is accumulated from experience with
successive stages of technology, over time. They warn, "?while
leapfrogging may appear as an attractive option for the late adopters,
it may not provide the intended results in all circumstances. The
greatest danger is that?the developing economies observe the benefits
which later and succeeding generations of IT bring to the industrialised
nations. Hurrying to acquire the same technology, developing countries
rely on the blind belief that similar benefits will quickly accrue to
themselves. Such opportunities may exist, but a reality check is
appropriate in order to protect the investment of the scarce resources
available for IT in most developing nations and to distinguish between
the circumstances where leapfrogging may or may not be successful."

They consider the factors that are required for successful
"leapfrogging." These include:

* National policies and international coordination to ensure the sharing
of information and resources to benefit all stakeholders.
*An enabling, regulatory environment
* National Information Infrastructure plans that consider connectivity
within the county and between it and the rest of the world
* Requisite training, both for technology professionals and for end-users
* A "social context" with values that support use of ICT, as well as a
technology adoption approach that respects and responds to the
particular social context and local needs
* Financial resources, from aid agencies, reconstruction and development
banks, international organisations, and the private sector. The authors
note that the ITU endeavors to establish partnerships between network
operators, governments and private sector interests
* An investment climate that is mutually favourable to investors as well
as the public.

The authors provide a number of references to international
organisations that are promoting leapfrogging in effective ways, as well
as examples of efforts at leapfrogging. These include:

* Egypt's Information and Decision Support Center (IDSC), where gender
issues and obstacles for women may have undermined the effort.
* Malaysia's Multimedia Super Corridor (MSC), a far-reaching project
that may be slowed by the decrease in economic growth rates
* A number of successful projects conducted by Canada's International
Development Research Centre (IDRC)
* Grameen, in Bangladesh, which provides technology that helps women
build revenue-generating sources
* Mongolian telecommunications networks
* The Navrongo Health Research Centre (NHRC) in Ghana, which is part of
an international initiative funded by IDRC and other aid agencies called

The authors also note that countries have alternatives to leapfrogging;
but warn that countries that fail to update their technologies are
likely to be excluded from the global economy, to the detriment of their
people. Most importantly, countries must recognise that simply
introducing technology will not contribute to development; rather the
public must understand how to use the technologies available (be they
radio, the Internet, or whatever) to help them achieve their own goals.

The authors also call for discussion about leapfrogging: lessons
learned, success and failures, economic, political, and social impacts.***]



former PGI and Informatics programmes.

Date: Tue, 9 May 2000 13:36:58 +0100 
From: "Cano, Virginia" <>
To: "''" <>

-----Original Message-----
Sent: 09 May 2000 12:09

UNESCO is reforming and refocusing the former PGI and Informatics 
programmes. The documentation is on the Web at:

According to the timescale, there will be a paper out to consultation 
by member states at the end of this month with responses due by the 
end of July. 


* Ian M. Johnson 
* Head, School of Information and Media, 
* The Robert Gordon University, 
* Garthdee Road, ABERDEEN AB10 7QE, Scotland, U.K. 

* Telephone: National 01224 263902; International + 44 1224 263902 
* Fax: National - 01224 263939; International + 44 1224 263939 

* URL -



To subscribe to LIBRES send e-mail message to
with the text:
subscribe libres <your first name> <your last name>_

Return to Contents Page
Return to Libres Home Page

CRICOS provider code: 00301J