The first difference is that, whereas a museum or special exhibit is processing a relatively small amount of data for relatively large audience, the initial record comprises a huge amount of data with almost no audience at all. As a result, the large amount of overhead required to present this data to the public in these flashy formats will never be justified and other modes of presentation must be developed.
The other difference that needs to be addressed is the long term accessibility of this information. These records must be maintained in perpetuity, long past the memory of the recorder or archivist and using formats and procedures independent of hardware and software standards as well as arbitrary indexing schemes such as cadastral identification and even political designations.
This paper will propose a structure for such a database with an
indexing schema based on geographic position, storage procedures
based on constant renewal and data standards accessible to all
hardware and software. Current data processing realities will be
acknowledged by the presentation of tools, currently under
development, that will convert archival data to these standards
while creating the interface that would allow public access over
We are constantly being made aware of the changes that are taking place in information technology, the collection, assembly, storage and dissemination of data. This paper will consider the question of how ICOMOS might fit into this new order.
"Recently I saw a virtual reality demonstration featuring
the square of St. Marco in Venice. They had built a
computer model of the site and you could almost walk through
of it. We just need to buy this software then we could do
the same with the world heritage list, put them on cdroms
and all the world would appreciate these assets."
"The heritage record is different. Electronic media are not stable enough for an archival record so we will have to print store and index the information anyways. We have been doing this successfully since our inception and there is no reason to change now.
This is not a position statement. It is two views of how the heritage record might be related to the information revolution that is going on around us. Both are wrong.
To be truthful the sentiments of the latter statement are seldom actually verbalized these days. Almost everyone recognizes the need to adapt our records to the information revolution though, within ICOMOS, the absence of any funding for the development of the electronic storage and distribution efforts that have been going on for the last few years suggests that this feeling may still exist behind the scenes.
In many ways however it is the former approach, that computers can do it all, which poses the greatest hazard. It is true that the hardware and software exists to build extraordinary models and to present data in ever more interesting ways. Furthermore these tools are becoming cheaper, better and easier to use each passing year. However, the corollary to this is, that when using such technology, this years effort will quickly become obsolete.
There are several other problems with the state-of-the-art as well. The computers necessary to run such tools will always be relatively expensive and not available to many ICOMOS members or the public we wish to reach. If however we maintain our records in the simplest of formats it is reasonable to expect that most of the world will soon have access to machinery adequate for the viewing of it.
Another problem is that the human resources needed to create these virtual realities is immense and far beyond the capabilities of this organization but by far the greatest drawback to these sophisticated presentation tools is that the model is not a true representation of the resource but a geometrically idealized "reconstruction" of someone's interpretation of it.
The heritage record
The fundamental difference between the heritage record and virtual reality, at least from the perspective of a systems analyst, is that the "record" consists of large amounts of data for small audiences while the "presentation" tools that capture the imagination, dispense relatively small amounts to a much larger public.
For any given site, there is a vast amount of information all of which must be stored and kept accessible but the complete archive is of interest to only a small audience of conservationists while a television program or museum exhibit is prepared for a much larger audience but presents only a small sub-set of the available data. It stands to reason that the procedures appropriate for collecting and storing the primary record will be quite different from those needed to demonstrate and popularize a site. It is even possible that the tools we need will be unique to ICOMOS in that no other body is trying to document such resources or archive such material.
It is necessary that we examine our mandate to determine what is essential and our resources to determine what is possible. Within the bounds of these two restraints technological possibility is not likely to be an issue.
It is mandate of ICOMOS is to advise UNESCO on the maintenance of the assets listed in the world heritage list and it stands to reason that we must have at our disposal the information necessary to do this effectively. There are vast amounts of such data, stored all over the world in widely differing formats which, when analyzed for the characteristics that help support this mandate, can be grouped as follows:
Readers and viewers for all formats in the data set should be stored with the data or in the case of the World Wide Web they should be available for free download from the same site.
Just as it is necessary to maintain a computer and operating system that can read the media on which the data is stored, so it is necessary to maintain the software necessary to fully exploit the archive. If both the data and the software necessary to read it are on the same media (cd, hard disk, web site etc.) the archive is complete and useable as long as the hardware works, even if the whole system has been obsolete for some time. Imagine a future archaeologist finding a stash of cd's or one of the backup systems that was put in someone's basement before the apocalypse. It is not hard to imagine these future researchers finding ways to access this information if we keep all the components together.
Keeping the data and the necessary software together would also
provide a solution for those that want to present their data in
more complex formats. To distribute such efforts it would only
be necessary that the data provider supply the necessary software
which could then be stored with it though hopefully the text and
images would be reduced to the archival formats as well.
When categorized as to their storage and access characteristics however this complexity falls away and, with few exceptions, we are dealing with text and images. The migration from a paper based to a digital archive will take different forms depending on the type of data but the first step must be the creation of an index of all the sites coupled with lists of all the data available concerning each. A unique and meaningful key for such a database would be the geographic location of the site (indeed, I can think of no other). The other fields in this primary database could be limited to indications of the type and location of the data available.
The result of such an exercise would be a data structure that would allow the efficient use of the computer to locate existing data in whatever format it happens to be. A book identified by library, shelf and title would be as easy to locate as a computer file identified by computer, directory and file name.
Of course the file on the computer would be much easier for the researcher to use so the next step would be to convert the existing documentation to these easier-to-access formats. If each document, when requested, were converted to the appropriate electronic format this information would only have to be handled once as all subsequent requests for the document could be processed electronically. As a first step the paper documents could be scanned and distributed as digital images. This would make them immediately available and would take no more time than photocopying them. Later, as resources become available or on an as-needed basis, the words could be converted to text using optical character recognition (OCR) software.
With this procedure in place, researchers browsing the database over the internet would request a document and if it was already in a digital format they would get within minutes with no input from documentation centre staff. If they are not yet on-line, that request would cause the document to be processed and added to the available selection.
The need for hard copy to supply libraries, classrooms and the dwindling numbers of individuals still without computers could be accomplished by maintaining this archive in a printable format. The ICOMOS internet server can already provide copies of such files to anyone requesting them and national committees or local institutions could provide printing services for their members.
If the current distribution service can be maintained and improved with so little effort the questions remaining concern the storage and long-term maintenance of the data.
The most carefully archived digital data is totally useless if the storage media fails or if the machinery is no longer available to read it. Thus it is incumbent on the archivists to maintain not only the data itself but working examples of the computers, operating systems and software necessary to read everything in their custody.
The fact is that no media is permanent, not even the stone of the ancients. We can only guess at the long term storage characteristics of our current technology though we do know it is much more likely to be measured in decades than in millennia or even centuries. An archive can only be maintained by constant renewal and in this case, this means that there must be an ongoing program of copying the data to new devices long before there is any question of media failure. Curiously, in this we are approaching the model of bardic societies that rely on word of mouth to maintain their legends. The reason this works is that the information is always fresh in someone's memory. We would know precious little about the Trojan War if we had had no input from the bards.
A program is needed that will identify appropriate storage technologies, purchase the necessary hardware and software and copy the database to the new media, perhaps on a five year cycle which is about the current life expectancy of a computer. Without such a commitment no long term archive can be said to exist.
Catastrophe is the other eventuality that needs to be addressed when one is entrusted with an irreplaceable archive and this too can be dealt with within the regime proposed here. All data storage is subject to physical destruction, whether by earthquake, fire, sabotage, or war but a program of constant renewal will result in multiple copies of the entire database being created significantly before the storage media is in danger. If the old media and equipment is moved off site once a conversion is complete it would provide a backup for as long as the media is readable.
Also, using the facilities of the internet, copies can be created and stored at widely separated sites which would be unlikely to be affected by any single event. These remote sites, probably managed by national committees, could also be used to provide alternate access points, called mirror sites, where the world could find our data.
Having a stable environment for the data allows the discussion to shift to the question of software and data file formats. Just as there is a partnership between computer and media there is one between file formats and software. There are three functions in this relationship, data preparation, storage, and presentation.
Those acquiring and preparing the primary data are our most valuable asset and no hinderance or inconvenience should be put in their way. They will prepare their reports on a variety of equipment using an ever-changing assortment of software and in many cases this record, stored in its native format, will also provide acceptable or even excellent presentation characteristics when used with the software that created it.
However this will not do for the archive. The necessary software may not be available in five years and even if it is there may be nobody available knowledgeable in its use. As a result the data that it supported will become less useful or even inaccessible.
For posterity we must forfeit the potential elegance of these formatted reports and the simplicity of simply taking electronic copies. Instead, standardized formats for the raw text and images should be chosen. These files must be accessible to the widest possible audience and readable by as many computer platforms as possible.
If we are forced to store our data in these primitive formats then the presentation techniques must provide the finish we are used to from word-processors and desk-top publishing packages. Here we have a model in the World Wide Web which is becoming the preeminent presentation mode of the Internet.
On thousands of computers around the world images and text are stored in these simple formats while the people viewing this data see it in various ways depending partly on their hardware and software and partly on whether or not they are interested in the sophisticated presentation options.
These presentation tools already far exceed what is necessary, or even appropriate, for the dissemination of the kinds of data we need to provide. Not only can text be distributed world wide much faster than you can read it but images too can be readily presented. However, an image which may be worth a thousand words as far as information is concerned, may cost the equivalent of several million in terms of access time, so this option is often disabled by the serious browser and excessive reliance on them actually reduces the browsers exposure to the underlying information. Still, the presenters always seem to want the fancy output so as long as we insure that every computer can at least read the text and make the connections we can get as sophisticated with our presentation as our resources permit.
These people will also need direction. The technicians can build on such guidelines but they will require constant monitoring by non-technical conservationists and constant input from the data providers. What this means is that the executive of ICOMOS must become knowledgable enough to use the system or they will lose control of it. This may be the most difficult requirement to fulfil.
The first step in establishing this control would be to set out the principles and policy of ICOMOS in regards to the handling of electronic data. This must be done, not in terms of hardware, software and procedures, but at a level of basic principle.
As a first step in this regard I would propose the discussion of following points with a view to drafting a resolution concerning data management.
What can or should ICOMOS do?
From a technical perspective the first step would be to create a storage structure for any conservation data offered and a global database, indexed on an asset's unique location on earth, which would provide a way to find the information it references.
In the beginning most of the information linked to this index would be hard copy on the shelves of the documentation centre and in libraries around the world, but whenever a document is requested it would be scanned and put on-line as image files. Later, as time allows or demand warrants, these files would processed into the standard file formats of the archive. Of course, all new documents coming in would be converted to the standard file types and put on-line immediately.
The long-term security of the archive would, at first, be assured by the establishment of a mirror site remote from the main ICOMOS server but within a few years the process of copying the data/software set to new media should commence with the old systems being put into storage for posterity.
What MUST ICOMOS do?
ICOMOS as an organization has three options in the face of this