Paper and Network Scholarships:
The Logistical Limits and Futures of Cultural Studies
Dr T. Matthew Ciolek,
Research School of Pacific and Asian Studies,
Australian National University, Canberra ACT 0200, Australia
tmciolek@coombs.anu.edu.au
http://www.ciolek.com/PEOPLE/ciolek-tm.html
To be presented at the
ECAITech session of the
Pacific Neighborhood
Consortium (PNC) Annual Meeting,
University of California at Berkeley, Berkeley, USA,
13-17 January 2000
Document created: 4 Dec 1999. Last revised: 15 Dec 1999.
0. Introduction
This paper reaches three conclusions: (i) That traditional, i.e.
paper-based scholarship suffers from several permanent mechanical and
logistical limitations. Therefore, no major progress is likely to happen
in cultural studies, until another, less
handicapped medium, possibly the Internet, is employed; (ii) Unfortunately, the
Internet, despite of being introduced over 30 years ago, is not used
in an efficient and imaginative manner; (iii) Serious
use of the Net might be able to bring about a Renaissance of
cultural studies, albeit at the cost of significant social and political
dislocations to the existing research practices and institutions.
The paper commences with an abstract and rather assertive discourse.
Midway, however, it waxes philosophically and does a lot of
intellectual meandering and back-tracking. Finally, it offers a
series of hesitant, tentative, and very sketchy hypotheses. This is
inevitable, as it is our past that we perceive best. The present has a
tendency to coalesce in an unexpected pattern, which reveals both its
un-called for precedents and inferred consequences, while the future
is inevitably a dazzling array of competing trajectories and
blueprints. Experience shows that most of our projections, however
firm or tentative, are always half-baked and misguided.
Any, even the slightest merit of this paper I dedicate fully to my dear
ECAITech friends: Janice M. Glowski, Jeanette Zerneke, Susan Whitfield, Ian
Johnson, Larry Crissman and Andrew Wilson. Without steady and mould-breaking
contacts with them, my synapses would be disappointingly sluggish.
1. Three types of cultural studies
Cultural studies, like Caesar's Gaul, form three major groups.
Firstly, there are studies which create verbal models. This
type of work usually takes the form of a narrative, a dissertation or
an encyclopedic dictionary. In narratives, such as stories or
travelogues - a sequential temporal theme prevails. In dissertations,
materials and arguments are presented and discussed according to the
structure of a subject-tree of issues and data. Finally, in
dictionaries, available information is fragmented and shaped into a
multitude of self-contained nuggets. These are presented in an
arbitrary, usually alphabetic order.
Secondly, there are studies which start with an array of variables.
Each of these variables are given a distinct value or attribute.
Some of the variables are numeric, while others can be
qualitative. Their common feature, however, is that together they form
an invisible documentary basis for a graphic display: a graph,
a chart, or a map. Such graphic display is subsequently annotated with
verbal labels, comments and explanatory notes. However, within the realm of
graphic models words play only a minor role. Maps and charts
communicate chiefly nonverbally. They show their wealth of
information wholeheartedly but do not say much about the revealed
connections and regularities amongst the source data.
Any relationships are revealed through the skilful use of lines, points,
and areas, as well as symbols, patterning and colours.
The third category of studies; numeric models and analyses,
also commence their existence with a body of data. However, in such models
the many variables to which the data pertain are put together in the
form of a matrix or an equation. In all cases the
assembled data form a dynamic lattice of quantitative relationships.
In numeric studies each of the constituent cells or elements plays a dual role. It
both represents (reports) its own value as well as forms a logical
link with the values assigned to other parts of the model. Naturally,
numeric studies do make use of words. However, such words merely
elucidate arithmetic patterns embedded in the assembled information.
So, they carry a meta-comment and do not
constitute the primary information itself.
All three categories of social science and humanities research have a
long and respectable history. The verbal model can be said to
originate, at least in the Western world, with the writings of
Herodotus of Halicarnassus (485-425 BCE), a man who is commonly
regarded as the 'Father of History.' Maps, as tools for systematic
presentation of large volumes of complex and interrelated information,
also have an ancient heritage. They were known already to Herodotus
himself. However, it might be convenient for mnemonics sake, to link
them to the research of Claudius Ptolomaeus (ca.100-ca.168 CE). He was
a Roman geographer who worked in Egypt and who compiled the seminal
Introduction to Geography accompanied by an extensive map of
the Graeco-Roman world and a gazetteer of ca. 8,000 partially
geo-referenced placenames (PWN 1967:601). Numeric studies, by
contrast, are a more recent invention. They can be said to originate
with the quantitative demographic investigations (see Note 1) of Sir William Petty (1623-1687), Gregory
King (1648-1712), economic studies of Francois Quesnay (1694-1774) and
the works of Marquise Nicolas de Caritat Condorcet (1743-1794).
Condorcet, a French mathematician and philosopher of culture, was
among the first Europeans to suggest that if social science research
was based on the use of statistics, starting with descriptive
statistics, then it could be conducted rigorously and objectively,
just like the natural sciences (PWN 1963:593).
Handy examples of verbal models of social and cultural realities
are provided by both S. Runciman's superbly detailed chronological
accounts of the Crusades (Runciman 1978abc), and F. Braudel's discussions
of the grammar, growth and transformations of the world civilisations
(Braudel 1995). The graphic model, of course, is best exemplified by
a plethora of maps and atlases dealing with social, historical,
economical and cultural variables and processes. Some more recent
specimens of these are the maps found in Putzger
(1963), Shepherd (1976), Scarre (1988), Stone (1989), and Vidal-Naquet
(1992).
The numeric model approach also has many examples. For this reason,
two of them will suffice. The first is a large set of demographical
data for Asia (ESCAP 1994). The model is composed of a matrix of 17
columns and 59 rows. The columns cover variables ranging from such
topics as 'mid-1994 population estimates', 'annual growth rate' and
'no of persons per km2' to 'population doubling time at current rate'
and '1992 GNP per capita'. Figures are provided for 53 countries and
territories. There are also separate totals for East Asia, South East
Asia, South Asia, Central Asia, Pacific areas and the grand total for
all of the Asia-Pacific region. In sum, the table is made of a grid of
1003 directly and indirectly interdependent cells.
Another example of a numeric model is the study of the supply
requirements and arrangements of the Macedonian army during
Alexander the Great's campaigns against the Persians, Scythians and Indians (Engels
1978). Although the bulk of the study is made up of words, and the book
itself contains several charts showing the roads taken by Macedonian
troops at various phases during their war for the East, all key
deliberations are predicated upon a simple arithmetic formula. The
formula balances daily food and water requirements of the troops, as
well as those of the accompanying animals (both the war-horses and the
horses, mules and camels from the baggage trains) against the physical
capacity of humans and animals to carry weights (such as supplies,
armour and weapons, and tools) over extended distances. It is through
the use of this equation that Engels can show convincingly that not until
the introduction of railway transport in the 19th century, it was possible for a
body of men or animals, regardless how many of them would travel
together, to proceed for "more than four days without replenishing its
water in a terrain where no water or grain was obtainable" (Engels
1978:63).
Of course, in the majority of contemporary studies of cultures and
societies all three approaches are freely used side by side,
often within the context of a single investigation.
2. The mechanical problems of paper-based scholarship
Throughout the 2,500 years since Herodotus all three types of
scholarship depended on the extensive use of a medium such as papyrus
(invented ca. 3000 BCE), subsequently replaced with parchment
(invented ca. 200 BCE) and, finally, with paper (invented ca. 100
BCE). All three materials played a vital role in
promoting the speedy production and distribution of knowledge. In all
cases, the medium of publication, such as paper (let's focus for
the sake of simplicity, on the most recent and most widely used
invention) invariably performs a dual function. It acts, in
private and public spheres of circulation of information alike, both
as a storage and as a display device.
In the first instance it is a tool, a mechanism for keeping data
(and any associated commentaries) recorded, preserved and ready for any
subsequent uses. In its second aspect, it is used to provide a
convenient listing, or presentation of patterns and regularities
contained within the recorded data. It is a subtle but important
distinction, one which became apparent only recently. It originated
with the introduction of communication networks, and client-server
technology. To use the Internet's parlance; paper, when used as a
tool for communication behaves both like a server and a browser
system.
The widespread and continuing reliance on paper for the storage and
distribution of information is favoured by many factors. These include
cellulose's impressive resilience to the damage caused by the elements or
vermin. They also include the clarity and high precision with which
letters, images and numbers can be imprinted on the surface of a page.
The ease of folding, trimming and binding a large paper leaf (i.e.
plano) into formats such as folio, quarto, octavo and so on (Ventura
Pacific 1999), is also important. However, its chief and most
important advantage is its remarkable cheapness. For example, each of
the Gutenberg Bibles managed to deliver information in the form of an
accessibly priced volume which previously would have required
parchment made from the skins of over 200 sheep (Manguel 1996:135).
Similarly, the massive history of Europe (Davies 1997) printed on some
1385 pages, can be purchased for AU$29.95, that is for less than
half-a-cent a page.
Nevertheless, convenient and popular as they are, all paper-based
publications suffer from two major and, as we shall see, essentially
unavoidable handicaps. These are: (i) stringent limitations to the
overall volume of information a paper document can adequately handle;
as well as the (ii) built-in incorrigibility of the paper-stored data.
Normally, we are not conscious of these shortcomings, and if we do, we
do not take them too seriously. The last five hundred years, or eleven
hundred years if we consider China, of print technology and the
complete ubiquity of paper have greatly desensitised us to the
existence these 'hidden' and stubborn physical problems.
Firstly, there are ergonomic restrictions to publications' overall
size and weight. The human hand favours only those objects which can
be held comfortably in its grasp for extended periods of time. Some of
these pragmatic stipulations are quite old. For example, in 1527 king
Francois I decreed a set of standard sizes for all books published in
France, and any printer who broke this rule was to be punished by a
term in prison (Manguel 1996:127). In addition to purely ergonomic
considerations, the maximum weight of a publication is also controlled
by the existing schedules of postal charges (see Note 2).
After five centuries of trial and error, the final range of
formats for printed publication is quite narrow (Ventura Pacific
1999). In the late 20th century, at the top end of the spectrum there
are books such as Le Petit Larousse (Larousse 1993), an encyclopaedic
dictionary some 1872 pages long. The entire volume is 8 cm thick,
weighs 4 kg and contains some 2 million words. There is also the already
mentioned History of Europe (Davies 1987). That volume is 6 cm thick,
weighs about 1.7 kg and contains some 600,000 words.
The lower end of the spectrum contains miniature documents.
There are books (eg such as from the 'Shambhala Pocket Classics' and
'Shambhala Centaur Editions' series) which are issued in tiny,
tricesimo-secundo (32mo) format. They are no more than 10.5 cm wide
and 12.5 cm long. They have about 140 pages, weigh approximately 0.05 kg,
and are no more than 1.5 cm thick. On average they contain about
17,000 words. Despite their small dimensions their individual pages
can still be opened without much difficulty and the text they deliver
is still ample and readily legible.
The middle ground, be definition, is occupied by publications falling between these two
extremes.
Table 1
Some physical characteristics of recently published books on Asian studies
----------------------------------------------------------------------------------------
Aspect/Book format* Duodecimo Octavo Quarto Total sample
----------------------------------------------------------------------------------------
Numbers in the PBO catalogue 3 81 16 100
No of pages - average 289 224 233
No of pages - range 216-346 42-432 76-691
Dimensions - average (cm) 11.5x17.9 17.1x22.2 20.3x26.2
Thickness** - average (cm) 1.9 1.0 1.1
Area p/page - average (cm2) 206 380 532
Weight - average (kg) 0.12 0.26 0.66
Weight - range (kg) 0.10-0.17 0.07-0.50 0.16-1.53
No of words per book*** 77,000 111,000 161,000
----------------------------------------------------------------------------------------
* For book sizes terminology see Note 3 and Roberts and Etherington (1999).
** Estimated, assuming the average no. of pages
*** Estimated, assuming 1.30 words/cm2 of page and rounded to the nearest thousand
An analysis of the details of 100 books listed in the catalogues of an
electronic bookshop (Philippine Bookstore Online 1997) indicates (see
Table 1), that approximately 80% of commercial publications dealing
with Asian culture, history and current affairs are octavo-sized
books (approx 17x22 cm). These books are 1 cm thin, contain
approximately 230 pages and weigh about a quarter of a kilogram. These
portable containers of information carry about 111,000 words. Only
three percent of the books in the analysed sample would have a smaller
(i.e. duodecimo) format, and sixteen percent would be issued in a larger
(i.e. quarto) formats. Finally, only two percent of the books in the
publisher's catalogue would weight over 1 kg, and about one in five
(18 percent) of the books would have more than 300 pages.
Above data suggest that in the world of paper documents - such as
manuscripts, journals and books - there seems to be a dynamic
relationship amongst a host of variables. While the smallest
and largest physical parameters of a document are controlled by the
economics of postal distribution system and anatomical characteristics of the
human body, their intermediate values are dictated by a series of
tradeoffs between the document's legibility, portability and the
amount of information it attempts to deliver.
The type-face, point-size, spacing between lines, with of margins all
influence the number of pages a book needs to have. For instance, a
recent decision by the ANU, RSPAS' journal 'Australian Archaeology'
(see Note 2) to use a Times font size 10,
instead of the hitherto prevailing size 11, has succeeded in packing
14% more words per page of the periodical (Andrews 1999). The number
of pages is controlled, in turn, by their dimensions and their
cumulative weight. At the same time, the weight and thickness of a
sheet of paper is determined by the information's planned durability
and the circumstances of expected use. In other words, paper as a
medium for written communication offers an ample and convenient but
always firmly circumscribed field of action.
As the work by N. Davies testifies, a volume published in one of the
smaller formats (such as quarto or octavo) can contain nearly 1500
pages and still fit within a tolerable weight-limit. This is not
possible, however, with larger formats such as folio, elephant folio
(i.e. a newspaper page, or 40x58 cm), atlas folio, and, finally,
double elephant folio. Here, any increase in the overall dimensions of
a page needs to be fully compensated by a corresponding reduction
in their overall number.
So, if the overall amount of space available
for words, images and numbers is limited, then such physical
constraints strictly govern the amount and quality of information which can be
carried between the covers of a manuscript, newspaper or book.
A couple of trends can be discerned here.
Firstly, the more
information a document attempts to deliver, the greater the likelihood
that in a verbal model this information will be geographically and
chronologically circumscribed, or, alternatively, ruthlessly
simplified in order to match the available display space and the
weight. The already cited work by S. Runciman exemplifies the first
tactics, F. Braudel's - the second.
In the case of a numeric model, the physical aspect of a publication
will influence both the granularity of the data to be presented and
the number of associated explanatory notes and glosses. If the
physical dimensions of a book prove to be restrictive then it is
obvious that large data sets would attempt to limit the number of
variables they deal with. They will concentrate instead on providing a
wealth of detail pertinent to just a handful of phenomena.
Alternatively, they can remain wide in coverage, but then they
must surely remain short on specifics. In this case information
will be aggregated and retabulated in order to meet the space and weight
standards of the publication.
Comparable compromises can also be seen in the world of graphic
models. Here, the physical size of a leaf of paper dictates the scale
and hence the amount of detail a map can convey. For instance, a chart
of the whole world, printed on a single page of a standard
atlas (say, 24 cm wide) needs to be at 1:125 mln scale.
Similarly, a map of the whole world fitting on two pages of an atlas
(say, 48 cm wide) needs to be drafted at no more than the 1:66 mln
scale. This means that a more detailed, say, 1:1 mln, and still a physically
manageable map is possible only for a small places like Sri Lanka,
Belgium or Maine. However, a seamless map of the whole world at 1:1
mln scale, which is an approach adopted very successfully across the
electronic medium by the Digital Chart of the World - DCW datasets
(see NCSU Libraries 1999), is most unlikely to be created and used, as
it would have to be approx 20 meters tall and 40 meters wide.
There is also an additional complication. For both technical and ergonomic
reasons the narrowest line which can be printed on a sheet
of paper is about 0.1 mm wide (Johnson 1999). This means that in the case
of a map drawn to 1:66 mln scale, the positional error is inevitably
about 6.6 km. The same error, in case of a in a map drawn to 1:1 mln
scale would be about 100 meters.
All this means that paper as a medium for scholarly communication
inevitably suffers from a series of mechanical handicaps. Naturally,
these limitations are not fully apparent when the data sets are small,
or when the need for detail or great precision is not a consideration.
However, if more serious work is undertaken, the overall
mechanical clumsiness of paper as a storage/display device require us
to start making invidious choices:
- In the case of verbal models, authors are forced to determine
how much space will be allocated to the presentation of primary sources and
extracted data, and how much of it to the ensuing commentaries,
interpretations and discussions.
- In the case of graphic models, authors - draughtsmen or
cartographers - are forced to decide whether each chart will show
precise but necessarily local information or whether it should
endeavour to provide readers with generalised but necessarily
fuzzy information.
- Finally, in the case of numerical models, the choice is
between detailed or plentiful information.
It is clear then that the scholar's skill and training does not
resides only in his/her ability to locate data, or in the capacity to
analyse them thoroughly. Nor does it reside solely in the speed and
aptitude with which findings can be juxtaposed with those of other
researchers. The skill in question also goes well beyond one's ability
to dress thoughts in adequate words and present them in as a cogent
and elegant series of premises, hypotheses, findings and conclusions.
Ultimately, it necessarily includes a gradually acquired ability to
choose and juggle gracefully yet judiciously between the everpresent
mechanical properties of the dominant medium i.e. paper, where all the
scholarly work takes shape. This medium is always limited as well as
limiting and it hence requires an inevitable compromise in regard
to the (i) scope, (ii) quality or (iii) accuracy in the presentation of
one's work.
In addition, as if the picture was not worrying
enough already, there is still another factor which needs to be considered.
3. The logistical problems of paper-based scholarship
Information, once printed and bound in the form of a journal or a book
is extremely difficult to amend. Any change, any correction, any
addition however minor, always implies a major technical procedure which is
both labour-intensive and costly. Modified pages have to
be printed afresh. If more than a few lines of information is added
to, or subtracted from the text, the pagination of the rest of the
document is also adversely affected. In this situation all relevant
pages need to be reformatted, reprinted and rebound. In short, paper as a
carrier of information, is characterised by two built-in logistical or
technical biases.
Firstly, the overall complexity, time and costs of production of a
printed document makes its authors and editors aim for utmost
perfection, content and appearance-wise, of their work. This
perfection is certainly welcome but it comes, nevertheless, at a
certain price. Not all of the addenda and corrigenda which reach the
authors' hands are likely to be included into their work. It is so
because sooner or later the publication schedule starts taking
precedence over considerations of accuracy and completeness of the work
which is about to be printed. As a result, authors and publishers
tacitly agree to be 'realistic' about their striving for
perfection and freedom from errors.
Secondly, the overall complexity and costliness of revisions to the
already printed and disseminated documents very strongly
discourage production of any subsequent errata and modifications. A
book, or an atlas, or a table with statistical materials is always
treated as a singular, once-in-a-lifetime operation. Thus a
body of work, as soon as it completed and released into the hands of
public tends to operate as a relic. It turns into an object which is
either revered or laughed at, but is unlikely to be ever improved and
worked on. Printed information never gets replaced, however erroneous
and inadequate it is found to be. Instead, if the work is bad, it is
merely appended (logically but not physically) by another batch of
print on the same topic. Bad paper-based information never gets
repaired and, interestingly enough, almost never gets removed from
circulation. A number of practical, financial as well as cultural
reasons make burning or pulping erroneous materials rare, at least in
the late 20th c., practice. Once inadequate material starts getting
distributed, its retraction and emendation becomes a logistical
nightmare. For an account of some of such nightmares see Note 4.
Consequently, all paper-based social and cultural studies depend on
skilful navigation in an ever growing ocean of information of diverse quality. In an
extensive mass of data and mass of commentaries a single page may
contain startling truths or outrageous lies, and the difference
between the two may be only discovered and highlighted by words
printed on some other page, some other time, in some other
publication. The chances that a direct physical link between such two
documents will ever be formed are non existent.
4. The hierarchy of scholarly publications
Obviously, no cultural research publication occurs on its
own. Cultural research is always situated within the overlapping contexts
of an already existing body of relevant knowledge. These contexts are always
present in the form of references to and acknowledgments of the key
aspects of earlier research and publications. The amount and kind of
references made to previous studies naturally vary from one author
to another. However, certain basic trends can be discerned (see Table
2).
Table 2
The volume of earlier material quoted by scholarly publications
Primary Materials used*
Publ.type sources articles books
--------------------------------------------------------------------
Research papers 4 9 27
--------------------------------------------------------------------
Monographs 17 103 93
--------------------------------------------------------------------
Syntheses
atlases** 4,335 26,265 23,742
maps*** 20 119 107
--------------------------------------------------------------------
* For details of these materials see Note 5
** 'Average' historical atlas is based on data from 27 earlier atlases
and 255 books.
*** 'Average' historical atlas contains approx 221 maps
Thus, an average research paper in the field of Asian studies seems to
use about 40 other publications. The corresponding knowledge-base of
historical monographs, in turn, is about 200 positions strong. The knowledge-base of large
scale syntheses such as historical atlases is, unsurprisingly, much
wider. Such works make use of about 54,000 publications.
Of course one should expect a considerable overlap between the bibliographies
used to produce each of those publications. Unfortunately, there are no
readily available data on the extent of
such overlaps. Therefore, figures in Table 2 should be treated only as very
rough estimates. They are useful because they point to the order of magnitude
of the phenomenon in question. They confirm that in a world where the
storage and display space is at a great premium it is not really
possible for higher level publications such as encyclopedias or
atlases to use verbatim all the information contained in the
publications on which they have drawn.
According to the proverb, a single picture is worth 1000 words. Table
2 above suggests that a single map delivers data selected from over
100 books. This information is now neatly synthetised in the form of a
graphic chart ranging in dimensions from a 'generous' 25x35cm (i.e. 875 cm2)
to a half, one third, or even a meagre quarter of that area.
It is natural then that even the best background materials cannot
fail to be used in an increasingly succinct and abstract manner.
Therefore data, where possible, are aggregated. Their detailed
annotations and discursive commentaries are either abbreviated or
dropped altogether. Finally, details of their authors and
publications start getting curtailed and, eventually, omitted. This
wholesale process of progressive abstraction of the originally detailed
information can be expressed in a form of Table 3.
Table 3
Four levels of generality in scholarly publications
-------------------------------------------------------------------------
Level of Range of Type of
generality information publication
-------------------------------------------------------------------------
1st data + source + context + methodology research papers, journal articles
2nd data + source + context monographs
3rd data + source overviews, text books
4th data syntheses, maps, encyclopedias
[also, newspapers & propaganda]
-------------------------------------------------------------------------
The table suggests that existing publications always face a fundamental
dilemma.
The information that they offer is either documented in detail, but
hardly of universal relevance, or is relevant, albeit terribly disembodied.
In other words, a necessary level of generalisation is always achieved at a
cost. Each time a publication aims to cover an increasingly large
range of issues, geography or chronology, it inevitably risks
an unwitting introduction of errors of omission, as well as the
progressive loss of integrity. The latter happens because without the neccesary
qualifying annotations, factual findings
stemming from incompatible methodologies
can be (and often are) wrongly conjoined.
5. The presence of errors in cultural studies
Blunders and mistakes present in paper publications tend to form
two large groups: contagious and non-contagious ones. On
the whole, verbal models seem to provide an environment where a
factual error can frequently occur due to poor copy-editing or
inadequate typesetting, but where it would remain largely innocuous. It is so
because verbal models rely on the integrity of the overall logic of
the entire argument and not on any particular value of it's
variables. Some of these, such as dates, place names, or the names
of the chief characters, are of course crucial and, as such, tend to be
proofread and checked very carefully. But other
details are less important and thus any likely fault with them does not
undermine the validity of the study in which they arise. For instance,
Sir Steven Runciman in his monumental history of the Crusades
commented on the march of some 20,000 people led in May 1096 by
Peter the Hermit across the Kingdom of Hungary:
"The vast majority travelled on foot. Where roads were good they
managed to cover twenty-five miles a day." (Runciman 1978a:124)
The point made by the sentence in question is that a greater speed of
movement was achieved when pilgrims and crusaders walked along
the well built and well maintained roads. It is a truthful
observation, and it is not undermined by the fact that the
stated value of 25 miles/day is impossibly high (Engles 1978,
Elting 1997:463, Lewis 1997:142). Another and identical
error occurs in his comment on two places in Asia Minor:
Dorylaeum is "22 hours' marching distance from Leuce ... To reach
this point the vanguard would have to had to cover some 85 miles in
four days." (Runciman 1978a:186-187).
Here once again the author incorrectly assumes that an army of several
thousand men was able to do a forced march for four consecutive
days at the improbably high speed of 6.2 km/hr. For discussion of
what constitutes real-life marching speeds see the Note 6.
A third example of a non-contagious error is offered by Map no. 13
"Central Europe" published in the vol 10 of 'The Cambridge Ancient
History' (Cook et al. 1934:346-347). There, in addition to locations
and names of major cities comprising the Augustian Empire
(44BCE-70CE), is also a dot surreally marking the city of
Berlin, that is a settlement which came into existence only in
Medieaval ages, in the 1230s (New Encyclopaedia Britannica 1974:733).
This blunder is so simple, so monumental, and so divorced from other
information conveyed by the map, that it is unlikely to be ever
transferred to other publications. It can be concluded, therefore, that
as long as blunders do not alter essential relationships between other
variables within a verbal or graphic model, they stay mostly harmless. Such
mistakes tend to remain isolated, and eventually forgotten.
Unfortunately, this is not the case with some other errors. It is
especially so if they occur in the context of some more rigorous and
more demanding graphic or numeric model. In these more tightly
integrated studies, the variables, by definition, form the sinews of
an argument, and the foundations of a thesis. Therefore, any errors,
if present, are very difficult to eradicate.
For instance, among the many charts comprising the '"Times" Atlas of
World History' there is a map entitled "The economic life of the Roman
Empire" (Stone 1989:91). In that map, among the many lines signalling
the existence of communication links between various parts of the
empire of the 2nd c. CE there is one representing a merchant sea route
joining Italy and Egypt. A legend attached to the line states:
"Alexandria-Puteoli 15-20 days (fastest 9 days)." (Stone 1989:91)
This is a curious and treacherous observation. It is true that
Roman merchant ships (as distinct from oar-powered warships),
equipped with a square sail, could catch NW winds prevailing
throughout most of the sailing season along the route between
Italy and North Africa, and thus reach the Egyptian port in no more than
15-20 days. Hoverer, any journey in the opposite direction - one that
would face the adverse winds without the aid of a movable triangular i.e. 'lateen'
sail (an Arabic invention of the 8th c. CE) - would have to be necessarily more
cumbersome and slower. Casson estimates (1984:15) that
such a journey's length would be in the vicinity of 40 days. He also
points out that this miscalculation is a simple, but nevertheless
common error of reasoning. It was first committed by Oertel (1934:387) in an
article commissioned for the vol 10 of 'The Cambridge Ancient History'
(Cook et al. 1934). This error managed to remain undetected by the
editorial team in charge, published and
accepted in good faith by other historians, and reproduced without
any qualms 55 years later by the editorial team of the '"Times" Atlas of World History'.
A similar problem can be spotted in numeric models. If we look at the
already mentioned ESCAP Population Data Sheet (ESCAP Population
Division 1994), we can notice that the table, printed on a 49x60 cm
(2940 cm2) chart, aims to provide historic data plus their
extrapolations for 53 individual countries and five major regions of
the Asia-Pacific area. However, despite its authoritative parentage -
there is no doubt that the UN's ESCAP Population Division is a well
resourced and a serious research organisation - the Data Sheet
presents a number of problems.
Firstly, there is a key omission, motivated by political, i.e.
un-scholarly considerations. The table provides absolutely no data on
the approximately 18 million people living in Taiwan, while being
quite detailed about the demographics of the two thousand inhabitants
of the Pacific island of Niue. At the same time it does not offer any
indication on the fate of the missing information. It does not state
whether the demographic and economic data for Taiwan were/were not
included into calculations for some other country (such as, for
example, PRC). Secondly, although statistics for each country are
annotated with details of a source from it was derived, there is no
indication with regard to the methodology employed to calculate
reference values for the 37 countries in question. In some cases more
than one source was consulted and we are not told which of the sources
where ultimately used. Thirdly, the table offers no explanation of the
methodology used in calculations. Was it an extrapolation of some kind
derived from the analysis of figures available for the years 1990
through 1993? or, was it a simple carry-over of values of the most
recent year? There are tacit problems as well. The ESCAP numeric model
of Asia-Pacific populations provides no indication on the temporal
range of data (i.e. last two years, last five years, last decade etc.)
used while computing such key variables as the annual growth rates,
life expectancy at birth, fertility rate per woman and so forth.
In short, ESCAP figures collated and made available world-wide are not
only incomplete but also they are distributed without any
reference to any underlying methodologies and assumptions.
The statistical figures are collated, printed in thousands of copies,
and distributed without any built-in means for any subsequent independent checks
and controls. The ESCAP materials can be concluded, therefore, to be
unscientific. They are unscientific because they represent public
and oft quoted authoritative statements, but of the type which cannot be critically
scrutinised by their recipients and users (Popper 1969).
Unfortunately, as we have already observed, there is no logistical
mechanism through which authors and publishers upon discovering a mistake
in the printed data could systematically flag and correct the error
all over the world on pages of all copies of all books which are known
to contain it. Therefore, it is inevitable that blunders of the
magnitudes just described, and greater, will persist un-checked and
have a potential to contaminate and hinder any of the subsequent
researches.
This pessimism is justified. The separation of
data from their sources and their methodologies, so obvious in the
ESCAP numeric model, is a common trait of many of the contemporary higher
levels syntheses. An analysis of 106 maps with information on the trade routes
of Africa, Asia and Europe proves the point (see Table 4).
Table 4
Problems with the maps published in five selected* historical atlases
--------------------------------------------------------------
Shortcoming Percentage of maps
displaying the problem
--------------------------------------------------------------
No time-frame 9%
No legend/key to symbols 12%
Factual errors present 20%
No lat-long grid 53%
No source of information stated 68%
Incomplete annotations of data 69%
No scale is specified 74%
No projection is specified 100%
--------------------------------------------------------------
Total 106 cases (100%)
--------------------------------------------------------------
* atlases analysed: Putzger 1963, Shepherd 1976, Scarre 1988,
Stone 1989, Vidal-Naquet 1992
Therefore, there is not doubt that the above atlases suffer from a
number of serious data-management problems. That some 9 percent of
maps published in historical atlases carry no information on the time
period to which they are supposed to pertain, is only one of the nasty
surprises. The discovery that no less than 20 percent of these
publications contains a major error of omission or commission is
another revelation. However, the greatest of all of them is the truly
unnerving discovery that in 68 percent of cases the drawn maps are
unable to provide even a slightest indication about the identity and
nature of the publications which they supposedly summarise and whose
intellectual content they present urbi et orbi in full colour
and typographic splendour.
Does it mean, that upon approaching a certain level of abstraction
even the most ambitious of projects turn painstakingly collated
data into a mere coffee-table books? Does it mean that once we try to
distil wisdom from some 50,000 books and journal articles we are
forced by the very nature and economics of the prevailing technology to produce a mere
collection of untrustworthy, therefore useless
pictures? Does it mean that high level syntheses are something to
look at, but not to take seriously and definitely not to rely on in one's own
work? Does it mean that cultural studies might essentially be a massive yet
a non-cumulative enterprise? I am afraid it might be so.
6. The dilemma of cultural studies
From what has been considered so far we can conclude that paper-based cultural
studies seem to be afflicted by a deeply rooted
and possibly un-eradicable contradiction:
Low level and grainy studies are cumbersome for the task of developing
a universally legible global picture of cultural phenomena. It is so,
simply because they are not constructed in a manner which facilitates
a global interchange of data and dovetailing of conclusions. Such
studies also tend to be much too idiosyncratic, too 'messy', and far
too context-specific. High-level studies, on the other hand, are able
to provide much prized clarity, better understanding of the
subject-matter and a fresh insight, but ultimately - as we have
observed before - they are not fully checkable. This happens - as has
been the case with the best selling Cultural Atlas of China (Blunden
and Elvin 1998) - when the work is prepared to the exacting scholarly
standards, but, subsequently, most of the apparatus gets suppressed by
the packager or publisher for reasons of space and cost (Elvin 1999).
The sad truth is that, some 25 hundred years since Herodotus, and 19
hundred years since Ptolomaeus, we are still unable to know with much
certainty:
- what primary materials have been actually used to create a particular set
of conclusions;
- what conclusions have been reached on the basis of a particular set
of data;
- how to check and then repair data speedily, simply and
inexpensively and re-draft related conclusions when an error in either of
them has been located.
This surely is an unhappy state of affairs. My purpose of pointing
to it is not to whinge and bemoan it. Nor is this an occasion
for witty observations and clever remarks. Rather, the above analyses and
observations are undertaken to ascertain for myself (and interested
colleagues) what are the likely invisible boundaries to the work we
have been doing for so long and so energetically.
Therefore, this paper is akin to an investigation of the technical
limitations of car traffic as opposed to railway traffic or air
traffic. The realisation that trains, upon reaching certain speed
inevitably will loose traction and are prevented from travelling any
faster, is a very practical finding. Similarly useful is a reminder that
passenger and cargo planes are unable to travel at speeds below their
built-in stalling speed. These observations simply tell us about the performance
and efficiency one can legitimately expect of a given technology in a given range
of contexts.
So a question arises, if the use of paper for storage and display of
scholarly work inevitably leads, as soon as our syntheses become
increasingly global in scale, to drastic over-simplifications,
incompleteness of glosses, and occurrences of structural as well as
random errors, what is the key reason for which these problems
cannot be rectified? Well, the answer to this question should be by now self-evident.
The loss of confidence with which we can approach the high-level
generalisations occurs because the necessary permanent links
between data and interpretations, that is links between
the source materials and inferences, are typically lost
or broken.
Let me reiterate: It is not the existence of
mistakes and errors in the published materials per se that
worries me here. Let's be realistic - we are all human. We all have
made - at one point or another of our researches - an error or two or
more. The problem I am concerned with is more fundamental. Given that
errors or oversimplifications are unavoidable and they always find a
way into our work, the question is - will we be ever able to track
them down if necessary and repair them as a matter of fact, as another part of our
work routine?
7. Is a shift to the electronic medium a solution? 3 answers...
So, paper-based scholarship seems to has reached its natural
limitations. Its ongoing growth and wellbeing are daily undermined by
the logic of the very medium that the scholarship depends on. A question
therefore arises with respect to the future of cultural studies. Can
we, should we, bypass these limitations by switching our work to
another medium, say to the electronic medium, to the realm of digital
files, intelligent software and the Internet?
The answer to this question is again, like Ceasar's Gaul, three-partite:
yes, no and perhaps.
Yes, we can take advantage of the electronic medium because it
has a number of favourable characteristics:
- Electronic storage of even very large volumes of information is
now inexpensive. Hence, there is no need to aggregate raw data.
Our scholarly apparatus and appendices can be as detailed and complete
as we wish. Our verbal, graphic and numeric models can be as ample and
rich, or as skinny and tentative as necessary.
- The cataloguing of digitally stored information can be made
automatic and reliable, without the author's or librarian's
intervention. A multitude of web-crawling spiders and agents already
roams the Net on hourly bases. Entire collections of documents and
data-sets, no matter how small or how large they are, can be fully and
adequately tracked;
- The location and retrieval of networked data can be made very fast.
Any string of characters - and increasingly often, any sequence of
pixels - is now globally findable and globally downloadable.
- Online GIS systems become ubiquitous. Complex information can be
arranged into arrays of individually or jointly accessible layers.
Each layer of information can generated afresh in a few moments from
both highly aggregated and highly granular data.
- Vector-based GIS systems are not constrained by the scale of the
objects they handle. A single electronic display system can be as
detailed or as general it needs to be. It can handle information
at all scales, ranging from a 1:1 (in order to show the right
arm of an ancient sculpture in a historic garden) to 1:125 mln (in
order to show patterns of distribution of languages, political
systems, and religions across the entire planet).
- Live data sets (as opposed to 'canned', or prepackaged
ones) can be easily maintained online. Therefore, data can be not only made
accessible to all interested parties, but they also can be easily repaired
and enhanced, whenever the need for such intervention arises.
- For the first time in humanity's long history,
the Internet offers a unique opportunity for endless and iterative error-correction and
perfectibility (Raymond 1998, Ditlea 1999) of information released into the public domain.
The initial product does not need to suffer from the inflexible publication-deadline
syndrome. It can be worked on and refined and updated before as well after
the publications stemming from it having been placed online
(see also Note 7).
- Dynamic, i.e. on-the-fly graphic and numeric displays (as opposed
to more traditional prepackaged displays) can be fairly easily arranged
for. A high-level syntheses can be always generated afresh, on the
basis of the most current (and, hopefully, more correct) rich and
un-aggregated data;
However, to be honest with ourselves we need to also answer the above
question in the negative. No, we cannot really take advantage
of the electronic medium because:
- Despite the fact Net is now over 30 years old (The New York Times
1999) and the Web is no less then 8.5 years old (Ciolek 1998) we still
keep using them as substitutes for the paper-world. We continue to
think of the electronic media as a special case of inexpensive and
fast-transmitting faxes. Even the repertoire of our Internet keywords
alludes the paper-made objects: we talk excitedly about
electronic papers, electronic books and e-journals.
- We also continue to propagate, consciously and otherwise, the
primacy of conclusions over the data from which they stem. We
appear to value more the elegance of verbal arguments than the cleanliness
and integrity of the information on which these arguments rest. The higher
social ranking enjoyed by those who write papers and books as
compared with to those who verify and compile bibliographies and
maintain resource catalogues confirms this point.
- The truly new uses of the Internet, those which go beyond our current
attempts to replicate in the digital format tools and
resources hitherto known from the world of paper, implies the
emergence of distributed, open-ended, and large-scale collaborative
research initiatives.
Such projects will inevitably center on the construction and use of
countless arrays of electronic data-sets whose maintenance and
evolution might continue well beyond the life-time of a particular
task, of a particular publication, funding
arrangement, or even the life-time of its original creators and
maintainers. But if the prestige and importance is attached to
projects and not to their personnel, how shall we rate a scholar's
performance?
- The identity and well-being of such novel projects will
necessarily have to take precedence over those of the contributing
scholars and data-librarians. Such a hypothetical social arrangement,
however, goes against the very grain of current employment and
promotion policies of the Academia. (see Note 8).
This means that an effective shift to the new technology is, at
present, activley undermined and inhibited by the prevailing social
and micro-political practices and customs. The contrary statement can
also be made - existing social structures and policies of research
institutions are very much threatened by the Internet-influenced shifts
in the role and definition of scholarly work.
- There is no known mechanism, no known software (so far) which can
be used to establish and preserve simple, persistent and, importantly,
bi-directional online coupling between
(i) data;
(ii) all their transformations;
(iii) all conclusions based on them.
While the digital world promises, in principle, seamless tracking of
the data's legacy information, all such operations would need to be
made explicit and fully meaningful to a human operator. This means
that any postulated bi-directional hypertext links need to be not only
interoperable across a wide range of software and platforms but fully
legible to daily users too. Also, any problems the legacy data may
present need to be easily correctable by ordinary users. In short, if
the digital world is not going to tyrannise and dummify its
beneficiaries all it's component parts need to be made simple, intuitive,
transparent, and repairable - not by software wizards and networking
experts - but, like the WWW files, at the user's end. A tall order
indeed.
- The network, although capable of transcending paper's traditional
limitations of space, is itself severely constrained by issues of
time. Software-engineering and ergonomics studies confirm that
a task whose length is perfectly legitimate and acceptable in the
world of paper, is perceived as excessively and insufferably long
whenever it involves computers and exeeds the 10 seconds (sic!) limit
(Nielsen 1994,1997; Buckingham 1996). Thus electronic operations which
do not provide instant gratification to their audience and ample
feedback on the progress of the transaction; are unlikely to be
readily engaged in. This is a serious issue, especially that studies
of Northeast Consulting Resources (1999) hint that despite overall
improvements in the network speeds, the Internet may not be really
able to sustain performance-based applications in the nearest future.
Moreover, a crucial point: it is likely that the Net contains a
multitude of other mechanical, logical or logistical limitations,
whose existence and possible adverse consequences to our scholarly
work, are - at present - simply uknown and invisible to us.
Finally, an answer to the question about transcending the boundaries
of paper-based scholarship by switching to the electronic medium
might be simply given as a hesitant and inelegant perhaps. Who
knows, maybe the networked digital medium will provide the required
solution after all. It might be so, because:
- Our prevailing attitudes to, and relationship with information are
not static and they may change. We may earnestly wish one day, for
example, to do the work we do, not only more easily and more quickly -
but also more accurately and more reliably.
- There is also another reason for the qualified optimism. The
Internet and its tools do not cease to evolve and improve. What seemed
to be revolutionary a few years ago, is now a part of an elementary
technical and intellectual milieu. After a few month on the Net, even
the magical and unfathomable invention is domesticated, taken for
granted and incorporated into routine. The innovativeness trots on.
The last major reshaping of the Internet occured in 1995 (Ciolek
1998). It was the moment when the Web crawlers commenced roaming the
Net in search of the data and started bringing them back to feed
batteries of ever-clever search engines. The previous revolution was
that of May 1991, when Tim Berners-Lee, Robert Cailliau and a team of software
engineers in CERN, Geneva devised a simple method of building of
(i) general purpose
(ii) Internet-based archipelago of
(iii) unstructured data (texts, images, numbers)
(iv) connected by a world wide web of
(v) one-directional hypertext links.
Therefore, we may assume fairly safely some eight or nine years
after that momentous invention, that it's time to take the next step.
This may happen in the next few days, or in the next few months,
no one really knows when: another team of boffins is bound to devise simple
tools for creating a brand new
(i) general purpose
(ii) Internet-based archipelago of data, this time carefully
(iii) structured ones and
(iv) spanned by a world wide web of
(v) bi-directional hypertext links.
This, if we think about it, is not such an impossible scenario after
all. If this happens, a two-way communication and well calibrated
interaction between low-level and high-level studies might become
possible for the first time since Herodotus.
When this happens, this will truly be a development of unimagineable
consequences, both to our research practices and to all social
institutions of which we are a vital part.
8. Notes
Note 1 - Numeric models of society
I am grateful to Prof. Mark Elvin for drawing my attention to the role
Gregory King and Francois Quesnay have played in developing 'Political Arithmetic'
(population statistics) as a methodical and scholarly operation.
Note 2 - Postal charges and publication weight
The maximum weight of a publication depends on existing schedules of
postal charges. For instance, each issue of a semi-annual journal,
'Australian Archaeology' published by the Research School of Pacific
and Asian Studies, ANU in A4 (i.e. quarto) format, contains exactly 76
pages, so it can stay just under 200 g (Andrews 1999). This 0.2 kg
limit is Australia Post's cut-off point for the concessionally priced
printed-matter.
Note 3 - Book and paper sizes
Book sizes were traditionally measured by number folds to a leaf
of paper of a base size. One leaf (i.e. plano)
if folded, creates a folio, if it is folded again, it creates a quarto, and so forth.
These logical sizes have been assigned various numerical values. For example
the generally accepted formats of books in the 20th century are:
Name Abbrev Max Book Length
-----------------------------------------------------------------------
Tricesimo-secundo 32mo. 5" 12.5 cm
Sextodecimo 16mo. 6" 15 cm
Duodecimo 12mo. 7" 18 cm
Octavo 8vo. 10" 25 cm
Quarto 4to. 12" 31 cm
Folio fo. over 12" 31 cm
Elephant Folio up to 23" 58.5 cm
Atlas Folio up to 25" 63.6 cm
Double Elephant Folio up to 50" 127.0 cm
-----------------------------------------------------------------------
Src: Ventura Pacific Ltd 1999
Now, toward the end of the 20th century world book sizes are governed
by ISO 216 paper size system (Kuhn 1996). In the "A" (European) standard, the
base leaf is 1m sq (841x1189 mm or 33.11x46.81 in).
The ISO specification generates the following sequences
formats 4A0, 2A0 - special publications; A0, A1 -
technical drawings, posters; A2, A3 - drawings, diagrams, large tables;
A4 - letters, magazines, forms, catalogues, laser printer and copying
machine output; A5 - note pads; A6 - postcards; B5, A5, B6, A6 - books
(eg. the width and height of a B series format is the geometric mean
between the corresponding A format and the next larger A format);
B4, A3 - newspapers.
In addition to the world ISO 216 standard there is also a slightly different set of
North American standards (ANSI/ASME Y14.1 and ANSI X3.151-1987.), not fully compatible with
ISO 216, and it uses as the base a paper leaf which is 34x44 in.
For details of
Text Encoding Initiative's (TEI) work on the SGML markup of data dealing
with physical sizes of various books see Bauman and Catapano (1997).
Note 4 - The removal of printed material from circulation
It was only in totalitarian countries like Soviet Russia (1917-1991)
or communist Poland (1945-1989) that a costly, complicated and
ineffective retrieval and substitution of an inconvenient printed
material could be ever contemplated. In the case of the Soviet Union,
editors of the Great Soviet Encyclopaedia made an energetic approach
circa 1954 to all libraries and research institutes owning the
reference book in question. They requested them in writing that they
cut out from one of the volumes (and destroy, of course) a page with
an article about the Bering Strait and replace it with the supplied
page containing a brand new and greatly extended article about that
geographic region. The actual point of the exercise, however, was to
accomplish a quick and surreptitious removal of a nearby article with
a laudatory biography of the KGB's head, Lavrenti Beria. Beria was one
of Stalin's henchmen, who in July 1953, during a power-struggle upon
Stalin's death in March that year, was ambushed, arrested and executed
by his Politburo colleagues. Subsequently, the editors of the
Encyclopedia were instructed by the new rulers to replace the now
compromising material with a shorter, and a less brown-nosing note.
Apparently, the ploy succeeded only partially, as many owners of the
Encyclopedia disobeyed the official directives and simply stored the
old and new Beria's biographies side by side (Besemeres 1974, Elvin
1999). A similar procedure, and with similar result, was attempted in
Warsaw in 1966, when the wording of an encyclopaedic article on Nazi
concentration camps of the WWII drew criticism from the Polish
communist party bosses. A replacement article was promptly produced by
the PWN publishing house, and distributed free of charge to all
subscribers to the series of the 13 volume Wielka Powszechna
Encyclopedia.
Note 5 - The size of scholarly apparatus
The number books and journals required to produce
a new publication can be calculated as follows:
Journals:
an inspection of bibliographies contained five
consecutive articles published in a recent issue of the ASAA's Asian
Studies Review reveal the following values. Morris-Suzuki (1998) - 1
primary source, 13 articles, 28 books; Reid (1998) - 2 primary sources, 9 articles, 28
books; Goodman (1998) - 15 primary sources, 12 articles, 21 books; Bapat
(1998) - 0 primary sources, 7 articles, 16 books; and Ip et al.(1998) - 0
primary sources, 6 articles, 41 books).
These preliminary figures suggest
that, an average journal article uses 4 primary sources, 9 other articles and
27 books.
Monographs:
an inspection of bibliographies contained in
the three publications Adams (1976), Engels (1978), Casson (1984)
provides the required data. Adams in his monograph made use of 17
primary sources, 95 journal articles and 76 books. Engels in his study
referred to 10 sources, 120 articles and 126 books. Finally, a
collection of 12 essays by Casson, were estimated to contain
references to approximately 24 sources, 94 journal articles, and 76 books.
These preliminary figures suggest that, an average book represents
information derived from 17 primary sources, 103 articles and 93 books.
Syntheses:
an inspection of bibliographies contained in five
historical atlases such as Putzger (1963), Shepherd (1976), Scarre
(1988), Stone (1989), and Vidal-Naquet (1992) shows that an average
historical atlas contains approx 221 maps which are based on 27
earlier atlases and 255 books.
Since the monographs data above indicate that each of these each of
these books appears to recapitulates 17 primary sources, 103 articles,
and 93 still another books we are able to obtain the following
estimates: sources (255*17) = 4335; articles (255*103) = 26,265; books
(255*93 + 27 atlases) = 23,742.
Note 6 - Movement rates of armies
The 25 miles/day or 40.2 km/day rate of movement, as reported by
Runciman (1978a:124) could be attained only by an individual or
two. It is however absolutely unachievable by large masses of men and
animals moving along a narrow ribbon of a road. Most probably, then,
Runciman is talking about the speed 25 km/day or 15.5 miles/day. This
revised value is still very high, but more in step with the marching
speeds of the highly fit, superbly drilled (Dodge 1890) and disciplined Macedonian
army (comprising infantry, cavalry, followers and a baggage train) who
covered the distance Babylon-Susa at the average rate of 12.3
miles/day [19.8 km/day] (Engels 1978:153).
Note 7 - Perfectibility
According to Ditlea (1999), iterative perfectibility is a major concern of Donald Knuth's, the
now-retired Stanford University professor, and computer programming
guru. "The only e-mail address Knuth maintains gathers reports of
errata from readers of his books, offering $2.56 for each previously
unreported error. (The amount is an inside joke: 256 equals 2 to the
8th power-the number of values a byte can represent.) Knuth's reward
checks are among computerdom's most prized trophies; few are actually
cashed.
He takes this error business very seriously. Engraved in the entryway
to his home are the words of Danish poet Piet Hein:
The road to wisdom?
Well it's plain
and simple to express:
Err
and err
and err again
but less
and less
and less."
Note 8 - Promotion criteria
As far as the today's norms are concerned, it could be very difficult
to argue successfully that Ms X or Mr Z deserve the tenure not so much
on the basis of their intellectual contribution to the body of world's
knowledge in the form of peer-reviewed articles and books on, say, the
burial practices of Ch'an monks of Song Dynasty of China, or the
numismatics of Cyprus, but rather on the strength of their
contribution to a SGML mark-up of a series of primary sources dealing
with these Ch'an burials, or in terms of geo- and
chrono-referencing of all archaeological sites world-wide where
Cypriot coins were uncovered.
In other words, in order to transcend
the existing boundaries of a paper-scholarship we need to pay less
attention to the scholarly uses of information and more to the
scholarly creation, structuring, enhancements,
and maintenance of such information.
9. About the Author
Dr T. Matthew Ciolek, a social scientist, heads the Internet
Publications Bureau, Research
School of Pacific and Asian Studies, The Australian National
University, Canberra, Australia. His work and contact details can be
found online at http://www.ciolek.com/PEOPLE/ciolek-tm.html
10. Acknowledgements
I am grateful to Olaf Ciolek and Mark Elvin for their critical comments on the
earlier version of this essay.
11. References
[The great volatility of online information means that some of the URLs listed
below may change by the time this article is printed. The date in round brackets indicates
the version of the document in question. For current pointers please
consult the online copy of this paper at
http://www.ciolek.com/PAPERS/pnc-berkeley-01.html
- Adams, John Paul. 1976. Logistics of the Roman Imperial Army:
Major Campaigns on the Eastern Front in the First Three Centuries A.D.
A Ph.D. thesis. Yale University.
- Andrews, Ann. 1999. Personal communication, Nov 1999, Coombs Academic Publishing,
Research School of Pacific and Asian Studies, ANU, Canberra.
- Anonymous. nd. Weights and Sizes of Papers
www.tssphoto.com/sp/dg/weight.html
- North Carolina State University (NCSU) Libraries. 1999.
Digital Chart of the World (DCW)
(v. 20 Feb 1999).
www.lib.ncsu.edu/stacks/gis/dcw.html
- Bapat, Jayant Bhalchandra. 1998. A Jatipura (Clan-history Myth) of
the Gurav Temple Priests of Maharashtra. Asian Studies Review, 22(1),
March 1998. pp.63-78.
- Bauman, Syd and Terry Catapano. 1997
TEI and the Encoding of the Physical Structure of Books. (v. 14 Nov 1997).
www.stg.brown.edu/webs/tei10/tei10.papers/bauman_catapano.html
- Besemeres, John. 1974. Personal communication, Apr 1974,
Research School of Pacific and Asian Studies, ANU, Canberra.
- Blunden, Caroline and Mark Elvin. 1998. Cultural Atlas of China.
(Revised edition, first published 1983). New York: Facts on File/Checkmark Books.
- Braudel, Fernand. 1995. A History of Civilizations. London: Penguin Books.
- Buckingham Shum, S. 1996. The Missing Link: Hypermedia Usability Research & The Web. Interfaces, British HCI Group
Magazine, Summer, 1996.
kmi.open.ac.uk/~simonb/missing-link/ml-report.html
- Casson, Lionel. 1984. Ancient Trade and Society. Detroit: Wayne
State University Press.
- Ciolek, T. Matthew. 1998. Asian Studies and the WWW: a Quick Stocktaking at the Cusp of two Millennia.
In: PNC Secretariat (ed.). 1998. Proceedings of the Annual Meeting of the Pacific Neighborhood
Consortium (PNC), Academia Sinica, Taipei, Taiwan, 15-18 May 1998, pp.101-142.
www.ciolek.com/PAPERS/pnc-taipei-98.html
- Ciolek, T.M. 1999. Internet
Structure and Development: On Strategic Uses of the Archetypes of the
Networked Mind. PNC Secretariat (ed.). 1999.
Proceedings of the 1999 EBTI, ECAI, SEER & PNC Joint Meeting 18-21 Jan 1999,
Academia Sinica, Taipei, Taiwan, pp. 21-50.
www.ciolek.com/PAPERS/pnc-taipei-99.html
- Cook, S.A., Adcock, F.E. and Charlesworth, M.P. (eds.). 1934. The Cambridge
Ancient History, Vol. 10, The Augustian Empire 44 B.C.- A.D.70. Cambridge: Cambridge
University Press.
- Davies, Norman. 1997. Europe: A History. London: Pimlico-Random House.
- Dietz, Steve. 1998. New Media Initiatives. (v. Nov 1998).
www.walkerart.org/nmi/g9_nmioverview.html
- Ditlea, Steve. 1999. Rewriting the Bible in 0s and 1s
Technology Review, Sep/Oct 1999.
www.techreview.com/articles/oct99/ditlea.htm
- Dodge, Theodore Ayrault. 1890. Alexander - A History of the Origin and Growth of the Art of War from the Earliest Times
to the Battle of Ipsus, 301 BC, with a Detailed Account of the Campaigns of the Great Macedonian.
Boston: Houghton Mifflin. (Reprinted: 1996. New York: Da Capo Press).
- Engels, Donald W. 1978. Alexander the Great and the Logistics of
the Macedonian Army. Berkeley: University of California Press.
- Elting, John R. 1997. Swords Around a Throne: Napoleon's Grande
Armee. London: George Weidenfeld and Nicholson.
- Elvin, Mark. 1999. Personal communication, Dec 1999,
Research School of Pacific and Asian Studies, ANU, Canberra.
- ESCAP Population Division. 1994. ESCAP Population Data Sheet:
Demographic Estimates for Asian and Pacific Countries and Areas, 1994.
Bangkok: United Nations Economic and Social Commission for Asia
Pacific (ESCAP).
- Goodman, David S.G. In Search of China's New Middle Classes: the
Creation of Wealth and Diversity in Shanxi during the 1990s. Asian
Studies Review, 22(1), March 1998. pp.39-62.
- Ip David, Chung-tong Wu and Christine Inglis. 1998. Settlement
experience of Taiwanese Immigrants in Australia. Asian Studies
Review, 22(1), March 1998. pp.79-97.
- Johnson, Ian (ed.) 1999. ECAI Metadata Manual (v. Mar 1999).
www.ecai.org/metadata/ecai_standard/
- Kuhn, Markus. 1996. International Standard Paper Sizes (v. 9 Nov 1999).
www.cl.cam.ac.uk/~mgk25/iso-paper.html
- Larousse. 1993. La Petit Larousse: Grand Format. Paris: Larousse.
- Lewis, Jon E. (ed.) 1997. The Handbook of the SAS and Elite Forces.
London: The Book Company.
- Manguel, Alberto. 1996. A History of Reading. New York: Viking and Penguin Books.
- Morris-Suzuki, Tessa. 1998. Invisible Countries: Japan and the
Asian Dream. Asian Studies Review, 22(1), March 1998. pp.5-22.
- The New Encyclopaedia Britannica, 15th Edition. 1974. Macropaedia - Vol 14.
Chicago: Encyclopaedia Britannica, Inc.
- The New York Times, October 12, 1999.
From Two Small Nodes, a Mighty Web Has Grown.
www10.nytimes.com/library/national/science/101299sci-internet-anniversary.html
- Nielsen, Jakob. 1994. Response Times: The Three Important
Limits (v. nd.) An except from Chapter 5, Nielsen, Jakob. 1994.
Usability Engineering. Boston: AP Professional.
http://www.useit.com/papers/responsetime.html
- Nielsen, Jakob. 1997. The Need for Speed (v. Mar 1997)
www.useit.com/alertbox/9703a.html
- Northeast Consulting Resources Inc.1999.
Internet Performance
Slowing Down (v. 12 Oct 1999).
www.nua.ie/surveys/?f=VS&art_id=905355353&rel=true
- Oertel, F. 1934. The Economic Unification of the Mediterranean
Region: Industry, Trade and Commerce. pp. 382-424. in: Cook, S.A et
al. (eds.). 1934. The Cambridge Ancient History, Vol. 10, The
Augustian Empire 44 B.C.- A.D.70. Cambridge: Cambridge University
Press.
- Panstwowe Wydawnictwo Naukowe (PWN). 1963 (Vol. 2); 1967 (Vol. 9);
Wielka Powszechna Encyclopedia PWN. Warszawa: Panstwowe Wydawnictwo
Naukowe.
- Philippine Bookstore Online. 1997. Book Catalogues: Culture,
History, and Current Issues
(v. Dec 1997).
www.philbooks.com
- Popper, Karl Raimund. 1969. Conjectures and Refutations: The Growth of Scientific Knowledge. London: Routledge and Kegan
Paul.
- Putzger, Friedrich Wilhelm. 1963. Historischer Weltatlas. Bielefeld: Velhagen
& Klasing.
- Raymond, Eric S. 1998. The
Cathedral and the Bazaar (v. 22 Nov 1998).
www.tuxedo.org/~esr/writings/cathedral-bazaar/cathedral-bazaar.html
- Reid, Anthony. 1998. Political "Tradition" in Indonesia: the One
and the Many.
Asian Studies Review, 22(1), March 1998. pp.23-38.
- Roberts, Matt T. and Don Etherington. 1999.
Bookbinding and the Conservation of books: A Dictionary of Descriptive Terminology (v. Oct 1999)
palimpsest.stanford.edu/don/don.html
- Runciman, Steven. 1978a. A History of Crusades: vol. I The First
Crusade and the Foundation of the Kingdom of Jerusalem. London:
Penguin Books.
- Runciman, Steven. 1978b. A History of Crusades: vol. II The
Kingdom of Jerusalem and the Frankish East 1100-1187. London: Penguin
Books.
- Runciman, Steven. 1978c. A History of Crusades: vol. III The
Kingdom of Acre and the later Crusades. London: Penguin
Books.
- Scarre, Chris (ed.). 1988. Past Worlds: The "Times" Atlas of Archaeology.
London: Times Books Ltd.
- Shepherd William R. 1976. Shepherd's Historical Atlas. 9th
edition, revised and updated. New York: Barnes & Noble Books.
- Stone, Norman (ed.). 1989. "The Times" Atlas of World History.
Third edition. London: Times Books Ltd.
- Ventura Pacific Ltd. 1999. A Book Collector's Glossary.
www.west.net/~books/glossary.htm
- Vidal-Naquet, Pierre. (ed.). 1992. The Harper Atlas of World History. New York:
HarperCollins.
12. Version and Change History
- 15 Dec 1999 - added notes on Gregory King and Francois Quesnay.
- Revisions, so far, incorporate minor editorial and markup fixes.
visitors to www.ciolek.com since 08 May 1997.
Maintainer: Dr T.Matthew Ciolek (tmciolek@ciolek.com)
Copyright (c) 1999 by T.Matthew Ciolek. All rights reserved. This Web page may be freely linked
to other Web pages. Contents may not be republished, altered or plagiarized.
URL http://www.ciolek.com/PAPERS/pnc-berkeley-01.html
[ Asian Studies WWW VL ]
[ www.ciolek.com ]
[ Buddhist Studies WWW VL ]