Discovery Guides Areas


Scientific Communication and the Dematerialization of Scholarship
(Released March 2007)

  by Douglas Brown  


Key Citations



Frontier Research


A New Age
Let us fast forward to 2003: a landmark report of the National Science Foundation on initiating the development of 'cyberinfrastrucure' suggests that 'a new age has dawned in scientific and engineering research'. (4) The report goes on:

The amounts of calculation and the quantities of information that can be stored, transmitted, and used are exploding at a stunning, almost disruptive rate. Vast improvements in raw computing power, storage capacity, algorithms, and networking capabilities have led to fundamental scientific discoveries inspired by a new generation of computational models . . . Powerful 'data mining' techniques operating across huge sets of multi-dimensional data open new approaches to discovery. Global networks can link all these together and support more interactivity and broader collaboration. (5)
Map of World
Interactive map that shows nine of the world's largest computing grids
This new wave of developments in very high-performance computing is opening up entirely new possibilities for transdisciplinary or 'problem-centred' research, as emphasis shifts on a massive scale to the capture, processing and analysis of primary data, while networking greatly facilitates remote collaboration and pooled expertise; further, computation is becoming no longer simply an aid to scientific research, but integrated into its fabric, effectively changing the whole notion of a scientific instrument. The report notes that the rise of computationally intense, networked science is being accompanied by a changing pattern in the way scholars communicate: 'The traditional, linear, batch processing approach is changing to a process of continuous refinement as scholars write, review, annotate, and revise in near real time using the Internet.' (6)

Since the report was published, the explosion in the use of social writing platforms, such as weblogs and wikis, and the practices of creating 'mashups' and 'workflows' may be ramifying this trend towards 'informalisation' and 'massively distributed collaboration', perhaps shifting the World Wide Web somewhat closer to Berners-Lee's initial conception of it as a collaborative, editable, peer-to-peer space for scientists to develop their thinking. In line with the constellation of developments now known as 'Web 2.0', researchers across disciplines are increasingly expecting access to raw, primary data as well as finished documents, and the 'journal article' seems to be becoming part of a continuum of research communication, with the conventional distinction between formal and informal communication falling away.

e-Science, The Grid and The God Particle
In parallel with the development of the cyberinfrastructure initiative in the US, Research Councils UK (comprising the eight publicly-funded UK research councils), have focused on the idea of 'e-Science', which they define as: "the large scale science that will increasingly be carried out through distributed global collaborations enabled by the Internet . . . such collaborative scientific enterprises . . . will require access to very large data collections, very large scale computing resources and high performance visualization." (7)

Colorful Collider
Large Hadron Collider at the particle physics laboratory CERN
Increasing collaboration has been a trend in many branches of science throughout the latter half of the twentieth century, as a reflection of intensified specialization and the need for a mix of expertise, reinforced by improvements in technology. (8) Underpinning e-Science research is the development of grid computation, a group of technologies that will facilitate very large scale distributed data computation and storage, and enable flexible and coordinated resource sharing and problem solving. The best known application so far is probably the Human Genome Project; test-bed experiments include GridPP, led by the University of Glasgow, which will analyze enormous amounts of data produced by the Large Hadron Collider at the particle physics laboratory CERN, during the upcoming investigation into the fundamental nature of matter, scheduled for 2007. This massive experiment will initially involve 200 scientists in 150 academic institutions worldwide. The effort to isolate the elusive particle known as Higgs boson, sometimes called 'The God Particle', is expected to generate 10 terabytes of data for each 8-hour run of the Collider.

Go To Towards a Natively Digital Model?

© 2007, ProQuest-CSA LLC. All rights reserved.

List of Visuals