Das Schweizer Fachportal für die Geschichtswissenschaften

Digital Humanities Summer School Bern, 1st edition

Autor / Autorin des Berichts: 
Zitierweise: Dängeli, Peter: Digital Humanities Summer School Bern, 1st edition, infoclio.ch Tagungsberichte, 2013. Online: infoclio.ch, <http://dx.doi.org/10.13098/infoclio.ch-tb-0066>, Stand:


The (inter-)discipline of Digital Humanities is slowly but surely taking shape within the Swiss research landscape. Following a number of regional and national conferences, installation of chairs at universities and a pioneering THATCamp -- a notably informal and participatory form of a conference -- that attracted participants from neighbouring countries, too, the first Digital Humanities Summer School at the University of Bern (26th-29th June 2013) constituted another DH event of importance and great interest. As the 2009 THATCamp that took place in Lausanne, the DH Summer School was initiated and organised by an established group encompassing proponents of the history platforms infoclio.ch (as the main orgnisers) and hist.net, the Swiss Association of History and Computing, the editorial project DODIS as well as researchers of the universities of Bern and Lausanne (LADHUL, DHLab). Owing to an attractive programme featuring internationally renown lecturers and workshop leaders, the summer school was fully booked within just a few weeks and its seats were taken by participants originating from twenty countries.

The four day event took off on Wednesday, 26th June, with an opening lecture by Ray Siemens, University of Victoria CA and ADHO chair, that shed light on the relation of Digital Humanities to the more traditional humanities disciplines and their intersections. In doing so, Cathy Davidson's notion of the ‚Digital Humanities 2.0‘ and Willard McCarty's ‚Methodological Commons‘ (ibi p. 118-119) served as important points of reference. While Digital Humanities may appear to be a positivist endeavour first of all from an outside point of view, the internal perspective reveals numerous critical aspects and blurred borderlines. Siemens advocates not to avoid the problem of definition in order to allow for a certain clarity that is a prerequisite to spur research progress.

Susan Schreibman, Trinity College Dublin, went on to deliver some historical context on the genesis of the Digital Humanities and the changing self-conceptions of its stakeholders by running the gamut from Father Roberto Busa's punchcard based Index Thomisticus to virtual worlds that ought to help answer particular research questions such as the ongoing Easter Rising-Projekt and its model of Dublin's city centre that re-examines the number of causalities during the 1916 event, taking situative conditions and the firepower of the weapons in use equally into account. The anecdote of the inadvertent online publication of the work ‚Digital Humanities‘ that happened to boost the sale of the book led to the discussion of changing modes of publication and the accompanying questions of how to credit and recognise different kinds of scholarly contributions and finally to the much debated opposition of ‚building and making‘ and ‚reading and critique‘. Against the backdrop of shifting weights and the emphasis on long-term curation of research data, Schreibman anticipates (and to some degree observes) a convergence of the so far distinct fields of scholars and documentalists (librarians, archivists).

While these two introductory courses mapped the terrain of Digital Humanities in a productive manner, the inspiring moment may have lagged somewhat behind expectations. To give an example, both lecturers made use of the also elsewhere abundantly employed visualisation technique of the word cloud (wordle, infamously also dubbed the mullets of the internet), which led to questioning by the audience -- prompted by the non-occurrence of the topics 'library' and 'archive' in a word cloud based on abstracts of a DH conference and entailing a discussion of the use of tools without (fully) understanding the inner workings ("black boxes") that is likely to produce biased results.

In her course on digital textual editing, Elena Pierazzo, Kings College London, concisely connected aspects of communication and text theory as well as the multifaceted practical challenges that face editors. Leaving at the basic differentiation of linguistic constructions (texts) and physical objects (documents), Pierazzo discussed the editorial spectrum that ranges from simple, well-structured neo-platonic texts to reading editions, critical editions, and diplomatic editions up to facsimile editions that closely resemble the document. The material dimension in particular gained in importance over the last decades (critique génétique, material/new philology), adding to the complexity and conventions of editorial practice that had been in place previously. Transferring the different editorial forms to the digital realm gives rise to a host of questions, the most pertinent ones of which Pierazzo addressed and exemplified pointing to some impressive existing and emerging digital editions. Her own prototype of a way how to edit Proust's manuscripts that was developed in relation to the work of the TEI Manuscripts SIG allows the reader to sequentially move through text stages as if looking over Proust's shoulder during the process of writing. The position of the editor, whose understanding of a text becomes manifest in the encoding and whose role and authority needs defining in crowdsourced edition projects, kept coming up during her talk. The interested reader will find plenty of information on digital editing in the recent three volume work by Patrick Sahle (Sahle 2013) and certainly in Pierazzo's forthcoming work on digital scholarly editions that will be available soon.

In his second course, which was concerned with social knowledge construction and creation, Ray Siemens made the case that bigger amounts of data, accelerated work practice, instantaneous and denser communications as well as increased participation by the audience in the research process and in the scientific discourse have the potential to significantly influence research questions and methods. Tying in with the work done by the Electronic Textual Cultures Lab (ETCL, U Victoria) and against the background that knowledge is often times generated in chaotic environments and under very different conditions -- at which informal research communication is particularly influential -- Siemens described a role change of the researchers in digital contexts from didactic authorities towards facilitators of knowledge. Collaborative tools that assist in the offering of materials, their annotation, itemisation, and analysis -- illustrated by Siemens pointing to a number of examples, e.g. to implementations of gamification aspects that honour voluntary participation, and supplemented by an outstanding annotated bibliography -- become essential parts in the production of knowledge. The fact that this production involves a greater number of actors and a wider public implies a shift from institutional structures to less formalised environments, in academia as well as in other spheres (e.g. journalism). These considerations necessitate a fair degree of awareness by scholars: in order to contribute to the shaping of new methodologies and modes of communication, humanities scholars should study the potential of new media in order to understand them, to fathom the peculiarities and limits, to know the mechanism of collaborative tools and methods, expose underlying ideologies and challenge simple positivistic approaches. A critical theory that keeps reflecting on ethical and disciplinary standards shall supplement the crafting and building aspects of knowledge creation.

David Berry, Swansea University, delivered a very substantial course on critical digital humanities that was fraught with the manifold implications that the widening of the humanities disciplines towards new approaches and methods and the backlashes of new methodologies on research practices have. Berry, too, began on matters of definitions, at which he gave an account of Stephen Ramsays DH typology, that opposes the above all practically oriented and relatively institutionalised community as it had formed during the early 1990s under the moniker Computing in the Humanities (DH type I) to a newer, less clearcut movement with broad interests and at times a prevailing mood of revolution with regard to a true renewal of the humanities (DH type II; media theorists, culture critics, digital artists). Berry approached the question as to what really changes in the configuration of humanities research from the three perspectives of quantification/calculation, organisation and the potential for acceleration. As a metric paradigm gets more prevalent that tends to substitute hermeneutics by mathematics and to formalise research, the organisational framework sees fundamental changes, too: research is carried out project-oriented and administrators and project leaders trump professors. At the same time the pace of research is increased through efficient forms of analysis, highly responsive means of communication and in many cases exploitative division of labour. According to Berry, allegations the Digital Humanities be in the first place an opportunistic alignment of research schemes with cost-value ratios cannot be perfectly dispelled given obvious parallels between some kinds of DH discourse and management speak. A closer look however reveals genuinely humanistic endeavours that do not thoughtlessly trade proven and tested research principles against the most up-to-date technologies and approaches. Yet there is clear evidence that the notion "digital" became a key term in the political economy of universities that assures relatively easy access to funds. Berry rightly claims more acceptance for the critical work that has been done to date and more support for future work. The possible slowdown effect caused by an emphasis on critical perspectives may be a productive slowdown thanks to the gains of the reflective process, especially beyond the project level in a mid- and long-term view.

Opposed to Berry's comprehensive approach, the following course held by Claire Lemercier, CNRS-Sciences Po, Paris, on the topic of network analysis was confined to a single research method, without omitting references to quantitative methodologies. Lemercier did not foreground specific analytical tools, but rather the formal possibilities to investigate research data lens-like from different angles and distances. According to her, the potential of the method lies not so much in attractive visual displays of complex (but often analytically rather unsubstantial) networks than in the possibility to analyse
existing and missing relationships on different levels and to detect isolated effects. The research process should from its outset (e.g. the data collection in an archive) be accompanied by formal considerations that are in a prospective manner aware of the dimensionality of the dataset and that avoid interferences (e.g. by restricting dependencies) further into the research process. Detailed knowledge about the structure of a dataset and the pursued research questions inform the choice of an analytical tool on a very basic level. Methods related to network analysis such as geometrical analysis, multiple correspondence analysis, sequence analysis, but also conventional descriptive statistics may be a better fit for many scenarios than social network analysis, which in fact denominates a range of different variations. This said, the network perspective may be fruitful in many areas -- relations between historical events and sources or linguistic entities may as well be regarded as a network as relations among and between people and organisations -- in order to detect correlations and relationship patterns, but the method should (nearly) always be complemented by a qualitative perspective on relevant segments of the dataset.

The last plenary course of the Summer School saw Frédéric Kaplan, EPFL Lausanne, presenting the newly started Venice Time Machine project of EPFL and the University Ca’Foscari (Digital Humanities Venice). This project is led by the vision to make past times more readily accessible by means of efficient digitisation and data extrapolation, and its findings should ultimately contribute to the temporal mapping of cities, countries, and continents. Such an endeavour requires the aggregation of enormous lots of reliable data on past events and facts. While we may possess this kind of data for present times, the density becomes far less once we start to move back in time. For this reason, the Venice project plans to digitise and process some 80 kilometres or more than 100 million documents during the next ten years that should someday allow to query an estimated 10 billions of 'events' (data points, factlets). Kaplan and his team face a number of challenges that they intend to meet using innovative methods such as optimised scan robots (perhaps even sensing the information without need to open the books), improved text recognition by means of contextual knowledge (as common in speech recognition), availing of freely available assistance by thousands of motivated students in exchange for course credits in massive open online courses (MOOCs), graph based semantic information modelling (RDF) or the extrapolation and simulation of missing information. Incertitude resting in the data might be alleviated by the allocation of 'fictional spaces' that allow for probability statements on an aggregate level and thus to single out scenarios of relatively high plausibility. Without doubt, this is a highly interesting project encompassing really useful prospects that could be applied elsewhere. Whether the pursued efficient semantic exploitation will turn into reality (with satisfying results) within just a few years remains to be shown, however.

These seven multifaceted, rich and entertaining courses were complemented by twelve workshops and seven unconference sessions that in many cases bore upon the course contents in a productive manner, but also introduced new aspects and topics. Theoretical (critical digital humanities, accessibility and source criticism of digital/digitised sources), disciplinary (DH and academia/pedagogy/curricula) and methodical concerns (data processing, text analysis, digital editing of text and music, visualisation methods and techniques, individual and collective work practices) as well as their intersection (semantically modeled research environments) were equally addressed. On the occasion of a project slam roughly one third of all participants gave a very short presentation of their current research areas, which went to show the considerable variation regarding disciplines and topics among the audience, that had contributed to engaging discussions throughout the summer school. A relaxed dinner, cozy evenings out, a spontaneously oranised report on the Taksim Gezi park protests in Istanbul by two participants and finally a concluding pic-nic that had unfortunately to be carried out indoors created great opportunities to socialise, deepen discussions and share visions. It is left to hope that there will be many follow-up summer schools in Bern or other Swiss cities and the DH development in Switzerland in fact offers a positive outlook in this regard with the upcoming DH 2014 conference in Lausanne (6th - 12th July 2014) that may represent a great setting next year already.

Report cross-posted from Blog des Vereins Geschichte und Informatik

Veranstaltung: 
Digital Humanities Summer School Switzerland
Organisiert von: 
infoclio.ch et al.
Veranstaltungsdatum: 
26.06.2013 bis 29.06.2013
Ort: 
Universität Bern
Sprache: 
e
Art des Berichts: 
Conference