Historians are interested in the past and make the claim that as specialists they are uniquely situated to be able to parse and decipher this past and by doing so project how that past has affected the present and possible futures. Historians of the French Annales school took this broad mandate and went further creating a concept called the longue durée. In very broad strokes, this posits that in order for any historical moment to have meaning it must be examined in the context of what preceded and succeeded it on a very long scale and with a very broad lens—one must examine the landscapes, cultures, and environments in which the events happened across a long period of time to be able to see the patterns that lead to any particular moment. Charitably, it can be said that this is an exhausting way of researching history—both for the historian and the reader. There was a reaction to this macro-history in microhistories that made the counter claim that massive histories missed the minutiae of the moment—they are so broad as to be indecipherable and useless for understanding human scale pasts.
What happens now that we have big data on the historical scene and the use of AI to parse it? After all, with the mass digitization of older material and the proliferation of born digital records, history seems like it should be enjoying a new renaissance of interpretation due to the increased accessibility to older content and vast volume of new content that allow researchers of history to be able to see and understand both the macro and micro. However, as alluded to in previous posts these new tools are not a replacement for the trained historian. More quantity does not equate to clarity, nor a replacement for considered and careful analysis of both the macro and micro by trained historians. Take a simple example—AI combing the vast amounts of political discourse looking for the quantity of certain key words or phrases to discover what topics are of greatest interest during a period. It is not without value as one will see what weight is being given to particular ideas, but the data without interpretation misses how politicians at polar opposite ends of the political spectrum might use the same words like patriotism, family, or race in the same quantity but starkly different ways. Similarly, these tools can only examine what is fed into them. They have no awareness of what has gone unrecorded let alone undigitized. There is no awareness of marginalized communities whose records are not prioritized for inclusion, oral sources, or for records that remain unprocessed.
The trained historian can however use these tools and the opportunities they create, to make new historical analysis—with a fine example of this kind of project being the Mapping the Republic of Letters (http://republicofletters.stanford.edu/). Historians making use of the vast quantities of digitized manuscript collections are now able to chart in detail the exchanges of ideas between historical figures, and how the dissemination of these ideas led to subsequent changes in society and thought. An example of the micro of individual exchanges leading to the macro of societal change. This model is a driver for digitizing and documenting the Constable papers. They interacted with significant historical figures like Thomas Paine or Thomas Jefferson, as well as often marginalized communities like African- and Native-Americans. Through digitization their encounters and observations can join the digital record for larger analysis and through mapping and considered description of the individual moments can be brought into the macro and micro narratives of the past.
Leave a comment