An excellent discussion of how OA serves authors more than their communities (their readers), which can negatively affect the incentives for a fair vetting process.
Some Tools to Add to Your Digital Toolbox
A great resource on DH tools by Nick Sacco : Some Tools to Add to Your Digital Toolbox.
I have a confession. I’m a renrat. I’ve enjoyed Renaissance faires since I was a kid and from at least the age of 14, I’ve made my own costumes, collected various historically accurate pieces (or “faire approved” garb), and learned sixteenth century colloquialisms and Shakespearean insults, so that I can play along with the actors, which I have done successfully enough to be confused for being one.
In reading Jeremiah McCall’s and Adam Chapman’s articles on historical video games, I found myself going through the same intellectual transformation I experienced as a Renaissance faire playtron: my mental model of experiencing bits of real history overshadowed superficial checks for accuracy. When I first began filling my massive trunk of bodices, skirts, and chemises, I payed close attention to historical accuracy in terms of fabrics and colours, and I would shake my fist at peasants wearing royal purple, demanding punitive measures for their disregard of sumptuary laws (even though I really can’t say my costumes are 100 percent accurate, whatsoever). Over the years, as movies like “Pirates of the Caribbean” came out, the sixteenth century started to resemble the eighteenth, and men wearing excessive eyeliner stole Sir Francis Drake’s spotlight while the local faire moved from a shire- to a port-themed set-up. I could go on about the uncomfortable corporatization of Renaissance faires and the implications, but I’ll spare you. As annoyed as I was about the many historical inaccuracies that abounded (exploded, really), a couple of years ago, I realized there wasn’t much I could do except continue enjoying what early modern life still existed, which expanded the more I studied early modern European and Atlantic World history. Discursive engagement has never ceased because I’m playing history, not writing or reading it.
Like historical video games, Renaissance faires and the actors and playtrons involved undergo the same scrutiny and criticism. What’s more is, I approach faires like I do adventures in video games (except that I always hope that it becomes the Millennial Fair in Chrono Trigger so I can tryout the teleporter; and in I also tend to do side missions more than the primary…). However, as Adam Chapman saliently states,
“Simply focusing on the accuracy of the game [or a Renaissance faire] often re-informs us about popular history rather than recognizing the opportunities for engaging with discourse about the past (and the nature of this discourse) that this new historical form can offer.”
Discursive engagement with history occurs at less obvious levels in video games and Renaissance faires, and both are mediums by which history can be consumed as well as experienced. However, while neither is fully representational of the past, at the very least they are mimetic cultural products in which players enjoy “typical historical environments, characters, scenarios, and experiences,” from Maypole dances and the celebration of spring, to danse macabre; from staving off the plague with foxtails and horns, to chasing fairies with bells and trinkets; from yelling “God Save the Queen” during her processions, to drinking an ale in a pewter mug while listening to songs of performers with friends.
While Renaissance faires may provide a means to experience history, it is not clear how they may serve as “problem spaces” in the same way as video games. This is not to say faires are not problem spaces; they are, but in different ways because there really are not any goals and conflicts, unless this includes fulfilling a multitude of mini missions to complete your experience (these might include: attending the market, seeing a few shows, bowing for all the courtiers, sneering and yelling back at Puritans, losing children in the maze, playing games, concluding with singing songs with friends while sitting on grass in the shade with friends and sampling things brought back from the New World thanks to a generously bearded courtier). In this case, the idea that “the choice of problem space, or more specifically the choice of whose problem spaces to represent necessarily locks the game [faire] into certain portrayals of the past,” particularly in Elizabethan England (with packs of Jack Sparrows running amok). As often as the facts are not straight, like video games but not to the same degree, Renaissance faire characters generally have historically legitimate roles and goals, at least the major figures.
DH and Knowledge Design
Digital Humanities is about experimentation. DHers are pioneers. By means of creating and discovering new ideas and facilitating new kinds of research questions, DHers create tools and develop new methods to perpetuate scholarly conversation as well as keep the humanities relevant. In addition to creating tools like many of the ones we’ve encountered, DHers are also designers. As the dissemination of knowledge takes new digital form in our currently evolving communication ecosystem, DHers design and present materials in ways that printed text did not so easily allow. Jeffrey Schnapp, the director of metaLAB at Harvard University, succinctly points out the following:
When you move from a universe where the rules with respect to a scholarly essay or monograph have been fully codified, to a universe of experimentation in which the rules have yet to be written, characterized by shifting toolkits and skillsets, in which genres of scholarship are undergoing constant redefinition, you become by necessity a knowledge designer. (Shaw 2012)
In expressing our scholarship digitally, we are also in command of how to present it in innovative ways. And, others have taken notice of DH’s designs, it seems. For instance, Jerome McGann, co-founder of NINES, from University of Virginia, has worked with Gale Cengage on their Nineteenth Century Collections Online (ECCO) database. Since McGann works with Gale, I don’t find it a coincidence that they have included text analysis tools such as their term clusters and term frequencies, which look like some of the tools we have used in our DH training.
Perhaps as we create scholarship in digital form and with various new tools, we as librarians and DHers may also want to consider how to effectively present knowledge in ways that make sense for new publishing models, whether it’s in the form of an online journal or a website with pedagogically motivated visualizations in databases. Furthermore, it seems important to ask, how might what we produce and how we choose to disseminate it to enhance academic study? Ultimately, the answers will develop from our experiments with new modes of research and expression, as well as reflection on and innovation in how we communicate in the digital world.
[On a related note, check out Peter Katz recent post “There is no outside the medium: Interface essentialism and the death of print (and the digital)“]
Libraries as Laboratories
When I think of laboratories, images come to mind of the 10th-century Persian alchemist Abu Bakr Muhammad ibn Zakariya al-Razi, or maybe Beaker and Dr. Bunson Honeydew from the Muppets. They probably have various burners, beakers and flasks, and winding tubes of travelling smokey gasses that eventually drip as strangely colored liquid into a tiny glass potion bottle, or something… Whatever the experiment, the goal might be to confirm a bit of information or further articulate some paradigm or another. Ultimately, from the technologies available in the lab by means of some methodology, what comes from the lab is a set of information to contribute to a conversation that will be disseminated and further discussed by a specific community of scholars.
Although not littered with broken glass and mad scientists (maybe…?), libraries too serve laboratory-like functions, especially for humanists interacting with texts and images, processing the data and information they glean in conducting mental experiments in the library.
As digital humanities has a growing presence in libraries, particular challenges are posed, especially whether one could characterize it as a service. Despite the disagreement over digital humanities as a service in libraries, Bethany Nowviskie, Trevor Muñoz, and Micah Vandegrift seem to agree on one thing: libraries facilitate the creation of knowledge and assist in preservation and dissemination of that knowledge. Libraries function as laboratories, even think tanks, for humanities scholars. Indeed, the library is the humanist’s laboratory. So, it only seems natural that librarians and researchers (which can be one and the same) should collaborate, rather than draw a line between them as Muñoz does in stating that the focus on faculty inhibits DH initiatives in libraries. Not only will libraries provide space and technological support, they can offer things that the researcher whose digital scholarship is to be disseminated by ensuring proper preservation, licensing, and metadata, etc., are taken care of. As Nowviskie points out, as an example, ”Library technologists, more used to working collaboratively and for broader audiences, can more easily do open source right – and thereby demonstrate its value.” Additionally, “libraries and library-embedded digital humanities centers [help] to beat what we might call a ‘path to production,’ both for innovative scholarship and for its supporting technical and social frameworks.” In this, while collaborators in terms of content and especially methodology (e.g. librarians knowing something about a methodology that will serve the researcher’s purposes), the librarian can be seen as providing a form of service in line with their traditional research consultations. I’m not sure what’s wrong with that.
Like Muñoz, I think that “good digital humanities work is exploratory and innovative,” but also that librarians, many as DHers themselves, can help walk the path of scholarly processes in the library as a DH laboratory. As Micah Vandegrift notes, “the library, as a staid institution of knowledge and exploration, should then blend in with the multitude of ways that the user discovers information,” and so offering assistance with digital humanities initiatives in sharing their know-how.
In reading these articles, discussions from the start of the semester came back to mind: is DH a discipline? a field? a set of methodologies? I can see how it can be considered a field of its own, but because of its interdisciplinarity, I see things like topic modeling, text analysis, and GIS largely (though not entirely in and of themselves) as methodologies available to various fields. I’m wondering if the distinction or confusion of the nature of DH is what is contributing to the conversation of who should own DH (librarians or faculty). This makes me wonder, can one own a methodology? My thought is, probably not.
Dynamic Reading of Maps vis-à-vis GIS
This week, we’re discussing Geographic Information Systems (GIS) in Digital Humanities. Prior to reading our course materials, when I thought about GIS, generally what came to my mind were images of historical maps overlain modern maps, more generally to show changes in cartography, comparing new natural and human-built structures to old visualizations in maps that were previously nonexistent or measured incorrectly by cartographers. Largely because I confined my thoughts to GIS uses in tandem with my current technological abilities and historical understandings heavily influenced by the early modern period, I thought about how I have read maps over the years. Maps are texts, and as David Ramsay points out, they include meta of “supra” narratives. These narratives have driven my interest because they illuminate relative worldviews. For instance, today, most world maps will include the Atlantic Ocean positioned at the center and the North Pole at the top, which reflects a modern Eurocentric / Western worldview. Conversely, this sixteenth-century world map shows the world “upside down”
and this Ptolemaic-based fifteenth-century map draws on the centrality of trading powers in the Middle East, placing it at the center.
While I considered these sort of historical conceptions of place as shaped by their worldviews, I also thought about these in terms of narratives in relation to one another. And, in relation to the potentiality in GIS, in “The Potential of Spatial Humanities” (2010), D.J. Bodenhamer sums it up rather nicely:
[H]uman activity is about time and space, and GIS provides a way to manage, relate, and query events, as well as to visualize them, that should be attractive to researchers.
This sounds wonderful, doesn’t it? While still blurring the lines between text and image, moving away from simple side-by-side comparisons of maps, GIS allows for what David Rumsay calls dynamic readings of maps. With GIS, we can combine information and visualizations of maps, as multi-stacked-layered visualizations, such as multiple historical maps on top of each other, along with topographical maps, and embedded on Google Earth for digital interaction. See David Rumsay’s 3D of Los Angeles, for instance.
However, while I did indeed think about maps as a means of reading human activity and GIS as multilayered, one aspect I hadn’t considered as greatly was that of what Rumsay pointed out in his DH2011 keynote speech: one of the challenges of GIS is that maps are already in themselves visualizations of information, and GIS techniques essentially create visualizations of visualizations. So it would seem that the interactivity of these visualizations, and thus makes the digital interface, extremely important.
What can we gain from GIS? Like all the other DH methodologies, such as text analysis and topic modeling, it seems that while GIS can most certainly help illustrate “the known” through what Cooper and Gregory (2010) call “mere spatial visualization” and as thinking tools, GIS techniques can also drive new research questions as research tools for “critical interpretation.”
[This was probably my favorite GIS example from David Rumsay’s DH11 talk and his Map Collection, and I wanted to post it here because I thought it was neat… and I love Cassini: Cassini Celestial and Terrestrial Globes]
Will DH Save the Humanities?
Currently, I’m experimenting with text analysis tools such as AntConc and Voyant for topic modeling. Ted Underwood’s blog entitled “Topic Modeling Made Just Simple Enough” introduces the uses and purpose of topic modeling, showing that researchers can extrapolate topics and infer discourse from a great number of texts. This makes text analysis seem more of a superficial close reading, allowing readers to get a sense of topics and themes, while topic modeling allows for more customization and a macroscopic view of a corpus. While certainly not a means to replace close readings of texts, topic modeling enables humanist researchers to conduct wider analysis of many texts in far less time than they would otherwise, which means they will present findings and add to scholarly discussions more quickly or more nuanced than traditional humanists.
Research produced with a much wider scope that breaks traditional disciplinary boundaries, perhaps completed in few years rather than a decade… So this is digital humanities. It has huge implications for academia, as it redefines humanist scholarship, which calls for a re-evaluation of not just the importance of the humanities, but also the system of awards in academia in how scholars are promoted and funded. After all, most funding agencies, in whatever form, seem to fund research that provides quick results. Moreover, it seems that digital humanities can help solve the crisis that has plagued both the humanities and pure sciences in academia.
While I attended Cal State Fullerton, the California State University system suffered draconian budget cuts, forcing faculty to take furloughs, paycuts, while departments had to slash course offerings and suspend hiring, thus forcing students to prolong their stay unless they chose to drop out after even required courses disappeared from registration catalogs. At Cal State Fullerton, the Modern Language Department suspended all but a handful of programs, while other departments cancelled many classes. Suffice it to say that we experienced a metaphorical drought in morale, and the humanities and pure sciences were hit the worst. Now, a few years later, it’s not as bad, but the many in the Cal State system and CSU Fullerton administrators continue to regard the humanities as “esoteric,” less-worthy of funding, and supporting present-est research and preferring STEM to reign. I’m hopeful that digital humanities will reshape this discourse about humanities and show that they are just as worthy of funding and research attention.
After all, digital humanities provides new forms of scholarly communication (I know, I’m stating the obvious). But if it likens to the methodologies of disciplines that our culture deems “more important,” as we see with computational analysis such as topic modeling, maybe the digital will bring a more positive view of humanities.
Matthew Jockers on Topic Modeling
This week, I attended Dr. Matthew Jockers’ talk “Correlating Theme, Geography, and Sentiment in the 19th Century Literary Imagination,” which was yet another great Catapult Center event at IU.
Dr. Matthew Jockers, a leader in text analysis from University of Nebraska, discussed his latest research utilizing topic modeling for geometrization of narrative, that is, the literary function of “place,” in conducting a macroanalysis on over 3,500 British and Irish novels. Topic modeling allows to solve some of the problems with identifying places caused by ambiguity in texts thanks to its name and entity recognition (NER) capabilities, such as shared name places (such as Georgia, which is a country as well as state in the US) or places as concept. Additionally, using LDA for developing word clusters and differentiating contexts of words, Dr. Jockers was able to get a larger sense of place, not in terms of coordinates, but in “placeness.” Primarily interested in representations of place, Dr. Jockers found interesting commonly addressed themes in his data set, including “peasant dwellings,” “war victories,” each discussed under difference words depending on Irish or English author or audience. Generally, Irish spoke more positive of home, while British depicted themselves with a sense of superiority and conversely the Irish with wretchedness. As Dr. Jockers points out, these are macro, general tendencies and so his findings should not be taken to speak for all people’s perspectives.
The greatest lessons learned from Dr. Jockers’ talk, aside from the fact that he enjoys analogies and metaphors related to food when it comes to text analysis or anything for that matter, are 1) that topic modeling via text analysis allows for further confirmation on what one already knows and 2) the process facilitates discovery of new categories of analysis and research questions (for instance, Dr. Jockers found that Irish-Americans were less sympathetic toward free Blacks in the nineteenth century, largely because of employment competition, while Irish people in Ireland related to them, probably because of understanding oppression!).
UAM and AntConc
Two weeks ago, I attended Tim Tangherlini‘s talk, “Challenges for a Humanities Macroscope,” in which he presented a project that traces themes and patterns in Danish folklore tales. Then, last Friday, I attended Markus Dickinson’s workshop on Basic Text Analysis Tools. Thinking of both presentations in tandem illustrates the importance of text analysis tools for macroscopic or “distant” reading, and contextualizing large corpora, as Tangherlini’s project shows.
Text analysis allows for a different kind of reading of large text corpora, and it facilitates development of new research questions and assists answering established ones through the discovery of word or phrase patterns. In other words, text analysis tools allow for one to see the larger picture, indeed view texts macroscopically, and see their “aboutness.” To conduct such a reading, Dickinson introduced two tools, AntConc and UAM. Both have user-friendly interfaces and simple designs, and best of all, they are both free. AntConc seems most useful when you have something you want to look for, but you don’t have a research question. In historical inquiry, we want texts to “speak” to us, and I suspect this tool, in performing a distant reading, will allow texts to do just that but in a wider context. AntConc also allows users to determine word frequency, collocation, distribution (in some ways it reminds of a glorified version of Google’s n-gram viewer for GoogleBooks), thus allowing for both micro and macroscopic readings. Similarly, UAM has the same search capacity. It is primarily an annotation tool and built for those with computational skills, and it seems most useful for those who have established research questions, particularly for linguistics and literature since users can find hierarchical schemes and parse a corpus into various pieces. Ultimately, AntConc seems to have wider applications beyond linguistics.
The librarian who sat next to me, asked me bluntly, “So, what can we actually do with this?” From a historian’s perspective, I am most excited about two possibilities with text analysis, although they are broad. Firstly, the UAM corpus tool has a function for rhetorical structure theory (RST), which allows for discursive analysis. The possibilities are endless for studying changes of discourse over time, culling the frequency and collocation of words and phrases. I can see this useful not just for primary texts, but also for historiographical purposes (e.g. “overtime, have scholars’ ideas changed about this?”). Secondly, along the same lines, those with an interest in mythopoetic discourse (battle myths, for example), can upload corpora into one of the tools in order to trace evolving portrayals of events, people, etc., as well as shifts in language in terms of contextualization (“You shall know a word by the company it keeps”!). The usage of these tools and methodologies seems to suggest text analysis comes natural to computational linguistics. However, the study of language is also the study of culture, which is easily applied to the study of history and the human condition.
If there’s one lesson I can learn from these tools (text analysis and distant reading) in light of digital humanities, is that digital humanities is by nature not just interdisciplinary, but transdisciplinary. Research questions and the products, whatever the form of the resulting scholarship, are enriched by humanists and computer scientists interacting and coming together.
[I also created a blog post on text analysis with other tools and resources for the Reference Department at IUB’s Wells Library. The link will work come next Monday.]
History and DH as Interdisciplinary
During last week’s session of our digital humanities (DH) course, we discussed the definition of DH, which has been a long and drawn-out debate among DHers for decades. From that class conversation, others dissected the meaning of both humanities and digital, and I found myself at odds with a few assertions:
“The humanities are opinions, not science.”
“The humanities are not social science.”
These are not verbatim quotes, but rather the gist of what I remember being said.
True, the humanities are not “science,” nor are they necessarily social science, but are they not at times scientific? Do we not use the scientific method to develop our hypotheses, to help shape our methodologies? In doing history, our current knowledge and our worldviews will inform how we think about history, but the historian is supposed be objective, not delineating personal opinions. However, history can be subjective, which the objective historian understands. We must shed all of our preconceptions and immerse ourselves into the distant lands of the past to fully engage in what things meant then. History is about perspective, indeed, but what’s most important are the perspectives of the people of the time being studied. Just objectivity is necessary for historians, and perhaps all humanities, so is objectivity for the scientific method. Perhaps they are intertwined.
This leads me to the next point. Why is it so important to differentiate humanities from the social sciences, which also use scientific methods? Humanities don’t rely on just close readings of texts, but they quantify and qualify data. Computational techniques and analysis, for instance, have significant effects on all disciplines, not just the sciences, whether social, hard, or “pure” sciences. In other words, humanities use the same tools as the social sciences, which enriches our studies, research, and findings. Again, I use history as an example: it is interdisciplinary, and it should be that way to assist in informing us about the past and understanding the human condition. It’s more than telling a story, it’s a matter of placing meaning and using a variety of disciplines as tools for creating categories of analysis. For instance, in studying “last dying speeches,” which are accounts of criminals last words before being executed in early modern England, psychology comes into play to inform our understanding of the power of apologies, as the public spectacle and expectation and then delivery of an apology from a transgressor helped to heal the body social. Surveying the demographics of criminals detailed in the Ordinaries’ Accounts, perhaps come from sociology. Ultimately, I would argue that interdisciplinarity deepens and supplements such a study, because, after all, history is not just retelling stories with facts and dates. It’s about crafting understanding and enhancing our knowledge, enlightening us about the past as well as ourselves, which is aided by a toolbox filled of infinite measures that extend beyond the humanities.
DH is interdisciplinary as well, and it seems that it’s not so much what you study, but rather what you do with it. Any medievalist (or early modernist as seen here) can have a blog or a website. But what does that website do? What’s its purpose? For our DH class we read and discussed the notion that DH is about “building,” using Stephen Ramsay’s point that “If you are not making anything, you are not…a digital humanist.” This seems quite true, but I think it goes beyond building and comes down to the idea of sharing. Mark Sample elucidates this rather well in his blog. As the digital landscape redefines how we produce, disseminate, and consume scholarship, and the boundaries by which we share knowledge are, for lack of a better term, boundless. This allows for us to engage in scholarly conversations, and perhaps these conversations are what scholarship is about: sharing ideas, re-evaluating our current knowledge through new evidence that is brought forth by other scholars. Digital Humanities, then, broadens the arena for discussion.