IDRH Events

Permanent URI for this collection

Browse

Recent Submissions

  • Publication
    The Public Digital Humanities Institute - A National Endowment for the Humanities Institute to Support Academic & Community Collaborations in the Digital Humanities
    (2024-01) Rosenblum, Brian; Tell, Dave; Fernández, Sylvia; Dwyer, Kaylen; Bishop, Sarah
    The Public Digital Humanities Institute (PDHI) brought together teams of academics and community partners from 12 community-based digital humanities projects for an intensive week of digital humanities training and discussion at the University of Kansas (KU) in Lawrence, Kansas. The PDHI was funded by the National Endowment for the Humanities through the Institute for Advanced Topics in the Digital Humanities program. It was organized and carried out by KU’s Institute for Digital Research in the Humanities (IDRH), under the direction of co-PI’s digital scholarly initiatives librarian Brian Rosenblum and professor of communication studies Dave Tell. This white paper discusses the PDHI's origins & goals, participating projects, curriculum & activities and outcomes. The PDHI Handbook provides access to slides, handouts and other resources presented at the Institute.
  • Publication
    Reasserting Thing-Power: Roughness as a Response to Antimaterialism
    (2013-09-14) Sullivan, Rachael
    Overwhelmingly, contemporary interface design principles aim for an experience of immateriality. As Bill Buxton, pioneering interface designer at Microsoft Research, recently told Ars Technica in an interview, “if you’re aware there’s a computer there, we’ve failed.” The fluid movement of fingers and the immediate responsiveness of a glowing screen envelopes users in effortless interaction with information. Drawing on software studies and neo-materialist theory, this presentation first shows that ease-of-use and user-friendliness as priorities enforce a misunderstanding of digital textuality and encourage composers (“writers” in classrooms, “content producers” on the social web) to look for and expect readymade composing surfaces. Many media and digital humanities scholars have interpreted this black-boxing of functionality as an ideological preference for immateriality and ephemerality in our writing technologies. As Matthew Kirschenbaum writes (2008), “Computers have been intentionally and deliberately engineered to produce the illusion of immateriality” (135). Such an illusion, which Matthew Fuller (2008) blames partly on rhetoric that enforces a caesura between the automaticity of computing contrasted with messier industrial or craft forms of production, “is ultimately trivializing and debilitating” (Fuller 4). Turning to political philosopher Jane Bennett’s theory of materiality (2010), which she calls “thing-power,” the second half of my presentation argues that the dis-appearance of writing materials, encouraged by revenue-generating, smoothing innovations like Facebook’s “frictionless sharing” and Amazon’s “one-click buying,” is detrimental to inventive practice in new media. Writing technologies are not neutral and unproblematic bearers of language; a confrontation with thing-power means “acknowledgment, respect, and sometimes fear of the materiality of the thing [and the] ways in which human being and thinghood overlap” (Bennett 349). Stronger than rhetoric of immateriality, an “anti-materiality bias” (Bennett 350) actively devalues ways that computers shape what is possible in writing and how that writing happens between the electrical conductivity and movement of fingertips, sensitive surfaces, and increasingly complex layers of software and software developers. When antimaterialism moves from interface design to the digital humanities classroom or studio, lost is the rewarding encounter with the rough‑edged energy and difficulty of not only language, but also the stuff of composing.
  • Publication
    3DGIS for Analysis of Ancient Maya Architecture & Landscapes
    (2015-10-19) Richards-Rissetto, Heather
    3DGIS for Discourse, Analysis, and Interpretations of Ancient Maya Architecture and Landscapes. Archaeological projects increasingly acquire and create 3D data of objects, buildings, and even landscapes; however, it is still a challenge to make these data accessible for researchers and cultural heritage managers and link these models to geo-referenced data sets for visualization and analysis. To address this issue, the MayaArch3D Project (www.mayaarch3d.org) is working to develop a 3D WebGIS-called QueryArch3D-to allow 3D models and GIS- to “talk to each other” for studies of architecture and landscapes-in this case, the eighth-century Maya kingdom of Copan, Honduras. In this talk, I will discuss how we are using 3D WebGIS to develop new visibility methods to explore the visibility or inter-visibility of monuments and buildings to or from common pathways that inhabitants of different social quarters may have taken while moving through the city of Copan. I will also present on an affiliated project-MayaCityBuilder-recently begun at the University of Nebraska-Lincoln that is using procedural modeling, rapid proto-typing of 3D models from a set of rules, to allow for the efficient and low-cost creation of alternative ancient Maya landscapes in order to foster discourse, analysis, and interpretations.
  • Publication
    Editing Walt Whitman’s Marginalia Today: Digital Humanities Methods at the Edge
    (2014-05-01) Cohen, Matt
    This talk is about methodology in the humanities. It begins with a discussion of the most basic practice of humanities research: note-taking. Annotations, marginalia, all of the methods of sifting, highlighting, and gathering: these are the substrate of our larger claims and discoveries. Such is the case even when we are working with “big data,” topic modeling, natural language processing, and other automated techniques for what Franco Moretti has called “distant reading.” The talk then reflects on the claims for methodology in and as what is being called the digital humanities. These observations emerge at the junction of two occasions. The first is a project to digitize the poet Walt Whitman’s annotations and marginalia, his personal metadata on his reading. This NEH-funded project is at the end of its first phase, and will be published later this year for free access at the Walt Whitman Archive (http://www.whitmanarchive.org/). The second spur is the active conversation about the digital humanities as a methodological crucible or fountain; both the tenor and the content of that conversation are occasions for considering the status of method in the humanities.
  • Publication
    Multi-Disciplinary Teaching Across Computing and the Humanities
    (2014-04-03) Fishwick, Paul
    Can the connections between the humanities and computer science include arts and humanities informing computer science? We are familiar with the idea that computer science results in technologies, and that these are then used as tools by artists and humanists. Going in the other direction is also possible, where deep concepts in computing are covered through cultural artifacts. We will include practical examples of this approach, including al Jazari’s water clock. These examples create new possibilities for humanist-computer science collaborations, and they also suggest that computer science can be viewed as empirically-driven rather than existing purely as an “artificial science.”
  • Publication
    XML as a Tool for Domain-Specific Languages
    (2011-09-23) sperberg-McQueen, Michael
    Abstract: Computers are general-purpose machines for manipulation of symbols, which means they can be applied in almost any field whose problems can be expressed in terms of symbols. But the creators of computer systems and the potential users of those systems do not always think the same way and do not always find communication easy. Much of the history of information technology can be glossed as a series of attempts to bridge this communication gap. One current approach to this problem is to design ‘domain-specific languages’ (DSLs): formal languages suitable for computer processing, with vocabulary and semantics drawn from the intended application domain. In retrospect, the design of the Extensible Markup Language (XML) can be viewed as an attempt to encourage domain-specific languages and make them easier to specify. Like DSLs as conventionally conceived of, XML vocabularies allow concise descriptions of interesting states of affairs in a particular application area and tend to be more accessible to domain experts than conventional programming languages. Unlike conventional DSLs, most XML vocabularies are specified as having declarative not imperative semantics; this is both a blessing (declarative information is almost always easier to verify and easier to apply in new and unexpected ways) and a curse (many conventional programmers find declarative semantics hard to come to terms with). Examples will be drawn largely from XML vocabularies for the encoding of culturally significant textual materials.
  • Publication
    Digital Humanities Forum 2015. Afternoon session
    (2015)
    1:15 - 2:00 Panel Session: Up in Arms: The Collision of Intellectual Property and Collaborative Practices, Rachel Mann (University of South Carolina); The More the Merrier: Tapping into the Power of Librarians to Collaborate on Undergraduate Digital Humanities Assignments, Stewart Varner (University of North Carolina); Overlapping Hierarchies: Academic Libraries and Digital Humanities, Andrew Rouner (Washington University in St. Louis); 2:00 - 2:30 Performing archives: sensitive data, social justice, and the performative frame, Jacqueline Wernimont (Arizona State University); 2:30 - 3:00 The computer-assisted identification of meter and rhyme: How Russian is not English, David Birnbaum (University of Pittsburgh); 3:00 - 3:15 — Break —; 3:15 - 4:15 , Panel Session; Digital Cuba: Problems and Possibilities, Jonathan Dettman (University of Nebraska-Kearney); Critical Making, Platform Politics and Open Source in the Study of Digital Artworks, Andy Stuhl (Massachusetts Institute of Technology); Decolonizing Digital Humanities: Africa in Perspective, Titilola Babalola Aiyegbusi (University of Lethbridge ); eLaboraHd: Project of Digital Experimentation, Adriana Álvarez and Miriam Peña (National University Autonomous of Mexico (UNAM)); 4:15 - 5:15 Closing Keynote: “Networking Peripheries: Technological Futures, Digital Memory and the Myth of Digital Universalism”, Anita Say Chan, Assistant Research Professor of Communications, University of Illinois
  • Publication
    DH Forum 2015 Morning Session
    (2015)
    9:00-10:00 Keynote Talk: “Push Pause: Slowing down digital humanities practices” Kim Christen Withey (Director of the Digital Technology and Culture Program and Co-Director of the Center for Digital Scholarship and Curation.); 10:00 - 10:30 A Postcolonial Reading of (Digital) Archival Structure Dhanashree Thorat (University of Florida); 10:30 - 11:00 The Archive Gap: the Digital Humanities and the Western Canon. Amardeep Singh (Lehigh University); 11:15 - 11:45 Visualizing History: The Malone Community Center: A Platform for the Malone Community to Rediscover their History, Jennifer Isasi and Alex Kinnaman (University of Nebraska-Lincoln); 11:45 - 12:15 Take Back the Narrative: Rethinking the History of Diverse Digital Humanities. Amy Earhart (Texas A&M University)
  • Publication
    Revising Ekphrasis: Using Topic Modeling to Tell the Sister Arts’ Story
    (2013-11-07) Rhody, Lisa
    For the past 20 years, the story of ekphrasis—poetry to, for, and about the visual arts—has been told as a long-standing, gendered contest between rival media, fraught with political, cultural, and religious anxieties. Although skeptical of the necessity of gendered rivalry as a principle of ekphrastic creation, literary scholars have struggled to present a compelling alternative model that sufficiently accounts for the genre’s representational complexity. This talk begins by asking if computational methods might offer new insights into the canon and tradition of ekphrasic poetry and suggests how topic modeling—one form of computational text analysis—might begin to refocus the aperture of our critical lens on the genre’s conventions. Oriented toward the non-expert, this presentation will assume no prior knowledge of topic modeling or social network analysis. I will provide a gentle introduction that builds toward an understanding of the potential uses for topic modeling and network analysis as a means for exploring large collections of poetic texts. Poetic collections, dense and rich with figurative language, require revising how we as humanists interpret topic modeling results. Therefore, this presentation will also address how changes in interpretation affect the questions we might ask and the assumptions we can make about “topics” generated by latent Dirichlet allocation (LDA)—one type of topic modeling algorithm.
  • Publication
    Playing without Power in videogames
    (2012-11-06) Sample, Mark
    Players and scholars alike have characterized videogames as fantasies about unlimited power. In this talk I explore how some videogames have rejected the core mechanic of “leveling up”—in which the player’s character grows increasingly more powerful—and have instead emphasized the vulnerability of the game’s protagonist. Such games test the limits of playing the powerless and the doomed in videogames, allowing us to explore the outer edges of our empathy and our imagination.
  • Publication
    Museum Collecting in the Age of ‘Big Data’: Opportunities for Collaboration
    (2012-09-22) Welsh, Peter
    Museums, particularly museums of cultural history, face a constant challenge of deciding which objects to add to the collection, knowing that acquiring any object brings obligations to provide long term stable environments, appropriate documentation, and ongoing access. Established practice is for each museum to evaluate potential acquisitions in accordance with their own written collections policy. However, institutions acting independently has led to significant duplication of objects, straining the resources of each museum. Some museums are exploring collaborative approaches to this “big data” problem by sharing information on collections, reallocating objects among museums, and making collections available to one another for exhibitions.
  • Publication
    Putting Moses in The Matrix: Academic Biblical Studies in a “Post-Development” Digital World
    (2013-09-14) Welch, Eric
    Despite its place as one of the oldest disciplines in the academy, the field of biblical studies has led the way in the adoption of digital avenues for scholarly analysis and the lay consumption of biblical texts. Beginning in the late 1970’s digital pioneers began the process of digitizing the text of the Bible. As texts in the original languages of Greek and Hebrew were digitized, grammatical data was organized in large databases that ultimately enabled complex and sophisticated searching of the biblical text. Three decades later the development of software for biblical research has grown into a small industry, the products of which are a foundational element of modern biblical scholarship. This paper will demonstrate the ways in which the ubiquity of digital resources has facilitated a return to the material in the field of biblical studies. First, this paper explores the implications of conducting digital research in a field where a number of powerful digital tools already exist. Specifically this paper will illustrate how research in a “post-development” world has reinvigorated the scholarly exploration of the biblical text itself as an object of study. With the new dimensions of textual research afforded by digitization scholarly workflow is improved and research design is fundamentally changed, opening the door to questions never imaginable by scholars of the past. Moving from the theoretical to the practical, this paper will then illustrate how software dramatically shortens the feedback loop in research, allowing for low risk experimentation with rapid results. As an example, the author will demonstrate how he recently used grammatically tagged primary sources, linked secondary sources, and graphic statistical feedback to analyze a poem in the book of Zephaniah for publication in a peer-reviewed journal. The field of academic biblical studies offers a powerful long-term view of digital research in the humanities. There is little doubt that digital biblical research has flourished due to the large market for religious texts in digital form; however, the success of digital biblical scholarship has important implications for humanities research beyond the study of religion. The drawn out lifespan of digital analysis in biblical studies demonstrates the remarkable potential for scholarly innovation as a field moves from developing tools to employing them as a natural part of the research process. Even in a traditional field such as biblical studies, in which a finite corpus of texts has been subject to careful analysis for over 2,000 years, the return to the material in digital form is constantly producing new and exciting insights.
  • Publication
    Making the Most of Free, Unrestricted Texts—the Text Creation Partnership
    (2011-09-24) Welzenbach, Rebecca
    Abstract: In April 2011, the Text Creation Partnership announced that 2,231 transcribed and SGML/XML encoded texts from the Eighteenth Century Collections Online (ECCO) corpus were freely available to the public, with no restrictions on their use or distribution. This is the first set of TCP texts to have all restrictions lifted. We have already seen significant interest in studying, manipulating, and publishing these texts, which has given us a peek at what might happen in a few years, when the much larger EEBO-TCP also archive becomes available to the public. The release was met with enthusiasm by power users who were eager to work directly with the XML files, but frustration by those who expected a full-service platform to interact with the texts. This presentation will discuss the mixed reactions to the release of the ECCO-TCP texts; offer examples of how people are starting to work with them; and highlight some of the questions, challenges, and opportunities that have arisen for the TCP as a result.
  • Publication
    Neoliberalism & Literary Geography of the 20th Century: Statistical Models
    (2015-04-22) Wilkens, Matthew
    Computational methods allow literary scholars to test their claims against a much larger and more diverse body of texts than would otherwise be possible. Recent examples include work on the evolution of poetic diction in the nineteenth century, on comparative social networks in American and Asian modernism, and on urban space in several centuries of British fiction. But there has been very little such research on contemporary literature, where problems of scale are most acute. This talk presents new computational work on neoliberalism and the literary geography of the twentieth century. To shed light on the extent to which fiction today is shaped by the logic of late capitalism, it assesses the relationship between the century’s significant changes in economic output and the shifting distribution of geographic attention in 10,000 American novels published between 1880 and 1990, finding a surprising — and growing — degree of geographic conservatism in postwar US fiction. This result calls into question the widespread critical assumption that neoliberal ideology demands an increasingly close alignment between market functions and aesthetic production.
  • Publication
    Emerging Opportunities for Visual Analytics in the Digital Humanities
    (2012-02-07) Weaver, Chris
    Research is a complex process of exploration and analysis that encompasses observation, collection, interpretation, discourse, and collaboration. That the digital humanities community aims to marry human and computational capabilities puts it squarely in the vanguard of emerging methodologies. As a growing methodological subdiscipline of the information sciences, visual analytics seeks to facilitate the research process by augmenting innate human visual and cognitive capabilities with interactive computational tools. The commonalities and potential for exchange between the digitial humanities and visual analytics is conspicuous. Useful but specialized applications of visual analysis now exist in numerous domains that tackle complex, voluminous information sources; well-represented domains include intelligence analysis, emergency response, business logistics, finance, and epidemiology. However, there is of yet little support for an open-ended, user-driven process of broad and deep digital engagement in which data processing, graphical depiction, and human interaction adapt to evolving research needs and goals, particularly in examinations of idiosyncrasy. In this talk, Chris Weaver will offer a vision of humanities scholarship infused with highly interactive, visual, computational facilities for interpretation and discourse. He will also present concrete progress on developing methods, techniques, and tools in support of that vision.
  • Publication
    Student Showcase: Who/Where/When/Why/How Tells the Story—A Roundtable Discussion
    (2017-09-29) Greene, Danyelle; Frank, Adrienne; Sasala, An
  • Publication
    The Moral Role of DH in a Data-Driven World
    (2014-09-13) Weingart, Scott
    Networks are increasingly invoked in the humanities and computational social sciences both metaphorically and formally to interrogate ourselves. Simultaneously, individuals, corporations, and governments employ networks as a means to prestige, profit, and power. When in 1696 Leibniz compared the scientific method to putting nature “on the rack,” he was not literally connecting torture to evidence gathering. In the intervening centuries, however, the metaphor has become frighteningly apt. Network analysis, an ostensibly scientific method, is used to justify targeting of terrorists and is instrumental in inferring private lives from public sharing. This lecture will address the relationship between networks and the digital humanities; what DH can learn from network analysis elsewhere; and importantly, how DH can contribute to these broader ethical discussions. Indeed, if we do not contribute our ethical concerns to the discussion, it is unclear who will.
  • Publication
    Making Research “Come Alive” Through Digital Storytelling
    (2017-09-29) Walters, Lynne; Green, Martha