SAIC/NU Data Viz Collaborative

SAIC/NU Data Viz Collaborative

August 16–22
Reception: Friday, August 16, 4:00–6:00 p.m.
Gallery X, 280 S. Columbus Dr., room 113

Twenty-one students and nine faculty members from Northwestern University and the School of the Art Institute of Chicago (SAIC) are combining big data with collaborative research, studio arts, and visual communication design this summer at SAIC’s downtown campus. The results—creative approaches to information visualization developed in an intensive new course, called Data Viz Collaborative—will go on view at SAIC’s Gallery X from August 16 through 22 with related installations in the lobby of the LeRoy Neiman Center from August 16 through September 13.

The free exhibition will showcase the latest developments at SAIC in a long history of connecting artistic and scientific practices via their shared processes of discovery. Divided into three research groups, each set of participants was given six weeks and a $500 budget to develop the experimental projects that will be on display. The areas of concentration are Big Data and School Choice in Chicagoland, Mapping Genealogy and Ancestry, and Eye-tracking: tracing the gaze in an image.

“In today’s increasingly data-driven world, artists and designers have much to contribute to innovation alongside scientists and engineers,” says SAIC President Walter E. Massey. “The complexity and scale of the issues presented by visualizing information in the age of big data require a creativity of approach and mindset in both research and problem-solving. Only by combining the interpretive powers of artists and scientists can we continue to achieve the kinds of breakthroughs necessary to sustain an innovative society and economy.”

http://www.saic.edu/academics/areasofstudy/artandscience/datavizcollaborative/

Talk Today, 4/22: Sarah Igo, Tracking the ‘Surveillance Society’

DH-relevant talk today over at the Program in Science in Human Culture:

SARAH IGO (Vanderbilt)

“Tracking the ‘Surveillance Society’”

Description: This talk explores the cultural effects of new ways of housing and accessing personal data in the U.S. in the 1960s and 1970s. It was in this period that citizens first mobilized around what they had known, in low-grade fashion, since at least the 1930s: that many agencies, public and private, were collecting information about them. New suspicions attended the mundane data-gathering operations of agencies like the Internal Revenue Service and Census Bureau, while hidden monitoring devices, vast warehouses of private information, and menacing bureaucracies loop through the cultural and political texts of the period. Faster computers, larger bureaucracies, and expanding databanks, I will argue, generated novel claims and claimants for the protection of “private” information. They also led to a distinctive understanding of the postindustrial U.S. as a “surveillance society,” which depended on the collection of personal data for its very operation.

Hagstrum Room (University Hall Room 201) on Mondays from 4pm-5:30pm.

http://www.shc.northwestern.edu/klopsteg/index.html

Fascinating and DH-relevant article by Igo: Sarah Igo, “Knowing Citizens,” Sensate Journal.

X-Post: Notes on McGann’s Radiant Textuality

X-posted from Issues in Digital History.

I am going to write a longer commentary on Jerome McGann’s Radiant Textuality: Literature After the World Wide Web (Palgrave, 2001) in an upcoming post, but a few sections of his preface and introduction (“Beginning Again: Humanities and Digital Culture, 1993-2000) are striking for how relevant they remain over ten years after he wrote the book:

McGann organizes his book around two main arguments:

The first is that understanding the structure of digital space requires a disciplined aesthetic intelligence. Because our most developed models for that kind of intelligence are textual models, we would be foolish indeed not to study those models in the closest possible ways. Our minds think in textual codes. Because the most advanced forms of textual codings are what we call ‘poetical,’ the study and application of digital codings summons us to new investigations into our textual inheritance (xi).

McGann’s second argument is as follows:

Digital technology used by humanities scholars has focused almost exclusively on methods of sorting, accessing, and disseminating large bodies of materials, and on certain specialized problems in computational stylistics and linguistics. In this respect the work rarely engages those questions about interpretation and self-aware reflection that are the central concerns for most humanities scholars and educators. Digital technology has remained instrumental in serving the technical and precritical occupations of librarians and archivists and editors. But the general field of humanities education and scholarship will not take the use of digital technology seriously until one demonstrates how its tools improve the ways we explore and explain aesthetic works—until, that is, they expand our interpretational procedures [italics in original] (xi-xii).

Here is McGann asking 11 years ago that we not view the computer in opposition to the book, but as a continuation of the history of the book. Perhaps more crucially, he argues that we should fit what would become known, a few short years later, as the digital humanities (not yet a popular term for the field in 2001) into the critical traditions of inquiry that are the precinct of modern humanities scholars:

We have to break away from questions like ‘will the computer replace the book?’ So much more interesting are the intellectual opportunities that open at a revelatory historical moment such as we are passing through. These opportunities come with special privileges for certain key disciplines—now, for engineering, for the sciences, for certain areas of philosophy (studies in logic), and the social sciences (cognitive modeling). But unapparent as it may at first seem, scholarship devoted to aesthetic materials has never been more needed than at this historical moment (xii).

To the end of developing “scholarship devoted to aesthetic materials,” McGann posits the following imagined debate between a pro-digital humanities scholar and an anti-digital humanities scholar:

Computational systems…are designed to negotiate disambiguated, fully commensurable signifying structures.

‘Indeed! And so why should machines of that kind hold any positive interest for humanities scholars, whose attention is always focused on human ambiguities and incommensurables?’

‘Indeed! But why not also ask: How shall these machines be made to operate in a world that functions through such ambiguities and incommensurable?’ (xiv).

Finally, McGann notices how the digital humanities potentially reunites what Nietzsche divided into the “Lower Criticism” of philology with the “Higher Criticism” of historicism and aesthetic inquiry. The digital does not reduce the critical insights of “Higher Criticism,” McGann believes; rather, it asks, perhaps even demands, that humanities scholars reimagine the higher levels of advanced critical inquiry in relation to the fundamentally transformed foundations of “Lower Criticism” when those foundations of text, source, evidence, archive are placed into the digital medium:

In our day the authority of this Nietzschean break has greatly diminished. Modern computational tools are extremely apt to execute one of the two permanent functions of scholarly criticism—the editorial and the archival function, the remembrance of things past. So great is their aptitude in this foundational area that we stand on the edge of a period that will see the complete editorial transformation of our inherited cultural archive. That event is neither a possibility nor a likelihood; it is a certainty. As it emerges around us, it exposes our need for critical tools of the same material and formal order that can execute our other permanent scholarly function: to imagine what we don’t know in a disciplined and deliberated fashion. How can digital tools be made into prothetic extensions of that demand for critical reflection? (18).