Well, they didn’t throw me to the Oompa Loompas.
This weekend I attended THATCamp, a BarCamp-style “unconference” on the humanities and technology (hence, THATCamp) hosted by the Center for History and New Media. It was terrific, and my earlier wibbling proved unjustified. While I was certainly in awe of the digital kung fu being thrown down, I could in fact follow 95% of the conversations, and I had a great time. Many, many thanks to the CHNM crew, especially Jeremy Boggs and Dave Lester, who I gather were the real architects of THATCamp, and to all the other great folks I met. Now I’ve got that post-conference power-up of enthusiasm, not to mention a lot of new blogs to follow, friends to correspond with, and things to think about. All should be fodder for future posts. But if I had to summarize what I took away from the weekend as a whole, I’d say this:
First, a lot of very smart people are thinking very hard about how best to apply the tools of the digital world to history and the humanities. It’s actually not an obvious or an easy question to answer, and I have to say I don’t think we as a community have entirely cracked it yet. There seemed to be more exciting and promising tools at the conference than there were obvious problems to apply them to. That’s not a dismissal. I think “more tools than problems” is a great position to be in. I just thought many sessions were stronger on “here’s what you can do with these tools” than on “here’s why you’ll want to do it.” Case in point: the NEH’s Office of Digital Humanities is seeking ideas for humanities supercomputing. Supercomputing! They want to give historians and other humanists access to supercomputers! But there’s an unfortunate dearth of historians who need a trillion calculations done in one second. This is what I really want and need to put my brain to. Not supercomputing, but the whole “OK, so what should we do with these tools” question. We really need some canonical projects that anybody can point to and say “oh, so that’s why this stuff is valuable to the humanities.” It’s going to happen soon–like I said, there are some very smart people thinking very hard about it. Once it does, we’ll probably stop calling this endeavor “digital history” at all. It will just be “history”, part of how it’s done.
Second, there’s money in them digital hills. If you’re a history or humanities graduate student looking to set yourself apart from the crowd, I strongly suggest thinking about getting involved in digital research. I’m afraid I don’t just mean a blog about robots. Demonstrate some programming chops along with your humanities education and there ought to be people who’ll want very much to hire you. (Edit: See? Here’s some THATCampers wondering where to find programmers.) Better yet, come up with some answers to the questions in my last paragraph. You don’t need a compsci degree, and you don’t need to be a math whiz. But you can’t be scared of your computer, and you do need to put in some time.
Third, I really like these people, the ones tearing down the wall between the two Cold War cultures of science and the humanities. You could say there’s an element of preaching to the choir at any meeting like this. Nobody at THATCamp was unsympathetic to the project of digital humanities. But so what? Choirs need to get together, to practice and to sing. A big reason to go to any conference is for validation–the formation in physical space of a community linked more by outlook and interest than geography. As I said before, these people feel like my tribe. So even if I don’t crack the digital humanities riddle, I’m going to keep turning up for things like THATCamp as long as they’ll have me.
My notes on specific sessions are below the fold (it’s a long post), along with links to many shiny gewgaws that were demoed or displayed.
Teaching Digital Humanities
The first session I went to (here’s the whole schedule, fairly spontaneously generated in the first hour of the conference) was on teaching digital humanities. Or digital history. The H in THATCamp stands for humanities, but the H in CHNM stands for history, and historians at THATCamp outnumbered other sorts of scholars to the point that a lot of us fell into using the two words interchangeably. Which they aren’t, of course. But interdisciplinarity was so basic to the goals of everybody there that I think (I hope?) it wasn’t too difficult for non-historians to relate.
Jeffrey McClurken, Bill Ferster, and Paula Petrik all described their experiences teaching digital history courses (Dan Cohen and Bill Turkel‘s courses were also referenced), and Patrick Juola, who teaches computer science, offered some complimentary and some dissenting advice. All the courses sounded excellent and yielded cool things. (Check out the student animation of Thomas Jefferson’s mail, here.)
One theme that came up was the myth of the digital generation. It’s just not true that everyone born after 1989 (yes, most of this year’s incoming freshmen will be, literally, children of the 1990s) comes into this world coding, hacking, searching, or networking like a pro. There are many skills to be taught and there is real room for the humanities in empowering students to live and thrive in a world of digital texts. Something else Jeff, Bill, Paula, and Patrick all seemed to agree on was that their goal was not to teach a specific set of tools–in five years, any given software will likely be gone or unrecognizable–but rather to instill foundation of basic concepts or a mindset of experimentation and procedural thinking. Patrick said that teaching computer programming was really about teaching math: if you understand the underlying mathematics, learning a programming language is easy–but if you only acquire the language, learning the math is like swimming upstream. But what is the equivalent, for the digital humanities, of the “underlying math”?
Another subtheme was that digital history courses to date are pretty improvisational–both because the approach is so new but also because of the temperaments of the people likely to teach it. Students have a lot of freedom to create their own projects. Thing is, students don’t always love having lots of freedom. I wonder what a more structured digital history course would look like.
(Apparently in Dan Cohen’s digital history class for grad students, the final project was not to produce a work of digital history but to write a grant application proposing one. Now there’s a class teaching students the way the world works.)
Games, Stories, and Worlds
It’s a shame more people didn’t come to the session on world and story building, especially since the session on games immediately after that was a little over-stuffed. (I think everyone was over at Bill’s session on actually making stuff with electronics. “See, Homer? That’s why your robot never worked.”) But Paula Petrik, James Smith, and I had a very nice conversation about games and simulation and narrative. James is involved in the world of MUDs, and I, you will be shocked to learn, have played some roleplaying games in my time. Anybody thinking about computer games and narrative, especially for teaching purposes, really ought to talk to somebody with tabletop gaming experience (and yes, I am available for consultation). In Here Comes Everybody, Clay Shirky talks about the innovation that comes from making failure essentially free. Computer gaming is a much bigger industry than its tabletop cousin, but pen and paper gamers can experiment much more rapidly, cheaply, and easily than computer game developers. And it sounds like the MUD/interactive fiction world is similar in many ways. Everyone oohs and ahs at the interactivity of, say, Grand Theft Auto, but nothing it does in terms of narrative or structure seems that astonishing to me. The indie RPG scene has its blowhards, but the conversations there about how narratives emerge from play are ahead of anything I’ve seen in the computer game world. Plus the nonexistent budgets of the indie tabletops are probably closer to conditions in most classrooms.
The second session on games was crowded, as I say, both with people and topics. Trevor Owens and I believe Dave Lester described Playing History, their proposed database linking to and evaluating all the little historical games out there for educational purposes. We talked about how to review games for classroom use (mimesis is not historicity) and how one might create a community of educators willing to review games for others. Marjee Chmiel demoed The JASON Project, a marine biology game she’s designing for National Geographic. The coolest part of the game, next to the part where you make sharks barf and then sift through their vomit for clues, was the “argument constructor,” where the kids playing have to use the evidence they’ve gathered along the way to convince the government that humans, not sharks, are killing the ocean’s seals. There’s a kind of a GUI for matching individual chunks of evidence to arguments and counterarguments that is very clever–apparently it comes from a Nintendo game about lawyers, go figure–and it could certainly be used in a history game. What a neat way to get elementary school students actively constructing and dissecting historical arguments.
I made a point about how meta-gaming, the interaction around a game, is always a crucial part of the experience, and ought to be designed for too. I also plugged Sam Wineburg’s work on historical thinking, as I often do.
Later, I had a conversation with Mikel Maron, who is involved in OpenStreetMaps (see below) and a keen game called SF0, and Josh Greenberg, newly ensconced in both the NYPL and the Park Slope daddy demographic, about possibilities for historical “alternate reality” games, mobile phone / GPS games, and the like. How cool would such a game in and around the NYPL be? Though what Josh really wanted to do was not to run a game but to design a software platform that would make it easy for other people to run their own games. You can take the boy out of CHNM, but you can’t take CHNM out of the boy.
My notes for this won’t translate into a blog post, but I actually did go to an honest for true programming session, led by the LoC’s Dan Chudnov on the visualization language Processing. Dan showed us how to draw an oval and a line of colored balls, and it was dead easy. With a little more work, we can apparently do things like this, or these, or these. I can’t help but feel there’s an underpants gnome issue here but who am I to say?
On Sunday morning, Raymond Yee, who literally wrote the book on web mashups, led an excellent session on them too. These are not, alas, those songs that remix Enya with Prodigy, but rather things like the housing maps you can make by combining Craigslist with GoogleMaps. I was pleased that one of the mashups Raymond led off with with was this Amazon-to-your-local-library button I’ve been using for ages. He also showed us this site that maps the location of every New York Times story in real time, and the blog from a course on mixing and remixing information he taught at Berkeley.
(Yes, Raymond has a wiki instead of a blog. He’s just that 2.0.)
(You can see me sitting next to Raymond (standing) in Dan Cohen’s picture here. I’m not looking at him, but neither is anyone else. We’re all 2.0!)
Josh asked good questions about how to connect these mashups with humanistic practices. Can a mashup make an argument? People in the library world were also interested in best practices for marking up data, to make their information accessible to applications like this, and to make the web less reliant on commercial sites like Amazon and Google Maps. Apparently there are efforts underway to create an open source library cataloging system with stable URLs for each item. The Library of Congress has something like this already, and Open Library wants to create a permanent URL for every book ever published. Neat.
Mapping and GIS
Sean Gillies kicked off with Pleiades, which is mapping the places and place names of the ancient world in KML files easily readable by things like Google Maps and Earth. There were a bunch of other projects laying historical maps over Google Map-like interfaces, but either they’re not public yet or I didn’t take note of the URLs. There was considerable talk about the virtues of KML versus GML which I admit I lost track of, but I zoned in again for Mikel’s description of OpenStreetMap, which is a user-generated wiki-philosophy alternative to Google Maps, quite superior in some areas (especially in England and Norway, apparently, and in areas geared to pedestrians rather than cars) but still a work in progress in many less enlightened regions.
“Density of information, not decoration” is the mantra for mapping and data visualization in general. These animations of historical housing data are perhaps prettier than they are revealing–but the housing booms (try the Madison, WI one) look a lot like the final stages of the old game Missile Command.
Enough of THAT
I had an early flight so I missed the final session, but as you can see, they had already packed a lot into a day and a half. Plus I had umpteen great conversations I haven’t blogged, met a passel of cool people, heard some juicy insider Zotero gossip (there’s just no way to say that and retain any hope of sounding cool), and the NEH begged me to come play with the Department of Energy’s supercomputers.
Thanks again, all. See you next year?