Virtual Peace ( is alive as of last evening.

For the last gosh-don’t-recall-how-many-months I’ve been working as a Project Collaborator for a project envisioned by the other half (more than half) of the Jenkins Chair here at Duke, Tim Lenoir.  For those of you who don’t know Tim, he’s been a leading historian of science for decades now, helping found the History and Philosophy of Science program at Stanford.  Tim is notable in part for changing areas of expertise multiple times over his career, and most recently he’s shifted into new media studies.  This is the shift that brought him here to Duke and I can’t say enough how incredible of an opportunity it is to work for him.  We seem to serve a pivotal function for Duke as people who bring together innovation with interdisciplnarianism.

What does that mean? Well, like the things we study, there are no easy simple narratives to cover it.  But I can speak through examples.  And the Virtual Peace Project is one such example.

Tim, in his latest intellectual foray, has developed an uncanny and unparalleled understanding of the role of simulation in society.  He has studies the path, no, wide swath of simulation in the history of personal computing, and he developed a course teaching contemporary video game criticism in relation to the historical context of simulation development.

It’s not enough to just attempt to study these things in some antiquated objective sense, however.  You’ve got to get your hands on these things, do these things, make these things, get some context. And the Virtual Peace project is exactly that. A way for us to understand and a way for us to actually do something, something really fantastic.

The Virtual Peace project is an initiative funded by the MacArthur Foundation and HASTAC through their DML grant program. Tim’s vision was to appropriate the first-person shooter (FPS) interface for immersive collaborative learning.  In particular, Virtual Peace simulates an environment in which multiple agencies coordinate and negotiate relief efforts for the aftermath of Hurricane Mitch in Honduras and Nicaragua.  The simulation, built on the Unreal game engine in collaboration with Virtual Heroes, allows for 16 people to play different roles as representatives of various agencies all trying to maximize the collective outcome of the relief effort.  It’s sort of like Second Life crossed with America’s Army, everyone armed not with guns but with private agendas and a common goal of humanitarian relief. The simulation is designed to take about an hour, perfect for classroom use. And with review components instructors have detailed means for evaluating the efforts and performance of each player.

I can’t say enough how cool this thing is.  Each player has a set of gestures he or she may deploy in front of another player.  The simulation has some new gaming innovations including proximity-based sound attenuation and full-screen full-session multi-POV video capture.  And the instructor can choose form a palette of “curveballs” to make the simulation run interesting.  Those changes to the scenario are communicated to each player through a PDA his or her avatar has. I was pushing for heads-up display but that’s not quite realistic yet I guess. 😉

The project pairs the simulation with a course-oriented website.  While a significant amount of web content is visible to the public, most of the web site is intended as a sort of simulation preparation and role-assignment course site.  We custom-built an authentication and authorization package that is simple and lightweight and user-friendly, a system that allows instructors to assign each student a role in the simulation, track the assignments, distribute hidden documents to people with specific roles, and allow everyone to see everything, including an after-action review, after the simulation run.

Last evening, Wednesday October 08, 2008, the Virtual Peace game simulation enjoyed its first live classroom run at the new Link facility in Perkins Library at Duke University.  A class of Rotary Fellows affiliated with the Duke-UNC Rotary Center were the first players in the simulation and there was much excitement in the air.

Next up:

I never miss a beat here it seems, for now I am already onto my next project, something that has been my main project since starting here: reading research and patent corpora mediated through text mining methods.  Yes that’s right, in an age where we struggle to get people to read at all (imagine what it’s like to be a poet in 2008) we’re moving forward with a new form of reading: reading everything at once, reading across the dimensions of text. I bet you’re wondering what I mean.  Well, I just can’t tell you what I mean, at least, not yet.

At the end of October I’ll be presenting with Tim in Berlin for the “Writing Genomics: Historiographical Challenges for New Historical Developments” workshop at the Max Planck Institute for the History of Science. We’ll be presenting on some results related to our work with the Center for Nanotechnology in Society at UCSB.  Basically we’ll be showing some of our methods for analyzing large document collections (scientific research literature, patents) as applied to the areas of bio/geno/nano/parma both in China and the US. We’ll demonstrate two main areas of interest: our semiotic maps of idea flows over time I’ve developed in working with Tim and Vincent Dorie, and the spike in the Chinese nano scientific literature at the intersection of bio/geno/nano/parma.  This will be perfect for a historiography workshop. The stated purpose of the workshop:

Although a growing corpus of case-studies focusing on different aspects of genomics is now available, the historical narratives continue to be dominated by the “actors” perspective or, in studies of science policy and socio-economical analysis, by stories lacking the fine-grained empirical content demanded by contemporary standards in the history of science.[…] Today, we are at the point in which having comprehensive narratives of the origin and development of this field would be not only possible, but very useful. For scholars in the humanities, this situation is simultaneously a source of difficulties and an opportunity to find new ways of approaching, in an empirically sound manner, the complexities and subtleties of this field.

I can’t express enough how exited I am about this. The end of easy narratives and the opportunity for intradisciplinary work (nod to Oury and Guattari) is just fantastic.  So, to be working on two innovations, platforms of innovation really, in just one week.  I told you my job here was pretty cool. Busy, hectic, breakneck, but also creative and multimodal.

It’s the dawn of the posthuman century and so perhaps the irony of phrases such as “virtual help” and “simulated peace” contain the echoes of nostalgia redolent in an ever-accelerated technological era. I’m excited to attend a presentation on humanitarian aid & development sims by Ryan Kelsey and colleagues from Columbia @ the CNMTL

I’m interested in this primarily because of two dimensions in which I work: teaching How They Got Game at Duke, and participating as a project collaborator for the Virtual Peace project. Both gaming and pedagogy are in some ways new subjects for me, new in the sense of analyzing and building both games and courses (and courses about games and building games, with the case of How They Got Game).

The talk is focusing on two sim projects from the CNMTL, one a project begun last year, and another first started back in 2001. Tucker Harding of Columbia spoke about ReliefSim, a health-related turn-based learning simulation used in the classroom to help students develop a deeper understanding of dealing with and working under conditions of a humanitarian crisis. ReliefSim’s development began in 2001.

The crisis in ReliefSim is a forced migration. Students enter ReliefSim first by viewing a text-heavy html interface with long series of interactive selections. The initial interface reflects the overall idea that the sim is not really training as much as it is educational augmentation. Display categories include assessments, interventions, information gauges, team, and age breakdown. With this display a player does not get a picture of the greater context of the crisis (e.g., caused by warring factions along national borders), it immediately gives a sense of features and depth of impact

With the panel the player chooses actions and assigns those actions to members of the team. In turn one, for example, we assign a water supply assesment to Eric, a food supply assessment to Marilyn, and a population assessment to Ryan. When we click “end turn,” the interface gives us back data generated by the assessments. Good information for our crisis: 10,000 people involved, 1600 under 5, 3000 betweeen 5 and 14, and 5400 15 and up (no assessment for elderly and/or inform at this point). We also see we have a 15,000 kcal food supply where each individual needs a minimum of 600 calories. We also have 100k liters of water, with a 5 liter average per person water demand. Our food supply seems good as we can feed everyone an average of 1500 calories per day. We also have 10 liters per day per person. However, will our population grow? We can support up to 20,000 people on our minimum water supply and 25,000 based on food.

The second game, presented by Rob Garfield of Columbia, is the Millenium Village Simulation, developed by Jeffery Sachs. The game’s conceit is that you the player are a sub-Saharan farmer trying to support your family as you move from subsistence farming to generating income. The Millenum Village Simulation reflects Sach’s full-spectrum approach to treating poverty. You can’t just build schools, for example, if your village suffers from occasional malaria epidemics that wipe out entire groups of children.

The sim interface for the turn-based game is similar albeit sexier than the interface for ReliefSim. Not limited to tabular/textual representaton and selection, the player is shown a simple visual representation of the farmer in the context of a village, the village in the context of greater environmental factors.

The player for each turn is to allocate the farmer’s time (including his wife’s) across a set of development tasks, such as collecting water, farming, or organizing a small business. If we choose to assign hours to farming, we are given choices as to whether we want to perform subsistence farming (grow maize) or income-generating farming (grow cotton). At this point we don’t have any idea how much effort translates into a result. We selected four hours of water collection, but we have no idea how many hours are needed to meet basic needs.

As we took a turn I noticed that the daily allocation was being set in the interface for an entire season; each turn is a season. (Which season?) The game takes a general approach to location (sub-Saharan Africa is widely varied in terms of seasonal conditions, for example) and a rationalist-optimization-oriented approach to helpng a student learn to support a farmer in such a location.

As with the previous presentation, the presenters display stunning tools built in a general knowledge/time management orientation. The SN-LMS presenters evaluated server logs within a site to understand character of students; the game tests a student’s ability to delegate time in order to reach optimally managed conditions for the economic development of a farmer. Both suffer somewhat from a level of specificity that can only be gained by the detail of greater context. It’s not clear why cases are more strongly relied upon, even as frameworks for developing evolving and dynamic game scenarios.