Danah Boyd’s skepticism about — and deconstruction of — the terms “digital native” and “digital immigrant” have done much to strip away the false veneer attending those particularly loaded phrases. I’ve always been skeptical of these characterizations, especially after witnessing so many members of my own generation engage in what one might call “digital innovation.” Many of the most useful digital tools and software we enjoy today have come from people like Steve Jobs, Bill Gates, Roy Rosenzweig, and others who grappled in adulthood with the promises that technological advances offered society and the humanities at large.
For me, the distinction between “native” and “immigrant” is not simply generational but instead rests in terms of adeptness of use. Younger people are more patient, more adept users of new media and digital technology while those of us born in the 1950s, 1960s, and early 1970s are less likely to embrace it. Only when the world at large “tests” that new technology and deems it appropriate (and often essential) for everyday use do us “old fogies” embrace those new digital tools. Granted, younger people born in the 1980s and 1990s are not just users; they are also producing the innovative, graphically stunning video games that have flooded the market. But these younger generations don’t quite seem to “own” the market as fully as I had previously imagined; skill with a computer does not equal wisdom with a computer. We label these younger folks as “digital natives,” but they certainly don’t have all the answers. I’ve always considered my son (born in 1985) to be one of those stereotypical “digital natives.” Thus, I routinely deferred to his digital skill in navigating the Internet responsibly while I faithfully fulfilled the role of “digital bumbler.”
But my hands-off attitude changed over time when I began to oversee and question how he was using the Internet and digital tools. I quickly learned that he just jumped on innovations like Facebook and Twitter (and some very violent online games) without questioning what they might or might not do to his online reputation — or to his psyche! His judgment in these cases was not always sound. My initial disinterest in supervising his use of these online tools stemmed from my deference to him as a “digital native.” In other words, I told myself that “he knows better than I do the dangers inherent in these online digital tools.” Not so. He got himself into a pickle on more than one occasion, and I had to help pull him out. In other words, he was an excellent “user” of digital tools, but he did so reflexively and without much forethought.
Thus, Danah Boyd is on target when she claims that the rhetoric attending these terms is not just inaccurate, it’s dangerous. The “networked world” she mentions is fraught with politicized language and trapdoors that can snare the unsuspecting user — both “native” and “immigrant” alike (Boyd, 197). An inclination to use a digital tool or platform, as I have come to view “digital natives,” does not automatically make one a discriminating user. Thus, older generations of less adept but interested users (like me) still have a role in guiding the generations growing up with this technology to use it properly and to recognize its potential pitfalls.
We must also keep in mind that many members of the younger generations are turned off to technology due to the numerous pitfalls they have experienced or that await them, such as the posting of unflattering images, online bullying, false information, and the like. Perhaps these online dangers explain why Allison C. Marsh’s museum studies students (supposedly “digital natives”) demonstrated “little interest in the digital world” (Marsh, 279). Her students’ further inability to use a simple program like Omeka to build an online exhibit that flowed logically proved equally disturbing to her. Granted, innovative experiments such as T. Mills Kelly’s “historical hoax” class can help draw reluctant students from the “digital native” generations into a more discerning and responsible use of the Web and other digital tools. However, such an approach is a two-edged sword, primarily because the thrill attending such a hoaxing exercise can result in students who later become “serial Web hoaxers.” In other words, it could create a whole new category of user — the “digital monster.”
What I find truly puzzling, though, is how none of the professors I have had at GMU have employed any digital innovation whatsoever in any of their classes. Of my 10 courses for the Master’s Program, only one course actually demonstrated (and for one class session only) how to use the Web to find online primary sources. Specifically, we spent the class with a librarian explaining to us how we could find newspapers online using Pro-Quest — and that was it. I find this state of affairs remarkably ironic given Daniel J. Cohen’s admission that digital history is what prompted the Virginia State Council of Higher Education to approve GMU’s PhD in History program as the “PhD with a Difference.” The only class I have taken that involved anything digital is this one — HIST 696, Clio-Wired — and only as part of the PhD program. If GMU is the standard-bearer of digital history, why are the History Department’s faculty members not on board? Of my 10 classes for the Master’s Degree program, the traditional model of sitting in a circle to discuss the “monograph of the week” dominated the teaching approach. The digital tools available today allow for excellent visualizations and immersive experiences. I would have loved to have seen some PowerPoint slides with images from the period or even audio or video clips from documentaries and the like to supplement the in-class learning experience. These are cheap and easy tools to use. I run an academic institution for the Army; and, strapped as we are for cash thanks to a gridlocked Congress, we still employ a variety of digital and audio-visual tools to enhance learning, most of which cost nothing. If we are to undergo a true “digital turn” in academic history, then our history professors should set the example, especially in the very institution that prides itself on leading the digital charge.