All posts by MBHP

What is Genre? (A Video, and a Beta Test)

Below is my “Concept in 60” on the problem of genre–really, the super-basic-est of issues at the heart of genre theory and genre studies and every other thing with “genre” in front of it. The nice thing about video is that it lets me draw (or show you, generally) the problem in a way that’s easy to understand.

Warning: it’s a bit quiet–I need to reconfigure the gain on my microphone.

The other reason this video exists is that I find the “watch while I draw with a sharpie” genre of educational Youtube videos to be both visually efficient, attractive, and pedagogically effective. So I’m beta-testing my ability to create these videos with only the equipment I have on hand (namely, a smartphone, 2 PCs and Audacity).

Advertisements

Curated Reality: Directives, the Collapse of Collaboration, and Technology in School

If we all took the Invisible Gorilla experiment (1-3), and you were one of the people who saw the gorilla, you’d probably try and figure out who else saw the gorilla, detect what they have in common with you that let them see the gorilla, and find a way to say “Hey, you saw the gorilla too. What’s the deal with the gorilla?”

Since you’d been directed to count all the passes, though, you would then convene with the Basketball Counters (who, all this time, have been doing to their calculations what you’ve been doing to the gorilla) in an attempt to arrive at an accurate picture of what’s going on here. That’s the collaboration that Davidson gestures ambitiously towards in her introduction (5), it’s the framework for the book’s objectives–to examine how we might adapt our schools and workplaces to account for this human tendency to pay attention to some things and not others, and to seek new information on what they’re missing when they need to problem-solve.

But let’s suppose (in a little mental experiment) that the Invisible Gorilla experiment that no one directed the audience to count the passes—there is no clear problem to solve. Without this directive, people would watch the video with a more open filter, with the counting-inclined counting all sorts of passes in different categories, the sports-inclined watching the form of the passes, perhaps, the literature-inclined attempting to close read the scene for symbolic meaning, and a fair number of people just watching. A larger number of people, without their attention externally directed elsewhere, might see the gorilla–without telling everyone to count (as Joe deliberately neglected to do on Friday), most people see the gorilla. It’s a gorilla.

In a group large enough, without this counting directive, people might generally see the gorilla and understand the passes and the coding of the colored shirts, and talk about what it means. But this central consensus on the (now obvious) gorilla presence would still generate a series of outlier groups: people who don’t think the gorilla is important, who counted all the passes between black shirts, or white shirts, or all the passes from one color shirt to different color shirts, and so on. They form their own small groups, reinforcing each other’s beliefs.

This is basically the internet in a nutshell. There is a mass of data, to be processed by people, with no filtering directives or directive towards problem-solving. Like Baby Andy (47), it’s just spitting data at us, and we’re selecting parts, giving them value, reinforcing the reproduction of that data, and grouping up with other people to form cultures where “Dada” is a word and “Mada” is not, where the gorilla doesn’t matter but the black-shirt passes do.

Internet Opinions

Collaboration under these circumstances may or may not be as prevalent as under the “count the passes” directive, but this collaboration is fragmentary and self-reinforcing non-collaboratory (or intra-group collaboratory) activity is just as common. Team Gorilla and Team Mathematics don’t always talk. They have no reason to. This self-selecting group-identity without an impetus to collaboration creates what I call a Curated Reality (sometimes called a bubble world, or a pundit sphere, or when properly financed, a cable news network). Davidson seems particularly unconcerned about this (the book is deliberately “optimistic” [back cover] after all), but it does make me nervous. I’m usually one of those “the internet is THE BEST THING THAT EVER HAPPENED” people, but there’s no ignoring the fact that people (all of us) select data that we are already pre-inclined to find interesting, accessible or agreeable, and filter out as “crap” all that stuff which is uninteresting or contrarian, and are generally blind to these filtering activities.

What this has to do with my experiences as a student and as an instructor grows out of the fact that, like Duke, my undergrad institution distributed iPods to its incoming freshmen (in the 2006-2007 school year). The problem was, without the financial resources of Duke or the Apple branding help garnered from Duke’s large public profile, the iPods were only distributed by specific programs in specific colleges. Unlike Duke’s student population, which as Davidson indicates was directed towards education their whole lives (64), the population at my not-quite-Ivy institution was less inclined to go along with the overtly experimental program, effectively fragmenting the student population (and the school) into iPod Education Advocates (developing apps and doing work), Happy iPod Hijackers (who laughed at this heavy-handed idea that if you just dump new tech into an old classroom things will get more efficient, and just used it to listen to music), and The Humanities Students (who did not receive the technology at all, despite appealing to the administration). The collaboration that Davidson commendably notes at Duke (65) collapsed before it even formed, except for isolated, intrepid pockets of iPod Education Enthusiasts and iPod owners. Like miniature Dukes.

In the following years, whole colleges in the university abandoned the program. The programs that abandoned the iPod did so because collaboration and innovation was stifled–ironically, stifled because these programs had implemented free iPods in an unequal fashion and hoped that crowdsourcing without directives would somehow magically collaborate them straight into the information age. iPod education became a Curated Reality–those who had it said it worked, those who had given up on it said it was worthwhile but not exemplary, and those who never had it scoffed at the idea that technology had anything new to offer, and none of these groups was really interested in talking because there was no directive, no problem to solve. Collaboration became in-group only, and attention blindness became the mode of the day.

While Davidson says Duke’s program never came with a directive (62-63), it did implicitly have one. Duke distributed the technology to a student population already inclined to work outside class time on improving the university, with specifically branded partnerships with Apple, under an educational initiative undertaken by the whole university with the direction of Davidson herself (64). In essence, she did the Invisible Gorilla experiment on a room full of professional counters at a conference on counting basketball passes–a directive is implicit in the context, creating an object of, and impetus for, collaboration.

The excerpt from a Youtube video that follows, by user Gabgorilla from October 20, 2011, stands as a prime example of both an argument for technologically enhanced education, and as an example of an artifact of collapsed collaborative possibility and implicit, limiting directives, forming a Curated Reality:

In the video, the user (a student or professor, perhaps, in a digital composition course) juxtaposes the “classrooms of today” (which are filled with laptops, mostly Apples) with the “classrooms of the past” (with patriarchal paintings and warped desks) (see 00:16 to 00:19), using Dictionary.com and proprietary clip-art to make a point about technology and classrooms in a painfully artificial use of technology that students would giggle a bit at. The video transitions from talking about technology in education generally to focusing implicitly on composition, challenging the notion that technology can only be used for “word processing” (01:48 to 01:50) while it fails to cite any uses that are not composition-oriented. The end result is commendable, but fails to reach outward beyond its implicit focus on composition technologies, proposing to enable a collaboration it implicitly fails to imagine. The video’s author challenges us to use technology in new ways, which in the video seems to mean making essays with more expensive software than a word processor.

This video gets caught up (as Davidson does, a bit) in the rhetoric of technology as panacea for education–a Curated Reality based on enthusiasm for technology and education whose laudable enthusiasm frequently erases the dangers of technology inequality and of shoehorning technology into a classroom without regard for its actual pedagogical usefulness or the ways in which technology has already impacted the classroom. Technology, despite everything said, insistently remains a replacement for or enhancement of older technology, and paying attention to it at all is grounds for self-congratulation (see all of Davidson, Chapter 3). Likewise, it remains bound up in an implicit economic language where the cost of these technologies, and their accessibility, is ignored. iPods are used to record and distribute spoken lectures to other iPod users (Davidson 66), and Duke (and Davidson) congratulate themselves on crowdsourcing new ways to use technology to make education accessible to everyone (with the several hundred dollars necessary to purchase an iPod in 2006).

Selective attention to one aspect of educational technology by a specialized group of educators with a specialized group of students (Davidson’s Duke and it’s implicitly elite student body) with the directive (implicit or otherwise–it was certainly obvious to Duke students) of modernizing educational practices creates a small group which can collaborate but collapses the possibility of collaboration outside that context–no one cites the problem of unequal implementation, or of the social forces built into educational systems which disqualify certain approaches (and which contaminated Davidson’s experimental control of not telling students what to do). Davidson, pointedly, recognizes this skewed basis but continues to universalize her experience at Duke anyway (64). She claims there are no directives or conditions–but directives were built in everywhere. Likewise, at my undergraduate institution, the unequal implementation of the initiative put further directives in the mix, rapidly enabling very specific kinds of collaboration and utterly destroying any other kind.

My much-belabored point is this: Much like the video’s limited embrace of technology, Davidson’s ideas of where this technology goes in Chapter 3 perpetuates some of the problems she wants to fight: it disables the awareness of attention blindness and collaboration that she champions. As many education technology enthusiasts (like me, and Davidson, and others) have done, we have challenged the conditions of an old, conservatively anti-technology Curated Reality on education and, in the process, perpetuated our own Curated Reality, blind to our structural preconceptions. We have enabled some forms of collaboration by disabling others, blind to our own implicit directives while claiming to be “open.” Our utopia is smaller than we imagined, because membership and collaborative knowledge is governed by criteria we pretend aren’t there.

Eating on the Wall

So I tend to go out to eat a lot, particularly when I end up back in Philadelphia (even if I’m there for only an hour and this is why). This is one of those central things my significant other and I agree upon, though I do sometimes frustrate her with my constant desire to eat a whole meal at literally any time of day.

As of the first paragraph, I’m actually doing that right now. For atmosphere, put this on in another tab, wait through the ad and the promo (because apparently noise can’t sell itself), and then come back. We can pretend to be in a coffeeshop together.

You done? Okay. Fiddle with the volume. I can wait. Ignore the sidebar–Youtube is trying to distract you with paid content.

All the time in the world here.
All the time in the world here.

Anyway, full disclosure: the S.O. uses Instagram (and no I don’t have the link to her profile). Her fascination with Instagram has made her a rather good photographer, doubtlessly in some small part due to a family tradition of taking artful pictures all the time. This practice extended to our frequent mealtimes, and created a ritual I have come to call Saying Hipster Grace. It this daily ritual, the adherents await the presentation of the meal, at which point the participants make a ritual ablution with their phones, where they aggressively rearrange the table to get all the food in frame and then use their cell phones to take a picture.

The food then begins to cool as they select filters and post to Instagram. A sigh of satisfaction signals the end of the ritual, as the picture uploads and the phone goes away.

My being rather agnostic about Saying Hipster Grace (I refuse to get baptized into Instagram, and just store pictures of good meals on my phone) is uncharacteristic and difficult to explain. In Standage’s terms, where we’re all Romans running down to the dock to get our mail (26), I’m the literate plebe that taught himself how to swim. I need data, all the time, and as many of you found out today, I’m constantly trying to keep my social media presence organized.

But I just won’t share my frankly spectacular meals on social media. Something about that just seems like oversharing, like Cicero tweeting from the restroom. And this makes me meditate on Standage’s assertions throughout Writing on the Wall that social media has, in some form, “been around for centuries” (250)–and I agree, it certainly has, even if sometimes his history chaffed the postcolonialist in me for being really traditional, Eurocentric and a bit self-fulfilling. But I can’t help but think that, as Standage himself notes, there’s something a little different this time around (239), something more pervasive, and more centralized (248).

My S.O.’s picture of cinnamon bun french toast from two days ago on Instagram was linked by 49 people in 4 hours, and by noon had been incorporated into the restaurant’s online marketing presence, all without her explicit permission. Like Cicero’s letters, she’s happy to have it reproduced and used (the dish was lovely, after all, and the attention is her gift to the restaurant), but unlike Cicero’s letters, her social media is being used for a pretty thoroughly centralized, wholly commercial end. This is Facebook Corporation via Instagram, not Cicero’s scribe, and breakfast ads don’t usually save the Roman Republic from ruin. Something is different here, but I’m not sure it’s just corporate centralization (one of Standage’s constant perils).

Super-Serious.
Super-Serious.

I take pictures of my best meals because I want to remember them, not to share them. I get the impression I’m carrying around a model of “what is important information” that Standage might attribute to the broadcast media age–that magically approved authorities like CNN (which constantly posts pictures of food) have a greater claim to talk about their lunch than I do. Sponsored results like the ones at the top right of the Youtube video I’m having you listen to, and the annoying voice-over at the beginning of the video, perpetuate this broadcast-privilege model of importance–the business that provides this service has more authority to sponsor or review than the average user, despite my profligate linking to things I like. Despite Standage’s closing note, I still feel like I have to “squeeze through the bottleneck of broadcast media” (250) despite not actually being dis-enabled from having that scale of web presence. This rebirth of social media is having some trouble shaking off its broadcast media phase, like a horrible Nazi-connected (202-203) puberty that has left its marks. That might be why a lot of people (and the language of twitter, with “followers”) imagine a me-as-central-broadcaster-to-audience model of social media, even though the constant desire for response and “likes” (and followers talking back) shows how this is different and interactional.

I guess, to declare my independence from broadcast-media thinking, I should take and post a picture of the cappuccino I was drinking in Elixr as I wrote that last paragraph, but I logged out the coffeeshop without logging in to take a picture.

A Little Preemptive Digital Archaeology

Asimov Type Faster

I may have odd digital writerly habits. For instance, the text you are reading right now was composed in the Windows 7 version of Notepad, the almost-totally-formatting-free, ASCII-based text editor. While I look at MSWord the way many people look at a sibling, whenever I compose for web-based reading I compose in Notepad. I’ve been poking around on writing on the internet since the early 1990s, and I still can’t beat the (apparently justified) feeling that someway, somehow, if I type this in Word and then copy-paste to the webpage, it will find a way to become ugly.

While reading Baron, I couldn’t help but notice he, and many of the writers he refers to, have similar strange or archaic digital habits. I might be born-digital-and-maladjusted, though, since I started writing not with a PC but with an already-outdated IBM Selectric III (see 79), had a plastic toy manual typewriter which jammed constantly, handwrote my first publishable stories, and continue to handwrite notes for classes in a leatherbound quarto notebook.

I also had intense brand loyalty to pens and certain pencils for creative writing, and kept and archived pens exhausted in my fiction-making, complete with the date of final drying-out and what project they served on. Until, at least, I moved on to just using MSWord.

6zhpg

But, contrary to Baron’s example (51), am nearly as particular about my keyboard as I am my pens–I’ve disqualified laptop purchasing options on the closeness or space between their keys, and once had to mail order the last remaining of a specific model of keyboard from a forgotten office supply warehouse in Texas. And I still remember the feel of the slightly concave, chocolate-colored keys of the Selectric.

What this makes me thing about is how many of us bring archaic practices to digital writing, and how we might begin to separate the archaic from the necessary in order to see what it is that digital media are actually making us do. I’m sure, for instance, that avid science fiction writer and gonzo futurist Charles Stross could probably find a better way to distribute his copyleft drafts than as .rtf documents, and that I could find a better composing medium than a program which, in essence, imitates a console interface. Certain digital habits have mostly fallen by the wayside–like Baron’s “handwriting fonts” (66) and fiddling with different fonts in email and papers (83), both of which are now strangely nostalgic enterprises that usually indicate that someone is new to digital composition and just basking in the endless stylistic possibility. Meanwhile, other incredibly frustrating old practices remain–the insurance industry, for example, still depends on faxing, which elicits the following reaction from anyone new to that industry:

This is Alan Rickman flipping a table. Your eyes do not deceive you.
My boss learned quickly not to mention the faxosaurus by name.

Certainly, there are countless complex forces determining which (sometimes frankly unnecessary) things we often bring to digital composition from its predecessors–the lack of white background was, according to Baron, a major obstacle for the popularity of early word processing software (105)–but we have to wonder what sort of impact these alien-to-computers techniques and technae will have on the future of digital writing. How long will it be before nobody knows what the “save” icon means? How long before people stop seeing the computer as “a better pencil?” What weird writing customs and cultural practices are we going to leave them with because we hold on to things that look like paper?

–Michael
(because for all my technological savvy, I can’t get my wordpress username changed)

Embarassing Email Addresses and The New Person

The most interesting difference since 2007 is a strange one, since it’s conventional to imagine that the relationships between people have changed. What this doesn’t do is acknowledge that the actual definition of what a person is has changed. Prior to the really big expansion of the use of the internet (when I started undergrad), when you talked about a person on the internet, you were talking about a person *using* the internet, a USER, who could just as easily drop the little bits of themselves they’d put out there and disappear. You had a friend from high school, their MySpace page gets deleted, and they’re gone until you go find them in “real life”. But at some point, after we begin to depend on this technology enough, it becomes inseparable from us–you cannot, after all, ever really delete a Facebook page, or stop using an email address, because the information is not wholly private, and thus not wholly containable. A part of you is in here.

Think, for instance, about how email use has changed. The embarrassingly verbose, ridiculously fetishistic email addresses of the early century are nearly gone (except for socially awkward moments of realization, when they pop back up), and people tend to centralize around one or two or three addresses, instead of just getting a new one and naming it something silly. It went from being a screenname to an address, that you have to give out to other people.

There’s also the moment of absolute panic the moment something is accidentally shared or deleted–it’s mourning and self-panic now, instead of the same phenomenon as misplacing car keys. It’s the idea of the person that’s been redefined.