Select Page

Frontier notes on metaphors: the digital as landscape and playground

“Try to imagine a culture where arguments are not viewed in terms of war, where no one wins or loses, where there is no sense of attacking or defending, gaining or losing ground.” – George Lakoff and Mark Johnson, Metaphors we live by (1980)

Bluntly put, time spent on Facebook stops us from giving love and affection to others or from furthering projects that undermine capitalism. What’s more, many people still labor on farms and in factories, and let’s not forget the working poor, undocumented workers, and youth in rural areas for whom access to the Internet is not a given. “The digital” does not sum up our entire condition. – Trebor Scholz, Why does digital labor matter now? (2013)

Google’s new campaign for teaching digital citizenship & safety to children – ‘Be Internet Awesome’ – invokes the idea of the internet as a playground. It recalls the kind of sign that elementary schools might post on the doors to remind students how to behave at recess. “These lessons are delivered via Interland, a playful browser-based game that makes learning about digital safety interactive and fun — just like the Internet itself.”

The lesson on security and privacy focuses on building strong passwords and controlling what information is visible in a profile. Remember kids, don’t share your locker combination with anyone and use your words carefully. Now go have fun!

In many ways, this message strikes an age-appropriate tone, and were it the creation of an elementary school teacher, I wouldn’t have much to say about it. But of course the deeper privacy issue with google is not about what we choose to put in our profiles, but what information google harvests from us for the purpose of profit.

If the Internet is a playground, we are encouraged to conceive of our time and cognitive effort as play rather than work. And we are encouraged to ‘gamify’ our classrooms, to reproduce the seemingly engaging logic of ranks, rewards, and levels that makes gaming so addictive. Yet, McKenzie Wark cautions us to attend to the underlying dynamic of playbor where we participate by posting, liking, and conversing, but Facebook and other companies profit. In this case, the Internet as playground is not an ‘open’ space, but one which Wark argues is based on “proprietary algorithms for managing networks” and “the data that can be extracted from those networks and that remains resolutely proprietary.”

We might imagine an innocent origin of gamification – the kids playing D&D in Stranger Things – but as Daphne Dragona points out, the “term only started being used in 2010 after it was reintroduced by the technology company BunchBall.com as a new form of game based marketing strategy.” She writes that “gamification succeeds in applying new forms of measurement and capitalisation” and “processes of homogenization.” Gamifying math homework or reading simply introduces incentives to get students to do what we want them to do, much like issuing approval in the form of grades.

According to the New York Times, Uber uses gamification tactics to keep it’s drivers on the road “without giving off a whiff of coercion.”

“To keep drivers on the road, the company has exploited some people’s tendency to set earnings goals — alerting them that they are ever so close to hitting a precious target when they try to log off. It has even concocted an algorithm similar to a Netflix feature that automatically loads the next program, which many experts believe encourages binge-watching. In Uber’s case, this means sending drivers their next fare opportunity before their current ride is even over.”

 

 

I am concerned with the broader class of metaphors that suggest the Internet is an inert and open place for us to roam. Scott McLeod often uses the metaphor of a ‘landscape’: “One of schools’ primary tasks is to help students master the dominant information landscape of their time.”

McLeod’s central metaphor – mastering the information landscape – fits into a larger historical narrative that depicts the Internet as a commons in the sense of “communally-held space, one which it is specifically inappropriate for any single individual or subset of the community (including governments) to own or control.” Adriane Lapointe continues, “The internet is compared to a landscape which can be used in various ways by a wide range of people for whatever purpose they please, so long as their actions do not interfere with the actions of others.”

I suspect that the landscape metaphor resonates with people because it captures how they feel the Internet should work. Sarah T. Roberts argues that we are tempted to imagine the digital as “valueless, politically neutral and as being without material consequences.” However, the digital information landscape is an artifact shaped by capitalism, the US military, and corporate power. It’s a landscape that actively tracks and targets us, buys and sells our information. And it’s mastered only by the corporations, CEOs and venture capitalists.

Be brave? I have no idea what it would mean to teach students how to ‘master’ the digital landscape. The idea of ‘mastering’ recalls the popular frontier and pioneer metaphors that have fallen out of fashion since 1990s as the Internet became ubiquitous, as Jan Rune Holmevik notes.  There is of course a longer history of the “frontiers of knowledge” metaphor going back to Francis Bacon and passing through Vannevar Bush, and thinking this way has become, according to Gregory Ulmer, “ubiquitous, a reflex, a habit of mind that shapes much of our thinking about inquiry” – and one that needs to be rethought if we take the postcolonial movement seriously.

While we might worry about being alert online, we aren’t exposed to enough stories about the physical and material implications of the digital. It’s far too easy to think that the online landscape exists only on our screens, never intersecting with the physical landscape in which we live. Yet, the Washington Post reports that in order to pave the way for new data centers, “the Prince William County neighborhood [in Virginia] of mostly elderly African American homeowners is being threatened by plans for a 38-acre computer data center that will be built nearby. The project requires the installation of 100-foot-high towers carrying 230,000-volt power lines through their land. The State Corporation Commission authorized Dominion Virginia Power in late June to seize land through eminent domain to make room for the towers.” In this case, the digital is transforming the physical landscape with hostile indifference to the people that live there.

Our students cannot be digitally literate citizens if they don’t know stories about the material implications about the digital. Cathy O’Neil has developed an apt metaphor for algorithms and data – Weapons of Math Destruction –  which have the potential to destroy lives because they feed on systemic biases. In her book, O’Neil explains that while attorneys cannot cite the neighborhood people live in as a reason to deny prisoners parole, it is  permissible to package that judgment into an algorithm that generates a prediction of recidivism.

“… we have this belief—which is just wrong—that data itself is inherently objective. That it is somehow created in an objective manner. And in the cases of predictive policing, or recidivism risk algorithms, when the data itself is so completely biased, every single problem of that system follows from the data bias. We could talk endlessly about what it is we’re doing when we give someone a high risk of recidivism and then send them to prison longer based partly on where they were born rather than what they’ve actually done. But at the end of the day, what we’re talking about is biased data. And it’s biased again because of systemic biases, systemic racism, …”

Safiya Noble‘s research on the racist bias in google search, and her work with Sarah T. Roberts on way that google glass normalizes surveillance, suggests that Google has effectively re-shaped our sensibilities so that we accept judgment and surveilance: “Google has effectively worked to convince the public that issues like invasion of privacy and inaccurate information in its search engine results would not be a problem for anyone unless they are doing something that they would need to hide.”

When I talk to students about the implications of their searches being tracked, I have no easy answers for them. How can youth use the net for empowerment when there’s always the possibility that their queries will count against them? Yes, we can use google to ask frank questions about our sexuality, diet, and body – or any of the other ways we worry about being ‘normal’ – but when we do so, we do not wander a non-invasive landscape. And there few cues that we need to be alert or smart.

 
Our starting point should not be the guiding metaphors of the digital as a playground where we need to practice safety or a landscape that we can master, but Shoshana Zuboff’s analysis of surveillance capitalism: “The game is selling access to the real-time flow of your daily life –your reality—in order to directly influence and modify your behavior for profit. This is the gateway to a new universe of monetization opportunities: restaurants who want to be your destination. Service vendors who want to fix your brake pads. Shops who will lure you like the fabled Sirens.”

So what do we teach students? I think that Chris Gilliard provides the right pedagogical insight to end on:

Students are often surprised (and even angered) to learn the degree to which they are digitally redlined, surveilled, and profiled on the web and to find out that educational systems are looking to replicate many of those worst practices in the name of “efficiency,” “engagement,” or “improved outcomes.” Students don’t know any other web—or, for that matter, have any notion of a web that would be different from the one we have now. Many teachers have at least heard about a web that didn’t spy on users, a web that was (theoretically at least) about connecting not through platforms but through interfaces where individuals had a significant amount of choice in saying how the web looked and what was shared. A big part of the teaching that I do is to tell students: “It’s not supposed to be like this” or “It doesn’t have to be like this.”

 

 

I footnotes