I promised my friend and fellow blogger Samir Chopra that I would write something about the recent uptick in conversation about technology in the classroom, the overwhelming majority of which has been condemnatory. Full disclosure: I’m a big fan of technology (even in the classroom), I consider myself a cautious techno-optimist, and so I am usually in the minority opinion among my educator-friends and colleagues when these conversations arise. Allow me to make a case for my side here.
If you work in or near a classroom, you no doubt saw your newsfeeds filled (about a month ago) by professors and teachers sharing Susan Dynarski’s NYT op-ed piece “Laptops Are Great. But Not During a Lecture or Meeting.” In it, Dynarski references two recent studies allegedly testing the effects of electronics in the classroom on student learning– one by psychologists at Princeton and UCLA (here) and the other by psychologists at McMaster and York Universities (here)– which Dynarski found compelling enough to justify her decision to “ban electronics, including laptops, in [her] classes and research seminars.” I will grant that there can be, and often are, problems with students’ use of technology in the classroom, but I think there are also a lot of problems with the studies that Dynarski references as well as the “tech ban” decision she believed was justified by them.
Garbage In, Garbage Out
Let’s talk about those studies first. The problem with poorly-designed social science studies is that cannot help but produce garbage results. As Catherine Predergast pointed out in her excellent Twitter thread on the topic– much of my account here is lifted from her thread– the hypothesis that “pen-and-paper notetakers learned more than those taking notes with a laptop” was allegedly confirmed in two studies, one at UCLA with 67 students and the other at Princeton with 169 students. Currently, there are roughly 5.4 million students enrolled in colleges and universities in the United States, and the overwhelming majority of them are not attending Princeton or UCLA. (I’m no statistician, but that seems like a severely limited sample size.) In the first study, students were shown a TED talk and asked to take notes with or without a laptop. (The students with laptops were not connected to the internet. This variable will be important later.) The students were then taken to a lab, distracted for a half-hour, and asked to recall “factual” vs. “conceptual” content of the TED talk. The laptop-notetakers did better on factual recall, but not on conceptual recall. Ergo, it was surmised, laptop-notetakers didn’t “learn” as much as the pen-and-paper notetakers.
The second study was almost an exact repeat of the first, only students were instructed in advance to “take notes on a lecture, just like you would in class.” (Leaving aside the whole problem of students’ largely deficient understanding of what effective note-taking on a lecture even is!) The results of the first study were more or less duplicated in the second. Now, I think it’s worth noting that the widely-held presumption that laptops interfere with students’ retention of (“factual”) information was not confirmed in these studies. If anything, that presumption was debunked. And that’s an important thing to note because, in reality, the verbatim reproduction of factual information was all this study was really set up to measure.
As Prendergast rightly observed (here), “AT NO POINT did this now oft-cited study EVER measure learning of students IN AN ACTUAL CLASSROOM or with laptops hooked to the internet, where students could look up info that they didn’t understand.” Nor did the study consider that students’ note-taking skills, with or without technology, is largely underdeveloped in college, which is the first (maybe only) place they encounter “lectures.” And the study didn’t account for the different types of “learning” that different classrooms and different subjects demand, some of which require more conceptual comprehension and others of which emphasize the recording and retention of factual information. (Not every class is like a TED talk, thank god.) Most importantly, as Prendergast noted (and I agree), the laptop-notetakers in these studies were more or less hamstrung by not being connected to the internet. That restriction not only manufactured a set of test-subjects with little or no resemblance to actual latptop users in actual college classrooms, but it also manufactured a “static” learning model for the laptop users and subsequently measured their learning against the “dynamic” pen-and-paper model.
In sum, junk science.
Dynamic Learning
The real irony of these oft-cited studies is that they seem to assume a dynamic model of “learning” that emphasizes conceptual comprehension, synthesis, and understanding over a more static model of “mere” factual retention and regurgitation, an assumption that I very much endorse. BUT, the unfortunate reality is that the design of these studies only really measure students’ efficiency or deficiency at the latter. I’ve been at the head of a college classroom for ten years now, and I don’t think anyone in my position could get away with “learning objectives” that figure students as empty info containers anymore. (And that’s a good thing!) We have to think of students not as mere recipients of knowledge, but also as producers of knowledge. Students occupy both of those roles in the classroom. The ideal “learning” environment, in my view, is one in which students are receiving and producing, retaining and synthesizing, remembering and comprehending.
That’s a “dynamic” process and, for centuries now, we’ve been incorporating new technologies– with varying degrees of resistance– to make the learning environment a more dynamic, more effective, more interactive, and more generative place. And then, in a split second, the internet came along and the kids got new toys and they started bringing them to school and now we’re all of the sudden a sect of evangelical Luddites.
Here’s the thing: trying to assess the merits or demerits of laptop note taking without giving students an internet connection is like trying to assess the merits or demerits of pen-and-paper note taking without giving students ink. It’s just a fundamental misunderstanding of how the technology being assessed works IRL. And this is where the real failing is when it comes to how we think of the benefits or dangers of “new technologies” in the classroom, I think. We’re not doing a very good job of adjusting our assessment models to the real world in which we (and, more importantly, our students) live.
“You’ll Shoot Your Eye Out!”
In the classic 1983 film “A Christmas Story,” the nine-year-old protagonist, Ralph Parker, desperately wants a BB gun for Christmas. Ralph’s mother refuses. Throughout the film, we hear Mother Parker repeat what becomes a hyper-maternal and Ralph-haunting refrain: you’ll shoot your eye out! you’ll shoot your eye out! you’ll shoot your eye out!
Of course, Ralph gets the gun and, of course, he immediately misfires on his first go at it. He breaks his glasses, but he doesn’t shoot his eye out. No blood, no foul, as my grandfather used to say.
Putting aside the many problems with the laptop “learning” studies above for the moment, I also think we in the professoriate have a tendency to be waaaaayyyy too “Mother Parker” about students’ use of technology in the classroom. Yes, some students are going to misfire. There are going to be some broken glasses. Some students will be distracted or will distract others, some will disengage, some won’t pay attention, a couple of them may even shoot their eye out. And, to be sure, some won’t learn. Just like students every other class, including those classes in which technology is “banned.”
Our role as educators ought to be to facilitate an environment in which we make it possible not only for students to learn, but also to learn how to learn. And that can’t happen in some artificial, sterile, classroom- laboratory that has been wiped free of distractions. It has to happen in a classroom that resembles the real world in which they live and learn– the one with blinking screens and political ambiguities and boring profs who blather on and breakups and the flu and assholes and jobs that suck and immediate access to a zillion other attention merchants demanding that students not learn.
Let’s bring more technology into the classroom, I say. Figuring out how to incorporate it into our lives-with-others in healthy and productive ways is a skill we all have to learn, professors and students alike. We’re all still learning how to do that. Remember when we used to have dinner or stand in a line or just have regular conversations before smartphones? Remember how hard it was to adjust? See how normal it is now? I get it that some people still want to resist and go back to the Norman Rockwell days of old, but that train has already left the station. Like, more than a decade ago.
Here’s How To Do It
I don’t have all the answers. I’m still figuring it out myself. And I still have days when I want to snatch up a students’ smartphone and smash it underfoot… but each semester I think I find a few more ways to adjust. Here are a few things that have worked for me:
- Have a conversation with your students, on the first day of class, about technology in the classroom. All of my courses have at least one unit that focuses on new technologies, and the course I teach the most frequently (Contemporary Moral Values, a sort of “intro” Ethics course) has almost 40% of the course content devoted to technology, so having a conversation about laptops and smartphones is a little more organic for me than it may be for others. Nevertheless, I think it’s absolutely critical to speak to students about your expectations regarding laptops and smartphones on the first day, if for no other reason than to let them know that (a) you are not some 15th century monk residing in an Ivory Tower and (b) you have expectations. The important thing to remember is that this should be a conversation. Ask them what they think about technology in the classroom. Listen to what they say and adjust your expectations where necessary. Talk about the ways that tech can help with their learning and not just the ways that it can interfere with their learning.
- Require that students using laptops sit in the front row. This is a rule that I include in my syllabus. As I mentioned above, I think we’re all still learning how to use classroom tech in non-distracting ways and so, until we get better at it, the laptops-in-the-front-row rule is helpful. First, if your experience is at all like mine has been, you’ll find that FAR FEWER students will insist on using laptops. Second, those that do use laptops will be FAR less likely to spend their time on social media if everyone behind them can see what they’re doing. Third, you’ll be able to tell from the people sitting behind the front row if the front-row-laptop-users are screwing around. It just takes asking one screwing-around student to leave the classroom to send a message. (Fwiw, the same goes for smartphones, though I don’t require everyone with a smartphone to sit in the front row because my front row is not infinite.) The point is to reinforce the message that the classroom is for learning. Laptops and smartphones can be a part of that, but if they prove to be a distraction, I will firmly but politely ask you to take your business outside the classroom.
- Don’t ban smartphones. I mean, really, there’s no point. You’ll only invite resentment and, anyway, your efforts are much better spent working with students to help them learn how to manage their attention, their responsibilities, their distractions, and their interactions with others. That said, give students one warning when they err. Next time, ask them to leave.
- When the opportunity arises, encourage students to use their technology to be “producers” of knowledge in the classroom. I find that these opportunities arise frequently in my classroom, especially now that I look for them. When a student begins a claim with “I read an article/study/etc that said x“, I can say “oh really? where did you read that?” or “who was the author?”. The rest of us will move on with the conversation, of course, but I’ll let the student Google it and tell the rest of us when they find it. Or, similarly, there are many times in my class when we’ll be talking about a contemporary moral or political issue and it’s helpful to locate some specific fact– when were the Confederate statues erected? how many people are incarcerated in the U.S.? which country just awarded citizenship to a robot?. It only takes a second to let students Google these things and, again, doing so allows them the opportunity to see themselves as more than mere recipients of knowledge. Not for nothing, but it’s also how they will learn in the real world outside the classroom.
- Talk to students about your own use (or lack thereof) of new technologies. I’m a technophile, so I tell my students on the first day that it’s just as difficult for me to set my smartphone aside for 50 minutes as it is for them. I frequently refer to things I read on social media, or saw on Instagram, or watched on YouTube, or posted on my blog, etc. I talk about interesting things I just learned about androids or machine learning or blockchains or stem cell research. I do these things because I want to model for my students the way in which new technologies facilitate my understanding of myself as a teacher, a researcher, and a learner. But even if you’re not tech-savvy, I can imagine that having conversations about whatever it is about tech that makes you feel uncomfortable or uninformed or just plain irritated can also be productive. The point is to acknowledge the world that we live in is one dominated by new technologies and to show that you are not unaware of that.
- Give your students assignments that require them to learn about, critique, or employ new technologies. As I’ve written about many times before on this blog, my students complete a final project called the “Technology and Human Values Project.” Every term, I’m amazed at what they produce. The fact is that new technologies are being invented faster than we can properly think about them these days. The young people sitting in our classrooms are the ones who are going to not only develop these technologies but also (hopefully!) make the decisions about how they are best used. The college classroom is the last place they will be able to think and talk about these decisions before their views are coerced by all manner of financial, employment-related, or health and safety considerations. Let’s not let them loose into the world unprepared for the task before them.