Oxford University Press

English Language Teaching Global Blog


5 Comments

Glue sniffing and classroom technology

Hands holding an iPadPaul Davies, co-author of Solutions second edition, takes a look back at when technology first appeared in the classroom and offers a warning about its use in today’s classrooms.

Visiting my children’s primary school the other day, I picked up a bottle of PVA glue from a table and gave it a little sniff. I’m not a habitual glue-sniffer, but I’d noticed that it was the same type of adhesive we used to have at my own primary school decades earlier and I knew the smell would be evocative. For a moment, I was reliving my schooldays.

I left secondary school in 1984, around the time when computers were beginning to have an impact in education. That year, the eminent British semiotician Daniel Chandler wrote: “The mirocomputer is a tool of awesome potency which is making it possible for educational practice to take a giant step backwards.”

What did he mean? Chandler was no Luddite: he embraced new technology and worked to develop early educational software in collaboration with the BBC. His fear, however, was that educators might be so beguiled by the novelty of the latest classroom technology (in those days, a PC the size of a fridge) that they failed to pay enough attention to the underlying pedagogy. He warned that computers should be viewed not as potential teaching machines but as aids to student expression because, put bluntly, computers can’t teach. They deal in information, not knowledge.

More than a quarter of a century later, Chandler’s warning still applies. Even today, many on-screen language games are basically stimulus and response, often with canned applause or some other audio/visual reward for a correct answer. Short of locking students in a box and dispensing food pellets through a chute if they pull the right lever, this is about as close to Skinnerian behaviourism as you can get. It is an approach to education that has been out of vogue for over half a century.

While Skinner deliberately excluded as irrelevant anything which goes on inside the mind so that he could focus solely on directly observable behaviour, subsequent theories of learning have taken the mind as a starting point: constructivism, brain-based learning, NLP, and so on.

Today, educationalists talk about how students construct knowledge through their interaction with information; they don’t talk about how best to condition students to respond in a certain way (except perhaps with certain aspects of classroom management). However, with the advent of new technology, unbounded behaviourism has re-emerged in the classroom – not because the pedagogy involved has been reconsidered but because, more often than not, it hasn’t been considered at all.

Leaving aside distinctions between the various platforms (PC, laptop, tablet, phone) which in any case appear to be converging, you can divide technology-based activities into two broad categories: A) things which simply couldn’t be done before the relevant technology was on offer, and B) things which have a more traditional equivalent. We shouldn’t assume that activities in either category are necessarily worthwhile, although they might well be.

In category A, a live chat with a class of children on another continent could prove a rich learning experience, while a video game in which you zap adjectives with a ray gun may do little more than keep students quiet for a while. In category B, the key question is whether the technology-based activity is a clear improvement on its precursor. Using an app to plan and monitor your revision timetable makes a lot of sense. But why should we always opt for PowerPoint projects over physical posters? ‘Because we’ve just bought a load of iPads’ is not a good enough reason.

And what about the children whose learning styles are better suited to physical, rather than on-screen, cutting and pasting? Shouldn’t they have the opportunity to put the electronic devices away for a while and get out the scissors and glue? After all, you can’t sniff an iPad.

Bookmark and Share


6 Comments

How to boost your students’ chances in an oral exam

In this post, Paul Davies, co-author of Solutions 2nd edition talks about boosting your students’ chances in oral exams.

Pity the poor student who has to sit an oral exam. Speaking in public is daunting enough – but in a foreign language? On a topic you haven’t chosen? With the knowledge that a poor performance could adversely affect the rest of your life? What a nightmare!

But sit them they must. So how should you help them prepare? Everything you do to improve their general level of English will contribute to success, but only if they manage to show off what they know come the big day. Here’s how to give them the best chance of doing that:

  • Resist the temptation to over-correct in class or fill every silence. Students need to get used to the sound of their own voice – including the sound of their own voice petering out. You’ll never learn to swim if you don’t take your armbands off.
  • Cover the topics you expect to come up in the exam. Sounds obvious, perhaps, but there is a temptation to gloss over some of the more hackneyed topics like recycling, education, childhood memories, and so on, in search of areas that are more stimulating and original. Be careful: off-beat topics are likely to generate little that is transferable, and well-worn topics are very likely to crop up in the exam.
  • Get them talking in pairs and groups. Not only will some students be more forthcoming in the relative privacy of a pair or small group, but also it increases the total amount of speaking that each student can do within the time available. You can’t monitor every conversation – which may not be a bad thing in itself – but you can spot check.
  • Whatever topics you cover, expose students to a range of ready-made opinions and insights. Examiners do not give marks for original thoughts, mainly because they have no way of detecting them. So don’t be hard on students who repeat verbatim what they’ve just heard or read – just make sure they understand it!
  • Teach set phrases for the functions they’ll need in the exam: presenting opinions, justifying yourself, presenting counter-arguments, and so on. And practise them until they are second nature. For example, arm your students with three or four different ways to frame an opinion: “To my mind, …”; “The fact is, …”; “The way I see it …”. (To really sound like a native speaker, opt for ungrammatical constructions like: “The key thing is, is that …”)
  • Discuss the details of the exam format. Some people love surprises, but this is not the time or place. Make sure your students know exactly what to expect, and when you practise exam tasks, do it in a way which is as close to the actual exam format as possible. And as the big day approaches, they need a few dry runs in exam conditions.

Once you’ve done your bit to boost their chances, the rest is in the lap of the gods – and your students’ own hands.

Bookmark and Share


15 Comments

Teaching ‘screenagers’ – how the digital world is changing learners

Ahead of his talk at IATEFL 2012 with co-author Tim Falla about how best to exploit currently available digital resources, Paul Davies looks at how the digital world is changing learners.

The term screenager was coined 15 years ago by the author Douglas Rushkoff in his book Playing the Future. He used the term to refer to young people who have been reared from infancy on a diet of TV, computers and other digital devices. On the surface, screenager is just another mildly-annoying made-up word, like edutainment and infomercial. Look deeper, however, and the word contains a clear implication: that teenagers are somehow more different than they used to be because their brains have been permanently altered by constant exposure to technology.

In the media, headline writers love to seize on reports which appear to confirm that implication. “Facebook and Bebo risk ‘infantilizing’ the human mind,” warned the Guardian on 24th February 2009; “How the internet is rewiring our brains,” lectured the Daily Mail on 7th June 2010; “Web addicts have brain changes,” claimed the BBC news site on 11th January 2012.

But go to the primary sources and you’ll find that very few of these studies actually claim to show what the headline writers claim they claim. For example, while the study of web addicts did indeed show their brains were different from non-addicts, the differences are just as likely to explain their addiction as be caused by it. The researchers took no ‘before’ and ‘after’ pictures of the addicts’ brains and could therefore make no claims about changes – but you would never know that from the media coverage.

Perhaps it is not surprising that newspapers should have a grudge against today’s teenagers (who don’t buy them) and against the internet (which is killing them off). So, claims which reflect badly on both are given top billing. When Susan Greenfield, the Oxford-based brain scientist, recently suggested a link between the Internet and autism, it was splashed over several front pages. But again, the headlines turned out to be misleading and Greenfield later clarified her position: “I point to the increase in autism and I point to internet use. That’s all.”

Not all the headlines are negative, of course. Some claim that modern technology has boosted young people’s cognitive skills and ‘rewired’ their brains (whatever that means) in positive ways. Today’s youngsters are supreme multi-taskers with brains that are more active and more efficient than previous generations. They may appear to lack focus, or be unable to concentrate, but that’s because we adults don’t quite get what they’re like. In fact, they’re fully evolved to live in a digital environment which has, to a greater or lesser extent, left us behind. Personally, I find these positive claims more refreshing, but the science behind them is equally shaky.

Continue reading