HomeAssessmentAre We Finally Bringing Placement And Practice Testing Into The 21st Century?

Are We Finally Bringing Placement And Practice Testing Into The 21st Century?

Author

Date

Category

 

2009 proved to be the year when long periods of investment by a number of publishers and exam boards in the application of new technology to language testing have finally came to fruition and resulted in a range of significant new additions to the repertoire of resources available to those involved in ELT. In particular, the Oxford Placement Test, represents a remarkable step forward in the online provision available to learners, teachers and language teaching institutions in two key areas, placement testing and practice testing.

As someone who has been involved for many years both in test design and as a test user, my initial professional interest in the site was primarily in the new placement testing facility developed by OUP, where the benefits of what can be achieved with the latest ‘custom built’ software can be seen not only in the learning management system (LMS), which allows for efficient and flexible administration of the test, but most crucially in the design of the placement test itself. The Oxford Placement Test is a CAT, a computer adaptive test, ‘the test you do with a mouse’. CATs are very difficult and expensive to produce, but when a CAT approach is taken to placement testing it brings huge benefits, because of the very nature of placement testing and the fact that it involves the need to determine the language levels (and ideally the language profiles) of learners of unknown levels. Traditional placement tests, even the best of them, are unavoidably inefficient in the sense that a proportion of the items will always be wasted, either being far too difficult and/or far too easy for a learner at a given level. CATs, because they are based on item banks from which items are chosen based on the testee’s previous responses, tailor the test to each test taker. They are thus able to achieve the ‘holy grail’ of placement testing, in that they can be quick and accurate, as well as being far more secure than their ‘pen and paper’ cousins.

The OOPT works really well in terms of its prime function, defining each learner’s level in CEFR terms and with a score. I have used it in my own institution. I have also explored its accuracy in concurrent validity trials with other tests and it produces consistently accurate results. It also produces those results immediately, with very useful underlying detail, and, through the LMS, in ways that allow for efficient integration into a range of possible school management systems.

While, like any objectively scorable test so far available, the OOPT’s measurement of the learner’s English language knowledge and skills does not include the productive skills of writing and speaking, the OOPT is based on constructs of language learning and testing which are explicitly documented and the description of the test is refreshingly honest in terms of what it can and can’t do. The test is divided into two main sections, ‘Use of English’ and ‘Listening’, the results of which are reported separately and as a total score, but the LMS allows for the integration of a range of other kinds of information into the final report, including both locally administered ‘Writing’ and ‘Speaking’ test scores and whatever general candidate information is required, age, gender, first language existing certification and the like.

What I also really liked about the OOPT was what I can do with the detailed results from each testee. At NILE (Norwich Institute for Language Education, where I work) we run professional development courses for language teachers and teacher trainers. All of these include a significant component of language work for non-native speaker teachers. With really advanced users of English it is often difficult to know what to focus on in language classes. The level of question by question detail available from going into each testee’s OOPT results allows anyone with access to those individual test results to see exactly what patterns of error each testee has. I was able to build a sequence of highly effective language lessons based on both the common areas of weakness of the group as a whole and the individual-specific points I wanted to touch on. Some of those areas were things I could never have predicted in advance, but the OOPT gave me hard diagnostic evidence.

As someone who has been very closely involved in placement test design for some 30 years I would expect to be highly critical of test instruments which on close examination were not up to the job or for which exaggerated claims were made. As with any innovative design when it is first in the public domain, the OOPT had some early flaws, difficulties with the delivery of the sound files in certain contexts and problems with some of the e.mailing functions, but these seem to have been very effectively (and pleasingly promptly) solved.

I am continuing to use the OOPT across a range of contexts, in my own institution and in a number of international projects in which NILE is involved and I regard it as one of the most significant advances in placement testing in a generation.

When I first went to the Oxford Placement Test website, my professional interest in placement testing meant that my initial focus was very much on the OOPT and the LMS through which it can be so effectively managed, with my first reaction to the practice tests also offered there being something like ‘More Cambridge main suite practice tests – haven’t we got enough already?’. However, my increasing interest in the flexibility of the LMS and its ability to integrate an important range of learning tasks and management information systems made me look more closely at the other side of the package. I was surprised to find out how much more was on offer to learners and teachers than just the traditional range of practice tasks. The combination of ‘tips’ and question-specific feedback available to learners working autonomously provides a sequence of learning loops that different learners could quickly learn to use according to their particular learning style. The practice test design provides learners with a very helpful repertoire of learning supports across all the skills and exam paper types: automatic and immediately available marking of each item; instant feedback on each item (or you can go back to them all at the end); exam tips for each question; a double-click facility for any word to see an online dictionary (this facility hasn’t always worked for me); audio scripts to check for yourself on the listening questions; textual and visual stimuli for the speaking questions, plus useful phrases for the speaking sections of the tests; and a range of other pedagogic support features, among which the one I liked the best was the ability for learners and teachers to engage in an online dialogue about written scripts, a kind of online ‘process writing’, involving the learner’s own script, the model answers provided, and the teacher’s ability to develop the learner’s writing skills through the facility available to the teacher to provide not just corrections but reformulations using a sort of ‘track changes’ feature. This is an aspect I want to explore a lot further and I’m now wondering if the LMS is something that NILE could adopt and perhaps adapt to facilitate one part of the tutorial supervision that is an important part of our MA programme. What keeps striking me about the LMS is how much more it could do than it is presently being used for.

So, this year I’ve found something that has opened up all sorts of new ways of doing some of the things we all need to do. If you haven’t yet had the chance to find what I have, I suggest you go to the Oxford Placement Test site and start exploring.

By Dave Allan.

Bookmark and Share

1 COMMENT

  1. Heya i’m for the first time here. I found this board and I find It really useful & it helped me out a lot.
    I hope to give something back and help others like you aided me.

Leave a Reply

Recent posts

Recent comments