Subramoni Iyer describes how his experience with students and teachers in Syria led to localised test instructions and a pre-test video.
Clarity’s Technical Director, Adrian Raper, considers the issue of fairness when students are taking the same test on different devices.
In this video, Andrew Stokes discusses how we can ensure that a placement test — or any other kind of language test — is culturally fair.
Surely the more questions you answer in a placement test, the more points you get and the higher your score? If you can’t finish, you can’t do yourself justice. And that must invalidate the result.
In the second of a series of short videos, testing expert Laura Edwards looks at the the roles of output and input in language testing.
Should a placement test include speaking and writing? Is it important that it is adaptive? Does a test-taker have to attempt every question? What, in fact is a placement test?
When Clarity and telc first conceptualised the Dynamic Placement Test, a key objective was to devise a democratic test — a computer-based level test available to schools whatever their digital setup. At the same time, we didn’t want to compromise on the technology: it needed to be a test that went well beyond multiple choice questions and gap fills. So within these constraints, the team prioritised three areas.
Can a test run on a student’s device ever be secure? What’s to stop a test taker looking up the answers on the Internet? What, in fact, does ‘secure’ mean in the context of a placement test?
Sean McDonald of telc catches up with Adrian Raper at the IATEFL Conference in Glasgow. He discusses his philosophy of testing, and the steady move from paper-based exams towards digital language assessment.
‘We like your online placement test,’ said the teacher at Taiwan’s Asia University, ‘but with 1,000 freshers and only 20 computers, we’d be halfway through the first semester before we could even sort out our classes.’ Read more