Home » Setting up a level test: challenges and solutions

Setting up a level test: challenges and solutions

Setting up a level test: challenges and solutions

When Clarity and telc first conceptualised the Dynamic Placement Test, a key objective was to devise a democratic test — a computer-based level test available to schools whatever their digital setup. At the same time, we didn’t want to compromise on the technology: it needed to be a test that went well beyond multiple choice questions and gap fills. So within these constraints, the team prioritised three areas.

  1. It should be able to run in less developed areas, such as remote parts of India or the Philippines. This means the test must function even when the connection is intermittent, or worse, lost.
  2. Many centres do not have banks of computers and this has prevented them from running digital tests in the past. But often their students have smartphones. According to the World Bank, the Philippines has 109 mobile subscriptions per 100 people — and these are really just computers in disguise. So the test has to run on mobiles.
  3. Finally, schools should be able to administer the test remotely to students pre-arrival. This means that the teacher Admin Panel has to have an effective communication system.

This was challenging in theory — and way more difficult in practice. Here are three stories that illustrate the types of problems that arise when a new test makes it out into the field.

Asia University, Taiwan: communication breakdown

The objective was to test 1,000 first year students. They would bring their own devices to take the test, and would do it in an invigilated lecture theatre. Simultaneous downloads to 1,000 phones were likely to place a strain on bandwidth even in a hi-tech country like Taiwan, so each student was emailed by the University email system and instructed to download the Dynamic Placement Test app beforehand. They were also told to bring headphones or earphones.

The problem was that 90% of students did not check their University emails, so they did not download the app and many of them (also surprisingly) did not have earphones with them. Not having the app meant that they hadn’t been through the familiarisation section ensuring that test takers understand the different task types. As they were in a hurry to start the test, many of them got to a screen they weren’t sure about, and simply stopped.

This test was run again with another 1,000 students. Again, 90% hadn’t read the email, but this time the English Department provided teaching assistants who were able to help students. They also brought in a box of headphones for those who didn’t have them. As a result, this round was much more successful, but an important question remains: what is the most reliable way of contacting students, and how much does this change from institution to institution?

Universitas Andalas, Indonesia: no Internet at all

This university wanted to level test 4,000 students over the course of a semester — an ambitious undertaking. Once again the students did not read their emails, and turned up at the lecture hall with no downloaded or installed app. This time there was no Internet at all, so if they hadn’t read their email, they were sent away, told to find wifi and download the app (and equip themselves with earphones), and only then were they let back in.

This flexible and practical approach worked quite well. The students had time to download the app, and to familiarise themselves with the question. At the time of writing about 2,500 have completed the test successfully.

Bahrain Polytechnic: preparation paid off

Bahrain Polytechnic had only 200 students to test, and they decided to do it in 10 shifts in a language lab with 20 computers. They put a considerable amount of preparation into it. Anticipating, this time, that students would not access their emails, they set up DPT accounts with fake emails (e.g. student01@bahrainpoly, student02@bahrainpoly etc.) and Excel-generated passwords. They ensured that all students had been told in class where they should go for the test and when. They then ran through the 10 sessions without a hitch.

It is perhaps predictable that a digital test run in the ‘traditional’ way in a lab will be smoother than one run in the wild world of the smartphone. But this doesn’t have to be the case. It is intriguing and satisfying to help set up tests in such a variety of environments, and to make the journey a little easier each time, tackling one problem at a time.

One comment

  1. andrew holbrook says:

    I think this all great. What i would like is a mobile phone based diagnostic test (grammar and lexis-the latter based on the GSL).

Leave a Reply

Your email address will not be published. Required fields are marked *