Elinor Stokes is the director of AtlasEnglish, ClarityEnglish’s official UK distributor. In a joint submission to the e-Assessment Awards 2021, the HOPES project was selected as a finalist for the Best International Implementation award.
The Dynamic Placement Test (DPT) is an adaptive and randomised online English placement test, which is accessible online on any device. It was co-published by ClarityEnglish and telc Language Tests, and has been deployed for English language assessment by educational institutions and companies worldwide.
The focus of this case study is the use of the Dynamic Placement Test in the HOPES project. HOPES was set up by a consortium of four European partners (DAAD — the German Academic Exchange Service, the British Council, Campus France, and Nuffic) as a response to the Syrian crisis. The project aimed to address the lack of higher education opportunities for Syrian refugees in the countries surrounding Syria. Across two phases of the project, DPT was used to assess the English language level of just under 9,000 students across Egypt, Turkey, Lebanon, Jordan, and Iraq in challenging testing conditions.
The Dynamic Placement Test was implemented in two phases of the HOPES project, each with a distinct objective. The first, in 2018, put 8,000 students through an English programme with the objective of helping them improve their proficiency by one CEFR level. Headed by the British Council, it took a three-step approach: the candidates were given a placement test, followed by a 100-hour (level-appropriate) face-to-face course, and a proficiency test. The placement test, for which DPT was employed, had several requirements. Given the students’ challenging circumstances, it had to be deliverable online, accessible on a wide range of devices, and unaffected by intermittent internet access. For test administrators, it had to provide a quick yet accurate English assessment, and an easily scalable setup process.
Phase two, headed by DAAD, took place in August-September 2020 and was largely a response to the Covid-19 pandemic. Amongst the virus’ many disruptions, in-person language tests such as IELTS were cancelled. This left aspiring students without a way of determining their English level and certifying it for universities or potential employers. The HOPES project (run by DAAD) turned to DPT to provide 750 tests to such candidates. Even more so than in 2018, the need for an online test that could be taken directly on the students’ devices was imperative. Two other factors became crucial: test security and the generation of certificates for test-takers.
Upon DPT’s selection, ClarityEnglish made small improvements to the Admin Panel (DPT’s administration system) to aid the international implementation. Customisation options including presenting the test’s familiarisation section in Arabic and specifying the time-zone of the test-takers were developed. More was to come – but this signalled Clarity’s ability and willingness to adapt the test to the new and challenging circumstances in which it would be deployed.
The objective of phase one was to improve the English of 8,000 students by one CEFR level. As a part of the three-step process, DPT’s sub-aim was to test all students and place the majority (>90%) into the correct course level. The objective of phase two was straightforward: assess and certify the English level of 750 candidates.
A four-step approach was devised to meet these objectives:
First, a series of webinars and training sessions familiarised the client with DPT and the Admin Panel. An issue that arose in phase one was the time difference between Clarity (based in Hong Kong) and the client (based in Egypt). In phase two, Atlas English – Clarity’s European partner and distributor – was enlisted to ensure the client received round-the-clock support.
Second, a list of test-taker names and emails was compiled. This was handled by the client through a combination of on-the-ground teams in the host countries and an online self-registration system (phase two relied exclusively on the latter).
Third, tests were set up. Phase one test-takers were staggered over a year (<500 students at a time), while all 750 phase two tests were taken on the same day. Due to the large numbers, a systematic process was developed: each batch of test-takers was compiled in a spreadsheet, which was uploaded to the Admin Panel – automatically creating user accounts. A test was set-up (time-zone and language preferences adapted to the specific group), and an automated ‘Welcome Email’ was sent out to the test-takers with all relevant test information and login details.
The last step in phase one was to download results to a spreadsheet and categorise test-takers by level to put them into appropriate classes. The last step in phase two was to automatically share the test-taker results and generate certificates.
Quantitatively, both phases of the project met their objectives. In phase one, DPT successfully assessed the English of all students who would later complete the course. A small percentage (<5%) dropped out of the process after registration and thus did not attempt DPT, and <1% experienced technical difficulties and repeated the test. Given the circumstances on the ground, this fell well within acceptable bounds.
In phase two, all candidates who attempted DPT completed it and received a certificate. Of 750 test-takers, 77 did not attempt the test (a completion rate of 89.7%). This was due to the particularly challenging circumstances in which the client compiled the list of test-takers. Relying solely on social media campaigns and self-registration over a short period caused a small proportion of inaccurate candidate information (incorrect emails).
While quantitatively successful, this begs the question: were the test-taker scores accurate? Looking at the inputs, test security features ensured the risk of cheating was all but eliminated. Sophisticated question types (such as word placement and sentence reconstruction) left students unable to ‘look up’ or guess correct answers; a built-in Anomaly Tracker flagged any suspicious test-taker behaviour. Moreover, there is anecdotal evidence that results were accurate: in phase one, course teachers were all satisfied that the English levels of their assigned students were appropriate. A testimonial from the lead administrator in phase two, Rania Helou (HOPES Scholarship Manager), also suggests a successful implementation strategy:
‘Having used DPT and the Admin Panel for the HOPES project, it gives me great pleasure to recommend it to all interested parties. It is easy to use, assesses accurately and reliably the three skills of the students, and can be run at your convenience on any device within 30 minutes. Not to forget that students can get an immediate certificate attesting their CEFR level. During the process, the Atlas team provided prompt and effective support!’
Language barriers presented a consistent challenge throughout the project. Most candidates were lower-level English speakers (scoring in the A2-B1 range). This did not present an issue for the test itself – DPT is designed to adapt to the English levels of candidates from A1 through to C2. Rather, it posed challenges from an administrative perspective: how to share test information and login instructions with over 500 test-takers at a time, many of whom understood English at an elementary level?
Fortunately, despite the candidates being spread across five countries, over 90% spoke Arabic fluently. As outlined in ‘Project Requirements’, one of the first developments made was an Arabic version of the question-familiarisation section of DPT. A second change made was to introduce customisable text options in the automated ‘Welcome Emails’ (outlined in ‘Project Management’) – allowing test administrators to share relevant information to all test-takers in Arabic.
While these methods proved effective in preparing candidates for the test, a small subset of test-takers reported feeling underprepared for the questions they would come across. Additional measures were taken to ensure they would be well prepared: a ‘Video Guide for Test Takers’ was created, translated into Arabic, and made available for test administrators to share through the ‘Welcome Email’.
Technical / Digital Challenges
A key challenge encountered in both phases of the HOPES project was the unreliable and intermittent access to the internet that virtually all test-takers experienced. This is articulated by Hala Ahmed, the Regional Academic Manager for HOPES and the British Council in Cairo, who said ‘we are working with universities and areas which don’t have a sophisticated [technical] infrastructure’. While accessibility issues made online testing more challenging, the geographic disparity of the test-takers and their social/economic circumstances (particularly in pandemic-struck 2020), demanded that all testing be done remotely. This presented quite a challenge.
The ‘Offline Mode’ of the Dynamic Placement Test was developed as a solution to these challenging testing conditions. Operationally, the Offline Mode works by downloading a package of test content (<30MB) to the test-taker’s device before the test commences. Once the test begins, there is no difference (either from the test-taker’s perspective or with regards to the test construct) between the Offline or Online Modes; but the Offline Mode will be uninterrupted by any drops in the network connection. As soon as the test-taker has completed the assessment and a network connection has been (re)established, the result is automatically uploaded to the cloud.
The Offline Mode of DPT strikes a fine balance between taking advantage of the benefits of online delivery (principally: remote assessment), while allowing the test experience to remain unaffected by differences in access to the internet. As such, this proved to be a key feature of the test that allowed both phases of the project to be carried out successfully.
Scale and Complexity
The assessment program involved just under 9,000 test-takers, based in Egypt, Turkey, Lebanon, Jordan, and Iraq. In the first phase of the project (2018) the majority of candidates were based in Lebanon and Egypt, and consisted of 70% Syrian migrants (displaced due to Syrian crisis), and 30% local candidates. In the second phase the majority of test-takers were based in Lebanon (>80%), with a larger proportion of local candidates than in the first phase.
While the first phase involved many test-takers (8,000), the implementation of the English assessment was spread over the course of a year – with planning taking place in 2017. While certainly complex, a systematic assessment process (see ‘Project Management’) and the development of solutions to allay technical and linguistic concerns (see ‘Regional Challenges’ and ‘Digital Challenges’), allowed this project to proceed as planned. The number of DPTs delivered in this project accounted for 5-10% of the total number of tests delivered that year.
Surprisingly, considering the smaller number of tests delivered, the second phase posed more serious administrative challenges. The reasons for this were threefold: the project was taking place fully remotely due to the Covid-19 pandemic; Lebanon, which served as the base of operations for the DAAD team was reeling from the Beirut explosion; and the project’s total timeframe of planning and implementation was a mere 2 months. Moreover, the different aims of the project (as outlined in ‘Project Management’) made the administrative task more challenging, requiring more on-hand support from Atlas.
With supplier teams based in Hong Kong and London, and tests deployed in five Middle Eastern countries – this project was certainly very international in its implementation. A systematic approach, sophisticated technology, and willingness to adapt to the needs of the local market all contributed to make the project a success.