The subject of this case study asked us to anonymise it. Names have therefore been changed, but all other details are accurate.
GCC Petroleum Training* (GCCPT) runs training courses for oil and gas companies across the Arabian Gulf region. As well as technical programs, these include coaching for English language and a talent / soft skills development course. One of the challenges GCCPT faces is to identify staff across the affiliate companies who will benefit from their programs, and to determine each learner’s level of English so as to place them in the most appropriate course. A key criterion is that this must be done quickly, efficiently and accurately.
Mr. Warith Adnan, Chief Training Officer, laid out further criteria as follows: ‘The test has to be routed through our internal Learning Management System and to be administered centrally from our Head Office, with test takers from several sister companies accessing the test as per stipulated time frame. We determined that we require an adaptive test as we can allocate a maximum of 45 minutes per trainee for the English language component of the selection process. Finally, we do not want to provide dedicated infrastructure for this, so we have a preference for a test that would run on the trainees’ own devices. This also raises the question of whether we require a proctored solution.’
In 2022 it became clear that the diagnostic tools they were using were not providing the specified outcomes and they started looking for alternatives. For English language level testing, they trialed Clarity’s Dynamic Placement Test (DPT), which is used extensively across the Gulf, both in the corporate sector and by governmental educational and training bodies.
Concerns and the trial
The trial was undertaken by members of the English language teaching team. Each of them took the test three times: once as themselves, and twice simulating learners at different levels (*see below). For example, Mr. Sandeep Agarwal, Training Officer, scored C2 as himself, and then A2 and B1 in his simulations. His conclusion was: ‘I recommended adoption of the test not just because it seemed accurate to me, but because both the adaptiveness and the randomisation of the questions solved other issues too. On my three attempts, I saw different questions each time. Each time, the questions were suitable for my level, which is motivating. Also, I couldn’t work out how a test taker could cheat.’ *
Ms. Gayathri, a Team Leader, added that: ‘The output is concise, focused and instant, whereas we found that other, broader English tests just gave too much information on Reading, Writing, Speaking and Listening which aren’t of interest to us. If there’s human marking required that means an unacceptable delay. With DPT you immediately get the CEFR level (the key item for us) as well as relative strengths between the skills, and a further Relative Numeric locating the trainee in the correct place within the CEFR level. That answers our needs.’
Ms. Salma Aslam, an English language teacher, said that: ‘The test was interactive, consistent, easy to navigate, and the language used was quite easy for both low level and proficient users of English language. One more thing: our test takers are working adults and we felt that not only was the content suitable, but also gave them the flexibility of being able to take the test at a convenient time, perhaps between meetings.’
A concern was that DPT does not assess test takers’ productive skills. However, it was recognised that there is a tension between administering a fast and efficient test to give an accurate snapshot of a test taker’s level on the one hand, and the time required to deliver and grade speaking and writing tests on the other – and the 45-minute requirement was not negotiable. In the end the GCCPT team agreed that ‘although the test does not tackle productive skills (writing and speaking) explicitly, it does evaluate soft productive skills’. This is done through questions such as dialogue reconstruction, an item type which encourages the test taker to analyse speech patterns and produce a structured conversation.
Once the trial had been successfully concluded towards the end of 2022, GCCPT approved the adoption of the Dynamic Placement Test. They began rolling it out as the key placement tool for English language courses run across 16 of its affiliate companies, as well as for placing staff in soft skills programs. Finally, they identified a further function: certifying the English language level of staff selected to attend external training courses or overseas conferences.
They placed an initial order for 10,000 tests. Tests are set up centrally twice a month and in the early months of 2023 were taken by around 150 test takers each time, rising to around 300 at the time of writing, with further increases expected in the autumn. Initially GCCPT’s supplier, Language Elements in the UAE, assisted in setting up the tests, but the team quickly realised how easy Admin Panel, the dedicated administration tool, makes this process.
‘Previously, each centre would have to purchase their own tests,’ said Mr. Sandeep Agarwal. ‘Now, it’s so much more convenient to do everything centrally. Also we can set up groups giving access to data only to the officer directly concerned, so data privacy can be maintained.’
Following the success of the adoption of the Dynamic Placement Test, GCCPT is currently trialling Clarity’s General English programs: Tense Buster, Clear Pronunciation, Active Reading and others. The organisation has a policy of additionally making learning resources available to participant’s families, and they believe that this will be particularly welcome in the case of English language programs.
* Note that is not a recommended method for trialling DPT. If you are a C2 teacher, it is extraordinarily difficult to answer consistently as an A2 candidate. It is better to run the trial with real test takers at different levels. However, it seems to have worked satisfactorily in this case.