English proficiency tests. Speaking tests in particular. Who likes them? In my last post, I talked about the student who had come up to me and asked me for advice about how she could quickly improve her speaking test score. The gist of that post was somewhat critical of the learners’ mentality, and I wondered why some Koreans would ask for the quick way to get a higher score. But, perhaps it’s not just our learners’ fault. Perhaps, when it comes to speaking tests, the very nature of the test itself is to blame for this attitude. I mean after all what do these tests really do? The test in question that my student was asking about was the OPIc. I should disclose that I know very little about the OPIc test, and for that matter many of the other speaking tests. I know a bit about the IELTS speaking test, and occasionally help students, on an informal basis, who are preparing for it. Also, a few years ago, SK, who at the time ran BULATS, considered making a move to a computerised version of the test, and through my academy, I took the online certification programme to become an examiner. Ultimately SK dropped the computerised version of the test, so I never got to actually do any work, but I did at least get a little insight into what an examiner has to do. But that’s really the extent of my knowledge/experience of English language speaking tests.
From what I gather, no-one really likes these tests. Take another example, the
famous notorious TOEIC, which up until quite recently was extremely popular in Korea. I understand now that the TOEIC is beginning to lose its popularity in Korea, after Samsung (and maybe others) announced that they would no longer accept scores from the TOEIC back in 2012. Of course, students are still doing it. But what does it measure, especially in Korea? I’ve met quite a few students who have very high TOEIC score (around the 800+ mark), but who can’t actually speak/use English all that well. I remember at a KOTESOL presentation in 2012, a Taiwanese researcher mentioned that the average score of some of her students was around the 400 mark, to the audible gasp of Koreans in the room. They were shocked at just how low these scores were. It’s clear therefore that in Korea, where students spend hours memorizing paragraphs for the tests, they’re not really learning English. Most of my students who say that have the TOEIC coming up are doing it for the 10th, 11th, 12th time. Perhaps they just keep going until the right question comes up, and they luck out?
Obviously, one of the key problems is that a lot of these tests are done on a computer with no human interaction. Okay, it’s an efficient and convenient process, but totally unnatural. At least with the IELTS speaking part of the test, it’s done face-to-face, with a real person. But even the IELTS only lasts for up to 14 minutes. Is that really enough time to accurately judge someone’s English ability? I highly doubt it. I find the whole thing similar to job interviews. I’ve done a few job interviews with potential teachers over the last few years, and in the 45 minutes that I have with that person, it’s not all that easy to see whether they would fit in the company. I’m sure there must have been some people who would have made good teachers, but just didn’t shine in the interview, and I know that there are some people who I thought did really well in the interview, but, after employing them, were perhaps not what I expected. And I know that this is a problem that is common across all sectors. When I was doing my undergraduate degree in law, we were constantly being told how competitive law jobs were, and for some of the City firms, there would be 300-400 applicants for each position. In the UK, for example, many of the big law firms hold assessment days instead of, or in addition to traditional interviews. The idea behind these assessment days is to give the assessors more of a chance to see the candidates working more naturally, and typically include a variety of tasks.
So, recently, this has got me thinking about an alternative to the current English language proficieny tests. In fact when I did my CertTESOL course, in our second week we began our Teaching Practice component. In total there were 12 trainees on the course, and with the exception of one member, we were all total newbies to TEFL. Nevertheless, our course instructors informed us that as a group we would have to come up with a way to place all of the students who had signed up for the free classes that we would be teaching. As far as I can remember there were around 40 students to place. Because it was a while ago, and because there was just so much to do, the whole thing is a bit of a blur. But, I do remember how we organised the students. We split them into two teams of 20. Half of the trainee teachers worked with one team, and the other six with the second team. We further divided the teams into smaller subgroups of around 4 or 5 students per group. The students then did a number of activities, while we, the trainees observed. The activities included a partner gap fill, where students were paired together and had to alternate between reading sentences while their partner listened and wrote down what they said. Then there was a free discussion where the groups spent around 15 minutes discussing some questions we had prepared. And there was also some kind of vocabulary test, which, if I remember correctly, the students did by themselves. During the whole time, we, the trainees, alternated between the groups, until each of us had spent some time with every group. The whole thing took around 90 minutes. Afterwards, the trainees all got together to discuss the students and placed them into one of three levels. The surprising thing was that there were very few disagreements among us, and we were very quickly able to group the students together based on their level. At the end of the whole thing, our tutors even said how surprised they were with our efforts and that we had done so much quicker than previous groups.
Put together my experience on the TESOL course, the way that some companies like to hold assessment days and the major shortcomings with the current English language proficiency tests, and I got to wondering whether there is an alternative to the way these tests are currently run. Would it be feasible to run tests like this where groups of test-takers visit a centre to spend the morning/afternoon, and a group of observers get to know them before placing them in a level? I think the idea poses many more questions than answers now, but I wonder whether there is something there? As teachers, most of us claim that we should be teaching communicatively – whatever that may mean – but the current tests are far from communicative. I understand that there is a need for tests; companies and universities generally do need to know whether the people they are letting in can speak English. It’s just that the test scores they rely on, are just not that reliable all the time.
There is so much more to a student’s level than just his/her grammatical or lexical ability. We need to know how fluent that student is, how he/she works with others, what they are interested in, their motivation for learning the language and so on. Rarely do we get the chance to learn these things in a 10-minute interview. When I had to place students into levels at my academy, I had a list of questions to ask and was essentially trying to elicit a target structure to see whether they had acquired the language at that level yet. I found this difficult, and not very accurate. The thing that struck me time and time again was how the students would be able to use the language before and after the interview, but during it they couldn’t. I eventually gave up asking the questions in such a formal manner, and realised that by just talking to the student, I could get a better picture of their ability. Sometimes, I’d ask a student to introduce him/herself, prompt with a few questions, and before you knew it, 15 minutes had gone by. I’d then suggest a level and to the student’s surprise they had just done a level test without knowing it. Perhaps a type of assessment day language test, in groups, over a longer period, would help put students at ease and allow them to be themselves.
Who could employ this style of assessment day test (or is there anyone out there doing such a thing now)? There are of course a number of issues. For a language academy, having an assessment day for new incoming students might be possible once a week/month. You’d need to arrange for the students to all arrive on time, and for a relatively large amount of teachers to be present to act as observers. But doable, surely?
What about in a company? Don’t they already hold interview days where candidates turn up, wait around for their test time, speak to the interviewer for five or ten minutes and then are done? What about putting everyone together, and giving them the chance to communicate instead of waiting around for an hour. Give them the chance to get comfortable and relax.
If it would work at a local level, what about on a larger scale? Instead of the tests we have now, would it be possible to create a standardised test, like IELTS or TOEIC, that could be taken by anyone? At the end of the assessment, the students could be awarded a certificate showing their level.
I think if done right, it would certainly be a step forward in creating an assessment that really did assess a candidate’s overall English ability. Perhaps I’m an optimist and the whole thing is just a bit idealistic. Maybe it would never work, and I’m talking rubbish. But what other alternatives are there? Show me someone who genuinely thinks that current tests are a good indicator of a learners English – so long as they’re not an ETS employee or someone working for one of the many test prep academies across Korea.
Maybe, just maybe, someone can pick through my waffling, and poorly thought out idea, and see some potential?