Skip to content Skip to menu

The first 1000 screens are in!

Apologies for the radio silence, but it’s been a busy few weeks here behind the scenes at SCALES. Debbie and I have now written to, emailed and phoned every school in Surrey (mainstream, special, independent and even home school families). That is about 340 schools. There are a few that haven’t replied definitively, but 70% have signed up (that’s almost 9,000 children)! The screening data are also now coming in thick and fast – this week we passed the 1000 screen mark, proving that Surrey teachers are FABULOUS! Every once in a great while I let myself believe for a moment or two that this might actually work.

On the whole it is working well – many teachers seem to be logging on and completing the screens without a hitch. The database time stamps all of the entries so we also know that the average time to complete a screen is about 5 minutes – exactly what all of our piloting indicated. Obviously some take longer. I suspect these are the screens of children who are having difficulties and the teachers are pondering exactly what these children are doing and how much it is interfering with classroom success. Of course we also have some outliers – unfortunately here we don’t know whether it is a technical problem or whether a teacher has stopped mid-report to answer the phone, grab a cup of tea or run to the loo.

But there have been a few stresses too and these really do cause me upset because I don’t want anyone to have a negative experience. I’m also beginning to realise that the schools must have a very different perception of the research set-up rather than the dull reality of just Debbie and me in our offices trying to field phone calls and email questions as quickly as possible. So I thought I’d try to explain why things have been set up the way they have been, bearing in mind I really had no idea what was involved in trying to do a study like this before we set off (probably a good thing)…

Where we depart from almost every other screening study in the world is that we are asking teachers to complete the screens and this is clearly a very big ask. Bruce Tomblin’s group sent an army of researchers out to assess the children themselves. We didn’t do that for a variety of reasons. One is practical – it would take lots of people and quite a bit of time to screen upwards of 9-12,000 children and that just wasn’t going to happen here. The main reason is far more important though – all studies to date that involve screening for language difficulties have been hampered by high rates of false positives. In other words, children look as though they are having difficulties on the screen, but no one is worried about them and they don’t tend to look impaired on formal testing. We wanted to have a screening procedure that would pick up children who were having language difficulties that were interfering with their everyday experiences. We also wanted to know how language difficulties impacted on classroom success from the earliest stages. The beauty of doing this study in the UK is that there is a universal assessment of scholastic attainment at the end of a child’s first year of school – the Early Years Foundation Stage Profile. My idea was to include the language and communication screen with the EYFSP so that we could see how variation in language and communication related to variation in school success. And my previous research experience suggested that teachers are indeed very good at picking up the kids who are having difficulties, which I hoped would improve the sensitivity and specificity of our instrument.

When we first applied for the grant, I think we madly suggested that we might have paper and pen versions of the screen. Since then, web surveys have become more commonly available and we learned that the EYFSP was also completed on-line, so we opted for a web version of the screen. Even at 5 minutes a screen, this is considerable teacher time, so we are paying for supply cover so that each teacher has a full day to do the screening. Here I would stress to beleaguered teachers that our funding and the time to get all of this up and running was fairly limited. So the system we are using does the job, but it does have some eccentricities.

One thing that is clearly driving teachers crazy is the need to input pupil numbers. This is because data are anonymous to us, but we need some way to link the screens up to the EYFSP at a later date. We have warned everyone about this and suggest they get a list of numbers ready, but of course it would be better if our system could link directly with the databases schools have and use. Goodness knows how this would be accomplished, but is obviously a better option – sorry!

The other thing that is not ideal is that if a teacher exits a screen part way through one child, the data are not saved. I can appreciate that this is a pain – I don’t know why it does it this way but it does. Hence the occasional outlier for completion time.

The final thing that some people have been less than happy about is the inclusion of an extra 17 questions at the end. This has occurred because just as we started the project, the Government announced that it was replacing the existing EYFSP from next year. So our cohort would be the last with the current measure and our study would be hopelessly out of date before it even started! We pestered the DfE until they gave us the preliminary version of the new assessment. We thought long and hard about including it, but in the end thought it would be best to do so and say something about how this might be used to identify children with language difficulties. But obvioiusly not ideal.

We also have had a few minor glitches like teachers copying the wrong website link, or entering numbers in the wrong order or an occasional warning sign that pops up when teachers move quickly to the next screen that makes them worry their efforts are not being saved (don’t worry – they are all there!). In these instances teachers phone or email to get help and are probably surprised if we don’t answer immediately. This I’m recognising is a big problem so all the answerphones and automatic email replies now include my mobile number. Of course the first day we rolled this out Debbie and I were both in a two-hour long Athena Swan meeting! As I’ve said before, I could do the SCALES project full-time and still be extremely busy. But unfortunately academic jobs rarely work like that so I’m trying to cover SCALES and juggle a large number of other things at the same time. But we are working on it and trying to get back to people straight away.

So it is no wonder that I’m not sleeping very well and am generally anxious about collecting the remaining 8000 screens – we are very definitely learning as we go along and hoping that not too many teachers will be hassled! We are keen to help teachers so things run as smoothly as possible.

And on a very positive note, the data we have look miraculously sensible and I think are going to be extremely interesting. For our first 1000 the gender split is 50:50; 11% with English as an additional language; 13% have reported concerns about speech, language or communication; 3% already have a statement for language disorder and 1% have a statement for ASD. Every day when we save the growing database Debbie and I get more excited. And we are extremely grateful to the hard-working teachers of Surrey who are making it happen!