SpLD assessment tools
The list of suitable tests for the assessment of specific learning difficulties (SpLD) in Higher Education is a key part of the National Assessment Framework for Applications for Disabled Students' Allowances. The purpose of the list is to promote quality and consistency in the Disabled Students' Allowances (DSA) process.
[Please note: This list is specifically aimed at students 16 and over. There are a significant number of other assessment materials that will be relevant to younger ages. You should be looking to evaluate them in terms of their reliability, validity, standardisation sample and the area they assess is relevant to your needs.]
STEC [SpLD Test Evaluation Committee]
STEC is a sub-committee of the SPLD Assessment Standards Committee (SASC). Its purpose is to provide guidance on assessment materials to SASC. Its responsibilities are:
- To review and evaluate assessment materials on a regular basis.
- To revise and update the test recommendations of the SpLD Working Group 2005/DfES Guidelines.
- To maintain a list of approved assessment materials for SpLDs in higher education.
- To engage with the publishers and distributors of assessment materials.
- To carry out other business as advised by and agreed with SASC.
- To report to SASC on a regular basis, including an annual written report to be submitted to SASC‘s AGM by the Chair.
STEC updated guidance and other relevant documents can be accessed from here. Updated guidance has been released, December 2014 and is downloadable from this page. Interim updates are below.
Please share this advice with colleagues and other stakeholders.
Academic Achievement Battery (AAB)
The Academic Achievement Battery (AAB) can be used for DSA assessments. The AAB is a large battery of tests which can be purchased as a complete test or in parts. STEC has produced guidance you can access on our downloads page which may assist assessors if they are considering purchasing and using this test. [Update issued October 2015]
Revised Guidance re Adult Reading Test (ART)
The SASC committee have updated its review of the Adult Reading Test (ART) and this latest update guidance [REVISED guidelines-December 2014-ART Revision1, downloadable from the SASC site] has been revised following information from the publishers.
The ART is currently being restandardised and a new version is planned for 2016. Given this information, it has been decided to remove the advice regarding its use ONLY as a qualitative tool. The advice regarding the 2 minutes writing task has also been removed. The ART remains on the list of approved tests for oral prose reading. [update issued July 2015]
Assessment of Writing Speed
Whilst many assessors find the various subtests of the Dash 17+ useful, it is important to bear in mind that this test battery is designed to assess speed of handwriting, rather than speed of free writing. The DASH 17+ free writing task is designed to be as undemanding as possible in order to reduce the impact of any cognitive processing difficulties, such as slow processing speed and/or weak short term/working memory. Hence for a Higher Education report many assessors believe this subtest does not sufficiently challenge the student and will opt to implement a longer free writing task, one that is subject related and more closely mimics the demands that will be made on the student at HE level. The words per minute rate can be calculated and compared to the SpLD Test Evaluation Committee (STEC) expected rate for undergraduate study, which is 25 wpm, and the work analysed in line with the guidance given. [update April 2015]
Confirming guidance agreed in October, percentiles can be used at the discretion of the assessor but are not considered to be mandatory for APC renewal. If an assessor chooses to include percentiles in a report, particular care must be taken as they can magnify small differences within the average range (SS 85-115) which may not be significant, yet large differences near the outer limits of the average range can seem reduced. The standard score must always be given alongside the percentile’ [Update April 2015]
Maintaining test standards.
SASC advises it is best practice for assessors not to put examples from test papers of errors in reports at all. Advice for training and assessments should be to not state test items, but only give examples of types of errors being made. Assessors are also reminded that working papers from assessments should not be made available as these could affect the standardisation of the test. [update issued August 2014]
TOMAL 2 [Test of Memory and Learning, 2nd edition]
All 5 subtests of the Attention/Concentration Index [ACI] should ideally be administered to enable a fully accurate composite to be calculated. If for any reason a subtest score has been prorated it would be important for the assessor to clearly indicate in the report that the ACI score has been prorated and hence can only be used for a ‘statistical rather than a clinical purpose’ (TOMAL2 p.59). The manual (p.59) also states ‘Although composites scores that contain a prorated value may be profiled, a specific prorated subtest standard score should neither be profiled nor taken as a reflection of an examinee’s memory performance on the subtest the prorated score represent.’ Consequently, we would not recommend that the individual subtest prorated score is documented in the report. [update issued August 2014]
Wechsler Abbreviated Scale of Intelligence® - Second Edition (WASI®-II) The WASI II is acceptable for DSA assessments in all areas the WASI I was listed. The WASI II is an updated version of the assessment battery. The WASI I is still acceptable as long as the forms are available.
Woodcock-Reading Mastery Tests: 3rd Edition (WRMT-III). Pearson. 5-75+ years
STEC approve the revised edition of this test battery with the following cautionary note:
In the absence of the publishers/importers providing (a) a re-standardisation to accommodate UK culture and normative standards and (b) guidance/substituted materials on word/picture/phrase substitutions for Americanisms and other cultural features, STEC advises users to apply the necessary caution when administering the test. Notwithstanding this point, it is STEC’s view that it considers this test useful and appropriate to administer for the purposes intended.
STEC response to recent debate about WRIT
The purpose of an ability test is to eliminate general learning difficulties and to examine potential – the WRIT can do both. A comparison between the WRIT and the WAIS is not particularly helpful as Verbal Analogies and Similarities are not measuring the same thing. Most people using the WRIT are well aware that there is some cultural bias as the test was not developed in the UK.
Tests don't diagnose, people do'. An assessment is a differential diagnosis which uses a battery of tests to come to a conclusion about strengths and weaknesses and does not rely on any individual test for a diagnosis. Specialist Dyslexia Assessors do not only rely on the WRIT scores or a working memory deficit to make a diagnosis of dyslexia. They use data from other tests and take into account the verbal abilities of the student throughout the whole assessment process, noting receptive and expressive language skills. The assessment will also look at difficulties with working memory, phonological weakness and speed of processing, literacy weaknesses and specific skills associated with reading and writing. Additional testing may be carried out if supplementary evidence is required.
Terms of Reference and Minutes of STEC meetings are accessible from the downloads page.