ArticlePDF Available

The Impact of Skills for Life on Adult Basic Skills in England: How should we interpret trends in participation and achievement?

Authors:

Abstract and Figures

The English Skills for Life strategy symbolises the prominent place that adult basic skills have claimed in education and training policy in England since the beginning of this century. The strategy aims to improve the skills of a large number of learners over a ten year period (2001–2010). This paper explores what we can learn about the impact of the strategy from an analysis of available statistical data. The paper presents trends in participation and achievement over the first four years of the strategy, which indicate a pattern of diminishing returns to numbers participating over time, and which may well reflect the growing difficulties the policy will face of engaging ‘hard to reach’ learners. Alongside this analysis, the paper raises a number of issues concerning the limitations of available statistical data in providing answers to questions such as the progress made by learners and their subsequent progression, both within and beyond adult basic skills provision. The paper goes on to argue that the strong emphasis on a numerical target related to qualification outcomes may serve to focus both practitioners’ and policy makers’ attention on this aspect alone. This, it is argued, may serve the interests of international benchmarking of skills levels in the population, but may do rather less in helping to improve learners’ lives and capabilities.
Content may be subject to copyright.
Bathmaker, A.-M. (2007) The impact of Skills for Life on adult basic
skills in England: how should we interpret trends in participation and
achievement? International Journal of Lifelong Education, 26 (3).
pp. 295-313. ISSN 0260-1370 Available from: http://eprints.uwe.ac.uk/119
We recommend you cite the published version.
The publisher’s URL is:
http://dx.doi.org/10.1080/02601370701362283
Refereed: Yes
(no note)
Disclaimer
UWE has obtained warranties from all depositors as to their title in the material
deposited and as to their right to deposit such material.
UWE makes no representation or warranties of commercial utility, title, or fit-
ness for a particular purpose or any other warranty, express or implied in respect
of any material deposited.
UWE makes no representation that the use of the materials will not infringe
any patent, copyright, trademark or other property or proprietary rights.
UWE accepts no liability for any infringement of intellectual property rights
in any material deposited but will remove such material from public view pend-
ing investigation in the event of an allegation of any such infringement.
PLEASE SCROLL DOWN FOR TEXT.
1
Bathmaker, A.M. (2007) The impact of Skills for Life on adult basic skills in
England: how should we interpret trends in participation and achievement?
International Journal of Lifelong Education, 26, 3, pp.295313.
Abstract
The English Skills for Life strategy symbolises the prominent place that adult basic
skills have claimed in education and training policy in England since the
beginning of this century. The strategy aims to improve the skills of a large
number of learners over a ten year period (2001-2010). This paper explores what
we can learn about the impact of the strategy from an analysis of available
statistical data. The paper presents trends in participation and achievement over
the first four years of the strategy, which indicate a pattern of diminishing
returns to numbers participating over time, and which may well reflect the
growing difficulties the policy will face of engaging ‘hard to reach’ learners.
Alongside this analysis, the paper raises a number of issues concerning the
limitations of available statistical data in providing answers to questions such as
the progress made by learners and their subsequent progression, both within and
beyond adult basic skills provision. The paper goes on to argue that the strong
emphasis on a numerical target related to qualification outcomes may serve to
focus both practitioners’ and policy makers’ attention on this aspect alone. This,
it is argued, may serve the interests of international benchmarking of skills levels
in the population, but may do rather less in helping to improve learners’ lives
and capabilities.
Keywords: Adult Basic Skills, Skills for Life, lifelong learning, participation,
achievement
Correspondence: Ann-Marie Bathmaker, Faculty of Education, University of the
West of England, Coldharbour Lane, Bristol BS16 1QY, UK
Ann-marie.Bathmaker@uwe.ac.uk
Introduction
In 2001 the UK government launched a major policy strategy in England to
address adult basic skills needs, entitled Skills for Life (SfL). The strategy
responds to growing concern that the basic skills in literacy, language and
2
numeracy of a considerable number of adults are inadequate to function
successfully in 21st century society. Skills for Life represents the first large scale
intervention into the area of adult basic skills education since the 1970s. This
paper is concerned with the impact of the strategy on levels of skill in the adult
population. It focuses specifically on the picture that we can gain from available
statistical data of trends in participation and achievement over the duration of
the Skills for Life strategy in its first phase from 2001 to 2004.
The work reported in this paper forms part of a study of the impact of the Skills
for Life strategy on learners’ lives, funded by the National Research and
Development Centre for adult literacy and numeracy (NRDC)1. The study as a
whole involves both quantitative and qualitative strands. The quantitative work
aims to investigate the impact of Skills for Life on learners, by examining existing
statistical data, and also by gathering new data using specially-devised tests for
literacy, numeracy and English for Speakers of Other Languages (ESOL). This
paper considers what can be learned from existing statistical data2.
The first part of the paper outlines what is meant by Skills for Life and identifies
what statistical data are available on adult basic skills in England. The second
part of the paper presents an overview of trends which can be identified in
participation and achievement of adult basic skills over the first four years of the
strategy, using data collected by the Learning and Skills Council (LSC). The final
section discusses issues which arise from using these data to understand the
impact of Skills for Life on adult basis skills provision, and considers trends
identified in the data, the limitations of the data, and implications for the future.
What is Skills for Life?
The primary aim of Skills for Life is to ‘make sure that England has one of the best
adult literacy and numeracy rates in the world’ (National Audit Office, 2004,
p.20). The long term vision is ‘ultimately to eliminate the problem’ of poor levels
3
of adult literacy and numeracy (ibid). As the name given to the policy suggests,
literacy, language and numeracy skills are deemed essential to people’s lives, to
enable them to participate or function effectively in work and in society more
widely. Increased interest in adult basic skills in current government policy in
England is closely linked to concerns for economic competitiveness in a
globalised economy, and to concerns about social exclusion and promoting active
citizenship. The rise of policy interest in adult basic skills at the beginning of the
21st century and the reasons underlying it follow a similar pattern to other
advanced industrialised countries (see Hamilton and Barton (2000) for comments
on OECD policy; Maclachlan and Cloonan (2003) for comments on Scottish
policy, and Searle (2004) for comments on Australian policy). There is broad
agreement that current education and training policy is based on a human capital
model of skill, within a new work order vision of global capitalism. Here,
upskilling is considered essential if individuals are to be employable and able to
compete with others in a global economy, and basic and key skills in literacy,
language and numeracy are seen as fundamental to wider skill development.
The vision implies that economic prosperity follows from achieving high levels
of skill and credentials, although this vision has been strongly critiqued by
researchers such as Keep and Mayhew (1999).
The ‘problem’ of adult basic skills
While policy interest in adult literacy and adult basic skills in England goes back
to the first half of the 20th century, Skills for Life can be seen as representing one of
two significant periods of campaigning in this area by national government. The
first was in the 1970s, when the Adult Literacy Resource Agency (ALRA)3 was
established, but as Hamilton and Hillier (2006) have observed in a historical
account of policy in this area, adult basic skills have not formed a priority for
politicians in England in the intervening period. Current policy interest was
sparked by the findings of the International Adult Literacy Survey (IALS) carried
4
out in the 1990s by the Organisation for Economic Co-operation and
Development (OECD, 1997). The OECD reported that the UK had a greater
percentage of adults with low levels of literacy and numeracy than 13 of the 20
countries included in the survey, who were its ‘international competitors’.
The Moser Report (DfEE, 1999) which followed IALS, suggested that up to 7
million adults in England had poor levels of literacy, and even more had
problems with numeracy. Moser’s arguments were influential with the DfEE
(now DfES) as indicated in the following quote from the DfEE Skills for Life
strategy document in 2001:
The ground-breaking report, A Fresh Start, published in March 1999
following the review chaired by Sir Claus Moser, identified up to 7
million adults in England who cannot read or write at the level we
would expect of an 11-year-old. Even more have trouble with
numbers. (DfEE, 2001, p.8)
While IALS and the Moser Report acted as significant influences on the then
newly-elected Labour Government, and were used to gain financial support for a
large-scale basic skills initiative, the findings of IALS in particular have faced
strong critiques. These critiques focus on two aspects of how literacy is
measured. One involves the technical procedures and assumptions in IALS (see
for example, the work of Blum, Goldstein and Guérin-Pace, 2001). The other
concerns the idea that there can be a common definition of literacy across
cultures, which conflicts with understandings of literacies as socially-situated
practices (see for example Hamilton and Barton, 2000).
Despite these concerns, which were published prior to the launch of Skills for Life,
the subsequent survey carried out in 2002/03 as part of Skills for Life (DfES,
2003b), reported that poor levels of literacy affected 17.8 million, well over
double Moser’s figure, but using a different baseline criterion. This survey has in
turn faced critique from Sticht (2004), who believes that:
5
The Skills for Life survey has limited value as a measure of the literacy
skills of the adult population 16 to 65 years old in England. It lacks
construct validity, meaning that it is not certain what skills and
knowledge the survey is assessing. It is inconsistent with the adults’
own perceptions of the adequacy of their literacy skills for meeting
everyday needs [.] (Sticht, 2004, p.4)
Yet in the National Audit Office Report of 2004, the figure given for adults with
literacy and numeracy needs was 26 million (National Audit Office, 2004, p.6),
using yet another criterion to that used in the Skills for Life survey. Even the
smallest of these figures Moser’s 7 million – suggests a major problem, but the
figures given in the DfES Skills for Life survey and the Audit Office report turn
the problem into what seems more like a major crisis. A front page Guardian
report in January 2006 (Smithers, 2006) entitled ‘12m workers have reading age
of children’ indicates precisely how such numbers are used to create a sense of
crisis, and pays scant attention to the issues raised in the research debate over
IALS, which question how such figures are generated.
The differences in the numbers quoted above result at least in part from how
levels which count as insufficient are defined. Whereas Moser talked of adults
whose skills did not match those expected of an 11 year old, the Skills for Life
strategy extends its remit to a much higher level of skill level 2 in the English
national qualifications framework4. This is the equivalent of a good GCSE, the
achievement goal for 16 year olds at the end of compulsory schooling. Moreover,
the scale of the problem is understood in terms of those who have not achieved a
qualification outcome. Thus, the DfES and Audit office figures embrace any
adult (aged 16-65) who has not achieved a qualification at level 2 in the national
qualifications framework
This point is not insignificant in helping to drive and define the current Skills for
Life strategy. The focus on certificated achievement is emphasised in the
headline target for the strategy, which is the number of individuals achieving
6
qualification outcomes. The strategy as a whole covers three levels of
achievement: Entry Level 3, Level 1 or Level 2, which represent some of the
lower (but not the lowest) levels of the English National Qualifications
Framework (NQF)5 and which should not be confused with National Curriculum
levels in schools.
The scope of the strategy from very low levels of literacy, language and
numeracy skills to GCSE-equivalent level 2 skills links in to the government’s
wider occupational skills strategy, which sets level 2 qualifications as a key
benchmark for vocational skills levels in the working population (DfES, 2003c).
Moreover, the definition of adult within the Skills for Life strategy as persons aged
16-65, which follows the definition used in the IALS surveys, and reflects the
current standard working age in England, reinforces the connection with
employability and economic competitiveness. The point here is that the Skills for
Life strategy appears to redefine the nature, not to mention the size of the
‘problem’, placing strong emphasis on qualification outcomes rather than other
evidence of progress, and encouraging also a stronger interest in qualification
levels that are perceived to have wider economic value that is level 2 and above
in the national qualifications framework. This is certainly not the way adult
basic skills needs have been understood in the past (see Hamilton, 1996), and
indicates how Skills for Life is not simply the latest means of addressing the long-
term issue of adult basic skills, but is acting to change how that issue is
understood and dealt with.
Available statistical data on adult basic skills
Central to the Skills for Life strategy are quantifiable targets for improvement.
When the government introduced the strategy in 2001, it established a target to
be met by the end of the decade, with two interim targets along the way. The
initial target was to improve the literacy and numeracy skills of 750,000 adults in
7
England by July 2004 (DfEE, 2001), increasing to 1.5 million by 2007 and 2.25
million by 2010. These targets embrace anyone engaged in learning over the age
of 16 (and up to the age of 65) in any form of provision, except for schools and
universities, and they are measured by the achievement of accredited outcomes.
This has meant that the collection of statistical data on adult basic skills,
particularly on the achievement of qualifications, has become a highly important
part of the work of the strategy. The next section of the paper gives an overview
of what data are collected on adult basic skills, and indicates what they can tell
us about skills levels amongst the adult population.
There are three types of statistical data available in England: firstly, there are
data on the scale of need; secondly, there are (very limited) data on learners’
progress; and thirdly, there are data on learners’ levels of achievement.
Although they all contribute to an overall picture, each type of data offers a
different perspective, and it turns out to be very difficult to gauge trends over
time as there have been no consistently collected data over a longer period of
time.
Data on scale of need
There have been a number of surveys in England which provide data on the scale
of need in adult basic skills (reviewed by Brooks et al, 2001a). The earliest survey
which Brooks et al identify was carried out in 1972 as part of the National Survey
of Health and Development (Rodgers, 1986). The most recent was undertaken in
2002/3 by the DfES (DfES, 2003b). These surveys collected their data using two
main approaches; firstly, self-reporting by adults on their level of skill in literacy,
numeracy or ESOL, and secondly, one-off performance tests undertaken by
individuals to assess their level of skill. They indicate scale of need, rather than
progress over time, and it is difficult to compare scale of need over time, as
different approaches to collecting data have been used from one survey to
8
another.
In addition to the above studies, a baseline survey was commissioned by the
DfES at the commencement of the Skills for Life strategy, carried out between June
2002 and May 2003 in England (DfES, 2003b). The purpose of the survey was to
produce a national profile of levels of competence in literacy and numeracy, and
to assess the impact of different levels of skill on people’s lives, the latter broken
down into work and everyday life. 8,730 randomly selected adults completed a
questionnaire, which gathered behavioural and demographic data, and
respondents completed two assessments, one for literacy and one for numeracy.
The percentage responses were then applied to the population of England as a
whole. The data have been used to suggest that in 2002/03 66% or 17.8 million
adults (16-65 year olds) had literacy skills at level 1 or below, and that 75% or
23.8 million adults had numeracy skills at level 1 or below. Thus the scale of
need, based on this survey, would appear to be enormous.
Data on progress
Only two studies have been undertaken specifically to assess learners’ progress
using a skills assessment instrument, where learners are tested on their skill
level, and then re-tested at a later date to evaluate progress. Both of these studies
investigated adult literacy, and not numeracy or ESOL. The first was in 1976-79,
carried out for the Department of Education and Science by the National
Foundation for Educational Research (NFER) (Gorman, 1981). The second was
undertaken twenty years later in 1998-1999 by NFER for the Basic Skills Agency
(Brooks et al, 2001b). These two surveys found gains in both reading and
writing. Brooks et al’s (2001a) evaluation of the data suggests that in the earlier
survey, the gains in writing were considered to be significant in educational
terms, whilst in the second study, the gains in reading could be considered
significant educationally.
9
In addition to the above research, there are two further studies, both of which
form part of the lifetime cohort studies undertaken in England, where
comparable data have been collected over time. The first involved 3000+ people
in the 1946 lifetime cohort study who took an identical reading test in 1961 at age
15 and again in 1972 at age 26; the average score had risen significantly (reported
in Rodgers, 1986). The second was part of the 2004 sweep of the British Cohort
Study (BCS70), which used some of the same items as were used in the previous
sweep with this lifetime cohort in 1991-2; the results are only now beginning to
appear (Bynner and Parsons, 2005; Parsons and Bynner, 2005). As this paper
goes on to discuss, recent data used to evaluate progress in relation to the Skills
for Life strategy does not actually assess learners’ progress, but learners’
achievement, which is not the same thing.
Data on levels of achievement
More extensive data are available on levels of achievement than on levels of need
and progress over time, if achievement is understood as completion of
certificated outcomes. Awarding bodies hold data on the number of candidates
achieving their qualifications. In addition, since the introduction of the Skills for
Life strategy, a number of different organisations are involved in providing data
on levels of achievement, as shown in table 3.
Table 3: Organisations which collect data on achievement of Skills for Life targets
Learning and Skills Council (LSC)
data on all provision which is funded in the
Learning and Skills sector, particularly
further education colleges
Offenders Learning and Skills Unit (OLSU)
data on prisoners and those on probation
Jobcentre Plus (based in the Department for
Work and Pensions)
data on unemployed/jobseekers who have
basic skills needs
Qualifications Awarding Bodies such as
data on achievement of awarding body
10
OCR, Edexel and City and Guilds
qualifications
The Department for Education and Skills
(DfES) statistical branch
data from all the above organisations, which
are then reviewed and monitored by the
DfES
Data on offenders (gathered by OLSU) and data on jobseekers (gathered by
Jobcentre Plus) are global figures collected on a regular basis, which simply give
the total number of individuals achieving qualifications which count towards the
Skills for Life target in the collection period. Awarding bodies for qualifications
gather more detailed data than this, but data gathered by them which are
relevant to Skills for Life targets are almost all incorporated into the Learning and
Skills Council database. The Learning and Skills Council dataset is the most
detailed, offering a breakdown of the data using a range of factors, which include
for example age, gender and ethnicity. The work undertaken by the DfES
statistical branch checks LSC data against that collected by other organisations,
and carries out further analysis, but the DfES does not undertake additional data-
gathering. It is for this reason that the analysis below is based on Learning and
Skills Council data. Moreover, reports on Skills for Life by the National Audit
Office (2004) and by the House of Commons Committee of Public Accounts
(2006) have all relied on LSC data.
The original aim of exploring existing statistical data was to develop a picture of
the impact of Skills for Life on learners’ progress based on such data. However,
this aim had to be modified, because none of the datasets identified above,
including the LSC data, record individual progress with any accuracy. Whilst
available data provide information about learners’ achievement of certificated
outcomes, this is not necessarily the same as progress. This is because the
records do not provide accurate data on learners’ levels of achievement at the
start of a programme of learning, so they cannot offer an accurate picture of
subsequent progress. Furthermore, from the point of view of assessing learners’
11
skill levels, researchers who use standardised assessment instruments would be
concerned that even when a record is kept on learners’ achievement before and
after a programme, these data are not based on matched tests, whereby learners’
skills are assessed at the beginning of a programme of study and at a later point
using the same or a statistically equated instrument.
Whilst it is therefore important to be cautious about what we can learn about the
impact of Skills for Life on learners from the statistical data gathered by the
Learning and Skills Council, it is nevertheless possible to detect trends in
participation and achievement using these data. The next section of this paper
presents a picture of these trends.
Trends in participation and achievement: an analysis of LSC data
Source and analysis of data
The analysis in this paper focuses on the first ‘milestone’ target for the Skills for
Life strategy, which was to improve the literacy and numeracy skills of 750,000
adults in England between 2001 and July 2004, a target that was met successfully.
According to LSC estimates, the number of learners achieving outcomes counting
towards the target was 862,0006, exceeding the target of 750,000 by over 100,000.
The LSC data were a key source of evidence for measuring progress towards the
target, and have been used in a number of reports on Skills for Life (for example
National Audit Office, 2004; House of Commons Committee of Public Accounts,
2006; Smithers, 2006). The LSC collects detailed data on learners participating in
LSC-funded provision, many of whom are learning in further education colleges.
The purpose of the LSC dataset is to monitor funding and this influences the
information that is collected. The Individualised Learner Record records
participation in learning by individuals in the form of what are called ‘learning
12
aims’. Records are kept at three stages: firstly, at the outset, when learning
opportunities are taken up; secondly, on completion of an agreed programme of
study, and finally, when learning aims are achieved, usually in the form of a
qualification outcome. There is a list of qualifications which are approved as
counting towards the Skills for Life target provided by the Qualifications and
Curriculum Authority. Within the LSC dataset, one learner is not equal to one
learning aim, because an individual learner may have several learning aims. For
example, a learner who is working on both literacy and numeracy will have two
different learning aims, as will a learner who is attending ESOL classes alongside
a vocational qualification. This does not make it straightforward to quantify the
number of individual learners who have participated in provision.
The data used here were supplied by the Learning and Skills Council data
division in 2005. They provided a subset of data from the LSC Individualised
Learner Record (ILR) relating specifically to adult basic skills. This subset
consisted of aggregated Skills for Life data collected by the LSC for the years 2000-
04. It included learners on adult basic skills courses which counted towards the
Skills for Life target, and figures for learners on other courses which did not count
towards the policy target. The data were provided by the LSC in yearly files,
with one complete file for each year from 2000/01 to 2004/04.
The data were collected in the same way each year by the LSC. Institutions
submit data on a number of census dates throughout the year, and the LSC
updates their dataset a number of times. Data for all years shown here are taken
from the ‘final freeze’ of data for these years, and considered to be accurate in
February 2006 (personal correspondence with LSC). Only the data for learners
who counted towards the target are used in the paper, and the data have been
used here to make a comparison across years. Thus the current analysis
discusses what we can learn about trends over time. Future analysis of the ILR is
planned, which will explore in more detail what lies behind this overview.
13
Presentation of data
The data presented here show numbers of learning aims in the form of learning
opportunities taken up, completed and achieved, and not numbers of individual
learners.
The Figures below show the pattern over four years of take up, completion and
achievement of qualifications which relate to the Skills for Life target. The first
three figures all show an upward trend for participation and achievement
between 2000 and 2004. Figure 1 shows how many learning opportunities were
taken up which counted towards the Skills for Life target each year from 2000-01
through to 2003-04. Over the four data collection periods, learning opportunities
taken up rose in 2001-02 and then fell in 2002-03, despite the addition of work-
based learning (WBL) in this third year of data collection. They then rose again
in 2003-04.
Figure 1 : Learning Opportunities Taken up Per Yea r
1066085
0
200000
400000
600000
800000
1000000
1200000
2000/01 2001/02 2002/03 2003/04
Source : Aggregated data files for SfL data 2000/01 to 2003/04 sup plied by LSC July 2005
Learning opportunities taken up
Figure 2 shows how many learning aims were completed. Learning aims
completed includes those which were recorded as achieved, not achieved, and
14
where the outcome was unknown.
As with take-up, over the four data collection periods, numbers for learning
completed rose between 2000/01 and 2001/02, fell back in 2002/03, and then
rose considerably in 2003/04.
Figure 2: Learning Opportunities Completed per Year
410572
492343
454867
636639
0
200000
400000
600000
800000
1000000
1200000
2000/01 2001/02 2002/03 2003/04
Source: Aggregated data files for SfL data 2000/01 to 2003/04 supplied by LSC July 2005
Learning opportunities completed
Figure 3 shows how many learning aims were achieved each year, using only the
records for aims which were completed and achieved. These show a similar
pattern to learning opportunities taken up; there was a rise between 2000/01 and
2001/02, a small fall in 2002/03 and then a significant rise in 2003/04.
15
Figure 3: Learning aims achieved per year
216127
269592
265072
370955
0
200000
400000
600000
800000
1000000
1200000
2000/01 2001/02 2002/03 2003/04
Source: Aggregated data files for SfL 2000/01 to 2003/04 supplied by LSC July 2005
Learnibg aims achieved
Figure 4 is a summary of charts 1, 2 and 3, and shows numbers of learning
opportunities taken up, completed and achieved together. Here we can see
clearly that while there is an overall rising trend in participation, completion and
achievement, there is at the same time a big step down in overall numbers at
each stage.
16
Figure 4: Learning opportunities taken up, learning opportunities complete d and
learning aims achieved per year
686755
492343
269592
633622
454867
265072
636639
370955
216127
410572
537877
1066085
0
200000
400000
600000
800000
1000000
1200000
Learning opportunities taken
up
Learning opportunities
completed
Learning aims achieved
Source: Aggregated data files for SfL 2000/01 to 2003/04 supplied by LSC July 2005
2000/01
2001/02
2002/03
2003/04
The extent of these diminishing returns is more apparent when considering the
ratio of completion to uptake, and achievement to uptake, as shown in the next
two Figures. In contrast to the upward trend in total numbers over the four
years, Figures 5 and 6 indicate a downward trend in percentage completion and
achievement over the same period. Figure 5 is a result of expressing the numbers
in Figure 2 (learning opportunities completed) as percentages of the numbers in
Figure 1 (learning opportunities taken up) and shows what proportion of the
learning opportunities initially taken up were followed through to completion.
Here there is a reduction in the ratio of uptake to completion of just over 16%
over the 4 year period, with 76.3% of learning opportunities completed in
2000/01, dropping to 59.7% completed in 2003/04.
17
Figure 5: Learning opportunities completed as a percentage of learning
opportunities taken up per year
76.3%
71.7%
71.8%
59.7%
0.0%
10.0%
20.0%
30.0%
40.0%
50.0%
60.0%
70.0%
80.0%
90.0%
100.0%
2000/01 2001/02 2002/03 2003/04
Source: Aggregated data files for SfL data 2000/01 to 2003/04 supplied by LSC July 2005
Figure 6 similarly is a result of expressing the numbers in Figure 3 (learning aims
achieved) as percentages of the numbers in Figure 1 (learning opportunities
taken up). Thus it shows what proportion of the learning opportunities
embarked on led to successful achievement. These percentages hovered at
around 40% for the first three years, and then dropped to 34.8% in 2003/04.
18
Figure 6: Learning aims achieved as a percentage of learning opportunities taken
up per year
40.2%
39.3%
41.8%
34.8%
0.0%
10.0%
20.0%
30.0%
40.0%
50.0%
60.0%
70.0%
80.0%
90.0%
100.0%
2000/01 2001/02 2002/03 2003/04
Source: Aggregated data files for SfL data 2000/01 to 2003/04 supplied by LSC July 2005
So in contrast to total numbers for participation, completion and achievement,
which rose over the four year period (as shown in Figure 4), there is a downward
trend in the proportion of opportunities taken up to those completed, and the
proportion of opportunities taken up to aims achieved. What Figures 5 and 6
make clear are the diminishing returns to participation, if measured in terms of
completion and achievement of outcomes, for in contrast to the upward trend in
total numbers, there is a downward trend in percentage completion and
achievement. Thus, by 2003/04, it took a lot more initial participation to achieve
the rise in total numbers of learning aims achieved in the final figure for 2003/04.
However, there are different ways in which the LSC outcomes data can be
aggregated to gauge the rate of achievement to initial participation. Figure 7
shows a breakdown of how all the outcomes for Skills for Life provision were
recorded in the LSC database for the period from 2000 to 2004.
19
0
200000
400000
600000
800000
1000000
1200000
2000/01
2001/02
2002/03
2003/04
completed - achieved
completed - unknown
outcome
continuing
completed - not achieved.
withdrawn.
Figure 7: Outcomes of learning opportunities taken up by year
Source: Aggregated data files for SfL data 2000/01 to 2003/04 supplied by LSC July 2005
0
200000
400000
600000
800000
1000000
1200000
2000/01
2001/02
2002/03
2003/04
completed - achieved
completed - unknown
outcome
continuing
completed - not achieved.
withdrawn.
Figure 7: Outcomes of learning opportunities taken up by year
Source: Aggregated data files for SfL data 2000/01 to 2003/04 supplied by LSC July 2005
Five possible outcomes are shown here: completion and successful achievement;
completion but outcome unknown; continuation of learning; completion without
achievement, and withdrawn. If the numbers for the learners who were
continuing to study and whose outcome was unknown are removed from the
initial number of learning aims taken up, then the number of learning aims
achieved as a percentage of learning opportunities taken up appears higher than
in Figure 6 above. This is shown in Figure 8 below. The reason for pointing out
these two different ways of presenting the data is that the version presented in
Figure 8, not that in Figure 6, was provided in the dataset supplied by the LSC.
20
Figure 8: Percentage of learning aims achieved as a percentage of learning
opportunities taken up (omitting continuing and unknown outcomes)
51.5%
44.8%
46.3%
46.4%
0.0%
10.0%
20.0%
30.0%
40.0%
50.0%
60.0%
70.0%
80.0%
90.0%
100.0%
2000/01 2001/02 2002/03 2003/04
Source: Aggregated data files for SfL data 2000/01 to 2003/04 supplied by LSC July 2005
Discussion: unravelling what we can learn from the LSC data
What we can learn from the data
A number of issues arise from the data presented above. Firstly, efforts to
increase participation would appear to be working. However, there is a
considerable attrition rate between enrolment, completion and achievement of
qualifications. Moreover, this appears to be getting worse. In other words,
although more learners were embarking on learning over the four year period,
proportionally fewer were completing and achieving. This is a significant
concern in relation to the cost of the Skills for Life strategy, if it is measured in
terms of achievement of qualifications. The above pattern raises a further issue,
which relates to the difficulty of reaching what are described in policy terms as
the ‘hard to reach’. As the data show, it is not just a matter of reaching learners,
21
but getting them to complete courses, and then undertake assessment and
achieve qualifications.
A further issue which arises lies in the way that the data are analysed and
presented, which may be influenced by the message that it is wished to convey.
Thus the desire to present a more positive picture of the success of Skills for Life
would lead to the choice of Figure 8 over Figure 6. However, tidying away
complexity may obscure important pointers and questions for practice, for the
different outcomes recorded in Figure 7 raise questions such as: What does it
mean to complete but not achieve? Does it mean that the learners in question left
without taking the test? Does it mean that they failed the test? Does it mean that
they have yet to take the test? Does it mean that they are still participating in
learning and will take the test at some future date?
More fundamentally, in terms of progress with adult literacy, language and
numeracy, does completing a course but not having a certificated outcome mean
that a person has not improved their skills? For whilst qualification outcomes
may offer an apparently straightforward way of providing an overview of ‘the
state of the nation’ in terms of basic skills levels, they may do rather less in terms
of providing evidence of progress and improvement by individual learners. This
is not intended to suggest that accreditation is not important to learners. As
Hamilton and Merrifield (2000, p.271) have observed: ‘Many students want
credentials, both to boost their self-confidence and to gain access to further
training and education.’ However, to measure the gains from adult basic skills
provision in terms of qualifications alone ignores wider benefits of learning,
which are discussed further below.
The limitations of the LSC data
Although the LSC data comprise the most comprehensive dataset available, there
22
are limitations to using these data both to measure progress towards the Skills for
Life targets, and to evaluate progress and progression. It is important to point
out here that this is not a criticism of the LSC Individualised Learner Record
database specifically, because it was not set up for these purposes. However, the
limitations need to be heeded when using LSC data to make claims about Skills
for Life.
The Skills for Life strategy aims to improve the skills of learners. However, what
counts as improvement within the strategy is a rather complex affair.
Improvement can be in literacy, numeracy or English for Speakers of Other
Languages (ESOL), but it has to be through nationally approved qualifications,
and must be at specified levels. Yet there are problems with this specification,
for at Entry Level, which is split into three further ‘sub-levels’, awarding bodies
did not disaggregate achievement across the three levels until 2004.
The picture is complicated further, in that the 2001 Skills for Life target of 750,000
is intended to mean 750,000 different learners. If a learner achieves in more than
one basic skill, in literacy and numeracy, or in numeracy and ESOL for example,
then that learner should still only count once towards the target. Moreover, if a
learner achieves at Level 1 and then moves on and successfully completes a
qualification at Level 2, that learner should still only be counted once.
In order to identify the number of individuals who count towards the Skills for
Life target from the above data, a formula has to be used by the DfES, which is
based on an estimate of the number of individuals who are working towards and
achieving more than one Skills for Life target (personal communication, DfES,
20047). It is difficult therefore to be sure that each learner is only counted once.
This may prove even more difficult in working towards the 2007 target of 1.5
million, where learners who were included in the 2004 results are not supposed
to be counted again.
23
These issues may have some purpose in the context of attempting to increase the
overall number of individuals participating in basic skills provision and
achieving qualifications which match the Skills for Life approved qualifications.
However, it means that a considerable amount of energy and time goes into
attempting to check that individuals are not double counted. Moreover, it
appears to value one-off participation over progression and participation in a
range of basic skills, and, as further LSC data show, there is a considerable
amount of basic skills provision which takes place which does not count towards
the target.
A further limitation of the LSC data lies in evaluating progress and progression.
It is not possible to speak with any certainty about the distance travelled by a
learner, because there is no systematic assessment or recording of the level of a
learner’s achievement on entry to a learning programme. This should raise
alarm bells for the Skills for Life strategy, if it is supposed to be concerned with
improvement by learners. Improvement is defined in the Skills for Life strategy as
the achievement of a qualification outcome, yet there is no direct evidence that
achievement of a certificated outcome actually represents progress by an
individual learner from a previous level of skill to another, rather than
certification of where that learner already was.
Nor is there accurate information about subsequent progress following the
achievement of a stated learning aim. Only learners who remain with the same
provider can be easily tracked from one course or year to the next, and even then
this may only apply to further education provision which is funded by the LSC.
The same issue applies to progression more widely, that is, what happens to
learners in terms of development beyond basic skills: do they progress to further
qualifications? Do they find jobs if they are unemployed? Do they move on in
their job if they are in employment? The lack of such data has been noted by
24
Brooks et al (2001a) in the past. They found no representative data on learner
progression from general basic skills or ESOL provision to employment or
further training and education in their survey of the research in the field.
Although research is currently under way to investigate this (Ananiadou et al,
2004; Metcalf and Meadows, 2005), it has not as yet formed part of the data that
are being used to measure the impact of Skills for Life.
A further area that is not recorded by the LSC database is progress in a broader
sense of the benefits gained by learners as a result of developing their basic skills.
For example, one gain identified regularly in research studies is the positive
effect on self-image, confidence and self-esteem (see Brooks et al, 2001a). This
may be linked to a second gain, which is the decision to continue with education
and take further courses. In addition, family literacy programmes have been
found to enhance children’s learning as well as that of adults, by encouraging
and enabling adults to help children with language, literacy and numeracy
development and by encouraging them to become involved in their child’s
school (Brooks et al, 1996, 1997; Hannon and Bird, 2004).
It is quite possible for policy to be concerned with the wider benefits of adult
basic skills provision, and to measure success against criteria other than the
achievement of qualification outcomes, as the work of Tett and colleagues in
Scotland (Tett et al, 2005) shows. They have developed and applied a social
capital index to evaluate the impact of the Scottish adult literacy and numeracy
strategy, and this has formed a key part of the findings reported to the Scottish
Executive, with the goal of ‘Closing the Gap’ between the disadvantaged and
advantaged (Tett et al, 2005, p.9)
Finally, it is worth pointing out that there is much debate about what is assessed,
and the way that people are assessed, in order to determine their literacy and
numeracy skills. The tests carried out as part of the birth cohort studies started
25
in 1958 and 1970 in England (Ekinsmyth and Bynner, 1994) were intended to
assess what is referred to as ‘functional literacy and numeracy’ (Bynner and
Parsons, 2001, p.283) which Bynner and Parsons define as ‘an individual’s ability
to deal with everyday situations requiring the use of literacy and numeracy
skills.’ ‘Functional’ definitions are prevalent in the way that language, literacy
and numeracy skills are understood in the context of Skills for Life. Nevertheless,
the qualifications that count towards the Skills for Life target include basic skills,
key skills and GCSE qualifications, and these do not automatically represent
achievement of the same ‘skills’. It might be hoped, in any case, that at least
Level 2 qualifications within the English National Qualifications Framework
would, by any definition, be some way above ‘functional’.
Conclusion: The impact of Skills for Life on adult basic skills
The analysis of LSC data in this paper has shown that we can detect trends in
participation and achievement in Skills for Life provision that raise important
questions concerning current adult basic skills policy in England. The data show
a picture of overall increases in numbers participating and achieving in Skills for
Life provision in the four years leading up to the first milestone target, but at the
same time, a trend of overall diminishing returns. For practitioners, the figures
raise important questions, such as what happens between enrolment, completion
and achievement why are attrition rates as high as they appear in the data? For
policymakers, the trends suggest that they need to be aware that Skills for Life
targets are likely to become increasingly harder to achieve, and to require greater
investment in participation to secure a continued rise in achievement in the
future.
However, the paper has also drawn attention to the limitations of using the LSC
data to make claims about learners’ progress (which are specifically not the
‘fault’ of the LSC data). There is currently no accurate means of measuring
26
individual progress. We can comment on numbers for participation in provision
and achievement of target qualifications, but these numbers do not demonstrate
that individuals have improved their basic skills. This has implications for
monitoring progress towards the 2007 and 2010 targets, and suggests the need to
improve the way that individual learners’ progress and progression are
recorded.
At the same time, the pursuit of more detailed and ‘accurate’ data may have
more to do with a policy agenda focused on national economic competitiveness
than on improving learners’ lives and capabilities. In this respect, the paper has
suggested that the policy emphasis on targets in the form of qualification
outcomes contributes to a framing of the ‘problem’ of adult basic skills in
particular ways. It constructs the issue as one which is strongly focused towards
the economic competitiveness of England in comparison with other countries,
particularly OECD countries. In relation to this concern, we are encouraged to
believe that many adults cannot gain employment or function adequately in
employment with their current levels of literacy and numeracy skills. Yet a
considerable number of adults are in employment, albeit possibly low-skilled
employment, even though they have limited literacy and numeracy skills
(Bynner and Parsons, 1998; DfES, 2003b). The major policy focus on qualification
outcomes may in fact have more to do with establishing national benchmark data
than with enabling individuals to make progress.
The issues focused on in this paper the analysis of statistical data on Skills for
Life and how we should interpret them reflect a significant shift in the
positioning and purposes of adult basic skills provision in England. In the space
of a few years, such provision has moved from the margins to the mainstream,
and from beyond the gaze of politicians and education policy makers in
government departments to one of their central concerns. What this paper
demonstrates is how this shift has served to refocus attention towards
27
participation in centrally approved courses, leading to the achievement of
approved qualification outcomes, rather than towards more broadly-defined
goals of adult education. That this does not have to be so is demonstrated in the
somewhat different emphasis that there has been until now in similar work in
Scotland.
A strategy which is based on narrowly-defined functional definitions of literacy,
language and numeracy, and which centres around the pursuit of targets related
to qualification outcomes, concentrates policy-makers’ and practitioners’ energy
on counting and quantifying outcomes. Whether this best serves the long-term
improvement of adults’ capabilities in basic skills, and their participation in
society as citizens as well as workers, is a question to which we should
constantly return.
Acknowledgements
Thanks to Sammy Rashid (University of Sheffield) for his work on the statistical
data analysis and to Greg Brooks (University of Sheffield), for critical comments
and advice.
Thanks to the LSC data division for their help in supplying the data and
responding to queries concerning their datasets.
References
Ananiadou, K., Emslie-Henry, R., Evans, K and Wolf, A. (2004) Identifying
Effective Workplace Basic Skills Strategies for Enhancing Employee Productivity and
Development, London: NRDC.
Basic Skills Agency (1997) Staying the Course: the Relationship Between Basic Skills
Support, Drop Out, Retention and Achievement in Further Education Colleges,
London: Basic Skills Agency.
Blum, A., Goldstein, H. and Guérin-Pace (2001) International Adult Literacy
Survey (IALS): an analysis of international comparisons of adult literacy,
Assessment in Education, 8, 2, pp.225-246.
28
Brooks, G., Gorman, T.P., Harman, J., Hutchison, D. and Wilkin, A. (1996) Family
Literacy Works: The NFER Evaluation of the Basic Skills Agency’s Family Literacy
Demonstration Programmes, London: Basic SkillsAgency.
Brooks, G., Gorman, T.P., Harman, J., Hutchison, D., Kinder, K., Moor, H. and
Wilkin, A. (1997) Family Literacy Lasts: the NFER Follow-up Study of the Basic Skills
Agency’s Demonstration Programmes, London: Basic Skills Agency.
Brooks, G. et al (2001a) Assembling the Fragments: A Review of Research on Adult
Basic Skills. Research Report RR220, Norwich: Department for Education and
Employment.
Brooks, G., Davies, R., Duckett, L., Hutchison, D., Kendall, S. and Wilkin, A.
(2001b) Progress in Adult Literacy: Do Learners Learn? London: Basic Skills Agency.
Bynner, J. and Parsons, S. (1998) Use it or Lose it? The Impact of Time Out of Work on
Literacy and Numeracy Skills, London: Basic Skills Agency.
Bynner, J. and Parsons, S. (2005) New Light on Literacy and Numeracy, NRDC
Miscellaneous Reports, www.nrdc.org.uk.
Department for Education and Employment (1999) A Fresh Start. Improving
Literacy and Numeracy. The Report of the Working Group Chaired by Sir Claus Moser.
http://www.lifelonglearning.co.uk/mosergroup/Accessed 21.12.04
Department for Education and Employment (2001) Skills for Life. The National
Strategy for Improving Adult Literacy and Numeracy Skills, Nottingham: DfEE
Publications.
Department for Education and Skills (2002) 14-19: Extending Opportunities, Raising
Standards. Summary, Norwich: The Stationery Office.
Department for Education and Skills (2003a) Skills for Life. The national strategy for
improving adult literacy and numeracy skills. Focus on delivery to 2007, Nottingham:
DfES Publications.
Department for Education and Skills (2003b) The Skills for Life Survey. A national
needs and impact survey of literacy, numeracy and ICT skills, Norwich: HMSO.
Department for Education and Skills (2003c) 21st Century Skills. Realising Our
Potential. Individuals, Employers, Nation, Cm 5810, London: Stationery Office.
Department for Education and Skills (2005) Skills: Getting on in Business, Getting
on at Work, White Paper Cm 6483-I, Norwich: HMSO.
Ekinsmyth, C. and Bynner, J. (1994) The Basic Skills of Young Adults, London:
Adult Literacy and Basic Skills Unit.
Further Education Funding Council (1998) Numeracy, Literacy and ESOL:
Evaluation of Entry and Level 1 Awards. National Report from the Inspectorate,
Coventry: FEFC.
29
Further Education Funding Council (1999) Basic Education: Curriculum Area
Survey Report, Coventry: FEFC.
Gorman, T.P. (1981) A Survey of Attainment and Progress of Learners in Adult
Literacy Schemes, Educational Research, 23, 3, pp.190-198.
Hamilton , M. (1996) A History of Adult Basic Education in Fieldhouse, R. (ed) A
History of Modern Adult Education, Leicester: National Institute of Adult
Continuing Education, pp.142-165.
Hamilton, M and Barton, D. (2000) The International Adult Literacy Survey:
What does it really measure, International Review of Education, 46, 5, pp.377-389.
Hamilton, M. and Hillier, Y. (forthcoming) Adult Literacy, Language and Numeracy:
The Changing Face of Policy and Practice, London: Trentham Books.
Hamilton, M. and Merrifield, J. (2000) Adult Learning and Literacy in the United
Kingdom IN Comings, J., Garner, B. and Smith, C. (eds) Annual Review of Adult
Learning and Literacy, Volume 1. A Project of the National Center for the Study of
Adult Learning and Literacy, Cambridge: MA: Joseey-Bass.
Hannon, P. and Bird, V. (2004) Family literacy in England: theory, practice, research
and policy, In Wasik, B.H. (Ed.) Handbook of Family Literacy. Mahwah, NJ:
Lawrence Erlbaum Associates.
House of Commons Committee of Public Accounts (2006) Skills for Life: Improving
Adult Literacy and Numeracy. Twentyfirst Report of Session 200506, London:
The Stationery Office.
Jessup, G. (1991) NVQs and the Emerging Model of Education and Training, London:
Falmer.
Keep, E. and Mayhew, K. (1999) The Assessment: Knowledge, Skills and
Competitiveness, Oxford Review of Economic Policy, 15, 1, pp.1-15.
McGivney, V. (1999) Returning Women: their Training and Employment Choices and
Needs, Leicester: NIACE.
Metcalf, H. and Meadows, P. (2005) Does literacy and numeracy training for
adults increase employment and employability? Evidence from the Skills for Life
programme in England. Paper presented to the Second International Conference
on Training, Employability and Employment, Monash University Centre, Prato,
Italy, 23 September 2005. www: niesr.ac.uk. Accessed 2.2.06.
National Audit Office (2004) Skills for Life: Improving Adult Literacy and Numeracy.
Report by the Comptroller and Auditor General, London: The Stationery Office.
OECD (1997) Literacy Skills for the Knowledge Society, Paris: OECD.
Parsons, S. and Bynner, J. (2005) Measuring basic skills for longitudinal study, NRDC
Report, October, www.nrdc.org.uk.
30
Rodgers, B. (1986) Change in the reading attainment of adults: a longitudinal
study, British Journal of Developmental Psychology, 4, 1, pp.1-17.
Searle (2004) Policy Silos and Red Ochre Men An examination of a decade of
adult literacy policy and program development in Australia, Journal of Vocational
Education and Training, 56, 1, pp. 81 -96.
Smithers, R. (2006) 12m workers have reading age of children. Poor results from
£6bn skills scheme, The Guardian, Tuesday 24 January, p.1)
Sticht, T. (2004) Critique of the Skills for Life Survey, Basic Skills Agency,
http://www.basic-skills.co.uk/ accessed 15 July 2006.
Tett, L., Hall, S., Maclachlan, K., Thorpe, G., Edwards, V. and Garside, L. (2005)
Evaluation of the Scottish Adult Literacy and Numeracy (ALN) Strategy, Edinburgh:
Scottish Executive Social Research.
31
APPENDIX 1: APPROVED SKILLS FOR LIFE QUALIFICATIONS AND THEIR
EQUIVALENCE
Whilst the list of recognised and approved qualifications is long, in summary it includes:
1 national literacy and numeracy qualifications accredited by the Qualifications
and Curriculum Authority (QCA)
2 national ESOL qualifications accredited by QCA (currently includes
qualifications being submitted for accreditation)
3 key skills test in communication or application of number at level 1 or level 2
4 national tests for adult literacy and numeracy at levels 1 and 2. These are
identical to the key skills tests on offer, and learners may then go on to build a
portfolio of evidence to achieve a key skills qualification.
5 full key skills qualification in communication or application of number at level 1
or level 2 (a test and a portfolio of evidence)
6 GCSE Maths or English at grade D-G (level 1) or C and above (level 2)
Table 4 shows the levels of equivalence of these qualifications, highlighting where these
count towards the Skills for Life targets.
Table 4: levels of equivalence of different qualifications relevant to Skills for Life
targets
Qualifications framework
Level
National
Qualifications
Framework
Key skills
Standards for
adult literacy,
numeracy and
ESOL
General/
academic
qualifications
National
Curriculum
levels in
schools
5
4
4
3
3
A levels
2
2
2
GCSE A*-C
1
1
1
GCSE D-G
4 to 5 (11-15
years)
Entry 3
3 (9 11
years)
Entry 2
2 (7 9 years)
Entry 1
1 (5 7 years)
Pre-entry
Note: shaded boxes show what qualification levels count towards the Skills for
Life targets
32
1 National Research and Development Centre for adult literacy and numeracy (NRDC) Study of
the Impact of the Skills for Life Learning Infrastructure on Learners, Project no. PG5.4, 2004-2007.
2 The views expressed in this paper are those of the author and not necessarily those of the
NRDC.
3 ALRA was an agency of The National Institute of Adult Continuing Education (NIACE). ALRA
eventually became the Adult Literacy Unit (ALU) and then the Adult Literacy and Basic Skills
Unit (ALBSU) and is now known as the Basic Skills Agency (BSA).
4 See the Qualifications and Curriculum Authority (QCA) website for up-to-date information on
the framework at http://www.qca.org.uk/.
5 See Appendix 1 for qualifications which count towards the Skills for Life target, and for a table of
equivalences across qualifications. The English Qualifications and Curriculum Authority
(www.qca.org.uk) provides details of the full national qualifications framework.
6 Figure taken from LSC Headline Stats Spreadsheet, overall summary sheet (estimate as at April
2005).
7 Personal interview with representative of DfES statistical branch, 1 July 2004.
... While measures were welcomed which raised the profile and funding of adult literacy and numeracy provision, the strategies have been critiqued for their performative emphasis on economic effectiveness and workforce development; the deficit view presented of adult learners; and the prioritisation of funding for adults close to gaining accreditation rather than those in greatest need (for example Papen 2005;NIACE 2007;NIACE 2008;Hodgson, Steer et al. 2007). The emphasis on accreditation and the conflation of qualifications with achievement has been critiqued by Wells (2006) and Bathmaker (2007). Appleby and Bathmaker (2006:711) note the shift in emphasis from earlier discourses of entitlement, to the perceived importance of skills for a global economy: ...
... While measures were welcomed which raised the profile and funding of adult literacy and numeracy provision, the strategies have been critiqued for their performative emphasis on economic effectiveness and workforce development; the deficit view presented of adult learners; and the prioritisation of funding for adults close to gaining accreditation rather than those in greatest need (for example Papen 2005;NIACE 2007;NIACE 2008;Hodgson, Steer et al. 2007). The emphasis on accreditation and the conflation of qualifications with achievement has been critiqued by Wells (2006) and Bathmaker (2007). Appleby and Bathmaker (2006:711) note the shift in emphasis from earlier discourses of entitlement, to the perceived importance of skills for a global economy: ...
... Key resulting tensions include changes in the relationship between literacy and citizenship, centering around the notion of conditionality (Dwyer, 2004); contradictory developments in assessment; the way in which target culture has skewed participation in unanticipated ways toward the the "low-hanging fruit"learners who are easiest to reach and support (Bathmaker, 2007); the pulls on literacy between the demands of schooling, employment and lifelong learning. I will conclude by discussing the latest policy development, the conveniently ambiguous notion of functional skills which-after all the efforts of the Skills for Life era-may result in the disappearance of adult literacy as a field distinct from vocational skills training. ...
Conference Paper
Nearly three decades of empirical research and theoretical discussion have helped us to understand literacy as not merely a discrete and autonomous skill, but as situated, social, multiple, complex—as 'literacies'. During this same period policies that inform literacy education have become increasingly narrow, aligning to the OECD's large-scale international literacy surveys that offer 'evidence' of a nation's capacity to compete. This symposium considers the range of responses from policy-makers, researchers and practitioners in Scotland, England and Canada to this environment, highlighting complexities and tensions but also offering hope for alternative perspectives and strategies that could reinvigorate the field.
... Firstly, these policies have intensifi ed accountability requirements that leave practitioners feeling they are less educators than technicians (Hamilton, 2009) and that reporting is more important than education (Crooks et al., 2008). Secondly, there is the fact that policies lead to programmes for those who can most easily show improvement rather than programmes that would address the systemic barriers faced by people who have the greatest struggles with dominant literacies (Bathmaker, 2007;Veeman et al., 2006). Historicising the role of statistics reveals that numerical accounts have played signifi cant roles in governance for a long time, and that linking adult literacy to economic competitiveness is not new. ...
Chapter
Full-text available
This chapter historicizes the role of statistics in governance, noting that the increasing refinement of literacy statistics over the 20th century draws on a troubling history of social engineering to define ‘fit’ citizens and to establish criteria for determining who should be the focus of policy attention. The chapter considers how statistics operate in the era of ‘advanced liberal’ rule. It closes with a call to develop alternative ways of representing the power of literacy: as engagement with print that either enables people to exert power in their lives, their communities and our collective future, or limits their possibilities for doing so. http://shop.niace.org.uk/more-powerful-literacies.html
... 24). One effect of these changes is that programs increasingly serve those who face the fewest barriers and can most readily show improvement related to particular conceptions of literacy (Bathmaker, 2007;Hillier, 2009;Smythe, 2011;Veeman, Ward, & Walker, 2006). ...
Chapter
Full-text available
This chapter discusses how normative 'literacy' has become a critical tactic of neoliberal power. I examine the case of Ontario, where people applying for social assistance must prove that they have completed secondary education and that lack of 'literacy' is not the reason they are unemployed. The mandatory 'literacy screening' in Ontario was introduced as part of broader reforms to welfare in the 1990s which impose contractual obligations on people applying for assistance. Using Foucaultian analytic tools I consider how, by asserting an educational norm required of all subjects, Ontario's 'literacy screening' justifies structural inequalities that threaten the very lives of marginalized peoples. https://www.sensepublishers.com/media/1319-canadian-education.pdf
... In sum, Brighouse [2] advocates an educational competency-oriented health models [4]. Life skills have system where curriculum is focused yet broad received so much attention in the literature [5][6][7][8][9][10][11][12][13][14] and are enough to provide students with a variety o f defined as the abilities help people enjoy a productive and opportunities to learn about family life and its satisfying life [15] and to make thoughtful decisions [16]. complexities, economic life and its demands and ...
Article
This study was aimed at exploring the necessity of life skills education from the viewpoint of learners at the Community Learning Centers (CLCs) in Iran. The target population of this survey research was learners of 16 community learning centers. A stratified random sample of 130 learners participated in the study. Data were collected using a five-point Likert type scale developed by the researchers based on the life skills suggested in the literature. Results indicated that all participants were in favor of life skills education at CLCs. Finally, some suggestions were made towards the improvement of the life skills education for adult learners.
Article
This thesis is looking at The Moser report – ‘A Fresh Start, Improving Literacy and Numeracy’. This report, written in 1999 was a major report commissioned by government to highlight the quality of the English and maths skills of adults in England. The report provided a range of recommendations, such as a skilled teaching workforce, a new curriculum, qualification standards and a new range of transferable qualifications, many of which were implemented. Skills for Life became a new curriculum area developed directly as a result of Moser. Twelve years later, Professor Alison Wolf published The Wolf Report – ‘A Review of Vocational Education’. The Wolf Report looked again at the English and maths skills of adults in England and made further recommendations, the main one being that all students should achieve a GCSE grade A* - C in English and maths. She stated that if they have not achieved this at school, then they would need to retake these qualifications alongside their vocational or academic programme within Post 16 education. The thesis looks at the impact of the Moser report over the last twenty years, what has happened to Post 16 English and maths as a result of Moser and where we are now with English and maths skills as we celebrate the 20thanniversary of the Moser Report. What has become apparent during this thesis is that the Moser recommendations from 1999 were accepted and implemented. Twelve years later Professor Wolf’s recommendations were also accepted and implemented. Both Moser and Wolf were intended to dramatically change the English and maths skills of adults in England. 2015 onwards however, is a very different story, where very little has continued to be carried out in the way of moving Moser forward and ultimately, twenty years later, although we now have standardised qualifications, curriculums and qualified teachers, we still have millions of people in England with poor English and maths skills.
Chapter
This chapter starts from the observation that there is a limited presence of quantitative research published in leading adult education journals such as Adult Education Quarterly, Studies in Continuing Education and International Journal of Lifelong Learning. This observation was also discussed by Fejes and Nylander (2015, see also Chap. 7). As an adult education scholar mainly working with large quantitative datasets, I aim to provide more insight on what quantitative methods have to offer to the field. I will do this through a brief discussion of the role of methodologies and methods in empirical research, but also by engaging with examples of quantitative research available in the scholarly literature, including a range of existing quantitative scales, and how these can be taken forward in new research as tools to generate the construction of new knowledge. I will first explore potential reasons why the presence of quantitative research in the leading generic adult education journals is so limited.
Article
An examination of articles published in leading adult education journals demonstrates that qualitative research dominates. To better understand this situation, a review of journal articles reporting on quantitative research has been undertaken by the author of this article. Differences in methodological strengths and weaknesses between quantitative and qualitative research are discussed, followed by a data mining exercise on 1,089 journal articles published in Adult Education Quarterly, Studies in Continuing Education, and International Journal of Lifelong Learning. A categorization of quantitative adult education research is presented, as well as a critical discussion on why quantitative adult education does not seem to be widespread in the key adult education journals.
Article
In this paper, we examine interviews with 34 stakeholders including tutors, administrators and adult education policy analysts who were working in the area of adult numeracy and/or literacy education. This provided opportunities for our research team to build up a clearer picture of the current status, plans, benefits and barriers related to improving adult numeracy and literacy with a particular emphasis on new technologies such as e-learning. The interviews provide a wide-ranging perspective of the factors which can help increase the viability of programmes for improving numeracy and literacy skills within adult learning environments. This includes a particular emphasis on new delivery methods, such as e-learning, mixed media and distance delivery to maximise student engagement and opportunities. Computer-related strategies provide new opportunities to meet learners’ individual needs.
Article
Full-text available
Summary The International Adult Literacy Survey raises a number of important issues which are inherent in all attempts to make comparisons of cognitive and behavioural attributes across countries. This paper discusses both the statistical and interpretational problems. A detailed analysis of the survey instruments is carried out to demonstrate the cultural specificity involved. The data modelling techniques used in IALS are critiqued and alternative analyses performed. The paper argues for extreme caution in interpreting results in the light of the weaknesses of the survey.
Article
Full-text available
Research on adult basic skills in the United Kingdom and elsewhere in the English-speaking world was reviewed. The review focused on the following aspects of adult basic skills education: the scale of need and impact of poor basic skills; learner characteristics; motivation, attendance, dropping out, and progression; provision and programs; tutors and professional development; teaching; assessment and accreditation; impact studies; other benefits of adult basic skills instruction; factors associated with progress; costs and benefits; and gaps in existing knowledge. The extent and key findings of research in each area were reviewed, and recommendations for improving research in the given area were presented. The most-researched areas of adult basic skills were literacy, numeracy, English for speakers of other languages, information and computer technology, and oracy. Areas that needed more research included the factors associated with learner progress in basic skills programs; the impact of English for speakers of other languages and use of information and computer technology in adult basic skills; and program costs and benefits. The search methodology used for the review is appended along with a list of 44 World Wide Web addresses and the names/addresses of individuals/organizations contacted during the review. (Contains 20 tables and 150 references.) (MN)
Article
The progress made by adults in dedicated mainstream basic skills provision in England and Wales and factors associated with students' progress were examined. Of the 2,135 students from 71 colleges of further education and local education authorities who took the reading pretest, 1,224 (57%) took the reading posttest. Writing scripts were received from 1,724 students at pretest and 937 (54%) at posttest. Background data were collected on the students, and 177 adult literacy tutors completed a questionnaire. Students achieved small but statistically significant improvements in reading and very small but statistically significant improvements in writing. Students' writing improved only in terms of length of script and quality of handwriting--not in terms of reduction of errors or increase in complexity. Students' writing skills were generally much poorer than their reading skills. No factors associated with differential progress in writing were found. The following factors appeared linked to differential progress in reading: regular attendance, all tutors in an area having qualified teacher status, and tutors having help in the classroom. Nine recommendations were presented. (Sixty-one tables/figures and 22 references are included. The following items are appended: descriptions of the study methodology and method used to devise the reading tests, the student profile instrument, and the tutor questionnaire.) (MN)
Article
In this chapter the authors' primary aim is to describe the general workings of the system of adult literacy and learning as it exists in the United Kingdom today. To better illuminate that system, they take a little time delving into its past--that is, showing how it arrived at its current state of operation. A secondary aim is to compare and contrast aspects of the U.K. system with that in the United States, which has experienced shifts in practice, similar to those just described, from a patchwork of community programs relying on volunteers, diverse institutions, and varied funding streams to ever-greater moves toward national accountability, documented performance, and systematic standards. In conducting this analysis, they hope to broaden the reader's understanding of the two systems as well as their own. (Contains 16 notes.) [Chapter 7 in: Annual Review of Adult Learning and Literacy. Volume 1; copublished with Jossey-Bass, Inc.; see ED436673.]
Article
A representative sample of 1,650 members of the 1970 British Cohort Study were surveyed at the age of 21 (in 1992) to gather information on their education, training, and employment experiences after the age of 16 and their self-assessed literacy and numeracy. Respondents also completed a half-hour assessment of their literacy and numeracy skills. Twelve percent of respondents reported problems with either reading, writing/spelling, or numeracy. Of those assessed, 19% and 55% failed to get beyond the foundation levels in literacy and numeracy, respectively. Among common problems detected were difficulties reading letters and forms, working out data and prices, and keeping accounts. Poor performance in the literacy and numeracy assessment was associated with family backgrounds in unskilled labor and parents who had failed to gain any educational qualifications. Among males, poor literacy/numeracy was also associated with higher rates of unemployment. Among females, poor literacy/numeracy was associated with holding a number of different jobs and then exiting the labor market frequently to have children. Persons reporting literacy/numeracy problems tended to have poorer self-images and less success in achieving educational progress and occupational success. (Thirty-four tables/figures are included. Appended are information on the survey sample, interview questions, and literacy and numeracy assessment instrument.) (MN)
Article
Women returners now account for over one-third of the total labor force, but the British labor market remains strongly segregated by gender, with over 85 percent of all employed women in the service industries. A high proportion are employed part time. Despite a majority of women now returning to the labor market after breaks for childbirth and caring responsibilities, the reentry process is not always straightforward and many experience a range of barriers and problems: lack of child care; lack of information, advice, and practical help; loss of contact with the labor market; downward mobility; lack of recognition of existing experience and skills; limited training opportunities; and lack of support during the reentry process. Reasons many women return to work without preparatory training include the following: insufficient information and guidance on opportunities; insufficient training programs; domestic constraints; financial constraints; psychological barriers; and disincentives caused by women's position in the labor market. Types of education and training programs for women returning to education and training are as follows: updating; reorientation; positive action programs offering nontraditional training; informal, community-based learning activities; and preparatory programs. Providers include Training and Enterprise Councils and women's projects and centers. Returners have some shared needs: information and advice, broad and appropriate training programs, and practical support. (Contains 44 references.) (YLB)