National Assessment and Overseas Experience

  • Brian Donnelly
Associate Minister of Education (Early Childhood Education and Maori Education)

You are all obviously aware of the recent trip I made to Toronto and England, to examine the impact of large scale assessment upon those two education systems. So I would like to open today by giving a very brief summary of my findings.

Ontario has only very recently embarked on large scale assessment. As a system, Ontario education is not only lacking in a tradition of large scale assessment of any sort such as we have in the senior secondary schools, but it also has no tradition of a provincial curriculum. In Canada education is almost the exclusive domain of the provinces. In Ontario there had been a further devolution of responsibilities to local education boards.

In the early 90's a royal commission on education was produced and a number of recommendations made. Amongst those recommendations were proposals for cohort testing. As a result an independent agency, the Education Quality and Accountability Office, the EQAO, was established to develop and administer these tests.

During our visit we spent time with EQAO Board members and officials, with parents, with the Minister and the Deputy Minister of Education (the equivalent of Howard Fancy), with teachers, with officials from the Ministry, with Dr Michael Fullan, a leading academic in the field of educational management - the broadest spectrum of stakeholders. We also visited a marking site where 350 or so teachers were going through the process of marking the grade 3 tests.

For those of you who have been led to believe that there is widespread resistance to the testing consider the following. Over 2000 teachers are going to be required for marking the literacy and numeracy tests. Applications for marking positions was four times greater than the number required. I would have thought that if there was large-scale resistance from the profession then there would have been a struggle to achieve the required numbers, especially as the marking process demanded an eleven day commitment during the first two weeks of summer break. The people I saw involved in the process were one hundred percent focused and absorbed by the task. There was one finding which was indisputable. All parties agreed that the marking process was the best professional development that had ever taken place in the province.

I specifically asked the teachers whether the students found the tests tedious. The idea was rejected out of hand. What a number of teachers told us was that they put students under stress. We investigated this idea further and it would seem that some teachers found the process stressful for themselves and that this stress was transmitted to the students. Nowhere did we find any evidence of children being "drugged".

However, a group of francophone teachers told us that their children had absolutely loved the experience and that they were extremely excited by what was happening.

The manner in which the assessment was carried out was through an integrated package that teachers worked through with their children over a period of 15 hours stretched across one week. There were very high levels of quality assurance built into the process and the tests involved quite a degree of higher level skills. So these were not simple pencil and paper, short answer or multichoice tests. We were told of one story where a child complained because they made her brain hurt.

As we delved deeper into what was happening in Ontario which was undergoing a range of reforms including the establishment of a provincial, outcomes-based curriculum, the reduction of the number of education boards, new ways of funding schools and the removal of principals from teachers collective contracts, what became increasingly obvious was that the process was one of parents wresting back some control of their children's education from the teachers. It became clear that the political agenda was being pushed by a very real sense of alienation and powerlessness felt by parents.

At the same time, educationists were exploiting this energy for slightly different agenda. The first was an agenda for the development of an assessment culture, professional practice, which would utilise achievement level data to inform programme construction. In this regard I believe that New Zealand has a far more developed assessment culture than Ontario.

The second agenda was far more subtle and one which I believe we do need to heed. They continued to talk about the establishment of "quality conversations" between parents and teachers. It was a strongly held belief that the two groups were talking past each other, were not using a common language or shared set of understandings. If the results of the returns to the Green Paper are as I suspect they will be then we will have some pretty strong evidence that this is also the case in New Zealand. And if this is the case then we better start doing something about it very soon.

If we don't then I suspect we shall end up with a situation similar to what we found in Britain where the politicians are dictating to schools not only what they must teach but how they must teach it to a quite specific degree. For example, children under the age of 8 are not allowed to use calculators, there must be an hour of literacy instruction per day (they have provided literacy lunchboxes which determine how teachers must use the literacy hour) and an hour of numeracy teaching per day.

League tables are not only espoused, they are considered to be the driving force for improvement. The new government which I remind you is a Labour government, even embarked on a policy of naming and shaming the worst schools. We visited one of those named and shamed schools. Admittedly it was one which had pulled itself out of the "failing" category and admittedly it had been provided with support a la our Supporting Schools project. Nevertheless we were told that the children had been bullied by children from other schools and had been too ashamed to tell people what school they went to.

The English have had key stage tests since 1994. There are three of these during the primary years. They tend to be narrow pencil and paper tests in reading, maths and science and from the results league tables are published. There is no doubt in my mind that the way in which they have tested and published has had serious educational downsides and it surprised me that some of those driving the system had little concern for these downsides, even though they admitted their possibility. I am prepared to go into more detail on what I found in England during question time.

Which brings us back to New Zealand. Those who have read the Green Paper will recognise that it is a package. Only one of the components is the testing proposal and you will know that the implementation of this part of the package has been postponed to enable more time for development. However, I need to say that the postponement does not mean it has been removed from the package.

What is important is to outline the thinking behind the assessment package. It is certainly not something out of the blue. It builds on proposals from the Assessment for Better Learning report from 1989, the conclusions of which were incorporated in the National Curriculum framework. To me measurement of learning achievement is as intrinsic to teaching as examinations are to a GP.

I believe that teachers have put in a great deal of effort in developing tools which will allow measurement of their students against the learning outcomes in the new curriculum statements. However, there are few, if any measurement tools which will provide comparative data against national norms. People will quickly quote PATs. Let us be quite clear about PATs. They do not measure against the learning outcomes of the new national curricula. They are simply anachronistic. The very assumptions upon which they are based, ie that learning is normally distributed, are not the assumptions underpinning the National Curriculum Framework.

So what else is there. Certainly not NEMP. I have been bewildered by statements to the effects that NEMP provides the information against which teachers and school administrators are able to judge the quality of their programmes. That would be like a doctor saying to a patient "I don't need to examine you. I've read the national study." Because that is what NEMP is. It is a project which provides information on how the system is performing. It gives no information about individuals or schools.

On one point I do agree with some of the more vocal critics of the testing proposal. I believe that it is teacher assessment which should be predominant. However, how does a teacher validate his or her own assessments, how does a parent validate assessments, how does a school administration validate such assessment.

The Green Paper is suggesting some validation mechanisms. It is proposing to enhance the Assessment Resource Banks so that these items can be used by teachers to develop their own idiosyncratic measurements. The items would be in concordance with the learning outcomes of the curriculum statements and would be normed within broad bands.

Secondly there are a number of areas in the curriculum which it is difficult if not impossible to evaluate through traditional testing means. It is proposed that exemplars are developed to provide some benchmarking for teachers to use in their assessments of their students. I am thinking of such areas of the curriculum as oral language and art.

There are already a number of measures which have already been developed which meet the tests of reliability and validity but of which schools are unaware. It is proposed that the existence of such measures is communicated clearly to schools so that they might consider them within their own assessment policy and practice. I refer to such tools as the Otago Fitness Test.

The package also proposes the development of diagnostic tools which may signal where children are having particular difficulty and where prioritised emphasis might be placed in any future programme. You will be aware of Marie Clay's six year diagnostic survey of reading skills. However, what do we have for other areas of the curriculum such as mathematics.

Finally the national tests. If teachers are universally opposed and parents largely support this part of the proposal then I would suggest that we have a serious "talking past each other" problem. The notion of conversation which people like Michael Fullan and Veronica Lacey were talking about is not one-way communication. It involves both listening and speaking (or writing).

The national testing proposal can be looked on as the validation process. It is the validation that what is being communicated about the progress of a child or the efficacy of a programme is validated by externally-referenced standards.

Almost all of the downsides that I mentioned earlier which are recognised in the English system are created by the publication of league tables and the manner in which this is done. It is for this reason the government has said that it won't allow this to happen. To those who believe this is not possible we need to look no further than across the Tasman to Victoria and New South Wales, both of which have had cohort testing for some time but have not had any publication of league tables. Therefore in your thinking on the Green Paper start from the view point that there will be no league tables.

Can I say to you that we already have league tables in this country. They have been created as a result of a measure that we put in place to provide a degree of social equity, our socio-economic decile rankings. Researchers Professor Helen Ladd and Ted Fiske, who have recently carried out work on the effects of Tomorrow's Schools claim that parents are ranking schools in accordance with the socio-economic rating. Regardless of achievement results, a decile 1 school is going to be perceived as worse than a decile 5 school.

We all know that there are some very high performing decile 1 schools and some cruising decile 10 schools.

National assessment will provide proof of how they are performing.

As I mentioned earlier, upon my return from overseas the government agreed to extend the timetable for the introduction of national testing for up to a year. It became obvious to me that the timetable we had set in the Green Paper was simply too tight to ensure that the many complex issues and decisions necessary could be considered without the mistakes that were made in initial planning in Ontario and England.

Among the questions we have to consider is who should be responsible for the tests, the development of the tests themselves, quality assurance processes and processing logistics.

There are significant advantages in continuing the dialogue with all education stakeholders. In particular we want to talk further with teachers about the advantages of the assessment package and how best to ensure that the proposals for national testing can be used to add value to the education system.

Any constructive assessment system must highlight the predominant role of the teacher in assessing pupils through tests devised from within the school and by an external agency.

I found in my investigations overseas that some teachers were starting to rely on nationally or provincially-set tests for assessing their own pupils. This to me seems back-to-front. Surely the teacher should use externally-referenced tests to back up his or her own judgement about student achievement.

Credibility with stake-holders is essential if the results of national testing are to be used constructively in the classroom. It therefore seems sound to take a little longer and make sure we have faultless design.

Meanwhile we will press ahead with the development of externally-referenced assessment material as an additional tool for establishing how our pupils are performing. This will include assessment resource banks and exemplar material.

We still believe that nation-wide testing of all students at a particular level is an important part of a total assessment package.

However we must make sure that we allow ourselves the time to get it right.

Throughout the development of "Assessment for Success" I have urged people to keep in mind certain principles:

- the interest of students is to be paramount. There's nothing radical about that. As I mentioned
- earlier it was a key principle in the 1989 Assessment for Better Learning document.

we will value only that information which has the potential to bring about constructive change and improvement.

we value the dedication and expertise of our educators and will work to involve them in all our activities.

We now have two weeks to the deadline for submissions on the Green Paper. I continue to urge everyone with the best interests of our young people's education at heart to think carefully about what's been suggested and to let us have the benefit of their deliberations, if they haven't already done so.

In the development of our thinking we will be taking on board the experience of other educational systems. To me the issue is about improved learning.

None of us can hold our heads too high after the results of the TIMSS study. In particular, the results of Maori and Pacific Island students in that study must be a cause for common concern.

My goals in this process are to see the development of an assessment matrix, a set of tools which will provide teachers and principals with information about their students' learning which will inform them of strengths and weaknesses in their programmes.

As I have said previously, measuring a piece of string does not make it longer. The challenge to all of you as principals is to analyse what you are presently measuring, recording and communicating to parents, to ask whether this measurement is against the objectives in the current curriculum statements and to most of all ask the question, are we using this information for any constructive purpose.

Then you need to say to yourselves, what tools are missing. What assessment materials would I like to see developed which could be used to provide better information about the learning of the students in my school, what better information could be provided for parents.

Once you have gone through that process, return to the Green Paper and evaluate what it is saying and then respond.

Finally I wish to stress that this Green Paper is very green.