Newsletter – Autumn 2013

Editorial

Jo-Anne Baird

The Association has been established for some 13 years and we can safely say that it has been launched.   During that time, not only have we met annually and enjoyed the intellectual stimulation of our conferences, we have networked and met new colleagues and friends.  This has resulted in joint projects across Europe and beyond, a professional accreditation programme, journal publications, a successful Newsletter and training workshops.  The next steps for the Association are a need to consolidate these achievements and firmly establish the Association’s structures so that we can support assessment progress across Europe to a larger extent than has been possible so far.  Details regarding the following developments will be shared at the Business Meeting in Paris, but the outline plans follow.

Secretariat

For a number of years, the strategic plan has noted the need for a member of staff to administer the Association’s affairs.  We are now in a position to take this forward.  This will be an excellent opportunity for someone to work across Europe, with senior assessment people in influential organisations.

 Association Committees

At the May meeting of the Council, some strategic changes to the structure of the Association’s committees were agreed to further promote our effectiveness and efficiency.  Some of our committees had been in place for several years and it was time to reconsider their functions.  We now plan to have the following standing committees:

Professional Development Committee – chaired by Antonella Poce
Publications Committee  – chaired by Gabriella Agrusti

We also have ad hoc committees associated with each conference, as follows:

Conference organising committee – chaired by Sandra Johnson for Paris
Scientific programme committee – chaired by Henk Moelands for Paris

A committee on Standards will be required in a few years’ time, to update the framework that was published last year.  More information on the roles of these committees and on forthcoming vacancies for contribution to their work will be given at the Business Meeting.

Forthcoming Council Vacancies

Several terms of office on the Council come to an end in 2014.  A vacancy will arise on the Council in 2014 when Gabriella Agrusti’s term comes to an end.

Additionally, we will need to elect a Vice President of the Association.  It is planned that my term as President will end in 2014 and Guri Nortvedt will become President in November 2014.  We had an excellent field of candidates for the election this year and I am giving advance warning of the elections to encourage you to stand for election or to put forward suitable candidates.

Arrangements will also need to be made for the posts of Executive Secretary and Treasurer, as those terms will also need to be renewed in 2014.

I hope you enjoy the contents of this Newsletter, which is informative, as always.  I look forward to seeing you in the beautiful Sorbonne in November.

 Jo-Anne Biard

Pearson Professor of Educational Assessment & Director of the Oxford University Centre for Educational Assessment (jo-anne.baird@education.ox.ac.uk)

2013 Annual Conference in Paris

It won’t be April in Paris.  But it will be Paris – a vibrant and exciting city at any time of year, even in early November! The venue for our 14th annual conference is the Sorbonne, one of the earliest foundations of the then University of Paris, itself the third oldest university in Europe after Bologna and Oxford. More specifically, the conference will be held in the elegant Palais Académique, which, located in the famous Latin Quarter on the Left Bank of the Seine, has housed the Rectorat of the Universities of Paris since the 19th century. The conference theme is International surveys, policy borrowing and national assessment, one of the most topical issues in the world today. What could be better?

Large-scale cross-border attainment survey programmes face formidable technical and logistic challenges, and their comparative results are subject to the most in-depth scrutiny from national stakeholders around the world.  When policy initiatives are inspired by cross-border comparisons, and national assessment programmes modelled on the international surveys are newly spawned as policy evaluation tools, questions can readily be raised about the validity, and wisdom, of uncritical transfer. The wide range of presentations, discussions and workshops will address a broad variety of related issues, and will not fail to stimulate reflection as well as simply to inform.

We are fortunate to have secured the participation of an impressive group of keynote speakers for this conference: Andreas Schleicher (OECD), a passionate speaker on the value and power of PISA findings for guiding policy change around the world; Nathalie Mons (University of Cergy-Pontoise), a leading figure in policy debates at governmental level in France, and a supporter as well as critic of the ‘PISA policy influence’; Pierre Vrignaud (University of Paris 10), who has studied international survey programmes from a technical standpoint; Paul Newton (Institute of Education, University of London), well-known for his prolific writings on assessment purposes and validity; and Fabienne van der Kleij (Cito, the Netherlands), winner of the Kathleen Tattersall New Assessment Researcher Award, describing her award-winning research.

The conference will be larger than any of its predecessors in terms of the number of paper presentations, discussion groups and pre-conference workshops on offer. Whether this is attributable to the attractive location, the politically charged conference theme, or both, the event promises to be a professionally stimulating experience for all who participate.

But it won’t all be work! Apart from good food and wine, the welcome reception will this year offer the opportunity to admire the magnificent elegance of Europe’s oldest science museum in the Conservatoire National des Arts et Métiers (CNAM), whilst chatting with old friends and new before the serious business begins the following day. As a ‘thank you’ to our Fellows for their invaluable support in reviewing the many conference proposals submitted this year, the Fellows Event features a unique pre-cocktail cultural tour of the most historic parts of the Sorbonne, including the iconic 17th century chapel.
The gala dinner, always a conference highlight, takes place in the majestic banqueting hall of the Cercle National des Armées (CNA), with entertainment provided by one of the most accomplished jazz manouche groups in the city. The CNA is located a few steps away from the famous Madeleine church, still in the heart of Paris but, like the CNAM, on the other side of the Seine. An evening not to be missed!

If you can spend time in Paris after the conference, and think you still have energy left to explore what else the city has to offer, then visit the Paris Tourist Office website and plan your post-conference escapade!

Sandra Johnson

Chair of the Paris Conference Organising Committee and member of the Scientific Programme Committee

Postscript: We owe an enormous debt of gratitude to our co-host, DEPP, the division of the Ministry of Education responsible for national and international assessment in France, whose local organisational support has been invaluable.

A Literary Taste of Paris

Sandra Johnson has given everyone a wonderful taste of both Paris and the delights of the conference but you may wish to get into the spirit of Paris in advance through a little selected reading. The list here is just a sample (and by no means a representative one!) but there might be something here that will interest you.

(With thanks and acknowledgments to Malcolm Burgess (The Guardian 19.5.2011) and others.)

Victor Hugo, Notre Dame de Paris (The Hunchback of Notre Dame), 1831

As a visitor it’s almost impossible not to see the splendid Notre-Dame Cathedral through the eyes of Victor Hugo and his creation Quasimodo.

“When, after groping your way lengthily up the gloomy spiral staircase, which rises vertically up through the thick wall of the bell towers, you abruptly emerged at last on to one of the two lofty platforms, flooded with air and daylight, a beautiful panorama unfolded itself …”

T E Carhart, The Piano Shop on the Left Bank, 2000

The sights, the smells, the atmosphere of a very special part of the city, evoked by T E Carhart, an American living in Paris, who discovers a piano repair shop on the Left Bank.

“Summer set in early and the sidewalks in the quartier came alive after hours. In a city where few apartments are air-conditioned, the terraces of cafés become the common refuge from a withering heat in the evening.”

Jeremy Mercer, Books, Baguettes and Bedbugs, 2005

The world around the Left Bank’s famous bookshop Shakespeare and Company – haunt of literary giants from Hemingway and Joyce to Ginsberg and Burroughs and still going strong.

“Shakespeare and Company sits on the very left edge of the Left Bank. The store is close enough to the Seine that when one is standing in the front doorway, a well-thrown apple core will easily reach river water.”

Muriel Barbery, The Elegance of the Hedgehog, 2008 (or, if you’re brave, try this in French: L’élegance du Herisson)

Life behind the façades of a grand Parisian apartment building in the very respectable 7th arrondissement, an insight into the secrets of the building’s concierge and its residents.

“My name is Renée. I am fifty-four years old. For twenty-seven years I have been the concierge at number 7, rue de Grenelle, a fine hôtel particulier with a courtyard and private gardens, divided into eight luxury apartments …”

Claude Izner, Murder on the Eiffel Tower, 2007

The brand-new Eiffel Tower – the glory of the 1889 Universal Exhibition – is at the centre of this dazzling murder mystery set in late 19th-century Paris.

“Pointing straight up into the sky on the other side of the Seine, Gustav Eiffel’s bronze-coloured tower was reminiscent of a giant streetlamp topped with gold. Panic-stricken, Eugénie searched for a pretext to get out of climbing it.”

Hilary Mantel, A Place of Greater Safety, 2006

It is 1789, and three young provincials have come to Paris to make their way. Georges-Jacques Danton, an ambitious young lawyer, is energetic, pragmatic, debt-ridden–and hugely but erotically ugly. Maximilien Robespierre, also a lawyer, is slight, diligent, and terrified of violence. His dearest friend, Camille Desmoulins, is a conspirator and pamphleteer of genius. A charming gadfly, erratic and untrustworthy, bisexual and beautiful, Camille is obsessed by one woman and engaged to marry another, her daughter. In the swells of revolution, they each taste the addictive delights of power, and the price that must be paid for it.

Plus Proust, Simenon (for Maigret), Dicken’s Tale of Two Cities (for a contrast with A Place of Greater Safety) and many, many more!

Work in Progress

Our Colleagues at DEPP

The following two articles introduce some recent and on-going work of our colleagues at DEPP who are co-hosting this year’s AEA-E conference.

   
Sandra Andreu
Project Manager
Pascal Bessonneau
Psychometrician
Marion Le Cam
Statistician
 DEPP
(The assessment, forecasting and performance Directorate of the Ministry of Education)

Large-scale assessments, especially international assessments, will be the topic of the next annual meeting of AEA in November 2013. These assessments are a large part of our work and the conference will be an opportunity to discuss these kinds of assessment and associated problems.

In addition, we would like to highlight a part of our work in the DEPP, the division of the French Ministry which is responsible for producing these surveys. This aspect of our work differs from the usual production of national and international indicators as we emphasise ways in which our experience in large-scale assessments can be used effectively on learning practices.

As computer-based assessments are now widely used and will be mandatory for the next PISA cycles, and beyond the measure of the correlation between paper based and computer based tests, we want to estimate the impact of a shift from paper based to computer based test, in terms of difficulty and dimensionality. This need is partly based on literature, on our own experience and to get the assurance of comparability for annual indicators. We thus constructed an original experimental design to test the comparability of paper based and computer based tests, and results of this study will be presented in November in Paris.

As another example, the DEPP is an active participant in a set of experiments which could reduce academic failure. The experiment “Lecture”, which has been implemented since 2011 in some voluntary schools, is one of them and is presented in a detailed way in this newsletter.

Experiment “Lecture”: What is the impact of early intervention on stimulating emergent literacy skills ?

A large scale assessment

The context

Since 2011, the DEPP has collaborated with researchers in the field of education science to test the impact of teaching phonology in pre-primary schools on literacy during the first years of primary school. The teaching strategy and research design were developed by researchers from the University of Lyon, who have worked with the DEPP to test the quality of assessment tools and to manage large longitudinal surveys.

Educational framework of the experiment “Lecture”

The framework recommends early interventions in the teaching of reading, adapting a report of the National Reading Panel (ref. NRP, 2000) to the French context. The intervention protocol comprised a guide for teachers of pupils in the final grade of pre-primary school, and those in 1st and 2nd grade. This guide – a series of lessons and lesson plans – provided for the structured teaching of skills which have been identified as necessary for learning to read: phonological skills, oral comprehension and letters knowledge. These sequences were explicit, systematic, gradual and intensive. Teachers are supported and monitored by advisors throughout the intervention.

Method

During the three years of the intervention (final grade of pre-primary school, 1st and 2nd grade), pupils were assessed at the beginning and at the end of each year. “Control group” schools were drawn to have the same features as schools in the experimental group (according to geographical location, priority education, size of the school) and children in this control group followed ‘normal’ academic teaching.

Priority education in this context means those areas which are allocated additional resources because of social deprivation.

In 2011-2012, 3,569 pupils in the final grade of 80 pre-primary schools, participated in the first two rounds of measurement; of these, 2,067 were in the experimental group.

In 2012-2013, 30 primary schools are still engaged in the programme, with a total of  6,500 pupils involved; of these, 1,800 were in the experimental group.

The intervention is continuing in 2013-2014, in particular for pupils in second grade who were in the experimental group during the first two years.                                 

What is happening ?

First results of the experiment

So far, the impact of the experiment has been evaluated with a comparison of scores between the two groups and its evolution between the first two measurement points. After an analysis of the global effect, further analyses will investigate the differential effects of the intervention.

During the first year of the intervention, pupils in the experimental group progressed more on the three dimensions which were the primary focus of the research:  “letters knowledge”, “phonological skills” and “reading decoding”. For phonology and oral comprehension, the effect of the intervention was greater for pupils with the lowest levels.

Continuation of the experiment in the first year of primary school (2012-2013)

The continuation of the evaluation should allow the identification of a potential transfer of the effect observed in the final grade of pre-primary school (GS) to the first grade (CP), in particular in the performance in reading at the end of the year.

Results from the third assessment confirm the benefit observed at the end of GS. Since data from the third and fourth assessments are currently under analysis, results are expected for the end of September.

Although the English scientific literature abounds in corresponding research on the positive effects of an early intervention in phonology, no large-scale study had been done on the subject until now in the French context. This study is the first and should provide useful information for research; positive results, especially long-lasting ones, should lead to changes in teaching policy during the last grade of pre-primary school.

Why inform members ?

We are pleased to discuss and share this research in particular and also to discuss the key role of national agencies and ministries who provide support for researchers. It would also be very stimulating to discuss with researchers from all Europe about their experience with these kinds of structures.

AEA allows us the opportunity to follow what is going on in Europe. It is especially important to share these kinds of interventions because the political and social contexts vary from country to country but in each one they are influenced by both international research and that emanating from other countries.

More information

For further information please contact:

sandra.andreu@education.gouv.fr
marion.lecam@education.gouv.fr
pascal.bessonneau@education.gouv.fr


Comparison of student performance in paper and computer-based tests

Anastassia Voronina
Senior Specialist
Estonian Foundation Innove Examination Centre

The context

Test and item writing in different electronic learning environments is not just “trendy”, it is also an important tool in measuring student achievement.

In the 20th century, paper was used as the main source in information exchange. People learned things mostly by reading books and different print media. Similarly schoolchildren used printed textbooks and they were assessed with the help of paper and pencil tests.

In the modern day world of Wikipedia, Google, Facebook and social networks people learn things differently than they did, for example, twenty years ago. In the learning process the role of computers is increasing, fewer and fewer paper materials are used.  As a result of this, many questions arise, for example, how reasonable is it to give students a paper and pencil test if they have acquired knowledge through a computer? Are the results of such paper based tests reliable considering the fact that knowledge was acquired from a different source than paper?

What is happening

We asked each other these sorts of questions a few years ago when we started a discussion about the need to write electronic tests and develop an e-bank of item pools. We were mostly interested in whether today’s student solves the same task presented on a computer screen in a different way than if it were presented on paper. The system that we use in e-item and e-test writing provides us with quite detailed results in assessing both computer and paper based tests and it also allows their comparison.

Computer based testing was not a completely new experience for schools in Estonia. Many of the schools had participated in a computer based assessment option in the field trial and the main study of PISA 2012. In the sample of the pilot testing there were 30 schools with 860 students from grade 10. Half of them did the e-test by using the electronic information system (EIS); the other half solved the same items on paper. A random sample determined whether the student took the paper or the computer based test. This research built on that experience.

For our pilot test a comparable test in Chemistry for year 9 in comprehensive school was compiled. As a contrast to the traditional way of testing, students did both types of tests – on paper as well as on the computer.

The main objective of this testing was to find out if the paper or computer medium played a role in student performance as they solved the same problems on paper and the computer. To reach this objective, two tests were constructed – a computer test and a paper test where exactly the same items were used. They were the same from the content point of view (questions and multiple choice answers) as well as similar in terms of pictures, layout of the page, etc. Both groups of students were given two hours to take the test.  The only difference in the test was that half of the students solved the problems on paper; the other half did it on the computer. The marking of the test took place immediately after the test. The data entry of the paper based test results was done manually and took considerably more time.

Why inform members

Preliminary analyses of the data showed some very interesting tendencies in favour of the computer based test.

Some of the main advantages of the computer test:

  • there were no difficulties in understanding student’s handwriting
  • absence of “typo” type of mistakes that in paper based tests result from student carelessness (the responses had technical limitations regarding the form of the response)
  • accidental mistakes caused by human marking (since the computer has an automated marking system)
  • during the computer test the student cannot be distracted with different activities like drawing on the sides of the test or writing poems in it. Therefore there is a bigger probability that the student concentrates more on the test

Another interesting observation that needs further analysis is the fact that there was quite a big difference between the numbers of missed responses between both types of test, with considerably fewer missed responses in the computer based test than in the paper and pencil one. It is interesting to note that in cases of MCQ (multiple choice questions) this difference depended on the type of the MCQ.  In the paper based test between 1% to 25% of the test takers skipped a multiple choice question; however, in the computer test the numbers of missed responses to the SAME question were up to 10 times smaller.

Quite a big difference in the number of missed responses was observed in cases when, for example, the computer test suggested a choice of responses from the drop down menu that were repeated around the same question, but in the paper based test the choice of the responses was given once before the item and the student had to write out the correct response by hand.

It is again interesting that the numbers of missed responses in the paper based test in such cases were more than 20% (2-4% in computer test and 17-21% in paper test)!  The average performance in those questions in the computer based test was more than 10% higher than in the paper and pencil test.  There could be many reasons why there was such a difference in the number of missed responses. Unfortunately we cannot make more substantial conclusions from just one test.

In conclusion we can say that it is very important to consider different types of computer testing and their influence on student performance. Since more and more everyday activities are done by computers, paper testing does not measure the skills in the format used in the real life. Computer based tests clearly give us more possibilities to ask questions; not only the format of testing but also the types of items and technical solutions can change student performance.  Further research should be done on how these factors influence student performance.

More information

For further information please contact:

anastassia.voronina@innove.ee


Validity in Education and Psychological Assessment

Paul Newton Institute of Education (University of London) and Stuart Shaw Cambridge International Examinations, Cambridge Assessment

The context

Validity is the hallmark of quality for educational and psychological measurement. But what does quality mean in this context? And to what exactly does the concept of validity actually apply? What does it mean to claim validity? And how can a claim to validity be substantiated? A new book entitled ‘Validity in Educational and Psychological Assessment’, which is due to be published by SAGE in early 2014, attempts to explore answers to these fundamental questions.

What is happening

‘Validity in Educational and Psychological Assessment’ adopts an historical perspective, providing a narrative through which to understand the evolution of validity theory from the nineteenth century to the present day. Even today, the meaning of validity is heavily contested and no clear professional consensus exists over a precise, technical meaning.

We describe the history of validity in five broad phases, mapped to the periods between:

1. the mid-1800s and 1920: gestation
2. 1921 and 1951: crystallisation
3. 1952 and 1974: fragmentation
4. 1975 and 1999: (re)unification
5. 2000 and 2012: deconstruction.

We explain how each of these phases can be characterised by different answers to the question at the heart of any validation exercise: how much and what kind of evidence and analysis is required to substantiate a claim of validity?

The book comprises six chapters. In Chapter 1 we set the scene for the historical account which follows. Chapters 2 through 5 offer readers a chronological account that delineates the phases of development of validity theory and validation practice. In Chapter 6 we propose a revision of Messick’s progressive matrix – a neo-Messickian framework for testing policy evaluation. Our revision of the progressive matrix aims to dispel some of the confusion engendered by its original presentation.

Why inform members

We believe that this book will be of interest to anyone with a professional or academic interest in evaluating the quality of educational or psychological assessments, measurements and diagnoses.

More information

For further information please contact:

Paul Newtonp.newton@ioe.ac.uk

Stuart Shawshaw.s@cie.org.uk

Doctoral work in Progress

Social and personal development in post-primary education: Assessing the Transition Year in Ireland

Aidan Clerkin
Educational Research Centre, St Patrick’s College, Dublin

Context

The Transition Year programme is an optional year that is offered to students midway through post-primary education in Ireland, when students are typically about 15 years old.  It is very unusual in international terms, and is designed to provide students with space for social and personal development in the absence of any high-stakes examinations.  It is somewhat akin to a gap year that has been embedded into the formal education system (Clerkin, 2012).

Transition Year gives students an opportunity to learn about the world outside their normal school experience – they take on unpaid work experience placements in real workplaces, participate in a range of project work and modules that would not otherwise be available, and go on trips with classmates and teachers.  Students are expected to take on more responsibility, learn to manage their time and organise themselves, try new experiences, and learn to communicate productively and work as part of a group.  The main goal of Transition Year is to prepare young people for “their role as autonomous, participative and responsible members of society” (Department of Education, 1993).  Having grown in popularity over the past two decades, more than half of the eligible cohort of students now takes part each year (Clerkin, 2013).

Research methodology

My PhD research is centred around a quantitative longitudinal assessment of some of the social and personal outcomes that have been qualitatively linked with Transition Year participation – including social self-efficacy, engagement with school, student-teacher relationships, life and school satisfaction, work orientation, self-reliance, and study habits. Three waves of longitudinal data have been collected, in March/April 2011, 2012, and 2013.  This spans the period from before participating students had the option of Transition Year, to one full year after the end of their involvement with the programme (or two years if they did not participate).  Approximately 1500 students in 20 schools around the country, randomly-selected from a stratified sampling frame, returned questionnaires in the first wave and the same students have been approached to participate each time.  The next stage of the research will focus on establishing longitudinal linkages and using latent growth curve modelling to examine change over time.

Research and policy relevance

First, this research will provide a detailed profile of the students who do and do not take part in Transition Year, incorporating quantitative psychosocial, academic and demographic information as well as a qualitative attitudinal component.  The most recent such profile of students was published in 2004, drawing on data from an older (1994) survey.  The current study updates this and also includes the significant addition of data on students’ social and psychological development, which has been noted as lacking from previous evaluations of the Transition Year (Smyth, Byrne & Hannan, 2004).

Second, the collection of longitudinal data will allow a detailed examination of the extent to which participation in Transition Year is associated with changes in students’ social and personal development, which has not been possible to date.  Promoting such development is an explicit aim of the programme.  With some critics suggesting that Transition Year is a luxury that should be abolished as a cost-saving measure in these difficult economic times (Clerkin, 2012), additional information on the extent to which the programme contributes to students’ development is timely and necessary.

References

Clerkin, A. (2012) Personal development in secondary education: The Irish Transition Year.
Education Policy Analysis Archives, 20 (38).[Open access from http://epaa.asu.edu/ojs/article/view/1061]

Clerkin, A. (2013). Growth of the ‘Transition Year’ programme, nationally and in schools serving disadvantaged students, 1992-2011.  Irish Educational Studies.
DOI: 10.1080/03323315.2013.770663

Department of Education. (1993). Transition Year programme: Guidelines for schools. Dublin: Author.
Smyth, E., Byrne, D. & Hannan, C. (2004). The Transition Year programme: An assessment.  Dublin: ESRI/Liffey Press.

  

Supervisors: Dr Michael O’Leary, Prof. Mark Morgan.

Contact: aidan.clerkin@erc.ie


Who am I and what can I achieve?

A study of students preparing for high-stakes, A-level examinations


Carol Brown

Oxford University Centre for Educational Assessment

Supervisors:

    

Context

The current research focuses on the theoretical framework of Eccles expectancy-value model of achievement (Eccles et al., 1983) and therefore examines the relationships between expectations, values and high stakes, A-level assessment (the main examinations taken for university entrance in England). It investigates Eccles assumptions that: students’ beliefs about their ability and expectations for success are the strongest predictors of grades (Eccles & Wigfield, 1995), that gender differences in task value underlie gender differences in role choice behaviour and achievement (Eccles et al, 1983) and that parental/family characteristics provide an indirect link between SES and educational outcomes (Eccles, 1992; Guo & Harris, 2000).

It also explores the role identity plays in the behavioural choices students make in preparation for their A-levels, and in their achievement, as Eccles (2009) assumed a link between personal and collective identities and the subsequent value attached to task. Value is determined by the degree to which a task fulfils needs, facilitates or affirms personal goals According to Eccles (2009) perceptions of skills, characteristics and competence, and perceptions of values and goals play a role in motivation by influencing expectations for success and the importance people attach to a range of tasks and choices; in this case A-level examinations. Therefore if specific identities are important to the individual then the activities, behaviours and tasks associated with them will take on high subjective task value and the individual will be motivated to act them out.

Research Question (s)

What are the relationships between expectations, subjective task value and A-level achievement in high stakes assessment?

  • What is the relationship between expectations and A-level achievement?
  • What is the relationship between expectations and subjective task value in A-level students?
  • What is the relationship between (1) intrinsic value (2) attainment value (3) utility value (4) perceived cost & A-level achievement?
  • How do gender, SES and type of school effect the STV attached to A-level achievement

Methodology

A pilot study2  (n=134) was conducted in a city and town comprehensive school. Students completed a questionnaire comprising of three parts;

  • Information on family demographics, socioeconomic status and social capital was collected using items from the Programme for International Student Assessment (PISA) student and parental questionnaires (2009,2012)
  • The subjective values attached to A-levels was examined using items adapted from Eccles and Wigfield’s (1995) self and task perceptions questionnaire
  • Future educational, economic and occupational expectancies and values were measured using items from the Michigan Study of Adult Life Transitions (Eccles, Adler, & Meece, 1984)

The main study uses a explanatory sequential mixed methods design. 1,000 students will complete the questionnaires and a subsample of 10 qualitative interviews, based on purposeful sampling, will further examine the relationships between expectations, values and achievement. In accordance with the extensive analyses conducted on the expectancy value model (Eccles & Wigfield, 1995; Wigfield, 1997; Wigfield, Eccles, & Roesner, 1998) inferential analysis will use structural equation modelling (SEM).

1 Examinations taken at the end of secondary education at age 18 in England and Wales
2 Results of the pilot study will be presented at the AEA Conference, Paris 2013

Why is this research of interest?

This research is interesting because it addresses the role of both psychological and social influences on achievement behaviour in A-level students. Researching these links is important as it identifies individual differences in motivational factors that contribute to A-level achievement and also identifies the socio-cultural factors underlying differences in behavioural choice and differential achievement in these students. This is especially important to consider as the current A-level system faces reform and the age of compulsory schooling is raised to age 18.

References

Eccles, J. S. (1992). School and family effects on the ontogeny of children’s interests, self-perceptions, and activity choices. Nebraska Symposium On Motivation Nebraska Symposium On Motivation, 40, 145-208.

Eccles, J. S. (2009). Who Am I and What Am I Going to Do With My Life? Personal and Collective Identities as Motivators of Action. Educational Psychologist, 44, 78-89. doi: 10.1080/00461520902832368

Eccles, J. S., Adler, T. F., Futterman, R., Goff, S. B., Kaczala, C. M., Meece, J. L., & Midgley, C. (1983). Expectations, values and academic behaviours. In J. T. S. (eds). (Ed.), Perspective on achievement and achievement motivation (pp. pp 75-146. San Francisco: Freeman).

Eccles, J. S., Adler, T. F., & Meece, J. L. (1984). Sex differences in achievement: A test of alternative theories. Journal of Personality and Social Psychology, 46(1), 26-43.

Eccles, J. S., & Wigfield, A. (1995). In the mind of the actor: the structure of adolescents’ achievement task values and expectancy-related beliefs. Personality and Social Psychology Bulletin, 21(3), 215-225.

Guo, G., & Harris, K. M. (2000). The mechanisms mediating the effects of poverty on children’s intellectual development. Demography, 37, 431-447.

Wigfield, A. (1997). Predicting children’s grades from their ability beliefs and subjective task values: Developmental and domain differences. Paper presented at the Biennial meeting of the Society for Research in Child Development.

Wigfield, A., Eccles, J. S., & Roesner, R. (1998). Relations of young children’s ability-related beliefs to their subjective task values, performance and effort Paper presented at the invited symposium on ‘Motivation and affect in the classroom’.

Member’s News

Zeeshan Rahman (zeeshan.rahman@cityandguilds.com) writes:

“Assessment is central to City & Guilds and so are the people who are involved in the development, administration and evaluation of assessment within the organisation. We value and support the AEA-Europe professional accreditation scheme because it recognises the skills, knowledge and experience of assessment professionals within City & Guilds and beyond. It encourages people to reflect on their achievements and think about future development. Several people have been awarded an accreditation status whilst working here thanks to encouragement from colleagues and Alastair Pollitt who came to our offices to explain the accreditation scheme, the application process and the benefits of applying.”

From left to right:  James Wise, Angela Bapuji, Giusy Poliseno and Thomas Harding


Introducing Guri A. Nortvedt, our new Vice President in her own words

Introduce yourself (educational background, professional experience etc)
My professional career started as a secondary school teacher in 1989, teaching mathematics and Norwegian. I soon became interested in assessment, and after only a few years of teaching I did a Cand Scient thesis (a kind of master thesis) in mathematics education on the topic diagnostic assessment. This was my way into research. A few months before I was examined on my thesis I started working at the University of Oslo, as a researcher attached to the project I was still studying. I was very excited about this.

The following years I divided my time between teaching in secondary school and teacher education and doing research. I had many happy years where my main activity was test development and research, in Oslo and in Trondheim.

In 2006 I was fortunate enough to get the opportunity to do a “late career” PhD when I was offered a research fellow position at the University of Oslo. That was four very interesting years where I designed and carried out a mixed methods study of relationships between reading comprehension and solving arithmetic word problems, involving two assessment designs.

Since 2010 I have been employed as a researcher at the University of Oslo. I lead a project where we develop mapping (or screening tests) in primary mathematics. In addition I am part of the Norwegian PISA team.

Best / favourite career moments?
When last winter on holiday on Gran Canaria, I was sitting in bed with my seven year old niece needing some time out of the sun: I was doing some work on a report on one of our pilots for the mapping tests, and she was helping out by working through two pilots for older students. Turning the page she said happily: ‘This is a very good item, this is the kind of item the children in my classroom who struggle with mathematics can not do.’

Did I feel proud?!

Worst career moments?
When a few minutes later she laughingly told me why another item was rubbish………. and was perfectly correct – as it turned out when the pilot data came in.

What do you think are the most important aspects of assessment? Where/how does assessment do most good?
I think transparency and openness are the two most important words when it comes to assessment; that students understand why and how they are assessed, how criteria are used, how the results are used. Only then can they truly learn what we want them to understand from being assessed.

Still, openness and transparency is equally important when assessment is used as part of a research design, or when designing assessment.

What would you hope to see AEA-Europe achieve in the next 2 years? 10 years?
In the next two years I hope to see us grow in members and that the level of activity reflects this. I would like for AEA-Europe to be a platform for communication also between conferences. This is something that the council are working towards, and if we are successful, in ten years we might for instance have a very successful proceedings or journal. I would also like to see that in ten years, we have many members also from countries that are currently under-represented in the membership.

What would you like people to remember you for at AEA-Europe?
A good sense of humour, a clear head, and a constructive attitude toward the life of the organisation.

What do you most value in AEA-Europe?
The membership – all the various people, experiences and attitudes that makes up the membership of AEA-Europe.

What are you hoping to bring to your new role?
I have three aims for my vice-presidency and following presidency: Firstly, I would like to contribute to an even more open and inclusive organisation. Next, I want to contribute to the growth of AEA-Europe, but at the same time maintain the diversity in the membership. I want AEA-Europe to remain at a size where it is possible to overview the full membership, and I want to keep the “mix” of academics, practitioners and policy level members. Finally, I want AEA-Europe to be democratic and transparent. You might say that my aims are not to change the organisation, but to refine what is already good.

What you most looking forward to at the next conference?
I am a bit childish about conferences. I look forward to everything – plenaries, paper presentations, discussion groups and poster sessions – I really enjoy listening, discussing, contributing, wondering, discovering or learning something new. In addition, I really enjoy talking to the other participants. My aim is to talk to as many as possible. So I really look forward to the social parts of the programme too: the welcoming reception, coffee breaks, lunches, the dinner….

Is there anything else you would like to add?
Only a big Thank you for the confidence you (the members) have shown me by electing me for vice-president.


Thanks from the publications committee

Guri, along with Jo-Anne, was one of the original members of the publications and communications committee. The other founding members were Chris Whetton, who led it for the first few years, and Emma Nardi. Julie Sewell went along to the initial meeting to take notes, but by the end of the meeting had agreed to edit the newsletter and was, as a result, the fifth member of the first committee.

The committee – and AEA-E members – owe a special debt of gratitude to both Jo-Anne and Guri as they took on the responsibility of editing the two special editions of Cadmo in spite of the heavy workloads both have. Clearly, being part of this committee leads to even greater involvement in the association.

Thank you, Guri and Jo-Anne, for all you do – and thanks to Chris and Emma for all they did. Thanks, too, to our current chair, Gabriella Agrusti, and our doctoral ‘helper’, Yasmine, El Masri – as well as our previous one, Lucy Simpson.

Conferences

European Association of Test Publishers – Growing Talent in Europe: Gaining Advantage through Assessment
Sept 25-27, 2013, St Julian’s, Malta
http://www.eatpconference.eu.com/‎

39th Annual IAEA conference,
October 20-25, 2013, Tel Aviv
http://iaea-2013.com/

Ireland International Conference on Education (IICE 2013)
October 21-23, 2013 Dublin Ireland,
http://www.iicedu.org/

AEA-Europe 14th Annual Conference 2013
Nov 7-9, 2013, Paris, France
https://www.aea-europe.net

ATP Innovations in Testing 2014
March 2 – 5, 2014, Scottsdale, Arizona
http://www.testpublishers.org/events 

AERA 2014 – The Power of Education Research for Innovation in Practice and Policy
April 3 – 7 1, 2014, Philadelphia, Pennsylvania
http://www.aera.net/tabid/10208/Default.aspx

North Central Association Higher Learning Commission’s 2014 Annual Conference
April 10 – 14, 2014, Chicago
http://annualconference.ncahlc.org/

CICE-2014 – Canada International Conference on Education
June 16 – 19, 2014, Nova Scotia, Canada
http://www.ciceducation.org/

EEL 4th annual international conference on education and e-learning
August 18-19, 2014, Bangkok, Thailand
http://www.e-learningedu.org/

Contributions & Deadlines

Publications committee members: Gabriella Agrusti, Jo-Anne Baird, Christina Wikstrom, Julie Sewell, Yasmine El Masri (temporary member).

The AEA-Europe Publications and Communications Committee would like to make a call for contributions for the Association’s newsletter. The Newsletter is published twice a year, in the Spring and Autumn, with deadline dates for the next Spring Newsletter being 20th February 2014. We would like help from members to make the information as up to date and relevant as possible. In particular we would like the following:

  • Articles for the Work in progress section. These should be a maximum of 500 words long and be formatted under the headings “Context”, “What is happening”, “Why inform members” and “More information” (brief details of a website or references). Please see previous newsletters for more information.
  • Articles for the Doctoral students’ Work in progress section. These should also be 500-600 words long and follow a similar format, covering the research and the context, with some details of methodology, potential impacts or directions.
  • What’s new section: Information on conferences and courses at master and PhD-level. For conferences we need to know: Title, date, place, website address. For courses at master and PhD-level we would like information on courses relating to assessment open to international students, bearing in mind any language constraints. We will need the name of the institution, the title for the course in your own language as well as in English, plus dates, town, country, website or a contact address.
  • Members’ news: any information on new appointments, promotions, etc.

We would also like some additional AEA-Europe national representatives who would be willing to send information about recent developments in assessment in their countries and approach relevant people to contribute to the Work in progress section. Please let Julie (e-mail address below) know if you would be interested in this role. We already have some volunteers but need more!

Julie Sewell (on behalf of the Publications and Communications Committee)
AEA-Europe

Contributions & Deadlines

Paul Newton Institute of Education (University of London) and Stuart Shaw Cambridge International Examinations, Cambridge Assessment

The context

Validity is the hallmark of quality for educational and psychological measurement. But what does quality mean in this context? And to what exactly does the concept of validity actually apply? What does it mean to claim validity? And how can a claim to validity be substantiated? A new book entitled ‘Validity in Educational and Psychological Assessment’, which is due to be published by SAGE in early 2014, attempts to explore answers to these fundamental questions.

What is happening

‘Validity in Educational and Psychological Assessment’ adopts an historical perspective, providing a narrative through which to understand the evolution of validity theory from the nineteenth century to the present day. Even today, the meaning of validity is heavily contested and no clear professional consensus exists over a precise, technical meaning.

We describe the history of validity in five broad phases, mapped to the periods between:

1. the mid-1800s and 1920: gestation
2. 1921 and 1951: crystallisation
3. 1952 and 1974: fragmentation
4. 1975 and 1999: (re)unification
5. 2000 and 2012: deconstruction.

We explain how each of these phases can be characterised by different answers to the question at the heart of any validation exercise: how much and what kind of evidence and analysis is required to substantiate a claim of validity?

The book comprises six chapters. In Chapter 1 we set the scene for the historical account which follows. Chapters 2 through 5 offer readers a chronological account that delineates the phases of development of validity theory and validation practice. In Chapter 6 we propose a revision of Messick’s progressive matrix – a neo-Messickian framework for testing policy evaluation. Our revision of the progressive matrix aims to dispel some of the confusion engendered by its original presentation.

Why inform members

We believe that this book will be of interest to anyone with a professional or academic interest in evaluating the quality of educational or psychological assessments, measurements and diagnoses.

More information

For further information please contact:

Paul Newtonp.newton@ioe.ac.uk

Stuart Shawshaw.s@cie.org.uk