Newsletter – Autumn 2016bigadmin2017-10-12T17:02:32+00:00

Newsletter – Autumn 2016

President’s Report

Good news! In this edition, we are pleased to welcome the new AEA Europe Newsletter Editor, Amina Afif, who officially replaces Julie Sewell after a long outstanding vacancy for this post. Amina is a member of the new Publication Committee chaired by Gill Stewart. One of the committee’s first tasks is to reflect upon and define a publication strategy for the AEA. Should you have any ideas in strengthening the identity of the AEA through publication of its work, please feel free to contribute your views to the committee members during the upcoming conference in Limassol.

The AEA intends to improve the content and frequency of its newsletter in future. We hope to be able to count on all of our members to share news about their work which we plan to publish in regular or special thematic issues more than twice a year, depending of course on the number of articles received.

This autumn was spent organizing and planning the next two annual AEA conferences in Limassol and Prague. We are putting the finishing touches to our conference in Limassol where we will meet in warm and welcoming (in more than one way) Cyprus. To a Norwegian, this will feel like having an extra summer holiday. The programme looks as exciting as ever. As we say each time, we received even more submissions this year compared to the one before. There is a growing interest in our association, and I hope we can convince those who attend their first or second AEA conference to become regular members and return year after year, as so many do. All across Europe, considerable interesting work is being undertaken and we can learn much from one another. I will cite here a simple example. Coming from a culture that relies heavily on teacher assessment and one that trusts teachers, I find it interesting to observe and learn how teacher assessment is moderated in other countries. For assessment cultures like mine, teacher moderation, at first glance, seems to be rooted in a lack of trust teachers. Looking deeper into the established mechanisms, however, I think that it could also mean that teachers are given the opportunity to develop common criteria and an understanding of what it means to master curricula goals across classrooms and schools. This goal is something we would like to achieve and learn to do without taking away any of the autonomy given to our teachers. I am looking forward to discussing this and other interesting topics with you all in Limassol!

This is also the last time I am writing the ‘President’s Word’ for the AEA Europe Newsletter. As my term comes to an end at the close of our conference in Limassol, I will step down and confidently hand over to Thierry Rocher, who will be the AEA Europe President for the coming two years. Together with him and Jannette Elwood, our new Vice-President and president elect, the AEA council will steer us safely through the next years, prepare upcoming conferences, handle the budget and all other businesses of AEA Europe as well as build new platforms for the association. Busy times lie ahead but I am optimistic that the council will comfortably sail through. I will certainly miss working with my colleague AEA Europe council members — intelligent, warm, giving and enthusiastic individuals who, over time, have become my friends and with whom I have shared many decisions. I will definitely miss having these interactions with them. I fully trust that they have in mind the best interest of all AEA Europe members, including mine, and I wish them nothing but the very best for the next two years. 

Welcome to Limassol

It is that exciting time of the year again and we are proud to be holding the 17th annual conference of AEA-Europe in Limassol, Cyprus (3-5th November 2016)! Our provider, Easy Conferences, is hosting this conference and you can find all the relevant information at http://www.cyprusconferences.org/aea2016/index.html.

The theme of the conference, “Social and Political underpinnings of educational assessment: Past, present and future” is both exciting and timely. Our community received the theme with enthusiasm, and a very large number of high-quality abstracts have been submitted. After a blind review procedure, we were able to set up a very rich academic program, with four parallel sessions, including open papers, discussion groups and a poster session. We were also thrilled to secure the participation of amazing keynote speakers – visit our web page for more information regarding their presentations and their CVs. The pre-conference workshops are shaping up to be an amazing start to the conference!

In addition to the excellent academic program, the local committee put together one of the most promising and exciting social programs. Picture yourself enjoying your welcome drink by the sea! Indulge in the pleasures of the local dishes and wine of the gala dinner… Oh, and don’t miss out on the optional tours around the island!

We, the local committee, are anxiously waiting to see you all very soon in Limassol!

An attempt to assess students’ collaboration and problem solving competences: A step towards 21st century education

Students in most western societies are indulged with information and communication technology (ICT), and they tend to take it for granted. Furthermore, the technological innovations and advancements have changed the knowledge requirements, with “an emphasis on what students can do with knowledge, rather than what units of knowledge they have” (Silva, 2009 p. 631). However, what happens in the classrooms with regard to the use of ICT does not necessarily reflect the use of ICT in the society in general. Moreover, researchers, policymakers and practitioners largely focus on the need for narrowing the gap between ICT integration in education and students’ use of ICT at home. Hence, several initiatives have focused on the need for integration of ICT in teaching and learning practices and the benefits of it for students and teachers. Moreover, it has often been stated that the assessment in particular is lagging behind. Even though there is a current shift from traditional paper-and-pencil towards computer-based assessments (Csapó, Ainley, Bennett, Latour, & Law, 2012), the tests often resemble paper-and-pencil tests (e.g., multiple-choice), just on a screen this time (Darling-Hammond, 2010). Thus, the depth of ICT is not fully exploited in these assessments.

The Assessment and Teaching of 21st Century Skills (ATC21S) project have dealt with some of these issues and developed the test Learning in Digital Networks (LDN-ICT). It measures a number of constructs, collectively called 21st century skills (Griffin, & Care, 2015). The LDN-ICT test, through its tasks, attempts to mimic real world problems, and includes innovative and complex item and task designs. For instance, the students interact with a test environment, which in turn facilitates synchronous communication and collaboration between the students through different tools (e.g., GoogleDocs and chat). The students have access to the world-wide web without any restrictions (barring district related restrictions). The test attempts to measure students’ ability in handling digital information, their ability to communicate and collaboration during problem solving.

As stated by Hohlfeld and colleagues (2010) “Validation of measurement instruments is an ongoing process. This is especially true when dealing with the measurement of technology literacy while using technology, because technology is perpetually changing” (p. 384). Following these arguments, a joint project between the University of California, Berkeley and University of Oslo was initiated. Within the scope of this project, the LDN-ICT test was adapted and further developed to fit the Norwegian language, and cultural, and school context. We have collected a pilot study and provided evidence for the internal and external validity of this innovative test. The analyses showed sound psychometric properties. Although further refinements of the test are encouraged and are underway, we present evidences that the test is already delivering promising results and is ready for a large-scale implementation in classrooms.

If you are interested in further details about our validation study or the test, join our presentation at the next AEA-Europe conference in Cyprus in November 2016. If you cannot make it to the conference, please keep an eye on our Researchgate page, or Email us.

Fazilat Siddiq, University of Oslo

Researchgate: https://www.researchgate.net/profile/Fazilat_Siddiq

Email: f.s.ullah@ils.uio.no or  fazilatu@gmail.com

Perman Gochyyev, University of California, Berkeley

Email: perman@berkeley.edu or permang@gmail.com

Student school reports in Portugal using only qualitative descriptors

The Portuguese independent institute, the IAVE, I.P., which is responsible for external assessment of students, introduced a new approach of reporting student performance by setting aside all reference to the long-standing traditional quantitative scores and focusing exclusively on qualitative information. This new approach was applied to Portuguese, Mathematics and Social Studies for 2nd grade students as well as to Portuguese and Mathematics for 5th and 8th grade students.

This is based on a complex and time-consuming framework designed to transform the analysis of student responses into an automated process of individual performance descriptors. For each student report, information from groups of items are organized into 4 categories:

  1. good achievement (the answers matched the expected result);
  2. expressed difficulties (the answers showed relevant faults or mistakes, meaning that there is room for a future pedagogical intervention);
  3. failed to achieve properly (the answers define a set curricular domains in which students need assistance);
  4. no answer provided.

These descriptors are organized by subject and further by domain. For example, Portuguese is broken down into Reading, Grammar and Writing while Mathematics is divided into Algebra, Geometry and Statistics. The approach attempts to eliminate the tendency of traditionally labelling a student simply as a “poor performer in Portuguese”.

This new more sophisticated and thorough report may be related to medical ones form the doctors as it provides the opportunity to identify improvement strategies to address specific learning needs. It also offers a means of saving time and costs related to individual support to students.

In addition, all the individual information may be aggregated to the classroom and school levels, which themselves may be compared with aggregated data at regional and national levels. Furthermore, a global overview by the four performance categories may be observed at the national, regional, school and class levels.

E-assessment strategy in Portugal

In renewing its external assessment approach, an e-assessment strategy was piloted where online tests for Portuguese and Mathematics were delivered to 2500 students of grades 4 and 6 in 43 private schools.  This initiative is part of the medium-term strategy to include all public and private schools in a similar process, covering thereby a cohort of 100,000 students each in grades 2, 5 and 8.

The e-assessment includes the scoring procedures and was extremely well-accepted by students and correctors although some skepticism still exists among a few parents and school officials. The tests which are delivered at the same time are installed on a USB drive on each computer (laptop or tablet). Data may be uploaded on-line to a central server, whenever internet connections are active, but are always recorded in the USB drive. This option proved to be a secure solution, avoiding the loss of information especially in some schools where the on-line uploading procedure failed.

For more information, please contact:

Helder Sousa
helder.dsousa@gmail.com

Transition from paper-based to computer-based assessment: a case study of practical implementation in France

Standardized tests used to measure student performance in Europe and all over the world are today gradually transiting from a predominantly paper-pencil format to an environment which is fully digital. During this phase, several questions are raised from both theoretical and practical perspectives. Two specific cases recently implemented in France by the DEPP are described below, illustrating the practical challenges of such approaches.

DEPP is the statistical department of the French Ministry of Education, which provides stakeholders with objective information regarding students’ knowledge, skills and attitudes, using sample-based assessments (sample sizes typically range between 8 to 12,000 students, with a total of 8 to 10 programs per year). By 2020, DEPP is committed to gradually switching all paper-based assessments to computer-based ones. From a practical point of view, this transition is today done differently according to the type of schools: online computer-based assessments in secondary schools and offline tablet-based assessment in primary schools.

For the secondary schools: in November 2015, DEPP successfully conducted a very large scale assessment of 160,000 grade 6 students in more than 4,000 schools across the entire country. This assessment was aimed at providing performance indicators for each of the 30 education authorities (academies). A representative sample was therefore selected from each of these authorities, corresponding to a high proportion of 1 out of 5 Grade 6 students assessed in total.

In practical terms, this assessment required a highly-organized setup. A network of local IT platforms was created using the existing IT and statistic departments of the local authorities. Such a support was a key factor given the wide range of IT equipment which was incredibly striking, at least for France, even in secondary schools. The quality of internet access also varied widely hence undermining full-web test administration. However, in spite of these constraints and owing to the high commitment of the local actors, over 90% of students were finally able to participate in the assessment in good conditions.

For the primary schools: the IT equipment and internet access are really much poorer and so heterogeneous that the online assessment using local computers, as in secondary schools, cannot be achieved. For this reason, another solution was foreseen, using tablets and involving external administrators. An interface was designed to allow administrators to receive and store the data from the tablets. In June 2016, DEPP successfully piloted this solution in 120 schools (3,000 students) in one education authority. Each test administrator brought to around 15-20 schools, a briefcase containing 15 tablets to assess one class.

This project confirmed the feasibility of a large-scale assessment on mobile devices, administered in an offline environment. The next step is to generalize in 2017 such a large-scale digital assessment using 90 briefcases, in order to assess students in all the 30 education authorities. These successes have confirmed the strategic plan for DEPP to adapt procedures to contexts in the next following years: offline in primary schools and online for secondary schools.

New Masters Course in Educational Measurement – Netherlands

Most teachers in the Dutch education system have not received formal training in the field of educational measurement. This is particularly problematic in the intermediate vocational education (IVE) and the higher vocational education (HVE) where certain parts of the examination programs are developed by the teachers themselves. Consequently, these exams have been subject to public debate for over 20 years. 

In order to address this concern, specialists in educational measurement from universities and other institutes have long strived without success to include knowledge of educational measurement in the curriculum of teacher-training colleges. However, this trend was reversed when the Dutch government recently demanded that firstly, teachers involved in examinations give formal proof of their expertise in educational measurement and secondly that all HVE teachers have at least a Master’s Degree. These decisions together paved the way for developing a 2-year Master program in educational measurement starting from September 2016. The initiative was led by Désirée Joosten-Ten Brinke of Fontys University of Applied Sciences, Theo Eggen who is an AEA-Europe Fellow of Cito and the Research Centre for Examination and Certification (RCEC).

The first year of the course will start at Fontys University and is already fully booked. It is intended for both people working in public and private educational sectors as well as those from the businesses and public authorities involved in educational assessment. Certain parts of the program will be tailored so as to enable students to implement their learning directly in their daily work.

For more information about the courses, please contact Cor Sluijter (Cor.Sluijter@cito.nl) or Theo Eggen.

Interested in co-editing an online journal in educational measurement?

The editing team of the online Journal of Applied Testing Technology (JATT) is currently seeking a third co-editor from Europe working in the field of educational measurement in order to help expand the content coverage of the journal. Since 2015, this journal published by the Association of Test Publishers has been under the editorship of Reid Klion, Chief Science Officer of Pan (a provider of internet-based human resource assessment services in the US, based in Carmel, Indiana) and myself Cor Sluijter, Director Department of Psychometrics and Research (CITO).

If you are an experienced researcher and interested in joining the coediting team, please feel free to contact me for more information.

Cor Sluijter, PhD

Director Department of Psychometrics and Research, CITO

Treasurer AEA-Europe

Tel: +3126 3521 398

Email: Cor.Sluijter@cito.nl 

MathemaTIC: students in Luxembourg learn mathematics on a digital adaptive multilingual platform

In response to the growing heterogeneous school population of Luxembourg, the Ministry of National Education, Children and Youth (MENJE) introduced MathemaTIC in September 2015, a project whereby 1700 students, aged 10-11 years in grades 5 and 6 from 40 schools, can choose to learn mathematics in German, French, Portuguese or English in an adaptive digital environment. MathemaTIC capitalizes on technology to adapt the learning of mathematics to individual student needs using a “language-free” solution. This brings a high value-added for students who have a lower command of the language of instruction and whose achievement would be hindered when the language becomes a barrier to understanding the mathematical text. It should be noted that this is particularly significant for Luxembourg which is the European country with the highest proportion of students who do not speak the language of instruction at home, with the highest number of foreign languages taught at school and the highest number of hours dedicated to the teaching of these foreign languages (Eurydice and Eurostat, 2012).

With MathemaTIC, students and teachers have 24/7 access to mathematics resources on any mobile device, including computers, laptops, tablets, and smartphones, at school and at home. The interface itself is divided into a student view and a teacher view, both accessible to the teacher. Students sign in using a personal identifier, and access interactive, visual, audio and video mathematical items or exercises – all divided into national curriculum learning modules. A 10-minute diagnostic test in each module first determines the initial individual knowledge related to the content which is then used to unlock items of varying difficulty and to guide the sequence of items proposed to the learner. According to student success in solving the items, access to the items might be completely  or partially locked. Exercises include technology simulations of real-world scenarios to connect abstract math concepts to practical applications, hence demystifying the subject. Teachers normally use MathemaTIC for the whole class during lessons alongside the textbook, as part of a group discussion or students may work alone to revise or complete homework. Student performance and progress over time are tracked through graphs and tables which enable both the teacher and the student to visualize the unique learning paths in real time. Using an integrated system of continuous feedback, real-time online individual tips and strategies adapted to individual needs are delivered to students. Those who are falling behind are offered remedial help while faster learning students receive more advanced tasks to move ahead. With the extra time gained, teachers offer supplementary explanation or exercises, targeted exactly to whom and where it is needed. This optimises lesson time and maximises impact on teaching and learning. At the end of each learning module, MathemaTIC proposes a 10-minute summative test that the teacher can administer to the class as a whole. The results of the summative test in comparison to the initial diagnostic test indicates a measure of learning for that module and helps identify and reduce gaps in learning.

Both teachers and students are particularly excited about MathemaTIC as it acts like an intelligent tutor and offers immediate low-stake feedback while the learning occurs rather than after learning has occurred as in the case of an adaptive testing.   Overall students have the opportunity to both master routine mathematical tasks and to tackle problem-solving tasks both in the classroom and at home and the multilingual platform serves the pedagogical objectives of differentiation, individualization and personalization while meeting the wide range of needs of students irrespective of their migration, language and social origins. In order to ensure the highest quality, MathemaTIC has been carefully tailored to meet the specific needs of students, and this has been achieved through a close collaboration and partnership with pedagogy, technology and research experts from Luxembourg, Canada and France.

To learn more and participate in the MathemaTIC project, please contact Amina Afif aminath.afif@men.lu or info@mathematic.lu. Alternatively, feel free to visit the MathemaTIC website www.mathematic.lu.

PhD on formative assessment of writing

In my dissertation (Burner, 2016) I explored teacher and student perceptions and practices of formative assessment in English as a foreign language (EFL) writing classes. Four EFL teachers and their students (N=100) took part in the study. Mixed methods were used to collect, analyze and report data. The dissertation consists of three journal articles and an extended abstract of 96 pages.

The first article (Burner, 2014) was a review of writing portfolios. The review was restricted to the period 1998-2013. The findings showed that writing portfolios have several formative potentials for second and foreign language learners. However, too little research has been conducted in primary and secondary education, and too few studies have used observation as a method of validating findings to see how portfolio can change assessment to become more formative.

The second article (Burner, 2015a) formed the start-up phase of the study, where the focus was how teachers and students perceive and act on formative assessment of writing. The findings showed that, despite the school’s long tradition of working with formative assessment, there were significant contradictions both within the student group and between the students and their teachers regarding how they perceive and act on formative assessment of writing. One of the main implications is that there is a need for more time and space to discuss, try out and follow up formative assessment in writing classes. The contradictions laid the foundation for interventions, using the writing portfolio as a mediating artifact.

Three classes (N=70) completed a year’s intervention period, using portfolios as an assessment tool. The third article (Burner, 2015b) reported the results from this period. The findings indicated that the assessment transformed to become more formative in some respects, but there were exceptions and there were some student differentiation and computer technical issues. Notably, the writing portfolio seemed to be most beneficial for the high performing students, but it created writing time and space for all students. The study provides an example of how a formative tool can contribute to bridge the gap between perceptions and practices, and theory and practice of formative assessment. However, it also indicates how challenging it is to conduct interventions at one end with teachers and expect results at the other end with students.

The contribution of this dissertation is increased knowledge about the relationships between portfolio assessment and formative assessment in second/foreign language learning contexts, and how teachers and students perceive and act on formative assessment in writing classes. It suggests a possible framework to study the formative activities involved in writing assessment in the juxtaposition of teacher and student perspectives, theory and practice.

Tony Burner
University of Southeast Norway
E-mail: tony.burner@hbv.no
Academia: https://hisn.academia.edu/TonyBurner

Introducing ICILS 2018

In 2013, the International Association for the Evaluation of Educational Achievement (IEA) launched an unprecedented study of the advancement of young people in computer and information literacy (CIL): the IEA International Computer and Information Literacy Study (ICILS).

This first international research study aimed at assessing and comparing the level of CIL of young people on a global scale, was developed to meet a growing need for high-quality data on digital literacy trends and associated teaching practices and contextual factors. ICILS 2013 collected data from close to 60,000 Grade 8 (or equivalent) students from 21 countries and education systems.

The IEA, together with the international study center at the Australian Council for Educational Research (ACER) is now launching a second phase of ICILS for 2018, which will build upon the salient research conducted in 2013, using ICILS 2013 as a baseline study, while further broadening the scope of digital skills and capacities to be measured. Three trend modules will provide key linkages between ICILS 2013 and 2018, allowing countries that participated in the previous cycle to monitor changes overtime in CIL and to identify correlated teaching and learning contexts. Two newly-developed modules will be field tested, and two optional modules covering computational thinking (the process of working out exactly how computers can help us solve problems) will be made available to participating education systems.

As in 2013, the student modules for ICILS 2018 will be administered online. As a new feature of the ICILS 2018 field trial, all ICILS instruments (including updated teacher, principal, and ICT-coordinator, questionnaires) will be adapted, translated, verified, and made available for online administration using the new IEA eAssessment System, which is one web-based platform with different functionalities relating to the different processes. This streamlining, integral to the administration of international large-scale assessments and the assurance of international comparability of the resultant data, represents an exciting advancement in the field of educational evaluation.

Governments are now including digital literacy in a wide range of curricula, prioritizing CIL in policy goals and asking teachers to integrate digital processes into pedagogy at an ever-increasing rate. ICILS 2018 provides an opportunity for education systems to gather and analyze data related to their young peoples’ CIL and help answer the question: What can education systems and schools do to improve students’ digital competencies in an age in which such skills will become ever more crucial to success on a micro- and macroscale?

The IEA welcomes contributions from all organizations interested in supporting this critical international research endeavor. Organizations and companies interested in exploring options for partnership are invited to contact the IEA (secretariat@iea.nl) to discuss mutual opportunities for collaboration.

For more information, please visit the ICILS 2018 website at http://www.iea.nl/icils_2018.html.