The AEA Europe Autumn/Winter 2019 Newsletter

Editor’s Note

Dear Readers,

Summer has gone by and the autumn leaves are turning orange.  In my part of the world the sun still shows up a few hours every other day. What’s more comforting, we can still look forward to some more sunshine in Lisbon – yes, that is where the IAVE team is welcoming us for the 20th AEA-Europe annual conference.  We are looking forward to meeting you there. Meanwhile here is what our members want to share with you about their activities related to assessment in education.

Our AEA-Europe President, Jannette Elwood has already completed one year of her term. The association continues to be successful but she reminds us not to be complacent and to keep working hard sothat we remain a force for good within the European educational assessment context.

Indeed, much is happening when we see that our contributors this month have sent in a great variety of articles on assessment. Within AEA-Europe, our own eAssessment SIG will be very active at the Lisbon conference. Come visit their stand near the conference registration desk, join their pre-conference workshop or their Ignite Session or again, participate in the SIG’s first business meeting to learn what the group has been up to in 2019.  How much do you know about the assessment culture of your country? Do you know to what extent your national examinations are influenced by the testing culture? Read two articles based on doctorate research focusing on assessment cultures, which examined how the American approaches to standardised testing during the twentieth century were received or resisted in Norway and Sweden and how the external examination system in England and Wales still prevail in spite of much opposition. The two authors are pondering on setting up an AEA-Europe SIG related to “assessment cultures”, if there is sufficient interest. Please consider joining! Critical thinking has long been a buzzword in the context of 21st century skills, but how do we assess this skill? Read on to find out how research in France has led to an improved definition of critical thinking which will be used to develop tests to assess these skills for primary and secondary school children. Moving on, you can read about the promising work of a young researcher who examined how machine learning could be applied to large-scale international studies using TIMSS 2015 U.S. national public-use data. This is  very useful going forward, as the analysis of growing volumes of big data in educational assessments still employ conventional statistical methods and have focused on only a few theory-driven variables. Next, how do teachers feel about standardised testing? We include here an article about a large-scale investigation of Irish primary teachers’ practices, beliefs/attitudes, policy advice and professional development needs with respect to standardized testing in English reading and mathematics. Another article describes how one organization joins a partnership to customize the online maths program for its initial teacher trainees, given that providers of such training in UK are now responsible for ensuring the adequate literacy and numeracy levels of their trainees. Would you like to find out out how well students are prepared for study, work and life in a digital world? Well, the whole educational assessment world is awaiting the results of IEA’s International Computer and Information Literacy Study 2018, to be released on the 5th November 2019. Read on to find how to access the study results when they go live. In the last article, the International Observatory on School Climate and Violence Prevention is organizing a congress for educators in Mexico in April 2020 to address issues related to school safety and children’s healthy development.

You will agree with me that once more our contributors have shared with us exciting initiatives and research findings. A big thank you to you all! And to you other readers, you must surely have a lot happening your side too! Please write to us and share your work in this space. Let our association members know about your work on research, educational assessment and policy programs. Use this space to connect with the broader community of assessment in education.

To include your story in the next issue of our newsletter, please feel free to reach out to me at aminath.afif@men.lu.To those of you coming to Lisbon, I hope to bump into you there!

Happy reading!

Amina Afif
AEA-Europe Newsletter Editor
Publications Committee

A word from the President

I cannot believe that it is nearly that time again – November and the AEA-Europe Conference!  Also I cannot believe that it means I have been nearly a year as President – time files when there is a lot of work to do!   This year, for our 20th Annual Conference, we are heading to the beautiful capital of Portugal – Lisbon.   The team at IAVE (our hosts this November) has been putting all their energy and enthusiasm into welcoming us and to make sure the conference will be a success – I have every confidence it will be and we will remember Lisbon with fondness as we do all our host cities!  There will be plenty for you to discuss, debate and engage with at the conference – but also much that will attract (and distract!) you in the great city of Lisbon – so make sure you bring all your energy to the historical city that has many stories to tell.   Again there will be a Mobile App to use at the conference on which you will be able to set up your own conference agenda, access the programme, set reminders for the presentations you wish to attend, link up with other attendees and vote for your favourite poster.

Meanwhile throughout the year, besides organising the conference, the association has been working hard to introduce new ideas, put in place new communication tools and refine its procedures and rules – all in response to the feedback we obtained from our members. We continue to improve the website as a one-stop-shop for information, communication and interaction with members.  Also on response to your feedback, we made sure that there was only one site to visit to access the conference website and registration page – we hope this made finding out about the conference, registration and accommodation easier for everyone this year.

The association’s activities also included successful webinars hosted by the e-Assessment SIG and discussions have been taking place with interested members about widening  the number and scope of SIGs associated with AEA-Europe. Professional development is another aspect of our work that has been strengthened this year through the accreditation process  – we have had a large number of applications for this and it shows that colleagues see accreditation as a positive activity to engage with and a kite-mark of quality when it comes to assessment practice.   We will also be promoting the network of PhD students at this conference – we hope to gain ideas and suggestions from them as to how we sustain engagement and support students throughout the year in between the annual event.  During the next coming months, the association will continue to grow and promote the new tools and ideas that are now in place. Our goal for next year is to establish a strategy in order to increase and diversify our membership so that even more European countries are represented in AEA-Europe.

We have had another great year and I am confident that with all the commitment and hard work shown by you all (the members of AEA-Europe)  that 2020 will be just as great – but…we cannot be complacent and it is truly up to all of us to keep AEA-Europe sustainable and a force for good within the European educational assessment context – even within these very turbulent times. Please do get in touch with me and other Council member if you have any ideas, suggestions and/or contributions you wish to make to help us achieve our aims of sustainability and development.

Thanks you again for all you do to spread the word about the good work of AEA-Europe across Europe and beyond.

Looking forward to meeting you all soon in Lisbon and enjoy the conference!

 See you in November at the AEA-Europe 2019 Conference in Lisbon!

It is conference time again! Are you all set for Lisbon? On the 13th – 16th November 2019, the Institute for Educational Assessment (IAVE) in Portugal will welcome you to the 20th AEA-Europe Conference where we will be reflecting on Assessment for transformation: teaching, learning and improving educational outcomes.

IAVE is a public institute since 2014 and has been responsible for designing and implementing external assessment in Portugal. We are legally acknowledged as having technical and professional independence and act as partner with the services and organisations within the Ministry of Education, which are responsible for the curriculum development and teacher training.  Our main goal is to contribute to the continuous improvement of education, promoting its quality, efficacy and efficiency in a way that will enhance better learning outcomes. We promote innovation and technological update in teacher training, in test development, administration and marking, being committed to innovative reporting that can be shared by teachers, parents and students, thus fostering quality feedback.

Within our scope of work, we are delighted this year to host the 20th annual AEA-Europe Conference, bringing together experts from different countries, who will share knowledge, experience and views in the field of assessment. A high number of participants have already registered, and the Local Organising Committee aims at providing a positive environment for professional growth and exchange. As every year, this will include plenary sessions with keynote addresses, open paper sessions, discussion groups, symposia, ignite sessions and poster presentations.  The conference venue, in the heart of the city of Lisbon, provides easy access to historic sites, adding a cultural dimension to our conference. We are looking forward to meeting you in Lisbon!

Learn about the AEA SIG E-Assessment activities in Lisbon!

As we are all gearing up for Lisbon, the SIG E-Assessment preparations for the 20th AEA-Europe conference are well underway. We are really looking forward to seeing AEA-Europe members with a specific interest in digital assessment joining our SIG activities at the conference.

All AEA-Europe members interested in digital assessment in all its forms are invited to come by our stand near the registration desk. SIG members will be there to answer your questions and you can register to join the SIG. If you are already a SIG member, come and say hello! We would love to hear your suggestions on how the SIG can help you, and we are always looking for members willing to host a webinar or post a blog.

Like last year, we will hold a pre-conference workshop on digital assessment on Wednesday Innovative on-screen assessment—exploring item types and paper-to-digital transition. In this workshop SIG board members Caroline Jongkamp and Rebecca Hamer will present a new framework for categorising digital assessments inspired by Kathleen Scalise’s 2009 taxonomy that proved so easy to use. The new framework builds on the findings from the SIG’s workshop in 2019 and includes many more digital item types that have become available since 2009. We will be examining the link between digital item type and what is being assessed. In the afternoon, participants will work on redesigning an existing paper-based assessment to a digital format.

On Friday morning between 8:30 and 9 am, the SIG will host its first business meeting, where we will report back on the past year, ask for your feedback and ideas and invite anyone interested in joining our Steering Group to make themselves known. In the Ignite session on Friday between 4 and 5 pm Martyn Ware (SIG chair) will present an overview of how the SIG supports sharing of practice in and experience of e-assessment. Come along to see if Martyn can deliver on the challenging format of 20 slides which automatically advance at 15 second intervals. We would like to draw your attention to the various other sessions and discussion groups with an eAssessment focus.

If you have questions or ideas for the SIG, please come and find us at our stand or throughout the conference. All SIG board members will be recognisable by our SIG-buttons which will be based on our new, refreshed logo. For more information, contacting Mary Richardson at mary.richardson@ucl.ac.uk.

Understanding our Assessment Cultures: Norway and Sweden

In my recent PhD study, I examined the policy legitimation of educational assessment reforms of Norway and Sweden. Part of the study is a historical piece, published in  Alarcón and Lawn’s (2018) book on Assessment Cultures, which traces the history of the two countries’ contemporary national examination and testing policies. The study integrates Hopmann’s (2003) perspectives on process- and product-controlled education systems with Carson’s (2008) comparison of the concept of merit in the French and American republics during the twentieth century. These perspectives help describe the emergence of two distinctly different approaches to educational assessment in the continental European countries and the United States: the examination culture (with an emphasis on professional [subjective] judgments) and the testing culture (with an emphasis on external [objective] measurements), as shown in Figure 1.

Curriculum steeringProcess-controlProduct-control
Premises for controlling the curriculum and teachers:The national curriculum provides guidelines to teachers, who are recognised as qualified through national teacher education.The school sector is divided between private and public providers, with no unified concept of teacher education. Thus, the emphasis is on external product control instead.
Assessment instrumentsEXAMINATIONS:TESTS:
Assessment instruments used to govern the education system and its certification procedures rely on:Professional (subjective) judgement: Members of the profession control each other’s assessments to facilitate the validity and comparability of assessments.External (objective) measurement: Standardised tests developed by measurement experts facilitate the validity and comparability of assessments.

Based on these typologies, the study details the reception of and resistance to American approaches to standardised testing during the twentieth century in Sweden and Norway.

Norway sustained the continental European tradition of professional judgments, associated with the German Abitur and the French Baccalauréat, whereas Sweden adopted a more American inspired approach, more reliant on externally developed tests. A shared feature, however, is that the teachers are undertaking the assessments, unlike in the UK which employ external examination agencies for their A-levels.

The study demonstrates how countries that began with an examination culture have—to a greater or lesser extent—been influenced by the testing culture, which can in part be related to engagement with the transnational trends throughout the twentieth century. Research collaboration undertaken in the International Examination Inquiry of the 1930s influenced the emphasis on psychometric approaches to educational assessment in Sweden, which ultimately replaced their examinations with psychometric tests in 1968. In Norway it was not until increased emphasis on accountability from the 1990s onwards, and especially as a result of the “PISA shock” in 2001, that psychometric approaches to educational assessment gained legitimacy and was implemented with the national testing scheme in 2004. As a result, Norway has one examination system used for certification purposes and another set of national tests used for governing purposes, whereas Sweden combines the two in one assessment instrument, the national tests.

The national tests in Sweden are subject- and disciplinary-based, reflecting how they are used to certify (subject) learning and instruction. In Norway, the national tests are interdisciplinary and skills-based, which reflects the emphasis on skills in PISA, the most influential international testing programme associated with the increased emphasis on accountability

Based on the present study, there is a basis for speculating that comparable traits exist between Norway and Germany, given Germany’s corresponding resistance to standardised testing and its sustained examination tradition. The Netherlands, on the other hand, can be assumed to share the legacy and contemporary epistemological orientation of Sweden, being another early adopter of standardised testing. These traits should be explored further in research studies on assessment cultures.

For further information about this study, please contact Sverre Tveit, University of Agder, Norway at sverre.tveit@uia.no.

Understanding our Assessment Cultures: England and Wales

Alarcón and Lawn’s book Assessment Cultures: Historical Perspectives, mentioned by Sverre Tveit in the previous article, illustrates the fact that national examination systems grow from within their home cultures. They are the result of decisions made in the past and the cumulative effect of those decisions is what has produced the systems we have today.

My recent PhD study focused on the first national school examination system for 16 and 18-year-olds in England and Wales. The scheme, which was called the “School Certificate Examinations” (SCE), was set up by the government in 1918, and it ran until 1950, when it was replaced by ‘O’ and ‘A’ levels. Several key aspects of the SCE system were controversial a hundred years ago and those controversies are still with us.

Firstly, the decision was taken by the government that the SCE should be externally run, by examination boards, rather than internally run by teachers. Alternatives to external exams had been supported since as far back as the mid-nineteenth century when the Abitur system of school-leaving examinations in Germany was well-known in Britain.  This was strongly promoted by several leading British educationalists. The story of why England and Wales did not take up that internal model has been a main line of enquiry in my work.

Secondly, rather than setting up one national examination board, the Ministry of Education invited eight examination boards to run the system, while schools were allowed to choose which board’s examinations they might use. The influence of the Abitur might be seen here, as it was assumed at the time that there would be close relationships between examining boards and ‘their’ schools.

Thirdly, those eight boards were based in the nation’s universities. It was felt by the government that to gain popular support the SCE system needed the universities’ prestige. But that decision was strongly opposed by those who said that the secondary school curriculum would become dominated by the universities and be “too academic”.

Lastly, there was opposition to external examinations from teachers and leading educationalists for the entire School Certificate period. Many teachers saw national external examinations as interfering with their professional autonomy. That argument was victorious in 1943 when a government committee recommended that external examinations for students under the age of seventeen should be abolished. This proposal became the Ministry of Education’s accepted policy for the next eight years.

And yet, the examination system in England and Wales is still based on external examinations. My main interest has been to answer the question, how did the external national examination system survive when there was so much opposition to it?

For further information about this study, please contact Andrew Watts, Wolfson College, Cambridge at aw253@cam.ac.uk.

Proposal to create an AEA-Europe Special Interest Group on “Assessment Cultures”

Based on the two previous articles on understanding assessment cultures, one can observe how the impact of different combinations of forces on individual societies, makes that each national assessment system is unique.

A possible forum to further share explanations of different national assessment cultures, would be an AEA-Europe Special Interest Group (SIG) on Assessment Cultures. Members could discuss how those explanations might be related to decisions made way back in time but which still influence the way we work. Or how decisions might relate to comparative accounts in which we reflect on differences between the assessment systems in our different countries.

If this idea sounds appealing to you, write to the authors and let them know. If you are planning to attend the conference in Lisbon and are interested in this topic, they could arrange an informal meeting there to discuss the proposal. If you want to see what could emerge from such studies, reading a few of the chapters in Alarcon and Lawn’s book would be an instructive and enjoyable experience. To indicate your interest, just send a brief e-mail to Andrew Watts at aw253@cam.ac.uk or Sverre Tveit at sverre.tveit@uia.no who are willing to put the proposal onto a more formal footing within the Association.

Assessing Critical Thinking early in education: a French Research focusing on primary school students

Critical thinking is an increasingly discussed matter in the world of education. The OECD considers critical thinking (CT) as one of the most important skills of the 21st century to help students succeed in modern and globalised economies based on knowledge and innovation. However, the ability of schools to develop students’ critical thinking is limited by the lack of information on the subject. Indeed, how can an education system implement the teaching of critical thinking without a way to assess this skill and a way to evaluate the effectiveness of classroom interventions? How can we train teachers if we do not know which interventions will be useful for the development of students’ CT?

To address these issues, educational and scientific initiatives are emerging and developing, most of them focusing on critical thinking in higher education. It is indeed very important to teach young adults how to think more rationally, to know their own reasoning biases, and how to distinguish between good and poor quality of information. But these abilities are so important that we should integrate them earlier into the curriculum, before higher education. The first benefit of early critical thinking education is to offer this knowledge to all children, and not only to the ones that pursue higher education. A second benefit is the easier development of habits (of thinking) earlier rather than later in life.

In line with these remarks, the French National Agency for Research has launched the funding of a scientific project on critical thinking assessment and education. Launched in January 2019, the project focuses on CT in primary and secondary schools in France. Among their research questions: How is CT evaluated at an age when written language is still in the process of being learned? What are the cognitive foundations of CT in children? What are the stages of children’s CT development? Which educational interventions have a significant positive impact on children’s CT?

The first phase of this project consisted in converging on a precise, minimal and operational definition of critical thinking, in order to overcome two shortcomings: first, the impossibility of assessing a capacity that was too broadly conceived, and second, the difficulty of developing capacities that were too general. This first “work package” is now completed and led to a simple definition: CT is the ability to properly calibrate one’s confidence in a given piece of information, based on an assessment of its quality (likelihood of the information, reliability of the source, soundness of the reasoning offered, etc.). That is, to evaluate the trustworthiness of the information at our disposal, whether on the Internet or on other media, or even in discussions at home or school, with friends and family. This evaluation also needs to be reflexive, assessing the epistemic quality of our own original ideas and opinions (“don’t believe everything you think!”).

Thanks to this improved definition, CT is clearly differentiated from other cognitive capacities such as decision-making or problem-solving, although it contributes to these skills, by feeding them with trusted information. The definition also led to identification of natural cognitive abilities that humans already have to evaluate information. These basic abilities, present in all children, are the foundational skills that we can improve and equip in order to help children develop an expert critical thinking.

Based on these results, the second phase of the project will develop tests to assess CT for primary and secondary school children. These tests will be used to measure their progress over age and grade levels. Then, the team will develop a CT teaching module and use their new tests to evaluate the effectiveness of several CT teaching modules (which already exist in France thanks to the personal initiatives of a few teachers). At the end, the research project will make it possible to verify the impact of CT education on young children in order to advise decision-makers and teachers on a better integration of CT education into the curriculum.

For further information about this research, feel free to contact audrey.bedel@ephe.psl.eu and/or see our website (in French) http://espritcritique.info.

Predicting mathematics achievement: A machine learning approach using international assessment data

At the 60th meeting of the International Association for the Evaluation of Educational Achievement (IEA) Annual General Assembly in Ljubljana, Slovenia in October 2019, Yuqi Liao from the American Institutes for Research (AIR) was invited to present his work on “Predicting mathematics achievement: A machine learning approach using TIMSS 2015 U.S.national public-use data”, which received the recognition for the best presentation from IEA’s research conference held earlier in the same year.

Poor mathematics achievement creates difficulties for students in everyday activities and leads to serious consequences in educational attainment and career advancement. Therefore, identifying characteristics that could predict poor mathematics achievement would help educators allocate resources effectively and intervene in time. Existing literature has explored various factors that may be associated with students’ mathematics achievement. Characteristics ranging from students’ gender, race, and family background, to teacher qualification and school resources, to name a few, are believed to be significant determinants of achievement. However, most studies have employed conventional statistical methods and have focused on only a few theory-driven variables. With the recent development of machine learning (ML), a number of data-driven techniques are now utilized by education researchers. However, only a handful of studies have focused on applying ML techniques to large-scale international assessment data, and none have compared the performance of different ML methods.

Using TIMSS 2015 U.S. national public-use data, Yuqi’s study shows how machine learning could be applied to large-scale international studies. The analysis provides a comparison of six commonly used machine learning models in identifying students with low mathematics performance, which would allow educators to effectively allocate resources to intervene and thus improve their performance. The results show that logistic regression, elastic net, and extreme gradient boosting all perform well in terms of balanced accuracy and sensitivity. Leveraging the high dimensionality of the data, this research analyzes 142 student and school variables and identifies characteristics that are associated with mathematics performance, including students’ home education resources, attitudes toward mathematics, and use and possession of computers or tablets.

The manuscript and the presentation are available online. Please contact Yuqi Liao at yliao@air.org for any questions or inquiries.

Standardised Testing in English Reading and Mathematics in the Irish Primary School: a Survey of Irish Primary Teachers

Recent policy changes in Ireland means that primary schools are now required to administer standardised tests in English reading and mathematics in claases of the 2nd, 4th and 6th grades, and to report the aggregated results to their Boards of Management and the Department of Education and Skills (DES).  Schools are also required to share the results with parents. As of September 2017, the outcomes of standardised tests are used at national level as part of the process involved in determining the allocation of special educational teaching resources to schools.

The international literature suggests that when standardised test scores are shared widely and used for purposes beyond internal planning, the associated sense of accountability can result in pressure to perform, narrowing of the curriculum and other negative consequences.

The first large scale investigation of Irish primary teachers’ practices, beliefs/attitudes, policy advice and professional development needs with respect to standardized testing in English reading and mathematics has been conducted by the Centre for Assessment Research Policy and Practice in Education (CARPE) at Dublin City University and the Irish National Teachers’ Organisation (INTO). Involving a random sample of 1,564 teachers, the findings indicate that Irish teachers were neither wholly supportive, nor wholly opposed to standardised testing. Good practices in terms of planning and use were reported by about one third of the sample. However, there was also evidence that test data were underutilized for formative purposes and that the process of constructing standardised tests as well as the interpretation of norm scores were not well understood by teachers. Somewhat worryingly, the data provide a snapshot in time that highlights the increased status of standardized testing in the Irish primary system The vast majority of teachers claimed that they felt pressure from within themselves to improve their pupils’ standardised test scores. About half reported feeling pressure from parents, a third from inspectors, principals or colleagues, with a quarter feeling pressure from pupils and the media. Some teachers indicated that they were either aware of colleagues within their own schools who ‘teach to the test’ and/or know of other schools in which the practice occurs. Approximately one in ten indicated that some of their pupils were getting private tuition before standardised testing in May. Individual teachers lamented the negative impact of standardised tests on pupils with SEN, reiterating the need for alternative and/or differentiated assessments.

The findings give educators and policy makers in Ireland (and anywhere standardised tests are being used) much food for thought and, potentially, a basis for informing decision-making, planning and action. The recommendation here is that a set of principles should always underpin standardised testing and that a number of initiatives should be put in place to ensure that problems associated with it in other jurisdictions do not escalate in the Irish context.

For more information contact michael.oleary@dcu.ie or access the full report at: https://www.dcu.ie/sites/default/files/carpe/carpeinto_standardised_testing_survey_web.pdf

Assessing and Improving the Numeracy of New Teachers in UK

In July, the UK Schools Minister announced that the national system of Professional Skills Tests in literacy and numeracy for trainee teachers would be removed and replaced by a new policy where providers of initial teacher training (ITT) are now responsible for ensuring the adequate literacy and numeracy levels of their trainees.

As a result of this new policy change, the National Association of School-Based Teacher Trainers (NASBTT), an organization for school-based ITT providers in the UK, entered into a partnership with Vretta to provide the award-winning Elevate My Maths (EMM) program to support their teacher trainer members.

EMM is an online program for students at the tertiary level that incorporates diagnostic assessment for determining each student’s strengths and weaknesses in maths, followed by upgrading lessons, and summative assessment to ensure that the student’s numeracy level is adequate for moving forward.  The EMM program was first announced at the AEA-Europe conference in 2014 in Tallinn, Estonia.  Since then it has been used extensively in Canada, but has also seen significant usage in the USA, Sweden, and in Australia where it was used for teacher training students. In 2018, EMM was recognized with an award from the e-Assessment Association for the Best Transformational Project.

NASBTT has some 175 members, providing ITT to new teachers in schools throughout England. Under the new NASBTT-Vretta partnership, these providers will have access to a customized diagnostic assessment which will assess the achievement of the trainees on four areas of fundamental maths skills.  Each of these topics has a series of questions in the diagnostic assessment and the students must answer 80% of these correctly so as to complete the topic mini-assessment. If they get fewer correct, they are recommended to take a series of online upgrading lessons on that topic. Finally, when all the upgrading lessons have been completed, they take the assessment over again, this time as a summative assessment to confirm their maths proficiency.

EMM is fully customizable, which means that its content is not exactly the same for all students. Each partnership selects the topics and test items that are appropriate for their own students’ needs.  For example, the University of Derby has created different versions of EMM for engineering students, business students, nursing students, and apprentices.

Overall, both NASBTT and Vretta are committed to increasing teachers’ numeracy levels and are looking forward to further development of the EMM assessments based on our experience this year.   In particular, we would welcome information from other AEA-Europe members with experience of providing numeracy assessment for teachers in training.

For further information this partnership, email graham.orpwood@vretta.com.

International Computer and Information Literacy Study 2018: Release of results on 5th November 2019

The International Computer and Information Literacy Study (ICILS) responds to a question of critical interest today:

How well are students prepared for study, work and life in a digital world?

More than 46,000 students and 26,000 teachers from 12 countries (Chile, Denmark, Finland, France, Germany, Italy, Kazakhstan, Korea, Luxembourg, Portugal, Uruguay, and United States) and two benchmarking education systems (Moscow (Russian Federation) and North Rhine Westphalia (Germany)) participated in the 2018 cycle of the assessment. ICILS 2018 also gathered valuable background information about students’ and teachers’ use of, and attitudes towards, technology.

Over the past four decades, information and computer technology has had profound impact on our daily lives, work and social interactions. ICILS deals with the core knowledge, skills and understanding that students need to succeed in this dynamic information environment. Participating in ICILS provides countries with reliable, comparable data about young people’s development of 21st century computer and information literacy (CIL) skills. On top of this, ICILS is unique in directly assessing computational thinking skills of students.

ICILS 2018 results will be released on 5 November 2019 at an event held in Washington D.C., which viewers may also join via web stream. For more information, including access to study results, infographics and video interviews, please visit the IEA website.

International Congress on Violence in Schools: Views from research, interventions and policies as means for the construction of peace

The International Observatory on School Climate and Violence Prevention (Observatory) is organizing a three-day international congress on Violence in Schools on the 1st – 3rd April 2020 at Puerto Vallarta, México. The objective is to discuss the range of ways that educators and others can work together to foster social, emotional, moral and civic and academic development and optimal climates for learning as well as to prevent school violence.

The Observatory is a research/policy/practice organization. It was formed twenty years ago in Paris, France and initially addressed school violence prevention. It has recently moved to Flinders University, Australia and its focus is expanding to include leaders from policy, practice and research. Today the Observatory contributes to best practices and the science of school climate and other prosocial (e.g. social emotional learning/SEL, character education, wellbeing and mental health promotion) K-12 school improvement. Its work also addresses violence prevention efforts that support school safety, children’s healthy development, school and life success across the world.

The deadline for submitting papers is November 15th, 2019.

To find out more, please visit www.congresocmve.com or contact Jonathan Cohen at jc273@tc.columbia.edu.