Efforts to broaden assessments to include more higher order thinking skills is an ongoing and challenging process. Typically, assessment focuses on tests on the traditional content knowledge that multiple choice questions foster. The Programme for International Student Assessment (PISA), sponsored by the Organisation for Economic Co-operation and Development (OECD), are administered every three years in a cycle that covers reading, mathematical and scientific literacy for 15-year-old students representing more than 90% of the world economy. PISA assessments focus on the capacity of students to extrapolate from what they have learned and to analyze and reason as they pose, solve and interpret problems in a variety of situations. Additionally, the test is designed to have students self-report on their motivation to learn, their beliefs about themselves and their attitudes to what they are learning, so that countries have a reliable and comprehensive way to inform educational policy and practice from a wide variety of international viewpoints, including those of students.
Pioneering work done by the ETS Assessment Training Institute, led by Rick Stiggins, who has long advocated for and designed systems for student-involved assessments for learning, also acknowledge that students themselves can be and should be their own best critics. In a classroom where one-way communication is the norm, it is typically only the teacher’s judgment that counts; students, like teachers, need the skills and tools through which to learn to critique their own work and that of others. Encouraging such multi-dimensional ways of assessing students’ work and process skills is essential in guiding and informing teachers and students alike about expectations and performance.
Though multiple choice tests are still the norm, there is a rich variety of classroom assessment tools that go “beyond the bubble:” portfolio analysis, rubrics and tools for analyzing media creations and communications. As technology improves for assessing student work and costs go down for alternative types of testing, more variety and more appropriate solutions will continue to emerge.
Although assessments designed to measure higher-order thinking skills and to involve students may not be called “media literacy,” these types of assessments lend themselves well to the types of skills and teaching methodology that media literacy fosters.
Sample assessments that have been developed and used by CML are contained in CML’s Media Literacy: A System for Change, and all of CML’s curricular programs contain assessments. Both a sample pre-post test and a student self-assessment for production, utilizing CML’s Q/TIPS framework, are included. In a typical implementation program, CML has found that students’ acquisition of content knowledge as well as attitudes and even behaviors are positively impacted by a sound approach to media literacy education.