As people play computer games – to explore simulated worlds, combat foes and overcome challenges – the computer software monitors their progress. It continually collects data about players’ actions, making inferences about their goals and strategies to set appropriate new challenges.
This approach of continually tracking a person’s progress while providing immediate automated responses has been termed ‘stealth assessment’ and it is starting to be applied to educational games and simulations.
The claim is that stealth assessment can test hard-to-measure aspects of learning, such as perseverance, creativity and strategic thinking.
It can also collect information about students’ learning states and processes without asking them to stop and take a test. In principle, stealth assessment techniques could provide teachers with continual data on how each learner is progressing.
Inquiry learning
The term ‘stealth assessment’ was first used by Valerie Shute in 2005 to describe the automated assessment process in a system named Smithtown to teach principles of microeconomics (for example, the laws of supply and demand).
Students explored the Smithtown simulated world and altered variables, such as the price of coffee and the incomes of inhabitants.
They engaged in inquiry learning by forming hypotheses and testing predictions. The software employed methods from artificial intelligence to monitor and analyse the students’ actions, giving them feedback to support their inquiry skills without disrupting the game.
Stealth assessment extends adaptive teaching by making continual adjustments to a simulated environment rather than selecting a path or exercise based on the diagnosis of a learner’s knowledge and misconceptions.
The adjustments are based on the learner’s actions while playing the game, such as what evidence the learner collects in the simulated world before making a prediction, or which game characters the learner asks for help.
The assessment is embedded within the flow of the game and the student may not be aware that this dynamic process of monitoring and response is taking place.
The principles of stealth assessment (see end of article) can be complex when carried out by computer, but are just what a good human sports coach would do in teaching tennis or soccer.
The coach watches the students as they practise the game and gives new challenges for each student matched to the level of ability. These are part of the game play (such as a serve in tennis or a penalty kick in soccer) rather than a separate test.
All the time, the coach is forming an understanding of each student’s skills and weaknesses.
Assessment design
The pedagogy that underlies stealth assessment is competency learning. The teacher (in the case of stealth learning, the computer) estimates what the student knows and can do, continually providing tasks and assessment that are matched to the student’s competency.
To do this, the teacher or teaching system must diagnose how the student is performing on specific problems and then infer levels of competency across a network of skills.
The objective is to detect the student’s problem-solving skills involving knowledge, comprehension and application, while also uncovering the higher level abilities of creativity and critical thinking.
A successful method of developing stealth assessment games is through ‘evidence-centred design’. First, the educational game designer needs to determine what knowledge, skills and competencies will be assessed so that they can be built into the gameplay.
These attributes cannot be assessed directly (since the game has no direct way of knowing what the student is thinking, and the stealth approach does not set explicit tests of knowledge), so the designer has to work out which behaviours and interactions will provide evidence of a player’s knowledge, skills and competencies.
The games designer then chooses actions that are appropriate to the player’s abilities: setting goals to be achieved, managing conflict, introducing challenges.
The designer builds measures of success and failure into the game as the learner undertakes a mission or solves a game problem.
These measures link together to form a network of probabilities that the learner has gained the desired skill or reached the required level of competency.
Opportunities and challenges
Stealth assessment works best when the assessment strategies, game and simulated world are all developed together through a process of evidence-centred design that applies not only to the assessment but also the gameplay (so that the game elements are included to stimulate engagement and learning).
A less-successful approach is to add dynamic assessment to an existing game or simulation.
Stealth assessment techniques can give learners immediate feedback on their actions and provide teachers with information on how each learner is developing skills of inquiry, critical thinking, decision-making and creativity.
This work is at an early stage, and it is not yet clear whether the methods of stealth assessment need to be developed afresh for each game and topic, or whether general methods of design can be adopted.
Assessment in practice
An example of a computer game that employs stealth assessment is Portal 2, developed by Valve Corporation.
The player takes the role of Chell, who has to explore an advanced science laboratory, realised as a complex mechanised maze and find an exit door by using a set of tools. Educational aims are for the user to learn aspects of physics, gain visual-spatial skills and develop critical thinking abilities.
Another, very different, example is TAALES. This analyses the lexical properties of students’ essays (such as word frequency and use of academic language) to assess the students’ vocabulary knowledge. Stealth assessment of student essays with TAALES is being coupled to a system that helps the students to improve their essay-writing skills.
Shute and colleagues embedded stealth assessment in the educational game Use Your Brainz, for middle-school students to learn skills of problem solving. A study with 55 school students over three days (an hour a day) showed that the stealth assessment by the computer matched standard measures of problem-solving ability. But there still needs to be a large-scale trial to validate the approach.
Conclusions
The term ‘stealth assessment’ provokes debate. Is it ethical to design a computer system that monitors students’ actions, assessing their skills of problem solving or creativity while purporting to give them an entertaining game?
Would it be more acceptable if the students know they are being continually monitored and assessed – which, after all, is exactly what a good human coach does?
For research projects, these systems can, and should, be developed within strict ethical guidelines that include telling the learners how they are being monitored, how the information will be used and gaining informed and willing consent from the participants.
But stealth assessment is already being embedded into commercial games and might, for example, be used without players’ knowledge to assess insurance risks.
Stealth assessment offers engaging ways to teach competencies, such as creativity, problem solving, persistence and collaboration, by incorporating dynamic assessment and feedback into computer games.
The methods need to be introduced with care and sensitivity, but early results show promise in combining the engagement of simulation games with the diagnostic power of dynamic assessment.
Principles of stealth assessment
The key principles of stealth assessment are that:
- The software analyzes the activities of students within a computer game or simulation
- The system continually adjusts the structure of the game to support learning, for example by offering new challenges matched to the student’s performance
- The system maintains the flow of the game, so that teaching and assessment are part of the game and not separate tests or exams
- The system builds a dynamic model of the learners to indicate their abilities and competencies
- It is intended to reduce learners’ anxiety about taking tests by blurring the distinction between assessment and learning while carrying out an accurate diagnosis.
Mike Sharples is Emeritus Professor of Educational Technology at The Open University and Honorary Visiting Professor at Anglia Ruskin University, Centre for Innovation in Higher Education.
This article is based on an extract from his book Practical Pedagogy, published by Routledge; PSM readers can receive a 20% discount and free delivery when ordering from routledge.com by entering the code A016; offer ends 31/7/19.