Game-based psychometric assessments: how much ‘game’ is enough?

Game-based psychometric assessments have well and truly arrived on the testing scene. Like any new development, some are excited by this evolutionary stage in the life cycle of psychometric tools, while others are concerned about this change. Irrespective of which side of the fence you sit on, there’s one great outcome that we can all agree on: Developers of psychometric tools are finally starting to seriously think about user experience as part of the development process.

 

The shift towards consumer experience and consumer-focused products is not new in many industries, but in psychometrics there are additional sensitivities and considerations. Compared to other types of technology, the rate of change is slow and adoption of new approaches predominantly rests on the psychometric qualities of reliability and validity. Can we build a reliable game-based assessment (GBA) that consistently measures the construct of interest? Can we build a valid tool that uses game design and thinking to target and profile a particular construct specifically?

 

I believe we can, and – what’s more – we have. We provided our GBA tools to Old Dominion University and a global tech company and their findings support our own conclusions: That we can indeed build valid and reliable game-based assessments that consistently and reliably measure a particular construct.

 

I also believe that GBAs will always be preferable to traditional assessments. An interactive assessment that presents multi-dimensional stimuli and scenarios, and tracks player behaviour and activity, the outcome they achieve, as well as the context in which they achieve it, is always going to be better than a conventional question-and-answer assessment.

 

Having said that, building a highly controlled, valid and reliable game-based experience is hard. Really, really hard. It takes a lot more time and involves a much more creative thinking process, that taps into many disciplines outside of Industrial/Organisational Psychology, such as computer science, game design, front end development, user experience, instructional design and more.

 

So, how much ‘game’ is enough?

 

While this new technology is exciting – and, to be perfectly honest – very enjoyable to create, we also have to remember that we’re not building recreational games. We’re building tools that provide value to an employer, by helping them to identify high potential candidates. These tools are deployed as part of a high-stakes testing process.

 

On the other hand, we’re also trying to make that testing experience as enjoyable as possible for the candidate, while also ensuring it’s fair and equitable, brief and, of course, predictive of their future work behaviour.  That’s a lot of considerations.

 

So, the question I’d like to analyse here is, how much game is enough? What works in terms of enhancing the candidate experience in what is often a serious and weighty scenario, and what doesn’t?

 

Let’s take a look at three important facets of using GBAs in a recruitment context.

 

1. Design for the User

 

Recruitment has changed a lot since Web 2.0. Applicant Tracking Systems, the infamous Heineken recruitment campaigns, interactive web campaigns, Easter egg hunts and many others are designed to shorten and improve the application process for candidates, and to engage them more at every stage.

 

And it makes sense: the primary user of your recruitment process is the candidate. However, all too often, organisations take a few steps in the right direction (of optimizing the recruitment process for the candidate) but don’t look at the full picture. It’s great if you have a fabulous ‘Why work with us?’ page, a nice welcome email, an engaging video message and so on. But then, Bam! Your candidate is asked to complete pages and pages of testing.

 

Forty-five minutes of high-tempo psychometric testing is draining. I personally think it’s on a par with going to the dentist. Most people don’t enjoy it. They see it at a necessary step and understand why it’s important, but trust me, they don’t like it.

 

This also gives us our greatest opportunity. How can we make the testing experience more enjoyable? Well, it’s actually pretty easy. Make candidates the cornerstone of every decision and design choice.

 

Fortunately, the game industry incorporates various disciplines, and player reactions, feelings, and experience have been one of its primary concern for decades. Borrowing knowledge from one of the largest industries in which great experiences are a serious scientific, technical and artistic endeavour can challenge traditional psychometric development processes, but it is necessary.

 

During the early development phase of our GBAs, psychometric considerations take a back seat. The complete focus is on game mechanics, game design, reward schedules (if there are any), implicit vs explicit rule sets, feedback system, signposting, and how you handle increasing levels of difficulty, to mention just a few.

 

Effectively, this equates to a lot of testing, individual interviews, focus groups, and iteration. Iterate, iterate, iterate! Even if that means changing, re-designing or dropping ideas within minutes, in the hope of achieving player nirvana. It means challenging our position as experts and embracing the fact that “players know best”. Inevitably, players will experience games in their own meaningful ways and often this may transpire in ways that we couldn’t even imagine.

 

So now, instead of simply considering technical aspects of assessment such as reliability, validity, internal consistency, equitable item pools, counterbalancing etc., we have to shift our mindset to include psychological concepts such as flow, autonomy, achievement, onboarding, transitions between tasks and player motivation.

 

“I found that the [GBA] was a very different way to run an online recruitment test. Also as it was quite a fun game you actually forgot that the game was part of an online test! (Participant 34)”

 

Fortunately, motivation is taken care of in a testing process. It’s an unavoidable aspect of recruitment, but that doesn’t mean that we can’t make the candidate testing experience awesome.

 

More than just awesomeness, however, many candidates expressed that the sense of “challenge“, “variety” and “fun” deeply engaged them in the task at hand and greatly contributed to relieving any test “anxiety” they experienced before commencing the GBA. The GBA “took the nervy part out of a traditional assessment away (Participant 1122)”.

 

2. Technology Enhanced Assessments “For The Win”

 

The technology stack is critical in enabling positive experiences for users. A low overhead, flexible, native solution allows the majority of users on modern browsers to experience the assessment with minimal challenges. While the technology allows GBAs to process thousands of events per minute and create psychometric models that provide organisations with an assessment of the candidate, candidates also enjoy GBAs for a variety of other reasons.

 

A Dynamic Testing Experience

 

“It was frantic and fast-paced which really got you focused and engaged. It was great that because I became so focused I didn’t feel as nervous as I did before starting. (Participant 299)”

 

Users of GBAs enjoy the technically superior nature of the assessment, as it provides a highly dynamic testing experience. Candidates frequently comment on the differences between traditional online assessments and the GBA. The digitally rich presentation of the assessment, that includes aspects such as the “animations”, “audio”, “design”, and “activity”, really framed the GBA as positively different to a regular assessment. As one candidate put it:

 

“The game vibe and different dynamic of testing. Really different. (Participant 309)”

 

Many candidates like the fact that GBAs are very different to traditional testing and equated this difference to something “new”, “refreshing”, “modern” and “interesting”. Perhaps the underlying factor contributing to the majority of these conclusions is that a large proportion of the candidates value the “interactivity” of the assessment. They felt like they were in “control”.

 

“The game-style to the assessment was refreshing. I really loved being able to complete tasks while waiting for other things to finish at my own pace.” (Participant 1367)

 

Others also pointed out the importance of the “practical component” of the assessment or the fact that they were actually “doing something” rather than just “responding to questions”. Most of the candidate commentary relating to interactivity can be summed up by the following comment:

 

“It was very good to do something interactive rather than simply answering questions. Good test that required me to judge multiple things at one and was an enjoyable way to do it. (Participant 1491)”

 

The combination of carefully designed games that promote autonomy, control, and the dynamism through technology-enhanced assessments result in significant endorsement rates from candidates. In four separate studies, we found that a significant proportion (around 75% at least) prefer GBAs over traditional assessments, and wish that other employers use them in their recruitment processes. Contrary to popular myths, game-based assessments have been perceived to be fairer than traditional assessments as they allow candidates to actually showcase their abilities.

 

The Double-Edged Sword of Technology

 

With technology really coming to the fore, it’s important to understand that it can also be a double-edged sword. Firstly, we have to make sure that the simple things are done right. Candidates have concerns over “game responsiveness”, “input latency”, “click detection”, and “sensitivity”. For example, one candidate felt that whenever she was playing the game her laptop would become very “unresponsive and slow and would not do anything” when she “clicked on certain things”.

 

Input lag, latency, and recognition were perceived as negative experiences, as the GBA was timed. We take this feedback and do our best to continually improve our technology stack to prevent this kind of situation as much as possible. However, given the level of technology we’re talking about, there will always be a small number of people who experience technical issues. It’s important that we recognise this and build quality support mechanisms that help candidates work through these challenges with self-help or vendor intervention.

 

Experience Across Devices

 

While for the majority of candidates, technology is an enabling factor and provides positive testing experiences, one of the greatest challenges right now is offering an equitable experience across different sized devices. Input methods, screen size, device-specific design, and affordances all contribute to a tremendous challenge in ensuring equitable testing experiences for candidates completing assessments on different devices.

 

If you’re not confident that you can achieve equivalence through affordances or scoring correction, the best advice is to lock it down to single type of device. The general variability between flagship mobile devices and some of the cheaper models is vast. These differences aren’t as pronounced on desktops and laptops.

 

Known Psychometric Constructs

 

Similarly, scoring models, engines, and processes can be difficult to understand because GBAs generally assess candidates using a more stealthy approach. If you are considering using machine learning or AI to assist with candidate analytics, it’s important that your approach is supported by some known theory of personality, cognition etc.

 

Psychometrics are already mysterious enough for people, and mixing in a ‘black box’ layer of magic over the top may prove to be just a little bit too much for candidates and employers alike.

 

However, with all this said, technological challenges or limitations are not new. They were present when computer-based assessments were introduced three decades ago and will continue to be a challenge with modern permeations of psychometric testing.

 

3. What’s the End Game?

 

Employer Brand Implications

 

We know that many candidates perceive the use of the GBAs as a positive indicator of an organisation’s uniqueness and innovation. When telling us what they think of employers who use GBAs, candidates used terms such as “market leaders”, “employer of choice”, “innovative” and “modern”. As one candidate put it:

 

“I liked the idea generally. I think it is a good idea to make the recruitment process a bit more fun and this reflects positively on [employer name] as a whole (Participant 772).”

 

Another candidate stated:

 

“I am impressed at [employer name] for this innovative and unique method of assessment (Participant 1064).”

 

The Importance of Face Validity

 

There is one critical question when using any new category of assessment, GBAs included: Why? In other words, what are you looking to achieve? Are you using them because you want candidates to have a positive brand experience? They will. The numbers show it.

 

We do need to consider the fact that not all assessments are created equal. Taking aside the psychometric concepts of reliability and validity, it’s important that the assessments are high in face validity – that candidates can see that they are assessing something relevant to their suitability for a job. Applicants’ perceptions of face validity have been seen to greatly influence their intentions with regards to discrimination and legal challenges (Smither, Reilly, Millsap, Pearlman & Stoffey, 1993).

 

Factors such as face validity, fairness, control, and performance have been found to influence candidate perceptions of an organisation and also their evaluations of the selection process as a whole (Macan, Avedon, Paese & Smith, 1994).

 

This is actually what separates GBAs from recreational games. There is absolutely no point in chasing moths around a screen, clicking on snowflakes, kicking soccer balls, or invading an alien colony, unless these are related to job performance and appear to be measuring something of worth, rather than being used for the sake of appearing ‘edgy’.

 

We actually conducted two separate validation studies with two of Revelian’s game based tools. One of these – Theme Park Hero – is heavily themed, while the other – Cognify – has a more generic, professional design. The endorsement rates, quality of experience and scores between these two assessments were almost identical across two samples totaling a little north of 2,000 candidates.

 

The constant between these two assessments is the high level of face validity across the both tools: candidates felt confident that they were being assessed across attributes that were highly relevant to their ability to perform the job.

 

From this, we learned that having a theme and making games fun is a great goal to have. It’s just critical that we also make sure that the games make sense in a high-stakes process, so that candidates feel fairly judged on worthwhile qualities that appear relevant to the job they are applying for.

 

Top Takeaways

 

I’m pretty certain that technologically-enhanced assessments such as game-based assessments will always win out over traditional assessment methods, and it won’t be long before they become the norm.

 

That being said, here are my top recommendations to keep in mind if you’re considering using GBAs for your organisation.

 

  • Make sure that your recruitment and assessment processes have the end user firmly in mind. Their onboarding, sense of fairness, engagement, and psychology do matter. We have shifted towards a consumer-driven economy, and social media certainly has become the review soapbox of our times.
  • Use technology to create dynamic experiences, so that your candidates feel great while they’re completing an assessment.
  • Don’t try to be edgy just because you want to be part of a cool trend. Ensure that assessments fit into part of your overall strategy and process.
  • Embed assessments into an overall strategic organisational fold.
  • Measure success.
  • Listen to candidates and make sure that they feel like the assessment you are using feels like it’s measuring something worthwhile.

 


About the Author

 

Salih Mujcic works as a Program Manager with Revelian’s Engineering department acting as the customer voice, helping to shape product vision and strategy. Recently, Salih’s main responsibility has been driving the development of innovative recruitment products using game-based technology. He is a registered Psychologist with post-graduate studies investigating the impact of game-based assessments on recruitment and selection processes.

 

Salih has a deep interest in games and gaming culture, technology, start-ups, people analytics, organisational culture, and web-based psychometric assessment.