These are the main elements of the review flow that will be very easy to follow once you get started:
- Getting started with an invitation email
- At the beginning of your reviewer link, getting a quick summary about the review you're about to make.
- Checking the question index
- Going through the scoring cycle for each question
- Checking your team's review stats (including yours and the applicant's leaderboard!)
Email with invitation to review applications
This email is the first way to know that you'll be a reviewer of short-answers submitted by candidates. This email will:
- let you know to which role you've been invited as a reviewer.
- contain a unique reviewer link. No painful log in or password details needed. Just this link, which you can always access in case you can't review all answers in one go.
The intro page of your reviewer link
After you click on your reviewer link, you'll first see an overview of what to expect as a reviewer. Then on the main landing page, you'll see:
A. The total number of answers that you'll rate.
B. a.An approximation on how much time would it take to review and score all the answers
C. A reminder of the job description and the main skills that your team are looking for.
You just need to click on the 'Review your first question' button after having this overall framework in mind.
You'll be taken to the first question. Questions will be displayed in the same order displayed to candidates.
For each question you'll see:
- number of answers reviewed by you out of the total allocated to you; and,
- the text of each question with their marking criteria and the skills you are looking for so you have full information before you start rating responses.
You'll see that the platform will take you to question no. 1 by default. However, you can click on the "Next question" button on the top-right if you would like to skip certain question groups or if you'd prefer rating answers in a different order.
|Note: The total number of answers to be reviewed might not be the same for all questions because that depends on the number of reviews per answer that were set and the number reviewers assigned to review applications (click here to learn more about the review allocation process).|
Behavioural science in the reviewer flow
Chunking is one of the behavioural design principles that comes to action at this point of the reviewer flow. Evidence shows that it's hard to simultaneously compare candidates in multiple areas at once, and that leads to cognitive load. We can also fall prey to the halo effect, if a candidate starts of strong (or weak), that affects everything we read thereafter. So that's why instead of reviewing a candidate's application in full vertically, we make it easy for you to do direct, horizontal comparisons of candidates by reviewing a batch of answers to question 1, then to question 2, and so on.
Scoring all the answers to the same question
Apologies if you find this section a bit long, However since it's the page where you spend most of your time we want to make sure that you have all the details so you can use your time wisely.
The typical scoring cycle
Answers to questions will appear as soon as you decide to score them. This is how the scoring cycle works:
1. You'll be able to see one answer at a time and then score it on a 1-to-5 scale.
2. Below each rating scale you'll see the relevant skills that are being assessed or your Review Guide. With this Review Guide as framework, click on the relevant star from the scale to depict how well a candidate answered the question.
3. Once you score an answer, you'll be automatically cycled on to the next answer waiting to be reviewed. Every score you give will be automatically saved (no need to click any "Save" buttons).
4. Points 1-3 repeat until you reach the last answer to a given question. Once you score the last answer to a given question, you'll be taken again to the next question to start a new rating cycle. When you score the last answer to the last question (i.e. your final, final review), you'll be taken to the review stats.
Behavioural science in the reviewer flow
Rules for Scoring, Anonymisation and Randomisation are behavioural design principles that come into action at this point of the reviewer flow:
Rules of Scoring. Your Review Guide will help you to be objective and consistent when identifying good, not so good and bad answers (click here to learn more about a typical structure of a Review Guide).
Anonymisation. Names and other socio-demographic details can be distracting and result in inadvertent bias. That's why the review flow removes names and all candidate personal details so you just focus on who's given good answers.
Randomisation. Our brains are heavily affected by ordering effects and small contextual factors around us. If we're hungry or tired, our scores are less reliable. All of this leads to lots of noise in the reviews, and a lack of objectivity about who's really good. That's why the review flow randomises the order of all candidate responses everytime you start a scoring cycle. So answer 1 from question 1 and answer 1 question 2 are not necessarily from the same candidate. That way, no candidate is disproportionately advantaged or disadvantaged by where they show up in the pile.
Scoring the last answer to the last question of the review flow
In this case, you'll be taken to the review stats page. So we recommend you go SLOW at this point of the review flow because it won't be possible to go back and re-rate responses.
Rechecking the scores you've given until a certain point
If it's possible to recheck a specific scores (click here to learn about possible scenarios), you just need to:
a. click on the coloured grid under the progress bar. The number of cells indicate the total number of answers per question to be reviewed. The colour coding of each cell helps you easily identify the distribution of scores. The redder the score, the lower; the greener the score, the higher.
Getting review stats
We love to give feedback. We are also aware of the effort you've put into reviewing applications so you deserve a cup of tea and a cake, along with some cool insights that will let you know more about the review process. At this point of the review flow, you'll be able to check:
- the distribution of scores given by each member of the reviewing team.
- The level of disagreement between you and the rest of reviewers and their review progress.
- The candidate leader board.
- An opportunity to give us feedback. We love to know more about your experience using Applied.
What happens after finishing a review?
You and your team can shortlist candidates based on the scores you've given (click here to learn how).