As part of the reviewing process on Applied, we recommend that a team of three is chosen to review candidate applications. The Applied application offers the chance to use a series of Sift questions which have been chosen to better evaluate candidate skills and appropriateness for the role.
Reviewers rate each answer on a scale of 1-5 stars based on how well the candidate has answered the question. Answers are randomised and anonymised so that reviewers have no information on candidates, but rate each answer on its own merits alone. A review guide is included for each question to keep the process as unbiased as possible!
Reviews are tallied up and each candidate has a total average score that is used to assess whether they'll move on to further assessment stages.
Why does Applied work like this?
According to research, CVs and cover letters are not the best way to find quality hires candidates. This is because they promote a high degree of subjectivity and are low predictors of how well a candidate will perform on the job. This is what Applied attempts to solve by using role-related questions.
When a reviewer has finished scoring candidates, their score is averaged with other reviewers’ scores. This is to make use of the “wisdom of the crowd”, and to avoid candidates being penalised by harsher reviewers, or boosted by generous ones. This averaged score can then be used to rate candidates overall.
Is Applied biased towards candidates who can write well?
Wait a minute! Isn’t the Applied way biased toward candidates with a better writing ability? No, not at all! Application questions are not formulated to promote a candidate’s writing ability, but rather to assess the content of the answer and show evidence of necessary skills for the role. To keep this as fair as possible, remember that we also attach a review guide specifying how to review each answer.
How the reviewing process works
1. Reviewers receive an email inviting them to anonymously review applications.
2. Using the link in the email, reviewers see a summary of the questions that they will be reviewing.
3. The reviewing screen has the following sections:
- Question - this is the question that candidates were asked to answer
- Answer - this is one of the candidates' answers. It has been anonymised, grouped with other candidate answers and then shown to the reviewer in a random order.
- How to rate - this is the marking criteria outlining what a poor (1 star) versus average (3 star) versus outstanding (5 star) answer looks like.
- Rate this answer - this is where the answer is scored on a scale of 1-5 stars
4. Reviewers are guided to look at the question, relevant skills and review guide to make their decision.
5. Reviewers rate each answer for a specific question, before moving on to answers for the next question.
6. Once finished, reviewers get a chance to go back to check their scoring or to submit their scores.
Once submitted, reviewers cannot change their scores. Scores are sent back to the Applied platform where they are combined with other reviewers' scores to calculate an average.
Feedback on reviewing style
All reviewers are given feedback on their reviewing style. The intention of this feedback is to:
- Illustrate how they have contributed to the reviewing process.
- Show which of their reviewing peers they are most similar/different to.
- Give an insight into their own personal reviewing style.
What happens once reviewing is finished?
Once all review scores have been submitted, they will be combined with the scores from other reviewers, reconstituted back into the individual level and be used to determine who is short-listed for interview.
This is what the review scores look like to the administrator running the role:
Once all reviewers have submitted their scores, the admin will use this information to move candidates above a certain score (set by the organisation) to interview.
We hope you found this article useful! If you have any more questions, please get in touch at firstname.lastname@example.org