Setting up an online application review workflow

Share This Resource

Using Reviewr to setup an efficient and engaging online application review workflow

In this webinar we discuss the key elements within Reviewr that are used to setup both the most efficient, and engaging, online application review workflows for scholarships, grants, recognition awards, and more.

In this webinar we discuss:
  • Common methods for assigning applicants to review committees
  • Blinding of data
  • Giving the review teams access to submissions
  • Collecting online evaluations and note taking
  • Team deliberations

Assigning applicants to review teams

You’ve launched your website, collected online applications, and now its time to review them – easy right? Not so fast. How do you create a review workflow that is both efficient for your review committee thats also fair, equitable, compliant, and engaging? How do we ensure that the review committee has all the content they need to review, without overwhelming them. Let’s dive into the most common workflows in assigning applications to review teams.

  1. Option 1 – Assigning all applications to all review members
    1. Simple approach
    2. Can assign all applicants to all reviewers in bulk
    3. Can manually adjust if needed
    4. Potentially overwhelming to committee if there is a large number of applications
  2. Option 2 – “Bucket method”
    1. This theory is based off a concept of review commitee groups. Examples include:
      1. Categories
      2. Phases
      3. Workflows
      4. Locations
      5. Review teams
    2. This option allows you to segment applications into more manageable groupings. 
    3. Allows each group to have its own review team
    4. Each group can also. have a “manager”.
  3. Option 3 – Hybrid approach
    1. Leverage buckets with advanced pairing options
      1. Randomization
      2. Manual Pairings within buckets
    2. Also works for progression based reviews.
      1. Each bucket is a “phase”
      2. Each phase can use its own review team and evaluation critera
      3. Top ranked applicants in a bucket can advance to other buckets

Screen Shot 2022 03 17 at 3.11.32 PM

Blinding Data

Blinding of applicant data from review teams has become increasingly popular with Reviewr customers. Often times this is day for 2 reasons.

  1. To ensure a fair, equitable, review process
  2. Hiding info from review teams that does not impact their review process (not overwhelming them with irrelevant info).

What Data can be blinded?

  1. Form data
    1. Can hide specific form fields from review teams. Examples include:
      1. Name
      2. Address
      3. Email
      4. Demographic info
      5. PII
    2. Other form field data that can be hidden includes open ended text fields. However, this type of blinding would hide the entire text box, not just the PII within the text. If a customer wants to redact content written within a text field they themselves have the ability to edit the submission form.
    3. Uploads
      1. While content WITHIN an upload can not be blinded (without downloaded, blinding, and reuploading), the actual files themselves can be hidden.
      2. This is also often hidden from the applicants as well if the files are submitted externally, such as a guest attachment for a reference letter.
    4. User interface
      1. Within Reviewr there are user interfaces and columns for things such as name, company, etc. These columns also need hidden from review teams view to ensure compliance in blinding. 
    5. Data within the evaluation
      1. It is common practice for the actual review evaluation and/or notes to be shared back to applicants as a feedback sharing mechanism. 
        1. If this is desired, the name of the person that left the evaluation can be blinded.
        2. The evaluation itself can have certain elements blinded, but not others. For example, share the notes but not the scores, or vice versa

Screen Shot 2022 03 17 at 4.00.08 PM

Granting access and conducting reviews

Submissions have been assigned to review teams. Check. Data has been blinded. Check. Now its time to begin reviewing.

First, let’s ensure that the review team has been added, and assigned. 

  1. Review teams can be added individually or uploaded
  2. Once added, the review team then gets assigned to a review bucket (see item #1 in this blog).
  3. Once the review team is added, and assigned, it’s time to trigger credentials.
    1. Invite them to register
    2. Assign them a temp password
  4. Once the review team logs in, they are presented with a “menu” of submissions to review.

Having an online review process is critical to ensuring both an efficient, and compliant, workflow. The use of tools like Reviewr not only streamline operations, but it eliminates a large percentage of human error when it comes to tabulating results and collecting evaluations.

  1. Scorecards are a powerful tool to leverage math based scoring for fairness. 
    1. Simple scorecards
    2. Weighted scoring
    3. Averages
  2. Note taking
    1. Log internal notes to other users as a way to “virtually deliberate”.
    2. Leave external questions and comments directly to applicants if desired.
  3. Deliberation
    1. Sometimes scores do not tell the full story. 
    2. Leverage Reviewr advancement user interface to discuss total scores, average scores, and view the data as a team.
    3. Let the scores and notes power internal discussions and team deliberations.

Screen Shot 2022 03 17 at 4.07.32 PM

 

Subscribe To Our Newsletter

Get insights and best practices from leaders in submission management and review.

More To Explore

Interested in seeing how Reviewr can work for you?

CTA post