Review Scoring

Collect scores for your submissions by asking your reviewers any number of scoring questions.

Give numeric scores friendly labels like Excellent, Good etc to help reviewers with their assessment.

Score the quality of different criteria e.g. quality of content, relevance, quality of English etc. and weight the scorings so that the strongest criteria carry most significance when it comes to making decisions.

Combine scores from multiple reviewers using Averages, Standard Deviation etc. to give you a complete picture of your submissions.

Customise Labels

Customise the numerical score labels.

Multiple Criteria

Collect scores on multiple criteria.

Weighted Scoring

Give different criteria different weighting.

Combine Scores

Combine and tally scores for easy decision making.

Customise Labels

For most scoring, a numerical scale will provide a clear means for grading a submission.

But, sometimes, numbers alone are not enough and textual descriptions can make the scoring much more tangible for reviewers.

Within Firebird, each numerical score can be given its own label to help reviewers make their choice. Scores can be displayed alongside the labels or hidden as required

The numerical scores implied by the selected choice can then be combined and weighted as needed.

Example

In this example, the numerical value for the score is hidden to reviewers:

Label Score
Excellent 5
Good 4
Average 3
Poor 2
Very Poor 1

Multiple Criteria

You can collect scoring and feedback on any number of criteria in your review forms.

Any number of criteria can be included and each criteria can be given its own weighting to give an accurate summary of the submission.

Examples of criteria that could be scored include:

  • Relevance of Topic
  • Quality of English
  • Technical Merit
  • Originality
  • Presenter Quality

Weighted Scoring

Scores collected in the review form can be weighted, as required, to produce the most meaningful information for your submissions and reviews.

These scores can then be combined into a total score that is correctly weighted. This means that a single combined score can be quickly assessed to determine the quality of any one submission and to make decisions about that submission.

Example


In this example, a review has two questions:

  • Technical Merit
  • Relevance of Topic

Each questions allows the reviewer to choose from a range of multiple choice options in their review form. In this example, the scores for each question follow a simple linear scale with no weighting:

Technical Merit

Grade Score
Excellent 5
Good 4
Average 3
Poor 2
Very Poor 1

Relevance of Topic

Grade Score
Very Relevant 3
Slightly Relevant 2
Not Relevant 1

However, it may be that Technical Merit is more important than Relevance of Topic when assessing a submission and so the individual scores could be given a weighting. Also, it may be that Excellent Technical Merit needs a stronger weighting so that it stands out from the rest even if the submission is not relevant.

So, this revised scoring could be implemented to give a better idea of how good a submission is for the project:

Technical Merit

Grade Weighted Score
Excellent 10
Good 6
Average 4
Poor 2
Very Poor 0

Relevance of Topic

Grade Weighted Score
Very Relevant 2
Slightly Relevant 1
Not Relevant 0

For each project, any number of questions, options and scores can be configured to help get the perfect scoring system.

Combine Scores

Each review can have any number of critera questions that can be weighted and combined in a number of ways.

Scores can be combined for a single review and also combined for a single submission.

This flexibility enables Firebird to collect and summarise the right data for fast and effective decision making.

Scores can be combined in many different ways including:
  • Sum
  • Mean
  • Mode
  • Median
  • Standard Deviation
  • Tally

Example

In this example, a submission has 3 reviews and each review collects scores for two criteria (Technical Merit and Relevance of Topic) plus a recommendation from the reviewer:

Review 1

Question Score
Technical Merit 10
Relevance of Topic 2
Recommendation Accept

Review 2

Question Score
Technical Merit 8
Relevance of Topic 3
Recommendation Accept

Review 3

Question Score
Technical Merit 4
Relevance of Topic 3
Recommendation Reject

The reviews could have a total score and mean score calculated for them and the submission could have a mean score and Recommendation Tally:

Combined Scores

Combined Score Review 1 Review 2 Review 3
Total Score 12 11 7
Average Score 6 5.5 3.5

Submission

Combined Score Value
Technical Merit Mean 7.33
Relevance of Topic Mean 2.67
Recommendation Tally Accept x 2, Reject x 1

The summary information in the submission can then be used for fast and effective decision making.

Need some help?

Talk to us

Every event is different and it can be confusing knowing what you need for your particular project.

We also understand that for first timers, the entire process can be extremely daunting which is why we provide more than software. For us, it's about giving you a personalised service too.

Come and talk to our experts who have personally supported thousands of events across every imaginable industry and academic subject.

This feature has been added to your 'My Features' page.
View My Features
© Firebird Conference Systems Ltd 2020. All rights reserved.
Thank You

We've received your message and will get in contact shortly.

Sorry

Something has gone wrong and we didn't get your message.

Please try emailing us at hello@firebird.systems or use the online chat below.