Reviewing Submissions
Reviewing submissions is where the real value of Castora comes through. Instead of watching tapes in isolation and sharing opinions via email, your entire casting team scores submissions against the same criteria, in the same place, with full visibility into each other's assessments.
The review workflow
When a performer submits their self-tape, every assigned reviewer is notified. The typical review process follows these steps:
- Watch the submission — view the self-tape and any accompanying materials.
- Score against criteria — rate each criterion on the scorecard independently.
- Add comments — leave notes on specific moments or overall impressions.
- Set a verdict — once all reviewers have scored, mark the final decision.
Watching submissions
The submission player
When you open a submission, you'll see the self-tape video player alongside the performer's details and any materials they've included (headshots, CVs, additional clips).
The player includes:
- Playback controls — play, pause, scrub, and adjust playback speed (0.5×, 1×, 1.5×, 2×).
- Timestamp markers — click any existing comment marker to jump to that moment in the tape.
- Full-screen mode — for detailed evaluation of framing, expression, and physical performance.
Reviewing materials
Below the video player, you'll find any additional files the performer submitted. Click to open headshots, CVs, or supplementary clips in a preview panel without leaving the review screen.
Scoring submissions
How scoring works
Each submission is evaluated against the criteria defined in the audition's scorecard. Every reviewer scores independently — you won't see other reviewers' scores until you've completed your own.
This blind scoring approach prevents anchoring bias, where early scores influence later reviewers' assessments.
Completing your scorecard
For each criterion, select a rating from 1 to 5:
| Rating | Meaning |
|---|---|
| 1 | Does not meet requirements |
| 2 | Below expectations |
| 3 | Meets expectations |
| 4 | Exceeds expectations |
| 5 | Outstanding |
Consider each criterion on its own merits. A performer might score a 5 on performance quality but a 3 on technical execution — that's expected and exactly what the weighted system is designed to handle.
Weighted scores
Once you submit your ratings, Castora calculates the weighted average based on the criteria weights set during audition creation. For example:
| Criterion | Weight | Your Rating | Weighted Score |
|---|---|---|---|
| Performance quality | 40% | 5 | 2.0 |
| Character fit | 30% | 4 | 1.2 |
| Technical execution | 20% | 3 | 0.6 |
| Direction potential | 10% | 4 | 0.4 |
| Total | 100% | 4.2 |
The weighted total gives you a single comparable score across all submissions while respecting the priorities your team has set.
Aggregate scores
Once multiple reviewers have scored a submission, Castora shows:
- Average weighted score — the mean of all reviewers' weighted totals.
- Score range — the highest and lowest weighted totals, highlighting any significant disagreements.
- Per-criterion breakdown — average scores for each criterion across all reviewers.
- Reviewer-by-reviewer detail — individual scores from each team member (visible after you've submitted your own).
Adding comments
Timestamped comments
While watching a self-tape, you can leave comments tied to specific moments in the video. Click the comment icon or press C at any point during playback to create a timestamped note.
Timestamped comments appear as markers on the video timeline, making it easy for other reviewers to jump directly to the moment you're referencing.
General comments
For observations about the overall submission — not tied to a specific moment — use the general comment field below the video player. These are visible to all reviewers on the audition.
Comment visibility
Comments are visible to reviewers only. Performers cannot see internal review comments or scores. This gives your team a safe space to discuss candidly without worrying about external visibility.
Setting verdicts
Once your team has reviewed a submission, it's time to make a decision. Verdicts categorise submissions into three outcomes:
Shortlisted
The performer is being seriously considered for the role. Shortlisted submissions are flagged at the top of your submission list for easy access during callbacks or final decisions.
Held
The performer shows promise but isn't a definitive yes yet. Use this for strong candidates you want to keep in the running while you continue reviewing other submissions.
Passed
The performer won't be moving forward for this role. Setting a pass verdict helps your team focus attention on remaining candidates.
Who can set verdicts?
By default, any reviewer can propose a verdict. The audition creator or an admin can configure verdicts to require consensus — meaning all reviewers must agree before a verdict is finalised.
Comparing candidates
Side-by-side comparison
Select two or more submissions from the submission list to enter comparison mode. This view shows:
- Video players for each selected submission, synchronised for simultaneous playback
- Score breakdowns side by side
- Reviewer comments for each candidate
Comparison mode is particularly useful during the final stages of casting when you're choosing between shortlisted candidates.
Score rankings
The Rankings tab shows all submissions ordered by their aggregate weighted score. This gives you a data-driven view of your candidate pool, though scores should always be considered alongside qualitative review and discussion.
Review best practices
- Score before reading other comments — complete your scorecard before looking at other reviewers' assessments to avoid bias.
- Use the full scale — if everyone rates everything a 3 or 4, the scoring system loses its value. Be willing to give 1s and 5s when warranted.
- Timestamp your key observations — future-you (and your team) will appreciate being able to jump straight to the moment that made you sit up.
- Don't rush — watch the full tape at least once before scoring. First impressions matter, but so does giving every performer a fair evaluation.
- Discuss disagreements — if two reviewers are far apart on a score, use the Audition Room to talk it through. Different perspectives often lead to better decisions.
- Set verdicts promptly — performers are waiting. The faster you can move through reviews and decisions, the better experience everyone has.