Skip to content

Review Workflow — Rubrics, Reviewers & Submissions

The review workflow ensures fair, structured evaluation of every abstract submission. Build rubrics with weighted criteria, assign reviewers, and track review progress.

Go to Settings → Rubric to build your evaluation criteria.

  1. Go to Settings → Rubric.
  2. Click + Add Criteria.
  3. Choose the criteria type:
    • Scale — numeric rating (e.g., 1–5 or 1–10)
    • Multiple Choice — select from predefined options
    • Yes / No — binary evaluation
    • Text — free-form text feedback
  4. Fill in:
    • Title — criteria name (e.g., “Originality”, “Methodology”, “Relevance”)
    • Description — what the reviewer should evaluate
    • Weight — percentage weight in the final score (all weights should sum to 100%)
    • Required — whether the reviewer must complete this criteria
  5. Configure type-specific settings:
    • Scale: Min value, max value, min label (“Poor”), max label (“Excellent”)
    • Multiple Choice: Define options list
    • Yes/No: Points for Yes, points for No
  6. Click Save.
CriteriaTypeWeightScale
OriginalityScale25%1–10 (Poor → Exceptional)
MethodologyScale25%1–10 (Weak → Rigorous)
Relevance to ThemeScale20%1–10 (Off-topic → Highly Relevant)
Clarity of WritingScale15%1–10 (Unclear → Crystal Clear)
Potential ImpactScale15%1–10 (Minimal → Transformative)

Drag criteria to reorder them. Reviewers see criteria in the order you set.

Click Preview to see the rubric exactly as reviewers will see it. Verify that instructions are clear and the scoring makes sense.


Go to Settings → Reviewers.

  1. Click + Add Reviewer.
  2. Fill in:
    • Name — reviewer’s full name
    • Email — for sending review assignments and notifications
    • Role — Reviewer, Lead Reviewer, Technical Reviewer, or Panel Reviewer
    • Designation — job title
    • Company — organization
    • Description — expertise areas
    • Photo — profile image
  3. Optionally enable Social Media links (LinkedIn, etc.).
  4. Click Save.

Save time by importing reviewer profiles from LinkedIn:

  1. Click Fetch from LinkedIn.
  2. Paste the reviewer’s LinkedIn profile URL.
  3. The system auto-fills name, designation, company, description, and photo.
  4. Review and save.
RoleResponsibility
ReviewerEvaluates assigned submissions using the rubric
Lead ReviewerSenior reviewer — may have final say on borderline submissions
Technical ReviewerEvaluates technical accuracy and methodology
Panel ReviewerReviews for specific tracks or panels

Go to Settings → Submissions to see all submitted abstracts.

StatusMeaning
PendingSubmitted, waiting for review assignment
Under ReviewReviewers assigned, evaluation in progress
ReviewedAll assigned reviewers have completed their evaluation
ApprovedAccepted — submitter notified
RejectedDeclined — submitter notified
Revision RequestedSubmitter asked to revise and resubmit
  1. Open a submission.
  2. Click Assign Reviewers.
  3. Select reviewers from your reviewer list.
  4. Each selected reviewer receives an email notification (if enabled).
  5. Reviewers log in to the reviewer portal to evaluate.

If Flag Conflicts is enabled and two reviewers’ scores differ by more than the threshold (e.g., one gives 9/10, another gives 5/10), the system flags the submission for organizer review.


  • Weight criteria thoughtfully — originality and relevance should typically carry more weight than writing style for research conferences.
  • Use blind review for fairness — especially for competitive academic conferences where reviewer bias could be an issue.
  • Set a realistic review period — give reviewers 2–3 weeks to complete evaluations. Send reminders 3 days before the deadline.
  • Assign odd number of reviewers — 3 reviewers per submission makes tie-breaking easier than 2.
  • Test the full workflow — submit a test abstract, assign yourself as a reviewer, complete the evaluation, and verify the scoring and notification workflow.