Grading & Results

Result Analytics

Result Analytics provides data-driven insights into exam performance at multiple levels — individual exams, classes, branches, and the entire organization. These analytics help instructors and administrators identify trends, evaluate exam effectiveness, and make informed decisions about curriculum and assessment design.

Exam-Level Insights

Each exam has an insights panel on its detail page showing real-time performance metrics. These metrics update automatically as submissions are graded:

FieldRequiredTypeDescription

Total Attempts

Required

number

Number of learners who started the exam.

Completed Attempts

Required

number

Number of learners who submitted the exam.

Average Score

Required

number

Mean score across all completed attempts.

Passing Rate

Required

percentage

Percentage of completed attempts that met the passing marks threshold.

Top Score

Required

number

Highest score achieved across all attempts.

Lowest Score

Required

number

Lowest score among completed attempts.

Class & Branch Performance

Performance data can be viewed at the class and branch level for broader analysis:

FieldRequiredTypeDescription

Exam Title

Required

text

Name of the exam being analyzed.

Average Score

Required

number

Mean score for the exam across all learners.

Total Attempts

Required

number

Number of learners who attempted the exam.

Pass Rate

Required

percentage

Percentage of learners who passed the exam.

Completion Rate

Required

percentage

Percentage of learners who submitted out of those who started.

Trends

Optional

object

Change indicators for average score, pass rate, and completion rate compared to previous periods.

Branch-level analytics extend these metrics with additional fields for branch name, exams completed out of total exams, and comparative performance across branches.

Item Analysis

Item analysis examines individual questions within an exam to evaluate their effectiveness. For each question, you can review:

  • Correct Response Rate — Percentage of learners who answered the question correctly. Very high or very low rates may indicate issues with the question.
  • Discrimination Index — How well the question differentiates between high-performing and low-performing learners. A good question should be answered correctly more often by high scorers.
  • Option Distribution — For MCQ questions, how responses were distributed across options. This reveals whether distractors are effective.
  • Average Time Spent — How long learners spent on the question on average, indicating difficulty or clarity issues.

Reports & Exports

BeamEdUp includes a Report Builder for creating custom reports from analytics data. Reports can be configured with:

  • Custom data sources (exam results, learner performance, branch analytics)
  • Selectable fields and columns to include in the report
  • Filters to narrow data by date range, branch, class, or exam
  • Grouping and sorting options for organizing the data
  • Visualization types for graphical representation of trends
  • Export formats including CSV, PDF, and XLSX
  • Scheduled reports that run automatically and email results to recipients