Skip to Main Content

Student Learning Assessment Resources

Results Overview

Results

This is the section where you share the data that you collected using your assessment assignments and measurement tools and following your data collection process. This is meant to provide your reader with a quick, big-picture view of how students performed on the assessment. However, this does not simply mean providing average grades by student or aggregated test data. The goal here is to present information about student learning aligned with student learning outcomes. Data should be grouped by SLO, not by student, and you should consider including averages, percentages, and ranges to give a more complete understanding of the nuances of student learning.

Tips for Results

  • This is where you will report the data you collected from your assessment. Keep in mind that “discussion” is a separate section where you will offer an interpretation of this data.
  • Data presented must be broken out by campus and mode of instruction.
  • We encourage the use of tables, charts, and graphs in this section to make your results easy for the reviewer to scan and understand. Be sure to label data clearly.
  • Don’t assume the data speaks for itself. Adding a description to the table or chart gives your reviewer important context for understanding what the data means.
  • If you are using an analytic rubric, present your results by trait.
  • If you are using an objective test, refer back to your test blueprint and present your results by the learning outcomes or topics and by level of difficulty (Bloom’s Taxonomy).

Examples

For examples of results from Georgia Southern University academic programs, please see the Academic Program Assessment Document Handbook or the Core Course Assessment Document Handbook.

 

Results Peer-review Criteria

Academic Program Student Learning Outcome Assessment Results

At Georgia Southern, the Academic Assessment Steering Committee (AASC) reviews all academic program assessment documents on an annual basis. When reviewing the results, the committee uses the  following rubric criteria to provide feedback to the program:

1 - BEGINNING 2 - DEVELOPING 3 - ACCEPTABLE 4 - EXEMPLARY
No results are presented, or it is unclear how the results relate to the SLO. Results are presented and relate to the SLO, but a lack of specificity does not allow useful conclusions to be drawn. Results are presented by measure instead of by SLO. Presentation is insufficiently detailed; only overall student scores or averages are presented. Results are presented by SLO. Tables and graphs effectively communicate results, including sample size, count, averages, percentages, and ranges, as appropriate to the measurement tool. For objective tests, results are presented according to items or groups of items connected to a SLO, as demonstrated in the test blueprint. For rubrics, results are presented according to rubric trait and level, including counts and percentages. Results included from all applicable campuses and/or delivery modes. Results are easily understood, as well as their implications. Strengths and weaknesses in student learning are easily identified. For an objective test, results are presented according to the test blueprint and include item analysis information. For rubrics, inter-rater reliability is ensured through reconciliation of scores across multiple raters. New findings are compared to past trends, as appropriate. Results are presented for all applicable campuses and/or delivery modes showing an equivalent level of rigor and detail.

General Education Student Learning Outcome Assessment Results

At Georgia Southern, the General Education and Core Curriculum (GECC) committee reviews all core course assessment documents on an annual basis. When reviewing the results, the committee uses the following rubric criteria to provide feedback to the course:

1 - BEGINNING 2 - DEVELOPING 3 - ACCEPTABLE 4 - EXEMPLARY
No results are presented, or it is unclear how the results relate to the Area Student Learning Outcome. Presentation of results is insufficiently detailed; only overall student scores or averages are presented. Missing results from some applicable campuses and/or delivery modes. Tables and graphs effectively communicate results, including sample size, count, averages, percentages, and ranges, as appropriate to the measurement tool. For objective tests, results are presented according to items or groups of items connected to a SLO, as demonstrated in the test blueprint. For rubrics, results are presented according to rubric trait and level, including counts and percentages. Results included from all applicable campuses and/or delivery modes, but are not equivalent in rigor or level of detail. Results are easily understood, as well as their implications. Strengths and weaknesses in student learning are easily identified. For an objective test, results are presented according to the test blueprint and include item analysis information. For rubrics, inter-rater reliability is ensured through reconciliation of scores across multiple raters. New findings are compared to past trends, as appropriate. Results are presented for all applicable campuses and/or delivery modes showing an equivalent level of rigor and detail.

Results Additional Resources

Inter-rater Reliability Resources

Item Analysis Additional Resources