Discussion
Your academic program assessment should be meaningful for the faculty in the department and should lead to evidence-based strategies for improving student learning. The Discussion section of your assessment document is where you interpret the data collected and presented in the Results section, explaining what you learned about student learning from analyzing the data. You can also propose conclusions here about what factors may have produced specific scores. The way you interpret your data will directly affect the action items you will choose to implement in the final section of your assessment document.
As with all other sections, be sure to show how your interpretation of the data connects to your student learning outcomes.
Tips for Discussion
Examples
For examples of discussion from Georgia Southern University academic programs, please see the Academic Program Assessment Document Handbook or the Core Course Assessment Document Handbook.
Academic Program Student Learning Outcome Assessment Results
At Georgia Southern, the Academic Assessment Steering Committee (AASC) reviews all academic program assessment documents on an annual basis. When reviewing the discussion, the committee uses the following rubric criteria to provide feedback to the program:
1 - BEGINNING | 2 - DEVELOPING | 3 - ACCEPTABLE | 4 - EXEMPLARY |
No interpretation is attempted, or the interpretation does not relate to the SLO and/or the results. | Interpretation is attempted, relates to the SLO and/or results but the interpretation is either insufficient to support programmatic decisions, not aligned with the program's previous action plans, offering excuses for results rather than thoughtful interpretations leading to improvements in student learning. | Interpretation is aligned with the program's SLO's. Interpretation is explained in terms of the desired levels of student performance, and is based on student achievement of those levels. Interpretation is justified through current disciplinary standards, previous results and/or benchmarks. Interpretation includes how courses, experiences, and/or the assessment process might have affected results. Interpretation indicates the appropriate collaboration and consensus of multiple internal stakeholders (e.g., program faculty committees, staff, and/or students). Interpretation is detailed enough to justify programmatic decisions concerning changes in instruction and/or curriculum. | Interpretation directly addresses the program's mission, SLOs, and action plans. Interpretation addresses past trends in student performance, as appropriate. Interpretation identifies possible areas of improvement, thus initiating future actions. |
General Education Student Learning Outcome Assessment Results
At Georgia Southern, the General Education and Core Curriculum (GECC) committee reviews all core course assessment documents on an annual basis. When reviewing the discussion, the committee uses the following rubric criteria to provide feedback to the course:
1 - BEGINNING | 2 - DEVELOPING | 3 - ACCEPTABLE | 4 - EXEMPLARY |
No interpretation is attempted, or the interpretation does not relate to the Area Student Learning Outcome and/or the results. | Interpretation is attempted, relates to the Area Student Learning Outcome and/or results but the interpretation is either insufficient to support programmatic decisions, not aligned with the program's previous action plans, offering excuses for results rather than thoughtful interpretations leading to improvements in student learning. | Interpretation is aligned with the Area Student Learning Outcome. Interpretation is explained in terms of the desired levels of student performance, and is based on student achievement of those levels. Interpretation is justified through current disciplinary standards, previous results and/or benchmarks. Interpretation includes how course content, experiences, and/or the assessment process might have affected results. Interpretation indicates the appropriate collaboration and consensus of multiple internal stakeholders (e.g., program faculty committees, staff, and/or students). Interpretation is detailed enough to justify programmatic decisions concerning changes in instruction and/or curriculum. Interpretation provided for data from all applicable campuses and/or delivery modes. | Interpretation directly addresses the Area Student Learning Outcome and results leading to an action plan. Interpretation addresses past trends in student performance, as appropriate. Interpretation identifies possible areas of improvement, thus initiating future actions. Interpretation of data includes an analysis of equivalencies across all applicable campuses and/or delivery modes. |