Skip to Main Content

Student Learning Assessment Resources

Assessment Methods Overview

The assessment methods section of the report explains the how of assessment for each student learning outcome. In the measurement tools and assignments section, you want to not only explain the assessment tools you use, but how they were selected and developed with the intention of collecting meaningful data for your department.

You should provide details on both parts included in this section:

  • Assignments -- the prompt and guidelines given to students that explain what they are required to do to demonstrate their learning (ie., the test questions or written assignment for a performance-based assessment)
  • Measurement Tools -- the tools used to evaluate student’s achievement levels on the assignment (ie., test answer key, test blueprint showing alignment of test items to SLOs and cognitive levels, analytic rubric)

You should also clarify whether your assessment is a direct or indirect measure of student learning:

  • Direct -- examines completed student work that requires application of the skills and knowledge specified in the SLO (ie., assessment based on an objective test, essay, presentation, lab report or other assignment)
  • Indirect -- information collected about students’ own reflections or interpretations of the skills and knowledge gained (ie., assessment based on a self-assessment, reflection essay, exit survey)

Direct measures are required to fulfill the guidelines included on the Academic Program Assessment Rubric, but a combination of direct and indirect measures can provide a more nuanced understanding of student learning that can better inform decisions about the program.

You may also select either formative or summative assessments to evaluate student learning in your program, or a combination of the two.

  • Formative assessment -- carried out throughout the course, project, or time-frame to provide feedback regarding whether the object is being met (ie., minute papers, discussion posts, “muddiest point” surveys)
  • Summative assessment -- carried out at the end of a course, project, or time-frame to evaluate whether the objective was achieved (i.e., the overall performance demonstrated in a final paper, final exam, final presentation)

It should be clear that the methods and tools you’ve selected are appropriate for the student learning outcomes you are measuring. Consideration should also be given to how you have proven these tools to be a reliable and valid measurement of the appropriate student learning.

Developing strong measurement tools and processes requires careful planning, testing, revising, and implementation. Be sure to highlight the work you have done and who your partners have been in the process of building your measurement tools and assignments.

Tips for Measurement Tools and Assignments

  • Explain why a specific measurement tool was selected and the process by which it was developed if it is an internally-developed instrument. How long did the development take? Were any of the questions pilot tested? How many faculty participated in the development?
  • If you are using an exam, include a copy of the exam or some example questions. You should also include a test blueprint that shows which questions measure learning relevant to each course SLO and at what level of difficulty, highlighting connections back to program SLOs.
  • Typically, multiple choice tests should include at least 25 items. These can all be included in one exam, or they can be distributed over a series of exams or quizzes throughout the semester. If there are multiple sections of your course, faculty should all be using the same test items to assess learning across sections.

If you are using a performance-based assessment, include a copy of the essay or assignment prompt. Add some narrative that explains how that assignment demonstrates learning linked back to the program student learning outcome. You should also include the analytic rubric used to student work produced from the assignment, and indicate how rubric traits link back to SLOs.

Examples

For examples of assessment measurement tools and assignments from Georgia Southern University academic programs, please see the Academic Program Assessment Document Handbook or the Core Course Assessment Document Handbook.

Measurement Tools and Assignments Peer-review Criteria

MEASUREMENT TOOLS AND ASSIGNMENTS

Academic Program Measurement Tools and Assignments Criteria

At Georgia Southern, the Academic Assessment Steering Committee (AASC) reviews all academic program assessment documents on an annual basis. When reviewing assignments design, the committee uses the following rubric criteria to provide feedback to the program:

1- BEGINNING 2- DEVELOPING 3- ACCEPTABLE 4- EXEMPLARY

SLO is assessed with only indirect measure(s) (i.e., surveys). No information is provided about how the measurement tool(s) and assignment(s) relate to the SLO.

SLO is asses with direct measure(s) (i.e., objective tests, rubrics). General description is provided of the measurement tool(s) and assignment(s). General information is provided about how the measurement tool(s) and assignment(s) relate to the SLO.

Detailed description of measurement tool(s) and its alignment with the SLO is provided. This includes: for an objective test measurement tool, test blueprint maps individual questions to the SLO (or element of the SLO), and expected levels of mastery are indicated; for an analytic rubric measurement tool, each trait is mapped to the SLO (or element of the SLO) and each level details expectations. Detailed description of the assignments and alignment with the SLO is provided. This includes: for an objective test assignment, representative test items are described to indicate relevance to the SLO and the expected level of mastery; for a performance-based assignment evaluated with an analytic rubric, the assignment prompt is described to indicate relevance to the SLO and the expected level of mastery. Measurement tool(s) will provide a direct/ observable result, and are appropriate to the SLO and the level of mastery. Measurements tool(s) will provide a direct/observable result, and are appropriate to the SLO and the level of mastery expected. Assignment(s) are appropriate to the SLO and the level of mastery expected.

Direct measures may be supplemented with indirect measures. Includes both formative and summative measures. A description of the development process for the measurement tool(s) and assignment(s) is included to illustrate their appropriateness to the SLO.

Core Course Measurement Tools and Assignments Criteria

At Georgia Southern, the General Education and Core Curriculum (GECC) Committee reviews all academic program assessment documents on an annual basis. When reviewing assignment design of core curriculum, the committee uses the following rubric criteria to provide feedback to the program:

1- BEGINNING 2- DEVELOPING 3-ACCEPTABLE 4- EXEMPLARY

No information is provided about how the measurement tool(s) and assignment(s) relate to the Area Student Learning Outcome. Area Student Learning Outcome is assessed with only indirect measure(s) (i.e., surveys).

Area Student Learning Outcome is assessed with direct measure(s) (i.e., objective tests, rubrics). General descriptions is provided of the measurement tool(s) and assignment(s). General information is provided about how the measurement tool(s) and assignment(s) relate to the Area Student Learning Outcome.

Detailed description of measurement tool(s) and their and their alignment with the Area Student Learning Outcome is provided. This includes: for an objective test measurement tool, test blueprint maps individual questions to expected levels of mastery from Bloom's Taxonomy for an analytic rubric measurement tool, each trait is described by multiple levels of possible performance. Detailed description of the assignment(s) and alignment with the area student learning outcome provided. This includes: for an objective test assignment, representative test items are described to indicate relevance to the Area Student Learning Outcome and the expected level of mastery; for a performance-based assignment evaluated with an analytic rubric, the assignment prompt is described to indicate relevance to the Area Student Learning Outcome and the expected level of mastery. Measurement tool(s) will provide a direct/ observable  result(s) and are appropriate to the area student learning outcome and the level of mastery expected. Assignments(s) are appropriate to the Area Student Learning Outcome and the level of mastery expected.

 

The Area Student Learning Outcome is measured may be supplemented with indirect measures. A description of the development process for the measurement tool(s) is included to illustrate appropriateness and accuracy (validity and reliability) of the tools.