You are here

Assessment Related Policies

The College of Education and Affiliated Programs is committed to a robust assessment system that informs program improvement through documenting student learning and program effectiveness. This system will be characterized by ongoing data collection, timely data analysis, and informed action to ensure that program improvement efforts are responsive to student and faculty needs.

Data Collection and Reporting

On December 14, 2007, the Assessment Committee stated regarding the UAS that: 

  1. Program faculty collect student learning outcome data for each outcome each semester, as appropriate (e.g., when the course linked to that outcome is offered, when a particular assessment takes place).
  2. Program faculty analyze data from a minimum of two student learning outcomes each year and take actions relevant to their findings/interpretations. The analysis, interpretation, and actions are documented in an annual report.
  3. Program faculty analyze and act upon all student learning outcomes at least once within a three-year period.

On April 22, 2013, the Assessment Committee amended the above, in light of the new capacity in the college to report data across multiple years. The following was passed unanimously by the committee:

  1. The college expectation is that all programs will report on data on each SLO as evidenced through a signature assignment at least once during a three-year period. Programs wishing to vary from this cycle must request an exception from the Assessment Office. Programs should identify their desired reporting cycle for the Assessment Office.

On March 3, 2014, the Assessment Committee affirmed a proposal presented in November 2013 moving the program reporting cycle to a biennial cycle that aligns with the CTC reporting cycle. The November 2013 proposal read as follows:

The College of Education will adopt the “biennial reporting cycle” followed by the CTC for all programs in the college. Thus programs will review, interpret, and report on SLO and program effectiveness data in Years 1, 3, and 5 of the CTC accreditation cycle. This means:

  • Programs will review data for the preceding two academic years in the fall when a report is due (e.g., in Fall 2014, review data for 2012-13 and 2013-14).
  • Data for all SLOs will be reported and reviewed in each report.

The Assessment Office will prepare data charts and graphs that reflect multiple years of data, including a “trailing year” representing data that have already been analyzed but are presented purely for comparison purposes.

In the 7th year of the cycle, the college will hold a “beyond compliance” symposium. Each symposium will be planned by the Assessment Committee based on needs at the time and feedback from the preceding accreditation visit. However, symposium goals include:

  • programs reflecting on SLO and effectiveness data for the program over the preceding 6 years of the cycle;
  • programs charting goals and next steps to support student learning in the coming 5-6 years;
  • providing an opportunity for programs and faculty to learn about assessment practices and strategies for supporting student success from their peers;
  • gathering feedback on program needs to further enhance the UAS.

Signature Assignment and Rubric Implementation

The Unit Assessment System (UAS) in the College of Education and Affiliated Programs has as its goal to have signature assignments and rubrics that are:

  • Fair:  assess the knowledge, skills and dispositions that have been taught in the program in a way that is transparent and clearly communicated;
  • Accurate:  assess what is stated, aligned with specific outcomes and standards, reflect the appropriate format for the outcome and level of complexity for the candidates;
  • Consistent:  produce dependable results that remain relatively stable and consistent over time;
  • Free of Bias:  assess candidate knowledge, skills and dispositions in a way that accurately reflects what they know and can do, without bias based on candidate characteristics or inappropriate assessment conditions.

To ensure high-quality signature assignments and rubrics, and to provide data that are useful for program improvement, the Assessment Committee has articulated the following guidelines. Signature assignments and rubrics should:

  1. Reflect the consensus of program faculty. Since these assignments and rubrics are designed to measure program learning outcomes, they should be developed with the involvement and investment of faculty in the program.
  2. Reflect the content taught in the program and assess that content at the appropriate level of complexity and with a method appropriate to the outcome.
  3. Be reviewed and refined by program faculty at least once during a four-year period to ensure they reflect current program content and outcomes, as well as current standards of the field. This review can be done through actions such as program faculty:
    1. reviewing the signature assignment and rubric, in the context of the program’s assessment plan, and making revisions to ensure both are up to date.
    2. discussing the data resulting from the signature assignment and rubric, comparing the data on the same assignment over time as well as criteria data from this assignment to similar criteria on other assignments, to examine consistency.
    3. reviewing candidate work exemplars and scoring these exemplars against the rubric to examine consistency.
    4. comparing and contrasting candidate performance on signature assignments to what is known about their performance overall.
  4. Present conclusions regarding the signature assignments and rubrics, including any revisions, in the appropriate Biennial Report for the College of Education and Affiliated Programs.
  5. Forward any revisions to the signature assignments and rubrics to the Assessment Office.

Approved unanimously by the Assessment Committee, May 5, 2014