Skip to Local Navigation
Skip to Content
California State University, Long Beach
California State University Long Beach Assessment and Testing Secondary Banner
Print this pageAdd this page to your favoritesSelect a font sizeSelect a small fontSelect a medium fontSelect a large font
 

Division of Student Services Tables of Student Learning Outcomes

Click on the departments below to view student learning outcomes and status reports for 2010-2011:

 

Associated Students, Inc. (ASI)

Career Development Center (CDC)

Center for Scholarship Information

Disabled Student Services (DSS)

Educational Equity Services

Educational Opportunity Program (EOP)

Housing and Residential Life

President's Scholars Program

Student Health Services

Student Life & Development (SLD)

Student Orientation, Advising, and Registration (SOAR)

Student Recreation and Wellness Center

Testing, Evaluation & Assessment (TEA)

University Student Union (USU)

Women's Resource Center

Timeline for 2011-2012

Associated Students, Inc. (ASI)

Student Learning Outcome
Students serving as an elected or appointed officer in Student Government will demonstrate increased utilization of at least three of the five leadership practices identified in Kouzes’ and Posner’s The Leadership Challenge.
Measurement Tool
Student Leadership Practices Inventory
Relevant Framework
Leadership Development
Status

The Student Leadership Practices Inventory (SLPI) was administered at the ASI Student Leadership Retreat on August 22, 2010. 47 students completed the questionnaire. In order for utilization of a practice to be considered “frequent," the rating of the behavior must fall above the 70th percentile ranking for the more than 2,200 student leaders who have taken the SLPI.

Data from the self-assessment exercise revealed that the following percentages of student leaders frequently engage in these leadership practices:

 

  • Modeling the Way                                    25.53%
  • Inspiring a Shared Vision                      46.81%
  • Challenging the Process                       44.68%
  • Enabling Others To Act                          38.30%   
  • Encouraging the Heart                           14.89%

 

Only 27.66% of the student participants assessed themselves above the 70th percentile on three or more of the leadership behaviors.

The same group was instructed to select 5-8 observers to whom the SLPI Observer instrument would be sent. The observers’ responses were analyzed to see if the students had been observed engaging in practices identified in The Leadership Challenge.

29 (61.70%) of the student leaders who completed the self assessment participated in the observer assessment. Data from the observer instrument revealed that the percentage of student leaders who frequently engage in the following leadership practices were:

 

  • Modeling the Way                                    34.48%
  • Inspiring a Shared Vision                      48.28%
  • Challenging the Process                       34.48%
  • Enabling Others To Act                          82.76%   
  • Encouraging the Heart                           41.38%

 

For the group as a whole, observer ratings were higher in 4 of the 5 practices. Of the student leaders participating in the observer exercise, 48.3% received higher average scores from their observers than they had assigned to themselves. According to the observers, 41.38% of the students engaged in three or more of the five leadership practices on a frequent basis.

Use of Findings

Upon further analysis, it was revealed that the most growth occurred in the practice of Encouraging the Heart with an average increase of 1.84 points between self and observer ratings and 65.52% of the student receiving higher observer ratings. This was followed by the practice of Enabling Others to Act with 55.17% of students receiving higher observer ratings and Inspiring a Shared Vision with 51.72% receiving higher observer ratings. Not surprisingly, two of these practices were among those students reported utilizing less frequently. As a result, leadership development efforts tended to concentrate on these two areas.

What was surprising was that the observers reported the student leaders engaging in the practice of Challenging the Process with less frequency than the student leaders reported themselves. Observers scored 58.62% of the student leaders lower than the students self-reported. We also noted that the observers rated 51.72% of the student leaders lower in the area of Modeling the Way.

 

Based on these findings, ASI will work on balancing its leadership development efforts to address all five practices instead of only concentrating on those that students reported using less frequently.

Go to Top

Career Development Center (CDC)  
Student Learning Outcome

As a result of participating in an interview preparation counseling session or workshop, students will correctly identify four out of six possible components to prepare for a professional interview. 

Measurement Tool

Pre & post assessment

Relevant Framework
Knowledge Acquisition, Integration, and Application
Status

The Career Center collected a total of 124 pre- and post-assessments for the Interview Preparation workshops.  Students attending the presentation were first asked to complete the pre-assessment prior to the start of the presentation.  The workshop was then given and the post-assessment was administered immediately afterwards.  Below is a summary of the results:

 

  • 116 of the pre and post were acceptable for data collection and 8 (pre & post) were not acceptable

  • 66% of the students had no prior instruction on interview preparation and 31% had received prior instruction (2.6% did not respond)

  • The average pre score was 2.37 and the average post score was 7.44, indicating an improvement of 5.07 points

  • The total number of students who were able to successfully identify 6 or more components came to 84% (97 total students)

Use of Findings
As a result of the findings, career counselors have streamlined and edited the workshop materials to place more emphasis on the nine critical interview components. Moreover, career counselors now use the nine interview components as a focus when conducting mock interviews with students.

Go to Top

Center for Scholarship Information

Student Learning Outcome

Students who complete the scholarship workshop will be able to list the four main components of a scholarship.

Measurement Tool
Post questionnaire
Relevant Framework
Knowledge Acquisition, Integration and Application
Status
In fall 2010 and spring 2011, the Center for Scholarship Information held five workshops for various student organizations to educate students about where to look for scholarships and best practices when applying for scholarships. At the beginning of each workshop, students were asked to complete a short questionnaire to determine their knowledge of scholarship resources and the four main components of a scholarship application. Only 19 out of 84 students (roughly 23%) were able to answer the questions accurately. At the conclusion of the workshop, students responded to the same questions. Sixty-nine (69)out of 84 students (roughly 82%) could list the four main components of a scholarship application.
Use of Findings
Based on the results of the pre- and post- workshop questionnaire, the Center for Scholarship Information has modified its scholarship workshop for the 2011-2012 academic year to direct student’s focus back to the main components and away from extra information provided at the end of the workshop.  For example, many students listed thank you letters as a main component so we have altered the manner in which this section is presented to make it clear that it is secondary to the main components.

Go to Top

Disabled Student Services (DSS)

Student Learning Outcome

Upon completion of the 2011 Spring semester, new students registered with Disabled Student Services will be able to  identify functional limitations resulting from their disability, demonstrate the ability to determine and state their appropriate accommodation needs, and describe the process involved in requesting disability accommodation for curriculum requirements.

Measurement Tool

Post Assessment

Relevant Framework

Knowledge Acquisition, Integration and Application

Self-appraisal

Status

87 New DSS students were tracked to determine if they understood their disabilities well enough to request appropriate and necessary accommodations with their respective faculty/instructors.

 

The results below pertain to learning outcomes of students who received advising in DSS:

 

  • Identifying their functional limitations in the classroom : 100% of the 87 students  were able to effectively identify their functional limitations and inform their faculty.

  • Ability to determine and state their functional needs for accommodations in the classroom: 86% of the 87 new students were able to state their functional needs in the classroom. The remaining 12 students who had difficulty with this process had a specific disability (Asperger, autism and traumatic brain injury). DSS worked directly with faculty and students to ensure they received assistance.

  • When appropriate, describe the process for curriculum modification: Of the 87 new students, 18 required curriculum modifications (extended time to complete course assignments, frequent absences due to medical conditions, for example). 100% of these students were able to successfully communicate to their faculty and department chairs about their specific need for modifications in the course curriculum.

Use of Findings

DSS will utilize these results to improve our advising and counseling, especially for our students with asperger, autism and TBI.

Go to Top

Educational Equity Services

Student Learning Outcome

After completing the six-week McNair Scholars Summer Research Internship Program (SRIP), 90% of participants will indicate they have increased their oral, written and research skills. 90% of participants will have their faculty mentors indicate that the participants will have increased their oral, written, and research skills

Measurement Tool

Faculty pre and post assessment & observation / student self assessment

Relevant Framework

Intellectual Growth

Personal & Educational Goals

Status
Our data in two cycles of assessment show that while these benchmarks are being met, in that participants and mentors are indicating growth in these skills, the amount of growth is showing what may be an interesting pattern. In the 2010-11 cohort, writing skills were rated as improving by 50%, while research and presentation skills were reported as rising 80%. The 2011-12 cohort reports a 50% gain in each of the three measures. There was an important systematic difference between the two cohorts, however. The 2010-11 group was assessed at the immediate end of the summer research internship, which represents a significant emotional high point for scholars. The 2011-12 group was assessed at the beginning of the Fall semester, far removed from this singular achievement.
Use of Findings
To investigate further whether this variation in achievement is a result of natural fluctuations or a systematic effect of the timing of the assessment we will gather assessment data for the 2012-13 cohort both at the end of the summer internship and at the start of the Fall semester. Given that we are meeting a rather high benchmark, however, it seems clear to us in the program that the basic features of our offering to our students should remain unchanged.

Go to Top

Educational Opportunity Program (EOP)

Student Learning Outcome

First year EOP students who participate in EOP math tutoring will increase their subject proficiency.

Measurement Tool
Pretest in math; final grade in math course
Relevant Framework

Intellectual Growth

Status

We first hypothesized that those students who attended more EOP mandatory tutoring sessions during Fall 2010 semester would pass their course at a higher rate than those who attended less or no tutorial sessions.  Students could have attended a maximum of 12 group contact sessions throughout the semester. 

  • We ran a Pearson’s Correlation to see if there was a relationship between number of contact sessions attended and passage rate in the MAPB 1 course.  However, there was no significant correlation between these two variables: t(142)= -.149, p=.077. 

  • We then looked at whether students who attended at least half or more of the 12 contact sessions (6 and above) passed the course at a higher rate than those who attended less than half of their contact sessions (5 or below).  We ran a Pearson’s Correlation and found a significant negative correlation between attending half or more mandatory tutoring sessions and passage rate in the MAPB 1 course: t(142)= -.190, p=.023.  In other words, those students who attended 6 or more tutoring sessions were less likely to pass the course.  This did not support our hypothesis. 

Since our first method of evaluation did not provide us with positive results, we sought to hear directly from the students who attended MAPB 1 mandatory tutoring during Fall 2010 semester to determine whether they felt EOP mandatory tutoring was helpful towards their passage of the course that semester.  We created an online survey which we administered from May 16 through June 3, 2011.  The questions and student responses are as follows:

 

  • Do you feel that attending EOP mandatory tutoring helped you in your MAPB 1 class during Fall 2010 semester?

  • 29/31 (94%) said YES

  • 2/31 (6%) said NO

  • Did the EOP tutors provide you with a range of strategies that you utilized on your homework and tests?

  • 28/32 (88%) said YES

  • 4/32 (12%) said NO

  • Did these strategies have a positive impact on your understanding of the material?

  • 29/31 (94%) said YES

  • 2/31 (6%) said NO

  • Did you feel more confident and prepared for coursework and exams after attending a tutoring session?

  • 29/31 (94%) said YES

  • 2/31 (6%) said NO

  • In your ideal mandatory tutoring session, which would be most helpful.....?

  • Group tutoring: 18/32 (56%)

  • One-on-one tutoring: 11/32 (35%)

  • Both: 2/32 (6%)

  • Group tutoring consisting of two student: 1/32 (3%)

  • In your ideal mandatory tutoring session, which would be most helpful.....?

  • Meeting once a week with a tutor 13/32 (44%)

  • Meeting more than once a week with a tutor 15/32 (47%)

  • Depends on the need of the student and whenever they would like to go 3/32 (9%)

  • In your ideal mandatory tutoring session, which would be most helpful.....?

  • 30 minutes appointment 8/32 (25%)

  • 60 minute appointments 20/30 (66%)

  • However long it takes the student who fully understand the material 2/32 (6%)

  • 30 minutes, unless the student needs longer 1/32 (3%)

  • How many tutoring appointments per semester do you feel would be most helpful with passing your MAPB 1 course?

  • 0-10 (4/32) (13%)

  • 11-20 (14/32) (44%)

  • 21-30   (5/32) (16%)

  • 31-40 (3/32) (9%)

  • 41-50 (2/32) (6%)

  • 90 (1/32) (3%)

  • As needed (2/32) (6%)

  • No answer (1/32) (3%)

 

Out of the 141 students enrolled in MAPB 1 during Fall 2010 semester, and required to attend EOP mandatory tutoring, 32 students took our survey at the end of the Spring 2011 semester.  While the number of responses a full semester later was not great, the answers of those who took the survey were overwhelmingly positive, and show that our students felt that EOP mandatory math tutoring was helpful towards their passage of MAPB 1 during Fall 2010 semester.

Use of Findings
Based on the results of math tutoring and student surveys EOP has modified the tutorial program to address areas of student needs and to determine if the pre-test is a positive tool for student success in math.

Go to Top

Housing and Residential Life

Student Learning Outcome

As a result of participating in the Resident Assistant (RA) Training, the RA will be able to identify at least six different steps used in Conflict Resolution Model, at least 3 resources on campus, at least five different residence hall regulations, and 4 basic steps in responding to a fire alarm in the residence halls.

Measurement Tool

Pre and Post Assessment

Relevant Framework

Knowledge Acquisition, Integration and Application

Status

There were 52 Resident Assistants (RAs) participated in the Resident Assistant Training.  A pre assessment was given at the start of the training, and at the end of the training, a post assessment was given to all RAs. 

Result data from the assessment exercise revealed the following percentage:

  • Conflict Resolution:

    • Pre assessment: 17%

    • Post assessment: 80%

  • Campus Resource:

    • Pre assessment: 31%

    • Post assessment: 96%

  • Rules and Regulations:

    • Pre assessment: 62%

    • Post assessment: 94%

  • Fire Alarm Response:

    • Pre assessment: 27%

    • Post assessment: 93%

Use of Findings

Based on the results of the post assessment, University Housing has made some minor changes to the resident assistants training sessions, and had put more emphasis on conflict resolution sessions and materials.

Go to Top

President's Scholars Program

Student Learning Outcome

As a result of attending the mandatory Freshmen Success Skills Seminar, freshmen President's Scholars will acquire time management and goal setting skills.

Measurement Tool

Post assessment

Relevant Framework

Personal & Educational Goals

Status
Two sessions were offered and 100% of the freshmen (30 Scholars total) attended one of the two dates. At the conclusion of the session, which included an overview of scholarship requirements, a review of on-campus resources, time management assessments and presentations, and a goal setting exercise, participants completed a post assessment to see if they understood information presented at the seminar. All Scholars were able to list, write down, and identify the requested items.
Use of Findings
These findings will be used to continue offering seminars in this manner to enhance the time management and goal setting skills of the Scholars.

Go to Top

Student Health Services

Student Learning Outcome

As a result of attending the Sexual Health Awareness Workshop (SHAW), students will use a reliable form of birth control and will report they feel more confident in their decisions to prevent pregnancy.

Measurement Tool

Pretest and posttest

Relevant Framework

Healthy Behavior

Status

Study outcomes indicate SHAW is a beneficial workshop that can lead to increased knowledge, self-efficacy, and positive behavior change.

Participants:
Retained knowledge gained in the workshop
Stated they had more confidence in preventing pregnancy and STIs
Stated they felt more comfortable talking about sex
Self-reported positive behavior, including:
Having a Well Woman Exam and getting tested for STIs
Using effective forms of contraception
Using effective safer sex practices to reduce risk of future STI infections

Knowledge:
There was a significant increase in knowledge from pre-test to post-test (p=.000).
There was also a significant increase in knowledge from pre-test to the six-month post-test (p=.002).

Self-Efficacy:
95% of participants said they felt more confident in their decisions to prevent pregnancy after attending SHAW.
95% of participants said they felt more confident in their decisions to prevent STIs after attending SHAW.
86% of participants said they felt more comfortable talking about sex with their partners after attending SHAW.
There was a statistically significant increase (p=.018) in confidence preventing STIs.

Behavior:
35% increase in the amount of participants who discussed STIs prior to having sex (from 40% to 75%).
30% increase in the amount of participants who have tested for STIs
10% increase in condom use
38% increase in use of birth control pills
72% increase in amount of participants who have had a well woman exam
21% increase in the amount of participants who discussed birth control prior to having sex
10% decrease in the amount of participants using no contraception (to 0%)

The full report can be viewed here.

Use of Findings
Findings will be used to evaluate and improve SHAW. Results indicate additional discussion on partner communication should be added to the workshop. We will also share our findings with other college health professionals

Go to Top

Student Life & Development (SLD)

Student Learning Outcome

Students who participate in the Alternative Spring Break Program will demonstrate the ability to identify and analyze the pros and cons of the governmental aid responses to the victims of Hurricane Katrina and formulate their own position regarding the government response.

Measurement Tool
Essay with rubric
Relevant Framework
Cognitive Complexity
Status

Thirty-three students participated in the Alternative Spring Break program. All were enrolled in the service learning course "University 300i, The Politics of Disaster." Participants attended a seven day service learning trip to New Orleans during spring break of 2011.

Students’ essay responses on the course’s final exam were used to measure student learning outcomes. The essay question asked students to compare their initial beliefs with their post-course beliefs concerning local and federal responses to the 2005 disaster. Essays were rated according to five criteria: a) preparation, b) content, c) integration, d) cognitive complexity, and e) a personal stance on the subject.

The vast majority of essays received high ratings of preparation, content and personal stance, with each essay including multiple references to course materials and activities. While student responses demonstrated their abillity to identify politicians and agencies, cause and effect relationships among events were mentioned less. Most students were able to take a personal stance on the disaster and often cited personal experiences garnered from the service learning aspect of the course.

To summarize, students’ strengths were reflected in their content recall. They were able to identify the names and positions of the major decision makers. However, student essays were often weak in critical analysis of the actions of the decision makers. On the other hand, students tended to be more confident in their analyses during class discussions.

 

Use of Findings

The findings indicate the need for additional class assignments with more emphasis on the effects of the event. Sx years after Hurricane Katrina, much literature exists that documents and examines the government's immediate response and subsequent actions. Course reading material was selected at the inception of the course four years ago, and should be updated with more current literature.

Go to Top

Student Orientation, Advising, & Registration (SOAR)

Student Learning Outcome

SOAR advisors who participate in three pre-training workshops will identify all General Education Categories.

Measurement Tool
Posttest
Relevant Framework

Practical Competence

Status
After a comprehensive screening and interview process by Student Orientation, Advising and Registration (SOAR), 16 undergraduate students were hired to serve as SOAR Advisors for Summer 2011.  SOAR then conducted three, three-hour pre-training sessions for these newly-hired advisors on Friday, April 22, 29 and May 6, 2011.  These pre-training sessions were designed to introduce California State University (CSU) policies, California State University, Long Beach (CSULB) academic advising guidelines, and the CSULB 2008 General Education Pattern.  During the first session, advisors were asked to individually identify the six categories that comprise the CSULB 2008 General Education Pattern, and only 1 of the 16 advisors (6.25%) could accurately identify all categories.  At the conclusion of the third pre-training session, advisors completed an assessment covering areas related to academic advising guidelines and policies, in addition to identifying the various categories of the general education pattern.  All 16 advisors (100%) were able to correctly identify the six categories of the CSULB 2008 General Education Pattern.
Use of Findings
Based on the results of the post assessment, Student Orientation, Advising and Registration will hone the content and structure of future pre-training sessions to reflect the changes in CSU policies and CSULB guidelines along with addressing the wide range of learning styles of today’s college student.

Go to Top

Student Recreation and Wellness Center

Student Learning Outcome

Intramural Sports student assistants who complete the volleyball officiating clinic will outline proper rules and regulations and demonstrate competent officiating mechanics

Measurement Tool
NIRSA test and non-participant observation
Relevant Framework
Practical Competence
Status

On February 2nd, 2011, The Intramural Sports Program conducted their first volleyball officiating clinic. Fifteen Intramural Sports student assistants attended the clinic. The clinic consisted of a two-hour theoretical session and a two-hour practical skills session. Both parts of the overall clinic were assessed through the NIRSA Level 1 Volleyball Officiating Theoretical Exam and Practical Skills Rating Form. The NIRSA level 1 Volleyball Officiating Theoretical Exam is a 50 question multiple choice exam that is administered open book. The Practical Skills Rating Form rates the official on a Likert scale (1-Poor to 5-Superior). The following aggregate data was collected:

NIRSA Level 1 Theoretical Exam:

  • 93% of participants scored at least 70% or better on the exam.

  • 20% of participants scored at least 80% or better on the exam.

  • 0% of the participants scored at least 90% or better on the exam.

Practical Skills Rating Form:

Criteria(5-Superior; 4-Good; 3-Average; 2-Fair; 1-Poor)

  • Judgment (consistency between skill level, team, action): The mean score of the participants was 3.33

  • Mechanics (proper signals and technique): The mean score of the participants was 3.47

  • Positioning(focus on the action, quick reaction to play): The mean score of the participants was 3.27

Knowledge of the game and rules: The mean score of the participants was 3.27

Use of Findings
Based on the results of the clinic, ASI Recreation has modified the clinic for future training and incorporated the same type of approach into training for other leagues.

Go to Top

Testing, Evaluation & Assessment (TEA)
Student Learning Outcome

Students who receive GWAR advising and/or take a GWAR course will successfully meet the GWAR within one year.

Measurement Tool

Student records; CS link report of WPE placement and outcomes

Relevant Framework

Persistence and Academic Achievement

Status

In September 2010, 297 students scored below 11 on the Writing Proficiency Exam, and were required to see and advisor and/or enroll in a GWAR writing intensive course. Of these 297 students, all but 12 did so. In November 2010, 242 students were also required to see an advisor and/or enroll in a GWAR course. Of these, all but 20 did so. GWAR course portfolio pass rates for these cohorts will be monitored in December of 2011 to see how many students met the GWAR in one year.

Students were sent numerous communications about the requirement to seek advising and/or enroll in a GWAR course after earning a WPE score lower than 11. Students who did not seek advising after receiving the communications received a GWAR hold, which prevented registration privileges. Once students saw a GWAR advisor or agreed to enroll in a required GWAR course, the hold was lifted. The communication and hold strategy might be explaining the success we experienced with the vast majority of students seeking advising and enrolling in required courses.

Use of Findings
Testing, Evaluation & Assessment will continue to monitor compliance with GWAR policy in the same manner by encouraging students to seek advisement from GWAR advisors and enroll in appropriate classes in a timely manner.

Go to Top

University Student Union (USU)

Student Learning Outcome
Student media volunteers who participate in the College Beat Technical Workshop will produce two three-minute video segments using proper filming and editing skills.
Measurement Tool
Video evaluation and non-participant observation
Relevant Framework
Knowledge Acquisition, Integration and Application
Status

The primary goal of the College Beat (CB) Technical Workshop is to provide CB volunteers with the tools and techniques needed to successfully film and edit for television production. The training lasts most of the day and includes information about pre-production, production, and post-production. The Workshop is run primarily by the CB Executive Producers and utilizes a power point presentation, as well as hands on experience with CB’s equipment.

Out of 25 students who attended the Fall 2010 College Beat Boot Camp, 17 (68%) were active participants in the two Fall 2010 College Beat 30 minute shows. These 17 students participated in one of the following areas: Editing, Producing, Camera Operations, or Hosting/Correspondent.

Eight (47%) of the 17 students were active in more than one of these areas.  Ten (59%) of the 17 active students participated in the editing process. Nine (53%) of the 17 active students participated as camera operators. Seven (41%) of the 17 active students participated by producing segments. Six (35%) of the 17 active students were segment hosts or correspondents.

 

By the end of the fall semester, participants had produced at least two three-minute video segments utilizing the proper filming and editing skills.

Use of Findings
Based on the results of the workshop and subsequent work produced by the participants throughout the year, College Beat has modified the workshop to address areas where students need more emphasis.

Go to Top

Women's Resource Center

Student Learning Outcome

As a result of attending a Project Safe classroom presentation about healthy relationships, attendees will correctly identify five behaviors considered to be red flags for poetential relationship violence.

Measurement Tool
Posttest: a series of questions including true and false, multiple choice, and subjective items  
Relevant Framework
Knowledge Acquisition
Status

Six students who attended the workshop; three completed it. 

At the end of the presentation, participants identified thirteen distinctly different red flags, both personal and general.  Included in their lists were: “yelling at his mother and sister,” “throwing his books across the room or at you when he’s mad,” “blames other people for stuff that happens to him and calls them stupid,” “hit you,” and listed by all, “calls you names” in public and in private.

In response to the question about an important point they will remember participants wrote about needing to pay attention to the way their boyfriends and potential boyfriends treat others, and the importance of listening and supporting friends in relationship trouble. 

The most surprising information learned at the program was the multiple resources available to them on campus – and in the communities in which they live. 

Additional written comments related to the red and blue flag making exercise: sharing the comments they had written on their flags felt like “making a connection.” 

 

Use of Findings
Though the workshop was smaller than anticipated, students who attended were satisfied and appreciated the discussion and hands-on activity.  The timing of the workshop, including its length and time of the year, will be reassessed for future workshops to enable more students to be involved.

 

Timeline for 2011-12

Go to Top

Due dates

Activity

Participants

By October 3, 2011

Draft student learning outcome for your department

Managers and staff from Student Services Departments

 

By October 24, 2011 Finalize learning outcome for 2011-2012 Managers and staff from Student Services Departments

By December 16, 2011

Student learning outcomes posted on assessment website

Learning Outcomes Committee

 

By June 8, 2012

Student learning outcomes results submitted and posted on assessment website

Learning Outcomes Committee

Managers and staff from Student Services Departments

By August 19, 2012

Use of results from student learning outcomes submitted and posted on assessment website

Learning Outcomes Committee

Managers and staff from Student Services Departments

Go to Top