Quick Launch


Home
Outcomes and Assessment Committee
February 12
Of Assessments and Cycles: Why the Three-Year Cycle is Our Friend

By Greg Turner, English Professor

Long before COS received a “show cause” sanction from the ACCJC, we were told to assess courses and programs on a cycle. In the English department, we did this while continuing to spend four days a year assessing our pre-collegiate English 251 course. Even though we were doing the assessments--and that helped each of us grow in the classroom--we never got to the point where we could honestly say we were completing cycles. Did we even know what complete a cycle meant?

An assessment cycle means different things to different groups. “A cycle is the time between more comprehensive assessments that go beyond the teacher’s isolated assessment and grading.” That is the most common description I’ve heard, and that seemed to cover how we had being doing things. However, if we are going to think of assessments of courses and programs as three-year cycles, we need to think more about what happens between those years of comprehensive assessments--and more importantly, what a cycle should mean.

It doesn’t do us any good to assess the same things every three years if nothing has changed. This is where the TracDat program provides assistance. Notice that the assessment information required in TracDat does not stop at the level of assessment results. In English, this is where we always stopped in our assessment cycle. This isn’t to say that we did not have discussions about student performance and how faculty sought to increase that performance. What we didn’t have was a systematic evaluation of the data. Which outcomes (or even parts of outcomes) demonstrated competency at a rate that is much lower than it should be? Then, what could teachers do to better enable students to meet that outcome? TracDat asks for an analysis of the assessment results. What are the implications of the data? It doesn’t stop there either. TracDat also asks what improvements are planned.

We need to train ourselves to think more deeply about why students are/are not meeting outcomes and what we can try as a group to improve consistently. In the past, I have walked out of department discussions about student achievement with new ideas. Rarely were they implemented, though. I’m sure many instructors learned from others as well, but until recently, I’m not aware of any group who decided to try something new together since portfolio assessment began in our department. Now we know that we need to analyze the data and come up with a plan to address shortcomings in our students’ abilities to meet our outcomes. But what then? Again, the assessment cycle provides a clue. It wants a follow-up. While a follow-up could mean many things, it is important that we use it with our goals in mind. If we completed an assessment, analyzed the data, and came up with improvements, then the follow up must be a result of how we implemented those improvements and how effectively they worked. There is more than one way for this to work.

First, the implementation of improvements may take a semester or two. It may take even another semester or two to tweak the improvements in ways that work best. Essentially, we may end up using the years in between comprehensive evaluations to create and adjust those recommendations. Then when the next comprehensive evaluation comes around three years later, we will be assessing those changes we made in the classroom. That is the cycle--closing the loop, many call it. The cycle isn’t really complete until an assessment of the improvements is made. At the same time, we also need to write about it in the new assessment analysis in order to understand how we cycled through the process from data gathering, analysis, making improvements, and then assessing the success of improvements. All of that must be in the original assessment, even though much of that old assessment work and reporting may have long been completed.

The second method might allow for quicker follow up. If there aren’t significant changes, it might be best to implement improvements and assess them long before the next comprehensive assessment. Then the follow-up could be posted in TracDat well before the next comprehensive cycle begins.

This view came to me as I’ve discussed these issues with colleagues. Many are of the belief that we should always assess, assess, assess. I don’t necessarily disagree with that, but I don’t think we do a good job of closing the loop if we continually assess and come up with new issues during the years between the three-year comprehensive assessments. My argument here is that we take the three years to own what we said we would do during the last comprehensive assessment. I work in a department with too many courses for us to work together on everything every year. The most productive way for us to move forward, in my opinion, is to focus on what we say we will do and assess those results. There will be plenty of time to focus on new issues once we have the next comprehensive assessment. This will force us to think more deeply as a department and as instructors about these assessment issues, as we won’t be talking about everything in our outcomes every time we have an assessment, as we have been doing for (believe it or not), decades. So how will your department use its three years?

October 07
Naming a Course Outcome--Milli Owens

Naming a course outcome isn’t difficult and may be something that you don’t give much thought to.  However, a little thought may help in the long run.  A helpful title is one that is specific enough to describe the outcome, yet general enough to include a variety of assessment plans that might be used with that outcome.

First a few things not to use in the title:

·         Dates:  Can restrict the ability to utilize the outcome for several assessments

·         Number of the outcome:  Numbers, like outcome #1, usually refer to a previous (and out-of-date) tracking method.

The faculty member teaching the course, or at least a faculty member familiar with the course content, would be the ideal person to develop the course outcome name.  They are most familiar with how the topic fits into the course.  For example, if John Weaver were a professor of basket weaving he would be familiar with the different courses within the discipline of basket weaving.  Let’s look at how he might name some of his course outcomes

 

Course Title

Course Outcome Name

Course Outcome

BSKT 101: Intro to Baskets

Historical Baskets

Students will be able to differentiate between modern and historical baskets.

BSKT 20: Early American Baskets

Colonial Baskets

Students will be able to differentiate between Colonial baskets and Native American baskets.

 

By looking at the two examples above you can see how the course outcome names are specific to the course as well as the outcome.  If Dr. Weaver had used the name “Colonial Baskets” for a BSKT 101 outcome it probably would have been too restrictive.  If he had used the name “Historical Baskets” for a BSKT 20 outcome it probably would have been too broad.  By using an appropriate title and outcome, Dr. Weaver will probably be able to develop several different methods of assessment to truly analyze his students’ ability to differentiate between basket types as appropriate for the course the student is enrolled in.​

October 02
CurricUNET vs. TracDat

In the flurry of researching, purchasing and implementing Tracdat, including the moving of outcomes from CurricUNET, questions have arisen about the purposes of the two systems.  Is COS replacing CurricUNET for Tracdat? 

The answer is no!  CurricUNET will continue to be the district’s course management system, the place where faculty develops, processes and approves courses and programs.  Although the college used CurricUNET as a vehicle for housing outcomes when there was no other electronic option, now that Tracdat, a dedicated system for the management and reporting of outcomes and assessments, is in place, all the information and work around outcomes will be housed within it.

If you are doing outcomes/assessment work or need information about same, go to Tracdat!

If you are doing updating or changing or building a course or program, go to CurricUNET!​

April 12
SLO Info

 

[Learning Outcomes are the] knowledge, skills, abilities, and attitudes that a student has attained at the end (or as a result) of his or her engagement in a particular set of collegiate experiences. Accrediting Commission of Community and Junior Colleges

   

Student Learning Outcomes (SLOs) focus on the learning that is expected to occur between the beginning and end of a course or program (certificate or degree). 

SLOs are measurable so that the difference between the planned achievement and the learned achievement can be determined. Once outcomes are assessed and discussed among colleagues, faculty adjusts their course or program to better accomplish the published learning outcomes.

No longer do we primarily ask, "what did the faculty cover in the class or with the student in a student services appointment" but rather, "what did the student take from the experience?"  In other words, the focus moves away from what the instructor does to what the student learns.  SLOs are most importantly a way for faculty to reflect on the outcome of their efforts.

SLOs can be incorporated into an instructor's syllabus, in the course outline, and in program descriptions. Thus, a student will know what outcomes to expect and to achieve for each course or program.

SLOs are also a way for the public to understand the expected results of taking courses and whether COS students are achieving the expectation.  As such, progress achieving SLOs are recorded in college-wide documents such as Program Review and Accreditation Reports.

March 19
Good Grief! Where Are We With Outcomes and Assessment at COS?

The National Institute for Learning Outcomes Assessment (NILOA) recently published a paper that is burning through academic listservs. The paper, entitled “From Denial to Acceptance: The Stages of Assessment,” written by Margaret A. Miller (a professor in the Center for the Study of Higher Education at the Curry School of Education at the University of Virginia) constructs an analogy between the development and use of learning outcomes and assessment in higher education and Elizabeth Kübler-Ross’s seminal theory of the stages of grief. The abstract of the paper sets up the initial analogy: “During the initial denial stage, faculty and staff could not understand why assessment was necessary, which led to anger that outside forces were trying to mandate it. However, demands for accountability continued to create pressure for colleges and universities to assess student learning, leading institutions to try bargaining with state officials and regional accreditation agencies. Unflattering national evaluations of American higher education, such as Measuring Up and the Spellings Commission report, propelled many institutions into depression. But eventually, reluctantly, slowly, and unevenly, many institutions came to an acceptance of assessment and its role in higher education.”

Of course, it’s difficult to read the paper and NOT think about the stage that might best describe one’s own campus. Like many in the California Community College system, COS has  experienced several stages of grief and not always in neat order. Certainly there were some years of denial, when the idea of exploring what and how students were learning (not so much what professors were teaching) seemed… well, preposterous! As education gurus came to campus to help explore and create SLOs, many felt that the work might be an interesting academic exercise, but really…who had the time or energy to give this any serious consideration? Outcomes were pretty easy to avoid until they started showing up in places that required thinking about them, such as course outlines and program review. Suddenly, faculty was confronted with describing and measuring what students learned in courses and programs. So the questions began: How to articulate what had always been second nature in the practice of teaching? And why does it have to be done THAT way? Who is going to look at outcomes and how am I supposed to know if these are “right”? Confronting the initial wave of outcome creation resulted in frustration, resentment and…yes, anger.

In what stage does COS find itself presently? There’s still plenty of anger, particularly once it became clear that outcomes were here to stay for much, much longer than had been hoped. Now they show up as a major part of accreditation work, and, chillingly, 15 of the 21 college campuses currently on sanction were found deficient in using outcome assessment results in planning. Understanding that the stakes have now been raised to levels that no longer allow mere anger, COS finds itself at the stage of bargaining. Winding down a year of self-study in advance of an accreditation visit by ACCJC, it’s clear that COS won’t quite meet the proficiency level of accomplishments in ACCJC’s SLO rubric which requires: (1) student learning outcomes and authentic assessment are in place for courses, programs and degrees; (2) there is widespread institutional dialogue about the results of assessment and identification of gaps; and (3) decision-making includes dialogue on the results of assessment and is purposefully directed toward aligning institution-wide practices to support and improve student learning, among other elements. However, the Accreditation Steering Committee has struck a bargain with ACCJC trainers to begin widespread outcomes assessment, starting with the assessment of one outcome per course per year. This first step will help to build the culture of assessment, allowing it to grow to full compliance and a functioning system of assessment, reporting and decision-making at all levels of instruction. The enormity of that task has some faculty already in a depressive funk. The new experience of completing one cycle of assessment for a single course outcome demonstrates the additional hours and work it will take to multiply that effort for all courses, programs and outcomes. Nevertheless, given not only the state/regional pressures to create a climate for and complete assessment work, but also on-going federal mandates for accountability in higher education, some years hence, we’ll likely wonder what the fuss was all about as the habits of assessment gain acceptance and become part of the academic routine.

Until those halcyon days arrive, however, there will surely be more “grieving” of the processes and purposes of outcomes and assessment. C. S. Lewis noted, “No one ever told me that grief felt so like fear,” a nexus that appropriately describes the feelings of many as they approach this work. Changing the culture of an institution is not for the faint of heart and requires commitment and energy from everyone.

January 31
Outcomes blog

Great idea! I think a blog will provide a space to share as well as research outcomes assessment experiences at COS.​

January 30
Have an OUTBURST!

Welcome to the Outcomes and Assessment Committee’s blog!  It is our hope that you’ll use this space to find information about outcomes and assessment work, post questions, join a conversation or find a model of outcome creation or assessing learning that works for you.  As outcomes develop at COS, there is a rich tapestry of experience and thinking growing around the processes and findings.  We’ll share some of those with you and hope you’ll find this space a useful support in all the important work you do!

January 10
Using Windows Live Writer

 

Using the Windows Live Writer application to add a blog is extremely easy!  I first downloaded the product from the Live Essentials download site.  There are several other products that come along for the ride, but I was only interested in using the Live Writer application.

Once the product was downloaded and installed (by the way, less than a minute for this operation) I opened up the product and was presented with a choice of “Blog providers”.  I selected the SharePoint option and was then asked for the url of the SharePoint Blog provider (in this, the test case, I entered http://www.cos.edu/testblog). After some churning and downloading and syncing a blank editing screen appeared.

Simple, easy, cheap and it works…… tomorrow I will try the product on a Mac (we live in a multicultural world)

 

 About this blog

 
About this blog
Welcome to the Outcomes & Assessment
blog!  This blog supports the work of COS faculty in learning outcomes and assessment. 
 

 O&A Committee

 

Co-Chairs:
Dr. Robert Urtecho
Joni Jordan

Faculty:
Milli Owens
Greg Turner
Marla Prochnow
Julie Rodriguez
Jeanne Draper
Anne Morrow

Research:
Dr. Mehmet Ozturk

Technology:
Tim Hollabaugh