Tuesday, August 15, 2006

Nuts & Bolts Aug. 2006 ARCHIVE

NUTS & BOLTS

An electronic assessment newsletter
Springfield College in Illinois
-----------------------------------------
August 2006
Vol. 7 No. 1
-----------------------------------------
Editor's Note. Until I can get access to SCI's new
assessment website, I will publish the
newsletter by email and archive current issues on an
interim basis on my personal weblog at
http://www.teachinglogspot.blogspot.com/ ... SCI’s
Common Student Learning Objectives and instructors’
guide "Classroom Assessment for Continuous
Improvement" are linked to SCI’s homepage at
http://www.sci.edu. [One change has been made in the archived copy since the original was emailed Aug. 14, to reflect a change in schedule. The third workshop will now be from 2:30 to 3:30 p.m. Tuesday, Sept. 12.]

* * *

CLASSROOM ASSESSMENT WORKSHOPS IN SEPTEMBER

Next month new (and not-so-new) instructors are
invited to a workshop at which I will briefly explain
how SCI’s Common Student Learning Objectives (SLOs)
were derived from our mission statement; and how
Course Based SLOs relate to daily lessons and
assignments. I will assist instructors in choosing
Classroom Assessment Techniques appropriate to the
SCI and/or Benedictine University mission statement
and the goals, objectives and outcomes in the courses
you teach. The workshop, in the Resource Center on the
lower level of SCI's Becker Library, will be offered
at three times:
(1) Thursday, Sept. 7, from 5:30 to 6:30 p.m.;
(2) Monday, Sept. 11, from 5:30 to 6:30 p.m.; and
(3) Tuesday, Sept. 12, from 2:30 to 3:30 p.m.
Attendance is voluntary, and all interested SCI and
Benedictine University faculty are welcome to
participate.

Here's a four-point summary of SCI’s philosophy of
assessment, prepared for a recent meeting of teachers
in the “Triple A” or adult accelerated associate’s
degree program. That’s 10 fewer points than Edwards
Deming, the management guru whose theories are
reflected in our learning outcomes assessment program
at SCI. But assessment isn’t rocket science – I want
to keep it simple.

1. Assessment is externally mandated, but it can be a
valuable part of what we do in the classroom. Let’s be
blunt about it. In higher ed no less than in the
public schools, we are mandated by outside
stakeholders – mostly the state and federal
governments – to do assessment. It’s part of the
political demand for “accountability” that gave us the
No Child Left Behind Act at the K-12 level, and this
summer some of us are nervously watching a blue-ribbon
federal commission as it debates ways of politicizing
higher ed as well. But assessment isn’t rocket science
– I define it as nothing more than using several
different ways of finding out what our students learn.
Some are embedded in work they do for grades; others
aren’t. But they can all help us teach better. At SCI,
we have designed an assessment program that addresses
accountability to outside stakeholders mostly at the
institutional level, by requiring standardized tests
of our sophomores and making sure our course offerings
and objectives square with the statewide Illinois
Articulation Initiative. That leaves our classroom
teachers free to assess student learning outcomes
(which basically means what the students learn) to
improve our teaching over the course of the semester –
when there’s still time to plug the assessment results
back into our planning processes.

2. Classroom assessment at SCI is “formative,” which
means we use the results immediately to improve our
teaching. There are several highly effective classroom
assessment techniques (known as CATs for short). One
that many of us like is the “one-minute paper.” At the
end of class, we’ll have the students write briefly on
two questions designed to get at what they learned:
(1) What was the clearest point in tonight’s class?
(2) What was the most confusing point? I think it’s
very useful. It’s humbling when I realize I led my
students off on a tangent when some off-the-cuff
remark keeps showing up as the clearest point, but
it’s good to know so I can get us all back on track
the following week. And I know to clear up the most
confusing point while I’m at it. This approach is what
educators call “formative assessment.” Carol Boston of
the University of Maryland defines it as the
“diagnostic use of assessment to provide feedback to
teachers and students over the course of instruction.”
It’s what we stress at SCI.

3. Classroom assessment is not rocket science, but it
is grounded in the scientific method of testing and
refining our data. Most of our classes are too small
for us to attempt statistical analysis with any rigor.
So classroom assessment, at least at SCI, is more an
art than a science. There’s a quote I like from Peter
Ewell, one of the pioneers in learning outcomes
assessment, on the Southern Illinois at Edwardsville
classroom assessment website: “Why do we insist on
measuring it with a micrometer when we mark it with
chalk and cut it with an axe?" My answer: We don’t
try, but we do learn how to heft an axe. That SIUE
website, by the way, is one of the best places to
start learning about CATs, and I recommend it highly.
I also recommend our instructors’ guide, Classroom
Assessment for Continuous Improvement, available as a
PDF document on the SCI website. I like it partly
because I wrote it. But it shares some good ideas from
other SCI instructors, and I think it explains the
philosophy behind assessment at SCI. It’s called
planning for continuous improvement, and it boils down
to a four-step process: (1) Plan something, a lesson
or a course; (2) Do it, at least get it started and
measure its interim success; (3) Study the data from
those measurements; and (4) Act or adjust your
procedures in light of your analysis of the data. The
idea is borrowed from industrial management, where
it’s known as a PDSA cycle, but behind it is nothing
more complicated than the scientific method. Most
important, it works in the classroom as well as it
does on the shop floor.

4. Our organizational culture at SCI is receptive to
assessment, and you can find plenty of help. Just ask
us. Your syllabi, for example, are full of numbers and
letters that relate the goals and objectives of
individual courses to the SCI mission statement. They
may be puzzling at first! It’s a new system, and we’re
still working out kinks. But most of us are adapting
to it. So we’ll be able to help you figure out what
all the letters and numbers mean, and how they relate
to what you do in the classroom. But we’ll also be
very sympathetic. We’ve been puzzled ourselves.
Sometimes I still get confused! But I take comfort in
point No. 3 above: Assessment isn’t rocket science, it
involves a continuous learning process and it takes
time to master. Ask your division chairs for help. Or
please feel free to contact me. I’m easiest to reach
by email … at pellertsen!@sci.edu.

FEDERAL TESTING MANDATE?

A federal Commission on the Future of Higher Education
that threatens to change the way we do assessment at
SCI, and everywhere else in higher ed, appears to have
backed off on its most extreme proposals for a “one
size fits all” federally mandated standardized testing
program. The commission, chaired by Bush
administration insider Charles Miller of Texas,
adopted its final report this month. An Associated
Press story sums it up like this:

“A national commission charged with
plotting the future of American higher education
approved its final recommendations Thursday, calling
on the government to provide more aid based on
financial need, while telling colleges to be more
accountable for what students learn.

A commission member representing nonprofit colleges
declined to sign on, however, saying the report
reflected too much of a "top down" approach to reform.

The report, which will be delivered to Education
Secretary Margaret Spellings in final form next month,
recommends that the federal government consolidate its
more than 20 financial aid programs and ensure that
Pell Grants - the main aid program for low-income
students - cover at least 70 percent of in-state
tuition costs. In 2004-2005, the grants covered less
than half.

But it says that colleges should do more to hold down
costs, and to better measure what students learn.
The 19-member commission, created by Spellings, has no
direct power, but has been closely watched by
policy-makers. Because of its diverse membership -
industry, government and for-profit and traditional
colleges are represented - any recommendations all
members agreed on would carry substantial weight as
Congress, the White House and state governments
consider education measures in the future.
(Pope)


The vote was 18-1. David Ward, president of the
American Council on Education, was the dissenting
member. Associated Press education reporter Justin
Pope noted that Ward “was the primary voice of
traditional colleges on the commission, and his
refusal to sign on could dilute the report's
influence.”

In the meantime, a snippet tucked into a report in
the online newsletter Higher Ed Today suggests a
partial retreat from commission chair Charles Miller's
insistence on a uniform national standarized testing
regimen. It also suggests testing will be one of the
footballs the politicans plan to kick around. The
snippet reads as follows:

Speaking to reporters after the vote,
Miller said his preference would be for “the academy
[itself] to address” the changes called for in the
report, and as evidence of his desire not to impose
mandates on higher education, he noted that the report
the commission approved Thursday had dropped language
(which was in last week’s draft) that called for
states to require public institutions to measure
student learning using a set of tests and other
measures. (The new language, which college leaders
pushed hard for in the last few days, just says that
“higher education institutions should measure student
learning using....")

If higher education is “not responsive to change” and
“doesn’t have a strategic vision,” Miller predicted,
then “things are going to be mandated.”
(Lederman)


I want to see the final draft before I try to read too
much into this. But I think it may be a hopeful sign
whatever new testing regimen emerges from all this
won't be too intrusive. A fuller discussion is posted
to my "teaching b/log" at
http://teachinglogspot.blogspot.com/

-- Pete Ellertsen, editor, Nuts & Bolts

Works Cited

Boston, Carol. “The concept of formative assessment.”
Practical Assessment, Research & Evaluation 8.9
(2002). 7 Aug. 2006.
http://pareonline.net/getvn.asp?v=8&n=9

Classroom Assessment for Continuous Improvement: A
Guide for Instructors. SCI. 2005. 7 Aug. 2006. PDF
file linked to http://www.sci.edu/assessment-site.htm

“Classroom Assessment Techniques.” University of
Southern Illinois at Edwardsville. n.d. 7 Aug. 2006.
http://www.siue.edu/~deder/assess/catmain.html

Lederman, Doug. “18 Yesses, 1 Major No.” Inside Higher
Ed 11 Aug. 2006. 14 Aug. 2006.
http://www.insidehighered.com/news/2006/08/11/commission

Pope, Justin. “Higher Education Report Gets OK.
Seattle Post-Intelligencer 10 Aug. 2006. 14 Aug. 2006.
http://seattlepi.nwsource.com/national/1110AP_Higher_Education_Commission.html

No comments: