Thursday, August 31, 2006

Nuts & Bolts Sept. 2006 ARCHIVE

NUTS & BOLTS

An electronic assessment newsletter
Springfield College in Illinois
-----------------------------------------
September 2006
Vol. 7 No. 2
-----------------------------------------
Editor's Note. Since I am still unable to post to
SCI's assessment website, I am publishing the
newsletter by email and archiving it in interim on my
personal weblog at http://www.teachinglogspot.blogspot.com/ ... if
representatives of the North Central Association, the
Illinois Board of Higher Education or other outside
stakeholders wish to see SCI's annual Assessment
Report for 2005-2006 or other current information
regarding our assessment program, please direct them
to my personal blog. -- Pete Ellertsen, assessment
chair

* * *

CLASSROOM ASSESSMENT WORKSHOP

September's issue of Nuts & Bolts is coming out a day
early (Aug. 31) to catch faculty members before the
last minute leading up to the Labor Day weekend. (OK,
OK, it's the *next* to last minute.) That's to give
you more time to plan on attending one of the workshop
sessions on Classroom Assessment Techniques we'll
conduct in the next couple of weeks.

The workshop, in the Resource Center on the
lower level of SCI's Becker Library, will be offered
at three times:

(1) Thursday, Sept. 7, from 5:30 to 6:30 p.m.;
(2) Monday, Sept. 11, from 5:30 to 6:30 p.m.; and
(3) Tuesday, Sept. 12, from 2:30 to 3:30 p.m.

Attendance is voluntary, and all interested SCI and
Benedictine University faculty are welcome. We'll talk
about how to find assessment techniques that are
appropriate to the learning goals and objectives in
our syllabi, and I'll show interested instructors some
aids on the World Wide Web that I've found helpful. I
expect the sessions will be small and informal.

The workshop would be especially valuable for new
instructors who may not have experience with CATs (as
classroom assessment techniques are called), but
seasoned instructors are welcome to exchange ideas and
share their experience and insights as well.

MORE POLITICS

In a move that surprised many, U.S. Education
Secretary Margaret Spellings has announced four
hearings nationwide on whether recommendations of the
federal Commission on the Future of Higher Education
"can be put in place through federal regulation"
through a procedure known as the negotiated
rule-making process. One area up for review, according
to an Aug. 18 announcement in the Federal Register, is
accreditation.

No one in higher education is able to say exactly what
the Department wants to do about accreditation,
according to Doug Lederman of the online newsletter
Inside Higher Ed:

"The Spellings commission’s report takes broad shots
at the perceived ineffectiveness and dysfunction of
the system of voluntary regional and national
accreditation, but offers relatively few firm
proposals for transforming it. So while there are no
obvious changes in accreditation that might emerge
from regulatory negotiations, the department could see
itself as having broad latitude to impose new
requirements on accreditors and, in turn, on colleges,
some observers speculate."

While staff-written working papers presented to the
commission in the spring and early drafts of its
report were quite acrimonious about accreditation,
along with other perceived failures in higher ed, the
hostile tone had largely dropped out of the final
draft, which was approved early in August. With the
hostility, most references to accreditation were also
backburnered.

In an article in the current issue of The Chronicle of
Higher Education, Kelly Field catches the uncertainty
that has greeted the commission's latest regulatory
tack. Field also ties the issue to President Bush's
larger political agenda:

"... some college lobbyists still wondered why an
administration that had shown little interest in
higher education during its first term was suddenly so
concerned with its future. Some speculated that the
administration was trying to divert attention from its
unpopular No Child Left Behind Act, the 2002 law that
imposed testing on the nation's elementary and
secondary schools; others suspected that it was
seeking to extend that law's reach into the college
classroom.

"To the suspicious, the secretary's choice of a
commission chairman seemed proof of a plot to
institute standardized testing at colleges. Charles
Miller, a millionaire investor and close friend of
both Ms. Spellings and President Bush, was best known
for devising a Texas public-school accountability
system that became the model for No Child Left Behind.
He was also associated with accountability testing at
the University of Texas System, where he led the Board
of Regents from 2001 to 2004."

Field adds:

"The swiftness of the secretary's response took some
college lobbyists by surprise. They said the
administration's announcement, which appeared in the
August 18 edition of the Federal Register, signaled
that the secretary did not want to lose any momentum
for change created by the commission's deliberations.

"So far, the administration has given few clues about
which recommendations it might consider as part of the
negotiated rule making — a process by which federal
agencies work with affected parties as regulations are
drafted."

All of this may well affect us at SCI and Benedictine,
because assessment is a politically driven process and
its political underpinnings may be changing.

Hearings will be held in California, Florida,
Washington, D.C., and Chicago. The Chicago hearing
will be Oct. 19 at Loyola University.

Works Cited

Field, Kelly. "Uncertainty Greets Report on Colleges
by U.S. Panel." Chronicle of Higher Education Sept. 1,
2006. http://chronicle.com/free/v53/i02/02a00101.htm

Lederman, Doug. "Regulatory Activism?" Inside Higher
Ed Aug. 21, 2006.
http://www.insidehighered.com/news/2006/08/21/regs

Nassirian, Barmak. "U.S. Department of Education
Formally Plans for 'Negotiated Rulemaking'." AACRAO
Transcript [American Association of Collegeate
Registrars and Admissions Officers] Aug. 30, 2006. http://www.aacrao.org/transcript/index.cfm?fuseaction=show_view&doc_id=3294

Tuesday, August 15, 2006

Nuts & Bolts Aug. 2006 ARCHIVE

NUTS & BOLTS

An electronic assessment newsletter
Springfield College in Illinois
-----------------------------------------
August 2006
Vol. 7 No. 1
-----------------------------------------
Editor's Note. Until I can get access to SCI's new
assessment website, I will publish the
newsletter by email and archive current issues on an
interim basis on my personal weblog at
http://www.teachinglogspot.blogspot.com/ ... SCI’s
Common Student Learning Objectives and instructors’
guide "Classroom Assessment for Continuous
Improvement" are linked to SCI’s homepage at
http://www.sci.edu. [One change has been made in the archived copy since the original was emailed Aug. 14, to reflect a change in schedule. The third workshop will now be from 2:30 to 3:30 p.m. Tuesday, Sept. 12.]

* * *

CLASSROOM ASSESSMENT WORKSHOPS IN SEPTEMBER

Next month new (and not-so-new) instructors are
invited to a workshop at which I will briefly explain
how SCI’s Common Student Learning Objectives (SLOs)
were derived from our mission statement; and how
Course Based SLOs relate to daily lessons and
assignments. I will assist instructors in choosing
Classroom Assessment Techniques appropriate to the
SCI and/or Benedictine University mission statement
and the goals, objectives and outcomes in the courses
you teach. The workshop, in the Resource Center on the
lower level of SCI's Becker Library, will be offered
at three times:
(1) Thursday, Sept. 7, from 5:30 to 6:30 p.m.;
(2) Monday, Sept. 11, from 5:30 to 6:30 p.m.; and
(3) Tuesday, Sept. 12, from 2:30 to 3:30 p.m.
Attendance is voluntary, and all interested SCI and
Benedictine University faculty are welcome to
participate.

Here's a four-point summary of SCI’s philosophy of
assessment, prepared for a recent meeting of teachers
in the “Triple A” or adult accelerated associate’s
degree program. That’s 10 fewer points than Edwards
Deming, the management guru whose theories are
reflected in our learning outcomes assessment program
at SCI. But assessment isn’t rocket science – I want
to keep it simple.

1. Assessment is externally mandated, but it can be a
valuable part of what we do in the classroom. Let’s be
blunt about it. In higher ed no less than in the
public schools, we are mandated by outside
stakeholders – mostly the state and federal
governments – to do assessment. It’s part of the
political demand for “accountability” that gave us the
No Child Left Behind Act at the K-12 level, and this
summer some of us are nervously watching a blue-ribbon
federal commission as it debates ways of politicizing
higher ed as well. But assessment isn’t rocket science
– I define it as nothing more than using several
different ways of finding out what our students learn.
Some are embedded in work they do for grades; others
aren’t. But they can all help us teach better. At SCI,
we have designed an assessment program that addresses
accountability to outside stakeholders mostly at the
institutional level, by requiring standardized tests
of our sophomores and making sure our course offerings
and objectives square with the statewide Illinois
Articulation Initiative. That leaves our classroom
teachers free to assess student learning outcomes
(which basically means what the students learn) to
improve our teaching over the course of the semester –
when there’s still time to plug the assessment results
back into our planning processes.

2. Classroom assessment at SCI is “formative,” which
means we use the results immediately to improve our
teaching. There are several highly effective classroom
assessment techniques (known as CATs for short). One
that many of us like is the “one-minute paper.” At the
end of class, we’ll have the students write briefly on
two questions designed to get at what they learned:
(1) What was the clearest point in tonight’s class?
(2) What was the most confusing point? I think it’s
very useful. It’s humbling when I realize I led my
students off on a tangent when some off-the-cuff
remark keeps showing up as the clearest point, but
it’s good to know so I can get us all back on track
the following week. And I know to clear up the most
confusing point while I’m at it. This approach is what
educators call “formative assessment.” Carol Boston of
the University of Maryland defines it as the
“diagnostic use of assessment to provide feedback to
teachers and students over the course of instruction.”
It’s what we stress at SCI.

3. Classroom assessment is not rocket science, but it
is grounded in the scientific method of testing and
refining our data. Most of our classes are too small
for us to attempt statistical analysis with any rigor.
So classroom assessment, at least at SCI, is more an
art than a science. There’s a quote I like from Peter
Ewell, one of the pioneers in learning outcomes
assessment, on the Southern Illinois at Edwardsville
classroom assessment website: “Why do we insist on
measuring it with a micrometer when we mark it with
chalk and cut it with an axe?" My answer: We don’t
try, but we do learn how to heft an axe. That SIUE
website, by the way, is one of the best places to
start learning about CATs, and I recommend it highly.
I also recommend our instructors’ guide, Classroom
Assessment for Continuous Improvement, available as a
PDF document on the SCI website. I like it partly
because I wrote it. But it shares some good ideas from
other SCI instructors, and I think it explains the
philosophy behind assessment at SCI. It’s called
planning for continuous improvement, and it boils down
to a four-step process: (1) Plan something, a lesson
or a course; (2) Do it, at least get it started and
measure its interim success; (3) Study the data from
those measurements; and (4) Act or adjust your
procedures in light of your analysis of the data. The
idea is borrowed from industrial management, where
it’s known as a PDSA cycle, but behind it is nothing
more complicated than the scientific method. Most
important, it works in the classroom as well as it
does on the shop floor.

4. Our organizational culture at SCI is receptive to
assessment, and you can find plenty of help. Just ask
us. Your syllabi, for example, are full of numbers and
letters that relate the goals and objectives of
individual courses to the SCI mission statement. They
may be puzzling at first! It’s a new system, and we’re
still working out kinks. But most of us are adapting
to it. So we’ll be able to help you figure out what
all the letters and numbers mean, and how they relate
to what you do in the classroom. But we’ll also be
very sympathetic. We’ve been puzzled ourselves.
Sometimes I still get confused! But I take comfort in
point No. 3 above: Assessment isn’t rocket science, it
involves a continuous learning process and it takes
time to master. Ask your division chairs for help. Or
please feel free to contact me. I’m easiest to reach
by email … at pellertsen!@sci.edu.

FEDERAL TESTING MANDATE?

A federal Commission on the Future of Higher Education
that threatens to change the way we do assessment at
SCI, and everywhere else in higher ed, appears to have
backed off on its most extreme proposals for a “one
size fits all” federally mandated standardized testing
program. The commission, chaired by Bush
administration insider Charles Miller of Texas,
adopted its final report this month. An Associated
Press story sums it up like this:

“A national commission charged with
plotting the future of American higher education
approved its final recommendations Thursday, calling
on the government to provide more aid based on
financial need, while telling colleges to be more
accountable for what students learn.

A commission member representing nonprofit colleges
declined to sign on, however, saying the report
reflected too much of a "top down" approach to reform.

The report, which will be delivered to Education
Secretary Margaret Spellings in final form next month,
recommends that the federal government consolidate its
more than 20 financial aid programs and ensure that
Pell Grants - the main aid program for low-income
students - cover at least 70 percent of in-state
tuition costs. In 2004-2005, the grants covered less
than half.

But it says that colleges should do more to hold down
costs, and to better measure what students learn.
The 19-member commission, created by Spellings, has no
direct power, but has been closely watched by
policy-makers. Because of its diverse membership -
industry, government and for-profit and traditional
colleges are represented - any recommendations all
members agreed on would carry substantial weight as
Congress, the White House and state governments
consider education measures in the future.
(Pope)


The vote was 18-1. David Ward, president of the
American Council on Education, was the dissenting
member. Associated Press education reporter Justin
Pope noted that Ward “was the primary voice of
traditional colleges on the commission, and his
refusal to sign on could dilute the report's
influence.”

In the meantime, a snippet tucked into a report in
the online newsletter Higher Ed Today suggests a
partial retreat from commission chair Charles Miller's
insistence on a uniform national standarized testing
regimen. It also suggests testing will be one of the
footballs the politicans plan to kick around. The
snippet reads as follows:

Speaking to reporters after the vote,
Miller said his preference would be for “the academy
[itself] to address” the changes called for in the
report, and as evidence of his desire not to impose
mandates on higher education, he noted that the report
the commission approved Thursday had dropped language
(which was in last week’s draft) that called for
states to require public institutions to measure
student learning using a set of tests and other
measures. (The new language, which college leaders
pushed hard for in the last few days, just says that
“higher education institutions should measure student
learning using....")

If higher education is “not responsive to change” and
“doesn’t have a strategic vision,” Miller predicted,
then “things are going to be mandated.”
(Lederman)


I want to see the final draft before I try to read too
much into this. But I think it may be a hopeful sign
whatever new testing regimen emerges from all this
won't be too intrusive. A fuller discussion is posted
to my "teaching b/log" at
http://teachinglogspot.blogspot.com/

-- Pete Ellertsen, editor, Nuts & Bolts

Works Cited

Boston, Carol. “The concept of formative assessment.”
Practical Assessment, Research & Evaluation 8.9
(2002). 7 Aug. 2006.
http://pareonline.net/getvn.asp?v=8&n=9

Classroom Assessment for Continuous Improvement: A
Guide for Instructors. SCI. 2005. 7 Aug. 2006. PDF
file linked to http://www.sci.edu/assessment-site.htm

“Classroom Assessment Techniques.” University of
Southern Illinois at Edwardsville. n.d. 7 Aug. 2006.
http://www.siue.edu/~deder/assess/catmain.html

Lederman, Doug. “18 Yesses, 1 Major No.” Inside Higher
Ed 11 Aug. 2006. 14 Aug. 2006.
http://www.insidehighered.com/news/2006/08/11/commission

Pope, Justin. “Higher Education Report Gets OK.
Seattle Post-Intelligencer 10 Aug. 2006. 14 Aug. 2006.
http://seattlepi.nwsource.com/national/1110AP_Higher_Education_Commission.html

Monday, August 14, 2006

Miller panel backs off on testing?

Now that the U.S. Education Department's blue-ribbon Commission on the Future of Higher Education has approved a draft report, there will be plenty of time to look at its implications for testing and assessment. It goes to Education Secretary Margaret Spellings next month, and then it is expected to be threshed out in a political process involving any number of government, industry and, hopefully, educational stakeholders.

In the meantime, a snippet tucked into a report in the online newsletter Inside Higher Ed suggests a partial retreat from commission chair Charles Miller's insistence on a uniform national standarized testing regimen. It also suggests testing will be one of the footballs the politicans plan to kick around. The snippet reads as follows:
Speaking to reporters after the vote, Miller said his preference would be for “the academy [itself] to address” the changes called for in the report, and as evidence of his desire not to impose mandates on higher education, he noted that the report the commission approved Thursday had dropped language (which was in last week’s draft) that called for states to require public institutions to measure student learning using a set of tests and other measures. (The new language, which college leaders pushed hard for in the last few days, just says that “higher education institutions should measure student learning using....")

If higher education is “not responsive to change” and “doesn’t have a strategic vision,” Miller predicted, then “things are going to be mandated.”
I want to see the final draft before I try to read too much into this. But I think it may be a hopeful sign whatever new testing regimen emerges from all this won't be too intrusive.

SCI annual assessment report / ARCHIVE

ANNUAL ASSESSMENT REPORT
Springfield College in Illinois
Academic Year 2005-2006


* * *

Editor’s Note. Since SCI went over to a new website in
July 2006, I have been unable to access the assessment
portion of the website. Until the remaining technical
bugs can be worked out, I am archiving current
reports, newsletters and other postings relating to
student learning outcomes assessment at SCI on my
personal weblog at
http://www.teachinglogspot.blogspot.com/ -- Peter
Ellertsen, chair, Assessment Committee.

* * *

Because SCI was reaccredited during the 2005-2006
academic year, the Assessment Committee’s activities
were heavily influenced by the site visit for
reaccreditation purposes that took place in November
2005. Before the visit, the committee’s focus was on
getting ready for the site visit; afterward, its focus
shifted to the preliminary stages of planning to
maintain and further develop elements of the college’s
Assessment Plan as adopted in 1996, amended in 2001
and implemented during the time intervening between
those dates and the present.

A key part of the plan, and one that received a great
deal of attention as it was initiated over the summer
and fall terms in 2005 was the implementation of a new
syllabus format incorporating the Common Student
Learning Objectives (CSLOs) adopted at a faculty
workshop in December 2004 and derived from SCI’s
stated mission of preparing students for lives of
“learning, leadership and service in a diverse world”
into Course Based Student Learning Objectives (CBSLOs)
and into individual assignments and assessment
activities by individual instructors and as part of
the college’s program and institutional effectiveness
assessment programs. Beginning in the fall semester,
all syllabi submitted to the Office of the Dean of
Academic Affairs from traditional and adult
accelerated associate’s level courses follow the new
format, and workshops were held in the summer and fall
of 2005 to help instructors follow the new format and
choose Classroom Assessment Techniques that will help
them perform both formative assessment during the
course of the semester and summative assessment at
semester’s end in a cycle of continuous improvement of
classroom instruction. In addition, the chair of the
Assessment Committee wrote a 45-page booklet entitled
"Classroom Assessment for Continuous Improvement." It
was given to workshop attendees in summer of 2005 and
posted to the college’s website at www.sci.edu as a
PDF document. Additional workshops for new faculty are
scheduled in September 2006.

During the site visit in November, members of the
Assessment Committee were informed verbally that
members of the site visit team were favorably
impressed with the degree to which SCI has developed
an organizational culture that is receptive to
assessment, and this impression was repeated in the
written report issued in December and formalized in
June 2006 (please see below for details). At the same
time, members of the site visit panel made in clear in
verbal communication that the SCI’s progress to date
is expected to continue as the 1996/2001 Assessment
Plan is fleshed out and further implemented. Along
with the accolade came what members of the Assessment
Committee interpreted as further marching orders.

After the site visit, the Committee’s focus shifted
toward maintenance of ongoing parts of the Assessment
Plan and planning toward expansion of the assessment
program as the Plan is further implemented. Program
assessment continued apace, as members of the
Assessment Committee continued to develop a matrix
showing with CSLOs and CBSLOs are reflected in General
Education courses and worked with outside stakeholders
in the evaluation and improvement of curricula,
particularly with regard to science. Standardized
tests purchased from ACT Inc. were purchased and
administered at the end of March, and efforts began to
study and interpret test results over time since the
reading module has been administered now since 2003
and a math test has been added. The small size of
SCI’s student population makes it imperative that data
accrue over time, and that they be interpreted
carefully since the data pool is only beginning to be
large enough, at least in the case of reading scores,
for valid statistical analysis. Details are reported
below.

Priorities for the coming 2006-2007 academic year will
be set by the committee in its September and October
meetings. It is expected that they will continue to
focus on fuller implementation of the 1996/2001
Assessment Plan, especially with regard to program
assessment, further efforts to reflect specific parts
of the mission statement and CSLOs in classroom
assessment of individual lessons and assignments, and
the completion of feedback loops and other
communication of learning outcomes data throughout the
college so these data can be consciously utilized in
decision-making processes.

Accreditation



After a site visit in November, the Higher Learning
Commission of the North Central Association of
Colleges and Schools formally renewed Springfield
College in Illinois' accreditation for 10 years. The
NCA site visit team's Comprehensive Evaluation report
said its inspection "confirm[ed] the institution's
capacity and responsibility to identify and address
issues," including a good half dozen issues of
long-standing concern to the accrediting body. The
panel noted that SCI's partnership with Benedictine
University was a crucial factor in granting continued
accreditation. After noting "major improvements since
the inception of the partnership," it reported in its
summary of findings:
The faculty and staff are qualified, dedicated, and
hopeful; in addition, recently hired staff are
bringing new perspectives to the institution. It was
clear from discussions with the Benedictine University
President and the Chair of its Board of Trustees, that
Benedictine is fully committed to the partnership.
With the University's leadership and its advantageous
presence in the state capital, Springfield College
should be able to continue to fulfill its mission.
While the team anticipates growing pains in relation
to the partnership, it should be possible to overcome
them. In short, it seems clear that Springfield
College in Illinois, in partnership with Benedictine
Univrsity, is now a viable institution with prospects
for a positive future.

Regarding assessment, the site visit committee
reported, “"It is clear from considerable
documentation and a variety of personal conversations
that SCI has made considerable progress in creating a
culture of assessment on campus, with a specific focus
on classroom-level assessment." The following evidence
was cited:
1. The development of common student learning outcomes
across the curriculum [citation omitted].
2. The requirement that each faculty member declare
the methods of course assessment as part of each
course syllabus.
3. The requirement that each faculty member submit an
end-of-course assessment report to the Dean which
identifies specific classroom assessment techniques
used, the findings from those assessments, and action
taken [citation omitted].
4. Course syllabi in both the traditional two-year
program and the accelerated degree program routinesly
list objectives related to either Common Student
Learning Objectives or Course-Based Learning
Objectives identified by the College and explicitly
related to the College's mission.
5. Testimony from students indicated that faculty use
classroom assessment techniques daily and that these
assessments result in clear changes in classes.
6. Interviews with a number of faculty demonstrate a
high degree of awareness of the assessment effort, and
a desire to use that process to improve
classroom-level instruction.


The site visit committee noted two other aspects of
the assessment plan - an annual review of courses to
"ensure that Illinois Articulation Initiatives are
met," and our program review process. "Following a
review," the panel noted, "the theater program was
placed on indefinite inactive status. Review of the
forensics program to determine the future of the
program has included two external evaluators. These
program reviews indicate the institution is reviewing
the effectiveness of the programs."

In addition, the site visit team noted a long-standing
overall commitment to good teaching and student
learning at SCI. It cited the way faculty members are
"evaluated by department chairs, students, and the
Dean of Academic Affairs," and the "classroom visits
and evaluations are discussed with faculty and affect
tenure decisions," as well as decisions on rehiring
adjunct instructors. Also credited were the LaFata and
Distinguished Teaching Awards and SCI's computer labs
and utilization of "limited rsources to improve
classrooms and maintain the cleanliness of the
grounds, common spaces, and classrooms." Especially
commended was the new Resource Center on the lower
level of Becker Library

Standardized testing



The Collegiate Assessment of Academic Proficiency
(CAAP) tests in reading and math were administered at
the end of March. On the CAAP reading test, SCIr
students who took it (n = 87) scored an average of
59.0; the nationwide reference group of second-year
students in private two-year colleges scored an
average of 60.4. That is slightly less than the
national average. But SCI students in 2005 averaged
61.2 on the reading test, compared to 60.4 nationwide,
and in 2004 SCI students' score was 59.9 compared to
60.3 nationwide. The college’s first math scores were
as follows: Students students averaged 56.5 as
compared to 56.1 nationally. Math scores will not be
statistically significant until the test has been
administered one or two years longer and more data
accrue.

In addition, SCI purchased from ACT Inc. a linkage
report providing a “value added” benchmark for
measuring how much SCI students learned about reading
and math in their college years. ACT Inc., the vendor,
explains: "This report contains an analysis of
performance for students who tested with the ACT
Assessment on entry to college and CAAP after general
education work has been completed. ... Because the
content specifications of some pairs of ACT Assessment
and CAAP tests are similar, it is possible to track
student performance for your cohort." The linkage
results:
· In reading, 18 percent of those SCI students who
took both the ACT test in high school and the CAAP
test this year (n = 60) made lower than expected
progress on the CAAP test as compared to 14 percent of
the nationwide reference group; 75 percent made
expected progress, compared to 75 percent of the
national group; and 7 percent made higher than on CAAP
compared to 11 percent of the national group.

· In math, 10 percent of the SCI students who took
both tests made lower than expected progress on the
CAAP test compared to 12 percent of the reference
group; 82 percent made expected progress, compared to
79 percent nationally; and 8 percent made higher than
expected progress, as compared to 9 percent of the
national group.

Over the summer, a subcommittee was empaneled to take
an exploratory look at all the CAAP test results and
make some preliminary decisions on how they can be
utilized as a planning tool for continuous improvement
of instruction. Serving on it were Academic Affairs
dean John Cicero, Languages and Literature chair Amy
Lakin, and math instructor Barb Tanzyus. Peter
Ellertsen, chair of the assessment committee, convened
the subcommittee. It met in June, and took the
following action:

(1) Ms. Lakin volunteered to suggest a flow chart
whereby student learning outcomes assessment data are
to be transmitted through the Office of the Dean of
Academic Affairs to the Board of Trustees, the chief
operating and fiscal officer and others engaged in
making budgetary decisions, in response to a
suggestion in the NCA site visit committee's report,
that "As the institution advances the Outcomes
Assessment program, it may consider integrating
requests developed as a result of assessment into the
budgeting and planning processes. Many of the
recommendations for change will be for curriculum or
pedagogical changes. However, other recommendations
will require resource allocations which must be
weighed against other budgetary requests. When the
institution gives priority to the assessment generated
resource requests, the result is to create even more
interest in the assessment outcomes." (p. 4). Ms.
Lakin’s recommendation will be submitted to the full
Assessment Committee at the beginning of the 2006-2007
school year.

(2) Ms. Tanzyus volunteered to assign the CAAP test
data to her baccalaureate statistics students for
analysis during the 2006-2007 school year, as a
preliminary step toward analysis of the data received
to date and determination of how these data can be
used as a tool for planning and budgeting for
continuous improvement of teaching and learning at SCI
as well as maximizing student learning outcomes and
institutional effectiveness in the academic domain. This will be an ongoing project.

Friday, August 11, 2006

Gray lady inks higher ed report

Final adoption of a report by the federal Commission on the Future of Higher Education got some media play, mostly from an Associated Press story in papers including The Los Angeles Times, Forbes and The Dallas Morning News and a New York Times news service story that ran in the Gray Lady herself and got picked up by papers including The Arkansas Democrat-Gazette, The Minneapolis Star-Tribune, The Register Guard in Portland, Ore., and The Gainesville (Fla.) Sun.

Today's Chicago Trib carried the AP story, with a graf contributed by staff reporter Jodi S. Cohen. It was a quote from a top University of Illinois administrator, and it demonstrates why some educators wonder if the commission didn't get in over its head just a little, especially on testing issues:
Richard Herman, chancellor of the University of Illinois at Urbana-Champaign, said the best measure of success is what students do after college.

"I am not opposed to the idea of additional measurements, but . . . using a limited number of metrics to measure the success of a college education is inaccurate," Herman said. "Our mission, I believe, is to prepare tomorrow's leaders. I would argue on those grounds that we have been enormously successful. Tell me what written test measures that."
In spite of the sweeping nature of the blue-ribbon commission's mandate, Sam Dillon's lede in The Times managed to get it all in:
WASHINGTON, Aug. 10 — A federal commission approved a final report on Thursday that urges a broad shake-up of American higher education. It calls for public universities to measure learning with standardized tests, federal monitoring of college quality and sweeping changes in financial aid.

The panel also called on policy makers and leaders in higher education to find new ways to control costs, saying college tuition should grow no faster than median family income, although it opposed price controls.

The report recommended bolstering Pell grants, the basic building block of federal student aid, by making the program cover a larger percentage of public college tuition. That proposal could cost billions of dollars.
Didllon's story, like the AP story, noted that David Ward of the American Council on Education refused to sign off on the report and explained his refusal to sign is significant because ACE is "the largest association of colleges and universities [and Ward] was the most powerful representative of the higher education establishment on the commission."

Dillon's story noted that controversial language in earlier drafts of the report was toned down at the last minute. Some of it involved standardized testing:
... in the last six weeks, the commission issued six drafts, watering down passages that had drawn criticism and eliminating one this week, written by Mr. Miller, that had encouraged expanding private loans as a share of student financial aid.

A proposal on standardized tests was also weakened at the last moment. Previous drafts said that “states should require” public universities to use standardized test, but the final version said simply that universities “should measure student learning” with standardized tests.
How that policy recommendation translates into actual mandates, of course, remains to be seen.

The commission was formed in September 2005 to discuss access, accountability and cost issues and to report to U.S. Education Secretary Margaret Spellings in a year's time. What happens next is not clear, although commission chair Charles Miller envisions more consultation with corporate and government leaders. He didn't mention educators, but that may be an oversight in the New York Times story. Dillon reported:
The members seemed at odds on how to carry their recommendations forward. Some, like former Gov. James B. Hunt Jr. of North Carolina, called on President Bush to incorporate them in the Congressional agenda.

Mr. Miller said the next step should be more “national dialogue” with governors and corporate leaders. He seemed upset by what he characterized as wrangling with representatives of the status quo.

“You can’t act on the recommendations today because you encounter one set of defenders and then behind them another set of defenders, and you get into all these battles,” he told reporters after the panel voted.
Dillon's story noted that some member organizations represted in ACE have endorsed drafts of the report, including the American Association of State Colleges and Universities and the American Association of Community Colleges. Other reaction was more in line with Ward's. Said Dillon:
Other important groups in the council issued withering critiques.

The Association of American Universities, which represents 60 top research universities, noted that the report “deals almost exclusively with undergraduate education.”

Robert M. Berdahl, a former chancellor at the University of California, Berkeley, who is president of the universities association, said, “What is needed is something much richer, with a more nuanced understanding of the educational engagement and how it is undertaken.” said

Another council member, the National Association of Independent Colleges and Universities, which represents 900 private institutions including liberal arts colleges, major research universities and church- and other faith-related colleges, attacked the recommendation to develop a national database to follow individual students’ progress as a way of holding colleges accountable for students’ success.

The association called the proposal a dangerous intrusion on privacy, saying, “Our members find this idea chilling.”

Several groups said the report spent much ink discussing increases in students’ work skills, while slighting the mission of colleges and universities to educate students as citizens.

Thursday, August 10, 2006

Higher ed head nixes higher ed report

As expected, the blue-ribbon Commission on the Future of Higher Education has gotten behind the third draft of a report to U.S. Education Secretary Margaret Spellings. The Associated Press is moving the story on today's wire and, surprisingly, The Seattle Post-Intelligencer and other papers are picking it up.

The final draft is toned down considerably from the hostile and abusive language of earlier versions, but David Ward, president of the American Council on Education, refused to sign on, saying, as The AP put it, "the report reflected too much of a 'top down' approach to reform." In a bylined story, AP education writer Justin Pope reported:
In the end, after weeks of negotiations and several drafts, Chairman Charles Miller brought all but one commissioner on board. However the one holdout, David Ward of the American Council on Education, was the primary voice of traditional colleges on the commission, and his refusal to sign on could dilute the report's influence.

Ward said he supported many of the commission's objectives, but opposed "one-size fits all" prescriptions that fail to reflect the differing mission of colleges.

Still, Ward noted several current and past college presidents on the commission signed on to the report at a meeting in Washington, D.C. He said colleges would pay close attention to its calls for reform.

"They now realize if they don't do it to themselves, somebody will do it to them," he said.
One of those "one size fits all" recommendations deals with mandated standardized testing. Others relate to unspecified standard accountability measures that would allow national comparisons of student learning (which may be a way of saying more standardized testing in the pedagese language). We'll see.

But assessment and accountability are not the only, or even the major, focus of the commission. Pope's summary for AP is brief, but accurate:
The report, which will be delivered to Education Secretary Margaret Spellings in final form next month, recommends that the federal government consolidate its more than 20 financial aid programs and ensure that Pell Grants - the main aid program for low-income students - cover at least 70 percent of in-state tuition costs. In 2004-2005, the grants covered less than half.

But it says that colleges should do more to hold down costs, and to better measure what students learn.

The 19-member commission, created by Spellings, has no direct power, but has been closely watched by policy-makers. Because of its diverse membership - industry, government and for-profit and traditional colleges are represented - any recommendations all members agreed on would carry substantial weight as Congress, the White House and state governments consider education measures in the future.
All the implications of this panel's recodmmendations are not clear yet. But I'll bet somebody makes a lot of money out of them!