NUTS & BOLTS
An electronic assessment newsletter
Springfield College in Illinois
-----------------------------------------
September 2006
Vol. 7 No. 2
-----------------------------------------
Editor's Note. Since I am still unable to post to
SCI's assessment website, I am publishing the
newsletter by email and archiving it in interim on my
personal weblog at http://www.teachinglogspot.blogspot.com/ ... if
representatives of the North Central Association, the
Illinois Board of Higher Education or other outside
stakeholders wish to see SCI's annual Assessment
Report for 2005-2006 or other current information
regarding our assessment program, please direct them
to my personal blog. -- Pete Ellertsen, assessment
chair
* * *
CLASSROOM ASSESSMENT WORKSHOP
September's issue of Nuts & Bolts is coming out a day
early (Aug. 31) to catch faculty members before the
last minute leading up to the Labor Day weekend. (OK,
OK, it's the *next* to last minute.) That's to give
you more time to plan on attending one of the workshop
sessions on Classroom Assessment Techniques we'll
conduct in the next couple of weeks.
The workshop, in the Resource Center on the
lower level of SCI's Becker Library, will be offered
at three times:
(1) Thursday, Sept. 7, from 5:30 to 6:30 p.m.;
(2) Monday, Sept. 11, from 5:30 to 6:30 p.m.; and
(3) Tuesday, Sept. 12, from 2:30 to 3:30 p.m.
Attendance is voluntary, and all interested SCI and
Benedictine University faculty are welcome. We'll talk
about how to find assessment techniques that are
appropriate to the learning goals and objectives in
our syllabi, and I'll show interested instructors some
aids on the World Wide Web that I've found helpful. I
expect the sessions will be small and informal.
The workshop would be especially valuable for new
instructors who may not have experience with CATs (as
classroom assessment techniques are called), but
seasoned instructors are welcome to exchange ideas and
share their experience and insights as well.
MORE POLITICS
In a move that surprised many, U.S. Education
Secretary Margaret Spellings has announced four
hearings nationwide on whether recommendations of the
federal Commission on the Future of Higher Education
"can be put in place through federal regulation"
through a procedure known as the negotiated
rule-making process. One area up for review, according
to an Aug. 18 announcement in the Federal Register, is
accreditation.
No one in higher education is able to say exactly what
the Department wants to do about accreditation,
according to Doug Lederman of the online newsletter
Inside Higher Ed:
"The Spellings commission’s report takes broad shots
at the perceived ineffectiveness and dysfunction of
the system of voluntary regional and national
accreditation, but offers relatively few firm
proposals for transforming it. So while there are no
obvious changes in accreditation that might emerge
from regulatory negotiations, the department could see
itself as having broad latitude to impose new
requirements on accreditors and, in turn, on colleges,
some observers speculate."
While staff-written working papers presented to the
commission in the spring and early drafts of its
report were quite acrimonious about accreditation,
along with other perceived failures in higher ed, the
hostile tone had largely dropped out of the final
draft, which was approved early in August. With the
hostility, most references to accreditation were also
backburnered.
In an article in the current issue of The Chronicle of
Higher Education, Kelly Field catches the uncertainty
that has greeted the commission's latest regulatory
tack. Field also ties the issue to President Bush's
larger political agenda:
"... some college lobbyists still wondered why an
administration that had shown little interest in
higher education during its first term was suddenly so
concerned with its future. Some speculated that the
administration was trying to divert attention from its
unpopular No Child Left Behind Act, the 2002 law that
imposed testing on the nation's elementary and
secondary schools; others suspected that it was
seeking to extend that law's reach into the college
classroom.
"To the suspicious, the secretary's choice of a
commission chairman seemed proof of a plot to
institute standardized testing at colleges. Charles
Miller, a millionaire investor and close friend of
both Ms. Spellings and President Bush, was best known
for devising a Texas public-school accountability
system that became the model for No Child Left Behind.
He was also associated with accountability testing at
the University of Texas System, where he led the Board
of Regents from 2001 to 2004."
Field adds:
"The swiftness of the secretary's response took some
college lobbyists by surprise. They said the
administration's announcement, which appeared in the
August 18 edition of the Federal Register, signaled
that the secretary did not want to lose any momentum
for change created by the commission's deliberations.
"So far, the administration has given few clues about
which recommendations it might consider as part of the
negotiated rule making — a process by which federal
agencies work with affected parties as regulations are
drafted."
All of this may well affect us at SCI and Benedictine,
because assessment is a politically driven process and
its political underpinnings may be changing.
Hearings will be held in California, Florida,
Washington, D.C., and Chicago. The Chicago hearing
will be Oct. 19 at Loyola University.
Works Cited
Field, Kelly. "Uncertainty Greets Report on Colleges
by U.S. Panel." Chronicle of Higher Education Sept. 1,
2006. http://chronicle.com/free/v53/i02/02a00101.htm
Lederman, Doug. "Regulatory Activism?" Inside Higher
Ed Aug. 21, 2006.
http://www.insidehighered.com/news/2006/08/21/regs
Nassirian, Barmak. "U.S. Department of Education
Formally Plans for 'Negotiated Rulemaking'." AACRAO
Transcript [American Association of Collegeate
Registrars and Admissions Officers] Aug. 30, 2006. http://www.aacrao.org/transcript/index.cfm?fuseaction=show_view&doc_id=3294
TEACHING B/LOG started out as a classroom teacher's journal/log with notes and comment on the politics of higher ed and learning outcomes assessment at a small liberal arts college. After several years on hiatus, it was revived in 2014 as a portal to updates and commentary on corporate school "reform," politics and the creation of a hereditary aristocracy in 21st-century America
Thursday, August 31, 2006
Tuesday, August 15, 2006
Nuts & Bolts Aug. 2006 ARCHIVE
NUTS & BOLTS
An electronic assessment newsletter
Springfield College in Illinois
-----------------------------------------
August 2006
Vol. 7 No. 1
-----------------------------------------
Editor's Note. Until I can get access to SCI's new
assessment website, I will publish the
newsletter by email and archive current issues on an
interim basis on my personal weblog at
http://www.teachinglogspot.blogspot.com/ ... SCI’s
Common Student Learning Objectives and instructors’
guide "Classroom Assessment for Continuous
Improvement" are linked to SCI’s homepage at
http://www.sci.edu. [One change has been made in the archived copy since the original was emailed Aug. 14, to reflect a change in schedule. The third workshop will now be from 2:30 to 3:30 p.m. Tuesday, Sept. 12.]
* * *
CLASSROOM ASSESSMENT WORKSHOPS IN SEPTEMBER
Next month new (and not-so-new) instructors are
invited to a workshop at which I will briefly explain
how SCI’s Common Student Learning Objectives (SLOs)
were derived from our mission statement; and how
Course Based SLOs relate to daily lessons and
assignments. I will assist instructors in choosing
Classroom Assessment Techniques appropriate to the
SCI and/or Benedictine University mission statement
and the goals, objectives and outcomes in the courses
you teach. The workshop, in the Resource Center on the
lower level of SCI's Becker Library, will be offered
at three times:
(1) Thursday, Sept. 7, from 5:30 to 6:30 p.m.;
(2) Monday, Sept. 11, from 5:30 to 6:30 p.m.; and
(3) Tuesday, Sept. 12, from 2:30 to 3:30 p.m.
Attendance is voluntary, and all interested SCI and
Benedictine University faculty are welcome to
participate.
Here's a four-point summary of SCI’s philosophy of
assessment, prepared for a recent meeting of teachers
in the “Triple A” or adult accelerated associate’s
degree program. That’s 10 fewer points than Edwards
Deming, the management guru whose theories are
reflected in our learning outcomes assessment program
at SCI. But assessment isn’t rocket science – I want
to keep it simple.
1. Assessment is externally mandated, but it can be a
valuable part of what we do in the classroom. Let’s be
blunt about it. In higher ed no less than in the
public schools, we are mandated by outside
stakeholders – mostly the state and federal
governments – to do assessment. It’s part of the
political demand for “accountability” that gave us the
No Child Left Behind Act at the K-12 level, and this
summer some of us are nervously watching a blue-ribbon
federal commission as it debates ways of politicizing
higher ed as well. But assessment isn’t rocket science
– I define it as nothing more than using several
different ways of finding out what our students learn.
Some are embedded in work they do for grades; others
aren’t. But they can all help us teach better. At SCI,
we have designed an assessment program that addresses
accountability to outside stakeholders mostly at the
institutional level, by requiring standardized tests
of our sophomores and making sure our course offerings
and objectives square with the statewide Illinois
Articulation Initiative. That leaves our classroom
teachers free to assess student learning outcomes
(which basically means what the students learn) to
improve our teaching over the course of the semester –
when there’s still time to plug the assessment results
back into our planning processes.
2. Classroom assessment at SCI is “formative,” which
means we use the results immediately to improve our
teaching. There are several highly effective classroom
assessment techniques (known as CATs for short). One
that many of us like is the “one-minute paper.” At the
end of class, we’ll have the students write briefly on
two questions designed to get at what they learned:
(1) What was the clearest point in tonight’s class?
(2) What was the most confusing point? I think it’s
very useful. It’s humbling when I realize I led my
students off on a tangent when some off-the-cuff
remark keeps showing up as the clearest point, but
it’s good to know so I can get us all back on track
the following week. And I know to clear up the most
confusing point while I’m at it. This approach is what
educators call “formative assessment.” Carol Boston of
the University of Maryland defines it as the
“diagnostic use of assessment to provide feedback to
teachers and students over the course of instruction.”
It’s what we stress at SCI.
3. Classroom assessment is not rocket science, but it
is grounded in the scientific method of testing and
refining our data. Most of our classes are too small
for us to attempt statistical analysis with any rigor.
So classroom assessment, at least at SCI, is more an
art than a science. There’s a quote I like from Peter
Ewell, one of the pioneers in learning outcomes
assessment, on the Southern Illinois at Edwardsville
classroom assessment website: “Why do we insist on
measuring it with a micrometer when we mark it with
chalk and cut it with an axe?" My answer: We don’t
try, but we do learn how to heft an axe. That SIUE
website, by the way, is one of the best places to
start learning about CATs, and I recommend it highly.
I also recommend our instructors’ guide, Classroom
Assessment for Continuous Improvement, available as a
PDF document on the SCI website. I like it partly
because I wrote it. But it shares some good ideas from
other SCI instructors, and I think it explains the
philosophy behind assessment at SCI. It’s called
planning for continuous improvement, and it boils down
to a four-step process: (1) Plan something, a lesson
or a course; (2) Do it, at least get it started and
measure its interim success; (3) Study the data from
those measurements; and (4) Act or adjust your
procedures in light of your analysis of the data. The
idea is borrowed from industrial management, where
it’s known as a PDSA cycle, but behind it is nothing
more complicated than the scientific method. Most
important, it works in the classroom as well as it
does on the shop floor.
4. Our organizational culture at SCI is receptive to
assessment, and you can find plenty of help. Just ask
us. Your syllabi, for example, are full of numbers and
letters that relate the goals and objectives of
individual courses to the SCI mission statement. They
may be puzzling at first! It’s a new system, and we’re
still working out kinks. But most of us are adapting
to it. So we’ll be able to help you figure out what
all the letters and numbers mean, and how they relate
to what you do in the classroom. But we’ll also be
very sympathetic. We’ve been puzzled ourselves.
Sometimes I still get confused! But I take comfort in
point No. 3 above: Assessment isn’t rocket science, it
involves a continuous learning process and it takes
time to master. Ask your division chairs for help. Or
please feel free to contact me. I’m easiest to reach
by email … at pellertsen!@sci.edu.
FEDERAL TESTING MANDATE?
A federal Commission on the Future of Higher Education
that threatens to change the way we do assessment at
SCI, and everywhere else in higher ed, appears to have
backed off on its most extreme proposals for a “one
size fits all” federally mandated standardized testing
program. The commission, chaired by Bush
administration insider Charles Miller of Texas,
adopted its final report this month. An Associated
Press story sums it up like this:
The vote was 18-1. David Ward, president of the
American Council on Education, was the dissenting
member. Associated Press education reporter Justin
Pope noted that Ward “was the primary voice of
traditional colleges on the commission, and his
refusal to sign on could dilute the report's
influence.”
In the meantime, a snippet tucked into a report in
the online newsletter Higher Ed Today suggests a
partial retreat from commission chair Charles Miller's
insistence on a uniform national standarized testing
regimen. It also suggests testing will be one of the
footballs the politicans plan to kick around. The
snippet reads as follows:
I want to see the final draft before I try to read too
much into this. But I think it may be a hopeful sign
whatever new testing regimen emerges from all this
won't be too intrusive. A fuller discussion is posted
to my "teaching b/log" at
http://teachinglogspot.blogspot.com/
-- Pete Ellertsen, editor, Nuts & Bolts
Works Cited
Boston, Carol. “The concept of formative assessment.”
Practical Assessment, Research & Evaluation 8.9
(2002). 7 Aug. 2006.
http://pareonline.net/getvn.asp?v=8&n=9
Classroom Assessment for Continuous Improvement: A
Guide for Instructors. SCI. 2005. 7 Aug. 2006. PDF
file linked to http://www.sci.edu/assessment-site.htm
“Classroom Assessment Techniques.” University of
Southern Illinois at Edwardsville. n.d. 7 Aug. 2006.
http://www.siue.edu/~deder/assess/catmain.html
Lederman, Doug. “18 Yesses, 1 Major No.” Inside Higher
Ed 11 Aug. 2006. 14 Aug. 2006.
http://www.insidehighered.com/news/2006/08/11/commission
Pope, Justin. “Higher Education Report Gets OK.
Seattle Post-Intelligencer 10 Aug. 2006. 14 Aug. 2006.
http://seattlepi.nwsource.com/national/1110AP_Higher_Education_Commission.html
An electronic assessment newsletter
Springfield College in Illinois
-----------------------------------------
August 2006
Vol. 7 No. 1
-----------------------------------------
Editor's Note. Until I can get access to SCI's new
assessment website, I will publish the
newsletter by email and archive current issues on an
interim basis on my personal weblog at
http://www.teachinglogspot.blogspot.com/ ... SCI’s
Common Student Learning Objectives and instructors’
guide "Classroom Assessment for Continuous
Improvement" are linked to SCI’s homepage at
http://www.sci.edu. [One change has been made in the archived copy since the original was emailed Aug. 14, to reflect a change in schedule. The third workshop will now be from 2:30 to 3:30 p.m. Tuesday, Sept. 12.]
* * *
CLASSROOM ASSESSMENT WORKSHOPS IN SEPTEMBER
Next month new (and not-so-new) instructors are
invited to a workshop at which I will briefly explain
how SCI’s Common Student Learning Objectives (SLOs)
were derived from our mission statement; and how
Course Based SLOs relate to daily lessons and
assignments. I will assist instructors in choosing
Classroom Assessment Techniques appropriate to the
SCI and/or Benedictine University mission statement
and the goals, objectives and outcomes in the courses
you teach. The workshop, in the Resource Center on the
lower level of SCI's Becker Library, will be offered
at three times:
(1) Thursday, Sept. 7, from 5:30 to 6:30 p.m.;
(2) Monday, Sept. 11, from 5:30 to 6:30 p.m.; and
(3) Tuesday, Sept. 12, from 2:30 to 3:30 p.m.
Attendance is voluntary, and all interested SCI and
Benedictine University faculty are welcome to
participate.
Here's a four-point summary of SCI’s philosophy of
assessment, prepared for a recent meeting of teachers
in the “Triple A” or adult accelerated associate’s
degree program. That’s 10 fewer points than Edwards
Deming, the management guru whose theories are
reflected in our learning outcomes assessment program
at SCI. But assessment isn’t rocket science – I want
to keep it simple.
1. Assessment is externally mandated, but it can be a
valuable part of what we do in the classroom. Let’s be
blunt about it. In higher ed no less than in the
public schools, we are mandated by outside
stakeholders – mostly the state and federal
governments – to do assessment. It’s part of the
political demand for “accountability” that gave us the
No Child Left Behind Act at the K-12 level, and this
summer some of us are nervously watching a blue-ribbon
federal commission as it debates ways of politicizing
higher ed as well. But assessment isn’t rocket science
– I define it as nothing more than using several
different ways of finding out what our students learn.
Some are embedded in work they do for grades; others
aren’t. But they can all help us teach better. At SCI,
we have designed an assessment program that addresses
accountability to outside stakeholders mostly at the
institutional level, by requiring standardized tests
of our sophomores and making sure our course offerings
and objectives square with the statewide Illinois
Articulation Initiative. That leaves our classroom
teachers free to assess student learning outcomes
(which basically means what the students learn) to
improve our teaching over the course of the semester –
when there’s still time to plug the assessment results
back into our planning processes.
2. Classroom assessment at SCI is “formative,” which
means we use the results immediately to improve our
teaching. There are several highly effective classroom
assessment techniques (known as CATs for short). One
that many of us like is the “one-minute paper.” At the
end of class, we’ll have the students write briefly on
two questions designed to get at what they learned:
(1) What was the clearest point in tonight’s class?
(2) What was the most confusing point? I think it’s
very useful. It’s humbling when I realize I led my
students off on a tangent when some off-the-cuff
remark keeps showing up as the clearest point, but
it’s good to know so I can get us all back on track
the following week. And I know to clear up the most
confusing point while I’m at it. This approach is what
educators call “formative assessment.” Carol Boston of
the University of Maryland defines it as the
“diagnostic use of assessment to provide feedback to
teachers and students over the course of instruction.”
It’s what we stress at SCI.
3. Classroom assessment is not rocket science, but it
is grounded in the scientific method of testing and
refining our data. Most of our classes are too small
for us to attempt statistical analysis with any rigor.
So classroom assessment, at least at SCI, is more an
art than a science. There’s a quote I like from Peter
Ewell, one of the pioneers in learning outcomes
assessment, on the Southern Illinois at Edwardsville
classroom assessment website: “Why do we insist on
measuring it with a micrometer when we mark it with
chalk and cut it with an axe?" My answer: We don’t
try, but we do learn how to heft an axe. That SIUE
website, by the way, is one of the best places to
start learning about CATs, and I recommend it highly.
I also recommend our instructors’ guide, Classroom
Assessment for Continuous Improvement, available as a
PDF document on the SCI website. I like it partly
because I wrote it. But it shares some good ideas from
other SCI instructors, and I think it explains the
philosophy behind assessment at SCI. It’s called
planning for continuous improvement, and it boils down
to a four-step process: (1) Plan something, a lesson
or a course; (2) Do it, at least get it started and
measure its interim success; (3) Study the data from
those measurements; and (4) Act or adjust your
procedures in light of your analysis of the data. The
idea is borrowed from industrial management, where
it’s known as a PDSA cycle, but behind it is nothing
more complicated than the scientific method. Most
important, it works in the classroom as well as it
does on the shop floor.
4. Our organizational culture at SCI is receptive to
assessment, and you can find plenty of help. Just ask
us. Your syllabi, for example, are full of numbers and
letters that relate the goals and objectives of
individual courses to the SCI mission statement. They
may be puzzling at first! It’s a new system, and we’re
still working out kinks. But most of us are adapting
to it. So we’ll be able to help you figure out what
all the letters and numbers mean, and how they relate
to what you do in the classroom. But we’ll also be
very sympathetic. We’ve been puzzled ourselves.
Sometimes I still get confused! But I take comfort in
point No. 3 above: Assessment isn’t rocket science, it
involves a continuous learning process and it takes
time to master. Ask your division chairs for help. Or
please feel free to contact me. I’m easiest to reach
by email … at pellertsen!@sci.edu.
FEDERAL TESTING MANDATE?
A federal Commission on the Future of Higher Education
that threatens to change the way we do assessment at
SCI, and everywhere else in higher ed, appears to have
backed off on its most extreme proposals for a “one
size fits all” federally mandated standardized testing
program. The commission, chaired by Bush
administration insider Charles Miller of Texas,
adopted its final report this month. An Associated
Press story sums it up like this:
“A national commission charged with
plotting the future of American higher education
approved its final recommendations Thursday, calling
on the government to provide more aid based on
financial need, while telling colleges to be more
accountable for what students learn.
A commission member representing nonprofit colleges
declined to sign on, however, saying the report
reflected too much of a "top down" approach to reform.
The report, which will be delivered to Education
Secretary Margaret Spellings in final form next month,
recommends that the federal government consolidate its
more than 20 financial aid programs and ensure that
Pell Grants - the main aid program for low-income
students - cover at least 70 percent of in-state
tuition costs. In 2004-2005, the grants covered less
than half.
But it says that colleges should do more to hold down
costs, and to better measure what students learn.
The 19-member commission, created by Spellings, has no
direct power, but has been closely watched by
policy-makers. Because of its diverse membership -
industry, government and for-profit and traditional
colleges are represented - any recommendations all
members agreed on would carry substantial weight as
Congress, the White House and state governments
consider education measures in the future.
(Pope)
The vote was 18-1. David Ward, president of the
American Council on Education, was the dissenting
member. Associated Press education reporter Justin
Pope noted that Ward “was the primary voice of
traditional colleges on the commission, and his
refusal to sign on could dilute the report's
influence.”
In the meantime, a snippet tucked into a report in
the online newsletter Higher Ed Today suggests a
partial retreat from commission chair Charles Miller's
insistence on a uniform national standarized testing
regimen. It also suggests testing will be one of the
footballs the politicans plan to kick around. The
snippet reads as follows:
Speaking to reporters after the vote,
Miller said his preference would be for “the academy
[itself] to address” the changes called for in the
report, and as evidence of his desire not to impose
mandates on higher education, he noted that the report
the commission approved Thursday had dropped language
(which was in last week’s draft) that called for
states to require public institutions to measure
student learning using a set of tests and other
measures. (The new language, which college leaders
pushed hard for in the last few days, just says that
“higher education institutions should measure student
learning using....")
If higher education is “not responsive to change” and
“doesn’t have a strategic vision,” Miller predicted,
then “things are going to be mandated.”
(Lederman)
I want to see the final draft before I try to read too
much into this. But I think it may be a hopeful sign
whatever new testing regimen emerges from all this
won't be too intrusive. A fuller discussion is posted
to my "teaching b/log" at
http://teachinglogspot.blogspot.com/
-- Pete Ellertsen, editor, Nuts & Bolts
Works Cited
Boston, Carol. “The concept of formative assessment.”
Practical Assessment, Research & Evaluation 8.9
(2002). 7 Aug. 2006.
http://pareonline.net/getvn.asp?v=8&n=9
Classroom Assessment for Continuous Improvement: A
Guide for Instructors. SCI. 2005. 7 Aug. 2006. PDF
file linked to http://www.sci.edu/assessment-site.htm
“Classroom Assessment Techniques.” University of
Southern Illinois at Edwardsville. n.d. 7 Aug. 2006.
http://www.siue.edu/~deder/assess/catmain.html
Lederman, Doug. “18 Yesses, 1 Major No.” Inside Higher
Ed 11 Aug. 2006. 14 Aug. 2006.
http://www.insidehighered.com/news/2006/08/11/commission
Pope, Justin. “Higher Education Report Gets OK.
Seattle Post-Intelligencer 10 Aug. 2006. 14 Aug. 2006.
http://seattlepi.nwsource.com/national/1110AP_Higher_Education_Commission.html
Monday, August 14, 2006
Miller panel backs off on testing?
Now that the U.S. Education Department's blue-ribbon Commission on the Future of Higher Education has approved a draft report, there will be plenty of time to look at its implications for testing and assessment. It goes to Education Secretary Margaret Spellings next month, and then it is expected to be threshed out in a political process involving any number of government, industry and, hopefully, educational stakeholders.
In the meantime, a snippet tucked into a report in the online newsletter Inside Higher Ed suggests a partial retreat from commission chair Charles Miller's insistence on a uniform national standarized testing regimen. It also suggests testing will be one of the footballs the politicans plan to kick around. The snippet reads as follows:
In the meantime, a snippet tucked into a report in the online newsletter Inside Higher Ed suggests a partial retreat from commission chair Charles Miller's insistence on a uniform national standarized testing regimen. It also suggests testing will be one of the footballs the politicans plan to kick around. The snippet reads as follows:
Speaking to reporters after the vote, Miller said his preference would be for “the academy [itself] to address” the changes called for in the report, and as evidence of his desire not to impose mandates on higher education, he noted that the report the commission approved Thursday had dropped language (which was in last week’s draft) that called for states to require public institutions to measure student learning using a set of tests and other measures. (The new language, which college leaders pushed hard for in the last few days, just says that “higher education institutions should measure student learning using....")I want to see the final draft before I try to read too much into this. But I think it may be a hopeful sign whatever new testing regimen emerges from all this won't be too intrusive.
If higher education is “not responsive to change” and “doesn’t have a strategic vision,” Miller predicted, then “things are going to be mandated.”
SCI annual assessment report / ARCHIVE
ANNUAL ASSESSMENT REPORT
Springfield College in Illinois
Academic Year 2005-2006
* * *
Editor’s Note. Since SCI went over to a new website in
July 2006, I have been unable to access the assessment
portion of the website. Until the remaining technical
bugs can be worked out, I am archiving current
reports, newsletters and other postings relating to
student learning outcomes assessment at SCI on my
personal weblog at
http://www.teachinglogspot.blogspot.com/ -- Peter
Ellertsen, chair, Assessment Committee.
* * *
Because SCI was reaccredited during the 2005-2006
academic year, the Assessment Committee’s activities
were heavily influenced by the site visit for
reaccreditation purposes that took place in November
2005. Before the visit, the committee’s focus was on
getting ready for the site visit; afterward, its focus
shifted to the preliminary stages of planning to
maintain and further develop elements of the college’s
Assessment Plan as adopted in 1996, amended in 2001
and implemented during the time intervening between
those dates and the present.
A key part of the plan, and one that received a great
deal of attention as it was initiated over the summer
and fall terms in 2005 was the implementation of a new
syllabus format incorporating the Common Student
Learning Objectives (CSLOs) adopted at a faculty
workshop in December 2004 and derived from SCI’s
stated mission of preparing students for lives of
“learning, leadership and service in a diverse world”
into Course Based Student Learning Objectives (CBSLOs)
and into individual assignments and assessment
activities by individual instructors and as part of
the college’s program and institutional effectiveness
assessment programs. Beginning in the fall semester,
all syllabi submitted to the Office of the Dean of
Academic Affairs from traditional and adult
accelerated associate’s level courses follow the new
format, and workshops were held in the summer and fall
of 2005 to help instructors follow the new format and
choose Classroom Assessment Techniques that will help
them perform both formative assessment during the
course of the semester and summative assessment at
semester’s end in a cycle of continuous improvement of
classroom instruction. In addition, the chair of the
Assessment Committee wrote a 45-page booklet entitled
"Classroom Assessment for Continuous Improvement." It
was given to workshop attendees in summer of 2005 and
posted to the college’s website at www.sci.edu as a
PDF document. Additional workshops for new faculty are
scheduled in September 2006.
During the site visit in November, members of the
Assessment Committee were informed verbally that
members of the site visit team were favorably
impressed with the degree to which SCI has developed
an organizational culture that is receptive to
assessment, and this impression was repeated in the
written report issued in December and formalized in
June 2006 (please see below for details). At the same
time, members of the site visit panel made in clear in
verbal communication that the SCI’s progress to date
is expected to continue as the 1996/2001 Assessment
Plan is fleshed out and further implemented. Along
with the accolade came what members of the Assessment
Committee interpreted as further marching orders.
After the site visit, the Committee’s focus shifted
toward maintenance of ongoing parts of the Assessment
Plan and planning toward expansion of the assessment
program as the Plan is further implemented. Program
assessment continued apace, as members of the
Assessment Committee continued to develop a matrix
showing with CSLOs and CBSLOs are reflected in General
Education courses and worked with outside stakeholders
in the evaluation and improvement of curricula,
particularly with regard to science. Standardized
tests purchased from ACT Inc. were purchased and
administered at the end of March, and efforts began to
study and interpret test results over time since the
reading module has been administered now since 2003
and a math test has been added. The small size of
SCI’s student population makes it imperative that data
accrue over time, and that they be interpreted
carefully since the data pool is only beginning to be
large enough, at least in the case of reading scores,
for valid statistical analysis. Details are reported
below.
Priorities for the coming 2006-2007 academic year will
be set by the committee in its September and October
meetings. It is expected that they will continue to
focus on fuller implementation of the 1996/2001
Assessment Plan, especially with regard to program
assessment, further efforts to reflect specific parts
of the mission statement and CSLOs in classroom
assessment of individual lessons and assignments, and
the completion of feedback loops and other
communication of learning outcomes data throughout the
college so these data can be consciously utilized in
decision-making processes.
After a site visit in November, the Higher Learning
Commission of the North Central Association of
Colleges and Schools formally renewed Springfield
College in Illinois' accreditation for 10 years. The
NCA site visit team's Comprehensive Evaluation report
said its inspection "confirm[ed] the institution's
capacity and responsibility to identify and address
issues," including a good half dozen issues of
long-standing concern to the accrediting body. The
panel noted that SCI's partnership with Benedictine
University was a crucial factor in granting continued
accreditation. After noting "major improvements since
the inception of the partnership," it reported in its
summary of findings:
Regarding assessment, the site visit committee
reported, “"It is clear from considerable
documentation and a variety of personal conversations
that SCI has made considerable progress in creating a
culture of assessment on campus, with a specific focus
on classroom-level assessment." The following evidence
was cited:
The site visit committee noted two other aspects of
the assessment plan - an annual review of courses to
"ensure that Illinois Articulation Initiatives are
met," and our program review process. "Following a
review," the panel noted, "the theater program was
placed on indefinite inactive status. Review of the
forensics program to determine the future of the
program has included two external evaluators. These
program reviews indicate the institution is reviewing
the effectiveness of the programs."
In addition, the site visit team noted a long-standing
overall commitment to good teaching and student
learning at SCI. It cited the way faculty members are
"evaluated by department chairs, students, and the
Dean of Academic Affairs," and the "classroom visits
and evaluations are discussed with faculty and affect
tenure decisions," as well as decisions on rehiring
adjunct instructors. Also credited were the LaFata and
Distinguished Teaching Awards and SCI's computer labs
and utilization of "limited rsources to improve
classrooms and maintain the cleanliness of the
grounds, common spaces, and classrooms." Especially
commended was the new Resource Center on the lower
level of Becker Library
The Collegiate Assessment of Academic Proficiency
(CAAP) tests in reading and math were administered at
the end of March. On the CAAP reading test, SCIr
students who took it (n = 87) scored an average of
59.0; the nationwide reference group of second-year
students in private two-year colleges scored an
average of 60.4. That is slightly less than the
national average. But SCI students in 2005 averaged
61.2 on the reading test, compared to 60.4 nationwide,
and in 2004 SCI students' score was 59.9 compared to
60.3 nationwide. The college’s first math scores were
as follows: Students students averaged 56.5 as
compared to 56.1 nationally. Math scores will not be
statistically significant until the test has been
administered one or two years longer and more data
accrue.
In addition, SCI purchased from ACT Inc. a linkage
report providing a “value added” benchmark for
measuring how much SCI students learned about reading
and math in their college years. ACT Inc., the vendor,
explains: "This report contains an analysis of
performance for students who tested with the ACT
Assessment on entry to college and CAAP after general
education work has been completed. ... Because the
content specifications of some pairs of ACT Assessment
and CAAP tests are similar, it is possible to track
student performance for your cohort." The linkage
results:
Over the summer, a subcommittee was empaneled to take
an exploratory look at all the CAAP test results and
make some preliminary decisions on how they can be
utilized as a planning tool for continuous improvement
of instruction. Serving on it were Academic Affairs
dean John Cicero, Languages and Literature chair Amy
Lakin, and math instructor Barb Tanzyus. Peter
Ellertsen, chair of the assessment committee, convened
the subcommittee. It met in June, and took the
following action:
(1) Ms. Lakin volunteered to suggest a flow chart
whereby student learning outcomes assessment data are
to be transmitted through the Office of the Dean of
Academic Affairs to the Board of Trustees, the chief
operating and fiscal officer and others engaged in
making budgetary decisions, in response to a
suggestion in the NCA site visit committee's report,
that "As the institution advances the Outcomes
Assessment program, it may consider integrating
requests developed as a result of assessment into the
budgeting and planning processes. Many of the
recommendations for change will be for curriculum or
pedagogical changes. However, other recommendations
will require resource allocations which must be
weighed against other budgetary requests. When the
institution gives priority to the assessment generated
resource requests, the result is to create even more
interest in the assessment outcomes." (p. 4). Ms.
Lakin’s recommendation will be submitted to the full
Assessment Committee at the beginning of the 2006-2007
school year.
(2) Ms. Tanzyus volunteered to assign the CAAP test
data to her baccalaureate statistics students for
analysis during the 2006-2007 school year, as a
preliminary step toward analysis of the data received
to date and determination of how these data can be
used as a tool for planning and budgeting for
continuous improvement of teaching and learning at SCI
as well as maximizing student learning outcomes and
institutional effectiveness in the academic domain. This will be an ongoing project.
Springfield College in Illinois
Academic Year 2005-2006
* * *
Editor’s Note. Since SCI went over to a new website in
July 2006, I have been unable to access the assessment
portion of the website. Until the remaining technical
bugs can be worked out, I am archiving current
reports, newsletters and other postings relating to
student learning outcomes assessment at SCI on my
personal weblog at
http://www.teachinglogspot.blogspot.com/ -- Peter
Ellertsen, chair, Assessment Committee.
* * *
Because SCI was reaccredited during the 2005-2006
academic year, the Assessment Committee’s activities
were heavily influenced by the site visit for
reaccreditation purposes that took place in November
2005. Before the visit, the committee’s focus was on
getting ready for the site visit; afterward, its focus
shifted to the preliminary stages of planning to
maintain and further develop elements of the college’s
Assessment Plan as adopted in 1996, amended in 2001
and implemented during the time intervening between
those dates and the present.
A key part of the plan, and one that received a great
deal of attention as it was initiated over the summer
and fall terms in 2005 was the implementation of a new
syllabus format incorporating the Common Student
Learning Objectives (CSLOs) adopted at a faculty
workshop in December 2004 and derived from SCI’s
stated mission of preparing students for lives of
“learning, leadership and service in a diverse world”
into Course Based Student Learning Objectives (CBSLOs)
and into individual assignments and assessment
activities by individual instructors and as part of
the college’s program and institutional effectiveness
assessment programs. Beginning in the fall semester,
all syllabi submitted to the Office of the Dean of
Academic Affairs from traditional and adult
accelerated associate’s level courses follow the new
format, and workshops were held in the summer and fall
of 2005 to help instructors follow the new format and
choose Classroom Assessment Techniques that will help
them perform both formative assessment during the
course of the semester and summative assessment at
semester’s end in a cycle of continuous improvement of
classroom instruction. In addition, the chair of the
Assessment Committee wrote a 45-page booklet entitled
"Classroom Assessment for Continuous Improvement." It
was given to workshop attendees in summer of 2005 and
posted to the college’s website at www.sci.edu as a
PDF document. Additional workshops for new faculty are
scheduled in September 2006.
During the site visit in November, members of the
Assessment Committee were informed verbally that
members of the site visit team were favorably
impressed with the degree to which SCI has developed
an organizational culture that is receptive to
assessment, and this impression was repeated in the
written report issued in December and formalized in
June 2006 (please see below for details). At the same
time, members of the site visit panel made in clear in
verbal communication that the SCI’s progress to date
is expected to continue as the 1996/2001 Assessment
Plan is fleshed out and further implemented. Along
with the accolade came what members of the Assessment
Committee interpreted as further marching orders.
After the site visit, the Committee’s focus shifted
toward maintenance of ongoing parts of the Assessment
Plan and planning toward expansion of the assessment
program as the Plan is further implemented. Program
assessment continued apace, as members of the
Assessment Committee continued to develop a matrix
showing with CSLOs and CBSLOs are reflected in General
Education courses and worked with outside stakeholders
in the evaluation and improvement of curricula,
particularly with regard to science. Standardized
tests purchased from ACT Inc. were purchased and
administered at the end of March, and efforts began to
study and interpret test results over time since the
reading module has been administered now since 2003
and a math test has been added. The small size of
SCI’s student population makes it imperative that data
accrue over time, and that they be interpreted
carefully since the data pool is only beginning to be
large enough, at least in the case of reading scores,
for valid statistical analysis. Details are reported
below.
Priorities for the coming 2006-2007 academic year will
be set by the committee in its September and October
meetings. It is expected that they will continue to
focus on fuller implementation of the 1996/2001
Assessment Plan, especially with regard to program
assessment, further efforts to reflect specific parts
of the mission statement and CSLOs in classroom
assessment of individual lessons and assignments, and
the completion of feedback loops and other
communication of learning outcomes data throughout the
college so these data can be consciously utilized in
decision-making processes.
Accreditation
After a site visit in November, the Higher Learning
Commission of the North Central Association of
Colleges and Schools formally renewed Springfield
College in Illinois' accreditation for 10 years. The
NCA site visit team's Comprehensive Evaluation report
said its inspection "confirm[ed] the institution's
capacity and responsibility to identify and address
issues," including a good half dozen issues of
long-standing concern to the accrediting body. The
panel noted that SCI's partnership with Benedictine
University was a crucial factor in granting continued
accreditation. After noting "major improvements since
the inception of the partnership," it reported in its
summary of findings:
The faculty and staff are qualified, dedicated, and
hopeful; in addition, recently hired staff are
bringing new perspectives to the institution. It was
clear from discussions with the Benedictine University
President and the Chair of its Board of Trustees, that
Benedictine is fully committed to the partnership.
With the University's leadership and its advantageous
presence in the state capital, Springfield College
should be able to continue to fulfill its mission.
While the team anticipates growing pains in relation
to the partnership, it should be possible to overcome
them. In short, it seems clear that Springfield
College in Illinois, in partnership with Benedictine
Univrsity, is now a viable institution with prospects
for a positive future.
Regarding assessment, the site visit committee
reported, “"It is clear from considerable
documentation and a variety of personal conversations
that SCI has made considerable progress in creating a
culture of assessment on campus, with a specific focus
on classroom-level assessment." The following evidence
was cited:
1. The development of common student learning outcomes
across the curriculum [citation omitted].
2. The requirement that each faculty member declare
the methods of course assessment as part of each
course syllabus.
3. The requirement that each faculty member submit an
end-of-course assessment report to the Dean which
identifies specific classroom assessment techniques
used, the findings from those assessments, and action
taken [citation omitted].
4. Course syllabi in both the traditional two-year
program and the accelerated degree program routinesly
list objectives related to either Common Student
Learning Objectives or Course-Based Learning
Objectives identified by the College and explicitly
related to the College's mission.
5. Testimony from students indicated that faculty use
classroom assessment techniques daily and that these
assessments result in clear changes in classes.
6. Interviews with a number of faculty demonstrate a
high degree of awareness of the assessment effort, and
a desire to use that process to improve
classroom-level instruction.
The site visit committee noted two other aspects of
the assessment plan - an annual review of courses to
"ensure that Illinois Articulation Initiatives are
met," and our program review process. "Following a
review," the panel noted, "the theater program was
placed on indefinite inactive status. Review of the
forensics program to determine the future of the
program has included two external evaluators. These
program reviews indicate the institution is reviewing
the effectiveness of the programs."
In addition, the site visit team noted a long-standing
overall commitment to good teaching and student
learning at SCI. It cited the way faculty members are
"evaluated by department chairs, students, and the
Dean of Academic Affairs," and the "classroom visits
and evaluations are discussed with faculty and affect
tenure decisions," as well as decisions on rehiring
adjunct instructors. Also credited were the LaFata and
Distinguished Teaching Awards and SCI's computer labs
and utilization of "limited rsources to improve
classrooms and maintain the cleanliness of the
grounds, common spaces, and classrooms." Especially
commended was the new Resource Center on the lower
level of Becker Library
Standardized testing
The Collegiate Assessment of Academic Proficiency
(CAAP) tests in reading and math were administered at
the end of March. On the CAAP reading test, SCIr
students who took it (n = 87) scored an average of
59.0; the nationwide reference group of second-year
students in private two-year colleges scored an
average of 60.4. That is slightly less than the
national average. But SCI students in 2005 averaged
61.2 on the reading test, compared to 60.4 nationwide,
and in 2004 SCI students' score was 59.9 compared to
60.3 nationwide. The college’s first math scores were
as follows: Students students averaged 56.5 as
compared to 56.1 nationally. Math scores will not be
statistically significant until the test has been
administered one or two years longer and more data
accrue.
In addition, SCI purchased from ACT Inc. a linkage
report providing a “value added” benchmark for
measuring how much SCI students learned about reading
and math in their college years. ACT Inc., the vendor,
explains: "This report contains an analysis of
performance for students who tested with the ACT
Assessment on entry to college and CAAP after general
education work has been completed. ... Because the
content specifications of some pairs of ACT Assessment
and CAAP tests are similar, it is possible to track
student performance for your cohort." The linkage
results:
· In reading, 18 percent of those SCI students who
took both the ACT test in high school and the CAAP
test this year (n = 60) made lower than expected
progress on the CAAP test as compared to 14 percent of
the nationwide reference group; 75 percent made
expected progress, compared to 75 percent of the
national group; and 7 percent made higher than on CAAP
compared to 11 percent of the national group.
· In math, 10 percent of the SCI students who took
both tests made lower than expected progress on the
CAAP test compared to 12 percent of the reference
group; 82 percent made expected progress, compared to
79 percent nationally; and 8 percent made higher than
expected progress, as compared to 9 percent of the
national group.
Over the summer, a subcommittee was empaneled to take
an exploratory look at all the CAAP test results and
make some preliminary decisions on how they can be
utilized as a planning tool for continuous improvement
of instruction. Serving on it were Academic Affairs
dean John Cicero, Languages and Literature chair Amy
Lakin, and math instructor Barb Tanzyus. Peter
Ellertsen, chair of the assessment committee, convened
the subcommittee. It met in June, and took the
following action:
(1) Ms. Lakin volunteered to suggest a flow chart
whereby student learning outcomes assessment data are
to be transmitted through the Office of the Dean of
Academic Affairs to the Board of Trustees, the chief
operating and fiscal officer and others engaged in
making budgetary decisions, in response to a
suggestion in the NCA site visit committee's report,
that "As the institution advances the Outcomes
Assessment program, it may consider integrating
requests developed as a result of assessment into the
budgeting and planning processes. Many of the
recommendations for change will be for curriculum or
pedagogical changes. However, other recommendations
will require resource allocations which must be
weighed against other budgetary requests. When the
institution gives priority to the assessment generated
resource requests, the result is to create even more
interest in the assessment outcomes." (p. 4). Ms.
Lakin’s recommendation will be submitted to the full
Assessment Committee at the beginning of the 2006-2007
school year.
(2) Ms. Tanzyus volunteered to assign the CAAP test
data to her baccalaureate statistics students for
analysis during the 2006-2007 school year, as a
preliminary step toward analysis of the data received
to date and determination of how these data can be
used as a tool for planning and budgeting for
continuous improvement of teaching and learning at SCI
as well as maximizing student learning outcomes and
institutional effectiveness in the academic domain. This will be an ongoing project.
Friday, August 11, 2006
Gray lady inks higher ed report
Final adoption of a report by the federal Commission on the Future of Higher Education got some media play, mostly from an Associated Press story in papers including The Los Angeles Times, Forbes and The Dallas Morning News and a New York Times news service story that ran in the Gray Lady herself and got picked up by papers including The Arkansas Democrat-Gazette, The Minneapolis Star-Tribune, The Register Guard in Portland, Ore., and The Gainesville (Fla.) Sun.
Today's Chicago Trib carried the AP story, with a graf contributed by staff reporter Jodi S. Cohen. It was a quote from a top University of Illinois administrator, and it demonstrates why some educators wonder if the commission didn't get in over its head just a little, especially on testing issues:
Dillon's story noted that controversial language in earlier drafts of the report was toned down at the last minute. Some of it involved standardized testing:
The commission was formed in September 2005 to discuss access, accountability and cost issues and to report to U.S. Education Secretary Margaret Spellings in a year's time. What happens next is not clear, although commission chair Charles Miller envisions more consultation with corporate and government leaders. He didn't mention educators, but that may be an oversight in the New York Times story. Dillon reported:
Today's Chicago Trib carried the AP story, with a graf contributed by staff reporter Jodi S. Cohen. It was a quote from a top University of Illinois administrator, and it demonstrates why some educators wonder if the commission didn't get in over its head just a little, especially on testing issues:
Richard Herman, chancellor of the University of Illinois at Urbana-Champaign, said the best measure of success is what students do after college.In spite of the sweeping nature of the blue-ribbon commission's mandate, Sam Dillon's lede in The Times managed to get it all in:
"I am not opposed to the idea of additional measurements, but . . . using a limited number of metrics to measure the success of a college education is inaccurate," Herman said. "Our mission, I believe, is to prepare tomorrow's leaders. I would argue on those grounds that we have been enormously successful. Tell me what written test measures that."
WASHINGTON, Aug. 10 — A federal commission approved a final report on Thursday that urges a broad shake-up of American higher education. It calls for public universities to measure learning with standardized tests, federal monitoring of college quality and sweeping changes in financial aid.Didllon's story, like the AP story, noted that David Ward of the American Council on Education refused to sign off on the report and explained his refusal to sign is significant because ACE is "the largest association of colleges and universities [and Ward] was the most powerful representative of the higher education establishment on the commission."
The panel also called on policy makers and leaders in higher education to find new ways to control costs, saying college tuition should grow no faster than median family income, although it opposed price controls.
The report recommended bolstering Pell grants, the basic building block of federal student aid, by making the program cover a larger percentage of public college tuition. That proposal could cost billions of dollars.
Dillon's story noted that controversial language in earlier drafts of the report was toned down at the last minute. Some of it involved standardized testing:
... in the last six weeks, the commission issued six drafts, watering down passages that had drawn criticism and eliminating one this week, written by Mr. Miller, that had encouraged expanding private loans as a share of student financial aid.How that policy recommendation translates into actual mandates, of course, remains to be seen.
A proposal on standardized tests was also weakened at the last moment. Previous drafts said that “states should require” public universities to use standardized test, but the final version said simply that universities “should measure student learning” with standardized tests.
The commission was formed in September 2005 to discuss access, accountability and cost issues and to report to U.S. Education Secretary Margaret Spellings in a year's time. What happens next is not clear, although commission chair Charles Miller envisions more consultation with corporate and government leaders. He didn't mention educators, but that may be an oversight in the New York Times story. Dillon reported:
The members seemed at odds on how to carry their recommendations forward. Some, like former Gov. James B. Hunt Jr. of North Carolina, called on President Bush to incorporate them in the Congressional agenda.Dillon's story noted that some member organizations represted in ACE have endorsed drafts of the report, including the American Association of State Colleges and Universities and the American Association of Community Colleges. Other reaction was more in line with Ward's. Said Dillon:
Mr. Miller said the next step should be more “national dialogue” with governors and corporate leaders. He seemed upset by what he characterized as wrangling with representatives of the status quo.
“You can’t act on the recommendations today because you encounter one set of defenders and then behind them another set of defenders, and you get into all these battles,” he told reporters after the panel voted.
Other important groups in the council issued withering critiques.
The Association of American Universities, which represents 60 top research universities, noted that the report “deals almost exclusively with undergraduate education.”
Robert M. Berdahl, a former chancellor at the University of California, Berkeley, who is president of the universities association, said, “What is needed is something much richer, with a more nuanced understanding of the educational engagement and how it is undertaken.” said
Another council member, the National Association of Independent Colleges and Universities, which represents 900 private institutions including liberal arts colleges, major research universities and church- and other faith-related colleges, attacked the recommendation to develop a national database to follow individual students’ progress as a way of holding colleges accountable for students’ success.
The association called the proposal a dangerous intrusion on privacy, saying, “Our members find this idea chilling.”
Several groups said the report spent much ink discussing increases in students’ work skills, while slighting the mission of colleges and universities to educate students as citizens.
Thursday, August 10, 2006
Higher ed head nixes higher ed report
As expected, the blue-ribbon Commission on the Future of Higher Education has gotten behind the third draft of a report to U.S. Education Secretary Margaret Spellings. The Associated Press is moving the story on today's wire and, surprisingly, The Seattle Post-Intelligencer and other papers are picking it up.
The final draft is toned down considerably from the hostile and abusive language of earlier versions, but David Ward, president of the American Council on Education, refused to sign on, saying, as The AP put it, "the report reflected too much of a 'top down' approach to reform." In a bylined story, AP education writer Justin Pope reported:
But assessment and accountability are not the only, or even the major, focus of the commission. Pope's summary for AP is brief, but accurate:
The final draft is toned down considerably from the hostile and abusive language of earlier versions, but David Ward, president of the American Council on Education, refused to sign on, saying, as The AP put it, "the report reflected too much of a 'top down' approach to reform." In a bylined story, AP education writer Justin Pope reported:
In the end, after weeks of negotiations and several drafts, Chairman Charles Miller brought all but one commissioner on board. However the one holdout, David Ward of the American Council on Education, was the primary voice of traditional colleges on the commission, and his refusal to sign on could dilute the report's influence.One of those "one size fits all" recommendations deals with mandated standardized testing. Others relate to unspecified standard accountability measures that would allow national comparisons of student learning (which may be a way of saying more standardized testing in the pedagese language). We'll see.
Ward said he supported many of the commission's objectives, but opposed "one-size fits all" prescriptions that fail to reflect the differing mission of colleges.
Still, Ward noted several current and past college presidents on the commission signed on to the report at a meeting in Washington, D.C. He said colleges would pay close attention to its calls for reform.
"They now realize if they don't do it to themselves, somebody will do it to them," he said.
But assessment and accountability are not the only, or even the major, focus of the commission. Pope's summary for AP is brief, but accurate:
The report, which will be delivered to Education Secretary Margaret Spellings in final form next month, recommends that the federal government consolidate its more than 20 financial aid programs and ensure that Pell Grants - the main aid program for low-income students - cover at least 70 percent of in-state tuition costs. In 2004-2005, the grants covered less than half.All the implications of this panel's recodmmendations are not clear yet. But I'll bet somebody makes a lot of money out of them!
But it says that colleges should do more to hold down costs, and to better measure what students learn.
The 19-member commission, created by Spellings, has no direct power, but has been closely watched by policy-makers. Because of its diverse membership - industry, government and for-profit and traditional colleges are represented - any recommendations all members agreed on would carry substantial weight as Congress, the White House and state governments consider education measures in the future.
Friday, July 21, 2006
Nuts & Bolts July 2006 / ARCHIVE
NUTS & BOLTS
An electronic assessment newsletter
Springfield College in Illinois
-----------------------------------------
July 2006
Vol. 6 No. 11
-----------------------------------------
Editor's Note. It now looks like it'll be a while
before I can get SCI's assessment website up and
running again. In the meantime, I plan to publish the
newsletter by email and archive current issues on an
interim basis on my personal weblog at
http://www.teachinglogspot.blogspot.com/ ... back
issues through June 2006, as well as the teaching
blog, can be accessed from my faculty page at
http://www.sci.edu/classes/ellertsen/welcome.html
* * *
Stuff happens, to paraphrase (but not quote) a popular
bumper sticker. I had planned to put Nuts & Bolts on
hiatus while I reorganized parts of SCI's assessment
website, but there's information I think I should get
out to faculty on a timely basis. So this email
message will serve as a short version of Nuts & Bolts,
SCI's monthly assessment newsletter, updating you on:
(1) reminders, tips and links relating to fall
semester syllabi, which are due in late July and early
August; and (2) developments on the federal Commission
on the Future of Higher Education, which is
deliberating radical changes in the way we do
institutional assessment.
1. Syllabi
If you've taught before at SCI and/or Benedictine
University at SCI, you're in luck. You don't have any
changes in the syllabus format to wrestle with this
year. Mary Jo Rappe of the Academic Affairs Office is
sending out detailed instructions with deadlines for
SCI's traditional and adult accelerated programs, as
well the various Benedictine modules.
If you're new, Mary Jo's instructions will show you
how to format a syllabus. And your division chair will
be able to help you work with student learning
objectives, learning outcomes and the other details of
a college syllabus.
In either event, syllabi are to be submitted this year
to your division chairs for approval.
With government and other outside stakeholders
dictating more and more of what goes on in the
classroom, our syllabi may seem more complicated than
what you remember from when you were in school. But
once you get the hang of it, it'll make sense. And
you'll wonder what all the fuss was about.
As assessment coordinator, I will be happy to offer
informal advice on how to incorporate goals,
objectives and assessment criteria into your syllabi.
I can be reached by email at pellertsen@sci.edu ...
and we have on the SCI website a 45-page PDF document
entitled "Classroom Assessment for Continuous
Improvement" that walks you through SCI's Common
Student Learning Objectives and other details.
Published in 2005, the classroom assessment guide
summarizes some basic principles of quality
improvement planning and offers tips on how to carry
it out in the classroom by means of formative
assessment. Unlike other parts of the assessment
website at the moment, it can be reached from our
homepage at www.sci.edu ... click on the Quick Link to
"Faculty and Student Websites" and then on "Assessment
Program Goals and Objectives" in the website directory
that opens. That will take you to a new page headed
"Program Goals and Objectives." Scroll down to the
heading "Classroom assessment" and click on the link
thqat says "Guide for Instructors (pdf)." It's
important to keep scrolling down, because on most
browsers you won't be able to see the classroom
assessment links at first.
If your head's swimming from all these details,
remember all of this stuff is like walking, breathing
or riding a bicycle. It's a lot easier to just *do* it
than it is to try to explain it!
2. Federal politicking
The blue-ribbon Commission on the Future of Higher
Education, empaneled in September 2005 and due to
issue a report in September of this year, has released
a second draft report considerably less hostile to
classroom educators than its first draft. Assessment
is hardly even mentioned in this draft, at least it
isn't reflected in press coverage, but nationwide
standardized testing is still looming in the
background.
Reports the online newletter Inside Higher Ed:
"Taken together, the changes made in response to
commissioners’ criticisms of the initial report — many
of which focused on its tendency to favor
harsh-sounding and simplistic rhetoric and
recommendations over practical, well-conceived
analysis and answers — do not radically alter the
panel’s bottom line view: that higher education must
perform better in educating students and in proving
its value to the American public.
"And many if not most of the initial draft’s findings
and recommendations remain intact, a fact many college
officials will rue. The second draft, like the first,
calls for the creation of a national “unit records”
system to track students’ performance through their
academic careers and into the work place (though it
calls the proposal something else), and urges the
collection and publication of significantly more
information that colleges have either not collected
or, more often, held close to the vest.
"But in case after case, the second draft shuns the
instinct, so prevalent in the first, to “throw rocks”
at higher education, as one commissioner put it in
written comments to his colleagues. That doesn’t mean
the new report lets colleges off the hook or ignores
higher education’s real and serious problems; it just
does so in language that is more descriptive and less
inflamed."
Inside Higher Ed's story, dated July 17, can be
accessed at
http://www.insidehighered.com/news/2006/07/17/commission ...
The next day Inside Higher Ed's reporter Doug
Lederman, who has been following the issue all year
long, did a reaction story noting that members of the
commission were all over the map.
He quoted David Ward, president of the American
Council on Education (which represents college
presidents), as saying the second draft showed
"improvements in both tone and content" over the
first. But Ward added it "omitted the preamble that
contained the harshest rhetoric of the first draft,
and since 'these introductory comments will set the
tone for the rest of the report ... I am very anxious
to see what changes will be made in this area.'"
Lederman also quoted American Council of Trustees and
Alumni president Ann Neal as saying the second draft
dropped earlier criticism of "important curricular
issues - and their connection to the serious cultural
illiteracy that the commission recognizes." And
Richard Vedder, an adjunct scholar for a politically
conservative think tank, worried that "as we move to
maximize support within the commission [by toning down
the rhetoric], we run risk of making it more of a
pablum, inoffensive document that says relatively
little."
Lederman's headline, "Too Much Change, or Not
Enough?," catches the tone of things. His report is
available at
http://www.insidehighered.com/news/2006/07/18/commission
Media reaction to the draft, as with the commission's
other deliberations, ranged from muted to nonexistent.
But there were signs the political posturing isn't
quite over.
Writing on a blog titled "Phi Beta Cons: The *Right*
Take on Higher Ed" in the online edition of William
Buckley's National Review magazine, Candace de Russy
said "this draft’s regrettable dropping of focus on
declining undergraduate education should not surprise
us. There are too many higher education insiders
serving on the commission, and it is not in their
self-interest to demand serious curricular reform and
an end to grade inflation as well as to show
open-mindedness to innovative means for delivering
higher education."
She added, "Thus it’s the commission itself that ought
to be gutted and re-constituted with members with
(pardon the expression) real guts. Barring that, it is
likely that this entire exercise will in the end do
little or nothing to ameliorate higher education."
The permalink to de Russy's blog entry is http://phibetacons.nationalreview.com/post/?q=Zjk5NmQ0Yjc3YjJjMzU2MWQ3NjI5MzVlN2U4OThmMzg=
Also reacting to the new draft in the National
Review's higher ed blog was Charles Mitchell, program
director at the American Council of Trustees and
Alumni. He quoted ACTA president Neal's July 18
statement to Higher Ed Today: "In a time of global
competition and conflict, transparency and assessments
don’t matter if the product is not worthy. ... Access
and completion rates are simply irrelevant if the
education received is incoherent and fails to
guarantee the common ground of training and outlook on
which our society depends. Yet the commission remains
silent on these critical points."
Mitchell added, I think with good reason, "There is
certainly much more to come on this story."
Mitchell's permalink is http://phibetacons.nationalreview.com/post/?q=Nzg0ZmFiNmI3NzVlMTFkNDY3YzUzYWIyMDY0NWFlNzE=
National standardized testing
In the meantime, ETS has released a report calling for
"a broad national system to better understand student
learning in two- and four-year colleges and
universities." To do that, ETS specifically recommends
"a systematic, data-driven, comprehensive approach to
measuring student learning with direct, valid and
reliable measures."
The ETS report is titled "A Culture of Evidence:
Postsecondary Assessment and Learning Outcomes." It
notes the federal commission's deliberations and
recommends that the regional accrediting associations
develop a national plan for testing on "four
dimensions of student learning":
-- workplace readiness and general skills
-- domain-specific knowledge and skills
-- soft skills such as teamwork, communications and
creativity
-- student engagement with learning.
"Colleges and universities face continued pressure to
prove their effectiveness in an increasingly difficult
fiscal environment," said Mari Pearlman, Senior Vice
President of Higher Education at ETS, in a press
release posted to the MarketWire public relations
service. "We hope this paper will further the
discussion about how our system of higher education
might respond to this challenge."
The ETS press release, which contains a link to the
report in PDF format, is available at http://www.marketwire.com/mw/release_html_b1?release_id=145859 ...
I hope I don't sound cynical if I note that ETS
(originally known as the Educational Testing Service)
is a leader in the standardized test business. Its
products include the SAT, the GRE, the TOEFL and high
school advanced placement tests.
-- Pete Ellertsen is chairman of SCI's assessment
committee and editor of Nuts & Bolts.
An electronic assessment newsletter
Springfield College in Illinois
-----------------------------------------
July 2006
Vol. 6 No. 11
-----------------------------------------
Editor's Note. It now looks like it'll be a while
before I can get SCI's assessment website up and
running again. In the meantime, I plan to publish the
newsletter by email and archive current issues on an
interim basis on my personal weblog at
http://www.teachinglogspot.blogspot.com/ ... back
issues through June 2006, as well as the teaching
blog, can be accessed from my faculty page at
http://www.sci.edu/classes/ellertsen/welcome.html
* * *
Stuff happens, to paraphrase (but not quote) a popular
bumper sticker. I had planned to put Nuts & Bolts on
hiatus while I reorganized parts of SCI's assessment
website, but there's information I think I should get
out to faculty on a timely basis. So this email
message will serve as a short version of Nuts & Bolts,
SCI's monthly assessment newsletter, updating you on:
(1) reminders, tips and links relating to fall
semester syllabi, which are due in late July and early
August; and (2) developments on the federal Commission
on the Future of Higher Education, which is
deliberating radical changes in the way we do
institutional assessment.
1. Syllabi
If you've taught before at SCI and/or Benedictine
University at SCI, you're in luck. You don't have any
changes in the syllabus format to wrestle with this
year. Mary Jo Rappe of the Academic Affairs Office is
sending out detailed instructions with deadlines for
SCI's traditional and adult accelerated programs, as
well the various Benedictine modules.
If you're new, Mary Jo's instructions will show you
how to format a syllabus. And your division chair will
be able to help you work with student learning
objectives, learning outcomes and the other details of
a college syllabus.
In either event, syllabi are to be submitted this year
to your division chairs for approval.
With government and other outside stakeholders
dictating more and more of what goes on in the
classroom, our syllabi may seem more complicated than
what you remember from when you were in school. But
once you get the hang of it, it'll make sense. And
you'll wonder what all the fuss was about.
As assessment coordinator, I will be happy to offer
informal advice on how to incorporate goals,
objectives and assessment criteria into your syllabi.
I can be reached by email at pellertsen@sci.edu ...
and we have on the SCI website a 45-page PDF document
entitled "Classroom Assessment for Continuous
Improvement" that walks you through SCI's Common
Student Learning Objectives and other details.
Published in 2005, the classroom assessment guide
summarizes some basic principles of quality
improvement planning and offers tips on how to carry
it out in the classroom by means of formative
assessment. Unlike other parts of the assessment
website at the moment, it can be reached from our
homepage at www.sci.edu ... click on the Quick Link to
"Faculty and Student Websites" and then on "Assessment
Program Goals and Objectives" in the website directory
that opens. That will take you to a new page headed
"Program Goals and Objectives." Scroll down to the
heading "Classroom assessment" and click on the link
thqat says "Guide for Instructors (pdf)." It's
important to keep scrolling down, because on most
browsers you won't be able to see the classroom
assessment links at first.
If your head's swimming from all these details,
remember all of this stuff is like walking, breathing
or riding a bicycle. It's a lot easier to just *do* it
than it is to try to explain it!
2. Federal politicking
The blue-ribbon Commission on the Future of Higher
Education, empaneled in September 2005 and due to
issue a report in September of this year, has released
a second draft report considerably less hostile to
classroom educators than its first draft. Assessment
is hardly even mentioned in this draft, at least it
isn't reflected in press coverage, but nationwide
standardized testing is still looming in the
background.
Reports the online newletter Inside Higher Ed:
"Taken together, the changes made in response to
commissioners’ criticisms of the initial report — many
of which focused on its tendency to favor
harsh-sounding and simplistic rhetoric and
recommendations over practical, well-conceived
analysis and answers — do not radically alter the
panel’s bottom line view: that higher education must
perform better in educating students and in proving
its value to the American public.
"And many if not most of the initial draft’s findings
and recommendations remain intact, a fact many college
officials will rue. The second draft, like the first,
calls for the creation of a national “unit records”
system to track students’ performance through their
academic careers and into the work place (though it
calls the proposal something else), and urges the
collection and publication of significantly more
information that colleges have either not collected
or, more often, held close to the vest.
"But in case after case, the second draft shuns the
instinct, so prevalent in the first, to “throw rocks”
at higher education, as one commissioner put it in
written comments to his colleagues. That doesn’t mean
the new report lets colleges off the hook or ignores
higher education’s real and serious problems; it just
does so in language that is more descriptive and less
inflamed."
Inside Higher Ed's story, dated July 17, can be
accessed at
http://www.insidehighered.com/news/2006/07/17/commission ...
The next day Inside Higher Ed's reporter Doug
Lederman, who has been following the issue all year
long, did a reaction story noting that members of the
commission were all over the map.
He quoted David Ward, president of the American
Council on Education (which represents college
presidents), as saying the second draft showed
"improvements in both tone and content" over the
first. But Ward added it "omitted the preamble that
contained the harshest rhetoric of the first draft,
and since 'these introductory comments will set the
tone for the rest of the report ... I am very anxious
to see what changes will be made in this area.'"
Lederman also quoted American Council of Trustees and
Alumni president Ann Neal as saying the second draft
dropped earlier criticism of "important curricular
issues - and their connection to the serious cultural
illiteracy that the commission recognizes." And
Richard Vedder, an adjunct scholar for a politically
conservative think tank, worried that "as we move to
maximize support within the commission [by toning down
the rhetoric], we run risk of making it more of a
pablum, inoffensive document that says relatively
little."
Lederman's headline, "Too Much Change, or Not
Enough?," catches the tone of things. His report is
available at
http://www.insidehighered.com/news/2006/07/18/commission
Media reaction to the draft, as with the commission's
other deliberations, ranged from muted to nonexistent.
But there were signs the political posturing isn't
quite over.
Writing on a blog titled "Phi Beta Cons: The *Right*
Take on Higher Ed" in the online edition of William
Buckley's National Review magazine, Candace de Russy
said "this draft’s regrettable dropping of focus on
declining undergraduate education should not surprise
us. There are too many higher education insiders
serving on the commission, and it is not in their
self-interest to demand serious curricular reform and
an end to grade inflation as well as to show
open-mindedness to innovative means for delivering
higher education."
She added, "Thus it’s the commission itself that ought
to be gutted and re-constituted with members with
(pardon the expression) real guts. Barring that, it is
likely that this entire exercise will in the end do
little or nothing to ameliorate higher education."
The permalink to de Russy's blog entry is http://phibetacons.nationalreview.com/post/?q=Zjk5NmQ0Yjc3YjJjMzU2MWQ3NjI5MzVlN2U4OThmMzg=
Also reacting to the new draft in the National
Review's higher ed blog was Charles Mitchell, program
director at the American Council of Trustees and
Alumni. He quoted ACTA president Neal's July 18
statement to Higher Ed Today: "In a time of global
competition and conflict, transparency and assessments
don’t matter if the product is not worthy. ... Access
and completion rates are simply irrelevant if the
education received is incoherent and fails to
guarantee the common ground of training and outlook on
which our society depends. Yet the commission remains
silent on these critical points."
Mitchell added, I think with good reason, "There is
certainly much more to come on this story."
Mitchell's permalink is http://phibetacons.nationalreview.com/post/?q=Nzg0ZmFiNmI3NzVlMTFkNDY3YzUzYWIyMDY0NWFlNzE=
National standardized testing
In the meantime, ETS has released a report calling for
"a broad national system to better understand student
learning in two- and four-year colleges and
universities." To do that, ETS specifically recommends
"a systematic, data-driven, comprehensive approach to
measuring student learning with direct, valid and
reliable measures."
The ETS report is titled "A Culture of Evidence:
Postsecondary Assessment and Learning Outcomes." It
notes the federal commission's deliberations and
recommends that the regional accrediting associations
develop a national plan for testing on "four
dimensions of student learning":
-- workplace readiness and general skills
-- domain-specific knowledge and skills
-- soft skills such as teamwork, communications and
creativity
-- student engagement with learning.
"Colleges and universities face continued pressure to
prove their effectiveness in an increasingly difficult
fiscal environment," said Mari Pearlman, Senior Vice
President of Higher Education at ETS, in a press
release posted to the MarketWire public relations
service. "We hope this paper will further the
discussion about how our system of higher education
might respond to this challenge."
The ETS press release, which contains a link to the
report in PDF format, is available at http://www.marketwire.com/mw/release_html_b1?release_id=145859 ...
I hope I don't sound cynical if I note that ETS
(originally known as the Educational Testing Service)
is a leader in the standardized test business. Its
products include the SAT, the GRE, the TOEFL and high
school advanced placement tests.
-- Pete Ellertsen is chairman of SCI's assessment
committee and editor of Nuts & Bolts.
Wednesday, July 19, 2006
Is this what the future looks like?
Proof, as if it were needed, that ideology has nothing to do with political meddling in the classroom comes today from Great Britain. It came in the form of a news report in The Guardian of a House of Commons committee hearing. Testifying was Alan Johnson, education secretary in Britain's Labour Party government. He defended the emphasis on standardized testing imposed by Labour's Office for Standards in Education ("Ofsted" for short). The Guardian reports:
Under Prime Minister Tony Blair, Labour has won elections since the late 1990s with a "New Labour" set of moderately liberal policies similar to former U.S. President Bill Clinton's. School reform; "league tables," well publicized lists of schools' aggragate test scores; and pressure on classroom teachers to raise test scores is part of the "whole kit and caboodle" New Labour offers to the voters. It sounds more than a little bit like our No Child Left Behind regimen of mandatory testing and ranking of schools by aggragate test scores, doesn't it?
British educators, like the teachers in Nottingham or Harvey Goldstein of the Institute of Education in London, argue the league tables can't help but measure factors like "sex, ethnic origin and social class background" that the schools can't be held responsible for.
Sometimes we tie the failure of NCLB to President Bush and the Republican Congress, but we forget the NCLB bill was co-sponsored by Sen. Edward Kennedy, D-Mass., and passed Congress with broad bipartisan support. Again, the similarity between Ofsted's accountability measures and NCLB is striking.
If anything, the British system is more hostile to good classroom teaching than our own. And it's good politics. At today's committee hearing, Johnson was kidded about his political ambitions. Here's how the exchange went:
Speaking to the House of Commons education select committee, Mr Johnson said staff at a school in Nottingham had told him recently that they would like to see league tables [ranking schools by test scores] scrapped.Sound familiar?
"I accept the pressure it puts, and the extra intensity and stress it puts on teachers, but it's absolutely the right thing to do," he said.
Mr Johnson gave his backing to "the whole kit and caboodle" of accountability for schools - from Ofsted inspections to national tests and exams and league tables.
He added: "If anything, we need to intensify that rather than relax."
Mr Johnson said it was "fundamental" that children should leave primary school with a mastery of reading and maths.
Under Prime Minister Tony Blair, Labour has won elections since the late 1990s with a "New Labour" set of moderately liberal policies similar to former U.S. President Bill Clinton's. School reform; "league tables," well publicized lists of schools' aggragate test scores; and pressure on classroom teachers to raise test scores is part of the "whole kit and caboodle" New Labour offers to the voters. It sounds more than a little bit like our No Child Left Behind regimen of mandatory testing and ranking of schools by aggragate test scores, doesn't it?
British educators, like the teachers in Nottingham or Harvey Goldstein of the Institute of Education in London, argue the league tables can't help but measure factors like "sex, ethnic origin and social class background" that the schools can't be held responsible for.
Sometimes we tie the failure of NCLB to President Bush and the Republican Congress, but we forget the NCLB bill was co-sponsored by Sen. Edward Kennedy, D-Mass., and passed Congress with broad bipartisan support. Again, the similarity between Ofsted's accountability measures and NCLB is striking.
If anything, the British system is more hostile to good classroom teaching than our own. And it's good politics. At today's committee hearing, Johnson was kidded about his political ambitions. Here's how the exchange went:
The Conservative MP for Reading East, Rob Wilson, told Mr Johnson he had "a few quid" on the outcome.And I would classify it as politics, politics, politics. Unfortunately, bashing classroom teachers looks like good politics on both sides of the water.
Mr Wilson asked: "When Tony Blair steps down next year and you take over as prime minister will your priority be, as his was, 'education, education, education'?"
The Labour chairman of the committee, Barry Sheerman, suggested at this point that the minister might like to restrict his answer to education policy.
In response to Mr Wilson's question, Mr Johnson said: "Yes. I would probably classify it as 'learning, learning, learning', but it's the same thing."
Wednesday, July 05, 2006
Resources on Native music
Cross-posted to music and teaching blogs for potential use in HUM 221 (Native American cultures) in the spring of 2007.
A valuable article in the Jan.-Feb. 2003 issue of Sharing Our Pathways, newsletter of the Alaska Rural Systemic Initiative at UA-Fairbanks. It's by Vivian Martindale, and it's titled "Native American Songs as Literature." In addition to an ANKN (Alaska Native Knowledge Netword) article on the Athabascan peoples, it mentions Joy Harjo, Canyon Records and other resources on Native cultures in the lower 48.
Says Martindale:
A valuable article in the Jan.-Feb. 2003 issue of Sharing Our Pathways, newsletter of the Alaska Rural Systemic Initiative at UA-Fairbanks. It's by Vivian Martindale, and it's titled "Native American Songs as Literature." In addition to an ANKN (Alaska Native Knowledge Netword) article on the Athabascan peoples, it mentions Joy Harjo, Canyon Records and other resources on Native cultures in the lower 48.
Says Martindale:
Classrooms don't have to be boring. Literature classes especially can be enhanced through the medium of song. In David Leedom Shaul's article "A Hopi Song-Poem in Context", he claims that the listener is similar to an audience during storytelling, in that the listener is also interacting with the music. The listener, as a participant, is not passive; the listener is hearing rhythms, words, patterns and much more. The listener does not have to understand the Native language in order to appreciate the song. Shaul calls attention to the genre called "song poems." These songs are in a category by themselves, separate from poetry and prose. "The text of song-poems in Hopi culture, like much poetry, seemingly create their own context by virtue of minimalist language" (Shaul 1992:230Ð31). Therefore it would be interesting to include the concept of song poems or poetry as music into a curriculum.She quotes this from a Joy Harjo/Poetic Justice song called "My House is the Red Earth." (Poetic Justice is Harjo's band.):
My house is the red earth. It could be the center of the world. I've heard New York, Tokyo or Paris called the center of the world, but I say it is magnificently humble. You could drive by and miss it. Radio waves can obscure it. Words cannot construct it for there are some sounds left to sacred wordless form. For instance, that fool crow picking through trash near the corral, understands the center of the world as greasy scraps of fat. Just ask him. He doesn't have to say that the earth has turned scarlet through fierce belief, after centuries of heartbreak and laughter.She also has tips and caveats on teaching traditional Native American music.
Tuesday, July 04, 2006
The Chronicle's take on Miller commission
Since The Chronicle of Higher Education usually hides its articles behind a subscription firewall, I haven't kept up with its coverage of the U.S. Education Department's blue-ribbon Commission on the Future of Higher Education. But every so often The Chronicle comes out from behind the firewall, and this week they've got a good takeout on the commission's draft report from the July 7 issue. Like practically everything else in The Chronicle, it's thorough and very well balanced.
Written by Kelly Field, the article catches the tone of the Commission's debate in the headline: "Draft Report From Federal Panel Takes Aim at Academe." A subhead notes the split between chairman Charles Miller and educators on the commission. It also details some of the substantive recommendations that have surfaced thus far:
Field's article ends with a valuable list of specific recommendations so far on issues of Access; Affordability; Quality and Innovation; and Accountability (with its recommendations on who carry them out listed in parentheses). I'll quote the recommendations on accountabilty below:
Written by Kelly Field, the article catches the tone of the Commission's debate in the headline: "Draft Report From Federal Panel Takes Aim at Academe." A subhead notes the split between chairman Charles Miller and educators on the commission. It also details some of the substantive recommendations that have surfaced thus far:
A draft report released last week by the Commission on the Future of Higher Education called for overhauling the federal student-aid and accreditation systems, easing the process of transferring credits between institutions, and using testing to measure the "value added" by a college education.With that on the record, Field goes on to sketch in the controversy on the commission over the tone of its deliberations. Some of the complaints are procedural, reflecting concern that the commission will railroad through a predetermined set of recommendations. Field says:
The report, which the panel discussed during a closed meeting two days after it was released, also endorsed the creation of a national "unit record" system to track the educational progress of every college student in the United States.
... several commission members were unhappy with both the substance and the tone of the preliminary report, which was written by an outside writer with assistance from commission staff members. Some said it favored the views of the consultants who drafted the commission's issue papers over the opinions of the commissioners themselves.On the other side of the issue, Field quoted Richard K. Vedder, an economist who writes for the right-wing American Enterprise Institute, who said the report represented "a good starting point," and Sara Martinez Tucker, president and chief executive of the Hispanic Scholarship Fund, who said she was "very pleased with the completeness of it." Field explains:
"This really reflects what the consultants put in the papers and what they would like the commission to say," said James J. Duderstadt, president emeritus of the University of Michigan at Ann Arbor. "It doesn't have any relationship to the kind of deliberations we had at the May meeting," when members began sifting through potential recommendations in an effort to reach an initial consensus.
David Ward, president of the American Council on Education, said the report was "based on a highly selective reading of testimony" and "in no way reflects the candid and creative discussions we have had during our yearlong process."
"I believe it is seriously flawed and needs significant revision," he wrote in a letter to college presidents.
Ms. Tucker said she created a matrix of all the ideas that came out of the commission's task forces, cross-referenced it against the report, and found that only three of her colleagues' suggestions were missing.Still, there's this question of tone. It's dogged the Commission since day one, and it won't go away. Field reports:
"Some of the ideas may be buried, or not as prominent as people would want, but they're in there. You just have to look," she said, noting that the unit-record proposal — her No. 2 priority — is not mentioned until Page 22 of the 27-page report.
Other panel members were troubled by the tone of the report, which began by noting that American higher education "has become one of our greatest success stories," but quickly turned to "the less inspiring realities of college life in our nation": the enrollment gap between rich and poor, the high use of remedial courses, rising costs, and a failure to prepare American workers for a changing global economy.Note Miller's language. His way is not an honest way, it's the honest way, implying all other ways are something other than honest. Perhaps it's just a chance turn of phrase. Or perhaps Miller's tone is hostile and combative.
The report went on to describe colleges as "risk-averse, frequently self-satisfied, and unduly expensive," and blamed rising tuitions on colleges' "failure to seek institutional efficiencies and by their disregard for improving productivity."
Robert W. Mendenhall, president of Western Governors University, an online, nonprofit institution, called the report "overly negative and overly focused on the academy as the culprit." And Ms. Tucker said she worried that the report's get-tough tone could backfire, alienating, rather than engaging and inspiring, academe.
Mr. Miller defended the draft, noting that Secretary Spellings had called on the commission not to be "shy or mealy-mouthed." In an interview, he said panel members' repeated calls for "moderate" language have left him feeling "almost like I'm being censored."
Mr. Miller also stood by his decision to have the panel's outside writer produce a complete draft, rather than an outline or set of recommendations, as was initially planned. Several panel members who received the full report a week before it was released to the public said they had been surprised by the abrupt change in plans.
He called the idea of offering recommendations before documenting the problem "an Alice in Wonderland idea: 'answers first, questions later.'"
"My way is the honest way, the direct way," he said.
Field's article ends with a valuable list of specific recommendations so far on issues of Access; Affordability; Quality and Innovation; and Accountability (with its recommendations on who carry them out listed in parentheses). I'll quote the recommendations on accountabilty below:
- Require institutions to measure student learning using measures such as the National Survey of Student Engagement and the Community College Survey of Student Engagement, as well as the Collegiate Learning Assessment and the Measure of Academic Proficiency and Progress (states). Provide incentives for states, higher-education associations, systems, and institutions to develop outcomes-focused accountability systems (federal government).
- Make results of such measures available to students and report them publicly in the aggregate. They should also be included on transcripts and in national databases of accountability data. Institutions should make aggregate results publicly available in a consumer-friendly form.
- Administer the National Assessment of Adult Literacy every five years, instead of 10 (Education Department).
- Require the National Center for Education Statistics to prepare timely annual public reports on college revenue and expenditures, including analysis of the major changes from year to year, at the sector and state levels (secretary of education).
- Develop a national student unit-record tracking system to follow the progress of each student in the country, with appropriate privacy safeguards.
- Create a consumer-friendly information database on higher education that includes a search engine that allows parents, policy makers, and others to weigh and rank institutions based on variables of their choosing (Department of Education).
- Establish a national accreditation framework that contains a set of comparable performance measures on learning outcomes appropriate to degree levels and institutional missions, and that is suitable for accreditation, public reporting, and consumer profiles; that does not prescribe specific input and process standards; and that requires institutions to report progress relative to their national and international peers.
- Make accreditation more transparent. Make the findings of reviews easily accessible to the public, and increase the proportion of public representatives in the governance of accrediting organizations and members of review teams from outside higher education.
Saturday, July 01, 2006
Snake oil and 'Texas-syle accountability'
I'm cross-posting this item to my blogs on newspapering and education, for reasons that should be obvious as we go along.
Are we seeing the beginning of an orchestrated effort to discredit American colleges and universities? After months of being mostly ignored by the news media, Charles Miller, the chairman of a blue-ribbon federal Commission on the Future of Higher Education gives an interview to his home-town paper. He blasts higher ed, and he blasts the members of his commission who dispute his rhetoric. Staff writer Ralph K.M. Haurwitz of The Austin (Tex.) American-Statesman reports in Friday's paper:
Now I'm going to assume The American-Statesman down in Austin got the story on its own. Miller is a former chairman of the University of Texas Board of Regents, and he might have mentioned it back home. Word might have gotten around town, and the paper might have decided to get to the bottom of it. Sometimes that's the way we got stories when I was on the courthouse beat. Of course Miller could have leaked it to a friendly paper, too, but I have no way of knowing that. So I won't speculate.
Miller's friend Margaret Spellings was in the news last week, too. At an international conference in Athens, Greece, she spoke on "higher education and the benefits of partnering with the private sector to prepare students for jobs in the 21st century." And by golly, she just happened to mention the Miller commission:
Miller noted the same statistical factoid in his interview. Here's how The American-Statesman reported his remark and put it in context:
All of this bears watching, but Miller's last points bear especially careful scrutiny. The membership of his commission is weighted toward industry and people with a vested interest in test prep and for-profit educational venures rather than academicians, and consistently he has touted one specific standardized testing product every time he mentions the subject of testing.
What it is that makes this old courthouse reporter think if Miller and his friends from Texas are peddling snake oil, and if they have their way, somebody, somewhere is going to make a big ole Texas-size pile of money as we move into the future of higher education?
Are we seeing the beginning of an orchestrated effort to discredit American colleges and universities? After months of being mostly ignored by the news media, Charles Miller, the chairman of a blue-ribbon federal Commission on the Future of Higher Education gives an interview to his home-town paper. He blasts higher ed, and he blasts the members of his commission who dispute his rhetoric. Staff writer Ralph K.M. Haurwitz of The Austin (Tex.) American-Statesman reports in Friday's paper:
Charles Miller expected a fight from higher education administrators when he agreed to head a national panel for his old friend, Margaret Spellings, the U.S. secretary of education. He's getting one.The headline catches the tone of Miller's remarks: "Chairman defends panel's call for reforms in higher education." But his proposed reforms - which are not yet the commission's because they haven't been adopted yet - have been roundly questioned in The New York Times and a few papers like The Boston Globe in major metro areas where the commission has conducted hearings. (The Harvard Crimson, a student paper with an understandable ax to grind, has followed the commission more faithfully than any of the dailies.) So why does Miller answer his critics in a paper down in Texas and not The Times, The Globe, The Harvard Crimson or the papers that have covered the commission's debate? And why, for that matter, does The American-Statesman write up Miller's defense without interviewing his critics?
The Commission on the Future of Higher Education issued a draft report this week recommending academic and fiscal reforms. Some higher education leaders, including a few on the commission, have criticized the draft as overly harsh in tone and too quick to condemn academia.
Miller, speaking from Houston on Thursday, a day after the commission met to review the draft, didn't sound like someone interested in backing down on substance and perhaps not too much on tone.
"I've been advised to say things in moderate terms, to not criticize the academy," Miller said, declining to say who offered such advice. "It's almost like being censored. Some of the language ... could be toned down, but the real issue is putting responsibility on the higher education system for things it's not doing well. It has some really bad flaws."
Now I'm going to assume The American-Statesman down in Austin got the story on its own. Miller is a former chairman of the University of Texas Board of Regents, and he might have mentioned it back home. Word might have gotten around town, and the paper might have decided to get to the bottom of it. Sometimes that's the way we got stories when I was on the courthouse beat. Of course Miller could have leaked it to a friendly paper, too, but I have no way of knowing that. So I won't speculate.
Miller's friend Margaret Spellings was in the news last week, too. At an international conference in Athens, Greece, she spoke on "higher education and the benefits of partnering with the private sector to prepare students for jobs in the 21st century." And by golly, she just happened to mention the Miller commission:
In launching this Higher Education Commission, we recognized that to remain a quality system we had to ask the tough questions and anticipate necessary changes that can and must be made if we are to have a robust system 50 years from now – especially as needs for all become greater.I can't find any evidence on the internet that the media picked up the story, but Secretary Spellings' remarks were helpfully posted on the U.S. Education Department website.
As a nation, we spend more than $300 billion dollars a year on higher education – a third of which comes from the federal government. Yet, we have very little information on what we are getting in return for that investment. And what we do know is cause for action.
Miller noted the same statistical factoid in his interview. Here's how The American-Statesman reported his remark and put it in context:
The federal government covers a third of the nation's higher education spending but less than 10 percent of the K-12 investment. Yet the federal government exercises more control over primary and secondary education — through Texas-style accountability that Bush parlayed into a national policy — than it does over colleges and universities.On that note, the paper segued to Miller's recommendations: Streamling and increasing financial aid, better record keeping," encouraging "colleges and universities to develop new and better methods of controlling costs and improving productivity," and encouraging "states to require public colleges to measure student learning using tests, such as the Collegiate Learning Assessment, that examine critical thinking, reading, math and other skills."
Miller, former head of the University of Texas System Board of Regents, said Thursday that he regards significant change as not only urgently needed but inevitable.
"If you have a very inefficient and very expensive enterprise, which higher education is now, and huge changes in technology and a cultural change in how people use this technology, that's almost a guarantee that some entity somewhere is going to develop a very effective way to deliver these skills at a much cheaper price," he said. "It could be in a country where they don't have a set of institutions to be angry about change.
"It'll have such demand that you'll have explosive growth that could sweep the higher education system like a tsunami. Supply creates a demand sometimes, not the other way around," Miller said, citing as an example the advent of personal computers and software to run them.
Miller said he knew from the start of the commission's work last year that some in higher education circles would be highly skeptical of his leanings. "I was from Texas and a businessman and worked on accountability and a Bush friend," he said. "I was in about the worst category you could be in."
All of this bears watching, but Miller's last points bear especially careful scrutiny. The membership of his commission is weighted toward industry and people with a vested interest in test prep and for-profit educational venures rather than academicians, and consistently he has touted one specific standardized testing product every time he mentions the subject of testing.
What it is that makes this old courthouse reporter think if Miller and his friends from Texas are peddling snake oil, and if they have their way, somebody, somewhere is going to make a big ole Texas-size pile of money as we move into the future of higher education?
Wednesday, June 28, 2006
A standardized essay test?
A couple of weeks ago, lifestyle reporter Barbara Brotman of The Chicago Tribune had a first-person Sunday story on her experience taking the new SAT essay test. She made some interesting points along the way about standardized testing ... she actually seemed to enjoy filling in the bubbles! ... and her experience points up something I think journalism teachers and journalism students ought to be more aware of.
In a nutshell: The kind of writing we do for the public is not the kind of writing that is privileged in the academy. Certainly, as Brotman found out, it isn't the kind of writing that gets top scores on the SAT.
Which leads me to wonder whether a one-size-fits-all standardized test like the SAT discriminates against kids who want to be professional writers.
I've been reading Brotman's stuff in The Trib for 10 years now. Among other things, she writes engagingly about the experience of raising kids in the city of Chicago. And in one especially memorable column when Mayor Daley and then-Gov. Jim Edgar were squabbling about a third airport (to supplement O'Hare and Midway), she interviewed day-care service providers about how they might go about helping the politicians debate the airport issue with more civility and maturity. Several, as I recall, suggested putting both the mayor and the governor in the "time-out corner." So when Brotman took the SAT, along with a college-bound daughter, and wrote up the experience in the June 11 Perspective section, I settled in for some enjoyable reading. She didn't disappoint:
So Brotman did what comes naturally for any reporter for a major metro newspaper (or a county seat weekly, for that matter). She worked the phone, got ahold of a spokesman for the firm that administers the SAT and started asking questions. Her problem, according to Edward Hardin, content specialist in English language arts at the College Board, was "sound bites." Here's how Brotman tells the story:
You see, the underlying structure of most good journalistic writing isn't sustained exemplification (to lapse into academic jargon for a minute) but narrative. When we cover an election or a three-alarm fire, we don't rush back to the newsroom and write an election essay or support a fire argument. We write a story. Our stories can be kind of scattershot, I'll admit, and sometimes they jump around a bit, because we're writing about people, and people's lives are like that. That's just the way of the world. But it all works out, because our readers need that narrative drive behind the story. If they don't get that sense of story from what we write as we cover the news, they move on to the sports page, the funnies or the school lunch menus.
"Story is the mother of all forms of of writing," says Donald Murray, a Pulitzer Prize-winning newspaper columnist who made the transition to academia as a professor at the University of New Hampshire, and has also written fiction and poetry. "Despite some intellectuals' lack of respect for traditional narrative, it is the principal way we all, intellectuals included, explore, understand, and explain our world to each other. We live and believe the narratives we have woven from our past and our experience. More than we realize, we see the world through story" (152).
But story, of course, isn't what they're looking for on the SAT.
It's not that there isn't a legitimate place for the kind of writing that gets kids a 6 out of 6 on the SAT essay. When I teach freshman English composition, I'm basically teaching argumentation, and it can be a struggle to get my students used to thinking in terms of supporting an argument with sustained evidence. But teach it I must ... even though I have to turn around and un-teach some of the more sustained academic windbaggery that goes along with it when I get the same kids as sophomores in basic newswriting. I don't pretend to be a rocket scientist, but somehow I manage to accomodate more than one style of writing in my students. I would recommend that attitude to the standardized testing industry.
It's as simple as this: Something is going badly wrong in the academy when a professional writer at one of the top 10 newspapers in the country scores in the "adequate but inconsistent" range on what is supposed to be a valid, reliable measurement of writing ability.
I've tried both academic and professional newspaper writing, and I would submit that good journalistic writing requires more craft and discipline than a lot of what passes for academic discourse (including my own). It's also been my experience that kids who are attracted to journalism have a pretty well developed flair for narrative writing by the time they get to college. Why? They like to read the papers, and they like to write the same kind of copy they like to read. So I am troubled by a one-size-fits-all standardized test that valorizes one style of writing above all others and penalizes the very students who are most likely to flourish as professional writers.
Works Cited
Brotman, Barbara. "Taking the SAT -- It's Not Just for Teenagers." Chicago Tribune 11 June 2006. Online ed. 28 June 2006. http://www.chicagotribune.com/news/opinion/chi-0606110190jun11,1,7867589.story
Murray, Donald. Writing to Deadline: The Journalist at Work. Portsmouth, NH: Heineman, 2000.
In a nutshell: The kind of writing we do for the public is not the kind of writing that is privileged in the academy. Certainly, as Brotman found out, it isn't the kind of writing that gets top scores on the SAT.
Which leads me to wonder whether a one-size-fits-all standardized test like the SAT discriminates against kids who want to be professional writers.
I've been reading Brotman's stuff in The Trib for 10 years now. Among other things, she writes engagingly about the experience of raising kids in the city of Chicago. And in one especially memorable column when Mayor Daley and then-Gov. Jim Edgar were squabbling about a third airport (to supplement O'Hare and Midway), she interviewed day-care service providers about how they might go about helping the politicians debate the airport issue with more civility and maturity. Several, as I recall, suggested putting both the mayor and the governor in the "time-out corner." So when Brotman took the SAT, along with a college-bound daughter, and wrote up the experience in the June 11 Perspective section, I settled in for some enjoyable reading. She didn't disappoint:
Inside [the testing site], I stood in a stairwell, waiting to check in. In front of me, high school students yawned, downing breakfast bars and trying not to stare at the mom in their midst. The staffer checking IDs gave mine the hairy eyeball, but my name was on the list. I was in.The verbal (or "critical reading," as the SAT now calls it) sections Brotman enjoyed, "like doing needlework," and math, well, she survived it. In fact, she knocked the top off the reading test and did better than she'd expected in math. But her essay fell below expectations, into a range that "exhibits adequate but inconsistent facility in the use of language" and "has some errors in grammar, usage, and mechanics." If you write for a major metro daily, this isn't exactly how most people judge your work.
I took a seat in the back of the classroom and put on my reading glasses. At 8:15 a.m., we began.
Say what?
The essay was first. The question:
"Does the success of a community--whether it is a class, a team, a family, a nation, or any other group--depend upon people's willingness to limit their personal interests? Plan and write an essay in which you develop your point of view on this issue."
And I thought--nothing.
Maybe there was something, a combination of emptiness and panic that swirled into a single thought: "Huh?"
Suddenly I realized that I, too, had stakes riding on the test. No one expected me to do well on the math. But how humiliating would it be if I blew the essay?
I got cracking. It wasn't anything to write Harvard about--how strange it felt to write by hand--but I used the words "relinquish," "communal" and "dovetailing." Take that, holistic scorers!
And then it was on to the next section. And the next, in a rhythm of test segments and brief breaks that was to continue for 4 1/2 hours.
So Brotman did what comes naturally for any reporter for a major metro newspaper (or a county seat weekly, for that matter). She worked the phone, got ahold of a spokesman for the firm that administers the SAT and started asking questions. Her problem, according to Edward Hardin, content specialist in English language arts at the College Board, was "sound bites." Here's how Brotman tells the story:
My nose was somewhat out of joint. Edward Hardin, content specialist in English language arts at the College Board, graciously put it back.Brotman summed up her reaction in one word: "Ouch." I would give it a little more than that.
Reached while he was meeting with six veteran SAT consultants to choose future essay subjects, he gave them my essay to read and grade again. Three gave it a 4; three gave it a 5 [out of 6 possible]. When he told them the author was a journalist, he said, they were not surprised.
My essay read like "sound bite writing," he said. "You have a lot of really interesting points and examples, but they tend to be one or two sentences, kind of scattershot. It jumps around a little bit, giving little chunks of information rather than sustained examples. There was some question as to whether it was building toward something or jumping around with random thoughts about the topic."
You see, the underlying structure of most good journalistic writing isn't sustained exemplification (to lapse into academic jargon for a minute) but narrative. When we cover an election or a three-alarm fire, we don't rush back to the newsroom and write an election essay or support a fire argument. We write a story. Our stories can be kind of scattershot, I'll admit, and sometimes they jump around a bit, because we're writing about people, and people's lives are like that. That's just the way of the world. But it all works out, because our readers need that narrative drive behind the story. If they don't get that sense of story from what we write as we cover the news, they move on to the sports page, the funnies or the school lunch menus.
"Story is the mother of all forms of of writing," says Donald Murray, a Pulitzer Prize-winning newspaper columnist who made the transition to academia as a professor at the University of New Hampshire, and has also written fiction and poetry. "Despite some intellectuals' lack of respect for traditional narrative, it is the principal way we all, intellectuals included, explore, understand, and explain our world to each other. We live and believe the narratives we have woven from our past and our experience. More than we realize, we see the world through story" (152).
But story, of course, isn't what they're looking for on the SAT.
It's not that there isn't a legitimate place for the kind of writing that gets kids a 6 out of 6 on the SAT essay. When I teach freshman English composition, I'm basically teaching argumentation, and it can be a struggle to get my students used to thinking in terms of supporting an argument with sustained evidence. But teach it I must ... even though I have to turn around and un-teach some of the more sustained academic windbaggery that goes along with it when I get the same kids as sophomores in basic newswriting. I don't pretend to be a rocket scientist, but somehow I manage to accomodate more than one style of writing in my students. I would recommend that attitude to the standardized testing industry.
It's as simple as this: Something is going badly wrong in the academy when a professional writer at one of the top 10 newspapers in the country scores in the "adequate but inconsistent" range on what is supposed to be a valid, reliable measurement of writing ability.
I've tried both academic and professional newspaper writing, and I would submit that good journalistic writing requires more craft and discipline than a lot of what passes for academic discourse (including my own). It's also been my experience that kids who are attracted to journalism have a pretty well developed flair for narrative writing by the time they get to college. Why? They like to read the papers, and they like to write the same kind of copy they like to read. So I am troubled by a one-size-fits-all standardized test that valorizes one style of writing above all others and penalizes the very students who are most likely to flourish as professional writers.
Brotman, Barbara. "Taking the SAT -- It's Not Just for Teenagers." Chicago Tribune 11 June 2006. Online ed. 28 June 2006. http://www.chicagotribune.com/news/opinion/chi-0606110190jun11,1,7867589.story
Murray, Donald. Writing to Deadline: The Journalist at Work. Portsmouth, NH: Heineman, 2000.
Friday, June 23, 2006
Quote from St. Angela Merici
Found while cleaning my office, an inpirational quote from St. Angela Merici, founder of the Ursuline order:
Do something,I found it in a three-fold brochure promoting the Ursuline Companions in Mission, Central Region, Crystal City, Mo. I'm posting it to the teaching blog so I won't lose it again as the office-cleaning progresses.
get moving,
be confident,
risk new things,
stick with it,
then be ready for
BIG SURPRISES!
(as translated by Sr. Terry Eppridge, OSU)
Friday, June 09, 2006
Public policy T-shirt
On sale at this week's Central/Southern Illinois Lutheran Synod meeting was a black-on-yellow T-shirt promoting the www.lutheranadvocacy.org website. On the back it displays the following:
A Six-Point Plan for Effective Public Policy:LutheranAdvocacy.org is a joint ministry of Lutheran Social Services of Illinois (LSSI), the three regional synods of the Evangelical Lutheran Church in America (ELCA) in the State of Illinois and the ELCA's Division for Church in Society. It is the ELCA state public policy office for the state of Illinois.Truly I tell you, just as you do this for the least of my brothers and sisters, you do this for me.
- When I was hungry, you gave me something to eat;
- When I was thirsty, you gave me something to drink;
- When I was a stranger, you welcomed me;
- When I was naked, you clothed me;
- When I was sick, you comforted me;
- When I was in prison, you visited me.
-- Matthew 25
Flag pledge -- in Dena'ina language
Here's an interesting angle on the move for "English only" laws that accompany the current round of hysteria over immigration. It comes from an anthropologist's recollection of leading the Pledge of Allegiance with a Dena'ina Athabascan elder named Peter Kalifornsky at a school on Alaska's Kenai Peninsula.
I'll let the anthropologist, Alan Boraas of Kenai Peninsula College, tell the story as he wrote it up for The Anchorage Daily News several years later:
I'll let the anthropologist, Alan Boraas of Kenai Peninsula College, tell the story as he wrote it up for The Anchorage Daily News several years later:
Two hundred grade schoolers make a lot of noise even when being shushed by their teachers, and I was a little ambivalent when we stepped to the microphone. I cleared my throat, Peter cleared his, and we began:Now here's the kicker. Again, I'll quote Boraas:
"Dek'nesh'uh bet'uhdi_t'ayich"' Peter read. "I pledge allegiance" I repeated. "Naq'ach' k'iniyich'," "to the flag," "ts'e_q'i k'i_anich'ina," "of the United States of America."
As we read, the children became curiously silent. Johnny stopped pulling Sally's pigtails, Betty and Amy stopped giggling, and Ricky, off in his own space, suddenly was captivated. As one, they stared intently at the frail old man speaking a strange language they didn't understand. They were not confused, but awed. Even the school district administrators paid attention.
The children seemed to sense that this was the language of their place. An ancient language with ancient roots. Though they came from many backgrounds, subconsciously they seemed to want to connect to those roots. After the program was over I stood to the side talking with some acquaintances, and I happened to look over toward Peter. Forty or so kids had gathered around him. They were quiet and respectful with a look not so much of admiration, but of wonder. It was as though there was something missing in their lives that this mysterious old man and his ancient language could satisfy. They would draw near and reach out their hand, and he would reach out his and touch them. Then they would drift away and others would press to the front for a chance to touch the hand of a man who held the secret to their connection to their place.
In one of the supreme ironies of our time, reading the Pledge of Allegiance in a Native language could be be illegal today. With the passage of Alaska's English-only law, English is the only language that can be used in government functions.There's another level of irony here, too. Kalifornsky was descended from a Dena'ina Athabascan man who converted to Christianity in the mid-1800s when he worked at a Russian outpost in California. (Hence the name.) When Alaska was a Russian colony, the Russian Orthodox Church promoted the use of Native languages and it was not uncommon for people to be bilingual, even trilingual. After the U.S. took over, the new territorial government brought in Protestant missionaries and English-only schools like those in the "lower 48." Now, a hundred years later, the Native languages are dying out. Kalifornsky, who died in 1993, devoted his last years to developing a written Dena'ina language and writing down many of the old Athabascan stories in their Native language.
Saturday, May 27, 2006
Reader Response: A portal page
An awful lot of alliteration there!
As I look for material on reader response to use in the Native American course, I keep running across a website called The Expanding Canon: Teaching Multicultural Literature in High School produced in 2003 by Thirteen/WNET New York in collaboration with the National Council of Teachers of English. It's pretty extensive, and I need to make a project of downloading it and reading it. But in the meantime, here are a few quotes I've come across -- copied here just so I'll know where they are.
Here's the disucssion that got me started. It's from an introduction to reader-response theory in the first lesson module:
Another passage I stumbled across in a Google search, from the fifth program on cultural studies in the classroom, suggests how cultural studies and reader response might work together:
As I look for material on reader response to use in the Native American course, I keep running across a website called The Expanding Canon: Teaching Multicultural Literature in High School produced in 2003 by Thirteen/WNET New York in collaboration with the National Council of Teachers of English. It's pretty extensive, and I need to make a project of downloading it and reading it. But in the meantime, here are a few quotes I've come across -- copied here just so I'll know where they are.
Here's the disucssion that got me started. It's from an introduction to reader-response theory in the first lesson module:
Language arts teachers at all levels now widely accept central tenets of the theory, particularly the notion that learning is a constructive and dynamic process in which students extract meaning from texts through experiencing, hypothesizing, exploring, and synthesizing. Most importantly, teaching reader response encourages students to be aware of what they bring to texts as readers; it helps them to recognize the specificity of their own cultural backgrounds and to work to understand the cultural background of others.It was that cultural angle that first attracted my attention. But I got the feeling a lot of the students haven't done much in the way of reacting to the arts in general.
Another passage I stumbled across in a Google search, from the fifth program on cultural studies in the classroom, suggests how cultural studies and reader response might work together:
Cultural studies ... is particularly valuable for teachers of multicultural literature because it focuses on the social divisions of class, gender, ethnicity, and race. Cultural studies looks at the ways in which meanings, stereotypes, and identities (both collective and individual) are generated within these social groups. The practice of cultural studies almost always involves the combination of otherwise discrete disciplines, including literature, sociology, education, history, philosophy, communications studies, and anthropology. An interdisciplinary approach is key to an understanding of these issues, because it allows students to study and compare multiple, varied texts that deal with the culture and history of a particular group.And a bit that I particularly like expands the definition of "text" to something more like what I want to do in the humanities courses:
The central teaching strategy of cultural studies is intertextual reading: comparing each literary text to culturally related texts. By reading literature in the context of other cultural works, students learn how the literature they study both creates and reflects cultural beliefs.I like that. "Intertextual response." Is that the word I'm looking for? More googling is in order I guess, to find out what, if anything, "intertextual" means in literary theory.
Texts for this practice may be drawn from almost any source: advertising, television, historical documents, visual artwork, legal documents, theological writing, etc. It's best to contextualize literature with primary sources or compilations of primary sources. Teachers should also look for texts that raise issues with which their students can identify. For example, in this session, Ishmael Reed's poetry and Graciela Limón's novel both look at transformative journeys, which students may relate to their own experiences.
When using this intertextual approach, teachers will want to brief students before giving them materials to read. It's usually helpful to explain that students will be asked to look for ways in which the different texts address similar issues; it's also useful to explain that students will be asked how these texts reinforce or challenge our ideas about those issues. Teachers may also want to offer general information about the texts: when they were written, by whom, for what purpose, etc. Finally, teachers may want to provide background about the characters and images they'll find. For example, when teaching Reed's "Railroad Bill, A Conjure Man," teachers can describe the trickster figure and his role in African stories, African American folklore, and legends before encouraging students to look for trickster references in the poem.
Friday, May 19, 2006
HUM 221: Alaska Native values
Link to faculty page -- also add to syllabus along with the links on Haudenosaunee and Lakota values ...
The Alaska Native Knowledge Network at the University of Alaska in Fairbanks has a webpage on Native values with links to separate webpages on:
Alutiiq | Athabaskan/Athabascan | Iñupiaq | Cup'ik | Yup'ik | Tlingit
Here's the overview:
The Alaska Native Knowledge Network at the University of Alaska in Fairbanks has a webpage on Native values with links to separate webpages on:
Here's the overview:
ALASKA NATIVE CULTURES all hold certain values to be paramount to their cultures. This website showcases some of the similarities and individualities of various Alaska Native groups and thier values. Below is a list of some important values all Alaska Native Cultures share, and at the bottom of the page are links to individual culture pages.A valuable resource, and a good way to start off the semester next year.
- Show Respect to Others - Each Person Has a Special Gift
- Share what you have - Giving Makes You Richer
- Know Who You Are - You Are a Reflection on Your Family
- Accept What Life Brings - You Cannot Control Many Things
- Have Patience - Some Things Cannot Be Rushed
- Live Carefully - What You Do Will Come Back to You
- Take Care of Others - You Cannot Live without Them
- Honor Your Elders - They Show You the Way in Life
- Pray for Guidance - Many Things Are Not Known
- See Connections - All Things Are Related
Thursday, May 18, 2006
Faculty report 2005-06
Note: I just finished my end-of-year faculty report for the Dean of Academic Affairs, and I'm posting it here for convenient reference.
Faculty Report, 2005-06
Courses taught:
Fall Semester: Communications 150 (intro to mass comm.), 15 students; COM 221 (intro to public relations) 20 students; English 111 (rhet./comp.), two sections totaling 30 students. New Horizons: COM 221-70 (intro to PR), 7 students.
Spring: COM 209 (basic newswriting), 12 students; COM 222 (intro to advertising), 18 students; ENG 111, one section with 10 students; Humanities 221 (Native American cultural expression), 28 students; COM 296 (capstone), 2 students; and COM 199 (independent study), 2 students. New Horizons: COM 150-70 (intro to mass com), 5 students. I taught HUM 221 and COM 199 for the first time this semester..
Committee assignments: Chair, assessment committee.
TEACHING STRATEGIES:
What changes did you make in your teaching this year? Please comment on any successes or failures you had in making changes to the syllabus, using a new textbook or material, or trying a new teaching strategy.
The biggest change, perhaps, was introducing Common Student Learning Objectives and Course Based Student Learning Objectives on my syllabi and beginning to use them more rigorously as I developed individual lesson plans. I’m finding it gives me more focus, as does the use of classroom assessment techniques over time.
I also taught a new course, in Native American cultural expressions, in a new field – the humanities. So I had to get up to speed on the field as well as course content; this involved a lot of experimentation with assignment formats, which is an ongoing process I expect to continue next school year. At the end of the spring semester, I opened a teaching weblog at http://teachinglogspot.blogspot.com/ in which I am exploring some of these issues more fully. This grows in part out of my experimentation with the use of blogs as a teaching tool during the spring semester, posting links to supplementary readings and prompts for writing assignments to the personal blogs linked to my faculty webpage at http://www.sci.edu/classes/ellertsen/welcome.html. The experience was encouraging, and I plan to do it in a more methodical way in the coming academic year.
Teaching mass communications majors in the new baccalaureate program will also require som adaptation of my teaching methods, since students’ motivational level and competencies should be higher than I have found to be the case with Gen Ed students.
In what area of teaching do you think you made the most improvement this year?
Lecturing (which is something I thought I would never do but turns out to be worth pursuing after all). I usually tuned out on classroom lectures in my student days, but this year I was assigned a classroom (Dawson 220) that required more lecturing on my part than in the past because of an instructor-centered (as opposed to student-centered) classroom design. It isn’t a style of presentation that comes easily to me, but I’ve reflected on lecturers who held my attention in grad school (there were a few) and read up on the Socratic method and lecture techniques on the Internet as I try to develop a style that fits my personality and teaching goals. I’m still awkward at it, since it involves a major change in the way I conduct classes. But I think I’m coming along. Only one student evaluation said I was too long-winded in class (fewer than in the past, come to think of it), so I think I’ve made a good start.
What would you like to improve on next year?
Lecturing! At my suggestion, I’ll meet lecture sections of my humanities classes in Becker L15 twice a week and a lab section in a computer lab once a week. This has been Okd by the scheduling committee. Since the humanities courses involve music and other art forms in addition to printed literature, I plan to make more use of electronic media – videos, sound tapes, etc. – next year. This is another area I need to work on next year, since basically I am a technological klutz.
Were there any other factors that helped or hindered your performance this year?
As always, I was able to work in an environment that values classroom instruction even when I had to spend a lot of time on accreditation and other issues that took time away from teaching. That’s important, and I think I tend to overlook it when I do these end-of-year reports.
One factor that took some getting used to was teaching in a computer lab that was designed more for lecturing and PowerPoint presentations than for writing and classroom discussion. In D220, all the student desks face the front of the room and there isn’t enough space between rows for me to work individually with students at their computers as they are writing or researching issues on the Internet in class. In all other regards, I like the classroom and plan to keep adapting my introductory public relations and advertising courses to a lecture format in order to continue meeting there. I have asked members of the scheduling committee to assign my writing and editing courses in future to computer labs like D22 or L16 that have sufficient space for me to intervene in student writing processes; their enrollment has tended to be between 10 and 15 students, so I anticipate no problem in being able to teach them in the smaller labs. My solution to the problems posed by the classroom design in my humanities courses is detailed above.
PROFESSIONAL DEVELOPMENT
In addition to your teaching, what other professional activities did you participate in during this academic year?
Membership in professional organizations:
Conferences/Workshops:
Presented paper “Sacred Harp Singing in a Living History Environment” to the Fall Conference of the Midwest Open Air Museums Coordinating Council, Eagle Creek Conference Center, Findlay, Illinois, November 10, 2005. Participated in Appalachian Dulcimer Week, Western Carolina University, for “learning and promoting the dulcimer’s history and traditional playing styles,” June 2005.
Other:
INTERACTION WITH STUDENTS
Did you serve as an academic adviser this year? No.
Did you serve as an adviser to any student organizations? Yes. If yes, please list. The Sleepy Weasel, campus literary magazine. My duties are primarily editorial.
In what ways have you supported other college activities this year?
Editor of Nuts & Bolts, the assessment newsletter. In addition, I consciously assigned my freshman English and journalism students to observe and write about Free Food Days and other on-campus activities, partly for the writing experience and partly to expose them to the student activities. I make a practice of donating copies of The Chicago Tribune and several music magazines to the Resource Center when I’ve finished reading them. I very much like what Joanna Beth Tweedy has done so far to create a welcoming, productive atmosphere in the Resource Center. It reminds me of the commons rooms in a more traditional school, and I try to do as much as I can to help her in these efforts.
ASSESSMENT
Have you used any means of assessing how students are learning (other than grading course work)?
I use several Classroom Assessment Techniques, primarily reflective essays. Often I embed them in the final exam, and perform a rudimentary content analysis on student answers to rather broad questions in order to see how often they mention topics that are covered in the CSLOs and CBSLOs on the syllabus. For more information on the technique, see the student prompt at http://www.sci.edu/classes/ellertsen/reflect.html linked to my faculty page.
I often use a variation of the “one-minute essay” CAT – asking students the clearest point, the most confusing point and the point(s) they want to know more about -- in a variety of ways. These range from a quick survey, either at the end of class or not infrequently during the class when I suspect we’ve gotten off track, to framing reflective essay questions that embed a clearest-point, most-confusing-point rubric into final exams or other assignments at the end of the semester. It has gotten to be pretty integral to the way we move through the material in a course.
What would most enable you to use consistently some assessment tools or techniques that would tell you about the College and student learning?
Since I edit the assessment newsletter, as in the past I’m more interested in the answers other instructors give to this question. I use reflective essays a lot in my own classes, as stated above, and am getting more focused in the questions I embed in the essay prompts as I gain experience with the technique.
Faculty Report, 2005-06
Peter Ellertsen
Fall Semester: Communications 150 (intro to mass comm.), 15 students; COM 221 (intro to public relations) 20 students; English 111 (rhet./comp.), two sections totaling 30 students. New Horizons: COM 221-70 (intro to PR), 7 students.
Spring: COM 209 (basic newswriting), 12 students; COM 222 (intro to advertising), 18 students; ENG 111, one section with 10 students; Humanities 221 (Native American cultural expression), 28 students; COM 296 (capstone), 2 students; and COM 199 (independent study), 2 students. New Horizons: COM 150-70 (intro to mass com), 5 students. I taught HUM 221 and COM 199 for the first time this semester..
Committee assignments: Chair, assessment committee.
TEACHING STRATEGIES:
What changes did you make in your teaching this year? Please comment on any successes or failures you had in making changes to the syllabus, using a new textbook or material, or trying a new teaching strategy.
The biggest change, perhaps, was introducing Common Student Learning Objectives and Course Based Student Learning Objectives on my syllabi and beginning to use them more rigorously as I developed individual lesson plans. I’m finding it gives me more focus, as does the use of classroom assessment techniques over time.
I also taught a new course, in Native American cultural expressions, in a new field – the humanities. So I had to get up to speed on the field as well as course content; this involved a lot of experimentation with assignment formats, which is an ongoing process I expect to continue next school year. At the end of the spring semester, I opened a teaching weblog at http://teachinglogspot.blogspot.com/ in which I am exploring some of these issues more fully. This grows in part out of my experimentation with the use of blogs as a teaching tool during the spring semester, posting links to supplementary readings and prompts for writing assignments to the personal blogs linked to my faculty webpage at http://www.sci.edu/classes/ellertsen/welcome.html. The experience was encouraging, and I plan to do it in a more methodical way in the coming academic year.
Teaching mass communications majors in the new baccalaureate program will also require som adaptation of my teaching methods, since students’ motivational level and competencies should be higher than I have found to be the case with Gen Ed students.
In what area of teaching do you think you made the most improvement this year?
Lecturing (which is something I thought I would never do but turns out to be worth pursuing after all). I usually tuned out on classroom lectures in my student days, but this year I was assigned a classroom (Dawson 220) that required more lecturing on my part than in the past because of an instructor-centered (as opposed to student-centered) classroom design. It isn’t a style of presentation that comes easily to me, but I’ve reflected on lecturers who held my attention in grad school (there were a few) and read up on the Socratic method and lecture techniques on the Internet as I try to develop a style that fits my personality and teaching goals. I’m still awkward at it, since it involves a major change in the way I conduct classes. But I think I’m coming along. Only one student evaluation said I was too long-winded in class (fewer than in the past, come to think of it), so I think I’ve made a good start.
What would you like to improve on next year?
Lecturing! At my suggestion, I’ll meet lecture sections of my humanities classes in Becker L15 twice a week and a lab section in a computer lab once a week. This has been Okd by the scheduling committee. Since the humanities courses involve music and other art forms in addition to printed literature, I plan to make more use of electronic media – videos, sound tapes, etc. – next year. This is another area I need to work on next year, since basically I am a technological klutz.
Were there any other factors that helped or hindered your performance this year?
As always, I was able to work in an environment that values classroom instruction even when I had to spend a lot of time on accreditation and other issues that took time away from teaching. That’s important, and I think I tend to overlook it when I do these end-of-year reports.
One factor that took some getting used to was teaching in a computer lab that was designed more for lecturing and PowerPoint presentations than for writing and classroom discussion. In D220, all the student desks face the front of the room and there isn’t enough space between rows for me to work individually with students at their computers as they are writing or researching issues on the Internet in class. In all other regards, I like the classroom and plan to keep adapting my introductory public relations and advertising courses to a lecture format in order to continue meeting there. I have asked members of the scheduling committee to assign my writing and editing courses in future to computer labs like D22 or L16 that have sufficient space for me to intervene in student writing processes; their enrollment has tended to be between 10 and 15 students, so I anticipate no problem in being able to teach them in the smaller labs. My solution to the problems posed by the classroom design in my humanities courses is detailed above.
PROFESSIONAL DEVELOPMENT
In addition to your teaching, what other professional activities did you participate in during this academic year?
Membership in professional organizations:
- National Council of Teachers of English
- Society of Professional Journalists
Conferences/Workshops:
Presented paper “Sacred Harp Singing in a Living History Environment” to the Fall Conference of the Midwest Open Air Museums Coordinating Council, Eagle Creek Conference Center, Findlay, Illinois, November 10, 2005. Participated in Appalachian Dulcimer Week, Western Carolina University, for “learning and promoting the dulcimer’s history and traditional playing styles,” June 2005.
Other:
- Volunteer interpreter at Lincoln’s New Salem State Historic Site. I interpret the log schoolhouse during the summer and sing with the New Salem Shape Note Singers.
- I coordinate meetings of the Prairieland Dulcimer Strings, a community group of amateur musicians that meets at my church (Atonement Lutheran).
INTERACTION WITH STUDENTS
Did you serve as an academic adviser this year? No.
Did you serve as an adviser to any student organizations? Yes. If yes, please list. The Sleepy Weasel, campus literary magazine. My duties are primarily editorial.
In what ways have you supported other college activities this year?
Editor of Nuts & Bolts, the assessment newsletter. In addition, I consciously assigned my freshman English and journalism students to observe and write about Free Food Days and other on-campus activities, partly for the writing experience and partly to expose them to the student activities. I make a practice of donating copies of The Chicago Tribune and several music magazines to the Resource Center when I’ve finished reading them. I very much like what Joanna Beth Tweedy has done so far to create a welcoming, productive atmosphere in the Resource Center. It reminds me of the commons rooms in a more traditional school, and I try to do as much as I can to help her in these efforts.
ASSESSMENT
Have you used any means of assessing how students are learning (other than grading course work)?
I use several Classroom Assessment Techniques, primarily reflective essays. Often I embed them in the final exam, and perform a rudimentary content analysis on student answers to rather broad questions in order to see how often they mention topics that are covered in the CSLOs and CBSLOs on the syllabus. For more information on the technique, see the student prompt at http://www.sci.edu/classes/ellertsen/reflect.html linked to my faculty page.
I often use a variation of the “one-minute essay” CAT – asking students the clearest point, the most confusing point and the point(s) they want to know more about -- in a variety of ways. These range from a quick survey, either at the end of class or not infrequently during the class when I suspect we’ve gotten off track, to framing reflective essay questions that embed a clearest-point, most-confusing-point rubric into final exams or other assignments at the end of the semester. It has gotten to be pretty integral to the way we move through the material in a course.
What would most enable you to use consistently some assessment tools or techniques that would tell you about the College and student learning?
Since I edit the assessment newsletter, as in the past I’m more interested in the answers other instructors give to this question. I use reflective essays a lot in my own classes, as stated above, and am getting more focused in the questions I embed in the essay prompts as I gain experience with the technique.
Subscribe to:
Posts (Atom)