The benchmarks for the survey seem to me crucial to the usefulness of survey
results (see box, above). The five you selected are so important to
undergraduate education. Is the expectation that institutions will work
toward excelling in all five areas?
Kuh: Both strong and not-so-strong performance meet the
accountability function, but for improvement, schools, I think, are going to
focus on areas where they are below par with their peers. The good news is
that many schools look at least strong enough to be competitive with their
peers on at least some of the items.
I don’t think it is possible for any institutions, given the nature of
their students and the wide variety of important educational activities that
this instrument addresses, to be at the top in every category. On the other
hand, the key is not that an institution is very, very strong in one area
but whether students are engaged across the board on a meaningful level.
Ewell: Certainly institutions should be expected to do better in
some areas than others. But they ought to be paying attention to all five
areas. If they drop below expectations on the benchmarks, the bad news is as
useful as the good news because they know what they need to work on.
Kuh: One way to proceed is to go to various groups on campus to
ask, "At what level should our institution be performing in this area,
given our students and the academic mission of this particular department,
unit, or the overall institution?" That’s the kind of discussions
that we hope the data will prompt on respective campuses.
Ewell: That’s another really good plug for peer analysis and
consortium work because each participant will know what levels of attainment
are possible based on institutions like them.
Cambridge: Are you doing anything to help institutions think about
how to work with a consortium, or are you leaving that to institutions to
Kuh: Well, it’s fundamentally an institutional question, but
your query is exactly right. A hallmark of our effort has been to stay close
to participating schools, letting them know the process and asking them for
feedback on the quality of our reports. We’ve had telephone debriefings,
meetings at national and regional conferences, and focus groups. As new
schools come on board, we’re helping by supplying frequently asked
questions and uses of data from schools already using the survey.
Cambridge: Moving to another group that might benefit from NSSE
results. What about regional accreditors? Would the categories on the NSSE
be useful for determining, for example, educational effectiveness?
Ewell: Certainly the NSSE could provide a vehicle for accreditors
to ask campuses for more focus on educational effectiveness and good
practice. The survey could illuminate the kinds of questions that make the
campus’s inquiry into its own practices more precise.
Kuh: Let me jump in here. If an institution thinks that it is
distinctive at some things and not at others, NSSE can confirm or disconfirm
whether that is, in fact, the case. I see this, in terms of accreditation,
as a wonderful tool for campuses to get clearer about both the outcomes that
they care more about and the extent to which they have the processes in
their environment to produce those outcomes.
Ewell: A group of campuses might even want to do this together as
they come up for accreditation, forming a ready-made consortium where they
could share results. I’ve talked about this with the Western Association
of Schools and Colleges [one of the regional accrediting agencies], which
appears quite interested.
Kuh: One of the more important internal uses of NSSE data is
pointing out areas that ought to be researched or assessed in more depth.
The NSSE survey by itself can’t yield the level of detail that the
institution needs, but it will certainly identify very quickly areas that
Cambridge: Let’s move from institutions and accreditors to the
students themselves. Why would students want to take this survey seriously?
Ewell: I’ll take a first crack at that. Students ought to be
asking certain questions about their own education, a kind of
self-monitoring. That’s a direct benefit; the indirect one is that the
testimony of students will help the institution get better, and therefore
benefit present and future students. Most of the time nobody really asks
students what the heck is going on at an institution. Students are often
flattered to be asked that question.
Kuh: In focus groups this last spring, students were very
thoughtful and reflective when they talked about the survey. Another point
is that in reading about the educationally enriching practices, like study
abroad, interdisciplinary course work, and internships, some students start
thinking about the opportunities that are or should be available to them.
Ewell: Yes, and another potential of these kinds of questions is
in shaping what parents and prospective students ought to be inquiring about
any college that they visit.
Cambridge: In the wake of the recent state report cards, which
stated that we don’t even know enough about student learning at colleges
and universities to rate them, states may be looking for ways to determine
and document student learning. If a state were to come to you with the
question "What would the NSSE offer that something like statewide
testing, some other national survey, or some other data collection system
does not?", how would you answer?
Ewell: This is something that the states actually might get
implemented, as opposed to standardized testing, which they often propose
but rarely happens. It is relatively inexpensive, has some precedent so that
schools don’t run away screaming, and faculty will not run to the
barricades about intrusiveness. It’s a practical answer to accountability.
The best answer, however, is that the NSSE focuses on things that
institutions can do something about. A good educational practice is
something that any campus can engage in. The questions point to practices
that are quite clear, that have policy handles for action, and that campus
leadership can push and really do something with.
Edgerton: The key point is that outcomes testing doesn’t give you
information that you can do anything about if you’re at the campus level.
The NSSE helps you explain the performance, not just record the performance.
Of course, you’d want a mix of strategies to get at a multifaceted picture
of what is going on on the campus for accountability. But the fact that NSSE
focuses on the processes that produce the outcomes means that a campus can
really do something with the information it gets.
Cambridge: How does the consortia idea fit when we’re talking
about the state level? Would NSSE prompt consortia of like kinds of
institutions within a state? Across state lines? Is there any way to fit
state needs and consortia together?
Kuh: With forethought. A state can act like a consortium because
it can add as many as 20 additional questions to the regular set. Any
consortium can do that. Consortia forming right now are developing
additional questions specific to their consortium. At least six schools need
to be in a consortium for it to work. During 2000 and 2001, there will be
about 500 different institutions using NSSE, so there is a rich set of
four-year colleges to choose from.
Ewell: We’re really in the beginning stages of working with
states on how they can best use this information, though. I think that is
one of the things we really want to work on.
Cambridge: How can a college participate in the next survey?
Kuh: Although we can’t accept new schools in the Pew-subsidized
survey administration, we can add institutions that can pay what we estimate
is full cost. That is about $7 to $8 per student. Anyone interested can
Edgerton: We do need to say that the current administration is for
four-year schools. The good news is that the Pew Charitable Trusts will soon
underwrite a kind of daughter of NSSE in the community college sector. We
care deeply about providing this same basis for understanding student
engagement to two-year schools.
Cambridge: Any last words about why you have committed yourself to
be a part of this project?
Ewell: Of all the many agendas trying to improve undergraduate
education that we’ve been involved in, this is among the most concrete. I
feel very good about it in that regard.
Kuh: This work will be worthwhile as frontline faculty get their
hands on the data and use it to continue conversations about teaching and
learning. This plays hand in glove with the emphasis on the scholarship of
teaching and learning, anchoring those conversations.
Edgerton: Sandy Astin [director of the Higher Education Research
Institute, University of California, Los Angeles] said to me early on in the
project, "Don’t think of this as a survey. Think of it as a way to
give voice to an agenda and to deepen our understanding of how to improve
undergraduate education." I’d put a further point on this: We need to
educate the public, students, and parents about what questions they ought to
be asking about higher education. NSSE asks some of those right questions.
Cambridge: Thanks for your answers and for providing leadership in
this important effort.