Given a choice, most faculty and staff would say that the papers, presentations, and performances their students create are better sources of assessment information than national surveys and standardized tests. And yet, most institutions rely more heavily on the latter than the former in their assessment programs.
In an upcoming report that will be distributed by the National Institute for Learning Outcomes Assessment (NILOA), Kathy Wise and I argue that one of the reasons we are more likely to gather than to use assessment data is because, whatever their limitations, standardized measures make it easier to collect data.
One reason for the difference in how challenging it is to collect versus use evidence is that there are many nationally known standardized tests, surveys, predesigned rubrics, or e-portfolio systems that institutions can adopt to collect assessment data and, in some cases, deliver detailed reports. We have heard these options sometimes referred to as “assessment in a box” or “plug in and play assessment.” This does not mean that gathering assessment evidence is easy, but it cuts down on the things that institutions have to design from scratch.
Most of the schools in the 2010 Wabash Study use some combination of national surveys and standardized tests.
Percentage of schools using the following input measures:
Cooperative Institutional Research Program Freshman Survey – 63%
Beginning College Survey of Student Engagement – 33%
Wabash National Study Incoming Student Survey – 33%
Percentage of schools using the following measures of student experiences:
National Survey of Student Engagement – 93%
Higher Education Research Institution College Senior Survey – 30%
Noel-Levitz Student Satisfaction Inventory – 23%
Percentage of schools using the following outcome measures:
Collegiate Learning Assessment – 50%
Collegiate Assessment of Academic Proficiency Critical Thinking Test – 33%
Wabash Study Outcome Measures – 33%
One of the most challenging parts of the new Wabash Study is that all schools will be examining student work. This means that much of the data collection and analysis that normally gets “outsourced” has to be done by people on campus. Given the goal of reviewing student work by this summer, we have a very tight timeline.
The information we currently have about institutions' plans for collecting and reviewing student work is sparse. Therefore, we ask two things of you. First, please download and review the information we have about your institution's plans for reviewing student work by clicking here (MS Excel document). If you have revisions, just send them in an email to email@example.com. We will update the document. Second, we have a very short, 13-question survey on the specifics of your school’s plans that we would like you to complete. You can go to the survey by clicking here.
We would like you to complete the survey by November 12, and we will blog about the results by November 18.