• If you are citizen of an European Union member nation, you may not use this service unless you are at least 16 years old.

  • You already know Dokkio is an AI-powered assistant to organize & manage your digital files & messages. Very soon, Dokkio will support Outlook as well as One Drive. Check it out today!

View
 

Inquiry-Based Assessment - Campus Examples

Page history last edited by Robert Hackett 3 years, 2 months ago

Front Page / Assessment /Inquiry-Based Assessment/ Campus Examples

 

Inquiry-Based Assessment


Overview  |  Guides  |  Campus Examples  |  Documents to Download


The University of Richmond has been the lead institution in the Bonner network in developing and refining an inquiry-based approach to assessment, which can also be called High-Impact Assessment because of the ways that it incorporates the characteristics of high-impact educational practices. (See High-Impact Practices).

 

See the presentations on the Documents to Download page. This article introduction below may be helpful. Additionally, see the materials from the workshop offered at the 2017 Bonner Summer Leadership Institute by Blake Stack, in conjunction with Ray Barclay, which discussed how University of Richmond staff are now using rubrics to assess student work as part of the assessment process.

 

High-Impact Assessment by Bryan Figura and Sylvia Gale, University of Richmond


 

 

Last January, we attended a session at the AAC&U’s national conference in Washington, D.C. called “Developing Greater Impact with High-Impact Practices: Internships and Civic Engagement.” As we listened to Jillian Kinzie, associate director of the Center for Postsecondary Research at Indiana University, Bloomington, frame the session with a reminder about what makes a practice “high-impact,” it struck us that this list of defining characteristics of high-impact practices sounded very much like a description of the approach to assessment we have been developing in our Center for the past few years. 

 

Could we be engaged in something we might call “high-impact assessment”?

 

We took this idea right to our next session, where we were presenting “Two Approaches to Assessing Civically Engaged Student Learning” alongside colleagues affiliated with the Bonner program at Rhodes College. There, we scrapped our scripted intro and instead opened the session by reading the list of statements Jillian Kinzie had just shared, a list first developed by George Kuh (2008). This time, however, we adapted the list to describe practices by faculty and staff engaged in assessing students’ learning (rather than practices of engaged student learning itself). We asked audience members whether they would describe the assessment initiatives in which they are invested on their campuses in these ways:

 

  • Involves meaningful effort
  • Helps staff and faculty build substantive relationships with each other
  • Helps engage staff and faculty across differences (getting us out of our programs and silos)
  • Provides staff and faculty with rich feedback about their work
  • Helps staff and faculty apply and test what they are learning about student learning to their teaching and programming
  • Provides opportunities for staff and faculty to reflect on the people their students are becoming

 

Not a single person in a room of 50+ people raised their hands. This did not surprise us. 

 

In fact, as we have talked with colleagues around the country about assessing student learning, the frustrations that often emerge make assessment sound decidedly “low-impact” (from a teaching and learning perspective):  

 

  • Student learning assessment is often not connected back to teaching and programming, especially when we are measuring the impact of co-curricular activity. We may collect, but we do not always close the loop, missing the chance to make meaning of the data in ways that bring fresh understanding to our work.
  • Assessment is something done to us rather than by us. Often we assess because we are asked to provide a specific kind of result to someone external to our center/program/initiative. We are “under the assessment gun,” rather than pursuing assessment motivated by our own inquiry about our work and its impact.
  • Assessment can turn up some interesting data but it is definitely not a creative activity. When we present about assessment at non-assessment conferences, attendees often report that they are there because they feel they “should be,” not because they necessarily want to be. Assessment is bitter medicine, separate from the food that sustains us.
  • Assessment is something experts do. Assessment relies on specialized knowledge; therefore, doing it well demands that we outsource it when/if at all possible.

 

For these reasons, and likely more, many of us work with people who are skeptical or even hostile to the idea of assessment (or we may feel this way ourselves). Such feelings extend far beyond assessment of civically engaged student learning to a larger critique about the gap between the predominant assessment tools at our disposal and the complex, iterative learning we are actually trying to measure. As David Scobey writes, “I have rarely seen evaluative tools that do justice to my experience or that of my students,” an experience that he names as rooted in “meaning-making and reflection,” which in turn “nurture the student’s capacity for self-making and engagement (ethical, civic, vocational) in the world” (emphasis in original, 2009). This is a critique especially sharp from academic humanists like Scobey, yet it is one that resonates deeply with our own experiences as we’ve tried to bridge the gap between what we know and what we want to know about our students’ complicated engaged learning trajectories.

 

The centerpiece of our efforts to bridge that gap is something we call the “data lab,” a method we have developed over the past few years with our colleagues in the Bonner Center for Civic Engagement at the University of Richmond. In a data lab, stakeholders with a common investment in a program, center, class, or shared experience come together to look carefully at artifacts (data) that emerge from that common investment. There are three goals of the data lab approach: to build a culture of inquiry among our colleagues and allies, to measure student learning (or other foci) across our programs, and to develop and refine our programs based on evidence. 

 

No matter what the specific focus of the data lab, each time we conduct a data lab we ask two simple questions about the artifacts we have gathered: What are we learning about (focus of the data lab) from this data? and What else do we wish we knew? 

 

The first question unifies the data lab inquiry and prevents us from slipping into a program evaluation mindset. Data labs are not an occasion to critique a specific program or initiative, which can make colleagues responsible for those programs feel vulnerable, but rather an opportunity to engage in a broader, common inquiry about the impact of our collective work. 

 

The second question, What else do we wish we knew (but can’t glimpse in this data)? reveals important gaps in our data collection processes, and, perhaps most importantly, points us toward future directions for our inquiry. As rich as our conversations about specific artifacts usually are, this question about what’s missing has actually been the most fruitful question for our Center to ask regularly. 

 

For example, our very first data lab, held in May 2011, revealed to us that while we could count students in our programs, we could not answer many meaningful questions about them. As a direct result of that conversation, we developed new and more robust tracking mechanisms that allow us to identify how our students travel across programs and years and to analyze participation through various demographic lenses. Gaining a better understanding of who our students are and how they intersect with us in turn provoked us to ask more complex questions about subsets of our student population. For instance, identifying a large gender gap across our programs provoked us to ask, what motivates college men to participate? We began an inquiry with the Dean of our men’s college, and have begun to pilot targeted programming to expand our outreach among male students. 

 

Clearly, data labs are generative. But they are also playful. In fact, we have learned that intentionally encouraging creative interaction with the data we are studying is critical to a data lab’s success. While it is tempting to simply let participants discuss artifacts in the ways already familiar to most of us, we instead strive to set up distinct and sometimes surprising protocols or instructions for each data “station.”  Often we set a theme for the data lab that is then carried out in the instructions at each station. This sort of thing matters, we’ve found, because we are aiming to freshen people’s relationships with the act of assessment itself.

 

In our most recent data lab, for example, our colleagues entered the room and became students enrolled at Hogwarts School of Witchcraft and Wizardry from J.K. Rowling’s Harry Potter series. Since Hogwarts is divided into four houses, participants were also divided into four houses (or small groups) with whom they attended classes (or data lab stations). Each station included instructions about how to creatively analyze the particular artifacts. One station was the “Pensieve Class.” In Harry Potter, the Pensieve is described as a shallow basin made of stone in which users deposit memories where they are stored in full forever. At the Pensieve station, data lab participants were given a stack of community-based learning (CBL) essays from a recent Biochemistry class and were instructed to: 

 

Pick one essay and read it. The assignment in this class was for Biochemistry students to notice something at their service site [in Richmond] that raised a biochemical question that they could answer while researching biochemical literature.

 

Using the Pensieve worksheet, draw a picture of the moment that you see in the Pensieve (from the student essay), in which the student identifies the connection to biochemistry (see Figure 1).

 

Figure 1. Quote from biochemistry student’s essay, “This particular day made me interested in finding out what the heartworm disease was actually doing to the dogs because the dogs being treated seemed just as happy and energetic as all of the other dogs.” 

 

This exercise was useful in two ways-- first, it forced us to slow our processing of the information down enough to imagine and draw a detailed moment. Second, the accumulation of these images allowed us, together, to collectively examine singular artifacts in a way that encouraged us to identify patterns and links that inform our collective work. In fact, Harry Potter’s mentor, the great wizard Dumbledore, uses the Pensieve to achieve similar ends:

 

I use the Pensieve. One simply siphons the excess thoughts from one's mind, pours them into the basin, and examines them at one's leisure. It becomes easier to spot patterns and links, you understand, when they are in this form (Rowling, 2000).

 

Had data lab participants simply been instructed to read across the biochemistry essays for “patterns and links,” the exercise would not have been nearly as engaging or accessible-- or fruitful. After this particular data lab, participants discussed why the infusion of creativity in our approach to assessment was important. One colleague said: 

 

It’s really easy to quote someone else, but here you have to slow down and see through the eyes of the student. It also makes you own the moment, [which is] more labor intensive mentally to do because you have to understand it so you can represent it [in a drawing].

 

Making data labs creative and playful sharpens our understanding of the data by enabling us to see it in new ways and allowing the questions that emerge from our data to be ours, questions in which we are fully invested. For the staff in our Center, the majority of whom do not work with each other on a daily basis, data labs spur critical discourse and curiosity about the core of our work in a structured, generative way. This method has had tremendous influence on our evaluation processes and findings because it has made assessment a shared area for learning rather than a burden and has shifted us from a culture of “my students” to one of “our students.” 

 

Our development of the data lab method has been influenced by other assessment initiatives in higher education with which we’ve been fortunate to interact. One among these is the “Assessing the Practices of Public Scholarship” (APPS) initiative, a research group supported by the Imagining America consortium. APPS focuses on developing assessment frameworks and approaches that “advance the reciprocal benefits of publicly engaged scholarship and practice” (“Integrated Assessment”). While APPS emphasizes community impact assessment, the group’s efforts to accumulate and reflect on practices for inviting stakeholders into a collaborative process of assessment has been instructive. Likewise, we have been fortified by the work emerging from the Center of Inquiry at Wabash College, whose Teagle Assessment Scholar program aims to “help colleges and universities use evidence to strengthen the impact of liberal arts education for all students” (“Teagle Assessment Scholar Program”). As we have learned from our colleague Terry Dolson, who joined the Center’s Scholar program in 2011 and has been a critical thought partner in our development of the data lab, the extent to which we can use evidence effectively to refine and improve teaching and learning depends in part on how successfully we engage stakeholders in themselves drawing out the implications for change the data contain.

 

We have found data labs to be a distinctive and powerful tool for centering these broader assessment aims, in that they are able to hold the tension of being creative and fun while also engaging us in critical and continuous reflection on our work. They are also immensely adaptable. To date, we have used the data lab method primarily to understand student learning, and we are currently exploring its use to better understand the impact of our sustained partnerships in the organizations and communities with which we work. The data lab is low-tech and low-budget because it requires only word processing software for note-taking and copies of artifacts (data). It is resource-intensive, however, because staff must plan and participate in the data lab cycle, which includes follow up reflection and action. Perhaps it is actually because the data lab method demands this kind of “meaningful effort” that it is decidedly high-impact, to return to George Kuh’s language. Certainly, the data lab has helped us find what David Scobey names as necessary, a tool that “do[es] justice” to our experience and enables us to reflect upon and make-meaning of our collective work in the world. 

 

REFERENCES

“Integrated Assessment.” Imagining America. Syracuse University, n.d. Web. 29 April 2015. http://imaginingamerica.org/research/assessment/

 

Kuh, George. High-Impact Educational Practices: What They Are, Who Has Access to Them, and Why They Matter. Washington, DC: Association of American Colleges and Universities, 2008.

 

Rowling, J.K.. Harry Potter and the Goblet of Fire. New York: Scholastic, 2000.

 

Scobey, David. “Meanings and Metrics.” Inside Higher Ed. March 19, 2009. Web. 29 April 2015. https://www.insidehighered.com/views/2009/03/19/scobey

 

“Teagle Assessment Scholars.” Wabash College, n.d. Web. 29 April, 2015. http://www.liberalarts.wabash.edu/assessment-scholars/