VALUE Rubric Project - 2013 UW-Madison Pilot Project
Presentation Information - 2014 AAC&U Annual Meeting: Quality, E-Quality & Opportunity, Washington D.C.
Panel: The VALUE of Quality Degrees, Friday, January 24, 2014; 10:30-11:45am
Presenters: Terrel Rhodes, Vice President, Office of Quality, Curriculum, and Assessment, AAC&U; Mo Bischof, Assistant Vice Provost, and Jocelyn Milner, Director of Academic Planning and Institutional Research—both of the University of Wisconsin, Madison; Sam Hines, Provost and Dean of the College, and Tara McNealy, Associate Provost for Planning, Assessment and Evaluation—both of The Citadel, The Military College of South Carolina.
- Slide Deck
- Full ppt Slide Deck
In the spring and summer of 2013, we piloted the use of one of the VALUE rubrics as a way to evaluate student learning. The 2013 pilot project used the written communication rubric to evaluate samples of work from new freshmen and from students who are close to graduation.
The project was comprised of three major elements:
1. Samples of student work, referred to as artifacts: We collected more than 400 short writing samples from first year students and from senior students who were nearing graduation. These writing samples were drawn from course assignments that meet the following criteria: about 2-8 pages (double-spaced) in length in which students display an ability to understand the context, purpose, and audience of the assignment, to demonstrate their skill at executing proper forms of academic writing, to develop ideas and content appropriately through the use of credible evidence and sources, and to master the syntax and mechanics of good writing. The identity of the students and the course that each artifact came from was not disclosed.
2. Rubrics: The Written Communication VALUE Rubric used in this project was developed collaboratively by academic professionals and faculty from around the nation in the context of an AAC&U project addressing the assessment of the Essential Learning Outcomes. It is one of 15 rubrics that was developed to assess learning outcomes that include–but are not limited to–critical thinking, information literacy, ethical reasoning, integrative learning, and quantitative literacy.
3. Faculty Scorers: A group of approximately 25 UW-Madison faculty met over two days in June 2013 and used the Written Communication VALUE Rubric to assess the student artifacts. These scorers received training on how to employ and calibrate the rubric. Each writing sample was evaluated by two faculty scorers.
February-March 2013: Instructors were contacted for permission to collect artifacts from their students, and to identify the assignment from their class that was appropriate for this assessment project.
March 2013: Potential faculty scorers were identified and invited to participate.
March-May 2013: Students were contacted via email and asked to submit writing samples.
June 18-19, 2013: Workshop - scorer training and artifact rating dates.
July 2013 and forward: Analysis and reporting of results
Frequently Asked Questions (FAQ)
Q. Is this a research project and do you need FERPA approval?
A. The VALUE Rubrics project is not a research project. Rather, it is an institution-level assessment of student learning gains to be used for administrative purposes only. Individual student data will remain entirely confidential and no personally identifiable information about a student will be disclosed.
Q. How are you maintaining student privacy in this project?
A. Only project leaders have access to student submissions and know who has submitted work. When student work is submitted, all identifying information is removed from it. No one else will be able to connect the writing samples to a particular student, course, or instructor. Guaranteeing students’ privacy in this matter is a high priority for this project.
Q. Are you assessing particular instructors in this project?
A. No. We are not evaluating particular instructors, groups of instructors, or the quality of their teaching. We are assessing student learning outcomes at an institutional level.
Q. Are you assessing particular course materials used in instruction in this project?
A. No. We are not evaluating the quality or content of course materials from any course on campus. We are assessing student learning outcomes at an institutional level.
Q. How often will this assessment occur? Is it a one-time event?
A. The VALUE Rubric Project of 2013 is a pilot project that will be part of an ongoing yearly process. There are a number of Essential Learning Outcomes (ELO) that the university will assess over time. This year’s pilot project is focused on Written Communication.
Q. Is this a longitudinal study?
A. This is not a longitudinal study, but rather is a cross-sectional study of student learning gains. Using the language of the Voluntary System of Accountability (VSA), this year’s project is a “Value-Added” assessment of Student learning. Although we are collecting student work from both first year students and students who are nearly graduating, we are not tracking first year students and then requiring them to submit samples when they are seniors. Rather, this is an attempt to take a representative sample of student learning at two levels and to compare them.
Q. The VALUE Rubric guidelines from the VSA talk about both Benchmark and Value-Added assessments. What is the difference between them?
A. The definitions of these different assessments can be found here. In short, a Benchmark assessment evaluates only a representative sample of student work from seniors in their last semester. A Value-Added assessment compares representative samples of student work from the same year that have been submitted by both first year students and graduating students in order to determine if there are student learning gains. This Value-Added assessment is cross-sectional rather than longitudinal in nature.
Q. Which student work was assessed in this study?
A. The VALUE Rubric project sampled student writing samples from courses across campus that either have large numbers of First Year Students (FYR) or large number of Nearly Graduating Students (NGR). In this process we sought writing samples from courses across campus that spanned the Colleges of Engineering, Agriculture & Life Sciences, and Letters & Sciences as well as the Schools of Business, Nursing, Journalism & Mass Communications, Education, and Human Ecology.
Q. What if I have more questions?
A. Please contact Jocelyn Milner (email@example.com) or Mo Noonan Bischof (firstname.lastname@example.org).
This section contains a number of documents about assessment, rubrics, and inter-rater reliability that provide context and background information provided to faculty scorers.
1. What is a Rubric? by Merilee Griffin, appearing in Rhodes, Terrel L. (2010). Assessing outcomes and improving achievement: tips and tools for using rubrics. Washington, DC: Association of American Colleges and Universities.
2. Developing Rubrics: Lessons Learned by Wende Morgaine, appearing in Rhodes, Terrel L. (2010). Assessing outcomes and improving achievement: tips and tools for using rubrics. Washington, DC: Association of American Colleges and Universities.
3. Article on Inter-Rater Reliability using Rubrics: Bresciani et. al. (2009) Examining Design and Inter-Rater Reliability of a Rubric Measuring Research Quality across Multiple Disciplines. Practical Assessment, Research & Evaluation, 14(12).
4. Research Review about the use of rubrics: Jonsson, A. & Svingby, G. (2007) The use of scoring rubrics: Reliability, validity and educational consequences. Educational Research Review 2: 130-144.
5. Full issue of peer Review (Fall 2011/Winter 2012) about Assessing Liberal Education Outcomes Using Value Rubrics. Numerous articles, but especially the article by Ashley Finley, are helpful for understanding the project.
More about AAC&U VALUE Rubric Project
More about the Wisconsin Experience and UW-Madison's Student Learning Outcomes
More about the VSA Guidelines for Use of AAC&U VALUE Rubrics
This project is designed to meet UW-Madison's commitment to the VSA/College Portrait and the requirement to provide assessment information in specified formats. These guidelines establish the framework for this project.
Mo Noonan Bischof, Assistant Vice Provost, Office of the Provost
Jocelyn Milner, Director of Academic Planning and Institutional Research