VALUE Rubric Project - 2013 UW-Madison Pilot Project
In the spring and summer of 2013, we are piloting the use of one of the VALUE rubrics as a way to evaluate student learning. The 2013 pilot project will use the written communication rubric to evaluate samples of work from new freshmen and from students who are close to graduation.
The project is comprised of three major elements:
1. Samples of student work, referred to as artifacts: We will collect up to 350 short writing samples from first year students and from senior students who are nearing graduation. These writing samples are drawn from course assignments that meet the following criteria: about 2-8 pages (double-spaced) in length in which students display an ability to understand the context, purpose, and audience of the assignment, to demonstrate their skill at executing proper forms of academic writing, to develop ideas and content appropriately through the use of credible evidence and sources, and to master the syntax and mechanics of good writing. The identity of the students and the course that each artifact came from will be kept private.
2. Rubrics: The Written Communication VALUE Rubric used in this project was developed collaboratively by academic professionals and faculty from around the nation in the context of an AAC&U project addressing the assessment of the Essential Learning Outcomes. It is one of 15 rubrics that was developed to assess learning outcomes that include–but are not limited to–critical thinking, information literacy, ethical reasoning, integrative learning, and quantitative literacy.
3. Faculty Raters: A group of approximately 40 UW-Madison faculty will use the Written Communication VALUE Rubric to assess the student artifacts. These raters will receive training on how to employ and calibrate the rubric. Each of the 700 writing samples will be evaluated by two faculty raters.
February-March 2013: Instructors are contacted for permission to collect artifacts from their students, and to identify the assignment from their class that is appropriate for this assessment project.
March 2013: Potential faculty raters are identified and invited to participate.
March-May 2013: Students are contacted via email and asked to submit writing samples.
June 18-19, 2013: Rater training and artifact rating dates.
July 2013: Analysis and reporting of results.
Frequently Asked Questions (FAQ)
Q. Is this a research project and do you need FERPA approval?
A. The VALUE Rubrics project is not a research project. Rather, it is an institution-level assessment of student learning gains to be used for administrative purposes only. Individual student data will remain entirely confidential and no personally identifiable information about a student will be disclosed.
Q. How are you maintaining student privacy in this project?
A. Only project leaders have access to student submissions and know who has submitted work. When student work is submitted, all identifying information is removed from it. No one else will be able to connect the writing samples to a particular student, course, or instructor. Guaranteeing students’ privacy in this matter is a high priority for this project.
Q. Are you assessing particular instructors in this project?
A. No. We are not evaluating particular instructors, groups of instructors, or the quality of their teaching. We are assessing student learning outcomes at an institutional level.
Q. Are you assessing particular course materials used in instruction in this project?
A. No. We are not evaluating the quality or content of course materials from any course on campus. We are assessing student learning outcomes at an institutional level.
Q. How often will this assessment occur? Is it a one-time event?
A. The VALUE Rubric Project of 2013 is a pilot project that will be part of an ongoing yearly process. There are a number of Essential Learning Outcomes (ELO) that the university will assess over time. This year’s pilot project is focused on Written Communication.
Q. Is this a longitudinal study?
A. This is not a longitudinal study, but rather is a cross-sectional study of student learning gains. Using the language of the Voluntary System of Accountability (VSA), this year’s project is a “Value-Added” assessment of Student learning. Although we are collecting student work from both first year students and students who are nearly graduating, we are not tracking first year students and then requiring them to submit samples when they are seniors. Rather, this is an attempt to take a representative sample of student learning at two levels and to compare them.
Q. The VALUE Rubric guidelines from the VSA talk about both Benchmark and Value-Added assessments. What is the difference between them?
A. The definitions of these different assessments can be found here. In short, a Benchmark assessment evaluates only a representative sample of student work from seniors in their last semester. A Value-Added assessment compares representative samples of student work from the same year that have been submitted by both first year students and graduating students in order to determine if there are student learning gains. This Value-Added assessment is cross-sectional rather than longitudinal in nature.
Q. When will information about the results of this project be available?
A. The assessment of student work is taking place in late June. Analysis of the results will take place in July, 2013 and will become available after that.
Q. Which student work is being assessed in this study?
A. This year’s VALUE Rubric project will be sampling student writing samples from courses across campus that either have large numbers of First Year Students (FYR) or large number of Nearly Graduating Students (NGR). In this process we sought writing samples from courses across campus that spanned the Colleges of Engineering, Agriculture & Life Sciences, and Letters & Sciences as well as the Schools of Business, Nursing, Journalism & Mass Communications, Education, and Human Ecology. Within this group, we have obtained student writing samples that met the requirements of this study, which are listed above.
Q. How can I become involved in this project as a faculty rater?
A. Please contact Jocelyn Milner (email@example.com) or Mo Noonan Bischof (firstname.lastname@example.org) if you are interested in becoming a faculty rater for this project. Please note that raters must either be tenure-track or tenured faculty members.
This section contains a number of documents about assessment, rubrics, and inter-rater reliability that provide context and background information for those who will be raters for this project.
1. What is a Rubric? by Merilee Griffin, appearing in Rhodes, Terrel L. (2010). Assessing outcomes and improving achievement: tips and tools for using rubrics. Washington, DC: Association of American Colleges and Universities.
2. Developing Rubrics: Lessons Learned by Wende Morgaine, appearing in Rhodes, Terrel L. (2010). Assessing outcomes and improving achievement: tips and tools for using rubrics. Washington, DC: Association of American Colleges and Universities.
3. Article on Inter-Rater Reliability using Rubrics: Bresciani et. al. (2009) Examining Design and Inter-Rater Reliability of a Rubric Measuring Research Quality across Multiple Disciplines. Practical Assessment, Research & Evaluation, 14(12).
4. Research Review about the use of rubrics: Jonsson, A. & Svingby, G. (2007) The use of scoring rubrics: Reliability, validity and educational consequences. Educational Research Review 2: 130-144.
5. Full issue of peer Review (Fall 2011/Winter 2012) about Assessing Liberal Education Outcomes Using Value Rubrics. Numerous articles, but especially the article by Ashley Finley, are helpful for understanding the project.
More about AAC&U VALUE Rubric Project
More about the Wisconsin Experience and UW-Madison's Student Learning Outcomes
More about the VSA Guidelines for Use of AAC&U VALUE Rubrics
This project is designed to meet UW-Madison's commitment to the VSA/College Portrait and the requirement to provide assessment information in specified formats. These guidelines establish the framework for this project.
Joshua Kundert, Associate Academic Planner, Academic Planning and Institutional Research
Mo Noonan Bischof, Assistant Vice Provost, Office of the Provost
Jocelyn Milner, Director of Academic Planning and Institutional Research