In education, learning is often viewed as a straight continuum of ability that represents a student’s understanding of a particular subject area.
But with newly developed diagnostic assessments, Laine Bradshaw, an associate professor in UGA’s College of Education, said learning is more like a developing toolbox, and assessments can be designed to determine which tools, or skills, students have and which ones they still need to acquire.
Laine P. Bradshaw
College of Education
- Ph.D., Research, Evaluation, Measurement and Statistics, University of Georgia, 2011
- M.Ed., Mathematics Education, University of Georgia, 2007
- B.S., Mathematics Education, University of Georgia, 2007
- At UGA: 4.5 years
“Every educator, administrator and teacher you talk to wants diagnostic assessments that tell them what their students’ strengths and weakness are,” Bradshaw said. “One of the biggest issues in education is developing brief assessments that can be administered quickly in the classroom and also provide reliable and accurate feedback about what students know and what they don’t know.”
While interning with the U.S. Department of Education as a graduate student in mathematics education, Bradshaw noticed there was a disconnect between assessment results and teacher practice. Educators wanted the information from assessments to be more timely and detailed in order to use the feedback to inform instruction and help students succeed in the classroom.
“Quality research shows that assessments can be a really effective learning tool if we can get feedback to teachers in a fast and efficient manner,” said Bradshaw. “You also have to tell teachers something they don’t already know. Teachers know who is generally high or low performing, but they often don’t know what specific skills each student may be struggling with.”
Traditional psychometric methodology is not designed to give detailed feedback on student strengths and weaknesses. Giving detailed or diagnostic feedback using traditional methods would require administering a lot of items—more items than schools have time to administer—to yield accurate and reliable feedback.
Bradshaw’s work is focused on developing and leveraging diagnostic psychometric methodology not currently used in assessment systems in order to design assessments that are both detailed and efficient.
“My research focuses on assessments that provide a multivariate profile of what students understand,” she said. “It’s a different way to look at assessment while providing the kind of information that resonates with teachers. That’s my goal: To provide teachers with information they need, value and trust.”
Bradshaw currently is an investigator on three federal grants—two from the National Science Foundation and one from the Institute of Education Sciences. For each of these projects, Bradshaw is advancing the psychometric methodology needed to design diagnostic assessments that will help researchers better understand the complex structures of how students and teachers reason mathematically.
“We’ve learned from research and have been able to advance the psychometric methodology to the point where we are ready to use psychometrically diagnostic assessments in practice,” said Bradshaw. “Research allows us to test the limits of assessment design and usability.”
To connect her research with practice, Bradshaw collaborated with Parcc, Inc., a testing company that began as a large assessment consortium, to design a first-of-its-kind psychometrically diagnostic classroom assessment system in reading and mathematics comprehension for students in grades two through eight. After students complete a short assessment online, teachers get immediate feedback on whether the they are on-track or need improvement in certain areas. Students can retake assessments as needed, and the system tracks the areas in which they have demonstrated mastery in over the year.
“We can better refine the design of the system to make it more useful for teachers and students once we have data to understand how it works in practice,” said Bradshaw.