Part III of The Current State of Assessing Historical Thinking: How Do We Measure Historical Thinking Skills

In order to measure historical thinking skills, there must be a categorization of the different types of cognitive processes which comprise it.

A clear breakdown is especially important, because as Stephane Levesque and Penny Clark pointed out in their handbook chapter about historical thinking definitions; “…if the ability to think historically should go beyond the mere mastery of factual knowledge about the past (“know that”), it is still unclear as to what the alleged connections between “history” and “thinking” actually means in conceptual and practical terms (“know how”) ( 2018, 119).

For more information on the breakdown of historical thinking skills, check out the History Forge page Skills Based Grading and Grading for Mastery.

The SCIM-C strategy is a useful way to breakdown historical thinking/reasoning, but does it go far enough?

The situation is not as dire as Levesque and Clark believe it is, as there are several classifications of historical thinking skills. For example, Peter Doolittle and David Hicks breakdown historical reasoning into five separate inquiry categories (Virginia Tech). Although we do have a definition of historical thinking, and corresponding skills, there does need to be a greater effort in separating those skills so teachers may test for them. If these skills are not further bracketed, then it will be as Denis Shemilt points out, “it may not be possible to make secure assessments of students’ historical consciousness.” (2018, 453).

Although there are many definitions of historical thinking, and corresponding skills, most state and classroom history assessments do not adequately  test for them because they utilize exams that primarily use multiple choice questions. The questions on these standardized tests, at their best, only measure aspects of factual recall (Reich, 2009). The popularity of multiple choice is not surprising, as Denis Shemilt points out in his article “Assessment of Learning in History Education,” teachers in the United States and Great Britain believed multiple choice tests “improved the reliability” of assessments (2018, 449).

Not only are these tests unable to measure critical thinking, researchers like Gabriel Reich suggest that recall tests cannot accurately measure historical knowledge (Reich, 2009).

Like a baker measuring ingredients using inches and feet, test creators are using the wrong measurement to assess historical thinking.

A popular alternative to factual recall tests are document-based questions (DBQs), which prompt students to analyze several primary documents, form a thesis, and defend it. In social studies courses, the most widespread use of DBQs is in the U.S. History Advanced Placement exam. If recall tests are like measuring how much flour goes into the bowl in inches, then DBQs are like using a hammer to crack the eggs.

DBQs are substantial questions, and require over an hour to complete; therefore, students use several types of inquiry and skills to form their answers. Due to the extent of these questions, it is unclear as to what particular historical thinking is being measured. Analysis of questions that test historical thinking and what they actually measure is necessary because assessments have a tremendous influence over classroom curriculum and pedagogy. Since assessments are necessary, some researchers have examined how they can be created in order to properly measure historical thinking.

There is a cartoon that describes what its like to inquire in qualitative methods. When touching an elephant blindfolded, you may grab its trunk and declare you are touching a snake. This is like the DBQ questions; in order to answer complex historical questions, we must look at many different perspectives. In order to gauge how well students are doing in each perspective, assessments must individually measure each inquiry method.

Coming Up Next

The next part will focus on the exemplar articles that describes how educators can measure historical thinking. To read the next part click on Part IV of the Current State of Assessing Historical Thinking: Exemplar Article of Measuring Historical Thinking

Bibliography

Attewell, Paul and David Lavin (2011). “The Other 75%: College Education Beyond the Elite.” In E. Lageman’s and H. Lewis’s (Eds.) What is College For? The Public Purpose of Higher Education. New York: Teachers College Press.

Corliss, Stephanie B., and Marcia C. Linn (2011). Assessing Learning From Inquiry Science Instruction. In D. Robinson and G. Schraw (Eds). Assessment of Higher Order Thinking Skills, 219-243.

Hicks, David, and Peter E. Doolittle (2008). Fostering Analysis in Historical Inquiry Through Multimedia Embedded Scaffolding. Theory and Research in Social Education, 36(3), 206-232.

Lee, John (unpublished chapter). Assessing Inquiry.

Lee, P., & Ashby, R. (2000). Progression in Historical Understanding Among Student Ages 7-14. In P.N. Stearns, P. Seixas, and S. Wineburg (Eds.), Knowing Teaching & Learning History: National and International Perspectives, 45-94. New York: New York University Press.

Levesque, Stephane and Penney Clark (2018). Historical Thinking: Definitions and Educational Applications. In S. Metzger’s and L. Harris’s (Eds.) The Wiley Handbook of History Teaching and Learning, 119-148. Hoboken, NJ: John Wiley & Sons.

McGrew, Sarah, Joel Breakstone, Teresa Ortega, Mark Smith, and Sam Wineburg. (2018). Can Students Evaluate Online Sources? Learning from Assessments of Civic Online Reasoning. Theory & Research in Social Education, 46(2), 165-193.

National Council for the Social Studies (NCSS). The College, Career, and Civic Life (C3) Framework for Social Studies State Standards: Guidance for Enhancing the Rigor of K-12 Civics, Economics, Geography, and History (2013). Silver Spring, Maryland: NCSS.

Reich, G. A. (2009) Testing historical knowledge: Standards, multiple-choice questions and student reasoning. Theory & Research in Social Education, 37(3), 325-360.

Renn, Kristen and Robert Reason. “Characteristics of College Students in The United States”. In Renn’s and R. Reason’s College Students in the United States: Characteristics, Experiences, and Outcomes, 3-27. San Francisco, CA: Jossey-Bass.

Shemilt, Denis (2018). Assessment of Learning in History Education: Past, Present, and Possible Futures. In S. Metzger’s and L. Harris’s (Eds.) The Wiley Handbook of History Teaching and Learning, 449-472. Hoboken, NJ: John Wiley & Sons.

Selwyn, Doug (2014). Why Inquiry? In E. Ross’s The Social Studies Curriculum, 267-288. New York: State University Press.

Vanderslight, Bruce (2014). Assessing Historical Thinking & Understanding: Innovative Designs for New Standards. New York: Routledge.

Virginia Tech. SCIM-C: Historical Inquiry. Retrieved from http://www.historicalinquiry.com/.

Waldis, Monika, et al. (2015). Material-Based and Open-Ended Writing Tasks for Assessing Narrative Among Students. In K. Ercikan and P. Seixas (Eds.), New Directions in Assessing Historical Thinking (pp. 117-131). New York: Routledge.

Wallace, David Adams (1987). The Past as Experience: A Qualitative Assessment of National History Day, The History Teacher, 20(2), 179-242.

Wineburg, Sam. (2001). Picturing the Past. In Historical Thinking and Other Unnatural Acts: Charting the Future of Teaching the Past (pp. 113-136). Philadelphia: Temple University Press.

Leave a Comment

Your email address will not be published. Required fields are marked *