Part VII of The Current State of Assessing Historical Thinking: Problems with Current Research on Scaffolding

In the last part of this article series, I began to describe how scaffolding needs to be a more important focus in research into historical inquiry assessments. In this section, I use two exemplar articles to further suggest my point on the need for scaffolding.

Exemplar Article #1

Analysis of the article “Fostering Analysis in Historical Inquiry Through Multimedia Embedded Scaffolding” suggests researchers do not consider how familiar students are with the historical topic. In this article, David Hicks and Peter Doolittle developed a strategy for historical thinking called SCIM-C. The strategy stands for “summarizing,” “contextualizing,” “inferring,” “monitoring,” “corroborating,” and “interpretation.”

These historical thinking skills are similar to how Sam Winburg, Sarah McGrew, and Monika Waldis define their skills. Students and teachers can use the SCIM-C strategy as a rational system to answer historical questions and analyze documents and artifacts. First publishing their ideas for the strategy in 2004, Hicks and Doolittle join other scholars in arguing for a greater evidence-based approach in history education. In their study, Hicks and Doolittle report their findings and answer the question; does the SCIM Historical Inquiry Tutorial foster the development of historical source analysis?

For more information on these skills, visit Skills Based Grading and Grading for Mastery

Undergraduates were used even though historical thinking curriculum is mostly being developed for secondary education students.

 The participants of the study were seventy-seven college undergraduates that were enrolled in a general studies health education course. Students were chosen from the health course because they would have little knowledge of historical procedures. The study introduced the SCIM strategy (they removed the C for this research) to the students over three instructional periods, and student’s knowledge was assessed using a single open-ended question. Based on the teaching of the SCIM strategy, many students applied their new found skills as part of a cognitively sophisticated process of analyzing sources.

Despite the success of numerous students, Hicks and Doolittle found that students applied historical thinking skills unevenly. This unevenness could be due to students not receiving differentiated assessments. Using Monkia Waldis’s theory on familiarity, students could have done poorly because they lacked knowledge of the historical time period being used for the assessment questions. If this is the case, then Hicks and Doolittle have data that does not truly show mastery of isolated thinking skills, but more a relationship between skills and knowledge of historical content. If K-12 teachers used the same strategy as Hicks and Doolittle to assess younger students, then they would need to differentiate the content in the assessments. Just one of these differentiated scaffolds would be allowing students to choose content in which to be assessed.

Exemplar Article #2

Along with familiarity with a topic, the knowledge and experiences of someone can affect their ability to master historical thinking skills. Instead of seeing knowledge and perspective as affecting the ability to historically think, Peter Lee and Rosalyn Ashby attributed age as a more prominent factor in their study “Progression in Historical Understanding Among Student Ages 7-14” (2000).

As one of their central tasks, Lee and Ashby examined how students change their perceptions of history as they age. The philosophy of this research falls in line with Jean Piaget’s ideas of students learning through a cognition model as they grow older, with strict limits on what a student can do at a certain age. 

In the main investigation, Lee and Ashby collected responses from 320 children between the ages of seven and fourteen. They also interviewed 1/3 of the students in order to determine the reasoning behind their interpretations of history. Students responded to questions by examining secondary source accounts of Romans in Briton, but each story differed in theme, tone, and time-scale. As students got older, they described differences in the stories based on their dates and abstract concepts.

More of Piaget’s ideas can be found at History Teaching and John Piaget.

Understanding the boundaries between types of historical thinking is important since educators want to help students understand their strengths and weaknesses.

Lee and Ashby took this observation and theorized that students progress in their formation of history as they age. Lee and Ashby use several practical codes when measuring students’ historical thinking. These codes are especially useful since other theorists did not use them. Some of these codes were “selection,” “legitimate viewpoint,” “intentional distortion,” “mistakes,” and “opinion unexplained” (Lee & Ashby, 58).

The disadvantage with Lee and Ashby’s findings is they do not consider how familiarity with a topic, student perspectives, or the amount of knowledge a student possesses may affect their ability to historically think. Simply using age as a factor in how students develop prevents educators and researchers from developing scaffolds to assist students in mastering historical thinking skills.

The age of a person has less to do with their thinking ability than their experiences and perspectives.

Since Lee and Ashby believe age is a factor in how well students can understand abstract concepts, they perhaps did not see a reason to formulate any steps or methods that would help students progress through a cognitive learning model. Models already exist, like Jerome Bruner’s “spiraling curriculum,” and have been effective in helping students learn deeper concepts. These models are beneficial because if a researcher or teacher is able to graph progression of historical thinking, then scaffolds can be likely built between each step to help with the advancement of cognition.

Coming Up Next

The next part of this article series will focus on the question: how does inquiry and assessment motivate students?


Attewell, Paul and David Lavin (2011). “The Other 75%: College Education Beyond the Elite.” In E. Lageman’s and H. Lewis’s (Eds.) What is College For? The Public Purpose of Higher Education. New York: Teachers College Press.

Corliss, Stephanie B., and Marcia C. Linn (2011). Assessing Learning From Inquiry Science Instruction. In D. Robinson and G. Schraw (Eds). Assessment of Higher Order Thinking Skills, 219-243.

Hicks, David, and Peter E. Doolittle (2008). Fostering Analysis in Historical Inquiry Through Multimedia Embedded Scaffolding. Theory and Research in Social Education, 36(3), 206-232.

Lee, John (unpublished chapter). Assessing Inquiry.

Lee, P., & Ashby, R. (2000). Progression in Historical Understanding Among Student Ages 7-14. In P.N. Stearns, P. Seixas, and S. Wineburg (Eds.), Knowing Teaching & Learning History: National and International Perspectives, 45-94. New York: New York University Press.

Levesque, Stephane and Penney Clark (2018). Historical Thinking: Definitions and Educational Applications. In S. Metzger’s and L. Harris’s (Eds.) The Wiley Handbook of History Teaching and Learning, 119-148. Hoboken, NJ: John Wiley & Sons.

McGrew, Sarah, Joel Breakstone, Teresa Ortega, Mark Smith, and Sam Wineburg. (2018). Can Students Evaluate Online Sources? Learning from Assessments of Civic Online Reasoning. Theory & Research in Social Education, 46(2), 165-193.

National Council for the Social Studies (NCSS). The College, Career, and Civic Life (C3) Framework for Social Studies State Standards: Guidance for Enhancing the Rigor of K-12 Civics, Economics, Geography, and History (2013). Silver Spring, Maryland: NCSS.

Reich, G. A. (2009) Testing historical knowledge: Standards, multiple-choice questions and student reasoning. Theory & Research in Social Education, 37(3), 325-360.

Renn, Kristen and Robert Reason. “Characteristics of College Students in The United States”. In Renn’s and R. Reason’s College Students in the United States: Characteristics, Experiences, and Outcomes, 3-27. San Francisco, CA: Jossey-Bass.

Shemilt, Denis (2018). Assessment of Learning in History Education: Past, Present, and Possible Futures. In S. Metzger’s and L. Harris’s (Eds.) The Wiley Handbook of History Teaching and Learning, 449-472. Hoboken, NJ: John Wiley & Sons.

Selwyn, Doug (2014). Why Inquiry? In E. Ross’s The Social Studies Curriculum, 267-288. New York: State University Press.

Vanderslight, Bruce (2014). Assessing Historical Thinking & Understanding: Innovative Designs for New Standards. New York: Routledge.

Virginia Tech. SCIM-C: Historical Inquiry. Retrieved from

Waldis, Monika, et al. (2015). Material-Based and Open-Ended Writing Tasks for Assessing Narrative Among Students. In K. Ercikan and P. Seixas (Eds.), New Directions in Assessing Historical Thinking (pp. 117-131). New York: Routledge.

Wallace, David Adams (1987). The Past as Experience: A Qualitative Assessment of National History Day, The History Teacher, 20(2), 179-242.

Wineburg, Sam. (2001). Picturing the Past. In Historical Thinking and Other Unnatural Acts: Charting the Future of Teaching the Past (pp. 113-136). Philadelphia: Temple University Press.

Leave a Comment

Your email address will not be published. Required fields are marked *