Part V of The Current State of Assessing Historical Thinking: What Data Can Be Pulled From Research in Inquiry and Historical Thinking

In the last section, I mentioned how DBQ assessments cannot accurately measure specific cognitive processes in social studies, like sourcing or contextualization. Additionally, Reich demonstrated how students use unintended thinking processes to answer multiple choice questions. Adding to the issues of creating assessments, there are many categories of data which complicates measuring historical thinking.

For example, Adam Wallace examined motivation and belief in oneself when examining National History Day projects (Wallace, 1987); David Hicks and Peter E. Doolittle focused on the historical thinking skills of sourcing (Hicks 2008); and Lee and Ashby focused on students’ interpretation of tone, theme, and time-scale as they grew older. Due to the complexity of assessing skills and thinking, creators of assessments that measure reasoning must be deliberate in identifying what they are measuring and the possible issues with their evaluations.

Historical thinking is like a river. There are many streams of consciousness that mix together and feed it. As students use more reasoning skills, their river gets bigger. Figuring out which sources students pulled reasoning from can become difficult, buts its imperative we do so in order to confirm students’ abilities.

Exemplar Article #1

Wineburg, Sam. (2001). Picturing the Past. In Historical Thinking and Other Unnatural Acts: Charting the Future of Teaching the Past (pp. 113-136). Philadelphia: Temple University Press.

Sam Wineburg has been working on being deliberate since the late 1990s; his work with the Stanford History Education Group (SHEG) has been some of the best scholarship on historical thinking. Wineburg received his doctorate in Psychological Studies in Education, which is perhaps why he has been thoroughly invested in investigating how students think about and learn history.

Examples of Wineburg’s investment are his numerous publications, one of which is his book Historical Thinking and Other Unnatural Acts: Charting the Future of Teaching the Past (Wineburg, 2001). In his book, Wineburg examines many reasons why students think about the past differently, such as the power of gender on historical thinking. 

The History Forge article What Does It Mean To Master Historical Thinking explores more of Sam Wineburg’s ideas and evaluates SHEG’s definition of mastery.

Wineburg wrote a chapter called “Picturing the Past” (2001) where he focused on the question: “how do boys and girls picture the past?” The research group consisted of 161 middle school students. The students were asked to draw pictures of different historical figures, like Pilgrims, Western Settlers, and Hippies, in order to see how they pictured these people. Wineburg quantified the images based on gender, number of people, and types of actions the historical figures were committing. Additionally, Wineburg and his assistants conducted interviews to let students explain their reasoning behind the drawings.

Male students drew predominantly male characters who were isolated or alone, and they were more likely to depict violence. Girls were more likely to draw female characters than boys, but their female to male ratio was 50/50.

Girls also drew more groups of people such as families. Most concerning was the girls propensity to fill their “historical world” with more men than would be realistic; the researchers wondered, do girls do this when they are reading textbook accounts of historical events. If girls and boys have different outlooks of gender in history, how might this affect their historical reasoning? Would it be fair to examine boys and girls using the same prompts and rubrics?

How can assessments create gaps along gender lines?

If young girls and boys are “seeing” a different historical world, then they will likely interpret primary sources and historical arguments differently. If this is the case, then students’ mastery over concepts may be benefited or hindered by their gender. For example, a female student may be less likely to disassociate violence or discrimination from historical events or people. If this is the case, the girls may conclude historical figures and events are invalid sources because they do not meet present moral standards (Founding Fathers owning slaves, Free Blacks settling on Native American lands, blaming Hitler for Germany’s anti semitism).

This can be experienced when students state something like “that person is racist so nothing they did could be important.” This becomes problematic when teaching about figures such as Thomas Jefferson or Martin Luther King Jr., who despite their many accomplishments, most likely committed adultery.

It is unhelpful to see historical figures as devils or heroes. Historical figures, no matter their actions, are human, and were influenced by the forces of their time. Describing people as “evil” or “good” oversimplifies history.

When creating assessments, the creators of the measurements must be explicit in what is being measured and how students will demonstrate their command over the concept or skills being assessed. Furthermore, there needs to be an understanding of how students stumble into theoretical pitfalls, and how mastery may look differently for various groups of students. Due to these reasons, scaffolds must be built in order to help students reach mastery over skills and concepts. For example, a simple pedagogical scaffold would be to model analysis of primary documents. A curriculum scaffold would be to have multiple types of the same reading, which would differ in reading level and/or theme.

Coming Up Next

The next part of this article series will focus on the question “What scaffolding exists and can it be improved?”

Bibliography

Attewell, Paul and David Lavin (2011). “The Other 75%: College Education Beyond the Elite.” In E. Lageman’s and H. Lewis’s (Eds.) What is College For? The Public Purpose of Higher Education. New York: Teachers College Press.

Corliss, Stephanie B., and Marcia C. Linn (2011). Assessing Learning From Inquiry Science Instruction. In D. Robinson and G. Schraw (Eds). Assessment of Higher Order Thinking Skills, 219-243.

Hicks, David, and Peter E. Doolittle (2008). Fostering Analysis in Historical Inquiry Through Multimedia Embedded Scaffolding. Theory and Research in Social Education, 36(3), 206-232.

Lee, John (unpublished chapter). Assessing Inquiry.

Lee, P., & Ashby, R. (2000). Progression in Historical Understanding Among Student Ages 7-14. In P.N. Stearns, P. Seixas, and S. Wineburg (Eds.), Knowing Teaching & Learning History: National and International Perspectives, 45-94. New York: New York University Press.

Levesque, Stephane and Penney Clark (2018). Historical Thinking: Definitions and Educational Applications. In S. Metzger’s and L. Harris’s (Eds.) The Wiley Handbook of History Teaching and Learning, 119-148. Hoboken, NJ: John Wiley & Sons.

McGrew, Sarah, Joel Breakstone, Teresa Ortega, Mark Smith, and Sam Wineburg. (2018). Can Students Evaluate Online Sources? Learning from Assessments of Civic Online Reasoning. Theory & Research in Social Education, 46(2), 165-193.

National Council for the Social Studies (NCSS). The College, Career, and Civic Life (C3) Framework for Social Studies State Standards: Guidance for Enhancing the Rigor of K-12 Civics, Economics, Geography, and History (2013). Silver Spring, Maryland: NCSS.

Reich, G. A. (2009) Testing historical knowledge: Standards, multiple-choice questions and student reasoning. Theory & Research in Social Education, 37(3), 325-360.

Renn, Kristen and Robert Reason. “Characteristics of College Students in The United States”. In Renn’s and R. Reason’s College Students in the United States: Characteristics, Experiences, and Outcomes, 3-27. San Francisco, CA: Jossey-Bass.

Shemilt, Denis (2018). Assessment of Learning in History Education: Past, Present, and Possible Futures. In S. Metzger’s and L. Harris’s (Eds.) The Wiley Handbook of History Teaching and Learning, 449-472. Hoboken, NJ: John Wiley & Sons.

Selwyn, Doug (2014). Why Inquiry? In E. Ross’s The Social Studies Curriculum, 267-288. New York: State University Press.

Vanderslight, Bruce (2014). Assessing Historical Thinking & Understanding: Innovative Designs for New Standards. New York: Routledge.

Virginia Tech. SCIM-C: Historical Inquiry. Retrieved from http://www.historicalinquiry.com/.

Waldis, Monika, et al. (2015). Material-Based and Open-Ended Writing Tasks for Assessing Narrative Among Students. In K. Ercikan and P. Seixas (Eds.), New Directions in Assessing Historical Thinking (pp. 117-131). New York: Routledge.

Wallace, David Adams (1987). The Past as Experience: A Qualitative Assessment of National History Day, The History Teacher, 20(2), 179-242.

Wineburg, Sam. (2001). Picturing the Past. In Historical Thinking and Other Unnatural Acts: Charting the Future of Teaching the Past (pp. 113-136). Philadelphia: Temple University Press.

Leave a Comment

Your email address will not be published. Required fields are marked *