I chose to read the two physics papers and the chemistry paper that used interviews. The two physics papers were about student self-efficacy, student views about the nature of science, and the change in these qualities after undergraduate research experience (Physics 1) and how students reason about physics problems and whether “expert” problem solving strategy actually exemplifies conceptual understanding (Physics 2), The chemistry paper was about student conceptual understanding of equilibrium and fundamental thermodynamics in terms of chemical reactions and the alternative conceptions students hold (Chemistry 1).
I think it was very serendipitous that these were the papers I chose because the way the interview data was analyzed and presented were different in each paper giving a nice range of examples. In Physics 1, the interviewed were semi-structured and did not seem to be originally intended for research purposes. I get this impression because the authors only interviewed one of two cohorts mentioned and do not explicitly probe for self-efficacy or student views about the nature of science. The authors decided to present their data in a “case study” style with a total of three students representing the total population. They used an emergent coding scheme and provided no statistics through out the paper; only quotes from students.
In contrast, Physics 2 appeared to be more structured than Physics 1 (though they still fall into the semi-structured category) as they developed their questions to specifically probe student reasoning of quantitative problems. They also went with a “case study-esce” style where they described two students in detail but then took it a step further and used the two student profiles to analyze the other students. The broke down two of the problems on the protocol and labeled each student as “Pat-like” or “Alex-like” (the pseudonyms for the two students) in their thinking. A chart describing the break down was the only numerical representation of the data. I believe they used a semi-emergent coding scheme (from video data, not transcripts) as the researchers were looking for specific things but did not predict what the reasoning patterns would look like. Physics 2 was also the only paper to mention a framework though it was used as an explanation of the students’ reasoning patterns not a predictor. They discussed symbolic forms which is when people combine a symbolic template ( = + ) with a conceptual schema (an informal idea or meaning that can be represented in a mathematical equation or expression).
(As an aside, I love how Physics 2 provided a link to the interview protocol and the full transcripts of Alex and Pat.)
Chemistry 1 was the most structured of the interviews (they were structured interviews) and the most diverse in terms of their interviewees. Physics 1 and 2 interviewed undergraduate students from a single university while Chemistry 1 interviewed undergraduates and graduates from a total of four universities. This paper also had the most “numbers” as it provided a breakdown of the interviewed population as well as percentages of how many students showed the alternative conceptions they were searching for. These authors used a non-emergent coding as they started with a list of 25 alternative conceptions they were searching for (though they did expand this list to 30 after finding 5 more alternative conceptions).
Overall, I feel the claims the papers were trying to make were supported by the evidence the authors provided. Physics 1 made an argument that self-efficacy and views on the nature of science are related and presented three “case studies” where this seemed to be the case. Physics 2 wanted to document different reasoning patterns in students, used two students to develop their “reasoning schema”, and then applied it to the rest of the students. Chemistry 1 wanted to measure how common certain alternative conceptions were in physical chemistry students and did this by analyzing student responses during interviews with these alternative conceptions in mind.