Thinking critically on critical thinking: why scientists’ skills need to spread

critical thinking skills scientific

Lecturer in Psychology, University of Tasmania

Disclosure statement

Rachel Grieve does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

University of Tasmania provides funding as a member of The Conversation AU.

View all partners

critical thinking skills scientific

MATHS AND SCIENCE EDUCATION: We’ve asked our authors about the state of maths and science education in Australia and its future direction. Today, Rachel Grieve discusses why we need to spread science-specific skills into the wider curriculum.

When we think of science and maths, stereotypical visions of lab coats, test-tubes, and formulae often spring to mind.

But more important than these stereotypes are the methods that underpin the work scientists do – namely generating and systematically testing hypotheses. A key part of this is critical thinking.

It’s a skill that often feels in short supply these days, but you don’t necessarily need to study science or maths in order gain it. It’s time to take critical thinking out of the realm of maths and science and broaden it into students’ general education.

What is critical thinking?

Critical thinking is a reflective and analytical style of thinking, with its basis in logic, rationality, and synthesis. It means delving deeper and asking questions like: why is that so? Where is the evidence? How good is that evidence? Is this a good argument? Is it biased? Is it verifiable? What are the alternative explanations?

Critical thinking moves us beyond mere description and into the realms of scientific inference and reasoning. This is what enables discoveries to be made and innovations to be fostered.

For many scientists, critical thinking becomes (seemingly) intuitive, but like any skill set, critical thinking needs to be taught and cultivated. Unfortunately, educators are unable to deposit this information directly into their students’ heads. While the theory of critical thinking can be taught, critical thinking itself needs to be experienced first-hand.

So what does this mean for educators trying to incorporate critical thinking within their curricula? We can teach students the theoretical elements of critical thinking. Take for example working through [statistical problems](http://wdeneys.org/data/COGNIT_1695.pdf](http://wdeneys.org/data/COGNIT_1695.pdf) like this one:

In a 1,000-person study, four people said their favourite series was Star Trek and 996 said Days of Our Lives. Jeremy is a randomly chosen participant in this study, is 26, and is doing graduate studies in physics. He stays at home most of the time and likes to play videogames. What is most likely? a. Jeremy’s favourite series is Star Trek b. Jeremy’s favourite series is Days of Our Lives

Some critical thought applied to this problem allows us to know that Jeremy is most likely to prefer Days of Our Lives.

Can you teach it?

It’s well established that statistical training is associated with improved decision-making. But the idea of “teaching” critical thinking is itself an oxymoron: critical thinking can really only be learned through practice. Thus, it is not surprising that student engagement with the critical thinking process itself is what pays the dividends for students.

As such, educators try to connect students with the subject matter outside the lecture theatre or classroom. For example, problem based learning is now widely used in the health sciences, whereby students must figure out the key issues related to a case and direct their own learning to solve that problem. Problem based learning has clear parallels with real life practice for health professionals.

Critical thinking goes beyond what might be on the final exam and life-long learning becomes the key. This is a good thing, as practice helps to improve our ability to think critically over time .

Just for scientists?

For those engaging with science, learning the skills needed to be a critical consumer of information is invaluable. But should these skills remain in the domain of scientists? Clearly not: for those engaging with life, being a critical consumer of information is also invaluable, allowing informed judgement.

Being able to actively consider and evaluate information, identify biases, examine the logic of arguments, and tolerate ambiguity until the evidence is in would allow many people from all backgrounds to make better decisions. While these decisions can be trivial (does that miracle anti-wrinkle cream really do what it claims?), in many cases, reasoning and decision-making can have a substantial impact, with some decisions have life-altering effects. A timely case-in-point is immunisation.

Pushing critical thinking from the realms of science and maths into the broader curriculum may lead to far-reaching outcomes. With increasing access to information on the internet, giving individuals the skills to critically think about that information may have widespread benefit, both personally and socially.

The value of science education might not always be in the facts, but in the thinking.

This is the sixth part of our series Maths and Science Education .

  • Maths and science education

critical thinking skills scientific

Research Fellow

critical thinking skills scientific

Senior Research Fellow - Women's Health Services

critical thinking skills scientific

Lecturer / Senior Lecturer - Marketing

critical thinking skills scientific

Assistant Editor - 1 year cadetship

critical thinking skills scientific

Executive Dean, Faculty of Health

Accelerate Learning

  • MISSION / VISION
  • DIVERSITY STATEMENT
  • CAREER OPPORTUNITIES
  • Kide Science
  • STEMscopes Science
  • Collaborate Science
  • STEMscopes Math
  • Math Nation
  • STEMscopes Coding
  • Mastery Coding
  • DIVE-in Engineering
  • STEMscopes Streaming
  • Tuva Data Literacy
  • NATIONAL INSTITUTE FOR STEM EDUCATION
  • STEMSCOPES PROFESSIONAL LEARNING
  • RESEARCH & EFFICACY STUDIES
  • STEM EDUCATION WEBINARS
  • LEARNING EQUITY
  • DISTANCE LEARNING
  • PRODUCT UPDATES
  • LMS INTEGRATIONS
  • STEMSCOPES BLOG
  • FREE RESOURCES
  • TESTIMONIALS

Critical Thinking in Science: Fostering Scientific Reasoning Skills in Students

ALI Staff | Published  July 13, 2023

Thinking like a scientist is a central goal of all science curricula.

As students learn facts, methodologies, and methods, what matters most is that all their learning happens through the lens of scientific reasoning what matters most is that it’s all through the lens of scientific reasoning.

That way, when it comes time for them to take on a little science themselves, either in the lab or by theoretically thinking through a solution, they understand how to do it in the right context.

One component of this type of thinking is being critical. Based on facts and evidence, critical thinking in science isn’t exactly the same as critical thinking in other subjects.

Students have to doubt the information they’re given until they can prove it’s right.

They have to truly understand what’s true and what’s hearsay. It’s complex, but with the right tools and plenty of practice, students can get it right.

What is critical thinking?

This particular style of thinking stands out because it requires reflection and analysis. Based on what's logical and rational, thinking critically is all about digging deep and going beyond the surface of a question to establish the quality of the question itself.

It ensures students put their brains to work when confronted with a question rather than taking every piece of information they’re given at face value.

It’s engaged, higher-level thinking that will serve them well in school and throughout their lives.

Why is critical thinking important?

Critical thinking is important when it comes to making good decisions.

It gives us the tools to think through a choice rather than quickly picking an option — and probably guessing wrong. Think of it as the all-important ‘why.’

Why is that true? Why is that right? Why is this the only option?

Finding answers to questions like these requires critical thinking. They require you to really analyze both the question itself and the possible solutions to establish validity.

Will that choice work for me? Does this feel right based on the evidence?

How does critical thinking in science impact students?

Critical thinking is essential in science.

It’s what naturally takes students in the direction of scientific reasoning since evidence is a key component of this style of thought.

It’s not just about whether evidence is available to support a particular answer but how valid that evidence is.

It’s about whether the information the student has fits together to create a strong argument and how to use verifiable facts to get a proper response.

Critical thinking in science helps students:

  • Actively evaluate information
  • Identify bias
  • Separate the logic within arguments
  • Analyze evidence

4 Ways to promote critical thinking

Figuring out how to develop critical thinking skills in science means looking at multiple strategies and deciding what will work best at your school and in your class.

Based on your student population, their needs and abilities, not every option will be a home run.

These particular examples are all based on the idea that for students to really learn how to think critically, they have to practice doing it. 

Each focuses on engaging students with science in a way that will motivate them to work independently as they hone their scientific reasoning skills.

Project-Based Learning

Project-based learning centers on critical thinking.

Teachers can shape a project around the thinking style to give students practice with evaluating evidence or other critical thinking skills.

Critical thinking also happens during collaboration, evidence-based thought, and reflection.

For example, setting students up for a research project is not only a great way to get them to think critically, but it also helps motivate them to learn.

Allowing them to pick the topic (that isn’t easy to look up online), develop their own research questions, and establish a process to collect data to find an answer lets students personally connect to science while using critical thinking at each stage of the assignment.

They’ll have to evaluate the quality of the research they find and make evidence-based decisions.

Self-Reflection

Adding a question or two to any lab practicum or activity requiring students to pause and reflect on what they did or learned also helps them practice critical thinking.

At this point in an assignment, they’ll pause and assess independently. 

You can ask students to reflect on the conclusions they came up with for a completed activity, which really makes them think about whether there's any bias in their answer.

Addressing Assumptions

One way critical thinking aligns so perfectly with scientific reasoning is that it encourages students to challenge all assumptions. 

Evidence is king in the science classroom, but even when students work with hard facts, there comes the risk of a little assumptive thinking.

Working with students to identify assumptions in existing research or asking them to address an issue where they suspend their own judgment and simply look at established facts polishes their that critical eye.

They’re getting practice without tossing out opinions, unproven hypotheses, and speculation in exchange for real data and real results, just like a scientist has to do.

Lab Activities With Trial-And-Error

Another component of critical thinking (as well as thinking like a scientist) is figuring out what to do when you get something wrong.

Backtracking can mean you have to rethink a process, redesign an experiment, or reevaluate data because the outcomes don’t make sense, but it’s okay.

The ability to get something wrong and recover is not only a valuable life skill, but it’s where most scientific breakthroughs start. Reminding students of this is always a valuable lesson.

Labs that include comparative activities are one way to increase critical thinking skills, especially when introducing new evidence that might cause students to change their conclusions once the lab has begun.

For example, you provide students with two distinct data sets and ask them to compare them.

With only two choices, there are a finite amount of conclusions to draw, but then what happens when you bring in a third data set? Will it void certain conclusions? Will it allow students to make new conclusions, ones even more deeply rooted in evidence?

Thinking like a scientist

When students get the opportunity to think critically, they’re learning to trust the data over their ‘gut,’ to approach problems systematically and make informed decisions using ‘good’ evidence.

When practiced enough, this ability will engage students in science in a whole new way, providing them with opportunities to dig deeper and learn more.

It can help enrich science and motivate students to approach the subject just like a professional would.

New call-to-action

Share this post!

Related articles.

The Top 7 Elements of a Highly Effective Math Class

The Top 7 Elements of a Highly Effective Math Class

Effective math instruction is key to helping students understand and enjoy math. It's not just about numbers; it's...

Play-based Learning in Preschool: Learning Through Play

Play-based Learning in Preschool: Learning Through Play

Play is a natural part of early childhood development, making play-based learning a perfect fit for preschool...

Strategies For Building A Math Community In Your Classroom

Strategies For Building A Math Community In Your Classroom

At the start of the school year, teachers have the chance to create a math classroom where every student feels valued...

STAY INFORMED ON THE LATEST IN STEM. SUBSCRIBE TODAY!

Which stem subjects are of interest to you.

STEMscopes Tech Specifications      STEMscopes Security Information & Compliance      Privacy Policy      Terms and Conditions

© 2024 Accelerate Learning  

critical thinking skills scientific

3. Critical Thinking in Science: How to Foster Scientific Reasoning Skills

Critical thinking in science is important largely because a lot of students have developed expectations about science that can prove to be counter-productive. 

After various experiences — both in school and out — students often perceive science to be primarily about learning “authoritative” content knowledge: this is how the solar system works; that is how diffusion works; this is the right answer and that is not. 

This perception allows little room for critical thinking in science, in spite of the fact that argument, reasoning, and critical thinking lie at the very core of scientific practice.

Argument, reasoning, and critical thinking lie at the very core of scientific practice.

critical thinking skills scientific

In this article, we outline two of the best approaches to be most effective in fostering scientific reasoning. Both try to put students in a scientist’s frame of mind more than is typical in science education:

  • First, we look at  small-group inquiry , where students formulate questions and investigate them in small groups. This approach is geared more toward younger students but has applications at higher levels too.
  • We also look  science   labs . Too often, science labs too often involve students simply following recipes or replicating standard results. Here, we offer tips to turn labs into spaces for independent inquiry and scientific reasoning.

critical thinking skills scientific

I. Critical Thinking in Science and Scientific Inquiry

Even very young students can “think scientifically” under the right instructional support. A series of experiments , for instance, established that preschoolers can make statistically valid inferences about unknown variables. Through observation they are also capable of distinguishing actions that cause certain outcomes from actions that don’t. These innate capacities, however, have to be developed for students to grow up into rigorous scientific critical thinkers. 

Even very young students can “think scientifically” under the right instructional support.

Although there are many techniques to get young children involved in scientific inquiry — encouraging them to ask and answer “why” questions, for instance — teachers can provide structured scientific inquiry experiences that are deeper than students can experience on their own. 

Goals for Teaching Critical Thinking Through Scientific Inquiry

When it comes to teaching critical thinking via science, the learning goals may vary, but students should learn that:

  • Failure to agree is okay, as long as you have reasons for why you disagree about something.
  • The logic of scientific inquiry is iterative. Scientists always have to consider how they might improve your methods next time. This includes addressing sources of uncertainty.
  • Claims to knowledge usually require multiple lines of evidence and a “match” or “fit” between our explanations and the evidence we have.
  • Collaboration, argument, and discussion are central features of scientific reasoning.
  • Visualization, analysis, and presentation are central features of scientific reasoning.
  • Overarching concepts in scientific practice — such as uncertainty, measurement, and meaningful experimental contrasts — manifest themselves somewhat differently in different scientific domains.

How to Teaching Critical Thinking in Science Via Inquiry

Sometimes we think of science education as being either a “direct” approach, where we tell students about a concept, or an “inquiry-based” approach, where students explore a concept themselves.  

But, especially, at the earliest grades, integrating both approaches can inform students of their options (i.e., generate and extend their ideas), while also letting students make decisions about what to do.

Like a lot of projects targeting critical thinking, limited classroom time is a challenge. Although the latest content standards, such as the Next Generation Science Standards , emphasize teaching scientific practices, many standardized tests still emphasize assessing scientific content knowledge.

The concept of uncertainty comes up in every scientific domain.

Creating a lesson that targets the right content is also an important aspect of developing authentic scientific experiences. It’s now more  widely acknowledged  that effective science instruction involves the interaction between domain-specific knowledge and domain-general knowledge, and that linking an inquiry experience to appropriate target content is vital.

For instance, the concept of uncertainty  comes up  in every scientific domain. But the sources of uncertainty coming from any given measurement vary tremendously by discipline. It requires content knowledge to know how to wisely apply the concept of uncertainty.

Tips and Challenges for teaching critical thinking in science

Teachers need to grapple with student misconceptions. Student intuition about how the world works — the way living things grow and behave, the way that objects fall and interact — often conflicts with scientific explanations. As part of the inquiry experience, teachers can help students to articulate these intuitions and revise them through argument and evidence.

Group composition is another challenge. Teachers will want to avoid situations where one member of the group will simply “take charge” of the decision-making, while other member(s) disengage. In some cases, grouping students by current ability level can make the group work more productive. 

Another approach is to establish group norms that help prevent unproductive group interactions. A third tactic is to have each group member learn an essential piece of the puzzle prior to the group work, so that each member is bringing something valuable to the table (which other group members don’t yet know).

It’s critical to ask students about how certain they are in their observations and explanations and what they could do better next time. When disagreements arise about what to do next or how to interpret evidence, the instructor should model good scientific practice by, for instance, getting students to think about what kind of evidence would help resolve the disagreement or whether there’s a compromise that might satisfy both groups.

The subjects of the inquiry experience and the tools at students’ disposal will depend upon the class and the grade level. Older students may be asked to create mathematical models, more sophisticated visualizations, and give fuller presentations of their results.

Lesson Plan Outline

This lesson plan takes a small-group inquiry approach to critical thinking in science. It asks students to collaboratively explore a scientific question, or perhaps a series of related questions, within a scientific domain.

Suppose students are exploring insect behavior. Groups may decide what questions to ask about insect behavior; how to observe, define, and record insect behavior; how to design an experiment that generates evidence related to their research questions; and how to interpret and present their results.

An in-depth inquiry experience usually takes place over the course of several classroom sessions, and includes classroom-wide instruction, small-group work, and potentially some individual work as well.

Students, especially younger students, will typically need some background knowledge that can inform more independent decision-making. So providing classroom-wide instruction and discussion before individual group work is a good idea.

For instance, Kathleen Metz had students observe insect behavior, explore the anatomy of insects, draw habitat maps, and collaboratively formulate (and categorize) research questions before students began to work more independently.

The subjects of a science inquiry experience can vary tremendously: local weather patterns, plant growth, pollution, bridge-building. The point is to engage students in multiple aspects of scientific practice: observing, formulating research questions, making predictions, gathering data, analyzing and interpreting data, refining and iterating the process.

As student groups take responsibility for their own investigation, teachers act as facilitators. They can circulate around the room, providing advice and guidance to individual groups. If classroom-wide misconceptions arise, they can pause group work to address those misconceptions directly and re-orient the class toward a more productive way of thinking.

Throughout the process, teachers can also ask questions like:

  • What are your assumptions about what’s going on? How can you check your assumptions?
  • Suppose that your results show X, what would you conclude?
  • If you had to do the process over again, what would you change? Why?

critical thinking skills scientific

II. Rethinking Science Labs

Beyond changing how students approach scientific inquiry, we also need to rethink science labs. After all, science lab activities are ubiquitous in science classrooms and they are a great opportunity to teach critical thinking skills.

Often, however, science labs are merely recipes that students follow to verify standard values (such as the force of acceleration due to gravity) or relationships between variables (such as the relationship between force, mass, and acceleration) known to the students beforehand. 

This approach does not usually involve critical thinking: students are not making many decisions during the process, and they do not reflect on what they’ve done except to see whether their experimental data matches the expected values.

With some small tweaks, however, science labs can involve more critical thinking. Science lab activities that give students not only the opportunity to design, analyze, and interpret the experiment, but re -design, re -analyze, and re -interpret the experiment provides ample opportunity for grappling with evidence and evidence-model relationships, particularly if students don’t know what answer they should be expecting beforehand.

Such activities improve scientific reasoning skills, such as: 

  • Evaluating quantitative data
  • Plausible scientific explanations for observed patterns

And also broader critical thinking skills, like:

  • Comparing models to data, and comparing models to each other
  • Thinking about what kind of evidence supports one model or another
  • Being open to changing your beliefs based on evidence

Traditional science lab experiences bear little resemblance to actual scientific practice. Actual practice  involves  decision-making under uncertainty, trial-and-error, tweaking experimental methods over time, testing instruments, and resolving conflicts among different kinds of evidence. Traditional in-school science labs rarely involve these things.

Traditional science lab experiences bear little resemblance to actual scientific practice.

When teachers use science labs as opportunities to engage students in the kinds of dilemmas that scientists actually face during research, students make more decisions and exhibit more sophisticated reasoning.

In the lesson plan below, students are asked to evaluate two models of drag forces on a falling object. One model assumes that drag increases linearly with the velocity of the falling object. Another model assumes that drag increases quadratically (e.g., with the square of the velocity).  Students use a motion detector and computer software to create a plot of the position of a disposable paper coffee filter as it falls to the ground. Among other variables, students can vary the number of coffee filters they drop at once, the height at which they drop them, how they drop  them, and how they clean their data. This is an approach to scaffolding critical thinking: a way to get students to ask the right kinds of questions and think in the way that scientists tend to think.

Design an experiment to test which model best characterizes the motion of the coffee filters. 

Things to think about in your design:

  • What are the relevant variables to control and which ones do you need to explore?
  • What are some logistical issues associated with the data collection that may cause unnecessary variability (either random or systematic) or mistakes?
  • How can you control or measure these?
  • What ways can you graph your data and which ones will help you figure out which model better describes your data?

Discuss your design with other groups and modify as you see fit.

Initial data collection

Conduct a quick trial-run of your experiment so that you can evaluate your methods.

  • Do your graphs provide evidence of which model is the best?
  • What ways can you improve your methods, data, or graphs to make your case more convincing?
  • Do you need to change how you’re collecting data?
  • Do you need to take data at different regions?
  • Do you just need more data?
  • Do you need to reduce your uncertainty?

After this initial evaluation of your data and methods, conduct the desired improvements, changes, or additions and re-evaluate at the end.

In your lab notes, make sure to keep track of your progress and process as you go. As always, your final product is less important than how you get there.

How to Make Science Labs Run Smoothly

Managing student expectations . As with many other lesson plans that incorporate critical thinking, students are not used to having so much freedom. As with the example lesson plan above, it’s important to scaffold student decision-making by pointing out what decisions have to be made, especially as students are transitioning to this approach.

Supporting student reasoning . Another challenge is to provide guidance to student groups without telling them how to do something. Too much “telling” diminishes student decision-making, but not enough support may leave students simply not knowing what to do. 

There are several key strategies teachers can try out here: 

  • Point out an issue with their data collection process without specifying exactly how to solve it.
  • Ask a lab group how they would improve their approach.
  • Ask two groups with conflicting results to compare their results, methods, and analyses.

Download our Teachers’ Guide

(please click here)

Sources and Resources

Lehrer, R., & Schauble, L. (2007). Scientific thinking and scientific literacy . Handbook of child psychology , Vol. 4. Wiley. A review of research on scientific thinking and experiments on teaching scientific thinking in the classroom.

Metz, K. (2004). Children’s understanding of scientific inquiry: Their conceptualizations of uncertainty in investigations of their own design . Cognition and Instruction 22(2). An example of a scientific inquiry experience for elementary school students.

The Next Generation Science Standards . The latest U.S. science content standards.

Concepts of Evidence A collection of important concepts related to evidence that cut across scientific disciplines.

Scienceblind A book about children’s science misconceptions and how to correct them.

Holmes, N. G., Keep, B., & Wieman, C. E. (2020). Developing scientific decision making by structuring and supporting student agency. Physical Review Physics Education Research , 16 (1), 010109. A research study on minimally altering traditional lab approaches to incorporate more critical thinking. The drag example was taken from this piece.

ISLE , led by E. Etkina.  A platform that helps teachers incorporate more critical thinking in physics labs.

Holmes, N. G., Wieman, C. E., & Bonn, D. A. (2015). Teaching critical thinking . Proceedings of the National Academy of Sciences , 112 (36), 11199-11204. An approach to improving critical thinking and reflection in science labs. Walker, J. P., Sampson, V., Grooms, J., Anderson, B., & Zimmerman, C. O. (2012). Argument-driven inquiry in undergraduate chemistry labs: The impact on students’ conceptual understanding, argument skills, and attitudes toward science . Journal of College Science Teaching , 41 (4), 74-81. A large-scale research study on transforming chemistry labs to be more inquiry-based.

Privacy Overview

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List

Logo of plosone

What influences students’ abilities to critically evaluate scientific investigations?

Ashley b. heim.

1 Department of Ecology and Evolutionary Biology, Cornell University, Ithaca, NY, United States of America

2 Laboratory of Atomic and Solid State Physics, Cornell University, Ithaca, NY, United States of America

David Esparza

Michelle k. smith, n. g. holmes, associated data.

All raw data files are available from the Cornell Institute for Social and Economic Research (CISER) data and reproduction archive ( https://archive.ciser.cornell.edu/studies/2881 ).

Critical thinking is the process by which people make decisions about what to trust and what to do. Many undergraduate courses, such as those in biology and physics, include critical thinking as an important learning goal. Assessing critical thinking, however, is non-trivial, with mixed recommendations for how to assess critical thinking as part of instruction. Here we evaluate the efficacy of assessment questions to probe students’ critical thinking skills in the context of biology and physics. We use two research-based standardized critical thinking instruments known as the Biology Lab Inventory of Critical Thinking in Ecology (Eco-BLIC) and Physics Lab Inventory of Critical Thinking (PLIC). These instruments provide experimental scenarios and pose questions asking students to evaluate what to trust and what to do regarding the quality of experimental designs and data. Using more than 3000 student responses from over 20 institutions, we sought to understand what features of the assessment questions elicit student critical thinking. Specifically, we investigated (a) how students critically evaluate aspects of research studies in biology and physics when they are individually evaluating one study at a time versus comparing and contrasting two and (b) whether individual evaluation questions are needed to encourage students to engage in critical thinking when comparing and contrasting. We found that students are more critical when making comparisons between two studies than when evaluating each study individually. Also, compare-and-contrast questions are sufficient for eliciting critical thinking, with students providing similar answers regardless of if the individual evaluation questions are included. This research offers new insight on the types of assessment questions that elicit critical thinking at the introductory undergraduate level; specifically, we recommend instructors incorporate more compare-and-contrast questions related to experimental design in their courses and assessments.

Introduction

Critical thinking and its importance.

Critical thinking, defined here as “the ways in which one uses data and evidence to make decisions about what to trust and what to do” [ 1 ], is a foundational learning goal for almost any undergraduate course and can be integrated in many points in the undergraduate curriculum. Beyond the classroom, critical thinking skills are important so that students are able to effectively evaluate data presented to them in a society where information is so readily accessible [ 2 , 3 ]. Furthermore, critical thinking is consistently ranked as one of the most necessary outcomes of post-secondary education for career advancement by employers [ 4 ]. In the workplace, those with critical thinking skills are more competitive because employers assume they can make evidence-based decisions based on multiple perspectives, keep an open mind, and acknowledge personal limitations [ 5 , 6 ]. Despite the importance of critical thinking skills, there are mixed recommendations on how to elicit and assess critical thinking during and as a result of instruction. In response, here we evaluate the degree to which different critical thinking questions elicit students’ critical thinking skills.

Assessing critical thinking in STEM

Across STEM (i.e., science, technology, engineering, and mathematics) disciplines, several standardized assessments probe critical thinking skills. These assessments focus on aspects of critical thinking and ask students to evaluate experimental methods [ 7 – 11 ], form hypotheses and make predictions [ 12 , 13 ], evaluate data [ 2 , 12 – 14 ], or draw conclusions based on a scenario or figure [ 2 , 12 – 14 ]. Many of these assessments are open-response, so they can be difficult to score, and several are not freely available.

In addition, there is an ongoing debate regarding whether critical thinking is a domain-general or context-specific skill. That is, can someone transfer their critical thinking skills from one domain or context to another (domain-general) or do their critical thinking skills only apply in their domain or context of expertise (context-specific)? Research on the effectiveness of teaching critical thinking has found mixed results, primarily due to a lack of consensus definition of and assessment tools for critical thinking [ 15 , 16 ]. Some argue that critical thinking is domain-general—or what Ennis refers to as the “general approach”—because it is an overlapping skill that people use in various aspects of their lives [ 17 ]. In contrast, others argue that critical thinking must be elicited in a context-specific domain, as prior knowledge is needed to make informed decisions in one’s discipline [ 18 , 19 ]. Current assessments include domain-general components [ 2 , 7 , 8 , 14 , 20 , 21 ], asking students to evaluate, for instance, experiments on the effectiveness of dietary supplements in athletes [ 20 ] and context-specific components, such as to measure students’ abilities to think critically in domains such as neuroscience [ 9 ] and biology [ 10 ].

Others maintain the view that critical thinking is a context-specific skill for the purpose of undergraduate education, but argue that it should be content accessible [ 22 – 24 ], as “thought processes are intertwined with what is being thought about” [ 23 ]. From this viewpoint, the context of the assessment would need to be embedded in a relatively accessible context to assess critical thinking independent of students’ content knowledge. Thus, to effectively elicit critical thinking among students, instructors should use assessments that present students with accessible domain-specific information needed to think deeply about the questions being asked [ 24 , 25 ].

Within the context of STEM, current critical thinking assessments primarily ask students to evaluate a single experimental scenario (e.g., [ 10 , 20 ]), though compare-and-contrast questions about more than one scenario can be a powerful way to elicit critical thinking [ 26 , 27 ]. Generally included in the “Analysis” level of Bloom’s taxonomy [ 28 – 30 ], compare-and-contrast questions encourage students to recognize, distinguish between, and relate features between scenarios and discern relevant patterns or trends, rather than compile lists of important features [ 26 ]. For example, a compare-and-contrast assessment may ask students to compare the hypotheses and research methods used in two different experimental scenarios, instead of having them evaluate the research methods of a single experiment. Alternatively, students may inherently recall and use experimental scenarios based on their prior experiences and knowledge as they evaluate an individual scenario. In addition, evaluating a single experimental scenario individually may act as metacognitive scaffolding [ 31 , 32 ]—a process which “guides students by asking questions about the task or suggesting relevant domain-independent strategies [ 32 ]—to support students in their compare-and-contrast thinking.

Purpose and research questions

Our primary objective of this study was to better understand what features of assessment questions elicit student critical thinking using two existing instruments in STEM: the Biology Lab Inventory of Critical Thinking in Ecology (Eco-BLIC) and Physics Lab Inventory of Critical Thinking (PLIC). We focused on biology and physics since critical thinking assessments were already available for these disciplines. Specifically, we investigated (a) how students critically evaluate aspects of research studies in biology and physics when they are individually evaluating one study at a time or comparing and contrasting two studies and (b) whether individual evaluation questions are needed to encourage students to engage in critical thinking when comparing and contrasting.

Providing undergraduates with ample opportunities to practice critical thinking skills in the classroom is necessary for evidence-based critical thinking in their future careers and everyday life. While most critical thinking instruments in biology and physics contexts have undergone some form of validation to ensure they are accurately measuring the intended construct, to our knowledge none have explored how different question types influence students’ critical thinking. This research offers new insight on the types of questions that elicit critical thinking, which can further be applied by educators and researchers across disciplines to measure cognitive student outcomes and incorporate more effective critical thinking opportunities in the classroom.

Ethics statement

The procedures for this study were approved by the Institutional Review Board of Cornell University (Eco-BLIC: #1904008779; PLIC: #1608006532). Informed consent was obtained by all participating students via online consent forms at the beginning of the study, and students did not receive compensation for participating in this study unless their instructor offered credit for completing the assessment.

Participants and assessment distribution

We administered the Eco-BLIC to undergraduate students across 26 courses at 11 institutions (six doctoral-granting, three Master’s-granting, and two Baccalaureate-granting) in Fall 2020 and Spring 2021 and received 1612 usable responses. Additionally, we administered the PLIC to undergraduate students across 21 courses at 11 institutions (six doctoral-granting, one Master’s-granting, three four-year colleges, and one 2-year college) in Fall 2020 and Spring 2021 and received 1839 usable responses. We recruited participants via convenience sampling by emailing instructors of primarily introductory ecology-focused courses or introductory physics courses who expressed potential interest in implementing our instrument in their course(s). Both instruments were administered online via Qualtrics and students were allowed to complete the assessments outside of class. The demographic distribution of the response data is presented in Table 1 , all of which were self-reported by students. The values presented in this table represent all responses we received.

Instrument description

Question types.

Though the content and concepts featured in the Eco-BLIC and PLIC are distinct, both instruments share a similar structure and set of question types. The Eco-BLIC—which was developed using a structure similar to that of the PLIC [ 1 ]—includes two predator-prey scenarios based on relationships between (a) smallmouth bass and mayflies and (b) great-horned owls and house mice. Within each scenario, students are presented with a field-based study and a laboratory-based study focused on a common research question about feeding behaviors of smallmouth bass or house mice, respectively. The prompts for these two Eco-BLIC scenarios are available in S1 and S2 Appendices. The PLIC focuses on two research groups conducting different experiments to test the relationship between oscillation periods of masses hanging on springs [ 1 ]; the prompts for this scenario can be found in S3 Appendix . The descriptive prompts in both the Eco-BLIC and PLIC also include a figure presenting data collected by each research group, from which students are expected to draw conclusions. The research scenarios (e.g., field-based group and lab-based group on the Eco-BLIC) are written so that each group has both strengths and weaknesses in their experimental designs.

After reading the prompt for the first experimental group (Group 1) in each instrument, students are asked to identify possible claims from Group 1’s data (data evaluation questions). Students next evaluate the strengths and weaknesses of various study features for Group 1 (individual evaluation questions). Examples of these individual evaluation questions are in Table 2 . They then suggest next steps the group should pursue (next steps items). Students are then asked to read about the prompt describing the second experimental group’s study (Group 2) and again answer questions about the possible claims, strengths and weaknesses, and next steps of Group 2’s study (data evaluation questions, individual evaluation questions, and next steps items). Once students have independently evaluated Groups 1 and 2, they answer a series of questions to compare the study approaches of Group 1 versus Group 2 (group comparison items). In this study, we focus our analysis on the individual evaluation questions and group comparison items.

The Eco-BLIC examples are derived from the owl/mouse scenario.

Instrument versions

To determine whether the individual evaluation questions impacted the assessment of students’ critical thinking, students were randomly assigned to take one of two versions of the assessment via Qualtrics branch logic: 1) a version that included the individual evaluation and group comparison items or 2) a version with only the group comparison items, with the individual evaluation questions removed. We calculated the median time it took students to answer each of these versions for both the Eco-BLIC and PLIC.

Think-aloud interviews

We also conducted one-on-one think-aloud interviews with students to elicit feedback on the assessment questions (Eco-BLIC n = 21; PLIC n = 4). Students were recruited via convenience sampling at our home institution and were primarily majoring in biology or physics. All interviews were audio-recorded and screen captured via Zoom and lasted approximately 30–60 minutes. We asked participants to discuss their reasoning for answering each question as they progressed through the instrument. We did not analyze these interviews in detail, but rather used them to extract relevant examples of critical thinking that helped to explain our quantitative findings. Multiple think-aloud interviews were conducted with students using previous versions of the PLIC [ 1 ], though these data are not discussed here.

Data analyses

Our analyses focused on (1) investigating the alignment between students’ responses to the individual evaluation questions and the group comparison items and (2) comparing student responses between the two instrument versions. If individual evaluation and group comparison items elicit critical thinking in the same way, we would expect to see the same frequency of responses for each question type, as per Fig 1 . For example, if students evaluated one study feature of Group 1 as a strength and the same study feature for Group 2 as a strength, we would expect that students would respond that both groups were highly effective for this study feature on the group comparison item (i.e., data represented by the purple circle in the top right quadrant of Fig 1 ). Alternatively, if students evaluated one study feature of Group 1 as a strength and the same study feature for Group 2 as a weakness, we would expect that students would indicate that Group 1 was more effective than Group 2 on the group comparison item (i.e., data represented by the green circle in the lower right quadrant of Fig 1 ).

An external file that holds a picture, illustration, etc.
Object name is pone.0273337.g001.jpg

The x- and y-axes represent rankings on the individual evaluation questions for Groups 1 and 2 (or field and lab groups), respectively. The colors in the legend at the top of the figure denote responses to the group comparison items. In this idealized example, all pie charts are the same size to indicate that the student answers are equally proportioned across all answer combinations.

We ran descriptive statistics to summarize student responses to questions and examine distributions and frequencies of the data on the Eco-BLIC and PLIC. We also conducted chi-square goodness-of-fit tests to analyze differences in student responses between versions within the relevant questions from the same instrument. In all of these tests, we used a Bonferroni correction to lower the chances of receiving a false positive and account for multiple comparisons. We generated figures—primarily multi-pie chart graphs and heat maps—to visualize differences between individual evaluation and group comparison items and between versions of each instrument with and without individual evaluation questions, respectively. All aforementioned data analyses and figures were conducted or generated in the R statistical computing environment (v. 4.1.1) and Microsoft Excel.

We asked students to evaluate different experimental set-ups on the Eco-BLIC and PLIC two ways. Students first evaluated the strengths and weaknesses of study features for each scenario individually (individual evaluation questions, Table 2 ) and, subsequently, answered a series of questions to compare and contrast the study approaches of both research groups side-by-side (group comparison items, Table 2 ). Through analyzing the individual evaluation questions, we found that students generally ranked experimental features (i.e., those related to study set-up, data collection and summary methods, and analysis and outcomes) of the independent research groups as strengths ( Fig 2 ), evidenced by the mean scores greater than 2 on a scale from 1 (weakness) to 4 (strength).

An external file that holds a picture, illustration, etc.
Object name is pone.0273337.g002.jpg

Each box represents the interquartile range (IQR). Lines within each box represent the median. Circles represent outliers of mean scores for each question.

Individual evaluation versus compare-and-contrast evaluation

Our results indicate that when students consider Group 1 or Group 2 individually, they mark most study features as strengths (consistent with the means in Fig 2 ), shown by the large circles in the upper right quadrant across the three experimental scenarios ( Fig 3 ). However, the proportion of colors on each pie chart shows that students select a range of responses when comparing the two groups [e.g., Group 1 being more effective (green), Group 2 being more effective (blue), both groups being effective (purple), and neither group being effective (orange)]. We infer that students were more discerning (i.e., more selective) when they were asked to compare the two groups across the various study features ( Fig 3 ). In short, students think about the groups differently if they are rating either Group 1 or Group 2 in the individual evaluation questions versus directly comparing Group 1 to Group 2.

An external file that holds a picture, illustration, etc.
Object name is pone.0273337.g003.jpg

The x- and y-axes represent students’ rankings on the individual evaluation questions for Groups 1 and 2 on each assessment, respectively, where 1 indicates weakness and 4 indicates strength. The overall size of each pie chart represents the proportion of students who responded with each pair of ratings. The colors in the pie charts denote the proportion of students’ responses who chose each option on the group comparison items. (A) Eco-BLIC bass-mayfly scenario (B) Eco-BLIC owl-mouse scenario (C) PLIC oscillation periods of masses hanging on springs scenario.

These results are further supported by student responses from the think-aloud interviews. For example, one interview participant responding to the bass-mayfly scenario of the Eco-BLIC explained that accounting for bias/error in both the field and lab groups in this scenario was a strength (i.e., 4). This participant mentioned that Group 1, who performed the experiment in the field, “[had] outliers, so they must have done pretty well,” and that Group 2, who collected organisms in the field but studied them in lab, “did a good job of accounting for bias.” However, when asked to compare between the groups, this student argued that Group 2 was more effective at accounting for bias/error, noting that “they controlled for more variables.”

Another individual who was evaluating “repeated trials for each mass” in the PLIC expressed a similar pattern. In response to ranking this feature of Group 1 as a strength, they explained: “Given their uncertainties and how small they are, [the group] seems like they’ve covered their bases pretty well.” Similarly, they evaluated this feature of Group 2 as a strength as well, simply noting: “Same as the last [group], I think it’s a strength.” However, when asked to compare between Groups 1 and 2, this individual argued that Group 1 was more effective because they conducted more trials.

Individual evaluation questions to support compare and contrast thinking

Given that students were more discerning when they directly compared two groups for both biology and physics experimental scenarios, we next sought to determine if the individual evaluation questions for Group 1 or Group 2 were necessary to elicit or helpful to support student critical thinking about the investigations. To test this, students were randomly assigned to one of two versions of the instrument. Students in one version saw individual evaluation questions about Group 1 and Group 2 and then saw group comparison items for Group 1 versus Group 2. Students in the second version only saw the group comparison items. We found that students assigned to both versions responded similarly to the group comparison questions, indicating that the individual evaluation questions did not promote additional critical thinking. We visually represent these similarities across versions with and without the individual evaluation questions in Fig 4 as heat maps.

An external file that holds a picture, illustration, etc.
Object name is pone.0273337.g004.jpg

The x-axis denotes students’ responses on the group comparison items (i.e., whether they ranked Group 1 as more effective, Group 2 as more effective, both groups as highly effective, or neither group as effective/both groups were minimally effective). The y-axis lists each of the study features that students compared between the field and lab groups. White and lighter shades of red indicate a lower percentage of student responses, while brighter red indicates a higher percentage of student responses. (A) Eco-BLIC bass-mayfly scenario. (B) Eco-BLIC owl-mouse scenario. (C) PLIC oscillation periods of masses hanging on springs scenario.

We ran chi-square goodness-of-fit tests on the answers between student responses on both instrument versions and there were no significant differences on the Eco-BLIC bass-mayfly scenario ( Fig 4A ; based on an adjusted p -value of 0.006) or owl-mouse questions ( Fig 4B ; based on an adjusted p-value of 0.004). There were only three significant differences (out of 53 items) in how students responded to questions on both versions of the PLIC ( Fig 4C ; based on an adjusted p -value of 0.0005). The items that students responded to differently ( p <0.0005) across both versions were items where the two groups were identical in their design; namely, the equipment used (i.e., stopwatches), the variables measured (i.e., time and mass), and the number of bounces of the spring per trial (i.e., five bounces). We calculated Cramer’s C (Vc; [ 33 ]), a measure commonly applied to Chi-square goodness of fit models to understand the magnitude of significant results. We found that the effect sizes for these three items were small (Vc = 0.11, Vc = 0.10, Vc = 0.06, respectively).

The trend that students answer the Group 1 versus Group 2 comparison questions similarly, regardless of whether they responded to the individual evaluation questions, is further supported by student responses from the think-aloud interviews. For example, one participant who did not see the individual evaluation questions for the owl-mouse scenario of the Eco-BLIC independently explained that sampling mice from other fields was a strength for both the lab and field groups. They explained that for the lab group, “I think that [the mice] coming from multiple nearby fields is good…I was curious if [mouse] behavior was universal.” For the field group, they reasoned, “I also noticed it was just from a single nearby field…I thought that was good for control.” However, this individual ultimately reasoned that the field group was “more effective for sampling methods…it’s better to have them from a single field because you know they were exposed to similar environments.” Thus, even without individual evaluation questions available, students can still make individual evaluations when comparing and contrasting between groups.

We also determined that removing the individual evaluation questions decreased the duration of time students needed to complete the Eco-BLIC and PLIC. On the Eco-BLIC, the median time to completion for the version with individual evaluation and group comparison questions was approximately 30 minutes, while the version with only the group comparisons had a median time to completion of 18 minutes. On the PLIC, the median time to completion for the version with individual evaluation questions and group comparison questions was approximately 17 minutes, while the version with only the group comparisons had a median time to completion of 15 minutes.

To determine how to elicit critical thinking in a streamlined manner using introductory biology and physics material, we investigated (a) how students critically evaluate aspects of experimental investigations in biology and physics when they are individually evaluating one study at a time versus comparing and contrasting two and (b) whether individual evaluation questions are needed to encourage students to engage in critical thinking when comparing and contrasting.

Students are more discerning when making comparisons

We found that students were more discerning when comparing between the two groups in the Eco-BLIC and PLIC rather than when evaluating each group individually. While students tended to independently evaluate study features of each group as strengths ( Fig 2 ), there was greater variation in their responses to which group was more effective when directly comparing between the two groups ( Fig 3 ). Literature evaluating the role of contrasting cases provides plausible explanations for our results. In that work, contrasting between two cases supports students in identifying deep features of the cases, compared with evaluating one case after the other [ 34 – 37 ]. When presented with a single example, students may deem certain study features as unimportant or irrelevant, but comparing study features side-by-side allows students to recognize the distinct features of each case [ 38 ]. We infer, therefore, that students were better able to recognize the strengths and weaknesses of the two groups in each of the assessment scenarios when evaluating the groups side by side, rather than in isolation [ 39 , 40 ]. This result is somewhat surprising, however, as students could have used their knowledge of experimental designs as a contrasting case when evaluating each group. Future work, therefore, should evaluate whether experts use their vast knowledge base of experimental studies as discerning contrasts when evaluating each group individually. This work would help determine whether our results here suggest that students do not have a sufficient experiment-base to use as contrasts or if the students just do not use their experiment-base when evaluating the individual groups. Regardless, our study suggests that critical thinking assessments should ask students to compare and contrast experimental scenarios, rather than just evaluate individual cases.

Individual evaluation questions do not influence answers to compare and contrast questions

We found that individual evaluation questions were unnecessary for eliciting or supporting students’ critical thinking on the two assessments. Students responded to the group comparison items similarly whether or not they had received the individual evaluation questions. The exception to this pattern was that students responded differently to three group comparison items on the PLIC when individual evaluation questions were provided. These three questions constituted a small portion of the PLIC and showed a small effect size. Furthermore, removing the individual evaluation questions decreased the median time for students to complete the Eco-BLIC and PLIC. It is plausible that spending more time thinking about the experimental methods while responding to the individual evaluation questions would then prepare students to be better discerners on the group comparison questions. However, the overall trend is that individual evaluation questions do not have a strong impact on how students evaluate experimental scenarios, nor do they set students up to be better critical thinkers later. This finding aligns with prior research suggesting that students tend to disregard details when they evaluate a single case, rather than comparing and contrasting multiple cases [ 38 ], further supporting our findings about the effectiveness of the group comparison questions.

Practical implications

Individual evaluation questions were not effective for students to engage in critical thinking nor to prepare them for subsequent questions that elicit their critical thinking. Thus, researchers and instructors could make critical thinking assessments more effective and less time-consuming by encouraging comparisons between cases. Additionally, the study raises a question about whether instruction should incorporate more experimental case studies throughout their courses and assessments so that students have a richer experiment-base to use as contrasts when evaluating individual experimental scenarios. To help students discern information about experimental design, we suggest that instructors consider providing them with multiple experimental studies (i.e., cases) and asking them to compare and contrast between these studies.

Future directions and limitations

When designing critical thinking assessments, questions should ask students to make meaningful comparisons that require them to consider the important features of the scenarios. One challenge of relying on compare-and-contrast questions in the Eco-BLIC and PLIC to elicit students’ critical thinking is ensuring that students are comparing similar yet distinct study features across experimental scenarios, and that these comparisons are meaningful [ 38 ]. For example, though sample size is different between experimental scenarios in our instruments, it is a significant feature that has implications for other aspects of the research like statistical analyses and behaviors of the animals. Therefore, one limitation of our study could be that we exclusively focused on experimental method evaluation questions (i.e., what to trust), and we are unsure if the same principles hold for other dimensions of critical thinking (i.e., what to do). Future research should explore whether questions that are not in a compare-and-contrast format also effectively elicit critical thinking, and if so, to what degree.

As our question schema in the Eco-BLIC and PLIC were designed for introductory biology and physics content, it is unknown how effective this question schema would be for upper-division biology and physics undergraduates who we would expect to have more content knowledge and prior experiences for making comparisons in their respective disciplines [ 18 , 41 ]. For example, are compare-and-contrast questions still needed to elicit critical thinking among upper-division students, or would critical thinking in this population be more effectively assessed by incorporating more sophisticated data analyses in the research scenarios? Also, if students with more expert-like thinking have a richer set of experimental scenarios to inherently use as contrasts when comparing, we might expect their responses on the individual evaluation questions and group comparisons to better align. To further examine how accessible and context-specific the Eco-BLIC and PLIC are, novel scenarios could be developed that incorporate topics and concepts more commonly addressed in upper-division courses. Additionally, if instructors offer students more experience comparing and contrasting experimental scenarios in the classroom, would students be more discerning on the individual evaluation questions?

While a single consensus definition of critical thinking does not currently exist [ 15 ], continuing to explore critical thinking in other STEM disciplines beyond biology and physics may offer more insight into the context-specific nature of critical thinking [ 22 , 23 ]. Future studies should investigate critical thinking patterns in other STEM disciplines (e.g., mathematics, engineering, chemistry) through designing assessments that encourage students to evaluate aspects of at least two experimental studies. As undergraduates are often enrolled in multiple courses simultaneously and thus have domain-specific knowledge in STEM, would we observe similar patterns in critical thinking across additional STEM disciplines?

Lastly, we want to emphasize that we cannot infer every aspect of critical thinking from students’ responses on the Eco-BLIC and PLIC. However, we suggest that student responses on the think-aloud interviews provide additional qualitative insight into how and why students were making comparisons in each scenario and their overall critical thinking processes.

Conclusions

Overall, we found that comparing and contrasting two different experiments is an effective and efficient way to elicit context-specific critical thinking in introductory biology and physics undergraduates using the Eco-BLIC and the PLIC. Students are more discerning (i.e., critical) and engage more deeply with the scenarios when making comparisons between two groups. Further, students do not evaluate features of experimental studies differently when individual evaluation questions are provided or removed. These novel findings hold true across both introductory biology and physics, based on student responses on the Eco-BLIC and PLIC, respectively—though there is much more to explore regarding critical thinking processes of students across other STEM disciplines and in more advanced stages of their education. Undergraduate students in STEM need to be able to critically think for career advancement, and the Eco-BLIC and PLIC are two means of measuring students’ critical thinking in biology and physics experimental contexts via comparing and contrasting. This research offers new insight on the types of questions that elicit critical thinking, which can further be applied by educators and researchers across disciplines to teach and measure cognitive student outcomes. Specifically, we recommend instructors incorporate more compare-and-contrast questions related to experimental design in their courses to efficiently elicit undergraduates’ critical thinking.

Supporting information

S1 appendix, s2 appendix, s3 appendix, acknowledgments.

We thank the members of the Cornell Discipline-based Education Research group for their feedback on this article, as well as our advisory board (Jenny Knight, Meghan Duffy, Luanna Prevost, and James Hewlett) and the AAALab for their ideas and suggestions. We also greatly appreciate the instructors who shared the Eco-BLIC and PLIC in their classes and the students who participated in this study.

Funding Statement

This work was supported by the National Science Foundation under grants DUE-1909602 (MS & NH) and DUE-1611482 (NH). NSF: nsf.gov The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

Data Availability

Your browser is not supported

Sorry but it looks as if your browser is out of date. To get the best experience using our site we recommend that you upgrade or switch browsers.

Find a solution

  • Skip to main content
  • Skip to navigation

critical thinking skills scientific

  • Back to parent navigation item
  • Collections
  • Sustainability in chemistry
  • Simple rules
  • Teacher well-being hub
  • Women in chemistry
  • Global science
  • Escape room activities
  • Decolonising chemistry teaching
  • Teaching science skills
  • Post-lockdown teaching support
  • Get the print issue
  • RSC Education

Three cartoons: a female student thinking about concentration, a male student in a wheelchair reading Frankenstein and a female student wearing a headscarf and safety goggles heating a test tube on a bunsen burner. All are wearing school uniform.

  • More from navigation items

Critical thinking in the lab (and beyond)

David Read

  • No comments

How to alter existing activities to foster scientific skills

Although many of us associate chemistry education with the laboratory, there remains a lack of evidence that correlates student learning with practical work. It is vital we continue to improve our understanding of how students learn from practical work, and we should devise methods that maximise the benefits. Jon-Marc Rodriguez and Marcy Towns, researchers at Purdue University, US, recently outlined an approach to modify existing practical activities to promote critical thinking in students, supporting enhanced learning. [1]

Although many of us associate chemistry education with the laboratory, there remains a lack of evidence that correlates student learning with practical work. It is vital we continue to improve our understanding of how students learn from practical work, and we should devise methods that maximise the benefits. Jon-Marc Rodriguez and Marcy Towns, researchers at Purdue University, US, recently outlined an approach to modify existing practical activities to promote critical thinking in students , supporting enhanced learning.

A picture of a wood grain desk, with two hands, one holding a piece of graph paper, the other drawing a curve onto the plotted graph

Source: © Science Photo Library

After an experiment, rather than asking a question, task students with plotting a graph; it’ll induce critical thinking and engagement with science practices

Jon-Marc and Marcy focused on critical thinking as a skill needed for successful engagement with the eight ‘science practices’. These practices come from a 2012 framework for science education published by the US National Research Council. The eight practices are: asking questions; developing and using models; planning and carrying out investigations; analysing and interpreting data; using mathematics and computational thinking; constructing explanations; engaging in argument from evidence; and obtaining, evaluating and communicating information. Such skills are widely viewed as integral to an effective chemistry programme. Practising scientists use multiple tools simultaneously when addressing a question, and well-designed practical activities that give students the opportunity to engage with numerous science practices will promote students’ scientific development.

The Purdue researchers chose to examine a traditional laboratory experiment on acid-base titrations because of its ubiquity in chemistry teaching. They characterised the pre- and post-lab questions associated with this experiment in terms of their alignment with the eight science practices. They found only two of ten pre- and post-lab questions elicited engagement with science practices, demonstrating the limitations of the traditional approach. Notably, the pre-lab questions included numerous calculations that were not considered to promote science practices-engagement. Students could answer the calculations algorithmically, with no consideration of the significance of their answer.

Next, Jon-Marc and Marcy modified the experiment and rewrote the pre- and post-lab questions in order to foster engagement with the science practices. They drew on recent research that recommends minimising the amount of information given to students and developing a general understanding of the underlying theory.  [2] The modified set of questions were fewer, with a greater emphasis on conceptual understanding. They questioned aspects such as the suitability of the method and the central question behind the experiment. Questions were more open and introduced greater scope for developing critical thinking.

Next, Jon-Marc and Marcy modified the experiment and rewrote the pre- and post-lab questions in order to foster engagement with the science practices. They drew on recent research that recommends minimising the amount of information given to students and developing a general understanding of the underlying theory. The modified set of questions were fewer, with a greater emphasis on conceptual understanding. They questioned aspects such as the suitability of the method and the central question behind the experiment. Questions were more open and introduced greater scope for developing critical thinking.

In taking an existing protocol and reframing it in terms of science practices, the authors demonstrate an approach instructors can use to adapt their existing activities to promote critical thinking. Using this approach, instructors do not have to spend excessive time creating new activities. Additionally, instructors will have the opportunity to research the impact of their approach on student learning in the teaching laboratory.

Teaching tips

Question phrasing and the steps students should go through to get an answer are instrumental in inducing critical thinking and engagement with science practices. As noted above, simple calculation-based questions do not prompt students to consider the significance of the values calculated. Questions should:

  • refer to an event, observation or phenomenon;
  • ask students to perform a calculation or demonstrate a relationship between variables;
  • ask students to provide a consequence or interpretation (not a restatement) in some form (eg a diagram or graph) based on their results, in the context of the event, observation or phenomenon.

This is more straightforward than it might first seem. The example question Jon-Marc and Marcy give requires students to calculate percentage errors for two titration techniques before discussing the relative accuracy of the methods. Students have to use their data to explain which method was more accurate, prompting a much higher level of engagement than a simple calculation.

As pre-lab preparation, ask students to consider an experimental procedure and then explain in a couple of sentences what methods are going to be used and the rationale for their use. As part of their pre-lab, the Purdue University research team asked students to devise a scientific (‘research’) question that could be answered using the data collected. They then asked students to evaluate and modify their own questions as part of the post-lab, supporting the development of investigative skills. It would be straightforward to incorporate this approach into any practical activity.

Finally, ask students to evaluate a mock response from another student about an aspect of the theory (eg ‘acids react with bases because acids like to donate protons and bases like to accept them’). This elicits critical thinking that can engage every student, with scope to stretch the more able.

These approaches can help students develop a more sophisticated view of chemistry and the higher order skills that will serve them well whatever their future destination.

[1] J-M G Rodriguez and M H Towns, J. Chem. Educ. , 2018, 95 , 2141, DOI: 10.1021/acs . jchemed.8b00683

[2] H Y Agustian and M K Seery, Chem. Educ. Res. Pract., 2017, 18 , 518, DOI: 10.1039/C7RP00140A

J-M G Rodriguez and M H Towns,  J. Chem. Educ. , 2018,  95 , 2141,  DOI: 10.1021/acs . jchemed.8b00683

H Y Agustian and M K Seery,  Chem. Educ. Res. Pract.,  2017,  18 , 518, DOI: 10.1039/C7RP00140A

David Read

More from David Read

An illustration of two hands miming the structure of Ammonia

How visuospatial thinking boosts chemistry understanding

Someone wearing a lab coat flexing their bicep

Strengthen your teaching practice with targeted CPD

Looking down at feet between forward and backward arrows on a street

3 ways to boost knowledge transfer and retention

  • Acids and bases
  • Education research
  • Evidence-based teaching
  • Secondary education

Related articles

Tiny people looking at giant models of ammonium and sulfur dichloride trying to figure out the inter molecular forces

Understanding how students untangle intermolecular forces

2024-03-14T05:10:00Z By Fraser Scott

Discover how learners use electronegativity to predict the location of dipole−dipole interactions 

A woman pushing play on an oversized remote control.

Why I use video to teach chemistry concepts

2024-02-27T08:17:00Z By Helen Rogerson

Helen Rogerson shares why and when videos are useful in the chemistry classroom

Looking down at feet between forward and backward arrows on a street

2024-02-20T05:00:00Z By David Read

Apply these evidence-informed cognitive processes to ensure your learners get ahead

No comments yet

Only registered users can comment on this article., more from education research.

High school students using tablet computers

Boost understanding of Johnstone’s triangle

2024-05-16T06:00:00Z By Fraser Scott

Discover how to use augmented reality to help students visualise organic mechanisms 

An illustration of two hands miming the structure of Ammonia

2024-04-18T06:07:00Z By David Read

Encourage your students to use their hands to help them get to grips with complex chemistry concepts

Cartoon of people negotiating a maze that includes the shapes of mathematical functions in an arrow.

Boost maths skills to improve chemistry learning

2024-01-18T08:00:00Z By Fraser Scott

Use these evidence-based tips to help your learners get ahead with chemical calculations

  • Contributors
  • Print issue
  • Email alerts

Site powered by Webvision Cloud

Development of Critical Thinking Skills Through Science Learning

  • First Online: 02 January 2023

Cite this chapter

critical thinking skills scientific

  • Johar Maknun 3 , 4  

Part of the book series: Integrated Science ((IS,volume 13))

715 Accesses

Education needs to be directed towards increasing the nation’s competitiveness to handle the global competition. This will be achieved as long as learning is directed towards improving students’ abilities, especially critical thinking skills. Therefore, science education aims to improve thinking skills and prepare students for future success. These abilities are trained with learning that requires them to conduct experiments, discover, and solve problems through small group discussions. A constructive learning approach is believed to facilitate and develop these skills.

Graphical Abstract/Art Performance

critical thinking skills scientific

Development of critical thinking skills through science learning.

(Adapted with permission from the Association of Science and Art(ASA), Universal Scientific Education and Research Network (USERN); Made by Sara Bakhshi)

Critical thinking relies on content, because you can’t navigate masses of information if you have nothing to navigate to. Kathy Hirsh-Pasek

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
  • Available as EPUB and PDF
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
  • Durable hardcover edition

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Carin AA, Sund RB (1990) Teaching modern science. Merril Publishing Company, New York

Google Scholar  

Carin AA (1997) Teaching science through discovery, 8th edn. Prentice Hall, New Jersey

Morgan WR (1995) Critical thinking what does that mean? J Coll Sci Teach 24(5):336–390

Arnyana IBP (2004) Pengembangan Perangkat Model Belajar Berdasarkan Masalah Dipandu Strategi Kooperatif dan Pengaruh Implementasinya Terhadap Keterampilan Berpikir Kritis dan Hasil Belajar Siswa SMA pada pelajaran Ekosistem. Disertasi tidak diterbitkan. Malang: Program Pasca-sarjana Universitas Negeri Malang

Zoller U, Ben-Caim D, Ron S (2000) The disposition toward critical thinking of high school and university science student: a inter-intra Isreali-Italian study. Int J Sci Educ 22(6).

Costa AL (1985) Goals for a critical thinking curriculum. In: Costa AL (ed) Developing mind: a resource book for teaching thinking. ASCD: Alexandria, Virginia

Ennis RH (1985) Curriculum for critical thinking. In: Costa AL (ed). Developing mind: a resource book for teaching thinking. ASCD: Alexandria, Virginia

Hughes W, Lavery J (2014) Critical thinking: an introduction to the basic skills-seventh edition, Canadian: Phil-papers

Verlinden J (2005) Critical thinking and every day argumen. Balmont. Wadsworth/the msou learning Inc., CA

Facione PA (2015) Critical thinking: what it is and why it counts. Hermosa beach: measured reasons LLC

Abd-El Khalick F, Lederman NG, Bell RL (1998) The nature of science and instructional practice: making the unnatural natural. Sci Educ 82: 417–436

Chiappetta EL, dan Koballa TR (2010) Science instruction in the middle and secondary schools: developing fundamental knowledge and skills. United State of America: Pearson Education Inc

McComas WF (2015) The nature of science & the next generation of biology education. Am Biol Teach 77(7):485–491. https://doi.org/10.1525/abt.2015.77.7.2

Article   Google Scholar  

Bell RL (2008) Best practices in science education teaching the nature of science: three critical questions. Cengage

Good R (2012) Advances in nature of science research, (Id), 97–106. https://doi.org/10.1007/978-94-007-2457-0

Ramsey J (1993) Reform movement implication social responsibility. Sci Educ 77(2):235–258

Loucks-Horsley S, Stiles KE, Mundry S, Love N, Hewson PH (2010) Designing professional development for teacher of science and mathematics. California: Corwin

Brickman P, Gormally C, Armstrong N, Hallar B (2009) Effects of inquiry-based learning on students’ science literacy skills and confidence. Int J Scholarsh Teach Learn 3:1–22

Kruse D (2009) Thinking strategies for the inquiry classroom. Curriculum corporation. Diambil pada tanggal 23 Juli 2014, dari. http://www.curriculumpress.edu.au/sample/pages/978174200313 9.pdf

Beyer BK (1995) Critical thinking. Phi Delta Kappa Educational Foundation, Bloomington

Kalman CS (2002) Developing critical thinking in undergraduate courses: a philosophical approach. Sci Educ 11(83–94):2002

Walker GH (1998) Critical thinking. Tersedia pada: http://www/utr.edu/administration/walkerteachingresoursecenter/facultydevelopment/criticalthinking

Ennis RH (1989) Critical thinking and subject specificity: clarification and needed research. Educ Res 18(3):4–10. https://doi.org/10.3102/0013189X018003004

Tiruneh DT Cock MD, Weldeslassie AG, Elen J, dan Janssen R (2017) Measuring critical thinking in physics: development and validation of a critical thinking test in electricity and magnetism. Int J of Sci and Math Educ 15:663–682

Zohar A (1994) The effect of biology critical thinking project in the development of critical thinking. J Res Sci Teach 31(2):163–196

Tiruneh DT, Weldeslassie AG, Kassa A, Tefera Z, Cock MD, Elen J (2016) Systematic design of a learning environment for domainspecific and domain-general critical thinking skills. Educ Tech Res Dev 2016(64):481–505

Abrami PC, Bernard RM, Borokhovski E, Waddington DI, Wade CA, Persson T (2015) Strategies for teaching students to think critically: a meta-analysis. Rev Educ Res 85(2):275–314. https://doi.org/10.3102/0034654314551063

Hashemi SA, Naderi E, Shariatmadari A, Naraghi MS, Mehrabi M (2010) Science production in iranian educational system by the use of critical thinking. Int J Instr 3(1)

Facione PA (2010) Critical thinking: what it is and why it counts. Insight Assesment, pp 1–24

Deming JC, Cracolice MS (2004) Learning how to think. Sci Teach 71(3):42–47

Straits WJ, Wilke RR (2002) Practical considerations for assessing inquiry-based instruction. J Coll Sci Teach 31(7):432–435

Jufri AW (2007) Pengaruh Implementasi Perangkat Pembelajaran Berbasis Inkuiri Melalui Strategi Kooperatif Terhadap Keterampilan Berpikir Kritis, Sikap, dan penguasaan konsep Siswa SMA di Kota Mataram. Disertasi , tidak dipublikasikan. Malang: Program Pascasarjana Universitas Negeri Malang

Mestre JP (2002) Probing adults’ conceptual understanding and transfer of learning via problem posing. J Appl Dev Psychol 23:9–50

Maknun J (2020) Implementation of guided inquiry learning model to improve understanding physics concepts and critical thinking skill of vocational high school students. Int Educ Stud 13(6). https://doi.org/10.5539/ies.v13n6p117

Peter EE (2012) Critical thinking: essence for teaching mathematics and mathematics problem solving skills. African J Math Comput Sci Res 5(3):39–43. https://doi.org/10.5897/AJMCSR11.161

Stobaugh R (2013) Assessing critical thinking in middle and high schools. Routledge, New York

Book   Google Scholar  

Rosen Y, Maryam T (2014) Making student thinking visible through a concept map in computer-based assessment of critical thinking. J Educ Comput Res 50(2):249–270. https://doi.org/10.2190/EC.50.2.f

Hakim L (2010) Pembelajaran Konsep Listrik Dinamis Melalui Kegiatan Laboratorium Problem Solving untuk Meningkatkan Penguasaan Konsep dan Keterampilan Berpikir Kritis Siswa. Sekolah Pascasarjana Universitas Pendidikan Indonesia, Tesis

Heller, Heller (1999) Cooperative group problem solving in physics, supported by national science foundation (NSF), the department of education, fund for improving post-secondary education (FIPSE), and by the University of Minnesota

Iselin JP (2007) Teaching critical thinking through simple experiment. In: proceeding of the 2007 ASEE North Midwest Sectional Conference

Cabera GA (1991) A framwork for evaluation the teaching of critical thinking. Education 113(1):59–63

Lipman M (2003) A thinking in education, 2nd edn. Cambridge University Press, Cambridge

Shakirova DM (2007) Technology for the shaping of college students’ and upper-grade students’ critical thinking. Russ Educ Soc 49(9):42–52

Maknun J (2019) The development of critical thinking skills in vocational high school students in Indonesia. Int J Innov Creativity Change 7(12):2019

Rosengrant D, Etkina E, Heuvelen AV (2007) An overview of recent research on multiple representations. [online] Tersedia: www.percentral.org/items/264 [18 Jan 2013]

Download references

Author information

Authors and affiliations.

Faculty of Technology and Vocational Education, Department of Architectural Engineering Education, Universitas Pendidikan Indonesia, Bandung, Indonesia

Johar Maknun

Integrated Science Association (ISA), Universal Scientific Education and Research Network (USERN), Bandung, Indonesia

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Johar Maknun .

Editor information

Editors and affiliations.

Universal Scientific Education and Research Network (USERN), Stockholm, Sweden

Nima Rezaei

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this chapter

Maknun, J. (2022). Development of Critical Thinking Skills Through Science Learning. In: Rezaei, N. (eds) Integrated Education and Learning. Integrated Science, vol 13. Springer, Cham. https://doi.org/10.1007/978-3-031-15963-3_8

Download citation

DOI : https://doi.org/10.1007/978-3-031-15963-3_8

Published : 02 January 2023

Publisher Name : Springer, Cham

Print ISBN : 978-3-031-15962-6

Online ISBN : 978-3-031-15963-3

eBook Packages : Education Education (R0)

Share this chapter

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research

Change Password

Your password must have 8 characters or more and contain 3 of the following:.

  • a lower case character, 
  • an upper case character, 
  • a special character 

Password Changed Successfully

Your password has been changed

  • Sign in / Register

Request Username

Can't sign in? Forgot your username?

Enter your email address below and we will send you your username

If the address matches an existing account you will receive an email with instructions to retrieve your username

Understanding the Complex Relationship between Critical Thinking and Science Reasoning among Undergraduate Thesis Writers

  • Jason E. Dowd
  • Robert J. Thompson
  • Leslie A. Schiff
  • Julie A. Reynolds

*Address correspondence to: Jason E. Dowd ( E-mail Address: [email protected] ).

Department of Biology, Duke University, Durham, NC 27708

Search for more papers by this author

Department of Psychology and Neuroscience, Duke University, Durham, NC 27708

Department of Microbiology and Immunology, University of Minnesota, Minneapolis, MN 55455

Developing critical-thinking and scientific reasoning skills are core learning objectives of science education, but little empirical evidence exists regarding the interrelationships between these constructs. Writing effectively fosters students’ development of these constructs, and it offers a unique window into studying how they relate. In this study of undergraduate thesis writing in biology at two universities, we examine how scientific reasoning exhibited in writing (assessed using the Biology Thesis Assessment Protocol) relates to general and specific critical-thinking skills (assessed using the California Critical Thinking Skills Test), and we consider implications for instruction. We find that scientific reasoning in writing is strongly related to inference , while other aspects of science reasoning that emerge in writing (epistemological considerations, writing conventions, etc.) are not significantly related to critical-thinking skills. Science reasoning in writing is not merely a proxy for critical thinking. In linking features of students’ writing to their critical-thinking skills, this study 1) provides a bridge to prior work suggesting that engagement in science writing enhances critical thinking and 2) serves as a foundational step for subsequently determining whether instruction focused explicitly on developing critical-thinking skills (particularly inference ) can actually improve students’ scientific reasoning in their writing.

INTRODUCTION

Critical-thinking and scientific reasoning skills are core learning objectives of science education for all students, regardless of whether or not they intend to pursue a career in science or engineering. Consistent with the view of learning as construction of understanding and meaning ( National Research Council, 2000 ), the pedagogical practice of writing has been found to be effective not only in fostering the development of students’ conceptual and procedural knowledge ( Gerdeman et al. , 2007 ) and communication skills ( Clase et al. , 2010 ), but also scientific reasoning ( Reynolds et al. , 2012 ) and critical-thinking skills ( Quitadamo and Kurtz, 2007 ).

Critical thinking and scientific reasoning are similar but different constructs that include various types of higher-order cognitive processes, metacognitive strategies, and dispositions involved in making meaning of information. Critical thinking is generally understood as the broader construct ( Holyoak and Morrison, 2005 ), comprising an array of cognitive processes and dispostions that are drawn upon differentially in everyday life and across domains of inquiry such as the natural sciences, social sciences, and humanities. Scientific reasoning, then, may be interpreted as the subset of critical-thinking skills (cognitive and metacognitive processes and dispositions) that 1) are involved in making meaning of information in scientific domains and 2) support the epistemological commitment to scientific methodology and paradigm(s).

Although there has been an enduring focus in higher education on promoting critical thinking and reasoning as general or “transferable” skills, research evidence provides increasing support for the view that reasoning and critical thinking are also situational or domain specific ( Beyer et al. , 2013 ). Some researchers, such as Lawson (2010) , present frameworks in which science reasoning is characterized explicitly in terms of critical-thinking skills. There are, however, limited coherent frameworks and empirical evidence regarding either the general or domain-specific interrelationships of scientific reasoning, as it is most broadly defined, and critical-thinking skills.

The Vision and Change in Undergraduate Biology Education Initiative provides a framework for thinking about these constructs and their interrelationship in the context of the core competencies and disciplinary practice they describe ( American Association for the Advancement of Science, 2011 ). These learning objectives aim for undergraduates to “understand the process of science, the interdisciplinary nature of the new biology and how science is closely integrated within society; be competent in communication and collaboration; have quantitative competency and a basic ability to interpret data; and have some experience with modeling, simulation and computational and systems level approaches as well as with using large databases” ( Woodin et al. , 2010 , pp. 71–72). This framework makes clear that science reasoning and critical-thinking skills play key roles in major learning outcomes; for example, “understanding the process of science” requires students to engage in (and be metacognitive about) scientific reasoning, and having the “ability to interpret data” requires critical-thinking skills. To help students better achieve these core competencies, we must better understand the interrelationships of their composite parts. Thus, the next step is to determine which specific critical-thinking skills are drawn upon when students engage in science reasoning in general and with regard to the particular scientific domain being studied. Such a determination could be applied to improve science education for both majors and nonmajors through pedagogical approaches that foster critical-thinking skills that are most relevant to science reasoning.

Writing affords one of the most effective means for making thinking visible ( Reynolds et al. , 2012 ) and learning how to “think like” and “write like” disciplinary experts ( Meizlish et al. , 2013 ). As a result, student writing affords the opportunities to both foster and examine the interrelationship of scientific reasoning and critical-thinking skills within and across disciplinary contexts. The purpose of this study was to better understand the relationship between students’ critical-thinking skills and scientific reasoning skills as reflected in the genre of undergraduate thesis writing in biology departments at two research universities, the University of Minnesota and Duke University.

In the following subsections, we discuss in greater detail the constructs of scientific reasoning and critical thinking, as well as the assessment of scientific reasoning in students’ thesis writing. In subsequent sections, we discuss our study design, findings, and the implications for enhancing educational practices.

Critical Thinking

The advances in cognitive science in the 21st century have increased our understanding of the mental processes involved in thinking and reasoning, as well as memory, learning, and problem solving. Critical thinking is understood to include both a cognitive dimension and a disposition dimension (e.g., reflective thinking) and is defined as “purposeful, self-regulatory judgment which results in interpretation, analysis, evaluation, and inference, as well as explanation of the evidential, conceptual, methodological, criteriological, or contextual considera­tions upon which that judgment is based” ( Facione, 1990, p. 3 ). Although various other definitions of critical thinking have been proposed, researchers have generally coalesced on this consensus: expert view ( Blattner and Frazier, 2002 ; Condon and Kelly-Riley, 2004 ; Bissell and Lemons, 2006 ; Quitadamo and Kurtz, 2007 ) and the corresponding measures of critical-­thinking skills ( August, 2016 ; Stephenson and Sadler-McKnight, 2016 ).

Both the cognitive skills and dispositional components of critical thinking have been recognized as important to science education ( Quitadamo and Kurtz, 2007 ). Empirical research demonstrates that specific pedagogical practices in science courses are effective in fostering students’ critical-thinking skills. Quitadamo and Kurtz (2007) found that students who engaged in a laboratory writing component in the context of a general education biology course significantly improved their overall critical-thinking skills (and their analytical and inference skills, in particular), whereas students engaged in a traditional quiz-based laboratory did not improve their critical-thinking skills. In related work, Quitadamo et al. (2008) found that a community-based inquiry experience, involving inquiry, writing, research, and analysis, was associated with improved critical thinking in a biology course for nonmajors, compared with traditionally taught sections. In both studies, students who exhibited stronger presemester critical-thinking skills exhibited stronger gains, suggesting that “students who have not been explicitly taught how to think critically may not reach the same potential as peers who have been taught these skills” ( Quitadamo and Kurtz, 2007 , p. 151).

Recently, Stephenson and Sadler-McKnight (2016) found that first-year general chemistry students who engaged in a science writing heuristic laboratory, which is an inquiry-based, writing-to-learn approach to instruction ( Hand and Keys, 1999 ), had significantly greater gains in total critical-thinking scores than students who received traditional laboratory instruction. Each of the four components—inquiry, writing, collaboration, and reflection—have been linked to critical thinking ( Stephenson and Sadler-McKnight, 2016 ). Like the other studies, this work highlights the value of targeting critical-thinking skills and the effectiveness of an inquiry-based, writing-to-learn approach to enhance critical thinking. Across studies, authors advocate adopting critical thinking as the course framework ( Pukkila, 2004 ) and developing explicit examples of how critical thinking relates to the scientific method ( Miri et al. , 2007 ).

In these examples, the important connection between writing and critical thinking is highlighted by the fact that each intervention involves the incorporation of writing into science, technology, engineering, and mathematics education (either alone or in combination with other pedagogical practices). However, critical-thinking skills are not always the primary learning outcome; in some contexts, scientific reasoning is the primary outcome that is assessed.

Scientific Reasoning

Scientific reasoning is a complex process that is broadly defined as “the skills involved in inquiry, experimentation, evidence evaluation, and inference that are done in the service of conceptual change or scientific understanding” ( Zimmerman, 2007 , p. 172). Scientific reasoning is understood to include both conceptual knowledge and the cognitive processes involved with generation of hypotheses (i.e., inductive processes involved in the generation of hypotheses and the deductive processes used in the testing of hypotheses), experimentation strategies, and evidence evaluation strategies. These dimensions are interrelated, in that “experimentation and inference strategies are selected based on prior conceptual knowledge of the domain” ( Zimmerman, 2000 , p. 139). Furthermore, conceptual and procedural knowledge and cognitive process dimensions can be general and domain specific (or discipline specific).

With regard to conceptual knowledge, attention has been focused on the acquisition of core methodological concepts fundamental to scientists’ causal reasoning and metacognitive distancing (or decontextualized thinking), which is the ability to reason independently of prior knowledge or beliefs ( Greenhoot et al. , 2004 ). The latter involves what Kuhn and Dean (2004) refer to as the coordination of theory and evidence, which requires that one question existing theories (i.e., prior knowledge and beliefs), seek contradictory evidence, eliminate alternative explanations, and revise one’s prior beliefs in the face of contradictory evidence. Kuhn and colleagues (2008) further elaborate that scientific thinking requires “a mature understanding of the epistemological foundations of science, recognizing scientific knowledge as constructed by humans rather than simply discovered in the world,” and “the ability to engage in skilled argumentation in the scientific domain, with an appreciation of argumentation as entailing the coordination of theory and evidence” ( Kuhn et al. , 2008 , p. 435). “This approach to scientific reasoning not only highlights the skills of generating and evaluating evidence-based inferences, but also encompasses epistemological appreciation of the functions of evidence and theory” ( Ding et al. , 2016 , p. 616). Evaluating evidence-based inferences involves epistemic cognition, which Moshman (2015) defines as the subset of metacognition that is concerned with justification, truth, and associated forms of reasoning. Epistemic cognition is both general and domain specific (or discipline specific; Moshman, 2015 ).

There is empirical support for the contributions of both prior knowledge and an understanding of the epistemological foundations of science to scientific reasoning. In a study of undergraduate science students, advanced scientific reasoning was most often accompanied by accurate prior knowledge as well as sophisticated epistemological commitments; additionally, for students who had comparable levels of prior knowledge, skillful reasoning was associated with a strong epistemological commitment to the consistency of theory with evidence ( Zeineddin and Abd-El-Khalick, 2010 ). These findings highlight the importance of the need for instructional activities that intentionally help learners develop sophisticated epistemological commitments focused on the nature of knowledge and the role of evidence in supporting knowledge claims ( Zeineddin and Abd-El-Khalick, 2010 ).

Scientific Reasoning in Students’ Thesis Writing

Pedagogical approaches that incorporate writing have also focused on enhancing scientific reasoning. Many rubrics have been developed to assess aspects of scientific reasoning in written artifacts. For example, Timmerman and colleagues (2011) , in the course of describing their own rubric for assessing scientific reasoning, highlight several examples of scientific reasoning assessment criteria ( Haaga, 1993 ; Tariq et al. , 1998 ; Topping et al. , 2000 ; Kelly and Takao, 2002 ; Halonen et al. , 2003 ; Willison and O’Regan, 2007 ).

At both the University of Minnesota and Duke University, we have focused on the genre of the undergraduate honors thesis as the rhetorical context in which to study and improve students’ scientific reasoning and writing. We view the process of writing an undergraduate honors thesis as a form of professional development in the sciences (i.e., a way of engaging students in the practices of a community of discourse). We have found that structured courses designed to scaffold the thesis-­writing process and promote metacognition can improve writing and reasoning skills in biology, chemistry, and economics ( Reynolds and Thompson, 2011 ; Dowd et al. , 2015a , b ). In the context of this prior work, we have defined scientific reasoning in writing as the emergent, underlying construct measured across distinct aspects of students’ written discussion of independent research in their undergraduate theses.

The Biology Thesis Assessment Protocol (BioTAP) was developed at Duke University as a tool for systematically guiding students and faculty through a “draft–feedback–revision” writing process, modeled after professional scientific peer-review processes ( Reynolds et al. , 2009 ). BioTAP includes activities and worksheets that allow students to engage in critical peer review and provides detailed descriptions, presented as rubrics, of the questions (i.e., dimensions, shown in Table 1 ) upon which such review should focus. Nine rubric dimensions focus on communication to the broader scientific community, and four rubric dimensions focus on the accuracy and appropriateness of the research. These rubric dimensions provide criteria by which the thesis is assessed, and therefore allow BioTAP to be used as an assessment tool as well as a teaching resource ( Reynolds et al. , 2009 ). Full details are available at www.science-writing.org/biotap.html .

In previous work, we have used BioTAP to quantitatively assess students’ undergraduate honors theses and explore the relationship between thesis-writing courses (or specific interventions within the courses) and the strength of students’ science reasoning in writing across different science disciplines: biology ( Reynolds and Thompson, 2011 ); chemistry ( Dowd et al. , 2015b ); and economics ( Dowd et al. , 2015a ). We have focused exclusively on the nine dimensions related to reasoning and writing (questions 1–9), as the other four dimensions (questions 10–13) require topic-specific expertise and are intended to be used by the student’s thesis supervisor.

Beyond considering individual dimensions, we have investigated whether meaningful constructs underlie students’ thesis scores. We conducted exploratory factor analysis of students’ theses in biology, economics, and chemistry and found one dominant underlying factor in each discipline; we termed the factor “scientific reasoning in writing” ( Dowd et al. , 2015a , b , 2016 ). That is, each of the nine dimensions could be understood as reflecting, in different ways and to different degrees, the construct of scientific reasoning in writing. The findings indicated evidence of both general and discipline-specific components to scientific reasoning in writing that relate to epistemic beliefs and paradigms, in keeping with broader ideas about science reasoning discussed earlier. Specifically, scientific reasoning in writing is more strongly associated with formulating a compelling argument for the significance of the research in the context of current literature in biology, making meaning regarding the implications of the findings in chemistry, and providing an organizational framework for interpreting the thesis in economics. We suggested that instruction, whether occurring in writing studios or in writing courses to facilitate thesis preparation, should attend to both components.

Research Question and Study Design

The genre of thesis writing combines the pedagogies of writing and inquiry found to foster scientific reasoning ( Reynolds et al. , 2012 ) and critical thinking ( Quitadamo and Kurtz, 2007 ; Quitadamo et al. , 2008 ; Stephenson and Sadler-­McKnight, 2016 ). However, there is no empirical evidence regarding the general or domain-specific interrelationships of scientific reasoning and critical-thinking skills, particularly in the rhetorical context of the undergraduate thesis. The BioTAP studies discussed earlier indicate that the rubric-based assessment produces evidence of scientific reasoning in the undergraduate thesis, but it was not designed to foster or measure critical thinking. The current study was undertaken to address the research question: How are students’ critical-thinking skills related to scientific reasoning as reflected in the genre of undergraduate thesis writing in biology? Determining these interrelationships could guide efforts to enhance students’ scientific reasoning and writing skills through focusing instruction on specific critical-thinking skills as well as disciplinary conventions.

To address this research question, we focused on undergraduate thesis writers in biology courses at two institutions, Duke University and the University of Minnesota, and examined the extent to which students’ scientific reasoning in writing, assessed in the undergraduate thesis using BioTAP, corresponds to students’ critical-thinking skills, assessed using the California Critical Thinking Skills Test (CCTST; August, 2016 ).

Study Sample

The study sample was composed of students enrolled in courses designed to scaffold the thesis-writing process in the Department of Biology at Duke University and the College of Biological Sciences at the University of Minnesota. Both courses complement students’ individual work with research advisors. The course is required for thesis writers at the University of Minnesota and optional for writers at Duke University. Not all students are required to complete a thesis, though it is required for students to graduate with honors; at the University of Minnesota, such students are enrolled in an honors program within the college. In total, 28 students were enrolled in the course at Duke University and 44 students were enrolled in the course at the University of Minnesota. Of those students, two students did not consent to participate in the study; additionally, five students did not validly complete the CCTST (i.e., attempted fewer than 60% of items or completed the test in less than 15 minutes). Thus, our overall rate of valid participation is 90%, with 27 students from Duke University and 38 students from the University of Minnesota. We found no statistically significant differences in thesis assessment between students with valid CCTST scores and invalid CCTST scores. Therefore, we focus on the 65 students who consented to participate and for whom we have complete and valid data in most of this study. Additionally, in asking students for their consent to participate, we allowed them to choose whether to provide or decline access to academic and demographic background data. Of the 65 students who consented to participate, 52 students granted access to such data. Therefore, for additional analyses involving academic and background data, we focus on the 52 students who consented. We note that the 13 students who participated but declined to share additional data performed slightly lower on the CCTST than the 52 others (perhaps suggesting that they differ by other measures, but we cannot determine this with certainty). Among the 52 students, 60% identified as female and 10% identified as being from underrepresented ethnicities.

In both courses, students completed the CCTST online, either in class or on their own, late in the Spring 2016 semester. This is the same assessment that was used in prior studies of critical thinking ( Quitadamo and Kurtz, 2007 ; Quitadamo et al. , 2008 ; Stephenson and Sadler-McKnight, 2016 ). It is “an objective measure of the core reasoning skills needed for reflective decision making concerning what to believe or what to do” ( Insight Assessment, 2016a ). In the test, students are asked to read and consider information as they answer multiple-choice questions. The questions are intended to be appropriate for all users, so there is no expectation of prior disciplinary knowledge in biology (or any other subject). Although actual test items are protected, sample items are available on the Insight Assessment website ( Insight Assessment, 2016b ). We have included one sample item in the Supplemental Material.

The CCTST is based on a consensus definition of critical thinking, measures cognitive and metacognitive skills associated with critical thinking, and has been evaluated for validity and reliability at the college level ( August, 2016 ; Stephenson and Sadler-McKnight, 2016 ). In addition to providing overall critical-thinking score, the CCTST assesses seven dimensions of critical thinking: analysis, interpretation, inference, evaluation, explanation, induction, and deduction. Scores on each dimension are calculated based on students’ performance on items related to that dimension. Analysis focuses on identifying assumptions, reasons, and claims and examining how they interact to form arguments. Interpretation, related to analysis, focuses on determining the precise meaning and significance of information. Inference focuses on drawing conclusions from reasons and evidence. Evaluation focuses on assessing the credibility of sources of information and claims they make. Explanation, related to evaluation, focuses on describing the evidence, assumptions, or rationale for beliefs and conclusions. Induction focuses on drawing inferences about what is probably true based on evidence. Deduction focuses on drawing conclusions about what must be true when the context completely determines the outcome. These are not independent dimensions; the fact that they are related supports their collective interpretation as critical thinking. Together, the CCTST dimensions provide a basis for evaluating students’ overall strength in using reasoning to form reflective judgments about what to believe or what to do ( August, 2016 ). Each of the seven dimensions and the overall CCTST score are measured on a scale of 0–100, where higher scores indicate superior performance. Scores correspond to superior (86–100), strong (79–85), moderate (70–78), weak (63–69), or not manifested (62 and below) skills.

Scientific Reasoning in Writing

At the end of the semester, students’ final, submitted undergraduate theses were assessed using BioTAP, which consists of nine rubric dimensions that focus on communication to the broader scientific community and four additional dimensions that focus on the exhibition of topic-specific expertise ( Reynolds et al. , 2009 ). These dimensions, framed as questions, are displayed in Table 1 .

Student theses were assessed on questions 1–9 of BioTAP using the same procedures described in previous studies ( Reynolds and Thompson, 2011 ; Dowd et al. , 2015a , b ). In this study, six raters were trained in the valid, reliable use of BioTAP rubrics. Each dimension was rated on a five-point scale: 1 indicates the dimension is missing, incomplete, or below acceptable standards; 3 indicates that the dimension is adequate but not exhibiting mastery; and 5 indicates that the dimension is excellent and exhibits mastery (intermediate ratings of 2 and 4 are appropriate when different parts of the thesis make a single category challenging). After training, two raters independently assessed each thesis and then discussed their independent ratings with one another to form a consensus rating. The consensus score is not an average score, but rather an agreed-upon, discussion-based score. On a five-point scale, raters independently assessed dimensions to be within 1 point of each other 82.4% of the time before discussion and formed consensus ratings 100% of the time after discussion.

In this study, we consider both categorical (mastery/nonmastery, where a score of 5 corresponds to mastery) and numerical treatments of individual BioTAP scores to better relate the manifestation of critical thinking in BioTAP assessment to all of the prior studies. For comprehensive/cumulative measures of BioTAP, we focus on the partial sum of questions 1–5, as these questions relate to higher-order scientific reasoning (whereas questions 6–9 relate to mid- and lower-order writing mechanics [ Reynolds et al. , 2009 ]), and the factor scores (i.e., numerical representations of the extent to which each student exhibits the underlying factor), which are calculated from the factor loadings published by Dowd et al. (2016) . We do not focus on questions 6–9 individually in statistical analyses, because we do not expect critical-thinking skills to relate to mid- and lower-order writing skills.

The final, submitted thesis reflects the student’s writing, the student’s scientific reasoning, the quality of feedback provided to the student by peers and mentors, and the student’s ability to incorporate that feedback into his or her work. Therefore, our assessment is not the same as an assessment of unpolished, unrevised samples of students’ written work. While one might imagine that such an unpolished sample may be more strongly correlated with critical-thinking skills measured by the CCTST, we argue that the complete, submitted thesis, assessed using BioTAP, is ultimately a more appropriate reflection of how students exhibit science reasoning in the scientific community.

Statistical Analyses

We took several steps to analyze the collected data. First, to provide context for subsequent interpretations, we generated descriptive statistics for the CCTST scores of the participants based on the norms for undergraduate CCTST test takers. To determine the strength of relationships among CCTST dimensions (including overall score) and the BioTAP dimensions, partial-sum score (questions 1–5), and factor score, we calculated Pearson’s correlations for each pair of measures. To examine whether falling on one side of the nonmastery/mastery threshold (as opposed to a linear scale of performance) was related to critical thinking, we grouped BioTAP dimensions into categories (mastery/nonmastery) and conducted Student’s t tests to compare the means scores of the two groups on each of the seven dimensions and overall score of the CCTST. Finally, for the strongest relationship that emerged, we included additional academic and background variables as covariates in multiple linear-regression analysis to explore questions about how much observed relationships between critical-thinking skills and science reasoning in writing might be explained by variation in these other factors.

Although BioTAP scores represent discreet, ordinal bins, the five-point scale is intended to capture an underlying continuous construct (from inadequate to exhibiting mastery). It has been argued that five categories is an appropriate cutoff for treating ordinal variables as pseudo-continuous ( Rhemtulla et al. , 2012 )—and therefore using continuous-variable statistical methods (e.g., Pearson’s correlations)—as long as the underlying assumption that ordinal scores are linearly distributed is valid. Although we have no way to statistically test this assumption, we interpret adequate scores to be approximately halfway between inadequate and mastery scores, resulting in a linear scale. In part because this assumption is subject to disagreement, we also consider and interpret a categorical (mastery/nonmastery) treatment of BioTAP variables.

We corrected for multiple comparisons using the Holm-Bonferroni method ( Holm, 1979 ). At the most general level, where we consider the single, comprehensive measures for BioTAP (partial-sum and factor score) and the CCTST (overall score), there is no need to correct for multiple comparisons, because the multiple, individual dimensions are collapsed into single dimensions. When we considered individual CCTST dimensions in relation to comprehensive measures for BioTAP, we accounted for seven comparisons; similarly, when we considered individual dimensions of BioTAP in relation to overall CCTST score, we accounted for five comparisons. When all seven CCTST and five BioTAP dimensions were examined individually and without prior knowledge, we accounted for 35 comparisons; such a rigorous threshold is likely to reject weak and moderate relationships, but it is appropriate if there are no specific pre-existing hypotheses. All p values are presented in tables for complete transparency, and we carefully consider the implications of our interpretation of these data in the Discussion section.

CCTST scores for students in this sample ranged from the 39th to 99th percentile of the general population of undergraduate CCTST test takers (mean percentile = 84.3, median = 85th percentile; Table 2 ); these percentiles reflect overall scores that range from moderate to superior. Scores on individual dimensions and overall scores were sufficiently normal and far enough from the ceiling of the scale to justify subsequent statistical analyses.

a Scores correspond to superior (86–100), strong (79–85), moderate (70–78), weak (63–69), or not manifested (62 and lower) skills.

The Pearson’s correlations between students’ cumulative scores on BioTAP (the factor score based on loadings published by Dowd et al. , 2016 , and the partial sum of scores on questions 1–5) and students’ overall scores on the CCTST are presented in Table 3 . We found that the partial-sum measure of BioTAP was significantly related to the overall measure of critical thinking ( r = 0.27, p = 0.03), while the BioTAP factor score was marginally related to overall CCTST ( r = 0.24, p = 0.05). When we looked at relationships between comprehensive BioTAP measures and scores for individual dimensions of the CCTST ( Table 3 ), we found significant positive correlations between the both BioTAP partial-sum and factor scores and CCTST inference ( r = 0.45, p < 0.001, and r = 0.41, p < 0.001, respectively). Although some other relationships have p values below 0.05 (e.g., the correlations between BioTAP partial-sum scores and CCTST induction and interpretation scores), they are not significant when we correct for multiple comparisons.

a In each cell, the top number is the correlation, and the bottom, italicized number is the associated p value. Correlations that are statistically significant after correcting for multiple comparisons are shown in bold.

b This is the partial sum of BioTAP scores on questions 1–5.

c This is the factor score calculated from factor loadings published by Dowd et al. (2016) .

When we expanded comparisons to include all 35 potential correlations among individual BioTAP and CCTST dimensions—and, accordingly, corrected for 35 comparisons—we did not find any additional statistically significant relationships. The Pearson’s correlations between students’ scores on each dimension of BioTAP and students’ scores on each dimension of the CCTST range from −0.11 to 0.35 ( Table 3 ); although the relationship between discussion of implications (BioTAP question 5) and inference appears to be relatively large ( r = 0.35), it is not significant ( p = 0.005; the Holm-Bonferroni cutoff is 0.00143). We found no statistically significant relationships between BioTAP questions 6–9 and CCTST dimensions (unpublished data), regardless of whether we correct for multiple comparisons.

The results of Student’s t tests comparing scores on each dimension of the CCTST of students who exhibit mastery with those of students who do not exhibit mastery on each dimension of BioTAP are presented in Table 4 . Focusing first on the overall CCTST scores, we found that the difference between those who exhibit mastery and those who do not in discussing implications of results (BioTAP question 5) is statistically significant ( t = 2.73, p = 0.008, d = 0.71). When we expanded t tests to include all 35 comparisons—and, like above, corrected for 35 comparisons—we found a significant difference in inference scores between students who exhibit mastery on question 5 and students who do not ( t = 3.41, p = 0.0012, d = 0.88), as well as a marginally significant difference in these students’ induction scores ( t = 3.26, p = 0.0018, d = 0.84; the Holm-Bonferroni cutoff is p = 0.00147). Cohen’s d effect sizes, which reveal the strength of the differences for statistically significant relationships, range from 0.71 to 0.88.

a In each cell, the top number is the t statistic for each comparison, and the middle, italicized number is the associated p value. The bottom number is the effect size. Correlations that are statistically significant after correcting for multiple comparisons are shown in bold.

Finally, we more closely examined the strongest relationship that we observed, which was between the CCTST dimension of inference and the BioTAP partial-sum composite score (shown in Table 3 ), using multiple regression analysis ( Table 5 ). Focusing on the 52 students for whom we have background information, we looked at the simple relationship between BioTAP and inference (model 1), a robust background model including multiple covariates that one might expect to explain some part of the variation in BioTAP (model 2), and a combined model including all variables (model 3). As model 3 shows, the covariates explain very little variation in BioTAP scores, and the relationship between inference and BioTAP persists even in the presence of all of the covariates.

** p < 0.01.

*** p < 0.001.

The aim of this study was to examine the extent to which the various components of scientific reasoning—manifested in writing in the genre of undergraduate thesis and assessed using BioTAP—draw on general and specific critical-thinking skills (assessed using CCTST) and to consider the implications for educational practices. Although science reasoning involves critical-thinking skills, it also relates to conceptual knowledge and the epistemological foundations of science disciplines ( Kuhn et al. , 2008 ). Moreover, science reasoning in writing , captured in students’ undergraduate theses, reflects habits, conventions, and the incorporation of feedback that may alter evidence of individuals’ critical-thinking skills. Our findings, however, provide empirical evidence that cumulative measures of science reasoning in writing are nonetheless related to students’ overall critical-thinking skills ( Table 3 ). The particularly significant roles of inference skills ( Table 3 ) and the discussion of implications of results (BioTAP question 5; Table 4 ) provide a basis for more specific ideas about how these constructs relate to one another and what educational interventions may have the most success in fostering these skills.

Our results build on previous findings. The genre of thesis writing combines pedagogies of writing and inquiry found to foster scientific reasoning ( Reynolds et al. , 2012 ) and critical thinking ( Quitadamo and Kurtz, 2007 ; Quitadamo et al. , 2008 ; Stephenson and Sadler-McKnight, 2016 ). Quitadamo and Kurtz (2007) reported that students who engaged in a laboratory writing component in a general education biology course significantly improved their inference and analysis skills, and Quitadamo and colleagues (2008) found that participation in a community-based inquiry biology course (that included a writing component) was associated with significant gains in students’ inference and evaluation skills. The shared focus on inference is noteworthy, because these prior studies actually differ from the current study; the former considered critical-­thinking skills as the primary learning outcome of writing-­focused interventions, whereas the latter focused on emergent links between two learning outcomes (science reasoning in writing and critical thinking). In other words, inference skills are impacted by writing as well as manifested in writing.

Inference focuses on drawing conclusions from argument and evidence. According to the consensus definition of critical thinking, the specific skill of inference includes several processes: querying evidence, conjecturing alternatives, and drawing conclusions. All of these activities are central to the independent research at the core of writing an undergraduate thesis. Indeed, a critical part of what we call “science reasoning in writing” might be characterized as a measure of students’ ability to infer and make meaning of information and findings. Because the cumulative BioTAP measures distill underlying similarities and, to an extent, suppress unique aspects of individual dimensions, we argue that it is appropriate to relate inference to scientific reasoning in writing . Even when we control for other potentially relevant background characteristics, the relationship is strong ( Table 5 ).

In taking the complementary view and focusing on BioTAP, when we compared students who exhibit mastery with those who do not, we found that the specific dimension of “discussing the implications of results” (question 5) differentiates students’ performance on several critical-thinking skills. To achieve mastery on this dimension, students must make connections between their results and other published studies and discuss the future directions of the research; in short, they must demonstrate an understanding of the bigger picture. The specific relationship between question 5 and inference is the strongest observed among all individual comparisons. Altogether, perhaps more than any other BioTAP dimension, this aspect of students’ writing provides a clear view of the role of students’ critical-thinking skills (particularly inference and, marginally, induction) in science reasoning.

While inference and discussion of implications emerge as particularly strongly related dimensions in this work, we note that the strongest contribution to “science reasoning in writing in biology,” as determined through exploratory factor analysis, is “argument for the significance of research” (BioTAP question 2, not question 5; Dowd et al. , 2016 ). Question 2 is not clearly related to critical-thinking skills. These findings are not contradictory, but rather suggest that the epistemological and disciplinary-specific aspects of science reasoning that emerge in writing through BioTAP are not completely aligned with aspects related to critical thinking. In other words, science reasoning in writing is not simply a proxy for those critical-thinking skills that play a role in science reasoning.

In a similar vein, the content-related, epistemological aspects of science reasoning, as well as the conventions associated with writing the undergraduate thesis (including feedback from peers and revision), may explain the lack of significant relationships between some science reasoning dimensions and some critical-thinking skills that might otherwise seem counterintuitive (e.g., BioTAP question 2, which relates to making an argument, and the critical-thinking skill of argument). It is possible that an individual’s critical-thinking skills may explain some variation in a particular BioTAP dimension, but other aspects of science reasoning and practice exert much stronger influence. Although these relationships do not emerge in our analyses, the lack of significant correlation does not mean that there is definitively no correlation. Correcting for multiple comparisons suppresses type 1 error at the expense of exacerbating type 2 error, which, combined with the limited sample size, constrains statistical power and makes weak relationships more difficult to detect. Ultimately, though, the relationships that do emerge highlight places where individuals’ distinct critical-thinking skills emerge most coherently in thesis assessment, which is why we are particularly interested in unpacking those relationships.

We recognize that, because only honors students submit theses at these institutions, this study sample is composed of a selective subset of the larger population of biology majors. Although this is an inherent limitation of focusing on thesis writing, links between our findings and results of other studies (with different populations) suggest that observed relationships may occur more broadly. The goal of improved science reasoning and critical thinking is shared among all biology majors, particularly those engaged in capstone research experiences. So while the implications of this work most directly apply to honors thesis writers, we provisionally suggest that all students could benefit from further study of them.

There are several important implications of this study for science education practices. Students’ inference skills relate to the understanding and effective application of scientific content. The fact that we find no statistically significant relationships between BioTAP questions 6–9 and CCTST dimensions suggests that such mid- to lower-order elements of BioTAP ( Reynolds et al. , 2009 ), which tend to be more structural in nature, do not focus on aspects of the finished thesis that draw strongly on critical thinking. In keeping with prior analyses ( Reynolds and Thompson, 2011 ; Dowd et al. , 2016 ), these findings further reinforce the notion that disciplinary instructors, who are most capable of teaching and assessing scientific reasoning and perhaps least interested in the more mechanical aspects of writing, may nonetheless be best suited to effectively model and assess students’ writing.

The goal of the thesis writing course at both Duke University and the University of Minnesota is not merely to improve thesis scores but to move students’ writing into the category of mastery across BioTAP dimensions. Recognizing that students with differing critical-thinking skills (particularly inference) are more or less likely to achieve mastery in the undergraduate thesis (particularly in discussing implications [question 5]) is important for developing and testing targeted pedagogical interventions to improve learning outcomes for all students.

The competencies characterized by the Vision and Change in Undergraduate Biology Education Initiative provide a general framework for recognizing that science reasoning and critical-thinking skills play key roles in major learning outcomes of science education. Our findings highlight places where science reasoning–related competencies (like “understanding the process of science”) connect to critical-thinking skills and places where critical thinking–related competencies might be manifested in scientific products (such as the ability to discuss implications in scientific writing). We encourage broader efforts to build empirical connections between competencies and pedagogical practices to further improve science education.

One specific implication of this work for science education is to focus on providing opportunities for students to develop their critical-thinking skills (particularly inference). Of course, as this correlational study is not designed to test causality, we do not claim that enhancing students’ inference skills will improve science reasoning in writing. However, as prior work shows that science writing activities influence students’ inference skills ( Quitadamo and Kurtz, 2007 ; Quitadamo et al. , 2008 ), there is reason to test such a hypothesis. Nevertheless, the focus must extend beyond inference as an isolated skill; rather, it is important to relate inference to the foundations of the scientific method ( Miri et al. , 2007 ) in terms of the epistemological appreciation of the functions and coordination of evidence ( Kuhn and Dean, 2004 ; Zeineddin and Abd-El-Khalick, 2010 ; Ding et al. , 2016 ) and disciplinary paradigms of truth and justification ( Moshman, 2015 ).

Although this study is limited to the domain of biology at two institutions with a relatively small number of students, the findings represent a foundational step in the direction of achieving success with more integrated learning outcomes. Hopefully, it will spur greater interest in empirically grounding discussions of the constructs of scientific reasoning and critical-thinking skills.

This study contributes to the efforts to improve science education, for both majors and nonmajors, through an empirically driven analysis of the relationships between scientific reasoning reflected in the genre of thesis writing and critical-thinking skills. This work is rooted in the usefulness of BioTAP as a method 1) to facilitate communication and learning and 2) to assess disciplinary-specific and general dimensions of science reasoning. The findings support the important role of the critical-thinking skill of inference in scientific reasoning in writing, while also highlighting ways in which other aspects of science reasoning (epistemological considerations, writing conventions, etc.) are not significantly related to critical thinking. Future research into the impact of interventions focused on specific critical-thinking skills (i.e., inference) for improved science reasoning in writing will build on this work and its implications for science education.

ACKNOWLEDGMENTS

We acknowledge the contributions of Kelaine Haas and Alexander Motten to the implementation and collection of data. We also thank Mine Çetinkaya-­Rundel for her insights regarding our statistical analyses. This research was funded by National Science Foundation award DUE-1525602.

  • American Association for the Advancement of Science . ( 2011 ). Vision and change in undergraduate biology education: A call to action . Washington, DC Retrieved September 26, 2017, from https://visionandchange.org/files/2013/11/aaas-VISchange-web1113.pdf . Google Scholar
  • August, D. ( 2016 ). California Critical Thinking Skills Test user manual and resource guide . San Jose: Insight Assessment/California Academic Press. Google Scholar
  • Beyer, C. H., Taylor, E., & Gillmore, G. M. ( 2013 ). Inside the undergraduate teaching experience: The University of Washington’s growth in faculty teaching study . Albany, NY: SUNY Press. Google Scholar
  • Bissell, A. N., & Lemons, P. P. ( 2006 ). A new method for assessing critical thinking in the classroom . BioScience , 56 (1), 66–72. https://doi.org/10.1641/0006-3568(2006)056[0066:ANMFAC]2.0.CO;2 . Google Scholar
  • Blattner, N. H., & Frazier, C. L. ( 2002 ). Developing a performance-based assessment of students’ critical thinking skills . Assessing Writing , 8 (1), 47–64. Google Scholar
  • Clase, K. L., Gundlach, E., & Pelaez, N. J. ( 2010 ). Calibrated peer review for computer-assisted learning of biological research competencies . Biochemistry and Molecular Biology Education , 38 (5), 290–295. Medline ,  Google Scholar
  • Condon, W., & Kelly-Riley, D. ( 2004 ). Assessing and teaching what we value: The relationship between college-level writing and critical thinking abilities . Assessing Writing , 9 (1), 56–75. https://doi.org/10.1016/j.asw.2004.01.003 . Google Scholar
  • Ding, L., Wei, X., & Liu, X. ( 2016 ). Variations in university students’ scientific reasoning skills across majors, years, and types of institutions . Research in Science Education , 46 (5), 613–632. https://doi.org/10.1007/s11165-015-9473-y . Google Scholar
  • Dowd, J. E., Connolly, M. P., Thompson, R. J.Jr., & Reynolds, J. A. ( 2015a ). Improved reasoning in undergraduate writing through structured workshops . Journal of Economic Education , 46 (1), 14–27. https://doi.org/10.1080/00220485.2014.978924 . Google Scholar
  • Dowd, J. E., Roy, C. P., Thompson, R. J.Jr., & Reynolds, J. A. ( 2015b ). “On course” for supporting expanded participation and improving scientific reasoning in undergraduate thesis writing . Journal of Chemical Education , 92 (1), 39–45. https://doi.org/10.1021/ed500298r . Google Scholar
  • Dowd, J. E., Thompson, R. J.Jr., & Reynolds, J. A. ( 2016 ). Quantitative genre analysis of undergraduate theses: Uncovering different ways of writing and thinking in science disciplines . WAC Journal , 27 , 36–51. Google Scholar
  • Facione, P. A. ( 1990 ). Critical thinking: a statement of expert consensus for purposes of educational assessment and instruction. Research findings and recommendations . Newark, DE: American Philosophical Association. Retrieved September 26, 2017, from https://philpapers.org/archive/FACCTA.pdf . Google Scholar
  • Gerdeman, R. D., Russell, A. A., Worden, K. J., Gerdeman, R. D., Russell, A. A., & Worden, K. J. ( 2007 ). Web-based student writing and reviewing in a large biology lecture course . Journal of College Science Teaching , 36 (5), 46–52. Google Scholar
  • Greenhoot, A. F., Semb, G., Colombo, J., & Schreiber, T. ( 2004 ). Prior beliefs and methodological concepts in scientific reasoning . Applied Cognitive Psychology , 18 (2), 203–221. https://doi.org/10.1002/acp.959 . Google Scholar
  • Haaga, D. A. F. ( 1993 ). Peer review of term papers in graduate psychology courses . Teaching of Psychology , 20 (1), 28–32. https://doi.org/10.1207/s15328023top2001_5 . Google Scholar
  • Halonen, J. S., Bosack, T., Clay, S., McCarthy, M., Dunn, D. S., Hill, G. W., … Whitlock, K. ( 2003 ). A rubric for learning, teaching, and assessing scientific inquiry in psychology . Teaching of Psychology , 30 (3), 196–208. https://doi.org/10.1207/S15328023TOP3003_01 . Google Scholar
  • Hand, B., & Keys, C. W. ( 1999 ). Inquiry investigation . Science Teacher , 66 (4), 27–29. Google Scholar
  • Holm, S. ( 1979 ). A simple sequentially rejective multiple test procedure . Scandinavian Journal of Statistics , 6 (2), 65–70. Google Scholar
  • Holyoak, K. J., & Morrison, R. G. ( 2005 ). The Cambridge handbook of thinking and reasoning . New York: Cambridge University Press. Google Scholar
  • Insight Assessment . ( 2016a ). California Critical Thinking Skills Test (CCTST) Retrieved September 26, 2017, from www.insightassessment.com/Products/Products-Summary/Critical-Thinking-Skills-Tests/California-Critical-Thinking-Skills-Test-CCTST . Google Scholar
  • Insight Assessment . ( 2016b ). Sample thinking skills questions. Retrieved September 26, 2017, from www.insightassessment.com/Resources/Teaching-Training-and-Learning-Tools/node_1487 . Google Scholar
  • Kelly, G. J., & Takao, A. ( 2002 ). Epistemic levels in argument: An analysis of university oceanography students’ use of evidence in writing . Science Education , 86 (3), 314–342. https://doi.org/10.1002/sce.10024 . Google Scholar
  • Kuhn, D., & Dean, D.Jr. ( 2004 ). Connecting scientific reasoning and causal inference . Journal of Cognition and Development , 5 (2), 261–288. https://doi.org/10.1207/s15327647jcd0502_5 . Google Scholar
  • Kuhn, D., Iordanou, K., Pease, M., & Wirkala, C. ( 2008 ). Beyond control of variables: What needs to develop to achieve skilled scientific thinking? . Cognitive Development , 23 (4), 435–451. https://doi.org/10.1016/j.cogdev.2008.09.006 . Google Scholar
  • Lawson, A. E. ( 2010 ). Basic inferences of scientific reasoning, argumentation, and discovery . Science Education , 94 (2), 336–364. https://doi.org/­10.1002/sce.20357 . Google Scholar
  • Meizlish, D., LaVaque-Manty, D., Silver, N., & Kaplan, M. ( 2013 ). Think like/write like: Metacognitive strategies to foster students’ development as disciplinary thinkers and writers . In Thompson, R. J. (Ed.), Changing the conversation about higher education (pp. 53–73). Lanham, MD: Rowman & Littlefield. Google Scholar
  • Miri, B., David, B.-C., & Uri, Z. ( 2007 ). Purposely teaching for the promotion of higher-order thinking skills: A case of critical thinking . Research in Science Education , 37 (4), 353–369. https://doi.org/10.1007/s11165-006-9029-2 . Google Scholar
  • Moshman, D. ( 2015 ). Epistemic cognition and development: The psychology of justification and truth . New York: Psychology Press. Google Scholar
  • National Research Council . ( 2000 ). How people learn: Brain, mind, experience, and school . Expanded ed.. Washington, DC: National Academies Press. Google Scholar
  • Pukkila, P. J. ( 2004 ). Introducing student inquiry in large introductory genetics classes . Genetics , 166 (1), 11–18. https://doi.org/10.1534/genetics.166.1.11 . Medline ,  Google Scholar
  • Quitadamo, I. J., Faiola, C. L., Johnson, J. E., & Kurtz, M. J. ( 2008 ). Community-based inquiry improves critical thinking in general education biology . CBE—Life Sciences Education , 7 (3), 327–337. https://doi.org/10.1187/cbe.07-11-0097 . Link ,  Google Scholar
  • Quitadamo, I. J., & Kurtz, M. J. ( 2007 ). Learning to improve: Using writing to increase critical thinking performance in general education biology . CBE—Life Sciences Education , 6 (2), 140–154. https://doi.org/10.1187/cbe.06-11-0203 . Link ,  Google Scholar
  • Reynolds, J. A., Smith, R., Moskovitz, C., & Sayle, A. ( 2009 ). BioTAP: A systematic approach to teaching scientific writing and evaluating undergraduate theses . BioScience , 59 (10), 896–903. https://doi.org/10.1525/bio.2009.59.10.11 . Google Scholar
  • Reynolds, J. A., Thaiss, C., Katkin, W., & Thompson, R. J. ( 2012 ). Writing-to-learn in undergraduate science education: A community-based, conceptually driven approach . CBE—Life Sciences Education , 11 (1), 17–25. https://doi.org/10.1187/cbe.11-08-0064 . Link ,  Google Scholar
  • Reynolds, J. A., & Thompson, R. J. ( 2011 ). Want to improve undergraduate thesis writing? Engage students and their faculty readers in scientific peer review . CBE—Life Sciences Education , 10 (2), 209–215. https://doi.org/­10.1187/cbe.10-10-0127 . Link ,  Google Scholar
  • Rhemtulla, M., Brosseau-Liard, P. E., & Savalei, V. ( 2012 ). When can categorical variables be treated as continuous? A comparison of robust continuous and categorical SEM estimation methods under suboptimal conditions . Psychological Methods , 17 (3), 354–373. https://doi.org/­10.1037/a0029315 . Medline ,  Google Scholar
  • Stephenson, N. S., & Sadler-McKnight, N. P. ( 2016 ). Developing critical thinking skills using the science writing heuristic in the chemistry laboratory . Chemistry Education Research and Practice , 17 (1), 72–79. https://doi.org/­10.1039/C5RP00102A . Google Scholar
  • Tariq, V. N., Stefani, L. A. J., Butcher, A. C., & Heylings, D. J. A. ( 1998 ). Developing a new approach to the assessment of project work . Assessment and Evaluation in Higher Education , 23 (3), 221–240. https://doi.org/­10.1080/0260293980230301 . Google Scholar
  • Timmerman, B. E. C., Strickland, D. C., Johnson, R. L., & Payne, J. R. ( 2011 ). Development of a “universal” rubric for assessing undergraduates’ scientific reasoning skills using scientific writing . Assessment and Evaluation in Higher Education , 36 (5), 509–547. https://doi.org/10.1080/­02602930903540991 . Google Scholar
  • Topping, K. J., Smith, E. F., Swanson, I., & Elliot, A. ( 2000 ). Formative peer assessment of academic writing between postgraduate students . Assessment and Evaluation in Higher Education , 25 (2), 149–169. https://doi.org/10.1080/713611428 . Google Scholar
  • Willison, J., & O’Regan, K. ( 2007 ). Commonly known, commonly not known, totally unknown: A framework for students becoming researchers . Higher Education Research and Development , 26 (4), 393–409. https://doi.org/10.1080/07294360701658609 . Google Scholar
  • Woodin, T., Carter, V. C., & Fletcher, L. ( 2010 ). Vision and Change in Biology Undergraduate Education: A Call for Action—Initial responses . CBE—Life Sciences Education , 9 (2), 71–73. https://doi.org/10.1187/cbe.10-03-0044 . Link ,  Google Scholar
  • Zeineddin, A., & Abd-El-Khalick, F. ( 2010 ). Scientific reasoning and epistemological commitments: Coordination of theory and evidence among college science students . Journal of Research in Science Teaching , 47 (9), 1064–1093. https://doi.org/10.1002/tea.20368 . Google Scholar
  • Zimmerman, C. ( 2000 ). The development of scientific reasoning skills . Developmental Review , 20 (1), 99–149. https://doi.org/10.1006/drev.1999.0497 . Google Scholar
  • Zimmerman, C. ( 2007 ). The development of scientific thinking skills in elementary and middle school . Developmental Review , 27 (2), 172–223. https://doi.org/10.1016/j.dr.2006.12.001 . Google Scholar
  • Redesign a biology introduction course to promote life consciousness 14 May 2024 | Cogent Education, Vol. 11, No. 1
  • Gender, Equity, and Science Writing: Examining Differences in Undergraduate Life Science Majors’ Attitudes toward Writing Lab Reports 6 March 2024 | Education Sciences, Vol. 14, No. 3
  • Designing a framework to improve critical reflection writing in teacher education using action research 24 February 2022 | Educational Action Research, Vol. 32, No. 1
  • Scientific Thinking and Critical Thinking in Science Education  5 September 2023 | Science & Education, Vol. 11
  • Students Need More than Content Knowledge To Counter Vaccine Hesitancy Journal of Microbiology & Biology Education, Vol. 24, No. 2
  • Critical thinking during science investigations: what do practicing teachers value and observe? 16 March 2023 | Teachers and Teaching, Vol. 29, No. 6
  • Effect of Web-Based Collaborative Learning Method with Scratch on Critical Thinking Skills of 5th Grade Students 30 March 2023 | Participatory Educational Research, Vol. 10, No. 2
  • Are We on the Way to Successfully Educating Future Citizens?—A Spotlight on Critical Thinking Skills and Beliefs about the Nature of Science among Pre-Service Biology Teachers in Germany 22 March 2023 | Behavioral Sciences, Vol. 13, No. 3
  • A Systematic Review on Inquiry-Based Writing Instruction in Tertiary Settings 30 November 2022 | Written Communication, Vol. 40, No. 1
  • An empirical analysis of the relationship between nature of science and critical thinking through science definitions and thinking skills 8 December 2022 | SN Social Sciences, Vol. 2, No. 12
  • TEACHING OF CRITICAL THINKING SKILLS BY SCIENCE TEACHERS IN JAPANESE PRIMARY SCHOOLS 25 October 2022 | Journal of Baltic Science Education, Vol. 21, No. 5
  • A Team-Based Activity to Support Knowledge Transfer and Experimental Design Skills of Undergraduate Science Students 4 May 2022 | Journal of Microbiology & Biology Education, Vol. 21
  • Curriculum Design of College Students’ English Critical Ability in the Internet Age Wireless Communications and Mobile Computing, Vol. 2022
  • Exploring the structure of students’ scientific higher order thinking in science education Thinking Skills and Creativity, Vol. 43
  • The Asia-Pacific Education Researcher, Vol. 31, No. 4
  • Conspiratorial Beliefs and Cognitive Styles: An Integrated Look on Analytic Thinking, Critical Thinking, and Scientific Reasoning in Relation to (Dis)trust in Conspiracy Theories 12 October 2021 | Frontiers in Psychology, Vol. 12
  • Professional Knowledge and Self-Efficacy Expectations of Pre-Service Teachers Regarding Scientific Reasoning and Diagnostics 11 October 2021 | Education Sciences, Vol. 11, No. 10
  • Developing textbook based on scientific approach, critical thinking, and science process skills Journal of Physics: Conference Series, Vol. 1839, No. 1
  • Using Models of Cognitive Development to Design College Learning Experiences
  • Thinking Skills and Creativity, Vol. 42
  • Assessing students’ prior knowledge on critical thinking skills in the biology classroom: Has it already been good?
  • Critical Thinking Level among Medical Sciences Students in Iran Education Research International, Vol. 2020
  • Teaching during a pandemic: Using high‐impact writing assignments to balance rigor, engagement, flexibility, and workload 12 October 2020 | Ecology and Evolution, Vol. 10, No. 22
  • Mini-Review - Teaching Writing in the Undergraduate Neuroscience Curriculum: Its Importance and Best Practices Neuroscience Letters, Vol. 737
  • Developing critical thinking skills assessment for pre-service elementary school teacher about the basic concept of science: validity and reliability Journal of Physics: Conference Series, Vol. 1567, No. 2
  • Challenging endocrinology students with a critical-thinking workbook Advances in Physiology Education, Vol. 44, No. 1
  • Jason E. Dowd ,
  • Robert J. Thompson ,
  • Leslie Schiff ,
  • Kelaine Haas ,
  • Christine Hohmann ,
  • Chris Roy ,
  • Warren Meck ,
  • John Bruno , and
  • Rebecca Price, Monitoring Editor
  • Kari L. Nelson ,
  • Claudia M. Rauter , and
  • Christine E. Cutucache
  • Elisabeth Schussler, Monitoring Editor

Submitted: 17 March 2017 Revised: 19 October 2017 Accepted: 20 October 2017

© 2018 J. E. Dowd et al. CBE—Life Sciences Education © 2018 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  • Search Menu
  • Sign in through your institution
  • Browse content in Arts and Humanities
  • Browse content in Archaeology
  • Anglo-Saxon and Medieval Archaeology
  • Archaeological Methodology and Techniques
  • Archaeology by Region
  • Archaeology of Religion
  • Archaeology of Trade and Exchange
  • Biblical Archaeology
  • Contemporary and Public Archaeology
  • Environmental Archaeology
  • Historical Archaeology
  • History and Theory of Archaeology
  • Industrial Archaeology
  • Landscape Archaeology
  • Mortuary Archaeology
  • Prehistoric Archaeology
  • Underwater Archaeology
  • Urban Archaeology
  • Zooarchaeology
  • Browse content in Architecture
  • Architectural Structure and Design
  • History of Architecture
  • Residential and Domestic Buildings
  • Theory of Architecture
  • Browse content in Art
  • Art Subjects and Themes
  • History of Art
  • Industrial and Commercial Art
  • Theory of Art
  • Biographical Studies
  • Byzantine Studies
  • Browse content in Classical Studies
  • Classical History
  • Classical Philosophy
  • Classical Mythology
  • Classical Literature
  • Classical Reception
  • Classical Art and Architecture
  • Classical Oratory and Rhetoric
  • Greek and Roman Papyrology
  • Greek and Roman Epigraphy
  • Greek and Roman Law
  • Greek and Roman Archaeology
  • Late Antiquity
  • Religion in the Ancient World
  • Digital Humanities
  • Browse content in History
  • Colonialism and Imperialism
  • Diplomatic History
  • Environmental History
  • Genealogy, Heraldry, Names, and Honours
  • Genocide and Ethnic Cleansing
  • Historical Geography
  • History by Period
  • History of Emotions
  • History of Agriculture
  • History of Education
  • History of Gender and Sexuality
  • Industrial History
  • Intellectual History
  • International History
  • Labour History
  • Legal and Constitutional History
  • Local and Family History
  • Maritime History
  • Military History
  • National Liberation and Post-Colonialism
  • Oral History
  • Political History
  • Public History
  • Regional and National History
  • Revolutions and Rebellions
  • Slavery and Abolition of Slavery
  • Social and Cultural History
  • Theory, Methods, and Historiography
  • Urban History
  • World History
  • Browse content in Language Teaching and Learning
  • Language Learning (Specific Skills)
  • Language Teaching Theory and Methods
  • Browse content in Linguistics
  • Applied Linguistics
  • Cognitive Linguistics
  • Computational Linguistics
  • Forensic Linguistics
  • Grammar, Syntax and Morphology
  • Historical and Diachronic Linguistics
  • History of English
  • Language Evolution
  • Language Reference
  • Language Acquisition
  • Language Variation
  • Language Families
  • Lexicography
  • Linguistic Anthropology
  • Linguistic Theories
  • Linguistic Typology
  • Phonetics and Phonology
  • Psycholinguistics
  • Sociolinguistics
  • Translation and Interpretation
  • Writing Systems
  • Browse content in Literature
  • Bibliography
  • Children's Literature Studies
  • Literary Studies (Romanticism)
  • Literary Studies (American)
  • Literary Studies (Asian)
  • Literary Studies (European)
  • Literary Studies (Eco-criticism)
  • Literary Studies (Modernism)
  • Literary Studies - World
  • Literary Studies (1500 to 1800)
  • Literary Studies (19th Century)
  • Literary Studies (20th Century onwards)
  • Literary Studies (African American Literature)
  • Literary Studies (British and Irish)
  • Literary Studies (Early and Medieval)
  • Literary Studies (Fiction, Novelists, and Prose Writers)
  • Literary Studies (Gender Studies)
  • Literary Studies (Graphic Novels)
  • Literary Studies (History of the Book)
  • Literary Studies (Plays and Playwrights)
  • Literary Studies (Poetry and Poets)
  • Literary Studies (Postcolonial Literature)
  • Literary Studies (Queer Studies)
  • Literary Studies (Science Fiction)
  • Literary Studies (Travel Literature)
  • Literary Studies (War Literature)
  • Literary Studies (Women's Writing)
  • Literary Theory and Cultural Studies
  • Mythology and Folklore
  • Shakespeare Studies and Criticism
  • Browse content in Media Studies
  • Browse content in Music
  • Applied Music
  • Dance and Music
  • Ethics in Music
  • Ethnomusicology
  • Gender and Sexuality in Music
  • Medicine and Music
  • Music Cultures
  • Music and Media
  • Music and Religion
  • Music and Culture
  • Music Education and Pedagogy
  • Music Theory and Analysis
  • Musical Scores, Lyrics, and Libretti
  • Musical Structures, Styles, and Techniques
  • Musicology and Music History
  • Performance Practice and Studies
  • Race and Ethnicity in Music
  • Sound Studies
  • Browse content in Performing Arts
  • Browse content in Philosophy
  • Aesthetics and Philosophy of Art
  • Epistemology
  • Feminist Philosophy
  • History of Western Philosophy
  • Metaphysics
  • Moral Philosophy
  • Non-Western Philosophy
  • Philosophy of Language
  • Philosophy of Mind
  • Philosophy of Perception
  • Philosophy of Science
  • Philosophy of Action
  • Philosophy of Law
  • Philosophy of Religion
  • Philosophy of Mathematics and Logic
  • Practical Ethics
  • Social and Political Philosophy
  • Browse content in Religion
  • Biblical Studies
  • Christianity
  • East Asian Religions
  • History of Religion
  • Judaism and Jewish Studies
  • Qumran Studies
  • Religion and Education
  • Religion and Health
  • Religion and Politics
  • Religion and Science
  • Religion and Law
  • Religion and Art, Literature, and Music
  • Religious Studies
  • Browse content in Society and Culture
  • Cookery, Food, and Drink
  • Cultural Studies
  • Customs and Traditions
  • Ethical Issues and Debates
  • Hobbies, Games, Arts and Crafts
  • Natural world, Country Life, and Pets
  • Popular Beliefs and Controversial Knowledge
  • Sports and Outdoor Recreation
  • Technology and Society
  • Travel and Holiday
  • Visual Culture
  • Browse content in Law
  • Arbitration
  • Browse content in Company and Commercial Law
  • Commercial Law
  • Company Law
  • Browse content in Comparative Law
  • Systems of Law
  • Competition Law
  • Browse content in Constitutional and Administrative Law
  • Government Powers
  • Judicial Review
  • Local Government Law
  • Military and Defence Law
  • Parliamentary and Legislative Practice
  • Construction Law
  • Contract Law
  • Browse content in Criminal Law
  • Criminal Procedure
  • Criminal Evidence Law
  • Sentencing and Punishment
  • Employment and Labour Law
  • Environment and Energy Law
  • Browse content in Financial Law
  • Banking Law
  • Insolvency Law
  • History of Law
  • Human Rights and Immigration
  • Intellectual Property Law
  • Browse content in International Law
  • Private International Law and Conflict of Laws
  • Public International Law
  • IT and Communications Law
  • Jurisprudence and Philosophy of Law
  • Law and Politics
  • Law and Society
  • Browse content in Legal System and Practice
  • Courts and Procedure
  • Legal Skills and Practice
  • Primary Sources of Law
  • Regulation of Legal Profession
  • Medical and Healthcare Law
  • Browse content in Policing
  • Criminal Investigation and Detection
  • Police and Security Services
  • Police Procedure and Law
  • Police Regional Planning
  • Browse content in Property Law
  • Personal Property Law
  • Study and Revision
  • Terrorism and National Security Law
  • Browse content in Trusts Law
  • Wills and Probate or Succession
  • Browse content in Medicine and Health
  • Browse content in Allied Health Professions
  • Arts Therapies
  • Clinical Science
  • Dietetics and Nutrition
  • Occupational Therapy
  • Operating Department Practice
  • Physiotherapy
  • Radiography
  • Speech and Language Therapy
  • Browse content in Anaesthetics
  • General Anaesthesia
  • Neuroanaesthesia
  • Clinical Neuroscience
  • Browse content in Clinical Medicine
  • Acute Medicine
  • Cardiovascular Medicine
  • Clinical Genetics
  • Clinical Pharmacology and Therapeutics
  • Dermatology
  • Endocrinology and Diabetes
  • Gastroenterology
  • Genito-urinary Medicine
  • Geriatric Medicine
  • Infectious Diseases
  • Medical Toxicology
  • Medical Oncology
  • Pain Medicine
  • Palliative Medicine
  • Rehabilitation Medicine
  • Respiratory Medicine and Pulmonology
  • Rheumatology
  • Sleep Medicine
  • Sports and Exercise Medicine
  • Community Medical Services
  • Critical Care
  • Emergency Medicine
  • Forensic Medicine
  • Haematology
  • History of Medicine
  • Browse content in Medical Skills
  • Clinical Skills
  • Communication Skills
  • Nursing Skills
  • Surgical Skills
  • Browse content in Medical Dentistry
  • Oral and Maxillofacial Surgery
  • Paediatric Dentistry
  • Restorative Dentistry and Orthodontics
  • Surgical Dentistry
  • Medical Ethics
  • Medical Statistics and Methodology
  • Browse content in Neurology
  • Clinical Neurophysiology
  • Neuropathology
  • Nursing Studies
  • Browse content in Obstetrics and Gynaecology
  • Gynaecology
  • Occupational Medicine
  • Ophthalmology
  • Otolaryngology (ENT)
  • Browse content in Paediatrics
  • Neonatology
  • Browse content in Pathology
  • Chemical Pathology
  • Clinical Cytogenetics and Molecular Genetics
  • Histopathology
  • Medical Microbiology and Virology
  • Patient Education and Information
  • Browse content in Pharmacology
  • Psychopharmacology
  • Browse content in Popular Health
  • Caring for Others
  • Complementary and Alternative Medicine
  • Self-help and Personal Development
  • Browse content in Preclinical Medicine
  • Cell Biology
  • Molecular Biology and Genetics
  • Reproduction, Growth and Development
  • Primary Care
  • Professional Development in Medicine
  • Browse content in Psychiatry
  • Addiction Medicine
  • Child and Adolescent Psychiatry
  • Forensic Psychiatry
  • Learning Disabilities
  • Old Age Psychiatry
  • Psychotherapy
  • Browse content in Public Health and Epidemiology
  • Epidemiology
  • Public Health
  • Browse content in Radiology
  • Clinical Radiology
  • Interventional Radiology
  • Nuclear Medicine
  • Radiation Oncology
  • Reproductive Medicine
  • Browse content in Surgery
  • Cardiothoracic Surgery
  • Gastro-intestinal and Colorectal Surgery
  • General Surgery
  • Neurosurgery
  • Paediatric Surgery
  • Peri-operative Care
  • Plastic and Reconstructive Surgery
  • Surgical Oncology
  • Transplant Surgery
  • Trauma and Orthopaedic Surgery
  • Vascular Surgery
  • Browse content in Science and Mathematics
  • Browse content in Biological Sciences
  • Aquatic Biology
  • Biochemistry
  • Bioinformatics and Computational Biology
  • Developmental Biology
  • Ecology and Conservation
  • Evolutionary Biology
  • Genetics and Genomics
  • Microbiology
  • Molecular and Cell Biology
  • Natural History
  • Plant Sciences and Forestry
  • Research Methods in Life Sciences
  • Structural Biology
  • Systems Biology
  • Zoology and Animal Sciences
  • Browse content in Chemistry
  • Analytical Chemistry
  • Computational Chemistry
  • Crystallography
  • Environmental Chemistry
  • Industrial Chemistry
  • Inorganic Chemistry
  • Materials Chemistry
  • Medicinal Chemistry
  • Mineralogy and Gems
  • Organic Chemistry
  • Physical Chemistry
  • Polymer Chemistry
  • Study and Communication Skills in Chemistry
  • Theoretical Chemistry
  • Browse content in Computer Science
  • Artificial Intelligence
  • Computer Architecture and Logic Design
  • Game Studies
  • Human-Computer Interaction
  • Mathematical Theory of Computation
  • Programming Languages
  • Software Engineering
  • Systems Analysis and Design
  • Virtual Reality
  • Browse content in Computing
  • Business Applications
  • Computer Security
  • Computer Games
  • Computer Networking and Communications
  • Digital Lifestyle
  • Graphical and Digital Media Applications
  • Operating Systems
  • Browse content in Earth Sciences and Geography
  • Atmospheric Sciences
  • Environmental Geography
  • Geology and the Lithosphere
  • Maps and Map-making
  • Meteorology and Climatology
  • Oceanography and Hydrology
  • Palaeontology
  • Physical Geography and Topography
  • Regional Geography
  • Soil Science
  • Urban Geography
  • Browse content in Engineering and Technology
  • Agriculture and Farming
  • Biological Engineering
  • Civil Engineering, Surveying, and Building
  • Electronics and Communications Engineering
  • Energy Technology
  • Engineering (General)
  • Environmental Science, Engineering, and Technology
  • History of Engineering and Technology
  • Mechanical Engineering and Materials
  • Technology of Industrial Chemistry
  • Transport Technology and Trades
  • Browse content in Environmental Science
  • Applied Ecology (Environmental Science)
  • Conservation of the Environment (Environmental Science)
  • Environmental Sustainability
  • Environmentalist Thought and Ideology (Environmental Science)
  • Management of Land and Natural Resources (Environmental Science)
  • Natural Disasters (Environmental Science)
  • Nuclear Issues (Environmental Science)
  • Pollution and Threats to the Environment (Environmental Science)
  • Social Impact of Environmental Issues (Environmental Science)
  • History of Science and Technology
  • Browse content in Materials Science
  • Ceramics and Glasses
  • Composite Materials
  • Metals, Alloying, and Corrosion
  • Nanotechnology
  • Browse content in Mathematics
  • Applied Mathematics
  • Biomathematics and Statistics
  • History of Mathematics
  • Mathematical Education
  • Mathematical Finance
  • Mathematical Analysis
  • Numerical and Computational Mathematics
  • Probability and Statistics
  • Pure Mathematics
  • Browse content in Neuroscience
  • Cognition and Behavioural Neuroscience
  • Development of the Nervous System
  • Disorders of the Nervous System
  • History of Neuroscience
  • Invertebrate Neurobiology
  • Molecular and Cellular Systems
  • Neuroendocrinology and Autonomic Nervous System
  • Neuroscientific Techniques
  • Sensory and Motor Systems
  • Browse content in Physics
  • Astronomy and Astrophysics
  • Atomic, Molecular, and Optical Physics
  • Biological and Medical Physics
  • Classical Mechanics
  • Computational Physics
  • Condensed Matter Physics
  • Electromagnetism, Optics, and Acoustics
  • History of Physics
  • Mathematical and Statistical Physics
  • Measurement Science
  • Nuclear Physics
  • Particles and Fields
  • Plasma Physics
  • Quantum Physics
  • Relativity and Gravitation
  • Semiconductor and Mesoscopic Physics
  • Browse content in Psychology
  • Affective Sciences
  • Clinical Psychology
  • Cognitive Psychology
  • Cognitive Neuroscience
  • Criminal and Forensic Psychology
  • Developmental Psychology
  • Educational Psychology
  • Evolutionary Psychology
  • Health Psychology
  • History and Systems in Psychology
  • Music Psychology
  • Neuropsychology
  • Organizational Psychology
  • Psychological Assessment and Testing
  • Psychology of Human-Technology Interaction
  • Psychology Professional Development and Training
  • Research Methods in Psychology
  • Social Psychology
  • Browse content in Social Sciences
  • Browse content in Anthropology
  • Anthropology of Religion
  • Human Evolution
  • Medical Anthropology
  • Physical Anthropology
  • Regional Anthropology
  • Social and Cultural Anthropology
  • Theory and Practice of Anthropology
  • Browse content in Business and Management
  • Business Ethics
  • Business Strategy
  • Business History
  • Business and Technology
  • Business and Government
  • Business and the Environment
  • Comparative Management
  • Corporate Governance
  • Corporate Social Responsibility
  • Entrepreneurship
  • Health Management
  • Human Resource Management
  • Industrial and Employment Relations
  • Industry Studies
  • Information and Communication Technologies
  • International Business
  • Knowledge Management
  • Management and Management Techniques
  • Operations Management
  • Organizational Theory and Behaviour
  • Pensions and Pension Management
  • Public and Nonprofit Management
  • Strategic Management
  • Supply Chain Management
  • Browse content in Criminology and Criminal Justice
  • Criminal Justice
  • Criminology
  • Forms of Crime
  • International and Comparative Criminology
  • Youth Violence and Juvenile Justice
  • Development Studies
  • Browse content in Economics
  • Agricultural, Environmental, and Natural Resource Economics
  • Asian Economics
  • Behavioural Finance
  • Behavioural Economics and Neuroeconomics
  • Econometrics and Mathematical Economics
  • Economic History
  • Economic Systems
  • Economic Methodology
  • Economic Development and Growth
  • Financial Markets
  • Financial Institutions and Services
  • General Economics and Teaching
  • Health, Education, and Welfare
  • History of Economic Thought
  • International Economics
  • Labour and Demographic Economics
  • Law and Economics
  • Macroeconomics and Monetary Economics
  • Microeconomics
  • Public Economics
  • Urban, Rural, and Regional Economics
  • Welfare Economics
  • Browse content in Education
  • Adult Education and Continuous Learning
  • Care and Counselling of Students
  • Early Childhood and Elementary Education
  • Educational Equipment and Technology
  • Educational Strategies and Policy
  • Higher and Further Education
  • Organization and Management of Education
  • Philosophy and Theory of Education
  • Schools Studies
  • Secondary Education
  • Teaching of a Specific Subject
  • Teaching of Specific Groups and Special Educational Needs
  • Teaching Skills and Techniques
  • Browse content in Environment
  • Applied Ecology (Social Science)
  • Climate Change
  • Conservation of the Environment (Social Science)
  • Environmentalist Thought and Ideology (Social Science)
  • Natural Disasters (Environment)
  • Social Impact of Environmental Issues (Social Science)
  • Browse content in Human Geography
  • Cultural Geography
  • Economic Geography
  • Political Geography
  • Browse content in Interdisciplinary Studies
  • Communication Studies
  • Museums, Libraries, and Information Sciences
  • Browse content in Politics
  • African Politics
  • Asian Politics
  • Chinese Politics
  • Comparative Politics
  • Conflict Politics
  • Elections and Electoral Studies
  • Environmental Politics
  • European Union
  • Foreign Policy
  • Gender and Politics
  • Human Rights and Politics
  • Indian Politics
  • International Relations
  • International Organization (Politics)
  • International Political Economy
  • Irish Politics
  • Latin American Politics
  • Middle Eastern Politics
  • Political Behaviour
  • Political Economy
  • Political Institutions
  • Political Methodology
  • Political Communication
  • Political Philosophy
  • Political Sociology
  • Political Theory
  • Politics and Law
  • Politics of Development
  • Public Policy
  • Public Administration
  • Quantitative Political Methodology
  • Regional Political Studies
  • Russian Politics
  • Security Studies
  • State and Local Government
  • UK Politics
  • US Politics
  • Browse content in Regional and Area Studies
  • African Studies
  • Asian Studies
  • East Asian Studies
  • Japanese Studies
  • Latin American Studies
  • Middle Eastern Studies
  • Native American Studies
  • Scottish Studies
  • Browse content in Research and Information
  • Research Methods
  • Browse content in Social Work
  • Addictions and Substance Misuse
  • Adoption and Fostering
  • Care of the Elderly
  • Child and Adolescent Social Work
  • Couple and Family Social Work
  • Direct Practice and Clinical Social Work
  • Emergency Services
  • Human Behaviour and the Social Environment
  • International and Global Issues in Social Work
  • Mental and Behavioural Health
  • Social Justice and Human Rights
  • Social Policy and Advocacy
  • Social Work and Crime and Justice
  • Social Work Macro Practice
  • Social Work Practice Settings
  • Social Work Research and Evidence-based Practice
  • Welfare and Benefit Systems
  • Browse content in Sociology
  • Childhood Studies
  • Community Development
  • Comparative and Historical Sociology
  • Economic Sociology
  • Gender and Sexuality
  • Gerontology and Ageing
  • Health, Illness, and Medicine
  • Marriage and the Family
  • Migration Studies
  • Occupations, Professions, and Work
  • Organizations
  • Population and Demography
  • Race and Ethnicity
  • Social Theory
  • Social Movements and Social Change
  • Social Research and Statistics
  • Social Stratification, Inequality, and Mobility
  • Sociology of Religion
  • Sociology of Education
  • Sport and Leisure
  • Urban and Rural Studies
  • Browse content in Warfare and Defence
  • Defence Strategy, Planning, and Research
  • Land Forces and Warfare
  • Military Administration
  • Military Life and Institutions
  • Naval Forces and Warfare
  • Other Warfare and Defence Issues
  • Peace Studies and Conflict Resolution
  • Weapons and Equipment

The Oxford Handbook of Thinking and Reasoning

  • < Previous chapter
  • Next chapter >

35 Scientific Thinking and Reasoning

Kevin N. Dunbar, Department of Human Development and Quantitative Methodology, University of Maryland, College Park, MD

David Klahr, Department of Psychology, Carnegie Mellon University, Pittsburgh, PA

  • Published: 21 November 2012
  • Cite Icon Cite
  • Permissions Icon Permissions

Scientific thinking refers to both thinking about the content of science and the set of reasoning processes that permeate the field of science: induction, deduction, experimental design, causal reasoning, concept formation, hypothesis testing, and so on. Here we cover both the history of research on scientific thinking and the different approaches that have been used, highlighting common themes that have emerged over the past 50 years of research. Future research will focus on the collaborative aspects of scientific thinking, on effective methods for teaching science, and on the neural underpinnings of the scientific mind.

There is no unitary activity called “scientific discovery”; there are activities of designing experiments, gathering data, inventing and developing observational instruments, formulating and modifying theories, deducing consequences from theories, making predictions from theories, testing theories, inducing regularities and invariants from data, discovering theoretical constructs, and others. — Simon, Langley, & Bradshaw, 1981 , p. 2

What Is Scientific Thinking and Reasoning?

There are two kinds of thinking we call “scientific.” The first, and most obvious, is thinking about the content of science. People are engaged in scientific thinking when they are reasoning about such entities and processes as force, mass, energy, equilibrium, magnetism, atoms, photosynthesis, radiation, geology, or astrophysics (and, of course, cognitive psychology!). The second kind of scientific thinking includes the set of reasoning processes that permeate the field of science: induction, deduction, experimental design, causal reasoning, concept formation, hypothesis testing, and so on. However, these reasoning processes are not unique to scientific thinking: They are the very same processes involved in everyday thinking. As Einstein put it:

The scientific way of forming concepts differs from that which we use in our daily life, not basically, but merely in the more precise definition of concepts and conclusions; more painstaking and systematic choice of experimental material, and greater logical economy. (The Common Language of Science, 1941, reprinted in Einstein, 1950 , p. 98)

Nearly 40 years after Einstein's remarkably insightful statement, Francis Crick offered a similar perspective: that great discoveries in science result not from extraordinary mental processes, but rather from rather common ones. The greatness of the discovery lies in the thing discovered.

I think what needs to be emphasized about the discovery of the double helix is that the path to it was, scientifically speaking, fairly commonplace. What was important was not the way it was discovered , but the object discovered—the structure of DNA itself. (Crick, 1988 , p. 67; emphasis added)

Under this view, scientific thinking involves the same general-purpose cognitive processes—such as induction, deduction, analogy, problem solving, and causal reasoning—that humans apply in nonscientific domains. These processes are covered in several different chapters of this handbook: Rips, Smith, & Medin, Chapter 11 on induction; Evans, Chapter 8 on deduction; Holyoak, Chapter 13 on analogy; Bassok & Novick, Chapter 21 on problem solving; and Cheng & Buehner, Chapter 12 on causality. One might question the claim that the highly specialized procedures associated with doing science in the “real world” can be understood by investigating the thinking processes used in laboratory studies of the sort described in this volume. However, when the focus is on major scientific breakthroughs, rather than on the more routine, incremental progress in a field, the psychology of problem solving provides a rich source of ideas about how such discoveries might occur. As Simon and his colleagues put it:

It is understandable, if ironic, that ‘normal’ science fits … the description of expert problem solving, while ‘revolutionary’ science fits the description of problem solving by novices. It is understandable because scientific activity, particularly at the revolutionary end of the continuum, is concerned with the discovery of new truths, not with the application of truths that are already well-known … it is basically a journey into unmapped terrain. Consequently, it is mainly characterized, as is novice problem solving, by trial-and-error search. The search may be highly selective—but it reaches its goal only after many halts, turnings, and back-trackings. (Simon, Langley, & Bradshaw, 1981 , p. 5)

The research literature on scientific thinking can be roughly categorized according to the two types of scientific thinking listed in the opening paragraph of this chapter: (1) One category focuses on thinking that directly involves scientific content . Such research ranges from studies of young children reasoning about the sun-moon-earth system (Vosniadou & Brewer, 1992 ) to college students reasoning about chemical equilibrium (Davenport, Yaron, Klahr, & Koedinger, 2008 ), to research that investigates collaborative problem solving by world-class researchers in real-world molecular biology labs (Dunbar, 1995 ). (2) The other category focuses on “general” cognitive processes, but it tends to do so by analyzing people's problem-solving behavior when they are presented with relatively complex situations that involve the integration and coordination of several different types of processes, and that are designed to capture some essential features of “real-world” science in the psychology laboratory (Bruner, Goodnow, & Austin, 1956 ; Klahr & Dunbar, 1988 ; Mynatt, Doherty, & Tweney, 1977 ).

There are a number of overlapping research traditions that have been used to investigate scientific thinking. We will cover both the history of research on scientific thinking and the different approaches that have been used, highlighting common themes that have emerged over the past 50 years of research.

A Brief History of Research on Scientific Thinking

Science is often considered one of the hallmarks of the human species, along with art and literature. Illuminating the thought processes used in science thus reveal key aspects of the human mind. The thought processes underlying scientific thinking have fascinated both scientists and nonscientists because the products of science have transformed our world and because the process of discovery is shrouded in mystery. Scientists talk of the chance discovery, the flash of insight, the years of perspiration, and the voyage of discovery. These images of science have helped make the mental processes underlying the discovery process intriguing to cognitive scientists as they attempt to uncover what really goes on inside the scientific mind and how scientists really think. Furthermore, the possibilities that scientists can be taught to think better by avoiding mistakes that have been clearly identified in research on scientific thinking, and that their scientific process could be partially automated, makes scientific thinking a topic of enduring interest.

The cognitive processes underlying scientific discovery and day-to-day scientific thinking have been a topic of intense scrutiny and speculation for almost 400 years (e.g., Bacon, 1620 ; Galilei 1638 ; Klahr 2000 ; Tweney, Doherty, & Mynatt, 1981 ). Understanding the nature of scientific thinking has been a central issue not only for our understanding of science but also for our understating of what it is to be human. Bacon's Novumm Organum in 1620 sketched out some of the key features of the ways that experiments are designed and data interpreted. Over the ensuing 400 years philosophers and scientists vigorously debated about the appropriate methods that scientists should use (see Giere, 1993 ). These debates over the appropriate methods for science typically resulted in the espousal of a particular type of reasoning method, such as induction or deduction. It was not until the Gestalt psychologists began working on the nature of human problem solving, during the 1940s, that experimental psychologists began to investigate the cognitive processes underlying scientific thinking and reasoning.

The Gestalt psychologist Max Wertheimer pioneered the investigation of scientific thinking (of the first type described earlier: thinking about scientific content ) in his landmark book Productive Thinking (Wertheimer, 1945 ). Wertheimer spent a considerable amount of time corresponding with Albert Einstein, attempting to discover how Einstein generated the concept of relativity. Wertheimer argued that Einstein had to overcome the structure of Newtonian physics at each step in his theorizing, and the ways that Einstein actually achieved this restructuring were articulated in terms of Gestalt theories. (For a recent and different account of how Einstein made his discovery, see Galison, 2003 .) We will see later how this process of overcoming alternative theories is an obstacle that both scientists and nonscientists need to deal with when evaluating and theorizing about the world.

One of the first investigations of scientific thinking of the second type (i.e., collections of general-purpose processes operating on complex, abstract, components of scientific thought) was carried out by Jerome Bruner and his colleagues at Harvard (Bruner et al., 1956 ). They argued that a key activity engaged in by scientists is to determine whether a particular instance is a member of a category. For example, a scientist might want to discover which substances undergo fission when bombarded by neutrons and which substances do not. Here, scientists have to discover the attributes that make a substance undergo fission. Bruner et al. saw scientific thinking as the testing of hypotheses and the collecting of data with the end goal of determining whether something is a member of a category. They invented a paradigm where people were required to formulate hypotheses and collect data that test their hypotheses. In one type of experiment, the participants were shown a card such as one with two borders and three green triangles. The participants were asked to determine the concept that this card represented by choosing other cards and getting feedback from the experimenter as to whether the chosen card was an example of the concept. In this case the participant may have thought that the concept was green and chosen a card with two green squares and one border. If the underlying concept was green, then the experimenter would say that the card was an example of the concept. In terms of scientific thinking, choosing a new card is akin to conducting an experiment, and the feedback from the experimenter is similar to knowing whether a hypothesis is confirmed or disconfirmed. Using this approach, Bruner et al. identified a number of strategies that people use to formulate and test hypotheses. They found that a key factor determining which hypothesis-testing strategy that people use is the amount of memory capacity that the strategy takes up (see also Morrison & Knowlton, Chapter 6 ; Medin et al., Chapter 11 ). Another key factor that they discovered was that it was much more difficult for people to discover negative concepts (e.g., not blue) than positive concepts (e.g., blue). Although Bruner et al.'s research is most commonly viewed as work on concepts, they saw their work as uncovering a key component of scientific thinking.

A second early line of research on scientific thinking was developed by Peter Wason and his colleagues (Wason, 1968 ). Like Bruner et al., Wason saw a key component of scientific thinking as being the testing of hypotheses. Whereas Bruner et al. focused on the different types of strategies that people use to formulate hypotheses, Wason focused on whether people adopt a strategy of trying to confirm or disconfirm their hypotheses. Using Popper's ( 1959 ) theory that scientists should try and falsify rather than confirm their hypotheses, Wason devised a deceptively simple task in which participants were given three numbers, such as 2-4-6, and were asked to discover the rule underlying the three numbers. Participants were asked to generate other triads of numbers and the experimenter would tell the participant whether the triad was consistent or inconsistent with the rule. They were told that when they were sure they knew what the rule was they should state it. Most participants began the experiment by thinking that the rule was even numbers increasing by 2. They then attempted to confirm their hypothesis by generating a triad like 8-10-12, then 14-16-18. These triads are consistent with the rule and the participants were told yes, that the triads were indeed consistent with the rule. However, when they proposed the rule—even numbers increasing by 2—they were told that the rule was incorrect. The correct rule was numbers of increasing magnitude! From this research, Wason concluded that people try to confirm their hypotheses, whereas normatively speaking, they should try to disconfirm their hypotheses. One implication of this research is that confirmation bias is not just restricted to scientists but is a general human tendency.

It was not until the 1970s that a general account of scientific reasoning was proposed. Herbert Simon, often in collaboration with Allan Newell, proposed that scientific thinking is a form of problem solving. He proposed that problem solving is a search in a problem space. Newell and Simon's theory of problem solving is discussed in many places in this handbook, usually in the context of specific problems (see especially Bassok & Novick, Chapter 21 ). Herbert Simon, however, devoted considerable time to understanding many different scientific discoveries and scientific reasoning processes. The common thread in his research was that scientific thinking and discovery is not a mysterious magical process but a process of problem solving in which clear heuristics are used. Simon's goal was to articulate the heuristics that scientists use in their research at a fine-grained level. By constructing computer programs that simulated the process of several major scientific discoveries, Simon and colleagues were able to articulate the specific computations that scientists could have used in making those discoveries (Langley, Simon, Bradshaw, & Zytkow, 1987 ; see section on “Computational Approaches to Scientific Thinking”). Particularly influential was Simon and Lea's ( 1974 ) work demonstrating that concept formation and induction consist of a search in two problem spaces: a space of instances and a space of rules. This idea has influenced problem-solving accounts of scientific thinking that will be discussed in the next section.

Overall, the work of Bruner, Wason, and Simon laid the foundations for contemporary research on scientific thinking. Early research on scientific thinking is summarized in Tweney, Doherty and Mynatt's 1981 book On Scientific Thinking , where they sketched out many of the themes that have dominated research on scientific thinking over the past few decades. Other more recent books such as Cognitive Models of Science (Giere, 1993 ), Exploring Science (Klahr, 2000 ), Cognitive Basis of Science (Carruthers, Stich, & Siegal, 2002 ), and New Directions in Scientific and Technical Thinking (Gorman, Kincannon, Gooding, & Tweney, 2004 ) provide detailed analyses of different aspects of scientific discovery. Another important collection is Vosnadiau's handbook on conceptual change research (Vosniadou, 2008 ). In this chapter, we discuss the main approaches that have been used to investigate scientific thinking.

How does one go about investigating the many different aspects of scientific thinking? One common approach to the study of the scientific mind has been to investigate several key aspects of scientific thinking using abstract tasks designed to mimic some essential characteristics of “real-world” science. There have been numerous methodologies that have been used to analyze the genesis of scientific concepts, theories, hypotheses, and experiments. Researchers have used experiments, verbal protocols, computer programs, and analyzed particular scientific discoveries. A more recent development has been to increase the ecological validity of such research by investigating scientists as they reason “live” (in vivo studies of scientific thinking) in their own laboratories (Dunbar, 1995 , 2002 ). From a “Thinking and Reasoning” standpoint the major aspects of scientific thinking that have been most actively investigated are problem solving, analogical reasoning, hypothesis testing, conceptual change, collaborative reasoning, inductive reasoning, and deductive reasoning.

Scientific Thinking as Problem Solving

One of the primary goals of accounts of scientific thinking has been to provide an overarching framework to understand the scientific mind. One framework that has had a great influence in cognitive science is that scientific thinking and scientific discovery can be conceived as a form of problem solving. As noted in the opening section of this chapter, Simon ( 1977 ; Simon, Langley, & Bradshaw, 1981 ) argued that both scientific thinking in general and problem solving in particular could be thought of as a search in a problem space. A problem space consists of all the possible states of a problem and all the operations that a problem solver can use to get from one state to the next. According to this view, by characterizing the types of representations and procedures that people use to get from one state to another it is possible to understand scientific thinking. Thus, scientific thinking can be characterized as a search in various problem spaces (Simon, 1977 ). Simon investigated a number of scientific discoveries by bringing participants into the laboratory, providing the participants with the data that a scientist had access to, and getting the participants to reason about the data and rediscover a scientific concept. He then analyzed the verbal protocols that participants generated and mapped out the types of problem spaces that the participants search in (e.g., Qin & Simon, 1990 ). Kulkarni and Simon ( 1988 ) used a more historical approach to uncover the problem-solving heuristics that Krebs used in his discovery of the urea cycle. Kulkarni and Simon analyzed Krebs's diaries and proposed a set of problem-solving heuristics that he used in his research. They then built a computer program incorporating the heuristics and biological knowledge that Krebs had before he made his discoveries. Of particular importance are the search heuristics that the program uses, which include experimental proposal heuristics and data interpretation heuristics. A key heuristic was an unusualness heuristic that focused on unusual findings, which guided search through a space of theories and a space of experiments.

Klahr and Dunbar ( 1988 ) extended the search in a problem space approach and proposed that scientific thinking can be thought of as a search through two related spaces: an hypothesis space and an experiment space. Each problem space that a scientist uses will have its own types of representations and operators used to change the representations. Search in the hypothesis space constrains search in the experiment space. Klahr and Dunbar found that some participants move from the hypothesis space to the experiment space, whereas others move from the experiment space to the hypothesis space. These different types of searches lead to the proposal of different types of hypotheses and experiments. More recent work has extended the dual-space approach to include alternative problem-solving spaces, including those for data, instrumentation, and domain-specific knowledge (Klahr & Simon, 1999 ; Schunn & Klahr, 1995 , 1996 ).

Scientific Thinking as Hypothesis Testing

Many researchers have regarded testing specific hypotheses predicted by theories as one of the key attributes of scientific thinking. Hypothesis testing is the process of evaluating a proposition by collecting evidence regarding its truth. Experimental cognitive research on scientific thinking that specifically examines this issue has tended to fall into two broad classes of investigations. The first class is concerned with the types of reasoning that lead scientists astray, thus blocking scientific ingenuity. A large amount of research has been conducted on the potentially faulty reasoning strategies that both participants in experiments and scientists use, such as considering only one favored hypothesis at a time and how this prevents the scientists from making discoveries. The second class is concerned with uncovering the mental processes underlying the generation of new scientific hypotheses and concepts. This research has tended to focus on the use of analogy and imagery in science, as well as the use of specific types of problem-solving heuristics.

Turning first to investigations of what diminishes scientific creativity, philosophers, historians, and experimental psychologists have devoted a considerable amount of research to “confirmation bias.” This occurs when scientists only consider one hypothesis (typically the favored hypothesis) and ignore other alternative hypotheses or potentially relevant hypotheses. This important phenomenon can distort the design of experiments, formulation of theories, and interpretation of data. Beginning with the work of Wason ( 1968 ) and as discussed earlier, researchers have repeatedly shown that when participants are asked to design an experiment to test a hypothesis they will predominantly design experiments that they think will yield results consistent with the hypothesis. Using the 2-4-6 task mentioned earlier, Klayman and Ha ( 1987 ) showed that in situations where one's hypothesis is likely to be confirmed, seeking confirmation is a normatively incorrect strategy, whereas when the probability of confirming one's hypothesis is low, then attempting to confirm one's hypothesis can be an appropriate strategy. Historical analyses by Tweney ( 1989 ), concerning the way that Faraday made his discoveries, and experiments investigating people testing hypotheses, have revealed that people use a confirm early, disconfirm late strategy: When people initially generate or are given hypotheses, they try and gather evidence that is consistent with the hypothesis. Once enough evidence has been gathered, then people attempt to find the boundaries of their hypothesis and often try to disconfirm their hypotheses.

In an interesting variant on the confirmation bias paradigm, Gorman ( 1989 ) showed that when participants are told that there is the possibility of error in the data that they receive, participants assume that any data that are inconsistent with their favored hypothesis are due to error. Thus, the possibility of error “insulates” hypotheses against disconfirmation. This intriguing hypothesis has not been confirmed by other researchers (Penner & Klahr, 1996 ), but it is an intriguing hypothesis that warrants further investigation.

Confirmation bias is very difficult to overcome. Even when participants are asked to consider alternate hypotheses, they will often fail to conduct experiments that could potentially disconfirm their hypothesis. Tweney and his colleagues provide an excellent overview of this phenomenon in their classic monograph On Scientific Thinking (1981). The precise reasons for this type of block are still widely debated. Researchers such as Michael Doherty have argued that working memory limitations make it difficult for people to consider more than one hypothesis. Consistent with this view, Dunbar and Sussman ( 1995 ) have shown that when participants are asked to hold irrelevant items in working memory while testing hypotheses, the participants will be unable to switch hypotheses in the face of inconsistent evidence. While working memory limitations are involved in the phenomenon of confirmation bias, even groups of scientists can also display confirmation bias. For example, the controversy over cold fusion is an example of confirmation bias. Here, large groups of scientists had other hypotheses available to explain their data yet maintained their hypotheses in the face of other more standard alternative hypotheses. Mitroff ( 1974 ) provides some interesting examples of NASA scientists demonstrating confirmation bias, which highlight the roles of commitment and motivation in this process. See also MacPherson and Stanovich ( 2007 ) for specific strategies that can be used to overcome confirmation bias.

Causal Thinking in Science

Much of scientific thinking and scientific theory building pertains to the development of causal models between variables of interest. For example, do vaccines cause illnesses? Do carbon dioxide emissions cause global warming? Does water on a planet indicate that there is life on the planet? Scientists and nonscientists alike are constantly bombarded with statements regarding the causal relationship between such variables. How does one evaluate the status of such claims? What kinds of data are informative? How do scientists and nonscientists deal with data that are inconsistent with their theory?

A central issue in the causal reasoning literature, one that is directly relevant to scientific thinking, is the extent to which scientists and nonscientists alike are governed by the search for causal mechanisms (i.e., how a variable works) versus the search for statistical data (i.e., how often variables co-occur). This dichotomy can be boiled down to the search for qualitative versus quantitative information about the paradigm the scientist is investigating. Researchers from a number of cognitive psychology laboratories have found that people prefer to gather more information about an underlying mechanism than covariation between a cause and an effect (e.g., Ahn, Kalish, Medin, & Gelman, 1995 ). That is, the predominant strategy that students in simulations of scientific thinking use is to gather as much information as possible about how the objects under investigation work, rather than collecting large amounts of quantitative data to determine whether the observations hold across multiple samples. These findings suggest that a central component of scientific thinking may be to formulate explicit mechanistic causal models of scientific events.

One type of situation in which causal reasoning has been observed extensively is when scientists obtain unexpected findings. Both historical and naturalistic research has revealed that reasoning causally about unexpected findings plays a central role in science. Indeed, scientists themselves frequently state that a finding was due to chance or was unexpected. Given that claims of unexpected findings are such a frequent component of scientists' autobiographies and interviews in the media, Dunbar ( 1995 , 1997 , 1999 ; Dunbar & Fugelsang, 2005 ; Fugelsang, Stein, Green, & Dunbar, 2004 ) decided to investigate the ways that scientists deal with unexpected findings. In 1991–1992 Dunbar spent 1 year in three molecular biology laboratories and one immunology laboratory at a prestigious U.S. university. He used the weekly laboratory meeting as a source of data on scientific discovery and scientific reasoning. (He termed this type of study “in vivo” cognition.) When he looked at the types of findings that the scientists made, he found that over 50% of the findings were unexpected and that these scientists had evolved a number of effective strategies for dealing with such findings. One clear strategy was to reason causally about the findings: Scientists attempted to build causal models of their unexpected findings. This causal model building results in the extensive use of collaborative reasoning, analogical reasoning, and problem-solving heuristics (Dunbar, 1997 , 2001 ).

Many of the key unexpected findings that scientists reasoned about in the in vivo studies of scientific thinking were inconsistent with the scientists' preexisting causal models. A laboratory equivalent of the biology labs involved creating a situation in which students obtained unexpected findings that were inconsistent with their preexisting theories. Dunbar and Fugelsang ( 2005 ) examined this issue by creating a scientific causal thinking simulation where experimental outcomes were either expected or unexpected. Dunbar ( 1995 ) has called the study of people reasoning in a cognitive laboratory “in vitro” cognition. These investigators found that students spent considerably more time reasoning about unexpected findings than expected findings. In addition, when assessing the overall degree to which their hypothesis was supported or refuted, participants spent the majority of their time considering unexpected findings. An analysis of participants' verbal protocols indicates that much of this extra time was spent formulating causal models for the unexpected findings. Similarly, scientists spend more time considering unexpected than expected findings, and this time is devoted to building causal models (Dunbar & Fugelsang, 2004 ).

Scientists know that unexpected findings occur often, and they have developed many strategies to take advantage of their unexpected findings. One of the most important places that they anticipate the unexpected is in designing experiments (Baker & Dunbar, 2000 ). They build different causal models of their experiments incorporating many conditions and controls. These multiple conditions and controls allow unknown mechanisms to manifest themselves. Thus, rather than being the victims of the unexpected, they create opportunities for unexpected events to occur, and once these events do occur, they have causal models that allow them to determine exactly where in the causal chain their unexpected finding arose. The results of these in vivo and in vitro studies all point to a more complex and nuanced account of how scientists and nonscientists alike test and evaluate hypotheses about theories.

The Roles of Inductive, Abductive, and Deductive Thinking in Science

One of the most basic characteristics of science is that scientists assume that the universe that we live in follows predictable rules. Scientists reason using a variety of different strategies to make new scientific discoveries. Three frequently used types of reasoning strategies that scientists use are inductive, abductive, and deductive reasoning. In the case of inductive reasoning, a scientist may observe a series of events and try to discover a rule that governs the event. Once a rule is discovered, scientists can extrapolate from the rule to formulate theories of observed and yet-to-be-observed phenomena. One example is the discovery using inductive reasoning that a certain type of bacterium is a cause of many ulcers (Thagard, 1999 ). In a fascinating series of articles, Thagard documented the reasoning processes that Marshall and Warren went through in proposing this novel hypothesis. One key reasoning process was the use of induction by generalization. Marshall and Warren noted that almost all patients with gastric entritis had a spiral bacterium in their stomachs, and he formed the generalization that this bacterium is the cause of stomach ulcers. There are numerous other examples of induction by generalization in science, such as Tycho De Brea's induction about the motion of planets from his observations, Dalton's use of induction in chemistry, and the discovery of prions as the source of mad cow disease. Many theories of induction have used scientific discovery and reasoning as examples of this important reasoning process.

Another common type of inductive reasoning is to map a feature of one member of a category to another member of a category. This is called categorical induction. This type of induction is a way of projecting a known property of one item onto another item that is from the same category. Thus, knowing that the Rous Sarcoma virus is a retrovirus that uses RNA rather than DNA, a biologist might assume that another virus that is thought to be a retrovirus also uses RNA rather than DNA. While research on this type of induction typically has not been discussed in accounts of scientific thinking, this type of induction is common in science. For an influential contribution to this literature, see Smith, Shafir, and Osherson ( 1993 ), and for reviews of this literature see Heit ( 2000 ) and Medin et al. (Chapter 11 ).

While less commonly mentioned than inductive reasoning, abductive reasoning is an important form of reasoning that scientists use when they are seeking to propose explanations for events such as unexpected findings (see Lombrozo, Chapter 14 ; Magnani, et al., 2010 ). In Figure 35.1 , taken from King ( 2011 ), the differences between inductive, abductive, and deductive thinking are highlighted. In the case of abduction, the reasoner attempts to generate explanations of the form “if situation X had occurred, could it have produced the current evidence I am attempting to interpret?” (For an interesting of analysis of abductive reasoning see the brief paper by Klahr & Masnick, 2001 ). Of course, as in classical induction, such reasoning may produce a plausible account that is still not the correct one. However, abduction does involve the generation of new knowledge, and is thus also related to research on creativity.

The different processes underlying inductive, abductive, and deductive reasoning in science. (Figure reproduced from King 2011 ).)

Turning now to deductive thinking, many thinking processes that scientists adhere to follow traditional rules of deductive logic. These processes correspond to those conditions in which a hypothesis may lead to, or is deducible to, a conclusion. Though they are not always phrased in syllogistic form, deductive arguments can be phrased as “syllogisms,” or as brief, mathematical statements in which the premises lead to the conclusion. Deductive reasoning is an extremely important aspect of scientific thinking because it underlies a large component of how scientists conduct their research. By looking at many scientific discoveries, we can often see that deductive reasoning is at work. Deductive reasoning statements all contain information or rules that state an assumption about how the world works, as well as a conclusion that would necessarily follow from the rule. Numerous discoveries in physics such as the discovery of dark matter by Vera Rubin are based on deductions. In the dark matter case, Rubin measured galactic rotation curves and based on the differences between the predicted and observed angular motions of galaxies she deduced that the structure of the universe was uneven. This led her to propose that dark matter existed. In contemporary physics the CERN Large Hadron Collider is being used to search for the Higgs Boson. The Higgs Boson is a deductive prediction from contemporary physics. If the Higgs Boson is not found, it may lead to a radical revision of the nature of physics and a new understanding of mass (Hecht, 2011 ).

The Roles of Analogy in Scientific Thinking

One of the most widely mentioned reasoning processes used in science is analogy. Scientists use analogies to form a bridge between what they already know and what they are trying to explain, understand, or discover. In fact, many scientists have claimed that the making of certain analogies was instrumental in their making a scientific discovery, and almost all scientific autobiographies and biographies feature one particular analogy that is discussed in depth. Coupled with the fact that there has been an enormous research program on analogical thinking and reasoning (see Holyoak, Chapter 13 ), we now have a number of models and theories of analogical reasoning that suggest how analogy can play a role in scientific discovery (see Gentner, Holyoak, & Kokinov, 2001 ). By analyzing several major discoveries in the history of science, Thagard and Croft ( 1999 ), Nersessian ( 1999 , 2008 ), and Gentner and Jeziorski ( 1993 ) have all shown that analogical reasoning is a key aspect of scientific discovery.

Traditional accounts of analogy distinguish between two components of analogical reasoning: the target and the source (Holyoak, Chapter 13 ; Gentner 2010 ). The target is the concept or problem that a scientist is attempting to explain or solve. The source is another piece of knowledge that the scientist uses to understand the target or to explain the target to others. What the scientist does when he or she makes an analogy is to map features of the source onto features of the target. By mapping the features of the source onto the target, new features of the target may be discovered, or the features of the target may be rearranged so that a new concept is invented and a scientific discovery is made. For example, a common analogy that is used with computers is to describe a harmful piece of software as a computer virus. Once a piece of software is called a virus, people can map features of biological viruses, such as that it is small, spreads easily, self-replicates using a host, and causes damage. People not only map individual features of the source onto the target but also the systems of relations. For example, if a computer virus is similar to a biological virus, then an immune system can be created on computers that can protect computers from future variants of a virus. One of the reasons that scientific analogy is so powerful is that it can generate new knowledge, such as the creation of a computational immune system having many of the features of a real biological immune system. This analogy also leads to predictions that there will be newer computer viruses that are the computational equivalent of retroviruses, lacking DNA, or standard instructions, that will elude the computational immune system.

The process of making an analogy involves a number of key steps: retrieval of a source from memory, aligning the features of the source with those of the target, mapping features of the source onto those of the target, and possibly making new inferences about the target. Scientific discoveries are made when the source highlights a hitherto unknown feature of the target or restructures the target into a new set of relations. Interestingly, research on analogy has shown that participants do not easily use remote analogies (see Gentner et al., 1997 ; Holyoak & Thagard 1995 ). Participants in experiments tend to focus on the sharing of a superficial feature between the source and the target, rather than the relations among features. In his in vivo studies of science, Dunbar ( 1995 , 2001 , 2002 ) investigated the ways that scientists use analogies while they are conducting their research and found that scientists use both relational and superficial features when they make analogies. Whether they use superficial or relational features depends on their goals. If their goal is to fix a problem in an experiment, their analogies are based upon superficial features. However, if their goal is to formulate hypotheses, they focus on analogies based upon sets of relations. One important difference between scientists and participants in experiments is that the scientists have deep relational knowledge of the processes that they are investigating and can hence use this relational knowledge to make analogies (see Holyoak, Chapter 13 for a thorough review of analogical reasoning).

Are scientific analogies always useful? Sometimes analogies can lead scientists and students astray. For example, Evelyn Fox-Keller ( 1985 ) shows how an analogy between the pulsing of a lighthouse and the activity of the slime mold dictyostelium led researchers astray for a number of years. Likewise, the analogy between the solar system (the source) and the structure of the atom (the target) has been shown to be potentially misleading to students taking more advanced courses in physics or chemistry. The solar system analogy has a number of misalignments to the structure of the atom, such as electrons being repelled from each other rather than attracted; moreover, electrons do not have individual orbits like planets but have orbit clouds of electron density. Furthermore, students have serious misconceptions about the nature of the solar system, which can compound their misunderstanding of the nature of the atom (Fischler & Lichtfeld, 1992 ). While analogy is a powerful tool in science, like all forms of induction, incorrect conclusions can be reached.

Conceptual Change in Science

Scientific knowledge continually accumulates as scientists gather evidence about the natural world. Over extended time, this knowledge accumulation leads to major revisions, extensions, and new organizational forms for expressing what is known about nature. Indeed, these changes are so substantial that philosophers of science speak of “revolutions” in a variety of scientific domains (Kuhn, 1962 ). The psychological literature that explores the idea of revolutionary conceptual change can be roughly divided into (a) investigations of how scientists actually make discoveries and integrate those discoveries into existing scientific contexts, and (b) investigations of nonscientists ranging from infants, to children, to students in science classes. In this section we summarize the adult studies of conceptual change, and in the next section we look at its developmental aspects.

Scientific concepts, like all concepts, can be characterized as containing a variety of “knowledge elements”: representations of words, thoughts, actions, objects, and processes. At certain points in the history of science, the accumulated evidence has demanded major shifts in the way these collections of knowledge elements are organized. This “radical conceptual change” process (see Keil, 1999 ; Nersessian 1998 , 2002 ; Thagard, 1992 ; Vosniadou 1998, for reviews) requires the formation of a new conceptual system that organizes knowledge in new ways, adds new knowledge, and results in a very different conceptual structure. For more recent research on conceptual change, The International Handbook of Research on Conceptual Change (Vosniadou, 2008 ) provides a detailed compendium of theories and controversies within the field.

While conceptual change in science is usually characterized by large-scale changes in concepts that occur over extensive periods of time, it has been possible to observe conceptual change using in vivo methodologies. Dunbar ( 1995 ) reported a major conceptual shift that occurred in immunologists, where they obtained a series of unexpected findings that forced the scientists to propose a new concept in immunology that in turn forced the change in other concepts. The drive behind this conceptual change was the discovery of a series of different unexpected findings or anomalies that required the scientists to both revise and reorganize their conceptual knowledge. Interestingly, this conceptual change was achieved by a group of scientists reasoning collaboratively, rather than by a scientist working alone. Different scientists tend to work on different aspects of concepts, and also different concepts, that when put together lead to a rapid change in entire conceptual structures.

Overall, accounts of conceptual change in individuals indicate that it is indeed similar to that of conceptual change in entire scientific fields. Individuals need to be confronted with anomalies that their preexisting theories cannot explain before entire conceptual structures are overthrown. However, replacement conceptual structures have to be generated before the old conceptual structure can be discarded. Sometimes, people do not overthrow their original conceptual theories and through their lives maintain their original views of many fundamental scientific concepts. Whether people actively possess naive theories, or whether they appear to have a naive theory because of the demand characteristics of the testing context, is a lively source of debate within the science education community (see Gupta, Hammer, & Redish, 2010 ).

Scientific Thinking in Children

Well before their first birthday, children appear to know several fundamental facts about the physical world. For example, studies with infants show that they behave as if they understand that solid objects endure over time (e.g., they don't just disappear and reappear, they cannot move through each other, and they move as a result of collisions with other solid objects or the force of gravity (Baillargeon, 2004 ; Carey 1985 ; Cohen & Cashon, 2006 ; Duschl, Schweingruber, & Shouse, 2007 ; Gelman & Baillargeon, 1983 ; Gelman & Kalish, 2006 ; Mandler, 2004 ; Metz 1995 ; Munakata, Casey, & Diamond, 2004 ). And even 6-month-olds are able to predict the future location of a moving object that they are attempting to grasp (Von Hofsten, 1980 ; Von Hofsten, Feng, & Spelke, 2000 ). In addition, they appear to be able to make nontrivial inferences about causes and their effects (Gopnik et al., 2004 ).

The similarities between children's thinking and scientists' thinking have an inherent allure and an internal contradiction. The allure resides in the enthusiastic wonder and openness with which both children and scientists approach the world around them. The paradox comes from the fact that different investigators of children's thinking have reached diametrically opposing conclusions about just how “scientific” children's thinking really is. Some claim support for the “child as a scientist” position (Brewer & Samarapungavan, 1991 ; Gelman & Wellman, 1991 ; Gopnik, Meltzoff, & Kuhl, 1999 ; Karmiloff-Smith 1988 ; Sodian, Zaitchik, & Carey, 1991 ; Samarapungavan 1992 ), while others offer serious challenges to the view (Fay & Klahr, 1996 ; Kern, Mirels, & Hinshaw, 1983 ; Kuhn, Amsel, & O'Laughlin, 1988 ; Schauble & Glaser, 1990 ; Siegler & Liebert, 1975 .) Such fundamentally incommensurate conclusions suggest that this very field—children's scientific thinking—is ripe for a conceptual revolution!

A recent comprehensive review (Duschl, Schweingruber, & Shouse, 2007 ) of what children bring to their science classes offers the following concise summary of the extensive developmental and educational research literature on children's scientific thinking:

Children entering school already have substantial knowledge of the natural world, much of which is implicit.

What children are capable of at a particular age is the result of a complex interplay among maturation, experience, and instruction. What is developmentally appropriate is not a simple function of age or grade, but rather is largely contingent on children's prior opportunities to learn.

Students' knowledge and experience play a critical role in their science learning, influencing four aspects of science understanding, including (a) knowing, using, and interpreting scientific explanations of the natural world; (b) generating and evaluating scientific evidence and explanations, (c) understanding how scientific knowledge is developed in the scientific community, and (d) participating in scientific practices and discourse.

Students learn science by actively engaging in the practices of science.

In the previous section of this article we discussed conceptual change with respect to scientific fields and undergraduate science students. However, the idea that children undergo radical conceptual change in which old “theories” need to be overthrown and reorganized has been a central topic in understanding changes in scientific thinking in both children and across the life span. This radical conceptual change is thought to be necessary for acquiring many new concepts in physics and is regarded as the major source of difficulty for students. The factors that are at the root of this conceptual shift view have been difficult to determine, although there have been a number of studies in cognitive development (Carey, 1985 ; Chi 1992 ; Chi & Roscoe, 2002 ), in the history of science (Thagard, 1992 ), and in physics education (Clement, 1982 ; Mestre 1991 ) that give detailed accounts of the changes in knowledge representation that occur while people switch from one way of representing scientific knowledge to another.

One area where students show great difficulty in understanding scientific concepts is physics. Analyses of students' changing conceptions, using interviews, verbal protocols, and behavioral outcome measures, indicate that large-scale changes in students' concepts occur in physics education (see McDermott & Redish, 1999 , for a review of this literature). Following Kuhn ( 1962 ), many researchers, but not all, have noted that students' changing conceptions resemble the sequences of conceptual changes in physics that have occurred in the history of science. These notions of radical paradigm shifts and ensuing incompatibility with past knowledge-states have called attention to interesting parallels between the development of particular scientific concepts in children and in the history of physics. Investigations of nonphysicists' understanding of motion indicate that students have extensive misunderstandings of motion. Some researchers have interpreted these findings as an indication that many people hold erroneous beliefs about motion similar to a medieval “impetus” theory (McCloskey, Caramazza, & Green, 1980 ). Furthermore, students appear to maintain “impetus” notions even after one or two courses in physics. In fact, some authors have noted that students who have taken one or two courses in physics can perform worse on physics problems than naive students (Mestre, 1991 ). Thus, it is only after extensive learning that we see a conceptual shift from impetus theories of motion to Newtonian scientific theories.

How one's conceptual representation shifts from “naive” to Newtonian is a matter of contention, as some have argued that the shift involves a radical conceptual change, whereas others have argued that the conceptual change is not really complete. For example, Kozhevnikov and Hegarty ( 2001 ) argue that much of the naive impetus notions of motion are maintained at the expense of Newtonian principles even with extensive training in physics. However, they argue that such impetus principles are maintained at an implicit level. Thus, although students can give the correct Newtonian answer to problems, their reaction times to respond indicate that they are also using impetus theories when they respond. An alternative view of conceptual change focuses on whether there are real conceptual changes at all. Gupta, Hammer and Redish ( 2010 ) and Disessa ( 2004 ) have conducted detailed investigations of changes in physics students' accounts of phenomena covered in elementary physics courses. They have found that rather than students possessing a naive theory that is replaced by the standard theory, many introductory physics students have no stable physical theory but rather construct their explanations from elementary pieces of knowledge of the physical world.

Computational Approaches to Scientific Thinking

Computational approaches have provided a more complete account of the scientific mind. Computational models provide specific detailed accounts of the cognitive processes underlying scientific thinking. Early computational work consisted of taking a scientific discovery and building computational models of the reasoning processes involved in the discovery. Langley, Simon, Bradshaw, and Zytkow ( 1987 ) built a series of programs that simulated discoveries such as those of Copernicus, Bacon, and Stahl. These programs had various inductive reasoning algorithms built into them, and when given the data that the scientists used, they were able to propose the same rules. Computational models make it possible to propose detailed models of the cognitive subcomponents of scientific thinking that specify exactly how scientific theories are generated, tested, and amended (see Darden, 1997 , and Shrager & Langley, 1990 , for accounts of this branch of research). More recently, the incorporation of scientific knowledge into computer programs has resulted in a shift in emphasis from using programs to simulate discoveries to building programs that are used to help scientists make discoveries. A number of these computer programs have made novel discoveries. For example, Valdes-Perez ( 1994 ) has built systems for discoveries in chemistry, and Fajtlowicz has done this in mathematics (Erdos, Fajtlowicz, & Staton, 1991 ).

These advances in the fields of computer discovery have led to new fields, conferences, journals, and even departments that specialize in the development of programs devised to search large databases in the hope of making new scientific discoveries (Langley, 2000 , 2002 ). This process is commonly known as “data mining.” This approach has only proved viable relatively recently, due to advances in computer technology. Biswal et al. ( 2010 ), Mitchell ( 2009 ), and Yang ( 2009 ) provide recent reviews of data mining in different scientific fields. Data mining is at the core of drug discovery, our understanding of the human genome, and our understanding of the universe for a number of reasons. First, vast databases concerning drug actions, biological processes, the genome, the proteome, and the universe itself now exist. Second, the development of high throughput data-mining algorithms makes it possible to search for new drug targets, novel biological mechanisms, and new astronomical phenomena in relatively short periods of time. Research programs that took decades, such as the development of penicillin, can now be done in days (Yang, 2009 ).

Another recent shift in the use of computers in scientific discovery has been to have both computers and people make discoveries together, rather than expecting that computers make an entire scientific discovery. Now instead of using computers to mimic the entire scientific discovery process as used by humans, computers can use powerful algorithms that search for patterns on large databases and provide the patterns to humans who can then use the output of these computers to make discoveries, ranging from the human genome to the structure of the universe. However, there are some robots such as ADAM, developed by King ( 2011 ), that can actually perform the entire scientific process, from the generation of hypotheses, to the conduct of experiments and the interpretation of results, with little human intervention. The ongoing development of scientific robots by some scientists (King et al., 2009 ) thus continues the tradition started by Herbert Simon in the 1960s. However, many of the controversies as to whether the robot is a “real scientist” or not continue to the present (Evans & Rzhetsky, 2010 , Gianfelici, 2010 ; Haufe, Elliott, Burian, & O' Malley, 2010 ; O'Malley 2011 ).

Scientific Thinking and Science Education

Accounts of the nature of science and research on scientific thinking have had profound effects on science education along many levels, particularly in recent years. Science education from the 1900s until the 1970s was primarily concerned with teaching students both the content of science (such as Newton's laws of motion) or the methods that scientists need to use in their research (such as using experimental and control groups). Beginning in the 1980s, a number of reports (e.g., American Association for the Advancement of Science, 1993; National Commission on Excellence in Education, 1983; Rutherford & Ahlgren, 1991 ) stressed the need for teaching scientific thinking skills rather than just methods and content. The addition of scientific thinking skills to the science curriculum from kindergarten through adulthood was a major shift in focus. Many of the particular scientific thinking skills that have been emphasized are skills covered in previous sections of this chapter, such as teaching deductive and inductive thinking strategies. However, rather than focusing on one particular skill, such as induction, researchers in education have focused on how the different components of scientific thinking are put together in science. Furthermore, science educators have focused upon situations where science is conducted collaboratively, rather than being the product of one person thinking alone. These changes in science education parallel changes in methodologies used to investigate science, such as analyzing the ways that scientists think and reason in their laboratories.

By looking at science as a complex multilayered and group activity, many researchers in science education have adopted a constructivist approach. This approach sees learning as an active rather than a passive process, and it suggests that students learn through constructing their scientific knowledge. We will first describe a few examples of the constructivist approach to science education. Following that, we will address several lines of work that challenge some of the assumptions of the constructivist approach to science education.

Often the goal of constructivist science education is to produce conceptual change through guided instruction where the teacher or professor acts as a guide to discovery, rather than the keeper of all the facts. One recent and influential approach to science education is the inquiry-based learning approach. Inquiry-based learning focuses on posing a problem or a puzzling event to students and asking them to propose a hypothesis that could explain the event. Next, the student is asked to collect data that test the hypothesis, make conclusions, and then reflect upon both the original problem and the thought processes that they used to solve the problem. Often students use computers that aid in their construction of new knowledge. The computers allow students to learn many of the different components of scientific thinking. For example, Reiser and his colleagues have developed a learning environment for biology, where students are encouraged to develop hypotheses in groups, codify the hypotheses, and search databases to test these hypotheses (Reiser et al., 2001 ).

One of the myths of science is the lone scientist suddenly shouting “Eureka, I have made a discovery!” Instead, in vivo studies of scientists (e.g., Dunbar, 1995 , 2002 ), historical analyses of scientific discoveries (Nersessian, 1999 ), and studies of children learning science at museums have all pointed to collaborative scientific discovery mechanisms as being one of the driving forces of science (Atkins et al., 2009 ; Azmitia & Crowley, 2001 ). What happens during collaborative scientific thinking is that there is usually a triggering event, such as an unexpected result or situation that a student does not understand. This results in other members of the group adding new information to the person's representation of knowledge, often adding new inductions and deductions that both challenge and transform the reasoner's old representations of knowledge (Chi & Roscoe, 2002 ; Dunbar 1998 ). Social mechanisms play a key component in fostering changes in concepts that have been ignored in traditional cognitive research but are crucial for both science and science education. In science education there has been a shift to collaborative learning, particularly at the elementary level; however, in university education, the emphasis is still on the individual scientist. As many domains of science now involve collaborations across scientific disciplines, we expect the explicit teaching of heuristics for collaborative science to increase.

What is the best way to teach and learn science? Surprisingly, the answer to this question has been difficult to uncover. For example, toward the end of the last century, influenced by several thinkers who advocated a constructivist approach to learning, ranging from Piaget (Beilin, 1994 ) to Papert ( 1980 ), many schools answered this question by adopting a philosophy dubbed “discovery learning.” Although a clear operational definition of this approach has yet to be articulated, the general idea is that children are expected to learn science by reconstructing the processes of scientific discovery—in a range of areas from computer programming to chemistry to mathematics. The premise is that letting students discover principles on their own, set their own goals, and collaboratively explore the natural world produces deeper knowledge that transfers widely.

The research literature on science education is far from consistent in its use of terminology. However, our reading suggests that “discovery learning” differs from “inquiry-based learning” in that few, if any, guidelines are given to students in discovery learning contexts, whereas in inquiry learning, students are given hypotheses and specific goals to achieve (see the second paragraph of this section for a definition of inquiry-based learning). Even though thousands of schools have adopted discovery learning as an alternative to more didactic approaches to teaching and learning, the evidence showing that it is more effective than traditional, direct, teacher-controlled instructional approaches is mixed, at best (Lorch et al., 2010 ; Minner, Levy, & Century, 2010 ). In several cases where the distinctions between direct instruction and more open-ended constructivist instruction have been clearly articulated, implemented, and assessed, direct instruction has proven to be superior to the alternatives (Chen & Klahr, 1999 ; Toth, Klahr, & Chen, 2000 ). For example, in a study of third- and fourth-grade children learning about experimental design, Klahr and Nigam ( 2004 ) found that many more children learned from direct instruction than from discovery learning. Furthermore, they found that among the few children who did manage to learn from a discovery method, there was no better performance on a far transfer test of scientific reasoning than that observed for the many children who learned from direct instruction.

The idea of children learning most of their science through a process of self-directed discovery has some romantic appeal, and it may accurately describe the personal experience of a handful of world-class scientists. However, the claim has generated some contentious disagreements (Kirschner, Sweller, & Clark, 2006 ; Klahr, 2010 ; Taber 2009 ; Tobias & Duffy, 2009 ), and the jury remains out on the extent to which most children can learn science that way.

Conclusions and Future Directions

The field of scientific thinking is now a thriving area of research with strong underpinnings in cognitive psychology and cognitive science. In recent years, a new professional society has been formed that aims to facilitate this integrative and interdisciplinary approach to the psychology of science, with its own journal and regular professional meetings. 1 Clearly the relations between these different aspects of scientific thinking need to be combined in order to produce a truly comprehensive picture of the scientific mind.

While much is known about certain aspects of scientific thinking, much more remains to be discovered. In particular, there has been little contact between cognitive, neuroscience, social, personality, and motivational accounts of scientific thinking. Research in thinking and reasoning has been expanded to use the methods and theories of cognitive neuroscience (see Morrison & Knowlton, Chapter 6 ). A similar approach can be taken in exploring scientific thinking (see Dunbar et al., 2007 ). There are two main reasons for taking a neuroscience approach to scientific thinking. First, functional neuroimaging allows the researcher to look at the entire human brain, making it possible to see the many different sites that are involved in scientific thinking and gain a more complete understanding of the entire range of mechanisms involved in this type of thought. Second, these brain-imaging approaches allow researchers to address fundamental questions in research on scientific thinking, such as the extent to which ordinary thinking in nonscientific contexts and scientific thinking recruit similar versus disparate neural structures of the brain.

Dunbar ( 2009 ) has used some novel methods to explore Simon's assertion, cited at the beginning of this chapter, that scientific thinking uses the same cognitive mechanisms that all human beings possess (rather than being an entirely different type of thinking) but combines them in ways that are specific to a particular aspect of science or a specific discipline of science. For example, Fugelsang and Dunbar ( 2009 ) compared causal reasoning when two colliding circular objects were labeled balls or labeled subatomic particles. They obtained different brain activation patterns depending on whether the stimuli were labeled balls or subatomic particles. In another series of experiments, Dunbar and colleagues used functional magnetic resonance imaging (fMRI) to study patterns of activation in the brains of students who have and who have not undergone conceptual change in physics. For example, Fugelsang and Dunbar ( 2005 ) and Dunbar et al. ( 2007 ) have found differences in the activation of specific brain sites (such as the anterior cingulate) for students when they encounter evidence that is inconsistent with their current conceptual understandings. These initial cognitive neuroscience investigations have the potential to reveal the ways that knowledge is organized in the scientific brain and provide detailed accounts of the nature of the representation of scientific knowledge. Petitto and Dunbar ( 2004 ) proposed the term “educational neuroscience” for the integration of research on education, including science education, with research on neuroscience. However, see Fitzpatrick (in press) for a very different perspective on whether neuroscience approaches are relevant to education. Clearly, research on the scientific brain is just beginning. We as scientists are beginning to get a reasonable grasp of the inner workings of the subcomponents of the scientific mind (i.e., problem solving, analogy, induction). However, great advances remain to be made concerning how these processes interact so that scientific discoveries can be made. Future research will focus on both the collaborative aspects of scientific thinking and the neural underpinnings of the scientific mind.

The International Society for the Psychology of Science and Technology (ISPST). Available at http://www.ispstonline.org/

Ahn, W., Kalish, C. W., Medin, D. L., & Gelman, S. A. ( 1995 ). The role of covariation versus mechanism information in causal attribution.   Cognition , 54 , 299–352.

American Association for the Advancement of Science. ( 1993 ). Benchmarks for scientific literacy . New York: Oxford University Press.

Google Scholar

Google Preview

Atkins, L. J., Velez, L., Goudy, D., & Dunbar, K. N. ( 2009 ). The unintended effects of interactive objects and labels in the science museum.   Science Education , 54 , 161–184.

Azmitia, M. A., & Crowley, K. ( 2001 ). The rhythms of scientific thinking: A study of collaboration in an earthquake microworld. In K. Crowley, C. Schunn, & T. Okada (Eds.), Designing for science: Implications from everyday, classroom, and professional settings (pp. 45–72). Mahwah, NJ: Erlbaum.

Bacon, F. ( 1620 /1854). Novum organum (B. Monatgue, Trans.). Philadelphia, P A: Parry & McMillan.

Baillargeon, R. ( 2004 ). Infants' reasoning about hidden objects: Evidence for event-general and event-specific expectations (article with peer commentaries and response, listed below).   Developmental Science , 54 , 391–424.

Baker, L. M., & Dunbar, K. ( 2000 ). Experimental design heuristics for scientific discovery: The use of baseline and known controls.   International Journal of Human Computer Studies , 54 , 335–349.

Beilin, H. ( 1994 ). Jean Piaget's enduring contribution to developmental psychology. In R. D. Parke, P. A. Ornstein, J. J. Rieser, & C. Zahn-Waxler (Eds.), A century of developmental psychology (pp. 257–290). Washington, DC US: American Psychological Association.

Biswal, B. B., Mennes, M., Zuo, X.-N., Gohel, S., Kelly, C., Smith, S.M., et al. ( 2010 ). Toward discovery science of human brain function.   Proceedings of the National Academy of Sciences of the United States of America , 107, 4734–4739.

Brewer, W. F., & Samarapungavan, A. ( 1991 ). Children's theories vs. scientific theories: Differences in reasoning or differences in knowledge? In R. R. Hoffman & D. S. Palermo (Eds.), Cognition and the symbolic processes: Applied and ecological perspectives (pp. 209–232). Hillsdale, NJ: Erlbaum.

Bruner, J. S., Goodnow, J. J., & Austin, G. A. ( 1956 ). A study of thinking . New York: NY Science Editions.

Carey, S. ( 1985 ). Conceptual change in childhood . Cambridge, MA: MIT Press.

Carruthers, P., Stich, S., & Siegal, M. ( 2002 ). The cognitive basis of science . New York: Cambridge University Press.

Chi, M. ( 1992 ). Conceptual change within and across ontological categories: Examples from learning and discovery in science. In R. Giere (Ed.), Cognitive models of science (pp. 129–186). Minneapolis: University of Minnesota Press.

Chi, M. T. H., & Roscoe, R. D. ( 2002 ). The processes and challenges of conceptual change. In M. Limon & L. Mason (Eds.), Reconsidering conceptual change: Issues in theory and practice (pp 3–27). Amsterdam, Netherlands: Kluwer Academic Publishers.

Chen, Z., & Klahr, D. ( 1999 ). All other things being equal: Children's acquisition of the control of variables strategy.   Child Development , 54 (5), 1098–1120.

Clement, J. ( 1982 ). Students' preconceptions in introductory mechanics.   American Journal of Physics , 54 , 66–71.

Cohen, L. B., & Cashon, C. H. ( 2006 ). Infant cognition. In W. Damon & R. M. Lerner (Series Eds.) & D. Kuhn & R. S. Siegler (Vol. Eds.), Handbook of child psychology. Vol. 2: Cognition, perception, and language (6th ed., pp. 214–251). New York: Wiley.

National Commission on Excellence in Education. ( 1983 ). A nation at risk: The imperative for educational reform . Washington, DC: US Department of Education.

Crick, F. H. C. ( 1988 ). What mad pursuit: A personal view of science . New York: Basic Books.

Darden, L. ( 2002 ). Strategies for discovering mechanisms: Schema instantiation, modular subassembly, forward chaining/backtracking.   Philosophy of Science , 69, S354–S365.

Davenport, J. L., Yaron, D., Klahr, D., & Koedinger, K. ( 2008 ). Development of conceptual understanding and problem solving expertise in chemistry. In B. C. Love, K. McRae, & V. M. Sloutsky (Eds.), Proceedings of the 30th Annual Conference of the Cognitive Science Society (pp. 751–756). Austin, TX: Cognitive Science Society.

diSessa, A. A. ( 2004 ). Contextuality and coordination in conceptual change. In E. Redish & M. Vicentini (Eds.), Proceedings of the International School of Physics “Enrico Fermi:” Research on physics education (pp. 137–156). Amsterdam, Netherlands: ISO Press/Italian Physics Society

Dunbar, K. ( 1995 ). How scientists really reason: Scientific reasoning in real-world laboratories. In R. J. Sternberg, & J. Davidson (Eds.), Mechanisms of insight (pp. 365–395). Cambridge, MA: MIT press.

Dunbar, K. ( 1997 ). How scientists think: Online creativity and conceptual change in science. In T. B. Ward, S. M. Smith, & S. Vaid (Eds.), Conceptual structures and processes: Emergence, discovery and change (pp. 461–494). Washington, DC: American Psychological Association.

Dunbar, K. ( 1998 ). Problem solving. In W. Bechtel & G. Graham (Eds.), A companion to cognitive science (pp. 289–298). London: Blackwell

Dunbar, K. ( 1999 ). The scientist InVivo : How scientists think and reason in the laboratory. In L. Magnani, N. Nersessian, & P. Thagard (Eds.), Model-based reasoning in scientific discovery (pp. 85–100). New York: Plenum.

Dunbar, K. ( 2001 ). The analogical paradox: Why analogy is so easy in naturalistic settings, yet so difficult in the psychology laboratory. In D. Gentner, K. J. Holyoak, & B. Kokinov Analogy: Perspectives from cognitive science (pp. 313–334). Cambridge, MA: MIT press.

Dunbar, K. ( 2002 ). Science as category: Implications of InVivo science for theories of cognitive development, scientific discovery, and the nature of science. In P. Caruthers, S. Stich, & M. Siegel (Eds.) Cognitive models of science (pp. 154–170). New York: Cambridge University Press.

Dunbar, K. ( 2009 ). The biology of physics: What the brain reveals about our physical understanding of the world. In M. Sabella, C. Henderson, & C. Singh. (Eds.), Proceedings of the Physics Education Research Conference (pp. 15–18). Melville, NY: American Institute of Physics.

Dunbar, K., & Fugelsang, J. ( 2004 ). Causal thinking in science: How scientists and students interpret the unexpected. In M. E. Gorman, A. Kincannon, D. Gooding, & R. D. Tweney (Eds.), New directions in scientific and technical thinking (pp. 57–59). Mahway, NJ: Erlbaum.

Dunbar, K., Fugelsang, J., & Stein, C. ( 2007 ). Do naïve theories ever go away? In M. Lovett & P. Shah (Eds.), Thinking with Data: 33 rd Carnegie Symposium on Cognition (pp. 193–206). Mahwah, NJ: Erlbaum.

Dunbar, K., & Sussman, D. ( 1995 ). Toward a cognitive account of frontal lobe function: Simulating frontal lobe deficits in normal subjects.   Annals of the New York Academy of Sciences , 54 , 289–304.

Duschl, R. A., Schweingruber, H. A., & Shouse, A. W. (Eds.). ( 2007 ). Taking science to school: Learning and teaching science in grades K-8. Washington, DC: National Academies Press.

Einstein, A. ( 1950 ). Out of my later years . New York: Philosophical Library

Erdos, P., Fajtlowicz, S., & Staton, W. ( 1991 ). Degree sequences in the triangle-free graphs,   Discrete Mathematics , 54 (91), 85–88.

Evans, J., & Rzhetsky, A. ( 2010 ). Machine science.   Science , 54 , 399–400.

Fay, A., & Klahr, D. ( 1996 ). Knowing about guessing and guessing about knowing: Preschoolers' understanding of indeterminacy.   Child Development , 54 , 689–716.

Fischler, H., & Lichtfeldt, M. ( 1992 ). Modern physics and students conceptions.   International Journal of Science Education , 54 , 181–190.

Fitzpatrick, S. M. (in press). Functional brain imaging: Neuro-turn or wrong turn? In M. M., Littlefield & J.M., Johnson (Eds.), The neuroscientific turn: Transdisciplinarity in the age of the brain. Ann Arbor: University of Michigan Press.

Fox-Keller, E. ( 1985 ). Reflections on gender and science . New Haven, CT: Yale University Press.

Fugelsang, J., & Dunbar, K. ( 2005 ). Brain-based mechanisms underlying complex causal thinking.   Neuropsychologia , 54 , 1204–1213.

Fugelsang, J., & Dunbar, K. ( 2009 ). Brain-based mechanisms underlying causal reasoning. In E. Kraft (Ed.), Neural correlates of thinking (pp. 269–279). Berlin, Germany: Springer

Fugelsang, J., Stein, C., Green, A., & Dunbar, K. ( 2004 ). Theory and data interactions of the scientific mind: Evidence from the molecular and the cognitive laboratory.   Canadian Journal of Experimental Psychology , 54 , 132–141

Galilei, G. ( 1638 /1991). Dialogues concerning two new sciences (A. de Salvio & H. Crew, Trans.). Amherst, NY: Prometheus Books.

Galison, P. ( 2003 ). Einstein's clocks, Poincaré's maps: Empires of time . New York: W. W. Norton.

Gelman, R., & Baillargeon, R. ( 1983 ). A review of Piagetian concepts. In P. H. Mussen (Series Ed.) & J. H. Flavell & E. M. Markman (Vol. Eds.), Handbook of child psychology (4th ed., Vol. 3, pp. 167–230). New York: Wiley.

Gelman, S. A., & Kalish, C. W. ( 2006 ). Conceptual development. In D. Kuhn & R. Siegler (Eds.), Handbook of child psychology. Vol. 2: Cognition, perception and language (pp. 687–733). New York: Wiley.

Gelman, S., & Wellman, H. ( 1991 ). Insides and essences.   Cognition , 54 , 214–244.

Gentner, D. ( 2010 ). Bootstrapping the mind: Analogical processes and symbol systems.   Cognitive Science , 54 , 752–775.

Gentner, D., Brem, S., Ferguson, R. W., Markman, A. B., Levidow, B. B., Wolff, P., & Forbus, K. D. ( 1997 ). Analogical reasoning and conceptual change: A case study of Johannes Kepler.   The Journal of the Learning Sciences , 54 (1), 3–40.

Gentner, D., Holyoak, K. J., & Kokinov, B. ( 2001 ). The analogical mind: Perspectives from cognitive science . Cambridge, MA: MIT Press.

Gentner, D., & Jeziorski, M. ( 1993 ). The shift from metaphor to analogy in western science. In A. Ortony (Ed.), Metaphor and thought (2nd ed., pp. 447–480). Cambridge, England: Cambridge University Press.

Gianfelici, F. ( 2010 ). Machine science: Truly machine-aided science.   Science , 54 , 317–319.

Giere, R. ( 1993 ). Cognitive models of science . Minneapolis: University of Minnesota Press.

Gopnik, A. N., Meltzoff, A. N., & Kuhl, P. K. ( 1999 ). The scientist in the crib: Minds, brains and how children learn . New York: Harper Collins

Gorman, M. E. ( 1989 ). Error, falsification and scientific inference: An experimental investigation.   Quarterly Journal of Experimental Psychology: Human Experimental Psychology , 41A , 385–412

Gorman, M. E., Kincannon, A., Gooding, D., & Tweney, R. D. ( 2004 ). New directions in scientific and technical thinking . Mahwah, NJ: Erlbaum.

Gupta, A., Hammer, D., & Redish, E. F. ( 2010 ). The case for dynamic models of learners' ontologies in physics.   Journal of the Learning Sciences , 54 (3), 285–321.

Haufe, C., Elliott, K. C., Burian, R., & O'Malley, M. A. ( 2010 ). Machine science: What's missing.   Science , 54 , 318–320.

Hecht, E. ( 2011 ). On defining mass.   The Physics Teacher , 54 , 40–43.

Heit, E. ( 2000 ). Properties of inductive reasoning.   Psychonomic Bulletin and Review , 54 , 569–592.

Holyoak, K. J., & Thagard, P. ( 1995 ). Mental leaps . Cambridge, MA: MIT Press.

Karmiloff-Smith, A. ( 1988 ) The child is a theoretician, not an inductivist.   Mind and Language , 54 , 183–195.

Keil, F. C. ( 1999 ). Conceptual change. In R. Wilson & F. Keil (Eds.), The MIT encyclopedia of cognitive science . (pp. 179–182) Cambridge, MA: MIT press.

Kern, L. H., Mirels, H. L., & Hinshaw, V. G. ( 1983 ). Scientists' understanding of propositional logic: An experimental investigation.   Social Studies of Science , 54 , 131–146.

King, R. D. ( 2011 ). Rise of the robo scientists.   Scientific American , 54 (1), 73–77.

King, R. D., Rowland, J., Oliver, S. G., Young, M., Aubrey, W., Byrne, E., et al. ( 2009 ). The automation of science.   Science , 54 , 85–89.

Kirschner, P. A., Sweller, J., & Clark, R. ( 2006 ) Why minimal guidance during instruction does not work: An analysis of the failure of constructivist, discovery, problem-based, experiential, and inquiry-based teaching.   Educational Psychologist , 54 , 75–86

Klahr, D. ( 2000 ). Exploring science: The cognition and development of discovery processes . Cambridge, MA: MIT Press.

Klahr, D. ( 2010 ). Coming up for air: But is it oxygen or phlogiston? A response to Taber's review of constructivist instruction: Success or failure?   Education Review , 54 (13), 1–6.

Klahr, D., & Dunbar, K. ( 1988 ). Dual space search during scientific reasoning.   Cognitive Science , 54 , 1–48.

Klahr, D., & Nigam, M. ( 2004 ). The equivalence of learning paths in early science instruction: effects of direct instruction and discovery learning.   Psychological Science , 54 (10), 661–667.

Klahr, D. & Masnick, A. M. ( 2002 ). Explaining, but not discovering, abduction. Review of L. Magnani (2001) abduction, reason, and science: Processes of discovery and explanation.   Contemporary Psychology , 47, 740–741.

Klahr, D., & Simon, H. ( 1999 ). Studies of scientific discovery: Complementary approaches and convergent findings.   Psychological Bulletin , 54 , 524–543.

Klayman, J., & Ha, Y. ( 1987 ). Confirmation, disconfirmation, and information in hypothesis testing.   Psychological Review , 54 , 211–228.

Kozhevnikov, M., & Hegarty, M. ( 2001 ). Impetus beliefs as default heuristic: Dissociation between explicit and implicit knowledge about motion.   Psychonomic Bulletin and Review , 54 , 439–453.

Kuhn, T. ( 1962 ). The structure of scientific revolutions . Chicago, IL: University of Chicago Press.

Kuhn, D., Amsel, E., & O'Laughlin, M. ( 1988 ). The development of scientific thinking skills . Orlando, FL: Academic Press.

Kulkarni, D., & Simon, H. A. ( 1988 ). The processes of scientific discovery: The strategy of experimentation.   Cognitive Science , 54 , 139–176.

Langley, P. ( 2000 ). Computational support of scientific discovery.   International Journal of Human-Computer Studies , 54 , 393–410.

Langley, P. ( 2002 ). Lessons for the computational discovery of scientific knowledge. In Proceedings of the First International Workshop on Data Mining Lessons Learned (pp. 9–12).

Langley, P., Simon, H. A., Bradshaw, G. L., & Zytkow, J. M. ( 1987 ). Scientific discovery: Computational explorations of the creative processes . Cambridge, MA: MIT Press.

Lorch, R. F., Jr., Lorch, E. P., Calderhead, W. J., Dunlap, E. E., Hodell, E. C., & Freer, B. D. ( 2010 ). Learning the control of variables strategy in higher and lower achieving classrooms: Contributions of explicit instruction and experimentation.   Journal of Educational Psychology , 54 (1), 90–101.

Magnani, L., Carnielli, W., & Pizzi, C., (Eds.) ( 2010 ). Model-based reasoning in science and technology: Abduction, logic,and computational discovery. Series Studies in Computational Intelligence (Vol. 314). Heidelberg/Berlin: Springer.

Mandler, J.M. ( 2004 ). The foundations of mind: Origins of conceptual thought . Oxford, England: Oxford University Press.

Macpherson, R., & Stanovich, K. E. ( 2007 ). Cognitive ability, thinking dispositions, and instructional set as predictors of critical thinking.   Learning and Individual Differences , 54 , 115–127.

McCloskey, M., Caramazza, A., & Green, B. ( 1980 ). Curvilinear motion in the absence of external forces: Naive beliefs about the motion of objects.   Science , 54 , 1139–1141.

McDermott, L. C., & Redish, L. ( 1999 ). Research letter on physics education research.   American Journal of Psychics , 54 , 755.

Mestre, J. P. ( 1991 ). Learning and instruction in pre-college physical science.   Physics Today , 54 , 56–62.

Metz, K. E. ( 1995 ). Reassessment of developmental constraints on children's science instruction.   Review of Educational Research , 54 (2), 93–127.

Minner, D. D., Levy, A. J., & Century, J. ( 2010 ). Inquiry-based science instruction—what is it and does it matter? Results from a research synthesis years 1984 to 2002.   Journal of Research in Science Teaching , 54 (4), 474–496.

Mitchell, T. M. ( 2009 ). Mining our reality.   Science , 54 , 1644–1645.

Mitroff, I. ( 1974 ). The subjective side of science . Amsterdam, Netherlands: Elsevier.

Munakata, Y., Casey, B. J., & Diamond, A. ( 2004 ). Developmental cognitive neuroscience: Progress and potential.   Trends in Cognitive Sciences , 54 , 122–128.

Mynatt, C. R., Doherty, M. E., & Tweney, R. D. ( 1977 ) Confirmation bias in a simulated research environment: An experimental study of scientific inference.   Quarterly Journal of Experimental Psychology , 54 , 89–95.

Nersessian, N. ( 1998 ). Conceptual change. In W. Bechtel, & G. Graham (Eds.), A companion to cognitive science (pp. 157–166). London, England: Blackwell.

Nersessian, N. ( 1999 ). Models, mental models, and representations: Model-based reasoning in conceptual change. In L. Magnani, N. Nersessian, & P. Thagard (Eds.), Model-based reasoning in scientific discovery (pp. 5–22). New York: Plenum.

Nersessian, N. J. ( 2002 ). The cognitive basis of model-based reasoning in science In. P. Carruthers, S. Stich, & M. Siegal (Eds.), The cognitive basis of science (pp. 133–152). New York: Cambridge University Press.

Nersessian, N. J. ( 2008 ) Creating scientific concepts . Cambridge, MA: MIT Press.

O' Malley, M. A. ( 2011 ). Exploration, iterativity and kludging in synthetic biology.   Comptes Rendus Chimie , 54 (4), 406–412 .

Papert, S. ( 1980 ) Mindstorms: Children computers and powerful ideas. New York: Basic Books.

Penner, D. E., & Klahr, D. ( 1996 ). When to trust the data: Further investigations of system error in a scientific reasoning task.   Memory and Cognition , 54 (5), 655–668.

Petitto, L. A., & Dunbar, K. ( 2004 ). New findings from educational neuroscience on bilingual brains, scientific brains, and the educated mind. In K. Fischer & T. Katzir (Eds.), Building usable knowledge in mind, brain, and education Cambridge, England: Cambridge University Press.

Popper, K. R. ( 1959 ). The logic of scientific discovery . London, England: Hutchinson.

Qin, Y., & Simon, H.A. ( 1990 ). Laboratory replication of scientific discovery processes.   Cognitive Science , 54 , 281–312.

Reiser, B. J., Tabak, I., Sandoval, W. A., Smith, B., Steinmuller, F., & Leone, T. J., ( 2001 ). BGuILE: Stategic and conceptual scaffolds for scientific inquiry in biology classrooms. In S. M. Carver & D. Klahr (Eds.), Cognition and instruction: Twenty-five years of progress (pp. 263–306). Mahwah, NJ: Erlbaum

Riordan, M., Rowson, P. C., & Wu, S. L. ( 2001 ). The search for the higgs boson.   Science , 54 , 259–260.

Rutherford, F. J., & Ahlgren, A. ( 1991 ). Science for all Americans. New York: Oxford University Press.

Samarapungavan, A. ( 1992 ). Children's judgments in theory choice tasks: Scientifc rationality in childhood.   Cognition , 54 , 1–32.

Schauble, L., & Glaser, R. ( 1990 ). Scientific thinking in children and adults. In D. Kuhn (Ed.), Developmental perspectives on teaching and learning thinking skills. Contributions to Human Development , (Vol. 21, pp. 9–26). Basel, Switzerland: Karger.

Schunn, C. D., & Klahr, D. ( 1995 ). A 4-space model of scientific discovery. In Proceedings of the 17th Annual Conference of the Cognitive Science Society (pp. 106–111). Mahwah, NJ: Erlbaum.

Schunn, C. D., & Klahr, D. ( 1996 ). The problem of problem spaces: When and how to go beyond a 2-space model of scientific discovery. Part of symposium on Building a theory of problem solving and scientific discovery: How big is N in N-space search? In Proceedings of the 18th Annual Conference of the Cognitive Science Society (pp. 25–26). Mahwah, NJ: Erlbaum.

Shrager, J., & Langley, P. ( 1990 ). Computational models of scientific discovery and theory formation . San Mateo, CA: Morgan Kaufmann.

Siegler, R. S., & Liebert, R. M. ( 1975 ). Acquisition of formal scientific reasoning by 10- and 13-year-olds: Designing a factorial experiment.   Developmental Psychology , 54 , 401–412.

Simon, H. A. ( 1977 ). Models of discovery . Dordrecht, Netherlands: D. Reidel Publishing.

Simon, H. A., Langley, P., & Bradshaw, G. L. ( 1981 ). Scientific discovery as problem solving.   Synthese , 54 , 1–27.

Simon, H. A., & Lea, G. ( 1974 ). Problem solving and rule induction. In H. Simon (Ed.), Models of thought (pp. 329–346). New Haven, CT: Yale University Press.

Smith, E. E., Shafir, E., & Osherson, D. ( 1993 ). Similarity, plausibility, and judgments of probability.   Cognition. Special Issue: Reasoning and decision making , 54 , 67–96.

Sodian, B., Zaitchik, D., & Carey, S. ( 1991 ). Young children's differentiation of hypothetical beliefs from evidence.   Child Development , 54 , 753–766.

Taber, K. S. ( 2009 ). Constructivism and the crisis in U.S. science education: An essay review.   Education Review , 54 (12), 1–26.

Thagard, P. ( 1992 ). Conceptual revolutions . Cambridge, MA: MIT Press.

Thagard, P. ( 1999 ). How scientists explain disease . Princeton, NJ: Princeton University Press.

Thagard, P., & Croft, D. ( 1999 ). Scientific discovery and technological innovation: Ulcers, dinosaur extinction, and the programming language Java. In L. Magnani, N. Nersessian, & P. Thagard (Eds.), Model-based reasoning in scientific discovery (pp. 125–138). New York: Plenum.

Tobias, S., & Duffy, T. M. (Eds.). ( 2009 ). Constructivist instruction: Success or failure? New York: Routledge.

Toth, E. E., Klahr, D., & Chen, Z. ( 2000 ) Bridging research and practice: A cognitively-based classroom intervention for teaching experimentation skills to elementary school children.   Cognition and Instruction , 54 (4), 423–459.

Tweney, R. D. ( 1989 ). A framework for the cognitive psychology of science. In B. Gholson, A. Houts, R. A. Neimeyer, & W. Shadish (Eds.), Psychology of science: Contributions to metascience (pp. 342–366). Cambridge, England: Cambridge University Press.

Tweney, R. D., Doherty, M. E., & Mynatt, C. R. ( 1981 ). On scientific thinking . New York: Columbia University Press.

Valdes-Perez, R. E. ( 1994 ). Conjecturing hidden entities via simplicity and conservation laws: Machine discovery in chemistry.   Artificial Intelligence , 54 (2), 247–280.

Von Hofsten, C. ( 1980 ). Predictive reaching for moving objects by human infants.   Journal of Experimental Child Psychology , 54 , 369–382.

Von Hofsten, C., Feng, Q., & Spelke, E. S. ( 2000 ). Object representation and predictive action in infancy.   Developmental Science , 54 , 193–205.

Vosnaidou, S. (Ed.). ( 2008 ). International handbook of research on conceptual change . New York: Taylor & Francis.

Vosniadou, S., & Brewer, W. F. ( 1992 ). Mental models of the earth: A study of conceptual change in childhood.   Cognitive Psychology , 54 , 535–585.

Wason, P. C. ( 1968 ). Reasoning about a rule.   Quarterly Journal of Experimental Psychology , 54 , 273–281.

Wertheimer, M. ( 1945 ). Productive thinking . New York: Harper.

Yang, Y. ( 2009 ). Target discovery from data mining approaches.   Drug Discovery Today , 54 (3–4), 147–154.

  • About Oxford Academic
  • Publish journals with us
  • University press partners
  • What we publish
  • New features  
  • Open access
  • Institutional account management
  • Rights and permissions
  • Get help with access
  • Accessibility
  • Advertising
  • Media enquiries
  • Oxford University Press
  • Oxford Languages
  • University of Oxford

Oxford University Press is a department of the University of Oxford. It furthers the University's objective of excellence in research, scholarship, and education by publishing worldwide

  • Copyright © 2024 Oxford University Press
  • Cookie settings
  • Cookie policy
  • Privacy policy
  • Legal notice

This Feature Is Available To Subscribers Only

Sign In or Create an Account

This PDF is available to Subscribers Only

For full access to this pdf, sign in to an existing account, or purchase an annual subscription.

Enago Academy

The Importance of Critical Thinking Skills in Research

' src=

Why is Critical Thinking Important: A Disruptive Force

Research anxiety seems to be taking an increasingly dominant role in the world of academic research. The pressure to publish or perish can warp your focus into thinking that the only good research is publishable research!

Today, your role as the researcher appears to take a back seat to the perceived value of the topic and the extent to which the results of the study will be cited around the world. Due to financial pressures and a growing tendency of risk aversion, studies are increasingly going down the path of applied research rather than basic or pure research . The potential for breakthroughs is being deliberately limited to incremental contributions from researchers who are forced to worry more about job security and pleasing their paymasters than about making a significant contribution to their field.

A Slow Decline

So what lead the researchers to their love of science and scientific research in the first place? The answer is critical thinking skills. The more that academic research becomes governed by policies outside of the research process, the less opportunity there will be for researchers to exercise such skills.

True research demands new ideas , perspectives, and arguments based on willingness and confidence to revisit and directly challenge existing schools of thought and established positions on theories and accepted codes of practice. Success comes from a recursive approach to the research question with an iterative refinement based on constant reflection and revision.

The importance of critical thinking skills in research is therefore huge, without which researchers may even lack the confidence to challenge their own assumptions.

A Misunderstood Skill

Critical thinking is widely recognized as a core competency and as a precursor to research. Employers value it as a requirement for every position they post, and every survey of potential employers for graduates in local markets rate the skill as their number one concern.

Related: Do you have questions on research idea or manuscript drafting? Get personalized answers on the FREE Q&A Forum!

When asked to clarify what critical thinking means to them, employers will use such phrases as “the ability to think independently,” or “the ability to think on their feet,” or “to show some initiative and resolve a problem without direct supervision.” These are all valuable skills, but how do you teach them?

For higher education institutions in particular, when you are being assessed against dropout, graduation, and job placement rates, where does a course in critical thinking skills fit into the mix? Student Success courses as a precursor to your first undergraduate course will help students to navigate the campus and whatever online resources are available to them (including the tutoring center), but that doesn’t equate to raising critical thinking competencies.

The Dependent Generation

As education becomes increasingly commoditized and broken-down into components that can be delivered online for maximum productivity and profitability, we run the risk of devaluing academic discourse and independent thought. Larger class sizes preclude substantive debate, and the more that content is broken into sound bites that can be tested in multiple-choice questions, the less requirement there will be for original thought.

Academic journals value citation above all else, and so content is steered towards the type of articles that will achieve high citation volume. As such, students and researchers will perpetuate such misuse by ensuring that their papers include only highly cited works. And the objective of high citation volume is achieved.

We expand the body of knowledge in any field by challenging the status quo. Denying the veracity of commonly accepted “facts” or playing devil’s advocate with established rules supports a necessary insurgency that drives future research. If we do not continue to emphasize the need for critical thinking skills to preserve such rebellion, academic research may begin to slowly fade away.

Rate this article Cancel Reply

Your email address will not be published.

critical thinking skills scientific

Enago Academy's Most Popular Articles

Content Analysis vs Thematic Analysis: What's the difference?

  • Reporting Research

Choosing the Right Analytical Approach: Thematic analysis vs. content analysis for data interpretation

In research, choosing the right approach to understand data is crucial for deriving meaningful insights.…

Cross-sectional and Longitudinal Study Design

Comparing Cross Sectional and Longitudinal Studies: 5 steps for choosing the right approach

The process of choosing the right research design can put ourselves at the crossroads of…

Networking in Academic Conferences

  • Career Corner

Unlocking the Power of Networking in Academic Conferences

Embarking on your first academic conference experience? Fear not, we got you covered! Academic conferences…

Research recommendation

Research Recommendations – Guiding policy-makers for evidence-based decision making

Research recommendations play a crucial role in guiding scholars and researchers toward fruitful avenues of…

critical thinking skills scientific

  • AI in Academia

Disclosing the Use of Generative AI: Best practices for authors in manuscript preparation

The rapid proliferation of generative and other AI-based tools in research writing has ignited an…

Intersectionality in Academia: Dealing with diverse perspectives

Meritocracy and Diversity in Science: Increasing inclusivity in STEM education

Avoiding the AI Trap: Pitfalls of relying on ChatGPT for PhD applications

critical thinking skills scientific

Sign-up to read more

Subscribe for free to get unrestricted access to all our resources on research writing and academic publishing including:

  • 2000+ blog articles
  • 50+ Webinars
  • 10+ Expert podcasts
  • 50+ Infographics
  • 10+ Checklists
  • Research Guides

We hate spam too. We promise to protect your privacy and never spam you.

I am looking for Editing/ Proofreading services for my manuscript Tentative date of next journal submission:

critical thinking skills scientific

As a researcher, what do you consider most when choosing an image manipulation detector?

Back Home

  • Search Search Search …
  • Search Search …

What’s the Difference Between Critical Thinking and Scientific Thinking?

critical thinking and scientific thinking

Thinking deeply about things is a defining feature of what it means to be human, but, surprising as it may seem, there isn’t just one way to ‘think’ about something; instead, humans have been developing organized and varied schools of thought for thousands of years.

Discussions about morality, religion, and the meaning of life often drive knowledge-seeking inquiry, leading people to wonder what the difference is between critical thinking and Scientific Thinking.

Critical thinkers prioritize objectivity to analyze a problem, deduce logical solutions, and examine what the ramifications of those solutions are.

While scientific thinking often relies heavily on critical thinking, scientific inquiry is more dedicated to acquiring knowledge rather than mere abstraction.

There are a lot of nuances between critical thinking and scientific thinking, and most of us probably utilize these skills in our everyday lives. The rest of this article will thoroughly define the two terms and relate how they are similar and different.

What Is Critical Thinking?

Critical thinking is a mindset ― a lens, if you will, through which one may view the world. Critical thinkers rely on a lot of introspection, constantly self-evaluating how they came to a conclusion, and what that conclusion naturally entails.

A critical thinker may discern what they already know about a subject, what that information suggests, why that information is relevant, and how that information could be linked to further lines of inquiry. Critical thinking is, therefore, simply the ability to think clearly and logically.

Systematic reasoning is prized over gut instinct, and determining relevance is crucial to parsing out useful data from extraneous information.

Naturally, the ability to think critically is highly prized in an academic setting, and most educators seek to enable their students to think critically.

What is the link between the styles and motivations of these two Romantic era poets? How can your current understanding of algebra be applied to geometry? How does our understanding of this historical figure influence our understanding of social life at the time?

So much information can be interlinked to develop our understanding of the world, and critical thinking is the basis for using objectivity to not only establish likely outcomes to a scenario, but also inquire on the repercussions of that outcome and reflect on the process by which one came to that conclusion.

What Is Scientific Thinking?

The objective of scientific thinking is the acquisition of knowledge. The more we know, the more we can hope to know.

Scientific thinking begins by imagining what the outcome of a problem may be, observing the situation, and then making notes and changing the initial hypothesis.

The commonly used scientific method is as follows:

  • Define the purpose of the experiment
  • Formulate a hypothesis
  • Study the phenomenon and collect data
  • Draw results

As you might imagine, this process can be repeated ad infinitum. So, you draw a conclusion that’s scientifically verifiable? Great! Now you can take that conclusion and use it as a basis for a new experiment. Of course, the scientific method has limits.

It’s hard to apply the scientific method when it comes to morality or religious beliefs. A revelation of a prophet cannot be empirically verified.

We can’t go inside said prophet’s mind and see exactly what neurons were firing to recreate the conditions under which the vision was made, and even if we could, the nature of such a revelation is spiritual and immaterial.

It’s impossible to influence the supernatural in the material world, and as such, creating a test that relies on changing something to see the outcome is impossible. Where scientific thinking does excel is in the fields of math and, well, science.

Physics is known as the perfect science because the forces that comprise our world are well understood and don’t tend to exhibit anomalies, making the empirically verified scientific method perfect for improving our understanding of the natural world.

How Are Critical Thinking and Scientific Thinking Similar and Different?

Both critical and scientific thinking rely on the use of empirical, objective evidence. Thinking scientifically or critically relies on using the data available and following it to its likely conclusion.

Scientific thinking can be seen as a stricter, more regulated version of critical thinking. It takes the tenets of critically thinking and narrows the focus.

Both fields of study eschew personal bias and gut instinct as both unreliable and unhelpful.

The main difference between the two, however, is the goal of each discipline.

While both prioritize learning and using data to make hypotheses, critical thinking is prone to much more abstraction and self-reflection.

With little variation in the scientific method, there’s not really any need to reflect on how those conclusions were drawn or if those conclusions are a result of any kind of bias. It’s just not useful information.

For a critical thinker, however, self-reflection is key to identifying inconsistencies and refining one’s argument.

Both scientific thinking and critical thinking tend to draw links between concepts, evaluating how they are related and what knowledge may be gleaned from that connection.

While critical thinking can be applied to most concepts, even those of morality and anthropology, scientific thinking is often problem oriented. If a problem exists, scientific inquiry attempts to gain the necessary information to solve it, overcoming obstacles along the way.

Both critical thinkers and scientific thinkers may very well end up at the same conclusion― they will just draw those conclusions differently. Critical thinkers are concerned with logic, order, and rational thinking.

Establishing already-understood information, applying that information to a query, and then establishing a defensible argument on the accuracy and relevance of the conclusion is the trademark of a critical thinker. Scientific thinkers, on the other hand, work towards solving knowledge almost exclusively through the acquisition of knowledge through the scientific method.

Scientific thinkers develop a hypothesis, test it, and then rinse and repeat until the phenomenon is understood. As such, scientific thinkers are obsessed with why questions. Why does this phenomenon happen?

Why does matter behave like this? In the end, both schools are thought have a lot of interesting ideas guiding them, and most of us probably use them throughout our daily lives.

https://www.vwaust.com/resource/what-is-scientific-thinking/

https://www.skillsyouneed.com/learn/critical-thinking.html#:~:text=Critical%20thinking%20is%20thinking%20about%20things%20in%20certain,to%20the%20best%20possible%20conclusion.%20Critical%20Thinking%20is%3A

https://psycnet.apa.org/record/2010-22950-019

You may also like

critical thinking arguments

Critical thinking arguments for beginners

Critical thinking is one of the most valuable sets of life skills you can ever have and it’s never too late to […]

Critical Thinking in Healthcare and Medicine

Critical Thinking in Healthcare and Medicine: A Crucial Skill for Improved Outcomes

Critical thinking is a crucial skill for individuals working in various healthcare domains, such as doctors, nurses, lab assistants, and patients. It […]

Best Sports for Critical Thinking

Best Sports for Critical Thinking: Enhancing Mental Agility through Athletics

Participating in sports is known to have numerous benefits, ranging from physical fitness to fostering social skills and teamwork. However, sports can […]

What is Non-Critical Thinking?

What is Non-Critical Thinking?

Think about the last time you had to take a stand on an important issue, or make a decision while not knowing […]

  • Community and Engagement
  • Honors and Awards
  • Give Now 

Why Is It Important for Students to Understand How Scientific Decisions are Made? ‘If You Don’t Understand How Scientists Decide What Makes One Claim More Believable, Then It’s Actually Very Hard to Understand Science,’ Says STEM Education Department Head William Sandoval

critical thinking skills scientific

For William Sandoval, head of the Department of STEM Education in the NC State College of Education, when preparing K-12 students to engage with real-world science, developing the skills to become career scientists is not nearly as important as helping them to engage with the science that will occur all around them in their everyday lives. 

To help facilitate this, Sandoval’s research has focused on how kids understand scientific argumentation and how scientists make the case for why people should believe the theories they’ve developed based on causal claims and evidence. 

The concept is known as epistemic cognition, or thinking about how people know what they know. 

“If we want kids to understand how science works, they have to understand the standards that scientists use to evaluate competing claims. If you don’t understand how scientists decide what makes one claim more believable, then it’s actually very hard to understand science,” Sandoval said. “Every person who gets a science education through high school is eventually going to encounter some scientific issue in their adult lives that they didn’t learn about in school because science moves really fast. If you never thought about the way the scientific community makes choices, it’s very hard to make sense of it all as an individual citizen.” 

How Do Kids Typically Think About Science?

Broadly speaking, research shows that, oftentimes, kids’ initially find many scientific concepts to be implausible because so many of the causal agents behind these theories are invisible and insensible. 

“You can’t feel your genes; you can’t feel an atom, so there are these causal agents in science that are very far away from our sensory experience, which makes them hard to understand,” Sandoval said. 

Despite this, research shows that children, even from a young age, believe that it’s much better to have evidence for a claim than to not have evidence. Therefore, teachers can draw on this by helping students understand the evidence behind the science they are learning through engagement in research, data collection or experimentation in the classroom. 

How Can Teachers Help Students Understand How Scientific Decisions Are Made?

At its core, Sandoval said, science is all about evaluating competing claims about how the world works. 

One of the best ways to help students understand how to do this is to have them engage in scientific argumentation. Sandoval shared the following steps to help students in this endeavor: 

  • Identify topics within the curriculum that can elicit disagreement ; this can vary from asking elementary students how plants get water to how they inherit genetic traits from their parents: “Identify for everybody when there is disagreement and try to clarify what the nature of the disagreement is. Then, discuss with students how they are going to decide [which claim is valid] and what it will take for everyone to agree.” 
  • Give students an opportunity to come up with their own evidence-based claims: “Our current national standards want kids to be doing investigations of various kinds, so these are really good opportunities for kids to engage in these epistemic considerations about how we’re going to decide what we’re going to believe about this thing that we’re studying. Because, if they’re getting data themselves, they can argue about not just what the data show but how you got the data, whether you did it in a reasonable way or not and whether you’ve interpreted the data in a reasonable way.”
  • Have students with opposing claims engage in discussion: “We found that when this is public, that works better and part of the reason is that this holds kids accountable to each other. They also have to be held accountable to standards of evidence. So, that’s a really important role for the teacher to play is to kind of push kids to talk about what evidence they have for this claim over that claim.”
  • Come to a consensus: “Talk about the standards or the criteria that you’d use to resolve your disagreement. It’s not enough that I believe one thing and someone else believes another thing. You’ve got to push us to agree, so pushing to consensus seems to be the thing that really helps kids work through their understanding of the criteria.”

For teachers who want to engage their students in this type of scientific thinking, Sandoval recommends using free resources provided by the NGSX professional learning system along with the inquiryHub ,a research-practice partnership that develops materials, tools and processes to promote STEM learning. 

  • Research and Impact
  • Ask The Expert
  • homepage-news
  • resources for educators
  • science education
  • STEM Ed Research
  • William Sandoval

More From College of Education News

Catherine Hartman

Catherine Hartman Named Assistant Professor of Community College Leadership 

critical thinking skills scientific

Leading the Way 

A headshot of a man in graduation regalia holding up a peace sign and making a funny face.

An Unforgettable Adventure: Meet Reginald Simon, MSA ‘24 

AIP Publishing Logo

Development of problem-based learning e-module based on identification of misconceptions material on plant tissues and animal tissues to train critical thinking skills and science process for class XI high school students

  • Split-Screen
  • Article contents
  • Figures & tables
  • Supplementary Data
  • Peer Review
  • Open the PDF for in another window
  • Reprints and Permissions
  • Cite Icon Cite
  • Search Site

Annisa Fauzia Rahmah , Herawati Susilo , Murni Sapta Sari; Development of problem-based learning e-module based on identification of misconceptions material on plant tissues and animal tissues to train critical thinking skills and science process for class XI high school students. AIP Conf. Proc. 24 May 2024; 3106 (1): 030017. https://doi.org/10.1063/5.0214956

Download citation file:

  • Ris (Zotero)
  • Reference Manager

This study aims to develop an e-module based on Problem-Based Learning that integrates critical thinking skills and science process skills based on identifying misconceptions about plant tissue and animal tissue using a three-tier diagnostic instrument. This research is development research using the ADDIE development model (Analysis, Design, Development, Implementation and Evaluation). This research involved students of class XI MIPA 1 and XI MIPA 2 at SMAN 8 Malang. The instruments used are validity and practicality test instruments. The results of the study tested the validity of the material expert, the validity of the teaching materials expert, and the validity of the e-module field practitioner, which resulted in a very valid category and could be used in the learning process. While the results of the practicality test research by means of one-to-one trials, small group trials and field trials, the e-module developed is categorised as very practical, which means that the e-module can be used by students in the learning process at school.

Citing articles via

Publish with us - request a quote.

critical thinking skills scientific

Sign up for alerts

  • Online ISSN 1551-7616
  • Print ISSN 0094-243X
  • For Researchers
  • For Librarians
  • For Advertisers
  • Our Publishing Partners  
  • Physics Today
  • Conference Proceedings
  • Special Topics

pubs.aip.org

  • Privacy Policy
  • Terms of Use

Connect with AIP Publishing

This feature is available to subscribers only.

Sign In or Create an Account

K-12 Resources By Teachers, For Teachers Provided by the K-12 Teachers Alliance

  • Teaching Strategies
  • Classroom Activities
  • Classroom Management
  • Technology in the Classroom
  • Professional Development
  • Lesson Plans
  • Writing Prompts
  • Graduate Programs

The Importance of Art Class

Janelle cox.

  • May 24, 2024

A student paints a picture in art class.

In  today’s   technology-driven classrooms ,  art remains an  important  component of student development.  Despite often being the first to  be cut  from the curriculum in some schools, dismissed as a luxury, or merely a source of fridge-worthy projects, art education holds profound benefits.

From fostering cognitive abilities and emotional resilience to enhancing academic performance and learning lifelong skills, art class provides much more than  just  a creative outlet. Here,  we’ll  explore why art class is  so  essential and how to make it more accessible to all students. 

Cognitive Skills 

Art classes play a critical role in developing a  student’s cognitive skills. They encourage creativity, allowing students to express themselves in a different way other than writing. This freedom promotes innovative thinking. It also helps to develop students’ critical thinking skills.

As students look at their work and that of their classmates, they learn to observe, analyze, and make judgments,  which are  all valuable skills students will use in all aspects of their lives. Art classes can also enhance  students’  visual-spatial skills.  When students are drawing, painting, or creating  sculptures   they need to understand space and perspective  which  are skills they need if they ever go into fields like architecture or engineering.  

Social-Emotional Learning

Art class extends beyond a  student’s cognitive development, it can also impact their social- emotional learning . Artistic activities can tap into students’ feelings so if they have a hard time vocalizing their feelings, they may be better able to express themselves through art. 

This  can feel therapeutic and help to build their self-confidence. It can also release any anxiety and stress they may be feeling. Art can also promote empathy.  When students explore different art forms and learn different cultural and personal perspectives, they  have a better understanding of  other  people’s  experiences.  

Academic Achievement

Various studies conducted over the years have shown a correlation between art education and academic achievement. Reports from organizations like  the Arts Education Partnership  and the  National Endowment for the Arts in the United States  suggest that the arts are linked to improved test scores, enhanced reading and language skills, and higher rates of going to and completing college. Additional findings show artistic activities enhance memory and attention to detail. Integrating art with other subjects, referred to now as STEAM (Science, Technology, Engineering, Arts, and Mathematics) can help make learning more relatable and deepen students’ understanding and retention. 

Lifelong Skills

The skills learned in art class extend far beyond the classroom.  In today’s job market creativity is valued. Employers are seeking individuals who are innovative, creative, and who think outside of the box. This need for creative thinking is ranked as a top skill for future professionals. Additionally, art class teaches risk-taking and resilience. By continually taking creative risks students are developing resilience which can help them with any challenges they may face in the future. 

Cultural Awareness and Appreciation

When students are engaged with art forms from different cultures , they gain a deeper understanding of global cultures. They learn to respect and value different viewpoints and traditions. By creating and discussing art from various backgrounds, students dispel stereotypes and prejudices, promoting a society that is more inclusive and empathetic to others.  

Making Art Class Accessible 

Art classes are not always accessible to all students.  This may be driven by socioeconomic status, school funding, or geographic location. Ensuring that every student has access to art education is crucial for a student’s well-rounded academic experience. Here are a few approaches to achieve this goal. 

Invest in Art 

One way to make art classes universally accessible is to invest in art programs. Allocate funds for basic supplies and materials that will inspire students to create  as well as invest in professional development for teachers. Teachers who have a background in art education will help foster a greater appreciation for the arts among students. 

Integrate Art

Art can be integrated  into the core curriculum to ensure all students have access to art education.  STEAM education   combining  art with other core curricula can become fundamental to every  child’s  educational experience.  

Utilize Technology 

Art education can be made  more accessible through technology. Digital tools can bring art classes to children across the globe. Virtual classes mean students can learn, create, and share their work with anyone worldwide. 

Form Partnerships within the Community 

Partnerships with local art galleries and artists can provide schools with additional resources.  These partnerships might involve professional artists working with students, or collaborations with local museums that offer field trips or workshops. Community involvement enhances the school’s art program and strengthens the community culture. 

Art class is a vital part of a  child’s educational experience. It nurtures cognitive, social, and emotional skills, boosts academic achievements, makes them more culturally aware, and prepares students with skills they will use throughout their lives. Making art education accessible for all students should be a priority for all leaders and administrators . 

  • #ArtClass , #Classroom Management

More in Classroom Management

critical thinking skills scientific

Beyond Monkey Bars: The Vital Role of Recess in Child Development

Do students need recess? This question has been discussed for years among parents and educators….

A pile of school supplies laying around a clipboard with a piece of paper that says, “brain breaks.”

Brain Breaks: The Science Behind it and the Benefits

The Science Behind Brain Breaks Most educators experience students’ glazed eyes, heads down,…

Students sit in front of their teacher who is talking; they are watching and listening intently.

Classroom Attention-Getters to Use for Engaging Students

For many teachers, classroom management is a challenge. Without various strategies in your…

A close-up of student hands being raised in front of a chalkboard, representing student participation.

Unmute the Classroom: Unleashing the Power of Student Participation

The silence of unengaged students who are hesitant to raise their hands or…

COMMENTS

  1. Understanding the Complex Relationship between Critical Thinking and

    We find that scientific reasoning in writing is strongly related to inference, while other aspects of science reasoning that emerge in writing (epistemological considerations, writing conventions, etc.) are not significantly related to critical-thinking skills. Science reasoning in writing is not merely a proxy for critical thinking.

  2. Scientific Thinking and Critical Thinking in Science Education

    Scientific thinking and critical thinking are two intellectual processes that are considered keys in the basic and comprehensive education of citizens. For this reason, their development is also contemplated as among the main objectives of science education. However, in the literature about the two types of thinking in the context of science education, there are quite frequent allusions to one ...

  3. Teaching critical thinking in science

    Scientific inquiry includes three key areas: 1. Identifying a problem and asking questions about that problem. 2. Selecting information to respond to the problem and evaluating it. 3. Drawing conclusions from the evidence. Critical thinking can be developed through focussed learning activities. Students not only need to receive information but ...

  4. Thinking critically on critical thinking: why scientists' skills need

    Critical thinking moves us beyond mere description and into the realms of scientific inference and reasoning. This is what enables discoveries to be made and innovations to be fostered.

  5. Critical Thinking in Science: Fostering Scientific Reasoning Skills in

    Critical thinking is essential in science. It's what naturally takes students in the direction of scientific reasoning since evidence is a key component of this style of thought. It's not just about whether evidence is available to support a particular answer but how valid that evidence is. It's about whether the information the student ...

  6. Your Deceptive Mind: A Scientific Guide to Critical Thinking Skills

    I am an unabashed fan of Steven Novella's "Your Deceptive Mind: A Scientific Guide to Critical Thinking Skills" (The Teaching Company, 2012), one of The Great Courses series, and would highly recommend it to anyone. Novella is a superb science communicator—in a different mold than Carl Sagan, but with the same ethic, and with a knack ...

  7. Crucial Cognitive Skills in Science Education: A Systematic Review

    DOI: 10.30870/jppi.v6i1.7140. Abstract. This systematic review focuses on identifying three common cognitive skills in science. education —process skills, critical thinking skills, and reasoning ...

  8. Teaching critical thinking

    Understanding and thinking critically about scientific evidence is a crucial skill in the modern world. We present a simple learning framework that employs cycles of decisions about making and acting on quantitative comparisons between datasets or data and models. With opportunities to improve the data or models, this structure is appropriate ...

  9. Critical Thinking in Science

    After all, science lab activities are ubiquitous in science classrooms and they are a great opportunity to teach critical thinking skills. Often, however, science labs are merely recipes that students follow to verify standard values (such as the force of acceleration due to gravity) or relationships between variables (such as the relationship ...

  10. What influences students' abilities to critically evaluate scientific

    Critical thinking and its importance. Critical thinking, defined here as "the ways in which one uses data and evidence to make decisions about what to trust and what to do" [], is a foundational learning goal for almost any undergraduate course and can be integrated in many points in the undergraduate curriculum.Beyond the classroom, critical thinking skills are important so that students ...

  11. What Is Critical Thinking?

    Critical thinking is the ability to effectively analyze information and form a judgment. To think critically, you must be aware of your own biases and assumptions when encountering information, and apply consistent standards when evaluating sources. Critical thinking skills help you to: Identify credible sources. Evaluate and respond to arguments.

  12. Fostering Students' Creativity and Critical Thinking in Science

    3.2.1 Creativity and Critical Thinking. Creativity and critical thinking are two distinct but related higher-order cognitive skills. As such, both require significant mental effort and energy; both are cognitively challenging. Creativity aims to create novel, appropriate ideas and products.

  13. Critical thinking in the lab (and beyond)

    Jon-Marc and Marcy focused on critical thinking as a skill needed for successful engagement with the eight 'science practices'. These practices come from a 2012 framework for science education published by the US National Research Council. The eight practices are: asking questions; developing and using models; planning and carrying out ...

  14. Development of Critical Thinking Skills Through Science Learning

    3 Science Learning to Develop Critical Thinking Skills In many countries, critical thinking has become one of the competencies of educational purposes, even as one of the learning targets. Studies show that it is a high-level skill known to aid in an individual's social, moral, mental, cognitive, and scientific development [ 28 ].

  15. Understanding the Complex Relationship between Critical Thinking and

    Developing critical-thinking and scientific reasoning skills are core learning objectives of science education, but little empirical evidence exists regarding the interrelationships between these constructs. Writing effectively fosters students' development of these constructs, and it offers a unique window into studying how they relate. In this study of undergraduate thesis writing in ...

  16. What Are Critical Thinking Skills and Why Are They Important?

    According to the University of the People in California, having critical thinking skills is important because they are [ 1 ]: Universal. Crucial for the economy. Essential for improving language and presentation skills. Very helpful in promoting creativity. Important for self-reflection.

  17. Scientific Thinking and Reasoning

    Abstract. Scientific thinking refers to both thinking about the content of science and the set of reasoning processes that permeate the field of science: induction, deduction, experimental design, causal reasoning, concept formation, hypothesis testing, and so on. Here we cover both the history of research on scientific thinking and the different approaches that have been used, highlighting ...

  18. Science-Based Strategies For Critical Thinking

    Viewing mistakes as data and data as leading to new conclusions and progress is part and parcel to the scientific process. Just so, one of the fallouts of teaching critical thinking skills is that students may bring home misunderstandings. But exploring controversy in science is the very method that scientists use to propel the field forward.

  19. Standard-based science education and critical thinking

    Integrating critical thinking with other competencies, like guided inquiry, experimenting, seeking evidence with support, teamwork and communication, are a few of the science-related skills which can easily be enhanced through the development of critical thinking skills. Science learning can be improved if it is taken to higher levels by aiming ...

  20. Critical thinking

    They conceived critical thinking to be related to the scientific method but more open, flexible, and self-correcting; instead of a recipe or a series of steps, critical thinking would be a wider set of skills, patterns, and strategies that allow someone to reason through an intellectual topic, constantly reassessing assumptions and potential ...

  21. The Importance of Critical Thinking Skills in Research

    The answer is critical thinking skills. The more that academic research becomes governed by policies outside of the research process, the less opportunity there will be for researchers to exercise such skills. True research demands new ideas, perspectives, and arguments based on willingness and confidence to revisit and directly challenge ...

  22. Scientific Literacy and Critical Thinking Skills- Critical Thinking Secrets

    Metacognition, or the process of thinking about one's own thinking, plays a crucial role in fostering critical thinking skills in science education. Cambridge highlights key steps in the critical thinking process, which include: Identifying a problem and asking questions about that problem. Selecting information to respond to the problem and ...

  23. Critical Thinking and Scientific Thinking

    While scientific thinking often relies heavily on critical thinking, scientific inquiry is more dedicated to acquiring knowledge rather than mere abstraction. There are a lot of nuances between critical thinking and scientific thinking, and most of us probably utilize these skills in our everyday lives. The rest of this article will thoroughly ...

  24. Why Is It Important for Students to Understand How Scientific Decisions

    For William Sandoval, head of the Department of STEM Education in the NC State College of Education, when preparing K-12 students to engage with real-world science, developing the skills to become career scientists is not nearly as important as helping them to engage with the science that will occur all around them in their everyday lives.. To help facilitate this, Sandoval's research has ...

  25. Development of problem-based learning e-module based on identification

    This study aims to develop an e-module based on Problem-Based Learning that integrates critical thinking skills and science process skills based on identifying misconceptions about plant tissue and animal tissue using a three-tier diagnostic instrument. This research is development research using the ADDIE development model (Analysis, Design ...

  26. An examination of accessibility and use of critical thinking for

    1.Introduction. The definition of critical thinking (CT) used for this study is a combination of three dimensions (Davies & Barnett, 2015; Santos Meneses, 2020): A cognitive dimension that focuses on logical reasoning skills; a metacognitive dimension, focusing on self-reflection, self-critique, and higher order thinking skills; and an ethical dimension, concerning morality, ethics, and human ...

  27. The Utilization of Metaverse Technology Applications Based on Science

    The existence of digital technology is currently seen as a system that helps teachers satisfy the needs of their students. The utilization of technology facilitates teachers' ability to address students' low critical thinking abilities. Therefore, this research studied the utilization of Metaverse technology applications based on Science, Technology, Engineering and Mathematics (Meta-STEM) to ...

  28. The Importance of Art Class

    Art classes play a critical role in developing a student's cognitive skills. They encourage creativity, allowing students to express themselves in a different way other than writing. This freedom promotes innovative thinking. It also helps to develop students' critical thinking skills. As students look at their work and that of their ...