Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • Review Article
  • Open access
  • Published: 11 January 2023

The effectiveness of collaborative problem solving in promoting students’ critical thinking: A meta-analysis based on empirical literature

  • Enwei Xu   ORCID: orcid.org/0000-0001-6424-8169 1 ,
  • Wei Wang 1 &
  • Qingxia Wang 1  

Humanities and Social Sciences Communications volume  10 , Article number:  16 ( 2023 ) Cite this article

17k Accesses

18 Citations

3 Altmetric

Metrics details

  • Science, technology and society

Collaborative problem-solving has been widely embraced in the classroom instruction of critical thinking, which is regarded as the core of curriculum reform based on key competencies in the field of education as well as a key competence for learners in the 21st century. However, the effectiveness of collaborative problem-solving in promoting students’ critical thinking remains uncertain. This current research presents the major findings of a meta-analysis of 36 pieces of the literature revealed in worldwide educational periodicals during the 21st century to identify the effectiveness of collaborative problem-solving in promoting students’ critical thinking and to determine, based on evidence, whether and to what extent collaborative problem solving can result in a rise or decrease in critical thinking. The findings show that (1) collaborative problem solving is an effective teaching approach to foster students’ critical thinking, with a significant overall effect size (ES = 0.82, z  = 12.78, P  < 0.01, 95% CI [0.69, 0.95]); (2) in respect to the dimensions of critical thinking, collaborative problem solving can significantly and successfully enhance students’ attitudinal tendencies (ES = 1.17, z  = 7.62, P  < 0.01, 95% CI[0.87, 1.47]); nevertheless, it falls short in terms of improving students’ cognitive skills, having only an upper-middle impact (ES = 0.70, z  = 11.55, P  < 0.01, 95% CI[0.58, 0.82]); and (3) the teaching type (chi 2  = 7.20, P  < 0.05), intervention duration (chi 2  = 12.18, P  < 0.01), subject area (chi 2  = 13.36, P  < 0.05), group size (chi 2  = 8.77, P  < 0.05), and learning scaffold (chi 2  = 9.03, P  < 0.01) all have an impact on critical thinking, and they can be viewed as important moderating factors that affect how critical thinking develops. On the basis of these results, recommendations are made for further study and instruction to better support students’ critical thinking in the context of collaborative problem-solving.

Similar content being viewed by others

problem solving skills journal article

A meta-analysis of the effects of design thinking on student learning

problem solving skills journal article

Fostering twenty-first century skills among primary school students through math project-based learning

problem solving skills journal article

A meta-analysis to gauge the impact of pedagogies employed in mixed-ability high school biology classrooms

Introduction.

Although critical thinking has a long history in research, the concept of critical thinking, which is regarded as an essential competence for learners in the 21st century, has recently attracted more attention from researchers and teaching practitioners (National Research Council, 2012 ). Critical thinking should be the core of curriculum reform based on key competencies in the field of education (Peng and Deng, 2017 ) because students with critical thinking can not only understand the meaning of knowledge but also effectively solve practical problems in real life even after knowledge is forgotten (Kek and Huijser, 2011 ). The definition of critical thinking is not universal (Ennis, 1989 ; Castle, 2009 ; Niu et al., 2013 ). In general, the definition of critical thinking is a self-aware and self-regulated thought process (Facione, 1990 ; Niu et al., 2013 ). It refers to the cognitive skills needed to interpret, analyze, synthesize, reason, and evaluate information as well as the attitudinal tendency to apply these abilities (Halpern, 2001 ). The view that critical thinking can be taught and learned through curriculum teaching has been widely supported by many researchers (e.g., Kuncel, 2011 ; Leng and Lu, 2020 ), leading to educators’ efforts to foster it among students. In the field of teaching practice, there are three types of courses for teaching critical thinking (Ennis, 1989 ). The first is an independent curriculum in which critical thinking is taught and cultivated without involving the knowledge of specific disciplines; the second is an integrated curriculum in which critical thinking is integrated into the teaching of other disciplines as a clear teaching goal; and the third is a mixed curriculum in which critical thinking is taught in parallel to the teaching of other disciplines for mixed teaching training. Furthermore, numerous measuring tools have been developed by researchers and educators to measure critical thinking in the context of teaching practice. These include standardized measurement tools, such as WGCTA, CCTST, CCTT, and CCTDI, which have been verified by repeated experiments and are considered effective and reliable by international scholars (Facione and Facione, 1992 ). In short, descriptions of critical thinking, including its two dimensions of attitudinal tendency and cognitive skills, different types of teaching courses, and standardized measurement tools provide a complex normative framework for understanding, teaching, and evaluating critical thinking.

Cultivating critical thinking in curriculum teaching can start with a problem, and one of the most popular critical thinking instructional approaches is problem-based learning (Liu et al., 2020 ). Duch et al. ( 2001 ) noted that problem-based learning in group collaboration is progressive active learning, which can improve students’ critical thinking and problem-solving skills. Collaborative problem-solving is the organic integration of collaborative learning and problem-based learning, which takes learners as the center of the learning process and uses problems with poor structure in real-world situations as the starting point for the learning process (Liang et al., 2017 ). Students learn the knowledge needed to solve problems in a collaborative group, reach a consensus on problems in the field, and form solutions through social cooperation methods, such as dialogue, interpretation, questioning, debate, negotiation, and reflection, thus promoting the development of learners’ domain knowledge and critical thinking (Cindy, 2004 ; Liang et al., 2017 ).

Collaborative problem-solving has been widely used in the teaching practice of critical thinking, and several studies have attempted to conduct a systematic review and meta-analysis of the empirical literature on critical thinking from various perspectives. However, little attention has been paid to the impact of collaborative problem-solving on critical thinking. Therefore, the best approach for developing and enhancing critical thinking throughout collaborative problem-solving is to examine how to implement critical thinking instruction; however, this issue is still unexplored, which means that many teachers are incapable of better instructing critical thinking (Leng and Lu, 2020 ; Niu et al., 2013 ). For example, Huber ( 2016 ) provided the meta-analysis findings of 71 publications on gaining critical thinking over various time frames in college with the aim of determining whether critical thinking was truly teachable. These authors found that learners significantly improve their critical thinking while in college and that critical thinking differs with factors such as teaching strategies, intervention duration, subject area, and teaching type. The usefulness of collaborative problem-solving in fostering students’ critical thinking, however, was not determined by this study, nor did it reveal whether there existed significant variations among the different elements. A meta-analysis of 31 pieces of educational literature was conducted by Liu et al. ( 2020 ) to assess the impact of problem-solving on college students’ critical thinking. These authors found that problem-solving could promote the development of critical thinking among college students and proposed establishing a reasonable group structure for problem-solving in a follow-up study to improve students’ critical thinking. Additionally, previous empirical studies have reached inconclusive and even contradictory conclusions about whether and to what extent collaborative problem-solving increases or decreases critical thinking levels. As an illustration, Yang et al. ( 2008 ) carried out an experiment on the integrated curriculum teaching of college students based on a web bulletin board with the goal of fostering participants’ critical thinking in the context of collaborative problem-solving. These authors’ research revealed that through sharing, debating, examining, and reflecting on various experiences and ideas, collaborative problem-solving can considerably enhance students’ critical thinking in real-life problem situations. In contrast, collaborative problem-solving had a positive impact on learners’ interaction and could improve learning interest and motivation but could not significantly improve students’ critical thinking when compared to traditional classroom teaching, according to research by Naber and Wyatt ( 2014 ) and Sendag and Odabasi ( 2009 ) on undergraduate and high school students, respectively.

The above studies show that there is inconsistency regarding the effectiveness of collaborative problem-solving in promoting students’ critical thinking. Therefore, it is essential to conduct a thorough and trustworthy review to detect and decide whether and to what degree collaborative problem-solving can result in a rise or decrease in critical thinking. Meta-analysis is a quantitative analysis approach that is utilized to examine quantitative data from various separate studies that are all focused on the same research topic. This approach characterizes the effectiveness of its impact by averaging the effect sizes of numerous qualitative studies in an effort to reduce the uncertainty brought on by independent research and produce more conclusive findings (Lipsey and Wilson, 2001 ).

This paper used a meta-analytic approach and carried out a meta-analysis to examine the effectiveness of collaborative problem-solving in promoting students’ critical thinking in order to make a contribution to both research and practice. The following research questions were addressed by this meta-analysis:

What is the overall effect size of collaborative problem-solving in promoting students’ critical thinking and its impact on the two dimensions of critical thinking (i.e., attitudinal tendency and cognitive skills)?

How are the disparities between the study conclusions impacted by various moderating variables if the impacts of various experimental designs in the included studies are heterogeneous?

This research followed the strict procedures (e.g., database searching, identification, screening, eligibility, merging, duplicate removal, and analysis of included studies) of Cooper’s ( 2010 ) proposed meta-analysis approach for examining quantitative data from various separate studies that are all focused on the same research topic. The relevant empirical research that appeared in worldwide educational periodicals within the 21st century was subjected to this meta-analysis using Rev-Man 5.4. The consistency of the data extracted separately by two researchers was tested using Cohen’s kappa coefficient, and a publication bias test and a heterogeneity test were run on the sample data to ascertain the quality of this meta-analysis.

Data sources and search strategies

There were three stages to the data collection process for this meta-analysis, as shown in Fig. 1 , which shows the number of articles included and eliminated during the selection process based on the statement and study eligibility criteria.

figure 1

This flowchart shows the number of records identified, included and excluded in the article.

First, the databases used to systematically search for relevant articles were the journal papers of the Web of Science Core Collection and the Chinese Core source journal, as well as the Chinese Social Science Citation Index (CSSCI) source journal papers included in CNKI. These databases were selected because they are credible platforms that are sources of scholarly and peer-reviewed information with advanced search tools and contain literature relevant to the subject of our topic from reliable researchers and experts. The search string with the Boolean operator used in the Web of Science was “TS = (((“critical thinking” or “ct” and “pretest” or “posttest”) or (“critical thinking” or “ct” and “control group” or “quasi experiment” or “experiment”)) and (“collaboration” or “collaborative learning” or “CSCL”) and (“problem solving” or “problem-based learning” or “PBL”))”. The research area was “Education Educational Research”, and the search period was “January 1, 2000, to December 30, 2021”. A total of 412 papers were obtained. The search string with the Boolean operator used in the CNKI was “SU = (‘critical thinking’*‘collaboration’ + ‘critical thinking’*‘collaborative learning’ + ‘critical thinking’*‘CSCL’ + ‘critical thinking’*‘problem solving’ + ‘critical thinking’*‘problem-based learning’ + ‘critical thinking’*‘PBL’ + ‘critical thinking’*‘problem oriented’) AND FT = (‘experiment’ + ‘quasi experiment’ + ‘pretest’ + ‘posttest’ + ‘empirical study’)” (translated into Chinese when searching). A total of 56 studies were found throughout the search period of “January 2000 to December 2021”. From the databases, all duplicates and retractions were eliminated before exporting the references into Endnote, a program for managing bibliographic references. In all, 466 studies were found.

Second, the studies that matched the inclusion and exclusion criteria for the meta-analysis were chosen by two researchers after they had reviewed the abstracts and titles of the gathered articles, yielding a total of 126 studies.

Third, two researchers thoroughly reviewed each included article’s whole text in accordance with the inclusion and exclusion criteria. Meanwhile, a snowball search was performed using the references and citations of the included articles to ensure complete coverage of the articles. Ultimately, 36 articles were kept.

Two researchers worked together to carry out this entire process, and a consensus rate of almost 94.7% was reached after discussion and negotiation to clarify any emerging differences.

Eligibility criteria

Since not all the retrieved studies matched the criteria for this meta-analysis, eligibility criteria for both inclusion and exclusion were developed as follows:

The publication language of the included studies was limited to English and Chinese, and the full text could be obtained. Articles that did not meet the publication language and articles not published between 2000 and 2021 were excluded.

The research design of the included studies must be empirical and quantitative studies that can assess the effect of collaborative problem-solving on the development of critical thinking. Articles that could not identify the causal mechanisms by which collaborative problem-solving affects critical thinking, such as review articles and theoretical articles, were excluded.

The research method of the included studies must feature a randomized control experiment or a quasi-experiment, or a natural experiment, which have a higher degree of internal validity with strong experimental designs and can all plausibly provide evidence that critical thinking and collaborative problem-solving are causally related. Articles with non-experimental research methods, such as purely correlational or observational studies, were excluded.

The participants of the included studies were only students in school, including K-12 students and college students. Articles in which the participants were non-school students, such as social workers or adult learners, were excluded.

The research results of the included studies must mention definite signs that may be utilized to gauge critical thinking’s impact (e.g., sample size, mean value, or standard deviation). Articles that lacked specific measurement indicators for critical thinking and could not calculate the effect size were excluded.

Data coding design

In order to perform a meta-analysis, it is necessary to collect the most important information from the articles, codify that information’s properties, and convert descriptive data into quantitative data. Therefore, this study designed a data coding template (see Table 1 ). Ultimately, 16 coding fields were retained.

The designed data-coding template consisted of three pieces of information. Basic information about the papers was included in the descriptive information: the publishing year, author, serial number, and title of the paper.

The variable information for the experimental design had three variables: the independent variable (instruction method), the dependent variable (critical thinking), and the moderating variable (learning stage, teaching type, intervention duration, learning scaffold, group size, measuring tool, and subject area). Depending on the topic of this study, the intervention strategy, as the independent variable, was coded into collaborative and non-collaborative problem-solving. The dependent variable, critical thinking, was coded as a cognitive skill and an attitudinal tendency. And seven moderating variables were created by grouping and combining the experimental design variables discovered within the 36 studies (see Table 1 ), where learning stages were encoded as higher education, high school, middle school, and primary school or lower; teaching types were encoded as mixed courses, integrated courses, and independent courses; intervention durations were encoded as 0–1 weeks, 1–4 weeks, 4–12 weeks, and more than 12 weeks; group sizes were encoded as 2–3 persons, 4–6 persons, 7–10 persons, and more than 10 persons; learning scaffolds were encoded as teacher-supported learning scaffold, technique-supported learning scaffold, and resource-supported learning scaffold; measuring tools were encoded as standardized measurement tools (e.g., WGCTA, CCTT, CCTST, and CCTDI) and self-adapting measurement tools (e.g., modified or made by researchers); and subject areas were encoded according to the specific subjects used in the 36 included studies.

The data information contained three metrics for measuring critical thinking: sample size, average value, and standard deviation. It is vital to remember that studies with various experimental designs frequently adopt various formulas to determine the effect size. And this paper used Morris’ proposed standardized mean difference (SMD) calculation formula ( 2008 , p. 369; see Supplementary Table S3 ).

Procedure for extracting and coding data

According to the data coding template (see Table 1 ), the 36 papers’ information was retrieved by two researchers, who then entered them into Excel (see Supplementary Table S1 ). The results of each study were extracted separately in the data extraction procedure if an article contained numerous studies on critical thinking, or if a study assessed different critical thinking dimensions. For instance, Tiwari et al. ( 2010 ) used four time points, which were viewed as numerous different studies, to examine the outcomes of critical thinking, and Chen ( 2013 ) included the two outcome variables of attitudinal tendency and cognitive skills, which were regarded as two studies. After discussion and negotiation during data extraction, the two researchers’ consistency test coefficients were roughly 93.27%. Supplementary Table S2 details the key characteristics of the 36 included articles with 79 effect quantities, including descriptive information (e.g., the publishing year, author, serial number, and title of the paper), variable information (e.g., independent variables, dependent variables, and moderating variables), and data information (e.g., mean values, standard deviations, and sample size). Following that, testing for publication bias and heterogeneity was done on the sample data using the Rev-Man 5.4 software, and then the test results were used to conduct a meta-analysis.

Publication bias test

When the sample of studies included in a meta-analysis does not accurately reflect the general status of research on the relevant subject, publication bias is said to be exhibited in this research. The reliability and accuracy of the meta-analysis may be impacted by publication bias. Due to this, the meta-analysis needs to check the sample data for publication bias (Stewart et al., 2006 ). A popular method to check for publication bias is the funnel plot; and it is unlikely that there will be publishing bias when the data are equally dispersed on either side of the average effect size and targeted within the higher region. The data are equally dispersed within the higher portion of the efficient zone, consistent with the funnel plot connected with this analysis (see Fig. 2 ), indicating that publication bias is unlikely in this situation.

figure 2

This funnel plot shows the result of publication bias of 79 effect quantities across 36 studies.

Heterogeneity test

To select the appropriate effect models for the meta-analysis, one might use the results of a heterogeneity test on the data effect sizes. In a meta-analysis, it is common practice to gauge the degree of data heterogeneity using the I 2 value, and I 2  ≥ 50% is typically understood to denote medium-high heterogeneity, which calls for the adoption of a random effect model; if not, a fixed effect model ought to be applied (Lipsey and Wilson, 2001 ). The findings of the heterogeneity test in this paper (see Table 2 ) revealed that I 2 was 86% and displayed significant heterogeneity ( P  < 0.01). To ensure accuracy and reliability, the overall effect size ought to be calculated utilizing the random effect model.

The analysis of the overall effect size

This meta-analysis utilized a random effect model to examine 79 effect quantities from 36 studies after eliminating heterogeneity. In accordance with Cohen’s criterion (Cohen, 1992 ), it is abundantly clear from the analysis results, which are shown in the forest plot of the overall effect (see Fig. 3 ), that the cumulative impact size of cooperative problem-solving is 0.82, which is statistically significant ( z  = 12.78, P  < 0.01, 95% CI [0.69, 0.95]), and can encourage learners to practice critical thinking.

figure 3

This forest plot shows the analysis result of the overall effect size across 36 studies.

In addition, this study examined two distinct dimensions of critical thinking to better understand the precise contributions that collaborative problem-solving makes to the growth of critical thinking. The findings (see Table 3 ) indicate that collaborative problem-solving improves cognitive skills (ES = 0.70) and attitudinal tendency (ES = 1.17), with significant intergroup differences (chi 2  = 7.95, P  < 0.01). Although collaborative problem-solving improves both dimensions of critical thinking, it is essential to point out that the improvements in students’ attitudinal tendency are much more pronounced and have a significant comprehensive effect (ES = 1.17, z  = 7.62, P  < 0.01, 95% CI [0.87, 1.47]), whereas gains in learners’ cognitive skill are slightly improved and are just above average. (ES = 0.70, z  = 11.55, P  < 0.01, 95% CI [0.58, 0.82]).

The analysis of moderator effect size

The whole forest plot’s 79 effect quantities underwent a two-tailed test, which revealed significant heterogeneity ( I 2  = 86%, z  = 12.78, P  < 0.01), indicating differences between various effect sizes that may have been influenced by moderating factors other than sampling error. Therefore, exploring possible moderating factors that might produce considerable heterogeneity was done using subgroup analysis, such as the learning stage, learning scaffold, teaching type, group size, duration of the intervention, measuring tool, and the subject area included in the 36 experimental designs, in order to further explore the key factors that influence critical thinking. The findings (see Table 4 ) indicate that various moderating factors have advantageous effects on critical thinking. In this situation, the subject area (chi 2  = 13.36, P  < 0.05), group size (chi 2  = 8.77, P  < 0.05), intervention duration (chi 2  = 12.18, P  < 0.01), learning scaffold (chi 2  = 9.03, P  < 0.01), and teaching type (chi 2  = 7.20, P  < 0.05) are all significant moderators that can be applied to support the cultivation of critical thinking. However, since the learning stage and the measuring tools did not significantly differ among intergroup (chi 2  = 3.15, P  = 0.21 > 0.05, and chi 2  = 0.08, P  = 0.78 > 0.05), we are unable to explain why these two factors are crucial in supporting the cultivation of critical thinking in the context of collaborative problem-solving. These are the precise outcomes, as follows:

Various learning stages influenced critical thinking positively, without significant intergroup differences (chi 2  = 3.15, P  = 0.21 > 0.05). High school was first on the list of effect sizes (ES = 1.36, P  < 0.01), then higher education (ES = 0.78, P  < 0.01), and middle school (ES = 0.73, P  < 0.01). These results show that, despite the learning stage’s beneficial influence on cultivating learners’ critical thinking, we are unable to explain why it is essential for cultivating critical thinking in the context of collaborative problem-solving.

Different teaching types had varying degrees of positive impact on critical thinking, with significant intergroup differences (chi 2  = 7.20, P  < 0.05). The effect size was ranked as follows: mixed courses (ES = 1.34, P  < 0.01), integrated courses (ES = 0.81, P  < 0.01), and independent courses (ES = 0.27, P  < 0.01). These results indicate that the most effective approach to cultivate critical thinking utilizing collaborative problem solving is through the teaching type of mixed courses.

Various intervention durations significantly improved critical thinking, and there were significant intergroup differences (chi 2  = 12.18, P  < 0.01). The effect sizes related to this variable showed a tendency to increase with longer intervention durations. The improvement in critical thinking reached a significant level (ES = 0.85, P  < 0.01) after more than 12 weeks of training. These findings indicate that the intervention duration and critical thinking’s impact are positively correlated, with a longer intervention duration having a greater effect.

Different learning scaffolds influenced critical thinking positively, with significant intergroup differences (chi 2  = 9.03, P  < 0.01). The resource-supported learning scaffold (ES = 0.69, P  < 0.01) acquired a medium-to-higher level of impact, the technique-supported learning scaffold (ES = 0.63, P  < 0.01) also attained a medium-to-higher level of impact, and the teacher-supported learning scaffold (ES = 0.92, P  < 0.01) displayed a high level of significant impact. These results show that the learning scaffold with teacher support has the greatest impact on cultivating critical thinking.

Various group sizes influenced critical thinking positively, and the intergroup differences were statistically significant (chi 2  = 8.77, P  < 0.05). Critical thinking showed a general declining trend with increasing group size. The overall effect size of 2–3 people in this situation was the biggest (ES = 0.99, P  < 0.01), and when the group size was greater than 7 people, the improvement in critical thinking was at the lower-middle level (ES < 0.5, P  < 0.01). These results show that the impact on critical thinking is positively connected with group size, and as group size grows, so does the overall impact.

Various measuring tools influenced critical thinking positively, with significant intergroup differences (chi 2  = 0.08, P  = 0.78 > 0.05). In this situation, the self-adapting measurement tools obtained an upper-medium level of effect (ES = 0.78), whereas the complete effect size of the standardized measurement tools was the largest, achieving a significant level of effect (ES = 0.84, P  < 0.01). These results show that, despite the beneficial influence of the measuring tool on cultivating critical thinking, we are unable to explain why it is crucial in fostering the growth of critical thinking by utilizing the approach of collaborative problem-solving.

Different subject areas had a greater impact on critical thinking, and the intergroup differences were statistically significant (chi 2  = 13.36, P  < 0.05). Mathematics had the greatest overall impact, achieving a significant level of effect (ES = 1.68, P  < 0.01), followed by science (ES = 1.25, P  < 0.01) and medical science (ES = 0.87, P  < 0.01), both of which also achieved a significant level of effect. Programming technology was the least effective (ES = 0.39, P  < 0.01), only having a medium-low degree of effect compared to education (ES = 0.72, P  < 0.01) and other fields (such as language, art, and social sciences) (ES = 0.58, P  < 0.01). These results suggest that scientific fields (e.g., mathematics, science) may be the most effective subject areas for cultivating critical thinking utilizing the approach of collaborative problem-solving.

The effectiveness of collaborative problem solving with regard to teaching critical thinking

According to this meta-analysis, using collaborative problem-solving as an intervention strategy in critical thinking teaching has a considerable amount of impact on cultivating learners’ critical thinking as a whole and has a favorable promotional effect on the two dimensions of critical thinking. According to certain studies, collaborative problem solving, the most frequently used critical thinking teaching strategy in curriculum instruction can considerably enhance students’ critical thinking (e.g., Liang et al., 2017 ; Liu et al., 2020 ; Cindy, 2004 ). This meta-analysis provides convergent data support for the above research views. Thus, the findings of this meta-analysis not only effectively address the first research query regarding the overall effect of cultivating critical thinking and its impact on the two dimensions of critical thinking (i.e., attitudinal tendency and cognitive skills) utilizing the approach of collaborative problem-solving, but also enhance our confidence in cultivating critical thinking by using collaborative problem-solving intervention approach in the context of classroom teaching.

Furthermore, the associated improvements in attitudinal tendency are much stronger, but the corresponding improvements in cognitive skill are only marginally better. According to certain studies, cognitive skill differs from the attitudinal tendency in classroom instruction; the cultivation and development of the former as a key ability is a process of gradual accumulation, while the latter as an attitude is affected by the context of the teaching situation (e.g., a novel and exciting teaching approach, challenging and rewarding tasks) (Halpern, 2001 ; Wei and Hong, 2022 ). Collaborative problem-solving as a teaching approach is exciting and interesting, as well as rewarding and challenging; because it takes the learners as the focus and examines problems with poor structure in real situations, and it can inspire students to fully realize their potential for problem-solving, which will significantly improve their attitudinal tendency toward solving problems (Liu et al., 2020 ). Similar to how collaborative problem-solving influences attitudinal tendency, attitudinal tendency impacts cognitive skill when attempting to solve a problem (Liu et al., 2020 ; Zhang et al., 2022 ), and stronger attitudinal tendencies are associated with improved learning achievement and cognitive ability in students (Sison, 2008 ; Zhang et al., 2022 ). It can be seen that the two specific dimensions of critical thinking as well as critical thinking as a whole are affected by collaborative problem-solving, and this study illuminates the nuanced links between cognitive skills and attitudinal tendencies with regard to these two dimensions of critical thinking. To fully develop students’ capacity for critical thinking, future empirical research should pay closer attention to cognitive skills.

The moderating effects of collaborative problem solving with regard to teaching critical thinking

In order to further explore the key factors that influence critical thinking, exploring possible moderating effects that might produce considerable heterogeneity was done using subgroup analysis. The findings show that the moderating factors, such as the teaching type, learning stage, group size, learning scaffold, duration of the intervention, measuring tool, and the subject area included in the 36 experimental designs, could all support the cultivation of collaborative problem-solving in critical thinking. Among them, the effect size differences between the learning stage and measuring tool are not significant, which does not explain why these two factors are crucial in supporting the cultivation of critical thinking utilizing the approach of collaborative problem-solving.

In terms of the learning stage, various learning stages influenced critical thinking positively without significant intergroup differences, indicating that we are unable to explain why it is crucial in fostering the growth of critical thinking.

Although high education accounts for 70.89% of all empirical studies performed by researchers, high school may be the appropriate learning stage to foster students’ critical thinking by utilizing the approach of collaborative problem-solving since it has the largest overall effect size. This phenomenon may be related to student’s cognitive development, which needs to be further studied in follow-up research.

With regard to teaching type, mixed course teaching may be the best teaching method to cultivate students’ critical thinking. Relevant studies have shown that in the actual teaching process if students are trained in thinking methods alone, the methods they learn are isolated and divorced from subject knowledge, which is not conducive to their transfer of thinking methods; therefore, if students’ thinking is trained only in subject teaching without systematic method training, it is challenging to apply to real-world circumstances (Ruggiero, 2012 ; Hu and Liu, 2015 ). Teaching critical thinking as mixed course teaching in parallel to other subject teachings can achieve the best effect on learners’ critical thinking, and explicit critical thinking instruction is more effective than less explicit critical thinking instruction (Bensley and Spero, 2014 ).

In terms of the intervention duration, with longer intervention times, the overall effect size shows an upward tendency. Thus, the intervention duration and critical thinking’s impact are positively correlated. Critical thinking, as a key competency for students in the 21st century, is difficult to get a meaningful improvement in a brief intervention duration. Instead, it could be developed over a lengthy period of time through consistent teaching and the progressive accumulation of knowledge (Halpern, 2001 ; Hu and Liu, 2015 ). Therefore, future empirical studies ought to take these restrictions into account throughout a longer period of critical thinking instruction.

With regard to group size, a group size of 2–3 persons has the highest effect size, and the comprehensive effect size decreases with increasing group size in general. This outcome is in line with some research findings; as an example, a group composed of two to four members is most appropriate for collaborative learning (Schellens and Valcke, 2006 ). However, the meta-analysis results also indicate that once the group size exceeds 7 people, small groups cannot produce better interaction and performance than large groups. This may be because the learning scaffolds of technique support, resource support, and teacher support improve the frequency and effectiveness of interaction among group members, and a collaborative group with more members may increase the diversity of views, which is helpful to cultivate critical thinking utilizing the approach of collaborative problem-solving.

With regard to the learning scaffold, the three different kinds of learning scaffolds can all enhance critical thinking. Among them, the teacher-supported learning scaffold has the largest overall effect size, demonstrating the interdependence of effective learning scaffolds and collaborative problem-solving. This outcome is in line with some research findings; as an example, a successful strategy is to encourage learners to collaborate, come up with solutions, and develop critical thinking skills by using learning scaffolds (Reiser, 2004 ; Xu et al., 2022 ); learning scaffolds can lower task complexity and unpleasant feelings while also enticing students to engage in learning activities (Wood et al., 2006 ); learning scaffolds are designed to assist students in using learning approaches more successfully to adapt the collaborative problem-solving process, and the teacher-supported learning scaffolds have the greatest influence on critical thinking in this process because they are more targeted, informative, and timely (Xu et al., 2022 ).

With respect to the measuring tool, despite the fact that standardized measurement tools (such as the WGCTA, CCTT, and CCTST) have been acknowledged as trustworthy and effective by worldwide experts, only 54.43% of the research included in this meta-analysis adopted them for assessment, and the results indicated no intergroup differences. These results suggest that not all teaching circumstances are appropriate for measuring critical thinking using standardized measurement tools. “The measuring tools for measuring thinking ability have limits in assessing learners in educational situations and should be adapted appropriately to accurately assess the changes in learners’ critical thinking.”, according to Simpson and Courtney ( 2002 , p. 91). As a result, in order to more fully and precisely gauge how learners’ critical thinking has evolved, we must properly modify standardized measuring tools based on collaborative problem-solving learning contexts.

With regard to the subject area, the comprehensive effect size of science departments (e.g., mathematics, science, medical science) is larger than that of language arts and social sciences. Some recent international education reforms have noted that critical thinking is a basic part of scientific literacy. Students with scientific literacy can prove the rationality of their judgment according to accurate evidence and reasonable standards when they face challenges or poorly structured problems (Kyndt et al., 2013 ), which makes critical thinking crucial for developing scientific understanding and applying this understanding to practical problem solving for problems related to science, technology, and society (Yore et al., 2007 ).

Suggestions for critical thinking teaching

Other than those stated in the discussion above, the following suggestions are offered for critical thinking instruction utilizing the approach of collaborative problem-solving.

First, teachers should put a special emphasis on the two core elements, which are collaboration and problem-solving, to design real problems based on collaborative situations. This meta-analysis provides evidence to support the view that collaborative problem-solving has a strong synergistic effect on promoting students’ critical thinking. Asking questions about real situations and allowing learners to take part in critical discussions on real problems during class instruction are key ways to teach critical thinking rather than simply reading speculative articles without practice (Mulnix, 2012 ). Furthermore, the improvement of students’ critical thinking is realized through cognitive conflict with other learners in the problem situation (Yang et al., 2008 ). Consequently, it is essential for teachers to put a special emphasis on the two core elements, which are collaboration and problem-solving, and design real problems and encourage students to discuss, negotiate, and argue based on collaborative problem-solving situations.

Second, teachers should design and implement mixed courses to cultivate learners’ critical thinking, utilizing the approach of collaborative problem-solving. Critical thinking can be taught through curriculum instruction (Kuncel, 2011 ; Leng and Lu, 2020 ), with the goal of cultivating learners’ critical thinking for flexible transfer and application in real problem-solving situations. This meta-analysis shows that mixed course teaching has a highly substantial impact on the cultivation and promotion of learners’ critical thinking. Therefore, teachers should design and implement mixed course teaching with real collaborative problem-solving situations in combination with the knowledge content of specific disciplines in conventional teaching, teach methods and strategies of critical thinking based on poorly structured problems to help students master critical thinking, and provide practical activities in which students can interact with each other to develop knowledge construction and critical thinking utilizing the approach of collaborative problem-solving.

Third, teachers should be more trained in critical thinking, particularly preservice teachers, and they also should be conscious of the ways in which teachers’ support for learning scaffolds can promote critical thinking. The learning scaffold supported by teachers had the greatest impact on learners’ critical thinking, in addition to being more directive, targeted, and timely (Wood et al., 2006 ). Critical thinking can only be effectively taught when teachers recognize the significance of critical thinking for students’ growth and use the proper approaches while designing instructional activities (Forawi, 2016 ). Therefore, with the intention of enabling teachers to create learning scaffolds to cultivate learners’ critical thinking utilizing the approach of collaborative problem solving, it is essential to concentrate on the teacher-supported learning scaffolds and enhance the instruction for teaching critical thinking to teachers, especially preservice teachers.

Implications and limitations

There are certain limitations in this meta-analysis, but future research can correct them. First, the search languages were restricted to English and Chinese, so it is possible that pertinent studies that were written in other languages were overlooked, resulting in an inadequate number of articles for review. Second, these data provided by the included studies are partially missing, such as whether teachers were trained in the theory and practice of critical thinking, the average age and gender of learners, and the differences in critical thinking among learners of various ages and genders. Third, as is typical for review articles, more studies were released while this meta-analysis was being done; therefore, it had a time limit. With the development of relevant research, future studies focusing on these issues are highly relevant and needed.

Conclusions

The subject of the magnitude of collaborative problem-solving’s impact on fostering students’ critical thinking, which received scant attention from other studies, was successfully addressed by this study. The question of the effectiveness of collaborative problem-solving in promoting students’ critical thinking was addressed in this study, which addressed a topic that had gotten little attention in earlier research. The following conclusions can be made:

Regarding the results obtained, collaborative problem solving is an effective teaching approach to foster learners’ critical thinking, with a significant overall effect size (ES = 0.82, z  = 12.78, P  < 0.01, 95% CI [0.69, 0.95]). With respect to the dimensions of critical thinking, collaborative problem-solving can significantly and effectively improve students’ attitudinal tendency, and the comprehensive effect is significant (ES = 1.17, z  = 7.62, P  < 0.01, 95% CI [0.87, 1.47]); nevertheless, it falls short in terms of improving students’ cognitive skills, having only an upper-middle impact (ES = 0.70, z  = 11.55, P  < 0.01, 95% CI [0.58, 0.82]).

As demonstrated by both the results and the discussion, there are varying degrees of beneficial effects on students’ critical thinking from all seven moderating factors, which were found across 36 studies. In this context, the teaching type (chi 2  = 7.20, P  < 0.05), intervention duration (chi 2  = 12.18, P  < 0.01), subject area (chi 2  = 13.36, P  < 0.05), group size (chi 2  = 8.77, P  < 0.05), and learning scaffold (chi 2  = 9.03, P  < 0.01) all have a positive impact on critical thinking, and they can be viewed as important moderating factors that affect how critical thinking develops. Since the learning stage (chi 2  = 3.15, P  = 0.21 > 0.05) and measuring tools (chi 2  = 0.08, P  = 0.78 > 0.05) did not demonstrate any significant intergroup differences, we are unable to explain why these two factors are crucial in supporting the cultivation of critical thinking in the context of collaborative problem-solving.

Data availability

All data generated or analyzed during this study are included within the article and its supplementary information files, and the supplementary information files are available in the Dataverse repository: https://doi.org/10.7910/DVN/IPFJO6 .

Bensley DA, Spero RA (2014) Improving critical thinking skills and meta-cognitive monitoring through direct infusion. Think Skills Creat 12:55–68. https://doi.org/10.1016/j.tsc.2014.02.001

Article   Google Scholar  

Castle A (2009) Defining and assessing critical thinking skills for student radiographers. Radiography 15(1):70–76. https://doi.org/10.1016/j.radi.2007.10.007

Chen XD (2013) An empirical study on the influence of PBL teaching model on critical thinking ability of non-English majors. J PLA Foreign Lang College 36 (04):68–72

Google Scholar  

Cohen A (1992) Antecedents of organizational commitment across occupational groups: a meta-analysis. J Organ Behav. https://doi.org/10.1002/job.4030130602

Cooper H (2010) Research synthesis and meta-analysis: a step-by-step approach, 4th edn. Sage, London, England

Cindy HS (2004) Problem-based learning: what and how do students learn? Educ Psychol Rev 51(1):31–39

Duch BJ, Gron SD, Allen DE (2001) The power of problem-based learning: a practical “how to” for teaching undergraduate courses in any discipline. Stylus Educ Sci 2:190–198

Ennis RH (1989) Critical thinking and subject specificity: clarification and needed research. Educ Res 18(3):4–10. https://doi.org/10.3102/0013189x018003004

Facione PA (1990) Critical thinking: a statement of expert consensus for purposes of educational assessment and instruction. Research findings and recommendations. Eric document reproduction service. https://eric.ed.gov/?id=ed315423

Facione PA, Facione NC (1992) The California Critical Thinking Dispositions Inventory (CCTDI) and the CCTDI test manual. California Academic Press, Millbrae, CA

Forawi SA (2016) Standard-based science education and critical thinking. Think Skills Creat 20:52–62. https://doi.org/10.1016/j.tsc.2016.02.005

Halpern DF (2001) Assessing the effectiveness of critical thinking instruction. J Gen Educ 50(4):270–286. https://doi.org/10.2307/27797889

Hu WP, Liu J (2015) Cultivation of pupils’ thinking ability: a five-year follow-up study. Psychol Behav Res 13(05):648–654. https://doi.org/10.3969/j.issn.1672-0628.2015.05.010

Huber K (2016) Does college teach critical thinking? A meta-analysis. Rev Educ Res 86(2):431–468. https://doi.org/10.3102/0034654315605917

Kek MYCA, Huijser H (2011) The power of problem-based learning in developing critical thinking skills: preparing students for tomorrow’s digital futures in today’s classrooms. High Educ Res Dev 30(3):329–341. https://doi.org/10.1080/07294360.2010.501074

Kuncel NR (2011) Measurement and meaning of critical thinking (Research report for the NRC 21st Century Skills Workshop). National Research Council, Washington, DC

Kyndt E, Raes E, Lismont B, Timmers F, Cascallar E, Dochy F (2013) A meta-analysis of the effects of face-to-face cooperative learning. Do recent studies falsify or verify earlier findings? Educ Res Rev 10(2):133–149. https://doi.org/10.1016/j.edurev.2013.02.002

Leng J, Lu XX (2020) Is critical thinking really teachable?—A meta-analysis based on 79 experimental or quasi experimental studies. Open Educ Res 26(06):110–118. https://doi.org/10.13966/j.cnki.kfjyyj.2020.06.011

Liang YZ, Zhu K, Zhao CL (2017) An empirical study on the depth of interaction promoted by collaborative problem solving learning activities. J E-educ Res 38(10):87–92. https://doi.org/10.13811/j.cnki.eer.2017.10.014

Lipsey M, Wilson D (2001) Practical meta-analysis. International Educational and Professional, London, pp. 92–160

Liu Z, Wu W, Jiang Q (2020) A study on the influence of problem based learning on college students’ critical thinking-based on a meta-analysis of 31 studies. Explor High Educ 03:43–49

Morris SB (2008) Estimating effect sizes from pretest-posttest-control group designs. Organ Res Methods 11(2):364–386. https://doi.org/10.1177/1094428106291059

Article   ADS   Google Scholar  

Mulnix JW (2012) Thinking critically about critical thinking. Educ Philos Theory 44(5):464–479. https://doi.org/10.1111/j.1469-5812.2010.00673.x

Naber J, Wyatt TH (2014) The effect of reflective writing interventions on the critical thinking skills and dispositions of baccalaureate nursing students. Nurse Educ Today 34(1):67–72. https://doi.org/10.1016/j.nedt.2013.04.002

National Research Council (2012) Education for life and work: developing transferable knowledge and skills in the 21st century. The National Academies Press, Washington, DC

Niu L, Behar HLS, Garvan CW (2013) Do instructional interventions influence college students’ critical thinking skills? A meta-analysis. Educ Res Rev 9(12):114–128. https://doi.org/10.1016/j.edurev.2012.12.002

Peng ZM, Deng L (2017) Towards the core of education reform: cultivating critical thinking skills as the core of skills in the 21st century. Res Educ Dev 24:57–63. https://doi.org/10.14121/j.cnki.1008-3855.2017.24.011

Reiser BJ (2004) Scaffolding complex learning: the mechanisms of structuring and problematizing student work. J Learn Sci 13(3):273–304. https://doi.org/10.1207/s15327809jls1303_2

Ruggiero VR (2012) The art of thinking: a guide to critical and creative thought, 4th edn. Harper Collins College Publishers, New York

Schellens T, Valcke M (2006) Fostering knowledge construction in university students through asynchronous discussion groups. Comput Educ 46(4):349–370. https://doi.org/10.1016/j.compedu.2004.07.010

Sendag S, Odabasi HF (2009) Effects of an online problem based learning course on content knowledge acquisition and critical thinking skills. Comput Educ 53(1):132–141. https://doi.org/10.1016/j.compedu.2009.01.008

Sison R (2008) Investigating Pair Programming in a Software Engineering Course in an Asian Setting. 2008 15th Asia-Pacific Software Engineering Conference, pp. 325–331. https://doi.org/10.1109/APSEC.2008.61

Simpson E, Courtney M (2002) Critical thinking in nursing education: literature review. Mary Courtney 8(2):89–98

Stewart L, Tierney J, Burdett S (2006) Do systematic reviews based on individual patient data offer a means of circumventing biases associated with trial publications? Publication bias in meta-analysis. John Wiley and Sons Inc, New York, pp. 261–286

Tiwari A, Lai P, So M, Yuen K (2010) A comparison of the effects of problem-based learning and lecturing on the development of students’ critical thinking. Med Educ 40(6):547–554. https://doi.org/10.1111/j.1365-2929.2006.02481.x

Wood D, Bruner JS, Ross G (2006) The role of tutoring in problem solving. J Child Psychol Psychiatry 17(2):89–100. https://doi.org/10.1111/j.1469-7610.1976.tb00381.x

Wei T, Hong S (2022) The meaning and realization of teachable critical thinking. Educ Theory Practice 10:51–57

Xu EW, Wang W, Wang QX (2022) A meta-analysis of the effectiveness of programming teaching in promoting K-12 students’ computational thinking. Educ Inf Technol. https://doi.org/10.1007/s10639-022-11445-2

Yang YC, Newby T, Bill R (2008) Facilitating interactions through structured web-based bulletin boards: a quasi-experimental study on promoting learners’ critical thinking skills. Comput Educ 50(4):1572–1585. https://doi.org/10.1016/j.compedu.2007.04.006

Yore LD, Pimm D, Tuan HL (2007) The literacy component of mathematical and scientific literacy. Int J Sci Math Educ 5(4):559–589. https://doi.org/10.1007/s10763-007-9089-4

Zhang T, Zhang S, Gao QQ, Wang JH (2022) Research on the development of learners’ critical thinking in online peer review. Audio Visual Educ Res 6:53–60. https://doi.org/10.13811/j.cnki.eer.2022.06.08

Download references

Acknowledgements

This research was supported by the graduate scientific research and innovation project of Xinjiang Uygur Autonomous Region named “Research on in-depth learning of high school information technology courses for the cultivation of computing thinking” (No. XJ2022G190) and the independent innovation fund project for doctoral students of the College of Educational Science of Xinjiang Normal University named “Research on project-based teaching of high school information technology courses from the perspective of discipline core literacy” (No. XJNUJKYA2003).

Author information

Authors and affiliations.

College of Educational Science, Xinjiang Normal University, 830017, Urumqi, Xinjiang, China

Enwei Xu, Wei Wang & Qingxia Wang

You can also search for this author in PubMed   Google Scholar

Corresponding authors

Correspondence to Enwei Xu or Wei Wang .

Ethics declarations

Competing interests.

The authors declare no competing interests.

Ethical approval

This article does not contain any studies with human participants performed by any of the authors.

Informed consent

Additional information.

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Supplementary tables, rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Xu, E., Wang, W. & Wang, Q. The effectiveness of collaborative problem solving in promoting students’ critical thinking: A meta-analysis based on empirical literature. Humanit Soc Sci Commun 10 , 16 (2023). https://doi.org/10.1057/s41599-023-01508-1

Download citation

Received : 07 August 2022

Accepted : 04 January 2023

Published : 11 January 2023

DOI : https://doi.org/10.1057/s41599-023-01508-1

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

This article is cited by

Impacts of online collaborative learning on students’ intercultural communication apprehension and intercultural communicative competence.

  • Hoa Thi Hoang Chau
  • Hung Phu Bui
  • Quynh Thi Huong Dinh

Education and Information Technologies (2024)

Exploring the effects of digital technology on deep learning: a meta-analysis

Sustainable electricity generation and farm-grid utilization from photovoltaic aquaculture: a bibliometric analysis.

  • A. A. Amusa
  • M. Alhassan

International Journal of Environmental Science and Technology (2024)

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

problem solving skills journal article

BRIEF RESEARCH REPORT article

Effects of online problem-solving instruction and identification attitude toward instructional strategies on students' creativity.

\nYi-Ping Wang

  • College of International Relations, Huaqiao University, Xiamen, China

Problem-solving ability is an essential part of daily life. Thus, curiosity and a thirst for knowledge should be cultivated in students to help them develop problem solving and independent thinking skills. Along with positive attitudes and an active disposition, these abilities are needed to solve problems throughout the lifespan and develop -confidence. To achieve educational objectives in the context of globalization, creative ability is necessary for generating competitive advantages. Therefore, creative thinking, critical thinking, and problem-solving ability are important basic competencies needed for future world citizens. Creativity should also be integrated into subject teaching to cultivate students' lifelong learning and a creative attitude toward life. A questionnaire was distributed to 420 students in colleges and universities in Fujian, China. After removing invalid and incomplete responses, 363 copies were found to be valid yielding a response rate of 86%. Findings indicate that the new generation requires high levels of support to develop creativity and integrate diverse subjects such as nature, humanities, and technology. A rich imagination is needed to root creativity in the new generation.

Introduction

Problem solving is ubiquitous in modern life and an essential skill for overcoming the problems we encounter daily. Problems can be overcome using problem-solving principles and creative inspiration from individuals ( Hao et al., 2016 ). Thus, students' curiosity and thirst for knowledge should be cultivated to develop their problem solving and independent thinking abilities. An active approach and positive attitude to solving problems may enhance self-confidence and the ability to cope with challenges.

Education aims to cultivate healthy personalities, thinking, judgment, and creativity ( Su et al., 2014 ). Essentially, education is the learning process to expand students' potential and cultivate their ability to adapt to—and improve—their environment. Basic goals of education should include self-expression, independent thinking, active inquiry, and problem solving. The curriculum goals should be life-centered to develop individuals' potential, cultivate scientific knowledge and skills, and help students adapt to the demands of modern life ( Atmatzidou et al., 2018 ). Education aims to deliver basic knowledge, cultivate physical and mental development, inquiry, and reflection, and create healthy citizens through activities involving interaction between individuals, individuals and society, and society and nature. To achieve educational objectives, students should be guided to develop their performance and creation abilities, research and active exploration abilities, independent thinking and problem-solving abilities. In the current globalization context, creative abilities are required for building competitive advantages. Accordingly, creative thinking, critical thinking, and problem-solving abilities are key skills for future world citizens. The cultivation of creativity should also be integrated into subject instruction, so that students develop their lifelong learning and creative attitudes toward life. Many countries are eager to cultivate creative new generations and promote the development of local business and humanistic technological education. It has become a national platform for the new generations of international technological art ( Zhang and Chu, 2016 ). In particular, the traditional productivity-oriented competition model is slowly being transformed to creativity-oriented industries. Innovation capability is likely to bring competitive advantages in the Internet information age. As the field of information technology grows exponentially, innovation capability has become more important. However, if opportunities for development are missed, it can be difficult to catch up as the need for creativity is likely to grow in the foreseeable future.

In this study we focus on student creativity and how it is affected by online problem-solving instruction and identification of attitudes toward instructional strategies. Our purpose is to help the new generation develop creativity and a rich imagination to integrate the power of nature, humanities, and technology.

Literature Review and Hypothesis

Su et al. (2017) proposed that teachers who use effective instructional strategies allow students to successfully negotiate the challenges of life, as effective instructional strategies may enhance students' problem-solving ability. Deeper relationships between teachers and students also result in better learning motivation for students. Art-related activities were used to observe the factors affecting preschool children's problem-solving ability ( Calvo et al., 2018 ). These factors included the cognition of problem goals, the development of perception ability, individual experience, interaction among peers, and resource assistance provided by teachers' instructional strategies. ( LaForce et al., 2017 ) pointed out that identifying problem-related data is an essential step in the problem-solving process, i.e., the process of acquiring data, judging data, reducing data coverage, or linking relevant data ( Wu et al., 2020a ). Teachers' instructional strategies for online problem solving also affect student performance. The following hypothesis was therefore established for this study.

H1: Online problem-solving instruction has a significant positive correlation with identification attitude.

Lu et al. (2017) consider that teachers can enhance students' problem-solving ability and cultivate their problem-finding skills through instructional strategies guiding discussion of current affairs. Instructional strategies and the use of multimedia in technology education can induce students' identification attitudes and learning motivation, ultimately enhancing learning effectiveness and facilitating the development of imagination and creativity. Students with identification attitudes toward strategies could design problem-solving methods using science ( Newhouse, 2017 ). The students understood that innovation was not necessarily the novel creation of “something from nothing” but might involve modification and new development based on existing affairs ( Wu et al., 2020b ). Achilleos et al. (2019) regard attitude toward education instructional strategies as the most important factor in students' creativity learning, where teachers, social and cultural factors, and experience in learning a foreign language revealed significant correlations. Our second hypothesis was therefore presented for this study.

H2: Identification attitude shows strong positive correlations with creativity.

Hsieh et al. (2017) posit that science-related thinking, discovery, and creation can be regarded as the research component of problem solving. Creativity is characterized by keenness, fluency, flexibility, originality, and elaboration—a kind of mental intelligence to generate distinct new concepts from known experiences or knowledge to solve problems with creative methods. Creativity can also be the application of known information, based on targeted outcomes, to generate novel, unique, and valuable new concepts or a new product or technology, unexplored innovative concepts or problem-solving abilities ( Wu et al., 2021 ). Joachim et al. (2018) consider creativity as a part of problem solving, as problem-solving characteristics often involve novel thinking, strong motivation and determination to present the important status of the solution in the latent process of problem solving. However, Joachim et al.'s (2018) views on creativity and problem-solving have largely been unexplored to date. Novel performance at any level of the creative process could be considered as creation. Rietz et al. (2019) stated that life brings diverse problems and the key to addressing these lies in creativity. Only when people invest more attention in creativity can problems be solved leading to optimum solutions for life's challenges. This gives rise to our third hypothesis.

H3: Online problem-solving instruction reveals strong positive correlations with creativity

Methodology

Operational definitions, online problem-solving instruction.

Referring to Chen et al. (2019) , the dimensions of online problem-solving instruction in this study were as follows.

1. Exercise example: Examples to illustrate teaching goals are provided as part of teachers' instruction. Students can learn effective problem-solving skills by observing experts' problem-solving interpretation and demonstration step-by-step.

2. Problem orientation: Problem-oriented learning refers to teachers giving carefully-designed situational problems to students, who start from a problem and proceed to problem solving and learning. After self-learning, students participate in team discussion or discussions with teachers. With constant trials, solutions are eventually proposed.

Identification Attitude

The dimensions for identification attitude toward learning are based on Tang et al. (2019) and contain the following three components.

1. Cognitive component: This refers to an individual's belief in or knowledge of specific matters. The cognition of attitude refers to evaluation of meaning from factual statements presented, i.e., an individual may form an attitude for or against a particular object. For instance, students understand that teachers have rich professional knowledge and can present materials with good organization.

2. Affective component: The affective or emotional component refers to an individual's emotions and feelings, including positive and negative feelings of respect and contempt, like and dislike, sympathy and exclusion. For example, students evaluating a teacher as a friendly person would have positive feelings about the teacher and want to develop that relationship.

3. Behavioral component: Behavior refers to an individual's response tendency to attitude objects, i.e., an individual's explicit behavioral performance when acting in relation to objects. Possible responses include approach, avoidance, or indifference. For instance, students might accept their teachers' arrangement of an activity with respect and actively ask teachers questions.

Kim et al. (2019) consider creativity includes basic cognitive abilities of divergent thinking, and that such abilities can be understood through testing tools or observation.

1. Fluency: Fluency refers to the quantity of a person's concept output, i.e., the ability to generate possible programs or solutions. A student with fluent thinking would propose several responses at the concept generation stage.

2. Flexibility: Flexibility is the ability to change thinking direction, i.e., being able to think of different methods when problems occur, to find out distinct applications or new concepts.

3. Originality: Originality refers to generation of unique and novel ideas, i.e., doing unexpected things or having the ability to see others' points of view.

4. Elaboration: Elaboration is a supplementary idea that refers to the ability to add new ideas to an original concept, i.e., the ability to increase novel concepts or build on existing ideas or basic concepts.

Research Objective

There are 89 colleges and universities in Fujian, China (50 colleges and 39 universities). Students in these institutions in Fujian comprised the research sample, and we distributed 420 copies of our questionnaire to them. After removing invalid and incomplete questionnaires, a total of 363 valid copies were returned, with a response rate of 86%.

This research focused on discussing online problems about teaching and teaching strategies. It used experimental design and online problem solving to do experimental research for 2 hours every week for 24 weeks (48 hours in total). To analyze data from the questionnaire, Structural Equation Modeling (SEM) was used. We followed a two-stage analysis of goodness-of-fit and model verification. Confirmatory Factor Analysis (CFA) was first executed, aiming to test complex variables in the model by deleting measured variables with negative effects on the cause-and-effect analysis. We then proceeded with path analysis with the modified model. Path analysis aims to estimate the path relationship among variables. Without testing complex variables through CFA, the path analysis might be affected by complex variables resulting in poor goodness-of-fit or an insignificant model path. Amos 18.0 was used in this study for the model fit test. The measurement result of CMIN/DF is considered good if lower than five and excellent if lower than three; Goodness-of-Fit Index (GFI), Adjusted Goodness-of-Fit index (AGFI), Normed Fit Index (NFI), Incremental Fit Index (IFI), Tucker-Lewis Index (TLI), and Comparative Fit Index (CFI) are considered good if higher than 0.9; and Root Mean Square Residual (RMR), Root Mean Square Error of Approximation (RMSEA), and Standardized Root Mean Square Residual (SRMR) are good if values are lower than 0.05.

Factor Analysis

Two factors of “exercise example” (eigenvalue = 4.638, α = 0.88) and “problem orientation” (eigenvalue = 3.751, α = 0.85) were extracted from the scale of instructional strategies for online problem solving. The cumulative covariance accounted for was 72.683%. Three factors were extracted from the identification attitude scale: “cognitive component” (eigenvalue = 2.463, α = 0.81), “affective component” (eigenvalue = 1.754, α = 0.83), and “behavioral component” (eigenvalue = 1.491, α = 0.84). The cumulative covariance reached 73.942%. Four factors were extracted from the creativity scale: “fluency” (eigenvalue = 2.461, α = 0.84), “flexibility” (eigenvalue = 2.055, α = 0.82), “originality” (eigenvalue = 1.976, α = 0.87), and “elaboration” (eigenvalue = 1.689, α = 0.86). The cumulative covariance accounted for was 79.317%.

Empirical Analysis of SEM

CFA results indicated the convergent and discriminant validity of the model were first observed, with convergent validity describing the reliability of individually observed variables, construct reliability (CR), and average variances extracted (AVE). Values of more than 0.5 indicate good reliability of individually observed variables. The factor loadings of the observed variables in the empirical analysis model were higher than the suggested value. CR should exceed 0.6, although some researchers suggest that 0.5 or above is acceptable. The model calibration results reveal CR was higher than 0.6, and AVE higher than 0.5, thus conforming to the suggested values.

Regarding the calibration results of structural equations, χ 2 / df , RMSEA, GFI, AGFI, RMR, and NFI were also calculated. For χ 2 / df a standard ≦5 is suggested and χ 2 / df = 2.422 ≦ 5 in this study. The standard for RMSEA is ≦0.08; reported here as 0.044 ≦ 0.08. GFI has a suggested standard of ≧0.9 and here it is reported as 0.951 ≧ 0.9. AGFI's suggested standard ≧0.9; it shows AGFI = 0.927 ≧ 0.9 in this study. RMR has a suggested standard of ≦0.05, and here it was reported as 0.023 ≦ 0.05. The NFI standard is ≧0.9; here it presents NFI = 0.937 ≧ 0.9 in this study. The overall model fit is good. The parameter calibration of the structural equation is shown in Table 1 and Figure 1 . The research results reveal instructional strategies for online problem solving → identification attitude: 0.346 *** , that is, H1 was supported. Identification attitude → creativity: 0.375 *** , that is, H2 was supported, and instructional strategies for online problem solving → creativity: 0.425 *** , that is, H3 was supported.

www.frontiersin.org

Table 1 . Structural equation modeling result.

www.frontiersin.org

Figure 1 . FigureModel path diagram.

The results show that online instructional strategies for online problem solving can enhance students' creativity. Apparently, an expository teaching style is no longer sufficient to cope with challenges encountered. Rather, teachers need to be willing to constantly learn and change their teaching behavior to cope with the rapid development of new technology and enhance teaching efficiency. When conveying new knowledge to beginners, the provision of exercise examples may help students establish new schema to benefit the application to similar situations. When lacking relevant schema, beginners may try to solve problems with trial and error. In this case, exercise examples with experts demonstrating problem-solving steps could benefit students' learning performance in the new field. Problems studied in real life may facilitate students' creativity, drawing on their existing knowledge as they use available resources and unconsciously apply existing knowledge to enhance creative ability. The solutions to problems are unpredictable but require the ability to cope with interaction between people in a given culture or society in different situations. As a result, teachers should make decisions with the consideration of situational changes in the teaching site, i.e., students' ability, performance and teaching schedule, rather than generalizing across all situations. We do not suggest limiting creative thinking or defining set times for enhancing students' creative thinking. Instead, factors that influence creative efficiency, creative value, and curriculum schedules should be taken into account. As teachers plan their teaching activities, they should pay particular attention to students' academic performance and the vicarious experience of teachers or peers. Uysal (2014) believed people can develop their mental ability through learning even without any creative invention. When we face any new concept, it is better to keep an open mind. That way we will realize there is still a lot to be created ( Fernández et al., 2018 ). Labusch et al. (2019) said the development of creativity is not only creating positive thoughts but also turning these advantages into something more refined and broader. Teachers need to provide learning opportunities that students can apply in their daily lives leading to a re-evaluation of their identification attitudes toward instructional strategies. In this case, enhancing students' self-efficacy may assist them in overcoming learning challenges and cultivating a more positive learning attitude.

The research results demonstrate that online problem solving supports students to examine their ideas, chase after knowledge and continually improve their learning. They can freely develop their imaginations and make choices without being limited to find tools suitable for self-performance. They can concentrate on details, retain memories, and calmly think of more elaborate problem-solving approaches. Students draw on plans and organization to make significant progress in their thinking depth, novelty, flexibility, unique style, and diversity of function. To cultivate students' habits of brainstorming and thinking, they must become familiar with the general use of contextual information, and flexibility to change approaches and seek answers. Training flexible thinking is essential so that students can cope with problems with ease, propose various options and generate solutions. Lumsdaine and Lumsdaine (1995) let students learn from each other and modify their own thought. This transition could help them to achieve their potential. Solitary and monotonous learning material can no longer attract students' attention. Teachers need to provide a wider variety of materials and free choices without limit. They could also find more suitable tools for teaching. Therefore, Treffinger and Isaksen (1992) no longer provide model answers. They want students to explore and develop without any restriction. This could also amplify their personal experience and bring more options into it. It enhances student's uniqueness, and this needs overall growth, subjectively and objectively. People should never venerate one over the other. We should also learn to make good use of the conditions and things we already have. The same thing could have an entirely different outcome depending on how we use it ( Aşik and Erktin, 2019 ). Consequently, problem-solving instruction could assist in the cultivation of creativity in students' practice ability or cultivation of independent thinking and problem-solving ability. Teachers should attempt to create beneficial educational environments, cultivating students' learning interests, and enhancing their mental development. With accumulated experience, students can then be encouraged to develop more flexible skills, sensitive perception, and active thinking along with the ability to appropriately express these experiences. This would provide comprehensive preparation for enhancing students' creative thinking ability. Instructional strategies for online problem solving heavily emphasize cooperative discussion, brainstorming, and presentation. Tasks focusing on students' favorite novels and other relevant interests are valuable for sustaining long-term attention. Success in learning does not simply rely on rich knowledge and skillful techniques; affective attitudes also play an important part. Such characteristics may encourage students to positively and actively face problems and logically enhance their learning attitudes step-by-step.

Data Availability Statement

The raw data supporting the conclusions of this article will be made available by the authors, without undue reservation.

Ethics Statement

The studies involving human participants were reviewed and approved by the Ethical Committee of the Huaqiao University. The patients/participants provided their written informed consent to participate in this study.

Author Contributions

Y-PW performed the initial analyses and approved the submitted version of the manuscript.

Conflict of Interest

The author declares that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Publisher's Note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

Acknowledgments

The authors thank the reviewers for their valuable comments.

Achilleos, A. P., Mettouris, C., Yeratziotis, A., Papadopoulos, G. A., Pllana, S., Huber, F., et al. (2019). SciChallenge: a social media aware platform for contest-based STEM education and motivation of young students. IEEE Trans. Learn. Technol . 12, 98–111. doi: 10.1109/TLT.2018.2810879

CrossRef Full Text | Google Scholar

Aşik, G., and Erktin, E. (2019). Metacognitive experiences: mediating the relationship between metacognitive knowledge and problem solving. Egitim ve Bilim 44, 85–103. doi: 10.15390/EB.2019.7199

Atmatzidou, S., Demetriadis, S., and Nika, P. (2018). How does the degree of guidance support students' metacognitive and problem solving skills in educational robotics? J. Sci. Educ. Technol. 27, 70–85. doi: 10.1007/s10956-017-9709-x

Calvo, I., Cabanes, I., Quesada, J., and Barambones, O. (2018). A multidisciplinary PBL approach for teaching industrial informatics and robotics in engineering. IEEE Trans. Educ . 61, 21–28. doi: 10.1109/TE.2017.2721907

Chen, S.-Y., Lai, C.-F., Lai, Y.-H., and Su Y, S. (2019). Effect of project-based learning on development of students' creative thinking. Int. J. Electr. Eng. Educ . doi: 10.1177/0020720919846808

Fernández, J., Zúñiga, M. E., Rosas, M. V., and Guerrero, R. A. (2018). Experiences in learning problem-solving through computational thinking. J. Comput. Sci. Technol. 18, 136–142. doi: 10.24215/16666038.18.e15

Hao, J., Liu, L., von Davier, A., Kyllonen, P., and Kitchen, C. (2016). “Collaborative problem solving skills vs. collaboration outcomes: findings from statistical analysis and data mining,” in Proceedings of the 9th International Conference on Educational Data Mining (Raleigh, NC), 382–387.

Google Scholar

Hsieh, J. S. C., Huang, Y.-M., and Wu, W.-C. V. (2017). Technological acceptance of LINE in flipped EFL oral training. Comput. Hum. Behav . 70, 178–190. doi: 10.1016/j.chb.2016.12.066

Joachim, V., Spieth, P., and Heidenreich, S. (2018). Active innovation resistance: an empirical study on functional and psychological barriers to innovation adoption in different contexts. Ind. Mark. Manag. 71, 95–107. doi: 10.1016/j.indmarman.2017.12.011

Kim, J., Jordan, S. S., Franklin, C., and Froerer, A. (2019). Is solution-focused brief therapy evidence-based: an update 10 years later. Fam. Soc. 100, 127–138. doi: 10.1177/1044389419841688

Labusch, A., Eickelmann, B., and Vennemann, M. (2019). “Computational thinking processes and their congruence with problem-solving and information processing,” in Proceedings of the Computational Thinking Education , 65–78. doi: 10.1007/978-981-13-6528-7_5

LaForce, M., Noble, E., and Blackwell, C. K. (2017). Problem-based learning (PBL) and student interest in STEM careers: the roles of motivation and ability beliefs. Educ. Sci. 7:92. doi: 10.3390/educsci7040092

Lu, X., Wang, D., and Yu, D. (2017). Effect of solution-focused brief therapy-based on exercise prescription intervention on adolescent mental health. Rev. Argentina de Clin. Psicol . 26, 347–354. doi: 10.24205/03276716.2017.1035

Lumsdaine, E., and Lumsdaine, M. (1995). Creative Problem Solving-Thinking Skills for Changing World . New York, NY: McGRAW-Hill. doi: 10.1109/45.464655

Newhouse, C. (2017). STEM the boredom: engage students in the Australian curriculum using ICT with problem-based learning and assessment. J. Sci. Educ. Technol. 26, 44–57. doi: 10.1007/s10956-016-9650-4

Rietz, T., Benke, I., and Maedche, A. (2019). “The impact of anthropomorphic and functional chatbot design features in enterprise collaboration systems on user acceptance,” in Proceedings of the 14th International Conference Wirtschaftsinformatik (Siegen), 1642-1656.

Su, Y.-S., Ding, T.-J., and Lai, C.-F. (2017). Analysis of students' engagement and learning performance in a social community supported computer programming course. Eurasia J. Math. Sci. Technol. Educ . 13, 6189–6201. doi: 10.12973/eurasia.2017.01058a

Su, Y. S., Yang, J. H., Hwang, W. Y., Huang, S. J., and Tern, M. Y. (2014). Investigating the role of computer-supported annotation in problem solving based teaching: an empirical study of a scratch programming pedagogy. Br. J. Educ. Technol. 45, 647–665. doi: 10.1111/bjet.12058

Tang, K. Y., Hsiao, C. H., and Su, Y. S. (2019). Networking for educational innovations: a bibliometric survey of international publication patterns. Sustainability 11:4608. doi: 10.3390/su11174608

Treffinger, D. J., and Isaksen, S. G. (1992). Creative Problem Solving: An Introduction . Montgomery: Center of Creative Learning, Inc.

Uysal, M. P. (2014). Improving first computer programming experiences: the case of adapting a web-supported and well-structured problem-solving method to a traditional course. Contemp. Educ. Technol . 5, 198–217. doi: 10.30935/cedtech/6125

Wu, T. J., Yuan, K. S., and Yen, D. C. (2021). Leader-member exchange, turnover intention and presenteeism–the moderated mediating effect of perceived organizational support. Curr. Psychol . doi: 10.1007/s12144-021-01825-1

Wu, T. J., Gao, J. Y., Wang, L. Y., and Yuan, K. S. (2020a). Exploring links between polychronicity and job performance from the person–environment fit perspective-the mediating role of well-being. Int. J. Environ. Res. Public Health 17, 3711–3722. doi: 10.3390/ijerph17103711

PubMed Abstract | CrossRef Full Text | Google Scholar

Wu, T. J., Xu, T., Li, L. Q., and Yuan, K. S. (2020b). “Touching with heart, reasoning by truth”! The impact of Brand cues on mini-film advertising effect. Int. J. Advert . 39, 1322–1350. doi: 10.1080/02650487.2020.1755184

Zhang, Y., and Chu, S. K. W. (2016). New ideas on the design of the web-based learning system oriented to problem solving from the perspective of question chain and learning community. Int. Rev. Res. Open Dis. 17, 176–189. doi: 10.19173/irrodl.v17i3.2115

Keywords: online problem, instructional strategies, identification attitude, affective component, creativity

Citation: Wang Y-P (2021) Effects of Online Problem-Solving Instruction and Identification Attitude Toward Instructional Strategies on Students' Creativity. Front. Psychol. 12:771128. doi: 10.3389/fpsyg.2021.771128

Received: 05 September 2021; Accepted: 27 September 2021; Published: 14 October 2021.

Reviewed by:

Copyright © 2021 Wang. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY) . The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Yi-Ping Wang, 1487774578@qq.com

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.

  • Open access
  • Published: 05 February 2018

The role of problem solving ability on innovative behavior and opportunity recognition in university students

  • Ji Young Kim 1 ,
  • Dae Soo Choi 1 ,
  • Chang-Soo Sung 1 &
  • Joo Y. Park 2  

Journal of Open Innovation: Technology, Market, and Complexity volume  4 , Article number:  4 ( 2018 ) Cite this article

25k Accesses

32 Citations

1 Altmetric

Metrics details

Universities engage in entrepreneurship education to increase social value creation, through students’ new opportunities recognition. However, there are not enough of empirical researches on whether the current entrepreneurship education can be differentiated from other curriculum to improve the opportunity recognition process. This study argues that it is very important for cognitive abilities to be manifested as behavior when students in university are new opportunities recognition. For this purpose, the relationship between problem solving ability, innovation behavior, and opportunity perception was verified empirically. This study was conducted on 203 students who took entrepreneurship education courses at Korean universities. The results of this study showed that problem solving ability positively influenced innovation behavior and opportunity perception. Innovation behavior was identified as a key parameter that partially mediated the relationship between problem solving ability and innovation behavior. The implication of this study is to prove the relationship between individual ‘s problem - solving ability considering the characteristics of education in Korea and the opportunity through innovative behavior and various learning strategies to help entrepreneurship education to design better courses for the future It has important implications for strategic pedagogy that can enhance behavioral elements in development.

It is the new opportunity recognition that all firms focus on for a new economic paradigm (Ancona and Caldwell, 1992 ). Recognizing high opportunities can significantly improve profit, growth, and / or competitive positioning. And this new opportunity leads to innovation. From a conceptual point of view, research is continuing on the question of ‘what is opportunity’ and ‘where is opportunity’ (Gartner and Carter, 2003 ; Venkataraman & Sarasvathy, 2001 ). Research on the discovery and realization of new opportunities is a very important research area that suggests how to discover and utilize creative opportunities that create new value and profit for pre-service workers, and is the ultimate goal of entrepreneurship education. (Kim et al., 2016 ). Particularly, there is a lot of debate about the relationship between opportunity perception and personal characteristics. Despite many arguments, however, research on individual characteristics and opportunity perceptions is still insufficient, and a unified opinion has not been created due to differences between cognitive and behavioral theories (Ko & Butler, 2003 ). In particular, there is much controversy over the relationship between opportunity recognition and personal traits, and research has been continuing to demonstrate that organizational learning in organizations can influence opportunity recognition (Shane & Venkataraman, 2000 ). In particular, learning enhances cognitive ability, which is an opportunity that leads to opportunity recognition through the manifestation of behavior (Lumpkin and Dess, 2004 ). Many studies have also demonstrated the difference in behavior that successful entrepreneurs see as contributing to their ability to recognize opportunities and create innovative business ideas (Dyer et al., 2008 ; Kim et al., 2017 ). For example, Alvarez and Barney ( 2005 ) argue for mountain climbing and mountain building to understand the implications of entrepreneurial behavior in relation to these theories. In other words, a new opportunity for entrepreneurs is not a passive case that is generally found and climbed by climbers such as mountains, but rather by the actions of entrepreneurs, creating competition for the market, creating another market, Is the same. Therefore, in order for a person’s cognitive ability to recognize a new opportunity, it must focus on manifesting an action that can realize an innovative idea. In this regard, Kanter ( 1988 ) proved the relationship between new opportunity recognition and those with innovative tendencies and regarded this new opportunity recognition as innovation activity through organizational education. Scott and Bruce ( 1994 ) have integrated a number of research flows into innovation pioneers to develop and test individual innovative behavioral models. In particular, they argued that individual problem-solving styles are very important to induce innovative behavior. Although there are a number of studies on problem solving ability, innovation behavior, and new opportunities, most of the opportunistic researches have been conducted in organizational units of companies. Is still insufficient. Furthermore, unified opinions were not created due to differences between cognitive theory and behavioral theory (Ko & Butler, 2003 ). It is also true that the effects of entrepreneurship education in university have not been studied empirically because they are mainly focused on promoting cognitive ability and applied to various kinds of teaching methods.

This study argues that it is very important for cognitive abilities to be manifested as behavior that. “Through” courses, In other words, it is very important to induce students to act through ‘learning through process’ learning through behavioral learning by providing students with some (virtual or real) business to start doing some of the actions of the entrepreneur. When students in university are new opportunity recognition. Especially, entrepreneurship education, which ultimately focuses on whether it is a new opportunity, is very important to induce behavior through behavior learning beyond the cognitive ability as the general education curriculum. Particularly, innovative behaviors that create and realize innovative ideas are very important for new opportunity recognition (Paine & Organ, 2000 ).In order to achieve this, various kinds of teaching methods are being pursued in the university, but studies on the effectiveness of behavioral learning have not been studied yet. In this study, we are based on team-based learning among various teaching methods for behavior learning that leads to innovative behaviors. Team learning instructional activity sequence designed by Michaelsen and Sweet ( 2008 ), the most well known team-based learning in entrepreneurship education as in class-primarily group work and outside class-primarily individual work. In this way, we demonstrate empirically the relationship between individual problem solving ability and opportunity through innovative behavior, and develop a variety of learning strategies that help entrepreneurship education to design better courses for the future. I would like to point out some implications for strategic pedagogy to increase the element.

The paper proceeds as follows: Initially we present the theory of innovative behavior with individual problem-solving ability, innovative behavior and opportunity recognition. We develop hypotheses to confirm its basic predictions in the student context. Finally, we link the findings with the wider social effect of entrepreneurship literature and highlight the theoretical contributions and practical implications.

Theoretical background

‘opportunity recognition’ as entrepreneurship education unit of analysis.

A commonly focused analysis in entrepreneurship research over the last 30 years has been the ‘opportunity’, most simply defined as any situation in which new products or services can be development of production (Casson, 1982 ; Shane & Venkataraman, 2000 ; Venkataraman, 1997 ). The definition of opportunity recognition is defined in many ways, but opportunity is defined as a perceived means of generating economic value (ie, profit) that has not been exploited previously and is not currently exploited by others. If opportunity is defined in this way, opportunity recognition can be defined as a cognitive process (or process) that concludes that an individual has identified an opportunity (Baron and Ensley, 2006 ). Kirzner ( 1997 ) pointed out that the distribution of information in society affects the discovery of entrepreneurial opportunities and that only a few individuals can identify and recognize specific opportunities in the market. The process of finding opportunities also depends on the individual’s ability and discovery (Stevenson & Gumpert, 1985 ). For example, people may miss opportunities due to a lack of cognitive ability to change external environments (Stevenson & Gumpert, 1985 ). Only those who recognize and value the existence of opportunity can benefit from new opportunities (Ardichvili et al., 2003a , b ; Shane & Venkataraman, 2000 ). Opportunity recognition is an early step in transforming value into a business concept that creates value and generates revenue and distinguishes it from the aggressive stages of detailed assessment and development of recognized opportunities and potential economic value. The focus of the new venture business is also an innovative opportunity to create new opportunities rather than merely expanding or repeating existing business models (Gaglio & Katz, 2001 ). As a result, universities need to make use of a variety of initiatives to educate students to recognize innovative opportunities. Therefore, entrepreneurship education aimed at a new opportunity recognition should be able to provide learning opportunities based on various theories of favorable conditions for new business creation and the types of traits required for new ventures (Garavan & O’Cinne’ide, 1994 ).

Based on these considerations, we also define opportunity recognition as the formation of beliefs that can be translated into actions in order to understand the signals of change (new information on new conditions) and respond to these changes.

Problem-solving ability and innovative behavior of education for students

Problem-solving abilities have been proven to be one of the key factors for success in organizations and personal careers (Anderson & Anderson 1995 ). Through decades of research data, organizations and schools have studied factors that affect improvement. Problem-solving abilities are defined in a number of prior studies, and problem-solving abilities in a volatile and sophisticated knowledge- and technology-based industry are an important ability to drive innovation and sustainable growth and development in the industry. Table  1 show the concept of problem solving ability defined in previous research.

There have been a number of previous studies, emphasis has been placed on the importance and meaning of rational problem-solving processes in order to improve problem-solving abilities, and research has focused on individual problem solving styles (Woodman et al., 1993 ; Scott & Bruce, 1994 ). According to the personal innovation behavior model of Scott and Bruce ( 1994 ), climate has shown individual innovative behavior as a result of individuals signaling the organization’s expectations of behavior and the potential consequences of action. Innovative organizations are, last but not least, equipment, facilities and time, including the direction of creativity and innovative change (Kanter, 1983 ; Siegel & Kaemmerer, 1978 ) Proper supply of such resources is important to innovation (Amabile, 1988 ; Van de Ven & Angle, 1989 ; Dubickis & Gaile-Sarkane, 2017 ). Based on a study of Koestler’s ( 1964 ) creative thinking, Jabri conceptualized a problem-solving style consisting of two independent thinking styles. He uses a structured problem-solving styles that is based on associative thinking, follows a set of rules, resolves reasonably logically, and uses an intuitive problem-solving ability that focuses on problem-solving, not tied to existing rules with multiple ideas. Intuitive problem solving styles tend to process information from different paradigms simultaneously. It is therefore more likely to create new problem solutions as possible (Isaksen, 1987 ; Kirton, 1976 ). However, style assessment is not desirable because the style of problem solving affects style differently depending on the individual problem-solving situations (Scott & Bruce, 1994 ). We are proposing a role for the University to encourage innovative behavior based on the individuality of our students in order to recognize new opportunities through education about Scott and Bruce’s innovative behavioral models and diverse entrepreneurship education approaches. And involvement of resources, such as entrepreneurship awareness programs, ultimately leads to the identification of individual characteristics and innovation. In addition, current Korean entrepreneurship education is mainly focused on cognitive learning to improve problem solving ability, and one aspect of cognitive learning plays an important role in learning process of new venture firms. This study has a more direct focus on behavior learning such as team-based learning.

Hypothesis development

Problem-solving ability and innovative behavior.

Problem solving is to discover knowledge and skills that reach the target country by interfering with a set of processes and goals where the solution is unknown, unfamiliar, or reaching a new state of goal (Jonassen, 2004 ; Inkinen, 2015 ). There are various approaches to solve this problem. To solve problems and improve problem solving with a successful solution experience, you should adopt the method that best suits your problem solution. You need to select the appropriate inputs for the solution elements and a flexible process structure. Problem solving ability has been recognized as a key element of innovative behavior in responding to rapid changes with the ability to find various alternatives and predict outcomes from these alternatives to maximize positive results, minimize negative consequences, and select solutions to problems (Barron & Harrington, 1981 ; Jabri, 1991 ; Kirton, 1976 ). We pose the following hypotheses:

Hypothesis 1: Individual problem-solving ability has an effect on the innovative behavior of students.

Innovative behavior and opportunity recognition

Innovation involves introducing ideas from outside the organization, through creative processes, and linking these ideas to products or processes. Many scholars studying innovation recognize that designing ideas is only one step in the innovation process (Kanter, 1988 ). Innovation is changing at the organizational or individual level. Kanter, Scott and Bruce defined personal innovation. In other words, an innovation act starts with recognition of a problem, adoption of a new idea, or creation of a solution, and an individual with an innovative tendency wants to create a realistically realizable group with the sympathy of such an idea. Innovative individuals create prototypes for innovations that enable ideas to be realized specifically with goods or services and become productive use and social day merchandising. According to previous studies, opportunity perception can be seen as an individual’s corporate strategy that focuses on the perception and exploitation of individuals about potential business ideas and opportunities and finds resources to create innovative outcomes (Manev et al., 2005 ). New Venture Ideas (NVI) are imaginary combinations of product/service offerings; potential markets or users, and means of bringing these offerings into existence (Davidsson, 2015 ). From the viewpoint of a potential entrepreneur like a university student, entrepreneurship starts with an idea. This process continues with a range of practices including attractiveness and feasibility of an idea, gathering information to minimize value-related uncertainty and possibility and perhaps the main idea’s conformity ratio in terms of newly discovered needs (Hayton & Cholakova, 2012 ). Earlier we proposed that the program as a whole increases the students’ innovative behavior and that innovative performance is the new venture ideas. Since it is logical to assume that the relationship between innovative behavior and opportunity recognition. We pose the following hypotheses:

Hypothesis 2: Innovative behavior will be a more potent inducer of opportunity recognition.

Problem-solving ability and opportunity recognition

Among the many factors influencing opportunity perception, the problems that arise in the fourth industry, the knowledge-based industry of the twenty-first century, are unpredictable and unstructured; they cannot be solved with existing solutions and require creative problem-solving skills. In order to determine how to solve problem situations that are different from the current situation and have unknown results, problems are solved through the process of adjusting previous experience, knowledge, and intuition (Charles & Lester, 1982 ). Experience, knowledge, and intuition are applied simultaneously to a single problem, not individually or collectively, and the intellectual and creative results that can be quickly and effectively solved in problem solving are seen as problem solving abilities (Ardichvili et al., 2003a , b ). Empirical studies of problem-solving abilities and opportunity perceptions have provided strong evidence that there is a positive relationship between theoretical integrative processes and corporate opportunity recognition (Ucbasaran et al., 2009 ). Therefore, we hypothesized that:

Hypothesis 3: Problem solving ability has an effect on the opportunity recognition.

The respondents for this study were randomly selected from three universities in Korea. Most of the respondents in this study were Korean university students who experienced team-based learning during behavioral learning through entrepreneurship education. Since then, we have been guided by two main criteria when choosing these universities. First, students who take entrepreneurship courses are critical to their innovation behavior. This led us to realize that innovative behavior is an important factor in an individual’s survival and growth. The second is that the parallel process of theoretical and behavioral learning is highly satisfied. A pilot study was conducted to verify the reliability and validity of the research measurements with 28 students at a university. The results of the pilot study showed high clarity and reliability (Cronbach ‘s alphas were all above 0.70) ​​of the research measurements. The sample of the pilot study was not incorporated in the present study.

This study was conducted in a four - year undergraduate course (various majors) that took entrepreneurship courses in Korea university programs. Students in this course have a mix of students who have previously experienced entrepreneurship and those who have not. During the course, students were taught the theoretical lessons for 8 weeks and the team for the 8 weeks. The questionnaire was administered during the last week of the course.

The data were analyzed from 203 participants, out of a total of 209, of which 7 were not appropriate. Of the 203 participants, 27% were female and 73% were male and the grade distribution was 3% for freshmen, 12% for grade 2, 26% for grade 2, and 59% for grade 2. The main distribution is 26% in social science, 16% in business and economics, 39% in engineering, 11% in music and athletics and 7% in others (see Table  2 ).

Measurement

The structure of the model was measured by questionnaires (problem-solving ability, innovation behavior and opportunity recognition questionnaire) consisting of the scale taken from questionnaires verified in previous studies. Tool selection was performed on two criteria. First, the selected tool should measure the same structure (ie, the original measured structure had to be conceptually identical to the way the structure was defined in this study model). Secondly, the psychometric qualities of the instrument for the student had to be high.

Assessment of the factors was carried out through principal component analyses (varimax rotation with eigenvalues of 1.0 or above) of the scales connected to the same level of the model to confirm the uniqueness of the scales with respect to each other. This was supplemented by the computation of the internal consistency reliability of the scales (Cronbach’s α). These analyses were executed using the individual participants’ responses (Nunnally & Bernstein, 1994 ).

Problem- solving ability was measured on a 7-point Likert-scale (1 = ‘completely disagree’; 7 = ‘completely agree’). Jabri ( 1991 ) used a measurement tool to measure individual problem solving ability.

Innovative behavior was measured on a 7-point Likert-scale (1 = ‘completely disagree’; 7 = ‘completely agree’). In order to measure innovation behavior, we modified the questionnaire items to fit the intention of this study among the questionnaire items used by Scott and Bruce ( 1994 ) and Kim and Rho ( 2010 ).

Opportunity recognition was measured on a 7-point Likert-scale (1 = ‘completely disagree’; 7 = ‘completely agree’). In order to measure opportunity recognition, we modified the questionnaire items to fit the intention of this study among the questionnaire items used by Kim and Rho ( 2010 ).

Methods of analysis

The first two parts of the analysis were primarily based on (multiple) regression analyses. The last part of the analysis was informed through the path analyses. The adequacy of the models was assessed by AMOS 18(Arbuckle & Wothke, 2003 ). Models were all tested with standardized coefficients obtained from the Principal Component Analysis. To ascertain the model fit, we analyzed the comparative fit index (CFI), the normed fit index (NFI), the Root Mean Square Err of Approximation (RMSEA), the standardized root mean square residual (SRMR) and the chi-square test statistic.

Reliability and validity are essential psychometrics to be reported. The first step to evaluate those aspects was to use the Cronbach’s alpha and the composite reliability to test reliability of the proposed scales. The usual threshold level is 0.7 for newly developed measures (Fornell and Larcker, 1981 ). Values range from 0.69 to 0.79 in the case of Cronbach’s alpha, and from 0.85 to 0.92 in the case of composite reliability (see Table  3 ). Therefore, these scales may be considered as reliable. Next, we estimated the research model, displayed in Fig.  1 , using structural equation modeling (SEM) and AMOS 18 (Arbuckle & Wothke, 2003 ). Our analysis revealed an adequate measurement model with high factor loadings for all the items on the expected factors and communalities of each item exceeding 0.50. We discuss three fit indices that are generally considered as important (Hu & Bentler, 1998 ). First, the CFI-value represents the overall difference between observed and predicted correlations. A value of 0.04 which is situated well below the cut-off value of 0.08, suggests that the hypothesized model resembles the actual correlations. Secondly, Bentler’s CFI (comparative fit index) greater than 0.90 and 0.95 which is above the cut-off of 0.90 (Schumacker & Lomax, 1996 ). Thirdly, NFI greater greater than 0.90 and 0.95 which is above the cut-off of 0.90 (Schumacker & Lomax, 1996 ). Fourthly, the standardized root mean square residual (SRMR) value of 0.0392 which is situated well below the cut-off value of 0.05(Hu & Bentler, 1998 ), and the chi-square value of 3581.622 which is situated well below the cut-off value of 0.0005. Finally, the RMSEA (root mean square error of approximation) equals 0.04 with a 90% confidence interval between 0.03 and 0.05.

Analysis of mediation effect

The value and confidence interval are situated over but below the cut-off value of 0.1 which suggests not a great but a good fit. Factor analysis was verified by factor analysis using principal component analysis and only factors with an eigenvalue of 1 or more by orthogonal rotation method were selected. Factor loading was considered to be significant at 0.5 or more (Hair et al., 2006a , b ). As a result of the analysis, cumulative explanation for 72.4% of the total variance. Confirmatory factor analysis thus supported the differentiation of the three components Also we tested the confirmatory validity of the construct by testing whether the structural linkage of each square is greater than the mean variance extraction (AVE) of each structure. The AVE ranged from 0.52 to 0.53, reaching the recommended level of .50 for both Fornell and Larcker ( 1981 ). Therefore, all constructs showed sufficient convergent validity (see Table 3 ).

As shown in Table  4 , the AVE value of each variable has a higher value than that of other factors. Therefore, the discriminant validity of the proposed model can be judged as appropriate.

Means, standard deviations, and correlations among the study variables are shown in Table  5 .

The mean scores for the conceptual model were as follows for problem-solving ability (MD. 5.20, SD.1.08), innovative behavior (MD.5.20, SD.1.03), and opportunity recognition (MD. 5.14, SD. 1.06) conditions. The means of problem-solving ability, innovative behavior, and opportunity recognition were high. Furthermore, those variables correlated positively with each other.

Figure  1 showed that all paths and their significance levels are presented in Table  6 . The path between the latent variables problem-solving ability and innovative behavior was significant (p, 0.001), consistent with Hypotheses 1. In addition, there was innovative behavior and opportunity recognition (p, 0.01), this result provide empirical support for Hypothesis 2.

H3 proposed that Problem-solving ability is positively related to opportunity recognition. The results of the correlation analysis: The coefficient of problem solving and opportunity perception weakened from .717 to .444, but it is still partly mediated because it is still significant (C. R  = 7.604 ***). This supports H3 (see Table 6 ).

In order to verify the significance of the indirect effect, the bootstrapping must be performed in AMOS, and the actual significance test should be identified using two-tailed significance. As a result, the significance of indirect effect is 0.04 ( p  < 0.05), which is statistically significant (see Table  7 ).

Discussion and conclusion

We have tried to demonstrate the effects of behavior and its significance by differentiating from the general curriculum emphasizing cognitive effects as a model of problem solving ability emerging as innovative behavior through opportunity of university entrepreneurship education.. This supports the premise that entrepreneurship education can improve opportunities or processes through behavioral learning. The results of this study support the role of entrepreneurship education in creating opportunities for innovative behavior and problem solving abilities. Entrepreneurship education should provide different types of learning for new opportunities and focus on what is manifested in behavior.

In addition, based on previous research, we propose whether the following contents are well followed and whether it is effective. First, the emergence of innovative behavior in problem-solving abilities increases as the cognitive diversity of students with diverse majors and diverse backgrounds increases. Second, the more entrepreneurial learning experiences, the greater the chance of new opportunities. Third, it is necessary to investigate students’ problem solving style and problem-solving ability first, and then a teaching strategy based on this combination of systematic and effective theory and practice is needed. Of course, as demonstrated by many studies, it may be easier to enhance the effectiveness of opportunity recognition through cognitive learning. This is because it emphasizes the achievement of knowledge and understanding with acquiring skills and competence. This process, however, is not enough for entrepreneurship education. However, we do not support full team-based behavioral learning in the class designed by Michaelsen and Sweet ( 2008 ). As with the results of this study, problem solving ability is positively related to opportunity perception directly. As previously demonstrated in previous studies, problem solving ability can be enhanced by cognitive learning (Anderson et al., 2001 ; Charles & Lester, 1982 ).

Therefore, it has been demonstrated that it is more efficient to balance a certain level of cognitive learning and behavior learning in consideration of the level of students in a course. Also this study satisfies the need for empirical research by Lumpkin and Lichtenstein ( 2005 ) and Robinson et al. ( 2016 ) and others. This will help to improve understanding of how entrepreneurship training is linked to various learning models and their effectiveness and to design better courses for the future. Finally, this study sought to provide an awareness of entrepreneurship education as the best curriculum for solutions that evolved into innovative behaviors that create new values and ultimately represent new opportunities. This study shows that it can positively influence the social effect of creating new value, that is, not only the cognitive effect of general pedagogy, but also the innovation behavior. By providing this awareness, we have laid the groundwork for empirical research on entrepreneurship education in order to create more opportunities for prospective students in education through education and to expand their capabilities.

Limitation and future research

Indeed, the concepts presented here and the limitations of this study have important implications that can fruitfully be addressed in future research. First, we selected a sample of college students taking entrepreneurship training. However, since it is not the whole of Korean university students, it is difficult to extend the research results to all college students in Korea. Second, there is no precedent research on the role of innovation behavior as intermedia in college students. Therefore, we were forced to proceed as an exploratory study.

The ability to recognize opportunities can provide significant benefits that can remain firm and competitive in an ever-changing environment. Future research should therefore expand these insights and try to empirically test more ways in which entrepreneurship pedagogy teaches how learning methods can be integrated into venture creation and growth processes to help new process opportunities. By providing this study, we will help entrepreneurship education in the university to create more opportunities and expand the capacity of prospective members.

Alvarez, S. A., & Barney, J. B. (2005). How do entrepreneurs organize firms under conditions of uncertainty? Journal of Management, 31 (5), 776–793.

Article   Google Scholar  

Amabile, T. M. (1988). A model of creativity and innovation in organizations. Research in Organizational Behavior, 10 (1), 123–167.

Google Scholar  

Ancona, D. G., & Caldwell, D. F. (1992). Demography and design: Predictors of new product team performance. Organization Science, 3 (3), 321–341.

Anderson, P. M., & Anderson, P. M. (1995). Analysis of faulted power systems (Vol. 445). New York: IEEE press.

Anderson, L. W., Krathwohl, D. R., Airasian, P., Cruikshank, K., Mayer, R., Pintrich, P., & Wittrock, M. (2001). A taxonomy for learning, teaching and assessing: A revision of Bloom’s taxonomy . New York: Longman Publishing.

Arbuckle, J. L., & Wothke, W. (2003). AMOS 5 user’s guide . Chicago: Smallwaters.

Ardichvili, A., Cardozo, R., & Ray, S. (2003a). A theory of entrepreneurial opportunity identification and development. Journal of Business Venturing, 18 (1), 105–123.

Ardichvili, A., Cardozo, R., & Ray, S. (2003b). A theory of entrepreneurial opportunity identification and development. Journal of Business Venturing, 18 (1), 105–123.

Baron, R. A., & Ensley, M. D. (2006). Opportunity recognition as the detection of meaningful patterns: Evidence from comparisons of novice and experienced entrepreneurs. Management Science, 52 (9), 1331–1344.

Barron, F., & Harrington, D. M. (1981). Creativity, intelligence, and personality. Annual review of psychology, 32 (1), 439–476.

Casson, M. (1982). The entrepreneur: An economic theory . Lanham: Rowman & Littlefield.

Charles, R., & Lester, F. (1982). Teaching problem solving: What, why & how . Palo Alto: Dale Seymour Publications.

Davidsson, P. (2015). Entrepreneurial opportunities and the entrepreneurship nexus: A re-conceptualization. Journal of Business Venturing, 30 (5), 674–695.

Dubickis, M., & Gaile-Sarkane, E. (2017). Transfer of know-how based on learning outcomes for development of open innovation. Journal of Open Innovation : Technology, market, and complexity , 3 (1), 4.

Dyer, J. H., Gregersen, H. B., & Christensen, C. (2008). Entrepreneur behaviors, opportunity recognition, and the origins of innovative ventures. Strategic Entrepreneurship Journal, 2 (4), 317–338.

D'zurilla, T. J., & Nezu, A. M. (1990). Development and preliminary evaluation of the social problem-solving inventory. Psychological Assessment: A Journal of Consulting and Clinical Psychology, 2 (2), 156.

Fornell, C., & Larcker, D. F. (1981). Evaluating structural equation models with unobservable variables and measurement error. Journal of Marketing Research , 39–50.

Gaglio, C. M., & Katz, J. A. (2001). The psychological basis of opportunity identification: Entrepreneurial alertness. Small Business Economics, 16 (2), 95–111.

Garavan, T. N., & O’Cinneide, B. (1994). Entrepreneurship education and training programmes: A review and evaluation-part 1. Journal of European Industrial Training, 18 (8), 3–12.

Gartner, W. B., & Carter, N. M. (2003). Entrepreneurial behavior and firm organizing processes. In Handbook of entrepreneurship research (pp. 195–221). New Mexico: Springer US.

Hair, E., Halle, T., Terry-Humen, E., Lavelle, B., & Calkins, J. (2006a). Children's school readiness in the ECLS-K: Predictions to academic, health, and social outcomes in first grade. Early Childhood Research Quarterly, 21 (4), 431–454.

Hair, J. F., Black, W. C., Babin, B. J., Anderson, R. E., & Tatham, R. L. (2006b). Multivariate Data Analysis (6th ed.). Upper Saddle River: Pearson Education, Inc..

Hayton, J. C., & Cholakova, M. (2012). The role of affect in the creation and intentional pursuit of entrepreneurial ideas. Entrepreneurship Theory and Practice, 36 (1), 41–68.

Hu, L. T., & Bentler, P. M. (1998). Fit indices in covariance structure modeling: Sensitivity to underparameterized model misspecification. Psychological Methods, 3 (4), 424.

Inkinen, T. (2015). Reflections on the innovative city: Examining three innovative locations in a knowledge bases framework. Journal of Open Innovation : Technolodgy, market. Complexity, 1 (1), 8.

Isaksen, S. G. (1987). Frontiers of creativity research: Beyond the basics. Bearly Ltd.

Jabri, M. M. (1991). The development of conceptually independent subscales in the measurement of modes of problem solving. Educational and Psychological Measurement, 51 (4), 975–983.

Jonassen, D. H. (2004). Learning to solve problems: An instructional design guide (Vol. 6). Hoboken: John Wiley & Sons.

Kanter, R. M. (1983). The change masters: Binnovation and entrepreneturship in the American corporation. Touchstone Book.

Kanter, R. M. (1988). Three tiers for innovation research. Communication Research, 15 (5), 509–523.

Kim, H. C., Song, C. H., & An, B. R. (2016). A study on effects of personal characteristics on start-up opportunity and entrepreneurial intention of start-up. Korean Management Consulting review, 16 (3), 75–87.

Kim, S. A., Ryoo, H. Y., & Ahn, H. J. (2017). Student customized creative education model based on open innovation. Journal of Open Innovation : Technology, Market, and Complexity , 3 (1), 6.

Kim, T. H., & Roh, J. H. (2010). A Study of the Impact of Public Service Motivation on Innovative Behavior of Organizational Members. Korean Journal of Public Administration, 48(3).

Kirton, M. (1976). Adaptors and innovators: A description and measure. Journal of Applied Psychology, 61 (5), 622.

Kirzner, I. M. (1997). Entrepreneurial discovery and the competitive market process: An Austrian approach. Journal of Economic Literature, 35 (1), 60–85.

Ko, S., & Butler, J. E. (2003). Alertness, bisociative thinking ability, and discovery of entrepreneurial opportunities in Asian hi-tech firms.

Koestler, A. (1964). The act of creation: A study of the conscious and unconscious processes of humor, scientific discovery and art.

Lumpkin, G. T., & Dess, G. G. (2004). E-Business Strategies and Internet Business Models: How the Internet Adds Value. Organizational Dynamics, 33 (2), 161–173.

Lumpkin, G. T., & Lichtenstein, B. B. (2005). The role of organizational learning in the opportunity-recognition process. Entrepreneurship Theory and Practice, 29 (4), 451–472.

Manev, I. M., Gyoshev, B. S., & Manolova, T. S. (2005). The role of human and social capital and entrepreneurial orientation for small business performance in a transitional economy. International Journal of Entrepreneurship and Innovation Management, 5 (3–4), 298–318.

Michaelsen, L. K., & Sweet, M. (2008). The essential elements of team-based learning. New directions for teaching and learning, 2008 (116), 7–27.

Nunnally, J. C., & Bernstein, I. H. (1994). Validity. Psychometric theory, 99–132.

Paine, J. B., & Organ, D. W. (2000). The cultural matrix of organizational citizenship behavior: Some preliminary conceptual and empirical observations. Human Resource Management Review, 10 (1), 45–59.

Robinson, S., Neergaard, H., Tanggaard, L., & Krueger, N. F. (2016). New horizons in entrepreneurship education: from teacher-led to student-centered learning. Education+ Training, 58(7/8), 661–683.

Schumacker, R. E., & Lomax, R. G. (1996). A beginner's guide to structural equation modeling . Mahwah: Laurence Erlbaum Google Scholar.

Scott, S. G., & Bruce, R. A. (1994). Determinants of innovative behavior: A path model of individual innovation in the workplace. Academy of Management Journal, 37 (3), 580–607.

Shane, S. A. (2003). A general theory of entrepreneurship: The individual-opportunity nexus . Cheltenham: Edward Elgar Publishing.

Book   Google Scholar  

Shane, S., & Venkataraman, S. (2000). The promise of entrepreneurship as a field of research. Academy of Management Review, 25 (1), 217–226.

Siegel, S. M., & Kaemmerer, W. F. (1978). Measuring the perceived support for innovation in organizations. Journal of Applied Psychology, 63 (5), 553–562.

Spivack, G., Platt, J. J., & Shure, M. B. (1976). The problem-solving approach to adjustment . San Francisco: Jossey-Bass.

Stevenson, H., & Gumpert, D. (1985). The heart of entrepreneurship.

Stevenson, H. H. & J. C. Jarillo (1990). 'A paradigm of entrepreneurship: Entrepreneurial management', Strategic Management Journal, 11, pp. 17–27.

Ucbasaran, D., Westhead, P., & Wright, M. (2009). The extent and nature of opportunity identification by experienced entrepreneurs. Journal of Business Venturing, 24 (2), 99–115.

Van de Ven, A. H., & Angle, H. L. (1989). Suggestions for managing the innovation journey (No. 9). Strategic Management Research Center, University of Minnesota.

Venkataraman, S. (1997). The distinctive domain of entrepreneurship research. Advances in entrepreneurship, firm emergence and growth, 3 (1), 119–138.

Venkataraman, S., & Sarasvathy, S. D. (2001). Strategy and entrepreneurship: Outlines of an untold story.

Warner, M. (2002). Publics and counterpublics. Public Culture, 14 (1), 49–90.

Woodman, R. W., Sawyer, J. E., & Griffin, R. W. (1993). Toward a theory of organizational creativity. Academy of Management Review, 18 (2), 293–321.

Download references

Author information

Authors and affiliations.

Dept. of Technology Entrepreneurship (Graduate School), Dongguk University, 904 Chungmurogwn, Toegye-ro 36Gil, Jung-gu, Seoul, 100-272, South Korea

Ji Young Kim, Dae Soo Choi & Chang-Soo Sung

Yonsei School of Business, Yonsei University, 50 Yonsei-ro, Seodaemun-gu, Seoul, 120-749, South Korea

Joo Y. Park

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Joo Y. Park .

Ethics declarations

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License ( http://creativecommons.org/licenses/by/4.0/ ), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Reprints and permissions

About this article

Cite this article.

Kim, J.Y., Choi, D.S., Sung, CS. et al. The role of problem solving ability on innovative behavior and opportunity recognition in university students. J. open innov. 4 , 4 (2018). https://doi.org/10.1186/s40852-018-0085-4

Download citation

Received : 12 September 2017

Accepted : 22 January 2018

Published : 05 February 2018

DOI : https://doi.org/10.1186/s40852-018-0085-4

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Problem-solving ability
  • Innovative behavior
  • Opportunity recognition
  • Entrepreneurship education

problem solving skills journal article

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings
  • My Bibliography
  • Collections
  • Citation manager

Save citation to file

Email citation, add to collections.

  • Create a new collection
  • Add to an existing collection

Add to My Bibliography

Your saved search, create a file for external citation management software, your rss feed.

  • Search in PubMed
  • Search in NLM Catalog
  • Add to Search

Problem-solving skills, solving problems and problem-based learning

Affiliation.

  • 1 Department of Clinical Epidemiology and Biostatistics, McMaster University, Hamilton, Ontario, Canada.
  • PMID: 3050382
  • DOI: 10.1111/j.1365-2923.1988.tb00754.x

This paper reviews the empirical evidence in support of the three concepts in the title. To the extent that a skill should be a general strategy, applicable in a variety of situations, and independent of the specific knowledge of the situation, there is little evidence that problem-solving skills, as described and measured in medical education, possess these characteristics. Instead there is an accumulation of evidence that expert problem-solving in medicine is dependent on (I) a wealth of prior specific experiences which can be used in routine solution of problems by pattern recognition processes, and (2) elaborated conceptual knowledge applicable to the occasional problematic situation. The use of problem-based learning (PBL) as an educational strategy is explored. In particular, the evidence suggesting the compatibility of PBL with this theory of expertise is discussed. Finally, I review some issues in the design of PBL curricula from the perspective of the proposed model of expertise.

PubMed Disclaimer

Similar articles

  • Educational technologies in problem-based learning in health sciences education: a systematic review. Jin J, Bridges SM. Jin J, et al. J Med Internet Res. 2014 Dec 10;16(12):e251. doi: 10.2196/jmir.3240. J Med Internet Res. 2014. PMID: 25498126 Free PMC article. Review.
  • Problem-based learning: a review of the educational and psychological theory. Onyon C. Onyon C. Clin Teach. 2012 Feb;9(1):22-6. doi: 10.1111/j.1743-498X.2011.00501.x. Clin Teach. 2012. PMID: 22225888 Review.
  • Helping students learn to think like experts when solving clinical problems. Mandin H, Jones A, Woloschuk W, Harasym P. Mandin H, et al. Acad Med. 1997 Mar;72(3):173-9. doi: 10.1097/00001888-199703000-00009. Acad Med. 1997. PMID: 9075420
  • Problem-based, self-directed learning. Barrows HS. Barrows HS. JAMA. 1983 Dec 9;250(22):3077-80. JAMA. 1983. PMID: 6644989
  • The psychological basis of problem-based learning: a review of the evidence. Norman GR, Schmidt HG. Norman GR, et al. Acad Med. 1992 Sep;67(9):557-65. doi: 10.1097/00001888-199209000-00002. Acad Med. 1992. PMID: 1520409 Review.
  • A Bibliometric Analysis of the 100 Most Cited Articles on Problem-Based Learning in Medical Education. Kulo V, Cestone C. Kulo V, et al. Med Sci Educ. 2023 Oct 3;33(6):1409-1426. doi: 10.1007/s40670-023-01893-x. eCollection 2023 Dec. Med Sci Educ. 2023. PMID: 38188399 Review.
  • Learning deliberate reflection in medical diagnosis: does learning-by-teaching help? Kuhn J, Mamede S, van den Berg P, Zwaan L, van Peet P, Bindels P, van Gog T. Kuhn J, et al. Adv Health Sci Educ Theory Pract. 2023 Mar;28(1):13-26. doi: 10.1007/s10459-022-10138-2. Epub 2022 Aug 1. Adv Health Sci Educ Theory Pract. 2023. PMID: 35913665 Free PMC article. Clinical Trial.
  • Knowledge to action: a scoping review of approaches to educate primary care providers in the identification and management of routine sleep disorders. King S, Damarell R, Schuwirth L, Vakulin A, Chai-Coetzer CL, McEvoy RD. King S, et al. J Clin Sleep Med. 2021 Nov 1;17(11):2307-2324. doi: 10.5664/jcsm.9374. J Clin Sleep Med. 2021. PMID: 33983109 Free PMC article. Review.
  • Medical Education From a Theory-Practice-Philosophy Perspective. Kirch SA, Sadofsky MJ. Kirch SA, et al. Acad Pathol. 2021 Apr 20;8:23742895211010236. doi: 10.1177/23742895211010236. eCollection 2021 Jan-Dec. Acad Pathol. 2021. PMID: 33959676 Free PMC article.
  • Attitudes and perceptions towards hypoglycaemia in patients with diabetes mellitus: A multinational cross-sectional study. Naser AY, Wong ICK, Whittlesea C, Alwafi H, Abuirmeileh A, Alsairafi ZK, Turkistani FM, Bokhari NS, Beykloo MY, Al-Taweel D, Almane MB, Wei L. Naser AY, et al. PLoS One. 2019 Oct 24;14(10):e0222275. doi: 10.1371/journal.pone.0222275. eCollection 2019. PLoS One. 2019. PMID: 31647820 Free PMC article.

Publication types

  • Search in MeSH

LinkOut - more resources

Full text sources.

full text provider logo

  • Citation Manager

NCBI Literature Resources

MeSH PMC Bookshelf Disclaimer

The PubMed wordmark and PubMed logo are registered trademarks of the U.S. Department of Health and Human Services (HHS). Unauthorized use of these marks is strictly prohibited.

  • Vol 7, No 1 (2021)

Problem solving skills: esssential skills challenges for the 21st century graduates

problem solving skills journal article

Akben, N. (2020). Effects of the problem-posing approach on students’ problem solving skills and metacognitive awareness in science education. Research in Science Education, 50(3), 1143–1165.

Araiza-Alba, P., Keane, T., Chen, W. S., & Kaufman, J. (2021). Immersive virtual reality as a tool to learn problem-solving skills. Computers & Education, 164, 104121.

Arifin, S., & Muslim, M. (2020). Tantangan Implementasi Kebijakan “Merdeka Belajar, Kampus Merdeka” pada Perguruan Tinggi Islam Swasta di Indonesia. Jurnal Pendidikan Islam Al-Ilmi, 3(1).

Care, E., Kim, H., Vista, A., & Anderson, K. (2018). Education System Alignment for 21st Century Skills: Focus on Assessment. Center for Universal Education at The Brookings Institution.

Chalkiadaki, A. (2018). A systematic literature review of 21st century skills and competencies in primary education. International Journal of Instruction, 11(3), 1–16.

Changwong, K., Sukkamart, A., & Sisan, B. (2018). Critical thinking skill development: Analysis of a new learning management model for Thai high schools. Journal of International Studies, 11(2).

Chen, X., Hertzog, C., & Park, D. C. (2017). Cognitive predictors of everyday problem solving across the lifespan. Gerontology, 63(4), 372–384.

Cheng, K. (2017). Advancing 21st century competencies in East Asian education systems. Center for Global Education. Asia Society, 2, 26.

Dörner, D., & Funke, J. (2017). Complex problem solving: what it is and what it is not. Frontiers in Psychology, 8, 1153.

Drigas, A. S., & Papoutsi, C. (2018). A new layered model on emotional intelligence. Behavioral Sciences, 8(5), 45.

Funke, J., Fischer, A., & Holt, D. V. (2018). Competencies for complexity: problem solving in the twenty-first century. In Assessment and teaching of 21st century skills (pp. 41–53). Springer.

Gale, A. J., Duffey, M. A., Park‐Gates, S., & Peek, P. F. (2017). Soft Skills versus hard skills: practitioners’ perspectives on interior design interns. Journal of Interior Design, 42(4), 45–63.

Georgiou, Y., & Kyza, E. A. (2018). Relations between student motivation, immersion and learning outcomes in location-based augmented reality settings. Computers in Human Behavior, 89, 173–181.

Graesser, A. C., Fiore, S. M., Greiff, S., Andrews-Todd, J., Foltz, P. W., & Hesse, F. W. (2018). Advancing the science of collaborative problem solving. Psychological Science in the Public Interest, 19(2), 59–92.

Husamah, H., Fatmawati, D., & Setyawan, D. (2018). OIDDE learning model: Improving higher order thinking skills of biology teacher candidates. International Journal of Instruction, 11(2), 249–264.

Ichsan, I. Z., Sigit, D. V., Miarsyah, M., Ali, A., Arif, W. P., & Prayitno, T. A. (2019). HOTS-AEP: Higher Order Thinking Skills from Elementary to Master Students in Environmental Learning. European Journal of Educational Research, 8(4), 935–942.

Johnson, B. T., & Acabchuk, R. L. (2018). What are the keys to a longer, happier life? Answers from five decades of health psychology research. Social Science & Medicine, 196, 218–226.

Kashani-Vahid, L., Afrooz, G., Shokoohi-Yekta, M., Kharrazi, K., & Ghobari, B. (2017). Can a creative interpersonal problem solving program improve creative thinking in gifted elementary students? Thinking Skills and Creativity, 24, 175–185.

Kraft, M. A. (2019). Teacher effects on complex cognitive skills and social-emotional competencies. Journal of Human Resources, 54(1), 1–36.

Kutlu, Ö., & Kartal, S. K. (2018). The Prominent Student Competences of the 21st Century Education and the Transformation of Classroom Assessment. International Journal of Progressive Education, 14(6), 70–82.

Martin, T. (2019). Review of student soft skills development using the 5Ws/H approach resulting in a realistic, experiential, applied, active learning and teaching pedagogical classroom. Journal of Behavioral and Applied Management, 19(1), 41–57.

Mefoh, P. C., Nwoke, M. B., Chukwuorji, J. C., & Chijioke, A. O. (2017). Effect of cognitive style and gender on adolescents’ problem solving ability. Thinking Skills and Creativity, 25, 47–52.

Mestry, R. (2017). Empowering principals to lead and manage public schools effectively in the 21st century. South African Journal of Education, 37(1).

Meyer, M. W., & Norman, D. (2020). Changing Design Education for the 21st Century. She Ji: The Journal of Design, Economics, and Innovation, 6(1), 13–49.

Miner, K. N., Walker, J. M., Bergman, M. E., Jean, V. A., Carter-Sowell, A., January, S. C., & Kaunas, C. (2018). From “her” problem to “our” problem: Using an individual lens versus a social-structural lens to understand gender inequity in STEM. Industrial and Organizational Psychology, 11(2), 267–290.

Özreçberoğlu, N., & Çağanağa, Ç. K. (2018). Making it count: Strategies for improving problem-solving skills in mathematics for students and teachers’ classroom management. Eurasia Journal of Mathematics, Science and Technology Education, 14(4), 1253–1261.

Peng, L., & Luo, S. (2021). Impact of social economic development on personality traits among Chinese college students: A cross-temporal meta-analysis, 2001–2016. Personality and Individual Differences, 171, 110461.

Pinter, R., & Cisar, S. M. (2018). Measuring Team Member Performance in Project Based Learning. Journal of Applied Technical and Educational Sciences, 8(4), 22–34.

Puccio, G. J. (2017). From the dawn of humanity to the 21st century: Creativity as an enduring survival skill. The Journal of Creative Behavior, 51(4), 330–334.

Putri, O. R. U., & Alfani, I. (2021). Mathematics Connection Process of Students With Low Mathematical Ability in Solving Contextual Problems Based on Gender. 4th Sriwijaya University Learning and Education International Conference (SULE-IC 2020), 549–555.

Reddy, M., & Panacharoensawad, B. (2017). Students Problem-Solving Difficulties and Implications in Physics: An Empirical Study on Influencing Factors. Journal of Education and Practice, 8(14), 59–62.

Rios, J. A., Ling, G., Pugh, R., Becker, D., & Bacall, A. (2020). Identifying critical 21st-century skills for workplace success: A content analysis of job advertisements. Educational Researcher, 49(2), 80–89.

Scoular, C., & Care, E. (2018). Teaching Twenty-First Century Skills: Implications at System Levels in Australia. In Assessment and Teaching of 21st Century Skills (pp. 145–162). Springer.

Shishigu, A., Hailu, A., & Anibo, Z. (2017). Problem-based learning and conceptual understanding of college female students in physics. Eurasia Journal of Mathematics, Science and Technology Education, 14(1), 145–154.

Silber‐Varod, V., Eshet‐Alkalai, Y., & Geri, N. (2019). Tracing research trends of 21st‐century learning skills. British Journal of Educational Technology, 50(6), 3099–3118.

Stoeffler, K., Rosen, Y., Bolsinova, M., & von Davier, A. A. (2020). Gamified performance assessment of collaborative problem solving skills. Computers in Human Behavior, 104, 106036.

Sulaiman, T., Muniyan, V., Madhvan, D., Hasan, R., & Rahim, S. S. A. (2017). Implementation of higher order thinking skills in teaching of science: A case study in Malaysia. International Research Journal of Education and Sciences (IRJES), 1(1), 2158–2550.

Tambunan, H. (2019). The Effectiveness of the Problem Solving Strategy and the Scientific Approach to Students’ Mathematical Capabilities in High Order Thinking Skills. International Electronic Journal of Mathematics Education, 14(2), 293–302.

Ulger, K. (2018). The effect of problem-based learning on the creative thinking and critical thinking disposition of students in visual arts education. Interdisciplinary Journal of Problem-Based Learning, 12(1), 10.

Urbani, J. M., Roshandel, S., Michaels, R., & Truesdell, E. (2017). Developing and modeling 21st-century skills with preservice teachers. Teacher Education Quarterly, 44(4), 27–50.

Virtanen, A., & Tynjälä, P. (2019). Factors explaining the learning of generic skills: a study of university students’ experiences. Teaching in Higher Education, 24(7), 880–894.

Wechsler, S. M., Saiz, C., Rivas, S. F., Vendramini, C. M. M., Almeida, L. S., Mundim, M. C., & Franco, A. (2018). Creative and critical thinking: Independent or overlapping components? Thinking Skills and Creativity, 27, 114–122.

Wijaya, T. T. (2021). How chinese students learn mathematics during the coronavirus pandemic. IJERI: International Journal of Educational Research and Innovation, 15, 1–16.

problem solving skills journal article

  • There are currently no refbacks.

Register Now

  • Editorial Team
  • Focus and Scope
  • Publication Ethics
  • Section Policies
  • Open Access Policy
  • Privacy Statemet
  • Journal History
  • Author Guidelines

Accreditation Decre of Educatio

Jurnal EDUCATIO: Jurnal Pendidikan Indonesia is Nationally Accredited in SINTA 3

Accreditation Number

(Ministry of RTHE): 105/E/KTP/2022 , No 61 pp 20, date Julne 6, 2022.

Certificate   Download here

Flag Counter

View Jurnal EDUCATIO Stats

Download EDUCATIO Template 2024

Recommended software tools for publishing and managing bibliographies, citations and references

  • Other Journals

problem solving skills journal article

  • Peer Review Process
  • Publication Frequency

Submissions

  • Online Submissions
  • Copyright Notice
  • Privacy Statement
  • Author Fees
  • Journal Sponsorship
  • About this Publishing System

ISSN: 2477-0302

Published by:  

Indonesian Institute for Counseling, Education and Therapy (IICET)

Address: Jl. Bunda I No. 19  Padang -  West Sumatera - Indonesia 25131 Telp. +62751 8970975| Email:  [email protected]

Collaborative Problem-Solving in Knowledge-Rich Domains: A Multi-Study Structural Equation Model

  • Open access
  • Published: 24 June 2024

Cite this article

You have full access to this open access article

problem solving skills journal article

  • Laura Brandl   ORCID: orcid.org/0000-0001-7974-7892 1 ,
  • Matthias Stadler 1 , 2 ,
  • Constanze Richters 1 ,
  • Anika Radkowitsch 3 ,
  • Martin R. Fischer 2 ,
  • Ralf Schmidmaier 4 &
  • Frank Fischer 1  

77 Accesses

Explore all metrics

Collaborative skills are crucial in knowledge-rich domains, such as medical diagnosing. The Collaborative Diagnostic Reasoning (CDR) model emphasizes the importance of high-quality collaborative diagnostic activities (CDAs; e.g., evidence elicitation and sharing), influenced by content and collaboration knowledge as well as more general social skills, to achieve accurate, justified, and efficient diagnostic outcomes (Radkowitsch et al., 2022). However, it has not yet been empirically tested, and the relationships between individual characteristics, CDAs, and diagnostic outcomes remain largely unexplored. The aim of this study was to test the CDR model by analyzing data from three studies in a simulation-based environment and to better understand the construct and the processes involved ( N = 504 intermediate medical students) using a structural equation model including indirect effects. We found various stable relationships between individual characteristics and CDAs, and between CDAs and diagnostic outcome, highlighting the multidimensional nature of CDR. While both content and collaboration knowledge were important for CDAs, none of the individual characteristics directly related to diagnostic outcome. The study suggests that CDAs are important factors in achieving successful diagnoses in collaborative contexts, particularly in simulation-based settings. CDAs are influenced by content and collaboration knowledge, highlighting the importance of understanding collaboration partners’ knowledge. We propose revising the CDR model by assigning higher priority to collaboration knowledge compared with social skills, and dividing the CDAs into information elicitation and sharing, with sharing being more transactive. Training should focus on the development of CDAs to improve CDR skills.

Similar content being viewed by others

problem solving skills journal article

Beyond experiential knowledge: a classification of patient knowledge

problem solving skills journal article

Interventions to improve team effectiveness within health care: a systematic review of the past decade

problem solving skills journal article

State-of-the-art literature review methodology: A six-step approach for knowledge synthesis

Avoid common mistakes on your manuscript.

Introduction

Collaborative skills are highly relevant in many situations, ranging from computer-supported collaborative learning to collaborative problem-solving in professional practice (Fiore et al., 2018 ). While several broad collaborative problem-solving frameworks exist (OECD, 2017 ), most of them are situated in knowledge-lean settings. However, one example of collaborative problem-solving of knowledge-rich domains is collaborative diagnostic reasoning (CDR; Radkowitsch et al., 2022 )—which aligns closely with medical practice—as this is a prototypical knowledge-rich domain requiring high collaboration skills in daily practice. In daily professional practice, physicians from different specialties often need to collaborate with different subdisciplines to solve complex problems, such as diagnosing, that is, determining the causes of a patient’s problem. Moreover, research in medical education and computer-supported collaborative learning suggests that the acquisition of medical knowledge and skills is significantly enhanced by collaborative problem-solving (Hautz et al., 2015 ; Koschmann et al., 1992 ). For problem-solving and learning, it is crucial that all relevant information (e.g., evidence and hypotheses) is elicited from and shared with the collaboration partner (Schmidt & Mamede, 2015 ). However, CDR is not unique to the medical field but also relevant in other domains, such as teacher education (Heitzmann et al., 2019 ).

The CDR model has been the basis of empirical studies and describes how individual characteristics and the diagnostic process are related to the diagnostic outcome. However, it has not yet been empirically tested, and the relationships between individual characteristics, diagnostic process, and diagnostic outcome remain mostly unexplored (Fink et al., 2023 ). The aim of this study is to test the CDR model by analyzing data from three studies with similar samples and tasks investigating CDR in a simulation-based environment. By undertaking these conceptual replications, we aspire to better understand the construct and the processes involved. As prior research has shown, collaboration needs to be performed at a high quality to achieve accurate problem solutions respectively learning outcomes (Pickal et al., 2023 ).

Collaborative Diagnostic Reasoning (CDR) Model

Diagnosing can be understood as the process of solving complex diagnostic problems through “goal-oriented collection and interpretation of case-specific or problem-specific information to reduce uncertainty” in decision-making through performing diagnostic activities at a high quality (Heitzmann et al., 2019 , p. 4). To solve diagnostic problems, that is, to identify the causes of an undesired state, it is increasingly important to collaborate with experts from different fields, as these problems become too complex to be solved individually (Abele, 2018 ; Fiore et al., 2018 ). Collaboration provides advantages such as the division of labor, access to diverse perspectives and expertise, and enhanced solution quality through collaborative sharing of knowledge and skills (Graesser et al., 2018 ).

The CDR model is a theoretical model focusing on the diagnostic process in collaborative settings within knowledge-rich domains (Radkowitsch et al., 2022 ). The CDR model is based on scientific discovery as a dual-search model (SDDS; Klahr & Dunbar, 1988 ) and its further development by van Joolingen and Jong ( 1997 ). The SDDS model describes individual reasoning as the coordinated search through hypothetical evidence and hypotheses spaces and indicates that for successful reasoning it is important not only that high-quality cognitive activities within these spaces are performed but also that one is able to coordinate between them (Klahr & Dunbar, 1988 ). In the extended SDDS model (van Joolingen & Jong, 1997 ) focusing on learning in knowledge-rich domains, a learner hypothesis space was added including all the hypotheses one can search for without additional knowledge. Although Dunbar ( 1995 ) found that conceptual change occurs more often in groups than in individual work, emphasizing the importance of collaborative processes in scientific thinking and knowledge construction, the SDDS model has hardly been systematically applied in computer-supported collaborative learning and collaborative problem-solving.

Thus, the CDR model builds upon these considerations and describes the relationship between individual characteristics, the diagnostic process, and the diagnostic outcome. As in the SDDS model we assume that CDR involves activities within an evidence and hypotheses space; however, unlike the SDDS in the CDR model, these spaces are understood as cognitive storages of information. Which aligns more to the extended dual search space model of scientific discovery learning (van Joolingen & Jong, 1997 ). In summary we assume that coordinating between evidence (data) and hypothesis (theory) is essential for successful diagnosing. Further, the CDR model is extended to not only individual but also collaborative cognitive activities and describes the interaction of epistemic activities (F. Fischer et al., 2014 ) and collaborative activities (Liu et al., 2016 ) to construct a shared problem representation (Rochelle & Teasley, 1995 ) and effectively collaborate. Thus, we define CDR as a set of skills for solving a complex problem collaboratively “by generating and evaluating evidence and hypotheses that can be shared with, elicited from, or negotiated among collaborators” (Radkowitsch et al., 2020 , p. 2). The CDR model also makes assumptions about the factors necessary for successful CDR. First, we look at what successful CDR means, why people differ, and what the mediating processes are.

Diagnostic Outcome: Accuracy, Justification, and Efficiency

The primary outcome of diagnostic processes, such as CDR, is the accuracy of the given diagnosis, which indicates problem-solving performance or expertise (Boshuizen et al., 2020 ). However, competence in diagnostic reasoning, whether it is done individually or collaboratively, also includes justifying the diagnosis and reaching it effectively. This is why, in addition to diagnostic accuracy, diagnostic justification and diagnostic efficiency should also be considered as secondary outcomes of the diagnostic reasoning process (Chernikova et al., 2022 ; Daniel et al., 2019 ). Diagnostic justification makes the reasoning behind the decision transparent and understandable for others (Bauer et al., 2022 ). Good reasoning entails a justification including evidence, which supports the reasoning (Hitchcock, 2005 ). Diagnostic efficiency is related to how much time and effort is needed to reach the correct diagnosis; this is important for CDR, as diagnosticians in practice are usually under time pressure (Braun et al., 2017 ). Both diagnostic justification and diagnostic efficiency are thus indicators of a structured and high-quality reasoning process. So, while in many studies, the focus of assessments regarding diagnostic reasoning is on the accuracy of the given diagnosis (Daniel et al., 2019 ), the CDR model considers all three facets of the diagnostic outcome as relevant factors.

Individual Characteristics: Knowledge and Social Skills

Research has shown that content knowledge, social skills, and, in particular, collaboration knowledge are important prerequisites for, and outcomes of, computer-supported collaborative learning (Jeong et al., 2019 ; Vogel et al., 2017 ). CDR has integrated these dependencies into its model structure. Thus, the CDR model assumes that people engaging in CDR differ with respect to their content knowledge, collaboration knowledge, and domain general social skills.

Content knowledge refers to conceptual and strategic knowledge in a specific domain (Förtsch et al., 2018 ). Conceptual knowledge encompasses factual understanding of domain-specific concepts and their interrelations. Strategic knowledge entails contextualized knowledge regarding problem-solving during the diagnostic process (Stark et al., 2011 ). During expertise development, large amounts of content knowledge are acquired and restructured through experience with problem-solving procedures and routines (Boshuizen et al., 2020 ). Research has repeatedly shown that having high conceptual and strategic knowledge is associated with the diagnostic outcome (e.g., Kiesewetter et al., 2020 ; cf. Fink et al., 2023 ).

In addition to content knowledge, the CDR model assumes that collaborators need collaboration knowledge. A key aspect of collaboration knowledge (i.e., being aware of knowledge distribution in the group; Noroozi et al., 2013 ) is the pooling and processing of non-shared information, as research shows that a lack of collaboration knowledge has a negative impact on information sharing, which in turn has a negative impact on performance (Stasser & Titus, 1985 ).

Finally, general social skills influence the CDR process. These skills mainly influence the collaborative aspect of collaborative problem-solving and less the problem-solving aspect (Graesser et al., 2018 ). Social skills are considered particularly important when collaboration knowledge is low (F. Fischer et al., 2013 ). CDR assumes that in particular the abilities to share and negotiate ideas, to coordinate, and to take the perspective are relevant for the diagnostic process and the diagnostic outcome (Radkowitsch et al., 2022 ; see also Liu et al., 2016 , and Hesse et al., 2015 ).

Diagnostic Process: Collaborative Diagnostic Activities

The diagnostic process is thought to mediate the effect of the individual characteristics on the diagnostic outcome and is described in the CDR model using collaborative diagnostic activities (CDAs), such as evidence elicitation, evidence sharing, and hypotheses sharing (Heitzmann et al., 2019 ; Radkowitsch et al., 2022 ). One of the main functions of CDAs is to construct a shared problem representation (Rochelle & Teasley, 1995 ) by sharing and eliciting relevant information, as information may not be equally distributed among all collaborators initially. To perform these CDAs at a high quality, each collaborator needs to identify information relevant to be shared with the collaboration partner as well as information they need from the collaboration partner (OECD, 2017 ).

Evidence elicitation involves requesting information from a collaboration partner to access additional knowledge resources (Weinberger & Fischer, 2006 ). Evidence sharing and hypothesis sharing involve identifying the information needed by the collaborator to build a shared problem representation. This externalization of relevant information can be understood as the novelty aspect of transactivity (Vogel et al., 2023 ). However, challenges arise from a lack of relevant information due to deficient sharing, which can result from imprecise justification and insufficient clustering of information. In particular, research has shown that collaborators often lack essential information-sharing skills, such as identifying information relevant for others from available data, especially in the medical domain (Kiesewetter et al., 2017 ; Tschan et al., 2009 ).

It is crucial for the diagnostic outcome that all relevant evidence and hypotheses are elicited and shared for the specific collaborators (Tschan et al., 2009 ). However, diagnostic outcomes seem to be influenced more by the relevance and quality of the shared information than by their quantity (Kiesewetter et al., 2017 ; Tschan et al., 2009 ). In addition, recent research has shown that the diagnostic process is not only an embodiment of individual characteristics but also adds a unique contribution to diagnostic outcome (Fink et al., 2023 ). However, it remains difficult to assess and foster CDAs.

Collaboration in Knowledge-Rich Domains: Agent-Based Simulations

There are several challenges when it comes to modelling collaborative settings in knowledge-rich domains for both learning and research endeavors. First, many situations are not easily accessible, as they may be scarce (e.g., natural disasters) or too critical or overwhelming to be approached by novices (e.g., some medical procedures). In these cases, the use of simulation-based environments allows authentic situations approximating real-life diagnostic problems to be provided (Cook et al., 2013 ; Heitzmann et al., 2019 ). Further, the use of technology-enhanced simulations allows data from the ongoing CDR process to be collected in log files. This enables researchers to analyze process data without the need for additional assessments with dedicated tests. Analyzing process data instead of only product data (the assessment’s outcome) permits insights into the problem-solving processes leading to the eventual outcome (e.g., Goldhammer et al., 2017 ). Second, when using human-to-human collaboration, the results of one individual are typically influenced by factors such as group composition or motivation of the collaboration partner (Radkowitsch et al., 2022 ). However, we understand CDR as an individual set of skills enabling collaboration, as indicated by the broader definition of collaborative problem-solving (OECD, 2017 ). Thus, the use of simulated agents as collaboration partners allows a standardized and controlled setting to be created that would otherwise be hard to establish in collaborations among humans (Rosen, 2015 ). There is initial research showing that performance in simulations using computerized agents is moderately related to collaborative skills in other operationalizations (Stadler & Herborn et al., 2020 ). Thus, computerized agents allow for enhanced control over the collaborative process without significantly diverging from human-to-human interaction (Graesser et al., 2018 ; Herborn et al., 2020 ). Third, in less controlled settings it is hard to ensure a specific process is taking place during collaborative problem-solving. For example, when using a human-to-human setting, it is possible that, even though we envision measuring or fostering a specific activity (i.e. hypotheses sharing), it is not performed by the student. Through using an agent-based simulated collaboration partner, we can ensure that all required processes are taking place while solving the problem (Rosen, 2015 ).

Summarizing, by fostering a consistent and controlled setting, simulated agents facilitate the accurate measurement and enhancement of collaborative problem-solving. Evidential support for the application of simulated agents spans a variety of contexts, including tutoring, collaborative learning, knowledge co-construction, and collaborative problem-solving itself, emphasizing their versatility and effectiveness in educational settings (Graesser et al., 2018 ; Rosen, 2015 ).

Research Question and Current Study

In computer-supported collaborative learning there has been the distinction between approaches addressing collaboration to learn and approaches focusing on learning to collaborate. Our study is best understood as addressing the second approach, learning to collaborate. We want to better understand CDR to be able to facilitate collaborative problem-solving skills in learners. Thus, in this paper, we examine what it takes to be able to collaborate in professional practice of knowledge-rich domains, such as medical diagnosing.

When solving diagnostic problems, such as diagnosing a patient, it is often necessary to collaborate with experts from different fields (Radkowitsch et al., 2022 ). In CDR, the diagnostic outcome depends on effectively eliciting and sharing relevant evidence and hypotheses among collaborators, who often lack information-sharing skills (Tschan et al., 2009 ). Thus, the CDR model emphasizes the importance of high-quality CDAs influenced by content and collaboration knowledge as well as social skills to achieve accurate, justified, and efficient diagnostic outcomes (Radkowitsch et al., 2022 ).

This study reviews the relationships postulated in CDR model across three studies to test them empirically and investigate the extent to which the relationships in the CDR model are applicable across studies . By addressing this research question, the current study contributes to a better understanding of the underlying processes in collaborative problem-solving.

We derived a model (Fig. 1 ) from the postulated relationships made by the CDR model. We assume that the individual characteristics are positively related to the CDAs (Hypotheses 1–3), as well as that the CDAs are positively related to the diagnostic outcome (Hypotheses 4–6). Further, we expect that the relationship between the individual characteristics and the diagnostic outcome is partially mediated by the CDAs (Hypotheses 7–15).

figure 1

Visualization of hypothesized relationships between individual characteristics, collaborative diagnostic activities, and diagnostic outcome

We used data from three studies with similar samples and tasks investigating CDR in an agent-based simulation in the medical domain. The studies can therefore be considered conceptual replication studies. Furthermore, we decided to use an agent-based simulation of a typical collaboration setting in diagnostic reasoning, namely the interdisciplinary collaboration between an internist and a radiologist (Radkowitsch et al., 2022 ).

To test the hypotheses, three studies were analyzed. Footnote 1 Study A was carried out in a laboratory setting in 2019 and included medical students in their third to sixth years. Study B included medical students in their fifth to sixth years. Data collection for this study was online due to the pandemic situation in 2020 and 2021. In both studies, participation was voluntary, and participants were paid 10 per hour. Study C was embedded as an online session in the curriculum of the third year of medical school in 2022. Participation was mandatory, but permission to use the data for research purposes was given voluntarily. All participants took part in only one of the three studies. All three studies received ethical approval from LMU Munich (approval numbers 18-261, 18-262 & 22-0436). For a sample description of each study, see Table 1 . We would like to emphasize that none of the students were specializing in internal medicine, ensuring that the study results reflect the competencies of regular medical students without specialized expertise.

Each of the three studies was organized in the same way, with participants first completing a pretest that included a prior knowledge test, socio-demographic questions, and questions about individual motivational-affective characteristics (e.g., social skills, interest, and motivation). Participants then moved on to the CDR simulation and worked on the patient case. The patient case was the same for studies B and C, but was different for study A. The complexity and difficulty of the patient case did not vary systematically between the patient cases.

Simulation and Task

In the CDR simulation, which is also used as a learning environment, the task was to take over the role of an internist and to collaborate with an agent-based radiologist to obtain further information by performing radiological examinations to diagnose fictitious patient cases with the chief symptom of fever. Medical experts from internal medicine, radiology, and general medicine constructed the patient cases. Each case was structured in the same way: by studying the medical record individually, then collaborating with an agent-based radiologist, and finally reporting the final diagnosis and its justification again individually. For a detailed description on the development and validation of the simulation, see Radkowitsch and colleagues ( 2020 ).

Before working within the simulation, participants were presented with an instruction for the simulated scenario and informed what they were to do with it. Then, we instructed participants how to access further information in the medical record by clicking on hyperlinks, as well as how they could use the toolbar to make notes for the later in the process. Furthermore, we acquainted the students with how they could request further information through collaborating with a radiologist.

During the collaboration with an agent-based radiologist, participants were asked to fill out request forms to obtain further evidence from radiological examinations needed to diagnose the patient case. To effectively collaborate with radiologists, it is crucial for internists to clearly communicate the type of evidence required to reduce uncertainty (referred to as “evidence elicitation”) and share any relevant patient information such as signs, symptoms, and medical history (referred to as “evidence sharing”) as well as suspected diagnoses under investigation (referred to as “hypotheses sharing”) that may impact the radiologists’ diagnostic process. Only when participants shared evidence and hypotheses appropriately for their requested examination did they receive a description and evaluation of the radiologist’s radiologic findings. What was considered appropriate was determined by medical experts for each case and examination in preparation of the cases. Therefore, this scenario involves more than a simple division of tasks, as the quality of one person’s activity (i.e., description and evaluation of the radiologic findings) depends on the collaborative efforts (i.e., CDAs) of the other person (OECD, 2017 )

Measures—Individual Characteristics

The individual characteristics were measured in the pretest. The internal consistencies of each measure per study are displayed in Table 4 in the Results section. We want to point out that the internal consistency of knowledge as a construct—determined by the intercorrelations among knowledge pieces—typically exhibits a moderate level. Importantly, recent research argues that a moderate level of internal consistency does not undermine the constructs’ capacity to explain a significant amount of variance (Edelsbrunner, 2024 ; Stadler et al., 2021 ; Taber, 2018 ).

Content knowledge was separated into radiology and internal medicine knowledge, as these two disciplines play a major role in the diagnosis of the simulated patient cases. For each discipline, conceptual and strategic knowledge was assessed (Kiesewetter et al., 2020 ; Stark et al., 2011 ). The items in each construct were presented in a randomized way in each study. However, the items for study C were shortened due to the embedding of the data collection in the curriculum. Therefore, items with a very high or low item difficulty in previous studies were excluded (Table 2 ).

Conceptual knowledge was measured using single-choice questions including five options adapted from a database of examination questions from the Medical Faculty of the LMU Munich, focusing on relevant and closely related diagnoses of the patient cases used in the simulation. A mean score of 0–1 was calculated, representing the percentage of correct answers and indicating the average conceptual knowledge of the participant per medical knowledge domain.

Strategic content knowledge was measured contextually using key features questions (M. R. Fischer et al., 2005 ). Short cases were introduced followed by two to three follow up questions (e.g., What is your most likely suspected diagnosis?, What is your next examination?, What treatment do you choose?). Each question had eight possible answers, from which the learners were asked to choose one. Again, a mean score of 0–1 was calculated, representing the percentage of correct responses, indicating the average strategic content knowledge of the participant per domain.

The measure of collaboration knowledge was consistent across the three studies and specific to the simulated task. Participants were asked to select all relevant information for seven different patient cases with the cardinal symptom of fever (internal medicine). The patient cases were presented in a randomized order and always included 12 pieces of information regarding the chief complaints, medical history, and physical examination of the patient cases. We then assessed whether each piece of information was shared correctly (i.e. whether relevant information was shared and irrelevant information was not shared) and assigned 1 point and divided it by the maximum of 12 points to standardized the range of measure to 0–1. Then we calculated a mean score for each case and then across all cases, resulting in a range of 0–1 indicating the participants’ collaboration knowledge

The construct of social skills was consistent across the three data collections and was measured on the basis of self-report on a 6-point Likert scale ranging from total disagreement to total agreement. The construct was measured using 23 questions divided into five subscales; for example items, see Table 3 . Five questions aimed to measure the overall construct, and the other four subscales were identified using the complex problem-solving frameworks of Liu et al. ( 2016 ) and Hesse et al. ( 2015 ): perspective taking (four questions), information sharing (five questions), negotiation (four questions), and coordination (five questions). For the final score, the mean of all subcategories was calculated, ranging from 1 to 6, representing general social skills.

Measures—Collaborative Diagnostic Activities (CDA)

We operationalize CDAs in the pretest case in terms of quality of evidence elicitation, evidence sharing, and hypotheses sharing. The internal consistencies of each measure per study are displayed in Table 4 in the Results section.

The quality of evidence elicitation was measured by assessing the appropriateness of the requested radiological examination for the indicated diagnosis. An expert solution was developed to indicate which radiological examinations were appropriate for each of the possible diagnoses. If participants requested an appropriate radiological examination for the indicated diagnoses, they received 1 point for that request attempt. Finally, a mean score across all request attempts (maximum of 3) was calculated and scored. The final mean score was transformed into a binary indicator, with 1 indicating that all requested radiological examinations were appropriated and 0 indicating that inappropriate radiological examinations were also requested, due to the categorical nature of the original data and its skewed distribution, with a majority of responses concentrated in a single category.

The quality of evidence sharing was measured using a precision indicator. This was calculated as the proportion of shared relevant evidence out of all shared evidence. Relevant evidence is defined per case and per diagnosis and indicated by the expert solution. The precision indicator was first calculated per radiological request. We then calculated the mean score, summarizing all attempts in that patient case. This resulted in a range from 0 points, indicating that only irrelevant evidence was shared, to 1 point, indicating that only relevant evidence was shared.

The quality of hypotheses sharing was also measured using a precision indicator. For each patient case, the proportion of diagnoses relevant for the respective patient case to all shared diagnoses was calculated. Which diagnoses were considered relevant for a specific case was determined by an expert solution. As with evidence elicitation, this score was evaluated and converted into a binary variable, where 1 indicated that only relevant diagnoses were shared and 0 indicated that also irrelevant diagnoses were shared, due to the categorical nature of the original data and its skewed distribution, with a majority of responses concentrated in a single category.

Measures—Diagnostic Outcome

We operationalize diagnostic outcome in the pretest case in terms of diagnostic accuracy, diagnostic justification, and diagnostic efficiency.

For diagnostic accuracy, a main diagnosis was assigned to each patient case as expert solution. After working on the patient case and requesting the radiological examination, participants indicated their final diagnosis. To do this, they typed in the first three letters of their desired diagnosis and then received suggestions from a list of 249 possible diagnoses. Diagnostic accuracy was then calculated by coding the agreement between the final diagnosis given and the expert solution. Accurate diagnoses (e.g., hospital-acquired pneumonia) were coded as 1, correct but inaccurate diagnoses (e.g., pneumonia) were coded as 0.5, and incorrect diagnoses were coded as 0. A binary indicator was used for the final diagnostic accuracy score, with 0 indicating an incorrect diagnosis and 1 indicating an at least inaccurate diagnosis, due to the categorical nature of the original data and its skewed distribution, with a majority of responses concentrated in a single category.

A prerequisite for diagnostic justification and diagnostic efficiency is the provision of at least an inaccurate diagnosis. If a participant provided an incorrect diagnosis (coded as 0), diagnostic justification and diagnostic efficiency were immediately scored as 0.

After choosing a final diagnosis, participants were asked to justify their decision in an open text field. Diagnostic justification was then calculated as the proportion of relevant reported information out of all relevant information that would have fully justified the final accurate diagnosis. Again, medical experts agreed on an expert solution that included all relevant information to justify the correct diagnosis. The participants’ solution was coded by two independent coders, each coding the full data, and differences in coding were discussed until the coders agreed. We obtained a range from 0 points, indicating a completely inadequate justification, to 1 point, indicating a completely adequately justified final diagnosis.

Diagnostic efficiency was defined as diagnostic accuracy (non-binary version) divided by the minutes required to solve the case.

Statistical Analyses

To answer the research question, a structural equation model (SEM) was estimated using MPlus Editor, version 8 (Muthén & Muthén, 2017 ). We decided to use a SEM, as it is a comprehensive statistical approach widely used in psychology and educational sciences for its ability to model complex relationships among observed and latent variables while accounting for measurement error (Hilbert & Stadler, 2017 ). SEM support the development and verification of theoretical models, enabling scholars to refine theories and interventions in psychology and education based on empirical evidence, as not only can one relationship be investigated but a system of regressions is also considered simultaneously (Nachtigall et al., 2003 ).

We included all links between the variables and applied a two-step approach, using mean-adjusted and variance-adjusted unweighted least squares (ULSMV, Savalei & Rhemtulla, 2013 ) as the estimator and THETA for parametrization, first examining the measurement model and then the structural model. The assessment of model fit was based on chi-square (χ2), root mean square error of approximation (RMSEA), and comparative fit index (CFI). Model fit is generally indicated by small chi-squared values; RMSEA values of < 0.08 (acceptable) and < 0.06 (excellent), and CFI values ≥ 0.90. We do not consider standardized root mean squared residual (SRMR), because, according to the definition used in MPlus, this index is not appropriate when the sample size is 200 or less, as natural variation in such small samples contributes to larger SRMR values (Asparouhov & Muthén, 2018 ). For hypotheses 1–6, we excluded path coefficients < 0.1 from our interpretation, as they are relatively small. In addition, at least two interpretable path coefficients, of which at least one is statistically significant, are required to find support for the hypothesis. For hypotheses 7–15, specific indirect effects (effect of an individual characteristic on diagnostic outcome through a specific CDA) and total indirect effects (mediation of the effect of an individual characteristic on diagnostic outcome through all mediators) were estimated.

We reported all measures in the study and outlined differences between the three samples. All data and analysis code have been made publicly available at the Open Science Framework (OSF) and can be accessed at https://osf.io/u8t62 . Materials for this study are available by email through the corresponding author. This study’s design and its analysis were not pre-registered.

The descriptive statistics of each measure per study are displayed in Table 4 . The intercorrelations between the measures per study can be found in Appendix Table 7 .

Overall Results of the SEM

All loadings were in the expected directions and statistically significant, except for conceptual knowledge in internal medicine in study C (λ = 0.241, p  = .120), conceptual knowledge in radiology in study A (λ = 0.398, p  = .018), and strategic knowledge in internal medicine (λ = 0.387, p  = .206) and radiology (λ = -0.166, p  = .302) in study B. Standardized factor loadings of the measurement model are shown in Appendix Table 8 .

The SEM has a good fit for study A [ X 2 (75) = 74.086, p = .508, RMSEA = 0.00, CFI = 1.00], study B [ X 2 (75) = 68.309, p  = .695, RMSEA = 0.000, CFI = 1.00], and study C [ X 2 (75) = 93.816, p  = .070, RMSEA = 0.036, CFI = 1.00].

Paths between Individual Characteristics, CDAs, and Diagnostic Outcome

The standardized path coefficients and hypotheses tests for the theoretical model are reported in Table 5 . An overview of the paths supported by the data is shown in Fig. 2 .

figure 2

Evidence on supported relationships between individual characteristics, collaborative diagnostic activities, and diagnostic outcome

Overall, the R 2 for the CDAs ranged from medium to high for evidence elicitation and evidence sharing, depending on the study, and were consistently low for hypotheses sharing across all three studies. Looking at diagnostic outcome, R 2 is consistently large for diagnostic accuracy and medium to large for diagnostic justification and diagnostic efficiency (Table 6 ).

The path from content knowledge to evidence elicitation was positive and > 0.1 in all three studies, as well as statistically significant in two of them; therefore, we consider Hypothesis 1a supported. The path from content knowledge to evidence sharing was positive and > 0.1 in two studies, as well as statistically significant in one of them; therefore, Hypothesis 1b is also supported. In contrast, the path from content knowledge to hypotheses sharing was indeed also positive in two studies, but as neither was statistically significant, we conclude that Hypothesis 1c was not supported. The path from collaboration knowledge to evidence elicitation was positive and > 0.1 in only one study, but also not statistically significant. Thus, we found that Hypothesis 2a was not supported. For the path from collaboration knowledge to evidence sharing, we found relevant positive and statistically significant coefficients in all three studies. Hypothesis 2b is therefore fully supported by the data. This is not the case for Hypothesis 2c, for which we found no coefficient > 0.1 for the path from collaboration knowledge to hypotheses sharing. For the path from social skills to evidence elicitation, we found positive coefficients > 0.1 in two out of three studies, of which one was also statistically significant. Thus, we consider Hypothesis 3a to be supported. For the path from social skills to evidence sharing, we again found one statistically significant positive coefficient, but in the other two studies it was < 0.1. Therefore, we do not consider Hypothesis 3b to be supported by the data. The same applies to the path from social skills to hypotheses sharing, where the coefficient is < 0.1 in two studies. We therefore do not consider Hypothesis 3c to be supported.

The path from evidence elicitation to diagnostic accuracy was statistically significant and large in magnitude in two out of three studies. Hypothesis 4a is therefore supported. The path from evidence elicitation to diagnostic justification was only positive and > 0.1 in one study, which was also not statistically significant. Therefore, we find no support for Hypothesis 4b. In contrast, the path from evidence elicitation to diagnostic efficiency was positive and statistically significant in two out of three studies, with one large effect. Hypothesis 4c is therefore supported. The path from evidence sharing to diagnostic accuracy was only positive and reasonably large in one study. Therefore, we do not find support for Hypothesis 5a. The path from evidence sharing to diagnostic justification was positive and > 0.1 in two studies as well as statistically significant in one of them, so Hypothesis 5b is supported. In contrast, we did not find a positive coefficient > 0.1 for the path from evidence sharing to diagnostic efficiency. Therefore, Hypothesis 5c is not supported by the data. Although we found coefficients > 0.1 in two studies for the path from hypotheses sharing to diagnostic accuracy, we found no support for Hypothesis 6a, as none of these was statistically significant. This is different for Hypothesis 6b, as we found two positive paths from hypotheses sharing to diagnostic justification, one of which was statistically significant and large. Finally, we found two positive paths from evidence sharing to diagnostic efficiency in three studies, one of which was statistically significant. Hypothesis 6c is therefore supported.

Indirect Effects between Individual Characteristics, CDA, and Diagnostic Outcome

Indirect effects of CDAs on the effect of individual characteristics on the diagnostic outcome in CDR were estimated to test hypotheses 7–15. Although we found a mediating effect of all CDAs (β = .31, p = .008), and specifically for evidence elicitation (β = .27, p = .021) from content knowledge on diagnostic accuracy in study C, and some significant overall and direct effects for other relationships (Appendix Table 9 ), none of these were consistent across all of the studies. Thus, we conclude no consistent support for any of the Hypotheses 7–15.

The aim of the current study was to investigate the extent to which the relationships specified in the CDR model (Radkowitsch et al., 2022 ) are applicable across studies, to better understand the processes underlying CDR in knowledge-rich domains. Not only is this exploration crucial for the medical field or collaborative problem-solving in knowledge-rich domains, but it also offers valuable insights for computer-supported collaborative learning research. Despite CDR’s specific focus, the principles and findings have relevant implications for understanding and enhancing collaborative processes in various educational and professional settings.

Specifically, we investigated how individual learner characteristics, the CDAs, and the diagnostic outcome are related. We therefore analyzed data from three independent studies, all from the same context, a simulation-based environment in the medical domain. Our study found positive relationships between content knowledge and the quality of evidence elicitation as well as the quality of evidence sharing, but not for the quality of hypotheses sharing. Furthermore, collaboration knowledge is positively related to the quality of evidence sharing, but not to the quality of evidence elicitation and the quality of hypotheses sharing. Social skills are only positively related to the quality of evidence elicitation. This underscores the multifaceted nature of collaborative problem-solving situations. Thus, effective CDR, a form of collaborative problem-solving, necessitates a nuanced understanding of the interplay between individual characteristics and CDAs.

The relevance of content knowledge for diagnostic competence is well established in research (Chernikova et al., 2020 ). To develop any diagnostic skills in knowledge-rich domains, learners need to acquire large amounts of knowledge and to restructure it through experience with problem-solving procedures and routines (Boshuizen et al., 2020 ). In the case of CDR this enables the diagnostician to come up with an initial suspected diagnosis, which is likely to be relevant information for the collaboration partner and to guide the further CDAs effectively. The finding that content knowledge only has a relation to the quality of evidence elicitation but none of the other CDAs can be explained by the fact that evidence elicitation is the least transactive CDA within the collaborative decision-making process. When eliciting evidence, the collaboration partner is used as an external knowledge resource (Weinberger & Fischer, 2006 ). So, despite being a collaborative activity, evidence elicitation is about what information from the collaboration partner is needed rather than what the collaboration partner needs. Thus, elicitation is less transactive than sharing, which is focused at what the collaboration partner needs.

Not only content knowledge but also collaboration knowledge is related to the quality of evidence sharing. This finding implies that collaboration knowledge may influence the CDR above and beyond individual content knowledge. It also supports the differentiation of knowledge types made in the CDR model (Radkowitsch et al., 2022 ). Thus, it is important to learn not only the conceptual and strategic medical knowledge that is required for diagnosing but also knowledge about what information is relevant for specific collaboration partners when diagnosing collaboratively. This finding underpins the importance of being aware of the knowledge distribution among collaboration partners and the relevance of the transactive memory (Wegner, 1987 ). Thus, for collaborative problem-solving in knowledge-rich domains—as for computer-supported collaborative learning more generally—knowledge and information awareness is crucial (Engelmann & Hesse, 2010 ).

Thus, the relevance of collaboration knowledge in collaborative problem-solving is an important finding of our study, highlighting that it is critical in facilitating effective collaborative processes and outcomes. The current findings emphasize the need for educational strategies that explicitly target the development of collaborative knowledge to ensure that learners have the knowledge and skills necessary to participate in productive collaborative problem-solving and computer-supported collaborative learning processes. In doing so, the CDR model emphasizes the need for learners to master collaborative skills and build shared problem representations to take full advantage of collaborative learning opportunities.

As CDR is conceptualized to be an interplay of cognitive and social skills (Hesse et al., 2015 ), we also assumed that social skills are related to CDAs. However, we only found evidence of the expected relationship between social skills and CDAs for the quality of evidence elicitation. One explanation could be that collaboration knowledge was relatively high in all three samples, outweighing the influences of general skills. This is consistent with the assumption of the CDR model that the influence of more general social skills is reduced with an increasing level of professional collaboration knowledge (Radkowitsch et al., 2022 ). When collaboration knowledge is available to the diagnosticians, it becomes more important than social skills. This finding again underlines the importance of collaboration knowledge, which can be seen as a domain- and profession-specific development of social skills. However, another explanation could be that, when collaborating with an agent, the effect of social skills decreases, as the agent was not programmed to respond to social nuances. The design of the simulation would thus buffer against the effect of social skills. Although the study by Herborn et al. ( 2020 ) found no differences between human-to-human and human-to-agent collaboration, this does not necessarily invalidate the potential variability in outcomes associated with the social skills incorporated into the agent. For a thorough investigation into the impact of social skills, the agent would need variable social abilities, enabling the variation of the importance of basic social skills for successful collaboration.

Further, we need to conclude that there is no support for a relationship between the individual characteristics and hypotheses sharing, as we found no stable support for the relationship between any of the individual characteristics and the quality of hypotheses sharing. One possible explanation could be that the binary precision measure used to operationalize quality in hypotheses sharing is not sensitive enough or is not capturing the relevant aspect of quality in that activity. Another explanation could be that there is no direct relationship between the individual characteristics and hypotheses sharing, as this relationship is mediated by evidence sharing and thus influenced by the activated knowledge scripts (Schmidt & Rikers, 2007 ).

Looking at the relationships between CDAs and the diagnostic outcome, the current results highlight the need to distinguish between primary (diagnostic accuracy) and secondary (diagnostic justification and efficiency) outcomes of diagnostic reasoning (Daniel et al., 2019 ). Achieving diagnostic accuracy, a purely quantitative outcome measure, is less transactive than other aspects of the diagnostic outcome. This is also where we find the link to evidence elicitation, as we consider this to be the least transactive CDA within the collaborative decision-making process. However, the ability to justify and reach this decision efficiently is then highly dependent on evidence sharing and hypotheses sharing, activities that are more focused on transactivity within CDR (Weinberger & Fischer, 2006 ).

Although individual learner characteristics are found to have an effect on CDAs, and CDAs impact the diagnostic outcome, the effect is not mediated by CDAs across studies. Thus, we assume that, for effective collaborative problem-solving in knowledge-rich domains, such as CDR, it is not enough to have sufficient content and collaboration knowledge; it is also necessary to be able to engage in high quality CDAs to achieve a high-quality diagnostic outcome. This is consistent with research on individual diagnostic reasoning, which shows that diagnostic activities have a unique contribution to the diagnostic outcome after controlling for content knowledge (Fink et al., 2023 ).

In summary, we explored evidence elicitation, evidence sharing, and hypotheses sharing as crucial CDAs. The findings revealed diverse associations of these CDAs with individual characteristics and facets of the diagnostic outcome, supporting the notion that the CDR-process involves a variety of different skills (instead of being one overarching skill). On the basis of these results, we propose categorizing CDAs into activities primarily focused on individual goals and needs (e.g., elicitation) and more transactive activities directly targeted at the collaborator (e.g., sharing). To enhance quality in CDAs, instructional support should be considered. For instance, providing learners with an adaptive collaboration script has been shown to improve evidence sharing quality and promote the internalization of collaboration scripts, fostering the development of collaboration knowledge (Radkowitsch et al., 2021 ). Further, group awareness tools, such as shared concept maps, should be considered to compensate for deficits in one’s collaboration knowledge (Engelmann & Hesse, 2010 ). However, what is required to engage in high-quality CDAs remains an open question. One starting point is domain-general cognitive skills. These could influence CDAs, particularly in the early stages of skill development (Hetmanek et al., 2018 ). Previous research showed that, in diagnostic reasoning, instructional support is more beneficial when being domain-specific than domain-general (Schons et al., 2022 ). Thus, there is still a need for further research on how such instructional support might look like.

Future Research

Although we used data from three studies, all of them were in the same domain; thus, it remains an open question whether these findings are applicable across domains. The CDR model claims that the described relationships are not limited to the medical domain, but rather are valid across domains for collaboratively solving complex problems in knowledge-rich domains. Future research should explore generalizability, for example, for teacher education, which is a distinct field that also requires diagnosing and complex problem-solving (Heitzmann et al., 2019 ).

Regardless of domain, the non-mediating relationship of CDAs between individual characteristics and diagnostic outcomes, as well as the found effects of the CDAs in the current study, suggests that an isolated analysis of CDAs does not fully represent the complex interactions and relationships among activities, individual characteristics, and diagnostic outcomes. Future studies might assess CDAs as a bundle of necessary activities, including a focus on their possible non-linear interactions. We propose to use process data analysis to account for the inherent complexity of the data, as different activities in different sequences can lead to the same outcome (Y. Chen et al., 2019 ). More exploratory analyses of fine-grained, theory-based sequence data are needed to provide insights into more general and more specific processes involved in successful solving complex problems collaboratively (Stadler et al., 2020 ).

As our results have shown, collaboration knowledge and thus awareness of the knowledge distribution among collaboration partners is highly relevant. While a recent meta-analyses showed a moderate effect of group awareness of students’ performance in computer-supported collaborative learning (D. Chen et al., 2024 ), it has so far not been systematically investigated in collaborative problem-solving. Thus, more research on the influence collaboration knowledge in collaborative problem-solving is needed.

Further, additional factors associated with success in collaborative problem-solving—not yet incorporated into the model and thus not yet investigated systematically—include communication skills (OECD, 2017 ), the self-concept of problem-solving ability (Scalise et al., 2016 ), and positive activating emotions during problem-solving tasks (Camacho-Morles et al., 2019 ).

Limitations

There are, however, some limitations to be considered. One is that we have only considered CDAs and how they relate to individual characteristics and outcomes. However, the CDR model also introduces individual diagnostic activities, such as the generation of evidence and the drawing of conclusions. These occur before and after the CDAs and may therefore also have an impact on the described relationships. However, we decided to focus on the CDAs within the CDR process because they are particularly relevant for constructing a shared problem representation, being central to CDR. Future research might consider these individual diagnostic activities, as they could, for example, further explain the how content knowledge is related to the diagnostic outcome.

Another limitation of the current analyses is the operationalization of quality for the CDAs. We chose the appropriateness of radiological examination for the indicated diagnosis for quality of evidence elicitation and precision for quality of evidence sharing and hypotheses sharing. However, all of these only shed light on one perspective of each activity, while possibly obscuring others. For example, it may be that content knowledge is not related to the precision of hypotheses sharing, but this may be different when looking at other quality indicators, such as sensitivity or specificity. However, we decided to use the precision aspect of activities, as research shows that collaborators often fail to identify relevant information, and the amount of information is not related to performance (Tschan et al., 2009 ). Future research may explore a broader variety of quality indicators to be able to assess the quality of CDAs as comprehensively as possible. It should also be noted that in study B a suppression effect (Horst, 1941 ) between hypothesis sharing and evidence elicitation artificially inflated the observed effect size. This is to be expected with process data that can be highly correlated and needs to be considered when interpreting the effect sizes.

In addition, it should be noted that the omega values obtained for the conceptual and strategic knowledge measures were below the commonly accepted threshold of 0.7. While we chose to use omega values as a more appropriate measure of reliability in our context, given the complex and multifaceted nature of the knowledge constructs, these lower-than-expected values raise important questions about the quality of the data and the robustness of the findings. Thus, it is important to understand that knowledge constructs, by their very nature, may not always exhibit high levels of internal consistency due to the diverse and interrelated components they encompass (Edelsbrunner, 2024 ; Stadler et al., 2021 ; Taber, 2018 ). This complexity may be reflected in the moderate omega values observed, which, while seemingly counterintuitive, does not invalidate the potential of the constructs to account for substantial variance in related outcomes. However, findings related to these constructs should be interpreted with caution, and the results presented should be considered tentative. Future research should further explore the implications of using different reliability coefficients in assessing complex constructs within the learning sciences, potentially providing deeper insights into the nuanced nature of knowledge and its measurement.

Another limitation of this study is related to the agent-based collaboration, as a predictive validation of collaborative problem-solving for later human-to-human collaboration in comparable contexts has not yet been systematically conducted. Although the agent-based collaboration situation used has been validated in terms of perceived authenticity, it still does not fully correspond to a real collaboration situation (Rosen, 2015 ). This could be an explanation for the low influence of social skills, as the setting might not require the application of a broad set of social skills (Hesse et al., 2015 ; Radkowitsch et al., 2020 ). In a real-life collaboration, the effects of social skills might be more pronounced. However, research showed that the human-to-agent approach did not lead to different results in collaborative problem-solving than the human-to-human approach in the 2015 PISA study, and correlations with other measures of collaborative skills have been found (Herborn et al., 2020 ; Stadler, Herborn et al., 2020 ). Future studies should specifically test the relevance of social skills for CDR in a human-to-human setting to strengthen the generalizability of our findings.

In conclusion, the current study highlights the importance of individual characteristics and CDAs as independent predictors for achieving good diagnoses in collaborative contexts, at least in the simulation-based settings we used in the studies included in our analysis. Collaboration knowledge emerged as a critical factor, demonstrating its importance over early acquired, general social skills. Therefore, it is imperative to revise the CDR approach by giving higher priority to the proficiency of collaboration knowledge compared with social skills. Furthermore, we conclude that, in simulation-based CDR, content knowledge does not play such a crucial role in predicting diagnostic success compared with many other educational settings, most probably because of the endless opportunities for retrying and revising in simulation-based learning environments.

With respect to CDAs, we suggest refining the perspective on the quality of CDAs and consider revising the CDR model by summarizing CDAs as information elicitation and information sharing, with the former being less transactive, and thus, less demanding than the latter. Adequate performance in both types of CDA is presumed to result in a high-quality shared problem representation, resulting in good diagnostic outcome. Collaborative problem-solving skills are highly relevant in professional practice of knowledge-rich domains, highlighting the need to strengthen these skills in students engaged in CDR and to provide learning opportunities accordingly. Further, the ability to effectively collaborate and construct shared problem representations is important, not only in CDR but also in collaborative problem-solving and computer-supported collaborative learning more in general, highlighting the need for integrating such skills into curricula and instructional design.

By emphasizing these aspects, we can improve the diagnostic skills of individuals in collaborative settings. Through advancing our understanding of CDR, we are taking a key step forward in optimizing collaborative problem-solving and ultimately contributing to improved diagnostic outcomes in various professional domains beyond CDR in medical education. In particular, integrating collaboration knowledge and skills into computer-supported collaborative learning environments can enrich learning experiences and outcomes in various knowledge-rich domains.

Please note that the data employed in this study have been used in previous publications (e.g., Brandl et al., 2021 ; Radkowitsch, et al., 2021 ; Richters et al., 2022 ). However, the research question and the results reported in this study are completely unique to this study.

Abele, S. (2018). Diagnostic problem-solving process in professional contexts: theory and empirical investigation in the context of car mechatronics using computer-generated log-files. Vocations and Learning, 11 (1), 133–159. https://doi.org/10.1007/s12186-017-9183-x

Article   Google Scholar  

Asparouhov, T., & Muthe´n, B. (2018). SRMR in Mplus . https://www.statmodel.com/download/SRMR2.pdf

Bauer, E., Sailer, M., Kiesewetter, J., Fischer, M. R., & Fischer, F. (2022). Diagnostic argumentation in teacher education: Making the case for justification, disconfirmation, and transparency. Frontiers in Education , 7 , Article 977631. https://doi.org/10.3389/feduc.2022.977631

Boshuizen, H. P., Gruber, H., & Strasser, J. (2020). Knowledge restructuring through case processing: the key to generalise expertise development theory across domains? Educational Research Review, 29 , 100310. https://doi.org/10.1016/j.edurev.2020.100310

Brandl, L., Richters, C., Radkowitsch, A., Obersteiner, A., Fischer, M. R., Schmidmaier, R., Fischer, F., & Stadler, M. (2021). Simulation-based learning of complex skills: Predicting performance with theoretically derived process features. Psychological Test and Assessment Modeling, 63 (4), 542–560. https://www.psychologie-aktuell.com/fileadmin/Redaktion/Journale/ptam-2021-4/PTAM__4-2021_6_kor.pdf

Braun, L. T., Zottmann, J. M., Adolf, C., Lottspeich, C., Then, C., Wirth, S., Fischer, M. R., & Schmidmaier, R. (2017). Representation scaffolds improve diagnostic efficiency in medical students. Medical Education, 51 (11), 1118–1126. https://doi.org/10.1111/medu.13355

Camacho-Morles, J., Slemp, G. R., Oades, L. G., Morrish, L., & Scoular, C. (2019). The role of achievement emotions in the collaborative problem-solving performance of adolescents. Learning and Individual Differences, 70 , 169–181. https://doi.org/10.1016/j.lindif.2019.02.005

Chen, D., Zhang, Y., Luo, H., Zhu, Z., Ma, J., & Lin, Y. (2024). Effects of group awareness support in CSCL on students’ learning performance: a three-level meta-analysis. International Journal of Computer-Supported Collaborative Learning, 19 (1), 97–129. https://doi.org/10.1007/s11412-024-09418-3

Chen, Y., Li, X., Liu, J., & Ying, Z. (2019). Statistical analysis of complex problem-solving process data: an event history analysis approach. Frontiers in Psychology , 10 , Article 486. https://doi.org/10.3389/fpsyg.2019.00486

Chernikova, O., Heitzmann, N., Fink, M. C., Timothy, V., Seidel, T., & Fischer, F. (2020). Facilitating diagnostic competences in higher education—a meta-analysis in medical and teacher education. Educational Psychology Review, 32 (1), 157–196. https://doi.org/10.1007/s10648-019-09492-2

Chernikova, O., Heitzmann, N., Opitz, A., Seidel, T., & Fischer, F. (2022). A theoretical framework for fostering diagnostic competences with simulations in higher education. In F. Fischer & A. Opitz (Eds.),  Learning to Diagnose with Simulations . Springer, Cham. https://doi.org/10.1007/978-3-030-89147-3_2

Cook, D. A., Brydges, R., Zendejas, B., Hamstra, S. J., & Hatala, R. M. (2013). Technology-enhanced simulation to assess health professionals: a systematic review of validity evidence, research methods, and reporting quality. Academic Medicine: Journal of the Association of American Medical Colleges, 88 (6), 872–883. https://doi.org/10.1097/ACM.0b013e31828ffdcf

Daniel, M., Rencic, J., Durning, S. J., Holmboe, E. S., Santen, S. A., Lang, V., Ratcliffe, T., Gordon, D., Heist, B., Lubarsky, S., Estrada, C. A., Ballard, T., Artino, A. R., Da Sergio Silva, A., Cleary, T., Stojan, J., & Gruppen, L. D. (2019). Clinical reasoning assessment methods: a scoping review and practical guidance. Academic Medicine: Journal of the Association of American Medical Colleges, 94 (6), 902–912. https://doi.org/10.1097/ACM.0000000000002618

Dunbar, K. (1995). How scientists really reason: scientific reasoning in real-world laboratories. In R. J. Sternberg & J. E. Davidson (Eds.), The nature of insight (pp. 365–395). MIT Press.

Google Scholar  

Edelsbrunner, P. A. (2024). Does interference between intuitive conceptions and scientific concepts produce reliable inter-individual differences? Science & Education. Advance online publication. https://doi.org/10.1007/s11191-024-00500-8

Book   Google Scholar  

Engelmann, T., & Hesse, F. W. (2010). How digital concept maps about the collaborators’ knowledge and information influence computer-supported collaborative problem solving. International Journal of Computer-Supported Collaborative Learning, 5 (3), 299–319. https://doi.org/10.1007/s11412-010-9089-1

Fink, M. C., Heitzmann, N., Reitmeier, V., Siebeck, M., Fischer, F., & Fischer, M. R. (2023). Diagnosing virtual patients: the interplay between knowledge and diagnostic activities. Advances in Health Sciences Education : Theory and Practice , 1–20. https://doi.org/10.1007/s10459-023-10211-4

Fiore, S. M., Graesser, A. C., & Greiff, S. (2018). Collaborative problem-solving education for the twenty-first-century workforce. Nature Human Behaviour, 2 (6), 367–369. https://doi.org/10.1038/s41562-018-0363-y

Fischer, F., Kollar, I., Stegmann, K., & Wecker, C. (2013). Toward a script theory of guidance in computer-supported collaborative learning. Educational Psychologist, 48 (1), 56–66. https://doi.org/10.1080/00461520.2012.748005

Fischer, F., Kollar, I., Ufer, S., Sodian, B., Hussmann, H., Pekrun, R., Neuhaus, B. J., Dorner, B., Pankofer, S., Fischer, M. R., Strijbos, J.‑W., Heene, M., & Eberle, J. (2014). Scientific reasoning and argumentation: advancing an interdisciplinary research agenda in education. Frontline Learning Research , 2 (3), 28–45. https://doi.org/10.14786/flr.v2i2.96

Fischer, M. R., Kopp, V., Holzer, M., Ruderich, F., & Jünger, J. (2005). A modified electronic key feature examination for undergraduate medical students: validation threats and opportunities. Medical Teacher, 27 (5), 450–455. https://doi.org/10.1080/01421590500078471

Förtsch, C., Sommerhoff, D., Fischer, F., Fischer, M. R., Girwidz, R., Obersteiner, A., Reiss, K., Stürmer, K., Siebeck, M., Schmidmaier, R., Seidel, T., Ufer, S., Wecker, C., & Neuhaus, B. J. (2018). Systematizing professional knowledge of medical doctors and teachers: development of an interdisciplinary framework in the context of diagnostic competences. Education Sciences, 8 (4), 207. https://doi.org/10.3390/educsci8040207

Goldhammer, F., Naumann, J., Rölke, H., Stelter, A., & Tóth, K. (2017). Relating product data to process data from computer-based competency assessment. In D. Leutner, J. Fleischer, J. Grünkorn, & E. Klieme (Eds.), Methodology of educational measurement and assessment. competence assessment in education (pp. 407–425). Springer International Publishing. https://doi.org/10.1007/978-3-319-50030-0_24

Graesser, A. C., Fiore, S. M., Greiff, S., Andrews-Todd, J., Foltz, P. W., & Hesse, F. W. (2018). Advancing the science of collaborative problem solving. Psychological Science in the Public Interest: A Journal of the American Psychological Society, 19 (2), 59–92. https://doi.org/10.1177/1529100618808244

Hautz, W. E., Kämmer, J. E., Schauber, S. K., Spies, C. D., & Gaissmaier, W. (2015). Diagnostic performance by medical students working individually or in teams. JAMA, 313 (3), 303–304. https://doi.org/10.1001/jama.2014.15770

Heitzmann, N., Seidel, T., Hetmanek, A., Wecker, C., Fischer, M. R., Ufer, S., Schmidmaier, R., Neuhaus, B. J., Siebeck, M., Stürmer, K., Obersteiner, A., Reiss, K., Girwidz, R., Fischer, F., & Opitz, A. (2019). Facilitating diagnostic competences in simulations in higher education: a framework and a research agenda. Frontline Learning Research , 1–24. https://doi.org/10.14786/flr.v7i4.384

Herborn, K., Stadler, M., Mustafić, M., & Greiff, S. (2020). The assessment of collaborative problem solving in PISA 2015: can computer agents replace humans? Computers in Human Behavior, 104 , 105624. https://doi.org/10.1016/j.chb.2018.07.035

Hesse, F. W., Care, E., Buder, J., Sassenberg, K., & Griffin, P. (2015). A framework for teachable collaborative problem solving skills. In P. Griffin & E. Care (Eds.), Assessment and Teaching of 21 st Century Skills (pp. 37–56). Springer.

Chapter   Google Scholar  

Hetmanek, A., Engelmann, K., Opitz, A., & Fischer, F. (2018). Beyond intelligence and domain knowledge. In F. Frank, C. Clark A., E. Katharina, O. Jonathan, F. Fischer, C. A. Chinn, K. Engelmann, & J. Osborne (Eds.), Scientific reasoning and argumentation (pp. 203–226). Routledge. https://doi.org/10.4324/9780203731826-12

Hilbert, S., & Stadler, M. (2017). Structural equation models. In V. Zeigler-Hill & T. K. Shackelford (Eds.), Encyclopedia of Personality and Individual differences (pp. 1–9). Springer International Publishing. https://doi.org/10.1007/978-3-319-28099-8_1285-1

Hitchcock, D. (2005). Good reasoning on the Toulmin model. Argumentation, 19 (3), 373–391. https://doi.org/10.1007/s10503-005-4422-y

Horst, P. (1941). The prediction of personnel adjustment. Socia LScience Research and Council Bulletin  (48), 431–436.

Jeong, H., Hmelo-Silver, C. E., & Jo, K. (2019). Ten years of computer-supported collaborative learning: a meta-analysis of CSCL in STEM education during 2005–2014. Educational Research Review, 28 , 100284. https://doi.org/10.1016/j.edurev.2019.100284

Kiesewetter, J., Fischer, F., & Fischer, M. R. (2017). Collaborative clinical reasoning—a systematic review of empirical studies. The Journal of Continuing Education in the Health Professions, 37 (2), 123–128. https://doi.org/10.1097/CEH.0000000000000158

Kiesewetter, J., Sailer, M., Jung, V. M., Schönberger, R., Bauer, E., Zottmann, J. M., Hege, I., Zimmermann, H., Fischer, F., & Fischer, M. R. (2020). Learning clinical reasoning: how virtual patient case format and prior knowledge interact. BMC Medical Education, 20 (1), 73–83. https://doi.org/10.1186/s12909-020-1987-y

Klahr, D., & Dunbar, K. (1988). Dual space search during scientific reasoning. Cognitive Science, 12 (1), 1–48. https://doi.org/10.1207/s15516709cog1201_1

Koschmann, T. D., Feltovich, P. J., Myers, A. C., & Barrows, H. S. (1992). Implications of CSCL for problem-based learning. ACM SIGCUE Outlook, 21 (3), 32–35. https://doi.org/10.1145/130893.130902

Liu, L., Hao, J., Davier, A. A. von, Kyllonen, P., & Zapata-Rivera, J.‑D. (2016). A tough nut to crack: measuring collaborative problem solving. In J. Keengwe, Y. Rosen, S. Ferrara, & M. Mosharraf (Eds.), Handbook of Research on Technology Tools for Real-World Skill Development (pp. 344–359). IGI Global. https://doi.org/10.4018/978-1-4666-9441-5.ch013

Muthén, L. K., & Muthén, B. O. (2017). Mplus: Statistical Analysis with Latent Variables: User’s Guide (Version 8) [Computer software]. Authors.

Nachtigall, C., Kroehne, U., Funke, F., & Steyer, R. (2003). (Why) should we use SEM? Pros and cons of structural equation modeling. Methods of Psychological Research Online, 8 (2), 1–22.

Noroozi, O., Biemans, H. J., Weinberger, A., Mulder, M., & Chizari, M. (2013). Scripting for construction of a transactive memory system in multidisciplinary CSCL environments. Learning and Instruction, 25 , 1–12. https://doi.org/10.1016/j.learninstruc.2012.10.002

OECD. (2017). PISA 2015 Assessment and Analytical Framework: Science, Reading, Mathematic, Financial Literacy and Collaborative Problem Solving, revised edition. PISA, OECD Publishing. https://doi.org/10.1787/9789264281820-en

Pickal, A. J., Engelmann, K., Chinn, C. A., Girwidz, R., Neuhaus, B. J., & Wecker, C. (2023). Fostering the collaborative diagnosis of cross-domain skills in video-based simulations. In Proceedings of the International Conference on Computer-supported for Collaborative Learning, Proceedings of the 16 th International Conference on Computer-Supported Collaborative Learning — CSCL 2023 (pp. 139–146). International Society of the Learning Sciences. https://doi.org/10.22318/cscl2023.638463

Radkowitsch, A., Fischer, M. R., Schmidmaier, R., & Fischer, F. (2020). Learning to diagnose collaboratively: Validating a simulation for medical students. GMS Journal for Medical Education, 37(5), Doc51. https://doi.org/10.3205/zma001344

Radkowitsch, A., Sailer, M., Fischer, M. R., Schmidmaier, R., & Fischer, F. (2022). Diagnosing collaboratively: A theoretical model and a simulation-based learning environment. In F. Fischer & A. Opitz (Eds.),  Learning to Diagnose with Simulations . Springer, Cham. https://doi.org/10.1007/978-3-030-89147-3_10

Radkowitsch, A., Sailer, M., Schmidmaier, R., Fischer, M. R., & Fischer, F. (2021). Learning to diagnose collaboratively—effects of adaptive collaboration scripts in agent-based medical simulations. Learning and Instruction, 75 , 101487. https://doi.org/10.1016/j.learninstruc.2021.101487

Richters, C., Stadler, M., Radkowitsch, A., Behrmann, F., Weidenbusch, M., Fischer, M. R., Schmidmaier, R., & Fischer, F. (2022). Making the rich even richer? Interaction of structured reflection with prior knowledge in collaborative medical simulations. In A. Weinberger, W. Chen, D. Hernández-Leo, & B.Che (Chair), International Society of the Learning Sciences. Hiroshima, Japan.

Rochelle, J., & Teasley, S. (1995). The construction of shared knowledge in collaborative problem solving. In C. O’Malley (Ed.), Computer-Supported Collaborative Learning (pp. 66–97). Springer.

Rosen, Y. (2015). Computer-based assessment of collaborative problem solving: exploring the feasibility of human-to-agent approach. International Journal of Artificial Intelligence in Education, 25 (3), 380–406. https://doi.org/10.1007/s40593-015-0042-3

Savalei, V., & Rhemtulla, M. (2013). The performance of robust test statistics with categorical data. British Journal of Mathematical and Statistical Psychology, 66 (2), 201–223. https://doi.org/10.1111/j.2044-8317.2012.02049.x

Scalise, K., Mustafic, M., & Greiff, S. (2016). Dispositions for collaborative problem solving. In S. Kuger, E. Klieme, N. Jude, & D. Kaplan (Eds.), Methodology of Educational Measurement and Assessment. Assessing Contexts of Learning (pp. 283–299). Springer International Publishing. https://doi.org/10.1007/978-3-319-45357-6_11

Schmidt, H. G., & Mamede, S. (2015). How to improve the teaching of clinical reasoning: a narrative review and a proposal. Medical Education, 49 (10), 961–973. https://doi.org/10.1111/medu.12775

Schmidt, H. G., & Rikers, R. M. J. P. (2007). How expertise develops in medicine: knowledge encapsulation and illness script formation. Medical Education, 41 (12), 1133–1139. https://doi.org/10.1111/j.1365-2923.2007.02915.x

Schons, C., Obersteiner, A., Reinhold, F., Fischer, F., & Reiss, K. (2022). Developing a simulation to foster prospective mathematics teachers’ diagnostic competencies: the effects of scaffolding . Advance online publication. https://doi.org/10.1007/s13138-022-00210-0

Stadler, M., Herborn, K., Mustafić, M., & Greiff, S. (2020). The assessment of collaborative problem solving in PISA 2015: an investigation of the validity of the PISA 2015 CPS tasks. Computers & Education, 157 , 103964. https://doi.org/10.1016/j.compedu.2020.103964

Stadler, M., Hofer, S., & Greiff, S. (2020). First among equals: log data indicates ability differences despite equal scores. Computers in Human Behavior, 111 , 106442. https://doi.org/10.1016/j.chb.2020.106442

Stadler, M., Sailer, M., & Fischer, F. (2021). Knowledge as a formative construct: a good alpha is not always better. New Ideas in Psychology , 60. https://doi.org/10.1016/j.newideapsych.2020.100832

Stark, R., Kopp, V., & Fischer, M. R. (2011). Case-based learning with worked examples in complex domains: two experimental studies in undergraduate medical education. Learning and Instruction, 21 (1), 22–33. https://doi.org/10.1016/j.learninstruc.2009.10.001

Stasser, G., & Titus, W. (1985). Pooling of unshared infomration in group decision making: biased information sampling during discussion. Journal of Personality and Social Psychology, 48 (6), 1467–1478.

Taber, K. S. (2018). The use of Cronbach’s alpha when developing and reporting research instruments in science education. Research in Science Education, 48 (6), 1273–1296. https://doi.org/10.1007/s11165-016-9602-2

Tschan, F., Semmer, N. K., Gurtner, A., Bizzari, L., Spychiger, M., Breuer, M., & Marsch, S. U. (2009). Explicit reasoning, confirmation bias, and illusory transactive memory: a simulation study of group medical decision making. Small Group Research, 40 (3), 271–300. https://doi.org/10.1177/1046496409332928

van Joolingen, W. R., & de Jong, T. (1997). An extended dual search space model of scientific discovery learning. Instructional Science, 25 (5), 307–346. https://doi.org/10.1023/A:1002993406499

Vogel, F., Wecker, C., Kollar, I., & Fischer, F. (2017). Socio-cognitive scaffolding with computer-supported collaboration scripts: a meta-analysis. Educational Psychology Review, 29 (3), 477–511. https://doi.org/10.1007/s10648-016-9361-7

Vogel, F., Weinberger, A., Hong, D., Wang, T., Glazewski, K., Hmelo-Silver, C. E., Uttamchandani, S., Mott, B., Lester, J., Oshima, J., Oshima, R., Yamashita, S., Lu, J., Brandl, L., Richters, C., Stadler, M., Fischer, F., Radkowitsch, A., Schmidmaier, R., . . . Noroozi, O. (2023). Transactivity and knowledge co-construction in collaborative problem solving. In Proceedings of the International Conference on Computer-supported for Collaborative Learning, Proceedings of the 16 th International Conference on Computer-Supported Collaborative Learning — CSCL 2023 (pp. 337–346). International Society of the Learning Sciences. https://doi.org/10.22318/cscl2023.646214

Wegner, D. M. (1987). transactive memory: a contemporary analysis of the group mind. In B. Mullen & G. R. Goethals (Eds.), Theories of Group Behavior (pp. 185–208). Springer New York. https://doi.org/10.1007/978-1-4612-4634-3_9

Weinberger, A., & Fischer, F. (2006). A framework to analyze argumentative knowledge construction in computer-supported collaborative learning. Computers & Education, 46 (1), 71–95. https://doi.org/10.1016/j.compedu.2005.04.003

Download references

Open Access funding enabled and organized by Projekt DEAL. The research presented in this contribution was funded by a grant of the Deutsche Forschungsgemeinschaft (DFG, FOR 2385) to Frank Fischer, Martin R. Fischer and Ralf Schmidmaier (FI 792/11-1 & FI 792/11-2) 

Author information

Authors and affiliations.

Department of Psychology, Ludwig-Maximilians-Universität München, Leopoldstr. 13, 80802, Munich, Germany

Laura Brandl, Matthias Stadler, Constanze Richters & Frank Fischer

Institute of Medical Education, LMU University Hospital, Ludwig-Maximilians-Universität München, Munich, Germany

Matthias Stadler & Martin R. Fischer

IPN Leibniz Institute for Science and Mathematics Education, Department of Mathematics Education, Kiel, Germany

Anika Radkowitsch

Medizinische Klinik und Poliklinik IV, LMU University Hospital, Ludwig-Maximilians-Universität München, Munich, Germany

Ralf Schmidmaier

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Laura Brandl .

Ethics declarations

Conflict of interest statement.

On behalf of all authors, the corresponding author states that there is no conflict of interest.

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Please note that the data employed in this study have been used in previous publications (e.g., Brandl et al., 2021; Radkowitsch, et al., 2021; Richters et al., 2022 ). However, the research question and the results reported in this study are completely unique to this study. An initial version of this article is presented as a poster at ISLS 2024.

see Tables 7 , 8 and 9

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Brandl, L., Stadler, M., Richters, C. et al. Collaborative Problem-Solving in Knowledge-Rich Domains: A Multi-Study Structural Equation Model. Intern. J. Comput.-Support. Collab. Learn (2024). https://doi.org/10.1007/s11412-024-09425-4

Download citation

Received : 18 September 2023

Accepted : 09 May 2024

Published : 24 June 2024

DOI : https://doi.org/10.1007/s11412-024-09425-4

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Collaborative Problem-solving
  • Simulation-based Learning Environment
  • Diagnostic Activities
  • Diagnostic Reasoning
  • Medical Education
  • Find a journal
  • Publish with us
  • Track your research

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Front Psychol

The Efficacy and Development of Students' Problem-Solving Strategies During Compulsory Schooling: Logfile Analyses

Gyöngyvér molnár.

1 Department of Learning and Instruction, University of Szeged, Szeged, Hungary

Benő Csapó

2 MTA-SZTE Research Group on the Development of Competencies, University of Szeged, Szeged, Hungary

The purpose of this study was to examine the role of exploration strategies students used in the first phase of problem solving. The sample for the study was drawn from 3 rd - to 12 th -grade students (aged 9–18) in Hungarian schools ( n = 4,371). Problems designed in the MicroDYN approach with different levels of complexity were administered to the students via the eDia online platform. Logfile analyses were performed to ascertain the impact of strategy use on the efficacy of problem solving. Students' exploration behavior was coded and clustered through Latent Class Analyses. Several theoretically effective strategies were identified, including the vary-one-thing-at-a-time (VOTAT) strategy and its sub-strategies. The results of the analyses indicate that the use of a theoretically effective strategy, which extract all information required to solve the problem, did not always lead to high performance. Conscious VOTAT strategy users proved to be the best problem solvers followed by non-conscious VOTAT strategy users and non-VOTAT strategy users. In the primary school sub-sample, six qualitatively different strategy class profiles were distinguished. The results shed new light on and provide a new interpretation of previous analyses of the processes involved in complex problem solving. They also highlight the importance of explicit enhancement of problem-solving skills and problem-solving strategies as a tool for knowledge acquisition in new contexts during and beyond school lessons.

Introduction

Computer-based assessment has presented new challenges and opportunities in educational research. A large number of studies have highlighted the importance and advantages of technology-based assessment over traditional paper-based testing (Csapó et al., 2012 ). Three main factors support and motivate the use of technology in educational assessment: (1) the improved efficiency and greater measurement precision in the already established assessment domains (e.g., Csapó et al., 2014 ); (2) the possibility of measuring constructs that would be impossible to measure by other means (e.g., Complex Problem Solving (CPS) 1 ; see Greiff et al., 2012 , 2013 ); and (3) the opportunity of logging and analyzing not only observed variables, but metadata as well (Lotz et al., 2017 ; Tóth et al., 2017 ; Zoanetti and Griffin, 2017 ). Analyzing logfiles may contribute to a deeper and better understanding of the phenomenon under examination. Logfile analyses can provide answers to research questions which could not be answered with traditional assessment techniques.

This study focuses on problem solving, especially on complex problem solving (CPS), which reflects higher-order cognitive processes. Previous research identified three different ways to measure CPS competencies: (1) Microworlds (e.g., Gardner and Berry, 1995 ), (2) formal frameworks (Funke, 2001 , 2010 ) and (3) minimal complex systems (Funke, 2014 ). In this paper, the focus is on the MicroDYN approach, which is a specific form of complex problem solving (CPS) in interactive situations using minimal complex systems (Funke, 2014 ). Recent analyses provide both a new theory and data-based evidence for a global understanding of different problem-solving strategies students employ or could employ in a complex problem-solving environment based on minimal complex systems.

The problem scenarios within the MicroDYN approach consist of a small number of variables and causal relations. From the perspective of the problem solver, solving a MicroDYN problem requires a sequence of continuous activities, in which the outcome of one activity is the input for the next. First, students interact with the simulated system, set values for the input variables, and observe the impacts of these settings on the target (dependent) variable. Then, they plot their conclusion about the causal relationships between the input and output variables on a graph (Phase 1). Next, they manipulate the independent variables again to set their values so that they result in the required values for the target variables (Phase 2).

When it comes to gathering information about a complex problem, as in the MicroDYN scenarios, there may be differences between the exploration strategies in terms of efficacy. Some of them may be more useful for generating knowledge about the system. Tschirgi ( 1980 ) identified different exploration strategies. When control of variables strategies (Greiff et al., 2014 ) were explored, findings showed that the vary-one-thing-at-a-time (VOTAT, Tschirgi, 1980 ; Funke, 2014 ) was the most effective strategy for identifying causal relations between the input and output variables in a minimal complex system (Fischer et al., 2012 ). Participants who employed this strategy tended to acquire more structural knowledge than those who used other strategies (Vollmeyer et al., 1996 ; Kröner et al., 2005 ). With the VOTAT strategy, the problem solver systematically varies only one input variable, while the others remain unchanged. This way, the effect of the variable that has just been changed can be observed directly by monitoring the changes in the output variables. There exist several types of VOTAT strategies.

Using this approach—defining the effectiveness of a strategy on a conceptual level, independently of empirical effectiveness—we developed a labeling system and a mathematical model based on all theoretically effective strategies. Thus, effectiveness was defined and linked to the amount of information extracted. An exploration strategy was defined as theoretically effective if the problem solver was able to extract all the information needed to solve the problem, independently of the application level of the information extracted and of the final achievement. We split the effectiveness of the exploration strategy and the usage and application of the information extracted to be able to solve the problem and control the system with respect to the target values based on the causal knowledge acquired. Systematicity was defined on the level of effectiveness based on the amount of information extracted and on the level of awareness based on the implementation of systematicity in time.

Students' actions were logged and coded according to our input behavior model and then clustered for comparison. We were able to distinguish three different VOTAT strategies and two successful non-VOTAT ones. We empirically tested awareness of the input behavior used in time. Awareness of strategy usage was analyzed by the sequence of the trials used, that is, by the systematicity of the trials used in time. We investigated the effectiveness of and differences in problem-solving behavior between three age groups by conducting latent class analyses to explore and define patterns in qualitatively different VOTAT strategy uses.

Although the assessment of problem solving within the MicroDYN approach is a relatively new area of research, its processes have already been studied in a number of different contexts, including a variety of educational settings with several age groups. Our cross-sectional design allows us to describe differences between age groups and outline the developmental tendencies of input behavior and strategy use among children in the age range covered by our data collection.

Reasoning strategies in complex problem solving

Problem-solving skills have been among the most extensively studied transversal skills over the last decade; they have been investigated in the most prominent comprehensive international large-scale assessments today (e.g., OECD, 2014 ). The common aspects in the different theoretical models are that a problem is characterized by a gap between the current state and the goal state with no immediate solution available (Mayer and Wittrock, 1996 ).

Parallel to the definition of the so-called twenty first-century skills (Griffin et al., 2012 ), recent research on problem solving disregards content knowledge and domain-specific processes. The reason for this is that understanding the structure of unfamiliar problems is more effective when it relies on abstract representation schemas and metacognitive strategies than on specifically relevant example problems (Klahr et al., 2007 ). That is, the focus is more on assessing domain-general problem-solving strategies (Molnár et al., 2017 ), such as complex problem solving, which can be used to solve novel problems, even those arising in interactive situations (Molnár et al., 2013 ).

Logfile analyses make it possible to divide the continuum of a problem-solving process into several scoreable phases by extracting information from the logfile that documents students' problem-solving behavior. In our case, latent class analysis extracts information from the file that logs students' interaction with the simulated system at the beginning of the problem-solving process. The way students manipulate the input (independent) variables represents their reasoning strategy. Log data, on the one hand, make it possible to analyze qualitative differences in these strategies and then their efficiency in terms of how they generate knowledge resulting in the correct plotting of the causal relationship in Phase 1 and then the proper setting to reach the required target value in Phase 2. On the other hand, qualitative strategy data can be quantified, and an alternative scoring system can be devised.

From the perspective of the traditional psychometric approach and method of scoring, these problems form a test task consisting of two scoreable items. The first phase is a knowledge acquisition process, where scores are assigned based on how accurately the causal relationship was plotted. The second phase is knowledge application, where the correctness of the value for the target variable is scored. Such scoring based on two phases of solving MicroDYN problems has been used in a number of previous studies (e.g., Greiff et al., 2013 , 2015 ; Wüstenberg et al., 2014 ; Csapó and Molnár, 2017 ; Greiff and Funke, 2017 ).

To sum up, there is great potential to investigate and cluster the problem-solving behavior and exploration strategy usage of the participants at the beginning of the problem-solving process and correlate the use of a successful exploration strategy with the model-building solution (achievement in Phase 1) observed directly in these simulated problem scenarios. Using logfile analyses (Greiff et al., 2015 ), the current article wishes to contribute insights into students' approaches to explore and solve problems related to minimal complex systems. By addressing research questions on the problem-solving strategies used, the study aims to understand students' exploration behavior in a complex problem-solving environment and the underlying causal relations. In this study, we show that such scoring can be developed through latent class analysis and that this alternative method of scoring may produce more reliable tests. Furthermore, such scoring can be automated and then employed in a large-scale assessment.

There are two major theoretical approaches to cognition relevant to our study; both offer general principles to interpret cognitive development beyond the narrower domain of problem solving. Piaget proposed the first comprehensive theory to explain the development of children's thinking as a sequence of four qualitatively different stages, the formal operational stage being the last one (Inhelder and Piaget, 1958 ), while the information processing approach describes human cognition by using terms and analogies borrowed from computer science. The information processing paradigm was not developed into an original developmental theory; it was rather aimed at reinterpreting and extending Piaget's theory (creating several Neo-Piagetian models) and synthesizing the main ideas of the two theoretical frameworks (Demetriou et al., 1993 ; Siegler, 1999 ). One of the focal points of these models is to explain the development of children's scientific reasoning, or, more closely, the way children understand how scientific experiments can be designed and how causal relationships can be explored by systematically changing the values of (independent) variables and observing their impact on other (target) variables.

From the perspective of the present study, the essential common element of cognitive developmental research is the control of variables strategy. Klahr and Dunbar ( 1988 ) distinguished two related skills in scientific thinking, hypothesis formation and experimental design, and they integrated these skills into a coherent model for a process of scientific discovery. The underlying assumption is that knowledge acquisition requires an iterative process involving both. System control as knowledge application tends to include both processes, especially when acquired knowledge turns out to be insufficient or dysfunctional (J. F. Beckmann, personal communication, August 16, 2017). Furthermore, they separated the processes of rule induction and problem solving, defining the latter as a search in a space of rules (Klahr and Dunbar, 1988 , p. 5).

de Jong and van Joolingen ( 1998 ) provided an overview of studies in scientific discovery learning with computer simulations. They concluded that a number of specific skills are needed for successful discovery, like systematic variation of variable values, which is in a focus of the present paper, and the use of high-quality heuristics for experimentation. They identified several characteristic problems in the discovery process and stressed that learners often have trouble interpreting data.

In one of the earliest systematic studies of students' problem-solving strategies, Vollmeyer et al. ( 1996 ) explored the impact of strategy systematicity and effectiveness on complex problem-solving performance. Based on previous studies, they distinguished the VOTAT strategy from other possible strategies [Change All (CA) and Heterogeneous (HT) other strategies], as VOTAT allows systematic exploration of the behavior of a system and a disconfirmation of hypotheses. In one of their experiments, they examined the hypothesis that VOTAT was more effective for acquiring knowledge than less systematic strategies. According to the results, the 36 undergraduate students had clearly shown strategy development. After interacting with the simulated system in several rounds, they tended to use the VOTAT strategy more frequently. In a second experiment, it was also demonstrated that goal specificity influences strategy use as well (Vollmeyer et al., 1996 ).

Beckmann and Goode ( 2014 ) analyzed the systematicity in exploration behavior in a study involving 80 first-year psychology students and focusing on the semantic context of a problem and its effect on the problem solvers' behavior in complex and dynamic systems. According to the results, a semantically familiar problem context invited a high number of a priori assumptions on the interdependency of system variables. These assumptions were less likely tested during the knowledge acquisition phase, this proving to be the main barrier to the acquisition of new knowledge. Unsystematic exploration behavior tended to produce non-informative system states that complicated the extraction of knowledge. A lack of knowledge ultimately led to poor control competency.

Beckmann et al. ( 2017 ) confirmed research results by Beckmann and Goode ( 2014 ) and demonstrated how a differentiation between complexity and difficulty leads to a better understanding of the cognitive mechanism behind CPS. According to findings from a study with 240 university students, the performance differences observed in the context of the semantic effect were associated with differences in the systematicity of the exploration behavior, and the systematicity of the exploration behavior was reflected in a specific sequence of interventions. They argued that it is only the VOTAT strategy—supplemented with the vary- none -at-a-time strategy in the case of noting autonomous changes—that creates informative system state transitions which enable problem solvers to derive knowledge of the causal structure of a complex, dynamic system.

Schoppek and Fischer ( 2017 ) also investigated VOTAT and the related “PULSE” strategy (all input variables to zero), which enables the problem solver to observe the eigendynamics of the system in a transfer experiment. They proposed that besides VOTAT and PULSE, other comprehensive knowledge elements and strategies, which contribute to successful CPS, should be investigated.

In a study with 2 nd - to 4 th -grade students, Chen and Klahr found little spontaneous development when children interacted with physical objects (in situations similar to that of Piaget's experiments), while more direct teaching of the control of variables strategy resulted in good effect sizes and older children were able to transfer the knowledge they had acquired (improved control of variable strategy) to remote contexts (Chen and Klahr, 1999 ). In a more recent study, Kuhn et al. ( 2008 ) further extended the scope of studies on scientific thinking, identifying three further aspects beyond the control of variables strategy, including coordinating effects of multiple influences, understanding the epistemological foundations of science and engaging in argumentation. In their experiment with 91 6th-grade students, they explored how students were able to estimate the impact of five independent variables simultaneously on a particular phenomenon, and they found that most students considered only one or two variables as possible causes.

In this paper, we explore several research questions on effective and less effective problem-solving strategies used in a complex problem-solving environment and detected by logfile analyses. We use logfile analyses to empirically test the success of different input behavior and strategy usage in CPS tasks within the MicroDYN framework. After constructing a mathematical model based on all theoretically effective strategies, which provide the problem solver with all the information needed to solve the problem, and defining several sub-strategies within the VOTAT strategy based on the amount of effort expended to extract the necessary information, we empirically distinguish different VOTAT and non-VOTAT strategies, which can result in good CPS performance and which go beyond the isolated variation strategy as an effective strategy for rule induction (Vollmeyer et al., 1996 ). We highlight the most and least effective VOTAT strategies used in solving MicroDYN problems and empirically investigate the awareness of the strategy used based on the sequence of the sub-strategies used. Based on these results, we conduct latent class analyses to explore and define patterns in qualitatively different VOTAT strategy uses.

We thus intend to answer five research questions:

  • RQ1: Does the use of a theoretically effective strategy occur prior to high performance? In other words, does the use of a theoretically effective strategy result in high performance?
  • RQ2: Do all VOTAT strategies result in a high CPS performance? What is the most effective VOTAT strategy?
  • RQ3: How does awareness of the exploration strategy used influence overall performance on CPS tasks?
  • RQ4: What profiles characterize the various problem solvers and explorers?
  • RQ5: Do exploration strategy profiles differ across grade levels, which represent different educational stages during compulsory schooling?

In this study, we investigated qualitatively different classes of students' exploration behavior in CPS environments. We used latent class analysis (LCA) to study effective and non-effective input behavior and strategy use, especially the principle of isolated variation, across several CPS tasks. We compared the effectiveness of students' exploration behavior based on the amount of information they extracted with their problem-solving achievement. We posed five separate hypotheses.

Hypothesis 1: We expect that high problem-solving achievement is not closely related to expert exploration behavior.

Vollmeyer et al. ( 1996 ) explored the impact of strategy effectiveness on problem-solving performance and reported that effectiveness correlated negatively and weakly to moderately with solution error ( r = −0.32 and r = −0.54, p < 0.05). They reported that “most participants eventually adopted the most systematic strategy, VOTAT, and the more they used it, the better they tended to perform. However, even those using the VOTAT strategy generally did not solve the problem completely” (p. 88). Greiff et al. ( 2015 ) confirmed that different exploration behaviors are relevant to CPS and that the number of sub-strategies implemented was related to overall problem-solving achievement.

Hypothesis 2: We expect that students who use the isolated variation strategy in exploring CPS problems have a significantly better overall performance than those who use a theoretically effective, but different strategy.

Sonnleiter et al. ( 2017 ) noted that “A more effective exploration strategy leads to a higher system knowledge score and the higher the gathered knowledge, the better the ability to achieve the target values. Thus, system knowledge can be seen as a reliable and valid measure of students' mental problem representations” (p. 169). According to Wüstenberg et al. ( 2012 ), students who consistently apply the principle of isolated variation—the most systematic VOTAT strategy—in CPS environments show better overall CPS performance, compared to those who use different exploration strategies. Kröner et al. ( 2005 ) reported a positive correlation between using the principle of isolated variation and the likelihood of solving the overall problem.

Hypothesis 3: We expected that more aware CPS exploration behavior would be more effective than exploration behavior that generally results in extracting all the necessary information from the system to solve the problem, but within which the steps have no logically built structure and no systematicity in time.

Vollmeyer et al. ( 1996 ) explored the impact of strategy systematicity on problem-solving performance. They emphasized that “the systematicity of participants' spontaneous hypothesis-testing strategies predicted their success on learning the structure of the biology lab problem space” (p. 88). Vollmeyer and her colleagues restricted systematic strategy users to isolated variation strategy users; this corresponds to our terminology usage of aware isolated variation strategy users.

Hypothesis 4: We expected to find a distinct number of classes with statistically distinguishable profiles of CPS exploration behavior. Specifically, we expected to find classes of proficient, intermediate and low-performing explorers.

Several studies (Osman and Speekenbrink, 2011 ; Wüstenberg et al., 2012 ; Greiff et al., 2015 ) have indicated that there exist quantitative differences between different exploration strategies, which are relevant to a CPS environment. The current study is the first to investigate whether a relatively small number of qualitatively different profiles of students' exploration proficiency can be derived from their behavior detected in a CPS environment in a broad age range.

Hypothesis 5: We expected that more proficient CPS exploration behavior would be more dominant at later grade levels as an indication of cognitive maturation and of increasing abilities to explore CPS environments.

The cognitive development in children between Grades 3 and 12 is immense. According to Piaget's stage theory, they move from concrete operations to formal operations and they will be able to think logically and abstractly. According to Galotti ( 2011 ) and Molnár et al. ( 2013 ), the ability to solve problems effectively and to make decisions in CPS environments increases in this period of time; Grades 6–8 seem especially crucial for development. Thus, we expect that cognitive maturation will also be reflected in more proficient exploration behavior.

Participants

The sample was drawn from 3 rd - to 12 th -grade students (aged 9–18) in Hungarian primary and secondary schools ( N = 4,371; Table ​ Table1). 1 ). School classes formed the sampling unit. 180 classes from 50 schools in different regions were involved in the study, resulting in a wide-ranging distribution of students' background variables. The proportion of boys and girls was about the same.

Composition of samples.

3584
4679
5608
66774911.92 (0.53)
76075112.94 (0.53)
89424913.89 (0.56)
9304815.00 (0.59)
10845116.79 (0.49)
111026817.02 (0.79)
12586417.93 (0.57)

The MicroDYN approach was employed to develop a measurement device for CPS. CPS tasks within the MicroDYN approach are based on linear structural equations (Funke, 2001 ), in which up to three input variables and up to three output variables are related (Greiff et al., 2013 ). Because of the small set of input and output variables, the MicroDYN problems could be understood completely with precise causal analyses (Funke, 2014 ). The relations are not presented to the problem solver in the scenario. To explore these relations, the problem solver must interact directly with the problem situation by manipulating the input variables (Greiff and Funke, 2010 ), an action that can influence the output variables (direct effects), and they must use the feedback provided by the computer to acquire and employ new knowledge (Fischer et al., 2012 ). Output variables can change spontaneously and can consist of internal dynamics, meaning they can change without changing the input variables (indirect effects; Greiff et al., 2013 ). Both direct and indirect effects can be detected with an adequate problem-solving strategy (Greiff et al., 2012 ). The interactions between the problem situation and the test taker play an important role, but they can only be identified in a computerized environment based on log data collected during test administration.

In this study, different versions with different levels of item complexity were used (Greiff et al., 2013 ), which varied by school grade (Table ​ (Table2; 2 ; six MicroDYN scenarios were administered in total in Grades 3–4; eight in Grade 5: nine in Grades 6–8; and twelve in Grades 9–12); however, we only involved those six tasks where the principle of isolated variation was the optimal exploration strategy. That is, we excluded problems with an external manipulation-independent, internal dynamic effect or multiple dependence effect from the analyses, and there were no delayed or accumulating effects used in the problem environments created. Complexity was defined by the number of input and output variables and the number of relations based on Cognitive Load Theory (Sweller, 1994 ). “Findings show that increases in the number of relations that must be processed in parallel in reasoning tasks consistently lead to increases in task difficulty” (Beckmann and Goode, 2017 ).

The design of the whole study: the complexity of the systems administered and the structure and anchoring of the tests applied in different grades.

2-1-2+++++++
2-2-2+++++++
2-2-2+++++++
2-2-2+++
3-2-3+++++++
3-3-3+++++++
3-3-4+
3-2-1++++++
3-3-4+++++
3-2-2+++++
3-3-3+++++
3-3-3++
3-3-3++

The tasks were designed so that all causal relations could be identified with systematic manipulation of the inputs. The tasks contained up to three input variables and up to three output variables with different fictitious cover stories. The values of the input variables were changed by clicking on a button with a + or – sign or by using a slider connected to the respective variable (see Figure ​ Figure1). 1 ). The controllers of the input variables range from “– –” (value = −2) to “++” (value = +2). The history of the values of the input variables within the same scenario was presented on a graph connected to each input variable. Beyond the input and output variables, each scenario contained a Help, Reset, Apply and Next button. The Reset button set the system back to its original status. The Apply button made it possible to test the effect of the currently set values of the input variables on the output variables, which appeared in the form of a diagram of each output variable. According to the user interface, within the same phase of each of the problem scenarios, the input values remained at the level at which they were set for the previous input until the Reset button was pressed or they were changed manually. The Next button implemented the navigation between the different MicroDYN scenarios and the different phases within a MicroDYN scenario.

An external file that holds a picture, illustration, etc.
Object name is fpsyg-09-00302-g0001.jpg

Exploration in phase 1 of the MicroDYN problems (two input variables and two output variables).

In the knowledge acquisition phase, participants were freely able to change the values of the input variables and attempt as many trials for each MicroDYN scenario as they liked within 180 s. During this 180 s, they had to draw the concept map (or causal diagram; Beckmann et al., 2017 ); that is, they had to draw the arrows between the variables presented on the concept map under the MicroDYN scenario on screen. In the knowledge application phase, students had to check their respective system using the right concept map presented on screen by reaching the given target values within a given time frame (90 s) in no more than four trials, that is, with a maximum of four clicks on the Apply button. This applied equally to all participants.

All of the CPS problems were administered online via the eDia platform. At the beginning, participants were provided with instructions about the usage of the user interface, including a warm-up task. Subsequently, participants had to explore, describe and operate unfamiliar systems. The assessment took place in the schools' ICT labs using the available school infrastructure. The whole CPS test took approximately 45 min to complete. Testing sessions were supervised by teachers who had been thoroughly trained in test administration. Students' problem-solving performance in the knowledge acquisition and application phases was automatically scored as CPS performance indicators; thus, problem solvers received immediate performance feedback at the end of the testing session. We split the sample into three age groups, whose achievement differed significantly (Grades 3–5, N = 1,871; Grades 6–7, N = 1,284; Grades 8–12, N = 1,216; F = 122.56, p < 0.001; t level_1_2 = −6.22, p < 0.001; t level_2_3 = −8.92, p < 0.001). This grouping corresponds to the changes in the developmental curve relevant to complex problem solving. The most intensive development takes place in Grades 6–7 (see Molnár et al., 2013 ). Measurement invariance, that is, the issue of structural stability, has been demonstrated with regard to complex problem solving in the MicroDYN approach already (e.g., Greiff et al., 2013 ) and was confirmed in the present study (Table ​ (Table3). 3 ). Between group differences can be interpreted as true and not as psychometric differences in latent ability. The comparisons across grade levels are valid.

Goodness of fit indices for measurement invariance of MicroDYN problems.

Configural invariance119.71420.9800.9870.039
Strong factorial invariance126.33457.373>0.050.9860.9800.038
Strict factorial invariance145.495215.028>0.050.9800.9760.042

χ 2 and df were estimated by the weighted least squares mean and variance adjusted estimator (WLSMV). Δχ 2 and Δdf were estimated by the Difference Test procedure in MPlus. Chi-square differences between models cannot be compared by subtracting χ 2 s and dfs if WLSMV estimators are used. CFI, comparative fit index; TLI, Tucker Lewis index; RMSEA, root mean square error of approximation .

The latent class analysis (Collins and Lanza, 2010 ) employed in this study seeks students whose problem-solving strategies show similar patterns. It is a probabilistic or model-based technique, which is a variant of the traditional cluster analysis (Tein et al., 2013 ). The indicator variables observed were re-coded strategy scores. Robust maximum likelihood estimation was used and two to seven cluster solutions were examined. The process of latent class analysis is similar to that of cluster analysis. Information theory methods, likelihood ratio statistical test methods and entropy-based criteria were used in reducing the number of latent classes. As a measure of the relative model fit, AIC (Akaike Information Criterion), which considers the number of model parameters, and BIC (Bayesian Information Criterion), which considers the number of parameters and the number of observations, are the two original and most commonly used information theory methods for model selection. The adjusted Bayesian Information Criterion (aBIC) is the sample size-adjusted BIC. Lower values indicated a better model fit for each criterion (see Dziak et al., 2012 ). Entropy represents the precision of the classification for individual cases. MPlus reports the relative entropy index of the model, which is a re-scaled version of entropy on a [0,1] scale. Values near one, indicating high certainty in classification, and values near zero, indicating low certainty, both point to a low level of homogeneity of the clusters. Finally, the Lo–Mendell–Rubin Adjusted Likelihood Ratio Test (Lo et al., 2001 ) was employed to compare the model containing n latent classes with that containing n −1 latent classes. A significant p -value ( p < 0.05) indicates that the n −1 model is rejected in favor of a model with n classes, as it fits better than the previous one (Muthén and Muthén, 2012 ).

As previous research has found (Greiff et al., 2013 ), achievement in the first and second phases of the problem-solving process can be directly linked to the concept of knowledge acquisition (representation) and knowledge application (generating a solution) and was scored dichotomously. For knowledge acquisition, students' responses were scored as correct (“1”) if the connections between the variables were accurately indicated on the concept map (students' drawings fully matched the underlying problem structure); otherwise, the response was scored as incorrect (“0”). For knowledge application, students' responses were scored as correct (“1”) if students reached the given target values within a given time frame and in no more than four steps, that is, with a maximum of four clicks on the Apply button; otherwise, the response was scored as incorrect (“0”).

We developed a labeling procedure to divide the continuum of the problem-solving process into more scoreable phases and to score students' activity and behavior in the exploration phase at the beginning of the problem-solving process. For the different analyses and the most effective clustering, we applied a categorization, distinguishing students' use of the full, basic and minimal input behavior within a single CPS task (detailed description see later). The unit of this labeling process was a trial, a setting of the input variables, which was tested by clicking on the Apply button during the exploration phase of a problem, thus between receiving the problem and clicking on the Next button to reach the second part, the application part of the problem. The sum of these trials, within the same problem environment is called the input behavior. The input behavior was called a strategy if it followed meaningful regularities.

By our definition, the full input behavior model describes what exactly was done throughout the exploration phase and what kinds of trials were employed in the problem-solving process. It consists of all the activities with the sliders and Apply buttons in the order they were executed during the first phase, the exploration phase of the problem-solving process. The basic input behavior is part of the full input behavior model by definition, when the order of the trials attempted was still being taken into account, but it only consists of activities where students were able to acquire new information on the system. This means that the following activities and trials were not included in the basic input behavior model (they were deleted from the full input behavior model to obtain the basic behavior model):

  • - where the same scenario, the same slider adjustment, was employed earlier within the task (that is, we excluded the role of ad hoc control behavior from the analyses),
  • - where the value (position) of more than one input variable (slider) was changed and where the effect of the input variable on the operation of the system was still theoretically unknown to the problem solver,
  • - where a new setting or new slider adjustment was employed, though the effect of the input variables used was known from previous settings.
  • - As the basic input behavior involves timing, that is, the order of the trials used, it is suitable for the analyses with regard to the awareness of the input behavior employed.

Finally, we generated the students' minimal input behavior model from the full input behavior model. By our definition, the minimal input behavior focuses on those untimed activities (a simple list, without the real order of the trials), where students were able to obtain new information from the system and were able to do so by employing the most effective trials.

Each of the activities in which the students engaged and each of the trials which they used were labeled according to the following labeling system to be able to define students' full input behavior in a systematic format (please note that the numerical labels are neither scores nor ordinal or metric information):

  • Only one single input variable was manipulated, whose relationship to the output variables was unknown (we considered a relationship unknown if its effect cannot be known from previous settings), while the other variables were set at a neutral value like zero. We labeled this trial +1.
  • One single input variable was changed, whose relationship to the output variables was unknown. The others were not at zero, but at a setting used earlier. We labeled this trial +2.
  • One single input variable was changed, whose relationship to the output variables was unknown, and the others were not at zero; however, the effect of the other input variable(s) was known from earlier settings. Even so, this combination was not attempted earlier. We labeled this trial +3.
  • Everything was maintained in a neutral (zero) position. This trial is especially important for CPS problems with their own internal dynamics. We labeled this +A.
  • The value of more than one input variable, whose relationship to the output variables was unknown, was changed at the same time, resulting in no additional information on the system. It was labeled –X.
  • The same trial, the slider adjustment, had already been employed earlier within the task, resulting in no additional information on the system. It was labeled −0.
  • A new slider adjustment was employed; however, the effect of the manipulated input variables was known from previous settings. This trial offered no additional information on the system and was labeled +0.

Although several input variables were changed by the scenario, it was theoretically possible to count the effect of the input variables on the output variables based on the information from the previous and present settings by using and solving linear equations. It was labeled +4.

An extra code (+5) was employed in the labeling process, but only for the basic input behavior, when the problem solver was able to figure out the structure of the problem based on the information obtained in the last trial used. This labeling has no meaning in the case of the minimal input behavior.

The full, basic and minimal input behavior models as well as the labeling procedure can be employed by analyzing problem solvers' exploration behavior and strategies for problems that are based on minimal complex systems. The user interface can preserve previous input values, and the values are not reset to zero after each exploration input. According to Fischer et al. ( 2012 ), VOTAT strategies are best for identifying causal relations between variables and they maximize the successful strategic behavior in minimal complex systems, such as CPS. By using a VOTAT strategy, the problem solver systematically varies only one input variable, while the others remain unchanged. This way, the effect of the changed variable can be found in the system by monitoring the changes in the output variables. There exist several types of VOTAT strategies based on the different combinations of VOTAT-centered trials +1, +2, and +3. The most obvious systematic strategy is when only one input variable is different from the neutral level in each trial and all the other input variables are systematically maintained at the neutral level. Thus, the strategy is a combination of so-called +1 trials, where it is employed for every input variable. Known as the isolated variation strategy (Müller et al., 2013 ), this strategy has been covered extensively in the literature. It must be noted that the isolated variation strategy is not appropriate to detect multiple dependence effects within the MicroDYN approach. We hypothesize that there are more and less successful input behaviors and strategies. We expect that theoretically effective, non-VOTAT strategies do not work as successfully as VOTAT strategies and that the most effective VOTAT strategy will be the isolated variation strategy.

We will illustrate the labeling and coding process and the course of generating a minimal input behavior out of a basic or full input behavior through the following two examples.

Figure ​ Figure1 1 shows an example with two input variables and two output variables. (The word problem reads as follows: “When you get home in the evening, there is a cat lying on your doorstep. It is exhausted and can barely move. You decide to feed it, and a neighbor gives you two kinds of cat food, Miaow and Catnip. Figure out how Miaow and Catnip impact activity and purring.”). The student who mapped the operation of the system as demonstrated in the figure pressed the Apply button six times in all, using the various settings for the Miaow and Catnip input variables.

In mapping the system, the problem solver kept the value of both the input variables at 0 in the first two steps (making no changes to the base values of the input variables), as a result of which the values of the output variables remained unchanged. In steps 3 and 4, he set the value of the Miaow input variable at 2, while the value of the Catnip variable remained at 0 (the bar chart by the name of each variable shows the history of these settings). Even making this change had no effect on the values of the output variables; that is, the values in each graph by the purring and activity variables are constantly horizontal. In steps 5 and 6, the student left the value of the Miaow input variable at 2, but a value of 2 was added to this for the Catnip input variable. As a result, the values of both output variables (purring and activity) began to grow by the same amount. The coding containing all the information (the full input behavior) for this sequence of steps was as follows: +A, −0, +1, −0, +2, −0. The reason for this is since steps 2, 4, and 6 were repetitions of previous combinations, we coded them as −0. Step 3 involved the purest use of a VOTAT strategy [changing the value of one input variable at a time, while keeping the values of the other input values at a neutral level (+1)], while the trial used in step 5 was also a VOTAT strategy. After all, only the value of one input variable changed compared to step 4. This is therefore not the same trial as we described in step 3 (+2). After step 5, all the necessary information was available to the problem solver. The basic input behavior for the same sequence of steps was +A, +1, +2, since the rest of the steps did not lead the problem solver to acquire unknown information. Independently of the time factor, the minimal input behavior in this case was also +A, +1, +2. The test taker was able to access new information on the operation of the system through these steps. From the point of view of awareness, this +1+2 strategy falls under aware strategy usage, as the +1 and +2 sub-strategies were not applied far apart (excluding the simple repetition of the executed trials next to each other) from each other in time. A good indicator of aware strategy usage is if there is no difference between minimal and basic input behavior.

In the second example (Figure ​ (Figure2), 2 ), we demonstrate the sequence of steps taken in mapping another problem as well as the coding we used. Here the students needed to solve a problem consisting of two input variables and one output variable. The word problem reads as follows: “Your mother has bought two new kinds of fruit drink mix. You want to make yourself a fruit drink with them. Figure out how the green and blue powders impact the sweetness of the drink. Plot your assumptions in the model.” The test taker attempted eight different trials in solving this problem, which were coded as follows: +1, +2, +0, +0, +0, +0, −0, −0. After step 2, the student had access to practically all the information required to plot the causal diagram. (In step 1, the problem solver checked the impact of one scoop of green powder and left the quantity of blue powder at zero. Once mixed, the resultant fruit drink became sweeter. In step 2, the problem solver likewise measured out one scoop of green powder for the drink but also added a scoop of blue powder. The sweetness of the drink changed as much as it had in step 1. After that, the student measured out various quantities of blue and then green powder, and looked at the impact.) The basic input behavior coded from the full input behavior used by the problem solver was +1+2, and the minimal input behavior was +1+1 because the purest VOTAT strategy was used in steps 1 and 6. (Thus, both variables separately confirmed the effects of the blue and the green powder on the sweetness of the drink.) From the point of view of awareness, this +1+1 strategy falls under non-aware strategy usage, as the two applications of the +1 trial occurred far apart from each other in time.

An external file that holds a picture, illustration, etc.
Object name is fpsyg-09-00302-g0002.jpg

Exploration in phase 1 of the problems based on minimal complex systems (two input variables and one output variable).

Based on students' minimal input behavior we executed latent class analyses. We narrowed the focus to the principle of isolated variation, especially to the extent to which this special strategy was employed in the exploration phase as an indicator of students' ability to proficiently explore the problem environment. We added an extra variable to each of the problems, describing students' exploration behavior based on the following three categories: (1) no isolated variation at all (e.g., isolated variation was employed for none of the input variables – 0 points); (2) partially isolated variation (e.g., isolated variation was employed for some but not all the input variables – 1 point); and (3) fully isolated variation (e.g., isolated variation was employed for all the input variables – 2 points). Thus, depending on the level of optimal exploration strategy used, all the students received new categorical scores based on their input exploration behavior, one for each of the CPS tasks. Let us return to the example provided in Figures ​ Figures1, 1 , ​ ,2. 2 . In the first example, a partially isolated strategy was applied, since the problem solver only used this strategy to test the effect of the Miaow input variables (in trials 3 and 4). In the second example, a full isolated strategy was applied, as the problem solver used this isolated variation strategy for both the input variables during the exploration phase in the first and sixth trials.

The reliability of the test improved when scoring was based on the log data

The reliability of the MicroDYN problems as a measure of knowledge acquisition and knowledge application, the traditional CPS indicators for phases 1 and 2, were acceptable at α = 0.72–0.86 in all grades (Table ​ (Table4). 4 ). After we re-scored the problem solvers' behavior at the beginning of the problem-solving process, coded the log data and assigned new variables for the effectiveness of strategy usage during the exploration phase of the task for each task and person, the overall reliability of the test scores improved. This phenomenon was noted in all grades and in both coding procedures, when the amount of information obtained was examined (Cronbach's α ranged from 0.86 to 0.96) and when the level of optimal exploration strategy used was analyzed (Cronbach's α ranged from 0.83 to 0.98; the answers to the warm-up tasks were excluded from these analyses).

Internal consistencies in scoring the MicroDYN problems: analyses based on both traditional CPS indicators and re-coded log data based on student behavior at the beginning of the problem-solving process.

30.830.870.800.83
40.770.860.850.86
50.780.900.880.90
60.720.910.880.93
70.740.920.890.94
80.800.920.900.95
90.830.960.930.97
100.850.940.930.96
110.860.940.930.98
120.830.930.920.97

Use of a theoretically effective strategy does not result in high performance (RQ1)

Use of a theoretically effective strategy did not always result in high performance. The percentage of effective strategy use and high CPS performance varied from 20 to 80%, depending on the complexity of the CPS tasks and the age group. The percentage of theoretically effective strategy use in each cohort increased by 20% for age when problems with the same complexity were compared (Table ​ (Table5) 5 ) and decreased about 20% for the increasing number of input variables in the problems.

Percentage of theoretically effective and non-effective strategy use and high CPS performance.

2-1 (2)19.9 (11.6)80.1 (46.6)58.228.2 (11.8)71.8 (30.0)41.8
2-2 (2)81.5 (39.8)18.5 (9.0)50.297.2 (46.8)2.8 (1.4)49.8
3-2 (3)65.9 (21.5)34.1 (11.1)32.689.3 (60.2)10.7 (7.2)67.4
3-3 (3)60.2 (21.9)39.8 (14.5)36.477.1 (49.0)22.9 (14.6)63.6
2-1 (2)28.3 (18.7)71.6 (47.2)65.926.9 (9.2)73.1 (24.9)34.1
2-2 (2)72.4 (47.0)27.5 (18.0)59.098.2 (34.4)1.8 (0.6)41.0
3-2 (3)50.8 (22.9)49.2 (22.2)45.085.9 (47.2)14.1 (7.8)54.9
3-3 (3)52.6 (25.7)47.4 (23.2)49.077.3 (39.5)22.7 (11.6)51.0
2-1 (2)28.7 (21.9)71.3 (54.5)76.425.5 (6.0)74.5 (17.6)23.6
2-2 (2)59.4 (43.2)40.6 (29.5)72.798.2 (26.8)1.8 (0.5)27.3
3-2 (3)42.0 (22.8)58.0 (31.4)54.281.9 (37.5)18.1 (8.3)45.8
3-3 (3)39.4 (22.8)60.6 (35.2)58.074.1 (31.2)25.8 (10.9)42.0

The percentage of theoretically effective strategy use was the same for the less complex problems in Grades 3–5 and for the most complex tasks in Grades 8–12 (58%). More than 80% of these students solved the problem correctly in the first case, but only 60% had the correct solution in the second case. There was a 50% probability of effective and non-effective strategy use for problems with two input and two output variables in Grades 3–5 and for problems with three input and three output variables in Grades 6–7. In Grades 8–12, the use of a theoretically effective strategy was always higher than 50%, independently of the complexity of the problems (with no internal dynamic). The guessing factor, that is, the ad hoc optimization (use of a theoretically non-effective strategy with the correct solution) also changed, mostly based on the complexity and position of the tasks in the test. The results confirmed our hypothesis that the use of a theoretically effective strategy does not necessary represent the correct solution and that the correct solution does not always represent the use of an even theoretically effective problem-solving strategy.

Not all the VOTAT strategies result in high CPS performance (RQ2)

On average, only 15% of the theoretically effective strategy uses involved non-VOTAT strategies. The isolated variation strategy comprised 45% of the VOTAT strategies employed. It was the only theoretically effective strategy which always resulted in the correct solution to the problem with higher probability independently of problem complexity or the grade of the students. The real advantage of this strategy was most remarkable in the case of the third cohort, where an average of 80% of the students who employed this strategy solved the problems correctly (Figures ​ (Figures3, 3 , ​ ,4 4 ).

An external file that holds a picture, illustration, etc.
Object name is fpsyg-09-00302-g0003.jpg

Efficacy of the most frequently employed VOTAT strategies on problems with two input variables and one or two output variables in Grades 3–5, 6–7, and 8–12.

An external file that holds a picture, illustration, etc.
Object name is fpsyg-09-00302-g0004.jpg

Efficacy of the most frequently employed VOTAT strategies on problems with three input variables and one or two output variables in Grades 3–5, 6–7, and 8–12.

The second most frequently employed and successful VOTAT strategy was the +1+2 type or the +1+2+2 type, depending on the number of input variables. In the +1+2 type, only one single input variable was manipulated in the first step, while the other variable remained at a neutral value; in the second step, only the other input variable was changed and the first retained the setting used previously. This proved to be relatively successful on problems with a low level of complexity independently of age, but it generally resulted in a good solution with a low level of probability on more complex problems.

VOTAT strategies of the +1+3 type (in the case of two input variables) and of the +1+1+2 type (in the case of three input variables) were employed even less frequently and with a lower level of efficacy than all the other VOTAT strategies (+1+1+3, +1+2+1, +1+2+2, +1+2+3, +1+3+1, +1+3+2 and +1+3+3 in the case of three input variables) and theoretically effective, non-VOTAT strategies (e.g., +4 in the case of two input variables or +1+4, +4+2 and +4+3 in the case of three input variables). In the following, we provide an example of the +4+2 type, where the MicroDyn problem has three input variables (A, B, and C) and three output variables. In the first trial, the problem solver set the input variables to the following values: 0 (for variable A), 1 (for variable B), and 1 (for variable C); that is, he or she changed two input variables at the same time. In the second trial, he or she changed the value of two input variables at the same time again and applied the following setting: 0 (for variable A), −2 (for variable B), and −1 (for variable C). In the third trial, he set variable A to 1, and left variables B and C unchanged. That is, the problem solver's input behavior can be described with the following trials: –X +4 +2. Based on this strategy, it was possible to map the relationships between the input and output variables without using any VOTAT strategy in the exploration phase.

Aware explorers perform significantly higher on the CPS tasks (RQ3)

We compared the achievement of the aware, isolated strategy users with that of the non-aware explorers (Table ​ (Table6). 6 ). The percentage of high achievers among the non-aware explorers seemed to be almost independent of age, but strongly influenced by the complexity of the problem and the learning effect we noted in the testing procedure (see RQ5). Results for problems with two input variables and one output variable confirmed our previous results, which showed that the probability of providing the correct solution is very high even without aware use of a theoretically effective strategy (60–70%). With more complex problems, the difference between the percentages of aware and non-aware explorers was huge. Generally, 85% of the non-aware explorers failed on the problems, while at least 80% of the aware, isolated strategy users were able to solve the problems correctly.

Percentage of high achievers among aware and non-aware explorers by grade and problem complexity.

2-157.9974.5116.5270.7278.988.2672.2983.4311.14
2-23.5962.9459.352.8770.5567.683.5280.9277.4
3-212.0477.8865.8417.9184.5166.619.2888.2969.01
3-324.6879.4754.7923.1788.6165.4426.8491.2864.44

Six qualitatively different explorer class profiles can be distinguished at the end of the elementary level and five at the end of the secondary level (RQ4 and RQ5)

In all three cohorts, each of the information theory criteria used (AIC, BIC, and aBIC) indicated a continuous decrease in an increasing number of latent classes. The likelihood ratio statistical test (Lo–Mendell Rubin Adjusted Likelihood Ratio Test) showed the best model fit in Grades 3–5 for the 4-class model, in Grades 6–7 for the 6-class model and in Grades 8–12 for the 5-class model. The entropy-based criterion reached the maximum values for the 2- and 3-class solutions, but it was also high for the best-fitting models based on the information theory and likelihood ratio criteria. Thus, the entropy index for the 4-class model showed that 80% of the 3 rd - to 5 th -graders, 82% of the 6 th - to 7 th -graders and 85% of the 8 th - to 12 th -graders were accurately categorized based on their class membership (Table ​ (Table7 7 ).

Information theory, likelihood ratio and entropy-based fit indices for latent class analyses.

213,38313,51213,4330.8542,3010.001
312,68712,88312,7630.8707140.001
512,44812,77812,5740.7661390.051
612,36212,75812,5140.7821100.100
213,38313,51213,4330.8542,3010.001
312,75112,94712,8260.8731,0680.001
412,57612,84012,6780.8191980.001
512,49712,82712,6240.8141040.004
712,40212,86612,5800.828500.498
28,2328,3198,2650.9412,1970.001
37,7187,8507,7680.8565240.001
47,6907,8697,7570.829440.002
67,7057,9767,8070.77040.561

AIC, Akaike Information Criterion; BIC, Bayesian Information Criterion; aBIC, adjusted Bayesian Information Criterion; L–M–R test, Lo–Mendell–Rubin Adjusted Likelihood Ratio Test. The best fitting model solution is in italics .

We distinguished four latent classes in the lower grades based on the exploration strategy employed and the level of isolated variation strategy used (Table ​ (Table8): 8 ): 40.5% of the students proved to be non-performing explorers on the basis of their strategic patterns in the CPS environments. They did not use any isolated or partially isolated variation at all; 23.6% of the students were among the low-performing explorers who only rarely employed a fully or partially isolated variation strategy (with 0–20% probability on the less complex problems and 0–5% probability on the more complex problems). 24.7% of the 3 rd - to 5 th -graders were categorized as slow learners who were intermediate performers with regard to the efficiency of the exploration strategy they used on the easiest problems with a slow learning effect, but low-performing explorers on the complex ones. In addition, 11.1% of the students proved to be proficient explorers, who used the isolated or partially isolated variation strategy with 80–100% probability on all the proposed CPS problems.

Relative frequencies and average latent class probabilities across grade levels 3–5, 6–7, and 8–12.

Non-performers40.50.9430.90.9232.20.92
Low performers23.60.8614.00.8416.20.87
Intermediate performers on easiest problems, but low performers on complex ones with a very slow learning effect24.70.8226.20.86
Rapid learners4.40.867.70.96
Almost high performers on easiest problems, but low performers on complex ones with a slow learning effect10.30.8217.60.79
Proficient strategy users11.10.9714.20.9626.30.97

In Grades 6–7, in which achievement proved to be significantly higher on average, 10% fewer students were observed in each of the first two classes (non-performing explorers and low-performing explorers). The percentage of intermediate explorers remained almost the same (26%), and we noted two more classes with the analyses: the class of rapid learners (4.4%) and that of slow learners, who are almost proficient explorers on the easiest problems, employing the fully or partially isolated variation strategy with 60–80% probability, but low-performing explorers on the complex ones (10.3%). The frequency of proficient strategy users was also increased (to 14.2%) compared to students in the lower grades. Finally, there was almost no change detected in the low-performing explorers' classes in Grades 8–12. We did not detect anyone in the class of intermediate explorers; they must have developed further and become (1) rapid learners (7.7%), (2) slow learners with almost high achievement with regard to the exploration strategy they used on the easiest problems, but low achievers on the complex ones (17.6%), or (3) proficient strategy users (26.3%), whose achievement was high both on the simplest and the most complex problems.

Based on these results, the percentage of non- and low explorers, who have very low exploration skills and do not learn during testing, decreased from almost 65 to 50% between the lower and higher primary school levels and then remained constant at the secondary level. There was a slight increase in respect of the percentage of students among the rapid learners. The students in that group used the fully or partially isolated strategy at very low levels at the beginning of the test, but they learned very quickly and detected these effective exploration strategies; thus, by the end of the test, their proficiency level with regard to exploration was equal to the top performers' achievement. However, we were unable to detect the class of rapid learners among 3 rd - to 5 th -graders.

Generally, students' level of exploration expertise with regard to fully and partially isolated variation improved significantly with age ( F = 70.376, p < 0.001). According to our expectations based on the achievement differences among students in Grades 3–5, 6–8 and 9–12, there were also significant differences in the level of expertise in fully or partially isolated strategy use during problem exploration between 3 rd - to 5 th - and 6 th - to 7 th -grade students ( t = −6.833, p < 0.001, d = 0.03) and between 6 th - to 7 th - and 8 th - to 12 th -grade students ( t = −6.993, p < 0.001, d = 0.03).

In this study, we examined 3 rd - to 12 th -grade (aged 9–18) students' problem-solving behavior by means of a logfile analysis to identify qualitatively different exploration strategies. Students' activity in the first phase of the problem-solving process was coded according to a mathematical model that was developed based on strategy effectiveness and then clustered for comparison. Reliability analyses of students' strategy use indicated that strategies used in the knowledge acquisition phase described students' development (ability level) better than traditional quantitative psychometric indicators, including the goodness of the model. The high reliability indices indicate that there are untapped possibilities in analyzing log data. Our analyses of logfiles extracted from a simulation-based assessment of problem solving have expanded the scope of previous studies and made it possible to identify a central component of children's scientific reasoning: the way students understand how scientific experiments can be designed and how causal relationships can be explored by systematically changing the values of (independent) variables and observing their impact on other (target) variables.

In this way, we have introduced a new labeling and scoring method that can be employed in addition to the two scores that have already been used in previous studies. We have found that using this scoring method (based on student strategy use) improves the reliability of the test. Further studies are needed to examine the validity of the scale based on this method and to determine what this scale really measures. We may assume that the general idea of varying the values of the independent variables and connecting them to the resultant changes in the target variable is the essence of scientific reasoning and that the systematic manipulation of variables is related to combinatorial reasoning, while summarizing one's observations and plotting a model is linked to rule induction. Such further studies have to place CPS testing in the context of other cognitive tests and may contribute to efforts to determine the place of CPS in a system of cognitive abilities (see e.g., Wüstenberg et al., 2012 ).

We have found that the use of a theoretically effective strategy does not always result in high performance. This is not surprising, and it confirms research results by de Jong and van Joolingen ( 1998 ), who argue that learners often have trouble interpreting data. As we observed earlier, using a systematic strategy requires combinatorial thinking, while drawing a conclusion from one's observations requires rule induction (inductive reasoning). Students showing systematic strategies but failing to solve the problem may possess combinatorial skills but lack the necessary level of inductive reasoning. It is more difficult to find an explanation for the other direction of discrepancy, when students actually solve the problem without an effective (complete) strategy. Thus, solving the problem does not require the use of a strategy which provides the problem solver with sufficient information about the problem environment to be able to form the correct solution. This finding is similar to results from previous research (e.g., Vollmeyer et al., 1996 ; Greiff et al., 2015 ). Goode and Beckmann ( 2010 ) reported two qualitatively different, but equally effective approaches: knowledge- based and ad hoc control.

In the present study, the contents of the problems were not based on real knowledge, and the causal relationships between the variables were artificial. Content knowledge was therefore no help to the students in filling the gap between the insufficient information acquired from interaction and the successful solution to the problem. We may assume that students guessed intuitively in such a case. Further studies may ascertain how students guess in such situations.

The percentage of success is influenced by the complexity of the CPS tasks, the type of theoretically effective strategy used, the age group and, finally, the degree to which the strategy was consciously employed.

The most frequently employed effective strategies fell within the class of VOTAT strategies. Almost half the VOTAT strategies were of the isolated variation strategy type, which resulted with higher probability in the correct solution independently of the complexity of the problem or the grade of the students. As noted earlier, not all the VOTAT strategies resulted in high CPS performance; moreover, all the other VOTAT strategies proved to be significantly less successful. Some of them worked with relative success on problems with a low level of complexity, but failed with a high level of probability on more complex problems independently of age group. Generally, the advantage of the isolated variation strategy (Wüstenberg et al., 2014 ) compared to the other VOTAT and non-VOTAT, theoretically effective strategies is clearly evident from the outcome. The use of the isolated variation strategy, where students examined the effect of the input variables on the output variables independently, resulted in a good solution with the highest probability and proved to be the most effective VOTAT strategy independently of student age or problem complexity.

Besides the type of strategy used, awareness also played an influential role. Aware VOTAT strategy users proved to be the most successful explorers. They were followed in effectiveness by non-aware VOTAT strategy users and theoretically effective, but non-VOTAT strategy users. They managed to represent the information that they had obtained from the system more effectively and made good decisions in the problem-solving process compared to their peers.

We noted both qualitative and quantitative changes of problem-solving behavior in the age range under examination. Using latent class analyses, we identified six qualitatively different class profiles during compulsory schooling. (1) Non-performing and (2) low-performing students who usually employed no fully or partially isolated variation strategy at all or, if so, then rarely. They basically demonstrated unsystematic exploration behavior. (3) Proficient strategy users who consistently employed optimal exploration strategies from the very first problem as well as the isolated variation strategy and the partially isolated variation, but only seldom. They must have more elaborated schemas available. (4) Slow learners who are intermediate performers on the easiest problems, but low performers on the complex ones or (5) high performers on the easiest problems, but low performers on the complex ones. Most members of this group managed to employ the principle of isolated or partially isolated variation and had an understanding of it, but they were only able to use it on the easiest task and then showed a rapid decline on the more complex CPS problems. They might have been cognitively overloaded by the increasingly difficult problem-solving environments they faced. (6) Rapid learners, a very small group from an educational point of view. These students started out as non-performers in their exploration behavior on the first CPS tasks, showed a rapid learning curve afterwards and began to use the partially isolated variation strategy increasingly and then the fully isolated variation strategy. By the end of the test, they reached the same high level of exploration behavior as the proficient explorers. We observed no so-called intermediate strategy users, i.e., those who used the partially isolated variation strategy almost exclusively on the test. As we expected, class membership increased significantly in the more proficient classes at the higher grade levels due to the effects of cognitive maturation and schooling, but this did not change noticeably in the two lowest-level classes.

Limitations of the study include the low sample size for secondary school students; further, repetition is required for validation. The generalizability of the results is also limited by the effects of semantic embedding (i.e., cover stories and variable labels), that is, the usage of different fictitious cover stories “with the intention of minimizing the uncontrollable effects of prior knowledge, beliefs or suppositions” (Beckmann and Goode, 2017 ). An assumption triggered by semantic contexts has an impact on exploration behavior (e.g., the range of interventions, or strategies employed by the problem solver; Beckmann and Goode, 2014 ), that is, how the problem solver interacts with the system. Limitations also include the characteristics of the interface used. In our view, analyses with regard to VOTAT strategies are only meaningful in systems with an interface where inputs do not automatically reset to zero from one input to the next (Beckmann and Goode, 2017 ). That is, we excluded problem environments from the study where the inputs automatically reset to zero from one input to the next. A further limitation of the generalizability of the results is that we have omitted problems with autonomic changes from the analyses.

The main reason why we have excluded systems that contain autoregressive dependencies from the analyses is that different strategy usage is required on problems which also involve the use of trial +A (according to our coding of sub-strategies), which is not among the effective sub-strategies for problems without autonomic changes. Analyses of students' behavior on problems with autonomic changes will form part of further studies, as well as a refinement of the definition of what makes a problem complex and difficult. We plan to adapt the Person, Task and Situation framework published by Beckmann and Goode ( 2017 ). The role of ad hoc control behavior was excluded from the analyses; further studies are required to ascertain the importance of the repetitive control behavior. Another limitation of the study could be the interpretation of the differences across age group clusters as indicators of development and not as a lack of stability of the model employed.

These results shed new light on and provide a new interpretation of previous analyses of complex problem solving in the MicroDYN approach. They also highlight the importance of explicit enhancement of problem-solving skills and problem-solving strategies as a tool for applying knowledge in a new context during school lessons.

Ethics statement

Ethical approval was not required for this study based on national and institutional guidelines. The assessments which provided data for this study were integrated parts of the educational processes of the participating schools. The coding system for the online platform masked students' identity; the data cannot be connected to the students. The results from the no-stakes diagnostic assessments were disclosed only to the participating students (as immediate feedback) and to their teachers. Because of the anonymity and no-stakes testing design of the assessment process, it was not required or possible to request and obtain written informed parental consent from the participants.

Author contributions

Both the authors, GM and BC, certify that they have participated sufficiently in the work to take responsibility for the content, including participation in the concept, design and analysis as well as the writing and final approval of the manuscript. Each author agrees to be accountable for all aspects of the work.

Conflict of interest statement

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

1 With regard to terminology, please note that different terms are used for the subject at hand (e.g., complex problem solving, dynamic problem solving, interactive problem solving and creative problem solving). In this paper, we use the modifier “complex” (see Csapó and Funke, 2017 ; Dörner and Funke, 2017 ).

Funding. This study was funded by OTKA K115497.

  • OECD (2014). PISA 2012 Results: Creative Problem Solving. Students' Skills in Tackling Real-Life Problems (Volume V) . Paris: OECD. [ Google Scholar ]
  • Beckmann J. F., Goode N. (2014). The benefit of being näive and knowing it? The unfavourable impact of perceived context familiarity on learning in complex problem solving tasks . Instructional Sci . 42 , 271–290. 10.1007/s11251-013-9280-7 [ CrossRef ] [ Google Scholar ]
  • Beckmann J. F., Birney D. P., Goode N. (2017). Beyond psychometrics: the difference between difficult problem solving and complex problem solving . Front. Psychol . 8 :1739. 10.3389/fpsyg.2017.01739 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Beckmann J. F., Goode N. (2017). Missing the wood for the wrong trees: On the difficulty of defining the complexity of complex problem solving scenarios . J. Intell . 5 :2 10.3390/jintelligence5020015 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Chen Z., Klahr D. (1999). All other things being equal: acquisition and transfer of the control of variables strategy . Child Dev. 70 , 1098–1120. 10.1111/1467-8624.00081 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Collins L. M., Lanza S. T. (2010). Latent Class and Latent Transition Analysis: With Applications in the Social, Behavioral, and Health Sciences . New York, NY: Wiley. [ Google Scholar ]
  • Csapó B., Ainley J., Bennett R., Latour T., Law N. (2012). Technological issues of computer-based assessment of 21st century skills , in Assessment and Teaching of 21st Century Skills , eds Griffin P., McGaw B., Care E. (New York, NY: Springer; ), 143–230. [ Google Scholar ]
  • Csapó B., Funke J. (eds.). (2017). The Nature of Problem Solving. Using Research to Inspire 21st Century Learning. Paris: OECD. [ Google Scholar ]
  • Csapó B., Molnár G. (2017). Potential for assessing dynamic problem-solving at the beginning of higher education studies . Front. Psychol. 8 :2022. 10.3389/fpsyg.2017.02022 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Csapó B., Molnár G., Nagy J. (2014). Computer-based assessment of school-readiness and reasoning skills . J. Educ. Psychol. 106 , 639–650. 10.1037/a0035756 [ CrossRef ] [ Google Scholar ]
  • de Jong T., van Joolingen W. R. (1998). Scientific discovery learning with computer simulations of conceptual domains . Rev. Educ. Res . 68 , 179–201. 10.3102/00346543068002179 [ CrossRef ] [ Google Scholar ]
  • Demetriou A., Efklides A., Platsidou M. (1993). The architecture and dynamics of developing mind: experiential structuralism as a frame for unifying cognitive developmental theories . Monogr. Soc. Res. Child Dev. 58 , 1–205. 10.2307/1166053 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Dörner D., Funke J. (2017). Complex problem solving: what it is and what it is not . Front. Psychol. 8 :1153 10.3389/fpsyg.2017.01153 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Dziak J. J., Coffman D. L., Lanza S. T., Li R. (2012). Sensitivity and Specificity of Information Criteria . The Pennsylvania State University: The Methodology Center and Department of Statistics. Available online at: https://methodology.psu.edu/media/techreports/12-119.pdf
  • Fischer A., Greiff S., Funke J. (2012). The process of solving complex problems . J. Probl. Solving . 4 , 19–42. 10.7771/1932-6246.1118 [ CrossRef ] [ Google Scholar ]
  • Funke J. (2001). Dynamic systems as tools for analysing human judgement . Think. Reason. 7 , 69–89. 10.1080/13546780042000046 [ CrossRef ] [ Google Scholar ]
  • Funke J. (2010). Complex problem solving: a case for complex cognition? Cogn. Process. 11 , 133–142. 10.1007/s10339-009-0345-0 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Funke J. (2014). Analysis of minimal complex systems and complex problem solving require different forms of causal cognition . Front. Psychol. 5 :739. 10.3389/fpsyg.2014.00739 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Galotti K. M. (2011). Cognitive Development . Thousand Oaks, CA: SAGE. [ Google Scholar ]
  • Gardner P. H., Berry D. C. (1995). The effect of different forms of advice on the control of a simulated complex system . Appl. Cogn. Psychol. 9 :7 10.1002/acp.2350090706 [ CrossRef ] [ Google Scholar ]
  • Goode N., Beckmann J. F. (2010). You need to know: there is a causal relationship between structural knowledge and control performance in complex problem solving tasks . Intelligence 38 , 345–352. 10.1016/j.intell.2010.01.001 [ CrossRef ] [ Google Scholar ]
  • Greiff S., Funke J. (2010). Systematische Erforschung komplexer Problemlösefähigkeit anhand minimal komplexer Systeme . Zeitschrift für Pädagogik 56 , 216–227. [ Google Scholar ]
  • Greiff S., Funke J. (2017). Interactive problem solving: exploring the potential of minimal complex systems , in The Nature of Problem Solving. Using Research to Inspire 21st Century Learning , eds Csapó B., Funke J. (Paris: OECD; ), 93–105. [ Google Scholar ]
  • Greiff S., Wüstenberg S., Avvisati F. (2015). Computer-generated log-file analyses as a window into students' minds? A showcase study based on the PISA 2012 assessment of problem solving . Computers Educ . 91 , 92–105. 10.1016/j.compedu.2015.10.018 [ CrossRef ] [ Google Scholar ]
  • Greiff S., Wüstenberg S., Csapó B., Demetriou A., Hautamäki H., Graesser A. C., et al. (2014). Domain-general problem solving skills and education in the 21st century . Educ. Res. Rev . 13 , 74–83. 10.1016/j.edurev.2014.10.002 [ CrossRef ] [ Google Scholar ]
  • Greiff S., Wüstenberg S., Funke J. (2012). Dynamic problem solving: a new assessment perspective . Appl. Psychol. Meas. 36 , 189–213. 10.1177/0146621612439620 [ CrossRef ] [ Google Scholar ]
  • Greiff S., Wüstenberg S., Goetz T., Vainikainen M.-P., Hautamäki J., Bornstein M. H. (2015). A longitudinal study of higher-order thinking skills: working memory and fluid reasoning in childhood enhance complex problem solving in adolescence . Front. Psychol. 6 :1060. 10.3389/fpsyg.2015.01060 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Greiff S., Wüstenberg S., Molnár G., Fischer A., Funke J., Csapó B. (2013). Complex problem solving in educational contexts – Something beyond g: Concept, assessment, measurement invariance, and construct validity . J. Educ. Psychol . 105 , 364–379. 10.1037/a0031856 [ CrossRef ] [ Google Scholar ]
  • Griffin P., McGaw B., Care E. (2012). Assessment and Teaching of 21st Century Skills . Dordrecht: Springer. [ Google Scholar ]
  • Inhelder B., Piaget J. (1958). The Growth of Logical Thinking from Childhood to Adolescence . New York, NY: Basic Books. [ Google Scholar ]
  • Klahr D., Dunbar K. (1988). Dual space search during scientific reasoning . Cogn. Sci. 12 , 1–48. 10.1207/s15516709cog1201_1 [ CrossRef ] [ Google Scholar ]
  • Klahr D., Triona L. M., Williams C. (2007). Hands on what? The relative effectiveness of physical versus virtual materials in an engineering design project by middle school children . J. Res. Sci. Teaching 44 , 183–203. 10.1002/tea.20152 [ CrossRef ] [ Google Scholar ]
  • Kröner S., Plass J. L., Leutner D. (2005). Intelligence assessment with computer simulations . Intelligence 33 , 347–368. 10.1016/j.intell.2005.03.002 [ CrossRef ] [ Google Scholar ]
  • Kuhn D., Iordanou K., Pease M., Wirkala C. (2008). Beyond control of variables: what needs to develop to achieve skilled scientific thinking? Cogn. Dev . 23 , 435–451. 10.1016/j.cogdev.2008.09.006 [ CrossRef ] [ Google Scholar ]
  • Lo Y., Mendell N. R., Rubin D. B. (2001). Testing the number of components in a normal mixture . Biometrika 88 , 767–778. 10.1093/biomet/88.3.767 [ CrossRef ] [ Google Scholar ]
  • Lotz C., Scherer R., Greiff S., Sparfeldt J. R. (2017). Intelligence in action – Effective strategic behaviors while solving complex problems . Intelligence 64 , 98–112. 10.1016/j.intell.2017.08.002 [ CrossRef ] [ Google Scholar ]
  • Mayer R. E., Wittrock M. C. (1996). Problem-solving transfer in Handbook of Educational Psychology , eds Berliner D. C., Calfee R. C. (New York, NY; London: Routledge; ), 47–62. [ Google Scholar ]
  • Molnár G., Greiff S., Csapó B. (2013). Inductive reasoning, domain specific and complex problem solving: relations and development . Think. Skills Creat. 9 , 35–45. 10.1016/j.tsc.2013.03.002 [ CrossRef ] [ Google Scholar ]
  • Molnár G., Greiff S., Wüstenberg S., Fischer A. (2017). Empirical study of computer-based assessment of domain-general complex problem-solving skills , in The Nature of Problem Solving. Using Research to Inspire 21st Century Learning , eds Csapó B., Funke J. (Paris: OECD; ), 125–140. [ Google Scholar ]
  • Müller J. C., Kretzschmar A., Greiff S. (2013). Exploring exploration: Inquiries into exploration behavior in complex problem solving assessment , in Proceedings of the 6th International Conference on Educational Data Mining , eds D'Mello S. K., Calvo R. A., Olney A. 336–337. Available online at: http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.666.6664&rep=rep1&type=pdf [ Google Scholar ]
  • Muthén L. K., Muthén B. O. (2012). Mplus User's Guide, 7th Edn . Los Angeles, CA: Muthén and Muthén. [ Google Scholar ]
  • Osman M., Speekenbrink M. (2011). Cue utilization and strategy application in stable and unstable dynamic environments . Cogn. Syst. Res . 12 , 355–364. 10.1016/j.cogsys.2010.12.004 [ CrossRef ] [ Google Scholar ]
  • Schoppek W., Fischer A. (2017). Common process demands of two complex dynamic control tasks: transfer is mediated by comprehensive strategies . Front. Psychol . 8 :2145. 10.3389/fpsyg.2017.02145 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Siegler R. S. (1999). Strategic development . Trends Cogn. Sci. 3 , 430–435. [ PubMed ] [ Google Scholar ]
  • Sonnleiter P., Keller U., Martin R., Latour T., Brunner M. (2017). Assessing complex problem solving in the classroom: meeting challenges and opportunities , in The Nature of Problem Solving. Using Research to Inspire 21st Century Learning , eds Csapó B., Funke J. (Paris: OECD; ), 159–174. [ Google Scholar ]
  • Sweller J. (1994). Cognitive load theory, learning difficulty, and instructional design . Learn. Instruct . 4 , 295–312. 10.1016/0959-4752(94)90003-5 [ CrossRef ] [ Google Scholar ]
  • Tein J. Y., Coxe S., Cham H. (2013). Statistical power to detect the correct number of classes in latent profile analysis . Struct. Equ. Modeling. 20 , 640–657. 10.1080/10705511.2013.824781 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Tóth K., Rölke H., Goldhammer F., Barkow I. (2017). Educational process mining: New possibilities for understanding students' problem-solving skills , in The Nature of Problem Solving. Using Research to Inspire 21st Century Learning , eds. Csapó B., Funke J. (Paris: OECD; ), 193–210. [ Google Scholar ]
  • Tschirgi J. E. (1980). Sensible reasoning: a hypothesis about hypotheses . Child Dev. 51 , 1–10. 10.2307/1129583 [ CrossRef ] [ Google Scholar ]
  • Vollmeyer R., Burns B. D., Holyoak K. J. (1996). The impact of goal specificity on strategy use and the acquisition of problem structure . Cogn. Sci. 20 , 75–100. 10.1207/s15516709cog2001_3 [ CrossRef ] [ Google Scholar ]
  • Wüstenberg S., Greiff S., Funke J. (2012). Complex problem solving. More than reasoning? Intelligence 40 , 1–14. 10.1016/j.intell.2011.11.003 [ CrossRef ] [ Google Scholar ]
  • Wüstenberg S., Stadler M., Hautamäki J., Greiff S. (2014). The role of strategy knowledge for the application of strategies in complex problem solving tasks . Technol. Knowledge Learn. 19 , 127–146. 10.1007/s10758-014-9222-8 [ CrossRef ] [ Google Scholar ]
  • Zoanetti N., Griffin P. (2017). Log-file data as indicators for problem-solving processes , in The Nature of Problem Solving. Using Research to Inspire 21st Century Learning , eds Csapó B., Funke J. (Paris: OECD; ), 177–191. [ Google Scholar ]

Introduction Games, Gamification, and Virtual Environments

Article sidebar, main article content.

Welcome to the games, gamification, and virtual environments special issue of the Journal of Technology-Integrated Lessons and Teaching ( JTILT ). Why a special issue on games? Because games have merit! Games can be played by anyone, can provide safe ways to practice essential skills, and can adapt to various interests and settings.

As more states and nations require computational thinking skills in P-12 education, connections between games and problem solving, algorithmic thinking, decomposition, and abstraction become visible. Players demonstrate these problem-solving skills when considering the probability of success with certain moves, character placement to minimize damage, or fine-tuning resources to maximize gains. The variation in complexity, type, genre, time requirements, player interactions, and so forth helps make games so popular. Individuals can locate the exemplars that match their interests.

Article Details

Creative Commons License

This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License .

Craig Erschel Shepherd, University of Memphis

Craig E. Shepherd, Ph.D. is an Associate Professor of Instructional Design and Technology at the University of Memphis.

Cecil R Short, Emporia State University

Cecil R. Short is an assistant professor and Director of Secondary Education at Emporia State University.

Information

  • Author Services

Initiatives

You are accessing a machine-readable page. In order to be human-readable, please install an RSS reader.

All articles published by MDPI are made immediately available worldwide under an open access license. No special permission is required to reuse all or part of the article published by MDPI, including figures and tables. For articles published under an open access Creative Common CC BY license, any part of the article may be reused without permission provided that the original article is clearly cited. For more information, please refer to https://www.mdpi.com/openaccess .

Feature papers represent the most advanced research with significant potential for high impact in the field. A Feature Paper should be a substantial original Article that involves several techniques or approaches, provides an outlook for future research directions and describes possible research applications.

Feature papers are submitted upon individual invitation or recommendation by the scientific editors and must receive positive feedback from the reviewers.

Editor’s Choice articles are based on recommendations by the scientific editors of MDPI journals from around the world. Editors select a small number of articles recently published in the journal that they believe will be particularly interesting to readers, or important in the respective research area. The aim is to provide a snapshot of some of the most exciting work published in the various research areas of the journal.

Original Submission Date Received: .

  • Active Journals
  • Find a Journal
  • Proceedings Series
  • For Authors
  • For Reviewers
  • For Editors
  • For Librarians
  • For Publishers
  • For Societies
  • For Conference Organizers
  • Open Access Policy
  • Institutional Open Access Program
  • Special Issues Guidelines
  • Editorial Process
  • Research and Publication Ethics
  • Article Processing Charges
  • Testimonials
  • Preprints.org
  • SciProfiles
  • Encyclopedia

education-logo

Article Menu

problem solving skills journal article

  • Subscribe SciFeed
  • Recommended Articles
  • Google Scholar
  • on Google Scholar
  • Table of Contents

Find support for a specific problem in the support section of our website.

Please let us know what you think of our products and services.

Visit our dedicated information section to learn more about MDPI.

JSmol Viewer

Mathematical modelling abilities of artificial intelligence tools: the case of chatgpt.

problem solving skills journal article

Share and Cite

Spreitzer, C.; Straser, O.; Zehetmeier, S.; Maaß, K. Mathematical Modelling Abilities of Artificial Intelligence Tools: The Case of ChatGPT. Educ. Sci. 2024 , 14 , 698. https://doi.org/10.3390/educsci14070698

Spreitzer C, Straser O, Zehetmeier S, Maaß K. Mathematical Modelling Abilities of Artificial Intelligence Tools: The Case of ChatGPT. Education Sciences . 2024; 14(7):698. https://doi.org/10.3390/educsci14070698

Spreitzer, Carina, Oliver Straser, Stefan Zehetmeier, and Katja Maaß. 2024. "Mathematical Modelling Abilities of Artificial Intelligence Tools: The Case of ChatGPT" Education Sciences 14, no. 7: 698. https://doi.org/10.3390/educsci14070698

Article Metrics

Article access statistics, further information, mdpi initiatives, follow mdpi.

MDPI

Subscribe to receive issue release notifications and newsletters from MDPI journals

IMAGES

  1. Problem Solving Skills: Meaning, Steps, Techniques (2023)

    problem solving skills journal article

  2. My Problem-Solving Journal

    problem solving skills journal article

  3. Teach Problem-Solving for Kids Step-by-Step

    problem solving skills journal article

  4. The Significance Of Problem-Solving Skills For Leaders In The Making

    problem solving skills journal article

  5. My Problem-Solving Journal

    problem solving skills journal article

  6. Problem Solving Journal Prompts BUNDLE

    problem solving skills journal article

VIDEO

  1. This Technique can solve your problem! #shaleenshrotriya #businesscoach #solution #coaching

  2. Why Journaling is Important For Success #shorts #journal #journaling

  3. 5 STRATEGIES TO IMPROVE YOUR PROBLEM SOLVING SKILLS

  4. Unlock Your Success: The Ultimate Guide to Journaling for Personal Growth and English Mastery!

  5. DBT Expanded Edition

  6. Medical Mystery Solved

COMMENTS

  1. Full article: Motivation to learn and problem solving

    First, learners themselves can contribute to better learning outcomes through their motivation, goal orientation, learnings efforts, and self-efficacy. Second, the design of the learning environment can facilitate learning and problem-solving processes by adapting important conditions such as difficulty, language, or learning aids.

  2. Problem solving through values: A challenge for thinking and capability

    Abstract. The paper aims to introduce the conceptual framework of problem solving through values. The framework consists of problem analysis, selection of value (s) as a background for the solution, the search for alternative ways of the solution, and the rationale for the solution. This framework reveals when, how, and why is important to ...

  3. The effectiveness of collaborative problem solving in promoting

    Duch et al. noted that problem-based learning in group collaboration is progressive active learning, which can improve students' critical thinking and problem-solving skills. Collaborative ...

  4. Determinants of 21st-Century Skills and 21st-Century Digital Skills for

    Situations that are complex and uncertain and that have no precedent require problem-solving skills (Keane et al., 2016). Problem-solving is often conceptualized as the knowledge and skills that are required to deal effectively with complex nonroutine situations (Funke et al., 2018). Although domain-specific knowledge plays an important role ...

  5. Problem Solving Skills: Essential Skills in Providing Solutions to

    They were selected Journal of Educational Issues ISSN 2377-2263 2022 from a class of 20 students as they failed the placement test regarding plans in daily life -the concept learned in the class ...

  6. Bridging critical thinking and transformative learning: The role of

    In recent decades, approaches to critical thinking have generally taken a practical turn, pivoting away from more abstract accounts - such as emphasizing the logical relations that hold between statements (Ennis, 1964) - and moving toward an emphasis on belief and action.According to the definition that Robert Ennis (2018) has been advocating for the last few decades, critical thinking is ...

  7. Cognitive control, intentions, and problem solving in skill learning

    The kind of problem solving we found, together with its flaws, is likely to be fairly typical for individuals in relatively early stages of skill learning. But in skills which require significant levels of flexibility—such as mountain biking and climbing—problem solving is also likely to be central to the most advanced levels of skill.

  8. Problem-Based Learning: An Overview of its Process and Impact on

    With this growing practice and popularity of PBL in various educational and organisational settings, 2, 3, 4 there has been an increasing number of studies examining its effectiveness on the quality of student learning and the extent to which its promise of developing self-directed learning habits, problem-solving skills and deep disciplinary ...

  9. Frontiers

    H1: Online problem-solving instruction has a significant positive correlation with identification attitude. Lu et al. (2017) consider that teachers can enhance students' problem-solving ability and cultivate their problem-finding skills through instructional strategies guiding discussion of current affairs. Instructional strategies and the use ...

  10. 36797 PDFs

    Vera Septi Andrini. The use of adaptive e-learning in enhancing mathematical problem-solving abilities has garnered attention in the education world. This article presents an in-depth review of ...

  11. Full article: Fostering critical thinking skills in secondary education

    Our critical thinking skills framework. The focus on critical thinking skills has its roots in two approaches: the cognitive psychological approach and the educational approach (see for reviews, e.g. Sternberg Citation 1986; Ten Dam and Volman Citation 2004).From a cognitive psychological approach, critical thinking is defined by the types of behaviours and skills that a critical thinker can show.

  12. Full article: The challenge of supporting creativity in problem-solving

    Introduction. Creativity, as one of the 21st-century skills, is an important part of the knowledge, skills, and attitudes citizens need in the future society (UNESCO Citation 2013).By developing students' creativity, they may be able to offer new perspectives, generate novel and meaningful ideas, raise new questions, and come up with solutions to ill-defined problems (Sternberg and Lubart ...

  13. Effect of training problem-solving skill on decision-making and

    INTRODUCTION. Having strong coping skills to reduce stress and satisfaction on the decision-making process to be creative and also having problem-solving skill is necessary in life.[] In this way, learning should be defined "meaningful" and problem-solving skill is one of these ways.[]Unfortunately, traditional teaching method in universities transfers a mixture of information and concepts ...

  14. (PDF) Enhancing students' problem-solving skills ...

    This study presents a three-stage, context-. based, problem-solving, learning activity that involves watching detective films, constructing a context-simulation activity, and introducing a project ...

  15. The role of problem solving ability on innovative behavior and

    Problem-solving ability and opportunity recognition. Among the many factors influencing opportunity perception, the problems that arise in the fourth industry, the knowledge-based industry of the twenty-first century, are unpredictable and unstructured; they cannot be solved with existing solutions and require creative problem-solving skills.

  16. Full article: Enhancing critical analysis and problem‐solving skills in

    Within the literature on graduate attributes, critical analysis and problem‐solving skills have been espoused as two fundamental skills that should be developed in university undergraduate students (Barrie, Citation 2006; Moore, Citation 2004).These skills are thought to enhance graduates' abilities to make connections between learning and practice (Thomas, Citation 2011), and their capacity ...

  17. The Effectiveness of Problem-solving on Coping Skills and Psychological

    This study aimed to examine the effects of the problem-solving to enhance effective coping skills and psychological adjustment among Iranian college students. The predictions of the present study were as follows: (a) Participants in a problem-solving program may perceive their problems to be resolved; (b) problem-solving training may increase ...

  18. Complex Problem Solving in Teams: The Impact of Collective Orientation

    Abstract. Complex problem solving is challenging and a high-level cognitive process for individuals. When analyzing complex problem solving in teams, an additional, new dimension has to be considered, as teamwork processes increase the requirements already put on individual team members. After introducing an idealized teamwork process model ...

  19. Problem-solving skills, solving problems and problem-based learning

    This paper reviews the empirical evidence in support of the three concepts in the title. To the extent that a skill should be a general strategy, applicable in a variety of situations, and independent of the specific knowledge of the situation, there is little evidence that problem-solving skills, as described and measured in medical education, possess these characteristics.

  20. Problem solving skills: esssential skills challenges for the 21st

    Problem solving skills are the abilities to identify problems, search and select various alternative solutions and make decisions in solving all the problems at hand. Problem solving skills are 21st century skills that are needed by society and the world of work. This research is a descriptive quantitative research. The research sample ...

  21. Problem Solving and Teaching How to Solve Problems in Technology-Rich

    By drawing from the literature on technological pedagogical content knowledge, design thinking, general and specific methods of problem solving, and role of technologies for solving problems, this article highlights the importance of problem solving for future teachers and discusses strategies that can help them become good problem solvers and ...

  22. Collaborative Problem-Solving in Knowledge-Rich Domains: A ...

    Collaborative skills are highly relevant in many situations, ranging from computer-supported collaborative learning to collaborative problem-solving in professional practice (Fiore et al., 2018).While several broad collaborative problem-solving frameworks exist (OECD, 2017), most of them are situated in knowledge-lean settings.However, one example of collaborative problem-solving of knowledge ...

  23. The Efficacy and Development of Students' Problem-Solving Strategies

    Reasoning strategies in complex problem solving. Problem-solving skills have been among the most extensively studied transversal skills over the last decade; they have been investigated in the most prominent comprehensive international large-scale assessments today (e.g., OECD, 2014).The common aspects in the different theoretical models are that a problem is characterized by a gap between the ...

  24. Introduction: Games, Gamification, and Virtual Environments

    Welcome to the games, gamification, and virtual environments special issue of the Journal of Technology-Integrated Lessons and Teaching (JTILT). Why a special issue on games? Because games have merit! Games can be played by anyone, can provide safe ways to practice essential skills, and can adapt to various interests and settings. As more states and nations require computational thinking ...

  25. Full article: Measuring collaborative problem solving: research agenda

    The Human-Agent vs. Human-Human discussion. After the individual interactive problem-solving assessment in 2012, the OECD decided that problem-solving skills would be assessed again in 2015 (Csapó & Funke, Citation 2017).However, this time the focus of the assessment was the individual's capacity for solving problems collaboratively instead of on his or her own.

  26. Education Sciences

    This work explores the mathematical modelling capabilities of various iterations of ChatGPT, focusing on their performance across tasks of differing complexity and openness. The study examines the abilities of GPT-3.5, GPT-4.0, and a more instructed version, GPT-MM, in multiple scenarios. It is observed that all versions demonstrate basic mathematical problem-solving skills. However, their ...