U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

The PMC website is updating on October 15, 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Springer Nature - PMC COVID-19 Collection

Logo of phenaturepg

Using Simulation as an Investigative Methodology in Researching Competencies of Clinical Social Work Practice: A Scoping Review

Kenta asakura.

1 Carleton University School of Social Work, Ottawa, Canada

Ruxandra M. Gheorghe

Stephanie borgen, karen sewell, heather macdonald.

2 Carleton University MacOdrum Library, Ottawa, Canada

This article reports a scoping review designed to synthesize current literature that used simulation as an investigative methodology (simulation-based research; SBR) in researching practice competencies in clinical social work. Following Arksey and O’Malley’s scoping review framework, 24 articles were included in this scoping review. The majority of articles reported SBR studies conducted in Canada and the U.S. and were published in the last 10 years, signifying that this is a burgeoning area of research in clinical social work. Areas of clinical competencies included professional decision-making (33%), the role of cognition and emotion (21%), attending to culture and diversity (21%), and others, such as supervision skills (8%). Using qualitative (46%), quantitative (42%), and mixed methods (13%) in research design, more than half of the SBR studies reported in the selected articles used live actors (54%) to simulate a realistic practice situation for research. Selected articles also offered both benefits and limitations of SBR in social work. We offer suggestions for when to use SBR for research on clinical social work practice and strengthening a collaboration between clinicians and researchers in advancing practice-informed research.

Introduction

While the use of simulation was originally introduced in medical education (Cleland et al. 2009 ), it is gaining much attention in clinical social work education (Bogo et al. 2014 , 2011 ; Rawlings 2012 ), with the potential to enhance the knowledge, skill, judgement, and self-awareness of practicing clinical social workers. Simulation in social work generally refers to a situation where a student or a practitioner engages with a trained actor (i.e., often known as “Standardized Patient,” SP) or a virtual reality program that portrays a well-designed character and/or practice scenario. A growing body of research evidence suggests that simulation offers concrete experiential learning opportunties well-suited to assist students in applying knowledge, values and skills into practice (Bogo et al. 2014 ; Kourgiantakis et al. 2019 ), making simulation a widely recognized key word in clinical social work education. While simulation has been used in the training and assessment of clinical practitioners, it is also emerging as a novel methodology for research on practice competencies. Understanding how simulation is used in this capacity can support practicing social workers in evaluating simulation-based research contributions and translating generated knowledge to their practice.

There is a burgeoning body of literature suggesting that simulation can be used as a promising research methodology for the studies about practice competencies in medicine and other healthcare fields (e.g., Cheng et al. 2014 ; Halamek 2013 ). While these reports and guidelines on simulation-as-methodology from medicine and other fields offer important research innovations, they might not be directly applicable when studying social work. To date, there is no report that discusses how simulation can be used as a research methodology in social work. The purpose of this scoping review was to systematically search and summarize the current state of social work literature, in which simulation (e.g., trained actors as standardized clients, virtual reality, staged environments) was used as an investigative methodology in researching practice competencies. As articulated by the Council on Social Work Education ( 2015 ), competencies of social work practice require one’s ability to integrate “knowledge, values, skills, and cognitive and affective processes that include … critical thinking, affective reactions, and exercise of judgment” (p. 6) in attending to each unique client and practice situation. In this review, we will focus on competencies related to clinical practice, which “address(es) the needs of individuals, families, couples and groups affected by life changes and challenges” (National Association of Social Workers 2005 , p. 8). Given that supervision is included in the NASW’s standards of clinical social work practice, we also consider supervision skills as a part of clinical social work competencies.

Simulation-Based Research

Cheng et al. ( 2014 ) defined and categorized simulation-based research (SBR), a program of research that involves the use of live SPs or any other kinds of simulated practice situations (e.g., computerized mannequin, virtual reality), into two types: (1) research that evaluates simulation-based teaching and learning, and (2) research that uses simulation as a methodological tool. Much is already well known about the use of simulation in the teaching and learning of social work (e.g., Bogo et al. 2014 ; Kourgiantakis et al. 2019 ). In this article, we focused on the latter type of SBR: the use of simulation as a novel research methodology.

There is a growing body of research in medicine and other related fields, in which simulation is used as an investigative methodology in studying practice competencies. For instance, Geurtzen et al. ( 2014 ) examined the cultural differences in prenatal counseling by examining how American and Dutch neonatologists worked with a simulated patient. Zhou et al. ( 2013 ) used data from the Objective Structured Clinical Examinations (OSCE), a type of experiential exam commonly used in professional schools where students’ competency is assessed based on their engagement with SPs. Others hired and trained SPs to portray a covert or mystery patient to examine the delivery of TB screening in clinics (Christian et al. 2018 ), service access and wait time in the public youth mental health system (Olin et al. 2016 ), and the quality of pharmacist practice (Ibrahim et al. 2018 ).

A few clinical researchers have published review papers or guidelines on the use of simulation as a research methodology, including family medicine (Beullens et al. 1997 ), pediatrics (Cheng et al. 2014 ), neonatal medicine (Halamek 2013 ), pharmacy (Watson et al. 2006 ) and public health (Chandra-Mouli et al. 2018 ). These methodological reports on the use of simulation are written for specific research foci, such as patient safety (Guise et al. 2017 ), quality of healthcare (Fitzpatrick and Tumlinson 2017 ) and provider behaviors and communications (Madden et al. 1997 ; Siminoff et al. 2011 ). In general, these methodological reports suggest simulation as a promising methodology for studying practice competencies. Because much of what happens in medical care between physicians and real patients is highly unpredictable, standardization of patient presentations and highly controlled clinical environments allows the researchers to assess healthcare delivery and practitioners’ skills objectively especially when the authenticity of the simulation is ensured (e.g., Cheng et al. 2014 ; Halamek 2013 ). The use of simulated patients also makes research recruitment easier than when recruiting real patients (e.g., Cheng et al. 2014 ). It is also suggested that SBR is resource intensive (e.g., researchers’ time, money, availabilities of technology) and can limit the number of research participants (Guise et al. 2017 ). Others also emphasize the time, labor, and cost intensive nature of SBR (e.g., Beullens et al. 1997 ; Guise et al. 2017 ). When relying on a simulated situation, research is often limited to one meeting between the physician and the patient, making it difficult for researchers to make longitudinal observation or keep track of changes over time (Beullens et al. 1997 ). Finally, these reviews warn that the novelty of this research methodology means that much remains unknown about the best practices in the areas of data collection (e.g., video capture, technological requirements and limitations), the use of appropriate outcome measures, and data analysis (e.g., how to analyze the video) (e.g., Cheng et al. 2014 ; Guise et al. 2017 ; Siminoff et al. 2011 ).

Psychotherapy Process Research

There is surprisingly little SBR in Psychology (e.g., clinical, counseling psychology) and Psychiatry. This is likely because psychologists and psychiatrists often engage in clinical research, in which researchers examine actual therapy sessions, either through direct observation of audio- or video-recorded sessions or self-reports from therapists or clients (Orlinsky et al. 2015 ). Psychotherapy process research (PPR) is a highly developed paradigm of clinical research, in which researchers examine what happens and how change happens in psychotherapy between the client and the therapist, with a focus on client outcomes (Hardy and Llewelyn 2015 ; Knobloch-Fedders et al. 2015 ). Although it is beyond the scope of this paper to discuss the large body of PPR, its primary focus is on what happens between the therapist and the client (Hardy and Llewelyn 2015 ). Studying actual therapy sessions, those engaged in PPR explicate the mechanisms of treatment and client change processes, understand key elements of effective treatment, develop theories and models of psychotherapy practice, and use research results to develop effective psychotherapy training (Hardy and Llewelyn 2015 ). Although a number of PPR studies might include social workers in their study samples along with other professionals, PPR appears to be a research paradigm primarily employed by psychologists and psychiatrists. With a few exceptions, such as studies on cross-cultural engagement (Tsang et al. 2011 ; Lee and Horvath 2013 , 2014 ), racial microaggression (Lee et al. 2018 ) and whiteness (Lee and Bhuyan 2013 ), social work researchers seem to rarely engage in PPR. This is likely due to reasons such as broad scopes of social work practice (i.e., services beyond outpatient therapy), and ethical concerns around the vulnerabilities of social work clients. Simulation is well-suited to addresses these realities, and as such is beginning to emerge in social work as a research methodology for the direct observation and analysis of clinical practice. While research on clinical social work practice has historically relied on retrospective data from clinicians and clients (Wilkins and Jones 2018 *) through surveys, interviews, and focus groups, SBR might provide a promising research methodology for advancing knowledge and research on social work practice.

Study Purpose

The purpose of this scoping review was to map out the current state of literature in which simulation was used as an investigative methodology in studying practice competencies of clinical social work. While methodological reports have been published in other healthcare fields (e.g., Beullens et al. 1997 ; Cheng et al. 2014 ), there is no similar methodological report focused on social work. Knowledge synthesis of existing relevant literature can lead to the advancement of this burgeoning area of research and assist clinical social workers in understanding how SBR aligns with practice and can contribute to their work. Scoping review is a methodology particularly suitable for the current study as we seek to explicate the key concepts of SBR as a new area of research and documenting its emerging evidence (Munn et al. 2018 ). Research questions guiding this scoping review were: (1) What are the characteristics of studies of clinical social work competencies that used simulation as a research methodology?; (2) How is simulation-based data used in the studies of practice competencies?; and (3) What benefits and limitations are there in the use of simulation as a research methodology in studying clinical social work practice competencies?

Scoping review is a knowledge synthesis methodology through which researchers “comprehensively summarize and synthesize evidence with the aim of informing practice, programs and policy and providing direction to future research priorities” (Colquhoun et al. 2014 , p. 1291). We conducted this scoping review to synthesize the existing knowledge on the use of simulation-based data for research on practice competencies in order to guide future SBR and guide practice. We closely followed Arksey and O’Malley’s framework (2005), which consists of the following five stages enhanced by Levac et al. ( 2010 ): (1) identifying research questions, (2) identifying relevant studies, (3) selecting studies based on inclusion and exclusion criteria, (4) charting data, and (5) collating, summarizing and reporting the results. In enhancing the study rigor, we also followed the PRISMA Extension for Scoping Reviews (PRISMA-ScR) guidelines (Tricco et al. 2018 ).

Identifying Relevant Studies

A librarian member of the team (HM) developed the search strategy for PsycINFO (1806-) in consultation with the research team. The search strategy was validated using a test set of 17 articles identified by the first author. We searched seven databases in April 2020: ASSIA (1987-), CINAHL (1981-), ERIC (1966-), PsycINFO (1806-), Social Services Abstracts (1979-), and Social Work Abstracts (1965-). Table ​ Table1 1 shows the concepts and search terms used.

Search strategy for PschINFO (Proquest)

ConceptSearch terms
Social work

(((ab(MSW) OR ti(MSW))

OR (ab(BSW) OR ti(BSW))

OR (ab(social NEAR/3 (work* OR casework* OR case-work*)) OR ti(social NEAR/3 (work* OR casework* OR case-work*)))

OR (ab(case NEAR/3 work*) OR ti(case NEAR/3 work*))

OR (ab(socialwork*) OR ti(socialwork*))

OR (ab(social-work*) OR ti(social-work*))

OR (ab(casework*) OR ti(casework*))

OR (ab(case-work*) OR ti(case-work*)) OR

((ab(clinician* OR supervis*) OR ti(clinician* OR supervis*)) AND noft(social-work OR socialwork))

OR (ab(Child NEAR/1 (welfare OR protection)) OR ti(Child NEAR/1 (welfare OR protection))))

OR MAINSUBJECT.EXACT("Social Workers")

OR MAINSUBJECT.EXACT("Psychiatric Social Workers")

OR MAINSUBJECT.EXACT("Social Casework")

OR MAINSUBJECT.EXACT("Social Group Work")

OR MAINSUBJECT.EXACT("Social Work Education"))

Simulation

(((ab((simulat* OR sample OR standard* OR virtual* OR computer*) NEAR/3 (patient* OR client* OR practice)) OR ti((simulat* OR sample OR standard* OR virtual* OR computer*) NEAR/3 (patient* OR client* OR practice)))

OR (ab(objective NEAR/3 structur*) OR ti(objective NEAR/3 structur*))

OR (ab(("role play" OR "role played" OR "role player" OR "role players" OR "role playing" OR "role plays")) OR ti(("role play" OR "role played" OR "role player" OR "role players" OR "role playing" OR "role plays")))

OR (ab(roleplay*) OR ti(roleplay*))

OR (ab(vignette*) OR ti(vignette*))

OR (ab(patient NEAR/3 instructor*) OR ti(patient NEAR/3 instructor*))

OR (ab(SPI) OR ti(SPI))

OR ((ab(SP) OR ti(SP)) AND noft(patient*))

OR (ab(HPS) OR ti(HPS))

OR (ab(OSCE) OR ti(OSCE))

OR (ab(SBL) OR ti(SBL)))

OR MAINSUBJECT.EXACT("Simulation")

OR MAINSUBJECT.EXACT("Role Playing")

OR MAINSUBJECT.EXACT("Vignette Measure")

OR MAINSUBJECT.EXACT("Computer Simulation")

OR MAINSUBJECT.EXACT("Virtual Reality"))

Arksey and O’Malley ( 2005 ) suggested hand searches to identify studies that the electronic searches might have missed. We searched the table of contents for all articles published since 2010 up to March 25, 2020 in the following relevant social work journals: Clinical Social Work Journal, Research on Social Work Practice, Journal of the Society for Social Work and Research, Journal of Social Work Education, Social Work Education: The International Journal, and Journal of Teaching in Social Work . This 10-year timeframe was set for the hand search because simulation is a relatively new method and was not common in social work prior to 2010. Finally, we also checked the reference lists of the relevant articles and consulted with a group of experts to ensure that we did not miss any other studies.

Selecting Studies

Using our content expertise, the lead author (KA) and a second team member (KS) worked together to develop the following inclusion criteria to identify empirical studies written in English and published in peer-review journals: (1) used simulation-based data (e.g., live SPs, video-recordings of SPs, virtual reality, data available from OSCE), (2) examined practice competencies (i.e., knowledge, values, and skills) related to clinical social work, and (3) included study samples comprised of social workers, social work students, or social work supervisors. We included any type of study design, and there were no geographical or time restrictions in our study selection. We excluded studies that used real clients or peer-based role plays, as simulation is designed to provide a realistic practice environment without engaging real clients or those with pre-existing relationships. We also excluded studies that used any static format of case studies, such as written-, audio- or image-based vignette as simulation is designed to create a dynamic, immersive practice situation for study participants. Additionally, we excluded studies with the following characteristics: (1) study participation of social workers, social work students or social work supervisors was unclear, (2) the primary study purpose was on the evaluation of simulation-based education or training, including those studies in which competencies were measured (e.g., skills improvement among students as a result of simulation-based learning), and (3) the focus was on macro social work practice competencies (e.g., community organizing, policy-making). These inclusion and exclusion criteria were iteratively developed as the team became familiar with existing literature. We also contacted authors when information provided was unclear and excluded the studies when we did not receive clear answers from the authors.

All titles and abstracts from the initial database search (n = 4224) were reviewed independently by two members (RG, SB) for eligibility. Inter-rater agreement was 75% for this screening step based on a calibration exercise. Any conflicting decisions were resolved by the lead author (KA) in consultation with another member (KS). Then three members (KA, RG, SB) screened full articles (n = 275) for eligibility. Inter-rater agreement was 93% for this screening step based on a calibration exercise. We used Covidence , a web-based platform for scoping and systematic reviews, for the screening process. Finally, we conducted a hand-search of reference lists of the included articles.

Charting the Data

In consultation between two authors (KA, KS), we developed a data extraction form which was piloted and used to chart following categories of data: the year of publication, study location, target competencies, study sample, scope of clinical practice, type of simulation-based data used (e.g., live SP, virtual reality), study design, rationale for using simulation-based data, simulation case development, data analysis, methodological strengths and limitations. Three team members (KA, RG, SB) independently reviewed each of the identified articles (N = 24) and charted the data. Finally, we conducted qualitative content analysis (Sandelowski 2000 ) of the extracted data. Any discrepancies or disagreements were discussed and resolved in the weekly team meetings. The lead author (KA) made the final decisions in consultation with another member of the team (KS), using our content expertise.

The initial database search resulted in 4224 articles, and after removing 128 duplicates, we had 4,096 articles remaining. A total of 275 full text articles remained after the title and abstract exclusions. After the team members (KA, RG, SB) independently screened 275 full text articles, we excluded 251 as they did not meet the inclusion criteria (e.g., methods used were not considered simulation; study purpose were not on clinical practice competencies) and had 18 articles included for the study. We also identified an additional 6 articles through the hand-search of relevant journals and reference lists. Taken together, a total sample of 24 articles met the criteria for this scoping review (See Fig.  1 : PRISMA flowchart).

An external file that holds a picture, illustration, etc.
Object name is 10615_2020_772_Fig1_HTML.jpg

PRISMA flowchart of the search and screening process

Study Characteristics

In addressing the first research question, we examined the study characteristics of social work studies, in which simulation was used to study clinical practice competencies (e.g., knowledge, values, skills). Table ​ Table2 2 summarizes the characteristics of the selected articles. Most of the articles (n = 17, 71%) were published after 2010. The majority of the articles reported studies conducted in North America, with 12 articles from Canada (50%) and six from the U.S. (25%). The remaining articles were from the United Kingdom (n = 5, 21%) and Denmark (n = 1, 4%). The study designs were almost evenly divided between qualitative and quantitative, with 11 qualitative (46%) and 10 quantitative (42%) articles, while mixed-methods studies were reported in three articles (13%). Most articles involved social work students and/or social work practitioners: social work students (n = 7, 29%), social workers (n = 7, 29%), or both students and social workers (n = 5, 21%). A small number of articles involved study samples of interprofessional groups (n = 3, 13%) and social work supervisors (n = 2, 8%). Sample size of those who participated in simulation varied, ranging from 12 (Wilkins et al. 2018 *) to 151 (MacDonell et al. 2019 *). Approximately half of the articles (n = 13, 54%) reported a sample size of 50 or less, while a sample size larger than 100 was reported in three articles (13%).

Characteristics of selected articles

CharacteristicArticles
n (%)

Year of article

Before 2000

2001–2010

2011–2015

2016–Present

3 (13)

4 (17)

8 (33)

9 (38)

Study location

Canada

USA

UK

Other

12 (50)

6 (25)

5 (21)

1 (4)

Targeted competencies studied

Decision making

Culture and diversity

Role of cognition and emotion

Communication skills

Supervision

Other

8 (33)

5 (21)

5 (21)

3 (13)

2 (8)

1 (4)

Study sample

Social work students

Social workers

Students & social workers

Interprofessional

Supervisors

7 (29)

7 (29)

5 (21)

3 (13)

2 (8)

Study Design

Qualitative

Quantitative

Mixed methods

11 (46)

10 (42)

3 (13)

Type of Simulation Data

Live SP

Secondary data from the OSCE

Video vignette

Virtual reality

13 (54)

6 (25)

4 (17)

1 (4)

Rationale for using simulation data

Yes

No

14 (58)

10 (42)

Case development

Yes

No

18 (75)

6 (25)

N  = 24. Due to rounding, percentages may not total 100

In terms of the target competencies studied within the articles, the most common area was clinicians’ professional decision-making (n = 8, 33%), for instance, in reporting of child abuse (e.g., LeBlanc et al. 2012 *; Tufford et al. 2015 *) and assessing suicide risk (e.g., Bogo et al. 2017 *; Regehr et al. 2015 *, 2016 *). Five articles (21%) examined the role of the clinicians’ cognitive skills and/or emotional reactions and states (Bogo et al. 2013 *; Katz et al. 2014 *; Reeves et al. 2015 *; Sewell et al. 2020 *; Tufford et al. 2017 ). An additional five articles (21%) examined culture and diversity related to clinical practice, such as cultural empathy (Garcia et al. 2012 *), stereotypes (Kurtz et al. 1989 *), ethnicity (Maki 1999 *), and working with LGBTQ populations (Logie et al. 2015 *; Tyler and Franklin 2020 *). Three articles (12.5%) examined communication skills among practitioners (Duehn and Mayadas 1979 *; Forrester et al. 2008 *; MacDonell et al. 2019 *). In two articles (8%), supervision skills were studied as a clinical practice competence (Wilkins and Jones 2018 *; Wilkins et al. 2018 *).

Almost half of the articles (n = 11, 46%) did not specify a scope of clinical practice and addressed competencies relevant to clinical practice in general (e.g., Bogo et al. 2013 *; Garcia et al. 2012 *; Katz et al. 2014 *; Tyler and Franklin 2020 *). Child protection was the most common specialized scope of clinical practice (n = 7, 29%), in that researchers examined the reporting of potential neglect and abuse (e.g., Forrester et al. 2008 *; Reeves et al. 2015 *; Regehr et al. 2010a *, 2010b *). Suicide risk assessment was another scope of practice focused in three articles (Bogo et al. 2017 *; Regehr et al. 2015 *, 2016 *). Other specialized scopes of practice reported in the articles were healthcare (MacDonell et al. 2019 *), criminal justice (Stone 2019 *), and employment services (Eskelinen and Caswell 2006 *). Table ​ Table3 3 shows a synthesis of the selected articles.

Synthesis of selected articles (N = 24)

AuthorYearLocationCompetencies studiedStudy sampleStudy designType of simulation dataSimulation data rationaleCase development
Bogo et al2013CanadaRole of cognition and emotionN = 18 / Students and social workersQualitativeData from the OSCENoYes
Bogo et al2017CanadaDecision makingN = 71 / Students and social workersQualitativeLive SPNoYes
Duehn & Mayadas1979USACommunication skillsN = 75 / Social work studentsQuantitativeLive SPNoNo
Eskelinen & Caswell2006OtherOtherN = 28 / Social workersQualitativeVideo vignetteYesYes
Forrester et al2008UKCommunication skillsN = 24 / Social workersQuantitativeLive SPYesYes
Garcia et al2012USACulture and diversityN = 20 / Social work studentsQualitativeLive SPYesYes
Katz et al2014CanadaRole of cognition and emotionN = 109 / Social work studentsQualitativeData from the OSCEYesNo
Kurtz et al1989USACulture and diversityN = 79 / Social work studentsQuantitativeVideo vignetteNoYes
LeBlanc et al2012CanadaDecision makingN = 96 / Social workersQuantitativeLive SPYesYes
Logie et al2015CanadaCulture and diversityN = 18 / Students and social workersQualitativeData from the OSCENoYes
Macdonell et al2019USACommunication skillsN = 151 / InterprofessionalQuantitativeLive SPYesNo
Maki1999USACulture and diversityN = 120 / Social workersQuantitativeVideo vignetteNoNo
Reeves et al2015UKRole of cognition and emotionN = 24 / InterprofessionalQuantitativeVirtual realityYesYes
Regehr et al2010(a)CanadaDecision makingN = 96 / Social workersMixed methodsLive SPNoYes
Regehr et al2010(b)CanadaDecision makingN = 96 / Social workersQuantitativeLive SPNoYes
Regehr et al2015CanadaDecision makingN = 71 / Students and social workersQuantitativeLive SPYesYes
Regehr et al2016CanadaDecision makingN = 71 / Social workersMixed methodsLive SPYesYes
Sewell et al2020CanadaRole of cognition and emotionN = 57 / Social work studentsQualitativeLive SPYesYes
Stone2019UKDecision makingN = 20 / InterprofessionalQualitativeVideo vignetteYesYes
Tufford et al2015CanadaDecision makingN = 23 / Students and social workersQualitativeData from the OSCENoNo
Tufford et al2017CanadaRole of cognition and emotionN = 20 / Social work studentsQualitativeData from the OSCEYesYes
Tyler & Franklin2020USACulture and diversityN = 37 / Social work studentsQualitativeData from the OSCENoNo
Wilkins & Jones2018UKSupervisionN = 30 / SupervisorsMixed methodsLive SPYesYes
Wilkins et al2018UKSupervisionN = 12 / SupervisorsQuantitativeLive SPYesYes

The Use of Simulation as a Research Methodology

To address the second research question of how simulation-based data are used in the studies of clinical competencies, we examined whether researchers provided a rationale for using simulation as a research methodology. We also looked into what types of simulation researchers used and whether and how researchers described the case development. Finally, we identified how simulation-based data were used and analyzed in studying practice competencies.

Approximately half of the articles (n = 14, 58%) reported a rationale for using simulation as a research methodology, while the other half (42%) did not describe relevant information on why simulation-based data was used to answer their research questions. Specific rationale located in the selected articles is discussed as methodological strengths in the subsequent section. About half of the articles (n = 13, 54%) reported using live SPs (i.e., trained actors) to simulate a social work practice scenario. In six articles (25%), researchers used data collected during the OSCEs for secondary data analysis. All of these studies (e.g., Bogo et al. 2013 *; Logie et al. 2015 *) used participants’ qualitative reflection on their engagement with SPs as a data source. In four studies (17%), researchers developed and used video-based case vignettes as a source of data. Researchers reported filming simulated interactions between SPs and practitioners in three of those articles (Eskelinen and Caswell 2006 *; Kurtz et al. 1989 *; Stone 2019 *), while an SP monologue was filmed and used as a data source in one article (Maki 1999 *). Virtual reality was used in one article (Reeves et al. 2015 *) to develop a virtual patient and an immersive practice environment in studying participants’ emotional response to child protection.

Most of the articles (n = 18, 75%) described the process of developing simulated case scenarios, while no relevant information was located in seven articles (29%). Of the 18 articles, researchers described specific processes of how the case vignettes were developed, with two common practices identified: (1) consultation with practitioners or experts during the case development phase (e.g., Bogo et al. 2017 *; Logie et al. 2015 *; Regehr et al. 2010a *), and (2) pilot-testing with other practitioners for face validity (e.g., Regehr et al. 2015 ,* 2016 *; Stone 2019 *). A few researchers (e.g., Bogo et al. 2017 *; LeBlanc et al. 2012 *; Tufford et al. 2017 ) reported an SP training. The degree of descriptions of SP training, however, varied. Some provided no information other than SPs being trained, while others described assisting SPs in consistently enacting the case scenarios on both verbal (i.e., key phrases to use) and non-verbal elements (e.g., emotional intensity) of communication.

In the articles that reported using qualitative research (n = 11), most researchers examined participants’ written or verbal reflections on their simulated practice (e.g., OSCEs, with live SPs) in studying practice competencies. For the most part, qualitative methodologies were employed for studies that aimed to conceptualize practice competencies that are not yet well studied, such as cognitive and affective skills (e.g., Bogo et al. 2017 *; Sewell et al. 2020 *; Tufford et al. 2017 ) and engaging culture and diversity in practice (e.g., Garcia et al. 2012 *; Logie et al. 2015 *; Tyler and Franklin 2020 *). Thematic analysis (e.g., Bogo et al. 2013 *, 2017 *; Tyler and Franklin 2020 *) and qualitative content analysis (Katz et al. 2014 *; Tufford et al. 2015 *) were the most common analytic methods used in studying practice competencies inductively through the participants’ perspectives. On the other hand, no apparent patterns were found in how researchers used quantitative (n = 10) or mixed (n = 3) methodologies in studying practice competencies. Kurtz et al. ( 1989 *) and Regehr’s team (e.g., Regehr et al. 2010a *, 2010b *; LeBlanc et al. 2012 *), for instance, studied professional decision-making by having participants fill out measures in conjunction with a participation in simulation. Some researchers, such as Duehn and Mayadas ( 1979 ), Forrester et al. ( 2008 *) and Wilkins et al. ( 2018 *) studied practice competencies by recording participants’ simulated sessions and analyzing practice skills, using a coding system. Others such as and Kurtz et al. ( 1989 *) and Maki ( 1999 *) looked at whether individual participants engaged multiple simulated scenarios similarly or differently based on the client characteristics, while MacDonell et al. ( 2019 *) compared social workers and other professionals in their engagements with SPs.

Strengths and Limitations in Social Work SBR

To address the third research question, we reviewed the selected articles to examine benefits and limitations associated with using simulation as a research methodology. Despite the fact that the use of simulation is a novel and innovative methodology in social work research, there was surprisingly little relevant discussion found. In terms of strengths of using simulation as a research methodology, three themes emerged: simulation offers (1) opportunities for direct observation of practice, (2) standardization of practice situations, and (3) a solution for research ethics related concerns. First, Wilkins and his colleagues (Wilkins and Jones 2018 *; Wilkins et al. 2018 *) suggested that the use of simulation provides an opportunity for researchers to directly observe practitioner behaviors. While social work researchers have traditionally relied on practitioners’ (often retrospective) self-report on their practice (Wilkins and Jones 2018 *), researchers are able to answer research questions related to clinical practice through an observation and analysis of real-time data on social workers. Second, researchers (e.g., Bogo et al. 2017 *; Forrester et al. 2008 *; LeBlanc et al. 2012 *) discussed the process of developing and preparing standardized simulation scenarios. Providing consistent verbal and non-verbal information in a standardized practice environment, researchers are able to observe participants and their clinical competencies. Third, several authors (e.g., Eskelinen and Caswell 2006 *; Forrester et al. 2008 *; Stone 2019 *) suggested that simulation provides a solution to an ethical challenge when researching social work practice. Researching real social work clients from vulnerable communities or sensitive topics, such as suicide risk (e.g., Regehr et al. 2015 *, 2016 *), poses serious concerns from a research ethics perspective. Simulation enables researchers to study clinical practice through direct observation of clinicians and their competencies, while mitigating these ethical concerns.

In reviewing the selected articles, several limitations were noted in relation to using simulation as a research methodology. First, a few researchers (e.g., Eskelinen and Caswell 2006 *; LeBlanc et al. 2012 *; Regehr et al. 2015 *) suggested that the very nature of simulation—the fact that it is simulated practice, not a real clinical encounter with a real client, poses methodological limitations. Simulation is a portrayal of a clinical situation and might not accurately reflect real life practice. Similarly, a few researchers (e.g., Bogo et al. 2017 *; Regehr et al. 2010a *; Wilkins et al. 2018 *) suggested a partial and constrained nature of simulation as a methodological limitation. These researchers cautioned that a short, single session format of simulation (e.g., 15-min in Regehr et al. 2010a *; 30-min in Wilkins and Jones 2018 *) might not fully capture the dynamic and process-oriented social work practice, limiting generalization of study results. Second, Forrester et al. ( 2008 *) and Maki ( 1999 *) cautioned that, not using real clients with real presenting problems, the use of simulation is not suitable for researchers if their research questions involve client outcomes (e.g., impacts of therapeutic alliance on clients; improvement in client presenting concerns). Finally, two articles (Regehr et al. 2015 *; Stone 2019 *) suggested issues related to socio-cultural diversity in simulated case scenarios (e.g., only white simulated clients). Especially for a study of social work practice, the use of simulation might limit its applicability and relevance to real life clinical practice if the simulated cases do not reflect the sociocultural diversities present in contemporary social work practice.

Discussion & Implications for Clinical Social Work

We identified 24 articles in this scoping review focused on the use of simulation as an investigative methodology (i.e., SBR) for researching clinical social work competencies. We synthesized the characteristics of relevant studies, the ways in which simulation-based data are used in research on clinical competencies, and relevant methodological benefits and limitations of SBR. Results on the characteristics of the selected articles suggest that the use of simulation is a relatively new methodology in clinical social work, with a majority of articles published in the last ten years. Given the small number of articles published prior to 2010 and the large number of articles using data available from OSCE-based student assessment, the emergence of simulation-based teaching in social work for the last ten years (Kourgiantakis et al. 2019 ) has likely played an important role in vitalizing SBR. While social work competence consists of a number of elements—the worker’s knowledge, values, skills, and cognitive and affective processes (CSWE 2015 ), only a few areas of competence were studied in the selected articles. Previously, a written case vignette method was often used in the studies of various social work competencies, most notably professional decision-making (e.g., Ashton 1999 ; Stokes and Schmidt 2012 ), social work values (e.g., Walden et al. 1990 ; Wilks 2004 ) and attitudes about marginalized populations (e.g., Camilleri and Ryan 2006 ; Schoenberg and Ravdal 2000 ). In comparison to the one-dimensional and static nature of written vignettes, simulation stimulates dynamic and immersive practice situations for researchers.

As SBR becomes commonplace in social work, researchers might consider expanding target competencies to study in the future, such as assessment, diagnosis and treatment planning. Another important point to note is the use of SBR for conceptualizing the role of cognition and emotion in social work practice (e.g., Bogo et al. 2013 *; Sewell et al. 2020 *). The notion of competency was re-conceptualized a few years ago (CSWE 2015 ) to recognize the important role of clinicians’ cognitive and affective processes in social work practice. Direct observation of and reflection on practice allowed researchers to explicate and translate these rather abstract concepts into concrete competency-based skillsets. As SBR has played an essential role in advancing the conceptualization of cognitive and affective processes, perhaps a similar research process can be used to identify and explicate other important competencies relevant to clinical practice, such as navigating countertransference, working with a therapeutic impasse, and engaging in an anti-oppressive clinical practice. Strengthening partnerships between researchers and practicing clinicians can ensure complex practice realities frame future research efforts, resulting in generating relevant knowledge for enhancing practice.

In terms of the current use of simulation as a research methodology, most SBR in social work have hired and trained live SPs in creating an immersive, dynamic and realistic practice environment for research. One scarcity observed here was the use of virtual simulation as a platform for research. Given recent technological advancement and a great need for remote, often online-based, practice especially felt during the COVID-19 pandemic, however, it seems that virtual simulation poses much promise in enhancing the use of simulation in social work research. Much has been written about the utility of virtual simulation in the teaching and learning of social work practice (e.g., Asakura et al. 2018 ; Asakura et al. in press; Washburn et al. 2016 ). Given that the data collected during the OSCEs have been used and analyzed for the studies of practice competencies, researchers might consider collecting data through these pedagogical innovations and research clinical competencies in a virtual environment.

Case development was another important area which emerged in our review study. There seems to be three components that can guide case development in future SBR. First, development of a realistic and trustworthy client case, an unarguably essential element of case development, can be further facilitated when the case is closely grounded in people’s real experiences. Here we can draw from a robust body of literature on simulation-based social work education. Scenarios should contain detailed information about the client’s history, current cognitive and emotional patterns, and key verbatim responses to use (Bogo et al. 2014 ; Kourgiantakis et al. 2019 ; Sewell et al. 2020 *). Furthermore, scenarios can be developed in consultation with social work practitioners or experts in the area (e.g., Bogo et al. 2013 *; Logie et al. 2015 *; Regehr et al. 2010a *). Consultation with service users might also help the researchers in strengthening the trustworthiness of the scenarios when relevant and feasible. The second key component is SP training for live SP or video-recorded simulation. This training might involve rehearsing the scenario with the SP and assisting the SP in demonstrating verbal content and non-verbal communications (e.g., facial expression, tone of voice) relevant to the vignette (e.g., LeBlanc et al. 2012 *; Sewell et al. 2020 *). Finally, pilot-testing of the case vignette with social workers and experts of a particular scope of practice might further strengthen the face validity and authenticity of the simulation used in the study.

Results of our review study offer two key points for when to use this methodology and why. First, SBR offers an ethical fit for research on social work practice. Studying actual clinical sessions, PPR is a strong methodology for observing real-time data on clinical practice (Knobloch-Fedders et al. 2015 ). PPR, however, typically involves the observation of therapeutic relationships over time (i.e., longitudinal research) and requires longer-term, intensive involvement from clients (Tsang et al. 2003 ). Social workers also often work in ethically sensitive, mandated social and health service contexts (Bogo 2018 ), such as child protection, residential treatment, prisons, and inpatient hospitals. In these non-voluntary settings, social workers often work with vulnerable populations without alternative access to treatment. Our scoping review suggests that accessing actual client sessions might not be the most ethically appropriate research methodology for social work. By simulating a realistic and trustworthy practice situation, SBR offers a methodological solution to researching social work practice that otherwise could not be observed due to legal and ethical reasons without involving or putting actual clients at risk for potential harm.

The second key point derived from this review study is a methodological fit between simulation and research on practice competencies. Simulation allows for the alignment and standardization of the practice environment, client characters, and scenarios for a specific research purpose. This finding corroborates the arguments made by SBR in medicine and other healthcare disciplines (e.g., Cheng et al. 2014 ; Halamek 2013 ). Actual client-worker interactions as seen in PPR are uniquely different and often unpredictable depending on the client’s presenting concerns and the client-worker relational dynamics. While actual clinician-client therapeutic processes need to be observed (i.e., PPR) when the study involves client outcomes, SBR might be better suited when researchers’ primary goal is to theorize or examine practice competencies. Designing and standardizing a specific case scenario allows researchers to control these variables, at least to some extent, and more easily identify similarities and differences in how research participants demonstrate various elements of practice competencies.

The current review study found that there is a rather inconsistent inclusion of a methodological rationale for using simulation as an investigative tool in the current social work SBR articles. Although half of the articles included a brief mention of a rationale, overall, a robust argument was missing as to why simulation was the most suitable methodology for their respective study purpose and inquiry. We suggest that researchers make concerted efforts to make explicit a rationale for using simulation-based data in answering their respective research questions. Additionally, there was no clear consensus about how social work researchers might approach study design and data analysis of simulation-based data. There was insufficient information in the current study to suggest when and how to use quantitative or mixed methods in social work SBR. We found that qualitative approaches might be suitable for a collection and analysis of post-simulated practice reflection, especially for the purpose of identifying or assessing under-studied clinical competencies (e.g., attending to culture and diversity in Logie et al. 2015 *; cognitive and affective processes in Sewell et al. 2020 *). Nonetheless, no qualitative study was identified in which simulation-based data were analyzed inductively from the participants’ practice. This points to the importance of articulating promising data collection methods for directly observing clinical practice (e.g., using video equipment) and data analytic methods for qualitative coding as a priority for further advancement of SBR. As those involved in medical or other healthcare SBR pointed out an absence of methodological best practices for this novel research methodology (e.g., Cheng et al. 2014 ; Guise et al. 2017 ; Siminoff et al. 2011 ), this is an area that merits continuing attention also from social work SBR researchers.

A disconnect between researchers and clinicians and between research and practice has been long noted in social work (Gehlert et al. 2017 ). Consultation with practicing clinicians in developing and pilot-testing vignettes as noted in many of the papers in this review (e.g., Bogo et al. 2017 *; LeBlanc et al. 2012 *) can enhance the accurate representation of client situations without reliance on stereotypes. This can also enhance the validity for simulations and the overall value of research findings. Given that the very purpose of simulation is to create a practice situation closely grounded in everyday clinical practice, active involvement from clinical social workers can only strengthen the quality of SBR and knowledge development in clinical social work. SBR provides a uniquely important opportunity for practice-informed research (CSWE 2015 ), in which researchers and clinicians can collaboratively work together in designing and conducting meaningful studies serving the needs of practitioners and grounded in the perspectives of clinicians and their clients.

Limitations

One limitation of any literature review, including this scoping review, is potential omission of some relevant studies (Peters et al. 2020 ). Due to the time and resource constraints (e.g., access to databases), we focused our review on relevant articles published in peer-review journals through the databases available in our university library system. Our review also excluded dissertations, books, book chapters, grey literature, and those published in non-English languages. As another limitation of this scoping review, we did not assess the quality of each empirical study included in this review. Given the novelty of SBR in social work, however, this scoping review aimed to provide a snapshot of current relevant literature, not to critically appraise research quality. In order to maintain study rigor, we used multiple independent reviewers for each and every phase of the review (i.e., screening, study selection, and charting) and followed the PRISMA-ScR checklist for reporting (Tricco et al. 2018 ). These methods were designed to increase overall consistency, transparency and safeguards against biases throughout the study.

In this scoping review, we synthesized the current social work literature ( n  = 24) in which simulation was used as an investigative methodology in studying clinical social work competencies. SBR offers a promising methodological solution to building knowledge about what clinical competencies might look like and how clinicians can engage various knowledge, values, and skills into a unique practice situation. The proximation to practice means the findings from studies using this methodology can provide relevant insight and serve to support clinical social workers. Recognizing both strengths and limitations of this novel research methodology, continued engagement with and investment in SBR can only advance the literature on clinical social work practice.

MSW, LICSW, RSW, Ph.D. is an Associate Professor at Carleton University School of Social Work. He is the Director of SIM Social Work Research Lab , in which he engages in a program of research on clinical social work education and practice.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

·References marked with an asterisk indicate articles included in the scoping review

  • Arksey H, O’Malley L. Scoping studies: Towards a methodological framework. International Journal of Social Research Methodology. 2005; 8 (1):19–32. doi: 10.1080/1364557032000119616. [ CrossRef ] [ Google Scholar ]
  • Asakura, K., Bogo, M., Good, B., & Power, R. (2018). Teaching note—Social work serial: Using simulated client sessions to teach social work practice.  Journal of Social Work Education , 54 (2), 397–404.
  • Asakura, K., Occhiuto, K., Todd, S., Leithead, C., & Clapperton, R. A call to action on artificial intelligence and social work education: Lessons learned from a simulation project using natural language processing.  Journal of Teaching in Social Work (in press).
  • Ashton V. Worker judgements of seriousness about and reporting of suspected child maltreatment. Child Abuse and Neglect. 1999; 23 (6):539–548. doi: 10.1016/S0145-2134(99)00032-0. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Beullens J, Rethans JJ, Goedhuys J, Buntinx F. The use of standardized patients in research in general practice. Family Practice. 1997; 14 (1):58–62. doi: 10.1093/fampra/14.1.58. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Bogo M. Social work practice: Integrating concepts, processs & skills. 2. New York, NY: Columbia University Press; 2018. [ Google Scholar ]
  • Bogo M, Regehr C, Logie C, Katz E, Mylopoulos M, Regehr G. Adapting objective structured clinical examinations to assess social work students’ performance and reflections. Journal of Social Work Education. 2011; 47 (1):5–18. doi: 10.5175/JSWE.2011.200900036. [ CrossRef ] [ Google Scholar ]
  • *Bogo, M., Katz, E., Regehr, C., Logie, C., Mylopoulos, M., & Tufford, L. (2013). Toward understanding meta-competence: An analysis of students’ reflection on their simulated interviews. Social Work Education , 32 (2), 259–273. 10.1080/02615479.2012.738662
  • Bogo M, Rawlings M, Katz E, Logie C. Using simulation in assessment and teaching: OSCE adapted for social work. Alexandria, VA: CSWE Press; 2014. [ Google Scholar ]
  • *Bogo, M., Regehr, C., Baird, S., Paterson, J., & LeBlanc, V. R. (2017). Cognitive and affective elements of practice confidence in social work students and practitioners. British Journal of Social Work , 47 (3), 701–718. 10.1093/bjsw/bcw026
  • Camilleri P, Ryan M. Social work students’ attitudes toward homosexuality and their knowledge and attitudes toward homosexual parenting as an alternative family unit: An Australian study. Social Work Education. 2006; 25 (3):288–304. doi: 10.1080/02615470600565244. [ CrossRef ] [ Google Scholar ]
  • Chandra-Mouli V, Lenz C, Adebayo E, Lungren IL, Garbero LG, Chatteriee S. A systematic review of the use of adolescent mystery clients in assessing the adolescent friendliness of health services in high, middle, and low-income countries. Global Health Action. 2018; 11 (1):1536412. doi: 10.1080/16549716.2018.1536412. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Cheng A, Auerbach M, Hunt EA, Chang TP, Pusic M, Nadkarni V, Kessler D. Designing and conducting simulation-based research. Pediatrics. 2014; 133 (6):1091–1101. doi: 10.1542/peds.2013-3267. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Cleland JA, Abe K, Rethans JJ. The use of simulated patients in medical education: AMEE guide No. 42. Medical Teacher. 2009; 31 (6):477–486. doi: 10.1080/01421590903002821. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Council on Social Work Education. (2015). Educational policy and accreditation standards . Washington, D.C.: Author. Retrieved from: https://www.cswe.org/getattachment/Accreditation/Accreditation-Process/2015-EPAS/2015EPAS_Web_FINAL.pdf.aspx
  • Christian CS, Gerdtham U, Hompashe D, Smith A, Burger R. Measuring quality gaps in TB screening in South Africa using standardised patient analysis. International Journal of Environmental Research and Public Health. 2018; 15 :729. doi: 10.3390/ijerph15040729. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Colquhoun HL, Levac D, O’Brien KK, Straus S, Tricco AC, Perrier L, Moher D. Scoping reviews: Time for clarify in definition, methods, and reporting. Journal of Clinical Epidemiology. 2014; 67 :1291–1294. doi: 10.1016/j.jclinepi.2014.03.013. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • *Duehn, W. D., & Mayadas, N. S. (1979). Starting where the client is: An empirical investigation. Social Casework, 60 (2), 67–74.
  • *Eskelinen, L., & Caswell, D. (2006). Comparison of social work practice in teams using a video vignette technique in a multi-method design. Qualitative Social Work , 5 (4), 489–503. 10.1177/1473325006070291
  • Fitzpatrick A, Tumlinson K. Strategies for optimal implementation of simulated clients for measuring quality of care in low- and middle-income countries. Global Health: Science and Practice. 2017; 5 (1):108–114. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *Forrester, D., Kershaw, S., Moss, H., & Hughes, L. (2008). Communication skills in child protection: How do social workers talk to parents? Child and Family Social Work, 13 , 41–51.
  • *Garcia, B., Lu, Y. E., & Maurer, K. (2012). Cultural empathy: Implications of findings from Social Work Objective-Structured Clinical Observation for field education. Field Educator , 2 (2), 1–8.
  • Geurtzen R, Hogeveen M, Rajani AK, Chitkara R, Antonius T, van Heijst A, Draaisma J, Halamek LP. Using simulation to study difficult clinical issues: Prenatal counseling at the threshold of viability across American and Dutch cultures. Simulation in Healthcare. 2014; 9 (3):167–173. doi: 10.1097/SIH.0000000000000011. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Gehlert S, Hall KL, Palinkas LA. Preparing our next-generation scientific workforce to address the Grand Challenges for Social Work. Journal of the Society for Social Work and Research. 2017; 8 (1):119–136. doi: 10.1086/690659. [ CrossRef ] [ Google Scholar ]
  • Guise J, Hansen M, Lambert W, O’Brien K. The role of simulation in mixed-methods research: A framework and application to patient safety. BMC Health Services Research. 2017; 17 :322. doi: 10.1186/s12913-017-2255-7. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Halamek LP. Simulation as a methodology for assessing the performance of healthcare professionals working in the delivery room. Seminars in Fetal & Neonatal Medicine. 2013; 18 :369–372. doi: 10.1016/j.siny.2013.08.010. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Hardy GE, Llewelyn S, et al. Introduction to psychotherapy process research. In: Gelo OCG, et al., editors. Psychotherapy Research. New York: Springer; 2015. pp. 183–193. [ Google Scholar ]
  • Ibrahim IR, Palaian S, Ibrahim MI. Assessment of diarrhea treatment and counseling in community pharmacies in Baghdad, Iraq: A simulated patient study. Pharmacy Practice. 2018; 16 (4):1313. doi: 10.18549/PharmPract.2018.04.1313. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • *Katz, E., Tufford, L., Bogo, M., & Regehr, C. (2014). Illuminating students’ pre-practicum conceptual and emotional states: Implications for field education. Journal of Teaching in Social Work , 34 (1), 96–108. 10.1080/08841233.2013.868391
  • Knobloch-Fedders LM, Elkin I, Kiesler DJ. Looking back, looking forward: A historical reflection on psychotherapy process research. Psychotherapy Research. 2015; 25 (4):383–395. doi: 10.1080/10503307.2014.906764. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Kourgiantakis T, Sewell K, Hu R, Logan J, Bogo M. Simulation in social work education: A scoping review. Research on Social Work Practice. 2019; 30 (4):433–450. doi: 10.1177/1049731519885015. [ CrossRef ] [ Google Scholar ]
  • *Kurtz, M. E., Johnson, S. M., Rice, S., Kurtz E., & Johnson M. (1989). Students’ clinical assessments: Are they affected by stereotyping? Journal of Social Work Education , 25 (1), 3–12. 10.1080/10437797.1989.10671264
  • *LeBlanc, V. R., C. Regehr, A. Shlonsky, & M. Bogo. (2012). Stress responses and decision making in child protection workers faced with high conflict situations. Child Abuse & Neglect , 36 , 404–412. 10.1016/j.chiabu.2012.01.003 [ PubMed ]
  • Lee E, Bhuyan R. Negotiating within whiteness in cross-cultural clinical encounters. Social Service Review. 2013; 87 (1):98–130. doi: 10.1086/669919. [ CrossRef ] [ Google Scholar ]
  • Lee E, Horvath AO. Early cultural dialogues in cross-cultural clinical practice. Smith College Studies in Social Work. 2013; 83 (2/3):185–212. doi: 10.1080/00377317.2013.802639. [ CrossRef ] [ Google Scholar ]
  • Lee E, Horvath AO. How a therapist responds to cultural versus noncultural dialogue in cross-cultural clinical practice. Journal of Social Work Practice. 2014; 28 (2):193–217. doi: 10.1080/02650533.2013.821104. [ CrossRef ] [ Google Scholar ]
  • Lee E, Tsang AKT, Bogo M, Johnstone M, Herschman J. Enactments of racial microaggression in everyday therapeutic encounters. Smith College Studies in Social Work. 2018; 88 (3):211–236. doi: 10.1080/00377317.2018.1476646. [ CrossRef ] [ Google Scholar ]
  • Levac D, Colquhoun H, O’Brien K. Scoping studies: Advancing the methodology. Implementation Science. 2010; 5 :69. doi: 10.1186/1748-5908-5-69. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • *Logie, C. H., Bogo, M., & Katz, E. (2015). “I didn’t feel equipped”: Social work students’ reflections on a simulated client “coming out.” Journal of Social Work Education, 51 (2), 315–328.
  • *Maki, M. T. (1999). The effects on clinician identification when clinician and client share a common ethnic minority background. Journal of Multicultural Social Work , 7 (1/2), 57–72. 10.1300/J285v07n01_04
  • *MacDonell, K. K., Pennar, A. L., King, L., Todd, L., Martinez, S., & Naar, S. (2019). Adolescent HIV healthcare providers’ competencies in motivational interviewing using a standard patient model of fidelity monitoring. AIDS and Behavior , 23 (10), 2837–2839. 10.1007/s10461-019-02445-4 [ PMC free article ] [ PubMed ]
  • Madden JM, Quick JD, Ross-Degnan D, Kafle KK. Undercover careseekers: Simulated clients in the study of health provider behavior in developing countries. Social Science & Medicine. 1997; 45 (1):1465–1482. doi: 10.1016/S0277-9536(97)00076-2. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Munn Z, Peters MDJ, Stern C, Tufanaru C, McArthur A, Aromataris E. Systematic review or scoping review? Guidance for authors when choosing between a systematic or scoping review approach. BMC Medical Research Methodology. 2018; 18 :143. doi: 10.1186/s12874-018-0611-x. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • National Association of Social Workers. (2005). NASW standards for clinical social work in social work practice . Retrieved from https://www.socialworkers.org/Practice/Practice-Standards-Guidelines
  • Olin SS, O’Connor BC, Storfer-Isser A, Clark LJ, Perkins M, Scholle SH, Whitmyre ED, Horwitz SM. Access to care for youth in a state mental health system: A simulated patient approach. Journal of the American Academy of Child & Adolescent Psychiatry. 2016; 55 (5):392–399. doi: 10.1016/j.jaac.2016.02.014. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Orlinsky D, Heinonen E, Hartmann A. Psychotherapy process research. International Encyclopedia of the Social & Behavioral Sciences (2nd ed) 2015; 19 :515–520. doi: 10.1016/B978-0-08-097086-8.21083-0. [ CrossRef ] [ Google Scholar ]
  • Peters, M. D. J., Godfrey, C., McInerney, P., Munn, Z., Tricco, A. C., & Khalil, H. (2020). Chapter 11: Scoping reviews, In E. Aromataris, E. & Z. Munn (Eds.). JBI Reviewer’s Manual. Adelaide: JBI. 10.46658/JBIRM-20-01
  • Rawlings MA. Assessing BSW student direct practice skill using standardized clients and self-efficacy theory. Journal of Social Work Education. 2012; 48 (2):553–576. doi: 10.5175/JSWE.2012.201000070. [ CrossRef ] [ Google Scholar ]
  • *Reeves, J., Drew, I., Shemmings, D., & Ferguson, H. (2015). ‘Rosie 2’ a child protection simulation: Perspectives on neglect and the ‘unconscious at work.’ Child Abuse Review , 24 (5), 346–364. 10.1002/car.2362
  • *Regehr, C., Bogo, M., LeBlanc, V. R., Baird, S., Paterson, J., & Birze, A. (2016). Suicide risk assessment: Clinicians’ confidence in their professional judgment. Journal of Loss and Trauma, 21 (1), 30–46.
  • *Regehr, C., Bogo, M., Shlonsky, A. & LeBlanc, V. (2010a). Confidence and professional judgment in assessing children’s risk of abuse. Research on Social Work Practice, 20 (6), 621–628.
  • *Regehr, C., LeBlanc, V. R., Bogo, M., Paterson, J. & Birze, A. (2015/2020). Suicide risk assessments: Examining influences on clinicians’ professional judgment. American Journal of Orthopsychiatry, 85 (4), 295–301. [ PubMed ]
  • *Regehr, C., LeBlanc, V., Shlonsky, A., & Bogo, M. (2010b). The influence of clinicians’ previous trauma exposure on their assessment of child abuse risk. Journal of Nervous and Mental Disease , 198 (9), 614–618. 10.1097/NMD.0b013e3181ef349e [ PubMed ]
  • Sandelowski M. Whatever happened to qualitative description? Research in Nursing and Health. 2000; 23 :334–340. doi: 10.1002/1098-240X(200008)23:4<334::AID-NUR9>3.0.CO;2-G. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Schoenberg NE, Ravdal H. Using vignettes in awareness and attitudinal research. International Journal of Social Research Methodology. 2000; 3 (1):63–74. doi: 10.1080/136455700294932. [ CrossRef ] [ Google Scholar ]
  • *Sewell, K. M., Sanders, J. E., Kourgiantakis, T., Katz, E., & Bogo, M. (2020). Cognitive and affective processes: MSW students’ awareness and coping through simulated interviews. Social Work Education . 10.1080/02615479.2020.1727875.
  • Siminoff LA, Rogers HL, Waller AC, Harris-Haywood S, Esptein RM, Carrio B, Longo DR. The advantages and challenges of unannounced standardized patients methodology to assess healthcare communication. Patient Education and Counseling. 2011; 82 (3):318–324. doi: 10.1016/j.pec.2011.01.021. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Stokes J, Schmidt G. Child protection decision making: A factorial analysis using case vignettes. Social Work. 2012; 57 (1):83–90. doi: 10.1093/sw/swr007. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • *Stone, K. (2019). Approved Mental Health Professionals and Detention: An Exploration of Professional Differences and Simularities. Practice , 31 (2), 83–96. 10.1080/09503153.2018.1445709
  • Tricco AC, Lillie E, Zarin W, O’Brien KK, Colquhoun H, Levac D, Hempel S. PRISMA extension for scoping reviews (PRISMA-ScR): Checklist and explanation. Annals of Internal Medicine. 2018; 169 :467–473. doi: 10.7326/M18-0850. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Tsang AKT, Bogo M, George U. Critical issues in cross-cultural counseling research: Case example of an ongoing project. Multicultural Counseling and Development. 2003; 31 :63–78. doi: 10.1002/j.2161-1912.2003.tb00532.x. [ CrossRef ] [ Google Scholar ]
  • Tsang AKT, Bogo M, Lee E. Engagement in cross-cultural clinical practice: Narrative analysis of first sessions. Clinical Social Work Journal. 2011; 39 :79–90. doi: 10.1007/s10615-010-0265-6. [ CrossRef ] [ Google Scholar ]
  • *Tufford, L., Bogo, M., & Asakura, K. (2015). How do social workers respond to potential child neglect? Social Work Education: The International Journal, 34 (2), 229–243.
  • *Tufford, L., Bogo, M., & Katz, E. (2017). Examining metacompetence in graduating BSW students. The Journal of Baccalaureate Social Work , 22 , 93–110. 10.18084/1084-7219.22.1.93
  • *Tyler, T. R., & Franklin, A. E. (2020). Student reflections with a dyadic Objective Structured Clinical Examination (OSCE) adapted for social work. Journal of Social Work Education, 40 (3), 221–241. 10.1080/08841233.2020.1751774
  • Walden T, Wolock I, Demone HW. Ethical decision making in human services: A comparative study. Families in Society: The Journal of Contemporary Social Services. 1990; 71 (2):67–75. doi: 10.1177/104438949007100201. [ CrossRef ] [ Google Scholar ]
  • Washburn M, Bordnick P, Rizzo AS. A pilot feasibility study of virtual patient simulation to enhance social work students’ brief mental health assessment skills. Social Work in Health Care. 2016; 55 (9):675–693. doi: 10.1080/00981389.2016.1210715. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Watson MC, Norris P, Granas AG. A systematic review of the use of simulated patients and pharmacy practice research. International Journal of Pharmacy Practice. 2006; 14 :83–93. doi: 10.1211/ijpp.14.2.0002. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • *Wilkins, D., & Jones, R. (2018). Simulating supervision: How do managers respond to a crisis? European Journal of Social Work, 21 (3), 454–466.
  • *Wilkins, D., Khan, M., Stabler, L., Newlands, F., & Mcdonnell, J. (2018). Evaluating the quality of social work supervision in UK children’s services: Comparing self-report and independent observations. Clinical Social Work Journal , 46 (4), 350–360. 10.1007/s10615-018-0680-7 [ PMC free article ] [ PubMed ]
  • Wilks T. The use of vignettes in qualitative research into social work values. Qualitative Social Work. 2004; 3 (1):78–87. doi: 10.1177/1473325004041133. [ CrossRef ] [ Google Scholar ]
  • Zhou Y, Collinson A, Laidlaw A, Humphris G. How do medical students respond to emotional cues and concerns expressed by simulated patients during OSCE consultations?—A multilevel study. PLoS ONE. 2013; 8 (10):e79166. doi: 10.1371/journal.pone.0079166. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]

Research Simulation Task

  • Show/Hide Line Reader
  • Learn It Part 1
  • Learn It Part 2
  • Exit Ticket

This module is for Grades: 9-10 Welcome

Research is the act of finding information from multiple sources in order to address a topic or answer a question. It requires the researcher to read closely, observe for relevant evidence, and make inferences from the evidence found. In this module, you will engage in a research process by analyzing a variety of print and non-print texts on the subject of bullying, identifying evidence regarding the topic, and using this new information to respond to questions about the text and the topic.

Module Objectives

By the end of this module, you will be able to:

  • Provide textual evidence to support your analysis of what the text says explicitly and inferences drawn from the text.
  • Provide an analysis of various accounts of a subject told in both print and non-print text.

the word research magnified

Conducting research using multiple sources can help you answer questions about a specific topic.

  • Identify important textual evidence about a research topic.
  • Make inferences from textual evidence.
  • Identify evidence in multiple text types.

Learn How to Use This Module

Get Started

  • Go to page 1
  • Go to page 2
  • Go to page 3
  • Go to page 4
  • Go to page 5
  • Go to page 6
  • Go to page 7

Teacher Resources | Accessibility

This website is a production of Maryland Public Television/Thinkport in collaboration with the Maryland State Department of Education. The contents of this website were developed under a grant from the U.S. Department of Education. However, those contents do not necessarily represent the policy of the U.S. Department of Education, and you should not assume endorsement by the Federal Government.

Thinkport logo

Central Idea

The big idea or the most important message that the author is trying to convey in a piece of literature. It is the unifying element of a story, which ties together all of the other elements of fiction used by the author to tell the story.

Toggle Audio close popup

New Jersey Department of Education

Official site of the state of new jersey.

  • FAQs Frequently Asked Questions

The State of NJ site may contain optional links, information, services and/or content from other websites operated by third parties that are provided as a convenience, such as Google™ Translate. Google™ Translate is an online service for which the user pays nothing to obtain a purported language translation. The user is on notice that neither the State of NJ site nor its operators review any of the services, information and/or content from anything that may be linked to the State of NJ site for any reason. - Read Full Disclaimer

  • Search close

NJSLA-ELA Companion Guide: Grades 3–8

The New Jersey Student Learning Assessments for English Language Arts (NJSLA-ELA) measure student proficiency with grade-level skills, knowledge, and concepts that are critical to college and career readiness. On each assessment, students read and analyze passages from authentic fiction and nonfiction texts. Test forms can also include multimedia stimuli such as video or audio. The ELA assessments emphasize the importance of close reading, synthesizing ideas within and across texts, determining the meaning of words and phrases in context, and writing effectively when using and/or analyzing sources.

NJSLA-ELA Grades 3–8

The NJSLA-ELA blueprints define the total number of tasks and points for any given grade-level assessment. To maintain the content coverage while shortening the assessment, it was necessary to create two blueprints for grades 3 through 8. One test form was assembled according to each blueprint.

The grades 3 through 8 ELA assessments:

  • align to a representative sampling of standards and evidence statements;
  • reflect the balance between literary and informational texts;
  • include a writing task in each unit and associated scoring rubrics;
  • align with the ELA Task Models;
  • maintain all item types (Evidence-Based Selected Response; Technology-Enhanced Constructed Response; and Prose Constructed Response); and
  • report on all five subclaims and performance

Blueprint 1 for grade 3 consists of a Literary Analysis Task and Research Simulation Task. Blueprint 2 is composed of a Narrative Writing Task, Short Passage Set, and Research Simulation Task. The units, ELA task types, and testing times for each blueprint are outlined in Tables 1 and 2.

Table 1: ELA Grade 3—Blueprint 1

Unit Task Time (minutes)
Unit 1 Literary Analysis Task 75 
Unit 2 Research Simulation Task 75

Table 2: ELA Grade 3—Blueprint 2

Unit Task Time (minutes)
Unit 1 75
Unit 2 Research Simulation Task 75

Blueprint 1 for grades 4 through 8 consists of a Literary Analysis Task, Short Passage Set, and Research Simulation Task. Blueprint 2 is composed of a Narrative Writing Task, Long or Paired Passage Set, and Research Simulation Task. The units, ELA task types, and testing times for each blueprint are outlined in Tables 3 and 4.

Table 3: ELA Grades 4 through 8—Blueprint 1

Unit Task Time (minutes)
Unit 1 90
Unit 2 Research Simulation Task 90

Table 4: ELA Grades 4 through 8—Blueprint 2

Comparability of forms.

Two NJSLA-ELA forms adhere to stringent content specifications and statistical requirements to ensure that the forms are comparable and fair for all students. Total points vary between the two forms due to the nature of the design of the Literary Analysis Task and Narrative Writing Tasks. However, they both ask students to read and respond to literary texts. Both tasks report and align to the Literary Text subclaim, standards, and evidence statements. The forms are designed so that students, regardless of which form they are assigned, will need to demonstrate the same level of knowledge to meet a specific performance level.

Expert analysis was conducted to ensure that scores are comparable across forms. First, the two forms were built to be similar in content and difficulty. Then, the two forms were equated by means of a statistical process conducted to establish comparable scores on different forms of an assessment.

The two forms will be randomly assigned to students. Therefore, all students need to be prepared to respond to all three task types.

For More Information

The NJSLA-ELA blueprints and additional test support documents (e.g., evidence statements, scoring rubrics) can be found on the Test Content and Other Information webpage of the New Jersey Assessments Resource Center under Educator Resources. The New Jersey Assessments Resource Center also includes links to access the ELA Practice Tests and Released Items .

If you have any questions, please contact the Office of Assessments at [email protected] .

IMAGES

  1. Research Simulation Task Essay

    how to write a research simulation task

  2. PARCC Research Simulation Task (Grade 3) Foldable by Courtney Neglia

    how to write a research simulation task

  3. How to Write a Synthesis/Research Simulation Task by Opal's Gems

    how to write a research simulation task

  4. Research Simulation Task Grade 5 by Stephanie Straw

    how to write a research simulation task

  5. Research Simulation Task (PARCC): Essay Structure/Outline

    how to write a research simulation task

  6. Research Simulation Task Graphic Organizer Teaching Resources

    how to write a research simulation task

VIDEO

  1. Computer Application in ELT

  2. PTK12. Network simulation task

  3. P2: Deciding Right Time: When to Use Simulation Studies Effectively

  4. Vector control of PMSM using Matlab Simulink simulation

  5. Federated Learning: from simulation to production, cross-device and cross-silo

  6. PTK13. Network simulation boundary conditions

COMMENTS

  1. Keeper Research Simulation Task Essay Format

    Write an informative essay that addresses and analyzes the question and supports your position with evidence from at least three of the five sources. Be sure to acknowledge competing views.

  2. Research Simulation Task (RST) and Literary Analysis Task (LAT)

    Research Simulation Task and Literary Analysis Task. . Narrative Task (NT) NOTE: The reading dimension is not scored for elicited narrative stories. sequence, describing scenes, objects or people, developing characters’ personalities, and u. The elements of organization to be assessed are expressed in the grade-level standards W1-W3.

  3. Grade 6 English Language Arts/Literacy Research Simulation ...

    The 2017 blueprint for PARCC’s grade 6 Research Simulation Task includes Evidence-Based Selected Response/Technology-Enhanced Constructed Response items as well as one Prose Constructed Response prompt.

  4. Research Simulation Task | Learn It - Thinkport

    Learn how to read and take notes on a text for research purposes by doing a simulation task on bullying. Read a short story by Richard Wright and find details, ideas, and author's purpose related to the topic.

  5. Using Simulation as an Investigative Methodology in ...

    In terms of strengths of using simulation as a research methodology, three themes emerged: simulation offers (1) opportunities for direct observation of practice, (2) standardization of practice situations, and (3) a solution for research ethics related concerns.

  6. Mastering the Art of Writing a Research Simulation Task

    In this article, we will explore the key steps involved in crafting a research simulation task that engages the readers and effectively tests their analytical and critical thinking skills. Key Takeaways. Understand the purpose of a research simulation task; Choose a relevant topic for your research simulation task

  7. Research Simulation Task

    Learn how to conduct research using multiple sources on the topic of bullying. Analyze print and non-print texts, identify evidence, and respond to questions in this interactive module.

  8. NJSLA-ELA Companion Guide: Grades 3–8 - The Official Web ...

    Blueprint 1 for grade 3 consists of a Literary Analysis Task and Research Simulation Task. Blueprint 2 is composed of a Narrative Writing Task, Short Passage Set, and Research Simulation Task. The units, ELA task types, and testing times for each blueprint are outlined in Tables 1 and 2.

  9. NJSLA Research Simulation Task Grade 6 Writing

    Students will be familiar with the format of the NJSLA Research Simulation Task prose constructed response. Students will observe the process for completing the prompt through teacher modeling.

  10. Grade 8 English Language Arts/Literacy Research Simulation ...

    2021 Released Items: Grade 8 Research Simulation Task The Research Simulation Task requires students to analyze an informational topic through several articles or multimedia stimuli. Students read and respond to a series of questions and synthesize information from multiple sources in order to write an analytic essay. The 2021 blueprint for ...