n (%)
Year of article
Before 2000
2001–2010
2011–2015
2016–Present
3 (13)
4 (17)
8 (33)
9 (38)
Study location
Canada
USA
UK
Other
12 (50)
6 (25)
5 (21)
1 (4)
Targeted competencies studied
Decision making
Culture and diversity
Role of cognition and emotion
Communication skills
Supervision
Other
8 (33)
5 (21)
5 (21)
3 (13)
2 (8)
1 (4)
Study sample
Social work students
Social workers
Students & social workers
Interprofessional
Supervisors
7 (29)
7 (29)
5 (21)
3 (13)
2 (8)
Study Design
Qualitative
Quantitative
Mixed methods
11 (46)
10 (42)
3 (13)
Type of Simulation Data
Live SP
Secondary data from the OSCE
Video vignette
Virtual reality
13 (54)
6 (25)
4 (17)
1 (4)
Rationale for using simulation data
Yes
No
14 (58)
10 (42)
Case development
Yes
No
18 (75)
6 (25)
N = 24. Due to rounding, percentages may not total 100
In terms of the target competencies studied within the articles, the most common area was clinicians’ professional decision-making (n = 8, 33%), for instance, in reporting of child abuse (e.g., LeBlanc et al. 2012 *; Tufford et al. 2015 *) and assessing suicide risk (e.g., Bogo et al. 2017 *; Regehr et al. 2015 *, 2016 *). Five articles (21%) examined the role of the clinicians’ cognitive skills and/or emotional reactions and states (Bogo et al. 2013 *; Katz et al. 2014 *; Reeves et al. 2015 *; Sewell et al. 2020 *; Tufford et al. 2017 ). An additional five articles (21%) examined culture and diversity related to clinical practice, such as cultural empathy (Garcia et al. 2012 *), stereotypes (Kurtz et al. 1989 *), ethnicity (Maki 1999 *), and working with LGBTQ populations (Logie et al. 2015 *; Tyler and Franklin 2020 *). Three articles (12.5%) examined communication skills among practitioners (Duehn and Mayadas 1979 *; Forrester et al. 2008 *; MacDonell et al. 2019 *). In two articles (8%), supervision skills were studied as a clinical practice competence (Wilkins and Jones 2018 *; Wilkins et al. 2018 *).
Almost half of the articles (n = 11, 46%) did not specify a scope of clinical practice and addressed competencies relevant to clinical practice in general (e.g., Bogo et al. 2013 *; Garcia et al. 2012 *; Katz et al. 2014 *; Tyler and Franklin 2020 *). Child protection was the most common specialized scope of clinical practice (n = 7, 29%), in that researchers examined the reporting of potential neglect and abuse (e.g., Forrester et al. 2008 *; Reeves et al. 2015 *; Regehr et al. 2010a *, 2010b *). Suicide risk assessment was another scope of practice focused in three articles (Bogo et al. 2017 *; Regehr et al. 2015 *, 2016 *). Other specialized scopes of practice reported in the articles were healthcare (MacDonell et al. 2019 *), criminal justice (Stone 2019 *), and employment services (Eskelinen and Caswell 2006 *). Table Table3 3 shows a synthesis of the selected articles.
Synthesis of selected articles (N = 24)
Author | Year | Location | Competencies studied | Study sample | Study design | Type of simulation data | Simulation data rationale | Case development |
---|---|---|---|---|---|---|---|---|
Bogo et al | 2013 | Canada | Role of cognition and emotion | N = 18 / Students and social workers | Qualitative | Data from the OSCE | No | Yes |
Bogo et al | 2017 | Canada | Decision making | N = 71 / Students and social workers | Qualitative | Live SP | No | Yes |
Duehn & Mayadas | 1979 | USA | Communication skills | N = 75 / Social work students | Quantitative | Live SP | No | No |
Eskelinen & Caswell | 2006 | Other | Other | N = 28 / Social workers | Qualitative | Video vignette | Yes | Yes |
Forrester et al | 2008 | UK | Communication skills | N = 24 / Social workers | Quantitative | Live SP | Yes | Yes |
Garcia et al | 2012 | USA | Culture and diversity | N = 20 / Social work students | Qualitative | Live SP | Yes | Yes |
Katz et al | 2014 | Canada | Role of cognition and emotion | N = 109 / Social work students | Qualitative | Data from the OSCE | Yes | No |
Kurtz et al | 1989 | USA | Culture and diversity | N = 79 / Social work students | Quantitative | Video vignette | No | Yes |
LeBlanc et al | 2012 | Canada | Decision making | N = 96 / Social workers | Quantitative | Live SP | Yes | Yes |
Logie et al | 2015 | Canada | Culture and diversity | N = 18 / Students and social workers | Qualitative | Data from the OSCE | No | Yes |
Macdonell et al | 2019 | USA | Communication skills | N = 151 / Interprofessional | Quantitative | Live SP | Yes | No |
Maki | 1999 | USA | Culture and diversity | N = 120 / Social workers | Quantitative | Video vignette | No | No |
Reeves et al | 2015 | UK | Role of cognition and emotion | N = 24 / Interprofessional | Quantitative | Virtual reality | Yes | Yes |
Regehr et al | 2010(a) | Canada | Decision making | N = 96 / Social workers | Mixed methods | Live SP | No | Yes |
Regehr et al | 2010(b) | Canada | Decision making | N = 96 / Social workers | Quantitative | Live SP | No | Yes |
Regehr et al | 2015 | Canada | Decision making | N = 71 / Students and social workers | Quantitative | Live SP | Yes | Yes |
Regehr et al | 2016 | Canada | Decision making | N = 71 / Social workers | Mixed methods | Live SP | Yes | Yes |
Sewell et al | 2020 | Canada | Role of cognition and emotion | N = 57 / Social work students | Qualitative | Live SP | Yes | Yes |
Stone | 2019 | UK | Decision making | N = 20 / Interprofessional | Qualitative | Video vignette | Yes | Yes |
Tufford et al | 2015 | Canada | Decision making | N = 23 / Students and social workers | Qualitative | Data from the OSCE | No | No |
Tufford et al | 2017 | Canada | Role of cognition and emotion | N = 20 / Social work students | Qualitative | Data from the OSCE | Yes | Yes |
Tyler & Franklin | 2020 | USA | Culture and diversity | N = 37 / Social work students | Qualitative | Data from the OSCE | No | No |
Wilkins & Jones | 2018 | UK | Supervision | N = 30 / Supervisors | Mixed methods | Live SP | Yes | Yes |
Wilkins et al | 2018 | UK | Supervision | N = 12 / Supervisors | Quantitative | Live SP | Yes | Yes |
To address the second research question of how simulation-based data are used in the studies of clinical competencies, we examined whether researchers provided a rationale for using simulation as a research methodology. We also looked into what types of simulation researchers used and whether and how researchers described the case development. Finally, we identified how simulation-based data were used and analyzed in studying practice competencies.
Approximately half of the articles (n = 14, 58%) reported a rationale for using simulation as a research methodology, while the other half (42%) did not describe relevant information on why simulation-based data was used to answer their research questions. Specific rationale located in the selected articles is discussed as methodological strengths in the subsequent section. About half of the articles (n = 13, 54%) reported using live SPs (i.e., trained actors) to simulate a social work practice scenario. In six articles (25%), researchers used data collected during the OSCEs for secondary data analysis. All of these studies (e.g., Bogo et al. 2013 *; Logie et al. 2015 *) used participants’ qualitative reflection on their engagement with SPs as a data source. In four studies (17%), researchers developed and used video-based case vignettes as a source of data. Researchers reported filming simulated interactions between SPs and practitioners in three of those articles (Eskelinen and Caswell 2006 *; Kurtz et al. 1989 *; Stone 2019 *), while an SP monologue was filmed and used as a data source in one article (Maki 1999 *). Virtual reality was used in one article (Reeves et al. 2015 *) to develop a virtual patient and an immersive practice environment in studying participants’ emotional response to child protection.
Most of the articles (n = 18, 75%) described the process of developing simulated case scenarios, while no relevant information was located in seven articles (29%). Of the 18 articles, researchers described specific processes of how the case vignettes were developed, with two common practices identified: (1) consultation with practitioners or experts during the case development phase (e.g., Bogo et al. 2017 *; Logie et al. 2015 *; Regehr et al. 2010a *), and (2) pilot-testing with other practitioners for face validity (e.g., Regehr et al. 2015 ,* 2016 *; Stone 2019 *). A few researchers (e.g., Bogo et al. 2017 *; LeBlanc et al. 2012 *; Tufford et al. 2017 ) reported an SP training. The degree of descriptions of SP training, however, varied. Some provided no information other than SPs being trained, while others described assisting SPs in consistently enacting the case scenarios on both verbal (i.e., key phrases to use) and non-verbal elements (e.g., emotional intensity) of communication.
In the articles that reported using qualitative research (n = 11), most researchers examined participants’ written or verbal reflections on their simulated practice (e.g., OSCEs, with live SPs) in studying practice competencies. For the most part, qualitative methodologies were employed for studies that aimed to conceptualize practice competencies that are not yet well studied, such as cognitive and affective skills (e.g., Bogo et al. 2017 *; Sewell et al. 2020 *; Tufford et al. 2017 ) and engaging culture and diversity in practice (e.g., Garcia et al. 2012 *; Logie et al. 2015 *; Tyler and Franklin 2020 *). Thematic analysis (e.g., Bogo et al. 2013 *, 2017 *; Tyler and Franklin 2020 *) and qualitative content analysis (Katz et al. 2014 *; Tufford et al. 2015 *) were the most common analytic methods used in studying practice competencies inductively through the participants’ perspectives. On the other hand, no apparent patterns were found in how researchers used quantitative (n = 10) or mixed (n = 3) methodologies in studying practice competencies. Kurtz et al. ( 1989 *) and Regehr’s team (e.g., Regehr et al. 2010a *, 2010b *; LeBlanc et al. 2012 *), for instance, studied professional decision-making by having participants fill out measures in conjunction with a participation in simulation. Some researchers, such as Duehn and Mayadas ( 1979 ), Forrester et al. ( 2008 *) and Wilkins et al. ( 2018 *) studied practice competencies by recording participants’ simulated sessions and analyzing practice skills, using a coding system. Others such as and Kurtz et al. ( 1989 *) and Maki ( 1999 *) looked at whether individual participants engaged multiple simulated scenarios similarly or differently based on the client characteristics, while MacDonell et al. ( 2019 *) compared social workers and other professionals in their engagements with SPs.
To address the third research question, we reviewed the selected articles to examine benefits and limitations associated with using simulation as a research methodology. Despite the fact that the use of simulation is a novel and innovative methodology in social work research, there was surprisingly little relevant discussion found. In terms of strengths of using simulation as a research methodology, three themes emerged: simulation offers (1) opportunities for direct observation of practice, (2) standardization of practice situations, and (3) a solution for research ethics related concerns. First, Wilkins and his colleagues (Wilkins and Jones 2018 *; Wilkins et al. 2018 *) suggested that the use of simulation provides an opportunity for researchers to directly observe practitioner behaviors. While social work researchers have traditionally relied on practitioners’ (often retrospective) self-report on their practice (Wilkins and Jones 2018 *), researchers are able to answer research questions related to clinical practice through an observation and analysis of real-time data on social workers. Second, researchers (e.g., Bogo et al. 2017 *; Forrester et al. 2008 *; LeBlanc et al. 2012 *) discussed the process of developing and preparing standardized simulation scenarios. Providing consistent verbal and non-verbal information in a standardized practice environment, researchers are able to observe participants and their clinical competencies. Third, several authors (e.g., Eskelinen and Caswell 2006 *; Forrester et al. 2008 *; Stone 2019 *) suggested that simulation provides a solution to an ethical challenge when researching social work practice. Researching real social work clients from vulnerable communities or sensitive topics, such as suicide risk (e.g., Regehr et al. 2015 *, 2016 *), poses serious concerns from a research ethics perspective. Simulation enables researchers to study clinical practice through direct observation of clinicians and their competencies, while mitigating these ethical concerns.
In reviewing the selected articles, several limitations were noted in relation to using simulation as a research methodology. First, a few researchers (e.g., Eskelinen and Caswell 2006 *; LeBlanc et al. 2012 *; Regehr et al. 2015 *) suggested that the very nature of simulation—the fact that it is simulated practice, not a real clinical encounter with a real client, poses methodological limitations. Simulation is a portrayal of a clinical situation and might not accurately reflect real life practice. Similarly, a few researchers (e.g., Bogo et al. 2017 *; Regehr et al. 2010a *; Wilkins et al. 2018 *) suggested a partial and constrained nature of simulation as a methodological limitation. These researchers cautioned that a short, single session format of simulation (e.g., 15-min in Regehr et al. 2010a *; 30-min in Wilkins and Jones 2018 *) might not fully capture the dynamic and process-oriented social work practice, limiting generalization of study results. Second, Forrester et al. ( 2008 *) and Maki ( 1999 *) cautioned that, not using real clients with real presenting problems, the use of simulation is not suitable for researchers if their research questions involve client outcomes (e.g., impacts of therapeutic alliance on clients; improvement in client presenting concerns). Finally, two articles (Regehr et al. 2015 *; Stone 2019 *) suggested issues related to socio-cultural diversity in simulated case scenarios (e.g., only white simulated clients). Especially for a study of social work practice, the use of simulation might limit its applicability and relevance to real life clinical practice if the simulated cases do not reflect the sociocultural diversities present in contemporary social work practice.
We identified 24 articles in this scoping review focused on the use of simulation as an investigative methodology (i.e., SBR) for researching clinical social work competencies. We synthesized the characteristics of relevant studies, the ways in which simulation-based data are used in research on clinical competencies, and relevant methodological benefits and limitations of SBR. Results on the characteristics of the selected articles suggest that the use of simulation is a relatively new methodology in clinical social work, with a majority of articles published in the last ten years. Given the small number of articles published prior to 2010 and the large number of articles using data available from OSCE-based student assessment, the emergence of simulation-based teaching in social work for the last ten years (Kourgiantakis et al. 2019 ) has likely played an important role in vitalizing SBR. While social work competence consists of a number of elements—the worker’s knowledge, values, skills, and cognitive and affective processes (CSWE 2015 ), only a few areas of competence were studied in the selected articles. Previously, a written case vignette method was often used in the studies of various social work competencies, most notably professional decision-making (e.g., Ashton 1999 ; Stokes and Schmidt 2012 ), social work values (e.g., Walden et al. 1990 ; Wilks 2004 ) and attitudes about marginalized populations (e.g., Camilleri and Ryan 2006 ; Schoenberg and Ravdal 2000 ). In comparison to the one-dimensional and static nature of written vignettes, simulation stimulates dynamic and immersive practice situations for researchers.
As SBR becomes commonplace in social work, researchers might consider expanding target competencies to study in the future, such as assessment, diagnosis and treatment planning. Another important point to note is the use of SBR for conceptualizing the role of cognition and emotion in social work practice (e.g., Bogo et al. 2013 *; Sewell et al. 2020 *). The notion of competency was re-conceptualized a few years ago (CSWE 2015 ) to recognize the important role of clinicians’ cognitive and affective processes in social work practice. Direct observation of and reflection on practice allowed researchers to explicate and translate these rather abstract concepts into concrete competency-based skillsets. As SBR has played an essential role in advancing the conceptualization of cognitive and affective processes, perhaps a similar research process can be used to identify and explicate other important competencies relevant to clinical practice, such as navigating countertransference, working with a therapeutic impasse, and engaging in an anti-oppressive clinical practice. Strengthening partnerships between researchers and practicing clinicians can ensure complex practice realities frame future research efforts, resulting in generating relevant knowledge for enhancing practice.
In terms of the current use of simulation as a research methodology, most SBR in social work have hired and trained live SPs in creating an immersive, dynamic and realistic practice environment for research. One scarcity observed here was the use of virtual simulation as a platform for research. Given recent technological advancement and a great need for remote, often online-based, practice especially felt during the COVID-19 pandemic, however, it seems that virtual simulation poses much promise in enhancing the use of simulation in social work research. Much has been written about the utility of virtual simulation in the teaching and learning of social work practice (e.g., Asakura et al. 2018 ; Asakura et al. in press; Washburn et al. 2016 ). Given that the data collected during the OSCEs have been used and analyzed for the studies of practice competencies, researchers might consider collecting data through these pedagogical innovations and research clinical competencies in a virtual environment.
Case development was another important area which emerged in our review study. There seems to be three components that can guide case development in future SBR. First, development of a realistic and trustworthy client case, an unarguably essential element of case development, can be further facilitated when the case is closely grounded in people’s real experiences. Here we can draw from a robust body of literature on simulation-based social work education. Scenarios should contain detailed information about the client’s history, current cognitive and emotional patterns, and key verbatim responses to use (Bogo et al. 2014 ; Kourgiantakis et al. 2019 ; Sewell et al. 2020 *). Furthermore, scenarios can be developed in consultation with social work practitioners or experts in the area (e.g., Bogo et al. 2013 *; Logie et al. 2015 *; Regehr et al. 2010a *). Consultation with service users might also help the researchers in strengthening the trustworthiness of the scenarios when relevant and feasible. The second key component is SP training for live SP or video-recorded simulation. This training might involve rehearsing the scenario with the SP and assisting the SP in demonstrating verbal content and non-verbal communications (e.g., facial expression, tone of voice) relevant to the vignette (e.g., LeBlanc et al. 2012 *; Sewell et al. 2020 *). Finally, pilot-testing of the case vignette with social workers and experts of a particular scope of practice might further strengthen the face validity and authenticity of the simulation used in the study.
Results of our review study offer two key points for when to use this methodology and why. First, SBR offers an ethical fit for research on social work practice. Studying actual clinical sessions, PPR is a strong methodology for observing real-time data on clinical practice (Knobloch-Fedders et al. 2015 ). PPR, however, typically involves the observation of therapeutic relationships over time (i.e., longitudinal research) and requires longer-term, intensive involvement from clients (Tsang et al. 2003 ). Social workers also often work in ethically sensitive, mandated social and health service contexts (Bogo 2018 ), such as child protection, residential treatment, prisons, and inpatient hospitals. In these non-voluntary settings, social workers often work with vulnerable populations without alternative access to treatment. Our scoping review suggests that accessing actual client sessions might not be the most ethically appropriate research methodology for social work. By simulating a realistic and trustworthy practice situation, SBR offers a methodological solution to researching social work practice that otherwise could not be observed due to legal and ethical reasons without involving or putting actual clients at risk for potential harm.
The second key point derived from this review study is a methodological fit between simulation and research on practice competencies. Simulation allows for the alignment and standardization of the practice environment, client characters, and scenarios for a specific research purpose. This finding corroborates the arguments made by SBR in medicine and other healthcare disciplines (e.g., Cheng et al. 2014 ; Halamek 2013 ). Actual client-worker interactions as seen in PPR are uniquely different and often unpredictable depending on the client’s presenting concerns and the client-worker relational dynamics. While actual clinician-client therapeutic processes need to be observed (i.e., PPR) when the study involves client outcomes, SBR might be better suited when researchers’ primary goal is to theorize or examine practice competencies. Designing and standardizing a specific case scenario allows researchers to control these variables, at least to some extent, and more easily identify similarities and differences in how research participants demonstrate various elements of practice competencies.
The current review study found that there is a rather inconsistent inclusion of a methodological rationale for using simulation as an investigative tool in the current social work SBR articles. Although half of the articles included a brief mention of a rationale, overall, a robust argument was missing as to why simulation was the most suitable methodology for their respective study purpose and inquiry. We suggest that researchers make concerted efforts to make explicit a rationale for using simulation-based data in answering their respective research questions. Additionally, there was no clear consensus about how social work researchers might approach study design and data analysis of simulation-based data. There was insufficient information in the current study to suggest when and how to use quantitative or mixed methods in social work SBR. We found that qualitative approaches might be suitable for a collection and analysis of post-simulated practice reflection, especially for the purpose of identifying or assessing under-studied clinical competencies (e.g., attending to culture and diversity in Logie et al. 2015 *; cognitive and affective processes in Sewell et al. 2020 *). Nonetheless, no qualitative study was identified in which simulation-based data were analyzed inductively from the participants’ practice. This points to the importance of articulating promising data collection methods for directly observing clinical practice (e.g., using video equipment) and data analytic methods for qualitative coding as a priority for further advancement of SBR. As those involved in medical or other healthcare SBR pointed out an absence of methodological best practices for this novel research methodology (e.g., Cheng et al. 2014 ; Guise et al. 2017 ; Siminoff et al. 2011 ), this is an area that merits continuing attention also from social work SBR researchers.
A disconnect between researchers and clinicians and between research and practice has been long noted in social work (Gehlert et al. 2017 ). Consultation with practicing clinicians in developing and pilot-testing vignettes as noted in many of the papers in this review (e.g., Bogo et al. 2017 *; LeBlanc et al. 2012 *) can enhance the accurate representation of client situations without reliance on stereotypes. This can also enhance the validity for simulations and the overall value of research findings. Given that the very purpose of simulation is to create a practice situation closely grounded in everyday clinical practice, active involvement from clinical social workers can only strengthen the quality of SBR and knowledge development in clinical social work. SBR provides a uniquely important opportunity for practice-informed research (CSWE 2015 ), in which researchers and clinicians can collaboratively work together in designing and conducting meaningful studies serving the needs of practitioners and grounded in the perspectives of clinicians and their clients.
One limitation of any literature review, including this scoping review, is potential omission of some relevant studies (Peters et al. 2020 ). Due to the time and resource constraints (e.g., access to databases), we focused our review on relevant articles published in peer-review journals through the databases available in our university library system. Our review also excluded dissertations, books, book chapters, grey literature, and those published in non-English languages. As another limitation of this scoping review, we did not assess the quality of each empirical study included in this review. Given the novelty of SBR in social work, however, this scoping review aimed to provide a snapshot of current relevant literature, not to critically appraise research quality. In order to maintain study rigor, we used multiple independent reviewers for each and every phase of the review (i.e., screening, study selection, and charting) and followed the PRISMA-ScR checklist for reporting (Tricco et al. 2018 ). These methods were designed to increase overall consistency, transparency and safeguards against biases throughout the study.
In this scoping review, we synthesized the current social work literature ( n = 24) in which simulation was used as an investigative methodology in studying clinical social work competencies. SBR offers a promising methodological solution to building knowledge about what clinical competencies might look like and how clinicians can engage various knowledge, values, and skills into a unique practice situation. The proximation to practice means the findings from studies using this methodology can provide relevant insight and serve to support clinical social workers. Recognizing both strengths and limitations of this novel research methodology, continued engagement with and investment in SBR can only advance the literature on clinical social work practice.
MSW, LICSW, RSW, Ph.D. is an Associate Professor at Carleton University School of Social Work. He is the Director of SIM Social Work Research Lab , in which he engages in a program of research on clinical social work education and practice.
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Research is the act of finding information from multiple sources in order to address a topic or answer a question. It requires the researcher to read closely, observe for relevant evidence, and make inferences from the evidence found. In this module, you will engage in a research process by analyzing a variety of print and non-print texts on the subject of bullying, identifying evidence regarding the topic, and using this new information to respond to questions about the text and the topic.
By the end of this module, you will be able to:
Conducting research using multiple sources can help you answer questions about a specific topic.
Learn How to Use This Module
Get Started
Teacher Resources | Accessibility
This website is a production of Maryland Public Television/Thinkport in collaboration with the Maryland State Department of Education. The contents of this website were developed under a grant from the U.S. Department of Education. However, those contents do not necessarily represent the policy of the U.S. Department of Education, and you should not assume endorsement by the Federal Government.
The big idea or the most important message that the author is trying to convey in a piece of literature. It is the unifying element of a story, which ties together all of the other elements of fiction used by the author to tell the story.
Toggle Audio close popup
Official site of the state of new jersey.
The State of NJ site may contain optional links, information, services and/or content from other websites operated by third parties that are provided as a convenience, such as Google™ Translate. Google™ Translate is an online service for which the user pays nothing to obtain a purported language translation. The user is on notice that neither the State of NJ site nor its operators review any of the services, information and/or content from anything that may be linked to the State of NJ site for any reason. - Read Full Disclaimer
The New Jersey Student Learning Assessments for English Language Arts (NJSLA-ELA) measure student proficiency with grade-level skills, knowledge, and concepts that are critical to college and career readiness. On each assessment, students read and analyze passages from authentic fiction and nonfiction texts. Test forms can also include multimedia stimuli such as video or audio. The ELA assessments emphasize the importance of close reading, synthesizing ideas within and across texts, determining the meaning of words and phrases in context, and writing effectively when using and/or analyzing sources.
The NJSLA-ELA blueprints define the total number of tasks and points for any given grade-level assessment. To maintain the content coverage while shortening the assessment, it was necessary to create two blueprints for grades 3 through 8. One test form was assembled according to each blueprint.
The grades 3 through 8 ELA assessments:
Blueprint 1 for grade 3 consists of a Literary Analysis Task and Research Simulation Task. Blueprint 2 is composed of a Narrative Writing Task, Short Passage Set, and Research Simulation Task. The units, ELA task types, and testing times for each blueprint are outlined in Tables 1 and 2.
Unit | Task | Time (minutes) |
---|---|---|
Unit 1 | Literary Analysis Task | 75 |
Unit 2 | Research Simulation Task | 75 |
Unit | Task | Time (minutes) |
---|---|---|
Unit 1 | 75 | |
Unit 2 | Research Simulation Task | 75 |
Blueprint 1 for grades 4 through 8 consists of a Literary Analysis Task, Short Passage Set, and Research Simulation Task. Blueprint 2 is composed of a Narrative Writing Task, Long or Paired Passage Set, and Research Simulation Task. The units, ELA task types, and testing times for each blueprint are outlined in Tables 3 and 4.
Unit | Task | Time (minutes) |
---|---|---|
Unit 1 | 90 | |
Unit 2 | Research Simulation Task | 90 |
Comparability of forms.
Two NJSLA-ELA forms adhere to stringent content specifications and statistical requirements to ensure that the forms are comparable and fair for all students. Total points vary between the two forms due to the nature of the design of the Literary Analysis Task and Narrative Writing Tasks. However, they both ask students to read and respond to literary texts. Both tasks report and align to the Literary Text subclaim, standards, and evidence statements. The forms are designed so that students, regardless of which form they are assigned, will need to demonstrate the same level of knowledge to meet a specific performance level.
Expert analysis was conducted to ensure that scores are comparable across forms. First, the two forms were built to be similar in content and difficulty. Then, the two forms were equated by means of a statistical process conducted to establish comparable scores on different forms of an assessment.
The two forms will be randomly assigned to students. Therefore, all students need to be prepared to respond to all three task types.
The NJSLA-ELA blueprints and additional test support documents (e.g., evidence statements, scoring rubrics) can be found on the Test Content and Other Information webpage of the New Jersey Assessments Resource Center under Educator Resources. The New Jersey Assessments Resource Center also includes links to access the ELA Practice Tests and Released Items .
If you have any questions, please contact the Office of Assessments at [email protected] .
IMAGES
VIDEO
COMMENTS
Write an informative essay that addresses and analyzes the question and supports your position with evidence from at least three of the five sources. Be sure to acknowledge competing views.
Research Simulation Task and Literary Analysis Task. . Narrative Task (NT) NOTE: The reading dimension is not scored for elicited narrative stories. sequence, describing scenes, objects or people, developing characters’ personalities, and u. The elements of organization to be assessed are expressed in the grade-level standards W1-W3.
The 2017 blueprint for PARCC’s grade 6 Research Simulation Task includes Evidence-Based Selected Response/Technology-Enhanced Constructed Response items as well as one Prose Constructed Response prompt.
Learn how to read and take notes on a text for research purposes by doing a simulation task on bullying. Read a short story by Richard Wright and find details, ideas, and author's purpose related to the topic.
In terms of strengths of using simulation as a research methodology, three themes emerged: simulation offers (1) opportunities for direct observation of practice, (2) standardization of practice situations, and (3) a solution for research ethics related concerns.
In this article, we will explore the key steps involved in crafting a research simulation task that engages the readers and effectively tests their analytical and critical thinking skills. Key Takeaways. Understand the purpose of a research simulation task; Choose a relevant topic for your research simulation task
Learn how to conduct research using multiple sources on the topic of bullying. Analyze print and non-print texts, identify evidence, and respond to questions in this interactive module.
Blueprint 1 for grade 3 consists of a Literary Analysis Task and Research Simulation Task. Blueprint 2 is composed of a Narrative Writing Task, Short Passage Set, and Research Simulation Task. The units, ELA task types, and testing times for each blueprint are outlined in Tables 1 and 2.
Students will be familiar with the format of the NJSLA Research Simulation Task prose constructed response. Students will observe the process for completing the prompt through teacher modeling.
2021 Released Items: Grade 8 Research Simulation Task The Research Simulation Task requires students to analyze an informational topic through several articles or multimedia stimuli. Students read and respond to a series of questions and synthesize information from multiple sources in order to write an analytic essay. The 2021 blueprint for ...