A primary goal of the open science movement is to improve the credibility of research. Concerns have been raised about how publication bias may skew findings in education and special education research, such as the absence of null findings and the use of questionable research practices (Cook, 2014; Gehlbach & Robinson, 2021; Makel et al., 2021). Openness and transparency have been proposed as approaches to help address concerns about questionable research (Cook et al., 2018; Cook, Fleming, et al., 2022; van Dijk et al., 2021) and appear to be gaining momentum. Notably, a recent special issue of Exceptional Children, the flagship research journal in the special education research community, was dedicated to update and expand quality indicators of special education research (Toste et al., 2023). Open science practices were interwoven with considerations for diversity, equity, and inclusion throughout discussions of quality indicators for various research approaches in the field. Moreover, in recent research grant competitions, federal funding agencies have increasingly prioritized integrating open science practices into research projects, such as budgeting funds for data management and curation, data sharing in public repositories, and open access publishing (National Institutes for Health, 2023; National Science Foundation, 2023; United States Department of Education, 2023).
In this manuscript, we provide an overview of preregistration, one key open science practice. Then, we provide a summary of third-party preregistration sites, followed by discussion of preregistration across research methodologies commonly used in special education research: group design studies, single-case design studies, qualitative studies, secondary data analyses, and systematic literature reviews. For each methodology covered, we describe unique considerations and challenges. It is our hope to inform and inspire special education researchers to integrate preregistration into typical practice, specifically, to view preregistration as an opportunity to not only increase methodological rigor and credibility but also support mentoring and training.
Overview of Preregistration
Preregistration is a research plan publicly shared by researchers prior to conducting the study. Documenting and sharing key elements of a planned study, such as research questions, hypotheses, intended sample, recruitment plan, outcomes, other key variables (e.g., independent variable(s) in intervention research), and planned data analyses, makes the research process more transparent than has traditionally been the case. In particular, the preregistration process clarifies which study elements were decided upon a priori and which are post hoc decisions (e.g., made during the course of conducting or analyzing the study). There is nothing wrong with post-hoc decision making in research, but it is important that research consumers understand whether a study is exploratory, with researchers making key decisions (e.g., which dependent variables to collect, how to analyze data) as they conduct and analyze the research (i.e., post hoc), or confirmatory, with a priori hypotheses tested following pre-determined procedures (Reich, 2021). The aim in exploratory research is to identify potential relations between variables that can serve to generate hypotheses for subsequent confirmatory research. Exploratory research is vital to the evolution of knowledge bases, but it is important that it not be confused with confirmatory research in which researchers test a priori hypotheses with predetermined methods. To be transparent about which studies are exploratory and confirmatory, researchers can preregister hypotheses and study plans for the latter.
Transparently sharing key study elements prior to conducting the study provides at least two additional advantages:
Researchers have a clearly articulated blueprint to test their a priori hypotheses and avoid “on the fly” decision-making that might unintentionally introduce bias (e.g., analyze the data in multiple ways and choose the approach that makes the most sense afterwards).
-
As discussed by Fleming et al. (2023), the preregistration process can help research consumers identify different questionable research practices, such as:
Selective outcome reporting (e.g., reporting only statistically significant findings) by specifying in advance all outcome measures
HARKing (hypothesizing after results are known) by specifying hypotheses before results are known
P-hacking (e.g., conducting multiple analyses and reporting the one that provides the most desired results) by specifying in advance how data will be analyzed
Data exclusion (i.e., deciding whether to exclude data, such as outliers, based on how decision affects results) by specifying in advance how outliers will be identified and treated
Data peeking (i.e., deciding whether to increase the sample/collect more data after analyzing collected data and determining whether results are desirable) by specifying in advance a rule for stopping sample recruitment
By transparently specifying key study elements in advance, the preregistration process makes questionable research practices more readily discoverable by reviewers and other readers. If, for example, researchers preregister one proximal and one distal dependent variable, but then only report effects for the proximal variable in the research report without explaining the absence of the distal variable as indicated in their study preregistration, readers may suspect the possibility of outcome-reporting bias (or cherry-picking which results to report). When preregistered plans align with research reports, or when researchers note and explain any deviations from preregistered plans, research consumers can have greater confidence that the researchers did not engage in questionable research practices. In the absence of the preregistered process, research results are typically are written up after the study was conducted and potentially reflect hindsight bias (i.e., researchers remember things differently than how they actually occurred; Nosek et al., 2018) as well as contain insufficient detail about study procedures to discern risk of bias due to page limitations.
One misconception about the preregistration process is that it is “set in stone” (DeHaven, 2017). In other words, there is a concern that once a researcher preregisters a study, there is no room for deviation. This can incite a level of apprehension or anxiety among researchers who know all too well the unexpected events that occur during the process of conducting a study in applied settings. However, researchers need not worry; the intent is to provide a transparent research plan with the understanding that science does not always go according to plan. The point is for researchers to document their plan, hold themselves accountable to that plan, and make visible to the research community the process they went through, including the unexpected. In fact, some third-party online repositories are designed to alleviate this concern by allowing for updates and amendments to be posted to preregistrations (see Fleming et al., 2023). This feature allows us to remain accountable to the original plan but also be transparent with our decision-making throughout the course of the study. Additionally, researchers may conduct non-preregistered analysis as part of their preregistered analysis (DeHaven, 2017), so long as they are clearly reported as such in the research report. Science is about the journey not the destination. The process of preregistration allows us to document the journey along the way as an added layer of evidence that is not beholden to journal page limits.
Researchers who preregister their research no doubt have good intentions to carry out their plans, but circumstances may arise that require one or more deviations. A preregistration deviation is “any discrepancy between what the authors said they would do in the preregistration and what they actually did in the final manuscript” (Willroth & Atherton, 2024, p. 3). Discrepancies can range from minor (e.g., slight change in participant inclusion criteria) to major (e.g., dropping a dependent variable). In fact, deviating from preregistrations appears to be the norm (Claesen et al., 2021). As such, Wilroth and Atherton (2024) recently proposed a deviation protocol intended to support systematic and consistent deviation reporting across studies. The deviation protocol allows authors to share when the deviation occurred (e.g., before, during, after data collection), the reason it occurred (e.g., typo/error, plan not possible, new knowledge, peer review), as well as describe the extent of the deviation and potential impact on the reader’s interpretation of the study. Providing authors with a structured reporting template makes the process more systematic and consistent and normalizes deviations when they occur. Therefore, authors better understand preregistration is not set in stone and that deviations can happen and should be reported to promote openness and transparency and document the scientific journey.
Importantly, preregistration can also be carried out in the context of a more prescribed process, a registered report, with journals that support the mechanism. Registered reports involve researchers submitting prospective Introduction and Method section (i.e., a study preregistration) for peer review. If and when the study is accepted in principle, based on the preregistration, the study is conducted and written up and then submitted for a second stage of peer review to ensure that researchers generally adhered to the preregistration, noted and explained when they did not, and reported and discussed findings appropriately. Importantly, after in-principle acceptance in the first stage of review, reviewers cannot reject a paper because of null results or findings that do not seem interesting. Integrating preregistration with peer review in this way incentivizes researchers to follow their study plans when possible, carefully report and justify deviations, and to publish null findings when they occur. With an increasing number of social science journals adopting the registered report process, researchers in special education and related fields will have expanding opportunities to conduct and publish registered reports. There are promising preliminary findings that registered reports do, in fact, combat publication bias by reporting null findings more frequently (Scheel et al., 2021). See Cook, Fleming, et al. (2022) for more information on registered reports.
Online Registries
As previously mentioned, preregistration is a research plan made public on a third-party registry site. While there are multiple registries to choose from, four online sites are the most prevalent among social science researchers: Aspredicted, ClinicalTrials, Open Science Framework (OSF), and Registry of Efficacy and Effectiveness Studies (REES). Fleming et al. (2023) recently published a detailed review of these four registries to provide explicit guidance to special education researchers with the goal to ease any perceived burden of preregistration. We provide a summary of, in our experience, the two registries used most often by researchers in special education and related fields (i.e., OSF and REES), including the user interfaces and use of templates for both registries, in the following paragraphs.
OSF Registries are a non-domain-specific registry that allows researchers from any field to preregister their studies online. A benefit of OSF Registries is they allow researchers to connect a preregistration to other open research outputs (e.g., preprints, open data, open materials) on a centralized project page. OSF project pages are accessible to all team members and allow researchers to manage workflow and collaborative projects. Conversely, REES is an education-specific registry that provides preregistration templates using terms and concepts common to educational research. For example, REES templates prompt researchers to include information on variables, conditions, and interventions specific to educational research. Furthermore, REES is more prescriptive than OSF and offers unique instructions and prompts across different research designs, ensuring researchers provide the appropriate level of specificity when preregistering their studies.
In our experience, both OSF and REES provide a positive user experience when using their sites. For example, researchers can add colleagues and collaborate on the preregistration together, save a draft of the preregistration until authors are ready to publish, and update the preregistration as needed. Both websites allow users to search through their database of registrations. Each registry also offers unique tools and features. For example, if researchers are worried about protecting intellectual property, OSF registries allow users to embargo or delay preregistrations for up to four years. OSF also allows researchers to generate a masked copy of the preregistration (identifying information removed) that authors can include with their completed manuscript when submitting their study for masked peer review. REES provides a “Demo Mode” where researchers can explore their template and process without creating an account or permanent record.
When preregistering a study, researchers often use a template from an online registry to guide their work. Both OSF and REES offer multiple templates to account for the different methods associated with different types of research. For example, OSF and REES provide preregistration templates for quantitative designs such as randomized control trials (RCTs), quasi-experimental designs, and regression discontinuity designs. OSF also offers templates for secondary data analyses, meta-analyses, and systematic reviews, as well as qualitative designs such as case studies and ethnographies. In contrast, REES offers a template for single-case designs. When selecting an online registry, researchers should first consider whether the registry offers a template that aligns with their proposed research design. See Figure 1 for an overview of REES and OSF registries.
Considerations for Preregistration by Methodology
There are unique considerations related to preregistration for different types of research methodology. In this section, we provide considerations related to preregistering research using research designs commonly applied in special education and related fields.
Group Designs
Beginning in medical research in context of clinical trial registries in the 1960s, preregistration of group design randomized clinical trials have a long history (see Dickersin & Rennie, 2003). As such, group experiments are typically a good fit for preregistration. For example, researchers typically have a priori hypotheses that they seek to test using pre-determined methods, all of which can be included in a preregistration. Three, of likely many, primary differences between clinical trials in medicine and group experiments in special education and related fields that have implications for preregistration are (a) study participants and outcomes in special education and related fields are dynamic and involve social elements and inherent variability, all which are inextricable from each other; (b) the infrastructure, system, and societal priorities and investment to conduct this work; and (c) the overlapping nature of systems that are not controlled and are easily influenced by contextual factors (e.g., financial, sociopolitical). Moreover, many clinical trials in special education are planned and conducted by researchers without grant funding who are not under a mandate to preregister their research and who make decisions about their studies independently.
Because of these and other differences between the fields of medicine and special education, there is not a clear expectation to register clinical trials in the latter as there is in the former. We do not take a deep dive into the procedures of preregistration of group design studies here, because these standards have existed for some time, are widely used and available, and do not differ substantially across fields (e.g., Cashin et al., 2023; Cook, Wong, et al., 2022; Simmons et al., 2021). However, preregistration of experimental group research is not as common in special education as it is in other fields like medicine (see Cook et al., 2023). Therefore, this commentary focuses on considerations and key points of focus for researchers in special education and related fields conducting group design research that may support the uptake, use, and benefits of preregistration.
One interesting consideration is the relevance to the grant application process and to what extent that application, when funded, can and should serve as a preregistration of group design studies (e.g., randomized control trials, adaptive intervention designs, quasi-experimental designs with participants allocated to conditions). When grant-funded abstracts are published online, are these considered study preregistrations? Or do we expect more of researchers who have already received a high level of peer- and community-level approval for this research (peer reviewers and the agency have recommended the study for funding)? Changes are often expected in school-based and special education research given the nuances of recruitment; the unpredictability of the school year and calendar; and the known policies and procedures for hiring, training, retention, and turnover in university/research organization infrastructures. Given the direct communication that researchers have with their corresponding program officers about any potential deviations or planned changes across the lifetime of a project, much of the data required for updating a preregistration will be collected in an expected and ongoing manner. A process to unify grant applications and the conduct of the subsequent funded proposal could serve as a type of preregistration. Even if grant applications and changes to grant projects discussed and approved by one’s program officer do not automatically populate a preregistration template, the text from these sources likely can be used to complete most of the information needed to preregister funded group-design studies and update the preregistrations as needed.
Potential changes in research design and implementation of preregistered plans across various stages of a study can be caused by changes in the overlapping and integrated systems that interact with the study at different points beyond the control of the researcher. Having specific affordances built into preregistration forms and online systems for changes due to (a) the research team, (b) the local schools or other study partners, and (c) broader issues such as state politics and regional educational or healthcare policies can provide a more flexible and uninterrupted preregistration flow. Of course, researchers can update preregistrations when changes to study plans occur. Including these structural features in online preregistration templates will help to (a) normalize deviations and their documentation and (b) make it easier and therefore more likely that researchers will document deviations from preregistered plans (which often does not occur, see Claesen et al., 2021). These issues are not unique to group-design research but may be more likely in cases when studies include district-wide samples or research samples that cut across multiple systems and levels of influence.
In terms of resources and templates, OSF has a populated repository that provides multiple templates for preregistering research (https://osf.io/zab38/) as well as a wiki site describing the templates, among other registration formats like registered reports. A standard preregistration form can be found here (https://osf.io/preprints/metaarxiv/epgjd/; Bowman et al., 2020). The site is a helpful resource that includes some discipline-specific templates that consider unique dimensions relevant to a particular field that may not present as important or applicable to others. For example, school psychology researchers have access to a specific preregistration form (https://osf.io/t6m9v) that is technically not accessible via the paywalled version (see Van’t Veer & Giner-Sorolla, 2016). Special education researchers may benefit from a special-education-specific repository or corpus of preregistration templates for research paradigms unique to special education across designs (randomized control trials, single-case, qualitative) that have unique, tailored considerations for within-design variation. Although using a structured template may have limitations specific to certain designs, the use of structured templates such as those found on OSF and REES (a) can support researchers preregistering studies by providing concrete guidance in the form of specific prompts and questions and (b) result in more specific preregistrations (Bakker et al., 2020).
Single-Case Designs
It is important to understand questionable research practices unique to single-case designs (SCDs), as they can distort interpretation of findings by undermining the validity of causal inferences (e.g., Tincani & Travers, 2022). Beginning with participant selection, SCD researchers may choose participants who are more likely to respond to the intervention. Preregistering inclusion (and exclusion) criteria can help reduce cherry-picking participants based on undisclosed characteristics. This is particularly relevant for academic intervention research wherein individuals within a particular disability eligibility category (e.g., specific learning disability, autism spectrum disorder) are likely to be heterogeneous and/or prior research has identified learner characteristics or prerequisite skills that impact responsiveness to the intervention. For example, including only students who have not received any phonics-based reading instruction may affect the outcomes of study examining the effects of a structured literacy intervention on reading fluency. While selecting participants who are likely to benefit from interventions is an essential component of both social validity and ethical research, all relevant characteristics researchers considered in participant selection should be reported (Tincani & Travers, 2022). Screening measures and decision points should be included in SCD preregistrations, particularly when more potential participants are recruited and meet inclusion criteria than are intended to bring into the intervention.
Relatedly, updating preregistrations after recruitment but prior to onset of data collection can safeguard against “trimming” data, or not including all data from all participants when reporting results (Shadish et al., 2016). For example, researchers could decide to remove a data point that was particularly low because it occurred on a Friday before the holidays or a fire drill took place right before the intervention session, impacting not only whether functional control is demonstrated but subsequent estimates of effect size. This is a unique aspect of SCD compared to group designs, where removing data points that are considered outliers is widespread and acceptable (Andre, 2022). Preregistration provides an opportunity to make and document rules for making these types of decisions beforehand, reducing the likelihood that decisions are influenced—consciously or subconsciously—by considerations of their effect on study outcomes. In other words, research teams can use their prior experience to proactively identify and report how these common situations will be handled.
Frequent, precise, and repeated measurement in SCD uniquely enables researchers to make response-guided adjustments to interventions without compromising internal validity. The ability of SCD researchers to make changes to the independent variable is dependent upon consistent measurement of the dependent variable. While “p-hacking” may not take place within SCD the way it does in group designs, SCD researchers can engage in questionable research practices similar to data dredging or HARKing by manipulating the measurement of the dependent variable in order to influence visual analysis. This could include changing the way dependent variables are operationally defined to artificially deflate baseline data or inflate intervention data, a unique consideration for SCD given that measurement of the dependent variable is generally defined or created by researchers (e.g., steps of a task analysis, requirement for correct written responses).
Preregistering operational definitions for dependent variables, proposed measurement systems, procedures for visual analysis, and frequency of data collection do not mean that post-hoc exploratory analyses cannot take place. Rather, it transforms the questionable research practice into an opportunity for scientific discovery and exploration through transparently HARKing in the discussion section, also known as THARKing (Hollenbeck & Wright, 2017). Changing how a dependent variable is operationally defined after data collection has begun (or even concluded) to influence visual analysis (i.e., HARKing) undermines the scientific process and misleads the broader research community. In contrast, openly acknowledging and explaining the rationale for changing operational definitions from the preregistration in a research report reflects the scientific process and may be ethically required given the high-stakes context of intervention research in schools where data are costly to obtain in regard to personnel and student instructional time (Hollenbeck & Wright, 2017).
Similarly, in the absence of preregistered procedures, researchers might apply only some visual analysis criteria that highlight positive intervention effects. For example, if a researcher predicts an intervention will functionally control three dependent variables (e.g., steps of a task analysis for using a calculator, correct solutions, and fluency) but student data show inconsistent or no effects on one of the outcomes (e.g., fluency), they may be tempted to only report on the two outcomes under functional control (e.g., steps of task analysis and correct solutions). This would be an example of what Hollenbeck and Wright (2017) call secretive HARKing, or SHARKing. In contrast, preregistration facilitates THARKing by encouraging reporting data for all three preregistered outcomes and hypothesizing reasons that a functional relation was not established between the intervention and the third dependent variable in the discussion.
The intention to use inductive and dynamic reasoning can be signaled in preregistrations by clarifying between exploratory and explanatory research questions. This distinction is critical because it influences the conclusions that should be drawn from SCD findings (Ledford et al., 2023). Preregistration does not preclude response-guided decision making, but rather can be used to specify the elements of SCD that are determined a priori and the basis for any response-guided decisions. Even when using response-guided decision making for phase-change decisions (e.g., intervention start time, cessation of intervention, timing of generalization or maintenance probes), researchers have a criterion or framework in mind that can be preregistered, as the “timing of phase changes is one of the most fundamental decisions made by researchers using SCD” (Cook, Johnson, et al., 2022, p. 365).
Rather than constraining decision-making, preregistration can foster collaboration and mentoring by providing a shared roadmap for research teams of any size. Centering this open science practice can be the vehicle for transforming solitary endeavors into collective ventures by protecting against the expertise-reversal effect. Experienced SCD researchers instinctively make decisions and use strategies based on their differing experiences, skills, and background knowledge. Students, collaborators, and research consumers may not fully understand the rationale behind these choices due to their limited understanding of the concepts and principles of behaviorism, exposure to extant research, or simply fewer experiences with the intervention or targeted behavior (i.e., the expertise-reversal effect). Even experienced SCD researchers likely vary in how they make decisions. The process of preregistration can set the occasion for teams using SCD to systematically consider and communicate methodological choices by anticipating potential challenges and their impacts.
Qualitative Studies
Preregistration has gained attention as a methodological approach to enhance collaboration, credibility, and transparency in qualitative research (e.g., Branney et al., 2023; Haven & Van Grootel, 2019). Those who engage in qualitative research partake in many approaches to deconstruct the meaning and purpose of human experience and what constitutes truth through the perspectives of others. The negotiation about how truth is formed often requires qualitative researchers to bring forward philosophical assumptions (e.g., epistemology stance), paradigms (e.g., researcher’s positionality and reflexivity), and interpretive and theoretical frameworks (e.g., influence of theory) in designing qualitative research (Creswell & Poth, 2016; Merriam and Associates, 2019). One key to preregistration of qualitative research in special education and related fields could involve the deconstruction of perspectives about ethical considerations and methodological procedures that qualitative researchers apply in their work (Levitt et al., 2022).
Several recent articles provided strategies for conducting and evaluating qualitative research in special education. Leko and colleagues (2021) reported indicators for methodological rigor in special education research emphasizing the role of the researcher as the instrument. The QR Collective (2023) extended this work underscoring equity-driven qualitative research and the importance of reflexivity, or the reflection of one’s positionality and perspective throughout qualitative design (Patton, 2015). This recent work can allow researchers to examine and reflect on important assumptions about the qualitative research process including how they define their proximity to the research being conducted, making explicit their worldviews they bring to the research, and how they imagine various theories posed will interact with these dynamics throughout the study. Critical to this manuscript, there is room for pinning down these thoughts before the study is conducted in a preregistration.
Other key opportunities for preregistration of qualitative research have been recommended outside of special education research, including promoting transparency of research aims, codebooks, sampling methods, data collections sources, and data analytic methods (Haven et al., 2020). However, recommendations to preregister these and other qualitative procedures are not without debate as critics argue that preregistration could, for example, stifle researchers’ subjectivity and reflexivity, threaten anonymization, create biases in interpreting data, and pose accessibility challenges as qualitative data comes in multiple formats (e.g., transcripts, recordings; Haven et al., 2020). These disagreements are real, are notable in anecdotal conversations in the field of special education, and could guard against the widespread use of preregistration. Though speculative, this could explain why few qualitative studies are preregistered as well as the notable absence of open science discussion, including preregistration, in some recent quality indicators articles in special education (e.g., Leko et al., 2021; The QR Collective, 2023). Careful and thoughtful considerations to these issues must be addressed by following ethical policies and guidance in collaboration with review boards that protect human subjects. This should include eliminating any false distinction about preregistration opportunities between quantitative and qualitative research. As qualitative researchers often tackle different types of research questions, data collection methods, sample size sufficiency, and often pursue different outcomes from quantitative studies, it is essential to rely on the judgement of these scholars about what is uniquely ethical for preregistration given the characteristics of their study.
Secondary Data Analyses
In our next two methods (secondary data analyses and systematic literature reviews), we focus on methods that use existing data to critically inform current and future research, practice, and policy. While preregistration of secondary data analyses studies may seem counter-intuitive (e.g., the data are already collected), it can serve as an important tool to increase transparency and reduce bias when analyzing novel research questions using existing data with which the authors are likely to be familiar (Lombardi et al., 2023). Despite the explicit structure it offers, preregistration is scarce in special education research with secondary data analyses studies specifically. Moreover, the preregistration process can be split into exploratory and confirmatory phases in order explore phenomena that have not yet crystallized into testable hypotheses (Baldwin et al., 2022).
Secondary data analyses studies are prevalent in special education research. However, it is not uncommon for details to be left out of these research reports; specifically details that would allow reproducibility by other research teams (e.g., variable computing). Over the years, authors may have attributed the lack of details to restrictions on page limits from journals. However, this excuse is no longer valid in that the majority of journals allow for supplemental materials that do not count toward manuscript page numbers (Campbell et al., 2024). It is critical for researchers to share details about their intended plans for data analyses and decision-making that occurred along the way for the research to be transparent and replicable. This process can be documented in a step-by-step fashion as annotated software code files (Lombardi et al., 2023). Lack of documentation could lead to lack of replication, and ultimately, potential misinterpretation of results that could impact implications for future research and practice.
Even with the same data set, no two preregistrations are alike. Secondary data analyses are typically done with large-scale datasets that have a wealth of variables to select from. For example, the same team of researchers carried out two different studies using the National Longitudinal Transition Study 2012 (Burghardt et al., 2017) data (Lombardi et al., 2022; Lombardi et al., 2024). Both studies were preregistered: Example 1 (https://osf.io/c9d4p) and Example 2 (https://osf.io/y3abz). They involved different sets of variables pertinent to the posed research and hypotheses. In the first example, no study deviations were reported or occurred. The analysis plan was carried out as described using logistic regression. In the second example, several deviations occurred as described in the preregistration (see sections marked with “updated”). A major reason for deviation was due to skip logic of the selected items in the second example study. Another potential complicating factor was due to the more complex analytic approach in Example 2, which was Structural Equation Modeling. As such, even though these studies used the same dataset, experiences can vary widely and may depend on variable selection, sampling designs, and planned analyses. Preregistration and deviation protocols provide the tools for transparency in decision-making especially when the steps do not go according to plan.
Systematic Literature Reviews
Our final commentary focuses on preregistration in systematic literature reviews. Systematic reviews are an essential and high-yield research method in special education and related fields, given that, when done rigorously and interpreted appropriately, they have a voice that can readily influence research, practice, and policy (Chow et al., 2021). There is evidence across several fields in the social sciences and medicine that the prevalence of systematic reviews and meta-analyses have increased over time (Chow, 2018; Sandbank et al., 2023). And, anecdotally, it is likely that this trend will continue given the many projects that shifted to systematic reviews from primary data collection due to the COVID-19 pandemic, both in the context of student projects and faculty time allocation.
In special education, new guidance that appears alongside the other resources in the recent special issue on quality indicators focuses on promoting four dimensions of rigor in systematic reviews in special education: coherence, contextualization, generativity, and transparency (Cumming et al., 2023). Though preregistration can improve the quality of a review on all these dimensions, preregistration is particularly well-suited for enhancing transparency. Scholars can preregister their studies and promote transparency by disclosing positionality, fully reporting search procedures and screening criteria, clearly defining variables of interest and corresponding constructs, and clearly articulating design and analysis plans (Cumming et al., 2023; Page et al., 2021). This not only promotes important context and perspective into the purpose of the review, but also demonstrates foresight, planning, and adherence to a priori research questions and corresponding analytic decisions. Additionally, preregistration guards against the possibility of researchers altering inclusion/exclusion criteria during their review to exclude studies based on their findings.
Preregistration may be less challenging to adopt for research synthesis than some other types of research and is a great opportunity for researchers to practice, model, and the transfer the process of preregistration to other designs. Technically a type of secondary data analysis, a major distinction of systematic reviews is that a data collection process needs to be conceptualized and implemented because the studies that are to be included are not yet located and collected. In contrast to original research involving humans, planning to collect data (studies) from the field in systematic reviews is extremely predictable once a plan is put in place. For example, there is no need for recruitment, consent/assent, and going back and forth about important human subjects research considerations.
Specific infrastructure exists for the preregistration of systematic literature reviews. For example, PROSPERO, which is supported by the National Institute for Health and Cancer Research out of the United Kingdom (https://www.crd.york.ac.uk/prospero), is a commonly used system for this exact purpose. On this repository, researchers can search existing protocols and follow guidance and resources to ensure research teams conducting systematic reviews can not only successfully preregister their reviews, but also update with transparency and reproducibility in mind. PROSPERO supports the registration of systematic reviews (as well as rapid reviews and umbrella reviews) but does not currently support scoping reviews or literature briefs because of the purpose of the more brief, broad reviews. Fortunately, adopting preregistration into a research program can ensure that all types of reviews, even those not supported by existing systems, can be preregistered and disseminated transparently to the field. For example, one can preregister a rigorous scoping review on OSF (the Generalized Systematic Review template on OSF is specific to preregistering systematic reviews and meta-analyses) instead of rationalizing not to preregister a scoping review because PROSPERO does not have the current mechanism to facilitate it.
Conclusion
Preregistration can help to combat questionable practices in special education research, including selective outcome reporting, HARKing, p-hacking, data exclusion, and data peeking. Additionally, there has been notable progress made in developing the necessary infrastructure for researchers to openly share study plans along with data and materials (Fleming et al., 2023). In addition, multiple federal funding agencies are calling for the integration of open practices, such as preregistration, within research designs. Now more than ever, it is easier for special education researchers to preregister studies using the budding infrastructure of online platforms and communities. While improvements should continue, it is important to change habits and integrate open practices. Seasoned special education researchers should be modeling these behaviors for early career scholars. In this paper, we described some of the challenges and unique considerations across various methodologies to foster open and ongoing conversation about how we may adapt open practices, share resources, and support each other in our special education scholarly community.
Competing Interests
The authors have no competing interests to declare.
Author Contribution
Allison Lombardi: Conceptualization, Writing – Original Draft, Writing – Review & Editing. Jason Chow: Conceptualization, Writing – Original Draft, Writing – Review & Editing. Bryan G Cook: Conceptualization, Writing – Original Draft, Writing – Review & Editing. LaRon Scott: Writing – Original Draft, Writing – Review & Editing. Jenny Root: Writing – Original Draft, Writing – Review & Editing. Jesse I. Fleming: Writing – Original Draft, Writing – Review & Editing.
References
Andre, Q. (2022). Outlier exclusion procedures must be blind to researcher’s hypothesis. Journal of Experimental Psychology, 151(1), 213–223. http://doi.org/10.1037/xge0001069
Bakker, M., Veldkamp, C., van Assen, M., Crompvoets, E., Ong, H. H., Nosek, B. A., Soderberg, C. K., Mellor, D., & Wicherts, J. M. (2020). Ensuring the quality and specificity of preregistrations. PLOS Biology, 18(12), Article e3000937. http://doi.org/10.1371/journal.pbio.3000937
Baldwin, J. R., Pingault, J. B., Schoeler, T., Sallis, H. M., & Munafò, M. R. (2022). Protecting against researcher bias in secondary data analysis: Challenges and potential solutions. European Journal of Epidemiology, 37(1), 1–10. http://doi.org/10.1007/s10654-021-00839-0
Bowman, S. D., DeHaven, A. C., Errington, T. M., Hardwicke, T. E., Mellor, D. T., Nosek, B. A., & Soderberg, C. K. (2020, January 22). OSF Prereg Template. http://doi.org/10.31222/osf.io/epgjd
Branney, P. E., Brooks, J., Kilby, L., Newman, K., Norris, E., Pownall, M., Talbot, C. V., Treharne, G. J., & Whitaker, C. M. (2023). Three steps to open science for qualitative research in psychology. Social and Personality Psychology Compass, 17(4), e12728. http://doi.org/10.1111/spc3.12728
Burghardt J., Haimson J., Liu A. Y., Lipscomb S., Potter F., Waits T., & Wang S. (2017). National Longitudinal Transition Study 2012 design documentation (NCEE 2017–4021). U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance.
Campbell, A. R., Brunsting, N., Landmark, L., Butler, B., & Cook, B. G. (2024). Sharing materials to heighten the impact of publications, preprint. http://doi.org/10.35542/osf.io/s7x4f
Cashin, A. G., Richards, G. C., DeVito, N. J., Mellor, D. T., & Lee, H. (2023). Registration of health and medical research. BMJ Evidence-Based Medicine, 28(1), 68–72. http://doi.org/10.1136/bmjebm-2021-111836
Chow, J. C. (2018). Prevalence of publication bias tests in speech, language, and hearing research. Journal of Speech, Language, and Hearing Research, 61(12), 3055–3063. http://doi.org/10.1044/2018_JSLHR-L-18-0098
Chow, J. C., Sjogren, A. L., & Zhao, H. (2021). Reporting and reproducibility of meta-analysis in speech, language, and hearing research. Journal of Speech, Language, and Hearing Research, 64(7), 2786–2793. http://doi.org/10.1044/2021_JSLHR-21-00047
Claesen, A., Gomes, S., Tuerlinckx, F., & Vanpaemel, W. (2021). Comparing dream to reality: An assessment of adherence of the first generation of preregistered studies. Royal Society Open Science, 8(10). http://doi.org/10.1098/rsos.211037
Cook, B. G. (2014). A call for examining replication and bias in special education research. Remedial and Special Education, 35(4), 233–246. http://doi.org/10.1177/0741932514528995
Cook, B. G., Fleming, J. I., Hart, S. A., Lane, K. L., Therrien, W. J., van Dijk, W., & Wilson, S. E. (2022). A how-to guide for open-science practices in special education research. Remedial and Special Education, 43(4), 270–280. http://doi.org/10.1177/07419325211019100
Cook, B. G., Johnson, A. H., Maggin, D. M., Therrien, W. J., Barton, E. E., Lloyd, J. W., Reichow, B., Talbott, E., & Travers, J. C. (2022). Open science and single-case design research. Remedial and Special Education, 43(5), 359–369. http://doi.org/10.1177/0741932521996452
Cook, B. G., Lloyd, J. W., Mellor, D., Nosek, B. A., & Therrien, W. J. (2018). Promoting open science to increase the trustworthiness of evidence in special education. Exceptional Children, 85(1), 104–118. http://doi.org/10.1177/0014402918793138
Cook, B. G., van Dijk, W., Vargas, I., Aigotti, S. M., Fleming, J. I., McDonald, S. D., Richmond, C. L., Griendling, L. M., McLucas, A. S., & Johnson, R. M. (2023). A targeted review of open practices in special education publications. Exceptional Children, 89(3), 238–255. http://doi.org/10.1177/00144029221145195
Cook, B. G., Wong, V. C., Fleming, J. I., & Solari, E. J. (2022). Preregistration of randomized controlled trials. Research on Social Work Practice. Advance online publication. http://doi.org/10.1177/10497315221121117
Creswell, J. W., & Poth, C. N. (2016). Qualitative inquiry and research design: Choosing among five approaches. Sage publications.
Cumming, M. M., Bettini, E., & Chow, J. C. (2023). High-quality systematic literature reviews in special education: Promoting coherence, contextualization, generativity, and transparency. Exceptional Children, 89(4), 412–431. http://doi.org/10.1177/00144029221146576
DeHaven, A. (2017). Pre-registration: A plan, not a prison. Center for Open Science. https://www.cos.io/blog/preregistration-plan-not-prison
Dickersin, K., & Rennie, D. (2003). Registering clinical trials. JAMA, 290(4), 516–523. http://doi.org/10.1001/jama.290.4.516
Fleming, J. I., McLucas, A. S., & Cook, B. G. (2023). Review of four preregistration registries for special education researchers. Remedial and Special Education, 44(6), 495–505. http://doi.org/10.1177/07419325231160293
Gehlbach, H., & Robinson, C. D. (2021). From old school to open science: The implications of new research norms for educational psychology and beyond. Educational Psychologist, 56(2), 79–89. http://doi.org/10.1080/00461520.2021.1898961
Haven, T. L., Errington, T. M., Gleditsch, K. S., van Grootel, L., Jacobs, A. M., Kern, F. G., Piñeiro, R., Rosenblatt, F., & Mokkink, L. B. (2020). Preregistering Qualitative Research: A Delphi Study. International Journal of Qualitative Methods, 19. http://doi.org/10.1177/1609406920976417
Haven, T. L., & Van Grootel, L. (2019). Preregistering qualitative research. Accountability in Research, 26(3), 229–244. http://doi.org/10.1080/08989621.2019.1580147
Hollenbeck, J. R., & Wright, P. M. (2017). Harking, sharking, and tharking: Making the case for post hoc analysis of scientific data. Journal of Management, 43(1), 5–18. http://doi.org/10.1177/0149206316679487
Ledford, J. R., Lambert, J. M., Pustejovsky, J. E., Zimmerman, K. N., Hollins, N., & Barton, E. E. (2023). Single-case-design research in special education: Next-generation guidelines and considerations. Exceptional Children, 89(4), 379–396. http://doi.org/10.1177/00144029221137656
Leko, M. M., Cook, B. G., & Cook, L. (2021). Qualitative methods in special education research. Learning Disabilities Research & Practice, 36(4), 278–286. http://doi.org/10.1111/ldrp.12268
Levitt, H. M., Surace, F. I., Wu, M. B., Chapin, B., Hargrove, J. G., Herbitter, C., Lu, E. C., Maroney, M. R., & Hochman, A. L. (2022). The meaning of scientific objectivity and subjectivity: From the perspective of methodologists. Psychological Methods, 27(4), 589–605. http://doi.org/10.1037/met0000363
Lombardi, A. R., Rifenbark, G. G., Hicks, T. A., Taconet, A., & Challenger, C. (2022). College and career readiness support for youth with and without disabilities based on the National Longitudinal Transition Study 2012. Exceptional Children, 89, 5–21. http://doi.org/10.1177/00144029221088940
Lombardi, A. R., Rifenbark, G. G., & Taconet, A. V. (2023). Quality indicators of secondary data analyses in special education research: A pre-registration guide. Exceptional Children, 89(4), 397–411. http://doi.org/10.1177/00144029221141029
Lombardi, A., Rifenbark, G., & Taconet, A. V. (2024). An intersectional examination of economic hardship and Individualized Education Program meeting participation. Exceptional Children, 90(2), 148–163 http://doi.org/10.1177/00144029231184568
Makel, M. C., Hodges, J., Cook, B. G., & Plucker, J. A. (2021). Both questionable and open research practices are prevalent in education research. Educational Researcher, 50(8), 493–504. http://doi.org/10.3102/0013189X211001356
Merriam, S. B., & Grenier, R. S. (Eds.). (2019). Qualitative research in practice: Examples for discussion and analysis. John Wiley & Sons.
National Institutes for Health (2023). Data management and sharing policy. https://sharing.nih.gov/data-management-and-sharing-policy/about-data-management-and-sharing-policies/data-management-and-sharing-policy-overview#after
National Science Foundation (2023). Proposal & award policies & procedures guide (PAPPG). NSF 23–1. Retrieved from: https://new.nsf.gov/policies/pappg/23-1
Nosek, B. A., DeHaven, A. C., & Mellor, D. T. (2018). The preregistration revolution. Proceedings of the National Academy of Sciences, 115(11), 2600–2606. http://doi.org/10.1073/pnas.1708274114
Page M. J., Moher D., Bossuyt P. M., Boutron I., Hoffmann T. C., Mulrow C. D., Shamseer L., Tetzlaff J. M., Akl E. A., Brennan S. E., Chou R., Glanville J., Grimshaw J. M., Hróbjartsson A., Lalu M. M., Li T., Loder E. W., Mayo-Wilson E., McDonald S.,… McKenzie J. E. (2021). PRISMA 2020 explanation and elaboration: Updated guidance and exemplars for reporting systematic reviews. BMJ, 372(160), 1–36. http://doi.org/10.1136/bmj.n160
Patton, M. Q. (2015). Qualitative research & evaluation methods: Integrating theory and practice (4th ed.). Sage.
Reich, J. (2021). Preregistration and registered reports. Educational Psychologist, 56(2), 101–109. http://doi.org/10.1080/00461520.2021.1900851
Sandbank, M., Bottema-Beutel, K., LaPoint, S. C., Feldman, J. I., Barrett, D. J., Caldwell, N., Dunham, K., Crank, J., Albarran, S., & Woynaroski, T. (2023). Autism intervention meta-analysis of early childhood studies (Project AIM): Updated systematic review and secondary analysis. BMJ, 383. http://doi.org/10.1136/bmj-2023-076733
Scheel, A. M., Schijen, M. R., & Lakens, D. (2021). An excess of positive results: Comparing the standard psychology literature with registered reports. Advances in Methods and Practices in Psychological Science, 4(2). http://doi.org/10.1177/25152459211007467
Shadish, W. R., Zelinsky, N. A., Vevea, J. L., & Kratochwill, T. R. (2016). A survey of publication practices of single-case design researchers when treatments have small or large effects. Journal of Applied Behavior Analysis, 49(3), 656–673. http://doi.org/10.1002/jaba.308
Simmons, J. P., Nelson, L. D., & Simonsohn, U. (2021). Pre-registration: Why and how. Journal of Consumer Psychology, 31(1), 151–162. http://doi.org/10.1002/jcpy.1208
The QR Collective (2023). Reflexive quality criteria: Questions and indicators for purpose-driven special education qualitative research. Exceptional Children, 89(4), 449–466. http://doi.org/10.1177/00144029231168106
Tincani, M., & Travers, J. (2022). Questionable research practices in single-case experimental designs: Examples and possible solutions. In W. O’Donohue, A. Masuda, & S. Lilienfeld (Eds.), Avoiding questionable research practices in applied psychology (pp. 269–285). Cham: Springer International Publishing.
Toste, J. R., Talbott, E., & Cumming, M. M. (2023). Special issue preview: Introducing the next generation of quality indicators for research in special education. Exceptional Children, 89(4), 357–358. http://doi.org/10.1177/00144029231174106
United States Department of Education (2023). Special education research grants program request for applications (ALN 84: 324A). National Center for Special Education Research, Institute of Educational Sciences.
van Dijk, W., Schatschneider, C., & Hart, S. A. (2021). Open science in education sciences. Journal of Learning Disabilities, 54(2), 139–152. http://doi.org/10.1177/0022219420945267
Van’t Veer, A. E., & Giner-Sorolla, R. (2016). Pre-registration in social psychology—A discussion and suggested template. Journal of Experimental Social Psychology, 67, 2–12. http://doi.org/10.1016/j.jesp.2016.03.004
Willroth, E. C., & Atherton, O. E. (2024). Best laid plans: A guide to reporting preregistration deviations. Advances in Methods and Practices in Psychological Science, 7(1). http://doi.org/10.1177/25152459231213802