SIPS Products

If you have a project that started or developed at SIPS, or have updates on listed products, we would love to hear about it! Please email sips@improvingpsych.org to have it added to this list or updated.

Resources
Templates & Tools
Papers & Other Writing

Resources

Replications & Reversals across Social Sciences (by FORRT)[SIPS 2023]

This community-driven collection of replicated effects (https://forrt.org/reversals/) lists papers, citations, sample sizes, study designs, and effect sizes of original and replication work. Scholars from varied backgrounds and disciplines are invited to contribute with effects in their respective fields. The resource already includes 550+ effects from more than 20 disciplines. See this document for more information.

Data Management for Psychological Science: A Crowdsourced Syllabus [SIPS 2021]

This syllabus provides detailed descriptions of data management topics, resources, and activities that can be used to create a course or workshop on data management. It is based on a session facilitated by DeBolt, M., Herrera-Bennett, A., Lawson, K. M., Schiavone, S., & Wysocki, A. (please see the end of the document for a complete list of contributors)

Embedding open and reproducible science into teaching: A bank of lesson plans and resources [SIPS 2021]

Shared bank of teaching resources and lesson plans on the broad topics of open scholarship, open science, replication, and reproducibility. Pownall, M., Azevedo, F., Aldoh, A., Elsherif, M., Vasilev, M., Pennington, C. R., Robertson, O., Vel Tromp, M., Liu, M., Makel, M. C., Tonge, N. A., Moreau, D., Horry, R., Shaw, J., Tzavella, L., McGarrigle, R., Talbot, C., & Parsons, S. https://doi.org/10.1037/stl0000307

Teaching and Mentoring Open Science [SIPS 2020]

A mentoring statement and syllabus modules for teachers and mentors who want to teach and model open science to their students. Kim, M. H., Woods, A. D., Ellis, A., & Davis-Kean, P.

Nudging Open Science [SIPS 2019]

A toolbox of resources and nudges for advancing open-science practices. Robson, S. G., Baum, M. A., Beaudry, J., Beitner, J., Brohmer, H., Chin, J. M., Jasko, K., Kouros, C. D., Laukkonen, R. E., Moreau D., Searston R. A., Slagter, H. A., Steffens N. K., & Tangen, J. M. https://psyarxiv.com/zn7vt/

BrainPower – Resources for power analysis in neuroimaging [SIPS 2019]

A hackathon group continues its work to develop tools for power analysis in neuroimaging. Read more here: https://brainpower.readthedocs.io/en/latest/index.html

Free, Open Datasets [SIPS 2019]

Brick, C., Botzet, L., Costello, C. K., Batruch, A., Arslan, R. C., Kline, M., Sommet, N., Green, J., Nuijten, M. B., Conley, M. A., Richardson, T., Sorhagen, N., Collentine, A. O., Feldman, G., Feingold, F., Manley, H., & Mullarkey, M. C. (2019). Directory of free, open psychological datasets. https://doi.org/10.17605/OSF.IO/TH8EW

Taxonomy for PsyArXiv Preprints [SIPS 2017]

Brown, B. T., Eerland, A., Condon, D. M., Pfeiffer, N., Hoplock, L., Columbus, S., Kline, M. E., Moshontz, H., Binion, G., Iskiwitch, C., & Forscher, P. S. (2017). Retrieved from link.

FORRT (Framework for Open and Reproducible Research Training) [SIPS 2018]

FORRT — the Framework for Open and Reproducible Research Training (forrt.org) — is an interdisciplinary community of 950+ early career scholars aiming to integrate open science principles into higher education as well as to advance research transparency, reproducibility, rigor, and ethics through pedagogical reform aligned with justice, equity, diversity, inclusion, and accessibility.

Media training materials [SIPS 2018]

How to communicate with journalists: https://osf.io/2dk6x/

Responsible communication with the public (including discussion of how to handle preprints): https://osf.io/7g8et/

Five questions to ask when reading science news [SIPS 2018]

Strategies to identify research that is reliable: https://osf.io/qxv4n/

Tools & Tips to Manage Your Lab [SIPS 2018]

A list of tools for researchers to use to help manage their students, projects, and writing reports: https://osf.io/r5my6/

Psych-DS [SIPS 2018]

Psych-DS is an in-progress community attempt to define a standard way of formatting and documenting scientific datasets.

Psychological Science Accelerator [SIPS 2017]

The Psychological Science Accelerator is a decentralized, massive, international collaboration of psychological researchers; read more here.

Open Science Video Series (2017, with COS and SPSP) [SIPS 2017]

During SIPS 2017, 5 short videos were recorded that served as an introduction to major topics within the area of open science. http://spsp.org/resources/multimedia/experts/openscience

PsyArXiv [SIPS 2016]

PsyArXiv is a preprint server for psychology; read more here.

StudySwap [SIPS 2016]

StudySwap is community “swap board” for researchers to connect to share/trade resources, including study participants; read more here.

Templates & Tools

EEG ERP Preregistration Template [SIPS 2019]

Govaart, G. H., Schettino, A., Helbling, S., Mehler, D. M. A., Ngiam, W. X. Q., Moreau, D., … Paul, M. (2022). EEG ERP Preregistration Template. https://osf.io/preprints/metaarxiv/4nvpt/

Tenzing: A tool for documenting contributorship with CRediT [SIPS 2019]

A web application for reporting which people made which contributions to a scientific project using the Contributor Roles Taxonomy. Holcombe, A., Kovacs, M., Aust, F., & Aczel, B. https://doi.org/10.31222/osf.io/b6ywe.

Session: “Replace journals’ writing-based authorship guidelines with a contributorship model.” Holcombe, A.O., Vazire, S., Chartier, C., Ebersole, C., Giner-Sorolla, R., Haroz, S., Moreau, D., Primbs, M., Ling, M., Werner, K., Schnyder, N., Adie, J., Crook, Z., Smout, C., Ribeiro, G., Tangen, J., Aczel, B., Thibault, R., Searston, R.,Van ‘t Veer, A., Schmalz, X.(2019). https://osf.io/pzyn3/

A Crowdsourced Effort to Develop a Lab Manual Template [SIPS 2019]

A large hackathon group has developed a template to help researchers develop their own lab manuals. https://bit.ly/2FdVt8F

Secondary Data Analysis Pre-Registration Template [SIPS 2018]

Preprint: Van den Akker, O., Weston, S. J., Campbell, L., Chopik, W. J., Damian, R. I., Davis-Kean, P., Hall, A., Kosie, J. E., Kruse, E., Olsen, J., Ritchie, S. J.,  Valentine, K. D., van ‘t Veer, A. E., & Bakker, M. (2019, November 20). Preregistration of secondary data analysis: A template and tutorial. https://doi.org/10.31234/osf.io/hvfmr

Project: Weston, S. J., Mellor, D. T., Bakker, M., Van den Akker, O., Campbell, L., Ritchie, S. J., Chopik, W. J., Damian, R. I., Kosie, J., Soderberg, C. K., Ebersole, C. R., Brown, B., Davis-Kean, P., Hall, A., Kruse, E., Olsen, J., Valentine, K. D., & Nguyen, T. T. (2018). Secondary data preregistration. Retrieved from https://osf.io/x4gzt

Papers & Other Writing

Guide for Editors on Getting Started with Open Science [SIPS 2022]

This guide by Silverstein et al. outlines steps that editors can take to implement open policies and practices within their journal, and goes through the what, why, how, and worries of each policy and practice. This paper introduces and summarizes the guide, which can be found here in its full version: https://doi.org/10.31219/osf.io/hstcx

Promoting Civility in Formal And Informal Open Science Contexts [SIPS 2022]

This paper by Darda et al. presents five scientifically supported recommendations for promoting interpersonal civility in formal and informal open-science contexts. https://doi.org/10.31234/osf.io/rfkyu

Transparency and Validity in Coding Open-Ended Data for Quantitative Analysis [SIPS 2021]

This paper by Conry-Murray et al. assesses current reporting practices for quantitative coding of open-ended data and proposes guidelines for transparent reporting informed by concerns with replicability, content validity, and statistical validity. https://doi.org/10.31234/osf.io/86y9f

Checklist for transparent reporting of reaction time pre-processing [SIPS 2021]

This checklist is meant to facilitate transparent reporting of reaction time pre-processing for research, editorial, and teaching purposes. It resulted from the unconference session and hackathon “(Too) many shades of reaction time data preprocessing” led by Krzysztof Cipora and Hannah D. Loenneker. All collaborators and more detailed information, including a systematic literature search, best practice survey, and multiverse analysis, can be found in the following publication: https://doi.org/10.1016/j.cortex.2023.11.012

Many Modelers [SIPS 2021]

van Dongen, N. N. N., Finnemann, A., de Ron, J., Tiokhin, L., Wang, S. B., Algermissen, J., Altmann, E. C., Chuang, L., Dumbravă, A., Bahník, Š., Fünderich, J. H., Geiger, S. J., Gerasimova, D., Golan, A., Herbers, J., Jekel, M., Lin, Y., Moreau, D., Oberholzer, Y., … Borsboom, D. (2022, August 24). Many Modelers. PsyArXiv. https://doi.org/10.31234/osf.io/r5yfz

Guidelines to Improve Internationalization in Psychological Science [SIPS 2021]

Putillam, A., Montilla Doble, L. J., Delos Santos, J. J., Elsherif, M. M., Steltenpohl, C. N., Moreau, D., Pownall, M., & Kapoor, H. Guidelines to improve internationalization in psychological science. https://psyarxiv.com/2u4h5/ 

Missing Data and Multiple Imputation Decision Tree [SIPS 2021]

Practical guidelines for researchers when examining their data for missingness and making decisions about how to handle that missingness. Woods, A. D.,  Davis-Kean, P., Halvorson, M., King, K., Logan, J., Xu, M., Bainter, S., Brown, D., Clay, J. M., Cruz, R. A., Elsherif, M. M., Gerasimova, D., Joyal-Desmarais, D., Moreau, D., Nissen, J., Schmidt, K., Uzdavines, A., Van Dusen, B., & Vasilev, M. https://psyarxiv.com/mdw5r/

Not All Effects Are Indispensable: Psychological Science Requires Verifiable Lines of Reasoning for Whether an Effect Matters [SIPS 2019]

Anvari, F., Kievit, R. A., Lakens, D., Pennington, C. R., Przybylski, A. A., Tiokhin, L., Weirnik, B. & Orben, A. (2022). Not all effects are indispensable: Psychological science requires verifiable lines of reasoning for whether an effect matters. Perspectives on Psychological Science. https://journals.sagepub.com/doi/full/10.1177/17456916221091565

Seven Steps Toward More Transparency in Statistical Practice [SIPS 2019]

Wagenmakers, E. J., Sarafoglou, A., Aarts, S. et al. (2021). Seven steps toward more transparency in statistical practice. Nature Human Behavior, 5, 1473–1480.
https://doi.org/10.1038/s41562-021-01211-8

Global Engagement Task Force Report [SIPS 2019]

This report makes recommendations for improving inclusion and access for scholars from regions outside the United States, Canada, and Western Europe. Steltenpohl, C. N., Doble, L. J. M., Basnight-Brown, D., Dutra, N. B., Belaus, A., Kung, C. C., Onie, S. Seernani, D., Chen, S.-C., Burin, D., & Darda, K. M. (2021). Society for the Improvement of Psychological Science Global Engagement Task Force Report. https://psyarxiv.com/4upqd/

Non-Interventional, Reproducible, and Open (NIRO) Systematic Reviews [SIPS 2019]

Guidelines for preparing and preregistering a systematic review protocol of non-interventional studies.
Pickering, J.S., Topor, M., Barbosa Mendes, A., Bishop, D. V. M., Büttner, F. C., Henderson, E. L., … Kothe, E.J. (2020). Non-Interventional, Reproducible, and Open (NIRO) Systematic Review Guidelines v0.1. https://osf.io/f3brw/wiki/home/

Rule out conflicts of interest in psychology awards [SIPS 2019]

Stoevenbelt, A. H., Nuijten, M. B., Pauli, B. E., & Wicherts, J. M. (2019) Nature 572, 312. doi: 10.1038/d41586-019-02429-3

Blog post: A fruitful rendevous at SIPS: Neuroimagers meet study preregistration advocates [SIPS 2019]

Algermissen, J., Bartlett, J., Gau, R., Heunis, S., Klapwijk, E., Mazor, M., Paul, M., Schettino, A., & Mehler, D. (2019). Blog post featuring resources for neuroimagers on preregistration, power analysis, and more.

7 Easy Steps to Open Science: An Annotated Reading List [SIPS 2018]

Crüwell, S., Van Doorn, J., Etz, A., Makel, M. C., Moshontz, H., Niebaum, J. C., Orben, A., Parsons, S., Schulte-Mecklenbeck, M. (In press). Zeitschrift für Psychologie. Preprint retrieved from https://psyarxiv.com/cfzyx.

The emerging relationship between clinical psychology and the credibility movement [SIPS 2018]

Reardon, K. W., Corker, K. S., & Tackett, J. L. (In press). The emerging relationship between clinical psychology and the credibility movement. The Behavior Therapist. [Preprint at https://psyarxiv.com/46rk5]

Pros and cons of replication research [SIPS 2017]

Zwaan, R. A., Etz, A., Lucas, R. E., & Donnellan, M. B. (2017). Making replications mainstream. Brain and Behavioral Sciences. doi: https://doi.org/10.1017/S0140525X17001972
OA version: https://psyarxiv.com/4tg9c/

Improving peer review [SIPS 2017]

Davis, W. E., Giner-Sorolla, R., Stephen Lindsay, D., Lougheed, J. P., Makel, M. C., Meier, M. E., Sun, J., Vaughn, L. A., & Zelenski, J. M. (2018). Peer-review guidelines promoting replicability and transparency in psychological science. Advances in Methods and Practices in Psychological Science1(4), 556–573. doi: https://doi.org/10.1177/2515245918806489
OA version: https://kar.kent.ac.uk/69135/

Increasing transparency of analysis for pre-existing data [SIPS 2017]

Weston, S. J., Ritchie, S. J., Rohrer, J. M., & Przybylski, A. K. (2019). Recommendations for increasing the transparency of analysis of preexisting data sets. Advances in Methods and Practices in Psychological Science2(3), 214-227.
OA version: https://psyarxiv.com/zmt3q/

Constraints on generality (COG) [SIPS 2016]

Simons, D. J., Shoda, Y., & Lindsay, D. S. (2017). Constraints on Generality (COG): A proposed addition to all empirical papers. Perspectives on Psychological Science, 12, 1123-1128. doi: https://doi.org/10.1177/1745691617708630 OA version: https://psyarxiv.com/w9e3r/

Experiences from the first ever SIPS meeting [SIPS 2016]

Wyble, B. (2016). How to win friends and improve science: Notes from the first meeting of SIPS (the Society for the Improvement of Psychological Science). Attention, Perception, & Psychophysics, 78, 1529-1530. doi: 10.3758/s13414-016-1180-x