SIPS Products

SIPS 2023

Resource: Replications & Reversals across Social Sciences (by FORRT)

This community-driven collection of replicated effects (https://forrt.org/reversals/) lists papers, citations, sample sizes, study designs, and effect sizes of original and replication work. Scholars from varied backgrounds and disciplines are invited to contribute with effects in their respective fields. The resource already includes 550+ effects from more than 20 disciplines. See this document for more information.

SIPS 2022

Paper: Guide for Editors on Getting Started with Open Science

This guide by Silverstein et al. outlines steps that editors can take to implement open policies and practices within their journal, and goes through the what, why, how, and worries of each policy and practice. This paper introduces and summarizes the guide, which can be found here in its full version: https://doi.org/10.31219/osf.io/hstcx

Paper: Promoting Civility in Formal And Informal Open Science Contexts

This paper by Darda et al. presents five scientifically supported recommendations for promoting interpersonal civility in formal and informal open-science contexts. https://doi.org/10.31234/osf.io/rfkyu

SIPS 2021

Paper: Transparency and Validity in Coding Open-Ended Data for Quantitative Analysis

This paper by Conry-Murray et al. assesses current reporting practices for quantitative coding of open-ended data and proposes guidelines for transparent reporting informed by concerns with replicability, content validity, and statistical validity. https://doi.org/10.31234/osf.io/86y9f

Paper: Checklist for transparent reporting of reaction time pre-processing

This checklist is meant to facilitate transparent reporting of reaction time pre-processing for research, editorial, and teaching purposes. It resulted from the unconference session and hackathon “(Too) many shades of reaction time data preprocessing” led by Krzysztof Cipora and Hannah D. Loenneker. All collaborators and more detailed information, including a systematic literature search, best practice survey, and multiverse analysis, can be found in the following publication: https://doi.org/10.1016/j.cortex.2023.11.012

Paper: Many Modelers

van Dongen, N. N. N., Finnemann, A., de Ron, J., Tiokhin, L., Wang, S. B., Algermissen, J., Altmann, E. C., Chuang, L., Dumbravă, A., Bahník, Š., Fünderich, J. H., Geiger, S. J., Gerasimova, D., Golan, A., Herbers, J., Jekel, M., Lin, Y., Moreau, D., Oberholzer, Y., … Borsboom, D. (2022, August 24). Many Modelers. PsyArXiv. https://doi.org/10.31234/osf.io/r5yfz

Resource: Data Management for Psychological Science: A Crowdsourced Syllabus

This syllabus provides detailed descriptions of data management topics, resources, and activities that can be used to create a course or workshop on data management. It is based on a session facilitated by DeBolt, M., Herrera-Bennett, A., Lawson, K. M., Schiavone, S., & Wysocki, A. (please see the end of the document for a complete list of contributors)

Resource: Embedding open and reproducible science into teaching: A bank of lesson plans and resources

Shared bank of teaching resources and lesson plans on the broad topics of open scholarship, open science, replication, and reproducibility. Pownall, M., Azevedo, F., Aldoh, A., Elsherif, M., Vasilev, M., Pennington, C. R., Robertson, O., Vel Tromp, M., Liu, M., Makel, M. C., Tonge, N. A., Moreau, D., Horry, R., Shaw, J., Tzavella, L., McGarrigle, R., Talbot, C., & Parsons, S. https://doi.org/10.1037/stl0000307

Paper: Guidelines to Improve Internationalization in Psychological Science

Putillam, A., Montilla Doble, L. J., Delos Santos, J. J., Elsherif, M. M., Steltenpohl, C. N., Moreau, D., Pownall, M., & Kapoor, H. Guidelines to improve internationalization in psychological science. https://psyarxiv.com/2u4h5/ 

Paper: Missing Data and Multiple Imputation Decision Tree

Practical guidelines for researchers when examining their data for missingness and making decisions about how to handle that missingness. Woods, A. D.,  Davis-Kean, P., Halvorson, M., King, K., Logan, J., Xu, M., Bainter, S., Brown, D., Clay, J. M., Cruz, R. A., Elsherif, M. M., Gerasimova, D., Joyal-Desmarais, D., Moreau, D., Nissen, J., Schmidt, K., Uzdavines, A., Van Dusen, B., & Vasilev, M. https://psyarxiv.com/mdw5r/

SIPS 2020

Resource: Teaching and Mentoring Open Science

A mentoring statement and syllabus modules for teachers and mentors who want to teach and model open science to their students. Kim, M. H., Woods, A. D., Ellis, A., & Davis-Kean, P.

SIPS 2019

Paper: Not All Effects Are Indispensable: Psychological Science Requires Verifiable Lines of Reasoning for Whether an Effect Matters

Anvari, F., Kievit, R. A., Lakens, D., Pennington, C. R., Przybylski, A. A., Tiokhin, L., Weirnik, B. & Orben, A. (2022). Not all effects are indispensable: Psychological science requires verifiable lines of reasoning for whether an effect matters. Perspectives on Psychological Science.

https://journals.sagepub.com/doi/full/10.1177/17456916221091565

Template: EEG ERP Preregistration Template

Govaart, G. H., Schettino, A., Helbling, S., Mehler, D. M. A., Ngiam, W. X. Q., Moreau, D., … Paul, M. (2022). EEG ERP Preregistration Template. https://osf.io/preprints/metaarxiv/4nvpt/

Paper: Seven Steps Toward More Transparency in Statistical Practice

Wagenmakers, E. J., Sarafoglou, A., Aarts, S. et al. (2021). Seven steps toward more transparency in statistical practice. Nature Human Behavior, 5, 1473–1480.
https://doi.org/10.1038/s41562-021-01211-8

Paper: Global Engagement Task Force Report

This report makes recommendations for improving inclusion and access for scholars from regions outside the United States, Canada, and Western Europe. Steltenpohl, C. N., Doble, L. J. M., Basnight-Brown, D., Dutra, N. B., Belaus, A., Kung, C. C., Onie, S. Seernani, D., Chen, S.-C., Burin, D., & Darda, K. M. (2021). Society for the Improvement of Psychological Science Global Engagement Task Force Report. https://psyarxiv.com/4upqd/

Resource: Nudging Open Science

A toolbox of resources and nudges for advancing open-science practices. Robson, S. G., Baum, M. A., Beaudry, J., Beitner, J., Brohmer, H., Chin, J. M., Jasko, K., Kouros, C. D., Laukkonen, R. E., Moreau D., Searston R. A., Slagter, H. A., Steffens N. K., & Tangen, J. M. https://psyarxiv.com/zn7vt/

Tool: Tenzing: Documenting contributorship with CRediT

A web application for reporting which people made which contributions to a scientific project using the Contributor Roles Taxonomy. Holcombe, A., Kovacs, M., Aust, F., & Aczel, B. https://doi.org/10.31222/osf.io/b6ywe.

A product developed out of the SIPS 2019 conference session “Replace journals’ writing-based authorship guidelines with a contributorship model.” Holcombe, A.O., Vazire, S., Chartier, C., Ebersole, C., Giner-Sorolla, R., Haroz, S., Moreau, D., Primbs, M., Ling, M., Werner, K., Schnyder, N., Adie, J., Crook, Z., Smout, C., Ribeiro, G., Tangen, J., Aczel, B., Thibault, R., Searston, R.,Van ‘t Veer, A., Schmalz, X.(2019). https://osf.io/pzyn3/

Paper: Non-Interventional, Reproducible, and Open (NIRO) Systematic Reviews

Guidelines for preparing and preregistering a systematic review protocol of non-interventional studies.
Pickering, J.S., Topor, M., Barbosa Mendes, A., Bishop, D. V. M., Büttner, F. C., Henderson, E. L., … Kothe, E.J. (2020). Non-Interventional, Reproducible, and Open (NIRO) Systematic Review Guidelines v0.1. https://osf.io/f3brw/wiki/home/

Template: A Crowdsourced Effort to Develop a Lab Manual Template

A large hackathon group has developed a template to help researchers develop their own lab manuals. https://bit.ly/2FdVt8F

Editorial: Rule out conflicts of interest in psychology awards.

Stoevenbelt, A. H., Nuijten, M. B., Pauli, B. E., & Wicherts, J. M. (2019) Nature 572, 312. doi: 10.1038/d41586-019-02429-3

Resource: BrainPower – Resources for power analysis in neuroimaging

A hackathon group continues its work to develop tools for power analysis in neuroimaging. Read more here: https://brainpower.readthedocs.io/en/latest/index.html

Blog: A fruitful rendevous at SIPS: Neuroimagers meet study preregistration advocates

Algermissen, J., Bartlett, J., Gau, R., Heunis, S., Klapwijk, E., Mazor, M., Paul, M., Schettino, A., & Mehler, D. (2019). Blog post featuring resources for neuroimagers on preregistration, power analysis, and more.

Resource: Free, Open Datasets

Brick, C., Botzet, L., Costello, C. K., Batruch, A., Arslan, R. C., Kline, M., Sommet, N., Green, J., Nuijten, M. B., Conley, M. A., Richardson, T., Sorhagen, N., Collentine, A. O., Feldman, G., Feingold, F., Manley, H., & Mullarkey, M. C. (2019). Directory of free, open psychological datasets. https://doi.org/10.17605/OSF.IO/TH8EW

SIPS 2018

Template: Secondary Data Analysis Pre-Registration

Van den Akker, O., Weston, S. J., Campbell, L., Chopik, W. J., Damian, R. I., Davis-Kean, P., Hall, A., Kosie, J. E., Kruse, E., Olsen, J., Ritchie, S. J.,  Valentine, K. D., van ‘t Veer, A. E., & Bakker, M. (2019, November 20). Preregistration of secondary data analysis: A template and tutorial. https://doi.org/10.31234/osf.io/hvfmr

Template: Secondary Data Analysis Pre-Registration

Weston, S. J., Mellor, D. T., Bakker, M., Van den Akker, O., Campbell, L., Ritchie, S. J., Chopik, W. J., Damian, R. I., Kosie, J., Soderberg, C. K., Ebersole, C. R., Brown, B., Davis-Kean, P., Hall, A., Kruse, E., Olsen, J., Valentine, K. D., & Nguyen, T. T. (2018). Secondary data preregistration. Retrieved from https://osf.io/x4gzt

Paper: 7 Easy Steps to Open Science: An Annotated Reading List

Crüwell, S., Van Doorn, J., Etz, A., Makel, M. C., Moshontz, H., Niebaum, J. C., Orben, A., Parsons, S., Schulte-Mecklenbeck, M. (In press). Zeitschrift für Psychologie. Preprint retrieved from https://psyarxiv.com/cfzyx.

Paper: The emerging relationship between clinical psychology and the credibility movement

Reardon, K. W., Corker, K. S., & Tackett, J. L. (In press). The emerging relationship between clinical psychology and the credibility movement. The Behavior Therapist. [Preprint at https://psyarxiv.com/46rk5]

Resource: FORRT (Framework for Open and Reproducible Research Training)

FORRT — the Framework for Open and Reproducible Research Training (forrt.org) — is an interdisciplinary community of 950+ early career scholars aiming to integrate open science principles into higher education as well as to advance research transparency, reproducibility, rigor, and ethics through pedagogical reform aligned with justice, equity, diversity, inclusion, and accessibility.

Resource: Media training materials

How to communicate with journalists: https://osf.io/2dk6x/

Responsible communication with the public (including discussion of how to handle preprints): https://osf.io/7g8et/

Resource: Five questions to ask when reading science news

Strategies to identify research that is reliable: https://osf.io/qxv4n/

Resource: Tools & Tips to Manage Your Lab

A list of tools for researchers to use to help manage their students, projects, and writing reports: https://osf.io/r5my6/

Resource: Psych-DS

Psych-DS is an in-progress community attempt to define a standard way of formatting and documenting scientific datasets.

SIPS 2017

Resource: Psychological Science Accelerator

The Psychological Science Accelerator is a decentralized, massive, international collaboration of psychological researchers; read more here.

Resource: Open Science Video Series (2017, with COS and SPSP)

During SIPS 2017, 5 short videos were recorded that served as an introduction to major topics within the area of open science. http://spsp.org/resources/multimedia/experts/openscience

Paper: Pros and cons of replication research

Zwaan, R. A., Etz, A., Lucas, R. E., & Donnellan, M. B. (2017). Making replications mainstream. Brain and Behavioral Sciences. doi: https://doi.org/10.1017/S0140525X17001972

OA version: https://psyarxiv.com/4tg9c/

Paper: Improving peer review

Davis, W. E., Giner-Sorolla, R., Stephen Lindsay, D., Lougheed, J. P., Makel, M. C., Meier, M. E., Sun, J., Vaughn, L. A., & Zelenski, J. M. (2018). Peer-review guidelines promoting replicability and transparency in psychological science. Advances in Methods and Practices in Psychological Science1(4), 556–573. doi: https://doi.org/10.1177/2515245918806489

OA version: https://kar.kent.ac.uk/69135/

Paper: Increasing transparency of analysis for pre-existing data

Weston, S. J., Ritchie, S. J., Rohrer, J. M., & Przybylski, A. K. (2019). Recommendations for increasing the transparency of analysis of preexisting data sets. Advances in Methods and Practices in Psychological Science2(3), 214-227.

OA version: https://psyarxiv.com/zmt3q/

Paper: Taxonomy for PsyArXiv Preprints

Brown, B. T., Eerland, A., Condon, D. M., Pfeiffer, N., Hoplock, L., Columbus, S., Kline, M. E., Moshontz, H., Binion, G., Iskiwitch, C., & Forscher, P. S. (2017). Retrieved from link.

SIPS 2016

Resource: PsyArXiv

PsyArXiv is a preprint server for psychology; read more here.

Resource: StudySwap

StudySwap is community “swap board” for researchers to connect to share/trade resources, including study participants; read more here.

Paper: Constraints on generality (COG)

Simons, D. J., Shoda, Y., & Lindsay, D. S. (2017). Constraints on Generality (COG): A proposed addition to all empirical papers. Perspectives on Psychological Science, 12, 1123-1128. doi: https://doi.org/10.1177/1745691617708630 OA version: https://psyarxiv.com/w9e3r/

Paper: Experiences from the first ever SIPS meeting

Wyble, B. (2016). How to win friends and improve science: Notes from the first meeting of SIPS (the Society for the Improvement of Psychological Science). Attention, Perception, & Psychophysics, 78, 1529-1530. doi: 10.3758/s13414-016-1180-x