In psychology there is little tradition of making the data on which researchers base their statistical analyses freely available to others after publication. This makes it difficult for anyone to independently reanalyse research results, and prevents small data sets from being combined for meta-analysis, or large ones mined for fresh insights or perspectives.

Psychologists need to rethink their reluctance to share data. Their discipline is 'softer' than some others: rarely do data on issues such as playground bullying or the usefulness of psychotherapy reveal really clear-cut answers. This makes the rigour with which the data are handled fundamental to research outcomes — and increases the desirability of having them open to examination by peers.

The need for more data sharing has just been amply demonstrated by Jelte Wicherts, a psychologist specializing in research methods at the University of Amsterdam, who tried to check out the robustness of statistical analyses in papers published in top psychology journals.

He selected the November and December 2004 issues of four journals published by the American Psychological Association (APA), which requires its authors to agree to share their data with other researchers after publication. In June 2005, Wicherts wrote to each corresponding author requesting data, in full confidence, for simple reanalysis. Six months and several hundred e-mails later, he abandoned the mission, having received only a quarter of the data sets. He reported his failure in an APA journal in October (J. M. Wicherts et al. Am. Psychol. 61, 726–728; 2006).

Researchers often have valid reasons for constraining access to their raw data, such as the privacy of research subjects. But data from most studies based on confidential information can be coded in a way that will guarantee their subjects' anonymity. The few cases where this is not possible can be exempted from the move towards data sharing.

A second factor deterring openness is a natural desire to retain exclusive access to data that took years of care and attention to collect. Like many researchers in other disciplines, psychologists fear that if different analytical approaches are brought to bear on their data, different conclusions could be drawn, casting doubt on their competence — or even their integrity. But in most cases, if data have been collected, selected and analysed correctly, researchers have little to fear in this regard, and the resulting discussion is likely to prove enlightening for the field as a whole.

An associated concern is that data could be wilfully misinterpreted by anyone with a political agenda. But this should not prevent the sharing of data sets: false interpretations of the data will fail to find any foothold in the community as whole.

A less frequently articulated reason for resistance to data sharing is the fact that some researchers are simply unable or unwilling to record and present their data in an unambiguous, reader friendly and archivable form.

The APA's editors and publishers are now planning their response to Wicherts' report. One result should be the acceleration of moves, already under discussion, to require the deposition of data as supplementary electronic material in APA databases. Where the APA leads, other psychology journals are likely to follow.

Granting bodies must also play a part. In 2003, the US National Institutes of Health introduced rules requiring the public sharing of data in psychology studies for grants exceeding $500,000, allowing exemptions where confidentiality issues cannot be circumvented. Other agencies should follow suit. And university departments need to do more to teach the basics of note-keeping and data presentation, to prepare their students for an era in which data sharing will increasingly become the norm.