There are two competing tensions in medical information: researchers want access to and reuse of large amounts of patient data beyond what those data were collected or in the first place. IRBs require that individual patient information is kept private and not revealed.
There are three locks that can be used to keep private information private: social, technological, legal.
The social lock marks information as one that should not be reverse engineered beyond the limits permitted by the donor. This lock is the default first defense.
The technological lock makes it difficult to re-identify de-identified information. This lock kicks in after the social lock has been breached.
The legal lock provides penalties for those who breach both the first and the second locks.
A few points to keep in mind:
There is no fail-safe defense against breach of privacy. If private information is collected and stored, it is only a matter of time and technology before it will be breached.
The more information is collected and stored, the greater the chance of #1 above. Both the amount of information and the length of time it is kept in one place directly increase the possibility of breach by increasing the corpus of information through which to find patterns and the time one can take to do so. Of course, it is the same amount and time of availability that make such information useful for researchers in the first place.
Expectations of privacy are changing.
Until that time when privacy expectations have changed so much that keeping information private is no longer worthwhile, the willingness of patients to donate their information would be proportional to the reassurance they will have for privacy.
The expectation of privacy would be proportional to the potential harm from the breach