Resources

This page lists some resources that may be useful. 

Fair Information Practices: A Basic History.  I maintain a short (now > 25 pages) history of Fair Information Practices that provides essential text, background, citations, some information about ongoing use of FIPs, and a touch of analysis.  I originally intended this history as an article for Wikipedia, but I found the process for submission unattractive, and I wanted to keep control of the document.  Last revised April 4, 2014.  The paper is here.  Readers are welcome to suggest additional materials, references, and corrections for the history. The paper is also at http://ssrn.com/abstract=2415020.

New OMB Privacy Standards for Commercial Database Vendors.  In early November 2013, Pam Dixon (World Privacy Forum) and I released a report on the federal government’s Do Not Pay Initiative. What is most important here is that an OMB memo establishes privacy standards for the use of commercial databases in the Initiative. The report praises (!) OMB for using the novel use of the federal government’s marketplace power to demand better privacy for consumers. OMB’s action shows that the feds can establish private sector privacy standards without legislation or regulation. The federal government is a big customer for commercial databases, so it could accomplish more by requiring those selling other database products and services to comply with Fair Information Practices as a matter of contract. The States can do the same thing. There’s a lot more in the report.  Data Brokers and the Federal Government: A New Front in the Battle for Privacy Opens is available at the WPF website here.

Patient’s Guide to HIPAA: How to Use the Law to Guard your Health Privacy.  This guide published by the World Privacy Forum is available here.  It is hard to find any comprehensive information about HIPAA aimed at patients.  The guide includes more than 60 FAQs and offers practical advice about how to use the HIPAA rules strategically and otherwise.  Revised 2013 to reflect the changes to HIPAA taking effect on September 23, 2013.

Privacy and Missing Persons after Natural Disasters (2013) (with Professor Joel Reidenberg and others at Fordham Law's Center on Law and Information Policy), available here. This is a major report designed to help the Missing Persons Community of Interest – an independent, informally organized group of humanitarian organizations, companies and volunteers – address privacy in their work to aid victims in coping with natural disasters. Importantly, the report offers suggestions to assist privacy regulators and policy-makers in meeting the special needs of the disaster relief context. The subject is a good illustration of some of the complexities that privacy rules present in emergency situations and identifies how those rules can be applied or changed to resolve those needs. The problems are real but solvable with some effort and good will.

Best Privacy Anywhere (2013).  I drafted a model state law that says that a company must provide consumers in a state the strictest privacy protections that the company applies in any other jurisdiction, domestic or foreign, where it conducts business online for substantially similar activities. The text of the bill is here.

Many of the usual objections to privacy protections for consumers will not be relevant under the legislation. A company cannot claim that a privacy policy is impossible because it will have to adopt the same privacy policy that the company applies elsewhere. A company cannot claim that a privacy policy is unprofitable because the company will be engaging in profitable activity on the same terms in another jurisdiction. A company cannot claim that a privacy policy is too cumbersome because it has to implement the same privacy policy that the company has already applied in another state or country.

For too long, we've watched Internet companies apply one set of privacy rules to its customers in Europe, in Australia, in Hong Kong, or in other countries because the laws in those countries require stronger privacy protections or because consumers in those countries demand better protections. Yet these companies provide consumers here weaker privacy protections largely because they can. There are no general federal privacy laws and no state privacy laws that set comprehensive standards for privacy.

My proposal would raise privacy standards in any state that enacts the bill.. And it will do so without the need for lengthy rules or extensive retooling of business models. The same privacy policies and the same business models that work elsewhere will work here.

The approach could not be simpler. Each company need only find the stringent privacy policy that it has adopted or to which it is subject and then provide the same policy for local consumers. Because all the legal questions and practical difficulties have already been resolved elsewhere, implementing the same policy here will be much easier, faster, and less expensive. Essentially, companies will be able to spread the cost of any privacy policy over a bigger base of customers. In effect, this bill will lower a company’s cost per customer for any given privacy policy.

How would this work in practice? If an Internet company offers social networking services to consumers in France under a privacy policy that says the personally identifiable information will not be shared with third parties for marketing without the express consent of consumers, then it must do the same thing in the state. If an Internet company offers search services to consumers in England and promises to erase personally identifiable data within six months, then it must do so when it does business here.

The legislation establishes a standard for determining whether a privacy policy is more stringent by using the general principles of Fair Information Practices. For each principle, the bill describes how to determine if particular privacy policy is more stringent. A policy that results in less use or less disclosure of personally identifiable information is more stringent than one that allows more use or disclosure. A policy that provides notice to consumers is more stringent if it provides better notice or earlier notice. The application of these stringency standards will not be complex.

The bill could be enacted with slight modification in most U.S. states.

Lacking in Facts, Independence, and Credibility: The 2011 NAI Annual Compliance Report (2012).  This assessment of the 2011 NAI Annual Compliance Report offers a gloss on the value of privacy self-regulation.  The assessment is here.  A broader and more historical look at privacy self-regulation can be found a few documents lower on this page.

From the Executive Summary of the assessment:

Does the NAI’s 2011 compliance report offer evidence that self-regulation is effective and worthwhile? The conclusions here are that the NAI report does not offer not meaningful evidence that self-regulation is effective, that the report hides more than it reveals about the NAI compliance process, and that the value of non-independent self-regulatory audits is unproved.

*****

The NAI report provides carefully selected and edited information about its members, the audit process, the qualification of its auditors, and the independence of its auditors. The NAI report fails to provide enough context for the few facts that it does provide, uses weasel worded statements that obscure the degree of compliance or non-compliance by NAI members, and claims credit for compliance with laws that are independent of NAI standards.

ONLINE PRIVACY A Reference Handbook (2011).   This book was published by ABC-CLIO as part of its Contemporary World Issues series.  The publisher's listing is here.  Pam Dixon and I wrote this book to provide a starting point for research by high school and college students, scholars, and general readers as well as by legislators, business people, activists, and others.  It includes a detailed background and history of online privacy, a review of current controversies, a perspective on worldwide developments, a chronology, sketches of important individuals, a directory of organizations, and a host of online resources.

Many Failures: A Brief History of Privacy Self-Regulation in the United States (2011).  This report by Pam Dixon and me focuses mostly on industry-supported privacy self-regulation started during the period just before and just after 2000.  Most of these programs disappeared entirely within a few years as soon as pressure for formal regulation and legislation faded, and when the political winds blew in other directions.  There is also discussion of the Department of Commerce Safe Harbor program, the Children's Online Privacy Protection Act self-regulatory structure, and the Platform for Privacy Preferences (P3P).  It is hard to find anything of a self-regulatory nature in the privacy arena during the study period that rises much above the level of abject failure.  The report is available at the World Privacy Forum website here.  Whether current privacy self-regulatory programs will be any different than past programs is not addressed in the report, but it appears that history is repeating itself.  The purpose of the report is to remind everyone what happened before.  A version of this paper appeared in the December 2011 issue of Privacy Laws and Business, where it is behind a paywall.  You can read the original (but longer) version here free.

Legislative Proposal for Deidentification of Personal DataThe Deidentification Dilemma: A Legislative and Contractual Proposal, 21 Fordham Intellectual Property, Media & Entertainment Law Journal 33 (2010). Available here or here,  Deidentification is one method for protecting privacy while permitting other uses of personal information. However, deidentified data is often still capable of being reidentified. The main purpose of this article is to offer a legislative-based contractual solution for the sharing of deidentified personal information while providing protections for privacy. The legislative framework allows a data discloser and a data recipient to enter into a voluntary contract that defines responsibilities and offers remedies to aggrieved individuals. This idea won't solve all problems with the sharing of deidentified data for research or other activities, but it would establish rules and enforceable standards under a statutory scheme. The basic idea is for a federal law, but a state could enact the proposed statute for data within its borders.

The Story of the Banker, State Commission, Health Records, and the Called Loans: An Urban Legend?
I collected the history (with much help from John Fanning) of a story involving the misuse of health records by a banker who served on a state health commission. The question is whether the story is true or whether it constitutes an urban legend. The story has never been properly substantiated in any meaningful way and should not be given credibility. The point is not that health records are never abused. This is not the case. The point is simply that the banker’s story cannot be accepted as true.

Here is a basic version of the banker story:  A banker on a state health commission had access to records of all the patients in his state who had been diagnosed with cancer. He cross-referenced that list with a list of patients who had outstanding loans at his bank, and called in the loans of those with cancer. The story varies somewhat from source to source. Sometimes the state is identified and sometimes not. Sometimes the commission is a state commission and sometimes a county commission. Sometimes the loans called are mortgages and sometimes they are other types of loans. The bank or the banker has never been named.

The conclusion offered here is that there is no firm basis for believing the story is true, and there are some reasons for concluding that the story is not true. Tracing the story back to its origins, there appear to be two independent points of origin for the story. Both of these sources acknowledged that they have no substantiation for the story. It appears that the story is nothing more than a rumor mentioned by “someone” at a conference. The story was then repeated, embellished, and recited by others so often that the story appeared to have considerable authority behind it. The document is here.  Last version dated December 12, 2011.

Proposal for FTC on Regulating Behavioral Targeting. On January 27, 2010, I submitted a short proposal to the FTC's Privacy Roundtable that suggests a way to regulate behavioral targeting activities based on the length of time that identifiable personal information is retained. I suggest four tiers. Tier 1 is for activities that keep information for no more than 24 hours. No regulations would apply. Tier 2 covers activities that keep information for up to six months. Consumers would have a right to see their information. Tier 3 covers activities that keep information for six months to a year. Consumers would have a right to delete their information. Tier 4 covers longer term retention, and consumers would have to affirmatively be provided a copy of their information and would also have full Fair Information Practice rights. The proposal is here.

HIT Policy Committee Statement.  On September 18, 2009, HHS's HIT Policy Committee invited me to talk about accounting and transparency measures for health care privacy.  My statement is here.

Notes and Observations on Selected Parts of Title XIII, Subtitle D, Privacy, American Recovery and Reinvestment Act Of 2009.  This document offers a preliminary analysis of how the privacy subtitle of the stimulus law will affect the HIPAA privacy rule. This analysis benefited greatly from assistance provided by several people who I do not have permission to identify. Any errors are mine alone. The analysis is here.  I may change this analysis from time to time if I have further thoughts or receive comments from others.  The current version, dated September 8, 2009, is the eighth revision.  Note that the full text of the subtitle and the relevant parts of the conference committee report are also available on this website here.

Privacy in the Clouds: Risks to Privacy and Confidentiality from Cloud Computing, World Privacy Forum (2009).  The report is available at the WPF website here

Summary:  This report discusses the issue of cloud computing and outlines its implications for the privacy of personal information as well as its implications for the confidentiality of business and governmental information. The report finds that for some information and for some business users, sharing may be illegal, may be limited in some ways, or may affect the status or protections of the information shared. The report discusses how even when no laws or obligations block the ability of a user to disclose information to a cloud provider, disclosure may still not be free of consequences. The report finds that information stored by a business or an individual with a third party may have fewer or weaker privacy or other protections than information in the possession of the creator of the information. The report, in its analysis and discussion of relevant laws, finds that both government agencies and private litigants may be able to obtain information from a third party more easily than from the creator of the information. A cloud provider’s terms of service, privacy policy, and location may significantly affect a user’s privacy and confidentiality interests.

Principal Findings:

•    Cloud computing has significant implications for the privacy of personal information as well as for the confidentiality of business and governmental information.
•    A user’s privacy and confidentiality risks vary significantly with the terms of service and privacy policy established by the cloud provider.
•    For some types of information and some categories of cloud computing users, privacy and confidentiality rights, obligations, and status may change when a user discloses information to a cloud provider.
•    Disclosure and remote storage may have adverse consequences for the legal status of or protections for personal or business information.
•    The location of information in the cloud may have significant effects on the privacy and confidentiality protections of information and on the privacy obligations of those who process or store the information.
•    Information in the cloud may have more than one legal location at the same time, with differing legal consequences.
•    Laws could oblige a cloud provider to examine user records for evidence of criminal activity and other matters.
•    Legal uncertainties make it difficult to assess the status of information in the cloud as well as the privacy and confidentiality protections available to users.
•    Responses to the privacy and confidentiality risks of cloud computing include better policies and practices by cloud providers, changes to laws, and more vigilance by users.

Red Flag and Address Discrepancy Requirements: Suggestions for Health Care Providers, World Privacy Forum (2008) (with Pam Dixon).  The report is available at the WPF website here. This report describes the FTC's rules defining obligations of creditors (including, perhaps surprisingly, many health care providers) to protect clients against identity theft.  The report suggests red flags that are suitable for health care providers.

Personal Health Records: Why Many PHRs Threaten Privacy.  I wrote this report for the World Privacy Forum in 2008 to show that PHRs can have significant negative consequences for the privacy of consumers who authorize the maintenance of their health records by PHR vendors.  Many, if not most, PHRs are not subject to the HIPAA health privacy rule.  Any consumer who agrees to put his or her health records in a PHR -- especially one that is a commercial, advertising-supported PHR -- is risking the loss of the information in those records to marketers, advertisers, and others.  The report is here and an accompanying consumer advisory is here.

Significant privacy consequences of PHRs not covered under HIPAA can include:

•    Health records in a PHR may lose their privileged status.
•    PHR records can be more easily subpoenaed by a third party than health records covered under HIPAA.
•    Identifiable health information may leak out of a PHR into the marketing system or to commercial data brokers.
•    In some cases, the information in a non-HIPAA covered PHR may be sold, rented, or otherwise shared.
•    It may be easier for consumers to accidentally or casually authorize the sharing of records in a PHR.
•    Consumers may think they have more control over the disclosure of PHR records than they actually do.
•    The linkage of PHR records from different sources may be embarrassing, cause family problems, or have other unexpected consequences.
•    Privacy protections offered by PHR vendors may be weaker than consumers expect and may be subject to change without notice or consumer consent.

• Privacy for Research Data.  I prepared a paper for the Panel on Confidentiality Issues Arising from the Integration of Remotely Sensed and Self-Identifying Data of the National Research Council. The paper attempts to describe privacy rules in the three most important areas relevant to research uses of information involving remotely-sensed and self-identifying data. The three issues are: (1) When is information sufficiently identifiable so that privacy rules apply or privacy concerns attach? (2) When does the collection of personal information fall under regulation? and (3) What rules govern the disclosure of personal information?  The paper appears at Appendix A in the Panel's report:  Putting People on the Map: Protecting Confidentiality with Linked Social-Spatial Data (2007) <http://books.nap.edu/catalog.php?record_id=11865>.

• Consent for Disclosure of Health Records:  Lessons from the Past (2007).  A 1998 Maine health privacy law that required written consent for many health disclosures was so unpopular and impractical that the legislature suspended the law shortly after it took effect. Many of the law’s requirements for written consent were later replaced with expanded authority for nonconsensual disclosures.  I wrote this short paper to provide a review of the history of the Maine law.  The paper is hosted at a discussion forum of the World Privacy Forum's website here.  The paper itself is here.

• Personal Health Records (2007). Joseph Turow, Judith Turow, and I wrote two pieces on privacy and other implications of personal health records.  One was for the American Medical Association's Virtual Mentor (Personalized Marketing of Health Products the 21st Century Way, available here), and the other appeared in the San Francisco Chronicle, Why Marketers Want Inside Your Medicine Cabinet (March 5, 2007), available here.

• Testimony on Health Privacy Studies (2006). The National Committee on Vital and Health Statistics (Department of Health and Human Services) held a hearing on November 30, 2006, on approaches to studying the HIPAA Privacy Rule. In my testimony (available here), I argued that privacy is a fundamental part of health care. There is no need to study the value of privacy. I also said that focusing on privacy knowledge of consumers or the costs of HIPAA would not be productive. I suggested instead that the committee look at privacy issues for health information networks. I proposed four areas of study:

Medical Identity Theft. A health information network is an identity thief’s dream.
Health Scores. I expect that a health network will contribute to the development of individual and family health scores that, like credit scores, will be used to make decisions about people.
Surveillance Capabilities of Health Information Networks. If a network contains information about medical appointments, it can be used by the police to find and detain anyone with an outstanding warrant, overdue tax bill, questionable immigration status, or overdue library book.
Preemption. Neither totally preemptive federal health privacy legislation nor a patchwork quilt of stronger state laws will work in a networked environment. We need to find a middle ground that recognizes structural state legislation while providing greater uniformity in a networked environment. Laws protecting psychiatric, substance abuse, HIV, and genetic information must be accommodated.
• Medical ID theft report (World Privacy Forum, 2006). This report on the problem of the stealing of medical identities was conceived of, researched by, and written by Pam Dixon of the World Privacy Forum.  It was her brilliant insight and original research that brought this hidden and significant problem to public attention. I contributed in minor ways.  The report is available at the WPF website here.

• FAQs for victims of medical ID theft (World Privacy Forum, 2006). This FAQ tells victims of medical identity theft how they can use the HIPAA health privacy rule to determine the scope of the problem and to correct their health records. I did much of the work on this resource, along with Pam Dixon. The report is available at the WPF website here. The FAQs were revised in April 2012.

Trafficking in Health Information: A Widespread Problem (Updated 2007). I prepared this short history of investigations that exposed widespread trafficking in health records by insurance companies, investigative firms, and others.  It includes some discussion of how pretexting has been used to obtain health records.  Investigations in the United States, Canada, and Great Britain dating back as far as the 1970s and as recently as 2006 show similar illicit activities.   Available here.  Updated August 2006 and April 2007.

• The American Way of Privacy (2005). In November 2005, the French National Commission for Information Technologies and Liberties (CNIL), French Senate, and University Paris II held a symposium on Information Technologies: Servitude or Freedom? I presented a paper titled The American Way of Privacy that was later published by the symposium sponsors in French under the title L’approche Américaine: la Régulation par le Congrès, le Marché et le Juge. My original English version is available here.

• Health Privacy Bibliography:  This bibliography is a bit old (2003), and I no longer keep it up to date.  But it may identify a document, hearing, or article on health privacy that you didn't know about.  Available here

• Privacy: Finding A Balanced Approach to Consumer Options:  This short paper I wrote in 2002 discusses consumer choice regarding secondary use of personal information and considers application of opt-in or opt-out rules for determining how personal information can be used and disclosed.   Available here.

• Fair Information Practices:  A classic statement of Fair Information Practices is reproduced here for reference and convenience. Fair Information Practices are core standards for the privacy of personal information and are based on American and international sources.  See also the basic history of FIPs, listed above on this page.

The Privacy of Health Information and the Challenge for Data Protection, Eighth International Conference of the Observatory "Giordano Dell'Amore" on the Relations Between Law and Economics, Stresa, Italy (May 1997), available here.  In this paper, I introduce the "paradox of informed consent" for the disclosure of health records.  While it predates the HIPAA health privacy rule, the paper remains relevant to debates over the role of consent in health care disclosures. The paradox of informed consent shows the severe limits of controlling health information using patient consent.

• Taking a Byte Out of History: The Archival Preservation of Federal Computer Records, House of Representatives Report Number 101-978, 101st Congress (1990).  This was the first congressional report that addressed the archival preservation of federal electronic records.  I wrote it for the House Committee on Government Operations when I worked on the staff of one of its subcommittees.  The report (4 MB) is mostly of historical interest today, but many of the issues raised by the report remain important today.  The report is available here.

• Electronic Collection and Dissemination of Information by Federal Agencies: A Policy Overview, House of Representatives Report Number 99-560, 99th Congress (1986).  This congressional report followed a series of hearings that examined policy issues relating to federal agency electronic information issues.  I wrote this report for the House Committee on Government Operations and its Government Information, Justice, and Agriculture Subcommittee.  It is one of the earliest looks at electronic information policy, just before the explosion of the Internet into popular awareness. This report is mostly of historical interest today, but much of the discussion still has some relevance, especially at the state level where many state agencies continue to exercise inappropriate controls over government information.  The 1986 congressional report (10 MB) is available here.  My thanks to the unnamed individual (you know who you are!) who made this report (and the one above) into a PDF so that they are now available on the Net.

One of the most important points in the report is the recognition that federal agencies can control their information in the marketplace even though works of the federal government are not subject to copyright.  I later explored the issue of copyright-like controls in a law journal article, Twin Evils: Government Copyright and Copyright-Like Controls Over Government Information, 45 Syracuse Law Review 999 (1995).  That article is available here in a typescript version in PDF format.  The 1986 report ultimately led to a change in the Paperwork Reduction Act nearly ten years later (Public Law 104-13, 1995).  The PRA language sought to prevent the agency abuses that the 1986 report found.  The PRA language is:

44 U.S.C. 3506(d)

With respect to information dissemination, each agency shall—
    (1) ensure that the public has timely and equitable access to the agency's public information, including ensuring such access through—
         (A) encouraging a diversity of public and private sources for information based on government public information;
         (B) in cases in which the agency provides public information maintained in electronic format, providing timely and equitable access to the underlying data (in whole or in part); and
         (C) agency dissemination of public information in an efficient, effective, and economical manner;
    (2) regularly solicit and consider public input on the agency’s information dissemination activities;
    (3) provide adequate notice when initiating, substantially modifying, or terminating significant information dissemination products; and
    (4) not, except where specifically authorized by statute—
         (A) establish an exclusive, restricted, or other distribution arrangement that interferes with timely and equitable availability of public information to the public;
         (B) restrict or regulate the use, resale, or redissemination of public information by the public;
         (C) charge fees or royalties for resale or redissemination of public information; or
         (D) establish user fees for public information that exceed the cost of dissemination.

One of the lessons here is that it can take many years after issues are identified before the stars are aligned, and Congress manages to enact legislation to address those issues.  By the time the 1995 PRA amendments were enacted, I no longer worked for the House of Representatives, but others who appreciated information policy matters remained and worked to enact needed reforms.  The importance of the PRA restrictions on federal agency dissemination activities is under appreciated today.  Not all agencies know about the requirements or comply with the law.

Back to home