Shoppers: "That's Not What I Signed Up For!"

Tools

Attorney Mark D. Rasch is the former head of the U.S. Justice Department's computer crime unit and today serves as Director of Cybersecurity and Privacy Consulting at CSC in Virginia.

Target's ability to mine CRM data got some unwelcome exposure this week from a book excerpt in The New York Times Magazine (a Forbes blogger recapped the excerpt using the headline "How Target Figured Out A Teen Girl Was Pregnant Before Her Father Did," which pretty much tells the story). But as retailers increasingly obtain personal data from consumers without their real knowledge—or any real ability to opt in or out—the legal implications get increasingly murky.

And although retailers are turning up customer characteristics that customers would never volunteer, some mobile payment systems are digging through a phone's data to raise equally troublesome privacy issues.

In one such system, a Seattle payments firm wants to push for a "pay by device," which would forgo a PIN number and instead authenticate an Android phone not only through the application itself but by scraping personal information from the phone itself.

Every device, once used, obtains personal characteristics. The names on your contact list, for example, or the applications you have downloaded, the frequency of use and the settings, each of which creates what amounts to a "digital signature" of the user. Thus, when you use a browser to go to a Web site, you are communicating not only your Internet protocol address, but what type of browser you're using, what version, what settings, what font, what sites you have seen recently and a whole host of information that can be used to personally identify you.

Indeed, the Electronic Frontier Foundation has on its Web site a tool called Panopticlick that will predict, based upon your browser settings and other information automatically transmitted to a Web site, just how identifiable you are. More than 80 percent of several million visitors to the site were uniquely identifiable, just based on those settings.

Unfortunately, much like your actual DNA, you have very little ability to meaningfully change this. Sure, you can download a different browser, but all that does is give you a new signature that you have to worry about. So, in this age where data is collected and stored, cross-referenced and mined, is there any real meaning to the terms "opt in" or "opt out"?

In the United States, there is no general "privacy" law. Certain types of information, like financial information or medical information, is protected by specific statutes. Payment-card information is protected to some extent by the security provisions of the payment-card industry digital security standards, a contract between merchants and their financial institutions. But other than that, it's really the wild, wild West out there. Thus, to protect privacy we tend to rely on the old standbys of contract law. Notice. Consent. An ability to opt in or opt out.

But in reality, what we have is a situation where thousands of companies post privacy policies knowing that consumers have neither the time nor the inclination to read them.But in reality, what we have is a situation where thousands of companies simply post privacy policies on their Web sites, embed them in applications, send them in multipage notices or obscure them in one way or another, knowing that consumers have neither the time nor the inclination to read them. This may be fine for the ordinary day-to-day transactions. But when a Web application or a device gathers information that most reasonable people would consider to be intimately personal, or whether it uses that information in an unusual way, the ordinary "simply click here" of day-to-day life may not be sufficient.

The application in question authenticates a user and his or her device by gathering information about how the user has used the device. Thus, if you use your cell phone as an authenticating device, the payment system will examine how you have used the device, who you have called, who your most frequent contacts are, what applications you have installed and how often you have used them, and then essentially create a digital signature of the phone and its user.

This type of data analytics reveals much more about the user than simply the fact that he or she isn't an authorized individual. The contact information, frequency of use and other personal information could certainly be used by the authenticator to market to that user's friends and relatives, to develop a profile of his or her personality, to learn whether or not that person is having an affair and a host of other personal information. When you decided to opt into a mobile payment system, did you really think that this was what you were buying?

Another problem with this system, like Target's CRM data mining, is that it's all or nothing. If you don't like my privacy policies or you don't like my settings, well, find yourself another mobile payment system or get out of the loyalty program. It's my way, or the highway. In many cases, the consumer either has no choice or the available choices are so limited that it amounts to virtually no choice.

As a result, the permanent loss of intimate privacy ends up being a cost of doing business in a modern society. Once this privacy is lost to a specific merchant, it may be lost to all merchants and to other third parties. We can all anticipate a situation where, once the payment system has authenticated you, law enforcement or other government agents can now collect that information from the merchant in furtherance of some legitimate investigative need. Privacy, like virginity, once given up cannot be restored.

So for companies that are thinking of gathering intimate information, or information about which they believe consumers might be squeamish, I suggest there be a super opt in. In addition to the normal privacy policy containing a host of terms and conditions that no consumers are really going to read, if you truly want the "benefit of the bargain" with the consumers willingly giving away their privacy in return for some feature, then I suggest you tell them about it in bold print with capital letters in a 14-point font and throw in a few exclamation points while you are at it.

If you are collecting the names of consumers' friends and relatives and intimate personal information, and using information for unusual purposes or other "non-standard" uses, then I suggest you tell your customers. Something like, "Hey, this is not the ordinary privacy notice. This is important." might work, although I am not sure customers would even read that. What you really want to do is make sure customers really do know what they are getting into.

If you disagree with me, I'll see you in court, buddy. If you agree with me, however, I would love to hear from you.