Retail Facial Recognition Comes Of Age

Tools

Attorney Mark D. Rasch is the former head of the U.S. Justice Department's computer crime unit and today serves as Director of Cybersecurity and Privacy Consulting at CSC in Virginia.

Some years ago, I demoed an ATM that had no card, no chip, no PIN and only a limited keyboard. The ATM used facial recognition software to identify me (after registration), so I only had to walk up to the machine, type in $20 from checking and, voila! Money dispensed. Assuming that everything works as promised and that facial recognition software is close to 100 percent accurate and reliable (more on this later), retailers should consider the legal, privacy and compliance issues related to biometrics before rushing in. Like all innovative technologies (from credit cards to loss prevention devices), it's not clear yet whether consumers will embrace or reject the new technology, or how regulators will ultimately react.

The legal issues for biometric technology surround various phases of its implementation. Capture. Enrollment. Storage and protection. Sharing. Comparison. Use. De-enrollment and purging. And this says nothing about the technical issues.

How do you get the image you are going to use for the facial recognition? Not an easy question. Sure, if it's an ATM or payment-card replacement, the person can voluntarily sit down and give consent for a picture to be taken. But what about passive capture? Setting up a camera in a store or elsewhere and taking images of those who walk in? Benneton recently announced it was testing (but had not deployed) a technology called EyeSee, a camera and facial recognition software deployed inside mannequins. The technology captures shoppers and eye level, and it can be used for both loss prevention, trend analysis (what kind of people are doing what types of things in the store) and, ultimately, identification of customers by comparison with other databases.

This type of "passive capture" is particularly problematic from a legal perspective. Although we may have convinced the consuming public that they have no "right to privacy" in their images while they are in the store or mall (outside bathrooms or dressing rooms), the concept of creating a database of individual actions and movements based on facial recognition software takes that privacy expectation to a new level.

There's a fundamental difference between monitoring traffic and monitoring individuals. Do people in your parking lot know they are consenting to your capture of their license plate numbers (and images of the number, race, gender and age of the occupants of their vehicles)? Once you add the possibility of facial recognition to "ordinary" capture devices (like theft prevention cameras), you have converted the data into personally identifiable information (PII). So how do you get the image that matters. If you get a picture taken at Costco for its membership, are you consenting to the chain's use of that image for facial recognition and tracking?

What's worse, retailers can "capture" images from publicly (or semi-publicly) available databases or social networking sites. Is it "legal" for a company like, say, WalMart to scour Facebook, LinkedIn or PhotoBucket to capture names and images to create a database? This would depend partly on the Terms of Use or Terms of Service of these entities and on whether each permits or prohibits both "scraping" and commercial use of its services, in addition to the privacy expectations of the users. Generally, if an image is placed on a publicly accessible portion of a social networking site, it is, well, publicly accessible. That doesn't mean the images are accurate, however. Just ask Notre Dame's Manti Te'o about that one! Moreover, even if it is legal, it's really creepy.

The next issue is enrollment. How do you link a captured image to a specific person? Again, people can voluntarily enroll—like those credit cards that have pictures on them. Or they can be forced to enroll—like a person who is arrested for shoplifting, has a picture taken and then is banned for life not only from the individual store but from all of the chain's stores and its affiliates forever. Stores use facial recognition software to create a nationwide database of such "banned" persons and to enforce the ban. Was consent required? Most likely not.There are other ways to enroll people. When a person signs up for a loyalty card, a retailer could either take his or her picture with the consumer's knowledge and consent (voluntary enrollment) or just use a secret camera to take and enroll the image and then link it to the loyalty identity. Again, if consumers have no expectation of privacy in their image and have voluntarily provided their name, this practice may be legal. Legal, but creepy.

Or a retailer could link databases and capture a customer's license plate or name from a credit card (swiped or not) and then link that information to the image captured. The registration process links the image captured to the identity provided or captured. If neither part is "private," is the combination of the two private? Like all legal opinions, the answer will depend on the circumstances.

Next is storage and protection. Although I can replace a debit or credit card, if a biometric is "hacked" or altered, it's really hard for me to get a new face. Moreover, there are issues related to the accuracy of the databases, the accuracy of the linking and the reliability of the image capture, all of which are exacerbated by a failure to protect and secure the database with the biometrics all the way from capture to use.

The biggest issue is how the biometric will be used. Most people would have little problem with taking a picture of an armed robber and using tools to either try and identify the person or provide an alert if the same person is seen again. But the creation and use of a comprehensive database of the observable activities of any person is different in both quantity and quality.

In London, for example, the average person is caught on camera more than 300 times a day. Link these cameras together, and we have a massive database of what ordinary people are doing. Add databases or readers to the mix, and we can link in-store activities with online activities as Home Depot has done. With that, we can see not only what consumers bought, but what they looked at and didn't buy, what books and magazines they leafed through, what size pants they tried on and a host of other information.

We currently have no legal regime adequate to deal with the massive amount of data and data analytics that can be created through massive deployment and use of facial recognition software. The potentials for use—and misuse—of such systems are only just now being examined.

Related legal issues include false identification. What happens when my twin brother is arrested at gunpoint for trespass? Or when Diane O'Meara (the real-life woman behind the image) is arrested, or worse, for something Lennay Kekua (the fictional doppelganger of Manti Te'o's girlfriend) allegedly did? How do you get your image back?

There is also the question of linking and sharing. Can a retailer link its cameras and its databases with other retailers? Can a company offer a "know your customer" service? Can the data be shared with law enforcement or intelligence agencies? Currently, there are few rules.

And, of course, all this assumes that facial recognition actually works. I am not sure which is creepier—that it doesn't work, or that it does.

So for retailers considering jumping onto the facial recognition bandwagon, I would recommend conducting a detailed privacy impact assessment, not only on the technology itself but also on its deployment and use. Don't start with the assumption that "public" images are "public" and can be used in any way you like. And don't be creepy.

If you disagree with me, I'll see you in court, buddy. If you agree with me, however, I would love to hear from you.