Did Walmart cross a line with its facial recognition tech test?


By Matthew Stern, RetailWire

The following appears courtesy of RetailWire.com, an online discussion forum for the retail industry.

Whether or not the public at large will accept the use of facial recognition in their daily lives remains to be seen, but it is clear that retailers are willing to investigate whether the controversial technology has a role to play in stores. Fortune, for example, recently reported that Walmart had been testing the use of facial recognition technology to identify shoplifters.

According to the article, Walmart used a solution called FaceFirst, which works by scanning the face of every shopper that comes through the door. The software then checks each face against a pre-existing database of suspected shoplifters. If the face of a customer matches with one in the database, the store's staff are alerted on their mobile devices.

The software was in place in stores throughout several states for several months. The use was discontinued due to a failure to produce a positive return on investment.

Fortune attempted to get in touch with various other retailers about the use of FaceFirst, but only Walmart admitted to having used it.

A consumer survey quoted in RetailWire in August of 2015 noted that 75 percent of consumers said they would not shop at a store that used facial recognition for marketing purposes.

Whether consumers are more comfortable with facial recognition being used for loss prevention remains to be seen, as is whether facial recognition intended for loss prevention purposes will eventually creep into other areas of a business.

Privacy watchdogs have been attempting to set specific guidelines for the use of such potentially invasive technology, as well as similar technologies used on social media. The results have not been what they had hoped.

Longtime online privacy advocate organization the Electronic Frontier Foundation (EFF) had, according to a press release, been engaged in a National Telecommunications Information Administration process to arrive at sensible limits for the use of facial recognition technology in the private sector and the sharing of information collected with government agencies.

According to a June 2015 statement on EFF.org, the EFF pulled out of the process along with eight other privacy organizations after 16 months due to the perception that companies refused to place the most basic limitations on the use of the technology for the sake of privacy.

Discussion Questions: Do you see a practical use for facial recognition technology in retailers' loss prevention efforts? Do you expect that privacy advocates to derail the practice?

Comments from the RetailWire BrainTrust:

I suppose truly identifying previous LP offenders is a decent use of facial recognition technology and is "better than racial profiling." But one question that must be answered is: Is there a statute of limitations? Or will one mistake follow the offender forever?

And what's to keep retailers from using facial recognition for profiling as well?

I think it's a DOA technology.
-Paula Rosenblum, Managing Partner, RSR Research

Sure, it's a practical use. This technology has done far more good than harm. Without it, the Boston Marathon bombers would still be free and would no doubt have killed more people in more attacks. Many murderers and rapists would never have been caught. Gimme a break. Common sense will, over time, erode the argument of privacy advocates.
-Warren Thayer, Editorial Director & Co-Founder, Frozen & Refrigerated Buyer

The issue of profiling will have to be addressed as well as statute of limitations, and how else the pictures may be used. People may not know how their pictures are being used now. However, if consumers find out that they have been used inappropriately, there will be a huge backlash. Retailers will have to weigh benefits against the actual of the system and the potential cost of a backlash from consumers. At this point using this technology is risky versus a questionable payoff.
-Camille P. Schuster, Ph.D., President, Global Collaborations, Inc.

It's not the recognition that makes me queasy; it's the aggregation of the data.

When I'm out in a privately-owned or public space, I have no expectation of privacy whatsoever. When my identity is captured, however, I have an interest in knowing that it will be used responsibly and protected from bad actors. Considering the current state of cybersecurity, I have zero confidence that my personal data is safe from breaches.

If a retailer uses facial recognition and machine learning to scan me, classify me, figure out my gender or ethnicity, track me in the store, and even compare my behaviors to the resultant purchases, that's really okay. The underlying intent there is to use sensing to understand behavior and improve customer experience.

If the same retailer uses facial tech to identify me personally and match my identity against a database, that's not okay. It's an invitation for abuses of various kinds, and it results in the creation of a data trove that is irresistible to data thieves.

Bottom line: You don't need my personal identity to catch me stealing. You just need an observation process. It's okay to use machine learning to make this more effective. It's okay use the photo as evidence in an individual instance of shoplifting. It's not okay to build a persistent time-and-location stamped private database of shopper images. Leave that sort of thing to the NSA.
-James Tenser, Principal, VSN Strategies

Facial recognition, like biometrics, will continue to be considered as an additional security safeguard against fraud, shoplifting and even VIP shopper identification. The key to mollifying the consumer's privacy concerns lies in the communication to the shopper as to why these technologies are beneficial to them.

Given the increasing number of credit card fraud cases, stolen identities and other personal information breaches, I believe most consumers understand that it is to their benefit to make sure they are who they say they are when a payment instrument is being used.

In addition, given the increasing number of security cameras both inside and outside retail stores, the majority of shoppers are already aware that they are being watched.

Surely there will be privacy advocates who will find fault with some of the more invasive techniques, but ultimately it will be the consumer who will decide if the measures provide more benefit than inconvenience or intrusion in their personal lives, not an advocacy group.
-Mark Heckman, Principal, Mark Heckman Consulting

Apparently not, if they discontinued the test because it didn't produce enough hits to pay out. It's not a privacy issue, it's an ROI issue.
-Dr. Stephen Needel, Managing Partner, Advanced Simulations

Read the entire RetailWire discussion.
 

Suggested Articles

Costco changes up its menu items, and Alibaba and Guess partner for a physical store.

Janey Whiteside, Walmart's new chief customer officer, is well acquainted with the importance of customer service in modern retail.

Whole Foods will offer deals on Amazon's Prime Day, and tariffs against China are causing pricing hikes.