The Unexpected Benefits of Tokenization

A 403 Labs QSA, PCI Columnist Walt Conway has worked in payments and technology for more than 30 years, 10 of them with Visa.

I am starting to see that one of the biggest benefits of tokenization might be the implementation process itself. That is, while using properly constructed tokens can reduce a merchant's PCI scope, the process of planning, designing and implementing can produce significant benefits, too. One result from tokenization is restricting the further spread of cardholder data throughout the enterprise. Another is that the implementation process gives you a running start in complying with PCI version 2.0.

Tokenization requires a lot of work to implement. It would be a shame to not take full advantage of that work and the benefits that come from it.

The first step in tokenization is to identify all the people, applications and departments that currently use cardholder data. This step is not easy, and it requires both business and process knowledge that the IT department is unlikely to have. That means you need to involve the business side in the tokenization—and by implication the PCI compliance—process.

Including the business experts in PCI compliance is, therefore, often the first benefit. To implement tokenization properly you need to go beyond the usual cardholder dataflow diagram that tracks a card transaction from authorization to settlement. For example, do you currently send or receive primary account number (PAN) data from your acquirer as part of the chargeback or dispute resolution process? Does your call center ever include PAN data in its escalation procedure or E-mail messages? Does the anti-fraud department use PAN data for velocity checking, or does it provide PAN data to law enforcement agencies?

If the answer to any of the above questions is "yes," then the next questions should be:

  • Is it worth the risk?
  • Do we still need to do things this way?
  • Can we change this process and reduce our risk and PCI scope?

The business people have detailed operational knowledge that IT staff working alone may lack. They can help pinpoint all the places card data (both electronic and paper) is used in the enterprise. On the other hand, they are also the ones who need to understand that "but we always did it that way" is not a compensating control.

A second benefit from tokenization is that you have to locate all your data. Many IT departments use an automated tool like a sensitive number finder to locate PAN data. For this approach to work, though, you need to know where to look. Although you should look for cardholder data everywhere, I don't know many organizations that run regular searches on departmental devices they think are out of scope. A detailed joint IT-business review of all the enterprise's activities can give you more confidence that you have really found all your in-scope systems, processes and devices.

Let me say from the outset that the process I'm describing is a lot of work, and the meetings will fill a lot of whiteboards. The team needs to find all the data repositories in the organization, locate all the paper files and reports (which can be digitized, so check the document management systems, too), understand all the back-office processes that use cardholder data, and challenge historic practices and procedures. My point is that because you are already doing the work as part of the tokenization process, you might as well get all the benefits.Another benefit from implementing tokenization is that because you have located all your PAN data, you are a step ahead in complying with the scoping requirement in PCI 2.0. One of the changes in PCI 2.0 is: "At least annually and prior to the annual assessment, the assessed entity should confirm the accuracy of their PCI DSS scope by identifying all locations and flows of cardholder data and ensuring they are included in the PCI DSS scope."

A third benefit from the tokenization process is that it forces the enterprise to reexamine which people can access PAN data. PCI Requirement 7 tells you to restrict access to cardholder data on a strict need-to-know basis based on job function. This principle is also known as "least privileges," because it grants only the minimum access required for a person to do his or her job.

Note that PCI—and good security practices—limits access to sensitive data based on the person's job requirements, not his or her place on the organization chart. A particular fraud analyst or chargeback clerk, therefore, may require access to PAN data to do their job, but another person in the department or even the department head should not automatically have those same privileges.

Implementing tokenization forces the organization to start from scratch, restricting access privileges to the token vault (and, therefore, to PAN data) to those with a job-related need. Speaking as a QSA, I regularly see organizations where I have to challenge the number of people with administrative privileges that let them see (and print and copy) PAN data. In most cases, the individuals report they don't need and never use the data. They were given access based on their department or position, not their job requirements.

A final benefit from the tokenization process is that it should stop PCI scope creep in future years. The token vault contains all the tokens and their associated PANs (encrypted, of course). This single location requires strong access controls and physical security, possibly stronger than are in place currently. The process of putting all the organization's PCI "eggs" in one basket triggers a rethinking of logical and physical controls and restrictions on the data. These controls should stop the leaking of data to unauthorized people and systems, thereby helping to limit future expansion of the organization's PCI scope.

I do not know that tokenization is the best approach for all merchants. Point-to-point encryption is also very promising, if it is implemented properly. Each technology has costs and benefits, and each addresses different sets of needs. The good news on the tokenization front is that the Law of Unintended Consequences (which holds that we should be ready for unexpected outcomes from any action) may, in this one case, actually work in merchants' favor.

What do you think? Are you implementing tokenization or considering it? How did you implement it, and did you see any of the benefits I'm describing? Did you use the process to reexamine your data flows and identify process changes that could reduce your PCI scope? I'd like to hear your thoughts. Either leave a comment or E-mail me at [email protected].

Suggested Articles

Costco changes up its menu items, and Alibaba and Guess partner for a physical store.

Janey Whiteside, Walmart's new chief customer officer, is well acquainted with the importance of customer service in modern retail.

Whole Foods will offer deals on Amazon's Prime Day, and tariffs against China are causing pricing hikes.