PCI Council's High-Value Token Definition Disappointing

A 403 Labs QSA, PCI Columnist Walt Conway has worked in payments and technology for more than 30 years, 10 of them with Visa.

When the PCI Council issued its PCI DSS Tokenization Guidelines on August 12, one aspect was especially surprising. It was when the Council introduced the concept of a "high-value token." A plain old token gets promoted to a high-value token when "the token itself can be used in lieu of cardholder data to perform a transaction."

Before reading the document (or at least until I got to page 20), I thought a token was a token. Whether a particular token, or tokenization approach, was in or out of PCI scope seemed to depend on how well it was constructed and how the tokenization engine and token vault were implemented. Now I learn that (with apologies to George Orwell) some tokens are more equal than others.

The rationale for the Council's distinction is that it considers high-value tokens to be payment instruments. High-value tokens can be used to initiate fraudulent (as well as valid) transactions. The Tokenization Task Force and the PCI Council deemed these high-value tokens to be as valuable to the bad guys as the original primary account numbers (PANs) they replaced. As a result, any such tokenization approach is subject to PCI DSS controls and the high-value tokens themselves are in your PCI scope.

This conclusion likely comes as a surprise to many retailers and other merchants (along with their tokenization providers). For example, E-Commerce merchants who use tokens for one-click ordering and repeat purchases (leaving the underlying PANs with their processor or another third party) just learned their tokens will still be in scope for PCI.

I wonder if the hotel room keycard (or resort account) I use to charge meals is a high-value token because it generates a payment-card transaction? I even wonder if tokens used for exception-item-processing such as chargebacks and refunds are high-value tokens because they impact (even if it is to reverse) transactions?

The document's discussion of high-value tokens concludes with what some will consider to be disappointing advice: "Merchants should therefore consult with their acquirer and/or the payment brands directly to determine specific requirements for tokens that can be used as payment instruments." I don't know how many "payment brands" will pick up the phone to offer tokenization advice to merchants, so I imagine acquirers, processors and QSAs will be getting a lot of questions in the coming weeks.

Some in the industry may think that in referring merchants back to their acquirer/processor, the Tokenization Task Force and the PCI Council avoided taking a stand. I don't see what else they could do. I have analyzed and evaluated tokenization approaches for both merchants and vendors. My experience is that there can be large differences across implementations of the same approach in different merchant environments.

(See our news story companion to this column: Fighting Words: How Specific Should PCI's Token Guidance Have Been?)

Reality is messy. Although I, and every QSA, might wish for a definitive answer, all we can reasonably expect is guidance that helps us and our merchant clients determine the scope for a tokenization approach.The good news in the guidelines is that merchants who use tokenization to remove their post-purchase and other back-office systems from PCI scope should be in good shape.

To be more precise, here is what the guidelines actually say: "System components that are adequately segmented [isolated] from the tokenization system and the CDE [cardholder data environment]; and that store, process or transmit only tokens; and that do not store, process or transmit any cardholder data or sensitive authentication data, may be considered outside of the CDE and possibly out of scope for PCI DSS."

Unfortunately, there is disappointing news for retailers and E-Commerce merchants who use tokens to generate transactions. They need to dig into the details of their tokenization systems, and it seems they may also need to consider their high-value tokens to be in scope for PCI.

From a big-picture perspective, the guidelines confirm that in certain conditions (and the Council detailed seven of them) replacing PAN data with tokens may remove that data from your PCI scope. That was the good news.

There also was some not-so-good news for some retailers. Notice I said "may remove" and not something more definite. I used that term intentionally, because the guidelines also indicate that to minimize PCI scope you need to know how you will use the tokens, not just how you generated them. In other words, not all tokens are created equal. The bottom line is that some tokens will still be in scope for PCI.

The Tokenization Task Force included representatives from merchants, vendors and QSAs. A lot of people put in a lot of time, effort and energy into drafting these guidelines. The PCI Council then reviewed the recommendations, ultimately releasing the final document, which concludes: "The level of PCI DSS scope reduction offered by a tokenization solution will also need to be carefully evaluated for each implementation."

No one should expect a simple answer to a complex issue like tokenization. Therefore, the first thing we need to keep in mind is that when the PCI Council releases "guidelines," that is exactly what they are: guidelines. It did not release pre-baked, ready-to-serve one-size-fits-all answers that apply in all cases. The Council cannot—and should not be expected to—do that because technologies, security and implementations will vary from merchant to merchant. Therefore, in my opinion, it is not reasonable to expect anything more than guidelines, which is all the Council promised in the first place.

We have been waiting for more than a year for the report from the Tokenization Task Force and the PCI Council, and now we have it. Love it or hate it, the guidelines are what we all have to work with. My guess is that some tokenization RFPs—along with any number of vendor sell sheets—will be re-written this week.

What do you think? Have you implemented tokenization? Do you have, or do you expect to have, high-value tokens? I'd like to hear your thoughts. Either leave a comment or E-mail me at [email protected].