Tokens Are Not The Same As Encryption. Honest

Tools

A 403 Labs QSA, PCI Columnist Walt Conway has worked in payments and technology for more than 30 years, 10 of them with Visa.

It's now been four months since the PCI Council's guidance on tokenization, and people are still mixing up tokenization and encryption. They are also drawing incorrect parallels/inferences. Tokenization is not encryption. Trying to compare the two is not appropriate (or like comparing quarks to streetcars or your other favorite silly similes), and doing so can lead to mistakes in scoping PCI.

By the way, after much effort, I think I finally found a real-world example of what a high-value token should be. Let's say I want to use my payment card at a merchant, but I don't want that merchant to have my PAN for whatever reason. Let's also say I can give my PAN to a trusted issuer or service provider, and that company in turn gives me a token to use at that untrusted merchant. Because that token—whatever it looks like—could be used by me (or anyone) to initiate a new transaction against the underlying payment card, that token would fit the definition of a high-value token requiring additional safeguards. I might even consider that high-value token to be in PCI scope.

The PCI Council's tokenization guidance was designed to help merchants (and QSAs) determine PCI scope with tokenization. The document specifies seven objectives to be met for tokens to be considered out of scope, plus another eight recommendations to reduce scope with tokenization.

Everyone should note that the guidance is just that: guidance. Although everyone reads the same document, some merchants, vendors and QSAs may come to different conclusions as to what is and is not in scope with tokenization. I attribute at least part of this situation to a muddling of tokenization and encryption in people's minds. Whatever the reason, the result for a merchant implementing tokenization is that it may find itself in a protracted argument internally, with competing vendors or maybe even with its QSA as to what is in and what is out of PCI scope.

Tokenization has the potential to both reduce a merchant's PCI scope and limit the risk of a cardholder data breach. The technology reduces scope by restricting cardholder data to the token vault, which should be properly segmented from all other systems and protected like the crown jewels. It reduces the risk of a data breach, because everybody in the organization now uses tokens instead of cardholder data. So, theoretically at least, there should be a lot less PAN data to be compromised.

Merchants implement tokenization in one of three general ways: The solution can be outsourced to a third party, such as a processor or specialized tokenization vendor; it can be developed internally by the merchant; or the merchant can adopt a hybrid approach using a third-party solution but hosting it internally.

Irrespective of the implementation, tokenization differs from encryption. For example, encryption is reversible while random tokens are not. In terms of scope, the PCI Council settled the question of whether encrypted data is in or out of scope in its FAQ #10359, which states: "encrypted data may be deemed out of scope if, and only if, it has been validated that the entity that possesses encrypted cardholder data does not have the means to decrypt it." Therefore, if anyone in the organization can decrypt the data, the encrypted data is in scope everywhere it appears. Conversely, if no one can reverse the encryption, the encrypted data is out of scope.

Tokens, however, are not (or at least should not be) reversible.Tokens, however, are not (or at least should not be) reversible. Tokens should be randomly generated, and the token vault should be properly segmented from other systems and devices. The PCI Council (and Visa before it) stipulated that there should be no way mathematically to reverse the process and derive the PAN given only a token. This means that even if someone knew the PAN associated with one or more tokens, they would have no information to help de-tokenize the next one.

A security expert I respect greatly recently challenged me as to whether any tokenization system, particularly an internally hosted tokenization solution, could really reduce PCI scope. He compared tokenization with encryption, and he concluded that because encrypted data is considered in PCI scope if the organization can decrypt the data, then tokens, too, should be in scope if anyone in the organization can de-tokenize them.

I failed to see then (and I still fail to see) the parallel between encryption and tokenization, and I do not agree with applying the rules for encryption to tokenization. The sole fact that some people in the organization with appropriate privileges can access the token vault and retrieve a PAN does not make the tokens "reversible." That ability does not bring tokens into PCI scope. Certainly all the persons and processes that retrieve PANs and use them for a transaction are in scope. But the tokens themselves should be considered out of scope.

The PCI Council's guidance supports this position when it describes the special case of a "high-value" token, which may require additional safeguards.

High-value tokens are those that can be used to initiate a new card transaction. According to the PCI Council's guidance document, such tokens "might be in scope for PCI DSS, even if they cannot directly be used to retrieve PAN or other cardholder data." The use of the word "might" is hardly definitive, but the fact that the Council called out only these tokens for special treatment reinforces my argument that comparing tokenization and encryption is a false analogy.

Tokenization and encryption have a complex relationship. The two technologies are fundamentally different—that is, encryption is reversible, where random tokens are not. At the same time, tokenization solutions require strong encryption to protect card data stored in the token vault.

The fact that tokenization uses encryption does not justify applying the same PCI scoping considerations to each technology. For this to be the case, one of two situations has to exist. The first situation would be to have the token solution violate the rule that there is no way mathematically to reverse the token. This would validate the comparison to encryption, and I might consider such a solution to be encryption in the first place, thus negating the desired scope reduction. The only other situation is to consider all tokens to be high-value tokens, which seems inconsistent with the PCI Council's guidance. Therefore, I have difficulty accepting a parallel between tokenization and encryption.

What do you think? Have you implemented tokenization? Did you receive the scope reduction benefits you expected? I'd like to hear your thoughts. Either leave a comment or E-mail me at wconway@403labs.com.