Tokenization and end-to-end encryption are designed to secure information both in transit and at rest. In other words, the focus of each technology is security first. The fact that they can reduce PCI scope or make PCI compliance easier is a secondary benefit. That tokenization and end-to-end encryption vendors are starting to figure this point out is good news.
The smart ones—and the only vendors you should be talking to, by the way—will tell you that there is no silver bullet to make PCI go away, which is a victory for reality over marketing hype. But there is one thing you should know about reducing your PCI scope: It is pretty important. If retailers are going to get the full value for their investment in either or both of these technologies, they better look carefully at their implementation or they may find they will not get all that they paid for.
There have been some pretty good arguments made that tokenization and end-to-end encryption each can reduce a retailer’s PCI scope. But if these technologies are not implemented properly, you may find that you are more secure (a good thing) but haven’t reduced your PCI scope (a bad thing).
Whether you insist that your tokens be compliant with the Luhn algorithm (used to compute the check digit in the PAN) or not is not particularly relevant. Similarly, the issues of token collisions (duplication) and even format preserving encryption/tokenization (the encrypted data or tokens look like a PAN) are secondary. The main question retailers need to ask is whether they will have the ability to de-tokenize or decrypt the data and return it to plain text. If you have this ability, through whatever means, that data is in still scope for PCI and you will have lost a lot of the value of your investment.
As almost everyone should know, the PCI Council has decreed that encrypted cardholder data is in your PCI scope. The sole exception is “if, and only if, it has been validated that the entity that possesses encrypted cardholder data does not have the means to decrypt it.” The whole idea of investing in either tokenization or end-to-end encryption is to get as many of your systems and databases out of PCI scope as possible and to reduce your cost of PCI compliance.
Therefore, this one exception is important. The clear intent of the Council is to say that if you, the merchant, have access to any mechanism that allows you to go from a token back to the clear text PAN--whether you call that mechanism decryption or de-tokenization or de-anything--that tokenized data is still in scope. You gained some extra security, but you just blew the opportunity to reduce your PCI scope.The only way to get the data out of scope is if you never--and I mean never--have the ability to convert the tokens back to clear text data. At the very least, any mechanism that can reverse the token would have to reside, like the encryption keys in the case of encrypted data, with a separate “entity.” Segmenting your network is not enough to get you around this problem. If the bad guys can break into one part of your network to get the tokens, they will just as easily break into another segment to get the means to convert those tokens back to clear text data whether you call that process decryption, a look-up table or some other form of prestidigitation.
This separate entity is ideally an outside organization physically separated from your operations and systems. A hardware security module (HSM) that does the encryption or tokenization and that is totally controlled by a third party might just qualify. But if you take that route, be prepared for your QSA to examine it pretty closely and ask some detailed, technical questions.
Once the cardholder data is encrypted or tokenized, and so long as the merchant never has the ability to retrieve clear text data, all the downstream systems could be out of scope. Otherwise, all bets are off and all your cardholder data are in scope. You may have improved your security, but you have not reduced your PCI scope.
It all comes down to two separate but related objectives: security and PCI scope. Both tokenization and end-to-end encryption can increase security and lower risk when properly implemented. That is really good. But if you are going to spend the dollars, time and effort to implement either solution, why not also reduce your PCI scope while you are at it? After all, that is a lot of what you are paying for. To do that means you have to implement either technology (or maybe both?) such that you cannot ever, ever, ever get from the cyphertext back to the clear text data. It also means that you better get out your checkbook, because a purely internal solution is pretty unlikely to reduce your PCI scope.
What do you think? I’d like to hear your thoughts. Either leave a comment or E-mail me at [email protected].