Next month, millions of adorable merchant IT executives will dress up and pretend to be responsible adults who are experts in all manner of security. They'll walk down Tokenization Street, going from one security vendor to another, holding out their brightly colored IT environment bags and ask, "Token Trick Or Treat?" Some will get delicious chocolate, which will cost-effectively protect their payment data. Others, unfortunately—like CSO Charlie Brown—will get a rock.
How can you make sure you don't get round dusty minerals that are overpriced and provide about as much true data security as a Jack-O-Lantern? It all comes down to knowing which houses to go to and which to avoid. Even more importantly, it's about knowing which questions to ask.
Many retail CIOs are evaluating tokenization as a way to reduce PCI scope and, thereby, the total cost to achieve and maintain PCI compliance. The problem is that many tokenization options are available. How can you be sure that you not only pick the right approach for your company but realize the benefits you are paying for?
I won't pretend to have all the answers. After all, with both software and hardware appliance packages in the market, things can get complicated quickly. Instead, I would like to offer a set of questions retailers need to ask—and have answered—before they commit to any approach. These questions are based on third-party products I've seen in the market, Visa's recent guidance on the subject and, to a great extent, my own clients' experiences. The list is not meant to be complete, but it should get you headed in the right direction—or at least help you avoid an expensive disappointment.
Tokenization is a data security technology that replaces primary account number (PAN) data with surrogate values, or tokens. Properly constructed tokens are not mathematically reversible, and the tokens can be removed from PCI scope. The appeal of tokenization is that it simplifies the process and reduces the cost of PCI compliance.
Encryption, on the other hand, is reversible and, as such, is not the same as tokenization. Furthermore, encryption does not take PAN data out of your PCI scope. Properly done, encryption may render your PAN data unreadable and, therefore, PCI compliant. But the data is still in scope.
Retailers have a number of tokenization options, including third-party token vendors, their existing application vendors who may offer a tokenization upgrade or even their payment processor or acquirer. A retailer also could develop a homegrown tokenization system. Although I would not recommend this last approach, the questions to be answered are pretty much the same.
Here, then, is this QSA's list of questions for your consideration. You may have other questions—in fact, you probably will—but this list should get you started.
In other words, are you really ready to begin? It will be impossible to scope a tokenization project until you know where all your cardholder data is generated, stored and transmitted. And I mean all your data. A good place to start is with a complete cardholder dataflow diagram (you know, the one you developed for your QSA).You may want to use automated data discovery tools to locate where PAN data may have leaked. A wide array of good open-source and vendor tools can help you.
This is perhaps the most important question you can ask at the first meeting with any vendor (or with your IT security staff, if you are thinking of a homegrown approach).
Without going into a great deal of detail, there are at least three widely discussed methods to generate a token. The first and, in my opinion, best method is to replace the PAN with a string of random numbers. This approach is secure, and it is not reversible.
You or a vendor could generate tokens with a one-way hash using a cryptographic salt. The resulting value should not be reversible. However, depending on the implementation, this value is not as secure as a random number. You also cannot store or transmit the truncated PAN together with the tokenized data if you take this approach.
Alternatively, you could use encryption to generate a token. But this is encryption, not truly tokenization. Because encryption is reversible, any token generated in this way would be reversible, too. If this approach is appealing, I suggest you await further guidance from the PCI Council.
Other approaches are available for generating tokens, some of which are as simple as using a sequence number to replace parts of the PAN. Whichever approach you take to generate tokens, make sure it actually removes the PAN data from your PCI scope. You don't want your database to be "compliant," you want it out of scope.
Because the whole point of tokenization is to reduce your PCI scope, the fewer systems and network segments that actually touch PAN data, the better. Therefore, the closer the tokenization process is to your POS, the more your scope can be reduced.
In some implementations, a PAN goes from the POS to the application, where a token is created (either in the application or by sending it to the tokenization vendor). This approach is good, but it is even better if the PAN can go from the POS directly to the token engine—bypassing the application entirely. Ideally, you would like to get all the way to the magnetic head in the card reader, if possible.
This question gets to the heart of the PCI scoping issue. Whether you have internal tokenization or rely on a third party, the tokenization system should not send a PAN to a token recipient. Wherever a PAN is sent, that system is in your PCI scope.
Retailers considering third-party packages should review the PCI Council's guidance on encrypted data: "encrypted data may be deemed out of scope if, and only if, it has been validated that the entity that possesses encrypted cardholder data does not have the means to decrypt it." Applying this guidance to tokenization means that if you can ever get back to a PAN—whether electronically or by calling the vendor and having it send you a few thousand account numbers for "testing"—you may not be reducing your scope as much as you expect.
This question ties back to the first question and assumes you have identified all the places you have PAN data. Certainly, the ideal situation would be to purge your old PAN data. If, however, this is not possible, you might want to check into provisions for tokenizing those databases, too.
We will go through the rest of the list of 10 questions in my next column. In the meantime, I'd like to hear what questions you are asking. Have you implemented tokenization? What questions did you ask and what lessons did you learn? I'd like to hear your thoughts. Either leave a comment or E-mail me at [email protected].