PCI 2.0: Major Step Forward, If You Value Vagueness

As PCI officially moves next month from 1.2.1 to 2.0, a series of small changes are opening the door to more QSA-to-QSA conflicts. For some, that move is good as it will allow for more flexibility. For others, the move will aggravate long-held concerns about interpretability, where a retailer may be ordered to do diametrically different things with a simple change from one QSA to another.

According to the draft of the PCI changes the PCI Council has been circulating for comment, there are indeed no major changes in next month's updated standard. But as they say in Purchase, the CVV devil is in the details.

For example, consider changes involving how encryption keys should be handled. The current rule requires the keys to be changed "at least annually." The concern about that rule had been that it didn't make sense in many instances, where the keys needed to be changed far more often than annually.

But instead of shortening the timeframe for those changes, the new rule will require the time periods to reflect how the keys are being used. So far, so good. Says the new rule: "Verify that key-management procedures are implemented to require periodic key changes at the end of the defined cryptoperiod," which it describes as "after a defined period of time has passed and/or after a certain amount of cipher-text has been produced by a given key." That's certainly reasonable.

The next line, though, expands the ways that duration could be determined, as in "as defined by the associated application vendor or key owner, and based on industry best practices and guidelines: for example, NIST Special Publication 800-57." Which industry best practices and guidelines? NIST was offered solely as an example. What if standards are in conflict? This is a classic area where equally experienced QSAs could go in different directions.

"It introduces more ambiguity to the larger PCI world," said Jonathan Lampe, a CISSP and the product management VP at Ipswitch. "It's bordering on the circular when they talk about industry best practices, because they are the industry's best practices. That's an ambiguity that would have been better to avoid."

Or consider PCI's new risk-based guidance, which simply concedes that not all risks are equally dangerous and that applying something akin to medical triage is wise. Note, from the new version: "The ranking of vulnerabilities as defined in 6.2.a is considered a best practice until June 30, 2012, after which it becomes a requirement." Here again, the Council's intent is good and the change is most welcome.Unfortunately, the Council opted to not get specific about where retailers should look for guidance in determining such matters. And again, this choice invites QSA quarrels. PCI's instruction is that "risk rankings should be based on industry best practices. For example, criteria for ranking 'High' risk vulnerabilities may include a CVSS base score of 4.0 or above, and/or a vendor-supplied patch classified by the vendor as critical and/or a vulnerability affecting a critical system component."

Coding quality is another area of change toward ambiguity. The current wording encourages applications to be created based on secure coding guidelines. Hard to argue with that. But the new version makes it explicit that this rule "applies to all custom-developed application types in scope, rather than only Web applications." No problem.

The current version wants developers to rely on the not-for-profit Open Web Application Security Project (OWASP), while the new version gives retailers many more choices. "As industry best practices for vulnerability management are updated (for example, the OWASP Guide, SANS CWE Top 25, CERT Secure Coding, etc.), the current best practices must be used for these requirements."

By listing three sources—and also by opening the door to an unlimited number of others by saying "etc."—conflicting interpretations are inevitable. It might not even have to get to a differing interpretation level. What if one QSA prefers a different group's guidelines than another QSA? By mandating its use ("current best practices must be used") and then giving lots of options, the clarity being promised may morph into chaos.

Yet another example: PCI 2.0 speaks of password management requirements for non-consumer users, but it never defines non-consumer users.

In 8.5.9, the new rule says: "For a sample of system components, obtain and inspect system configuration settings to verify that user password parameters are set to require users to change passwords at least every 90 days. For service providers only, review internal processes and customer/user documentation to verify that non-consumer user passwords are required to change periodically and that non-consumer users are given guidance as to when, and under what circumstances, passwords must change."

"They are creating a special qualification for this, but it's not clear what a non-consumer is," Ipswitch's Lampe said. "It could be a cashier or a member of the IT department. No way to tell."

Blake Huebner, the director of information security at BHI SecureConnect, took objection to changes within the wireless scanning area. The new rule (11.1) permits quarterly testing for wireless access points to use additional methods. That rule adds "flexibility that methods used may include wireless network scans, physical site inspections, network access control (NAC) or wireless IDS/IPS."

Huebner thinks allowing physical site inspections is an unwelcome change. "They regressed on wireless scanning," he said, adding that physical inspections rarely add much and may distract IT staff from more effective methods for detecting rogue wireless issues.

Suggested Articles

Costco changes up its menu items, and Alibaba and Guess partner for a physical store.

Janey Whiteside, Walmart's new chief customer officer, is well acquainted with the importance of customer service in modern retail.

Whole Foods will offer deals on Amazon's Prime Day, and tariffs against China are causing pricing hikes.