Tokenization for PCI Compliance

derek_tumulakInterview with Derek Tumulak,
VP of Product Management

Tokenization has taken center stage for many organizations looking for a seemingly quick approach to data protection. While it is a viable option in the interim, or as a component of a long-term data protection strategy, organizations need to carefully consider the framework of tokenization and its implication to their overall security strategy.

Question: Simply speaking (or “in your own words”), what is tokenization?

Tokenization is an approach where sensitive data is replaced, typically in a database, with unique information that holds similar field type properties as the original data. The unique information or “reference” is used to recover the sensitive information, and does not compromise the overall security of the system if implemented correctly. All forms of data can be tokenized, including credit cards, social security numbers, medical records, personally identifiable information, etc. While this approach may be suitable in some instances, tokenization, in itself, is not a complete data protection offering.

Question: What are the benefits or advantages of taking a tokenization approach?

The main benefits of tokenization are that systems need not be changed to handle ciphertext, data protection is transparent to systems that do not actually process data, and systems that do not access data are taken out of the scope of compliance audits.

Question: What are pitfalls or key challenges that organizations should be aware of?

The primary challenge with tokenization is that every function of every system needs to be evaluated with no simple, phased approach to securing your environment. It’s an all or nothing approach versus traditional data protection which allows you to pilot your implementation in phases and demonstrate incremental success. Many people view tokenization as a silver bullet for protecting data or achieving compliance, when, in reality, it is an implementation approach that should be considered a possible part of your overall data protection strategy. Tokenization only makes sense when you integrate encryption, secure key management and access control, as they are critical components to a comprehensive data protection strategy. Ultimately, there are still many instances where data needs to be moved throughout an enterprise and accessed; these are the challenges that tokenization doesn’t really solve.

Question: Who is using tokenization?

Several organizations are using tokenization today, with many of them focused on protection of credit card information for PCI compliance. Over the last five years, SafeNet has been deployed in several large retail environments, employing tokenization as a part of their overall data protection strategy. We continue to help customers deploy our technology, as well as provide them with guidance around building both tactical and strategic plans for data protection, which may include tokenization.

Question: How does tokenization address compliance requirements?

Tokenization addresses compliance by simply moving the stored sensitive information to a single location that, in turn, takes several of the “tokenized” systems out of scope from an audit perspective. Customers still need to make sure that sensitive data is being protected properly with access control, encryption, and strong key management.

Question: What’s the future of tokenization?

Tokenization is an implementation option for organizations looking to achieve compliance and implement overall security best practices. Customers really need to look at their overall data protection strategy, and where encryption, key management, DLP, access control, and tokenization come into play both now and in the future. Many of these technologies solve a short-term need by “retrofitting” existing environments to provide enterprise security. 

The best solutions are the ones that can solve an immediate need today but can also play a role in a longer term strategic approach. This involves influencing and becoming a part of product architectures for third-party vendors, and working with the industry as a whole to change the way people look at managing and securing data, as opposed to networks or infrastructure. The bottom line is that tokenization, while a valid approach, is only one of several pieces of the overall data protection puzzle.

Learn more about tokenization and data protection solutions