SafeNet Tokenization Manager protects sensitive data that enters organizations and facilitates compliance with regulations (such as PCI DSS and HIPAA) by reducing the regulatory scope and costs.
How Tokenization Works
Tokenization is the process of replacing sensitive data (primary account numbers, social security numbers etc.) with a surrogate value, a token. The tokenization process significantly reduces the risks of data exposure and data-blooming, as the sensitive data is stored in a central token vault in an encrypted format.
Every token that is issued, represents a single unique string of sensitive data. Assigning a token to a single original Primary Account Number (PAN) enables merchants to use the same token multiple times, whenever the specific PAN is used in a transaction.
SafeNet Tokenization Manager complies with PCI Tokenization Guidelines (Published August 2011) and VISA Tokenization Best Practices.
SafeNet's Tokenization Manager Highlights:
Format Preserving Tokenization
Format Preserving Tokenization (FPT) uses tokens that preserve the length and format of the sensitive data. FPT ensures that no changes to legacy databases are required in order to support the tokenization process.
Tokenization Manager FPT supports multiple formats of credit card numbers, SSN and other PII data as well as alphanumeric data. It complies with the PCI-DSS guidelines for token / PAN distinguishability (achieved through LUHN algorithm enforcement)
Scalability and Elasticity
Tokenization Manager is designed to offer scalability and elasticity that enables organizations to cost-effectively implement their solution:
- Clustered deployment ensures high availability and scalability
- Multiple Tokenization Manager Instances(on physical or virtual servers) can share a single Token Vault, avoiding token collisions
- Elasticity is achieved by deploying a variable number of Instances/Hardware Servers depending on the transaction volume
- Targeted to enterprises and service providers
- Suitable for merchants to support “Peak Traffic Days” in a cost-effective way
Security and Robustness
In order to ensure a more secure solution, all Tokenization Manager crypto operations are done within SafeNet KeySecure, a robust key-manager and crypto off-load appliance.
Tokenization Manager in conjunction with KeySecure and Crypto Pack provides:
- Secure key-vault
- Trusted execution environment for all cryptographic operations
- Single interface for logging, auditing, and reporting access to protected data, keys, and tokens
- Support for key-rotation functionality for Token Vault encryption keys
- Support for single and multi-use tokens
- Compliance with NIST 800-57 Key-management guidelines and with PCI-DSS key-management requirements
Tokenization as a Service (TaaS)
SafeNet Tokenization Manager enables financial service providers and payment acquirers to expand their offering and create a new revenue stream by offering Tokenization as a Service to their customers.
Deploying Tokenization Manager and DataSecure at their premises, service providers are able to offer customers a full set of encryption and tokenization services, taking customers’ entire organization out of regulatory scope, eliminating all PCI-DSS auditing costs.
- Safenet’s Tokenization Manager solution fully complies with PCI-DSS requirements
- API Web Services allow easy integration and clear segmentation from CDE to non-CDE
- Elastic deployment and business model that best fits a Service environment and pricing
- Support of different Token Vaults for different merchants
Tokenization has taken center stage for many organizations looking for a seemingly quick approach to data protection. While it is a viable option in the interim, or as a component of a long-term data protection strategy, organizations need to carefully consider the framework of tokenization and its implication to their overall security strategy.
Simply speaking, what is tokenization?
Tokenization is an approach where sensitive data is replaced, typically in a database, with unique information that holds similar field type properties as the original data. The unique information or “reference” is used to recover the sensitive information, and does not compromise the overall security of the system if implemented correctly. All forms of data can be tokenized, including credit cards, social security numbers, medical records, personally identifiable information, etc. While this approach may be suitable in some instances, tokenization, in itself, is not a complete data protection offering.
What are the benefits or advantages of taking a tokenization approach?
The main benefits of tokenization are that systems need not be changed to handle ciphertext, data protection is transparent to systems that do not actually process data, and systems that do not access data are taken out of the scope of compliance audits.
What are pitfalls or key challenges that organizations should be aware of?
The primary challenge with tokenization is that every function of every system needs to be evaluated with no simple, phased approach to securing your environment. It’s an all or nothing approach versus traditional data protection which allows you to pilot your implementation in phases and demonstrate incremental success. Many people view tokenization as a silver bullet for protecting data or achieving compliance, when, in reality, it is an implementation approach that should be considered a possible part of your overall data protection strategy. Tokenization only makes sense when you integrate encryption, secure key management and access control, as they are critical components to a comprehensive data protection strategy. Ultimately, there are still many instances where data needs to be moved throughout an enterprise and accessed; these are the challenges that tokenization doesn’t really solve.
Who is using tokenization?
Several organizations are using tokenization today, with many of them focused on protection of credit card information for PCI compliance. Over the last five years, SafeNet has been deployed in several large retail environments, employing tokenization as a part of their overall data protection strategy. We continue to help customers deploy our technology, as well as provide them with guidance around building both tactical and strategic plans for data protection, which may include tokenization.
How does tokenization address compliance requirements?
Tokenization addresses compliance by simply moving the stored sensitive information to a single location that, in turn, takes several of the “tokenized” systems out of scope from an audit perspective. Customers still need to make sure that sensitive data is being protected properly with access control, encryption, and strong key management.
What’s the future of tokenization?
Tokenization is an implementation option for organizations looking to achieve compliance and implement overall security best practices. Customers really need to look at their overall data protection strategy, and where encryption, key management, DLP, access control, and tokenization come into play both now and in the future. Many of these technologies solve a short-term need by “retrofitting” existing environments to provide enterprise security.
The best solutions are the ones that can solve an immediate need today but can also play a role in a longer term strategic approach. This involves influencing and becoming a part of product architectures for third-party vendors, and working with the industry as a whole to change the way people look at managing and securing data, as opposed to networks or infrastructure. The bottom line is that tokenization, while a valid approach, is only one of several pieces of the overall data protection puzzle.
View How To Buy