Tokenization takes sensitive data values and replaces them with values (tokens) of the same size and type. For example, systems that expect 16-byte credit card numbers or 9-byte Social Security numbers will receive 16 or 9 byte tokens. These tokens will reference sensitive data, but not actually be sensitive themselves. The sensitive data will be encrypted and stored in the tokenization system.
Tokenization can reduce the scope of PCI compliance; by replacing sensitive data with tokens, many applications and database may no longer need to comply with PCI regulations, making audits easier and saving time and resources.
In addition, tokenization of data may result in faster performance than encryption in some cases. For example, an application that processes encrypted data may have to go through the decryption process so that the application logic isn’t broken, which can take time. Tokenization isn’t subject to this delay.
Encryption and tokenization solutions often co-exist. Encryption can be used for transmission of sensitive information while tokenization can be used to protect sensitive data at rest.