Stefan Beissel, Ph.D., CISA, CISSP
The handling of sensitive data requires compliance to standards and laws that include high demands on data security. But the handling of sensitive data can be restricted with tokenization. Companies that process sensitive data do not always need the specific data content in every processing step. Sometimes only the unique identification of data is required. Tokenization replaces sensitive data by unique strings that cannot be converted back to the original by an algorithm. Systems that use these strings do not handle sensitive data anymore.
Tokens can be generated with different techniques such as encryption, hashing and numbers. Tokens that were generated with encryption can be converted to their original state. Thus, encryption techniques are less suitable to generate tokens. By using hashing, a digital fingerprint is created, which is generally unique. But depending on the hashing algorithm used, the risk of collisions can be present and the uniqueness of the token is no longer ensured. Other techniques for the generation of tokens are the use of a serial number or a random number. In principle, any string of numbers may be used as a token as long as it allows a unique identification, almost no collisions and it cannot be converted by an algorithm to its original state.
An exemplary use case for a tokenization system is the integration of an e-commerce merchant who accepts credit card payments through a web store. It is most advantageous for the merchant to keep payment data outside of his/her network so that he/she is not bound to the regulations of the payment card industry. In a token-based method, the merchant must ensure that the web session is redirected to the systems of an external payment processor, e.g., by using a plug-in, before the payment information is entered by the customer. When customers enters their, cardholder data, the data are sent directly to the processor who operates a tokenization system. The processor assigns the cardholder data in the tokenization system to a multiusable token and sends the token to the merchant.
By using tokenization, the scope of systems that handle sensitive data and, therefore, must meet compliance and audit requirements can be reduced. It facilitates a more restrictive handling of sensitive data without adjusting business processes. Hencem tokenization offers not only a security improvement, but also potential savings.
Read Stefan Beissel’s recent Journal article:
“Meeting Security and Compliance Requirements Efficiently With Tokenization,” ISACA Journal, volume 1, 2014.