What is random tokenization?
Random tokenization is a method to secure sensitive data such that, in the event of a breach or attack, the breached data has no value. Random tokenization masks sensitive information by generating a random string to represent the underlying data without any mathematical relationship, i.e. the token itself isn't encrypted and has no actual meaning.
Are tokens encrypted?
Tokens themselves aren't encrypted, but they represent actual values which are encrypted. Information cannot be derived from a token (as it's random and not mathematically related to any value) without access to a token vault of some sort in which the token and value mappings are stored and secured, commonly by encryption.
Token values usually need to be authenticated via the token vault so information is not exposed while it's being verified. Random tokenization is used frequently, for example credit institutions commonly use it to mask card numbers.

Devron is a next-generation federated learning and data science platform that enables decentralized analytics. Learn more about our solutions, read more of our knowledge base articles, about our federated learning platform, or schedule a demo with us today.