← Back to Glossary

Tokenization

Replacing sensitive data with non-sensitive placeholder tokens that maintain the original data format while being meaningless if compromised.

Related Terms