Enable Tokenisation

1 votes

Tokenization, when applied to data security, is the process of substituting a sensitive data element with a non-sensitive equivalent, referred to as a token, that has no extrinsic or exploitable meaning or value. The token is a reference that maps back to the sensitive data through a tokenization system.

Under consideration Miscellaneous Suggested by: Andrew Dribbell Upvoted: 16 Oct, '20 Comments: 0

Comments: 0