tokenization

Tokenization, when applied to data security, is the process of substituting a sensitive data element with a non-sensitive equivalent, referred to as a token, that has no extrinsic or exploitable meaning or value. The token is a reference that maps back to the sensitive data through a tokenization system.
Turkey Corporates Handed The eBanking Key: QR Codes
Turkey Corporates Handed The eBanking Key: QR Codes
January 21, 2016  |  B2B Payments

Corporations may want the same kind of fast, technologically advanced banking solutions as consumers do, but corporate banking also has more demanding thresholds for security...

READ MORE >
The Rest Of The Payments 2015 By The Letter: S Through Z
The Rest Of The Payments 2015 By The Letter: S Through Z
December 09, 2015  |  Karen Webster

A was for Apple Pay, B was for bitcoin and C was for cross-border. OK, maybe you saw those ones coming, but whatever could the...

READ MORE >