PYMNTS-MonitorEdge-May-2024

Moving Beyond Tokenization’s ‘Automagic’ Buzz

Beware complacency in anything, but especially in data security. Technology has a way of lulling people into a false sense of security, promising a “set it and forget it” process that can be dangerous.

So it is with tokenization, said GEOBRIDGE CTO Jason Way in a recent edition of “Beyond the Buzzword” with PYMNTS’ Karen Webster.

“The word tokenization conveys a meaning of a ‘perfectly safe to share’ … alternative to a real value,” Way said.

Too many business leaders grab onto the concept, and some technology, then say to customers and partners alike that sensitive data has been protected — indeed, that all stakeholders are protected — in a way that Way described, tongue in cheek, as “automagically.”

However, protecting data in true and real-time fashion, he continued, has a lot of parts to consider. Beyond the buzzword “tokenization,” with its connotations of data inscrutable and untouchable, lies a sea of moving parts.

 

“There are a lot of interdependencies,” he told Webster. “Unfortunately, a single word does not convey all of the important aspects of what needs to occur with tokenization.”

The moving parts span the continuum of how a token is generated, the value it needs to replace, the protections that are in place for that value, how the tokens get where they need to go and what happens when they get there.

The Protection Gap And The Standards Gap

Sensitive operations must occur in perfect harmony for tokenization to prove effective, but that synching is rarefied. Call it a gap between replacement and real protection.

The gap exists because there are no real standards in place governing tokenization at all. There’s no uniformity across enforcement of how tokens are implemented or utilized. The Wild West aspect of tokenization — where many companies have staked claims to offer protection, but just replacement is in the offing — stands in stark contrast to the payments industry at large.

That’s because, when it comes to card payments and PCI SSC, requirements abound on how sensitive data is created, maintained and protected, said Way — especially in the debit space (PCI PIN) and in point-to-point encryption standards (PCI P2PE), for example … the standards that stretch across the implementation and mandate hardware to govern security.

As it stands now, tokenization in a world without standards becomes one where “it’s very much an à la carte in how these different terms and aspects come together,” he said, and inefficiencies and vulnerabilities become entrenched.

Start With The Concept

Finding a way toward standardization demands uniformity in concept, and implementation of those concepts. Way told Webster that, as a jumping-off point, “users must truly appreciate the differences between tokenization and encryption, and that means they must first digest that tokenization is worthless without encryption.” A surrogate value must be shared between parties in place of a real value.

He traced tokenization’s genesis to earlier efforts to satisfy PCI DSS requirements.

“The industry initially scrambled to learn how to recognize and eliminate the unnecessary presence of sensitive cardholder data. With that as a primary motivational factor (removing the presence of data you can be fined for having), the focus has always been limited to that concept. Users were primarily concerned with removal as opposed to protection,” Way said.

However, he continued, data protection and removal are starkly different concepts. In recent times, the mindset has shifted a bit to one where data is shared while simultaneously being protected, rather than simply being removed from the equation.

“And if a particular real value is sensitive enough to warrant the use of a token, instead of the real value,” said Way, “then the real value should be protected using the strongest technique available.”

Way noted that the strongest technique available is encryption, but he also cautioned against a mindset that encryption is the same across hardware and software. In fact, there’s a world of difference between the two.

“Software-based encryption is kind of like building a house out of six-inch steel, and then leaving the deadbolt key in all the doors,” he offered as an illustration. “The big bad wolf won’t blow it down because he doesn’t have to; he can just turn the key and walk right inside.”

Don’ts … And Dos

Way pointed to one key vulnerability in today’s tokenization processes, in terms of practice.

Keys must only exist inside tamper-responsive hardware, he said, while encryption and decryption functions should only occur in the same tamper-responsive hardware. Too many solutions have keys and processes running in software, offering hackers a tempting target, as they can steal the key and the ostensibly protected data.

With a nod to the plethora of offerings that exist when it comes to keys, Way said that some key management solutions can lessen the burden, yet extant solutions still offer unnecessary risk.

He noted that a handful of solution providers have attempted to leverage an “integration” with hardware manufacturers. However, in all the solutions he’s seen to date, the hardware is only leveraged for the purposes of protecting a “master” key, while the software continues to generate and expose keys in software. He said of his own firm that “we’ve reapplied our decades of experience focusing on hardware encryption compliance standards, and applied these protection techniques to the discipline of tokenization.”

With the recent offering of its TokenBRIDGE solution, GEOBRIDGE has leveraged RESTful APIs to enable secure TLS 1.2 connection profiles, which interface with its solution to eliminate client software requirements.

Searching For Standards

“A safe token is one that is truly random and has no link whatsoever to the real value it represents,” Way said.

He told Webster that such randomness does not govern token production today (or the ones currently in the field) with vulnerabilities that may get to true randomization. He said those who would seek to protect data through tokenization would do well to notice that there are standards in the payments industry tied to such randomization efforts.

Consider the fact that NIST has certifications for FIPS 140-2 Level 3 Random Number Generators, and they are prevalent in hardware encryption solutions, which can help guarantee a token’s uniqueness.

Looking to other corners of the payments industry for standardization makes sense, Way told Webster, because, after all: “Nobody needs to go much farther than the standards they are employing for different services. Why shouldn’t we protect everything the same?”

PYMNTS-MonitorEdge-May-2024