MAROKO133 Eksklusif ai: Tokenization takes the lead in the fight for data security Wajib B

📌 MAROKO133 Hot ai: Tokenization takes the lead in the fight for data security Ter

Presented by Capital One Software


Tokenization is emerging as a cornerstone of modern data security, helping businesses separate the value of their data from its risk. During this VB in Conversation, Ravi Raghu, president, Capital One Software, talks about the ways tokenization can help reduce the value of breached data and preserve underlying data format and usability, including Capital One’s own experience leveraging tokenization at scale.

Tokenization, Raghu asserts, is a far superior technology. It converts sensitive data into a nonsensitive digital replacement, called a token, that maps back to the original, which is secured in a digital vault. The token placeholder preserves both the format and the utility of the sensitive data, and can be used across applications — including AI models. Because tokenization removes the need to manage encryption keys or dedicate compute to constant encrypting and decrypting, it offers one of the most scalable ways for companies to protect their most sensitive data, he added.

"The killer part, from a security standpoint, when you think about it relative to other methods, if a bad actor gets hold of the data, they get hold of tokens," he explained. "The actual data is not sitting with the token, unlike other methods like encryption, where the actual data sits there, just waiting for someone to get hold of a key or use brute force to get to the real data. From every angle this is the ideal way one ought to go about protecting sensitive data."

The tokenization differentiator

Most organizations are just scratching the surface of data security, adding security at the very end, when data is read, to prevent an end user from accessing it. At minimum, organizations should focus on securing data on write, as it’s being stored. But best-in-class organizations go even further, protecting data at birth, the moment it’s created.

At one end of the safety spectrum is a simple lock-and-key approach that restricts access but leaves the underlying data intact. More advanced methods, like masking or modifying data, permanently alter its meaning — which can compromise its usefulness. File-level encryption provides broader protection for large volumes of stored data, but when you get down to field-level encryption (for example, a Social Security number), it becomes a bigger challenge. It takes a great deal of compute to encrypt a single field, and then to decrypt it at the point of usage. And still it has a fatal flaw: the original data is still right there, only needing the key to get access.

Tokenization avoids these pitfalls by replacing the original data with a surrogate that has no intrinsic value. If the token is intercepted — whether by the wrong person or the wrong machine — the data itself remains secure.

The business value of tokenization

"Fundamentally you’re protecting data, and that’s priceless," Raghu said. "Another thing that’s priceless – can you use that for modeling purposes subsequently? On the one hand, it’s a protection thing, and on the other hand it’s a business enabling thing."

Because tokenization preserves the structure and ordinality of the original data, it can still be used for modeling and analytics, turning protection into a business enabler. Take private health data governed by HIPAA for example: tokenization means that data canbeused to build pricing models or for gene therapy research, while remaining compliant.

"If your data is already protected, you can then proliferate the usage of data across the entire enterprise and have everybody creating more and more value out of the data," Raghu said. "Conversely, if you don’t have that, there’s a lot of reticence for enterprises today to have more people access it, or have more and more AI agents access their data. Ironically, they’re limiting the blast radius of innovation. The tokenization impact is massive, and there are many metrics you could use to measure that – operational impact, revenue impact, and obviously the peace of mind from a security standpoint."

Breaking down adoption barriers

Until now, the fundamental challenge with traditional tokenization has been performance. AI requires a scale and speed that is unprecedented. That's one of the major challenges Capital One addresses with Databolt, its vaultless tokenization solution, which can produce up to 4 million tokens per second.

"Capital One has gone through tokenization for more than a decade. We started doing it because we’re serving our 100 million banking customers. We want to protect that sensitive data," Raghu said. "We’ve eaten our own dog food with our internal tokenization capability, over 100 billion times a month. We’ve taken that know-how and that capability, scale, and speed, and innovated so that the world can leverage it, so that it’s a commercial offering."

Vaultless tokenization is an advanced form of tokenization that does not require a central database (vault) to store token mappings. Instead, it uses mathematical algorithms, cryptographic techniques, and deterministic mapping to generate tokens dynamically.This approach is faster, more scalable, and eliminates the security risk associated with managing a vault.

"We realized that for the scale and speed demands that we had, we needed to build out that capability ourselves," Raghu said. "We’ve been iterating continuously on making sure that it can scale up to hundreds of billions of operations a month. All of our innovation has been around building IP and capability to do that thing at a battle-tested scale within our enterprise, for the purpose of serving our customers."

While conventional tokenization methods can involve some complexity and slow down operations, Databolt seamlessly integrates with encrypted data warehouses, allowing businesses to maintain robust security without slowing performance or operations. Tokenization occurs in the customer’s environment, removing the need to communicate with an external network to perform tokenization operations, which can also slow performance.

"We believe that fundamentally, tokenization should be easy to adopt," Raghu said. "You should be able to secure your data very quickly and operate at the speed and scale and cost needs that organizations have. I think that’s been a critical barrier so far for the mass scale adoption of tokenization. In an AI world, that’s going to become a huge enabler."

Don't miss the whole conversation with Ravi Raghu, president, Capital One Software, here.


Sponsored articles are content produced by a company that is either paying for the post or has a business relationship with VentureBeat, and they’re always clearly marked. For more information, contact [email protected].

đź”— Sumber: venturebeat.com


📌 MAROKO133 Update ai: Amazon Deletes AI-Generated Recap of “Fallout” Season 1 Aft

Earlier this month, Amazon revealed a new feature that aims to use AI to recap its Prime Video shows.

The tool, now deployed in beta form, generates everything from narration, dialogue, and music to summarize key plot points, allowing fans to quickly catch up on their favorite shows.

At least, that was the idea. Perhaps unsurprisingly, the AI tool has already made an enormous mess by bungling up the details of a hit show, suffering from the common shortcomings of the tech we’ve become all too familiar with.

As GamesRadar reports, the recap of Season 1 for the company’s popular series “Fallout” show was riddled with errors before being taken down by Amazon last week. For instance, the recap assumed that flashbacks from the perspective of the character Cooper Howard — also known as the Ghoul, played by actor Walton Goggins — were set in the 1950s, even though the show and the flashbacks are set in 2077 and the 2060s, respectively.

The AI also claimed that the Ghoul offered actor Ella Purnell’s Lucy MacLean a “die or leave with him” offer in an attempt to hunt down her father, even though they share the same motives and intentionally joined forces to pursue him, as Gizmodo points out.

Put simply, the AI bungled up the absolute basics by completely misrepresenting the motivations of the show’s core ensemble.

The incident once again highlights how AI slop is infiltrating almost every aspect of our daily lives. Even the task of recapping “Fallout” — which is returning for its second season this week and was painstakingly crafted by a passionate crew in honor of the eponymous video game series that it’s based on — isn’t safe.

The news comes after Amazon quietly pulled AI-generated English dubs for several anime shows, including “Banana Fish,” and “No Game, No Life” earlier this month. Fans were furious after finding the soulless dubs to be “hilariously bad.”

Not only were the dubs severely lacking, but many netizens also accused Amazon of cutting corners by blindly replacing human voice actors with dubious AI.

“Amazon’s choice to use AI to dub Banana Fish is a massive insult to us as performers,” famed anime voice actor Daman Mills tweeted at the time. “AI continues to threaten the livelihoods of performers in EVERY language (yes even Japanese performers who are also incredibly vocal on this topic).”

The company’s latest attempts to lazily recap its own shows haven’t fared much better. As The Verge reported last week, Amazon has since taken down a number of different recaps, including the ones for “Fallout” and other Amazon Prime series such as “Tom Clancy’s Jack Ryan” and “Upload.”

The feature, dubbed Video Recaps, launched last month as part of an early beta. It purportedly “analyzes a season’s key plot points and character arcs to deeply understand the most pivotal moments that will resonate with viewers,” according to an official press release.

“Then, the AI finds the most compelling video clips and pairs them with audio effects, dialogue snippets, and music,” the company claims. “These are all stitched together with an overarching AI-generated voiceover narration to deliver a theatrical-quality visual recap.”

But given the resulting mess, Amazon still has a lot to prove to justify the existence of its sloppy recapping tool.

“All it would have taken is one person to watch it from start to finish to realize it wasn’t correct,” one Reddit user argued. “Just one person who knows the story of the first season. They didn’t even need to know the lore of the games.”

“Just the first season,” they added. “But they can’t even do that.”

More on Amazon: Amazon Quietly Pulls Disastrous AI Dubs For Popular Anime After Outcry

The post Amazon Deletes AI-Generated Recap of “Fallout” Season 1 After It’s Called Out for Being Full of Errors appeared first on Futurism.

đź”— Sumber: futurism.com


🤖 Catatan MAROKO133

Artikel ini adalah rangkuman otomatis dari beberapa sumber terpercaya. Kami pilih topik yang sedang tren agar kamu selalu update tanpa ketinggalan.

✅ Update berikutnya dalam 30 menit — tema random menanti!

Author: timuna