UPDATED 09:00 EDT / SEPTEMBER 26 2023

Ameesh Divatia, AWS Startup Showcase Segment, 2023 SECURITY

New Baffle Data Protection for AI secures private data in generative AI projects

Data security platform startup Baffle Inc. today launched a new solution for securing private data in generative artificial intelligence projects that integrates with existing data pipelines.

Called Baffle Data Protection for AI, the new service has been designed to address the issue where publicly available generative AI services, such as ChatGPT, while useful, present a risk of sharing data. The risk of sharing private data has seen companies banning employees from using public generative AI services out of security concerns.

Baffle Data Protection for AI addresses private data concerns, allowing companies to secure sensitive data for use in generative AI projects against those risks. With Baffle Data Protection for AI, sensitive data is encrypted with the advanced encryption standard algorithm as it’s ingested into the data pipeline. When this data is used in a generative AI service, sensitive data values are anonymized, so leakage of cleartext data — information stored or sent in unencrypted form — cannot occur even with prompt engineering or adversarial prompting.

With the service, sensitive data remains encrypted no matter where the data may be moved or transferred in the generative AI pipeline, preventing unauthorized users from seeing private data in cleartext. Using Baffle Data Protection for AI, companies are able to meet specific compliance requirements, such as the European Union’s General Data Protection Regulation right to be forgotten, by shredding the associated encryption key.

Baffle Data Protection for AI also prevents private data from being exposed in a public generative AI service, as the personally identifiable information data is anonymized. The feature is an additional safeguard that complements processes and policies that forbid the use of private data, either intentionally or unintentionally, with a public generative AI service.

Baffle’s no-code, data-centric approach is said to protect data as soon as it is created and then keep it protected throughout the generative AI prompt and response process.

“ChatGPT has been a corporate disruptor, forcing companies to either develop or accelerate their plans for leveraging [generative AI] in their organizations,” said Ameesh Divatia, founder and chief executive officer of Baffle. “But data security and compliance concerns have been stifling innovation — until now.”

Divatia added that “Baffle Data Protection for AI makes it easy to protect sensitive corporate data while using that data with private gen AI services so companies can meet their gen AI timelines safely and securely.”

Divatia (pictured) spoke with theCUBE, SiliconANGLE Media’s livestreaming studio, in September, where he discussed how Baffle seeks to provide easy data protection for apps, analytics and AI:

Photo: SiliconANGLE

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One-click below supports our mission to provide free, deep and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU