Dig has made enhancements to its data security offering to enable the detection, classification, and mapping of sensitive data used to train LLM models. Credit: The KonG / Shutterstock Cloud data security provider Dig Security has added new capabilities to its Dig Data Security offering to help secure data processed through large language model (LLM) architectures used by its customers. With the new features, Dig’s data security posture management (DSPM) offering would enable customers to train and deploy LLMs while upholding the security, compliance, and visibility of the data being fed into the AI models, according to the company. “Securing data is a prime concern for any organization, and the need to ensure sensitive data is not inadvertently exposed via AI models becomes more important as AI use increases,” said Jack Poller, an analyst at ESG Global. “This new data security capability puts Dig in a prime position to capitalize on the opportunity.” All the new capabilities will be available to Dig’s existing customers within the Dig Data Security offering at launch. Dig secures data going into LLM Dig’s DSPM scans every database across an organization’s cloud accounts, detects, and classifies sensitive data (PII, PCI, etc.), and shows which users and roles can access the data. This helps detect whether any sensitive data is being used to train the AI models. “Organizations today struggle with both discovering data that needs to be secured and correctly classifying data,” Poller said. “The problem becomes more challenging with AI as the AI models are opaque.” Dig’s data detection and response allows users to track data flow to understand and control the data in the AI model’s training corpus, such as PII being moved into a bucket used for model training. “Once the model has been trained, it’s impossible to post-process the AI model to identify and remove any sensitive data that should not have been used for training,” Poller added. “Dig’s new capabilities enable organizations to reduce or eliminate the risk of unwanted sensitive data being used for AI training.” Dig maps data access and identifies shadow models Dig’s new data access governance capabilities can highlight AI models with API access to organizational data stores, and which types of sensitive data this gives them access to. “Dig’s agentless solution covers the entire cloud environment, including databases running on unmanaged virtual machines (VMs),” said Dan Benjamin, CEO and co-founder at Dig Security. “The feature alerts security teams to sensitive data stored or moved into these databases.” “Dig will also detect when a VM is used to deploy an AI model or a vector database, which can store embeddings,” Benjamin added. All these enhancements are made within Dig Data Security, which will combine DSPM, data loss prevention (DLP), and DDR capabilities into a single platform. Related content news Action1 says it has decided to remain founder-led After reviewing customer reactions to stories of a potential buyout, Action1 decided it had the potential to stay independent and deliver more. By Shweta Sharma 21 Aug 2024 4 mins Mergers and Acquisitions Security Software feature Custodians looking to beat offenders in gen AI cybersecurity battle The true determinant of success will be how well each side harnesses this powerful tool to outmaneuver the other in the ongoing cybersecurity arms race. By Shweta Sharma 21 Aug 2024 8 mins Generative AI Security Software opinion Who writes the code in your security software? You need to know Trusting but verifying the code in the security software you use may not be an easy task, but it’s a worthwhile endeavor. Here are some recommended actions. By Susan Bradley 19 Aug 2024 7 mins CSO and CISO Windows Security Security Software news Generative AI takes center stage at Black Hat USA 2024 Top gen AI-driven cybersecurity tools, platforms, features, services, and technologies unveiled at Black Hat 2024 that you need to know about. By Shweta Sharma 08 Aug 2024 6 mins Black Hat Generative AI Security Software PODCASTS VIDEOS RESOURCES EVENTS SUBSCRIBE TO OUR NEWSLETTER From our editors straight to your inbox Get started by entering your email address below. Please enter a valid email address Subscribe