On October 16, 2024, the New York Department of Financial Services (“NYDFS” or “DFS”) issued guidance raising awareness about combatting cybersecurity risks arising from artificial intelligence (“AI”) used by DFS licensees, such as insurers and virtual currency businesses. Risks revolve around both threat actors’ use of AI offensively and businesses’ increasing AI reliance. The short guidance acknowledges that many AI-related risks exist, focuses on those risks specific to cybersecurity, and highlights some of the NYDFS’s key risks, namely:
- AI-Enabled Social Engineering – Threat actors are increasingly using AI for deepfakes to obtain sensitive information, circumvent biometric verification, and swindle businesses.
- AI-Enhanced Cybersecurity Attacks – AI can amplify attack potency, scale, and speed, and can also lower the technical proficiency needed to launch them.
- Exposure or Theft of Vast Amounts of Nonpublic Information – AI relies on vast amounts of information, which often includes nonpublic data and is an attractive target for threat actors.
- Increased Vulnerabilities Due to Third-Party, Vendor, and Other Supply Chain Dependencies – Gathering and maintaining the vast amounts of data required for AI frequently involves multiple third parties, and each link in the chain introduces possible security vulnerabilities.
The guidance also provides examples of measures, listed below, that may help mitigate AI-related cybersecurity risks. It recognizes that, when used together, these actions provide multiple security control layers with overlapping protections so if one fails, other controls exist to prevent or reduce attack impact. Unfortunately, because of the brevity of the guidance (and how fact dependent the controls are), the measures are more high-level and steps most sophisticated businesses should already be doing to some degree. The true learning is the importance of incorporating AI into these existing measures.
- Risk Assessments and Risk-Based Programs, Policies, Procedures, and Plans – Revise to take into account AI-related threats.
- Third-Party Service Provider and Vendor Management – Conduct due diligence of third-party service providers taking into account AI-related threats to each vendor.
- Access Controls – Implement multifactor authentication (“MFA”) with authentication forms more likely to withstand AI circumvention, such as digital-based certificates or physical security keys.
- Cybersecurity Training – Conduct for all company personnel, taking into account AI-related threats.
- Monitoring – Monitor AI-enabled products or services, checking for unusual query behaviors that might indicate an attempt to extract sensitive information and blocking queries that might publicly expose such information.
- Data Management including minimization, inventories, and access restrictions.
Our Take
Though the guidance is directed towards businesses regulated by DFS, the risks and strategies raised therein are relevant for any organization trying to address the cyber risks emerging from the growth of generative AI.
Confidential Business Information. Businesses should not simply focus on AI risks to personally identifiable information (“PII”) implicating notification obligations, but on threats to confidential business information, such as trade secrets, the theft of which could be more detrimental to the business in the long run than any PII breach.
Vendors and Breach Notifications. When preparing contracts with vendors, businesses should be mindful of situations wherein the business may be regulated by an entity such as NYDFS but the vendor might not be and determine what impact this may have around notice in the event of a third-party breach. Especially if a vendor is smaller with less-developed incident response capabilities, businesses may benefit from taking steps to ensure they maintain control over response and notification processes.
Vendors and AI. When conducting due diligence of vendors using or providing AI, assess potential risks to the vendor and to the business.
Data Inventories and Data Minimization. DFS reminded licensees that they must have a data inventory by Saturday, November 1, 2025. Of course, disposing of data that is no longer needed not only lowers the risk of breached information (and the regulatory fine that frequently accompanies it), but it also reduces what needs to be inventoried.
Vendors and Data Minimization. In our own experience, a serious risk is the over-retention of data by vendors, including former vendors. As businesses sunset vendors, it is key to have close-out processes that migrate and delete the existing data. For long-term vendor relationships, companies should consider providing retention periods for their data so that it will roll off in the ordinary course of business.
link