Issue link: https://nebusinessmedia.uberflip.com/i/1540922
HARTFORDBUSINESS.COM | NOVEMBER 3, 2025 23 FOCUS | CYBERSECURIT Y EXPERT'S CORNER How to protect your organization from emerging AI threats By Bill Becker L ast year, an employee in a financial role at multinational engineering firm Arup's Hong Kong office received a suspicious email requesting a financial transaction. The employee was then summoned into a video call with the CFO and other employees. Through this meeting, they convinced him to carry out the request received in the email. The only problem was that what the employee saw and heard on screen wasn't real. It was a deepfake. The image and voices of the CFO and others that attended the call were cloned using artificial intelligence. The result of the deepfake attack: $25 million gone in an instant. What can a business do to deal with something like this? One thing is to set up a process that requires multiple people to approve financial transactions. You should also have a protocol for verifying that the person, even the CEO, is legitimate. Lastly, invest in detection tools and employee training. This is just the tip of the iceberg with things to consider when it comes to AI. For example, how is your organiza- tion dealing with shadow AI? With shadow AI, employees are using unapproved/unauthorized AI tools and platforms. While an employee may think the AI solution they're using helps them get the job done faster, there are several things going on that introduce risk into your organization. Here are just a few to consider: • Protected data being entered into an unapproved large language model: This jeopardizes both company and client data. It's both a privacy and compliance issue. • Vulnerabilities: Because these tools aren't vetted, they expand your organization's attack surface and may contain critical security flaws. That makes your systems more exposed and increases the risk of a cyberattack. • Lack of visibility: Not being able to account for your AI assets puts your company in an uncomfortable situation. Should an incident occur, lack of visibility increases the time it takes to respond to something that goes wrong. So, what can you do to reduce risk in this domain? Here are some ideas to get you started: • Have clear policies that spell out which tools and systems employees are allowed to use, and how they should use them. • Have a vetting process for AI tools and platforms. This process should include understanding how data flows, seeing if output introduces harmful bias that impacts decisions and customers, and if there are any security or privacy concerns. • To enhance visibility, establish a system that documents approved AI platforms and monitors any unapproved ones in use within your organization. Lastly, let's briefly talk about the threat landscape that's driven by AI. One of the growing trends is indirect prompt injection attacks. This is where malformed prompts are hidden in places like websites controlled by a bad actor, or they're added to social media posts. Then, when some unsuspecting person visits the site and asks their web browser AI assistant to summarize the page, the hidden malformed prompts are executed. This can lead to things like credential theft or data exfiltration. What does this mean for your business? This goes back to vetting tools, visibility and policies for using and securing browsers with AI assistants. As your business begins to integrate AI to help with various functions, it's also important to reduce the risk new technology brings. The rapid evolution of AI brings both opportunities and challenges. It is our responsibility as business leaders to navigate these complexities thought- fully, ensuring that innovation does not come at the cost of security. Bill Becker is the owner of Connecti- cut-based information security and intelligence firm Bsquared Intel. EXPERT'S CORNER CT is cracking down on how businesses use personal information By Alan M. Winchester C onnecticut is joining forces with six other states to ensure their privacy laws are followed and the personal information of individuals is protected. This could have major consequences for businesses that do not comply. For those new to privacy law, the United States lacks a compre- hensive and exclusive set of laws governing how businesses may use personal information. Instead, each state crafts its own set of laws to apply to its residents, regardless of where the company holding that information is incorporated, or where the information is located. The attorneys general of Connecticut, California, Colorado, Delaware, Indiana, New Jersey and Oregon have created the Consortium of Privacy Regulators and promise aggressive enforcement of privacy laws. It is expected many companies will adopt a practice that satisfies the strictest of privacy laws, thereby ensuring compliance in all states. The consortium's first area of focus is on how companies sell or share personally identifiable information (PII) with other businesses. Most states require an opt-out provision governing the sale or sharing of personal infor- mation. Data subjects who don't want to have their PII sold are directed to communicate that via email. Practically, few do this because it is a lot of work. Colorado, Connecticut and Delaware, however, have an opt-in requirement, where data subjects must affirmatively consent to the selling or sharing of sensitive personal information. Many companies rely on their privacy policies to obtain this consent when their customer first signs up for their services. They might also include an explanation banner at the time they collect the information. Companies face challenges with PII laws But many organizations do not initially intend to sell or share personal information, so they do not seek consent at the time of collection. If they eventually realize the value of the information and change their mind, it is impermissible to sell or share. And, many smaller or midsize organizations rely on service providers to manage their websites and custom- er-facing applications. This could lead to a disconnect between how the organization understands its website to be configured and how the service providers believe it should be config- ured, especially if vendors are located in states with less strict laws. This leads to noncompliance. Still, other companies sell or share PII without realizing it. For example, the marketing depart- ments of many organizations use Meta pixels or Google Analytics to track website traffic. This service is free, but in exchange, it passes information back to Google and Meta about the data subjects and their website interactions. Totally unaware, many companies share sensitive information because their pixels or beacons aren't prop- erly configured. And, the user hasn't consented to, nor knows, PII is being collected and shared. To address this, some organizations are creating a universal opt-out mecha- nism that allows data subjects to signal or communicate privacy preferences. One common type of opt-out mecha- nism is the global privacy control, which is embedded in some browsers or available as an extension for browsers such as Chrome. Some states, including Connecticut, require businesses to listen to the global privacy control signal from customers' browsers and obey their PII wishes, or face enforcement. If your business collects PII from U.S. residents, there is a good chance some of that information is protected by a state's universal opt-out mecha- nism law. Carefully review what PII you collect and how you honor the requests of data subjects regarding their PII. Also, discuss with your technology vendors what information is being auto- matically captured by your websites or applications and shared with providers such as Google or Meta. The cost of noncompliance can be millions of dollars. Alan M. Winchester is the leader of Harris Beach Murtha's cyber- security protection and response practice group. Bill Becker Alan M. Winchester

