AI Platform Compliance

Your team adopted AI tools. Your compliance program hasn't caught up.

The same obligation that applies to WhatsApp now applies to Copilot.

When a registered rep uses ChatGPT Enterprise to draft a client recommendation, or a compliance officer uses Microsoft Copilot to summarize a flagged message thread, that activity may constitute a business communication subject to SEC and FINRA retention requirements.

Most compliance programs weren't designed for this. Most vendors haven't addressed it. That gap is where the next wave of enforcement will land.

This page covers why existing compliance infrastructure doesn't address AI tool activity, what regulators have already signaled, how Arc is built to close the gap, and what your firm should be doing now.

What is AI platform compliance?

AI platform compliance is the practice of capturing, retaining, and supervising business activity that occurs through or with AI tools (ChatGPT Enterprise, Microsoft Copilot, Google Gemini for Workspace, Claude for Work, and others) in a manner that satisfies the recordkeeping obligations that apply to regulated firms.

The core question regulators will ask is not whether a firm used AI. It is whether they kept records of the business communications that resulted from it.

That question has no clean answer yet. The absence of explicit AI-specific guidance does not create a compliance safe harbor. The existing framework applies: any communication related to client activity, investment recommendations, or firm business is a business record. The channel is irrelevant.

Why existing compliance infrastructure doesn’t cover AI tools

Compliance platforms were designed around communication channels: email, messaging apps, social media. The mental model is: a person sends a message; the message is captured; the archive stores it.

AI tools don’t work that way. An employee doesn’t send a message to ChatGPT. They prompt it. The model generates a response. That output may then flow into a client email, a recommendation memo, a compliance review. At each step, the question of “what was captured and when” becomes more complex.

The gaps in current infrastructure fall into three categories:

No interception point. Email compliance works because mail flows through servers. AI tool activity flows through browser sessions, API calls, or embedded interfaces — none of which existing compliance middleware was designed to intercept.

No consistent record format. A captured iMessage is a message. A captured AI session may include a prompt, a response, tool calls, context injected from internal systems, and agent-to-agent communication. None of that fits the data model of a traditional compliance archive.

No vendor coverage. As of 2026, no major compliance vendor has shipped a production solution for AI tool retention. The market is open. Firms relying on their existing vendor for this coverage are relying on something that doesn’t exist yet.

What regulators have said about AI

The SEC’s 2026 examination priorities explicitly name AI governance as a focus area. Examiners have been directed to assess whether firms have policies governing employee use of AI tools, whether those policies are enforced, and whether records of AI-assisted activity are being retained.

FINRA has not issued AI-specific recordkeeping guidance as of this writing, but has signaled that existing rules apply to AI-generated communications in the same way they apply to any other business communication.

The practical implication: firms that cannot produce records of AI-assisted client activity during a routine exam will face the same scrutiny as firms that couldn’t produce WhatsApp records in 2022.

The window to get ahead of this is now. No competitor has meaningful depth in this category. Firms that build AI compliance infrastructure this year will be in a structurally different position from those that wait for regulatory clarity.

How Comma is approaching AI platform compliance

Comma is building Arc: a compliance platform designed from the ground up for both human communication channels and AI agent activity.

The premise is that the division between “communications compliance” and “AI governance” is artificial. A registered rep sending a WhatsApp message and a registered rep prompting Copilot for a client recommendation are engaged in the same category of regulated activity. The compliance infrastructure should treat them consistently.

Arc is designed as a set of composable components that together form a compliance fabric:


Arc Relay — Available now

Arc Relay is the first production component of the Arc platform. It captures communications from AI-assisted and relay-dependent channels and routes them through Comma’s compliance pipeline: retention, supervision, legal hold, and export.

Arc Relay is self-hosted and open for download today.


Arc Bridge — Coming soon

Arc Bridge enables capture from self-hosted AI infrastructure — Ollama, vLLM, LiteLLM, Open WebUI, and similar deployments. For firms running models behind their own firewall, Arc Bridge integrates without requiring changes to how the team works. Capture happens at the infrastructure layer, not the application layer.


Compliance MCP Servers — Coming soon

Comma is building Compliance MCP Servers that expose the full compliance workflow as MCP-native tools: retention policy, legal hold, discovery, review, and export.

This means an AI agent can run the full compliance lifecycle. A legal hold doesn’t require a compliance officer to log in and click through a workflow. An agent can invoke it directly, with a full audit trail, through a standardized protocol.

Compliance becomes an agentic workflow, not a manual one.


Arc Gate — Coming soon

Not every AI workflow runs through MCP. Some agents send raw API calls directly to OpenAI, Anthropic, or Google, with no tool layer and no relay. Arc Gate sits in front of those calls. Same compliance pipeline, applied at the API level.

One platform. Human channels and agent channels. Whatever path the data takes.


What firms should do now

Regulatory clarity on AI is coming, but it hasn’t arrived. In the meantime, there are practical steps every regulated firm should be taking, regardless of what any vendor, including Comma, supports today.

1. Start with an honest inventory. Which AI tools are actually in use across your front office, compliance team, and operations? Don’t limit this to IT-approved deployments. Shadow AI adoption is real, and the tools your team adopted informally are just as subject to regulatory scrutiny as the ones procurement signed off on.

2. Assess what those tools are producing. Prompts, responses, summaries, client recommendations: if the output touches client activity or firm business, your existing retention obligations almost certainly apply. The question isn’t whether the tool is on an approved list. It’s whether the output is a business record.

3. Find out what your tools can actually retain. Most AI platforms have limited native export functionality, and some have none. Know what each tool can and cannot produce before a regulator asks you to produce it.

4. Ask your compliance vendor the hard question. Not whether AI compliance is on their roadmap, but whether they’ve shipped something. A roadmap slide is not a compliance solution. Most vendors haven’t shipped one. You should know whether yours has.

5. Document where you stand. A written AI governance policy covering which tools are permitted, how they should be used, and what records management looks like won’t close every gap, but it demonstrates good-faith effort. In an exam, that matters.

Ready to get ahead of it?

Book a 20-minute walkthrough to see where Comma is heading with AI compliance.

Related reading

Frequently asked questions

Are AI tool conversations subject to SEC and FINRA retention requirements?
If an AI-generated or AI-assisted output relates to a client, a transaction, a recommendation, or firm business — it is likely a business communication under existing rules. SEC Rule 17a-4 and FINRA Rule 4511 do not carve out AI tools. The channel doesn't determine the obligation; the content does. Regulators have not issued formal AI-specific recordkeeping guidance as of 2026, but the SEC's examination priorities explicitly name AI governance as a focus area.
What is Arc?
Arc is Comma's compliance platform for AI and agent channels. It is designed as a set of composable components (Arc Relay, Arc Bridge, Compliance MCP Servers, and Arc Gate) that together form a compliance fabric covering both human messaging and AI-generated activity. Arc Relay is available today. The remaining components are in active development.
What is Arc Relay?
Arc Relay is a self-hosted relay that captures communications from AI-assisted channels and routes them through Comma's compliance pipeline: retention, supervision, legal hold, and export. It is the first production component of the Arc platform and is available for download today.
What is an MCP server and why does it matter for compliance?
MCP (Model Context Protocol) is a standard for connecting AI agents to external tools and data sources. Comma is building Compliance MCP Servers that expose the full compliance workflow as MCP-native tools: retention policy, legal hold, discovery, review, and export. This means an AI agent can execute the full compliance lifecycle without a human having to manage each step manually. Compliance becomes an agentic workflow, not a manual one.
What AI compliance steps should regulated firms take now?
Firms should: (1) inventory every AI tool in active use across front-office and compliance functions; (2) assess whether outputs from those tools constitute business communications under existing recordkeeping rules; (3) confirm whether those tools have any native retention or export capability; (4) evaluate whether their current compliance vendor has a stated roadmap for AI channel coverage. Most do not.