NY AI Transparency Act Advances — What It Means

NY AI Transparency Act Advances — What It Means

March 16, 2026 · Martin Bowling

New York wants to know what your AI was trained on

New York’s AI Training Data Transparency Act (S 6955) has cleared the Senate Internet and Technology Committee and is heading to a floor vote. If passed, the bill would require developers of generative AI models to publicly disclose what data they used to train their systems.

For small businesses that build AI, this probably does not apply to you. But for businesses that use AI tools — chatbots, content generators, scheduling assistants, intake widgets — this bill signals a shift in how AI vendors will need to operate. And New York rarely acts alone.

What happened

Senator Andrew Gounardes introduced S 6955 in March 2025. After passing through the Senate Internet and Technology Committee, it’s now queued for a full floor vote.

The bill is straightforward: if you develop a generative AI model or service, you must post on your website specific information about the data used to train it.

What developers would need to disclose

  • Whether training datasets include copyrighted, trademarked, or patented material
  • Whether datasets were purchased, licensed, or scraped from public sources
  • Whether personal information or identifying data is included
  • Whether aggregate consumer information was used
  • What cleaning, processing, or modifications were applied to the data

The companion Assembly bill (A 6578) has already been amended on third reading, suggesting active refinement and momentum in both chambers.

This is not the only New York AI bill moving

S 6955 is part of a broader push. New York currently has several AI bills advancing simultaneously:

  • RAISE Act (S 8828): Already signed into law in December 2025. Takes effect January 1, 2027. Requires safety protocols for frontier AI models built by companies with over $500 million in revenue.
  • Stop Deepfakes Act (S 6954): Requires AI content creation tools to embed provenance data. Passed committee and awaiting floor vote.
  • Chatbot impersonation liability (S 7263): Imposes liability when a chatbot impersonates a licensed professional. Also on floor votes.

New York is building a layered regulatory framework — training data transparency, safety protocols, content provenance, and professional liability — that could become a model for other states.

Why this matters for small businesses

You are not the target, but you are in the blast radius

S 6955 and the RAISE Act target AI developers, not the businesses that use their products. The RAISE Act specifically applies only to companies with over $500 million in annual revenue building frontier models. A restaurant using an AI answering service or a contractor using AI dispatch is not in the crosshairs.

But the downstream effects are real. When AI vendors face new disclosure and compliance requirements, those costs get passed through. Smaller AI providers may exit markets where compliance overhead erodes their margins. And the tools you rely on could change — in pricing, in capability, or in availability.

The patchwork problem is getting worse

New York is not acting in isolation. As of early 2026, legislators in 45 states have introduced 1,561 AI-related bills. Last year, 145 state AI laws were enacted. This year’s pace is faster.

We have already seen 78 chatbot safety bills hit 27 states, seven states targeting AI pricing tools, and Colorado’s comprehensive AI Act set to take effect June 30, 2026. At the federal level, the Small Business AI Training Act is trying to help small businesses navigate this exact complexity.

For businesses operating across state lines — even digitally — the compliance landscape is becoming fragmented in ways that make it harder to choose and deploy AI tools confidently.

Training data transparency changes the trust equation

This is the less obvious but more important implication. If S 6955 passes, AI vendors operating in New York will need to tell you what’s in their training data. That creates a new baseline for vendor evaluation.

Right now, most small businesses pick AI tools based on price, features, and ease of use. Training data transparency adds a fourth dimension: provenance. Was the model trained on licensed data or scraped content? Does it include personal information that could create liability for your business? These are questions that most small business owners have never needed to ask — but may soon need to.

Our take

New York’s approach is more practical than most state AI regulation we have seen this year. Rather than trying to regulate how AI is used (which gets complicated fast), S 6955 focuses on disclosure. It says: if you build AI, tell people what went into it.

The bottom line: Training data transparency is a good thing for small businesses. It gives you more information to make better vendor decisions — and it costs you nothing to comply because you are not the one who has to disclose.

What is missing from the conversation

  • Enforcement specifics: The bill does not yet detail penalties for non-compliance or how disclosures would be audited
  • Downstream accountability: If an AI vendor trains on problematic data and your business uses that tool to make decisions, who holds the liability? S 6955 does not answer that question

Questions that remain

  • Will the Assembly version align with the Senate version, or will differences require reconciliation?
  • Could federal preemption efforts — like the Trump administration’s executive order targeting state AI laws — block enforcement?
  • Will other states adopt similar training data transparency requirements?

What you should do

Immediate actions

  1. Ask your AI vendors about training data — even before this becomes law, it is a reasonable question. If a vendor cannot or will not answer, that tells you something.
  2. Document which AI tools you use and how — if regulation reaches businesses that deploy AI (not just build it), you want a clear picture of your AI footprint.
  3. Stay informed on your state’s AI bills — the IAPP state AI legislation tracker is the best free resource for monitoring what is moving in your state.

Watch for

  • Whether S 6955 passes the full Senate and gets signed into law
  • How AI vendors respond — transparent vendors will use compliance as a selling point
  • Whether West Virginia or neighboring Appalachian states introduce similar measures

The regulatory picture is getting clearer

State AI regulation is not slowing down. New York’s training data transparency bill is one piece of a much larger puzzle — but it is a piece that directly affects the tools small businesses depend on.

The good news: transparency benefits buyers. The more you know about how your AI tools were built, the better decisions you can make. You do not need to become a compliance expert, but you do need to pay attention.

If navigating AI tools and regulation feels overwhelming, we can help. Appalach.AI works with small businesses across the region to choose, deploy, and manage AI tools that fit their needs — and we stay on top of the regulatory landscape so you do not have to.

AI Tools Industry News Small Business