Perplexity AI Sued Over Secret Data Sharing
Your AI search tool may be sharing your business data
A class-action lawsuit filed today in San Francisco federal court accuses Perplexity AI of embedding hidden tracking technology that shares users’ private conversations — including financial and tax data — with Meta and Google. The suit names all three companies as defendants.
If you have used Perplexity to research suppliers, draft financial projections, or look up tax questions, the lawsuit alleges that data may have been sent to two of the largest advertising platforms on earth — even when you used Perplexity’s “Incognito” mode.
What the lawsuit claims
The complaint centers on three allegations.
Hidden tracking software. Perplexity allegedly embedded “undetectable” tracking technology in its search product. Unlike standard cookies or tracking pixels that users can see and block, this technology operated below the surface. Users had no way to know it was there and no way to opt out.
Data sharing with Meta and Google. The tracking software allegedly transmitted user conversations and search queries to Meta and Google. For small business owners, that means competitive research, pricing questions, vendor comparisons, and financial data could have been routed to advertising networks that sell targeting data to your competitors.
Incognito mode did not work. Perplexity marketed an “Incognito” mode that implied private browsing. The lawsuit alleges this mode still leaked user data to third parties — making the privacy promise misleading at best and deceptive at worst.
Key facts
- Filed April 1, 2026 in the U.S. District Court for the Northern District of California
- Class-action status sought on behalf of all affected Perplexity users
- Meta and Google named as co-defendants for receiving and using the data
- Alleges violations of federal wiretapping laws and California privacy statutes
Why this matters for small businesses
The privacy promise that sold you
Many small business owners adopted Perplexity specifically because it positioned itself as a smarter, more private alternative to Google Search. Unlike Google, which builds an advertising profile from everything you search, Perplexity marketed itself as an AI-powered answer engine focused on accuracy rather than ad targeting.
That pitch resonated with business owners who wanted to research sensitive topics — employee costs, legal questions, competitive pricing — without feeding that data into an advertising ecosystem. If the lawsuit’s allegations hold, that entire value proposition was hollow.
Your business data in advertising networks
Think about what you have searched for using AI tools in the past year. Vendor pricing. Employee benefits costs. Legal questions about contracts or compliance. Tax strategies. Expansion plans.
Now imagine that data sitting in Meta’s and Google’s advertising systems, available to inform the ads your competitors see — or worse, the ads they run targeting your customers. That is the scenario the lawsuit describes.
This is not hypothetical risk. A 2024 FTC study found that major tech platforms routinely use behavioral data to enable surveillance pricing, where businesses and consumers see different prices based on their inferred willingness to pay. Your AI search history could feed exactly that kind of system.
The trust problem keeps growing
Public trust in AI tools is already fragile. Only 26 percent of Americans say they trust AI, and incidents like this explain why. Every time a company markets privacy it does not deliver, the entire AI industry loses credibility — and businesses that rely on AI tools to serve customers absorb that reputational cost.
The FTC has made clear that companies deploying AI tools are responsible for how those tools handle customer data. If your business uses an AI tool that mishandles data, regulators may hold you accountable even if you did not know about the problem.
Our take
What we think
This lawsuit highlights a fundamental problem with how AI tools handle business data: most users have no idea where their queries go after they hit enter. Privacy policies are long, vague, and written to protect the company, not the user.
Perplexity is not the only AI tool that routes data through third-party systems. Many AI search tools, chatbots, and assistants send queries to external APIs, cloud services, or analytics platforms as part of normal operation. The difference the lawsuit alleges is that Perplexity did this secretly, through hidden tracking, while marketing itself as private.
The bottom line: If a free AI tool seems too good to be true, your data is probably the product. This lawsuit is a reminder that “private” and “AI-powered” do not automatically go together.
What is missing from the conversation
- Business-specific impact. Most coverage focuses on consumer privacy. But small businesses routinely put sensitive operational data into AI search tools — data that has direct competitive value. The business exposure here may be larger than the consumer exposure.
- Incognito theater. The Incognito mode allegation is especially damaging. It suggests Perplexity knew users wanted privacy and built a feature that appeared to deliver it while allegedly doing the opposite. That crosses the line from negligence to potential deception.
Questions that remain
- How long was the tracking active before discovery?
- Did Meta and Google know where the data was coming from, or was it anonymized before transmission?
- Are other AI search tools using similar hidden tracking methods?
What you should do
Immediate actions
- Audit your AI tool usage. Make a list of every AI tool your business uses — search, chat, content generation, analytics. For each one, check the privacy policy for language about data sharing with third parties.
- Assume your queries are not private. Until you can verify otherwise, treat every AI search query as potentially visible to the tool’s partners. Do not enter sensitive financial data, trade secrets, or customer information into tools you have not vetted.
- Check for state AI disclosure laws in your area. Twenty-seven states now have some form of AI transparency requirement. If you use AI tools that interact with customers, make sure you understand your disclosure obligations — especially if those tools are sharing data you did not authorize.
Watch for
- Court rulings on the class certification. If the class is certified, this becomes a much larger case with broader implications for all AI tools.
- FTC enforcement action. The FTC has been increasingly aggressive on AI privacy violations. This lawsuit could trigger a regulatory investigation.
- Perplexity’s response. How Perplexity responds — denial, settlement, or product changes — will signal how seriously AI companies take data privacy obligations.
Resources
- FTC AI policy and what it means for your business
- AI accountability legislation tracker
- How to evaluate AI tool security after supply chain attacks
The bigger picture
This lawsuit lands in a year when AI accountability legislation is accelerating at both the state and federal level. Lawmakers are watching cases like this closely. If Perplexity’s tracking allegations prove true, expect faster movement on mandatory AI transparency requirements — which ultimately protects businesses that use AI tools responsibly.
For small businesses in Appalachia and beyond, the lesson is straightforward: vet your AI tools the same way you vet any vendor that handles sensitive business data. Ask what data they collect, where it goes, and who else sees it. If the answers are vague, that is your answer.
Have questions about which AI tools are safe for your business? Get in touch — we help businesses adopt AI without compromising their data.