The city of Baltimore sued Elon Musk’s xAI on Tuesday, alleging its tool Grok generates nonconsensual sexual images in violation of the city’s consumer protection law — among the first actions of its kind by a local government against an AI company.
The complaint claims that Grok, developed by xAI and promoted on X, exposes users to Grok-generated content and puts them at risk of having their own images altered without consent.
Cities often find more success when alleging violations of state consumer protection laws rather than relying solely on local statutes.
Ben Yelin, University of Maryland
Filed in Baltimore’s Circuit Court by law firm DiCello Levitt on the city’s behalf, the suit argues the court has jurisdiction over xAI because the company operates in Baltimore.
“Grok has flooded the feeds of Baltimore’s X users with NCII (non-consensual intimate imagery) and CSAM (child sexual abuse material),” the complaint reads.
The city passed its consumer protection law in 2023 and has since sued companies, including DraftKings and FanDuel, for targeting residents with gambling problems, as well as digital lending app Dave, over misleading marketing and high interest charges.
The city is the first municipality to challenge Grok, though a bipartisan group of 35 state attorneys general, including Maryland’s Anthony G. Brown, sent a letter to xAI in January outlining concerns about the tool.
It will likely be a tough road ahead for the case. State governments typically lead consumer protection enforcement.
Cities often find more success when alleging violations of state consumer protection laws rather than relying solely on local statutes, according to Ben Yelin, program director at the University of Maryland’s Center for Cyber Health and Hazard Strategies.
Kentucky sued the AI company Character.ai in January, alleging its chatbot is dangerous for children and violated the state’s consumer and data protection act.
Still, other municipalities are trying it out. New York City’s consumer protection department recently sued a solar panel installation company for its alleged predatory lending policy under NYC law.
AI regulation momentum continues
Baltimore’s lawsuit against Grok isn’t exactly AI regulation. But the way this case unfolds — suing a private company over its AI tool — could offer insight into how state and local governments plan to take on the challenge of reining in AI’s power going forward.
A December executive order from President Donald Trump threatened to strip states of key broadband funding if they pursued their own AI regulations. Though no cases have emerged from it, the order raised concerns.
More details on the administration’s regulatory approach came out last week, with a key exemption for state and local consumer protection laws that can generally apply to AI developers. So, it’s unlikely the Trump administration would challenge Baltimore’s consumer protection law.
Most local-level AI regulation is also not expected to face pushback since it applies primarily to city government officials.
“Almost all local laws on this concern how the government itself is allowed to use AI, and that’s something that this executive order allows for,” Yelin previously told Technical.ly.
Maryland is poised to continue passing AI regulation, according to Yelin. The state has already challenged the Trump administration in court over issues like cuts to federal food and education funding, and he expects lawmakers to take a similar approach with AI.
Lawmakers in the state are already weighing several bills that would shape how Marylanders use and are safeguarded against risks from artificial intelligence. One consumer protection bill that would require AI systems to clearly disclose when users are interacting with an AI product rather than a licensed behavioral health provider passed the House in early March.
“In Maryland, it’s going to be full speed ahead,” Yelin said, “because I think Maryland legislators will have an oppositional tone to this.”