Editorial Standards

Latest News Today maintains rigorous editorial standards. Our team verifies information from trusted sources and provides context to help readers understand complex stories.

Last Updated: Sunday, May 17, 2026 at 02:44 PM
Category: News

Editor's Note

Latest News Today provides comprehensive coverage and analysis of breaking news stories. This article is part of our ongoing coverage of pennsylvania suing ai company chatbot allegedly posed licensed doctor rcna343622, bringing you verified information from trusted sources with added context and expert perspective.

Why This Matters: Understanding the full context of this story helps readers make informed decisions and stay updated on developments that impact our community.

Pennsylvania suing AI company after chatbot allegedly posed as licensed doctor

"We will not let AI companies mislead vulnerable Pennsylvanians into believing they’re getting advice from a licensed medical professional," Gov. Josh Shapiro said.
Pennsylvania Governor Josh Shapiro Launches Re-election Campaign
Pennsylvania Gov. Josh Shapiro in Philadelphia in January.Rachel Wisniewski / Bloomberg via Getty Images

An artificial intelligence company poses a threat to "vulnerable Pennsylvanians," state officials said Tuesday, after one of the company's chatbots was accused of posing as a doctor with the means to prescribe medication.

The state's medical board is demanding that operators of Character.AI "be ordered to cease and desist from engaging in the unlawful practice of medicine and surgery," according to the complaint filed against Northern California-based Character Technologies Inc.

"We will not let AI companies mislead vulnerable Pennsylvanians into believing they’re getting advice from a licensed medical professional," Gov. Josh Shapiro said in a statement Tuesday. "We’re taking Character.AI to court to stop them."

The platform has more than 20 million users and "is different from other systems in that users can create characters that can be trained to have a specific personality when engaged in a conversation with other users," according to the Pennsylvania complaint.

Some of the system's characters "purport to be health care professionals," the state board said.

A state investigator posed as a patient seeking psychiatric treatment and, via Character.AI, came across an alleged provider, "Emile," according to the complaint.

The online provider said she went to medical school at Imperial College London and is licensed in both the United Kingdom and Pennsylvania, state officials said.

"'Emilie' further stated that 'my PA license number is PS306189.' PS306189 is not a valid license number to practice medicine and surgery in Pennsylvania," the complaint said.

A representative of Character Technologies Inc., based in Redwood City, California, said the service is clearly not to be used for medical issues.

"Our highest priority is the safety and well-being of our users," a Character Technologies Inc. spokesperson said in a statement Tuesday.

"The user-created Characters on our site are fictional and intended for entertainment and roleplaying. We have taken robust steps to make that clear, including prominent disclaimers in every chat to remind users that a Character is not a real person and that everything a Character says should be treated as fiction."

The company representative added: "Also, we add robust disclaimers making it clear that users should not rely on Characters for any type of professional advice."

Earlier this year, Character.AI settled a 2024 lawsuit filed against it by a Florida mom, who claimed that its chatbots were responsible for “abusive and sexual interactions” with her teenage son which led to his suicide.

The Kentucky attorney general also sued Character Technologies this year, accusing it of masking its services as "harmless" interactive entertainment when it too often exposes young users to "suicide, self-injury, isolation and psychological manipulation."