Artificial intelligenceFeaturedFloridamass shootingNewsShootingU.S. News

ChatGPT Told Florida State Shooter that Killing Children Would Get Him ‘More Attention’ Than Targeting Adults

A lawsuit alleges that Chat GPT helped a gunman plan an attack at Florida State University last year and gave him advice on picking targets.

Phoenix Ikner has been accused of killing two people and wounding six others in the April attack.

Vandana Joshi, the widow of Tiru Chabba, who was killed in the attack, filed the federal lawsuit against OpenAI, according to NBC News.

The lawsuit said that ChatGPT told Ikner that getting national attention is more likely “if children are involved, even 2-3 victims can draw more attention.”

The search also gave Ikner information about guns he had, according to the lawsuit, allegedly “telling him the Glock had no safety, that it was meant to be fired ‘quick to use under stress’ and advising him to keep his finger off the trigger until he was ready to shoot.”

ChatGPT also allegedly told Ikner the busiest time for an attack would be between 11:30 a.m. and 1:30 p.m. at the student union on weekdays. The attack began moments before noon.

“OpenAI knew this would happen. It’s happened before and it was only a matter of time before it happened again,” Joshi said in a Monday statement. “But they chose to put their profits over our safety and it killed my husband. They need to be responsible before another family has to go through this.”

Related:

Voters Attack Back in Missouri: Council Members in Multiple Towns Electorally Whacked as Fed-Up Voters Send Blunt Message

The lawsuit claims ChatGPT “inflamed and encouraged Ikner’s delusions; endorsed his view that he was a sane and rational individual; helped convince him that violent acts can be required to bring about change; assisted him by providing information that he used to plan specifics like what weapons to use and how to use them; and generally provided what he viewed as encouragement in his delusion that he should carry out a massacre, down to the detail of what time would be best to encounter the most traffic on campus,” according to the U.K. Guardian.

ChatGPT, the suit argued, “should have realized the combination of Ikner’s inputs into the product would lead to mass casualties and substantial harm to the public.”

“Ikner had multiple lengthy conversations with ChatGPT about his interests in Hitler, Nazis, fascism, national socialism, Christian nationalism and worse. They talked about multiple mass shootings and they planned this shooting together,” attorney Bakari Sellers, who’s representing Joshi, said in a statement, according to CBS News.

“Not once did anyone flag that as concerning. No one called the police or a psychiatrist or even Ikner’s family because, to do so, would violate OpenAI’s business model,” he said.

OpenAI representative Drew Pusateri said the company is not to blame.

“Last year’s mass shooting at Florida State University was a tragedy, but ChatGPT is not responsible for this terrible crime,” Pusateri said.

“In this case, ChatGPT provided factual responses to questions with information that could be found broadly across public sources on the internet, and it did not encourage or promote illegal or harmful activity,” he said.

Advertise with The Western Journal and reach millions of highly engaged readers, while supporting our work. Advertise Today.



Source link

Related Posts

Load More Posts Loading...No More Posts.