Football player’s apparent suicide fuels discussion about warning signs

Meanwhile, debate persists about AI chatbots for mental health support.

Football player’s apparent suicide fuels discussion about warning signs

Meanwhile, debate persists about AI chatbots for mental health support.

After a Dallas Cowboys player died by an apparent suicide, social media users expressed concern that suicide warning signs could go unnoticed in successful people. An article about another man who died by suicide, which his parents say was instigated by his conversations with ChatGPT, prompted more debate about the role of AI in mental health support.

In response, communicators may share suicide prevention resources, reiterate the risks of using AI for mental health support, and recirculate alternative free or low-cost resources.


Insights brought to you by the reporters and science writers of Public Good News (PGN), a nonprofit newsroom dedicated to improving community health.


On November 6, Dallas Cowboys player Marshawn Kneeland died by apparent suicide. Articles and social media posts about his death have garnered thousands of engagements. Many commenters said that men need more mental health support and worried that mental health struggles may go unnoticed in people with successful careers. One comment read, “This man scored his first NFL touchdown Monday, showing even after such a happy moment people are battling within.” Some also shared personal stories about suicidal ideation or losing loved ones to suicide, with a few expressing concern that they had missed warning signs.

Online conversations about the risks of using AI chatbots for mental health support continued in recent weeks. On November 6, CNN reported that in July, a 23-year-old died by suicide, which his parents say ChatGPT encouraged. His parents are now suing OpenAI. Social media posts expressed frustration and hopelessness about the wave of suicides allegedly connected to AI use, with one X post garnering more than 1 million views. Many commenters were particularly worried about how children and teens interact with chatbots. In response to a November 13 Facebook post from the American Psychological Association warning that AI “is not a replacement for a qualified mental health care provider,” some commenters said that AI chatbots may provide support for people who are uncomfortable seeking help from a therapist. Others suggested that therapists are unable to help people cope with serious trauma.


Recommendations brought to you by the health communication experts behind Infodemiology.com.

Recommendations for public health professionals

Each week, the Infodemiology.com team will provide messaging recommendations in response to some of the trending narratives outlined above. These helpful tips can be used when creating content, updating web and FAQ pages, and developing strategy for messaging about mental health.

Conversations about suicide provide an opportunity to recirculate suicide warning signs and stress that anyone can struggle with mental health, no matter their gender or how successful they are. Communicators may direct people to the 988 Suicide & Crisis Lifeline and recirculate general mental health resources, including local mental health centers. Communicators may also want to recirculate mental health resources tailored to men, such as therapist directories to find local therapists who specialize in men’s issues, local support groups, and the ManKind Project, which connects men to peer support. Sharing support groups and other resources for those who have lost loved ones to suicide is also recommended.

As concerns about AI and mental health persist, communicators may continue to educate people about the risks of seeking AI mental health support. Messaging may reiterate that AI is not a replacement for human mental health support. Since research shows that some young adults may feel more comfortable talking to a chatbot than in person, communicators may share human-led mental health support that people can access via text, online chat, or telehealth.