By Aaron Chowdhury, Chattanooga, Tenn.–
In an effort to help prevent suicide, social media giant Facebook has rolled out a new self harm and suicide prevention reporting tool.
Facebook has partnered with the National Suicide Prevention Lifeline and several other mental health organizations.
If a user scrolls through there feed and notice a harmful or suicidal post, the user can report the post to Facebook. Once a post is reported, Facebook’s staff reviews the post and determines whether or not the post is harmful.
If the post is considered harmful, the user is greeted with a message the next time they log into Facebook.
“The idea is good,” said UTC student counselor Kristen Vaughn. “It gets resources out there and it gets students one step closer to help. But then you’re bringing in a corporation into something very personal, and I would hope the friends wouldn’t contact Facebook, the friends would contact the friend.”
Some students on campus also expressed the same concerns as Vaughn.
“I use Facebook quite often and I can see how some could benefit from this, but I could see many people getting offended or refuse to get help,” said Kayla Wilson, a junior from Knoxville. “Friends and family could also misinterpret a status update and think it may be harmful when in reality it is not.”
Facebook is a platform where users can share and post links to other websites, videos, photos and so on.
In order to avoid having a post reported or mistaken for being harmful, students should be careful posting content to Facebook.
Vaughn also warns students to be both careful and accountable for what they post on social media.
“Be cautious of how you use social media, and take responsibility,” she said.
To report a post to Facebook, a user has to click on the small arrow above the post and click report. Facebook users can also report spam and offensive or inappropriate content.