Los Angeles -- In the ever-so-buzzworthy scenario of increased transparency and shared responsibility, social media giant Facebook is not only beefing up security this week, it is also becoming a bit more transparent. Today the company is rolling out a new Support Dashboard, allowing users to see the exact status of filed reports.
Facebook today announced initial testing of the Support Dashboard, a feature that allows you to see what happens after you click the “Report” button. To that extent, the social networking giant says the feature is designed to help people better understand reporting and educate them about how Facebook goes about resolving issues sent its way. The Support Dashboard is supposed to be directly accessible from your account settings (facebook.com/settings) as soon as it is available.
“The Support Dashboard is a portal designed to help you track the progress of the reports you make on Facebook,” Facebook explained in a blog post. “From your Support Dashboard, you can see if your report has been reviewed by Facebook employees who asses reports 24 hours a day, seven days a week, in dozens of languages.”
Here is what the dashboard looks like for users:
As it tightens the security of its vast community, now when you report a photo or someone's account--one that you believe violates Facebook's community standards--you will be able to go into the Support Dashboard and see if that report has been received, reviewed, and what action the Facebook User Operations team ultimately took. It will also alert you when that decision has been made.
“We are always looking for ways to make reporting easier and more transparent for the people who use Facebook. Through these vital reports, people help us effectively take down content that violates our policies,” Facebook's Product Manager for Support Engineering and Site Integrity, Terry Guo, told ABC News. “The hope is that the Support Dashboard will provide greater clarity and accountability to our processes.”
In fact, this is a massive move for the company, as it promises to update the status on all submissions, giving users the ability to see why an action was taken, or why it was not.
Once you click the “report” button on a piece of content, you will be transferred to this new dashboard, which the company is now in testing phases with.
Here is what Facebook had to say about it on their safety blog:
At Facebook, we believe that safety is a combined responsibility. We encourage the more than 900 million audience who use our service to report content to us that violates our Community Standards. These reports enable our team of review professionals to quickly and effectively remove abusive content from Facebook. However, we have consistently received feedback that once people report something to us, they did not know where it went or whether it was handled. Today, we are excited to announce initial testing of a feature that allows you to see what happens after you click “Report.”
Thus, as soon as the status on a submitted issue has been changed, the user is notified so that they can review it. In what is mostly regarded as an engineering-heavy organization, bringing this level of communication and support to the table is a welcome addition.
Facebook is certainly making an effort to be more transparent. Now serving more than 900 million users, Facebook is clearly interested in retaining those audience by reaching out to them in ways that not only make them feel heard, but also make them feel more secure. By shifting to a policy of increased transparency, the social media giant is hoping that users will be more satisfied in the way that it addresses--or does not address--user-reported issues.
Nevertheless, as the feature is by no means counters Facebook's many problems when it comes to reporting content, it is definitely a step in the right direction. That being said, let us hope there is more on the way as this is by no means enough. Facebook has a long way to go in the transparency department.