Facebook today apologized for suddenly disabling certain apps last month, saying that it “over-weighted certain types of user feedback, causing us to erroneously disable some apps”.
It has also taken several steps to help avoid issues like this in the future. New feedback metrics and a benchmark for how much negative feedback is unacceptable have been added to Application Insights. A new granular enforcement system has been instituted such that only the an app’s social channels that are drawing negative feedback will be blocked. Finally, rather than temporarily deleting apps, those subject to suspension are put in Disabled Mode so developers can still “test the app, edit settings, and view Insights.”
These changes should increase developer confidence in the Platform and allow them to test new communication and viral features without risking that their entire app might be deleted.
During the last week of June, Facebook changed how negative feedback for apps was weighted in its automatic app spam-prevention enforcement system. For instance, an app’s wall posts being marked as spam were more likely to trigger enforcement. This caused some apps to suddenly be deleted, infuriating developers.
Facebook allowed affected developers to appeal the enforcement, and began reinstating some of the disabled apps, though others such as Game of Truth are still disabled. Being disabled, even for only a short time, negatively impacts monetization as well as user growth and retention. Some developers said they had been treated unfairly, and that there was no way of telling how much negative feedback was too much. Others wanted the ability to modify and test their apps instead of being locked out while they were suspended.
A statement from Facebook today noted that “we realize that any downtime has a significant impact on both our developers and users. Many of our developers have chosen to build their businesses on top of Facebook, and we take that responsibility very seriously.” In an effort to make its enforcement system more predictable and rebuild trust with developers, it has now answered many of the requests of the affected developers with policy changes and data that should be available to all developers soon.
New Feedback Metrics in Application Insights
Developers will now see a News Feed tab in their Application Insights that displays positive and negative feedback. A spam reports per story published graph includes a benchmark in the form of green and red zones that indicates whether an app is receiving enough negative feedback to warrant enforcement. In the screenshot Facebook provides, it appears that 0.0023 spam reports per story is the threshold, though this could be different for different apps.
The new metrics will let developer test viral mechanisms and accurately assess whether they are causing too many spam reports. This transparency should increase developer trust in the Platform. However, it might degrade the use experience by encouraging developers to be as spammy as possible while still remaining under the threshold.
Previously, too much negative feedback to usage of a single social channel would cause an entire app to be disabled. This could have caused unforseen consequences of a new viral mechanism or a negative response to a tertiary social channel to bring down an entire app. It also made it less clear what a developer needed to change to return to good standing with Facebook.
Now Facebook will use a granular enforcement system whereby only the social channel causing the negative feedback will be disabled. Facebook explains that “for example, if an app is generating a lot of negative feedback via chat messages, we will take action only on that app’s ability to publish to chat but otherwise leave the app intact.” This will make it much more obvious what mechanism must be modified for an app to be reinstated.
Developers can also appeal granular enforcement rather than the suspension of their entire app. Apps drawing negative feedback across channels are still subject to disablement.
Before today, if an app was disabled, it was effectively deleted, becoming completely unavailable to both a user and its developer. This prevented developers from checking Insights logs, testing their apps, and making changes to their settings or the app itself. Combined with enforcement emails that don’t always include enough detail for developers to learn what they were doing wrong, this prevented devs quickly fixing their apps and appealing the enforcement.
Now disabled apps are placed in Disabled Mode, which makes them unavailable to users, but developers can still access them.
Incidents like what happened at the end of June can send ripples through the development community, leading some to consider switching to making apps for other platforms such as iOS or Android. With Facebook looking to compete with these mobile operating systems in the near future, it needs both top app makers and the long tail of developers behind it.
It’s somewhat surprising that Facebook made the mistake of letting its auto-enforcement system get too aggressive considering its turbulent history with developers. Even more so because it’s been focusing on improving developer relations over the past months with its “Operation: Developer Love”. The Facebook developer community might not be quick to forget, but these changes might make it more willing to forgive.