- Google’s has removed bad apps and reviews from Play Store
- For cleaning process, it uses a machine learning-backed anti-spam system
- Google asks people to avoid writing profane and hateful reviews
Google has announced that it has removed millions of reviews and ratings that it detected were not trustworthy or genuine, and deleted thousands of bad apps identified due to suspicious reviews and rating activities from the Play Store. The company says that it uses a Machine Learning powered anti-spam system to carry out this combing process and keep the Play Store safe.
“Google Play ratings and reviews are extremely important in helping users decide which apps to install. Unfortunately, fake and misleading reviews can undermine users' trust in those ratings. User trust is a top priority for us at Google Play, and we are continuously working to make sure that the ratings and reviews shown in our store are not being manipulated,” Fei Ye, Software Engineer and Kazushi Nagayama, Ninja Spamologist, said in a blogpost.
But what are the parameters that Google uses to identify bad reviews? Google has listed out various ways in which ratings and reviews may violate its developer guidelines. The first one is Bad content. Reviews which use profane and hateful language, or reviews which are off-topic are removed from the “Reviews” section below the apps. Incentivized ratings and reviews that are given by real humans in exchange for money or valuable items are also removed.
Fake ratings and reviews which are meant to manipulate an app's average rating or top reviews also violate Google's policy. The company says that it has seen different approaches to manipulating the average rating - from 5-star taps to positively boost an app's average rating, to 1-star attacks to influence it negatively. “When we see these, we take action on the app itself, as well as the review or rating in question,” the company said.
Earlier this year, the Google Play Trust & Safety teams deployed a system that combines human intelligence with machine learning to detect and enforce policy violations in ratings and reviews. “A team of engineers and analysts closely monitor and study suspicious activities in Play's ratings and reviews, and improve the model's precision and recall on a regular basis,” the company noted. Google claims that it regularly asks its skilled reviewers to check the decisions made by the machine learning models for quality assurance.
The Mountain View-based company has also asked its developers and users to help it by following certain guidelines. For developers, Google suggests that they should not buy fake or incentivized ratings and run campaigns like “Give us 5 stars and we'll give you this in-app item!”. The campaigns that are run in-app or otherwise are counted as incentivized ratings, and are prohibited by policy.
For users, Google suggests that they should not accept or receive money or goods (even virtual ones) in exchange for reviews and ratings. They are advised not to use gibberish, hateful, sexual and profane language to criticise an app or game.