Google is tightening supervision of its Play store by forming a team who will screen new apps for malware and sexually explicit material.
While not adopting Apple’s approval process, Google’s special team will screen each Play Store app submitted by developers to spot earlier anything that runs afoul of its rules. According to the blog, it had already begun reviewing apps several months ago “to better protect the community and improve the app catalog”.
Eunice Kim, product manager for Google Play, said: “This new process involves a team of experts who are responsible for identifying violations of our developer policies earlier in the app lifecycle. We value the rapid innovation and iteration that is unique to Google Play, and will continue to help developers get their products to market within a matter of hours after submission, rather than days or weeks. In fact, there has been no noticeable change for developers during the rollout.”
As well as the code review process, Google Play has also rolled out improvements to the way it handles the status of publishing, so that developers now have more insight into why apps are rejected or suspended, and they can easily fix and resubmit their apps for minor policy violations.
“Over the past year, we’ve paid more than $7 billion to developers and are excited to see the ecosystem grow and innovate,” Kim said. “We’ll continue to build tools and services that foster this growth and help the developer community build successful businesses.”
Google Play also added an age-based rating system for apps and games to better help developers label their apps for the right audience. Consistent with industry best practices, this change will give developers an easy way to communicate familiar and locally relevant content ratings to their users and help improve app discovery and engagement by letting people choose content that is right for them. Starting in May, all new apps and updates to existing apps will require a completed questionnaire before they can be published on Google Play.