In its mission to fight bogus live streams, Facebook has updated its developer policies so that people cannot use the platform just for attention grabbing and flooding people’s news feeds with irrelevant content.
The move will also explicitly forbid live videos that are “only images” (including animated images) or polls linked to largely inanimate material, Engadget reported on Monday.
Just as it does not want people broadcasting crimes on Facebook, it also does not want your News Feed cluttered with live videos that are merely attempts to stand out from the crowd, the report said.
Facebook wants truly live video, whether it was professional news or an impromptu feed from your friend’s party.
“If you (developers) enable people to publish Live Video to Facebook, remind them of their obligation to not include third party ads in their video content and to clearly distinguish any pre-recorded content from live content,” Facebook said in a post.
Facebook was under scrutiny after it did not respond to several murders and deaths that were broadcasted live on the platform but soon it started to work on the misuse of the feature.
In March, the social media giant expanded its portfolio of its suicide prevention tools that use Artificial Intelligence (AI) and pattern recognition to help troubled users.
The new tools are similar to the ones that Facebook launched in 2015, which allowed users’ friends to flag a troubling image or status post.