views
Instagram is becoming a safer platform for young users. The Facebook-owned photo-sharing platform has banned adults from direct messaging teenagers who don’t follow them and is introducing “safety prompts” that will be shown to teens when they send a message to adults who have been “exhibiting potentially suspicious behaviour.” Safety prompts will give teenagers the option of reporting or blocking adults who are messaging them. These prompts will also remind young users to not feel pressured to respond to messages and be careful while sharing media or information with someone they don’t know.
With the new changes, notices will appear when Instagram‘s moderation systems spot suspicious behaviour from adult users. Instagram is not sharing details on how these system operate, but it has said that suspicious behaviour could include sending “a large amount of friend or message requests to people under 18.” The new feature will be available in some countries this month and will be rolled out globally soon. Instagram did not mention which countries the new feature is being rolled out for initially.
Instagram also said that it is developing a new AI and machine learning technology to try and detect someone’s age when they sign up for an account. Officially, the app requires that users are aged 13 years or above, but it is easy to lie about one’s age. Instagram said that it wants to do more to stop this from happening, but didn’t go into details on how the new systems will help with this.
Instagram also said that it will encourage new teenage users to make their profile private. If they choose to create a public account, Instagram will send them a notification that highlights the benefits of private accounts and reminds them to check their settings.
Read all the Latest News, Breaking News and Coronavirus News here
Comments
0 comment