Meta Adds New Safeguards for Kids and Teenagers on All Platforms

Meta Adds New Safeguards for Kids and Teenagers on All Platforms

Meta unveiled new security features for teen Instagram accounts on Wednesday. In order to provide teens with additional tools when they communicate with other users through the platform’s direct messaging (DM) feature, the Menlo Park-based social media behemoth announced that it is extending its Teen Account protection and safety features. Teens using Meta applications will now receive additional context about the people they are DMing thanks to this upgrade. In the meanwhile, adult-managed accounts will now have some of the same protections as teen accounts.

Meta Introduces New Instagram Safety Features for Teens

According to Meta, new safety features have been added to teen accounts’ direct messages (DMs) to help users spot possible scammers and provide them more context about the identities they’re dealing with. The upgrade gives teens more options for blocking accounts and safety advice. The account’s Instagram joining month and year will also be visible to them, as it will be prominently displayed at the top of new chat windows. A new block and report feature will also allow users to do both at once.

Instagram direct messages now offer both of these new security measures. Even if they follow each other, the first one will give teens safety advice before they text another user. Teenagers are advised by these guidelines to carefully review the other person’s profile and to “not have to chat with them if something doesn’t feel right.” Additionally, it serves as a reminder to minors to exercise caution while sharing information with others.

Teenagers will see the month and year that account joined Instagram at the top of the chat screen when they send a direct message to someone for the first time. This will make it easier for users to identify possible scammers and give them additional context about the accounts they are messaging.

Additionally, Meta is extending the scope of Teen Account safety features to adult-run accounts that primarily contain children. Adults frequently post pictures and videos of their kids on these sites, which have a child as the profile picture. Parents or child talent managers typically oversee these accounts on behalf of youngsters under the age of 13.

Notably, accounts created by children themselves are removed from Meta, even if adults are permitted to manage accounts on their behalf as long as they note this in their bio.

Furthermore, Meta is expanding a number of teen account safeguards to include adult-managed child accounts. These include enabling Hidden Words, which filters offensive comments, and returning to the most stringent message settings by default to stop unsolicited messages. In the upcoming months, these upgrades will be made available.

Sanchita Patil

error: Content is protected !!