Instagram, under fire from safety advocates to keep children off the app and prevent teens from seeing harmful content, is testing new ways to verify users’ age.
Instagram, under fire from safety advocates to keep children off the app and prevent teens from seeing harmful content, is testing new ways to verify users’ age. Among them: running users’ video selfies through an artificial intelligence that can determine if they are adults.
The Meta Platforms Inc.-owned app recently started requiring users to submit their birth date to verify that they are over 13 and eligible to use Instagram. The company has also introduced new privacy settings for 13- to 18-year-olds, including parental controls. Now, if someone tries to change their profile to say they’re an adult, Instagram has a few options beyond submitting a personal identification card.
Starting in the US, Instagram will be accepting video selfies, which Meta will submit to the identity verification company Yoti. “Yoti’s technology estimates your age based on your facial features and shares that estimate with us,” Instagram said in a statement. “Meta and Yoti then delete the image.”
Instagram is making the changes as part of a commitment to raise its standards around protecting teenagers. That promise came after a whistle-blower testified in October that Facebook had prioritized profit over the wellbeing of users, especially teens.
Yoti said it trained the AI through “anonymous images of diverse people from around the world who have transparently allowed Yoti to use their data.” It has knowledge of what under-13s look like because of images obtained with parental consent, it added.
If users don’t want to submit a video or ID, they can also ask three adult users to vouch for them. Those users will get a request to confirm the person’s age, must respond within three days and mustn’t vouch for anyone else at the same time.