Antigone Davis, the global head of Safety Policy at Meta, released a series of legislative recommendations Tuesday morning in a blogpost on the website Medium. Titled, “A framework for legislation to support parents and protect teens online,” its proposals include a federal mandate that would centralize parental consent, controls and user age verification for apps within a mobile device’s app store or operating system. It also encourages the development of national standards for age-appropriate content and ads targeting minors.
Davis said the goal is not to absolve social media companies of responsibility but to create and enforce industrywide standards that will hold platforms to account.
Blumenthal described Meta’s framework as a blatant deflection.
“Meta’s adamant attempts to deflect responsibility for its own products are beyond the pale,” Blumenthal said Tuesday in a joint statement with Tennessee representative Sen. Marsha Blackburn. “The company’s proposals push the responsibility of safety onto parents without making the necessary changes to toxic black box algorithms or Big Tech’s harmful business model.”
Blumenthal and Blackburn are the lead sponsors of the bipartisan Kids Online Safety Act. Criticized by technology companies, digital privacy activists and LGBTQ+ rights groups for concerns over censorship and other potential unintended consequences, the legislation would allow underage users and their parents to opt out of addictive app features and algorithmic recommendations on social media platforms.
On Tuesday Blumenthal and Blackburn said the proposal is “necessary to ensure Big Tech accountability and real, lasting reforms that actually protect and empower young people online.”
Under Meta’s proposed framework, app stores would share user age information with social media platforms. Davis said operating systems already collect this information when parents register and set up a child’s mobile device. If a child under the age of 16 tries to download an app, the app store would immediately notify the parents with a request for approval.
Social media platforms would use the app store-verified age to automatically funnel users into an an age-appropriate experience and offer parental supervision tools for children under age 16.
The framework also encouraged a federal mandate that would require certain apps to offer accessible parental controls. Additionally, it advocated for the creation of national, age appropriate standards for social media content –– like the ratings used in movies and video games –– and advertisements, such as restricting ad personalization to just age and location for users under 16.
© 2024 Hartford Courant. Distributed by Tribune Content Agency, LLC.