Meta, the social media giant, has been asked to reverse its decision to lower the age limit for using WhatsApp.
On Thursday, the UK and EU implemented a change that reduced the minimum age from 16 years to 13 years.
The campaign group Smartphone Free Childhood called the Meta move, which owns Facebook, Instagram and other social media sites, “tone-deaf”.
The Times quoted Daisy Greenwell, co-founder of WhatsApp: “WhatsApp puts shareholder profits before children’s safety.”
The reduction of the age for using these devices from 16 years to 13 is tone deaf, and it ignores alarms that are being raised by many experts including doctors, scientists, educators, child safety specialists, parents, and mental health professionals.
WhatsApp, she continued, is seen by parents as the most safe social media application, “because, after all, it’s only messaging.”
In this way, it acts as a gateway for other social media applications. Why not send a Snapchat message to your WhatsApp friends?
WhatsApp is not risk-free. The app is often used as the first place where kids are exposed to extremist content. Bullying and harassment are common. It’s also the preferred messaging application for predators because of its end-toend encryption.
Conservative MP Vicky Ford of the Education Select Committee said Meta’s decision to lower the recommended age without consulting parents is “highly irresponsible”.
The Prime Minister Rishi sunak, told BBC that the Online Safety Act will give regulators the power to make sure social media firms protect children from harmful content.
He added: “They should not be seeing things, especially self-harm. If they do not comply with the regulations that are set down by the regulator, there could be very serious fines. Because like all parents, we want to see our children grow up in safety, whether it’s on the fields, or in the online world.”
WhatsApp stated that the new age limit was in line with most countries, and there were protections in place.
Meta has announced a new set of safety features to help protect its users, especially young people, against “sextortion”.
The company confirmed that it would begin to test a Nudity protection filter for Direct Messages on Instagram. This will default on for users under the age of 18 and automatically blur any images detected by Instagram as containing nudity.
Users will see an alert when they receive nude pictures. They can also choose to report or block the user.