Science

Meta criticised after lowering WhatsApp minimum age from 16 to 13

Pinterest LinkedIn Tumblr

Meta has lowered the minimum age to use the popular messaging platform WhatsApp.

The move, which came into effect on Thursday, reduces the age limit from 16 down to 13 in the UK and EU.

It has been criticised by a number of campaign groups who have urged the company to reverse the decision.

Smartphone Free Childhood told Sky News that it was an example of “a tech giant putting shareholder profits first and children’s safety second”.

A spokesperson for the group said: “Reducing their age of use from 16 to 13 sends a message to parents that WhatsApp is safe for children, but the stories we’re hearing from our community of parents paint a very different picture.”

Conservative MP Vicky Ford, a member of the education select committee, said that Meta’s decision to reduce the age recommendation without consulting parents was “highly irresponsible”.

Meta has defended the move, with a spokesperson saying: “We give all users options to control who can add them to groups, and the first time you receive a message from an unknown number we give you the option to block and report the account.”

However, Smartphone Free Childhood questioned the effectiveness of WhatsApp’s safety features and said the app, like other social media platforms, could prove disruptive at schools for students.

They also raised the spectre of young people having “unrestricted internet access in their pockets” and the wider effects this can have on mental health, social lives, and development.

WhatsApp has a number of security and safety features already, including users only being accessible by people who have your number, as well as control over profile visibility.

It says the move brings the age limit in line with the majority of countries.

Other popular social media platforms like Snapchat, TikTok, and X, formerly Twitter, use 13 as a minimum age too.

This week, Meta also unveiled a range of new safety features intended to protect users, particularly young people, from intimate image abuse and “sextortion”.

It confirmed it will begin testing a filter in direct messages, called Nudity Protection, which will be turned on by default for those aged under 18.

The tool will automatically blur images thought to contain nudity and users will also see a message urging them not to feel pressured to respond, as well as an option to block the sender and report the chat.

A number of recent studies have shown parents express concern over social media and smartphone use.

A recent poll from charity Parentkind found that more than four in five (83%) of parents said they felt smartphones were “harmful” to children and young people, and 58% of parents believe the government should introduce a ban on smartphones for under-16s.

In February, the Department for Education published guidance instructing teachers how they could completely ban phones in schools.

Daniel Kebede, the head of the National Education Union, the largest education union in the UK, called on the government to hold an inquiry into “dangerous” online content young people could access on their smartphones.

This post appeared first on sky.com