Emily started using Instagram when she was in her mid-teens and found it helpful at first. She used the photo-sharing app to follow fitness influencers, but what began as a constructive relationship with the platform spiralled into a crisis centred on body image. At 19 she was diagnosed with an eating disorder.
“I felt like my body wasn’t good enough, because even though I did go to the gym a lot, my body still never looked like the bodies of these influencers,” says Emily, now a 20-year-old a student who is in recovery.
Emily, who preferred not to use her real name, uses Instagram sparingly now. She is one of many Instagram users whose suffering came to prominence this week with revelations that the platform’s owner, Facebook, seemed to know it was damaging teenage girls’ mental health.
According to internal research leaked to the Wall Street Journal (WSJ), the app has made body image issues worse for one in three girls and in one Facebook study of teenagers in the UK and the US, more than 40% of Instagram users who said they felt “unattractive” said the feeling began while using the app.
Instagram has more than 1 billion users worldwide and an estimated 30 million in the UK, with Kim Kardashian, Selena Gomez and Ariana Grande among the accounts with hundreds of millions of followers between them. In the UK, the Love Island couple Liam Reardon and Millie Court have already raced to a combined following of nearly 3 million since winning the 2021 title.
Two in five girls (40%) aged 11 to 16 in the UK say they have seen images online that have made them feel insecure or less confident about themselves. This increases to half (50%) in girls aged 17 to 21, according to research by Girlguiding in its annual girls’ attitudes survey.
Sonia Livingstone, professor of social psychology at the department of media and communications, LSE, describes adolescence for teenage girls as an “arc” that tends to begin with the staple experiences of interest in pets, painting or playing with younger siblings, through to the more confident young woman ready to face the world. But it is the experience in the middle of that parabola that represents a particular challenge, and where Instagram can be most troubling.
“It is at that point where they are assailed with many answers to their dilemmas and a prominent answer at the moment is that it might be what they look like, that it matters what they bought,” says Livingstone, who next week is due to give evidence to MPs and peers scrutinising the draft UK online safety bill, which imposes a duty of care on social media companies to protect users from harmful content.
Facebook’s in-depth research into the photo-sharing app stated that Instagram had a deeper effect on teenage girls because it focused more on the body and lifestyle, compared with TikTok’s emphasis on performance videos such as dancing, and Snapchat’s jokey face features. “Social comparison is worse on Instagram,” said the Facebook study. The leaked research pointed to the app’s Explore page, where an algorithm tailors the photos and videos that a user sees, potentially creating a spiral of harmful content.
“Aspects of Instagram exacerbate each other to create a perfect storm,” said the research.
Livingstone says a key feature of the online safety bill will be its provisions on regulating the algorithms that constantly tailor and tweak what you view according to your perceived needs and tastes – and can push teenage girls into that vortex of esteem-damaging content. “There is a lot to be done about algorithms and AI [artificial intelligence].”
Beeban Kidron, the crossbench peer who sits on the joint committee into the online safety bill and was behind the recent introduction of a children’s privacy code, says Ofcom, the UK communications watchdog, will have a vital role in scrutinising algorithms.
“The value in algorithmic oversight for regulators, is that the decisions that tech companies make will become transparent, including decisions like FB took to allow Instagram to target teenage girls with images and features that ended in anxiety, depression and suicidal thoughts. Algorithmic oversight is the key to society wrestling back some control.”
A spokesperson for the Department for Digital, Culture, Media and Sport says the bill will address those concerns. “As part of their duty of care, companies will need to mitigate the risks of their algorithms promoting illegal or harmful content, particularly to children. Ofcom will have a range of powers to ensure they do this, including the ability to request information and enter companies’ premises to access data and equipment.”
For others, there is a wider issue of educating the young how to navigate a world dominated by social media. Deana Puccio, co-founder of the Rap project, which visits schools across the UK and abroad to discuss issues such as consent, online and offline safety and building confidence in body image and self-esteem, says the bill should be accompanied by a wider education drive.
“We, parents, educators, politicians need to equip our young people with the tools, the analytical skills to make healthy choices for themselves. Because they will get access to whatever they want to. They are better at navigating the online world than we are.”
Puccio adds that teenagers should be encouraged to make their social media posts reflect a more realistic vision of the world. “We need to start building up people’s confidence to post real-life ups and downs.”
The head of Instagram risked fanning criticism of the app on Thursday with comments that compared social media’s impact on society to that of cars. “We know that more people die than would otherwise because of car accidents, but by and large, cars create way more value in the world than they destroy. And I think social media is similar,” said Adam Mosseri.
Facebook referred the Guardian to a blogpost by Karina Newton, the head of public policy at Instagram, who said the internal research showed “our commitment to understanding complex and difficult issues young people may struggle with, and informs all the work we do to help those experiencing these issues”.
Responding to the algorithm and drug cartel allegations, Facebook said divisions had existed in society long before its platform appeared and that it had a “comprehensive strategy” for keeping people safe in countries where there was a risk of conflict and violence.
The UK’s Competition and Markets Authority (CMA) has unveiled compliance principles to curb locally some of the sharper auto-renewal practices of antivirus software firms.
The move follows the watchdog baring its teeth at McAfee and Norton over the issue of automatically renewing contracts.
The CMA took exception to auto-renewal contracts for antivirus software that customers in the UK signed up for and found difficult to cancel. Refunds and clearer pricing information (including making sure consumers were aware that year two could well end up considerably costlier than the first) were the order of the day.
Today’s principles build on that work, and are aimed at helping antivirus companies toe the line where UK consumer law is concerned. They are a bit more detailed than a simple “stop being horrid.”
The focus remains on auto-renewing contracts, where a customer signs up for a fixed period, then is charged again for subsequent periods. The CMA acknowledges that such arrangements are convenient, but they risk the consumer being locked into an agreement they no longer want or that they get stung with higher fees at renewal time.
While the principles are intended to be helpful, lurking in the background is consumer law and the threat of a potential trip to court for vendors stepping out of line.
First up comes a requirement to make sure customers are informed about auto-renewal, rather than hiding the detail in an End User Licence Agreement (EULA) or burying it in hard-to-read text through which a user must scroll.
Price claims must be “accurate” and “not mislead your customers” – so only show discounts against the normal price. It must also be possible to turn off the auto-renew easily, keep auto-renew turned off once it is off and, if on, make sure customers are reminded in good time that an auto-renew will happen.
Getting a refund must be easier and customers should be able to change their mind when auto-renewal happens. If the customer has stopped using the product, safeguards are needed around auto-renewal.
The last principle could pose a few challenges – how does a vendor become aware that a customer is not using its product? The suggestion from the CMA is to check if software updates are being received rather than simply charging users year after year.
The Register contacted McAfee and Norton for their thoughts on the principles, and will update should the companies respond. ®
Just a few months after hitting unicorn status, Gorillas has raised another major round of funding from big-name investors.
German start-up Gorillas has raised nearly $1bn to expand its on-demand grocery delivery business.
The Series C funding round was led by Delivery Hero, the German food and grocery delivery giant that recently took a stake in Deliveroo.
Gorillas also received backing from existing investors including Coatue Management, DST Global and Tencent, as well as new investors G Squared, Alanda Capital, Macquarie Capital, MSA Capital and Thrive Capital.
The fresh funding comes just a few months after the company’s $290m Series B, which brought its valuation to more than $1bn.
Gorillas was founded in Berlin in 2020 by Kağan Sümer and Jörg Kattner, promising grocery deliveries in as little as 10 minutes.
It now operates more than 180 warehouses and has expanded to more than 55 cities in nine countries, including Amsterdam, London, Paris, Madrid, New York and Munich.
The company plans to use the latest funding for its next phase of development. This includes reinforcing its footprint in existing markets and investing in operations, technology and marketing.
“The size of today’s funding round by an extraordinary investment consortium underscores the tremendous market potential that lies ahead of us,” said Sümer, who is CEO of the start-up.
“With Delivery Hero, we have chosen a strong strategic support that is deeply rooted in the global delivery market, and is renowned for having unique experience in sustainably scaling a German company internationally.”
On-demand grocery delivery is a growing area in Europe that’s attracting investor attention.
The Information Commissioner’s Office is to intervene over concerns about the use of facial recognition technology on pupils queueing for lunch in school canteens in the UK.
Nine schools in North Ayrshire began taking payments for school lunches this week by scanning the faces of their pupils, according to a report in the Financial Times. More schools are expected to follow.
The ICO, an independent body set up to uphold information rights in the UK, said it would be contacting North Ayrshire council about the move and urged a “less intrusive” approach where possible.
An ICO spokesperson said organisations using facial recognition technology must comply with data protection law before, during and after its use, adding: “Data protection law provides additional protections for children, and organisations need to carefully consider the necessity and proportionality of collecting biometric data before they do so.
“Organisations should consider using a different approach if the same goal can be achieved in a less intrusive manner. We are aware of the introduction, and will be making inquiries with North Ayrshire council.”
The company supplying the technology claimed it was more Covid-secure than other systems, as it was cashless and contactless, and sped up the lunch queue, cutting the time spent on each transaction to five seconds.
Other types of biometric systems, principally fingerprint scanners, have been used in schools in the UK for years, but campaigners say the use of facial recognition technology is unnecessary.
Silkie Carlo, the director of Big Brother Watch, told the Guardian the campaign group had written to schools using facial recognition systems, setting out their concerns and urging them to stop immediately.
“No child should have to go through border-style identity checks just to get a school meal,” she said. “We are supposed to live in a democracy, not a security state.
“This is highly sensitive, personal data that children should be taught to protect, not to give away on a whim. This biometrics company has refused to disclose who else children’s personal information could be shared with and there are some red flags here for us.”
The technology is being installed in schools in the UK by a company called CRB Cunninghams. David Swanston, its managing director, told the FT: “It’s the fastest way of recognising someone at the till. In a secondary school you have around about a 25-minute period to serve potentially 1,000 pupils. So we need fast throughput at the point of sale.”
Live facial recognition, technology that scans crowds to identify faces, has been challenged by civil rights campaigners because of concerns about consent. CRB Cunninghams said the system being installed in UK schools was different – parents had to give explicit consent and cameras check against encrypted faceprint templates stored on school servers.
A spokesperson for North Ayrshire council said its catering system contracts were coming to a natural end, allowing the introduction of new IT “which makes our service more efficient and enhances the pupil experience using innovative technology”.
They added: “Given the ongoing risks associated with Covid-19, the council is keen to have contactless identification as this provides a safer environment for both pupils and staff. Facial recognition has been assessed as the optimal solution that will meet all our requirements.”
The council said 97% of children or their parents had given consent for the new system.
A Scottish government spokesperson said that local authorities, as data controllers, had a duty to comply with general data protection regulations and that schools must by law adhere to strict guidelines on how they collect, store, record and share personal data.
Hayley Dunn, a business leadership specialist at the Association of School and College Leaders, said: “There would need to be strict privacy and data protection controls on any companies offering this technology.
“Leaders would also have legitimate concerns about the potential for cyber ransomware attacks and the importance of storing information securely, which they would need reassurances around before implementing any new technology.”