At first glance, Demis Hassabis is an unusual figure for Dominic Cummings to have turned to for guidance in March 2020 about the threat of the novel coronavirus bearing down on the UK.
The co-founder of Google subsidiary DeepMind, which is dedicated to high-level AI research, has a varied CV, but is no epidemiologist. A child chess prodigy, he hit the rank of master at 13 and was for a brief time the second-highest-rated player in the world in his age category.
After completing his A-levels two years early, he joined video game studio Bullfrog, where he co-designed the hit classic Theme Park at just 17 years old, before leaving to study computer science at Cambridge. He returned to video game development for another decade, and, after switching back to academia and a PhD in cognitive neuroscience, founded DeepMind in 2011.
Another branch of the company, the one Hassabis was more directly involved in, focused on theoretical breakthroughs first, with applications coming later: the company succeeded in building AIs with world-beating performance at games including Go, chess and the video game Starcraft II. Last year, those breakthroughs were also applied to tackling health problems, as DeepMind announced a stunning breakthrough “in the tricky problem of protein folding”.
But, DeepMind says, Hassabis wasn’t invited to talk about his work. “Demis was one of several scientists who contributed his thoughts on the government’s response to Covid-19,” a DeepMind spokesperson told the Guardian. “He attended one Sage meeting in-person on 18 March. The views he shared with fellow Sage members, officials, and government advisers, including Dominic Cummings, were based purely on reviewing publicly available international data.
“Along with many other scientists at the time, Demis was a proponent of fast and decisive lockdown measures based on public evidence drawn from what was happening in other countries. He was acting in a personal capacity as a leading data scientist in the public interest.”
Cummings has a well-documented fascination with the world of science. In the same early stages of the UK’s response, Timothy Gowers, a Fields-medal-winning mathematician, was called on to set up the Data Evaluation and Learning for Viral Epidemics, or Delve, group, which reported in favour of face mask use in May 2020. But, as well as being a generally respected scientist, Hassabis is linked to the rationalist movement, which has guided much of Cummings’ thinking.
“We know that Dom is rationalist-influenced from his own blogroll and comments,” says Tom Chivers, author of a book on the movement, The AI Does Not Hate You. While Hassabis is not himself a member of the community, his involvement in advanced AI research brings him into the same circles.
“What rationalism implies from a policy perspective is a big question,” Chivers says, “but you can see something like it in the effective altruist mode of thinking: trying to separate emotional responses from outcomes. And, by extension, it can lead to serious thought about long-term existential risks, AI and bio-terror, because they have the potential to crush human flourishing in the long term.”
When she first began talking to her peers in the House of Lords about the rights of children on the internet, Baroness Kidron says she looked like “a naysayer”, like someone who was “trying to talk about wooden toys” or, in her husband’s words, like “one middle-aged woman against Silicon Valley”. It was 2012 and the film-maker and recently appointed life peer was working on her documentary InRealLife, spending “hundreds of hours in the bedrooms of children” to discover how the internet affects young lives. What she saw disturbed her.
“I did what they were doing – gaming, falling in love, watching pornography, going to meet-ups, making music – you name it, it happened,” Beeban Kidron says. The film explored everything from children’s exposure to porn, to rampant online bullying, to the way privacy is compromised online. But Kidron noticed that one thing underpinned it all: on the internet, nobody knows you’re a kid. “Digital services and products were treating them as if they were equal,” she says. “The outcome of treating everyone equally is you treat a kid like an adult.”
Almost a decade later, Kidron has pushed through a Children’s Code that hopes to change this landscape for ever. The Age Appropriate Design Code, an amendment to the 2018 Data Protection Act, came into effect this month. It requires online services to “put the best interests of the child first” when designing apps, games, websites and internet-connected toys that are “likely” to be used by kids.
In total, there are 15 standards that companies need to adhere to in order to avoid being fined up to 4% of their global turnover. These include offering “bite-size” terms and conditions for children; giving them “high privacy” by default; turning off geolocation and profiling; and avoiding “nudge techniques” that encourage children to turn off privacy settings. The code, which will be enforced by the Information Commissioner’s Office (ICO), also advises against “using personal data in a way that incentivises children to stay engaged”, such as feeding children a long string of auto-playing videos one after the other.
The code was introduced in September 2020, but offered companies a 12-month transition period, in this time the world’s tech giants have seemingly begun responding to the sting of Kidron’s sling. Instagram now prevents adults from messaging children who don’t follow them on the app, while anyone under 16 who creates an account will have it set to private by default. TikTok has implemented a bedtime for notifications; teens aged 13-15 will no longer be pinged after 9pm. Meanwhile, YouTube has turned off autoplay for users aged 13-17, while Google has blocked the targeted advertising of under-18s.
But hang on, why does TikTok’s bedtime only apply to those 13 and over? Are toddlers OK to use the app until 2am? You’ve just spotted the first flaw in the plan. While social media sites require users to be at least 13 to sign up for their services (in line with America’s 21-year-old Children’s Online Privacy Protection Act), a quick glance at reality shows that kids lie about their age in order to snap, share and status-update. Creating a system in which children can’t lie, by, for example, necessitating that they provide ID to access an online service, ironically risks compromising their privacy further.
“There is nothing that stops us having a very sophisticated age-check mechanism in which you don’t even know the identity of the person, you just know that that they’re 12,” Kidron argues, pointing to a report on age verification that she recently worked on with her organisation 5Rights Foundation, entitled But how do they know it is a child?. Third-party providers, for example, could confirm someone’s identity without passing on the data to tech giants, or capacity testing could allow websites to estimate someone’s age based on whether they can solve a puzzle (no prizes for figuring out the numerous ways that could go wrong).
Whatever the solution, Kidron is currently working on a private member’s bill that sets minimum standards of age assurance, thereby preventing companies from choosing their own “intrusive, heavy handed or just terrible, lousy, and ineffective” techniques.
How did Kidron go from looking like a “naysayer” to changing the landscape so drastically? Kidron began making documentaries in the 80s before working in Hollywood (most notably directing the Bridget Jones sequel The Edge of Reason). After becoming a baroness, she founded the 5Rights Foundation to fight for children’s digital rights. She says she had her “early adopters” in parliament, including the archbishop of York, Stephen Cottrell, Conservative peer Dido Harding and Liberal Democrat peer Timothy Clement-Jones. “That was my gang,” Kidron says, but others remained sceptical for years. “The final set of people only came on board this summer, once they saw what the tech companies were doing.”
The Children’s Code as a whole defines a child as anyone under 18, in line with the United Nations Convention on the Rights of the Child (UNCRC). For Kidron, it’s about much more than privacy – “a child’s right to unfettered access to different points of view is actually taken away by an algorithmic push for a particular point of view,” she argues, also noting that the right to the best possible health is removed when companies store and sell data about children’s mental health. “It’s nothing short of a generational injustice,” she says. “Here was this technology that was purporting to be progressive, but in relation to children it was regressive – it was taking away the existing rights and protections.”
How did these claims go down in Silicon Valley? Conversations with executives were surprisingly “very good and productive”, according to Kidron, but she ultimately realised that change would have to be forced upon tech companies. “They have an awful lot of money to have an awful lot of very clever people say an awful lot of things in an awful lot of spaces. And then nothing happens,” she says. “Anyone who thinks that the talk itself is going to make the change is simply wrong.”
And yet while companies must now comply with the code, even Kidron admits, “they have to comply in ways that they determine”. TikTok’s bedtime, for example, seems both arbitrary and easy to get around (children are well versed in changing the date and time on their devices to proceed in video games). Yet Kidron says the exact o’clock is irrelevant – the policy is about targeting sleeplessness in children, which in turn enables them to succeed at school. “These things seem tiny… but they’re not. They’re about the culture and they’re about how children live.”
As for children working their way around barriers, Kidron notes that transgression is part of childhood, but “you have to allow kids to transgress, you can’t just tell them it’s really normal”. “The problem we have is kids who are eight are looking at hardcore, violent, misogynistic porn and there’s no friction in the system to say, ‘Actually, that’s not yours.’”
Yet problems also arise when we allow tech companies, not parents, to set boundaries for our children. In 2017, YouTube came under fire after its parental controls blocked children from seeing content made by LGBTQ+ creators (YouTube initially apologised for the “confusion” and said only videos that “discuss more sensitive issues” would be restricted in the future). Kidron says she’s “not a big takedown freak” and is “committed to the idea that children have rights to participate”, but can the same be said of companies hoping to avoid fines? Numerous American websites remain inaccessible in Europe after the implementation of General Data Protection Regulation (GDPR) laws in 2018, with companies preferring to restrict access rather than adapt.
For now, it remains to be seen how the Children’s Code will be enforced in practice; Kidron says it’s “the biggest redesign of tech since GDPR”, but in December 2020 a freedom of information request revealed that more than half of GDPR fines issued by the ICO remain unpaid.
Still, Kidron is certain of one thing: that tech companies are “disordering the world” with their algorithms – “making differences of their terms for people who are popular and have a lot of followers versus those who are not” and “labelling things that get attention without really thinking about what that attention is about”. These are prescient remarks: a day after we speak, the Wall Street Journal revealed that Facebook has a program that exempts high-profile users from its rules and has also published internal studies demonstrating that Instagram is harmful to teens. One internal presentation slide read: “We make body image issues worse for one in three teen girls.” Instagram’s head of public policy responded to the report in a blog post, writing: “The story focuses on a limited set of findings and casts them in a negative light.”
Whether or not Kidron was once “one middle-aged woman against Silicon Valley”, today she has global support. The recent changes implemented by social media companies are not just UK-based, but have been rolled out worldwide. Kidron says her code is a Trojan horse, “starting the conversation that says, you can regulate this environment”.
But this Trojan horse is only beginning to open up. “We had 14 Factory Acts in the 19th century on child labour alone,” Kidron says, adding that the code is likely to be the first of many more regulations to come. “I think today we air punch,” she says, when asked how it feels to have led the charge for change. “Tomorrow, we go back to work.”
Sir Clive Sinclair’s contributions to computing and business are well known, and we’ve done our best to celebrate his life in our obituary of the electronics pioneer, who passed last week aged 81.
To mark his life we felt it appropriate to also consider his impact on Reg readers.
Like many others, your correspondent’s first computer was a ZX Spectrum. The machine led to my presence in these pages, because I eventually joined the Australian ZX Users’ Association (AZUA), which published its own magazine and invited contributions.
A die was cast.
I tracked down AZUA co-founder David Vernon who told us, by email: “We all loved Clive. We loved his foresight, his eccentricity and his desire to bring computing to the masses.”
Referring to the ZX80 and ZX81, Vernon added: “But we found him frustrating in equal measure. Why to save a few pounds did he give us such a crappy touch keyboard? Why not give us 4K of RAM and not 1K?”
“But even these irritations had a silver lining. They showed us that we didn’t have to put up with whatever a manufacturer gave us but we could improve on it. And this is perhaps Clive’s legacy to my generation — we could do stuff that we never imagined. Clive gave us confidence that we could do clever stuff too. And we did.”
“Honestly, it’s thanks to Clive that I now run my own publishing business. Without my early experience of writing and publishing computer programs and help pages I’d not be doing what I do today.”
Author of ’80s classic The Hobbit didn’t know game was a hit
Similar stories poured in from around the world as comments on our appreciation of the great man’s life.
“For me it was the ZX 48K,” wrote commenter Mozzie.
“It got upgraded with a Saga 1, had an astonishing Saisho cassette player that never to failed to load anything except LoTR. Chuckie Egg, Dizzy, Wriggler, Harrier Attack, Target Renegade and even HiSoft Pascal… thanks Clive for giving me the means to feed my family the last 16 years.”
“So, so many aspects of my life are directly or only slightly indirectly related to my love for coding and electronics and tech in general, and that all stems back to those heady days of the early 80s, sat in my bedroom in front of my Speccy,” wrote another forum member, ChrisC.
Linus Torvalds was a Sinclair user: Among those influenced by Sir Clive was Linus Torvalds, creator of the Linux kernel, who worked on a Sinclair QL before he turned to his most famous work. From 00:30 in the video below, he reminisces about his time using the QL.
A commenter named Allwallgbr shared his experience working for Sir Clive for two years in the 1970s and described the time as “amazing hard working, fun packed years servicing audio products and demonstrating at Hi-Fi exhibitions.
“I enjoyed the work and the buzz of the company so much, no other employer came close in my entire working life.”
Readers also remembered Sir Clive as possessing a strange charisma.
“An absolute genius with just the right amount of barminess to be a proper British boffin,” opined a commenter with the handle John Brown (no body). “He even had the proper boffin’s bald patch and glasses.”
Others shared their experiences putting Sinclair kit to work.
“I wrote a text-graphic based D&D game, and fed in the entire D&D stats to help automate games,” wrote a Reg forums member named Danny 2.
Next, he tackled something harder. “I tried and failed to write a conversation simulator to pass the Turing test: more difficult than I expected.”
“I think I freaked my mum out when she heard noises at 6am. It was just my seven-year-old self who was desperate to find out whether SIN and COS would let me PLOT a circle on my birthday present ZX81,” wrote another commenter, who goes by the strangely apt handle 0x80004005. (We’re guessing it’s a Windows error code.)
“I’m currently crying like a baby here. This has hit me a lot harder than I thought it would. This marks the end of the line for the largest chunk of my formative years, and possibly the greatest influence in my entering the career I have,” wrote commenter Stumpy.
“RIP to a massively flawed genius,” our reader added. “A man with ideas often far ahead of their time. I’ll be setting a glass of decent malt aside for you tonight.”
A few readers offered some Sinclair BASIC as tribute:
The legacy lives on: One measure of Sir Clive’s contribution was that emulators for his computers remain available to this day, even if some homages such as the Spectrum Vega+ went awry.
Classic games developed for the ZX Spectrum remain available in many forms, not just as image files for emulators. Manic Miner is now an app, as is Lords Of Midnight. Some other Spectrum classics have even been ported to Microsoft’s XBOX.
One of the folks we reached out to for a Sinclair memory was Shane Muller, an Australian tech entrepreneur who in 2019 threw a very good party to celebrate his thirty years in the tech business. At that event he brandished the ZX81 that started it all.
Shane’s response to news of Sir Clive’s death was to write him a letter:
Taara’s wireless optical technology was able to beam nearly 700TB of data in 20 days across the Congo River.
Alphabet’s internet balloon project Loon may be a closed chapter, but the company is finding new ways to use some of this technology to bring high-speed internet to remote and underserved areas.
Project Taara is Alphabet’s attempt to harness wireless optical tech to make fast internet accessible and affordable. In a blog post yesterday (16 September), the project’s director of engineering, Baris Erkmen, said that its wireless optical communications links are now beaming light-speed connectivity across the Congo River.
“I’m delighted to share that working with Liquid Intelligent Technologies, we recently helped bridge a particularly stubborn connectivity gap between Brazzaville in the Republic of the Congo and Kinshasa in the Democratic Republic of Congo,” he wrote.
Brazzaville and Kinshasa are only 4.8km apart, but because of the speed and depth of the Congo River, it wasn’t possible to establish fibre connection between the two cities. Instead, cables have to travel more than 400km to loop around the river.
Erkmen said that after installing links on both sides of the river, Taara’s technology was able to beam across nearly 700TB of data in 20 days with almost 100pc availability. While the connectivity won’t always be reliable in all weather conditions, he said he was confident it will “play a key role in bringing faster, more affordable connectivity” to the 17m people living in the two cities.
“Being able to deliver high-speed internet (up to 20Gbps) most of the time is a vastly better option than having millions of people miss out on the benefits of connectivity because the economics of laying hundreds of kilometres of cable in the ground simply don’t stack up.”
Project Taara’s predecessor Loon brought helium balloon-based internet to Kenya and delivered communications services to Puerto Rico and Peru following natural disasters in those countries.
Like Project Taara, it was part of Alphabet’s X research division that invests in ambitious but costly projects. However, Loon was shut down in January because it was unable to make a business case for the project and its path to commercial viability was “much longer and riskier than hoped”, according to X lab head Astro Teller.
Don’t miss out on the knowledge you need to succeed. Sign up for the Daily Brief, Silicon Republic’s digest of need-to-know sci-tech news.