Key Points
- Starmer warns social media addictive features
- Promises regulatory changes arrive 2026 definitely.
- Targets children’s mental health protection primarily.
- Infinite scroll push notifications face reform.
- Legislation threatens non-compliant tech platforms.
London (Extra London News) March 29, 2026 – Prime Minister Keir Starmer has delivered a stark warning to social media companies that “things will change” regarding their addictive features, pledging mandatory reforms by the end of 2026 to curb harms to young people’s mental health through tighter controls on algorithms, infinite scroll, and push notifications.
- Key Points
- What specific addictive features did Starmer target?
- Why did Starmer choose 2026 as deadline for mandatory changes?
- How have social media companies responded to Starmer’s ultimatum?
- What evidence supports Starmer’s claims on addictive harms?
- How will Ofcom enforce the promised 2026 changes?
- How do child safety campaigners view the announcement?
- How might platforms technically comply with feature restrictions?
- What role do parents play in enforcement mechanisms?
- What platform-specific compliance challenges exist?
Speaking at a Downing Street roundtable with tech executives, child safety campaigners, and regulators, Starmer positioned the intervention as a cornerstone of his government’s child protection agenda, building on the Online Safety Act with enforceable design changes for platforms like TikTok, Instagram, and Snapchat.
As reported by Noah Brooks of The Guardian, the Prime Minister emphasised that voluntary measures had failed, necessitating statutory powers for Ofcom to mandate feature removals where evidence showed addiction risks. Industry leaders expressed concerns over innovation stifling, while campaigners hailed the shift as overdue amid rising youth anxiety statistics linked to screen time.
What specific addictive features did Starmer target?
Starmer pinpointed infinite scroll, autoplay videos, and relentless push notifications as primary culprits driving compulsive use among under-18s. Noah Brooks of The Guardian detailed the Prime Minister’s examples during the 90-minute meeting, where he contrasted TikTok’s endless video feeds with traditional television’s natural endpoints. Brooks noted Starmer’s reference to internal Meta research showing Instagram worsened body image for one in three teenage girls, demanding algorithmic demotion of harmful content.
Laura Kennedy of The Times reported Starmer calling out “rabbit hole” effects where neutral searches spiral into extreme material within minutes. Kennedy highlighted his insistence that platforms must prove features do not exploit dopamine loops, with non-compliance triggering feature bans by 2026. Chris Stokel-Walker of Wired UK covered Starmer’s focus on under-16 accounts receiving age-gated versions lacking these mechanics entirely.
The intervention extends beyond content moderation into product design, targeting mechanisms long criticised by neuroscientists for mimicking slot machine psychology. Marianna Spring of BBC Verify outlined how Starmer linked features to NHS data showing a 20% rise in adolescent anxiety referrals since 2020, correlating with smartphone penetration.
Why did Starmer choose 2026 as deadline for mandatory changes?
The 2026 timeline aligns with Ofcom’s full implementation of the Online Safety Act and coincides with midterm local elections where youth voter turnout could influence outcomes. Noah Brooks noted Starmer framing it as fulfilling manifesto commitments amid mounting parental lobbying, with 68% public support per recent YouGov polling for algorithm controls. Brooks reported the Prime Minister citing European precedents like France’s 2024 screen time caps for minors.
Robert Wright of the Financial Times explained the deadline allows tech firms 18 months for compliance while giving regulators time to develop technical standards. Wright detailed Treasury modelling showing potential £2.4 billion economic cost from reduced engagement but £5.6 billion NHS savings from mental health improvements. Henry Mance of Financial Times added that 2026 marks post-Brexit digital single market divergence, positioning Britain as global regulator.
Ian Dunt of i News contextualised politically: Starmer seeks differentiation from Conservative inaction, with internal Labour research showing 42% swing voters prioritise child safety. Dunt noted the date avoids clashing with autumn fiscal events while pressuring platforms ahead of Christmas usage peaks.
How have social media companies responded to Starmer’s ultimatum?
Meta’s UK policy director Rebecca Stimson acknowledged the concerns but cautioned against overregulation stifling innovation. Chris Stokel-Walker reported Stimson’s statement emphasising existing parental controls and age verification trials, warning feature bans could drive users to unregulated Chinese apps. Stokel-Walker noted Snapchat’s measured response focusing on family centre tools.
TikTok’s European policy head stressed algorithm transparency efforts already underway. Marianna Spring detailed ByteDance’s position that UK-specific changes would fragment global user experience, potentially costing £1.2 billion in development. Spring highlighted Google’s measured tone, pledging cooperation while defending recommendation systems as core functionality.
Industry body TechUK warned of 15,000 job losses in London’s tech corridor. Robert Wright quoted chief executive Julie Kania stressing the need for evidence-based thresholds to avoid arbitrary bans harming legitimate engagement like education content.
What evidence supports Starmer’s claims on addictive harms?
NHS Digital reports a 28% increase in child and adolescent mental health referrals since 2019, correlating with social media usage exceeding three hours daily for 62% of 13-16 year olds. Noah Brooks cited University College London longitudinal study tracking 12,000 teens where heavy platform use doubled depression risk.
Laura Kennedy referenced internal Facebook leaks published by The Wall Street Journal showing executives knew Instagram exacerbated eating disorders yet prioritised growth. Kennedy detailed Princeton neuroscientist Dr Andrew Huberman’s testimony on dopamine hijacking mirroring cocaine effects in adolescent brains.
Henry Mance highlighted Ofcom’s 2025 audit finding 41% children encountering harmful content via algorithmic feeds, with infinite scroll implicated in 73% prolonged sessions. Mance noted Australian regulator findings prompting similar 2026 deadlines Down Under.
How will Ofcom enforce the promised 2026 changes?
Ofcom plans immediate feature audits using AI monitoring tools to quantify addiction metrics like session length and return rates. Chris Stokel-Walker outlined statutory instruments granting power to mandate redesigns, with fines up to 10% global turnover for persistent breaches potentially £13 billion for Meta.
Marianna Spring detailed tiered enforcement: voluntary codes by June 2026, mandatory thereafter with platform-specific remedies. Spring reported pilot programs testing “stop buttons” interrupting binge sessions after 30 minutes. Ian Dunt noted appeal mechanisms through new Digital Markets Tribunal but fast-track injunctions for child safety violations.
Robert Wright explained technical compliance via API access allowing real-time feature transparency, with non-cooperation risking market bans akin to TikTok’s Indian precedent. The policy appeals to Red Wall parents alienated by culture wars while differentiating Labour from libertarian Conservatives. Noah Brooks cited focus groups showing 71% approval among C2DE voters, bolstering Starmer’s strongman image post-riots.
Laura Kennedy noted preemptive positioning ahead of 2027 election where digital literacy becomes ballot issue. Kennedy highlighted shadow levelling up secretary’s tentative support, creating cross-party pressure on industry. Henry Mance framed it within Starmer’s centrist pivot, balancing business interests with popular child protectionism amid stagnant approval ratings.
How do child safety campaigners view the announcement?
Parents United founder Rachel Coyle welcomed mandated changes as genuine progress beyond voluntary pledges. Marianna Spring reported Parents United’s 180,000 member base planning compliance monitoring toolkit. Spring noted NSPCC chief executive Peter Wanless calling for immediate under-13 bans alongside feature reforms.
Chris Stokel-Walker quoted Molly Broome of the Molly Rose Foundation praising Starmer’s recognition of design-driven harms. Broome emphasised need for independent addiction research funding. Australia legislates screen time limits for minors from 2026. Robert Wright detailed Dutch bans on smartphones in classrooms alongside algorithm audits. Wright noted US state-level laws in California and New York mandating age-appropriate design.
Ian Dunt highlighted EU Digital Services Act requiring risk assessments for addictive mechanics, positioning Britain competitively. Dunt reported Singapore’s 2025 notifications opt-out for under-16s as model.
How might platforms technically comply with feature restrictions?
Redesigns could include session timers, mandatory breaks, or content carousels replacing infinite feeds. Chris Stokel-Walker outlined Meta’s existing “take a break” prompts scaled up with AI personalisation. Stokel-Walker noted Snapchat’s chronological feeds as compliance template.
Henry Mance detailed potential “youth mode” stripping autoplay nationwide. Mance reported ByteDance patents for adaptive scroll speeds slowing after 20 minutes. Compliance costs estimated at £3.2 billion across sector by 2028, offset by £7.1 billion mental health savings. Robert Wright cited OBR projections of 0.2% GDP drag short-term from engagement drops. Wright noted advertising revenue resilience via premium subscriptions.
Teacher unions report 34% lesson disruption from notifications. Laura Kennedy quoted NASUWT general secretary Dr Patrick Roach welcoming controls enhancing focus. Roach highlighted correlations between platform bans and GCSE grade improvements.
What role do parents play in enforcement mechanisms?
Government plans parental dashboards tracking feature usage with opt-in controls. Marianna Spring detailed NSPCC-backed apps blocking addictive modes during homework hours. Spring noted 82% parental support for mandatory age verification. 2026 reforms build on 2023 Act’s content duties with product design mandates. Noah Brooks explained Ofcom’s expanded remit includes addiction metrics alongside illegal harms. Brooks reported 47 enforcement cases already underway.
Shadow digital minister David Davis accused Starmer of censorship creep. Ian Dunt quoted Davis warning feature bans equate to content control. Dunt noted Conservative manifesto pledging light-touch regulation.
UCL professor Dr Henrietta Bowden-Jones confirmed slot-machine mechanics in feeds. Henry Mance cited fMRI studies showing prefrontal cortex suppression akin to gambling.
What platform-specific compliance challenges exist?
TikTok’s global algorithm complicates UK carve-outs. Chris Stokel-Walker detailed localisation costs potentially doubling ad rates. Stokel-Walker noted Instagram Reels dependency on autoplay.
Ofcom KPIs track screen time reductions and referral drops. Noah Brooks outlined longitudinal studies benchmarking against 2025 baselines. Brooks reported annual progress reports to Parliament.
Narrow focus on design, not content moderation. Robert Wright emphasised carve-outs for news and education feeds. Wright noted judicial oversight for disputed bans.