Undated family handout file photo of Molly Russell whose family's five-year wait for answers is set to end as an inquest will finally examine whether algorithms used by social media firms to keep users hooked contributed to her death. Molly, from Harrow, north-west London, is known to have viewed material linked to anxiety, depression, self-harm and suicide before ending her life in November 2017, prompting her family to campaign for better internet safety. Issue date: Tuesday September 20, 2022.

Boss admits Pinterest ‘not safe’ when Molly Russell used site before ending life

A senior executive from social media giant Pinterest has apologised as he admitted the site was “not safe” when schoolgirl Molly Russell used it. The company’s head of community operations, Judson Hoffman, told North London Coroner’s Court that self-harm or suicide content that violates its policies “still likely exists on our platform” and conceded it […]

Continue Reading 0
featureimage-22

Robot ‘taught to laugh at jokes’

A robot has been taught to laugh at jokes in a bid to make it more human. Researchers at Kyoto University in Japan are using artificial intelligence (AI) to train robots about appropriate laughter – and to differentiate between chuckles and rip-roaring squeals. Writing in the journal Frontiers in Robotics and AI, they describe working […]

Continue Reading 0
featureimage-21

In-depth competition probe launched into Microsoft’s Activision Blizzard deal

Microsoft’s proposed 68.7 billion US dollar (£59.7bn) takeover of games publisher Activision Blizzard is to face an in-depth investigation amid concerns the deal is anti-competitive, the UK’s competition watchdog has confirmed. The Competition and Markets Authority (CMA) said it has referred the takeover for a so-called phase two probe after Microsoft said it would not […]

Continue Reading 0