MPs to summon Elon Musk to testify about X’s role in UK summer riots
The House of Commons investigation into the increase of harmful content on social media is also likely to summon executives from Meta and TikTok.
Members of Parliament (MPs) are planning to call Elon Musk to testify about how X (formerly Twitter) may have helped spread false information. This is part of an investigation into the UK riots and the growth of harmful AI-generated content, according to the Guardian.
Top executives from Meta (the company behind Facebook and Instagram) and TikTok are also expected to be questioned by MPs as part of a social media inquiry by the Commons science and technology committee.
The first hearings will happen in the new year, as there are growing concerns that the UK’s online safety laws might fall behind fast-moving technology and the political influence on platforms like X.
MPs will look into the effects of generative AI, which was used to create misleading images on Facebook and X that encouraged people to take part in Islamophobic protests after three schoolgirls were killed in Southport in August. They will also examine how business models in Silicon Valley promote content that can mislead or harm people.
Chi Onwurah, the Labour chair of the select committee, said, “[Musk] has strong opinions on many issues.” She added, “I’d like the chance to question him and see how he justifies supporting freedom of speech while also promoting false information.”
Musk, who owns X, was angry when he wasn’t invited to a UK government investment summit in September. Onwurah said to the Guardian, “I want to make up for that by inviting him to come.”
Peter Mandelson, a former Labour minister who might become the next UK ambassador to Washington, recently called for an end to the “feud” between Musk and the UK government. Mandelson told the *How to Win an Election* podcast, “Musk is a major figure in technology and business, and I think it would be a mistake for the UK to ignore him. We shouldn’t keep fighting with him.”
X didn’t respond when asked if Musk would testify in the UK, though it seems unlikely. The world’s richest man is preparing for a senior role in the Trump White House and has been critical of the Labour government. He also commented on UK inheritance tax changes by saying, “Britain is going full Stalin.” He also said during the Southport riots, “Civil war is inevitable.”
The Commons inquiry is happening as social media is facing new challenges. Many users are leaving X (formerly Twitter) to join a new platform called Bluesky. Some people are leaving because they are unhappy with the spread of false information, the return of banned users like Tommy Robinson and Andrew Tate, and new rules that allow the platform to use user data to train its AI models.
Keir Starmer said on Tuesday that he does not plan to join Bluesky, and neither do government departments. The Prime Minister said at the G20 summit in Brazil, “What matters for the government is that we can reach and communicate with as many people as possible, and that’s the only thing that matters for me.”
Elon Musk, who was not invited to the UK government investment summit, responded by saying, “I don’t think anyone should go to the UK when they are releasing convicted criminals and imprisoning people for social media posts.”
Lucy Connolly, who was jailed for posting on X, wrote: “Mass deportation now, set fire to all the fucking hotels full of the bastards for all I care.” She was convicted for posting material that could stir up racial hatred, but X decided her post did not break its rules against violent threats.
Chi Onwurah, who is leading the inquiry, said it will investigate the connection between social media algorithms, AI, and the spread of harmful or false content.
The inquiry will also look at how AI affects search engines like Google. Recently, Google’s AI shared false and racist claims about people in African countries. Google said the AI made a mistake and removed the content.
After the Southport killings on July 29, false information spread on social media. Accounts with over 100,000 followers wrongly identified the attacker as a Muslim asylum seeker.
Ofcom, the UK communications regulator, has already found that some platforms were used to spread hate, provoke violence, and encourage attacks on mosques and asylum homes.
Next month, Ofcom will announce new rules under the Online Safety Act, which will require social media companies to stop the spread of illegal content and reduce safety risks. This includes preventing violence, hate speech, and false information meant to cause harm. Companies will have to remove illegal material once they know about it and improve safety in their products.
Published: 20th November 2024
Also Read:
Drivers warned of snow and ice during UK rush hour
Ishmael Brima Koroma: 1st Sierra Leonean judge at Super Model Universe 2024
The average price of a home in the UK has fallen by £5,000 in November