With Elon Musk’s takeover of the online platform Twitter, attention has again focused on the evolution of social media and the impact for users.
I am keen to hear the views of young people in East Hampshire about ‘life online’ and what they’d like to see in legislation, regulation and self-regulation, so I’m holding a series of workshop discussions in local secondary schools.
I have been very impressed by the points I have heard, and the practical, balanced approach they advocate, and I am very grateful to them for taking part, and to the schools for facilitating it.
Online harms was a subject parliament didn’t debate even a single generation ago – but we will return to it many times in future.
Coincidentally last week I led a cross-party backbench Westminster Hall debate on the issue. Ensuring people, especially children, can stay safe online is an important and complex area of public policy with many technical, legal and moral challenges.
For the most part, the online world is a fantastic thing – it helps us keep in touch, find information quickly and run our lives more efficiently.
Social media is a significant part of our online experience, even more so for young people.
Readers of this column are most likely to use Facebook and WhatsApp; for a younger cohort it is more likely to be TikTok, Instagram and Snapchat.
And this may surprise you – Instagram, TikTok and YouTube are now teens’ top three go-to places for news as well as messaging and entertainment.
Much of social media is positive, especially how it let’s us stay in touch easily. But it can also be negative through the distribution, facilitation and magnification of harmful content, often to the most vulnerable, often with issues such as mental health, eating disorders, self-harm and more.
Legislation is important, but there is much more, too, including especially what the platforms can do.
At the very acute end is maximising efforts to tackle child abuse. Then there is the proliferation of online fraud where often the victim and perpetrator are far apart, even in another country.
That means we need a different way to address it, with a premium on stopping it at source.
And so-called ‘co-ordinated inauthentic behaviour’ online and disinformation, sponsored by states, spreads untruths and sows intolerance and division throughout society.
This is an area, though, that requires great care as we must protect freedom of expression.
It is right to be cautious about removing legal material, but in this context, there is not always a hard line between legal and what isn’t.
An example is use of emojis in racist abuse. No one considered emojis when current legislation was written.
A balance is needed as we must protect free speech, and people have to be able to make choices.
But top of the priorities is ensuring children can be safe online.