Brief:
- Choose a social media platform that you use. Based on today’s discussion of the ethical issues, what would you change, or add to the platform to make it more ethical? Use some of the links below to dig into this subject, or find some of your own (a simple google search will produce tons of results on this subject).
- Create some basic ideas. You can do this on paper, or any way you choose. Focus on how it would work, not what it looks like.
Claudia Pagliari, of the University of Edinburgh’s Usher Institute said (The Times 28/10/2017) that “Many people don’t think there is an ethical issue,” she said. “However, if you’re a social media user you might feel slightly differently. You have some expectations that you are in a trust community …”
The attitude described by Claudia Pagliari, certainly applied to me as I used to believe that the internet and Social Media are, in general, a force for good. My own first experience of what we now call Social Media came in 2000 when I became aware of ‘Friends Reunited’ and through this medium I was able to reconnect with one or two friends from school that I hadn’t seen since leaving education in 1984. It wasn’t long before I heard of childhood friends who’s relationships had foundered, or ended, as a result of the contact they had made with former partners from their own school days.
On 1st September 2004 The Guardian reported that, “Internet sites such as Friends Reunited are unwittingly fuelling a surge in marital break-up as bored husbands and wives contact old flames, relationship counsellors warned yesterday as official figures showed divorce has reached a seven-year high.”

https://www.theguardian.com/uk/2004/sep/01/johncarvel?CMP=share_btn_url
Perhaps this was an early indication – just as ‘The Facebook’ launched that same year before it was renamed as Facebook in 2005 and began the universal social media age which has lasted for the past 20 years and shows no sign of retreat.
Over the years, I have used social media on and off, first Facebook then Twitter were my favourites and more recently – since the Covid-19 outbreak in 2020 – my go to social media app has been TikTok and there remains rarely a day when I don’t scroll through it for at least a few minutes.
While there is certainly content which could generously be described as in poor taste, there is also a lot of fun and quality content to be found on the app. My own recent appearance on the ‘Theirapist’ podcast (see a previous post on this blog here: https://theoldtimes.co.uk/2025/09/30/theirapist-podcast/ ) came as a direct result of following a well-qualified content creator in the therapy space and I follow several interesting, factual and talented creatives.

That said, there can be no doubt that – just like the impact on relationships of ‘Friends Reunited’ – on early social media users, there is now clear evidence of the negative uses to which this new media can be put. Twitter has become well known as the home of fake news, hatred and misinformation and this has led to many users quitting an app – particularly since the platform came into the ownership of Elon Musk. The Guardian reported on this a year ago:

Similarly Facebook has provided a platform to both groups and individuals who wish to express radical and often unpleasant personal and political views as reported in The Independent as long ago as 19th April 2019:

While I have a generally positive view of TikTok, I am aware that its algorithm keeps me within my own ‘bubble’ as it populates my ‘For You Page’ (FYP) with videos that play into to my own interests, preferences and likes. As a result, it is possible to be insulated from content which may be definitively offensive; as well as content which may not appeal to me and therefore offends my sensibilities – even though it may not be extreme or unpleasant.
Therefore, the app provides content which responds to the signals I give it based on what I like, share and comment upon. In addition, it records my ‘watch time’ so that I continue to see content that may be of interest to me even if I do not positively confirm that through the positive actions I take.
Occasionally, this can lead to an extended time on a video that does not meet with the users approval – perhaps one that takes an alternative political, social or moral viewpoint – skewing the algorithm to provide more content of this nature and leads to calls from users to “Comment, like and share” to help them back onto what the user believes to be the “right side” of TikTok.
Consequently, it is easy for users of social media to fall down the so-called ‘rabbit hole’ of negative and harmful content and this has become a concern across society – particularly for young and vulnerable people. This has led to greater social and political focus on the potentially negative side of the phenomenon of easy access to social media at the end of the first quarter of the 21st century. This article in The Times from 7th March 2025 is typical of these recently expressed concerns:

Similar concerns were expressed in the group during the classroom session on the potential for negative impacts of social media:

Kazu said, “Age ranges from 10 – 14 had a higher rate of suicide and depression” while Ivanusa made the point that “social media has power control over people”. “The media is filled with false news and information” added Tony while Kennedy made the point that “If you are not paying for the product, you are the product”.
As new students of Digital Media and with many of those on the course looking forward to a long career in the medium – reaching into the middle of the century and beyond – we have an opportunity to think about these issues and consider options to embrace the positive aspects of social media and to reconsider the ethics of the field and seek to influence the big companies and platforms to do a better job. So, what should be be asking for?
I think there are a few simple actions TikTok could take to improve the experience and these include:
- Greater monitoring of accounts promoting extreme political and social views – particularly those creators who encourage views which are contrary to established Equality Act 2010 characteristics, and which would be illegal in analogue settings.
- Zero tolerance of any creators offering support of violence or civil disturbance.
- A prompt to take a break after a continuous scrolling session.
- A button which enables the user to find out ‘why I am seeing this’ so that it becomes easier to question the source and veracity of particular content.
- Promote Diversity: this feature would enable users to find content from those creators less well represented on the platform – this could include LGBTQ+, ethnic minority and local creators – so that these creators appear more regularly on the FYP page.
- ‘Sensitive content’ warning – this could be applied to content which has the potential to be upsetting or triggering and without censoring what the user might see, it would give the user the opportunity to see or avoid particular content according to the content of the warning.
- An FYP lock – this would enable the user to decide how much scrolling time they’d like each day (or other time period) in advance so that we can get our ‘fix’ while having a brake on excess use.
There is an obligation on TikTok – and other social media platforms – in my opinion to look after their users in the same way that retailers, restaurants and others will take care of their customers. They should consider:
- User wellbeing not just engagement.
- Make the algorithm visible and negotiable.
- Seek to reduce echo chambers by improving exposure to diverse stories.
- Work towards reducing accidental harm by setting a brake on intense and disturbing content.
- Give greater autonomy – especially to younger users and those prone to excessive screen time.
Leave a Reply