Filter Bubbles (Social Media Part 3 )

I recently had a chat with Natalie about helping to develop a module on the dark side of social media for the new intake of the Digital Teacher SSC in May. There is a TED Talk delivered by Eli Pariser about the idea of Filter Bubbles. This is a term describing facebook and other social media recognizing the things you like and dislike and filtering your information to show you people with similar views to you.

But this may mean you get less or no exposure to people with different religious/political/ideological ideas from you and where does that leave us? My personal opinion; intolerant. If you believe that everyone agrees with you and believes the same thing as you and respects the same values as you, how do you learn to appreciate other points of view? I like to use the analogy of school children. If you have never failed in your life at all and always told, regardless of how well or poorly you have done, that you are all winners, how does that teach you to cope with failure when you get rejected from a job interview for the first time or your exam results aren’t quite as high as you would’ve liked?

We learn from our mistakes and we learn from interactions with people with different opinions and beliefs and it is this that teaches us how to form a rational argument – if you are ignorant of other peoples’ viewpoints, you are more likely to become frustrated with it and attempt to put your view across in a way that becomes offensive or “pushy.”

As doctors-to-be we are instructed that we should advocate for the patient but that we should not influence in them in anyway; it is important to lay out the options and exactly what each option entails, without appearing judgemental of their beliefs, opinions or decisions, regardless of our own opinions of the beliefs and decisions. If you have never been exposed to people, in reality, or online, of differing opinions, we will never learn to do this in a sensitive, appropriate way.

Students like myself need to be taught to think more critically about where they get their information from, because in this world of ever-changing information sources, online entities and social media track your likes and dislikes, so you might only be seeing the side of the story that the internet thinks will resonate with you best so that you click on more of their links and charge them more in advertising. This is a cynical view, I am aware, but taking 30 seconds to fact check something, as well as giving you another side to an argument that you may actually agree better with, in the end, is well worth that time.

The key point I’m making here is that due to the highly personalised feed of information we get from some websites and social media, you have to question the bias of the information that is presented to you, as you have to decide whether you are indeed getting the full picture of if you are getting fed what a machine thinks you want to know. And how well can a machine, using a mathematical algorithm, know the inner workings of your mind? Well considering that I don’t really understand my own thoughts, likes and dislikes very much sometimes, I imagine no computer knows anything about that truly, but simply carries out pattern recognition to identify themes…

Fearghal