The Social Media Bubble - Computerphile

The Impact of Social Media on Human Relationships

In today's digital age, social media has become an integral part of our lives. We use it to connect with friends and family, share our thoughts and opinions, and stay updated on current events. However, as we spend more time online, we often find ourselves struggling to maintain meaningful relationships in the physical world. This phenomenon is particularly evident on Facebook, where we can sort of "chatter" with people and catch up, but ultimately carry on with normal life.

As our social media networks grow, so do our expectations for how often we should interact with our online friends. We might have 300 friends on Facebook, or even 500 celebrities with thousands of followers. But despite having a large network of online acquaintances, we often find it difficult to maintain genuine relationships. Our online interactions can be shallow and lacking in depth, making it challenging to develop strong bonds with others.

One study conducted by Dbar found that social media does not necessarily help us manage more real-life connections. Instead, it can create a sense of isolation and disconnection from the world around us. The researchers discovered that people often overestimate the number of people who see their Facebook posts, believing that 100% of their friends are aware of what they're sharing online.

This underestimation of audience can have significant consequences for how we engage with each other on social media. When we broadcast our opinions to a large audience, we can inadvertently create a sense of division and polarization. Our words can be misinterpreted or taken out of context, leading to misunderstandings and conflicts with others. This is particularly problematic in the context of politics, where nuanced discussions can quickly devolve into heated debates.

The impact of social media on our filter bubbles cannot be overstated. Filter bubbles refer to the personalized news feeds that we customize to show us only the content that aligns with our interests and views. While these filters may make it easier for us to stay informed about topics that matter to us, they can also create a sense of echo chambers where we're only exposed to opinions that reinforce our existing biases.

This phenomenon is exacerbated by the algorithms that power our social media feeds. These algorithms are designed to show us content that we're likely to engage with, based on our past behavior and preferences. While this may make it easier for us to find information that resonates with us, it can also create a sense of tunnel vision where we're only exposed to one side of the argument.

The consequences of these filter bubbles are far-reaching. They can contribute to the spread of misinformation, the amplification of extremist views, and the erosion of civil discourse. In order to mitigate these effects, researchers are exploring new algorithms that prioritize diversity and nuance in our online interactions.

One promising approach is to use machine learning algorithms that can detect when we're engaging with content that's likely to be misleading or biased. These algorithms can then take steps to balance our feeds and expose us to a wider range of perspectives. By doing so, they may help us develop more nuanced views of the world around us.

However, there are also significant challenges to designing social media platforms that promote healthy online interactions. For one thing, we need to address the issue of algorithmic bias, which can lead to discriminatory or unfair treatment of certain groups or individuals.

Furthermore, our social media platforms are vulnerable to manipulation and exploitation by malicious actors. These individuals may use our platforms to spread misinformation, propaganda, or other forms of disinformation in order to manipulate public opinion or influence elections.

In order to build more secure and responsible social media platforms, we need to invest in research and development that prioritizes transparency, accountability, and user well-being. This may involve implementing new technologies that can detect and prevent manipulation, as well as creating more nuanced algorithms that prioritize diversity and nuance in our online interactions.

Ultimately, the impact of social media on human relationships is a complex and multifaceted issue. While these platforms have the potential to connect us with others across the globe, they also pose significant challenges for how we communicate and interact with each other. By acknowledging these challenges and working to address them, we can build more responsible and effective social media platforms that promote healthy online interactions and nuanced discussions.

"WEBVTTKind: captionsLanguage: enif I'm looking at my news feed how come the outcome of a vote or a referendum is so different to what I'm expecting it to be all right so it's because all the people that post their opinions of what's going to happen uh of people that you've made friends with uh and they're producing a kind of filter bubble which is showing you all the same information that you think just shared by other people who think the same thing of you and it's a phenomenon of uh who you've made friends with and thus what information that filters towards you through Facebook this is an interesting social media phenomenon that we we've seen research about for a while now and that people are investigating for how to deal with because what you've experienced there is called a social media filter bubble uh and it's the way that technology is affecting the information we see on a day-to-day basis and who we see it from and who we talk to uh and it's based on several theories of how we develop friendships and things like that um but it's really a kind of increasingly well established phenomenon that we essentially um make friends with people that we know and like are similar to us and then uh all the information we see on a daily basis is coming from those people who have the same opinions as we do so it's exactly one of the two interesting things that we've been seeing lately on referendums and voting is that uh our Facebook feed feels very very different to the National outcome so without discussing you know sides and outcomes and votes it's just interesting to see that uh the votes have come out differently to what a lot of people expect or it's just been a lot closer we expect uh so either side of how you see what happened um it was really close and you don't see the same balance likely in your Facebook feed let's talk about who we make friends with because this is an interesting set of theories as to what what affects who we choose to talk to and what's like there's several things like attractiveness or uh reciprocity of U sharing information with each other or just uh helping each other with things these are all fine but two key ones are location so who we're likely to run into uh and also similarity now the similarity one is different is interesting and what effect it has there was a nice quote that I found um so similarity includes having similar attitudes values interests and beliefs while also being of similar age gender social economic status education and attractiveness has come up again uh so we're good friends because we're both very attractive um but that similarity statement is quite interesting so social economic status uh interests values these are all the things that effect you know maybe who we would vote for which or what things we vote for um but it means that if we're friends with lots of people who are similar to us then we see those same values coming out now over and over again um and so this is what's affecting the the social media fil above all is are we getting what we want is that what it is well part of it is self-satisfaction and getting what we want from our friends and social networks um but to understand how we can design technology to uh affect this or change this or to give us a more balanced view of what's happening in the world there's several things we might want to look at several theories that will help us look at it um one of them is just the theories of communication and the kind of major types of communication we have with people there's interpersonal communication U which is two or more people talking to each other in a kind of continuous varying dialogue which is fine and then the other is mass broadcast um which is kind of the opposite perspective where one person is talking and everyone else is sort of receiving um and so when a politician goes on TV and says something this is broadcast and when we talk to each other in our homes this is uh interpersonal communication and one of these challenges of what's happening on social media is that we're sort of trying to be be interpersonal but we are doing broadcast so let's bear that in mind of one thing the second is uh based on common ground Theory so whenever we talk to someone we have this kind of concept of what the other person knows and if we say something how they'll receive it so uh when I talk to some people about technology I take it at quite high level because I expect them not to understand the detail but to be interested in the reasons and the concepts behind it with then when you talk to another nerd you start talking about lowlevel things that only you two would find interesting but it means that when we say something uh in a group with friends we would first maybe uh say something to kind of estimate how they might respond to an issue so if you say something bold about your opinion of politics then uh will they think the same thing is you and then once you have a good estimate of that you would then choose what you say uh so we always have this kind of continuous gradient of estimating how much to say about what we think and uh whether you can make a casual joke about it and they take that or not this is fine if we're in an interpersonal real life setting but challenging when we are online on Facebook or anything like that and we're not quite sure who's going to see it so then there's the size of our friendship networks is the interesting thing um there's something called the Dumar number which is a social science uh well-known number or social psychology well-known number um which estimates that we can manage about 150 friendships at most some they say between 100 and 250 but people tend to zero in on kind of 150 uh and this isn't strong strong relationships but relationships of people that we can sort of chatter and catch up and then carry on with normal uh and then as we get more more people people start sliding off that 150 and going elsewhere but what we have on Facebook is a lot more people now dbar has done some more research more recently saying that it doesn't help us manage more real life connections but he acknowledges that we have mostly a lot more people listening to us you might have 300 friends on Facebook uh you might have 500 celebrities have thousands whatever um but it just means that whatever we say uh we are not only saying to more people than we can normally have a normal friendship with we're saying it to a broader audience which encourages this mass communication there's also an interesting Facebook study done by Bernstein atown 2013 where they uh analyzed how many people you think see your Facebook posts this is a Facebook study and they found that we estimate about 27% of the actual number of people who see it so if we think 27 people saw it 100 people saw it so we have this huge underestimation of who will see whatever we say online as well as who the audience is and trying to estimate it so what does this mean for us this means when we're on Facebook and we're talking about uh politics we are trying to say interpersonal things with somebody who we think shares our same opinion but we're broadcasting it to a much bigger audience than we're normal than we used to and we're in this very binary status of share not share and we don't have this kind of gradient of share certain things certain people the other effect this has had then on things like the referendum or elections is that it's created a lot more difference between people or it's divided people more because we've said things in a very broadcast politic type way that we might only say normally to someone who's a friend at home um or we would choose how to say it differently to a friend to another friend um so the impact this then has on our filter bubbles is that it exaggerates our filter bubbles we might unfollow someone on Facebook not necessarily friend them but just turn off their posts so that we don't see something we disagree with or something we don't find interesting uh and so we specialize our filter bubble to even more what we want to see and who we want to hear it from uh and it creates The Divide even more as to what's happening and we have a less realistic view of the whole world then so what we have to do with technology is figure out how to how and when we should affect that which means there's a lot of research at the moment looking at how do you manipulate filter BS how often should you bring in different types of content it's the kind of thing that set off the Facebook emotion study that got lots of negative press and they were trying to say if we slightly manipulate the Facebook feed to have different types of content positive and negative or different political views then what affs this have on people um so there's a lot of research going into how can we build an algorithm which samples different people's points of view but while still being something you find comfortable and interesting um because of course we get a lot of social personal value from Facebook that makes us happy it's a lot to do with well-being than it is to do about political Communications does that mean that maybe someone thinks they're being an activist but the computer science side of things is is causing that perhaps just not to go to who they're hoping it will go to yeah so the the algorithm is optimized to show you things that you're likely to like and it's uh optimized to kind of later on show things if you continue to look and you want to see more and more to then bring in other stuff that you might not normally press the like button on so that sort of means that if we say something then the majority of people in your Facebook filter bubble who are already on your side anyway probably just see a small statement and maybe like it and then the people who wouldn't have liked it probably wouldn't see it so it's it all exaggerates the filter bubble and so the question is can we design algorithms which are more varied or sample variation the right rates uh or bring in different types of opinions at the right sort of times give you balanced views or know when to give you balanced views it's a huge opportunity because we're in this massive social technology space where it's mediating the the vast aity of our communication but yet it's optimized one way not kind of in a balanced way the problem is that if I obtain a cookie off you which is supposed to be secure then I can send that to let's say Amazon or to a shop and say I'm Sean please you know what's in his shopping basket what's his address what's his credit card detailsif I'm looking at my news feed how come the outcome of a vote or a referendum is so different to what I'm expecting it to be all right so it's because all the people that post their opinions of what's going to happen uh of people that you've made friends with uh and they're producing a kind of filter bubble which is showing you all the same information that you think just shared by other people who think the same thing of you and it's a phenomenon of uh who you've made friends with and thus what information that filters towards you through Facebook this is an interesting social media phenomenon that we we've seen research about for a while now and that people are investigating for how to deal with because what you've experienced there is called a social media filter bubble uh and it's the way that technology is affecting the information we see on a day-to-day basis and who we see it from and who we talk to uh and it's based on several theories of how we develop friendships and things like that um but it's really a kind of increasingly well established phenomenon that we essentially um make friends with people that we know and like are similar to us and then uh all the information we see on a daily basis is coming from those people who have the same opinions as we do so it's exactly one of the two interesting things that we've been seeing lately on referendums and voting is that uh our Facebook feed feels very very different to the National outcome so without discussing you know sides and outcomes and votes it's just interesting to see that uh the votes have come out differently to what a lot of people expect or it's just been a lot closer we expect uh so either side of how you see what happened um it was really close and you don't see the same balance likely in your Facebook feed let's talk about who we make friends with because this is an interesting set of theories as to what what affects who we choose to talk to and what's like there's several things like attractiveness or uh reciprocity of U sharing information with each other or just uh helping each other with things these are all fine but two key ones are location so who we're likely to run into uh and also similarity now the similarity one is different is interesting and what effect it has there was a nice quote that I found um so similarity includes having similar attitudes values interests and beliefs while also being of similar age gender social economic status education and attractiveness has come up again uh so we're good friends because we're both very attractive um but that similarity statement is quite interesting so social economic status uh interests values these are all the things that effect you know maybe who we would vote for which or what things we vote for um but it means that if we're friends with lots of people who are similar to us then we see those same values coming out now over and over again um and so this is what's affecting the the social media fil above all is are we getting what we want is that what it is well part of it is self-satisfaction and getting what we want from our friends and social networks um but to understand how we can design technology to uh affect this or change this or to give us a more balanced view of what's happening in the world there's several things we might want to look at several theories that will help us look at it um one of them is just the theories of communication and the kind of major types of communication we have with people there's interpersonal communication U which is two or more people talking to each other in a kind of continuous varying dialogue which is fine and then the other is mass broadcast um which is kind of the opposite perspective where one person is talking and everyone else is sort of receiving um and so when a politician goes on TV and says something this is broadcast and when we talk to each other in our homes this is uh interpersonal communication and one of these challenges of what's happening on social media is that we're sort of trying to be be interpersonal but we are doing broadcast so let's bear that in mind of one thing the second is uh based on common ground Theory so whenever we talk to someone we have this kind of concept of what the other person knows and if we say something how they'll receive it so uh when I talk to some people about technology I take it at quite high level because I expect them not to understand the detail but to be interested in the reasons and the concepts behind it with then when you talk to another nerd you start talking about lowlevel things that only you two would find interesting but it means that when we say something uh in a group with friends we would first maybe uh say something to kind of estimate how they might respond to an issue so if you say something bold about your opinion of politics then uh will they think the same thing is you and then once you have a good estimate of that you would then choose what you say uh so we always have this kind of continuous gradient of estimating how much to say about what we think and uh whether you can make a casual joke about it and they take that or not this is fine if we're in an interpersonal real life setting but challenging when we are online on Facebook or anything like that and we're not quite sure who's going to see it so then there's the size of our friendship networks is the interesting thing um there's something called the Dumar number which is a social science uh well-known number or social psychology well-known number um which estimates that we can manage about 150 friendships at most some they say between 100 and 250 but people tend to zero in on kind of 150 uh and this isn't strong strong relationships but relationships of people that we can sort of chatter and catch up and then carry on with normal uh and then as we get more more people people start sliding off that 150 and going elsewhere but what we have on Facebook is a lot more people now dbar has done some more research more recently saying that it doesn't help us manage more real life connections but he acknowledges that we have mostly a lot more people listening to us you might have 300 friends on Facebook uh you might have 500 celebrities have thousands whatever um but it just means that whatever we say uh we are not only saying to more people than we can normally have a normal friendship with we're saying it to a broader audience which encourages this mass communication there's also an interesting Facebook study done by Bernstein atown 2013 where they uh analyzed how many people you think see your Facebook posts this is a Facebook study and they found that we estimate about 27% of the actual number of people who see it so if we think 27 people saw it 100 people saw it so we have this huge underestimation of who will see whatever we say online as well as who the audience is and trying to estimate it so what does this mean for us this means when we're on Facebook and we're talking about uh politics we are trying to say interpersonal things with somebody who we think shares our same opinion but we're broadcasting it to a much bigger audience than we're normal than we used to and we're in this very binary status of share not share and we don't have this kind of gradient of share certain things certain people the other effect this has had then on things like the referendum or elections is that it's created a lot more difference between people or it's divided people more because we've said things in a very broadcast politic type way that we might only say normally to someone who's a friend at home um or we would choose how to say it differently to a friend to another friend um so the impact this then has on our filter bubbles is that it exaggerates our filter bubbles we might unfollow someone on Facebook not necessarily friend them but just turn off their posts so that we don't see something we disagree with or something we don't find interesting uh and so we specialize our filter bubble to even more what we want to see and who we want to hear it from uh and it creates The Divide even more as to what's happening and we have a less realistic view of the whole world then so what we have to do with technology is figure out how to how and when we should affect that which means there's a lot of research at the moment looking at how do you manipulate filter BS how often should you bring in different types of content it's the kind of thing that set off the Facebook emotion study that got lots of negative press and they were trying to say if we slightly manipulate the Facebook feed to have different types of content positive and negative or different political views then what affs this have on people um so there's a lot of research going into how can we build an algorithm which samples different people's points of view but while still being something you find comfortable and interesting um because of course we get a lot of social personal value from Facebook that makes us happy it's a lot to do with well-being than it is to do about political Communications does that mean that maybe someone thinks they're being an activist but the computer science side of things is is causing that perhaps just not to go to who they're hoping it will go to yeah so the the algorithm is optimized to show you things that you're likely to like and it's uh optimized to kind of later on show things if you continue to look and you want to see more and more to then bring in other stuff that you might not normally press the like button on so that sort of means that if we say something then the majority of people in your Facebook filter bubble who are already on your side anyway probably just see a small statement and maybe like it and then the people who wouldn't have liked it probably wouldn't see it so it's it all exaggerates the filter bubble and so the question is can we design algorithms which are more varied or sample variation the right rates uh or bring in different types of opinions at the right sort of times give you balanced views or know when to give you balanced views it's a huge opportunity because we're in this massive social technology space where it's mediating the the vast aity of our communication but yet it's optimized one way not kind of in a balanced way the problem is that if I obtain a cookie off you which is supposed to be secure then I can send that to let's say Amazon or to a shop and say I'm Sean please you know what's in his shopping basket what's his address what's his credit card details\n"