Read Latest Issue now

Taha Yasseri, Oxford Internet Institute: “We can have value judgement only when we are isolated from social information”

Posted on Nov 4, 2019 by FEED Staff

Dr Taha Yasseri from the Oxford Internet Institute discusses the influencer economy and how ‘group think’ affects us more than we like to admit

FEED: Let’s start by hearing about your background and your involvement with the Oxford Internet Institute.

Taha Yasseri: My training and background is in theoretical physics and the physics of complex systems, and then in network science. I joined the Oxford Internet Institute about seven years ago. I’m a senior research fellow in computational social science at the moment.

I do a lot of network analysis to answer different questions, ranging from how information diffuses in social networks, all the way to how online dating is being revolutionised by mobile dating. Most of my work is based on larger-scale data analysis and mathematical modelling. 

FEED: Can you talk a bit about the power of social media influence and what its real value is? What is the difference between the real value of social media and its perceived value?

Taha Yasseri: Well, the fact we are influenced by others and, to a great extent, make our decisions based on what other people do (friends, family or colleagues) is not new. It’s not an internet phenomenon.

In 1969, 50 years ago, an American psychologist, Stanley Milgram, and his colleagues did some experiments. In one, they went to the streets of New York and started staring at a window – a random window of a random building – across the street, even though there was nothing going on there. Then they counted how many other people looked as they walked by, or stopped and kept looking. They counted this number and then changed the size of their own initial group to see how much more influence they could have as their group grew bigger. So, as I said, this has nothing to do with the internet. The phenomenon has been observed, and even experimented on and measured.

The information we receive about what other people are doing – we call it social information – used to be limited by physical proximity. We could physically see what people did. Or, after we invented the telegram and telephone, we could know what our relatives or friends did, and then be influenced by them. But today on social media and online platforms, we receive information about what other people have watched, shared and done on a very large scale. We get information about thousands or millions of people that we’ve never met and are not going to meet. When I see a video on YouTube has been watched four billion times, I have no idea who these people are, but this number is very intimidating. I would feel left out if I didn’t join that huge group of people who have shared this experience.

FEED: How do online social networks amplify that already existing tendency?

TAHA YASSERI: Online, we are bombarded with social information and are therefore much more vulnerable to making our decisions based on other people’s choices. And there have been multiple experiments reporting and quantifying this effect. It is here in our nature, but it’s not entirely based on online platforms.

The first question to ask is: what exactly is being spread on our social networks, beyond the spreading phenomenon itself? The spreading phenomenon is just a tool, but what is being spread? It could be fake news, a rumour, destructive information or it could be good habits, new fashions, trends or a catchy song. You can’t really criticise the phenomenon of social influence just on its own.

There have been amazing experiments recently done by Sinan Aral from MIT, who showed social contagion between people who use Nike running apps. They showed that if my friends run on a given day, I’m much more likely to go out and run as well. This is significant social influence I’m getting from my friends. But it’s for good, because we have seen that running and exercise is a positive thing. Now if, instead of jogging, we talk about a piece of fake news or a wrong belief, obviously that could be negative.

FEED: Do numbers create influence? If I look at a Twitter account with 50 followers, am I less likely to pay attention to it than if it’s an account with 500 followers? 

Taha Yasseri: Absolutely. It has been reported on many different occasions. We ran a real-world experiment in one of the museums in Oxford. We asked visitors of the museum to choose their favourite picture from nine pictures on the same wall. We gave them an iPad showing thumbnails of the same pictures and they could choose their own favourite. To half of them, we gave an iPad that showed the live stats for each picture up to that moment, so they could see what other people had chosen as their favourite pictures. 

When I see a video on YouTube has been watched four billion times, I have no idea who these people are, but this number is very intimidating

In both groups, the same picture was the winner of the competition, but in the group that could see the number of votes cast for each picture, the winner won with a much bigger margin – there was no chance for the second and third pictures to ever catch up with it. True, it was a nicer, more attractive picture, but there was this feedback effect. People, in the moment they were about to cast their vote, could see “Oh, this is popular”, and they were much more likely to choose the one other people had already selected.

When people left the room, we showed them these results. People in the art world think they have strong opinions when it comes to art. They will say, “No, I have chosen based on my tastes. I wasn’t influenced.” And it was hard for us to pinpoint individuals and say, “No, you were influenced by others,” but we could see collectively there was a significant difference between people choosing on their own and people who received the information about other people’s choices.

That’s why every now and then, we see something on social media that goes madly viral, like the Instagram egg a few months ago. There is no content, nothing there you could say people really liked. It’s just a picture of an egg. But through the same processes, through copying other people’s behaviour and the feeling we are part of the group, there is a strong force that can, to a great extent, affect our behaviour and what we do.

FEED: Is the idea ‘people will look at something and make their own critical value judgement’ fading? Is there a shift now where facts matter less? 

Taha Yasseri: I think it’s very optimistic to think humans make their decisions based on a true judgment process. And it has never been like that. I don’t think our judgement is fading due to the technology. I think the technology could just amplify what has always existed in all societies. And, of course, the technology allows us to observe it in a better way. 

How come, 200 years ago, a large number of people ended up cutting their hair in a ridiculous fashion? This is the same process. At the time, there was no Instagram. People just saw other people on the street with a certain haircut and thought they would copy it. They wanted to have the same identity or fashion taste.

Social media can amplify it by providing this information on a whole different scale. We do not need to walk down the street to see how people cut their hair. We can sit in our home and look at hundreds of thousands of pictures on Instagram, Twitter  and Facebook. It’s much more visible today.

FEED: What other science has been done around studying influence by social networks?

Taha Yasseri: My favourite experiment is one that Matthew Salganik and Duncan Watts of Princeton University have done. They asked people to listen to different songs and choose their favourite, ranking the songs according to quality. Then, to a second group of subjects, they showed a preranked list of the songs based on what other subjects had selected. Many of the people in the second group ended up copying that ranked list.

 

For a third group, they gave them wrong information with basically the worst song at the top of the list, and still people followed the list. People said, “Yeah, I really like that music,” when that was music that the majority of people on their own had said was not a great piece. But, because it was put on top of the list and because they were told this was the choice of the majority of the other people in the group, people followed it. That completely shows there’s very little value judgment. 

We can have value judgment only when we are completely isolated from social information. As soon as we have even some information about what other people would choose, or what other people have chosen or what decisions other people have made, we are very much prone to following it, particularly when it comes to things we have very little personal opinion about, or if we are thirsty for information and need some sort of guidance toward one or another option. We do not really use our own judgment.

FEED: In the September issue of FEED, we looked at the online economy of likes and subscribers and how easy it is to manipulate. Do you think the practice of buying likes is common? How does it affect the credibility of an influencer and their audience?

Taha Yasseri: I don’t have hard data to say how many likes are bought, or how many fake reviews are out there. But I would be surprised if someone claims there is no artificial social boosting happening on social media. Of course, different platforms have their own strategies to suppress or control this, and they might be successful to a certain extent, but the more sophisticated the algorithms they use to detect and suppress artificially boosted content become, the more sophisticated the algorithms and methods will become on the other side of the field. That is, the algorithms that marketing companies would use. But, of course, most of us just see the numbers – we see this content has received this many likes and that one has received that many.

What social media platforms could do is to look at the network behind these numbers. Seemingly similar numbers could originate from different structures. They could come from isolated individuals without many followers and without much content they have created on their own. Or the same number could come from users who are embedded in a social structure. It’s much more likely that the first case is also an ‘Astroturf’ number – it’s an artificial boost – that does not come as a result of social spreading.

These are signs that social media platforms could use to detect and suppress artificial boosts, but it’s very difficult for normal users like us to just look at the number and say, “Oh, this seems to be a fake number.” Well, the number is real, but whether it means more than just an algorithm liking content… that’s hard for regular users to find out.

FEED: Are there any good tools people can use to verify what they are seeing online? 

Taha Yasseri: Yes, there are a number of different tools. One that I use when I see something going viral on Twitter and I feel a bit suspicious about it, particularly when it comes to political content, is Botometer. It’s a tool produced by my friends at Indiana University in the US and it uses multiple pieces of information. It goes through the network of followership and tweets and retweets of individual accounts, and then gives you a measure – a likelihood – of that Twitter account being a bot instead of an actual human being.

Of course, there is an understandable level of error in this detection and the outcome of the algorithm is not ‘yes’ or ‘no’; it’s a probability. But then you can look further into that. If an account has a very low chance of being a human, you can see why and which parameters are suggesting this account is probably a robot. It gives a very good overview of a given hashtag or keyword or, for given followers, of an account. You can see what percentage of them might be or most likely are robots.

But as I said, using the same insight that we gain from these tools, you can reverse-engineer the process and produce even more sophisticated bots or more sophisticated algorithms.

FEED: What do you think is the most effective artificial means of boosting signal or a message? Is it bots and likes? What are some of the more effective methods people are using?

Taha Yasseri: It has been shown that when we receive the same information – or the same item is being promoted to us – from multiple channels, it’s much more effective on our decision-making.

If I see ten of my friends have bought the same mobile phone, I may or may not follow them and buy the same phone. But if I see two of my friends, two of my colleagues and two of my family members have bought the same mobile phone, I’m much more likely to. 

In the second case, there are only six people instead of ten, but because they are from different communities and from different channels, the overall effect is much stronger on my decision.

Based on that, the most effective strategy could be producing content in different conventions of the network through social media bots. It’s very difficult to measure and to quantify the influence, but I think receiving the same content from four different accounts is much more effective than receiving the same content from a single account, but with many more likes. But I am just speculating based on the research that showed receiving information from multiple sources is much more effective.

FEED: Is it true there’s only one thing worse than being talked about, and that’s not being talked about? Would you agree that negative press is just as good as positive press?

Taha Yasseri: It’s true. We had a project about seven years ago in which we tried to predict the box office revenues of movies based on how much social buzz there was about them before they were released.

When I was working on that project, many people told me, “Sometimes, you talk about a movie just to say it’s a bad movie. It doesn’t necessarily translate into tickets or box office revenue.” And I couldn’t argue with them at the time. 

But then, when we finished the project, we realised there is no such thing as bad publicity. There was a very clear relationship between the social buzz and success. We didn’t measure the sentiments. We didn’t measure if people were talking about a movie in a positive way or a negative way. It was beyond our method. We just measured the volume, and the sheer volume was a very good predictor of the box office revenue. That was the first time I actually believed there is no such thing as bad publicity.

Sometimes, we go and watch a movie just because we have heard it’s bad, but often we go and watch a movie just because we have heard about it. We do not remember what was said about the movie. As long as the name sounds familiar to us, we choose that over other options we haven’t heard anything about.

As soon as we have even some information about what other people would choose, or what other people have chosen, we are very much prone to following it  

FEED: What effect do you see online influence culture having in the future?

Taha Yasseri: I might have sounded a little bit negative, but I want to say that this is not new. I hear that people say that social media is destroying our democracy, and we are in a post-fact era.

Some of this might be true, but what I’m saying is that social media is not to blame. It might amplify some of the intrinsic features of human societies, but so have many other technologies. When the printing press became available and affordable books were printed in Victorian times, the elites were very worried the ‘commoners’ would read books and be diverted away from sane thinking!

For any new communication technology, we fear it might destroy our society. I don’t think we need to blame or should blame the technology. Of course, with any new technology, there are challenges, and we need to learn about them. We need to regulate them and we need to control them. But the challenges are mostly and fundamentally about human nature and society – not just the technology.

Genius Interview: Richard Rushfield - Th...

April 7th, 2022

The Ankler started as Hollywood commentary for Richard Rushfield’s LA friends and colleagues. Now,...

Andrea Barrica: "Our mission is to creat...

February 11th, 2020

Andrea Barrica is founder of sexual health site O.school - and people just can’t...

Pippa Harris: "I feel that the studios t...

January 11th, 2021

Dame Pippa Harris is a producer and chair of Bafta who recently received an...

Greg Gilderman & Kevin Hayes, The Weathe...

July 29th, 2019

The Weather Company’s editor in chief and global head of video, Greg Gilderman, and...