Read Latest Issue now

Genius Interview: Daisy Soderberg-Rivkin

Posted on Jun 6, 2023 by FEED Staff

Misinformation, terrorism, child pornography – you name the online danger, Daisy Soderberg-Rivkin has battled it. From being the responsible gaming manager at Lego to member safety policy lead at Bumble, FEED sat down with her just before embarking on her latest adventure at TikTok

FEED: You have been on quite the journey when it comes to the world of internet safety. Why not start with what motivated you to fight for a safer online world?

DAISY SODERBERG-RIVKIN: People tend to classify my role as a trust and safety professional. What that means is I’ve written, analysed and implemented online safety policies for many different companies, organisations and think tanks – and the government.

My whole career has been about keeping people safe online – and that all began being a content moderator at Google. 

Starting out with that job meant I began in the trenches of online safety, looking at anything and everything you can imagine. My role there was within web search, which is probably the biggest product at Google you can work on. I was involved with topics that could be considered ‘not as bad’, like copyright and trademark infringement. But then that ranged all the way up to child sexual abuse material (CSAM), terrorist material, defamation, revenge porn – ultimately the biggest dangers we see on the internet. So really, I started with content moderation, which has several routes you can take afterwards, career-wise.

You could go in a completely different direction, or you can do what I did and find what inspires you to continue that work, knowing all the bad that is still out there.

I started to specialise more and more in the child safety and terrorism areas, but particularly the former. The reasons are obvious: they are one of the most vulnerable populations we have, and the future of our online and offline worlds. Therefore, it’s important we keep them safe, but also make sure we are shaping good citizens for the future because, one day, they will be the ones managing all these tools and mechanisms. They are going to be the government officials looking down and trying to regulate these things. That’s why it’s so important to keep them safe – and is what inspired me to continue this line of work.

Daisy Soderberg-Rivkin
Daisy Soderberg-Rivkin

FEED: What did that lead to?

DAISY SODERBERG-RIVKIN: A wide range of types of platform. I’ve worked in the dating area at Bumble as a content policy lead, consulted for the US government and Congress on regulations that they put forth. In that scenario, it’s a case where in theory these regulations sound great, but in practice… they don’t work out so great. My work involved acting as someone who had worked in the private sector to translate those concerns. 

Then I worked with Lego, which was sort of the ultimate golden circle of child safety, within the gaming area. Right now, I’m about to start a new job at TikTok working on the EMEA market, as a policy manager.

FEED: You mentioned policies vs practice. Can you explain why trying to action laws in this field is so challenging?

DAISY SODERBERG-RIVKIN: Online safety is an ever-changing and nuanced area. The reason is, it’s easy to put a red line on things that are completely illegal – like CSAM for example, where there isn’t really much need for a debate over whether it’s good or bad. It’s something that society deems wrong.

Where it gets difficult are those grey areas, so things that are not necessarily illegal, but some deem harmful. That’s where things get complicated because it’s subjective. What that comes down to is how that company you’re working for looks at that type of issue. 

I break things down in terms of having three groups: the company, the public and the government, which ultimately comprises three different perspectives. 

From a company perspective, private enterprises can allow or not allow whatever they want – short of anything illegal. Governments then come in and say, ‘but these things are harmful to kids. We’ve seen studies, we’ve seen this – we’ve seen that’, trying to pressurise those companies into doing more. They subsequently try to come up with these regulations that reflect the determination that things related to children’s safety online are important.

But what they don’t think about so much is how it will go in practice. The conversation often falls short at ‘what do you think is enough?’ because the answer will be different from one regulator to another, right? Because of personal preferences. Someone who might be religious may have a different idea of what should be shown, versus someone who isn’t.

FEED: What steps can the public take to protect their children?

DAISY SODERBERG-RIVKIN: What we want is a standard for keeping platforms safe, but to still give consumers choice. 

That’s where transparency comes in, to set the expectations to children of what they might see. Things are still inevitably going to fall through the cracks, right? You can’t protect from everything. 

It’s not much different to what you experience in the real world. You send your child out into the world, and can protect them as much as possible, but inevitably they are going to see something that might disturb them. They are going to have a friend with a phone without parental controls, for example.

People ask me, ‘so how do you protect them from that?’ and I always say it’s not a question of protection. It’s a question of preparation. Are you doing as much offline as you’re doing online? Are you having the conversations with them? Most people simply don’t realise that half the battle is offline. 

In summary: it’s that idea that when you’re writing regulations and creating polices, that you make sure you are consulting with the people who are actually going to enforce these things – to see if it’s realistic and how it might impact the people that they’re already trying to protect.

I’ve often said when answering questions like this that it is a village effort. It’s the parents, the company, the government – society as a whole.

Daisy Soderberg-Rivkin
Daisy Soderberg-Rivkin

FEED: How are well are governments handling this?

DAISY SODERBERG-RIVKIN: You see scary examples, like in India where they are trying to increase censorship for OTTs, stopping them from televising anything that shows disagreement with the government. That’s when things become murky and basic human rights come into play. The position of governments is always up for debate. 

FEED: What are the biggest dangers we are seeing online right now?

DAISY SODERBERG-RIVKIN: There are the obvious ones that we’re always talking about like CSAM and privacy. 

But what really concern me, which I don’t think is talked about as much, are the risks we see online being translated into offline behaviour and in turn presenting offline risks. The things I’m talking about are recruiting for terrorism, cyberbullying – things that translate into bullying at school, racism. Or misinformation when it comes to targeted advertisement – relating to smoking, drinking, drugs, dieting – the latter having a massive impact on teenage mental health.

This problem of online transferring into offline has real consequences – including deaths. You see kids playing first-person shooter games that often have extremely violent material, that tells them it’s okay to pull out a gun and shoot someone if they’re annoying you. There have been studies on serial killers who have been influenced by video games.

These are just a slice of the dangers we face online today. What with the metaverse coming, it’s never been more important to remember that the online world is shaping a future generation who will live in our societies, manage our societies and our online and offline presences.

FEED: How do you overcome the age-old debate on first-person shooters?

DAISY SODERBERG-RIVKIN: If we have learnt anything from prohibition, it’s that just getting rid of something or outlawing something isn’t going to stop people from doing it. It usually makes things worse. Also, kids are way more technically savvy than we give them credit for. If you don’t let them do it, they will find it somewhere else.

So, then we come back to this idea of preparation in the offline world. Making sure that although what they are playing might be fun, it’s fictional. Making it clear that whipping out a gun and shooting someone is not okay and illegal.

There are also games that are getting more creative, picking up on what it is that kids love so much about Fortnite, and mimicking those without the violent component.

I’m not saying that someone has found the solution. There’s never going to be a silver bullet to this. But balance is a good thing, making sure that there are non-violent competitors in the market among the big players like Call of Duty and Fortnite.

FEED: What key regulations are currently in place? Are they helping?

DAISY SODERBERG-RIVKIN: Most regulations throughout the world cover a variety of platforms. In the UK, the big one is the online safety bill. In the US, there isn’t really a central law that addresses online safety, instead there are multiple that cover different online sectors. There’s our children’s online privacy protection act (COPPA), which focuses on keeping children’s data safe – very similar to the measures you see in GDPR and the EU.

For example, when I was working at Lego, we would consult with product teams and make sure we didn’t use kids’ data at all, period. That can be a challenge because you’re trying to determine how well you’re performing and without any data that’s really difficult. 

In Australia, they have an online safety act which creates some online safety expectations for companies. Again, these are regulations that are very good in theory. But when you are on the other side of it and trying to implement those things, you realise how subjective safety is.

FEED: There’s a lot of focus on the negatives online. What are the positives?

DAISY SODERBERG-RIVKIN: Most of what we hear about are the negative impacts of social media and gaming. I think that the media has thrown a lot of information at people about how the internet is going to ruin their kids. The scare tactics have been intense. That’s not to say I don’t agree with a lot of them. There are definitely risks, just as much as there are risks in the real world. 

Sometimes, it’s nice to look at the positives for a change. Great things also come from using the internet.

From an educational perspective, you have easy access to academic material online. From a social perspective, you have access to people who are from every corner of the world. The ability for children to speak to someone from another country and get a different cultural perspective is incredible.

Think of Covid-19! Kids were able to continue going to school, despite schools closing physically on a global scale. There are so many positive impacts.

That’s not to shadow the negative side of things, of course. But it goes back to that question of what is possible and what is enough? 

When thinking of when the internet was created and social media companies followed suit, I think of the phrase ‘you can’t put the toothpaste back into the tube’. It’s out there. Often parents say, ‘why don’t you just ask social media companies to remove all of the danger?’ and I turn back to them and say, ‘how do you propose we do that?’.

The government will come to you and say: ‘remove all nudity, remove all bad words’ etc. Then I say, ‘well, there will be nothing left’. You have to think, all these efforts are being done by many different mechanisms. As I said, it takes a village of content moderators, AI age verification mechanisms, all sorts of things. But these are still being worked on. They haven’t been perfected – and they won’t ever be perfect because our definition of what’s good and bad will always change. There will always be a new issue.

FEED: Should we have safety concerns when it comes to the metaverse?

DAISY SODERBERG-RIVKIN: With the metaverse, the first thing that came to mind was a sense of PTSD, of the time when we first dealt with livestream videos. That lack of control when something is happening in real time, and you might not be able to stop it. The safety concern lies with the fact there’s less control when something is happening in real time. It’s also scary in the sense that no one has actually defined what it is. There are many definitions out there, none of which are unanimous. 

FEED: Any advice for someone who wants to enter your profession?

DAISY SODERBERG-RIVKIN: I would start by saying it’s not for the light-hearted. It’s a job for people who can operate under pressure and make quick decisions, but also keep in touch with things that are always happening around the world. 

A lot of people ask what they need to study to go into trust and safety? I honestly think anyone can work in this field. Trust and safety is full of lawyers, engineers, policy offers, economists, doctors, the list goes on. 

The internet isn’t going away. We need the same institutions we have in the real world, online. This is as well as making sure we can regulate and move forward as a society within the online world as best as possible. 

Originally featured in the Summer 2023 issue of FEED.

Interviews archive.

Genius Interview: Paulette Pantoja - Blu...

July 28th, 2022

From a start-up in her apartment, Paulette Pantoja has grown Blu Digital Group into...

Genius Interview: Brielle Urssery

April 13th, 2023

From an early rise through the ranks, gaining unmatched understanding of FAST, to becoming...

Genius Interview: Chloé Rochereuil

September 19th, 2024

Chloé Rochereuil is an Emmy-nominated, multi-award-winning director who joins FEED to talk all about...

Genius Interview: Sue Anstiss MBE

November 24th, 2023

FEED learns all about the journey of Sue Anstiss and her powerful mission to...