305-socialmedia-logo.jpg

3.05 | 7.24.19

Earlier this year Soonish took on social media in an episode called A Future Without Facebook. In that show I explained my own decision to quit the troubled platform, and talked with friends and colleagues about their own reasons for staying or going.

But the story of how these platforms are confounding early hopes for social media—and are instead blowing up democracies—was never just about Facebook. In today’s special follow-up episode, I speak with national security expert Juliette Kayyem and former Twitter engineer Raffi Krikorian about the challenges spanning all of our social media platforms—Twitter, YouTube, Instagram, Facebook, Reddit, and many others.

Algorithms designed to serve personalized content and targeted ads, for instance, have ended up fueling political polarization, aggravating radical-fringe resentments, and accelerating the spread of misinformation and disinformation. “The aspect that's different now is…the extent to which the guy sitting alone, who has these horrible thoughts, is able to find a community or a network to radicalize him and give a sense of community for that anger,” Kayyem observes. YouTube’s autoplay feature, which can lead viewers down rabbit holes full of conspiracy-theory videos, “might be one of the most dangerous features on the planet,” Krikorian comments.

How can we fix it? Both Krikorian and Kayyem say what’s needed is a combination of citizen pressure, technical and business-model changes, education for individuals (so they’ll know how to judge what they see on social platforms), and legislation to make information sources more transparent and hold platforms liable for the harassment they facilitate.

My chat with Kayyem and Krikorian was recorded at Net@50, a celebration of the 50th birthday of the ARPANET (the precursor to today’s Internet) organized by the World Frontiers Forum and Xconomy. Thank you to both organizations for permission to share the session and use the photo above.



Mentioned in this Episode

The Constant: A History of Getting Things Wrong

Soonish Episode 3.03: A Future Without Facebook

Net@50

Xconomy

World Frontiers Forum

Juliette Kayyem

Raffi Krikorian

Plymouth Rock: A Pageant from Iconography

Soonish Episode 3.04: The Art that Launched a Thousand Rockets

Soonish on Patreon




Chapter Guide

0:00 Hub & Spoke Sonic ID

00:08 Special Announcement: The Constant Joins Hub & Spoke

01:59 Soonish Opening

02:15 Audio Montage: Social Media in the News

03:43 The Problem Is Bigger than Facebook

05:29 Meet Guests Juliette Kayyem and Raffi Krikorian

06:04 Question 1: How Did You Get Interested in the Problem of Social Media?

12:39 Question 2: Shouldn’t We Have Noticed This Earlier?

16:22 Question 3: Micro or Macro Solutions?

22:54 Question 4: Can Individuals Make a Difference?

24:42 Audience Question: What’s Really New Here?

27:59 Audience Question: Should We Eliminate Anonymity on the Internet?

29:17 Audience Question: Making Us Smarter

31:21 Final Credits

32:14 Check Out the “Plymouth Rock” Episode of Iconography

33:35 Thank You to Our Patreon Supporters




Notes

The Soonish opening theme is by Graham Gordon Ramsay.

All additional music is by Titlecard Music and Sound.

If you like the show, please rate and review Soonish on Apple Podcasts / iTunes! The more ratings we get, the more people will find the show.

You can also support the show with a per-episode donation at patreon.com/soonish. Listener contributions are the rocket fuel that keeps this whole ship going!




Full Transcript

Wade: Before we dive into the show today, I've got some exciting news to share. Hub and spoke is growing. This month we brought a new show into the collective. And if you like Soonish, I know you're going to like The Constant.

It's a really fun and engaging show about the human side of science and how common it is for scientists to get things wrong. The show is made by Mark Chrisler, who lives in Chicago and is also a playwright. And Mark's goal isn't to make fun of scientists. Not at all. Instead, I think he's trying to show how scientists are just people who make mistakes and get led astray by their own beliefs and biases, just like everyone else. And how when you think about it, it's actually kind of amazing that science still works as well as it does, and that in the long run we do learn how to ask better questions and get less wrong answers.

In his latest episode, Mark argues that if there's going to be a patron saint of wrongness, it should be Douglas Corrigan, better known as Wrong Way Corrigan. He's the pilot who gave new meaning to the phrase air quotes by "accidentally" flying solo across the Atlantic in 1938.

Mark Chrisler: What makes July 17th so holy? In The Constant calendar, it's the anniversary of either one of the greatest adventures or misadventures of all time. It's hard to say for sure, which is as I like it. It's been 81 years since a single-engine Curtiss Robin took off from Floyd Bennett Field in Brooklyn on an amazing journey that captured the attention of people throughout the world. An amazing journey that was never supposed to happen.

Wade: We couldn't be more thrilled to welcome Marc into Hub & Spoke, and I hope you'll check out that episode and the entire constant archive at constant podcast dot com. OK, on with the show.

Audio Montage: The future is shaped by technology, but technology is shaped by us.

Wade: You're listening to Soonish. I'm Wade Roush.

Audio Montage: President Trump used Twitter to blast several Democratic Congress women of color for criticizing the United States and being what he called un-American.

Audio Montage: He wrote it down on his Twitter feed. He said, the countries you came from.

Audio Montage: Facebook and Instagram, banned conspiracy theorist Alex Jones. Jones' talk show has a Facebook page where he's repeatedly said the Newtown school shooting was staged.

Audio Montage: The problem lies with YouTube's algorithm. Is YouTube basically recommending all of us videos just to keep us hooked?

Audio Montage: This comes after a gunman who killed 50 people at two mosques in New Zealand live-streamed the shootings on Facebook.

Audio Montage: Facebook took it down only after being contacted by officials in New Zealand. But then other users kept reposting that video on Facebook as well as on YouTube and Twitter.

Audio Montage: It took just seconds for this Instagram livestream to go from dangerous to deadly.

Audio Montage: In a series of tweets, Darla Schein claimed without evidence that childhood diseases such as measles keep you healthy.

Audio Montage: The company has faced intense criticism in recent years for not doing enough to curb hate speech and misinformation. After last Saturday's synagogue shooting near San Diego, white supremacists gathered on a Facebook page linked to the alleged shooter to express support.

Wade: It's one of the biggest and, in a way, one of the most surprising technology stories of the last three years. Social media, a set of tools that was supposed to give everyone a voice and bring us together around shared ideas, has turned out to be way more effective at doing just the opposite: sowing division, doubt, discord and distrust.

If you listened to the March 2019 episode of Soonish, you heard me place a lot of the blame for this development on one company and one CEO, namely Facebook and Mark Zuckerberg.

And there is plenty to criticize about Facebook, especially the way its surveillance capitalism business model gives everyone from political campaigns to state-sponsored hackers a way to mess with our democratic process by targeting specific groups with inflammatory ads and posts. I may even have called Facebook the Ford Pinto of the Internet, in a reference to the 1970s hatchback with a fatal engineering flaw that caused it to explode in rear end collisions.

But the story of how social media technology is backfiring on us was never just about Facebook.

And today, I want to bring you a conversation that I taped on stage recently with two remarkable people who've been longtime observers of the whole social media sphere and who have some concrete ideas about what's most broken and what we can do to fix it.

This interview happened in July 2019 at an event at the MIT Media Lab called Net@50. It was a celebration of the fiftieth anniversary of the creation of the ARPANET, the precursor to today's Internet. And it was co organized by the World Frontiers Forum and the technology news site Xconomy, where I used to work as an editor.

The theme of the segment was how to fix social media. And the first speaker you'll hear is Juliette Kayyem. She's a national security expert who teaches at Harvard's Kennedy School of Government and who served as assistant secretary for intergovernmental affairs in the Department of Homeland Security under President Obama.

The second speaker is Raffi Krikorian. Raffi helped invent the idea of the Internet of Things as a grad student at the MIT Media Lab. And he went on to work as an engineer at Twitter in its early days. And now Raffi is managing director at Emerson Collective, the philanthropic and social change organization started by Steve Jobs widow, Laurene Powell Jobs.

Wade: Juliette and then Raffi, I'd like to ask you first to tell us a bit more about what you're up to now. And maybe as you fill us in on your professional biographies, weave in a little bit about how you came to be interested in this big question we're asking: What can we — what happened to social media and what can we do to fix it?

Juliette Kayyem: Well, thank you. We were saying that we feel like we're the buzzkill panel of the day, of this visionary day. But I never thought that Peter Gabriel would be the opening act. So I'm very excited to be here.

I mean, I come from this from the perspective of a background in counterterrorism and national security. I served on the National Commission on Terrorism, have spent a career in counterterrorism, and then more generally, homeland security, basically risk reduction, building defenses. I'm on the faculty, a professor at Harvard's Kennedy School of Government. That's when I actually started thinking about policy work around these issues. Essentially, security is actually the easy part. Security is, you know, build a wall, don't have a Boston Marathon. That's the easy part. The challenge is how do you promote secure flow of people, goods, ideas and networks? That's how a sophisticated understanding of what security is today, because you have to promote that flow, but it has to be secure. I'm interested in this issue in terms of fixing social media. While the privacy concerns are important, while the fake news and fake information is important, I'm really concerned about the radicalization issue, especially coming from counterterrorism. And a lot less so on the ISIS side, but [on] the white supremacist side.

So what you're seeing today that's different than, say, when I got into counterterrorism in '95, right, so Al Qaeda, was a very, very small, targeted group of men. All of them had had, you know, fought in the Afghan war. They all had to vow directly to bin Laden. And it was a limited number of people and they didn't trust anyone else.

The radicalization that you're seeing now has three components, all aided by social media. One is, of course, this idea that it's a displacement hatred. It's doesn't have to do with, you know, oh, "I don't like the African-American couple that moved in next door." It's actually a sense, and you see it in a lot of this stuff today, that the fact that the African-American couple moved in next door displaces my presence. Right. This otherness that we're seeing play out on Twitter now. The second aspect that's different in radicalization now is, of course, social media and the extent to which the guy sitting alone who has these horrible thoughts is able to find a community or a network to radicalize and, you know, to give a sense of community for that anger and therefore the violence. And then the third, of course, is a public discourse. And I don't—why am I apologizing for being political?—a public discourse that does not condemn this racism, that in fact, condones it and promotes it so that there was not a shamefulness. All right? I'm not so naive to believe that racism is going to die in my lifetime, that anti-Semitism is going to die in my lifetime. But I would like it to be shamed. Right? I want these people to go back inside. And that combination, all aided by social media, is why I've gotten into this space. That was a long answer. But just basically, it's all horrible. No, I'm joking.

Raffi Krikorian: That's a lot to follow. I mean, for me, like I mentioned, I think like when I actually left the Boston area, I moved to San Francisco, actually joined what was then a pretty small company called Twitter. And I'm working on building out their infrastructure all the time. So at Twitter, my job really was to get rid of the Fail Whale. Do whatever we could and make sure Twitter could scale to be reliable. Twitter could go everywhere in the world. But the things that we like to talk about this time of the Arab Spring, we'd like to talk about revolutions or where we're coming forth because people could just talk to each other. The metaphor we really liked to use with this was kind of like a flattening of the landscape. You could see everyone. You could talk to everyone. Access was being enabled. In fact, a lot of the side projects my team would work on where we would be in communication with Chinese dissidents, trying to figure out how to still get them access to Twitter through the firewall. So whenever someone found the hole, we would try to figure out how to make Twitter accessible via that hole for people who wanted to use it.

But then, you know, I went off and did other stuff along the way. And then came Inauguration Day 2017, when I actually had the visceral understanding of how social media was sort of used against a lot of people. I actually networked my way and tried to find my way into the Democratic National Committee to really actually go to hand-to-hand combat on how to how to change the discourse, how to change the way we use social media and change the way that we sort of talk about it. And, you know, in the political sense, well, I apologize, but in the political sense, you're sort of like facing two aspects of it. On one hand, you want to use this new mechanism to reach and talk to people. Like I ran a massive study in the in the Alabama special election for the Senate to really understand the impact of social media to get out the vote. It turns out you actually have to go do it. We ran a randomized controlled trial against the entire state of Alabama. We excluded a certain portion of Alabama, never talked to them on social media. And we took about two thirds of the rest of the Democratic leaning population to try to get them to go out and vote for now-Senator Doug Jones. And it turns out a decent percentage of them actually went out to vote because they saw messages on social media. So on one end, you're trying to use this to our advantage. But on the other side, you're dealing with the fact that, you know, a lot of people went to the polls on the wrong day because they saw an advertisement saying that the wrong Tuesday was Election Day. And it was a purposely put advertisement out there.

But you know, if people don't have any mechanisms to understand what's true, what's not true, when it's coming through this accelerated media—I think one of the things I like is person-to-person communication in a room, you know, where there's built-in friction involved. Right. There's only a certain number of people I can talk to one time. And, you know, technologies, of course, lessen that friction. But you can almost think that social media has become like an accelerant in some way, in a way that like we, as people don't have the built in mechanisms. Evolution hasn't taught us how to deal with this type of social accelerant, you know, in a lot of ways.

Wade: Thanks. Well, to get political—Why not? Let's just be a little bit, let's apply a little self-examination. I'm feeling like I'm as worked up about these issues as you are. And I also date that back to Inauguration Day 2017 or maybe November 8th, 2016. But let's wind the clock back a little farther. Weren't a lot of these problems quite evident before that? Raffi, you've been dealing with bad actors and potential misuse of platforms like Twitter since you started in this field, right? Juliette, you've been looking at national security issues and how radicalization is abetted by the platforms. This all started a long time ago and it's not a product of the Trump era. Shouldn't we have noticed this? Shouldn't "we liberals" have noticed this sooner?

Juliette Kayyem: Well, I mean, to a certain extent we did. I mean, the radicalization process that we were looking at with ISIS certainly was that. There were major attacks by ISIS in this country. They weren't 9/11 magnitude. But you certainly had the Pulse nightclub, another incidents where men are alone in rooms and getting radicalized online. They've never been to Afghanistan or Syria or any of these places. So we certainly knew how platforms were being utilized. I think the accelerant aspect of it has even changed in just five years. The understanding by those of us, I don't work in government anymore, but those of us who study this, the understanding that the platforms know how to say the right things, but are not changing their algorithms in response to the right things. And if you want a better example, it's New Zealand. That it takes 17 or 18 minutes to bring down a live feed on Facebook of a man who is killing people. So I'm not a technology person, but I've got a couple of data points to help. But what if you know the sound of bullets? People standing who are now falling, you know, people screaming. That seems to me like a series of data points that should alert someone that something is going terribly wrong.

So I do think it has changed. And then just back to the point of the public space, I mean, think of social media as just reflective of a public space. And our tolerance in the public domain is has changed. Right? I mean, it's just it's acceptable to say certain things. And when it's acceptable to say it in the public sphere, it is certainly going to be harder to bring it down in social media. And it's going to be harder to, as I said, there's something about shame. I mean, no one's embarrassed to have these horrible thoughts anymore. I'd like them to be embarrassed again.

Raffi Krikorian: But I think also, you know, one of the things that sort of like a free society sort of values is the fact that you have to be confronted with oppositional opinion. That's one thing that you should go do. And one of the things that's really rapid has changed in the social media landscape is, you know, autoplay on YouTube might be one of the most dangerous features on the planet in the grand scheme of things. Right. Because what autoplay on YouTube does, and what a lot of the social media platforms also do, is it sends you down a rabbit hole of things that are driven by your particular engagement. And because we've created these systems that are driven by your engagement and trying to figure out how to juice that engagement, you've created these potential radicalization pathways to go through, to go through our networks. So we're no longer in an environment at least online where you're forced to see oppositional opinion. And when you do, you're immediately set up to absolutely hate it, so therefore go back to your own camp. You're no longer, to put a finer point on it, you're no longer forced to see moderate oppositional opinion that you can tehn have a discourse about, because that's just not what these platforms are set up to do.

Wade: Right. Okay. Well, we've gotten a lot out there already in this last eight minutes about the toxicity of these platforms. You just touched on something that I think is fascinating. There's a kind of tension right now, when we start thinking about solutions to these problems, between "micro" or technical solutions—ike, for example, there have been people who have talked about the idea of just turning off the YouTube autoplay recommendation algorithm in certain cases to tamp down that kind of, and then there are examples like Pinterest, which figured out that the easiest way to get rid of anti-vaxxer misinformation on Pinterest was to just make it impossible to search for that topic on Pinterest. Simplest thing in the world. That sounds encouraging from one perspective. But I also wonder whether those kinds of solutions can really scale and help us to fix the deepest problems on these platforms, which have to do with the deliberate way the platforms rile us up in order to get us to spend more time on the platforms in order to see more ads. Is that a fixable problem?

Raffi Krikorian: Yeah. I mean, look at your Pinterest example. It's exactly my point on adding friction. Right. So like, you know, I made this comment at some conference a few few months ago that one of the biggest changes a social media company can do is just to figure out how to like dampened virality, just basically slow it down, make it more comprehensible to people. Give them a chance to process it. Like if you just look at some pure statistics online, like, you know, a vast majority people don't post content. They only read content. But those people who do, a lot of them have found like these frictionless ways to engage, like sharing or retweeting or something that doesn't actually require you to add information. You're just, again, juicing some algorithm in the process.

Raffi Krikorian: I think this is a multi-fold problem. Obviously there is no one silver bullet. So on the one hand you have an economic incentive. So like even when Facebook is fined or potentially messing up our democracy, their stock price goes up. Right? So the economic incentives are clearly broken here. But at the same time, like in government, we don't have people can have intelligent conversations about it. I'll go back to Facebook again, because I actually attempted to help prep people for these conversations with Mark Zuckerberg. But it's clear our legislators do not understand the technology we're dealing with anymore so they can't hold a check and balance accountable. And finally, there are local voices that are really important that are not being represented in the governance of these of these platforms. These platforms are governed back from where I'm now, in a Silicon Valley office somewhere, you know, granted with a thousand or so people, but they're not a diverse groups of people. They're not representing everyone's opinions. They don't represent all the opinions internationally. They don't even represent all of the opinions domestically inside just Palo Alto. So there's like so many things here that are set up to fail. And figuring out the fixes to one of them. That's not not the answer. You have to actually fix the entire system.

Wade: Juliette, where do you suspect the answers are? At what level?

Juliette Kayyem: I think it's a combination. I mean, I think the market driving certain behavior. People getting off these platforms will drive some behavior so that the companies actually become responsive. But I do think there's a role for legislation. And I think the failure to legislate is is giving these social media platforms their own legislative ability. I mean, I think one of the geniuses of what Facebook and Mark Zuckerberg were able to do for many years is to convince us that they had no agency. Right? But as we realized, that convincing is agency. Because they actually do have a capacity to figure out what the algorithms are to slow the pace down.

So legislation would make sense. It's not going to be perfect. Think about it a little bit like gun legislation, right? Your goal is, you're not going to end all gun violence, but you goal is to have to have legislation around, say, gun control. You want to minimize the risk and you want to maximize defenses. It's as simple as that. This is Security 101. So that you could have rules of disclosure, for fake accounts, for bots. You could have rules of disclosure about what standards these social media platforms are using in terms of what kind of speech that's allowed. You could have the Honesty Ad Act or something like that in which you actually have to certify who is in fact, buying this. Privacy. And then there is a role for liability. Right. Because if you look at the extremism out there and in particular people of color and women and people with more diverse backgrounds that are that are suffering, the fact that there is it seems to be no responsibility for that kind of activity that is occurring on these social media platforms, seems to me ridiculous from a governance perspective. You're starting to see that change. There was actually a woman who just got, at least from a website, at least got major money from a court because of the harassment of online platforms.

So there are a number of solutions. None of them are going to solve the problem, but they will create a narrative in which this stuff can be regulated. And I think in the absence of starting that narrative, our hands are thrown up. Can I do one more quickly?

Wade: Yep.

Juliette Kayyem: Which is 2020. So if you me as a counterterrorism and national security person, what would I do if I were the Russians? How do I win, if I want Trump to win, I suppress 10,000 African-American votes in Michigan and I've won. How do I do this? I don't do it the same way I did it before. I'm going to bring down a couple of critical infrastructures. I'm going to do fake news that there is an active shooter. And maybe one other thing. The media, the social media platforms have got to get their act together on the use of their platforms to not create fake news that impacts the ability of people to get to vote, to actually do the physical act of voting. And that's where government can come into play, state governments come into play and get them to focus to get that stuff done, because that's what I'm worried about.

Raffi Krikorian: I mean, what I would do for 2020 is something very similar. But like, you know, I'm obviously a computer scientist also. And I don't have a diverse educational background. But we just need more historians. We need more like history of science people working on these platforms, because I feel like we're repeating a bunch of lessons, just at a very accelerated pace right now. When it comes to 2020, it's not like the Russians aren't playing from a playbook they've been using forever. They're just doing it in a very fast way in a new platform and technology. And we're just for whatever reason, not looking historical context and figure out how to combat that.

Wade: Okay, so speaking as a historian of technology and podcaster who did an episode about how Facebook is one of the biggest threats to democracy—Juliette, you were talking about agency. Does it ever come back to the individual? How much of a difference can it make for individuals to stop using Facebook and Twitter or other platforms, or use them less, or use them more wisely?

Juliette Kayyem: I think there is. As I said, we talk about layered defense in security. So think about airport security. We hate it. But the truth is, there's a lot going on there that's layered defenses so that you don't have what we call a single point of failure. That's how we we set up physical security in the physical world. And I think because we call it technology and cyber and it's fancy, we think that the security on the social media is different. It's not. It's minimizing risk, layered defenses, all the same keywords. So one area where I do think that the layered defenses are a part of it is educating people so that when they see things that look too good to be true, like "the Pope endorsed Donald Trump," the social media platform has responsibility, but so do individuals, for recognizing that popes tend not to do presidential endorsements.

Wade: Raffi, do you think individuals can make a difference?

Raffi Krikorian: Oh, obviously, individuals can make a difference. However, I want to also say that we shouldn't blame the victim. I think that a lot of this should fall onto platform providers. A lot of it should fall onto some form of smart regulation around all this. You know, we have the smartest people in the world working ad targeting platforms. Why aren't we building better UIs for people to make sense of what's going on, to make sense of the information they're seeing.

Wade: So we've got about five minutes left. So if there are any Slido questions or if there's anybody in the audience who has questions, this is the time. So I see your hand raised right here, sir.

Audience Question: Just a simple observation, more than a question, but looking for a response. Many of these activities are not particularly new. Yellow journalism, which took us to war at least once, maybe twice. Talk radio, which was endemic in the United States, probably still is. With some trepidation I will also include churches, which also get into a political role. It's a very complicated problem. I don't think people are very educatable. Maybe we can educate ourselves, but that's not what we're talking about. It's a combination of entertainment and fact-passing. So I'm being a little pessimistic. Any comment?

Raffi Krikorian: I mean, I want to be optimistic.

Juliette Kayyem: Yeah, we're going to be optimistic now.

Raffi Krikorian: In the grand scheme of things, I want to believe that people are educatable. I think that if you think about local news, local news sometimes is the most informed about local issues. And if we're only seeing like what The New York Times covers about it, which is a very particular lens, whether it be like national commentary or politics. But if you want the real facts go took to that beat reporter who actually went there and spoke to the people on the scene. And the Internet provides that ability to connect down at that level and find out everything that's going on. I remember in one of the elections, when I was still a graduate student in this building, or this building didn't exist, the next building next door, like we had the debate going on on the screen. And this was revolutionary at the time, but live fact checking happening at the moment, like scrolling on one of the columns, you know, in ANSI type or whatever. That's the power of, that's the vision of how we need to get back to that world. We need to be able to connect diverse voices in a way that allows you to have a real good diet of information going on. And the world we live in right now is not that. But I'm optimistic and believe we can get there.

Juliette Kayyem: And I think there's ways also that social ... I mean, I'm optimistic, too, that social media platforms can amplify, and I know it's a hated word now, but experts. In other words, think about the anti vaccine [movement]. And this is really interesting today. So there's always been a latent anti-vaccine movement, but it's actually being pushed by the Russians right now. So that disinformation in health security is being pushed by the Russians, because their goal is you want to unify the base, and you want to disperse the opposition, however Russia is going to define it at that stage. So just what we were talking about at Pinterest, and you know, Facebook. Most anti vaxxers get their news from Facebook. Most non anti-vaxxers get their news from their social network. Their real social network, doctors, friends or whatever. Why can we put where can we figure out a way in which those experts—not a bad word, people who believe two plus two equals four and not five—can be amplified as well? So that we have a commom baseline that vaccination is good and the plague is bad.

Wade: So to keep things moving along, thanks, I'm going to go to a question that's coming in on Slido. That bottom question. How much of the problem with social media would be solved if we could eliminate anonymity on the Internet?

Raffi Krikorian: I mean, I believe that a lot could be fixed. I don't think everything would be fixed. Anonymity doesn't solve a security issue necessarily. So like a lot of what we had to deal with at the Democratic Party was actually hijacked real accounts that were built up over time that were then hijacked and used. So like, again, like there isn't a silver bullet to these things. However, you know, there is a place for being able to stand up behind your beliefs and sort of talk about it. I'll draw just a very quick analogy, like, you know, the caucus system. So like if you go to like an Iowa caucus, everyone has to stand up and argue which candidate you want to support, or which or which delegate you want support or whatever it is. That turns off a lot of people from participating. That's like a hard thing for a lot of people to go do. So like some form of anonymity is actually good. Could you make an argument that the pendulum swinging away from anonymity to true identity might temporarily fix the problem? Sure. But again, that also isn't how the world really works. So we should try to figure out the right balance of things.

Audience Question: So I don't know if this is question. But I listen to a lot of folks who talk about this. It's one of my interests. And Raffi, I think I hear you saying something that's a little different than what most people are saying. I hear a lot of talk about, you know, regulation in terms of monitoring what's said or viewpoints or things like that. But I think that you are actually bringing out the idea of rejiggering the networks to universally make us smarter, instead of just limiting what's being said and so on. I wonder if you could just expand on that a little bit?

Juliette Kayyem: Absolutely.

Raffi Krikorian: I'd love your take on it, too. But like for me, like the situation is that a lot of these networks have—this is an overused phrase in media these days—but they've made us the product. Right? In a lot of ways. As opposed to like making a service that makes humanity better or makes society better. And like what we need is, we need people who are building things in the public interest as opposed to building things that are specifically set up to make a lot of money for these particular corporations.

Juliette Kayyem: Yeah, I mean, I agree. This is not binary, either the is on the platform or it's off the platform, right? It is whether you can set up a system in which it's it's slowing it down. In my world, you know, New Zealand was like one of these moments where you're like, nothing has been fixed. You just cannot believe that that stuff remained online for as long as it did. And the reason why is because the more people that were watching it from his network, then the more that it got played out. It seems to me that would seem to be correctable from a technology perspective. I mean, so there are a number of things that you can do to either promote the two plus two equals four people, promote the experts, and demote the guys with a gun in a mosque.

Wade: Well, that was not a total buzz kill. Thank you very much. Can we have a hand for our panelists? Thanks.

Wade: Soonish is written and produced by me, Wade Roush. Our theme music is by Graham Gordon Ramsay. All of our other music is from Titlecard Music and Sound in Boston. A big thank you to Xconomy and the World Frontiers Forum for letting me share the tape of my conversation with Julia and Raffi. The World Frontiers Forum is a non-profit based in Cambridge, Massachusetts, that brings together young technology pioneers to tackle sustainable development goals. You can learn more about their work at worldfrontiersforum.org.

Wade: Soonish is a proud founding member of Hub & spoke, a Boston-based collective of smart, idea-driven podcasts. As I said at the top of the show, we're thrilled to be joined by Mark Chrisler of The Constant. You can find the latest episode of that show at hubspokeudio.org, and you can browse the whole archive at Mark's website, constantpodcast.com.

I also want to tell you about the latest episode of the Hub & spoke show Iconography. Host Charles Gustine wraps up his season-long exploration of the mythology of the pilgrims with a visit to, of course, Plymouth Rock. Now, one new thing I learned from this episode was that just like the Puritans who allegedly touched down on it, Plymouth Rock is itself a kind of immigrant. It's a big chunk of granite and diorite that formed 630 million years ago as part of the Dedham Formation in what is now central Massachusetts. During the last Ice Age, it got scraped off by glaciers and carried to its date with destiny in Plymouth Harbor.

Charles reconstructs Plymouth Rock's strange biography literally piece by piece. I've actually got a cameo voice acting role in the drama, as does Lonely Palette host Tamar Avishai. And in the Department of Bizarre Coincidences, Charles talks a bit in the episode about the Greek-style portico that was built in 1920 to protect Plymouth Rock. Guess the name of the architect who designed that portico. It was none other than Chesley Bonestell, the space artist I talked about in the previous episode of Soonish. Before Bonestell started painting his hyper-realistic magazine illustrations of spaceships and distant planets, he was an architect, and the Plymouth Rock portico was one of his most important commissions. You can take an audio journey to Plymouth right now at iconographypodcast.com.

Special thanks to my top supporters on Patreon, including Kent Rasmussen, Celia Ramsay and Paul and Patricia Roush. Hi, Mom and Dad! Thanks also to Jamie Roush, Lucia and Warren Prosperi, Victor McElheny, Andy Hrycyna, Steve Marantz, Elizabeth Blanch, Chuck and Gail Mandeville, Ellen Leanse, Mark Pelofsky, Graham Ramsay, And everyone who pitches in to support the show. You can join this visionary crowd by signing up to make a per-episode donation at whatever level feels comfortable for you. Just go to patreon.com/soonish. Thanks for listening and I'll be back with a new episode...soonish.