S05 E04: What are our Civil Rights on the Internet?
Vanessa: Welcome to “Pod for the Cause,” the official podcast of the Leadership Conference on Civil and Human Rights and the Leadership Conference Education Fund, where we expand the conversation on the critical civil and human rights issues of our day. I’m your host, Vanessa Gonzalez, coming to you from beautiful Washington, D.C. And like we start off every show, we’ve got the Pod Squad.
Vanessa: That’s pretty good, where we discuss pop culture, social justice, and everything in between. We have got some amazing folks on the Pod Squad today. First up, we have Laura Murphy, President of Laura Murphy & Associates, LLC, and author of “The Rationale for and Key Elements of a Business Civil Rights Audit.” Hi, Laura.
Vanessa: Thanks for joining us today.
Laura: Happy to be here.
Vanessa: And next up, we have Sofia Ongele, founder of ReDawn, and creative advisor intern at Kode With Klossy. Thanks for being here, Sofia, appreciate it.
Sofia: Thank you for having me.
Vanessa: In this episode, we’re talking about the intersection of technology and civil rights. While we all know about the dangers of social media, and I know you know them, there’s a larger issue in addition to your Facebook feed getting hacked. We know that technology typically races ahead of the law when it comes to ensuring our civil rights, privacy, and safety. But what happens when technology is actually designed in a way that is hurting us? Before we jump in, I just wanna set the stage with some quick definitions and statistics. Today you’re gonna hear about a civil rights audit. A civil rights audit is an outside, independent report that evaluates civil rights and racial equity issues in a company, and provides a guide to solving them. There has been an increase in the use of artificial intelligence, or AI, by businesses and government that has life-changing implications while deepening harmful discriminatory practices. Unfortunately, those from low socioeconomic statuses are disproportionately impacted by automated decision-making, digital profiling, and AI. So for example, Facebook, can I still call it Facebook, or am I supposed to call it Meta?
Laura: I think you can still call it Facebook.
Sofia: I think you call it Facebook. I feel like calling it Meta legitimizes the fact that they’re trying to rebrand themselves, which they’re Facebook. They’re Facebook.
Vanessa: Yeah, thank you. Facebook’s ad platform still offers multiple ways for discrimination by race and ethnicity, despite backlash and boycotts, and this can be exploited by bad actors. As we know, 72% of American adults are on social media, and Facebook remains the second-highest used social media just below YouTube. And the use of social media increases as age decreases, making younger people the majority of social media users. So Laura, in your report, “The Rationale For and Key Elements of a Business Civil Rights Audit,” you explain how you can’t wish, pledge, talk, or tweet solutions to structural racism, inequity, and discrimination, you really have to roll up your sleeves and do the work. So how do you do that? How do we even create an audit? How do you start that process? And then, what does the work look like?
Laura: Well, audits in the case of Airbnb, Facebook, and Starbucks, which I cover in my report, were generated because of consumer complaints and user complaints. In the case of Starbucks, there was a famous arrest of a Black man who was just sitting there waiting for another person to join them and the manager called the police on them. And in the case of Airbnb, there was concern about how Black guests were treated by hosts, whether they were getting the same bookings, or whether they were being rejected more often. In the case of Facebook, there were numerous complaints about hate speech, the civil rights activists being kicked off the platform, the voter suppression. Remember when Cambridge Analytica got hold of the Facebook data, and the Russians used it to suppress voter turnout in 2016, and some people think that that contributed to the election of Donald Trump by giving people misinformation and encouraging them to stay home.
So in all of these cases, the audits came about as a result of issues. But an audit can also be voluntary. And we’ve seen of late companies voluntarily taking on racial justice or civil rights audits. It’s when you hire a civil rights expert to come in and look at the company’s products, policies, practices, and services. And this is different from asking a company to hire more minorities. This is like, are you hurting communities by locating a toxic waste site in their communities? Are you hurting communities by Black people being treated differently than white people? Are you hurting communities by not making your products accessible to people with disabilities? You really go in and you talk to stakeholder groups that have lodged complaints. You look at the lawsuits that have been waged against the company, are your advertising practices, you know, for example, with Facebook, are they discriminatory? Are they preventing people of color from seeing jobs, or getting credit, or seeing housing? So these are real-life harms that an auditor comes in and assesses and works with the company to come up with solutions so that people are not hurt.
Now, people say, “Well, you know the companies aren’t doing enough through these audits, is this enough?” Well, it’s never enough. You know, civil rights laws are not self-enforcing. So how do you get them to be enforced? You either file a lawsuit, you get an Act of Congress, or you get voluntary compliance. And we’re saying, to all the corporations who put Black Lives Matter on their masthead in the aftermath with George Floyd’s murder, where is the beef? Talk is cheap. Sure, you can give contributions to Unidos and the NAACP, that’s all very important. But are your products hurting our communities, and how can we get you to stop hurting our communities? And it also means, are you hiring people of color, and women, and people from the LGBTQIA community in positions of power so you don’t keep making the same mistakes?
But you can have Black, Asian, Latino faces in high places, and still, sell messed-up products like Amazon sells facial recognition technology that can’t tell one brown person from the other adequately, and they sell it to police forces. And so, you know, you see more and more local communities putting a moratorium on facial recognition technology, and I think this is where Sofia and her work is so important because those technologies hurt our people. And so, that’s the kind of thing a civil rights audit gets to. And I think, while it is not a silver bullet, it’s not a panacea, it is a method of getting to these issues voluntarily. It involves the CEOs of the companies. You can’t give this to your diversity and inclusion person, you have to give it to a senior executive to work with an outside auditor to come up with a public report that details what the problems are, and how the company is going to solve them.
Vanessa: Thank you so much for that. Thank you for putting it in real terms so we can really see, okay, this is how this impacts our lives daily. I think a lot of times when people think about technology, one, you don’t have to know the ins and outs of technology to know it’s hurting you, or that a program has been written to exclude you specifically, right? And so I really appreciate you putting it in the context of our day-to-day, it is also kind of a downer because I think when you go online sometimes, you go online as an escape, right? You want to kind of just get out of your bubble or you go for help.
Laura: But wait, people aren’t buying newspapers to find jobs. They’re finding jobs online. They are often going to Facebook to find jobs, they’re going to Google, they’re going to Monster, they’re going to all of these online platforms to find jobs. So while we often use social media and digital platforms for escape, increasingly, if you don’t have a Facebook account, you can’t find out when your child’s school is closing. Or you can’t find out when there’s a community meeting.
Vanessa: It’s very true.
Laura: And so, they have become monopolies, and they’ve become almost like utilities. Like we expect running water and electricity, we expect that they will be up and running, and we use them not just for pleasure, but for core life activities. Like, finding apartments, how many people do you know go out and buy a newspaper to find an apartment?
Vanessa: Right. Right. And I remember back in the day they used to have those giant books, and now, as someone myself is looking for a new place to live, it’s all online. And I kind of put my trust in that what they’re showing me is what I need and everything that I can buy.
Laura: But suppose the algorithm is constructed in a way that excludes you from seeing it because you fit some socioeconomic criteria, and that was the issue with Facebook. There was a lawsuit saying that people within certain ZIP codes who had certain characteristics could be excluded by advertisers from seeing an apartment, and you know where you get an apartment can be life-changing. Like, if you’re a single mom, you’re looking for an apartment in a more upscale community and you can’t see it, that affects your child’s mobility for life because maybe they will get into a better public school system. So these are very much life and death issues that affect our success.
Vanessa: Yeah, thank you so much. Thank you for that. It definitely does shape your life, and I think it is definitely stuff we take for granted, right? Sofia, I want you to get in here. So when we talk about all of the amazing things Laura has lifted up and hit at, and we’re gonna talk about the voluntary companies and what that really looks like. But I want to hear from you Sofia a bit about why is the accountability piece on this so important? And really, what does that look like? Because, when we think about some of the dangers, in particular, that girls and young girls, and young girls of color face on these platforms, you know, what are you asking for folks to do to improve the safety?
Sofia: Yeah, I think those dangers really need to be zeroed in on because I feel like a lot of people don’t necessarily realize what they do. Like, speaking personally, to me and my sisters last summer, just seeing violence against Black bodies on our feeds constantly is an incredibly traumatizing thing and you don’t really realize it until the aftermath when you get in this state of being literally clinically depressed. I mean, Hogan’s data really spoke for itself. There’s negative comparison between teenagers. There’s FOMO, fear of missing out, when someone’s doing something and you’re not, it increases the risk of eating disorders, of suicidal ideation. All of these issues that genuinely…when you’re someone who’s ages 13 to 18, even younger…I started on Instagram when I was 11. You’re not supposed to do that, but we learn arithmetic before the age of 11. So I knew what year I had to be born in as most of my peers did, as well.
But we get on these platforms at such a young age, and it changes the way that we think both about ourselves and the systems around us. And I think that machine learning-based algorithms don’t put people first, they put profits first. There isn’t some humanity filter behind this system that’s creating these algorithms over and over based on the content that you interact with. And if you look at, you know, Senator Blumenthal’s experiment, I don’t know if you had seen it, but he created this account as if it was a teenage girl, interacted with things that suggested they were going through some kind of body dysmorphia, and then they were given accounts, and information, and content that exacerbated that, more eating disorder-related content that would make that issue worse. So it’s very, very clear that Facebook isn’t actually caring about their users in this process.
Evidently, Facebook cannot do this on their own. They’ve proven time and again that without some kind of outside intervention, they’re not going to work to make their platforms safer. I really hope that it comes at a legislative level so that there is at least a basis across other social media platforms because while Instagram is doing this, there’s other platforms we’re getting on at increasingly young ages like TikTok. I mean, I personally use TikTok multiple hours a day. And while I love it, I know that there’s always a flip side to that with the way that they’re creating these algorithms.
Vanessa: No, thank you for that. Because I think, one, 11 seems so young. I’m much older than you, and when I look at TikTok and I look at Instagram, you know, I can laugh at some of this stuff. And I can actually discern like, all right, I know I’m getting this because I searched for something else, right, because I didn’t grow up with this. And so to your point, how that can change the mind and the way that children process information, and even how you expect to receive information, I would assume would kind of just shift and change, and then it’s all being left up to these companies. And to your point, it’s not about people, it’s about profit, right?
Sofia: It really isn’t. I’m sure that you’ve seen this as well, but they were going to make an Instagram for kids. Kids younger than the age of 11. Children. Children in elementary school should be playing outside. I mean, if we’re being real with ourselves, I don’t think that a company making an app for kids is a sign of a company that’s doing very well in the first place. They’re trying to reel people in at younger and younger ages so that we create this mental dependency on this product that we don’t actually need to function in our day-to-day lives.
Vanessa: It’s pretty gross.
Sofia: It is. It is very gross.
Vanessa: Thank you for that. I really appreciate that. So let’s link these two things together. So, Laura, we have companies like Airbnb and Starbucks which were in the news, as you mentioned, had a couple of instances of things blowing up. You know, they’ve gone through some of these audits, they’re going through examining, really, I think, what the structure and the bones are, right, as a company. And like you said, you can’t give this to your diversity officer. This is bigger. Companies that try to do that, it’s not sincere, right? It’s like, “All right, you’re just trying to say you did this and you didn’t really do this.” Can you talk about the impact that you have seen? Have you seen changes once companies undergo some of these audits?
Laura: Absolutely, absolutely. Like in the case of Airbnb, I remember a conversation with the CEO Brian Chesky saying, like, “What can we do to discourage people from engaging in discriminatory behavior?” You know, I said, “Okay, we’re working on changing your anti-discrimination policy,” so those have been improved. He said, “I want something more than that. I want people to attest to the fact that they won’t discriminate as a condition of signing onto the platform.” At his encouragement, we came up with something called the Community Commitment, which is an agreement that you have to sign as a condition of getting on Airbnb to treat others with respect, without judgment or bias based on race, religion, national origin, ethnicity, disability, sex, gender identity, sexual orientation, or age. And you know that cost Airbnb 1.4 million users who refused to sign that.
Vanessa: People who refuse to sign this?
Laura: Yeah. When he came up with this idea, I said, “Well, I think it’s fantastic, but it’s gonna cost you.” And he said, “I’m okay with that.” And you know what? The company is hugely profitable. They just went out for public offering last December, and they are just plowing away. And speaking of the public offering, we got 12 women on minority-owned, veteran-owned firms to be part of the public offering, so those firms help offer the stock to the public, we got them to create diverse candidate slates. So the numbers of people of color in positions of power at Airbnb have shot up. They’ve come up with a groundbreaking way to measure discrimination on their platform so that they can fix it. It’s called Project Lighthouse, and it’s gotten awards and recognition, and it’s being used as a model by other companies like LinkedIn. Airbnb doesn’t collect your race. So they came up with a system to guess what your perceived race is so that they could measure whether people of color were actually getting fewer listings than they requested. And now they are in the process of reducing any gaps in the booking. So there’s real work being done.
They have a steering committee that meets very regularly. The senior managers are in charge of the anti-discrimination effort, it’s not the diversity and inclusion person. At Starbucks, I didn’t work on that. That was done by former Attorney General Eric Holder. They hired a chief inclusion and diversity officer, which they didn’t have. They achieved pay equity for all of their employees, they resolved the outstanding EEOC allegations, they revised their anti-discrimination and harassment policy. So both those companies came up with very, very specific and tangible responses to eliminating…well, we’ll never be able to eliminate all discrimination but to reduce discrimination severely on their platform.
None of these companies that I talk about in my report have lost an ounce of profitability as a result of this effort. And quite the opposite, I think, if you’re a forward-looking business, and you look at the upcoming workforce, it’s going to be millennials and Gen Zs. Gen Zs are people born in 1996 and afterward. Those are the most diverse, well-educated populations. They’re more educated than the baby boomers, and they’re more opinionated. They’re demanding more of the corporations that they use, and that they work for. And so if a business is being smart, and they’re looking at how their consumer base is changing, how their workforce is changing, they are going to make these changes now to position themselves to treat all of their customers better so that they will build customer loyalty. I mean, this is in their enlightened self-interest. You either fix these problems now, or you let them fester, you become the target of boycotts or social media campaigns, fix it now. You’re not gonna hurt your profitability, you’re going to make yourselves a better company, and you’re gonna stop hurting people. Don’t you wanna stop hurting people, corporate America?
Laura: You’d think so. You’d think so.
Vanessa: No, and I appreciate it. Like, we have to be real, right? At the end of the day, you know, there are some good people, obviously, and there’s people that wanna take steps forward, but they are also in corporate America, right? So it does come to the bottom line. However, if keeping your bottom line in mind also makes you do better for the community and for the people that you serve, great. Like, let’s do that. I think that’s a really interesting kind of case study about 1.4 million Airbnb people decided not to sign this simple pledge.
Vanessa: That is so scary and telling.
Laura: To me, it’s surprising it’s not more because we’re in a racially polarized moment like I’ve never seen in my lifetime. And I have a pretty long lifetime. And I have not seen it this bad where we’re not talking to each other. We’re not speaking to each other. We’re demonizing. There’s so much hate floating around the digital universe. If it’s only 1.4 million, I think that that’s a win.
Vanessa: I love that. See, that’s such a good way to think about it. When you think about everybody, right? The digital universe as you said and everybody, 1.4 is kind of nothing.
Laura: Well, you know, you got over 300 million people in the United States alone, and this is something that applies to Airbnb worldwide. So that’s pretty darn good to me.
Vanessa: That’s amazing. I often wonder whenever I am on social media, when I’m booking Airbnbs, I often think that the only way that they know my race and ethnicity as a Latina is because, you know, I have a Z in my last name, right? My last name is Gonzalez. And I’m always curious about how does that work? I have a daughter who is also Latina and her father is white, and so she has both of our names. We do get different things when she uses her dad’s name versus my name. And I’m always like, “Ah, I see what they’re doing. That’s not okay. That’s pretty shady.” Talking about all of that great stuff, and how they’re really putting in some of the structures, which will hopefully make a long-term impact and change, right, because it’ll just change the way that people do business, that’s hopefully what will happen. Sofia, when you think about your work, how will you know these things have worked?
Sofia: I feel like the way that you know that it’s worked, there’s so many new technologies that have been introduced, like surveillance capitalism, facial recognition, all of these things that have huge real-world consequences, specifically against people of color. And I think the day that we see companies across the board really seeing these technologies for what they are, and they’re used to discriminate, that’s really what they were made to do. Once they have diversity at the table, people like actively making the decisions, including the engineers. I feel like this is a very huge problem with big technology. All of the engineers kind of look the same. So it’s like when I walked into my first computer science class, I obviously did not see anyone who looked like me because you don’t see Black girls in technology.
But it’s incredibly important to have people who look like you in those spaces because, of course, it convinces younger generations, like, “Oh, this is kind of cool, maybe I should look into this.” But also having those people, they’re able to rationalize with the people on the other side better because they have the ethos, they have the skillset, and they also have the life experience. You know, being in their body that the other people wouldn’t. So I think it’s really diversifying our spaces, particularly in engineering, and making sure that the technologies serve everyone, not just a select few.
Vanessa: Yeah, thank you for that. That is such a good point. We talk a lot on this show about Dr. Kimberlé Crenshaw, and the intersectionalism that is needed when you’re tackling these things, right? We can’t say technology will get better if we don’t think about what is the workforce that’s coming in in technology, and what are the schools looking like that are feeding these engineers into these giant tech firms? So, absolutely right. This goes all the way back to some of the foundations that we’ve been talking about, right? And we know that those foundations within themselves are broken. So how can we start looking at this more holistically, and building really from the ground up? So thank you so much for bringing that into the conversation.
Laura: And you know what? I just wanna say how important intersectionality is. I remember talking to a group of Black Facebook users, and some of them spoke up and said, you know, “We feel like we have problems on Facebook because we’re Black, but we really feel like we have problems because we’re Muslim.” If I had just gone into my civil rights analysis just looking at race alone, I would not have come across the way Muslims feel like they’re treated on the platform. Look at ourselves, we are not just one thing. You’re Latina and you’re a woman. And how do you know when you go into a restaurant, or into a work situation, are people giving you different treatment because of your ethnicity or race, or because of your gender?
I advocate in my report for an intersectional approach that we ought to look at national origin, we ought to look at race, we ought to look at gender identity, sexual orientation, age, disability because often we go in thinking a company has one problem, and sometimes they have multiple problems. And why not use civil rights to lift up all boats? I wanna make civil rights sexy again, you know? I want to get people talking about all of these isms, and how blind exclusion and blind discrimination is self-defeating for our nation, is self-defeating for our economy, is hurting us around the world. And until we embrace and address all of these isms, you know, none of us is going to be really free.
Vanessa: One hundred percent, yes. I love that. I think it is beyond true, right? It is the reality of what we’re living. And until we actually start to express that and feel like we can freely express that, and live in the world as our true authentic selves, all of ourselves, we’re gonna have these problems. And I do think to your point, when you look at things from civil rights people are typically like, “Okay, so we’re talking about race and ethnicity. Okay. And…” right? We have to start expanding that and. And I think, you know, we’re all trying to get better about it. But I think when you think about technology, which is supposed to be the unifier, which is supposed to help people come together from different backgrounds, different abilities, you name it, but then you start to understand it’s actually not. It’s actually set up to exclude some of these folks, and it’s silencing others, and muting some voices. And so I think, it has done some good in the world, I don’t think we can say there has not been some good done. Mr. Floyd’s murder, obviously, we all saw those videos, and I know I saw them via social media first, and then that kind of cued me in to what was happening. I think there has been some positive things, but then, I think you kind of have to look at it again, holistically, right? We still need to fix a lot of stuff there.
Laura: We do. And I very much relate to Sofia’s talking about the trauma she experienced in the aftermath of all of this and seeing these videos, it made me have a different relationship with the news. And I’ve been working at a kick-butt organization most of my career at the ACLU where controversy is our friend oftentimes. This saddened me and made me hurt in a way. I don’t think we can afford this loss of hope, you know? We’ve got to find a way to rejuvenate ourselves and heal ourselves. The world is an ugly place. As leaders, I think we have to create a path of optimism, because how do you get people to follow you if you’re spewing negativity? It works for some hate groups, I won’t say that. But you have to give people a vision of the future. We have to encourage each other, and we are going to endure setbacks. But we have to move forward, and we have to find allies in different communities in order to do that. I just think we need to talk about civil rights, and we need to talk about how we all benefit from civil rights.
Vanessa: I wanna dig into one piece of what you just said. Sofia, when we talk about the harm that some of these videos can do, right? But I just mentioned as well, unfortunately seeing some of these videos, and we don’t want to replay them over and over again, right? Because folks are humans, and they should not have their lives played out like that on social media right next to advertisements of whatever the latest fitness craze is, right? But I think for many people, this is the first spark where they’ve been able to see really how bad it is for some communities. It has been through those videos. So when you think about the level of awareness that some of this can bring, but also thinking about the trauma that it can also inflict on people, how do we have that conversation about some of that balance?
Sofia: Part of the issue stems from folks needing to see people being brutalized to believe that it’s happening. Activists and civil rights leaders have been saying for the longest time…We saw Rodney King, I mean, I didn’t, I wasn’t alive yet. But we saw Rodney King.
Vanessa: Ooh, she wasn’t alive yet.
Laura: I’m okay with it, Sofia. I don’t discriminate based on age.
Vanessa: Just throw it in there. Yeah.
Sofia: My point being though that people have seen this happen. That we’ve learned, and we know, and the stats speak for themselves, police brutality is something that is happening. I don’t necessarily understand the reasoning behind needing to see the brutalization of people, like seeing people actively being shot, being knelt on, being beat up, truthfully. It feels like a bit of, I don’t want to say that people are like sadistic or anything because I feel like people have the purest of intentions at heart when they’re trying to learn about these things. However, people need to be listening to the people of color that this is affecting. Seeing doesn’t have to be believing. I think that hearing people’s own testaments should be just as powerful.
Vanessa: Ooh, I love that. You’re right. Believe people when they tell you their truth, believe communities, there’s no reason you have to see it in front of your face. Thank you for that. I wanna talk about something that shook a lot of people up, myself included because I communicate with my people on WhatsApp. But when that outage happened, oh, you would have thought, oh my god, people just started spiraling. I think all of you know what I’m talking about. So there was Facebook, WhatsApp, and Instagram, they faced outages. It had a couple of different impacts, right? To some, this was a minor inconvenience, it was like a nice break from social media. It also, I think, showed people like, “Oh my God, how many hours do I really spend on this site?” But I also think for those people it was a luxury. For others, particularly immigrants, folks from other communities, they use Whatsapp and some of these systems to be able to communicate with their family, to be able to keep ahead of late-breaking news, those types of things for their own safety. Let’s talk a little bit about what happens when these systems go down, whether it’s for better or for worse. I mean, Laura, what do you think when these systems crash?
Laura: Well, it just tells you what a monopoly power Meta/Facebook is. They have control over so many systems of communication that we deem to be fundamental. I really think, unless there is more competition, we are going to be vulnerable. And I think Congress is really talking about this. I think it didn’t just shake users, but it shook legislators, you know, about how dependent we are on a private corporation. I think the question is fairly asked, should some of these companies be treated like utilities? And should they be regulated? I think that that is a question that every trauma that Facebook reveals, or every scandal that is revealed, or every whistleblower, or every outage, and keeps inviting that question about how do we protect the public when the public is so dependent on one privately owned corporation?
Vanessa: Pretty scary. It’s very scary. Sofia, what do you think about when these outages happen? Are they helpful for people to take that mental health break?
Sofia: I feel like it depends on your positionality. Personally speaking, WhatsApp is the only reason that there are no longer calling cards in my house. Truly, you’re able to reach people from abroad without needing to pay extra. However, you look at the number of small businesses and countries in the Global South that rely on WhatsApp and Facebook exclusively to run their businesses, to my knowledge, it took a huge hit to their economies that day. It depends on which areas were awake of course because I think it was a six-hour one. So, of course, some people were probably asleep. But even so, you look at the sheer number of people who were so deeply affected, whether it be communicating with their loved ones, or political organizations, or otherwise, small businesses that keep their countries running. It’s exactly what Laura was saying, it’s incredibly unfair that it is one corporation that everyone is dependent on to really keep the wheels turning. And I think that we need to look at, both as like a nation and then also as a people just across the entire world because so many of us across the entire world are using these platforms, like what needs to be done to make sure that they don’t have nearly as much power, nearly as much stake as they do when running legitimate systems all across the world.
Like, I remember, because a good amount of my family is in Uganda, they had to turn off the internet for the day when they were having elections there because misinformation festers like crazy on these platforms, especially I’m sure as you’ve seen, where they don’t have content regulators who speak the language that people speak there. They’re able to have hate fester on the platform, and misinformation festers on the platform. Like, with the genocide that was happening in, I believe it was Myanmar that it was happening, based on what was happening on Facebook, and they hadn’t done anything about it because what could they do? They didn’t speak the language. And I think that we need to be much more critical of how these systems affect not only our day-to-day lives but the world.
I feel like there’s so many grave, grave, grave consequences on a humanity level that really affect our ability to continue as a people. You look at the climate misinformation as well that’s on these platforms, the nations that are going to cease to exist as a result. Like, in Bangladesh, it’s 1 in 5 residents, as of 2025, will no longer have a home. In an area that is so densely populated, you have to worry about what is causing this to continue? And in a lot of ways, it’s these platforms that we are so reliant on. So we really have to reevaluate our relationships with them for sure.
Vanessa: Well, we are coming to the end of our time. And so I want to ask you all the big question that we end every episode with. In your work…and you deal with a lot of heavy things, and so I appreciate that you both are trying to keep that light going at the end of the tunnel. But Laura, can you tell us, what gives you hope?
Laura: Well, I’m talking to two people who give me hope, especially want to shout out to Sofia because as a Black woman, as a woman of color, having a co-major in political science and computer engineering, I just think that’s a fabulous and much-needed conversation. So I think she’s gonna bring her A-game to the problems that we’re facing as a society. And Vanessa, your work, communicating these issues thoughtfully for an organization that I greatly admire, The Leadership Conference on Civil and Human Rights, that is the epitome of advancing civil rights for all people, that gives me hope. I just think hope is intrinsic to my faith. And as a person of faith, I have to believe that we each have the capacity to do good in society. I just see so many people who want things to change for the better that I am encouraged by that. And I don’t get my hope from cable television, and I don’t get my hope from the newspaper headlines, I get my hope from talking to people who are actively on the battlefield. So, I’m encouraged, and I think we all ought to be encouraged, and we need to give financial and emotional support to people who are fighting these battles.
Vanessa: Uh, I love that. Yes, thank you, thank you, thank you for that. All right, Sofia, who we’re all gonna work for one day.
Sofia: Maybe not. The thing that gives me the most hope is probably people, not only people who are angry but people who translate anger into positive action. Because I feel like it’s very easy, I mean, personally speaking, it’s very easy to become very desensitized to a lot of things. I mean like Donald Trump became president when I was 16. Like, it was disheartening, but it felt normal. And I think that these things that a lot of younger generations are taking as normal is not very healthy for our development to see, like people being brutalized, or the planet being burnt to the ground, all of these things. But whenever I see people, whether it be like Greta Thunberg or Vanessa Nakate, who are actively speaking out and translating their anger into action, it makes me…I don’t know, I feel like without hope nihilism becomes a very good option. But I think that hope keeps us alive. And if people are able to maintain their passion throughout all that goes wrong in the world, then I think that we’re on the right track.
Vanessa: Uh, that is fantastic. Thank you for such an amazing conversation today. I think we really wanna underline, you know, when Facebook, and Instagram, and WhatsApp, and TikTok, and man, you keep naming them, you know, some people think they’re just silly little websites, or that it’s something that kids do. And I think people really, really need to start understanding that these platforms have an impact on our day-to-day life, on our future, as you’ve lifted up, on our Earth, how we lead with climate. Oh, the disinformation and the misinformation impacted, as we saw without question our last election cycle. It’s scary to think about what could happen this coming election cycle.
The Leadership Conference, we’re always trying to push out, you know, real sources of information to try to combat that, but it’s really hard when you have so many bad actors. And so, these systems are not just websites, right? They’re not just apps on my phone. These are very real influential systems. And I really hope that people start to take them for what they are, and call them into place, and demand action and changes. So, thank you all for being the vanguards in this, and in pushing people forward, and making them do more. We definitely appreciate it. Again, thank you for an amazing conversation. And thank you for joining us on “Pod for the Cause.”
Laura: Thank you.
Vanessa: Thank you for listening to “Pod for the Cause,” the official podcast of The Leadership Conference on Civil and Human Rights, and The Leadership Conference Education Fund. For more information, please visit civilrights.org. And to connect with us, hit us up on Instagram and Twitter @podforthecause. And also, you can now text us. Text “civil rights,” that’s two words, “civil rights” to 40649, to keep up with our latest episodes. Be sure to subscribe to our show on your favorite podcast app and leave a five-star review. Until next time, I’m Vanessa Gonzalez. Thanks for listening to “Pod for the Cause.”