May 26, 2022 Dear Mrs. Nikolova, Thank you for your letter. Keeping our users well informed and fighting disinformation, influence operations, and misinformation is a key priority for Meta. We agree that a well-informed society is the backbone of a well-functioning democracy. We fully acknowledge our role in keeping our users safe and well informed, be it in Serbia, the West Balkans or elsewhere in the world. We know that people want to see accurate information on our platforms – and so do we. We also know that misinformation and influence operations can be harmful to our community and can make the world less informed, particularly in the context of elections and other important social and political events. First, let me share with you our approach to tackle misinformation, which is defined as content which is false or misleading. To reduce the spread of misinformation on our services, we have implemented a three-pronged strategy, which we have been honing since 2016. 1. When we become aware of it, we remove misinformation that violates our global set of rules called our Community Standards, which are guided by our values on expression, safety, dignity, authenticity and privacy. These rules publicly explain what is and isn't allowed on our services, including strict rules on hate speech, voter suppression, harassment and inciting violence. This includes misinformation about the election process which could prevent people from voting – such as false news related to the dates, location, time, and methods of voting. We also remove misinformation which could contribute to the risk of imminent physical harm or violence, as well as misleading manipulated videos (“deepfakes”) which are created by artificial intelligence and depict someone saying something they did not say in a misleading way. In determining what constitutes misinformation in these categories, we partner with independent experts who possess knowledge and expertise to assess the truth of the content and whether it is likely to directly contribute to the risk of imminent harm. This includes, for instance, partnering with human rights organizations with a presence on the ground in a country to determine the truth of a rumor about civil conflict, and partnering with health organizations during the global COVID-19 pandemic. 2. Second, for all other misinformation, we focus on reducing its prevalence or creating an environment that fosters a productive dialogue. We focus on slowing the spread of hoaxes and viral misinformation, and directing users to authoritative information. As part of that effort, we partner with more than 80 independent third-party fact-checking organizations (including partners in the Balkans) to review and rate the accuracy of the most viral content on our platforms When a fact-checker rates a piece of content as false, we take a number of enforcement actions, including adding strong warning labels and significantly reducing its distribution so that fewer people see it. 3. Once fact-checkers rate something as false, not only do we reduce its distribution, but we inform people with more context and notify users who try to share it. Informing our users is also raising their awareness. To this end, we are conducting educational campaigns on our platforms to reduce the spread of misinformation. Currently we are working on launching a media literacy campaign in the context of the Russian invasion of Ukraine. The campaign will be launched in partnerships with local safety partners in Albania, Bosnia and Herzegovina, North Macedonia, Montenegro, Kosovo and Serbia. The main aim of the campaign is to show our users how to detect and challenge potential false news and misinformation through an informative video. Let me also go more in the details with regards to our third-party fact-checking programs. To ensure high standards, all of our third-party fact-checking partners are certified through the non-partisan International Fact-Checking Network (IFCN) and follow IFCN's Code of Principles. These Principles include a series of commitments that organizations must adhere to in order to promote excellence in fact-checking: nonpartisanship and fairness; transparency of sources; transparency of funding and organization; transparency of methodology; open and honest corrections policy. This certification process is rigorous and supervised by independent assessors and verified by IFCN's board of advisors. Currently in the Balkans region, we partner with the fact-checking partners listed below and plan to expand the programme further in the region this year: ● Serbia: Istinomer, AFP ● Bosnia and Herzegovina: Raskrinkavanje.ba, AFP ● Montenegro: Raskrinkavanje.me (Center for Democratic Transition), AFP ● North Macedonia: Vistinomer (Metamorphosis) ● Croatia: AFP, Faktograf Through these partners, we cover six Balkan languages: Bosnian, Croatian, Serbian, Montenegrin, Macedonian & Albanian. Here's how Meta's fact-checking program uses a combination of technology and human review to detect and demote false news stories: ● We use signals, including feedback from people on Facebook and Instagram, to predict potentially false stories for fact-checkers to review. ● When fact-checkers rate a story as false, we significantly reduce its distribution in Feed. ● Pages, groups, accounts and domains that repeatedly share false news will also see their distribution reduced and their ability to monetize and advertise removed. ● We also want to empower people to decide for themselves what to read, trust, and share. When third-party fact-checkers write articles about a news story, we show them in Related Articles immediately below the story in Feed. We also send people and Page Admins notifications if they try to share a story or have shared one in the past that's been determined to be false. ● We use the information from fact-checkers to train our machine learning model, so that we can catch more potentially false news stories and do so faster. In your letter, you also mention state-controlled or state-affiliated outlets. A critical part of our approach to misinformation is ensuring people are informed and understand the content they see on Facebook and Instagram and who is behind it. That's why, since 2020, we have applied labels to State-controlled media on Facebook globally, so people know where this information comes from. In response to the war in Ukraine, we are taking additional steps related to Russian state-controlled media outlets. We are globally demoting content from Facebook Pages and Instagram accounts from Russian state-controlled media outlets, prohibiting ads from Russian state controlled media, demonetizing their accounts, and making them harder to find across our platforms. We have also begun to demote posts that contain links to Russian state-controlled media websites on Facebook. We label these links and provide more information to people before they share or click on them to let them know that they lead to Russian state-controlled media websites. We have similar measures in place on Instagram. By providing this additional transparency, we aim to give people more context if they want to share direct links to Russian state-controlled media websites or when others see someone's post that contains a link to one of these sites. In line with the European Union regulation, we have blocked RT and Sputnik across the EU given the exceptional circumstances, as well as in the UK following a request from the UK government. On another note, it is also our goal to support media outlets with high journalistic and ethical standards. Our goal is to support them in terms of audience development and to help develop the business models on which they base their core business. We support publishers in various fields and help them develop their audience by giving them access to free tools and services and offering a wide range of free trainings and workshops that are open to all publishers and journalists. Also, we offer products and programs that help publishers further develop their business models. We've recently supported media outlets like Ringier Serbia (www.ringier.rs), Radio Free Europe / Radio Liberty (www.rferl.org), and Antena 1 (www.a1.ro). When it comes to disinformation (understood as the deliberate intent to mislead or manipulate) and influence operations, we are always looking for coordinated inauthentic behavior exhibited by violating actors, and we are actively working to find and stop coordinated campaigns that seek to manipulate public debate across our platforms. We have grown our team focused on disrupting influence operations to over 200 experts across the company, with backgrounds in law enforcement, national security, investigative journalism, cybersecurity, law, and engineering. We continue to expand our technical teams to build scaled solutions to help detect and prevent these behaviors, and we are partnering with civil society organizations, researchers,and governments to strengthen our defenses. We have also improved our detection systems to more effectively identify and block fake accounts, which are the source of a lot of the inauthentic activity. We recently released our pilot quarterly adversarial threat report, which provides a broad view into the risks we see worldwide and across multiple policy violations. Our public security reporting began over four years ago when we first shared our findings about coordinated inauthentic behavior (CIB) by the Russian Internet Research Agency. Since 2017, we've reported on over 150 influence operations with details on each network takedown so that people know about the threats we see – whether they come from nation states, commercial firms or unattributed groups. Finally, I want to tackle the topic of elections specifically. Protecting the integrity of elections, while preserving freedom of expression, is a top priority for Meta. We have made very substantial investments in safety and security, with more than 40,000 people working on these issues - spending approximately $5 billion on safety and security in 2021 alone. We've built new products and developed stronger policies to stay ahead of emerging threats. Through improved technology, we're more effective at finding and removing people abusing our platforms, blocking fake accounts, and limiting the spread of false news and misinformation For the Serbian Elections held on April 3, Meta: ● Established an Elections Operation Center, bringing together subject matter experts from across the company – including native Serbian speakers and experts from our threat intelligence, data science, engineering, research, operations, policy and legal teams – so we could respond in real time to potential problems and challenges. This allowed us to remove content that violates our Community Standards or Community Guidelines faster and served as another line of defense against misinformation. ● Supported voter turnout in Serbia by showing an Election Day Reminder on April 3, an informational pop-up message that encouraged people over 18 to go and vote, and connecting them with authoritative information from the Serbian Election Commission. ● Together with the International Foundation for Electoral Systems (IFES), a global civic NGO, hosted a webinar for representatives of Serbian political parties running in 2022 elections focused on how to use Meta Family of Apps responsibly during an election campaign. Our teams constantly monitor the situation on the ground, including in the Balkans, to respond to the challenges we may face. But fighting misinformation is an ever-evolving problem, and we can't do it alone. It is by working hand in hand with experts on the ground that we keep our users well informed. Hence, I would like to thank you for your letter. Please be sure that our teams are open to further cooperate with you. Sincerely, Nick Clegg President, Global Affairs