--- title: "Meta has suffered consecutive defeats related to failing to disclose known product hazards to the public" type: "News" locale: "en" url: "https://longbridge.com/en/news/280913696.md" description: "Meta lost in two court hearings, both involving allegations of failing to disclose the dangers of its products to the public. The rulings indicate that internal research conducted by companies may lead to legal liability. Former executives pointed out that Meta's internal research contradicts its external image, and the jury found that it inadequately regulated the platform, endangering minors. This case reflects the dilemma faced by tech companies between research and legal responsibility" datetime: "2026-03-29T14:20:38.000Z" locales: - [zh-CN](https://longbridge.com/zh-CN/news/280913696.md) - [en](https://longbridge.com/en/news/280913696.md) - [zh-HK](https://longbridge.com/zh-HK/news/280913696.md) --- > Supported Languages: [简体中文](https://longbridge.com/zh-CN/news/280913696.md) | [繁體中文](https://longbridge.com/zh-HK/news/280913696.md) # Meta has suffered consecutive defeats related to failing to disclose known product hazards to the public **Key Points** - Meta's losses in two court hearings, although involving different cases, both point to allegations that the company **was aware of the dangers of its products**. - The rulings indicate that companies hiring researchers to analyze the impact of their products on users may face potential legal liabilities. - In the current context of significant investment in artificial intelligence in the tech industry, experts worry that internal research will continue to be restricted, affecting consumer safety. More than a decade ago, when it was still called Facebook, Meta hired researchers from the social sciences to analyze the impact of social networking services on users. This move was intended to show that the company and its peers were taking the benefits and potential risks of innovative products seriously. However, this week's verdicts against Meta indicate that these research findings may instead become unfavorable legal evidence. Brian Boland, a former Facebook executive who testified in two court hearings (one in New Mexico and another in Los Angeles), stated that the negative conclusions disclosed in Meta's internal research and documents were in stark contrast to the public image the company sought to project. Juries in both trials found that Meta failed to adequately regulate its platform, putting minors in danger. Years ago, after Facebook researcher Frances Haugen became a well-known whistleblower, Zuckerberg's company began tightening control over its research teams. Subsequently, emerging tech companies like OpenAI and Anthropic have heavily invested in researchers, requiring them to study the impact of the next generation of artificial intelligence on users and publicly release the results. With growing concerns about the negative effects of artificial intelligence on certain users, these companies must consider whether to continue funding related research or suppress it, which would better serve their interests. Boland stated in an interview: "There was a time when the company had relevant teams that could conduct related research; during that brief window, a group of very talented researchers had relatively loose permissions to examine the issues occurring with these products, but to my knowledge, they no longer have that space." Although the two losses for Meta this week involved different cases, the core issue remains the same: the company did not disclose to the public the known dangers of its products. **Evidence: Internal Documents and Research** The jury reviewed millions of company documents, including executive emails, presentations, and internal research conducted by Meta employees. Among the internal investigations, a significant proportion of teenage users reported experiencing non-consensual sexual harassment on Instagram; another study indicated that users who reduced their Facebook usage experienced lower levels of depression and anxiety, which was ultimately halted by Meta The plaintiff's lawyers did not rely solely on internal research to make their case, but this research significantly strengthened their claims of wrongdoing by Meta. The defense team for Meta argued that some of the research was outdated, taken out of context, and misleading, distorting the company's operational model and safety philosophy. **"Both sides have presented their statements"** "The jury heard the statements from both sides and a fair presentation of the facts, and based on that, they made their ruling," said Bolan, "and in both trials with entirely different cases, the jury delivered clear verdicts." Meta and YouTube, which is also a defendant in the Los Angeles case, both stated that they would appeal. Psychologist and lawyer Lisa Strohman (who served as an internal expert consultant in the New Mexico lawsuit) stated that the management of Meta and the entire tech industry had thought they could use internal research to boost their image and win public favor. "I think they didn't realize that researchers are also parents and regular people," Strohman said, "and they didn't anticipate that these people wouldn't be easily bought." After relevant research content was leaked, the public relations effect that executives had originally hoped for was completely counterproductive. The most damaging event for Meta occurred in 2021 when former Facebook product manager turned whistleblower Frances Haugen leaked a large number of documents showing that the company was aware of the potential harms of its products. On December 1, 2021, former Facebook employee Frances Haugen testified before the House Energy and Commerce Committee's Communications and Technology Subcommittee on Capitol Hill in Washington. Kate Block, research and project director at the nonprofit "Children and Screens: Institute of Digital Media and Child Development," stated, "Haugen's disclosures became a significant turning point globally—not just for the company itself, but for researchers, policymakers, and the general public as well." This leak also prompted Meta and the entire tech industry to make significant adjustments, beginning to eliminate research that could be detrimental to the company. According to previous reports by the U.S. Consumer News and Business Channel, several teams researching product harms and related issues were disbanded. Some companies have also begun to remove tools and features used by third-party researchers to analyze their platforms. "Companies may now view ongoing research as a legal risk, but independent third-party research must continue to be supported," Block said. Sasha Horowitz, executive director of the Tech Oversight Project, stated that most of the internal research used in this week's trial was not new content; many documents had previously been made public by other whistleblowers. The key addition to the trial was "real emails, original quotes, screenshots, internal marketing presentations, and memos," which provided necessary background information. The current tech industry is making a major push into the field of artificial intelligence, with companies like Meta, OpenAI, and Google prioritizing product development over research and safety. Block expressed deep concern about this: "Like early social media, the public's visibility into the research conducted by AI companies on their own products remains very limited." "Most artificial intelligence companies are currently only researching the models themselves—model behavior, interpretability, and alignment, but there is a significant gap in research on the impact of chatbots and digital assistants on children's development," Brock said. "AI companies have the opportunity to avoid repeating past mistakes—we urgently need to establish transparent and verifiable mechanisms that require companies to disclose their platform information to the public and support more independent evaluations." ### Related Stocks - [First Trust Dow Jones Internet ETF (FDN.US)](https://longbridge.com/en/quote/FDN.US.md) - [Meta Platforms, Inc. (META.US)](https://longbridge.com/en/quote/META.US.md) - [GraniteShares 2x Long META Daily ETF (FBL.US)](https://longbridge.com/en/quote/FBL.US.md) - [ProShares Big Data Refiners ETF (DAT.US)](https://longbridge.com/en/quote/DAT.US.md) - [State StreetSPDRS&PSftwr&SvcsETF (XSW.US)](https://longbridge.com/en/quote/XSW.US.md) - [Vanguard Communication Services ETF (VOX.US)](https://longbridge.com/en/quote/VOX.US.md) - [Pacer Benchmark Data&Infras RE SCTR ETF (SRVR.US)](https://longbridge.com/en/quote/SRVR.US.md) - [Global X Cloud Computing ETF (CLOU.US)](https://longbridge.com/en/quote/CLOU.US.md) - [Direxion Daily META Bull 2X ETF (METU.US)](https://longbridge.com/en/quote/METU.US.md) - [Roundhill META WeeklyPay ETF (METW.US)](https://longbridge.com/en/quote/METW.US.md) - [Direxion Daily META Bear 1X ETF (METD.US)](https://longbridge.com/en/quote/METD.US.md) - [Fidelity MSCI Communication ServicesETF (FCOM.US)](https://longbridge.com/en/quote/FCOM.US.md) - [Franklin Exponential Data ETF (XDAT.US)](https://longbridge.com/en/quote/XDAT.US.md) - [iShares U.S. Digital Infras & RE ETF (IDGT.US)](https://longbridge.com/en/quote/IDGT.US.md) - [iShares Expanded Tech-Software Sect ETF (IGV.US)](https://longbridge.com/en/quote/IGV.US.md) - [State Street® CommServSelSectSPDR®ETF (XLC.US)](https://longbridge.com/en/quote/XLC.US.md) - [Global X Data Center & Dgtl Infrs ETF (DTCR.US)](https://longbridge.com/en/quote/DTCR.US.md) ## Related News & Research - [Tech Wrap March 30: OPPO Find X9 Ultra, WhatsApp on CarPlay, Blaupunkt](https://longbridge.com/en/news/281041030.md) - [Meta integrated ai into the core of its risk review program - website](https://longbridge.com/en/news/281190495.md) - [Draft IT regulations may widen govt oversight on social media content](https://longbridge.com/en/news/281133616.md) - [Govt proposes to bring independent news creators under MIB purview](https://longbridge.com/en/news/281049326.md) - [Meta Introducing Our First Ai Glasses Built For Prescriptions](https://longbridge.com/en/news/281195955.md)