4.7 • 6.2K Ratings
🗓️ 2 June 2022
⏱️ 57 minutes
🧾️ Download transcript
As transparency reporting about content moderation enforcement has become standard across the platform industry, there's been growing questions about the reliability and accuracy of the reports the platforms are producing. With all reporting being entirely voluntary and the content moderation industry in general being very opaque, it’s hard to know how much to trust the figures that companies report in their quarterly or biannual enforcement reports. As a result, there's been growing calls for independent audits of these figures, and last month, Meta released its first ever independent audit of its content moderation reporting systems.
This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek sat down with someone who actually knows something about auditing: Colleen Honigsberg, an associate professor of law at Stanford Law School, whose research is focused on the empirical study of corporate and securities law. They talked about how auditors work, the promises and pitfalls of auditing in other contexts and what that might teach us for auditing in the content moderation context, and whether this is going to be a useful regulatory tool.
Support this show http://supporter.acast.com/lawfare.
Hosted on Acast. See acast.com/privacy for more information.
Click on a timestamp to play from that location
0:00.0 | The following podcast contains advertising to access an ad-free version of the LawFair |
0:07.2 | podcast become a material supporter of LawFair at patreon.com slash LawFair. |
0:14.7 | That's patreon.com slash LawFair. |
0:18.2 | Also, check out LawFair's other podcast offerings, rational security, chatter, LawFair |
0:25.6 | no bull and the aftermath. |
0:32.6 | To sign off on an audit report, you already have to be a specific type of CPA with |
0:39.8 | particular experiences and qualifications that have taken the exams and passed them. |
0:45.0 | For this one, I assume that the partners don't need those types of qualifications, so you're |
0:51.7 | already removing that, plus presumably there's going to be variation in the quality of the |
0:56.4 | partner because there's always variation in quality of people. |
1:00.5 | So it seems like if anything it would be more important to know if the partner is here |
1:05.2 | than it would be in the financial context. |
1:08.5 | I'm Evelyn Duac and this is the LawFair podcast, June 2nd, 2022. |
1:14.9 | Today we're bringing you another episode of our Arbiters of Truth series on the Online Information |
1:19.3 | Ecosystem. |
1:20.8 | This episode is about auditing. |
1:22.3 | Wait, wait, don't switch off, I promise it's interesting and important. |
1:27.2 | As transparency reporting about content moderation enforcement has become standard across the |
1:31.3 | platform industry, there's been growing questions about the reliability and accuracy of |
1:36.0 | the reports the platforms are producing. |
1:38.8 | With all reporting being entirely voluntary and the content moderation industry in general |
1:43.0 | being very opaque, it's hard to know how much to trust the figures that companies report |
... |
Please login to see the full transcript.
Disclaimer: The podcast and artwork embedded on this page are from The Lawfare Institute, and are the property of its owner and not affiliated with or endorsed by Tapesearch.
Generated transcripts are the property of The Lawfare Institute and are distributed freely under the Fair Use doctrine. Transcripts generated by Tapesearch are not guaranteed to be accurate.
Copyright © Tapesearch 2025.