Online Safety Bill: How Global Platforms Use MLOps to Keep People Safe
The UK government’s communications regulator, Ofcom, commissioned Winder.AI to produce a report to improve their understanding of the end-to-end AI governance processes that support the creation and deployment of automated content classifiers used in moderating online content. Together we interviewed social media platforms and moderation technology vendors to ask them about the tools, technologies and processes that are often referred to as machine learning operations (MLOps).
In this presentation, I will distil the findings of the report and share some of the insights that we gained from the interviews. With a particular emphasis on online safety, you will learn about how MLOps and AI governance is used within these companies to help ensure that their AI solutions are safe and effective. You will also learn about the challenges that they face in operating their AI solutions at scale. Finally, this presentation will emphasise the importance of online safety and briefly review the UK government’s Online Safety Bill and the impact that it may have.
This presentation focuses on the importance of online safety and is therefore relevant to anyone interested in keeping people safe online. On a technical level, this presentation delves into the use of AI, AI governance, and MLOps, and is therefore relevant to anyone who is interested in AI.
-
Failure Is Always An OptionDylan BeattieThursday Jun 29, 15:40
-
Flow. The Worst Software Development Approach in HistorySander HoogendoornKim van WilgenThursday Jun 29, 13:30
-
Programming's Greatest MistakesMark RendleWednesday Jun 28, 16:40
-
Demystifying Blockchain - From Infrastructures Via Smart Contracts to ApplicationsOlivier RikkenTuesday Jun 27, 14:20
-
SecurityBert HubertTuesday Jun 27, 14:20 & Tuesday Jun 27, 16:40
-
TBAAnita SenguptaTuesday Jun 27, 09:10
-
TBAErik ScherderWednesday Jun 28, 09:10
-
One Rule to Rule Them AllDave ThomasThursday Jun 29, 09:10
-
How The Hack?Ben SadeghipourWednesday Jun 28, 13:20