Secure Machine Learning: The major security flaws in the ML lifecycle (and how to avoid them)

In this webinar

The operation and maintenance of large scale production machine learning systems has uncovered new challenges which require fundamentally different approaches to that of traditional software. The field of security in data & machine learning infrastructure has seen a growing rise in attention due to the critical risks being identified as it expands into more demanding real-world use-cases.

In this talk we will introduce the motivations and the importance of security in data & machine learning infrastructure through a set of practical examples showcasing “Flawed Machine Learning Security.” These “Flawed ML security examples are analogous to the annual “OWASP Top 10” report that highlights the top vulnerabilities in the web space, and will highlight common high risk touch points.

Throughout this session we will cover a practical example that will showcase how we can leverage the plethora of cloud native tooling to mitigate these critical security vulnerabilities. We will cover concepts such as role base access control for ML system artifacts and resources, encryption and access restrictions of data in transit and at rest, best practices for supply chain vulnerability mitigation, tools for vulnerability scans, and templates that practitioners can introduce to ensure best practices.

Key points we are going to discuss:

  • The importance of security in data and ML infrastructure
  • Common high risk touch points and vulnerabilities in the web space
  • How to leverage tools to mitigate these critical security vulnerabilities
  • Templates to ensure best practices

Speakers

Alejandro Saucedo

Engineering Director, Seldon

Watch the video

See it for yourself

Deploy, monitor and explain your models today.

Solutions

Open Source

Seldon Technologies Limited, registered in England and Wales with company number 09188032


Registered Address:
2 Underwood Row, London, N1 7LQ
United Kingdom

© 2022 Seldon Technologies. All Rights Reserved.