– Notebook: https://bit.ly/tf-cv-whylabs
– Free WhyLabs Signup: https://whylabs.ai/free
– whylogs github (give us a star!) https://github.com/whylabs/whylogs/
– Join The AI Slack group: https://bit.ly/r2ai-slack
Join this hands-on workshop to learn ML monitoring for computer vision models in production with TensorFlow and WhyLabs.
If you want to build reliable computer vision pipelines, trustworthy data, and responsible ML models, you’ll need to monitor your models and data.
In this workshop, we’ll cover how to use ML monitoring techniques to implement your own AI observability solution for computer vision classification applications.
Once completed you’ll also receive a certificate!
This workshop will cover:
Reading image data for TensorFlow models
Training a computer vision classification model with TensorFlow/Keras
Detecting image data quality issues
Detecting data drift for computer vision
Measuring for potential concept drift
Monitoring ML model performance
What you’ll need:
A modern web browser
A Google account (for saving a Google Colab)
Sign up free a free WhyLabs account (https://whylabs.ai/free)
Who should attend:
Anyone interested in AI Observability, Model monitoring, MLOps, and DataOps! This workshop is designed to be approachable for most skill levels. Familiarity with machine learning and Python will be useful, but it’s not required.
By the end of this workshop, you’ll be able to implement data and AI observability into your own pipelines (Kafka, Airflow, Flyte, etc) and ML applications to catch deviations and biases in data or ML model behavior.
About the instructor:
Sage Elliott enjoys breaking down the barrier to AI observability, talking to amazing people in the Robust & Responsible AI community, and teaching workshops on machine learning. Sage has worked in hardware and software engineering roles at various startups for over a decade.
Connect with Sage on LinkedIn: https://www.linkedin.com/in/sageelliott/
About WhyLabs:
WhyLabs.ai is an AI observability platform that prevents data & model performance degradation by allowing you to monitor your data and machine learning models in production.
Do you want to connect with the team, learn about WhyLabs, or get support? Join the WhyLabs + Robust & Responsible AI community Slack: http://join.slack.whylabs.ai/