
Contact
Thiago de Faria
Presenting
AI with a DevOps mindset – experimentation, sharing and easy deployment of ML components
In this talk, I’ll look at ML & AI from a devops engineer perspective, emphasizing how they can help and shorten time-to-market of such products. Although it may be new, paradigms familiar to the devops world can be translated to the ML world: reduced batch sizes, CI/CD, sharing, observability, …
Experienced engineers hear about Machine Learning and AI all the time now. They may even work in a place that has a Data Team building a prototype ML model that needs to be deployed in production ASAP – and that is just the tip of the iceberg.
When do we build an ML component? When we need to find patterns without explicitly programming machines to do so. Despite not being easy to test and Data Scientists not usually having a software engineering background, ML components can drink from the same source of the devops movement.
This talk will introduce and give support to:
- Avoid the AI hype train
- CI/CD for ML? Yes, please, but we need to talk about Continuous Evaluation!
- The ML bugs – no you cannot catch them with console.log()
- Create a safe environment for Data Scientists
- Packaging, deploying and serving ML models
By the end of this talk, engineers will understand more ML lifecycles, the AI hype and feel more comfortable to support these types of applications.