2022-11-19, 13:30β14:30 (Europe/Athens), Hall 2
Training and reproducing ML models from scratch has been always the bitter pill for data science teams, researchers and practitioners, let alone the data collection and engineering process.
Transformers library have come a long way to make our lives a little more easier and bring the latest SoTA AI models for CV, NLP, audio and multimodal processing at the tip of a button.
In this workshop we are going to have a walkthrough in Hugging Face platform, play around with the transformers library, load models to process text, detect objects in images and why not create new π€ accounts!
Project Repository
https://github.com/sniafas/fosscomm22-democratize-ml
Ξ€he purpose of the workshop is to demonstrate HF capabilities to machine learning and open source community of FOSSCOMM.
Transformers is a powerful opinionated library, that became popular for bringing the implementation and access of pretrained transformer models to community within a rich api supporting Pytorch, Tensorflow & other ML libraries.
The HF platform allows the users to share models, datasets and applications on a self-contained ecosystem even to perform real time inference and make demo applications.
The workshop will demonstrate Hugging Face basic API and transformers functionality covering the following topics:
- Explore π€ platform
- Install development environment
- Utilise transformers pipelines for NLP & CV
- HuggingFace CLI
- Load dataset, upload & reuse models
- Create and push a new model to hub
Open up you jupyter notebooks and let's have fun with Hugging Face π€
Prerequisites: Knowledge in Python, general ML concepts, Familiarity w/ Linux
Stavros Niafas is a Machine Learning engineer involved in machine/deep learning, active learning, MLOps and computer vision. He is also actively engaged in systems engineering, FLOSS & photography.
Lately he is on a journey to democratize ML and fill the gap between research and production.