End-to-End NLP Project with Hugging Face, FastAPI, and Docker – Towards Data Science

This tutorial explains how to build a containerized sentiment analysis API using Hugging Face, FastAPI and Docker 10 min read

Many AI projects fail, according to various reports (eg. Hardvard Business Review). I speculate that part of the barrier to AI project success is the technical step from having built a model to making it widely available for others in your organization.

So how do you make your model easily available for consumption? One way is to wrap it in an API and containerize it so that your model can be exposed on any server with Docker installed. And thats exactly what well do in this tutorial.

We will take a sentiment analysis model from Hugging Face (an arbitrary choice just to have a model thats easy to show as an example), write an API endpoint that exposes the model using FastAPI, and then well containerize our sentiment analysis app with Docker. Ill provide code examples and explanations all the way.

The tutorial code has been tested on Linux, and should work on Windows too.

We will use the Pipeline class from Hugging Faces transformers library. See Hugging Faces tutorial for an introduction to the Pipeline if youre unfamiliar with it.

The pipeline makes it very easy to use models such as sentiment models. Check out Hugging Faces sentiment analysis tutorial for a thorough introduction to the concept.

You can instantiate the pipe with several different constructor arguments. One way is to pass in a type of task:

This will use Hugging Faces default model for the provided task.

Another way is to pass the model argument specifying which model you want to use. You dont

See more here:

End-to-End NLP Project with Hugging Face, FastAPI, and Docker - Towards Data Science

Related Posts

Comments are closed.