March 21, 2021

An Overview of Different Text Embeddings

An Overview of Different Text Embeddings

Text embedding is a method to capture the meaning and context of text and covert those meanings into numerical representations digestible by a computer. This is necessary for any machine learning task that takes text as its input. For example, question answering, text generation, text summarisation, etc.

I wrote a blog post as part of the Ezra Tech Blog on different embedding models namely, Word2Vec, GloVe, FastText and ELMo. Please visit for details on each method, how it compares to other models and how to use them in Python.