Glossary

Word embedding

A computational technique in natural language processing (NLP) in which words or phrases are represented as dense numerical vectors in a high-dimensional space, such that words with similar meanings are close together in the space. Word embeddings (Word2Vec, GloVe, fastText) enable machine learning

Learn More

Related Terms