are a way to represent words, phrases and images as vectors in a high dimentional space
if you embed cat and dog
it would be plotted close to each other in vector space
this vector space is extremely high dimentional, to provide “semantic” closeness
the process of calculating the similarity between two vectors is called cosine similarity where a value of 1 would indicate high similarity and a value of -1 would indicate high opposition
embeddings are stored in vector stores most databases support it, (pgvector by postgres)