Skip to main content

You are not logged in. Your edit will be placed in a queue until it is peer reviewed.

We welcome edits that make the post easier to understand and more valuable for readers. Because community members review edits, please try to make the post substantially better than how you found it, for example, by fixing grammar or adding additional resources and hyperlinks.

2
  • Where are these default definitions provided? You might just be providing a distinction (the embedding is not part of the LLM training), when you can use LLMs to build embeddings; they might just not be the most efficient way to do it.
    – Yakk
    Commented Apr 16 at 19:23
  • @Yakk Ultimately embeddings come from a learned one-to-one mapping between words and embeddings. Sometimes the mapping is learned at the same time as the rest of the network. Sometimes it uses a pretrained model like Word2Vec link. Importantly though the "mapping" between words and embeddings only ever takes in the word as input, never the context. Commented Apr 17 at 15:21