You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is Relational Embedding possible between two sentences with independent llm embedding of each ? Or does every sentence pair need an relation embedding pass from an llm ?
underlying assemption that embeddings differentiation can withhold relational information (king - man + woman = queen) requires higher dimentions than what the original embedding space can offer. e.g. there are tons of ways to break a car all of which can relate two sentences.
For a db with N sentences, if differenciation was possible, it will reduce the problem complexity to N llm embeddings calls, the rest is vectors comparision, while if differentiation is not possible, N! calls are required to derive relations between all db sentences.
Relational Sentence Embedding
It is possible to use ML to learn embeddings differentiation meanings, but ML with supervised learning cannot take advantage from the generic llm capabilities.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Problem Description
Relational Sentence Embedding
It is possible to use ML to learn embeddings differentiation meanings, but ML with supervised learning cannot take advantage from the generic llm capabilities.
Massive Text Embedding Benchmark
Beta Was this translation helpful? Give feedback.
All reactions