You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Mar 14, 2024. It is now read-only.
I am reading the doc of how to use generated embeddings at https://torchbiggraph.readthedocs.io/en/latest/downstream_tasks.html. I see an example how to do a link prediction, which uses the same operations as in training: ComplexDiagonalDynamicOperator and DotComparator. The same thing is done in Ranking example.
But once we are doing Nearest neighbor search, we just use IndexFlatL2... Correct me if I am wrong, but these things are not the same, right? They are going to produce different results. And, if so, shouldn't we use FAISS with METRIC_INNER_PRODUCT https://github.com/facebookresearch/faiss/wiki/MetricType-and-distances ?
The text was updated successfully, but these errors were encountered:
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Hi there,
I am reading the doc of how to use generated embeddings at https://torchbiggraph.readthedocs.io/en/latest/downstream_tasks.html. I see an example how to do a link prediction, which uses the same operations as in training: ComplexDiagonalDynamicOperator and DotComparator. The same thing is done in Ranking example.
But once we are doing Nearest neighbor search, we just use IndexFlatL2... Correct me if I am wrong, but these things are not the same, right? They are going to produce different results. And, if so, shouldn't we use FAISS with METRIC_INNER_PRODUCT https://github.com/facebookresearch/faiss/wiki/MetricType-and-distances ?
The text was updated successfully, but these errors were encountered: