This repository has been archived by the owner on Aug 1, 2024. It is now read-only.
Replies: 1 comment
-
I don't know BertViz and
|
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi, I was trying to visualizae the attentions of ESMv2 with BertViz but I had a couple of questions. When I encode two inputs with encode_plus and the model adds the special tokens to the sequences, should I consider the separating token between the two inputs as a part of sequence 1 or 2? Also, can I expect to find any meaningful connections between two sequences if I visualize the attention of the model and look for seq1-> seq2 connections?
Beta Was this translation helpful? Give feedback.
All reactions