You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The sentence_tokenizer() and paragraph_tokenizer() should add attributes about sentences in the Token objects directly instead of creating a new list of Tokens embedded in tuples.
An idea is to use the _ attribute in Token objets to store two k/v pairs: sent/word_num and par/word_num
The text was updated successfully, but these errors were encountered:
The sentence_tokenizer() and paragraph_tokenizer() should add attributes about sentences in the Token objects directly instead of creating a new list of Tokens embedded in tuples.
An idea is to use the
_
attribute in Token objets to store two k/v pairs:sent/word_num
andpar/word_num
The text was updated successfully, but these errors were encountered: