Skip to content

Commit

Permalink
comment
Browse files Browse the repository at this point in the history
  • Loading branch information
maxjakob committed Jan 25, 2024
1 parent 3b2bf1d commit a8aad2d
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion notebooks/search/tokenization.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -255,7 +255,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"Next we tokenize the long text, create chunks of size 510 tokens and map the tokens back to text. Notice that the BERT tokenizer itself is warning us about the 512 tokens limitation."
"Next we tokenize the long text, create chunks of size 510 tokens and map the tokens back to text. Notice that on the first run the BERT tokenizer itself is warning us about the 512 tokens limitation."
]
},
{
Expand Down

0 comments on commit a8aad2d

Please sign in to comment.