diff --git a/notebooks/search/tokenization.ipynb b/notebooks/search/tokenization.ipynb index 0b10f402..db0067e7 100644 --- a/notebooks/search/tokenization.ipynb +++ b/notebooks/search/tokenization.ipynb @@ -255,7 +255,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Next we tokenize the long text, create chunks of size 510 tokens and map the tokens back to text. Notice that the BERT tokenizer itself is warning us about the 512 tokens limitation." + "Next we tokenize the long text, create chunks of size 510 tokens and map the tokens back to text. Notice that on the first run the BERT tokenizer itself is warning us about the 512 tokens limitation." ] }, {