From a8aad2d6f66304e5a683ca7a51d16dd3497b28f5 Mon Sep 17 00:00:00 2001 From: Max Jakob Date: Thu, 25 Jan 2024 12:00:03 +0100 Subject: [PATCH] comment --- notebooks/search/tokenization.ipynb | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/notebooks/search/tokenization.ipynb b/notebooks/search/tokenization.ipynb index 0b10f402..db0067e7 100644 --- a/notebooks/search/tokenization.ipynb +++ b/notebooks/search/tokenization.ipynb @@ -255,7 +255,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Next we tokenize the long text, create chunks of size 510 tokens and map the tokens back to text. Notice that the BERT tokenizer itself is warning us about the 512 tokens limitation." + "Next we tokenize the long text, create chunks of size 510 tokens and map the tokens back to text. Notice that on the first run the BERT tokenizer itself is warning us about the 512 tokens limitation." ] }, {