Skip to content

Commit

Permalink
extend max_prompt_length and input text for 128k evaluation (#891)
Browse files Browse the repository at this point in the history
* extend max_prompt_length and input text for 128k evaluation

* Extend max_prompt_length and input text for 128k evaluation

---------

Co-authored-by: Logan Adams <114770087+loadams@users.noreply.github.com>
  • Loading branch information
HeyangQin and loadams committed Sep 3, 2024
1 parent 957ae31 commit 1293d45
Showing 1 changed file with 8 additions and 0 deletions.
8 changes: 8 additions & 0 deletions benchmarks/inference/mii/src/client.py
Original file line number Diff line number Diff line change
Expand Up @@ -347,6 +347,14 @@ def run_client(args):
p.start()

tokenizer = AutoTokenizer.from_pretrained(args.model)

# make sure max_prompt_length is longer than the target prompt length
args.max_prompt_length = max(args.max_prompt_length, int(args.mean_prompt_length * 3))
# check if the all_text is longer than the max prompt length, if not expand it
global all_text
while len(tokenizer.tokenize(all_text)) < args.max_prompt_length:
all_text += all_text

query_generator = RandomQueryGenerator(all_text, tokenizer, seed=42)
request_text = query_generator.get_random_request_text(
args.mean_prompt_length,
Expand Down

0 comments on commit 1293d45

Please sign in to comment.