You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
"""
Starting from v4.46, the logits model output will have the same type as the model (except at train time, where it will always be FP32)
"""
Is being injected into the response such as:
"""
<|begin_of_text|>Explain: Do Not Go Gentle into That Good Starting from v4.46, the logits model output will have the same type as the model (except at train time, where it will always be FP32)
Night
"""
When running
python -m vptq --model=VPTQ-community/Meta-Llama-3.1-70B-Instruct-v8-k65536-0-woft --prompt="Explain: Do Not Go Gentle into That Good Night"
Which, given my not amazing coding skills best I can think to fix is by changing eval_prompt to supress the UserWarning.
def eval_prompt(model, tokenizer, args):
inputs = tokenizer(args.prompt, return_tensors="pt").to(model.device)
streamer = transformers.TextStreamer(tokenizer)
with warnings.catch_warnings():
warnings.filterwarnings(
"ignore",
message="Starting from v4.46, the `logits` model output will have the same type as the model",
category=UserWarning,
)
model.generate(**inputs, streamer=streamer, max_new_tokens=100, pad_token_id=2)
The content you are editing has changed. Please copy your edits and refresh the page.
Thanks for bringing this to our attention! Sorry you encountered this warning. I also noticed the warning and will update it shortly, please wait a moment.
Additionally, you can also directly submit a pull request; you are welcome to join as a contributor.
I've tried a few things, but haven't found a way to suppress it without globally setting transformer logging to error. My changes to eval_prompt didn't work when I made a custom pip install -e . of my fork. If I find something I'll make a pull.
"""
Starting from v4.46, the
logits
model output will have the same type as the model (except at train time, where it will always be FP32)"""
Is being injected into the response such as:
"""
<|begin_of_text|>Explain: Do Not Go Gentle into That Good Starting from v4.46, the
logits
model output will have the same type as the model (except at train time, where it will always be FP32)Night
"""
When running
Which, given my not amazing coding skills best I can think to fix is by changing eval_prompt to supress the UserWarning.
Tasks
The text was updated successfully, but these errors were encountered: