Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Does not work in Oobabooga #53

Open
Kaszebe opened this issue Oct 8, 2024 · 1 comment
Open

Does not work in Oobabooga #53

Kaszebe opened this issue Oct 8, 2024 · 1 comment
Labels
inference inference related questions question Further information is requested

Comments

@Kaszebe
Copy link

Kaszebe commented Oct 8, 2024

Hello,

I downloaded this last night:

VPTQ-community_Meta-Llama-3.1-405B-Instruct-v16-k65536-1024-woft

I ran it in Oobabooga. It loaded fine. But when I tried to talk to the model (chat-instruct) nothing happened. I ran nvidia-smi and it looked like the model loaded but no inferencing was going on.

I also did the same for other V8 and V16 models and they did not work in Oobabooga as well.

I don't have to use Oobabooga....I can use other programs such as Kobold if they work better?

@YangWang92
Copy link
Contributor

Please hold on, I'll take a closer look. Thank you.

@YangWang92 YangWang92 added question Further information is requested investigate labels Oct 9, 2024
@YangWang92 YangWang92 added inference inference related questions and removed investigate labels Oct 20, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
inference inference related questions question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants