You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I've noticed on reddit, twitter, etc that a lot of people have started using local LLMs running on LLMStudio and others. I have a PR ready to change the file structure for how we save and upload locally saved models so one can directly use the models that are downloaded from LLMStudio rather than having to re-install it for localpilot.
Would appreciate if someone could grant me write access so I can make a PR in this repo.
Appreciate your work on localpilot!
The text was updated successfully, but these errors were encountered:
LLMStudio being supported would be nuts! I'm actually also interested in this...
Who knows, maybe I'll do it, but no guarantees. I'm quite busy already...
EDIT: The debug options to set a local proxy on Copilot are outdated and don't work anymore. On that note, I tried out "Continue" from TogetherAI and their VSCode extension, downloaded LM Studio, loaded the model deepseek-coder-6.7b-base.Q3_K_S and it's been impressive so far. Basically Copilot speeds on my RTX2080 (Notebook GPU).
Have fun! Just make sure you have at least 8GB VRAM to offload the whole model(must be adjusted in LM Studio!) to have similar speeds.
EDIT2: Upon further testing in an actual work environment, well, the small 6.7B models aren't quite intelligent, to say the least... When working with libraries, for example, it has absolutely no clue what to do and just suggests "stuff" that doesn't even make sense in that particular context, even with 1024 token context and a Q5 model. However, that has to do with the model. A bigger or newer one will maybe do better... Only time will tell.
I've noticed on reddit, twitter, etc that a lot of people have started using local LLMs running on LLMStudio and others. I have a PR ready to change the file structure for how we save and upload locally saved models so one can directly use the models that are downloaded from LLMStudio rather than having to re-install it for localpilot.
Would appreciate if someone could grant me write access so I can make a PR in this repo.
Appreciate your work on localpilot!
The text was updated successfully, but these errors were encountered: