Conversation
fix(README.md): GitHub hackathon link
|
wow, that's is super great, will get time on weekend to collab on that! thank you for the contribution <3 |
|
@di-sukharev - any progress on PR? |
|
well, month has passed and i still dont have enough time to manage this, it's a beautiful contribution! @jaroslaw-weber are there any specific parts you are concerned about? |
src/engine/ollama.ts
Outdated
| //hoftix: local models are not so clever... | ||
| prompt+='Summarize above git diff in 10 words or less' | ||
|
|
||
| //console.log('prompt length ', prompt.length) |
|
@di-sukharev thank you for checking the PR :) my concerns:
i think if those two are working then we can maybe merge v1.0 version of the local model feature? what we could improve in next PRs:
|
|
I can help with testing and feedback ( if you advice how I can build and install this package "locally") |
|
@senovr im pretty sure its quite straightforward to run this. try this: |
|
My issue is not with installing ollama itself, but with building npm oco package from source. |
|
@di-sukharev do you think we can deploy soon? |
|
Will be merging on Thursday
|
|
hi @jaroslaw-weber, still havent got a free minute to merge this properly, will be back asap |
|
No problem don't rush :) it's open source, not your job |
|
@jaroslaw-weber , @di-sukharev - sorry for being silent for a while. A bit longer answer is: Here is an example of actual commit message:
|
|
<3 looking into it |
|
@di-sukharev, thank you very much for mergin this into main.
When I run git add - git commit, I got a commit message , but without (fix) (chore) (feature). My question to @jaroslaw-weber - if I change model to something else, will it works? |
|
@senovr it may work but those models are usually less powerful than gpt 4. But definitely you can try |
|
@senovr i've only tested and merged it, shoutout to @jaroslaw-weber who implemented the feature! |
|
@jaroslaw-weber , sorry for bothering you again ) In Dmitry's code, I seeing this row: Probably in ollama.ts we can do it similarly... My second question is probaby goes to both of you @di-sukharev and @jaroslaw-weber : Probably this is the case not having (chore) (fix) (feat) in commit message... I can not locate similar promt when calling chatGPT api, can you tell me if such prompt exists? BTW, let me know if we need to move this conversation in new issue ) |
|
@senovr haha lol, please lets create a new issue, could you please create the PR, i will merge once done, please target it to the |
|
I have a PR from a little while ago that forks this PR to add support for changing the ollama model. I can update it and reopen it here if anyone is interested. |
|
Yeah forgot about mentioning that. I had some issue with generating commit message with the original prompt so I changed it. But feel free to modify that part :) Great that you guys follow up on this |
|
As for hardcoding part of the model, I was going to work on this after my PR got merged but I got busy when it eventually got merged :) . Feel free to improve this part. |
adding ollama support (local model)
how to setup:
ollama run mistralto fetch the model (2gb of data)it's not perfect but can be a start to incorporating other models that are available through ollama.