|
Jack Romano |
I am concerned about data privacy and don't want to send data to the openAI model. I wish to use local model but the local model provided by Curosity is not sufficient as it uses CPU. I would like to see you implement OLLAMA as a choice of AI assistant. I tried to use the OpenAI compatibility feature of ollama but didn't work. please consider adding this as a feature.