Local Models
Ollama
Ollama is an easy way to get LLMs running on your computer through a CLI.
To run Ollama with NeoGPT
-
Download Ollama for your OS from here
-
Open the installed ollama application, and go through the setup process.
-
Now you are ready to download a model. You can see the list of available models here.
-
To download a model, run the following command in your terminal:
-
It will likely take a few minutes to download the model.Once its done you can use it with NeoGPT.
You can also export the model name as an environment variable and use it with NeoGPT.