Introduction to LLMs
NeoGPT works with both local and hosted language models.
Local models are stored on your device and can be used offline. They are limited by the resources available on your device.
Hosted models offer speed and capability with payment requirements, accessible from anywhere and not constrained by local device resources.
Local Models
NeoGPT currently supports Hugging Face, LLama CPP, LM Studio and Ollama as the local models for use. Local models allow you to chat with local files, youtube videos offline in any environment. You can choose any of the available models based on your preference and usecase.
Hosted Models
NeoGPT also supports OpenAI and TogetherAI as hosted models. You can use the OpenAI GPT models to perform customised tasks on pdfs and youtube videos while TogetherAI offers real-time interaction with NeoGPT, thus bringing dynamic conversations to the table.🚀