**Last Update**: 25.05.2024
***
- [Ollama](https://ollama.com) lets you run LLMs on your local machine
- [[Ollama - Models | Many models available]]
- Most popular is llama3 (from meta)
- Installations available for macOS, Linux & Windows.
- Windows considered *experimental*, but seems okay
- Download the model(s) you want
- Downloaded models can be quite large
- It is *free*
- You're in effect paying, however what you're paying is for your own hardware
- **Ollama exposes endpoints at localhost:11434**
- Avoids sending data offsite
***
**References**:
- [Running Uncensored and Open Source LLMs on Your Local Machine](https://www.youtube.com/watch?v=Q0toFxwB_Is)