β All Resources
Tool Free Beginner
Ollama β Run Large Language Models Locally
π Other π€ Ollama
Ollama is an open-source tool for running large language models on local hardware β Mac, Linux, and Windows. It handles model downloading, quantisation, and serving through a simple CLI and API compatible with the OpenAI API format. With Ollama, a practitioner can run Llama 3.3, Mistral, Phi-4, or dozens of other models locally without cloud API costs or data leaving the device. AICI recommends Ollama as the most accessible way for AI governance practitioners to develop firsthand understanding of how language models behave β including their failure modes, their sensitivity to prompting, and the gap between benchmark performance and real-world utility. You cannot govern what you have not used.
Access Free Resource β