Nvidia
loading
·
loading
·
Ollama Environment Configuration
·66 words·1 min
Key environment settings for running Ollama efficiently on Windows and Linux.
Force Ollama to use only a single Nvidia GPU
·163 words·1 min
Selecting and enabling which GPUs are visible to ollama within windows.
Base Mac Mini M4, The alternatives for Low-End NVIDIA Hardware for Inference
·621 words·3 mins
How the Mac Mini M4 has enabled affordable Local LLM inference and making VRAM-starved NVIDIA cards obsolete for low power and low cost