You pointed a web app at localhost:11434 and got nothing back. No error in Ollama's logs, no crash, no useful message. The request died silently in your browser's console with a CORS policy error.

Ollama does this on purpose. Every cross-origin request gets blocked out of the box. The fix is one environment variable, but the docs bury it where you won't find it when you're scanning for answers mid-frustration.

How do you fix Ollama CORS on Mac?

Set OLLAMA_ORIGINS to allow cross-origin requests. The fastest path:

launchctl setenv OLLAMA_ORIGINS "*" && pkill Ollama; open -a Ollama

That tells macOS to accept requests from any origin and restarts the Ollama process. You're unblocked in about five seconds.

The command persists until your next reboot. To make it permanent, add this line to your ~/.zshrc:

export OLLAMA_ORIGINS="*"

Restart your terminal. Ollama picks up the variable on its next launch.

Want to scope it to specific origins instead of the wildcard?

launchctl setenv OLLAMA_ORIGINS "http://localhost:4200,http://localhost:3000"

Comma-separated, no spaces. Each origin is an exact match, so include the port number.

How do you verify the CORS fix worked?

Open your browser's developer console and run:

fetch('http://localhost:11434/api/tags').then(r => r.json()).then(console.log)

Your model list should print to the console. If you still get a CORS error, Ollama hasn't picked up the new variable. Kill and restart: pkill Ollama; open -a Ollama.

Why does Ollama block browser requests by default?

Ollama's API server listens on localhost:11434 without CORS headers. When a browser makes a cross-origin request - any request from a web page running on a different origin - it first sends a preflight OPTIONS request. Ollama responds without the required Access-Control-Allow-Origin header, and the browser kills the actual request before it fires.

This is a security decision, not a bug. Ollama's API can load and unload models, which is a destructive operation you probably don't want any random webpage triggering. Shipping without CORS means every browser-based client has to explicitly opt in. That includes any Ollama frontend running in a browser tab - ModelPiper, Open WebUI on localhost, or a custom dashboard you built yourself.

The tradeoff is real. Security by default is the right call for an API server. But it means every new user hits the same wall on their first day, often without understanding why their request silently failed.

Common gotchas after applying the fix

The setting disappears after a reboot. launchctl setenv doesn't persist across restarts on macOS. If you only used that command, you'll hit the same CORS wall after your next reboot. Adding export OLLAMA_ORIGINS="*" to ~/.zshrc is the permanent fix.

Homebrew-managed Ollama ignores shell variables. If you installed Ollama with brew services start ollama, it runs as a launchd service that doesn't inherit your shell environment. You need to edit the Homebrew plist or use launchctl setenv to set the variable at the system level, not just in your ~/.zshrc.

The setting works but requests still hang. If CORS is fixed (no browser error) but requests take forever, Ollama might be loading a model on first request. Large models (7B+) take 3-5 seconds to load from disk. Wait for the first response, and subsequent requests will be fast.

The alternative: skip CORS entirely

ToolPiper bundles llama.cpp as its own inference engine - same models, same GGUF format, same Metal GPU speed. Because the browser connects to ToolPiper's HTTP server directly (which includes CORS headers natively), there's no environment variable to configure and nothing to reset after a reboot.

It also connects to Ollama as an external provider, so your existing models appear alongside ToolPiper's built-in engine. You don't have to choose one or the other.

Download ToolPiper at modelpiper.com, or apply the fix above and keep using Ollama directly.

This is part of a series on Ollama frontends for Mac. Next: Ollama Chat Without Docker on Mac - native alternatives to Open WebUI.