ModelPiper
A browser-based UI for building AI pipelines, chatting with models, processing images, and orchestrating workflows. Works with cloud APIs, local models, or both.
Getting Started
ModelPiper connects to AI providers to run inference. You can get started in minutes with any of these options:
ToolPiper
macOS companion app that bundles local inference (llama.cpp, speech-to-text, text-to-speech, image upscaling), proxies cloud APIs securely, and manages model downloads.
OpenRouter
Access 300+ models (GPT-4o, Claude, Gemini, Llama, Mistral, and more) through a single API key. Works directly in the browser — no install required.
Ollama
Run open-source models entirely on your machine. Free, private, no API key needed.
ollama pull llama3.2)OLLAMA_ORIGINS=* ollama serveConnecting AI Providers
Open My Connections from the sidebar to manage providers. Each provider becomes a configuration you can use across Pipelines, Chat, and other features.
Cloud Providers
OpenRouter works directly from the browser. Other cloud providers (OpenAI, Anthropic, Google Gemini) require the ToolPiper companion app to proxy requests securely.
Local Providers
Ollama and LM Studio run on your machine and connect over localhost. Models stay on your hardware — nothing leaves your network.
Custom Endpoints
Any OpenAI-compatible API can be added as a custom provider. Set the host, port, path, and auth pattern to match your server.
Features
Visual block-based editor. Chain LLM, audio, image, and RAG blocks together with automatic data flow between them.
Multi-model chat interface. Streaming responses, system prompts, conversation history, and provider switching mid-conversation.
CoreML-powered image upscaling with side-by-side comparison. Requires ToolPiper for the upscaling backend.
Save pipeline configurations as reusable templates. Import and export your workspace including all settings and connections.
API Keys & Privacy
Where are my keys stored?
API keys are encrypted with AES-GCM and stored in your browser's localStorage. When ToolPiper is connected, keys are additionally stored in the macOS Keychain. Keys never leave your device.
What data is sent externally?
Only the prompts and parameters you send to your chosen AI provider. ModelPiper has no backend server, no telemetry, and no analytics. Your data flows directly between your browser and the provider.
Companion Apps
ModelPiper works standalone, but companion apps unlock additional capabilities:
ToolPiper
Local AI gateway — inference engines, cloud proxy, model management, speech & audio
VisionPiper
Screen capture, recording, GIF conversion, and live streaming
AudioPiper
Multi-source audio mixer — mic, system audio, and per-app capture
MediaPiper
Browser extension — image & video hover preview, intelligent discovery, on-device AI upscaling