Press & Media
Brand assets, company information, and press resources
Logo & Brand Assets
Use the logo on a dark or light background. Don't alter the colors, stretch, or rotate. Leave clear space around the logo equal to the height of the "M" mark.
About ModelPiper
ModelPiper is a local-first AI platform for macOS. It lets developers and power users build, run, and chain AI tools entirely on their own hardware — no cloud accounts, no per-token billing, no data leaving the device. The platform includes a visual pipeline builder, a native macOS inference engine (ToolPiper) with 147 MCP tools, browser automation, self-healing test recording, voice AI, screen capture, and audio mixing.
The product suite runs on Apple Silicon and ships as free Mac App Store apps plus a free web interface. All AI processing — LLM inference, speech-to-text, text-to-speech, OCR, pose detection, and image upscaling — runs on-device using llama.cpp, Apple Intelligence, CoreML, and Metal GPU.
ModelPiper is headquartered in the United States and is an independent, bootstrapped company.
Copy-paste friendly. No approval needed for factual press coverage.
Founder
Ben Racicot
Founder & Lead Engineer
Ben Racicot is the founder and lead engineer of ModelPiper. He designs and builds every layer of the platform — from the Swift macOS apps and inference backends to the Angular web interface and MCP server protocol. Before ModelPiper, Ben spent over a decade building web and native applications, with a focus on real-time systems, developer tooling, and performance engineering.
ModelPiper grew out of a conviction that useful AI tools shouldn't require cloud subscriptions or giving up your data. Ben built it because the tools he wanted didn't exist: a local inference engine that works like a platform, not a chatbot.
LinkedInProduct Suite
ToolPiper
Mac App StoreNative macOS AI engine. Bundles local inference via llama.cpp and Apple Intelligence, 147 MCP tools, browser automation, self-healing test recording (PiperTest), RAG, voice AI, video upscaling, and an OpenAI-compatible API. One install replaces Ollama + Open WebUI + multiple MCP servers.
Mac App Store →ModelPiper Web App
FreeVisual AI pipeline builder in the browser. Drag-and-drop blocks for LLMs, vision models, TTS engines, and custom logic. Streams data between steps as it arrives. Supports Ollama, OpenAI, Anthropic, Groq, and any OpenAI-compatible endpoint.
modelpiper.com →System Actions
Built into ToolPipermacOS desktop automation via voice. Push-to-talk dictation, natural language commands, clipboard intelligence, and AI-powered snippets. 26 action domains covering audio, display, Bluetooth, Finder, windows, and more — 142 actions total, all built into ToolPiper.
VisionPiper
Mac App StoreScreen capture with live AI streaming. Capture any region, record H.264 video, export GIFs, and stream frames to vision models at 30fps over WebSocket. A steerable camera for your screen.
Mac App Store →AudioPiper
Mac App StoreMulti-source audio capture and mixing. Per-app audio via Core Audio Taps — no virtual drivers, no kernel extensions. Record from any app and stream to ToolPiper for real-time transcription.
Mac App Store →MediaPiper
Browser ExtensionBrowser extension for Chrome, Firefox, and Safari. Hover any image or video for instant full-size preview with intelligent discovery, on-device AI upscaling, video controls, and 40+ keyboard shortcuts.
Chrome Web Store →Press Releases
Key Facts
Social & Community
Media Contact
For press inquiries, interview requests, or additional assets:
press@modelpiper.com