ServiceHub

AI-Native Service Management for macOS

⌘

Menu Bar App

One-click start/stop for all your services from the macOS menu bar. No terminal needed.

πŸ”€

AI Proxy Hub

Route Claude Code to GPT-5.4, Llama 3.1 70B, or any OpenAI-compatible backend β€” one command.

πŸ› 

Universal Service Manager

Homebrew, launchd, Docker, and custom binaries β€” all managed in one place.

Built for AI Developers

ServiceHub manages your AI inference stack. Proxy Claude Code to any model β€” no vendor lock-in.

Claude Code

β†’ GPT-5.4

via sub2api

Claude Code

β†’ Llama 3.1 70B

via NVIDIA API

one-api

OpenAI Proxy

Web UI included

quectoGPT

Full AI Platform

Web Β· Coordinator Β· Static

Terminal
# Route Claude Code to GPT-5.4 via sub2api
$ servicehub start anthropic-proxy
# Switch to NVIDIA API (Llama 3.1 70B)
$ servicehub start anthropic-proxy-nvidia
# Launch your full AI stack at once
$ servicehub start one-api quecto-web quecto-coordinator