By default, the Claude Code CLI authenticates with Anthropic and routes every request to their cloud API. You can override this behavior with a single configuration file that points the CLI at your LM Studio server instead. Once the redirect is in place, the CLI works exactly as normal — tools, MCP integrations, and context handling all behave identically — except the model runs on your own hardware and no requests leave your network.Documentation Index
Fetch the complete documentation index at: https://reliatrack.mintlify.app/llms.txt
Use this file to discover all available pages before exploring further.
Prerequisites
- Node.js 18 or later installed on your local machine
- LM Studio running on your GPU server with a model loaded and the local server started on port 1234 (see Set up a local LLM with LM Studio)
- Network connectivity to the server: either
localhost(same machine) or a Tailscale IP for a remote server
Install the Claude Code CLI
Install the official Anthropic Claude Code package globally from npm:Confirm the installation succeeded:
Create the Claude configuration directory
The CLI reads its configuration from
~/.claude/config.json. Create the directory if it does not already exist:Write the config.json file
Create or edit How this works: Setting
~/.claude/config.json with the following content. Set ANTHROPIC_BASE_URL to the address of your LM Studio server.The Tailscale IP
100.82.56.40 shown above is an example. Replace it with the actual Tailscale address of your GPU server. You can find it by running tailscale ip -4 on the server.ANTHROPIC_BASE_URL redirects all API calls away from api.anthropic.com to your local server. The CLI never contacts Anthropic’s OAuth service, so no cloud authentication occurs. The ANTHROPIC_API_KEY field is required by the CLI’s schema but its value is not validated by LM Studio — "local" is a safe placeholder.Verify the connection
Run a simple prompt to confirm the CLI is talking to your local model:The model should respond. If LM Studio is the active backend, you will notice that no Anthropic sign-in prompt appears and the response originates from whichever model is loaded in LM Studio.
Troubleshooting
The CLI hangs or returns a connection error
The CLI hangs or returns a connection error
Confirm that LM Studio’s local server is running and bound to
0.0.0.0 on port 1234. Test connectivity directly with nc -vz <SERVER_IP> 1234. If using Tailscale, verify both devices are connected to the same tailnet with tailscale status.The CLI prompts me to sign in to Anthropic
The CLI prompts me to sign in to Anthropic
This means
ANTHROPIC_BASE_URL is not being picked up. Check that ~/.claude/config.json is valid JSON and that the env key is at the top level of the object. You can validate the file with cat ~/.claude/config.json | python3 -m json.tool.Responses are very slow
Responses are very slow
The model loaded in LM Studio may be too large for your hardware. See Choose the right AI model for guidance on selecting a model that matches your server’s memory capacity.