OpenClaw + GitHub Copilot GPT-5.4 技术修复指南
OpenClaw + GitHub Copilot GPT-5.4 Technical Fix Guide
Date: 2026-03-07
Overview
This guide documents how to make github-copilot/gpt-5.4 work inside OpenClaw when the model already works in OpenCode but fails in OpenClaw.
The final solution requires both:
- a config fix in
~/.openclaw/openclaw.json - a runtime patch in the installed OpenClaw bundle
This is necessary because the problem is not only model registration. It is also an OpenClaw transport-routing issue for GitHub Copilot Responses API traffic.
Symptoms
The following errors may appear during debugging:
Symptom 1: model rejected
github-copilot/gpt-5.4 ... not allowed Symptom 2: IDE auth header missing
HTTP 400: bad request: missing Editor-Version header for IDE auth Symptom 3: unsupported provider mode
No API provider registered for api: github-copilot Symptom 4: wrong endpoint
HTTP 400: model "gpt-5.4" is not accessible via the /chat/completions endpoint Symptom 5: gateway instability
gateway disconnected: closed | idle Root Cause Analysis
There are four distinct problems.
1. Model config and allowlist mismatch
OpenClaw could see the provider, but github-copilot/gpt-5.4 was not fully wired into the active model config path used by the agent defaults.
2. Missing GitHub Copilot IDE headers
GitHub Copilot requires IDE-style headers for auth. OpenClaw was sending requests through a generic OpenAI-compatible path, so required headers were not included.
Required headers:
User-Agent: GitHubCopilotChat/0.35.0 Editor-Version: vscode/1.107.0 Editor-Plugin-Version: copilot-chat/0.35.0 Copilot-Integration-Id: vscode-chat Without them, Copilot returns:
missing Editor-Version header for IDE auth 3. GPT-5.4 is not a Chat Completions model
gpt-5.4 must use the Responses API, not /chat/completions.
So this is wrong for gpt-5.4:
"api":"openai-completions"This is required instead:
"api":"openai-responses"4. OpenClaw transport routing only handled openai, not github-copilot
Even after changing gpt-5.4 to openai-responses, OpenClaw still fell back to the generic stream path because its embedded runner only activated the Responses transport for provider openai.
That caused OpenClaw to keep hitting /chat/completions for GitHub Copilot GPT-5.4.
Files Involved
Config file
~/.openclaw/openclaw.json
Installed OpenClaw runtime bundle
/home/water/.nvm/versions/node/v22.22.0/lib/node_modules/openclaw/dist/reply-DhtejUNZ.js
Reapply script
~/.openclaw/workspace/ken-patchs/reapply-openclaw-copilot-gpt54-patches.mjs
Step 1: Fix the OpenClaw config
Update the GitHub Copilot provider block in ~/.openclaw/openclaw.json.
Provider-level requirements
Use:
"baseUrl":"https://api.individual.githubcopilot.com","api":"openai-completions"Why keep provider API as openai-completions?
- OpenClaw runtime expects the provider to stay on a supported generic adapter path
- switching the entire provider to
github-copilotcaused runtime/provider registration failures
Model-level requirements for GPT-5.4
Set the gpt-5.4 model entry to:
{"id":"gpt-5.4","name":"GPT-5.4","api":"openai-responses","reasoning":true,"input":["text","image"],"cost":{"input":0,"output":0,"cacheRead":0,"cacheWrite":0},"contextWindow":128000,"maxTokens":64000}Agent model registration
Make sure this exists:
"agents":{"defaults":{"models":{"github-copilot/gpt-5.4":{}}}}Step 2: Patch OpenClaw to inject Copilot IDE headers
OpenClaw needs to attach Copilot IDE headers before sending provider requests.
In /home/water/.nvm/versions/node/v22.22.0/lib/node_modules/openclaw/dist/reply-DhtejUNZ.js, add a wrapper like this near the other provider wrappers:
constGITHUB_COPILOT_IDE_HEADERS={"User-Agent":"GitHubCopilotChat/0.35.0","Editor-Version":"vscode/1.107.0","Editor-Plugin-Version":"copilot-chat/0.35.0","Copilot-Integration-Id":"vscode-chat"};functioncreateGitHubCopilotHeadersWrapper(baseStreamFn){const underlying = baseStreamFn ?? streamSimple;return(model, context, options)=>{returnunderlying(model, context,{...options,headers:{...GITHUB_COPILOT_IDE_HEADERS,...options?.headers }});};}Then apply it inside the provider wrapper logic:
if(provider ==="github-copilot") agent.streamFn =createGitHubCopilotHeadersWrapper(agent.streamFn);Step 3: Patch OpenClaw to route GitHub Copilot Responses correctly
Find the branch that decides which stream transport to use.
Original behavior:
}elseif(params.model.api ==="openai-responses"&& params.provider ==="openai"){Replace it with:
}elseif(params.model.api ==="openai-responses"&&(params.provider ==="openai"|| params.provider ==="github-copilot")){Why this matters:
- before the patch,
github-copilotnever entered the Responses transport branch - OpenClaw fell back to
streamSimple streamSimplehit/chat/completions- GPT-5.4 rejected that endpoint
After this patch:
github-copilot+openai-responsesuses the correct Responses transport- GPT-5.4 no longer falls back to Chat Completions
Step 4: Validate and restart
Validate config JSON
node-e"JSON.parse(require('fs').readFileSync('/home/water/.openclaw/openclaw.json','utf8')); console.log('OK')"Validate patched bundle syntax
node--check"/home/water/.nvm/versions/node/v22.22.0/lib/node_modules/openclaw/dist/reply-DhtejUNZ.js"Restart gateway
openclaw gateway restart Verification Procedure
- Set the model to
github-copilot/gpt-5.4 - Send a simple prompt like
hi - Confirm the gateway stays connected
- Confirm none of these errors return:
missing Editor-Version header for IDE auth model "gpt-5.4" is not accessible via the /chat/completions endpoint No API provider registered for api: github-copilot Reapply After OpenClaw Updates
Because the runtime fix patches the installed OpenClaw bundle, upgrades or reinstalls may overwrite it.
Use the reapply script:
node ~/.openclaw/workspace/ken-patchs/reapply-openclaw-copilot-gpt54-patches.mjs openclaw gateway restart Design Notes
Why not switch the whole provider to api: "github-copilot"?
That looked tempting, but OpenClaw’s runtime path did not have a compatible registered streaming provider for that mode in this setup, which caused runtime/provider registration failures.
Why not keep GPT-5.4 on openai-completions?
Because GitHub Copilot GPT-5.4 is not accessible on /chat/completions. It must go through the Responses API.
Why did OpenCode work earlier?
OpenCode already handled the GitHub Copilot transport path correctly, including the required Copilot headers and the proper API mode, while OpenClaw needed both config and runtime fixes.
Recommended Maintenance Notes
- Keep this guide with the reapply script path documented nearby
- After any OpenClaw upgrade, rerun the patch script
- If OpenClaw changes its bundle file name, update the script path target accordingly
- If GitHub Copilot changes required IDE header versions, update both the runtime patch and reapply script
Quick Recovery Commands
node ~/.openclaw/workspace/ken-patchs/reapply-openclaw-copilot-gpt54-patches.mjs openclaw gateway restart openclaw status Final State
With the config fix and runtime patches in place, github-copilot/gpt-5.4 works in OpenClaw and the gateway remains stable.