Unity MCP CI Test Improvements (#452)
* Update github-repo-stats.yml * Server: refine shutdown logic per bot feedback\n- Parameterize _force_exit(code) and use timers with args\n- Consistent behavior on BrokenPipeError (no immediate exit)\n- Exit code 1 on unexpected exceptions\n\nTests: restore telemetry module after disabling to avoid bleed-over * Revert "Server: refine shutdown logic per bot feedback\n- Parameterize _force_exit(code) and use timers with args\n- Consistent behavior on BrokenPipeError (no immediate exit)\n- Exit code 1 on unexpected exceptions\n\nTests: restore telemetry module after disabling to avoid bleed-over" This reverts commit 74d35d371a28b2d86cb7722e28017b29be053efd. * Add fork-only Unity tests workflow and guard upstream run * Move fork Unity tests workflow to root * Fix MCP server install step in NL suite workflow * Harden NL suite prompts for deterministic anchors * update claude haiku version for NL/T tests * Fix CI: share unity-mcp status dir * update yaml * Add Unity bridge debug step in CI * Fail fast when Unity MCP status file missing * Allow Unity local share writable for MCP status * Mount Unity cache rw and dump Editor log for MCP debug * Allow Unity config dir writable for MCP heartbeat/logs * Write Unity logs to file and list config dir in debug * Use available Anthropic models for T pass * Use latest claude sonnet/haiku models in workflow * Fix YAML indentation for MCP preflight step * Point MCP server to src/server.py and fix preflight * another try * Add MCP preflight workflow and update NL suite * Fixes to improve CI testing * Cleanup * fixes * diag * fix yaml * fix status dir * Fix YAML / printing to stdout --> stderr * find in file fixes. * fixes to find_in_file and CI report format error * Only run the stats on the CoPlay main repo, not forks. * Coderabbit fixes.main
parent
a9878622ea
commit
d06eaefa8a
|
|
@ -103,25 +103,27 @@ STRICT OP GUARDRAILS
|
||||||
**Goal**: Demonstrate method replacement operations
|
**Goal**: Demonstrate method replacement operations
|
||||||
**Actions**:
|
**Actions**:
|
||||||
- Replace `HasTarget()` method body: `public bool HasTarget() { return currentTarget != null; }`
|
- Replace `HasTarget()` method body: `public bool HasTarget() { return currentTarget != null; }`
|
||||||
- Insert `PrintSeries()` method after `GetCurrentTarget()`: `public void PrintSeries() { Debug.Log("1,2,3"); }`
|
- Validate.
|
||||||
- Verify both methods exist and are properly formatted
|
- Insert `PrintSeries()` method after a unique anchor method. Prefer `GetCurrentTarget()` if unique; otherwise use another unique method such as `ApplyBlend`. Insert: `public void PrintSeries() { Debug.Log("1,2,3"); }`
|
||||||
|
- Validate that both methods exist and are properly formatted.
|
||||||
- Delete `PrintSeries()` method (cleanup for next test)
|
- Delete `PrintSeries()` method (cleanup for next test)
|
||||||
- **Expected final state**: `HasTarget()` modified, file structure intact, no temporary methods
|
- **Expected final state**: `HasTarget()` modified, file structure intact, no temporary methods
|
||||||
|
|
||||||
### NL-2. Anchor Comment Insertion (Additive State B)
|
### NL-2. Anchor Comment Insertion (Additive State B)
|
||||||
**Goal**: Demonstrate anchor-based insertions above methods
|
**Goal**: Demonstrate anchor-based insertions above methods
|
||||||
**Actions**:
|
**Actions**:
|
||||||
- Use `find_in_file` to locate current position of `Update()` method
|
- Use `find_in_file` with a tolerant anchor to locate the `Update()` method, e.g. `(?m)^\\s*(?:public|private|protected|internal)?\\s*void\\s+Update\\s*\\(\\s*\\)`
|
||||||
|
- Expect exactly one match; if multiple, fail clearly rather than guessing.
|
||||||
- Insert `// Build marker OK` comment line above `Update()` method
|
- Insert `// Build marker OK` comment line above `Update()` method
|
||||||
- Verify comment exists and `Update()` still functions
|
- Verify comment exists and `Update()` still functions
|
||||||
- **Expected final state**: State A + build marker comment above `Update()`
|
- **Expected final state**: State A + build marker comment above `Update()`
|
||||||
|
|
||||||
### NL-3. End-of-Class Content (Additive State C)
|
### NL-3. End-of-Class Content (Additive State C)
|
||||||
**Goal**: Demonstrate end-of-class insertions with smart brace matching
|
**Goal**: Demonstrate end-of-class insertions without ambiguous anchors
|
||||||
**Actions**:
|
**Actions**:
|
||||||
- Match the final class-closing brace by scanning from EOF (e.g., last `^\s*}\s*$`)
|
- Use `find_in_file` to locate brace-only lines (e.g., `(?m)^\\s*}\\s*$`). Select the **last** such line (preferably indentation 0 if multiples).
|
||||||
or compute via `find_in_file` + ranges; insert immediately before it.
|
- Compute an exact insertion point immediately before that last brace using `apply_text_edits` (do not use `anchor_insert` for this step).
|
||||||
- Insert three comment lines before final class brace:
|
- Insert three comment lines before the final class brace:
|
||||||
```
|
```
|
||||||
// Tail test A
|
// Tail test A
|
||||||
// Tail test B
|
// Tail test B
|
||||||
|
|
@ -159,7 +161,7 @@ find_in_file(pattern: "public bool HasTarget\\(\\)")
|
||||||
|
|
||||||
**Anchor-based insertions:**
|
**Anchor-based insertions:**
|
||||||
```json
|
```json
|
||||||
{"op": "anchor_insert", "anchor": "private void Update\\(\\)", "position": "before", "text": "// comment"}
|
{"op": "anchor_insert", "anchor": "(?m)^\\s*(?:public|private|protected|internal)?\\s*void\\s+Update\\s*\\(\\s*\\)", "position": "before", "text": "// comment"}
|
||||||
```
|
```
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
|
||||||
|
|
@ -0,0 +1,55 @@
|
||||||
|
name: Claude MCP Preflight (no Unity)
|
||||||
|
|
||||||
|
on: [workflow_dispatch]
|
||||||
|
|
||||||
|
permissions:
|
||||||
|
contents: read
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
mcp-preflight:
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
timeout-minutes: 15
|
||||||
|
|
||||||
|
steps:
|
||||||
|
- uses: actions/checkout@v4
|
||||||
|
with:
|
||||||
|
fetch-depth: 0
|
||||||
|
|
||||||
|
- uses: astral-sh/setup-uv@v4
|
||||||
|
with:
|
||||||
|
python-version: "3.11"
|
||||||
|
|
||||||
|
- name: Install MCP server deps
|
||||||
|
run: |
|
||||||
|
set -eux
|
||||||
|
uv venv
|
||||||
|
echo "VIRTUAL_ENV=$GITHUB_WORKSPACE/.venv" >> "$GITHUB_ENV"
|
||||||
|
echo "$GITHUB_WORKSPACE/.venv/bin" >> "$GITHUB_PATH"
|
||||||
|
if [ -f Server/pyproject.toml ]; then
|
||||||
|
uv pip install -e Server
|
||||||
|
elif [ -f Server/requirements.txt ]; then
|
||||||
|
uv pip install -r Server/requirements.txt
|
||||||
|
else
|
||||||
|
echo "No MCP Python deps found" >&2
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
- name: Preflight MCP server (stdio)
|
||||||
|
env:
|
||||||
|
PYTHONUNBUFFERED: "1"
|
||||||
|
MCP_LOG_LEVEL: debug
|
||||||
|
UNITY_PROJECT_ROOT: ${{ github.workspace }}/TestProjects/UnityMCPTests
|
||||||
|
UNITY_MCP_STATUS_DIR: ${{ github.workspace }}/.unity-mcp-dummy
|
||||||
|
UNITY_MCP_HOST: 127.0.0.1
|
||||||
|
run: |
|
||||||
|
set -euxo pipefail
|
||||||
|
mkdir -p "$UNITY_MCP_STATUS_DIR"
|
||||||
|
# Create a dummy status file with an unreachable port; help should not require it
|
||||||
|
cat > "$UNITY_MCP_STATUS_DIR/unity-mcp-status-dummy.json" <<JSON
|
||||||
|
{ "unity_port": 0, "reason": "dummy", "reloading": false, "project_path": "$UNITY_PROJECT_ROOT/Assets" }
|
||||||
|
JSON
|
||||||
|
uv run --active --directory Server mcp-for-unity --transport stdio --help \
|
||||||
|
> /tmp/mcp-preflight.log 2>&1 || { cat /tmp/mcp-preflight.log; exit 1; }
|
||||||
|
cat /tmp/mcp-preflight.log
|
||||||
|
|
||||||
|
|
||||||
|
|
@ -15,7 +15,7 @@ env:
|
||||||
|
|
||||||
jobs:
|
jobs:
|
||||||
nl-suite:
|
nl-suite:
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-24.04
|
||||||
timeout-minutes: 60
|
timeout-minutes: 60
|
||||||
env:
|
env:
|
||||||
JUNIT_OUT: reports/junit-nl-suite.xml
|
JUNIT_OUT: reports/junit-nl-suite.xml
|
||||||
|
|
@ -62,9 +62,6 @@ jobs:
|
||||||
else
|
else
|
||||||
echo "No MCP Python deps found (skipping)"
|
echo "No MCP Python deps found (skipping)"
|
||||||
fi
|
fi
|
||||||
else
|
|
||||||
echo "No MCP Python deps found (skipping)"
|
|
||||||
fi
|
|
||||||
|
|
||||||
# --- Licensing: allow both ULF and EBL when available ---
|
# --- Licensing: allow both ULF and EBL when available ---
|
||||||
- name: Decide license sources
|
- name: Decide license sources
|
||||||
|
|
@ -123,7 +120,7 @@ jobs:
|
||||||
UNITY_PASSWORD: ${{ secrets.UNITY_PASSWORD }}
|
UNITY_PASSWORD: ${{ secrets.UNITY_PASSWORD }}
|
||||||
UNITY_SERIAL: ${{ secrets.UNITY_SERIAL }}
|
UNITY_SERIAL: ${{ secrets.UNITY_SERIAL }}
|
||||||
run: |
|
run: |
|
||||||
set -euxo pipefail
|
set -euo pipefail
|
||||||
# host dirs to receive the full Unity config and local-share
|
# host dirs to receive the full Unity config and local-share
|
||||||
mkdir -p "$RUNNER_TEMP/unity-config" "$RUNNER_TEMP/unity-local"
|
mkdir -p "$RUNNER_TEMP/unity-config" "$RUNNER_TEMP/unity-local"
|
||||||
|
|
||||||
|
|
@ -159,7 +156,7 @@ jobs:
|
||||||
|
|
||||||
# ---------- Warm up project (import Library once) ----------
|
# ---------- Warm up project (import Library once) ----------
|
||||||
- name: Warm up project (import Library once)
|
- name: Warm up project (import Library once)
|
||||||
if: steps.lic.outputs.use_ulf == 'true' || steps.lic.outputs.use_ebl == 'true'
|
if: steps.detect.outputs.anthropic_ok == 'true' && (steps.lic.outputs.use_ulf == 'true' || steps.lic.outputs.use_ebl == 'true')
|
||||||
shell: bash
|
shell: bash
|
||||||
env:
|
env:
|
||||||
UNITY_IMAGE: ${{ env.UNITY_IMAGE }}
|
UNITY_IMAGE: ${{ env.UNITY_IMAGE }}
|
||||||
|
|
@ -172,11 +169,12 @@ jobs:
|
||||||
fi
|
fi
|
||||||
docker run --rm --network host \
|
docker run --rm --network host \
|
||||||
-e HOME=/root \
|
-e HOME=/root \
|
||||||
-v "${{ github.workspace }}:/workspace" -w /workspace \
|
-v "${{ github.workspace }}:${{ github.workspace }}" -w "${{ github.workspace }}" \
|
||||||
-v "$RUNNER_TEMP/unity-config:/root/.config/unity3d" \
|
-v "$RUNNER_TEMP/unity-config:/root/.config/unity3d" \
|
||||||
-v "$RUNNER_TEMP/unity-local:/root/.local/share/unity3d" \
|
-v "$RUNNER_TEMP/unity-local:/root/.local/share/unity3d" \
|
||||||
|
-v "$RUNNER_TEMP/unity-cache:/root/.cache/unity3d" \
|
||||||
"$UNITY_IMAGE" /opt/unity/Editor/Unity -batchmode -nographics -logFile - \
|
"$UNITY_IMAGE" /opt/unity/Editor/Unity -batchmode -nographics -logFile - \
|
||||||
-projectPath /workspace/TestProjects/UnityMCPTests \
|
-projectPath "${{ github.workspace }}/TestProjects/UnityMCPTests" \
|
||||||
"${manual_args[@]}" \
|
"${manual_args[@]}" \
|
||||||
-quit
|
-quit
|
||||||
|
|
||||||
|
|
@ -184,12 +182,12 @@ jobs:
|
||||||
- name: Clean old MCP status
|
- name: Clean old MCP status
|
||||||
run: |
|
run: |
|
||||||
set -eux
|
set -eux
|
||||||
mkdir -p "$HOME/.unity-mcp"
|
mkdir -p "$GITHUB_WORKSPACE/.unity-mcp"
|
||||||
rm -f "$HOME/.unity-mcp"/unity-mcp-status-*.json || true
|
rm -f "$GITHUB_WORKSPACE/.unity-mcp"/unity-mcp-status-*.json || true
|
||||||
|
|
||||||
# ---------- Start headless Unity (persistent bridge) ----------
|
# ---------- Start headless Unity (persistent bridge) ----------
|
||||||
- name: Start Unity (persistent bridge)
|
- name: Start Unity (persistent bridge)
|
||||||
if: steps.lic.outputs.use_ulf == 'true' || steps.lic.outputs.use_ebl == 'true'
|
if: steps.detect.outputs.anthropic_ok == 'true' && (steps.lic.outputs.use_ulf == 'true' || steps.lic.outputs.use_ebl == 'true')
|
||||||
shell: bash
|
shell: bash
|
||||||
env:
|
env:
|
||||||
UNITY_IMAGE: ${{ env.UNITY_IMAGE }}
|
UNITY_IMAGE: ${{ env.UNITY_IMAGE }}
|
||||||
|
|
@ -201,29 +199,30 @@ jobs:
|
||||||
manual_args=(-manualLicenseFile "/root/.local/share/unity3d/Unity/Unity_lic.ulf")
|
manual_args=(-manualLicenseFile "/root/.local/share/unity3d/Unity/Unity_lic.ulf")
|
||||||
fi
|
fi
|
||||||
|
|
||||||
mkdir -p "$RUNNER_TEMP/unity-status"
|
mkdir -p "$GITHUB_WORKSPACE/.unity-mcp"
|
||||||
docker rm -f unity-mcp >/dev/null 2>&1 || true
|
docker rm -f unity-mcp >/dev/null 2>&1 || true
|
||||||
docker run -d --name unity-mcp --network host \
|
docker run -d --name unity-mcp --network host \
|
||||||
-e HOME=/root \
|
-e HOME=/root \
|
||||||
-e UNITY_MCP_ALLOW_BATCH=1 \
|
-e UNITY_MCP_ALLOW_BATCH=1 \
|
||||||
-e UNITY_MCP_STATUS_DIR=/root/.unity-mcp \
|
-e UNITY_MCP_STATUS_DIR="${{ github.workspace }}/.unity-mcp" \
|
||||||
-e UNITY_MCP_BIND_HOST=127.0.0.1 \
|
-e UNITY_MCP_BIND_HOST=127.0.0.1 \
|
||||||
-v "${{ github.workspace }}:/workspace" -w /workspace \
|
-v "${{ github.workspace }}:${{ github.workspace }}" -w "${{ github.workspace }}" \
|
||||||
-v "$RUNNER_TEMP/unity-status:/root/.unity-mcp" \
|
-v "$RUNNER_TEMP/unity-config:/root/.config/unity3d" \
|
||||||
-v "$RUNNER_TEMP/unity-config:/root/.config/unity3d:ro" \
|
-v "$RUNNER_TEMP/unity-local:/root/.local/share/unity3d" \
|
||||||
-v "$RUNNER_TEMP/unity-local:/root/.local/share/unity3d:ro" \
|
-v "$RUNNER_TEMP/unity-cache:/root/.cache/unity3d" \
|
||||||
"$UNITY_IMAGE" /opt/unity/Editor/Unity -batchmode -nographics -logFile - \
|
"$UNITY_IMAGE" /opt/unity/Editor/Unity -batchmode -nographics -logFile /root/.config/unity3d/Editor.log \
|
||||||
-stackTraceLogType Full \
|
-stackTraceLogType Full \
|
||||||
-projectPath /workspace/TestProjects/UnityMCPTests \
|
-projectPath "${{ github.workspace }}/TestProjects/UnityMCPTests" \
|
||||||
"${manual_args[@]}" \
|
"${manual_args[@]}" \
|
||||||
-executeMethod MCPForUnity.Editor.Services.Transport.Transports.StdioBridgeHost.StartAutoConnect
|
-executeMethod MCPForUnity.Editor.McpCiBoot.StartStdioForCi
|
||||||
|
|
||||||
# ---------- Wait for Unity bridge ----------
|
# ---------- Wait for Unity bridge ----------
|
||||||
- name: Wait for Unity bridge (robust)
|
- name: Wait for Unity bridge (robust)
|
||||||
|
if: steps.detect.outputs.anthropic_ok == 'true' && (steps.lic.outputs.use_ulf == 'true' || steps.lic.outputs.use_ebl == 'true')
|
||||||
shell: bash
|
shell: bash
|
||||||
run: |
|
run: |
|
||||||
set -euo pipefail
|
set -euo pipefail
|
||||||
deadline=$((SECONDS+900)) # 15 min max
|
deadline=$((SECONDS+600)) # 10 min max
|
||||||
fatal_after=$((SECONDS+120)) # give licensing 2 min to settle
|
fatal_after=$((SECONDS+120)) # give licensing 2 min to settle
|
||||||
|
|
||||||
# Fail fast only if container actually died
|
# Fail fast only if container actually died
|
||||||
|
|
@ -239,15 +238,18 @@ jobs:
|
||||||
logs="$(docker logs unity-mcp 2>&1 || true)"
|
logs="$(docker logs unity-mcp 2>&1 || true)"
|
||||||
|
|
||||||
# 1) Primary: status JSON exposes TCP port
|
# 1) Primary: status JSON exposes TCP port
|
||||||
port="$(jq -r '.unity_port // empty' "$RUNNER_TEMP"/unity-status/unity-mcp-status-*.json 2>/dev/null | head -n1 || true)"
|
port="$(jq -r '.unity_port // empty' "$GITHUB_WORKSPACE"/.unity-mcp/unity-mcp-status-*.json 2>/dev/null | head -n1 || true)"
|
||||||
if [[ -n "${port:-}" ]] && timeout 1 bash -lc "exec 3<>/dev/tcp/127.0.0.1/$port"; then
|
if [[ -n "${port:-}" ]] && timeout 1 bash -lc "exec 3<>/dev/tcp/127.0.0.1/$port"; then
|
||||||
echo "Bridge ready on port $port"
|
echo "Bridge ready on port $port"
|
||||||
|
# Ensure status file is readable by all (Claude container might run as different user)
|
||||||
|
docker exec unity-mcp chmod -R a+rwx "$GITHUB_WORKSPACE/.unity-mcp" || chmod -R a+rwx "$GITHUB_WORKSPACE/.unity-mcp" || true
|
||||||
exit 0
|
exit 0
|
||||||
fi
|
fi
|
||||||
|
|
||||||
# 2) Secondary: log markers
|
# 2) Secondary: log markers
|
||||||
if echo "$logs" | grep -qiE "$ok_pat"; then
|
if echo "$logs" | grep -qiE "$ok_pat"; then
|
||||||
echo "Bridge ready (log markers)"
|
echo "Bridge ready (log markers)"
|
||||||
|
docker exec unity-mcp chmod -R a+rwx "$GITHUB_WORKSPACE/.unity-mcp" || chmod -R a+rwx "$GITHUB_WORKSPACE/.unity-mcp" || true
|
||||||
exit 0
|
exit 0
|
||||||
fi
|
fi
|
||||||
|
|
||||||
|
|
@ -272,31 +274,43 @@ jobs:
|
||||||
docker logs unity-mcp --tail 200 | sed -E 's/((email|serial|license|password|token)[^[:space:]]*)/[REDACTED]/Ig'
|
docker logs unity-mcp --tail 200 | sed -E 's/((email|serial|license|password|token)[^[:space:]]*)/[REDACTED]/Ig'
|
||||||
exit 1
|
exit 1
|
||||||
|
|
||||||
# (moved) — return license after Unity is stopped
|
# ---------- Debug Unity bridge status ----------
|
||||||
|
- name: Debug Unity bridge status
|
||||||
# ---------- MCP client config ----------
|
if: always() && (steps.lic.outputs.use_ulf == 'true' || steps.lic.outputs.use_ebl == 'true')
|
||||||
- name: Write MCP config (.claude/mcp.json)
|
shell: bash
|
||||||
run: |
|
run: |
|
||||||
set -eux
|
set -euxo pipefail
|
||||||
mkdir -p .claude
|
echo "--- Unity container state ---"
|
||||||
cat > .claude/mcp.json <<JSON
|
docker inspect -f '{{.State.Status}} {{.State.ExitCode}}' unity-mcp || true
|
||||||
{
|
echo "--- Unity container logs (tail 200) ---"
|
||||||
"mcpServers": {
|
docker logs unity-mcp --tail 200 | sed -E 's/((email|serial|license|password|token)[^[:space:]]*)/[REDACTED]/Ig' || true
|
||||||
"unity": {
|
echo "--- Container status dir ---"
|
||||||
"command": "uv",
|
docker exec unity-mcp ls -la "${{ github.workspace }}/.unity-mcp" || true
|
||||||
"args": ["run","--active","--directory","Server","python","server.py"],
|
echo "--- Host status dir ---"
|
||||||
"transport": { "type": "stdio" },
|
ls -la "$GITHUB_WORKSPACE/.unity-mcp" || true
|
||||||
"env": {
|
echo "--- Host status file (first 120 lines) ---"
|
||||||
"PYTHONUNBUFFERED": "1",
|
jq -r . "$GITHUB_WORKSPACE"/.unity-mcp/unity-mcp-status-*.json | sed -n '1,120p' || true
|
||||||
"MCP_LOG_LEVEL": "debug",
|
echo "--- Port probe from host ---"
|
||||||
"UNITY_PROJECT_ROOT": "$GITHUB_WORKSPACE/TestProjects/UnityMCPTests",
|
port="$(jq -r '.unity_port // empty' "$GITHUB_WORKSPACE"/.unity-mcp/unity-mcp-status-*.json 2>/dev/null | head -n1 || true)"
|
||||||
"UNITY_MCP_STATUS_DIR": "$RUNNER_TEMP/unity-status",
|
echo "unity_port=${port:-}"
|
||||||
"UNITY_MCP_HOST": "127.0.0.1"
|
if [[ -n "${port:-}" ]]; then
|
||||||
}
|
timeout 1 bash -lc "exec 3<>/dev/tcp/127.0.0.1/$port" && echo "TCP OK" || echo "TCP probe failed"
|
||||||
}
|
else
|
||||||
}
|
echo "No unity_port in status file"
|
||||||
}
|
fi
|
||||||
JSON
|
echo "--- Config dir listing ---"
|
||||||
|
docker exec unity-mcp ls -la /root/.config/unity3d || true
|
||||||
|
echo "--- Editor log tail ---"
|
||||||
|
docker exec unity-mcp tail -n 200 /root/.config/unity3d/Editor.log || true
|
||||||
|
# Fail fast if no status file was written
|
||||||
|
shopt -s nullglob
|
||||||
|
status_files=("$GITHUB_WORKSPACE"/.unity-mcp/unity-mcp-status-*.json)
|
||||||
|
if ((${#status_files[@]} == 0)); then
|
||||||
|
echo "::error::No Unity MCP status file found; failing fast."
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
# (moved) — return license after Unity is stopped
|
||||||
|
|
||||||
- name: Pin Claude tool permissions (.claude/settings.json)
|
- name: Pin Claude tool permissions (.claude/settings.json)
|
||||||
run: |
|
run: |
|
||||||
|
|
@ -307,11 +321,11 @@ jobs:
|
||||||
"permissions": {
|
"permissions": {
|
||||||
"allow": [
|
"allow": [
|
||||||
"mcp__unity",
|
"mcp__unity",
|
||||||
"Edit(reports/**)"
|
"Edit(reports/**)",
|
||||||
|
"MultiEdit(reports/**)"
|
||||||
],
|
],
|
||||||
"deny": [
|
"deny": [
|
||||||
"Bash",
|
"Bash",
|
||||||
"MultiEdit",
|
|
||||||
"WebFetch",
|
"WebFetch",
|
||||||
"WebSearch",
|
"WebSearch",
|
||||||
"Task",
|
"Task",
|
||||||
|
|
@ -346,11 +360,11 @@ jobs:
|
||||||
- name: Verify Unity bridge status/port
|
- name: Verify Unity bridge status/port
|
||||||
run: |
|
run: |
|
||||||
set -euxo pipefail
|
set -euxo pipefail
|
||||||
ls -la "$RUNNER_TEMP/unity-status" || true
|
ls -la "$GITHUB_WORKSPACE/.unity-mcp" || true
|
||||||
jq -r . "$RUNNER_TEMP"/unity-status/unity-mcp-status-*.json | sed -n '1,80p' || true
|
jq -r . "$GITHUB_WORKSPACE"/.unity-mcp/unity-mcp-status-*.json | sed -n '1,80p' || true
|
||||||
|
|
||||||
shopt -s nullglob
|
shopt -s nullglob
|
||||||
status_files=("$RUNNER_TEMP"/unity-status/unity-mcp-status-*.json)
|
status_files=("$GITHUB_WORKSPACE"/.unity-mcp/unity-mcp-status-*.json)
|
||||||
if ((${#status_files[@]})); then
|
if ((${#status_files[@]})); then
|
||||||
port="$(grep -hEo '"unity_port"[[:space:]]*:[[:space:]]*[0-9]+' "${status_files[@]}" \
|
port="$(grep -hEo '"unity_port"[[:space:]]*:[[:space:]]*[0-9]+' "${status_files[@]}" \
|
||||||
| sed -E 's/.*: *([0-9]+).*/\1/' | head -n1 || true)"
|
| sed -E 's/.*: *([0-9]+).*/\1/' | head -n1 || true)"
|
||||||
|
|
@ -363,13 +377,266 @@ jobs:
|
||||||
timeout 1 bash -lc "exec 3<>/dev/tcp/127.0.0.1/$port" && echo "TCP OK"
|
timeout 1 bash -lc "exec 3<>/dev/tcp/127.0.0.1/$port" && echo "TCP OK"
|
||||||
fi
|
fi
|
||||||
|
|
||||||
# (removed) Revert helper and baseline snapshot are no longer used
|
if ((${#status_files[@]})); then
|
||||||
|
first_status="${status_files[0]}"
|
||||||
|
fname="$(basename "$first_status")"
|
||||||
|
hash_part="${fname%.json}"; hash_part="${hash_part#unity-mcp-status-}"
|
||||||
|
proj="$(jq -r '.project_name // empty' "$first_status" || true)"
|
||||||
|
if [[ -n "${proj:-}" && -n "${hash_part:-}" ]]; then
|
||||||
|
echo "UNITY_MCP_DEFAULT_INSTANCE=${proj}@${hash_part}" >> "$GITHUB_ENV"
|
||||||
|
echo "Default instance set to ${proj}@${hash_part}"
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
# ---------- MCP client config ----------
|
||||||
|
- name: Write MCP config (.claude/mcp.json)
|
||||||
|
run: |
|
||||||
|
set -eux
|
||||||
|
mkdir -p .claude
|
||||||
|
python3 - <<'PY'
|
||||||
|
import json
|
||||||
|
import os
|
||||||
|
import textwrap
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
workspace = os.environ["GITHUB_WORKSPACE"]
|
||||||
|
default_inst = os.environ.get("UNITY_MCP_DEFAULT_INSTANCE", "").strip()
|
||||||
|
|
||||||
|
cfg = {
|
||||||
|
"mcpServers": {
|
||||||
|
"unity": {
|
||||||
|
"args": [
|
||||||
|
"run",
|
||||||
|
"--active",
|
||||||
|
"--directory",
|
||||||
|
"Server",
|
||||||
|
"mcp-for-unity",
|
||||||
|
"--transport",
|
||||||
|
"stdio",
|
||||||
|
],
|
||||||
|
"transport": {"type": "stdio"},
|
||||||
|
"env": {
|
||||||
|
"PYTHONUNBUFFERED": "1",
|
||||||
|
"MCP_LOG_LEVEL": "debug",
|
||||||
|
"UNITY_PROJECT_ROOT": f"{workspace}/TestProjects/UnityMCPTests",
|
||||||
|
"UNITY_MCP_STATUS_DIR": f"{workspace}/.unity-mcp",
|
||||||
|
"UNITY_MCP_HOST": "127.0.0.1",
|
||||||
|
},
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
unity = cfg["mcpServers"]["unity"]
|
||||||
|
if default_inst:
|
||||||
|
unity["env"]["UNITY_MCP_DEFAULT_INSTANCE"] = default_inst
|
||||||
|
if "--default-instance" not in unity["args"]:
|
||||||
|
unity["args"] += ["--default-instance", default_inst]
|
||||||
|
|
||||||
|
runner_script = Path(".claude/run-unity-mcp.sh")
|
||||||
|
workspace_path = Path(workspace)
|
||||||
|
uv_candidate = workspace_path / ".venv" / "bin" / "uv"
|
||||||
|
uv_cmd = uv_candidate.as_posix() if uv_candidate.exists() else "uv"
|
||||||
|
script = textwrap.dedent(f"""\
|
||||||
|
#!/usr/bin/env bash
|
||||||
|
set -euo pipefail
|
||||||
|
LOG="{workspace}/.unity-mcp/mcp-server-startup-debug.log"
|
||||||
|
mkdir -p "$(dirname "$LOG")"
|
||||||
|
echo "" >> "$LOG"
|
||||||
|
echo "[ $(date -Iseconds) ] Starting unity MCP server" >> "$LOG"
|
||||||
|
# Redirect stderr to log, keep stdout for MCP communication
|
||||||
|
exec {uv_cmd} "$@" 2>> "$LOG"
|
||||||
|
""")
|
||||||
|
runner_script.write_text(script)
|
||||||
|
runner_script.chmod(0o755)
|
||||||
|
|
||||||
|
unity["command"] = runner_script.resolve().as_posix()
|
||||||
|
|
||||||
|
path = Path(".claude/mcp.json")
|
||||||
|
path.write_text(json.dumps(cfg, indent=2) + "\n")
|
||||||
|
print(f"Wrote {path} and {runner_script} (UNITY_MCP_DEFAULT_INSTANCE={default_inst or 'unset'})")
|
||||||
|
PY
|
||||||
|
|
||||||
|
- name: Debug MCP config
|
||||||
|
run: |
|
||||||
|
set -eux
|
||||||
|
echo "=== .claude/mcp.json ==="
|
||||||
|
cat .claude/mcp.json
|
||||||
|
echo ""
|
||||||
|
echo "=== Status dir contents ==="
|
||||||
|
ls -la "$GITHUB_WORKSPACE/.unity-mcp" || true
|
||||||
|
echo ""
|
||||||
|
echo "=== Status file content ==="
|
||||||
|
cat "$GITHUB_WORKSPACE"/.unity-mcp/unity-mcp-status-*.json 2>/dev/null || echo "(no status files)"
|
||||||
|
|
||||||
|
- name: Preflight MCP server (with retries)
|
||||||
|
env:
|
||||||
|
UNITY_MCP_DEFAULT_INSTANCE: ${{ env.UNITY_MCP_DEFAULT_INSTANCE }}
|
||||||
|
run: |
|
||||||
|
set -euxo pipefail
|
||||||
|
export PYTHONUNBUFFERED=1
|
||||||
|
export MCP_LOG_LEVEL=debug
|
||||||
|
export UNITY_PROJECT_ROOT="$GITHUB_WORKSPACE/TestProjects/UnityMCPTests"
|
||||||
|
export UNITY_MCP_STATUS_DIR="$GITHUB_WORKSPACE/.unity-mcp"
|
||||||
|
export UNITY_MCP_HOST=127.0.0.1
|
||||||
|
if [[ -n "${UNITY_MCP_DEFAULT_INSTANCE:-}" ]]; then
|
||||||
|
export UNITY_MCP_DEFAULT_INSTANCE
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Debug: probe Unity's actual ping/pong response
|
||||||
|
echo "--- Unity ping/pong probe ---"
|
||||||
|
python3 <<'PY'
|
||||||
|
import socket, struct, sys
|
||||||
|
port = 6400
|
||||||
|
try:
|
||||||
|
s = socket.create_connection(("127.0.0.1", port), timeout=2)
|
||||||
|
s.settimeout(2)
|
||||||
|
hs = s.recv(512)
|
||||||
|
print(f"handshake: {hs!r}")
|
||||||
|
hs_ok = b"FRAMING=1" in hs
|
||||||
|
print(f"FRAMING=1 present: {hs_ok}")
|
||||||
|
if hs_ok:
|
||||||
|
s.sendall(struct.pack(">Q", 4) + b"ping")
|
||||||
|
hdr = s.recv(8)
|
||||||
|
print(f"response header len: {len(hdr)}")
|
||||||
|
if len(hdr) == 8:
|
||||||
|
length = struct.unpack(">Q", hdr)[0]
|
||||||
|
resp = s.recv(length)
|
||||||
|
print(f"response payload: {resp!r}")
|
||||||
|
pong_check = b'"message":"pong"'
|
||||||
|
print(f"contains pong_check: {pong_check in resp}")
|
||||||
|
s.close()
|
||||||
|
except Exception as e:
|
||||||
|
print(f"probe error: {e}")
|
||||||
|
PY
|
||||||
|
|
||||||
|
attempt=0
|
||||||
|
while true; do
|
||||||
|
attempt=$((attempt+1))
|
||||||
|
if uv run --active --directory Server mcp-for-unity --transport stdio --help > /tmp/mcp-preflight.log 2>&1; then
|
||||||
|
cat /tmp/mcp-preflight.log
|
||||||
|
break
|
||||||
|
fi
|
||||||
|
if [ "$attempt" -ge 5 ]; then
|
||||||
|
echo "::error::MCP server did not settle after $attempt attempts"
|
||||||
|
cat /tmp/mcp-preflight.log || true
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
sleep 2
|
||||||
|
done
|
||||||
|
|
||||||
|
- name: Verify MCP Unity instance and Claude args
|
||||||
|
env:
|
||||||
|
UNITY_MCP_DEFAULT_INSTANCE: ${{ env.UNITY_MCP_DEFAULT_INSTANCE }}
|
||||||
|
run: |
|
||||||
|
set -euxo pipefail
|
||||||
|
export PYTHONUNBUFFERED=1 MCP_LOG_LEVEL=debug
|
||||||
|
export UNITY_PROJECT_ROOT="$GITHUB_WORKSPACE/TestProjects/UnityMCPTests"
|
||||||
|
export UNITY_MCP_STATUS_DIR="$GITHUB_WORKSPACE/.unity-mcp"
|
||||||
|
export UNITY_MCP_HOST=127.0.0.1
|
||||||
|
if [[ -n "${UNITY_MCP_DEFAULT_INSTANCE:-}" ]]; then
|
||||||
|
export UNITY_MCP_DEFAULT_INSTANCE
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Debug: check what PortDiscovery sees
|
||||||
|
echo "--- PortDiscovery debug ---"
|
||||||
|
python3 - <<'PY'
|
||||||
|
import sys
|
||||||
|
sys.path.insert(0, "Server/src")
|
||||||
|
from transport.legacy.port_discovery import PortDiscovery
|
||||||
|
import json
|
||||||
|
|
||||||
|
print(f"status_dir: {PortDiscovery.get_registry_dir()}")
|
||||||
|
instances = PortDiscovery.discover_all_unity_instances()
|
||||||
|
print(f"discover_all_unity_instances: {[{'id':i.id,'port':i.port} for i in instances]}")
|
||||||
|
print(f"try_probe_direct(6400): {PortDiscovery._try_probe_unity_mcp(6400)}")
|
||||||
|
print(f"discover_unity_port: {PortDiscovery.discover_unity_port()}")
|
||||||
|
PY
|
||||||
|
|
||||||
|
python3 - <<'PY'
|
||||||
|
import json
|
||||||
|
import subprocess
|
||||||
|
cmd = [
|
||||||
|
"uv", "run", "--active", "--directory", "Server", "python", "-c",
|
||||||
|
"from transport.legacy.stdio_port_registry import stdio_port_registry; "
|
||||||
|
"inst = stdio_port_registry.get_instances(force_refresh=True); "
|
||||||
|
"import json; print(json.dumps([{'id':i.id,'port':i.port} for i in inst]))"
|
||||||
|
]
|
||||||
|
result = subprocess.run(cmd, capture_output=True, text=True)
|
||||||
|
print(result.stdout.strip())
|
||||||
|
if result.returncode != 0:
|
||||||
|
print(result.stderr)
|
||||||
|
raise SystemExit(1)
|
||||||
|
try:
|
||||||
|
data = json.loads(result.stdout.strip() or "[]")
|
||||||
|
if not data:
|
||||||
|
print("::error::No Unity instances discovered by MCP registry")
|
||||||
|
raise SystemExit(1)
|
||||||
|
except Exception as e:
|
||||||
|
print(f"::error::Failed to parse instances: {e}")
|
||||||
|
raise SystemExit(1)
|
||||||
|
PY
|
||||||
|
|
||||||
|
echo "=== Testing MCP server startup with --status-dir flag ==="
|
||||||
|
uv run --active --directory Server python <<'PYTEST'
|
||||||
|
import os
|
||||||
|
import sys
|
||||||
|
import glob
|
||||||
|
sys.path.insert(0, 'src')
|
||||||
|
from transport.legacy.port_discovery import PortDiscovery
|
||||||
|
status_dir = PortDiscovery.get_registry_dir()
|
||||||
|
print('Status dir:', status_dir)
|
||||||
|
print('Exists:', status_dir.exists())
|
||||||
|
pattern = str(status_dir / 'unity-mcp-status-*.json')
|
||||||
|
files = glob.glob(pattern)
|
||||||
|
print('Files:', files)
|
||||||
|
instances = PortDiscovery.discover_all_unity_instances()
|
||||||
|
print('Instances:', [i.id for i in instances])
|
||||||
|
if not instances:
|
||||||
|
print('::error::Discovery returned empty list!')
|
||||||
|
sys.exit(1)
|
||||||
|
PYTEST
|
||||||
|
|
||||||
|
# ---------- Final Unity check before Claude ----------
|
||||||
|
- name: Verify Unity IMMEDIATELY before Claude
|
||||||
|
run: |
|
||||||
|
set -euxo pipefail
|
||||||
|
echo "=== Unity container status ==="
|
||||||
|
docker inspect -f '{{.State.Status}} {{.State.Running}}' unity-mcp || echo "Container not found!"
|
||||||
|
|
||||||
|
echo "=== Raw socket probe to Unity ==="
|
||||||
|
# Try raw TCP connect without Python overhead
|
||||||
|
for host in 127.0.0.1 localhost; do
|
||||||
|
echo "Probing $host:6400..."
|
||||||
|
if timeout 2 bash -c "exec 3<>/dev/tcp/$host/6400" 2>/dev/null; then
|
||||||
|
echo "$host:6400 - SUCCESS"
|
||||||
|
else
|
||||||
|
echo "$host:6400 - FAILED"
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
|
||||||
|
echo "=== Netstat for port 6400 ==="
|
||||||
|
docker exec unity-mcp netstat -tlnp 2>/dev/null | grep 6400 || ss -tlnp | grep 6400 || echo "No listener found on 6400"
|
||||||
|
|
||||||
|
echo "=== Python probe with timing ==="
|
||||||
|
python3 <<'PY'
|
||||||
|
import socket, time
|
||||||
|
start = time.time()
|
||||||
|
for host in ['127.0.0.1', 'localhost']:
|
||||||
|
try:
|
||||||
|
s = socket.create_connection((host, 6400), timeout=2)
|
||||||
|
s.close()
|
||||||
|
print(f"{host}:6400 OK ({time.time()-start:.2f}s)")
|
||||||
|
except Exception as e:
|
||||||
|
print(f"{host}:6400 FAILED: {e} ({time.time()-start:.2f}s)")
|
||||||
|
PY
|
||||||
|
|
||||||
# ---------- Run suite in two passes ----------
|
# ---------- Run suite in two passes ----------
|
||||||
- name: Run Claude NL pass
|
- name: Run Claude NL pass
|
||||||
uses: anthropics/claude-code-base-action@beta
|
uses: anthropics/claude-code-base-action@beta
|
||||||
if: steps.detect.outputs.anthropic_ok == 'true' && steps.detect.outputs.unity_ok == 'true'
|
if: steps.detect.outputs.anthropic_ok == 'true' && steps.detect.outputs.unity_ok == 'true'
|
||||||
continue-on-error: true
|
continue-on-error: true
|
||||||
|
env:
|
||||||
|
UNITY_MCP_DEFAULT_INSTANCE: ${{ env.UNITY_MCP_DEFAULT_INSTANCE }}
|
||||||
with:
|
with:
|
||||||
use_node_cache: false
|
use_node_cache: false
|
||||||
prompt_file: .claude/prompts/nl-unity-suite-nl.md
|
prompt_file: .claude/prompts/nl-unity-suite-nl.md
|
||||||
|
|
@ -377,7 +644,8 @@ jobs:
|
||||||
settings: .claude/settings.json
|
settings: .claude/settings.json
|
||||||
allowed_tools: "mcp__unity,Edit(reports/**),MultiEdit(reports/**)"
|
allowed_tools: "mcp__unity,Edit(reports/**),MultiEdit(reports/**)"
|
||||||
disallowed_tools: "Bash,WebFetch,WebSearch,Task,TodoWrite,NotebookEdit,NotebookRead"
|
disallowed_tools: "Bash,WebFetch,WebSearch,Task,TodoWrite,NotebookEdit,NotebookRead"
|
||||||
model: claude-3-7-sonnet-20250219
|
model: claude-haiku-4-5-20251001
|
||||||
|
fallback_model: claude-sonnet-4-5-20250929
|
||||||
append_system_prompt: |
|
append_system_prompt: |
|
||||||
You are running the NL pass only.
|
You are running the NL pass only.
|
||||||
- Emit exactly NL-0, NL-1, NL-2, NL-3, NL-4.
|
- Emit exactly NL-0, NL-1, NL-2, NL-3, NL-4.
|
||||||
|
|
@ -387,10 +655,22 @@ jobs:
|
||||||
timeout_minutes: "30"
|
timeout_minutes: "30"
|
||||||
anthropic_api_key: ${{ secrets.ANTHROPIC_API_KEY }}
|
anthropic_api_key: ${{ secrets.ANTHROPIC_API_KEY }}
|
||||||
|
|
||||||
|
- name: Debug MCP server startup (after NL pass)
|
||||||
|
if: always()
|
||||||
|
run: |
|
||||||
|
set -eux
|
||||||
|
echo "=== MCP Server Startup Debug Log ==="
|
||||||
|
cat "$GITHUB_WORKSPACE/.unity-mcp/mcp-server-startup-debug.log" 2>/dev/null || echo "(no debug log found - MCP server may not have started)"
|
||||||
|
echo ""
|
||||||
|
echo "=== Status dir after Claude ==="
|
||||||
|
ls -la "$GITHUB_WORKSPACE/.unity-mcp" || true
|
||||||
|
|
||||||
- name: Run Claude T pass A-J
|
- name: Run Claude T pass A-J
|
||||||
uses: anthropics/claude-code-base-action@beta
|
uses: anthropics/claude-code-base-action@beta
|
||||||
if: steps.detect.outputs.anthropic_ok == 'true' && steps.detect.outputs.unity_ok == 'true'
|
if: steps.detect.outputs.anthropic_ok == 'true' && steps.detect.outputs.unity_ok == 'true'
|
||||||
continue-on-error: true
|
continue-on-error: true
|
||||||
|
env:
|
||||||
|
UNITY_MCP_DEFAULT_INSTANCE: ${{ env.UNITY_MCP_DEFAULT_INSTANCE }}
|
||||||
with:
|
with:
|
||||||
use_node_cache: false
|
use_node_cache: false
|
||||||
prompt_file: .claude/prompts/nl-unity-suite-t.md
|
prompt_file: .claude/prompts/nl-unity-suite-t.md
|
||||||
|
|
@ -398,7 +678,8 @@ jobs:
|
||||||
settings: .claude/settings.json
|
settings: .claude/settings.json
|
||||||
allowed_tools: "mcp__unity,Edit(reports/**),MultiEdit(reports/**)"
|
allowed_tools: "mcp__unity,Edit(reports/**),MultiEdit(reports/**)"
|
||||||
disallowed_tools: "Bash,WebFetch,WebSearch,Task,TodoWrite,NotebookEdit,NotebookRead"
|
disallowed_tools: "Bash,WebFetch,WebSearch,Task,TodoWrite,NotebookEdit,NotebookRead"
|
||||||
model: claude-3-5-haiku-20241022
|
model: claude-haiku-4-5-20251001
|
||||||
|
fallback_model: claude-sonnet-4-5-20250929
|
||||||
append_system_prompt: |
|
append_system_prompt: |
|
||||||
You are running the T pass (A–J) only.
|
You are running the T pass (A–J) only.
|
||||||
Output requirements:
|
Output requirements:
|
||||||
|
|
@ -441,8 +722,8 @@ jobs:
|
||||||
settings: .claude/settings.json
|
settings: .claude/settings.json
|
||||||
allowed_tools: "mcp__unity,Edit(reports/**),MultiEdit(reports/**)"
|
allowed_tools: "mcp__unity,Edit(reports/**),MultiEdit(reports/**)"
|
||||||
disallowed_tools: "Bash,MultiEdit(/!(reports/**)),WebFetch,WebSearch,Task,TodoWrite,NotebookEdit,NotebookRead"
|
disallowed_tools: "Bash,MultiEdit(/!(reports/**)),WebFetch,WebSearch,Task,TodoWrite,NotebookEdit,NotebookRead"
|
||||||
model: claude-3-7-sonnet-20250219
|
model: claude-sonnet-4-5-20250929
|
||||||
fallback_model: claude-3-5-haiku-20241022
|
fallback_model: claude-haiku-4-5-20251001
|
||||||
append_system_prompt: |
|
append_system_prompt: |
|
||||||
You are running the T pass only.
|
You are running the T pass only.
|
||||||
Output requirements:
|
Output requirements:
|
||||||
|
|
@ -535,10 +816,10 @@ jobs:
|
||||||
|
|
||||||
def id_from_filename(p: Path):
|
def id_from_filename(p: Path):
|
||||||
n = p.name
|
n = p.name
|
||||||
m = re.match(r'NL(\d+)_results\.xml$', n, re.I)
|
m = re.match(r'NL-?(\d+)_results\.xml$', n, re.I)
|
||||||
if m:
|
if m:
|
||||||
return f"NL-{int(m.group(1))}"
|
return f"NL-{int(m.group(1))}"
|
||||||
m = re.match(r'T([A-J])_results\.xml$', n, re.I)
|
m = re.match(r'T-?([A-J])_results\.xml$', n, re.I)
|
||||||
if m:
|
if m:
|
||||||
return f"T-{m.group(1).upper()}"
|
return f"T-{m.group(1).upper()}"
|
||||||
return None
|
return None
|
||||||
|
|
@ -582,10 +863,10 @@ jobs:
|
||||||
seen = set()
|
seen = set()
|
||||||
def id_from_filename(p: Path):
|
def id_from_filename(p: Path):
|
||||||
n = p.name
|
n = p.name
|
||||||
m = re.match(r'NL(\d+)_results\.xml$', n, re.I)
|
m = re.match(r'NL-?(\d+)_results\.xml$', n, re.I)
|
||||||
if m:
|
if m:
|
||||||
return f"NL-{int(m.group(1))}"
|
return f"NL-{int(m.group(1))}"
|
||||||
m = re.match(r'T([A-J])_results\.xml$', n, re.I)
|
m = re.match(r'T-?([A-J])_results\.xml$', n, re.I)
|
||||||
if m:
|
if m:
|
||||||
return f"T-{m.group(1).upper()}"
|
return f"T-{m.group(1).upper()}"
|
||||||
return None
|
return None
|
||||||
|
|
@ -855,18 +1136,6 @@ jobs:
|
||||||
md_out.write_text('\n'.join(lines), encoding='utf-8')
|
md_out.write_text('\n'.join(lines), encoding='utf-8')
|
||||||
PY
|
PY
|
||||||
|
|
||||||
- name: "Debug: list report files"
|
|
||||||
if: always()
|
|
||||||
shell: bash
|
|
||||||
run: |
|
|
||||||
set -eux
|
|
||||||
ls -la reports || true
|
|
||||||
shopt -s nullglob
|
|
||||||
for f in reports/*.xml; do
|
|
||||||
echo "===== $f ====="
|
|
||||||
head -n 40 "$f" || true
|
|
||||||
done
|
|
||||||
|
|
||||||
# ---------- Collect execution transcript (if present) ----------
|
# ---------- Collect execution transcript (if present) ----------
|
||||||
- name: Collect action execution transcript
|
- name: Collect action execution transcript
|
||||||
if: always()
|
if: always()
|
||||||
|
|
@ -936,7 +1205,7 @@ jobs:
|
||||||
require_tests: false
|
require_tests: false
|
||||||
fail_on_parse_error: true
|
fail_on_parse_error: true
|
||||||
|
|
||||||
- name: Upload artifacts (reports + fragments + transcript)
|
- name: Upload artifacts (reports + fragments + transcript + debug)
|
||||||
if: always()
|
if: always()
|
||||||
uses: actions/upload-artifact@v4
|
uses: actions/upload-artifact@v4
|
||||||
with:
|
with:
|
||||||
|
|
@ -946,6 +1215,7 @@ jobs:
|
||||||
${{ env.MD_OUT }}
|
${{ env.MD_OUT }}
|
||||||
reports/*_results.xml
|
reports/*_results.xml
|
||||||
reports/claude-execution-output.json
|
reports/claude-execution-output.json
|
||||||
|
${{ github.workspace }}/.unity-mcp/mcp-server-startup-debug.log
|
||||||
retention-days: 7
|
retention-days: 7
|
||||||
|
|
||||||
# ---------- Always stop Unity ----------
|
# ---------- Always stop Unity ----------
|
||||||
|
|
|
||||||
|
|
@ -1,14 +1,15 @@
|
||||||
name: github-repo-stats
|
name: github-repo-stats
|
||||||
|
|
||||||
on:
|
on:
|
||||||
schedule:
|
# schedule:
|
||||||
# Run this once per day, towards the end of the day for keeping the most
|
# Run this once per day, towards the end of the day for keeping the most
|
||||||
# recent data point most meaningful (hours are interpreted in UTC).
|
# recent data point most meaningful (hours are interpreted in UTC).
|
||||||
- cron: "0 23 * * *"
|
#- cron: "0 23 * * *"
|
||||||
workflow_dispatch: # Allow for running this manually.
|
workflow_dispatch: # Allow for running this manually.
|
||||||
|
|
||||||
jobs:
|
jobs:
|
||||||
j1:
|
j1:
|
||||||
|
if: github.repository == 'CoplayDev/unity-mcp'
|
||||||
name: github-repo-stats
|
name: github-repo-stats
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-latest
|
||||||
steps:
|
steps:
|
||||||
|
|
|
||||||
|
|
@ -0,0 +1,199 @@
|
||||||
|
name: Unity Tests (fork)
|
||||||
|
|
||||||
|
on:
|
||||||
|
workflow_dispatch: {}
|
||||||
|
|
||||||
|
permissions:
|
||||||
|
contents: read
|
||||||
|
checks: write
|
||||||
|
|
||||||
|
concurrency:
|
||||||
|
group: ${{ github.workflow }}-${{ github.ref }}
|
||||||
|
cancel-in-progress: true
|
||||||
|
|
||||||
|
env:
|
||||||
|
UNITY_IMAGE: unityci/editor:ubuntu-2021.3.45f2-linux-il2cpp-3
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
test-editmode:
|
||||||
|
# Guard: run only on the fork owner's repo
|
||||||
|
if: github.repository_owner == 'dsarno'
|
||||||
|
name: Test in editmode (fork)
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
timeout-minutes: 90
|
||||||
|
|
||||||
|
steps:
|
||||||
|
# ---------- Secrets check ----------
|
||||||
|
- name: Detect Unity credentials (outputs)
|
||||||
|
id: detect
|
||||||
|
env:
|
||||||
|
UNITY_LICENSE: ${{ secrets.UNITY_LICENSE }}
|
||||||
|
UNITY_EMAIL: ${{ secrets.UNITY_EMAIL }}
|
||||||
|
UNITY_PASSWORD: ${{ secrets.UNITY_PASSWORD }}
|
||||||
|
UNITY_SERIAL: ${{ secrets.UNITY_SERIAL }}
|
||||||
|
run: |
|
||||||
|
set -e
|
||||||
|
if [ -n "$UNITY_LICENSE" ]; then echo "unity_ok=true" >> "$GITHUB_OUTPUT"; else echo "unity_ok=false" >> "$GITHUB_OUTPUT"; fi
|
||||||
|
if [ -n "$UNITY_EMAIL" ] && [ -n "$UNITY_PASSWORD" ]; then echo "ebl_ok=true" >> "$GITHUB_OUTPUT"; else echo "ebl_ok=false" >> "$GITHUB_OUTPUT"; fi
|
||||||
|
if [ -n "$UNITY_SERIAL" ]; then echo "has_serial=true" >> "$GITHUB_OUTPUT"; else echo "has_serial=false" >> "$GITHUB_OUTPUT"; fi
|
||||||
|
|
||||||
|
- name: Checkout repository
|
||||||
|
uses: actions/checkout@v4
|
||||||
|
with:
|
||||||
|
fetch-depth: 0
|
||||||
|
|
||||||
|
- name: Prepare reports
|
||||||
|
run: |
|
||||||
|
set -eux
|
||||||
|
rm -f reports/*.xml || true
|
||||||
|
mkdir -p reports
|
||||||
|
|
||||||
|
# ---------- Licensing: allow both ULF and EBL ----------
|
||||||
|
- name: Decide license sources
|
||||||
|
id: lic
|
||||||
|
shell: bash
|
||||||
|
env:
|
||||||
|
UNITY_LICENSE: ${{ secrets.UNITY_LICENSE }}
|
||||||
|
UNITY_EMAIL: ${{ secrets.UNITY_EMAIL }}
|
||||||
|
UNITY_PASSWORD: ${{ secrets.UNITY_PASSWORD }}
|
||||||
|
UNITY_SERIAL: ${{ secrets.UNITY_SERIAL }}
|
||||||
|
run: |
|
||||||
|
set -eu
|
||||||
|
use_ulf=false; use_ebl=false
|
||||||
|
[[ -n "${UNITY_LICENSE:-}" ]] && use_ulf=true
|
||||||
|
[[ -n "${UNITY_EMAIL:-}" && -n "${UNITY_PASSWORD:-}" ]] && use_ebl=true
|
||||||
|
echo "use_ulf=$use_ulf" >> "$GITHUB_OUTPUT"
|
||||||
|
echo "use_ebl=$use_ebl" >> "$GITHUB_OUTPUT"
|
||||||
|
echo "has_serial=$([[ -n "${UNITY_SERIAL:-}" ]] && echo true || echo false)" >> "$GITHUB_OUTPUT"
|
||||||
|
|
||||||
|
- name: Stage Unity .ulf license (from secret)
|
||||||
|
if: steps.lic.outputs.use_ulf == 'true'
|
||||||
|
id: ulf
|
||||||
|
env:
|
||||||
|
UNITY_LICENSE: ${{ secrets.UNITY_LICENSE }}
|
||||||
|
shell: bash
|
||||||
|
run: |
|
||||||
|
set -eu
|
||||||
|
mkdir -p "$RUNNER_TEMP/unity-license-ulf" "$RUNNER_TEMP/unity-local/Unity"
|
||||||
|
f="$RUNNER_TEMP/unity-license-ulf/Unity_lic.ulf"
|
||||||
|
if printf "%s" "$UNITY_LICENSE" | base64 -d - >/dev/null 2>&1; then
|
||||||
|
printf "%s" "$UNITY_LICENSE" | base64 -d - > "$f"
|
||||||
|
else
|
||||||
|
printf "%s" "$UNITY_LICENSE" > "$f"
|
||||||
|
fi
|
||||||
|
chmod 600 "$f" || true
|
||||||
|
# If someone pasted an entitlement XML into UNITY_LICENSE by mistake, re-home it:
|
||||||
|
if head -c 100 "$f" | grep -qi '<\?xml'; then
|
||||||
|
mkdir -p "$RUNNER_TEMP/unity-config/Unity/licenses"
|
||||||
|
mv "$f" "$RUNNER_TEMP/unity-config/Unity/licenses/UnityEntitlementLicense.xml"
|
||||||
|
echo "ok=false" >> "$GITHUB_OUTPUT"
|
||||||
|
elif grep -qi '<Signature>' "$f"; then
|
||||||
|
# provide it in the standard local-share path too
|
||||||
|
cp -f "$f" "$RUNNER_TEMP/unity-local/Unity/Unity_lic.ulf"
|
||||||
|
echo "ok=true" >> "$GITHUB_OUTPUT"
|
||||||
|
else
|
||||||
|
echo "ok=false" >> "$GITHUB_OUTPUT"
|
||||||
|
fi
|
||||||
|
|
||||||
|
- name: Activate Unity (EBL via container - host-mount)
|
||||||
|
if: steps.lic.outputs.use_ebl == 'true'
|
||||||
|
shell: bash
|
||||||
|
env:
|
||||||
|
UNITY_IMAGE: ${{ env.UNITY_IMAGE }}
|
||||||
|
UNITY_EMAIL: ${{ secrets.UNITY_EMAIL }}
|
||||||
|
UNITY_PASSWORD: ${{ secrets.UNITY_PASSWORD }}
|
||||||
|
UNITY_SERIAL: ${{ secrets.UNITY_SERIAL }}
|
||||||
|
run: |
|
||||||
|
set -euxo pipefail
|
||||||
|
mkdir -p "$RUNNER_TEMP/unity-config" "$RUNNER_TEMP/unity-local"
|
||||||
|
|
||||||
|
# Try Pro first if serial is present, otherwise named-user EBL.
|
||||||
|
docker run --rm --network host \
|
||||||
|
-e HOME=/root \
|
||||||
|
-e UNITY_EMAIL -e UNITY_PASSWORD -e UNITY_SERIAL \
|
||||||
|
-v "$RUNNER_TEMP/unity-config:/root/.config/unity3d" \
|
||||||
|
-v "$RUNNER_TEMP/unity-local:/root/.local/share/unity3d" \
|
||||||
|
"$UNITY_IMAGE" bash -lc '
|
||||||
|
set -euxo pipefail
|
||||||
|
if [[ -n "${UNITY_SERIAL:-}" ]]; then
|
||||||
|
/opt/unity/Editor/Unity -batchmode -nographics -logFile - \
|
||||||
|
-username "$UNITY_EMAIL" -password "$UNITY_PASSWORD" -serial "$UNITY_SERIAL" -quit || true
|
||||||
|
else
|
||||||
|
/opt/unity/Editor/Unity -batchmode -nographics -logFile - \
|
||||||
|
-username "$UNITY_EMAIL" -password "$UNITY_PASSWORD" -quit || true
|
||||||
|
fi
|
||||||
|
ls -la /root/.config/unity3d/Unity/licenses || true
|
||||||
|
'
|
||||||
|
|
||||||
|
# Verify entitlement written to host mount; allow ULF-only runs to proceed
|
||||||
|
if ! find "$RUNNER_TEMP/unity-config" -type f -iname "*.xml" | grep -q .; then
|
||||||
|
if [[ "${{ steps.ulf.outputs.ok }}" == "true" ]]; then
|
||||||
|
echo "EBL entitlement not found; proceeding with ULF-only (ok=true)."
|
||||||
|
else
|
||||||
|
echo "No entitlement produced and no valid ULF; cannot continue." >&2
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
# ---------- Warm up project (import Library once) ----------
|
||||||
|
- name: Warm up project (import Library once)
|
||||||
|
if: steps.lic.outputs.use_ulf == 'true' || steps.lic.outputs.use_ebl == 'true'
|
||||||
|
shell: bash
|
||||||
|
env:
|
||||||
|
UNITY_IMAGE: ${{ env.UNITY_IMAGE }}
|
||||||
|
ULF_OK: ${{ steps.ulf.outputs.ok }}
|
||||||
|
run: |
|
||||||
|
set -euxo pipefail
|
||||||
|
manual_args=()
|
||||||
|
if [[ "${ULF_OK:-false}" == "true" ]]; then
|
||||||
|
manual_args=(-manualLicenseFile "/root/.local/share/unity3d/Unity/Unity_lic.ulf")
|
||||||
|
fi
|
||||||
|
docker run --rm --network host \
|
||||||
|
-e HOME=/root \
|
||||||
|
-v "${{ github.workspace }}:/workspace" -w /workspace \
|
||||||
|
-v "$RUNNER_TEMP/unity-config:/root/.config/unity3d" \
|
||||||
|
-v "$RUNNER_TEMP/unity-local:/root/.local/share/unity3d" \
|
||||||
|
"$UNITY_IMAGE" /opt/unity/Editor/Unity -batchmode -nographics -logFile - \
|
||||||
|
-projectPath /workspace/TestProjects/UnityMCPTests \
|
||||||
|
"${manual_args[@]}" \
|
||||||
|
-quit
|
||||||
|
|
||||||
|
# ---------- Run editmode tests ----------
|
||||||
|
- name: Run editmode tests (Unity CLI)
|
||||||
|
if: steps.lic.outputs.use_ulf == 'true' || steps.lic.outputs.use_ebl == 'true'
|
||||||
|
shell: bash
|
||||||
|
env:
|
||||||
|
UNITY_IMAGE: ${{ env.UNITY_IMAGE }}
|
||||||
|
ULF_OK: ${{ steps.ulf.outputs.ok }}
|
||||||
|
run: |
|
||||||
|
set -euxo pipefail
|
||||||
|
manual_args=()
|
||||||
|
if [[ "${ULF_OK:-false}" == "true" ]]; then
|
||||||
|
manual_args=(-manualLicenseFile "/root/.local/share/unity3d/Unity/Unity_lic.ulf")
|
||||||
|
fi
|
||||||
|
docker run --rm --network host \
|
||||||
|
-e HOME=/root \
|
||||||
|
-v "${{ github.workspace }}:/workspace" -w /workspace \
|
||||||
|
-v "$RUNNER_TEMP/unity-config:/root/.config/unity3d" \
|
||||||
|
-v "$RUNNER_TEMP/unity-local:/root/.local/share/unity3d" \
|
||||||
|
"$UNITY_IMAGE" /opt/unity/Editor/Unity -batchmode -nographics -logFile - \
|
||||||
|
-projectPath /workspace/TestProjects/UnityMCPTests \
|
||||||
|
-runTests \
|
||||||
|
-testPlatform editmode \
|
||||||
|
-testResults /workspace/reports/editmode-results.xml \
|
||||||
|
-testResultsFormatter NUnit \
|
||||||
|
"${manual_args[@]}" \
|
||||||
|
-quit
|
||||||
|
|
||||||
|
- name: Upload test results
|
||||||
|
if: always()
|
||||||
|
uses: actions/upload-artifact@v4
|
||||||
|
with:
|
||||||
|
name: unity-editmode-results
|
||||||
|
path: reports
|
||||||
|
|
||||||
|
- name: License diagnostics when missing
|
||||||
|
if: steps.lic.outputs.use_ulf != 'true' && steps.lic.outputs.use_ebl != 'true'
|
||||||
|
run: |
|
||||||
|
echo "::error::No Unity credentials were supplied. Set UNITY_LICENSE or UNITY_EMAIL/UNITY_PASSWORD (and optionally UNITY_SERIAL) secrets in this fork."
|
||||||
|
|
||||||
|
|
@ -11,6 +11,8 @@ on:
|
||||||
|
|
||||||
jobs:
|
jobs:
|
||||||
testAllModes:
|
testAllModes:
|
||||||
|
# Guard: only run on upstream repo; skip on forks
|
||||||
|
if: github.repository_owner == 'CoplayDev'
|
||||||
name: Test in ${{ matrix.testMode }}
|
name: Test in ${{ matrix.testMode }}
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-latest
|
||||||
strategy:
|
strategy:
|
||||||
|
|
|
||||||
|
|
@ -0,0 +1,21 @@
|
||||||
|
using System;
|
||||||
|
using MCPForUnity.Editor.Constants;
|
||||||
|
using MCPForUnity.Editor.Services.Transport.Transports;
|
||||||
|
using UnityEditor;
|
||||||
|
|
||||||
|
namespace MCPForUnity.Editor
|
||||||
|
{
|
||||||
|
public static class McpCiBoot
|
||||||
|
{
|
||||||
|
public static void StartStdioForCi()
|
||||||
|
{
|
||||||
|
try
|
||||||
|
{
|
||||||
|
EditorPrefs.SetBool(EditorPrefKeys.UseHttpTransport, false);
|
||||||
|
}
|
||||||
|
catch { /* ignore */ }
|
||||||
|
|
||||||
|
StdioBridgeHost.StartAutoConnect();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
@ -0,0 +1,12 @@
|
||||||
|
fileFormatVersion: 2
|
||||||
|
guid: ef9dca277ab34ba1b136d8dcd45de948
|
||||||
|
MonoImporter:
|
||||||
|
externalObjects: {}
|
||||||
|
serializedVersion: 2
|
||||||
|
defaultReferences: []
|
||||||
|
executionOrder: 0
|
||||||
|
icon: {instanceID: 0}
|
||||||
|
userData:
|
||||||
|
assetBundleName:
|
||||||
|
assetBundleVariant:
|
||||||
|
|
||||||
|
|
@ -25,7 +25,10 @@ namespace MCPForUnity.Editor.Services
|
||||||
{
|
{
|
||||||
// Only persist resume intent when stdio is the active transport and the bridge is running.
|
// Only persist resume intent when stdio is the active transport and the bridge is running.
|
||||||
bool useHttp = EditorPrefs.GetBool(EditorPrefKeys.UseHttpTransport, true);
|
bool useHttp = EditorPrefs.GetBool(EditorPrefKeys.UseHttpTransport, true);
|
||||||
bool isRunning = MCPServiceLocator.TransportManager.IsRunning(TransportMode.Stdio);
|
// Check both TransportManager AND StdioBridgeHost directly, because CI starts via StdioBridgeHost
|
||||||
|
// bypassing TransportManager state.
|
||||||
|
bool isRunning = MCPServiceLocator.TransportManager.IsRunning(TransportMode.Stdio)
|
||||||
|
|| StdioBridgeHost.IsRunning;
|
||||||
bool shouldResume = !useHttp && isRunning;
|
bool shouldResume = !useHttp && isRunning;
|
||||||
|
|
||||||
if (shouldResume)
|
if (shouldResume)
|
||||||
|
|
@ -34,13 +37,12 @@ namespace MCPForUnity.Editor.Services
|
||||||
|
|
||||||
// Stop only the stdio bridge; leave HTTP untouched if it is running concurrently.
|
// Stop only the stdio bridge; leave HTTP untouched if it is running concurrently.
|
||||||
var stopTask = MCPServiceLocator.TransportManager.StopAsync(TransportMode.Stdio);
|
var stopTask = MCPServiceLocator.TransportManager.StopAsync(TransportMode.Stdio);
|
||||||
stopTask.ContinueWith(t =>
|
|
||||||
{
|
// Wait for stop to complete (which deletes the status file)
|
||||||
if (t.IsFaulted && t.Exception != null)
|
try { stopTask.Wait(500); } catch { }
|
||||||
{
|
|
||||||
McpLog.Warn($"Error stopping stdio bridge before reload: {t.Exception.GetBaseException()?.Message}");
|
// Write reloading status so clients don't think we vanished
|
||||||
}
|
StdioBridgeHost.WriteHeartbeat(true, "reloading");
|
||||||
}, System.Threading.Tasks.TaskScheduler.Default);
|
|
||||||
}
|
}
|
||||||
else
|
else
|
||||||
{
|
{
|
||||||
|
|
|
||||||
|
|
@ -463,8 +463,12 @@ namespace MCPForUnity.Editor.Services.Transport.Transports
|
||||||
|
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
string statusDir = Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.UserProfile), ".unity-mcp");
|
string dir = Environment.GetEnvironmentVariable("UNITY_MCP_STATUS_DIR");
|
||||||
string statusFile = Path.Combine(statusDir, $"unity-mcp-status-{ComputeProjectHash(Application.dataPath)}.json");
|
if (string.IsNullOrWhiteSpace(dir))
|
||||||
|
{
|
||||||
|
dir = Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.UserProfile), ".unity-mcp");
|
||||||
|
}
|
||||||
|
string statusFile = Path.Combine(dir, $"unity-mcp-status-{ComputeProjectHash(Application.dataPath)}.json");
|
||||||
if (File.Exists(statusFile))
|
if (File.Exists(statusFile))
|
||||||
{
|
{
|
||||||
File.Delete(statusFile);
|
File.Delete(statusFile);
|
||||||
|
|
@ -1011,7 +1015,7 @@ namespace MCPForUnity.Editor.Services.Transport.Transports
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
private static void WriteHeartbeat(bool reloading, string reason = null)
|
public static void WriteHeartbeat(bool reloading, string reason = null)
|
||||||
{
|
{
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
|
|
|
||||||
|
|
@ -0,0 +1,174 @@
|
||||||
|
import base64
|
||||||
|
import os
|
||||||
|
import re
|
||||||
|
from typing import Annotated, Any
|
||||||
|
from urllib.parse import unquote, urlparse
|
||||||
|
|
||||||
|
from fastmcp import Context
|
||||||
|
|
||||||
|
from services.registry import mcp_for_unity_tool
|
||||||
|
from services.tools import get_unity_instance_from_context
|
||||||
|
from transport.unity_transport import send_with_unity_instance
|
||||||
|
from transport.legacy.unity_connection import async_send_command_with_retry
|
||||||
|
|
||||||
|
|
||||||
|
def _split_uri(uri: str) -> tuple[str, str]:
|
||||||
|
"""Split an incoming URI or path into (name, directory) suitable for Unity.
|
||||||
|
|
||||||
|
Rules:
|
||||||
|
- unity://path/Assets/... → keep as Assets-relative (after decode/normalize)
|
||||||
|
- file://... → percent-decode, normalize, strip host and leading slashes,
|
||||||
|
then, if any 'Assets' segment exists, return path relative to that 'Assets' root.
|
||||||
|
Otherwise, fall back to original name/dir behavior.
|
||||||
|
- plain paths → decode/normalize separators; if they contain an 'Assets' segment,
|
||||||
|
return relative to 'Assets'.
|
||||||
|
"""
|
||||||
|
raw_path: str
|
||||||
|
if uri.startswith("unity://path/"):
|
||||||
|
raw_path = uri[len("unity://path/"):]
|
||||||
|
elif uri.startswith("file://"):
|
||||||
|
parsed = urlparse(uri)
|
||||||
|
host = (parsed.netloc or "").strip()
|
||||||
|
p = parsed.path or ""
|
||||||
|
# UNC: file://server/share/... -> //server/share/...
|
||||||
|
if host and host.lower() != "localhost":
|
||||||
|
p = f"//{host}{p}"
|
||||||
|
# Use percent-decoded path, preserving leading slashes
|
||||||
|
raw_path = unquote(p)
|
||||||
|
else:
|
||||||
|
raw_path = uri
|
||||||
|
|
||||||
|
# Percent-decode any residual encodings and normalize separators
|
||||||
|
raw_path = unquote(raw_path).replace("\\", "/")
|
||||||
|
# Strip leading slash only for Windows drive-letter forms like "/C:/..."
|
||||||
|
if os.name == "nt" and len(raw_path) >= 3 and raw_path[0] == "/" and raw_path[2] == ":":
|
||||||
|
raw_path = raw_path[1:]
|
||||||
|
|
||||||
|
# Normalize path (collapse ../, ./)
|
||||||
|
norm = os.path.normpath(raw_path).replace("\\", "/")
|
||||||
|
|
||||||
|
# If an 'Assets' segment exists, compute path relative to it (case-insensitive)
|
||||||
|
parts = [p for p in norm.split("/") if p not in ("", ".")]
|
||||||
|
idx = next((i for i, seg in enumerate(parts)
|
||||||
|
if seg.lower() == "assets"), None)
|
||||||
|
assets_rel = "/".join(parts[idx:]) if idx is not None else None
|
||||||
|
|
||||||
|
effective_path = assets_rel if assets_rel else norm
|
||||||
|
# For POSIX absolute paths outside Assets, drop the leading '/'
|
||||||
|
# to return a clean relative-like directory (e.g., '/tmp' -> 'tmp').
|
||||||
|
if effective_path.startswith("/"):
|
||||||
|
effective_path = effective_path[1:]
|
||||||
|
|
||||||
|
name = os.path.splitext(os.path.basename(effective_path))[0]
|
||||||
|
directory = os.path.dirname(effective_path)
|
||||||
|
return name, directory
|
||||||
|
|
||||||
|
|
||||||
|
@mcp_for_unity_tool(description="Searches a file with a regex pattern and returns line numbers and excerpts.")
|
||||||
|
async def find_in_file(
|
||||||
|
ctx: Context,
|
||||||
|
uri: Annotated[str, "The resource URI to search under Assets/ or file path form supported by read_resource"],
|
||||||
|
pattern: Annotated[str, "The regex pattern to search for"],
|
||||||
|
project_root: Annotated[str | None, "Optional project root path"] = None,
|
||||||
|
max_results: Annotated[int, "Cap results to avoid huge payloads"] = 200,
|
||||||
|
ignore_case: Annotated[bool | str | None, "Case insensitive search"] = True,
|
||||||
|
) -> dict[str, Any]:
|
||||||
|
# project_root is currently unused but kept for interface consistency
|
||||||
|
unity_instance = get_unity_instance_from_context(ctx)
|
||||||
|
await ctx.info(
|
||||||
|
f"Processing find_in_file: {uri} (unity_instance={unity_instance or 'default'})")
|
||||||
|
|
||||||
|
name, directory = _split_uri(uri)
|
||||||
|
|
||||||
|
# 1. Read file content via Unity
|
||||||
|
read_resp = await send_with_unity_instance(
|
||||||
|
async_send_command_with_retry,
|
||||||
|
unity_instance,
|
||||||
|
"manage_script",
|
||||||
|
{
|
||||||
|
"action": "read",
|
||||||
|
"name": name,
|
||||||
|
"path": directory,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
|
if not isinstance(read_resp, dict) or not read_resp.get("success"):
|
||||||
|
return read_resp if isinstance(read_resp, dict) else {"success": False, "message": str(read_resp)}
|
||||||
|
|
||||||
|
data = read_resp.get("data", {})
|
||||||
|
contents = data.get("contents")
|
||||||
|
if not contents and data.get("contentsEncoded") and data.get("encodedContents"):
|
||||||
|
try:
|
||||||
|
contents = base64.b64decode(data.get("encodedContents", "").encode(
|
||||||
|
"utf-8")).decode("utf-8", "replace")
|
||||||
|
except (ValueError, TypeError, base64.binascii.Error):
|
||||||
|
contents = contents or ""
|
||||||
|
|
||||||
|
if contents is None:
|
||||||
|
return {"success": False, "message": "Could not read file content."}
|
||||||
|
|
||||||
|
# 2. Perform regex search
|
||||||
|
flags = re.MULTILINE
|
||||||
|
# Handle ignore_case which can be boolean or string from some clients
|
||||||
|
ic = ignore_case
|
||||||
|
if isinstance(ic, str):
|
||||||
|
ic = ic.lower() in ("true", "1", "yes")
|
||||||
|
if ic:
|
||||||
|
flags |= re.IGNORECASE
|
||||||
|
|
||||||
|
try:
|
||||||
|
regex = re.compile(pattern, flags)
|
||||||
|
except re.error as e:
|
||||||
|
return {"success": False, "message": f"Invalid regex pattern: {e}"}
|
||||||
|
|
||||||
|
# If the regex is not multiline specific (doesn't contain \n literal match logic),
|
||||||
|
# we could iterate lines. But users might use multiline regexes.
|
||||||
|
# Let's search the whole content and map back to lines.
|
||||||
|
|
||||||
|
found = list(regex.finditer(contents))
|
||||||
|
|
||||||
|
results = []
|
||||||
|
count = 0
|
||||||
|
|
||||||
|
for m in found:
|
||||||
|
if count >= max_results:
|
||||||
|
break
|
||||||
|
|
||||||
|
start_idx = m.start()
|
||||||
|
end_idx = m.end()
|
||||||
|
|
||||||
|
# Calculate line number
|
||||||
|
# Count newlines up to start_idx
|
||||||
|
line_num = contents.count('\n', 0, start_idx) + 1
|
||||||
|
|
||||||
|
# Get line content for excerpt
|
||||||
|
# Find start of line
|
||||||
|
line_start = contents.rfind('\n', 0, start_idx) + 1
|
||||||
|
# Find end of line
|
||||||
|
line_end = contents.find('\n', start_idx)
|
||||||
|
if line_end == -1:
|
||||||
|
line_end = len(contents)
|
||||||
|
|
||||||
|
line_content = contents[line_start:line_end]
|
||||||
|
|
||||||
|
# Create excerpt
|
||||||
|
# We can just return the line content as excerpt
|
||||||
|
|
||||||
|
results.append({
|
||||||
|
"line": line_num,
|
||||||
|
"content": line_content.strip(), # detailed match info?
|
||||||
|
"match": m.group(0),
|
||||||
|
"start": start_idx,
|
||||||
|
"end": end_idx
|
||||||
|
})
|
||||||
|
count += 1
|
||||||
|
|
||||||
|
return {
|
||||||
|
"success": True,
|
||||||
|
"data": {
|
||||||
|
"matches": results,
|
||||||
|
"count": len(results),
|
||||||
|
"total_matches": len(found)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
|
@ -121,7 +121,7 @@ async def apply_text_edits(
|
||||||
return read_resp if isinstance(read_resp, dict) else {"success": False, "message": str(read_resp)}
|
return read_resp if isinstance(read_resp, dict) else {"success": False, "message": str(read_resp)}
|
||||||
data = read_resp.get("data", {})
|
data = read_resp.get("data", {})
|
||||||
contents = data.get("contents")
|
contents = data.get("contents")
|
||||||
if not contents and data.get("contentsEncoded"):
|
if not contents and data.get("contentsEncoded") and data.get("encodedContents"):
|
||||||
try:
|
try:
|
||||||
contents = base64.b64decode(data.get("encodedContents", "").encode(
|
contents = base64.b64decode(data.get("encodedContents", "").encode(
|
||||||
"utf-8")).decode("utf-8", "replace")
|
"utf-8")).decode("utf-8", "replace")
|
||||||
|
|
|
||||||
|
|
@ -34,10 +34,13 @@ class PortDiscovery:
|
||||||
@staticmethod
|
@staticmethod
|
||||||
def get_registry_path() -> Path:
|
def get_registry_path() -> Path:
|
||||||
"""Get the path to the port registry file"""
|
"""Get the path to the port registry file"""
|
||||||
return Path.home() / ".unity-mcp" / PortDiscovery.REGISTRY_FILE
|
return PortDiscovery.get_registry_dir() / PortDiscovery.REGISTRY_FILE
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
def get_registry_dir() -> Path:
|
def get_registry_dir() -> Path:
|
||||||
|
env_dir = os.environ.get("UNITY_MCP_STATUS_DIR")
|
||||||
|
if env_dir:
|
||||||
|
return Path(env_dir)
|
||||||
return Path.home() / ".unity-mcp"
|
return Path.home() / ".unity-mcp"
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
|
|
@ -270,6 +273,18 @@ class PortDiscovery:
|
||||||
port) if isinstance(port, int) else False
|
port) if isinstance(port, int) else False
|
||||||
|
|
||||||
if not is_alive:
|
if not is_alive:
|
||||||
|
# If Unity says it's reloading and the status is fresh, don't drop the instance.
|
||||||
|
freshness = last_heartbeat or file_mtime
|
||||||
|
now = datetime.now()
|
||||||
|
if freshness.tzinfo:
|
||||||
|
from datetime import timezone
|
||||||
|
now = datetime.now(timezone.utc)
|
||||||
|
|
||||||
|
age_s = (now - freshness).total_seconds()
|
||||||
|
|
||||||
|
if is_reloading and age_s < 60:
|
||||||
|
pass # keep it, status="reloading"
|
||||||
|
else:
|
||||||
logger.debug(
|
logger.debug(
|
||||||
f"Instance {project_name}@{hash_value} has heartbeat but port {port} not responding")
|
f"Instance {project_name}@{hash_value} has heartbeat but port {port} not responding")
|
||||||
continue
|
continue
|
||||||
|
|
|
||||||
|
|
@ -1,8 +0,0 @@
|
||||||
fileFormatVersion: 2
|
|
||||||
guid: d6cd845e48d9e4d558d50f7a50149682
|
|
||||||
folderAsset: yes
|
|
||||||
DefaultImporter:
|
|
||||||
externalObjects: {}
|
|
||||||
userData:
|
|
||||||
assetBundleName:
|
|
||||||
assetBundleVariant:
|
|
||||||
Loading…
Reference in New Issue