Restore to commit 74e578279624c6045ca440a3459ebfa1f8d54191
This commit is contained in:
192
scripts/README.md
Normal file
192
scripts/README.md
Normal file
@@ -0,0 +1,192 @@
|
||||
# Utility Scripts
|
||||
|
||||
This directory contains utility scripts for managing the Shopify AI App Builder.
|
||||
|
||||
## Environment File Scripts
|
||||
|
||||
### validate-env.sh
|
||||
|
||||
Validates `.env` files for common issues including invisible Unicode characters.
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
./scripts/validate-env.sh [env-file-path]
|
||||
|
||||
# Examples:
|
||||
./scripts/validate-env.sh .env
|
||||
./scripts/validate-env.sh /path/to/custom.env
|
||||
```
|
||||
|
||||
**What it checks:**
|
||||
- U+200E (Left-to-Right Mark) - the most common issue
|
||||
- U+200F (Right-to-Left Mark)
|
||||
- U+200B (Zero Width Space)
|
||||
- U+FEFF (Byte Order Mark / BOM)
|
||||
- Other directional formatting characters (U+202A-202E)
|
||||
- Windows line endings (CRLF)
|
||||
- Trailing spaces in variable definitions
|
||||
- Spaces in variable names
|
||||
|
||||
**Exit codes:**
|
||||
- `0` - File is clean
|
||||
- `1` - Issues found
|
||||
|
||||
### clean-env.sh
|
||||
|
||||
Removes invisible Unicode characters from `.env` files.
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
./scripts/clean-env.sh [env-file-path]
|
||||
|
||||
# Examples:
|
||||
./scripts/clean-env.sh .env
|
||||
./scripts/clean-env.sh /path/to/custom.env
|
||||
```
|
||||
|
||||
**What it does:**
|
||||
- Creates a backup of the original file (`.backup` extension)
|
||||
- Removes all invisible Unicode characters
|
||||
- Preserves all valid content
|
||||
|
||||
**Always creates a backup** before modifying the file, so you can restore if needed:
|
||||
```bash
|
||||
mv .env.backup .env
|
||||
```
|
||||
|
||||
## Typical Workflow
|
||||
|
||||
When you encounter Portainer deployment errors with invisible characters:
|
||||
|
||||
```bash
|
||||
# 1. Validate your env file
|
||||
./scripts/validate-env.sh .env
|
||||
|
||||
# 2. If issues found, clean it
|
||||
./scripts/clean-env.sh .env
|
||||
|
||||
# 3. Verify it's fixed
|
||||
./scripts/validate-env.sh .env
|
||||
|
||||
# 4. Deploy to Portainer
|
||||
```
|
||||
|
||||
## Other Scripts
|
||||
|
||||
### check-duplicate-classes.php
|
||||
|
||||
Comprehensive PHP/WordPress Plugin Static Analyzer that detects runtime errors, duplicate declarations, missing dependencies, and code issues. Optimized for minimal memory usage via streaming token parsing.
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
php scripts/check-duplicate-classes.php /path/to/plugin
|
||||
php scripts/check-duplicate-classes.php /path/to/plugin --verbose
|
||||
php scripts/check-duplicate-classes.php /path/to/plugin --quiet
|
||||
php scripts/check-duplicate-classes.php /path/to/plugin --strict
|
||||
```
|
||||
|
||||
**Options:**
|
||||
- `--verbose` - Show detailed output and statistics
|
||||
- `--strict` - Enable stricter checks (may have false positives)
|
||||
- `--quiet` - Suppress informational messages
|
||||
- `--no-cache` - Skip caching (useful for CI/CD)
|
||||
|
||||
**What it detects:**
|
||||
- Duplicate class, interface, trait, function, and constant declarations (prevents "Cannot redeclare" errors)
|
||||
- Missing include/require files
|
||||
- Usage of undefined classes (new, instanceof, extends, implements, catch)
|
||||
- **Undefined functions** - Calls to functions that don't exist in the codebase or PHP/WP core
|
||||
- **Early function calls** - Functions called before they are defined in the same file
|
||||
- **Potential undefined arrays** - Array accesses on variables that may not be initialized
|
||||
- **CSS overlap risks** - CSS patterns that may cause UI elements to overlap
|
||||
- Namespace-aware detection (prevents false positives across namespaces)
|
||||
|
||||
**CSS Overlap Detection:**
|
||||
The script analyzes CSS files for patterns that commonly cause overlapping UI elements:
|
||||
- Absolute positioned elements at z-index 0
|
||||
- Elements anchored to both top/bottom or left/right
|
||||
- Negative margins that can pull elements into overlapping positions
|
||||
|
||||
**Exit codes:**
|
||||
- `0` - No issues found
|
||||
- `1` - Issues detected
|
||||
|
||||
**Example output:**
|
||||
```
|
||||
Checking for undefined functions...
|
||||
✓ No undefined functions found
|
||||
|
||||
Checking for early function calls...
|
||||
EARLY FUNCTION CALL: 'my_function' called at line 15 but defined at line 45 in includes/helper.php
|
||||
...
|
||||
✓ No early function calls found
|
||||
|
||||
Checking for potential undefined arrays and array keys...
|
||||
POTENTIAL UNDEFINED ARRAY: '$options' accessed as array at functions.php:23
|
||||
This array may not be initialized before use.
|
||||
Consider using isset() or !empty() check before access.
|
||||
...
|
||||
|
||||
Checking for potential CSS overlaps...
|
||||
POTENTIAL CSS OVERLAP in admin/css/admin.css:156
|
||||
Reason: Negative margin (may cause overlap)
|
||||
Context: margin-left: -15px;...
|
||||
...
|
||||
|
||||
DUPLICATE CLASS: 'PCFB_Field_Icons' declared in 2 locations:
|
||||
admin/class-admin-helper.php:167
|
||||
admin/page-form-builder.php:206
|
||||
|
||||
MISSING CLASS/TYPE: 'Undefined_Class' used but not declared:
|
||||
admin/some-file.php:42 (new)
|
||||
|
||||
================================================================================
|
||||
VALIDATION SUMMARY
|
||||
================================================================================
|
||||
Runtime issues (duplicates, missing classes): FOUND
|
||||
Undefined functions: 0 found
|
||||
Early function calls: 1 found
|
||||
Potential undefined arrays: 1 found
|
||||
Potential CSS overlaps: 1 found
|
||||
|
||||
FAIL: Issues detected that should be reviewed
|
||||
```
|
||||
|
||||
### entrypoint.sh
|
||||
|
||||
Container entrypoint script that:
|
||||
- Sets up the environment
|
||||
- Starts the web server (port 4000)
|
||||
- Starts the terminal service (port 4001)
|
||||
|
||||
This script runs automatically when the Docker container starts. You don't need to run it manually.
|
||||
|
||||
### healthcheck.sh
|
||||
|
||||
Docker health check script that verifies:
|
||||
- The web server is responding on port 4000
|
||||
- The service is healthy
|
||||
|
||||
This is used by Docker to monitor container health. You don't need to run it manually.
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Permission denied
|
||||
|
||||
If you get "Permission denied" errors:
|
||||
```bash
|
||||
chmod +x scripts/*.sh
|
||||
```
|
||||
|
||||
### Command not found
|
||||
|
||||
Make sure you're running from the project root:
|
||||
```bash
|
||||
cd /path/to/shopify-ai
|
||||
./scripts/validate-env.sh .env
|
||||
```
|
||||
|
||||
Or use full paths:
|
||||
```bash
|
||||
/path/to/shopify-ai/scripts/validate-env.sh /path/to/.env
|
||||
```
|
||||
24
scripts/_balance_check.js
Normal file
24
scripts/_balance_check.js
Normal file
@@ -0,0 +1,24 @@
|
||||
const fs = require('fs');
|
||||
const s = fs.readFileSync('chat/public/builder.js','utf8');
|
||||
let paren=0,brack=0,brace=0,backtick=0;
|
||||
for(let i=0;i<s.length;i++){
|
||||
const c=s[i];
|
||||
if(c==='`') backtick ^= 1;
|
||||
if(!backtick){
|
||||
if(c==='(') paren++;
|
||||
if(c===')') paren--;
|
||||
if(c==='[') brack++;
|
||||
if(c===']') brack--;
|
||||
if(c==='{') brace++;
|
||||
if(c==='}') brace--;
|
||||
}
|
||||
if(paren<0||brack<0||brace<0){
|
||||
const lines=s.slice(0,i+1).split('\n');
|
||||
console.log('Mismatch at char', i, 'line', lines.length);
|
||||
console.log('char:',c);
|
||||
console.log('paren,brack,brace',paren,brack,brace);
|
||||
console.log(lines.slice(-6).join('\n'));
|
||||
process.exit(1);
|
||||
}
|
||||
}
|
||||
console.log('final counts paren,brack,brace,backtick',paren,brack,brace,backtick);
|
||||
20
scripts/_balance_check_py.py
Normal file
20
scripts/_balance_check_py.py
Normal file
@@ -0,0 +1,20 @@
|
||||
with open('chat/public/builder.js','r',encoding='utf8') as f:
|
||||
s=f.read()
|
||||
paren=brack=brace=0
|
||||
backtick=False
|
||||
for i,c in enumerate(s):
|
||||
if c=='`': backtick=not backtick
|
||||
if not backtick:
|
||||
if c=='(': paren+=1
|
||||
if c==')': paren-=1
|
||||
if c=='[': brack+=1
|
||||
if c==']': brack-=1
|
||||
if c=='{': brace+=1
|
||||
if c=='}': brace-=1
|
||||
if paren<0 or brack<0 or brace<0:
|
||||
print('mismatch at',i,'char',c,'counts',paren,brack,brace)
|
||||
# print context
|
||||
start=max(0,i-100); end=min(len(s),i+100)
|
||||
print(s[start:end])
|
||||
raise SystemExit(1)
|
||||
print('final',paren,brack,brace,backtick)
|
||||
14
scripts/_find_js_error.js
Normal file
14
scripts/_find_js_error.js
Normal file
@@ -0,0 +1,14 @@
|
||||
const fs=require('fs');
|
||||
const path='chat/public/builder.js';
|
||||
const src=fs.readFileSync(path,'utf8');
|
||||
let lo=0,hi=src.length,mid;let lastGood=0;
|
||||
while(lo<hi){mid=Math.floor((lo+hi)/2);
|
||||
try{new Function(src.slice(0,mid)); lastGood=mid; lo=mid+1;}catch(e){hi=mid;}
|
||||
}
|
||||
const start=Math.max(0,lastGood-200), end=Math.min(src.length, lastGood+200);
|
||||
const context=src.slice(start,end);
|
||||
const before=src.slice(0,lastGood);
|
||||
const linesBefore=before.split('\n').length;
|
||||
console.log('approx failure at char index', lastGood, 'around line', linesBefore);
|
||||
console.log('---context---');
|
||||
console.log(context);
|
||||
26
scripts/_test_opencode_config.js
Normal file
26
scripts/_test_opencode_config.js
Normal file
@@ -0,0 +1,26 @@
|
||||
const OPENCODE_OLLAMA_PROVIDER='ollama';
|
||||
const OPENCODE_OLLAMA_MODEL='qwen3:0.6b';
|
||||
const OPENCODE_OLLAMA_BASE_URL='https://ollama.plugincompass.com';
|
||||
const OPENCODE_OLLAMA_API_KEY='abc123';
|
||||
const baseUrl=(OPENCODE_OLLAMA_BASE_URL||'https://ollama.plugincompass.com').replace(/\/+$/,'');
|
||||
const providerCfg={
|
||||
options:{ baseURL: baseUrl },
|
||||
models: {
|
||||
[OPENCODE_OLLAMA_MODEL]: {
|
||||
id: OPENCODE_OLLAMA_MODEL,
|
||||
name: OPENCODE_OLLAMA_MODEL,
|
||||
tool_call: true,
|
||||
temperature: true
|
||||
}
|
||||
}
|
||||
};
|
||||
if (OPENCODE_OLLAMA_API_KEY) providerCfg.options.apiKey = OPENCODE_OLLAMA_API_KEY;
|
||||
const cfg = {
|
||||
"$schema": "https://opencode.ai/config.json",
|
||||
model: `${OPENCODE_OLLAMA_PROVIDER}/${OPENCODE_OLLAMA_MODEL}`,
|
||||
small_model: `${OPENCODE_OLLAMA_PROVIDER}/${OPENCODE_OLLAMA_MODEL}`,
|
||||
provider: {
|
||||
[OPENCODE_OLLAMA_PROVIDER]: providerCfg
|
||||
}
|
||||
};
|
||||
console.log(JSON.stringify(cfg, null, 2));
|
||||
38
scripts/build-and-verify.ps1
Normal file
38
scripts/build-and-verify.ps1
Normal file
@@ -0,0 +1,38 @@
|
||||
# PowerShell helper script to build and verify the WordPress Plugin AI Builder container
|
||||
|
||||
param(
|
||||
[switch]$NoCache
|
||||
)
|
||||
|
||||
$buildArgs = @('compose', 'build')
|
||||
if ($NoCache) { $buildArgs += '--no-cache' }
|
||||
|
||||
Write-Host "Building WordPress Plugin AI Builder container..."
|
||||
docker @buildArgs
|
||||
|
||||
if ($LASTEXITCODE -ne 0) {
|
||||
Write-Error "Build failed"
|
||||
exit 1
|
||||
}
|
||||
|
||||
Write-Host "Starting container..."
|
||||
docker compose up -d
|
||||
|
||||
Write-Host "Waiting for container readiness (healthcheck)..."
|
||||
Start-Sleep -Seconds 5
|
||||
|
||||
Write-Host "Verifying tools inside container..."
|
||||
|
||||
docker compose exec wordpress-plugin-ai-builder pwsh -NoProfile -Command 'pwsh --version; opencode --version; node --version; if (Get-Command opencode -ErrorAction SilentlyContinue) { Write-Host "opencode available" } else { Write-Error "opencode not found"; exit 1 }'
|
||||
|
||||
if ($LASTEXITCODE -ne 0) {
|
||||
Write-Error "Verification failed"
|
||||
exit 1
|
||||
}
|
||||
|
||||
Write-Host "All checks passed." -ForegroundColor Green
|
||||
Write-Host "Open http://localhost:4000 for the web interface." -ForegroundColor Cyan
|
||||
Write-Host "Open http://localhost:4001 for the terminal." -ForegroundColor Cyan
|
||||
Write-Host "Verifying `github` helper is available in profile (sourcing profile)..."
|
||||
docker compose exec wordpress-plugin-ai-builder pwsh -NoProfile -Command ". /root/.config/powershell/Microsoft.PowerShell_profile.ps1; Get-Command github"
|
||||
if ($LASTEXITCODE -ne 0) { Write-Warning "`github` helper not detected. Ensure you built the image with the new profile, or update the volume via README instructions." }
|
||||
1313
scripts/check-duplicate-classes.php
Normal file
1313
scripts/check-duplicate-classes.php
Normal file
File diff suppressed because it is too large
Load Diff
25
scripts/check-opencode.ps1
Normal file
25
scripts/check-opencode.ps1
Normal file
@@ -0,0 +1,25 @@
|
||||
# Simple script to validate opencode and pwsh inside running container or local environment
|
||||
param(
|
||||
[string]$ContainerName = 'wordpress-plugin-ai-builder'
|
||||
)
|
||||
|
||||
# If docker is available, run inside the container
|
||||
if (Get-Command docker -ErrorAction SilentlyContinue) {
|
||||
Write-Host "Running checks inside container: $ContainerName"
|
||||
docker compose exec $ContainerName pwsh -NoProfile -Command @"
|
||||
pwsh --version
|
||||
node --version
|
||||
opencode --version
|
||||
opencode help | Out-Null
|
||||
. /root/.config/powershell/Microsoft.PowerShell_profile.ps1
|
||||
Get-Command github | Out-Null
|
||||
exit 0
|
||||
"@
|
||||
} else {
|
||||
Write-Host "Docker is not available - trying local checks"
|
||||
pwsh --version
|
||||
node --version
|
||||
if (Get-Command opencode -ErrorAction SilentlyContinue) { opencode --version } else { Write-Error "opencode not found"; exit 1 }
|
||||
}
|
||||
|
||||
Write-Host "Checks completed." -ForegroundColor Green
|
||||
19
scripts/check-opencode.sh
Executable file
19
scripts/check-opencode.sh
Executable file
@@ -0,0 +1,19 @@
|
||||
#!/usr/bin/env bash
|
||||
# Simple shell script to validate opencode and pwsh inside running container or local environment
|
||||
CONTAINER_NAME=${1:-wordpress-plugin-ai-builder}
|
||||
|
||||
if command -v docker > /dev/null 2>&1; then
|
||||
echo "Running checks inside container: $CONTAINER_NAME"
|
||||
docker compose exec $CONTAINER_NAME pwsh -NoProfile -Command "pwsh --version; node --version; opencode --version; . /root/.config/powershell/Microsoft.PowerShell_profile.ps1; Get-Command github"
|
||||
else
|
||||
echo "Docker not found - running local checks"
|
||||
pwsh --version
|
||||
node --version
|
||||
if command -v opencode > /dev/null 2>&1; then
|
||||
opencode --version
|
||||
else
|
||||
echo "opencode not found"; exit 1
|
||||
fi
|
||||
fi
|
||||
|
||||
echo "Checks completed."
|
||||
39
scripts/clean-env.sh
Executable file
39
scripts/clean-env.sh
Executable file
@@ -0,0 +1,39 @@
|
||||
#!/bin/bash
|
||||
# Clean invisible Unicode characters from .env files
|
||||
# This removes U+200E (Left-to-Right Mark) and other common invisible characters
|
||||
|
||||
set -e
|
||||
|
||||
ENV_FILE="${1:-.env}"
|
||||
|
||||
if [ ! -f "$ENV_FILE" ]; then
|
||||
echo "Error: File $ENV_FILE not found"
|
||||
echo "Usage: $0 [env-file-path]"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "Cleaning $ENV_FILE..."
|
||||
|
||||
# Create a backup
|
||||
cp "$ENV_FILE" "${ENV_FILE}.backup"
|
||||
echo "Created backup: ${ENV_FILE}.backup"
|
||||
|
||||
# Remove common invisible Unicode characters:
|
||||
# - U+200E (Left-to-Right Mark) - E2 80 8E
|
||||
# - U+200F (Right-to-Left Mark) - E2 80 8F
|
||||
# - U+200B (Zero Width Space) - E2 80 8B
|
||||
# - U+FEFF (Zero Width No-Break Space / BOM) - EF BB BF
|
||||
# - U+202A-202E (Directional formatting characters)
|
||||
|
||||
sed -i 's/\xE2\x80\x8E//g' "$ENV_FILE" # Remove U+200E
|
||||
sed -i 's/\xE2\x80\x8F//g' "$ENV_FILE" # Remove U+200F
|
||||
sed -i 's/\xE2\x80\x8B//g' "$ENV_FILE" # Remove U+200B
|
||||
sed -i 's/\xEF\xBB\xBF//g' "$ENV_FILE" # Remove BOM
|
||||
sed -i 's/\xE2\x80\xAA//g' "$ENV_FILE" # Remove U+202A
|
||||
sed -i 's/\xE2\x80\xAB//g' "$ENV_FILE" # Remove U+202B
|
||||
sed -i 's/\xE2\x80\xAC//g' "$ENV_FILE" # Remove U+202C
|
||||
sed -i 's/\xE2\x80\xAD//g' "$ENV_FILE" # Remove U+202D
|
||||
sed -i 's/\xE2\x80\xAE//g' "$ENV_FILE" # Remove U+202E
|
||||
|
||||
echo "Cleaned successfully!"
|
||||
echo "If you want to restore the backup: mv ${ENV_FILE}.backup $ENV_FILE"
|
||||
224
scripts/diagnostic-logger.sh
Normal file
224
scripts/diagnostic-logger.sh
Normal file
@@ -0,0 +1,224 @@
|
||||
#!/bin/bash
|
||||
# Diagnostic logging utility for Shopify AI App Builder container
|
||||
# Provides comprehensive system and application diagnostics
|
||||
|
||||
set -e
|
||||
|
||||
# Colors for log output
|
||||
RED='\033[0;31m'
|
||||
GREEN='\033[0;32m'
|
||||
YELLOW='\033[1;33m'
|
||||
BLUE='\033[0;34m'
|
||||
NC='\033[0m' # No Color
|
||||
|
||||
# Diagnostic log location
|
||||
DIAG_LOG_DIR="/var/log/shopify-ai"
|
||||
DIAG_LOG_FILE="${DIAG_LOG_DIR}/diagnostics.log"
|
||||
mkdir -p "$DIAG_LOG_DIR"
|
||||
|
||||
# Logging function with timestamps and levels
|
||||
diag_log() {
|
||||
local level="$1"
|
||||
shift
|
||||
local message="$*"
|
||||
local timestamp=$(date '+%Y-%m-%d %H:%M:%S.%3N')
|
||||
local pid=$$
|
||||
|
||||
# Log to file
|
||||
echo "[${timestamp}] [${level}] [PID:${pid}] ${message}" >> "$DIAG_LOG_FILE"
|
||||
|
||||
# Log to stderr with colors
|
||||
case "$level" in
|
||||
ERROR)
|
||||
echo -e "${RED}[${timestamp}] [ERROR] ${message}${NC}" >&2
|
||||
;;
|
||||
WARN)
|
||||
echo -e "${YELLOW}[${timestamp}] [WARN] ${message}${NC}" >&2
|
||||
;;
|
||||
INFO)
|
||||
echo -e "${GREEN}[${timestamp}] [INFO] ${message}${NC}" >&2
|
||||
;;
|
||||
DEBUG)
|
||||
echo -e "${BLUE}[${timestamp}] [DEBUG] ${message}${NC}" >&2
|
||||
;;
|
||||
esac
|
||||
}
|
||||
|
||||
# System information gathering
|
||||
log_system_info() {
|
||||
diag_log "INFO" "========== SYSTEM DIAGNOSTIC START =========="
|
||||
|
||||
# OS Information
|
||||
diag_log "INFO" "=== OS Information ==="
|
||||
diag_log "INFO" "Kernel: $(uname -r)"
|
||||
diag_log "INFO" "Hostname: $(hostname)"
|
||||
diag_log "INFO" "Uptime: $(uptime -p 2>/dev/null || uptime)"
|
||||
|
||||
# CPU Information
|
||||
diag_log "INFO" "=== CPU Information ==="
|
||||
if [ -f /proc/cpuinfo ]; then
|
||||
local cpu_count=$(nproc)
|
||||
local cpu_model=$(grep -m1 "model name" /proc/cpuinfo | cut -d':' -f2 | xargs)
|
||||
diag_log "INFO" "CPUs: ${cpu_count}"
|
||||
diag_log "INFO" "Model: ${cpu_model}"
|
||||
fi
|
||||
|
||||
# Memory Information
|
||||
diag_log "INFO" "=== Memory Information ==="
|
||||
if [ -f /proc/meminfo ]; then
|
||||
local total_mem=$(awk '/MemTotal/ {printf "%.2f GB", $2/1024/1024}' /proc/meminfo)
|
||||
local free_mem=$(awk '/MemAvailable/ {printf "%.2f GB", $2/1024/1024}' /proc/meminfo)
|
||||
local used_mem=$(awk 'BEGIN{printf "%.2f GB", '"${total_mem}"' - '"${free_mem}"'}')
|
||||
diag_log "INFO" "Total: ${total_mem}"
|
||||
diag_log "INFO" "Used: ${used_mem}"
|
||||
diag_log "INFO" "Available: ${free_mem}"
|
||||
fi
|
||||
|
||||
# Disk Information
|
||||
diag_log "INFO" "=== Disk Information ==="
|
||||
df -h / | while read line; do
|
||||
diag_log "INFO" "$line"
|
||||
done
|
||||
|
||||
# Network Information
|
||||
diag_log "INFO" "=== Network Information ==="
|
||||
ip -4 addr show | grep -oP '(?<=inet\s)\d+(\.\d+){3}' | while read ip; do
|
||||
diag_log "INFO" "IPv4: ${ip}"
|
||||
done
|
||||
}
|
||||
|
||||
# Service status checking
|
||||
check_service_status() {
|
||||
local service_name="$1"
|
||||
local port="$2"
|
||||
local pid="$3"
|
||||
|
||||
diag_log "INFO" "=== Service: ${service_name} ==="
|
||||
|
||||
# Check if process is running
|
||||
if [ -n "$pid" ] && kill -0 "$pid" 2>/dev/null; then
|
||||
diag_log "INFO" "Process running (PID: ${pid})"
|
||||
|
||||
# Check memory usage
|
||||
if [ -f "/proc/${pid}/status" ]; then
|
||||
local mem_usage=$(awk '/VmRSS/ {printf "%.2f MB", $2/1024}' "/proc/${pid}/status")
|
||||
local cpu_usage=$(awk '/utime|stime/ {sum+=$2} END {printf "%.2f seconds", sum/100}' "/proc/${pid}/status")
|
||||
diag_log "INFO" "Memory: ${mem_usage}"
|
||||
diag_log "INFO" "CPU Time: ${cpu_usage}"
|
||||
fi
|
||||
else
|
||||
diag_log "WARN" "Process not running (PID: ${pid})"
|
||||
fi
|
||||
|
||||
# Check if port is listening
|
||||
if ss -tuln 2>/dev/null | grep -q ":${port}"; then
|
||||
diag_log "INFO" "Port ${port} listening"
|
||||
else
|
||||
diag_log "ERROR" "Port ${port} NOT listening"
|
||||
fi
|
||||
|
||||
# Check if service responds to HTTP requests
|
||||
if [ "$port" = "4000" ]; then
|
||||
if timeout 3 curl -s http://localhost:${port}/api/health > /dev/null 2>&1; then
|
||||
diag_log "INFO" "HTTP endpoint responding"
|
||||
else
|
||||
diag_log "ERROR" "HTTP endpoint NOT responding"
|
||||
fi
|
||||
fi
|
||||
}
|
||||
|
||||
# Resource monitoring (can be called periodically)
|
||||
monitor_resources() {
|
||||
local timestamp=$(date '+%Y-%m-%d %H:%M:%S')
|
||||
|
||||
# CPU usage
|
||||
local cpu_usage=$(top -bn1 | grep "Cpu(s)" | sed "s/.*, *\([0-9.]*\)%* id.*/\1/" | awk '{print $1}')
|
||||
|
||||
# Memory usage
|
||||
local mem_total=$(free -m | awk '/Mem:/ {print $2}')
|
||||
local mem_used=$(free -m | awk '/Mem:/ {print $3}')
|
||||
local mem_percent=$(( (mem_used * 100) / mem_total ))
|
||||
|
||||
# Disk usage
|
||||
local disk_usage=$(df / | tail -1 | awk '{print $5}')
|
||||
|
||||
# Load average
|
||||
local load_avg=$(uptime | awk -F'load average:' '{print $2}')
|
||||
|
||||
diag_log "INFO" "[MONITOR] CPU: ${cpu_usage}% | MEM: ${mem_percent}% (${mem_used}MB/${mem_total}MB) | DISK: ${disk_usage} | LOAD: ${load_avg}"
|
||||
}
|
||||
|
||||
# Environment variable validation
|
||||
validate_environment() {
|
||||
diag_log "INFO" "=== Environment Validation ==="
|
||||
|
||||
local critical_vars=(
|
||||
"OPENCODE_API_KEY"
|
||||
"SESSION_SECRET"
|
||||
"ACCESS_PASSWORD"
|
||||
)
|
||||
|
||||
local optional_vars=(
|
||||
"OPENROUTER_API_KEY"
|
||||
"MISTRAL_API_KEY"
|
||||
"GROQ_API_KEY"
|
||||
"GOOGLE_API_KEY"
|
||||
"DODO_PAYMENTS_API_KEY"
|
||||
"ADMIN_USER"
|
||||
"ADMIN_PASSWORD"
|
||||
"REPO_URL"
|
||||
"REPO_BRANCH"
|
||||
)
|
||||
|
||||
# Check critical variables
|
||||
for var in "${critical_vars[@]}"; do
|
||||
if [ -z "${!var}" ]; then
|
||||
diag_log "ERROR" "Missing critical variable: ${var}"
|
||||
else
|
||||
diag_log "INFO" "✓ ${var}: SET"
|
||||
fi
|
||||
done
|
||||
|
||||
# Check optional variables
|
||||
for var in "${optional_vars[@]}"; do
|
||||
if [ -n "${!var}" ]; then
|
||||
diag_log "INFO" "✓ ${var}: SET"
|
||||
else
|
||||
diag_log "DEBUG" "${var}: NOT SET (optional)"
|
||||
fi
|
||||
done
|
||||
|
||||
# Check filesystem permissions
|
||||
diag_log "INFO" "=== Filesystem Permissions ==="
|
||||
local data_dir="/home/web/data"
|
||||
if [ -d "$data_dir" ]; then
|
||||
local perms=$(stat -c "%a" "$data_dir")
|
||||
local owner=$(stat -c "%U:%G" "$data_dir")
|
||||
diag_log "INFO" "${data_dir}: ${perms} (${owner})"
|
||||
else
|
||||
diag_log "WARN" "${data_dir}: NOT FOUND"
|
||||
fi
|
||||
}
|
||||
|
||||
# Log rotation
|
||||
rotate_logs() {
|
||||
local max_size=$(( 10 * 1024 * 1024 )) # 10 MB
|
||||
if [ -f "$DIAG_LOG_FILE" ]; then
|
||||
local file_size=$(stat -f%z "$DIAG_LOG_FILE" 2>/dev/null || stat -c%s "$DIAG_LOG_FILE" 2>/dev/null || echo 0)
|
||||
if [ "$file_size" -gt "$max_size" ]; then
|
||||
local backup_file="${DIAG_LOG_FILE}.$(date '+%Y%m%d_%H%M%S').bak"
|
||||
mv "$DIAG_LOG_FILE" "$backup_file"
|
||||
diag_log "INFO" "Log rotated to ${backup_file}"
|
||||
fi
|
||||
fi
|
||||
}
|
||||
|
||||
# Export functions for use in other scripts
|
||||
export -f diag_log
|
||||
export -f log_system_info
|
||||
export -f check_service_status
|
||||
export -f monitor_resources
|
||||
export -f validate_environment
|
||||
export -f rotate_logs
|
||||
export DIAG_LOG_DIR
|
||||
export DIAG_LOG_FILE
|
||||
381
scripts/entrypoint.sh
Executable file
381
scripts/entrypoint.sh
Executable file
@@ -0,0 +1,381 @@
|
||||
#!/bin/bash
|
||||
set -e
|
||||
|
||||
# Source diagnostic logger if available
|
||||
if [ -f "/usr/local/bin/diagnostic-logger.sh" ]; then
|
||||
. /usr/local/bin/diagnostic-logger.sh
|
||||
fi
|
||||
|
||||
# Helper function to log messages
|
||||
log() {
|
||||
local timestamp=$(date '+%Y-%m-%d %H:%M:%S')
|
||||
echo "[${timestamp}] $*" >&2
|
||||
}
|
||||
|
||||
# Sanitize environment variables by removing invisible Unicode characters
|
||||
# This fixes the Portainer deployment issue where U+200E and similar characters
|
||||
# cause "unexpected character" errors in variable names
|
||||
sanitize_env_vars() {
|
||||
log "Sanitizing environment variables..."
|
||||
|
||||
# Create a secure temporary file
|
||||
local temp_env
|
||||
temp_env=$(mktemp /tmp/sanitized_env.XXXXXX)
|
||||
|
||||
# Export current environment to a file, then clean it
|
||||
export -p > "$temp_env"
|
||||
|
||||
# Remove common invisible Unicode characters in a single sed command:
|
||||
# - U+200E (Left-to-Right Mark) - E2 80 8E
|
||||
# - U+200F (Right-to-Left Mark) - E2 80 8F
|
||||
# - U+200B (Zero Width Space) - E2 80 8B
|
||||
# - U+FEFF (Zero Width No-Break Space / BOM) - EF BB BF
|
||||
# - U+202A-202E (Directional formatting characters)
|
||||
sed -i \
|
||||
-e 's/\xE2\x80\x8E//g' \
|
||||
-e 's/\xE2\x80\x8F//g' \
|
||||
-e 's/\xE2\x80\x8B//g' \
|
||||
-e 's/\xEF\xBB\xBF//g' \
|
||||
-e 's/\xE2\x80\xAA//g' \
|
||||
-e 's/\xE2\x80\xAB//g' \
|
||||
-e 's/\xE2\x80\xAC//g' \
|
||||
-e 's/\xE2\x80\xAD//g' \
|
||||
-e 's/\xE2\x80\xAE//g' \
|
||||
"$temp_env" 2>/dev/null
|
||||
|
||||
# Source the sanitized environment
|
||||
# If sourcing fails, log a warning but continue (environment may be partially set)
|
||||
if ! source "$temp_env" 2>/dev/null; then
|
||||
log "WARNING: Failed to source sanitized environment. Some variables may not be set correctly."
|
||||
fi
|
||||
|
||||
# Clean up temporary file
|
||||
rm -f "$temp_env"
|
||||
|
||||
log "Environment variables sanitized successfully"
|
||||
}
|
||||
|
||||
# Sanitize environment variables on startup
|
||||
log "=== STARTUP INITIALIZATION ==="
|
||||
log "Container ID: $(hostname 2>/dev/null || 'unknown')"
|
||||
log "Timestamp: $(date '+%Y-%m-%d %H:%M:%S %Z')"
|
||||
|
||||
# Initialize diagnostic logging only if file exists and we're in debug mode
|
||||
if [ -f "/usr/local/bin/diagnostic-logger.sh" ] && [ "${DEBUG_LOGGING:-false}" = "true" ]; then
|
||||
. /usr/local/bin/diagnostic-logger.sh
|
||||
log_system_info
|
||||
validate_environment
|
||||
rotate_logs
|
||||
fi
|
||||
|
||||
sanitize_env_vars
|
||||
|
||||
log "=== ENVIRONMENT SANITIZATION COMPLETE ==="
|
||||
|
||||
# Repository configuration
|
||||
REPO_URL="${REPO_URL:-}"
|
||||
REPO_DIR="/home/web/data"
|
||||
REPO_BRANCH="${REPO_BRANCH:-main}"
|
||||
GITHUB_USERNAME="${GITHUB_USERNAME:-}"
|
||||
GITHUB_PAT="${GITHUB_PAT:-}"
|
||||
|
||||
# Helper function to get authenticated repository URL
|
||||
get_auth_url() {
|
||||
local url="$1"
|
||||
if [ -n "$GITHUB_USERNAME" ] && [ -n "$GITHUB_PAT" ]; then
|
||||
# Extract the repository path from the URL
|
||||
local repo_path=$(echo "$url" | sed 's|https://github.com/||')
|
||||
echo "https://${GITHUB_USERNAME}:${GITHUB_PAT}@github.com/${repo_path}"
|
||||
else
|
||||
echo "$url"
|
||||
fi
|
||||
}
|
||||
|
||||
log "Initializing Shopify AI App Builder..."
|
||||
|
||||
# Log repository configuration details
|
||||
log "Repository Configuration:"
|
||||
log " URL: ${REPO_URL:-'NOT SET'}"
|
||||
log " Branch: ${REPO_BRANCH:-'NOT SET'}"
|
||||
log " Directory: ${REPO_DIR}"
|
||||
log " GitHub Username: ${GITHUB_USERNAME:-'NOT SET'}"
|
||||
log " GitHub PAT: ${GITHUB_PAT:+SET (hidden)}${GITHUB_PAT:-NOT SET}"
|
||||
|
||||
# Only clone/pull if REPO_URL is set
|
||||
if [ -n "$REPO_URL" ]; then
|
||||
log "Repository URL: $REPO_URL"
|
||||
log "Repository directory: $REPO_DIR"
|
||||
log "Default branch: $REPO_BRANCH"
|
||||
|
||||
# Check if authentication credentials are available
|
||||
if [ -n "$GITHUB_USERNAME" ] && [ -n "$GITHUB_PAT" ]; then
|
||||
log "GitHub authentication credentials found for user: $GITHUB_USERNAME"
|
||||
else
|
||||
log "WARNING: No GitHub authentication credentials found. Private repository access may fail."
|
||||
fi
|
||||
|
||||
# Check if git is available
|
||||
if ! command -v git &> /dev/null; then
|
||||
log "ERROR: git is not available in the container"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Check if the repository directory is empty or doesn't have .git
|
||||
if [ ! -d "$REPO_DIR/.git" ]; then
|
||||
# Directory doesn't exist or .git is missing - need to clone
|
||||
if [ -d "$REPO_DIR" ] && [ "$(ls -A "$REPO_DIR")" ]; then
|
||||
# Directory exists but is not a git repo - back it up
|
||||
log "WARNING: $REPO_DIR exists but is not a git repository. Backing up to ${REPO_DIR}.backup"
|
||||
mv "$REPO_DIR" "${REPO_DIR}.backup"
|
||||
fi
|
||||
|
||||
log "Repository not found. Cloning $REPO_URL into $REPO_DIR..."
|
||||
auth_url=$(get_auth_url "$REPO_URL")
|
||||
git clone "$auth_url" "$REPO_DIR"
|
||||
cd "$REPO_DIR"
|
||||
log "Successfully cloned repository"
|
||||
else
|
||||
# Repository exists, pull latest changes
|
||||
cd "$REPO_DIR"
|
||||
log "Repository found at $REPO_DIR. Pulling latest changes from $REPO_BRANCH..."
|
||||
|
||||
# Update remote URL to use authentication if credentials are available
|
||||
if [ -n "$GITHUB_USERNAME" ] && [ -n "$GITHUB_PAT" ]; then
|
||||
auth_url=$(get_auth_url "$REPO_URL")
|
||||
log "Updating remote URL with authentication credentials"
|
||||
git remote set-url origin "$auth_url"
|
||||
fi
|
||||
|
||||
# Check if we're on a detached HEAD or have uncommitted changes
|
||||
if git diff --quiet && git diff --cached --quiet; then
|
||||
git fetch origin
|
||||
# Use plain git pull per policy (do not force origin/branch explicitly here)
|
||||
git pull || {
|
||||
log "WARNING: Failed to pull from $REPO_BRANCH, attempting to pull from any available branch"
|
||||
git pull || log "WARNING: Pull operation failed, continuing anyway"
|
||||
}
|
||||
log "Successfully pulled latest changes"
|
||||
else
|
||||
log "WARNING: Repository has uncommitted changes. Skipping pull to avoid conflicts."
|
||||
log "Run 'git status' to see changes, then 'git pull' manually if desired"
|
||||
fi
|
||||
fi
|
||||
log "Repository is ready at $REPO_DIR"
|
||||
else
|
||||
log "No REPO_URL set - starting with empty workspace"
|
||||
mkdir -p "$REPO_DIR"
|
||||
fi
|
||||
|
||||
log "Starting Shopify AI App Builder service and ttyd..."
|
||||
|
||||
# Use /opt/webchat directly as it contains the actual server.js with node_modules
|
||||
# /opt/webchat_v2 is just a wrapper and causes module loading failures
|
||||
CHAT_APP_DIR="${CHAT_APP_DIR:-/opt/webchat}"
|
||||
CHAT_APP_FALLBACK="/opt/webchat_v2"
|
||||
CHAT_PORT="${CHAT_PORT:-4000}"
|
||||
CHAT_HOST="${CHAT_HOST:-0.0.0.0}"
|
||||
ACCESS_PASSWORD="${ACCESS_PASSWORD:-}"
|
||||
|
||||
# Persist opencode installation & user config
|
||||
# Move/Copy /root/.opencode to $REPO_DIR/.opencode if not already present
|
||||
# Then symlink /root/.opencode -> $REPO_DIR/.opencode so opencode state (connections, providers)
|
||||
# are persisted on container restarts while keeping the install accessible for runtime.
|
||||
PERSISTED_OPENCODE_DIR="$REPO_DIR/.opencode"
|
||||
OPENCODE_INSTALL_DIR="/root/.opencode"
|
||||
if [ -d "$OPENCODE_INSTALL_DIR" ]; then
|
||||
# If persisted dir does not exist, copy initial files so the app continues to work
|
||||
if [ ! -d "$PERSISTED_OPENCODE_DIR" ] || [ -z "$(ls -A $PERSISTED_OPENCODE_DIR 2>/dev/null)" ]; then
|
||||
log "Persisting opencode into $PERSISTED_OPENCODE_DIR"
|
||||
mkdir -p "$PERSISTED_OPENCODE_DIR"
|
||||
# Copy installed files to the persisted folder to preserve both the binary and state
|
||||
cp -a "$OPENCODE_INSTALL_DIR/." "$PERSISTED_OPENCODE_DIR/" || true
|
||||
chown -R root:root "$PERSISTED_OPENCODE_DIR" || true
|
||||
fi
|
||||
# Replace the install dir with a symlink to the persisted directory (only if not a symlink already)
|
||||
if [ -e "$OPENCODE_INSTALL_DIR" ] && [ ! -L "$OPENCODE_INSTALL_DIR" ]; then
|
||||
log "Symlinking $OPENCODE_INSTALL_DIR -> $PERSISTED_OPENCODE_DIR"
|
||||
rm -rf "$OPENCODE_INSTALL_DIR"
|
||||
ln -s "$PERSISTED_OPENCODE_DIR" "$OPENCODE_INSTALL_DIR"
|
||||
elif [ -L "$OPENCODE_INSTALL_DIR" ]; then
|
||||
log "$OPENCODE_INSTALL_DIR already symlinked; skipping"
|
||||
fi
|
||||
fi
|
||||
|
||||
# Only ensure opencode command exists - qwen and gemini commands should not be aliased to opencode
|
||||
ensure_cli_wrappers() {
|
||||
local bin_dir="$PERSISTED_OPENCODE_DIR/bin"
|
||||
mkdir -p "$bin_dir"
|
||||
# Only create symlink for opencode command itself
|
||||
if [ ! -L "/usr/local/bin/opencode" ]; then
|
||||
ln -sf "$bin_dir/opencode" "/usr/local/bin/opencode"
|
||||
fi
|
||||
}
|
||||
ensure_cli_wrappers
|
||||
|
||||
# Ensure a root-level opencode.json exists so OpenCode can discover the configured
|
||||
# Ollama/OpenAI-compatible model. This file lives in the persisted storage root
|
||||
# (/home/web/data) and is overwritten on every container start.
|
||||
ensure_root_opencode_config() {
|
||||
local config_path="$REPO_DIR/opencode.json"
|
||||
|
||||
# Allow overrides while defaulting to the PluginCompass Ollama gateway + qwen3 model
|
||||
export OPENCODE_OLLAMA_BASE_URL="${OPENCODE_OLLAMA_BASE_URL:-https://ollama.plugincompass.com}"
|
||||
export OPENCODE_OLLAMA_MODEL="${OPENCODE_OLLAMA_MODEL:-qwen3:0.6b}"
|
||||
|
||||
# Prefer an explicit Ollama key, fall back to OPENCODE_API_KEY (existing env var)
|
||||
export OPENCODE_OLLAMA_API_KEY="${OPENCODE_OLLAMA_API_KEY:-${OPENCODE_API_KEY:-}}"
|
||||
|
||||
mkdir -p "$(dirname "$config_path")"
|
||||
|
||||
log "Writing OpenCode config: ${config_path} (baseURL=${OPENCODE_OLLAMA_BASE_URL}, model=${OPENCODE_OLLAMA_MODEL})"
|
||||
|
||||
python3 - <<'PY' > "$config_path"
|
||||
import json, os
|
||||
|
||||
base_url = (os.environ.get("OPENCODE_OLLAMA_BASE_URL") or "https://ollama.plugincompass.com").rstrip("/")
|
||||
|
||||
model_id = os.environ.get("OPENCODE_OLLAMA_MODEL") or "qwen3:0.6b"
|
||||
api_key = (os.environ.get("OPENCODE_OLLAMA_API_KEY") or "").strip()
|
||||
provider_name = os.environ.get("OPENCODE_OLLAMA_PROVIDER") or "ollama"
|
||||
|
||||
provider_cfg = {
|
||||
"options": {
|
||||
"baseURL": base_url,
|
||||
},
|
||||
"models": {
|
||||
model_id: {
|
||||
"id": model_id,
|
||||
"name": model_id,
|
||||
"tool_call": True,
|
||||
"temperature": True,
|
||||
}
|
||||
},
|
||||
}
|
||||
|
||||
if api_key:
|
||||
provider_cfg["options"]["apiKey"] = api_key
|
||||
|
||||
cfg = {
|
||||
"$schema": "https://opencode.ai/config.json",
|
||||
"model": f"{provider_name}/{model_id}",
|
||||
"small_model": f"{provider_name}/{model_id}",
|
||||
"provider": {
|
||||
provider_name: provider_cfg,
|
||||
},
|
||||
}
|
||||
|
||||
print(json.dumps(cfg, indent=2))
|
||||
PY
|
||||
|
||||
chmod 600 "$config_path" 2>/dev/null || true
|
||||
}
|
||||
ensure_root_opencode_config
|
||||
|
||||
# Set up signal handlers to properly clean up background processes
|
||||
cleanup() {
|
||||
local signal="$1"
|
||||
log "=== SIGNAL RECEIVED: ${signal} ==="
|
||||
log "Initiating graceful shutdown..."
|
||||
|
||||
# Log final resource snapshot before shutdown
|
||||
if type monitor_resources &>/dev/null; then
|
||||
log "Final resource snapshot:"
|
||||
monitor_resources
|
||||
fi
|
||||
|
||||
if [ -n "$MONITOR_PID" ] && kill -0 "$MONITOR_PID" 2>/dev/null; then
|
||||
log "Terminating monitor process (PID: $MONITOR_PID)"
|
||||
kill "$MONITOR_PID" 2>/dev/null || true
|
||||
wait "$MONITOR_PID" 2>/dev/null || true
|
||||
log "Monitor process terminated"
|
||||
fi
|
||||
|
||||
if [ -n "$CHAT_PID" ] && kill -0 "$CHAT_PID" 2>/dev/null; then
|
||||
log "Terminating chat service (PID: $CHAT_PID) - giving it time for graceful shutdown"
|
||||
kill "$CHAT_PID" 2>/dev/null || true
|
||||
|
||||
# Wait up to 25 seconds for graceful shutdown (Docker stop_grace_period is 30s)
|
||||
for i in $(seq 1 25); do
|
||||
if ! kill -0 "$CHAT_PID" 2>/dev/null; then
|
||||
log "Chat service terminated gracefully (${i} seconds)"
|
||||
if type diag_log &>/dev/null; then
|
||||
diag_log "INFO" "Chat service shutdown complete"
|
||||
fi
|
||||
exit 0
|
||||
fi
|
||||
sleep 1
|
||||
done
|
||||
log "WARNING: Chat service did not terminate gracefully, forcing exit"
|
||||
kill -9 "$CHAT_PID" 2>/dev/null || true
|
||||
fi
|
||||
|
||||
log "=== SHUTDOWN COMPLETE ==="
|
||||
exit 0
|
||||
}
|
||||
|
||||
# Set up traps for common signals
|
||||
trap cleanup SIGTERM SIGINT SIGQUIT SIGHUP
|
||||
|
||||
if [ -f "$CHAT_APP_DIR/server.js" ]; then
|
||||
log "Launching chat service on ${CHAT_HOST}:${CHAT_PORT} from $CHAT_APP_DIR"
|
||||
log "Environment: CHAT_PORT=${CHAT_PORT} CHAT_HOST=${CHAT_HOST} CHAT_DATA_ROOT=${REPO_DIR} CHAT_REPO_ROOT=${REPO_DIR}"
|
||||
CHAT_PORT=$CHAT_PORT CHAT_HOST=$CHAT_HOST CHAT_DATA_ROOT=$REPO_DIR CHAT_REPO_ROOT=$REPO_DIR node "$CHAT_APP_DIR/server.js" 2>&1 &
|
||||
CHAT_PID=$!
|
||||
log "Chat service started with PID: $CHAT_PID"
|
||||
elif [ -f "$CHAT_APP_FALLBACK/server.js" ]; then
|
||||
log "Primary chat service not found at $CHAT_APP_DIR, trying fallback at $CHAT_APP_FALLBACK"
|
||||
log "Launching chat service on ${CHAT_HOST}:${CHAT_PORT} from $CHAT_APP_FALLBACK"
|
||||
log "Environment: CHAT_PORT=${CHAT_PORT} CHAT_HOST=${CHAT_HOST} CHAT_DATA_ROOT=${REPO_DIR} CHAT_REPO_ROOT=${REPO_DIR}"
|
||||
CHAT_PORT=$CHAT_PORT CHAT_HOST=$CHAT_HOST CHAT_DATA_ROOT=$REPO_DIR CHAT_REPO_ROOT=$REPO_DIR node "$CHAT_APP_FALLBACK/server.js" 2>&1 &
|
||||
CHAT_PID=$!
|
||||
log "Chat service started with PID: $CHAT_PID"
|
||||
else
|
||||
log "ERROR: Chat service not found at $CHAT_APP_DIR or $CHAT_APP_FALLBACK; skipping startup"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Log initial service status after startup
|
||||
sleep 2
|
||||
if type check_service_status &>/dev/null; then
|
||||
check_service_status "chat" "$CHAT_PORT" "$CHAT_PID"
|
||||
fi
|
||||
|
||||
# Monitor chat service health
|
||||
if [ -n "$CHAT_PID" ]; then
|
||||
(
|
||||
# Initial check
|
||||
sleep 5
|
||||
if type check_service_status &>/dev/null; then
|
||||
check_service_status "chat" "$CHAT_PORT" "$CHAT_PID"
|
||||
fi
|
||||
|
||||
# Periodic monitoring
|
||||
check_count=0
|
||||
while kill -0 "$CHAT_PID" 2>/dev/null; do
|
||||
sleep 30
|
||||
check_count=$((check_count + 1))
|
||||
|
||||
# Every 2 minutes (4 checks), do resource monitoring
|
||||
if [ $((check_count % 4)) -eq 0 ] && type monitor_resources &>/dev/null; then
|
||||
monitor_resources
|
||||
fi
|
||||
|
||||
# Every 5 minutes (10 checks), do service status check
|
||||
if [ $((check_count % 10)) -eq 0 ] && type check_service_status &>/dev/null; then
|
||||
check_service_status "chat" "$CHAT_PORT" "$CHAT_PID"
|
||||
fi
|
||||
done
|
||||
log "ERROR: Chat service (PID: $CHAT_PID) has exited unexpectedly"
|
||||
if type diag_log &>/dev/null; then
|
||||
diag_log "ERROR" "Chat service exited unexpectedly after ${check_count} checks"
|
||||
fi
|
||||
) &
|
||||
MONITOR_PID=$!
|
||||
log "Health monitor started with PID: $MONITOR_PID"
|
||||
fi
|
||||
|
||||
# Start ttyd proxy instead of ttyd directly (on-demand activation)
|
||||
log "Starting ttyd proxy (on-demand mode)"
|
||||
log "ttyd will only run when port 4001 is accessed"
|
||||
log "Idle timeout: 5 minutes of inactivity"
|
||||
exec node /usr/local/bin/ttyd-proxy.js
|
||||
206
scripts/healthcheck.sh
Normal file
206
scripts/healthcheck.sh
Normal file
@@ -0,0 +1,206 @@
|
||||
#!/bin/bash
|
||||
# Enhanced health check script for Shopify AI App Builder container
|
||||
# Checks both ttyd (port 4001) and chat service (port 4000)
|
||||
# Provides detailed diagnostics for debugging
|
||||
|
||||
set -e
|
||||
|
||||
# Colors for output
|
||||
RED='\033[0;31m'
|
||||
GREEN='\033[0;32m'
|
||||
YELLOW='\033[1;33m'
|
||||
BLUE='\033[0;34m'
|
||||
NC='\033[0m'
|
||||
|
||||
# Diagnostic log location
|
||||
DIAG_LOG_DIR="/var/log/shopify-ai"
|
||||
DIAG_LOG_FILE="${DIAG_LOG_DIR}/healthcheck.log"
|
||||
mkdir -p "$DIAG_LOG_DIR"
|
||||
|
||||
# Health check logging
|
||||
health_log() {
|
||||
local level="$1"
|
||||
shift
|
||||
local message="$*"
|
||||
local timestamp=$(date '+%Y-%m-%d %H:%M:%S')
|
||||
|
||||
# Log to file
|
||||
echo "[${timestamp}] [${level}] ${message}" >> "$DIAG_LOG_FILE"
|
||||
|
||||
# Log to stdout for Docker health check
|
||||
echo "${message}"
|
||||
}
|
||||
|
||||
# Port checking function
|
||||
check_port() {
|
||||
local port="$1"
|
||||
local service="$2"
|
||||
|
||||
health_log "INFO" "Checking ${service} on port ${port}..."
|
||||
|
||||
# Check using ss (modern alternative to netstat)
|
||||
if command -v ss &>/dev/null; then
|
||||
if ss -tuln 2>/dev/null | grep -q ":${port} "; then
|
||||
health_log "INFO" "✓ ${service} is listening on port ${port}"
|
||||
else
|
||||
health_log "ERROR" "✗ ${service} is NOT listening on port ${port}"
|
||||
return 1
|
||||
fi
|
||||
# Fallback to netstat if ss not available
|
||||
elif command -v netstat &>/dev/null; then
|
||||
if netstat -tuln 2>/dev/null | grep -q ":${port} "; then
|
||||
health_log "INFO" "✓ ${service} is listening on port ${port}"
|
||||
else
|
||||
health_log "ERROR" "✗ ${service} is NOT listening on port ${port}"
|
||||
return 1
|
||||
fi
|
||||
else
|
||||
health_log "WARN" "Neither ss nor netstat available for port checking"
|
||||
return 1
|
||||
fi
|
||||
}
|
||||
|
||||
# HTTP endpoint checking function
|
||||
check_http() {
|
||||
local url="$1"
|
||||
local service="$2"
|
||||
local timeout="${3:-3}"
|
||||
|
||||
health_log "INFO" "Checking ${service} HTTP endpoint: ${url}"
|
||||
|
||||
if command -v timeout &>/dev/null; then
|
||||
if timeout "${timeout}" curl -s -o /dev/null -w "%{http_code}" "${url}" 2>&1 | grep -q "200\|302"; then
|
||||
health_log "INFO" "✓ ${service} HTTP endpoint responding (HTTP 200/302)"
|
||||
return 0
|
||||
else
|
||||
health_log "ERROR" "✗ ${service} HTTP endpoint NOT responding (timeout: ${timeout}s)"
|
||||
return 1
|
||||
fi
|
||||
else
|
||||
health_log "WARN" "timeout command not available for HTTP check"
|
||||
return 1
|
||||
fi
|
||||
}
|
||||
|
||||
# Process checking function
|
||||
check_process() {
|
||||
local service="$1"
|
||||
local port="$2"
|
||||
|
||||
health_log "INFO" "Checking ${service} process..."
|
||||
|
||||
# Find process listening on port
|
||||
local pid=""
|
||||
if command -v ss &>/dev/null; then
|
||||
pid=$(ss -tulnp 2>/dev/null | grep ":${port} " | awk '{print $7}' | cut -d',' -f1)
|
||||
elif command -v lsof &>/dev/null; then
|
||||
pid=$(lsof -ti ":${port}" 2>/dev/null)
|
||||
fi
|
||||
|
||||
if [ -n "$pid" ]; then
|
||||
health_log "INFO" "✓ ${service} process running (PID: ${pid})"
|
||||
|
||||
# Check process memory usage
|
||||
if [ -f "/proc/${pid}/status" ]; then
|
||||
local mem_mb=$(awk '/VmRSS/ {printf "%.2f MB", $2/1024}' "/proc/${pid}/status")
|
||||
health_log "INFO" " Memory usage: ${mem_mb}"
|
||||
|
||||
local cpu_time=$(awk '/utime|stime/ {sum+=$2} END {printf "%.2f seconds", sum/100}' "/proc/${pid}/status")
|
||||
health_log "INFO" " CPU time: ${cpu_time}"
|
||||
fi
|
||||
return 0
|
||||
else
|
||||
health_log "ERROR" "✗ ${service} process NOT found"
|
||||
return 1
|
||||
fi
|
||||
}
|
||||
|
||||
# System resource check
|
||||
check_resources() {
|
||||
health_log "INFO" "=== System Resources ==="
|
||||
|
||||
# Memory
|
||||
if command -v free &>/dev/null; then
|
||||
local mem_total=$(free -m | awk '/Mem:/ {print $2}')
|
||||
local mem_used=$(free -m | awk '/Mem:/ {print $3}')
|
||||
local mem_percent=$(( (mem_used * 100) / mem_total ))
|
||||
health_log "INFO" "Memory: ${mem_used}MB / ${mem_total}MB (${mem_percent}%)"
|
||||
|
||||
if [ $mem_percent -gt 90 ]; then
|
||||
health_log "WARN" "⚠ High memory usage: ${mem_percent}%"
|
||||
fi
|
||||
fi
|
||||
|
||||
# Disk
|
||||
if command -v df &>/dev/null; then
|
||||
local disk_usage=$(df / | tail -1 | awk '{print $5}' | sed 's/%//')
|
||||
health_log "INFO" "Disk: ${disk_usage}% used"
|
||||
|
||||
if [ "$disk_usage" -gt 80 ]; then
|
||||
health_log "WARN" "⚠ High disk usage: ${disk_usage}%"
|
||||
fi
|
||||
fi
|
||||
|
||||
# Load average
|
||||
if command -v uptime &>/dev/null; then
|
||||
local load_avg=$(uptime | awk -F'load average:' '{print $2}' | xargs)
|
||||
health_log "INFO" "Load average: ${load_avg}"
|
||||
fi
|
||||
}
|
||||
|
||||
# Main health check sequence
|
||||
main() {
|
||||
local exit_code=0
|
||||
|
||||
health_log "INFO" "========== HEALTH CHECK START =========="
|
||||
health_log "INFO" "Timestamp: $(date '+%Y-%m-%d %H:%M:%S %Z')"
|
||||
|
||||
# Check system resources
|
||||
check_resources
|
||||
|
||||
# Check chat service (port 4000)
|
||||
health_log "INFO" "=== Chat Service (port 4000) ==="
|
||||
if ! check_port 4000 "chat service"; then
|
||||
exit_code=1
|
||||
fi
|
||||
|
||||
if ! check_http "http://localhost:4000/api/health" "chat service" 3; then
|
||||
exit_code=1
|
||||
fi
|
||||
|
||||
if ! check_process "chat service" 4000; then
|
||||
exit_code=1
|
||||
fi
|
||||
|
||||
# Check ttyd service (port 4001) - proxy running, ttyd starts on-demand
|
||||
health_log "INFO" "=== TTYD Proxy Service (port 4001) ==="
|
||||
if ! check_port 4001 "ttyd-proxy"; then
|
||||
exit_code=1
|
||||
fi
|
||||
|
||||
# Check if proxy responds ( ttyd may not be running yet - that's OK )
|
||||
if ! check_http "http://localhost:4001/" "ttyd-proxy" 10; then
|
||||
exit_code=1
|
||||
fi
|
||||
|
||||
# Check proxy process (not ttyd - ttyd starts on-demand)
|
||||
if ! check_process "ttyd-proxy" 4001; then
|
||||
exit_code=1
|
||||
fi
|
||||
|
||||
# Optionally log that ttyd starts on-demand
|
||||
health_log "INFO" "ttyd-proxy active (ttyd starts on-demand when visited)"
|
||||
|
||||
health_log "INFO" "========== HEALTH CHECK END ==========="
|
||||
|
||||
if [ $exit_code -eq 0 ]; then
|
||||
health_log "INFO" "✓ Health check PASSED"
|
||||
else
|
||||
health_log "ERROR" "✗ Health check FAILED"
|
||||
fi
|
||||
|
||||
return $exit_code
|
||||
}
|
||||
|
||||
# Run main function
|
||||
main "$@"
|
||||
24
scripts/run-smoke-tests.ps1
Normal file
24
scripts/run-smoke-tests.ps1
Normal file
@@ -0,0 +1,24 @@
|
||||
# Runs a set of smoke tests for the opencode CLI inside the running container
|
||||
param(
|
||||
[string]$ContainerName = 'wordpress-plugin-ai-builder'
|
||||
)
|
||||
|
||||
$cmds = @(
|
||||
'pwsh --version',
|
||||
'node --version',
|
||||
'opencode --version',
|
||||
'opencode help'
|
||||
# verify our helper alias/function is loaded (source the profile file manually to simulate login)
|
||||
'. /root/.config/powershell/Microsoft.PowerShell_profile.ps1; Get-Command github'
|
||||
)
|
||||
|
||||
foreach ($c in $cmds) {
|
||||
Write-Host "Running: $c"
|
||||
docker compose exec $ContainerName pwsh -NoProfile -Command $c
|
||||
if ($LASTEXITCODE -ne 0) {
|
||||
Write-Error "Command failed: $c"
|
||||
exit 1
|
||||
}
|
||||
}
|
||||
|
||||
Write-Host "Smoke tests passed." -ForegroundColor Green
|
||||
137
scripts/test-email.js
Normal file
137
scripts/test-email.js
Normal file
@@ -0,0 +1,137 @@
|
||||
#!/usr/bin/env node
|
||||
/**
|
||||
* Email Configuration Test Script
|
||||
* Tests the email system to ensure password reset emails will work correctly
|
||||
*/
|
||||
|
||||
const http = require('http');
|
||||
const path = require('path');
|
||||
|
||||
const PORT = process.env.CHAT_PORT || 4000;
|
||||
const HOST = process.env.CHAT_HOST || 'localhost';
|
||||
|
||||
console.log('🧪 Testing Email Configuration...\n');
|
||||
|
||||
// Test 1: Check if server is running
|
||||
function testServerRunning() {
|
||||
return new Promise((resolve) => {
|
||||
const req = http.request({
|
||||
hostname: HOST,
|
||||
port: PORT,
|
||||
path: '/api/health',
|
||||
method: 'GET',
|
||||
timeout: 5000
|
||||
}, (res) => {
|
||||
let data = '';
|
||||
res.on('data', chunk => data += chunk);
|
||||
res.on('end', () => {
|
||||
try {
|
||||
const json = JSON.parse(data);
|
||||
resolve({ success: json.ok, message: 'Server is running' });
|
||||
} catch (e) {
|
||||
resolve({ success: false, message: 'Invalid response from server' });
|
||||
}
|
||||
});
|
||||
});
|
||||
req.on('error', () => resolve({ success: false, message: 'Server not responding' }));
|
||||
req.on('timeout', () => {
|
||||
req.destroy();
|
||||
resolve({ success: false, message: 'Connection timed out' });
|
||||
});
|
||||
req.end();
|
||||
});
|
||||
}
|
||||
|
||||
// Test 2: Preview password reset email template
|
||||
function testEmailPreview() {
|
||||
return new Promise((resolve) => {
|
||||
const req = http.request({
|
||||
hostname: HOST,
|
||||
port: PORT,
|
||||
path: '/debug/email/preview?type=reset&email=test@example.com&token=test-token-123',
|
||||
method: 'GET',
|
||||
timeout: 10000
|
||||
}, (res) => {
|
||||
let data = '';
|
||||
res.on('data', chunk => data += chunk);
|
||||
res.on('end', () => {
|
||||
const hasBrandedHtml = data.includes('Plugin Compass') &&
|
||||
data.includes('renderBrandedEmail') === false &&
|
||||
data.includes('#004225');
|
||||
const hasResetLink = data.includes('reset-password');
|
||||
const hasButton = data.includes('Reset password');
|
||||
|
||||
resolve({
|
||||
success: hasBrandedHtml && hasResetLink && hasButton,
|
||||
message: hasBrandedHtml && hasResetLink && hasButton
|
||||
? 'Email template renders correctly with branding'
|
||||
: 'Email template missing required elements',
|
||||
details: {
|
||||
hasBranding: hasBrandedHtml,
|
||||
hasResetLink: hasResetLink,
|
||||
hasButton: hasButton
|
||||
}
|
||||
});
|
||||
});
|
||||
});
|
||||
req.on('error', () => resolve({ success: false, message: 'Failed to connect to preview endpoint' }));
|
||||
req.on('timeout', () => {
|
||||
req.destroy();
|
||||
resolve({ success: false, message: 'Preview request timed out' });
|
||||
});
|
||||
req.end();
|
||||
});
|
||||
}
|
||||
|
||||
async function runTests() {
|
||||
console.log('Test 1: Server Status');
|
||||
console.log('─'.repeat(50));
|
||||
const serverTest = await testServerRunning();
|
||||
console.log(` ${serverTest.success ? '✓' : '✗'} ${serverTest.message}`);
|
||||
console.log('');
|
||||
|
||||
if (!serverTest.success) {
|
||||
console.log('❌ Server is not running. Start it with: node chat/server.js');
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
console.log('Test 2: Email Template Preview');
|
||||
console.log('─'.repeat(50));
|
||||
const previewTest = await testEmailPreview();
|
||||
console.log(` ${previewTest.success ? '✓' : '✗'} ${previewTest.message}`);
|
||||
if (previewTest.details) {
|
||||
console.log(` - Has Plugin Compass branding: ${previewTest.details.hasBranding ? '✓' : '✗'}`);
|
||||
console.log(` - Has reset password link: ${previewTest.details.hasResetLink ? '✓' : '✗'}`);
|
||||
console.log(` - Has CTA button: ${previewTest.details.hasButton ? '✓' : '✗'}`);
|
||||
}
|
||||
console.log('');
|
||||
|
||||
console.log('─'.repeat(50));
|
||||
console.log('📧 Email System Status:');
|
||||
console.log('');
|
||||
console.log(' The email system is ready. To send actual emails:');
|
||||
console.log('');
|
||||
console.log(' 1. Edit the .env file and configure SMTP settings:');
|
||||
console.log(' SMTP_HOST=smtp.gmail.com');
|
||||
console.log(' SMTP_PORT=587');
|
||||
console.log(' SMTP_USER=your-email@gmail.com');
|
||||
console.log(' SMTP_PASS=your-app-password');
|
||||
console.log(' SMTP_FROM=noreply@yourdomain.com');
|
||||
console.log('');
|
||||
console.log(' 2. For Gmail, create an app password at:');
|
||||
console.log(' https://myaccount.google.com/apppasswords');
|
||||
console.log('');
|
||||
console.log(' 3. Restart the server:');
|
||||
console.log(' node chat/server.js');
|
||||
console.log('');
|
||||
console.log(' 4. Test with: node scripts/test-email.js');
|
||||
console.log('');
|
||||
console.log('💡 Password reset emails include:');
|
||||
console.log(' - Professional Plugin Compass branding');
|
||||
console.log(' - Green gradient CTA button');
|
||||
console.log(' - Mobile-responsive design');
|
||||
console.log(' - One-click reset link');
|
||||
console.log('');
|
||||
}
|
||||
|
||||
runTests().catch(console.error);
|
||||
164
scripts/test-entrypoint-integration.sh
Executable file
164
scripts/test-entrypoint-integration.sh
Executable file
@@ -0,0 +1,164 @@
|
||||
#!/bin/bash
|
||||
# Integration test for entrypoint.sh sanitization
|
||||
# This simulates what happens when Portainer passes environment variables with Unicode characters
|
||||
|
||||
set -e
|
||||
|
||||
# Define Unicode character constants for testing
|
||||
readonly U200E_HEX=$'\xE2\x80\x8E' # U+200E Left-to-Right Mark
|
||||
readonly U200B_HEX=$'\xE2\x80\x8B' # U+200B Zero Width Space
|
||||
|
||||
echo "=========================================="
|
||||
echo "Entrypoint.sh Sanitization Integration Test"
|
||||
echo "=========================================="
|
||||
echo ""
|
||||
|
||||
# Create a test directory
|
||||
TEST_DIR=$(mktemp -d /tmp/entrypoint_test.XXXXXX)
|
||||
cd "$TEST_DIR"
|
||||
|
||||
echo "Test 1: Simulating environment with invisible Unicode characters..."
|
||||
|
||||
# Create a script that exports variables with invisible Unicode characters
|
||||
# This simulates what Portainer would do when env vars contain U+200E
|
||||
cat > "${TEST_DIR}/set_env_with_unicode.sh" << 'EOF'
|
||||
#!/bin/bash
|
||||
# Simulate Portainer environment with invisible Unicode characters
|
||||
export ADMIN_USER="testuser"
|
||||
export ADMIN_PASSWORD="testpass"
|
||||
export OPENROUTER_API_KEY="test-key-123"
|
||||
EOF
|
||||
|
||||
# Add invisible U+200E character after variable names (simulating the Portainer bug)
|
||||
printf "export TEST_VAR_U200E%s=\"value1\"\n" "$U200E_HEX" >> "${TEST_DIR}/set_env_with_unicode.sh"
|
||||
printf "export TEST_VAR_U200B%s=\"value2\"\n" "$U200B_HEX" >> "${TEST_DIR}/set_env_with_unicode.sh"
|
||||
|
||||
echo " Created test environment with Unicode characters"
|
||||
echo ""
|
||||
|
||||
# Verify the test file has Unicode characters
|
||||
echo "Test 2: Verifying test environment has invisible characters..."
|
||||
if grep -q "$U200E_HEX" "${TEST_DIR}/set_env_with_unicode.sh" 2>/dev/null; then
|
||||
echo " ✓ U+200E detected in test file"
|
||||
else
|
||||
echo " ✗ Failed to create test file with U+200E"
|
||||
exit 1
|
||||
fi
|
||||
echo ""
|
||||
|
||||
# Source the environment with Unicode characters
|
||||
echo "Test 3: Loading environment with Unicode characters..."
|
||||
source "${TEST_DIR}/set_env_with_unicode.sh" 2>/dev/null || true
|
||||
echo " Environment loaded"
|
||||
echo ""
|
||||
|
||||
# Extract and test just the sanitization function from entrypoint.sh
|
||||
echo "Test 4: Testing sanitization function..."
|
||||
RESULT_FILE=$(mktemp /tmp/sanitized_result.XXXXXX)
|
||||
cat > "${TEST_DIR}/test_sanitize.sh" << SANITIZE_EOF
|
||||
#!/bin/bash
|
||||
sanitize_env_vars() {
|
||||
echo "Sanitizing environment variables..."
|
||||
|
||||
# Create a secure temporary file
|
||||
local temp_env
|
||||
temp_env=\$(mktemp /tmp/sanitized_env.XXXXXX)
|
||||
|
||||
# Export current environment to a file, then clean it
|
||||
export -p > "\$temp_env"
|
||||
|
||||
# Remove common invisible Unicode characters
|
||||
sed -i \\
|
||||
-e 's/\\xE2\\x80\\x8E//g' \\
|
||||
-e 's/\\xE2\\x80\\x8F//g' \\
|
||||
-e 's/\\xE2\\x80\\x8B//g' \\
|
||||
-e 's/\\xEF\\xBB\\xBF//g' \\
|
||||
-e 's/\\xE2\\x80\\xAA//g' \\
|
||||
-e 's/\\xE2\\x80\\xAB//g' \\
|
||||
-e 's/\\xE2\\x80\\xAC//g' \\
|
||||
-e 's/\\xE2\\x80\\xAD//g' \\
|
||||
-e 's/\\xE2\\x80\\xAE//g' \\
|
||||
"\$temp_env" 2>/dev/null
|
||||
|
||||
# Source the sanitized environment
|
||||
if ! source "\$temp_env" 2>/dev/null; then
|
||||
echo "WARNING: Failed to source sanitized environment"
|
||||
fi
|
||||
|
||||
# Clean up temporary file
|
||||
rm -f "\$temp_env"
|
||||
|
||||
echo "Environment variables sanitized successfully"
|
||||
}
|
||||
|
||||
# Run the sanitization
|
||||
sanitize_env_vars
|
||||
|
||||
# Verify variables are still accessible after sanitization
|
||||
echo ""
|
||||
echo "Verifying sanitized environment variables:"
|
||||
echo " ADMIN_USER=\$ADMIN_USER"
|
||||
echo " ADMIN_PASSWORD=\$ADMIN_PASSWORD"
|
||||
echo " OPENROUTER_API_KEY=\$OPENROUTER_API_KEY"
|
||||
|
||||
# Export the sanitized environment for verification
|
||||
export -p > "$RESULT_FILE"
|
||||
SANITIZE_EOF
|
||||
|
||||
chmod +x "${TEST_DIR}/test_sanitize.sh"
|
||||
|
||||
# Run the sanitization test
|
||||
bash "${TEST_DIR}/test_sanitize.sh"
|
||||
echo ""
|
||||
|
||||
# Verify the sanitized output doesn't have Unicode characters
|
||||
echo "Test 5: Verifying Unicode characters are removed after sanitization..."
|
||||
if [ ! -f "$RESULT_FILE" ]; then
|
||||
echo " ✗ FAILED: Could not find sanitized result file"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
if grep -q "$U200E_HEX" "$RESULT_FILE" 2>/dev/null; then
|
||||
echo " ✗ FAILED: U+200E still present after sanitization"
|
||||
exit 1
|
||||
else
|
||||
echo " ✓ U+200E successfully removed"
|
||||
fi
|
||||
|
||||
if grep -q "$U200B_HEX" "$RESULT_FILE" 2>/dev/null; then
|
||||
echo " ✗ FAILED: U+200B still present after sanitization"
|
||||
exit 1
|
||||
else
|
||||
echo " ✓ U+200B successfully removed"
|
||||
fi
|
||||
echo ""
|
||||
|
||||
# Verify environment variables are preserved
|
||||
echo "Test 6: Verifying environment variables are preserved..."
|
||||
source "$RESULT_FILE"
|
||||
|
||||
if [ "$ADMIN_USER" = "testuser" ]; then
|
||||
echo " ✓ ADMIN_USER preserved correctly"
|
||||
else
|
||||
echo " ✗ FAILED: ADMIN_USER=$ADMIN_USER (expected: testuser)"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
if [ "$ADMIN_PASSWORD" = "testpass" ]; then
|
||||
echo " ✓ ADMIN_PASSWORD preserved correctly"
|
||||
else
|
||||
echo " ✗ FAILED: ADMIN_PASSWORD=$ADMIN_PASSWORD (expected: testpass)"
|
||||
exit 1
|
||||
fi
|
||||
echo ""
|
||||
|
||||
# Cleanup
|
||||
rm -rf "$TEST_DIR"
|
||||
rm -f "$RESULT_FILE"
|
||||
|
||||
echo "=========================================="
|
||||
echo "All integration tests PASSED! ✓"
|
||||
echo "=========================================="
|
||||
echo ""
|
||||
echo "The entrypoint.sh sanitization will automatically fix"
|
||||
echo "the Portainer U+200E error on container startup."
|
||||
134
scripts/test-env-sanitization.sh
Executable file
134
scripts/test-env-sanitization.sh
Executable file
@@ -0,0 +1,134 @@
|
||||
#!/bin/bash
|
||||
# Test script to verify environment variable sanitization
|
||||
# This tests that invisible Unicode characters are properly removed
|
||||
|
||||
set -e
|
||||
|
||||
# Define Unicode character constants for testing
|
||||
readonly U200E_HEX=$'\xE2\x80\x8E' # U+200E Left-to-Right Mark
|
||||
readonly U200B_HEX=$'\xE2\x80\x8B' # U+200B Zero Width Space
|
||||
|
||||
echo "Testing environment variable sanitization..."
|
||||
echo ""
|
||||
|
||||
# Create a test file with problematic Unicode characters
|
||||
TEST_FILE=$(mktemp /tmp/test_env_with_unicode.XXXXXX.sh)
|
||||
|
||||
# Create test environment with U+200E (Left-to-Right Mark) after variable names
|
||||
cat > "$TEST_FILE" << 'EOF'
|
||||
# Test environment variables with invisible Unicode characters
|
||||
export ADMIN_USER="testuser"
|
||||
export ADMIN_PASSWORD="testpass123"
|
||||
export OPENROUTER_API_KEY="sk-test-key-12345"
|
||||
export NORMAL_VAR="normalvalue"
|
||||
EOF
|
||||
|
||||
# Add actual invisible U+200E character (E2 80 8E in UTF-8) to the file
|
||||
# This simulates what happens when users copy-paste from web browsers
|
||||
printf "export TEST_VAR_WITH_U200E%s=\"value_with_unicode\"\n" "$U200E_HEX" >> "$TEST_FILE"
|
||||
printf "export TEST_VAR_WITH_U200B%s=\"value_with_zwsp\"\n" "$U200B_HEX" >> "$TEST_FILE"
|
||||
|
||||
echo "Original test file (with invisible characters):"
|
||||
hexdump -C "$TEST_FILE" | grep -E "e2 80" || echo " (invisible characters present but not shown)"
|
||||
echo ""
|
||||
|
||||
# Test 1: Verify the original file has Unicode characters
|
||||
echo "Test 1: Checking for invisible Unicode characters in original file..."
|
||||
if grep -q "$U200E_HEX" "$TEST_FILE" 2>/dev/null; then
|
||||
echo " ✓ U+200E (Left-to-Right Mark) detected"
|
||||
else
|
||||
echo " ✗ U+200E not found (expected to find it)"
|
||||
fi
|
||||
|
||||
if grep -q "$U200B_HEX" "$TEST_FILE" 2>/dev/null; then
|
||||
echo " ✓ U+200B (Zero Width Space) detected"
|
||||
else
|
||||
echo " ✗ U+200B not found (expected to find it)"
|
||||
fi
|
||||
echo ""
|
||||
|
||||
# Test 2: Apply the same sanitization logic from entrypoint.sh
|
||||
echo "Test 2: Applying sanitization..."
|
||||
SANITIZED_FILE=$(mktemp /tmp/test_env_sanitized.XXXXXX.sh)
|
||||
cp "$TEST_FILE" "$SANITIZED_FILE"
|
||||
|
||||
# Remove common invisible Unicode characters (same logic as entrypoint.sh)
|
||||
sed -i \
|
||||
-e 's/\xE2\x80\x8E//g' \
|
||||
-e 's/\xE2\x80\x8F//g' \
|
||||
-e 's/\xE2\x80\x8B//g' \
|
||||
-e 's/\xEF\xBB\xBF//g' \
|
||||
-e 's/\xE2\x80\xAA//g' \
|
||||
-e 's/\xE2\x80\xAB//g' \
|
||||
-e 's/\xE2\x80\xAC//g' \
|
||||
-e 's/\xE2\x80\xAD//g' \
|
||||
-e 's/\xE2\x80\xAE//g' \
|
||||
"$SANITIZED_FILE" 2>/dev/null
|
||||
|
||||
echo " Sanitization complete"
|
||||
echo ""
|
||||
|
||||
# Test 3: Verify Unicode characters are removed
|
||||
echo "Test 3: Verifying invisible characters are removed..."
|
||||
if grep -q "$U200E_HEX" "$SANITIZED_FILE" 2>/dev/null; then
|
||||
echo " ✗ FAILED: U+200E still present after sanitization"
|
||||
exit 1
|
||||
else
|
||||
echo " ✓ U+200E successfully removed"
|
||||
fi
|
||||
|
||||
if grep -q "$U200B_HEX" "$SANITIZED_FILE" 2>/dev/null; then
|
||||
echo " ✗ FAILED: U+200B still present after sanitization"
|
||||
exit 1
|
||||
else
|
||||
echo " ✓ U+200B successfully removed"
|
||||
fi
|
||||
echo ""
|
||||
|
||||
# Test 4: Verify the sanitized file is valid bash and can be sourced
|
||||
echo "Test 4: Testing if sanitized file is valid bash..."
|
||||
if bash -n "$SANITIZED_FILE" 2>/dev/null; then
|
||||
echo " ✓ Sanitized file has valid bash syntax"
|
||||
else
|
||||
echo " ✗ FAILED: Sanitized file has syntax errors"
|
||||
cat "$SANITIZED_FILE"
|
||||
exit 1
|
||||
fi
|
||||
echo ""
|
||||
|
||||
# Test 5: Try sourcing the sanitized environment
|
||||
echo "Test 5: Testing if sanitized environment can be sourced..."
|
||||
(
|
||||
source "$SANITIZED_FILE"
|
||||
if [ "$ADMIN_USER" = "testuser" ] && [ "$ADMIN_PASSWORD" = "testpass123" ]; then
|
||||
echo " ✓ Environment variables loaded correctly"
|
||||
else
|
||||
echo " ✗ FAILED: Environment variables not loaded correctly"
|
||||
echo " ADMIN_USER=$ADMIN_USER (expected: testuser)"
|
||||
echo " ADMIN_PASSWORD=$ADMIN_PASSWORD (expected: testpass123)"
|
||||
exit 1
|
||||
fi
|
||||
)
|
||||
echo ""
|
||||
|
||||
# Test 6: Compare file sizes (sanitized should be smaller)
|
||||
ORIGINAL_SIZE=$(wc -c < "$TEST_FILE")
|
||||
SANITIZED_SIZE=$(wc -c < "$SANITIZED_FILE")
|
||||
REMOVED_BYTES=$((ORIGINAL_SIZE - SANITIZED_SIZE))
|
||||
|
||||
echo "Test 6: Verifying bytes were removed..."
|
||||
if [ $REMOVED_BYTES -gt 0 ]; then
|
||||
echo " ✓ Removed $REMOVED_BYTES bytes of invisible Unicode characters"
|
||||
else
|
||||
echo " ✗ WARNING: No bytes removed (original: $ORIGINAL_SIZE, sanitized: $SANITIZED_SIZE)"
|
||||
fi
|
||||
echo ""
|
||||
|
||||
# Cleanup
|
||||
rm -f "$TEST_FILE" "$SANITIZED_FILE"
|
||||
|
||||
echo "=================================="
|
||||
echo "All sanitization tests PASSED! ✓"
|
||||
echo "=================================="
|
||||
echo ""
|
||||
echo "The entrypoint.sh sanitization logic will prevent the Portainer U+200E error."
|
||||
281
scripts/ttyd-proxy.js
Normal file
281
scripts/ttyd-proxy.js
Normal file
@@ -0,0 +1,281 @@
|
||||
#!/usr/bin/env node
|
||||
|
||||
const { spawn } = require('child_process');
|
||||
const http = require('http');
|
||||
const net = require('net');
|
||||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
|
||||
const TTYD_PORT = 4001;
|
||||
const TTYD_HOST = '0.0.0.0';
|
||||
const IDLE_TIMEOUT_MS = 5 * 60 * 1000; // 5 minutes of inactivity
|
||||
const STARTUP_TIMEOUT_MS = 10000; // 10 seconds to start ttyd
|
||||
|
||||
let ttydProcess = null;
|
||||
let lastActivityTime = Date.now();
|
||||
let clientCount = 0;
|
||||
let startupPromise = null;
|
||||
|
||||
const logFile = '/var/log/ttyd-proxy.log';
|
||||
|
||||
function log(message) {
|
||||
const timestamp = new Date().toISOString();
|
||||
const logLine = `[${timestamp}] ${message}\n`;
|
||||
fs.appendFileSync(logFile, logLine);
|
||||
console.log(logLine.trim());
|
||||
}
|
||||
|
||||
function startTtyd() {
|
||||
if (startupPromise) {
|
||||
return startupPromise;
|
||||
}
|
||||
|
||||
if (ttydProcess) {
|
||||
return Promise.resolve();
|
||||
}
|
||||
|
||||
log('Starting ttyd process...');
|
||||
|
||||
startupPromise = new Promise((resolve, reject) => {
|
||||
const args = [
|
||||
'-W',
|
||||
'-p', '4002', // Use port 4002 for the actual ttyd instance
|
||||
'-i', '0.0.0.0',
|
||||
'/usr/bin/pwsh'
|
||||
];
|
||||
|
||||
// Add password protection if ACCESS_PASSWORD is set
|
||||
if (process.env.ACCESS_PASSWORD) {
|
||||
args.splice(0, 0, '-c', `user:${process.env.ACCESS_PASSWORD}`);
|
||||
}
|
||||
|
||||
ttydProcess = spawn('/usr/local/bin/ttyd', args, {
|
||||
env: process.env,
|
||||
stdio: ['ignore', 'pipe', 'pipe']
|
||||
});
|
||||
|
||||
const startupTimer = setTimeout(() => {
|
||||
if (ttydProcess) {
|
||||
log('ttyd startup timeout, killing process');
|
||||
ttydProcess.kill('SIGKILL');
|
||||
ttydProcess = null;
|
||||
startupPromise = null;
|
||||
reject(new Error('ttyd startup timeout'));
|
||||
}
|
||||
}, STARTUP_TIMEOUT_MS);
|
||||
|
||||
ttydProcess.stdout.on('data', (data) => {
|
||||
log(`ttyd stdout: ${data.slice(0, 200)}`);
|
||||
});
|
||||
|
||||
ttydProcess.stderr.on('data', (data) => {
|
||||
const stderr = data.toString();
|
||||
if (!stderr.includes('client disconnected') && !stderr.includes('new client')) {
|
||||
log(`ttyd stderr: ${stderr.slice(0, 200)}`);
|
||||
}
|
||||
});
|
||||
|
||||
ttydProcess.on('exit', (code, signal) => {
|
||||
log(`ttyd exited: code=${code}, signal=${signal}`);
|
||||
ttydProcess = null;
|
||||
startupPromise = null;
|
||||
});
|
||||
|
||||
// Wait for ttyd to start listening on port 4002
|
||||
let attempts = 0;
|
||||
const checkReady = () => {
|
||||
attempts++;
|
||||
const client = net.connect(4002, '0.0.0.0', () => {
|
||||
clearTimeout(startupTimer);
|
||||
client.destroy();
|
||||
log('ttyd is ready on port 4002');
|
||||
resolve();
|
||||
});
|
||||
|
||||
client.on('error', () => {
|
||||
client.destroy();
|
||||
if (attempts < 50) {
|
||||
setTimeout(checkReady, 200);
|
||||
} else {
|
||||
clearTimeout(startupTimer);
|
||||
if (ttydProcess) {
|
||||
ttydProcess.kill('SIGKILL');
|
||||
ttydProcess = null;
|
||||
}
|
||||
startupPromise = null;
|
||||
reject(new Error('ttyd failed to start'));
|
||||
}
|
||||
});
|
||||
};
|
||||
|
||||
checkReady();
|
||||
});
|
||||
|
||||
return startupPromise;
|
||||
}
|
||||
|
||||
function stopTtyd() {
|
||||
if (ttydProcess) {
|
||||
log('Stopping idle ttyd process...');
|
||||
ttydProcess.kill('SIGTERM');
|
||||
setTimeout(() => {
|
||||
if (ttydProcess) {
|
||||
ttydProcess.kill('SIGKILL');
|
||||
ttydProcess = null;
|
||||
}
|
||||
}, 5000);
|
||||
}
|
||||
}
|
||||
|
||||
function checkIdleTimeout() {
|
||||
const now = Date.now();
|
||||
const idleTime = now - lastActivityTime;
|
||||
|
||||
if (clientCount === 0 && idleTime > IDLE_TIMEOUT_MS) {
|
||||
log(`ttyd idle for ${idleTime}ms, stopping...`);
|
||||
stopTtyd();
|
||||
}
|
||||
}
|
||||
|
||||
setInterval(checkIdleTimeout, 60000); // Check every minute
|
||||
|
||||
const proxy = http.createServer((clientReq, clientRes) => {
|
||||
lastActivityTime = Date.now();
|
||||
|
||||
// Update client count based on request type
|
||||
if (clientReq.url === '/' && clientReq.method === 'GET') {
|
||||
clientCount++;
|
||||
}
|
||||
|
||||
const handleProxy = async () => {
|
||||
try {
|
||||
await startTtyd();
|
||||
|
||||
const options = {
|
||||
hostname: 'localhost',
|
||||
port: 4002,
|
||||
path: clientReq.url,
|
||||
method: clientReq.method,
|
||||
headers: {
|
||||
...clientReq.headers,
|
||||
host: 'localhost:4002'
|
||||
}
|
||||
};
|
||||
|
||||
const proxyReq = http.request(options, (proxyRes) => {
|
||||
clientRes.writeHead(proxyRes.statusCode, proxyRes.headers);
|
||||
proxyRes.pipe(clientRes);
|
||||
|
||||
proxyRes.on('end', () => {
|
||||
clientCount = Math.max(0, clientCount - 1);
|
||||
lastActivityTime = Date.now();
|
||||
});
|
||||
});
|
||||
|
||||
proxyReq.on('error', (err) => {
|
||||
log(`Proxy request error: ${err.message}`);
|
||||
clientCount = Math.max(0, clientCount - 1);
|
||||
if (!clientRes.headersSent) {
|
||||
clientRes.writeHead(502, { 'Content-Type': 'text/plain' });
|
||||
clientRes.end('Bad Gateway: ttyd not available');
|
||||
}
|
||||
});
|
||||
|
||||
clientReq.pipe(proxyReq);
|
||||
} catch (err) {
|
||||
log(`Proxy error: ${err.message}`);
|
||||
clientCount = Math.max(0, clientCount - 1);
|
||||
if (!clientRes.headersSent) {
|
||||
clientRes.writeHead(503, { 'Content-Type': 'text/plain' });
|
||||
clientRes.end('Service Unavailable: ttyd failed to start');
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
handleProxy();
|
||||
});
|
||||
|
||||
proxy.on('upgrade', (clientReq, clientSocket, head) => {
|
||||
lastActivityTime = Date.now();
|
||||
clientCount++;
|
||||
|
||||
const handleUpgrade = async () => {
|
||||
try {
|
||||
await startTtyd();
|
||||
|
||||
const proxyReq = http.request({
|
||||
hostname: 'localhost',
|
||||
port: 4002,
|
||||
path: clientReq.url,
|
||||
method: clientReq.method,
|
||||
headers: {
|
||||
...clientReq.headers,
|
||||
host: 'localhost:4002'
|
||||
}
|
||||
});
|
||||
|
||||
proxyReq.on('upgrade', (proxyRes, proxySocket, proxyHead) => {
|
||||
clientSocket.write(
|
||||
'HTTP/1.1 101 Switching Protocols\r\n' +
|
||||
`Upgrade: ${proxyRes.headers.upgrade}\r\n` +
|
||||
`Connection: Upgrade\r\n` +
|
||||
Object.entries(proxyRes.headers)
|
||||
.filter(([k]) => k !== 'upgrade' && k !== 'connection')
|
||||
.map(([k, v]) => `${k}: ${v}`)
|
||||
.join('\r\n') +
|
||||
'\r\n\r\n'
|
||||
);
|
||||
|
||||
proxySocket.pipe(clientSocket).pipe(proxySocket);
|
||||
|
||||
proxySocket.on('close', () => {
|
||||
clientCount = Math.max(0, clientCount - 1);
|
||||
lastActivityTime = Date.now();
|
||||
});
|
||||
|
||||
clientSocket.on('close', () => {
|
||||
clientCount = Math.max(0, clientCount - 1);
|
||||
lastActivityTime = Date.now();
|
||||
proxySocket.end();
|
||||
});
|
||||
});
|
||||
|
||||
proxyReq.on('error', (err) => {
|
||||
log(`WebSocket proxy error: ${err.message}`);
|
||||
clientCount = Math.max(0, clientCount - 1);
|
||||
clientSocket.end();
|
||||
});
|
||||
|
||||
proxyReq.end();
|
||||
} catch (err) {
|
||||
log(`WebSocket upgrade error: ${err.message}`);
|
||||
clientCount = Math.max(0, clientCount - 1);
|
||||
clientSocket.end();
|
||||
}
|
||||
};
|
||||
|
||||
handleUpgrade();
|
||||
});
|
||||
|
||||
proxy.listen(TTYD_PORT, TTYD_HOST, () => {
|
||||
log(`ttyd proxy listening on ${TTYD_HOST}:${TTYD_PORT}`);
|
||||
log(`ttyd will start on-demand after ${IDLE_TIMEOUT_MS/1000} minutes of inactivity`);
|
||||
});
|
||||
|
||||
process.on('SIGTERM', () => {
|
||||
log('Received SIGTERM, shutting down...');
|
||||
stopTtyd();
|
||||
proxy.close(() => {
|
||||
log('Proxy server closed');
|
||||
process.exit(0);
|
||||
});
|
||||
});
|
||||
|
||||
process.on('SIGINT', () => {
|
||||
log('Received SIGINT, shutting down...');
|
||||
stopTtyd();
|
||||
proxy.close(() => {
|
||||
log('Proxy server closed');
|
||||
process.exit(0);
|
||||
});
|
||||
});
|
||||
92
scripts/validate-env.sh
Executable file
92
scripts/validate-env.sh
Executable file
@@ -0,0 +1,92 @@
|
||||
#!/bin/bash
|
||||
# Validate .env files for common issues including invisible Unicode characters
|
||||
# This script can be run before deployment to catch potential problems
|
||||
|
||||
set -e
|
||||
|
||||
ENV_FILE="${1:-.env}"
|
||||
|
||||
if [ ! -f "$ENV_FILE" ]; then
|
||||
echo "✓ No $ENV_FILE file found (using environment variables or defaults)"
|
||||
exit 0
|
||||
fi
|
||||
|
||||
if [ $# -lt 1 ]; then
|
||||
echo "Usage: $(basename "$0") [.env file path]"
|
||||
echo "Example: $(basename "$0") .env"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "Validating $ENV_FILE..."
|
||||
echo ""
|
||||
|
||||
# Check for invisible Unicode characters
|
||||
FOUND_ISSUES=0
|
||||
|
||||
# Check for U+200E (Left-to-Right Mark) - E2 80 8E
|
||||
if grep -q $'\xE2\x80\x8E' "$ENV_FILE" 2>/dev/null; then
|
||||
echo "✗ Found U+200E (Left-to-Right Mark) invisible character"
|
||||
grep -n $'\xE2\x80\x8E' "$ENV_FILE" | head -5
|
||||
FOUND_ISSUES=1
|
||||
fi
|
||||
|
||||
# Check for U+200F (Right-to-Left Mark) - E2 80 8F
|
||||
if grep -q $'\xE2\x80\x8F' "$ENV_FILE" 2>/dev/null; then
|
||||
echo "✗ Found U+200F (Right-to-Left Mark) invisible character"
|
||||
grep -n $'\xE2\x80\x8F' "$ENV_FILE" | head -5
|
||||
FOUND_ISSUES=1
|
||||
fi
|
||||
|
||||
# Check for U+200B (Zero Width Space) - E2 80 8B
|
||||
if grep -q $'\xE2\x80\x8B' "$ENV_FILE" 2>/dev/null; then
|
||||
echo "✗ Found U+200B (Zero Width Space) invisible character"
|
||||
grep -n $'\xE2\x80\x8B' "$ENV_FILE" | head -5
|
||||
FOUND_ISSUES=1
|
||||
fi
|
||||
|
||||
# Check for BOM (Byte Order Mark) - EF BB BF
|
||||
if grep -q $'\xEF\xBB\xBF' "$ENV_FILE" 2>/dev/null; then
|
||||
echo "✗ Found BOM (Byte Order Mark) at start of file"
|
||||
FOUND_ISSUES=1
|
||||
fi
|
||||
|
||||
# Check for other directional formatting characters
|
||||
for code in $'\xE2\x80\xAA' $'\xE2\x80\xAB' $'\xE2\x80\xAC' $'\xE2\x80\xAD' $'\xE2\x80\xAE'; do
|
||||
if grep -q "$code" "$ENV_FILE" 2>/dev/null; then
|
||||
echo "✗ Found directional formatting invisible character"
|
||||
FOUND_ISSUES=1
|
||||
break
|
||||
fi
|
||||
done
|
||||
|
||||
# Check for Windows line endings (CRLF) if file command is available
|
||||
if command -v file >/dev/null 2>&1; then
|
||||
if file "$ENV_FILE" | grep -q CRLF 2>/dev/null; then
|
||||
echo "⚠ Warning: File has Windows line endings (CRLF)"
|
||||
echo " This may cause issues on Linux. Consider converting to LF."
|
||||
fi
|
||||
fi
|
||||
|
||||
# Check for trailing spaces on variable definitions
|
||||
if grep -E '^[A-Z_]+[[:space:]]+=' "$ENV_FILE" >/dev/null 2>&1; then
|
||||
echo "⚠ Warning: Found spaces before '=' in variable definitions"
|
||||
grep -n -E '^[A-Z_]+[[:space:]]+=' "$ENV_FILE"
|
||||
fi
|
||||
|
||||
# Check for spaces in variable names
|
||||
if grep -E '^[A-Z_]+[[:space:]]+[^=]*=' "$ENV_FILE" >/dev/null 2>&1; then
|
||||
echo "⚠ Warning: Possible spaces in variable names"
|
||||
fi
|
||||
|
||||
if [ $FOUND_ISSUES -eq 0 ]; then
|
||||
echo "✓ No invisible Unicode characters found"
|
||||
echo "✓ File looks clean!"
|
||||
exit 0
|
||||
else
|
||||
echo ""
|
||||
echo "----------------------------------------"
|
||||
echo "Issues found! Run the following to fix:"
|
||||
echo " ./scripts/clean-env.sh $ENV_FILE"
|
||||
echo "----------------------------------------"
|
||||
exit 1
|
||||
fi
|
||||
762
scripts/validate-woocommerce.sh
Normal file
762
scripts/validate-woocommerce.sh
Normal file
@@ -0,0 +1,762 @@
|
||||
#!/usr/bin/env bash
|
||||
set -euo pipefail
|
||||
|
||||
# ==============================================================================
|
||||
# STRICT WooCommerce Plugin Compatibility Validator
|
||||
# ==============================================================================
|
||||
# This script performs aggressive static analysis to ensure WooCommerce plugins
|
||||
# meet modern standards, specifically HPOS compatibility and security.
|
||||
# ==============================================================================
|
||||
|
||||
PLUGIN_DIR="${1:-}"
|
||||
|
||||
# Color codes
|
||||
RED='\033[0;31m'
|
||||
YELLOW='\033[1;33m'
|
||||
GREEN='\033[0;32m'
|
||||
BLUE='\033[0;34m'
|
||||
CYAN='\033[0;36m'
|
||||
MAGENTA='\033[0;35m'
|
||||
NC='\033[0m' # No Color
|
||||
|
||||
# Counters
|
||||
CRITICAL_COUNT=0
|
||||
WARNING_COUNT=0
|
||||
|
||||
# Usage check
|
||||
if [[ -z "${PLUGIN_DIR}" ]]; then
|
||||
echo "Usage: $(basename "$0") /path/to/plugin" >&2
|
||||
exit 1
|
||||
fi
|
||||
|
||||
if [[ ! -d "${PLUGIN_DIR}" ]]; then
|
||||
echo "Plugin directory not found: ${PLUGIN_DIR}" >&2
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo -e "${BLUE}========================================================${NC}"
|
||||
echo -e "${BLUE} STRICT WOOCOMMERCE COMPATIBILITY VALIDATION ${NC}"
|
||||
echo -e "${BLUE}========================================================${NC}"
|
||||
echo "Target: ${PLUGIN_DIR}"
|
||||
echo ""
|
||||
|
||||
# Gather files
|
||||
mapfile -t php_files < <(find "${PLUGIN_DIR}" -type f -name "*.php" ! -path "*/vendor/*" ! -path "*/node_modules/*" 2>/dev/null | sort || true)
|
||||
|
||||
if [[ "${#php_files[@]}" -eq 0 ]]; then
|
||||
echo -e "${RED}✗ FATAL: No PHP files found.${NC}"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# ============================================
|
||||
# 1. HPOS STRICT MODE (CRITICAL)
|
||||
# ============================================
|
||||
echo -e "${MAGENTA}[1/9] HPOS (High-Performance Order Storage) Strict Analysis...${NC}"
|
||||
|
||||
hpos_declared=false
|
||||
hpos_compatible=false
|
||||
|
||||
# 1.1 Check Declaration
|
||||
for file in "${php_files[@]}"; do
|
||||
if grep -q "FeaturesUtil::declare_compatibility" "${file}"; then
|
||||
hpos_declared=true
|
||||
if grep -q "'custom_order_tables'" "${file}" || grep -q '"custom_order_tables"' "${file}"; then
|
||||
hpos_compatible=true
|
||||
fi
|
||||
fi
|
||||
done
|
||||
|
||||
if ! $hpos_declared; then
|
||||
echo -e "${RED} ✗ FAILURE: HPOS compatibility not declared.${NC}"
|
||||
echo " Must use: \Automattic\WooCommerce\Utilities\FeaturesUtil::declare_compatibility"
|
||||
CRITICAL_COUNT=$((CRITICAL_COUNT + 1))
|
||||
elif ! $hpos_compatible; then
|
||||
echo -e "${RED} ✗ FAILURE: HPOS declared but 'custom_order_tables' missing.${NC}"
|
||||
CRITICAL_COUNT=$((CRITICAL_COUNT + 1))
|
||||
else
|
||||
echo -e "${GREEN} ✓ HPOS compatibility declared.${NC}"
|
||||
fi
|
||||
|
||||
# 1.2 Check for Legacy Direct DB Access (Strict)
|
||||
# Flag any direct SQL queries that mention posts table AND shop_order in the same file
|
||||
for file in "${php_files[@]}"; do
|
||||
# Check for SQL queries joining posts/postmeta looking for orders
|
||||
if grep -E "wp_posts|wp_postmeta" "${file}" | grep -qE "shop_order|_order_"; then
|
||||
# Exclude if comments explicitly say it's legacy fallback
|
||||
if ! grep -q "HPOS usage" "${file}"; then
|
||||
echo -e "${RED} ✗ HPOS VIOLATION in ${file}:${NC}"
|
||||
echo " Detected direct 'wp_posts'/'wp_postmeta' access potentially for orders."
|
||||
echo " HPOS requires using WC_Order CRUD methods or specific HPOS tables."
|
||||
grep -nE "wp_posts|wp_postmeta" "${file}" | head -3
|
||||
CRITICAL_COUNT=$((CRITICAL_COUNT + 1))
|
||||
fi
|
||||
fi
|
||||
|
||||
# Check for legacy metadata functions on orders
|
||||
# We flag usage of get_post_meta where the variable name implies an order
|
||||
if grep -nE "(get|update|add|delete)_post_meta\s*\(\s*\\$(order|wc_order|item)" "${file}"; then
|
||||
echo -e "${RED} ✗ HPOS VIOLATION in ${file}:${NC}"
|
||||
echo " Detected legacy meta function on order variable."
|
||||
echo " Use \$order->get_meta(), \$order->update_meta_data(), etc."
|
||||
CRITICAL_COUNT=$((CRITICAL_COUNT + 1))
|
||||
fi
|
||||
|
||||
# Check for direct ID access (e.g., $order->id)
|
||||
if grep -nE "\\\$[a-zA-Z0-9_]*order->id" "${file}" | grep -v "get_id()"; then
|
||||
echo -e "${RED} ✗ DEPRECATED ACCESS in ${file}:${NC}"
|
||||
echo " Detected direct access to \$order->id. Use \$order->get_id()."
|
||||
CRITICAL_COUNT=$((CRITICAL_COUNT + 1))
|
||||
fi
|
||||
done
|
||||
echo ""
|
||||
|
||||
# ============================================
|
||||
# 2. WOOCOMMERCE VERSION HEADERS
|
||||
# ============================================
|
||||
echo -e "${MAGENTA}[2/9] Validating Version Headers...${NC}"
|
||||
|
||||
# Find main plugin file
|
||||
main_file=""
|
||||
for file in "${php_files[@]}"; do
|
||||
if grep -q "Plugin Name:" "${file}"; then
|
||||
main_file="${file}"
|
||||
break
|
||||
fi
|
||||
done
|
||||
|
||||
if [[ -z "${main_file}" ]]; then
|
||||
echo -e "${RED} ✗ FAILURE: Main plugin file not found.${NC}"
|
||||
CRITICAL_COUNT=$((CRITICAL_COUNT + 1))
|
||||
else
|
||||
wc_tested=$(grep "WC tested up to:" "${main_file}" | sed 's/.*://' | tr -d '[:space:]')
|
||||
wc_required=$(grep "WC requires at least:" "${main_file}" | sed 's/.*://' | tr -d '[:space:]')
|
||||
|
||||
if [[ -z "$wc_tested" ]]; then
|
||||
echo -e "${RED} ✗ FAILURE: Missing 'WC tested up to' header.${NC}"
|
||||
CRITICAL_COUNT=$((CRITICAL_COUNT + 1))
|
||||
else
|
||||
# Simple check if tested version is reasonably recent (starts with 8 or 9)
|
||||
if [[ ! "$wc_tested" =~ ^(8|9)\. ]]; then
|
||||
echo -e "${YELLOW} ⚠ WARNING: 'WC tested up to' ($wc_tested) seems old (pre-8.0).${NC}"
|
||||
WARNING_COUNT=$((WARNING_COUNT + 1))
|
||||
fi
|
||||
fi
|
||||
|
||||
if [[ -z "$wc_required" ]]; then
|
||||
echo -e "${RED} ✗ FAILURE: Missing 'WC requires at least' header.${NC}"
|
||||
CRITICAL_COUNT=$((CRITICAL_COUNT + 1))
|
||||
fi
|
||||
fi
|
||||
echo ""
|
||||
|
||||
# ============================================
|
||||
# 3. DEPRECATED FUNCTIONS & HOOKS (Strict)
|
||||
# ============================================
|
||||
echo -e "${MAGENTA}[3/9] Scanning for Deprecated WooCommerce Code...${NC}"
|
||||
|
||||
DEPRECATED_PATTERNS=(
|
||||
"woocommerce_get_page_id:Use wc_get_page_id()"
|
||||
"wc_get_page_permalink:Use wc_get_page_permalink()"
|
||||
"woocommerce_show_messages:Use wc_print_notices()"
|
||||
"wc_current_theme_is_fse_theme:Use wp_is_block_theme() (WC 9.9+)"
|
||||
"global \$woocommerce:Use WC()"
|
||||
"\$woocommerce->cart:Use WC()->cart"
|
||||
"\$woocommerce->customer:Use WC()->customer"
|
||||
"store_api_validate_add_to_cart:Use woocommerce_store_api_validate_add_to_cart"
|
||||
"jquery-blockui:Use wc-jquery-blockui"
|
||||
"woocommerce_coupon_error:Use wc_add_notice()"
|
||||
"woocommerce_add_notice:Use wc_add_notice()"
|
||||
"woocommerce_get_template:Use wc_get_template()"
|
||||
"woocommerce_get_template_part:Use wc_get_template_part()"
|
||||
"woocommerce_locate_template:Use wc_locate_template()"
|
||||
"WC()->cart->get_cart_from_session:Removed in WC 8.2+"
|
||||
"woocommerce_calculate_totals:Use recalculate_totals()"
|
||||
"woocommerce_get_product_from_item:Use \$item->get_product()"
|
||||
"woocommerce_get_order_item_meta:Use \$item->get_meta()"
|
||||
"woocommerce_downloadable_file_permission:Use wc_downloadable_file_permission()"
|
||||
"woocommerce_sanitize_taxonomy_name:Use wc_sanitize_taxonomy_name()"
|
||||
"woocommerce_clean:Use wc_clean()"
|
||||
"woocommerce_array_overlay:Use wc_array_overlay()"
|
||||
)
|
||||
|
||||
for file in "${php_files[@]}"; do
|
||||
for pattern_set in "${DEPRECATED_PATTERNS[@]}"; do
|
||||
pattern="${pattern_set%%:*}"
|
||||
advice="${pattern_set##*:}"
|
||||
|
||||
if grep -q "$pattern" "${file}"; then
|
||||
echo -e "${RED} ✗ DEPRECATED CODE in ${file}:${NC}"
|
||||
echo " Found: '$pattern'"
|
||||
echo " Fix: $advice"
|
||||
grep -n "$pattern" "${file}" | head -1
|
||||
CRITICAL_COUNT=$((CRITICAL_COUNT + 1))
|
||||
fi
|
||||
done
|
||||
done
|
||||
echo ""
|
||||
|
||||
# ============================================
|
||||
# 4. DATABASE & SCHEMA SAFETY
|
||||
# ============================================
|
||||
echo -e "${MAGENTA}[4/9] Validating Database Usage...${NC}"
|
||||
|
||||
for file in "${php_files[@]}"; do
|
||||
# Check for creating tables without dbDelta
|
||||
if grep -q "CREATE TABLE" "${file}"; then
|
||||
if ! grep -q "dbDelta" "${file}"; then
|
||||
echo -e "${RED} ✗ DB SAFETY in ${file}:${NC}"
|
||||
echo " 'CREATE TABLE' found without 'dbDelta'. This is a WP standard violation."
|
||||
CRITICAL_COUNT=$((CRITICAL_COUNT + 1))
|
||||
fi
|
||||
fi
|
||||
|
||||
# Strict check: Using standard WP tables for order data
|
||||
if grep -E "INSERT INTO.*wp_options" "${file}" | grep -q "woocommerce"; then
|
||||
echo -e "${YELLOW} ⚠ PERFORMANCE: Inserting WooCommerce data directly into wp_options in ${file}.${NC}"
|
||||
echo " Avoid cluttering the autoloaded options table."
|
||||
WARNING_COUNT=$((WARNING_COUNT + 1))
|
||||
fi
|
||||
done
|
||||
echo ""
|
||||
|
||||
# ============================================
|
||||
# 5. TEMPLATE OVERRIDES
|
||||
# ============================================
|
||||
echo -e "${MAGENTA}[5/9] Checking Template Structure...${NC}"
|
||||
|
||||
if [[ -d "${PLUGIN_DIR}/templates" ]]; then
|
||||
echo -e "${GREEN} ✓ '/templates' directory found.${NC}"
|
||||
|
||||
# Check for outdated template versions in comments
|
||||
find "${PLUGIN_DIR}/templates" -name "*.php" | while read -r tpl; do
|
||||
if ! grep -q "@version" "${tpl}"; then
|
||||
echo -e "${YELLOW} ⚠ WARNING: Template ${tpl} missing @version tag.${NC}"
|
||||
WARNING_COUNT=$((WARNING_COUNT + 1))
|
||||
fi
|
||||
done
|
||||
fi
|
||||
echo ""
|
||||
|
||||
# ============================================
|
||||
# 6. CART/CHECKOUT BLOCKS
|
||||
# ============================================
|
||||
echo -e "${MAGENTA}[6/9] Checking Blocks Compatibility...${NC}"
|
||||
|
||||
has_blocks_integration=false
|
||||
for file in "${php_files[@]}"; do
|
||||
if grep -qE "woocommerce_blocks|StoreApi|IntegrationInterface" "${file}"; then
|
||||
has_blocks_integration=true
|
||||
break
|
||||
fi
|
||||
done
|
||||
|
||||
if ! $has_blocks_integration; then
|
||||
echo -e "${YELLOW} ⚠ WARNING: No Cart/Checkout Blocks integration detected.${NC}"
|
||||
echo " Your plugin may break in Block-based checkouts."
|
||||
WARNING_COUNT=$((WARNING_COUNT + 1))
|
||||
else
|
||||
echo -e "${GREEN} ✓ Blocks integration detected.${NC}"
|
||||
fi
|
||||
echo ""
|
||||
|
||||
# ============================================
|
||||
# 7. ORDER NOTES & LOGGING
|
||||
# ============================================
|
||||
echo -e "${MAGENTA}[7/9] Checking Logging Standards...${NC}"
|
||||
|
||||
for file in "${php_files[@]}"; do
|
||||
# Check for error_log
|
||||
if grep -q "error_log" "${file}"; then
|
||||
echo -e "${YELLOW} ⚠ WARNING: 'error_log' found in ${file}. Use 'WC_Logger' instead.${NC}"
|
||||
WARNING_COUNT=$((WARNING_COUNT + 1))
|
||||
fi
|
||||
|
||||
# Check for adding order notes directly via comments
|
||||
if grep -q "wp_insert_comment" "${file}" | grep -q "order"; then
|
||||
echo -e "${RED} ✗ FAILURE: inserting comments directly for orders in ${file}.${NC}"
|
||||
echo " Use \$order->add_order_note() instead."
|
||||
CRITICAL_COUNT=$((CRITICAL_COUNT + 1))
|
||||
fi
|
||||
done
|
||||
echo ""
|
||||
|
||||
# ============================================
|
||||
# 8. AJAX ERROR DETECTION
|
||||
# ============================================
|
||||
echo -e "${MAGENTA}[8/9] Detecting AJAX Errors & Security Issues...${NC}"
|
||||
|
||||
for file in "${php_files[@]}"; do
|
||||
# Check for AJAX actions without nonce verification
|
||||
if grep -qE "add_action.*wp_ajax_.*add_action.*wp_ajax_nopriv_" "${file}"; then
|
||||
if ! grep -q "check_ajax_referer\|wp_verify_nonce" "${file}"; then
|
||||
echo -e "${RED} ✗ AJAX SECURITY VIOLATION in ${file}:${NC}"
|
||||
echo " AJAX action found without nonce verification."
|
||||
grep -nE "add_action.*wp_ajax_" "${file}" | head -1
|
||||
CRITICAL_COUNT=$((CRITICAL_COUNT + 1))
|
||||
fi
|
||||
fi
|
||||
|
||||
# Check for wp_send_json_error without proper status codes
|
||||
if grep -q "wp_send_json_error" "${file}"; then
|
||||
if ! grep -q "wp_send_json_error.*[45][0-9][0-9]" "${file}"; then
|
||||
echo -e "${YELLOW} ⚠ AJAX ERROR HANDLING in ${file}:${NC}"
|
||||
echo " wp_send_json_error() should include HTTP status code (400, 403, 500)."
|
||||
grep -n "wp_send_json_error" "${file}" | head -1
|
||||
WARNING_COUNT=$((WARNING_COUNT + 1))
|
||||
fi
|
||||
fi
|
||||
|
||||
# Check for AJAX handlers using wp_die() or die()
|
||||
if grep -qE "wp_die\(|die\(" "${file}"; then
|
||||
if grep -qE "wp_ajax_.*{" "${file}"; then
|
||||
echo -e "${RED} ✗ AJAX ERROR in ${file}:${NC}"
|
||||
echo " Direct wp_die() or die() in AJAX handler breaks proper error handling."
|
||||
grep -nE "(wp_die|die)\(" "${file}" | head -1
|
||||
CRITICAL_COUNT=$((CRITICAL_COUNT + 1))
|
||||
fi
|
||||
fi
|
||||
|
||||
# Check for AJAX actions missing capability checks
|
||||
if grep -qE "add_action.*wp_ajax_" "${file}"; then
|
||||
if ! grep -qE "current_user_can|check_ajax_referer" "${file}"; then
|
||||
echo -e "${RED} ✗ AJAX SECURITY in ${file}:${NC}"
|
||||
echo " AJAX action missing capability or nonce checks."
|
||||
grep -nE "add_action.*wp_ajax_" "${file}" | head -1
|
||||
CRITICAL_COUNT=$((CRITICAL_COUNT + 1))
|
||||
fi
|
||||
fi
|
||||
|
||||
# Check for AJAX handlers returning data without proper JSON encoding
|
||||
if grep -qE "wp_ajax_" "${file}"; then
|
||||
if grep -q "echo.*json_encode" "${file}"; then
|
||||
echo -e "${YELLOW} ⚠ AJAX ERROR HANDLING in ${file}:${NC}"
|
||||
echo " Manual JSON encoding in AJAX. Use wp_send_json_success/error()."
|
||||
grep -n "echo.*json_encode" "${file}" | head -1
|
||||
WARNING_COUNT=$((WARNING_COUNT + 1))
|
||||
fi
|
||||
fi
|
||||
|
||||
# Check for AJAX handlers catching errors without proper logging
|
||||
if grep -qE "wp_ajax_.*try.*catch" "${file}"; then
|
||||
if ! grep -qE "wc_get_logger|wc_log_error|error_log.*AJAX" "${file}"; then
|
||||
echo -e "${YELLOW} ⚠ AJAX ERROR HANDLING in ${file}:${NC}"
|
||||
echo " AJAX try-catch block may not be logging errors properly."
|
||||
grep -nE "wp_ajax_.*try" "${file}" | head -1
|
||||
WARNING_COUNT=$((WARNING_COUNT + 1))
|
||||
fi
|
||||
fi
|
||||
done
|
||||
echo ""
|
||||
|
||||
# ============================================
|
||||
# 10. CLASS LOADING ERRORS & MISSING CLASSES
|
||||
# ============================================
|
||||
echo -e "${MAGENTA}[10/10] Detecting Class Loading Errors and Missing Classes...${NC}"
|
||||
|
||||
for file in "${php_files[@]}"; do
|
||||
# Check for deprecated order properties
|
||||
if grep -qE "\\\$order->(order_type|customer_id|customer_note|post_status)" "${file}"; then
|
||||
echo -e "${RED} ✗ DEPRECATED ORDER PROPERTY in ${file}${NC}"
|
||||
echo " Direct property access removed. Use getter methods."
|
||||
grep -nE "\\\$order->(order_type|customer_id|customer_note|post_status)" "${file}" | head -1
|
||||
CRITICAL_COUNT=$((CRITICAL_COUNT + 1))
|
||||
fi
|
||||
|
||||
# Check for legacy meta handling on products
|
||||
if grep -E "(get|update|add|delete)_post_meta.*product_id" "${file}"; then
|
||||
if ! grep -q "HPOS" "${file}"; then
|
||||
echo -e "${YELLOW} ⚠ POTENTIAL LEGACY META HANDLING in ${file}${NC}"
|
||||
echo " Verify proper CRUD methods are used for products."
|
||||
WARNING_COUNT=$((WARNING_COUNT + 1))
|
||||
fi
|
||||
fi
|
||||
|
||||
# Check for proper webhook implementation
|
||||
if grep -q "wc_get_webhook" "${file}" || grep -q "WC_Webhook" "${file}"; then
|
||||
echo -e "${GREEN} ✓ Webhook implementation detected${NC}"
|
||||
fi
|
||||
|
||||
# Check for payment gateway implementation
|
||||
if grep -q "WC_Payment_Gateway" "${file}"; then
|
||||
if ! grep -qE "process_payment|transaction_result" "${file}"; then
|
||||
echo -e "${YELLOW} ⚠ Incomplete payment gateway in ${file}${NC}"
|
||||
echo " Missing process_payment() method."
|
||||
WARNING_COUNT=$((WARNING_COUNT + 1))
|
||||
fi
|
||||
fi
|
||||
|
||||
# Check for shipping method implementation
|
||||
if grep -q "WC_Shipping_Method" "${file}"; then
|
||||
if ! grep -qE "calculate_shipping" "${file}"; then
|
||||
echo -e "${YELLOW} ⚠ Incomplete shipping method in ${file}${NC}"
|
||||
echo " Missing calculate_shipping() method."
|
||||
WARNING_COUNT=$((WARNING_COUNT + 1))
|
||||
fi
|
||||
fi
|
||||
|
||||
# Check for proper cart handling
|
||||
if grep -qE "\\\$woocommerce->cart|WC()->cart->add_to_cart" "${file}"; then
|
||||
if grep -qE "add_to_cart\(\s*\\\$_POST|add_to_cart\(\s*\\\$_GET" "${file}"; then
|
||||
echo -e "${RED} ✗ SECURITY: Direct user input in add_to_cart in ${file}${NC}"
|
||||
echo " Must validate and sanitize product data."
|
||||
CRITICAL_COUNT=$((CRITICAL_COUNT + 1))
|
||||
fi
|
||||
fi
|
||||
|
||||
# Check for proper product query usage
|
||||
if grep -q "wp_get_post_terms.*product" "${file}"; then
|
||||
echo -e "${YELLOW} ⚠ Use wc_get_product_terms() instead of wp_get_post_terms() in ${file}${NC}"
|
||||
WARNING_COUNT=$((WARNING_COUNT + 1))
|
||||
fi
|
||||
|
||||
# Check for proper price handling (should use wc_get_price_including_tax/excluding_tax)
|
||||
if grep -qE "get_post_meta.*_price" "${file}" && ! grep -q "wc_get_price"; then
|
||||
echo -e "${YELLOW} ⚠ Use WC price functions in ${file}${NC}"
|
||||
echo " Prefer wc_get_price_including_tax() over direct meta access."
|
||||
WARNING_COUNT=$((WARNING_COUNT + 1))
|
||||
fi
|
||||
|
||||
# Check for proper action/filter prefixes
|
||||
if grep -qE "add_action.*'woocommerce_|add_filter.*'woocommerce_" "${file}"; then
|
||||
echo -e "${GREEN} ✓ Proper WooCommerce hook usage${NC}"
|
||||
fi
|
||||
|
||||
# Check for Block-based checkout integration
|
||||
if grep -qE "woocommerce_store_api|CheckoutBlocks|StoreApi" "${file}"; then
|
||||
echo -e "${GREEN} ✓ Block-based checkout support detected${NC}"
|
||||
fi
|
||||
|
||||
# Check for product attribute handling
|
||||
if grep -qE "get_the_terms.*product" "${file}" && grep -qE "pa_" "${file}"; then
|
||||
echo -e "${GREEN} ✓ Product attribute handling detected${NC}"
|
||||
fi
|
||||
|
||||
# Check for tax handling
|
||||
if grep -qE "WC_Tax|calculate_tax|get_tax_rate" "${file}"; then
|
||||
echo -e "${GREEN} ✓ Tax calculation handling detected${NC}"
|
||||
fi
|
||||
done
|
||||
echo ""
|
||||
|
||||
# ============================================
|
||||
# 10. CLASS LOADING ERRORS & MISSING CLASSES
|
||||
# ============================================
|
||||
echo -e "${MAGENTA}[10/10] Detecting Class Loading Errors and Missing Classes...${NC}"
|
||||
|
||||
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
CLASS_ERRORS=0
|
||||
CLASS_WARNINGS=0
|
||||
|
||||
if command -v php >/dev/null 2>&1; then
|
||||
if [[ -f "${SCRIPT_DIR}/check-duplicate-classes.php" ]]; then
|
||||
php "${SCRIPT_DIR}/check-duplicate-classes.php" "${PLUGIN_DIR}" 2>&1 | while IFS= read -r line; do
|
||||
if [[ "$line" == *"MISSING CLASS"* ]]; then
|
||||
echo -e "${RED} ✗ CLASS LOADING ERROR: ${line}${NC}"
|
||||
CLASS_ERRORS=$((CLASS_ERRORS + 1))
|
||||
CRITICAL_COUNT=$((CRITICAL_COUNT + 1))
|
||||
elif [[ "$line" == *"MISSING CLASS/TYPE"* ]]; then
|
||||
echo -e "${RED} ✗ CLASS LOADING ERROR: ${line}${NC}"
|
||||
CLASS_ERRORS=$((CLASS_ERRORS + 1))
|
||||
CRITICAL_COUNT=$((CRITICAL_COUNT + 1))
|
||||
elif [[ "$line" == *"DUPLICATE CLASS"* ]]; then
|
||||
echo -e "${RED} ✗ CLASS LOADING ERROR: ${line}${NC}"
|
||||
CLASS_ERRORS=$((CLASS_ERRORS + 1))
|
||||
CRITICAL_COUNT=$((CRITICAL_COUNT + 1))
|
||||
elif [[ "$line" == *"DUPLICATE INTERFACE"* ]]; then
|
||||
echo -e "${RED} ✗ CLASS LOADING ERROR: ${line}${NC}"
|
||||
CLASS_ERRORS=$((CLASS_ERRORS + 1))
|
||||
CRITICAL_COUNT=$((CRITICAL_COUNT + 1))
|
||||
elif [[ "$line" == *"DUPLICATE TRAIT"* ]]; then
|
||||
echo -e "${RED} ✗ CLASS LOADING ERROR: ${line}${NC}"
|
||||
CLASS_ERRORS=$((CLASS_ERRORS + 1))
|
||||
CRITICAL_COUNT=$((CRITICAL_COUNT + 1))
|
||||
elif [[ "$line" == *"install_class_not_loaded"* ]] || [[ "$line" == *"class_not_loaded"* ]]; then
|
||||
echo -e "${RED} ✗ CLASS LOADING ERROR: ${line}${NC}"
|
||||
CLASS_ERRORS=$((CLASS_ERRORS + 1))
|
||||
CRITICAL_COUNT=$((CRITICAL_COUNT + 1))
|
||||
elif [[ "$line" == *"MISSING INCLUDE"* ]]; then
|
||||
echo -e "${RED} ✗ MISSING INCLUDE: ${line}${NC}"
|
||||
CLASS_ERRORS=$((CLASS_ERRORS + 1))
|
||||
CRITICAL_COUNT=$((CRITICAL_COUNT + 1))
|
||||
elif [[ "$line" == *"ACTIVATION HOOK ISSUE"* ]]; then
|
||||
echo -e "${RED} ✗ ACTIVATION HOOK CLASS ERROR: ${line}${NC}"
|
||||
CLASS_ERRORS=$((CLASS_ERRORS + 1))
|
||||
CRITICAL_COUNT=$((CRITICAL_COUNT + 1))
|
||||
elif [[ "$line" == *"UNDEFINED FUNCTION"* ]]; then
|
||||
echo -e "${RED} ✗ UNDEFINED FUNCTION: ${line}${NC}"
|
||||
CLASS_ERRORS=$((CLASS_ERRORS + 1))
|
||||
CRITICAL_COUNT=$((CRITICAL_COUNT + 1))
|
||||
elif [[ "$line" == *"EARLY FUNCTION CALL"* ]]; then
|
||||
echo -e "${YELLOW} ⚠ CLASS LOADING WARNING: ${line}${NC}"
|
||||
CLASS_WARNINGS=$((CLASS_WARNINGS + 1))
|
||||
WARNING_COUNT=$((WARNING_COUNT + 1))
|
||||
elif [[ "$line" == *"static_class_not_loaded"* ]]; then
|
||||
echo -e "${RED} ✗ STATIC CLASS NOT LOADED: ${line}${NC}"
|
||||
CLASS_ERRORS=$((CLASS_ERRORS + 1))
|
||||
CRITICAL_COUNT=$((CRITICAL_COUNT + 1))
|
||||
elif [[ "$line" == *"Checking for"* ]] || [[ "$line" == *"No"* ]] || [[ "$line" == *"✓"* ]] || [[ "$line" == *"PASS"* ]] || [[ "$line" == *"FAIL"* ]] || [[ "$line" == *"Scanned"* ]] || [[ "$line" == *"found"* ]]; then
|
||||
:
|
||||
else
|
||||
echo " ${line}"
|
||||
fi
|
||||
done
|
||||
else
|
||||
echo -e "${YELLOW} ⚠ check-duplicate-classes.php not found - using static analysis${NC}"
|
||||
fi
|
||||
|
||||
echo ""
|
||||
echo -e "${MAGENTA}[10.1/10] WooCommerce Class Loading Pattern Analysis...${NC}"
|
||||
|
||||
for file in "${php_files[@]}"; do
|
||||
if grep -qE "(WC_[A-Z][a-zA-Z0-9_]*|woocommerce_[a-zA-Z0-9_]+)" "${file}"; then
|
||||
wc_classes=$(grep -oE "(WC_[A-Z][a-zA-Z0-9_]*|woocommerce_[a-zA-Z0-9_]+)" "${file}" | sort -u)
|
||||
for wc_class in $wc_classes; do
|
||||
if [[ "$wc_class" == "WC_"* ]] || [[ "$wc_class" == "woocommerce_"* ]]; then
|
||||
if ! grep -qE "(class|interface|trait)\s+${wc_class}" "${file}"; then
|
||||
if ! grep -qE "(function_exists|class_exists|interface_exists).*${wc_class}" "${file}"; then
|
||||
echo -e "${YELLOW} ⚠ WooCommerce Class Reference in ${file}:${NC}"
|
||||
echo " '${wc_class}' referenced but may require WooCommerce dependency check."
|
||||
CLASS_WARNINGS=$((CLASS_WARNINGS + 1))
|
||||
WARNING_COUNT=$((WARNING_COUNT + 1))
|
||||
fi
|
||||
fi
|
||||
fi
|
||||
done
|
||||
fi
|
||||
|
||||
if grep -qE "(class_exists|interface_exists|trait_exists)\s*\(" "${file}"; then
|
||||
if ! grep -qE "(class_exists|interface_exists|trait_exists)\s*\([^)]+\)\s*;" "${file}"; then
|
||||
echo -e "${YELLOW} ⚠ Dynamic Class Check in ${file}:${NC}"
|
||||
echo " Dynamic class loading detected - ensure WooCommerce dependency check."
|
||||
CLASS_WARNINGS=$((CLASS_WARNINGS + 1))
|
||||
WARNING_COUNT=$((WARNING_COUNT + 1))
|
||||
fi
|
||||
fi
|
||||
|
||||
if grep -qE "new\s+[A-Z][a-zA-Z0-9_]*\s*\(" "${file}"; then
|
||||
class_refs=$(grep -oE "new\s+[A-Z][a-zA-Z0-9_]*\s*\(" "${file}" | grep -oE "[A-Z][a-zA-Z0-9_]*" | sort -u)
|
||||
for class_ref in $class_refs; do
|
||||
if ! grep -qE "(class|interface|trait)\s+${class_ref}" "${file}"; then
|
||||
include_check=$(grep -nE "(require|include).*${class_ref}" "${file}" | head -1 || true)
|
||||
if [[ -z "$include_check" ]]; then
|
||||
if [[ "$class_ref" != "WC_"* ]] && [[ "$class_ref" != "WP_"* ]]; then
|
||||
echo -e "${RED} ✗ POTENTIAL MISSING CLASS in ${file}:${NC}"
|
||||
echo " Class '${class_ref}' instantiated but may not be loaded."
|
||||
CLASS_ERRORS=$((CLASS_ERRORS + 1))
|
||||
CRITICAL_COUNT=$((CRITICAL_COUNT + 1))
|
||||
fi
|
||||
fi
|
||||
fi
|
||||
done
|
||||
fi
|
||||
|
||||
if grep -qE "[A-Z][a-zA-Z0-9_]*::[a-z]" "${file}"; then
|
||||
static_refs=$(grep -oE "[A-Z][a-zA-Z0-9_]*::[a-z][a-zA-Z0-9_]*" "${file}" | cut -d':' -f1 | sort -u)
|
||||
for static_ref in $static_refs; do
|
||||
if ! grep -qE "(class|interface|trait)\s+${static_ref}" "${file}"; then
|
||||
include_check=$(grep -nE "(require|include|use).*${static_ref}" "${file}" | head -1 || true)
|
||||
if [[ -z "$include_check" ]]; then
|
||||
if [[ "$static_ref" != "WC_"* ]] && [[ "$static_ref" != "WP_"* ]]; then
|
||||
echo -e "${YELLOW} ⚠ POTENTIAL MISSING STATIC CLASS in ${file}:${NC}"
|
||||
echo " Static class '${static_ref}' referenced but may not be loaded."
|
||||
CLASS_WARNINGS=$((CLASS_WARNINGS + 1))
|
||||
WARNING_COUNT=$((WARNING_COUNT + 1))
|
||||
fi
|
||||
fi
|
||||
fi
|
||||
done
|
||||
fi
|
||||
|
||||
if grep -qE "add_action\s*\(\s*['\"]woocommerce_" "${file}"; then
|
||||
if ! grep -qE "(class_exists|function_exists).*woocommerce" "${file}"; then
|
||||
echo -e "${YELLOW} ⚠ WooCommerce Hook Without Dependency Check in ${file}:${NC}"
|
||||
echo " WooCommerce hooks detected without dependency check."
|
||||
CLASS_WARNINGS=$((CLASS_WARNINGS + 1))
|
||||
WARNING_COUNT=$((WARNING_COUNT + 1))
|
||||
fi
|
||||
fi
|
||||
|
||||
if grep -qE "add_filter\s*\(\s*['\"]woocommerce_" "${file}"; then
|
||||
if ! grep -qE "(class_exists|function_exists).*woocommerce" "${file}"; then
|
||||
echo -e "${YELLOW} ⚠ WooCommerce Filter Without Dependency Check in ${file}:${NC}"
|
||||
echo " WooCommerce filters detected without dependency check."
|
||||
CLASS_WARNINGS=$((CLASS_WARNINGS + 1))
|
||||
WARNING_COUNT=$((WARNING_COUNT + 1))
|
||||
fi
|
||||
fi
|
||||
done
|
||||
|
||||
if [[ $CLASS_ERRORS -eq 0 ]] && [[ $CLASS_WARNINGS -eq 0 ]]; then
|
||||
echo -e "${GREEN} ✓ No class loading errors detected${NC}"
|
||||
elif [[ $CLASS_ERRORS -gt 0 ]]; then
|
||||
echo -e "${RED} ✗ ${CLASS_ERRORS} class loading error(s) found${NC}"
|
||||
else
|
||||
echo -e "${YELLOW} ⚠ ${CLASS_WARNINGS} class loading warning(s) found${NC}"
|
||||
fi
|
||||
else
|
||||
echo -e "${YELLOW} ⚠ PHP not available - skipping class loading validation${NC}"
|
||||
fi
|
||||
echo ""
|
||||
|
||||
# ============================================
|
||||
# 11. TOO FEW ARGUMENTS ERROR DETECTION
|
||||
# ============================================
|
||||
echo -e "${MAGENTA}[11/11] Checking for Too Few Arguments Errors...${NC}"
|
||||
|
||||
ARG_ERRORS=0
|
||||
|
||||
for file in "${php_files[@]}"; do
|
||||
# Check for array_key_exists with single argument (requires 2)
|
||||
if grep -nE "array_key_exists\s*\(\s*['\"][^'\"]+['\"]\s*\)" "${file}" >/dev/null 2>&1; then
|
||||
if ! grep -nE "array_key_exists\s*\(\s*['\"][^'\"]+['\"]\s*,\s*\\$" "${file}" >/dev/null 2>&1; then
|
||||
matches=$(grep -nE "array_key_exists\s*\(\s*['\"][^'\"]+['\"]\s*\)" "${file}" | head -1)
|
||||
if [[ -n "$matches" ]] && ! grep -E "array_key_exists\s*\(\s*['\"][^'\"]+['\"]\s*,\s*[a-zA-Z_]" "${file}" >/dev/null 2>&1; then
|
||||
echo -e "${RED} ✗ TOO FEW ARGUMENTS in ${file}:${NC}"
|
||||
echo " array_key_exists() called with only 1 argument (requires 2)"
|
||||
echo "$matches"
|
||||
ARG_ERRORS=$((ARG_ERRORS + 1))
|
||||
CRITICAL_COUNT=$((CRITICAL_COUNT + 1))
|
||||
fi
|
||||
fi
|
||||
fi
|
||||
|
||||
# Check for function calls with missing required arguments
|
||||
# str_replace requires 3 arguments
|
||||
if grep -nE "str_replace\s*\(\s*[^)]+,\s*[^)]+\s*\)" "${file}" >/dev/null 2>&1; then
|
||||
if ! grep -E "str_replace\s*\(\s*[^)]+,\s*[^)]+,\s*[^)]+\s*\)" "${file}" >/dev/null 2>&1; then
|
||||
matches=$(grep -nE "str_replace\s*\(\s*[^)]+,\s*[^)]+\s*\)" "${file}" | head -1)
|
||||
if [[ -n "$matches" ]]; then
|
||||
echo -e "${RED} ✗ TOO FEW ARGUMENTS in ${file}:${NC}"
|
||||
echo " str_replace() called with only 2 arguments (requires 3: search, replace, subject)"
|
||||
echo "$matches"
|
||||
ARG_ERRORS=$((ARG_ERRORS + 1))
|
||||
CRITICAL_COUNT=$((CRITICAL_COUNT + 1))
|
||||
fi
|
||||
fi
|
||||
fi
|
||||
|
||||
# Check for sprintf/printf with format but no arguments
|
||||
if grep -nE "(sprintf|printf)\s*\(\s*['\"][%].*['\"]\s*\)" "${file}" >/dev/null 2>&1; then
|
||||
matches=$(grep -nE "(sprintf|printf)\s*\(\s*['\"][%].*['\"]\s*\)" "${file}" | head -1)
|
||||
if [[ -n "$matches" ]]; then
|
||||
echo -e "${YELLOW} ⚠ POSSIBLE TOO FEW ARGUMENTS in ${file}:${NC}"
|
||||
echo " sprintf/printf has format specifiers but may be missing arguments"
|
||||
echo "$matches"
|
||||
ARG_ERRORS=$((ARG_ERRORS + 1))
|
||||
WARNING_COUNT=$((WARNING_COUNT + 1))
|
||||
fi
|
||||
fi
|
||||
|
||||
# Check for array_slice with only array argument (requires at least 2)
|
||||
if grep -nE "array_slice\s*\(\s*\\\$[a-zA-Z_]+\s*\)" "${file}" >/dev/null 2>&1; then
|
||||
matches=$(grep -nE "array_slice\s*\(\s*\\\$[a-zA-Z_]+\s*\)" "${file}" | head -1)
|
||||
if [[ -n "$matches" ]]; then
|
||||
echo -e "${RED} ✗ TOO FEW ARGUMENTS in ${file}:${NC}"
|
||||
echo " array_slice() called with only 1 argument (requires at least 2: array, offset)"
|
||||
echo "$matches"
|
||||
ARG_ERRORS=$((ARG_ERRORS + 1))
|
||||
CRITICAL_COUNT=$((CRITICAL_COUNT + 1))
|
||||
fi
|
||||
fi
|
||||
|
||||
# Check for in_array with only haystack argument (requires 2)
|
||||
if grep -nE "in_array\s*\(\s*\\\$[a-zA-Z_]+\s*\)" "${file}" >/dev/null 2>&1; then
|
||||
matches=$(grep -nE "in_array\s*\(\s*\\\$[a-zA-Z_]+\s*\)" "${file}" | head -1)
|
||||
if [[ -n "$matches" ]]; then
|
||||
echo -e "${RED} ✗ TOO FEW ARGUMENTS in ${file}:${NC}"
|
||||
echo " in_array() called with only 1 argument (requires 2: needle, haystack)"
|
||||
echo "$matches"
|
||||
ARG_ERRORS=$((ARG_ERRORS + 1))
|
||||
CRITICAL_COUNT=$((CRITICAL_COUNT + 1))
|
||||
fi
|
||||
fi
|
||||
|
||||
# Check for explode with only delimiter argument (requires 2)
|
||||
if grep -nE "explode\s*\(\s*['\"][^'\"]+['\"]\s*\)" "${file}" >/dev/null 2>&1; then
|
||||
if ! grep -E "explode\s*\(\s*['\"][^'\"]+['\"]\s*,\s*\\$" "${file}" >/dev/null 2>&1; then
|
||||
matches=$(grep -nE "explode\s*\(\s*['\"][^'\"]+['\"]\s*\)" "${file}" | head -1)
|
||||
if [[ -n "$matches" ]] && ! grep -E "explode\s*\(\s*['\"][^'\"]+['\"]\s*,\s*[^)]+\)" "${file}" >/dev/null 2>&1; then
|
||||
echo -e "${RED} ✗ TOO FEW ARGUMENTS in ${file}:${NC}"
|
||||
echo " explode() called with only 1 argument (requires 2: delimiter, string)"
|
||||
echo "$matches"
|
||||
ARG_ERRORS=$((ARG_ERRORS + 1))
|
||||
CRITICAL_COUNT=$((CRITICAL_COUNT + 1))
|
||||
fi
|
||||
fi
|
||||
fi
|
||||
|
||||
# Check for implode with only array argument (requires 1, but checking for wrong order)
|
||||
if grep -nE "implode\s*\(\s*\\\$[a-zA-Z_]+\s*,\s*['\"][^'\"]+['\"]\s*\)" "${file}" >/dev/null 2>&1; then
|
||||
matches=$(grep -nE "implode\s*\(\s*\\\$[a-zA-Z_]+\s*,\s*['\"][^'\"]+['\"]\s*\)" "${file}" | head -1)
|
||||
if [[ -n "$matches" ]]; then
|
||||
echo -e "${YELLOW} ⚠ POSSIBLE ARGUMENT ORDER ERROR in ${file}:${NC}"
|
||||
echo " implode() arguments may be in wrong order (expected glue, array)"
|
||||
echo "$matches"
|
||||
ARG_ERRORS=$((ARG_ERRORS + 1))
|
||||
WARNING_COUNT=$((WARNING_COUNT + 1))
|
||||
fi
|
||||
fi
|
||||
|
||||
# Check for array_merge with single argument (unusual pattern)
|
||||
if grep -nE "array_merge\s*\(\s*\\\$[a-zA-Z_]+\s*\)" "${file}" >/dev/null 2>&1; then
|
||||
matches=$(grep -nE "array_merge\s*\(\s*\\\$[a-zA-Z_]+\s*\)" "${file}" | head -1)
|
||||
if [[ -n "$matches" ]]; then
|
||||
echo -e "${YELLOW} ⚠ POSSIBLE TOO FEW ARGUMENTS in ${file}:${NC}"
|
||||
echo " array_merge() with single argument may indicate missing array parameter"
|
||||
echo "$matches"
|
||||
ARG_ERRORS=$((ARG_ERRORS + 1))
|
||||
WARNING_COUNT=$((WARNING_COUNT + 1))
|
||||
fi
|
||||
fi
|
||||
|
||||
# Check for WooCommerce-specific functions with missing arguments
|
||||
if grep -nE "wc_get_order\s*\(\s*\)" "${file}" >/dev/null 2>&1; then
|
||||
matches=$(grep -nE "wc_get_order\s*\(\s*\)" "${file}" | head -1)
|
||||
if [[ -n "$matches" ]]; then
|
||||
echo -e "${YELLOW} ⚠ POSSIBLE TOO FEW ARGUMENTS in ${file}:${NC}"
|
||||
echo " wc_get_order() called without order ID argument"
|
||||
echo "$matches"
|
||||
ARG_ERRORS=$((ARG_ERRORS + 1))
|
||||
WARNING_COUNT=$((WARNING_COUNT + 1))
|
||||
fi
|
||||
fi
|
||||
|
||||
if grep -nE "wc_get_product\s*\(\s*\)" "${file}" >/dev/null 2>&1; then
|
||||
matches=$(grep -nE "wc_get_product\s*\(\s*\)" "${file}" | head -1)
|
||||
if [[ -n "$matches" ]]; then
|
||||
echo -e "${YELLOW} ⚠ POSSIBLE TOO FEW ARGUMENTS in ${file}:${NC}"
|
||||
echo " wc_get_product() called without product ID argument"
|
||||
echo "$matches"
|
||||
ARG_ERRORS=$((ARG_ERRORS + 1))
|
||||
WARNING_COUNT=$((WARNING_COUNT + 1))
|
||||
fi
|
||||
fi
|
||||
done
|
||||
|
||||
if [[ $ARG_ERRORS -eq 0 ]]; then
|
||||
echo -e "${GREEN} ✓ No too few arguments errors detected${NC}"
|
||||
fi
|
||||
echo ""
|
||||
|
||||
# ============================================
|
||||
# 12. FINAL SUMMARY
|
||||
# ============================================
|
||||
|
||||
echo -e "${BLUE}========================================================${NC}"
|
||||
echo -e "${BLUE} VALIDATION SUMMARY ${NC}"
|
||||
echo -e "${BLUE}========================================================${NC}"
|
||||
|
||||
if [[ "${CRITICAL_COUNT}" -gt 0 ]]; then
|
||||
echo -e "${RED}FAILED: ${CRITICAL_COUNT} critical issues found.${NC}"
|
||||
echo "This plugin is NOT safely compatible with modern WooCommerce."
|
||||
echo "Fix all Critical failures before usage."
|
||||
exit 1
|
||||
elif [[ "${WARNING_COUNT}" -gt 0 ]]; then
|
||||
echo -e "${YELLOW}PASSED WITH WARNINGS: 0 critical, ${WARNING_COUNT} warnings.${NC}"
|
||||
echo "Review warnings for future-proofing."
|
||||
exit 0
|
||||
else
|
||||
echo -e "${GREEN}PASSED: 0 critical, 0 warnings.${NC}"
|
||||
echo "Plugin appears fully compatible with strict standards."
|
||||
exit 0
|
||||
fi
|
||||
1021
scripts/validate-wordpress-plugin.sh
Executable file
1021
scripts/validate-wordpress-plugin.sh
Executable file
File diff suppressed because it is too large
Load Diff
275
scripts/verify-builder-message-flow.js
Executable file
275
scripts/verify-builder-message-flow.js
Executable file
@@ -0,0 +1,275 @@
|
||||
#!/usr/bin/env node
|
||||
|
||||
/**
|
||||
* Verification script for builder message sending to OpenCode
|
||||
* This script validates that all components for message sending are correct
|
||||
*/
|
||||
|
||||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
|
||||
console.log('╔══════════════════════════════════════════════════════════════╗');
|
||||
console.log('║ Builder → OpenCode Message Flow Verification ║');
|
||||
console.log('╚══════════════════════════════════════════════════════════════╝\n');
|
||||
|
||||
const rootDir = path.join(__dirname, '..');
|
||||
let allChecks = true;
|
||||
let checkCount = 0;
|
||||
let passCount = 0;
|
||||
|
||||
function check(condition, description) {
|
||||
checkCount++;
|
||||
if (condition) {
|
||||
passCount++;
|
||||
console.log(`✓ ${description}`);
|
||||
return true;
|
||||
} else {
|
||||
allChecks = false;
|
||||
console.log(`✗ ${description}`);
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
// Load files
|
||||
console.log('Loading files...\n');
|
||||
const builderJsPath = path.join(rootDir, 'chat/public/builder.js');
|
||||
const serverJsPath = path.join(rootDir, 'chat/server.js');
|
||||
const builderHtmlPath = path.join(rootDir, 'chat/public/builder.html');
|
||||
|
||||
let builderJs, serverJs, builderHtml;
|
||||
|
||||
try {
|
||||
builderJs = fs.readFileSync(builderJsPath, 'utf8');
|
||||
check(true, 'Loaded builder.js');
|
||||
} catch (e) {
|
||||
check(false, `Failed to load builder.js: ${e.message}`);
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
try {
|
||||
serverJs = fs.readFileSync(serverJsPath, 'utf8');
|
||||
check(true, 'Loaded server.js');
|
||||
} catch (e) {
|
||||
check(false, `Failed to load server.js: ${e.message}`);
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
try {
|
||||
builderHtml = fs.readFileSync(builderHtmlPath, 'utf8');
|
||||
check(true, 'Loaded builder.html');
|
||||
} catch (e) {
|
||||
check(false, `Failed to load builder.html: ${e.message}`);
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
console.log('\n--- File Syntax Validation ---\n');
|
||||
|
||||
// Check syntax of JS files
|
||||
try {
|
||||
new (require('vm').Script)(builderJs, { filename: 'builder.js' });
|
||||
check(true, 'builder.js syntax is valid');
|
||||
} catch (e) {
|
||||
check(false, `builder.js has syntax error: ${e.message}`);
|
||||
}
|
||||
|
||||
try {
|
||||
new (require('vm').Script)(serverJs, { filename: 'server.js' });
|
||||
check(true, 'server.js syntax is valid');
|
||||
} catch (e) {
|
||||
check(false, `server.js has syntax error: ${e.message}`);
|
||||
}
|
||||
|
||||
console.log('\n--- Builder.js Message Sending ---\n');
|
||||
|
||||
// Check executeBuild function
|
||||
check(
|
||||
builderJs.includes('async function executeBuild(planContent)'),
|
||||
'executeBuild function exists'
|
||||
);
|
||||
|
||||
check(
|
||||
builderJs.includes('await api(`/api/sessions/${state.currentSessionId}/messages`'),
|
||||
'executeBuild sends to correct API endpoint'
|
||||
);
|
||||
|
||||
check(
|
||||
builderJs.includes("cli: 'opencode'"),
|
||||
'executeBuild sets cli to "opencode"'
|
||||
);
|
||||
|
||||
check(
|
||||
builderJs.includes('isProceedWithBuild: true'),
|
||||
'executeBuild sets isProceedWithBuild flag'
|
||||
);
|
||||
|
||||
check(
|
||||
builderJs.includes('planContent: planContent'),
|
||||
'executeBuild includes planContent in payload'
|
||||
);
|
||||
|
||||
check(
|
||||
builderJs.includes('streamMessage(state.currentSessionId, response.message.id)'),
|
||||
'executeBuild starts streaming after message creation'
|
||||
);
|
||||
|
||||
// Check redoProceedWithBuild function
|
||||
check(
|
||||
builderJs.includes('async function redoProceedWithBuild(planContent, model)'),
|
||||
'redoProceedWithBuild function exists'
|
||||
);
|
||||
|
||||
// Check sendMessage function
|
||||
check(
|
||||
builderJs.includes('async function sendMessage()'),
|
||||
'sendMessage function exists'
|
||||
);
|
||||
|
||||
check(
|
||||
builderJs.includes('await api(`/api/sessions/${state.currentSessionId}/messages`'),
|
||||
'sendMessage sends to correct API endpoint'
|
||||
);
|
||||
|
||||
console.log('\n--- Server.js Message Handling ---\n');
|
||||
|
||||
// Check route matching
|
||||
check(
|
||||
serverJs.includes('const messageMatch = pathname.match') &&
|
||||
serverJs.includes('api') && serverJs.includes('sessions') && serverJs.includes('messages'),
|
||||
'Server has message route matcher'
|
||||
);
|
||||
|
||||
check(
|
||||
serverJs.includes('return handleNewMessage(req, res, messageMatch[1], userId)'),
|
||||
'Server routes to handleNewMessage'
|
||||
);
|
||||
|
||||
// Check handleNewMessage function
|
||||
check(
|
||||
serverJs.includes('async function handleNewMessage(req, res, sessionId, userId)'),
|
||||
'handleNewMessage function exists'
|
||||
);
|
||||
|
||||
check(
|
||||
serverJs.includes('const content = sanitizeMessage(body.content'),
|
||||
'handleNewMessage extracts and sanitizes content'
|
||||
);
|
||||
|
||||
check(
|
||||
serverJs.includes('const cli = normalizeCli(body.cli || session.cli)'),
|
||||
'handleNewMessage extracts CLI'
|
||||
);
|
||||
|
||||
check(
|
||||
serverJs.includes('session.messages.push(message)'),
|
||||
'handleNewMessage adds message to session'
|
||||
);
|
||||
|
||||
check(
|
||||
serverJs.includes('queueMessage(sessionId, message)'),
|
||||
'handleNewMessage queues message'
|
||||
);
|
||||
|
||||
console.log('\n--- Message Processing ---\n');
|
||||
|
||||
// Check processMessage function
|
||||
check(
|
||||
serverJs.includes('async function processMessage(sessionId, message)'),
|
||||
'processMessage function exists'
|
||||
);
|
||||
|
||||
check(
|
||||
serverJs.includes('const opencodeResult = await sendToOpencodeWithFallback'),
|
||||
'processMessage calls sendToOpencodeWithFallback'
|
||||
);
|
||||
|
||||
// Check sendToOpencodeWithFallback function
|
||||
check(
|
||||
serverJs.includes('async function sendToOpencodeWithFallback'),
|
||||
'sendToOpencodeWithFallback function exists'
|
||||
);
|
||||
|
||||
check(
|
||||
serverJs.includes('const result = await sendToOpencode'),
|
||||
'sendToOpencodeWithFallback calls sendToOpencode'
|
||||
);
|
||||
|
||||
console.log('\n--- OpenCode Integration ---\n');
|
||||
|
||||
// Check sendToOpencode function
|
||||
check(
|
||||
serverJs.includes('async function sendToOpencode({ session, model, content, message, cli, streamCallback, opencodeSessionId })'),
|
||||
'sendToOpencode function exists'
|
||||
);
|
||||
|
||||
check(
|
||||
serverJs.includes('const clean = sanitizeMessage(content)'),
|
||||
'sendToOpencode sanitizes content'
|
||||
);
|
||||
|
||||
check(
|
||||
serverJs.includes("const args = ['run', '--model', resolvedModel]"),
|
||||
'sendToOpencode prepares CLI arguments'
|
||||
);
|
||||
|
||||
check(
|
||||
serverJs.includes('args.push(clean)'),
|
||||
'sendToOpencode adds content as argument'
|
||||
);
|
||||
|
||||
check(
|
||||
serverJs.includes('const { stdout, stderr } = await runCommand(cliCommand, args'),
|
||||
'sendToOpencode executes OpenCode CLI'
|
||||
);
|
||||
|
||||
console.log('\n--- Streaming Support ---\n');
|
||||
|
||||
// Check streaming functionality
|
||||
check(
|
||||
builderJs.includes('function streamMessage(sessionId, messageId)'),
|
||||
'streamMessage function exists in builder'
|
||||
);
|
||||
|
||||
check(
|
||||
builderJs.includes('const url = `/api/sessions/${sessionId}/messages/${messageId}/stream`'),
|
||||
'streamMessage connects to correct endpoint'
|
||||
);
|
||||
|
||||
check(
|
||||
builderJs.includes('const eventSource = new EventSource(url)'),
|
||||
'streamMessage uses EventSource for SSE'
|
||||
);
|
||||
|
||||
check(
|
||||
serverJs.includes('async function handleMessageStream(req, res, sessionId, messageId, userId)'),
|
||||
'handleMessageStream function exists in server'
|
||||
);
|
||||
|
||||
check(
|
||||
serverJs.includes("'Content-Type': 'text/event-stream'"),
|
||||
'Server sets correct content type for SSE'
|
||||
);
|
||||
|
||||
console.log('\n--- Summary ---\n');
|
||||
|
||||
const percentage = ((passCount / checkCount) * 100).toFixed(1);
|
||||
console.log(`Checks passed: ${passCount}/${checkCount} (${percentage}%)`);
|
||||
|
||||
if (allChecks) {
|
||||
console.log('\n✅ ALL CHECKS PASSED');
|
||||
console.log('\nThe builder message sending flow is correctly implemented.');
|
||||
console.log('All files are valid and components are properly connected.');
|
||||
console.log('Messages should successfully flow: Builder → Server → OpenCode\n');
|
||||
|
||||
console.log('If messages are not being sent, check:');
|
||||
console.log(' 1. OpenCode CLI is installed and accessible');
|
||||
console.log(' 2. Server is running and accessible');
|
||||
console.log(' 3. User has a valid session');
|
||||
console.log(' 4. Model is properly configured');
|
||||
console.log(' 5. Browser console for runtime errors');
|
||||
console.log(' 6. Server logs for processing errors');
|
||||
process.exit(0);
|
||||
} else {
|
||||
console.log('\n❌ SOME CHECKS FAILED');
|
||||
console.log('\nReview the failed checks above to identify issues.\n');
|
||||
process.exit(1);
|
||||
}
|
||||
Reference in New Issue
Block a user