Restore to commit 74e578279624c6045ca440a3459ebfa1f8d54191

This commit is contained in:
southseact-3d
2026-02-07 20:32:41 +00:00
commit ed67b7741b
252 changed files with 99814 additions and 0 deletions

173
QUICK_TEST.md Normal file
View File

@@ -0,0 +1,173 @@
# Quick Test Guide
## 🚀 Quick Deploy & Test
### 1. Rebuild Container (Required!)
```bash
# Stop and remove old container
docker-compose down
# Rebuild with no cache to pick up changes
docker-compose build --no-cache
# Start container
docker-compose up -d
```
### 2. Verify Container Logs Work ✅
```bash
# Tail logs - you should now see output!
docker logs -f shopify-ai-builder
# Look for lines like:
# [2024-01-11T...] Server started on http://0.0.0.0:4000
# [CONFIG] Mistral: { configured: true, ... }
# [CONFIG] Planning Settings: { ... }
```
**Before the fix:** No logs visible, all went to `/var/log/chat-service.log`
**After the fix:** All logs visible in Docker logs ✅
### 3. Test Groq Planning ✅
**Setup:**
```env
# Add to .env or docker-compose.yml
GROQ_API_KEY=your_groq_api_key_here
```
**Test Steps:**
1. Restart after adding env var: `docker-compose restart`
2. Go to: `http://localhost:4000/admin/login`
3. Navigate to "Plan models"
4. Click "Add planning model"
5. Select Provider: `groq`, Model: (leave empty for default)
6. Save
7. Go to builder: `http://localhost:4000/apps`
8. Create new project and enter a plan request
9. Check logs: `docker logs shopify-ai-builder | grep "\[GROQ\]"`
**What to expect:**
- Planning request returns a response
- Logs show: `[GROQ] Starting API request`
- Logs show: `[GROQ] Response received { status: 200, ok: true }`
- Logs show: `[GROQ] Extracted reply: { replyLength: ... }`
### 4. Test Mistral Planning ✅
**Setup:**
```env
# Add to .env or docker-compose.yml
MISTRAL_API_KEY=your_mistral_api_key_here
```
**Test Steps:**
1. Restart after adding env var: `docker-compose restart`
2. Go to admin panel → Plan models
3. Add Mistral as planning provider
4. Go to builder and create plan request
5. Check logs: `docker logs shopify-ai-builder | grep "\[MISTRAL\]"`
**What to expect:**
- Planning request returns a response
- Logs show Mistral API request/response cycle
- Model information tracked properly
### 5. Verify Provider Fallback ✅
**Test automatic fallback:**
1. Configure multiple providers in planning chain (e.g., Groq → Mistral → OpenRouter)
2. Make a plan request
3. If first provider fails, should automatically try next
4. Check logs to see fallback chain: `docker logs shopify-ai-builder | grep "\[PLAN\]"`
## 🎯 Success Criteria
- ✅ Container logs are visible and updating in real-time
- ✅ Groq planning returns responses
- ✅ Mistral planning returns responses
- ✅ Provider fallback works automatically
- ✅ Admin panel shows all providers (openrouter, mistral, google, groq, nvidia)
## 🐛 Troubleshooting
### Problem: Still no logs visible
**Solution:** Make sure you rebuilt with `--no-cache`
```bash
docker-compose down
docker-compose build --no-cache
docker-compose up -d
```
### Problem: "API key not configured"
**Solution:** Add environment variables and restart
```bash
# Add to .env file
GROQ_API_KEY=...
MISTRAL_API_KEY=...
# Restart container
docker-compose restart
```
### Problem: Groq returns error
**Solution:** Check logs for specific error message
```bash
docker logs shopify-ai-builder 2>&1 | grep -A 5 "\[GROQ\].*error"
```
### Problem: Planning returns no response
**Solution:**
1. Check logs: `docker logs -f shopify-ai-builder`
2. Look for error messages with provider prefixes
3. Verify API key is correct
4. Try a different provider as fallback
## 📊 Log Examples
**Good Groq Response:**
```
[GROQ] Starting API request { url: '...', model: 'llama-3.3-70b-versatile', messageCount: 3 }
[GROQ] Response received { status: 200, ok: true }
[GROQ] Response data: { hasChoices: true, choicesLength: 1, model: '...' }
[GROQ] Extracted reply: { replyLength: 523, replyPreview: 'Here is a WordPress plugin...' }
[PLAN] Provider succeeded { provider: 'groq', model: '...', replyLength: 523 }
```
**Good Mistral Response:**
```
[MISTRAL] Starting API request { url: '...', model: 'mistral-large-latest', messageCount: 3 }
[MISTRAL] Response received { status: 200, ok: true }
[MISTRAL] Successfully extracted reply { replyLength: 1024, replyPreview: 'I'll help you create...' }
[PLAN] Provider succeeded { provider: 'mistral', model: 'mistral-large-latest', replyLength: 1024 }
```
**Fallback Chain Working:**
```
[PLAN] Trying provider { provider: 'groq', model: '' }
[GROQ] Request failed { status: 429, detail: 'Rate limit exceeded' }
[PLAN] Trying provider { provider: 'mistral', model: '' }
[MISTRAL] Successfully extracted reply { replyLength: 856 }
[PLAN] Provider succeeded { provider: 'mistral', model: 'mistral-large-latest' }
```
## 📝 Environment Variables
```env
# Required for Groq
GROQ_API_KEY=your_groq_key_here
# Required for Mistral
MISTRAL_API_KEY=your_mistral_key_here
# Optional - use defaults if not set
GROQ_API_URL=https://api.groq.com/openai/v1/chat/completions
MISTRAL_API_URL=https://api.mistral.ai/v1/chat/completions
```
## 🔗 Quick Links
- Admin Panel: `http://localhost:4000/admin/login`
- Plan Models Config: `http://localhost:4000/admin/plan`
- Builder: `http://localhost:4000/apps`
- Full Documentation: See `FIXES_SUMMARY.md`