📋 Description
Hiện tại StillMe chỉ support DeepSeek (cloud) và OpenAI (cloud). Mục tiêu là thêm support cho Ollama local AI (llama3.1:8b) để:
- Giảm cost khi dùng local model
- Faster response cho simple queries
- Privacy-first option (no data sent to cloud)
🎯 Current Status
- ✅ Code đã có trong legacy folder
- ✅ Ollama đã được install và pull model llama3.1:8b
- ❌ Chưa integrate vào backend API
- ❌ Smart router chưa hỗ trợ routing local vs cloud
🛠️ Technical Requirements
- Integrate Ollama client vào
backend/api/main.py
- Implement Smart Router logic:
- Simple queries → Ollama local (llama3.1:8b)
- Complex queries → DeepSeek cloud
- Fallback: Ollama → DeepSeek nếu Ollama fail
- Add Ollama health check endpoint
- Update
generate_ai_response() để support Ollama
📁 Files to Modify
backend/api/main.py - Add Ollama integration
backend/services/ - Create Ollama client service (optional)
dashboard.py - Update model selection
.env.example - Add OLLAMA_BASE_URL config
🧪 Testing Requirements
- Test Ollama connection
- Test Smart Router logic
- Test fallback mechanism
- Test với simple vs complex queries
💡 Implementation Ideas
- Use
requests hoặc httpx để call Ollama API
- Ollama endpoint:
http://localhost:11434/api/generate
- Model:
llama3.1:8b
- Smart routing based on:
- Query complexity (length, keywords)
- Estimated response time
- Ollama availability
✅ Acceptance Criteria
🤝 Willing to Contribute?
📚 References
📋 Description
Hiện tại StillMe chỉ support DeepSeek (cloud) và OpenAI (cloud). Mục tiêu là thêm support cho Ollama local AI (llama3.1:8b) để:
🎯 Current Status
🛠️ Technical Requirements
backend/api/main.pygenerate_ai_response()để support Ollama📁 Files to Modify
backend/api/main.py- Add Ollama integrationbackend/services/- Create Ollama client service (optional)dashboard.py- Update model selection.env.example- Add OLLAMA_BASE_URL config🧪 Testing Requirements
💡 Implementation Ideas
requestshoặchttpxđể call Ollama APIhttp://localhost:11434/api/generatellama3.1:8b✅ Acceptance Criteria
🤝 Willing to Contribute?
📚 References
legacy/stillme_core/adapters/ollama_client.py(reference)