Remove all Ollama remnants and complete migration to llama.cpp

- Remove Ollama-specific files (Dockerfile.ollama, entrypoint.sh)
- Replace all query_ollama imports and calls with query_llama
- Remove langchain-ollama dependency from requirements.txt
- Update all utility files (autonomous, kindness, image_generation, etc.)
- Update README.md documentation references
- Maintain backward compatibility alias in llm.py
This commit is contained in:
2025-12-07 17:50:08 +02:00
parent a6da4c0c2e
commit d58be3b33e
15 changed files with 39 additions and 286 deletions

View File

@@ -1,8 +1,8 @@
from utils.llm import query_ollama
from utils.llm import query_llama
async def analyze_sentiment(messages: list) -> tuple[str, float]:
"""
Analyze the sentiment of a conversation using Ollama
Analyze the sentiment of a conversation using llama.cpp
Returns a tuple of (sentiment description, positivity score from 0-1)
"""
# Combine the last few messages for context (up to 5)
@@ -29,7 +29,7 @@ Score: 0.85
Response:"""
try:
response = await query_ollama(prompt)
response = await query_llama(prompt)
if not response or 'Score:' not in response:
return "Could not analyze sentiment", 0.5