koko210Serve
d58be3b33e
Remove all Ollama remnants and complete migration to llama.cpp
- Remove Ollama-specific files (Dockerfile.ollama, entrypoint.sh)
- Replace all query_ollama imports and calls with query_llama
- Remove langchain-ollama dependency from requirements.txt
- Update all utility files (autonomous, kindness, image_generation, etc.)
- Update README.md documentation references
- Maintain backward compatibility alias in llm.py
2025-12-07 17:50:28 +02:00
..
2025-12-07 17:21:59 +02:00
2025-12-07 17:21:59 +02:00
2025-12-07 17:21:59 +02:00
2025-12-07 17:21:59 +02:00
2025-12-07 17:21:59 +02:00
2025-12-07 17:21:59 +02:00
2025-12-07 17:21:59 +02:00
2025-12-07 17:21:59 +02:00
2025-12-07 17:21:59 +02:00
2025-12-07 17:21:59 +02:00
2025-12-07 17:21:59 +02:00
2025-12-07 17:21:59 +02:00
2025-12-07 17:21:59 +02:00
2025-12-07 17:21:59 +02:00
2025-12-07 17:21:59 +02:00
2025-12-07 17:21:59 +02:00
2025-12-07 17:21:59 +02:00
2025-12-07 17:21:59 +02:00
2025-12-07 17:21:59 +02:00
2025-12-07 17:21:59 +02:00
2025-12-07 17:21:59 +02:00
2025-12-07 17:21:59 +02:00
2025-12-07 17:21:59 +02:00
2025-12-07 17:21:59 +02:00
2025-12-07 17:21:59 +02:00
2025-12-07 17:21:59 +02:00
2025-12-07 17:21:59 +02:00