Skip to content

vecnode/fastapi-ollama-orchestrator

Repository files navigation

fastapi-ollama-orchestrator

FastAPI agent orchestration server with OpenAI-compatible chat endpoints and running short multi-turn debate on Ollama. This example uses gemma3:4b, change in docker-compose.yml.

# Automatic build
./build_containers.sh

# Follow logs
sudo docker compose logs -f agent-orchestrator

# Optional: open dashboard in browser
# http://localhost:11435/dashboard
# Dashboard "Run start_1" shows live orchestrator logs in the HTML console.

# Run the experiment directly inside the container (no HTTP)
sudo docker compose exec agent-orchestrator python main.py --run-trigger start_1
# CLI trigger logs appear only in that terminal session (not in the dashboard stream).

# Optional: open a shell in the container
sudo docker compose exec agent-orchestrator bash

webview_screenshot

About

FastAPI agent orchestration server with OpenAI-compatible chat endpoints and running short multi-turn debate using Ollama.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors