Skip to main content

LLM Platform External Services

Overview​

This document lists all external services that run on specific ports and URLs across the LLM platform ecosystem.

Core AI Services​

Ollama AI Server​

  • URL: http://localhost:11434
  • API Endpoint: http://localhost:11434/api
  • Health Check: http://localhost:11434/api/tags
  • Models Endpoint: http://localhost:11434/api/tags
  • Generate Endpoint: http://localhost:11434/api/generate
  • Embeddings Endpoint: http://localhost:11434/api/embeddings
  • Description: Local AI model server for text generation and embeddings
  • Used By: Core LLM module, AI providers, Government compliance

Apple Foundation Models​

  • URL: http://localhost:11435
  • WebSocket: ws://localhost:8080/apple-fm
  • HTTP Bridge: http://localhost:3000
  • Description: Apple's Foundation Models integration
  • Used By: Apple provider module

LangChain Server​

  • URL: http://localhost:1234/v1
  • API Endpoint: http://localhost:8000
  • Description: LangChain-compatible API server
  • Used By: LangChain provider module

Database Services​

PostgreSQL​

  • Host: localhost
  • Port: 5432
  • Protocol: TCP
  • Description: Primary database for LLM platform
  • Used By: Recipe onboarding, service discovery

MySQL/MariaDB​

  • Host: localhost
  • Port: 3306
  • Protocol: TCP
  • Description: Alternative database option
  • Used By: Recipe onboarding, service discovery

Cache & Session Services​

Redis Cache​

  • Host: localhost
  • Port: 6379
  • Protocol: TCP
  • Description: High-performance caching layer
  • Used By: LLM module, API normalizer, alternative services

Memcached​

  • Host: localhost
  • Port: 11211
  • Protocol: TCP
  • Description: Alternative caching service
  • Used By: Alternative services module

Vector Database Services​

Qdrant Vector Database​

  • Host: localhost
  • Port: 6333
  • HTTP API: http://localhost:6333
  • Description: Vector database for embeddings and similarity search
  • Used By: LLM module, vector search services

Milvus Vector Database​

  • Host: localhost
  • Port: 19530
  • HTTP API: http://localhost:9091
  • Description: Alternative vector database
  • Used By: LLM module, vector search services

Message Queue Services​

RabbitMQ​

  • Host: localhost
  • Port: 5672
  • Management UI: http://localhost:15672
  • Description: Message queue for background processing
  • Used By: LLM module, agent orchestration

Redis Queue​

  • Host: localhost
  • Port: 6379
  • Description: Redis-based queue processing
  • Used By: LLM module, background tasks

Monitoring & Observability​

Prometheus​

  • Host: localhost
  • Port: 9090
  • Description: Metrics collection and monitoring
  • Used By: Platform monitoring, performance tracking

Grafana​

  • Host: localhost
  • Port: 3000
  • Description: Metrics visualization and dashboards
  • Used By: Platform monitoring, performance tracking

Jaeger​

  • Host: localhost
  • Port: 16686
  • Description: Distributed tracing
  • Used By: Request tracing, performance analysis

Development Services​

Node.js Development Server​

  • Host: localhost
  • Port: 3001
  • Description: Development server for frontend assets
  • Used By: Theme development, asset compilation

Webpack Dev Server​

  • Host: localhost
  • Port: 8080
  • Description: Hot module replacement for development
  • Used By: Theme development, JavaScript development

Storybook​

  • Host: localhost
  • Port: 6006
  • Description: Component development environment
  • Used By: UI component development

Security Services​

Vault​

  • Host: localhost
  • Port: 8200
  • Description: Secrets management
  • Used By: Government compliance, API key storage

OpenLDAP​

  • Host: localhost
  • Port: 389
  • Description: Directory services
  • Used By: Government compliance, authentication

Docker & Container Services​

Docker Registry​

  • Host: localhost
  • Port: 5000
  • Description: Private container registry
  • Used By: Container deployment, CI/CD

Portainer​

  • Host: localhost
  • Port: 9000
  • Description: Docker management UI
  • Used By: Container management, monitoring

Testing Services​

Selenium Hub​

  • Host: localhost
  • Port: 4444
  • Description: Browser automation for testing
  • Used By: Playwright tests, e2e testing

Mailhog​

  • Host: localhost
  • Port: 8025
  • Description: Email testing service
  • Used By: Email testing, development

Service Discovery​

Consul​

  • Host: localhost
  • Port: 8500
  • Description: Service discovery and configuration
  • Used By: Alternative services module

Eureka​

  • Host: localhost
  • Port: 8761
  • Description: Service registry
  • Used By: Alternative services module

API Gateway Services​

Kong​

  • Host: localhost
  • Port: 8001
  • API Gateway: http://localhost:8000
  • Description: API gateway and management
  • Used By: API normalization, traffic management

Zuul​

  • Host: localhost
  • Port: 8080
  • Description: API gateway and load balancer
  • Used By: API routing, load balancing

Development Tools​

phpMyAdmin​

  • Host: localhost
  • Port: 8081
  • Description: MySQL database management
  • Used By: Database administration

Adminer​

  • Host: localhost
  • Port: 8082
  • Description: Database management tool
  • Used By: Database administration

Configuration​

Environment Variables​

Most services can be configured via environment variables:

# AI Services
OLLAMA_HOST=localhost:11434
LANGCHAIN_API_URL=http://localhost:8000

# Database
DATABASE_HOST=localhost
DATABASE_PORT=5432
REDIS_HOST=localhost
REDIS_PORT=6379

# Vector Database
QDRANT_HOST=localhost
QDRANT_PORT=6333

# Monitoring
PROMETHEUS_HOST=localhost
PROMETHEUS_PORT=9090

Docker Compose​

Services can be managed via Docker Compose:

# Start all services
docker-compose up -d

# Start specific services
docker-compose up -d ollama redis qdrant

# Stop services
docker-compose down

Port Conflicts​

If ports are already in use, configure alternative ports:

# Check port usage
netstat -an | grep :11434

# Use alternative ports
OLLAMA_PORT=11435
REDIS_PORT=6380
QDRANT_PORT=6334

Health Checks​

Monitor service health:

# Ollama
curl http://localhost:11434/api/tags

# Redis
redis-cli ping

# Qdrant
curl http://localhost:6333/health

# PostgreSQL
pg_isready -h localhost -p 5432