Getting Started¶
Quick Start Guide¶
This guide will help you set up the Orbital Manager Backend development environment.
Prerequisites¶
- Python: 3.11 or higher
- PostgreSQL: 12+ (for orders, batch items, prep inventory)
- RabbitMQ: 3.8+ (for message broker)
- Snowflake: Account access (for reference data sync)
- Git: For version control
Installation¶
1. Clone the Repository¶
git clone https://github.com/Orbital-Kitchens/orbital-manager-backend.git
cd orbital-manager-backend
2. Install Python Dependencies¶
3. Set Up Environment Variables¶
Create a .env file in the project root:
# PostgreSQL (Supabase)
POSTGRES_SUPABASE_URL=postgresql://user:password@localhost:5432/orbital_kitchen
# RabbitMQ
RABBITMQ_URL=amqp://guest:guest@localhost:5672/
# Snowflake
SNOWFLAKE_ACCOUNT=your_account
SNOWFLAKE_USER=your_user
SNOWFLAKE_WAREHOUSE=your_warehouse
SNOWFLAKE_DATABASE=your_database
SNOWFLAKE_SCHEMA=your_schema
# Sentry (optional)
SENTRY_DSN=https://your-sentry-dsn
# Azure Key Vault (for production secrets)
AZURE_KEY_VAULT_URL=https://your-keyvault.vault.azure.net/
4. Set Up Databases¶
PostgreSQL:
# Create database
createdb orbital_kitchen
# Run migrations for Kitchen Batch Tool Service
cd kitchen_batch_tool_service
alembic upgrade head
RabbitMQ:
# Using Docker
docker run -d --name rabbitmq -p 5672:5672 -p 15672:15672 rabbitmq:3-management
# Access management UI at http://localhost:15672
# Default credentials: guest/guest
Running Services¶
Option 1: Run Individual Services with Uvicorn¶
Terminal 1 - Kitchen Batch Tool Service (port 8000):
Terminal 2 - Order Management Service (port 8001):
Terminal 3 - Kitchen Prep Tool Service (port 8002):
Option 2: Run All Services with Docker Compose¶
Run the complete stack with all services containerized:
-
Ensure Docker is running
-
Configure environment: Copy
.env.exampleto.envat the project root and edit with your local settings -
Run all services:
-
Verify all services are working:
- Check health endpoints
- Test functionality
-
Review logs:
docker compose -f docker-compose.yml logs -f -
Stop all services:
Verify Setup¶
Health Checks¶
# Check Kitchen Batch Tool Service
curl http://localhost:8000/ping
# Check Order Management Service
curl http://localhost:8001/ping
# Check Kitchen Prep Tool Service
curl http://localhost:8002/ping
Test Order Flow¶
Orders are received via Otter webhooks directly to the Order Management Service HTTP endpoint.
Expected Flow:
- Otter sends webhook to
/webhooks/otterendpoint - Webhook payload published to RabbitMQ (
otter_eventsexchange) - Otter event consumer worker processes event from queue
- New order saved to PostgreSQL (
otter.ordersschema) - Structured order published to RabbitMQ (
structured_order_dataexchange) - Kitchen Batch Tool Service consumes from RabbitMQ
- SSE stream (if connected) receives event
Testing Webhooks Locally:
Use a tool like ngrok to expose your local service to the internet, then configure Otter webhook URL to point to your local endpoint. Or simulate webhooks with curl:
curl -X POST http://localhost:8001/webhooks/otter \
-H "Content-Type: application/json" \
-d @test_webhook_payload.json
Test SSE Stream¶
Running Tests¶
# Run all tests
pytest
# Run tests for specific service
pytest order_management_service/tests/
pytest kitchen_batch_tool_service/tests/
pytest kitchen_prep_tool_service/tests/
# Run with coverage
pytest --cov=app --cov-report=html
# Run specific test file
pytest kitchen_batch_tool_service/tests/test_calculate_used_items.py
# Run with verbose output
pytest -v
Development Tools¶
Linting¶
# Check all code
ruff check .
# Auto-fix issues
ruff check . --fix
# Format code
ruff format .
# Check formatting
ruff format --check .
Type Checking¶
# Run mypy (if configured)
mypy order_management_service/app/
mypy kitchen_batch_tool_service/app/
mypy kitchen_prep_tool_service/app/
Troubleshooting¶
RabbitMQ Connection Issues¶
Problem: RabbitMQ connection failed
Solution:
- Ensure RabbitMQ is running: docker ps | grep rabbitmq
- Check RABBITMQ_URL in .env
- Verify credentials
- Check RabbitMQ management UI: http://localhost:15672
PostgreSQL Connection Issues¶
Problem: PostgreSQL connection failed
Solution:
- Verify PostgreSQL is running: pg_isready
- Check POSTGRES_SUPABASE_URL in .env
- Verify database exists
- Check user permissions
Port Already in Use¶
Problem: Address already in use
Solution:
# Find process using the port
lsof -i :8000
lsof -i :8001
lsof -i :8002
# Kill the process
kill -9 <PID>
Import Errors¶
Problem: ModuleNotFoundError: No module named 'shared'
Solution:
- Ensure you're running from the project root
- Services add project root to sys.path in main.py:
from shared.utils.logger import get_logger
Next Steps¶
Tips¶
- Use
--reloadflag during development for auto-restart - Check logs for detailed error messages
- Monitor all service consoles when testing order flow
- Use structured logging fields for better debugging
- Run tests before committing changes
- Use Docker Compose for integration testing