Onboarding a New Location - Data Engineering¶
Step-by-step Guide¶
1. Verify Otter Data Integration¶
Check how the new location appears in Otter data:
- Confirm Data Flow
- Confirm
facility_idandfacility_namevalues - Ensure data is flowing correctly from Otter API
- Update the facility ID to facility name mapping in the airflow script
dw-orders-insert.sql
- Confirm
2. Update Snowflake SQL Views¶
- Update ORBITAL_KITCHENS_DW.DW.V_BRAND_ACTIVE
- Update the facility ID to location abbreviation mapping in the CASE statements
3. Update Sigma Dashboards and Configuration¶
-
Update Location Custom Function
- Navigate to Sigma admin page
- Update the location custom function at the bottom of the page
-
Audit All Dashboards
- Review all existing dashboards to identify which ones are missing data from the new location
- Update manual location mapping formulas where applicable: Food Cost Summary & Efficiency orders table, TU Velocity waste log table
-
Store Level Ops Metrics Dashboard
- Update location mapping formula in 7shifts tables
- Add location to sort field in all tables
-
Kitchen Screen Metrics Dashboard
- Add English and Spanish tabs for new location
4. Update Snowflake Notebooks¶
-
Add to Supplemental Ratings Snowflake Notebook
- Include the new location in the ratings analysis
-
Create TU Sales Snowflake Notebook
- Set up new notebook for Transfer Unit sales tracking
- Update TU Velocity Dashboard to include new location
-
Filter Bakery Forecast (Temporary)
- Filter out the new location from existing Snowflake notebook until forecast is ready
5. Set Up Daily and Weekly Reports¶
-
Daily Sales Report
- Add new location to the daily sales report
-
Daily Kitchen Performance Report
- Create new dashboard for the location
- Configure automated notification
-
Weekly Kitchen Announcements
- Create dashboard for the location
- Configure automated notification
6. Configure Slack Notifications¶
-
Weekly Food Quality Report
- Add automated Slack notification to the location's daily channel
-
Purchase Order PDFs
- Set up Airflow automation to post PO PDFs to daily Slack channel
7. Update 7Shifts Integration¶
- Payroll Ingestion
- Update Airflow script to include new location
- Add location to weekly payroll dashboard
Weekly Labor Report->USQ Summary Tables->Bakery Cost & Sales-> add new location- Make sure to change total labor formula (in Labor Cost table) for new location
8. Set Up Odoo Transfer Units¶
Coordinate with Linh to have VA complete:
- Transfer Unit Configuration
- Add new warehouse for transfer units in Odoo
- Create new TUs or TU containers if necessary
9. Create Prep & Inventory Management System¶
Steps 1 and 2 are not needed with the release of the prep & inventory tool
-
Google Sheet Dashboard
- Create new Prep & Inventory Dashboard Google Sheet for the location
- Set up historical data extraction
- Create Airflow script for data ingestion
- Create corresponding Snowflake table
-
Scanning Apps
- Set up new scanning apps for the location
- Update formulas in the sheets
- Configure App Script with:
- Spreadsheet ID
- Webhook URLs
- Location abbreviations
- PDF generation settings
-
Initial Inventory Setup
- Create initial inventory delivery list with pars
-
Dashboard Integration
- Add tab for new location in Prep Production Dashboard (once scans start coming through)
- Add tab for new location in Inventory Scanning Dashboard (once scans start coming through)
10. Set Up Waste Tracking¶
- Create new waste log app for the location
- Configure data ingestion pipeline
11. Forecasting Setup (After 3-4 Weeks of Data)¶
Important: Wait for 3-4 weeks of operational data before creating forecasts
-
Order Forecast
- Create order forecast for new location (including hours) via
Order ForecastingSnowflake Notebook - Update Sigma dashboard to include new location
- Create order forecast for new location (including hours) via
-
Bakery Item Forecast
- Create bakery item forecast for new location via
Bakery Item ForecastSnowflake Notebook
- Create bakery item forecast for new location via
-
Automated Inventory Pars
- Create automated inventory pars Snowflake Notebook based on historical data (copy from another location)