Skip to main content

AI Bot Router Import Documentation

Overview

This document explains the conditional import logic in main.py for the AI bot router and how to troubleshoot router import failures.

Router Import Logic

The AI bot uses a conditional import pattern in main.py (lines 80-159):
# Import AI router with error handling
try:
    import sys
    sys.path.insert(0, '/app')
    from api.routes.ai_query import router as ai_query_router
    AI_ROUTER_AVAILABLE = True
    logger.info("✅ AI query router imported successfully")
except ImportError as e:
    logger.error(f"❌ AI query router import failed: {e}")
    AI_ROUTER_AVAILABLE = False
    ai_query_router = None

# Conditional router inclusion
if AI_ROUTER_AVAILABLE and ai_query_router:
    app.include_router(ai_query_router)
    logger.info("🚀 ai_query router successfully imported and active")
else:
    logger.error("❌ CRITICAL: AI router import failed - service will not function properly")
    # No fallback endpoint - service will fail explicitly

Why This Pattern Exists

  1. Graceful Degradation: If the router import fails, the service can still start without crashing
  2. Explicit Failure: Without a fallback endpoint, import failures are immediately apparent
  3. Debugging: Clear logging shows whether the router loaded successfully

Common Import Failure Causes

1. Missing Dependencies

  • PyJWT: Required by api/auth.py for JWT token handling
  • Vertex AI: Required for Gemini model integration
  • BigQuery: Required for data queries
Fix: Ensure all dependencies are in requirements.txt:
PyJWT>=2.8.0
google-cloud-aiplatform>=1.38.0
google-cloud-bigquery>=3.11.0

2. Import Path Issues

The router uses fallback imports for Cloud Run environment:
# In api/routes/ai_query.py
try:
    from api.dependencies import get_current_user
    from api.bigquery.queries import ...
except ImportError:
    # Fallback imports for Cloud Run environment
    import sys
    sys.path.insert(0, '/app')
    from api.dependencies import get_current_user
    from api.bigquery.queries import ...
Fix: Ensure sys.path.insert(0, '/app') is used in fallback imports.

3. Environment Variables

Missing environment variables cause silent failures:
  • GCP_PROJECT_ID (should be “payroll-bi-gauntlet”)
  • VERTEX_MODEL (for Gemini model)
  • GCP_REGION (should be “us-central1”)

Troubleshooting Steps

1. Check Cloud Run Logs

Look for these log messages: Success Indicators:
  • ”🔍 Attempting to import AI query router…”
  • ”✅ AI query router imported successfully”
  • ”🚀 ai_query router successfully imported and active”
Failure Indicators:
  • ”❌ AI query router import failed”
  • ”❌ CRITICAL: AI router import failed”
  • ModuleNotFoundError or ImportError tracebacks

2. Test Endpoint Directly

curl -X POST https://your-service-url/api/v1/vertex/query \
  -H "Content-Type: application/json" \
  -d '{"question": "how many employees did nug have in aug"}'
Expected Success Response:
{
  "answer": "Nug has 139 employees in August 2025."
}
Failure Response (if router not loaded):
  • 404 Not Found (no endpoint available)
  • 500 Internal Server Error (import failure)

3. Force Container Rebuild

If dependencies were added but container wasn’t rebuilt:
gcloud run deploy fastapi-backend \
  --source . \
  --no-cache \
  --region us-central1 \
  --project payroll-bi-gauntlet
The --no-cache flag ensures the container is rebuilt with latest requirements.txt.

Deployment Checklist

Before deploying AI bot changes:
  1. Dependencies: Verify all required packages are in requirements.txt
  2. Import Paths: Check fallback imports use sys.path.insert(0, '/app')
  3. Environment Variables: Confirm all required env vars are set in Cloud Run
  4. No Fallback Endpoint: Ensure inline fallback endpoint is removed
  5. Force Rebuild: Use --no-cache flag if dependencies changed
  6. Test Endpoint: Verify /api/v1/vertex/query returns business data, not generic responses

Historical Context

The Fallback Endpoint Problem

Previously, when the router import failed, main.py would create an inline fallback endpoint:
@app.post("/api/v1/vertex/query")
async def ai_query_inline(request: AIQueryRequest):
    # Simplified logic returning generic responses
    return {"answer": "Agent data for 2025-08-01:"}
Problem: This created a route conflict where:
  1. Router defines /api/v1/vertex/query (with full business logic)
  2. Fallback also defines /api/v1/vertex/query (with generic responses)
  3. FastAPI would use whichever was registered last
Solution: Remove the fallback endpoint entirely. If the router import fails, the service should fail explicitly so we know there’s a problem.
  • main.py (lines 80-159): Router import logic
  • api/routes/ai_query.py: The AI router implementation
  • requirements.txt: Dependencies including PyJWT
  • api/bigquery/business_queries.py: Business-level query functions
  • api/bigquery/business_matcher.py: Business name fuzzy matching

Monitoring

To monitor router health in production:
  1. Startup Logs: Check for ”🚀 ai_query router successfully imported and active”
  2. Endpoint Testing: Regular tests of business queries like “how many employees did nug have in aug”
  3. Error Alerts: Monitor for ”❌ CRITICAL: AI router import failed” messages
This ensures the AI bot is functioning with full business query capabilities rather than falling back to generic responses.