# Location Management System ## Overview The location management system is the foundation of Changemaker Lite's field organizing capabilities. It provides building-level and unit-level voter/supporter tracking with comprehensive address management, geocoding integration, and Canadian electoral data (NAR) import support. **Key Capabilities:** - **Building + Unit Architecture**: Location (building) has 1:N Address (units) for multi-unit buildings - **NAR Integration**: Import Canadian electoral data (LOC_GUID, ADDR_GUID from Elections Canada) - **Multi-Provider Geocoding**: Automatically geocode addresses with confidence scoring - **CSV Import/Export**: Bulk operations for campaign data management - **Support Level Tracking**: LEVEL_1 (Strong) → LEVEL_4 (Opposed) classification - **Spatial Filtering**: Filter locations by polygon cuts or bounding box - **History Tracking**: Complete audit trail of location changes - **Field Data**: Sign tracking, building notes, federal district assignment **Use Cases:** - Voter file management for electoral campaigns - Door-to-door canvassing organization - Sign placement tracking (lawn signs, window signs) - Multi-unit building canvassing (apartments, condos) - Federal electoral district mapping - NAR 2025 import for Canadian campaigns - Walk sheet generation for field teams ## Architecture ```mermaid graph TD A[Admin User] -->|Manages Locations| B[LocationsPage] B -->|CRUD Operations| C[Locations API] C -->|Save/Query| D[(Location Model)] C -->|Geocode Address| E[Geocoding Service] E -->|Try Providers| F[Multi-Provider Chain] F -->|Cache Result| G[(Redis Cache)] H[CSV Import] -->|Parse File| C C -->|Validate| I[Location Service] I -->|Auto-Geocode| E I -->|Create Records| D J[NAR Import] -->|Server Stream| K[NAR Import Service] K -->|Join Address+Location| L[Location Files] K -->|Convert Coords| M[proj4 Lambert→WGS84] K -->|Filter| N[Cut/City/Postal] K -->|Bulk Insert| D D -->|1:N| O[(Address Model)] D -->|Assigned To| P[(Cut Model)] Q[Public Map] -->|GET /api/public/map/locations| C C -->|Filter by Bounds| D R[Canvass Session] -->|Load Addresses| C C -->|Point-in-Polygon| S[Spatial Utils] style D fill:#e1f5ff style O fill:#e1f5ff style P fill:#e1f5ff style G fill:#fff4e1 ``` **Flow Description:** 1. **Admin creates location** → Location service validates address and optionally geocodes 2. **CSV import** → Service parses file, detects format (standard/NAR), geocodes if needed, creates records 3. **NAR server import** → Streams large files, joins Address+Location CSVs, converts Lambert coords, filters, bulk inserts 4. **Public map loads** → Location service queries by bounds, returns color-coded markers 5. **Canvass session starts** → Service loads addresses within cut polygon using ray-casting algorithm 6. **Geocoding** → Multi-provider chain tries providers in order, caches successful results ## Database Models ### Location Model See [Location Model Documentation](../../database/models/map.md#location-model) for full schema. **Key Fields:** - `latitude` / `longitude`: WGS84 coordinates (Decimal type for precision) - `address`: Street address (building level, not including unit numbers) - `postalCode`: Canadian postal code (A1A 1A1 format) - `province`: Province code (ON, QC, AB, etc.) - `federalDistrict`: Federal electoral district name - `buildingType`: SINGLE_FAMILY | MULTI_UNIT | MIXED_USE | COMMERCIAL - `totalUnits`: Number of units in building (for multi-unit buildings) - `geocodeConfidence`: Confidence score 0-100 from geocoding service - `geocodeProvider`: Google, Mapbox, Nominatim, Photon, LocationIQ, ArcGIS - `narLocGuid`: NAR LOC_GUID identifier (Canadian electoral data) - `buildingNotes`: Free-text notes about building access, parking, etc. **NAR-Specific Fields:** - `narLocGuid`: Location GUID from NAR dataset - `buildingUse`: Building use code (1=Residential, 2=Commercial, etc.) - `postalCode`: Extracted from NAR MAIL_POSTAL_CODE - `province`: Extracted from NAR PROV_CODE - `federalDistrict`: Extracted from NAR FED_ENG_NAME **Geocoding Fields:** - `geocodeConfidence`: 0-100 score (>90=high, 70-90=medium, <70=low) - `geocodeProvider`: Which provider successfully geocoded the address - `geocodeAttempts`: Number of failed geocoding attempts - `lastGeocodeAttempt`: Timestamp of last geocoding attempt ### Address Model See [Address Model Documentation](../../database/models/map.md#address-model) for full schema. **Key Fields:** - `locationId`: Foreign key to Location (building) - `unitNumber`: Unit/apartment/suite number (optional for single-family) - `firstName` / `lastName`: Resident name - `email` / `phone`: Contact information - `supportLevel`: LEVEL_1 (Strong) | LEVEL_2 (Leaning) | LEVEL_3 (Undecided) | LEVEL_4 (Opposed) - `sign`: Boolean - has lawn/window sign - `signSize`: Sign size description (e.g., "24x18 lawn", "window") - `notes`: Free-text notes from canvassing - `narAddrGuid`: NAR ADDR_GUID identifier **NAR-Specific Fields:** - `narAddrGuid`: Address GUID from NAR dataset - `unitNumber`: Extracted from NAR APT_NO_LABEL **Related Models:** - [Cut](../../database/models/map.md#cut-model) — Polygon overlays for organizing - [CanvassVisit](../../database/models/canvass.md#canvassvisit-model) — Door-knock records - [LocationHistory](../../database/models/map.md#locationhistory-model) — Audit trail ## API Endpoints See [Locations Backend Module Documentation](../../backend/modules/map/locations.md) for full API reference. **Admin Endpoints:** | Method | Endpoint | Auth | Description | |--------|----------|------|-------------| | GET | `/api/map/locations` | MAP_ADMIN | List locations with pagination, search, filters | | GET | `/api/map/locations/stats` | MAP_ADMIN | Get location statistics (total, geocoded, by confidence) | | GET | `/api/map/locations/:id` | MAP_ADMIN | Get location details with addresses | | POST | `/api/map/locations` | MAP_ADMIN | Create new location | | PATCH | `/api/map/locations/:id` | MAP_ADMIN | Update location | | DELETE | `/api/map/locations/:id` | MAP_ADMIN | Delete location (and cascade addresses) | | POST | `/api/map/locations/geocode` | MAP_ADMIN | Geocode single address | | POST | `/api/map/locations/reverse-geocode` | MAP_ADMIN | Reverse geocode lat/lng to address | | POST | `/api/map/locations/import` | MAP_ADMIN | Import CSV file (standard or NAR format) | | GET | `/api/map/locations/export` | MAP_ADMIN | Export locations to CSV | | GET | `/api/map/locations/:id/history` | MAP_ADMIN | Get location change history | **Bulk Operations:** | Method | Endpoint | Auth | Description | |--------|----------|------|-------------| | POST | `/api/map/locations/bulk-geocode/start` | MAP_ADMIN | Start bulk geocoding job (BullMQ) | | GET | `/api/map/locations/bulk-geocode/status` | MAP_ADMIN | Check bulk geocoding job status | | POST | `/api/map/locations/bulk-geocode/cancel` | MAP_ADMIN | Cancel running bulk geocoding job | **NAR Import Endpoints:** | Method | Endpoint | Auth | Description | |--------|----------|------|-------------| | GET | `/api/map/locations/nar/datasets` | MAP_ADMIN | List available NAR datasets from `/data` directory | | POST | `/api/map/locations/nar/import` | MAP_ADMIN | Server-side streaming NAR import with filters | | GET | `/api/map/locations/nar/import/progress` | MAP_ADMIN | Get NAR import progress (polling endpoint) | **Public Endpoints:** | Method | Endpoint | Auth | Description | |--------|----------|------|-------------| | GET | `/api/public/map/locations` | None | List locations by bounds (for public map) | **Volunteer Endpoints:** | Method | Endpoint | Auth | Description | |--------|----------|------|-------------| | PATCH | `/api/map/canvass/volunteer/locations/:id` | Any logged-in user | Update location from canvass session | ## Configuration ### Environment Variables | Variable | Type | Default | Description | |----------|------|---------|-------------| | `GEOCODING_ENABLED` | boolean | `true` | Enable geocoding services | | `GEOCODING_CACHE_ENABLED` | boolean | `true` | Cache geocoding results in Redis | | `GEOCODING_CACHE_TTL_HOURS` | number | `168` | Cache TTL (7 days) | | `GEOCODING_PROVIDERS` | string[] | See geocoding.md | Comma-separated provider list | | `GOOGLE_MAPS_API_KEY` | string | - | Google Geocoding API key | | `MAPBOX_ACCESS_TOKEN` | string | - | Mapbox API token | | `LOCATIONIQ_API_KEY` | string | - | LocationIQ API key | | `NAR_DATA_DIR` | string | `/data` | Directory containing NAR CSV files | ### Database Indexes Key indexes for performance: ```sql -- Location queries CREATE INDEX idx_locations_lat_lng ON "Location" (latitude, longitude); CREATE INDEX idx_locations_postal_code ON "Location" ("postalCode"); CREATE INDEX idx_locations_province ON "Location" (province); CREATE INDEX idx_locations_federal_district ON "Location" ("federalDistrict"); CREATE INDEX idx_locations_geocode_confidence ON "Location" ("geocodeConfidence"); CREATE INDEX idx_locations_nar_loc_guid ON "Location" ("narLocGuid"); -- Address queries CREATE INDEX idx_addresses_location_id ON "Address" ("locationId"); CREATE INDEX idx_addresses_support_level ON "Address" ("supportLevel"); CREATE INDEX idx_addresses_nar_addr_guid ON "Address" ("narAddrGuid"); -- Spatial queries (cut assignment) CREATE INDEX idx_locations_lat ON "Location" (latitude); CREATE INDEX idx_locations_lng ON "Location" (longitude); ``` ## Admin Workflow ### Creating a Location **Step 1: Navigate to Locations Page** Navigate to **Map → Locations** in the admin sidebar. ![LocationsPage Screenshot Placeholder] **Step 2: Click "Add Location"** Click the **+ Add Location** button in the top-right corner. **Step 3: Enter Address Information** Fill in the location form: - **Address**: Street address (e.g., "123 Main Street") - **Postal Code**: Canadian postal code (e.g., "K1A 0B1") - **Building Type**: Single Family / Multi-Unit / Mixed Use / Commercial - **Total Units**: Number of units (for multi-unit buildings) - **Building Notes**: Access codes, parking info, etc. **Step 4: Auto-Geocode (Optional)** Click **Geocode** button to automatically fetch latitude/longitude coordinates. The system will: 1. Try geocoding providers in order (Google → Mapbox → Nominatim → Photon → LocationIQ → ArcGIS) 2. Return confidence score (0-100) 3. Display formatted address from provider 4. Cache result in Redis for 7 days **Step 5: Add Addresses (Units)** For multi-unit buildings, click **Add Address** to create unit records: - **Unit Number**: Apartment/suite number - **First Name / Last Name**: Resident name - **Support Level**: LEVEL_1 (Strong) → LEVEL_4 (Opposed) - **Sign**: Check if resident has lawn/window sign - **Notes**: Canvassing notes **Step 6: Save Location** Click **Create** to save the location and addresses. ### CSV Import Workflow **Step 1: Prepare CSV File** Prepare a CSV file with the following columns (flexible header names): **Standard Format:** ```csv address,firstName,lastName,email,phone,unitNumber,supportLevel,sign,notes,latitude,longitude 123 Main St,John,Doe,john@example.com,555-1234,101,LEVEL_1,true,Friendly contact,, 124 Main St,Jane,Smith,jane@example.com,555-5678,,LEVEL_2,false,Ask about lawn sign,45.4215,-75.6972 ``` **NAR Format** (auto-detected if 3+ NAR columns present): ```csv CIVIC_NO,OFFICIAL_STREET_NAME,OFFICIAL_STREET_TYPE,APT_NO_LABEL,MAIL_POSTAL_CODE,BG_LATITUDE,BG_LONGITUDE,FED_ENG_NAME 123,Main,Street,101,K1A 0B1,45.4215,-75.6972,Ottawa Centre 124,Main,Street,,K1A 0B2,45.4220,-75.6975,Ottawa Centre ``` **Step 2: Open Import Modal** Click **Import CSV** button on LocationsPage. **Step 3: Select Import Format** Choose format: - **Standard**: General campaign CSV (address, firstName, lastName, supportLevel, etc.) - **NAR**: National Address Register format (auto-detected) - **Server**: Server-side NAR streaming import (for large files >100MB) **Step 4: Configure Filters (Optional)** Filter imported locations: - **Cut**: Import only locations within a polygon - **Map Area**: Import only locations within current map bounds - **City**: Filter by city name - **Province**: Filter by province code (ON, QC, AB, etc.) - **Residential Only**: Exclude commercial buildings (BU_USE = 1) **Step 5: Upload File** Drag-and-drop or click to select CSV file. **Step 6: Configure Geocoding** Toggle **Geocode Missing Coordinates**: - **Enabled**: Automatically geocode addresses without lat/lng (slower, uses geocoding API quota) - **Disabled**: Import only records with coordinates (faster, for NAR imports) **Step 7: Review Import Results** After import completes, view results: - **Created**: Number of new locations created - **Skipped**: Number of duplicate addresses skipped - **Failed**: Number of errors (invalid addresses, geocoding failures) - **Geocoded**: Number of addresses successfully geocoded ### NAR Server Import Workflow **For large NAR datasets (>100MB), use server-side streaming import:** **Step 1: Upload NAR Files to Server** Copy NAR CSV files to server's `/data` directory: ```bash # Example NAR files for Ontario (province code 35) /data/Address_35_part_1.csv /data/Address_35_part_2.csv /data/Location_35.csv ``` **Step 2: Open NAR Import Tab** Click **NAR Import** tab on LocationsPage. **Step 3: Scan for Datasets** Click **Scan NAR Directory** to detect available datasets. The system will: - Scan `/data` directory for Address_*.csv and Location_*.csv files - Group files by province code (10=NL, 24=QC, 35=ON, 48=AB, etc.) - Display file sizes and counts **Step 4: Select Province** Choose province from dropdown (e.g., "35 - Ontario (10.5 GB, 45 files)"). **Step 5: Configure Filters** Apply optional filters: - **City**: Filter by MAIL_MUN_NAME or CSD_ENG_NAME - **Postal Code Prefix**: Filter by first 3 characters (e.g., "K1A") - **Cut**: Import only addresses within polygon - **Residential Only**: Exclude commercial buildings (BU_USE != 1) **Step 6: Start Import** Click **Start Import**. The system will: 1. Stream Address CSV files (multi-part files processed sequentially) 2. Join with Location CSV on LOC_GUID 3. Convert BG_X/BG_Y (Lambert projection) to lat/lng (WGS84) using proj4 4. Apply filters (city, postal, cut, residential) 5. Bulk insert locations + addresses (transaction batches of 500) 6. Update progress every 5 seconds **Step 7: Monitor Progress** View real-time progress: - **Records Processed**: Current/total count - **Progress Percentage**: Visual progress bar - **ETA**: Estimated time remaining - **Current File**: Which multi-part file is being processed **Step 8: Review Results** After import completes: - **Total Created**: Number of locations + addresses created - **Duration**: Total import time - **Skipped**: Duplicate or filtered records ### Bulk Re-Geocoding **For locations with missing or low-confidence coordinates:** **Step 1: Open Bulk Geocode Modal** Click **Bulk Re-Geocode** button on LocationsPage. **Step 2: Configure Job Parameters** Set parameters: - **Confidence Filter**: Re-geocode locations below threshold (e.g., <70) - **Missing Only**: Only geocode locations without coordinates - **Provider**: Choose preferred geocoding provider - **Batch Size**: Number of locations per batch (default: 50) **Step 3: Start Job** Click **Start Job** to queue bulk geocoding job in BullMQ. **Step 4: Monitor Progress** Poll job status: - **Completed**: Number of successfully geocoded locations - **Failed**: Number of geocoding failures - **Progress**: Percentage complete - **ETA**: Estimated time remaining **Step 5: Cancel Job (Optional)** Click **Cancel Job** to stop bulk geocoding. ### Exporting Locations **Step 1: Configure Export Filters** Apply filters on LocationsPage: - **Search**: Filter by address or notes - **Confidence Level**: High / Medium / Low / None - **Cut**: Export locations within specific polygon **Step 2: Click Export CSV** Click **Export CSV** button. The system will: 1. Export locations matching current filters 2. Include all address records (one row per address) 3. Download CSV file with timestamp **Export Format:** ```csv locationId,address,latitude,longitude,postalCode,province,federalDistrict,buildingType,totalUnits,geocodeConfidence,geocodeProvider,unitNumber,firstName,lastName,email,phone,supportLevel,sign,signSize,notes uuid-1,123 Main St,45.4215,-75.6972,K1A 0B1,ON,Ottawa Centre,MULTI_UNIT,12,95,GOOGLE,101,John,Doe,john@example.com,555-1234,LEVEL_1,true,24x18 lawn,Friendly contact ``` ## Public Workflow **Public users can view locations on the interactive map.** **Step 1: Navigate to Public Map** Visit `/map` (public route, no authentication required). **Step 2: Browse Map** Interact with Leaflet map: - **Zoom/Pan**: Use mouse or touch gestures - **Markers**: Locations displayed as color-coded circle markers: - **Green**: LEVEL_1 (Strong support) - **Yellow**: LEVEL_2 (Leaning support) - **Gray**: LEVEL_3 (Undecided) - **Red**: LEVEL_4 (Opposed) - **Blue**: No support level assigned **Step 3: View Cut Overlays** Toggle cut overlays using **Cuts** control panel: - **Show/Hide**: Toggle cut visibility - **Opacity**: Adjust polygon transparency - **Legend**: View cut color legend **Step 4: Geolocate** Click **Geolocate** button to center map on current location (requires browser geolocation permission). **Step 5: Fullscreen Mode** Click **Fullscreen** button to expand map to full screen. ## Volunteer Workflow **Volunteers can update location data during canvassing sessions.** **Step 1: Start Canvass Session** See [Canvassing Documentation](./canvassing.md) for full workflow. **Step 2: Record Visit** When visiting a location, update fields: - **Support Level**: Update based on conversation - **Sign**: Check if resident wants lawn/window sign - **Notes**: Add canvassing notes **Step 3: Update Location** Click **Save Visit** to record changes. The system will: 1. Create CanvassVisit record with outcome 2. Update Address with new supportLevel/sign/notes 3. Update Location.lastUpdated timestamp 4. Create LocationHistory audit record ## Code Examples ### Creating a Location (Frontend) ```typescript // admin/src/pages/LocationsPage.tsx const handleCreate = async (values: any) => { try { const { data } = await api.post('/map/locations', { address: values.address, postalCode: values.postalCode, buildingType: values.buildingType, totalUnits: values.totalUnits, buildingNotes: values.buildingNotes, latitude: values.latitude, longitude: values.longitude, geocodeConfidence: values.geocodeConfidence, geocodeProvider: values.geocodeProvider, }); message.success('Location created'); setCreateModalOpen(false); createForm.resetFields(); fetchLocations(); } catch (error) { message.error('Failed to create location'); } }; ``` ### Geocoding an Address (Frontend) ```typescript // admin/src/pages/LocationsPage.tsx const handleGeocode = async () => { const address = createForm.getFieldValue('address'); const postalCode = createForm.getFieldValue('postalCode'); if (!address) { message.warning('Please enter an address first'); return; } setGeocoding(true); try { const fullAddress = postalCode ? `${address}, ${postalCode}` : address; const { data } = await api.post('/map/locations/geocode', { address: fullAddress, }); createForm.setFieldsValue({ latitude: data.latitude, longitude: data.longitude, geocodeConfidence: data.confidence, geocodeProvider: data.provider, }); message.success( `Geocoded with ${data.provider} (confidence: ${data.confidence}%)` ); } catch (error) { message.error('Geocoding failed'); } finally { setGeocoding(false); } }; ``` ### Location Service Create (Backend) ```typescript // api/src/modules/map/locations/locations.service.ts async create(data: CreateLocationInput, userId: string) { // Auto-geocode if address provided but no coordinates if (data.address && !data.latitude && !data.longitude) { try { const fullAddress = data.postalCode ? `${data.address}, ${data.postalCode}` : data.address; const geocodeResult = await geocodingService.geocode(fullAddress); data.latitude = geocodeResult.latitude; data.longitude = geocodeResult.longitude; data.geocodeConfidence = geocodeResult.confidence; data.geocodeProvider = geocodeResult.provider; logger.info('Auto-geocoded location', { address: fullAddress, provider: geocodeResult.provider, confidence: geocodeResult.confidence, }); } catch (err) { logger.warn('Auto-geocoding failed, creating location without coordinates', err); } } const location = await prisma.location.create({ data: { address: data.address, latitude: data.latitude, longitude: data.longitude, postalCode: data.postalCode, province: data.province, federalDistrict: data.federalDistrict, buildingType: data.buildingType, totalUnits: data.totalUnits, buildingNotes: data.buildingNotes, geocodeConfidence: data.geocodeConfidence, geocodeProvider: data.geocodeProvider, createdByUserId: userId, }, }); // Create history record await prisma.locationHistory.create({ data: { locationId: location.id, action: LocationHistoryAction.CREATED, changedByUserId: userId, changes: JSON.stringify({ created: true }), }, }); recordLocationQuery('create'); return location; } ``` ### CSV Import Detection (Backend) ```typescript // api/src/modules/map/locations/locations.service.ts function detectNarFormat(headers: string[]): boolean { const normalizedHeaders = headers.map((h) => h.trim().toUpperCase()); let matchCount = 0; const matched = new Set(); // NAR columns to detect (need 3+ matches) const NAR_DETECT_COLUMNS = [ 'CIVIC_NO', 'OFFICIAL_STREET_NAME', 'OFFICIAL_STREET_TYPE', 'BG_X', 'BG_Y', 'MAIL_POSTAL_CODE', 'MAIL_PROV_ABVN', 'BG_LATITUDE', 'BG_LONGITUDE', ]; for (const col of NAR_DETECT_COLUMNS) { if (normalizedHeaders.includes(col) && !matched.has(col)) { matched.add(col); matchCount++; } } return matchCount >= 3; } ``` ### NAR Lambert Coordinate Conversion (Backend) ```typescript // api/src/modules/map/locations/locations.service.ts import proj4 from 'proj4'; // Statistics Canada Lambert Conformal Conic (EPSG:3347) → WGS84 (EPSG:4326) proj4.defs( 'EPSG:3347', '+proj=lcc +lat_1=49 +lat_2=77 +lat_0=63.390675 +lon_0=-91.86666666666666 ' + '+x_0=6200000 +y_0=3000000 +ellps=GRS80 +units=m +no_defs' ); /** Convert BG_X/BG_Y (EPSG:3347 Lambert) to [lat, lng] (WGS84) */ function lambertToLatLng(bgX: number, bgY: number): [number, number] { const [lng, lat] = proj4('EPSG:3347', 'EPSG:4326', [bgX, bgY]); return [lat, lng]; } // Usage in NAR import const [lat, lng] = lambertToLatLng(row.BG_X, row.BG_Y); ``` ### Spatial Filtering by Cut (Backend) ```typescript // api/src/modules/map/locations/locations.service.ts async findByBounds(filters: BoundsQuery) { const where: Prisma.LocationWhereInput = { latitude: { gte: new Prisma.Decimal(filters.minLat), lte: new Prisma.Decimal(filters.maxLat), }, longitude: { gte: new Prisma.Decimal(filters.minLng), lte: new Prisma.Decimal(filters.maxLng), }, }; const locations = await prisma.location.findMany({ where, select: { id: true, latitude: true, longitude: true, address: true, addresses: { select: { supportLevel: true, }, }, }, }); // If cut filter provided, apply point-in-polygon if (filters.cutId) { const cut = await prisma.cut.findUnique({ where: { id: filters.cutId }, select: { geojson: true }, }); if (cut?.geojson) { const polygons = parseGeoJsonPolygon(cut.geojson); return locations.filter((loc) => { const lat = Number(loc.latitude); const lng = Number(loc.longitude); return polygons.some((poly) => isPointInPolygon(lat, lng, poly)); }); } } return locations; } ``` ## Troubleshooting ### Issue: Geocoding Fails for Valid Address **Symptoms:** - "Geocoding failed" error message - Location created without coordinates - Low geocode confidence score (<50) **Causes:** - Invalid API key for geocoding provider - Provider quota exceeded - Address format not recognized by provider - Provider service down **Solutions:** 1. **Check API keys**: ```bash # Verify API keys are set in .env grep "GOOGLE_MAPS_API_KEY\|MAPBOX_ACCESS_TOKEN\|LOCATIONIQ_API_KEY" .env ``` 2. **Test geocoding endpoint directly**: ```bash curl -X POST http://localhost:4000/api/map/locations/geocode \ -H "Authorization: Bearer YOUR_TOKEN" \ -H "Content-Type: application/json" \ -d '{"address":"123 Main Street, Ottawa, ON K1A 0B1"}' ``` 3. **Check provider order in env**: ```bash # Try different provider order GEOCODING_PROVIDERS=GOOGLE,NOMINATIM,PHOTON,MAPBOX,LOCATIONIQ,ARCGIS ``` 4. **View API logs**: ```bash docker compose logs -f api | grep geocode ``` ### Issue: NAR Import Fails or Hangs **Symptoms:** - NAR import progress stuck at 0% - Import fails with "File not found" error - Import fails with "Invalid coordinates" error - Memory errors during large imports **Causes:** - NAR files not in `/data` directory - Multi-part files missing (e.g., Address_35_part_2.csv) - Incorrect province code - Invalid BG_X/BG_Y coordinates - Cut polygon filter too complex **Solutions:** 1. **Verify NAR files exist**: ```bash # Check /data directory in container docker compose exec api ls -lh /data # Verify file naming matches NAR format # Address_{PROV_CODE}_part_{N}.csv # Location_{PROV_CODE}.csv ``` 2. **Check province code mapping**: ``` 10 = Newfoundland and Labrador 24 = Quebec 35 = Ontario 48 = Alberta 59 = British Columbia 62 = Nunavut ``` 3. **Test coordinate conversion**: ```bash # Verify proj4 is installed docker compose exec api node -e "const proj4 = require('proj4'); console.log(proj4.version);" ``` 4. **Monitor import progress**: ```bash # Watch API logs during import docker compose logs -f api | grep "NAR import" # Check Redis for progress key docker compose exec redis redis-cli GET "NAR_IMPORT_PROGRESS" ``` 5. **Use smaller filters for testing**: - Start with single postal code prefix (e.g., "K1A") - Use small cut polygon - Enable residential-only filter (reduces records by ~50%) ### Issue: Duplicate Locations Created on Import **Symptoms:** - Same address appears multiple times in table - Export CSV has duplicate rows - Location count doesn't match expected NAR count **Causes:** - Re-importing same CSV file without checking for duplicates - NAR Address multi-part files have overlapping records - Different LOC_GUID for same physical address (NAR data issue) **Solutions:** 1. **Use NAR GUID fields for deduplication**: The system deduplicates by `narLocGuid` and `narAddrGuid`: ```typescript // Check for existing location before creating const existing = await prisma.location.findFirst({ where: { narLocGuid: row.LOC_GUID }, }); if (existing) { skipped++; continue; } ``` 2. **Delete duplicates manually**: ```sql -- Find duplicate locations by address SELECT address, COUNT(*) as count FROM "Location" GROUP BY address HAVING COUNT(*) > 1; -- Keep first, delete rest DELETE FROM "Location" WHERE id NOT IN ( SELECT MIN(id) FROM "Location" GROUP BY address ); ``` 3. **Use server-side NAR import** (better deduplication): Server-side import joins Address + Location files on LOC_GUID before inserting, preventing duplicates. ### Issue: Low Geocode Confidence for NAR Data **Symptoms:** - NAR locations have geocodeConfidence < 70 - Locations appear in wrong place on map - "Low confidence" warnings in admin **Causes:** - BG_X/BG_Y coordinates missing in NAR Location file - BG_LATITUDE/BG_LONGITUDE used instead of converted Lambert coords - proj4 conversion error **Solutions:** 1. **Verify coordinate source**: NAR Location files have TWO coordinate fields: - `BG_LATITUDE` / `BG_LONGITUDE`: Direct WGS84 (use these if available) - `BG_X` / `BG_Y`: Lambert Conformal Conic EPSG:3347 (requires conversion) 2. **Use BG_LATITUDE/BG_LONGITUDE if available**: ```typescript // Priority: use direct WGS84 coords if available const lat = row.BG_LATITUDE ? parseFloat(row.BG_LATITUDE) : (row.BG_X && row.BG_Y ? lambertToLatLng(row.BG_X, row.BG_Y)[0] : null); ``` 3. **Re-geocode low-confidence locations**: Use bulk re-geocoding feature with confidence filter <70. ## Performance Considerations ### Query Optimization **Bounding Box Queries:** Always use indexed lat/lng queries for map bounds: ```sql -- Efficient: uses idx_locations_lat_lng index SELECT * FROM "Location" WHERE latitude BETWEEN 45.0 AND 46.0 AND longitude BETWEEN -76.0 AND -75.0; -- Inefficient: no index SELECT * FROM "Location" WHERE ST_Contains(polygon, point); -- PostGIS not used ``` **Point-in-Polygon:** For small result sets (<1000 locations), use application-level ray-casting: ```typescript // api/src/utils/spatial.ts export function isPointInPolygon( lat: number, lng: number, polygonCoords: number[][] ): boolean { let inside = false; for (let i = 0, j = polygonCoords.length - 1; i < polygonCoords.length; j = i++) { const xi = polygonCoords[i]![1]!; // lat const yi = polygonCoords[i]![0]!; // lng const xj = polygonCoords[j]![1]!; const yj = polygonCoords[j]![0]!; const intersect = ((yi > lng) !== (yj > lng)) && (lat < (xj - xi) * (lng - yi) / (yj - yi) + xi); if (intersect) inside = !inside; } return inside; } ``` For large result sets (>10,000 locations), consider PostGIS extension. ### Geocoding Rate Limits **Provider Limits:** | Provider | Free Tier | Rate Limit | |----------|-----------|------------| | Google | $200/month credit | 50 req/sec | | Mapbox | 100,000/month | 600 req/min | | Nominatim | Unlimited | 1 req/sec | | Photon | Unlimited | No limit (self-hosted recommended) | | LocationIQ | 5,000/day | 2 req/sec | | ArcGIS | 20,000/month | 50 req/sec | **Best Practices:** 1. **Enable Redis caching** (default: 7 days TTL) 2. **Use bulk geocoding jobs** (BullMQ queue with rate limiting) 3. **Prefer NAR imports** (coordinates included, no geocoding needed) 4. **Batch geocoding requests** (50 locations per batch) ### NAR Import Performance **Large File Streaming:** NAR Address files can be 10+ GB. Use server-side streaming to avoid memory issues: ```typescript // api/src/modules/map/locations/nar-import.service.ts import { createReadStream } from 'fs'; import { parse } from 'csv-parse'; async function streamNarFile(filePath: string) { return new Promise((resolve, reject) => { const stream = createReadStream(filePath) .pipe(parse({ columns: true, skip_empty_lines: true })); const batch: any[] = []; const BATCH_SIZE = 500; stream.on('data', async (row) => { batch.push(row); if (batch.length >= BATCH_SIZE) { stream.pause(); // Backpressure await insertBatch(batch); batch.length = 0; stream.resume(); } }); stream.on('end', async () => { if (batch.length > 0) await insertBatch(batch); resolve(true); }); stream.on('error', reject); }); } ``` **Transaction Batching:** Insert locations in transaction batches to improve performance: ```typescript async function insertBatch(rows: any[]) { await prisma.$transaction( rows.map((row) => prisma.location.create({ data: { address: row.address, latitude: row.latitude, longitude: row.longitude, // ... other fields }, }) ), { timeout: 30000 } // 30s timeout for large batches ); } ``` ### Map Rendering Performance **Marker Clustering:** For maps with >1000 locations, use marker clustering to improve render performance: ```typescript // admin/src/components/map/AdminMapView.tsx import MarkerClusterGroup from 'react-leaflet-cluster'; {locations.map((loc) => ( ))} ``` **Viewport Filtering:** Only load locations within map bounds + buffer: ```typescript // admin/src/pages/public/MapPage.tsx const handleMapMove = useCallback( debounce(() => { if (!mapRef.current) return; const bounds = mapRef.current.getBounds(); const buffer = 0.1; // 10% buffer fetchLocations({ minLat: bounds.getSouth() - buffer, maxLat: bounds.getNorth() + buffer, minLng: bounds.getWest() - buffer, maxLng: bounds.getEast() + buffer, }); }, 500), [] ); ``` ## Related Documentation **Backend Modules:** - [Locations Backend Module](../../backend/modules/map/locations.md) — API implementation - [Geocoding Service](../../backend/modules/map/geocoding.md) — Multi-provider geocoding - [Spatial Utils](../../backend/modules/map/spatial.md) — Point-in-polygon algorithms **Frontend Pages:** - [LocationsPage](../../frontend/pages/admin/locations-page.md) — Admin CRUD interface - [AdminMapView](../../frontend/pages/admin/map-view.md) — Interactive map component - [Public MapPage](../../frontend/pages/public/map-page.md) — Public map view **Database:** - [Map Models](../../database/models/map.md) — Location, Address, Cut schemas - [Location History](../../database/models/map.md#locationhistory-model) — Audit trail - [Spatial Queries](../../database/queries.md#spatial-queries) — Optimization tips **Features:** - [Geocoding](./geocoding.md) — Multi-provider geocoding system - [Cuts](./cuts.md) — Geographic polygon overlays - [Canvassing](./canvassing.md) — Field organizing workflow - [NAR Import](./nar-import.md) — Canadian electoral data import - [Data Quality Dashboard](./data-quality.md) — Geocoding quality metrics