38 KiB
Data Migration Procedures
This guide provides step-by-step procedures for migrating data from Changemaker Lite V1 to V2, including export scripts, transformation logic, import procedures, and validation steps.
Overview
V2 data migration involves:
- Export - Extract data from V1 NocoDB tables
- Transform - Convert V1 schema to V2 Prisma models
- Import - Load transformed data into V2 PostgreSQL
- Validate - Verify data integrity and completeness
!!! danger "Production Migration Warning" ALWAYS perform a test migration on a staging environment before production. Data loss is possible if scripts contain errors.
Prerequisites
Before beginning data migration:
- V1 backup completed (PostgreSQL dump + uploads)
- V2 environment running (
docker compose up -d v2-postgres redis api) - Prisma migrations applied (
npx prisma migrate deploy) - Node.js 20+ installed (for transformation scripts)
- Sufficient disk space (3x current database size recommended)
- Network access (V1 NocoDB API, V2 database)
Data Mapping
V1 Tables → V2 Prisma Models
| V1 NocoDB Table | V2 Prisma Model | Notes |
|---|---|---|
influence_users |
User |
Merge with login table |
login |
User |
Merge with influence_users |
campaigns |
Campaign |
Add createdByUserId relation |
representatives |
Representative |
Direct migration |
responses |
RepresentativeResponse |
Add verification fields |
response_upvotes |
ResponseUpvote |
Add IP dedup field |
postal_code_cache |
PostalCodeCache |
Direct migration |
locations |
Location |
Split address, add geocoding fields |
shifts |
Shift |
Extract signups to ShiftSignup |
shift_signups |
ShiftSignup |
Add status enum |
cuts |
Cut |
Parse GeoJSON coordinates |
| (none) | RefreshToken |
New in V2 (generated on first login) |
| (none) | SiteSettings |
New in V2 (seed with defaults) |
| (none) | MapSettings |
New in V2 (seed with defaults) |
Field Mapping Tables
Users
V1 Field (influence_users) |
V1 Field (login) |
V2 Field | Transformation |
|---|---|---|---|
Id |
Id |
- | Discard (V2 uses CUID) |
Email |
Email |
email |
Merge by email, enforce unique |
Password |
Password |
password |
Bcrypt hash (direct copy) |
| - | Name |
name |
From login.Name |
| - | - | phone |
NULL (not in V1) |
Role |
- | role |
Map: 'admin'→'SUPER_ADMIN', 'user'→'USER' |
| - | - | status |
Default: 'ACTIVE' |
| - | - | createdVia |
Default: 'STANDARD' |
| - | - | expiresAt |
NULL |
| - | - | emailVerified |
Default: false |
Created |
Created |
createdAt |
ISO 8601 timestamp |
| - | - | updatedAt |
Use createdAt or current time |
Merge Logic:
// Pseudocode
const mergeUsers = (influenceUsers, loginUsers) => {
const merged = new Map();
// Add all login users first (has name field)
loginUsers.forEach(user => {
merged.set(user.Email.toLowerCase(), {
email: user.Email,
password: user.Password,
name: user.Name,
role: 'USER', // Default, may be overridden
createdAt: user.Created || new Date()
});
});
// Override with influence_users (has role field)
influenceUsers.forEach(user => {
const existing = merged.get(user.Email.toLowerCase());
if (existing) {
existing.role = mapRole(user.Role);
} else {
merged.set(user.Email.toLowerCase(), {
email: user.Email,
password: user.Password,
name: null,
role: mapRole(user.Role),
createdAt: user.Created || new Date()
});
}
});
return Array.from(merged.values());
};
const mapRole = (v1Role) => {
const roleMap = {
'admin': 'SUPER_ADMIN',
'moderator': 'INFLUENCE_ADMIN',
'user': 'USER'
};
return roleMap[v1Role] || 'USER';
};
Campaigns
| V1 Field | V2 Field | Transformation |
|---|---|---|
Id |
- | Discard (use CUID) |
Title |
title |
Direct copy |
Description |
description |
Direct copy |
Slug |
slug |
Direct copy |
IsActive |
active |
Boolean conversion |
| - | highlighted |
Default: false |
TargetLevel |
targetLevel |
Direct copy or NULL |
TargetPosition |
targetPosition |
Direct copy or NULL |
| - | targetName |
NULL (not in V1) |
| - | targetEmail |
NULL |
| - | targetPostalCode |
NULL |
| - | customSubject |
NULL |
| - | customBody |
NULL |
| - | responseWallEnabled |
Default: true |
Created |
createdAt |
ISO 8601 timestamp |
| - | updatedAt |
Use createdAt |
| - | createdByUserId |
Requires user lookup |
CreatedBy Mapping:
// V1 campaigns may not have createdBy field
// Options:
// 1. Assign all to first SUPER_ADMIN user
// 2. Use separate mapping table if V1 tracked creators
// 3. Create placeholder "System" user
const assignCreator = async (campaign) => {
// Find first SUPER_ADMIN user
const admin = await prisma.user.findFirst({
where: { role: 'SUPER_ADMIN' }
});
if (!admin) {
throw new Error('No SUPER_ADMIN user found. Create admin user first.');
}
return admin.id;
};
Locations
| V1 Field | V2 Field | Transformation |
|---|---|---|
Id |
- | Discard (use CUID) |
Address |
address, city, province, postalCode |
Parse address string |
| - | addressLine2 |
NULL |
| - | country |
Default: 'Canada' |
Latitude |
latitude |
Float conversion |
Longitude |
longitude |
Float conversion |
| - | geocoded |
latitude != NULL && longitude != NULL |
| - | geocodedAt |
Use createdAt if geocoded |
| - | geocodeProvider |
'Legacy V1' or NULL |
| - | geocodeQuality |
NULL (unknown) |
SupportLevel |
supportLevel |
Map string to enum |
Notes |
notes |
Direct copy |
| - | contactName |
NULL |
| - | contactPhone |
NULL |
| - | contactEmail |
NULL |
| - | cutId |
NULL (assign later if needed) |
Created |
createdAt |
ISO 8601 timestamp |
| - | updatedAt |
Use createdAt |
| - | createdByUserId |
First MAP_ADMIN or SUPER_ADMIN |
Address Parsing:
// V1 stored full address as single string
// V2 requires structured fields
const parseAddress = (addressString) => {
// Example V1 address: "123 Main St, Toronto, ON M5V 1A1"
// Basic parsing (may need refinement for edge cases)
const parts = addressString.split(',').map(s => s.trim());
if (parts.length === 1) {
// Only street address
return {
address: parts[0],
city: null,
province: null,
postalCode: null
};
}
// Extract postal code (last part if matches pattern)
const postalRegex = /^[A-Z]\d[A-Z]\s?\d[A-Z]\d$/i;
let postalCode = null;
let province = null;
let city = null;
if (parts.length >= 3) {
const lastPart = parts[parts.length - 1];
const postalMatch = lastPart.match(/([A-Z]\d[A-Z]\s?\d[A-Z]\d)/i);
if (postalMatch) {
postalCode = postalMatch[1].replace(/\s/, '').toUpperCase();
// Province usually before postal code
const provincePart = lastPart.replace(postalMatch[0], '').trim();
if (provincePart) {
province = provincePart;
} else if (parts.length >= 4) {
province = parts[parts.length - 2];
}
}
// City is second-to-last or third-to-last
if (parts.length >= 4 && province) {
city = parts[parts.length - 3];
} else if (parts.length >= 3) {
city = parts[parts.length - 2];
}
}
return {
address: parts[0],
city: city || null,
province: province || null,
postalCode: postalCode || null
};
};
// Example usage:
parseAddress("123 Main St, Toronto, ON M5V 1A1");
// → { address: "123 Main St", city: "Toronto", province: "ON", postalCode: "M5V1A1" }
SupportLevel Enum Mapping:
const mapSupportLevel = (v1Level) => {
// V1 used inconsistent strings
const levelMap = {
'strong support': 'STRONG_SUPPORT',
'support': 'SUPPORT',
'undecided': 'UNDECIDED',
'oppose': 'OPPOSED',
'strong oppose': 'STRONG_OPPOSED',
'unknown': 'UNKNOWN',
'not home': 'NOT_HOME',
'moved': 'MOVED',
'deceased': 'DECEASED',
'': 'UNKNOWN'
};
return levelMap[v1Level?.toLowerCase()] || 'UNKNOWN';
};
Export V1 Data
Option 1: NocoDB API Export
Script: scripts/export-v1-nocodb.js
#!/usr/bin/env node
const axios = require('axios');
const fs = require('fs').promises;
const path = require('path');
const NOCODB_URL = process.env.V1_NOCODB_URL || 'http://localhost:8080';
const NOCODB_TOKEN = process.env.V1_NOCODB_TOKEN;
const OUTPUT_DIR = process.env.OUTPUT_DIR || './v1-export';
const tables = [
'influence_users',
'login',
'campaigns',
'representatives',
'responses',
'response_upvotes',
'postal_code_cache',
'locations',
'shifts',
'shift_signups',
'cuts'
];
const exportTable = async (tableName) => {
console.log(`Exporting ${tableName}...`);
let allRecords = [];
let offset = 0;
const limit = 100;
let hasMore = true;
while (hasMore) {
const response = await axios.get(
`${NOCODB_URL}/api/v1/db/data/v1/${tableName}`,
{
headers: { 'xc-token': NOCODB_TOKEN },
params: { limit, offset }
}
);
const records = response.data.list || [];
allRecords = allRecords.concat(records);
console.log(` Fetched ${records.length} records (total: ${allRecords.length})`);
if (records.length < limit) {
hasMore = false;
} else {
offset += limit;
}
}
await fs.writeFile(
path.join(OUTPUT_DIR, `${tableName}.json`),
JSON.stringify(allRecords, null, 2)
);
console.log(`✓ Exported ${allRecords.length} records from ${tableName}`);
return allRecords.length;
};
const main = async () => {
await fs.mkdir(OUTPUT_DIR, { recursive: true });
const counts = {};
for (const table of tables) {
try {
counts[table] = await exportTable(table);
} catch (error) {
console.error(`✗ Failed to export ${table}:`, error.message);
counts[table] = 0;
}
}
// Write summary
await fs.writeFile(
path.join(OUTPUT_DIR, 'export-summary.json'),
JSON.stringify({ exportedAt: new Date(), counts }, null, 2)
);
console.log('\nExport Summary:');
console.table(counts);
};
main().catch(console.error);
Usage:
cd /home/bunker-admin/changemaker.lite
mkdir -p v1-export
# Export from running V1 instance
V1_NOCODB_URL=http://localhost:8080 \
V1_NOCODB_TOKEN=your-token \
OUTPUT_DIR=./v1-export \
node scripts/export-v1-nocodb.js
Option 2: PostgreSQL Direct Export
If you have direct access to V1 PostgreSQL database:
# Export each table as CSV
docker compose -f docker-compose.v1.yml exec v1-postgres \
psql -U nocodb -d nocodb -c "\COPY influence_users TO STDOUT CSV HEADER" > v1-export/influence_users.csv
docker compose -f docker-compose.v1.yml exec v1-postgres \
psql -U nocodb -d nocodb -c "\COPY login TO STDOUT CSV HEADER" > v1-export/login.csv
docker compose -f docker-compose.v1.yml exec v1-postgres \
psql -U nocodb -d nocodb -c "\COPY campaigns TO STDOUT CSV HEADER" > v1-export/campaigns.csv
# Repeat for all tables...
Backup File Uploads
# V1 uploads directory
tar -czf v1-uploads-backup.tar.gz ./uploads/
# Verify archive
tar -tzf v1-uploads-backup.tar.gz | head -20
Transform Data
User Transformation
Script: scripts/transform-users.js
#!/usr/bin/env node
const fs = require('fs').promises;
const path = require('path');
const INPUT_DIR = process.env.INPUT_DIR || './v1-export';
const OUTPUT_DIR = process.env.OUTPUT_DIR || './v2-import';
const mapRole = (v1Role) => {
const roleMap = {
'admin': 'SUPER_ADMIN',
'moderator': 'INFLUENCE_ADMIN',
'user': 'USER'
};
return roleMap[v1Role] || 'USER';
};
const transformUsers = async () => {
const influenceUsers = JSON.parse(
await fs.readFile(path.join(INPUT_DIR, 'influence_users.json'), 'utf-8')
);
const loginUsers = JSON.parse(
await fs.readFile(path.join(INPUT_DIR, 'login.json'), 'utf-8')
);
const merged = new Map();
// Add login users (has name field)
loginUsers.forEach(user => {
merged.set(user.Email.toLowerCase(), {
email: user.Email,
password: user.Password,
name: user.Name || null,
role: 'USER',
status: 'ACTIVE',
createdVia: 'STANDARD',
emailVerified: false,
createdAt: user.Created || new Date().toISOString(),
updatedAt: user.Created || new Date().toISOString()
});
});
// Override with influence_users (has role field)
influenceUsers.forEach(user => {
const existing = merged.get(user.Email.toLowerCase());
if (existing) {
existing.role = mapRole(user.Role);
} else {
merged.set(user.Email.toLowerCase(), {
email: user.Email,
password: user.Password,
name: null,
role: mapRole(user.Role),
status: 'ACTIVE',
createdVia: 'STANDARD',
emailVerified: false,
createdAt: user.Created || new Date().toISOString(),
updatedAt: user.Created || new Date().toISOString()
});
}
});
const users = Array.from(merged.values());
await fs.writeFile(
path.join(OUTPUT_DIR, 'users.json'),
JSON.stringify(users, null, 2)
);
console.log(`✓ Transformed ${users.length} users`);
console.log(` influence_users: ${influenceUsers.length}`);
console.log(` login: ${loginUsers.length}`);
console.log(` merged: ${users.length}`);
return users;
};
const main = async () => {
await fs.mkdir(OUTPUT_DIR, { recursive: true });
await transformUsers();
};
main().catch(console.error);
Campaign Transformation
Script: scripts/transform-campaigns.js
#!/usr/bin/env node
const fs = require('fs').promises;
const path = require('path');
const INPUT_DIR = process.env.INPUT_DIR || './v1-export';
const OUTPUT_DIR = process.env.OUTPUT_DIR || './v2-import';
const transformCampaigns = async () => {
const v1Campaigns = JSON.parse(
await fs.readFile(path.join(INPUT_DIR, 'campaigns.json'), 'utf-8')
);
// Note: createdByUserId must be populated after users are imported
// This transformation creates placeholder field
const campaigns = v1Campaigns.map(campaign => ({
title: campaign.Title,
description: campaign.Description || null,
slug: campaign.Slug,
active: Boolean(campaign.IsActive),
highlighted: false,
targetLevel: campaign.TargetLevel || null,
targetPosition: campaign.TargetPosition || null,
targetName: null,
targetEmail: null,
targetPostalCode: null,
customSubject: null,
customBody: null,
responseWallEnabled: true,
createdAt: campaign.Created || new Date().toISOString(),
updatedAt: campaign.Created || new Date().toISOString(),
_v1Id: campaign.Id // Keep for reference in import script
}));
await fs.writeFile(
path.join(OUTPUT_DIR, 'campaigns.json'),
JSON.stringify(campaigns, null, 2)
);
console.log(`✓ Transformed ${campaigns.length} campaigns`);
return campaigns;
};
const main = async () => {
await fs.mkdir(OUTPUT_DIR, { recursive: true });
await transformCampaigns();
};
main().catch(console.error);
Location Transformation
Script: scripts/transform-locations.js
#!/usr/bin/env node
const fs = require('fs').promises;
const path = require('path');
const INPUT_DIR = process.env.INPUT_DIR || './v1-export';
const OUTPUT_DIR = process.env.OUTPUT_DIR || './v2-import';
const parseAddress = (addressString) => {
if (!addressString) {
return { address: '', city: null, province: null, postalCode: null };
}
const parts = addressString.split(',').map(s => s.trim());
if (parts.length === 1) {
return {
address: parts[0],
city: null,
province: null,
postalCode: null
};
}
const postalRegex = /([A-Z]\d[A-Z]\s?\d[A-Z]\d)/i;
let postalCode = null;
let province = null;
let city = null;
if (parts.length >= 3) {
const lastPart = parts[parts.length - 1];
const postalMatch = lastPart.match(postalRegex);
if (postalMatch) {
postalCode = postalMatch[1].replace(/\s/, '').toUpperCase();
const provincePart = lastPart.replace(postalMatch[0], '').trim();
if (provincePart) {
province = provincePart;
} else if (parts.length >= 4) {
province = parts[parts.length - 2];
}
}
if (parts.length >= 4 && province) {
city = parts[parts.length - 3];
} else if (parts.length >= 3) {
city = parts[parts.length - 2];
}
}
return {
address: parts[0],
city: city || null,
province: province || null,
postalCode: postalCode || null
};
};
const mapSupportLevel = (v1Level) => {
const levelMap = {
'strong support': 'STRONG_SUPPORT',
'support': 'SUPPORT',
'undecided': 'UNDECIDED',
'oppose': 'OPPOSED',
'strong oppose': 'STRONG_OPPOSED',
'unknown': 'UNKNOWN',
'not home': 'NOT_HOME',
'moved': 'MOVED',
'deceased': 'DECEASED',
'': 'UNKNOWN'
};
return levelMap[v1Level?.toLowerCase()] || 'UNKNOWN';
};
const transformLocations = async () => {
const v1Locations = JSON.parse(
await fs.readFile(path.join(INPUT_DIR, 'locations.json'), 'utf-8')
);
const locations = v1Locations.map(loc => {
const { address, city, province, postalCode } = parseAddress(loc.Address);
const hasCoordinates = loc.Latitude != null && loc.Longitude != null;
return {
...parseAddress(loc.Address),
country: 'Canada',
latitude: loc.Latitude ? parseFloat(loc.Latitude) : null,
longitude: loc.Longitude ? parseFloat(loc.Longitude) : null,
geocoded: hasCoordinates,
geocodedAt: hasCoordinates ? (loc.Created || new Date().toISOString()) : null,
geocodeProvider: hasCoordinates ? 'Legacy V1' : null,
geocodeQuality: null,
supportLevel: mapSupportLevel(loc.SupportLevel),
notes: loc.Notes || null,
contactName: null,
contactPhone: null,
contactEmail: null,
createdAt: loc.Created || new Date().toISOString(),
updatedAt: loc.Created || new Date().toISOString(),
_v1Id: loc.Id
};
});
await fs.writeFile(
path.join(OUTPUT_DIR, 'locations.json'),
JSON.stringify(locations, null, 2)
);
console.log(`✓ Transformed ${locations.length} locations`);
const geocodedCount = locations.filter(l => l.geocoded).length;
console.log(` Geocoded: ${geocodedCount} (${(geocodedCount/locations.length*100).toFixed(1)}%)`);
return locations;
};
const main = async () => {
await fs.mkdir(OUTPUT_DIR, { recursive: true });
await transformLocations();
};
main().catch(console.error);
Import V2 Data
Import Script
Script: scripts/import-v2-data.js
#!/usr/bin/env node
const { PrismaClient } = require('@prisma/client');
const fs = require('fs').promises;
const path = require('path');
const prisma = new PrismaClient();
const INPUT_DIR = process.env.INPUT_DIR || './v2-import';
const importUsers = async () => {
const users = JSON.parse(
await fs.readFile(path.join(INPUT_DIR, 'users.json'), 'utf-8')
);
console.log(`Importing ${users.length} users...`);
const created = [];
for (const user of users) {
try {
const newUser = await prisma.user.create({ data: user });
created.push(newUser);
} catch (error) {
if (error.code === 'P2002') {
console.warn(` ⚠ User ${user.email} already exists, skipping`);
} else {
console.error(` ✗ Failed to import user ${user.email}:`, error.message);
}
}
}
console.log(`✓ Imported ${created.length}/${users.length} users`);
return created;
};
const importCampaigns = async () => {
const campaigns = JSON.parse(
await fs.readFile(path.join(INPUT_DIR, 'campaigns.json'), 'utf-8')
);
// Find first SUPER_ADMIN user
const admin = await prisma.user.findFirst({
where: { role: 'SUPER_ADMIN' }
});
if (!admin) {
throw new Error('No SUPER_ADMIN user found. Import users first.');
}
console.log(`Importing ${campaigns.length} campaigns (creator: ${admin.email})...`);
const created = [];
for (const campaign of campaigns) {
try {
const { _v1Id, ...data } = campaign;
const newCampaign = await prisma.campaign.create({
data: {
...data,
createdByUserId: admin.id
}
});
created.push(newCampaign);
} catch (error) {
if (error.code === 'P2002') {
console.warn(` ⚠ Campaign ${campaign.slug} already exists, skipping`);
} else {
console.error(` ✗ Failed to import campaign ${campaign.title}:`, error.message);
}
}
}
console.log(`✓ Imported ${created.length}/${campaigns.length} campaigns`);
return created;
};
const importLocations = async () => {
const locations = JSON.parse(
await fs.readFile(path.join(INPUT_DIR, 'locations.json'), 'utf-8')
);
// Find first MAP_ADMIN or SUPER_ADMIN user
const admin = await prisma.user.findFirst({
where: { OR: [{ role: 'MAP_ADMIN' }, { role: 'SUPER_ADMIN' }] }
});
if (!admin) {
throw new Error('No MAP_ADMIN or SUPER_ADMIN user found. Import users first.');
}
console.log(`Importing ${locations.length} locations (creator: ${admin.email})...`);
const created = [];
for (const location of locations) {
try {
const { _v1Id, ...data } = location;
const newLocation = await prisma.location.create({
data: {
...data,
createdByUserId: admin.id
}
});
created.push(newLocation);
} catch (error) {
console.error(` ✗ Failed to import location ${location.address}:`, error.message);
}
}
console.log(`✓ Imported ${created.length}/${locations.length} locations`);
return created;
};
const main = async () => {
try {
console.log('Starting V2 data import...\n');
await importUsers();
console.log();
await importCampaigns();
console.log();
await importLocations();
console.log();
console.log('✓ Import complete!');
} catch (error) {
console.error('Import failed:', error);
process.exit(1);
} finally {
await prisma.$disconnect();
}
};
main();
Usage:
cd /home/bunker-admin/changemaker.lite
# Ensure V2 database is running and migrated
docker compose up -d v2-postgres
docker compose exec api npx prisma migrate deploy
# Run import
INPUT_DIR=./v2-import node scripts/import-v2-data.js
Validate Migration
Validation Script
Script: scripts/validate-migration.js
#!/usr/bin/env node
const { PrismaClient } = require('@prisma/client');
const fs = require('fs').promises;
const path = require('path');
const prisma = new PrismaClient();
const V1_EXPORT_DIR = './v1-export';
const validateCounts = async () => {
console.log('Validating record counts...\n');
const v1Summary = JSON.parse(
await fs.readFile(path.join(V1_EXPORT_DIR, 'export-summary.json'), 'utf-8')
);
const v2Counts = {
users: await prisma.user.count(),
campaigns: await prisma.campaign.count(),
locations: await prisma.location.count(),
shifts: await prisma.shift.count(),
representatives: await prisma.representative.count()
};
const comparison = [
{
Table: 'Users',
V1: v1Summary.counts.influence_users + v1Summary.counts.login,
V2: v2Counts.users,
Match: '≈' // Approximate due to deduplication
},
{
Table: 'Campaigns',
V1: v1Summary.counts.campaigns,
V2: v2Counts.campaigns,
Match: v1Summary.counts.campaigns === v2Counts.campaigns ? '✓' : '✗'
},
{
Table: 'Locations',
V1: v1Summary.counts.locations,
V2: v2Counts.locations,
Match: v1Summary.counts.locations === v2Counts.locations ? '✓' : '✗'
},
{
Table: 'Shifts',
V1: v1Summary.counts.shifts,
V2: v2Counts.shifts,
Match: v1Summary.counts.shifts === v2Counts.shifts ? '✓' : '✗'
},
{
Table: 'Representatives',
V1: v1Summary.counts.representatives,
V2: v2Counts.representatives,
Match: v1Summary.counts.representatives === v2Counts.representatives ? '✓' : '✗'
}
];
console.table(comparison);
};
const validateSampleData = async () => {
console.log('\nValidating sample data integrity...\n');
// Check first user
const firstUser = await prisma.user.findFirst({
orderBy: { createdAt: 'asc' }
});
console.log('First User:', {
email: firstUser.email,
role: firstUser.role,
hasPassword: firstUser.password?.startsWith('$2b$') ? 'Yes (bcrypt)' : 'No'
});
// Check first campaign
const firstCampaign = await prisma.campaign.findFirst({
include: { createdBy: { select: { email: true } } },
orderBy: { createdAt: 'asc' }
});
console.log('First Campaign:', {
title: firstCampaign.title,
slug: firstCampaign.slug,
creator: firstCampaign.createdBy.email
});
// Check first location
const firstLocation = await prisma.location.findFirst({
orderBy: { createdAt: 'asc' }
});
console.log('First Location:', {
address: firstLocation.address,
city: firstLocation.city,
geocoded: firstLocation.geocoded,
supportLevel: firstLocation.supportLevel
});
// Geocoding statistics
const totalLocations = await prisma.location.count();
const geocodedLocations = await prisma.location.count({
where: { geocoded: true }
});
console.log('\nGeocoding Stats:', {
total: totalLocations,
geocoded: geocodedLocations,
percentage: `${(geocodedLocations / totalLocations * 100).toFixed(1)}%`
});
};
const main = async () => {
try {
await validateCounts();
await validateSampleData();
console.log('\n✓ Validation complete');
} catch (error) {
console.error('Validation failed:', error);
process.exit(1);
} finally {
await prisma.$disconnect();
}
};
main();
Special Cases
Handling Duplicate Emails
During user merge, you may encounter duplicate emails:
// Option 1: Keep first occurrence, log duplicates
const handleDuplicates = (users) => {
const seen = new Set();
const duplicates = [];
const unique = users.filter(user => {
if (seen.has(user.email.toLowerCase())) {
duplicates.push(user);
return false;
}
seen.add(user.email.toLowerCase());
return true;
});
if (duplicates.length > 0) {
console.warn(`Found ${duplicates.length} duplicate emails:`);
duplicates.forEach(d => console.warn(` - ${d.email}`));
}
return unique;
};
// Option 2: Append suffix to duplicates
const handleDuplicatesWithSuffix = (users) => {
const counts = new Map();
return users.map(user => {
const email = user.email.toLowerCase();
const count = counts.get(email) || 0;
counts.set(email, count + 1);
if (count > 0) {
const [local, domain] = email.split('@');
return {
...user,
email: `${local}+v1dup${count}@${domain}`
};
}
return user;
});
};
Migrating Representative Cache
Representative cache can be rebuilt from Represent API, but to preserve it:
const transformRepresentatives = async () => {
const v1Reps = JSON.parse(
await fs.readFile(path.join(INPUT_DIR, 'representatives.json'), 'utf-8')
);
const reps = v1Reps.map(rep => ({
name: rep.Name,
email: rep.Email,
district: rep.District,
party: rep.Party,
level: rep.Level,
photoUrl: rep.PhotoUrl || null,
postalCodes: rep.PostalCodes ? JSON.parse(rep.PostalCodes) : [],
createdAt: rep.Created || new Date().toISOString(),
updatedAt: rep.Updated || new Date().toISOString()
}));
await fs.writeFile(
path.join(OUTPUT_DIR, 'representatives.json'),
JSON.stringify(reps, null, 2)
);
return reps;
};
Migrating Shift Signups
V1 may have embedded signups; V2 uses separate ShiftSignup table:
const transformShiftSignups = async () => {
const v1Shifts = JSON.parse(
await fs.readFile(path.join(INPUT_DIR, 'shifts.json'), 'utf-8')
);
const signups = [];
v1Shifts.forEach(shift => {
if (shift.Signups && Array.isArray(shift.Signups)) {
shift.Signups.forEach(signup => {
signups.push({
shiftId: shift.Id, // V1 ID, will need mapping in import
userId: signup.UserId, // V1 ID, will need mapping
status: 'CONFIRMED',
notes: signup.Notes || null,
confirmedAt: signup.CreatedAt || new Date().toISOString(),
createdAt: signup.CreatedAt || new Date().toISOString()
});
});
}
});
await fs.writeFile(
path.join(OUTPUT_DIR, 'shift-signups.json'),
JSON.stringify(signups, null, 2)
);
return signups;
};
// Import with ID mapping
const importShiftSignups = async (idMappings) => {
const signups = JSON.parse(
await fs.readFile(path.join(INPUT_DIR, 'shift-signups.json'), 'utf-8')
);
for (const signup of signups) {
const v2ShiftId = idMappings.shifts[signup.shiftId];
const v2UserId = idMappings.users[signup.userId];
if (!v2ShiftId || !v2UserId) {
console.warn(`Skipping signup: shift ${signup.shiftId} or user ${signup.userId} not found`);
continue;
}
await prisma.shiftSignup.create({
data: {
shiftId: v2ShiftId,
userId: v2UserId,
status: signup.status,
notes: signup.notes,
confirmedAt: signup.confirmedAt,
createdAt: signup.createdAt
}
});
}
};
Testing Migration
Pre-Production Test Migration
Before production migration, perform full test on staging:
# 1. Clone production V1 data to staging
./scripts/backup.sh
scp backups/latest.tar.gz staging-server:/tmp/
# 2. Restore V1 on staging
ssh staging-server
cd /opt/changemaker-lite
tar -xzf /tmp/latest.tar.gz -C ./
docker compose -f docker-compose.v1.yml up -d
# 3. Export V1 data
docker compose -f docker-compose.v1.yml exec influence-app node /app/scripts/export-data.js
# 4. Set up V2 on staging
git checkout v2
docker compose up -d v2-postgres redis
docker compose exec api npx prisma migrate deploy
# 5. Transform and import
node scripts/transform-users.js
node scripts/transform-campaigns.js
node scripts/transform-locations.js
node scripts/import-v2-data.js
# 6. Validate
node scripts/validate-migration.js
# 7. Test critical workflows
./scripts/test-v2-workflows.sh
Test Critical Workflows
Script: scripts/test-v2-workflows.sh
#!/bin/bash
set -e
API_URL="http://localhost:4000"
ADMIN_TOKEN=""
echo "Testing V2 Critical Workflows"
echo "=============================="
# 1. Admin Login
echo -n "1. Admin login... "
LOGIN_RESPONSE=$(curl -s -X POST "$API_URL/api/auth/login" \
-H "Content-Type: application/json" \
-d '{"email":"admin@example.com","password":"Admin123!"}')
ADMIN_TOKEN=$(echo $LOGIN_RESPONSE | jq -r '.data.accessToken')
if [ "$ADMIN_TOKEN" != "null" ] && [ -n "$ADMIN_TOKEN" ]; then
echo "✓"
else
echo "✗ Failed"
exit 1
fi
# 2. List Campaigns
echo -n "2. List campaigns... "
CAMPAIGNS=$(curl -s "$API_URL/api/influence/campaigns" \
-H "Authorization: Bearer $ADMIN_TOKEN")
CAMPAIGN_COUNT=$(echo $CAMPAIGNS | jq '.data | length')
echo "✓ ($CAMPAIGN_COUNT campaigns)"
# 3. Representative Lookup
echo -n "3. Representative lookup (M5V 1A1)... "
REPS=$(curl -s -X POST "$API_URL/api/influence/representatives/lookup" \
-H "Content-Type: application/json" \
-d '{"postalCode":"M5V1A1"}')
REP_COUNT=$(echo $REPS | jq '.data | length')
echo "✓ ($REP_COUNT representatives)"
# 4. List Locations
echo -n "4. List locations... "
LOCATIONS=$(curl -s "$API_URL/api/map/locations" \
-H "Authorization: Bearer $ADMIN_TOKEN")
LOCATION_COUNT=$(echo $LOCATIONS | jq '.data | length')
echo "✓ ($LOCATION_COUNT locations)"
# 5. Send Test Email
echo -n "5. Queue test email... "
EMAIL_RESPONSE=$(curl -s -X POST "$API_URL/api/influence/campaign-emails/send-email" \
-H "Content-Type: application/json" \
-d '{
"campaignId":"'$(echo $CAMPAIGNS | jq -r '.data[0].id')'",
"postalCode":"M5V1A1",
"senderName":"Test User",
"senderEmail":"test@example.com"
}')
if echo $EMAIL_RESPONSE | jq -e '.success' > /dev/null; then
echo "✓"
else
echo "✗ Failed"
fi
echo
echo "All critical workflows passed ✓"
Production Migration
Step-by-Step Procedure
Phase 1: Preparation (1-2 days before)
-
Announce Downtime Window
Subject: Scheduled Maintenance - System Upgrade We will be performing a major system upgrade on [DATE] at [TIME]. Expected downtime: 15-30 minutes What to expect: - All users will be logged out - You will need to re-login after the upgrade - Your data and passwords remain unchanged Please save any unsaved work before [TIME]. -
Backup V1
./scripts/backup.sh --include-uploads # Verify backup tar -tzf backups/changemaker-v1-$(date +%Y%m%d).tar.gz | head -20 -
Test V2 on Staging (use procedure above)
Phase 2: Export (T-60min)
-
Enable V1 Read-Only Mode
# Stop V1 write services docker compose -f docker-compose.v1.yml stop influence-app map-app # Keep database running for export -
Export V1 Data
V1_NOCODB_URL=http://localhost:8080 \ V1_NOCODB_TOKEN=$(cat .env | grep NOCODB_API_TOKEN | cut -d= -f2) \ node scripts/export-v1-nocodb.js # Verify export ls -lh v1-export/
Phase 3: Transform (T-30min)
- Transform Data
node scripts/transform-users.js node scripts/transform-campaigns.js node scripts/transform-locations.js node scripts/transform-shifts.js # Verify transformed data ls -lh v2-import/
Phase 4: Import (T-15min)
-
Stop V1 Completely
docker compose -f docker-compose.v1.yml down -
Start V2 Database
docker compose up -d v2-postgres redis docker compose exec api npx prisma migrate deploy -
Import Data
node scripts/import-v2-data.js | tee migration.log -
Validate Import
node scripts/validate-migration.js
Phase 5: Launch V2 (T+0min)
-
Start All V2 Services
docker compose up -d # Wait for health checks sleep 30 # Verify all healthy docker compose ps -
Smoke Test
./scripts/test-v2-workflows.sh -
Update DNS/Tunnel
- Pangolin: Update endpoint in admin
- Cloudflare: Update tunnel configuration
- Manual DNS: Update A/CNAME records
Phase 6: Monitor (T+15min to T+24hr)
-
Watch Logs
docker compose logs -f api admin -
Monitor Metrics
- Open Grafana: http://localhost:3001
- Check API Performance dashboard
- Watch for error spikes
-
Test User Logins
- Admin login
- Regular user login
- Temp user creation (shift signup)
-
Announce Migration Complete
Subject: System Upgrade Complete Our system upgrade is complete! You can now log in at: https://app.cmlite.org Your username and password remain unchanged. New features available: - [List new V2 features] If you experience any issues, please contact support@cmlite.org.
Rollback Procedures
If migration fails, follow these steps:
Emergency Rollback (T+0 to T+2hr)
# 1. Stop V2 services
docker compose down
# 2. Restore V1 services
docker compose -f docker-compose.v1.yml up -d
# 3. Restore V1 database from backup (if modified)
docker compose -f docker-compose.v1.yml exec -T v1-postgres \
psql -U nocodb nocodb < backups/v1-postgres-backup.sql
# 4. Verify V1 operational
curl -I http://localhost:3333/health
# 5. Revert DNS/tunnel
# 6. Announce rollback
echo "Migration has been rolled back. V1 is operational." | \
mail -s "Migration Rollback" admin@cmlite.org
Post-Rollback Analysis
-
Review Migration Logs
cat migration.log | grep ERROR -
Identify Root Cause
- Data transformation errors?
- Database constraint violations?
- Application bugs?
-
Fix Issues on Staging
- Update transformation scripts
- Test again on staging
- Validate thoroughly
-
Reschedule Migration
- New downtime window
- Communicate lessons learned
Troubleshooting
Issue: Prisma Unique Constraint Violation
Error: P2002: Unique constraint failed on the constraint: unique_email
Cause: Duplicate emails in merged user data.
Solution:
// Before import, deduplicate
const users = JSON.parse(await fs.readFile('v2-import/users.json', 'utf-8'));
const unique = handleDuplicates(users);
await fs.writeFile('v2-import/users.json', JSON.stringify(unique, null, 2));
Issue: Foreign Key Constraint Violation
Error: P2003: Foreign key constraint failed on the field: createdByUserId
Cause: Campaign references user that doesn't exist (import order).
Solution: Always import in order:
- Users first
- Campaigns (references users)
- Locations (references users)
- Shifts, responses, etc.
Issue: Bcrypt Hashes Not Working
Symptoms: Users can't login after migration despite correct password.
Cause: Password field truncated or corrupted.
Diagnosis:
-- Check password hash format
SELECT email, LEFT(password, 10), LENGTH(password) FROM "User" LIMIT 5;
-- Should be: "$2b$10...", length 60
Solution:
# Re-import users, ensure password field is text type
# Or batch reset passwords:
docker compose exec api node scripts/reset-all-passwords.js
Related Documentation
- Migration Overview - Migration planning guide
- Breaking Changes - V1→V2 differences
- API Changes - Endpoint mapping
- Feature Parity - Feature comparison
Next Steps
After successful migration:
-
Configure V2 Settings
-
Train Administrators
-
Enable New Features
-
Set Up Monitoring
!!! success "Migration Complete" Congratulations on completing your V2 migration! Welcome to the modern Changemaker Lite platform.