Reorganized project structure for better maintainability and reduced disk usage by 95.9% (11 GB -> 451 MB). Directory Reorganization (85% reduction in root files): - Created docs/ with subdirectories (deployment, testing, database, etc.) - Created infrastructure/vpn-configs/ for VPN scripts - Moved 90+ files from root to organized locations - Archived obsolete documentation (context system, offline mode, zombie debugging) - Moved all test files to tests/ directory - Root directory: 119 files -> 18 files Disk Cleanup (10.55 GB recovered): - Deleted Rust build artifacts: 9.6 GB (target/ directories) - Deleted Python virtual environments: 161 MB (venv/ directories) - Deleted Python cache: 50 KB (__pycache__/) New Structure: - docs/ - All documentation organized by category - docs/archives/ - Obsolete but preserved documentation - infrastructure/ - VPN configs and SSH setup - tests/ - All test files consolidated - logs/ - Ready for future logs Benefits: - Cleaner root directory (18 vs 119 files) - Logical organization of documentation - 95.9% disk space reduction - Faster navigation and discovery - Better portability (build artifacts excluded) Build artifacts can be regenerated: - Rust: cargo build --release (5-15 min per project) - Python: pip install -r requirements.txt (2-3 min) Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
4.5 KiB
Data Migration Procedure
From Jupiter (172.16.3.20) to RMM (172.16.3.30)
Date: 2026-01-17 Data to Migrate: 68 conversation contexts + any credentials/other data Estimated Time: 5 minutes
Step 1: Export Data from Jupiter
Open PuTTY and connect to Jupiter (172.16.3.20)
# Export all data (structure already exists on RMM, just need INSERT statements)
docker exec mariadb mysqldump \
-u claudetools \
-pCT_e8fcd5a3952030a79ed6debae6c954ed \
--no-create-info \
--skip-add-drop-table \
--insert-ignore \
--complete-insert \
claudetools > /tmp/claudetools_data_export.sql
# Check what was exported
echo "=== Export Summary ==="
wc -l /tmp/claudetools_data_export.sql
grep "^INSERT INTO" /tmp/claudetools_data_export.sql | sed 's/INSERT INTO `\([^`]*\)`.*/\1/' | sort | uniq -c
Expected output:
68 conversation_contexts
(and possibly credentials, clients, machines, etc.)
Step 2: Copy to RMM Server
Still on Jupiter:
# Copy export file to RMM server
scp /tmp/claudetools_data_export.sql guru@172.16.3.30:/tmp/
# Verify copy
ssh guru@172.16.3.30 "ls -lh /tmp/claudetools_data_export.sql"
Step 3: Import into RMM Database
Open another PuTTY session and connect to RMM (172.16.3.30)
# Import the data
mysql -u claudetools \
-pCT_e8fcd5a3952030a79ed6debae6c954ed \
-D claudetools < /tmp/claudetools_data_export.sql
# Check for errors
echo $?
# If output is 0, import was successful
Step 4: Verify Migration
Still on RMM (172.16.3.30):
# Check record counts
mysql -u claudetools \
-pCT_e8fcd5a3952030a79ed6debae6c954ed \
-D claudetools \
-e "SELECT TABLE_NAME, TABLE_ROWS
FROM information_schema.TABLES
WHERE TABLE_SCHEMA = 'claudetools'
AND TABLE_ROWS > 0
ORDER BY TABLE_ROWS DESC;"
Expected output:
TABLE_NAME TABLE_ROWS
conversation_contexts 68
credentials (if any)
clients (if any)
machines (if any)
... etc ...
Step 5: Test API Access
From Windows:
# Test context recall
curl -s http://172.16.3.30:8001/api/conversation-contexts?limit=5 | python -m json.tool
# Expected: Should return 5 conversation contexts
Step 6: Cleanup
On Jupiter (172.16.3.20):
# Remove temporary export file
rm /tmp/claudetools_data_export.sql
On RMM (172.16.3.30):
# Remove temporary import file
rm /tmp/claudetools_data_export.sql
Quick Single-Command Version
If you want to do it all in one go, run this from Jupiter:
# On Jupiter - Export, copy, and import in one command
docker exec mariadb mysqldump \
-u claudetools \
-pCT_e8fcd5a3952030a79ed6debae6c954ed \
--no-create-info \
--skip-add-drop-table \
--insert-ignore \
--complete-insert \
claudetools | \
ssh guru@172.16.3.30 "mysql -u claudetools -pCT_e8fcd5a3952030a79ed6debae6c954ed -D claudetools"
Then verify on RMM:
mysql -u claudetools -pCT_e8fcd5a3952030a79ed6debae6c954ed -D claudetools \
-e "SELECT COUNT(*) FROM conversation_contexts;"
Troubleshooting
Issue: "Table doesn't exist"
Solution: Schema wasn't created on RMM - run schema creation first
Issue: Duplicate key errors
Solution: Using --insert-ignore should skip duplicates automatically
Issue: Foreign key constraint errors
Solution: Temporarily disable foreign key checks:
SET FOREIGN_KEY_CHECKS=0;
-- import data
SET FOREIGN_KEY_CHECKS=1;
Issue: Character encoding errors
Solution: Database should already be utf8mb4, but if needed:
mysqldump --default-character-set=utf8mb4 ...
mysql --default-character-set=utf8mb4 ...
After Migration
- Update documentation - Note that 172.16.3.30 is now the primary database
- Test context recall - Verify hooks can read the migrated contexts
- Backup old database - Keep Jupiter database as backup for now
- Monitor new database - Watch for any issues with migrated data
Verification Checklist
- Exported data from Jupiter (172.16.3.20)
- Copied export to RMM (172.16.3.30)
- Imported into RMM database
- Verified record counts match
- Tested API can access data
- Tested context recall works
- Cleaned up temporary files
Status: Ready to execute Risk Level: Low (original data remains on Jupiter) Rollback: If issues occur, just point clients back to 172.16.3.20