Files
claudetools/Sync-FromNAS-retrieved.ps1
Mike Swanson e4392afce9 docs: Document Dataforth test database system and troubleshooting
Investigation and Documentation:
- Discovered and documented test database system on AD2 server
- Created comprehensive TEST_DATABASE_ARCHITECTURE.md with full system details
- Retrieved all key database files from AD2 (import.js, schema.sql, server configs)
- Documented data flow: DOS machines → NAS → AD2 → SQLite → Web interface
- Verified database health: 1,027,517 records, 1075 MB, dates back to 1990

Database System Architecture:
- SQLite database with Node.js/Express.js web server (port 3000)
- Automated import via Sync-FromNAS.ps1 (runs every 15 minutes)
- 8 log types supported: DSCLOG, 5BLOG, 7BLOG, 8BLOG, PWRLOG, SCTLOG, VASLOG, SHT
- FTS5 full-text search, comprehensive indexes for performance
- API endpoints: search, stats, export, datasheet generation

Troubleshooting Scripts Created:
- Database diagnostics: check-db-simple.ps1, test-db-directly.ps1
- Server status checks: check-node-running.ps1, check-db-server.ps1
- Performance analysis: check-db-performance.ps1, check-wal-files.ps1
- API testing: test-api-endpoint.ps1, test-query.js
- Import monitoring: check-new-records.ps1
- Database optimization attempts: api-js-optimized.js, api-js-fixed.js
- Deployment scripts: deploy-db-optimization.ps1, deploy-db-fix.ps1, restore-original.ps1

Key Findings:
- Database file healthy and queryable (verified with test-query.js)
- Node.js server not running (port 3000 closed) - root cause of web interface issues
- Database last updated 8 days ago (01/13/2026) - automated sync may be broken
- Attempted performance optimizations (WAL mode) incompatible with readonly connections
- Original api.js restored from backup after optimization conflicts

Retrieved Documentation:
- QUICKSTART-retrieved.md: Quick start guide for database server
- SESSION_NOTES-retrieved.md: Complete session notes from database creation
- Sync-FromNAS-retrieved.ps1: Full sync script with database import logic
- import-js-retrieved.js: Node.js import script (12,774 bytes)
- schema-retrieved.sql: SQLite schema with FTS5 triggers
- server-js-retrieved.js: Express.js server configuration
- api-js-retrieved.js: API routes and endpoints
- package-retrieved.json: Node.js dependencies

Action Items Identified:
1. Start Node.js server on AD2 to restore web interface functionality
2. Investigate why automated sync hasn't updated database in 8 days
3. Check Windows Task Scheduler for Sync-FromNAS.ps1 scheduled task
4. Run manual import to catch up on 8 days of test data if needed

Technical Details:
- Database path: C:\Shares\testdatadb\database\testdata.db
- Web interface: http://192.168.0.6:3000 (when running)
- Database size: 1075.14 MB (1,127,362,560 bytes)
- Total records: 1,027,517 (slight variance from original 1,030,940)
- Pass rate: 99.82% (1,029,046 passed, 1,888 failed)

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
2026-01-21 16:38:54 -07:00

432 lines
15 KiB
PowerShell

# Sync-AD2-NAS.ps1 (formerly Sync-FromNAS.ps1)
# Bidirectional sync between AD2 and NAS (D2TESTNAS)
#
# PULL (NAS → AD2): Test results (LOGS/*.DAT, Reports/*.TXT) → Database import
# PUSH (AD2 → NAS): Software updates (ProdSW/*, TODO.BAT) → DOS machines
#
# Run: powershell -ExecutionPolicy Bypass -File C:\Shares\test\scripts\Sync-FromNAS.ps1
# Scheduled: Every 15 minutes via Windows Task Scheduler
param(
[switch]$DryRun, # Show what would be done without doing it
[switch]$Verbose, # Extra output
[int]$MaxAgeMinutes = 1440 # Default: files from last 24 hours (was 60 min, too aggressive)
)
# ============================================================================
# Configuration
# ============================================================================
$NAS_IP = "192.168.0.9"
$NAS_USER = "root"
$NAS_PASSWORD = "Paper123!@#-nas"
$NAS_HOSTKEY = "SHA256:5CVIPlqjLPxO8n48PKLAP99nE6XkEBAjTkaYmJAeOdA"
$NAS_DATA_PATH = "/data/test"
$AD2_TEST_PATH = "C:\Shares\test"
$AD2_HISTLOGS_PATH = "C:\Shares\test\Ate\HISTLOGS"
$SSH = "C:\Program Files\OpenSSH\ssh.exe" # Changed from PLINK to OpenSSH
$SCP = "C:\Program Files\OpenSSH\scp.exe" # Changed from PSCP to OpenSSH
$LOG_FILE = "C:\Shares\test\scripts\sync-from-nas.log"
$STATUS_FILE = "C:\Shares\test\_SYNC_STATUS.txt"
$LOG_TYPES = @("5BLOG", "7BLOG", "8BLOG", "DSCLOG", "SCTLOG", "VASLOG", "PWRLOG", "HVLOG")
# Database import configuration
$IMPORT_SCRIPT = "C:\Shares\testdatadb\database\import.js"
$NODE_PATH = "node"
# ============================================================================
# Functions
# ============================================================================
function Write-Log {
param([string]$Message)
$timestamp = Get-Date -Format "yyyy-MM-dd HH:mm:ss"
$logLine = "$timestamp : $Message"
Add-Content -Path $LOG_FILE -Value $logLine
if ($Verbose) { Write-Host $logLine }
}
function Invoke-NASCommand {
param([string]$Command)
$result = & $SSH -i "C:\Users\sysadmin\.ssh\id_ed25519" -o BatchMode=yes -o ConnectTimeout=10 -o StrictHostKeyChecking=accept-new $Command 2>&1
return $result
}
function Copy-FromNAS {
param(
[string]$RemotePath,
[string]$LocalPath
)
# Ensure local directory exists
$localDir = Split-Path -Parent $LocalPath
if (-not (Test-Path $localDir)) {
New-Item -ItemType Directory -Path $localDir -Force | Out-Null
}
$result = & $SCP -O -o StrictHostKeyChecking=accept-new -o UserKnownHostsFile="C:\Shares\test\scripts\.ssh\known_hosts" "${NAS_USER}@${NAS_IP}:$RemotePath" $LocalPath 2>&1 if ($LASTEXITCODE -ne 0) {
$errorMsg = $result | Out-String
Write-Log " SCP PUSH ERROR (exit $LASTEXITCODE): $errorMsg"
}
return $LASTEXITCODE -eq 0
}
function Remove-FromNAS {
param([string]$RemotePath)
Invoke-NASCommand "rm -f '$RemotePath'" | Out-Null
}
function Copy-ToNAS {
param(
[string]$LocalPath,
[string]$RemotePath
)
# Ensure remote directory exists
$remoteDir = Split-Path -Parent $RemotePath
Invoke-NASCommand "mkdir -p '$remoteDir'" | Out-Null
$result = & $SCP -O -o StrictHostKeyChecking=accept-new -o UserKnownHostsFile="C:\Shares\test\scripts\.ssh\known_hosts" $LocalPath "${NAS_USER}@${NAS_IP}:$RemotePath" 2>&1
if ($LASTEXITCODE -ne 0) {
$errorMsg = $result | Out-String
Write-Log " SCP PUSH ERROR (exit $LASTEXITCODE): $errorMsg"
}
return $LASTEXITCODE -eq 0
}
function Get-FileHash256 {
param([string]$FilePath)
if (Test-Path $FilePath) {
return (Get-FileHash -Path $FilePath -Algorithm SHA256).Hash
}
return $null
}
function Import-ToDatabase {
param([string[]]$FilePaths)
if ($FilePaths.Count -eq 0) { return }
Write-Log "Importing $($FilePaths.Count) file(s) to database..."
# Build argument list
$args = @("$IMPORT_SCRIPT", "--file") + $FilePaths
try {
$output = & $NODE_PATH $args 2>&1
foreach ($line in $output) {
Write-Log " [DB] $line"
}
Write-Log "Database import complete"
} catch {
Write-Log "ERROR: Database import failed: $_"
}
}
# ============================================================================
# Main Script
# ============================================================================
Write-Log "=========================================="
Write-Log "Starting sync from NAS"
Write-Log "Max age: $MaxAgeMinutes minutes"
if ($DryRun) { Write-Log "DRY RUN - no changes will be made" }
$errorCount = 0
$syncedFiles = 0
$skippedFiles = 0
$syncedDatFiles = @() # Track DAT files for database import
# Find all DAT files on NAS modified within the time window
Write-Log "Finding DAT files on NAS..."
$findCommand = "find $NAS_DATA_PATH/TS-*/LOGS -name '*.DAT' -type f -mmin -$MaxAgeMinutes 2>/dev/null"
$datFiles = Invoke-NASCommand $findCommand
if (-not $datFiles -or $datFiles.Count -eq 0) {
Write-Log "No new DAT files found on NAS"
} else {
Write-Log "Found $($datFiles.Count) DAT file(s) to process"
foreach ($remoteFile in $datFiles) {
$remoteFile = $remoteFile.Trim()
if ([string]::IsNullOrWhiteSpace($remoteFile)) { continue }
# Parse the path: /data/test/TS-XX/LOGS/7BLOG/file.DAT
if ($remoteFile -match "/data/test/(TS-[^/]+)/LOGS/([^/]+)/(.+\.DAT)$") {
$station = $Matches[1]
$logType = $Matches[2]
$fileName = $Matches[3]
Write-Log "Processing: $station/$logType/$fileName"
# Destination 1: Per-station folder (preserves structure)
$stationDest = Join-Path $AD2_TEST_PATH "$station\LOGS\$logType\$fileName"
# Destination 2: Aggregated HISTLOGS folder
$histlogsDest = Join-Path $AD2_HISTLOGS_PATH "$logType\$fileName"
if ($DryRun) {
Write-Log " [DRY RUN] Would copy to: $stationDest"
$syncedFiles++
} else {
# Copy to station folder only (skip HISTLOGS to avoid duplicates)
$success1 = Copy-FromNAS -RemotePath $remoteFile -LocalPath $stationDest
if ($success1) {
Write-Log " Copied to station folder"
# Remove from NAS after successful sync
Remove-FromNAS -RemotePath $remoteFile
Write-Log " Removed from NAS"
# Track for database import
$syncedDatFiles += $stationDest
$syncedFiles++
} else {
Write-Log " ERROR: Failed to copy from NAS"
$errorCount++
}
}
} else {
Write-Log " Skipping (unexpected path format): $remoteFile"
$skippedFiles++
}
}
}
# Find and sync TXT report files
Write-Log "Finding TXT reports on NAS..."
$findReportsCommand = "find $NAS_DATA_PATH/TS-*/Reports -name '*.TXT' -type f -mmin -$MaxAgeMinutes 2>/dev/null"
$txtFiles = Invoke-NASCommand $findReportsCommand
if ($txtFiles -and $txtFiles.Count -gt 0) {
Write-Log "Found $($txtFiles.Count) TXT report(s) to process"
foreach ($remoteFile in $txtFiles) {
$remoteFile = $remoteFile.Trim()
if ([string]::IsNullOrWhiteSpace($remoteFile)) { continue }
if ($remoteFile -match "/data/test/(TS-[^/]+)/Reports/(.+\.TXT)$") {
$station = $Matches[1]
$fileName = $Matches[2]
Write-Log "Processing report: $station/$fileName"
# Destination: Per-station Reports folder
$reportDest = Join-Path $AD2_TEST_PATH "$station\Reports\$fileName"
if ($DryRun) {
Write-Log " [DRY RUN] Would copy to: $reportDest"
$syncedFiles++
} else {
$success = Copy-FromNAS -RemotePath $remoteFile -LocalPath $reportDest
if ($success) {
Write-Log " Copied report"
Remove-FromNAS -RemotePath $remoteFile
Write-Log " Removed from NAS"
$syncedFiles++
} else {
Write-Log " ERROR: Failed to copy report"
$errorCount++
}
}
}
}
}
# ============================================================================
# Import synced DAT files to database
# ============================================================================
if (-not $DryRun -and $syncedDatFiles.Count -gt 0) {
Import-ToDatabase -FilePaths $syncedDatFiles
}
# ============================================================================
# PUSH: AD2 → NAS (Software Updates for DOS Machines)
# ============================================================================
Write-Log "--- AD2 to NAS Sync (Software Updates) ---"
$pushedFiles = 0
# Sync COMMON/ProdSW (batch files for all stations)
# AD2 uses _COMMON, NAS uses COMMON - handle both
$commonSources = @(
@{ Local = "$AD2_TEST_PATH\_COMMON\ProdSW"; Remote = "$NAS_DATA_PATH/COMMON/ProdSW" },
@{ Local = "$AD2_TEST_PATH\COMMON\ProdSW"; Remote = "$NAS_DATA_PATH/COMMON/ProdSW" }
)
foreach ($source in $commonSources) {
if (Test-Path $source.Local) {
Write-Log "Syncing COMMON ProdSW from: $($source.Local)"
$commonFiles = Get-ChildItem -Path $source.Local -File -ErrorAction SilentlyContinue
foreach ($file in $commonFiles) {
$remotePath = "$($source.Remote)/$($file.Name)"
if ($DryRun) {
Write-Log " [DRY RUN] Would push: $($file.Name) -> $remotePath"
$pushedFiles++
} else {
$success = Copy-ToNAS -LocalPath $file.FullName -RemotePath $remotePath
if ($success) {
Write-Log " Pushed: $($file.Name)"
$pushedFiles++
} else {
Write-Log " ERROR: Failed to push $($file.Name)"
$errorCount++
}
}
}
}
}
# Sync UPDATE.BAT (root level utility)
Write-Log "Syncing UPDATE.BAT..."
$updateBatLocal = "$AD2_TEST_PATH\UPDATE.BAT"
if (Test-Path $updateBatLocal) {
$updateBatRemote = "$NAS_DATA_PATH/UPDATE.BAT"
if ($DryRun) {
Write-Log " [DRY RUN] Would push: UPDATE.BAT -> $updateBatRemote"
$pushedFiles++
} else {
$success = Copy-ToNAS -LocalPath $updateBatLocal -RemotePath $updateBatRemote
if ($success) {
Write-Log " Pushed: UPDATE.BAT"
$pushedFiles++
} else {
Write-Log " ERROR: Failed to push UPDATE.BAT"
$errorCount++
}
}
} else {
Write-Log " WARNING: UPDATE.BAT not found at $updateBatLocal"
}
# Sync DEPLOY.BAT (root level utility)
Write-Log "Syncing DEPLOY.BAT..."
$deployBatLocal = "$AD2_TEST_PATH\DEPLOY.BAT"
if (Test-Path $deployBatLocal) {
$deployBatRemote = "$NAS_DATA_PATH/DEPLOY.BAT"
if ($DryRun) {
Write-Log " [DRY RUN] Would push: DEPLOY.BAT -> $deployBatRemote"
$pushedFiles++
} else {
$success = Copy-ToNAS -LocalPath $deployBatLocal -RemotePath $deployBatRemote
if ($success) {
Write-Log " Pushed: DEPLOY.BAT"
$pushedFiles++
} else {
Write-Log " ERROR: Failed to push DEPLOY.BAT"
$errorCount++
}
}
} else {
Write-Log " WARNING: DEPLOY.BAT not found at $deployBatLocal"
}
# Sync per-station ProdSW folders
Write-Log "Syncing station-specific ProdSW folders..."
$stationFolders = Get-ChildItem -Path $AD2_TEST_PATH -Directory -Filter "TS-*" -ErrorAction SilentlyContinue
foreach ($station in $stationFolders) {
$prodSwPath = Join-Path $station.FullName "ProdSW"
if (Test-Path $prodSwPath) {
# Get all files in ProdSW (including subdirectories)
$prodSwFiles = Get-ChildItem -Path $prodSwPath -File -Recurse -ErrorAction SilentlyContinue
foreach ($file in $prodSwFiles) {
# Calculate relative path from ProdSW folder
$relativePath = $file.FullName.Substring($prodSwPath.Length + 1).Replace('\', '/')
$remotePath = "$NAS_DATA_PATH/$($station.Name)/ProdSW/$relativePath"
if ($DryRun) {
Write-Log " [DRY RUN] Would push: $($station.Name)/ProdSW/$relativePath"
$pushedFiles++
} else {
$success = Copy-ToNAS -LocalPath $file.FullName -RemotePath $remotePath
if ($success) {
Write-Log " Pushed: $($station.Name)/ProdSW/$relativePath"
$pushedFiles++
} else {
Write-Log " ERROR: Failed to push $($station.Name)/ProdSW/$relativePath"
$errorCount++
}
}
}
}
# Check for TODO.BAT (one-time task file)
$todoBatPath = Join-Path $station.FullName "TODO.BAT"
if (Test-Path $todoBatPath) {
$remoteTodoPath = "$NAS_DATA_PATH/$($station.Name)/TODO.BAT"
Write-Log "Found TODO.BAT for $($station.Name)"
if ($DryRun) {
Write-Log " [DRY RUN] Would push TODO.BAT -> $remoteTodoPath"
$pushedFiles++
} else {
$success = Copy-ToNAS -LocalPath $todoBatPath -RemotePath $remoteTodoPath
if ($success) {
Write-Log " Pushed TODO.BAT to NAS"
# Remove from AD2 after successful push (one-shot mechanism)
Remove-Item -Path $todoBatPath -Force
Write-Log " Removed TODO.BAT from AD2 (pushed to NAS)"
$pushedFiles++
} else {
Write-Log " ERROR: Failed to push TODO.BAT"
$errorCount++
}
}
}
}
Write-Log "AD2 to NAS sync: $pushedFiles file(s) pushed"
# ============================================================================
# Update Status File
# ============================================================================
$status = if ($errorCount -eq 0) { "OK" } else { "ERRORS" }
$statusContent = @"
AD2 <-> NAS Bidirectional Sync Status
======================================
Timestamp: $(Get-Date -Format "yyyy-MM-dd HH:mm:ss")
Status: $status
PULL (NAS -> AD2 - Test Results):
Files Pulled: $syncedFiles
Files Skipped: $skippedFiles
DAT Files Imported to DB: $($syncedDatFiles.Count)
PUSH (AD2 -> NAS - Software Updates):
Files Pushed: $pushedFiles
Errors: $errorCount
"@
Set-Content -Path $STATUS_FILE -Value $statusContent
Write-Log "=========================================="
Write-Log "Sync complete: PULL=$syncedFiles, PUSH=$pushedFiles, Errors=$errorCount"
Write-Log "=========================================="
# Exit with error code if there were failures
if ($errorCount -gt 0) {
exit 1
} else {
exit 0
}