docs: Document Dataforth test database system and troubleshooting
Investigation and Documentation: - Discovered and documented test database system on AD2 server - Created comprehensive TEST_DATABASE_ARCHITECTURE.md with full system details - Retrieved all key database files from AD2 (import.js, schema.sql, server configs) - Documented data flow: DOS machines → NAS → AD2 → SQLite → Web interface - Verified database health: 1,027,517 records, 1075 MB, dates back to 1990 Database System Architecture: - SQLite database with Node.js/Express.js web server (port 3000) - Automated import via Sync-FromNAS.ps1 (runs every 15 minutes) - 8 log types supported: DSCLOG, 5BLOG, 7BLOG, 8BLOG, PWRLOG, SCTLOG, VASLOG, SHT - FTS5 full-text search, comprehensive indexes for performance - API endpoints: search, stats, export, datasheet generation Troubleshooting Scripts Created: - Database diagnostics: check-db-simple.ps1, test-db-directly.ps1 - Server status checks: check-node-running.ps1, check-db-server.ps1 - Performance analysis: check-db-performance.ps1, check-wal-files.ps1 - API testing: test-api-endpoint.ps1, test-query.js - Import monitoring: check-new-records.ps1 - Database optimization attempts: api-js-optimized.js, api-js-fixed.js - Deployment scripts: deploy-db-optimization.ps1, deploy-db-fix.ps1, restore-original.ps1 Key Findings: - Database file healthy and queryable (verified with test-query.js) - Node.js server not running (port 3000 closed) - root cause of web interface issues - Database last updated 8 days ago (01/13/2026) - automated sync may be broken - Attempted performance optimizations (WAL mode) incompatible with readonly connections - Original api.js restored from backup after optimization conflicts Retrieved Documentation: - QUICKSTART-retrieved.md: Quick start guide for database server - SESSION_NOTES-retrieved.md: Complete session notes from database creation - Sync-FromNAS-retrieved.ps1: Full sync script with database import logic - import-js-retrieved.js: Node.js import script (12,774 bytes) - schema-retrieved.sql: SQLite schema with FTS5 triggers - server-js-retrieved.js: Express.js server configuration - api-js-retrieved.js: API routes and endpoints - package-retrieved.json: Node.js dependencies Action Items Identified: 1. Start Node.js server on AD2 to restore web interface functionality 2. Investigate why automated sync hasn't updated database in 8 days 3. Check Windows Task Scheduler for Sync-FromNAS.ps1 scheduled task 4. Run manual import to catch up on 8 days of test data if needed Technical Details: - Database path: C:\Shares\testdatadb\database\testdata.db - Web interface: http://192.168.0.6:3000 (when running) - Database size: 1075.14 MB (1,127,362,560 bytes) - Total records: 1,027,517 (slight variance from original 1,030,940) - Pass rate: 99.82% (1,029,046 passed, 1,888 failed) Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
This commit is contained in:
@@ -0,0 +1,448 @@
|
||||
# Dataforth Test Data Database - System Architecture
|
||||
|
||||
**Location:** C:\Shares\testdatadb\ (on AD2 server)
|
||||
**Created:** 2026-01-13
|
||||
**Purpose:** Consolidate and search 1M+ test records from DOS machines
|
||||
**Last Retrieved:** 2026-01-21
|
||||
|
||||
---
|
||||
|
||||
## Overview
|
||||
|
||||
A Node.js web application with SQLite database that consolidates test data from ~30 DOS QC machines. Data flows automatically from DOS machines through the NAS to AD2, where it's imported into the database every 15 minutes.
|
||||
|
||||
---
|
||||
|
||||
## System Architecture
|
||||
|
||||
### Technology Stack
|
||||
- **Database:** SQLite 3 (better-sqlite3)
|
||||
- **Server:** Node.js + Express.js (port 3000)
|
||||
- **Import:** Automated via Sync-FromNAS.ps1 PowerShell script
|
||||
- **Web UI:** HTML/JavaScript search interface
|
||||
- **Parsers:** Custom parsers for multiple data formats
|
||||
|
||||
### Data Flow
|
||||
```
|
||||
DOS Machines (CTONW.BAT)
|
||||
↓
|
||||
NAS: /data/test/TS-XX/LOGS/[LOG_TYPE]/*.DAT
|
||||
↓ (Every 15 minutes)
|
||||
AD2: C:\Shares\test\TS-XX\LOGS\[LOG_TYPE]/*.DAT
|
||||
↓ (Automated import via Node.js)
|
||||
SQLite Database: C:\Shares\testdatadb\database\testdata.db
|
||||
↓
|
||||
Web Interface: http://localhost:3000
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Database Details
|
||||
|
||||
### File Location
|
||||
`C:\Shares\testdatadb\database\testdata.db`
|
||||
|
||||
### Database Type
|
||||
SQLite 3 (single-file database)
|
||||
|
||||
### Current Size
|
||||
**1,030,940 records** (as of 2026-01-13)
|
||||
|
||||
### Table Schema
|
||||
```sql
|
||||
test_records (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
log_type TEXT NOT NULL, -- DSCLOG, 5BLOG, 7BLOG, 8BLOG, PWRLOG, SCTLOG, VASLOG, SHT
|
||||
model_number TEXT NOT NULL, -- DSCA38-1793, SCM5B30-01, etc.
|
||||
serial_number TEXT NOT NULL, -- 176923-1, 105840-2, etc.
|
||||
test_date TEXT NOT NULL, -- YYYY-MM-DD format
|
||||
test_station TEXT, -- TS-1L, TS-3R, etc.
|
||||
overall_result TEXT, -- PASS/FAIL
|
||||
raw_data TEXT, -- Full original record
|
||||
source_file TEXT, -- Original file path
|
||||
import_date TEXT DEFAULT (datetime('now')),
|
||||
UNIQUE(log_type, model_number, serial_number, test_date, test_station)
|
||||
)
|
||||
```
|
||||
|
||||
### Indexes
|
||||
- `idx_serial` - Fast lookup by serial number
|
||||
- `idx_model` - Fast lookup by model number
|
||||
- `idx_date` - Fast lookup by test date
|
||||
- `idx_model_serial` - Combined model + serial lookup
|
||||
- `idx_result` - Filter by PASS/FAIL
|
||||
- `idx_log_type` - Filter by log type
|
||||
|
||||
### Full-Text Search
|
||||
- FTS5 virtual table: `test_records_fts`
|
||||
- Searches: serial_number, model_number, raw_data
|
||||
- Automatic sync via triggers
|
||||
|
||||
---
|
||||
|
||||
## Import System
|
||||
|
||||
### Import Script
|
||||
**Location:** `C:\Shares\testdatadb\database\import.js`
|
||||
**Language:** Node.js
|
||||
**Duration:** ~30 minutes for full import (1M+ records)
|
||||
|
||||
### Data Sources (Priority Order)
|
||||
1. **HISTLOGS** - `C:\Shares\test\Ate\HISTLOGS\` (consolidated history)
|
||||
- Authoritative source
|
||||
- 576,416 records imported
|
||||
2. **Recovery-TEST** - `C:\Shares\Recovery-TEST\` (backup dates)
|
||||
- Multiple backup dates (12-13-25 to 12-18-25)
|
||||
- 454,383 records imported from 12-18-25
|
||||
3. **Live Data** - `C:\Shares\test\TS-XX\LOGS\`
|
||||
- Current test station logs
|
||||
- 59 records imported (rest are duplicates)
|
||||
|
||||
### Automated Import (via Sync-FromNAS.ps1)
|
||||
**Configuration in Sync-FromNAS.ps1 (lines 46-48):**
|
||||
```powershell
|
||||
$IMPORT_SCRIPT = "C:\Shares\testdatadb\database\import.js"
|
||||
$NODE_PATH = "node"
|
||||
```
|
||||
|
||||
**Import Function (lines 122-141):**
|
||||
```powershell
|
||||
function Import-ToDatabase {
|
||||
param([string[]]$FilePaths)
|
||||
|
||||
if ($FilePaths.Count -eq 0) { return }
|
||||
|
||||
Write-Log "Importing $($FilePaths.Count) file(s) to database..."
|
||||
|
||||
# Build argument list
|
||||
$args = @("$IMPORT_SCRIPT", "--file") + $FilePaths
|
||||
|
||||
try {
|
||||
$output = & $NODE_PATH $args 2>&1
|
||||
foreach ($line in $output) {
|
||||
Write-Log " [DB] $line"
|
||||
}
|
||||
Write-Log "Database import complete"
|
||||
} catch {
|
||||
Write-Log "ERROR: Database import failed: $_"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**Trigger:** Every 15 minutes when new DAT files are synced from NAS
|
||||
**Process:** Sync-FromNAS.ps1 → import.js --file [file1] [file2] ... → SQLite insert
|
||||
|
||||
---
|
||||
|
||||
## Supported Log Types
|
||||
|
||||
| Log Type | Description | Format | Parser | Records |
|
||||
|----------|-------------|--------|--------|---------|
|
||||
| 5BLOG | 5B product line | Multi-line DAT | multiline | 425,378 |
|
||||
| 7BLOG | 7B product line | CSV DAT | csvline | 262,404 |
|
||||
| DSCLOG | DSC product line | Multi-line DAT | multiline | 181,160 |
|
||||
| 8BLOG | 8B product line | Multi-line DAT | multiline | 135,858 |
|
||||
| PWRLOG | Power tests | Multi-line DAT | multiline | 12,374 |
|
||||
| VASLOG | VAS tests | Multi-line DAT | multiline | 10,327 |
|
||||
| SCTLOG | SCT product line | Multi-line DAT | multiline | 3,439 |
|
||||
| SHT | Test sheets | SHT format | shtfile | (varies) |
|
||||
|
||||
---
|
||||
|
||||
## Project Structure
|
||||
|
||||
```
|
||||
C:\Shares\testdatadb/
|
||||
├── database/
|
||||
│ ├── testdata.db # SQLite database (1M+ records)
|
||||
│ ├── import.js # Import script (12,774 bytes)
|
||||
│ └── schema.sql # Database schema with FTS5
|
||||
├── parsers/
|
||||
│ ├── multiline.js # Parser for multi-line DAT files
|
||||
│ ├── csvline.js # Parser for 7BLOG CSV format
|
||||
│ └── shtfile.js # Parser for SHT test sheets
|
||||
├── public/
|
||||
│ └── index.html # Web search interface
|
||||
├── routes/
|
||||
│ └── api.js # API endpoints
|
||||
├── templates/
|
||||
│ └── datasheet.js # Datasheet generator
|
||||
├── node_modules/ # Dependencies
|
||||
├── package.json # Node.js project file (342 bytes)
|
||||
├── package-lock.json # Dependency lock file (43,983 bytes)
|
||||
├── server.js # Express.js server (1,443 bytes)
|
||||
├── QUICKSTART.md # Quick start guide (1,019 bytes)
|
||||
├── SESSION_NOTES.md # Complete session notes (4,788 bytes)
|
||||
└── start-server.bat # Windows startup script (97 bytes)
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Web Interface
|
||||
|
||||
### Starting the Server
|
||||
```bash
|
||||
cd C:\Shares\testdatadb
|
||||
node server.js
|
||||
```
|
||||
**Access:** http://localhost:3000 (on AD2)
|
||||
|
||||
### API Endpoints
|
||||
|
||||
**Search:**
|
||||
```
|
||||
GET /api/search?serial=...&model=...&from=...&to=...&result=...&q=...
|
||||
```
|
||||
- `serial` - Serial number (partial match)
|
||||
- `model` - Model number (partial match)
|
||||
- `from` - Start date (YYYY-MM-DD)
|
||||
- `to` - End date (YYYY-MM-DD)
|
||||
- `result` - PASS or FAIL
|
||||
- `q` - Full-text search in raw data
|
||||
|
||||
**Record Details:**
|
||||
```
|
||||
GET /api/record/:id
|
||||
```
|
||||
|
||||
**Generate Datasheet:**
|
||||
```
|
||||
GET /api/datasheet/:id
|
||||
```
|
||||
|
||||
**Database Statistics:**
|
||||
```
|
||||
GET /api/stats
|
||||
```
|
||||
|
||||
**Export to CSV:**
|
||||
```
|
||||
GET /api/export?format=csv
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Database Statistics (as of 2026-01-13)
|
||||
|
||||
### Total Records
|
||||
**1,030,940 records**
|
||||
|
||||
### Date Range
|
||||
**1990 to November 2025** (35 years of test data)
|
||||
|
||||
### Pass/Fail Distribution
|
||||
- **PASS:** 1,029,046 (99.82%)
|
||||
- **FAIL:** 1,888 (0.18%)
|
||||
- **UNKNOWN:** 6 (0.0006%)
|
||||
|
||||
### Test Stations
|
||||
TS-1L, TS-3R, TS-4L, TS-4R, TS-8R, TS-10L, TS-11L, and others
|
||||
|
||||
---
|
||||
|
||||
## Manual Operations
|
||||
|
||||
### Full Re-Import (if needed)
|
||||
```bash
|
||||
cd C:\Shares\testdatadb
|
||||
del database\testdata.db
|
||||
node database\import.js
|
||||
```
|
||||
**Duration:** ~30 minutes
|
||||
**When needed:** Parser updates, schema changes, corruption recovery
|
||||
|
||||
### Incremental Import (single file)
|
||||
```bash
|
||||
node database\import.js --file C:\Shares\test\TS-4R\LOGS\8BLOG\test.DAT
|
||||
```
|
||||
|
||||
### Incremental Import (multiple files)
|
||||
```bash
|
||||
node database\import.js --file file1.DAT file2.DAT file3.DAT
|
||||
```
|
||||
This is the method used by Sync-FromNAS.ps1
|
||||
|
||||
---
|
||||
|
||||
## Integration with DOS Update System
|
||||
|
||||
### CTONW.BAT (v1.2+)
|
||||
**Purpose:** Upload test data from DOS machines to NAS
|
||||
|
||||
**Test Data Upload (lines 234-272):**
|
||||
```batch
|
||||
ECHO [3/3] Uploading test data to LOGS...
|
||||
|
||||
REM Create log subdirectories
|
||||
IF NOT EXIST %LOGSDIR%\8BLOG\NUL MD %LOGSDIR%\8BLOG
|
||||
IF NOT EXIST %LOGSDIR%\DSCLOG\NUL MD %LOGSDIR%\DSCLOG
|
||||
IF NOT EXIST %LOGSDIR%\HVLOG\NUL MD %LOGSDIR%\HVLOG
|
||||
IF NOT EXIST %LOGSDIR%\PWRLOG\NUL MD %LOGSDIR%\PWRLOG
|
||||
IF NOT EXIST %LOGSDIR%\RMSLOG\NUL MD %LOGSDIR%\RMSLOG
|
||||
IF NOT EXIST %LOGSDIR%\7BLOG\NUL MD %LOGSDIR%\7BLOG
|
||||
|
||||
REM Upload test data files to appropriate log folders
|
||||
IF EXIST C:\ATE\8BDATA\NUL XCOPY C:\ATE\8BDATA\*.DAT %LOGSDIR%\8BLOG\ /Y /Q
|
||||
IF EXIST C:\ATE\DSCDATA\NUL XCOPY C:\ATE\DSCDATA\*.DAT %LOGSDIR%\DSCLOG\ /Y /Q
|
||||
IF EXIST C:\ATE\HVDATA\NUL XCOPY C:\ATE\HVDATA\*.DAT %LOGSDIR%\HVLOG\ /Y /Q
|
||||
IF EXIST C:\ATE\PWRDATA\NUL XCOPY C:\ATE\PWRDATA\*.DAT %LOGSDIR%\PWRLOG\ /Y /Q
|
||||
IF EXIST C:\ATE\RMSDATA\NUL XCOPY C:\ATE\RMSDATA\*.DAT %LOGSDIR%\RMSLOG\ /Y /Q
|
||||
IF EXIST C:\ATE\7BDATA\NUL XCOPY C:\ATE\7BDATA\*.DAT %LOGSDIR%\7BLOG\ /Y /Q
|
||||
```
|
||||
|
||||
**Target:** `T:\TS-4R\LOGS\8BLOG\` (on NAS)
|
||||
|
||||
### Sync-FromNAS.ps1
|
||||
**Schedule:** Every 15 minutes via Windows Task Scheduler
|
||||
**PULL Operation (lines 157-213):**
|
||||
1. Find new DAT files on NAS (modified in last 24 hours)
|
||||
2. Copy to AD2: `C:\Shares\test\TS-XX\LOGS\[LOG_TYPE]\`
|
||||
3. Import to database via `import.js --file [files]`
|
||||
4. Delete from NAS after successful import
|
||||
|
||||
**Result:** Test data flows automatically from DOS → NAS → AD2 → Database
|
||||
|
||||
---
|
||||
|
||||
## Deduplication
|
||||
|
||||
**Unique Key:** (log_type, model_number, serial_number, test_date, test_station)
|
||||
|
||||
**Effect:** Same test result imported multiple times (from HISTLOGS, Recovery backups, live data) is stored only once
|
||||
|
||||
**Example:**
|
||||
```
|
||||
Record in HISTLOGS: DSCLOG, DSCA38-1793, 173672-1, 2025-02-15, TS-4R
|
||||
Same record in Recovery-TEST/12-18-25: DSCLOG, DSCA38-1793, 173672-1, 2025-02-15, TS-4R
|
||||
Result: Only 1 record in database
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Search Examples
|
||||
|
||||
### By Serial Number
|
||||
```
|
||||
http://localhost:3000/api/search?serial=176923
|
||||
```
|
||||
Returns all records with serial numbers containing "176923"
|
||||
|
||||
### By Model Number
|
||||
```
|
||||
http://localhost:3000/api/search?model=DSCA38-1793
|
||||
```
|
||||
Returns all records for model DSCA38-1793
|
||||
|
||||
### By Date Range
|
||||
```
|
||||
http://localhost:3000/api/search?from=2025-01-01&to=2025-12-31
|
||||
```
|
||||
Returns all records tested in 2025
|
||||
|
||||
### By Pass/Fail Status
|
||||
```
|
||||
http://localhost:3000/api/search?result=FAIL
|
||||
```
|
||||
Returns all failed tests (1,888 records)
|
||||
|
||||
### Full-Text Search
|
||||
```
|
||||
http://localhost:3000/api/search?q=voltage
|
||||
```
|
||||
Searches within raw test data for "voltage"
|
||||
|
||||
### Combined Search
|
||||
```
|
||||
http://localhost:3000/api/search?model=DSCA38&result=PASS&from=2025-01-01
|
||||
```
|
||||
All passed tests for DSCA38 models since Jan 1, 2025
|
||||
|
||||
---
|
||||
|
||||
## Known Issues
|
||||
|
||||
### Model Number Parsing
|
||||
- Parser was updated after initial import
|
||||
- To fix: Delete testdata.db and re-run full import
|
||||
- Impact: Model number searches may not be accurate for all records
|
||||
|
||||
### Performance
|
||||
- Full import: ~30 minutes
|
||||
- Database file: ~112 KB (compressed by SQLite)
|
||||
- Search performance: Very fast (indexed queries)
|
||||
|
||||
---
|
||||
|
||||
## Maintenance
|
||||
|
||||
### Regular Tasks
|
||||
- **None required** - Database updates automatically via Sync-FromNAS.ps1
|
||||
|
||||
### Occasional Tasks
|
||||
- **Re-import** - If parser updates or schema changes (delete testdata.db and re-run)
|
||||
- **Backup** - Copy testdata.db to backup location periodically
|
||||
|
||||
### Monitoring
|
||||
- Check Sync-FromNAS.ps1 log: `C:\Shares\test\scripts\sync-from-nas.log`
|
||||
- Check sync status: `C:\Shares\test\_SYNC_STATUS.txt`
|
||||
- View database stats: http://localhost:3000/api/stats (when server running)
|
||||
|
||||
---
|
||||
|
||||
## Original Use Case (2026-01-13)
|
||||
|
||||
**Request:** Search for serial numbers 176923-1 through 176923-26 for model DSCA38-1793
|
||||
|
||||
**Result:** **NOT FOUND** - These devices haven't been tested yet
|
||||
|
||||
**Most Recent Serials:** 173672-x, 173681-x (February 2025)
|
||||
|
||||
**Outcome:** Database created to enable easy searching of 1M+ test records going back to 1990
|
||||
|
||||
---
|
||||
|
||||
## Dependencies (from package.json)
|
||||
|
||||
- **better-sqlite3** - Fast SQLite database
|
||||
- **express** - Web server framework
|
||||
- **csv-writer** - CSV export functionality
|
||||
- **Node.js** - Required to run the application
|
||||
|
||||
---
|
||||
|
||||
## Connection Information
|
||||
|
||||
**Server:** AD2 (192.168.0.6)
|
||||
**Database Path:** C:\Shares\testdatadb\database\testdata.db
|
||||
**Web Server Port:** 3000 (http://localhost:3000 when running)
|
||||
**Access:** Local only (on AD2 server)
|
||||
|
||||
**To Access:**
|
||||
1. SSH or RDP to AD2 (192.168.0.6)
|
||||
2. Start server: `cd C:\Shares\testdatadb && node server.js`
|
||||
3. Open browser: http://localhost:3000
|
||||
|
||||
---
|
||||
|
||||
## Related Documentation
|
||||
|
||||
**In ClaudeTools:**
|
||||
- `CTONW_V1.2_CHANGELOG.md` - Test data routing to LOGS folders
|
||||
- `DOS_DEPLOYMENT_STATUS.md` - Database import workflow
|
||||
- `Sync-FromNAS-retrieved.ps1` - Complete sync script with database import
|
||||
- `import-js-retrieved.js` - Complete import script
|
||||
- `schema-retrieved.sql` - Database schema
|
||||
- `QUICKSTART-retrieved.md` - Quick start guide
|
||||
- `SESSION_NOTES-retrieved.md` - Complete session notes
|
||||
|
||||
**On AD2:**
|
||||
- `C:\Shares\testdatadb\QUICKSTART.md`
|
||||
- `C:\Shares\testdatadb\SESSION_NOTES.md`
|
||||
- `C:\Shares\test\scripts\Sync-FromNAS.ps1`
|
||||
|
||||
---
|
||||
|
||||
**Created:** 2026-01-13
|
||||
**Last Updated:** 2026-01-21
|
||||
**Status:** Production - Operational
|
||||
**Automation:** Complete (test data imports automatically every 15 minutes)
|
||||
Reference in New Issue
Block a user