CNL-TN-2026-009: Macroscope Virtual Field Explorer
CNL-TN-2026-006: Macroscope Virtual Field Explorer
Technical Specification for an Immersive Ecological Research Platform
Document Type: Technical Note
Document ID: CNL-TN-2026-006
Version: 3.0
Date: January 18, 2026
Author: Dr. Michael P. Hamilton
Institution: Canemah Nature Laboratory
Status: Implementation Specification
AI Collaboration Disclosure: This technical specification was developed collaboratively with Claude (Anthropic, Opus 4.5), with the AI contributing to system architecture design, workflow specification, and technical documentation. Dr. Hamilton provided the conceptual vision, domain expertise, and integration requirements based on forty years of ecological sensor deployment and viewmap research.
Abstract
This document specifies the Macroscope Virtual Field Explorer (MVFE), an immersive web-based platform enabling students, researchers, and citizen scientists to conduct quantitative ecological investigations within 360° spherical environments captured at field stations worldwide. The system integrates existing archives of seasonal panoramic video from The Virtual Field network with on-demand 3D Gaussian splat generation via Apple's SHARP model, creating "measurement terrariums" that transform passive viewing into active spatial analysis.
The platform is built on a LAMP stack (Linux, Apache, MySQL, PHP) with vanilla JavaScript for frontend interactions and Python for computational backends. Users navigate a global constellation of field stations, immerse themselves in seasonal habitat documentation, deploy virtual instruments to extract quantitative measurements, and compare findings across ecosystems and seasons. The system supports screen-based interaction with future WebXR extension planned.
Key Design Principle: Pattern recognition remains with the human observer. The system provides measurement capability and contextual data; meaning emerges from the investigator's interpretive framework shaped by experience.
1. Introduction
1.1 Background
The Virtual Field project, funded by the National Science Foundation during the COVID-19 pandemic, assembled a network of field stations collecting standardized 360° video documentation at solstices and equinoxes. This archive now contains over 280 five-minute spherical videos spanning 70+ locations across 25+ field stations, from Costa Rican tropical forests to Montana conifer stands to Belgian heathlands.
These videos were conceived as passive educational resources—windows into ecosystems for remote learners. This specification describes a transformation: converting that passive archive into an active research instrument through integration with rapid 3D scene reconstruction technology.
1.2 The SHARP Capability
Apple's SHARP model (Single-image High-Accuracy Real-time Parallax) generates metric 3D Gaussian splat representations from single photographs in under one second. Previous work at Canemah Nature Laboratory (CNL-TN-2026-005) demonstrated that perspective views extracted from equirectangular 360° imagery can be processed through SHARP to create bounded 3D "terrariums"—measurement windows preserving depth relationships within each viewing frustum.
1.3 Design Philosophy
The Macroscope Virtual Field Explorer embodies four decades of viewmap research, from the original 1986 videodisc-based EXPLORER mode ("scan right, scan left, scan up, scan down, zoom to ecosystem") through contemporary neural radiance fields. The fundamental insight remains constant: ecosystems become comprehensible when made navigable. MVFE extends navigability from viewing to measurement, from observation to quantification.
1.4 Human-Centered Investigation
As articulated in "Observatories of Complexity: Two Macroscopes for the Biosphere and Noosphere" (Hamilton, 2026), the instruments surface data; meaning emerges from the observer's predictive model shaped by experience. A trained ecologist looking at a 2024 terrarium beside a 2021 terrarium will recognize a windthrow event. A student may need guidance to see it. Neither requires the system to flag it automatically—the visual evidence is present in the data; the interpretation is a human act.
The system computes; the human interprets.
2. System Architecture
2.1 Architectural Overview
┌─────────────────────────────────────────────────────────────────────┐
│ MACROSCOPE VIRTUAL FIELD EXPLORER │
│ Human-Driven Investigation Platform │
├─────────────────────────────────────────────────────────────────────┤
│ │
│ USER-DRIVEN INVESTIGATION WORKFLOW │
│ │
│ ┌──────────────┐ ┌──────────────┐ ┌──────────────┐ │
│ │ GLOBE │───▶│ STATION │───▶│ IMMERSION │ │
│ │ INTERFACE │ │ PORTAL │ │ CHAMBER │ │
│ │ (Mapbox) │ │ (PHP) │ │ (Pannellum) │ │
│ └──────────────┘ └──────────────┘ └──────────────┘ │
│ │ │
│ ▼ │
│ ┌──────────────┐ │
│ │ TRICORDER │ │
│ │ Gaze Capture │ │
│ └──────────────┘ │
│ │ │
│ ┌──────────────────┼──────────────┐ │
│ ▼ ▼ ▼ │
│ ┌──────────┐ ┌──────────┐ ┌────────┐│
│ │ SINGLE │ │ TEMPORAL │ │ FIELD ││
│ │TERRARIUM │ │ COMPARE │ │NOTEBOOK││
│ └──────────┘ └──────────┘ └────────┘│
│ │ │ │ │
│ └────────┬─────────┘ │ │
│ ▼ │ │
│ ┌──────────────┐ │ │
│ │ MEASUREMENT │◀─────────────────┘ │
│ │ TOOLS │ │
│ └──────────────┘ │
│ │ │
│ ▼ │
│ ┌──────────────┐ │
│ │ EXPORT │ │
│ │ (CSV/JSON) │ │
│ └──────────────┘ │
│ │
├─────────────────────────────────────────────────────────────────────┤
│ BACKEND SERVICES │
│ ┌─────────────┐ ┌─────────────┐ ┌─────────────┐ ┌────────────┐│
│ │ SHARP │ │ iNaturalist│ │ Session │ │ Media ││
│ │ Engine │ │ API │ │ Store │ │ Archive ││
│ │ (Python) │ │ (Public) │ │ (MySQL) │ │ (Local) ││
│ └─────────────┘ └─────────────┘ └─────────────┘ └────────────┘│
└─────────────────────────────────────────────────────────────────────┘
2.2 Technology Stack
| Component | Technology | Function |
|---|---|---|
| Globe Interface | Mapbox GL JS | Global field station visualization and navigation |
| Station Portal | PHP + Mapbox GL JS | Site-level viewmap locations with metadata |
| Immersion Chamber | Pannellum | 360° spherical viewing |
| Field Notebook | Vanilla JS overlay | Habitat metadata, user notes, iNaturalist context |
| Tricorder | Vanilla JS + Canvas | Gaze-directed sampling and perspective extraction |
| SHARP Engine | PHP → Python pipeline | On-demand Gaussian splat generation |
| Terrarium View | WebGL2 splat renderer | Interactive 3D measurement environment |
| Session Store | MySQL + localStorage | User data persistence |
| Media Archive | Local filesystem | Equirectangular frames extracted from video |
2.3 Data Flow
User gaze in Pannellum
│
▼
┌───────────────────┐
│ Capture gaze: │
│ yaw, pitch, fov │
└─────────┬─────────┘
│
▼
┌───────────────────┐ ┌───────────────────┐
│ Extract perspec- │────▶│ SHARP inference │
│ tive from equi- │ │ (~1 second) │
│ rectangular │ └─────────┬─────────┘
└───────────────────┘ │
▼
┌───────────────────┐
│ PLY Gaussian │
│ splat file │
└─────────┬─────────┘
│
▼
┌───────────────────┐
│ Splat viewer │
│ (SAME viewpoint) │
└───────────────────┘
Critical Design Decision: The splat viewer initializes at the identical perspective (yaw, pitch, field of view) that the user was viewing when they triggered SCAN. The user should not have to reorient—they are looking at the same scene, now rendered in navigable 3D.
3. Modular Frontend Architecture
3.1 File Organization
The frontend has been refactored into modular components for maintainability:
Macroscope-Virtual-Field-Explorer/
├── index.php # HTML template + PHP config (~100 lines)
├── css/
│ └── explorer.css # All application styles (~530 lines)
├── js/
│ ├── splat-viewer.js # WebGL2 Gaussian splat renderer (~530 lines)
│ ├── pannellum.js # 360° viewer init & gaze capture (~60 lines)
│ ├── selection.js # Rectangle drawing & gaze conversion (~140 lines)
│ ├── panel.js # iPad panel drag/resize/minimize (~130 lines)
│ ├── measurement.js # Measure mode & calibration (~200 lines)
│ └── app.js # Main controller & scan API (~70 lines)
├── generate-terrarium.php # API endpoint for SHARP pipeline
├── run_sharp.py # Python pipeline wrapper
├── extract_perspective.py # Equirectangular projection
├── config.php # Configuration settings
├── 360_panos/ # Source equirectangular images
├── data/
│ └── splats/ # Generated PLY files
└── temp/ # Intermediate perspective images
3.2 Module Dependencies
index.php
│
├── css/explorer.css
│
└── JavaScript (load order matters)
│
├── splat-viewer.js # No dependencies (standalone WebGL2 class)
│
├── pannellum.js # Depends on: Pannellum CDN, CONFIG global
│
├── selection.js # Depends on: pannellum.js (viewer, currentGaze)
│ # app.js (triggerScan callback)
│
├── panel.js # Depends on: splat-viewer.js (SplatViewer class)
│
├── measurement.js # Depends on: splat-viewer.js (getPointAt method)
│ # panel.js (splatViewer instance)
│
└── app.js # Depends on: All above modules
# CONFIG global from PHP
3.3 Module Specifications
3.3.1 splat-viewer.js
WebGL2 Gaussian splat renderer. Standalone class with no external dependencies.
Class: SplatViewer
class SplatViewer {
constructor(canvasId, plyUrl) // Initialize and optionally load PLY
loadPLY(url) // Async load and parse PLY file
parsePLY(buffer) // Parse binary PLY to typed arrays
uploadData() // Upload geometry to GPU
sortSplats() // Back-to-front depth sort
render() // Main render loop
resize() // Handle canvas resize
resetOrbit() // Reset camera to default position
getPointAt(screenX, screenY) // Raycast to find Gaussian at screen point
start() / stop() // Control render loop
}
Exports: window.SplatViewer
3.3.2 pannellum.js
Pannellum 360° viewer initialization and gaze state management.
Functions:
initPannellum()— Initialize viewer with CONFIG.panoramaupdateGazeDisplay()— Update UI with current yaw/pitch/fovformatAngle(angle)— Normalize angle to 0-360 rangeresetPannellumView()— Reset to default orientation
Exports: viewer, currentGaze, initPannellum, updateGazeDisplay, formatAngle, resetPannellumView
3.3.3 selection.js
Rectangle selection for region-based terrarium generation.
Functions:
enterSelectionMode()/exitSelectionMode()— Mode managementhandleSelectionMouseDown/Move/Up(e)— Mouse event handlersupdateSelectionRect()— Update visual rectangleselectionToGaze(start, end, width, height)— Convert screen rect to gaze paramsinitSelection()— Attach event listeners
Exports: selectionMode, enterSelectionMode, exitSelectionMode, selectionToGaze, initSelection
3.3.4 panel.js
Floating iPad panel management for the terrarium viewer.
Functions:
openSplatViewer(plyUrl, metadata)— Show panel and load PLYcloseSplatViewer()— Hide panel and stop renderingtoggleMinimize()— Minimize/restore panelinitPanelDrag()/initPanelResize()— Drag and resize setupinitPanel()— Initialize all panel functionality
Exports: splatViewer, openSplatViewer, closeSplatViewer, toggleMinimize, initPanel
3.3.5 measurement.js
Point-to-point measurement tools with calibration.
State:
measureMode— Boolean, measurement activemeasurePoints— Array of clicked pointsscaleFactor— Inches per SHARP unitscaleCalibrated— Boolean, calibration set
Functions:
enterMeasureMode()/exitMeasureMode()— Mode managementaddMeasurePoint(point3D, screenX, screenY)— Add clicked pointdrawMeasureLine(p1, p2)— Draw visual line between pointsclearMeasureMarkers()— Remove all markerscalibrateScale()— Set scale from known distanceresetCalibration()— Clear calibrationinitMeasurement()— Attach event listeners
Exports: All state variables and functions
3.3.6 app.js
Main application controller. Initializes all modules and handles API communication.
Functions:
triggerScan(gaze)— Send gaze to API, open result in viewerinitApp()— Initialize all modules, wire event handlers
Expects: CONFIG global with {panorama, sharpEndpoint}
4. User Experience Workflow
4.1 Navigation Hierarchy
GLOBAL ──▶ STATION ──▶ SPHERE ──▶ GAZE ──▶ TERRARIUM ──▶ MEASURE
4.2 Detailed Workflow
Phase 1: Global Orientation
User arrives at a 3D globe with field station markers at geographic coordinates. Clicking a marker or searching by name flies to station location.
Phase 2: Station Portal
The station view presents:
- Station header with name, institution, links
- Site map with viewmap markers
- Viewmap table with ecosystem metadata
- Seasonal navigator showing available captures (typically 4 per location)
User selects a viewmap location and season.
Phase 3: Immersion Chamber
User enters the 360° sphere via Pannellum viewer:
- Full equirectangular panorama with drag navigation
- Compass rose for orientation
- Time badge showing capture date/season
- Tool palette: Tricorder, Notebook (minimized by default)
Phase 4: Gaze Capture (Tricorder)
User looks at a feature of interest and activates SCAN:
-
System captures current gaze state:
yaw: azimuth (-180° to +180°)pitch: elevation (-90° to +90°)hfov: horizontal field of view (user's current zoom)
-
Visual indicator shows extraction bounds (rectangular overlay)
-
User confirms: "Generate Terrarium"
Phase 5: Terrarium Generation
Backend processing (~2-4 seconds total):
- PHP endpoint receives gaze parameters
- Python script extracts perspective image from equirectangular
- SHARP inference generates Gaussian splat (~1 second on M4 Max)
- PLY file saved to static directory
- Response returns PLY URL and metadata
Phase 6: Terrarium Exploration
Critical: Splat viewer opens at the SAME viewpoint:
- Camera position matches original gaze direction
- User sees the same scene, now in 3D
- No reorientation required
From this starting point, user can:
- Orbit around the scene
- Zoom in/out
- Pan to reframe
- Activate measurement tools
Phase 7: Measurement
Available tools:
- Point-to-point distance: Click two locations, see distance
- Height estimation: Click base and top, see vertical extent
- Depth profile: Histogram of point depths in view
All measurements logged to Field Notebook with:
- Value and units
- Gaze direction at capture
- Timestamp
- Optional user annotation
Phase 8: Temporal Comparison
User selects a different season at same location:
- Season selector shows available captures
- Previous gaze direction is preserved
- User generates terrarium at identical direction
- Split-view comparison: Season A | Season B
- Differential metrics computed:
- Canopy closure change
- Sky view factor change
- Point density by height band
The human recognizes patterns. The system provides quantified evidence.
Phase 9: Export
User exports findings:
- CSV: raw measurements with metadata
- JSON: structured session data
- Screenshot: current terrarium view with annotations
5. Technical Specifications
5.1 Globe Interface
Technology: Mapbox GL JS v3+
Implementation: globe.php with embedded JavaScript
mapboxgl.accessToken = 'your-token';
const map = new mapboxgl.Map({
container: 'map',
style: 'mapbox://styles/mapbox/satellite-streets-v12',
projection: 'globe',
center: [-122.6, 45.5],
zoom: 2
});
// Load station markers from PHP-generated GeoJSON
map.on('load', () => {
map.addSource('stations', {
type: 'geojson',
data: '/api/stations.php'
});
// ... marker layer configuration
});
5.2 Station Portal
Technology: PHP with Mapbox embed
Files:
station.php?id=X— Station detail pageapi/station-data.php?id=X— JSON endpoint for station metadataapi/viewmaps.php?station=X— JSON endpoint for viewmap list
5.3 Immersion Chamber
Technology: Pannellum (vanilla JS library)
Configuration (from pannellum.js):
const viewer = pannellum.viewer('panorama', {
type: 'equirectangular',
panorama: CONFIG.panorama,
autoLoad: true,
compass: true,
hfov: 90,
minHfov: 50,
maxHfov: 120,
showControls: false
});
Gaze Extraction:
function updateGazeDisplay() {
currentGaze = {
yaw: viewer.getYaw(),
pitch: viewer.getPitch(),
hfov: viewer.getHfov()
};
}
5.4 Perspective Extraction
Technology: Python (called by PHP via shell_exec)
The perspective extraction uses equirectangular-to-rectilinear projection. Unlike cubemap extraction (fixed 90° faces at cardinal directions), this extracts an arbitrary perspective at the user's exact gaze direction and field of view.
Key function (extract_perspective.py):
def extract_perspective(equirect_path, yaw, pitch, hfov, output_size=1024):
"""
Extract perspective view from equirectangular image.
Args:
equirect_path: Path to equirectangular source image
yaw: Horizontal angle in degrees (-180 to 180)
pitch: Vertical angle in degrees (-90 to 90)
hfov: Horizontal field of view in degrees
output_size: Output image dimension (square)
Returns:
Perspective image as numpy array
"""
5.5 SHARP Engine
Technology: PHP endpoint → Python → SHARP model
Endpoint: generate-terrarium.php
Request:
POST /generate-terrarium.php
Content-Type: application/json
{
"source": "360_panos/360_CCCELC001_fresh_20220922-01.jpg",
"yaw": 45.2,
"pitch": 12.5,
"fov": 90
}
Response:
{
"success": true,
"ply_url": "data/splats/360_CCCELC001_fresh_20220922-01_45.2_12.5_1705592400.ply",
"metadata": {
"source": "360_panos/360_CCCELC001_fresh_20220922-01.jpg",
"gaze": {
"yaw": 45.2,
"pitch": 12.5,
"fov": 90
},
"generation_time": 27.51,
"file_size": 66061086
}
}
5.6 Terrarium Viewer
Technology: WebGL2 Gaussian Splat Renderer (splat-viewer.js)
Critical Feature: Camera initialization matches source gaze.
// From SplatViewer.loadPLY()
this.orbit.theta = Math.PI;
this.orbit.phi = Math.PI / 2;
this.orbit.radius = 1.5;
5.7 Measurement Tools
Point-to-point distance (from measurement.js):
function addMeasurePoint(point3D, screenX, screenY) {
measurePoints.push({ point3D, screenX, screenY });
if (measurePoints.length === 2) {
const p1 = measurePoints[0].point3D;
const p2 = measurePoints[1].point3D;
const dx = p2.x - p1.x;
const dy = p2.y - p1.y;
const dz = p2.z - p1.z;
const rawDistance = Math.sqrt(dx*dx + dy*dy + dz*dz);
// Apply calibration if available
if (scaleCalibrated) {
displayDist = (rawDistance * scaleFactor).toFixed(2);
unit = 'inches';
}
}
}
5.8 Session Persistence
Technology: localStorage (browser) + MySQL (optional server-side)
localStorage schema:
{
sessionId: "uuid",
currentStation: "elc",
currentViewmap: "forest-pond",
measurements: [
{
id: "m1",
type: "distance",
value: 12.3,
unit: "m",
gaze: { yaw: 45.2, pitch: 12.5 },
timestamp: "2026-01-18T10:30:00Z",
annotation: "Windthrow trunk length"
}
],
notes: [
{
viewmap: "forest-pond",
season: "winter-2024",
text: "Canopy gap visible in NNW quadrant"
}
]
}
6. Data Architecture
6.1 Data Assets
The Virtual Field Archive:
- 280+ YouTube videos (public domain)
- 5 minutes duration each, equirectangular 360°
- 4 seasons × ~70 locations
- Primarily 2021-2022 captures
ELC Local Archive:
- Original MP4 files from Dr. Hamilton's contributions
- Full resolution, immediate access
- 4 seasons: Winter 2021, Spring 2022, Summer 2022, Fall 2022
Frame Extraction: Frames extracted from videos using ffmpeg:
ffmpeg -ss 00:02:30 -i video.mp4 -frames:v 1 -q:v 2 frame.jpg
Timestamp 2:30 chosen for consistent conditions (after camera settling, before end).
6.2 Database Schema
-- Field stations
CREATE TABLE stations (
id INT AUTO_INCREMENT PRIMARY KEY,
name VARCHAR(255) NOT NULL,
institution VARCHAR(255),
latitude DECIMAL(10, 7),
longitude DECIMAL(10, 7),
website VARCHAR(255),
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);
-- Viewmap locations within stations
CREATE TABLE viewmaps (
id INT AUTO_INCREMENT PRIMARY KEY,
station_id INT NOT NULL,
name VARCHAR(255) NOT NULL,
description TEXT,
ecosystem VARCHAR(100),
aquatic_features BOOLEAN DEFAULT FALSE,
latitude DECIMAL(10, 7),
longitude DECIMAL(10, 7),
FOREIGN KEY (station_id) REFERENCES stations(id)
);
-- Seasonal captures
CREATE TABLE captures (
id INT AUTO_INCREMENT PRIMARY KEY,
viewmap_id INT NOT NULL,
season ENUM('winter', 'spring', 'summer', 'fall') NOT NULL,
capture_date DATE,
video_url VARCHAR(500),
frame_path VARCHAR(255),
FOREIGN KEY (viewmap_id) REFERENCES viewmaps(id)
);
-- User sessions (optional server-side persistence)
CREATE TABLE sessions (
id VARCHAR(36) PRIMARY KEY,
data JSON,
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP
);
-- Generated terrariums (cache)
CREATE TABLE terrariums (
id INT AUTO_INCREMENT PRIMARY KEY,
capture_id INT NOT NULL,
yaw DECIMAL(6, 2),
pitch DECIMAL(6, 2),
hfov DECIMAL(5, 2),
ply_path VARCHAR(255),
num_gaussians INT,
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
FOREIGN KEY (capture_id) REFERENCES captures(id),
UNIQUE KEY (capture_id, yaw, pitch, hfov)
);
6.3 External APIs
iNaturalist:
GET https://api.inaturalist.org/v1/observations
?lat={latitude}
&lng={longitude}
&radius=1
&quality_grade=research
&per_page=200
Mapbox:
- Globe rendering
- Satellite imagery
- Marker clustering
7. Implementation Status
7.1 Operational Components
| Component | File(s) | Status | Notes |
|---|---|---|---|
| Perspective Extraction | extract_perspective.py |
✓ Working | Arbitrary yaw/pitch/fov from equirectangular |
| SHARP Pipeline Wrapper | run_sharp.py |
✓ Working | CPU mode, ~27s inference time |
| PHP API Endpoint | generate-terrarium.php |
✓ Working | Proper error handling, JSON responses |
| Pannellum Integration | js/pannellum.js |
✓ Working | 360° navigation, gaze capture functional |
| Region Selection UI | js/selection.js |
✓ Working | Rectangle drawing, gaze-to-FOV conversion |
| iPad Panel | js/panel.js |
✓ Working | Drag, resize, minimize |
| Measurement Tools | js/measurement.js |
✓ Working | Point-to-point, calibration, recalibration |
| Modular CSS | css/explorer.css |
✓ Working | All styles extracted |
| Application Controller | js/app.js |
✓ Working | Module wiring, scan API |
7.2 Components Requiring Development
| Component | Status | Issue |
|---|---|---|
| Custom WebGL2 Viewer | Rendering errors | Coordinate system mismatch, depth sorting |
| GaussianSplats3D Integration | No scene rendered | Library initialization failure |
| Temporal Comparison | Not started | Requires working viewer first |
| Multi-Station Navigation | Not started | Phase 4 of roadmap |
7.3 Technical Findings
SHARP Output Characteristics:
- Coordinate System: OpenCV convention (x right, y down, z forward)
- Camera Position: Origin, looking down +Z axis
- Output Format: PLY with positions, scales, rotations, spherical harmonics
- Typical Output: ~1.18 million Gaussians, ~63 MB PLY file
- Inference Time: ~10s on M4 Max with MPS, ~27s on CPU
Critical Constraint: Single-image splats provide reliable depth only within a bounded viewing frustum. Sky and distant elements lack depth information and produce artifacts when viewed from significantly different angles.
Apache Environment Configuration:
env['PYTHONPATH'] = '/path/to/ml-sharp/src'
env['MPLCONFIGDIR'] = '/tmp/matplotlib'
env['TORCH_HOME'] = '/path/to/project/cache/torch'
env['HOME'] = '/path/to/project/cache'
GPU Access: MPS (Metal Performance Shaders) acceleration is not available from Apache's sandboxed context. CPU inference is required for web-triggered generation.
7.4 Known Issues and Workarounds
Viewer Rendering:
- Issue: Custom WebGL2 viewer produces distorted or black output
- Workaround: Generated PLY files can be verified using SuperSplat (https://playcanvas.com/supersplat/editor)
CPU-Only Inference:
- Issue: Apache cannot access MPS acceleration
- Impact: Generation time ~27s vs ~10s with MPS
- Acceptable for proof of concept; production optimization possible via dedicated inference service
8. Pattern Recognition Philosophy
8.1 What the System Does
The Macroscope Virtual Field Explorer provides:
- Navigation to archived 360° captures
- On-demand 3D terrarium generation
- Measurement tools for quantification
- Temporal comparison capability
- Data export for analysis
8.2 What the System Does Not Do
The system does NOT provide:
- Automated anomaly detection
- Pattern flagging without user prompting
- Interpretation of structural changes
- AI-generated ecological assessments
8.3 Why This Matters
The Virtual Field archive provides discrete seasonal snapshots—four captures per year per location. This temporal resolution enables seasonal comparison and interannual comparison, but not the continuous monitoring that would support real-time anomaly detection.
More fundamentally, ecological interpretation requires the predictive model that a trained observer brings. A 29% drop in canopy closure means something to someone who has seen windthrow events before. The numbers alone are not the insight.
8.4 Future Extension: Visual Pattern Detection
Current experiments at Canemah Nature Laboratory with YOLOv8 are developing automated visual pattern detection for live 360° camera feeds. When mature, this capability could extend to:
- Real-time species detection in panoramic streams
- Motion and activity pattern recognition
- Integration with STRATA context generation
This remains experimental and is not included in the current implementation scope.
8.5 Connection to MEO Patterns
The existing MEO patterns infrastructure (macroscope.earth/?page=patterns) demonstrates automated detection for continuous time-series data:
- Environmental correlations from Tempest weather station
- Biological rhythms from BirdWeather detections
- Anomaly flagging based on statistical deviation
The architectural pattern is established. Extending to visual/structural data from 360° imagery would require batch processing the archive to compute metrics—feasible but outside current scope.
9. Implementation Roadmap
Phase 1: Core Loop Proof of Concept (COMPLETE)
Objective: Demonstrate gaze-to-terrarium workflow with single ELC capture
Deliverables:
- [x]
index.php: Pannellum viewer with ELC winter frame - [x] Gaze capture UI (vanilla JS, display yaw/pitch/fov)
- [x] SCAN button triggers perspective extraction
- [x]
generate-terrarium.php: PHP endpoint calling Python - [x]
extract_perspective.py: Arbitrary gaze extraction - [x]
run_sharp.py: SHARP inference wrapper - [x] Splat viewer integration with gaze-matched camera init
- [x] Round-trip: look → scan → terrarium (same view) → return
- [x] Modular refactoring into separate JS/CSS files
Test Data: ELC equirectangular frames (all four seasons available)
Phase 2: Measurement Tools (IN PROGRESS)
Objective: Point-to-point distance measurement in terrarium
Deliverables:
- [x] Click-to-3D raycasting in WebGL viewer
- [x] Two-point distance tool with display
- [x] Calibration from known distance
- [x] Recalibration capability
- [x] Measurement logging to UI
- [ ] Height estimation tool (vertical component)
- [ ] Measurement logging to localStorage
Success Criteria: User can measure distance between two clicked points with real-world units
Phase 3: Temporal Comparison (2 weeks)
Objective: Side-by-side seasonal terrariums with differential metrics
Data Required:
- ELC Winter 2021 and Winter 2024 frames (or available years)
Deliverables:
- [ ] Season selector UI in sphere view
- [ ] Gaze direction preservation across season switch
- [ ] Dual terrarium display (split view)
- [ ] Differential metrics computation:
- Sky view factor change
- Point density by height band
- [ ] Visual comparison overlay
Success Criteria: User compares two seasons at same gaze direction, sees quantified change
Phase 4: Multi-Station Navigation (2-3 weeks)
Objective: Globe → Station → Sphere navigation for Virtual Field archive
Data Required:
- Station metadata (coordinates, names, institutions)
- Viewmap inventory per station
- Frame extraction from YouTube archive (batch ffmpeg job)
Deliverables:
- [ ]
globe.php: Mapbox globe with station markers - [ ]
station.php: Station portal with viewmap list - [ ] MySQL tables populated with Virtual Field metadata
- [ ] Frame extraction pipeline for 5-10 stations
Success Criteria: User navigates from globe to any implemented station to sphere
Phase 5: Field Notebook + Export (2 weeks)
Objective: Session continuity and data export
Deliverables:
- [ ] Notebook UI overlay (habitat metadata, species context)
- [ ] iNaturalist API integration for location-based observations
- [ ] Measurement logging with annotation
- [ ] User notes per viewmap/season
- [ ] CSV export of measurements
- [ ] JSON export of session data
Success Criteria: Measurements persist across page reloads, can be exported
Phase 6: WebXR Integration (Optional, 3-4 weeks)
Objective: VR support for Quest 3, Vision Pro
Deliverables:
- [ ] WebXR session management
- [ ] Spatial UI adaptations for Pannellum
- [ ] Controller input handling
- [ ] Splat viewer in VR mode
Success Criteria: Full workflow functional in VR headset
10. Educational Integration
10.1 Alignment with Virtual Field Explorer Guides
The Virtual Field project includes Explorer Guides for skill development:
- Write Field Notes
- Sketch What You See
- Learn to Ask Questions
MVFE extends these with quantitative capabilities:
- Measure What You See: Distance, height, structural metrics
- Compare Across Time: Seasonal change quantification
- Compare Across Space: Cross-ecosystem structural analysis
10.2 Curriculum Pathways
K-8:
- Guided exploration of single ecosystem
- Simple observation logging in Field Notebook
- "What do you notice?" prompts
High School:
- Multi-ecosystem comparison
- Basic measurement collection
- Seasonal change documentation
- Data export for analysis
University:
- Full quantitative workflow
- Hypothesis-driven investigation
- Statistical comparison across sites
- Research report generation
Citizen Science:
- Contribution to phenology monitoring
- Distributed measurement campaigns
- Community science integration
11. Deployment Architecture
11.1 Development Environment
┌─────────────────────────────────────────────────────────────┐
│ DEVELOPMENT (Data - M4 Max) │
├─────────────────────────────────────────────────────────────┤
│ Web Server: Apache (WebMon managed) │
│ PHP: 8.3+ via /opt/homebrew/bin/php │
│ Database: MySQL 8.4+ (phpMyAdmin) │
│ Python: 3.x with SHARP dependencies │
│ SHARP: MPS acceleration (Apple Silicon) │
│ Storage: Local filesystem │
│ URL: localhost or local network │
└─────────────────────────────────────────────────────────────┘
11.2 Production Environment
┌─────────────────────────────────────────────────────────────┐
│ PRODUCTION (Galatea - M4 Pro) │
├─────────────────────────────────────────────────────────────┤
│ Web Server: Apache │
│ PHP: 8.3+ │
│ Database: MySQL 8.4+ │
│ Python: 3.x with SHARP (or offload to Sauron) │
│ Storage: Local NAS │
│ Network: 1Gb fiber │
│ URL: macroscope.earth or subdomain │
└─────────────────────────────────────────────────────────────┘
11.3 Computation Offload (Optional)
┌─────────────────────────────────────────────────────────────┐
│ COMPUTATION (Sauron - Intel i9) │
├─────────────────────────────────────────────────────────────┤
│ SHARP inference with GPU acceleration │
│ Batch processing of archive │
│ API endpoint for Galatea to call │
└─────────────────────────────────────────────────────────────┘
12. Connection to Macroscope Philosophy
12.1 Two Macroscopes
As described in "Observatories of Complexity" (Hamilton, 2026), the Macroscope Virtual Field Explorer represents the biosphere observatory—extending human perception to scales beyond unaided capacity. The archive of 280+ seasonal captures across 25+ field stations provides a view into ecological complexity that no individual field naturalist could survey.
The companion observatory—the noosphere Macroscope—is represented by the collaborative intelligence between human observer and AI system. The language model can surface patterns in how humanity has collectively understood ecosystems; the human observer brings the predictive model that recognizes what those patterns mean.
12.2 The Instruments Serve the Observation
The sensor feeds, the pattern recognition, the 3D reconstructions—all produce calibration signals. The controlled hallucination remains with the scientist. The Macroscope doesn't see ecosystems; it provides data the ecologist's brain uses to update its model.
A forest canopy can tear open. The terrarium shows the gap. The trained observer recognizes windthrow. The instruments surface the data; meaning emerges from the observer.
13. References
-
Hamilton, M.P. & Lassoie, J.P. (1986). The Macroscope: A videodisc atlas for ecology education. Cornell University.
-
Hamilton, M.P. (2026). "Virtual Terrariums: When a Failed Hypothesis Becomes a Better Instrument." Coffee with Claude. CNL-TN-2026-005.
-
Hamilton, M.P. (2026). "Observatories of Complexity: Two Macroscopes for the Biosphere and Noosphere." Coffee with Claude. https://coffeewithclaude.com/post.php?slug=observatories-of-complexity-two-macroscopes-for-the-biosphere-and-noosphere
-
The Virtual Field Project. (2021-present). 360-Degree Seasonal Videos. https://thevirtualfield.org/360-degree-seasonal-videos/
-
Apple Machine Learning Research. (2025). SHARP: Single-image to High-fidelity 3D with Accurate shape and Realistic aPpearance. https://github.com/apple/ml-sharp
-
Hamilton, M.P. (2025). MEO Spatial Intelligence Framework v2.0. MacroNexus Ecological Observatory.
-
Pannellum. (2024). Lightweight Panorama Viewer for the Web. https://pannellum.org/
-
Kerbl, B. et al. (2023). 3D Gaussian Splatting for Real-Time Radiance Field Rendering. SIGGRAPH 2023.
-
Seth, A.K. (2021). Being You: A New Science of Consciousness. Dutton.
Document History
| Version | Date | Author | Changes |
|---|---|---|---|
| 1.0 | 2026-01-18 | M. Hamilton / Claude | Initial draft specification |
| 2.0 | 2026-01-18 | M. Hamilton / Claude | Revised for LAMP stack, honest scope, human-centered investigation, gaze-matched viewer |
| 3.0 | 2026-01-18 | M. Hamilton / Claude | Consolidated with Addendum A; documented modular frontend architecture (css/, js/ separation); updated implementation status |
This document is part of the Canemah Nature Laboratory Technical Note series.
Cite This Document
BibTeX
Permanent URL: https://canemah.org/archive/document.php?id=CNL-TN-2026-009