CNL-TN-2026-043 Technical Note

STRATA 2.0: Distributed Intelligence Architecture

Published: April 6, 2026 Version: 2 This version: April 7, 2026

STRATA 2.0: Distributed Intelligence Architecture

From Monitoring Engine to Distributed Research Service

Canemah Nature Laboratory Technical Note Series

Document ID: CNL-TN-2026-043 Version: 1.0 Date: April 7, 2026 Author: Michael P. Hamilton, Ph.D. Affiliation: Canemah Nature Laboratory, Oregon City, Oregon Supersedes: CNL-TN-2026-043 v0.2 (Draft, April 6, 2026)


AI Assistance Disclosure: This document was developed collaboratively with Claude (Anthropic, claude-opus-4-6) via Cowork. Claude contributed to architectural analysis, codebase study, capability mapping, distributed systems design, and document drafting. The author takes full responsibility for the content, accuracy, and conclusions.


Abstract

STRATA 2.0 evolves the existing STRATA intelligence engine into a distributed research service spanning the Canemah Nature Laboratory's Tailscale mesh network. The current STRATA system on Galatea (Mac Mini M4 Pro) provides 13 temporal micro-agents, 7 context builders, 25 tool-calling endpoints, a 4-domain privacy model, and a personality system for narrative interpretation of Macroscope sensor streams. Analysis of this codebase reveals a clean four-layer architecture with minimal coupling to the web application, making extraction into a standalone service architecturally straightforward.

The distributed architecture uses a hybrid approach: a shared PHP library on Galatea for co-located projects (MNG, production services), wrapped in a thin HTTP API for remote access by Data and other nodes over Tailscale. A plugin dispatch layer extends this across the mesh network, routing intelligence requests to specialized nodes: Galatea for data authority and cloud AI, Data for development and local inference via Ollama, Sauron for GPU compute (3DGS, ML training), and Hogwarts for vision processing (YOLO, camera services).

STRATA_Bench, the structured investigation lab bench built on a 14-table strata_db schema with a seven-phase scientific workflow, becomes the primary consumer of this distributed service. The lab notebook captures every operation as a publishable primary dataset, with investigations flowing to Science with Claude (SWC) for public presentation. Together, these components form a complete observation-to-publication pipeline operating across federated databases and distributed compute nodes.


1. Introduction

1.1 Motivation

The Macroscope project has accumulated substantial infrastructure over four decades of development: 57 sensor tables streaming environmental and biological data, three trained anomaly-detection meshes (SOMA), 13 temporal micro-agents (STRATA), a curated place-based observatory (MNG), and a growing archive of technical documents. Two things have been missing: a structured environment for conducting investigations that leverage this infrastructure, and a way to make STRATA's intelligence available as a service to multiple projects rather than embedding it in a single web application.

STRATA 1.0 proved the concept: modular agents can generate meaningful temporal context from heterogeneous sensor streams. The v0.2 draft of this document described extending STRATA from monitoring intelligence into a lab bench. This v1.0 revision captures two major developments: the STRATA_Bench application scaffold and database are now built and operational, and the architectural direction has evolved from a monolithic application on Data to a distributed service across the laboratory's Tailscale mesh network.

1.2 Three Evolutions

STRATA is undergoing three simultaneous evolutions:

  • Monitoring to Investigation. STRATA 1.0 answers "what is happening now?" STRATA_Bench adds "what does this pattern mean, and how do we test that hypothesis?" through a seven-phase scientific workflow with a comprehensive lab notebook.
  • Monolith to Service. The STRATA intelligence layer -- currently embedded in the Galatea web application -- is being extracted into a standalone service with a shared library and HTTP API, consumable by MNG, STRATA_Bench, and future projects.
  • Single Node to Distributed Mesh. Intelligence processing expands from Galatea alone to a four-node Tailscale mesh where each machine contributes its specialized capabilities: data authority, local inference, GPU compute, and vision processing.

1.3 Design Principles

  • Human-guided, AI-assisted. The investigator defines questions, evaluates results, and makes all scientific judgments. AI handles data retrieval, pattern detection, computation, and draft generation.
  • Cost-aware. Tiered AI strategy routes tasks to the cheapest sufficient model. Every API call is logged with cost, latency, and model attribution.
  • Phase-structured. The seven-phase workflow provides scientific rigor and auditability. Each phase has defined inputs, outputs, and completion criteria.
  • Distributed by design. STRATA intelligence routes to the right node for each task. The calling code does not need to know where work happens.
  • Pattern-first. SOMA's RBM meshes, transfer entropy, and micro-agent temporal windows feed into the lab bench as detection infrastructure.

2. STRATA 1.0 Codebase Analysis

A detailed study of the production STRATA codebase at Projects/Live/Galatea/Macroscope-Galatea/ (mirrored from /Library/WebServer/Documents/Macroscope/ on Galatea) reveals a four-layer architecture with clean separation of concerns.

2.1 Layer 1: Sensor Data Access (Tools)

The StrataTools class in api/strata/tools.php coordinates 25 named tool calls mapped to 12 platform-specific PHP files under api/strata/tools/. Each platform file provides sensor query and temporal history functions with aggregation awareness (instantaneous, cumulative_daily, categorical, event). Approximately 5,800 lines of pure data access code. All functions talk directly to the macroscope database via mysqli, take parameter arrays as input, and return structured arrays. No web UI coupling whatsoever.

Platforms covered: Tempest, Ecowitt, AirLink, Airthings, AmbientWeather, BirdWeather detections, BirdWeather environmental, iNaturalist, Apple Health activity, Apple Health clinical, Apple Health vitals, Apple Health workouts, documents.

2.2 Layer 2: Context Building (Intelligence)

Seven context builder files under includes/context/ (birds, birdweather_env, tempest, ecowitt, ambient, airthings, airlink) plus the orchestrator includes/strata_context.php. These read pre-computed JSON state files and assemble natural language system prompts for Claude. The orchestrator function buildSystemPrompt() takes temporal state, spatial state, site name, personality type, and user name as parameters and returns a complete system prompt string. Approximately 1,200 lines. Includes the adaptive query interpretation framework with five query types: casual, specific source, comparative, temporal comparison, and species queries.

2.3 Layer 3: Web Application (Shell)

chat.php is the main endpoint -- handles HTTP POST, reads PHP session auth, loads user preferences, loads JSON state files from disk, calls buildSystemPrompt(), then makes a curl call to the Anthropic API. The api/index.php router handles site switching, platform queries, chart data, and user preferences. includes/auth.php provides the four-domain privacy model: EARTH and LIFE always visible, HOME and SELF gated by user_domain_access and user_platform_access tables, admin god mode. Multi-site support via user_site_access table.

2.4 Layer 4: State Generation (Agents)

agents/generate_temporal_state.php orchestrates 13 platform summary micro-agents (agents/summaries/*.php) that query the database and write JSON state files to context/temporal/. agents/spatial_agent.php builds location, species, and platform distribution state. These run via CLI (cron or manual) and write state files that Layer 2 reads. Approximately 2,000 lines. Pure CLI scripts with no web coupling.

2.5 Coupling Points

The coupling between STRATA intelligence and the web application is confined to four points:

  • Credential path. Every file hardcodes /Library/WebServer/secure/macroscope/database.conf.php. This is the single largest coupling point -- a literal filesystem path rather than an environment-aware configuration.
  • Session/auth. chat.php and api/index.php depend on PHP sessions ($_SESSION['user_id'], $_SESSION['selected_site_id']). The privacy model in auth.php is tightly session-bound.
  • State files on disk. The agent layer writes JSON to context/temporal/ and agents/, and the context layer reads from those paths. This is effectively a file-based message bus.
  • Anthropic API call. The actual Claude call is embedded in chat.php rather than existing as a separate service function.

Layers 1, 2, and 4 are already nearly decoupled from the web. They are pure PHP functions that take data in and return data out. The main surgery required is replacing the hardcoded credential path with environment-aware configuration and extracting the Claude API call from chat.php into the service layer.


3. Distributed Service Architecture

3.1 Shared Library + HTTP Wrapper (Option C)

The STRATA service uses a hybrid approach designated "Option C" during the design process. The core intelligence lives as a shared PHP library on Galatea. Projects running on Galatea (MNG, production web services) call the library directly via require_once -- no HTTP overhead, no API authentication needed. A thin HTTP endpoint wraps the same library for remote callers. Data, Sauron, and future nodes reach STRATA over Tailscale via this HTTP API.

This mirrors the existing data architecture: Galatea is the authoritative source, Data reaches it over Tailscale. The sync agent pulls database rows for bulk data; the HTTP service handles real-time intelligence requests that need Galatea's full context (current state files, real-time sensor readings, privacy model, narrative synthesis).

Three alternative approaches were evaluated:

  • Option A (HTTP API only): All consumers call via HTTP. Simple but adds unnecessary overhead for Galatea-local projects like MNG.
  • Option B (Shared library only): All consumers require_once the library. Efficient but requires all consumers to run on Galatea.
  • Option C (Hybrid): Shared library for co-located projects, HTTP wrapper for remote access. Best of both approaches.

3.2 Plugin Dispatch Layer

The STRATA service includes a dispatch layer that routes intelligence requests across the Tailscale mesh based on capability. Each node registers what it can do, and STRATA routes accordingly. The calling code -- whether MNG, STRATA_Bench, or a future project -- does not need to know which machine handles a given request.

This extends the pattern already established in StrataTools::$toolMap, where tool names map to handler functions. The extension is that handlers may live on different machines. A narrative interpretation request routes to Galatea (cloud API with full sensor context). A quick classification routes to Data (Ollama local inference). A vision model inference routes to Sauron or Hogwarts (GPU).

3.3 Network Topology

Node Hardware STRATA Role Capabilities Status
Galatea Mac Mini M4 Pro, 1Gb fiber Core: data authority, shared library, HTTP API, cloud AI Collectors, production DB, MNG, SWC, STRATA 1.0 Active
Data MacBook Pro M4 Max Development, local inference, STRATA_Bench workbench Ollama/MLX, Cowork, strata_db, MySQL replica Active
Sauron Intel NUC i9, 2x RTX 3090 GPU compute plugin 3DGS, YOLO training, ML training, CUDA Pending setup
Hogwarts Mac Mini M1 Vision processing plugin YOLO inference, camera services, bird feeder monitoring Offline

All nodes connect via the Tailscale mesh network (tailnet: broichan). Galatea is reachable at 100.81.28.21 / galatea.tail1babf0.ts.net. Data at 100.98.33.38 / data-3.tail1babf0.ts.net. Sauron and Hogwarts will join the mesh when brought online.

3.4 Two Channels to Galatea

Data maintains two channels to Galatea, each serving a different need:

  • Data synchronization (existing). The com.macroscope.sync launchd agent pulls the macroscope database every 5 minutes via high-water-mark replication. CNL admin provides bidirectional sync for cnl_archive. This handles bulk data that STRATA_Bench needs locally for browsing and analysis.
  • Service calls (new). The STRATA HTTP API handles real-time intelligence requests: build a system prompt, execute a tool call against the sensor network, query current state. Requests that need Galatea's full context -- current state files, real-time readings, the privacy model, narrative synthesis -- use this channel rather than replicating the data.

4. STRATA_Bench: The Investigation Lab Bench

4.1 Application Scaffold (Built)

The STRATA_Bench application scaffold is operational at Projects/Workbench/STRATA/ on Data, served at http://localhost/Projects/Workbench/STRATA/. The scaffold follows established Macroscope LAMP conventions:

  • lib/env.php -- Environment bootstrap with hardcoded paths per hostname, defined() guard for safe multiple includes. Data credentials at ~/Sites-secure/strata/database.php.
  • lib/config.php -- Three database connections (getStrataDb, getMacroscopeDb, getNexusDb) using mysqli with static connection caching. Core recording functions: logNotebookEntry() with auto-incrementing sequence numbers and running cost/count updates, logAiInteraction() with automatic notebook entry creation, createInvestigation() with stored procedure call.
  • index.php -- Dashboard with status cards (DB connection, investigation count, notebook entries, total cost), research queue with phase progress dots and cost tracking.
  • admin/index.php -- Admin dashboard with new investigation form (domain, hypothesis, priority, seed source) and investigation detail view with per-phase status, confidence, entry count, and cost.
  • notebook.php -- Lab notebook viewer with entries organized by phase, 15 color-coded entry type badges, investigation selector, and AI model/tier/cost display.
  • patterns.php, costs.php -- Placeholder pages for SOMA 2.0 integration and AI cost visualization.

4.2 The Factory Metaphor

STRATA_Bench treats each investigation as a production line. Each of the seven phases is a station on the factory floor with starting conditions (inputs from the prior phase), tools to deploy (AI models, analysis scripts, visualization libraries), processing cycles that run until output parameters are met, and defined outputs that become inputs for the next station.

The lab notebook is the flight recorder of this factory -- capturing the exhaust trail of every operation automatically (the Joule layer: timestamps, methods, costs, AI interactions) plus human annotations (the Faraday layer: interpretive judgments, field observations, corrections). The AI disclosure record -- every prompt, response, model, cost, and token count -- is itself a publishable primary dataset, not merely metadata. This is the "study of the study": the investigation process becomes as scientifically valuable as the investigation results.

4.3 Seven-Phase Investigation Workflow

# Phase Purpose AI Role Human Role
1 Seed Capture originating observation Suggest related patterns from sensor data Define question from field experience
2 Priors Literature review, contextual framing Search literature, summarize relevant work Evaluate relevance, add domain knowledge
3 Proposal Testable hypothesis with success criteria Draft hypothesis, suggest metrics Refine hypothesis, approve criteria
4 Workflow Methods, tools, AI role documentation Generate scripts, configure pipelines Review methods, ensure ecological validity
5 Testing Execute analysis, collect results Run computations, generate visualizations Interpret results, identify artifacts vs signal
6 Conclusions Synthesize findings, state limitations Draft conclusions, flag logical gaps Final scientific judgment, approve findings
7 Reflections Meta-analysis of investigation process Assess collaboration quality Evaluate AI contribution, plan follow-up

Confidence scores (0.0-1.0) are tracked per phase, reflecting the investigator's assessment of completeness and reliability.


5. Database Architecture

5.1 The strata_db Schema (Built)

The strata_db database is operational on Data with 14 tables, 19 foreign keys, 22 indexes, and 1 stored procedure (seed_investigation_phases). The schema was designed around the factory/notebook metaphor:

Category Tables Purpose Key Design
Core investigations, investigation_phases Investigation state Phases auto-seeded by stored procedure
Lab Notebook notebook_entries, ai_interactions, notebook_artifacts Flight recorder Auto-incrementing sequence, running cost updates
Registry tools, data_sources, strata_agents Capability tracking Version, type, domain classification
Relationships investigation_data_sources, investigation_tools Per-investigation binding M:N with usage notes
Discovery patterns, tags, investigation_tags Pattern detection, classification SOMA mesh linkage, confidence scoring
Export swc_exports Publication pipeline Maps to SWC phases and tabs

5.2 Cross-Database Architecture

STRATA_Bench joins three federated databases at the application layer. The strata_db handles investigation state. The macroscope database (57 tables, synced from Galatea every 5 minutes) provides time-series sensor data. The macroscope_nexus database (shared with MNG) provides curated places, categories, media, and species. The existing bridge pattern (macroscope_nexus.monitoring_sources.macroscope_platform_id linking to macroscope.sensor_platforms.id) provides the architectural seam.

5.3 The Lab Notebook as Primary Dataset

The notebook subsystem consists of three tables serving different audiences. notebook_entries captures every operation (15 entry types: observation, query, analysis, computation, literature, visualization, code, annotation, ai_request, ai_response, discussion, decision, error, phase_transition, export) with timestamps, methods, and investigation linkage. ai_interactions captures the full AI disclosure record: prompts, responses, model identifiers, token counts, costs, and processing times. notebook_artifacts links generated files, visualizations, and exported data to their originating notebook entries.

This three-table design separates the Joule layer (automatic operational recording) from the Faraday layer (human interpretive annotations) while maintaining the AI disclosure record as a first-class entity suitable for independent publication.


6. Inherited Capabilities from STRATA 1.0

The STRATA service inherits and makes available everything from the Galatea production system:

  • 13 temporal micro-agents: Platform-specific state generators for Tempest, Ecowitt, AirLink, Airthings, AmbientWeather, BirdWeather (detections + environmental), iNaturalist, Apple Health (vitals, activity, workouts, clinical), and documents.
  • 7 context builders: Modular prompt generators assembling temporal, spatial, document, and domain context. Orchestrated by buildSystemPrompt().
  • 25 tool-calling endpoints: Real-time sensor queries, document search, observation analysis, statistical summaries -- mapped through StrataTools::$toolMap.
  • 4-domain privacy model: EARTH and LIFE (public), HOME and SELF (private) with per-user, per-domain, per-platform granular access control. Admin god mode.
  • Multi-temporal windows: 9 temporal windows per sensor (last_hour through rain_year) plus 7 bird-specific windows including dawn_chorus.
  • Personality system: 4 AI personas (field_naturalist, technical_analyst, conversational_guide, research_assistant) configurable per user.
  • Sensor registry: Modular platform definitions with aggregation-type awareness, location metadata, and privacy classification.

7. Tiered AI Cost Strategy

Tier Provider Node Use Cases Cost
Local Ollama, MLX Data Classification, embedding, repetitive agentic loops, draft generation $0 (electricity)
Standard Claude Sonnet Galatea Routine analysis, literature summaries, code generation, data interpretation ~$0.01-0.05/call
Synthesis Claude Opus Galatea Complex reasoning, narrative synthesis, cross-domain integration ~$0.05-0.50/call
Vision YOLO, LLaVA Hogwarts/Sauron Species classification, camera trap analysis, real-time detection $0 (local GPU)
Compute Custom models Sauron 3DGS rendering, ML training, SOMA mesh training, CUDA workloads $0 (local GPU)

Every AI call is logged in the ai_interactions table with model identifier, token counts, processing time, and cost. The STRATA_Bench dashboard displays running session and investigation costs in real time. The dispatch layer routes to the cheapest sufficient tier automatically.


8. The STRATA-SWC Pipeline

STRATA_Bench and Science with Claude (SWC) form a complete investigation-to-publication pipeline. STRATA_Bench conducts investigations; SWC publishes them.

STRATA_Bench SWC
Role Conduct investigations Publish investigations
Machine Data (development) Galatea (production)
Audience Investigator (private) Public readers
AI Access Full: distributed service None (static content)
Database strata_db + macroscope + macroscope_nexus sciencewithclaude_db
Workflow Active: experiments, analysis, notebook Archival: display formatted results

The swc_exports table in strata_db tracks the publication pipeline. SWC's four-tab viewer (Story, Technical Specs, Workbench, Publication) maps to STRATA investigation phases, with the lab notebook AI disclosure record providing the Workbench tab's transparent attribution content.


9. Initial Research Queue

STR-001: STRATA 2.0 Distributed Service Build-Out

Status: Active (Phase 3: Proposal) Hypothesis: Extracting STRATA's intelligence layer into a shared library with HTTP wrapper and plugin dispatch will enable distributed AI-assisted research across the Tailscale mesh while maintaining the 4-domain privacy model and cost-aware routing. Data Sources: macroscope (57 tables), macroscope_nexus, cnl_archive, Macroscope-Galatea codebase Tools: Claude Opus, Ollama/Llama3, Python

This self-referential investigation builds the instrument that powers all subsequent investigations. Key deliverables: shared library extraction, HTTP API, plugin registry, dispatch layer.

STR-002: SOMA 2.0 -- Domain Mesh Expansion and Temporal Depth

Status: Queued (Phase 1: Seed) Hypothesis: Adding temporal trajectory encoding and relational cross-domain tensioning to existing RBM meshes will enable detection of anomaly propagation across environmental, acoustic, and biological streams. Data Sources: macroscope.tempest_observations, birdweather_detections, ecowitt_readings Tools: Python/NumPy, MLX, Claude Sonnet

Follows Whittaker's gradient analysis: domain meshes maintain independent internal logic, connected by a relational layer that learns how tension propagates between streams.

STR-003: Tempest vs. Open-Meteo Microclimate Divergence

Status: Queued (Phase 1: Seed) Hypothesis: Physical weather station measurements at Canemah will show systematic divergence from Open-Meteo grid estimates, revealing microclimate signatures driven by topography, canopy cover, and riparian proximity to the Willamette. Data Sources: macroscope.tempest_observations, macroscope.openmeteo_hourly Tools: Python/Pandas, Chart.js, Claude Opus

The most immediately tractable investigation. Initial seed: 3.2C divergence during morning temperature inversion.

STR-004: ecoSPLAT 2.0 -- Panoramic to 3D Gaussian Terrarium

Status: Queued Hypothesis: 3D Gaussian Splatting can transform panoramic field photography into navigable ecological terrariums suitable for species-in-context visualization and phenological time-series comparison. Data Sources: ecoSPLAT filesystem (894 GB), Insta360 panoramas Tools: Sauron GPU, Three.js, Python, 3DGS toolkit

First investigation to exercise the Sauron compute plugin via the STRATA dispatch layer.

STR-005: Real-time Video Stream Classification

Status: Blocked (Hogwarts offline, Sauron Linux/CUDA pending) Hypothesis: Real-time YOLO classification of camera trap video combined with vision-enabled local AI can achieve >85% species identification accuracy for mammals and birds at the Canemah study site. Data Sources: RTSP camera streams, macroscope.birdweather_detections (training reference) Tools: Sauron GPU/CUDA, YOLOv8, Python, Ollama/LLaVA

First investigation to exercise the Hogwarts vision plugin. Multi-modal species confirmation via acoustic + visual cross-reference.


10. Implementation Roadmap

With the STRATA_Bench scaffold and strata_db built, the implementation focus shifts to the distributed service architecture:

  1. Extract shared library. Move Layers 1, 2, and 4 from Macroscope-Galatea into a standalone library directory on Galatea. Replace hardcoded credential paths with environment-aware configuration. Verify MNG can require_once the library directly.
  2. Build HTTP wrapper. Thin PHP endpoint on Galatea that wraps the shared library for remote callers. API key authentication. Endpoints: buildContext, executeTool, getTemporalState, getSpatialState.
  3. Design plugin registry. Capability advertisement and discovery for mesh nodes. Each node registers its available tools, models, and compute resources. The dispatch layer routes requests based on capability, cost tier, and availability.
  4. Connect STRATA_Bench to service. Wire the lab bench investigation workflow to the STRATA HTTP API for intelligence queries. Local Ollama for development-tier calls, Galatea service for production-tier calls.
  5. Bring Hogwarts online. Join Tailscale mesh, register vision capabilities, configure YOLO inference for bird feeder cameras. First test of the plugin dispatch pattern.
  6. Register in CNL Projects. Add STRATA_Bench and STRATA Service as project entries in cnl_archive.projects with appropriate deployment targets, paths, and Macroscope domain classifications.

11. References

[1] Hamilton, M. P. (2026). "Macroscope/STRATA and MNG Convergence Plan." CNL-TN-2026-027, Canemah Nature Laboratory.

[2] Hamilton, M. P. (2026). "Organelle Convergence Architecture." CNL-FN-2026-026, Canemah Nature Laboratory.

[3] Hamilton, M. P. (2025). "CNL Technical Note Style Guide." CNL-SG-2025-002 v1.1, Canemah Nature Laboratory.

[4] Whittaker, R. H. (1967). "Gradient analysis of vegetation." Biological Reviews, 42(2), 207-264.

[5] Anthropic (2026). "Claude API Documentation." https://docs.anthropic.com (accessed April 7, 2026).

[6] Ollama (2026). "Run Large Language Models Locally." https://ollama.ai (accessed April 7, 2026).

[7] Tailscale (2026). "Tailscale: Secure Networking." https://tailscale.com (accessed April 7, 2026).


Document History

Version Date Changes
0.1 2026-04-06 Initial draft as "IRIS" planning document.
0.2 2026-04-06 Renamed to STRATA 2.0. Reframed from separate system to evolution of existing STRATA engine. Added 1.0-to-2.0 comparison, pattern detection, STR-nnn investigation IDs.
1.0 2026-04-07 Major revision. Added codebase analysis of STRATA 1.0 (four-layer architecture, coupling points). Introduced distributed service architecture (Option C + plugin dispatch). Documented operational strata_db (14 tables) and STRATA_Bench scaffold. Added Hogwarts as vision node. Updated network topology to four-node Tailscale mesh. Expanded AI cost strategy to five tiers with per-node routing. Revised implementation roadmap for service extraction.

Cite This Document

(2026). "STRATA 2.0: Distributed Intelligence Architecture." Canemah Nature Laboratory Technical Note CNL-TN-2026-043. https://canemah.org/archive/CNL-TN-2026-043

BibTeX

@techreport{cnl2026strata, author = {}, title = {STRATA 2.0: Distributed Intelligence Architecture}, institution = {Canemah Nature Laboratory}, year = {2026}, number = {CNL-TN-2026-043}, month = {april}, url = {https://canemah.org/archive/document.php?id=CNL-TN-2026-043}, abstract = {STRATA 2.0 evolves the existing STRATA intelligence engine into a distributed research service spanning the Canemah Nature Laboratory's Tailscale mesh network. The current STRATA system on Galatea (Mac Mini M4 Pro) provides 13 temporal micro-agents, 7 context builders, 25 tool-calling endpoints, a 4-domain privacy model, and a personality system for narrative interpretation of Macroscope sensor streams. Analysis of this codebase reveals a clean four-layer architecture with minimal coupling to the web application, making extraction into a standalone service architecturally straightforward. The distributed architecture uses a hybrid approach: a shared PHP library on Galatea for co-located projects (MNG, production services), wrapped in a thin HTTP API for remote access by Data and other nodes over Tailscale. A plugin dispatch layer extends this across the mesh network, routing intelligence requests to specialized nodes: Galatea for data authority and cloud AI, Data for development and local inference via Ollama, Sauron for GPU compute (3DGS, ML training), and Hogwarts for vision processing (YOLO, camera services). STRATA\_Bench, the structured investigation lab bench built on a 14-table strata\_db schema with a seven-phase scientific workflow, becomes the primary consumer of this distributed service. The lab notebook captures every operation as a publishable primary dataset, with investigations flowing to Science with Claude (SWC) for public presentation. Together, these components form a complete observation-to-publication pipeline operating across federated databases and distributed compute nodes.} }

Permanent URL: https://canemah.org/archive/document.php?id=CNL-TN-2026-043