📡 Signals

Atomic units of lived data. The foundation of pattern recognition and synthesis.

What is a Signal?

A signal is the atomic unit of lived data in Autonomy. It represents a single moment of documented reality — raw capture that forms the input layer for AI-powered pattern detection and synthesis.

Signals are not content. They're not posts. They're timestamped documentation, optionally geolocated, preserved with full structural fidelity.

Every signal belongs to a realm and has a type (the medium) and optional context (the intent).

Signal Types (Medium)

DOCUMENT

Text in any form — writing, notes, code, references. The most common signal type.

PHOTO

Visual capture. Images preserved with technical metadata when available.

TRANSMISSION

Audio or video recordings. YouTube videos. Podcast episodes. Processed via transcript.

CONVERSATION

Dialogue logs. AI chat transcripts. Co-created content from back-and-forth exchange.

Signal Context (Intent)

Context describes why a signal was created. It's optional but provides valuable metadata for synthesis and pattern recognition.

CAPTURE

Default — generic documentation, intent to be determined

NOTE

Quick capture, ephemeral thought

JOURNAL

Reflective writing, daily log

CODE

Technical artifact, implementation

REFERENCE

External source, citation

OBSERVATION

Field note, documented reality

ARTIFACT

Created work, finished piece

Signal Structure

Core Fields

signal_idUnique identifier (ULID format)
realm_idWhich realm this signal belongs to
signal_typeMedium: DOCUMENT, PHOTO, TRANSMISSION, CONVERSATION
signal_contextIntent: CAPTURE, NOTE, JOURNAL, CODE, REFERENCE, OBSERVATION, ARTIFACT
signal_titleBrief title
signal_summaryLonger summary or description
signal_authorWho created/captured this signal
signal_temperatureImportance (-1.0 to 1.0, default 0.0)
signal_statusACTIVE, PENDING, REJECTED, FAILED, or ARCHIVED
signal_visibilityPUBLIC, PRIVATE, SANCTUM, or SHARED

Analysis Fields

Surface and structural metadata extracted through analysis.

Surface Layer

signal_actionsArray of visible actions in the signal
signal_environmentContext at time of signal creation
signal_entitiesCategorized entities: people, places, infrastructure, organizations, concepts, media
signal_densityRecursion/complexity metric (-1.0 to 1.0)

Structural Layer

signal_energyEnergetic state (e.g., methodical, resolute, exhausted)
signal_stateLife/project state (e.g., infrastructure-building, crisis, integration)
signal_orientationDirectional facing (e.g., toward sovereignty, toward extraction)
signal_substrateStructural/conceptual foundation underlying the signal
signal_ontological_statesArray of being-states (e.g., sovereign, embedded, coherent)
signal_symbolic_elementsArray of recurring motifs/symbols (e.g., mirror, archive, trail)
signal_subsystemsArray of engaged subsystems (e.g., cognitive, infrastructural, relational)
signal_dominant_languageArray of key semantic patterns shaping the signal

Temporal Data

stamp_createdWhen the original content was captured/created
stamp_importedWhen signal was ingested into the system
stamp_updatedLast modification timestamp (auto-updated)

Geospatial Data (Optional)

signal_locationPostGIS Point (PostgreSQL) - GeoJSON format
signal_latitude / signal_longitudeDecimal coordinates (MySQL)

Type-Specific Data

signal_metadataTechnical/immutable facts about the signal
signal_payloadThe actual content data
DOCUMENT - Metadata & Payload

Metadata:

  • word_count - Number of words
  • character_count - Number of characters
  • language - Language code (e.g., 'en', 'es')
  • file_extension - File extension (e.g., '.md', '.txt')
  • encoding - Character encoding (e.g., 'utf-8')
  • mime_type - MIME type (e.g., 'text/plain')

Payload:

  • content - The actual text content
  • format - Rendering format: 'plain', 'markdown', or 'html'
PHOTO - Metadata & Payload

Metadata (EXIF & Properties):

  • camera - Camera model
  • lens - Lens information
  • iso - ISO sensitivity
  • aperture - Aperture value (e.g., 'f/1.5')
  • shutter_speed - Shutter speed (e.g., '1/120')
  • focal_length - Focal length in mm
  • width - Image width in pixels
  • height - Image height in pixels
  • file_size - File size in bytes
  • mime_type - MIME type (e.g., 'image/jpeg')
  • color_space - Color space (e.g., 'sRGB')
  • timestamp_original - Original capture timestamp from EXIF
  • gps_altitude - GPS altitude in meters

Payload:

  • file_path - Local path or cloud storage URL
  • thumbnail_path - Optimized thumbnail path
  • original_filename - Original filename when uploaded
TRANSMISSION - Metadata & Payload

Metadata:

  • source_type - 'audio', 'video', or 'other'
  • source_url - YouTube URL, file path, etc.
  • youtube_id - YouTube video ID (if applicable)
  • youtube_channel - YouTube channel name
  • youtube_published_at - YouTube publish timestamp
  • youtube_thumbnail - YouTube thumbnail URL
  • timestamps - Array of topic markers: [{topic, timestamp}]
  • duration - Duration in seconds
  • bitrate - Bitrate in kbps
  • sample_rate - Sample rate in Hz
  • channels - Audio channels (1=mono, 2=stereo)
  • codec - Codec (e.g., 'h264', 'aac')
  • file_size - File size in bytes
  • mime_type - MIME type (e.g., 'video/mp4')
  • width - Video width in pixels
  • height - Video height in pixels
  • framerate - Framerate in fps
  • has_transcript - Boolean
  • transcript_method - 'ai', 'manual', or 'imported'

Payload:

  • file_path - Local file or cloud storage URL
  • transcript - Plain text transcript
  • timed_transcript - Array of timestamped segments: [{text, start, end}]
CONVERSATION - Metadata & Payload

Metadata:

  • platform - 'claude', 'chatgpt', 'gemini', 'remnant', or 'other'
  • model - Model identifier (e.g., 'claude-sonnet-4')
  • message_count - Total number of messages
  • turn_count - Number of back-and-forth exchanges
  • duration_minutes - Estimated conversation duration
  • total_tokens - Total token count (if available)
  • started_at - First message timestamp
  • ended_at - Last message timestamp

Payload:

  • messages - Array of messages: [{role, content, timestamp, metadata}]
  • summary - Generated conversation summary
  • key_points - Array of extracted key insights

Additional Fields

signal_tagsArray of tags for categorization
signal_embeddingVector embedding (1536 dimensions) for semantic search

History & Annotations

signal_historyAudit trail: [{timestamp, fields_changed, previous_values, trigger, user_id}]
signal_annotationsUser notes and synthesis feedback: {user_notes[], synthesis_feedback[]}

Visibility Levels

PUBLIC

Anyone can view this signal (if they have access to the realm).

SANCTUM

Only users with SANCTUM role or higher can view.

PRIVATE

Only the realm owner can view this signal.

SHARED

Reserved for future multi-user realm features.

Analysis Layers

Signals capture two distinct layers of analysis: surface and structural.

Surface Layer

Observable, concrete elements of the signal. What's directly visible.

  • Actions - What's happening in the signal
  • Environment - Context at time of creation
  • Entities - People, places, infrastructure, concepts mentioned
  • Density - Measure of recursion and conceptual complexity

Structural Layer

Underlying patterns, states, and orientations. What the signal is, not just what it contains.

  • Energy - Energetic state (methodical, resolute, exhausted)
  • State - Life/project state (infrastructure-building, crisis, integration)
  • Orientation - Directional facing (toward sovereignty, toward manifestation)
  • Substrate - Foundational structure underlying the signal
  • Ontological States - Being-states (sovereign, embedded, coherent)
  • Symbolic Elements - Recurring motifs and symbols (mirror, trail, extraction)
  • Subsystems - Which subsystems are engaged (cognitive, relational, infrastructural)
  • Dominant Language - Semantic field shaping the signal

Signal Lifecycle

  1. Capture — Signal is created with minimal data: type, context, raw payload
  2. Analysis — Surface and structural layers are extracted and stored
  3. Clustering — Can be grouped with related signals in clusters
  4. Pattern Detection — Cross-signal analysis identifies longitudinal patterns
  5. Synthesis — Higher-order insights generated at cluster level

Key Principles

Signals are input for pattern recognition

Not content for consumption. Raw documentation of lived reality that analysis processes to identify patterns and generate insights.

Every signal belongs to a realm

Signals don't exist in isolation. They're always part of a realm, ensuring clear ownership and access control.

Analysis data can be manually edited

All analysis fields are editable. Changes are tracked in signal_history with trigger type (user_edit, re_synthesis, etc.).

Location is metadata, not a signal type

Geographic coordinates attach to any signal. Places are clustering context, not signals themselves.

Related Concepts