📡 Signals
Atomic units of lived data. The foundation of pattern recognition and synthesis.
What is a Signal?
A signal is the atomic unit of lived data in Autonomy. It represents a single moment of documented reality — raw capture that forms the input layer for AI-powered pattern detection and synthesis.
Signals are not content. They're not posts. They're timestamped documentation, optionally geolocated, preserved with full structural fidelity.
Every signal belongs to a realm and has a type (the medium) and optional context (the intent).
Signal Types (Medium)
DOCUMENT
Text in any form — writing, notes, code, references. The most common signal type.
PHOTO
Visual capture. Images preserved with technical metadata when available.
TRANSMISSION
Audio or video recordings. YouTube videos. Podcast episodes. Processed via transcript.
CONVERSATION
Dialogue logs. AI chat transcripts. Co-created content from back-and-forth exchange.
Signal Context (Intent)
Context describes why a signal was created. It's optional but provides valuable metadata for AI synthesis and pattern recognition.
CAPTURE
Default — generic documentation, intent to be determined
NOTE
Quick capture, ephemeral thought
JOURNAL
Reflective writing, daily log
CODE
Technical artifact, implementation
REFERENCE
External source, citation
OBSERVATION
Field note, documented reality
ARTIFACT
Created work, finished piece
Signal Structure
Core Fields
signal_idUnique identifier (ULID format)realm_idWhich realm this signal belongs tosignal_typeMedium: DOCUMENT, PHOTO, TRANSMISSION, CONVERSATIONsignal_contextIntent: CAPTURE, NOTE, JOURNAL, CODE, REFERENCE, OBSERVATION, ARTIFACTsignal_titleBrief title (initially from synthesis)signal_descriptionLonger description (initially from synthesis)signal_authorWho created/captured this signalsignal_temperatureImportance (-1.0 to 1.0, default 0.0)signal_statusACTIVE, PENDING, REJECTED, FAILED, or ARCHIVEDsignal_visibilityPUBLIC, PRIVATE, SANCTUM, or SHAREDTemporal Data
stamp_createdWhen the original content was captured/createdstamp_importedWhen signal was ingested into the systemstamp_updatedLast modification timestamp (auto-updated)Geospatial Data (Optional)
signal_locationPostGIS Point (PostgreSQL) - GeoJSON formatsignal_latitude / signal_longitudeDecimal coordinates (MySQL)Type-Specific Data
signal_metadataTechnical/immutable facts about the signalsignal_payloadThe actual content dataDOCUMENT - Metadata & Payload
Metadata:
word_count- Number of wordscharacter_count- Number of characterslanguage- Language code (e.g., 'en', 'es')file_extension- File extension (e.g., '.md', '.txt')encoding- Character encoding (e.g., 'utf-8')mime_type- MIME type (e.g., 'text/plain')
Payload:
content- The actual text contentformat- Rendering format: 'plain', 'markdown', or 'html'
PHOTO - Metadata & Payload
Metadata (EXIF & Properties):
camera- Camera modellens- Lens informationiso- ISO sensitivityaperture- Aperture value (e.g., 'f/1.5')shutter_speed- Shutter speed (e.g., '1/120')focal_length- Focal length in mmwidth- Image width in pixelsheight- Image height in pixelsfile_size- File size in bytesmime_type- MIME type (e.g., 'image/jpeg')color_space- Color space (e.g., 'sRGB')timestamp_original- Original capture timestamp from EXIFgps_altitude- GPS altitude in meters
Payload:
file_path- Local path or cloud storage URLthumbnail_path- Optimized thumbnail pathoriginal_filename- Original filename when uploaded
TRANSMISSION - Metadata & Payload
Metadata:
source_type- 'audio', 'video', or 'other'source_url- YouTube URL, file path, etc.youtube_id- YouTube video ID (if applicable)youtube_channel- YouTube channel nameyoutube_published_at- YouTube publish timestampyoutube_thumbnail- YouTube thumbnail URLtimestamps- Array of topic markers: [{topic, timestamp}]duration- Duration in secondsbitrate- Bitrate in kbpssample_rate- Sample rate in Hzchannels- Audio channels (1=mono, 2=stereo)codec- Codec (e.g., 'h264', 'aac')file_size- File size in bytesmime_type- MIME type (e.g., 'video/mp4')width- Video width in pixelsheight- Video height in pixelsframerate- Framerate in fpshas_transcript- Booleantranscript_method- 'ai', 'manual', or 'imported'
Payload:
file_path- Local file or cloud storage URLtranscript- Plain text transcripttimed_transcript- Array of timestamped segments: [{text, start, end}]
CONVERSATION - Metadata & Payload
Metadata:
platform- 'claude', 'chatgpt', 'gemini', 'remnant', or 'other'model- Model identifier (e.g., 'claude-sonnet-4')message_count- Total number of messagesturn_count- Number of back-and-forth exchangesduration_minutes- Estimated conversation durationtotal_tokens- Total token count (if available)started_at- First message timestampended_at- Last message timestamp
Payload:
messages- Array of messages: [{role, content, timestamp, metadata}]summary- AI-generated conversation summarykey_points- Array of extracted key insights
Additional Fields
signal_tagsArray of tags (initially from synthesis)signal_embeddingVector embedding (1536 dimensions) for semantic searchHistory & Annotations
signal_historyAudit trail: [{timestamp, action, field, user_id}]signal_annotationsUser notes and synthesis feedback: {user_notes[], synthesis_feedback[]}Visibility Levels
PUBLIC
Anyone can view this signal (if they have access to the realm).
SANCTUM
Only users with SANCTUM role or higher can view.
PRIVATE
Only the realm owner can view this signal.
SHARED
Reserved for future multi-user realm features.
Signal Lifecycle
- Capture — Signal is created with minimal data: type, context, raw payload
- Synthesis — AI processes signal and generates METADATA/SURFACE (title, description, tags)
- Enrichment — Title/description/tags copied to signal table for display
- Clustering — Can be grouped with related signals in clusters
- Deep Synthesis — STRUCTURE, PATTERNS analysis for cross-signal insights
- Reflection — MIRROR, MYTH, NARRATIVE generation at cluster level
Key Principles
Signals are input for pattern recognition
Not content for consumption. Raw documentation of lived reality that AI synthesis processes to identify patterns and generate insights.
Every signal belongs to a realm
Signals don't exist in isolation. They're always part of a realm, ensuring clear ownership and access control.
Title/description/tags come from synthesis
These display fields are initially AI-generated, then user-editable. Changes are tracked in signal_history.
Location is metadata, not a signal type
Geographic coordinates attach to any signal. Places are clustering context, not signals themselves.