unified-rendering-pipeline
π― The Case of the Unified Rendering Pipeline (Rendering Unto Caesar)
Case Details
Case ID: task-b268053f-029f-4921-b66a-adc82234b26f
Status: parked
Created: 2025-11-12T13:54:32.010Z
Updated: 2025-11-12T13:54:32.010Z
Priority: 2
Investigation Details
Case Type: investigation Urgency Level: routine
Notes
Vision
Unify Stuffy's rendering pipeline so that Oculus nodes, Stuffy channels, and all markdown content flow through ONE rendering engine with consistent enhancements.
Goal: Render markdown from any source (Oculus, channels, uploads) through a single, enhanced pipeline that supports:
- Data-path attributes on headings (header navigation)
- Auto-wrapped enhanced-table components
- Web component protection and rendering
- Castle web components (Oculus-specific UI overlays)
Status
β Phase 1 COMPLETE - Unified renderMarkdown() usage
- All Oculus node rendering now uses
renderMarkdown()instead of plainmarked.parse() - Oculus nodes get same enhancements as channels
- Enhanced tables, header navigation, web components all work
- Committed: e4dd6c75
Phase 2: Migrate to markdown-it.js (OUTSTANDING)
Problem:
- Oculus backend uses
markdown-it-py(Python) β generates markdown-it tokens - Stuffy frontend uses
marked.js(JavaScript) β incompatible token format - Can't inject AST directly (formats don't match)
Solution: Migrate Stuffy from marked.js β markdown-it.js
Tasks:
Install markdown-it.js in Stuffy frontend
Rewrite
renderMarkdown()to use markdown-it APIPort custom renderer enhancements:- Heading renderer with data-path attributes
Table renderer with enhanced-table wrapping
Web component protection logic
Test all existing features (tables, headings, components)
Remove marked.js dependency
Touchpoints:
/Users/graemefawcett/org/stuffy-poc/public/index.html(lines 8, 770-862)renderMarkdown()function - complete rewrite- May need markdown-it plugins for GFM tables, etc.
Effort: 4-6 hours
Risk: Medium (need to preserve all existing features)
Phase 3: AST Injection & Backend Integration (OUTSTANDING)
Goal: Enable Stuffy to consume AST directly from Oculus instead of markdown strings
Tasks:
Add
?format=astsupport to Oculus API endpointsAdd
renderFromAST()function in Stuffy that accepts markdown-it tokensAdd source detection (Oculus vs channel) in rendering pipeline
Layer Oculus-specific overlays when source is Oculus:- Cardinal navigator overlay
Fence toolbar overlay (future)
Slot panel overlay
Keep markdown fallback for backwards compatibility
Touchpoints:
Oculus API:
/Users/graemefawcett/org/services/oculus-api/oculus/api.py- Addformatparameter toGET /api/oculus/node/{slug}Return serialized token tree when
format=astStuffy:
/Users/graemefawcett/org/stuffy-poc/public/index.html- AddrenderFromAST(tokens)functionDetect if response is AST vs markdown
Route accordingly
Effort: 6-8 hours
Risk: Low (additive, doesn't break existing)
Phase 4: Channel Migration to Oculus Backend (FUTURE)
Goal: Port regular Stuffy channels to use Oculus graph nodes as backend
Concept:
- Channels become special Oculus nodes (or stay as ephemeral content)
- Content stored in graph = permanent, queryable
- Content in channels = ephemeral, streaming
- Both render through same pipeline
Tasks (deferred):
- Design channel β node mapping
- Implement channel storage in graph
- Update Stuffy server to emit from graph
- Maintain backwards compatibility
Dependencies
BLOCKED BY: Token Tree Cache case (linked below)
- Cache is important for rendering optimization
- But NOT required to complete pipeline unification
- Can complete Phase 2-3 without cache
- Cache enables Phase 4 + live updates
Success Criteria
β
Phase 1: Unified renderMarkdown() usage
β Phase 2: Stuffy uses markdown-it.js
β Phase 3: AST injection working from Oculus
β Phase 4: Channels backed by graph (future)
Result: ONE rendering engine, any content source, consistent behavior.
Investigation Timeline
2025-11-12T13:54:32.011Z - System - case_created
Case opened from CLI
Auto-Detected Keywords
investigation
This case file is automatically updated. For investigation logs, see the corresponding log channel.
π Case To-Do List
This case has an integrated to-do list system that syncs with the Oculus knowledge graph. The to-do list uses the virtual:todo-list fence which auto-detects GitHub-style checkbox markdown.
How the To-Do System Works
- Auto-Detection: Checkbox lists are automatically detected as
virtual:todo-listfences - Alice Integration: Display in Alice dashboard using
:::wonderland-todo-list slug="${current_case}" - ISA Operations: Use fence exec for add/check/update operations
- Metadata Support: Add
[assignee:name][priority:level]tags to tasks
Case To-Do Operations
- View state:
oculus fence list ${slug}thenoculus fence view ${slug} <fence-index> - Add task:
oculus fence exec ${slug} <fence-index> add "New task" - Check task:
oculus fence exec ${slug} <fence-index> check 0 - Update task:
oculus fence exec ${slug} <fence-index> update 0 "Updated content" - Reference: See virtual-fence-todo for full documentation
Current Case Tasks
- π― Solve the case
- π Document findings in investigation notes
- π Link relevant evidence and consciousness resources
- β Update case status when complete
Next Steps
Add investigation notes and evidence tags as you progress. The to-do list will evolve with your investigation. Tasks can be managed via Oculus fence operations or edited directly in the node.