Building Workflows
Learn how to create, manage, and optimize workflows in Circuitry.
Workflow Basics
A workflow is a sequence of connected nodes that process data from start to finish. Workflows can be linear, branched, or parallel, depending on your automation needs.
Creating a New Workflow
Starting from Scratch
- Open the Circuitry Editor
- Your canvas starts with a Start Node already in place - this is your workflow's entry point
- Add additional nodes by dragging from the sidebar
- Connect nodes to define data flow
- Configure each node's settings
- Save your workflow with a descriptive name
Using Templates
Coming soon: Pre-built workflow templates for common use cases.
Workflow Components
Canvas Controls
- Pan: Click and drag on empty canvas space
- Zoom: Use mouse wheel or trackpad
- Fit View: Double-click empty space
- Multi-select: Hold Shift and click nodes
- Delete: Select node(s) and press Delete key
Connection Rules
- One-to-One: Most nodes have single input/output
- One-to-Many: Fork nodes split to multiple outputs
- Many-to-One: Join nodes merge multiple inputs
- Type Safety: Connections validate data compatibility
Execution Flow
Sequential Execution
By default, nodes execute in sequence:
Start → Node A → Node B → Node C → End
Each node waits for the previous to complete before starting.
Parallel Execution
Use Fork and Join nodes for parallel processing:
┌→ Branch A →┐
Start →→ Fork → Branch B → Join → End
└→ Branch C →┘
All branches execute simultaneously, improving performance.
Conditional Execution
Use Condition nodes to create branching logic:
Start → Condition → [if true] → Process A → End
└→ [if false] → Process B → End
Data Flow
Understanding Data Passing
- Each node receives input from connected predecessor(s)
- Processes the input according to its configuration
- Produces output for successor node(s)
- Data is passed as JSON between nodes
Data Transformation
Nodes can transform data in various ways:
- Agent Nodes: Generate new content based on prompts
- Action Nodes: Modify, filter, or enrich data
- Loop Nodes: Process arrays element by element
- Join Nodes: Combine multiple data streams
Variable References
Reference data from previous nodes using template syntax:
{{nodeName.output.field}}
{{input.propertyName}}
{{previousNode.result}}
Workflow Management
Saving Workflows
Workflows auto-save as you work. Manual save options:
- Ctrl/Cmd + S: Quick save
- Save As: Create a copy with new name
- Version History: Track changes over time
Organizing Workflows
Best practices for workflow organization:
- Descriptive Names: Use clear, action-oriented names
- Folders: Group related workflows (coming soon)
- Tags: Add tags for easy filtering
- Documentation: Add descriptions to complex workflows
Sharing Workflows
Options for collaboration:
- Export: Download workflow as JSON
- Import: Load workflow from JSON file
- Share Link: Generate shareable URL (coming soon)
- Team Workspaces: Collaborate with team members
Execution Monitoring
Real-time Feedback
During execution, observe:
- Node States: Color-coded execution status
- Progress Indicators: Loading spinners on active nodes
- Data Preview: Hover nodes to see input/output
- Execution Time: Duration for each node
- Error Messages: Detailed error information
Execution Logs
Access detailed logs showing:
- Timestamp for each operation
- Input/output data for debugging
- Error stack traces
- Performance metrics
Performance Optimization
Best Practices
- Minimize API Calls: Batch operations when possible
- Use Caching: Enable caching for repeated operations
- Parallel Processing: Use Fork/Join for independent tasks
- Efficient Prompts: Keep AI prompts concise and focused
- Data Filtering: Process only necessary data
Resource Limits
Be aware of system limits:
- Execution Timeout: 5 minutes per workflow
- Memory Limit: 512MB per execution
- API Rate Limits: Varies by service
- Parallel Branches: Maximum 10 concurrent
Error Handling
Common Issues
- Connection Errors: Ensure nodes are properly connected
- Configuration Errors: Validate node settings
- Data Format Errors: Check JSON structure
- API Failures: Handle rate limits and timeouts
- Model Errors: Verify AI model availability
Debugging Strategies
- Test Incrementally: Execute partial workflows
- Use Sample Data: Test with known inputs
- Check Logs: Review execution logs for details
- Add Logging: Use Action nodes to log intermediate data
- Error Boundaries: Use conditions to handle failures
Advanced Techniques
Nested Workflows
Call one workflow from another (coming soon):
Main Workflow → Execute Sub-Workflow → Continue
Dynamic Node Configuration
Configure nodes based on runtime data:
// In Action node
const config = {
model: input.preferredModel || 'gpt-5',
temperature: input.creativity || 0.7
}
Custom Functions
Write JavaScript in Action nodes:
// Data transformation
const processed = input.items.map(item => ({
...item,
timestamp: Date.now(),
status: 'processed'
}));
return { processed };