Behaviors
Every component can have a behavior that controls how it processes requests during runtime execution. Behaviors turn static diagrams into executable simulations.
Behavior modes
Section titled “Behavior modes”| Mode | What it does |
|---|---|
| Passthrough | Passes requests through unchanged (default) |
| Transform | Changes the shape of data using an expression |
| Filter | Drops a percentage of requests |
| Queue | Holds requests in line and releases them at a set rate |
| Split | Sends requests across outputs using round-robin or weighted distribution |
| Delay | Adds processing time before passing requests on |
| Condition | Routes requests by expression (if/else branching) |
| Retry | Retries failed requests with backoff |
| Rate Limit | Caps how many requests can pass through per step |
| Circuit Breaker | Shuts down after too many failures, then slowly recovers |
| Batch | Collects requests into a group before releasing them all at once |
| Replicate | Copies every request to all outputs (fanout) |
Each component gets exactly one behavior mode. 12 modes total. If you need combined patterns (e.g. rate limiting + retry), compose them as separate connected components on the canvas.
New to behaviors? Load one of the Lesson templates to see each mode in action. Start with Pipeline Basics for the fundamentals.
Setting behaviors
Section titled “Setting behaviors”Via the editor
Section titled “Via the editor”- Double-click a component to open the editor
- The Behavior section is in the Basic tab
- Select a mode and configure parameters
Via AI chat
Section titled “Via AI chat”Ask the AI to set behaviors:
“Make the API filter requests where status !== ‘active’”
The AI will assign the appropriate behavior mode and configure its parameters.
Visual indicator
Section titled “Visual indicator”Components with active behaviors show a lightning bolt badge on the canvas. Hover any component to see its behavior mode, parameters, and active expressions in the tooltip.
Passthrough
Section titled “Passthrough”The default mode. Requests pass through unchanged with no extra processing.
When to use: Components that just forward data without modification, such as load balancers, proxies, or hubs.
Example: A “Router” component that receives requests and sends them downstream. No configuration needed.
Transform
Section titled “Transform”Changes the shape of data using a JavaScript-style expression.
| Parameter | Range | Default |
|---|---|---|
| Transform expression | Any valid expression | data.transformed = true |
The expression is evaluated safely without eval() or new Function(). Supported operations include simple arithmetic and dotted path assignment.
When to use: Enriching payloads, computing derived values, reformatting data between services.
Examples:
// Calculate order totaldata.total = data.price * data.qty
// Add a timestampdata.processedAt = "2026-03-23"
// Normalize a fielddata.email = data.email_rawFilter
Section titled “Filter”Drops a configurable percentage of incoming requests.
| Parameter | Range | Default |
|---|---|---|
| Drop rate | 0-100% | 0% |
When to use: Simulating packet loss, modeling unreliable networks, testing how downstream components handle missing data.
Examples:
- Set drop rate to 10% to simulate a flaky network connection
- Set drop rate to 50% to model a validation layer that rejects half of incoming requests
- Set drop rate to 100% to fully block a path (useful for A/B testing with Split)
Holds requests in line and releases them at a controlled rate.
| Parameter | Range | Default |
|---|---|---|
| Capacity | 1+ | 10 |
| Service rate | 1-20/step | 1 |
When the queue is full, the oldest request is dropped (FIFO). The service rate controls how many requests leave per simulation step, creating natural bottlenecks.
When to use: Modeling message queues, task buffers, or any component where requests wait to be processed.
Examples:
- Message broker: Capacity 100, service rate 5. Buffers bursts and drains steadily.
- Single-threaded worker: Capacity 20, service rate 1. Processes one at a time, builds a backlog under load.
- Fast consumer: Capacity 50, service rate 10. High throughput with overflow protection.
Distributes requests across multiple outgoing connections.
Routing modes:
- Round-robin (default): Sends each request to the next output in sequence
- Weighted: If any outgoing connection has a weight set, requests are distributed probabilistically based on those weights
When to use: Load balancing, sharding, distributing work across parallel workers.
Examples:
- Connect a “Load Balancer” to three “Server” components. Split sends requests 1, 2, 3 to servers A, B, C in order, then repeats.
- Set connection weights (e.g. 70/30) to route more traffic to a primary server than a secondary.
Adds processing time before forwarding requests.
| Parameter | Range | Default |
|---|---|---|
| Delay (ms) | 10+ ms | 100ms |
When to use: Simulating slow services, database queries, external API calls, or any component with meaningful processing time.
Examples:
- Database query: Set delay to 200ms to model a typical DB round-trip
- External API: Set delay to 500ms to model a third-party service call
- CPU-intensive task: Set delay to 1000ms to model heavy computation
Condition
Section titled “Condition”Routes requests by evaluating a JavaScript expression against the request data.
| Parameter | Range | Default |
|---|---|---|
| Condition expression | Any valid expression | (none) |
- True path goes to the first outgoing connection
- False path goes to the second outgoing connection
When to use: If/else branching, content-based routing, validation gates.
Examples:
// Route premium users to fast lanedata.tier === "premium"
// Only allow authenticated requestsdata.token !== undefined
// Route large orders to bulk processingdata.quantity > 100
// Combine conditionsdata.status === "active" && data.verified === trueSee the Branching & Merging lesson template for a working example.
Re-sends requests to the downstream component on simulated failure.
| Parameter | Range | Default |
|---|---|---|
| Max retries | 1-10 | 3 |
| Backoff steps | 1-10 | 1 |
| Failure rate | 0-100% | 30% |
The failure rate controls how often the simulated call fails. On each failure, the request waits (backoff steps * attempt number) before trying again. After exhausting all retries, the request is dropped.
When to use: Modeling resilient API calls, transient failure recovery, testing retry storms.
Examples:
- Flaky microservice: Failure rate 20%, max retries 3, backoff 2. Retries with increasing delays (2, 4, 6 steps).
- Aggressive retry: Failure rate 50%, max retries 5, backoff 1. Many quick retries to stress-test downstream.
- Conservative retry: Failure rate 10%, max retries 2, backoff 5. Few retries with long pauses.
Rate Limit
Section titled “Rate Limit”Caps how many requests pass through per simulation step.
| Parameter | Range | Default |
|---|---|---|
| Rate limit | 1-100 req/step | 5 |
Any requests above the limit are dropped. Useful for modeling API rate limits or throughput caps.
When to use: API gateways, throttling layers, protecting downstream services from burst traffic.
Examples:
- API gateway: Rate limit 10/step. Allows steady traffic, drops excess during spikes.
- Strict throttle: Rate limit 1/step. Only one request gets through at a time.
See the Resilience Patterns lesson template for rate limit + retry + circuit breaker working together.
Circuit Breaker
Section titled “Circuit Breaker”Protects against cascading failure by tripping open after too many errors.
| Parameter | Range | Default |
|---|---|---|
| Failure threshold | 1-20 | 3 |
| Cooldown steps | 1-50 | 5 |
| Failure rate | 0-100% | 20% |
State machine:
- Closed (normal): Requests pass through. Failures are counted.
- Open (tripped): All requests are dropped immediately. Waits for cooldown.
- Half-Open (probing): Exactly one test request is allowed through. If it succeeds, the breaker closes and normal traffic resumes. If it fails, it re-trips to open. Remaining queued items wait during the probe (they are not dropped).
When to use: Protecting services from cascading failures, modeling real circuit breaker patterns (e.g. Netflix Hystrix, Resilience4j).
Examples:
- Standard breaker: Threshold 3, cooldown 5, failure rate 20%. Trips after 3 failures, waits 5 steps, then probes.
- Sensitive breaker: Threshold 1, cooldown 10, failure rate 30%. Trips on first failure, long recovery.
- Tolerant breaker: Threshold 10, cooldown 3, failure rate 10%. Absorbs many errors before tripping.
Collects requests into a group before releasing them all at once.
| Parameter | Range | Default |
|---|---|---|
| Batch size | 2-100 | 3 |
Requests accumulate until the batch size is reached, then all are released simultaneously. After the batch component finishes processing, the merged group is expanded back into individual items for downstream routing. This means a batch of 3 items arrives at the next component as 3 separate packets, not 1 merged packet.
When to use: Modeling batch processing pipelines, bulk API calls, aggregation layers.
Examples:
- Batch writer: Batch size 10. Collects 10 records, then writes them all to the database at once. Downstream components receive 10 individual records.
- Aggregator: Batch size 5. Groups incoming events before sending a summary downstream.
See the Batch & Scale lesson template for batching + replicate fan-out.
Replicate
Section titled “Replicate”Copies every incoming request to ALL outgoing connections simultaneously.
Unlike Split (one output at a time), Replicate is a full broadcast/fanout.
When to use: Event broadcasting, pub/sub patterns, sending the same data to multiple consumers.
Examples:
- Connect a “Message Bus” to three subscribers. Every message reaches all three.
- Model a CDC (Change Data Capture) pipeline that fans out database changes to search, cache, and analytics services.
Composing behaviors
Section titled “Composing behaviors”Since each component gets one behavior, combine patterns by chaining components:
[Rate Limit] -> [Retry] -> [Circuit Breaker] -> [Service]This models a real resilience stack: throttle incoming traffic, retry transient failures, and trip the breaker if the service is down. Each component handles one concern.
Other useful compositions:
[Filter] -> [Transform] -> [Batch] -> [Database]Validate requests, enrich the payload, batch them up, then write to storage.
[Condition] -> [Split] -> [Worker A] -> [Queue] -> [Worker B]Route by expression, then distribute or buffer depending on the path.