Skip to content

Templates

Chinilla includes 16 templates across three categories: Interview for system design interview practice, Lessons for learning behaviors, and Examples for real-world workflows.

Classic system design interview problems, fully wired with realistic behaviors, metrics, and costs. Load one, simulate traffic, find the bottleneck, and iterate.

Design a TinyURL-style service. Users submit long URLs, get short links back. Reads heavily outnumber writes. Watch the cache absorb traffic and the DB stay calm.

Design a real-time messaging system like WhatsApp or Slack. Messages flow through WebSocket connections, get persisted, and fan out to recipients.

Design a multi-channel notification service (email, SMS, push). Events come in, get prioritized, routed to the right channel, and tracked for delivery.

Design a distributed rate limiter that protects APIs from abuse. Watch what happens when traffic exceeds the limit: requests get throttled, the backend stays healthy.

Design a file storage service like S3 or Google Drive. Handle large uploads with chunking, metadata indexing, and CDN distribution.

Design a distributed event streaming platform like Kafka. Producers publish to topics, partitions handle parallelism, consumers process at their own pace.

Design a CDN like CloudFront or Akamai. Static content cached at edge nodes close to users. Watch cache hits keep the origin server relaxed.

Design a social media feed like Twitter or Instagram. The classic fan-out problem: when someone posts, who computes the feed and when?

Focused, minimal templates that teach specific behaviors. Load one, run the simulation, and watch how packets flow.

Linear chain: transform, filter, queue, delay. Learn how packets move through a sequence and where bottlenecks form.

If/else routing with condition mode, fan-out with split, and synchronization with merge. Packets take different paths based on data.

Rate limiting, retry with backoff, and circuit breaker working together. Learn how systems survive traffic floods and flaky dependencies. Connections include network latency (200ms API call round-trips) so you can see how delay compounds with retries.

Queue buffering, batch grouping, and replicate broadcasting. See how batching reduces downstream load and replicate fans out to all consumers.

Combines rate limit, condition routing, queue, batch, retry, and replicate in one mini-system. The graduation exercise.

5 components wired to isolate and observe circuit breaker behavior: Steady Traffic (trigger) sends requests through a Gateway (passthrough, 5ms) to a Payment Service (circuit breaker with 40% failure rate, threshold of 2 failures, 3-step cooldown, 200ms processing). Surviving requests flow into a Result Queue (capacity 20, service rate 3) and land in the Order Ledger (storage). The Gateway to Payment connection has 50ms latency. Run at different seed counts (N=1 vs N=5) to see the breaker trip, enter half-open, and probe before recovering.

A typical online store: client, API gateway, product service, cart service, order service, payment gateway, database, cache, and event bus. Includes realistic connection latencies (80ms client round-trip, 200ms payment, 1ms cache reads).

End-to-end pottery production: online orders, potting, drying rack, glazing, kiln firing in batches, packing, and shipping with inventory tracking. Includes physical transit latencies between production stages.

  1. From the launcher, click Template
  2. Use category filters (Interview / Lessons / Examples) to narrow options
  3. Click a template card to load it
  4. Modify, validate, and simulate as needed

Start with the Interview category if you’re prepping for system design interviews. Each template is a complete architecture with realistic behaviors, metrics, and costs. The Lessons category teaches specific simulation behaviors step by step.