A step-by-step guide to designing activities, workflows, workers, and clients
using the drag-and-drop visual builder — then generating production-ready Python code.
Tip: On first visit, a sample Order Processing System loads automatically with 6 types, 5 activities, 2 workflows, 3 workers, and 2 clients. Explore it or click Clear All to start fresh.
localhost:3000
Primitive Types
string
int
float
bool
Components
⚡ Activity
🔄 Workflow
⚙ Worker
Project Metadata
OrderSystem
orders-prod
order-processing
Python
TypesActivitiesWorkflowsWorkersClientsResults
Your workspace — content appears here as you add items
1
Define Your Types
Types are the data structures your activities and workflows pass around. You can create structs (like dataclasses), enums, and aliases.
STRUCT
Building a struct
01 Click + Add Type on the Types tab
02 Name it in PascalCase — e.g. OrderInput
03Drag a string from the sidebar onto the drop zone — a field appears
04 Name the field order_id (snake_case) and check required
05 Repeat — drag int, bool, or your own types as fields
ENUM
Building an enum
01 Add a type, then change the kind dropdown to enum
02 Click + Add Value for each variant
03 Enter PENDING / pending as name/wire value
Naming rules: Type names must be PascalCase. Field names must be snake_case. Enum variants should be UPPER_SNAKE. The builder validates these on generate.
Activities are the individual units of work — each one does a specific task like validating data, calling an API, or sending an email.
Sync ActivitiesBLOCKING
The worker executes the function and returns the result directly. Use for fast operations.
→ Database lookups
→ Input validation
→ Simple API calls
Async ActivitiesHEARTBEAT
Long-running operations that must heartbeat periodically to prove liveness. Can complete out-of-band.
→ Payment processing
→ File processing / ETL
→ Warehouse fulfillment
Configuring an activity:
01 Switch to the Activities tab and click + Add Activity
02 Choose sync or async from the mode dropdown
03 Name it in snake_case — e.g. capture_payment
04 Set Input/Output Types — your custom types appear in the dropdown
05 Set the Timeout (e.g. 30s, 5m, 2h) and Max Retries
06 For async: configure the Heartbeat Timeout and optional Task Queue Override
Heartbeat timeout: If your async activity doesn't heartbeat within this window, Temporal considers it failed and may retry. Set it to something reasonable — e.g. 30s for payment, 5m for warehouse ops.
3
Build Workflows
Workflows orchestrate your activities into a reliable process. Add steps that reference the activities you defined — they run in sequence.
sync— Caller blocks until complete
async— Fire-and-forget, returns a handle
cron— Runs on a schedule (e.g. 0 3 * * *)
Adding workflow steps:
01 Click + Add Step inside the workflow card
02 Give the step an ID (e.g. validate)
03 Select the step kind — usually activity
04 Pick the activity from the dropdown (only your defined activities appear)
05 Name the output variable (e.g. validated_order) — used by later steps
validate_order
reserve_inventory
capture_payment
fulfill_order
send_notification
Tip: Other step kinds include child_workflow (call another workflow), timer (durable sleep), and condition (branching). Advanced kinds like parallel and continue_as_new can be configured in the generated YAML.
4
Configure Workers
Workers are the running processes that poll Temporal for work. Each worker hosts a set of activities and workflows on a specific task queue.
01 Switch to the Workers tab and click + Add Worker
02 Name it in kebab-case — e.g. order-worker
03 Set the Task Queue it polls (e.g. order-processing)
04Check the boxes for which activities and workflows this worker hosts
05 Tune concurrency (max activities, max workflow tasks) and resources (replicas, CPU, memory)
Why multiple workers? Separate workers let you scale independently — e.g. 3 replicas for the order-worker but only 1 for notifications. Activities on different task queues must be on separate workers.
Validation check: The builder warns you if any activity or workflow isn't assigned to a worker — unassigned items won't execute. Each worker must host at least one activity or workflow.
5
Set Up Clients
Clients are how your application code starts workflows, sends signals, and queries state. Each client gets type-safe generated stubs.
01 Switch to Clients and click + Add Client
02 Name it api-client (kebab-case)
03 Set the Target — the Temporal server address (e.g. localhost:7233)
04 Choose default mode: async (returns handle) or sync (blocks)
05Check the workflows this client is allowed to invoke — this is a type-safe allow-list
06 Optionally enable TLS for production connections
The generator produces typed methods for each allowed workflow:
You're ready! Preview your YAML, then hit Generate to produce the full project.
Before generating
01 Click Preview YAML to inspect the config
02 Check for any red dots on tabs — they indicate validation errors
03 Click Generate — errors show in a panel with clickable links
04 Fix any issues and regenerate
Generated files
📦types.py — Dataclasses and enums
⚡activities.py — Activity stubs with retry policies
🔄workflows.py — Workflow classes with steps
⚙worker_*.py — Runnable worker entry points
🔌client_*.py — Typed client classes
📄Dockerfile — Container image
📄docker-compose.yml — Worker services
📄requirements.txt — Python deps
📄config.yaml — Your source config
Running your generated project:
# Download the ZIP from the Results tab, then: cd your-project pip install -r requirements.txt
# Start a worker python worker_order_worker.py
# Or use the generated docker-compose docker compose up --build
Next step: The generated activities have NotImplementedError stubs. Open each activities.py function and replace the stub with your actual business logic — the Temporal wiring, types, retry policies, and heartbeat scaffolding are already in place.
Results Tab
OrderProcessingSystem
20260405_143022_a1b2c3d4
Download ZIP
📦types.py
⚡activities.py
🔄workflows.py
⚙worker_order.py
⚙worker_pay.py
🔌client_api.py
Quick Reference
Naming Conventions
PascalCase
Types, Workflows
OrderInput, ProcessOrder
snake_case
Activities, Fields
validate_order, order_id
kebab-case
Workers, Clients
order-worker, api-client
UPPER_SNAKE
Enum variants
PENDING, PAYMENT_CAPTURED
Duration Format
500ms
Milliseconds
30s
Seconds
5m
Minutes
2h
Hours
1d
Days
Keyboard Shortcuts
Load Sample
Pre-fill with an Order Processing System
Preview YAML
Inspect the generated config before submitting
Clear All
Reset everything to blank
Generate
Validate + send to API + show results
API Endpoints
POST
/api/generate
Submit YAML config
GET
/api/projects
List all projects
GET
/api/projects/{id}/download
Download ZIP
GET
/api/health
Health check
TIDS v1.0
Schema Reference
The Temporal Interface Definition Schema (TIDS) is the YAML format that defines your entire Temporal application. Here's every top-level key and what it controls.
Required Sections
metadata— Project name, namespace, default task queue, target language
activities— Activity definitions with sync/async mode, I/O types, timeouts, retries
workflows— Workflow orchestrations with steps, signals, queries, updates
workers— Worker processes with task queues, concurrency, resource limits
clients— Client configs with TLS, allowed workflows, connection targets
Optional Sections
types— Reusable structs, enums, and aliases for I/O contracts
retry_policies— Named retry presets (backoff, max attempts, non-retryable errors)
signals— Signal definitions reusable across workflows
queries— Query definitions for reading workflow state
updates— Update handlers with optional validators
pipelines— Multi-workflow orchestration with stages and fan-out
observability— Metrics, tracing, and logging configuration
Key Schema Patterns
Activity I/O Contract
# Each activity declares its # typed input and output input: type:DocumentInput output: type:ExtractedText
Workflow Step Graph
# Steps form a DAG with # depends_on edges steps:
- id: extract kind: activity depends_on: [upload]
Types used in input.type, output.type, and fields[].type follow this grammar:
type_expr = primitive | reference | generic primitive = "string" | "int" | "float" | "bool" | "datetime" | "duration" reference = PascalCaseName# refers to a types entry generic = "list<" type ">" | "map<" type "," type ">" | "optional<" type ">"
EXAMPLE
Manual Schema: AI Document Pipeline
Let's write a complete YAML schema by hand for a real-world AI workflow: a document-processing pipeline that uploads documents, extracts text, stores embeddings in a vector database, and serves an LLM-powered chat agent — with Temporal orchestrating every step.
Two workflows in this example: (1) A document ingestion pipeline that processes uploaded files into a searchable context database, and (2) a chat agent session where each user message triggers a Temporal workflow that retrieves context and calls the LLM.
Architecture Overview
Upload Doc
PDF / DOCX
Extract Text
sync activity
Chunk + Embed
sync activity
Store in VectorDB
async activity
User Message
chat input
Retrieve Context
vector search
Call LLM
async + heartbeat
Response
streamed back
A. Metadata & Types
Start every schema with the version, project metadata, and the data structures your activities will pass around.
Why async for LLM calls? LLM inference can take 10-60+ seconds. The async mode tells the generator to wire heartbeat scaffolding — the activity reports activity.heartbeat("streaming...") as tokens arrive. If the worker crashes mid-stream, Temporal detects the missing heartbeat and retries on another worker.
C. Workflows — Orchestrating the Pipeline
Two workflows: one for ingestion (async, fire-and-forget), one for each chat turn (sync, blocks until response).
Chat turn as sync workflow: The API gateway calls execute_workflow(ChatTurn, ...) which blocks until the LLM responds. This gives you Temporal's full retry/timeout guarantees on every single chat message — if the LLM worker crashes mid-response, it retries automatically. The user just sees a slightly delayed reply, never a broken connection.
D. Workers & Clients
Separate workers for CPU-bound text extraction, GPU-bound LLM inference, and the API client.
Why three workers? The llm-worker runs on GPU machines with max_concurrent_activities: 4 (limited by VRAM). The ingestion-worker runs on CPU instances for text parsing. The chat-worker handles fast vector lookups at high concurrency. Temporal routes work to the right queue automatically.
E. Chat Session Flow — Temporal in Every Interaction
Here's how every single chat message flows through Temporal. The API gateway is just a thin client — all reliability lives in the workflow.
1
User sends a message
The frontend POST to /api/chat. The API gateway uses the generated client to call client.execute_chat_turn(message, workflow_id) — this blocks until the workflow completes.
2
ChatTurn workflow starts
Temporal schedules the workflow on the ai-pipeline queue. Step 1: retrieve_context runs as a sync activity — queries the vector DB with the user's message, returns the top-K matching document chunks.
3
LLM call with context
Step 2: The workflow assembles a prompt (user message + retrieved context) and dispatches call_llm to the llm-inference queue. The LLM worker picks it up, starts streaming tokens, and heartbeats every few seconds. If it crashes, Temporal retries on another GPU worker.
4
Response returned
The LLM activity completes with an LLMResponse. The workflow finishes. The execute_workflow call unblocks and the API gateway returns the response to the user. Total time: typically 2-10 seconds, with full retry guarantees on every step.
Agent pattern: For multi-turn agents that use tools (search, code execution, API calls), each tool invocation becomes an activity. The agent's "reasoning loop" is the workflow — it calls call_llm, checks if the LLM wants to use a tool, executes the tool activity, feeds the result back, and loops via continue_as_new if the conversation exceeds history limits. Temporal makes the entire agent loop durable and retryable.
Try it yourself
Copy any of the YAML blocks above into a file called config.yaml, then POST it to the builder API:
curl-X POST http://localhost:8000/api/generate \ -H"Content-Type: text/yaml" \ -d @config.yaml
Or paste the complete YAML into the builder UI's Preview YAML modal (coming soon: YAML import), configure any remaining fields visually, and hit Generate.