Multi-Step Workflows
Chain LLM operations into readable and testable pipelines withOrka AI workflows.
Why Workflows?
Workflows let you chain multiple LLM operations into a single, readable pipeline. Each step transforms the context and passes it to the next. Built-in steps handle common patterns like planning, retrieval, generation, and verification — but you can also create custom steps for any logic.
import { createOrka } from '@orka-js/core';import { OpenAIAdapter } from '@orka-js/openai';import { plan, retrieve, generate, verify, improve } from '@orka-js/workflow'; const orka = createOrka({ llm: new OpenAIAdapter({ apiKey: process.env.OPENAI_API_KEY! }), vectorDB: myVectorDB,}); const workflow = orka.workflow({ name: 'support-response', steps: [ plan(), // Analyze input and create action plan retrieve('knowledge-base', { topK: 5 }), // Semantic search in knowledge base generate({ temperature: 0.7 }), // Generate response using LLM verify({ criteria: ['relevant', 'no hallucination'] }), // Evaluate quality improve({ maxIterations: 1 }), // Fix issues if verify fails ],}); const result = await workflow.run('How do I reset my password?'); console.log(result.output); // Final responseconsole.log(result.steps); // Results from each stepconsole.log(result.totalLatencyMs); // Total execution timeconsole.log(result.totalTokens); // Total tokens consumed# Built-in Steps
Orka provides six built-in workflow steps that cover the most common LLM patterns:
Decomposes complex goals into a structured execution roadmap.
High-precision vector retrieval with advanced reranking.
Orchestrates LLM inference with dynamic context injection.
Evaluates hallucinations and grounding via LLM-as-judge.
Corrects failures automatically based on verification feedback.
Injects domain-specific business logic into the pipeline.
# Step Details
plan()
Analyzes the input and creates an action plan. Useful for complex queries that require multiple steps.
plan()// Output: "1. Search for password reset documentation2. Extract relevant steps3. Format as user-friendly instructions"retrieve(name, options)
Performs semantic search in a knowledge base and adds the results to the context.
retrieve('documentation', { topK: 5, // Number of results to retrieve minScore: 0.7, // Minimum similarity score filter: { category: 'guides' } // Metadata filter})generate(options)
Generates a response using the LLM with the current context (input + retrieved documents).
generate({ temperature: 0.7, // Creativity level (0-1) maxTokens: 1000, // Maximum response length systemPrompt: 'You are a helpful assistant.' // Custom system prompt})verify(options)
Evaluates the generated output against criteria using LLM-as-judge. Sets ctx.verified to true/false.
verify({ criteria: [ 'relevant', // Is the answer relevant to the question? 'no hallucination', // Is the answer grounded in the context? 'complete', // Does it fully answer the question? 'professional tone' // Is the tone appropriate? ]})improve(options)
If verify() failed, attempts to fix the issues. Runs up to maxIterations times.
improve({ maxIterations: 2, // Maximum improvement attempts // Automatically uses the verification feedback to improve the output})# Custom Steps
Create custom steps for any logic that isn't covered by built-in steps. Custom steps receive the workflow context and must return it.
import { custom } from '@orka-js/workflow'; // $Custom step to translate the outputconst translateStep = custom('translate', async (ctx) => { const result = await ctx.llm.generate( `Translate to French: ${ctx.output}`, { temperature: 0.3 } ); ctx.output = result.content; ctx.metadata.translatedTo = 'fr'; // Record step in history ctx.history.push({ stepName: 'translate', output: result.content, latencyMs: result.latencyMs, tokens: result.usage?.totalTokens, }); return ctx;}); // Use in workflowconst workflow = orka.workflow({ name: 'multilingual-support', steps: [ retrieve('docs'), generate(), translateStep, // Your custom step ],});Workflow Context (ctx)
Shared State Orchestration
Data Persistence across the Reasoning Pipeline
ctx.inputSourcestring
Immutable origin query
ctx.outputCurrent Statestring
Reactive step response
ctx.retrievedDocsContextDocument[]
Grounding knowledge base
ctx.llmEngineLLMAdapter
Unified model interface
ctx.metadataExtensibilityRecord<string, any>
Persistent custom store
# Workflow Result
The workflow.run() method returns a WorkflowResult object with the final output and detailed execution information:
| Response Field | Data Type | Insight & Usage |
|---|---|---|
outputFinal Result | string | The synthesized final answer |
stepsAudit Trail | WorkflowStepResult[] | Full execution trace (logs, inputs, outputs) |
totalLatencyMsPerformance | number | End-to-end performance metric |
totalTokensBilling / ROI | number | Cumulative consumption across steps |
verifiedReliability | boolean | Integrity check status |
Complete Example
import { createOrka } from '@orka-js/core';import { OpenAIAdapter } from '@orka-js/openai';import { PineconeAdapter } from '@orka-js/pinecone';import { plan, retrieve, generate, verify, improve, custom } from '@orka-js/workflow'; const orka = createOrka({ llm: new OpenAIAdapter({ apiKey: process.env.OPENAI_API_KEY! }), vectorDB: new PineconeAdapter({ apiKey: process.env.PINECONE_API_KEY! }),}); // Custom logging stepconst logStep = custom('log', async (ctx) => { console.log('[Workflow] Current output length:', ctx.output.length); return ctx;}); const supportWorkflow = orka.workflow({ name: 'customer-support', steps: [ plan(), retrieve('support-docs', { topK: 5 }), generate({ temperature: 0.5, systemPrompt: "You are a helpful support agent."), logStep, verify({ criteria: ['helpful', 'accurate', 'professional'] }), improve({ maxIterations: 1 }), ],}); async function handleSupportQuery(query: string) { const result = await supportWorkflow.run(query); console.log('Answer:', result.output); console.log('Verified:', result.verified); console.log('Steps:', result.steps.map(s => s.stepName)); console.log('Total time:', result.totalLatencyMs, 'ms'); console.log('Total tokens:', result.totalTokens); return result.output;} await handleSupportQuery('How do I cancel my subscription?');Tree-shaking Imports
// ✅ Import only what you needimport { plan, retrieve, generate, verify, improve, custom } from '@orka-js/workflow'; // ✅ Or import from main packageimport { plan, retrieve, generate } from 'orkajs';