OrkaJS
Orka.JS

Multi-Step Workflows

Chain LLM operations into readable and testable pipelines withOrka AI workflows.

Why Workflows?

Workflows let you chain multiple LLM operations into a single, readable pipeline. Each step transforms the context and passes it to the next. Built-in steps handle common patterns like planning, retrieval, generation, and verification — but you can also create custom steps for any logic.

import { createOrka } from '@orka-js/core';
import { OpenAIAdapter } from '@orka-js/openai';
import { plan, retrieve, generate, verify, improve } from '@orka-js/workflow';
 
const orka = createOrka({
llm: new OpenAIAdapter({ apiKey: process.env.OPENAI_API_KEY! }),
vectorDB: myVectorDB,
});
 
const workflow = orka.workflow({
name: 'support-response',
steps: [
plan(), // Analyze input and create action plan
retrieve('knowledge-base', { topK: 5 }), // Semantic search in knowledge base
generate({ temperature: 0.7 }), // Generate response using LLM
verify({ criteria: ['relevant', 'no hallucination'] }), // Evaluate quality
improve({ maxIterations: 1 }), // Fix issues if verify fails
],
});
 
const result = await workflow.run('How do I reset my password?');
 
console.log(result.output); // Final response
console.log(result.steps); // Results from each step
console.log(result.totalLatencyMs); // Total execution time
console.log(result.totalTokens); // Total tokens consumed

# Built-in Steps

Orka provides six built-in workflow steps that cover the most common LLM patterns:

plan()Cognitive Strategy

Decomposes complex goals into a structured execution roadmap.

retrieve()Knowledge Access

High-precision vector retrieval with advanced reranking.

generate()Creative Synthesis

Orchestrates LLM inference with dynamic context injection.

verify()Quality Assurance

Evaluates hallucinations and grounding via LLM-as-judge.

improve()Iterative Refinement

Corrects failures automatically based on verification feedback.

custom()Logic Extensibility

Injects domain-specific business logic into the pipeline.

# Step Details

plan()

Analyzes the input and creates an action plan. Useful for complex queries that require multiple steps.

plan()
// Output: "1. Search for password reset documentation
2. Extract relevant steps
3. Format as user-friendly instructions"

retrieve(name, options)

Performs semantic search in a knowledge base and adds the results to the context.

retrieve('documentation', {
topK: 5, // Number of results to retrieve
minScore: 0.7, // Minimum similarity score
filter: { category: 'guides' } // Metadata filter
})

generate(options)

Generates a response using the LLM with the current context (input + retrieved documents).

generate({
temperature: 0.7, // Creativity level (0-1)
maxTokens: 1000, // Maximum response length
systemPrompt: 'You are a helpful assistant.' // Custom system prompt
})

verify(options)

Evaluates the generated output against criteria using LLM-as-judge. Sets ctx.verified to true/false.

verify({
criteria: [
'relevant', // Is the answer relevant to the question?
'no hallucination', // Is the answer grounded in the context?
'complete', // Does it fully answer the question?
'professional tone' // Is the tone appropriate?
]
})

improve(options)

If verify() failed, attempts to fix the issues. Runs up to maxIterations times.

improve({
maxIterations: 2, // Maximum improvement attempts
// Automatically uses the verification feedback to improve the output
})

# Custom Steps

Create custom steps for any logic that isn't covered by built-in steps. Custom steps receive the workflow context and must return it.

import { custom } from '@orka-js/workflow';
 
// $Custom step to translate the output
const translateStep = custom('translate', async (ctx) => {
const result = await ctx.llm.generate(
`Translate to French: ${ctx.output}`,
{ temperature: 0.3 }
);
 
ctx.output = result.content;
ctx.metadata.translatedTo = 'fr';
 
// Record step in history
ctx.history.push({
stepName: 'translate',
output: result.content,
latencyMs: result.latencyMs,
tokens: result.usage?.totalTokens,
});
 
return ctx;
});
 
// Use in workflow
const workflow = orka.workflow({
name: 'multilingual-support',
steps: [
retrieve('docs'),
generate(),
translateStep, // Your custom step
],
});

Workflow Context (ctx)

Shared State Orchestration

Data Persistence across the Reasoning Pipeline

ctx.inputSource

string

Immutable origin query

ctx.outputCurrent State

string

Reactive step response

ctx.retrievedDocsContext

Document[]

Grounding knowledge base

ctx.llmEngine

LLMAdapter

Unified model interface

ctx.metadataExtensibility

Record<string, any>

Persistent custom store

# Workflow Result

The workflow.run() method returns a WorkflowResult object with the final output and detailed execution information:

Response FieldData TypeInsight & Usage
outputFinal Result
stringThe synthesized final answer
stepsAudit Trail
WorkflowStepResult[]Full execution trace (logs, inputs, outputs)
totalLatencyMsPerformance
numberEnd-to-end performance metric
totalTokensBilling / ROI
numberCumulative consumption across steps
verifiedReliability
booleanIntegrity check status

Complete Example

workflow-step-example.ts
import { createOrka } from '@orka-js/core';
import { OpenAIAdapter } from '@orka-js/openai';
import { PineconeAdapter } from '@orka-js/pinecone';
import { plan, retrieve, generate, verify, improve, custom } from '@orka-js/workflow';
 
const orka = createOrka({
llm: new OpenAIAdapter({ apiKey: process.env.OPENAI_API_KEY! }),
vectorDB: new PineconeAdapter({ apiKey: process.env.PINECONE_API_KEY! }),
});
 
// Custom logging step
const logStep = custom('log', async (ctx) => {
console.log('[Workflow] Current output length:', ctx.output.length);
return ctx;
});
 
const supportWorkflow = orka.workflow({
name: 'customer-support',
steps: [
plan(),
retrieve('support-docs', { topK: 5 }),
generate({ temperature: 0.5, systemPrompt: "You are a helpful support agent."),
logStep,
verify({ criteria: ['helpful', 'accurate', 'professional'] }),
improve({ maxIterations: 1 }),
],
});
 
async function handleSupportQuery(query: string) {
const result = await supportWorkflow.run(query);
 
console.log('Answer:', result.output);
console.log('Verified:', result.verified);
console.log('Steps:', result.steps.map(s => s.stepName));
console.log('Total time:', result.totalLatencyMs, 'ms');
console.log('Total tokens:', result.totalTokens);
 
return result.output;
}
 
await handleSupportQuery('How do I cancel my subscription?');

Tree-shaking Imports

// ✅ Import only what you need
import { plan, retrieve, generate, verify, improve, custom } from '@orka-js/workflow';
 
// ✅ Or import from main package
import { plan, retrieve, generate } from 'orkajs';