Non-LLM Participants
Beach's session manager invokes a Participant once per turn-advancement step. LLMs are one Participant kind; deterministic logic is another. The framework treats them identically — same lifecycle, same RespondCall output, same observability hooks. Choosing one over the other is a question of what kind of work this turn-step does, not a question of how Beach wraps it.
This guide is the wiring story for non-LLM Participants. For the broader question of when to advance a turn (vs. handle an event without advancing), see Adding a Participant.
When HandlerParticipant is the right shape
The canonical wrapper for deterministic non-LLM turn-advancers. Use it when the work this turn-step does is:
- Triage classification. Inbound shape (length, language, predicate-match, regex) decides which downstream specialist runs next. The decision is mechanical; an LLM would just paraphrase the rule.
- Rules-engine evaluation. Pre-declared rules over the mailbox state produce a typed decision. The rules engine is the load-bearing logic; the LLM would only restate its output.
- Database-backed responses. A FAQ lookup, account-state query, or knowledge-base hit answers the user directly without LLM paraphrasing. (LLMs are still useful for composing the response; the lookup itself is deterministic.)
- Queue runners. A queue runner takes an inbound request, decides what specialists to dispatch, and emits a
RespondCalldeclaring the turn-state shape. No LLM call at this layer.
When HandlerParticipant is the wrong shape
- Event-router handlers that consume + emit events without advancing a turn. The Participant axis and the EventRouter axis are orthogonal — the same logical unit can register on both with two registrations. Use a router handler when you're routing or transforming events; use a
HandlerParticipantwhen you're advancing a turn. - Free-text generation, multi-step reasoning, paraphrasing. Use
LLMActor— that's the LLM's strength. - Anything that needs the LLM's tool-loop machinery.
HandlerParticipantdoesn't run a tool loop; it produces oneRespondCallper invocation. For multi-step LLM tool-calling,LLMActoris the right wrapper.
Three steps to ship a HandlerParticipant
- Define the handler function. Takes
ParticipantInvokeOptions, returns either a bareRespondCall(the common case — no mailbox change) or a fullParticipantInvokeResult(when you want to append a synthetic decision-message to the mailbox thread for audit replay). - Wrap it in a
HandlerParticipant. One line; pick a stableidfor telemetry/audit. - Pass it to
runParticipantTurn. The session manager's lifecycle (heartbeat, timeout, cancellation, manifest cleanup) wraps the call identically toLLMActor.
import { HandlerParticipant, SessionTurnManager } from '@cool-ai/beach-session';
// 1. Define the handler.
const triage = new HandlerParticipant({
id: 'task-triage',
async handle(opts) {
const last = opts.messages[opts.messages.length - 1];
const text = typeof last?.content === 'string' ? last.content : '';
const decision = text.length < 20 ? 'too-short' : 'route-to-specialist';
return {
parts: [{ partType: 'response', text: `Triage: ${decision}` }],
turnState: 'complete',
};
},
});
// 2. Drive it through the session manager.
const result = await sessionManager.runParticipantTurn({
participant: triage,
sessionId, slotKey: 'triage', inboundMessage,
});
Worked example: a classifier with three downstream specialists
import { HandlerParticipant, type Participant } from '@cool-ai/beach-session';
type Decision = 'flights' | 'hotels' | 'general';
function classify(text: string): Decision {
const lower = text.toLowerCase();
if (lower.includes('flight')) return 'flights';
if (lower.includes('hotel') || lower.includes('booking')) return 'hotels';
return 'general';
}
const triage: Participant = new HandlerParticipant({
id: 'triage-classifier',
async handle(opts) {
const last = opts.messages[opts.messages.length - 1];
const text = typeof last?.content === 'string' ? last.content : '';
const decision = classify(text);
return {
parts: [{ partType: 'response', text: `Triage: route to ${decision}-specialist` }],
turnState: 'complete',
};
},
});
The decision is downstream-routable: a routing rule on the assistant's reply_ready event can match against the parts[0].text shape (or, more robustly, the partType and data fields if you emit a typed-decision part) and dispatch the correct specialist. See Adding a Participant § "Adding an actor" for the full pattern with classified routing rules.
Multi-step deterministic work with inject()
A HandlerParticipant can return turnState: 'awaiting' to wait for a specialist's result, identical to how an LLMActor awaits a background tool call. The session manager's inject() method drives the second invocation when the result arrives:
const queueRunner = new HandlerParticipant({
id: 'research-queue',
async handle(opts) {
const last = opts.messages[opts.messages.length - 1];
const text = typeof last?.content === 'string' ? last.content : '';
if (text.startsWith('[research-result]')) {
// Second invocation — the specialist returned. Compose the final reply.
return {
parts: [{ partType: 'response', text: `Researched: ${text}` }],
turnState: 'complete',
};
}
// First invocation — dispatch the specialist + wait.
void dispatchResearchSpecialist(text); // your async dispatch
return {
parts: [{ partType: 'response', text: 'Researching…' }],
turnState: 'awaiting',
};
},
});
const turn = await sessionManager.runParticipantTurn({
participant: queueRunner,
sessionId, slotKey: 'research', inboundMessage,
turnId: 'research-1',
});
// turn.turnState === 'awaiting'
// Specialist returns later:
await sessionManager.inject({
turnId: 'research-1',
message: { role: 'user', content: '[research-result] 3 flights found' },
});
// queueRunner.handle is invoked again with the inject message appended;
// returns turnState: 'complete' this time.
This is the canonical pattern for queue-runner migrations. Earlier versions of Beach made queue runners drive their own inject-flow turn-advancement manually outside the session manager. With HandlerParticipant, the runner is constructed once and passed to runParticipantTurn; the manager handles heartbeat, timeout, and manifest cleanup uniformly.
Cooperative cancellation
Handlers doing I/O (a database lookup, a cache fetch, a peer-call) MUST wire opts.signal into their I/O so cancelTurn propagates cleanly. The canonical pattern matches what fetch() and callActor() do internally:
const dbHandler = new HandlerParticipant({
id: 'db-lookup',
async handle(opts) {
const result = await db.query('SELECT …', { signal: opts.signal });
return {
parts: [{ partType: 'response', text: result.summary }],
turnState: 'complete',
};
},
});
Handlers without I/O can ignore opts.signal — the work is synchronous from Beach's perspective and finishes before cancellation can interrupt.
What HandlerParticipant does NOT change
- Routing rules. The post-turn
assistant:reply_readyevent still flows through the router exactly as it does forLLMActor. AHandlerParticipantjust produces theRespondCall; the routing layer doesn't know or care which Participant kind generated it. - Filtering.
FilterAndDistributedoesn't distinguish betweenLLMActoroutput andHandlerParticipantoutput — both areRespondCall-shaped. - Telemetry.
onTurnStarted/onTurnSettled/onTurnTimeout/onTurnCancelledfire identically;actorIdisParticipant.idregardless of kind. - Authorisation.
Principalpropagation reachesHandlerParticipant.handle()viaopts— indirectly: the session manager'sHandlerContextcarries the principal through router-dispatched events. Within a Participant, the principal is on the triggering event the orchestrator was responding to. This pairs cleanly with the forthcomingrequiresCapabilitychecks on the Authoriser interface. - Durability.
DurableExecutorcheckpoints capture the samestarted → llm-complete → settledlifecycle for aHandlerParticipantturn as for anLLMActorturn. The phase namellm-completeis a misnomer for non-LLM Participants but the contract is "one iteration of the Participant'sinvokecompleted" — independent of LLM involvement. The phase taxonomy is open (CheckpointPhaseRegistry); consumers wanting a custom phase likehandler-completeregister one at startup.
Migration from EventRouter handler → HandlerParticipant
If you have a working EventRouter handler today that drives turn-advancement manually, the migration to HandlerParticipant is mechanical:
Before — handler drives turn manually:
router.register('legacy-queue-runner', async (event, ctx) => {
const result = await sessionManager.runTurn({
sessionId, actorId: 'queue-runner', /* … LLM-specific fields … */
});
// … manual inject / re-invoke logic …
});
After — handler constructs a Participant + delegates lifecycle to the session manager:
const queueRunner = new HandlerParticipant({
id: 'queue-runner',
async handle(opts) {
// Your turn-advancement logic, returning a RespondCall.
return { parts: […], turnState: '…' };
},
});
router.register('queue-runner-handler', async (event, ctx) => {
await sessionManager.runParticipantTurn({
participant: queueRunner,
sessionId, slotKey: 'queue', inboundMessage: event.data.message,
});
});
The handler still exists (it's the orchestration entry-point Beach's router invokes). What's new is that the turn-advancement logic now lives inside a Participant rather than being driven manually. The session manager handles heartbeat, timeout, cancellation signal cascade, manifest cleanup, and observability hooks for you.
Related
- Adding a Participant — colloquial-sense participant guide (handlers + actors).
@cool-ai/beach-sessionREADME — Participants section,runParticipantTurnAPI.- Actors vs Handlers — when to choose which.