Skip to content

Workflows

A workflow is a reusable process that combines an LLM, OMD data, and a delivery channel to bring useful information directly to your workforce. Workflows can be scheduled, event-triggered, or started manually through the HTTP API.


Common parameters

Every workflow shares the following parameters when triggered via the API:

Parameter Type Description
workflow_id string ID of the workflow to start (see pages below)
user_ids array List of user IDs to deliver the workflow output to
channel string Delivery channel: matrix, teams, email, whatsapp, api
config_id string OMD configuration ID (identifies the OMD instance)
instance string OMD instance name (e.g. www, sandbox)
language string Language for the initial messages (e.g. en, de)
model string (Optional) LLM to use in provider:model:reasoning format, e.g. openai:gpt-4.1:none
new_room boolean If true, always start a new conversation room/thread; if false, reuse an existing one
workflow_data object Workflow-specific input data (see individual workflow pages)

Available workflows

Workflow ID Description
Weekly Update weekly_update Trip summary for an upcoming week delivered to a mobile worker
Start of Day Summary sod_summary Daily briefing for the upcoming workday
End of Day Summary eod_summary Summary of the completed workday including notes
Territory Summary territory_summary Summary of all trips done in a territory on a given day
Document Search document_search Interactive semantic search through the Document Hub
Documentation Search documentation_search Search through docs.optimizemyday.com
SQL Bot sql_bot Natural-language interface to the OMD Powerhouse StarRocks database
Default Chat default General-purpose chat without domain-specific tools

How workflows are delivered

When a workflow is triggered, the agent:

  1. Creates or reuses a conversation room/thread on the target channel.
  2. Invites the specified users to the room (if required by the channel).
  3. Assembles the system prompt and injects the workflow's initial context.
  4. Calls the LLM, executes any tool calls, and iterates until a final response is ready.
  5. Delivers the final message(s) to the room. Intermediate tool messages are suppressed on channels like Email where they would create noise.
  6. Keeps the conversation open for follow-up questions from the user (reactive mode).