Skip to main content

core-chat-service

Lightweight chat service implementing a multi-agent system with tool calling using Workers AI. Provides chat session management, message history via D1, and an agentic loop that can search products, query interactions, and retrieve patterns. Also exposes A2A (Agent-to-Agent) protocol endpoints and an agent card.

Worker name: crow-core-chat-service Domain (prod): internal.chat.crowai.dev Domain (dev): dev.internal.chat.crowai.dev

Schema

chat_session

ColumnTypeNotes
idtext PK
organization_idtext
user_idtext
created_atintegerepoch ms
updated_atintegerepoch ms

chat_message

ColumnTypeNotes
idtext PK
session_idtext FKreferences chat_session.id
roletextuser, assistant, tool
contenttext
created_atintegerepoch ms

Routes

MethodPathDescription
POST/api/v1/chat/sessionsCreate a new chat session
POST/api/v1/chat/sessions/{sessionId}/messagesSend a message (triggers agentic loop)
GET/api/v1/chat/sessions/{sessionId}/messagesGet message history for session
GET/api/v1/chat/sessions/organization/{orgId}List sessions for org
DELETE/api/v1/chat/sessions/{sessionId}Delete a session and its messages
GET/.well-known/agent.jsonA2A agent card
POST/a2aA2A protocol endpoint

Agentic Loop

The service runs an agentic loop (max 5 iterations) using @cf/meta/llama-3.3-70b-instruct-fp8-fast with three tools:

ToolDescriptionCalls
search_productsSemantic product searchGateway /api/v1/products/search
get_interactionsQuery interactions by orgGateway /api/v1/interactions/organization/{orgId}
get_patternsRetrieve pattern analysisGateway /api/v1/patterns/organization/{orgId}

Each iteration: send message + tool definitions to LLM, check if LLM wants to call a tool, execute the tool, feed result back. Loop terminates when the LLM produces a final response without tool calls or after 5 iterations.

Environment Variables

VariableExample
ENVIRONMENTdev
API_GATEWAY_URLhttps://dev.api.crowai.dev

Secrets

SecretPurpose
INTERNAL_GATEWAY_KEYGateway trust validation

Bindings

BindingTypeName
DBD1crow-core-chat-service-db
AIWorkers AILLM inference (@cf/meta/llama-3.3-70b-instruct-fp8-fast)

Dependencies

  • Inbound: gateway (dashboard chat interface)
  • Outbound: gateway API (products, interactions, patterns via tool calls)

Key Behaviors

  • INTERNAL_GATEWAY_KEY guard: All /api/v1/* routes require the shared internal key
  • BOLA: Session operations check X-Organization-Id against organization_id
  • A2A protocol: Implements the Agent-to-Agent protocol for inter-agent communication
  • Tool calling: LLM decides which tools to invoke based on the user's question context