Architecture: System Overview
High-level overview
The Telegram dialogs subsystem has two main execution directions:
- Command path: WebSocket action -> messaging envelope -> owner runtime -> TDLib service -> live event
- Update path: TDLib update -> update normalizer/runtime service -> cache projection -> live event
System diagram
flowchart LR
UI[Browser / WebSocket client] --> WS[TDLibListChatsConsumer]
WS -->|publish_command| MSG[Messaging API / CommandEnvelope]
MSG --> JS[JetStream command streams]
JS --> OWNER[AccountOwnerWorker / AccountSessionRuntime]
OWNER --> REG[commands.registry]
REG --> CTX[CommandHandlerContext]
CTX --> FACADE[TdlibFacade]
FACADE --> API[tdlib/api/*]
FACADE --> SVC[tdlib/services/*]
SVC --> NORM[tdlib/normalizers/*]
NORM --> DOMAIN[dialogs/domain/*]
API --> TDLIB[Telegram TDLib]
TDLib[Telegram TDLib] --> UPD[UpdateRuntimeService]
UPD --> CACHE[DialogCache / Projections]
UPD --> LIVE[publish_live_payload_sync]
LIVE --> NATS[Core NATS live subjects]
NATS --> WS
CACHE --> REPLAY[ReplayService]
CACHE --> SNAP[CacheProjectionUpdater]
SNAP --> WS
REPLAY --> WS
Entry points
WebSocket entry point
- File:
backend/tg_client/dialogs/ws/list_chats/consumers.py - Class:
TDLibListChatsConsumer - Responsibilities:
- accept WebSocket messages
- handle
open_clientdirectly for live subscriptions + snapshot/replay - forward all other
actionvalues viapublish_command()
Owner runtime entry point
- File:
backend/tg_client/account_runtime/owner_worker.py - Class:
AccountSessionRuntime - Responsibilities:
- own TDLib session lifecycle for one Telegram account on one shard owner
- receive JetStream commands
- dispatch command payloads through
commands.registry - attach TDLib update handlers and run
UpdateRuntimeService
Component catalog
| Component | Role | Inputs | Outputs | Depends on | Depended on by | Interaction mechanism |
|---|---|---|---|---|---|---|
TDLibListChatsConsumer |
WebSocket gateway | client JSON actions | live chat_update frames |
messaging API, projection updater, replay service, cloud upload helper | browser clients | WebSocket + Core NATS subscription |
publish_command() |
command envelope publisher | command name + payload | CommandEnvelope, JetStream message |
ownership coordinator, messaging publisher | WS consumer | async function call |
AccountOwnerWorker / AccountSessionRuntime |
shard/account executor | JetStream messages, lease state | TDLib calls, live events, DLQ on failure | SessionManager, command registry, TdlibFacade | messaging topology | JetStream durable consumers |
commands.registry |
command dispatch | normalized payload + context | handler invocation | command context, imported handlers | owner runtime | in-process registry map |
CommandHandlerContext |
composition per command | userbot, tg, payload |
service objects, send helpers | TdlibFacade, Channel, Message |
handlers | dataclass assembly |
TdlibFacade |
composition root | TDLib transport object | raw APIs, services, shared builders | TdlibBaseClient, services, normalizers, domain builders |
command context, update runtime | in-process object graph |
tdlib/api/* |
thin TDLib wrappers | normalized method parameters | {ok, result, error} raw transport result |
TdlibBaseClient |
services | synchronous method call |
tdlib/services/* |
business logic | handler payload fields | normalized service result | raw APIs, normalizers, sometimes Channel / Message |
handlers | direct method call |
tdlib/normalizers/* |
typed result shaping | raw TDLib/service data | stable payload.result shapes |
domain builders, helper normalizers | services, update runtime | direct method call |
dialogs/domain/* |
shared typed payload builders | chat/message/profile raw objects | typed fragments | context resolvers, ws components | normalizers, services | direct method call |
UpdateRuntimeService |
TDLib update orchestration | TDLib update dicts | cached projections + live events | UpdateNormalizer, DialogCache, publish_live_payload_sync |
owner runtime | update callback |
DialogCache |
short-horizon projection store | normalized chat/profile/message events | snapshots, recent events, file cache | Django cache backend | projection updater, replay service, media loader | cache API |
ReplayService |
reconnect recovery | account snapshot + sequence | recent replay events | DialogCache |
WS consumer | async call |
media_loader.py |
async media resolution | file id / remote id jobs | media_ready payloads |
FileService, cloud storage, DialogCache |
media command service | background threads / async tasks |
Layering
flowchart TB
subgraph Gateway
C[WebSocket consumer]
R[commands.registry]
end
subgraph Runtime
O[owner_worker]
U[UpdateRuntimeService]
end
subgraph Composition
F[TdlibFacade]
end
subgraph Application
S[tdlib/services/*]
N[tdlib/normalizers/*]
end
subgraph Domain
D[dialogs/domain/*]
end
subgraph Infrastructure
A[tdlib/api/*]
B[TdlibBaseClient]
Cache[DialogCache]
Msg[Messaging]
Cloud[Cloud storage]
end
C --> Msg --> O --> R --> F
F --> S --> N --> D
S --> A --> B
O --> U --> N
U --> Cache
U --> Msg
S --> Cloud
Not implemented / requires verification
- there is no formal Pydantic or dataclass schema layer for every command result
- some result shapes are stabilized by normalizers, others are service-assembled dicts
- the docs tree below marks those places explicitly instead of inventing schemas