Architecture: Dependency Map
Layer dependency graph
flowchart TB
H[commands/handlers/*] --> C[commands/context.py]
C --> F[tdlib/facade.py]
F --> A[tdlib/api/*]
F --> S[tdlib/services/*]
F --> D[domain/channel.py + domain/message.py]
S --> A
S --> N[tdlib/normalizers/*]
N --> D
D --> R[context resolvers]
R --> A
U[services/update_runtime_service.py] --> N
U --> Cache[cache/dialog_cache.py]
U --> Msg[messaging API]
WS[ws/list_chats/consumers.py] --> Msg
WS --> Cache
WS --> Replay[replay/service.py]
OWNER[account_runtime/owner_worker.py] --> C
OWNER --> U
OWNER --> F
Concrete dependency observations
- handlers do not call TDLib transport directly; they always go through the context services
- services depend on raw API modules and, where needed, normalizers
- normalizers and domain builders depend on context resolvers rather than on TDLib transport objects
- update runtime depends on normalizers plus cache/event side effects
- WebSocket consumer is a gateway/orchestration layer, not the business logic owner
No longer present in the active runtime path
The current dialogs runtime path no longer depends on the old monolithic TdlibClient adapter layer. TdlibFacade + TdlibBaseClient + domain-specific APIs/services are the active composition path.
Architectural constraints visible in code
TdlibBaseClientis the only transport wrapper aroundtg.call_method()- handler groups are resolved only by
commands.registry - normalized live payloads are emitted through
send_ws()/publish_live_event() MessageandChannelare shared typed builders reused by search, dialog, chat list, and update flows