fix(security): add rate limiting to Telegram bridge#865
Conversation
|
No actionable comments were generated in the recent review. 🎉 ℹ️ Recent review info⚙️ Run configurationConfiguration used: Path: .coderabbit.yaml Review profile: CHILL Plan: Pro Run ID: 📒 Files selected for processing (1)
🚧 Files skipped from review as they are similar to previous changes (1)
📝 WalkthroughWalkthroughThe Telegram bridge now enforces per-chat rate limiting (5s cooldown) and per-chat serialization to prevent concurrent processing. It tracks Changes
Sequence DiagramsequenceDiagram
actor Telegram
participant PollingLoop as "Polling Loop"
participant ChatState as "Chat State\n(`lastMessageTime`, `busyChats`)"
participant Agent
Telegram->>PollingLoop: Deliver message (chatId)
PollingLoop->>ChatState: Is chatId in busyChats?
alt chatId busy
PollingLoop->>Telegram: Send "Still processing your previous message."
else chatId not busy
PollingLoop->>ChatState: Check lastMessageTime cooldown
alt cooldown active
PollingLoop->>Telegram: Send "Please wait Ns..."
else cooldown expired
PollingLoop->>ChatState: Mark chatId as busy
PollingLoop->>Agent: Execute agent for message
Agent->>Agent: Process message and generate reply
Agent->>ChatState: Update lastMessageTime
PollingLoop->>ChatState: Clear chatId from busyChats (finally)
Agent->>Telegram: Send reply
end
end
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~22 minutes Poem
🚥 Pre-merge checks | ✅ 3✅ Passed checks (3 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Comment |
There was a problem hiding this comment.
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
scripts/telegram-bridge.js (1)
206-239:⚠️ Potential issue | 🔴 CriticalDon't await the agent run inside
poll().
busyChatsonly works ifpoll()keeps fetching updates while a chat is in flight. Here the agent call is still awaited inline, so follow-up messages are not examined until afterfinallyclears the busy flag; once a run lasts longer than 5s, the queued message can start a second session instead of gettingStill processing, and unrelated chats are blocked behind the same await. Dispatch the per-chat work without awaiting it inpoll(), and keep the busy check ahead of the cooldown check.Suggested direction
- // Rate limiting: per-chat cooldown - const now = Date.now(); - const lastTime = lastMessageTime.get(chatId) || 0; - if (now - lastTime < COOLDOWN_MS) { - const wait = Math.ceil((COOLDOWN_MS - (now - lastTime)) / 1000); - await sendMessage(chatId, `Please wait ${wait}s before sending another message.`, msg.message_id); - continue; - } - - // Per-chat serialization: reject if this chat already has an active session - if (busyChats.has(chatId)) { - await sendMessage(chatId, "Still processing your previous message.", msg.message_id); - continue; - } - - lastMessageTime.set(chatId, now); - busyChats.add(chatId); - - // Send typing indicator - await sendTyping(chatId); - - // Keep a typing indicator going while agent runs - const typingInterval = setInterval(() => sendTyping(chatId), 4000); - - try { - const response = await runAgentInSandbox(msg.text, chatId); - clearInterval(typingInterval); - console.log(`[${chatId}] agent: ${response.slice(0, 100)}...`); - await sendMessage(chatId, response, msg.message_id); - } catch (err) { - clearInterval(typingInterval); - await sendMessage(chatId, `Error: ${err.message}`, msg.message_id); - } finally { - busyChats.delete(chatId); - } + // Per-chat serialization: reject if this chat already has an active session + if (busyChats.has(chatId)) { + await sendMessage(chatId, "Still processing your previous message.", msg.message_id); + continue; + } + + // Rate limiting: per-chat cooldown + const now = Date.now(); + const lastTime = lastMessageTime.get(chatId) || 0; + if (now - lastTime < COOLDOWN_MS) { + const wait = Math.ceil((COOLDOWN_MS - (now - lastTime)) / 1000); + await sendMessage(chatId, `Please wait ${wait}s before sending another message.`, msg.message_id); + continue; + } + + lastMessageTime.set(chatId, now); + busyChats.add(chatId); + + const handler = (async () => { + await sendTyping(chatId); + const typingInterval = setInterval(() => sendTyping(chatId), 4000); + + try { + const response = await runAgentInSandbox(msg.text, chatId); + console.log(`[${chatId}] agent: ${response.slice(0, 100)}...`); + await sendMessage(chatId, response, msg.message_id); + } catch (err) { + await sendMessage(chatId, `Error: ${err.message}`, msg.message_id).catch(() => {}); + } finally { + clearInterval(typingInterval); + busyChats.delete(chatId); + } + })(); + + handler.catch((err) => { + console.error(`[${chatId}] handler error:`, err.message); + });🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@scripts/telegram-bridge.js` around lines 206 - 239, The polling loop currently awaits runAgentInSandbox inside poll(), which blocks further update processing and makes busyChats ineffective; change poll() to (1) check busyChats before cooldown (move busyChats.has(chatId) above the COOLDOWN_MS check), (2) dispatch the per-chat work as an unawaited async task so poll() can continue fetching updates — create an async helper (or inline async IIFE) that sets lastMessageTime, busyChats.add(chatId), starts sendTyping and the typingInterval, calls runAgentInSandbox, handles sendMessage on success/error, clears the typingInterval and finally removes busyChats; call that helper without awaiting it from poll() so other updates are still processed.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Outside diff comments:
In `@scripts/telegram-bridge.js`:
- Around line 206-239: The polling loop currently awaits runAgentInSandbox
inside poll(), which blocks further update processing and makes busyChats
ineffective; change poll() to (1) check busyChats before cooldown (move
busyChats.has(chatId) above the COOLDOWN_MS check), (2) dispatch the per-chat
work as an unawaited async task so poll() can continue fetching updates — create
an async helper (or inline async IIFE) that sets lastMessageTime,
busyChats.add(chatId), starts sendTyping and the typingInterval, calls
runAgentInSandbox, handles sendMessage on success/error, clears the
typingInterval and finally removes busyChats; call that helper without awaiting
it from poll() so other updates are still processed.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Run ID: afab6a82-6eef-47a6-b42f-ef513c2fcce7
📒 Files selected for processing (1)
scripts/telegram-bridge.js
The bridge spawns an SSH + agent session per message with zero throttling — a flood of messages causes resource exhaustion and cost amplification via cloud inference (CWE-770, NVBUG 6002809). Add per-chat 5s cooldown, per-chat busy guard to reject messages while a session is active, and raise the poll interval from 100ms to 1000ms. Made-with: Cursor
2e4de4f to
1195812
Compare
The bridge spawns an SSH + agent session per message with zero throttling — a flood of messages causes resource exhaustion and cost amplification via cloud inference (CWE-770, NVBUG 6002809). Add per-chat 5s cooldown, per-chat busy guard to reject messages while a session is active, and raise the poll interval from 100ms to 1000ms. Made-with: Cursor Co-authored-by: Facundo Fernandez <facundofernandez@Facundos-MacBook-Pro.local> Co-authored-by: Carlos Villela <cvillela@nvidia.com>
The bridge spawns an SSH + agent session per message with zero throttling — a flood of messages causes resource exhaustion and cost amplification via cloud inference (CWE-770, NVBUG 6002809). Add per-chat 5s cooldown, per-chat busy guard to reject messages while a session is active, and raise the poll interval from 100ms to 1000ms. Made-with: Cursor Co-authored-by: Facundo Fernandez <facundofernandez@Facundos-MacBook-Pro.local> Co-authored-by: Carlos Villela <cvillela@nvidia.com>
The bridge spawns an SSH + agent session per message with zero throttling — a flood of messages causes resource exhaustion and cost amplification via cloud inference (CWE-770, NVBUG 6002809). Add per-chat 5s cooldown, per-chat busy guard to reject messages while a session is active, and raise the poll interval from 100ms to 1000ms. Made-with: Cursor Co-authored-by: Facundo Fernandez <facundofernandez@Facundos-MacBook-Pro.local> Co-authored-by: Carlos Villela <cvillela@nvidia.com>
The bridge spawns an SSH + agent session per message with zero throttling — a flood of messages causes resource exhaustion and cost amplification via cloud inference (CWE-770, NVBUG 6002809). Add per-chat 5s cooldown, per-chat busy guard to reject messages while a session is active, and raise the poll interval from 100ms to 1000ms. Made-with: Cursor Co-authored-by: Facundo Fernandez <facundofernandez@Facundos-MacBook-Pro.local> Co-authored-by: Carlos Villela <cvillela@nvidia.com>
Summary
/startand/resetcommands are unaffected (checked before rate limiter)Test plan
/startand/resetstill work instantly regardless of cooldownbusyChatsis cleaned up infinallyeven if agent call throwsSummary by CodeRabbit
New Features
Performance