Design a Real-time Chat UI
Interview Prep mode active
Interview answer templates and key talking points. Switch the mode in the header to change your experience.
Step 1 — Clarify the scope
A chat UI is like a physical letter mailbox that was replaced by a walkie-talkie. Old chat apps (polling) were the mailbox — you walk out every minute to check if there's mail. WebSocket is the walkie-talkie — it stays open, and you hear the message the moment it's sent. The challenge is: what do you do if the walkie-talkie connection drops? You need to reconnect automatically and re-sync any messages you missed.
Step 2 — Component architecture
ChatApp
ConversationList
left sidebar — list of chats
ConversationSearch
ConversationItem (×N)
avatar, last message preview, unread count
Step 3 — WebSocket connection management
Connection lifecycle
1function useChatSocket(conversationId: string) {2 const [socket, setSocket] = useState<WebSocket | null>(null);3 const [connected, setConnected] = useState(false);4 const reconnectAttempts = useRef(0);5 const reconnectTimer = useRef<ReturnType<typeof setTimeout>>();67 const connect = useCallback(() => {8 const ws = new WebSocket(`wss://api.example.com/ws/chat/${conversationId}/`);910 ws.onopen = () => {11 setConnected(true);12 reconnectAttempts.current = 0;13 };1415 ws.onmessage = (event) => {16 const message = JSON.parse(event.data);17 handleIncomingMessage(message); // update React Query cache18 };1920 ws.onclose = () => {21 setConnected(false);22 scheduleReconnect();23 };2425 setSocket(ws);26 }, [conversationId]);2728 const scheduleReconnect = () => {29 // Exponential backoff: 1s, 2s, 4s, 8s, max 30s30 const delay = Math.min(1000 * 2 ** reconnectAttempts.current, 30_000);31 reconnectAttempts.current += 1;32 reconnectTimer.current = setTimeout(connect, delay);33 };3435 useEffect(() => {36 connect();37 return () => {38 socket?.close();39 clearTimeout(reconnectTimer.current);40 };41 }, [conversationId]);4243 return { socket, connected };44}
Exponential backoff is critical. If the server goes down and 10,000 users retry every second, you create a thundering herd that makes the outage worse. Spreading retries over 1s → 2s → 4s gives the server breathing room.
What to do when the socket reconnects
When the connection drops and reconnects, messages sent during the gap are missing. You need to re-sync:
1ws.onopen = async () => {2 setConnected(true);3 // Fetch any messages we missed while disconnected4 const since = getLastReceivedMessageTimestamp();5 const missed = await fetch(`/api/conversations/${conversationId}/messages?since=${since}`);6 const { messages } = await missed.json();7 if (messages.length) addMessagesToCache(messages);8};
Step 4 — Message list (scroll behavior)
Chat scroll is backwards compared to a normal feed. New messages go at the bottom, history loads at the top.
Auto-scroll to bottom for new messages
1function MessageList({ messages }: { messages: Message[] }) {2 const bottomRef = useRef<HTMLDivElement>(null);3 const containerRef = useRef<HTMLDivElement>(null);4 const [userScrolledUp, setUserScrolledUp] = useState(false);56 // Auto-scroll to bottom only if user hasn't scrolled up7 useEffect(() => {8 if (!userScrolledUp) {9 bottomRef.current?.scrollIntoView({ behavior: 'smooth' });10 }11 }, [messages, userScrolledUp]);1213 const handleScroll = () => {14 const el = containerRef.current;15 if (!el) return;16 // "scrolled up" = not at the bottom17 const atBottom = el.scrollHeight - el.scrollTop - el.clientHeight < 50;18 setUserScrolledUp(!atBottom);19 };2021 return (22 <div ref={containerRef} onScroll={handleScroll} className="overflow-y-auto flex-1">23 {messages.map(m => <MessageBubble key={m.id} message={m} />)}24 <div ref={bottomRef} />25 </div>26 );27}
Load older messages (scroll to top)
1function useMessageHistory(conversationId: string) {2 return useInfiniteQuery({3 queryKey: ['messages', conversationId],4 queryFn: ({ pageParam }) =>5 fetch(`/api/conversations/${conversationId}/messages?before=${pageParam}&limit=30`)6 .then(r => r.json()),7 getNextPageParam: (firstPage) => firstPage.oldest_message_id,8 select: (data) => ({9 ...data,10 pages: [...data.pages].reverse(), // newest page last11 pageParams: [...data.pageParams].reverse(),12 }),13 });14}1516// Detect scroll to top to load more history17const handleScroll = () => {18 const el = containerRef.current;19 if (el && el.scrollTop < 100 && hasNextPage && !isFetchingNextPage) {20 const prevScrollHeight = el.scrollHeight;21 fetchNextPage().then(() => {22 // Maintain scroll position — don't jump to top after loading23 el.scrollTop = el.scrollHeight - prevScrollHeight;24 });25 }26};
The scroll position fix (scrollHeight - prevScrollHeight) is the tricky part — when you prepend older messages, the scrollHeight increases, which would push the current view upward. You offset by the difference.
Step 5 — Optimistic message sending
1const sendMessage = useMutation({2 mutationFn: (text: string) =>3 fetch(`/api/conversations/${conversationId}/messages`, {4 method: 'POST',5 body: JSON.stringify({ text }),6 }).then(r => r.json()),78 onMutate: async (text) => {9 const tempId = `temp-${Date.now()}`;10 const optimisticMessage: Message = {11 id: tempId,12 text,13 sender_id: currentUser.id,14 created_at: new Date().toISOString(),15 status: 'sending', // show a clock icon16 };17 addMessageToCache(optimisticMessage);18 return { tempId };19 },2021 onSuccess: (serverMessage, _text, context) => {22 // Replace temp message with server-confirmed message23 replaceMessageInCache(context.tempId, { ...serverMessage, status: 'sent' });24 },2526 onError: (_err, _text, context) => {27 // Mark as failed — show retry button28 updateMessageStatus(context.tempId, 'failed');29 },30});
Status flow: sending (clock icon) → sent (single tick) → delivered (double tick) → read (blue tick). Each transition comes from the WebSocket.
Step 6 — Typing indicator
1function useTypingIndicator(socket: WebSocket | null, conversationId: string) {2 const typingTimer = useRef<ReturnType<typeof setTimeout>>();3 const isTyping = useRef(false);45 const onInput = () => {6 if (!isTyping.current) {7 isTyping.current = true;8 socket?.send(JSON.stringify({ type: 'typing_start', conversation_id: conversationId }));9 }10 // Reset the stop timer on every keystroke11 clearTimeout(typingTimer.current);12 typingTimer.current = setTimeout(() => {13 isTyping.current = false;14 socket?.send(JSON.stringify({ type: 'typing_stop', conversation_id: conversationId }));15 }, 1500); // stop event fires 1.5s after last keystroke16 };1718 return { onInput };19}
Never send a typing event on every keystroke. Send typing_start once when typing begins, and typing_stop 1.5s after the last keystroke. This reduces WebSocket traffic by ~95%.
Step 7 — Read receipts
1// Mark messages as read when they enter the viewport2function MessageBubble({ message }: { message: Message }) {3 const ref = useRef<HTMLDivElement>(null);4 const { markRead } = useReadReceipts();56 useEffect(() => {7 if (message.read || message.sender_id === currentUser.id) return;8 const observer = new IntersectionObserver(9 ([entry]) => {10 if (entry.isIntersecting) {11 markRead(message.id);12 observer.disconnect();13 }14 },15 { threshold: 0.5 } // message must be 50% visible to count as read16 );17 if (ref.current) observer.observe(ref.current);18 return () => observer.disconnect();19 }, [message.id]);2021 return <div ref={ref}>...</div>;22}
Don't send a read receipt API call for every individual message as it scrolls into view — batch them. Collect message IDs for 500ms, then send one request: PATCH /api/read-receipts { message_ids: [...] }. This reduces API calls from O(messages visible) to O(scroll events) — a 10–20x reduction.
Step 8 — Connection status UI
Always show the user whether they're connected. Silent failures in chat are unacceptable.
1function ConnectionBanner({ connected }: { connected: boolean }) {2 if (connected) return null;3 return (4 <div5 role="alert"6 aria-live="assertive"7 className="bg-amber-500/10 border-b border-amber-500/30 text-amber-400 text-xs text-center py-1"8 >9 Reconnecting… Messages will be sent when connection is restored.10 </div>11 );12}
While disconnected, messages added to the optimistic cache are queued locally. On reconnect, flush the queue.
Step 9 — Performance considerations
| Problem | Solution |
|---|---|
| 1000+ messages in DOM | Virtual scrolling with @tanstack/virtual |
| Image attachments blocking render | Lazy load, aspect-ratio placeholder |
| Typing events flooding the socket | Debounce — send start/stop, not per-keystroke |
| Read receipts — one request per message | Batch with 500ms collector |
| Messages rendering on every socket event | React.memo on MessageBubble, stable keys |
| Reconnect thundering herd | Exponential backoff with jitter |
Step 10 — Accessibility
<main>for the message list,<nav>for conversation list- New incoming messages:
aria-live="polite"container so screen readers announce them - Typing indicator:
aria-live="polite"so screen readers say "Alice is typing" - Message input:
aria-label="Message",role="textbox",aria-multiline="true" - Images: meaningful
alttext oralt="Attachment sent by Alice" - Focus management: after sending a message, keep focus on the input
Step 11 — Data model
1interface Message {2 id: string;3 conversation_id: string;4 sender_id: string;5 content: string;6 type: 'text' | 'image' | 'file' | 'system';7 status: 'sending' | 'sent' | 'delivered' | 'read' | 'failed';8 sequence_number: number; // monotonically increasing per conversation9 created_at: string;10 edited_at?: string;11 reply_to_id?: string; // thread support12 reactions?: Record<string, string[]>; // emoji → [userId, ...]13 attachments?: Attachment[];14}1516interface Conversation {17 id: string;18 participants: Participant[];19 last_message?: Message;20 unread_count: number;21 is_group: boolean;22 name?: string; // group name23}2425interface Attachment {26 id: string;27 url: string;28 type: 'image' | 'video' | 'file';29 filename: string;30 size_bytes: number;31 width?: number;32 height?: number;33}
The sequence_number is the key to ordering — don't rely on created_at alone since clocks can drift between clients and servers.
Step 12 — Message ordering and deduplication
In distributed systems, messages can arrive out of order over WebSocket. Use sequence numbers to detect and handle gaps.
1function useMessageOrdering(conversationId: string) {2 const [messages, setMessages] = useState<Message[]>([]);3 const expectedSeq = useRef(0);4 const buffer = useRef<Message[]>([]);56 const addMessage = (msg: Message) => {7 if (msg.sequence_number === expectedSeq.current) {8 setMessages(prev => [...prev, msg]);9 expectedSeq.current++;10 // Drain the buffer — any buffered messages that now fit in sequence11 while (buffer.current[0]?.sequence_number === expectedSeq.current) {12 const next = buffer.current.shift()!;13 setMessages(prev => [...prev, next]);14 expectedSeq.current++;15 }16 } else if (msg.sequence_number > expectedSeq.current) {17 // Out of order — buffer it and request the missing message18 buffer.current.push(msg);19 buffer.current.sort((a, b) => a.sequence_number - b.sequence_number);20 requestMissingMessages(conversationId, expectedSeq.current, msg.sequence_number);21 }22 // If sequence_number < expected — it's a duplicate, discard23 };2425 return { messages, addMessage };26}
Step 13 — Message reactions
1function useReactions(messageId: string) {2 const queryClient = useQueryClient();34 const toggleReaction = useMutation({5 mutationFn: (emoji: string) =>6 fetch(`/api/messages/${messageId}/reactions`, {7 method: 'POST',8 body: JSON.stringify({ emoji }),9 }),1011 onMutate: async (emoji) => {12 // Optimistic update13 queryClient.setQueryData(['messages', conversationId], (old: Message[]) =>14 old.map(m => {15 if (m.id !== messageId) return m;16 const existing = m.reactions?.[emoji] ?? [];17 const hasReacted = existing.includes(currentUser.id);18 return {19 ...m,20 reactions: {21 ...m.reactions,22 [emoji]: hasReacted23 ? existing.filter(id => id !== currentUser.id)24 : [...existing, currentUser.id],25 },26 };27 })28 );29 },30 });3132 return { toggleReaction };33}
Step 14 — Threaded replies
1function MessageThread({ parentId }: { parentId: string }) {2 const { data: replies, isLoading } = useQuery({3 queryKey: ['thread', parentId],4 queryFn: () => fetch(`/api/messages/${parentId}/replies`).then(r => r.json()),5 // Don't load thread until user explicitly opens it6 enabled: false,7 });89 const [open, setOpen] = useState(false);1011 return (12 <div>13 <button onClick={() => setOpen(o => !o)}>14 {replies?.length ?? 0} replies15 </button>16 {open && (17 <div className="ms-8 border-s-2 border-slate-700 ps-4 mt-2 space-y-2">18 {replies?.map(r => <MessageBubble key={r.id} message={r} />)}19 <ReplyComposer parentId={parentId} />20 </div>21 )}22 </div>23 );24}
Step 15 — File and image sharing
1async function sendFile(file: File, conversationId: string) {2 // 1. Get a pre-signed upload URL from the server3 const { upload_url, file_id } = await fetch('/api/attachments/presign', {4 method: 'POST',5 body: JSON.stringify({ filename: file.name, content_type: file.type, size: file.size }),6 }).then(r => r.json());78 // 2. Upload directly to S3/GCS — server never touches the bytes9 await fetch(upload_url, { method: 'PUT', body: file, headers: { 'Content-Type': file.type } });1011 // 3. Send the message referencing the file_id12 await sendMessage(conversationId, { type: 'file', attachment_id: file_id });13}
User selects file
- sendFile(file, conversationId)
- POST /api/attachments/presign
- Server generates pre-signed S3 URL (expires in 5 min)
- PUT [pre-signed S3 URL] — direct browser → CDN upload
- No server involved — avoids server bandwidth cost
- POST /api/messages { attachment_id }
- WebSocket broadcast to conversation participants
Receivers
- Receive message with CDN URL — lazy-load image/file
Step 16 — Push notifications (Web Push API)
When the user is not on the page, deliver messages via browser push notifications.
1async function subscribeToPush() {2 const registration = await navigator.serviceWorker.ready;3 const subscription = await registration.pushManager.subscribe({4 userVisibleOnly: true,5 applicationServerKey: urlBase64ToUint8Array(process.env.NEXT_PUBLIC_VAPID_KEY!),6 });7 // Send subscription to server — server will use it to push messages8 await fetch('/api/push/subscribe', {9 method: 'POST',10 body: JSON.stringify(subscription),11 });12}1314// In service worker — handle incoming push15self.addEventListener('push', event => {16 const data = event.data?.json();17 event.waitUntil(18 self.registration.showNotification(data.sender_name, {19 body: data.message_preview,20 icon: data.sender_avatar,21 data: { conversation_id: data.conversation_id },22 })23 );24});2526// Clicking notification opens the correct conversation27self.addEventListener('notificationclick', event => {28 event.notification.close();29 event.waitUntil(clients.openWindow(`/chat/${event.notification.data.conversation_id}`));30});
Step 17 — Offline outbox
Messages composed while offline should not be lost.
1const OUTBOX_KEY = 'chat-outbox';23function useOfflineOutbox() {4 const sendWithOutbox = async (message: Partial<Message>) => {5 if (!navigator.onLine) {6 const queue = JSON.parse(localStorage.getItem(OUTBOX_KEY) ?? '[]');7 queue.push({ ...message, id: crypto.randomUUID(), status: 'sending' });8 localStorage.setItem(OUTBOX_KEY, JSON.stringify(queue));9 // Show optimistically in UI with 'pending' status10 return;11 }12 await sendMessageToServer(message);13 };1415 useEffect(() => {16 const flush = async () => {17 const queue = JSON.parse(localStorage.getItem(OUTBOX_KEY) ?? '[]');18 if (queue.length === 0) return;19 for (const msg of queue) {20 await sendMessageToServer(msg);21 }22 localStorage.removeItem(OUTBOX_KEY);23 };24 window.addEventListener('online', flush);25 return () => window.removeEventListener('online', flush);26 }, []);2728 return { sendWithOutbox };29}
Step 18 — i18n and message timestamps
1function MessageTimestamp({ createdAt }: { createdAt: string }) {2 const date = new Date(createdAt);3 const now = new Date();4 const diffMs = now.getTime() - date.getTime();56 // Today: show time only7 if (diffMs < 86400000 && date.getDate() === now.getDate()) {8 return <time dateTime={createdAt}>9 {new Intl.DateTimeFormat(userLocale, { hour: 'numeric', minute: '2-digit' }).format(date)}10 </time>;11 }1213 // This week: show day name + time14 if (diffMs < 7 * 86400000) {15 return <time dateTime={createdAt}>16 {new Intl.DateTimeFormat(userLocale, { weekday: 'short', hour: 'numeric', minute: '2-digit' }).format(date)}17 </time>;18 }1920 // Older: show full date21 return <time dateTime={createdAt}>22 {new Intl.DateTimeFormat(userLocale, { dateStyle: 'medium', timeStyle: 'short' }).format(date)}23 </time>;24}
Always use <time dateTime={isoString}> for semantic HTML — screen readers and search engines understand it.
Step 19 — Telemetry
| Metric | Target | Alert when |
|---|---|---|
| Message delivery P95 | < 500ms | > 1s (WebSocket latency) |
| WebSocket reconnect rate | < 1% | > 5% (network instability) |
| Message send failure rate | < 0.1% | > 1% (server or queue issue) |
| Push notification delivery | > 90% | < 70% (VAPID key or service worker issue) |
| Unread badge accuracy | 100% | Any discrepancy (sequence number bug) |
"The core of a real-time chat UI is WebSocket connection management — connect on mount, reconnect with exponential backoff, re-sync missed messages using sequence numbers on reconnect. For message ordering I'd use monotonic sequence numbers per conversation so out-of-order delivery is handled gracefully. Optimistic updates make sends feel instant. Typing indicators are debounced — send start/stop events only. Read receipts are batched. For file sharing I'd use pre-signed S3 URLs so the browser uploads directly to the CDN without going through the server. Push notifications keep users engaged when the tab is closed."