Back
System Design

Design a Real-time Chat UI

ModeInterview answer templates and key talking points
System Design
Design a News Feed UIDesign a Component Library / Design SystemDesign a File Upload UIDesign an E-commerce Cart + Product PageDesign an Autocomplete / Search UIDesign a Video Player (YouTube / Netflix)Design a Data Table (Sort / Filter / Virtualize)Design an Image Carousel / Photo GalleryDesign a Real-time Chat UIDesign a Collaborative Editor (Google Docs)
hardreal_time

Design a Real-time Chat UI

Interview Prep mode active

Interview answer templates and key talking points. Switch the mode in the header to change your experience.

Step 1 — Clarify the scope

CandidateIs this a one-on-one chat or group chat, or both?
InterviewerBoth — 1:1 and group conversations.
CandidateDo messages support rich content — images, files, reactions?
InterviewerText messages and image attachments. Emoji reactions.
CandidateIs this real-time or near-real-time? Should I see a message within 1 second?
InterviewerYes, real-time. Less than 1 second delivery is expected.
CandidateDo we need read receipts and typing indicators?
InterviewerYes to both.
CandidateShould message history be paginated (load older messages on scroll up)?
InterviewerYes — load history in batches when the user scrolls to the top.
CandidateAny offline support — can users send messages while offline?
InterviewerNot required for now.
Simple Analogy

A chat UI is like a physical letter mailbox that was replaced by a walkie-talkie. Old chat apps (polling) were the mailbox — you walk out every minute to check if there's mail. WebSocket is the walkie-talkie — it stays open, and you hear the message the moment it's sent. The challenge is: what do you do if the walkie-talkie connection drops? You need to reconnect automatically and re-sync any messages you missed.


Step 2 — Component architecture

Chat UI — Component Tree
Visual component flow from page shell to reusable UI blocks

ChatApp

ConversationList

left sidebar — list of chats

ConversationSearch

ConversationItem (×N)

avatar, last message preview, unread count


Step 3 — WebSocket connection management

Connection lifecycle

TSX
1function useChatSocket(conversationId: string) {
2 const [socket, setSocket] = useState<WebSocket | null>(null);
3 const [connected, setConnected] = useState(false);
4 const reconnectAttempts = useRef(0);
5 const reconnectTimer = useRef<ReturnType<typeof setTimeout>>();
6
7 const connect = useCallback(() => {
8 const ws = new WebSocket(`wss://api.example.com/ws/chat/${conversationId}/`);
9
10 ws.onopen = () => {
11 setConnected(true);
12 reconnectAttempts.current = 0;
13 };
14
15 ws.onmessage = (event) => {
16 const message = JSON.parse(event.data);
17 handleIncomingMessage(message); // update React Query cache
18 };
19
20 ws.onclose = () => {
21 setConnected(false);
22 scheduleReconnect();
23 };
24
25 setSocket(ws);
26 }, [conversationId]);
27
28 const scheduleReconnect = () => {
29 // Exponential backoff: 1s, 2s, 4s, 8s, max 30s
30 const delay = Math.min(1000 * 2 ** reconnectAttempts.current, 30_000);
31 reconnectAttempts.current += 1;
32 reconnectTimer.current = setTimeout(connect, delay);
33 };
34
35 useEffect(() => {
36 connect();
37 return () => {
38 socket?.close();
39 clearTimeout(reconnectTimer.current);
40 };
41 }, [conversationId]);
42
43 return { socket, connected };
44}

Exponential backoff is critical. If the server goes down and 10,000 users retry every second, you create a thundering herd that makes the outage worse. Spreading retries over 1s → 2s → 4s gives the server breathing room.

What to do when the socket reconnects

When the connection drops and reconnects, messages sent during the gap are missing. You need to re-sync:

TSX
1ws.onopen = async () => {
2 setConnected(true);
3 // Fetch any messages we missed while disconnected
4 const since = getLastReceivedMessageTimestamp();
5 const missed = await fetch(`/api/conversations/${conversationId}/messages?since=${since}`);
6 const { messages } = await missed.json();
7 if (messages.length) addMessagesToCache(messages);
8};

Step 4 — Message list (scroll behavior)

Chat scroll is backwards compared to a normal feed. New messages go at the bottom, history loads at the top.

Auto-scroll to bottom for new messages

TSX
1function MessageList({ messages }: { messages: Message[] }) {
2 const bottomRef = useRef<HTMLDivElement>(null);
3 const containerRef = useRef<HTMLDivElement>(null);
4 const [userScrolledUp, setUserScrolledUp] = useState(false);
5
6 // Auto-scroll to bottom only if user hasn't scrolled up
7 useEffect(() => {
8 if (!userScrolledUp) {
9 bottomRef.current?.scrollIntoView({ behavior: 'smooth' });
10 }
11 }, [messages, userScrolledUp]);
12
13 const handleScroll = () => {
14 const el = containerRef.current;
15 if (!el) return;
16 // "scrolled up" = not at the bottom
17 const atBottom = el.scrollHeight - el.scrollTop - el.clientHeight < 50;
18 setUserScrolledUp(!atBottom);
19 };
20
21 return (
22 <div ref={containerRef} onScroll={handleScroll} className="overflow-y-auto flex-1">
23 {messages.map(m => <MessageBubble key={m.id} message={m} />)}
24 <div ref={bottomRef} />
25 </div>
26 );
27}

Load older messages (scroll to top)

TSX
1function useMessageHistory(conversationId: string) {
2 return useInfiniteQuery({
3 queryKey: ['messages', conversationId],
4 queryFn: ({ pageParam }) =>
5 fetch(`/api/conversations/${conversationId}/messages?before=${pageParam}&limit=30`)
6 .then(r => r.json()),
7 getNextPageParam: (firstPage) => firstPage.oldest_message_id,
8 select: (data) => ({
9 ...data,
10 pages: [...data.pages].reverse(), // newest page last
11 pageParams: [...data.pageParams].reverse(),
12 }),
13 });
14}
15
16// Detect scroll to top to load more history
17const handleScroll = () => {
18 const el = containerRef.current;
19 if (el && el.scrollTop < 100 && hasNextPage && !isFetchingNextPage) {
20 const prevScrollHeight = el.scrollHeight;
21 fetchNextPage().then(() => {
22 // Maintain scroll position — don't jump to top after loading
23 el.scrollTop = el.scrollHeight - prevScrollHeight;
24 });
25 }
26};

The scroll position fix (scrollHeight - prevScrollHeight) is the tricky part — when you prepend older messages, the scrollHeight increases, which would push the current view upward. You offset by the difference.


Step 5 — Optimistic message sending

TSX
1const sendMessage = useMutation({
2 mutationFn: (text: string) =>
3 fetch(`/api/conversations/${conversationId}/messages`, {
4 method: 'POST',
5 body: JSON.stringify({ text }),
6 }).then(r => r.json()),
7
8 onMutate: async (text) => {
9 const tempId = `temp-${Date.now()}`;
10 const optimisticMessage: Message = {
11 id: tempId,
12 text,
13 sender_id: currentUser.id,
14 created_at: new Date().toISOString(),
15 status: 'sending', // show a clock icon
16 };
17 addMessageToCache(optimisticMessage);
18 return { tempId };
19 },
20
21 onSuccess: (serverMessage, _text, context) => {
22 // Replace temp message with server-confirmed message
23 replaceMessageInCache(context.tempId, { ...serverMessage, status: 'sent' });
24 },
25
26 onError: (_err, _text, context) => {
27 // Mark as failed — show retry button
28 updateMessageStatus(context.tempId, 'failed');
29 },
30});

Status flow: sending (clock icon) → sent (single tick) → delivered (double tick) → read (blue tick). Each transition comes from the WebSocket.


Step 6 — Typing indicator

TSX
1function useTypingIndicator(socket: WebSocket | null, conversationId: string) {
2 const typingTimer = useRef<ReturnType<typeof setTimeout>>();
3 const isTyping = useRef(false);
4
5 const onInput = () => {
6 if (!isTyping.current) {
7 isTyping.current = true;
8 socket?.send(JSON.stringify({ type: 'typing_start', conversation_id: conversationId }));
9 }
10 // Reset the stop timer on every keystroke
11 clearTimeout(typingTimer.current);
12 typingTimer.current = setTimeout(() => {
13 isTyping.current = false;
14 socket?.send(JSON.stringify({ type: 'typing_stop', conversation_id: conversationId }));
15 }, 1500); // stop event fires 1.5s after last keystroke
16 };
17
18 return { onInput };
19}

Never send a typing event on every keystroke. Send typing_start once when typing begins, and typing_stop 1.5s after the last keystroke. This reduces WebSocket traffic by ~95%.


Step 7 — Read receipts

TSX
1// Mark messages as read when they enter the viewport
2function MessageBubble({ message }: { message: Message }) {
3 const ref = useRef<HTMLDivElement>(null);
4 const { markRead } = useReadReceipts();
5
6 useEffect(() => {
7 if (message.read || message.sender_id === currentUser.id) return;
8 const observer = new IntersectionObserver(
9 ([entry]) => {
10 if (entry.isIntersecting) {
11 markRead(message.id);
12 observer.disconnect();
13 }
14 },
15 { threshold: 0.5 } // message must be 50% visible to count as read
16 );
17 if (ref.current) observer.observe(ref.current);
18 return () => observer.disconnect();
19 }, [message.id]);
20
21 return <div ref={ref}>...</div>;
22}
Senior Perspective

Don't send a read receipt API call for every individual message as it scrolls into view — batch them. Collect message IDs for 500ms, then send one request: PATCH /api/read-receipts { message_ids: [...] }. This reduces API calls from O(messages visible) to O(scroll events) — a 10–20x reduction.


Step 8 — Connection status UI

Always show the user whether they're connected. Silent failures in chat are unacceptable.

TSX
1function ConnectionBanner({ connected }: { connected: boolean }) {
2 if (connected) return null;
3 return (
4 <div
5 role="alert"
6 aria-live="assertive"
7 className="bg-amber-500/10 border-b border-amber-500/30 text-amber-400 text-xs text-center py-1"
8 >
9 Reconnecting… Messages will be sent when connection is restored.
10 </div>
11 );
12}

While disconnected, messages added to the optimistic cache are queued locally. On reconnect, flush the queue.


Step 9 — Performance considerations

ProblemSolution
1000+ messages in DOMVirtual scrolling with @tanstack/virtual
Image attachments blocking renderLazy load, aspect-ratio placeholder
Typing events flooding the socketDebounce — send start/stop, not per-keystroke
Read receipts — one request per messageBatch with 500ms collector
Messages rendering on every socket eventReact.memo on MessageBubble, stable keys
Reconnect thundering herdExponential backoff with jitter

Step 10 — Accessibility

  • <main> for the message list, <nav> for conversation list
  • New incoming messages: aria-live="polite" container so screen readers announce them
  • Typing indicator: aria-live="polite" so screen readers say "Alice is typing"
  • Message input: aria-label="Message", role="textbox", aria-multiline="true"
  • Images: meaningful alt text or alt="Attachment sent by Alice"
  • Focus management: after sending a message, keep focus on the input

Step 11 — Data model

TS
1interface Message {
2 id: string;
3 conversation_id: string;
4 sender_id: string;
5 content: string;
6 type: 'text' | 'image' | 'file' | 'system';
7 status: 'sending' | 'sent' | 'delivered' | 'read' | 'failed';
8 sequence_number: number; // monotonically increasing per conversation
9 created_at: string;
10 edited_at?: string;
11 reply_to_id?: string; // thread support
12 reactions?: Record&lt;string, string[]&gt;; // emoji → [userId, ...]
13 attachments?: Attachment[];
14}
15
16interface Conversation {
17 id: string;
18 participants: Participant[];
19 last_message?: Message;
20 unread_count: number;
21 is_group: boolean;
22 name?: string; // group name
23}
24
25interface Attachment {
26 id: string;
27 url: string;
28 type: 'image' | 'video' | 'file';
29 filename: string;
30 size_bytes: number;
31 width?: number;
32 height?: number;
33}

The sequence_number is the key to ordering — don't rely on created_at alone since clocks can drift between clients and servers.


Step 12 — Message ordering and deduplication

In distributed systems, messages can arrive out of order over WebSocket. Use sequence numbers to detect and handle gaps.

TSX
1function useMessageOrdering(conversationId: string) {
2 const [messages, setMessages] = useState<Message[]>([]);
3 const expectedSeq = useRef(0);
4 const buffer = useRef<Message[]>([]);
5
6 const addMessage = (msg: Message) => {
7 if (msg.sequence_number === expectedSeq.current) {
8 setMessages(prev => [...prev, msg]);
9 expectedSeq.current++;
10 // Drain the buffer — any buffered messages that now fit in sequence
11 while (buffer.current[0]?.sequence_number === expectedSeq.current) {
12 const next = buffer.current.shift()!;
13 setMessages(prev => [...prev, next]);
14 expectedSeq.current++;
15 }
16 } else if (msg.sequence_number > expectedSeq.current) {
17 // Out of order — buffer it and request the missing message
18 buffer.current.push(msg);
19 buffer.current.sort((a, b) => a.sequence_number - b.sequence_number);
20 requestMissingMessages(conversationId, expectedSeq.current, msg.sequence_number);
21 }
22 // If sequence_number < expected — it's a duplicate, discard
23 };
24
25 return { messages, addMessage };
26}

Step 13 — Message reactions

TSX
1function useReactions(messageId: string) {
2 const queryClient = useQueryClient();
3
4 const toggleReaction = useMutation({
5 mutationFn: (emoji: string) =>
6 fetch(`/api/messages/${messageId}/reactions`, {
7 method: 'POST',
8 body: JSON.stringify({ emoji }),
9 }),
10
11 onMutate: async (emoji) => {
12 // Optimistic update
13 queryClient.setQueryData(['messages', conversationId], (old: Message[]) =>
14 old.map(m => {
15 if (m.id !== messageId) return m;
16 const existing = m.reactions?.[emoji] ?? [];
17 const hasReacted = existing.includes(currentUser.id);
18 return {
19 ...m,
20 reactions: {
21 ...m.reactions,
22 [emoji]: hasReacted
23 ? existing.filter(id => id !== currentUser.id)
24 : [...existing, currentUser.id],
25 },
26 };
27 })
28 );
29 },
30 });
31
32 return { toggleReaction };
33}

Step 14 — Threaded replies

TSX
1function MessageThread({ parentId }: { parentId: string }) {
2 const { data: replies, isLoading } = useQuery({
3 queryKey: ['thread', parentId],
4 queryFn: () => fetch(`/api/messages/${parentId}/replies`).then(r => r.json()),
5 // Don't load thread until user explicitly opens it
6 enabled: false,
7 });
8
9 const [open, setOpen] = useState(false);
10
11 return (
12 <div>
13 <button onClick={() => setOpen(o => !o)}>
14 {replies?.length ?? 0} replies
15 </button>
16 {open && (
17 <div className="ms-8 border-s-2 border-slate-700 ps-4 mt-2 space-y-2">
18 {replies?.map(r => <MessageBubble key={r.id} message={r} />)}
19 <ReplyComposer parentId={parentId} />
20 </div>
21 )}
22 </div>
23 );
24}

Step 15 — File and image sharing

TSX
1async function sendFile(file: File, conversationId: string) {
2 // 1. Get a pre-signed upload URL from the server
3 const { upload_url, file_id } = await fetch('/api/attachments/presign', {
4 method: 'POST',
5 body: JSON.stringify({ filename: file.name, content_type: file.type, size: file.size }),
6 }).then(r => r.json());
7
8 // 2. Upload directly to S3/GCS — server never touches the bytes
9 await fetch(upload_url, { method: 'PUT', body: file, headers: { 'Content-Type': file.type } });
10
11 // 3. Send the message referencing the file_id
12 await sendMessage(conversationId, { type: 'file', attachment_id: file_id });
13}
File Upload Flow in Chat
Component node + responsibility notes
App
1

User selects file

  • sendFile(file, conversationId)
  • POST /api/attachments/presign
  • Server generates pre-signed S3 URL (expires in 5 min)
  • PUT [pre-signed S3 URL] — direct browser → CDN upload
  • No server involved — avoids server bandwidth cost
  • POST /api/messages { attachment_id }
  • WebSocket broadcast to conversation participants
2

Receivers

  • Receive message with CDN URL — lazy-load image/file

Step 16 — Push notifications (Web Push API)

When the user is not on the page, deliver messages via browser push notifications.

TSX
1async function subscribeToPush() {
2 const registration = await navigator.serviceWorker.ready;
3 const subscription = await registration.pushManager.subscribe({
4 userVisibleOnly: true,
5 applicationServerKey: urlBase64ToUint8Array(process.env.NEXT_PUBLIC_VAPID_KEY!),
6 });
7 // Send subscription to server — server will use it to push messages
8 await fetch('/api/push/subscribe', {
9 method: 'POST',
10 body: JSON.stringify(subscription),
11 });
12}
13
14// In service worker — handle incoming push
15self.addEventListener('push', event => {
16 const data = event.data?.json();
17 event.waitUntil(
18 self.registration.showNotification(data.sender_name, {
19 body: data.message_preview,
20 icon: data.sender_avatar,
21 data: { conversation_id: data.conversation_id },
22 })
23 );
24});
25
26// Clicking notification opens the correct conversation
27self.addEventListener('notificationclick', event => {
28 event.notification.close();
29 event.waitUntil(clients.openWindow(`/chat/${event.notification.data.conversation_id}`));
30});

Step 17 — Offline outbox

Messages composed while offline should not be lost.

TSX
1const OUTBOX_KEY = 'chat-outbox';
2
3function useOfflineOutbox() {
4 const sendWithOutbox = async (message: Partial<Message>) => {
5 if (!navigator.onLine) {
6 const queue = JSON.parse(localStorage.getItem(OUTBOX_KEY) ?? '[]');
7 queue.push({ ...message, id: crypto.randomUUID(), status: 'sending' });
8 localStorage.setItem(OUTBOX_KEY, JSON.stringify(queue));
9 // Show optimistically in UI with 'pending' status
10 return;
11 }
12 await sendMessageToServer(message);
13 };
14
15 useEffect(() => {
16 const flush = async () => {
17 const queue = JSON.parse(localStorage.getItem(OUTBOX_KEY) ?? '[]');
18 if (queue.length === 0) return;
19 for (const msg of queue) {
20 await sendMessageToServer(msg);
21 }
22 localStorage.removeItem(OUTBOX_KEY);
23 };
24 window.addEventListener('online', flush);
25 return () => window.removeEventListener('online', flush);
26 }, []);
27
28 return { sendWithOutbox };
29}

Step 18 — i18n and message timestamps

TSX
1function MessageTimestamp({ createdAt }: { createdAt: string }) {
2 const date = new Date(createdAt);
3 const now = new Date();
4 const diffMs = now.getTime() - date.getTime();
5
6 // Today: show time only
7 if (diffMs < 86400000 && date.getDate() === now.getDate()) {
8 return <time dateTime={createdAt}>
9 {new Intl.DateTimeFormat(userLocale, { hour: 'numeric', minute: '2-digit' }).format(date)}
10 </time>;
11 }
12
13 // This week: show day name + time
14 if (diffMs < 7 * 86400000) {
15 return <time dateTime={createdAt}>
16 {new Intl.DateTimeFormat(userLocale, { weekday: 'short', hour: 'numeric', minute: '2-digit' }).format(date)}
17 </time>;
18 }
19
20 // Older: show full date
21 return <time dateTime={createdAt}>
22 {new Intl.DateTimeFormat(userLocale, { dateStyle: 'medium', timeStyle: 'short' }).format(date)}
23 </time>;
24}

Always use <time dateTime={isoString}> for semantic HTML — screen readers and search engines understand it.


Step 19 — Telemetry

MetricTargetAlert when
Message delivery P95< 500ms> 1s (WebSocket latency)
WebSocket reconnect rate< 1%> 5% (network instability)
Message send failure rate< 0.1%> 1% (server or queue issue)
Push notification delivery> 90%< 70% (VAPID key or service worker issue)
Unread badge accuracy100%Any discrepancy (sequence number bug)
Say This in Your Interview

"The core of a real-time chat UI is WebSocket connection management — connect on mount, reconnect with exponential backoff, re-sync missed messages using sequence numbers on reconnect. For message ordering I'd use monotonic sequence numbers per conversation so out-of-order delivery is handled gracefully. Optimistic updates make sends feel instant. Typing indicators are debounced — send start/stop events only. Read receipts are batched. For file sharing I'd use pre-signed S3 URLs so the browser uploads directly to the CDN without going through the server. Push notifications keep users engaged when the tab is closed."

Next

Interview Prep

Interview answer templates and key talking points

On this page