Design a News Feed UI
Interview Prep mode active
Interview answer templates and key talking points. Switch the mode in the header to change your experience.
Step 1 — Clarify the scope
The first thing you do in a frontend system design interview is ask questions. Never jump straight to components.
Think of a news feed like a stack of printed newspapers being loaded onto a truck. You don't load all of them at once — you load the first few, drive to the next stop, pick up more. The truck (your browser) only carries what it needs right now. Carrying the full warehouse (rendering 1000 posts at once) makes the truck too slow to move.
Step 2 — High-level component architecture
Start with a box diagram of the UI before talking about any implementation detail.
App
FeedPage
FeedHeader
sticky top bar (logo, notifications, avatar)
NewPostsBanner
"5 new posts" pill (appears on new data)
FeedList
the virtualized scroll container
FeedItem (×N)
individual post card
FeedSkeleton
shown while loading next page
ScrollSentinel
invisible div at the bottom; triggers load
Why this structure?
FeedListowns the data and scroll state — children just receive propsFeedItemis a pure presentational component — easy to test and reuseScrollSentineldecouples scroll detection from rendering logic
Step 3 — Data fetching strategy
Cursor-based pagination (not offset)
Never use ?page=2&limit=20 offset pagination for a live feed. Here is why:
1Timeline:2 User opens feed → posts [1,2,3,4,5,...20] loaded (page 1)3 Post #3 gets deleted4 User scrolls → loads page 2 → posts [21,...40]5 But now offset shifts — post #21 becomes post #206 → User sees a duplicate or misses a post
Use a cursor instead:
1GET /api/feed?cursor=eyJpZCI6MTIzfQ==&limit=202Response: { items: [...], next_cursor: "eyJpZCI6MTAzfQ==" }
The cursor encodes the last-seen post ID (usually base64). The server returns posts strictly before that ID. No gaps, no duplicates, even as new posts arrive.
React Query implementation
1import { useInfiniteQuery } from '@tanstack/react-query';23function useFeed() {4 return useInfiniteQuery({5 queryKey: ['feed'],6 queryFn: ({ pageParam }) =>7 fetch(`/api/feed?cursor=${pageParam ?? ''}&limit=20`).then(r => r.json()),8 getNextPageParam: (lastPage) => lastPage.next_cursor ?? undefined,9 staleTime: 30_000, // treat data as fresh for 30s — don't refetch on tab focus10 });11}
useInfiniteQuery accumulates all pages in data.pages[]. You flatten them for rendering:
1const allPosts = data?.pages.flatMap(page => page.items) ?? [];
Scroll sentinel with IntersectionObserver
1function ScrollSentinel({ onIntersect }: { onIntersect: () => void }) {2 const ref = useRef<HTMLDivElement>(null);34 useEffect(() => {5 const observer = new IntersectionObserver(6 ([entry]) => { if (entry.isIntersecting) onIntersect(); },7 { rootMargin: '200px' } // trigger 200px before the user hits the bottom8 );9 if (ref.current) observer.observe(ref.current);10 return () => observer.disconnect();11 }, [onIntersect]);1213 return <div ref={ref} />;14}1516// Usage in FeedList:17<ScrollSentinel onIntersect={() => {18 if (hasNextPage && !isFetchingNextPage) fetchNextPage();19}} />
rootMargin: '200px' gives you a 200px look-ahead, so posts load before the user reaches the bottom — they never see a spinner mid-scroll.
Step 4 — Virtual scrolling (performance)
Rendering 500 DOM nodes simultaneously destroys scroll performance. A 60fps scroll requires each frame to complete in under 16ms. With 500 complex post cards, that's impossible.
Virtual scrolling (also called "windowing") keeps only the visible rows in the DOM — usually 5–10 rows above and below the viewport. As you scroll down, elements at the top are recycled and re-populated with new data. The DOM node count stays constant at ~20–30 regardless of list size. Libraries: react-window (lightweight), react-virtual (hooks-based, more flexible), @tanstack/virtual (headless).
When to apply virtual scrolling
| List size | Approach |
|---|---|
| < 100 items | Plain map — no virtualization needed |
| 100–500 items | Consider it if items are complex (media, interactions) |
| 500+ items | Always virtualize |
Basic react-window setup
1import { VariableSizeList } from 'react-window';23// Posts have variable heights (text-only vs image vs video)4// so we use VariableSizeList, not FixedSizeList5function VirtualFeed({ posts }: { posts: Post[] }) {6 const listRef = useRef<VariableSizeList>(null);7 const heightCache = useRef<Record<string, number>>({});89 const getHeight = (index: number) =>10 heightCache.current[posts[index].id] ?? 300; // default estimate1112 const onItemRendered = (id: string, height: number) => {13 if (heightCache.current[id] !== height) {14 heightCache.current[id] = height;15 listRef.current?.resetAfterIndex(0); // tell react-window to recalculate16 }17 };1819 return (20 <VariableSizeList21 ref={listRef}22 height={window.innerHeight}23 itemCount={posts.length}24 itemSize={getHeight}25 width="100%"26 >27 {({ index, style }) => (28 <div style={style}>29 <FeedItem30 post={posts[index]}31 onHeightChange={(h) => onItemRendered(posts[index].id, h)}32 />33 </div>34 )}35 </VariableSizeList>36 );37}
The tricky part with variable-height virtualization: you need to measure each item after it renders (use ResizeObserver inside FeedItem) and report back.
Step 5 — Real-time new posts banner
You don't auto-inject new posts into the feed — that would jump the user's scroll position. Instead, show a banner.
1function useNewPostsPolling(latestKnownId: string) {2 const [newCount, setNewCount] = useState(0);34 useEffect(() => {5 const interval = setInterval(async () => {6 const res = await fetch(`/api/feed/new-count?since=${latestKnownId}`);7 const { count } = await res.json();8 setNewCount(count);9 }, 15_000); // poll every 15s — cheap, no WebSocket needed1011 return () => clearInterval(interval);12 }, [latestKnownId]);1314 return newCount;15}1617// In FeedPage:18{newCount > 0 && (19 <button20 onClick={() => { window.scrollTo({ top: 0, behavior: 'smooth' }); refetch(); }}21 className="fixed top-16 left-1/2 -translate-x-1/2 z-50 px-4 py-2 bg-teal-500 rounded-full text-white text-sm font-semibold shadow-lg"22 >23 ↑ {newCount} new posts24 </button>25)}
"For real-time new post detection, I'd use polling over WebSocket here because polling at 15-second intervals is cheap — we're not doing full refetch, just fetching a count. WebSocket would add connection overhead for something the user checks rarely. If the product needed sub-second real-time (like a live trading feed), I'd switch to SSE or WebSocket."
Step 6 — Optimistic updates for likes
Never wait for the server to confirm a like before updating the UI. Users expect instant feedback.
1const queryClient = useQueryClient();23const likeMutation = useMutation({4 mutationFn: (postId: string) => fetch(`/api/posts/${postId}/like`, { method: 'POST' }),56 onMutate: async (postId) => {7 await queryClient.cancelQueries({ queryKey: ['feed'] });8 const snapshot = queryClient.getQueryData(['feed']);910 // Optimistically update the cache11 queryClient.setQueryData(['feed'], (old: InfiniteData<FeedPage>) => ({12 ...old,13 pages: old.pages.map(page => ({14 ...page,15 items: page.items.map(post =>16 post.id === postId17 ? { ...post, liked: !post.liked, likes_count: post.liked ? post.likes_count - 1 : post.likes_count + 1 }18 : post19 ),20 })),21 }));2223 return { snapshot }; // return snapshot for rollback24 },2526 onError: (_err, _postId, context) => {27 // Roll back on failure28 queryClient.setQueryData(['feed'], context?.snapshot);29 },30});
The pattern: snapshot → optimistic update → on error rollback. No spinner, no delay.
Step 7 — Image and video loading
Lazy load below-the-fold media
1function PostImage({ src, alt }: { src: string; alt: string }) {2 return (3 <img4 src={src}5 alt={alt}6 loading="lazy" // native browser lazy loading7 decoding="async" // non-blocking decode8 className="w-full rounded-lg object-cover"9 />10 );11}
Skeleton while loading
1function PostMedia({ src }: { src: string }) {2 const [loaded, setLoaded] = useState(false);3 return (4 <div className="relative aspect-video bg-slate-800 rounded-lg overflow-hidden">5 {!loaded && <div className="absolute inset-0 animate-pulse bg-slate-700" />}6 <img src={src} onLoad={() => setLoaded(true)} className={loaded ? 'opacity-100' : 'opacity-0'} />7 </div>8 );9}
Always reserve space for media with aspect-video or a fixed height before load — this prevents CLS (Cumulative Layout Shift), which hurts Core Web Vitals and scroll position jumps.
Step 8 — State management breakdown
For a news feed, avoid putting everything in global state. Follow this ownership model:
- Server state (posts, likes, comments): React Query — it handles caching, background refetch, deduplication
- UI state (is this post expanded? is comment box open?): local
useStateinsideFeedItem - User state (logged-in user, preferences): Zustand or Context — changes rarely, needs to be globally accessible
- Transient state (hover, focus): CSS
:hover/:focus— never React state
The mistake junior devs make is putting everything in a global Redux store. That means a like button toggle triggers a re-render of every component subscribed to the store.
Step 9 — Accessibility checklist
- Feed list should be a
<main role="feed" aria-busy={isFetching}>landmark - Each post should be an
<article>element - Like button:
<button aria-label="Like post" aria-pressed={post.liked}> - Images: meaningful
alttext — never empty for content images - "N new posts" banner:
role="status" aria-live="polite"so screen readers announce it - Keyboard: Spacebar should scroll the feed (not activate buttons) — test this
Step 10 — Data model and normalized store
Before writing any component code, define your entities. Interviewers notice when candidates jump straight to JSX without thinking about data shape.
Core entities
1// Post — what the server returns2interface Post {3 id: string;4 author_id: string;5 body: string; // raw text — may contain @mentions and #hashtags6 media?: { type: 'image' | 'video'; url: string; width: number; height: number }[];7 likes_count: number;8 comments_count: number;9 liked_by_me: boolean;10 created_at: string; // ISO 860111}1213// User — minimal shape for feed display14interface FeedUser {15 id: string;16 username: string;17 display_name: string;18 avatar_url: string;19 is_verified: boolean;20}2122// Feed page — cursor-paginated response23interface FeedPage {24 items: Post[];25 next_cursor: string | null;26 user_map: Record<string, FeedUser>; // embed authors inline27}
Normalized store structure
A flat, normalized store prevents duplication and makes cache updates (likes, edits) cheap — you update one place instead of scanning every page.
React Query doesn't normalize automatically — you need to do it manually in select. In the queryFn, store individual posts with queryClient.setQueryData(['posts', post.id], post) for each post returned. This way if a user opens a post detail page, it's already cached. Likes update the single ['posts', postId] entry and React Query invalidates all queries that derive from it.
Step 11 — HTTP caching, deduplication, and idempotency
HTTP cache headers
The feed API response should be uncacheable (always fresh), but static assets and user avatars should be cached aggressively.
1# Feed endpoint — no browser caching2Cache-Control: no-store34# Avatar images — cache for 1 year (CDN-served with content hash URL)5Cache-Control: public, max-age=31536000, immutable67# Post images — cache for 7 days8Cache-Control: public, max-age=604800
Request deduplication
React Query deduplicates identical in-flight requests automatically. If two components both call useQuery(['posts', id]), only one network request fires. The second subscriber shares the same promise.
Idempotent like/unlike
A like must be safe to retry. If the network fails mid-request, retrying should not double-like. The server should treat a like as an upsert, not an append:
1POST /api/posts/:id/like2→ Server: INSERT INTO likes (user_id, post_id) ON CONFLICT DO NOTHING
From the frontend, the mutation key ensures only one in-flight like per post:
1useMutation({2 mutationKey: ['like', postId], // deduplicate: only one concurrent like per post3 mutationFn: () => fetch(`/api/posts/${postId}/like`, { method: 'POST' }),4})
Step 12 — Rendering rich text safely
Feed posts can contain @mentions and #hashtags. Rendering them naively as innerHTML is an XSS vulnerability.
Safe mention/hashtag rendering
1function RichPostBody({ body }: { body: string }) {2 // Parse into segments — never use dangerouslySetInnerHTML3 const segments = parseRichText(body);45 return (6 <p>7 {segments.map((seg, i) => {8 if (seg.type === 'mention') return (9 <a key={i} href={`/u/${seg.value}`} className="text-teal-400 hover:underline">10 @{seg.value}11 </a>12 );13 if (seg.type === 'hashtag') return (14 <a key={i} href={`/explore/${seg.value}`} className="text-indigo-400 hover:underline">15 #{seg.value}16 </a>17 );18 return <span key={i}>{seg.value}</span>;19 })}20 </p>21 );22}2324function parseRichText(text: string) {25 // Split on @mention and #hashtag patterns, preserve tokens26 const parts = text.split(/(@\w+|#\w+)/g);27 return parts.map(part => {28 if (part.startsWith('@')) return { type: 'mention' as const, value: part.slice(1) };29 if (part.startsWith('#')) return { type: 'hashtag' as const, value: part.slice(1) };30 return { type: 'text' as const, value: part };31 });32}
Post truncation
Long posts should be truncated with a "See more" toggle — show 3 lines by default.
1function TruncatedBody({ body }: { body: string }) {2 const [expanded, setExpanded] = useState(false);3 const isLong = body.length > 280;45 return (6 <div>7 <div className={!expanded && isLong ? 'line-clamp-3' : ''}>8 <RichPostBody body={body} />9 </div>10 {isLong && (11 <button onClick={() => setExpanded(e => !e)} className="text-teal-400 text-sm mt-1">12 {expanded ? 'See less' : 'See more'}13 </button>14 )}15 </div>16 );17}
"I never use dangerouslySetInnerHTML for user-generated content. Instead I parse the post body into typed segments (text, mention, hashtag) and render each as a React element. This gives full XSS protection and lets me attach click handlers or prefetch user profiles on mention hover."
Step 13 — Scroll position and stale feeds
Preserving scroll position on remounting
When a user navigates to a post detail page and comes back, the feed should restore scroll position — not start from the top.
1// Store scroll position in sessionStorage before navigating away2function FeedList() {3 const scrollRef = useRef<HTMLDivElement>(null);45 useEffect(() => {6 const saved = sessionStorage.getItem('feed-scroll');7 if (saved && scrollRef.current) {8 scrollRef.current.scrollTop = parseInt(saved);9 }10 return () => {11 if (scrollRef.current) {12 sessionStorage.setItem('feed-scroll', String(scrollRef.current.scrollTop));13 }14 };15 }, []);1617 return <div ref={scrollRef} className="overflow-y-auto h-screen">...</div>;18}
With React Query's staleTime, the data is still cached when the user comes back — no refetch, instant render, scroll restored.
Stale feed detection
If the user leaves the tab for 10+ minutes and comes back, silently refetch in the background and show the "N new posts" banner — don't yank the feed out from under them.
1// In React Query config — refetch on window focus only if data is older than 10 min2useInfiniteQuery({3 queryKey: ['feed'],4 staleTime: 10 * 60 * 1000, // 10 minutes5 refetchOnWindowFocus: true, // triggers when tab becomes active6 refetchOnMount: false, // don't refetch if data is still fresh7})
Step 14 — Offline support
Reading offline (Service Worker cache)
Use a service worker to serve cached feed data when the network is unavailable:
1// sw.js — network-first strategy for feed API2self.addEventListener('fetch', event => {3 if (event.request.url.includes('/api/feed')) {4 event.respondWith(5 fetch(event.request)6 .then(response => {7 const clone = response.clone();8 caches.open('feed-cache-v1').then(cache => cache.put(event.request, clone));9 return response;10 })11 .catch(() => caches.match(event.request)) // serve stale on network failure12 );13 }14});
Show an offline banner so users know the feed is stale:
1function useOnlineStatus() {2 const [online, setOnline] = useState(navigator.onLine);3 useEffect(() => {4 const on = () => setOnline(true);5 const off = () => setOnline(false);6 window.addEventListener('online', on);7 window.addEventListener('offline', off);8 return () => { window.removeEventListener('online', on); window.removeEventListener('offline', off); };9 }, []);10 return online;11}
Writing offline — the outbox pattern
If a user likes a post while offline, don't silently drop the action.
User taps Like (offline)
- Optimistic UI update (immediate)
- OutboxQueue
- persist to IndexedDB { action: 'like', postId, timestamp }
Network restored (online event)
- OutboxQueue.flush()
- Dequeue oldest action
- POST /api/posts/:id/like
- On success: remove from IndexedDB
On failure
- Retry with exponential backoff (up to 3 attempts)
1const OUTBOX_KEY = 'feed-outbox';23function enqueueAction(action: { type: string; postId: string }) {4 const queue = JSON.parse(localStorage.getItem(OUTBOX_KEY) ?? '[]');5 queue.push({ ...action, id: crypto.randomUUID(), createdAt: Date.now() });6 localStorage.setItem(OUTBOX_KEY, JSON.stringify(queue));7}89async function flushOutbox() {10 const queue = JSON.parse(localStorage.getItem(OUTBOX_KEY) ?? '[]');11 for (const item of queue) {12 try {13 await fetch(`/api/posts/${item.postId}/like`, { method: 'POST' });14 // Remove from queue on success15 const remaining = queue.filter((q: { id: string }) => q.id !== item.id);16 localStorage.setItem(OUTBOX_KEY, JSON.stringify(remaining));17 } catch {18 break; // stop on failure — retry on next flush19 }20 }21}2223// Flush when network comes back24window.addEventListener('online', flushOutbox);
Step 15 — Retry strategy
Not every failure is worth retrying. Classify errors before deciding.
1useMutation({2 mutationFn: submitPost,3 retry: (failureCount, error) => {4 // Don't retry client errors (400, 401, 403, 422)5 if (error.status >= 400 && error.status < 500) return false;6 // Retry server errors up to 3 times7 return failureCount < 3;8 },9 retryDelay: (attempt) => Math.min(1000 * 2 ** attempt, 30_000), // exponential backoff, max 30s10})
| Status | Retry? | Reason |
|---|---|---|
| 429 Too Many Requests | Yes, after delay | Rate limited, back off |
| 500 Server Error | Yes, up to 3x | Transient failure |
| 503 Unavailable | Yes | Server overloaded |
| 400 Bad Request | No | Fix the payload, not retryable |
| 401 Unauthorized | No | Refresh token first, then resubmit |
| 422 Validation | No | User must correct input |
Show a user-facing error state with a retry button for persistent failures — never fail silently.
Step 16 — Comments and live updates
Comment thread architecture
1function CommentSection({ postId }: { postId: string }) {2 const { data: comments } = useQuery({3 queryKey: ['comments', postId],4 queryFn: () => fetch(`/api/posts/${postId}/comments`).then(r => r.json()),5 enabled: !!postId,6 staleTime: 10_000, // comments can be slightly stale7 });89 return (10 <div>11 {comments?.map(c => <CommentRow key={c.id} comment={c} />)}12 <CommentComposer postId={postId} />13 </div>14 );15}
Load comments lazily — only fetch when the user expands the comment section. This avoids an extra network request per visible post on initial page load.
Live comment updates via SSE
For real-time comments, Server-Sent Events (SSE) are better than WebSocket — they're unidirectional (server → client), work over plain HTTP/2, and don't need a persistent bidirectional connection.
1function useLiveComments(postId: string) {2 const queryClient = useQueryClient();34 useEffect(() => {5 const es = new EventSource(`/api/posts/${postId}/comments/stream`);67 es.addEventListener('new_comment', (event) => {8 const newComment = JSON.parse(event.data);9 queryClient.setQueryData(['comments', postId], (old: Comment[] = []) => [10 ...old,11 newComment,12 ]);13 });1415 return () => es.close();16 }, [postId, queryClient]);17}
User opens comment section
- useLiveComments(postId)
- new EventSource('/api/posts/:id/comments/stream')
- HTTP/2 long-lived connection
Another user posts a comment
- Server pushes SSE event: { type: 'new_comment', data: {...} }
- Client receives — queryClient.setQueryData update
- React re-renders comment list (no refetch needed)
User closes comment section
- es.close() — connection terminated
"For live comments I'd use SSE over WebSocket. SSE is simpler — it works over a standard HTTP connection, auto-reconnects, and is supported in all modern browsers. WebSocket makes sense if we needed bidirectional streaming, like a cursor-collaboration feature. For comment counts on the feed itself, polling every 30s is sufficient — no need for a persistent connection just for a number."
Step 17 — JavaScript loading strategy
The feed should be interactive immediately. Every kilobyte of JS delays that.
Code split by route
1// Next.js does this automatically per page — each route is a separate JS chunk2// For heavy components inside the feed, use dynamic imports:34const VideoPlayer = dynamic(() => import('./VideoPlayer'), {5 loading: () => <div className="aspect-video bg-slate-800 animate-pulse rounded-lg" />,6 ssr: false, // video player doesn't need SSR7});89// Only load video player code when a video post is visible10function PostBody({ post }: { post: Post }) {11 if (post.media?.type === 'video') return <VideoPlayer src={post.media.url} />;12 return <PostImage src={post.media?.url} />;13}
What to lazy-load in a feed
| Component | Strategy | Why |
|---|---|---|
| Video player | dynamic() on demand | Heavy library, rarely needed |
| Emoji picker (comment composer) | dynamic() on click | Only needed when user opens composer |
| Share modal | dynamic() on click | Infrequent, don't pay upfront |
| Feed item | Do NOT lazy-load | Visible immediately on paint |
| Comment section | Lazy fetch data, not code | Code is small, data is expensive |
Core Web Vitals targets
| Metric | Target | What breaks it in a feed |
|---|---|---|
| LCP (Largest Contentful Paint) | < 2.5s | First post image loading slowly |
| CLS (Cumulative Layout Shift) | < 0.1 | Images without reserved dimensions |
| INP (Interaction to Next Paint) | < 200ms | Like button handler blocking main thread |
| FID (First Input Delay) | < 100ms | Heavy JS parsing on load |
1// Prevent CLS: always reserve image dimensions2<img3 src={post.media.url}4 width={post.media.width}5 height={post.media.height}6 style={{ aspectRatio: `${post.media.width}/${post.media.height}` }}7 loading="lazy"8/>
Step 18 — Accessibility deep dive
Feed list semantics
1<main>2 <section role="feed" aria-label="News feed" aria-busy={isFetchingNextPage}>3 {posts.map(post => (4 <article key={post.id} aria-label={`Post by ${post.author.display_name}`}>5 <PostHeader post={post} />6 <PostBody post={post} />7 <PostActions post={post} />8 </article>9 ))}10 </section>11</main>
Keyboard navigation
The role="feed" role has specific keyboard expectations:
- Page Down / Page Up — scroll the feed one page
- Tab — move focus between interactive elements (like buttons, links)
- Users should be able to navigate through posts without a mouse
Motion and reduced motion
If your feed has animated transitions (post appearing, like animation), respect prefers-reduced-motion:
1const shouldAnimate = !window.matchMedia('(prefers-reduced-motion: reduce)').matches;23// In Framer Motion:4<motion.div5 initial={shouldAnimate ? { opacity: 0, y: 8 } : false}6 animate={{ opacity: 1, y: 0 }}7>
Dynamic content announcements
When new posts load at the top (after the user clicks the "N new posts" banner), announce it to screen readers:
1<div role="status" aria-live="polite" aria-atomic="true" className="sr-only">2 {newCount > 0 ? `${newCount} new posts loaded` : ''}3</div>
Step 19 — Internationalization (i18n)
Timestamps
Never display raw ISO strings. Use Intl.RelativeTimeFormat for human-readable relative times, and Intl.DateTimeFormat for absolute dates:
1function formatTimestamp(isoString: string, locale: string): string {2 const date = new Date(isoString);3 const secondsAgo = (Date.now() - date.getTime()) / 1000;45 if (secondsAgo < 60) return 'just now';6 if (secondsAgo < 3600) {7 const mins = Math.floor(secondsAgo / 60);8 return new Intl.RelativeTimeFormat(locale, { numeric: 'auto' }).format(-mins, 'minute');9 }10 if (secondsAgo < 86400) {11 const hrs = Math.floor(secondsAgo / 3600);12 return new Intl.RelativeTimeFormat(locale, { numeric: 'auto' }).format(-hrs, 'hour');13 }14 return new Intl.DateTimeFormat(locale, { month: 'short', day: 'numeric' }).format(date);15}
RTL (right-to-left) layout
Arabic and Hebrew users read RTL. Use logical CSS properties that flip automatically:
1// ✅ Correct — flips for RTL2<div className="ms-4"> // margin-inline-start (= margin-left in LTR, margin-right in RTL)3<div className="ps-3"> // padding-inline-start45// ❌ Wrong — hardcoded direction6<div className="ml-4"> // always left, breaks RTL
Set dir="rtl" on <html> based on the user's locale — Tailwind's rtl: variant handles the rest.
Step 20 — Telemetry and observability
Shipping a feed without metrics means you're flying blind. Define what to measure.
Key metrics to instrument
1// 1. Feed load time (time from page mount to first post visible)2performance.mark('feed-start');3// ... after first posts render:4performance.measure('feed-load', 'feed-start');56// 2. Scroll depth — how far users read7function trackScrollDepth(scrollPct: number) {8 if ([25, 50, 75, 90].includes(Math.floor(scrollPct))) {9 analytics.track('feed_scroll_depth', { percent: scrollPct });10 }11}1213// 3. Interaction rate — likes and comments per session14analytics.track('post_liked', { post_id: postId, position_in_feed: index });1516// 4. Error rate17window.addEventListener('unhandledrejection', event => {18 analytics.track('feed_error', { message: event.reason?.message });19});
What to alert on
| Signal | Threshold | Action |
|---|---|---|
| Feed load P95 | > 3s | Investigate API or client bundle |
| Error rate | > 1% of sessions | Page on-call |
| Like success rate | < 99% | Mutation or server issue |
| Scroll depth median | < 25% | Content quality or pagination bug |
| INP regression | > 200ms | JavaScript main-thread contention |
"I'd instrument three things from day one: feed load time (P50/P95), scroll depth distribution, and error rate. Load time tells me if the pagination strategy is working. Scroll depth tells me if the content ranking is good — if median depth drops from 50% to 20% overnight, something broke. Error rate catches silent failures in mutations like likes."
Step 21 — Final performance checklist
| Concern | Technique |
|---|---|
| Large list render | Virtual scrolling (react-window / @tanstack/virtual) |
| Images below fold | loading="lazy" + explicit width/height (prevent CLS) |
| Layout shift | Reserve media dimensions before load |
| Excessive re-renders | React.memo on FeedItem, stable callback refs |
| Too many fetches | staleTime: 30s in React Query, cursor pagination |
| New post detection | 15s polling of count endpoint (not full refetch) |
| Like latency | Optimistic update with rollback on error |
| Offline likes | Outbox pattern — persist to localStorage, flush on reconnect |
| Video/emoji JS cost | dynamic() imports — load only when needed |
| Timestamp format | Intl.RelativeTimeFormat — locale-aware, no library needed |
| RTL support | Tailwind logical properties (ms-, ps-, rtl:) |
| Screen reader UX | role="feed", <article> per post, aria-live for banners |
"The three biggest frontend challenges in a news feed are: first, performance with a large DOM — I'd solve with virtual scrolling and lazy media loading. Second, real-time without breaking scroll position — I'd use a 'N new posts' banner instead of auto-injecting. Third, data consistency — I'd use optimistic updates for interactions like likes so the UI feels instant, with rollback on error. For resilience I'd add an offline outbox so likes aren't silently dropped on flaky connections. The architecture: React Query for server state, cursor pagination, SSE for live comments, service worker for offline reads, and Intl APIs for i18n."