BACK TO INTELLIGENCE
SEO & GROWTHOctober 19, 202525 min

Technical SEO: The Developers' Role

SEO is no longer just keywords and backlinks. It is Render Performance, Core Web Vitals, Structured Data, and Server Architecture. A comprehensive guide for engineers.

Google is a Robot: An Engineering Perspective on Search

For the past two decades, SEO (Search Engine Optimization) has been the domain of the Marketing Department. It was viewed as "Soft Magic"—writing blog posts, stuffing keywords, and begging for backlinks. Developers were often left out of the conversation, or worse, treated as obstacles ("Can you change the H1 tag? Can you fix the URL structure?").

In 2025, that paradigm is dead. SEO is now a Product Engineering discipline.

Why? Because Google's algorithm has evolved from a simple "Keyword Counter" to a sophisticated "User Experience Auditor." Google now judges your code quality, your render performance, your server architecture, and your accessibility compliance. Marketing cannot fix Largest Contentful Paint (LCP). Marketing cannot implement Edge Caching. Marketing cannot write JSON-LD Schema. Only Engineers can do this.

This whitepaper is a manifesto for the "Technical SEO Engineer." It bridges the gap between the robots.txt and the React Component.


Part 1: The Rendering Problem (Client-Side vs. Server-Side)

To understand Technical SEO, you must understand how Googlebot (the crawler) works. Googlebot is essentially a headless Chrome browser running in a massive distributed data center. But it has limits.

The JavaScript Trap

In the early days of SPAs (Single Page Applications) like React, Angular, and Vue, developers shipped empty HTML shells:

<html>
<body>
  <div id="root"></div>
  <script src="bundle.js"></script>
</body>
</html>

The content was fetched via API after the page loaded. When Googlebot arrived, it saw an empty <div>. It indexed nothing. While Googlebot is now "Evergreen" (it can execute JS), it does so in a second wave called the Render Queue.

  1. Crawl: Googlebot grabs HTML. (Fast, Instant Indexing).
  2. Queue: If HTML is empty, page goes to Render Queue.
  3. Render: Days or weeks later, resources permit, Googlebot executes JS to see content.

This delay is fatal for business. If you launch a product on Monday, and Google doesn't render it until Friday, you missed the news cycle.

The Solution: SSR and SSG (Next.js)

We solve this with Server-Side Rendering (SSR) or Static Site Generation (SSG). Frameworks like Next.js execute the React code on the server or at build time and send fully populated HTML to the client.

  • View Source: Googlebot sees <h1>My Product</h1> instantly.
  • Result: Instant Indexing. Better rankings.

Rule #1 of Tech SEO: If it's not in the initial HTML response body, assume it doesn't exist for the crawler.


Part 2: Core Web Vitals (The Performance Ranking Factor)

In 2021, Google introduced the "Page Experience Update." Speed became a direct Ranking Factor. They introduced three metrics called Core Web Vitals (CWV).

1. Largest Contentful Paint (LCP) - Loading Performance

  • What: How long until the biggest thing on screen (Hero Image, Headline) is visible?
  • Target: < 2.5 seconds.
  • Engineering Fixes:
    • Image Optimization: Use WebP/AVIF formats. Provide srcset for responsive sizing.
    • Preloading: <link rel="preload" as="image" href="hero.jpg">.
    • Server Speed: optimize Database queries. Use EDGE Caching (Vercel/Cloudflare).

2. Cumulative Layout Shift (CLS) - Visual Stability

  • What: Do elements jump around while loading? (e.g., An ad loads and pushes the text down).
  • Target: < 0.1 score.
  • Engineering Fixes:
    • Aspect Ratios: Always define width and height attributes on <img> tags so the browser reserves space.
    • Font Loading: Use font-display: swap to prevent FOUT (Flash of Unstyled Text) or FOIT (Flash of Invisible Text).

3. Interaction to Next Paint (INP) - Responsiveness

  • What: When I click a button, how long until the browser paints the next frame? (Replaced FID).
  • Target: < 200 milliseconds.
  • Engineering Fixes:
    • Hydration: Minimize the JavaScript bundle size. Break up "Long Tasks" on the Main Thread.
    • React: Use useTransition or Suspense to prioritize UI updates.

The Business Impact: Google Search Console gives you a "Pass" or "Fail" grade. If you Fail, you are effectively demoted in rankings. No amount of keywords can save a slow site.


Part 3: Crawl Budget and Architecture

Google does not have infinite resources. It assigns a Crawl Budget to every domain. "I will index 1,000 pages from site X today." If you waste this budget, Google leaves before finding your profitable pages.

Budget Killers

  1. Duplicate Content: example.com/product vs example.com/product?ref=twitter.
    • Google sees 2 pages. Wastes budget.
    • Fix: Canonical Tags. <link rel="canonical" href="https://example.com/product" />. This tells Google: "Ignore the parameters, this is the master copy."
  2. Infinite Spaces: Calendar pages (Next Month, Next Month...), Filter combinations (Red + Blue + Size 10...).
    • Fix: robots.txt. Disallow infinite paths. Use meta name="robots" content="noindex" on low-value facet pages.
  3. Soft 404s: Pages that say "Product Not Found" but return a 200 OK status code.
    • Fix: Ensure your server returns accurate HTTP status codes (404 Not Found, 410 Gone, 301 Moved Permanently).

Sitemaps

The Sitemap (sitemap.xml) is the map you give to the Robot. It should be Dynamic and Automated.

  • Don't edit it manually.
  • Your build pipeline should generate it based on your database contents.
  • Exclude non-canonical pages.

Part 4: Structured Data (The Semantic Web)

Google is trying to move from "Strings" to "Things." It wants to understand that "Deniz Berke" is a Person, who works for an Organization, offering a Service. We teach Google this using Structured Data (Schema.org) via JSON-LD.

The Invisible Code

Inside the <head> of your page, we inject a JSON script that humans never see.

<script type="application/ld+json">
{
  "@context": "https://schema.org",
  "@type": "SoftwareApplication",
  "name": "DenizBerke Dashboard",
  "operatingSystem": "Web",
  "applicationCategory": "BusinessApplication",
  "offers": {
    "@type": "Offer",
    "price": "5000",
    "priceCurrency": "USD"
  }
}
</script>

The Return: Rich Snippets

When you provide this code, Google rewards you with Rich Snippets (Rich Results):

  • Star Ratings: 5 yellow stars next to your link.
  • Price: "$5000" explicitly shown.
  • FAQ: Expanding questions right in the search results.
  • Sitelinks Search Box: A search bar for your site within Google.

CTR Explosion: Rich Snippets occupy more vertical pixel space and draw the eye. They can increase Click-Through Rate (CTR) by 30%. This is purely an engineering task. At DENIZBERKE, we build React Components that automatically inject the correct Schema based on the content type.


Part 5: Internationalization (Hreflang)

For global enterprises, "Which Google?" matters. If I search from Berlin, I want the German version. If I search from London, I want the English version. If you mess this up, you get "Duplicate Content" penalties.

The Solution: Hreflang Tags. <link rel="alternate" hreflang="de" href="example.com/de/page" /> <link rel="alternate" hreflang="en-gb" href="example.com/uk/page" />

This map tells Google exactly which version to serve to which user. Implementing this dynamically across thousands of pages is a complex routing challenge (Middleware), but essential for global SEO.


Part 5.5: Next.js 14 Metadata API Deep Dive

In the React world, we used to use react-helmet to manage <head> tags. It was buggy and client-side only. Next.js 13+ introduced the Metadata API. This acts as the "SEO Control Center."

Static Metadata (layout.tsx):

export const metadata: Metadata = {
  title: 'Deniz Berke | Digital Architect',
  description: 'Building the future of finance.',
  openGraph: {
    images: ['/og-image.png'],
  },
}

Dynamic Metadata (page.tsx): For blog posts, content is dynamic. We use generateMetadata.

export async function generateMetadata({ params }): Promise<Metadata> {
  const post = await getPost(params.slug);
  return {
    title: post.title,
    description: post.excerpt,
    openGraph: {
      images: [post.coverImage],
    },
  }
}

Why this changes the game: This runs on the Server. The HTML that arrives at Googlebot already contains the perfect Title, Description, and OG Tags. Zero hydration needed. Zero flicker. 100% Indexability.


Part 6: The Technical SEO Checklist

Before you launch, check these 10 items.

  1. [ ] Robots.txt Validation: Does it block /admin? Does it allow /api/og? Test in Search Console.
  2. [ ] Canonical Self-Reference: Does every page point to itself as canonical? (Prevents scraping issues).
  3. [ ] Sitemap Automation: Is sitemap.xml generated at build time? Check next-sitemap.
  4. [ ] Trailing Slashes: Pick one strategy (Slash or No-Slash) and enforcing 301 Redirects globally.
  5. [ ] 404 Hierarchy: Does a bad URL return a true 404 status? (Not a soft 200).
  6. [ ] Image LCP: Is the Hero Image using priority={true} in Next/Image?
  7. [ ] Font Layout Shift: Are you using next/font? (Zero Layout Shift).
  8. [ ] Heading Structure: Is there exactly one <h1>? Are <h2> and <h3> nested logically?
  9. [ ] Mobile Viewport: Is the clickable area of buttons >48px? (Fat Finger rule).
  10. [ ] SSL/TLS: Is HTTP redirecting to HTTPS? Is HSTS enabled?

Part 7: Frequently Asked Questions (FAQ)

Q: Does Lighthouse score matter? A: Yes and No. A score of 100 is great, but "Field Data" (Real User Metrics) matters more. Use CrUX (Chrome User Experience Report) to see what real users are experiencing.

Q: SPA vs MPA for SEO? A: With Next.js (SSR), the line is blurred. You get the SEO benefits of an MPA (Server-rendered HTML) with the UX benefits of an SPA (Client-side routing). It is the best of both worlds.

Q: How often should I update the sitemap? A: Instantly. Within 1 hour of publishing a new post. Ping Google via the API if possible.


Part 8: The AI Search Era (SGE)

Search Generative Experience (SGE) is coming. Google will answer the user directly with AI, pushing organic links down. How do you survive? You must be the Source of Truth.

  • Be the citational source.
  • Provide unique data tables.
  • Provide distinct, authoritative opinions (branding). If you just summarize others, AI will replace you. If you Create, AI will cite you.

Conclusion: Growth Engineering

SEO is not a "Marketing Task" anymore. It is a "Full-Stack Constraint." Just as you engineer for Security, and you engineer for Scalability, you must Engineer for Discoverability.

A beautiful website that Google cannot read is a billboard in the desert. By treating the Search Engine as a primary user persona (The Robot User), developers can unlock massive compounded organic growth. Code is the lever. Content is the fuel. SEO is the engine.

#SEO#Engineering#Core Web Vitals#Next.js#Growth#Architecture