Most SEO advice is written for marketers. This is written for developers.
Technical SEO is the subset of search optimization that deals with how your site is built, not what's written on it. It's the difference between Google being able to crawl and index your site efficiently versus struggling with it.
Rendering and Crawl Budget
Googlebot has a finite amount of resources to spend on your site. This is your crawl budget. Waste it and new pages take longer to appear in search results.
Server-side rendering matters. If your page requires JavaScript execution to display content, Google can render it — but it takes longer and uses more of your crawl budget. Pre-rendered or server-rendered HTML is crawled faster and more reliably.
Return proper HTTP status codes. Soft 404s (pages that return 200 but display "not found" content) waste crawl budget. If a page doesn't exist, return 404. If it moved, return 301. Don't return 200 for everything.
robots.txt controls access, not indexing. Disallowing a URL in robots.txt prevents crawling but doesn't prevent indexing if other pages link to it. Use noindex meta tags or X-Robots-Tag headers to prevent indexing.
Core Web Vitals
These are measurable performance metrics that affect rankings. They're also just good engineering.
Largest Contentful Paint (LCP): How fast does the main content appear? Target: under 2.5 seconds. Fix: optimize images, preload critical resources, use a CDN, reduce server response time.
Cumulative Layout Shift (CLS): Does content jump around while loading? Target: under 0.1. Fix: set explicit dimensions on images and embeds, avoid injecting content above the fold after initial render, use font-display: swap or preload fonts.
Interaction to Next Paint (INP): How fast does the page respond to user input? Target: under 200ms. Fix: break up long JavaScript tasks, reduce main thread blocking, defer non-critical scripts.
Structured Data
JSON-LD is the preferred format. Add it to your pages as a script tag in the head or body.
At minimum, implement:
- Organization: Your company name, logo, contact info
- WebSite: Your site's search URL structure (if applicable)
- BreadcrumbList: Navigation hierarchy for inner pages
- Service: Individual service offerings with descriptions
URL Architecture
Use descriptive, hierarchical URLs. /services/automated-vulnerability-assessment is better than /services?id=3.
Trailing slashes: Pick one convention and redirect the other. Having both creates duplicate content.
Parameters: Avoid URL parameters for content. If you must use them, implement canonical tags pointing to the preferred URL.
Sitemaps
Your sitemap should be dynamically generated and only include canonical, indexable URLs. Don't include pages with noindex, redirected URLs, or error pages.
Submit your sitemap through Google Search Console and reference it in robots.txt. Keep it under 50,000 URLs per file — split into multiple sitemaps if needed.
Internal Linking
Every important page should be reachable within 3 clicks from the homepage. Google distributes ranking authority through internal links — pages that are deeply buried receive less.
Use descriptive anchor text. "Learn more" tells Google nothing. "Website security audit process" tells Google exactly what the linked page is about.
The Minimum Viable Checklist
If you do nothing else: 1. Server-render your HTML 2. Implement proper HTTP status codes 3. Add structured data (Organization + BreadcrumbList at minimum) 4. Generate a dynamic sitemap 5. Set canonical URLs on every page 6. Measure and optimize Core Web Vitals
Everything else is optimization on top of a solid foundation. Get these right first.