Back to blog
SEO 04 Apr 2026 10 min read

What is Technical SEO — And Why Your Small Business Website is Probably Getting It Wrong

Most small business websites have technical SEO problems hiding in plain sight. Here's what technical SEO actually covers, why it matters more than keywords, and why a custom .NET developer can audit and fix everything in a single deployment.

What is Technical SEO — And Why Your Small Business Website is Probably Getting It Wrong

Most small businesses focus their SEO budget on two things: writing content and building backlinks. Both matter. But there's a third category that quietly determines whether any of that work actually shows up in Google — and it's the one most businesses never look at.

Technical SEO is the infrastructure beneath your content. It's the set of signals that tells Google what your website is, what each page does, whether the site is trustworthy, and how well it performs for the people who land on it. Get it wrong and your content doesn't rank — even if it's excellent. Get it right and every other piece of your SEO strategy works harder.

For small businesses, this is the most common and most expensive mistake in digital marketing: spending money on content and links while the technical foundation is silently working against you.

Technical SEO vs. On-Page SEO vs. Off-Page SEO

There are three branches of SEO, and they're not interchangeable:

  • Technical SEO — Crawlability, Core Web Vitals, schema markup, canonical URLs, sitemaps, indexing signals, redirect chains, structured data
  • On-Page SEO — Content quality, keyword targeting, headings, internal linking, page structure
  • Off-Page SEO — Backlinks, Google Business Profile, local citations, domain authority

The analogy that works: on-page SEO is what you say, off-page SEO is who vouches for you, and technical SEO is whether Google can hear you at all.

Most agencies lead with content and links because that's what's visible. Technical SEO lives in code, which means it gets ignored until something goes visibly wrong — and by then, rankings have already slipped.

What Technical SEO Actually Covers

A full technical SEO audit for a small business covers four areas.

Indexing and Crawlability

Google's crawler — Googlebot — visits your website and follows links to build a map of its content. If it can't reach a page, that page doesn't rank. Common problems:

  • robots.txt misconfiguration — A single character in the wrong place can accidentally block Google from crawling your entire site. This happens more than you'd think, particularly after a developer makes a change without understanding the SEO implications.
  • Orphaned pages — Pages with no internal links pointing to them. Google finds pages by following links; if nothing links to a page, Google may never discover it.
  • Sitemap errors — A sitemap listing pages that return 404 errors wastes Google's crawl budget and signals a poorly maintained site.
  • Accidental noindex tags — A stray noindex meta tag on a service page will remove it from Google's index entirely. You'd never know it was there unless you checked the page source.

On-Page Technical Signals

  • Duplicate meta titles — If every service page shares the same meta title, Google can't distinguish between them. They compete with each other instead of each targeting a distinct search query.
  • Missing or duplicate meta descriptions — Not a direct ranking factor, but they drive click-through rates from search results — which is a ranking signal.
  • Canonical URL errors — A canonical tag tells Google which version of a URL is the definitive one. If it points to the wrong URL, you're telling Google to rank a page that doesn't exist and ignoring the one that does.
  • Heading structure — One H1 per page, in logical hierarchy. Multiple H1s confuse both Google and screen readers.
  • Missing image alt text — Alt text is how Google understands images. It also matters for accessibility, which Google factors into page experience scoring.

Structured Data (Schema Markup)

Schema markup is code added to a page that tells Google — in unambiguous terms — what the page is. Not just that it's about a web development service, but that it is a Service offered by a ProfessionalService organisation with a specific address, price range, and opening hours.

Google uses this to generate rich results: FAQ dropdowns, local business panels, breadcrumbs in search results. For small businesses, schema is one of the most under-used technical SEO tools available — and one of the most impactful for organic click-through rates.

The types that matter most:

  • Organization / ProfessionalService — Tells Google what your business is and what it does
  • LocalBusiness — Ties your site to a physical location for local search results
  • Service — Describes individual services with names, descriptions, and providers
  • FAQPage — Enables FAQ rich results directly in search — free extra visibility without needing to rank in position one
  • BreadcrumbList — Communicates the hierarchy of your site to Google
  • Article — Marks up blog posts with publication dates and authorship signals for E-E-A-T

Core Web Vitals and Page Performance

In 2021, Google formalised what had been implicit for years: page experience is a ranking factor. The measurement system is Core Web Vitals — three metrics that reflect real-world user experience on real devices and real connections.

We've covered Core Web Vitals in depth in our article on website speed and SEO, but here's what every small business needs to know:

  • LCP (Largest Contentful Paint) — How long before the main content appears on screen. Target: under 2.5 seconds. Above 4 seconds is rated Poor by Google.
  • INP (Interaction to Next Paint) — How quickly the page responds to a user click, tap, or keystroke. Target: under 200ms. This replaced FID in March 2024 and is now Google's official responsiveness metric.
  • CLS (Cumulative Layout Shift) — Whether content jumps around as the page loads. Target: under 0.1. The classic example is a button that shifts just as you're about to tap it.

A 1-second delay in load time reduces conversions by 7%. A small business with failing Core Web Vitals doesn't just rank lower — it loses the visitors it does attract before they read a single word.

The Problem With Technical SEO on WordPress and Page Builders

Here's the honest assessment: on WordPress, Wix, and Squarespace, the platform architecture works against technical SEO precision. WordPress SEO is controlled through plugins — typically Yoast or RankMath — that are doing their best inside a system not designed for it. The result:

  • Meta titles and descriptions are set page by page through an admin panel. Fifty pages means fifty chances to accidentally duplicate, skip, or misconfigure them.
  • Schema markup is auto-generated by plugins that frequently produce incorrect or incomplete output — particularly for service businesses that don't fit neatly into pre-defined templates.
  • Canonical URLs can be silently overridden by themes, WooCommerce, or conflicting plugins without the site owner ever knowing.
  • Core Web Vitals are structurally compromised. WordPress loads plugin CSS and JavaScript regardless of whether the current page needs them. Caching plugins reduce the damage but can't eliminate it.

There is no single view of the technical SEO state of a WordPress site. A proper audit means clicking through dozens of admin screens, running external crawl tools, and cross-referencing results manually. As we've covered before, this is one of the hidden costs of CMS-based development that rarely gets discussed upfront.

The Custom .NET Advantage: The Entire Site in One View

When a small business website is built on ASP.NET Core and version-controlled in Git, the entire technical SEO picture exists in one place: the codebase.

Every meta tag, every canonical URL, every schema block, every sitemap entry, every robots.txt rule — it's all code. A developer can open the repository and read the complete technical SEO state of the entire site in minutes. Not by clicking through admin screens. Not by running an external tool. By reading the files.

This changes the economics of technical SEO for a small business in three concrete ways.

1. Git Versioning: A Complete Audit Trail

Version control with Git means every change to every file in your website is logged with a timestamp, the author, and an exact record of what changed — including every SEO-critical element.

  • If rankings drop overnight, a developer can run a diff and see exactly what changed and when — no guesswork, no trawling through a database audit log
  • If a canonical URL was accidentally removed during a content update, you can identify the exact commit, restore the correct value, and deploy the fix in minutes
  • If you want to audit every meta description on the site for duplicates, a developer opens one structured data file and reads every single one — because they're stored together in code, not scattered across fifty database records
  • If Google Search Console flags a new crawl error, you can trace it to the precise commit that caused it and know immediately whether it was intentional

This is a fundamentally different model from WordPress, where content changes are logged in a database that's difficult to query and impossible to diff meaningfully. The Git history of a code-based small business site is a complete, human-readable technical SEO audit trail from the first line of code.

2. The Whole-Site Audit in One Pass

In a bespoke .NET website, SEO-critical elements live in structured, predictable locations:

  • All page metadata — titles, descriptions, canonical logic — is set in page model classes, one class per page type
  • All schema markup lives in a shared partial that applies automatically to every page type — changes propagate immediately on the next build
  • All content data for services, locations, and blog articles is stored in structured JSON files that a developer can read, diff, and validate in a text editor without loading an admin panel
  • The sitemap is generated in code — adding a new page automatically includes it in the sitemap without any manual intervention

A developer can perform a complete technical SEO audit of an entire small business site by reviewing a handful of files. There is no equivalent workflow in any CMS environment.

3. Site-Wide Fixes in a Single Deployment

This is where the difference becomes most commercially significant for a small business.

Imagine an audit finds 25 technical issues across 40 pages: incorrect schema types on service pages, missing alt attributes on a shared image component, a canonical logic error affecting every location page, and a sitemap including five pages that should be excluded.

On a WordPress site, fixing those issues means logging into the admin panel and working through each page individually — changing meta fields, updating plugin settings, editing pages one at a time. That's a half-day of manual work with a high risk of inconsistency between pages.

On a custom .NET site:

  • The schema fix is one change to a shared partial — every page inherits it on deploy
  • The alt attribute fix is one change to a shared component — every instance is corrected simultaneously
  • The canonical logic fix is one change to the base page model — every page in the site gets the correct canonical at once
  • The sitemap fix is one change to the generator — the corrected sitemap is live in the next deployment

Four changes. One deployment. Every affected page fixed at the same time.

For a small business paying for SEO work, the difference between "we can fix all of this today" and "we'll need three weeks to work through this" is not an abstract technical preference — it's how quickly you start recovering rankings.

What a Technical SEO Audit Covers in Practice

When we audit a small business site, we're working through four areas in a single pass through the codebase:

Indexing and crawlability

  • Is robots.txt correctly configured — blocking only what should be blocked?
  • Is the XML sitemap accurate, listing only indexable pages, and free of 404 errors?
  • Are there orphaned pages with no internal links?
  • Are there pages indexed that should be excluded — thank-you pages, internal search results, admin routes?

On-page technical signals

  • Are meta titles unique, within 60 characters, and keyword-relevant for each page type?
  • Are meta descriptions unique, compelling, and within 160 characters?
  • Is there one H1 per page with headings in logical hierarchy?
  • Do all images have descriptive, relevant alt text?

Schema and structured data

  • Is ProfessionalService or Organization schema present on the homepage?
  • Are service pages marked up with Service schema?
  • Is LocalBusiness schema on location pages with correct address and coordinates?
  • Are FAQs marked up with FAQPage schema to trigger rich result expansion in search?
  • Is Article schema on blog posts with publication date and author?

Performance and Core Web Vitals

  • What are the LCP, INP, and CLS scores for mobile and desktop?
  • Are images in modern formats (WebP or AVIF)?
  • Is a CDN in place for static asset delivery?
  • Are there redirect chains bleeding PageRank with every hop?
  • Is HTTPS enforced with no mixed-content warnings on any page?

Real-World Example: ASSET North West

When we rebuilt the ASSET North West website, technical SEO was a core deliverable — not an afterthought bolted on after launch.

Their previous WordPress installation had accumulated years of technical debt: crawling issues, failing Core Web Vitals, thin and duplicated service page content, and no structured data. The SEO plugin was configured, but it couldn't compensate for a platform loading 50,000 lines of code on every single page request.

The new bespoke .NET site resolved every technical issue simultaneously at launch — not as a separate post-build SEO pass, but as a natural consequence of how the site was architected:

  • Schema markup built into a shared partial — every page type automatically receives the correct schema without any manual per-page configuration
  • Canonical URLs generated programmatically from the request path — no admin field for a content editor to accidentally misconfigure
  • Sitemap generated in code, listing only indexable pages — no manual re-submission required when new content is published
  • Core Web Vitals score of 98% on PageSpeed Insights — not because a caching plugin is papering over a slow platform, but because there is no bloat to work around

The result was increased organic visibility, improved conversion rates from search traffic, and a site that maintains its technical health because the architecture enforces it — not because someone set a reminder to check a settings panel.

How Code360 Handles Technical SEO

Technical SEO isn't a service we add after a project is finished. It's built into every bespoke .NET website we deliver, because the way we build makes it the default — not the exception.

  • Schema markupProfessionalService, LocalBusiness, Service, FAQPage, Article, and BreadcrumbList in shared partials, applied automatically across all relevant page types
  • Canonical URLs — Generated programmatically. No content editor can accidentally misconfigure them.
  • Metadata — Set in structured data files. Every page title and description is visible and auditable by a developer in a single file.
  • Sitemaps — Generated in code, submitted to Search Console, listing only what should be indexed
  • Core Web Vitals — Tested against PageSpeed Insights, Lighthouse, and GTmetrix before every launch. Our sites consistently score 90+.
  • Git version control — Every change to every SEO-relevant element is committed, timestamped, and diffable. A complete audit trail from day one.

Our website management and support service includes ongoing technical health checks post-launch — so issues are caught and fixed before they affect rankings, with full Git history showing exactly what changed and when.

The Bottom Line

Technical SEO is the foundation everything else sits on. For a small business, it's the difference between a site that works as a lead generation tool and one that sits in Google's index, effectively invisible.

The core advantages of a custom .NET website for technical SEO come down to visibility and control:

  • A developer can see the complete technical SEO state of the entire site in one view
  • Git versioning means every change is tracked, diffable, and reversible — a complete audit trail
  • Site-wide issues are fixed in a single deployment, not page by page through an admin panel
  • Core Web Vitals are built in from the start — performance that protects rankings rather than erodes them
  • Schema markup is implemented once in shared code and applied everywhere automatically

If your small business website is on WordPress, Wix, or a page builder, there are almost certainly technical SEO problems you don't know about. If it's built on ASP.NET Core, those problems are visible, auditable, and fixable all at once.

Get in touch — we'll audit your current site's technical SEO, tell you exactly what's wrong, and show you what it would take to fix it.

Get a Quote