Your Guide to Technical SEO for Ecommerce: Optimize Your Site for Success

Technical SEO for Ecommerce

Many people think search engine optimization is about stuffing keywords into badly written articles.

Don’t get me wrong, many brands make good use of ecommerce content marketing. But a lot of content is low effort – or straight from ChatGPT. It doesn’t exactly help expand the world’s knowledge base.

I want to talk about something that does make the Internet a better place — technical SEO.

You see, nobody likes functionally bad websites. You know, the kind of websites that:

  • Take forever to load. It’s 2024, I’m not waiting 30 seconds to see your products. 
  • Don’t have a mobile version. Are we even living in the same century? 
  • Have security problems. I’m out before you steal my crypto.
  • Make it hard to access and use pages for their intended purpose. Doesn’t matter if it’s purchasing a product, browsing your store, or simply reading a blog post.

Bad websites make it hard for Google too. A weak website structure and poor navigation will prevent search engines from effectively crawling your content and discerning pages from one another. 

This is where technical SEO comes in — it’s the list of technical requirements your website needs to function correctly.

Think of it like passing a tech inspection on your car. You might have a pumped-up engine, nice leather seats, and a fat subwoofer in the back. But if your car doesn’t pass tech, you won’t be able to bless the streets with your crew’s latest mixtape.

It’s the same with your blog or ecommerce store. If your website is falling apart as it’s trying to load, you’re probably not getting into the SERPs.

There’s lots more to it, though, so let’s get technical.

Sell more products with less effort.

If you want a steady flow of engaged customers primed to buy, we’ve got a proven process for driving organic traffic and converting it into sales.

Technical SEO in 30 Seconds

Technical SEO usually works on a sitewide level and helps you fix problems like crawlability, indexability, navigation, accessibility, responsive design, page speed, security, and more.

Crawlability and indexability are the most important. Here’s why:

To get organic traffic (and subsequently sales), you need your pages to rank on the search engine results page, realistically in the top 20 results. 

But a page cannot rank if it’s not indexed. And with some exceptions, pages cannot be indexed if bots can’t crawl your site

So, What Is Crawling?

Search engines use software called crawlers, spiders, or bots (the terms are used interchangeably) to discover content on the Internet. 

Crawling process and crawl depth

Spiders crawl websites in a tree-like fashion. They:

  1. Read the page they’re on (assume the homepage).
  2. Evaluate the page for inclusion in the search engine index.
  3. Follow each link to discover new pages and content.
  4. Repeat until the entire website is read and mapped out.

What’s an Index, Then?

The search engine index is simply the gigantic pool of Internet resources Google maintains. When you type a keyword into the search tab, the ranking algorithm checks what’s available in the index and ranks the best 100 or so pages according to various factors.

What is technical SEO search query in Google

Getting your pages indexed is the passing of the tech inspection that we mentioned earlier. It doesn’t make you Vin Diesel, it simply puts your car on the road and gives it a non-zero chance in the competition.

Vin Diesel, Paul Walker in Fast and Furious
Running around with noindex will get your site killed, Brian… I’m all about the SEO game, but this is family!

Poor Technical SEO Will Break Your Website. No, Really.

Ecommerce sites frequently work with large catalogs of products, which must be managed and organized to make them accessible to crawlers and visitors alike.

When the number of pages exceeds what you can feasibly manage by hand, the effects of bad technical SEO can snowball and turn your whole store into a roadside wreck.

For example, a bad site structure might prevent crawlers and users from finding ALL of your products. If Google can’t find them, they won’t rank. And if users don’t find them, they won’t sell. 

Nik says it pretty clearly in his article on sudden drops in traffic:

“I can’t stress this enough: HAVING CRAWLING ISSUES IS A HUGE PROBLEM AND IT MIGHT KILL ALL YOUR RANKINGS.”

Nik Trifonov SEO Specialist

Nik Trifonov

SEO Specialist, DCP

Here’s another fact:

“A site that loads in 1 second has a conversion rate 3x higher than a site that loads in 5 seconds.”
Portent Research 2022

Modern ecommerce platforms like Shopify, WooCommerce, and Squarespace come with the promise of solving technical SEO as part of the service. 

This is somewhat true. These platforms always work right out of the box but rarely achieve their full potential without some elbow grease.

That said, here are the cornerstones of technical SEO that you need to care about for your ecommerce store.

Improve the Crawlability of Your Ecommerce Store

Crawlability measures how easy or hard it is to discover content on your site via crawling. 

The following items influence how search engines crawl your site: 

The robots.txt file

The robots.txt file instructs crawlers where to go and where not to after landing on your site. A badly configured robots.txt file can block search engines from crawling the site and discovering your content. 

Go ahead and check yours. It should be located directly under the root domain, like so: https://digitalcommerce.com/robots.txt

Good. All bots are allowed to crawl everything on your site.Very, very bad. Bots are not allowed to crawl anything on your site.
User-agent: *
Disallow:
User-agent: *
Disallow:/
One slash makes all the difference. We’ve seen it happen during bad migrations.

The robots.txt file can wipe your entire site out of Google search with long-lasting consequences. 

The meta robots tag

You should find this code in any one of your pages:

<meta name=”robots” content=”index,follow”>

These are instructions for crawlers after they have arrived on the page:

  • Index, follow should be the default instruction for the vast majority of pages. 
  • Noindex, follow can be used for keeping pages with no SEO value out of the index — duplicate content, offer pages, lead generation, cart, checkout, etc.
  • Nofollow will instruct robots to NOT follow any links on the page.

Since we always want spiders to crawl our website and refresh the index, there’s hardly ever a reason to use nofollow to an entire page. You can use the rel=”nofollow” attribute to disable specific links, when necessary (e.g. affiliate or paid links).

Navigation and Internal linking

Crawling relies almost exclusively on internal linking. Your website structure needs to ensure that all content is accessible and crawl depth is minimized where possible — this is especially true for menus.

Every category page should be accessible via the navigation, and every product should be listed in at least one category. The same is true for the blog section of your site and any other content sections.

Failure to do so can result in orphaned pages — pages that are severed from the rest of the site, and not discoverable via crawling.

Crawl depth is the distance of a page from the homepage (or the number of clicks it takes to reach it via the navigation). 

The relative importance of a page within a website structure is proportionate to its crawl depth and the number of internal links it receives.

The lower the crawl depth, the more prominent a page becomes. The more internal links it receives, the stronger its authority compared to the rest of the website. 

You should aim to set all of your categories with a maximum crawl depth of 2 and your products — max CD of 3. Be wary if you see numbers higher than 5, and try to improve the internal linking of these pages. 

For ecommerce, the menu structure usually follows this hierarchical pattern, where each of the pages features navigation elements to browse and discover content further down the tree. 

Home → Store → Categories → Subcategories → Products → Product variations

Website URL Architecture
Keep in mind, that navigation structure and URL structure are not necessarily the same. More about URLs further down.

For stores with large product catalogs, mega menus are often utilized to flatten out the structure and reduce crawl depth. 

Secondary navigation is also important.

Link widgets help to create additional internal links to better interconnect different pages and sections:

  • Related products/posts
  • Highlighted categories
  • Breadcrumbs
  • HTML pagination (for categories)

Diversity is key, but remember that users don’t want to browse a page with endless menus and widgets that dilute the focus from the page’s main purpose and content. 

Editorial internal linking

Links from the main body content of a page signal authority and trust. These links are highly valued, compared to links from widgets and even the main menu.

They can help fill in the cracks in your internal linking, strategically reduce crawl depth, and also pass SEO value to your main targets.

Accessibility of content 

404 errors (broken links) and 500 errors create dead ends for crawlers, while too many redirects (301, 302, etc) add extra steps in the journey, consuming unnecessary resources and slowing the website down.

Keep all of the above in check.

Screaming Frog is the number one tool for diagnosing crawling on your site. Screaming Frog will crawl a website and provide you with a plethora of information regarding: 

  • Website structure — crawl depth, internal and external linking.
  • Server response — 200, 3xx, 4xx, 5xx codes.
  • Meta robots tags.
  • Canonicals.
  • Onpage elements.
  • And much more.
Screaming Frog Technical SEO Crawler
Screaming Frog is the one tool to get when starting with technical SEO!

The free version works up to 500 URLs, and it’s well worth investing if your site is bigger.

Maximizing the Indexability of Your Collections and Products

Indexability is all about making sure the right content is known to search engines and considered for relevant searches. 

You want your products and categories to rank for ecommerce searches, your blog posts for informational searches, and your home or about page for searches about your brand. 

But you don’t want four product pages fighting for the same keyword. You’ll never rank all four on the first page — not for any competitive keywords anyway. More likely, this internal competition will result in neither of the pages performing to its full potential. 

Indexability is as much about getting your content in as it is about keeping all the junk out of the index. You want the right URLs for the right keywords. And that starts with…

The sitemap.xml file

Linked at the bottom of every properly configured robots.txt file is the sitemap.xml – a dynamically-generated XML file that contains all URLs on your website that you want to be indexed and ranked. 

Оnce discovered Google will visit the sitemap periodically to re-crawl each page and update its index. The sitemap is the only way for search engines to discover your content besides crawling. 

Make sure your sitemap.xml is properly configured. To do this:

Check if all your important content is added to the sitemap. On most ecommerce stores, the sitemap will be organized into smaller maps dedicated to specific sections of your website, like: 

  • Homepage, Contact, About, and other top-level pages
  • Store categories, Products
  • Blog categories, Articles

However, you don’t want all pages of your site in the sitemap. Keep the following out:

  • Dead or inaccessible content – 404 Not Found and 500 Internal Server Error
  • Pages set to “noindex” in the meta robots tag
  • Pages canonicalized or redirected to another page
  • HTTP pages

Go ahead and check your sitemap. It should be located in a similar URL (sitemap name can vary): https://digitalcommerce.com/sitemap_index.xml

XML Sitemap

Do a site search for your domain and see what content shows up.

For example, type into Google: site:digitalcommerce.com/

Make sure your sitemap is submitted in Google Search Console. You’ll get reports for the performance of your sitemaps and their listed URLs. You can also inspect individual URLs to diagnose problems.

Google Search Console - sitemap report

The meta robots tag

We already talked about the meta robots tags in the crawlability section. It’s equally important for indexability as well. 

<meta name=”robots” content=”index,follow”>

Again, make sure your important SEO targets like products, categories, content articles and landing pages (for search) are set to “index, follow”. The incorrect directive can drop the page out of the index as soon as the next crawl.

Beyond that, meta robots are an important tool for keeping non-SEO pages out of the index. 

You should consider setting the following pages to noindex:

  • Lead generation pages — signup forms, thank you pages
  • Functional pages — cart, checkout
  • Temporary sales and offer pages
  • Various landing pages for ads, email marketing, affiliates, and other forms of referrals or direct traffic

The canonical tag

Just like the meta robots tag, you should also find the canonical link tag on every one of your pages. It looks like this:

<link rel=”canonical” href=”https://digitalcommerce.com/case-studies/”/>

The canonical tag points out the main URL you want Google to consider for indexing and ranking. 

Normally, for many pages on your site like the homepage, about us, contact, blog posts, and info pages, there is only one URL. So by extension, it’s the main one, and life is good. 

Ecommerce stores, however, have the potential to create hundreds if not thousands of URLs that have duplicate content and contribute zero additional value for search engines.

Here are some examples:

  1. Product versions:
  • https://aedadvantage.ca/products/lifepak-cr2-defibrillator?variant=29528570593383
  • https://aedadvantage.ca/products/lifepak-cr2-defibrillator?variant=41856370835627
  1. Dynamic category URLs based on tags and filtering:
  • https://printfresh.com/collections/blouses?filter.p.tag=Apparel&filter.p.tag=Siberian+Stripes&filter.v.option.size=M&filter.v.availability=1
  • https://printfresh.com/collections/blouses?filter.p.tag=High+Horse&filter.v.option.size=L&filter.v.availability=1&sort_by=title-ascending
  1. Paginated URLs
  • https://printfresh.com/collections/pajamas?page=2
  • https://printfresh.com/collections/pajamas?page=3
  1. URL versions with and without forward slash at the end:
  • https://propercloth.com/tailored-clothing/burgundy-italian-flannel-bedford-jacket-192721.html
  • https://propercloth.com/tailored-clothing/burgundy-italian-flannel-bedford-jacket-192721.html/

Of course, duplicate content is a problem because we cannot predict how the algorithm will react. Most of the time, Google will figure it out, but it will on occasion index the wrong URL or index both URLs and get confused about which one to rank for what keywords.

For these and every other case of duplicate content, we use the canonical tag to instruct search engines which URL we want to index and rank. 

Keep in mind, that the canonical link is a suggestion, not a directive. Search engines can and will ignore the canonical if other factors tip the scales. 

For more strict control over indexing, use the meta robots tag and 301 redirects.

Google Search Console is the number one tool for diagnosing indexing. GSC will show you, amongst many other things:

  • The pages Google has crawled and discovered.
  • Whether or not they are indexed, and why.

It will also offer holistic reports updated with each crawl of your site.

GSC Indexing report

Guide Crawlers and Users With an Efficient URL Structure

The URL structure is a map of how pages are organized on your website. The better the map the easier it is for crawlers and visitors to get around the site and find the content they’re looking for. 

Keyword-optimized URLs matter for ranking, while human-readable URLs enhance user experience. It’s not difficult to achieve both with just a little bit of planning and strategic keyword targeting.

For ecommerce stores, it is especially important to map out your URL structure from the start.

As your business scales up and you add more products to the catalog, the amount of URLs can increase greatly. The more URLs, the harder it becomes to manage them and even more difficult (and risky) to reorganize an already developed store. 

If you find yourself five years into your business without having done this, don’t worry, it’s fixable. It’ll involve auditing your content, redesigning your URL structure, implementing 301 redirects, and migrating internal links.

URL migration is not impossible, but difficult to get right and you only have once chance. This is why you should hire a team like ours to smooth out any ecommerce SEO issues you might have.

The typical URL structure for ecommerce stores is to use hierarchical category URLs, but independent product URLs, like so: 

  • domain.com
  • domain.com/shop
  • domain.com/shop/category
  • domain.com/shop/category/subcategory
  • domain.com/product
  • domain.com/product?variant=2

This allows you to be flexible with your products, assigning them in and out of categories as your needs arise. 

This structure is used by most popular ecommerce platforms like WooCommerce and Shopify. For the most part, these platforms will sort out your URL structure, but you can’t leave it to chance. Plus, optimizing individual URLs is down to you. 

I’ve written a dedicated article about ecommerce URL structures. I recommend reading it next if you’d like to know more about how these structures work.

Use Structured Data To Make Your Site Stand Out in Search

Structured data or schema is a way to describe the contents of your page in a way that search engines can comprehend without any doubt. 

For ecommerce searches, structured data will help you get visual enhancements in the search engine results page, like:

  • Stars
  • Prices
  • Product images
  • Shipping cost, and delivery estimates
Rich snippets in Google's Search Engine Results Page
Stars and reviews, images, pricing and delivery information help you stand out in the SERP and drive more qualified traffic!

Even better, Google can include your products in various shopping panels at the top of the SERP. 

These Merchant Listings offer greatly improved visibility and click-through rates, compared to normal search snippets. 

Merchant Listing Experiences in Google Search
You can’t even see the regular results without scrolling past one or two of these panels!

If you’re running an ecommerce store, you just can’t afford to not implement structured data on your product pages.

This is how structured data looks in JSON format when you manually implement it on a product page:

<script type="application/ld+json">
{
	"@context": "http://schema.org/",
	"@type": "Product",
	"name": "SEO Competition Blocker",
    "brand":"Digital Commerce Partners",
    "description":"The SEO Competition Blocker effectively removes your top 3 competitors from the search engine results page. Requires a daily offering to the Google God to maintain effectiveness.",
     "image": "https://digitalcommerce.com/wp-content/uploads/2024/04/competition-blocker.jpg",
     "gtin14":"1845678901001",
	"aggregateRating": {
		"@type": "AggregateRating",
		"ratingValue" : "4.3",
		"ratingCount" : "17",
		"reviewCount" : "",
		"worstRating" : "1",
		"bestRating" : "5"
	},
  	"offers": {
		"@type": "Offer",
		"priceSpecification": {
          "@type": "UnitPriceSpecification",
          "price": 18.10,
          "priceCurrency": "USD",
          "referenceQuantity": {
            "@type": "QuantitativeValue",
            "value": "42",
            "unitCode": "LB",
            "valueReference": {
              "@type": "QuantitativeValue",
              "value": "1",
              "unitCode": "LB"
            }
          }
        },
		"url": "https://digitalcommercepartners.com/seo-competition-blocker/",
		"itemCondition": "https://schema.org/NewCondition",
        "availability": "https://schema.org/InStock",
        "shippingDetails": {
          "@type": "OfferShippingDetails",
          "shippingRate": {
            "@type": "MonetaryAmount",
            "value": 4.99,
            "currency": "USD"
          },
          "shippingDestination": {
            "@type": "DefinedRegion",
            "addressCountry": "US"
          },
          "deliveryTime": {
            "@type": "ShippingDeliveryTime",
            "handlingTime": {
              "@type": "QuantitativeValue",
              "minValue": 0,
              "maxValue": 1,
              "unitCode": "DAY"
            },
            "transitTime": {
              "@type": "QuantitativeValue",
              "minValue": 1,
              "maxValue": 3,
              "unitCode": "DAY"
            }
          }
        },
        "hasMerchantReturnPolicy": {
          "@type": "MerchantReturnPolicy",
          "applicableCountry": "US",
          "returnPolicyCategory": "https://schema.org/MerchantReturnFiniteReturnWindow",
          "merchantReturnDays": 30,
          "returnMethod": "https://schema.org/ReturnByMail",
          "returnFees": "https://schema.org/FreeReturn"
         }
	}
}
</script>

But when talking about an entire ecommerce store, you would be looking for an automatic solution. 

Most ecommerce platforms have plugins or extensions available to generate product schema automatically. The better tools are often paid ones, but the investment is well worth it.

You can use Google’s Rich Results Tester to analyze any page and diagnose schema problems. 

Rich Results Tester - Auditing a Printfresh product page

Google Search Console will also give you reports about your structured data and merchant listings. 

If you’re new to ecommerce schema, I recommend reading my article, where you’ll learn all the ins and outs of effective structured data implementation. 

HTTPS or Die. Make No Compromise with Site Security.

Website security is critical for all websites, but especially ecommerce.

Customers are making transactions and handing over sensitive payment details. A good way to ruin the sale, ensure the customer never returns, and tell all their friends not to shop with you, is to leave gaps in your security. 

HTTPS is a confirmed ranking factor and most modern browsers will throw warnings if your website loads without it. 

Thankfully, it’s largely a solved problem. Pretty much every modern platform and hosting company offers HTTPS as a standard feature. 

That said, you should be vigilant and monitor for any odd HTTP URLs appearing in Google Search Console or a crawl of your site.

Mobile Is the Norm. Make Sure Your Site Adapts.

Mobile traffic can comprise upwards of 80% of your entire traffic. Your Analytics graph probably looks something like this. 

Analytics traffic breakdown by device type

Here’s another fact: Since July 5, 2024, the primary crawler used for indexing and ranking is Googlebot Smartphone.

With so many shoppers browsing your store on their phones, you simply cannot afford a website design that doesn’t adapt to different screens and devices. This means:

  • Layouts that rearrange content to eliminate horizontal scrolling or clipping of content
  • Font sizes and spacings that maximize real state on small screens but remain readable
  • Images, videos, and tables that resize to fit on the screen
  • Buttons and other clickable elements that remain prominent and easy to use
  • Popups and CTAs that scale correctly and allow users to get back to reading the page
AED Advantage product page mobile
Proper cloth Product page mobile

Responsive design has been the norm for quite a while. Most modern ecommerce platforms like Shopify, WooCommerce, Wix, Squarespace, etc are already designed to be responsive and accessible by all devices. 

Themes often need some fine-tuning to achieve the best results, but they’re still all responsive for the most part.

What still lacks is true performance (read page speed) optimization for mobile. Websites frequently load twice as slowly on mobile devices (and connections) as they do on desktops. 

Which brings me to the next point:

Optimize Your Site Speed or Lose Money!

Slow sites offer poor experience and Google’s entire empire revolves around providing accurate and FAST search experiences. Thus, page speed is a direct ranking factor. 

Even more important, fast sites result in a conversion rate 3x higher than slow sites.

Google provides a comprehensive and free testing tool to analyze your website and get recommendations for improving page speed and user experience — Page Speed Insights.

Page Speed Insights Audit

Page Speed Insights is also incorporated into Google Search Console, providing holistic Core Web Vitals and page experience reports. 

Page Experience Report in Google Search Console

What Influences Page Speed?

The HTML and CSS, which comprise most of the visible content, usually load in an instant. So what else influences page speed?

  • Hosting setup. How fast the server responds to browser requests, and how it provides information and resources is the foundation for good page load speed. Technologies like caching, CDN, and the general performance of a hosting service can make or break your site.
  • Content management system. Site builders are not equal. Some platforms have better architecture and code quality than others, allowing for more streamlined, faster-loading websites. 
  • External resources. Tracking tools, email marketing software, site functions — these modern features require a connection to 3rd party sites, as well as processing time to complete various functions.
  • Images and videos. These are the largest files on the page. Their file size defines the overall payload transferred across the network. Smaller file size = faster download.

What Can You Do About the Loading Speed of Your Ecommerce Store?

Not a whole lot if your store is running on Shopify, Squarespace, or one of the other full-suite ecommerce platforms. These platforms handle the entire technical stack — hosting, content management system, scripts, and technology. You can choose your theme, which does matter, but the underlying architecture is still the same.

Quite a bit if your store is running on WooCommerce/WordPress. As an open-source system, you control the hosting, theme, and plugins you use, which 3rd party scripts are integrated. For the technically adept, that provides opportunity. For all others, it can present a significant challenge to get right or risk to perform even worse. 

Regardless of the platform, you can always ensure that images and videos are properly optimized before uploading them to your site. 

This involves picking the right resolution for the application and using modern image formats like WebP that offer superior file sizes with comparable quality to PNG and JPEG. 

Tools like CloudConvert and TinyPNG are your friends. Most platforms usually have plugins/apps that can optimize your existing image library automatically. 

Beyond that, accept that you probably won’t get a perfect score. Try to pass Core Web Vitals, if possible, and make sure you keep up with your competitors. But don’t take it too hard if the Page Speed Insights throws a bunch of red flags at you.

The boom of SaaS platforms, website builders, plugins, and 3rd party software allows everyone to build a stunning ecommerce store with all the bells and whistles you can wish for. 

However, that inevitably makes websites heavy and slow. 

Don’t Get Left on the Side of the Road. Get Your Technical SEO in Check.

Well, this has been some ride, huh? We’ve only scratched the surface of technical SEO, but I’m afraid this is the end of the road, at least for this article. 

Let’s recap all of the ways technical SEO can kill your organic performance: 

  • Crawling is the main method search engines browse the web and discover content. If they can’t crawl your site, they can’t index your pages.
  • Indexing is the collection and management of all web resources available for the ranking algorithm. If your site is not indexed, it can’t compete in search.
  • Site security, mobile responsiveness, and page load speed are all major factors that move the needle on the ranking algorithm, but also directly impact user experience — therefore sales.

Here is your arsenal for technical SEO. Use these tools to stay on top of your site health and diagnose problems:

  • Screaming Frog. For diagnosing the technical state of your website and onpage optimization.
  • Google Search Console. An all-in-one dashboard for your search performance and website health.
  • Rich Results Tester. For analyzing and working with structured data.
  • Page Speed Insights. For diagnosing page speed issues.
  • Semrush or Ahrefs. Both marketing SaaS platforms offer a comprehensive site audit feature that crawls your site and provides insight into its technical health and how to improve.

And remember – conduct regular SEO audits! Between our weekly SEO checks, monthly reports, and specialized audits we keep a close eye on our clients’ websites and our own.

For every new client, we perform specialized audits like our Page Mapping Audit and Website Health Audit to set a baseline for the site. We prioritize and work through the most severe technical issues to unlock maximum performance as soon as possible. 

If you’re interested in our technical SEO services, please visit the page. Explore our full range of SEO services here or contact us.

Aleksandar Stoyanov Avatar

Get more leads with less effort.

If you want a steady flow of targeted leads, we’ve got a proven process for driving organic traffic and converting it into qualified leads.