.htaccess File
A configuration file for Apache web servers that controls URL redirects, access permissions, and other server behaviors. The .htaccess file is commonly used for implementing 301 redirects and managing URL rewriting rules.
2xx Status Codes
HTTP response codes in the 200 range indicating successful requests. The most common is 200 OK, confirming the server delivered the requested page successfully.
301 Redirect
A permanent server-side redirect that passes nearly all link equity from the original URL to the destination. Essential for preserving SEO value during site migrations, URL changes, and domain consolidations.
302 Redirect
A temporary redirect indicating a page has moved temporarily. Unlike 301 redirects, search engines may continue indexing the original URL and may not transfer full link equity to the destination.
404 Error
An HTTP status code indicating the requested page cannot be found on the server. Excessive 404 errors can waste crawl budget and create poor user experiences if not properly managed with redirects or custom error pages.
410 Gone
An HTTP status code indicating a page has been permanently removed with no forwarding address. Unlike 404, it explicitly signals to search engines that the content will not return, prompting faster de-indexing.
4xx Status Codes
HTTP response codes in the 400 range indicating client-side errors. Common examples include 401 Unauthorized, 403 Forbidden, and 404 Not Found. Monitoring these codes helps identify broken links and access issues.
5xx Status Codes
HTTP response codes in the 500 range indicating server-side errors. These codes signal that the server failed to fulfill a valid request, potentially blocking crawlers from accessing and indexing content.
AJAX
Asynchronous JavaScript and XML — a technique for loading content dynamically without full page reloads. AJAX-heavy sites can create crawling challenges if search engines cannot execute the JavaScript needed to render content.
AMP (Accelerated Mobile Pages)
A Google-backed framework designed to create fast-loading mobile web pages using stripped-down HTML. While AMP adoption has declined as Core Web Vitals became the primary speed benchmark, some publishers still use it.
API
Application Programming Interface — a set of protocols enabling different software systems to communicate. SEO tools use APIs to pull data from search engines, analytics platforms, and other services programmatically.
Caching
Storing copies of web content for faster delivery on subsequent requests. Effective caching strategies improve page load speeds, reduce server load, and contribute to better Core Web Vitals scores.
Canonical URL
The preferred URL that search engines should index when multiple URLs serve the same or similar content. Setting canonical URLs correctly prevents dilution of ranking signals across duplicate pages.
Click Depth
The number of clicks required to reach a specific page from the homepage. Pages with shallow click depth are crawled more frequently and tend to receive more link equity, making site architecture a critical SEO factor.
Client-Side and Server-Side Rendering
Two approaches to generating web page HTML. Server-side rendering produces complete HTML on the server for easy crawling, while client-side rendering builds pages in the browser with JavaScript, which can create indexing challenges.
Core Web Vitals
Google's set of user experience metrics measuring loading performance (LCP), interactivity (INP), and visual stability (CLS). Core Web Vitals are confirmed ranking signals and essential benchmarks for technical SEO.
Crawl Budget
The number of pages a search engine crawler will visit on a site within a given timeframe. Managing crawl budget is critical for large sites to ensure important pages are discovered and indexed efficiently.
Crawler Traps
Website structures that cause search engine crawlers to get stuck in infinite loops or waste crawl budget on low-value pages. Common traps include infinite calendars, faceted navigation, and session-based URLs.
Critical Rendering Path
The sequence of steps a browser takes to convert HTML, CSS, and JavaScript into rendered pixels on screen. Optimizing the critical rendering path reduces time to first meaningful paint and improves page speed metrics.
CSS
Cascading Style Sheets — the language used to control the visual presentation of web pages. CSS optimization impacts page load speed, and render-blocking CSS can delay content visibility to both users and search engines.
De-Index
The removal of a page or site from a search engine's index, making it no longer appear in search results. De-indexing can occur through manual penalties, noindex tags, or technical misconfigurations.
DNS
Domain Name System — the internet's phonebook that translates domain names into IP addresses. DNS configuration affects site accessibility, page load speed, and can impact crawling if resolution is slow or misconfigured.
Do-Follow
The default state of a hyperlink that allows search engines to follow it and pass link equity to the destination page. Do-follow links are the primary mechanism through which PageRank and authority flow between websites.
DOM
Document Object Model — a programming interface representing HTML documents as a tree structure. Search engines interact with the DOM to understand page content, making DOM rendering critical for JavaScript-heavy websites.
Dwell Time
The amount of time a user spends on a page before returning to search results. Longer dwell times can indicate content that effectively satisfies search intent, though Google has not confirmed it as a direct ranking factor.
Dynamic URL
A URL generated dynamically based on database queries, typically containing parameters like question marks and ampersands. Dynamic URLs can create crawling challenges and duplicate content issues if not properly managed.
Edge SEO
Implementing SEO changes at the CDN or edge server level rather than modifying the origin server. Edge SEO enables rapid deployment of redirects, header modifications, and rendering optimizations without backend development cycles.
File Compression
Reducing file sizes through encoding techniques like Gzip or Brotli to speed up data transfer between servers and browsers. Compression typically reduces page weight by 60-80%, directly improving load times and Core Web Vitals.
Findability
How easily users and search engines can discover content on a website. Findability depends on site architecture, internal linking, navigation design, and proper indexation of important pages.
Follow
The default directive for links, indicating search engines should crawl the linked page and pass link equity. A followed link transfers ranking signals from the source page to the destination.
Google Business Profile
Google's free tool for managing how a business appears in Google Search and Maps. An optimized Google Business Profile is essential for local SEO, influencing local pack rankings and providing rich business information directly in SERPs.
Hreflang
An HTML attribute that tells search engines which language and geographic region a page is intended for. Hreflang tags prevent duplicate content issues across multilingual sites and ensure users see the correct regional version.
HTTP
HyperText Transfer Protocol — the foundational protocol for data transfer on the web. HTTP defines how messages are formatted and transmitted between web servers and browsers.
HTTPS
The secure version of HTTP that encrypts data between browser and server using SSL/TLS certificates. HTTPS is a confirmed Google ranking signal, and sites without it may display browser security warnings that deter visitors.
Index
The database where search engines store information about web pages they have crawled and processed. Only pages included in the index can appear in search results, making indexation a prerequisite for organic visibility.
Index Bloat
When a search engine indexes far more pages than a site intends, including low-value or duplicate pages. Index bloat dilutes crawl budget and overall site quality signals, potentially depressing rankings for important pages.
Index Coverage Report
A Google Search Console report showing which pages are indexed, excluded, or experiencing errors. This report is essential for diagnosing indexation issues and ensuring important content appears in search results.
Indexability
Whether a page meets the technical requirements for search engines to include it in their index. Factors affecting indexability include noindex tags, canonical signals, crawl accessibility, and content quality thresholds.
Indexed Page
A web page that has been crawled, processed, and added to a search engine's database. Only indexed pages can appear in search results, and the site: search operator can verify a page's index status.
Indexing
The process by which search engines analyze crawled pages and store them in their database for retrieval. Indexing involves parsing content, evaluating quality, and organizing information for efficient search result generation.
Information Architecture
The structural organization of a website's content, including hierarchy, navigation, and URL patterns. Strong information architecture improves crawlability, distributes link equity efficiently, and helps users find content intuitively.
Information Retrieval
The science of searching for and extracting relevant information from large datasets. Search engines are fundamentally information retrieval systems, using algorithms to match queries with the most relevant documents.
Interaction to Next Paint
A Core Web Vitals metric measuring page responsiveness by tracking the time between a user interaction and the next visual update. INP replaced First Input Delay as the primary interactivity metric in March 2024.
IP Address
A unique numerical identifier assigned to every device connected to the internet. IP addresses are relevant to SEO for server location signals, CDN configuration, and identifying potentially manipulative link networks.
JavaScript
A programming language that enables dynamic, interactive web content. JavaScript-heavy sites can face SEO challenges because search engine crawlers may not fully render JS content, potentially leaving important information unindexed.
JSON-LD
JavaScript Object Notation for Linked Data — Google's recommended format for implementing structured data markup. JSON-LD allows you to add rich data about entities, products, and content without modifying the visible HTML.
Latent Semantic Indexing
An information retrieval method that uses statistical patterns to identify the relationships between terms and concepts. While Google doesn't use LSI directly, the concept influenced modern semantic search capabilities.
Lazy Loading
A technique that defers loading of images and other resources until they are needed — typically when they enter the viewport. Lazy loading improves initial page load performance but must be implemented carefully to ensure search engines can access all content.
Link Profile
The complete collection of backlinks pointing to a website, including their sources, anchor text, link attributes, and quality distribution. A healthy, diverse link profile signals genuine authority to search engines.
Linked Unstructured Citations
Mentions of a business's name, address, or phone number on web pages that include a link back to the business website. These citations combine the trust signals of NAP consistency with the ranking value of backlinks.
Local Business Schema
Structured data markup specifically designed for local businesses that provides search engines with details like address, hours, phone number, and service areas. Local business schema enhances visibility in local search results and map packs.
Log File
A server-generated record of all requests made to a website, including those from search engine crawlers. Log files provide the most accurate data on how search engines actually interact with your site.
Log File Analysis
The process of examining server log files to understand how search engine bots crawl a website. Log file analysis reveals crawl frequency, crawl budget allocation, and potential issues that aren't visible in standard SEO tools.
Minification
Removing unnecessary characters from code files — whitespace, comments, and line breaks — without changing functionality. Minifying HTML, CSS, and JavaScript reduces file sizes and improves page load performance.
Mobile-First Indexing
Google's approach of using the mobile version of a page's content for indexing and ranking. Since Google predominantly crawls with a mobile user agent, sites must ensure their mobile experience contains all critical content and functionality.
Noopener
A link attribute that prevents a newly opened page from accessing the original page's window object. Noopener improves security and performance for links that open in new tabs.
Noreferrer
A link attribute that prevents the browser from sending the referring page's URL to the destination site. Noreferrer provides privacy but also means the destination won't see referral traffic data in their analytics.
PHP
A server-side programming language widely used for web development. PHP powers platforms like WordPress and can impact SEO through server response times, URL generation, and how content is dynamically rendered.
Redirect
A server instruction that automatically sends users and search engines from one URL to another. Proper redirect implementation preserves link equity, prevents broken experiences, and is essential during site migrations and URL changes.
Redirection
The process of forwarding one URL to another. Redirection strategies are critical during site redesigns, domain changes, and content consolidation to maintain search equity and prevent traffic loss.
Referrer
The URL of the page that linked to the current page, sent as an HTTP header when users follow links. Referrer data helps track traffic sources but can be restricted by noreferrer attributes and browser privacy settings.
Rel Canonical
An HTML link element specifying the preferred URL for a page when duplicate versions exist. The rel=canonical tag consolidates ranking signals to a single URL and is one of the most important technical SEO implementations.
Relative URL
A URL path that doesn't include the full domain, relying on the current page's context to resolve. While relative URLs work for internal links, absolute URLs are generally preferred for canonical tags and structured data.
Render-Blocking Scripts
JavaScript and CSS files that must be loaded and processed before a page can render, delaying visual content display. Eliminating or deferring render-blocking resources is a key performance optimization for improving Core Web Vitals.
Rendering
The process of converting HTML, CSS, and JavaScript code into the visual page that users see. Search engines must render pages to understand JavaScript-generated content, creating a second wave of processing beyond initial crawling.
Robots.txt
A text file in a website's root directory that instructs search engine crawlers which pages or sections to crawl or avoid. Robots.txt is a critical tool for managing crawl budget and preventing indexation of low-value pages.
Schema Markup
A standardized vocabulary of tags added to HTML that helps search engines understand content meaning. Schema markup enables rich results, knowledge panels, and other enhanced SERP features that improve click-through rates.
Scrape
Extracting data from websites using automated tools. While scraping has legitimate uses in SEO research and competitive analysis, unauthorized scraping can violate terms of service and copyright protections.
Secure Sockets Layer
A security protocol (now largely replaced by TLS) that encrypts data between web servers and browsers. SSL/TLS encryption, indicated by HTTPS, is a confirmed Google ranking signal and a web security standard.
Server Log Analysis
Examining server access logs to understand how search engine crawlers interact with a website. Server log analysis reveals actual crawl behavior, crawl frequency patterns, and technical issues not visible through standard SEO tools.
Sitemap
A file that lists all important pages on a website to help search engines discover and crawl content efficiently. XML sitemaps are submitted through search console platforms and are especially valuable for large or complex sites.
Srcset
An HTML attribute that specifies multiple image sources for different screen sizes and resolutions. Using srcset enables responsive images that load optimally sized files, improving performance on mobile devices.
SSL Certificate
A digital certificate that authenticates a website's identity and enables encrypted connections. SSL certificates are required for HTTPS, a confirmed Google ranking factor, and modern browsers flag sites without them as insecure.
Status Codes
Three-digit HTTP response codes indicating the outcome of a server request. Understanding status codes (2xx success, 3xx redirects, 4xx client errors, 5xx server errors) is fundamental to technical SEO troubleshooting.
Structured Data
Standardized code formats that help search engines understand and categorize page content. Implementing structured data through schema markup enables rich results, knowledge panels, and enhanced SERP features.
Taxonomy
The classification system used to organize website content into categories, tags, and hierarchical groupings. Well-structured taxonomies improve navigation, internal linking, and help search engines understand content relationships.
Transport Layer Security
The modern successor to SSL, providing encrypted communication between web servers and browsers. TLS powers HTTPS connections and is essential for website security, user trust, and maintaining ranking eligibility.
URL Folders
Subdirectory segments within a URL path that organize content hierarchically, such as /blog/category/post-title. URL folder structure communicates site architecture to search engines and affects how authority flows between content sections.
URL Parameter
Query strings appended to URLs using ? and & characters that modify page content or tracking. URL parameters can create duplicate content and crawl waste if search engines index multiple parameter combinations of the same content.
URL Slug
The human-readable portion of a URL that identifies a specific page, typically appearing after the domain and folder path. Optimized URL slugs are concise, descriptive, include target keywords, and use hyphens to separate words.
UTM Code
Urchin Tracking Module parameters appended to URLs that track campaign performance in analytics platforms. UTM codes help attribute traffic to specific marketing channels and campaigns without affecting SEO.
XML
Extensible Markup Language — a format for structuring and transporting data. In SEO, XML is primarily used for sitemaps that help search engines discover and understand a website's page inventory.
XML Sitemap
An XML file listing all important URLs on a website that search engines should crawl and index. XML sitemaps can include metadata about each URL, such as last modification date, change frequency, and priority level.
Other Categories
Need help putting these concepts into practice? Digital Commerce Partners builds organic growth systems for ecommerce brands.
Learn how we work