SEO Best Practices

Search engine optimization (SEO) refers to the strategies you employ to push your site to the top of organic (free) search engine results. There is a lot of information already written on the subject, and nowadays you can easily find a multitude of websites, books, videos, and even entire careers devoted to SEO.

Many SEO best practices center on placing keywords in your titles, headers, and body tags, while at the same time creating quality content that naturally attracts visitors and creates external links to your site. You should definitely learn about and employ these content-driven strategies when designing your site, but you should also take advantage of the performance-driven SEO benefits that the platform provides. This topic details design guidelines that you can easily implement to help you achieve the best possible SEO for your site, such as minimizing page load times, creating beneficial URL structures, and using content-rendering techniques that are friendly to search engines.

Minimize Page Load Times

Over 50% of shoppers leave a website if it takes longer than four seconds to load. Unified Commerce storefronts load faster than those that run on other platforms, translating to fewer bounces and higher sales.

Faster websites also enjoy higher placement in search results. Lucky for you, the platform handles many aspects of page load performance. However, you should still benchmark the pages on your site to make sure there are no bottlenecks. When designing your theme, refer to the theme performance guidelines to create a fast frontend experience.

Render SEO Content Through Hypr Instead of JavaScript

For the most part, search engine crawlers do not execute JavaScript. Therefore, if a page on your site renders content mostly through JavaScript, it is safe to assume it is not optimized for SEO. This might be a perfectly acceptable strategy if you are not targeting a particular page's content for SEO. However, if the page is rendering high-value SEO content such as product and category information or important navigational links, you need search engine crawlers to be able to index this information.

An easy way to determine what search engine crawlers can discover on your site is to turn JavaScript off or use your browser's developer tools to enable a feature with a similar effect. If any of your pages are using JavaScript to render SEO content, refactor the pages to use Hypr instead. Hypr includes many tags and filters that provide powerful rendering functionality without sacrificing SEO performance. Using Hypr, you can avoid nested calls to the API, complicated data transformation in JavaScript, and unnecessary view updates. For example, Hypr gives you access to the:

  • include_products tag, which retrieves a list of products from the catalog using inline API calls. The include_documents and include_entities tags are also available for use.
  • find_where filter, which allows you to use logic to search for items in lists or objects.
  • dictsort filter, which allows you to sort a list of objects based on a common property.
  • for and if tags, which allow you to use conditional and looping control flow in your templates.

To learn more about Hypr, refer to the Hypr Templating System topic.

Specify SEO Data in Admin Pages

Within Admin, you can specify the following SEO data within certain pages:

Field NameSEO Effect
Meta TitleMaps to the HTML meta title tag. While most search engines place little value on this tag, most themes inject the value of the meta title tag into the HTML title tag.

The HTML title tag is used by search engines and browsers to display the title of the page, and is critical to SEO. Search engines place very high importance on the correlation between a page's title tag and its content.
SlugThe default URL structure for products is{slug}/p/{productCode} and for categories is{slug}/c/{categoryId}. The slug (or SEO-friendly URL) gives you the ability to add a meaningful component to the URL structure in order to boost search results.
Meta DescriptionMaps to the HTML meta description tag, which is used by search engines to summarize the content of the page.
Meta KeywordsMaps to the HTML meta keywords tag, which is used to tell search engines what the page is about. From an SEO-perspective, search engines place little value on this tag, but the search implementation uses these keywords to help construct search results for pages on your storefront.
H1 / H2 HeadersSearch engine crawlers use HTML headers to help index your pages. You can specify most headers on your pages through the Site Builder editor page settings or through available header or HTML widgets.

The pages in which you can specify SEO data include:

Customize your Theme to Expose Additional SEO Settings

If you want to expose additional SEO settings to Admin users than those available by default, you can customize theme.json and theme-ui.json.

Create Custom Routes

Custom routing allows you to display SEO-friendly URLs on your site that map behind-the-scenes to conventional resources such as a product page or a search results page.

Custom URL Structures for Better SEO

With custom routing, you gain advanced control over the URL structures on your site and can more visibly highlight the products or categories your shoppers are interested in purchasing.

For example, a category page for women’s tops from a certain designer might have a URL that looks like: However, for SEO reasons you may prefer that the category page use a URL such as With custom routing, you can use the SEO-friendly URL and let Unified Commerce map it to the correct category page.

Apply the Canonical Tag

Custom routes let you apply the canonical tag to mark a URL structure as the preferred one for search engine crawlers to index and for shoppers to see in the URL bar.

Refer to Custom Route Settings to learn how to configure custom routes.

Upload a Robots.txt File

In some cases, you can boost your SEO by preventing search engine crawlers from indexing certain directories on your site. To accomplish this, Unified Commerce allows you to upload a robots.txt file through Admin General settings.

Examples of pages that you'd want to disallow search engine crawlers from indexing include pages that display duplicate content (multiple versions of one page negatively impact SEO) or pages that include sensitive content or content irrelevant to your shoppers (which has the added benefit of reducing bandwidth requirements). However, when editing your robots.txt file, make sure you do not commit syntax mistakes that cause search engine crawlers to ignore important pages on your site. You should also familiarize yourself with alternatives to the robots.txt file, such as the meta robot, noindex, nofollow, and canonical tags, and decide which implementation is better suited for your specific use case.

Upload a Custom Sitemap

In some cases, you can boost your SEO by uploading a custom sitemap different from the one generated by Unified Commerce. Sitemaps provide instructions to search engines on how to crawl your site. With a custom sitemap you can specify:

  • Which pages should be crawled.
  • The priority of content.
  • The date when the content was last updated.
  • A more scannable and keyword-heavy organization for content.

You can view your current sitemap at To upload a custom sitemap, refer to the File and Image Management topic.