I love Squarespace. That may sound strange given the title of this post, but it is true. Out of all the website builders out there — Site 123, Ucraft, Webflow, Webnode, Wix etc. — Squarespace is my favorite one to build a site on.
Alas, it just sucks when it comes to search engine optimization. And it breaks my heart.
While it has come a long way from its inception, there are still a few (but major) areas where Squarespace falls short on, and there doesn’t seem to be plans to act on them any time soon.
Let’s explore what they are:
1. Canonicalization
When there are two or more versions of a site, it’s not ideal for search engines, websites or their users. This faux pas tends to emerge when canonicalization isn’t established.
For example, the URL of a page can exist in many forms:
- https://seothesis.com/resources/
- https://www.seothesis.com/resources/
- https://seothesis.com/resources
- https://www.seothesis.com/resources
Whether there’s a WWW or a trailing slash, all URLs should redirect to a single version and nothing more. Unfortunately, this isn’t the case with Squarespace. A case can be made for the rel=canonical tag (as Squarespace does), but sadly, there are many SEO tools which don’t recognize these tags and will continually alert site owners that they have duplicate content issues when in reality they don’t.
At the very least, it would be great for Squarespace to allow users the option to decide which version they want canonicalized, as opposed to a default choice.
In this instance, while it won’t cause major visibility issues, the lack of canonical control leads to frustration for analysts and webmasters alike.
2. Robots.txt + Sitemap.xml
Robots.txt is a file which is used to restrict and control the behavior of all types of search engine robots and other third-party crawlers.
Sitemap.xml is a file which lists every page of a website in hierarchical order for search engine robots to crawl.
Both files are helpful for search engines to understand and process a website with greater ease and accuracy. Common sense would reason for a website owner to have full control over what they want and do not want search engines to crawl and index.
Regrettably, not with Squarespace.
It is not possible to edit your site’s robots.txt, nor your sitemap.xml file. This is a tragedy, considering the sheer number of scenarios where alterations and modifications are absolutely necessary. Both are automatically generated, and while there is no clear explanation as to why this is the case, it quickly becomes problematic for those of us who want to optimize our sites for full visibility and crawl bandwidth.
3. Structured Data
Structured data is simply a standardized format of providing information about a page and classifying content to help search engines better understand what’s on the page. Schema is a collaborative guideline for structured data and is used by many website owners to structure their website information for search engines.
Unfortunately, Squarespace hinders the opportunity of structured data. This is due to pre-existing schema which is inserted into the code of every site, whether you have opted for it or not. This itself is not an issue, however it does not maximize the full features and attributes available.
If you generate schema markup and include it on your site, Squarespace will not replace the pre-existing schema with your own, rather only add to it, which tends to make things more convoluted (especially if it’s the same schema type!).
In most cases, using Squarespace, it’s best to not add any structured data all in order to avoid any complications.
4. .htaccess
.htaccess is a server configuration file which can be used to password protect, rewrite or redirect files. When it comes to SEO, it can be extremely useful when you need to rename pages en-masse or set rules for redirects.
A common scenario where .htaccess rules would be implemented is when there are external links directed to a page that no longer exists — instead of having search engines de-index the page and render any link benefits useless, the .htaccess file can be utilized to have that page redirect to a relevant alternative and reclaim any lost link authority.
Unfortunately, .htaccess is not accessible with a Squarespace account. While they have attempted to accommodate for this with their “URL Mappings” feature, it is not nearly as scalable when dealing with large numbers of pages and their respective rules.
Final Thoughts
While we hold out hope Squarespace will eventually evolve and grow to a point where these obstacles are resolved and full control is native to the platform, at this current stage we have to revert to alternative content management systems and live to optimize another day.