How to eliminate SEO problems that prevent you from achieving your goals
2 years ago on August 01, 2023
At this year's SMX Report conference, I presented an overview of the SEO challenges that keep us from achieving our goals. I took a comprehensive look at the resources, communications and mental constructs around SEO that often hinder progress.
Often we are looking for quick fixes that result in significant ranking improvements. They still exist, but the relationship that connects us to our customers and the site to users is what brings the most value.
Here are a few questions to ask before you even begin troubleshooting:
Is the company ready?
At Locomotive, we work with a wide range of clients. One of the key benefits is that we get to see a wide range of issues, as well as gain insight into how different companies handle SEO from an implementation perspective. The three key factors for our clients delivering the biggest increase in organic sales year over year are as follows:
- Sufficient resources for implementation,
- Recognition of the value of SEO by various stakeholders,
- Openness to trials and setbacks.
Technical SEO recommendations affect a wide variety of teams in your organization - from developers to content teams and beyond. If your teams are currently barely managing a two-year backlog of issues, adding new technical SEO recommendations will likely never see the light of day.
I once had to prove to the head of IT that the SEO team should have access to Search Console and Google Analytics. The level of distrust of SEO in the IT team was so high that they actively tried to suppress any requests from our team. This happened at the very beginning of our engagement. If you have a team actively working against your SEO priorities, nothing will work.
If it takes you six months to create an acceptable business case for adding two lines to a website's robotx.txt file to exclude paths with bad content, you will be very limited in your ability to achieve SEO.
Is the team of SEO specialists ready?
The bottleneck in SEO development is not always the clients. It is often the SEO agency team. Every SEO agency team should focus on the following three areas:
- Clear communication of problems,
- Properly prioritizes projects,
- Testing and reporting of results.
If you've ever sent a client a raw Screaming Frog output in CSV format and asked them to fix 32,000,301 URIs, you're doing it wrong. This tells their teams:
- I don't value your time.
- I don't understand what it takes to fix these problems.
It is up to the SEO consultant to review this list, find the site-wide 301s in the footer, clean up the parameters (e.g. https:example.com/sid=12345) and provide the client with a clear concise list of 301s that should be handled in themes or components and 301s that should be addressed in content.
We use a tool called Notion to prioritize problems for technical audits.
Notion allows us to prioritize all issues for clients.
Create views that are filtered for relevant commands.
Finally, add very clear information about the what, why, and how in the form of tickets to solve each problem.
In addition to technical audits, we use the ICE method to qualify and prioritize recommended development projects.
This will allow you and the client to quickly prioritize between "quick wins" and projects that will require significant resources.
Finally, demonstrating the value of the projects you've worked on is critical to gaining the trust of internal teams as well as the resources to take on larger projects.
Using Google Data Studio makes this very simple and efficient. Creating usable templates, regular expressions for URIs, and sharing information with the right stakeholders makes it quick and easy to demonstrate the value of the work to gain support for larger projects.
SEO issues
Most SEO problems can be broken down into a few categories. I like to talk about these categories rather than using technical jargon because it helps to get back to things that are meaningful and understandable to non-SEOs.
References
Links, particularly <a> elements, are a vote to indicate the importance of another URI. If you post something on Reddit, you wouldn't expect that post to get widespread publicity with just one affirmative vote. Links to pages on your site are similar in nature.
Links are also discovery mechanisms for search engines. They help them find both good and bad URIs. Our job as SEOs is to help them find the good ones, but at the same time prevent them from detecting anything bad. In this case, "bad" can refer to a URI with no content or a page specifically targeted to logged-in users. Essentially, "bad" pages are pages you don't want users to find.
Having an up-to-date dynamic XML sitemap is the first step to solving the problem of "good" URIs. An XML sitemap helps search engines find the content you want to show your users. The XML sitemap should include ALL the URIs you want users to find on your site, and nothing else.
Google provides site owners with a tool called "Coverage Report" in Search Console. It shows indexable URIs that are not in your sitemap. If all the "good" URIs are present in your XML sitemap, you can see here why other URIs are indexed and whether they should be indexed.
The coverage report will also show URIs that are presented in the sitemap, but Google has chosen not to include them in the search results. Often this is due to the presence of other code, such as a robots.txt file or meta robots tag, that indicates to search engines that they do not want to show that URI. In other cases, the URI is either not the best URI on your site for that topic, or the URI doesn't match the topic that search engines have a demand for.
You can use Google Analytics to get reliable information about all pages found by users. Again, using your XML sitemap as a baseline for "good" URIs, comparing the pages that users arrive at from search results (organic) with the pages in your sitemap is a good exercise to find URIs that should be included in your sitemap or excluded.
If search engines are finding URIs they shouldn't be finding, you should think twice:
- Removal of URI references.
- Prevent search engines from accessing the path or URI in robots.txt.
- Asking search engines not to index URIs via the header or robots meta flag.
- Block access to the URI at the server level. (e.g. 403 Forbidden)
- Delete URIs using Search Console's removal tool.
It is worth noting that site owners should carefully consider options 2 and 3 above, as blocking URIs in robots.txt will prevent them from reading and processing the meta robots or header noindex directives.
If search engines aren't finding the URI you want users to find, consider:
- Adding additional links from other pages to the URI.
- Asking other sites to link to the URI.
- Including URIs in XML sitemap.
Table of Contents
Content is the most commonly used word in SEO. Most posts from Google and other SEO experts treat content as an inappropriately broad noun. "Just make your content better." What if you view it as a verb? Satisfy. Content is not text. In fact, millions of ranking pages now have very little written content. Improving content by adding some LSI entities or keywords is not that far from the keyword spamming of yesteryear.
One of our biggest wins over the last two years was simply adding a downloadable PDF to some pages where the PDF was closely tied to the user experience on the page. Content is about listening and creating an experience that satisfies what the user wanted to learn or do, as clearly and effectively as possible.
From a technical standpoint, there are things that can help us measure user satisfaction with pages.
Parameters
Many companies use metrics to track website usage or perform other functions such as sorting content or determining user status. This can lead to a situation where reporting tools track multiple URIs representing the same page.
In the example above, Google directed traffic to two different versions of the same web page based on the site's use of the sid parameter in internal links. This complicates our lives as marketers because instead of seeing that this page has 880 user sessions and is an important page, the data is fragmented across multiple URIs.
To get out of these situations, we have several tools:
- Exclude certain parameters in Google Analytics.
- Include a canonical reference to a non-parametric version in the HTML element.
- Update internal references to remove unnecessary parameters.
It is important to note that internal links will almost always be a stronger signal to search engines than a canonical link element. Canonical link elements are a clue to the correct version of the URI. If Google detects forty internal links to https://example.com/page.html?sid=1234, even if the canonical version is https://example.com/page.html, that version will likely be treated as the correct URI.
Feedback
Include feedback mechanisms in your pages that communicate results to analytics tools
Using this feedback will help you sort pages that have the following issues:
- Outdated content,
- Didn't answer the user's question,
- The navigation shows a link to the wrong page,
- Content is confusing or the wrong medium is being used.
Custom metrics
Consider using custom metrics such as read time, personas, tasks completed, number of people logged in and logged out to improve reporting on your pages in your analytics tools:
Site Search
Make sure you're tracking site search queries in your analytics tools. Site search has proven time and time again to be a great diagnostic tool:
- Important pages that should be in the navigation.
- Content you should be covering but aren't.
- Seasonal trends or emissions issues.
Cannibalistic content
Content cannibalism is problematic because you can lose control of the experience you've created for users, and search engines can get confused and interchange the URIs they show users for specific search queries. Combining very similar pages is a great strategy for both users and search engines.
If you click on a single search query in Search Console, Google will show you all the pages on your website that competed for that query over a specified period of time.
This can sometimes be confusing because in many cases Google can display links to a website in search results, which results in multiple URIs being displayed in search results for queries.
If we focus on non-branded queries (search queries that don't contain a specific brand or product name), it's often more fruitful to find pages that do have cannibalistic content. If you're good at Python, this data can be pulled from the Search Console API and spreadsheets can be created that count the number of URIs that received clicks on the same query.
Screaming Frog now has a duplicate content report that allows you to review your site and quickly analyze content that is an exact or near duplicate.
Finally, giving SEM teams paths to place paid landing pages is a good strategy to eliminate the accidental creation of "cannibalistic" content by disparate teams.
Work experience
User experience on websites can affect both visibility and revenue. In many cases, if changes are favorable for revenue, they will also be favorable for visibility as search engines increasingly incorporate experience into their understanding of the metrics that quantify user satisfaction.
Page speed
The two best ways to make page speed a priority for a company are to tie it to lost revenue or position it as a way for a competitor to get ahead of it.
Google Analytics has limited page speed metrics and for smaller sites can produce highly skewed averages with small time samples, but if you increase time samples and work to align metrics like document interactivity with meaningful revenue reduction, it can give you the information you need to prioritize your speed work.
One of my favorite reports to share with developers is the Measure report from web.dev. Not only does it provide an overview with prioritized problems and guidelines for solving them, but it also links to the Lighthouse report to give developers more detailed information about individual problems.
Web.dev also provides a link to a handy CrUX Data Studio dashboard that will make it easy to see improvements and celebrate them with a wider range of internal stakeholders.
Microsoft Clarity
Clarity is a great free tool from Microsoft that connects to Bing Webmaster Tools and provides a rich set of experience metrics as well as individual session recordings. In my opinion, there's no better way to understand user experience than by looking at session recordings. You can see when people are reading, when they have to close 15 pop-ups, when the hamburger menu closes unexpectedly, and see if other things are interfering with what you want them to do.
Understanding Intentions
Using the second page and exit page in landing page reports in your analytics tool can give you really good information about what users want and how they're getting it. Does the landing page contain links to the information they were looking for? Are users navigating to another page to find the answer they were supposed to get on the landing page?
Hidden issues
Getting in the habit of opening the Developer Tools Console in Chrome when visiting pages is a good way to discover hidden bugs that can affect users or metrics.
Mistakes here can lead to:
- Incomplete tracking information
- Missing content
- Insecure pages
- Low page performance.
Relevance
Relevancy, in my opinion, is how well a page matches what the user was looking for. It's not about keyword level, but whether the page provides an answer or a solution to the main meaning the user put into their search query.
Google Data Studio provides quick insights into user searches and landing pages, as well as other informative metrics such as clicks and impressions.
Uploading this data to CSV and using a simple pivot table in Excel or Google Sheets gives you a high-level view of what the best search engine on the planet, Google, thinks your page is about.
Since this page on Locomotive's website is designed to sell technical SEO services, we can quickly see where this page is relevant to something it probably shouldn't be.
This is an opportunity for us to update the page by adding text describing the types of analysis we offer, talking about the benefits of technical SEO and our credentials as an agency. Searches highlighted in green (see below) correspond to the purpose of the page.
The items highlighted in red (see below) give us the opportunity to produce more in-depth educational content that takes a closer look at the details and mechanisms of technical SEO.
Also, understanding your authority and expertise as the search engine sees it is very important to understanding what you may be relevant for. Around 2019, Google started boosting the rankings of some absurdly unoptimized websites. Many of these were local government websites that had never seen an SEO specialist and rarely a developer or designer. Google became more aware of the authority attributed to websites.
Search engines can also use the totality of a site's textual content, authors, links, etc. to see how competent a site is in a particular topic area. Writing new content that matches your site's topic expertise or your civic authority will always score higher than content that does not. This also aligns with the concept of "ebb - flow," meaning that over time, the more you demonstrate your expertise in new content, it has an additional impact on all content in that subject area.
Finally, the last two areas related to relevancy include knowing what you might be relevant for and understanding when Google adds relevancy for you.
If I worked for an energy company and we were asked to name a new product plan "Unlimited Utilities", unless significant investment was made to draw attention to the name, it is very unlikely that users would ever find our landing pages through a Google search because Google understands it as a company-specific navigational term.
I like to think that Google simply includes what it knows about me in the search text. In the example below, Google knows that I am in Raleigh, so it included the +raleigh symbol in my search.
I'm sure it's much more complicated than that, but from a mental design perspective it's helpful to consider that Google takes your location, search history, etc. into account when processing your search to provide results more tailored to you.
Summarizing
Effective SEO requires a comprehensive approach. Here are the key elements that allow you to approach things from all angles:
- Companies need the commitment and resources of a team to succeed in SEO.
- SEO teams should focus on clarity of communication and effective prioritization.
- The key areas to consider in SEO strategy are links, content (page satisfaction), experience and relevancy.
- GIGO is the real thing. By taking your time, with accurate XML sitemaps, user metrics, user feedback mechanisms, etc., you can make your life easier and get the data you need for development.
- Spend some time looking at user sessions. You'll thank me.
- Work hard to make sure your pages solve a problem or give the right answer.
- See how your page content aligns with user search queries provided by Google.
- Write to support and reinforce peer review of your site's subject matter. Credibility is a key factor.
Get 300 checks per month absolutely FREE!
No credit card needed. No strings attached. 👍