Fell site positions - causes, analysis, how to restore

August 10, 2023

Table of Contents:


The position of the site in the search engine results is the main metric of the quality of the optimizer's work. A sharp drop out or even a small drop in positions for queries becomes a problem both for the SEO specialist and for the resource owner. Reduced visibility on the semantic core leads to negative dynamics of traffic indicators, in other words - sales decrease.

How to determine that positions have dropped

In order to quickly track changes in positions for selected key queries, many specialized tools are used. Among the most popular are AllPositions, Topvisor, and SE Ranking.

All of them, in addition to directly collecting positions for the specified queries, also collect frequency, indicate the relevant page, build a graph of site visibility, determine the competitors in the issue, etc. In each of them you can customize the frequency of checks convenient for you.

In current realities, for the most efficient work, position checks should be performed as often as possible (optimally 2 times a week), but if the budget for checks is limited, you can choose to collect on demand, and choose your own update date, depending on the situation.

After the completion of the next update, for clarity above the positions is displayed a graph showing the visibility of the site (overall - for all the specified search engines, as well as detailed for each selected PS).

On the graph it is noticeable if there was a general decrease in the visibility of the site, the numerical indicators under the graph display more detailed information: for example, the position worsened for almost half of the queries.

Visibility is a relative indicator that depends on the frequency of queries and positions of the site on them. It is calculated using a special formula.

In other words, a drop in a high-frequency query will affect the visibility percentage much more than a medium-frequency query. And also the deterioration of positions for a low-frequency query (for example, with frequency 1) may not affect the visibility graph at all.

By conducting regular checks and analyzing visibility data, you can track down site-critical drops as quickly as possible.

Has there really been a fall

Having found an unpleasant picture of negative dynamics, especially when the positions have fallen sharply and strongly, first of all it is necessary to determine whether the fall has really occurred or whether there was a failure in the operation of the service. You can selectively check positions manually by typing queries into the search bar. To do this, it is better to use incognito mode (with personalized output disabled) and do not forget to specify the region being promoted.

Let's say that the collection showed a number of queries falling out of the top 10, but the positions in the output remained the same. This may indicate an error. Having found such a picture, you can try to collect the positions a little later - everything will probably fall into place. However, this situation does not happen very often.

Site positions have sagged: what to do in the first place

There are a number of questions that need to be answered before we can begin to analyze the decline:

  • How big is the scale of the drop: how many search engines were affected; did the site go down as a whole or just individual pages/queries?
  • Have any work been carried out on the site? Could they have had an impact?
  • Has the top 10 as a whole changed?

The answers to these questions can give an overall picture of the decline:

  • positions sagged in all search engines - this fact may indicate the presence of technical problems (falling out of the index, inaccessibility of pages, rollback edits, etc.);
  • positions sagged in one search engine - may be due to the presence of a search engine filter, a change in the search algorithm;
  • positions sagged on one page - it is necessary to check the content of the page, as well as its response;
  • individual queries have dropped - a detailed query-by-query analysis is required.

Decrease in positions at the time of large-scale work on the site (e.g., changing the structure, transferring the site to a new CMS, etc.) may indicate errors in the execution of work or that the changes were indexed at an intermediate stage (for example, did not have time to set up 301 redirects when adjusting the structure). In any case, it is necessary to check the correctness of the changes made, this will speed up re-indexing.

Once you have obtained and analyzed the initial information about the fall, you can move on to finding out more about the causes and looking for possible remediation options.

Conventionally, we can distinguish a number of reasons that occur regardless of what changes were made to the site.

Adjustment of ranking algorithms

As you know, search engine ranking algorithms are constantly being refined. Basically, these changes are aimed at improving the quality of the output, but there are also unsuccessful experiments that eventually roll back. The emergence of new or update of old algorithms often become the cause of a drop in the position of the site. In order to understand that the decline in positions in queries affected exactly the adjustments of algorithms, it is necessary to always be aware of current events. As a rule, all significant updates are announced by search engines in advance, so the study of professional groups and SEO-communities will help you prepare for the upcoming improvements. It is also useful to keep in touch with other specialists: talk to colleagues or read SEO forums, it will help to understand the situation.

The most striking example of a significant impact on positions is the "Medic Update" in Google in the summer of 2018. To recognize the "strength" of updates, you can use special services on the degree of change in renditions by date.

Issuance changes

In addition to updating algorithms, the reason for adjusting the rendition may be a change in the intent of the query, for example, it has been commercial for a long time, but has become informational. Here is an illustrative example: the site has been ranked in the top 3 for a long time for the query "pig", and during the next collection of positions, a noticeable decrease in visibility in Google PS was detected.

The visibility of movie theater sites, video hosting sites, etc. appeared. The reason was quite simple: in 2021, a new movie "Pig" was released, the information about which took the first positions in the Google search results.

In this case, the decrease in positions occurred regardless of the quality indicators of the site and something to do in the short term is not possible. You can consider the option of adjusting the core.


Even a perfectly optimized site may not stay on top when a powerful competitor appears on the market. Therefore, it is important to regularly analyze the competitive environment. At the same time, it is necessary to understand that in SEO the main role is played by competitors in terms of rendition, and not by business, as they may differ. Monitoring the appearance of new sites in the top can be done manually, but it is faster and more effective to use special tools.

Main causes of falls

The other reasons are directly related to what work was or was not done on the site. They can be divided into seven large blocks. Let's consider each one in more detail.

Block 1 - technical problems

A large number of technical errors is generally negatively evaluated by search engines. You can detect their occurrence in Google Search Console. They can be very diverse: growth of broken links, appearance of duplicate pages, occurrence of duplicate meta tags, problems with loading speed, lack of site maps, etc.

The causes of these errors are often peculiarities of CMS or plugins, so it is recommended to regularly conduct technical analysis of the resource (especially after making edits) to eliminate problems as quickly as possible.

In addition to the general negative impact on ranking, some technical errors can lead to more noticeable consequences.

Error in robots.txt


The site/section/page is not allowed to be indexed. To detect the problem, you need to check whether a particular URL (or the site as a whole) is in the index. This can be done using the search operator "site:..." or "url:...".

What to do

Remove the erroneous directive that prohibits indexing and send the page for re-checking. To determine which directive caused the index dropout, it is recommended to check it with a special tool. It shows the correctness of the syntax and also allows you to check which URLs are allowed to be indexed and which are not.

How to avoid

In order to learn about indexing problems and all robots.txt changes in time, it is recommended to set up notifications from Google Search Console to your work email.

Deleting/unavailable page


The landing page at the time of indexing gave a response other than the 200th response. There can be several variants:

  • 404 response - the page was removed or unpublished (on purpose or accidentally);
  • 301/302 response - permanent or temporary redirection to another page is configured.

The main symptom: falling positions on one landing page - they either completely fell out, or there was a change of the relevant page with a simultaneous decrease in positions. This fact can lead to a drop in positions in all search engines, but not necessarily simultaneously (the timing depends on the speed of indexing the site by robots of different search engines). Detect it is quite simple. First, it is necessary to check the actual response of the page. Second, check the presence of a particular URL in the index.

What to do

Set up a 200 response for the promoted page. Send it for re-review.

How to avoid

Set up tracking of the status of promoted pages in Google Search Console. In addition to responses, it also shows the history of title and description tag changes, which is also very useful when analyzing position changes.

Site inaccessibility


The resource falls out of the index, because at the time of the robot bypass it was unavailable (gave a response different from the 200th). There are cases when the site remains in the index, but still pessimized by the search engine due to long-term failures in the work.

What to do

One of the reasons for resource inaccessibility can be problems on the hosting side, so to solve them you need to contact technical support.

How to avoid

Track the correct operation of the site. For example, using Google Search Console.

Appearance of a duplicate page


A technical or semantic duplicate of a landing page has appeared in the index. The duplicate can be complete and formed, for example, due to the peculiarities of CMS. Such problems are often found on Joomla (when duplicates are created when creating a new section), Bitrix (due to the addition of various get-parameters) or WordPress (due to plugins connected in it). In the presence of a technical duplicate site positions can fall outside the top 10 (without changing the relevant URL). Duplicate pages can also be duplicated in meaning, in this case the problem is in the content filling. In this situation, the pages will begin to compete with each other in the output.

What to do

It is recommended to eliminate complete duplicates from the index using robots.txt (disallow or clean-param directives) or setting up 301 redirects (if technically possible). To solve the problem of semantic duplicates it is necessary either to differentiate pages by semantics, or to leave one of them - the highest priority.

How to avoid

Regularly perform technical analysis of the site. Avoid overlapping queries when clustering the semantic kernel and creating new pages.

Display problems on mobile devices


A correct adaptive or mobile version of the website is of great importance. In 2018, Google announced the beginning of mobile-first indexing, i.e. the priority ranking of mobile content. Thus, in the absence of a correct adaptive or mobile version or a large number of errors in the display on smartphones, there is a high probability of a decrease in positions even for the most authoritative resources.

Search engines also have penalties for violations related to the correctness of displaying mobile versions. For example, for hidden redirects for mobile devices. This fact is considered a violation of recommendations for webmasters.

Action is taken against such resources, such as removing the URL from the index.

What to do

Correctly adapt the resource for different devices and resolutions. Eliminate any problems that hinder interaction with the site.

How to avoid

To check the adaptability of your website, you can use special tools in Google Search Console.

Low download speed


An equally important ranking factor is loading speed. In the spring of 2021, Google launched an algorithm that evaluates the convenience and security of working on a page. Page Experience includes the Core Web Vitals factor, the values of which show the user's interaction with the resource. Google recognizes as quality resources resources that:

  • The main content is rendered quickly;
  • Low wait time before the first interaction with content;
  • content elements are visually stable and do not hinder interaction with the content.

From February 2022, unsatisfactory indicators of speed and quality of page loading can directly affect the visibility of the resource in the issuance.

What to do

Fix all the errors that affect the speed of resource loading, as well as follow the recommendations from search engines.

How to avoid

Analyze your loading speed regularly using special tools, such as PageSpeed Insights.

Block 2 - content

Content on a website is one of the key ranking factors. Therefore, errors related to it are bound to lead to lower positions.

Google imposes penalties for the following problems:

What to do

Problematic and incorrect text should be either completely rewritten or spot-checked: eliminate unnecessary occurrences, errors, "crooked" constructions, correct formatting, get rid of "water". After making corrections, it is necessary to send the pages with corrections for revision.

How to avoid

The text on the page should correspond to the topic, be high-quality, unique, not spammy and not overoptimized, well-formatted, without punctuation and spelling errors. Text writing should be done by a professional or a well-chosen copywriter.

Block 3 - regional factors

One of the key roles in Google's commercial ranking is played by the regional factor. Therefore, if a resource has problems with geo-referencing, its position may significantly deteriorate. It is necessary to check the data in the Google Search Console panel

Currently, Google strictly monitors the reality and relevance of a company's contact information through the work of assessors and tollockers. To confirm information, you may be asked to record a video, send a photo of the company's signage, etc.

What to do

To geo-locate a site, it is necessary to place an actual unique address in the promoted region. For sites without an actual address, there is currently an option to add them to Google Business as online organizations. However, such organizations are not published on Google Maps.

How to avoid

To avoid regionalization issues, you should avoid using fake addresses for geolocation.

Despite the fact that for a long time there have been talks about the decreasing importance of links in SEO, external optimization still affects positions. Search engines are careful to evaluate the link mass: a sharp rise or fall in the number of backlinks can indicate the use of artificial methods of building and cause a drop in positions. You can track and analyze the link mass of your site using Google Search Console.

You should also keep an eye on the quality of your links.

Examples of "bad" links:

  • from questionable, low-traffic resources;
  • from donors on the same subnet;
  • hidden links;
  • from pages that also link to "spammy" sites.

For the most serious violations search engines impose penalties. There are also penalties for placing paid links.

What to do

If problems with backlinks are detected, it is necessary to analyze the link mass and remove low-quality links. If you can't remove the links yourself, Google allows you to reject them using a special tool.

How to avoid

When building up link mass, it is important not to use "artificial" methods, but to give preference to high-quality natural links. In order to prevent the growth of spammy links (for example, from malicious users), it is necessary to regularly analyze the link profile.

Block 5 - behavioral factors

Behavioral factors have a significant impact on positions. This is a set of user actions on the site and in the search engine, which are taken into account when ranking sites in Google. These include CTR, bounce rate, time on site, depth of browsing, etc. Deteriorating values of these indicators can lead to a drop in positions. And the negative dynamics can be both natural (for example, the site became inconvenient for the user, the page's snippet became less attractive, etc.) and artificially reduced (if the bounce rate suddenly increased, but visits have dubious sources).

There is also a large number of services that are paid behavioral factors. This technique is not recommended for use and can lead to serious penalties from search engines.

What to do

If positive or negative behavioral indicators are detected, it is necessary to contact the TP of the search engine: specify the problem and clarify whether the site is not subject to sanctions. In case of a natural decrease in behavioral factors, it is necessary to work on the site: check for errors, loading speed and usability.

How to avoid

Do not use artificial methods of promotion.

Block 6 - violations and security

In addition to the reasons listed above, there are a number of other problems for which search engines can demote a domain in the issuance. Manual sanctions are applied to resources that try to deceive the search engine. Each PS has its own list of violations.

This can be attributed to the actions of attackers: website hacking and malware placement. Google Safe Browsing helps to detect viruses. If Google considers a site to be potentially dangerous, it notifies users and webmasters.

What to do

Having found information in this section about the presence of violations, you should promptly proceed to solve the problem. After the work is completed, it is recommended to click on the "Problem solved" button in Google Search Console.

How to avoid

In order to notice unauthorized actions in time, it is necessary:

  • check the cache of search engines regularly;
  • install antivirus on the site;
  • monitor notifications in Google Search Console.

Block 7 - affiliation

Affiliates are sites of one owner and one topic, created to monopolize the output. Search engines do not officially confirm the existence of the filter, but long-term SEO-promotion practice proves the existence of sanctions for this violation.

Search engines can identify affiliates for many reasons:

  • similar design and layout;
  • matching contact information - company address, index, phone number;
  • similar product assortment and prices;
  • identical descriptions of catalog sections, product cards, images;
  • IP addresses of similar sites are located on the same hosting;
  • cross-references.

However, at present, even differences in these parameters do not always protect against a filter. Representatives of search engines may act on a competitor's complaint by conducting a detailed inspection of organizations.

The main penalty is a significant decrease in positions, and this can affect both resources. The algorithm of punishment is different: earlier we recorded a sharp change of positions for groups of queries (one or the other resource was shown in the output). In the future, the punishment was a complete pessimization of the site: positions remained always low.

What to do

If the affiliate is known and has access to it, you should eliminate one of the sites, for example, by setting up 301 redirects. You can also differentiate the topics of the resources - eliminate all semantic overlaps.

Uniqueization of data on the promoted resource can also help:

  • change of contact details and legal entity;
  • correction of data in the catalog: assortment, prices, descriptions;
  • changing texts (increasing their uniqueness).

You should also contact technical support to find out the circumstances of the filter. Sometimes Google representatives can help you solve the problem by avoiding boilerplate answers.

How to avoid

Don't create affiliate sites, use real unique contact information.


In order to maintain and improve the position of the site, you need to:

Regularly monitor the positions of your site and competitors' resources.

Keep up with search engine algorithm updates.

Verify information in the Google Search Console panel.

Conduct technical analysis of the resource.

Create unique quality content for the pages of the site.

Analyze the quality of link mass and the dynamics of its growth.

Do not use "black" methods of promotion

Constantly improve the site, keep an eye on trends in rendition, and look for additional points of growth.

Following the recommendations will avoid a sharp and unexpected drop in positions, as well as the imposition of sanctions from search engines.

Need to check positions on Google? The answer is here:

Join Rankinity

Get 300 checks per month absolutely FREE!

No credit card needed. No strings attached. 👍