How to deal with an SEO problem?

When we look at an SEO problem, we often tend to want to solve it at any cost, without taking the time to understand its origin. Yet it is by successfully identifying the cause of a problem that it will be best to deal with it and prevent it from happening again. This is why it is important to use a methodology to overcome any difficulties encountered; Whether it is technical, content or popular.

This article is not intended to list the various data to be analyzed in this light nor to explain the meaning and the way of interpreting them. This has already been done here and elsewhere. On the other hand, it can help you gain efficiency with some methodological advice.

Rule #1: an inventory you will realize

Before undertaking anything, let us remember that acting in a hurry is never good. This is especially the case in SEO, an activity that is carried out over the long term and which consequently requires being patient. Rather than go headlong into the optimization application, take the time to take stock of the situation by asking yourself the right questions:

  • When was the anomaly detected? = The period of study is restricted.
  • How has it evolved since then? = The degree of severity is measured.
  • What are the consequences ? = Identify levers likely affected.

Also, imagine that you are a doctor interacting with one of his patients. Before you examine it, you want to know more about him, his symptoms and what prompts him to come and consult you. This inventory will serve as a basis for comparison to measure the effectiveness (or not) of your future actions.

Rule #2: Well you’ll have to equip yourself

Since your site does not have a word, you will have to ask your favorite tools to get the answers to the questions asked in the previous point.

Without transition, go to the Google Search Console, where you can verify that your site is not subject to punishment for spam and control its exploration by engine robots. If the ex-Webmaster Tools are interesting to examine the behavior of Googlebot, the statistics returned only reflect a trend. You will have to go further and use the advanced (but paying) features of solutions such as Screaming Frog that can go as far as log analysis.

As far as linking is concerned, you have the choice between Majestic and Ahrefs. Free to choose the one that suits you the most. They will be especially useful to you to discover your last referring domains as well as the anchor texts of the backlinks coming from these.

It is common, especially in competitive industries like the one in which I work, that malicious players exploit the security vulnerabilities of third-party sites to create phishing files. Through these files, they can generate “optimized” dummy pages via duplicate content in order to harm their competitors (NSEO) – duplicate content integrating backlinks – or simply to redirect traffic to their own site.

Once the undesirable referral domains are listed, it is recommended to contact them one by one and to transcribe the returns obtained in a follow-up sheet. This will save you precious time in your possible raises as well as allow you to follow the progress of operations as it should.

Finally, do not underestimate the Duplicate Content, internal as well as external. Again, tools exist. I have myself tested a very good recently: Killduplicate.

Rule #3: The necessary step back you will take

Well, here you are now in possession of a volume of data whose analysis would give a migraine to any consultant. But as you are organized and you have followed rule 1, you are not afraid. On the contrary, you will even take a cunning pleasure to make speak the data that you gathered according to the answers obtained to the preceding questions and keeping in mind that before the discovery of your problem, everything was well.

This may seem trivial, but we often forget that the source of an SEO problem can be identified starting from the comparison of the periods before and after its identification. To help you, here are some of the events that can affect more or less serious SEO:

  • Updating the robots.txt file;
  • Google algorithm update;
  • Changes to your server;
  • A new version of the site is online;
  • Publication of a new section on the site;
  • Arrival of a new competitor in the market;
  • Increase in undesirable referral domains;
  • Intensified efforts of one or more of your competitors …

Attention, you may encounter two (or more) problems simultaneously, hence the interest again to act methodologically. Compare your KPIs before and after these events. First tracks should logically …

Please note that if you are lucky enough to work in a team of specialists, agencies and advertisers, do not hesitate to solicit the help of your colleagues, if only by talking to them. Taking a step back can only do you good. This sometimes makes it possible to raise hypotheses to which one would not have thought alone.

Also, think about questioning your client or your collaborators. The source of a significant increase in crawl errors (404, 500 …), for example, can often be identified by listing the last actions that may have affected the tree or the performance of the server.

Rule #4: Step by step you will advance

Place to action! Or shares. If you feel you have to do several things to get things back in order, move forward carefully, step by step. Otherwise, you will be unable to identify practices that have borne fruit or, conversely, those that have not. Ideally, make a timetable noting well the date of putting into production of the corrective as well as the results observed after at least 2 periods, generally in the short and medium term. At the end of these, you can move on to the next optimization.

In conclusion

It should be remembered that even in an emergency, we must not rush when its natural SEO is going through a bad pass. As is often the case in SEO, it is recommended to define and apply an adapted strategy which will include a phase of analysis and optimization. If you need a SEO hero –> contact us!