Why Your Web Traffic Is Going Down 5 Possible Reasons

Is your website suffering a drop in web traffic lately? Here are the 5 possible reasons for the sudden drop in website traffic.
Why Your Web Traffic Is Going Down 5 Possible Reasons


 

Your Pages Timing Out

Some servers have bandwidth restrictions because of the associated cost that comes with a higher bandwidth; these servers may need to be upgraded.  Sometimes, the issue is hardware related and can be resolved by upgrading your hardware processing or memory limitation.  Some sites block IP addresses when visitors access too many pages at a certain rate. This setting is a strict way to avoid any DDOS hacking attempts but it can also have a negative impact on your site.  Typically, this is monitored at a page’s second setting and if the threshold is too low, normal search engine bot crawling may hit the threshold and the bots cannot crawl the site properly.
Solution is, if its a server bandwidth limitation, then it might be an appropriate time to upgrade services.

Pages Not Loading Properly

Make sure they have the proper 200 HTTP Header Status. Did the server experience frequent or long downtime? Did the domain recently expire and was renewed late? 
Solution - You can use a free HTTP Header Status checking tool to determine whether the proper status is there. For massive sites, typical crawling tools like Xenu, DeepCrawl, Screaming Frog, or Botify can test these.
 

Did You Fix Duplicate Content Issues

Fixing duplicate content often involves implementing canonical tags, 301 redirects, noindex meta tags, or disallows in robots.txt. All of which can result in a decrease in indexed URLs.  This is one example where the decrease in indexed pages might be a good thing. 
 

Recent URL change

Sometimes a change in CMS, backend programming, or server setting that results in a change in domain, subdomain, or folder may consequently change the URLs of a site.  Search engines may remember the old URLs but, if they don’t redirect properly, a lot of pages can become de-indexed. 
 

Search Engine Bots view your website differently

Sometimes what search engine spiders see is different than what we see.  Some developers build sites in a preferred way without knowing the SEO implications.  Occasionally, a preferred out-of-the-box CMS will be used without checking if it is search engine friendly. The worse situation would be pages that are infected with some type of malware that Google automatically deindexes the page immediately once detected. 

Solution is to use Google Search Console’s fetch and render feature is the best way to see if Googlebot is seeing the same content as you are.