According to Google, one of the most important things you should focus on is Crawlability:

  • Your Site should be “clean”, and easy to crawl.


Think about it: Google needs to evaluate your Site every few days and, the fewer resources they need, the better.


That is why we were worried about our “Members” directory.

  • It generates dozens of “artificial” and empty-of-content URLs.

And all these URLs are wasting Google’s resources and time.


Today, we’ll share with you how we have solved this issue in a simple way.


In case you don’t know, in this section we talk about How we are building this Website: our SEO strategy, Plugins we use, our Traffic Results…


As usual, this Post will be divided into 2 different parts:

1. How to improve the crawlability of your Site with “robots.txt” file.

2. Our Google Analytics report for today: Users and Pageviews we have.


Let’s begin:

1. How to improve the crawlability of your Site with “robots.txt” file.

Pages in the “Members” Folder generated for an “inactive” user.


We have few registered users.

Also, many them are “ghost users”: they are registered but not active.

  • Some of them use the registration process to promote their business, leave a URL to their Site…


The problem is: as you can see in the image above, for each “inactive” user, BuddyPress (the Social-network Plugin we use) generates 6 empty URLs:

  • The default folder.
  • The Profile.
  • Forums.
  • Replies.
  • Favorites.
  • Activity.

We have no problem having “inactive” users but… We don’t want these URLs to make things difficult for Google.


Moreover, if the user is “active”, 2 additional URLs are created (at least):

Pages in the “Members” Folder generated for an “active” user.


Our Google Search Console is full of these empty URLs and, we think, this can’t be good for our Crawlability:

This is a small part of all the empty URLs in our “Members” directory.


What have we done?

We have used the Robots.txt file to solve this issue.


* If you don’t know what this file is, please, visit first our “What is robots.txt file and how to create one” Post.


How have we improved our Crawlability with Robots.txt?


Disallowing our “Members” folder.

If you look carefully, all these empty-of-content URLs we have just showed you, are under the “Members” folder:



By Disallowing this folder, we are telling Google that it should not crawl all these pages because they are not important.


If you check our Robots.txt file now, you’ll find this:


As you can appreciate, we have added the “/members/” folder.

  • Now, if Google wants to crawl any page inside this folder, our robots.txt file will deter it.


We hope this makes things easier for Google … And that we also get rid of those annoying pages in Google Search Console.


2. Our Google Analytics report for today: Users and Pageviews we have.

Our traffic keeps improving.

  • However, we still need a few days before we have the traffic we had in June.



As you can see in the Post’s top image, we have reached 1,200 users per week, again.



We currently receive 6,595 page views per month.

  • Our goal was to reach 10,000 pageviews this summer but… We’ll have to wait.


As always, thanks for being there!

© 2022 - Consuunt. Privacy Policy


We're not around right now. But you can send us an email and we'll get back to you, asap.


Log in with your credentials

Forgot your details?