+91-9873436666 sales@sixpl.com
Filtering Referral Spam in Google Analytics

Filtering Referral Spam in Google Analytics

Want to know the ways to filter referral spam in Google Analytics? You might find an answer here…

You can minimize or even eliminate the negative effects of fake traffic in Google Analytics. If you are a beginner, you must check out the Google Analytics Solutions Gallery and look out for referrer spam or block bots. While you look in the gallery, you can find some great resources there, but filtering referral spam requires a lot more.

Given below are some effective steps that can help you with the process the filtering spam or block bots in Google Analytics:

Identify which bots are harmful: Belief or not, bot all bots are not bad! There are plenty of bots which makes our search world go round. To cite some examples, these may include GoogleBot, BingBot, SpyFu, Deep Crawl, Screaming Frog, among various other bots. However, these bots are not dangerous for sites, or even for the visitors of the site.

The bots that can turn out dangerous for your site includes the ones that can seize your traffic, get through loopholes in CMS for the purpose of hacking and scrape your content. On the basis of your industry type, there are certainly some forms of bot traffic more harmful from others. Thus, it is important to figure out which bots need to be tackled.

In fact, it’s not always the bots that are harmful; there are many other referral sources that can send enormous traffic to ruin your data.

Carefully filter your Analytics Traffic: Before you start to filter out your bot traffic, you must know what is being taken out of your data set. This requires you to compare data. To do this, you can create a separate view in Analytics and name it like “bot traffic filtered.” This is how you can do it following these simple steps:

  • Click on “Admin”
  • Click on the drop-down menu, in the right column under “View”
  • Select “Create new view”
  • In the next step, be sure to set your time zone, which defaults to Pacific Time

Block bots in Google Analytics: Google provides you an “easy button” instead of doing it manually to block known bots. This can minimize your work up to 75 to 80 percent. The bots are regularly updated as Google keeps on finding new bots.

For your new view, you can select the “view settings” option and click the checkbox. This will exclude all hits from the known bots. Doing this will tell you what will happen to your traffic as soon you turn on bot filtering. But, before this, make sure that none of your important traffic sources are part of Google’s list of known bots. When you decide to roll it out in the main profile, add an annotation to explain changes and prepare people who view your analytics. All you need to do is click on the little arrow under analytics chart in Google analytics and follow the instructions.  

Create a custom referrer filter: Google’s bot filtering system may work for a while, but certainly, there will be some other referrers which might send volumes of no quality traffic to your website. To identify these referrers, open the referrer list in Google Analytics and sort the data descending by bounce rate to bring 100% bounce rate to the top.

Using an advanced filter, you can filter the data to only highlight a number of sessions over a specific threshold. This can vary as per your traffic volume. Having this done, you can scroll down the list and pick sites that you want to add to your referral exclusion list. Once you get a list of the sites that you want to filter, cut them down to the main TLD.

In the process, you may also find some foul websites but make sure that you don’t visit them to check them out. Visiting them might make you a recipient of some unwanted malware or spyware. After your list is fully vetted, you can finally create a custom referrer filter.

Set up a bad referrer filter: Once you have a list of all the bad referrers that you want to block, create a new filter in the view that you had set up earlier for bad referrers. Remember not to do this at the account level, but only in the view screen. To set the filter, follow these steps:

  • Select “Admin” option
  • Select “Filter” under “View”
  • Click on “Add Filter” and give a desired name to the filter
  • Then Click on “Custom” and “Exclude”
  • Select “Campaign Source” as your “Filter Field” and add all the domains that you want to exclude in the box

Be careful as you enter multiple domains that can create a mess. Also, make it a point to test your filter and update it regularly as you will get new domains to exclude now and then.

Block bot traffic: Now this step requires skilled expertise. This involves using .htaccess or web config in IIS, which acts as a backbone for the entire website. This must be handled cautiously as one wrong character can bring down your site completely. Don’t forget to make a backup copy and make sure that you have access directly to your server.

.htaccess file acts as a powerful tool for disposal as you can block high volume of bot traffic from accessing your server. You will have to integrate this code into the existing .htaccess file:

Rewrite Engine On

Options +FollowSymlinks

Deny from 123.45.67.89

Allow from all

Following the above steps is an effective way to block high or low volumes of bot traffic which can place a high load on your server. As the list gets longer, it puts more load on your server that causes your site to slow down. Blocking the bots can reduce the load on your server and also ensure that these visits out of the Google Analytics. In case of serious security issue, you should contact your web host or system administrator to resolve the issue.

A Better Insight on Durable URLs for SEO

A Better Insight on Durable URLs for SEO

Great content can bring you better search engine ranking for competitive queries but URLs can make a big difference! It is a durable URL that actually accumulates links or signals over time and releases fresh content. It is also responsible to maintain your position at the top of search results. “Durable URLs” are all that you need to ensure long-term SEO success.

What is a durable URL?

Unlike disposable URL, a durable URL is the one that remains lasting and could be updated with new or fresh content later. During the process, it continues to rank for various keyword targets to maintain ranking signals. However, there are plenty of URLs which are updated very often but are not said to be durable URLs. It’s just a way of thinking about URLs which decides the future success through the accumulation of signals such as inbound links.

Having a durable URLs ensures that the content becomes a dynamic document rather than an inactive proclamation. The best part is, the content matching the query can be updated any time, throughout the year.

Durable and disposable URLs

A durable URL, particularly the resource name or slug, must not contain any dates, opinions, author’s name, adjectives or any information about the post. It should simply match the crux of target search terms or keywords. It must be highly simple, short, easy to read and free from clutter. For an instance, a write up about digital marketing, a sound URL slug could be: digital-marketing or business-promotion-digital-marketing etc. Here, you can simply update content without even bothering about changing URL every year or month.

On the other hand, disposable URL slug could be like digital-marketing-trends-2018 or promote-business-with-digital-marketing-trends-2018. In such instances, the URL must be redirected to a new location to accumulate ranking signals or query else, no accumulation occurs and old versions are maintained. To have a new version for the next year, a new URL should be created to match the content. Creating a new URL means that the webpage must start from the beginning, having no likes, shares, tweets, bookmarks or pre-existing links.

Making a choice…

The big question is, how to choose whether to use durable or a disposable URL?

It is not always a compulsion to choose a particular type of URL. different content or keywords demand different URLs.

There are certain types of content that work best with a durable content including:

  • Critical content and other content targeting vital short-tail keywords
  • Time-sensitive content, most based on annual reports or other annual listings
  • Content where core topics remain the same
  • Content which needs to be updated on a regular basis
  • Content that targets high search volume target keywords
  • Content targeting keywords which might have a shift in context overtime

On the other hand, types of content which might benefit from disposable URL includes:

  • Content that can be used as a reference for future use
  • Content with a specific point of view
  • Author’s content which needs no modification in future
  • Content with no search volume target keywords
  • Content with longer tail-keyword targets

However, redirects can help to pass many ranking signals yet, having a strong initial URL that stands the test of time is irreplaceable!

How evergreen URLs work?

Evergreen URLs are need of an hour. Content may need updates but the underlying URL and basic topic usually remains same. Having a better understanding of URLs can allow your website to maintain fresh content without actually getting worried about pre-existing ranking, traffic or social signals.

Make sure to pick your original URL wisely and then you can always have a chance to make necessary changes to title, images, meta description or text body, even after publishing new piece of content!