Ever wondered what impact does User-generated content make on your SEO strategy? You might have noticed that the popularity of UGC has grown significantly in recent years. Top websites are more than eager to share user-generated content on their platforms. Among the most shared UGC includes digital videos, blogging, podcasting, review sites, social networking, mobile photography, wikis, etc.
Believe or not, consumers act as your best marketers. It is more frequent that your potential customers trust other online reviews given by other customers to make a buying decision. This happens because we all are more open to accepting the opinions and suggestions given by our peers. You can always flaunt your product, but the sentiment that someone creates who is outside of the organization is even more impactful. By simply adding customer reviews on the website pages, you can increase your organic page visits greatly.
UGC, as a form of crowdsourcing, can immensely bring value to your website and SEO in following ways:
Provides SEO with qualitative content: Through user-generated content, you can always generate fresh and authentic content that acts as a key to keep your ranking on top. Brands need not require a lot of resources to create new targeted for SEO as the content is constantly generated by the users itself. This also indicates that the brands are not talking about their happy customers; instead, the customers themselves are talking about the brand.
Few sites, especially E-commerce websites often lack fresh content. In this case, UGC acts as a great solution to offer some content to other users looking for some information to evaluate the product or services. Having reviews also suggest that your site is active and credible for customers.
Acts as a great source for long-tail keywords: Long-tail keywords are very effective in terms to have your rank high. Having UGC help brands to know phrases that users generally search on the search engine which brands may not necessarily think of including to their customer marketing. Through UGC, you can be aware of trending topics in which your visitors are most interested in. Having comments or reviews, you have an opportunity to excel in your PPC campaigns as you can choose ideal phrases and keywords to optimize your overall SEO strategy.
When users post what they love about your product or service, they provide you an insight into what your audience is looking for. Knowing which long-tail keywords to use can make a huge difference in your ranking!
Enriches social media optimization: A well-executed content curation strategy is necessary to boost your SEO. Social media marketing and SEO are closely associated in a way that sharing content on social media platforms engages users and brings them to your website. This calls a great traffic to your website, which amplifies your SEO. This, in turn, improves your site visibility and indirectly pushes your ranking high on search results.
Sharing your customer’s reviews or images builds greater trust with your followers on Facebook or Twitter, which increases likes, comments, and shares. The more you engage your followers on social media, the more you attract more organic traffic to your website!
Offers great results for search engine queries: To evaluate a website, automated search engine algorithms use spiders that keenly look into the ratings and reviews provided by the customer. Also, to evaluate site’s reputation, search quality raters give ratings to the website manually according to its content originality, topic mastery, authority of the author, among various other tenets. Thus, the content provided by the customers help you to earn SEO points from quality rating systems and provides great results to search engine queries.
Just as top brands and companies are harnessing the use of user-generated content, you too can embrace UGC as an effective tactic to improve your overall organic search performance. Majority of customers today rely on Google for their buying journey and thus, it is important for you to grasp UGC as one of your strongest marketing tools to strengthen SEO.
Once your leverage on UGC, make sure to monitor your website on a regular basis. One negative review can badly impact your brand image if you take time to respond him/her. You will also need to select the right spam filter to prevent comment spamming, as it can harm your rankings, authority, and traffic. However, if managed appropriately, user-generated content is one of the best options to boost your SEO campaign.
Last week, webmasters saw a sudden ranking fluctuations in their website rankings. Even premium SEO tools witnessed a huge drop in rankings of key websites. This gave speculations about a new algorithm update. Google has now officially announced this broad core algorithm update.
The news is now confirmed that Google has rolled out another broad core algorithm update this week. This algorithm update was released on 16th April, Monday as confirmed by its official Twitter handle.
Google writes in its tweet that it usually makes broad changes in its algorithms to improve search results. They do this routinely throughout the year. Google added that some of its changes are focused on specific improvements while others are broad changes.
Many SEOs thought that the update targeted low-quality pages however, it was later confirmed that the update was closely related to content relevance. Google is adjusting rankings to ensure that searchers obtain best search results to their queries.
Google released a similar algorithm update early this month after it was explained that a “broad core” algorithm update is something that takes place routinely several times all through the year
However, if your ranking is also affected by this update, it does not necessarily indicate a low quality of your content. Instead, it’s more likely that your content needs to be more relevant to the users who are looking for the information.
Want to know the ways to filter referral spam in Google Analytics? You might find an answer here…
You can minimize or even eliminate the negative effects of fake traffic in Google Analytics. If you are a beginner, you must check out the Google Analytics Solutions Gallery and look out for referrer spam or block bots. While you look in the gallery, you can find some great resources there, but filtering referral spam requires a lot more.
Given below are some effective steps that can help you with the process the filtering spam or block bots in Google Analytics:
Identify which bots are harmful: Belief or not, bot all bots are not bad! There are plenty of bots which makes our search world go round. To cite some examples, these may include GoogleBot, BingBot, SpyFu, Deep Crawl, Screaming Frog, among various other bots. However, these bots are not dangerous for sites, or even for the visitors of the site.
The bots that can turn out dangerous for your site includes the ones that can seize your traffic, get through loopholes in CMS for the purpose of hacking and scrape your content. On the basis of your industry type, there are certainly some forms of bot traffic more harmful from others. Thus, it is important to figure out which bots need to be tackled.
In fact, it’s not always the bots that are harmful; there are many other referral sources that can send enormous traffic to ruin your data.
Carefully filter your Analytics Traffic: Before you start to filter out your bot traffic, you must know what is being taken out of your data set. This requires you to compare data. To do this, you can create a separate view in Analytics and name it like “bot traffic filtered.” This is how you can do it following these simple steps:
- Click on “Admin”
- Click on the drop-down menu, in the right column under “View”
- Select “Create new view”
- In the next step, be sure to set your time zone, which defaults to Pacific Time
Block bots in Google Analytics: Google provides you an “easy button” instead of doing it manually to block known bots. This can minimize your work up to 75 to 80 percent. The bots are regularly updated as Google keeps on finding new bots.
For your new view, you can select the “view settings” option and click the checkbox. This will exclude all hits from the known bots. Doing this will tell you what will happen to your traffic as soon you turn on bot filtering. But, before this, make sure that none of your important traffic sources are part of Google’s list of known bots. When you decide to roll it out in the main profile, add an annotation to explain changes and prepare people who view your analytics. All you need to do is click on the little arrow under analytics chart in Google analytics and follow the instructions.
Create a custom referrer filter: Google’s bot filtering system may work for a while, but certainly, there will be some other referrers which might send volumes of no quality traffic to your website. To identify these referrers, open the referrer list in Google Analytics and sort the data descending by bounce rate to bring 100% bounce rate to the top.
Using an advanced filter, you can filter the data to only highlight a number of sessions over a specific threshold. This can vary as per your traffic volume. Having this done, you can scroll down the list and pick sites that you want to add to your referral exclusion list. Once you get a list of the sites that you want to filter, cut them down to the main TLD.
In the process, you may also find some foul websites but make sure that you don’t visit them to check them out. Visiting them might make you a recipient of some unwanted malware or spyware. After your list is fully vetted, you can finally create a custom referrer filter.
Set up a bad referrer filter: Once you have a list of all the bad referrers that you want to block, create a new filter in the view that you had set up earlier for bad referrers. Remember not to do this at the account level, but only in the view screen. To set the filter, follow these steps:
- Select “Admin” option
- Select “Filter” under “View”
- Click on “Add Filter” and give a desired name to the filter
- Then Click on “Custom” and “Exclude”
- Select “Campaign Source” as your “Filter Field” and add all the domains that you want to exclude in the box
Be careful as you enter multiple domains that can create a mess. Also, make it a point to test your filter and update it regularly as you will get new domains to exclude now and then.
Block bot traffic: Now this step requires skilled expertise. This involves using .htaccess or web config in IIS, which acts as a backbone for the entire website. This must be handled cautiously as one wrong character can bring down your site completely. Don’t forget to make a backup copy and make sure that you have access directly to your server.
.htaccess file acts as a powerful tool for disposal as you can block high volume of bot traffic from accessing your server. You will have to integrate this code into the existing .htaccess file:
Rewrite Engine On
Deny from 126.96.36.199
Allow from all
Following the above steps is an effective way to block high or low volumes of bot traffic which can place a high load on your server. As the list gets longer, it puts more load on your server that causes your site to slow down. Blocking the bots can reduce the load on your server and also ensure that these visits out of the Google Analytics. In case of serious security issue, you should contact your web host or system administrator to resolve the issue.
Great content can bring you better search engine ranking for competitive queries but URLs can make a big difference! It is a durable URL that actually accumulates links or signals over time and releases fresh content. It is also responsible to maintain your position at the top of search results. “Durable URLs” are all that you need to ensure long-term SEO success.
What is a durable URL?
Unlike disposable URL, a durable URL is the one that remains lasting and could be updated with new or fresh content later. During the process, it continues to rank for various keyword targets to maintain ranking signals. However, there are plenty of URLs which are updated very often but are not said to be durable URLs. It’s just a way of thinking about URLs which decides the future success through the accumulation of signals such as inbound links.
Having a durable URLs ensures that the content becomes a dynamic document rather than an inactive proclamation. The best part is, the content matching the query can be updated any time, throughout the year.
Durable and disposable URLs
A durable URL, particularly the resource name or slug, must not contain any dates, opinions, author’s name, adjectives or any information about the post. It should simply match the crux of target search terms or keywords. It must be highly simple, short, easy to read and free from clutter. For an instance, a write up about digital marketing, a sound URL slug could be: digital-marketing or business-promotion-digital-marketing etc. Here, you can simply update content without even bothering about changing URL every year or month.
On the other hand, disposable URL slug could be like digital-marketing-trends-2018 or promote-business-with-digital-marketing-trends-2018. In such instances, the URL must be redirected to a new location to accumulate ranking signals or query else, no accumulation occurs and old versions are maintained. To have a new version for the next year, a new URL should be created to match the content. Creating a new URL means that the webpage must start from the beginning, having no likes, shares, tweets, bookmarks or pre-existing links.
Making a choice…
The big question is, how to choose whether to use durable or a disposable URL?
It is not always a compulsion to choose a particular type of URL. different content or keywords demand different URLs.
There are certain types of content that work best with a durable content including:
- Critical content and other content targeting vital short-tail keywords
- Time-sensitive content, most based on annual reports or other annual listings
- Content where core topics remain the same
- Content which needs to be updated on a regular basis
- Content that targets high search volume target keywords
- Content targeting keywords which might have a shift in context overtime
On the other hand, types of content which might benefit from disposable URL includes:
- Content that can be used as a reference for future use
- Content with a specific point of view
- Author’s content which needs no modification in future
- Content with no search volume target keywords
- Content with longer tail-keyword targets
However, redirects can help to pass many ranking signals yet, having a strong initial URL that stands the test of time is irreplaceable!
How evergreen URLs work?
Evergreen URLs are need of an hour. Content may need updates but the underlying URL and basic topic usually remains same. Having a better understanding of URLs can allow your website to maintain fresh content without actually getting worried about pre-existing ranking, traffic or social signals.
Make sure to pick your original URL wisely and then you can always have a chance to make necessary changes to title, images, meta description or text body, even after publishing new piece of content!