Categories
Blog SEO

5 Reasons You Lose Traffic After a Website Migration & How You Can Prevent It

There comes a time in most websites’ lifetime when they will go through the dreaded website migration.

Website migrations are one of the most difficult technical processes to go through regardless of how skilled you are in digital marketing or development. You are taking a website that (hopefully) has stability, and you are making a huge change. This can cause a host of issues and impact the business objectives regardless of how thorough your planning and implementation has been.

In this post I am going to talk through five of the most common reasons that you may lose traffic to your website during your migration, and the steps you can take to try and prevent it.

Lack of respect for redirects

301 redirects are standard practice when it comes to website migrations. You find all of the pages on the current website, and you redirect them to the new location with a single hop (okay, there is a bit more to it than just that). Sounds simple right?

Well…

What happens when your development team decide to skip that part, and any of your recommendations/hard work and just put up the new website?

You get Google (other search engines are available) spitting their dummy out about the number of pages causing 404s, and your SEO having a heart attack as the 404 count rises by the 000s every day!

stod - crawl errors

Inevitably, if this issue is not turned around quickly, you start to lose visibility within the search engine. This leads to a decrease in organic traffic and the potential loss of conversion. I don’t need to tell you that this is not a good position to be in.

So how do you ensure that this does not happen?

Firstly, you need to have or build a good relationship with the development team working on the project. Go and buy them coffee, help them out, make friends. This will stand you in good stead, not only for the migration but for other technical changes you require.

Secondly, you need to ensure that you have conducted a thorough crawl of the website using all the tools available to you. I tend to use a combination of the following:

These URLs then need to be mapped correctly to the new location using a single 301 redirect. I would suggest that you use rules where possible to reduce the number of individual redirect calls being made.

Thirdly – and here is the important part – Test these redirects work on the staging environment. That way you can check to ensure they have been implemented correctly and that they’re behaving how you would expect them to. Once you are happy with these, double check them on the launch of the new website to ensure they have been moved across and continue to monitor them over the next few months.

2. Google taking time to recognise redirects

Recent experience has indicated that Google is taking longer than it used to to recognise redirects and changes made during a site migration, which is then not reflected in the index.

The chart below shows how Google is indexing the new and old versions of a website over a two month period. Although I would expect to see fluctuation over a period of time, previous migrations have seen a much quicker change, with Google quickly reflecting the new URLs within the index.

stod - indexation

There are a number of reasons why your website may have a lower indexation number compared to your previous website. But it is essential that you figure it out.

At this stage, most people will just refer to visibility tools as a measure of progress, such as the one shown below. Although it is good to see how you compare to the previous state of affairs, you need to keep an eye on your internal data.

stod - search metrics

Tip: Don’t look at the visibility graph and take it at face value, dig in to see if you have retained similar rankings. It is great to have a similar or better looking graph, but absolutely pointless if all the terms have dropped to page 2 or beyond.

So how do you help speed up indexing process?

This is one of those times where you are in Google’s hands, waiting for them to recrawl the website and reflect that in their index. I would, however, suggest that you do the following to help as much as possible:

  • Use GSC website address change tool (if applicable)
  • Upload new sitemaps to GSC – I would also upload the new XML sitemap to the old GSC account.
  • Regularly review the new XML sitemaps and pages/sections within GSC that are not being indexed. Identify the key areas and use the Fetch as Google feature to submit to Google.

3. Removal of pages

It is common during a website migration for the information architecture of the website to change. Moving to a new website/domain provides the perfect opportunity to improve the way users and search engines can get around.

It is as this stage, and before the pages have been removed, that you understand the impact those changes will have on the business objectives.

Take a look at this somewhat fictitious exchange:

Client/Stakeholder: “I am going to remove X pages during the migration as they are not converting.”

You: “By doing so you will lose X% of traffic across all channels with the likelihood of losing organic visibility, which in turn will affect conversion.”

Client/Stakeholder: “That’s fine, as they are not converting directly and therefore the traffic is not qualified.”

You: “But this will also have an impact on your assisted conversions, I would suggest that we combine these pages where possible.”

Client/Stakeholder: “I understand, but I am going ahead.”

Website launches:

stod - removal of pages

Client/Stakeholder: “We have lost lots of traffic and the board are going nuts!”

You: “Face palm! – How are the conversions?”

Client/Stakeholder: “Down! WTF!”

So how do you reduce the potential of this happening?

Do research! And do it thoroughly. If you and/or the client want to remove pages then you need to really understand the impact that it will have. Information that you want to be able to present back to the client / key stakeholders include:

  • Impact on key metrics such as conversion / traffic.
  • Potential impact on search engine visibility. Losing pages will mean the potential loss of keyword rankings.
  • Alternative solutions if relevant. Can you combine some of the pages to make them more relevant? Can the pages be improved to help improve conversion?

4. Crawlers being blocked through Robots.txt & NoIndex tags

As standard practise, you should ensure that any new website is not visible to users or search engines whilst it is going through the development stages. As you can see below, this is not always the case.

stod - blocked robots

You could conduct a number of searches in Google right now and you will find an array of websites with their development or staging website’s index. Go take a look and try the following:

  • site:[INSERT DOMAIN] inurl:staging.
  • site:[INSERT DOMAIN] inurl:dev.
  • site:[INSERT DOMAIN] inurl:uat.

How did you get on? Find many?

More importantly, how does this mean that you lose traffic? Well IF standard practice has been followed you should not see any of the above, as your development team would have added both Disallow: / to the robots.txt file and the meta NoIndex tag to every single page BEFORE a search engine could crawl it.

Some people might say that this is overkill, but for me I would want to ensure that nobody out of the confines of the business and any external partners know what is coming. I would even suggest that the website is placed behind a gated wall and is IP restricted to those trusted few.

Anyhow, I digress. The issue of traffic loss arises when you move the website from development to a live environment. It is at this stage that small details are often missed, notably the removal of the NoIndex tags and the Disallow: / command in the robots.txt.

If these tags are not removed from the website on launch, then you are going to be in a bit of trouble. Your meta descriptions will start to indicate the pages are being blocked by the Robots.txt and after a while (if not resolved), your pages will start to drop from the index.

So how do you stop this from happening?

This one is easy, at least I would hope so. On launch of the website have a check of the Robots.txt for the Disallow: / command blocking all robots. I would also recommend that you run a crawl of the website and pay special attention to the NoIndex tag.

5. Lost ALL traffic

One basic mistake that can be made is not moving across or adding in your analytics. I recently came across a website that had gone through a migration and lost ALL of their traffic.

Traffic Loss

As you can imagine they were in despair, so when I pointed out that they did not have any tracking code on the entire website they were very annoyed, but also happy that they had not lost everything.

But why does this happen? Surely you would expect tracking to be added as part of the course.

Well, in my experience that has not always been the case. Depending on the migration type, and whether you are having a new website built, you need to specifically request that the tracking is moved across.

How can I prevent this from happening?

I would suggest that you use Google Tag Manager and have this implemented on the website throughout the development process.

From here you can do things in two ways depending on how comfortable you are with GA and GTM.

The first option, and probably the simplest way, is to ensure your GA code has been implemented within Google Tag Manager but hasn’t been published. Then on launch, all you need to do is to publish the tag to get ensure you are tracking continuously.

The second option, and the one I would generally plump for, is a little more involved. I am keen that all my tracking is in place before the website is launched, and therefore I would want to test events, goals, eCommerce if applicable, etc, but I don’t want that skewing any live data. Therefore, I would do the following:

  1. Create a new GA account specifically for staging environment or use an existing view and filters.
  2. Publish the tag containing the test profile and begin testing.
  3. Once happy, and on launch. Remove test tag and implement tag with the live account details.
  4. Create annotation in GA to highlight the change in website.

But that’s just me. ?

There you have it, 5 reasons you could lose traffic during your site migration and how you can prevent it from happening. You may think that these are very basic issues, and I would agree. However, they are being made time and time again because they are small details that people forget during such a large and data intensive process.

I would love to hear about your migration, and whether you came across any of the things I mentioned in the comments below.

This post was originally published on State of Digital.

Categories
Blog SEO

11 Browser Plugins I Use For SEO Every Day!

This post was originally published on State of Digital.

Tools can come in all different type of formats including desktop applications, online software, Excel tools, or even browser plugins.

In my last post I covered 17 tools that would help you with Technical SEO. Within the comments a few people mentioned some plugins that they felt warranted beig in the list. As I was concentrating on the desktop applications at the time I didn’t add any plugins, but today is the day for them!

I am an avid user of Chrome. I know there are a large number of people that don’t like Chrome, they feel we are providing Google with more and more infomation about who we are. But I like it, as do 50% of those using tablet and console browsers according to StatCounter.

In this post I have provided you with the plugins that I use on a daily basis to ensure that I remain focussed, efficient and productive.

I have split the plugins into the following categories, so feel free to navigate through.

Project Management

1. Procrastinator

When I am trying to get in the zone this plugin comes in handy. Allowing me to block access to websites such as Twitter, BBC Sport and of course State of Digital during certain times of the day reminds me that I have a job to do and a deadline to hit.

Within the settings you can list the websites that you want to block, and the times that you want to restrict access. You are able to do this either individual or provide a blanket restriction across the entire list.

I generally go for specific time periods before and after lunch, but you need to find out what works for you.

Download here

2. Worklife

A tool that I have been using more and more recently to ensure that all my meetings are as efficient and productive as possible.

The tool itself allows you to create and share a meeting agenda, write public and private notes during the meeting, whilst keeping a record of the actions and open items. You can also share the agenda before hand so that all of those that are in the meeting can add to the notes.

Once you sign-up for Worklife, you can use the plugin to automatically open your next meeting in a new tab. Although a simple feature, it ensures that you use it in the correct way.

Download here

General SEO

3. Moz Bar

The updated Moz bar is far better than the original, but there is still a way to go with usability in my opinion.

Moz bar

Moz provides you with a very quick snapshot of the website that you are looking at including:

  • On-page Elements
  • General Attributes
  • Link Metrics
  • Markup
  • HTTP Status

It continues to show you Page and Domain Authority within the bar, whilst it now shows DA when the bar is not showing which can be helpful.

Download here

4. SEO Serp

When you are wanting to check a handful of rankings very quickly, then this is the plugin for you.

SEO Serp

From anywhere, you can select the search engine, add your keyword and website and SEO Serp will go and find the current ranking. It will also provide you with a list of the current top 10.

This will not replace your ranking software, but if you are in a meeting and looking for a specific keyword ranking position, then SEO Serp will provide it very quickly.

Download here

5. Scrapper

Scrape similar has saved me so much time over the years, by allowing me to scrape specific content on the fly.

Whether it be the listings for a search term, every H1 on a specific page or a list of products from a competitor site, Scrape similar can do it all.

The only slight downside is that you need to know XPath to get the most out of it, but we are all techical here right? Alternatively, you can right click on the code that you looking to gather and copy the XPath into scrape similar.

All the data that you gather can be exported to a Google Docs spreadsheet before you manipulate it as you see fit.

Download here

Technical SEO

6. Ayima Redirect Path

Before this tool came out, I was struggling to identify redirect loops and hops on the fly. Since then, this plugin has been one of the main plugins within my toolbar.

Ayima Redirect Path

Each time you go to a website, the redirect tool will determine the status code of the page, but more importantly if it has gone through any redirects. If it has identified a number of redirects, then it will list the path that has been taken allowing you to investigate further.

Download here

7. Web Developer Toolbar

I have been using this tool for years, and is one that I find extremely useful.

Providing me with the ability to check so many different elements including disabling JavaScript & Cookies, outlining external pointing links, iFrames, forms and broken images it has become integral to what I do.

Web Developer Toolbar

One thing to note, if you do use the plugin and change some of the settings, ensure that you reset them when you are finished otherwise you may find every website you visit to be broken in some way.

Download here

8. Check My Links

This tool is a bit like Ronseal, it does exactly what it says on the tin.

Check My Links

When clicking on the plugin it will start to analyse all of the links on the current page and provide you with a green or red light based on the number of complete or broken links.

Download here

9. Wappalyzer

Similar to BuiltWith, Wappalyzer will analyse the website you are looking at provide you with some core information on the go.

Whether you are trying to determine the CMS, the server type, what analytics package they are using or if they are using marketing automation, Wappalyzer can do this and quickly.

I mainly use this to determine technical items such as what CMS is being used, what server type so I can better understand the configuration and whether they are using some type of framework.

Download here

10. aHrefs

I’m a big of aHrefs and what they are doing with their product, so bringing it into a plugin is great.

AHrefs

The aHrefs toolbar is one that I open specifically when I am analysing link data for a specific website. At a quick glance the toolbar provides both page and root domain link data alongside some social metrics.

If you are logged into aHrefs then you can get more data about the specific domain such as:
– Referring IPs
– Domain overview
– On-page Elements

Download here

Analytics & Tag Manager

11. Tag Assistant

Anyone using GA and/or Tag Manager, this is a great plugin. It provides you at a glance any issues that you may have with your implementation, as well as what tags are being fired through GTM.

Google Tag Assistant

When analysing GA or Tag Manager, I use this plugin alongside Google Analytics Debug to enable me to pinpoint any issues.

Download here

Lots of plugins, so little time…

These are the ones that I use daily, and I am sure that there are more that you can share with me below. However, if you can only have five from the above I recommend the following:

  • Procrastinator
  • MozBar
  • Ayima Redirect Tool
  • aHrefs
  • Tag Assistant

Well that’s it! What do you think, any that you would add? Look forward to reading your comments below or over on twitter @danielbianchini.

[Featured image: Sean MacEntee]

Categories
Blog SEO Tools

17 Tools to help with Technical SEO

This post was originally published on State of Digital.

Technical SEO continues to be one of the most valuable stages in any SEO campaign. Ensuring that the technical foundations are laid provides you with the ability to become more creative with content.

In this post, I have provided 17 tools that you can use to during different areas of technical SEO.

You, Pen & Paper

Tools are great, but you are better! Tools allow you to get an understanding of any technical issues quickly but it still requires brain power to analyse what has been identified. Therefore, whilst you are running a crawl of the website using your preferred crawling tool, you should also give it a visual inspection. This part of the process is one that is often missed as we rely on tools to do all of the heavy lifting for us.

During your visual review you should be manually checking each template and the source code for the following items making notes as you go:

  • Title Tags
  • Meta Tags including description and directives such as the rel=canonical and robots tag.
  • Heading structure
  • How layered navigation is managed
  • Pagination management
  • www vs non-www.
  • Checking canonicalisation issues
  • Robots.txt
  • HTAccess

Once you have conducted your review you can see if they have been verified through the use of tools.

Crawler of choice

Running a crawl of the website is one of, if not the most important part of any technical SEO feature. Using tools such as those highlighted below will provide you with lots of information with regards to the current state of the website.

Once the crawl has completed, the first step is to export the data into a spreadsheet so that it can be analysed fully. It is at this point that I generally export by section such as response codes, Images, Directives, Protocol etc. This allows me to dive into any issues with a specific set of data rather than having to filter the entire crawl.

Example Crawlers:

Screaming Frog crawler

Google Search Console (GWT)

Recently named Google Search Console, this tool provides you the information that Google can see and is willing to show you. To conduct through technical SEO checks it is essential that you get access to Google Search Console, and if it is not available then make sure that you implement it.

Once you have access, it is key to identify any significant issues that have been highlighted, and from a technical point of view they are likely to be found in the following three areas:

  • Google Index

Under the Google Index section of Google Search Console, you will find a number of options including Index Status and Remove URLs. These two options provide you with with the current indexation figures, and what URLs have been removed. If you compare these stats against the number of your pages or the number of pages within your XML sitemap you can start to identify whether you are suffering from duplicate content issues.

Google Webmaster Tools Index Status report

  • Crawl

Here is where Google gives you insight into the state of current websites in terms of errors identified, how often your website is crawled, how your sitemap is performing and whether there are any errors and where you can handle your URL parameters.

  • Search Appearance

Within Search Appearance you will be able to compare the number of missing and duplicate title and metas with what you found within your crawl. You can also identify any issues with the structured data that may be available on the current website.

Page Speed Tool

As consumers are constantly switching between devices, page speed has become more important not only from a rankings perspective but also from a usability point of view.

Google currently state that if your website / page does not fully load within 1 – 2 seconds then it is below average. This is supported by users hitting the back button if the website is not visible almost instantly, this is also true for mobile devices where users expect the website to load quickly even if they are on a 3G connection.

There are many ways in which you can speed up your website including image optimisation, minifying code (JS, CSS, HTML) and enabling compression. These issues can be identified using one of the following tools:

Speed tools:

Google PageSpeed Insights

Change Log

This doesn’t happen very often for one reason or another, but it can be an important part of conducting a technical SEO. If your website has taken a hit in visibility or traffic/conversions, you may be able to track it back to a technical change.

One way to keep on-top of the technical changes is to add an annotation to your analytics package. This is a very simple process when using Google Analytics and can be shared with everyone that has access to the project. Further to adding information about technical changes, annotations can, and in my opinion should be used to keep a record of any marketing activities (PR, email, campaigns) as well as tracking any confirmed algorithm updates.

By tracking these activities it will be easier to identify what has either helped or hindered your website over a period of time.

Markup Checker

Structured data has become a larger part of the technical process over the past few years, however, there are still a large number of websites that have not implemented any markup.

Those that are early adopters to structured data are seeing the benefits of increased click through rates and conversions. Implementing the correct markup for your website doesn’t have to be that difficult, with the following tools allowing you identify, create and test your specific markup.

Markup tools:

Google Structured Data Testing Tool

XML Sitemaps

Surprisingly missing XML sitemaps are a common theme in technical SEO especially audits, yet it is one of the most basic features to implement.

At the most basic level you should implement a manual XML sitemap that has been created and uploaded to the server by yourself. If you can, and it is advised implement a feature to automate the creation of the XML sitemap and publication to the root for search engines to be able to access it.

Two following tools will allow you to create either a manual or automated sitemap, whilst the other two will allow you to validate the XML sitemap that you currently have. you can also

Sitemap Tools:

These are just some of the tools that are available to be used during the technical SEO phase. What tools do you use for technical SEO? I’d love to hear your thoughts below in the comments below or over on twitter @danielbianchini.

[Image Credit: Flickr – OZinOH]

Categories
Blog SEO

4 Technical SEO Issues That Often Go Unnoticed

This post was originally published on State of Digital.

Focusing on great content for your website, but failing on technical SEO is like putting Fernando Alonso in the 2015 McClaren F1 car. You have a great asset, but are being held back by technical issues!

In this post, I discuss four technical SEO issues that go unnoticed by most companies.

Redirect Chains

world-chainRedirects are part and parcel of having an evolving website. You want to ensure that both search engines and users do not have a bad experience and therefore you add in redirects to the most relevant page, and quite right too.

But what occurs more than some people realise, is the page that you are redirecting has already been redirected, thus causing a redirect chain. This is common within both eCommerce and editorial content, but can be solved relatively easily.

The problem you have is you are potentially losing any link authority that you may have gained from pages you redirected two or three iterations ago. I appreciate Matt Cutts has said all link value is passed through redirects, but I am a big believer that the more redirects they go through the more value is lost.

To see if you have any redirect chains on your website, all you need to do is fire up Screaming Frog and run a crawl. On completion of the crawl, go to the menu and select reports > redirect chains.

This will provide you with an XLS of all the redirects and redirect chains that are currently live on the website. The next step will be to start cleaning these up. I have seen some good gains in traffic by changing a redirect chain into a one-to-one redirect.

Layered Navigation

I come across this issue ALL of the time, yet nobody seems to be solving the issue. It is not that difficult to plan when you are creating an eCommerce website, or change once it has been built, but people still are not dealing with layered navigation.

For those that are not sure what I mean by layered navigation, I am talking about the filtering system you see on most, if not all, eCommerce product listing. It is the navigation that allows you to filter down to brand, size, colour, reviews, etc.

This, alongside product pages, is one of the most common issues causing duplicate content on eCommerce websites. If you are an eCommerce store, 9 out of 10 times if you conduct a site: search in Google, you will see a lot more pages indexed than you would expect. This is likely to be down to issues with layered navigation.

Providing the user with the flexibility to be granular with their filtering is great from a user perspective and one that I fully support. However, they need to be handled correctly.

Here are three examples of issues you will find with layered navigation and how they could be solved.

Product listing pages:

If you provide the user with the functionality to change the number of products that are being viewed within the listing, then you need to ensure that only a single URL is being indexed.

The most common way of handling this is by adding in the rel=canonical tag. The only question you need to ask yourself is which page do you want to be indexed? On most eCommerce solutions you have the following options:

  • 12 (default view)
  • 24
  • 48
  • View All

Depending on the speed of your website I would either rel=canonical to the default view or the view all page, but I would definitely have one. If you do not include a rel=canonical tag then all of these pages will be indexed for every single variation of filter you can imagine for your website. That is a lot of extra pages!

Filters:

You do not want and/or need all of your filter options to be dynamic. You would expect brand terms to be static URLs rather than dynamic URLs. There are likely to be other filter options and this does depend on the website that you are working on, but keyword research can help you with this.

However when allowing users to filter by items such as colour, size, price and review, you are likely to want to have these dynamic, with a rel=canonical tag added.

Example below.

  • www.domain.com/product/brand/ – This is fine to be kept as it is.
  • www.domain.com/product/brand/?=colour – This should have the following canonical tag added to it –
  • www.domain.com/product/brand/?=colour&?=size – This should have the following canonical tag added to it –
  • www.domain.com/product/brand/?=colour&?=size&?=review – This should have the following canonical tag added to it –

*Note: All eCommerce sites are different and keyword research should be carried out to determine the type of pages that are delivered by static and dynamic UR£.

Pagination:

This can be handled in two ways, either canonicalising all pages to a single page, usually the View All, or using the rel=next/prev feature that is available.

The option that you take here is very much dependent on the speed of your website and the amount of products you have available. Google prefers to surface the View All page, and if there are less than ten pages I like to rel=canonical to that page. However if there are consistently more than ten pages, I implement the rel=next/prev tag to indicate to the search engines they are the same page.

You can find more on the Google Webmaster Central blog.

Robots.txt

Robots-txtWhen was the last time you honestly looked at your robots.txt? Have you ever looked at it? You are not alone, a lot of people have not. The robots.txt file provides you with the ideal way to restrict search engines from accessing content or elements they do not need to see.

It is important that the robots.txt file is understood and utilised as much as possible. Adding in rogue folders and files can have a serious impact on the way that your website is being crawled.

If you are looking for more information on how to use the robots.txt file, then Google has provided a resource for you – https://support.google.com/webmasters/answer/6062608?rd=1

Schema Mark-Up

I attended a conference recently where the presenter asked how many of us are using schema markup, only four people raised their hand. Four people out of a room of nearly 200 people, I was astonished.

For eCommerce it is essential, and I cannot recommend it enough to any of my clients. Not just because we have entered the world of structured data and we need to provide the search engines with context about what we are trying to say, but at present it still differentiates your website in the SERPs.

There are a range of schema markups that are available, so you do not have the excuse of saying ‘I don’t work on an eCommerce store’. To find out more information then take a look here – http://www.schema.org/ and if you are looking for help to create your schema then here is another handy tool – http://schema-creator.org/.

If you only take a couple of recommendations away from this post, I would strongly recommend you solve your layered navigation issues and implement schema where possible.

Do you often miss these four technical SEO features? Are there others that you feel get missed when auditing your website from a technical perspective? I would love to hear your feedback in the comments below or on twitter @danielbianchini.

[Image credit: The Guardian]

Categories
Blog SEO Tools

Content inspiration: One tool, 1000s of sparks!

This post was originally published on State of Digital.

Everyday content creation is becoming harder and harder, as everyone becomes online publishers. Whether you are working in publishing, electronics, white goods or fashion, everyone is publishing huge amounts of content.

Previously I have written about 6 tools that can help inspire your content, using social media, Q&A websites and keyword tools. One of tools that I mentioned was Ahrefs Content Explorer and how, similarly to BuzzSumo, it provides information on the most shared content on the web.

In this post, I am going to show you how to use Content Explorer to provide you with 1000s of URLs to help further inspire your content creation by either providing new ideas, improving content, or combining to provide a more valuable resource.

Note: I am not affiliated to Ahrefs, I am just an advocate of the tool suite that they have put together. To get the most out of this post you will need a subscription

Now that is out the way, lets get started.

Content ideas from across the web

When opening the Ahrefs Content Explorer tool, you need to start by entering a keyword topic. As a first step to finding your content inspiration, it is key to start with something as broad as possible, so that you can get a clear understanding of what content is succeeding within your market.

Once you have entered your topic, it is essential that you exclude your own content from the mix so that you do not skew any results. To do this check the exclude radio button on the left hand side under the Domain Name feature and add your own URL, in this instance I have removed our domain White.net and hit the search button.

ahrefs-state-of-digital

 

This returned over 200k results, from websites such as Forbes, Mashable, Moz and the Telegraph. It goes to show that lots of people are talking about SEO, and not just on the websites you may consider a competitor.

As a starting point, export the data so that you can start to collate a number of URLs to analyse and provide you with your inspiration. When exporting the data you will be asked whether you want a fast export of 1,000 rows or a full export. This is going to depend on your subscription level as to how many credits you have or want to use.

Once you have downloaded the data, you will need to create two extra columns. One is for the topic, and the other is the domain. This will help later in the process when filtering, to identify opportunities. Using an excel formula you can automate filling in the domain column, more information can be found here.

ahrefs-spreadsheet

Now that you have your first set of data, if possible you should run more searches, this time a little more targeted. With the example that I am using, I would then perform the following searches each time downloading and adding to my spreadsheet:

  • Keyword research
  • Technical SEO
  • Content marketing
  • Website audits

Once you have completed a number of searches, ensure that you dedupe the URLs gathered to ensure you only have unique options. This step allows you to see what is being created across the web for your search topics, but what about those that you class as competitors?

What works for your direct competitors?

The broad searches that you created previously are likely to have already provided you with lots of inspiration, but now it is time to be a bit more targeted.

Understanding what is working for your competitors is a key aspect to any search marketing campaign, so determining what is most popular is key.

This step is very similar to the first, but you are going to include your competitor domains only. Remove any search term that may be in the search bar, and instead include your competitors domain in the search box on the left hand side similar to when excluding your own domain earlier. By hitting search now, you will be presented with content that is only available on the domain that you entered.

ahrefs-example

The example above shows the most popular content on the State of Digital website ordered by the median number of shares in descending order. What you may notice is that just because you generate lots of shares, it does not automatically mean you get lots of links.

It is at this stage that you have a choice on how you want to proceed. You can either download the report as is, with every topic shown or you can use the same or similar terms you used earlier to help with the targeting. Personally I would chose to download every topic to be as thorough as possible, but each situation is different.

Once you have decided what to do, you need to download either one or multiple spreadsheets, and add them to your existing set of data. Do not forget that you need to tag each URL with a topic for later.

Now repeat this step with as many competitors are you feel necessary.

Note: Not every domain will be in the Ahrefs database, but it is growing daily.

Inspiration, recreation and combination

By conducting the above process I have managed to create a spreadsheet that has over 8,000 pieces of content with social and linking metrics, that I can use to inspire the content that I will create going forward.

spreadsheet-ahrefs

Inspiration:

By browsing through the list of URLs filtered by a specific topic, I managed to come up with a large number of content titles and ideas that will allow me to create fresh content over the coming months.

Recreation:

Looking at somebody else’s content and recreating it is not stealing, as long as you are providing value by creating a better resource.

When looking for content that could be recreated and improved, I filter the data by topic and the date. This allows me to see any content that has been created within the past 12 – 24 months. If the content was well shared and linked to, and has not been covered in depth in the last six months I add it to my list.

Within a few minutes I have a list of potential ideas including this piece that was created in 2012 over on Search Engine Journal: The Definitive Guide to Local SEO. This piece generated a large number of shares and links, but more importantly it can be updated due to the latest updates implemented by Google.

Combination:

Sometimes you come across a number of posts that are very good, but just need that bit extra added to them. This is when combining content ideas to create a more indepth and valuable resource is beneficial.

Further analysis of the data that I have gathered has led to me to identify 4 different blog posts that fit very well together, and if combined would provide a valuable resource. It could even be turned into a presentation and whitepaper, increasing the potential for shares and links.

Once you have a list of content ideas, it is time to prioritise your efforts on what you feel will be the biggest return.

Using the right tools, having the right process and a little bit of time, you can find thousands of pieces of content that will inspire you and your campaigns. It has helped me, so I hope it will help you.

Are you taking these steps or something similar to inspire your content creation? I would love to hear your thoughts in the comments below or over on twitter @danielbianchini.

[Photo Credit: miscellaneaarts via Compfight cc]

Categories
Blog SEO

5 Techniques to Ensure Your Content is User Focused

This post was originally published on State of Digital.

It is all about the user, it always has been and always will be, it is just that we focus on the user in slightly different ways.

With digital marketing, we usually concentrate on keyword and competitor research to determine the type of content that we develop. It becomes all too easy to get wrapped up in our digital world, and forget the needs, wants and desires of those that we are marketing to.

But digital marketing, just like offline marketing, should be putting your target audience at the heart of everything you do. During this post, I will discuss five techniques to help you further understand your audience without using keyword research, ensuring that your content is user focused.

Become an expert, taught by experts

experts

[Image Credit: Flickr]

Stakeholder meetings have become a key part of any online marketing campaign, regardless of the channel that you are working in.

The aim of the meeting is simple, you need to know everything that your internal contact and their colleagues know about the product/service. Immersing yourself within the business is a key part to not only cementing a client/agency relationship, but allowing you to market appropriately.

Some things that have worked well for us at White.net are attending product demonstrations, going on courses, and using the product/service that we are promoting. These things allow us to put ourselves in the consumer’s shoes, which means that we can provide the user with a better experience.

Stakeholder meetings could span several sessions, but by the end you should be well versed in the service/product and know at least:

  • What it is
  • How it works
  • How it benefits the user
  • Whether any changes are happening within the sector
  • What pain points it addresses
  • What the USP is
  • Why it is better than the competition
  • Any technical specifications.

These are just a few of the learnings that you should have gathered from these meetings. This is the only stage during the campaign when the people you are working for, will be working harder than you.

Become a member of the customer service team

On some occasions a series of meetings is just not enough, you need to see how people are interacting with the product and the questions that they are asking.

We often forget that shopping online is not always as useful as touching and using the product. Therefore it is essential to understand the questions that users are asking the business. This can be done easily by shadowing a member of customer services, or sifting through all questions that have been sent.

As you are listening to the customer service representative or searching through all the questions, you need to keep a look out for common themes. Once you have identified these themes, it becomes easier to create content to answer those questions and save the customer services team some time.

This can be quite time consuming and is not always necessary, but on those occasions that it is, you will find some really useful insight into what content your user is looking for.

Surveys

Cheap, easy to create and not hugely intrusive, surveys are a great way to get information on what your users think of the product/service that you offer.

However in my opinion, you need to split your survey audience into two groups: existing customers and potential customers. This way you are more likely to get reliable results as the potential customers are not biased.

When surveying your existing customer base, the best way is to run an email campaign with an incentive to improve participation. By doing it this way you are able to be more specific in the questions that you are asking your customers. Although you may be providing an incentive to answer questions, don’t overdo the number that you ask, keep it to a minimum to get the best possible response.

For new customers, it is likely that you will want to target them when they are leaving the website and not whilst they are browsing it. Again, make sure you ask limited questions, and in my experience a single question for this type of survey will ensure that the user does not get annoyed.

When it comes to setting up a survey there are a lot of tools to use, but I would recommend the following:

Understand your audience through personas

Most businesses have or are in the process of creating personas for their marketing activity. If you have the ability to do so, then ensure that you read through the persona documents that the business has put together.

YouGov_Profiler

Although some persona documents can provide you with limited information, the majority will provide you with a few snippets that can be used to influence your marketing approach.

Some of the key pieces of information that you should include are:

  • Who is your target audience (age/gender)?
  • What channels do they consume content through?
  • What influences a purchase?
  • Do they use social media?
  • Do they use multiple devices when browsing online?

Knowing this information will allow you to tailor some of the marketing content that you would create, to ensure that your target audience and user is taken into consideration. There is no point in creating something that your audience will not be interested in.

If your business does not have or are not in the process of putting together a persona document, then use YouGov Profile tool, as a quick way to understand your users. When using this tool, you may need to find a brand that is a close match if your brand is not within the database.

Focus Groups

Although considered an old tactic within offline marketing and product development, focus groups are becoming increasingly popular within digital marketing.

The aim of the focus group is to understand how people feel about the product/service in an open and honest forum.

You will be able to ask those that use your product/service exactly what information they would like to see on the website, what questions they were asking and whether the website provided the answers. During this session you can get feedback on the product/service and discover how it compares to your competition. When you are running a focus group, ensure that you are prepared so that everything runs as smoothly as possible. You are talking directly to your customers, so you want to ensure they continue to get the best possible service.

Once the session has been completed, you should have a lot of useful notes that will filter into the strategy that you apply when marketing to your users.

And this helps how?

It is a fair question, but how many of you know this type of detail on the product/service that you are promoting? By going through as many of these steps as possible at the start of the campaign, you will gather a large amount of information about your users that keyword and competitor research will not provide you with.

Some of the information that you will have gathered will allow you to start to develop content that, alongside some keyword research, will be user focused.

If you are not confident in running these sessions yourself, then see if the business you are working for are running any internally. I promise you it will be beneficial to your relationship with the business, and to how you approach the campaign.

Are you using any of these techniques already? If not, I would love to hear the alternative ways you do go about ensuring that you are focused on the user, either in the comments below or on twitter @danielbianchini.

[Featured Image Credit: Flickr]

Categories
Blog SEO

6 Tools You Should be Using to Inspire Your Content

This post was originally published on State of Digital

Content, Content, Content. It is the most talked about subject in our industry and rightly so. However, it is becoming increasingly more difficult to create new, fresh ideas for your niche as everyone online has become a publisher.

Some create content just for the sake of it, others really go into depth and try to create the very best resource for that subject. No matter which of these two categories you fall into, the six tools I have provided below will help you to continue creating the content you need.

Content Explorer – aHrefs

The Content Explorer tool within the aHrefs suite is relatively new. Similar to BuzzSumo, you search for a topic and it will return a list of the most shared and linked-to content within the database.

ahrefs

The feature I really like is the advanced search. This allows you to be more specific in your search through the use of boolean operators, groupings, and even by the domain. This tool means you can easily spot content that has been successful for both your own website, but also for your competitors.

Just because it has been done before, does not necessarily mean it cannot be done better!

Uber Suggest

Commonly known as being awesome at helping with keyword research, Uber Suggest can also be used for content inspiration. By providing you with the search terms from Google Suggest, the tool throws up some great ideas.

ubersuggest

If you can, then use multiple country locations to get the most out of the tool and enhance the topic that you have identified. Combining multiple searches will also ensure that the topic you are writing about is extremely well covered.

Quora/Yahoo Questions

This tool is an obvious choice to use, but one that seems to be easily forgotten. Consumers often use Q&A sites to get answers for those questions that they need answering.

Spending just 30 minutes a day looking around these sites will provide you with enough content ideas for a month! The key here is to be broad with your original search, and then when you identified a topic get more specific. This will allow you to really drill down to the topic that you have identified.

Topsy

Creating content is not always about what people have been looking for in the past, but also what they are sharing in the present. Being proactive is key, in order to understand what people are sharing right now, and to see if you can create something better.

topsy

Using tools such as Topsy, will allow you to stay on top of the different types of content that is being shared by your audience right now.

Hubspot Blog Topic Generator

Can’t think of what to write? You are in need of blog title inspiration. Hubspot have created a very simple, easy to use but useful tool that helps create blog titles for you, based on the keywords that you provide.

hubspot

They openly admit that the algorithm is not highly sophisticated, and may require some amending but it will definitely give you something to work with.

Your Own Analytics – Looking at old posts.

One tool that is often overlooked when generating content ideas is your own analytics package. Although not an obvious choice, there is no better place to see how your old content performed and whether it is still performing well.

Take a look at the content that was created 12-18 months ago, did it receive much traffic? Does it still? If you answer yes to one or both of those questions, then how much social media attention did it get? If this generated a significant amount, and I do not necessarily mean thousands of shares, this piece may be worth rewriting and republishing.

If you can find 5 – 10 posts that are worth rewriting, then you save yourself a lot of time. It is a tried and tested piece of content that can be updated regularly and kept current.

These are just some of the tools that I use to help during my content idea generation process. What tools do you use? I would love to hear your thoughts on the tools I mentioned above, either in the comments or on twitter @danielbianchini.

[Flickr Credit – Michael Phillips]

Categories
Blog SEO

5 Steps to Spotting Keyword Opportunities

This post was originally published on State of Digital

It’s a bit late to say it’s a new year with new opportunities, but I am going to say it anyway! We have all been busy building out new campaigns, and trying to find opportunities for our clients or brands that we work with, as we get full flow into 2015.

One thing that I have been working on a lot over the past few months, is spotting opportunities. Whether that is for existing clients, or for potential clients through pitch meetings and I wanted to share my ideas with you.

The process is a simple, and easy one to follow, but the start requires time and patience, however the outcome could be amazing!

Before we start, some housekeeping – I am using dummy data, and I have no affiliation to any of the brands mentioned in my examples. :) I am using Linkdex as a tool (our software provider), but all of this could be done within a spreadsheet, although that requires a little bit of work.

Now that is out of the way, let’s begin!

Gather the right data, not ALL the data!

There are two very important but time consuming steps to identifying the right opportunities, and the first is keyword research.

We are all capable of doing keyword research, so I won’t go into too much detail here, but just make sure that the terms that you are targeting are highly relevant.

To help gather this information, I use a combination of the following tools:

Conducting your keyword research using these tools should help you come up with a comprehensive list of highly relevant terms.

One thing that I would recommend, that I think is overlooked far too often is, looking at the search terms that you are already visible for. These terms may be ranking page 2+, but it is a clear sign that the search engine is finding them relevant for certain terms.

A quick way of finding the search terms you are already visible for is running your website through SEMrush. They will provide all the terms that the website ranks for within the top 20.

Whilst conducting your keyword research, make sure that for every keyphrase you get the following data based on an exact match query:

  • Monthly Search Volumes
  • Average CPC
  • and get the keyphrase ranking whilst you are at it

This will become helpful when identifying the opportunity. Hopefully, by now you will have a large number of keywords in your spreadsheet ready to start tagging.

Keyword-list

Tagging by Topic

No, we are not running a PPC campaign here, but we should be using some of the methods that PPC practitioners have been using for years.

Tagging keywords into specific groups/topics, allow you to be more granular with your analysis to spot the opportunity.

Some of the keyword research that I have come across, on most occassions have tags associated to the very top level. They have been tagged simply as high, medium or low, which is OK, but doesn’t really provide you with the information you need.

If you are keen on using the high, medium and low tags, then I would suggest that you use them in collaboration with more specific tags.

For tagging keywords, I like to keep it simple, and related to the website that you are working on, therefore I use the navigation structure.

As an example of what a tagging structure could look like is as follows:

Category
Category – Sub-category
Category – Sub-category – Product

You are also likely to want to include a modifier tag as part of your process, such as location, price, style, etc.

Location
Location – London
Location – Ediburgh

For my example, and since I am looking at the cruise industry I used the following tag method.

Brand (Everything related to a specific brand).
Brand – Cruise Liner (Everything relate to a specific brands ships).
Brand – Cruise Liner – Ship Name (Everything to a specific ship for a specific brand).
Brand – Cruise Liner – Ship Name – Details (Everything that related to a specific ships specification, for a specific brand).

keyword-tagging

There were others, but this should give you the idea.

Now you understand the tagging process, you need to go through each keyphrase giving a tag or mulitple tags if required.

For the search term “Balmoral deck plan” I would give the following tags.

Brand (Fred Olsen)
Brand – Cruise Liner (Becuase it is a cruise liner)
Brand – Cruise Liner – Balmoral (Keyword targeting a specifc ship)
Brand – Cruise Liner – Balmoral – Detail (Keyword is targeting specific details of a ship)

This may look like overkill, but when you come to analysing it becomes key to see where the opportunities really lie.

This process is very time consuming, and you could automate it to a certain extent, but it is a very important step. Adding a keyword to the wrong tag could mean missing a huge opportunity, so make time to do this properly.

Once you have completed this, you can either upload it to your SEO software (Linkdex in my case), or you can continue the analysis within a spreadsheet.

Analysing opportunities by tag

Once uploaded to Linkdex, the software will do a lot of the hard work for you. They will provide you with estimated traffic based on ranking positions and search volume, as well as providing you with how much the current levels of traffic would cost through PPC activity.

If you are using a spreadsheet, you can do the maths through formulae by using estimated click through rates, and the search volume figures you have.

Now it is time to start analysing your keywords at a tag level (Group Analysis in Linkdex) to identify any noticeable opportunities.

When looking at the data provided by Linkdex, or filtered in your spreadsheet it is key to not be too focused on the search volume column.

Take a look at all the data provided, with a keen look at the number of keyphrases ranking within the top 20. There is no point in starting to target terms that have lots of search volume, but no current rankings. Start by looking at the keywords that are ranking, but also have good search volumes as these are realistic opportunities.

Fred-Olsen-Opportunity

Once you have identified an opportunity, such as Fred Olsen highlighted in the image above, it is time to look at the tags at a more granular level.

Fred-Olsen-Opportunity-Granular

Looking at the more specific tags that have been identified within the Fred Olsen branded tag, it is clear to see that they all have very good potential, and are only driving very limited estimated traffic.

On initial view, this looks like we have an opportunity, but we need to see what a forecasted return would be.

What is the potential – Forecasting

Forecasting is a dubious subject within the SEO industry, as there are so many variables, and the limited accurate data that is available, but C-Levels want to see what there return would be.

Forecasting within Linkdex is easy, using the forecast tool. Select the keyword tag that you want, identify the target window that you will improve, and estimate where you will get them in terms of rankings. By doing this, the forecast tool will estimate the uplift in traffic.

Fred-Olsen-Opportunity-Forecasting

This can also be done using a spreadsheet (we have one internally), to provide the current rankings and estimated potential rankings that will provide a new traffic figure.

But this data alone, will not necessarily swing the decision for you or the C-Level that you will be reporting too.

The next step would be to add in a conversion rate and an average order value to provide you with a return.

In this case it was £108,884,34 per month, and yes that is just for one tag!

But, before we get carried away, let’s ensure that the competition isn’t too strong, and we get nowhere near our targets.

Identify, analyse and judge

Before we go to the C-Level with a great opportunity, we need to do our due diligence on the competition. Who are they? What terms are they ranking for and with what content? Can we realistically compete with them for these terms?

A quick competitor analysis will allow you to find that information and provide the final piece to the jigsaw.

With all this information, you can decide if you have an opportunity that is worth taking to the C-Level to invest in.

These are my steps to identifying an opportunity, and they can be used for any brand or business. The time consuming parts of the process are at the start, but once you have the keywords, and they are tagged properly, identifying the opportunity is the easy bit!

I hope this is useful. I’d love to know how you spot keyword opportunities, and whether you agree or disagree with my approach either in the comments or on twitter: @danielbianchini.

[Flickr Image Credit]

Categories
Blog SEO

Decisions, Decisions! The Marketing Software Dilemma

This post was originally published on State of Digital

Software is big business! Marketing software and online marketing software in particular is big business. Only recently, Hubspot floated on the New York Stock Exchange with a “closing price [which] reflects an overall valuation of $913 million” according to the WBUR. That’s nearly a billion dollars for a marketing software company!

Although we already knew it (or we should have), it goes to show the potential that is out there for digital.

Although HubSpot is one of the more successful forms of online marketing software that is available, there are lots of tools on offer to us as online marketers. There are many examples that I could have chosen, but staying close to search and SEO specifically, I have chosen link analysis products. This is a competitive niche, and has become an essential part of the marketing mix especially with the algorithm changes and Penguin updates.

Below is a small list of different software providers or companies that offer a link analysis service:

  • Majestic
  • aHrefs
  • Open Site Explorer
  • Search Metrics
  • Linkdex
  • Bright Edge
  • Web Me Up
  • Open Link Profiler

All the above provide some kind of link data for analysis. Some of the above use the Majestic API and then add an extra layer of filtering on top. Some, however, have their own indexes that may or may not provide different results. But for the majority, the main difference between each is how they present that data to us.

How many of the above do you use? Four? Five?

It’s great to have these options, and the list is unlikely to reduce – in fact they are likely to increase – but how many do you need? How much will they all cost? Do they all do something different? And this is just for a small niche within the SEO industry.

There is lots of content about what tools you should have, and why, but how do you evaluate why you need them? The next part of this post takes you through how we evaluate tools at White.net, and how we make that all so difficult decision.

Try before you buy

We trial everything. Well when I mean everything, I mean the tools that we are interested in. This is THE most important step when deciding whether a tool is going to be an asset to your wider set.

Most tools will provide some kind of trial, whether it is 30 day access, a demo or a free version. If there doesn’t seem to be one available, then get in touch with them as I’m sure that they will provide you with limited access.

Once you have access to a trial, make sure that you actually test it. I can’t remember how many times in the past we have had a software trial only to let it run out without putting it through its paces. To stop this from happening, we now give one of our teams the task of testing the tool. They use it for client work, as well as any internal projects that we are running. Once the free trial has expired, the team provide their feedback on whether they feel that it is a valuable addition to the marketing mix.

This feedback is key. If you force tools onto your teams that they don’t need or feel add any value, you will find yourself spending a lot of money and not getting much from it.

Nice to have or must have?

During the trial, the team needs to very carefully consider whether the tool is a must have, or just a nice to have.

The reason for this is simple, there are many tools available that do exactly the same job. To understand whether the tool is going to be valuable or not, you need to create bespoke criteria to judge it on. This allows you to have a more analytical approach to determining whether the tool will be valuable.

Try not to fall down the trap of just getting a tool because you see people on social media raving about it. Make sure the tool will add value to what you are doing. Just because a tool is right for them, doesn’t necessarily mean it’s right for you!

Cost

No matter whether you are agency, in-house or a freelance consultant, cost can be a sticking point and needs to be a consideration throughout the process. Tools can range from being free, to being charged based on credits resulting in variable monthly or annual costs.

So there you have it, three areas that we analyse before determining whether we need to add any extra software to our marketing mix.

How do you determine whether you need a tool? Do you have a process of evaluation or is it more of a purchase first, cancel later? I’d love to hear your thoughts in the comments below, or on Twitter @danielbianchini.