Blog SEO

#SEONOW2015: What does SEO reporting look like in 2015?

This post was originally published on


In 2015, brands will move further towards an online marketing target, with each channel playing their part in that goal.

With SEO changing dramatically over the last few years, our reports have to reflect that. These changes have resulted in a move away from ranking positions and the number of links that we have built, to a more content-focussed approach.

This has led to agency and in-house teams providing training to key stakeholders within businesses, who prefer to see clear deliverables and results which links and rankings provide.

Although there will be a move to a more integrated marketing report, there has always been a constant in each report: how your activity compares to the KPIs that you have been set.

Daniel 3

Which on-site metrics should really matter to brands? Why?

The on-site metrics that matter most to your brand are generally individual and should be reverse engineered based on the targets you have been set.

The metrics that you need to use will differ depending on the campaign that you are running. A campaign that is aimed at generating brand awareness (content marketing) is going to have different metrics to a campaign aimed at the conversion end of the funnel.

With that said, there will always be some metrics that need to be used across any campaign, with the majority leading back to content performance and device.

Looking at it from an SEO point of view, landing page performance is crucial. What content is generating the most visits from search engines? What is the conversion rate of those pages, or how many of those pages play a part in the conversion funnel?

If these pages are not playing a part, why not? Are they pages that have the biggest exit percentages, do they have limited time on-page, is the bounce rate too high? By reporting on these figures you will be able to determine how effective these pages are, and whether a campaign needs to be based on improving these pages.

If there is an off-page element to your campaign, then you need to track brand mentions, whilst cross-referencing any citations/links to your referral traffic. This will provide you with a good understanding of whether the placements that you have generated are working and worth further investment.

As mentioned above, we are moving to a more mobile-focussed environment, which needs to be reflected in your report. Therefore understanding what each device category is contributing to your campaign, can often provide great insight and determine how further campaigns are crafted.

Which off-site metrics should really matter to brands? Why?

Reporting on the number of links that you have generated is gone! You should no longer be building quantity of links but instead generating quality links, and this should be reflected in your report.

If through your campaign, you have generated coverage within high-quality publications, then you should report it. This also needs to be supported by referral figures that the link/coverage has generated to support the cost of activity.

Alongside referral traffic from your campagin, you should monitor brand mentions. This is important as it could lead to an increase in direct or social traffic.

Reporting on the number of links that you have generated is gone! You should no longer be building quantity of links but instead generating quality links, and this should be reflected in your report.

How to tell the right story?

Each report that you create needs to be focussed on the KPIs that you have been given. Whether it is an individual campaign that will contribute or on-going work, how does it effect your target?

The story that you tell will be determined by the campaign you are running. Choosing the right metrics will allow you to bring the campaign alive, by showing the successes and failures of campaign.

An important factor when telling your story is being honest. If your campaign has not worked, then do not cover it up with meaningless metrics. Explain what happened, the learnings that you have taken away from it and how they will be implemented into the next campaign.

Linkdex SEONOW 2015
Daniel Bianchini contributes to Linkdex’s SEONOW 2015

When creating your report, make it visual, easy to understand and straight to the point. Creating complicated reports to tell the story will not only confuse those that are reading it, but will likely bring further questions.

Finally, more than anything else, make sure that it always relates back to the KPIs.

If you are interested in reading the full eBook, you can download it from the Linkdex wesbite. I’d be interested in your thoughts on how reporting has changed, and what represents a good SEO report in the comments below or over on twiter @danielbianchini.

Blog Events

SMX London – Chris Sherman Interview

This post was originally published on

This week I had the opportunity to interview Chris Sherman, Founding Editor of about the upcoming SMX London event. During the interview, Chris gives his insight on the event, the agenda and his thoughts on the search industry in 2015.

If you are interested in going to SMX London and you want some discount, then you are in luck. The lovely people at SMX have provided us with a 15% discount code when signing up by simply add WHITESMX at the registration page.

The below is a transcript of the interview recording.

Daniel Bianchini (DB):Hello Chris, thanks for taking the time to speak to me.

Chris Sherman: Sure.

DB: Really looking forward to SMX coming up in May. So I just really wanted to get some of your thoughts about a couple of key points that I sent across, if that’s okay?

CS: Sure. It sounds great.

DB: So the first one is, obviously, SMX is back in London. What can we expect from this year’s conference? Is there any surprises you can let the audience in on?

CS: I don’t think we’re gonna have any surprises. The real advantage that I think we have in running SMX in London in May is that it comes right between our SMX West and our SMX Advanced show.

During the process of planning the show, we get to not only see what’s new, what’s happening and so on in San Jose at SMX West, but we’re also thinking forward to what we’re gonna run at SMX Advanced in Seattle.

So in my mind, SMX London gets the best of both worlds. We kind of distill the content that we think is really, really useful from both of those shows, and of course it’s not entirely U.S. based approach. We try to look at what’s happening in the U.K., what’s happening in Europe and so on. But we really do get a lot of advantages from having the timing right when it is.

DB:I was at SMX Advanced in Seattle last year. I was lucky enough to manage to come across and I really enjoyed it.

So how do you think the content there differs with the content that you have over in the U.K.? Like you said, you try and get the best of both worlds. But is there a difference?

CS: Yeah. There is a difference. I mean, with SMX London, as with most of the rest of our SMX shows, we try to have a balance between advanced and intermediate content. We really don’t do a whole lot of basic content anymore because we feel that most people who are coming to the shows do have the basics already down.

So it’s gonna be a nice broad range of topics that we’re covering. We do have some of the same sessions that we are running at SMX Advanced running in London. So people who are experienced will have the advantage of getting that content.

There will be different speakers. That’s the nice thing about London is we tend to have more U.K. and European based speakers, so we get a different perspective, but we still get that great content that we really get at SMX events.

DB:That’s interesting. Obviously you have been coming to the U.K. for quite awhile now. How do you think the content has changed throughout time?

CS: Oh, it has been fascinating. I’ve actually been involved in programming search conferences for over 15 years now. When we first started doing them it was just so, so basic. We were only interested in what’s your search engine ranking and how many hits does a page get, and so on.

It was simple stuff. Over time it has involved to be this incredibly complex, very rich sort of process. I mean, when we first started doing it there were no ads, for example. I mean, it was all just organic search. Now we have ads. We’ve got mobile. We have so many things. Social media.

It’s just become this incredibly rich and fascinating sort of process that we have to get involved with and it changes constantly. So to keep up with it, it’s really a full time job.

DB:Do you think there’s been any trends that have stayed consistent or has it completely flipped?

CS: Well, it’s interesting, because when people ask this question I always have consistently, for the past 15 years had pretty much the same response. The content and the way that you actually get stuff out to people is very consistent. You still have to have great content. You have compelling ads. All that kind of stuff.

Yet many people say, “Well, wait a minute. Isn’t there some secret sauce? Isn’t there some brand new formula that we have to be paying attention to and so on?”

Yes, there is, there are. Things change constantly. But in that constant change if you don’t have the good content, if you don’t have the compelling advertising and so on, you don’t have any chance. That’s something that we continue to focus on at the conferences.

DB:We’ve all seen the change in the content over the years. Agencies generally tend to come to these type of events a lot more than in-house members, is sort of the way I see things. Why should those in house people come along to SMX London, rather than just follow it on blogs and Twitter?

CS: Well, I think you’ve seen live blogging. You have seen tweets and so on. That kind of stuff is great. It’s really good to kind of dip your foot, so to speak, into the flow of what’s going on. But you really don’t get the context that you get when you’re at the event itself. You don’t get the nuances of actually watching speakers, seeing their body language, in many cases. Getting the subtle nuances that come through when you hear an actual full conversation or presentation.

There are little things like that. But then also at the event itself, a huge part of it is the networking with the other attendees so that you can see a presentation, you have a break, blog and actually discuss what you’ve just heard with other people who are there.

The opportunity to exchange tips, exchange ideas, and that whole learning process, you can’t get that if you just do it by reading live blogs or on Twitter. So I think that’s a very, very compelling reason for people to actually attend the event itself.

DB:I’m very much an advocate of those, just to network and then making sure they speak to people after the show or straight after the presentation to make sure that they can clarify any points they were a bit concerned about.

So the next section I’m taking a look at the agenda and I wanted to just talk about that, if that’s okay.

CS: Sure.

DB: keyword research is a huge part of search marketing, both organic and paid. With the removal of the keyword data, obviously, and the focus starting to move towards the user, how do you think this changes the way we look at keyword research?

CS: Well, I think keyword research is and will always be very important. I mean, that’s fundamentally the way search engines work regardless of the improvements and the algorithms and so on. You’re basically taking a string of words and trying to match that to content on the web.

Until we really get full AI based voice search, that’s not gonna change very much. What’s changed is that the search engines are going beyond just the simple strings of keywords and they’re actually looking at content. They are looking at more semantic intent.Also intent. That’s the key word. What is the person really trying to accomplish when they’re using a search engine?

So I think the search engines are looking at much more of a behavior that’s going on and trying to help people understand beyond what the simple words that are being typed in. What is it that we can do to really solve this person’s problem or satisfy their information need?

So I think from the standpoint of somebody doing SEO, keyword research has to become much more about what are the different personas? What are the needs, and how can we create content that’s gonna be really rich that will help satisfy these needs?

So the keywords, again, I think they’re fundamentally important, but it’s much more than that, the way that search engines work these days.

DB:With keyword research comes content and that’s become a huge, huge topic since the changes Google made over the last few years.With content moving on to be a bit more creative, a bit more visual. What’s the best piece of content you have come across and why, or several?

CS: It will probably sound trite, but I’ll say Shakespeare. It goes back hundreds of years, but there’s a reason that Shakespeare was relevant in his time and still is now. I mean, it’s just absolutely amazing stuff that this guy wrote.

I think to be more contemporary, there’s stuff that I see being created every day. To your point, it really is all about creativity. But again, going back to what we were talking about earlier, I think again, compelling content is something that satisfies a need. I think if people really wrap their heads around that concept, creating that kind of compelling content becomes much easier.

It doesn’t have to be Shakespeare. It could be something as simple as, where can I find the best tool for this task that I’m trying to provide. Even mundane things like that, you can create very, very compelling stuff if you get creative and really try and help the person that has that need.

DB: Grand. So, just taking another step in a different direction, there’s also a panel this year, which is great. Past panels involving employees for search engines have generally become quite heated, which is obviously great. I really enjoyed the SMX Advanced one last year with Matt Cutts.

What are you hoping to hear from the search engines representatives this year during the Meet a Search Engine session?

CS: Well it’s gonna be very interesting. The search engines, we’re really lucky to have them at the event. It’s great to have representation from Google, from Bing. In past years we’ve actually had Baidu, we’ve had Yandex, we’ve had other search engines.

It’s always difficult to know because they have an interest in, obviously, engaging with their customers, which is really what the SEO community is in a large degree. But they also have competitive issues. So it’s almost like a dance trying to get them to reveal interesting things and sometimes it does get heated, as you said. Other times it’s more, “Well, come on. We’re really trying to draw you out.”

I don’t honestly know what we’re gonna hear this year. Sometimes it’s very, very interesting. Very deep and, “Wow, this is really gonna change my life as a search marketer.” Other times it’s, “Well, things are just sort of business as usual.”

But it’s my job as moderator of that session to really do the probing and really trying to get at them. Quite honestly, I don’t know yet how I’m going to approach that. But it’s definitely something that I’m gonna have fun with and really try to see what we can come up with in that session.

DB: I’m really looking forward to that. I’m quite interested to see how you’re gonna get some, extrainformation out of them, shall we say. When you talk about search, social media is never far form the discussion.

CS: Right.

DB: How do you feel that search and social is best used together, and whether it is essential to every business?

CS: Well, it’s a good question. We’re actually running an entire track on search and social. One of the things that’s actually emerging as a theme is that search and social, I think until quite recently, have been thought of as very, very different species, almost, in marketing. But, in fact, they’re [inaudible 00:11:36] and they can amplify the effects, one or the other.

In the past we’ve kind of focused on that as well. Do this on Facebook or Twitter. Then you might have this kind of impact on your search results and drive traffic and so on. But that was always kind of done in an almost ad hoc way. I think what we’re seeing emerging now is people really analytically looking at how do we make these two things work together?

What are the tools that we have? What are the sort of reports, the metrics, the things that we can measure and conclusively prove that, hey, this is the way you should really be approaching this. Then how can we take that amplification effect and really magnify it so that we’re not just having a really effective search campaign, we’re not just having a really effective social campaign, but we’re seeing ripple effects based on what we’re doing with both that really, really compound the overall effect of both sides of the equation, if you will.

So I think we’re gonna see a lot of people talking about, here are the real techniques. Here are the real analytics. It’s gonna be very tangible and real information rather than we just have a gut clench that this is gonna work.

DB: Okay. Thank you very much for that. Just take a little step away from SMX and the agenda. I wanted to just talk about the industry a little bit. So in 2014 it continued to be a year of change in the search industry. Any reports on what Google specifically has up their sleeve this year, and whether we’ll see any other animals begin with the name P?

CS: It’s a really good question because again, as I said I’ve been watching the search space for well over 15 years. Every year, in my mind, has been a year of significant change. It continues to just astonish me how things keep evolving and keep progressing.

I think we’re not gonna see the kind of dramatic changes that we have in the past few years with the various algorithms, with the knowledge graph and so on. But what we’re starting to see, and I notice this literally almost every day when I’m using Google now, is I think that AI within Google has become so good that the AI itself is actually starting to make changes to how Google works.

Google will not talk about this. They will not publicly admit it. They acknowledge that they’re using AI and so on, but I think, as much as what Google is planning in terms of the change to their algorithms, the improvements that they’re making and so on, I think there’s a subtler change going on. The software itself has actually reached a stage where it’s improving on its own.

I have no proof of this. I have no way to substantiate it. It’s just more that I have been using Google since it started in 1998. My sense is that I’m seeing changes that I haven’t seen before, and I don’t really know what to make of them other than, well, maybe the AI really is here. So I think that’s gonna be an interesting thing to watch going forward. Maybe I’m crazy. Maybe I’m wrong. But that’s my sense.

DB: I’d be very interesting to keep an eye on that, like you said, and seeing how that goes. So we’re just going to talk about this year specifically. What do you think will be the biggest trend in 2015, and what do you think practitioners such as myself will be doing a lot more of in the search marketing campaigns during 2015?

CS: Well, I hate to say it because it’s kind of a cliche but I think mobile. I mean, this is the year, if not already, where people are active more on mobile devices than they are on desktop, and not just mobile devices itself, but voice. I think voice is becoming a huge thing. It’s gonna be very interesting to see. I’m not entirely clear yet what the implications are in terms of how you have to alter your SEO efforts. Obviously, the ads are different. There’s in app stuff that is starting to emerge that is very interesting, that’s very different. Both Google and Bing are surfacing things from within apps. So I think mobile is gonna be probably the thing that most people are gonna be focusing a lot more on this year than they have in the past.

DB: That’s very interesting. I think, obviously a lot has come out over the recent weeks. Google’s recent announcement of mobile and the ranking factors. So, mobile is definitely something I would agree is gonna be huge this year, if not last year as well.

If you have to choose one, and I know this is putting you on the spot a little bit. What do you think is currently the most influential important ranking factor signal?

CS: That’s a really good question, too. I think we’ve actually moved beyond what’s the most important one? Obviously in the past it was links. I mean, you could argue a good case right now that it’s content, but there’s no good definition of what is quality content, and so on.

I don’t know. Again, depending on whose studies you look at, there are anywhere from several hundred to several thousand ranking signals that come into play. My sense with that is that depends entirely on the type of site that you have, the content that you’re providing, or what you’re trying to accomplish with your marketing.

So the ranking factors are gonna vary. If you’re local, obviously it’s gonna be some local signals that are gonna come into play. If you’re a retailer it’s gonna be process and all that kind of stuff. So I don’t think there is any one overall kind of signal anymore in the past that links were. I think it’s probably also gonna become much more diluted, if you will. We’re gonna have many, many more signals. They are gonna be starting to be used in very more complex and nuanced ways.

So I think that’s good for searchers. It’s gonna make our jobs as search marketers quite a bit more challenging to try and understand. Okay again, it’s getting back into the head of the searcher, because I think that’s what the search engines are trying to do themselves in terms of understanding, what can we do to make the best possible experience for the people that are using this?

DB: And finally. How would you compare the strength of search marketers in the U.K. to those offered out of the U.S.A.?

CS: I would put them on a par. This is interesting because in the past, quite honestly, when we would program the conferences I would kind of gauge. Features that we rolled out in the U.S. and they wouldn’t come to the U.K. for a year or two or three years later.

But I think that’s changed. Everything is pretty much equal now around the world. Maybe a little less so in some other parts of the world. But definitely in the U.K. it’s absolute parity with the U.S. We program our shows with that in mind. We don’t think there’s any difference at all. So we’re trying to bring absolutely the best quality programming, the best speakers that we possibly can to the U.K. shows just like we do in the United States.

DB: Great. Well, thank you for taking the time to speak to me. I’m really looking forward to SMX in May.

CS: That’s great.

DB: I look forward to seeing you then.

CS: Sounds good. Thank you, Dan.

DB: Thank you very much, Chris.

Blog Presentations

4 steps to building a data-driven strategy [Presentation]

This post was originally published on

On Monday 24th November, I gave a presentation on building a data-driven strategy at White Exchange.

Blog SEO

Competitor Analysis: Identifying your online competitors

This post was originally published on

Do you conduct any competitor research for the industry that you work in? If the answer is yes, then great. If it is no, then you are not the only one!

In my opinion, competitor research is one of the most underrated pieces of work completed. Often even if it is done, it is not used, and gets left on the desk, put in a drawer or, if sent electronically, not even read.

Why dammit, why?

Why dammit, why - Jackie Chan

This piece of research is essential to your online marketing plan, your strategy, and your business! It is key to understanding what is going on and what is required for your business to succeed or, at the very least, keep it afloat. Yet, so many people just don’t seem to care, or view it as a pointless task.

Well, over the next four posts I am hoping to change your mind. I want to show you what you can uncover with competitor research, and how it can all come together to influence your search marketing plan.

In these posts I am going to be discussing:

  • Identifying competitors based on search terms
  • Finding keyword & content opportunities
  • Understanding what content performs well
  • What coverage your competitors are getting, and why

But first I am going to start with identifying your online competitors.

Online is different to Offline!

If you have got this far, then you either don’t normally conduct competitor analysis or you want to know how and why to do it.

To start with, you need to ask yourself a few questions. Who are my competitors? What search terms are they visible for? Are those search terms of value to you? What are your competitors ranking for and should you be? How much money will it cost me to buy that traffic through paid search?

Luckily, there are tools available to help you do this. Some are paid, as you would expect, but they are worth the money if you are going to be constantly monitoring the landscape – which you should be!

So how do you understand who your online competitors are within search and what their visibility is? Well here is what I do in 10 steps…

*To complete these steps you will need paid access to both SEMrush & Linkdex.

  1. Firstly, head over to SEMrush, type in your domain, choose the country that you want to analyse (SEMrush currently has 22 countries), and hit search. This will return a lot of data, but at this point you are purely focusing on the organic keywords and competitors, which you will find if you scroll down the page.
    SEMrush - Competitor Analysis
  2. What you need to do now is to download all the organic keywords from the top 10-20 organic competitors. You can obviously choose more or less depending on the market that you are researching. To do this, simply click on the ‘Organic Competitors’ full report, then click on the competitor of choice. This will provide you with a list of keywords that you can simply download into Excel format. Go ahead and do this for your chosen number of competitors.
    SEMrush - Competitor Keywords
  3. Now you have all the keywords, you need to merge them into a single spreadsheet, keeping all the data, and de-dupe them.
  4. Now that you have a single list, you will need to spend some time going through the keywords and removing any that are unnecessary. Terms that include brand, jobs, recruitment, sales and anything else that isn’t relevant to your business and market, need to be removed. This will give you a much more accurate list of terms.
  5. Once you have completed your list in Excel, you will need to import this data into Linkdex, keeping the Term, Search Volume and CPC data found in SEMrush. To do this, simply go to the keyword rankings function within Linkdex and bulk upload using their import tool. Choose the correct headings and let it gather ranking data for those terms.
  6. Whilst that is happening, head over to the new ‘visibility’ feature that has recently been released by Linkdex. This feature is similar to that of SEMrush in that it tracks millions of keywords, but it also allows you to do some of your analysis side-by-side.
  7. Once you are in the new feature, you need to start entering the competitors that you identified in SEMrush. Once complete you will start to see the table populate with terms that each domain is visible for.
    Linkdex - Competitor Visibility
  8. The next step is pretty time-consuming, but is required. You will need to go through each competitor and add any keywords that are not currently in your list, but that are relevant to you. You may have to go and get the search volume and CPC data for these extra terms. This can be done by heading over to the keyword planner and adding in the terms as exact match and returning the data.
  9. By time you have done this, you should have a very comprehensive list of search terms that you and your competitors are competing for.
  10. Still in Linkdex, head over to the dashboards and create a ‘Competitor Detective Pro’ widget that looks at all of the keywords that you have added into Linkdex for checking. Once you have set this up and clicked OK, wait for the data to load and voila! Here are your competitors based on all the terms within the market, along with rankings by position, estimated traffic volume and how much that traffic is worth if you paid for it through PPC.
    Linkdex - Competitor Detective

So there you have it, a list of your online competitors who are targeting the key phrases within your industry, along with ranking data, estimated volumes and how much it would cost. This data can be useful to understand where you currently sit in the search landscape vs your new found competitors. It will also likely throw up some competitors that you may not have thought were competing on similar terms. All this data can form part of your strategy going forward and inform the next steps.

In my next post I will talk about how you take this data and find new opportunities that your competitors are already taking advantage of.

Are you conducting any competitor analysis for your clients? Do you follow a similar process, or are you doing something completely different? I’d really like to hear your comments on my thought process and what you would do differently in the comments below or over on twitter @danielbianchini.

Flickr Image Credit.

Blog Events

SMX Advanced

This post was originally published on

What a great week this has been! Why you may ask? Well this week I attended SMX Advanced over in Seattle, one of the most talked about and recommended search marketing conferences in the world.

The week started off with a trip to the Moz office for a look around, followed by the SMX Advanced conference and networking sessions, ending on Thursday with me and Stuart beating Matt Cutts and his team mate in a game of pool.

Beating Matt Cutts from Google in Pool at SMX Advanced 2014

But it was the conference that I was here for, and I wasn’t disappointed. The content that was available was great and came from some very experienced marketers. The calibre of the attendees was also extremely high, with some interesting questions being asked and some great conversations had during the networking sessions.

In this post I have provided my thoughts and some key points from a selected number of sessions that I thought you would be most interested in, and that would provide the most value.

The Periodic Table of SEO Ranking Factors in 2014.

The very first session of the conference was extremely interesting with some great points coming out of it. In it, Marcus Tober, Marianne Sweeny and Matthew Brown provided some very valuable insights, both from their own experiences and survey data.

One of the biggest take-aways from the session was to make sure that whatever you do, whether it is from a marketing or UX point of view, make sure it is done for the user. All three of the speakers spoke about the importance of site speed, and ensuring that both the user and the search engine were happy with load times.

Here are some of the key points from this session:

  • SEO has to be concerned with UX and UX has to be concerned about SEO.
  • Understand what Google are going to use when there are not enough links to rank a page but the page is of value.
  • Mobile usage is continuing to increase at a rapid rate, but how do you link to a page on a mobile?
  • 77% of URLs are different in search when comparing Mobile vs Desktop usage.
  • Don’t make the image too overpowering, as users prefer the content.
  • Content is important, but it is more important to have quality content. To understand whether you have quality content you need to conduct a content audit/inventory.
  • Update your content on a regular basis where possible, without changing the URL. This is especially true for those evergreen pieces that can be updated and shared regularly.
  • Create content for where the user is, not for where you think the user is. You can understand this by looking at pageviews and page/folder level within your analytics.
  • You really need to be utilising structured data, but you need to have the right markup implemented as it could lead to you lose clicks.
  • Focus on entity optimisation! This a big focus for Google, so you need to get up to scratch and quickly.

Keyword research on Roids!

For me, this session was mainly about the second talk with Rae Hoffman (@sugarrae) talking about how she uses SEMrush alongside other tools to find opportunities for optimisation and content. Now I am not saying the other talks weren’t great, I just felt that they were more talks about what you could do rather than providing actionable data.

Here are some of the key points from this session:

  • Conduct keyword research to identify topics that relate to content opportunities.
  • Utilise Analytics to identify the top 50-100 landing pages. Once you have these, put them into SEMrush to identify the keywords that they are ranking for.
  • Segment your keyword research into categories. These categories could include topics such as content marketing, category pages, related terms, and opportunities.
  • Do you know your position value? Use SEMrush’s total traffic, and remove the top 10 traffic driving terms to understand what portion of traffic they are driving.
  • Utilise the domain analysis tool within SEMrush to identify any keywords that you may have missed but your competitors are targeting.
  • You need to have a realistic understanding of whether you are really going to rank for a target term or phrase. Within SEMrush, they provide a keyword relevance score that will help you make the right choices.

During the Q&A of the session, there were two questions that drew quite a bit of discussion. The first was “what is more important in keyword research, search volume or conversions?” This was a relatively straightforward question in my opinion, with the answer being conversion first, utilising conversion rates to understand the potential money that you are going to receive, but if that isn’t possible then revert back to potential traffic.

The second question was how to utilise keyword data within GWT. The panel agreed that utilising the data was important but that the data is lost on a regular occurrence because GWT only shows 90 days worth.

Set an alert for every 90 days to download ALL of your GWT data or you will just lose it. – Rae Hoffman

and here are the slides…

What SEMs should do for Mobile.

A theme in a lot of sessions at SMX Advanced was that mobile is coming quickly, and the majority are not ready. Go and have a look at the Apple website on a mobile. Did you know that they don’t have a mobile website? Do they actually need one? Well that is another question, but it goes to show that if one of the biggest companies in the world don’t have it nailed down, then what hope do others have?

For this session, the main focus was on strategy and advertising, but one point that stuck with me is that people are having issues with tracking. We use many different mobile devices before we make a purchase, which makes attributing a purchase extremely difficult. This problem is something that will continue, but we would hope to be solved in the very near future.

Here are some of the key points from this session:

  • The question is not whether mobile is an important part of the process. Its about understanding how each device influences the consumer differently.
  • A critical step for mobile planning is to ensure your entire strategy, not just across mobile or other. It needs to be about business goals.
  • Mobile search drives multiple types of conversions – set kpis realistic to your campaign objectives.
    • enable click to call / call tracking.
    • mobile coupon codes.
    • enable location extensions.
    • in-store mobile offers.
  • Mobile search drives multiple conversions and can’t be measured in isolation.
  • Utilise mobile ad extensions within advertising to take up more real-estate.
  • Track calls using different numbers across devices & desktop.
  • There are still people who have flip phones and use WAP. This means that you need to advertise to using WAP text ads.
  • Use localised numbers rather than 0800 numbers. People like to see real phone numbers.
  • There are 3 major problems with mobile – low conversions / the SEM auction is broken / there are tracking issues.
  • It is very difficult to understand cross-device attribution as there are so many touch points across them all before a purchase/action is made.

One bit of information that shocked me the most was how quickly the click through rate changes from position one to two when comparing it to the current desktop studies.

Click through rate dropped by 45% from position 1 to 2 on mobile devices – Jaclyn Jordan

Creating blockbuster content

As you would expect with the changes over recent times, content has been a big part of all the sessions that I attended. There was a constant message throughout that content that goes viral will not happen each and every time. It takes time to understand what works for each audience, what piece of content is relevant to the consumer at the right time, and what it is that makes it different or better than anything else that is available.

Here are some of the key points from this session:

  • Quality of content is far more important than quantity.
  • Content going forward needs to be mobile-friendly, load quickly, have authorship markup, include social signals, and allow engagement.
  • Long form content tends to get more interaction, better engagement, and is shared and linked to more.
  • Having great social signals gives a much better indication of the quality of content, alongside authorship.
  • Use keyword research to help inform content, but don’t let it be the main focus of your content.
  • Determine what works using social signals, engagement, and links.
  • Ensure that you break up paragraphs for easy reading and use images to summarise concepts.
  • Images are meant to serve a purpose not just fill a space.
  • Use internal search for content ideas.
  • Ask your staff one question to gather as much information as possible. “What is the one question you get asked all the time?”.
  • Use research tools such as Google Instant, Uber Suggest and Yahoo Answers to see what people are looking for, and then create it better.

Content takes time. It can’t be done on a trial basis. It needs to be created, tested and updated to make it work.

and here are the slides…

Executing a Flawless Content Marketing Strategy

Here are some of the key points from this session:

  • Mobile apps are the future of content marketing, but you need HTML 5 responsive versions.
  • You need to have a strong call to action, otherwise what is the point?
  • Make your content interactive and engaging and it will help you achieve your goals!
  • Smaller newspapers generally have a more engaged audience, so look to these for coverage of your content.
  • Use Facebook Graph search to find journalists and gather as much information as possible. Use LinkedIn and send InMail for pitching.
  • Pitch as early as possible to have a better chance of getting interest and coverage.
  • Once you have a relationship, nurture it. It’s not a one night stand! Say thank you, use social and email to stay in touch, and be available.

and here are the slides…

You&A with Matt Cutts & Danny Sullivan

As you would expect this was by far the most popular session of the conference with so many attendees wanting to quiz Matt about what is going on at Google, and what is coming next. Once this session got going it was extremely fast-paced, so I am afraid I haven’t got many notes to share, however Rae Hoffman did an amazing job of tweeting and then fleshing out the content, so I would recommend going to have a read here.


Was it worth it? Yes! Most definitely. I have learnt and picked up a lot of useful information whilst also meeting new people within the industry on the other side of the pond. Now that I have experienced it, I am hoping that I can go back in the future, possibly as a speaker…

Blog SEO

Operational SEO: Insights into running an SEO department [Presentation]

This post was originally published on

Operational SEO is a huge topic with so many areas to talk about. When Linkdex asked me to speak at their ThinkTank on the subject, it took me a while to identify the key area that I wanted to discuss, but I finally narrowed it down to the people who deliver the projects, the team! Without the very best people it doesn’t matter what tools you have or what brand you are working for, they are limitless.

In the presentation that I gave on the 8th May, I talked about how to ensure that you are keeping on top of the talent that is available to keep ahead of the digital merry go-round that we sit within. I also discuss how we changed our structure three times in the space of time that I have been at to ensure that it was the best for us. I went into how we deal with tools, processes & templates internally and one of the biggest things for me, communication. I finish by talking about the need to constantly push on and ensuring that not only the product evolves, but the team continues to develop through training, conference attendance and constant up-skill.

All of the things I talk about, I have strong feelings on. If you attended the session then I hope you enjoyed it. If you didn’t then take a look at the slides below, and let me know your thoughts on what is required to deliver market leading SEO in the comments below or over on twitter @danielbianchini.

To deliver market-leading SEO, you need three key ingredients:

Brand: – You need to be working with or for the right brand. I don’t necessarily mean the biggest brand, but a brand that understands what is required to deliver market-leading SEO, including the ability to react quickly to the ever-changing the landscape.

Tools: – Alongside the right brand, the second ingredient is using the right tools. There are a huge number of tools in our industry, with the majority doing very similar things. What is key is that you chose the right tools for what you are trying to achieve.>

Team: – Most importantly, you need the right team. This, in my opinion, is the main ingredient to delivering market-leading SEO, and what my presentation at the Operational SEO Think Tank focused on.

I feel that there are five points that are important to any team to ensure they continue to deliver the very best SEO product. They are:

  • Recruitment – Building the right team
  • Structure
  • Tools, Processes & Templates
  • Internal Communication
  • Team Development

Talent spotting

The digital merry go-round is one that I think happens faster than in any other industry. The average stay in a digital position is two years, with people moving on to either progress their career or to enhance their pay packet. This means that you need to be constantly aware of what talent is available in the local area, as well as those a bit further afield.

Utilising tools such as LinkedIn and twitter is a great way of finding those who are of interest to you. They may not be the right people for now, but you can monitor their progress in private twitter lists, and start to make soft contact by retweeting or replying to their posts. Going to meet-ups is another really good way of seeing what talent is available, and putting a face to a name is always the first stage of recruitment.

As well as using online tools, look to the local education establishments. We have two very good universities in Oxford, as well as a highly-rated college that we have established ties with. Building relationships like this allows you to help develop the next generation of digital marketers, whilst ensuring that you can identify the cream of the talent. To help partner with the education authorities, provide your assistance by guest lecturing, sponsoring modules and/or providing internships.


However, having the right team is only one part of the process; how the team is structured is another a key element. Since being at White, we have gone through 3 different structures. The first was more of a flat hierarchy, with each member having their own clients. The issue with this was that everyone was working in a silo, not many were brainstorming ideas, and not collaborating meant that the projects could get very stale very quickly.

We moved away from a flat structure because we were moving to a more content-led approach, and felt that a more departmentalised approach would be a better way of working. This meant that we had teams of specialists in content, outreach and technical that were used by a number of consultants running the projects. We quickly identified that there was an issue with this model, with the specialists being isolated in terms of feedback on the work, what results were being gained, and a general feeling of not being involved, so we failed it and fast!

From here we moved to a team approach, combining specialists and consultants together in teams that were dedicated to specific clients. This provided the required feedback flow that was missing from the department structure, but also allowed the client to have more than one contact point to discuss the deliverables with.

There is no ‘correct’ structure for an SEO team. Just make sure that you choose the one that works for you!


Process, Templates & Tools, Oh My!

Processes are not a requirement, but are something that I feel can really help with the delivering of certain projects within an SEO campaign. We use processes internally, but as a guide rather than a must-follow. We also encourage all members of the team to challenge the processes that have been put together, in order to further enhance them and thus make the projects better.

It is a similar thing with our toolset. We want to be using the best tools for the job to make us more efficient, and help us to make intelligent decisions based on data and analysis when required. That is why we use a baseline toolset to maintain quality levels, but also encourage teams to test all tools that are available. If they feel that they are better than a current tool, or that they complement others, then we are happy to add them.

Use tools, process & templates as guides, but allow your staff to challenge & develop them to enhance projects.



Communication, regardless of whether it is to the client or just internal, is a huge factor in delivering market-leading SEO. The biggest issue with internal communication is that the majority of it happens over email or IM, both of which are personal to the individual. Communicating this way can mean conversations get lost, other members of the team aren’t privy to certain information, and difficulty retrieving those conversation threads when the individual is on holiday or has left the business. For this reason, and to assist with planning work, we use project management tools that allow communication both internally and externally.

As well as project management tools, regular team meetings are encouraged to discuss and debate a range of topics including internal work, processes, and industry news. This type of open forum provides a great arena for teams to share knowledge across the agency. These can be as regular as you like, but they need to be a minimum of once a month to make them work effectively.

Team Development

Ensuring that the team is constantly developing and evolving to be ahead of the curve requires an investment from the business. If you want the best, then you need to allow them to become the best by providing them with the opportunity to learn. This can be done in a number of ways, and some of the ones that we have integrated include:

  • 80:20 week – Each week the team have a day’s worth of time for them to up-skill themselves through internal and/or external training.
  • Conference / Meet-up Attendance – Allow your teams to attend as many UK conferences as possible. This can be in the form of one team member who then passes their learnings on to the rest of the team, or full office attendance.
  • Allow side projects – Testing is key to understand what works and what doesn’t, where the boundaries can be pushed and where there are clear lines in the sand. Allowing teams to work on side projects within their time allocation, means that any testing and learning is done on churn and burn sites rather than clients.

Train people well enough so they can leave, treat them well enough so they don’t want to. cc @richardbranson”

For me these are five key areas that are essential to delivering market-leading SEO, but by no means are they the only areas. I would be very interested in hearing your thoughts on what you think is required to deliver market-leading SEO, so get in touch through twitter @danielbianchini or leave a comment below!

Blog SEO

The Great Content Cull: The What, Why and How?

This post was originally published on

Over the past 15 years, content published on the web has drastically changed. From content creation, right through to the way that we consume pieces of content, and the devices that we consume them on. Content is changing, and quickly!

This drastic change means that we constantly need to be reviewing, editing, and deleting the content that we once thought was up to scratch. Over the past few months, I have been conducting a number of content reviews and wanted to let you in on what I have been doing.

Before we get started on that, I guess the question you are already asking is why? Why do I need to review my content? There are many ways to answer this question, and from many different angles. One reason is to ensure that you provide the very best information to your users, and those that come across the website. If you provide out of date information, or a post that is statistically incorrect then you are going to lose the trust of the user, and they may not return in the future.

From a search perspective, you want to ensure that your content targets keyword topics based on user intent. You also want to ensure that any content that you have produced in the past will not be deemed as useless, and more importantly spammy by the search engines. As I am sure you are aware with the well publicised algorithm updates by Google, they are now looking even more at the content that is provided as a ranking factor, and if you don’t comply then you are likely to be either given a penalty or fall considerable behind your competition.

Now you know, the next step is to understand what you have.

What does your content landscape look like?

Many of us don’t know what content we have on our websites, let alone what different types, and that is where the trouble begins.

If you are not aware what content you have on your website, then how do you know what is working, what needs to be improved, and what is no longer required?

If you are aware of what content you have then you are in better shape than most. If you are not, then this is the first place to start.

Understanding the content you have

The first step to understanding what content has been published on your website is to create a content inventory. This can be as simple or as detailed as you like, but there are certain elements that it must include:

  • URL
  • Page Title
  • Media Type (list multiple if required – PDF, Video, Slideshow, etc)
  • Content category (product, blog post, category page, etc)

These basic elements provide you with the initial structure of what content is currently available on you site. A large part of gathering this information can be automated using crawlers. In a previous post I talked about finding all the URLs to your website, so this may help.

Before you get started, if you are not sure what a content inventory should look like or include, go and have a read of Andrew Kaufman’s post Discovery: Content Audits, Inventories and Interviews Oh My! This will provide you with a good background to content inventories.

Gathering the data

Now that you have the basics of the content published on your website, you need to start gathering the data that will help you make decisions.

The metrics that you need will be determined by the type of website that you are reviewing, but they are likely to include the following:

  • Visits (include Uniques) – Min of 12months.
  • Conversion Data – Min of 12months.
  • Bounce Rate – Min of 12months.
  • Social Interactions (Use SocialCrawlytics)
  • Reviews
  • Comments
  • Content Length
  • Linking Root Domains

As I said, there are many more that you are likely to need, but these are some of the main ones.

If you think there are others that should be included in this list then let me know in the comments below.

Now, your spreadsheet should be starting to fill out nicely with data, but there is still a very important step missing!

Brains over algorithms every day!

So, you now have all the data required to make a decision on what content looks like it is working, and what isn’t, but we still need to conduct a manual check. Data gives us some great insight, but it is no substitute for a human looking at each page.

This task is likely to take a considerable amount of time, so you may want to break it down into smaller, more manageable chunks.

For the manual check, you will need to go through each and every URL individually, asking yourself a number of questions that could include:

  • What is the objective of the page?
  • Does the page provide me with the information that I require?
  • Has this page been created for a user or for a search engine?
  • Is the information supplied still valid or does it require updating?
  • Are the comments/reviews useful?
  • Would this page make me buy the product/service?
  • Would I share this page to my social community?

Whilst asking yourself these questions, you need to be taking notes and entering them into the spreadsheet. I see this as one of the most important stages of the content review. Data gives you a lot of information, but it won’t answer all the questions that a user is looking for.

Take your time! Getting this right is essential, as you will be making some big decisions based on these comments.

The decision!

Whilst you are making your way through the content inventory, you will also need to be thinking about what you feel is the best possible outcome for the page.

If it was your content, would you keep it, edit it, or completely remove it? This is an important step, and the data you’ve gathered should help determine your decision.

For me, I look at a range of metrics depending on the page type, but the most important aspect is the content itself.

If the content doesn’t answer the questions that I have mentioned above, or others that come out of reviewing the content, then it is going to require some work no matter what the data tells me. First and foremost, the content should always be about providing the best information to the user.

Below are a couple of examples that require a different analysis.

Product Pages

What I am looking for from a product page is persuasion. How will the product change/improve my life? This may sound dramatic, but if the copy is written in a way that I can see it improving my life then I am more likely to purchase it. See the example below.

“The oven’s innovative MoisturePlus feature will inject a fine burst of steam into the oven cavity during cooking to prevent food from drying out, with delicious results. Rapid heat-up times mean there’s no hanging around waiting for it to heat up, while even temperature distribution throughout the interior helps you achieve professional-level cooking in your own home.”

I am much more likely to purchase the above compared to:

“The freestanding Zanussi ZCV661MXC Electric Cooker has been equipped with a double oven, four highly responsive ceramic hobs and an easy-clean interior.”

Combining that insight with how many people converted on the page, reviewed the product, and shared it with their social community will allow you to determine the required action.

Blog Post

If it’s a blog post, you will be looking at different types of analysis compared to the product pages above.

You very quickly need to understand whether the post is giving the user value, and whether it is up-to-date and relevant. As blogging has changed over the years, there are still many pages that have been used for micro-blogging – pages that provide a paragraph of information, but nothing in-depth. These are pages that may have provided relevance in the past but with the new world of content creation and those pesky Pandas, this is the type of content that is likely to require an update.

Other content that is covered on a regular basis, especially when products or tools are updated, may require redirecting to the latest versions. A perfect example for our industry would be blog posts on “Tips to pass the GA Exam”. As the exam gets updated on a regular basis, so will these posts. They will either require rewriting or redirecting, the choice will be yours.

I could go on and on here, but I hope that you will now be able to go and make a decision based on the data that you have gathered and the manual analysis that you have conducted. It will always be a scary decision to make, but you need to trust your instincts and make an educated decision.

So there you have it, some of the steps I take to reviewing content. How do you go about reviewing your content? How often to do you conduct an inventory of your website? I would love to hear your comments below or on twitter @danielbianchini.

Blog SEO

Link Audits: What should you be looking for?

This post was originally published on

Link auditing has been something that has been examined for a considerable time within SEO, but has never been as important as it is in today’s changing landscape. Since the Penguin algorithm there have been SEOs up and down the country conducting link audits, not just to identify penalties but to ensure there isn’t a possibility of getting in trouble.

Due to the algorithm updates, the way people conduct audits has changed, as have the metrics and type of information and/or websites they look at. We are constantly looking to improve what we do internally at, and I wanted to share with you some of the metrics that we look at and the reasons behind them.

I have even managed to rope in a few others to give their thoughts on what they look at when conducting a link audit.

What’s Changed?

It’s long been known that the quantity of links is no longer a factor in the algorithm. Rather, it is the quality of the linking domain that provides the value. Therefore, there is little point, except for benchmarking, in looking at the total number of links pointing to your domain, especially if you have identified that the majority of those are low-quality links such as directories, website comments and articles.

What to look for?

Although most of the industry has changed the way that they are acquiring links, there are still some who continue to acquire links unnaturally, or who have used this technique in the past. This all means that you need to be extra vigilant when looking through your link profile. It’s important that you can spot any trends and issues to ensure that you action something instantly.

Below are a number of factors (not all) that we look at when conducting a link audit:

Authority/Influence Score
The majority of us use some kind of score when looking at links, whether that be Domain Authority from Moz or Influence Score from MajesticSEO/Linkdex, as it gives you a very quick and visual look at the state of your link profile. As with anything, this is a top-level view and you also need to manually assess those links.

This is generally one of the first things that I do when conducting a link audit. I can quickly see if there is a major issue, especially if the chart is showing more links to the lower side of the authority/influence score. Once I have done this, I can then dig deeper into those links that are below where I would hope them to be, and this provides a good starting point for analysis.

Linking Influence Score

*The chart above is also a very good way to represent it when showing clients. It allows them to easily understand the state of their link profile without going into too much detail.

Where are your links from? (Link types)
So all your links are from directories and guest posts! That looks natural.

By looking at the link type, you get more information on how natural your profile is, and what you need to be considering moving forward. There are a number of tools available for you to see quickly what link types you have, such as Link Detective and Linkdex.

The ideal scenario is that you will have a number of links from everywhere: directories, blog posts, news, image banners, text, followed and no followed links, all with different or natural anchor text. This will be classed as much more natural link profile than if they all come from a certain tactic.

If you do spot areas of concern, especially if the majority of your links are coming from the same area, then you need to think carefully about what value they are providing, and how this will affect your strategy going forward.

Things you should be looking for in more detail are:

– Site types (blog, news, directory, forums, etc.)
– Link type (images, text)
– Followed vs NoFollowed

Anchor text distribution
Brand variations should always be the main terms you find when having an initial look at your link profile. If your brand doesn’t occupy the top 5 to 10 then you are likely to be in some serious trouble. People don’t naturally link by your core terms! People link with Brand, URLs, Click here, etc. This is natural!

If you do spot that your anchor text distribution is predominantly non-brand, then you need to consider whether they are good links, or if they should be removed. If they are good links, then you need to be thinking about a brand building campaign that will drive natural brand links to your website with the aim of having a more natural profile.

Anchor Text Distribution Chart

Links from the same IP
If you have links from the same IP, then this could be seen as a clear signal of networking. Building lots of links across a network of websites that sit on the same IP can be seen as a manipulative technique and possible lead to penalties, whether manual or algorithmic.

Being able to spot IPs can be somewhat difficult if you use standard tools, but you can pull these in with either the Excel for SEOs tool or by using other paid tools such as LinkRisk.

Once you have this information you should filter it in order, and go through each of the links. You are likely to see a pattern, whether it be a type of site, such as directories (most probable), or blogs that are all on the same topic. Either way, they are likely to need removing, but this is something that you need to consider before starting out.

LinkRisk Profile
As previously mentioned, Penguin has caused many people to be looking at their link profile for all types of reasons. But now no matter what you are doing an audit for, you need to be looking at it with penalties in mind, and how you can prevent your client/website from getting one.

We have been using LinkRisk as the first stage of risk identification, with clear scoring patterns that allow you to quickly see the health of the website. However, once a report has been created, we also go through manually checking to ensure that a human eye has been cast over every domain. Time-consuming, I hear you say? Well yes, but algorithms can make mistakes no matter how complex they are, just ask Google.

During this process we look at the links from a human perspective asking ourselves some questions:

– Will this link drive traffic?
– Has this link been built for SEO purposes?
– Was this link built via manipulative/paid means?

If our answers aren’t “Yes”, “No” and “No”, then we start to create a list of website owners to contact to have them removed or to create a disavow file. This is a key part of our link audits, and one that is highly recommended being in yours.

LinkRisk Score

But these are just my thoughts, what do others think?

To help me with this, I asked Paul Rogers and Paul Madden (All the Paul’s) for their thoughts on what to look for when auditing a website.
Paul Rogers

“For me, link auditing is one of the most important skills for an SEO professional to have these days – as unnatural link patterns represent a huge risk for websites and they can often be difficult to detect.

I start by outputting links (from multiple data sources) into a spreadsheet and apply formulas to identify low quality links (e.g.: targeted anchor)

There are lots of things to look out for when auditing a link profile, such as:

– High number of similar links (group of sites, c-blocks, similar format etc.)
– Suspicious looking individual links (sitewide, directory, comment etc.)
– suspicious looking trends (directories, networks etc.)
– Obvious footprints (guest post, sponsored post etc.)
– Links from sites that clearly sell links

These are just a few to go alongside what you have mentioned, but there are hundreds of signals to be looking out for.

These days, even websites that have never been impacted by the penguin update should be looking to optimise their link profile, in order to future-proof their organic visibility.”

Paul Madden

“Doing a link audit is a skilled task and shouldn’t be left to someone without the correct level of experience.

Some things to consider when you are doing a link audit include: –

Try to use as many data sources as possible for the initial audit. Our experience suggests that the best sources in declining order of importance are: –

  • Google Webmaster Tools
  • Majestic SEO Historic and Fresh
  • Ahrefs
  • Existing client SEO link building reports
  • Moz
  • Bing Webmaster Tools

If the site is under penalty then you will have to confess to and reverse nearly all the SEO activity within the profile before you can expect to escape the penalty. This includes removing or disavowing all commercial anchor text links and all links from site types that have been used to gather links over the years. This would include directories, articles, sponsored links, widgets, site wide (sidebar, footer and widget links) and nowadays you’ll have to look more carefully at things like detectable guest posts and press releases.

Think about the links you see in the profile in terms of what is algorhythmically detectable and what is not. This will include thinking about things like obvious footprints as well as looking at whether the linking site is likely to have appeared on Google’s radar already in other penalties and disavow data.

We have literally thousands of disavow files now uploaded to LinkRisk and its been very interesting to see what the quality of that data tells us about the thinking of most SEO’s when doing the disavow and cleanup work. Our most disavowed domain has been disavowed at “domain:” level over 180 times so far (I can’t name it I’m afraid) and people have even been disavowing the highest quality domains in an attempt to wipe the slate clean ( 52 times / 12 times!). This also tells us something about Google’s expected ability to use the disavow data against us… its patchy as far as quality for web spam detection at best.

When you submit your disavow you have two choices to make on timings. You can wait for the Google cache to update on the sites where you have had the link removed (a few weeks typically) before submitting the reinclusion request so Google can confirm that they have gone (this is our preferred method). You could also submit the reinclusion earlier but this will sometimes necessitate the disavowing of many of the domains that have actually said they will remove or Google will be unable to confirm that the removals have taken place.

Whilst its tempting to get the reinclusion in quickly I would strongly suggest that, on moral grounds, disavowing should only be done as a last resort and its unfair to disavow if the site owner is working with you to remove the links.”

What’s Next?

As links change, and in some people’s thought become less important, what else should you be looking at?

Well, this is something I have been thinking about for a while, and I think there will be two additions to those metrics above.

Social counts at page level. This will be based on the ability to reach a wider audience by sharing your content, allowing you to increase authority, traffic and, of course, build links. This metric will allow you to see how well it correlates with those pages that have attracted the most natural links.

The other area that I feel we will be looking into going forward, is looking at the person that has placed the link. Are they an authority within your industry or niche? Do they link to competitors? What websites are they linking from? Do they have a good following?

I think these are two metrics that will be looked at increasingly over the coming months and years as these link audits evolve.

So, those are some of the metrics that we look at when conducting a link audit, but don’t think that’s the job done! At the end of the audit, you need to be able to produce some actionable takeaways to show the client. Do they need to remove links? Is there a gap that needs to be filled? Do they continue doing what they are doing? These are the types of questions you should be able to answer, adding to your strategy moving forward.

What do you look for when auditing a website? Do you disagree with any of the above? What other items would you suggest adding to an audit document? I would love to hear your comments below and of course on twitter @danielbianchini.

Blog SEO

6 Steps to Finding all Your Website URLs

This post was originally published on

Removing pages from your website? Going through a site redesign or migration but not sure you have all the URLs on your website? It’s an issue that we will all go through at some point in our SEO life. I have been involved in a lot of projects recently where I’ve needed to find all the URLs that were on a website, and it can be a pain to do! However, I’ve now managed to get it down to 6 easy steps, and I wanted to share them with you.

Step 1. Crawl your website

This is an obvious one in my opinion. Looking at your website and gathering all the URLs that you can find should be easy right? Well, maybe if you only have a 10’s of pages, but if you are at enterprise level then this isn’t so easy.

Most of you will already know that you can use tools such as Xenu and ScreamingFrog, but if your site is at enterprise level, these may not be robust enough. In this case you could turn to DeepCrawl, which specialises in crawling large websites.

Once you have run the crawl, place those URLs into a spreadsheet on a tab labelled ‘Website Crawl’. You will also want to start a ‘Master List’ so that you have a single URL list, so go ahead and create that too. We will be constantly adding to this list as we go through the steps.

Step 2. Visit your Analytics Package

Reviewing your analytics data is extremely important for all levels of marketing, but many people don’t realise that this is a good place to see what pages you have on your website.

We are looking for all the pages on the website, so you need to head over to Content or Behaviour in your analytics package, change the date so you have a range of at least 18 months, and hit the download button. If you have too many URLs for the download or it is taking a considerable amount of time, you may want to investigate using the API.

Once you have these URLs create a new tab called ‘Analytics Data’ and place your URLs here, you then want to add a copy to the bottom of the list in the ‘Master List’ tab. Now to step 3.

Step 3. XML Sitemaps

This is another place that is commonly forgotten when looking for URLs. The XML sitemap should, ideally, be the place to find the most up to date version of all URLs for your website. After all, it is the place you are asking the search engines to look to help improve your visibility.

Luckily, with an XML file you can open it straight into Excel, so do that now. You will need to do some formatting to remove the unnecessary tags that accompany the XML sitemap, but once you’ve done this it will leave you with a list of URLs. Copy this list into your original spreadsheet onto a worksheet called ‘Sitemaps’, and then copy it into the ‘Master List’ tab.

Step 4. Get a list of Your Most linked pages

Everyone likes a good link right? So you wouldn’t want to do anything that may lose an influential link. The next step is to download a list of the URLs that are your most linked to pages. You will probably need to do this from multiple tools if you have access to them, but at the very minimum you should download a list of URLs from GWT.

Tools to download most linked to pages from include:

  • Google Webmaster Tools
  • Bing Webmaster Tools
  • Open Site Explorer Top Pages
  • MajesticSEO

Once you have these URLs, collate them and add them to a new tab called ‘Most Linked Pages’, and add a copy to the bottom of the URL list on the ‘Master List’ tab. Now move on to the fifth step.

Step 5. Scraping the SERPs

So far. we have used numerous tools to get all our URLs, but we haven’t checked the search engines! So let’s do this now. You will need a scraping extension for your browser. I use Scrape Similar for Chrome.

Go to the search engine of choice, you will want to check at least Google & Bing in the UK, and type in Now, change the settings of SERP page to show 100 results (we want to do this as quickly as possible!) – this can be done by going to your account settings and changing the view from 10 to 100. You may also need to remove Instant Search from the tick box options.

Now you should be able to see 100 results from your domain. Hover over the first result title tag, right click and select “scrape similar”. This should bring up another dialog box with the list of the URLs from the first 100 SERPs and provide you with the option to either put it straight into excel or Google Drive. Either option is good at this point. You will need to go through all the listings that the search engines have returned – this could take a bit of time! There might be a quicker way to do this, and if you know one I would be happy to hear about it in the comments below.

Once you have gone through the results and collated the URLs, put them in a new tab called ‘SERP Scrapped URLs’ and add the list to the bottom of the URLs you have gathered from Steps 1-4 in the ‘Master List’ tab.

Step 6. De-dupe & Check

Wow, you have come a long way and more than likely have a lot of URLs within your spreadsheet. Most of those are likely to be duplicated, at least we hope they are as it will mean you are doing a good job. In Excel there is a feature that allows you to remove all duplicates and leave you with a unique list of URLs. This feature is found in Data > Remove Duplicates. Go ahead and do this.

Hopefully this will leave you with a good amount of URLs. Now for the final step, copy the list of URLs and run them through a crawler, I’d use ScreamingFrog to allow you to check the HTTP status of those URLs. Now you have the status codes, copy this list back into your spreadsheet, which will leave you with a list of as complete as possible URLs with status codes. Now you are done!

If you have completed all six steps, then you should have a pretty thorough list of the URLs that are located on your website. I hope this was helpful and provides some structure to finding all the URLs that you need. Have I missed anything out? Is there a quicker more reliable way of getting all the URLs? I would love to hear your thoughts in the comments below or over on twitter @danielbianchini.