Categories
Blog SEO

5 Reasons You Lose Traffic After a Website Migration & How You Can Prevent It

There comes a time in most websites’ lifetime when they will go through the dreaded website migration.

Website migrations are one of the most difficult technical processes to go through regardless of how skilled you are in digital marketing or development. You are taking a website that (hopefully) has stability, and you are making a huge change. This can cause a host of issues and impact the business objectives regardless of how thorough your planning and implementation has been.

In this post I am going to talk through five of the most common reasons that you may lose traffic to your website during your migration, and the steps you can take to try and prevent it.

Lack of respect for redirects

301 redirects are standard practice when it comes to website migrations. You find all of the pages on the current website, and you redirect them to the new location with a single hop (okay, there is a bit more to it than just that). Sounds simple right?

Well…

What happens when your development team decide to skip that part, and any of your recommendations/hard work and just put up the new website?

You get Google (other search engines are available) spitting their dummy out about the number of pages causing 404s, and your SEO having a heart attack as the 404 count rises by the 000s every day!

stod - crawl errors

Inevitably, if this issue is not turned around quickly, you start to lose visibility within the search engine. This leads to a decrease in organic traffic and the potential loss of conversion. I don’t need to tell you that this is not a good position to be in.

So how do you ensure that this does not happen?

Firstly, you need to have or build a good relationship with the development team working on the project. Go and buy them coffee, help them out, make friends. This will stand you in good stead, not only for the migration but for other technical changes you require.

Secondly, you need to ensure that you have conducted a thorough crawl of the website using all the tools available to you. I tend to use a combination of the following:

These URLs then need to be mapped correctly to the new location using a single 301 redirect. I would suggest that you use rules where possible to reduce the number of individual redirect calls being made.

Thirdly – and here is the important part – Test these redirects work on the staging environment. That way you can check to ensure they have been implemented correctly and that they’re behaving how you would expect them to. Once you are happy with these, double check them on the launch of the new website to ensure they have been moved across and continue to monitor them over the next few months.

2. Google taking time to recognise redirects

Recent experience has indicated that Google is taking longer than it used to to recognise redirects and changes made during a site migration, which is then not reflected in the index.

The chart below shows how Google is indexing the new and old versions of a website over a two month period. Although I would expect to see fluctuation over a period of time, previous migrations have seen a much quicker change, with Google quickly reflecting the new URLs within the index.

stod - indexation

There are a number of reasons why your website may have a lower indexation number compared to your previous website. But it is essential that you figure it out.

At this stage, most people will just refer to visibility tools as a measure of progress, such as the one shown below. Although it is good to see how you compare to the previous state of affairs, you need to keep an eye on your internal data.

stod - search metrics

Tip: Don’t look at the visibility graph and take it at face value, dig in to see if you have retained similar rankings. It is great to have a similar or better looking graph, but absolutely pointless if all the terms have dropped to page 2 or beyond.

So how do you help speed up indexing process?

This is one of those times where you are in Google’s hands, waiting for them to recrawl the website and reflect that in their index. I would, however, suggest that you do the following to help as much as possible:

  • Use GSC website address change tool (if applicable)
  • Upload new sitemaps to GSC – I would also upload the new XML sitemap to the old GSC account.
  • Regularly review the new XML sitemaps and pages/sections within GSC that are not being indexed. Identify the key areas and use the Fetch as Google feature to submit to Google.

3. Removal of pages

It is common during a website migration for the information architecture of the website to change. Moving to a new website/domain provides the perfect opportunity to improve the way users and search engines can get around.

It is as this stage, and before the pages have been removed, that you understand the impact those changes will have on the business objectives.

Take a look at this somewhat fictitious exchange:

Client/Stakeholder: “I am going to remove X pages during the migration as they are not converting.”

You: “By doing so you will lose X% of traffic across all channels with the likelihood of losing organic visibility, which in turn will affect conversion.”

Client/Stakeholder: “That’s fine, as they are not converting directly and therefore the traffic is not qualified.”

You: “But this will also have an impact on your assisted conversions, I would suggest that we combine these pages where possible.”

Client/Stakeholder: “I understand, but I am going ahead.”

Website launches:

stod - removal of pages

Client/Stakeholder: “We have lost lots of traffic and the board are going nuts!”

You: “Face palm! – How are the conversions?”

Client/Stakeholder: “Down! WTF!”

So how do you reduce the potential of this happening?

Do research! And do it thoroughly. If you and/or the client want to remove pages then you need to really understand the impact that it will have. Information that you want to be able to present back to the client / key stakeholders include:

  • Impact on key metrics such as conversion / traffic.
  • Potential impact on search engine visibility. Losing pages will mean the potential loss of keyword rankings.
  • Alternative solutions if relevant. Can you combine some of the pages to make them more relevant? Can the pages be improved to help improve conversion?

4. Crawlers being blocked through Robots.txt & NoIndex tags

As standard practise, you should ensure that any new website is not visible to users or search engines whilst it is going through the development stages. As you can see below, this is not always the case.

stod - blocked robots

You could conduct a number of searches in Google right now and you will find an array of websites with their development or staging website’s index. Go take a look and try the following:

  • site:[INSERT DOMAIN] inurl:staging.
  • site:[INSERT DOMAIN] inurl:dev.
  • site:[INSERT DOMAIN] inurl:uat.

How did you get on? Find many?

More importantly, how does this mean that you lose traffic? Well IF standard practice has been followed you should not see any of the above, as your development team would have added both Disallow: / to the robots.txt file and the meta NoIndex tag to every single page BEFORE a search engine could crawl it.

Some people might say that this is overkill, but for me I would want to ensure that nobody out of the confines of the business and any external partners know what is coming. I would even suggest that the website is placed behind a gated wall and is IP restricted to those trusted few.

Anyhow, I digress. The issue of traffic loss arises when you move the website from development to a live environment. It is at this stage that small details are often missed, notably the removal of the NoIndex tags and the Disallow: / command in the robots.txt.

If these tags are not removed from the website on launch, then you are going to be in a bit of trouble. Your meta descriptions will start to indicate the pages are being blocked by the Robots.txt and after a while (if not resolved), your pages will start to drop from the index.

So how do you stop this from happening?

This one is easy, at least I would hope so. On launch of the website have a check of the Robots.txt for the Disallow: / command blocking all robots. I would also recommend that you run a crawl of the website and pay special attention to the NoIndex tag.

5. Lost ALL traffic

One basic mistake that can be made is not moving across or adding in your analytics. I recently came across a website that had gone through a migration and lost ALL of their traffic.

Traffic Loss

As you can imagine they were in despair, so when I pointed out that they did not have any tracking code on the entire website they were very annoyed, but also happy that they had not lost everything.

But why does this happen? Surely you would expect tracking to be added as part of the course.

Well, in my experience that has not always been the case. Depending on the migration type, and whether you are having a new website built, you need to specifically request that the tracking is moved across.

How can I prevent this from happening?

I would suggest that you use Google Tag Manager and have this implemented on the website throughout the development process.

From here you can do things in two ways depending on how comfortable you are with GA and GTM.

The first option, and probably the simplest way, is to ensure your GA code has been implemented within Google Tag Manager but hasn’t been published. Then on launch, all you need to do is to publish the tag to get ensure you are tracking continuously.

The second option, and the one I would generally plump for, is a little more involved. I am keen that all my tracking is in place before the website is launched, and therefore I would want to test events, goals, eCommerce if applicable, etc, but I don’t want that skewing any live data. Therefore, I would do the following:

  1. Create a new GA account specifically for staging environment or use an existing view and filters.
  2. Publish the tag containing the test profile and begin testing.
  3. Once happy, and on launch. Remove test tag and implement tag with the live account details.
  4. Create annotation in GA to highlight the change in website.

But that’s just me. ?

There you have it, 5 reasons you could lose traffic during your site migration and how you can prevent it from happening. You may think that these are very basic issues, and I would agree. However, they are being made time and time again because they are small details that people forget during such a large and data intensive process.

I would love to hear about your migration, and whether you came across any of the things I mentioned in the comments below.

This post was originally published on State of Digital.

Categories
Blog SEO Tools

SEOmonitor Software Review: A Top SEO Tool for Ranking in Google

It’s no secret that Google searches are a gold mine for anyone who can successfully achieve those coveted top positions in Google’s results. 

But the reality is, Google processes over 40,000 search queries per second, so you won’t be able to get to the top without a good strategy. What’s more, since most internet users don’t venture beyond the first page of the SERPs, you need every SEO tool at your disposal to compete for those valuable slots.

Ultimately, your success with search engine marketing depends on the quality of the SEO software you use. SEO tools can help enhance the overall performance of a given website – from blogs to e-commerce stores – in the search engine results. 

Some of these tools can work together for even better performance, forming what we in the industry call an SEO toolkit. I’ve reviewed some SEO tools before in previous blog posts, but today, I’m going to focus on an excellent SEO tool called SEOmonitor.


 
Ready to try SEOmonitor? If you’re interested in giving SEOmonitor a go, the guys there have allowed me to offer you a special extended 30-day trial, rather than the standard 14 days. All you have to do is use this promo code: tRWlHJ 


SEO software review: SEOmonitor

SEOmonitor aims to help marketers get a better glimpse at the overall landscape of organic reach and traffic for their industry. That’s why it covers quite a lot of ground in just one tool, with several groundbreaking features: 

    • Organic Traffic
    • SEO Campaign
    • Keyword Research
    • Competition Insights
    • Content Performance
    • Issues
      • Search Reputation
    • Business Case Builder

One of the greatest things I’ve found about SEOmonitor is that it can be integrated with Adobe Sitecatalyst, Google SearchConsole, SEMrush, Majestic, and of course, Google Analytics. 

With these native integrations, SEOmonitor is able to gather all of your website’s organic data, manage the Not Provided solution, provide insights on keyword data, and competitors!

But I’m getting ahead of myself. Let’s talk about what kind of tool SEOmonitor is.

What is SEOmonitor?

SEOmonitor is primarily a reporting tool, providing a full set of data on organic keywords – which can also be used to research keywords and topics. It gives you a full breakdown of the keywords providing traffic to a website, complete with metrics such as:

    • Search volume
    • Clickthrough rate
    • Average position
    • Number of search visits from each keyword
      • Bounce rate
    • Conversion rate

I’ll go over a lot of the best features in SEOmonitor in more detail, but first, I want to highlight a top feature in SEOmonitor: the Topic Explorer. SEOmonitor’s Topic Explorer focuses more on “topics” as opposed to individual keywords, which is a big deal in today’s search environment, because Google has been moving away from simplistic individual keyword ranking in favor of latent semantic indexing (LSI).

The Topic Explorer tool links different keywords to a topic to find semantically-related keywords, so if you input a topic as broad as “car insurance,” for example, you’ll get a huge list of keywords associated with that topic – not just keywords with “car” or “insurance” in them.

The team at SEOmonitor have also developed their own metric known as Visibility Score, which is a more intuitive way to assess keywords rather than the arbitrary “Keyword Difficulty” metrics provided by most other tools. Visibility Score blends rankings and search volumes in a way that makes it more relevant, insightful, and easier to understand than other SEO performance tracking approaches.

Pretty helpful, right?

The Origin Story of SEOmonitor

The team behind SEOmonitor were originally part of an SEO agency – they were marketers providing SEO work for clients. Then, in 2013, they were hit with the same problem that pretty much every single SEO agency and professional had to deal with: Google’s encrypted organic searches. Suddenly, there was much less information available for predicting or measuring SEO performance.

To solve the problem, these business-minded marketing pros created a tool and delivered it to a market full of SEO brands and agencies that needed it. SEOmonitor was officially launched in 2014 as a tool to measure and predict SEO performance. 

Today, the company supplies over 2,000 brands with its services and continues to grow. In fact, SEOmonitor even won the EU Search Awards in 2016!

SEOmonitor Features

Winning the EU Search Awards is obviously a big deal, but SEOmonitor didn’t get there by accident. 

So, what is it about SEOmonitor that has won over professionals all over the continent? Let’s take a look at the tool’s main features!

Automatic Keyword Research

In order to earn top results for your keywords, you have to start by understanding their competitiveness and relevance.

SEOmonitor helps by sifting through thousands of keywords that are relevant to you, revealing how challenging it would be to rank in the top 10 on the SERPs, and suggesting the value you can expect if you successfully rank in one of those top 10 spots! 

It’s particularly helpful to have all of this keyword data concentrated in a single tool.

Opportunity Indicator

What good does it do you to rank first in Google for the keyword “dancing shoes” if you sell gluten-free pizza doughs? You may be highly ranked, but it’s a waste of an opportunity to rank for irrelevant keywords – and if you run an ecommerce or content website, it’s a waste to rank for too many keywords that lack buyer-intent. 

Fortunately, SEOmonitor provides a feature called the Opportunity Indicator. By measuring the search volume, the difficulty, and the rank of each tracked keyword, SEOmonitor is able to prioritise the keywords that are likely to have the biggest impact on visits in the shortest amount of time. This is the kind of data that’s worth every penny, especially when you have limited bandwidth or resources to put toward content creation, because it lets you focus your attention on the activities that will make the biggest impact. 

Visibility Score

A professional SEO tool isn’t complete without the famous “Visibility Score Metric.” The Visibility Score is a core SEO performance metric, allowing you to see an accurate measurement of the overall visibility of a group of keywords in Google at a single glance. The score is expressed as a percentage, representing your impression share in the organic results: how many times a user saw your website in the results page, from the total number of searches on your keywords. 

SEOmonitor does the trick by giving you an overview of the overall performance in Google. It helps you to identify changes and anomalies in your SEO performance, compares them with non-brand traffic, and shows which keywords (or keyword groups) influenced overall visibility the most.

Business Forecast

Every agency insider today is aware that most people want you to forecast future results based on the work that you have suggested. This is a universal trend, although the accuracy of these predictions is really a whole other conversation. 

SEOmonitor, on the other hand, is able to provide solid insights on potential future results. All you have to do is understand the shown data and choose from some of your keywords groups, select the average position that you expect to achieve at the time, and then just hit the forecast button. SEOmonitor then produces a graph like the image above. 

This Business Forecast feature makes it easier to build SEO proposals based around projected results, such as the number of non-brand organic visits the client will have on a monthly basis and how many conversions this new traffic will generate. An excellent addition to this tool is that it takes into account the seasonality of the industry and the current rankings and traffic values. It’s a pretty comprehensive approach!

For the price of the tool and the features that are available, SEOmonitor really can be used for companies of all sizes, including both on the agency and client side.

Content Performance Review

The Content Performance Review provides an analysis of external content pointing to your site, giving you a fantastic overview of how your outreach campaigns are working. And yes, you’ll see all the normal metrics of any backlink analysis, such as:

    • Domain rating
    • Social media shares
    • Visits and conversions to your website
      • Link status
    • Anchor text

But the beauty of SEOmonitor is that it goes above and beyond, showing the effect on the Visibility Score for any particular landing page.

And to make sure you don’t miss anything in the daily routine, this analysis is updated every day, so that you have all the current information and are up-to-date on any significant changes, anomalies, and particularities.

Organic Traffic

I have found that SEOmonitor allows you to connect your website profile to Google Analytics and Adobe SiteCatalyst, which broadens the potential uses of this already fantastic tool to even more people. The majority of SEO software suites will only allow you link together a single analytics software, most likely Google Analytics. 

But we’re not talking about just any SEO tool, are we? 

SEOmonitor allows you to hook different analytics tools up to your website without charging extra for it. And once you have connected your analytics package to SEOmonitor, you’ll get more than just organic traffic metrics, including: 

    • Organic conversion rate
      • Organic conversions
    • Organic revenue

All this data wrapped around your campaign tracking makes for more precise, informed decisions. Integrating Google Search Console query data, you will start to get a feel for which keywords are actually generating traffic, as well as pinpointing topics for you to optimise.

Another cool thing about this, is that in your setup process you will have it added in your brand terms. The terms allow SEOmonitor to split out the brand and non-branded traffic! 

Would you like to know how? Well, a magician tends not to reveal his secrets, but here it is: With the integration of Google Search Console, some clever algorithms running in the background are able to match up your organic traffic to landing pages – plus the keywords from both your currently active campaign and Google Search Console – to bring a good indication of the split. 

It’s not always 100% accurate, but it provides you with valuable insights that would’ve remained unknown otherwise. This could be the competitive advantage you’re looking for!

Competition Insights

Want to see exactly how you compare to the competition?

On the Competition Insights page, you’ll be able to compare the Visibility Score trend of your top competitors, as well as their current Visibility Score for desktop and mobile. You’ll also see their top keywords, how many keywords you have in common and their domain score.

Below, you’ll see a list of all the keywords that your competitors rank for. Detailed metrics show you exactly how these competitors rank, what kind of changes they’ve seen, and more! You can hover over each element for even more details of what kind of content is working for the competition.

SEO Timeline: Events Correlated to Rank Fluctuations

When looking at the keywords in your SEO campaign, you can click on the little calendar icon to see a complete timeline for that particular keyword. This timeline shows you events such as landing page changes, HTTPS migrations, new backlinks, Google algorithm updates, and more. It then shows you exactly how your rank for that keyword changed as a result of those events.

This fantastic timeline tool gives you a special insight into how your actions affect your ranking. Want to pass along what you’ve discovered to your client or teammates? Just click the “Share Insight” button at the top of the event timeline. This creates a special link to a dedicated page that shows this timeline, and allows for comments and discussion at the bottom.

Flexible Pricing

In an effort to make the pricing policy a bit more fair and accessible, SEOmonitor uses a different pricing system from most tools: The price that you pay for SEOmonitor depends on the number of websites and keywords that you track. 

One website and up to 300 keywords – the cheapest option – is €49 per month. With every website that you add, while keeping the same number of keywords, the price increases 10 euros. For every 100 or 1000 keywords that you add extra, the price increase varies. 

If you’re running a larger business or agency and tracking more than 100,000 keywords, a custom price is designed exclusively for you. The cool thing is, whichever pricing plan you decide on, you will always have unlimited access to all the features, so it’s not one of those pay-to-win kinds of tools that we see in almost every corner these days. 

When paying more than €300 per month for the tool, you will also have unrestricted access to pitching resources, which comes in handy for any agency or professional looking to escalate.

Who is SEOmonitor for?

At this point, it’s fair to say that SEOmonitor offers an incredible number of features. We’ve already seen a number of them! 

But the truth is, the list above hardly even scratches the surface of this comprehensive tool.

Take a look:

So, with so many features, it’s a fair question to ask who would benefit most from using this tool. Is it for small business owners, marketing professionals, agencies, or somewhere in between? 

Well, let’s consider what a reasonably fair cost-to-benefit ratio might be.

Because of the price is customisable, it can easily scale for small or large teams, whether in-house or agencies. But if you are a freelancer and have quite a lot on your plate right now, SEOmonitor might be a way to clear up your time and start delivering even better results, which may greatly benefit your career. 

In more general terms, everyone can benefit greatly from this tool if they do a little twist and turn. To take full advantage of all the features that SEOmonitor offers, though, it’s probably best suited to agencies who handle SEO for many different clients.

That being said, any SEO content marketing experts or teams will have a blast exploring and using the technology within SEOmonitor. This tool will absolutely bring new life and excitement to your search campaigns.

Pros & Cons

I’ve been working with SEOmonitor for quite some time now, and one great way I’ve found to communicate the value of a product is with a quick list summarising its pros and cons. There are some fantastic features here in SEOmonitor, but there are also some things that could use some improvement. Let’s see what they are!

SEOmonitor Pros

Daily search engine updates

Before SEOmonitor, I had to use other ranking tools with weekly and/or bi-weekly update features. If you consider that SEO is a long term process and organic changes need some time to be tracked, it’s understandable that daily search engine updates wouldn’t be so necessary. 

However, when I work with larger websites, their contents change frequently, so I soon realised how useful a daily update feature could be in uncovering SEO issues quickly. With SEOmonitor, I am now able to diagnose important SEO problems as soon as they crop up!

Easy to group and smart-group keywords into folders

Working as a digital marketer, you know that our daily marketing tasks take a lot of time. This has a direct impact on how much you can bill and how efficient you are with your work. SEOmonitor helps with its keyword smart-grouping feature, where the system suggests keyword groups based on the keywords you are tracking. Anything that can save your expensive, precious time is worthwhile in my book!

Keyword cannibalisation warnings

Keyword cannibalisation is something to take seriously, as it can potentially damage your rankings for multiple reasons. Keyword cannibalisation happens when a website’s information architecture relies on a single keyword or phrase on multiple parts of the website. While this can occur unintentionally, having a bunch of pages that target the same keyword can cause real problems. 

SEOmonitor has a filter that shows how many times Google has changed your landing pages in the SERPs, targeting the same keywords. If too much change is found by SEOmonitor, it means that Google can’t decide which keywords relate to which page. By looking into your cannibalisation list, you’ll be able to create new pages or update the content on those pages.

Forecasting

I know I mentioned Forecasting before, as it is a really helpful feature for setting up goals in your SEO campaigns, but the tool is just so fantastic it deserves its own spot as a pro of SEOmonitor. 

When focusing on a set of keyword groups, SEOmonitor will forecast the amount of traffic and PPC costs, and predict a what-if scenario. The forecasted data is based on your Google Analytics data that the account is linked to. For this, SEOmonitor has developed an algorithm predicts the revenue you are generating from your organic traffic and the grouped keywords you are tracking. 

This feature is quite handy as a way to estimate the monetary value of your SEO efforts. In practical terms, you can share these numbers with a client or your executive team – or anyone who doesn’t understand the SEO process in detail, but always cares about the bottom line. 

As I mentioned before, forecasting isn’t always accurate, but it gives you a sizeable advantage to have some real-world numbers at your disposal. 

Opportunity indicators

Arguably one of the most interesting features in SEOmonitor is the Opportunity Indicator. This is nothing but a basic calculation of how much revenue you would earn when you nab the top result. By looking at these indicators, you can adjust your keyword strategy to capture more revenue with less effort. It’s very helpful when you can set revenue and sales goals for your SEO efforts by defining position-based KPI’s. 

Easy way to learn competitor keywords

Admit it: you’ve always wanted to take a quick peek in the competitor’s backyard to see how they handle their stuff? The good news is, SEOmonitor lets you easily investigate your competitor’s keywords and see which ones are frequently used. This step will help you find new opportunities, while thinking of strategies to outrank your competitors on the search results page. 

Yes, many other SEO tools have similar functions, but SEOmonitor shows competitor insights for specific keywords or keyword groups that you are focusing on as well. Plus, you can see which keywords are generating clicks for the landing pages of your competitors. This is a handy way to try and reverse engineer your competitor’s success!

Download desktop and mobile SERP snapshots as HTML page

Remember what I just said about saving time all the time? Well, this feature gives you an idea of how your pages are indexed in the real world. Once again, you’ll save time by not needing to switch back and forth to Google to check SERPs and meta-tags. This is as practical as it gets!

SEOmonitor Cons

Confusing user interface

SEOmonitor is a tool that provides a bunch of useful information and insights across many features. That may be why its interface is so crowded and occasionally confusing. There are a lot of icons and buttons you can click on within the tool, but they are not exactly marked or named in any way. Unless you get in the habit of hovering your mouse pointer over every element on the page, you may miss out on valuable features! 

It’s also worth mentioning that when working with a lot of keywords at once, the UI refreshes and computes the data, which slows down the browser window, slowing down the overall speed of your work. 

Forecasting

Although I included forecasting as a pro for SEOmonitor, it’s still not a perfect feature. The forecast number that SEOmonitor gives you is very broad. My bottom line here is, if you rely TOO much on the traffic or revenue number provided, you might compromise your business. Remember, these numbers aren’t set in stone, so be cautious about basing your company’s future on these kinds of forecasts.

Steep learning curve

There is good documentation with SEOmonitor, but getting used to the tool and finding the best workflow can be challenging.

It’s also somewhat expensive because of its daily search engine updates. But as mentioned before, if you have websites with lots of pages that are frequently changed, daily updates can be useful, which makes the price worth it.

SEOmonitor FAQs

How many languages does SEOmonitor cater to?

Currently, SEOmonitor supports 20 languages, including: English, French, Dutch, Italian, Japanese, Arabic, Brazilian, Chinese, and Romanian.

Does SEOmonitor work on WordPress.com as well?

Yes. Users will just have to disable the default SEO options in their WordPress.com accounts.

What does SEOmonitor integrate with? 

Here are just a few of SEOmonitor’s popular integrations:

Conclusion

Honestly, there is just so much to appreciate about SEOmonitor. One of the tool’s key strengths is its effortless keyword tracking across desktop and mobile. 

Adding hundreds of high-value keywords is child’s play – within minutes, you have the ability to review performance, monitor competitors, and uncover the most promising keyword opportunities available for your website.

Whether you’re monitoring 100 keywords or 5,000, it’s always easy to understand how an SEO project is going via SEOmonitor’s Visibility Score metric. The score measures the actual impact of ranking changes, taking into account the various features that each query can trigger in the search results. Combined with the app’s ability to connect keywords to conversions and revenue, it makes it easy for both you and your clients to monitor and understand your SEO efforts and results.

SEOmonitor also makes it easy to access the data you need to make recommendations, confirm insights, and pull together reports. The ability to separate branded traffic provides for a greater level of transparency. Meanwhile, the team is ALWAYS open to improvement – iterating constantly and not only listening to feedback, but implementing it. 

The interface can be challenging at times and powerful features are often hidden away. Luckily, the in-app live chat is always available if you need help. Another thing I’ve noticed is that, whenever new features are being added to the app, this can result in some functionality/usability issues. While the support team are awesome at responding right away, it does sometimes take up to a day or two for those issues to get fixed, which is something to be aware of.

Ultimately, SEOmonitor takes care of everything when it comes to creating and monitoring an SEO campaign. If you don’t have such a tool, or you have one but aren’t satisfied, give this one a try. 

If you aren’t a current customer and would be interested in testing SEOmonitor for yourself, just know they are very helpful. They also offer a migration service to assist in any migrations away from other software. But if you don’t want to put your money on it yet, then consider signing up for a free trial and request a demo with the team. 

Again, the guys at SEOmonitor have allowed me to offer you an extended 30 day trial instead of the standard 14 days. Just use the following promo code: tRWlHJ

So, are you interested in giving it a try, or are you already an SEOmonitor user? What are your impressions about the tool? Share your opinion in a comment below. I would love to hear it!

Categories
Blog SEO

Google using New Labels within Mobile Search Results!

I was researching the sad news of the passing of the artist formally known as Prince (RIP) when I came across what look to be new icons within the mobile search results. It has been quite a newsworthy day in the UK, so I conducted a few more searches and started to see these icons on more and more queries.

Google have been adding visual aids in the form of icons and labels to the mobile search results for quite a while, and whilst I have seen “mobile-friendly”, “slow to load” and multiple PPC labels. I have not come across brand logos within the top stories or the “Live” icon in mobile search (as shown below).

HRM_Queen_90th_Birthday

 

I tried several other queries related to the Queen’s birthday, Princes passing and Ched Evans conviction being quashed, and this is what I was presented with.

Prince with Brand Logos in Top Stories

This was the first time that I noticed the News Outlet brand logos within Google’s mobile search.

Prince with Live Label in Google Mobile Search

This is for the same search query but showing the term “Live” in the UK search results. Conducting more searches I noticed a similar pattern as shown in the following screenshot below.

Ched Evans with Brand Logos

 

These visual aids within the SERPs took me by surprise, as I am used to Top Stories being in a similar format to the rest of the SERP as shown below.

 

HRM Queen 90th Standard Top Stories
Being curious, I checked on a desktop computer for the same search queries to see if these brand logos or the live text was showing with the SERPs.

Prince dead desktop google search

 

queen birthday celebrations desktop

These queries returned the result pages looking as they have done for a while. This begs the question, why add these labels and brand icons to the mobile search results only? Are mobile users more susceptible to visual aids than those on a desktop?

Now, I don’t generally search for news related results, so this could have been live for a while. but I have not seen much coverage. Barry Schwartz reported back in October 2015, that Google was testing the Live icon within the SERPs, however, today has been the first time that I have seen this being used within the UK.

Have you seen any of the above before, whether it be in the UK or elsewhere?  Do you think they will add value to the user experience? I would love to hear your thoughts in the comments below.

Categories
Blog Tools

Tool Review: Linkdex New Platform

Has anyone noticed that Linkdex has been looking a little bit different recently?
I’ve been using the new platform for a month or so now, so having had time to explore all the features of the new interface. It’s no secret that in the past there have been some flaws with the platform, great data but a lack of user-friendly features meant that it often felt easier to use a combination of other tools instead. But with these latest changes and relaunch of the platform earlier this year, it’s time to put new Linkdex to the test…

The Platform – What’s Changed?

Navigation

This was the most noticeable change for me to begin with. All the nav used to run across the top of the page but now there’s been a bit of a shift around so the side navigation menus have become a lot more important. When Linkdex said they were changing this, I didn’t think it would make a huge amount of difference to the functionality and would just feel like a purely cosmetic change.

But in reality, this has been a massive help. One of my previous complaints of Linkdex was how difficult it was to find certain features if you weren’t used to using the tool everyday, but this has made it much simpler.

 

16.03 - Linkdex Visibility

 

This also means they’ve made the filters clearer too so you can check the data of the page you’re on quickly, without getting lost in the old maze of filters. It’s always visible at the top of the data pages, instead of having to click to check what’s been applied to the page you’re on.

Show Me

As part of making the filtering system better, there’s now the option to apply ‘Show Me’s’ to the data.

 

16.03 - Linkdex Show Me

 

I hope the next step is to be able to save your own ‘Show Me’ options across your entire account. This would be really helpful for ensuring that everyone working on our clients is reporting on the exact same data and seeing opportunities for strategy in the same places, rather than having to issue instructions on how to mimic filters I’ve applied previously. This is a good step in the right direction in terms of the filtering functionality so I’ll be watching out for further developments.

Reporting & Dashboards

Up until now I was really impressed with the changes that Linkdex has made, but it seems a bit like they might have forgotten the reporting section (Or hoping it’s a phase 2). There have been some tweaks but nothing too drastic to change my opinion on this section of the tool, though I do think reporting is one of the hardest things for a platform to get right.

Now, I don’t use the reporting stuff in the platform that much anyway so this isn’t really a dealbreaker for me. Instead I much prefer to download the graphs separately and build my own custom reports using other data sources too, which is still possible so that works. The best change here is how the dashboards work. They no longer double as a version of the report which is difficult to construct, instead the graphs you select display side by side every time you login which means you get a great top level view of the performance of each project instantly. This is ideal for when I’m running quick checks across multiple projects to flick between dashboards and understand which clients have seen the biggest changes over the past month.

Away from the Platform…

The platform isn’t the only thing that’s been changing at Linkdex though by the looks of it.

Grown Up Website

Those who have been in the industry for a while might remember this:
16.03 - Linkdex Website

 

A few iterations later and a shift in the brand messaging to be ‘The SEO Platform of choice for Professional Marketers’ and what you see now on a visit to the Linkdex site is a sleeker, grown-up experience.

 

16.03 - Linkdex Professional Marketer

 

Ordinarily, I wouldn’t comment on the design of a tool’s site or event factor it into the quality of the software, but for Linkdex there are special circumstances. As an agency, White.net prides itself on transparency with our clients, which means we often give access to the Linkdex projects to our clients so they can see the raw data for themselves. If you’re doing this with a client, you need the platform’s own site to be reflective of the quality of the data its collecting so it looks like a reliable source.

Industry Thought Leadership

Linkdex has always worked to promote knowledge-sharing across the industry through its SEO Insights pieces and events, but recently the efforts to promote these and attract great speakers has really increased. I spoke on one of their SEONow Webinars just a few months ago, alongside some excellent speakers and a tonne of engagement following the event. The Customer Experience Think Tank earlier this month also had some great presentations and a good mix of actionable talks and big-sky thinking that quickly made the SEONow events ones i’ll be looking to attend in future. Not to mention they’re sponsoring the bar at BrightonSEO too… This commitment to the industry is really positive to see. Doing good across SEO-world with these free events makes it seem a lot more like they care and understand what we’re all working towards as agencies using the platform.

Linkdex isn’t perfect. But what tool is? My main takeaway from all these changes though is that they’re taking things seriously now. It feels like they’ve listened to the feedback over the past few years from people like myself and finally done something about it. I’m sure we’ll see many more changes over the upcoming year and hopefully even more positive improvements. Have you tried using the new platform? I’d love to hear more about what you think about it in the comments below.

Categories
Blog Presentations

3 Tactics to Futureproof SEO in 2016 & beyond

This evening I had the pleasure of presenting on a Linkdex SEONow webinar alongside Chris Hart and Danny Goodwin.

The topic of the session was SEO insights for 2016, following on from the blog post that I contributed to back in late December.

Below are the slides that I presented to a great audience, followed by some good discussion with Chris and Danny. I have also included the video of the webinar at the top of the page.


On upload to Slideshare it seems as if some of the slides have become blurred. If you would like them, get in touch direct.

FULL WRITE UP TO COME SOON

Audience Questions:

Q: Personal assistants – is it just relevant for B2C or also for B2B?

Dan: I think this very much depends on your audience. However, unless there is considerable cost associated to your implementation why wouldn’t you do it? As more people use personal assistants (it’s growing) and search using mobile, the smarter this applications will become and start to automate what you see. As I mentioned in the presentation, I never actually set the app to being in any of the content shown, but it was based on my search preferences across all my devices. This does require you to be signed into Google, or other but when are you not?

Q: Is Google clever enough to know my site is responsive via a fluid layout (i.e., I don’t have a dedicated mobile site)? Is that OK?

Dan: Yes. There are three ways that Google has indicated that they see mobile websites, and I would suggest that this fits within the responsive category although I am speculating based on the question. The three formats that Google have provided are shown below:

  • Responsive design <— Google recommended
  • Dynamic Serving
  • Separate URL (m.)

To help determine whether Google classes your website as mobile-friendly, you can check the following tools:

Q: What about keywords, what are the changes around keyword usage and optimization? How to look at keyword organic traffic in the semantic search era? How to optimize for semantic search?

Dan: You should be optimizing around topics and not just individual keywords. Similar to how in AdWords you would build a list of keywords for a particular ad group. This set of keywords then allow you to create content around a topic that provides your user with more detailed information. These topics should then be used as part of your content strategy, which will identify which content should be used at each stage of the buying cycle

Q: Backlinks – which are the best practices about link building and using anchor text in 2016?

Dan: You want to increase the number of backinks your website gets? Build great content or digital assets that are worth linking to.

(For a more detailed discussion about links, listen to the webinar.)

Q: You mentioned schemas for search engines. Can you explain a little further?

Dan: Search engines read content on the pages, but it doesn’t necessarily give any context as what it is. The Hummingbird update and schema has helped search engines to get more clarity to what is being displayed. Schemas for search engines can be used in multiple different ways HTML5 & JSON-LD just two that I mentioned, and are snippets of code that surround specific parts of your website content.

A very simple example would be pricing. You’d wrap the price of your product in schema, which will be picked up by search engines and likely displayed within the search engine result pages.

There are many resources on schema, but the two that I always point to are:

If you’re looking for information or testing tools on implementation, then I would recommend the following:

Q: Does Google use as ranking signal – when visitors keep coming back to your website?

Dan: On an ongoing basis? Then I would say no. However, if a sudden surge of people were searching for a specific website using a certain query then I’d expect and have seen a short term increase in rankings for that website.

I have seen some experiments that Rand over at Moz has conducted where he has sent a lot of social traffic to a certain search phrase and then select a specific website. This has then seen an increase in position for that term, although on a very temporary basis.

Q: About AMP pages, we have LTE as standard offer so we have fast connection in here Vienna, anyway connections are getting faster and faster so why do we need AMP? As I understand AMP will look like page from ’90s.

Dan: Google will continue to provide resources to improve websites as part of their mission to improve the web. This doesn’t however mean that it is something that you need to implement if you think that your website speed is good enough.

Michael King has written a very good post on improving site speed by using one piece of code

Q: Do you have any optimization tips for industries that are very competitive, but quite conservative and regulated when it comes to their content?

Dan: If you do decide to go down building a content platform for your brand, understand that it takes time to gain traction. Put everything that you have into and you keep going so that it is a success. It’s easy to get down about the traction you are gaining, but just keep going!

This Econsultancy post provides some information on ideas for the finance industry.

Q: One of my clients has a blog post that’s gotten more traffic than others, and I want to reuse it. Is it accurate that Google doesn’t like content that looks like a copy/repeat of something else? So if I make a few changes to update it, does that mean I should delete the previous (original) post so it doesn’t look like a copy?

Dan: Many different ways that this can be handled. I’d suggest that you check out the webinar for more in-depth responses but in short:

  • Republishing isn’t an issue if it’s valid and useful.
  • If you copy it to another URL then you’d want to implement canonical tags.
  • When creating content think about whether it can be used again, will it be evergreen. Then you can really think about the URLs that you are using.
  • Check the webinar. 🙂

I’d love to hear your thoughts on the presentation in the comments below or as usual over on twitter @danielbianchini.

Categories
Blog Events

New search adventures for 2016 – The 2nd Optimise Oxford

Tonight was the second Optimise Oxford down at the St Aldates tavern with three excellent talks from Stephen Kenwright, Sean Butcher and Katie Bennett.

Unfortunately I was unable to make the event this evening but followed on Twitter, and was lucky enough to see the slides prior to the night.

Stephen Kenwright – Link Metrics that Matter

Stephen hasn’t put his slides up online yet, but wrote a post about Link Metrics that Matter over on the Branded3 website. Here he talks about the current use of Moz and Majestic metrics, and how TrustRank maybe a better way of measuring links.

1. Trust is as important as PageRank
2. Multiple links from the same website shows trust
3. I want more links from people who already link.

Below is Stephen delivering his talk captured by White.net.

 

Sean Butcher – Google rich answers, why they can no longer be ignored

Sean’s presentation talks about Google’s Rich Answers and shows data from a study performed by Eric Enge of Stone Temple. He also created his own study of 200 How, What, Where and Why queries which is really interesting. Take a look at his slides below.

 

As with Stephen above, the team over at White.net managed to capture Sean in action and posted it on Twitter below.

 

Katie Bennett – Why your digital strategy needs user personas

The White.net team managed to catch Katie rocking the stage in the video below!

 

Following online was difficult, but it seemed to be another great event and one that will continue to flourish here in Oxford.

Some images from Twitter

There was plenty of conversation on twitter and I managed to borrow some images taken by @UELukeT, @OptimsieMeetup and @whitedotnet.

Hopefully I can make the next one, and see you all there too! To stay up to date on when the next event is, head over to Twitter and follow @optimisemeetup, sign up on the meetup page or sign up for my newsletter below!

Categories
Blog Presentations

Outreach Digital – The Changing World of SEO

Last night I was invited to present at the Outreach Digital event held at WeWork in Soho. During my talk I discussed the changing world of SEO and how recent alogirthm changes have had a major affect on the landscape and how we conduct our work. From here I provided 7 tips that helped those looking to stay out of harms way and build longterm organic growth for their business.

It was a lot of fun, with a great audience. I hope to be able to speak there in the future. If you couldn’t make it, then please see the slides below.

If you liked the presentation and want to stay in touch, please sign-up for my newsletter!

If anyone has any more photos from the event I would love for you to send them to me so that I can feature them below. You can send them to me[@]danielbianchini[.]co[.]uk

Categories
Blog Tools

6 Steps to turn BuzzStream into a CRM System

Tools that have dual purposes are becoming extremely popular in businesses, especially those that are keeping an eye on costs.

As with most agencies, sales is a major part of growing your business and ensuring that the company growth continues, meaning lots of money is spent of CRM software. Most will go with with recognised software such as Salesforce, that costs thousands of pounds due to reputation, but there are other options available, especially for those:

  • needing to keep costs down,
  • not a salesperson by trade,
  • spending too much of your income on tools for delivery.

It is times like these where you need to be inventive with your toolset, and use tools for dual purposes.

In the rest of this post, I am going to talk you through using one of the best tools to generate natural coverage available to SEO and PR professionals for Business Development!

The tool that I am referring to is of course Buzzstream!

 

Before we get started, I am going to assume that you have a Buzzstream account and you know how to use it. I am also going to assume that you have connected your preferred email account to the tool to ensure emails are connected. If you have none of the above then please visit the resource section here.

Right, lets get started!

 

Create your sales project

As with any Buzzstream outreach campaign, you need to first create a project specific to your needs. To do this head, up to the top left and select the dropdown. Within here you will see the button New Project as shown below.

buzzstream-create-project

Once you have clicked the New Project button, you need to name your project. For this example, I have used Business Development. Now you need to assign the project to yourself and untick the send backlinks checkbox.

buzzstream-project

Now the project has been created, it’s time to focus on the pipeline.

 

Create your pipeline stages

Within the project you have created, head over to the settings and select customise fields. Under Custom fields for Link Partners select Add new custom field.

buzzstream-custom-fields

This will allow you to start adding your pipeline stages, however you first need to give it a name and change the filter to Dropdown.

creare-pipeline

Once you have changed the filter, you will see a new set of fields appear. This is where you need to start entering your stages.

In my example I have used the following:

  • Prospect
  • Cold call – contacted
  • Warm lead
  • Warm lead – response
  • Brief received
  • Proposal sent
  • Won
  • Lost

 

These may not be exactly what you require, so you will need to change these to suit your requirements. Once you have finished adding your pipeline stages, you need to select the project that you want the fields to be enabled in. Since you are already within the correct project, the checkbox should already be selected.

Click save, and head back to your project.

 

Import your prospects

If you have not been using a CRM to date, the details of your targets are likely to be in a little bit of a mess. Luckily, Buzzstream offers you multiple ways to upload your information. Click on the Add link partners button and select one of the three options that is suitable for you.

buzzstream-add-contacts

Now that your prospects are within your project it is time to start adding them to the relevant stage of the pipeline.

 

Updating the pipeline status

The next stage could be time consuming if you have not uploaded your contacts by XLS. However, it is a step that is necessary and it will mean you are aware of what type of contact is required for the prospect.

The quickest way of editing the pipeline stage is selecting it within the main dashboard as shown below. On selection of the prospect, you will need to select the pipeline stage from the dropdown and then hit save. You will need to complete this process with all of your prospects.

buzzstream-change-pipeline-status

 

Confirm contact details

As part of the Buzzstream tool, it automatically finds all potential contact methods for the specific website/prospect that you have entered.

It is a beneficial step to go through each prospect and review the contact details. This will mean editing email, phone and address details and accepting any social media accounts they find. This will ensure that you have as much information about your prospect as possible.

Be aware, that the details found by Buzzstream may not necessarily be the correct information. Therefore you need to cross-reference it against the prospecting details that you already have.

 

Making contact and keeping details up to date

As with any CRM system, you will need to ensure that any information taken is added to the contact. Buzzstream provides a notes feature which will allow you to add any information that you take down during meetings or phone calls with your contact.

 

buzzstream-notes

 

If you have email communication with your prospect and your email is connected to Buzzstream, all your correspondence will be imported to the contact information. This will allow you to see any previous conversations that you have had, whilst also being able to easily review previous agreements made between the two parties. If you are unable to connect your email (I do not seem to be able to), then include your Buzzbox email address into the Bcc and all your communication will be added as if it was connected.

As you progress the prospect, you will need to change the pipeline stages through to conclusion by editing them as shown above.

 

Extra tips

The more you use this method for your business development, the more inventive you can be with the features that are already available. Below are just a few extras that you may want to use.

  • Assign tasks to others: If there is more than one person working on the lead, you can assign them tasks.
  • Using tags: Although you are keeping all your data online, you may have some offline outbound activity happening. Using tags, you can add them to contacts to easily show what marketing campaign they have been part of, to provide a more personalised approach.
  • Create template responses: To speed up the outreach process, you can create templates that provide the outline of the communication that you want to send. These templates can be edited on each send to ensure that the approach is personalised.

 

And there you have it, using Buzzstream for business development!

This is a perfect example of a tool that can be used for multiple tasks, and why during your toolkit review it is essential to understand what each tool can do.

Are you using Buzzstream for sales or for a task other than link building? Do you think Buzzstream could be the answer to your CRM issues, even as a temporary solution? Are you using another tool that is not a dedicated CRM system for business development? I would love to hear from you in the comments below or other on Twitter @danielbianchini.

 

Image credit: Sean MacEntee (Flickr)

Categories
Blog SEO

SEO Trends 2016: 44 Experts On The Future Of Organic Search Success

I was asked to contribute alongside 43 experts within the organic search field for a post originally published on Momentology.

 

The only constant in the world of SEO is change. In our epic post on SEO trends for 2015, our panel of experts overwhelmingly (and correctly) predicted mobile would be one of the biggest areas to watch. While mobile got the most attention last year, what should be the top focuses for brands and businesses in 2016?

If 2015 was the year of becoming mobile-friendly, then 2016 is shaping up to be the year of user experience.

In early 2015, Google introduced an assortment of changes and a new “mobile-friendly” algorithm designed to improve the mobile experience for consumers. Then in October, we learned about an artificial intelligence algorithm called RankBrain, which is Google’s latest attempt to improve search results for its users.

Just as Google has always been focused on providing its users with the best results, now it’s time to put your focus on what’s most important for your audience.

Understanding your audience and the entire consumer journey so you can be visible at the moments when it matters most is mission critical now.

Content strategy also remains a top focus, but with a greater eye toward intent, context, and usefulness.

You can’t ignore mobile SEO or apps in 2016, either. Consumers live in a mobile world and rely on devices of all sizes – so your mobile strategy must put users first.

Through it all, fundamental, technical SEO will remain critical. And technology and data will be crucial to help optimize and measure the success of your organic efforts in 2016 and beyond.

This year, Momentology has collected insights from 44 experienced SEO experts:

What’s the future of SEO? Here’s your forecast for 2016.

Barry Adams, Founder, Polemic Digital

Barry AdamsThe search space will continue to narrow in focus in 2016, as mobile-first browsing habits will siphon traffic from search engines towards mobile apps – specifically YouTube, Facebook, and news apps. I suspect 2016 might be the first year to see a stagnation, if not decline, in search volumes on some of the major search engines.

As a result of this narrowing search space, a brand’s total share of voice will become even more important. I expect a proliferation of brand-owned content channels – such as Momentology – in a wide range of industries, from DIY to retail, manufacturing, and medical technology.

Brands will create and promote self-owned publication channels to build their own audiences, rather than rely on third-party platforms to deliver visitors to their commercial sites. Some of these brand-owned channels will be indistinguishable from independent channels. A few independent online publications will be bought by brands who can’t be bothered building their own audience from scratch.

Altogether, 2016 will be the year where the fight for audience attention will reach a new peak, as organic search evolves in to a zero-sum game and social media becomes exclusively pay-to-play for corporate accounts. The limitations of our industry will start to materialize as consumer behavior changes, and the fight for consumer attention will be fiercer than ever.

In order for SEO to survive and thrive in such an environment, SEO providers will need to focus more on highlighting their clients’ competitive edge and find increasingly provocative and attention-grabbing content angles.

The future of online success will not be dependent on organic search. Instead I see an online brand’s growth rely on how successful they’ll be able to integrate with existing dominant platforms. News hosted on Facebook and Google, ecommerce through Twitter and YouTube, those will be the trends that will pave the way for online success in the coming years.

Adam Audette, Senior Vice President of SEO, Merkle

Adam AudetteThere are three key areas our teams are working in that I believe represent the future of organic search:

1. Mobile, Apps & New Technical Work

Apps, apps, and more apps. Deep linking, app indexing, and now app streaming are all key areas for SEOs in 2016. App store optimization, mobile article formats from Google, Apple and Facebook, and the increasing importance of app content in organic search all mean we will be focused on a mobile world more and more.

A second related piece is technical SEO work. Pagination, faceted navigation, and international SEO are just table stakes. The future is about http/2 as well as https, JavaScript, single page applications (SPAs), the DOM, and dynamic websites. Not to mention site latency, structured data, and even voice search.

2. Content Strategy & Content Marketing

Moving toward a deeper understanding of audience cohorts and personas, and how searcher behavior changes based on the segments. This directly informs site architecture, taxonomy, content plans and task completion metrics.

Here, too, we need to understand future technology such as voice search and how it changes behavior, resulting in new types of content experiences.

All of this must be informed by user testing.

Finally, an understanding of entity search and structured data and how the Knowledge Graph can best represent a brand’s identity and visibility in organic search.

3. Personalized Experiences & Marketing Addressability at Scale

We will be increasingly leveraging first-party data to improve content experiences, while making that content perform well in organic search. We’ll also be leveraging data to understand the cross-channel performance and strategies between display and organic search and paid and organic search.

The future is data and leveraging SEO as a critical piece of the attribution funnel, and its relationship to a holistic digital marketing strategy.

Loren Baker, Co-Founder & VP, Foundation Digital

Loren BakerAll in all, integrated marketing that delivers targeted users to topically relevant content will be a key focus on 2016.

What do I mean by this? Make sure that your content funnel is populated with targeted traffic, not only from influencer outreach, but from smart ad buys, persona targeting, social sharing integration, and any other marketing disciplines.

Deliver the right person to the right type of content. Enrich their life/goals from an information and UX perspective enough for that traffic to become your brand advocate – through sharing and interacting with that content, making a purchase (triggering an event), or adding to the storyline in some way.

Daniel Bianchini, Director of Services, White.net

Daniel BianchiniAs we move into 2016, there will be two significant changes in the way SEOs approach campaigns, both of which have started to be discussed more openly.

  • There will be a continued but much more in-depth approach to understanding the audience. During 2015, we started to move toward the idea of creating content that is more specific, and focuses on the need of the user. However, during 2016 we will start to use the vast data that is now easily available to us to be more targeted. While doing this, we will look more into the experience the user has when they land on our content regardless of the device they are using.
  • There is going to be more of a focus on search via mobile devices. I don’t necessarily mean through search engines, but how applications like Google Now are learning about our needs. Thus, the need to increase the use of structured data will intensify.

Chris Boggs, Founder, Web Traffic Advisors

Chris BoggsIn 2016, SEO will continue to command a greater share of executive attention from SMBs to enterprise. Last year I predicted deconstruction would be an important part of the SEO world in 2015, and I certainly did my share of that with clients and networking friends. For 2016, I will borrow from Google’s John Mueller, who was quoted in Twitter as using one word to define what SEOs should be focused on: consistency.

Of course, many in the industry lambasted this typically coy recommendation – truly falling into step with some of the Google riddles the SEO industry was long blessed with by Matt Cutts.

What the hell does that even mean

The great thing about using the word “consistency” to encapsulate SEO strategic thinking is that the word can be mapped to each of the typical work streams associated with organic optimization: technical attention to detail, establishing and maintaining relevance, and growing authority.

For tech SEO consistency can be pretty simple to envision and “strategize” by SEOs and marketers, but often this translates to impossible demands for IT teams based on site technology that is in place. One example is a blog on a different subdomain in order to support a separate CMS. This is something SEOs – at least partially because of an unspoken goal of consistency – sometimes will target as an opportunity but end up compromising or abandoning the idea of moving the blog to a directory because it simply won’t work.

Content seems like the easiest consistency goal to tackle. Just publish on-topic content that is useful to your target audiences, right? I feel there is more to it. I believe consistency in this case has to do with your style/tone, such as how The Onion uses an irreverent 404 Error page, keeping in theme with its satirical self.

Also, how consistent is your content with that of the competition? I often recommend creating a content matrix to truly understand what pages your competitors are using, and where the gaps exist that may cause your site to look inconsistent when it comes to true relevance. This is also a rewording of a classic SEO strategy and creating pages to fill gaps found during keyword research.

The concept of link and authoritative citation consistency is easier than it can be made out to be. SEOs sometimes act like children in the way they think they can pull the wool over Google’s eyes when it comes to some link acquisition tactics. Penguin and manual penalties have reformed many, but not those oblivious or stuck about 5 years behind evolving “legitimate” SEO.

Co-citation and the understanding of relationships between sites and industries is probably Google’s greatest strength, and it should be played-into. The old school link “building” roux of discovering who links to your competitors and trying to get those links still holds true.

In this last area, some are lucky than others because their “nature” is inconsistency, when it comes to inbound links. Broader news and information publishers have an eclectic link pattern both inbound and outbound. Leverage this understanding to be consistent in getting “off topic” links from places that link to a wide variety of subjects, but also maintain consistency with your peers by getting the relevant links that help to identify your topic.

Michael Bonfils, CEO & President, International Media Management Corp.

Michael BonfilsI’d like to say it’s going to be more of the same, but it won’t.

Google’s business model of a desktop-based search engine advertising platform faces some real challenges as different means of communication, and the devices that support them, continue to evolve. As a global SEO provider, explaining to clients and their respective agencies who partner with us that PPC, social, and display all need to talk and work with us has them scratching their heads asking us why and to stay in our own corner of the room.

To survive as an SEO in 2016, it means to expand our knowledge, as well as the ability to provide this keyword-based world of ours to reach into social, display, and paid search in a mobile world. Our job is no longer just driving Google searchers to top rankings, it’s also driving them to the platform experience – be it mobile, social, or apps – and having these visitors hopefully use our keywords and clients to advocate for that experience.

Google’s algo has no other way to go but to evolve into AI that evaluates “chatter” rather than rules. In 2016, our job will be creating cross-channel keyword/client “chatter” more than just links.

Brent Csutoras, Founder & CEO, Pixel Road Designs

Brent CsutorasFor too long, we have looked at our online marketing campaigns as checklists, where we focus more on getting all the checks rather than the quality of the items we are checking off. In 2016, we have to start focusing on the quality of everything we do, from the strategy, to the creation, to the implementation, the engagement, and ultimately measuring the return.

What is the point of spending a lot of time, energy, and resources on a campaign, that research would have shown you will not get the reaction you hoped for. Why create multiple infographics, if they are average quality and won’t stand out from the thousands of others, and why run a marketing campaign in a way that doesn’t speak to the audience your targeting?

In 2016, the engines will continue to focus on user experience, quality, and personalizing results for each user.

So you have to research what your audience would like, what topics they actually become vocal and take action behind, create your marketing campaigns directly for the audience you are targeting, and make sure the quality of what you are producing and presenting is high enough it elicits the action and response you need.

Dave Davies, CEO, Beanstalk Internet Marketing Inc.

Dave DaviesI strongly believe that 2016 is going to be the year that any semblance of using purely old-school SEO tactics is culled with the focus of conversations being around “web presence” and less around “where do I rank today.”

With the release of Google’s Search Quality Rating Guidelines there is a clear message: usability and user experience first with a focus on how clear and accessible the main content of a specific page is. What we’ll be further witness to through 2016 is the human ratings on actual experience (and not just content) being integrated into the algorithm through human algorithm adjustments and the further utilization of AI to understand what the user would be or is experiencing and adjusting page scores based on that.

Further, we’ll see marketers (those that aren’t culled) broadening their thoughts regarding Google organic presence beyond the ranking and sometimes even beyond the click. This will come in the form of increased attention being paid to featured snippets and mobile where the click is often replaced with instant access to knowledge. In this area I highly recommend following Eric Enge of Stone Temple Consulting who is performing some interesting tests.

Finally, in 2016 we’ll see an increased awareness by large brands on how to better utilize their data. We saw in 2015 Black Friday sales fall in the retail sector by more than 10 percent with online sales picking up the slack with a jump of 14.3 percent. Interestingly, email drove 25 percent more sales than in 2014, illustrating a far more effective use of user data.

Anecdotally, I noticed a significant step forward in how I was marketed to during this peak sales period with outstanding use of both email and remarketing. Big brands are stepping up their game. The question for 2016 is whether they will stay ahead or whether smaller companies will make use of the tools and principles to take some of those sales back.

Summing up, in 2016 businesses need to:

  • Focus on global presence and not just rankings.
  • Focus on making the main content of your page quickly accessible, engaging and useful.
  • Think about the various ways content is being displayed in search results and question how you can be present there.
  • Think about ways of using your current visitors and the data you hold to remarket to. Be smart and give them what they want.

Stoney deGeyter, CEO & Project Manager, Pole Position Marketing

Stoney deGeyterI wouldn’t be surprised to see a greater influence of the user experience in the search results. Google has already announced that artificial intelligence is now the third most influential ranking factor. While this is described more as a query processing algorithm, it’s not a far leap to see “RankBrain“, as they are calling it, begin to take over some of the other ranking factors that Google deems important.

That’s not to say that other “old school” algorithm factors will no longer apply. What is AI other than an algorithm that learns to write itself? The Google search results, within just a few years, could primarily be fueled by this learning algorithm that that takes all the old school signals into account.

Currently, engineers look at the data and tweak the algorithm accordingly. RankBrain could essentially cut out the engineers to allow the algorithm to adapt on the fly, even minute by minute based on the data being produced. With essentially 40-50 or more algorithms being used, RankBrain could replace them all.

So where does user experience come into play?

A learning algorithm can, theoretically, do a better job at analyzing searcher intent, and adjust according to mass behavior. Actually, with enough computing power, it could adjust according to individual behavior. Every click, every bounce, every time you scroll the search results, all of this data can be immediately translated.

Yes, RankBrain is still looking at the data that the engineers are looking at, but it can more easily interpret that data and rewrite, test and release the algorithm accordingly.

Google has already invested a great amount of resources in creating predictive assistants. It won’t be too long before each of us feeds Google enough data for RankBrain to use your specific behavior to delivers a completely custom set of search results crafted just for you.

Eric Enge, CEO, Stone Temple Consulting Corp.

Eric EngeThere are two major trends to follow during the course of 2016. These are:

1. A New Era of ‘Content Effectiveness Optimization’

This is the notion of measuring overall user satisfaction with the pages of your site, and striving to increase that to higher levels.

In Google’s Search Quality Rating Guidelines, for the first time ever, Google introduced the concept of “Needs Met”. This is an evaluation of whether pages returned in the search results actually address the needs of the searchers. It doesn’t take much insight to realize that Google would not be collecting this data (at great expense) if they weren’t making active use of it.

Panda was only the first major algorithm that attempted to measure content quality. Google’s journey down this path is continuing, and I believe they have many ways they try to assess content quality today.

The bottom line? If you can tune your web pages so that more people are satisfied with the experience of your site, chances are good that this will lead to rankings increases for you over time.

2. The Rise of Machine Learning

Google recently announced an algorithm called Rankbrain. This algo is a machine learning algorithm that they said had become the third most important factors in rankings.

I’ve been able to have some conversations with a Google spokesperson, and what I learned is that contrary to what some others in the trade press has said, the Rankbrain algorithm is applicable to all search queries, not just the long tail or more unusual ones. Here is what the spokesperson said to me:

“These sorts of signals usually aren’t restricted to a specific portion of queries; it’s more that the effects are noticeable more for some queries than others.”

You can think of search engine algorithms as having traditionally having three components:

  • Discoverability (can they find the content).
  • Relevance (what is the content about).
  • Importance (how does its value compare to other pages discovered on the same topic).

Rankbrain improves search in a fourth area, that of better understanding the intent of the user’s query.

The fact that Google believes that this is working really well for them is a big deal. They will continue to tune Rankbrain, as well as find other ways to make use of machine learning in their algos. You will also hear more and more about how other companies such as Facebook are using it.

We are entering the first stage of the machine learning revolution. It has a long way to go before it peaks, but you will be hearing about it constantly from here on out.

Erin Everhart, Lead Manager, Digital Marketing – SEO, Home Depot

Erin EverhartWe’ve seen some pretty big changes in 2015 with organic search – side note: I feel like we say that every year, so there’s even an underlying trend here: to just always expect things to change – that will carry into 2016, but the one that’s keeping me up at night is how drastically different the SERP landscape has changed.

Most notably are two things:

  • Google moved from a 7 Map Pack to a 3 Map Pack in August, which limits placement opportunities, but the even more troubling sign is that the Google Snack Pack is appearing for more queries, on both desktop and local, than the 7-Pack Places was. Google isn’t serving local results for just city-specific or “near-me” queries; it’s appearing for key head terms even if there’s no local intent (e.g., “picture frames”).
  • Google is now showing three paid ads instead of two on mobile.

Both of those changes are pushing your traditional 10 “free” organic listings further down the page, and you better believe that’s going to have an impact on your CTR. I wouldn’t be surprised if you start seeing less traffic YOY even if you’ve maintained a No. 1 ranking.

Maybe Google is just testing things out? Or maybe they’ll move to a completely paid model for all Map listings? Who knows at this point, but I think we’re going to start seeing that a No. 1 ranking in 2016 isn’t nearly as valuable as a No. 1 ranking was in 2015. And frankly that’s a scary world to live in.

Duane Forrester, VP, Organic Search Optimization, Bruce Clay Inc.

Duane ForresterIn a single word, “usefulness.”

The engines have been focused on this for a while now, the concept of getting closer to the root of what a searcher means, intends or desires.

“Relevancy” was the watchword for the last few years. The problem is that relevancy is too narrow, too easily met today and doesn’t dive into the intent behind a query. If you can understand the intent, you’re orders of magnitude closer to solving for whatever the searcher actually needs in the big picture.

Businesses need to take the much-talked about yet often over looked step of truly integrating programs in 2016. Cross sharing data between search and social both paid and organic. They need to develop programs for data capture, research and insight derivation. That will allow the production of useful content and experiences, aligned with searcher needs, that the engines will eat up!

The engines want, first and foremost, to have highly satisfied customers. Your content can be their answer. And when the engines win, you can win too.

Glenn Gabe, President, G-Squared Interactive

Glenn GabeFor important trends in 2016 SEO-wise, I’ll focus on the topic from a Google algorithm update standpoint. In 2016, we expect the migration of major algorithms like Panda and Penguin to Google’s core ranking algorithm, which can have a significant impact on many sites across the web.

In the past, Google would unleash these algorithms in one fell swoop (and the impact could typically be seen on one specific day). With Google’s move to migrate these major algorithms to its core ranking algo, websites could theoretically see impact at any time. This is a huge shift for Google, and can have a significant impact on websites once the migration is completed.

Algorithms

Between Penguin and Panda, it looks like Penguin will be the first to go real-time. Note, this was supposed to happen in 2015, but Google recently announced that the launch has been delayed (due to the holidays). Needless to say, many webmasters and business owners impacted by previous Penguin updates are eagerly awaiting the real-time Penguin.

Google’s Penguin algorithm, which heavily targets unnatural links, has been devastating for certain websites. And it doesn’t help that Penguin has been a disaster of an algorithm recently. There hasn’t been an update in close to a year, the last one (Penguin 3.0) was underwhelming to say the least, and websites are still being filtered after performing a lot of remediation work.

We can only hope that the real-time Penguin has more of an impact than Penguin 3.0. Time will tell.

From a Panda standpoint, the update targeting low quality content has turned into a once-per-year update (when it used to roll out near-monthly). The last official update was Panda 4.2 released on July 18, 2015. And similar to Penguin 3.0, it was underwhelming.

Google had technical problems with the update, so there had to be an extended rollout (over months). Based on the slow rollout, it was difficult to see the impact (if any) across the web. Many sites that had been previously impacted by Panda updates, especially Panda 4.1 and the 10/24/14 update, saw little impact from Panda 4.2.

If Panda does go real-time, sites could technically see impact at any time with regard to low quality content. And “low quality content” can mean several things. But, and this is important, it will be near-impossible to know that Panda is impacting your site. I’ve written about this problem over the years (starting in 2013 when Google first hinted that Panda would go real-time at some point).

Why is this a problem? Well, when you know you have been impacted by Panda, you could analyze your site through the lens of the algorithm update. Then you can form a solid plan of attack for rectifying content quality problems. But if you don’t know Panda is impacting the site, you won’t know that content quality is a problem holding back your site.

So, you are left with decreasing Google organic traffic with little clue about what the problem is. Could it be technical, content-related, links-related, etc?

Basically, both Panda and Penguin having the ability to impact your site at any time without any sign they are impacting the site will leave many questions about what to tackle SEO-wise. This will undoubtedly lead to a lot of confusion and frustration.

That said, the good news with baking major algorithms into Google’s core ranking algo is that websites can see impact more frequently based on the changes they are implementing. The downside is you won’t know which factors are causing drops or gains in Google organic traffic. I think we’re all eager to see how this plays out.

One thing is for sure, 2016 will be an interesting ride algo-wise.

Megan Geiss, Marketing Strategy Director, Merkle | RKG

Megan GeissAs we look at 2016 to see what’s on tap for SEO, we must first realize those things from this past year that will continue: mobile expansion and personalization.

The push for mobile to continue to dominate is obvious, with the app space getting more and more attention. Mobile apps are taking over the SERP landscape and to be successful in mobile search, companies are going to need to have optimized mobile apps with deep linking.

The other area that will continue to be important is personalization. Although not as present in organic search, it will likely take on a bigger play and become an important factor.

Google wants us to deliver the best content possible for the unique user’s queries and will reward those companies that do. Is this where RankBrain comes into play? Perhaps. There is a lot left to be seen and learned from Google’s machine-learning and I think we are going to see more of that for 2016.

Casie Gillette, Director of Online Marketing, KoMarketing

Casie GilletteA couple trends you need to watch are featured snippets and brand equity.

At this point we all understand the goal of Google is to give people the answer to their query as fast as possible. But I also think over the past couple years, we understand a bit more about what that really means for websites.

Yes, Google wants to give people the answers but they want to give them the answers directly within the search results – not by sending them to your website. In turn, we are seeing a lot more featured snippets in SERPs.

We are also seeing brands become more powerful. Google wants to show their users trusted sources and that often means well-known brands.

There are a couple things that come from this:

  • Brands have to refocus on how they can reach those top of the funnel consumers in search results when Google is trying to directly provide them the information they need.
  • SEOs have to think beyond the website, beyond links, and really consider the customer. People are using other channels to discover products and companies. They are asking for recommendations on social channels and in forums. They are looking for reviews and they are gathering as much data as possible before they even start the conversation on your website. We have to make sure we (or our clients) are monitoring social media for the right keyword phrases, answering questions in forums or in discussion groups, and engaging our customers. We need our customers, potential customers, and the general public talking about us because that’s how we become a trusted brand.

Mike Grehan, CMO & Managing Director, Acronym Media

Mike GrehanContent gap analysis will be important in 2016. In fact, more precisely, developing content around intent to provide an experience and not just a result.

We’ve known for a long time in search that the notion of relevance has always been central to providing accurate results. But now, we need to think not just about “is the result relevant” but also “is the result useful in the moment?”

And that means putting a lot more thought into content, not just because you think it may rank, but because it genuinely does satisfy an information need, at that precise moment, for the end user.

  • Should the result for a specific moment be a web page?
  • Or would a video result or local map result better?
  • Would a simple image be useful?
  • Should it be a short and concise result, or does it need a multi-page PDF document?

2016 will be much more about “content aware” strategies and “content mapping” solutions.

Jenny Halasz, President & Founder, JLH Marketing

Jenny HalaszIn 2016, we will see a significant trend toward technical SEO. People will always have their fun proclaiming SEO dead, but as Google loses control of their algorithm and elements like RankBrain become more pervasive, the technical elements of SEO will become ever more important.

Sites that fail with complex technical implementations will suffer at the hands of the algorithm. Google will have a take-no-prisoners attitude when hreflang tags and schema are implemented incorrectly

John Mueller was just recently quoted saying that if you fail to provide the handshake command for an hreflang tag, Google will just disregard it entirely. When Google does what they think is best, it usually works out OK, but there are enough cases of them getting it 100 percent wrong that brands and marketers will need to monitor this to make sure they’re getting it right.

As speed, mobile delivery, apps, and voice search functions become more important, we’re going to see an even further widening of the gap between rich and poor. The large brands that can afford the significant budgets associated with implementing these technical requirements will continue to dominate the SERPs, both locally and nationally. The small businesses, unable to keep pace, will fall by the wayside, and we’ll see a lot of great ideas go unnoticed.

Finally, 2016 will be the year that we see links begin to get degraded in the Google algorithm. The culture of fear and paranoia that surrounds links (both inbound and outbound) has infected the mainstream.

Where once the panic was confined only to SEOs and our ilk, it has now spread to major brands, major media, and primary sources that Google has traditionally used in the algorithm to determine value of a website. Fear and paranoia of a Google penalty without actual knowledge of how they work has broken links as a rating and ranking method.

It’s significant that the Penguin algorithm isn’t rolling out in 2015, and that it took more than 12 months to roll out last time. I believe the signals Google used in that algorithm aren’t valid anymore, and there is an increasing level of noise and confusion surrounding links.

Instead, I believe that Google will use a sort of “authority rank” to determine the value of a site as their primary factor rather than links. It may still use links in the computation, but I think it will encompass mentions, shares, buzz, and associate primary individuals with companies as subject matter experts. How exactly they’ll do this remains to be seen, but I think we can count on links not mattering as much in the near future.

Christopher Hart, Head of Client Development, U.S., Linkdex

Christopher HartWe’re rapidly moving toward a digital world where multiple environments will simply become the environment in which users experience your stuff. So understanding your user will be critical.

Four top trends to watch now:

1. User Experience

Users expect your site to work, whether they access your site through a browser or an app. Google (and search engines in general) are working to find ways that allow them to make judgments that aren’t gameable. So, expect user engagement metrics to start causing websites or pages to rank more favorably.

If a site is more useful, it’s more valuable. Great content that is useful and engaging to an audience is valuable. Usefulness is a rewarding factor – it doesn’t matter whether you have the greatest content in the world.

2. Mobilization & Apps Coming Together

Desktop. Mobile. App. It’s all merging.

Google and Apple have made huge strides in making apps more indexable. Google app usage can be a streaming experience from the cloud, not from your site.

Mobile environments will be the norm and users will continue to engage with content and mediums on the fly. Trying to reach your audience while they’re sitting still in one location in front of a device is a losing strategy.

3. Big Data Will Become Normal Data

More organizations are getting their arms around this and big data will do more to break down corporate silos.

One example you should take notice of: Publicis Groupe. The agency made a serious move, reorganizing itself to remove silos in a bid to become customer-centric and ensure all their clients are serviced with the same data and technology.

4. Schema & Markup

People who don’t take notice and markup their sites properly will find less engagement happening. It’s an easy win to provide your information in a way that search engines will understand how to present it.

There are now hundreds of SERP variations and the organic click curve is changing fast. People are consuming more info at a glance. One of your jobs is put the most “glanceable” information in front of people.

Bill Hartzer, Senior Strategist, Globe Runner

Bill HartzerFor 2016, content will remain king, so to speak, and publishing content on a regular basis will continue to be important. For 2016, though, since search engines are much more aware of actual user engagement metrics (especially on the social side), creating content that really resonates well with people will be key. Not only do you need to create good content, but that content must be something that users, real people, engage with.

Your social strategy must be tightly integrated and coordinated with your search strategy, because in 2016, it’s going to be even more connected. Your website just won’t rank well if it’s not liked by real people.

For 2016, it’s important to have a good editorial calendar that not only targets your main keywords, but those keywords need to be integrated into larger topics that you cover. Those topics should then be covered in depth, especially in a way that is useful for your ideal website visitors. Then, use social media to help promote your content in a way that not only drives them to your website but also encourages engagement.

Kristjan Hauksson COO & Partner, SMFB Engine

Kristjan HaukssonFacebook has moved closer to the search experience I was hoping for and will go further toward that in 2016, this might create the Social Search Engine Optimizer or the SSEO. This is not only because of Facebook but also Twitter and other more organic social media networks crawled by Google.

SEO gained a bit of a momentum in 2015 and will keep on doing so in 2016 with still more emphasis on social media signals and the mobile experience. This was underlined mid-year 2015 when Google’s John Mueller stated that sites without a dedicated desktop version would not suffer any ranking penalties.

I am also seeing indications that video will start to play an even bigger role is 2016.

Fundamentally, 2015 versus 2016 SEO will not change. It’s still about good content and user experience. And remember that title tags still matter.

Christina Hecht, Senior SEO Strategist, Vertical Measures

Christina HechtHeading into 2016, one trend we’ve seen with big brands and smaller clients alike is the need to get more ROI from existing content, which involves circling back to older content, improving it, and republishing it.

We know that Google responds well to fresh content, so for years, brands and website owners have gotten the message that they need to consistently publish new content. But creating brand new content isn’t always possible and it’s certainly not the only strategy to pursue.

In fact, it’s a mistake to overlook the opportunity to leverage existing content that you’ve already invested in and has proven successful. Things change daily, so take another look at your existing content to ensure that your pre-2016 content still makes sense in the new year.

2016 will bring even more frequent change than ever before, with real-time updates to the Google Panda and Penguin algorithm updates happening now or in early 2016, respectively. Plus, a brand’s user personas and competition have likely changed over time, which warrants giving existing content another look and a freshening-up.

My best advice? Follow this five-step process to squeeze more juice from older content with minimal effort:

  1. Prioritize content that once performed well but has recently become less effective.
  2. Check to see if the content still aligns with the types of searches your current user persona(s) might make and their expectations of your page. If it doesn’t, tweak it to meet your searcher’s intent.
  3. Improve the content with semantic keywords (synonyms, close-variants and tangentially-related phrases), Hummingbird-style queries and easy-to-consume formatting (bullets, tables, images).
  4. Republish the content and amplify it as you would a new piece of content: Share it, tweet it, promote it, post it, reach out to influencers, and earn new links.
  5. Measure the results and learn from them; KPIs include rankings, traffic, SERP CTR, engagement, bounce, conversions, etc.

Jim Hedger, Creative Partner, Digital Always Media inc.

Jim HedgerSearch doesn’t change with the precision or planning we might expect from the smartest technologists in the world. It evolves with its environment, albeit often in staggering leaps.

The search engines and social media tools tend to go where the users are, or where they expect their users to be in the future. Google’s mission is still based on making the world’s information free and delivering it by producing the best search result sets based on query and user behavior.

The key phrase for SEOs: “user behavior.” Everything is about adapting to or promoting behaviors. (A conversion is a favorable behavior.) After all, it is no good to make changes or promote evolution if the users are not going to appreciate them or worse yet, if the users are migrating to another ecosystem.

Mobile Everything

In a data-driven world, everything is mobile. I can’t do the simplest things such as raking leaves in the backyard, walking to the corner store, or even doing laundry in the basement without first being sure whichever mobile device I’m using is in my pocket.

We consult mobile devices to confirm long known surface routes, subway, and bus schedules. We use them when shopping to see if something less expensive can be found elsewhere. While watching a hockey game, my girlfriend and I will often be on one of four mobile devices strewn across the living room table. I have one cradled in my ear as I type this sentence.

Google has been pushing mobile friendly design for the last few years and that will continue in 2016 with support for Accelerated Mobile Pages. AMP strips all unessential JS queries from source code to build a faster loading and page which is less reliant on multiple third-party servers.

Google’s support is another in a series of signals suggesting Google expects more users to generate more queries using mobile devices. (Mobile queries passed the 50 percent mark sometime in the summer of 2015.)

User Experience

As the Internet Advertising Bureau (IAB) belatedly discovered, creating a positive user experience is critical to actually benefiting from user acceptance. In a rush to sell ad space on any possible high traffic online property, publishers and markers made some major mistakes which Google moved to correct.

Interstitial ads can make the mobile experience exasperating, especially if they appear every time the same URL is loaded. Worse than that, auto-run video ads are annoying, deplete batteries rapidly and eat a lot bandwidth.

Google has been clear about its dislike of degraded user experiences. SEOs should clearly avoid site elements that degrade or defeat a mobile session. Expect Google to demand higher quality user experience, especially in the mobile environment.

Quality Content

Good content isn’t going to be good enough if it is riddled with inaccuracies. Google is likely to begin some form of algorithmic fact checking, especially around websites offering critical health, financial, political, real estate, news and business information, or products.

Google already performs fact checking operations on local business listings so “begin” might be the wrong word. Extend might be closer but in the near future Google will be capable of looking inside itself to verify the accuracy of information found on any given page.

Deep Thoughts

I want to write about the impact of Google and IBM open sourcing their first generations of artificial intelligence but that would require far too many words. Suffice it to say the advent of what are essentially super-computers being opened to the general public will have profound effects on how marketers learn to understand the oceans of consumer data we have at our disposal.

Expect new analytic metrics to emerge as user behavior and intent are better interpreted by an already highly analytic advertising sector. Analytic marketers have found a myriad of ways to predict a user’s life experience.

Remember the Target direct mail campaign that accidentally announced a teen daughter’s pregnancy to her family? That trick was accomplished because one of Target’s marketers got the idea to personally examine the shopping habits of women who had signed up for Target’s baby registry. Once that marketer compared the items regularly purchased by several self-reported pregnant women he was able to extrapolate that most women who purchased X, Y, and, Z were very likely pregnant themselves.

Imagine what we’ll be able to learn about consumers with the algorithmic aid of rudimentary A.I. in the coming months and years.

Penguinitis

The slow march of the Penguins might stop in 2016. I’m in the camp that thinks Google finally broke itself sometime back in 2013 or 2014 and has been desperately trying to patch massive holes rather than rebuilding itself.

For what it’s worth, this situation was predicted way back in 2003 when Google links became commodities that were being bought and sold based on what Google showed as PageRank. The only surprise is how long it’s taken to scale into the algorithmic disaster it has become.

I don’t expect the Penguin link evaluation ever-flux project to ever be completed, even though like every other SEO I’ve been pushing my clients to pay strict attention to their link footprints, how links are phrased on their pages, and to whom they choose to link to.

Yahaol!

A quick and dirty prediction to finish on… AOL will buy the assets of Yahoo for $4.2 billion and the promise of free email accounts for all Yahoo employees for the rest of their natural born lives. If those employees’ brain pattering is ever transferred to any A.I. other than the one Marissa Mayer keeps tucked away in her secondary shoe closet, those email accounts will be closed faster than the last Yahoo board meeting will be.

For the record, predicting the evolution of environments is as difficult as predicting the weather. Many if not all of these ideas might be wrong. It’s not like Google or Bing have followed the most logical paths to get to this point.

Jon Henshaw, Co-Founder & President, Raven Internet Marketing Tools

Jon HenshawTwo trends to watch:

Device Responsiveness and Speed

As most search marketers already know, mobile searches have surpassed desktop searches. That means if you don’t have a mobile-friendly site, you could be missing out on a lot of traffic right now.

In 2016, I think Google will step up their efforts to display results that are more fully optimized for the device that’s accessing their search engine. In particular, they will focus on speed and start to give special attention to tablets.

SEOs will need to do whatever they can to speed up and improve the UX of their sites. Implementing SRCSET for their images and improving the UX for tablets is a good place to start.

Bifurcation of Sites and Apps in Search

It’s becoming clear that both websites and apps are here to stay. Google has figured out that people may want either one (or both) based on their interests, needs and usage patterns. If businesses have an app or are planning to create one, they should take full advantage of Google’s App Indexing.

Bill Hunt, President, Back Azimuth Consulting

Bill Hunt2016 will be the year of searcher interest alignment.

Successful search marketers will need to create new keyword phrase to content maps that focus on the “why behind the query” and then ensure the paired content satisfies the needs of searchers. You’ll need to go beyond quality content and think about the best format that amplifies the alignment.

Device and location-centric content will be more important. It will be essential for search marketers to create a holistic approach considering the different situations for the same set of keywords.

Brands will need to broaden the lens and focus on overall findability, especially outside traditional search engines, by integrating video, social networks, and specifically apps.

Mark Jackson, President & CEO of Vizion Interactive

Mark-JacksonI think more and more people are buying into the concept of SEO (budgeting for “SEO”), but they’re calling it “content marketing”. Because of this, the root of our practice is going underserved/unnoticed and I think it’s time to revisit the technical and architectural foundation.

SEO has evolved a bunch over the years, but the core of our practice should always be ensuring that a website is properly architected and can be correctly crawled, indexed, and cached. Too often, I am seeing people make a mess out of their robots.txt, on-page meta and/or sitemaps.

For example, people hastily moving to https, but forgetting that they need to follow certain steps to ensure a smooth transition. Or, ecommerce websites that fail to follow proper procedures with pagination / canonicalization. It is this foundation that allows the rest of it to be successful. Without this foundation, you’re building a stilt house made of toothpicks.

Fundamental things, from my personal experience this past year, seem to be taking a back seat to “doing content”. Maybe they’ll dress up the content by paying for some social sharing of the content, or even do some actual outreach to promote the content to the right people.

Don’t get me wrong: it’s all important.

But, if I were to predict “the big thing” for 2016, it would be getting clients to remember that there is still a technical side to SEO and it’s a vital one.

Some great tools that I’ve been using more, recently, include Deep Crawl and Visual SEO. Getting a real sense for how a site is crawled, the efficiency of the crawl, and then verifying the indexation with a view of Google’s cache/Google Page Speed tests. These are fundamental elements that all SEOs should pay close attention to, especially for large ecommerce websites.

Ammon Johns, Internet Marketing Consultant

Ammon Johns2016 marks a necessary shift towards playing hardball. There are two ways this will happen:

Reevaluate Your Position on Apps

Apps are a great way to gain additional data, since customers using the app are logged in, or otherwise identifiable by default if your app is built correctly. That means that you want to be thinking of an app for your business as the new form of store card or loyalty card.

You should be incentivizing use of the app over use of the website with discounts or reward points so that you collect more of that valuable data, and let less of it be given away to third party services and trackers.

Once users have your app and know it gives them points or discounts they are more likely to use the app rather than searches that could make you fight for position against competitors all over again every time.

Realize Google is Your Rival

If you haven’t already done this, then 2016 must be the year that you realize that Google is not your friend, nor even your ally. Yes, there are times that Google will do you a good turn, but only when it suits their objectives.

Ideally as a business, once you have served a customer, you want them to come directly to you the next time. Google is absolutely your rival in this.

Google has its own business and brand to promote, way above yours. Google would happily be telling people in your store, via apps or Google Glass, “Hey, that item you are looking at is cheaper in another store.”

You need to remember that Google is a rival, and seek to cut them out of the deal in future wherever you can.

Dixon Jones, Marketing Director, Majestic

Dixon JonesLast year I predicted a move towards integrated online, offline, and multi-touch analytics. It’s hardly a trend, but I wasn’t the only one to suggest it (Ammon Johns and Erin Everhart to name just two). It is still coming – in many ways it’s already here, with the inevitable backlash of Apple volunteering ad blocking software.

But now SEOs have bigger problems. Machine learning and Google’s Knowledge Graph development will make rank checking much less reliable and will reduce the correlation between visibility in search results and traffic to and through your website.

Truthfully, websites are becoming a smaller part of the digital marketing mix. It isn’t that visitors to the website are declining, but savvy digital marketers are trying to win hearts and minds of users before they reach the website and, in many cases, the user never even needs to go to the website.

This simply makes the need for multichannel, multi-touch attribution more acute than ever if we are ever to understand our user’s journey and mindset.

In terms of pure search, this will mean more emphasis on optimizing for Facebook, for Amazon, or eBay, depending on your business model. Oh, and Apple Search? Let’s see…

Ryan Jones, Manager Search Strategy & Analytics, SapientNitro

Ryan JonesOne trend will be moving from big data back to small data. As SEO matures into “real marketing,” brands will focus less on vanity metrics and more on actionable SEO data that helps drive conversions.

SEO will continue to be less about the algorithm and more about understanding what the user is trying to accomplish. The concept of accomplishing tasks will be big.

If your site is based on showing facts or public domain information, it will continue to lose traffic. How people search is changing, it’s no longer about words on a page – it’s more about actions or “verbs.”

Sites that understand that user intent and help them “do something” will be clear winners while sites that are merely information with ads will continue to lose out.

Julie Joyce, Owner, LinkFish Media

Julie JoyceThe biggest trend we’ll see in 2016, especially for big brands, is that they’ll start being everywhere. They won’t just publish content on their own sites, for example. They’ll create amazing content that is hosted only on an app or as a guest piece on another site.

As people start to consume content from sources other than just the SERPs, it’s going to be critical to be able to bring users to you through various platforms, like Facebook and Instagram.

Big brands need as many avenues of traffic as possible. They need to interact with consumers all over the place and pay attention to where their audience goes.

If you’re pushing recipes, for example, you’re probably going to have greater success with Pinterest, Instagram, or Facebook than you would if you published them on LinkedIn. It might seem obvious to say that, but you’d be surprised at how many people aren’t really thinking about that.

Krista LaRiviere, Co-Founder & CEO, gShift

Krista LaRiviereSEO now stands for Strategies for Earned and Owned with a focus on long-term discoverability and smarter content marketing. With this in mind, marketers in 2016 will need to continue with all of the traditional SEO best practices, plus be aware of two trends impacting the execution, measurement and ultimate success of a web presence or individual content campaigns: influencer marketing and off-site analytics.

On the execution side, an optimized content marketing strategy designed to beat the competition will focus on content distribution, amplification, and audience development through influencer marketing. From finding the best influencers who share a target audience, to managing content creation and amplification, to tracking which influencer in which channel is the most impactful to your brand – digital influencer marketing is key.

On the measurement side, it’s off-site analytics. With 67 percent of a prospect’s journey occurring off-site, brands require, but are lacking, insight into engagement and interaction of their external content assets.

A recent Google’s patent filing further confirms the importance of off-site content for relevancy and trust of a web presence, as well as the behavioral trend prospects are demonstrating by continuing to inform themselves in social and other off-site presence points. This trend makes it difficult for standard analytics packages to represent the full dataset.

Influencer marketing (a significant driver of off-site content) requires influencer analytics. Platforms and technologies are starting to address this gap.

Martin Macdonald, Head of SEO, Orbitz Worldwide

Martin MacdonaldSince 2007, Google have declared every year, the “year of mobile”.

In April (2015) we also went through the “mobilegeddon” update that proved to be anything but a massive shakeup that Google told us it would be.

Despite all this, we’re for the first time facing a new reality that must be dealt with, alongside desktop search. Almost every traffic report we’ve seen towards the end of the year, particularly around Black Friday/Cyber Monday has shown mobile and desktop traffic to be at near parity.

This alone should be enough to re-prioritize mobile SEO alongside desktop, but it doesn’t end there.

Big brands typically have native Android and iOS apps, and the recent moves by Google to live stream uninstalled apps directly to consumers phones, in a pseudo-VPN environment, alongside full app indexing, opens up a new frontier.

Typical websites have always struggled to match conversion rates, and underlying usability on small screened devices. These restrictions have never presented as big a problem in native apps, where we can tailor the user experience with much greater finesse.

Therefore, for any brand that rely on a certain amount of organic search, and (like most) have conversion issues with that traffic – getting your app streamed directly could be a massive game changer, and will re-focus SEOs beyond the web, and into native apps, in a way that hasn’t happened to date.

Roger Montti, Owner, martinibuster.com

Roger MonttiThe obvious answer is mobile. But it’s not mobile. Mobile is a part of larger and more important trend.

The important trend for 2016 is focusing on user experience. User experience is increasingly determining whether a site is going to be ranked for mobile search. Focusing on user experience can lead to remarkable improvements in ranking, conversion rates, social shares, traffic, and sales.

The phrase “user experience” is referenced 23 times in Google’s Quality Rating Guidelines. These guidelines are used by human quality raters to create a reference set of quality judgments that can be used to train algorithms to scale the job of rating the quality of a website, like the Panda algorithm.

Human quality raters are also used to evaluate the success of algorithms. At the heart of these classifiers, which are used to classify the success of the algorithms, is the phrase “user experience.”

User experience is one of the key quality signals used by the human quality raters and by the algorithms. That’s what the page layout algorithm was all about. It’s at the heart of Panda.

To underline the algorithm’s focus on user experience, the new mobile section of Google’s quality rater guide is called, “Understanding Mobile User Needs.”

It’s not about fast downloads, asynchronous scripts, or removing render blocking scripts. Those play a role, but a role is one part of a larger whole.

When you align your web and mobile strategy to user experience models you will be a good deal ahead toward scoring better on the quality signals that algorithms today are looking for. User experience is deeply embedded in the algorithms of today and no doubt more so in 2016.

Lee Odden, CEO, TopRank Online Marketing

Lee OddenMarketers can hypothesize year after year about what companies should do in order to win at SEO, but the obfuscation of actual cause and effect by Google of any singularly useful tactic is hardly fuel for a reliable prediction. From RankBrain to Panda penalizations, SEO can seem a lot more like a game of algorithm whack-a-mole than marketing that generates revenue.

So, what should companies do to win at SEO in 2016? They should start by focusing on the thing that tactic-agnostic marketers have always relied upon: customer centricity.

Think about it: If Google’s pursuit of delivering the best answers in search results is primarily based on consumer behavior and preferences, then why shouldn’t your marketing and SEO?

Of course there’s tremendous importance on the technical side of SEO, topic demand and signals that point to content. Before all of that, winning with SEO requires customer insight and planning. It means understanding customer behaviors in terms of:

  • Information discovery (desktop, tablet, mobile, search, social, influencers, subscriptions)
  • Content topic and media type preferences (yes – keywords, text, images, video, long or short form, facts, stories)
  • Triggers that inspire taking action (trial, demo, subscribe, share, inquire, test, consultation, purchase, refer, advocate)

Know your customers and optimize accordingly. Google is after the same customers with their ads, so make sure your content is optimized for click as much as it is for placement.

Does it still make sense to use search data to inspire content, site architecture, promotion and link attraction? Of course it does. What about technical SEO to solve site performance, duplicate content and optimal crawling? Yes again.

But at the same time, focus on what makes customer tick. Use SEO to make it easy for customers to do what they want to do and you’ll find yourself less distracted by shiny SEO tactics of the past and focused instead on customer acquisition that drives marketing performance in the future.

Chuck Price, Founder, Measurable SEO

Chuck PricePenguin 4.0 is coming in 2016 and it’s going to rock the world of organic search. More than a year has passed since the last Penguin refresh. Webmaster trends analyst Gary Illyes was quoted as saying “The new Penguin update will make webmaster’s life easier a bit and for most people it will make it a delight.”

In November, it was discovered the algorithm was still unfinished. A few weeks later, on December 3, it was reported “With the holidays upon us, it looks like the penguins won’t march until next year.” The new algorithm could make life easier for “most” people, but that still leaves another 49.9 percent at risk.

Google now has reams of user-generated data concerning spammy links. You and I refer to this data as a disavow file. There’s no question that good links get disavowed every day, but the sheer volume of spam link data that is shared with Google would place these links as outliers. Factor in machine learning, like RankBrain, and there’s a whole lot of links that could potentially be rendered impotent or toxic overnight.

Google has been discounting the value of spammy links for years, and upped the ante with Penguin making classes of links “toxic.” I suspect the new algorithm will take the evolution of link value to the next level – placing a premium on semantically related links. Relevance will replace PageRank. I believe this will become particularly evident in 2016.

The biggest Penguin 4.0 winners will be major brands that attract scores of editorially links naturally. Smaller companies that use content marketing to attract related, editorial links will also be “delighted.”

The biggest Penguin 4.0 losers will be brands that continue to abuse guest posting as a way to manipulate rankings. Companies that market themselves as SEOs, but sell nothing but spam; black hat SEOs, and their unwitting clients, that rely on private blog networks.

Matt Roberts, Chief Strategy Officer, Linkdex

MattRoberts1The recipe for 101 SEO remains simple and unchanged. Create content that matches keyword intent. Put the content inside relevant information architecture, complemented with smart technical SEO, then know why people would link and find ways of making it happen.

Given the simplicity of this, why aren’t more website more optimized? For example, why don’t more website have an optimal amount of content?

When you explore this you discover that there are lots of reasons. Most of them have less to do with know-how and more to do with politics, resources, and effort.

I see technology helping with this a lot going forward. I predict that 2016 is going to be characterized by data, machine learning, and software coming together to take more of the strain out of “best practice” SEO to allow SEOs to spend more of their time doing more creative and team orientated marketing tasks.

Dave Rohrer, Founder at NorthSide Metrics

Dave RohrerLike everyone that makes “predictions” on what the trends will be I had to go and read what I said last year just to see how far wrong or on the money I was. Verdict: not too shabby.

I spent the first four months of 2015 at an agency where I worked with very large brands, and while some were looking to the future, I can’t say they all were. Since April, I have been running my own small one man shop. In working with companies of different size, I noticed some similarities in how they approach content marketing.

The thing is, no matter the size of the company or the quality of the content produced, you still need to do one thing: market your content marketing content. I have started to say this more and more, and hopefully people will do it more in 2016, but we shall see.

So to all those creating great content, do remember that this is not your “Field of Dreams”. If you build it, the traffic, links, and sales won’t just come without you doing some work to promote it. You need to market your content marketing content!

Kristine Schachinger, CEO & Founder, The Vetters Agency

Kristine SchachingerIt seems like we say the same things most years, but sadly sites are becoming increasingly poorer at delivering the results they could be providing their site owners.

While spam was once Google’s largest focus, user experience has become almost an equal cross for sites to bear. As time goes on, Google only continues to increase that focus.

Yet even with the new algorithms that strictly address usability, sites are getting more unusable. We commonly see sites becoming more a platform for advertising revenue than for user engagement and we regularly see page sizes ranging from 5MB-20MB and technical SEO almost overlooked.

Google is clear on all these items. They want you to provide a better user experience. Sites that do this well will be rewarded. Sites that don’t risk being devalued.

So if you want to get ahead in 2016? Make sure your ads don’t violate Google’s page layout specifications. Create pages that download at under a MB or two, and get a technical site audit see what else is holding your site back.

Next, make sure you add schema to your site. With the continued expansion of the knowledge graph Google is going to be putting more emphasis on your schema mark-up and if your competitor is doing it better than you, they will win the organic race. This is no longer a “what if”, it is a site must.

Finally, be careful of overlooking your organic search traffic for social media referrers such as Facebook. Facebook and other social media traffic come from closed eco-systems that want your money. Although they hold a large place in the expanding realm of organic visibility, that traffic can also all but disappear at a moment’s notice with no chance at recovery. Leaving you with less revenue and you in need of a much larger advertising budget.

So don’t overlook your organic search traffic. It may seem with the addition of Google AdWords spaces and the increasing breadth of the Knowledge Graph Google organic visibility has decreased to a point of being futile, but this is just not the case. Google organic is still a wide-open playing field across many long and mid tail terms and differing verticals.

Don’t forsake your organic visitors, who stay longer and visit more pages, for the short-term satisfaction of a site that relies on a predominance of say, Facebook visitors. You need both, but you need organic search as much if not more.

So many sites fail in these three basic areas, if you start 2016 early, with improved user experience, the addition of schema implementation, and better organic optimization you will have a very happy year end. Maybe even a bonus or two.

Additional note: Google Penguin is coming out in early 2016. If you see a large drop in traffic after it is released, get to an experienced site recovery auditor quickly. Do not wait, do not pass go; the longer you wait the worse it will be.

Grant Simmons, VP of Search Marketing, Homes.com

Grant SimmonsI’m a massive believer in the “next screen,” whether that’s kiosk, vehicle, or wearable tech that empowers users on the go with relevant and valuable information. SEO marketers need to consider the user experience on these screens as just as important as mobile was influenced by the “mobile-friendly” update earlier this year.

“Mobile” is no longer just a smartphone or tablet, it’s a component of user context, like location, prior behavior, device, current behavior, interaction method and/or a host of many other elements the search engine will use to improve their search results, leveraging the feedback loop of user SERP interaction to improve the algorithm (utilizing the machine learning of RankBrain to help decipher user intent from user input and context).

What does this mean to search marketers? Content without expertise, intent consideration, and understanding of context will find it increasing difficult to show up in results, because more focused content will better satisfy the questions users are asking. Keyword research in the traditional sense will be useless and usurped by crowdsourced “real-people” research and site usage data.

Brand – i.e., An entity: (person, organization or company) a focused subject matter or topic expert (defined by social, link, online / offline graph) – will be everything and further affect ranking algorithms and click-through rates.

So SEO practitioners will partner with or take on digital PR, branding and/or marketing agencies to build a topic-focused digital footprint. We’ll be the conductor of the marketing orchestra, the captain of the digital ship.


Bill Slawski, Director of Search Marketing, Go Fish Digital

Bill SlawskiOne of the biggest trends for 2016 will involve the search engines relying more upon structured data found on websites. Many sites have adopted such markup; but many others haven’t yet, and it could be a competitive advantage to those that have figured out how to do it well. It could make a difference in how sites are displayed in search results on both mobile devices and desktop computers.

Using the right markup can:

  • Give the search engines more precise information about the goods and services that you offer.
  • Help make ratings and reviews stand out.
  • Draw attention to events.
  • Make contact information for your business show up in search results.

Appearing strongly in knowledge panels and in search results can make the difference between whether someone visits your site, is impacted by your brand, and chooses to do business with you.

Nichola Stott, Owner, theMediaFlow SEO Agency

Nichola-StottOne of the most important areas and growing in strategic importance in 2016, is the area of UX as it dovetails into SEO, particularly in terms of performance optimization. Factors like site speed, UX by device, and content delivery become more important to search engines because of technological advances.

Plus, users are becoming more sophisticated and more discerning in terms of sites we frequent and the experience we expect to be delivered cross platform. For many such reasons we’re seeing an increasing need for the inclusion of what may traditionally be seen as UX developer/performance auditing tasks also coming under the remit of the technical SEO audit process.

Secondly (and somewhat hopefully) we suspect 2016 may be the year that businesses spend more strategically on content activities. While there’s always a place for interesting blog content that is produced by topic and product experts particularly in interesting niches, there’s also a lot of dross that has been churned out by “volume” agencies, in recent years.

We prefer to work with larger budgets on higher quality, smaller volume content pieces to generate more impactful results. Finding great ideas that match the brand values and USP can be so much more engaging than churning out content for content’s sake.

Kaspar Szymanski, SEO Consultant, SearchBrothers

Kaspar SzymanskiRight now an online business’ success in organic Google search is hugely determined by page speed. While this is no news, page speed’s weight as a ranking factor seems to have decisively grown as backlinks lost their importance in recent years. This trend will grow and magnify in 2016.

While minification or gzipping tend to be widely accepted and mostly applied already, developers will now focus on squeezing out the last few percent in order to make sites load uber-fast. HTTP/2, resource preloading and preconnecting are some of the methods that will embraced by the SEO industry and e-commerce community in 2016.

Having said that, SEO remains complex and there are no silver bullets. Speed will be a decisive winning factor in organic search for overall well optimized sites. That inevitably includes great on-page optimization.

A lot of business will direct their resources in 2016 to auditing in order to get their on-page SEO in order and get a head start in the race.


Jose Truchado, CEO, at Loud Voice Digital

jose-truchado2015 has been one of the fastest changing years in the SEO sector. Although we try to predict search trends on a yearly basis, search behaviors that dictate how search engines evolve may prove that yearly strategies are no longer viable. Instead every business needs to be able to adapt rapidly, or even better, to predict how people will be using search engines in the near future.

2015 was the year where mobile claimed the crown of searches, surpassing those made on desktops. 2016 will be the year where search assistants such as Google Now, Siri, and Cortana will start dictating how we interact with search engines.

Question words such as: Who, What, Where, Why, and How are a common denominator in many of the searches we perform today and that is because search engines are closer to emulating how we interact with other humans than ever before. So formatting your information in order to answer those questions whenever possible will be paramount in any SEO strategy

Wearable technology, such as smartwatches, are one of the factors that are accelerating this type of search and in 2016, as they become more popular, seeing people asking questions to their watches will seem more and more natural.

I talked about this two years ago in a conference in Brighton, at that time I had the first version of Samsung’s smartwatch and I remember being too embarrassed to use the already existing voice capabilities of the watch; today I find myself often asking my watch “Ok Google, What is the weather going to be like this week in London” or “what’s exchange rate between USD and GBP” because I’m to lazy to get my phone out (I’m still a geek for doing that but less so than two years ago). Talking to wear technology for more complex task such as booking a restaurant or a hotel will be a common thing in the next couple of years when browsing on wearables has developed even more.

If having a mobile ready site was not one of your 2015 objectives, then that would be your priority number 1. What will be different in 2016 is that it will no longer be just about having a mobile ready site; having the information on the site formatted for how people interact with mobile devices will also be pivotal for any successful SEO strategy.

Optimizing for local search, using micro-formats, and indexing of mobile app pages will also see their importance increased. With having to serve information to so many types of screens and devices search engines will give preference to those sites who serve snippets of information that they can understand and manage better.

Bas van den Beld, Marketing Strategy Consultant

Bas van den BeldTo be successful in SEO in 2016 marketers should first of all stop chasing Google. Google is changing and is moving towards becoming more of a personal assistant than a search engine. As an SEO we need to deal with that.

Too many SEOs are focusing on when the next Google update will roll out. Instead they should be working toward getting ready where it matters.

A part of an SEO’s job will still be “old fashioned” SEO, making sure everything works as it should be. It just won’t be on the more generic terms, it needs to be on a level that is much more personal to the searcher.

With competition growing and search getting more personal, the SEO needs to get closer to those they are targeting. Getting a grip on the consumer journey they are in and optimizing to be there at the right stage of the journey. In SEO in 2016 you need to be there when it matters.

Martin Woods, SEO Consultant, SALT

Martin WoodsThis time last year I made several predictions about the direction of SEO in 2015, and several of those came true, based on the SEO campaigns my agency has been involved with.

Perhaps the most significant has been a shift towards project work for larger brands, for more precise deliverables from SEO companies. While ongoing base SEO retainers make sense for regular repeat monthly sprints of work, such as technical audits, we are more commonly being asked to provide ‘layered’ projects on top of this, according to marketing calendars. This has been the case for supporting in-house SEO teams, as well as when we are sole SEO consultants involved on projects. I see this trend continuing into 2016.

Secondly, the scope of what is considered to be within the remit of “an SEO” is growing every year, as search engine algorithms become more and more complex. Toward the end of 2015 news emerged that Google was using a machine-learning artificial intelligence system called RankBrain to help sort through billions of search results – probably to deliver ever more relevant results. This suggests that 2016 will see an increasing shift toward personalised results, based on real AI, rather than generic metrics.

2016 is also flagged as the year when two key notorious spam algorithms – Google Penguin & Google Panda – will be incorporated into the rolling algorithm. Businesses should certainly be putting more focus on staying on the right side of these algorithms, through regular ongoing auditing of link profiles, followed by the updating of disavow files. Also they should be paying close attention to index bloat, and to low quality pages in the search index.

Lastly, many of our clients are moving toward implementing edge network technology for various reasons – specifically around increased security (SSL, DNSSEC, and DDOS) – which all have SEO implications, as well as using powerful CDNs for optimizing delivery. We recently looked at the risks/rewards of using this type of technology in a case study on Cloudflare’s impact on SEO. I predict that technical SEO in 2016 will increasingly champion this technology as website security continues to impact organic search.

The experts have spoken. Your turn. What do you think will be the biggest SEO trend in 2016?