The top 6 SEO trends from SMX Munich conference
Lots of changes have affected search in 2017 and now, as we've entered 2018, all eyes are on the newly emerging trends and the updates that Google has in stock for us.
"Evolve or decay" — that's what many call the future of SEO. And if you find it hard to keep track of all the changes you must adjust to and evolve with… No worries! Just back from the largest and most trusted SEO conference in Europe — SMX Munich — we've carefully documented all the most important takeaways!
We've been all eyes and ears for you, and here's what we brought back — the top 6 SEO trends for 2018 as seen by the SEO PowerSuite team:
1. Mobile-first indexing: how to get your site prepared
Google's been working on mobile-first indexing for quite some time now, and, finally, quite a big change (at least for Google itself) is about to roll out.
To put it simply, instead of indexing and ranking your desktop-version content (and then pushing it down in rankings on smartphones if the website wasn't mobile-friendly), Google will now index and rank the mobile version of the site.
- How's that achieved technically?
So, basically, if you have no mobile version (and show to Googlebot your desktop version of the site), it simply crawls your desktop version. As John Mueller points out, Google tests showed that a properly set up desktop version of your site might work a lot better in mobile-first indexing than a badly set up mobile version.
- What websites are being switched to mobile-first?
At the moment, before switching to mobile-first, Google checks if your site is ready.
- What happens to desktop versions?
While indexing mobile, Google will still be able to track its connection to the desktop version. And if a query is made on desktop, the result will be served from your desktop site.
- What are the criteria to check if the site is ready?
Here are the key criteria Google currently looks at:
- The content on your mobile version is equivalent to that indexed for your desktop version (you aren't ready for the switch if you have very simplified mobile version with only part of your content available there).
- Similar internal links (the mobile version of your site has all the internal links Google needs to be able to crawl your entire website).
- Images and alt-texts are available (to make sure all your current images can get indexed for image search).
- Structured data and other annotations (AMP, hreflang, etc.) . With switching to mobile, you need to make sure all of that is present on your mobile version.
- If you have a separate m-dot site, you need to have a strong enough server (able to cope with the load of crawling).
- Are all types of mobile-friendly versions OK?
Yes. It doesn't matter if you have a separate m-dot mobile website, serve mobile content dynamically on the same URLs or simply use responsive design. Both of them work fine. However, with m-dot versions you will definitely have a lot more work to do to get the site prepared.
- Is there a way to see if your website has already been switched?
You can do that by analyzing your server logs, or by simply adding a comment into your source code. Sample PHP code looks like this:
<!-- User-agent: <?php echo $_SERVER ["HTTP_USER_AGENT"]; ?> -->
Thus you'll be able to see which version of Google's crawler visited a particular page in the source code of your page in Google's cache (cache:yoursite.com).
- Will structured data set with Data Highlighter still be working?
If you've been using Data Highlighter to markup structured data on your pages, you will probably need to *retrain* that (which might be a pretty time-consuming issue for some sites).
Getting your mobile version ready for mobile-first indexing surely needs a lot of auditing. You'd need to make sure all your content is in place, get rid of all the broken images and resources, take care of your internal link structure and what not.
To do that, and to compare your mobile version against the desktop one, the best idea would be to create two separate projects in WebSite Auditor. One crawled with a general user agent, another one with Google mobile bot.
To do that, when creating your project, tick the Enable expert options checkbox on the first step of the wizard, and pick the corresponding user agent and robots.txt instructions:
So you build an awesome website — modern, beautiful, with all the visual affects you've always dreamt of. But… You find it NOWHERE in search engines. And so do your customers.
So, to understand your website the way users do, they need to first render JS and build a DOM-model.
Can they do that? Well, Google claims they can. But only in case you're very scrupulous in terms of your website setup.
So, here are the top things for you to remember:
- 1 unique clean URL per a unique piece of content
Each piece of your content must be located "somewhere" for a search engine to index it. So if you dynamically change your content without changing a URL, you're preventing search engines from accessing it. Also, try sticking to clean URL format: example.com/url.
- Use "a hrefs" instead of onclick="window.location="
Keep in mind that search engines do not treat onclick="window.location=" as ordinary links. Which means that in most cases they won't follow this type of navigation. And definitely won't treat them as internal link signals.
- Load content automatically, not based on user interaction (click, mouseover, scroll)
Google bot is not a user; it's not going to click or scroll down, so it simply won't see the content you're loading upon these actions.
- Load the content quickly
Google obviously won't wait forever, so content that you want to be crawled and indexed must load very quickly.
- Stick to server-side (as opposed to clients-side) rendering
- Avoid duplicate <head> section elements (title, meta description, etc.)
With some JS frameworks you might face this type of content duplication, which you'd better fix:
3. The rise of voice search: a threat or an opportunity?
Google Assistant, Siri, Alexa, Cortana – what we see today is a true explosion in the voice assistants industry. And more of them are sure to come.
With the rise of voice assistants, voice search is growing rapidly. But what is voice search? What does it mean for you and your website?
The same. But definitely with a twist, since voice search is making the input truly conversational. And here are a few things you might focus your strategy on to get ready for voice search (and, surely, benefit in traditional search as well).
- Get featured snippets.
It is 80% of Google Home answers that come from featured snippets. When Google Home answers the searcher's question, it will cite the source of the info (the site's name), and send a link to the searcher's Google Home app. And this is a great opportunity you cannot miss.
So, find the queries that results with a featured snippet and write a better answer to them, seasoned with Schema markup (note: check our recent guide to featured snippets optimization).
- Re-visit your keyword strategy.
It turned out, for voice searches, there's a very specific way searchers are likely to look for a product depending on its nature:
- Low cost/risk items are typically searched by store (loyalty to the store brand)
- Medium cost/risk items are typically searched by brand (loyalty to the product brand)
- High cost/risk items are typically searched by features (loyalty to prices/features)
So be there, where your customers are looking for you.
- Adjust your language.
Speak the language your audience speaks. Think of all the questions that come up on different stages of customer journeys, and of all the modifiers they use:
How do you get question ideas for voice search? Rank Tracker's Common Questions research method would be a great help.
In your project go to Keyword Research, click the Suggest Keywords button, and choose the Common Questions research method. Enter your keywords and click Next. When your research is done, your dashboard will populate with ideas for your possible questions.
4. Content marketing: why ~97% of your efforts don't bring results.
For a very long time SEO was all about quantity. We needed backlinks, we needed keywords, and we needed more. And even though time's changed, we still have one "quantity" concept in minds. That is the more content you create (for the more keywords you target) the better.
Sounds fair, doesn't it?
Well, keeping your content marketing strategy in line with this theory might mean you're putting your efforts down the drain. Since in most cases it is only 3% of your highest performing content ("content unicorns", as Larry Kim calls them) that brings most of your traffic:
And the reasons for that are pretty obvious from the way search engine and social media algos work. It is the best performing content (with the highest engagement rates) that gets pushed higher in SERPs and in news feeds, so the mediocre pieces of content have absolutely no chance to catch up.
How can you adjust your content strategy for that? In two simple ways:
- Measure and compare user engagement rates (be that social shares, conversions, CTRs or any relevant metric) for your current content and identify top-performers. And focus maximum attention on them, because with a little extra budget and promotion they could bring you even better results.
- Focus your efforts on creating "unicorn babies" – which is re-utilize the topic that has already performed for you greatly. Repurpose your content in various shapes – videos, webinars, infographics and what not, reuse it from time to time and create follow-ups, to repeat your success.
Measuring "engagement" might sound a bit vague. However, in WebSite Auditor, there's a number of metrics that could help you identify your best performing content:
- InLink Rank for pages' backlink popularity
- Google Analytics Pageviews and Bounce rates
- Social media share counts
5. What are SEO a/b tests & how to use them to maximize traffic?
Don't trust your gut, run a test! Test your landing pages for conversion; test your ads for CTR. Test, test and test. In most types of marketing activities, we heavily rely on A/B testing.
However, not too many of us ever considered A/B testing their SEO. And if we think about it, it seems quite understandable — unlike human beings, search engines used to be predictable.
But let's face it — machine learning has changed it once and for good. The algos, having absorbed such a great number of human behavior signals, turned into something you cannot simply predict. So, it's just the time you start testing.
And the only difference from traditional split testing here would be that you cannot simply create two versions of a page and see which one ranks better. Even ignoring the problem of duplicate content, the test would be muddied by the age of the page, its current performance, and its appearance in internal linking structures. So we take sets of similar pages to split:
- Identify the set of pages you want to improve;
- Choose the test to run across those pages;
- Randomly group the pages into the control and variant groups;
- Measure the resulting changes and declare a test a success if the variant group outperforms its forecast while the control group does not.
"But what should I test?" you might ask. Testing for the sake of testing is senseless. Before starting out, you need to clearly state your hypothesis.
Like Pinterest did:
The most reliable way to measure the performance of your a/b tested page groups is the organic traffic they get. However, you definitely want to keep a close eye on your particular keyword rankings (since it's not only about the amount of traffic, but definitely about keeping rankings for your top money keyword safe).
To do that, you might tag the keywords that have their ranking pages involved into split testing:
You might also create your reports based on this tags to analyze the performance of your tested keywords:
6. Accelerated Mobile Pages — Should we AMP-ify?
If you somehow missed the buzz, Accelerated Mobile Pages (AMP) is an initiative launched by Google in 2015. It is a new "standard" for building web content for mobile devices, which is based on AMP HTML (regular HTML with some restrictions and extensions), AMP JS (a library that ensures the fast rendering of AMP pages), and AMP Cache (Google's cloud cache intended to reduce the time it takes for content to load on a user's mobile device).
When it first made its appearance, the project seemed to be the future of mobile content. And while it does have its advantages, Bastian Grimm from Peak Ace reminds us AMP is not a silver bullet that lets you ignore speed optimization of your main website.
The experience publishers had with AMP over the past years proves:
- AMP requires a lot of effort.
Converting existing sites to AMP almost never works, you need to rebuild the entire HTML & CSS from scratch (which takes time & resources).
- AMP urges extra spends.
Extending CMS capabilities to manage AMP content is expensive, additional maintenance (IT, editorial, etc.) increases costs further.
- AMP content lacks branding.
If you scroll the page down the logo, can you guess which newspaper is which? AMP versions of different websites essentially all look pretty much the same. And more to that, your monetization & marketing automation means are very limited due to JS restrictions.
Well, this is something you might sacrifice for the sake of user experience? Wait a sec.
It is actually possible to make your regular website load FASTER than AMP:
This will surely also require your technical effort (for the tech-savvy, I recommend looking through Bastian's presentation here, starting from slide #22). But this will give your whole website a certain SEO boost and, most importantly, a better user experience without harming brand and conversions.
Before you make your decision, it's important to understand where your site stands in terms of mobile loading speed.
You can run a comprehensive mobile-friendliness test in WebSite Auditor under Content Analysis. Just switch to the Technical factors tab and look at the Page Usability (Mobile) section of the technical factors for any errors or warnings.
In summary, we can totally see how quickly SEO is changing — often in ways we could not have predicted. This way, search engines are keeping the game hard, fair, and — if you think about it — interesting.
Which of the factors above do you think will affect SEO most in the coming months? As always, I'm looking forward to hearing your thoughts and questions in the comments!