Wednesday, February 27, 2019

14 SEO Predictions for 2019 & Beyond, as Told by Mozzers

Posted by TheMozTeam

With the new year in full swing and an already busy first quarter, our 2019 predictions for SEO in the new year are hopping onto the scene a little late — but fashionably so, we hope. From an explosion of SERP features to increased monetization to the key drivers of search this year, our SEO experts have consulted their crystal balls (read: access to mountains of data and in-depth analyses) and made their predictions. Read on for an exhaustive list of fourteen things to watch out for in search from our very own Dr. Pete, Britney Muller, Rob Bucci, Russ Jones, and Miriam Ellis!

1. Answers will drive search

People Also Ask boxes exploded in 2018, and featured snippets have expanded into both multifaceted and multi-snippet versions. Google wants to answer questions, it wants to answer them across as many devices as possible, and it will reward sites with succinct, well-structured answers. Focus on answers that naturally leave visitors wanting more and establish your brand and credibility. [Dr. Peter J. Meyers]

Further reading:

2. Voice search will continue to be utterly useless for optimization

Optimizing for voice search will still be no more than optimizing for featured snippets, and conversions from voice will remain a dark box. [Russ Jones]

Further reading:

3. Mobile is table stakes

This is barely a prediction. If your 2019 plan is to finally figure out mobile, you're already too late. Almost all Google features are designed with mobile-first in mind, and the mobile-first index has expanded rapidly in the past few months. Get your mobile house (not to be confused with your mobile home) in order as soon as you can. [Dr. Peter J. Meyers]

Further reading:

4. Further SERP feature intrusions in organic search

Expect Google to find more and more ways to replace organic with solutions that keep users on Google’s property. This includes interactive SERP features that replace, slowly but surely, many website offerings in the same way that live scores, weather, and flights have. [Russ Jones]

Further reading:

5. Video will dominate niches

Featured Videos, Video Carousels, and Suggested Clips (where Google targets specific content in a video) are taking over the how-to spaces. As Google tests search appliances with screens, including Home Hub, expect video to dominate instructional and DIY niches. [Dr. Peter J. Meyers]

Further reading:

6. SERPs will become more interactive

We’ve seen the start of interactive SERPs with People Also Ask Boxes. Depending on which question you expand, two to three new questions will generate below that directly pertain to your expanded question. This real-time engagement keeps people on the SERP longer and helps Google better understand what a user is seeking. [Britney Muller]

Further reading:

7. Local SEO: Google will continue getting up in your business — literally

Google will continue asking more and more intimate questions about your business to your customers. Does this business have gender-neutral bathrooms? Is this business accessible? What is the atmosphere like? How clean is it? What kind of lighting do they have? And so on. If Google can acquire accurate, real-world information about your business (your percentage of repeat customers via geocaching, price via transaction history, etc.) they can rely less heavily on website signals and provide more accurate results to searchers. [Britney Muller]

Further reading:

8. Business proximity-to-searcher will remain a top local ranking factor

In Moz’s recent State of Local SEO report, the majority of respondents agreed that Google’s focus on the proximity of a searcher to local businesses frequently emphasizes distance over quality in the local SERPs. I predict that we’ll continue to see this heavily weighting the results in 2019. On the one hand, hyper-localized results can be positive, as they allow a diversity of businesses to shine for a given search. On the other hand, with the exception of urgent situations, most people would prefer to see best options rather than just closest ones. [Miriam Ellis]

Further reading:

9. Local SEO: Google is going to increase monetization

Look to see more of the local and maps space monetized uniquely by Google both through Adwords and potentially new lead-gen models. This space will become more and more competitive. [Russ Jones]

Further reading:

10. Monetization tests for voice

Google and Amazon have been moving towards voice-supported displays in hopes of better monetizing voice. It will be interesting to see their efforts to get displays in homes and how they integrate the display advertising. Bold prediction: Amazon will provide sleep-mode display ads similar to how Kindle currently displays them today. [Britney Muller]

11. Marketers will place a greater focus on the SERPs

I expect we’ll see a greater focus on the analysis of SERPs as Google does more to give people answers without them having to leave the search results. We’re seeing more and more vertical search engines like Google Jobs, Google Flights, Google Hotels, Google Shopping. We’re also seeing more in-depth content make it onto the SERP than ever in the form of featured snippets, People Also Ask boxes, and more. With these new developments, marketers are increasingly going to want to report on their general brand visibility within the SERPs, not just their website ranking. It’s going to be more important than ever for people to be measuring all the elements within a SERP, not just their own ranking. [Rob Bucci]

Further reading:

12. Targeting topics will be more productive than targeting queries

2019 is going to be another year in which we see the emphasis on individual search queries start to decline, as people focus more on clusters of queries around topics. People Also Ask queries have made the importance of topics much more obvious to the SEO industry. With PAAs, Google is clearly illustrating that they think about searcher experience in terms of a searcher’s satisfaction across an entire topic, not just a specific search query. With this in mind, we can expect SEOs to more and more want to see their search queries clustered into topics so they can measure their visibility and the competitive landscape across these clusters. [Rob Bucci]

Further reading:

13. Linked unstructured citations will receive increasing focus

I recently conducted a small study in which there was a 75% correlation between organic and local pack rank. Linked unstructured citations (the mention of partial or complete business information + a link on any type of relevant website) are a means of improving organic rankings which underpin local rankings. They can also serve as a non-Google dependent means of driving traffic and leads. Anything you’re not having to pay Google for will become increasingly precious. Structured citations on key local business listing platforms will remain table stakes, but competitive local businesses will need to focus on unstructured data to move the needle. [Miriam Ellis]

Further reading:

14. Reviews will remain a competitive difference-maker

A Google rep recently stated that about one-third of local searches are made with the intent of reading reviews. This is huge. Local businesses that acquire and maintain a good and interactive reputation on the web will have a critical advantage over brands that ignore reviews as fundamental to customer service. Competitive local businesses will earn, monitor, respond to, and analyze the sentiment of their review corpus. [Miriam Ellis]

Further reading:

We’ve heard from Mozzers, and now we want to hear from you. What have you seen so far in 2019 that’s got your SEO Spidey senses tingling? What trends are you capitalizing on and planning for? Let us know in the comments below (and brag to friends and colleagues when your prediction comes true in the next 6–10 months). ;-)


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Thursday, February 21, 2019

The Influence of Voice Search on Featured Snippets

Posted by TheMozTeam

This post was originally published on the STAT blog.


We all know that featured snippets provide easy-to-read, authoritative answers and that digital assistants love to say them out loud when asked questions.

This means that featured snippets have an impact on voice search — bad snippets, or no snippets at all, and digital assistants struggle. By that logic: Create a lot of awesome snippets and win the voice search race. Right?

Right, but there’s actually a far more interesting angle to examine — one that will help you nab more snippets and optimize for voice search at the same time. In order to explore this, we need to make like Doctor Who and go back in time.

From typing to talking

Back when dinosaurs roamed the earth and queries were typed into search engines via keyboards, people adapted to search engines by adjusting how they performed queries. We pulled out unnecessary words and phrases, like “the,” “of,” and, well, “and,” which created truncated requests — robotic-sounding searches for a robotic search engine.

The first ever dinosaur to use Google.

Of course, as search engines have evolved, so too has their ability to understand natural language patterns and the intent behind queries. Google’s 2013 Hummingbird update helped pave the way for such evolution. This algorithm rejigging allowed Google’s search engine to better understand the whole of a query, moving it away from keyword matching to conversation having.

This is good news if you’re a human person: We have a harder time changing the way we speak than the way we write. It’s even greater news for digital assistants, because voice search only works if search engines can interpret human speech and engage in chitchat.

Digital assistants and machine learning

By looking at how digital assistants do their voice search thing (what we say versus what they search), we can see just how far machine learning has come with natural language processing and how far it still has to go (robots, they’re just like us!). We can also get a sense of the kinds of queries we need to be tracking if voice search is on the SEO agenda.

For example, when we asked our Google Assistant, “What are the best headphones for $100,” it queried [best headphones for $100]. We followed that by asking, “What about wireless,” and it searched [best wireless headphones for $100]. And then we remembered that we’re in Canada, so we followed that with, “I meant $100 Canadian,” and it performed a search for [best wireless headphones for $100 Canadian].

We can learn two things from this successful tête-à-tête: Not only does our Google Assistant manage to construct mostly full-sentence queries out of our mostly full-sentence asks, but it’s able to accurately link together topical queries. Despite us dropping our subject altogether by the end, Google Assistant still knows what we’re talking about.

Of course, we’re not above pointing out the fumbles. In the string of: “How to bake a Bundt cake,” “What kind of pan does it take,” and then “How much do those cost,” the actual query Google Assistant searched for the last question was [how much does bundt cake cost].

Just after we finished praising our Assistant for being able to maintain the same subject all the way through our inquiry, we needed it to be able to switch tracks. And it couldn’t. It associated the “those” with our initial Bundt cake subject instead of the most recent noun mentioned (Bundt cake pans).

In another important line of questioning about Bundt cake-baking, “How long will it take” produced the query [how long does it take to take a Bundt cake], while “How long does that take” produced [how long does a Bundt cake take to bake].

They’re the same ask, but our Google Assistant had a harder time parsing which definition of “take” our first sentence was using, spitting out a rather awkward query. Unless we really did want to know how long it’s going to take us to run off with someone’s freshly baked Bundt cake? (Don’t judge us.)

Since Google is likely paying out the wazoo to up the machine learning ante, we expect there to be less awkward failures over time. Which is a good thing, because when we asked about Bundt cake ingredients (“Does it take butter”) we found ourselves looking at a SERP for [how do I bake a butter].

Not that that doesn’t sound delicious.

Snippets are appearing for different kinds of queries

So, what are we to make of all of this? That we’re essentially in the midst of a natural language renaissance. And that voice search is helping spearhead the charge.

As for what this means for snippets specifically? They’re going to have to show up for human speak-type queries. And wouldn’t you know it, Google is already moving forward with this strategy, and not simply creating more snippets for the same types of queries. We’ve even got proof.

Over the last two years, we’ve seen an increase in the number of words in a query that surfaces a featured snippet. Long-tail queries may be a nuisance and a half, but snippet-having queries are getting longer by the minute.

When we bucket and weight the terms found in those long-tail queries by TF-IDF, we get further proof of voice search’s sway over snippets. The term “how” appears more than any other word and is followed closely by “does,” “to,” “much,” “what,” and “is” — all words that typically compose full sentences and are easier to remove from our typed searches than our spoken ones.

This means that if we want to snag more snippets and help searchers using digital assistants, we need to build out long-tail, natural-sounding keyword lists to track and optimize for.

Format your snippet content to match

When it’s finally time to optimize, one of the best ways to get your content into the ears of a searcher is through the right snippet formatting, which is a lesson we can learn from Google.

Taking our TF-IDF-weighted terms, we found that the words “best” and “how to” brought in the most list snippets of the bunch. We certainly don’t have to think too hard about why Google decided they benefit from list formatting — it provides a quick comparative snapshot or a handy step-by-step.

From this, we may be inclined to format all of our “best” and “how to” keyword content into lists. But, as you can see in the chart above, paragraphs and tables are still appearing here, and we could be leaving snippets on the table by ignoring them. If we have time, we’ll dig into which keywords those formats are a better fit for and why.

Get tracking

You could be the Wonder Woman of meta descriptions, but if you aren’t optimizing for the right kind of snippets, then your content’s going to have a harder time getting heard. Building out a voice search-friendly keyword list to track is the first step to lassoing those snippets.

Want to learn how you can do that in STAT? Say hello and request a tailored demo.

Need more snippets in your life? We dug into Google’s double-snippet SERPs for you — double the snippets, double the fun.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

SEO Channel Context: An Analysis of Growth Opportunities

Posted by BrankoK

Too often do you see SEO analyses and decisions being made without considering the context of the marketing channel mix. Equally as often do you see large budgets being poured into paid ads in ways that seem to forget there's a whole lot to gain from catering to popular search demand.

Both instances can lead to leaky conversion funnels and missed opportunity for long term traffic flows. But this article will show you a case of an SEO context analysis we used to determine the importance and role of SEO.

This analysis was one of our deliverables for a marketing agency client who hired us to inform SEO decisions which we then turned into a report template for you to get inspired by and duplicate.

Case description

The included charts show real, live data. You can see the whole SEO channel context analysis in this Data Studio SEO report template.

The traffic analyzed is for of a monetizing blog, whose marketing team also happens to be one of most fun to work for. For the sake of this case study, we're giving them a spectacular undercover name — "The Broze Fellaz."

For context, this blog started off with content for the first two years before they launched their flagship product. Now, they sell a catalogue of products highly relevant to their content and, thanks to one of the most entertaining Shark Tank episodes ever aired, they have acquired investments and a highly engaged niche community.

As you’ll see below, organic search is their biggest channel in many ways. Facebook also runs both as organic and paid and the team spends many an hour inside the platform. Email has elaborate automated flows that strive to leverage subscribers that come from the stellar content on the website. We therefore chose the three — organic Search, Facebook, and email — as a combination that would yield a comprehensive analysis with insights we can easily act on.

Ingredients for the SEO analysis

This analysis is a result of a long-term retainer relationship with "The Broze Fellaz" as our ongoing analytics client. A great deal was required in order for data-driven action to happen, but we assure you, it's all doable.

From the analysis best practice drawer, we used:

  • 2 cups of relevant channels for context and analysis via comparison.
  • 3 cups of different touch points to identify channel roles — bringing in traffic, generating opt-ins, closing sales, etc.
  • 5 heads of open-minded lettuce and readiness to change current status quo, for a team that can execute.
  • 457 oz of focus-on-finding what is going on with organic search, why it is going on, and what we can do about it (otherwise, we’d end up with another scorecard export).
  • Imperial units used in arbitrary numbers that are hard to imagine and thus feel very large.
  • 1 to 2 heads of your analyst brain, baked into the analysis. You're not making an automated report — even a HubSpot intern can do that. You're being a human and you're analyzing. You're making human analysis. This helps avoid having your job stolen by a robot.
  • Full tray of Data Studio visualizations that appeal to the eye.
  • Sprinkles of benchmarks, for highlighting significance of performance differences.

From the measurement setup and stack toolbox, we used:

  • Google Analytics with tailored channel definitions, enhanced e-commerce and Search Console integration.
  • Event tracking for opt-ins and adjusted bounce rate via MashMetrics GTM setup framework.
  • UTM routine for social and email traffic implemented via Google Sheets & UTM.io.
  • Google Data Studio. This is my favorite visualization tool. Despite its flaws and gaps (as it’s still in beta) I say it is better than its paid counterparts, and it keeps getting better. For data sources, we used the native connectors for Google Analytics and Google Sheets, then Facebook community connectors by Supermetrics.
  • Keyword Hero. Thanks to semantic algorithms and data aggregation, you are indeed able to see 95 percent of your organic search queries (check out Onpage Hero, too, you'll be amazed).

Inspiration for my approach comes from Lea Pica, Avinash, the Google Data Studio newsletter, and Chris Penn, along with our dear clients and the questions they have us answer for them.

Ready? Let's dive in.

Analysis of the client's SEO on the context of their channel mix

1) Insight: Before the visit

What's going on and why is it happening?

Organic search traffic volume blows the other channels out of the water. This is normal for sites with quality regular content; yet, the difference is stark considering the active effort that goes into Facebook and email campaigns.

The CTR of organic search is up to par with Facebook. That's a lot to say when comparing an organic channel to a channel with high level of targeting control.

It looks like email flows are the clear winner in terms of CTR to the website, which has a highly engaged community of users who return fairly often and advocate passionately. It also has a product and content that's incredibly relevant to their users, which few other companies appear to be good at.

There's a high CTR on search engine results pages often indicates that organic search may support funnel stages beyond just the top.

As well, email flows are sent to a very warm audience — interested users who went through a double opt-in. It is to be expected for this CTR to be high.

What's been done already?

There's an active effort and budget allocation being put towards Facebook Ads and email automation. A content plan has been put in place and is being executed diligently.

What we recommend next

  1. Approach SEO in a way as systematic as what you do for Facebook and email flows.
  2. Optimize meta titles and descriptions via testing tools such as Sanity Check. The organic search CTR may become consistently higher than that of Facebook ads.
  3. Assuming you've worked on improving CTR for Facebook ads, have the same person work on the meta text and titles. Most likely, there'll be patterns you can replicate from social to SEO.
  4. Run a technical audit and optimize accordingly. Knowing that you haven’t done that in a long time, and seeing how much traffic you get anyway, there’ll be quick, big wins to enjoy.

Results we expect

You can easily increase the organic CTR by at least 5 percent. You could also clean up the technical state of your site in the eyes of crawlers -— you’ll then see faster indexing by search engines when you publish new content, increased impressions for existing content. As a result, you may enjoy a major spike within a month.

2) Insight: Engagement and options during the visit

With over 70 percent of traffic coming to this website from organic search, the metrics in this analysis will be heavily skewed towards organic search. So, comparing the rate for organic search to site-wide is sometimes conclusive, other times not conclusive.

Adjusted bounce rate — via GTM events in the measurement framework used, we do not count a visit as a bounce if the visit lasts 45 seconds or longer. We prefer this approach because such an adjusted bounce rate is much more actionable for content sites. Users who find what they were searching for often read the page they land on for several minutes without clicking to another page. However, this is still a memorable visit for the user. Further, staying on the landing page for a while, or keeping the page open in a browser tab, are both good indicators for distinguishing quality, interested traffic, from all traffic.

We included all Facebook traffic here, not just paid. We know from the client’s data that the majority is from paid content, they have a solid UTM routine in place. But due to boosted posts, we’ve experienced big inaccuracies when splitting paid and organic Facebook for the purposes of channel attribution.

What's going on and why is it happening?

It looks like organic search has a bounce rate worse than the email flows — that's to be expected and not actionable, considering that the emails are only sent to recent visitors who have gone through a double opt-in. What is meaningful, however, is that organic has a better bounce rate than Facebook. It is safe to say that organic search visitors will be more likely to remember the website than the Facebook visitors.

Opt-in rates for Facebook are right above site average, and those for organic search are right below, while organic is bringing in a majority of email opt-ins despite its lower opt-in rate.

Google's algorithms and the draw of the content on this website are doing better at winning users' attention than the detailed targeting applied on Facebook. The organic traffic will have a higher likelihood of remembering the website and coming back. Across all of our clients, we find that organic search can be a great retargeting channel, particularly if you consider that the site will come up higher in search results for its recent visitors.

What's been done already?

The Facebook ad campaigns of "The Broze Fellaz" have been built and optimized for driving content opt-ins. Site content that ranks in organic search is less intentional than that.

Opt-in placements have been tested on some of the biggest organic traffic magnets.

Thorough, creative and consistent content calendars have been in place as a foundation for all channels.

What we recommend next

  1. It's great to keep using organic search as a way to introduce new users to the site. Now, you can try to be more intentional about using it for driving opt-ins. It’s already serving both of the stages of the funnel.
  2. Test and optimize opt-in placements on more traffic magnets.
  3. Test and optimize opt-in copy for top 10 traffic magnets.
  4. Once your opt-in rates have improved, focus on growing the channel. Add to the content work with a 3-month sprint of an extensive SEO project
  5. Assign Google Analytics goal values to non-e-commerce actions on your site. The current opt-ins have different roles and levels of importance and there’s also a handful of other actions people can take that lead to marketing results down the road. Analyzing goal values will help you create better flows toward pre-purchase actions.
  6. Facebook campaigns seem to be at a point where you can pour more budget into them and expect proportionate increase in opt-in count.

Results we expect

Growth in your opt-ins from Facebook should be proportionate to increase in budget, with a near-immediate effect. At the same time, it’s fairly realistic to bring the opt-in rate of organic search closer to site average.

3) Insight: Closing the deal

For channel attribution with money involved, you want to make sure that your Google Analytics channel definitions, view filters, and UTM’s are in top shape.

What's going on and why is it happening?

Transaction rate, as well as per session value, is higher for organic search than it is for Facebook (paid and organic combined).

Organic search contributes to far more last-click revenue than Facebook and email combined. For its relatively low volume of traffic, email flows are outstanding in the volume of revenue they bring in.

Thanks to the integration of Keyword Hero with Google Analytics for this client, we can see that about 30 percent of organic search visits are from branded keywords, which tends to drive the transaction rate up.

So, why is this happening? Most of the product on the site is highly relevant to the information people search for on Google.

Multi-channel reports in Google Analytics also show that people often discover the site in organic search, then come back by typing in the URL or clicking a bookmark. That makes organic a source of conversions where, very often, no other channels are even needed.

We can conclude that Facebook posts and campaigns of this client are built to drive content opt-ins, not e-commerce transactions. Email flows are built specifically to close sales.

What’s been done already?

There is dedicated staff for Facebook campaigns and posts, as well a thorough system dedicated to automated email flows.

A consistent content routine is in place, with experienced staff at the helm. A piece has been published every week for the last few years, with the content calendar filled with ready-to-publish content for the next few months. The community is highly engaged, reading times are high, comment count soaring, and usefulness of content outstanding. This, along with partnerships with influencers, helps "The Broze Fellaz" take up a half of the first page on the SERP for several lucrative topics. They’ve been achieving this even without a comprehensive SEO project. Content seems to be king indeed.

Google Shopping has been tried. The campaign looked promising but didn't yield incremental sales. There’s much more search demand for informational queries than there is for product.

What we recommend next

  1. Organic traffic is ready to grow. If there is no budget left, resource allocation should be considered. In paid search, you can often simply increase budgets. Here, with stellar content already performing well, a comprehensive SEO project is begging for your attention. Focus can be put into structure and technical aspects, as well as content that better caters to search demand. Think optimizing the site’s information architecture, interlinking content for cornerstone structure, log analysis, and technical cleanup, meta text testing for CTR gains that would also lead to ranking gains, strategic ranking of long tail topics, intentional growing of the backlink profile.
  2. Three- or six-month intensive sprint of comprehensive SEO work would be appropriate.

Results we expect

Increasing last click revenue from organic search and direct by 25 percent would lead to a gain as high as all of the current revenue from automated email flows. Considering how large the growth has been already, this gain is more than achievable in 3–6 months.

Wrapping it up

Organic search presence of "The Broze Fellaz" should continue to be the number-one role for bringing new people to the site and bringing people back to the site. Doing so supports sales that happen with the contribution of other channels, e.g. email flows. The analysis points out is that organic search is also effective at playing the role of the last-click channel for transactions, often times without the help of other channels.

We’ve worked with this client for a few years, and, based on our knowledge of their marketing focus, this analysis points us to a confident conclusion that a dedicated, comprehensive SEO project will lead to high incremental growth.

Your turn

In drawing analytical conclusions and acting on them, there’s always more than one way to shoe a horse. Let us know what conclusions you would’ve drawn instead. Copy the layout of our SEO Channel Context Comparison analysis template and show us what it helped you do for your SEO efforts — create a similar analysis for a paid or owned channel in your mix. Whether it’s comments below, tweeting our way, or sending a smoke signal, we’ll be all ears. And eyes.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Friday, February 15, 2019

4 Ways to Improve Your Data Hygiene - Whiteboard Friday

Posted by DiTomaso

We base so much of our livelihood on good data, but managing that data properly is a task in and of itself. In this week's Whiteboard Friday, Dana DiTomaso shares why you need to keep your data clean and some of the top things to watch out for.

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Hi. My name is Dana DiTomaso. I am President and partner at Kick Point. We're a digital marketing agency, based in the frozen north of Edmonton, Alberta. So today I'm going to be talking to you about data hygiene.

What I mean by that is the stuff that we see every single time we start working with a new client this stuff is always messed up. Sometimes it's one of these four things. Sometimes it's all four, or sometimes there are extra things. So I'm going to cover this stuff today in the hopes that perhaps the next time we get a profile from someone it is not quite as bad, or if you look at these things and see how bad it is, definitely start sitting down and cleaning this stuff up.

1. Filters

So what we're going to start with first are filters. By filters, I'm talking about analytics here, specifically Google Analytics. When go you into the admin of Google Analytics, there's a section called Filters. There's a section on the left, which is all the filters for everything in that account, and then there's a section for each view for filters. Filters help you exclude or include specific traffic based on a set of parameters.

Filter out office, home office, and agency traffic

So usually what we'll find is one Analytics property for your website, and it has one view, which is all website data which is the default that Analytics gives you, but then there are no filters, which means that you're not excluding things like office traffic, your internal people visiting the website, or home office. If you have a bunch of people who work from home, get their IP addresses, exclude them from this because you don't necessarily want your internal traffic mucking up things like conversions, especially if you're doing stuff like checking your own forms.

You haven't had a lead in a while and maybe you fill out the form to make sure it's working. You don't want that coming in as a conversion and then screwing up your data, especially if you're a low-volume website. If you have a million hits a day, then maybe this isn't a problem for you. But if you're like the rest of us and don't necessarily have that much traffic, something like this can be a big problem in terms of the volume of traffic you see. Then agency traffic as well.

So agencies, please make sure that you're filtering out your own traffic. Again things like your web developer, some contractor you worked with briefly, really make sure you're filtering out all that stuff because you don't want that polluting your main profile.

Create a test and staging view

The other thing that I recommend is creating what we call a test and staging view. Usually in our Analytics profiles, we'll have three different views. One we call master, and that's the view that has all these filters applied to it.

So you're only seeing the traffic that isn't you. It's the customers, people visiting your website, the real people, not your office people. Then the second view we call test and staging. So this is just your staging server, which is really nice. For example, if you have a different URL for your staging server, which you should, then you can just include that traffic. Then if you're making enhancements to the site or you upgraded your WordPress instance and you want to make sure that your goals are still firing correctly, you can do all that and see that it's working in the test and staging view without polluting your main view.

Test on a second property

That's really helpful. Then the third thing is make sure to test on a second property. This is easy to do with Google Tag Manager. What we'll have set up in most of our Google Tag Manager accounts is we'll have our usual analytics and most of the stuff goes to there. But then if we're testing something new, like say the content consumption metric we started putting out this summer, then we want to make sure we set up a second Analytics view and we put the test, the new stuff that we're trying over to the second Analytics property, not view.

So you have two different Analytics properties. One is your main property. This is where all the regular stuff goes. Then you have a second property, which is where you test things out, and this is really helpful to make sure that you're not going to screw something up accidentally when you're trying out some crazy new thing like content consumption, which can totally happen and has definitely happened as we were testing the product. You don't want to pollute your main data with something different that you're trying out.

So send something to a second property. You do this for websites. You always have a staging and a live. So why wouldn't you do this for your analytics, where you have a staging and a live? So definitely consider setting up a second property.

2. Time zones

The next thing that we have a lot of problems with are time zones. Here's what happens.

Let's say your website, basic install of WordPress and you didn't change the time zone in WordPress, so it's set to UTM. That's the default in WordPress unless you change it. So now you've got your data for your website saying it's UTM. Then let's say your marketing team is on the East Coast, so they've got all of their tools set to Eastern time. Then your sales team is on the West Coast, so all of their tools are set to Pacific time.

So you can end up with a situation where let's say, for example, you've got a website where you're using a form plugin for WordPress. Then when someone submits a form, it's recorded on your website, but then that data also gets pushed over to your sales CRM. So now your website is saying that this number of leads came in on this day, because it's in UTM mode. Well, the day ended, or it hasn't started yet, and now you've got Eastern, which is when your analytics tools are recording the number of leads.

But then the third wrinkle is then you have Salesforce or HubSpot or whatever your CRM is now recording Pacific time. So that means that you've got this huge gap of who knows when this stuff happened, and your data will never line up. This is incredibly frustrating, especially if you're trying to diagnose why, for example, I'm submitting a form, but I'm not seeing the lead, or if you've got other data hygiene issues, you can't match up the data and that's because you have different time zones.

So definitely check the time zones of every product you use --website, CRM, analytics, ads, all of it. If it has a time zone, pick one, stick with it. That's your canonical time zone. It will save you so many headaches down the road, trust me.

3. Attribution

The next thing is attribution. Attribution is a whole other lecture in and of itself, beyond what I'm talking about here today.

Different tools have different ways of showing attribution

But what I find frustrating about attribution is that every tool has its own little special way of doing it. Analytics is like the last non-direct click. That's great. Ads says, well, maybe we'll attribute it, maybe we won't. If you went to the site a week ago, maybe we'll call it a view-through conversion. Who knows what they're going to call it? Then Facebook has a completely different attribution window.

You can use a tool, such as Supermetrics, to change the attribution window. But if you don't understand what the default attribution window is in the first place, you're just going to make things harder for yourself. Then there's HubSpot, which says the very first touch is what matters, and so, of course, HubSpot will never agree with Analytics and so on. Every tool has its own little special sauce and how they do attribution. So pick a source of truth.

Pick your source of truth

This is the best thing to do is just say, "You know what? I trust this tool the most." Then that is your source of truth. Do not try to get this source of truth to match up with that source of truth. You will go insane. You do have to make sure that you are at least knowing that things like your time zones are clear so that's all set.

Be honest about limitations

But then after that, really it's just making sure that you're being honest about your limitations.

Know where things are necessarily going to fall down, and that's okay, but at least you've got this source of truth that you at least can trust. That's the most important thing with attribution. Make sure to spend the time and read how each tool handles attribution so when someone comes to you and says, "Well, I see that we got 300 visits from this ad campaign, but in Facebook it says we got 6,000.

Why is that? You have an answer. That might be a little bit of an extreme example, but I mean I've seen weirder things with Facebook attribution versus Analytics attribution. I've even talked about stuff like Mixpanel and Kissmetrics. Every tool has its own little special way of recording attributions. It's never the same as anyone else's. We don't have a standard in the industry of how this stuff works, so make sure you understand these pieces.

4. Interactions

Then the last thing are what I call interactions. The biggest thing that I find that people do wrong here is in Google Tag Manager it gives you a lot of rope, which you can hang yourself with if you're not careful.

GTM interactive hits

One of the biggest things is what we call an interactive hit versus a non-interactive hit. So let's say in Google Tag Manager you have a scroll depth.

You want to see how far down the page people scroll. At 25%, 50%, 75%, and 100%, it will send off an alert and say this is how far down they scrolled on the page. Well, the thing is that you can also make that interactive. So if somebody scrolls down the page 25%, you can say, well, that's an interactive hit, which means that person is no longer bounced, because it's counting an interaction, which for your setup might be great.

Gaming bounce rate

But what I've seen are unscrupulous agencies who come in and say if the person scrolls 2% of the way down the page, now that's an interactive hit. Suddenly the client's bounce rate goes down from say 80% to 3%, and they think, "Wow, this agency is amazing." They're not amazing. They're lying. This is where Google Tag Manager can really manipulate your bounce rate. So be careful when you're using interactive hits.

Absolutely, maybe it's totally fair that if someone is reading your content, they might just read that one page and then hit the back button and go back out. It's totally fair to use something like scroll depth or a certain piece of the content entering the user's view port, that that would be interactive. But that doesn't mean that everything should be interactive. So just dial it back on the interactions that you're using, or at least make smart decisions about the interactions that you choose to use. So you can game your bounce rate for that.

Goal setup

Then goal setup as well, that's a big problem. A lot of people by default maybe they have destination goals set up in Analytics because they don't know how to set up event-based goals. But what we find happens is by destination goal, I mean you filled out the form, you got to a thank you page, and you're recording views of that thank you page as goals, which yes, that's one way to do it.

But the problem is that a lot of people, who aren't super great at interneting, will bookmark that page or they'll keep coming back to it again and again because maybe you put some really useful information on your thank you page, which is what you should do, except that means that people keep visiting it again and again without actually filling out the form. So now your conversion rate is all messed up because you're basing it on destination, not on the actual action of the form being submitted.

So be careful on how you set up goals, because that can also really game the way you're looking at your data.

Ad blockers

Ad blockers could be anywhere from 2% to 10% of your audience depending upon how technically sophisticated your visitors are. So you'll end up in situations where you have a form fill, you have no corresponding visit to match with that form fill.

It just goes into an attribution black hole. But they did fill out the form, so at least you got their data, but you have no idea where they came from. Again, that's going to be okay. So definitely think about the percentage of your visitors, based on you and your audience, who probably have an ad blocker installed and make sure you're comfortable with that level of error in your data. That's just the internet, and ad blockers are getting more and more popular.

Stuff like Apple is changing the way that they do tracking. So definitely make sure that you understand these pieces and you're really thinking about that when you're looking at your data. Again, these numbers may never 100% match up. That's okay. You can't measure everything. Sorry.

Bonus: Audit!

Then the last thing I really want you to think about — this is the bonus tip — audit regularly.

So at least once a year, go through all the different stuff that I've covered in this video and make sure that nothing has changed or updated, you don't have some secret, exciting new tracking code that somebody added in and then forgot because you were trying out a trial of this product and you tossed it on, and it's been running for a year even though the trial expired nine months ago. So definitely make sure that you're running the stuff that you should be running and doing an audit at least on an yearly basis.

If you're busy and you have a lot of different visitors to your website, it's a pretty high-volume property, maybe monthly or quarterly would be a better interval, but at least once a year go through and make sure that everything that's there is supposed to be there, because that will save you headaches when you look at trying to compare year-over-year and realize that something horrible has been going on for the last nine months and all of your data is trash. We really don't want to have that happen.

So I hope these tips are helpful. Get to know your data a little bit better. It will like you for it. Thanks.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Wednesday, February 13, 2019

A guide to setting up your very own search intent projects

Posted by TheMozTeam

This post was originally published on the STAT blog.


Whether you’re tracking thousands or millions of keywords, if you expect to extract deep insights and trends just by looking at your keywords from a high-level, you’re not getting the full story.

Smart segmentation is key to making sense of your data. And you’re probably already applying this outside of STAT. So now, we’re going to show you how to do it in STAT to uncover boatloads of insights that will help you make super data-driven decisions.

To show you what we mean, let’s take a look at a few ways we can set up a search intent project to uncover the kinds of insights we shared in our whitepaper, Using search intent to connect with consumers.

Before we jump in, there are a few things you should have down pat:

1. Picking a search intent that works for you

Search intent is the motivating force behind search and it can be:

  • Informational: The searcher has identified a need and is looking for information on the best solution, ie. [blender], [food processor]
  • Commercial: The searcher has zeroed in on a solution and wants to compare options, ie. [blender reviews], [best blenders]
  • Transactional: The searcher has narrowed their hunt down to a few best options, and is on the precipice of purchase, ie. [affordable blenders], [blender cost]
    • Local (sub-category of transactional): The searcher plans to do or buy something locally, ie. [blenders in dallas]
    • Navigational (sub-category of transactional): The searcher wants to locate a specific website, ie. [Blendtec]

We left navigational intent out of our study because it’s brand specific and didn’t want to bias our data.

Our keyword set was a big list of retail products — from kitty pooper-scoopers to pricey speakers. We needed a straightforward way to imply search intent, so we added keyword modifiers to characterize each type of intent.

As always, different strokes for different folks: The modifiers you choose and the intent categories you look at may differ, but it’s important to map that all out before you get started.

2. Identifying the SERP features you really want

For our whitepaper research, we pretty much tracked every feature under the sun, but you certainly don’t have to.

You might already know which features you want to target, the ones you want to keep an eye on, or questions you want to answer. For example, are shopping boxes taking up enough space to warrant a PPC strategy?

In this blog post, we’re going to really focus-in on our most beloved SERP feature: featured snippets (called “answers” in STAT). And we’ll be using a sample project where we’re tracking 25,692 keywords against Amazon.com.

3. Using STAT’s segmentation tools

Setting up projects in STAT means making use of the segmentation tools. Here’s a quick rundown of what we used:

  • Standard tag: Best used to group your keywords into static themes — search intent, brand, product type, or modifier.
  • Dynamic tag: Like a smart playlist, automatically returns keywords that match certain criteria, like a given search volume, rank, or SERP feature appearance.
  • Data view: House any number of tags and show how those tags perform as a group.

Learn more about tags and data views in the STAT Knowledge Base.

Now, on to the main event…

1. Use top-level search intent to find SERP feature opportunities

To kick things off, we’ll identify the SERP features that appear at each level of search intent by creating tags.

Our first step is to filter our keywords and create standard tags for our search intent keywords (read more abou tfiltering keywords). Second, we create dynamic tags to track the appearance of specific SERP features within each search intent group. And our final step, to keep everything organized, is to place our tags in tidy little data views, according to search intent.

Here’s a peek at what that looks like in STAT:

What can we uncover?

Our standard tags (the blue tags) show how many keywords are in each search intent bucket: 2,940 commercial keywords. And our dynamic tags (the sunny yellow stars) show how many of those keywords return a SERP feature: 547 commercial keywords with a snippet.

This means we can quickly spot how much opportunity exists for each SERP feature by simply glancing at the tags. Boom!

By quickly crunching some numbers, we can see that snippets appear on 5 percent of our informational SERPs (27 out of 521), 19 percent of our commercial SERPs (547 out of 2,940), and 12 percent of our transactional SERPs (253 out of 2,058).

From this, we might conclude that optimizing our commercial intent keywords for featured snippets is the way to go since they appear to present the biggest opportunity. To confirm, let’s click on the commercial intent featured snippet tag to view the tag dashboard…

Voilà! There are loads of opportunities to gain a featured snippet.

Though, we should note that most of our keywords rank below where Google typically pulls the answer from. So, what we can see right away is that we need to make some serious ranking gains in order to stand a chance at grabbing those snippets.


2. Find SERP feature opportunities with intent modifiers

Now, let’s take a look at which SERP features appear most often for our different keyword modifiers.

To do this, we group our keywords by modifier and create a standard tag for each group. Then, we set up dynamic tags for our desired SERP features. Again, to keep track of all the things, we contained the tags in handy data views, grouped by search intent.

What can we uncover?

Because we saw that featured snippets appear most often for our commercial intent keywords, it’s time to drill on down and figure out precisely which modifiers within our commercial bucket are driving this trend.

Glancing quickly at the numbers in the tag titles in the image above, we can see that “best,” “reviews,” and “top” are responsible for the majority of the keywords that return a featured snippet:

  • 212 out of 294 of our “best” keywords (72%)
  • 109 out of 294 of our “reviews” keywords (37%)
  • 170 out of 294 of our “top” keywords (59%)

This shows us where our efforts are best spent optimizing.

By clicking on the “best — featured snippets” tag, we’re magically transported into the dashboard. Here, we see that our average ranking could use some TLC.


There is a lot of opportunity to snag a snippet here, but we (actually, Amazon, who we’re tracking these keywords against) don’t seem to be capitalizing on that potential as much as we could. Let’s drill down further to see which snippets we already own.

We know we’ve got content that has won snippets, so we can use that as a guideline for the other keywords that we want to target.


3. See which pages are ranking best by search intent

In our blog post How Google dishes out content by search intent, we looked at what type of pages — category pages, product pages, reviews — appear most frequently at each stage of a searcher’s intent.

What we found was that Google loves category pages, which are the engine’s top choice for retail keywords across all levels of search intent. Product pages weren’t far behind.

By creating dynamic tags for URL markers, or portions of your URL that identify product pages versus category pages, and segmenting those by intent, you too can get all this glorious data. That’s exactly what we did for our retail keywords

What can we uncover?

Looking at the tags in the transactional page types data view, we can see that product pages are appearing far more frequently (526) than category pages (151).

When we glanced at the dashboard, we found that slightly more than half of the product pages were ranking on the first page (sah-weet!). That said, more than thirty percent appeared on page three and beyond. So despite the initial visual of “doing well”, there’s a lot of opportunity that Amazon could be capitalizing on.

We can also see this in the Daily Snapshot. In the image above, we compare category pages (left) to product pages (right), and we see that while there are less category pages ranking, the rank is significantly better. Amazon could take some of the lessons they’ve applied to their category pages to help their product pages out.

Wrapping it up

So what did we learn today?

  1. Smart segmentation starts with a well-crafted list of keywords, grouped into tags, and housed in data views.
  2. The more you segment, the more insights you’re gonna uncover.
  3. Rely on the dashboards in STAT to flag opportunities and tell you what’s good, yo!

Want to see it all in action? Get a tailored walkthrough of STAT, here.

Or get your mitts on even more intent-based insights in our full whitepaper: Using search intent to connect with consumers.

Read on, readers!

More in our search intent series:


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Tuesday, February 12, 2019

People Ask Their Most Pressing SEO Questions — Our Experts Answer

Posted by TheMozTeam

We teamed up with our friends at Duda, a website design scaling platform service, who asked their agency customers to divulge their most pressing SEO questions, quandaries, and concerns. Our in-house SEO experts, always down for a challenge, hunkered down to collaborate on providing them with answers. From Schema.org to voice search to local targeting, we're tackling real-world questions about organic search. Read on for digestible insights and further resources!


How do you optimize for international markets?

International sites can be multi-regional, multilingual, or both. The website setup will differ depending on that classification.

  • Multi-regional sites are those that target audiences from multiple countries. For example: a site that targets users in the U.S. and the U.K.
  • Multilingual sites are those that target speakers of multiple languages. For example, a site that targets both English and Spanish-speakers.

To geo-target sections of your site to different countries, you can use a country-specific domain (ccTLD) such as “.de” for Germany or subdomains/subdirectories on generic TLDs such as “example.com/de.”

For different language versions of your content, Google recommends using different URLs rather than using cookies to change the language of the content on the page. If you do this, make use of the hreflang tag to tell Google about alternate language versions of the page.

For more information on internationalization, visit Google’s “Managing multi-regional and multilingual sites” or Moz’s guide to international SEO.


How do we communicate to clients that SEO projects need ongoing maintenance work?

If your client is having difficulty understanding SEO as a continuous effort, rather than a one-and-done task, it can be helpful to highlight the changing nature of the web.

Say you created enough quality content and earned enough links to that content to earn yourself a spot at the top of page one. Because organic placement is earned and not paid for, you don’t have to keep paying to maintain that placement on page one. However, what happens when a competitor comes along with better content that has more links than your content? Because Google wants to surface the highest quality content, your page’s rankings will likely suffer in favor of this better page.

Maybe it’s not a competitor that depreciates your site’s rankings. Maybe new technology comes along and now your page is outdated or even broken in some areas.

Or how about pages that are ranking highly in search results, only to get crowded out by a featured snippet, a Knowledge Panel, Google Ads, or whatever the latest SERP feature is?

Set-it-and-forget-it is not an option. Your competitors are always on your heels, technology is always changing, and Google is constantly changing the search experience.

SEO specialists are here to ensure you stay at the forefront of all these changes because the cost of inaction is often the loss of previously earned organic visibility.


How do I see what subpages Google delivers on a search? (Such as when the main page shows an assortment of subpages below the result, via an indent.)

Sometimes, as part of a URL’s result snippet, Google will list additional subpages from that domain beneath the main title-url-description. These are called organic sitelinks. Site owners have no control over when and which URLs Google chooses to show here aside from deleting or NoIndexing the page from the site.

If you’re tracking keywords in a Moz Pro Campaign, you have the ability to see which SERP features (including sitelinks) your pages appear in.

The Moz Keyword Explorer research tool also allows you to view SERP features by keyword:


What are the best techniques for analyzing competitors?

One of the best ways to begin a competitor analysis is by identifying the URLs on your competitor’s site that you’re directly competing with. The idea of analyzing an entire website against your own can be overwhelming, so start with the areas of direct competition.

For example, if you’re targeting the keyword “best apple pie recipes,” identify the top ranking URL(s) for that particular query and evaluate them against your apple pie recipe page.

You should consider comparing qualities such as:

Moz also created the metrics Domain Authority (DA) and Page Authority (PA) to help website owners better understand their ranking ability compared to their competitors. For example, if your URL has a PA of 35 and your competitor’s URL has a PA of 40, it’s likely that their URL will rank more favorably in search results.

Competitor analysis is a great benchmarking tool and can give you great ideas for your own strategies, but remember, if your only strategy is emulation, the best you’ll ever be is the second-best version of your competitors!


As an SEO agency, can you put a backlink to your website on clients’ pages without getting a Google penalty? (Think the Google Penguin update.)

Many website design and digital marketing agencies add a link to their website in the footer of all their clients’ websites (usually via their logo or brand name). Google says in their quality guidelines that “creating links that weren’t editorially placed or vouched for by the site’s owner on a page, otherwise known as unnatural links, can be considered a violation of our guidelines” and they use the example of “widely distributed links in the footers or templates of various sites.” This does not mean that all such footer links are a violation of Google’s guidelines. What it does mean is that these links have to be vouched for by the site’s owner. For example, an agency cannot require this type of link on their clients’ websites as part of their terms of service or contract. You must allow your client the choice of using nofollow or removing the link.

The fourth update of the Google Penguin algorithm was rolled into Google’s core algorithm in September of 2016. This new “gentler” algorithm, described in the Google Algorithm Change History, devalues unnatural links, rather than penalizing sites, but link schemes that violate Google’s quality guidelines should still be avoided.


We’re working on a new website. How do we communicate the value of SEO to our customers?

When someone searches a word or phrase related to a business, good SEO ensures that the business’s website shows up prominently in the organic (non-ad) search results, that their result is informative and enticing enough to prompt searchers to click, and that the visitor has a positive experience with the website. In other words, good SEO helps a website get found, get chosen, and convert new business.

That’s done through activities that fall into three main categories:

  • Content: Website content should be written to address your audience’s needs at all stages of their purchase journey: from top-of-funnel, informational content to bottom-of-funnel, I-want-to-buy content. Search engine optimized content is really just content that is written around the topics your audience wants and in the formats they want it, with the purpose of converting or assisting conversions.
  • Links: Earning links to your web content from high-quality, relevant websites not only helps Google find your content, it signals that your site is trustworthy.
  • Accessibility: Ensuring that your website and its content can be found and understood by both search engines and people. A strong technical foundation also increases the likelihood that visitors to the website have a positive experience on any device.

Why is SEO valuable? Simply put, it’s one more place to get in front of people who need the products or services you offer. With 40–60 billion Google searches in the US every month, and more than 41% / 62% (mobile / desktop) of clicks going to organic, it’s an investment you can’t afford to ignore.


How do you optimize for voice search? Where do you find phrases used via tools like Google Analytics?

Google doesn’t yet separate out voice query data from text query data, but many queries don’t change drastically with the medium (speaking vs. typing the question), so the current keyword data we have can still be a valuable way to target voice searchers. It’s important here to draw the distinction between voice search (“Hey Google, where is the Space Needle?”) and voice commands (ex: “Hey Google, tell me about my day”) — the latter are not queries, but rather spoken tasks that certain voice assistant devices will respond to. These voice commands differ from what we’d type, but they are not the same as a search query.

Voice assistant devices typically pull their answers to informational queries from their Knowledge Graph or from the top of organic search results, which is often a featured snippet. That’s why one of the best ways to go after voice queries is to capture featured snippets.

If you’re a local business, it’s also important to have your GMB data completely and accurately filled out, as this can influence the results Google surfaces for voice assistance like, “Hey Google, find me a pizza place near me that’s open now.”


Should my clients use a service such as Yext? Do they work? Is it worth it?

Automated listings management can be hugely helpful, but there are some genuine pain points with Yext, in particular. These include pricing (very expensive) and the fact that Yext charges customers to push their data to many directories that see little, if any, human use. Most importantly, local business owners need to understand that Yext is basically putting a paid layer of good data over the top of bad data — sweeping dirt under the carpet, you might say. Once you stop paying Yext, they pull up the carpet and there’s all your dirt again. By contrast, services like Moz Local (automated citation management) and Whitespark (manual citation management) correct your bad data at the source, rather than just putting a temporary paid Band-Aid over it. So, investigate all options and choose wisely.


How do I best target specific towns and cities my clients want to be found in outside of their physical location?

If you market a service area business (like a plumber), create a great website landing page with consumer-centric, helpful, unique content for each of your major service cities. Also very interesting for service area businesses is the fact that Google just changed its handling of setting the service radius in your Google My Business dashboard so that it reflects your true service area instead of your physical address. If you market a brick-and-mortar business that customers come to from other areas, it’s typically not useful to create content saying, “People drive to us from X!” Rather, build relationships with neighboring communities in the real world, reflect them on your social outreach, and, if they’re really of interest, reflect them on your website. Both service area businesses and bricks-and-mortar models may need to invest in PPC to increase visibility in all desired locations.


How often should I change page titles and meta descriptions to help local SEO?

While it’s good to experiment, don’t change your major tags just for the sake of busy work. Rather, if some societal trend changes the way people talk about something you offer, consider editing your titles and descriptions. For example, an auto dealership could realize that its consumers have started searching for “EVs” more than electric vehicles because society has become comfortable enough with these products to refer to them in shorthand. If keyword research and trend analysis indicate a shift like this, then it may be time to re-optimize elements of your website. Changing any part of your optimization is only going to help you rank better if it reflects how customers are searching.

Read more about title tags and metas:


Should you service clients within the same niche, since there can only be one #1?

If your keywords have no local intent, then taking on two clients competing for the same terms nationally could certainly be unethical. But this is a great question, because it presents the opportunity to absorb the fact that for any keyword for which Google perceives a local intent, there is no longer only one #1. For these search terms, both local and many organic results are personalized to the location of the searcher.

Your Mexican restaurant client in downtown isn’t really competing with your Mexican restaurant client uptown when a user searches for “best tacos.” Searchers’ results will change depending on where they are in the city when they search. So unless you’ve got two identical businesses within the same couple of blocks in a city, you can serve them both, working hard to find the USP of each client to help them shine bright in their particular setting for searchers in close proximity.


Is it better to have a one-page format or break it into 3–5 pages for a local service company that does not have lengthy content?

This question is looking for an easy way out of publishing when you’ve become a publisher. Every business with a website is a publisher, and there’s no good excuse for not having adequate content to create a landing page for each of your services, and a landing page for each of the cities you serve. I believe this question (and it’s a common one!) arises from businesses not being sure what to write about to differentiate their services in one location from their services in another. The services are the same, but what’s different is the location!

Publish text and video reviews from customers there, showcase your best projects there, offer tips specific to the geography and regulations there, interview service people, interview experts, sponsor teams and events in those service locations, etc. These things require an investment of time, but you’re in the publishing business now, so invest the time and get publishing! All a one-page website shows is a lack of commitment to customer service. For more on this, read Overcoming Your Fear of Local Landing Pages.


How much content do you need for SEO?

Intent, intent, intent! Google’s ranking signals are going to vary depending on the intent behind the query, and thank goodness for that! This is why you don’t need a 3,000-word article for your product page to rank, for example.

The answer to “how much content does my page need?” is “enough content for it to be complete and comprehensive,” which is a subjective factor that is going to differ from query to query.

Whether you write 300 words or 3,000 words isn’t the issue. It’s whether you completely and thoroughly addressed the page topic.

Check out these Whiteboard Fridays around content for SEO:


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

The Basics of Building an Intent-based Keyword List

Posted by TheMozTeam

This week, we're taking a deep into search intent.

The STAT whitepaper looked at how SERP features respond to intent, and the bonus blog posts broke things down even further and examined how individual intent modifiers impact SERP features, the kind of content that Google serves at each stage of intent, and how you can set up your very own search intent projects. And look out for Seer's very own Scott Taft's upcoming post on how to use STAT and Power BI to create your very own search intent dashboard.

Search intent is the new demographics, so it only made sense to get up close and personal with it. Of course, in order to bag all those juicy search intent tidbits, we needed a great intent-based keyword list. Here’s how you can get your hands on one of those.

Gather your core keywords

First, before you can even think about intent, you need to have a solid foundation of core keywords in place. These are the products, features, and/or services that you’ll build your search intent funnel around.

But goodness knows that keyword list-building is more of an art than a science, and even the greatest writers (hi, Homer) needed to invoke the muses (hey, Calliope) for inspiration, so if staring at your website isn’t getting the creative juices flowing, you can look to a few different places for help.

Snag some good suggestions from keyword research tools

Lots of folks like to use the Google Keyword Planner to help them get started. Ubersuggest and Yoast’s Google Suggest Expander will also help add keywords to your arsenal. And Answer The Public gives you all of that, and beautifully visualized to boot.

Simply plunk in a keyword and watch the suggestions pour in. Just remember to be critical of these auto-generated lists, as odd choices sometimes slip into the mix. For example, apparently we should add [free phones] to our list of [rank tracking] keywords. Huh.

Spot inspiration on the SERPs

Two straight-from-the-SERP resources that we love for keyword research are the “People also ask” box and related searches. These queries are Google-vetted and plentiful, and also give you some insight into how the search engine giant links topics.

If you’re a STAT client, you can generate reports that will give you every question in a PAA box (before it gets infinite), as well as each of the eight related searches at the bottom of a SERP. Run the reports for a couple of days and you’ll get a quick sense of which questions and queries Google favours for your existing keyword set.

A quick note about language & location

When you’re in the UK, you push a pram, not a stroller; you don’t wear a sweater, you wear a jumper. This is all to say that if you’re in the business of global tracking, it’s important to keep different countries’ word choices in mind. Even if you’re not creating content with them, it’s good to see if you’re appearing for the terms your global searchers are using.

Add your intent modifiers

Now it’s time to tackle the intent bit of your keyword list. And this bit is going to require drawing some lines in the sand because the modifiers that occupy each intent category can be highly subjective — does “best” apply transactional intent instead of commercial?

We’ve put together a loose guideline below, but the bottom line is that intent should be structured and classified in a way that makes sense to your business. And if you’re stuck for modifiers to marry to your core keywords, here’s a list of 50+ to help with the coupling.

Informational intent

The searcher has identified a need and is looking for the best solution. These keywords are the core keywords from your earlier hard work, plus every question you think your searchers might have if they’re unfamiliar with your product or services.

Your informational queries might look something like:

  • [product name]
  • what is [product name]
  • how does [product name] work
  • how do I use [product name]
Commercial intent

At this stage, the searcher has zeroed in on a solution and is looking into all the different options available to them. They’re doing comparative research and are interested in specific requirements and features.

For our research, we used best, compare, deals, new, online, refurbished, reviews, shop, top, and used.

Your commercial queries might look something like:

  • best [product name]
  • [product name] reviews
  • compare [product name]
  • what is the top [product name]
  • [colour/style/size] [product name]
Transactional intent (including local and navigational intent)

Transactional queries are the most likely to convert and generally include terms that revolve around price, brand, and location, which is why navigational and local intent are nestled within this stage of the intent funnel.

For our research, we used affordable, buy, cheap, cost, coupon, free shipping, and price.

Your transactional queries might look something like:

  • how much does [product name] cost
  • [product name] in [location]
  • order [product name] online
  • [product name] near me
  • affordable [brand name] [product name]
A tip if you want to speed things up

A super quick way to add modifiers to your keywords and save your typing fingers is by using a keyword mixer like this one. Just don’t forget that using computer programs for human-speak means you’ll have to give them the ol’ once-over to make sure they still make sense.

Audit your list

Now that you’ve reached for the stars and got yourself a huge list of keywords, it’s time to bring things back down to reality and see which ones you’ll actually want to keep around.

No two audits are going to look the same, but here are a few considerations you’ll want to keep in mind when whittling your keywords down to the best of the bunch.

  1. Relevance. Are your keywords represented on your site? Do they point to optimized pages
  2. Search volume. Are you after highly searched terms or looking to build an audience? You can get the SV goods from the Google Keyword Planner.
  3. Opportunity. How many clicks and impressions are your keywords raking in? While not comprehensive (thanks, Not Provided), you can gather some of this info by digging into Google Search Console.
  4. Competition. What other websites are ranking for your keywords? Are you up against SERP monsters like Amazon? What about paid advertising like shopping boxes? How much SERP space are they taking up? Your friendly SERP analytics platform withshare of voice capabilities (hi!) can help you understand your search landscape.
  5. Difficulty. How easy is your keyword going to be to win? Search volume can give you a rough idea — the higher the search volume, the stiffer the competition is likely to be — but for a different approach, Moz’s Keyword Explorer has a Difficulty score that takes Page Authority, Domain Authority, and projected click-through-rate into account.

By now, you should have a pretty solid plan of attack to create an intent-based keyword list of your very own to love, nurture, and cherish.

If, before you jump headlong into it, you’re curious what a good chunk of this is going to looks like in practice, give this excellent article by Russ Jones a read, or drop us a line. We’re always keen to show folks why tracking keywords at scale is the best way to uncover intent-based insights.

Read on, readers!

More in our search intent series:

This post was originally published on the STAT blog.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!