Google Algorithm Updates—What They Mean For Publishers
By Michael Cottam, SEO Consultant
Google is continuously improving their algorithms, striving to be able to find the best content and the best user experience for searchers amongst the incredible masses of new content published every day.
Some of what Google is adding into their algorithms is designed to recognize great, original and popular content, while other metrics are designed to catch sites that are trying to “game the system” (especially with links) or are simply republishing material that originated elsewhere.
Today, Google uses three kinds of signals for ranking web pages: content, links and user engagement metrics. Publishers need to understand what Google is looking at (both good and bad signals) in order to create and tune their sites to rank well in Google and deliver traffic to their sites.
The most famous part of Google’s algorithms that has to do with content is Panda. Launched in February 2011, it was designed to reward pages with things like big, original images; plenty of well-written text; rich content elements like videos, maps, etc. It was also designed to penalize pages with tons of ads, too much whitespace and forms above the fold, thin content, etc.; things that make the user experience less satisfying. Google has continued to tweak Panda over the years, and with each iteration Google has been better able to recognize truly good, original, useful content and/or poorer quality content that was undeservedly seen as high quality in earlier iterations of the algorithm.
What can/should publishers do with respect to their content to benefit from content-related updates? Or, at least, not be penalized by them?
- Create a great user experience: Make the page load quickly; don’t interfere with the user experience with popup dialogs that cover the content; make it easy to get to the content on the page that they’re looking for without excessive scrolling.
- Cover the topic thoroughly: Check out other publishers’ pages who have covered the same topic, and are ranking for the target term—are they talking about subtopics or referring to related terms that you aren’t? It’s not about the word count—it’s about covering the subject matter thoroughly on a single page. And don’t split the content across multiple pages—Google is going to pick just one of your pages to rank for that topic and ignore the content that’s on the other ones.
- Use original images and videos: If you have the same image that was provided by the company you’re writing about, or are using stock photography for a destination article, then you’ve got nothing better to offer the user than the other publishers covering this topic. Take your own photos and videos whenever possible.
- Use original text: Don’t copy overview material from people’s biographies, company backgrounders, or tourist bureau sites.
The most famous part of Google’s algorithms related to links is Penguin. Prior to Penguin (launched in April 2012), Google had (and still has today) manual link penalties. If you think of links like “votes” for your page, link penalties are what you get when you’re caught engaging in voter fraud. Google wants to count links to your site that represent a vote for the content on that page. Anything you do to fake this can get you in trouble. When you have a penalty, typically your page will suddenly rank 40 or more places lower than it did before the penalty…or not at all.
With manual link penalties, a Google Search Quality engineer has actually looked at your links, and can see what you’ve done to get links you didn’t really earn. The engineer manually registers a penalty against specific pages, or your entire site, and you get a notice in Google Search Console to that effect.
Penguin penalties were algorithmic, meaning that they’re automatically seeing link patterns that they know indicate bad or paid links and automatically applying a penalty to your site. You get no notification—you just stop ranking for certain terms, or for anything at all. With the advent of Penguin 4.0 in October 2016, Google claimed that there was no longer a Penguin penalty—those types of links that Penguin was penalizing are now simply ignored by the Page Rank calculations. However, it’s very important to note that there still are algorithmic penalties in Google outside of Penguin, and you CAN get penalized for certain kinds of link patterns.
What can/should publishers do with respect to links to protect against link-related updates?
- Don’t get links from pages that are going to be syndicated across many sites—this is known as “article marketing.” That includes e-press releases.
- Avoid site-wide links on other sites. It’s fine to sponsor a charity, for example, but ask for a single link from a page related to your sponsorship, or a blog post, or their About page—not a site wide footer or sidebar link.
- Don’t get links from sites where clearly humans don’t visit: directories you’ve never heard of, blogs where the content is fluff, sites that don’t get shared on social media or linked to by many other sites.
- Do review your latest backlinks periodically in Search Console and if you find really bad sites linking to you, submit a disavow file to Google Search Console with those domains in it.
- Do traditional marketing and PR, and make that the reason you get links. Be a resource for reporters to interview/quote on your industry; contribute to blogs and journals in your space; support charities and your community and get mentioned in the newspaper because of that.
USER ENGAGEMENT SIGNALS
What is Google measuring when it comes to user engagement? The two most likely signals are click through rate and bounce rate.
Each position from 1 to 10 on page 1 has an average click through rate (CTR). For example, about 20% of searchers will click on the first organic search result; about 13% will click on the second result, etc. If your page is the fourth result for a given search, and the average CTR on result #4 is 9%, and over the last 100 searches for that term Google sees 12% of people click on your result, that’s a positive signal to Google that your headline and snippet matches what the searchers are looking for. On the other hand, if your CTR is lower than average, it indicates searchers aren’t liking what your page says it is.
Your bounce rate is the percentage of searchers that click on you in the search results, then click the back button AND click a different result. Presumably, this indicated that your page didn’t answer their question—at least, not completely—and they had to go to another page to complete their task. A high bounce rate thus indicates to Google that your page’s content is either low quality or not very relevant/helpful for that particular search query. Conversely, a low bounce rate indicates your page is a great answer to that searcher’s question.
What can/should publishers do with respect to user engagement signals to protect against changes in this part of the algorithm?
- Be sure your content thoroughly covers the page’s topic, so that a searcher probably won’t have to go to your competitor’s page to get the rest of the information they’re looking for.
- Structure your page in such a way that the user can see that the subtopic they might be looking for is on the page, even if it’s not initially visible. Use tabs, or use inpage anchors to scroll to sections.
- Do the searches yourself, and look at what Google is showing for your page’s headline and snippet (which come from the page title and the meta description in general). Ask yourself if your page in the search result looks more compelling than the other 9 on page 1. Does your result look credible (mention reviews, BBB A+ rating, years in business, etc.)? Does it look spammy or legit (don’t use a page title of “Purple Widgets – Widgets that are Purple – Purple Colored Widgets” for example).
- Use rich content like videos, maps, virtual tours that engage the visitor and keep them on your page.
Google is continually refining their algorithm, making it better at recognizing great content, and recognizing “buzz” or positive mentions from real people—especially authorities. If you design your site content for a great user experience, giving the user the best and most complete resource for the topics they’re searching, then as long as you’re not doing crazy coding tricks that prevent Google from seeing your content clearly and correctly, you should be in good shape. When it comes to links, don’t think about links: think about marketing, getting exposure in places on the web that real people visit regularly. The links that come from this kind of exposure are the kinds of links you want, that Google will see as “editorial votes” for your content and brand, and keep you out of Google penalties.
Magazines Canada Hotsheets deliver current information on a single topic, each written by an expert in the field. Return to Magazines Canada Hotsheets.