Possible 2015 Google Algorithms Changes You Should Be Thinking About Now
What does 2015 mean for Google’s algorithm? And what does it mean for your SEO? If 2014 was any indication, we can expect Google to continue to tweak and adjust its search algorithms, likely on an increasingly frequent, rolling basis.
In 2014, we saw several major updates to Google’s Penguin and Panda algorithms and the rollout of the Pigeon algorithm. The slow, late fall rollout of Penguin 3.0 changed SERP rankings for some sites—including for some retailers just before Black Friday and Cyber Monday—while two Panda updates tried to address redundant, irrelevant, and low quality content in search results. And the summer release of Pigeon gave a boost to localized search query results.
As Google seeks to serve its users better quality and more relevant search results, businesses can’t afford to wait around to see what changes are being made. We have to be on the lookout for trends and new opportunities.
So what should you bet on in 2015? Below are four probable areas we could see Google address with algorithm changes. These are areas where current attention will help your overall marketing now and, should Google decide to weigh in, your SEO in the future, as well.
Precision Keywords and Long-Tail Keywords
Keyword stuffing and other black hat SEO tactics have been virtually eradicated by more robust search algorithms, but legitimate keyword strategies continue to be an important part of SEO.
Still, the conversation has begun to shift. While many sites continue to compete for first-page SERPs rankings with broad, top-of-industry keywords, some marketers are focusing on more precise keywords and long-tail keywords. Jonathan Long recommends both in order to get your site’s search result in front of ready-to-buy leads.
This makes sense. Geolocation, custom options, and specific services and products have a big impact on a search user’s decision to go with a specific solution. Web users are getting savvier. They can tell whether or not they’re clicking on a high-quality search result that’s going to be accurate, trustworthy, and helpful.
Longtime web users are also getting more accurate with their search terms. This trend is only going to continue as digital natives become the prime marketing audience for most products and services. Caroline Stroud thinks long-tail keywords will be increasingly important in 2015, as Google’s Pigeon algorithm already makes it possible for search users to find “hyper-local” results.
Page Load Speeds, Bounce Rates, and Time on Site
The recent trend for streamlined sites—in everything from minimal design to backend development—is a sign of our need for speed. As we continue to expect more of our web experience, page load speeds could become a factor for Google. As Evan Bailyn recently noted:
Google spends tens of millions of dollars so that you can receive search results hundredths of a second faster. A company that cares that much about speed certainly cares that your site serves its pages quickly as well.
Page load speeds are already important from a human standpoint. Nearly half of web users already expect sites to load in two seconds or less. It’s a good bet that at some point soon, Google may start factoring in page load times in its algorithms. Likewise, bounce rates and time-on-site data could also enter into the algorithm picture. As Marcus Sheridan points out, both can be signs of the quality of your site’s content.
All of these speed and time-related metrics can be boiled down to one overall concept, according to Kim Speler and Rand Fishkin: user engagement. Your site ranking could become increasingly linked to these human, satisfaction-of-use indicators by Google algorithms.
The algorithms of Google are our primary focus here, but never underestimate the influence of a competitor to influence strategy. When Yahoo won a 5-year default search contract for Mozilla’s Firefox browser last fall, it helped Microsoft’s Bing algorithm capture 22.9% of the search engine market. (Yahoo relies on Bing for its search results.)
Bing doesn’t index entire web pages like Google does—it crawls the first 100k of the page. That means what makes up your above-the-fold content, as it’s sometimes known, becomes the determining factor for reaching one out of every five search engine users.
While it’s unlikely that Google would abandon the huge investment it’s made in indexing entire pages beyond just the first 100k, placing quality content up-front could still pay off by feeding back into user experience. High-quality, above-the-fold content could improve your bounce rate and time-on-site metrics by giving users more of what they want upfront.
Thorough, High-Quality Content
The whole point of Google’s algorithm is to optimize web results for real flesh-and-blood humans. Real people who want high-quality content and the best product and service solutions they can find.
These algorithm changes are Google’s way to try to deliver that to its customers, who in turn will be your customers. As algorithms change and update, they seek to get closer to knowing exactly what web users want and where it can be found.
We can continue to tweak our pages’ SEO strategies with keywords and optimized code to address the “where.” But it will take a continued commitment to creating high-quality content to satisfy the “what”—both for your customers, as well as the algorithms.
A safe bet for web marketers is to continue to focus on this content. Thorough, high-quality content meets the needs of search engine users and your users and never goes out of style.
While we can’t look into a crystal ball and see exactly what Google plans to do next, we can look to the current environment for clues about where things are headed and what seems most important right now. The algorithm updates and refreshes of the past few years didn’t come out of the blue. They were grounded in the web environment of the time and the direction web search was moving. While we can’t predict exactly how the future will turn out, we can all weigh how the current SEO trends and topics could add value for users. And these guesstimations could have a big payoff in future algorithm updates.